Anoncheg<p>Part2: <a href="https://techhub.social/tags/dailyreport" class="mention hashtag" rel="nofollow noopener noreferrer" target="_blank">#<span>dailyreport</span></a> <a href="https://techhub.social/tags/negativesampleing" class="mention hashtag" rel="nofollow noopener noreferrer" target="_blank">#<span>negativesampleing</span></a> <a href="https://techhub.social/tags/sampling" class="mention hashtag" rel="nofollow noopener noreferrer" target="_blank">#<span>sampling</span></a> <a href="https://techhub.social/tags/llm" class="mention hashtag" rel="nofollow noopener noreferrer" target="_blank">#<span>llm</span></a> <a href="https://techhub.social/tags/recsys" class="mention hashtag" rel="nofollow noopener noreferrer" target="_blank">#<span>recsys</span></a> <br> target word w and the negative samples</p><p>For binary Classification: Negative sampling transforms<br> the problem into a series of binary classification tasks,<br> where the model learns to distinguish between positive<br> and negative samples.</p><p>Example "The dog is playing with a bone," and assume a<br> window size of 2 positive samples for the target word<br> "dog" would include:<br>- ("dog", "The")<br>- ("dog", "is")<br>- ("dog", "playing")<br>- ("dog", "with")<br>- ("dog", "a")<br>- ("dog", "bone")</p><p>Negative Samples: ("dog", "car"), ("dog", "apple"),<br> ("dog", "house"), ("dog", "tree")</p><p>calc: logσ(vdog⋅vbone) +<br> logσ(−vdog⋅vcar)+logσ(−vdog⋅vapple)+logσ(−vdog⋅vhouse)<br>蠡</p>