Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

A question about sampling #1

Open
Li-Zheng-94 opened this issue Apr 12, 2022 · 1 comment
Open

A question about sampling #1

Li-Zheng-94 opened this issue Apr 12, 2022 · 1 comment

Comments

@Li-Zheng-94
Copy link

Thanks for your great work.

I have a question about sampling.

In your paper, you use hard negative (HN) and soft negative(SN) to sample negative items.
"In HN, the negative item in each triplet is selected as the closest to the anchor in a batch."
"SN refers to picking the furthest negative item to the anchor within the batch."

The sampling methods described in the paper and that implemented by the code are opposite. Is there anything wrong in this code?

elif metric_samples == 'hard':

elif metric_samples == 'soft':

@AndresPMD
Copy link
Owner

@Li-Zheng-94
Thanks for showing interest in our work.

According to the documentation from https://pytorch.org/docs/stable/generated/torch.topk.html:
If the largest is False then the k smallest elements are returned, so closest in distance with respect to the anchor. So this is what we use in HARD NEGATIVE sampling.

You can see it clearly here:
elif metric_samples == 'hard':
hard_neg_indexes = (torch.topk(cost_s, 2, dim=1, largest=False)[1][:, 1],
torch.topk(cost_im, 2, dim=1, largest=False)[1][:, 1])

Let me know if I understand the question correctly.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants