Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

about match_embds function #33

Open
xiexiaozheng opened this issue Aug 1, 2024 · 1 comment
Open

about match_embds function #33

xiexiaozheng opened this issue Aug 1, 2024 · 1 comment

Comments

@xiexiaozheng
Copy link

@zhang-tao-whu hello, as for the tracker part, I have a question about the function of match_embds, in this function, why is the cosine similarity calculated from only one sample in the batch, as shown in the following code?
`

def match_embds(self, ref_embds, cur_embds):
    #  embeds (q, b, c)
    ref_embds, cur_embds = ref_embds.detach()[:, 0, :], cur_embds.detach()[:, 0, :] # only one sample in a batch
    ref_embds = ref_embds / (ref_embds.norm(dim=1)[:, None] + 1e-6)
    cur_embds = cur_embds / (cur_embds.norm(dim=1)[:, None] + 1e-6)
    cos_sim = torch.mm(ref_embds, cur_embds.transpose(0, 1))
    C = 1 - cos_sim

    C = C.cpu()
    C = torch.where(torch.isnan(C), torch.full_like(C, 0), C)

    indices = linear_sum_assignment(C.transpose(0, 1))
    indices = indices[1]
    return indices

`

@KaihongLi
Copy link

The training batch size is 1, but each batch contains 5 images for training.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants