Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

The result after relu activation function isn't used in grp-dsod. #35

Open
zjuzuhe opened this issue Mar 30, 2018 · 1 comment
Open

Comments

@zjuzuhe
Copy link

zjuzuhe commented Mar 30, 2018

Thx for your sharing code of grp-dsod.I read the code and I find that result after relu function isn't used in this part.

def global_level(net, from_layer, relu_name):
    fc = L.InnerProduct(net[relu_name], num_output=1)
    sigmoid = L.Sigmoid(fc, in_place=True)
    att_name = "{}_att".format(from_layer)
    sigmoid = L.Reshape(sigmoid, reshape_param=dict(shape=dict(dim=[-1])))
    scale = L.Scale(net[att_name], sigmoid, axis=0, bias_term=False, bias_filler=dict(value=0))
    relu = L.ReLU(scale, in_place=True)
    residual = L.Eltwise(net[from_layer], scale)
    gatt_name = "{}_gate".format(from_layer)
    net[gatt_name] = residual
    return net

relu = L.ReLU(scale, in_place=True)
Is it a mistake?Or,is it discarded?

@zjuzuhe zjuzuhe changed the title The relu activation function don't be used in grp-dsod. The result after relu activation function don't be used in grp-dsod. Mar 30, 2018
@zjuzuhe zjuzuhe changed the title The result after relu activation function don't be used in grp-dsod. The result after relu activation function isn't used in grp-dsod. Mar 30, 2018
@szq0214
Copy link
Owner

szq0214 commented Mar 31, 2018

Hi @zjuzuhe, thanks for pointing out this, actually, we did not use relu following the scale operation in global-level attention. I will remove or comment this line soon. I'm also not sure if it is helpful or not using relu here.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants