You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
In file MSNet.py while creating (or defining) the graph for the model, I noticed that in the function that creates a residual block the activation layer is being applied before convolution. Is that a mistake or I am missing something considering we don't apply activation before Convolution?
The text was updated successfully, but these errors were encountered:
In file MSNet.py while creating (or defining) the graph for the model, I noticed that in the function that creates a residual block the activation layer is being applied before convolution
Have been scratching my head on this for sometime now, a 'Pre-Activation ResNet' might be the intention, as explained here:
In file
MSNet.py
while creating (or defining) the graph for the model, I noticed that in the function that creates a residual block the activation layer is being applied before convolution. Is that a mistake or I am missing something considering we don't apply activation before Convolution?The text was updated successfully, but these errors were encountered: