Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

update. #1

Open
wants to merge 254 commits into
base: master
Choose a base branch
from
Open

update. #1

wants to merge 254 commits into from

Conversation

sanchittechnogeek
Copy link
Owner

Updating the repository! 4:00pm 4/6/2022 IST

infinnie and others added 30 commits June 20, 2017 04:30
Create neural-networks-1.md
The L2 norm for `w_2 = [0.25,0.25,0.25,0.25]` comes to be `0.5`, not `0.25`.
isnt the regularizer we are using for W the SQUARED l2 norm? because if we were to use the regular l2 norm we would need to take the root over the entire sum of squares (i.e. R(W) = ||W||^2). but we are not doing that. it is just a small detail, i know, but it has been bugging me. best(: peglegpete
You had a few broken links to the old scipy wiki and one internal reference with the wrong name.
Lets --> Let's, right in the beginning of this paragraph.
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.