Skip to content

Commit

Permalink
Update README.md
Browse files Browse the repository at this point in the history
  • Loading branch information
goldsborough committed Mar 6, 2018
1 parent e4e6713 commit 89a40da
Show file tree
Hide file tree
Showing 2 changed files with 10 additions and 1 deletion.
3 changes: 2 additions & 1 deletion README.md
Original file line number Diff line number Diff line change
Expand Up @@ -9,7 +9,8 @@ There are a few "sights" you can metaphorically visit in this repository:
- Build C++ and/or CUDA extensions by going into the `cpp/` or `cuda/` folder and executing `python setup.py install`,
- JIT-compile C++ and/or CUDA extensions by going into the `cpp/` or `cuda/` folder and calling `python jit.py`, which will JIT-compile the extension and load it,
- Benchmark Python vs. C++ vs. CUDA by running `python benchmark.py {py, cpp, cuda} [--cuda]`,
- Run gradient-checks on the code by running `python grad_check.py {py, cpp, cuda}`.
- Run gradient checks on the code by running `python grad_check.py {py, cpp, cuda} [--cuda]`.
- Run output checks on the code by running `python check.py {forward, backward} [--cuda]`.

## Authors

Expand Down
8 changes: 8 additions & 0 deletions check.py
Original file line number Diff line number Diff line change
Expand Up @@ -24,6 +24,11 @@ def check_equal(first, second, verbose):
np.testing.assert_allclose(x, y, err_msg="Index: {}".format(i))


def zero_grad(variables):
for variable in variables:
variable.grad.zero_()


def check_forward(variables, with_cuda, verbose):
baseline_values = python.lltm_baseline.LLTMFunction.apply(*variables)
cpp_values = cpp.lltm.LLTMFunction.apply(*variables)
Expand All @@ -44,6 +49,8 @@ def check_backward(variables, with_cuda, verbose):
(baseline_values[0] + baseline_values[1]).sum().backward()
grad_baseline = [var.grad for var in variables]

zero_grad(variables)

cpp_values = cpp.lltm.LLTMFunction.apply(*variables)
(cpp_values[0] + cpp_values[1]).sum().backward()
grad_cpp = [var.grad for var in variables]
Expand All @@ -53,6 +60,7 @@ def check_backward(variables, with_cuda, verbose):
print('Ok')

if with_cuda:
zero_grad(variables)
cuda_values = cuda.lltm.LLTMFunction.apply(*variables)
(cuda_values[0] + cuda_values[1]).sum().backward()
grad_cuda = [var.grad for var in variables]
Expand Down

0 comments on commit 89a40da

Please sign in to comment.