Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

ONNX Zoo's basic MNIST model fails #178

Open
robinvanemden opened this issue May 24, 2020 · 1 comment
Open

ONNX Zoo's basic MNIST model fails #178

robinvanemden opened this issue May 24, 2020 · 1 comment

Comments

@robinvanemden
Copy link

robinvanemden commented May 24, 2020

ONNC produces invalid outcomes when compiling MNIST model v1.3 from the ONNX model repo.

Building the inference runtime as described in the backend guide and running it on test_data_set_1 results in:

> ./inference mnist.input onnc-runtime-service.weight

[-0.044384, 0.010354, 0.074052, 0.020479, -0.131909, 0.145801, -0.053591, -0.047789, 0.084736, -0.057945, ]

When compiling directly from source files I obtain:

gcc -fno-exceptions -I./include/ -I /onnc/onnc/include  \
./src/client-lib.c ./src/onnc-runtime-core.c  ./src/client-app.c  ./src/onnc-runtime-service.c \
/onnc/onnc/lib/Runtime/operator/randomnormal.c \

... etc, all c operator files ... 

/onnc/lib/Runtime/operator/gather.c  \
-o ./inference -lm

> ./inference mnist.input onnc-runtime-service.weight

[-2427.208984, -319.598022, 1552.176880, 111.066208, 1734.641357, -1012.644592, -1362.241211, 1107.857178, -280.093781, 117.853188, ]

... both are incorrect, as the result defined in output_0.pb reads:

[5041.8887, -3568.878, -187.82423, -1685.797, -1183.3232, -614.42926, 892.6643, -373.65845    -290.2623, -111.176216]

(Directly compiled Squeezenet works fine - which seems to indicate this is MNIST related?)

@robinvanemden robinvanemden changed the title Gcc MNIST compilation differing from cmake compilation, both differing from expected result MNIST compilation fails May 24, 2020
@robinvanemden robinvanemden changed the title MNIST compilation fails ONNX Zoo MNIST compilation fails May 24, 2020
@robinvanemden robinvanemden changed the title ONNX Zoo MNIST compilation fails ONNX Zoo's basic MNIST model fails May 25, 2020
@ajaya1274
Copy link

-We have build inference runtime as described in the backend guide.
-It's working properly in the case of alexnet model and displaying the output accordingly on test_data_set
-But when we are using . /inference mnist.input onnc-runtime-service.weight(which is for running mnist inference it is giving segmentfault(coredump).
So kindly share the steps for creating mnist inference so that we can do the required changes and recreate the issue.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants