Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Different detection result on localhost and the server #39

Open
SnoopyDevelops opened this issue Jan 6, 2023 · 5 comments
Open

Different detection result on localhost and the server #39

SnoopyDevelops opened this issue Jan 6, 2023 · 5 comments

Comments

@SnoopyDevelops
Copy link

SnoopyDevelops commented Jan 6, 2023

Tested with second sample of ChatGPT and the detection result is not same with server.

The test result of https://openai-openai-detector.hf.space/
image

Test result with roberta-base model on localhost
image

Test result with roberta-large model on localhost
image

@ononotofu
Copy link

@SnoopyDevelops Did you ever make progress on this? Seeing the same issues here.

@SnoopyDevelops
Copy link
Author

Unfortunately, no

@CoconutMacaroon
Copy link

I can also confirm this as well, where the server result is different than both the -base and -large models.

@jnousis
Copy link

jnousis commented Mar 27, 2023

Was anyone able to get consistent results between localhost and the website? Tried both models and still get different results.

@NotTheDr01ds
Copy link

NotTheDr01ds commented Apr 27, 2023

The trick to getting the same results is apparently to use the same Python dependencies. The model that is being used on HuggingSpace really is the same -base that we've been downloading and using.

I've written up full instructions in this Ask Ubuntu answer. After using the HuggingSpace Dockerfile as a guide, my local version gives the same results as https://openai-openai-detector.hf.space/. Thanks to @CoconutMacaroon for confirming this.

Note that there are currently a few known issues with the HuggingFace OpenAI version:

The first two of these, at least, are fixed in a change to detector/index.html. Instead of:

req.open('GET', window.location.href + '?' + textbox.value, true);

Use:

const maxCharacters = 16300;
req.open('GET', `?${encodeURIComponent(textbox.value)}`.slice(0,maxCharacters), true);

Thanks to @makyen (Stack Overflow Mod) for assistance with this.

I have a Space with these changes that you can use as a basis for your local copy (using the Dockerfile and cloning with git per the Ask Ubuntu answer mentioned above).

Keep in mind that, just as the OpenAI version, you won't be able to use it from that Space URL. You'll need to use the top level URL instead. Again, it's mainly for you to clone locally (or as a new Space if desired); it's going to be really slow at times on the free CPU tier.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

5 participants