Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[BUG] Token count displaying as 0 when using Langchain auto-instrumentor with Vertex #1202

Open
millers1389 opened this issue Jan 15, 2025 · 7 comments
Assignees
Labels
bug Something isn't working instrumentation: langchain instrumentation Adding instrumentations to open source packages language: python Related to Python integration

Comments

@millers1389
Copy link

millers1389 commented Jan 15, 2025

Describe the bug
The token count will only display as 0 when using Langchain autoinstrumentor with Vertex
The proper token counts are shown in the output but not where we show the token count in the UI

To Reproduce
Steps to reproduce the behavior:

  1. Add langchain autoinstrumentor to your langchain application that uses Vertex as the LLM provider
  2. Run a sample query
  3. Observe the token count for the trace in Phoenix is 0

Expected behavior
The proper token count for each trace should be displayed

Screenshots
See comments below

Additional context
Colab for reproducing

@millers1389 millers1389 added bug Something isn't working triage Issues that require triage labels Jan 15, 2025
@trevor-laviale-arize
Copy link

When using the langchain autoinstrumentor in a langchain application w/ Vertex, the token counts show as 0. I think it's because the tokens are logged in a different way than the openinference semantic conventions.
Screenshot 2025-01-15 at 1 16 21 PM
Screenshot 2025-01-15 at 1 15 22 PM

@axiomofjoy
Copy link
Contributor

Thanks for reporting @millers1389! Looks like that Colab is private, I have requested access.

@millers1389
Copy link
Author

Just granted access! Let me know if you have any other questions!

@axiomofjoy
Copy link
Contributor

Thanks @millers1389 !

@axiomofjoy axiomofjoy transferred this issue from Arize-ai/phoenix Jan 15, 2025
@github-project-automation github-project-automation bot moved this to 📘 Todo in phoenix Jan 15, 2025
@dosubot dosubot bot added the language: python Related to Python integration label Jan 15, 2025
@axiomofjoy axiomofjoy added the instrumentation Adding instrumentations to open source packages label Jan 15, 2025
@mikeldking mikeldking added instrumentation: langchain and removed triage Issues that require triage labels Jan 15, 2025
@nate-mar nate-mar self-assigned this Jan 16, 2025
@nate-mar
Copy link
Contributor

HI @millers1389 -- Can you grant me access to the notebook as well? Thanks!

@nate-mar
Copy link
Contributor

Can you also print out the package versions of your dependencies?

@nate-mar nate-mar moved this to In Progress in Instrumentation Jan 16, 2025
@nate-mar nate-mar moved this from In Progress to Done in Instrumentation Jan 17, 2025
@nate-mar nate-mar closed this as completed by moving to Done in Instrumentation Jan 17, 2025
@github-project-automation github-project-automation bot moved this from 📘 Todo to ✅ Done in phoenix Jan 17, 2025
@nate-mar nate-mar reopened this Jan 17, 2025
@github-project-automation github-project-automation bot moved this from ✅ Done to 👨‍💻 In progress in phoenix Jan 17, 2025
@nate-mar
Copy link
Contributor

Accidentally closed this.

@nate-mar nate-mar moved this from Done to In Progress in Instrumentation Jan 17, 2025
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug Something isn't working instrumentation: langchain instrumentation Adding instrumentations to open source packages language: python Related to Python integration
Projects
Status: In Progress
Status: 👨‍💻 In progress
Development

No branches or pull requests

5 participants