-
Notifications
You must be signed in to change notification settings - Fork 44
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
[BUG] Prompts are noy displayed correctly #1186
Comments
Hey @kripper - thanks for your note - it looks like you might be using openLLMetry which has different conventions around If you'd like to try our LiteLLM instrumentation please check out the openinference intsrumentation https://github.com/Arize-ai/openinference/tree/main/python/instrumentation/openinference-instrumentation-litellm Thanks for your feedback. We're hoping we can land on a good set of conventions that all backends support! |
Thanks @mikeldking.
With these settings, I'm getting this error:
|
Unfortunately not - that integration was added by the litellm team and it only can export telemetry to https://app.arize.com/ and not phoenix. Let me dig into their proxy integration a bit more (BerriAI/liteLLM-proxy#17) In the meantime you can use any of our integrations found here on the client side (https://docs.arize.com/phoenix/tracing/integrations-tracing) that might be the easiest unblock. Thank you for your patience. |
I would like to, but I'm afraid we are dealing with sensitive data in the prompts that cannot be exposed online. |
Interesting. Will do. Well look into it. Cc @nate-mar |
Is it possible to view the prompts as Markdown instead of escaped JSON strings, which are hard on the eyes?
This is a trace of "vertex_ai/gemini-2.0-flash-exp" via the LiteLLM OTEL callback.
The text was updated successfully, but these errors were encountered: