Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[BUG] Prompts are noy displayed correctly #1186

Open
kripper opened this issue Dec 21, 2024 · 5 comments
Open

[BUG] Prompts are noy displayed correctly #1186

kripper opened this issue Dec 21, 2024 · 5 comments
Assignees
Labels
bug Something isn't working enhancement New feature or request question Further information is requested

Comments

@kripper
Copy link

kripper commented Dec 21, 2024

image
Is it possible to view the prompts as Markdown instead of escaped JSON strings, which are hard on the eyes?

This is a trace of "vertex_ai/gemini-2.0-flash-exp" via the LiteLLM OTEL callback.

@kripper kripper added enhancement New feature or request triage Issues that require triage labels Dec 21, 2024
@dosubot dosubot bot added the question Further information is requested label Dec 21, 2024
@kripper kripper changed the title [QUESTION] View prompts markdown [QUESTION] View prompts correctly Dec 22, 2024
@kripper kripper changed the title [QUESTION] View prompts correctly [BUG] Prompts are noy displayed correctly Dec 22, 2024
@mikeldking
Copy link
Contributor

Hey @kripper - thanks for your note - it looks like you might be using openLLMetry which has different conventions around llm semantic conventions. We are participating in the genai conventions group but currently don't support LiteLLM instrumentation from traceloop.

If you'd like to try our LiteLLM instrumentation please check out the openinference intsrumentation https://github.com/Arize-ai/openinference/tree/main/python/instrumentation/openinference-instrumentation-litellm

Thanks for your feedback. We're hoping we can land on a good set of conventions that all backends support!

@mikeldking mikeldking removed the triage Issues that require triage label Dec 27, 2024
@kripper
Copy link
Author

kripper commented Jan 8, 2025

Thanks @mikeldking.
I'm a little confused.
Does that mean that we can use this setting for Phoenix instead?

litellm_settings:
  callbacks: ["arize"]

environment_variables:
    ARIZE_SPACE_KEY: "default"
    #ARIZE_API_KEY: ""
    ARIZE_ENDPOINT: "http://127.0.0.1:6006/v1" <--- Is this endpoint correct?
    ARIZE_HTTP_ENDPOINT: "http://127.0.0.1:6006/v1"

With these settings, I'm getting this error:

Traceback (most recent call last):
  File "/usr/local/python/3.12.1/lib/python3.12/site-packages/opentelemetry/sdk/trace/export/__init__.py", line 360, in _export_batch
    self.span_exporter.export(self.spans_list[:idx])  # type: ignore
    ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/usr/local/python/3.12.1/lib/python3.12/site-packages/opentelemetry/exporter/otlp/proto/http/trace_exporter/__init__.py", line 189, in export
    return self._export_serialized_spans(serialized_data)
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/usr/local/python/3.12.1/lib/python3.12/site-packages/opentelemetry/exporter/otlp/proto/http/trace_exporter/__init__.py", line 159, in _export_serialized_spans
    resp = self._export(serialized_data)
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/usr/local/python/3.12.1/lib/python3.12/site-packages/opentelemetry/exporter/otlp/proto/http/trace_exporter/__init__.py", line 133, in _export
    return self._session.post(
           ^^^^^^^^^^^^^^^^^^^
  File "/home/codespace/.local/lib/python3.12/site-packages/requests/sessions.py", line 637, in post
    return self.request("POST", url, data=data, json=json, **kwargs)
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/home/codespace/.local/lib/python3.12/site-packages/requests/sessions.py", line 575, in request
    prep = self.prepare_request(req)
           ^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/home/codespace/.local/lib/python3.12/site-packages/requests/sessions.py", line 484, in prepare_request
    p.prepare(
  File "/home/codespace/.local/lib/python3.12/site-packages/requests/models.py", line 367, in prepare
    self.prepare_url(url, params)
  File "/home/codespace/.local/lib/python3.12/site-packages/requests/models.py", line 438, in prepare_url
    raise MissingSchema(
requests.exceptions.MissingSchema: Invalid URL 'None': No scheme supplied. Perhaps you meant https://None?

@mikeldking
Copy link
Contributor

Thanks @mikeldking. I'm a little confused. Does that mean that we can use this setting for Phoenix instead?

litellm_settings:
  callbacks: ["arize"]

environment_variables:
    ARIZE_SPACE_KEY: "default"
    #ARIZE_API_KEY: ""
    ARIZE_ENDPOINT: "http://127.0.0.1:6006/v1" <--- Is this endpoint correct?
    ARIZE_HTTP_ENDPOINT: "http://127.0.0.1:6006/v1"

With these settings, I'm getting this error:

Traceback (most recent call last):
  File "/usr/local/python/3.12.1/lib/python3.12/site-packages/opentelemetry/sdk/trace/export/__init__.py", line 360, in _export_batch
    self.span_exporter.export(self.spans_list[:idx])  # type: ignore
    ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/usr/local/python/3.12.1/lib/python3.12/site-packages/opentelemetry/exporter/otlp/proto/http/trace_exporter/__init__.py", line 189, in export
    return self._export_serialized_spans(serialized_data)
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/usr/local/python/3.12.1/lib/python3.12/site-packages/opentelemetry/exporter/otlp/proto/http/trace_exporter/__init__.py", line 159, in _export_serialized_spans
    resp = self._export(serialized_data)
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/usr/local/python/3.12.1/lib/python3.12/site-packages/opentelemetry/exporter/otlp/proto/http/trace_exporter/__init__.py", line 133, in _export
    return self._session.post(
           ^^^^^^^^^^^^^^^^^^^
  File "/home/codespace/.local/lib/python3.12/site-packages/requests/sessions.py", line 637, in post
    return self.request("POST", url, data=data, json=json, **kwargs)
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/home/codespace/.local/lib/python3.12/site-packages/requests/sessions.py", line 575, in request
    prep = self.prepare_request(req)
           ^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/home/codespace/.local/lib/python3.12/site-packages/requests/sessions.py", line 484, in prepare_request
    p.prepare(
  File "/home/codespace/.local/lib/python3.12/site-packages/requests/models.py", line 367, in prepare
    self.prepare_url(url, params)
  File "/home/codespace/.local/lib/python3.12/site-packages/requests/models.py", line 438, in prepare_url
    raise MissingSchema(
requests.exceptions.MissingSchema: Invalid URL 'None': No scheme supplied. Perhaps you meant https://None?

Unfortunately not - that integration was added by the litellm team and it only can export telemetry to https://app.arize.com/ and not phoenix.

Let me dig into their proxy integration a bit more (BerriAI/liteLLM-proxy#17)

In the meantime you can use any of our integrations found here on the client side (https://docs.arize.com/phoenix/tracing/integrations-tracing) that might be the easiest unblock.

Thank you for your patience.

@kripper
Copy link
Author

kripper commented Jan 9, 2025

In the meantime you can use any of our integrations found here on the client side

I would like to, but I'm afraid we are dealing with sensitive data in the prompts that cannot be exposed online.

@mikeldking
Copy link
Contributor

In the meantime you can use any of our integrations found here on the client side

I would like to, but I'm afraid we are dealing with sensitive data in the prompts that cannot be exposed online.

Interesting. Will do. Well look into it.

Cc @nate-mar

@github-project-automation github-project-automation bot moved this to 📘 Todo in phoenix Jan 10, 2025
@mikeldking mikeldking transferred this issue from Arize-ai/phoenix Jan 10, 2025
@mikeldking mikeldking assigned nate-mar and unassigned mikeldking Jan 10, 2025
@dosubot dosubot bot added the bug Something isn't working label Jan 10, 2025
@nate-mar nate-mar moved this to Todo in Instrumentation Jan 17, 2025
@nate-mar nate-mar moved this from Todo to In Progress in Instrumentation Jan 17, 2025
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug Something isn't working enhancement New feature or request question Further information is requested
Projects
Status: In Progress
Status: 📘 Todo
Development

No branches or pull requests

3 participants