-
Notifications
You must be signed in to change notification settings - Fork 45
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
docs(js): update vercel and langchain examples add readme for openinference-core #1083
Merged
Merged
Changes from all commits
Commits
Show all changes
5 commits
Select commit
Hold shift + click to select a range
58dfaf0
docs(examples): update vercel ai sdk example
Parker-Stafford e8ad19d
remove package-lock
Parker-Stafford 30f9ad8
docs(core): add documentation to readmes for openinference-core
Parker-Stafford 008ca0b
remove console log
Parker-Stafford 340a4d9
readme prettiers
Parker-Stafford File filter
Filter by extension
Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
There are no files selected for viewing
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
This file was deleted.
Oops, something went wrong.
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Original file line number | Diff line number | Diff line change |
---|---|---|
@@ -0,0 +1,4 @@ | ||
import { NextConfig } from "next"; | ||
const nextConfig: NextConfig = {}; | ||
|
||
export default nextConfig; |
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Original file line number | Diff line number | Diff line change |
---|---|---|
@@ -1,5 +1,110 @@ | ||
# OpenInference Core | ||
|
||
[![npm version](https://badge.fury.io/js/@arizeai%2Fopeninference-instrumentation.svg)](https://badge.fury.io/js/@arizeai%2Fopeninference-core) | ||
[![npm version](https://badge.fury.io/js/@arizeai%2Fopeninference-core.svg)](https://badge.fury.io/js/@arizeai%2Fopeninference-core) | ||
|
||
This package provides OpenInference Core utilities for LLM Traces. | ||
|
||
## Installation | ||
|
||
```bash | ||
npm install @arizeai/openinference-core # npm | ||
pnpm add @arizeai/openinference-core # pnpm | ||
yarn add @arizeai/openinference-core # yarn | ||
``` | ||
|
||
## Customizing Spans | ||
|
||
The `@arizeai/openinference-core` package offers utilities to track important application metadata such as sessions and users using context attribute propagation: | ||
|
||
- `setSession`: to specify a session ID to track and group multi-turn conversations | ||
- `setUser`: to specify a user ID to track different conversations with a given user | ||
- `setMetadata`: to add custom metadata that can provide extra information to support a wide range of operational needs | ||
- `setTag`: to add tags, to filter spans on specific keywords | ||
- `setPromptTemplate`: to reflect the prompt template used, with its version and variables. This is useful for prompt template tracking | ||
- `setAttributes`: to add multiple custom attributes at the same time | ||
|
||
> [!NOTE] All @arizeai/openinference auto instrumentation packages will pull attributes off of context and add them to spans | ||
|
||
### Examples | ||
|
||
`setSession` | ||
|
||
```typescript | ||
import { context } from "@opentelemetry/api"; | ||
import { setSession } from "@openinference-core"; | ||
|
||
context.with(setSession(context.active(), { sessionId: "session-id" }), () => { | ||
// Calls within this block will generate spans with the attributes: | ||
// "session.id" = "session-id" | ||
}); | ||
``` | ||
|
||
Each setter function returns a new active context, so they can be chained together. | ||
|
||
```typescript | ||
import { context } from "@opentelemetry/api"; | ||
import { setAttributes, setSession } from "@openinference-core"; | ||
|
||
context.with( | ||
setAttributes(setSession(context.active(), { sessionId: "session-id" }), { | ||
myAttribute: "test", | ||
}), | ||
() => { | ||
// Calls within this block will generate spans with the attributes: | ||
// "myAttribute" = "test" | ||
// "session.id" = "session-id" | ||
}, | ||
); | ||
``` | ||
|
||
Additionally, they can be used in conjunction with the [OpenInference Semantic Conventions](../openinference-semantic-conventions/). | ||
|
||
```typescript | ||
import { context } from "@opentelemetry/api" | ||
import { setAttributes } from "@openinference-core" | ||
import { SemanticConventions } from "@arizeai/openinference-semantic-conventions"; | ||
|
||
|
||
context.with( | ||
setAttributes( | ||
{ [SemanticConventions.SESSION_ID: "session-id" } | ||
), | ||
() => { | ||
// Calls within this block will generate spans with the attributes: | ||
// "session.id" = "session-id" | ||
} | ||
) | ||
``` | ||
|
||
If you are creating spans manually and want to propagate context attributes you've set to those spans as well you can use the `getAttributesFromContext` utility to do that. you can read more about customizing spans in our [docs](https://docs.arize.com/phoenix/tracing/how-to-tracing/customize-spans). | ||
|
||
```typescript | ||
import { getAttributesFromContext } from "@arizeai/openinference-core"; | ||
import { context, trace } from "@opentelemetry/api"; | ||
|
||
const contextAttributes = getAttributesFromContext(context.active()); | ||
const tracer = trace.getTracer("example"); | ||
const span = tracer.startSpan("example span"); | ||
span.setAttributes(contextAttributes); | ||
span.end(); | ||
``` | ||
|
||
## Trace Config | ||
|
||
This package also provides support for controlling settings like data privacy and payload sizes. For instance, you may want to keep sensitive information from being logged for security reasons, or you may want to limit the size of the base64 encoded images logged to reduced payload size. | ||
|
||
> [!NOTE] These values can also be controlled via environment variables, see more information [here](https://github.com/Arize-ai/openinference/blob/main/spec/configuration.md). | ||
|
||
Here is an example of how to configure these settings using the OpenAI auto instrumentation. Note that all of our auto instrumentations will accept a traceConfig object. | ||
|
||
```typescript | ||
import { OpenAIInstrumentation } from "@arizeai/openinference-instrumentation-openai"; | ||
|
||
/** | ||
* Everything left out of here will fallback to | ||
* environment variables then defaults | ||
*/ | ||
const traceConfig = { hideInputs: true }; | ||
|
||
const instrumentation = new OpenAIInstrumentation({ traceConfig }); | ||
``` |
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Oops, something went wrong.
Add this suggestion to a batch that can be applied as a single commit.
This suggestion is invalid because no changes were made to the code.
Suggestions cannot be applied while the pull request is closed.
Suggestions cannot be applied while viewing a subset of changes.
Only one suggestion per line can be applied in a batch.
Add this suggestion to a batch that can be applied as a single commit.
Applying suggestions on deleted lines is not supported.
You must change the existing code in this line in order to create a valid suggestion.
Outdated suggestions cannot be applied.
This suggestion has been applied or marked resolved.
Suggestions cannot be applied from pending reviews.
Suggestions cannot be applied on multi-line comments.
Suggestions cannot be applied while the pull request is queued to merge.
Suggestion cannot be applied right now. Please check back later.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
we removed pacage locks from examples