Skip to content

Conversation

@k11kirky
Copy link
Contributor

@k11kirky k11kirky commented Jan 9, 2025

Initial PR for LLM Observability

  • Support for OpenAI SDK chat completions with OpenAI and AsyncOpenAI
  • Example file to generate data, usage example below

example data can be created by exporting the following env vars

export POSTHOG_PROJECT_API_KEY=<your-project-key>
export POSTHOG_PERSONAL_API_KEY=<your-personal-key>
export OPENAI_API_KEY=<your-openai-key>

python3 llm_observability_examples

@k11kirky k11kirky changed the title WIP: DO NOT MERGE - add llm observability to python sdk WIP: Add llm observability to python sdk Jan 10, 2025
@k11kirky k11kirky changed the title WIP: Add llm observability to python sdk Feat: Add llm observability to python sdk Jan 10, 2025
@k11kirky k11kirky requested review from Twixes and skoob13 January 10, 2025 19:06
"$ai_model_parameters": get_model_params(kwargs),
"$ai_input": kwargs.get("messages"),
"$ai_output": {
"choices": [
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Do we want to send an object here instead of the array with choices directly? We've already flattened the structure with additional fields like input/output tokens, so this might be the output itself.

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Although for most use cases "choices" only has one option, the user can ask the llm for mulitple im which case the choices array represents n options for a response. If we remove the object and just return the array it would look more like the messages array representing a history.

@k11kirky k11kirky marked this pull request as ready for review January 10, 2025 22:39
posthog_properties: Optional[Dict[str, Any]] = None,
**kwargs: Any,
):
distinct_id = posthog_distinct_id or uuid.uuid4()
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

How about UUIDv7?

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

we use 4 in the rest of the repo, so trying to keep it consistant

@k11kirky k11kirky merged commit 66101c9 into master Jan 11, 2025
2 checks passed
@k11kirky k11kirky deleted the feat/llm-observability-v0.1 branch January 11, 2025 01:34
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

5 participants