Skip to content
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
4 changes: 2 additions & 2 deletions .release-please-manifest.json
Original file line number Diff line number Diff line change
@@ -1,3 +1,3 @@
{
".": "0.2.0-alpha.31"
}
".": "0.2.0-alpha.32"
}
2 changes: 1 addition & 1 deletion .stats.yml
Original file line number Diff line number Diff line change
@@ -1 +1 @@
configured_endpoints: 13
configured_endpoints: 14
14 changes: 14 additions & 0 deletions CHANGELOG.md
Original file line number Diff line number Diff line change
Expand Up @@ -5,6 +5,20 @@ All notable changes to this project will be documented in this file.
The format is based on [Keep a Changelog](http://keepachangelog.com/en/1.0.0/)
and this project adheres to [Semantic Versioning](http://semver.org/spec/v2.0.0.html).

## 0.2.0-alpha.32 (2024-10-31)

Full Changelog: [v0.2.0-alpha.31...v0.2.0-alpha.32](https://github.com/openlayer-ai/openlayer-python/compare/v0.2.0-alpha.31...v0.2.0-alpha.32)

### Features

* **api:** manual updates ([#360](https://github.com/openlayer-ai/openlayer-python/issues/360)) ([4641235](https://github.com/openlayer-ai/openlayer-python/commit/4641235bf842a5d6d132870517aa1ac523867fc9))


### Bug Fixes

* **docs:** remove old examples from next branch ([534b732](https://github.com/openlayer-ai/openlayer-python/commit/534b73224f9adb3b287fac1f4abd285eed65c047))
* **docs:** ruff linting issues ([728a7dc](https://github.com/openlayer-ai/openlayer-python/commit/728a7dc71ddb0edb1f8cfa7c0d6889801d1486a0))

## 0.2.0-alpha.31 (2024-10-07)

Full Changelog: [v0.2.0-alpha.30...v0.2.0-alpha.31](https://github.com/openlayer-ai/openlayer-python/compare/v0.2.0-alpha.30...v0.2.0-alpha.31)
Expand Down
52 changes: 28 additions & 24 deletions CONTRIBUTING.md
Original file line number Diff line number Diff line change
Expand Up @@ -2,9 +2,13 @@

### With Rye

We use [Rye](https://rye.astral.sh/) to manage dependencies so we highly recommend [installing it](https://rye.astral.sh/guide/installation/) as it will automatically provision a Python environment with the expected Python version.
We use [Rye](https://rye.astral.sh/) to manage dependencies because it will automatically provision a Python environment with the expected Python version. To set it up, run:

After installing Rye, you'll just have to run this command:
```sh
$ ./scripts/bootstrap
```

Or [install Rye manually](https://rye.astral.sh/guide/installation/) and run:

```sh
$ rye sync --all-features
Expand All @@ -31,25 +35,25 @@ $ pip install -r requirements-dev.lock

## Modifying/Adding code

Most of the SDK is generated code, and any modified code will be overridden on the next generation. The
`src/openlayer/lib/` and `examples/` directories are exceptions and will never be overridden.
Most of the SDK is generated code. Modifications to code will be persisted between generations, but may
result in merge conflicts between manual patches and changes from the generator. The generator will never
modify the contents of the `src/openlayer/lib/` and `examples/` directories.

## Adding and running examples

All files in the `examples/` directory are not modified by the Stainless generator and can be freely edited or
added to.
All files in the `examples/` directory are not modified by the generator and can be freely edited or added to.

```bash
```py
# add an example to examples/<your-example>.py

#!/usr/bin/env -S rye run python
```

```
chmod +x examples/<your-example>.py
```sh
$ chmod +x examples/<your-example>.py
# run the example against your api
./examples/<your-example>.py
$ ./examples/<your-example>.py
```

## Using the repository from source
Expand All @@ -58,8 +62,8 @@ If you’d like to use the repository from source, you can either install from g

To install via git:

```bash
pip install git+ssh://[email protected]/openlayer-ai/openlayer-python.git
```sh
$ pip install git+ssh://[email protected]/openlayer-ai/openlayer-python.git
```

Alternatively, you can build from source and install the wheel file:
Expand All @@ -68,29 +72,29 @@ Building this package will create two files in the `dist/` directory, a `.tar.gz

To create a distributable version of the library, all you have to do is run this command:

```bash
rye build
```sh
$ rye build
# or
python -m build
$ python -m build
```

Then to install:

```sh
pip install ./path-to-wheel-file.whl
$ pip install ./path-to-wheel-file.whl
```

## Running tests

Most tests require you to [set up a mock server](https://github.com/stoplightio/prism) against the OpenAPI spec to run the tests.

```bash
```sh
# you will need npm installed
npx prism mock path/to/your/openapi.yml
$ npx prism mock path/to/your/openapi.yml
```

```bash
rye run pytest
```sh
$ ./scripts/test
```

## Linting and formatting
Expand All @@ -100,14 +104,14 @@ This repository uses [ruff](https://github.com/astral-sh/ruff) and

To lint:

```bash
rye run lint
```sh
$ ./scripts/lint
```

To format and fix all ruff issues automatically:

```bash
rye run format
```sh
$ ./scripts/format
```

## Publishing and releases
Expand Down
37 changes: 26 additions & 11 deletions README.md
Original file line number Diff line number Diff line change
Expand Up @@ -32,7 +32,7 @@ client = Openlayer(
api_key=os.environ.get("OPENLAYER_API_KEY"),
)

data_stream_response = client.inference_pipelines.data.stream(
response = client.inference_pipelines.data.stream(
inference_pipeline_id="182bd5e5-6e1a-4fe4-a799-aa6d9a6ab26e",
config={
"input_variable_names": ["user_query"],
Expand All @@ -47,11 +47,11 @@ data_stream_response = client.inference_pipelines.data.stream(
"output": "42",
"tokens": 7,
"cost": 0.02,
"timestamp": 1620000000,
"timestamp": 1610000000,
}
],
)
print(data_stream_response.success)
print(response.success)
```

While you can provide an `api_key` keyword argument,
Expand All @@ -75,7 +75,7 @@ client = AsyncOpenlayer(


async def main() -> None:
data_stream_response = await client.inference_pipelines.data.stream(
response = await client.inference_pipelines.data.stream(
inference_pipeline_id="182bd5e5-6e1a-4fe4-a799-aa6d9a6ab26e",
config={
"input_variable_names": ["user_query"],
Expand All @@ -90,11 +90,11 @@ async def main() -> None:
"output": "42",
"tokens": 7,
"cost": 0.02,
"timestamp": 1620000000,
"timestamp": 1610000000,
}
],
)
print(data_stream_response.success)
print(response.success)


asyncio.run(main())
Expand Down Expand Up @@ -142,7 +142,7 @@ try:
"output": "42",
"tokens": 7,
"cost": 0.02,
"timestamp": 1620000000,
"timestamp": 1610000000,
}
],
)
Expand Down Expand Up @@ -203,7 +203,7 @@ client.with_options(max_retries=5).inference_pipelines.data.stream(
"output": "42",
"tokens": 7,
"cost": 0.02,
"timestamp": 1620000000,
"timestamp": 1610000000,
}
],
)
Expand Down Expand Up @@ -244,7 +244,7 @@ client.with_options(timeout=5.0).inference_pipelines.data.stream(
"output": "42",
"tokens": 7,
"cost": 0.02,
"timestamp": 1620000000,
"timestamp": 1610000000,
}
],
)
Expand Down Expand Up @@ -300,7 +300,7 @@ response = client.inference_pipelines.data.with_raw_response.stream(
"output": "42",
"tokens": 7,
"cost": 0.02,
"timestamp": 1620000000,
"timestamp": 1610000000,
}],
)
print(response.headers.get('X-My-Header'))
Expand Down Expand Up @@ -335,7 +335,7 @@ with client.inference_pipelines.data.with_streaming_response.stream(
"output": "42",
"tokens": 7,
"cost": 0.02,
"timestamp": 1620000000,
"timestamp": 1610000000,
}
],
) as response:
Expand Down Expand Up @@ -425,6 +425,21 @@ We take backwards-compatibility seriously and work hard to ensure you can rely o

We are keen for your feedback; please open an [issue](https://www.github.com/openlayer-ai/openlayer-python/issues) with questions, bugs, or suggestions.

### Determining the installed version

If you've upgraded to the latest version but aren't seeing any new features you were expecting then your python environment is likely still using an older version.

You can determine the version that is being used at runtime with:

```py
import openlayer
print(openlayer.__version__)
```

## Requirements

Python 3.7 or higher.

## Contributing

See [the contributing documentation](./CONTRIBUTING.md).
10 changes: 10 additions & 0 deletions api.md
Original file line number Diff line number Diff line change
Expand Up @@ -38,6 +38,16 @@ Methods:

# Commits

Types:

```python
from openlayer.types import CommitCreateResponse
```

Methods:

- <code title="post /projects/{projectId}/versions">client.commits.<a href="./src/openlayer/resources/commits/commits.py">create</a>(project_id, \*\*<a href="src/openlayer/types/commit_create_params.py">params</a>) -> <a href="./src/openlayer/types/commit_create_response.py">CommitCreateResponse</a></code>

## TestResults

Types:
Expand Down
1 change: 1 addition & 0 deletions examples/tracing/anthropic/anthropic_tracing.ipynb
Original file line number Diff line number Diff line change
Expand Up @@ -39,6 +39,7 @@
"outputs": [],
"source": [
"import os\n",
"\n",
"import anthropic\n",
"\n",
"# OpenAI env variables\n",
Expand Down
5 changes: 2 additions & 3 deletions examples/tracing/azure-openai/azure_openai_tracing.ipynb
Original file line number Diff line number Diff line change
Expand Up @@ -39,7 +39,6 @@
"outputs": [],
"source": [
"import os\n",
"import openai\n",
"\n",
"# Azure OpenAI env variables\n",
"os.environ[\"AZURE_OPENAI_ENDPOINT\"] = \"YOUR_AZURE_OPENAI_ENDPOINT_HERE\"\n",
Expand All @@ -66,10 +65,10 @@
"metadata": {},
"outputs": [],
"source": [
"from openlayer.lib import trace_openai\n",
"\n",
"from openai import AzureOpenAI\n",
"\n",
"from openlayer.lib import trace_openai\n",
"\n",
"azure_client = trace_openai(\n",
" AzureOpenAI(\n",
" api_key=os.environ.get(\"AZURE_OPENAI_API_KEY\"),\n",
Expand Down
1 change: 1 addition & 0 deletions examples/tracing/groq/groq_tracing.ipynb
Original file line number Diff line number Diff line change
Expand Up @@ -64,6 +64,7 @@
"outputs": [],
"source": [
"import groq\n",
"\n",
"from openlayer.lib import trace_groq\n",
"\n",
"groq_client = trace_groq(groq.Groq())"
Expand Down
12 changes: 6 additions & 6 deletions examples/tracing/mistral/mistral_tracing.ipynb
Original file line number Diff line number Diff line change
Expand Up @@ -61,6 +61,7 @@
"outputs": [],
"source": [
"import mistralai\n",
"\n",
"from openlayer.lib import trace_mistral\n",
"\n",
"mistral_client = trace_mistral(mistralai.Mistral(api_key=\"YOUR_MISTRAL_AI_API_KEY_HERE\"))"
Expand Down Expand Up @@ -115,10 +116,7 @@
" \"content\": \"What's the meaning of life?\",\n",
" },\n",
" ]\n",
")\n",
"\n",
"for chunk in stream_response:\n",
" print(chunk.data.choices[0].delta.content)"
") "
]
},
{
Expand All @@ -127,7 +125,9 @@
"id": "2654f47f-fadd-4142-b185-4d992a30c46a",
"metadata": {},
"outputs": [],
"source": []
"source": [
"chunks = [chunk.data.choices[0].delta.content for chunk in stream_response]"
]
}
],
"metadata": {
Expand All @@ -146,7 +146,7 @@
"name": "python",
"nbconvert_exporter": "python",
"pygments_lexer": "ipython3",
"version": "3.9.19"
"version": "3.9.18"
}
},
"nbformat": 4,
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -39,6 +39,7 @@
"outputs": [],
"source": [
"import os\n",
"\n",
"import openai\n",
"\n",
"# OpenAI env variables\n",
Expand Down Expand Up @@ -127,9 +128,10 @@
"metadata": {},
"outputs": [],
"source": [
"from openlayer.lib import trace_openai_assistant_thread_run\n",
"import time\n",
"\n",
"from openlayer.lib import trace_openai_assistant_thread_run\n",
"\n",
"# Keep polling the run results\n",
"while run.status != \"completed\":\n",
" run = openai_client.beta.threads.runs.retrieve(thread_id=thread.id, run_id=run.id)\n",
Expand Down
1 change: 1 addition & 0 deletions examples/tracing/openai/openai_tracing.ipynb
Original file line number Diff line number Diff line change
Expand Up @@ -39,6 +39,7 @@
"outputs": [],
"source": [
"import os\n",
"\n",
"import openai\n",
"\n",
"# OpenAI env variables\n",
Expand Down
Loading