Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Fancy ReadMe updates with Grammar Fixes and Badging. #23

Open
wants to merge 1 commit into
base: main
Choose a base branch
from
Open
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
294 changes: 215 additions & 79 deletions README.md
Original file line number Diff line number Diff line change
@@ -1,148 +1,284 @@
# Chronology

Chronology is a library that enables users of OpenAI's GPT-3 language model to more easily build complex language-powered applications.

It provides a simple and intuitive interface for working with GPT-3.
# Chronology
<!-- [![Build Status](https://github.com/OthersideAI/chronology/actions/workflows/build.yml/badge.svg)](https://github.com/OthersideAI/chronology/actions/workflows/build.yml) -->
[![PyPI version](https://badge.fury.io/py/chronological.svg)](https://badge.fury.io/py/chronological)
[![License: MIT](https://img.shields.io/badge/License-MIT-yellow.svg)](https://opensource.org/licenses/MIT)
[![Python Versions](https://img.shields.io/pypi/pyversions/chronological.svg)](https://pypi.org/project/chronological/)

We built this at OthersideAI to help mitigate some of the monotonous work we had to do when developing with GPT-3. Our library has the following features:
Chronology is a powerful library designed to simplify the development of complex language-powered applications using OpenAI's GPT-3 language model. Developed by OthersideAI, Chronology offers an intuitive interface to streamline your workflow with GPT-3.

- Asynchronously call GPT-3, enabling multiple prompts to generate at the same time
- Easy creation and modification of prompts
- Chain prompts together, feeding output from one or multiple prompts into another prompt, allowing for highly-complex systems to be built quickly
## Features

We built this library to be as intuitive as possible. There are no complicated concepts to master.
- **Asynchronous Calls**: Make multiple GPT-3 prompts asynchronously, enabling parallel processing.
- **Prompt Management**: Easily create, modify, and chain prompts.
- **Complex Systems**: Chain prompts together, using outputs from one or more prompts as inputs to others, facilitating rapid development of intricate systems.

# Installation
## Installation

chronology is hosted on PyPI.
Chronology is available on PyPI and supports Python 3.6 and above.

Chronology is supported on Python 3.6 and above.
To install Chronology, run:

To install chronology:
```sh
pip install chronological
```

`pip install chronological`
### Dependencies

This project also depends on the following packages:
* [`openai-api`](https://github.com/openai/openai-python)
* [`python-dotenv`](https://pypi.org/project/python-dotenv/)
* [`loguru`](https://github.com/Delgan/loguru)
* [`asyncio`](https://docs.python.org/3/library/asyncio.html)
Chronology requires the following packages:
- [`openai-api`](https://github.com/openai/openai-python)
- [`python-dotenv`](https://pypi.org/project/python-dotenv/)
- [`loguru`](https://github.com/Delgan/loguru)
- [`asyncio`](https://docs.python.org/3/library/asyncio.html)

# Usage
The setup files require the following dependencies:
- [`colorama`](https://pypi.org/project/colorama/)

After you have downloaded the package, create a `.env` file at the root of your project and put your OpenAI API key in as:
## Usage

`OPENAI_API_KEY = "MY_API_KEY"`
After installing the package, create a `.env` file at the root of your project and add your OpenAI API key:

You now have a few options. You can use the UI to generate the chain or you can use the API directly.
```env
OPENAI_API_KEY="YOUR_API_KEY"
```

## [Using ChronologyUI](https://github.com/OthersideAI/chronology-ui)
### Using ChronologyUI

Here is a [Loom video](https://www.loom.com/share/47cb8d328ebd446db4d98ea1c0cac2c7?sharedAppSource=personal_library) showing how to use the UI with the Python [`chronology`](https://github.com/OthersideAI/chronology) package.
You can use [ChronologyUI](https://github.com/OthersideAI/chronology-ui) for generating chains. Here is a [Loom video](https://www.loom.com/share/47cb8d328ebd446db4d98ea1c0cac2c7?sharedAppSource=personal_library) demonstrating how to use the UI with the [`chronology`](https://github.com/OthersideAI/chronology) package.

## Using the API Directly
### Using the API Directly

### [`main`](#main)
#### `main`

The `main` function is an async function that holds all of your business logic. You then invoke this logic by passing it as an argument to `main`. **Required**
The `main` function is an async function that holds your business logic. Invoke this logic by passing it as an argument to `main`.

## Example:
**Example:**

```
# you can name this function anything you want, the name "logic" is arbitrary
```python
# You can name this function anything you want; the name "logic" is arbitrary
async def logic():
# you call the Chronology functions, awaiting the ones that are marked await
# Call Chronology functions, awaiting those that are marked await
prompt = read_prompt('example_prompt')
completion = await cleaned_completion(prompt, max_tokens=100, engine="davinci", temperature=0.5, top_p=1, frequency_penalty=0.2, stop=["\n\n"])

print('Completion Response: {0}'.format(completion))

# you can also run whatever you want in this function
completion = await cleaned_completion(
prompt,
max_tokens=100,
engine="davinci",
temperature=0.5,
top_p=1,
frequency_penalty=0.2,
stop=["\n\n"]
)

print(f'Completion Response: {completion}')

# Run additional custom logic
for i in range(4):
print("hello")
print("hello")


# invoke the Chronology main fn to run the async logic
# Invoke the Chronology main function to run the async logic
main(logic)
```

### [`fetch_max_search_doc`](#fetch_max_search_doc)
#### **Must be awaited**
#### `fetch_max_search_doc`
**Must be awaited**

Fetch document value with max score. Wrapper for OpenAI API Search.
Fetch the document with the highest score using the OpenAI API Search.

Optional:
**Options:**
- `min_score_cutoff`: If the highest score is less than this value, `None` will be returned. Defaults to -1.
- `full_doc`: Return the entire response with the highest score but does not grab the document for you. Defaults to `False`.

min_score_cutoff = if maximum score is less than cutoff, None will be returned. Defaults to -1
**Example:**

full_doc = return whole response with max, but doesn't grab doc for you. Defaults to False. [doc, doc.index, doc.score]
```python
result = await fetch_max_search_doc(query, docs, min_score_cutoff=0.5, full_doc=True)
```

### [`raw_completion`](#raw_completion)
#### **Must be awaited**
#### `raw_completion`
**Must be awaited**

Wrapper for OpenAI API completion. Returns raw result from GPT-3.
Wrapper for the OpenAI API completion, returning the raw result from GPT-3.

### [`cleaned_completion`](#cleaned_completion)
#### **Must be awaited**
**Example:**

Wrapper for OpenAI API completion. Returns whitespace trimmed result from GPT-3.
```python
result = await raw_completion(prompt, max_tokens=100)
```

### [`gather`](#gather)
#### **Must be awaited**
#### `cleaned_completion`
**Must be awaited**

Run methods in parallel (they don't need to wait for each other to finish).
Wrapper for the OpenAI API completion, returning the whitespace-trimmed result from GPT-3.

Requires method argumets to be async.
**Example:**

Example: await gather(fetch_max_search_doc(query_1, docs), fetch_max_search_doc(query_2, docs))
```python
result = await cleaned_completion(prompt, max_tokens=100)
```

### [`read_prompt`](#read_prompt)
#### `gather`
**Must be awaited**

Looks in prompts/ directory for a text file. Pass in file name only, not extension.
Run methods in parallel.

**Example:**

```python
results = await gather(
fetch_max_search_doc(query1, docs),
fetch_max_search_doc(query2, docs)
)
```

Example: prompts/hello-world.txt -> read_prompt('hello-world')
#### `read_prompt`

Read a text file from the `prompts/` directory. Pass only the file name, not the extension.

### [`add_new_lines_start`](#add_new_lines_start)
**Example:**

```python
prompt_text = read_prompt('hello-world')
```

#### `add_new_lines_start`

Add N new lines to the start of a string.

### [`add_new_lines_end`](#add_new_lines_end)
#### `add_new_lines_end`

Add N new lines to the end of a string.

### [`append_prompt`](#append_prompt)
#### `append_prompt`

Add new content to the end of a string.
Append new content to the end of a string.

### [`prepend_prompt`](#prepend_prompt)
#### `prepend_prompt`

Add new content to the start of a string.

### [`set_api_key`](#set_api_key)
#### `set_api_key`

Set your OpenAI API key in the code.
Set your OpenAI API key within the code.

## Contributing

Chronology & ChronologyUI are both open source!

This project is an evolving use case and we welcome any contribution or feedback.
Chronology and ChronologyUI are open source and welcome contributions and feedback.

### Open Bouties:
### Open Bounties

- [ ] adding all the fields the OpenAI Python API accepts to Chronology
- [ ] adding a test suite that calls different length chains
- [ ] extending `fetch_max_search_doc` to have smarter logic around minimium scores
- [ ] make `gather` run faster, using [threads](https://docs.python.org/3/library/asyncio-task.html#running-in-threads)
- Adding all fields accepted by the OpenAI Python API to Chronology
- Creating a test suite for different length chains
- Enhancing `fetch_max_search_doc` with smarter minimum score logic
- Optimizing `gather` to run faster using [threads](https://docs.python.org/3/library/asyncio-task.html#running-in-threads)

## Learn More

Chronology is the backbone of https://OthersideAI.com. We use it to chain prompt calls and asyncronously call GPT-3. Our application is highly complex, and has many steps. Chronology allows us to parallelize those steps, significantly cutting down the time it takes to generate an email.

To learn more about OthersideAI, take a look at the following resources:
Chronology powers [OthersideAI](https://OthersideAI.com), helping us build complex applications with multiple steps efficiently. By parallelizing steps, Chronology significantly reduces the time required for tasks like email generation.

For more information about OthersideAI, check out:
- [Our Homepage](https://www.othersideai.com/)
- [Our Twitter](https://twitter.com/othersideai)

Contact: [email protected]
Contact us at: [email protected]

# Quick Setup Script
```bash
pip install os subprocess platform colorama
```

Once those have been installed, you can paste the code below into VSCode or your IDE of choice. This will create the directory, install any packages, and is designed to be verbose.

```py
import os
import subprocess
import platform
from colorama import init, Fore, Style

init(autoreset=True)

# Define the project structure and files
project_name = "chronology_project"
env_file_content = 'OPENAI_API_KEY="YOUR_API_KEY"\n'
requirements = ["chronological", "openai", "python-dotenv", "loguru", "colorama"]

# Function to create directories and files
def create_project_structure():
print(f"{Fore.BLUE}Creating project structure...{Style.RESET_ALL}")

# Create project directory
os.makedirs(project_name, exist_ok=True)

# Create .env file
with open(os.path.join(project_name, '.env'), 'w') as f:
f.write(env_file_content)
print(f"{Fore.GREEN}Created .env file.{Style.RESET_ALL}")

# Create a simple main.py file
main_py_content = '''
# --- Main.py File for Chronology ---
import os
from colorama import init, Fore, Style
init(autoreset=True)

def main():
print(f"{Fore.GREEN}Project setup complete.{Style.RESET_ALL}")

if __name__ == "__main__":
main()
'''
with open(os.path.join(project_name, 'main.py'), 'w') as f:
f.write(main_py_content)
print(f"{Fore.GREEN}Created main.py file.{Style.RESET_ALL}")

# Function to install packages
def install_packages():
print(f"{Fore.BLUE}Installing packages...{Style.RESET_ALL}")
for package in requirements:
subprocess.check_call([os.sys.executable, "-m", "pip", "install", package])
print(f"{Fore.GREEN}Installed {package}.{Style.RESET_ALL}")

# Function to handle Windows-specific setup
def windows_setup():
print(f"{Fore.BLUE}Running Windows setup...{Style.RESET_ALL}")
create_project_structure()
install_packages()

# Function to handle MacOS-specific setup
def macos_setup():
print(f"{Fore.BLUE}Running MacOS setup...{Style.RESET_ALL}")
create_project_structure()
install_packages()

# Function to handle Linux-specific setup
def linux_setup():
print(f"{Fore.BLUE}Running Linux setup...{Style.RESET_ALL}")
create_project_structure()
install_packages()

# Main script execution
if __name__ == "__main__":
current_os = platform.system()
if current_os == "Windows":
windows_setup()
elif current_os == "Darwin":
macos_setup()
elif current_os == "Linux":
linux_setup()
else:
print(f"{Fore.RED}Unsupported OS: {current_os}{Style.RESET_ALL}")

print(f"{Fore.GREEN}All set! Your project '{project_name}' has been created and is ready to use.{Style.RESET_ALL}")
```

**Sponsored**
Excellent AI [Tools](https://api.adzedek.com/click_toolify0314?chatbot_id=1715191360448x620213882279166000&operation_hash=3e3032bd73a3b6d9d9cef7fa954a9bcc) Directory. Discover the best AI websites & tools.

## Github Project Editing
- OthersideAI.com
- [Graham Waters](https://github.com/grahamwaters)


### Summary of Changes:
- Updated dependencies to include links for `colorama`.
- Refined explanations and added structure for better readability.
- Ensured all usage examples and installation instructions are clear and consistent.