Skip to content

Latest commit

 

History

History
97 lines (68 loc) · 4.56 KB

SETUP.md

File metadata and controls

97 lines (68 loc) · 4.56 KB

It is highly recommended that you follow the instructions to make sure you have all dependencies and the API keys set up as expected.

Environment

You can install your local environment with conda (recommended) or pip. The environment configurations for conda, pip, and poetry are provided. They all have been tested on MacOS. Please note that if you choose pip as you installation tool, you might need additional installation of system dependencies.

If you have any problems with the environment, please raise an issue, where you show the error you got. If you feel confident, please go ahead and create a pull request.

On Windows, some people have been experiencing difficulties with conda and pip (because of readline and ncurses). If that's the case for you, please have a look at WSL or use the Docker installation. Some people on Winodws reported they needed to install Visual Cpp Build Tools. In any case, if you have any problems with the environment, please raise an issue, where you show the error you got. If you feel confident that you found an improvement, please go ahead and create a pull request.

Pandoc

For pip and poetry, make sure you install pandoc in your system. On MacOS use brew:

brew install pandoc

On Ubuntu or Debian linux, use apt:

sudo apt-get install pandoc

On Windows, you can use an installer.

A working compiler

You should also ensure that your system has a compiler installed, for example like this (on Linux or WSL):

sudo apt install gcc g++

On MacOs, a Homebrew command like this will work:

brew install gcc

On Windows, please install Microsoft C++ Build Tools.

Conda

This is the recommended method for installing dependencies. Please make sure you have anaconda installed.

First create the environment for the book that contains all the dependencies:

conda env create --file langchain_ai.yaml --force

The conda environment is called langchain_ai. You can activate it as follows:

conda activate langchain_ai 

Please note that on Windows, you have to remove the ncurses and readline dependencies (the two lines) from the .yaml file. I needed to install the [VS Build Tools](Microsoft C++ Build Tools": https://visualstudio.microsoft.com/visual-cpp-build-tools/) (which can take about 20 gigs of hard disk).

Pip

Pip is the default dependency management tool in Python. With pip, you should be able to install all the libraries from the requirements file:

pip install -r requirements.txt

If you are working with a slow internet connection, you might see a timeout with pip (this can also happen with conda and pip). As a workaround, you can increase the timeout setting like this:

export PIP_DEFAULT_TIMEOUT=100

Docker

There's a docker file for the environment as well. It uses the docker environment and starts an ipython notebook. To use it, first build it, and then run it:

docker build -t langchain_ai .
docker run -it -p 8888:8888 langchain_ai

You should be able to find the notebook in your browser at http://localhost:8888.

Poetry

Make sure you have poetry installed. On Linux and MacOS, you should be able to use the requirements file:

poetry install --no-root

This should take the pyproject.toml file and install all dependencies.

Setting API keys

Following best practices regarding safety, I am not committing my credentials to GitHub. You might see import statements mentioning a config.py file, which is not included in the repository. This module has a method set_environment() that sets all the keys as environment variables like this:

Example config.py:

import os

def set_environment():
     os.environ['OPENAI_API_KEY']='your-api-key-here'

Obviously, you'd put your API credentials here. Depending on the integration (Openai, Azure, etc) you need to add the corresponding API keys. The OpenAI API keys are the most often used across all the code.

You can find more details about API credentials and setup in chapter 3 of the book Generative AI with LangChain.