Make sure you have uv installed from their official website. We're going with 3.11 for now cause of all the nice features.
uv sync
export SQLITE_PATH="./db.sqlite3" # Define where sqlite DB will be stored
uv run alembic upgrade head # Run migrations
Set the environment variable ALLOWED_ORIGINS
to a comma-separated list of origins if you're deploying this to a custom domain.
Currently, the environment can be setup through the settings page on the frontend.
You only need an OpenAI API key to start using DataLine. You may optionally use Langsmith to record logs for your LLM flow.
!NOTE that adding Langsmith will send the graph state to Langsmith to log. The graph state includes your private results. Only enable this feature if you are okay with sharing your data with Langsmith. We use this mainly for debugging during development.
Run migrations if needed:
uv run alembic upgrade head
You can then run uvicorn to start the backend:
# don't forget to specify your SQLITE path
# export SQLITE_PATH="./db.sqlite3"
uv run uvicorn dataline.main:app --reload --port=7377
To run tests: uv run pytest . -vv
When adding new migrations using Alembic, please remember to include the following PRAGMA FK commands if your migration requires foreign key relationship support:
PRAGMA foreign_keys=OFF;
-- Your migration commands here
PRAGMA foreign_keys=ON;
Alembic:
# disable foreign key checking
op.execute("PRAGMA foreign_keys=OFF;")
This ensures that foreign key constraints are properly handled during the migration process.
They stay if you ship something you're proud of, you've shipped too late.
Currently some raw SQL from the very early MVP remains and is being replaced with SQLAlchemy queries.The LLM querying code is also pretty non-generic and hard to extend, so that will soon be replaced with some "Agent" implementation.
uv run pre-commit install
docker run -p 1433:1433 -e 'ACCEPT_EULA=Y' -e 'SA_PASSWORD=My_password1' -d chriseaton/adventureworks:latest
DSN: mssql://SA:My_password1@localhost/AdventureWorks?TrustServerCertificate=yes&driver=ODBC+Driver+18+for+SQL+Server
Build & Run image locally: bash ./scripts/postgres_img_with_sample_data.sh
DSN: postgres://postgres:dvdrental@localhost:5432/dvdrental
docker run -p 3306:3306 -d sakiladb/mysql
DSN: mysql://sakila:[email protected]:3306/sakila