You want to upload data to AIND's Cloud Storage platform on AWS.
Authentication for write permissions to aind-open-data bucket. Please reach out to AIND Scientific Computing for access.
Install directly from PyPI. We recommend installing into a virtual environment or conda environment.
pip install aind-data-transfer-liteYou can interact with AIND Data Transfer Lite in three ways:
- Launch the GUI from Python (e.g., in an IDE) for interactive use.
- Run the standalone executable, which requires no Python installation.
- Use Python scripts or the command-line interface to perform data uploads programmatically.
-
Ensure dependencies are installed.
-
Either:
-
Open the file
src/aind_data_transfer_lite/ui.pyin VS Code and click "Run" in the upper right-hand corner. -
Or run the following in the terminal:
python -m aind_data_transfer_lite.ui
-
-
You should see a window titled "AIND Data Transfer Lite" appear.
During an upload job, high-level progress and status messages are displayed in the Output panel of the UI. For full, detailed logs (including validation steps and upload diagnostics), refer to the terminal where the application was launched.
Note: Executables are currently built and uploaded to each release manually by a maintainer.
For users who don’t want to install Python or dependencies, a standalone executable is available from the GitHub Releases page.
Note: The standalone executable currently requires AWS CLI to be installed in order to perform uploads.
Each release includes a pre-built executable that can be run directly.
from pathlib import Path
import os
from aind_data_transfer_lite.models import JobSettings
from aind_data_transfer_lite.upload_data import UploadDataJob
# Assuming running from same directory as this README file
cwd = os.getcwd()
behavior_path = Path(cwd) / "tests" / "resources" / "behavior_data"
ecephys_path = Path(cwd) / "tests" / "resources" / "ecephys_data"
metadata_path = Path(cwd) / "tests" / "resources" / "metadata_dir"
modality_directories = {
"behavior": behavior_path,
"ecephys": ecephys_path
}
metadata_directory = metadata_path
job_settings = JobSettings(
dry_run=True,
modality_directories=modality_directories,
metadata_directory=metadata_directory,
s3_bucket="aind-open-data-dev"
)
job = UploadDataJob(job_settings=job_settings)
job.run_job()python -m aind_data_transfer_lite.upload_data \
--metadata_directory "./tests/resources/metadata_dir" \
--modality_directories '{"behavior": "./tests/resources/behavior_data", "ecephys": "./tests/resources/ecephys_data"}' \
--dry_run "True"python -m aind_data_transfer_lite.upload_data `
--metadata_directory "./tests/resources/metadata_dir" `
--modality_directories '{\"behavior\": \"./tests/resources/behavior_data\", \"ecephys\": \"./tests/resources/ecephys_data\"}' `
--dry_run "True"For code development, clone the repo and install as
pip install -e ".[dev]"This section is intended for maintainers and contributors preparing a release. End users should download pre-built executables from GitHub Releases.
Developers can build the standalone executable using PyInstaller. PyInstaller is able to automatically detect the required dependencies for the GUI and generate a working executable without any manual modification to the .spec file. A generated .spec file is committed to the repository to ensure builds are reproducible across environments.
PyInstaller can generate a working .spec file automatically.
-
Activate your development environment
-
Ensure PyInstaller is installed:
pip install pyinstaller
-
Run:
pyinstaller \ --onefile \ --name aind-data-transfer-lite-ui \ --windowed \ src/aind_data_transfer_lite/ui.py
This command will:
- Generate
aind-data-transfer-lite-ui.spec - Build a working executable
- Populate the
dist/directory
If you want to build the executable locally using the existing .spec file:
-
Activate your development environment
-
Ensure PyInstaller is installed:
pip install pyinstaller
-
Navigate to the repository root and build using the included spec file:
pyinstaller aind-data-transfer-lite-ui.spec
-
The executable will appear in the
dist/folder
There are several libraries used to run linters, check documentation, and run tests.
- Please test your changes using the coverage library, which will run the tests and log a coverage report:
coverage run -m unittest discover && coverage report- Use interrogate to check that modules, methods, etc. have been documented thoroughly:
interrogate .- Use flake8 to check that code is up to standards (no unused imports, etc.):
flake8 .- Use black to automatically format the code into PEP standards:
black .- Use isort to automatically sort import statements:
isort .For internal members, please create a branch. For external members, please fork the repository and open a pull request from the fork. We'll primarily use Angular style for commit messages. Roughly, they should follow the pattern:
<type>(<scope>): <short summary>
where scope (optional) describes the packages affected by the code changes and type (mandatory) is one of:
- build: Changes that affect build tools or external dependencies (example scopes: pyproject.toml, setup.py)
- ci: Changes to our CI configuration files and scripts (examples: .github/workflows/ci.yml)
- docs: Documentation only changes
- feat: A new feature
- fix: A bugfix
- perf: A code change that improves performance
- refactor: A code change that neither fixes a bug nor adds a feature
- test: Adding missing tests or correcting existing tests
The table below, from semantic release, shows which commit message gets you which release type when semantic-release runs (using the default configuration):
| Commit message | Release type |
|---|---|
fix(pencil): stop graphite breaking when too much pressure applied |
|
feat(pencil): add 'graphiteWidth' option |
|
perf(pencil): remove graphiteWidth optionBREAKING CHANGE: The graphiteWidth option has been removed.The default graphite width of 10mm is always used for performance reasons. |
(Note that the BREAKING CHANGE: token must be in the footer of the commit) |
