Skip to content

CV Zebrafish — a modular software tool to track and analyze zebrafish movements from high-speed imaging. It integrates with DeepLabCut outputs to compute metrics like fin angles, movement timing, speed, and behavior classification

Notifications You must be signed in to change notification settings

oss-slu/cv_zebrafish

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

CV_Zebrafish

Overview

Computer-vision toolkit for analyzing zebrafish movement data exported from DeepLabCut. The project ships a PyQt UI, calculation pipeline, validation utilities, Plotly-based visualizations, and historical parity checks against the legacy “Bruce” workflow.

Repository Layout

.
├── app.py                   # PyQt entry point
├── webengineDemo.py         # Plotly demo
├── assets/
│   ├── images/                       # UI illustrations and icons
│   └── sample_data/                  # Placeholder for bundled DLC runs
├── configs/
│   ├── defaults/                     # Reserved for auto-generated configs
│   └── samples/
│       ├── csv/                      # Curated DeepLabCut exports
│       └── jsons/                    # BaseConfig + autogenerated variants
├── docs/
│   ├── architecture/calculations/    # Implementation notes + comparisons
│   ├── howtos/{ui,validation}/       # Step-by-step guides
│   └── product/{meeting_minutes,...} # Product docs and presentations
├── legacy/
│   ├── codes/                        # Bruce pipeline + configs/results
│   ├── dlc/, input_data/             # Historical DLC artifacts
│   └── reports/                      # Output CSVs, QA assets
├── requirements/validation/          # Narrow dependency set for validators
├── src/cvzebrafish/
│   ├── core/
│   │   ├── calculations/
│   │   ├── config/configSetup.py     # Config discovery helpers
│   │   ├── parsing/Parser.py         # DeepLabCut CSV parser
│   │   └── validation/
│   ├── data/
│   │   ├── repositories/
│   │   └── sources/
│   ├── platform/
│   │   ├── __init__.py
│   │   ├── paths.py
│   │   └── __pycache__/
│   ├── ui/
│   │   ├── __init__.py
│   │   ├── .gitignore
│   │   ├── __pycache__/
│   │   ├── components/
│   │   ├── resources/
│   │   └── scenes/
│   └── viz/
│       ├── adapters/
│       ├── export/
│       └── figures/
├── tests/
│   ├── unit/core/
│   ├── integration/, e2e/ (stubs)
│   └── conftest.py + shared fixtures
├── AGENTS.md, contributing.md, LICENSE, environment.yml, requirements.txt
└── package*.json                     # Front-end prototype scaffolding

Core Functionality Highlights

  • Parsing & Config Loadingsrc/cvzebrafish/core/parsing/Parser.py maps DLC CSV columns into structured point dictionaries, while core/config/configSetup.py cascades through local/sample config locations and backfills missing files from BaseConfig.json.
  • Calculation Pipelinecore/calculations/ orchestrates kinematic metrics (angles, yaw, bout detection).
  • Validation Utilitiescore/validation/json_verifier.py and csv_verifier.py enforce schema integrity, check DLC CSV structure/ranges, and scaffold configs. Minimal dependencies live in requirements/validation/requirements.txt.
  • Visualization & Exportviz/adapters/output_adapter.py and viz/figures/outputDisplay.py turn calculation outputs into Plotly charts plus CSV/HTML artifacts.
  • PyQt UI Flowapp.py boots the multi-scene UI defined under src/cvzebrafish/ui/scenes (Landing, CSV/JSON inputs, Config generators, Calculation/Graph viewer, Validators) and reuses shared widgets from ui/components/.
  • Data Ingestion & Persistencesrc/cvzebrafish/data/repositories and src/cvzebrafish/data/sources normalizes DLC outputs into SQLite (CV_Zebrafish.db), tracks file hashes, and records ingestion runs for reproducibility.
  • Legacy Parity & Docslegacy/ mirrors the prior code path so regression comparisons and documentation in docs/architecture/.../comparison_legacy_vs_new.md stay grounded.
  • Automated Teststests/unit/core exercises parser + metrics against golden CSV fixtures. pytest is configured via conftest.py.

Python Environment

You can use Conda or a virtual environment created with python -m venv.

Option A: pip / venv

python -m venv .venv
# Windows
.venv\Scripts\activate
# macOS / Linux
source .venv/bin/activate
pip install -r requirements.txt

Option B: Conda

conda env create -f environment.yml
conda activate dlc

Deactivate the environment with deactivate (venv) or conda deactivate when you finish.

Running the App

From the repository root:

python app.py

The UI walks through CSV/JSON selection, validation, configuration tweaks, calculation runs, and graph review.

Plot Preview Dependencies

  • The Graph Viewer scene renders Plotly figures to PNG via Kaleido. Install it alongside the UI dependencies with pip install kaleido so in-app dot plots appear correctly.

Useful CLI Utilities

  • pytest to execute the unit suites
  • python -m cvzebrafish.core.validation.json_verifier <config.json> (or run without arguments to be prompted for a path)
  • python -m cvzebrafish.core.validation.csv_verifier to be prompted for a DLC CSV and run structural checks

License

This project is licensed under the MIT License — see the file for details.

About

CV Zebrafish — a modular software tool to track and analyze zebrafish movements from high-speed imaging. It integrates with DeepLabCut outputs to compute metrics like fin angles, movement timing, speed, and behavior classification

Topics

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Contributors 5

Languages