Skip to content

CenikLab/Translational-buffering-ML

Repository files navigation

Translational-buffering-ML

Adapted from the LGBM model from Predicting the translation efficiency of messenger RNA in mammalian cells, original code available.

Environemnt Setup

  1. Create conda environment:
    conda env create -f environment.yml --prefix ./TE_classic_ML_env/
  2. Activate conda environment:
    conda activate ./TE_clasic_ML_env

Data

Training data is already provided in ./data/.

If generating training data yourself:

  • Place/symlink appris_human_v2_selected.fa and appris_mouse_v2_selected.fa in ./data/.
  • Place/symlink human_all_biochem_feature_no_len.csv in ./biochem_and_struct_data if training with biochem data.

Training

Training examples can be found in experiments.py. Use -e human_corr argument to train human correlation model (save models with -s flag).

Predicting

See original code repo for examples.

About

No description, website, or topics provided.

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

 
 
 

Contributors