This was a project that a group of 5 members did as a part of a research class over the Spring 2024 semester, in collaboration with GE Research industry partner.
- Python
- PyTorch
- PyBullet
- Hugging Face (LeRobot)
- OpenAI Gymnasium
- Gazebo3D
- XML
- RoboHive
- Ubuntu
- Windows
- Must use WSL (Windows Subsystem for Linux)
- MacOS
Please take the GE_Research_Project_Part_1.ipynb and run each cell as outlined in the file. Ensure you are utilizing Python >=3.10.0 to run the file and ensure all dependencies can be properly installed.
- Install MuJoCo and pyenv (python version management)
- Install Mujoco
- Install pyenv
Linux/WSL:
sudo apt update
sudo apt-get install -y build-essential zlib1g-dev libffi-dev libssl-dev liblzma-dev libbz2-dev libreadline-dev libsqlite3-dev
curl https://pyenv.run | bash
echo -e 'export PATH="$HOME/.pyenv/bin:$PATH"\neval "$(pyenv init --path)"\neval "$(pyenv init -)"\neval "$(pyenv virtualenv-init -)"' >> ~/.bashrc
source ~/.bashrc
NOTE: echo command adds pyenv to .bashrc to start it every time you open a terminal
MacOS:
brew install pyenv
brew install pyenv-virtualenv
- Install known working version of python
pyenv install 3.9.18
- Clone repo
git clone https://github.com/purdue-mars/VIP-GEAI
cd 02-block-pick-and-place
- Create virtual environemnt, install repo, and dependencies
pyenv virtualenv 3.9.18 pick_n_place
pyenv activate pick_n_place
pip3 install -e .
Goal: get the robot to pick up the brick in the right bin and place it in the left bin within the target green zone via following a precomputed trajectory open-loop.
Below is a video of our demo:
GEResearch.-.Robohive.x.OpenAI.Simulation.mp4
To run the starter script:
cd pick_n_place
python3 part_1/main.py
NOTE: Initially, the robot will just go to the target green zone
Update the part_1/main.py
file with the following steps:
- STEP 1.1: Generate Task-relevant Poses
- STEP 1.2: Create Joint-position trajectory following the task-relevant poses
Helpful tips:
- On the left sidebar of the MuJoCo Renderer, click:
Rendering > Frame > Site|Body|Geom
to toggle rendering of the coordinate framesRendering > Label > Site|Body|Geom
to toggle rendering of the labels
- Helpful utils are provided for you to in the robohive library:
- quat2mat: convert quaternion to rotation matrix
- mat2quat: convert rotation matrix to quaternion
- generate_joint_space_min_jerk: generate joint space trajectory given start and end joint angles
Utilizing motion planning algorithms to create collision-free trajectories to complete the pick and place task, with Hugging Face LeRobot library.
Below is a video of our training our model utilizing LeRobot:
GE.Research.-.LeRobot.Machine.Learning.Simulation.mp4
Often, a robotic system may not have perfect perception of the object pose in the real world, how can the arm still perform robust pick and place using only raw images as input?
Get an introduction to learning techniques commonly found in robotics:
- reinforcement learning
- behavior cloning
We will go ahead and implement this in future iterations of this project.
- Thank you Shaopeng Liu @ GE Research for meeting with us to go over our work throughout the semester!