You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Copy file name to clipboardExpand all lines: README.md
+15-8Lines changed: 15 additions & 8 deletions
Original file line number
Diff line number
Diff line change
@@ -1,4 +1,4 @@
1
-
# Introduction
1
+
##Introduction
2
2
This ROS package contains a visual-inertial-leg odometry (VILO) for Unitree A1 and Go1 robot. Our goal is to provide a compact and low-cost long term position sensing suite for legged robots (A sensing solution only has one IMU, one stereo camera, and leg sensors. Total cost <$1000).
3
3
4
4
The focus of this work is adding body velocity calculated from leg joint sensors and calibrate potential kinematic parameter errors to improve its accuracy when used in VILO. This odometry uses the optimization framework from the VINS-Fusion, one of the most popular visual inertial odometry. Additional to VINS's image and IMU measurement model, we add a special contact preintegration term. It achieves lower than 1\% position estimation drift on various datasets. More details of the theoratical contribution can be found in our recent paper:
@@ -11,14 +11,14 @@ Here are two videos comparing the performance of the VILO and the VINS-Fusion:
11
11
12
12
13
13
## Installation
14
-
use Docker and VSCode ''Remote - Containers''. A dockerfile that configures an individual development environment is shown in .devcontainer/Dockerfile.
14
+
Use Docker and VSCode ''Remote - Containers''. A dockerfile that configures an individual development environment is shown in .devcontainer/Dockerfile.
15
15
16
-
The state estimation result can be visualized using Rviz. If the user's computer already has ROS installed then open Rviz directly and load config file config/rviz/vilo_rviz_config.rviz.
16
+
During docker image compilation, the system memory may be eaten up if your computer's memory size < 16GB. Please add 8-16G swap space following [this tutorial (3. Adding swap file)](https://www.thegeekdiary.com/how-to-add-swap-space-in-linux/).
17
17
18
-
If the user's computer does not have ROS installed, the user can use Rviz inside the docker. However in order to see the Rviz interface the computer must have graphic driver and xhost properly configured.
18
+
The state estimation result can be visualized using Rviz. If the user's computer already has ROS installed then open Rviz directly and load config file config/rviz/vilo_rviz_config.rviz. If the user's computer does not have ROS installed, the user can use Rviz inside the docker. However in order to forward the Rviz interface to the host computer, a series of nontrivial setting should be done to have graphic driver and xhost properly configured. Please refer to [this tutorial](https://github.com/ShuoYangRobotics/A1-QP-MPC-Controller/blob/main/README.md#setup) for more information.
19
19
20
20
21
-
## Dataset
21
+
## Demon Datasets
22
22
A Google drive folder https://drive.google.com/drive/folders/13GsFDaBkDrslOl9BfE4AJnOn3ECDXVnc
23
23
24
24
contains several dataset to test the VILO. Download them to \${PATH_TO_CODE_REPO}/bags. Since we directly map the code into a location inside the docker (/root/vilo_ws). If we download dataset rosbag to
@@ -29,18 +29,26 @@ Connect to remote container, make sure you have bags in /root/vilo_ws/src/vilo/b
29
29
30
30
1. stree.bag. We use the following command to run this bag
31
31
```shell
32
-
roslaunch vilo run_street_bag.launch
32
+
roslaunch vilo run_street_bag_vilo.launch
33
33
```
34
34
2. campus.bag
35
+
```shell
36
+
roslaunch vilo run_campus_bag_vilo.launch
37
+
```
35
38
3. track.bag
36
-
39
+
```shell
40
+
roslaunch vilo run_track_bag_vilo.launch
41
+
```
37
42
38
43
4. outdoor_snow.bag. The bag contains sensor data collected during the snow walking run shown in the second video above.
39
44
40
45
5. indoor_with_ground_truth_1.bag. The robot moves forward and back quickly. Groundtruh data is
41
46
42
47
Notice the rosbag play should be slow down for slow computers, otherwise the VILO cannot finish computation in time. In the launch files we play them at 0.5x original speed.
43
48
49
+
## Examine Results
50
+
While the algorithm is running, a number of ROS topics contain estimation results will be published. Moreover in the output/ folder some of estimation results will be saved in csv files.
51
+
44
52
## Custom Sensor Setup
45
53
The VILO can only works properly when sensor topics are received correctly and all sensor transformations are set properly.
46
54
@@ -88,4 +96,3 @@ From the info list, the most important topics are
88
96
89
97
One caviet when users create the same topic is, the joint velocities directly generated by A1 robot is very noisy. My controller differentiates joint angles to generate joint velocities.
0 commit comments