Everything is migrated to:
https://github.com/dfl-rlab/dddmr_navigation
Weβve just integrated a Gazebo models using Unitree-go2 with the DDDMR Navigation Stack, unlocking true 3D navigation for simulation and testing. Using the latest quadruped robots go2 combined with our advanced stack, you can explore navigation capabilities that go far beyond traditional 2D navigation frameworks.
π Jump in, simulate, and experience features that Nav2 alone canβt achieve β multi-level mapping, ramp navigation, and obstacle handling in complex environments.
πΎ Let's play go2 using dddmr navigation
![]() Airy as the belly button!? |
![]() DDDMR SLAM |
πΎ Let's play airy using dddmr navigation
Note
DDDMR Navigation Stack is designed to solve the issues that Nav2 not able to handle: such as multi-layer floor mapping and localization, path planning in stereo structures and percption markings and clearings in a 3D point cloud map.
![]() Multilevel map |
![]() Obstacle avoidance on ramps |
![]() Navigating while mapping |
![]() Semantic segmentation and navigation (stay tunedπ₯) |
DDDMR navigation (3D Mobile Robot Navigation) is a navigation stack allows users to map, localize and autonomously navigate in 3D environments.
Below figure shows the comparison between 2D navigation stack and DDD(3D) navigation. Our stack is a total solution for a mobile platform to navigate in 3D environments. There are plenty advantages for choosing DDD navigation:
β¨ The standard procedures of DDD mobile robots and 2D mobile robots are the same, make it easier for 2D navigation stack users to transit to DDD navigation without difficulties:
- Mapping and refining the map using our packages and tools.
- Turn off mapping, use MCL to localize the robot by providing an initial pose.
- Send a goal to the robot, the robot will calculate the global plan and avoid obstacles using local planner.
β¨ DDD navigation is no longer suffered from terrain situations. For example, ramps in factories or wheelchair accessible.
β¨ DDD navigation has been well tested is many fields and is based on the cost-effective hardware, for example, 16 lines lidar, intel NUC/Jetson Orin Nano and consumer-grade imu. We are trying to make the solution as affordable as possible.
π€ We would like to thank the contributors and users of Navigation and Navigation2, ddd navigation stands on your shoulder!
π΅βπ«π΅βπ«π΅βπ«π΅βπ«π΅βπ«π΅βπ«π΅βπ«π΅βπ«π΅βπ«π΅βπ« I have a robot, but where to start?Click me to see the beginner's guideπ΅βπ«π΅βπ«π΅βπ«π΅βπ«π΅βπ«π΅βπ«π΅βπ«π΅βπ«π΅βπ«π΅βπ«
π‘ Click me to see Mapping
https://github.com/dfl-rlab/dddmr_navigation/tree/main/src/dddmr_lego_loamπ‘ Click me to see Localization
https://github.com/dfl-rlab/dddmr_navigation/tree/main/src/dddmr_mcl_3dlπ‘ Click me to see Perception
https://github.com/dfl-rlab/dddmr_navigation/tree/main/src/dddmr_perception_3dπ‘ Click me to see Global planner
https://github.com/dfl-rlab/dddmr_navigation/tree/main/src/dddmr_global_plannerπ‘ Click me to see Local planner
https://github.com/dfl-rlab/dddmr_navigation/tree/main/src/dddmr_local_plannerπ‘ Click me to see Move base
https://github.com/dfl-rlab/dddmr_navigation/tree/main/src/dddmr_p2p_move_base![]() 3D mapping |
![]() 3D global planning |
![]() 3D local planning |
![]() 3D navigation |
![]() Support vairant sensors (Unitree G4) |
![]() Support vairant sensors (Depth Camera) |













