Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

The response speed of the laser odometer is very slow, and there are ghost images in the local map. #7

Open
woshinidiesb opened this issue Jan 3, 2025 · 8 comments

Comments

@woshinidiesb
Copy link

Dear author, hello. I'm very grateful for your previous responses, which helped me understand why LIO didn't work properly. However, now I'm encountering a new problem: 1. When the vehicle's forward acceleration is relatively large, LIO responds very slowly. The GPS shows that the vehicle has traveled several meters, but LIO only starts to move forward slowly. I have checked the IMU data and radar data and run Fast_LIO2 with this data, finding no abnormalities. But I can't figure out what causes this. I looked at your CAT data and found that this behavior shouldn't occur at this speed. I hope you can help me solve this problem. 2. I have reviewed the source code and paper of Fast_LIO2 and found that LIO2 uses featureless matching, while LIMO uses point-plane matching. Why is it designed this way? It seems that point-plane matching is less robust than the featureless matching of Fast_LIO2. I tested and compared the two algorithms on an uphill to flat road section and found that LIMO experiences severe high-speed drift, but Fast_LIO2 doesn't. I don't know why.

@woshinidiesb
Copy link
Author

@woshinidiesb
Copy link
Author

General config

fast_limo:
ros__parameters:
topics:
input:
lidar: /sensing/lidar/c16_front/pointcloud_raw
imu: /sensing/imu/imu_data
# output:

gps: # GPS config
  topic: /sensing/gnss/pose_with_covariance
  #frame: gps
  active: true
tf:
  active: true

frames: # output frames (the transform between frames will be broadcasted)
  world: map
  body: base_link

num_threads: 20 # OpenMP threads (will be equal to $(nproc) if it is set higher)

sensor_type: 1  # LiDAR type (0: OUSTER \ 1: VELODYNE \ 2: HESAI \ 3: LIVOX)

debug: true     # fill useful intermediate pcl (deskewed, final_raw...) for visualizing purposes
verbose: true   # print debug/performance board

estimate_extrinsics: false   # continuous estimation of LiDAR-IMU extrinsics 
time_offset: true           # whether to take into account a possible sync offset between IMU and LiDAR (set to true if they are not properly in sync)
end_of_sweep: true         # whether the sweep reference time is w.r.t. the start or the end of the scan (only applies to VELODYNE/OUSTER)

calibration:  # automatic IMU calibration (if all set to false, no calibration will be done)
  gravity_align: true     # estimate gravity vector
  accel: true             # estimate lin. accel. bias
  gyro: true              # estimate ang. vel. bias
  time: 5.0               # [s] time to estimate (during this time, the robot must be at stand still)



extrinsics: # w.r.t baselink [SI]
  imu:
    t: [ 1.2, 0.35, 0.8 ]
    R: [ 0.,  -1.,  0.,
          1.,  0.,  0.,
          0.,  0.,  1. ]
  lidar:
    t: [ 1.78,  -0.13,  1.45 ]
    R: [ 1.,  0.,  0.,
          0.,  1.,  0.,
          0.,  0.,  1. ]


intrinsics:
  accel:
    bias: [ 0.01, 0.01, 0.01 ]  # [m/s^2]
    sm:   [ 1.,  0.,  0.,
            0.,  1.,  0.,
            0.,  0.,  1. ]
  gyro:
    bias: [ 0.00045, 0.00051, -0.01 ]  # [rad/s]

filters:
  cropBox: # prism crop
    active: true
    box:
      min: [ -1.5, -1.5, -1.8 ]  # [m]
      max: [ 1.5, 1.0, 2.0 ]     # [m]

  voxelGrid:
    active: true
    leafSize: [ 0.2, 0.2, 0.2 ]

  minDistance: # sphere crop
    active: true
    value: 3.5  # [m]

  FoV: # crop field of view
    active: true
    value: 360.0  # [deg]
  
  rateSampling: # quick downsample
    active: true
    value: 8

iKFoM:  # Iterative Kalman Filter on Manifolds lib
  MAX_NUM_ITERS: 10                # max num+1 of iterations of the KF (limits comp. load)
  MAX_NUM_MATCHES: 5000           # max num of matches to account for when computing Jacobians (limits comp. load)
  MAX_NUM_PC2MATCH: 10000         # max num of points to consider when matching with map (limits comp. load)
  LIMITS: 0.01                 

  Mapping:
    NUM_MATCH_POINTS: 5           # num of points that constitute a match
    MAX_DIST_PLANE: 2.0           # [m] max distance between points of every match
    PLANES_THRESHOLD: 5.0e-2      # [m] threshold to consider if match points are part of a plane 
    LocalMapping: false            # whether to keep a fixed size map (moving with the robot) or a limitless map (when active, it limits the memory in use)
    
  iKDTree: # Incremental KD Tree lib
    balance: 0.6      # tree balancing param
    delete: 0.3       # tree deletion param
    voxel: 0.1        # downsampling value
    bb_size: 400.0   # [m] Local Map's bounding box dimension
    bb_range: 150.0   # [m] Local Map's bounding box moving range (if the robot is closer than bb_range to any local map edge, the BB will move)

  covariance:
    gyro: 0.0001 
    accel: 0.01
    bias_gyro: 0.00001
    bias_accel: 0.0001

@woshinidiesb
Copy link
Author

The above is my configuration file. I'm using the ros2 humble branch.

@woshinidiesb
Copy link
Author

By the way, this radar is an uncommon brand, the radar point cloud timestamp field type is double and time, and the timestamp type of velodyne in the Common file needs to be changed to double in order to correctly parse the correct time

@woshinidiesb
Copy link
Author

/*
Copyright (c) 2024 Oriol Martínez @fetty31

This program is free software: you can redistribute it and/or modify
it under the terms of the GNU General Public License as published by
the Free Software Foundation, either version 3 of the License, or
(at your option) any later version.

This program is distributed in the hope that it will be useful,
but WITHOUT ANY WARRANTY; without even the implied warranty of
MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
GNU General Public License for more details.

You should have received a copy of the GNU General Public License
along with this program. If not, see https://www.gnu.org/licenses/.
*/

#ifndef FASTLIMO_COMMON_HPP
#define FASTLIMO_COMMON_HPP

#ifndef HAS_CUPID
#include <cpuid.h>
#endif

#define FAST_LIMO_v "1.0.0"

// System
#include
#include
#include
#include
#include <sys/times.h>

#include
#include
#include

#include <stdio.h>
#include <stdlib.h>
#include
#include

#include
#include

#include
#include
#include
#include

template
std::string to_string_with_precision(const T a_value, const int n = 6)
{
std::ostringstream out;
out.precision(n);
out << std::fixed << a_value;
return out.str();
}

// FASTLIOv2
#include "IKFoM/use-ikfom.hpp"
#include "ikd-Tree/ikd_Tree/ikd_Tree.h"

// Boost
#include <boost/format.hpp>
#include <boost/make_shared.hpp>
#include <boost/circular_buffer.hpp>
#include <boost/algorithm/string.hpp>
#include <boost/range/adaptor/indexed.hpp>
#include <boost/range/adaptor/adjacent_filtered.hpp>
#include <boost/range/adaptor/filtered.hpp>

// PCL
#define PCL_NO_PRECOMPILE
#include <pcl/filters/crop_box.h>
#include <pcl/filters/voxel_grid.h>
#include <pcl/filters/statistical_outlier_removal.h>
#include <pcl/io/pcd_io.h>

namespace fast_limo {
enum class SensorType { OUSTER, VELODYNE, HESAI, LIVOX, UNKNOWN };

// MODULES

class Localizer;
class Mapper;

// OBJECTS

class State;
class Plane;
class Match;

// UTILS

struct Config;

// STRUCTURES

struct Point {
Point(): data{0.f, 0.f, 0.f, 1.f} {}
Point(float x, float y, float z): data{x, y, z, 1.f} {}

PCL_ADD_POINT4D;
float intensity;
union {
  std::uint32_t t;   // (Ouster) time since beginning of scan in nanoseconds
  double time;        // (Velodyne) time since beginning of scan in seconds
  double timestamp;  // (Hesai) absolute timestamp in seconds
                     // (Livox) absolute timestamp in (seconds * 10e9)
};
EIGEN_MAKE_ALIGNED_OPERATOR_NEW

} EIGEN_ALIGN16;

struct Extrinsics{
struct SE3 {
Eigen::Vector3f t;
Eigen::Matrix3f R;
};
SE3 imu2baselink;
SE3 lidar2baselink;
Eigen::Matrix4f imu2baselink_T;
Eigen::Matrix4f lidar2baselink_T;
};

struct IMUmeas{
double stamp;
double dt; // defined as the difference between the current and the previous measurement
Eigen::Vector3f ang_vel;
Eigen::Vector3f lin_accel;
Eigen::Quaternionf q;
};

}

POINT_CLOUD_REGISTER_POINT_STRUCT(fast_limo::Point,
(float, x, x)
(float, y, y)
(float, z, z)
(float, intensity, intensity)
(std::uint32_t, t, t)
(double, time, time)
(double, timestamp, timestamp))

typedef fast_limo::Point PointType;
typedef pcl::PointXYZ MapPoint;
typedef std::vector<pcl::PointXYZ, Eigen::aligned_allocatorpcl::PointXYZ> MapPoints;
typedef std::vector<PointType, Eigen::aligned_allocator> LocPoints;

typedef std::vector<fast_limo::Match> Matches;
typedef std::vector<fast_limo::Plane> Planes;
typedef std::vector<fast_limo::State> States;

#endif

@woshinidiesb
Copy link
Author

@fetty31
Copy link
Owner

fetty31 commented Feb 28, 2025

Hi @woshinidiesb ,

Regarding the FAST_LIOv2 featureless matching that you mention I don't think you got that right. FAST_LIOv2 uses an Extended Kalman Filter (eKF) architecture in order to estimate the ego position of the robot. This eKF uses a kinematic model for state propagation and point-to-plane matching as the measurement model. Thus, FAST_LIOv2 is actually minimizing point-to-plane distances in order to solve the problem. Fast-LIMO is doing exactly the same procedure, with the only difference that there's no accumulation process. Instead, a multithreaded approach is defined with the purpose of reducing as much as possible the estimation latency and offering an alternative to exploit LiDAR drivers that publish at much higher frequencies (e.g. not waiting for the sensor to finish a whole rotation).

@woshinidiesb
Copy link
Author

Hi @woshinidiesb ,

Regarding the FAST_LIOv2 featureless matching that you mention I don't think you got that right. FAST_LIOv2 uses an Extended Kalman Filter (eKF) architecture in order to estimate the ego position of the robot. This eKF uses a kinematic model for state propagation and point-to-plane matching as the measurement model. Thus, FAST_LIOv2 is actually minimizing point-to-plane distances in order to solve the problem. Fast-LIMO is doing exactly the same procedure, with the only difference that there's no accumulation process. Instead, a multithreaded approach is defined with the purpose of reducing as much as possible the estimation latency and offering an alternative to exploit LiDAR drivers that publish at much higher frequencies (e.g. not waiting for the sensor to finish a whole rotation).

Thank you for your reply. I think I understand his principle now, but I don't know why there is such error in this speed estimate. Could you please answer it

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants