-
Notifications
You must be signed in to change notification settings - Fork 2
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Add pose task to IK #54
base: main
Are you sure you want to change the base?
Conversation
As documented in https://github.com/ami-iit/element_motion-generation-with-ml/issues/111 and https://github.com/ami-iit/element_motion-generation-with-ml/issues/113, the pose task implementation is complete and based on my tests seems to work properly. |
@davidegorbani @evelyd @Giulero @GiulioRomualdi @claudia-lat what is the status of this PR? |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Sorry, I started the review and never published the comments.
@@ -505,8 +581,7 @@ bool HumanIK::advance() | |||
{ | |||
Eigen::Matrix4d basePose; // Pose of the base | |||
Eigen::VectorXd initialJointPositions; // Initial positions of the joints | |||
basePose.setIdentity(); // Set the base pose to the identity matrix | |||
m_system.dynamics->setState({basePose.topRightCorner<3, 1>(), toManifRot(basePose.topLeftCorner<3, 3>()), m_calibrationJointPositions}); | |||
m_system.dynamics->setState({m_basePose.topRightCorner<3, 1>(), toManifRot(m_basePose.topLeftCorner<3, 3>()), m_calibrationJointPositions}); |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Here we reset the base pose to the identity such that the subject returns to the initial position; I would leave this possibility at least with an option in the configuration file.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Addressed in 2c61a3e
if (verticalForce > m_FloorContactTasks[node].verticalForceThreshold) | ||
{ // if the vertical force is greater than the threshold, set the foot height to 0 | ||
I_H_link.translation(Eigen::Vector3d(I_H_link.translation().x(), I_H_link.translation().y(), linkHeight)); |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I don't understand why the previous implementation is substituted; like this, when the foot is in contact and the task is active, I think the foot can slide on the ground as the position on x and y directions is updated at every step.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
So the problem, I think, is that there is some sensor mismatch. That if you don't allow the foot to slide on the ground then it builds up over time and results in really weird-looking behavior. For context:
Original logic (3794efa) | My logic (2c61a3e) |
---|---|
orig_floor_contact_task_baf-2025-03-17_14.21.45.mp4 |
my_floor_contact_task_baf-2025-03-17_14.29.28.mp4 |
If you have another idea to alleviate the problem then I can test it, but up til now this is what seemed to work best.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I think it's better if we discuss about this f2f.
src/IK/src/InverseKinematics.cpp
Outdated
@@ -451,15 +534,15 @@ bool HumanIK::calibrateAllWithWorld(std::unordered_map<int, nodeData> nodeStruct | |||
Eigen::Matrix3d IMU_R_link = (m_OrientationTasks[node].calibrationMatrix * data.I_R_IMU).rotation().transpose() | |||
* iDynTree::toEigen(m_kinDyn->getWorldTransform(m_OrientationTasks[node].frameName).getRotation()); | |||
m_OrientationTasks[node].IMU_R_link = BipedalLocomotion::Conversions::toManifRot(IMU_R_link); | |||
m_OrientationTasks[node].calibrationMatrix = secondaryCalib * m_OrientationTasks[node].calibrationMatrix; | |||
m_OrientationTasks[node].calibrationMatrix = toManifRot(initialBasePose.topLeftCorner<3, 3>()) * secondaryCalib * m_OrientationTasks[node].calibrationMatrix; |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Also in this case, could you please add an option such that the user can choose whether to premultiply the calibration matrix by the initial base orientation?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Addressed in 2c61a3e
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I see you implemented only for the gravity task, could you please implement the same logic also for the orientation task?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Yep, my bad. Done in 5b50148
I added the option to include a SE3 task in the IK, such that we can use position trackers if desired. Bindings are also added.