Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Add pose task to IK #54

Open
wants to merge 12 commits into
base: main
Choose a base branch
from
Open

Add pose task to IK #54

wants to merge 12 commits into from

Conversation

evelyd
Copy link
Contributor

@evelyd evelyd commented Oct 16, 2024

I added the option to include a SE3 task in the IK, such that we can use position trackers if desired. Bindings are also added.

@evelyd evelyd requested a review from davidegorbani October 16, 2024 08:45
@evelyd evelyd self-assigned this Oct 16, 2024
@evelyd evelyd added enhancement New feature or request phase:implementation labels Oct 16, 2024
@evelyd
Copy link
Contributor Author

evelyd commented Nov 13, 2024

As documented in https://github.com/ami-iit/element_motion-generation-with-ml/issues/111 and https://github.com/ami-iit/element_motion-generation-with-ml/issues/113, the pose task implementation is complete and based on my tests seems to work properly.

@traversaro
Copy link
Contributor

@davidegorbani @evelyd @Giulero @GiulioRomualdi @claudia-lat what is the status of this PR?

Copy link
Collaborator

@davidegorbani davidegorbani left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Sorry, I started the review and never published the comments.

@@ -505,8 +581,7 @@ bool HumanIK::advance()
{
Eigen::Matrix4d basePose; // Pose of the base
Eigen::VectorXd initialJointPositions; // Initial positions of the joints
basePose.setIdentity(); // Set the base pose to the identity matrix
m_system.dynamics->setState({basePose.topRightCorner<3, 1>(), toManifRot(basePose.topLeftCorner<3, 3>()), m_calibrationJointPositions});
m_system.dynamics->setState({m_basePose.topRightCorner<3, 1>(), toManifRot(m_basePose.topLeftCorner<3, 3>()), m_calibrationJointPositions});
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Here we reset the base pose to the identity such that the subject returns to the initial position; I would leave this possibility at least with an option in the configuration file.

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Addressed in 2c61a3e

Comment on lines +323 to +325
if (verticalForce > m_FloorContactTasks[node].verticalForceThreshold)
{ // if the vertical force is greater than the threshold, set the foot height to 0
I_H_link.translation(Eigen::Vector3d(I_H_link.translation().x(), I_H_link.translation().y(), linkHeight));
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I don't understand why the previous implementation is substituted; like this, when the foot is in contact and the task is active, I think the foot can slide on the ground as the position on x and y directions is updated at every step.

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

So the problem, I think, is that there is some sensor mismatch. That if you don't allow the foot to slide on the ground then it builds up over time and results in really weird-looking behavior. For context:

Original logic (3794efa) My logic (2c61a3e)
orig_floor_contact_task_baf-2025-03-17_14.21.45.mp4
my_floor_contact_task_baf-2025-03-17_14.29.28.mp4

If you have another idea to alleviate the problem then I can test it, but up til now this is what seemed to work best.

Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I think it's better if we discuss about this f2f.

@@ -451,15 +534,15 @@ bool HumanIK::calibrateAllWithWorld(std::unordered_map<int, nodeData> nodeStruct
Eigen::Matrix3d IMU_R_link = (m_OrientationTasks[node].calibrationMatrix * data.I_R_IMU).rotation().transpose()
* iDynTree::toEigen(m_kinDyn->getWorldTransform(m_OrientationTasks[node].frameName).getRotation());
m_OrientationTasks[node].IMU_R_link = BipedalLocomotion::Conversions::toManifRot(IMU_R_link);
m_OrientationTasks[node].calibrationMatrix = secondaryCalib * m_OrientationTasks[node].calibrationMatrix;
m_OrientationTasks[node].calibrationMatrix = toManifRot(initialBasePose.topLeftCorner<3, 3>()) * secondaryCalib * m_OrientationTasks[node].calibrationMatrix;
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Also in this case, could you please add an option such that the user can choose whether to premultiply the calibration matrix by the initial base orientation?

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Addressed in 2c61a3e

Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I see you implemented only for the gravity task, could you please implement the same logic also for the orientation task?

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Yep, my bad. Done in 5b50148

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
Projects
None yet
Development

Successfully merging this pull request may close these issues.

3 participants