Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

IMU frame transformations #757

Open
AndyLV opened this issue May 20, 2022 · 3 comments
Open

IMU frame transformations #757

AndyLV opened this issue May 20, 2022 · 3 comments

Comments

@AndyLV
Copy link

AndyLV commented May 20, 2022

Hello everybody,

I noticed some problems when fusing IMU+GNSS data in the following situation:
My IMU provides its data with z-axis pointing downwards (NED). According to my understanding, robot-localization expects the z-axis upwards (ENU). Hence I created an additional TF frame which is rolled by 180° with respect to the other frames. Unfortunately this did not work - e.g. amongst others the gravity compensation increases the abs value to ~2*9.81. I had a look at the source code and found e.g. the following part:
https://github.com/cra-ros-pkg/robot_localization/blob/melodic-devel/src/ros_filter.cpp
line 2661-2677. There you describe already that e.g. a yaw angle of 90 degree will swap roll and pitch. What I do not understand why you limit the corresponding transformation to yaw only? (The roll angle is ignored and hence the sign flip of the x and z axis).

Was this intended or is this a bug? If you want to make the code more universal for arbitrary TF frames I can list you the pieces of code in ros_filter.cpp and navsat_transform.cpp which I needed to modify in order to get results which look at least more correct than the original code.

Thank you very much for your feedback and kind regards,

Andy

@ayrton04
Copy link
Collaborator

According to my understanding, robot-localization expects the z-axis upwards (ENU). Hence I created an additional TF frame which is rolled by 180° with respect to the other frames.

It's been literally years since I used IMUs in a setting where all three Euler angles mattered, but I think when r_l assumes that your IMU data is in ENU frame, it means that it assumes the data is in ENU frame before you pass it to the filter, not that you will provide a transform to make it so.

There you describe already that e.g. a yaw angle of 90 degree will swap roll and pitch. What I do not understand why you limit the corresponding transformation to yaw only?

The transformation on those lines question is not being used to actually transform the IMU data. It's being used to determine which part of the EKF's state vector to update. For example, if we just want to use the roll angle from the sensor, but the sensor is mounted sideways, then the final EKF state will have its pitch angle updated. The updateVector is just a boolean vector that dictates which parts of the EKF's state vector are updated.

And I believe in this instance (i.e., when we care about what part of the state vector is getting updated), only the yaw matters for the transformation that would make that determination:

  • If the ENU IMU's mounting orientation is neutral, then the code in question will make sure that whatever state member variables are updated are the ones the user specified in the sensor config (i.e., the IMU's orientation variables are directly measuring the same orientation angles in the robot's world frame).
  • If the ENU sensor's mounting orientation is such that it's rotated in yaw only w.r.t. the robot, then as I said above, there will be ramifications for the roll and pitch angles of the sensor, which will become swapped (with possible sign inversions) in the robot's world frame.
  • However, IMUs are unique sensors, in that if you flip the ENU-frame IMU on its side such that Y in the IMU frame is aligned with Z in the world frame (ignoring signs for the moment), the yaw value coming from the sensor does not equate to pitch of the vehicle. Likewise, if you stand it up such that the IMU's X axis is aligned with the world frame's Z axis, then the yaw value from the sensor should not be used as the roll for the robot's world frame. Roll and pitch in the IMU are derived from the gravity vector, whereas yaw is computed using the magnetometer, and rolling and pitching the sensor (at least the IMU I had access to at the time) makes no difference to the yaw value.
  • Now contrast this to non-IMU orientation data, such as what we might get from a motion capture system. If the coordinate frame of that system has a different orientation than the robot's world frame, then it will be the case that a change in yaw in the mocap frame could equate to a roll or pitch angle in the robot's world frame.

Tha actual transformation of the IMU data is done here, although your question is valid in that context as well. I believe that logic may be incorrect (I don't think you can rotate the offsets in the orientations as individual Euler angles like that), but I think the reason behind it is the same.

TL;DR, I'd accept a PR to change these, providing that what I described above is supported, and the PR in question doesn't break that functionality in order to satisfy/fix one use case. I think IMUs will still need special treatment, owing to the fact that the yaw value seems to be unaffected (other than a sign inversion) by rolling or pitching the sensor. I no longer have an IMU that I can use to verify that, so I'll have to rely on people who have one.

@hidmic
Copy link
Contributor

hidmic commented Jun 20, 2023

I think it's the second time in the past 5 years or so I stumbled upon this detail. It always pops up when mounting IMUs in unorthodox ways.

However, IMUs are unique sensors, in that if you flip the ENU-frame IMU on its side such that Y in the IMU frame is aligned with Z in the world frame (ignoring signs for the moment), the yaw value coming from the sensor does not equate to pitch of the vehicle. Likewise, if you stand it up such that the IMU's X axis is aligned with the world frame's Z axis, then the yaw value from the sensor should not be used as the roll for the robot's world frame. Roll and pitch in the IMU are derived from the gravity vector, whereas yaw is computed using the magnetometer, and rolling and pitching the sensor (at least the IMU I had access to at the time) makes no difference to the yaw value.

That really clarifies the rationale for these transforms, thanks. Though I must say I find it odd. Regardless of how orientation is estimated, it should be reported between consistent orthonormal frames. If that estimation is somehow restricted by implementation details, such as those described above, that's not something that robot_localization should or even can address. As it stands, not applying the full rotation between base and IMU frames precludes many mounting configurations from working at all.

FWIW I've never worked with a device behaving as you describe, but I've mostly worked with cheap MEMS IMUs so I wouldn't take my personal experience all that serious.

@ayrton04
Copy link
Collaborator

I wrote the package using Microstrain IMUs, and they absolutely behaved this way for me, at least from what I recall. I no longer have access to one.

My experience with IMUs is that their behaviour is inconsistent (across manufacturers) at best. Having said that, I haven't really revisited that logic in a long time. I also don't really plan to, as I'm trying to get people to adopt fuse instead.

https://github.com/locusrobotics/fuse
https://docs.ros.org/en/melodic/api/fuse_doc/html/

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

3 participants