Measuring the orientation values is not trivial because it requires some good knowledge of Mathematics. Do not be afraid, though! After reading this article, you’ll be able to calculate the orientation of each joint using one line of C# code!
Sounds good? Let’s get started.
Prerequisites
To run the code and samples provided in this guide, you’ll need the following:
- Kinect for XBOX v2 sensor with an adapter (or Kinect for Windows v2 sensor)
- Kinect for Windows v2 SDK
- Windows 8.1 or higher
- Visual Studio 2013 or higher
- A dedicated USB 3 port
Let’s do the Math…
Kinect is reading the joint orientation values as a quaternion. A quaternion is a set of 4 values: X, Y, Z, and W.
The Kinect SDK is encapsulating the quaternion into a structure called Vector4. We need to transform this quaternion (Vector4) into a set of 3 numeric values.
Using the Orientation quaternion, we can calculate the rotation of the joint around the X, Y, and Z axis.
Pitch: rotating around the X-axis
The rotation around the X axis is called Pitch. Here is how to measure it:
public static double Pitch(this Vector4 quaternion)
{
double value1 = 2.0 * (quaternion.W * quaternion.X + quaternion.Y * quaternion.Z);
double value2 = 1.0 - 2.0 * (quaternion.X * quaternion.X + quaternion.Y * quaternion.Y);
double roll = Math.Atan2(value1, value2);
return roll * (180.0 / Math.PI);
}
Yaw: rotating around the Y-axis
The rotation around the Y axis is called Yaw. Here is how to measure it:
public static double Yaw(this Vector4 quaternion)
{
double value = 2.0 * (quaternion.W * quaternion.Y - quaternion.Z * quaternion.X);
value = value > 1.0 ? 1.0 : value;
value = value < -1.0 ? -1.0 : value;
double pitch = Math.Asin(value);
return pitch * (180.0 / Math.PI);
}
Roll: rotating around the Z-axis
The rotation around the Z axis is called Roll. Here is how to measure it:
public static double Roll(this Vector4 quaternion)
{
double value1 = 2.0 * (quaternion.W * quaternion.Z + quaternion.X * quaternion.Y);
double value2 = 1.0 - 2.0 * (quaternion.Y * quaternion.Y + quaternion.Z * quaternion.Z);
double yaw = Math.Atan2(value1, value2);
return yaw * (180.0 / Math.PI);
}
Using the code
Here is the complete code. All you have to do is import the following C# file into your Kinect project.
using System;
using Microsoft.Kinect;
namespace LightBuzz.Vitruvius
{
/// <summary>
/// Provides extension methods for transforming quaternions to rotations.
/// </summary>
public static class JointOrientationExtensions
{
/// <summary>
/// Rotates the specified quaternion around the X axis.
/// </summary>
/// <param name="quaternion">The orientation quaternion.</param>
/// <returns>The rotation in degrees.</returns>
public static double Pitch(this Vector4 quaternion)
{
double value1 = 2.0 * (quaternion.W * quaternion.X + quaternion.Y * quaternion.Z);
double value2 = 1.0 - 2.0 * (quaternion.X * quaternion.X + quaternion.Y * quaternion.Y);
double roll = Math.Atan2(value1, value2);
return roll * (180.0 / Math.PI);
}
/// <summary>
/// Rotates the specified quaternion around the Y axis.
/// </summary>
/// <param name="quaternion">The orientation quaternion.</param>
/// <returns>The rotation in degrees.</returns>
public static double Yaw(this Vector4 quaternion)
{
double value = 2.0 * (quaternion.W * quaternion.Y - quaternion.Z * quaternion.X);
value = value > 1.0 ? 1.0 : value;
value = value < -1.0 ? -1.0 : value;
double pitch = Math.Asin(value);
return pitch * (180.0 / Math.PI);
}
/// <summary>
/// Rotates the specified quaternion around the Z axis.
/// </summary>
/// <param name="quaternion">The orientation quaternion.</param>
/// <returns>The rotation in degrees.</returns>
public static double Roll(this Vector4 quaternion)
{
double value1 = 2.0 * (quaternion.W * quaternion.Z + quaternion.X * quaternion.Y);
double value2 = 1.0 - 2.0 * (quaternion.Y * quaternion.Y + quaternion.Z * quaternion.Z);
double yaw = Math.Atan2(value1, value2);
return yaw * (180.0 / Math.PI);
}
}
}
Then, in your main C# file, import the following namespace:
using LightBuzz.Vitruvius;
And, finally, specify the joint you would like to measure the orientation for and call the Roll, Pitch, and Yaw methods:
var orientation = body.JointOrientations[JointType.ElbowLeft].Orientation;
var rotationX = orientation.Pitch();
var rotationY = orientation.Yaw();
var rotationZ = orientation.Roll();
Download the code on GitHub
Supported joints
Measuring the rotation of the human body joints is not a trivial job. Unfortunately, Kinect does not provide us with Orientation values for the Head, Hands, and Feet. These joints do not have a “parent” joint, so it’s extremely difficult to accurately measure their orientation. The orientation accuracy regarding the rest of the joints is pretty good. The supported joints are the following:
- Neck
- SpineShoulder
- SpineBase
- ShoulderLeft/ShoulderRight
- ElbowLeft/ElbowRight
- WristLeft/WristRight
- HipLeft/HipRight
- KneeLeft/KneeRight
The method described in this post will help you in most use-case scenarios, such as simple games. In case you would like to have increased accuracy (e.g. healthcare apps), you’ll need to use more complex algorithms, which involve a lot of custom coding. LightBuzz has a lot of experience with this kind of algorithms and could help you with your project.
Summary
In this blog post, you’ve learnt how to easily measure the rotation of the human body joints around the X, Y, and Z axis. ‘Til the next time, keep Kinecting!
Download Vitruvius now
Hello Eng VANGOS PTERNEAS
thaaank u so much for ur help
I want to ask does measuring joint rotation will help in detecting face turns?
if i want to know whether the face is turned right or left, can measuring joint rotation help?
because i tried to detect it by (neck , spin shoulder, right shoulder) angle and it gave me mostly the same value for the three situations (face in front, face to right or face to left) and i think even if i change the angle to (head, spin shoulder, right shoulder) it will not give me a different answer,
so what do u think? will measuring joint rotation help?
Hello Aisha. Thanks for your comment. This is the best way to determine the rotation of the head. In the code below, I am assuming you have a non-null Body and a non-null Face object:
var head = body.Joints[JointType.Head].Position;
var nose = face.Nose;
var tmp = new CameraSpacePoint
{
X = head.X,
Y = head.Y,
Z = nose.Z
};
var rotation = head.Angle(nose, tmp);
Let me know if that helped you.
hello again Eng.Vangos
I’m so thankful for ur support
Extra question: do I need to identify the faceframe like the way u did here “https://vitruviuskinect.com/hd-face/” or here “https://pterneas.com/2014/12/21/kinect-2-face-basics/”
or there is a simpler way!!
Because i’m taking some “body angles” in the same time i want to know the”face rotation”
so can i add it some where here:
_reader = _sensor.OpenMultiSourceFrameReader(FrameSourceTypes.Color | FrameSourceTypes.Depth | FrameSourceTypes.Infrared | FrameSourceTypes.Body);
_reader.MultiSourceFrameArrived += Reader_MultiSourceFrameArrived;
_playersController = new PlayersController();
_playersController.BodyEntered += UserReporter_BodyEntered;
_playersController.BodyLeft += UserReporter_BodyLeft;
_playersController.Start();
Hi Aisha. I suggest you use the HD Face. The HD Face provides the face point coordinates in the 3D space, so it’s better for measuring the angles. Also, the HD Face provides more points.
You can check the Face Points demo of the Vitruvius Unity package.
hello Vangos
i have an error here when debuging
_sensor = KinectSensor.GetDefault();
if (_sensor != null)
{
_reader = _sensor.OpenMultiSourceFrameReader(FrameSourceTypes.Color | FrameSourceTypes.Depth | FrameSourceTypes.Infrared | FrameSourceTypes.Body);
_reader.MultiSourceFrameArrived += Reader_MultiSourceFrameArrived;
_faceSource = new HighDefinitionFaceFrameSource(_sensor);
_faceReader = _faceSource.OpenReader();
_faceReader.FrameArrived += FaceReader_FrameArrived;
_playersController = new PlayersController();
_playersController.BodyEntered += UserReporter_BodyEntered;
_playersController.BodyLeft += UserReporter_BodyLeft;
_playersController.Start();
_sensor.Open();
the error is: An unhandled exception of type ‘System.InvalidOperationException’ occurred in Microsoft.Kinect.Face.dll
Additional information: This API has returned an exception from an HRESULT: 0x80070002
to the line of : _faceSource = new HighDefinitionFaceFrameSource(_sensor);
i hope u can help me
thaaaaaanks
Hi Aisha. I think the error is caused because the Kinect SDK cannot find the Face libraries. Please, add the following line under Project → Properties → Build Events → Post-build event command line. This command will import some necessary configuration files.
xcopy "C:\Program Files (x86)\Microsoft SDKs\Windows\v8.0\ExtensionSDKs\Microsoft.Kinect.Face\2.0\Redist\CommonConfiguration\x64\NuiDatabase" "NuiDatabase" /e /y /i /r
Hi Vangos,I recently buy the vitruvius package its worked perfectly
but is it possible to calculate the rotation for the wrist and the hip ?
Thank you very much, David! I’m glad you enjoy the product. The rotation of the wrist is tricky because Kinect does not provide reliable orientation information for the end joints (head, wrists, feet).
However, you could estimate the rotation in terms of other joints.
For example, to measure the rotation of the hip, you could check the angle between the hip, the spine, and the knee. To measure the rotation of the wrist, you could measure the angle between the wrist, the hand, and the Z axis. You can do this with the help of the Angle() method.
Thank you so much for your fast reply,
i am not expert in unity (about 1 year exp)
1) if you can explain more please (Angle() method)
2) i want to calculate the shoulder angle + rotation is this possible with VITRUVIUS ?
Hi David. I have passed your request to our Support team and we’ll get back to you via email!
Hello Vangos. Thank you for your blog. Could I consult you some questions? Recently, I have tried to use quaternions to calculate the rotation angles of human joints to control the robot. But by comparing the joint angle values of people and the robot who do the same action, it’s found that there’s a great difference. At first, I suspected it was the problem with the conversion formula or Kinect, but after many tests, I found it wasn’t. Will it be the data of people joint rotation angle are different from the robot’s? Or is it the problem that the coordinate system needs to be converted? Do you have any ideas?
Hello. I do not have any information about how the robot operates. However, I know for sure that the orientations of the head, hands, and feet are not provided correctly by Kinect. So, that may be your issue here.