A lot of Kinect developers need to create apps that calculate a lot of complex human body measurements. Today I’ll show you how to use Vitruvius to measure angles in the 3D and 2D space.
{ Reasonably, Vitruvius is featured in the official Microsoft Kinect Website and Channel9 }
This is what you’ll be able to do after reading the following tutorial:
What is an angle?
An angle is a combination of two lines. To measure an angle, all you need is 3 points in the 3D or 2D space: a starting point, a middle point, and an end point.
The official Kinect SDK includes the following built-in types of points, so, first, let me give you some background information:
1) CameraSpacePoint
A CameraSpacePoint is a set of coordinates in the 3D space. It has X, Y, and Z values, all measured in meters. The X value represents the horizontal distance of the point, measured from the left side of the field-of-view. The Y value represents the vertical distance of the point, measured from the top of the field-of-view. The Z value represents the distance between the point and the sensor plane. The CameraSpacePoint is the only Kinect point that gives us information about the 3D world.
2) ColorSpacePoint
A ColorSpacePoint is a set of coordinates in the 2D Color space. The Color space is a 1920×1080 frame. A ColorSpacePoint only has two coordinates: the X coordinate defines the distance from the left edge of the frame, and the Y coordinate defines the distance from the top of the frame.
3) DepthSpacePoint
A DepthSpacePoint is, again, a set of coordinates in the 2D Depth space. The Depth space is a 512×424 frame. A depth point has X and Y values, just like the color points. The Depth space is equivalent to the Infrared space, so you can use the same type of points for both.
What about the joints?
You already know that Kinect’s power is the ability to calculate the positions of 25 human body joints. A Joint is a structure that includes:
- The position in the 3D space
- The type/name of the joint
- The tracking accuracy
The position of a joint in the 3D space is expressed as a CameraSpacePoint. To properly map the 3D-position into the 2D-position, you need to use Coordinate Mapping.
Measuring an angle with Vitruvius
Vitruvius contains some handy extension methods that will help you measure an angle using CameraSpacePoints, ColorSpacePoints, or DepthSpacePoints. First, you need to include the proper namespace in your C# file:
using LightBuzz.Vitruvius;
We assume that you already know how to get a Body object from a BodyFrame. Let’s say that you want to measure the angle on your elbow. This angle is formed by 3 joints — the shoulder, the elbow, and the wrist:
Joint shoulder = body.Joints[JointType.ShoulderLeft];
Joint elbow = body.Joints[JointType.ElbowLeft];
Joint wrist = body.Joints[JointType.WristLeft];
The middle point of the angle is, of course, the elbow. The starting point of the angle is the shoulder. The end point of the angle is the wrist. So, here you are:
double angle = elbow.Angle(shoulder, wrist);
The Angle method is an extension method, included in Vitruvius. This way, you can calculate any angle for any human body joint!
The above code is equivalent to the following:
double angle = elbow.Position.Angle(shoulder.Position, wrist.Position);
Need to measure the same angle in the Color or Depth space? No problem! Vitruvius includes another handy extension method, called ToPoint. Here’s how you can use it:
var shoulderColor = shoulder.ToPoint(Viualization.Color);
var elbowColor = elbow.ToPoint(Viualization.Color);
var wristColor = wrist.ToPoint(Viualization.Color);
double angle = elbowColor.Angle(shoulderColor, wristColor);
Similarly, you can calculate the angle in the Depth/Infrared space:
var shoulderDepth = shoulder.ToPoint(Viualization.Depth);
var elbowDepth = elbow.ToPoint(Viualization.Depth);
var wristDepth = wrist.ToPoint(Viualization.Depth);
double angle = elbowColor.Angle(shoulderColor, wristColor);
Simple, huh?
Same principles apply to HD Face points, too!
CameraSpacePoint nose = face.Nose;
CameraSpacePoint cheekLeft = face.CheekLeft;
CameraSpacePoint cheekRight = face.CheekRight;
double angle = nose.Angle(cheekLeft, cheekRight);
Note: the sample projects included with Vitruvius also contain a handy Arc control. Using the Arc control, you can visualize the angle on top of a body very easily.
Why do you need this?
Except the obvious reason (measuring an angle), such Mathematical calculations are extremely helpful when you need to compare the relative positions between a few points. Gesture detection, sign language identification, or even facial expressions!
Supported Platforms
The above code (including angle measurements and Arc controls) is supported in the following platforms, frameworks, and engines:
- Unity3D
- Windows Presentation Foundation (.NET 4.5+)
- Windows Store (WinRT)
Get Vitruvius
As you see, Vitruvius is helping innovative companies create Kinect apps fast. Vitruvius simplifies Kinect development, so you can now focus on what’s really important: your app, your research, and your customers.
Where are your tutorials/videos for getting started with Vitruvius? Right now, the only code samples I can find are on your front page advertisements and they’re missing context.
Hi Justin. Here you are: http://vitruviuskinect.com/getting-started-unity/
This seems like a very promising tool, but there’s no documentation. The included ‘Documentation’ in the free version is empty, and no documentation is provided online.
Hi Brian. The documentation is available online in the following URLs:
The redirection is on 404 pgae not fount tab. Could you please fix this because there is no documentation available?
Hi Konstantina. The documentation is available here. What is the page you are trying to access?
Hai Vango…
May I ask about this code :
var shoulderColor = shoulder.ToPoint(Viualization.Color);
var elbowColor = elbow.ToPoint(Viualization.Color);
var wristColor = wrist.ToPoint(Viualization.Color);
double angle = elbowColor.Angle(shoulderColor, wristColor);
From where Viualization comes?
Hi Suzanne. Visualization is an enumeration that is part of Vitruvius. It is used as a flag to determine the output type. If you use Visualization.Color, the points will be created for a 1920×1080 canvas. If you use Visualization.Depth or Visualization.Infrared, the points will be created for a 512×424 resolution.
Let me know if that information helped you 🙂
Hello,
i hope you can help me. You need this code for skelattracking:
using (var frame = reference.BodyFrameReference.AcquireFrame())
{
if (frame != null)
{
var bodies = frame.Bodies();
_userReporter.Update(bodies);
Body body = bodies.Closest();
if (body != null)
{
viewer.DrawBody(body);
angle.Update(body.Joints[_start], body.Joints[_center], body.Joints[_end], 60);
tblAngle.Text = ((int)angle.Angle).ToString();
}
}
}
and calculate the angle.
My question is: Can i will track a part of the body(ShoulderRight, ElbowRight,WristRight) and not the complete body? If yes, how i do it?
Hello Klara. Kinect can only track the whole body. However, you can draw only the joints and lines you need. It’s not required to draw everything. For example, if you only need to display 3 joints, you can use the following code:
XAML:
<Viewbox>
<Grid Width="1920" Height="1080">
<Image Name="camera" />
<Canvas Name="canvas">
<Ellipse Name="ellipseShoulder" Width="10" Height="10" Fill="Blue" />
<Ellipse Name="ellipseElbow" Width="10" Height="10" Fill="Blue" />
<Ellipse Name="ellipseWrist" Width="10" Height="10" Fill="Blue" />
<Line Name="line1" Stroke="Blue" StrokeThickness="5" />
<Line Name="line2" Stroke="Blue" StrokeThickness="5" />
</Canvas>
</Grid>
</Viewbox>
C#
if (body != null)
{
var shoulder = body.Joints[JointType.ShoulderRight].Position.ToPoint(Visualization.Color);
var elbow = body.Joints[JointType.ElbowRight].Position.ToPoint(Visualization.Color);
var wrist = body.Joints[JointType.WristRight].Position.ToPoint(Visualization.Color);
if (!float.IsInfinity(shoulder.X) && !float.IsInfinity(shoulder.Y) &&
!float.IsInfinity(elbow.X) && !float.IsInfinity(elbow.Y) &&
!float.IsInfinity(wrist.X) && !float.IsInfinity(wrist.Y))
{
Canvas.SetLeft(ellipseShoulder, shoulder.X - ellipseShoulder.Width / 2);
Canvas.SetTop(ellipseShoulder, shoulder.Y - ellipseShoulder.Height / 2);
Canvas.SetLeft(ellipseElbow, elbow.X - ellipseElbow.Width / 2);
Canvas.SetTop(ellipseElbow, elbow.Y - ellipseElbow.Height / 2);
Canvas.SetLeft(ellipseWrist, wrist.X - ellipseWrist.Width / 2);
Canvas.SetTop(ellipseWrist, wrist.Y - ellipseWrist.Height / 2);
line1.X1 = shoulder.X;
line1.Y1 = shoulder.Y;
line1.X2 = elbow.X;
line1.Y2 = elbow.Y;
line2.X1 = elbow.X;
line2.Y1 = elbow.Y;
line2.X2 = wrist.X;
line2.Y2 = wrist.Y;
}
}
Let me know if that information helped you 🙂
Hey,
thanks it works. But it is less accurate than viewer.DrawBody(body) and there is a fault in the if-function (double can’t convert in to double).
Otherwise it is ok 🙂
Thank you. Please check the updated code in my previous comment.
Yes, i had the same idea. 🙂
but,
if (!float.IsInfinity(shoulder.X) && !float.IsInfinity(shoulder.Y) &&
!float.IsInfinity(elbow.X) && !float.IsInfinity(elbow.Y) &&
!float.IsInfinity(wrist.X) && !float.IsInfinity(wrist.Y))
{
….
}
Have the Error : CS1503
Argument ‘1’: cannot convert from “double” in “float”
That code should work in WPF and Unity. In case you are using WinRT, replace “float.IsInfinity” with “double.IsInfinity” and it should be OK.
Super 🙂
Hey,
i have one question for calculate the angle.
Can you tell me what formula you use it to calculate the angle ?
Vector3D: a (x,y,z) = Elbow – Shoulder
Vector3D: b(x,y,z)=Wrist – Shoulder
angle =arccos (a x b)/(|a|*|b|)
Use you this one?
b*= Wrist-Elbow
Hi Klara,
You can simply use the Angle method:
var start = body.Joints[JointType.ShoulderLeft];
var center = body.Joints[JointType.ElbowLeft];
var end = body.Joints[JointType.WristLeft];
var angle = center.Angle(start, end);
Source code
Yes , it was , but behind it? Which calculation ?
Hi Klara. You mean this code? Source.
I have a question if you don’t mind
Can I used the angle calculation to figure out if the hand ( arm) is up near to mouth or far from the mouth??
Hello Hanan. Sure, you can check whether the hand is near to or far from the mouth by using the following Vitruvius code:
const float threshold = 0.1f; // 10 cm - change this
var hand = body.Joints[JointType.Hand].Position;
var mouth = face.Mouth;
var distance = hand.Length(mouth);
if (distance < threshold) // Hand is close to the mouth. if (distance > threshold) // Hand is far from the mouth.
Let me know if that helped you (you can check the Face sample to learn how to acquire the Face object).
Can somebody help me, how to find the angles between in the Right Elbow and Right Knee?
Anybody provide me the sample code…
I don’t know which extensions to use.. need a proper tutorial…
Hello. We can definitely help you. An angle needs 3 points:
– Right Elbow
– Right Knee
What is the third one? Could you probably send a picture of the angle?
I got the code MathExtensions.cs from you……
Now my doubt is how to start coding in C#.net..
My objective is to find the Angle using the joints as follows..
1. Right Wrist, Right Elbow, Right Shoulder.. Elbow Angle
2. Right Hip, Right Knee, Right Angle.. Knee Angle
How to include the extensions in Visual Studio?
Please give a complete tutorial..
Here’s how you can include Vitruvius in your Project:
1) Open the downloaded Vitruvius folder and navigate to the Assemblies folder.
2) Open the WPF or WinRT folder, based on the type of project you want to create. If you don’t know what type of project you have, it’s most probably WinRT.
3) Open Visual Studio and create a new WPF or WinRT project, targeting .NET 4.5+.
4) Right-click the References item and select “Add new reference”.
5) Browse to the folder you opened in Step #2.
6) Select LightBuzz.Vitruvius.dll.
7) In your .cs file, include the following reference:
using LightBuzz.Vitruvius;
That’s it. You can now access all of the Vitruvius extensions.
For example:
var shoulder = body.Joints[JointType.ShoulderRight];
var elbow = body.Joints[JointType.ElbowRight];
var wrist = body.Joints[JointType.WristRight];
var hip = body.Joints[JointType.HipRight];
var knee = body.Joints[JointType.KneeRight];
var ankle = body.Joints[JointType.AnkleRight];
var angle1 = elbow.Angle(shoulder, wrist);
var angle2 = knee.Angle(hip, ankle);
using System.Windows;
using LightBuzz.Vitruvius;
using Microsoft.Kinect;
namespace AngleMeasurement
{
///
/// Interaction logic for MainWindow.xaml
///
///
public partial class MainWindow : Window
{
public MainWindow()
{
InitializeComponent();
var shoulder = body.Joints[JointType.ShoulderRight];
var elbow = body.Joints[JointType.ElbowRight];
var wrist = body.Joints[JointType.WristRight];
var hip = body.Joints[JointType.HipRight];
var knee = body.Joints[JointType.KneeRight];
var ankle = body.Joints[JointType.AnkleRight];
var angle1 = elbow.Angle(shoulder, wrist);
var angle2 = knee.Angle(hip, ankle);
}
}
}
Getting the following error: The Name ‘body’ doesn’t exist in the current context..
I want to use the arc tool to display the above two angles in live mode…
What’s the code? I’ve to type..
Thanks in advance..
Hello. What you asked is into the Samples folder.
1) Open the downloaded Vitruvius folder and navigate to the *Samples* folder.
2) Open the WPF or WinRT folder, based on the type of project you want to create.
3) Open the .sln file using Visual Studio.
4) Check the AnglePage.xaml.cs.
It’s a complete example, using the Arc control and the Angle extension method.
Dear sir, I know what you mean? i’ve already explored the code of AnglePage.xaml.cs..
but the code you have given and the code inside AnglePage.xaml.cs is totally different..
Please tell me the reason for this error
Getting the following error: The Name ‘body’ doesn’t exist in the current context..
using System.Windows;
using LightBuzz.Vitruvius;
using Microsoft.Kinect;
namespace AngleMeasurement
{
///
/// Interaction logic for MainWindow.xaml
///
///
public partial class MainWindow : Window
{
public MainWindow()
{
InitializeComponent();
var shoulder = body.Joints[JointType.ShoulderRight];
var elbow = body.Joints[JointType.ElbowRight];
var wrist = body.Joints[JointType.WristRight];
var hip = body.Joints[JointType.HipRight];
var knee = body.Joints[JointType.KneeRight];
var ankle = body.Joints[JointType.AnkleRight];
var angle1 = elbow.Angle(shoulder, wrist);
var angle2 = knee.Angle(hip, ankle);
}
}
}
The code I typed in my previous comment should be placed inside the Reader_MultiSourceFrameArrived method. This is where you retrieve the Body object.
Dear sir, thanks for your help, but now I’ve one final doubt..
Please help me..
using (var frame = reference.BodyFrameReference.AcquireFrame())
{
if (frame != null)
{
var bodies = frame.Bodies();
_userReporter.Update(bodies);
Body body = bodies.Closest();
if (body != null)
{
var shoulder = body.Joints[JointType.ShoulderRight];
var elbow = body.Joints[JointType.ElbowRight];
var wrist = body.Joints[JointType.WristRight];
var hip = body.Joints[JointType.HipRight];
var knee = body.Joints[JointType.KneeRight];
var ankle = body.Joints[JointType.AnkleRight];
var angle1 = elbow.Angle(shoulder, wrist);
var angle2 = knee.Angle(hip, ankle);
viewer.DrawBody(body);
angle.Update(body.Joints[JointType.ShoulderRight], body.Joints[JointType.ElbowRight], body.Joints[JointType.WristRight], 100);
angle.Update(body.Joints[JointType.HipRight], body.Joints[JointType.KneeRight], body.Joints[JointType.AnkleRight], 100);
tblAngle.Text = ((int)angle.Angle).ToString();
//viewer.DrawBody(body);
//angle.Update(body.Joints[_start], body.Joints[_center], body.Joints[_end], 100);
//tblAngle.Text = ((int)angle.Angle).ToString();
}
}
}
}
void UserReporter_BodyEntered(object sender, PlayersControllerEventArgs e)
{
}
void UserReporter_BodyLeft(object sender, PlayersControllerEventArgs e)
{
viewer.Clear();
angle.Clear();
tblAngle.Text = “-“;
}
}
}
With the code above, i was able to get only the Knee angle.. The code to show elbow angle get overwritten..
Whether i need to create separate method for elbow angle as well?
How can i get both angles simultaneously?
Hello. You’ll simply need to add a second Arc control in your XAML file.
<controls:KinectAngle x:Name="angle1" Opacity="0.5" />
<controls:KinectAngle x:Name="angle2" Opacity="0.5" />
…and update both controls accordingly:
angle1.Update(body.Joints[JointType.ShoulderRight], body.Joints[JointType.ElbowRight], body.Joints[JointType.WristRight], 100);
angle2.Update(body.Joints[JointType.HipRight], body.Joints[JointType.KneeRight], body.Joints[JointType.AnkleRight], 100);
While Displaying the various angles in the screen at the same time, is there any better way we can do from the code below?
Please suggest how to store those values in real-time in .csv format..
Please send your code to support@lightbuzz.com, since it’s not displayed in the comments.
Hey, can you tell me
How big is the error/ deviation of Kinect 2?
If you calculate a angle from 90 degree –> error? or standard deviation? for example at a distance of 2 m
Hi Klara. It depends on the type of the angle. For example, the shoulder angles are more accurate than the knee ankles, no matter the deviation.
And compared to reality?
As long as a joint is visible (and e.g. not hidden by another body part), the measurements are very close to the reality (cm or mm level accuracy).
Thanks for your answer.
If I have an error of 1 cm in a point (shoulder angle calculate with Spin_Shoulder, Shoulder, Elbow) than i calculate a angle with a error from 4 degree. Is this right?
It should be normal. You could smooth the values across time to create a more accurate perception of the results.
Hi Vangos, can you please tell me how to include my ankle angle ? currently all other joint angles are captured except ankle (I’m using Unity + Virtuvius pack + kinect V2). Also would like to capture 2 eyes
Hello Muthug. Thanks for using Vitruvius! What ankle angle do you need? You can use the Angle extension method to measure the angle between 3 points. If you need the rotation information, then use the joint orientation property:
var orientation = body.Joints[JointType.AnkleLeft].Orientation;
This is a quaternion actually.
To get the position of the eyes, simply use the Face extension methods (check the Face sample inside the Avateering Unity scene):
var eyeLeft = face.EyeLeft;
All of the Face properties (like EyeLeft, EyeRight, Jaw, etc) are CameraSpacePoints, just like the original joint positions.
Let me know if you need any additional help.
Vangos, I’m using your sample unity project. Edited Avateering Sample scene and calling only skeleton model, the skeleton is mirroring what ever action I do, showing all joint angles except ankle angle. Which file to edit for me to see the skeleton model to show the ankle angle too. (only 6 angles are shown in your sample scene file, please help me to display the 7th angle -marked yellow in the uploaded image:
http://pictub.club/image/HAPZm
Hello Muthug,
You can display whichever angle you need. Here’s how:
* Launch the Unity editor and open the Angle scene.
* From your Hierarchy view, select the “Angle Sample”.
* Navigate to the Inspector window and find the Joint Peaks array.
* Modify the start/center/end joints of an array element or add a new one.
Done! (I have sent a picture to your email, too).
Vangos
Hello Vangos, I would like to ask you whether it is possible to use vitruvius not live, but using it to parse depth videos from datasets, and calculate the angles among joints within each frame and store them?
Hello, Argie. Sure! Using the Video Recording and Playback of Vitruvius, you can replay the 3D data, acquire every single frame, and calculate the angles without connecting a Kinect sensor.
hello
thanks for ur support
I have some questions:
1. Is there an optimal way to choose a specific skeletal angles to detect a gesture?
2. how many angles i need to measure… to detect an exact gesture? is there a limit?
3. is it possible to find the angle of rightsoulder, spinbase and rightknee? i try it but it did’t work… (i want to detect the angle of a bending human back… “bowing”)
4. i couldn’t understand the “arc” clearly.. is it only for displaying purpose,,, or does it count in calculations? i always put it 50!!
thannks again
Hello, Aisha. Thank you for your message. Check my comments below:
1) It depends on the gesture. The more complex a gesture, the more angles you’ll need to consider. What gesture are you trying to measure?
2) You can measure as many angles as you like, depending on the gesture.
3) You can do the following:
var chest = body.Joints[JointType.SpineShoulder].Position;
var waist = body.Joints[JointType.SpineBase].Position;
var verticalAxis = new CameraSpacePoint { X = waist.X, Y = 0f, Z = waist.Z };
var angle = waist.Angle(chest, verticalAxis);
4) The Arc control is for display purposes. You can specify the points, as well as its radius. So, if you set it to 50, the radius of the arc would be 50 points.
Hi Vangos,
thanks for this great tool! I have been playing around with it and I tried to display the different angles of the joints right next to actual joints in the life picture (like the picture of the athlete here on your webpage 😉 )
Unfortunately, I do not know this can be achieved. Right know I can only show the angles on a canvas on top of the picture like in your sample code….do you have an idea to help me out?
Cheers,
Ben
Hi Ben. Thank you for your comment. You can definitely achieve the same results. It is not clear to me what problems you are facing. Could you please send us a picture of the results so far? Thank you!
Thanks, Vangos! I just sent you a mail with the picture for clarification 🙂
Hi,
Thank you for your Help!
Can I draw this (var angle = waist.Angle(chest, verticalAxis)) in the skeleton view as my joints drawing in Update method ?? Or I just can calculate this angle
[…] one of my previous blog posts, I showed you how to measure joint angles using Kinect and C#. Today, we’ll dive into a more complex topic: in this article, you are going to learn how to […]
[…] one of my previous blog posts, I showed you how to measure joint angles using Kinect and C#. Today, we’ll dive into a more complex topic: in this article, you are going to learn how to […]
[…] Pterneas May 28, 2017 In one of my previous blog posts, I showed you how to measure joint angles using Kinect and C#. Today, we’ll dive into a more complex topic: in this article, you are going to learn how to […]
Hello Vangos,
I’m completely new to Visual Studio and Unity. I am looking to measure the angle arm and the shoulder. Can you provide a detailed tutorial or help me out on how to get started?
Hello Srick. To use Kinect with Unity3D, you need to download the Academic or Premium version of our software. Among others, the Unity samples include an angle calculation demo.
The C# code is identical to the one presented in this blog post:
double angle = elbow.Angle(shoulder, wrist);
Hello again
Im so thankfull for ur service
I would like to ask:
If i want to define several gestures by body joints angles
I think angle data from one person is not enough(true or false?)
So i took 5 angles data (angles that define my gestures) from 10 different persons for each gesture
1. Are 10 persons enough?
2. Do i need a datamining tool to
Search for the correct gesture between the several gestures of my app?
3. Before using the angles data in the datamining tool … do i need some calculation for the data or i can use it directly?
We will do all of that if we define a gesture by joint postions… do we need to do it if we define the gesture by joints angles?
Thaaank u
Hello Aisha. Viruvius will let you acquire the values (e.g. angles/rotations) via an easy API. The way you’ll use the data depends on the use-case scenario.
You can measure the angles for as many joints and as many people as you like. Kinect supports up to 6 people simultaneously.
Dear Vangos,
Thanks for creating this tool, it can be of great use for me. I have some start-up problems though.
When I add the right reference, and add “using LightBuzz.Vitruvius;”, Visual Studio doesn’t recognize the references
used in the code above; the same would happen when you leave out “using LightBuzz.Vitruvius;”.
Do you know what the problem could be?
Thanks in advance
I’m working in WPF, I’m not sure if this is the way to go for calculating angles, but it might be important.
Hi David. To use the LightBuzz.Vitruvius.dll, you need to target .NET framework 4.5 or higher. Also, you need to install the official Microsoft Kinect SDK v2. Vitruvius for WPF relies on the Microsoft.Kinect.dll, which is part of the Visual Studio assembly list.
Let me know if that worked for you!
Hey Van,
Superb library for my robotics work. 🙂
Actually i need your help to, move robotic hand in X, Y, Z angle.
***requirement-
No. of servos required for each hand.
Angles(shoulder, elbow, wrist) to fetch and assign to particular servo.
—
Thanks
Hi Abhay. Thank you for your comment. Vitruvius can definitely help you measure the angles between the bones. For example:
var shoulder = body.Joints[JointType.ShoulderRight].Position;
var elbow = body.Joints[JointType.ElbowRight].Position;
var wrist = body.Joints[JointType.WristRight].Position;
var angle = elbow.Angle(shoulder, wrist);
You can also ignore an Axis by providing the Axis name as a parameter to the Angle() method.
I am not familiar with the framework you are using, so it is up to you to make the proper transformations and apply the motion to the servo motors.
hello Vangos
i hope u r fine
in the examples of the angle page of vitriuvs
calculating the angle will be as follow
angle1.Update(body.Joints[_start1], body.Joints[_center1], body.Joints[_end1], 50);
but in the documentation
the center parameter is the first one
public static double Angle(
this Joint center,
Joint start,
Joint end
)
so which one is the correct?
will that make any difference?
i developed my project like the example not like the documentation
are my data all wrong??
Hi Aisha. The Update method is part of the angle visual element. The angle visual element is a XAML User Control that displays an Arc on a 2D Canvas. The parameters are the joints you want to visualize (e.g. Shoulder – Elbow – Wrist).
The Angle() method is the one that calculates the actual value of the angle in the 3D space. The Angle() method is not related to XAML — it’s simply doing the Math 🙂
So, the Angle() method is doing the calculations and the Arc control is visualizing the result. You can use one of them or both. This is totally up to you.
oooh
first, thanks for your fast response
q1: so where in the “angle page example” the real angle is being calculated!!
q2: is it correct what im doing in my project? i mean are my data correct from collecting it only from the xml user control??
or it will be more accurate when using the real method
Hi Aisha. Please, check my comments below:
q1: so where in the “angle page example” the real angle is being calculated!!
Yes, the Arc control is simply a “higher-level” visualization control. Internally, it’s using the same angle calculation methodology.
q2: is it correct what im doing in my project? i mean are my data correct from collecting it only from the xml user control??
Yes, it is totally fine.
Hi Vangos,
I have a question to ask, I had calculate the angles already, but how can I draw the arc length in the Body Skeleton and display it?
Hello Elvis. You can use the built-in AngleArc control (LightBuzz.Vitruvius.Controls.dll). You may also refer to the AnglePage.xaml file to see the control in action.
It’s usage is fairly simple:
...
xmlns:controls="clr-namespace:LightBuzz.Vitruvius.Controls;assembly=LightBuzz.Vitruvius.Controls"
...
<controls:KinectAngle x:Name="angle1" Opacity="0.5" Fill="Green" />
In case you are using Unity, the Angle Sample scene includes an Arc prefab for you to use.
Hi Vangos. If i am not using Unity, I can use the built-in AngleArc control also?
Hello Elvis. Sure, the AngleArc control is also available in XAML. You need to use:
using LightBuzz.Vitruvius.Controls;
I have a question, it is I don’ t know what is your Function “DrawBody” and I draw are new one. But in the xaml, it is not similar than you. I draw a angle and it isn’t appear in the skeleton, it is separate.
You need to specify the AngleArc points. You can use the 2D positions (e.g. Visualization.Color) of the target joints.
OH…I mean, i output the display in the image and the AngleArc is not match with the skeleton, how to use the 2D positions of the target joints to control the AngleArc?
That should work:
var shoulder = body.Joints[JointType.ShoulderLeft].Position;
var elbow = body.Joints[JointType.ElbowLeft].Position;
var wrist = body.Joints[JointType.WristLeft].Position;
var radius = 20.0;
angleArc.Update(shoulder, elbow, wrist, radius);
I am sorry that my explain were wrong, can i have your email? I would like to send the photo how my AngleArc were separate between the skeleton.
You can contact our support team: https://vitruviuskinect.com/support
Hi Vangos
How i can resize the image view ?
i tried too many times but it gets back to the normal size
thanks
Hello David. Are you using Vitruvius 4 or 5?
thank yoy Vangos for your reply,i am using Vitruvius 5
but thanks i already did it 😀
Awesome 🙂
Hello again!
Im so thankfull for ur service.
I would like to ask four questions:
1. I want to Measure the angle of bowling. I tried to use your code as follows,but i do not know why it is so. Can you help me to explain it?
What is the verticalAxis and Y=of? How do you use the three variables mearsures the angle of bowling?
/***************************************************************/
var chest = body.Joints[JointType.SpineShoulder].Position;
var waist = body.Joints[JointType.SpineBase].Position;
var verticalAxis = new CameraSpacePoint { X = waist.X, Y = 0f, Z = waist.Z };
var angle = waist.Angle(chest, verticalAxis);
/***************************************************************/
2. How to mearsure this part’s angle of rotation when i rotate my upper arm?(That part between shoudler and elbow.)Can you show this code to me?
3. I tried to use your code: ‘var orientation = body.Joints[JointType.AnkleLeft].Orientation’ ; but it has error:”Microsoft. Kinect. Joint” does not contain the definition of “Orientation” and can not find an extension method “Orientation” for the first parameter of acceptable type “Microsoft. Kinect. Joint” (is there a lack of using instructions or assembly references?)
4. How to know the X,Y,Z coordinates of one joint?Such as X.WristRight, Y.WristRight, Z.WristRight.
Hello Jeff. Please, check my comments below:
Thanks for your replying!
My second question is how to get the shouldercuff ‘s angle when my shouldercuff(rotatorcuff) is rotating.
Can you show me the code? I am not sure that you can understand what i describe,so how can i send picture to you?Send to your email or …?
Thank you very much!
Thanks for your replying!
My second question is how to get the shouldercuff ‘s angle when my shouldercuff(rotatorcuff) is rotating.
Can you show me the code? I am not sure that you can understand what i describe,so how can i send picture to you?Send to your email or …?
Thank you very much!
Hello Jeff. Feel free to send your support requests to support@lightbuzz.com.
Thank you very much!I have sent email to you!Please check it!
Hello George,
Can Vitruvius be uesd in linux system?I want to move this project to linux system?what should i do?
Thank you!
Yours sincerely,
Jeff
Hello Jeff. This is Vangos, founder of Vitruvius. Vitruvius is based on Windows binaries and middleware. Currently, the body-tracking middleware is only available on Windows.
Hello George,
I am so sorry for disturbing you again!I find a problem that the kinect V2 can not identify the angle information of bone when I shake the kinect camara.My kinect that set on a handcart can not identify the information,when the handcart go across the uneven road.Can you help me solve this problems?
Thank you!
Yours sincerely,
Jeff
Hello Jeff. This is Vangos again. Kinect was manufactured by Microsoft to remain still. Body tracking will not work if the device is shaking. The device should be positioned on a stable surface.
Hello, How can I verify that the angles are real or what if margin of error?
Hello Juan. The angles are calculated in the 3D space. The accuracy depends on the tracking accuracy of the joints. To check the tracking accuracy of a joint, simply use:
float confidence = body.Joints[JointType.ShoulderLeft].Confidence;
// 0.0 == Low confidence
// 1.0 == High confidence
Thank You!
Hi Vangos,
I’m researching about spasticity rehabilitation in arms and I’ve used some methods of Vitruvius to obtain, for example, the elbow angle. I’ve modified other methods to obtain the X,Y,Z coordinates of a joint in real-time and show them on screen. I wanted to ask you if I can use this in my project final memory. Of couse, I will cite the job correctly.
I’ve read de License Agreement but it’s not clear for me.
Thank you, this tool has been very useful for me!
Sure, you can definitely do that! I’m glad Vitruvius helped you with your research!
Hi Vangos,
Thank you for this wonderful tool. I am completely new to C# and was able to calculate and display the elbow angle from your code. But how can I calculate the angle just from the x, y, z coordinates of the joints? For example: If I have x, y, z coordinates (shoulder, elbow and wrist) of a skeleton, how could I calculate the elbow angle just from these joint coordinates?
Unfortunately, I do not know how this can be achieved. Could you please help me out?
Thanks,
Luke
Hello Luke. The Angle() method can be overloaded to receive 3 CameraSpacePoint (or Vector3D) arguments. A CameraSpacePoint (Vector3D) is simply a pair of X/Y/Z values. All you need to do is construct your CameraSpacePoint/Vector3D structs and pass them to the Angle() method.
Hi Vangos,
Thank you for the reply. It worked with the method which you said :). I used Update() method to find the angle and it works. But with Angle() method, I couldn’t make it work. Could you please tell me where I am going wrong? Below is my code using Update() and Angle() method:
Vector3D vector1 = new Vector3D(-0.11, 0.3, 1.7);
Vector3D vector2 = new Vector3D(-0.1, 0.1, 1.5);
Vector3D vector3 = new Vector3D(0.2, 0.1, 1.5);
angleElbow.Update(vector1, vector2, vector3, 25);
//double angle = elbow.Angle(shoulder, wrist);
double angle = vector2.Angle(vector1, vector3);
Hi Luke. There should be a third argument. The Angle() method needs 3 points to measure their angle.
You mean it should be as below? But I am getting error when I do so.
double angle = vector2.Angle(vector1, vector2, vector3);
tblAngle.Text = ((int)angle).ToString(); //to print the angle
Sorry for the inconvenience
This should work:
CameraSpacePoint start = new CameraSpacePoint
{
X = vector1.X,
Y = vector1.Y,
Z = vector1.Z
};
CameraSpacePoint center = new CameraSpacePoint
{
X = vector2.X,
Y = vector2.Y,
Z = vector2.Z
};
CameraSpacePoint end = new CameraSpacePoint
{
X = vector3.X,
Y = vector3.Y,
Z = vector3.Z
};
double angle = center.Angle(start, end);
Hello Vangos, now it works with the CameraSpacePoint. Thanks a lot 🙂
Well, I have one more doubt. I have calculated Elbow angle with shoulder, elbow and wrist joints. I am getting the inner elbow angle as the right hand (115 degrees) in the Athlete image at top. But how did you calculate the outer elbow angle (325)? Is there some other function for it?
I also calculated shoulder angle with spineshoulder, shoulder and elbow joints, but here I am getting outer angle. I am not sure if you are getting the angle which I mean. I am bit confused. Could you please clarify on this?
Hello Vangos, I have a question regarding project configuration and the target processor. I want to use Angle calculations in my project. My base project works with project configuration “Debug” and target processor “Any CPU”. I have tried Angle calculations in “Debug” and “x64”. But as it is not compatible with my base project settings, I am getting an error and is not able to use it in my project. What should I do, so that I can make it work in my base project setting? It would be a great help to me.
Thank You
Hello Marco. The LightBuzz.Vitruvius.Samples.WPF demo solution works with x64 architecture. What is the error message you are receiving?
Hi Vangos, thanks for the reply. I was using AnglePage example from Vitruvius WPF KinectV2 Samples. It works with project configuration “Debug” and target processor “x64”. But as my base project is build with project configuration “Debug” and target processor “Any CPU”, I am not able to use AnglePage example in my project and get error :
Severity Code Description Project File Line Suppression State Warning There was a mismatch between the processor architecture of the project being built “MSIL” and the processor architecture of the reference “LightBuzz.Vitruvius, Version=1.0.0.0, Culture=neutral, processorArchitecture=AMD64”, “AMD64”. This mismatch may cause runtime failures. Please consider changing the targeted processor architecture of your project through the Configuration Manager so as to align the processor architectures between your project and references, or take a dependency on references with a processor architecture that matches the targeted processor architecture of your project.
Is it there any way that I can also make it work with project configuration “Debug” and target processor “Any CPU” as my base project?
Thank you
The default configuration of Vitruvius is “Any CPU”. Switching to “Any CPU” should work just fine.
Vitruvius is compiled with the “Any CPU” configuration. If you download the package from scratch, are you able to run the Angle sample in Debug – Any CPU?
Hi Vangos, I have tried switching to “Any CPU” configuration but then I get this error but it works when I switch it to x64. Unfortunately, I am not able to get it work .
I have downloaded the package from Git and used the Anglepage sample. As my base project is in Debug-Any CPU , I couldn’t use it . So I started with the Skeleton tracking source code from your website and then did Angle calculations without using Vitruvius.
Hi, Vangos. I was able to get the Yaw, Roll, and Pitch of a joint using your JointOrientationExtensions.cs code.
It seems to give each angles correctly, but I was wondering what exactly are the reference axes for these angles.
Quaternion math is too complicated for me to understand, and based on the answers at Microsoft Community Forum, it is stated that
– Bone direction(Y green) – always matches the skeleton.
– Normal(Z blue) – joint roll, perpendicular to the bone
– Binormal(X orange) – perpendicular to the bone and normal
As the description above, if Z axis(normal) is perpendicular to the bone and X axis is perpendicular to the bone and normal, how am I supposed to find the them since it’s not specified which direction out of infinite lines that are perpendicular to the bone direction?
Thank you.
Hello Harry. What is the link to the forum thread you are mentioning?
https://social.msdn.microsoft.com/Forums/en-US/a87049b5-7842-4c17-b776-3f6f4260c801/how-to-interpret-jointorientation-data
It’s Kevin’s reply in the above link.
The reason I’m looking into your JointOrientation code is because I want to find the Pronation/Supination(Also known as the arm roll) angle of the forearm. My understanding is that perhaps the ‘roll’ angle at the WristLeft/WristRight joint might give me this information. Am I interpreting the ‘roll’ angle here correctly?
Thanks.
You could do that, indeed. However, the result would be quite jittery. You would better use a combination of measurements, instead. For example, you could consider the position of the thumb joint, too.
Thank you for your reply.
I tried an experiment using your code to read the Joint Orientation of the wrist and elbow, but the ‘roll’ angle obviously doesn’t give the Pronation/Supination angle of the forearm. I was wondering if there’s other ways to compute Yaw, Roll, and Pitch to fix this? I was thinking maybe it has something to do with the order of computing Yaw, Roll, and Pitch.
I would not expect accurate numbers, as this is a limitation of the device (too few data to properly track the rotation of the wrist). Instead, consider checking the angle between the wrist and the thumb joint.
Hello Vangos, I would like to ask if it is possible to integrate the measuring distance between two joints in the anglepage, such as the distance between the lefthand and the head,and how can I draw the joint distance in the Body Skeleton and display it? I`m looking forward to your reply. Thank you.
Hello Shawn. Sure, you can integrate any functionality into the Angle page. You can check this MSND thread for adding a line to your XAML code. The distance measurement would be calculated in the FrameArrived event of your C# code.
Thanks for your fast response but I`ve two doubts.
1. I have integrated the code to measure the distance, but the distance is not displayed.
using (var frame = reference.BodyFrameReference.AcquireFrame())
{
if (frame != null)
{
var bodies = frame.Bodies();
_playersController.Update(bodies);
Body body = bodies.Closest();
if (body != null)
{
var joint1 = body.Joints[JointType.KneeLeft];
var joint2 = body.Joints[JointType.KneeRight];
var distance = MathExtensions.Length(joint1.Position, joint2.Position);
viewer.DrawBody(body);
angle1.Update(body.Joints[_start1], body.Joints[_center1], body.Joints[_end1], 50);
angle2.Update(body.Joints[_start2], body.Joints[_center2], body.Joints[_end2], 50);
tblAngle1.Text = ((int)angle1.Angle).ToString();
tblAngle2.Text = ((int)angle2.Angle).ToString();
}
}
}
}
void UserReporter_BodyEntered(object sender, PlayersControllerEventArgs e)
{
}
void UserReporter_BodyLeft(object sender, PlayersControllerEventArgs e)
{
viewer.Clear();
angle1.Clear();
angle2.Clear();
tblAngle1.Text = “-“;
tblAngle2.Text = “-“;
}
}
2. In the xaml, I get a fixed line segment. It did not following the two joints movement.
Please help me, thank you.
In your C# code, you need to update the X1, X2, Y1, and Y2 values of the Line XAML object. The X1, X2, Y1, and Y2 values are the coordinates of the start/end points of the line.