A Shaky Start to Motion Controls

Blog / Tanya Frank / December 6, 2017

The Trouble with Motion

Realistic visualisation of human movement in a virtual environment is very important to increase user engagement with the simulated environment. In a recently completed DSTIL project, we were tasked with mirroring the real life arm movements of a human presenter with a set of digital arms on a large screen. The arms were also required to interact with various objects at predetermined locations within the scene, complicating the task. To add further challenge, we didn’t have much control on the platform on which the project was developed, Unreal Engine 4 Editor (UE4), which we had not used beyond small scale projects. This blog post describes our experiences in finding a solution for motion control in a challenging project with a short timeline.

Looking for Solutions but Finding Problems

Ideas included utilising some form of specialised motion control device to track the movement of the user’s entire arm, paired with existing animations when specific object interactions were required to allow for accurate hand-to-object placement. While there are a number of potential options for consumer level motion control, we were limited by a very strict set of requirements. We looked at the following devices:

  • Leap Motion
  • Playstation Move
  • Oculus Touch
  • HTC VIVE Controllers
  • Microsoft Kinect 2
  • Myo Armband

While they’re all super fun to play around with, only one device ended up fitting the specifications required: the Microsoft Kinect 2. As the project was being developed within UE4, we utilised the plugin Kinect4Unreal (K4U) to bridge the gap between the Microsoft Kinect SDK and UE4.

K4U allows the developer to either match a full skeleton from the Kinect 2 to the equivalent Mannequin within UE4, or you can be more specific and match individual joints to bones directly. Since we did not need a full skeleton representation, we attempted to match joints individually. Initial attempts would correctly map arm movements with their digital equivalent, however a number of quirks arose. Basic gestures and movements would look fairly realistic, however too much movement could result in the digital arms deforming unnaturally or jerking around significantly. On the other end of the scale, no movement at all would still result in a jittering effect which also made the arms look unnaturally shaky. Additionally, the Kinect 2 is engineered for a typical lounge room environment where users are facing directly toward the sensor bar – in the presentation environment the sensor is high up and further away than the ideal case, causing some jitter in the resulting pose, especially on in-between joints such as the elbows.

Mapping individual joints to their corresponding bones resulted often in conflicting and unstable arm positions. There was no constraints placed on bones relative to the connecting bones attached on either side. This created the appearance of unnatural movements as the bones would attempt to match the gap between the joint positions, regardless of realistic human movement capabilities.

During the process of problem solving the random and unnatural movements of the arm it was discovered that the default smoothing functionality (present within the Kinect 1 SDK) had been removed by Microsoft in the Kinect 2 SDK. This was to allow the developers to implement their own smoothing without being ‘locked’ into a single approach.

K4U implements a simple smoothing function that can be utilised. This however did not appear to be appropriate for the more precise movements required in our scenario. It became clear that this was a problem we’d have to fix ourselves.

Not Quite Solutions…

To address this problem, we attempted a number of different solutions and found that each had their own issues which prevented us from implementing them.

  1. Reimplement the default Microsoft smoothing from the Kinect 1 SDK to work with the Kinect 2 SDK.
    After discovering the lack of joint smoothing functionality in the Kinect 2 SDK, we decided to attempt simply porting the Kinect 1 code to the Kinect 2. Unfortunately, this was not as simple as we had anticipated. Ultimately, we were unable to to directly convert the code within the limited time and resources available for the project.
  2. Create a new Unreal 4 Skeleton with constraints placed on specific bones.
    Creating a biped skeleton that allowed specific constraints to be placed on individual bones did not work as intended. One arm would display correctly, while the movements of the other were reversed. Video games often use complicated constraint rigs to blend character skeletons between animation and physics movement, however the time required to set up sufficiently accurate constraints, and thus solve the mirroring issue, was beyond the resources of this project.
  3. Attempt to constrain variables within UE4 blueprints.
    UE4 provides the Blueprints Visual Scripting system which is a node-based way of implementing gameplay features from within the editor. Blueprints are intended to be an alternative to implementing gameplay features via C++, however they are able to be used in conjunction with each other if desired. It became quickly apparent that utilising blueprints to constrain the arm variables was going to be a time and resource consuming process and susceptible to many flaws. The number of Blueprint nodes required for this would result in a large, clunky and difficult to manage Blueprint file that would quickly become difficult to maintain.

Inverse Problem Solving

Attempting multiple solutions and finding them inadequate is always demoralising, and especially so as the time left on the project seeped away. We needed a breakthrough, and fast. So we came at the problem from the opposite direction: rather than articulating the virtual arms to match the user’s hands, what if we used the user’s hands to articulate the virtual arms? Enter Inverse Kinematics (IK).

Rather than specify every joint individually, we used the Kinect 2 to find the positions of the right and left hands only. The remainder of the joint and bone positions were determined by utilising Inverse Kinematics (IK) relative to the hand positions. IK determines the angles and positions of joints based on the desired pose of the character. As the shoulder positions were set at fixed points on the character object and the character arm model sets the constraints of the size and shape of the arm itself, when the characters hand positions updated this would provide the desired pose and allow the IK calculations to take place and display the arm correctly. This ensured that each bone would not move in an unnatural way (particularly if compared to the movement of its other connected elements).

This solution also had the advantage of simplifying the transition between the user controlled arm movements and pre-generated animations. UE4 offers blending functionality that can determine how to transition between one animation to another. For the blend purposes, the Kinect 2 positioned IK arm movements are considered to be an animation. This allowed the use of an Animation State Machine to realistically transition between the user controlled IK movements and the pre-generated animations.

Moving On

Although we spent a lot of time evaluating devices and solutions, we learnt a lot about how to approach this problem and user engagement in virtual reality software more generally. The ultimate solution provided the realism our client required, while also easing secondary requirements such as animation blending. If you’re on a tight timeline for delivery of a project involving motion control features, consider using IK.


Header image courtesy of Max Pixel.

Thanks to Joost Funke Küpper, Stuart Cameron, and Shannon Pace for reviewing this post and providing suggestions.