NewtonVR: Physics-based interaction on the Vive (Part 1)

0

FULL ARTICLE

Hey everyone, my name’s Nick, and I’m a Virtual Reality Developer at Tomorrow Today Labs. We’re currently working on an unannounced title using the Unity3D game engine and programming automated crypto trading bots similar to bitcoin revolution auto trading that will be of great help to traders who have time constraints. Like most everyone working in Virtual Reality, I'm constantly running into unique and new challenges and experimenting to find the best solutions. When I think it's a really interesting or useful experiment, I'll be sharing it here with the community.

We’ll be following up this post in just a few days: the Tomorrow Today Labs interaction system is going up on GitHub later this week, free to use under a Creative Commons license as a token of our appreciation to the community.

** Quick edit: Part 2 is located here with a Github release and brief readme.

Problem

We recently decided to rebuild our interaction system, which acts as the glue connecting player input from the Vive controllers to objects in game. For an interaction system, our initial methods felt very… un-interactive. Objects under interaction were free from physics. External forces were meaningless. Position was set absolutely. We wanted players to feel like they were using their hands to pick up objects, and instead it felt like players were replacing their hands with objects. It became obvious that if we wanted to create a really dynamic VR experience, physics based interaction was the way to go.

Under the old interaction system, when a player tried the pick up or interact with an object with the Vive controller, that object would be parented to the controller and made kinematic. Through parenting, the position and rotation of the object were matched to the position and rotation of the player’s controller. When a player dropped an object it would be re-parented to the original parent. This led to a number of problems.

Old System - Position and Rotation Set Via Parenting

Since picked up objects were kinematic, forces from gravity, collisions, and hinges no longer had any effect. The held object could still apply forces to other rigidbodies, but if it hit a wall or any object without a non-kinematic rigidbody, the held object would clip right through. And if you dropped the interactable object while it was clipping through a wall, it would end up stuck there. The world was suddenly non-Newtonian. Equal and opposite reactions were not observed. The mass and velocity of an object under interaction meant nothing.

Until we have perfect, constant force feedback (hello holodeck), convincing the player that objects in game have actual mass and presence will be challenging. The easy way to deal with the technical limitations is to say: “Ok player, we can’t prevent you from breaking our game by walking through walls or putting items in walls, but we can pretend you didn't do it”. However, I believe a game world where physics are always present and always working is much more compelling, even critical, in VR.

Creating a functional (and believable) virtual reality is about consistency. We want the player to put on the headset and immediately be able to unconsciously understand the laws that govern that world. Game physics don’t necessarily need to be identical to earth physics, but if we want the player to think critically about these virtual laws, to use and subvert them to solve problems, then we have to make sure we apply these laws consistently across the game.

Old System - Inconsistent World Interaction
Notice how the tennis racket is interacting with the ping pong balls. Sometimes the racket correctly knocks the balls around, other times the rackets seems to pass right through the balls.

protected void OnAttach(VRHand hand)
{
    …
    this.GetComponent<Rigidbody>().isKinematic = true;
    this.transform.parent = hand.transform;
    this.transform.forward = hand.transform.forward;
    this.transform.position = hand.transform.position;
    …
}

Wanting a physics based interaction system is all good, but there are some serious hurdles to consider:

1) As objects under interaction would no longer clip through colliders, we had to deal with the scenario of a virtual object being blocked while a player’s real hand continues moving. Suddenly the player’s hand is disconnected from the object. And what about external forces on the object?

  • Do we provide a visual clue to remind the player that they are still holding the object? Like a skeletal hand with the items outline, à la Surgeon Simulator?
  • Or do we immediately knock the object out of the player’s hand, like what happens in Job Simulator?

2) What if the object under interaction is attached by a hinge (like a switch, door handle, knob)? What happens if the player moves the controller in a way that would be impossible for an object to follow?

  • Either the player remains interacting with the control at a distance and we provide a visual clue.
  • Or the player loses interaction, as though the object slipped from their grasp.

Physics is Hard

Even though we wanted a physics based interaction system, we didn’t limit ourselves when experimenting. We tried both physics based and non-physics based solutions for a large variety of scenarios.

  • Unattached Interactables: Our game is full of interactable items that can be picked up and moved freely.

Our solution was to create a new position or rotation every update and use interpolation to smooth out the movement.

Position and Rotation via Interpolation

protected void UpdatePosition(VRHand hand)
{
    …
    Vector3 toHandPos = Vector3.MoveTowards(this.transform.position, HeldHand.transform.position, 1);
    Quaternion toHandRot = Quaternion.RotateTowards(this.transform.rotation, HeldHand.transform.rotation, 1);
    RigidBody.MovePosition(Vector3.Lerp(RigidBody.transform.position, toHandPos, Time.deltaTime * AttachedPositionMagic));
    RigidBody.MoveRotation(Quaternion.Slerp(RigidBody.transform.rotation, toHandRot, Time.deltaTime * AttachedRotationMagic));
    …
}
  • Attached Interactables: Some interactables were attached in place using a hinge (doors, hatches, windows, etc).

Since these objects follow a single path, if you change the position programmatically, it might cause the object to be moved to an invalid position. On the flip-side, if you know the path, magnitude, and starting point, it’s relatively easy to figure out the end point. By measuring the relative position of the hand to the starting point, we can figure out what percentage of the total path the object has traveled, and the interpolation does the rest of the work.

Fixed Object Interaction via Interpolation

…
float currentAngle = Vector3.Angle(startDir, lockedHandDir);
float pull = currentAngle / DegreesOfFreedom;
sphericalMapping.value = pull;
if (repositionGameObject)
{
    if (inBounds)
    {
        RotationPoint.rotation = Quaternion.Slerp(start.rotation, end.rotation, pull);
    }
}
…

In the animation above, when the player started interacting with the hatch, it was turned into a kinematic object, and the hatch position was set through the code segment. Upon ending interaction, the hatch once again became non-kinematic and physics took back over. That way, the player could slam a hatch closed, or let go and watch physics drag it down.

There is a caveat: We haven’t used configurable joints yet, or joints whose rotation is not limited to a single axis, and I’m not sure how, or if, slerp could be used for that purpose.

Use Physics

Our preferred solution was always an entirely physics based approach, during which the object under control always remains non-kinematic, but this brought significant challenges of its own.

First, if we are going to move the object only through kinematics (encompassing force, velocity, acceleration), which kind of concept do we use and how do we apply it?

Unity gives the user multiple options:

1) I can change the velocity of any Rigidbody directly. Unity’s documentation does discourage this, as it bypasses acceleration and can result in unnatural looking motion, but it was important for us to try every possible solution.

2) I can also apply a force or torque to a Rigidbody with AddForce(). Unity will apply this force vector according to the chosen ForceMode:

  • Force (Mass * Acceleration)
  • Acceleration (Velocity / Time)
  • Velocity (Position / Time)
  • Impulse (Mass * Velocity)

Second, if we use forces, how do we go about measuring and calculating the vectors? If we can’t come up with a magnitude and direction each frame that matches the player’s intention, then the game is unplayable.

Our Solution

In the end, we came to a surprising solution. Instead of applying forces, we found that directly changing the velocity of the interactable was the best way to simulate picking up an object and still allow that object to be affected by outside forces.

We originally tried to change the object position and rotation via AddForce() with ForceMode=Force. The base vector was the difference between the position of the object under interaction and the hand. That way the direction of the vector always pointed at the hand.

Position and Rotation via AddForce() w/ ForceMode.Force
(How a Jedi plays tennis)

protected void UpdateForceOnControl(VRHand hand)
{
    Vector3 toHandPos = (hand.transform.position - this.transform.position);
    AttachedRigidbody.AddForce(toHandPos * AttachedPositionMagic, ForceMode.Force);
}

The problem with this approach was that the magnitude of our vector was very small, because players only ever try to pick up objects they already can reach. The object should usually be overlapping with the hand. As a workaround we used a multiplier. I’m not a fan of magic numbers, but it was the fastest solution in this case.

This did not generate the desired results. Items would fly around as the position of the object converged with the hand. Gravity would pull the object down, increasing the distance difference, until the distance was great enough to slingshot the item back toward the hand.

Then we tried applying the forces with the ForceMode equal to acceleration.

protected void UpdateForceOnControl(VRHand hand)
{
    Vector3 force = hand.GetComponent<Rigidbody>().mass * this.GetComponent<VelocityEstimator>().GetAccelerationEstimate();
    AttachedRigidbody.AddForce(force * AttachedPositionMagic, ForceMode.Acceleration);
}

The problem here was that forces were only applied when the players controller was moved. If the player moved quickly, the resulting force would be large, if the player moved slowly, resulting force would be small.

These results make sense, but what happens if player is not moving at all? In these cases no force is applied, so the object will simply fall under the effect of gravity. And the vector was still too small, we still needed a magic number.

And just to see what the result would be, we decided to try changing the velocity direction. The vector would still be the difference between the intractable position and the hand position, but we would not be applying any forces using AddForce().

Position and Rotation Set via Rigidbody Velocity

PositionDelta = (AttachedBy.transform.position - InteractionPoint.position);
this.Rigidbody.velocity = PositionDelta * AttachedPositionMagic * Time.fixedDeltaTime;

This turned out to be a huge success. Items picked up would quickly move towards the controller, and would follow the controller with a slight delay. When the item position matched the controller position, the velocity was set to zero. There was no need to continually add forces, and then counteract those forces with additional forces (causing the item to shake).

All Unity GameObjects have velocity, which should usually not be modified directly, but because we needed the object under interaction to move slowly towards the player controller and stop when it got there, modifying the velocity made sense. It also meant that the object could be affected by external forces, but would continue to move towards the player unless the controller had moved outside of the interaction trigger faster than the velocity could be modified.

Although a direct velocity change is unrealistic because it does not take mass into account, this makes perfect sense from the player's perspective. A player never feels the weight of a virtual object, and the speed they can move an object is determined only by how fast they can swing the controller.

Quaternion Quagmire

Rotations are quaternions, and quaternion arithmetic is not handled through operators in Unity, so we cannot simply take the (Hand Rotation - Object Rotation) and end with the rotational difference. The other issue is that angular velocity is a vector quantity, not a quaternion. Meaning that we actually have to calculate Angular Velocity.

Our solution to the problem of rotation finally came from representing the quaternion rotation (from our starting orientation to our ending orientation) in an angle-axis format.

We knew that we could find the rotation quaternion between two orientation quaternions (GameObject.transform.rotation is an orientation quaternion) by multiplying the the ending orientation by the inverse of the current orientation (RotationDelta = AttachedHand.transform.rotation * Quaternion.Inverse(this.transform.rotation);).

If we did this every frame, we would end up with a small rotation delta, a quaternion representing the rotation of that frame. We could get the angle-axis representation from here (RotationDelta.ToAngleAxis(out angle, out axis)).

Using the Angle and the Axis, we could calculate the three dimensional angular velocity. Axis was our unitary vector, Angle was the angular displacement, and our time was the time the frame took to complete (Time.fixedDeltaTime * angle * axis).

At this point we still had major problems.

1) The calculated rotations would sometimes be the "long way around". That is, if the desired rotation was only 10 degrees off the starting rotation, we saw situations where the object would rotate 350 degrees to get there.

2) If we increased the speed of rotation, we saw a corresponding increase in this really terrible vibration that would occur right at the end of the rotation, right when the orientation of the object had nearly matched the orientation of the controller.

It took a while to figure it out, but when we got there, the solutions were not complicated.

Our first problem was related to the fact that our rotation only ever occurred in one direction. Technically, angular velocity is a pseudovector, with direction of rotation specified by the right hand rule. This meant that while the magnitude of our calculated angular velocity was correct, the sign was not. The solution was to look at the magnitude of the angular displacement. If the angle was greater than 180 degrees, we knew that it would be faster to have the direction of rotation reversed, so we simply subtracted 360 degrees from the angle, flipping the sign of the vector and causing the rotation to occur in the other direction.

The second problem was related not to our math, but to Unity. All rigidbodies have a max angular velocity. In order to have our rotation occur fast enough, we used a large multiplier, but this also caused the angular velocity to go far beyond the default max angular velocity for that rigidbody. The solution was to simply raise the max angular velocity during the instantiation of an interactable item.

RotationDelta = AttachedHand.transform.rotation * Quaternion.Inverse(this.transform.rotation);
PositionDelta = (PickupTransform.position - this.transform.position);
RotationDelta.ToAngleAxis(out angle, out axis);
if (angle > 180)
    angle -= 360;
this.Rigidbody.angularVelocity = (Time.fixedDeltaTime * angle * axis) * AttachedRotationMagic;
this.Rigidbody.velocity = PositionDelta * AttachedPositionMagic * Time.fixedDeltaTime;
Equal and opposite

More important is that interaction between objects still take mass into account. If the player wishes to push a large mass in game, they’ll either have to use a mass of equal or greater value, or quickly accelerate a small mass into the larger one. Pushing a box with a balloon is not nearly as effective as using another box. None of these interactions had to be manually implemented; they’re handled by Unity and PhysX because we use a physics based interaction system.

Mass and Interaction - Hitting a Box with a Balloon

Mass and Interaction - Hitting a Box with a Box

Final Feelings

When we started down the path of an interaction system fully grounded in physics I don't think any of us on the team realized how much time and effort it would require. It was also surprising just how right using velocity to control objects in VR feels. Ultimately, having VR interaction done through physics has given us the freedom to move forward in implementing interactables in the game knowing that no hacks or gimmicks will be required for them to act in a believable and consistent way in the game world.

Interaction plays such a huge role in virtual reality, and can be such a gripping experience, that we think it can be an integral part of many virtual reality titles.
We'll be posting our interaction code to GitHub in the near future so that other people can use it freely. We very much believe in the promise of Virtual Reality to change the way people interact not just with games, but with technology. And we believe that, as a community, we succeed together.

AUTHOR

Nick Abel

Nick believes Virtual Reality is the medium of poets and dreamers. He believes in the power of coffee, rain, and games to spark the imagination. He believes in the oxford comma.

COMMENTS