For an overview of the project, click here.

This post outlines the work has been completed thus far.

  1. Unreal Engine 4.14 was selected to be the development environment for this project. This is partially because it was the development environment recommended by the course in which this project was initially developed, and because of my relative lack of experience with creating VR content. The HTC Vive was selected to be the VR platform to deliver this project. This was due to the fact that it provided the most robust and reliable player tracking information.
  2. After doing some basic tutorials in modelling, coding (blueprints) and animation in Unreal Engine 4.14, it was determined that iKinema’s real-time IK (inverse kinematics) middleware plugin for Unreal Engine was required in order to drive the dynamically generated movement of the NPCs. Using this plugin allowed for the transform (position and orientation) information of the Vive’s two hand controllers and the VR headset to dynamically animate each of the NPCs.ikinema.png
    Example of iKinema being used to drive a full-body VR experience. Source:
  3. The standard Unreal Engine humanoid model (from Unreal’s third person level template) was used as the model for both the player’s avatar and the NPCs. In order to use this model within the iKinema plugin, it had to be modified inside 3D modeling software (3ds Max).
    Modifying the default model in 3ds Max (L) and adding the iKinema rig (R)
  4. Unreal’s VR level template was selected to be used for the initial version of this project. The modified humanoid model was brought into the project at this time and the iKinema animation rig was attached to it. The first development step was adding a body to the player avatar. This meant that rather than being a disembodied viewpoint with two disembodied hands (which was the default viewpoint in Unreal’s VR level template) the player would have a full, moving body when they looked down. This was also vital for the proposed project because it would be necessary that both players have visible avatars.
    Unreal 4.14.3 default VR template with disembodied hands (L) and hands attached to avatar’s body (R)
  5. For the player’s body, the head of the model had to be removed and animated separately so that it was only visible from an external viewpoint. (If not, the head would block the view occupied by the player.) In order to separate the head from the body, 3D modelling software (3ds Max) was used to modify the model. The hands from the default VR template were removed (since they were not attached to any body) and replaced with the hands from the model. 3D animation software (MotionBuilder 2017) was used to animate the hands and fingers of the model so that they opened and closed when the player pulled the trigger buttons on the hand controllers.
  6. A mirror was temporarily added to the game for testing purposes.
  7. Room-scale navigation was added so that the player’s head and body would be positioned based on the position of the player in the real world.
  8. At this stage, the iKinema plugin enabled the transform information of the hand controllers and headset to animate every part of the avatar’s body (including the hips and knees) except the feet. The player could wave their hands, touch their heads and even crouch in the physical world and the player’s avatar would accurately reflect those movements. However, if the player walked in physical space, the player’s avatar would slide around the virtual world, since the system had no way of tracking the transform information of the player’s feet. In order to solve this problem, the default walk and strafe animations from Unreal Engine’s third person level template were modified and used to animate the avatar’s feet producing the illusion of walking when physically moving around the room.
  9. Approximately about 2 minutes of the transform information from the hand controllers and headset were then stored in a movement buffer.
  10. Ten different NPCs were added to the level. A basic AI system was implemented that allowed the NPCs to randomly explore the level. A simple AI behavior was briefly implemented to allow the NPCs to turn and look at the player when they were within a certain proximity.
  11. Implemented dynamic animation of the NPCs upper bodies based on the transform information stored in the movement buffer. Each NPC would still randomly walk around the level, but now their heads and hands would move based on the playback from a random spot in the movement buffer.
  12. A player navigation system was implemented, allowing the user to move around the level by swinging their arms as if they were walking.