We have always interacted with the digital world in a 2-dimensional manner. The way we interface with 3D scenes is usually limited via a screen. Recently, that has changed; we can now naturally experience 3D scenes. We can step into the scene, move around, grab objects and even make creations that we experience like they were real. Well, it might be a shock to some people, but VR is no invention of the 21st century. It dates back to the 1960s when Ivan Sutherland invented the ‘Sword of Damocles’, which is regarded as the first VR and AR device. Decades of research has given birth to the incorporation of cutting edge technology to this concept. Innovative products which are exciting, now more than ever, have sprung up in recent years. One significant recent improvement made available to consumer VR devices is 6DoF on standalone headsets.

How VR works.

VR headsets or Head Mounted Displays (HMD) are the devices we use to experience virtual reality. These gadgets use an interplay of hardware and software to provide full immersion into the world we’re interacting with.

  • Optics - HMD is the primary component of virtual reality devices. It ensures that no matter where the user’s head may turn, the display is positioned right in front of the user’s eyes. The display takes up the entire field of view of the user or at least ensures that whatever is displayed is always in the field of view of the user. Headsets achieve stereoscopic 3-D by feeding a slightly different image to each eye, which is more or less how we see in the real world. In the real world, each eye sees everything from a slightly different vantage point. Imitating this in VR creates the illusion of depth. A set of Fresnel lenses enable the eyes to focus on a display that is very close to the eyes. A common problem faced by headset manufacturers is that Interpupillary distance (IPD) varies from person to person. IPD is the distance from the centre of the pupils. It affects your ability to focus on the two displays right in front of you. Headset manufacturers solve this by adding a slider to adjust the screens to fit your IPD.

  • Tracking - Tracking is what separates VR from other experiences. When you turn your head, that rotation is applied to your field of view in the VR scene. Inertia Measurement Unit (IMU) is a device in VR headsets that makes this possible. The IMU reports the orientation and acceleration using a combination of accelerometers, gyroscopes and magnetometer.

Okay, let’s create a VR experience.

Enough talk. Let’s make an entertaining app to experience VR. Virtual reality has many applications outside of entertainment, but we won’t dwell on those in this article. Let’s create a VR app that can give users a thrill of flying. For this article, we’ll build the game onto an Oculus Quest headset.

Dangerous Sports Club

Dangerous Sports Club banner

High concept.

This game is about base jumping from tall structures in the user’s selected city.

How do we achieve this?

We’ll use the Unity game engine for making our app. Unity is a game engine that makes it easy to create games for a variety of platforms. In this case, we’re targeting Oculus Quest, which runs the Android OS. We won’t be writing any Java or Kotlin code, since we’re using Unity, C#.

For the buildings and structures in whatever city the user selects, we’ll use Mapbox to populate the game world. There are other options available, such as Google Maps SDK for Unity, and OpenStreetMaps coupled with plugins to render OpenStreetMap data into Unity. Google Maps SDK is great, but it may be difficult to lay your hands on the full version as the demo is very limited. Mapbox presents buildings with good nine-sliced textures out of the box. OpenStreetMap is totally free, but it suffers from a lack of sufficient buildings data in underdeveloped countries. Ensure you have the following installed on your development machine:

  • Unity. min 2017.
  • Android SDK min 24

Basic setup.

Create a new Unity project. Ensure 3D is selected. Let’s configure the project for our target platform. Oculus Quest uses the Android OS. In Unity, go to File -> Build settings. Switch the platform to Android. Change Texture compression to ASTC. Remember to uncheck Development Build for your final build, as it may impact performance.

Build Settings

Click Player Settings -> XR Settings. Check Virtual Reality Supported and add Oculus to the Virtual Reality SDKs list. Under Other Settings section, set minimum API level to 24. Oculus requires a minimum of API level 19, but the Mapbox library, which we’ll be adding in the next section requires API level 24.

Player Settings

We still have more configurations to do to set up our project for Oculus Quest but for now, let’s design our scene so we at least have something we can see in the Unity Editor and/or Game Window. Ignore the missing Oculus package warning for now, we’ll fix that soon.


  • Since we decided to use Mapbox, go ahead and import the Mapbox SDK for Unity by opening
  • Follow the instructions to download the SDK for Unity. It requires you to create an account to get your API access token.
  • When you’re done downloading, head back to Unity and click Assets > Import Package > Custom Package, then select the Mapbox package to install.
  • Once the import is complete, a Mapbox setup dialogue opens. If you didn’t see the dialogue, go to Mapbox > Setup to open it up. Enter your api token in the field provided.
  • Click on Data Explorer sample scene to load it up in the Editor. Close the setup dialogue.
  • Save the scene as “SportsCity” under “Assets/Scenes” folder.

Mapbox’s Data Explorer scene is a good place to start as it is closer to the look we want, so we have fewer customizations to fiddle around with. Press play at the top of the Unity window to get a feel of how the scene looks. It loads an area in (let’s say) New York. We can change the coordinates but let’s stick with the default as this area has enough tall structures for us to use. Click play again to end.

Manhattan, New York. Loaded with Mapbox Unity SDK

I have removed the Points of Interests labels from the scene. Check the Mapbox documentation to know how to do this and other customization settings as the focus of this article is not Mapbox.

Oculus VR.

Now it’s time to integrate Oculus utilities into our project to enable us to navigate and interact with the scene with the Oculus headset and controllers. The Oculus package which includes scripts, prefabs, and other resources supplements Unity’s built-in VR support. It includes an interface for controlling VR camera behaviour, a first-person control prefab, a unified input API for controllers, advanced rendering features, object-grabbing and haptics scripts for Touch, debugging tools, and more.

  • In Unity, switch to the Asset Store pane, search “Oculus Integration” and download the package.
  • Click import when the download is complete. Ensure that all components of the package are selected for import. Update any Oculus plugin you’re prompted to install. You may need to restart Unity to complete the process.
  • Re-open the project if you restarted Unity.
  • Click Oculus > Tools > Remove AndroidManifest.xml and Oculus > Tools > Create store-compatible AndroidManifest.xml. It will generate/refresh the AndroidManifest.xml under Assets/Plugins/Android~Remove Android manifest & generate Oculus store-compatible manifest. Or you can add the below line manually to make the project Quest compatible.
<uses-feature android:name="android.hardware.vr.headtracking" android:version="1" android:required="true" />

Now we have both Mapbox and Oculus packages in our project. We can now access the scene loaded by Mapbox with our Oculus hardware. It’s not automatic, and we still have some work to do. What translates the user’s motion in real life into motion within the game? The camera in the game scene is what enables the user to view the game world. In a regular FPS game, the camera’s view would be rendered to the screen which the user views in a flat manner. He may need to control the camera (his view) by way of touching buttons on the screen, controller or keyboard. In the VR context, he would have to change his view by moving his head. Some scripts would be added to the Unity camera GameObject, which takes readings in realtime from the headset’s Inertial Measurement Unit (IMU) and copies that onto the camera in the game scene. That is, the camera object is controlled by head tracking. We wouldn’t need to write scripts from scratch to do this as this is already available with Unity’s built-in VR support, and also in the Oculus Integration package which gives more access to the hardware.

An excellent way to start is to drag an OVRPlayerController prefab into the scene.

  • In the Project pane, open Assets > Oculus > VR > Prefabs and drag an OVRPlayerController prefab into the Hierarchy pane. It adds an OVRPlayerController GameObject into the scene.
  • Activate Quest support by first selecting OVRPlayerController > OVRCameraRig in the Hierarchy pane. You will see a “Target Devices” section on the top of the inspector of the OVRManager component. Switch the only element from “Gear Vr Or Go” to “Quest”.

At this point, we no longer need the camera rig that was existing in our scene before we added OVRPlayerController. OVRPlayerController has a child GameObject—OVRCameraRig— whose pose is controlled by head tracking. Therefore, disable MainCamera.

When the scene loads, the user should be able to go to the top of a building he wants to dive from. Remember, the user can walk around the scene (in 6Dof mode), but the user’s natural play area is limited. It would be callous to require the user to walk equal amounts of distance in the real world to navigate the game world for long distances. So, in addition to being able to walk around, for a full title, adding waypoints that the user can point at with the hand controller to teleport or dash to would be a good idea. We won’t cover that in this article.

For moving to the roof of a building, I added a gem at the floor level entrance which the player collides with to move up. OVRPlayerController has a CharacterController component which can trigger collisions with other colliders.

Collider at a building entrance

public class BuildingEntranceGem : MonoBehaviour
    public Transform buildingRoofDestination;

    // Update is called once per frame
    void Update()
        transform.Rotate(0, 10 * Time.deltaTime, 0);

    //OnTriggerEnter is called when the GameObject collides with another GameObject
    private void OnTriggerEnter(Collider other)
        if (!other.gameObject.CompareTag("Player"))

        CharacterController characterController = other.GetComponent<CharacterController>();
        characterController.enabled = false;
        other.transform.position = buildingRoofDestination.position;
        characterController.enabled = true;

A beautiful screen fade is necessary with this position change. When at the top, the user can move off the roof to start a dive.

Build to a Quest.

Connect your Quest to your computer via a USB cable. In Build Settings, ensure your device is selected under Run Device. Click Build And Run.


Optimization is a critical part of VR development. Unlike some other platforms, it’s best to optimize early and often with VR, rather than leaving it to a later stage in development. Testing regularly on the target devices is also essential.
This is an excellent resource for knowing what settings to use for your Oculus Quest app.