Darshan Shankar

Startup founder, software engineer, Y Combinator alum. Working on a new Virtual Reality startup.

Read this first

Preparing your DK2 VR application for Oculus CV1

The consumer release of the Oculus Rift is coming soon - in just months not years! Here’s how you can prepare your VR application for CV1 by testing with a DK2.

Note: let’s assume things out of our control – like drivers and the Oculus runtime – don’t improve significantly (they certainly will).

 DK2 vs CV1 rendering

The Oculus DK2 has a 1920x1080 display that refreshes at 75fps. However, your VR application is rendering to a much higher resolution eye texture (or render target): 2364x1461. That’s a 1.66x render scale over the native display resolution. We do this to ensure a 1:1 texel to pixel resolution on the display after the distortion warp.

Unlike the DK2, the Oculus CV1 is a 2160x1200 display fixed at 90fps, by default rendering 400 million pixels per second compared to the DK2’s 259 million pixels per second. Using the same 1.66x render scale, I estimate the CV1 has a default...

Continue reading →


A Brief Overview of Lighting in Unity 5

Entire books could be written about lighting in Unity 5, but here is a brief mishmash of notes from the past few months of work in Unity 5. I’ve split it into 6 sections:

  • Forward vs. Deferred Rendering
  • Realtime vs. Baked Lighting
  • Engine Lighting Tools
  • Lighting Scenarios
  • Performance Tips
  • Post Processing

Most of this is applicable to any content made with Unity, not just virtual reality applications. This is a fairly high-level overview but it does assume some prior knowledge of basic graphics & lighting concepts. Let’s begin!

 Forward vs. Deferred Rendering Paths

When in doubt, go with a forward rendering path. It’s ideal for VR applications for several reasons. Forward rendering has a low upfront performance cost, it’s easier on the CPU with fewer draw calls, it runs well on mobile VR devices, and it supports anti-aliasing as well as translucent materials. Here’s a quick overview to...

Continue reading →


VR camera rotation without nausea: a counterintuitive discovery

By now, this is obvious advice to any VR developer: you can positionally move a VR camera using a traditional gamepad thumbstick but any form of yaw rotation (mouselook or thumbstick rotation) causes extreme player discomfort.

That being said, I found an edge case where rotation works smoothly, without any player discomfort.

 The environment

The player is floating in space, with a large, rotating planet in front of them. There are thousands of objects on the surface of the planet (trees, buildings). As you would expect, when the planet rotates, the objects on the planet surface rotate along with it.

This poses an engineering problem. Each object on the planet surface incurs a draw call – this is very expensive for...

Continue reading →


12 performance tricks for optimizing VR apps in Unity 5

VR applications are far more computationally intensive than non-VR counterparts, making performance optimization a crucial task. If you’re targeting the mobile devices like the GearVR, this becomes even more paramount.

These are performance metrics you should try and hit:

  • 50 draw calls per eye. Unity 5 more accurately refers to them as setpass calls.
  • Fewer than 50K-100K vertices & 50K-100K polygons in your scene.

Here’s a bag of simple tricks you can use to hit those numbers:

 Static batching

You probably have a ton of static geometry in your scene such as walls, chairs, lights, and meshes that never move. Mark them as static in the editor. Be sure to mark them as lightmap static in order to get baked lightmap textures. Instead of incurring a draw call for each individual object, objects marked as static can be batched into one combined mesh.

Static batching has one crucial...

Continue reading →


Developing cross-platform VR apps for Oculus Rift and GearVR

I’ve spent the past 2 months developing a virtual reality application for PC (Oculus Rift) and mobile (Samsung GearVR). I used Unity 5 and the Oculus PC (0.5.0.1-beta) and Mobile (0.5.1) SDKs. Here are some lessons learned that will save you time.

 Oculus PC and Mobile SDK can be used together

In Unity, you can use the two SDKs simultaneously. The Mobile SDK is under OVR>Moonlight (fun fact: Project Moonlight was the internal codename for the GearVR project). They share many resources, including the OVRCamera prefabs.

For example, if you use a OVRCameraRig in your scene, there is no additional work needed for it function properly on the PC or on mobile. It Just WorksTM

 Setup separate Unity project folders

Unity stores platform-specific data, like compressed textures in /Library. Unity’s texture compression runs as a slow, single-threaded process. Large textures like lightmaps can...

Continue reading →