VR Dashboard Development

Introduction

Software development is a team sport. Various methodological tools and practices, such as daily meetings in the agile, for example, are applied to ensure teamwork.

Idea

The idea is to create a VR platform with a whiteboard that can be used by participants for drawing. Therefore, they could meet online, feel as if they work in the same place and communicate using a whiteboard that makes it possible to draw with the help of hand movements and gestures.

Technical Aspects

There is a set of compulsory features to make this idea real. Here’s the minimal list:

  • VR headset integration for:
    • rendering an image specifically for stereoscopic vision
    • motion tracking
    • tracking navigation in the virtual environment
  • Realtime textures generation for regular drawings updates on the whiteboard.
  • Web server for persisting the state of the virtual environment, that is to include:
    • concurrent access to the state storage
    • quick networking for real-time state synchronization with clients
  • Skybox
  • A set of static objects for diversifying the environment
Pic 1. Main scene

VR Integration

We have chosen Oculus Quest 1 as a VR headset. There are ready-made libraries for it in Unity3D. Oculus Integration Package suits us, and particularly OVRCameraRig was taken out of it. This prefab is a camera optimized for Oculus stereoscopic. And its child elements give us a chance to track and render hand movements and gestures. But we are not to get ahead of ourselves and add it to the project, as we are more likely to go with OVRPlayerController, which can help to navigate and relocate in the virtual environment.

Pic 2. Prefabs for Oculus VR

Real-time Texture Generation

Though we have got the environment and are now ready to navigate in it with the help of VR, nothing still happens when virtual hands and a whiteboard interact. We need two mechanisms to make this feature real:

  • Custom texture shader to update the drawing on the whiteboard regularly

Raycasting

This is a technique that helps to determine the intersection point of a ray let out and an object. In this case, we are to define the intersection point of a ray let out from a 3d avatar coordinate of a right hand with the surface of a whiteboard. In case this intersection happens (and obviously if the distance gone by the ray is short enough) we get a UV intersection point coordinate on a whiteboard. Then we need to update the whiteboard texture at this point. Particularly, the raycasting we get from Physics class out of the UnityEngine library:

// Defined as a class field to avoid allocations on every frame.
RaycastHit[] _raycastResults = new RaycastHit[NHits];
// Called when the user presses trigger on a controller.
int resultsCount = Physics.RaycastNonAlloc(
raycastSourcePosition,
raycastSourceForwardVector,
_raycastResults,
raycastLength,
raycastLayerMask);
for (int i = 0; i < resultsCount; i++)
{
RaycastHit hit = _raycastResults[i];
// Here we check if the raycast hit the board.
}
}

Custom Texture Shader

The drawing itself is real due to the custom textures shader. Originally, we had two textures — a texture of the board surface itself and a texture that reflects all the movements made by the marker. A shader is to intake those two to combine them pixel by pixel getting a final texture as a result.

Pic 3. Shaders configuration
Pic 4. Shader itself in the editing window
  • If the pixel is white, the marker color is chosen

Web Server

Unity App is a client application for one user. We need a web server to create an interaction between distant users. Its function is to store the overall scene state for all the users and be a synchronization point. Three components ensure this feature:

  • UDP streams for the quickest state transmission
  • Orleans Core to store statuses with a concurrent access

ASP API

In this case, the ASP app serves as the main access point for the clients, which is readily available. And it is obviously not to be used neither for the server which stores the status nor for the server which maintains real-time client communication.

UDP Streams

There are a huge number of networking libraries based on different protocols on the .Net. But for our case, we have to choose between TCP and UDP. We will use the UDP library. We are opting for this library to ensure the fastest speed of communication, as UDP is generally a standard choice for such kinds of applications, though TCP might be enough up to some level.

Orleans

Orleans is used as a framework for building a state storing server, which provides concurrent access. This framework is built on the ‘actors’ concept. And its actors (they are named grains here) ensure the synchronization of concurrent calls out-of-the-box.

Conclusion

This is how we got the simplest MVP application for VR meetings at the whiteboard. This example shows that even a minimal solution uses a wide range of technologies from different areas. And subsequently, this requires the participation of several people from the beginning.

About us

We at Akvelon Inc love cutting-edge technologies in mobile development, blockchain, big data, machine learning, artificial intelligence, computer vision, and many others. This article is based on our experience from several of Akvelon’s AR/VR projects which are described on our official website.

Akvelon company official logo