VR Dashboard Development

Dmitry Petukhov
6 min readJun 30, 2021

--

Introduction

Software development is a team sport. Various methodological tools and practices, such as daily meetings in the agile, for example, are applied to ensure teamwork.

Sometimes there is a need for a more complex format. Thus, for instance, live team meetings with a whiteboard are used from time to time to brainstorm ideas or discuss some software architecture aspects giving participants an opportunity to express their ideas through visual communication.

There are few tools for daily standup meetings that help to carry out this activity online. And a smaller number of tools helping to have such remote meetings in a visual communication environment. Well, we decided to create an app with complete VR immersion that is to simulate live meetings with a whiteboard in a remote format.

Idea

The idea is to create a VR platform with a whiteboard that can be used by participants for drawing. Therefore, they could meet online, feel as if they work in the same place and communicate using a whiteboard that makes it possible to draw with the help of hand movements and gestures.

Technical Aspects

There is a set of compulsory features to make this idea real. Here’s the minimal list:

  • 3D Engine for building and rendering the virtual environment.
  • VR headset integration for:
    • rendering an image specifically for stereoscopic vision
    • motion tracking
    • tracking navigation in the virtual environment
  • Realtime textures generation for regular drawings updates on the whiteboard.
  • Web server for persisting the state of the virtual environment, that is to include:
    • concurrent access to the state storage
    • quick networking for real-time state synchronization with clients

As a 3D engine we have chosen Unity3D since it’s easy to use, but not simple, and is featured with a developed community and a free license for non-commercial projects. To create the virtual environment itself, we have used a standard object scene out of the free asset:
https://assetstore.unity.com/packages/3d/environments/landscapes/rpg-poly-pack-lite-148410.

This asset includes:

  • Terrain
  • Skybox
  • A set of static objects for diversifying the environment

We have also added the board itself and some tools with a help of free models:
https://assetstore.unity.com/packages/3d/props/clipboard-137662
https://assetstore.unity.com/packages/3d/props/office-supplies-low-poly-105519
https://poly.google.com/view/cYQOKE7Wd7D

We got the following basic scene:

Pic 1. Main scene

VR Integration

We have chosen Oculus Quest 1 as a VR headset. There are ready-made libraries for it in Unity3D. Oculus Integration Package suits us, and particularly OVRCameraRig was taken out of it. This prefab is a camera optimized for Oculus stereoscopic. And its child elements give us a chance to track and render hand movements and gestures. But we are not to get ahead of ourselves and add it to the project, as we are more likely to go with OVRPlayerController, which can help to navigate and relocate in the virtual environment.

Pic 2. Prefabs for Oculus VR

Real-time Texture Generation

Though we have got the environment and are now ready to navigate in it with the help of VR, nothing still happens when virtual hands and a whiteboard interact. We need two mechanisms to make this feature real:

  • Raycasting to define the point of a hand and a whiteboard contact
  • Custom texture shader to update the drawing on the whiteboard regularly

Raycasting

This is a technique that helps to determine the intersection point of a ray let out and an object. In this case, we are to define the intersection point of a ray let out from a 3d avatar coordinate of a right hand with the surface of a whiteboard. In case this intersection happens (and obviously if the distance gone by the ray is short enough) we get a UV intersection point coordinate on a whiteboard. Then we need to update the whiteboard texture at this point. Particularly, the raycasting we get from Physics class out of the UnityEngine library:

// Defined as a class field to avoid allocations on every frame.
RaycastHit[] _raycastResults = new RaycastHit[NHits];
// Called when the user presses trigger on a controller.
int resultsCount = Physics.RaycastNonAlloc(
raycastSourcePosition,
raycastSourceForwardVector,
_raycastResults,
raycastLength,
raycastLayerMask);
for (int i = 0; i < resultsCount; i++)
{
RaycastHit hit = _raycastResults[i];
// Here we check if the raycast hit the board.
}
}

Custom Texture Shader

The drawing itself is real due to the custom textures shader. Originally, we had two textures — a texture of the board surface itself and a texture that reflects all the movements made by the marker. A shader is to intake those two to combine them pixel by pixel getting a final texture as a result.

Here are the shader configurations:

Pic 3. Shaders configuration

The shader itself is edited in Unity with the help of a visual editor based on the diagrams:

Pic 4. Shader itself in the editing window

The way the shader combines colors for the final whiteboard texture based on the MarkerInput texture map is clearly seen here:

  • If the pixel is black, the value is taken from the board texture
  • If the pixel is white, the marker color is chosen

Web Server

Unity App is a client application for one user. We need a web server to create an interaction between distant users. Its function is to store the overall scene state for all the users and be a synchronization point. Three components ensure this feature:

  • ASP app with API to set an initial connection
  • UDP streams for the quickest state transmission
  • Orleans Core to store statuses with a concurrent access

ASP API

In this case, the ASP app serves as the main access point for the clients, which is readily available. And it is obviously not to be used neither for the server which stores the status nor for the server which maintains real-time client communication.

UDP Streams

There are a huge number of networking libraries based on different protocols on the .Net. But for our case, we have to choose between TCP and UDP. We will use the UDP library. We are opting for this library to ensure the fastest speed of communication, as UDP is generally a standard choice for such kinds of applications, though TCP might be enough up to some level.

Orleans

Orleans is used as a framework for building a state storing server, which provides concurrent access. This framework is built on the ‘actors’ concept. And its actors (they are named grains here) ensure the synchronization of concurrent calls out-of-the-box.

Conclusion

This is how we got the simplest MVP application for VR meetings at the whiteboard. This example shows that even a minimal solution uses a wide range of technologies from different areas. And subsequently, this requires the participation of several people from the beginning.

However, it can be seen that ready-made libraries were used for most aspects of the project. And it’s important that there are many VR-specific features among them. That is, key technologies are already available as libraries. They are tested daily on similar projects. And in general, we can say that the implementation of VR applications is becoming more and more feasible.

About us

We at Akvelon Inc love cutting-edge technologies in mobile development, blockchain, big data, machine learning, artificial intelligence, computer vision, and many others. This article is based on our experience from several of Akvelon’s AR/VR projects which are described on our official website.

Akvelon company official logo

If you would like to work with our strong Akvelon team — please see our open positions.

Written in collaboration with Stanislav Akeldov.

--

--