Integration of Intel RealSense depth cameras with the Oculus Quest standalone VR headset using Unity.
This project consolidates the knowledge gained while attempting to power & operate a RealSense D435i camera directly from an Oculus Quest VR headset.
At the time of this attempt (5th-12th of June 2019), no prior work on this specific problem was available in the public realm, and the solution wouldn't be possible without the valuable contribution of members of the Intel development team.
The discussion which led to the solution, is available here, for future reference.
Aim of this repository is to describe the steps necessary to achieve the Quest-RealSense integration inside Unity 2019.x, as well as provide a sample Unity project, showcasing the method.
George Adamopoulos
23 of June 2019
The project was tested in the following environments.
Development environment:
Label | Info |
---|---|
Operating System & Version | Windows 10 1803 |
Language | C# |
Unity Version | 2019.1.4f1 |
Graphics API | OpenGLES3 |
Scripting API Version | .NET 4.x |
RealSense Depth Camera Information:
Label | Info |
---|---|
Camera Model | D435i |
Firmware Version | 5.11.6.200 + |
SDK Version | 2.22.0 + |
Oculus Quest Information:
Label | Info |
---|---|
Headset Model | Quest (May 2019) |
Headset Version | 256550.6170.5 + |
Unity Package Version | 1.36 + |
This guide assumes that the necessary steps for setting-up the Unity development environment for the Oculus Quest have been completed, as described in the official Oculus Quest documentation: [1] [2]
Before proceeding to the next steps, the project should be able to produce a working Android .apk build, which runs on the Quest without issues.
This guide assumes that the latest Intel RealSense Unity wrappers package are imported succesfully in Unity 2019.x without errors.
Before proceeding to the next steps, the project should be able to run one of Intel's provided example Scenes in Unity's Play Mode, provided a RealSense device is connected to an appropriate USB port of a Windows machine. More information in the official realsense repository.
Navigate to Unity's Project Settings > Player > Other Settings and ensure that the Scripting Backend is set to Mono. According to this reply the RealSense library does not support Unity's IL2CPP library, at least at the time of writing this.
In general, in order to allow a Unity project to access the RealSense cameras when targeting a platform other than Windows, the appropriate wrappers for this platform need to be built as Native Plugins first, and included in the Unity project.
In this case, because we are targeting Android (the OS of Oculus Quest) we will have to build the librealsense.aar plugin from the provided Android Java source code, based on the official guidelines.
In my experience, building from the Windows Command Prompt as an Administrator, using the gradlew assembleRelease
command proved to be the most straightforward, less error-prone, way:
A succesful build process should take around 10 minutes on a decent machine, and look like this:
If the build is succesful, the generated .aar file will be located in
<librealsense_root_dir>/wrappers/android/librealsense/build/outputs/aar
.
The generated librealsense.aar file should be placed inside your Unity project, in the Assets / RealSenseSDK2.0 / Plugins directory, alongside the Intel.RealSense.dll and librealsense2.dll. A succesful setup should look like this:
Note: A big shout-out to ogoshen for generously providing the solution of this next step!
Now that all the libraries are in place, before actually being able to access the RealSense camera, we need a C# script that performs two crucial jobs:
- Initializes a new instance of the Java class RsContext
- Makes sure that Android Camera Permissions are explicitly requested from the user, if not provided already.
Attaching the following script to any GameObject in your Scene, would ensure that those two operations are executed in the beginning of your application:
using UnityEngine;
public class AndroidPermissions : MonoBehaviour
{
#if UNITY_ANDROID && !UNITY_EDITOR
void Awake()
{
if (!UnityEngine.Android.Permission.HasUserAuthorizedPermission(UnityEngine.Android.Permission.Camera))
{
UnityEngine.Android.Permission.RequestUserPermission(UnityEngine.Android.Permission.Camera);
}
using (var javaUnityPlayer = new AndroidJavaClass("com.unity3d.player.UnityPlayer"))
using (var currentActivity = javaUnityPlayer.GetStatic<AndroidJavaObject>("currentActivity"))
using (var rsContext = new AndroidJavaClass("com.intel.realsense.librealsense.RsContext"))
{
Debug.Log(rsContext);
rsContext.CallStatic("init", currentActivity);
}
}
#endif
}
As stated in the original discussion, if you are using any other XR mode apart from Multi-Pass Stereo, Geometry Shaders will not work on the Quest.
This means that if you try to load an example Unity project, such as the PointCloudDepthAndColor scene from the Unity samples, where the PointCloudMat material assigned to the PointCloudRenderer component is using by default the Custom/PointCloudGeom shader ( a geometry shader ), you will get an
OPENGL NATIVE PLUG-IN ERROR: GL_INVALID_OPERATION: Operation illegal in current state
error.
Switching the shader of this material to the simple Custom/PointCloud shader should work like a charm!
Alternatively, you can switch your XR mode to Multi-Pass stereo.