This content originally appeared on DEV Community and was authored by kurohuku
Prepare camera output
Before displaying the current time to the watch overlay, try displaying the camera output of a simple 3d scene.
Prepare scene
In the hierarchy window, create the following game objects.
- Camera
- 3D Object > Cube
- Light > Directional Light
Arrange the objects as the camera captures the cube. It’s OK to arrange roughly.
Rotate the cube
We want to make animated camera output so create a script to rotate the cube. Create Rotate.cs inside the Scripts folder, then copy the following code.
using UnityEngine;
public class Rotate : MonoBehaviour
{
void Update()
{
transform.Rotate(0, 0.5f, 0);
}
}
Attach the Rotate.cs
to the Cube object in the scene.
Run the program and check the cube is rotating.
Here, the test scene setup is done.
Add camera reference
Add a camera member variable to WatchOverlay.cs
.
public class WatchOverlay : MonoBehaviour
{
+ public Camera camera;
private ulong overlayHandle = OpenVR.k_ulOverlayHandleInvalid;
[Range(0, 0.5f)] public float size;
[Range(-0.2f, 0.2f)] public float x;
[Range(-0.2f, 0.2f)] public float y;
[Range(-0.2f, 0.2f)] public float z;
[Range(0, 360)] public int rotationX;
[Range(0, 360)] public int rotationY;
[Range(0, 360)] public int rotationZ;
...
Set the scene camera
to the camera variable of the WatchOverlay
object in the Unity inspector.
Remove image file code
To use the camera output, remove the image file code.
private void Start()
{
InitOpenVR();
overlayHandle = CreateOverlay("WatchOverlayKey", "WatchOverlay");
- var filePath = Application.streamingAssetsPath + "/sns-icon.jpg";
- SetOverlayFromFile(overlayHandle, filePath);
SetOverlaySize(overlayHandle, size);
ShowOverlay(overlayHandle);
}
Create Render Texture
Create a new RenderTextures
folder inside the Assets
in the project window. Right click the RenderTextures
folder > Create > RenderTexture to create a new render texture asset, and named WatchRenderTexture
. We will set the scene camera output to this render texture asset.
Write camera output to the RenderTexture asset
On the hierarchy, click Camera
object and open the inspector. Drag WatchRenderTexture
from the project window to the camera Target Texture property in the inspector.
Then the camera output is written into WatchRenderTexture
asset.
RenderTexture setting
On the project window, click WatchRenderTexture
asset and open the inspector. Change Size to 512 x 512.
Optional: When you want to create RenderTexture by script
If you want to create RenderTexture in the code without creating a RenderTexture asset, you can write as below.
public class WatchOverlay : MonoBehaviour
{
public Camera camera;
+ private RenderTexture renderTexture;
private ulong overlayHandle = OpenVR.k_ulOverlayHandleInvalid;
...
private void Start()
{
InitOpenVR();
overlayHandle = CreateOverlay("WatchOverlayKey", "WatchOverlay");
+ // Set camera.targetTexture to write the camera output to the render texture.
+ renderTexture = new RenderTexture(512, 512, 16, RenderTextureFormat.ARGBFloat);
+ camera.targetTexture = renderTexture;
SetOverlaySize(overlayHandle, size);
ShowOverlay(overlayHandle);
}
...
Add Render Texture variable
Add a member variable to store RenderTexture.
public class WatchOverlay : MonoBehaviour
{
public Camera camera;
+ public RenderTexture renderTexture;
private ulong overlayHandle = OpenVR.k_ulOverlayHandleInvalid;
...
Open WatchOverlay
game object inspector, and set WatchRenderTexture
asset to renderTexture
variable.
Wait for texture creation
Wait for the render texture to be ready to use.
private void Update()
{
var leftControllerIndex = OpenVR.System.GetTrackedDeviceIndexForControllerRole(ETrackedControllerRole.LeftHand);
if (leftControllerIndex != OpenVR.k_unTrackedDeviceIndexInvalid)
{
var position = new Vector3(x, y, z);
var rotation = Quaternion.Euler(rotationX, rotationY, rotationZ);
SetOverlayTransformRelative(overlayHandle, leftControllerIndex, position, rotation);
}
+ if (!renderTexture.IsCreated())
+ {
+ return;
+ }
+
+ // add render code here
}
Get a native texture pointer
The texture data type to pass to the OpenVR is a graphics API, such as DirectX or OpenGL, which works under the Unity API layer.
We can get a native texture pointer to access the low level API texture data with Unity’s GetNativeTexturePtr().
Call GetNativeTexturePtr()
in Update()
to get the texture data reference which will be passed to the OpenVR API.
private void Update()
{
var leftControllerIndex = OpenVR.System.GetTrackedDeviceIndexForControllerRole(ETrackedControllerRole.LeftHand);
if (leftControllerIndex != OpenVR.k_unTrackedDeviceIndexInvalid)
{
var position = new Vector3(x, y, z);
var rotation = Quaternion.Euler(rotationX, rotationY, rotationZ);
SetOverlayTransformRelative(overlayHandle, leftControllerIndex, position, rotation);
}
if (!renderTexture.IsCreated())
{
return;
}
+ var nativeTexturePtr = renderTexture.GetNativeTexturePtr();
}
Optional: Sync rendering thread
The GetNativeTexturePtr() documentation does not recommend calling it every frame for performance. However, we call it in Update()
in this tutorial, because rendering texture to the overlay without syncing to the rendering thread sometimes crashes the program.
Create an OpenVR texture
Create a texture variable of OpenVR texture data type Texture_t
.
private void Update()
{
var leftControllerIndex = OpenVR.System.GetTrackedDeviceIndexForControllerRole(ETrackedControllerRole.LeftHand);
if (leftControllerIndex != OpenVR.k_unTrackedDeviceIndexInvalid)
{
var position = new Vector3(x, y, z);
var rotation = Quaternion.Euler(rotationX, rotationY, rotationZ);
SetOverlayTransformRelative(overlayHandle, leftControllerIndex, position, rotation);
}
if (!renderTexture.IsCreated())
{
return;
}
var nativeTexturePtr = renderTexture.GetNativeTexturePtr();
+ var texture = new Texture_t
+ {
+ eColorSpace = EColorSpace.Auto,
+ eType = ETextureType.DirectX,
+ handle = nativeTexturePtr
+ };
}
Set graphics API type to eType
. Since DirectX is the default graphics API in this tutorial environment, we are going to assume that the graphics API is DirectX from now on.
Set the native texture pointer to handle
.
The camera output is written into this texture.
Optional: If you want to support APIs other than DirectX
You can detect current graphics API with SystemInfo.graphicsDeviceType
.
switch (SystemInfo.graphicsDeviceType)
{
case GraphicsDeviceType.Direct3D11:
texture.eType = ETextureType.DirectX;
break;
case GraphicsDeviceType.Direct3D12:
texture.eType = ETextureType.DirectX12;
break;
case GraphicsDeviceType.OpenGLES2:
case GraphicsDeviceType.OpenGLES3:
case GraphicsDeviceType.OpenGLCore:
texture.eType = ETextureType.OpenGL;
break;
case GraphicsDeviceType.Vulkan:
texture.eType = ETextureType.Vulkan;
break;
}
You can check the program behavior with APIs other than Direct3D11 by unchecking Project Settings > Player > Other Settings > Auto Graphics API for Windows, and adding the desired graphics API to Graphics APIs for Windows.
https://docs.unity3d.com/Manual/GraphicsAPIs.html
Draw texture to the overlay
Draw texture to the overlay with SetOverlayTexture(). (read the wiki for details)
Pass the Texture_t
texture we created before.
rivate void Update()
{
var leftControllerIndex = OpenVR.System.GetTrackedDeviceIndexForControllerRole(ETrackedControllerRole.LeftHand);
if (leftControllerIndex != OpenVR.k_unTrackedDeviceIndexInvalid)
{
var position = new Vector3(x, y, z);
var rotation = Quaternion.Euler(rotationX, rotationY, rotationZ);
SetOverlayTransformRelative(overlayHandle, leftControllerIndex, position, rotation);
}
if (!renderTexture.IsCreated())
{
return;
}
var nativeTexturePtr = renderTexture.GetNativeTexturePtr();
var texture = new Texture_t
{
eColorSpace = EColorSpace.Auto,
eType = ETextureType.DirectX,
handle = nativeTexturePtr
};
+ var error = OpenVR.Overlay.SetOverlayTexture(overlayHandle, ref texture);
+ if (error != EVROverlayError.None)
+ {
+ throw new Exception("Failed to draw texture: " + error);
+ }
}
Run the program. Make sure the camera output is shown on the overlay.
Flip vertical
The camera output is vertically flipped, so we will have to flip it back.
This happens because the V-axis direction of the texture UV system is flipped between Unity and DirectX. Unity’s V-axis faces upwards while DirectX’s faces downwards.
https://docs.unity3d.com/Manual/SL-PlatformDifferences.html
There are different ways to fix this issue. However, we use the OpenVR SetOverlayTextureBounds() to vertically flip the V-axis. (read the wiki for details)
private void Start()
{
InitOpenVR();
overlayHandle = CreateOverlay("WatchOverlayKey", "WatchOverlay");
+ var bounds = new VRTextureBounds_t
+ {
+ uMin = 0,
+ uMax = 1,
+ vMin = 1,
+ vMax = 0
+ };
+ var error = OpenVR.Overlay.SetOverlayTextureBounds(overlayHandle, ref bounds);
+ if (error != EVROverlayError.None)
+ {
+ throw new Exception("Failed to flip texture: " + error);
+ }
SetOverlaySize(overlayHandle, size);
ShowOverlay(overlayHandle);
}
This flips texture vertically. Run the program and check the texture is flipped.
Optional: Support for APIs other than DirectX
In this tutorial, we assume that the graphics API is DirectX so always flip the texture vertically.
If you want to support other APIs, check graphicsDeviceType
and deal as “If the graphicsDeviceType
is the OpenGL, don’t flip the texture” similar to the above section “Optional: If you want to support APIs other than DirectX”.
Create a canvas to display the current time
We have displayed the camera output to the overlay. Let’s create the watch next.
Remove the Cube and Directional Light, we never use them later.
Open the Camera inspector, and click Reset on the Transform component to reset the position to (0, 0, 0).
Create the below objects into the scene.
- UI > Canvas
- UI > Text — TextMeshPro inside the Canvas object
When we create TextMeshPro for the first time, it shows the dialog below, click Import TMP Essentials button. Close the dialog after importing.
Open the Canvas inspector, and change Render mode to Screen Space -Camera. Then drag the camera object from the hierarchy to the Render Camera property.
Open the Text (TMP) object inspector, and center the text vertically and horizontally with Alignment.
Input “00:00:00” to the text.
Select the Camera object, and set Clear Flags to Solid Color.
Click the Background color, and make sure A (Alpha) is 0.
This makes the camera background transparent and displays the current time only.
Select the Canvas object, and set the Plane Distance to 10. This makes editing easy on the Editor.
Select Text (TMP), click the anchor setting (the rectangle on the left top of the Rect Transform component), and select the right bottom blue crossed arrow (stretch to both directions).
Set Left, Top, Right, and Bottom to 0.
Scroll the inspector, and set the TextMeshPro — Text (UI) component Font Size to 70.
Run the program. Check if the current time is displayed on the left wrist.
Adjust the position and font size if it is off.
Activate the watch
Create a new script Watch.cs
inside the Scripts
folder. Copy the following code.
using UnityEngine;
using System;
using TMPro;
public class Watch : MonoBehaviour
{
private TextMeshProUGUI label;
void Start()
{
label = GetComponent<TextMeshProUGUI>();
}
void Update()
{
var hour = DateTime.Now.Hour;
var minute = DateTime.Now.Minute;
var second = DateTime.Now.Second;
label.text = $"{hour:00}:{minute:00}:{second:00}";
}
}
Add Watch.cs
to the Text (TMP) object into the scene.
Run the program. It should display the current time.
Organize code
Organize WatchOverlay.cs
.
Flip overlay vertically
Move flipping code into FlipOverlayVertical()
.
private void Start()
{
InitOpenVR();
overlayHandle = CreateOverlay("WatchOverlayKey", "WatchOverlay");
- var bounds = new VRTextureBounds_t
- {
- uMin = 0,
- uMax = 1,
- vMin = 1,
- vMax = 0
- };
- var error = OpenVR.Overlay.SetOverlayTextureBounds(overlayHandle, ref bounds);
- if (error != EVROverlayError.None)
- {
- throw new Exception("Failed to flip texture: " + error);
- }
+ FlipOverlayVertical(overlayHandle);
SetOverlaySize(overlayHandle, size);
ShowOverlay(overlayHandle);
}
...
+ private void FlipOverlayVertical(ulong handle)
+ {
+ var bounds = new VRTextureBounds_t
+ {
+ uMin = 0,
+ uMax = 1,
+ vMin = 1,
+ vMax = 0
+ };
+
+ var error = OpenVR.Overlay.SetOverlayTextureBounds(handle, ref bounds);
+ if (error != EVROverlayError.None)
+ {
+ throw new Exception("Failed to flip texture: " + error);
+ }
+ }
Draw RenderTexture
Move drawing code into SetOverlayRenderTexture()
.
private void Update()
{
var leftControllerIndex = OpenVR.System.GetTrackedDeviceIndexForControllerRole(ETrackedControllerRole.LeftHand);
if (leftControllerIndex != OpenVR.k_unTrackedDeviceIndexInvalid)
{
var position = new Vector3(x, y, z);
var rotation = Quaternion.Euler(rotationX, rotationY, rotationZ);
SetOverlayTransformRelative(overlayHandle, leftControllerIndex, position, rotation);
}
- var nativeTexturePtr = renderTexture.GetNativeTexturePtr();
- var texture = new Texture_t
- {
- eColorSpace = EColorSpace.Auto,
- eType = ETextureType.DirectX,
- handle = nativeTexturePtr
- };
- var error = OpenVR.Overlay.SetOverlayTexture(overlayHandle, ref texture);
- if (error != EVROverlayError.None)
- {
- throw new Exception("Failed to draw texture: " + error);
- }
+ SetOverlayRenderTexture(overlayHandle, renderTexture);
}
...
+ private void SetOverlayRenderTexture(ulong handle, RenderTexture renderTexture)
+ {
+ if (!renderTexture.IsCreated()) return;
+
+ var nativeTexturePtr = renderTexture.GetNativeTexturePtr();
+ var texture = new Texture_t
+ {
+ eColorSpace = EColorSpace.Auto,
+ eType = ETextureType.DirectX,
+ handle = nativeTexturePtr
+ };
+ var error = OpenVR.Overlay.SetOverlayTexture(handle, ref texture);
+ if (error != EVROverlayError.None)
+ {
+ throw new Exception("Failed to draw texture: " + error);
+ }
+ }
Final code
using UnityEngine;
using Valve.VR;
using System;
public class WatchOverlay : MonoBehaviour
{
public Camera camera;
public RenderTexture renderTexture;
private ulong overlayHandle = OpenVR.k_ulOverlayHandleInvalid;
[Range(0, 0.5f)] public float size;
[Range(-0.2f, 0.2f)] public float x;
[Range(-0.2f, 0.2f)] public float y;
[Range(-0.2f, 0.2f)] public float z;
[Range(0, 360)] public int rotationX;
[Range(0, 360)] public int rotationY;
[Range(0, 360)] public int rotationZ;
private void Start()
{
InitOpenVR();
overlayHandle = CreateOverlay("WatchOverlayKey", "WatchOverlay");
FlipOverlayVertical(overlayHandle);
SetOverlaySize(overlayHandle, size);
ShowOverlay(overlayHandle);
}
private void Update()
{
var position = new Vector3(x, y, z);
var rotation = Quaternion.Euler(rotationX, rotationY, rotationZ);
var leftControllerIndex = OpenVR.System.GetTrackedDeviceIndexForControllerRole(ETrackedControllerRole.LeftHand);
if (leftControllerIndex != OpenVR.k_unTrackedDeviceIndexInvalid)
{
SetOverlayTransformRelative(overlayHandle, leftControllerIndex, position, rotation);
}
SetOverlayRenderTexture(overlayHandle, renderTexture);
}
private void OnApplicationQuit()
{
DestroyOverlay(overlayHandle);
}
private void OnDestroy()
{
ShutdownOpenVR();
}
private void InitOpenVR()
{
if (OpenVR.System != null) return;
var error = EVRInitError.None;
OpenVR.Init(ref error, EVRApplicationType.VRApplication_Overlay);
if (error != EVRInitError.None)
{
throw new Exception("Failed to initialize OpenVR: " + error);
}
}
private void ShutdownOpenVR()
{
if (OpenVR.System != null)
{
OpenVR.Shutdown();
}
}
private ulong CreateOverlay(string key, string name)
{
var handle = OpenVR.k_ulOverlayHandleInvalid;
var error = OpenVR.Overlay.CreateOverlay(key, name, ref handle);
if (error != EVROverlayError.None)
{
throw new Exception("Failed to create overlay: " + error);
}
return handle;
}
private void DestroyOverlay(ulong handle)
{
if (handle != OpenVR.k_ulOverlayHandleInvalid)
{
var error = OpenVR.Overlay.DestroyOverlay(handle);
if (error != EVROverlayError.None)
{
throw new Exception("Failed to dispose overlay: " + error);
}
}
}
private void SetOverlayFromFile(ulong handle, string path)
{
var error = OpenVR.Overlay.SetOverlayFromFile(handle, path);
if (error != EVROverlayError.None)
{
throw new Exception("Failed to draw image file: " + error);
}
}
private void ShowOverlay(ulong handle)
{
var error = OpenVR.Overlay.ShowOverlay(handle);
if (error != EVROverlayError.None)
{
throw new Exception("Failed to show overlay: " + error);
}
}
private void SetOverlaySize(ulong handle, float size)
{
var error = OpenVR.Overlay.SetOverlayWidthInMeters(handle, size);
if (error != EVROverlayError.None)
{
throw new Exception("Failed to set overlay size: " + error);
}
}
private void SetOverlayTransformAbsolute(ulong handle, Vector3 position, Quaternion rotation)
{
var rigidTransform = new SteamVR_Utils.RigidTransform(position, rotation);
var matrix = rigidTransform.ToHmdMatrix34();
var error = OpenVR.Overlay.SetOverlayTransformAbsolute(handle, ETrackingUniverseOrigin.TrackingUniverseStanding, ref matrix);
if (error != EVROverlayError.None)
{
throw new Exception("Failed to set overlay position: " + error);
}
}
private void SetOverlayTransformRelative(ulong handle, uint deviceIndex, Vector3 position, Quaternion rotation)
{
var rigidTransform = new SteamVR_Utils.RigidTransform(position, rotation);
var matrix = rigidTransform.ToHmdMatrix34();
var error = OpenVR.Overlay.SetOverlayTransformTrackedDeviceRelative(handle, deviceIndex, ref matrix);
if (error != EVROverlayError.None)
{
throw new Exception("Failed to set overlay position: " + error);
}
}
private void FlipOverlayVertical(ulong handle)
{
var bounds = new VRTextureBounds_t
{
uMin = 0,
uMax = 1,
vMin = 1,
vMax = 0
};
var error = OpenVR.Overlay.SetOverlayTextureBounds(handle, ref bounds);
if (error != EVROverlayError.None)
{
throw new Exception("Failed to flip texture: " + error);
}
}
private void SetOverlayRenderTexture(ulong handle, RenderTexture renderTexture)
{
if (!renderTexture.IsCreated()) return;
var nativeTexturePtr = renderTexture.GetNativeTexturePtr();
var texture = new Texture_t
{
eColorSpace = EColorSpace.Auto,
eType = ETextureType.DirectX,
handle = nativeTexturePtr
};
var error = OpenVR.Overlay.SetOverlayTexture(handle, ref texture);
if (error != EVROverlayError.None)
{
throw new Exception("Failed to draw texture: " + error);
}
}
}
Finally, we have displayed the current time on the left wrist. In the next part, we will make the dashboard setting screen to switch which hand to display the overlay.
This content originally appeared on DEV Community and was authored by kurohuku
kurohuku | Sciencx (2024-06-16T13:27:05+00:00) Part 7 — Draw Camera Output. Retrieved from https://www.scien.cx/2024/06/16/part-7-draw-camera-output/
Please log in to upload a file.
There are no updates yet.
Click the Upload button above to add an update.