Package org.lwjgl.ovr

Class OVRLayerEyeFovDepth

  • All Implemented Interfaces:
    java.lang.AutoCloseable, NativeResource, Pointer

    public class OVRLayerEyeFovDepth
    extends Struct
    implements NativeResource
    Describes a layer that specifies a monoscopic or stereoscopic view, with depth textures in addition to color textures. This is typically used to support positional time warp. This struct is the same as OVRLayerEyeFov, but with the addition of DepthTexture and ProjectionDesc.

    ProjectionDesc can be created using TimewarpProjectionDesc_FromProjection.

    Three options exist with respect to mono/stereo texture usage:

    • ColorTexture[0] and ColorTexture[1] contain the left and right stereo renderings, respectively. Viewport[0] and Viewport[1] refer to ColorTexture[0] and ColorTexture[1], respectively.
    • ColorTexture[0] contains both the left and right renderings, ColorTexture[1] is NULL, and Viewport[0] and Viewport[1] refer to sub-rects with ColorTexture[0].
    • ColorTexture[0] contains a single monoscopic rendering, and Viewport[0] and Viewport[1] both refer to that rendering.

    Member documentation

    • HeaderHeader.Type must be LayerType_EyeFovDepth
    • ColorTexture[ovrEye_Count]ovrTextureSwapChains for the left and right eye respectively. The second one of which can be NULL for cases described above.
    • Viewport[ovrEye_Count] – specifies the ColorTexture sub-rect UV coordinates. Both Viewport[0] and Viewport[1] must be valid.
    • Fov[ovrEye_Count] – the viewport field of view
    • RenderPose[ovrEye_Count] – specifies the position and orientation of each eye view, with position specified in meters. RenderPose will typically be the value returned from _CalcEyePoses, but can be different in special cases if a different head pose is used for rendering.
    • SensorSampleTime – specifies the timestamp when the source OVRPosef (used in calculating RenderPose) was sampled from the SDK. Typically retrieved by calling GetTimeInSeconds around the instant the application calls GetTrackingState. The main purpose for this is to accurately track app tracking latency.
    • DepthTexture[ovrEye_Count] – depth texture for depth composition with overlays. Must map 1:1 to the ColorTexture.
    • ProjectionDesc – specifies how to convert DepthTexture information into meters


     struct ovrLayerEyeFovDepth {
         ovrLayerHeader Header;
         ovrTextureSwapChain ColorTexture[ovrEye_Count];
         ovrRecti Viewport[ovrEye_Count];
         ovrFovPort Fov[ovrEye_Count];
         ovrPosef RenderPose[ovrEye_Count];
         double SensorSampleTime;
         ovrTextureSwapChain DepthTexture[ovrEye_Count];
         ovrTimewarpProjectionDesc ProjectionDesc;