Package org.lwjgl.ovr

Class OVRLayerEyeFov

  • All Implemented Interfaces:
    java.lang.AutoCloseable, NativeResource, Pointer

    public class OVRLayerEyeFov
    extends Struct
    implements NativeResource
    Describes a layer that specifies a monoscopic or stereoscopic view. This is the kind of layer that's typically used as layer 0 to SubmitFrame, as it is the kind of layer used to render a 3D stereoscopic view.

    Member documentation

    • HeaderHeader.Type must be LayerType_EyeFov
    • ColorTextureovrTextureSwapChains for the left and right eye respectively. The second one of which can be NULL.
    • Viewport – specifies the ColorTexture sub-rect UV coordinates. Both Viewport[0] and Viewport[1] must be valid.
    • Fov – the viewport field of view
    • RenderPose – specifies the position and orientation of each eye view, with the position specified in meters. RenderPose will typically be the value returned from _CalcEyePoses, but can be different in special cases if a different head pose is used for rendering.
    • SensorSampleTime – specifies the timestamp when the source OVRPosef (used in calculating RenderPose) was sampled from the SDK. Typically retrieved by calling GetTimeInSeconds around the instant the application calls GetTrackingState. The main purpose for this is to accurately track app tracking latency.


     struct ovrLayerEyeFov {
         ovrLayerHeader Header;
         ovrTextureSwapChain ColorTexture[ovrEye_Count];
         ovrRecti Viewport[ovrEye_Count];
         ovrFovPort Fov[ovrEye_Count];
         ovrPosef RenderPose[ovrEye_Count];
         double SensorSampleTime;