Package org.lwjgl.ovr

Class OVRUtil


  • public class OVRUtil
    extends java.lang.Object
    Native bindings to the libOVR utility functions.
    • Method Detail

      • novr_Detect

        public static void novr_Detect​(int timeoutMilliseconds,
                                       long __result)
        Unsafe version of: _Detect
      • ovr_Detect

        public static OVRDetectResult ovr_Detect​(int timeoutMilliseconds,
                                                 OVRDetectResult __result)
        Detects Oculus Runtime and Device Status.

        Checks for Oculus Runtime and Oculus HMD device status without loading the LibOVRRT shared library. This may be called before Initialize to help decide whether or not to initialize LibOVR.

        Parameters:
        timeoutMilliseconds - a timeout to wait for HMD to be attached or 0 to poll
      • novrMatrix4f_Projection

        public static void novrMatrix4f_Projection​(long fov,
                                                   float znear,
                                                   float zfar,
                                                   int projectionModFlags,
                                                   long __result)
        Unsafe version of: Matrix4f_Projection
      • novrTimewarpProjectionDesc_FromProjection

        public static void novrTimewarpProjectionDesc_FromProjection​(long projection,
                                                                     int projectionModFlags,
                                                                     long __result)
      • novrMatrix4f_OrthoSubProjection

        public static void novrMatrix4f_OrthoSubProjection​(long projection,
                                                           long orthoScale,
                                                           float orthoDistance,
                                                           float HmdToEyeOffsetX,
                                                           long __result)
        Unsafe version of: Matrix4f_OrthoSubProjection
      • ovrMatrix4f_OrthoSubProjection

        public static OVRMatrix4f ovrMatrix4f_OrthoSubProjection​(OVRMatrix4f projection,
                                                                 OVRVector2f orthoScale,
                                                                 float orthoDistance,
                                                                 float HmdToEyeOffsetX,
                                                                 OVRMatrix4f __result)
        Generates an orthographic sub-projection.

        Used for 2D rendering, Y is down.

        Parameters:
        projection - the perspective matrix that the orthographic matrix is derived from
        orthoScale - equal to 1.0f / pixelsPerTanAngleAtCenter
        orthoDistance - equal to the distance from the camera in meters, such as 0.8m
        HmdToEyeOffsetX - the offset of the eye from the center
        __result - the calculated projection matrix
      • novr_CalcEyePoses

        public static void novr_CalcEyePoses​(long headPose,
                                             long HmdToEyePose,
                                             long outEyePoses)
        Unsafe version of: _CalcEyePoses
      • ovr_CalcEyePoses

        public static void ovr_CalcEyePoses​(OVRPosef headPose,
                                            OVRPosef.Buffer HmdToEyePose,
                                            OVRPosef.Buffer outEyePoses)
        Computes offset eye poses based on headPose returned by OVRTrackingState.
        Parameters:
        headPose - indicates the HMD position and orientation to use for the calculation
        HmdToEyePose - can be OVREyeRenderDesc.HmdToEyePose returned from GetRenderDesc. For monoscopic rendering, use a position vector that is average of the two position vectors for each eye .
        outEyePoses - if outEyePoses are used for rendering, they should be passed to SubmitFrame in OVRLayerEyeFov::RenderPose or OVRLayerEyeFovDepth::RenderPose
      • novr_GetEyePoses

        public static void novr_GetEyePoses​(long session,
                                            long frameIndex,
                                            boolean latencyMarker,
                                            long HmdToEyePose,
                                            long outEyePoses,
                                            long outSensorSampleTime)
        Unsafe version of: _GetEyePoses
      • ovr_GetEyePoses

        public static void ovr_GetEyePoses​(long session,
                                           long frameIndex,
                                           boolean latencyMarker,
                                           OVRPosef.Buffer HmdToEyePose,
                                           OVRPosef.Buffer outEyePoses,
                                           @Nullable
                                           java.nio.DoubleBuffer outSensorSampleTime)
        Returns the predicted head pose in outHmdTrackingState and offset eye poses in outEyePoses.

        This is a thread-safe function where caller should increment frameIndex with every frame and pass that index where applicable to functions called on the rendering thread. Assuming outEyePoses are used for rendering, it should be passed as a part of OVRLayerEyeFov. The caller does not need to worry about applying HmdToEyePose to the returned outEyePoses variables.

        Parameters:
        session - an ovrSession previously returned by Create
        frameIndex - the targeted frame index, or 0 to refer to one frame after the last time SubmitFrame was called
        latencyMarker - Specifies that this call is the point in time where the "App-to-Mid-Photon" latency timer starts from. If a given ovrLayer provides "SensorSampleTimestamp", that will override the value stored here.
        HmdToEyePose - can be OVREyeRenderDesc.HmdToEyePose returned from GetRenderDesc. For monoscopic rendering, use a position vector that is the average of the two position vectors for each eye.
        outEyePoses - the predicted eye poses
        outSensorSampleTime - the time when this function was called. May be NULL, in which case it is ignored.
      • novrPosef_FlipHandedness

        public static void novrPosef_FlipHandedness​(long inPose,
                                                    long outPose)
        Unsafe version of: Posef_FlipHandedness
      • ovrPosef_FlipHandedness

        public static void ovrPosef_FlipHandedness​(OVRPosef inPose,
                                                   OVRPosef outPose)
        Tracking poses provided by the SDK come in a right-handed coordinate system. If an application is passing in Projection_LeftHanded into Matrix4f_Projection, then it should also use this function to flip the HMD tracking poses to be left-handed.

        While this utility function is intended to convert a left-handed OVRPosef into a right-handed coordinate system, it will also work for converting right-handed to left-handed since the flip operation is the same for both cases.

        Parameters:
        inPose - a pose that is right-handed
        outPose - the pose that is requested to be left-handed (can be the same pointer to inPose)
      • novr_ReadWavFromBuffer

        public static int novr_ReadWavFromBuffer​(long outAudioChannel,
                                                 long inputData,
                                                 int dataSizeInBytes,
                                                 int stereoChannelToUse)
        Unsafe version of: _ReadWavFromBuffer
        Parameters:
        dataSizeInBytes - size of the buffer in bytes
      • ovr_ReadWavFromBuffer

        public static int ovr_ReadWavFromBuffer​(OVRAudioChannelData outAudioChannel,
                                                java.nio.ByteBuffer inputData,
                                                int stereoChannelToUse)
        Reads an audio channel from Wav (Waveform Audio File) data.

        Input must be a byte buffer representing a valid Wav file. Audio samples from the specified channel are read, converted to float [-1.0f, 1.0f] and returned through OVRAudioChannelData.

        Supported formats: PCM 8b, 16b, 32b and IEEE float (little-endian only).

        Parameters:
        outAudioChannel - output audio channel data
        inputData - a binary buffer representing a valid Wav file data
        stereoChannelToUse - audio channel index to extract (0 for mono)
      • novr_GenHapticsFromAudioData

        public static int novr_GenHapticsFromAudioData​(long outHapticsClip,
                                                       long audioChannel,
                                                       int genMode)
        Unsafe version of: _GenHapticsFromAudioData
      • ovr_GenHapticsFromAudioData

        public static int ovr_GenHapticsFromAudioData​(OVRHapticsClip outHapticsClip,
                                                      OVRAudioChannelData audioChannel,
                                                      int genMode)
        Generates playable Touch Haptics data from an audio channel.
        Parameters:
        outHapticsClip - generated Haptics clip
        audioChannel - input audio channel data
        genMode - mode used to convert and audio channel data to Haptics data. Must be:
        HapticsGenMode_PointSample
      • novr_ReleaseAudioChannelData

        public static void novr_ReleaseAudioChannelData​(long audioChannel)
        Unsafe version of: _ReleaseAudioChannelData
      • ovr_ReleaseAudioChannelData

        public static void ovr_ReleaseAudioChannelData​(OVRAudioChannelData audioChannel)
        Releases memory allocated for ovrAudioChannelData. Must be called to avoid memory leak.
        Parameters:
        audioChannel - pointer to an audio channel
      • novr_ReleaseHapticsClip

        public static void novr_ReleaseHapticsClip​(long hapticsClip)
        Unsafe version of: _ReleaseHapticsClip
      • ovr_ReleaseHapticsClip

        public static void ovr_ReleaseHapticsClip​(OVRHapticsClip hapticsClip)
        Releases memory allocated for ovrHapticsClip. Must be called to avoid memory leak.
        Parameters:
        hapticsClip - pointer to a haptics clip
      • ovr_GetEyePoses

        public static void ovr_GetEyePoses​(long session,
                                           long frameIndex,
                                           boolean latencyMarker,
                                           OVRPosef.Buffer HmdToEyePose,
                                           OVRPosef.Buffer outEyePoses,
                                           @Nullable
                                           double[] outSensorSampleTime)
        Array version of: _GetEyePoses