The SenseGlove Unity Plugin is built upon our native C# API, and allows one to interface with SenseGlove Devices. The latest stable version is v2.3.1, compiled 2022-06-13 with Unity version 2017.4.30f1. You can download it from GitHub. If you’re not sure where to begin, refer to our “Getting started” guide.
Our Unity Plugin is compatible with Windows, Linux and Android (Oculus Quest 2, Pico Neo). It is not dependent on any 3rd party XR plugins (such as SteamVR or Oculus). The XR rig will need to be linked to the SenseGlove scripts, a tutorial for which is available here.
The SenseGlove Unity plugin requires the SenseCom Software to interface with SenseGlove devices on Windows and Linux. Said software comes with the Unity Plugin download. If the program is installed and ran at least once, the SenseGlove Unity Plugin will start it up automatically when your program runs.
At the core of the plugin is the SG_HapticGlove script that, when placed in a scene, allows one to retrieve hand tracking data and send haptic commands to one glove. You’’ll need two of these in your scene, one for the right hand and one for the left hand. Unity projects using this script are be compatible with both SenseGlove Nova and SenseGlove DK1 (the exoskeleton glove).
SenseGlove provides a pre-rigged 3D Hand Model composed of several ‘layers’, each of which is responsible for a part of interacting with the virtual world: There layers handle animation, (force)feedback, grabbing, calibration, gesture recognition, hand physics, etc. Each of these layers are linked by the SG_TrackedHand script; which determines their execution order. Each layer is optional, and can be either disabled or entirely deleted. They can also be used individually, without the use of a SG_TrackedHand, but you’ll need to manually link their components via the inspector.
Unity Plugin Features
Interface with SenseGlove Nova and DK1 Exoskeleton through the SG_HapticGlove class.
A flexible Hand Prefab with multiple layers, each of which can be toggled or deleted to suit your simulation.
An SG_User to link two hands together, and hide the Hand Models of unconnected gloves.
SG_VR_Rig scripts to link any 3rd party VR Plugins (SteamVR, Oculus, Unity XR) to the SenseGlove Hands.
An optional “Physics-Based” Grab Script; pick up objects between your thumb and fingers, or fingers and the hand palm.
A SG_Interactable script that can be extended to create any kind of interaction in VR.
Basic SG_Grabable script that represents an object that can be picked up and placed in various ways, with one or more hands at the same time.
DropZones and SnapZones to detect SG_Grabable objects, used to quickly create assembly tasks in VR.
Hand Detection zones to detect the hand’s rigidbody, used to create simple buttons or interaction zones.
A “Calibration Scene” to include at the start, which guides users through calibration for optimal tracking. Alternatively, you can either activate the calibration Layer or use the SenseCom software.
Assign Force-Feedback properties to any GameObject through the SG_Material script. The force-response is based on collider penetration (how far inside the collider we are).
Hand Physics Bodies to push objects around or prevent the hands from passing through virtual tables.
Optional Passthrough layer that prevents fingers from going into virtual objects.
Scripts to Deform the (visual) mesh of an object when your squeeze it.
Access to the underlying C# classes for extended functionality (Accessing raw sensor data, device info).
What is not possible (yet)
These features are not available in the current Unity Plugin, but might become available at later releases.
Computer-Vision (Optical) tracking of the Nova Glove is still in development for Vive Pro. Expected release near the end of 2022.
OpenVR Integration: It’s not yet possible to access SenseGlove tracking through OpenVR (SteamVR / Unity XR).
Controller support: (VR) Controllers cannot be used to interface with the scripts in the SenseGlove Unity Plugin. Though our SG_TrackedHand can take input form any device that implements the IHandPoseProvider interface.
Windows Mixed Reality (WMR) platform support.
Basic “Hinge / Door” and “Slider / Drawer” scripts are available, but buggy. Reworking them for Unity Plugin v2.3.
Surface Texture Haptics (sliding the hand across the surface lets you feel vibrations / bumps) is currently not possible.
Sliding objects in the hand based on the grip strength is currently not possible.
Integration with Unity’s Animator system for hand animation, though our HandPose can output finger flexions as values between 0 and 1.
MacOS Support - This has to to with SenseCom’s back end rather than Unity. When SenseCom becomes compatible with MacOS, so will the Unity Plugin.