They use different algorithm method to generate gaze data. Please refer our ‘Eye Tracking’ page.
- VR: 9 points. It takes 20-30 sec in average.
- Remote : 7 points. It takes 14 sec in average.
But, it may take longer when the camera couldn’t find your pupil because the eye is not in the camera angle.
We have been applied it to Gear VR and DeeponM2.
It’s possible to integrate it to other VR HMD, but it requires additional development.
Please contact us for more information.
It mostly depends on AP of the device. The minimum requirement is USB 3.0 and Snapdragon AP.
However, the frame rate is different by device specification.
Below is list of our tested devices. The eye tracking FPS is 50~60 on these devices.
- Galaxy Tab S6
- Galaxy Tab S5 Lite
- Galaxy Tab S5e
- LG G8
If you use other device model, please contact us.
It’s difficult to fit in the device if the face is too small or user can’t control moving head freely.
If you share detail information of your project targeting infant or elderly people, we will provide additional guidance.
- VR: No. But we could provide add-on type prescription lens.
- Remote: Yes. But depending on camera angle, if the glasses frame covers your pupil or dark filter is on the lens of glasses, it may occurs bad tracking.
- VR: Each left and right eye image is respectively 640×480
- Remote: The full image size is 1280×720
- VR/Remote: We send you updated SW file manually.
- Gaze Analysis: It’s updated automatically because it’s web-based solution.
You can utilize 2 data metrics.
- Face missing: Detect if the camera finds the users’ face.
- OutofScreen: True/Gaze on the screen, False/Gaze out of the screen
- VR: The distance between IR camera and pupil is very close because TrueGaze VR Kit attaches inside VR HMD.
- Remote: The 6.5cm long standard TrueGaze Remote Kit tracks between 30cm ~ 60cm. However, the distance could be extended to 1m or longer if it is custom-made. If you are interested in custom-made Remote Kit, please contact us.
1.5ms means latency of eye-tracking algorithm computation. Latency for data transmission and reception are excluded.
No. The interface of connecting iOS device to our Remote kit is different. It doesn’t allow using USB OTG.
We provide Fixation API. It distinguish Fixation data and Saccade data.
We are planning to develop more APIs, so it would be great if you give feedback what API would you like to use for your service.
The main reason locating camera on the bottom is for capturing users’ pupil image well.
As our algorithm is not affected by camera location, it can be installed on the top of the device screen. But when it’s on the top, it’s less likely to find users’ pupil.
Please contact us We will let you experience our solution for demonstration.
We support 2D image/video, and 360° image/video.
No, Please use VIVE, VIVE Focus integrated with TrueGaze VR Kit & SDK.
If you want to use other VR HMD model, please contact us.
No. We support iOS, Android, and Unity Engine(Android).
Please check from our documentation(https://docs.seeso.io/).
Any time a user opens your app will count as a session. To prevent from overcharging, we have limited to a max 10 sessions per user per day.
SeeSo utilizes the front-facing RGB camera for eye tracking.
In order for accurate gaze analysis, we recommend a minimum of 20 lux (think dark hotel hallway).