In all the VR excitement, one under-recognized area of development is user input. How do you control and manipulate what’s on the screen? The same can be asked of AR.

Of course, most high-end HMDs have hand controllers. But they vary in sophistication — everything from the Vive’s high-end controllers to the Rift’s (delayed) hand controllers to Google Daydream’s recently unveiled Wii-like remote.

These will all develop fitting use cases… but what about hands-free input? This is the area that Eyefluence is addressing, discussed by founder & CEO Jim Marggraff during a recent episode of Everything VR/AR Podcast (full episode embedded below).

The company focuses (excuse the pun) on eye-tracking specifically. Used for years in market research and SEO design — such as tracking where people look on a search result page — its VR uses will be integrated directly into headsets as a standard interface tool.

If that sounds familiar, eye tracking is an evolution of the current method of manipulating VR environments using “dwell time.” This involves a fixed point that is navigated through head tracking, and can emulate a push or click when it dwells in the same spot.

Eye tracking takes that to another level and allows for a lot more control through the direct movement (and dwell time) of your eyes. This can be applied to everything from gaming, to helping paraplegics communicate, to emergency responders whose hands are occupied.

“Powerful hardware, coupled with a new interaction model will allow them to navigate, activate, pan, zoom and scroll with their eyes,” said Marggraff of potential AR uses for fire fighters. “The effect this has on ability to save lives and improve basic performance is vast.”

Hear the full episode here.