Today we have become accustomed to them and they no longer seem like a big deal, but not long ago touchscreens were an incredible technological revolution that thousands of people found it difficult to get used to. Now they are the norm, and if a device does not have a touchscreen, it seems technologically outdated.
And as it happens in this sector, companies and researchers are working to find the next big invention and technology with which we interact with smart devices. Voice has emerged as a very viable option, and now with virtual assistants like Siri or Alexa, we can ask them to do numerous tasks for us without lifting a finger.
But technology always aspires for more, and even though it may seem impossible right now, there are more and more projects that allow us to interact with these devices without saying or moving anything, just with our gaze.
This is how Apple explained the new function they presented last Wednesday in honor of Global Accessibility Awareness Day, designed for people with different disabilities so that they are not excluded from technology.
The Eye Tracking tool, powered by Artificial Intelligence (AI), allows users to navigate their iPhone or iPad with just their gaze. For this, this tool uses the device’s internal camera along with the built-in machine learning system to understand what the user wants to do with just their gaze.
Undoubtedly, what is extremely striking, and if it works well, will be a huge achievement on the part of Apple is that it does not require any extra devices or utensils to function, something very different from Neuralink technology, Musk’s company that has managed to allow a man to control a screen with his gaze, but they had to insert a chip into his head, and as we told you last week, failures have already been reported.
Users can navigate through the elements of an application and use Dwell Control to activate each element, accessing additional functions such as physical buttons, slides, and other gestures only with their eyes.