NVIDIA Maxine Tested: You Can Finally Look Eyes Into Your Eyes When Video Conferencing (Without Actually Doing It)

Look at me when I talk to you

In Xataka we were able to enjoy an exclusive demo of the technology: for this we used a small internal application executed on a laptop with an integrated webcam —an external webcam can be used without problems—.
The application shows a divided screen that on the left side records our face at all times, how we move it and, above all, where we look. From there, artificial intelligence does the work of "creating" eyes that always look at the camera through augmented reality.
NVIDIA technology is capable of emulating blinks and "repositioning" the eyes and gaze when we look at other areas and even when we turn our face a little.
In one of our tests we pushed the machine a bit and actually looked outside the screen to see how the system behaved, most of the time adapting without problems although at some point there could be a little conflict.
If we did that in our tests, our eyes didn't appear to be looking slightly up or down (which is what would happen in a scenario without this technology), but instead they appeared to always be looking straight ahead, as if we were looking directly into the eyes of our interlocutor.
We were also able to test one of the small applications that are used in conjunction with this development and that allowed us to create the mesh of our face, creating a mask on the right side that imitated the gestures and movements of our face.
Audio effects: These artificial intelligence algorithms are designed to improve the quality of the audio, such as removing noise or echo from the room.
Augmented reality effects: this is where facial or gesture and pose tracking comes into play, and thanks to this, the movements we make with our mouths or eyes can be recognized. Among the features offered is the creation of a precise mesh for our face and also the eye contact feature ("Eye contact") that we have been able to test.
The question, of course, is when we will see this type of improvement in our day to day. After the technical demo we were able to enjoy a question and answer session with Alex Qi, one of the people in charge of the Artificial Intelligence Software group at NVIDIA.
As Qi explained to us, the SDKs are already prepared to be used by companies and developers. Therefore, it remains to be seen if platforms such as Zoom, Teams or Skype integrate it into their services, which would be the first key element to be able to enjoy this option.
Qi explained to us how in any case they are still working on improving the eye contact characteristic, which of course poses challenges: the color of the eyes or the fact that the hair can cover part of the face and the eyes pose a challenge in some scenarios, as are the light conditions.
Even so, the software is capable of working in all these conditions and there is simply a job of perfecting the algorithms here, which for example work without problems if the user wears glasses, although the reflections in the lenses of those glasses —if they exist— can pose another challenge for these algorithms.
The truth is that the state of this technology makes us want it to be available as soon as possible, but there are no estimated dates. That friend of mine, by the way, would be overjoyed.

Related Posts

Leave a Reply

%d bloggers like this: