Videoconferencing, podcasts and webinars grew in popularity during the pandemic years of 2020 and 2021 as remote working became part of the new normal. With the pandemic now in the rearview mirror, video communication technologies have shown no sign of slowing down.
What has been amusing to me is that despite the prevalence of video communication, very little attention has been paid to how ineffective we appear on camera, often using weak, low-resolution webcams. Bad lighting is undoubtedly a big problem when using video calls mainly from home. The sub-HD resolution webcams built into most, even high-end laptops, don’t help.
Without the professional assets available in a professional television studio, politicians, celebrities and industry experts often look sinister when interviewed remotely from their homes.
Regular videoconferencing calls from home are especially vulnerable to an “amateur hour” look and feel, especially during a formal presentation where wandering eyes (eg, not looking directly into the webcam) can distract the viewer. Huh.
The location of the webcam is responsible for this unwanted effect as the camera is usually integrated on top of the laptop panel or on a separate stand which is difficult to place in front of a desktop display.
Because typical videoconferencing using a desktop or laptop PC does not have proper teleprompter functionality, which is complex, bulky, and expensive, it is nearly impossible to read speaker notes without avoiding the annoying phenomenon of a terrible webcam angle that obscures your Looks up or down the nose. ,
Is there any quick way to cure eye sight problem?
There are a few ways to mitigate this problem in a typical desktop or laptop home setup. However, these approaches are entirely artificial and do not eliminate the problem.
Some companies offer small external webcams, often equipped without an integrated microphone, to reduce the size of the device and allow placement in the center of your screen, in front of any text content or text you may be using. In front of the viewing window of the going video app.
These cameras use a thin wire wrapped and clipped on top of the display. This way, you look directly into the webcam and can see most, though not all, of the presentation or text you’re presenting.
Yet, another method is using a clear piece of acrylic plastic that allows you to mount almost any webcam and hook it over the top of the display so that the webcam suspends itself in front of the center point of the display. Do it
The advantage of this approach is that it frees you up to use your preferred webcam. The downside is that the webcam’s size and acrylic plastic device often obscure a good portion of the screen, making it less useful as a teleprompter alternative.
Down the road, we may see laptop and PC displays with integrated webcams behind LCD panels, which are invisible to the user. While this is an ideal solution to the problem I described above, the downside is that these specialized displays will cost a lot, which most manufacturers will hesitate to offer due to the price elasticity implications.
AI can fix eye contact issues easily and cost-effectively.
The idea of using artificial intelligence to reduce or eliminate eye contact during videoconferencing calls is not new. When done correctly, AI can eliminate the need to purchase expensive teleprompting equipment that television studios use or resort to some of the gimmicky methods mentioned above.
The challenge with employing AI to perform eye contact correction on the fly (live) or even in a recorded scenario is that it requires processor horsepower to do the heavy lifting.
Apple Silicon has had this integrated capability with its iPhone chips for a few years now. Many users don’t know that Apple’s FaceTime app has eye contact correction (which can be turned off), which ensures that your eyes are centered in the middle of the screen regardless of the iPhone’s orientation.
Contact settings in Apple’s FaceTime app
Microsoft has also joined the AI party to fix problems with eye contact. Last year, it announced that it would add Eye Contact Solution capability to Windows 11 by leveraging the power of Qualcomm’s Arm Solutions and Neural Processing Unit (NPU) silicon to enhance video and audio in meetings — including subject framing, Background noise suppression is included. , and background blur.
Many of these features are already available on Microsoft’s Surface Pro X device, which uses an Arm chip. Nevertheless, Microsoft will be deploying this functionality more widely this year on compatible models from major PC OEMs.
nvidia broadcast with eye contact
Nvidia’s Broadcast app, which works on a wide range of Nvidia external graphics cards, is a robust AI tool that improves video calls and communications on x86-based PCs. Last week, Nvidia enhanced the utility in version 1.4 to support its implementation of Eye Contact, which makes it appear as though the subject within a video is looking directly at the camera.
The new eye contact effect adjusts the speaker’s eyes to recreate eye contact with the camera. This capability is achieved by harnessing the AI horsepower in Nvidia’s GPUs to accurately capture and align the gaze.
The new Eye Contact effect in Nvidia Broadcast 1.4 moves the speaker’s eyes to simulate eye contact with the camera. , Image credit: Nvidia
The advantage of Nvidia’s approach is that the capability isn’t limited to a single videoconferencing platform or app. Apple only supports its eye contact correction capability using the iPhone’s FaceTime app. However, I wouldn’t be surprised if Apple expands this capability to macOS users later this year with its Continuity Camera capability.
In addition, Nvidia provides Broadcast Vignette functionality that is on par with what many Instagram app users experience. In this way, Nvidia Broadcast can generate a brief background blur to get an AI-simulated blurred view on your webcam, instantly boosting visual quality.
Replacing background images on videoconferencing calls is nothing new. Still, Nvidia’s approach will likely deliver better quality as it harnesses the power of its graphics cards, which are optimized for video content creation and gaming.
closing thoughts
The Eye Contact feature in Nvidia’s Broadcast app is currently in beta form and not yet ready for deployment. Like any beta feature, it will suffer from the inevitable glitches, and we should delay a formal judgment of its quality until a production version is made available.
Furthermore, Nvidia Broadcast isn’t just a run-of-the-mill app, but an open SDK with features that can be integrated into third-party apps. This opens up interesting new capabilities for third-party applications to directly take advantage of the functionality in Nvidia Broadcast.
Despite this, I am baffled by some of the adverse reactions that have appeared over the past few years around the prospect of using AI to correct eye contact. Some technical analysts use phrases such as “creepness factor” to categorize this feature in the most attractive way possible.
In fact, the potential would inspire many, perhaps deserved, jokes if the latter effect seemed unnatural and artificial. However, the creepy designation sounds over the top and pretentious. The same objection can be made about using makeup or deploying advanced equipment that corrects audio deficiencies during video calls. Apps like TikTok or Instagram wouldn’t exist without filters, which create far from creepy images in my opinion.
Like it or not, videoconferencing has survived as one of the positive outcomes of the post-pandemic world. Using technology that facilitates more productive, compelling and impactful video calls is something we should welcome, not scorn.
As someone who produces a weekly video podcast and recognizes the potential to eliminate or even reduce eyestrain, which in turn can offer teleprompter-like advantages, I’m coming up next. I look forward to testing this much needed capability in the weeks to come.