Like many of you, when “Star Trek: The Next Generation” (TNG) came out, I was fascinated by the holodeck concept. For many of us, this became the bar for the upcoming metaverse or 3D web implementation.

The holodeck was a photorealistic virtual environment built on the concept of harsh lighting, which could render solid objects out of light (that’s a thing) to provide entertainment for the crew. Starships that spend months or years away from their home port will need some form of entertainment. Large Soviet-era submarines needed swimming pools for the same reason.

Although not new, the concept of creating a virtual world also did not become real outside of games that were primarily used for entertainment. While there were simulations for more practical purposes such as military training and going back decades, only a small minority of people ever experienced them. This, coupled with potentially substantial cost restrictions, prevented the show from taking this technology where it was supposed to go. The obvious error is evident now that we actively explore recreating holodeck-like experiences.

Let’s take a look at how TNG got the holodeck technology wrong or at least not implemented it as widely as it would in real life. Then we’ll close with our product of the week, a phone and smartwatch service from Gab Wireless that will keep your kids and maybe some of our older adults more secure.

Simulation-to-interface optimization

The problem with TNG’s holodeck technology didn’t come to me either initially or afterward while watching the show. This came to me while watching various keynotes at Lenovo’s Virtual Tech World event last week.

Lenovo has arguably the best suite of tools for locating commercial interfaces in structures like the Metaverse. It showed a set of deep relationships with core technology providers that will help the company execute in areas where mixed reality is used, such as holodeck-like, VR-based video conferencing offerings. Unlike Meta’s prototypes, these products include legs.

Lenovo’s tools include a variety of glasses and conference/huddle room offerings that combine superior avatars, in which they scan participants in real time with a 3D scanner to create more holodecks than those pioneered by Facebook- To create an experience like that, which uses cartoon-like characters.

It is somewhat similar to the Virtual Medical Doctor in “Star Trek: Voyager” and with a unique badge, could not only exit any area where there were holo-emitters but could also exit Voyager.

In several episodes of both series, there were examples of the ability to not only recreate the bridge and control the interfaces for various ships, but to fool the participants into thinking they were not in the holodeck.

So, what’s wrong?

Well, if you could build anything with hard light, including people, why wouldn’t you have fixed interfaces on a ship, or would you be limited to a live crew?

How the Metaverse Could Change Human-to-Machine Interfaces

We’ve often talked about how the big AI revolution will eliminate the need to learn how to use tech-based tools. As we’ve seen with AI-based artists or writers, users just need to be able to describe what results they want to achieve. If they want a paper on a particular topic, they summarize the assignment, and the AI ​​generates the written result. Or they describe what they want in a picture, and, again, the AI ​​creates it.

Now fast forward hundreds of years in time to the “Star Trek” stories.

Doesn’t this mean that human-machine interfaces throughout the enterprise will be hard light-based, dynamically changing to address both the unique needs of the operator and the situation, and potentially redundant as AI is already very Was doing something the crew does automatically?

physical drone vs hard light human digital twin

“Star Trek: Discovery” recently demonstrated the use of drones. It had data named androids, but why do you need massive staffing levels on starships if you can create digital ones that are indistinguishable from humans?

Also, if you can create complex objects virtually, why wouldn’t you have control interfaces that adapt to the situation rather than be fixed? Also, given that you can place crew almost everywhere, why would you put them on the top deck in a vulnerable position inside the skin of the ship and instead put them in a central armored position within the ship?

I’m pointing this out because, often, with new technology, we first emulate how we used to do things. Then, over time, we break from those old constructions and eventually adapt around the latest technology. As we move into the metaverse, we’re talking about the concept of digital twins, but what if we only need twins and don’t need an actual physical device?

For example, if you’re working from within the Metaverse, your interface can be what you imagine it to be, assuming your body was supportive enough. You won’t need to build an actual office, cubicle, or even PC. All of them can be presented, and Metaverse technology will connect what you did in the Metaverse to an accurate result in the real world.

Let’s say you are writing a paper, attending a meeting, doing research, or even creating a new product. In that case, Metaverse technology can provide you with options you wouldn’t have in the real world and a better interface with other technologies like 3D printers that can take your imagination and create them in the real world.

In personal life you can live in a small house. But in the metaverse, there’s a giant digital mansion that requires less inspection than the real one, and you can do a remodel just by describing what you want to replace — no contractors, no cost, and no cost as a result. There is no long term problem with .

Wrapping up: Lenovo may be the first to grok it

The reason it’s called out Lenovo is that it’s the only multi-national vendor to aggressively produce its Metaverse-ready tool kit, and it has the most breadth with regards to AR- to VR-based industrial headsets.

Lenovo’s CTO, Yong Rui, was also the most candid about the various elements and directions of this technology. He explained that the Metaverse was not evolving in a fixed space as a hybrid of the physical and virtual worlds, which are very different and very similar to “Star Trek” and where the technology is now going.

Rui spoke of four technical layers: orienting the user, describing the virtual environment, improving the realism in textures, and the rules governing the environment or object semantics.

Now think of application on warships, vehicles, or factories. You can use non-physical, virtual and voice-based interfaces that will never wear out, require repair or maintenance – except electronic – and without having to move the user from where they are working Can be changed – even if they are working from home.

This is the future I think we are headed for, with Lenovo in a leading position as a solutions provider, and one that will transform the future represented by “Star Trek” from large, stationary ships to much smaller ones served by humans. will turn into ships. With virtual interfaces, primarily employed by advanced digital AI constructs, potentially eliminating the dangers for red shirt wearers.

In short, the future is closer than what’s going on inside the “Star Trek” holodeck, and the sooner the industry gets this right, the faster our progress will be in the hybrid virtual world that Lenovo talked about. Was. Last week.

Technical Product of the Week

gab phone

I don’t think kids should be given unsupervised smartphones. There are plenty of people who want to hunt down kids who aren’t usually trained or equipped to deal with most of these bad actors. At a Qualcomm mobile event last week, they introduced a service and device solution from Gab Wireless that I found fascinating.

gab phone by gab wirelessSpeaking of the company’s motivation, one of the executives told the story of his younger daughter and inappropriate photos sent by an older boy, which apparently harmed his daughter.

While most products designed to protect children are focused on the needs of parents, meaning children often refuse to use them, Gab is designed to meet the needs of both the child user and the parent. Combines features for those who want to keep them safe.

Content on the go is protected, curated, and under parental controls. But there are tons of games and distractions that even a child has access to. Music service doesn’t just blank out bad words; It excludes songs with inappropriate language.

Using gamification, Gabb has also created phone and smartwatch apps that encourage exercise and promote positive behavior. Plus, the phone protects the child from predators and content that could harm them if viewed unsupervised.

In the end, I think children are not only our most important resource, but our most valuable resource as well, and a service that protects them is an obvious candidate for my Product of the Week.

The opinions expressed in this article are those of the author and do not necessarily reflect the views of ECT News Network.