While Netscape didn’t invent the Internet or HTML, it was the company that made the Internet real. Netscape went ahead with the creation of Tim Berners-Lee’s HTML and was instrumental in turning it into something that will change the world.

Last week at Siggraph, Nvidia’s opening keynote identified Universal Scene Description (USD), developed by Pixar, a Disney subsidiary, as the HTML equivalent for the Metaverse. Since Pixar wouldn’t exist without Steve Jobs, it’s like putting Pixar where Berners-Lee was, and Nvidia where Netscape was, but unlike Netscape, Nvidia is very well run and knows its battles. How to choose

Nvidia also talked about the future of the Metaverse, where avatars will become browser-like, creating a whole new level of human/machine interface. Nvidia also announced the concept of Neural Graphics, which is based heavily on AI to create more realistic Metaverse graphical elements with far less work.

This week let’s talk more about what happened at Siggraph — and how Nvidia and Disney can, and should, demonstrate their strengths at the forefront of the Metaverse.

Then we’ll close our product of the week, the HP Halo product, with an update on the Dragonfly laptop, which has just released its third edition. Halo products showcase the full capabilities of the seller and draw people to the brand, and it’s well positioned against the best of Apple.

Metaverse and Disney

I’m a former Disney employee and I can’t think of any other company on the content side that would be a better base for building the Metaverse.

Disney has always been about fantasy and trying to make magic real. While the firm has had problems maintaining its innovative leadership over the years, it still attracts all its peers, especially youth, across all age groups in terms of physical, magical places to see and film content.

It is tempting that the concept of the Multiverse, which could easily become a Metaverse creation, as illustrated by the Marvel Universe, which is also owned by Disney, suggests that as the Metaverse moved into the consumer market. Goes on, Disney could be even more powerful. The driver of this new technology for fun.

That’s a long way to say that given its relationship with the USD and entertainment, Disney may be the best-positioned media company to take advantage of this new paradigm and turn its version of the metaverse into something truly amazing. Imagine the potential of Metaverse Disney parks that kids can enjoy from their homes during extreme weather events, pandemics or wars.

Nvidia’s One Metaverse Movement

Right now, the metaverse is a mess. It appears that companies like Meta and Google are creating experiences that, like CompuServe and AOL, were done at the dawn of the Internet, which the market did not want.

The reason those wall-garden efforts didn’t survive is because no single company can meet the needs of each user. Once they gave way to the open Internet, the technology really took off, and AOL and CompuServe largely faded into history.

Nvidia CEO Jensen Huang is a big believer in the metaverse. He refers to it as Web 3.0 – the successor to Web 2.0 (the Internet as we know it today, with changes to the cloud and user-generated content). This concept of a generic metaverse, with elements that you can move on seamlessly, requires a great deal of standardization and advancements in physical interfaces like VR goggles.

Huang addressed this during the keynote, speaking of the massive advances in headset technology that in the future will bring VR glasses much closer to the size and weight of reading glasses, making them less tedious and annoying. . However, recalling our problems with 3D glasses, the industry will still need to address the overwhelming dislike of consumers for prosthetic interfaces if the effort is to reach its full potential.

One of the most interesting parts of this presentation was the concept of neural graphics, or graphics enhanced significantly by AI, which reduce the cost and speed of scanning things in the real world and turning them into mirror images in the virtual world. increase. At the event, Nvidia presented about 16 papers on neural graphics, two of which won awards.

Building on Pixar’s concept of Universal Scene Description, Huang explained how, once these virtual elements were created, they would be linked via AI to ensure that they remain in sync with the real world, Enables complex digital twins that can be used effectively for extreme precision. Simulation for both business and entertainment purposes.

This made me wonder how long it would take before we had the incarnation of Huang, who was revealed to be the keynote speaker, was actually the keynote speaker. With Huang’s progress in terms of avatar realism and emotion, there will come a time when avatars will be far better at such presentations than humans.

Up to this point, Huang introduced a concept called Audio2Face which combines a voice track with an avatar that creates realistic facial expressions, conveys emotion and is often indistinguishable from an actor’s appearance.

To do this realistically, they mapped facial muscles and then allowed the AI ​​to learn how people manipulated those muscles for different emotions and the ability to edit those emotions after the fact. . I have no doubt that the kids of tomorrow will have a lot more fun than this and in the future will create some deeply murky issues that we will need to address.

With Audio2Face MDL, a new content definition language, and neural VDB that can reduce video file sizes by up to 99%, create a pattern of increased resolution and realism while reducing the overall cost of effort.

Back to Disney: This technology could allow the company to create more compelling streaming and movie theater content while reducing its production budget, which would be huge for its top and bottom tiers.

Finally, Huang talked about a cloud publishing service for Avatars called Omniverse ACE. This could potentially open up a market for avatar creation, which in itself could be a highly profitable new tech industry.

wrapping up

With tremendous gains in USD and multi-age group content, Disney is in a unique position to benefit from our move into the metaverse.

However, the technology company to watch in this space is Nvidia which is at the forefront of creating this Web 3.0 metaverse creation that will be fast-forward to the Internet as we know it and provide us with amazing new experiences – and undoubtedly new ones. Problems we plague haven’t identified yet – much like the Internet.

In their respective fields, both Nvidia and Disney are forces of nature, and betting against either company has proven unwise. Together, they are creating a metaverse that will surprise, entertain and help solve global problems like climate change.

What is being built for the metaverse is simply amazing. For another example, look at this:

We are at the forefront of another technological revolution. Once done, the world will become a mixture of the real and the virtual and will be forever changed again.

Technical Product of the Week

HP Elite Dragonfly G3

Halo products are expensive and somewhat exclusive offerings that often show what a company can do, regardless of price.

The HP Elite Dragonfly G3 is the third generation of this Halo product, and it’s a relatively affordable showcase of HP’s laptop capabilities.

Lighter than most of its competitors, including the MacBook, sporting the latest 12th Gen Intel Core processors, and promising up to 22 hours of battery life (video), this 2.2-pound laptop is an impressive piece of kit.

HP Elite Dragonfly G3 Notebook

HP Elite Dragonfly G3 | image credit: HP

Some interesting features include a mechanical privacy shade for the 5MP front-facing camera that is activated electronically from the keyboard.

The laptop comes in a unique Slate Blue finish which I think looks awesome. This latest generation was designed for the new hybrid world many of us now live in, where we both work from home but sometimes have to go to the office.

It has Wi-Fi 6e for better wireless connectivity and supports 5G WAN for times when Wi-Fi is either too insecure or too unrealistic.

The Elite Dragonfly G3 has a unique 3:2 aspect ratio instead of the more typical panoramic display. The latter may be better for films but 3:2 is better for work. Laptops in this class are expected to focus more on content creation than on entertainment. This high screen also enabled a large touchpad that includes a fingerprint reader for security.

The ports on this unit, which has a 13.5-inch display, are surprisingly complete for one of the thinnest laptops I’ve tested. In addition to two USB-C Thunderbolt ports, it has a full-size USB port and a full-size HDMI port, both of which are unusual but unheard of in a laptop this small and light.

hp elite dragonfly g3 port

HP Elite Dragonfly G3 Right-Side Ports | image credit: HP

The product is relatively durable, using a magnesium/aluminum frame that is largely from recycled metals and designed to be recycled again as the laptop gets older.

In conclusion, it is potentially one of the most secure laptops in its class with the Wolf Pro security option for those who want extra security. Interestingly, starting at just $2,000, the Wolf Security Edition is also one of the most affordable.

I was at the launch of HP’s first Dragonfly laptop and I am very impressed with this offering which is my product of the week. I’m going to hate giving this laptop back.

The opinions expressed in this article are those of the author and do not necessarily reflect the views of ECT News Network.

This week is Siggraph 2022 where Nvidia will be doing one of the main things.

While on the consumer side the Metaverse has industrial uses, outside of gaming, which has effectively included Metaverse instances over the years, at Siggraph Nvidia will talk about its leadership in integrating AI into technology, the creation and application of digital twins, and successes. . In major new robotic factories such as the one made by BMW with the help of Siemens.

But what I find even more interesting is that as metaverse tools like Nvidia’s Omniverse become more consumer friendly, the ability to use AI and human digital twins is enabling us to create our own worlds where We Let the rules dictate and where our AI-powered digital twins will simulate real people and animals.

At that point, I expect we’ll need to learn what it means to be gods of the world we’ve created, and I doubt we’re prepared anywhere, in terms of the addictive nature of such products and the way these metaverses become virtual. How to create a world in a way that can form the basis of our own digital immortality.

Let’s explore the capabilities of the Metaverse this week, then we’ll end with our product of the week: the Microsoft Surface Duo 2.

Sigraph and the AI-powered metaverse

If you’ve participated in multiplayer video games like Warcraft, you’ve experienced a rudimentary form of the metaverse. You also found that objects that do things in the real world – like doors and windows that open, leaves that move with the wind, and people who behave like people – don’t exist yet.

With the introduction of digital twins and physics through tools like Nvidia’s Omniverse, this is changing so that reality-dependent simulations, such as those developed by autonomous cars and robots, work accurately and assure that humans or Potential accidents are reduced without putting real animals at risk. Because those accidents initially happen in a virtual world.

At SIGGRAPH, Nvidia will talk about the Metaverse’s current capabilities and its near future, where for a time, money and greatest capabilities will be tied to industrial, not entertainment, use.

For that purpose, the need to make an observer feel as though they are in a real world, outside of simulations intended to train people, is largely reduced. But training humans is also a goal of simulations, and creating human digital twins will be an important step forward in our ability to use AI of the future to handle the ever-increasing amount of repetitive and annoying portions of our workloads.

It is my belief that the next major breakthrough in human productivity will be the ability of regular people to create digital twins of their own that can perform an increasing number of tasks autonomously. Auto-fill is a very, very early milestone on this path that will eventually allow us to create virtual clones of ourselves that can cover for us or significantly increase our reach.

Nvidia is at the forefront of this technology. For anyone wanting to know what the Metaverse is capable of doing today, attending Siggraph Keynote should virtually be on your important to-do list.

But if we project 20 or so years into the future, given the enormous pace of development in this space, our ability to immerse ourselves in virtual worlds will increase, as well as our ability to create these existences. Worlds where physics as we know it is not only optional, but where we can choose to put ourselves in “God Mode” and walk through virtual worlds as the ultimate rulers of the virtual spaces we’ve created. Huh.

immersion is important

While we’ll have intermediate stages using prosthetics that use more advanced forms of haptics to make us feel like we’re immersed in these virtual worlds, it’s like Elon Musk’s attempt to create better human-machine interfaces. that will make a real difference.

By connecting directly to the brain, we should be able to create experiences that are indistinguishable from the real world and place us far more realistically in these alternate realities.

Meta Reality Labs is researching and developing haptic gloves to bring the sense of touch to the metaverse of the future.

Yet, as we gain the ability to create these worlds ourselves, changing these connections to provide more reality (e.g. experiencing pain in battle) will be optional, allowing us to walk through encounters. As if we were super powers.

human digital twin

One of the biggest problems with video games is that NPCs, no matter how good the graphics, use very limited scripts. They don’t learn, they don’t change, and they’re barely more capable than the animatronics at Disneyland.

But with the creation of human digital twins, we will gain the ability to populate the created world with more realistic citizens.

Imagine being able to present the Digital Twin you create to others and license them to be used in companies or games made by them. These NPCs will be based on real people, will respond more realistically to changes, potentially be able to learn and grow, and will not be tied to your gender or even your physical appearance.

For example what about Talking Dragon based on your digital twin? You can also populate the created Metaverse world with a large number of clones that have been altered to look like a diverse population, including animals.

Practical applications will include everything from virtual classrooms with virtual teachers to police and military training with virtual partners against virtual criminals – all based on real people, providing the ability to train with an unlimited number of realistic scenarios.

For example, for a police officer, one of the hardest things to train is domestic disturbance. These collisions can go all the way sideways. I know of many instances where a police officer stepped in to protect the abusing spouse and then was spotted by the same spouse who suddenly decided to defend her husband from the officer.

Today I read a story about a rookie who approached a legally armed citizen who was on his property. The officer was almost shot because he tried to seduce a civilian who had not broken any laws. He was all set to kill the civilian officer if it had happened. For this the officer was fired but he could have died.

Being able to train in situations like this can help ensure the safety of both the civilian and the officer.

Wrapping Up: God Mode

Anyone who has ever played a game in God Mode knows that it really destroys a lot of the game’s value. Yes, you can burn through the game in a fraction of the time, but it’s like buying a book and then reading a comprehensive summary with no spoilers. Most of the fun of a game is figuring out puzzles and working through challenges.

“Westworld” explored what might happen if virtual people, who were created to emulate humans, found out they had been abused. To be realistic, these creations would need to emulate the full flow of pain, suffering, and emotion, and it’s certainly a remote possibility that they could kick off their schedule.

However, another possibility is that people fully immersed in God Mode may not be able to differentiate between what they can do in the virtual world and the real world. This can result in some bad behavior in the real world.

I think we will find that there will be a clearer delineation between those who want to create viable worlds and treat those worlds beneficially, and those who want to create worlds that allow them to explore their distorted fantasies and occult desires. allow.

This can be a way to determine if someone has the right personality to be a leader, as it would be so easy to abuse power in the virtual world, and the tendency to abuse power is for anyone who goes into management. Should be a big red flag.

We’re still decades away from this potential, but we should start thinking about the limits of using this technology for entertainment so that we don’t create a significant group of people who don’t see others as different from virtual people who don’t. They misbehave in the twisted world. they will make.

What kind of metaverse god would you be?

Technical Product of the Week

Surface Duo 2

I’ve been using the Surface Duo 2 for several months now and it remains my favorite phone. I’m amazed at how many people came up to me to ask me about the phone and then said, when I shared with them what it does, they want to buy one.

This has huge advantages when consuming emails with attachments or links. The attachment or link opens on another screen without interrupting the flow of reading the email that it delivered. Just like using the phone when you’re opening a website that requires two-factor authentication. The authentication app is on another screen, so you don’t have to go back and try to locate the screen you were working on, preserving the workflow again.

For reading, it reads and holds like a book with two virtual pages, one on each screen. While I thought I might have problems watching videos due to the gap between the screens, I have been watching videos on both screens for some time now and unlike my issues of using dual screen monitors where I find the separation annoying , it’s not the difference. Doesn’t bother me at all.

Ideally this phone works best with a headset or smartwatch, such as the Apple Watch, so you can talk and listen – as it’s awkward to hold this form factor on its head. However, many of us somehow use the speakerphone feature on our smartphones and the Surface Duo 2 works fine that way.

In the end, I think it shows why with a revolutionary device – such as the iPhone was, and there is the Surface Duo 2 – it needs a lot of smart marketing for people to really understand the benefits of a different design. Otherwise they won’t get it.

Remember that the iPhone design, which previously emulated the failed LG Prada phone, was supported by a lot of marketing, while Prada, even though it initially had a strong luxury brand, was not.

Nonetheless, Microsoft’s Surface Duo 2 remains my favorite smartphone. This is really awesome – and my product of the week.

The opinions expressed in this article are those of the author and do not necessarily reflect the views of ECT News Network.

For most of us the metaverse is mostly hype about the promise of a new internet that we can explore virtually. As it is currently implemented, the world of the Metaverse network is reminiscent of the pre-Internet. It is represented by a group of very different and unique efforts than the post-Netscape Internet that seems more like a walled garden approach than today’s Netscape Internet.

Implementations range from useful – like those using Nvidia’s Omniverse – to promises of “something” from Meta (formerly known as Facebook) that, at least now, mostly disappoint. It is believed that disappointment is more likely to be caused by higher expectations than any sluggishness by the meta. This is often a problem with new technologies where expectations are dashed and then people become overwhelmed with the results.

Now, with the announcement of the Metaverse Standards Forum last week, it looks like the industry is headed for a bigger problem with the Metaverse, which is the lack of interoperability and Internet-like standards that could allow for a much more seamless future. . metaverse

Let’s talk about how important this movement is this week. Then we’ll close with our product of the week, a mobile solar solution that could help avoid the ecological and power outage problems that states like California and Texas are expected to experience as climate change damages their electric grids. makes it less reliable.

current metaverse

Currently, the metaverse isn’t as much of a thing as it is a lot of things.

The most advanced version of the Metaverse today is Nvidia’s Omniverse. The equipment is used to design buildings, train autonomous robots (including autonomous cars), and form the foundation for Prithvi-2, which is designed to better simulate and predict the weather – both To provide early information of major weather events and to design potential measures for global climate change.

While many people think the metaverse will grow to replace the Internet, I doubt it will or will happen. The Internet organizes information relatively efficiently. Moving from a test interface to a VR interface can slow down the data access process without any offsetting benefits.

The Metaverse is best for simulation, emulation, and especially for tasks where the use of virtual environments and machine speed can solve critical problems more quickly and accurately than existing alternatives. For those tasks, it is already proving itself valuable. While it will likely develop into something more like the holodeck in “Star Trek” or the virtual world depicted in the movie “The Matrix,” it hasn’t yet.

what do you need now

What we can do now is to create photorealistic images that can be explored virtually. But we can’t make realistic digital twins of humans to populate the metaverse. We can’t yet build the device of the human body so you can experience the metaverse as if it were real, and our primary interface, VR glasses, are big, bulky and create the 3D glasses that the market previously rejected. , on the contrary look much better .

These problems are not cheap or easy to fix. If they were to be solved uniquely for each of the Metaverse instances, then the evolution of the Metaverse and our experience in it would be years behind, not decades.

What is needed is the level of collaboration and collaboration that has now built the internet to focus on building the metaverse, and that is exactly what happened last week.

Acclaimed Founding Member

The formation of the Metaverse Standards Forum directly addresses this interoperability and standards problem.

Meta and Nvidia are both on this platform, including who’s who of the tech companies — except for Apple, a firm that generally wants to go it alone. Heavy hitters like Microsoft, Adobe, Alibaba, Huawei, Qualcomm and Sony are participating, along with Epic Games (Metaverse promises a future where you can play in the digital twin of your home, school or office).

Existing standards groups including the Spatial Web Foundation, the Web3D Consortium and the World Wide Web Consortium have also joined.

Hosted by the Khronos Group, membership to MSF is free and open to any organization, so look for companies from multiple industries to be listed. The forum meeting is expected to begin next month.

This effort should significantly increase the pace of progress for the Metaverse and make it more useful for more things; Nvidia is using it successfully for today and is reaching a future where we can use it for everything from entertainment and gaming to creating our own digital twins and the potential for digital immortality.

Wrapping Up: The Metaverse Grows Up

I hope that the formation of the Metaverse Standards Forum will accelerate the development of the Metaverse and move it towards a common concept that can interoperate between providers.

While I don’t believe it will ever replace the Internet, I do think it could evolve into an experience that, over time, we can largely live and play with for most of our lives, Can potentially enrich those lives significantly.

I envision virtual vacations, more engaging remote meetings, and video games that are more realistic than ever, all due to better collaboration and an effort to set standards that will benefit the mixed reality market as a whole.

The Metaverse is coming and, thanks to the Metaverse Standards Forum, it will arrive faster and it could have been better.

Technical Product of the Week

Sesame Solar Nanogrid

Those of us who live in states where electricity has become unreliable due to global warming and poorly planned electrical grids expect some serious problems in extreme weather.

Companies and institutions have generator backups, but gas and diesel shortages are on the rise. So, not only are these generators likely to be unreliable when used for extended periods, they are anything but green and will exacerbate the climate change problem they are supposed to mitigate.

Sesame Solar has an institutional solution to this problem, a large solar-generating trailer that also carries a hydrogen fuel cell to generate electricity at night or on cloudy days.

The trailer can also process and filter local water, which can relieve residents from weather or crisis-related water shortages.

It appears that Sesame Solar does a better job of mitigating power outages without producing greenhouse gases that will exacerbate the problem. As a result, the Sesame Solar Nanogrid is my product of the week.

The opinions expressed in this article are those of the author and do not necessarily reflect the views of ECT News Network.