The metaverse is still being shaped and defined, and it means different things to different people. Get ready for mixed reality worlds, virtual music and 360-degree news reporting.

Despite no-one really knowing what form it will eventually take, the metaverse is already big business. And, as with every new and exciting technology, it’s arguably better to be on board the train – even if it’s slow and you’re not entirely sure of its destination – than stuck at the station.

There is already huge investment taking place, with entire countries betting millions on the metaverse’s potential. South Korea and Spain are offering grants to kickstart the nascent industry – to the tune of $177 million and $4 million respectively. The city of Shanghai alone has allocated $1.5 billion in order to cultivate a metaverse industry, estimated to be worth 350 billion yuan ($52 billion) by the end of 2025.

Despite these bold investments, we’re currently at the foundations stage when it comes to building the metaverse. An open standards forum has only just been formed, not only to help ensure interoperability, but to prevent any one company from attempting to own the space. It’s still early days and those companies dipping their toes into the metaverse often have different ideas of what it is and how it will benefit them.

The Vizrt metaverse is Unreal

Norwegian real-time visualisation company Vizrt has yet to fully embrace the metaverse as a buzzword marketing tool. But it’s been creating virtual worlds and digital doubles for several years. It came to many broadcasters’ attention by being the first to generate a ‘holographic’ live interviewee for CNN during the 2008 US presidential election.

Its suite of production tools is well known in the industry, and the company’s experience in rendering real-time visuals, combined with digital doubles and data-driven assets, places it at the confluence of broadcast imagery and virtual worlds. Viz Engine 5, for example, boasts tight integration with Unreal Engine 5 and enables the system to recreate huge environments using Unreals new Nanite virtualised geometry system.

As Gerhard Lang, Chief Technology Officer at Vizrt, explains: “When you’re talking about a large scale metaverse environment, one of the promotions that we did for the Unreal 5 integration was the Washington Mall, which is a huge landscape which lots of elements, trees and buildings and streets and water and so on. This would have been a tedious task in previous versions of Unreal because the level-of-detail generation that you needed to do and the optimisations to make sure that the shots you’re trying to render for broadcast would have been excessive. With Nanite this becomes way easier.”

Vizrt UNreal 5

 One of the promotions Vizrt did for Unreal 5 was this digital version of the Washington Mall

However he’s candid about the integration of Unreal 5, saying, “It’s just an option or an opportunity, but this is not the centre of what we would consider our building blocks of a metaverse [to be]. There’s more to it, especially when you think about collaboration. Part of a true metaverse is how you distribute the content that needs to be rendered at different places in an efficient way, and how you keep that synchronised. So, part of the whole equation is the asset management behind it. And I think we’ve been pioneering that – along with virtual studios – for many, many years.”

Lang suggests that a true metaverse experience will require substantial compute power, with a significant increase in cloud computing capabilities. “The bandwidth that you need needs to be super high,” he adds, “and also the latency for this to be generated remotely and transferred back to you is also crucial – if this is too high, the experience becomes bad.”

“I think the metaverse has become such a catch-all term, that it starts to become a bit meaningless… Everything we do [at Condense] is about making experiences that can be done right now.” - Nick Fellingham, Condense

“It will also require some significant changes in how the broadcasters are working and how they are capturing video. Volumetric video is the tool of choice if you’re thinking about a studio environment, where you want to extract that person completely from where they are and then transport them in the right spot in the virtual studio that is rendered either in the cloud or your home PC. But I think going forward that might be a market that is totally worth exploring.”

Condense sees a musical metaverse

Condense is a Bristol-based startup, with the express intent of bringing live performances into the metaverse. It recently hosted a holo-booth at the Glastonbury festival, where attendees could dance in a makeshift motion capture stage and have their 3D avatars appear in a virtual world in real time.

“I think the metaverse has become such a catch-all term, that it starts to become a bit meaningless, suggests Nick Fellingham CEO of Condense. “But for us, we’re just really focused on stuff that can be done today. And everything we do is about making experiences that can be done right now. Obviously, we have a vision for the future, but we don’t need to worry too much about what it’s going to be like in 10 or 15 years time and try and make these grand predictions. With our technology, you can stream real life events into 3D applications today.”

Condense’s system employs an array of cameras – be they depth cameras like LIDAR, or time-of-flight cameras – to capture both the form and movement of a performer, distribute it over the Internet, and then recreate them as a polygonal 3D mesh, with support for Unity, Unreal and 3D web applications.

“Basically, what we end up with is exactly what game developers are used to,” says Fellingham. “It’s got lighting information, it’s got different information about the material properties. And as time goes on, we believe that you can use the material engines that exist inside video games, to get closer and closer to real life.”

When you think about the broadcast industry and its application to the metaverse, it’s easy to think of 2D video being screened in the virtual world, but Fellingham believes this paradigm is redundant. “I think that the metaverse in its purest form is a 3D simulation that gives us a more natural way to consume the Internet. That’s the way I think about it. And given that 2D video doesn’t fit, 2D video will be phased out [in favour of volumetric video].

“It is early, nascent technology, but it’s the end state - there’s nowhere we go after this. We don’t suddenly end up needing 4D video. In fact, I like to think about it like this: even if we have the technology to pipe this content directly into our brains, it will still be 3D video going in, because that’s how our brains are designed, and that’s what feels natural for us. It’s an inevitability, and I want to be looking back in 20 years time saying, our company contributed towards that acceleration towards this new medium.”

The Adobe metaverse

Adobe, once a staunchly 2D-oriented company that held sway over illustration, desktop publishing and video editing, has for some time been slowly building its armoury of 3D applications. It began with the development of Adobe Dimension in 2017, followed by the acquisition of French software developer Allegorithmic in 2019, which brought the apps Substance 3D Painter and Substance 3D Designer into the Adobe fold. Since then Adobe has added 3D Sampler and 3D Stager to the suite, and its newest tool – Substance 3D Modeler – is currently in beta and soon to be released.

Substance 3D Modeler is a ‘digital clay’-type tool, enabling the creation of 3D assets using a combination of organic and hard-surface sculpting tools, and can be operated via a VR headset alongside a traditional desktop mouse or tablet.

As the real world is filled with manufactured products, so the world of the metaverse needs to be populated with virtual assets: buildings, vehicles, props, clothing… As big brands look to capitalise on the marketing potential of the metaverse, it’s no surprise that Adobe is seeing signups from the likes of Hugo Boss, Salomon, Coca-Cola, Unity, Nvidia and NASCAR.

adobe metaverse substance 3d modeler 3-2

 The same asset created in the design process can – with a few tweaks – be used across multiple digital channels

“The emergence of the metaverse presents exciting opportunities for the fashion world,” said Sebastien Berg, head of digital excellence at Hugo Boss. “And Adobe Substance 3D tools are an integral part of our approach as we execute on our plan to develop 80% of our collections on a digital basis by the end of this year.”

The benefits of this digital pipeline are obvious: the same asset created in the design process can – with a few tweaks – be used in ads and marketing, deployed in AR experiences, dropped into metaverse locations or sold in virtual stores.

One might argue that Adobe was a little late to the ‘3D’ party, but its timing with regards the advent of the metaverse could not be better. It’s also telling how much R&D it appears to be doing in AR experiences and retail tools.

Switching onto the TRT TV metaverse

At its Metaverse and Broadcasting forum in June, Türkiye’s public broadcaster TRT announced its TRT Metaverse project. The company – in conjunction with metaverse-oriented agency ME and ILLUSORR – is aiming to launch the world’s first public broadcasting metaverse platform.

trt metaverse auditorium

 The TRT Metaverse auditorium is an important gathering space for live events where the hosts and panelists present and engage live with audience members

The platform is designed as a virtual environment and immersive experience, and is a first step at transitioning the broadcast industry into the metaverse. Accessible across most devices – from VR headsets to mobile devices – the platform will host a range of virtual spaces, each offering a different role, such as auditorium, gallery, studio, retail and broadcast content. The viewer’s avatar can be guided around the futuristic 3D surroundings in order to watch live acts, attend events, examine artworks, shop, choose media channels to view, or just mingle in the lounge and communicate with others via voice and text chat.

The ability to network with other people and form bonds is a key pillar of the project, and stems from Türkiye’s historical location where east meets west. Another important aspect is the news studios, where people can engage directly with news content, selecting a specific country from a spinning globe, interacting directly with presenters, and being transported virtually to locations worldwide.

The market for virtual real estate was valued at $500 million in 2021, and looks to double in 2022.

As Faisal U-K, co-founder of ILLUSORR, explains: “Imagine instead of having to go on Twitter to see news or watch it on TV, you can enter the metaverse with your avatar, click on a certain channel, and be immersed in a live 360-degree feed in Rwanda, Egypt or Ukraine. You are there with reporters, experiencing the news in real time. Those are the kind of immersive experiences the metaverse is going to introduce to the conversation.”

Sports Metaverse

Backed by the likes of ex-YouTube CEO Chad Hurley, NBA champion Andrew Bogut and rapper Nas, Sports Metaverse is a digital world devoted entirely to sports. Created by Sports Icon, it’s intended as a place for sports fans to gather, chat and show off their NFTs. Future plans include mini-games, streamed sporting events, betting, retail stores, virtual stadiums and immersive experiences with the sport stars themselves.

Sports Metaverse has just completed its first land sale, in which plots of virtual real estate were available for purchase using the Ether cryptocurrency. With each land lot purchased, the owner gets a fan cave, complete with social areas, NFT shelving, pool table, TV and so on. Once bought, land can also be sublet to tenants.

The-Sandbox-Game-The-Doggies-are-coming-to-The-Sandbox

 The Doggies are an NFT collection in The Sandbox comprised of 10,000 programmatically generated Snoop Doggs

The market for virtual real estate is already remarkably buoyant, with names like Snoop Dogg, Samsung and PwC piling into a market which was valued at $500 million in 2021, and looks to double in 2022.

And on the subject of Snoop Dogg, he’s building his own private metaverse - the Snoopverse. Created in The Sandbox (a decentralised, community-driven virtual world), it features a digital twin of his mansion in Diamond Bar, California, hosting members-only parties and acting as a venue for music concerts. And if that sounds like a crazy celebrity vanity project, someone recently paid $450,000 for the plot of land next to Snoop’s virtual abode. Even in the metaverse, it’s all about location, location, location.

Panasonic MeganeX headset

VR headsets will be essential in realising the immersive nature of the metaverse, yet there are still only a handful of players in the consumer arena, including Meta, Vive, HP, HTC and Sony. Having changed its name from Facebook to Meta, it’s no surprise where Mark Zuckerberg is betting the company dollar, and the latest headset in its Quest line is the Meta Quest Pro, a high-end follow up to the Meta Quest 2.

Like its popular predecessor, the new headset offers both virtual and augmented reality, although rumours suggest it boasts a spec that puts it firmly in the professional and business category, with a price tag of around $800. The Meta Quest Pro is due in September 2022.

One of the newer entrants to the market is Shiftall – a subsidiary of Panasonic – with its MeganeX ultra-lightweight standalone headset. At just 250g it’ll certainly be one of the most portable headsets out there, yet will still manage to deliver 5.2K 10-bit HDR images at 120Hz per eye using micro OLED panels. The MeganeX is SteamVR compatible, with more platforms available after launch. It will join a number of other accessories aimed at VR and metaverse users, including the HaritoraX full-body motion tracking system and PebbleFeel sensory harness that cools or heats the wearer depending on the immersive experience.

For more about the metaverse and the latest technology innovations that are powering it, discover IBC365’s metaverse articles.