An industry view of presence and the metaverse

Published: Thu, 03/31/22

Presence News

from the International Society for
Presence Research
 
JOIN THE ISPR PRESENCE COMMUNITY ON FACEBOOK  

An industry view of presence and the metaverse

March 31, 2022


[The extent to which the views and concerns of entertainment industry leaders parallel those of presence scholars is striking, as demonstrated in this blog post from SMPTE (the Society of Motion Picture and Television Engineers). As in academic work, the terminology and relationships among concepts, and predictions for the speed and nature of technology evolution, vary. But the same basic ideas are there, which suggests that academia and industry would both benefit from more communication and collaboration. –Matthew]

Meet the Metaverse

By Michael Goldman
March 29, 2022

In Chaitanya Chinchlikar’s opinion, the newest digital business buzzword—Metaverse—is a label that has all sorts of potential implications for the entertainment industry. Boiled down, he suggests the concept fits into the over-arching goal that filmed entertainment content creators have pursued for generations—“to draw the viewer closer and closer into the content itself.”

More or less, the Metaverse conceptually involves the idea of allowing human beings to interact in realtime, both with each other and with content, for business, entertainment, education, pleasure, and in some cases, mischief, entirely within a digital universe made out of networks of inter-connected immersive virtual worlds. More simply, where entertainment is concerned anyway, “it is essentially the answer to the question, what’s next in content experiences,” Chinchlikar adds, meaning it is the logical next step in the evolution and practical use of cinematic virtual reality concepts for entertainment, taken to a new and exciting level.

Chinchlikar is the chief technology officer and head of emerging media for Whistling Woods International, a film and creative arts educational institute based in India. He is also a member of the SMPTE Board of Governors and actively involved in SMPTE’s Rapid Industry Solutions initiative, currently focused on the on-set virtual production revolution . He says there are multiple layers in terms of what is possible with the concept, starting with the business-focused notion of taking immersive methods, tools, and ideas used for entertainment and expanding them into practical use for normal, everyday tasks and interactions, and certainly that idea includes new virtual methods for collaboration, including for content creation.

But when it comes to content itself, Chinchlikar insists the Metaverse is the logical next step in the industry’s exploration of the idea of immersion.

“If you study content over the last 50 to 60 years, almost everything we have done technologically has been to draw the viewer closer and closer into the content,” he says. “We have tried to make content more immersive, whether by making screens really large, better-quality projection, more lifelike colors, stereoscopy, digital surround sound, spatial immersive audio, and even physically moving and rattling a viewer’s seat. In other words, we have always tried to draw the audience into the content as much as cinema-style viewing would allow.

“Now, VR comes along, and that is the first opportunity where the technology itself lets us literally immerse the viewer inside the content by blocking out the real world and creating a virtual world that draws the attention of two of their primary senses—sight and sound. So, the Metaverse is essentially the next step of what you would say is ‘Cinematic VR’—you ‘do’ things in the real world while watching and experiencing what happens in the virtual world, usually by wearing a headset. By creating virtual worlds and having people use avatars to go inside them and interact with other people’s avatars, the Metaverse is letting us take those next steps.”

Chinchlikar therefore believes the Metaverse will eventually become “the fourth content consumption platform” available to consumers going forward.

“We have cinema, we have regular television, we have OTT or digital streaming, and now we have immersive—whether you call it VR or the Metaverse,” he explains. “Cinema involves captive community viewing; television is non-captive family viewing, since you are typically doing it in your own home; OTT/streaming is non-captive individual viewing, and now we have the fourth one—captive individual viewing, but with extensive immersive qualities. That gives it a certain kind of impact that none of the first three can offer—it stimulates the immersion sense in your mind to the fullest.”

He further argues that “immersion is a sense” in and of itself, albeit one with different layers to it.

“There are multiple types of immersion,” he relates. “You have technical immersion, like being immersed in a swimming pool. You have cognitive immersion, where your mind is immersed, and cognitive immersion has been used by filmmakers for decades to create more impactful content. Cognitive immersion is further broken up into many parts. First, there is spatial immersion, where you see a movie or image and literally feel like you are part of the visual you are seeing. There is also temporal immersion, where you feel like you are actually part of the storyline you are consuming. Content with a strong temporal immersion focus have the most impact when it comes to twists or sudden changes in storylines that you’re not expecting, because your mind is literally part of what is going on. We also have spacio-temporal immersion, which is both. And finally, there is emotional immersion, where you have a major emotional reaction to what you are seeing.

“Immersive content in the Metaverse essentially takes these kinds of immersion and targets them to impact the mind as much as possible.”

Chinchlikar cautions that this form of interaction with a consumer’s psyche and emotions comes with potential hazards and may require the institution of safeguards as this form of content becomes more viable. He points to the recent news coverage of an incident in the UK in which a woman charged she was sexually harassed inside Facebook’s Meta virtual platform—meaning, essentially, her avatar was virtually groped by the avatars of strangers. Such incidents are frightening, to be sure, and obviously constitute virtual bullying, but were likely not anticipated when the technology was developed, and thus, what to do about them is one of many concerning areas that need to be addressed as the technology matures, Chinchlikar opines.

“Things like this makes the Metaverse very tricky, and we need to get around to dealing with such possibilities,” he says. “Obviously, we have to make sure that such experiences don’t negatively impact the mind too much or give people PTSD. And then, you also have to make sure there is little or no biological or physical strain put on the body, whether to the eyes, ears, or the neck from wearing a [weighted headset or helmet]. Also, we can’t misuse the six degrees of freedom that immersive experiences allow you. Someone with a history of vertigo, for instance, is likely to throw up with certain [virtual] experiences. So, there are safeguards that content creators need to keep in mind regarding the Metaverse. But the Metaverse itself is neither good nor bad. It’s like a knife. Give a knife to a surgeon, and he can perform life-saving surgery. Give it to a murderer, and he will kill someone.

“So, it is really up to content creators to ensure two things. First, that the Metaverse is not dangerous biologically or socially, and second that content in the Metaverse engages people for the duration of time that they have asked for their attention, as opposed to a kind of pointless experience. Further, while it is not an absolute must, a lot of the success of engagement in the Metaverse will depend on how realistic-looking [photo-real] the content can be made.”

Bandwidth is another concern in terms of building a virtual realm where high-end imagery, audio, and other key data can travel down communication pipelines in realtime seamlessly.

“A lot of companies can take a 360-video feed coming in, break it up into boxes or tiles, and do field-of-view-based rendering, so that wherever your eyes [are directed] will be rendered in high quality, and everything [outside your field of primary focus] as you turn your head will be rendered in low quality,” he explains. “But current typical bandwidth doesn’t easily allow two-way, realtime interaction with [high-end rendering], so it is pretty much one way for right now. But we are seeing people stream live 6K or even 8K VR content over 4G networks—the Olympics last summer were streamed that way in VR, for instance. And there are a lot of innovative companies, like Tiled Media, that are already putting out interesting apps for high-quality VR streaming.”

By the same token, great positives can emerge from the proper and controlled utilization of these virtual platforms, he suggests.

“Imagine teaching Egyptian history,” he suggests. “How great would it be to have a [child] put on a headset and have them walk around an ancient Egyptian city, see and hear people interact, examine [surfaces], and then walk down and see people dressed a certain way and meet a minister and a senator. You can put them on top of a pyramid and see a city below being formed. So, now, you are teaching them civics and urban planning, among other things. The idea is deep learning—deep impact of the content on the user. There are also all kinds of new business models for live sports and other events that can come out of it, like selling experiences to virtually sit at the [50-yard-line] at the Super Bowl, for instance.”

As Metaverse technology evolves, the need to sort out what kinds of standards will need to be developed and implemented to go along with it will also have to evolve, Chinchlikar emphasizes. He suggests standardization work in the Metaverse will have to be “very specialized” moving forward, and adds he is less concerned about traditional technical standardization work regarding things like interoperability, audio, video, compression, and so forth. All that is “rather normal,” he points out compared to the other challenges posed by a deep dive into immersive content. There, he suggests, the issues of security, safety, and privacy will need to be addressed to make industry-standard approaches in this new realm viable.

“You are wearing a display about three inches from your eyes, so the kind of biological safety that display needs to carry is unlike anything anybody has ever experienced before,” he says. “It’s almost the same as testing [medications] before releasing them. They have to know what the impacts will be.

“The second most important thing is going to be privacy. All VR headsets do iris tracking—they track your pupil dilation. Cognitive research will tell you that when the pupil is dilated the most, that is when the biggest amount of information [is captured]. We will have streaming VR services that are advertising based, and so, advertisers are going to [want this information]. They are going to start insisting on having their ads displayed when the customer’s pupil is dilated the most. Is that an invasion of privacy—trying to reach a consumer when you know he or she is at their emotional best or emotional worst? What should you be able to advertise to people who are in that state? Because this is a form of bio-hacking. It’s not just a Google or a Facebook tracking your phone to see what kind of stores you go into so they can tailor ads to [your interests]. This is hacking you when you are emotionally the most perceptive, feeding you advertising or political information during that time, because they have [data] that tells them when you are in that state. Only the consumer and the company whose [equipment the consumer is using] should know your [preferences], but will they start keeping a record of it?

“So, privacy is going to take on a whole new meaning when it comes to immersive content in the Metaverse. That’s why I think we will need some standards or rules there for sure.”

Chinchlikar adds that there will be other learning curves for artists and content creators generally to fully exploit this new medium properly—a medium that he estimates is only in “version 1.0” of its evolution.

“This is the first frameless medium for filmmakers,” he says. “So some core entertainment concepts will need to be rewritten as content creators no longer have the benefit of the frame. [Filmmakers] will also need to learn to direct your attention to certain areas. That’s a new skill for everybody—screenwriters, audiographers, directors, editors, cinematographers. People have to learn what ‘directing your attention’ even means for this medium. They will have to learn a lot more about cognitive science to [really master it].”

Still, Chinchlikar adds, no matter the obstacles or concerns, the genie is out of the bottle, so to speak, and the Metaverse is only going to grow and evolve in ways that he expects will fundamentally change how entertainment content is made and experienced.

“We know VR headsets will eventually get better,” he says. “Right now, it is sort of like having a shoebox on your head. It needs to get sleeker, lighter weight, higher quality, and be able to do that with no biological disturbances to the user. This has already started to happen. Just look at the difference between the earlier [virtual reality product] HTC Vive and the new HTC Vive Flow to see what a massive change in form factor, weight, and quality has already happened in a short time. When almost all immersive world devices are nothing more than lightweight wearable sunglasses, you will start to see the VR content consumption experience explode. And I have no doubt it will happen because this is, as I noted, a fourth avenue for people to create and therefore monetize content, and that should really enable media companies. And as Cinematic VR content consumption explodes, so will the use of the Metaverse.”

In any case, Chinchlikar expects cinematic VR tools to allow content creators to “expand their offerings and consumption” in the Metaverse. In particular, he points out that certain kinds of experiences are more successful in this medium than others, and those include what he foresees as a short-film renaissance of sorts, VR style.

“I expect the Metaverse to help the industry create new overlaps and blends of content it can monetize,” Chinchlikar relates. “It will finally allow the industry to monetize short films, for instance. The immersive experience is [best suited] for a 20-to-30-minute short fiction experience, more than a two-hour film because of [how intense it is]. I could be wrong, but that is my hypothesis. So, as we go further with this, I expect maybe five years from now, we will have four nice, smooth, parallel streams of content available to consumers.”

Earlier this year, Chinchlikar participated in a SMPTE panel with other industry experts that explored some of the needs, rules, and changes in content creation required for vibrant virtual production work—which you can view here. In 2021, he also took part in two panel discussions at both of that year’s VR/AR Association‘s Global Summits on virtual production. You can find those discussions here and here.


 
 

Managing Editor: Matthew Lombard

The International Society for Presence Research (ISPR) is a not-for-profit organization that supports academic research related to the concept of (tele)presence. ISPR Preseence News is available via RSS feed, various e-mail formats and/or Twitter. For more information please visit us at http://ispr.info.