It has been over five decades since Japanese roboticist Masahiro Mori developed a theory describing the eerie or uneasy feeling people experience in response to humanoid robots that closely, but not perfectly, resemble human beings.
Labeled the “uncanny valley” by Mori in 1970, the phenomenon has stood the test of time with more recent examples of creepiness filtering into the burgeoning fields of artificial intelligence, photorealistic computer animation, virtual reality, augmented reality, and increasingly lifelike androids.
Photo shows new 3D avatars Microsoft Co. announced in November 2021 for launch in the first half of 2022 for immersive meetings on Microsoft Teams as it enters the metaverse race. (Photo courtesy of Microsoft Co.)(Kyodo)
But what happens beyond the other side of the valley as resemblance to humans is perfected? Some researchers worry that as “trusted” virtual humans become indistinguishable from real people, we open ourselves to more manipulation by platform providers. In other words, our responses while still in the uncanny valley, as creepy as they can be, could be a good thing — a kind of self-defense mechanism.
Japanese roboticist Masahiro Mori. (Photo courtesy of Masahiro Mori)(Kyodo)
Mori, now 94, a professor emeritus of the Tokyo Institute of Technology who retired in 1987, originally plotted his uncanny valley hypothesis in a graph, showing an observer’s emotional response against the human likeness of a robot.
He stated that as a robot’s appearance is made more humanlike, there is a growing affinity for it but only up to a point beyond which the person experiences a reaction of extreme disgust, coldness, or even fear, shown by a plunge into the valley.
But as the robot becomes more indistinguishable from a real person, positive emotions of empathy similar to human-to-human interaction emerge once more. The disconcerting void between “not-quite-human” and “perfectly human” is the uncanny valley.
With tech companies led by Mark Zuckerberg’s Meta Platforms Inc. staking a claim on the creation of a metaverse — viewed as the internet’s next iteration “where people can work and socialize in a virtual world” — some experts say the uncanny valley graph is just as pertinent in immersive environments, including in VR and AR.
While we have become accustomed to interacting with “low-fidelity versions of human faces going back to the early days of TV,” we will have the ability to project photorealistic humans in 3D virtual worlds before the end of this decade, Louis Rosenberg, a 30-year veteran of AR development and CEO of Unanimous AI, recently told Kyodo News in an interview. How will we determine what is real?
“Personally, I believe the greatest danger of the metaverse is the prospect that agenda-driven artificial agents controlled by AI algorithms will engage us in ‘conversational manipulation’ without us realizing that the ‘person’ we are interacting with is not real.”
Supplied image shows an English version of Japanese roboticist Masahiro Mori’s original graph of the “uncanny valley,” which shows movement amplifies the emotional response of subjects. The word “familiarity” in the graph is now translated as “affinity.” (Copyright Indiana University Associate Professor Karl F. MacDorman)(Kyodo)
In a corporate-controlled metaverse featuring “virtual product placement,” we could easily think we are simply having a conversation with a person like ourselves, causing us to drop our defenses. “You won’t know what was manipulated to serve the agenda of a paying third-party and what is authentic.”
This is dangerous because “the AI agent that is trying to influence us could have access to a vast database about our personal interests and beliefs, purchasing habits, temperament, etc. So how do we protect against this? Regulation,” Rosenberg said.
Mori himself has said designers should stop before the first peak of the uncanny valley and not “risk getting closer to the other side,” where robots — and now, by extension AI or AR — become indistinguishable from humans.
Applying his theory to the virtual world of the metaverse, he said, “If the person (in the real world) understands that the space they are in is imaginary, I do not think this presents a problem, even if it is creepy,” he recently told Kyodo.
But if the person is unable to distinguish reality from a virtual world, this itself will be a problem, he said, adding that the “bigger issue” is if bad actors misuse the technology for malicious purposes, comparing it to a sharp implement that can either be used as “as a ‘dagger’ to kill or a ‘scalpel’ to save someone.”
In her research, Rachel McDonnell, an associate professor in Creative Technologies at the School of Computer Science and Statistics at Trinity College Dublin, poses the question, “Should we tread softly across the uncanny valley” with virtual humans?
She says while virtual humans have almost reached photorealism, “their conversational abilities are still far from a stage where they are convincing enough to be mistaken for a real human converser.”
A longtime proponent of making virtual humans more realistic, she says the biggest dangers now are “AI-driven video avatars or deepfake videos, where convincing videos can be created of one human, driven by the motion and speech of another.”
But she adds: “Transparency around how avatars and videos are created will help overcome some of the ethical challenges around privacy and misrepresentation.” She gives an example of attaching a watermark to distinguish deepfakes from authentic video content.
Rosenberg, meanwhile, outlines various forms of regulation to keep the metaverse safe, such as informing users when they are engaging with a virtual persona.
“It could be that they are all required to dress a certain way, indicating they are not real, or have some other visual clue. But, an even more effective method would be to ensure that they actually don’t look quite human as compared to other users.”
That is, regulation could ensure virtual humans trigger the “uncanny valley” response deep within our brains, he said. “This is the most effective path because the response within us is visceral and subconscious, which would protect us most effectively from being fooled.”
Meta, the social media giant formerly known as Facebook that has rebranded to focus on the metaverse, has been under fire in recent years for spreading disinformation, mishandling users’ data, and using algorithms that end up sowing discord and distrust on the internet, where users cling to their own “facts.”
On Dec. 9, Meta launched the cartoonlike Horizon Worlds to people 18 and older in the United States and Canada as Zuckerberg’s first attempt at his vision of an “embodied internet,” where avatars of real people will share a virtual space.
Christoph Bartneck, an associate professor at the University of Canterbury in New Zealand, says that the metaverse, a name taken from the 1992 sci-fi novel “Snow Crash” by Neal Stephenson, is not a new concept, and for now, merely fiction.
“It is a sign of a lack of originality that Facebook resorts to promise another virtual world. It seems like a gigantic distraction maneuver to take our attention away from all the bad influence that Facebook and its products have on society,” he said.
In 2021, Meta announced it would spend at least $10 billion on its metaverse division to create AR and VR hardware, software, and content. Other tech companies, including Microsoft and video game and software developer Epic Games, have jumped on the bandwagon, while Nike Inc. has launched Nikeland, featuring virtual sneakers, on video-game platform Roblox.
Unanimous AI’s Rosenberg says making the metaverse seem “uncanny,” i.e., not quite real, is easier than we think. “It turns out very small changes can make a big difference” by focusing on how our perception assigns authenticity to experiences.
British design and manufacturing company Engineered Arts’s Ameca is described as “the perfect platform to develop interaction between us humans and any metaverse or digital realm.” A recently unveiled AI robot with remarkably humanlike facial expressions, it appears astonished to be “awake” — perplexed and eerily amused.
Mark Zuckerberg announces the new name Meta of Facebook company during a Facebook Connect livestream displayed on a laptop screen in this illustration photo taken on Oct. 28, 2021. (NurPhoto/Getty/Kyodo)
“In the metaverse, the simplest thing — like how a virtual persona’s eyes move, or hair moves, or even just the speed of their motion (do they take longer to move than an actual human?) is enough to make them seem deeply unreal,” Rosenberg said, adding that regulation should require that artificial agents be distinguishable from others since this would be easy to achieve.
McDonnell, meanwhile, says she is still optimistic that realistic virtual humans will make a positive impact on society in a future metaverse, including benefits such as preserving users’ privacy in sensitive situations such as with whistleblowers or witnesses testifying in court and overcoming phobias, racial bias, and even conditions such as post-traumatic stress disorder.
“There is a huge potential for the use of virtual humans for good,” she said.
In experiments, her research team found that participants in survival tasks games “generally trusted” virtual agents who had suggested a ranking of objects vital for survival in hypothetical crash scenarios, “but small manipulations of the agents’ facial expressions or voice could influence the level of trust,” she said.
The notion of the uncanny valley as a defense mechanism dates back to Mori in 1970, who called it a “self-preservation instinct,” not from lifeless objects that appear different than us but to protect us from things that are “exceedingly similar, such as corpses and related species,” Karl F. MacDorman, an associate professor in the School of Informatics and Computing at Indiana University, noted.
As for Mori, who has said he never intended the uncanny valley to be a rigorous scientific theory but more a caveat for robotic designers, his message about the metaverse is simple.
“I hope (those) involved in creating it will make something healthy for the happiness of humanity,” Mori said.
Vilnius, Lithuania, Aug. 08, 2022 (GLOBE NEWSWIRE) — The FishVerse team is very excited to announce the launch of its revolutionary fishing ecosystem inside the Metaverse. Built on Blockchain technology, FishVerse is a truly decentralized AAA-type mobile fishing game where players can experience closest thing possible to the real fishing.
The FishVerse was created by MG Labs gaming studio, an experienced game development company that will make web3 game enjoyable and accessible for everyone. Since its establishment in 2021, the developer that specializes in AAA game types on Blockchain already has launched MetaShooter – a very successful game, that already is live on Steam – the world’s biggest gaming platform. MetaShooter is the first decentralized blockchain-based hunting metaverse. Counting more than 8000 users within first month of its alpha game launch.
While announcing FishVerse, the company explained its mission to create an “ultra-realistic, one-of-a-kind web3 game, where millions of fishing and P2E enthusiasts can enjoy playing from any device or corner of the globe.”
Besides offering a platform for gamers and fishing enthusiasts to enjoy their gaming experience to the fullest, the company also offers monetization opportunities. Players can monetize by catching and utilizing NFT fishes, competing in tournaments, completing missions, building business and more in order to earn passive income.
The development team understands the importance of fishing market, that it’s enormous and one of the top outdoor activities in the world and the best part is, that fishers are so passionate about it. FishVerse intends to give them a unique decentralized fishing experience with the ability to earn and enjoy their beloved hobby more.
How to Get Started
The team has ensured that prospective users won’t struggle to use the new platform. To get started, each user is expected to do the following: `
Create a user account on the fisher dashboard. After signing up, where user will deposit some FVS tokens.
Create your character, either male or female. Take advantage of the customization option to make your character stand out with unique features and NFTs.
Finally, get a fishing ticket. With the NFT fishing ticket, you can start your fishing journey and earn rewards as you progress.
If you want to take your fishing to the next level, go ahead and:
Purchase NFT equipment. These gaming assets will help you to progress in The FishVerse more effectively.
Get an NFT tournament ticket. The ticket will allow you to access tournaments and compete for a prize fund.
Purchase NFT assets. Within the game ecosystem, you can start a business by providing in-demand services for other players to increase your revenue.
On the fishing metaverse, the developers offer tons of incentives to make the game not only exciting but also rewarding.
Players can receive tokens whenever they catch fish, complete some trophy collections or missions. They can also upgrade their equipment and participate in tournaments where successful participants can earn prize funds.
On offer are realistic games where participants can complete various exciting and rewarding missions and challenges that will hone their fishing skills and boost their morale.
You can also build businesses by offering a wide range of services for other players or hosting activities to lighten their moods and contribute to a memorable fishing experience on the metaverse.
Land Ownership and Other Opportunities
During your fishing expeditions, other players will need to recharge stamina or their boats, as well to get some additional equipment or fix it. There’s no better way to monetize the demand by investing in ecosystem property. Purchase different tiers of NFT lands in the metaverse and have a place for fish breeding and dock to recharge your fishing boat or that of other players. Also you’ll have a place where to showcase your fish trophies.
You can also own an NFT repair shop where you’ll assist other players to fix their broken fishing equipment as well as manufacture fodder and fish bait for personal use. In your shop, you can personalize your fishing rod and any other NFTs.
And the best part is that this whole detailed ecosystem is powered by FVS token which is the main fuel for The FishVerse engine along with other token utilities in order to keep economy healthy.
Workshops and Programs Will Discuss and Showcase the Innovations that Deliver Content to the Metaverse
CHICAGO, Aug. 8, 2022 /PRNewswire/ — SIGGRAPH 2022 delves into the evolution and advancements in technology for the metaverse. Several programs across the conference will explore the current state and future possibilities in this virtual universe. Conversations and workshops will be held to assist in creating content for the metaverse; while leading experts touch on the experiences, and other metaverse-related ideas, from discussions on interoperability, workshops on 3D modeling, and even tackling potential challenges in this environment. The 49th annual conference will run 8–11 August in person, with on-demand sessions available virtually 25 July–31 October 2022.
The metaverse, once considered a hypothetical virtual world, is now an immersive network of 3D worlds focused on social interaction. Advancements in technologies, including augmented and virtual reality, have made it easier to connect users in both these remote and 3D worlds. But how does an interactive experience affect one’s life in the metaverse? What will happen to how people socialize, work, learn, and play? How does it affect various industries such as art, gaming, fashion, healthcare, or movies?
“I believe SIGGRAPH 2022 is the ideal place to talk about the metaverse, discuss what is happening now, and exchange ideas as to what is possible,” said Munkhtsetseg Nandigjav, SIGGRAPH 2022 Conference Chair. “Our community is made up of creators and innovators. We want to further these discussions and promote how this experience will potentially change lives. We contribute to the variations and transformations occurring in this virtual world, and we can empower these creators to make an impact for the better.”
Highlights that will cover the metaverse include:
[Frontiers Workshop] Challenges to Unlock the Metaverse: Haptics, Gaze, Prototyping Tools, & More! Contributors: Pedro Lopes, University of Chicago; Michael Nebeling, University of Michigan; Shan-Yuan Teng, University of Chicago; Mark Billinghurst, Empathic Computing Lab, The University of Auckland; Yudai Tanaka, University of Chicago Advances in augmented and virtual reality have paved the way for a new type of user interface that can connect users remotely via spatial interactions: the metaverse. This workshop will deep dive into some of the roadblocks to unlock the potential of the metaverse. These include integrating haptic sensations, integrating gaze and attention into user interfaces, and accelerating the prototyping of metaverse experiences.
[Panel] Privacy, Safety, and Wellbeing: Solutions for the Future of AR and VR Moderator: Callie Holderman, Snap Inc.; Panelists: Eakta Jain, University of Florida; Michael Running Wolf, Northeastern University; and Liv Erickson, Mozilla There have been many discussions about the metaverse as a construct and how it will be populated, while issues such as privacy and safety have not been brought up. This panel touches on these topics.
[Birds of a Feather] The Web3D Ecosystem and the Metaverse Contributor: Anita Havele, Web3D Consortium This session features a discussion on how technology contributes to the metaverse. From interactive real-time 3D to mixed reality and humanoid animation, everything done in 3D is significant to an open metaverse. See the scaling expertise in 3D, modeling and simulation, geospatial, augmented reality, and web audio toward an open, interoperable metaverse.
[Courses] Building the Open Metaverse: Part I Contributors: Patrick Cozzi, Cesium; Marc Petit, Epic Games; Neal Stephenson, Lamina1; Rev Lebaredian, NVIDIA; Natalya Tatarchuk, Unity; Steve May, Pixar Animation Studios This session features an introduction to the concepts and building blocks for the open metaverse, covering the current state and potential future directions, including: 3D-first computing, interoperability, game engine ecosystems, the evolution of content creation, and scaling users and worlds. The themes of openness and collaboration are woven throughout all the topics.
[Immersive Pavilion] Journal of My Journey: Seamless Interaction in Virtuality and Reality with Digital Fabrication and Sensory Feedback Contributors: Miguel Ying Jie Then, Ching Lui, Yvone Tsai Chen, Zin Yin Lim, Ping Hsuan Han, National Taipei University of Technology Journal of My Journey is a work that explores the possibilities of integrating seamless interactions in virtuality and reality. The choices users make in the virtual world can be output to the real world, thus enhancing the connection between reality and the virtual world.
Access to the various metaverse presentations and workshops at SIGGRAPH 2022 are available in person and online. Learn more and register for the conference at s2022.SIGGRAPH.org/register.
About ACM, ACM SIGGRAPH, and SIGGRAPH 2022
ACM, the Association for Computing Machinery, is the world’s largest educational and scientific computing society, uniting educators, researchers, and professionals to inspire dialogue, share resources, and address the field’s challenges. ACM SIGGRAPH is a special interest group within ACM that serves as an interdisciplinary community for members in research, technology, and applications in computer graphics and interactive techniques. The SIGGRAPH conference is the world’s leading annual interdisciplinary educational experience showcasing the latest in computer graphics and interactive techniques. SIGGRAPH 2022, the 49th annual conference hosted by ACM SIGGRAPH, will take place as a hybrid event, with live events 8–11 August at the Vancouver Contention Centre and virtual content available starting 25 July through 31 October. Click here for news from the conference and its partners.
Today’s video focuses on Unity Software (U 7.89%) and four things I am keeping my eye on as an investor during its earnings on Tuesday, August 8, after the market closes. The past few months have been a rollercoaster for Unity, from a merger announcement to an issue with its monetization solutions, causing the company to revise its guidance. Check out the short video to learn more, consider subscribing, and click the special offer link below.
*Stock prices used were the pre-market prices of August 8, 2022. The video was published on August 8, 2022.
Jose Najarro has positions in Unity Software Inc. The Motley Fool has positions in and recommends Unity Software Inc. The Motley Fool has a disclosure policy. Jose is an affiliate of The Motley Fool and may be compensated for promoting its services. If you choose to subscribe through his link, he will earn some extra money that supports his channel. His opinions remain his own and are unaffected by The Motley Fool.