Mixed Reality - The Future Of Computing? - Alternative View

Table of contents:

Mixed Reality - The Future Of Computing? - Alternative View
Mixed Reality - The Future Of Computing? - Alternative View

Video: Mixed Reality - The Future Of Computing? - Alternative View

Video: Mixed Reality - The Future Of Computing? - Alternative View
Video: Mixed Reality-The Future of Computing 2024, May
Anonim

Alex Kipman knows very well what hardware is. Since joining Microsoft 16 years ago, he was the main inventor of over a hundred patents, including the groundbreaking XboxKinect motion detection technology, which paved the way for some of the features in his latest creation, a holographic 3D headset called HoloLens.

But today, sitting in his office at Microsoft headquarters in Redmond, Washington, Kipman isn't talking about hardware. He discusses the relationship between humans and machines from a broader philosophical perspective. Regardless of whether we interact with machines through screens or things that sit on our heads, for him all this is just a "moment in time."

Brazilian-born Kipman, a technical specialist at Microsoft's Windows and Devices Group, enthusiastically explains that a key benefit of technology is its ability to replace time and space. He cites the example of "mixed reality" (MR), a Microsoft term for a mixture of the real world with computer-generated graphics. According to Kipman, one day it will invisibly combine augmented and virtual reality. Among the most striking features of MR, he says, is the potential to unleash "superpowers of replacement" in the real world.

People place particular value on the feeling you get when you physically share space with another person. This is why Alice Bonasio of FastCompany decided to interview Kipman face to face. "But if you could do this kind of interaction without wasting time moving," says Kipman, "life would be much more interesting." Further from the first person.

“My daughter can interact with her brothers in Brazil every weekend, and my employees don't need to travel the world to do their job,” he continues. “With the advent of artificial intelligence, we could continue the conversation, but I wouldn't be here anymore. One day you and I will talk, you will be on Mars, and I will be dead for a hundred years. Our technicians' job is to accelerate the future and constantly wonder how to do it."

Microsoft is banking on mixed reality to help us get into the future. And here we go back to the hardware again. Availability of the right device at the right price will be a factor in determining whether consumers embrace mixed reality (although devices alone are unlikely to start the MR revolution, as VR has shown). While the HoloLens is the only standalone holographic computer on the market (unlike the OculusRift or HTCVive if you don't have to attach cables to an external device), the $ 3,000 smart glasses serve as more of a proof of concept than a consumer product.

Now Microsoft wants to change that. The company is launching Windows Mixed Reality Headsets this fall, the first major attempt to market the concept to the general public. While this device is still closer to a perfect hybrid of augmented and virtual reality, it has already embodied the main features of HoloLens - like advanced tracking and mapping capabilities - and offered a more affordable price of 300-500 dollars. The headset will be available in various forms from a variety of hardware partners including Dell, HP and Samsung, and will allow users to create 3D spaces that can be personalized with media, apps, browser windows and more.

According to Microsoft, implementing a platform that allows anyone in general to create their own digital world is the first step in achieving that very leap into the world of tomorrow. “If you believe, as we do, that mixed reality is the inevitable next trend in computing, you'll have to harness productivity, creativity, education and a whole new range of entertainment, from casual to hardcore gaming,” says Kipman.

Promotional video:

Improving mixed reality

Kipman isn't the only one optimistic about mixed reality. California-based startup Avegant is working on a platform that presents detailed 3D images by layering multiple focal planes, which the company calls "light field" technology. “The applications are endless,” says Avegant CEO Jörg Tewis. “From designers and engineers who directly manipulate 3D models with their hands, to medical professors who illustrate various heart diseases using an example of a practically living model to their students. At home, users can surround virtual shelves with their favorite products. Mixed reality allows people to interact directly with their ideas instead of screens and keyboards."

To do all this, mixed reality devices must support virtual images that will appear indistinguishable from the real world and interact with it in one piece. According to Professor Gregory Welch, a computer scientist at the University of Central Florida, most of the technologies developed so far have not yet reached this equilibrium. "Mixed reality is especially difficult because there is neither the hidden imperfection of the virtual, nor the striking purity of the real."

Image
Image

He and his colleagues found that in some cases, the relatively wide field of view of the real world that HoloLens provides can harm an important sense of presence. While a healthy person sees 210 degrees, the HoloLens display magnifies the center of your field of view by 30 degrees or so. In experiments by Welch and his team, the gap between the real and the expanded landscape reduced the sense of immersion and presence.

“This means that if you look at the virtual person in front of you (as was the case in our experiment), you will only see part of him floating in space in front of you,” Welch says. “You will need to move your head up and down to 'draw' the perception of him, since you cannot see the whole person at once unless you look at him from afar (he will appear smaller). The problem is that your brain constantly sees the "normal" world around it, and this "overwrites" many of the types of perception that you might otherwise have."

Welch goes on to explain that in the demos we see today with the HoloLens or Apple ARKit, for example, virtual objects can be fixed to a flat surface, but beyond the basic shape and visual appearance, the software usually does not recognize many important physical characteristics of the object, such as weight, center of mass and behavior, or the surface on which it is located - not to mention any activity in the real world that takes place around objects.

“If I accidentally roll a couple of dice off the virtual table, they will not“fall”when they reach the edge and bounce off as would be expected based on their type and floor material,” he explains.

In a paper that Welch co-authored with Professor Jeremy Baylinson, director of the Virtual Human Interaction Lab (VHIL) at Stanford University, they set out some of their research findings that show that virtual content has much higher value when it exhibits the behaviors that we do. expect from physical objects in the real world.

“In our lab, we're starting to use HoloLens to understand the relationship between the augmented reality experience and the subsequent psychological relationship to the physical space itself,” says Beilinson. For example, he explains that his experiments show that virtual people who "walk like ghosts" through real objects, rather than bypassing or trying to avoid them, are perceived as less "real" than those who obey the laws of physics.

Advances in mixed reality are likely to make headsets more affordable and lighter, but it's also possible that at least some of our future interactions with this technology won't include wearable electronics at all. Spatial Augmented Reality (SAR), for example, developed by Welch many years ago, allows you to use projectors to change the appearance of physical objects around you, such as the material of a table or the color of a couch - without glasses.

“Of course SAR won't work in all situations, but when it does, it will be convincing and easy,” Welch says. “If there is something magical about when the world around you is changing, and you have nothing to do with it - no headset, no phone, nothing. You just exist in the physical world, which is practically changing around you."

Virtual collaboration tool in the real world

Nonnie de la Peña, founder and CEO of Emblematic, helped put VR to use as a reporting tool and for storytelling. She is called the “godmother of virtual reality,” and she believes immersion technology is the closest thing to imagining an audience - that is, putting it in the place of a storyteller. She believes HoloLens has the potential to increase the quality and depth of our understanding of the world, thanks in part to volumetric capture, which creates a 3D model of objects using multiple cameras and a green screen. “Microsoft started offering high-level realism using volumetric capture, and it was immediately picked up by journalists,” says de la Peña. Emblematic's own creation, After Solitary, is an award-winning documentary.created in partnership with PBS and KnightFoundation, which used this technique to convey the essence of the trauma of a long prison sentence.

The biggest change that mixed reality promises is that content won't be tied to any particular device. MR uses building blocks (real-world objects or computer-generated objects) to create environments in which people enter for further interaction. In this context, devices become a window that allows you to look into and access these worlds, rather than a repository that stores your personal content (like your smartphone).

In these shared real / virtual environments, Kipman notes, our relationship with computing shifts from personal to shared - from devices that store your own personalized content to shared spaces of creativity mediated by technology. Kipman thinks this has profound implications for how we will develop applications in the future. If, for example, you create a virtual statue and place it as a hologram on top of a table in your living room, another person with a mixed reality device will see your statue when they enter the room and move it if they want to. Because the content is not stored on your device, but on the environment itself, defining the objects (both real and virtual) that inhabit it.

“These concepts require rethinking the operating system in a mixed reality context,” says Kipman. “You must build a foundation that moves from silicon to cloud architecture to realize your own move from personal computing to collaborative computing. It will take time,”he smiles.

Ilya Khel