HoloLens 2 from Microsoft: a $ 3500 mixed reality headset for the factory



[ad_1]

I'm in a tiny room in a basement somewhere at Microsoft's headquarters in Redmond, Washington, wearing an old version of the HoloLens 2 helmet. In front of me is a real ATV, missing a lock . Not quite at the corner of my vision – but certainly on the side – I see a glowing indicator pointing to a bucket of straight bolts. I approach, lean down to take a closer look at the shelf and grab one.

Back at the ATV, a holographic game hovers above, telling me what to do and indicating the exact place where the lock should be placed. After a few minutes, I managed to fix the problem, guided by holograms. I tap a holographic button on the guide to close the instructions.

This kind of demo is becoming a banality for technical journalists like me. But if you read carefully the description above, you will find that three key technical innovations are hidden from view.

Here they are: I saw a hologram on the side because the field of vision in which they can appear is much larger than before. I leaned over and did not worry about an uncomfortable helmet moving because it was better balanced on the head. I pressed a button just by pressing a button as I did not need to learn a complicated gesture to use the HoloLens 2.

These three things may not seem so remarkable to you, but that's precisely the point. Microsoft had to make HoloLens feel much more natural if they really want to use them, and that's the way it is.

There is another remarkably remarkable thing: even if it was a demo, I played the role of a worker because that is what HoloLens 2 serves – workers, not consumers .


Microsoft HoloLens 2 is available for pre-order today for $ 3,500 USD. It should be delivered later this year. However, Microsoft has decided to sell its products only to companies that want to deploy the headset to their employees. For the moment, Microsoft does not even announce a version of the HoloLens 2 development kit.

Compared to the HoloLens that we saw for the first time four years ago, the second version is better in almost every important area. It's more comfortable, has a much wider field of view, and is better at detecting real physical objects in the room. It incorporates new components such as the Azure Kinect sensor, an ARM processor, eye tracking sensors and a totally different display system.

He has two speakers, the visor is up and he can see what your hands are doing more precisely than before. There is an 8 megapixel front camera for video conferencing, able to track up to 6 degrees of tracking and also using USB-C to charge. In short, it is full of new technologies. But after four years, this should not be a surprise.

The biggest complaint about the first HoloLens was simple: you only saw the holograms in a relatively small box directly in front of you. Turn your head, even a little, and they will disappear from your field of vision. Even worse, their edges would disappear even when you look them straight in the eyes. It was like watching a digital world through a tiny rectangle.

HoloLens 2 has a field of view twice as big as before. It does not fill your entire field of vision – there are still cuts – but it's big enough now that you do not feel constantly bothered by a mailbox. Microsoft claims that each eye has the equivalent of a 2K screen in front of it, but it's better to see it as a metaphor rather than a precise specification. The exact specification is that it has a "holographic density of 47 pixels per degree", which means that the pixel density is high enough to allow you to read 8-point fonts.

Generally, when a technological product has better specifications, it occurs by the mere force of its technical iteration: faster processors, larger batteries, more RAM, and so on. But this strategy would not work for the display on the HoloLens 2. It should be lighter, not heavier. Microsoft has had to completely adopt another type of display technology.


Lasers and mirrors

Laser-based displays have become the thing to do for the computers on your face. Vaunt's Intel project used lasers, as did the North Focals smart glasses. Although Microsoft uses some of the same basic components, it took them in a different direction and went much further in developing what they can do.

The HoloLens 2 lasers glow in a set of mirrors that oscillate as fast as 54,000 cycles per second so that the reflected light can paint a screen. These two parts together form the basis of a display by microelectromechanical system (MEMS). It's very difficult to do, but the trickiest part for a MEMS screen is getting the image it paints in your eyes.

A solution used by companies such as North is the addition of a holographic film on the lens to reflect the image directly on your retina. This has many disadvantages: a tiny screen and a low resolution for two. But the really problematic part is simply making sure the display is right in the eye. You must be tailor-made for the North glasses, and the picture may disappear completely if they are misaligned.

Microsoft does not want any of these problems and has therefore adopted the same thing as the first HoloLens guides: waveguides. These are the pieces of glass in front of your eyes that have been carefully engraved so that they can reflect the holograms in front of your eyes. The waveguides of HoloLens 2 are now lighter because Microsoft uses two sandwich plates instead of three.

When you assemble the entire system (lasers, mirrors and waveguide), you get a brighter screen with a wider field of view that does not need to be precisely oriented in your eyes to work. Zulfi Alam, general manager of Optics Engineering at Microsoft, says that Microsoft is ahead of this system and that waveguides are definitely the way to go for a mixed reality. "There is no competition for the next two or three years that can approach this level of fidelity in the waveguides," he says.

Do you want a wider field of view? Simple. Just increase the angle of the mirrors that reflect the laser light. A wider angle means a bigger picture.

Do you want brighter images? Simple again. Lasers, so as not to insist too much, have light to spare. Of course, you have to take into account that the waveguides lose a ton of light, but the displays I saw were set at 500 nits and looked very bright. Microsoft thinks that it could be much brighter in the final version, depending on the power absorbed.

Do you want to see the holograms without being specially adapted to your headset? Simple again. The waveguide does not require adjustment or specific measurement. You can just put on the headphones and go on the front. It can also sit far enough in front of your eyes to allow you to comfortably wear the glasses you need.

Simple, simple, simple, no? In truth, it is devilishly complex. Microsoft needed to create an entirely new burning system for waveguides. It had to be found how to direct the light to the right place in the waveguides, photon by photon. "We simulate each photon from the laser," says Alam. The light of the lasers is not just reflected; it is divided into several colors and by several "pupils" in the display system, and then "reconstituted" at the right place on the waveguides. "Every photon is calculated where it's supposed to go," says Alam. It takes a tonne of computing power, so Microsoft had to develop custom silicon to do all the calculations on where the photos would go.

And although alignment is much easier with the waveguide, that does not mean it's perfect. That's why there are two tiny cameras on the bridge of the nose, directed towards your eyeballs. They will allow HoloLens 2 to automatically measure the distance between your students and adjust the image accordingly. These cameras will also allow the HoloLens 2 to adjust the image vertically if it tilts or if your eyes are not perfectly flat. (They are not, sorry.)

A kind of free advantage of these cameras is that they can also scan your retinas to connect to HoloLens 2 safely. After all, it runs Windows and therefore supports Windows Hello. They also tell you where you are looking, which activates some user interactions that I will discuss below.



A MEMS mirror under a high speed camera.
GIF: Microsoft

Then there is the power: lasers, oscillating mirrors and custom chips to manage the computer, because all this must gnaw the battery. But Alam tells me that despite all this, he still needs less energy than the alternative. The resonance of the mirrors oscillates, so it takes less energy to move them, as if they were the fastest metronomes of all time. Lasers also cause less losses than LEDs, and custom silicon can be optimized for its specific task.

"Our evolution is towards a form factor that is really glasses," says Alam, "and all these steps are important in this journey."

All this technology is impressive, but I do not want to overestimate the quality of the image. What I was using was not a finished product. I saw a tiny halo around some holograms, and they sometimes jumped a little. Most features based on nose bridge eye scans have not yet been activated. Yet, compared to the first HoloLens, what I saw crossed the line: "a great demo that I'd use for 20 minutes, then I'd be bored" to "I could see people there." Using for a few hours if the software was really useful. "

But if you plan to use a helmet "for a few hours", you must be comfortable enough to leave it first.


Alex Kipman, Technical Researcher – AI and Mixed Reality, Microsoft.

Comfort zone

Here is how you place the HoloLens 2: you put it on like a baseball cap, you turn a button on the back to tighten the headband, then you start to see holograms. The end.

It's a lot less demanding than the last HoloLens or any other face shield I've ever tried. Because of all the work on the display system, you can skip the extra step "with the position to make sure you can see the picture". The body of the thing is also simpler. It's one group that holds up with minimal pressure on the back of the head and forehead. (There is an optional top strap if you need it.)

All this is nice, but useless if the headset is uncomfortable to wear. And even though I've never had it for more than 20 minutes, I think it'll last longer.

Microsoft has a human factor lab where they love to show their collection of dummy heads and high-speed cameras. Carl Ledbetter, senior director of design for the Microsoft device design team, explained to me all the prototypes and hardware that Microsoft had been trying to incorporate into the final product. He explained how Microsoft had experimented with different models and materials, eventually leading to carbon fiber to gain weight.



"The reality is [we have to] suitable for children, adults, men, women and different ethnic groups around the world. Everybody's head is different, "he says. Microsoft has a database of about 600 heads allowing to follow the shape of the skull, the depth of the eye, the size and the relative position of the nasal bridge, as well as other variants. Ledbetter's team attached sensors to people's necks to measure muscle fatigue to ensure that the center of gravity was in place.

The result is that the HoloLens 2 has a more tolerant and flexible fit. It simply does a better job of adapting to basic human and physical realities. You can tilt the visor upwards so that it is no longer in your field of vision and thus make visual contact without removing the helmet. The memory foam cushion that rests on your forehead is removable and cleanable. The thermals have been completely redesigned so that heat is removed from your head.

All of this really helps, but the most important thing that Microsoft did was move the center of gravity right behind your ears rather than up your eyes. The HoloLens 2 is not much lighter than the original HoloLens. he feels lighter because it is balanced more naturally on your head. This balance makes a huge difference. Its weight is less visible and should put less pressure on your neck.

Ledbetter moved the weight by literally moving the heavier part: the main processor and the battery are now located in a module located at the back of the helmet, with wires inside the headband going up to the display panel and components at the front. This processor, by the way, is a Qualcomm Snapdragon 850 based on ARM, which is important because it responds to another fundamental human reality: we hate when the battery dies and we hate plugging things. An ARM processor means that it can have a smaller size. drums.

The original HoloLens were running on an Intel and Windows processor. Since then, Microsoft has made considerable efforts to make Windows work properly under ARM. These efforts are slowly paying off on laptops, but Intel is still on the agenda on machines where raw speed is usually more important to users than battery life. In general, there is a voltage with Intel. It does not provide the low-power chips required by mobile devices. Intel should have even put pressure on Microsoft to keep the Surface Go on its chips.

So, what about HoloLens 2? Alex Kipman is responsible for the entire HoloLens project. He says that "ARM rules in battery-powered devices. The decision of ARM has become quite easy. If you are going to be on battery, [it’s] difficult to find a product that does not work with ARM today. "

When I point out that there are a lot of battery-powered Windows laptops using Intel chips, it's marveling. "Intel does not even have a SoC [system on chip] right now for these types of products that run on battery. They had one, the previous version [of the HoloLens] had Cherry Trail, which they stopped. This decision is obvious. "


For workers, not consumers

HoloLens 2 is only sold to businesses and not to consumers. It was designed for what Kipman calls "front-line workers", employees of auto shops, factories, operating theaters and field repair work. It is designed for people who work with their hands and have trouble integrating a computer or smartphone into their daily work. Kipman wants to replace the Windows 2000 computer stained with grease installed in a corner of the work room. That's about the same decision that Google has made for Google Glass.

"If you think about 7 billion people around the world, people like you and me – knowledge workers – are by far the minority," he says. For him, the workers who will use it are "maybe people who repair our jet engine. Maybe it's the people who are in a retail space. Maybe it's the doctors who operate you in an operating room. "

He goes on to say that it is for "people who have been, in a sense, neglected or who have not had access to technology. [in their hands-on jobs] because personal computers, tablets and phones do not really lend themselves to these experiences. "

This is enough. This fits perfectly with Microsoft's new goal of meeting business needs instead of trying to create consumer products. It was one of my recipes when I interviewed General Manager Satya Nadella last year and it is still the case today. As I wrote then, it's "a type of Microsoft different from the one we used to think about. It's a little less obvious, yes, but it has the advantage of being much more likely to succeed. "

In addition, Kipman claims that even HoloLens 2 is not good enough to be a mainstream consumer product. "It's the best, the best watermark of what can be achieved in mixed reality and I'm here to tell you that it's still not a consumer product," he says, then continues:

Why is not it a consumer product? It's not as immersive as you want. It is more than twice as immersive as the previous one, [but it’s] still not immersive enough for the consumer of the street to use it. It's still not comfortable enough … I'd say that as long as these things are not much more immersive than the most immersive product, far more comfortable than the most comfortable product, and that's up to $ 1,000 or less, I think people are suggesting that these products are ready.

Kipman says that Microsoft has not participated in the consumer hype cycle for these types of products. "We were not the company that had angered VR. We are certainly not the company that has been the vogue of AR And since we merged the two into mixed reality and AI efforts, we have not stirred any emotion either. "

This is not entirely true. We saw a lot of Microsoft demos showing games – including Minecraft – and other consumer applications for HoloLens. So this transition to the business market is absolutely pivotal.

But it's a pivotal part of Microsoft's corporate strategy. And it's not because it's no longer positioned as a consumer product that it's not an important product – a product for which Microsoft seems to be committed and for which it develops software.


A better interface on your face

The first HoloLens forced users to learn delicate gestures with names such as "Air Tap" and "Bloom". You had to do these very specific hand gestures, because that's all the first HoloLens sensors could detect and understand.

The HoloLens 2 can detect and understand a lot more through a new network of room reading sensors called Azure Kinect. "Kinect" because that is the hallmark of Microsoft cameras capable of digitizing rooms, "Azure" because apparently everything the company does today is somehow connected to its service cloud computing and also states that it is a commercial product, not an Xbox add-in. sure.

"HoloLens 1 is only a big mesh. It's like dropping coverage on the real world, "says Kipman. "With HoloLens 2, we move from spatial mapping to semantic understanding of spaces. You understand what a couch is, a human sitting on the couch, what is the difference between a window and a wall. "

I can not speak about the quality of Kinect. Actually able to identify objects – Microsoft has not demonstrated it to us – but it works theoretically because the Azure Kinect sees the room at a higher resolution and because it is connected to cloud computing services that 'help to understand what things are.

There is one aspect where I can definitely say that the highest fidelity is real: it identifies my hand and what it does much more easily. It can track up to 25 points of articulation of both hands in space, which means you should no longer use the Air Tap gesture to interact with holograms.



Resizing a hologram with a natural gesture. The video does not show the real field of view.
Image: Microsoft

In a demo, I paced a room looking at various holograms installed on tables. When I reached my hands, a box appeared around each of them with small handles on the edges and corners. I could just reach out and grab the whole box and move the hologram. I could also just grab an edge to rotate it or two to resize it. When there was a button, I could pull out my finger and push it. I doubt that it is accurate enough, for example, to let you type on a virtual QWERTY keyboard, but it is nevertheless a big step forward compared to the first generation.

Eye tracking also comes into play in the way you interact with holograms. HoloLens 2 can detect your current location and use this information as a kind of user interface. There were demos where I just watched a little bubble to burst it into a holographic fireworks, but the most useful was a self-scroller. The closer I got to the bottom of the page, the faster the words ran, but it stopped when I looked back.

I have not seen the full high level user interface, so I do not know if that changes. But one thing is absolutely not: it still runs on Windows. It uses Windows OneCore shared code, which means you will not get a traditional Windows desktop shell, but you can run any universal Windows application there. It also has the necessary drivers to allow you to connect a keyboard and a mouse via Bluetooth if you really want it.

Chaitanya Sareen, head of the group program for Microsoft Mixed Reality, says they are trying to "make the machine work around the person rather than the other way around." Sareen calls this interaction "instinctive" and not "intuitive" because it can be added to what we already do with real objects in the world. "Is someone born saying," There's going to be a close button [in the upper corner of a window]& # 39 ;? No, he says. "Many of the interfaces we use are learned."

Sareen is still thinking about the details of what the UI will be, but the goal is to use most of the natural gestures that you have captured as a toddler instead of you to learn a brand new interface language.

Microsoft is also making new software tools available to developers. One of the most important, Dynamic 365 Guides, will be a mixed-reality application with templates to create instructions for repairing world objects such as this ATV. The other tools depend on Microsoft cloud services. One of them, Azure Remote Rendering, allows HoloLens to offload some of the computing load to the cloud. It exists because HoloLens 2 can only store and render a limited type of detail for rendering similar to a 3D rendering of an engine locally. With remote rendering, some details can come from the real-time cloud, allowing you to display potentially infinite levels of detail, allowing you to model and interact with the smallest parts of the cloud. 39, a holographic machine.

Finally, there are the Azure spatial anchors. It allows you to pin holograms in real world locations. Basically, it's not that different from what Apple and Google are already doing in augmented reality: letting multiple devices see and interact with the same virtual object. Microsoft's ambitions, however, are far more ambitious: to create an infrastructure for a set of holograms "globally" and to create tools that allow developers to use this infrastructure across multiple platforms, including iOS and Android.

A resolution requiring more than GPS location and object recognition. Kipman talks a lot about the distinction between identical, boring conference rooms in the same place and on the same floor on different floors. Tracking objects in the space with the help of optics is deemed difficult. Walk in a circle around a building and your position will move away so that the computer will not put your end point at the starting point. It's a little vague on the path that Microsoft has taken to solve these problems, but we are actively working on it.


Alex Kipman thinks we are on the brink of "the age of the computer". The first computers arrived with their architectures open, then the phones with wallet applications shops. intends to keep HoloLens open. HoloLens works with Microsoft's cloud services, but also with other ecosystems. Kipman says the HoloLens and Azure are "weakly coupled, but closely aligned."

I could do more than dispute with his summary of the history of computing and point out that there are also a lot of opportunists who demand transparency, but the most important point remains: Microsoft thinks the mixed reality will be a Big Deal.

To understand Microsoft's projects recently, it took a lot more jargon than before. With HoloLens 2 in particular, expect a lot of talk about "time to value" (how quickly a user can do something useful after buying a device from an employer) and about "intelligence To the cloud).

There is cognitive dissonance among regular consumers with all this talk. The protests of Kipman on the contrary, there is a lot of hype around the HoloLens 2. It is simply intended for companies. Some are well deserved. I think HoloLens 2 is a technical marvel. The fact that it is not sold as a consumer device does not mean that it is not an important piece of technology, which could change our understanding of what a computer should look like. .

But we're used to consumer electronics companies doing their best to bring such technical wonders to the stores, translating this hype into gadgets in our pockets and on our heads.

For HoloLens 2, the hype is not related to personal technology. These are just business.


[ad_2]

Source link