That's not the big unsolved problem with augmented reality. Those are all problems VR systems can already solve with enough money and transistors behind them.
The AR big problem is displaying dark. You can put bright things on the display, but not dark ones. Microsoft demos their VR systems only in environments with carefully controlled dim lighting. The same is true of Meta. Laster just puts a dimming filter in front of the real world to dim it out so the overlays show up well.
Is there any AR headgear which displays the real world optically and can selectively darken the real world? Magic Leap pretended to do that, but now it appears they can't. You could, of course, do it by focusing the real scene on a focal plane, like a camera, using a monochrome LCD panel as a shutter, and refocusing the scene to infinity. But the optics for that require some length, which means bulky headgear like night vision glasses. Maybe with nonlinear optics or something similarly exotic it might be possible. But if there was a easy way to do this, DoD would be using it for night vision gear.
The AR big problem is displaying dark. You can put bright things on the display, but not dark ones.
There is actually a solution (EDIT: nope, see replies), but it's tricky: If you stack two sheets of polarizing filters on top of each other, then rotate them, you can get them to pass all light at 0 degrees rotation and block all light at 90 degrees rotation. It's like special paper that adjusts brightness depending on how much it's rotated relative to the sheet of paper behind it. https://www.amazon.com/Educational-Innovations-Polarizing-Fi...
So you could imagine cutting a series of circular polarizing filters and using them as "pixels". If you had a grid of 800x600 of these tiny filters, and a way to control them at 60 fps, you'd have a very convincing way of "displaying dark" in real time.
It'd require some difficult R&D to be viable. Controlling 800x600 = 480,000 tiny surfaces at 60fps would take some clever mechanics, to put it mildly. Maybe it won't ever be viable, but at least there's theoretically a way to do this.
A minor problem with this approach is that the polarizing filter may affect the colors behind it. But humans are very good at adapting to a constant color overlay, so it might not be an issue.
The problem with that solution is optical, I believe. It would work if you were able to put such a filter directly on your retina, but when you put it earlier in the path of the light, before images are focused, you cannot selectively block individual pixels as they appear on your retina. As a result, the dark spots will look blurry.
(Also, if the pixels are dense enough I imagine you'll get diffraction.)
Here's is Michael Abrash's better explanation:
>“But wait,” you say (as I did when I realized the problem), “you can just put an LCD screen with the same resolution on the outside of the glasses, and use it to block real-world pixels however you like.” That’s a clever idea, but it doesn’t work. You can’t focus on an LCD screen an inch away (and you wouldn’t want to, anyway, since everything interesting in the real world is more than an inch away), so a pixel at that distance would show up as a translucent blob several degrees across, just as a speck of dirt on your glasses shows up as a blurry circle, not a sharp point. It’s true that you can black out an area of the real world by occluding many pixels, but that black area will have a wide, fuzzy border trailing off around its edges. That could well be useful for improving contrast in specific regions of the screen (behind HUD elements, for example), but it’s of no use when trying to stencil a virtual object into the real world so it appears to fit seamlessly.
What about Near-Eye Light Field Displays[1][2]? From what I've seen those look to have promise in solving some focus problems and some of the problems with how cumbersome most VR/AR displays are. As a bonus, they can correct for prescriptions.
The answer is a higher resolution screen plus some clever adaptive optics and software. The problem is that even 8k screens do not come close to required resolution... And you also want fast refresh rate.
"We made black with light. That's not possible, but it was two artists on the team that thought about the problem, and all of our physicists and engineers are like "well you can't make black, it's not possible." but they forgot that what we're doing - you know the whole world that you experience is actually happening in [the brain], and [the brain] can make anything." - Rony Abovitz in 2015.
No, they're just putting out enough light to override the background, then showing dimmer areas for black. If they have a display with enough intensity and dynamic range, they can pull this off. Eye contrast is local, not absolute, so this works within the dynamic range of the human eye.
No, they're just putting out enough light to override the background, then showing dimmer areas for black
Right, that's how all existing HMD systems work - but generally not with the whole FOV re-rendered so it's not so cut and dry.
Note that such a system doesn't give you black ever. It gives you muddy [insert lens color on the grey spectrum].
The case you describe ends up with what is practically an opaque lens that is replicating the environment that is not virtual. So you might as well just use VR with camera pass through at that point.
I don't know. It's similar. I wonder what the problems are with using it, then?
One idea that comes to mind is that a regular screen leaks light. If you adjust the brightness on your laptop as low as it will go, then display a black image, there's still a massive difference between what you see there vs when the screen is completely powered off. But if you take two sheets of polarizing filter and stick them in front of your screen, you can get almost perfect blackness. That's why I thought it was a different idea, since the difference is so dramatic. You can block almost all the light, whereas a regular screen doesn't seem to be able to get that kind of contrast.
Let me be clearer: Being able to show black is super important for AR. For one thing, it's a pain in the ass to read text on a transparent HMD , because you never know what colors will be behind the letters. You can make some educated guesses, and put up your own background colors for the text, but since everything has opacity, it'll always matter what's physically behind the "hologram".
Yes, yes, it's still not a hologram, but the popularized version of holograms (from things like star wars) is still the best way to think about AR display.
If you can show SOME black, text legibility becomes a lot easier. Everything can look way better, even if the world always shines through enough to see it.
If you can show PURE black, enough to ACTUALLY obscure the world, now you can erase stuff. Like indicator lights, panopticon optics, and people.
Right. Pictures of what people see through the goggles seem to be either carefully posed against dark backgrounds (Meta [1]) or totally fake (Magic Leap[2], Microsoft[3]) It's amazingly difficult to find honest through-the-lens pictures of what the user sees. When you do, they're disappointing.
The part that's surprising to me is how instantly popular of a startup it became with so little information. Was this "demo" they gave so damn good that all the investors (some really reputable ones such as Google) started throwing money at it, without doing their research to see how real it was?
Perhaps growth just isn't a great metric for future growth. For example, tons of food-delivery and pet-product startups have exponential growth early on and evaporate a bit later when the product ceases to be sold at cost and a newer competitor appears.
>Is there any AR headgear which displays the real world optically and can selectively darken the real world?
They've said they have a solution, and it's more optical illusion than optics. They don't darken the real world, but will make you perceive something similar.
That's one approach - just add a camera, mix the camera image with the graphics, and feed it to a VR headset. Works, but it's more intrusive for the user than the AR enthusiasts want.
The main issue with this approach is that the video pipeline adds dozens of milliseconds of latency, and it becomes awkward to interact with the physical environment. You couldn't play AR pong for example.
They are called LCD screens. The problem is having a high resolution one. (focus problems can be solved with adaptive optics and measuring via IR laser refraction index of the lens)
Is there any AR headgear which displays the real world optically and can selectively darken the real world? Magic Leap pretended to do that, but now it appears they can't. You could, of course, do it by focusing the real scene on a focal plane, like a camera, using a monochrome LCD panel as a shutter, and refocusing the scene to infinity. But the optics for that require some length, which means bulky headgear like night vision glasses. Maybe with nonlinear optics or something similarly exotic it might be possible. But if there was a easy way to do this, DoD would be using it for night vision gear.