Apple Glasses: What we expect, what we think we know | Instant News


Looking at other wearable Apple devices can show you where the Apple Glasses are.

Scott Stein / CNET

First time came VR. Then came the wave of expensive and expensive AR headsets full of the promise of a wild, mixed reality world. Apple now seems like prepare your own smart glassesfinally, seven years after Google Glass and four years after the debut of Oculus Rift. These reports extended again for several years, including a the story was solved by CNET’s Shara Tibken in 2018.

Apple has been in the wings all this time without a headset at all, despite the company aspirations in AR are clear and well telegraphed on the iPhone and iPad for. Every year, Apple makes a significant step on iOS with its AR tools.

I’ve use more AR and VR headsets than I can remember, and has been tracking the entire landscape for years. In many ways, the future logical path of the Apple AR headset must be clear from just studying the pieces that have been laid out. Just apple acquired VR streaming media company, NextVR, and previously acquired AR headset lens makers Akonia Holographics.

I have my own thoughts about what might be rumored to be a long-rumored headset, and so far, the report feels in harmony with it. Much like Apple Watch, which appears among many other smart watches and has many features I’ve seen in other forms before, Apple’s glasses might not be a big surprise if you have followed the beat of the AR / VR landscape lately.

Remember Google Glass? How about Snapchat glasses? Or, that HoloLens, or Magic Leap? Facebook works on AR glassestoo and snap, and also Niantic. The view can suddenly be crowded quickly.

This is where Apple will tend to be based on what has been reported, and how the company can avoid the pitfalls of the previous platform.

Future North Focals smart glasses will look very normal. Apple can take a similar path.

North

Ordinary glasses, first, with a normal name

Making people use an AR headset is difficult. I find it hard to remember to pack smart glasses, and find space to carry them. Most of them does not support my recipe, Among.

Apple has always been touting Apple Watch, first and foremost, as a “good watch.” I expect the same from his glasses. If Apple make prescription glasses and make it available, Warby Parker style, in a seasonal frame from its Apple Store, which might be enough for people if it looks good.

From there, Apple can add AR features and let newcomers experience it. Augmented reality is strange, potentially ruled out, and people need to feel how much is right for them. The original Apple Watch is designed to be glimpsed for five seconds each time. Perhaps the same idea is being used for Apple AR features.

Apple Glass is a new name for these glasses. Not surprisingly, because the watch is Apple Watch, the TV box is Apple TV. Apple could have taken the “Air” route like “AirFrames,” but I want to know if these things will eventually be moored in some time.

Cheaper than you think?

A new report from Apple leaker Jon Prosser said a product called Apple Glass will start at $ 499 plus lenses like the recipe. That can still increase the price beyond what I pay for my glasses, but it remains in the realm that is not crazy. While HoloLens and Magic Leap cost thousands of dollars, they are not at all targeted at ordinary consumers. The cost of a VR headset ranges from $ 200 to $ 1,000 Oculus Quest The price of $ 400 to $ 500 seems like a good settlement point. The original iPad starts at $ 500. Apple Watch is almost the same. If the glasses are accessories, and are intended to be used with watches, AirPods and iPhones, you can’t make it too expensive.

IPhone powered

Qualcomm AR and VR Plan have telegraphing the next wave of headsets: lots of them will be driven by a cellphone. Headsets powered by cell phones can have a lower weight and only have cameras and main sensors on the plane to measure movement and capture information, while cellphones carry heavy lifting and do not drain the headset’s battery life.

The Apple star device is the iPhone, and is loaded with sophisticated chipsets that can do a lot of AR and computer vision calculations. It can already handle turning on an AR headset now, imagine what could happen in another year or two.

The Apple iPhone tends to the machine.

Angela Lang / CNET

The world of QR codes, and maybe objects that are location aware

Report from The QR code on the upcoming iOS 14 AR application which will launch a 3D experience when scanning code in physical locations, such as Starbucks, is corroborated by the Prosser report. Apple Glass will scan these codes and use them to launch the AR experience quickly.

The idea of ‚Äč‚Äčthis QR code working for AR is not new: the Nintendo 3DS 2011 launched with a QR card package that works with baked AR games too.

Maybe the QR code will help speed up AR work in the “stupid” world. The latest iPhone Apple has mysterious U1 chip it can used to improve accuracy in the placement of AR objects, and also to more quickly find other Apple devices that have U1 chips too. The tracker tile report that arrived shortly after this year, which can be viewed through an iPhone application using AR, might be able to extend to Apple’s glasses. If all Apple objects recognize each other, they can act as a flare at home. The U1 chip can be an indoor navigation tool to increase precision.

Apple’s latest iPad has the sensor technology it needs

Apple has invested heavily in an array of cameras that can sense the world from short and long distances. Facing forward TrueDepth camera on each iPhone Face ID since X like shrinking Microsoft Kinect, and can scan several meters, sense 3D information with accuracy high enough to be used for safe face scanning. Newer one Rear Lidar sensor on the iPad Pro 2020 can scan even further, a few meters away. That’s the range of glasses needed.

The Apple iPad Lidar Pro Scanner is more for sensing depth than scanning real-life objects, according to the developer: the point array sent to ping the world is less fine-grained, but good enough to unite the environment and scan the landscape, pay attention to furniture, people and more. The Lidar sensor array is reportedly an Apple AR eyewear sensor, and that makes sense. The iPad Pro and the next iPhone could end up as a live development kit for the glasses sensor, but there is already a code on iOS 13 showing stereoscopic software called “Starboard,” and reports iOS 14 support for a hand-held remote controller.

How far away is the visual?

Will Apple push the bleeding edge of a realistic holographic AR, or aim at the style, some key functions and build from there? Undoubtedly, the last. The first Apple Watch is full of features but still lacks some important things that other watches have, such as GPS and cellular connectivity. Similarly, the first iPhone, which does not have an app store, 3G or GPS. Apple tends to market its new products by doing some important things very well.

High-end mixed reality headsets like HoloLens 2 and Magic Leap, which show sophisticated 3D effects, are very heavy. Smart glasses that are smaller and more normal North Focus or Vuzix Blade more like Google Glass; they present bits of head information on a flat 2D screen.

There aren’t many lightweight AR headsets yet, but that will change. Plug-in glasses like nReal Light show some 3D graphics like Magic Leap, and it runs using the telephone. That’s close to what Apple can make.

Apple’s dual screen can surpass competition and offer better picture quality for its size. We have seen ordinary eyeglass lenses that can embed waveguides to make images float without being seen. And over time, Apple can have more sophisticated hardware in the store.

AirPod changed from absurd to essential. Can Apple do the same for glasses?

Sarah Tew / CNET

Look into AirPods for ease of use – and audio augmented reality

I have thought about how AirPods and their instant comfort, and a strange design, is an initial experiment on how to wear Apple hardware directly on our face acceptable and normal. AirPods are expensive compared to cable shoots in a box, but also utilitarian. They are relaxed. Apple Glass needs to feel the same way.

AirPods can also start involving audio that is space-conscious, to involve information from locations that can appear and remind someone to turn on their glasses. Maybe the two will work together. Immersive audio is casual, and we do it all the time. Immersive videos are difficult and not always needed. I can see AR works as an audio-first approach, like ping. Apple Glass has the potential to carry out world-wide spatial vigilance scans that will allow spatial audio to work.

Somehow, watches and glasses need to work together for navigation and other applications.

Scott Stein / CNET

Apple Watch and AirPods can be Glass’s great friends

Apple already has a collection of wearable devices connected to the iPhone, and both make sense with glasses. AirPods can be paired for audio (although maybe the glasses have their own glasses Bose Frames-like audio too), while a watch can be a remote control which helps. Apple Watch has acted as a remote at certain times, for Apple TV, or connecting with an iPhone camera. Apple’s glasses can also look at a watch and expand its display virtually, offering additional features that appear silently, such as halos.

Apple Watch can also provide something that would be difficult to obtain from a hand gesture or touch-sensitive frame on a pair of glasses: haptics. Rumbling feedback on a watch can give a tactile response to virtual things, maybe.

Could it be that Qualcomm and Apple’s reconciliation are also about XR?

Qualcomm and Apple work together again on future iPhones, and I don’t think it’s only about modems. 5G is the main feature for mobile phones, no doubt. But it’s also a killer element for next-gen AR and VR. Qualcomm has explored how remote rendering enables 5G phones and connected glasses to connect to streaming cloud content and location data. The glasses can finally stand alone and use 5G to do sophisticated computing, in a way like Apple Watch finally works via cellular.

Qualcomm chipsets are on almost every standalone AR and VR headset that I can think of (Oculus Quest, HoloLens 2, wave of new smart glasses, the latest version of Google Glass, Vive Focus). Apple hardware will likely interact with some Qualcomm XR tools that appear too.

There are other AR devices out there, like nReal Light. Apple needs to open iOS to work with it, such as a fitness tracker and smart watch.

Sarah Tew / CNET

Expect the iPhone to also support VR and other AR

While Apple Glass might be Apple’s biggest focus, that doesn’t mean there can’t be, or shouldn’t be, competitors. For example, there are plenty of smart watches and fitness trackers that work with the iPhone. What’s annoying for trackers and other watches is how they are blocked in more limited interactions with iOS than Apple Watch. That can be the same at the end of the road, if the connected VR and AR headsets are allowed to work with future iOS updates. This is where the direction of Qualcomm with mobile chips, and Google Android will most likely follow.

Launch date: Still can be in a year

New Apple products tend to be announced months before they arrive, maybe even more. The iPhone, Apple Watch, HomePod and iPad all follow this path. The Prosser Report said the first announcement could come with the next iPhone in the fall; this is a standard Apple event as previously planned with precoronavirus (which probably won’t happen). Even then, the actual availability might be 2021. This is in line with Shara Tibken report from way back in 2018.

Developers will need a long start to get used to developing Apple eyewear, and make applications work and flow with whatever design guidelines Apple will make. Apple Glass sounds like the culmination of years of acquisition, recruitment, and drama behind the scenes, but they might not arrive as fast as you think.





image source

to request modification Contact us at Here or [email protected]