designed Apple’s VR glasses are dependent on another device, and you may have to move more processor-heavy tasks to a connected iPhone or Mac.
The company is said to be working on a chip dedicated to glasses, a system on a SoC that lacks capabilities found in other company’s processors.
The new chip does not contain the Apple Neural Engine, which deals with artificial intelligence and machine learning. But it was designed To be better at transmitting and receiving data wirelessly and compressing and decompressing video than traditional chips.
This makes sense if the glasses are designed to stream data from another device rather than do the heavy processing themselves.
The chip is designed to be as energy efficient as possible for maximum battery life, which can help remove unused parts of the chip and stream data from another device.
And for wearable tech, like a watch or a set of glasses, there’s always been a delicate balance between battery life, performance, and capability.
The original Apple Watch handed over many tasks to a connected iPhone. But the company later managed to make its internal processor powerful enough to handle many of them.
Read also: Apple asks you before it targets you with ads in iOS 15
Apple Glasses require an iPhone connection to function
Bloomberg reported in 2020 that previous versions of the glasses were supposed to run with a separate fixed hub until Joni Ive stepped in and said they should be self-sufficient, and Tim Cook sided with Joni Ive.
The glasses contain its own CPU and GPU. It may be able to connect to a phone or tablet or even work in the basic standalone mode.
Some of the company’s devices, such as the Apple Watch, can still perform basic tasks in low battery mode.
The device contains an unusually large image sensor, the size of a spectacle lens, that would have been difficult to manufacture. It was designed to capture high-resolution image data from the user’s surroundings for augmented reality.
This can be useful given that the glasses are both a virtual and augmented reality device.
It is difficult to use virtual reality without obscuring the user’s vision. It is also difficult to use augmented reality without the user being able to see the outside world.
The image sensor can be intended to provide a view of the user’s surroundings from inside the eyeglasses, similar to the way Oculus displays. But maybe at a higher quality.
And while rumors about Apple working on an augmented reality device have been around for years. But the idea is still in focus.
Famous analyst Ming-Chi Kuo has predicted that we may see helmet-shaped glasses in 2022. But the new report says that the chips for the glasses will not be ready for mass production for at least a year.
The most elegant eyewear model could appear as soon as 2023, while Koo expects the middle of 2025.
And unlike the helmet-shaped glasses, it’s rumored that the sleeker model of the glasses is exclusively for augmented reality.
Read also: Apple may announce its new watch this month