Analyst Ming-chi Kuo says that Apple’s rumoured new mixed reality headset will use 3D sensors for better hand tracking, as reported by MacRumors and 9to5Mac. There are four sets of 3D sensors in the headset instead of just one for Face ID. This should make it more accurate than the TrueDepth camera system that is used for Face ID.
In the same way that Face ID can figure out facial expressions to make Animoji, Kuo says that the structured light sensors can also figure out objects and “dynamic detail change” in the hands, just like how Face ID can figure out facial expressions.
Capturing the description of hand gestures can make it easier for humans and machines to communicate with each other. For instance, a virtual balloon in your hand could fly away when the sensors detect that your fist is not clenched anymore, so the virtual balloon could go away. As far as Kuo is concerned, the sensors will be capable of detecting objects that are up to 200 per cent further away from the iPhone’s Face ID than is currently possible.
Hand tracking is also supported by Meta’s Quest headsets, however, this isn’t a core component of the device and depends on monochrome cameras. However, there is no mention in Kuo’s report as to whether Apple’s headset will also have physical controllers. According to Bloomberg, Apple was testing hand tracking for the smartphone back in January.
Kuo as well gave a few more information about what might happen once Apple makes its first headset. It is expected that the first model will weigh between 300 and 400 grammes (0.66-0.88 pounds). A second-generation model with a new battery system and a faster processor is said to be coming out in 2024.
It will be out sometime next year, says Kuo, and Apple expects to sell about three million of them in 2023. A lot of people who start using something new are going to pay a lot for it at first, and they’re going to be early adopters.
1 Comment
Pingback: Apple delays return to office indefinitely - Innovation Village | Technology, Product Reviews, Business