News

You can use Apple Vision Pro with motion sickness and not get sick: Here's how

It is even more immersive this way.

Highlights

  • Apple Vision Pro uses the R1 chip to process data from 12 cameras, 5 sensors, and 6 mics.
  • This data is processed to reduce motion sickness while using the headset.
  • The reduction is done in real-time, and is personalised for each user based on their eye, head, and hand movements.
Apple Vision Pro

Moving to hardware products, Apple announced the revolutionary Apple Vision Pro at WWDC. The Vision Pro headset scales beyond the boundaries of a traditional display. It uses micro-OLED technology to pack 23 million pixels into two displays. It is powered by Apple's M2 chip and has a new R1 chip. The device has twelve cameras, five sensors, and six microphones. The Vision Pro is priced at $3,499 (Rs 2,88,752 approx.) and will be available early next year.



Apple filed over 5,000 patents to make the Apple Vision Pro. Now one of these patents shows how Apple has made it safe to use the Apple Vision Pro with motion sickness. This means even if you can’t use other VR/AR headsets because of motion sickness, Apple’s headset is safe to use.

The patent shows that most of the headset’s motion sickness reduction is happening at the processor level. Vision Pro’s R1 chip is responsible for processing data from the 12 cameras, 5 sensors, and 6 mics on the device. However, Apple has tuned it in a way that will reduce motion sickness and improve the overall immersion at the same time.

How does Apple Vision Pro motion sickness reduction work?

The European Patent Office has published an Apple patent that shows the detailed working of the Vision Pro motion sickness reduction. The headset adjusts the image around and outside the user’s peripheral vision (foveated gaze zone) to make it more immersive. Normal AR/VR headsets have a black border around the edges to keep you focused on the content in front of you.

However, Apple’s AR headset adjusts the contrast and image outside the user’s field of vision. The headset audio is also tuned to keep the user focused and reduce motion sickness. This combination of audio-visual cues is helping Apple improve immersion and make Vision Pro more user-friendly.

But it doesn’t end there, as the R1 chip is making this process personal for each user. According to the patent, the headset will track your eye movements, head positioning, and hand gestures to personalize the experience. It works on 6 degrees of freedom (DoF), tracking user movements and adjusting the cues in real time.

What is the role of the R1 chip and M2 chip in Apple Vision Pro?

It may sound easy but Apple fitting the M2 chip in a mixed-reality headset is a big deal. This chip is accompanied by a new R1 chip. Apple calls the Vision Pro its first “spatial computer” and hence the M2 chip. It is responsible for running all the tasks on the headset. You can use it as a Mac or an iPad in mixed reality, thanks to the M2 chip.

Coming to the R1, this chip is responsible for processing everything that you see through the headset. All the data from Vision Pro’s 12 cameras, 5 sensors, and 6 mics go through this chip. So the R1 chip creates and maintains the immersion, reduces motion sickness, and translates your commands to the M2 chip for processing.

While the Vision Pro will be available next year, it is surprising to see how much horsepower Apple has fitted into this little device.

  • Published Date: June 12, 2023 11:49 AM IST
For the latest tech news across the world, latest PC and Mobile games, tips & tricks, top-notch gadget reviews of most exciting releases follow Techlusive India’s Facebook, Twitter, subscribe our YouTube Channel. Also follow us on  Facebook Messenger for latest updates.