Your favorite feature of the AirPods Pro? Transparency Mode may be the best option for those of us who must maintain focus while listening to music. And thankfully, Apple has improved Transparency Mode to the point that it really could be worth purchasing a pair for the sake of using them.
When using AirPods, how does Transparency Mode function?
Transparency Mode uses the built-in microphones of your AirPods to artificially amplify ambient noise if you don’t have a pair of AirPods Pro or AirPods Max. When they are appropriately sized for your ears, it won’t feel like you’re even wearing them: When you are off Transparency Mode, the sound volume decreases, returning to the muffled state typical of a well-sealed room.
This function is ideal for situations in which you want to listen to music or a podcast, but you also need to be able to pay attention to your surroundings and the people in them. If you have Transparency Mode enabled, you can hold a conversation without removing your headphones.
However, this setting never took into consideration the volume of ambient noise. Your AirPods would in turn force any sudden, loud sounds—like a siren or horn—into your ears. On rare occasions, this has happened to me, and trust me, it is no picnic.
Use Adaptive Transparency to protect your hearing.
The new AirPods Pro 2 also includes Adaptive Transparency. In this version of Transparency Mode, only sounds with a dB value greater than 85 are attenuated, while all other sounds are preserved at their natural volume. The effect is similar to that of constantly carrying a sound mixer in your ear, helping to equalize the many noises you encounter. With the new H2 chip, Apple claims these noises are processed 48,000 times per second, so there’s almost no lag time.
True, Apple isn’t the pioneer in developing Adaptive Transparency or anything like it. Although Bose’s QuietComfort Earbuds II have a feature similar to Apple’s, PCMag says it isn’t as well developed.
Al Griffin, the editor of TechRadar, wore AirPods Pro 2 to a show. According to him, the concert’s decibel level peaked at 114.7, but with the help of Adaptive Transparency, he was only exposed to sounds up to 85 decibels. Griffin claims that he could not hear the music clearly with earplugs but could with his AirPods, despite the fact that the former is more expensive. Griffin admits that using the AirPods Pro 2 at a performance is not ideal, but he plans to do so anyhow.
However, it appears that you may not need to purchase a new set of AirPods Pro in order to take advantage of Adaptive Transparency. Apple’s upcoming iOS 16.1 beta adds Adaptive Transparency support to the first-gen AirPods Pro, even though they lack the second-gen H2 processor that Apple claims powers the functionality in the AirPods Pro 2.
Assuming the feature even works on the first-generation AirPods Pro, I expect it to perform less smoothly. Since the H1 chip isn’t powerful enough to process noises at 48,000 times per second, perhaps the noise reduction will arrive a little later. But even if I have to spend an additional $250 on a new set of AirPods, I’ll be satisfied if I can achieve the majority of Adaptive Transparency’s advantages with my current headphones.
Perhaps Apple will include Adaptive Transparency on the AirPods Max if it does it on the original AirPods Pro. We won’t know for sure until iOS 16.1 is released to the public.
- iPhone 14 and iPhone 14 Pro users have reported new issues
- The United States defeats Russia in a battle for control of the global internet’s future