In just about a week and a half, the mystery behind a surprisingly spartan invitation, historical locations, and unusual buildings will be revealed, and we’ll finally know what Apple’s new smartphone looks like.
Among the many rumors that surround the event, one of the more interesting is that the upcoming new iPhone models may contain a range of environmental sensors that could make Apple’s mobile handset much more aware of the world around it, and open up a realm of new possible applications in the process.
Let’s start with the technology that would probably have the largest number of applications: a barometric sensor, such as the ones made by German industrial giant Bosch. Such technology could help iOS apps detect the ambient pressure at the iPhone’s current location.
With a bit of calibration to account for local conditions, this sensor could be used to detect the exact altitude of the handset (and, presumably, of its owner), allowing for precise location inside buildings—reportedly a major focus of Apple’s recent mapping efforts. Combined with other technologies like iBeacons and GPS, altitude detection could make it easy to navigate large buildings without getting lost or having to stop for directions.
Even without calibration, the sensor would be capable of detecting elevation changes, which could turn it into a very useful tool in the creation and management of walking and cycling maps, where changes in incline can mean the difference between a leisurely stroll and an unexpected cardio workout.
Is it just me, or is it hot in here?
Two other possible environmental technologies that could find their way into an iPhone are temperature and humidity sensors. These would be harder to use, because they are likely to be suffer from interference from body parts and the iPhone’s own internal electronics: The recorded temperature inside your pocket is likely to be much different than what would be measured if you were holding your handset out in the street on a February morning, and a hot CPU is likely to affect both temperature and humidity inside the phone’s case.
Still, there are probably some relatively simple heuristics that could be applied in software to help improve the reliability of the measurements taken by these sensors. For example, if the screen is on and user input is being detected, it’s likely that the phone isn’t in your pocket, and GPS data could be used to determine with a good amount of certainty whether you are indoors or not.
The data generated by these sensors could be very useful in conjunction with Apple’s upcoming HomeKit technology: By figuring out what the conditions inside your house are, iOS could help manage your heating so as to create comfortable conditions wherever you are. In fact, in a not-so-distant future in which wearable technologies and home automation are commonplace, the sensors would provide “hyperlocal” information that might make it possible to manage the climate inside your house at a very fine level, so that, for example, only the room you’re in is heated, reducing the amount of energy wasted in the process.
And let’s not forget that the scale at which iOS operates opens up a number of other interesting applications. Considering that there are hundreds of millions of iPhone and iPad users, the introduction of environmental sensors might one day turn each device into a miniature weather station, allowing scientists to measure climate in places and with a detail hitherto impossible to achieve.
Of course, this is all still wild speculation at this point; given the possibilities, however, I will definitely keep my fingers crossed that, come September 9, our iPhones will be more aware of their surroundings than ever.