You put it up to your ear and the screen goes blank. Turn it on its side, and the screen rotates. Walk outside and it can find your location, point a compass in the correct direction and post your geostatus on Google Latitude. Point it at a Twitter user in the real world, and you can see his or her status on a pop-up screen.
While not every feature on the iPhone is unique, the way Apple has implemented some of those features is. And the iPhone serves as the base for tens of thousands of third-party applications, many of which make use of its built-in features in innovative ways.
Although Apple wouldn’t comment on this story, preferring not to divulge too much about how the ‘magic’ works, we’ve dug around and compiled information from various sources.
To take a closer look at how some of the technology inside the iPhone works, read on.
Screen coating protects against dirt
Earlier generations of the iPhone were dirt magnets, leaving the screen smudged up after normal use, but the new iPhone 3GS has a protective coating that wards off oily grime. Amazingly, it works — the 3GS is relatively smudge-free, although it does still pick up dust easily.
Gadget blog Gizmodo posted a helpful description from Bill Nye the Science Guy about the oleophobic coating that Apple engineers stuck on the new model’s glass. It is an anti-bonding agent, similar to car wax sealants that cause water to bead, rather than pool, on your car’s roof, Nye explained.
The same effect happens with Apple’s polymer coating, preventing the grime from bonding, building and leaving telltale smudges behind.
Screen dims to automatically save power
When you have not used your iPhone for a while, the screen dims to save power. In addition, as you use the phone, brightness adjusts depending on whether you are in sunlight or darkness.
Don’t think this is just a timed event based on an internal clock that knows when the sun goes down — it’s not. The iPhone uses an ambient light sensor, describes The New York Times’ David Pogue, in an illuminating 2007 column.
Light sensors scan for electromagnetic intensity
On the iPhone, a chemical-based photocell changes its power setting depending on an infrared reading. A photocell is better at reading environmental data than a simple on-off light sensor you might have installed above your garage — photocells can detect sustained differences, knowing when you have just passed your hand over the sensor versus entered a dark room or walked outside.
Capacitive touch screen
On its iPhone site, Apple has revealed some clues about how the touch screen works. A panel under the screen glass senses your touch using an electrical field. The panel then sends this reading to an LCD below it.
In other words, your finger changes the electrical charge, which in turn feeds the phone operating system and determines which pixels have changed and which activities have been triggered.
Every touch screen phone uses a similar method, but what makes the iPhone unique is how the iPhone OS responds so quickly to swipes, pinches, and finger presses — so fast that there is a burgeoning market for high-quality iPhone games that some say rival even the mighty Nintendo DS and PlayStation Portable.
Proximity sensor blanks screen automatically
When you move the iPhone to your ear, it automatically blanks the screen. This saves battery power, since you don’t need the screen while talking on the phone (normally), and it also prevents you from touching an onscreen key — such as the one to end a call — by mistake.
Based on our own hands-on testing with the device, there are at least two (and possibly three) infrared sensors located near the ear speaker. Like most proximity sensors in smartphones, the iPhone sends out an electromagnetic field that scans for obstructions in an area about a half-inch away from the phone. Test this by placing an object over one of the sensors. The screen will go blank, and then reappear when you move the object away.
Going full tilt
Jon Peddie, a consumer tech analyst, explains how the accelerometer works. “The accelerometer knows where the center of the earth is,” Peddie says. “The trick is to sense a change.”
Peddie explains that the first accelerometers had tiny magnetic cylinders that could slide from inside one very tightly wound transformer to another nearby, causing the signal in one to go down while the signal in the other went up. These were called linear voltage displacement transducers.
Today, like everything else electronic, the iPhone employs micro-electromechanical systems (MEMS). These devices have tiny (3 microns thick and 125 to 150 microns long) polysilicon arms with small hammer-like blocks on the end. They act like springs and hold the MEMS structure above a substrate. Acceleration causes the arms to deflect from their center position. And just like in the old electro-mechanical devices, the movement of that tiny mass is detected, by capacitors in this case, and a signal is generated.
Shake to shuffle and more
What we know from hands-on testing is that the iPhone shake feature uses the accelerometer. When it senses the phone has moved (in this case, from side to side), the OS triggers an API call for syncing or a random song. The accelerometer is sensitive enough, says Peddie, to know the difference between a shake and just turning the phone to its side to signal landscape mode.
GPS finds your current location
As long as you are outside and not stuck in a tunnel or inside an office complex, your iPhone can find your exact location using an internal GPS. Like most devices and smartphones with built-in GPS, the receiver in the iPhone can read a signal from a series of orbiting satellites — about three dozen of them — that transmit a near-constant signal.
The iPhone receiver reads data from the satellites and figures out — based on how long the transmission takes — the distance to the satellites, and then calculates your location. When the GPS receives another GPS signal, it more accurately determines your location. Once the iPhone acquires three satellite signals, the receiver triangulates your position.
Apps that use the iPhone microphone
According to the developers of both apps, Leaf Trombone and Vocoder can interpret the intensity of a sound wave — essentially, a vibration that moves through the molecules in air — and then trigger a related sound. With Vocoder, for example, as you blow softly you can play a note on a piano that is equally soft, and as you blow harder the sound becomes louder. Leaf Trombone reads the sound waves and creates matching pitches depending on where you move the leaf shown on the screen. Both apps use the iPhone microphone to read the sound waves.
Leaf Trombone reads the intensity of the sound wave over the microphone.
“We are leveraging audio signal-processing algorithms to analyze the audio stream from the microphone,” says Ge Wang, Leaf Trombone’s creator and assistant professor at the Center for Computer Research in Music and Acoustics at Stanford University. “The algorithm tracks the energy level of the audio input and conditions that signal into something that can be readily used” for blowing into the microphone.
Augmented reality apps show pop-up data
These tools depend on several iPhone functions. According to the developer, the Nearest Tube app reads the GPS location of your phone, finds nearby train stations in London and then reads compass data and the accelerometer to find out which direction you are pointing the phone so it can then show you which way the station is. Open GL technology — essentially a graphics engine for displaying pixels — shows the Tube data in real-time. The app shows the overlays on top of the camera view so they look like they are pop-ups.
John Brandon is a veteran of the computing industry, having worked as an IT manager for 10 years and a tech journalist for another 10. He has written more than 2,500 feature articles and is a regular contributor to Computerworld.