It’s the talk of consumer electronics and gaming. Depending on who you ask, 3D could change the way we watch TV and play computer games, but there’s one thing it seems everyone is agreed on: who wants to wear those goofy glasses?
It turns out you might not have to. At this week’s CeBIT IT fair, engineers are showing a new breed of screen that projects a 3D image towards the viewer’s eyes so glasses aren’t required.
Glasses are used to select which of two images each eye sees, allowing the right eye to see one and the left eye the other using either color filters or shutters synchronized with the screen. In the new displays the separation is done by a panel consisting of tiny lenses that sits in front of the screen.
The basic technology isn’t new but in the past this type of screen has projected a 3D image to a single spot in space and the viewer needs to be in that sweet-spot to see it. If you move to another position all you see is a blurred image, while if you watch the screen with a friend, someone is out of luck. With the new displays, though, that’s changing.
Sunny Ocean Studios has developed a panel that can be fitted to a standard display, sending out a stereoscopic image to 64 positions around the screen.
“This means you have a very large area to view, you can run around and see a nice 3D display,” said Armin Grasnick, managing director of the Singapore-based company. “Normally you have a few, just five or eight or nine angles, but now we have 64 and its very easy to catch a 3D effect.”
Sunny Ocean hopes to sell the panels to display makers.
A similar panel has been developed by Germany’s SeeFront, but it projects an image to a single point. However, that point can be moved around, and by watching the viewer with a camera the system makes constant adjustments to the image so it follows the viewer’s head movements.
“We have a camera here as part of the display and the camera is looking at you,” said Christoph Grossman, founder and CEO of the company. “It determines your position in space in terms of X, Y, Z coordinates and this information is passed on to an algorithm running in the computer in the background and is taken into account to give you the best 3D image to your current position, so you can move around freely and still see a great 3D image.”
SeeFront is demonstrating the system on a MacBook computer with input from the existing camera in the machine.
Perhaps the most impressive system was a screen from Germany’s Fraunhofer Institute. It has a panel of narrow, cylindrical lenses in front of the screen, which divide the two images that are reproduced and direct one to each eye.
“Of course you want to move,” said Bernd Duckstein, a research associate at Fraunhofer’s Heinrich Hertz Institute. “So on top of the screen we have two cameras, which detect the position of the viewer’s eyes and according to the position of the eyes, the lenticular plate is moving in front of the flat-panel screen so the channels can follow the eyes of the user.”
The panel moves by up to three millimeters to adjust to the viewing position, he said.
Additionally, infrared cameras above the screen watch out for hand gestures. Users can manipulate objects on screen and control software without actually touching the screen.
The technologies are more complex than those coming to market this year but judging from the reaction at CeBIT, they’re definitely worth keeping both eyes on.