Editor’s Note: The following article is reprinted from Network World.
University of Utah researchers and programmers are creating an iPhone application that will let users edit massive image files containing hundreds of gigabytes of data. When you get bored with that, you can use another of their recently released applications to virtually dissect a real human corpse.
It’s easy to see the latter being a big hit at Halloween parties this year.
Still in development, ViSUS on the iPhone will let you edit, view and zoom in on such objects as very high resolution, large-scale CT scans, satellite images and geographic images from Google Earth. The phone is only a visualization platform: all those gigabytes of image data are running on high-powered servers somewhere else, and stream to the iPhone for rendering.
The university is turning into a hot bed of high-definition iPhone image applications, all released in the last few months and available on Apple’s App Store via iTunes.
The others, which all run natively on the iPhone with their attendant data, are:
• ImageVis3D Mobile: Lets you import 3D images of medical CT or MRI scans, or anything else, to your iPhone and quickly display, rotate, and manipulate them.
It’s based on a desktop/laptop application originally developed by the University’s Scientific Computing and Imaging Institute (SCI), which specializes in software for visualization, scientific computing and image analysis.
“Rendering the data on the iPhone is the really incredible thing about this,” says Tom Fogal, a software developer with SCI, and co-author of the PC-based software, with Jens Krueger, a German computer scientist. Krueger wrote the iPhone version.
“It demonstrates the progress in hardware and [in] software algorithms gleaned from about 25 years of research,” Fogal says. “Ten years ago, few people would consider doing what ImageVis3D does without a million-dollar supercomputer. And now, we do it on something that fits in the palm of your hand.”
SCI is considering porting the application to other mobile platforms, including tablets, but they “have not yet found a device we’d like to target,” Fogal says.
ImageVis3D Mobile is a free application, appearing in the App Store in September.
• AnatomyLab: A series of images let you study a real cadaver through 40 stages of an actual dissection. It’s designed for anatomy students, many of whom don’t have access to cadavers in anatomy labs.
The application grew out of an anatomy textbook project by biology Professor-Lecturer Mark Nielsen, and Ph.D. student Shawn Miller, who created a DVD for the textbook, showing the sequential dissection of a real body. They decided to turn it into an iPhone application, and Nielsen asked his son Scott, a physics major at Utah, to write the code. The application sells for $10, and went on sale in July.
• My Body: A scaled-down version of AnatomyLab, intended for the general public. It’s $@ and has been in the App Store since August.
ViSUS will take the iPhone’s imaging capabilities to a whole new level.
Today the software, originally written for workstations and PCs by Valerio Pascucci, a SCI Institute faculty member, is a powerful 3D visualization program that can deal with massive data sets. One application has been to combine it with tools specifically designed for climate change researchers and meteorologists.
When released for the iPhone, ViSUS will marry the iPhone’s big, bright screen to back-end server power, enabling mobile users to render and navigate very large, very detailed images.
The iPhone has a screen resolution of 480 x 320 pixels. According to the university, the best of today’’s high-definition TV sets has an image resolution of 1,080 x 1,920 pixels. But ViSUS can handle an image resolution of 200,000 by 200,000 pixels.
Streaming the images to the iPhone will let users see an entire image, at lower resolution, or zoom in to look at parts of the image, at higher resolution. The university says ViSUS handles the images faster, with less processing power, than other software, such as Google Earth.