People who read
‘s reviews, whether in print or online, are a discerning bunch; if your letters to us are any indication, you hold us to some pretty high standards of accuracy and ethics. That’s great, since we hold ourselves to those same standards.
But occasionally we get a letter that lets us know the reviews process is, for some readers, as clear as mud. If you’ve ever wondered what goes into a reviewhow we choose products to review, how closely we work with the manufacturers, and how we handle less-than-positive reviewsread on.
decide which products to review?
For each issue, we choose 25 or 30 of the products we think you’re most likely to consider purchasing (or just want to read about). We also shoot for a good balance of hardware and software and a mix of product categories.
Although we focus on products that will appeal to Mac professionals, we also slip in a just-for-fun product every once in a while. One longtime reviewer remembers, back in the early days of
debating whether or not to review an astrology program; the debate ended when someone pointed out that all six people involved in the discussion were Geminis. A review was printed, and the author received a thank-you note six months later from the program’s developer, who used the proceeds from her booming sales to pay off her student loans and finance her graduate studies.
Product selection isn’t usually cause for debate, though; it’s simply a matter of choosing products that are recent releases or updates with significant changes from earlier versions, and likely to be of interest to a significant number of our readers.
Oddly, getting the products in our hands for review is sometimes more difficult than selecting them. For example, it’s my job as reviews editor to convince PR folks at particularly suspicious companies that we’re not, in fact, trying to scam free software or hardware (besides, we return the hardwareand even the software, if they request that we do so). And sometimes I have to coax them to send me products for a roundup review, over their objections that their product “really isn’t comparable to anything else on the market.” Fortunately, most companies are great about sending us evaluation copies or units as soon as their products are released; it’s relatively rare that a company is so reluctant to fulfill our requests that we must resort to purchasing the product ourselves.
Another problem is that we frequently receive the Windows version of software. For example, one developer of TCP/IP software insisted that we review his product. After much discussion about various hardware requirements, he sent usyou guessed itthe Windows version. Further discussions with this developer went something like this:
“We can’t run this; it’s for a PC. We need the Macintosh version.”
“But we only have the one version. It’s compatible with all computers.”
“We told you we were running a Macintosh 8100.”
“Right, I have that in my notes. Why won’t it run?”
“Because a Macintosh is not a PC.”
“What are you talking about? I sent you a 3 1/2 inch disk!”
“Yes, but you sent a PC disk. We need the Mac version.”
“The version I sent you runs on all computers.”
“No, it doesn’t run on a Macintosh.”
“That’s news to me. Who makes the Macintosh?”
Nope. Although we often preview significant new products elsewhere in
we’re careful to review and rate only products that arrive in shrink-wrapped boxes or that you can purchase online. We want to experience what you, the purchaser, will experience, so we don’t review betas, golden masters, or CDs burned specially for us.
And we have to be vigilant about this rule. Not long ago, we received a very final-looking CD in a shrink-wrapped box and proceeded to install the program. It turned out to be a beta version, the only indication being the words “beta installer” that flashed on the screen during installation. We postponed the review until we received a verifiably final version.
How involved are the companies in the reviews process?
This is one of the most sensitive areas we have to deal with. The editor needs to work closely with the companiesto get the product, to get the information we need to do a fair evaluation, and to make sure we have our facts straight once the review is written (though we never indulge in the all-too-common practice of forwarding advance copies of reviews to the vendors for feedback). But we also need to make sure the reviewers limit their contact with vendors so that their objectivity isn’t compromised.
One well-known software firm invited the press, including potential reviewers, to attend an on-site presentation on the new version of one of its popular programs.
didn’t send a reviewer, fearing his objectivity might be compromised in this case. When our review appeared, we heard strenuous objections from the company that the author could not have been qualified, on the grounds that he hadn’t attended the presentationor received the $5,000 honorarium given to attendees.
Reviewers do have some contact with the companies, however. Sometimes it’s anonymous, as when the author is checking on the company’s technical support; other timessuch as when the product won’t even function well enough for the author to test it properlythe reviewer will identify himself to a product manager and explain the problem. The result may be a negative review, but a much improved product when it’s next updated.
Do you avoid running negative reviews?
We select products for review based on many factors, but not on whether or not we think the product will fare well. We believe we’re doing our readers a service by warning them about products that are buggy, that don’t compare well against competing products, or that could actually do damage.
The makers of such products sometimes take the opportunity to fix the problems and resubmit their products for review; others take a different tack. For example, one of
‘s contributing editors reviewed a product and gave it the worst possible rating. “The development team was very upset and brought me into their lab to show me how hard they had tried. I talked to the developers, saw their detailed architectural documents, met with the head of the project, and even played around in their state-of-the-art user-analysis setup, complete with one-way mirrors, video cameras on the mouse and keyboard, and extensive test scripts. ‘But it doesn’t work,’ I said. They agreed, but couldn’t understand why, since they had tried very hard. Shortly after that, the product was killed.”
Fortunately, such cases are the exception rather than the rule. We don’t enjoy printing negative reviews, and we don’t relish seeing companies go under. But we take our obligation to give our readers honest, unbiased evaluations seriously.
The next time you read a review in
you’ll know a bit more about what’s happening behind the scenes…and the battles we may have waged to bring it to you.
LINDA COMER is a Senior Associate Editor of Reviews at
and finished just behind Leonardo Dicaprio in People’s “50 Sexiest People” contest.