Your membership has expired

The payment for your account couldn't be processed or you've canceled your account with us.

Re-activate

Save products you love, products you own and much more!

Save products icon

Other Membership Benefits:

Savings icon Exclusive Deals for Members Best time to buy icon Best Time to Buy Products Recall tracker icon Recall & Safety Alerts TV screen optimizer icon TV Screen Optimizer and more

    Consumer Reports auto testing and reliability: Top six myths busted

    Consumer Reports News: November 04, 2011 02:23 PM

    With the recent release of our annual car reliability data, there was widespread coverage of the findings that showed Ford stumbled with some new cars, Chrysler made dramatic improvements, and the Japanese automakers dominated the top 10 brands, joined notably by Volvo. However, not all media outlets seemed to understand or explain our methodologies. So, let's clarify a few key points around the reliability survey and auto testing, busting those myths before they spread further.

    1. The same people who test toasters and coffee test cars.
      Nope. While we take nothing away from our colleagues who do test toasters and coffee, Consumer Reports' Auto Test Division has its own dedicated staff and facility. Most of us have worked in the automotive industry before coming here, including stints at General Motors, Land Rover, Ford, Nissan, and Pirelli. And everyone who tests cars has been to high-performance track driving schools, and some staff members even teach them. Plus, many of us have true racing experience, as well.

    2. Consumer Reports recommends Toyotas without even driving them.
      False again. In order for CR to recommend a car, several things must happen: CR has to buy the car and test it; the car must first score high enough in our tests to be recommended; the car must receive average or better predicted reliability based on our annual survey results; and the car must perform adequately in the Insurance Institute for Highway Safety (IIHS) and National Highway Traffic Safety Administration (NHTSA) safety tests, if tested. For some redesigned vehicles, CR will predict above average reliability if the previous outgoing model was consistently very reliable. It has always been this way, and always will.

    3. Consumer Reports reliability data is biased because it only surveys subscribers.
      We're proud of our reliability data. It is based on the largest survey of its kind, with over one million responses. That covers quite a variety of people, as evidenced by the critical letters from subscribers we get every day. The responses also covered over 300 domestic and imported models, the majority of all the new and used models available in the past 10 years. Do we brainwash people to, say, like certain cars that do well in our tests? We don't. And if we did, it sure doesn't seem to be working. Owners of some high-scoring cars in our tests reported problems with their vehicles. In addition, there are few disputes from the manufacturers concerning our report of their vehicles' reliability, meaning they aren't surprised by the results.

    4. The wife of the director of the Auto Test Division works for (insert car company name here).
      Part of working at Consumer Reports includes filling out detailed conflict-of-interest statements, which are reviewed by an independent auditor. These include both investments and family income. Such a conflict would result in the investment having to be sold or a staffer being reassigned, in order to maintain employment. Plus, as policy, our staff can't accept gifts or trips, which are routinely lavished upon the automotive press as routine business with each new model introduction.

    5. Consumer Reports is biased against or for American (or European, or Asian) cars.
      It is true that there is a bias here at Consumer Reports. We are biased in favor of safe, reliable, fuel-efficient vehicles that are enjoyable to drive. To us, it doesn't matter one bit if the car is from a domestic manufacturer or a foreign one, or if it is built in North America or South Korea. The tests are the same for all vehicles, and the results speak for themselves.

    6. Consumer Reports hates cars.
      As I said above, we're enthusiasts here at the test track. Most of us own "fun" cars on the side. We think the same attributes that make a car fun to drive add to dynamic safety. But we also believe that performance doesn't override practical considerations, including fuel economy. After all, we're scoring cars for the daily commute—not for track days or off-roading. Since we're fully independent, we don't take advertising and we don't score borrowed press cars. We can call the shots the way we see them.

      Sometimes the Internet masses agree, sometimes they don't. We're coming from the unique perspective of being experienced engineers with access to over 100 cars a year on a dedicated track and consistent roads. And we're not shooting from the hip; Our results are from a jury of engineers and testers using over 50 documented tests, thousands of miles on each test car, and months of time in use. Each piece of data has been reviewed and checked. Since not every reader has that background, we don't expect everyone to agree with us all the time.

    After all, Sam Sifton, outgoing food critic for The New York Times, recently wrote that all criticism is an argument. No truer words were written. But there's little argument over the truth behind the falsehoods outlined above.

    Tom Mutchler


    E-mail Newsletters

    FREE e-mail Newsletters! Choose from cars, safety, health, and more!
    Already signed-up?
    Manage your newsletters here too.

    Cars News

    Cars

    Cars Build & Buy Car Buying Service
    Save thousands off MSRP with upfront dealer pricing information and a transparent car buying experience.

    See your savings

    Mobile

    Mobile Get Ratings on the go and compare
    while you shop

    Learn more