Listeners to my radio show know that, in general, I disagree with pretty much everything Consumer Reports says about cars. It is my opinion that since they do not accept advertising dollars and rely on subscription fees, they do their best to grab headlines and sensationalize issues.
Now comes the 2013 Consumer Reports Annual Reliability survey. The big story in this year’s report is a big drop in the rating of the Honda Accord V6. The question is, whom do you believe? Ask Honda Accord owners with a V6 and I am betting you get a much different answer, and find them to absolutely love their cars.
So why do I put no credibility in this list? The answer lies in Consumer Reports own information. According to their website:
Consumer Reports’ expert team of statisticians and automotive engineers used the survey data to predict reliability of new 2014 models. Predicted reliability is Consumer Reports’ forecast of how well models currently on sale are likely to hold up.
WAIT! This is a prediction, not an actual study of reliability like the Power & Associates study? Consumer Reports looks at problems with vehicles over the past 3 model years or in other words 2011, 2012, and 2013 models. This is the basis for their predictions of 2014 models.
So let me get this straight…we all know that Ford had problems with their Microsoft Sync system in years past, and we also know that the problem was corrected in 2012. So that means the 2014 Fords should be near the bottom of the list? This is a HUGE flaw in the Consumer Reports method and makes the entire list bogus.
Predictions are not science. Let’s call it what it is: A GUESS. I have seen cars that start out with a few problems, many very minor, that turn out to be some of the best vehicles ever made, and the opposite is true as well. Again, these things make the Consumer Reports list a joke.
Here is another issue that bears looking it, again from Consumer Reports own website: Consumer Reports surveys our magazine and website subscribers each year to ask about any serious problems they’ve had with their vehicles in the preceding 12 months.
In the real world, if a vehicle these days has a “serious problem”, which is rare in itself, you can bet it will be corrected by the following month, much less the following year.
One issue I have with both Consumer Reports and the J.D. Power surveys is that they are not weighted by severity. A blown engine counts the same as a broken ashtray. Given that, why pay attention to either of them?
I have no doubts that Toyota and Honda deserve to be at or near the top of any list dealing with reliability. I drive a lot of cars and talk to a ton of people who listen to my radio show, and having Ford at the bottom of the Consumer Reports list is ridiculous, and that is completely unbiased.
Consumer Reports panned the 2012 Honda Civic, which has turned out to be a great car and a favorite of car buyers nationally. Then they turned around and bashed the Toyota Prius C, which the dealers can’t keep on the lot because they sell so fast. According to Consumer Reports own surveys of their paid subscribers, the Prius C got the best rating of any car made, yet Consumer Reports stands by their poor review of the Prius C. Doesn’t make a lot of sense, does it?
Next problem, a car must be “average” in past reliability to make the list. So the vehicles that are on the bottom of their list get bad publicity, but by their own methodology, cars worse than average are not mentioned. Where is the justice in that?
For years I have heard the saying “follow the money.” That would seem to be the case here for sure. They are quick to announce they don’t take ads in the magazine, but they sure do on their website, and let’s face it…sensationalized headlines sell magazines.
Just in case you forgot, The February 2007 issue of Consumer Reports stated that only two of the child safety seats it tested for that issue passed the magazine’s side impact tests. The National Highway Traffic Safety Administration, which subsequently retested the seats, found that all those seats passed the corresponding NHTSA tests at the speeds described in the magazine report.
The Consumer Reports article reported that the tests simulated the effects of collisions at 38.5 mph. However, the tests that were completed, in fact, simulated collisions at 70 mph. Consumer Reports stated in a letter from its president Jim Guest to its subscribers that it would retest the seats. The article was removed from the CR website, and on January 18, 2007, the organization posted a note on its homepage about the misleading tests.Tags: consumer reports