Inside Consumer Reports: Controversies surrounding the MacBook Pro and HomePod

By Stephen Silver

Enthusiasts have questioned Consumer Reports' assessments of Apple's products for many years. After looking at the test facilities, AppleInsider talked to Consumer Reports leaders about the various controversies of the past.

On May 17, AppleInsider paid a visit to the Consumer Reports offices and testing facility in Yonkers, N.Y., where we got an inside look at how the organization arrives at its conclusions about products.

When we sat down with the Consumer Reports decision-makers, one of the topics of discussion was the organization's past assessments of Apple products, including a series of controversies involving individual conclusions about the 2016 MacBook Pro and the HomePod in early 2017.

"Biased for consumers"

Everyone we spoke with at Consumer Reports adamantly denied that they have been unfair to Apple, arguing that their mission and methodology is entirely analytical and data-driven.

Accusations of bias or unfairness to certain companies, a Consumer Reports spokesperson noted, is "not a unique complaint." In fact, there were times when Consumer Reports has been accused of bias in the other direction -- of liking Apple and its products too much.

"We believe that we have certainly been fair when it comes to Apple products and we haven't treated them any differently than anyone else,'" said Jennifer Shecter, CR's Senior Director, Content Impact & Corporate Outreach. "We test more than 7,000 products and services every single year, and I deal with every company who has an issue and they say well, why do you like Apple so much more than you like my product.' or you have other companies who say "why don't you like [mine].'"

"But this not a question of preference," said Shecter. "This is a question of scientific evaluation and comparative testing. It is not about preference, that does not come into play into our testing."

"We treat every company and every product the same way. And it's been like that since 1936," Shecter added. "We deal with Apple products and Apple as a company the same way we deal with every other company. Empirically speaking, we evaluate every product as it comes, and every piece of feedback from the manufacturer as it comes, and that is our process. So we have had robust dialogue with Apple, throughout the processes, and we are responsive to feedback that we get from companies, too."

According to Consumer Reports' director of communications Barrie Rosen, the organization currently recommends Apple products in the categories of smartwatches, smartphones, tablets, streaming media devices, smart speakers, laptops, and all-in-one desktop computers -- just about every category in which Apple offers products.

As far as device usage goes, while touring Consumer Reports' offices, we did note that the desktop computers are PCs. However, we did see quite a few MacBooks on desks there, and when we met with the company personnel in a conference room, at least two had MacBooks in front of them. As for smartphones, we noticed about a 50/50 split between iPhones and Android devices in the offices.

Apple and Consumer Reports

As for Apple and Consumer Reports' relationship, they do have one, and judging by the team's comments about it, it's somewhere between cordial and cooperative.

"We have an ongoing dialogue with Apple," Shecter said. "Whenever we have a story that's coming out that we're going to call Apple out -- just like we would any other manufacturer -- we call them. And when they have questions, we answer them.

Consumer Reports' says that their direct dealings with Apple are with all sorts of different Apple personnel. Sometimes they talk to marketing people, sometimes product managers, and occasionally even engineers.

"It might start with a product manager," Maria Rerecich, director of the electronics testing team, said. "But we've often wind up in dialogue with the engineers on the product, because they have specific technical questions that they need to be asking and hearing our answers to. We'll go engineer-to-engineer at that point.

Shecter added that the companies' communications are usually done over the phone and that in-person meetings with Apple are less common, mostly because the two companies are headquartered on opposite coasts.

"As we do, we invite manufacturers to come and visit our facility. We're always eager to go to other manufacturers' facilities," she said.

"I think that when we've been in dialogue with them it's been good faith on both sides," said Shecter. "There have been times when we've had to be in constant contact if there's a story coming out, or if there's more stories coming out, but every time they have a question we answer their questions, truly, honestly, and responsively."

The MacBook Pro Story

One of those times dialog was needed was following the release of Consumer Reports assessment of the 2016 MacBook Pro.

As AppleInsider reported in December of 2016, CR, upon the release of the newly refreshed MacBook Pro line, declined to recommend the new product, the first time it didn't recommend the MacBook Pro. Its decision was reversed the following month, following lengthy engagement with Apple and the release of a software fix.

The problem, it turned out, was that Consumer Reports had turned off caching prior to running the test, leading to the exposure of a battery bug that Apple later fixed. Apple said in public comments at the time that this resetting was "not a setting used by customers and does not reflect real-world usage."

The team at Consumer Reports gave us their side of what happened in that product assessment. Their engagement with Apple involved extensive communications, over the Christmas holiday that year.

"We run battery life tests on laptops," Maria Rerecich said. "And the way we run the battery life test is, we are doing a website cycle over Wi-Fi. We don't run those live over Wi-Fi, we have a server with the webpages stored on it so it's consistent and our testing has to be the same from time to time. So we don't want to encounter the vagaries of someone's website being down. So we have the webpages stored, we run those over wi-fi from our server, to the test machines, and we're basically cycling through webpages."

"We start with the computers fully charged, we set them all to the same brightness level, so again it's consistent across. So just let em run overnight, and then some, depending on [how long they last]," added Rerecich. "We log that recording, and basically once the laptop dies the watch stops, so we can tell exactly when it's petered out. We run that once, we record the results, and we run it a second time. we're looking for consistency within five or ten percent."

"We set them up the same as much as possible, we go with default settings largely, but have turned the cache off, and the reason we turn the cache off is because we don't want the website to be sitting in cache on the machine because then it's not exercising properly," Rerecich continued.

"In the case of the MacBook Pro, that year [2016], when we run the two trials of the same test, we might get 14 hours and 14.1 hours. We may get 16 hours and then 16.3 hours. If it's more than that 5 percent variation, we'll run it a third time and see where it falls. The case of the MacBook, we got 10 hours, then we got 4 hours. We never see a variation like that. This was completely unusual. And the numbers told us that there's something going on. We were not able to evaluate that battery as a result, because what's the right number?' And we felt that based on that testing, there was something that was not explained, and we could not recommend the model as a result. We got in touch with Apple and we had a lot of back and forth with Apple."

"We got in touch with them beforehand and asked them for a comment," Shechter said. "Because we said we're seeing this issue, and it's gonna impact our ability to recommend the product. And what they told us for the story was they didn't know why that would be. They didn't understand why we were getting the results we were getting. So we asked them and they gave us a comment for the story. We published our story and then there had more questions about trying to figure out what the problem could be. So that was over Christmas, the year before [last], and we probably talked to them every single day of that Christmas holiday."

Eventually, CR gave their log of the test to Apple, who determined what had happened.

"Because you're doing it the way you are doing it, there's a bug being triggered in Safari," Shechter said they were told by Apple. "When you turn the cache off, this bug is being triggered. And while we respectfully disagree with you that this is a real-world test, or that this is the best indicator, they said you know, we want to make sure to them the recommendation seemed important, and so we went back and forth, and they decided to issue an update for the bug, and so they issued it, and we re-ran the tests, and it worked. So we were able to recommend it."

Shechter added that CR's policy is that they will revisit product assessments after they have denied a recommendation for a specific reason, once the issue behind that reason has been fixed. But they will not retest products that have already been recommended. If they did that, she said, "we'd be re-resting products all the time and we'd never even get to new products."

Consumer Reports sees the back and forth with Apple as an instance in which they and a manufacturer worked together to deliver a positive result for consumers.

The HomePod issue

Another rather contentious issue was Consumer Reports' assessment, this past February, of the HomePod speaker, "The Apple HomePod Sounds Good, but Other Smart Speakers Sound Better." Many, including AppleInsider, had taken issue with CR's determination that the HomePod fell below such competitors as the Sonos One and Google Home Max. Our editorial also questioned how Consumer Reports had reached some a determination in what appeared to be a single day.

CR responded to that editorial that week, concluding with an invitation to visit their facilities.

During that visit, the CR staffers further filled us in on what happened when they wrote about the HomePod. The assessment was, they said, a "first look" -- a collection of initial impressions of a product that Consumer Reports often publishes for certain high-profile products releases, including most iPhone and Galaxy devices. First look assessments will also sometimes take look at new, highly-touted features -- such as the Face ID that debuted in the iPhone X -- and also look at whether the product does certain things that it advertises doing.

"Because consumers will be very interested in a phone like that or a phone like the flagship Samsung phones or things of that nature, we will try to get results out the first day," Recevich said. "So we'll do this first look type of thing, with really big, important products that people are buzzing about. Because consumers want to know, with Consumer Reports- everyone else is talking about it, why aren't you talking about it? So that's why we did that for the HomePod -- not because it was Apple, but because these are the types of products that people are [interested in] -- when the Apple Watch first came out, we did that. But we do it for other products as well.

"Because the HomePod was sort of a hot product and we knew a lot of people were interested in what it was like and how it was. We did what was called a first look," Recevich continued. "In this case it wasn't a rented sample because we didn't have that available, but the first day we got them in, we did some testing, so that we could give an impression about it."

The first looks are usually presented as "early test results." Consumer Reports has published day-of-release first looks of the iPhone X, the iPhone 8 and iPhone 8 Plus, the 2016 Apple Watch and the Samsung Galaxy S 8. CR does not appear to have published first looks upon the releases of the Sonos One or Google Home Max.

"Typically on a first day, those first looks that we do, we'll get the product in, we'll rush it into the lab." said Rerecich. "We'll do the testing that we'll accomplish on the first day, and then we're gonna get together with editorial and talk about what we saw, what the results were, we'll ask questions, and we'll refine what that first day story is about."

"We did a listening test. We actually ran through our entire audio quality evaluation that day, and that was the basis of the first look, along with how it is to work with, and things like that." Amplified Rerecich. "It was just comparing the one column, which I believe is all we did on the first day. whereas the ranges that came out, combined everything else. For a speaker like that, the sound quality is the highest-weighted, it's the most important, and the other ones affected it. So, the ranking that came out in the ratings could have been slightly different from that first day, because that first day was just one component of the test.

Rerecich added that testing is not just about sound quality for speakers, but also about ease of use, features, and overall versatility. Following those evaluations, the full report is issued.

Jerry Beilinson, Consumer Reports' content development team leader and electronics editor for the Consumer Reports magazine, argued that the assessment, written by CR's Allen St. John, wasn't actually all that negative.

"Very good sound, far better than most other speakers, one of the best. The argument seems to be that [we said] it wasn't the best speaker that was ever released. It's recommended, it got a very good rating, and was with very similar rating numbers, edged out by two similarly high-end competitors."

A question of aesthetics

The Consumer Reports officials did point out one thing: Their testing process does not, formally, take into account the look and aesthetics of the product. And design, it should go without saying, is something historically very important to Apple.

"We don't care much about aesthetics," Mark Connelly, director of testing said during our tour of the facilities.

"We'll describe them sometimes," said Rerecich. "Maybe for a washing machine that's not so important, but for something like a speaker that you might have in your living room, you might care what it looks like. We don't rate it and say this one looks better than that one,' unless it's unusual in some way."

"Just to be clear, those are not part of the numerical rating, they don't get calculated as part of it," said Jerry Beilinson. "What Consumer Reports can do for you is give you comparisons you can't make yourself and judgments you can't easily make yourself. So when you're in the store, what is the performance in terms of speed of this laptop? Well I'm not going to look at it or read the spec but and necessarily know what is the battery life if a consumer is looking at a product they can tell whether they like the way it looks, they don't really need ConsumerReports or another outside authority to tell them whether they think the blue one or red one is better-looking."

"That's not a way in which a consumer might be lacking in information they need to make the purchase," said Beilinson. "What we're really trying to do, primarily, is fill consumers in with information that they need to be able to make good choices."

One side or another

We appreciate that Consumer Reports opened their doors to discuss testing methodology, making the entire process a little more transparent. Based on our commenters, AppleInsider readers and car aficionados have had a trust issue. Trust, once lost, is hard to regain.

In the case of the "antennagate," the public-facing information was about the signal attenuation when the phone was held, with the rest of the story behind a paywall. A paywall is any publication's right, as bills do have to get paid, but the information that was propagated wasn't the whole story -- as the group did recommend the iPhone in that paid report.

With the 2016 MacBook Pro, Apple's declarations on the matter, and Consumer Reports still differ. CR has clarified its stance on the matter to us now, and Apple has not beyond it being not a consumer setting -- so the whole story for users is somewhere in between. We still feel that this should have been addressed with Apple before publication of the initial test results, though.

Regarding the more recent HomePod and the audio comparison, AppleInsider still believes that the comparison the same day that the product was released wasn't handled well at all. Additionally, while an analysis in a vacuum of a product can be done rapidly, we do not believe that any kind of meaningful comparison between complex products like the HomePod and Google Home Max that Consumer Reports aims to do can be completed from start to publication in a matter of hours.

This all said, there are a few things we conclude from our visit to Consumer Reports. Having viewed their testing process and met with their team, we are confident that they do not harbor a purposeful anti-Apple agenda, nor is there any sort of conspiracy against Apple afoot behind the CR walls. Their complete testing and evaluation process is conducted with integrity and in good faith.

However, there may very well be something about CR's analytical, numbers-driven process that clashes with the design-heavy Apple ethos, and makes their conclusions about Apple products different from those of more traditional reviewers. Even so, this hasn't stopped them from recommending most of Apple's lineup.