While boasting an "Octa Core" Application Processor option and an extremely high resolution display, Samsung's new Galaxy Note 4 falls flat in running GPU intensive apps and games--particularly in comparison to Apple's iPhone 6 Plus.
The first benchmarks showing off the actual performance users will get from the new Note 4 highlights that Samsung appears to be making the wrong engineering choices, and that the conventional wisdom about Samsung's advantages in operating its own chip design and fab are also wrong.
Bad Engineering Choices
Samsung has pushed screen technology ahead of its own processor capabilities, resulting in extremely poor performance in high definition.
In an apparent effort to win the "spec war," Samsung began aggressively cranking up device resolutions on its most expensive premium flagships after Steve Jobs demonstrated iPhone 4's Retina Display back in 2010. Prior to that, Samsung was internally focused on smaller devices, not larger, higher resolution displays.
In contrast, Apple has only changed its flagship iPhone resolutions every two years since, making iPhone 5 taller and the new iPhone 6 & 6 Plus both larger and more pixel dense in the move to "Retina HD" displays.
The most obvious result has been that iOS app developers have had a much easier time managing the changes in resolution, so they can focus on new apps and features rather than testing across a broad range of configurations. That's apparent in the fact that nearly all new apps and games appear for iOS first, and only arrive on Android later after they've proven to be broadly popular in the App Store.
However, there's also another problem: by pushing resolution numbers so fast (and without any regard for whether having more pixels actually makes a discernible, qualitative difference), Samsung has pushed screen technology ahead of its own processor capabilities, resulting in extremely poor performance in high definition.
Earlier, AppleInsider noted that Apple's own leap to a Retina HD 1080p screen on iPhone 6 Plus resulted in graphics that were in some cases slower at their native resolution than last year's iPhone 5s: rendering a challenging OpenGL ES 3.0 3D scene dropped frame rates from 24.4 to 19 fps. Samsung's new Exynos-powered Note 4 drops down to 10.5 fps--almost half that of iPhone 6 Plus-- in the same test
Samsung's own even-higher resolution Note 4 (or equally high resolution Galaxy S5 flagship) both turn in benchmarks far lower than Apple's new 6 Plus--and less than half that of last year's iPhone 5s. In terms of fps, the latest benchmarks show that Samsung's new Exynos-powered Note 4 drops down to 10.5 fps--almost half that of iPhone 6 Plus-- in the same test.
Looking at the fairly decent, low level theoretical scores of the GPUs Samsung uses (combined with much higher clock rates and more RAM), it appears that the company's devotion to extremely high resolution numbers is a spec list checkmark (rather than a real feature that benefits users) and is a primary contributing reason for poor real life scores in rendering 3D OpenGL scenes.
In other words, the chips Samsung is choosing to use could theoretically match Apple's latest iPhones if they were not also driving tons of additional pixels that contribute little to no benefit to users. Think of it as a reasonably powerful engine installed into a monster truck with massive wheels it can barely turn.
Samsung itself has been marketing Note 4 to less sophisticated buyers as a "for the colorful" device along the lines of Apple's iPhone 5c ads, in a series of "Love Notes" spots.
However, even for a consumer device, Samsung appears to have picked the wrong screen resolution for Note 4, given the horsepower of its own Exynos 5 Octa Core Application Processor, or even Qualcomm's Snapdragon 805, which Samsung will use in most international markets.
Of course, at the same time there are also a variety of other Android devices with the same 1080p resolution as iPhone 6 Plus, and they don't score as well either. That's a fact we earlier blamed on Google's Android, particularly its shoddy implementation of OpenGL that squanders the capabilities of faster chips with more cores and more available RAM.
Apple's iPhone 6 A8 GPU destroys Galaxy S5, HTC One M8, Moto X & Nexus 5 /w fewer, slower cores & much less RAM $AAPL pic.twitter.com/SMgPqoYYrC
-- Daniel Eran Dilger (@DanielEran) October 1, 2014
Samsung falling behind in Application Processors
Samsung is still scrambling to bring its Note 4 to market in the wake of iPhone 6 Plus, but initial GPU details of the Note 4 are already available on Kishonti Informatics' GFXBench website.
What they show is not just an underpowered leap to an "even-higher resolution" screen; they also show that Samsung is continuing to license basic ARM Mali graphics for use in the Exynos Application Processors it is putting into its own premium devices despite expecting the same price from consumers as Apple's iPhones, but for a less powerful, less responsive device.
At the same time, Samsung is marketing its Eyxnos 5 chip as "Octa Core," as if the number of cores are a meaningful frame of reference in terms of power or capacity. The CPU (also a stock ARM design) is only intended to use four cores at a time; there are two sets of cores, four that run at top power, and four baby cores that coast along very efficiently when the device is in standby. Running all eight doesn't even make sense.
Specifically, the Note 4's Exynos 5433 is an ARM "big.LITTLE" design that pairs together sets of four A15 and four A7 cores, each pair designed to work at different clock speeds. Calling Samsung's Exynos "8 core" is like calling a truck "4 wheel drive" when it can effectively only power two wheels at once.
In contrast, both Apple and Qualcomm have purposely avoided ARM's stock big.LITTLE architecture in their own Cyclone A7/A8 or Krait Snapdragon core designs, both of which use fewer cores and more advanced core management to deliver better performance at lower power consumption than the stock ARM technology that Samsung is using.
The Y of Exynos
However, the reason Samsung develops its own Exynos Application Processors is so it can eventually replace Qualcomm; currently that's not possible because Qualcomm holds patents on CDMA, LTE and other advanced carrier technology.
Both Apple and Samsung use Qualcomm's baseband chips to handle wireless modem features while their own proprietary Ax or Exynos Application Processors run the rest of the phone or tablet. This creates a smoke and mirrors marketing charade for Samsung to trumpet features of its "Octa core" Galaxy phones while actually shipping something completely different
At some point, Samsung (and likely Apple, too) will want to integrate their own baseband modems into their own Application Processors rather than paying Qualcomm for a separate chip, but that's currently not feasible. Samsung's Exynos is experimenting with Qualcomm-competing designs in limited markets, both in products that either use Intel's LTE baseband chips or Samsung's own modem-integrated packages.
This fragment of the market (excluding most major markets in North America, Japan, Korea and Japan) is what Samsung addresses with its Exyno-powered Galaxy devices, with the rest of the world getting the same brands of Galaxy devices powered by Qualcomm Snapdragon chips. This creates a smoke and mirrors marketing charade for Samsung to trumpet features of its "Octa core" Galaxy phones while actually shipping something completely different.
Read Samsung's marketing doublespeak press release that Shara Tibken loosely edited for CNET and you get a sense of how strategically perfect this sort of muddled, non-specific mess of billowing specifications works to confuse rank and file tech journalists incapable of critical review.
On the other hand, when Samsung shipped defective Exynos Galaxy S4 devices last summer with what AnandTech described as "a broken implementation of the CCI-400 coherent bus interface" with "implications [that] are serious from a power consumption (and performance) standpoint," the news was rarely reported, in part because "neither ARM nor Samsung LSI will talk about the bug publicly, and Samsung didn't fess up to the problem at first either - leaving end users to discover it on their own."
Last summer, iSuppli reported (above) that Samsung's Galaxy S4 equipped with its own Exynos 5 Octa was substantially more expensive than the North American version of the same phone shipping with a Qualcomm Snapdragon (and both were estimated to be more expensive than iPhone 5).
Samsung's problems with defective Exynos designs (despite using off the shelf ARM technology), paired with chips that are not only more expensive to build but also lack the economies of scale that Apple's A7 and A8 have enjoyed--across tens of millions of iPhones and iPads--makes it easy to understand not only why Apple is more profitable, but also why its Application Processor technology is rapidly evolving faster than Samsung's.
Apple gets Ax series for effort
Samsung's design choices to use ARM's inferior Mali GPU and ARM's big.LITTLE CPU architecture are informative because Samsung isn't struggling (like HTC) to gain access to chip fabs or silicon design expertise. Thanks in part to its decade of partnerships with Apple to develop chips for high volume iPods and iPhones, Samsung is now one of the top chip fabs in existence.
However, having the ability to build the best chips doesn't mean Samsung has the desire to. That's no doubt a contributing reason why Apple began building its own in-house chip design team around five years ago. The "A4" used in iPhone 4 and the original iPad was the first major delivery.
Since then, Apple has rapidly outpaced the rest of the mobile chip design world. Using its volume sales of iOS devices to drive investment in better and better chips leveraging economies of scale, Apple has managed to deploy the first 64-bit mobile processors in a volume product, which also happened to be the highest volume product of last year.
While Samsung ships samples of Exynos devices in some markets, the majority of its smartphone and tablet volumes pay for the development of Qualcomm Snapdragon chips. Yet even Qualcomm insiders noted last year that in its move to 64 bits, "Apple kicked everybody in the balls with this. It's being downplayed, but it set off panic in the industry."
Other chip makers (including many who were sitting on advanced technology) have been left behind in the mobile space because they partnered with hardware makers who couldn't sell their chips. Every generation of Nvidia's Tegra chips, for example, have been installed in loser products ranging from Microsoft's Zune HD to KIN to Surface and Nvidia's own Shield.
Intel and AMD have made very little progress in courting business from mobile devices, and TI's OMAP processor family was abandoned when the company pulled out of the consumer mobile industry (after powering a series of low volume flops including Amazon's Kindle Fire, Nook, BlackBerry Playbook and the Google-Samsung cobranded Galaxy Nexus).
Pedal to the Metal
On top the economies of scale driving (and financing) rapid advancement of Apple's Ax series of Application Processors, the company has also developed its own Metal API as a superior performance alternative to the more general purpose, cross platform OpenGL ES and OpenCL for general computing on a GPU.
Because Samsung doesn't standardize on a single Application Processor family (using both its own Exynos and Qualcomm's Snapdragon chips in the same models) or even a single GPU architecture (using a mix of ARM Mali, PowerVR, and Adreno GPUs across even its latest devices), it can't replicate Apple's Metal in a way that would benefit its own products.
Metal is already seeing adoption just days after iOS 8 became available to consumers; top App Store games have been ported to Metal before even getting to Android.
Using Metal, developers can achieve higher frame rates (and animate more details at any target frame rate) on the same hardware, allowing games on iPhone 6 Plus to further outpace competing devices in its category, widening the nearly 2x performance gap it already enjoys over Samsung's Galaxy Note 4 in generic OpenGL benchmarks.