Links on Android Authority may earn us a commission. Learn more.
ARM’s mission to improve gaming on mobile
The mobile game market is growing fast, riding the smartphone and tablet revolution to greater heights with every passing year. We’re seeing a resurgence of classic titles as ports and homages on Android. As computing power increases, the scope for high-end gaming improves. There are many barriers still to overcome, but mobile gaming is on the up.
According to Gartner mobile games were worth over $9 billion in 2012, and then $13 billion in 2013 and they’ll hit $17 billion this year. Next year, worldwide revenue of $22 billion will eclipse the PC game market for the first time. Recent research from Flurry revealed that 525 million Android devices across the globe engage in mobile gaming every month.
ARM's technology is in 95% of today’s smartphones, and it’s clear they have a big part to play
With its technology in 95% of today’s smartphones, it’s clear that ARM has a big part to play. Luckily the company has a storied history in the gaming world that dates back to the BBC Micro. It was ARM processors inside the ill-fated 3DO console, the first mobile to feature a game (Snake on the Nokia 6110), and a string of handhelds including the GBA, DS, 3DS, and the PlayStation Vita.
We spoke to Ed Plowman, ARM’s Director of Solution Architecture, about what challenges lie ahead for smartphone gaming and how ARM has been working to meet them.
Limited power and bandwidth
“The inconvenient truth is that we can probably build more on silicon than we can turn on in one go,” Ed explains, “Working out how we can make the techniques employed on high-end gaming platforms accessible on mobile hardware without blowing power budgets is a real challenge.”
Delivering a good experience for gamers on mobile devices requires careful use of battery life. We want good performance, but not if it sucks down too much juice and causes our smartphones to overheat.
Mobile GPUs caught up with consoles in terms of raw compute capability last year.
“Traditionally the way games have been developed for the higher end of the market, there’s a lot of brute force techniques, so while performance is an issue even to someone working on a high-end PC, it has a different meaning for us,” says Ed.
If we look back five years or so, we find a typical gaming PC would pull down 650 watts, compared to 100 watts for the PS3 and Xbox 360 when they first came out. The consoles were reduced to around 80 watts after revision cycles and components were shrunk down. That compares to 7.5 watts for a typical high-end tablet and 4 or 5 watts for a high end handset. It’s quite a discrepancy.
“Mobile GPUs caught up with consoles in terms of raw compute capability last year,” he explains, “we hit the point where we’re at parity with the PS3 and Xbox 360, but we’re still three to four years behind on the evolutionary cycle when it comes to bandwidth.”
This isn’t just about power consumption. Heat dissipation is also a major headache for mobile hardware. It’s a problem that anyone who plays graphically intensive games on their Android smartphone will be well aware of.
What has ARM been doing?
There are a number of people like Ed working at ARM, focusing on how to improve the gaming experience on smartphones and they’ve found some very clever ways to do it.
One area they’ve worked on is texture compression. The industry lacks a standard for widespread compression of textures.
“When we started examining where content was going and where the pressures were inside the system for mobile, we found texture operations were particularly expensive, so we developed the ASTC (Adaptive Scalable Texture Compression) system to put us in a position where weren’t just solving today’s problems,” Ed explains, “DirectX Texture Compression does a good job of compressing color, but what about normals, what about displacement maps, what about non-correlated channel maps, things like alphas and luminescence? How do you start compressing those things?”
They created a system for encoding that can encapsulate all of this data and give you a very wide range of bit-rate options , everything from 16-bit down to 1-bit in all the formats you could ever want.
Getting more from tile-based rendering
The Mali GPU is a tile-based rendering system. In fact, 90% of the mobile market is owned by three different GPU vendors all using tile-based rendering systems of one form or another. Naturally, ARM has been looking to maximize the potential of these systems and to that end introduced extensions to the Open GL ES specification.
“Rather than looking to resolve information to a frame buffer all the time and constantly push it out and read it back in again,” explains Ed, “we allow the developer to dynamically allocate on a shader-by-shader basis the storage that represents the pixel in the tile buffer so they can divide it up whichever way they want.”
This enables multitask rendering via direct control of the storage underneath the pixel in the tile memory.
“What you can do is overlay it so in each pass of each draw call, you can store auxiliary information as well as the pixel’s current depth and stencil and you can divide the space it normally uses to represent that information whatever way you want,” Ed enthuses. “Then in subsequent passes without stepping off the tile, so you haven’t written any of this information back to memory, you’ve saved yourself those multiple target write backs and readings, you’ve saved all of that effort going out to memory and you can do the final passes while it’s still on the tile. It saves an awful lot of energy and an awful lot of bandwidth, getting you more performance and more power.”
Trying to balance power demands with performance and maximize the compute capability that you have at any one point in time is something that ARM is also doing at a system level with big.LITTLE technology.
Not just for the high-end
“Always the goal is to squeeze more out of what’s there, to cater for multiple SKUs, not just the high-end, bring those facilities down as quickly as possible so you can get them into the market,” says Ed.
This philosophy seems to match Google’s ambitions for Android and the commitment to improving the experience on lesser hardware, instead of constantly focusing on the flagships.
“The aim is unification across the product, offering a solution to every part of the market with as much commonality as possible, which makes it easier for an OEM to move up the food chain, leveraging ARM IP,” Ed explains, “Commonality amongst everything makes the transition easier.”
It’s an approach that clearly resonates with the growing list of ARM partners.
Is cloud gaming a viable alternative?
As the demands on mobile hardware grow, we can’t help wondering if cloud gaming is a viable alternative to squeezing more out of existing tech. Wi-Fi and cellular data networks are getting faster and faster, why not run games on remote hardware and stream the action? It’s an area that Ed has already investigated and he explains the pitfalls.
“The experience would be variable because of the way networks are built, big carriers networks are built for distributed time division multiplex loads, not everyone going on there at once and everyone demanding a consistently high bandwidth,” he says, and it’s not the only problem.
“Because of the way common video codecs work, synthetic images don’t work well, they’re designed to cope with natural images with more random components, info grouped in macro block with little variation leads to block artifacts,” he explains, “…image quality is never stable and codecs hate fast-moving objects.”
This is all before you consider the business issues with scalability and the extra power required and heat generated compared to a normal data center because of the need for GPUs.
Where Ed does see potential is running the game engine in the cloud, but having the device do the rendering. It’s an area sparking plenty of research right now.
An exciting future
Talk turns to what the next big thing in gaming might be and there’s plenty on the mobile horizon. Virtual Reality is set to finally take off with a raft of consumer products approaching release. The direction it will take and the potential impact on mobile is tough to predict, but there’s some crossover into visual computing.
“The ability to reuse GPU performance for other things, like processing of video input, image data, anti-shake technology, 3D mapping using camera input, good head tracking and the ability to overlay 3D mapping on the real world,” Ed’s clearly excited at the possibilities.
He also mentions panoptic cameras and the rise of 4K as inevitable trends for mobile and expresses surprise that a low cost Android console has yet to take off.
We round off the interview talking about Khronos, a not-for-profit group dedicated to “creating open standards for the authoring and acceleration of parallel computing, graphics, dynamic media, computer vision and sensor processing on a wide variety of platforms and devices.”
Ed was a founding member and served as treasurer for the past seven years. When they started the conversation about better graphics the Nokia 6110 was state of the art. It’s come a long way since then. Pragmatism convinced a number of mobile OS developers of the need for common standards. Symbian was an early leader, and by the time Google was bringing Android onto the market support for Open GL ES was a natural choice.
Open standards help to level the playing field and push everyone forward. They break down barriers to entry and encourage innovation and healthy competition. ARM has been very supportive of Khronos from the start, and open standards in general, as Ed says “ARM is all about standardization.”
The end result for us as consumers is greater choice at lower prices. Game on.