Best daily deals

Affiliate links on Android Authority may earn us a commission. Learn more.

Lighting, console level graphics & ARM - 5 things developers need to know

Over one-third of the smartphones in the world use an ARM based GPU, and every developer should know how to best use ARM's technology when making 3D games.
By
May 6, 2015

If you have ever seen a 1980s science fiction movie, or if you have ever played a 1980s computer game, then you will understand when I say that computer graphics have come a long way in the last few decades. At the dawn of the computer graphic age it was all about wireframes and simple texture mapping. Now we live in the time of photorealistic rendering with the use of shaders and advanced lighting techniques.

The challenge for 3D game makers, and for GPU designers, is to find ways to create the most realistic rendering of a scene while using the smallest amount of computing power. The reason is that 3D games, even those on Android devices, run at high frame rates ranging from 25 frames per second (fps) right up to 60 fps. In other words, the GPU has less than 1/60 of a second to turn a huge load of graphics data into a realistic rendering of a scene.

The quicker the objects, shadows, lighting, and reflections can be rendered, the greater the fps. And high frame rates mean smooth gameplay. Quick render times also mean that game designers can create increasingly complex scenes, something which further adds to the realism.

1. ARM isn’t just a CPU designer

The vast majority of smartphones and tablets use processors with ARM designed CPU cores, but ARM doesn’t just design CPU cores, it also designs GPUs. In fact over 50% of all Android tablets and over 35% of smartphones have ARM designed GPUs. Marketed under the brand name “Mali,” the GPU finds its way into almost every category of smartphone including high-end devices. The Samsung Galaxy S6 uses a Exynos 7420 SoC with four ARM designed CPU cores, and the ARM Mali-T760MP8 GPU.

During GDC ARM demonstrated an upcoming Unreal Engine 4 plugin for its Mali Offline Compiler.

For game designers the popularity of the Mali GPU means it is essential that games are tested and optimized for the Mali GPU. As you would expect, ARM provides a comprehensive set of developer tools for game designers. Among the tools you will find the Mali Graphics Debugger, which allows developers to trace OpenGL ES and OpenCL API calls in their application, and understand frame-by-frame the effect on the application to help identify possible issues; the OpenGL ES Emulator, which helps software development and testing of the next generation OpenGL ES 3.1 applications via PC emulation; and the Mali Offline Compiler, a command line tool that translates vertex, fragment and compute shaders written in the OpenGL ES Shading Language (ESSL) into binary shaders for execution on Mali GPUs.

If you want to see what is possible with ARM’s GPU specific tools then I recommend reading Profiling Epic Citadel via ARM DS-5 Development Studio, which shows how these tools can be used for performance analysis and optimization.

2. ARM will soon release an Unreal Engine 4 plugin for its Mali Offline Compiler

During GDC ARM demonstrated an upcoming Unreal Engine 4 plugin for its Mali Offline Compiler. It will allow you to analyze materials and get advanced mobile statistics while previewing the number of arithmetic, load & store and texture instructions in your code. Here is a demo of the new plugin:

The reason this type of tool is important is because it gives game makers the tools need to port games from the console/PC space to mobile. Typically content on the XBOX/PS3 is at 720p, but the Google Nexus 10 displays games at 2.5k. The challenge for game makers is to maintain a high level of gaming experience while optimizing for the power budget of a mobile device.

3. ARM is developing new GPU techniques

The engineers at ARM do more than design GPUs, they also help create and develop some of the latest 3D graphic techniques. The company recently demonstrated a new rendering technique for creating dynamic soft shadows based on a local cubemap. The new demo is called Ice Cave and it is worth watching before reading further.

If you aren’t familiar with cubemaps they are a technique which has been implemented in GPUs since 1999. It allows 3D designers to simulate the large surrounding area that encompasses an object without straining the GPU.

If you want to place a silver candlestick in the middle of a complex room, you can create all the objects that make up the room (including the walls, flooring, furniture, light sources, etc) plus the candlestick, and then fully render the scene. But for gaming that is slow, certainly too slow for 60 fps. So if you can offload some of that rendering so that it occurs during the game design phase, that will help improve speed. And that is what a cubemap does. It is a pre-rendered scene of the 6 surfaces that make up a room (i.e. a cube) with the four walls, the ceiling and the floor. This render can then be mapped onto the shiny surfaces to give a good approximation of the reflections that can been seen on surface of the candlestick.

It is also possible to get an even better experience by combining the cubemap shadows with the traditional shadow map technique.

Since the pre-rendered cubemap includes all the views from every possible angle then it doesn’t matter where the camera is in the scene, the GPU can simulate the reflections. This approach is much quicker than rendering the whole scene. This approach has had several major developments over recent years and the technique was refined significantly in 2004 and 2010.

The Ice Demo shows off a new local cubemap technique. Sylwester Bala and Roberto Lopez Mendez, from ARM, developed the technique when they realized that by adding an alpha channel to the cubemap it could be used to generate shadows. Basically, the alpha channel (the level of transparency) represents how much light can enter the room. If you want to read the full technical explanation of how this new technique works then check out this blog: Dynamic Soft Shadows Based on Local Cubemap. Below is a short walk-through of the Ice Cave demo by Sylwester:

It is also possible to get an even better experience by combining the cubemap shadows with the traditional shadow map technique, as this demo shows:

4. Geomerics is an ARM company

Lighting is an important part of any visual medium including photography, videography and 3D gaming. Film directors and game designers use light to set the mood, intensity and atmosphere of a scene. At one end of the lighting scale is Utopian science fiction lighting, where everything is bright, clean and sterile. At the other end of the spectrum (sorry, bad pun) is the dark world of horror or suspense. The latter tends to use low lighting and lots of shadows, punctuated by pools of light to grab your attention and draw you in.

There are many different types of light source available to game designers including directional, ambient, spotlight and point light. Directional light is far away like sunlight, and as you know sunlight casts shadows; ambient lighting casts soft rays equally to every part of a scene without any specific direction, as a result it doesn’t cast any shadows; spotlights emit from a single source in a cone shape, like on the stage in a theater; and point lights are your basic real-world light sources like light bulbs or candles – the key thing about point lights is that they emit in all directions.

Simulating all this lighting in 3D games can be GPU intensive. But like cubemaps, there is a way to shortcut the process and produce a scene that is good enough to fool the human eye. There are several different ways to create realistic lighting without all the hard work. One way is to use a lightmap bake. Created offline, like a cubemap, it gives the illusion that light is being cast onto an object, but the baked light won’t have any effect on moving objects.

Another technique is “bounce lighting”, here game designers add light sources at strategic positions in order to simulate global illumination. In other words, a new light source is added at the point where a light would be reflected, however, it can be hard to achieve physical correctness using this method.

Enlighten takes the pre-baked lightmap approach one step further by using a unique and highly optimized runtime library that generates lightmaps in real time.

A third is to use Enlighten from Geomerics. Enlighten takes the pre-baked lightmap approach one step further by using a unique and highly optimized runtime library that generates lightmaps in real time. The lightmap is created using the CPU during the gameplay, and is subsequently added to the rest of the direct lighting on the GPU.

This means that now the lightmap technique can be applied to moving objects. When combined with offline lightmaps only the lights and materials that need to be updated at runtime will use any CPU time.

The result is a technique that doesn’t only apply to mobile games, but one that can scale up to PC and consoles.

The subway demo below shows Enlighten in action. Note how during the “dynamic translucency” part of the demo some walls are destroyed allowing light to pass where it was previously partially blocked, however the indirect lighting remains consistent. This all happens in real-time and is not something pre-rendered just to create the demo.

5. Enlighten 3 includes a new lighting editor

To achieve such great lighting, Geomerics has released a new lighting editor called Forge. It has been specifically developed for the needs of Android game artists, and provides an immediate “out of the box” experience. It is also an important tool for “integration engineers,” as Forge serves as a model example and practical reference for integrating Enlighten’s key features into any in-house engine and editor.

One of the really useful features of Forge is that it provides the ability to import and export the lighting configurations you have set up for your scenes. This is particularly useful for defining certain lighting conditions or environments and then simply sharing them (via export) across your other levels/scenes.