## The Rendering Pipeline

OpenGL ES keeps track of these three matrices. Each time we set one of the matrices, it will remember it until we change the matrix again. In OpenGL ES speak, this is called a state. OpenGL keeps track of more than just the matrix states, though it also keeps track of whether we want it to alpha-blend triangles, whether we want lighting to be taken into account, which texture should be applied to our geometry, and so on. In fact, OpenGL ES is one huge state machine. We set its current state,...

## Playing Around Practically

The possibilities, even with this simple model, are endless. Let's extend our little CannonTest so we can actually shoot a cannonball. Here's what we want to do As long as the user drags his finger over the screen, the canon will follow it. That's how we'll specify the angle at which we'll shoot the ball. As soon as we receive a touch-up event, we'll fire a cannonball in the direction the cannon is pointing. The initial velocity of the cannonball will be a combination of the cannon's direction...

## Becoming a Registered Developer

Android makes it really easy to publish your application on the official Android Market. All you have to do is register as a developer for a one-time fee of US25. Depending on the country you live in, this developer account will allow you to put up free and or paid applications (see Chapter 1 for a list of countries you can sell applications from). Google is working hard to expand the number of countries you can sell applications to and from. To register an account, visit and follow the...

## Camera in 2D

The view frustum for our 2D world, again gl.glOrthof(x, x + FRUSTUM_WIDTH, y, y + FRUSTUM_HEIGHT, 1, -1) Figure 8-20 shows what that means. gl.glOrthof(x, x + FRUSTUM_WIDTH, y, y + FRUSTUM_HEIGHT, 1, -1) Figure 8-20 shows what that means. Figure 8-20. Moving the frustum around Figure 8-20. Moving the frustum around By this we simply specify the bottom-left corner of our view frustum in the world space. This is already sufficient to implement a freely movable 2D camera. But we can...

## Streaming Music

Small sound effects fit into the limited heap memory an Android application gets from the operating system. Bigger audio files containing longer music pieces don't. For this reason we need to stream the music to the audio hardware, which means that we only read-in a small chunk at a time, enough to decode it to raw PCM data and throw that at the audio chip. That sounds intimidating. Luckily there's the MediaPlayer class, which handles all that business for us. All we need to do is point it at...

## Zbuffer Precision and Zfighting

It's always tempting to abuse the near and far clipping planes to show as much of our awesome scene as possible. We put a lot of effort into adding a ton of objects to our world, after all, and that effort should be visible. The only problem with this is that the z-buffer has a limited precision. On most Android devices, each depth value stored in the z-buffer has no more than 16 bits that's 65,535 different depth values at most. So instead of setting the near clipping plane distance to 0.00001...

## Androidlnput The Great Coordinator

The Input implementation of our game framework ties together all the handlers we just developed. Any method calls will be delegated to the corresponding handler. The only interesting part of this implementation is where we choose which TouchHandler implementation we use based on the Android version the device is running. Listing 5-11 shows you the implementation, called AndroidInput. Listing 5-11. AndroidInput.java Handling the Handlers with Style package import android.content.Context import...

## In Practice

We'll now go through all the necessary steps to get lighting to work with OpenGL ES. Along the way we'll create a few little helper classes that make working with light sources a bit easier. We'll put those in the com.badlogic.androidgames.framework.gl package. As with all OpenGL ES states, we first have to enable the functionality in question. We do that with this Once enabled, lighting will be applied to all objects we render. We'll have to specify the light sources and materials as well as...

## Screens and Transitions

We are now able to define our screens and transitions. We'll follow the same formula we used in Mr. Nom We'll have a main screen with a logo PLAY, HIGHSCORES, and HELP menu items and a button to disable and enable sound. We'll have a game screen that will ask the player to get ready and handle running, paused, game-over, and next-level states gracefully. The only new addition to what we used in Mr. Nom will be the next-level state of the screen, which will be triggered once Bob hits the castle....

## Specifying Triangles

Next up we have to figure out how we can tell OpenGL ES about the triangles we want it to render. First let's define what a triangle is made of A triangle is made of three points. Each point is called a vertex. A vertex has a position in 3D space. A position in 3D space is given as three floats, specifying the x-, y-, and z-coordinates. A vertex can have additional attributes, such as a color or texture coordinates (which we'll talk about later). These can be represented as floats as well....

## Texture Atlas Because Sharing Is Caring

Up until this point we have only ever used a single texture in our programs. What if we not only want to render Bob, but other superheroes or enemies or explosions or coins as well We could have multiple textures, each holding the image of one object type. But OpenGL ES wouldn't like that much, since we'd need to switch textures for every object type we render (e.g., bind Bob's texture, render Bobs, bind the coin texture, render coins, etc.). We can do better by putting multiple images into a...

## Going Full Screen

Before we dive headfirst into drawing our first shapes with the Android APIs, let's fix something else. Up until this point, all our activities have shown their title bars. The notification bar was visible as well. We'd like to immerse our players a little bit more by getting rid of those. We can do that with two simple calls WindowManager.LayoutParams.F LAG_FULLSCREEN) The first call gets rid of the activity's title bar. To make the activity go full-screen and thus eliminate the notification...

## Measuring Frame Rate

BobTest provides a perfect example to start with some optimizations. Before we can do that, though, we need a way to assess performance. Manual visual inspection (doh, it looks like it stutters a little) is not precise enough. A better way to measure how fast our program performs is to count the number of frames we render per second. If you remember Chapter 3, we talked about something called the vertical synchronization, or vsync for short. This is enabled on all Android devices that are on...

## Mobile Gaming Is Different

Gaming was already huge way before the likes of the iPhone and Android started to conquer this market segment. However, with those new forms of hybrid devices, the landscape has started to change. Gaming is no longer something for nerdy kids. Serious businesspeople have been caught playing the latest trendy game on their mobile phones in public, newspapers pick up stories of successful small game developers making a fortune on mobile phone application markets, and established game publishers...

## The UI Assets

We'll again create our UI assets relative to some target resolution. Our game will be run in landscape mode, so we simply choose a target resolution of 480x320 pixels. The screens in Figure 12-4 already show all the elements we have in our UI a logo, different menu items, a couple of buttons, and some text. For the text we'll reuse the font we used in Super Jumper. We've already done the compositing of all these things in previous games, and you've learned that putting them into a texture atlas...

## GLSurface View Making Things Easy Since 2008

The first thing we need is some type of View that will allow us to draw via OpenGL ES. Luckily there's such a View in the Android API. It's called GLSurfaceView, and it's a descendent of the SurfaceView class, which we already used for drawing the world of Mr. Nom. We also need a separate main loop thread again so that we don't bog down the UI thread. Surprise GLSurfaceView already sets up such a thread for us All we need to do is implement a listener interface called GLSurfaceView.Renderer and...

## Frameworks and Engines

If you bought this book with a little prior game development knowledge, you may have wondered why I didn't choose to use one of the many frameworks available for Android game development. Reinventing the wheel is bad, right I want you to firmly understand the principles. Although this may be tedious at times, it will pay off in the end. With the knowledge you gained here, it will be so much easier to pick up any precanned solution out there, and it is my hope that you'll recognize the advantage...

## Androids Features and Architecture

Android is not just another Linux distribution for mobile devices. While you develop for Android, you're not all that likely to meet the Linux kernel itself. The developer-facing side of Android is a platform that abstracts away the underlying Linux kernel and is programmed via Java. From a high-level view, Android possesses several nice features An application framework providing a rich set of APIs to create various types of applications. It also allows the reuse and replacement of components...

## Accessing the External Storage

While assets are superb for shipping all our images and sounds with our application, there are times when we need to be able to persist some information and reload it later on. A common example would be with high-scores. Android offers many different ways of doing this you can use local shared preferences of an application, a small SQLite database, and so on. All these options have one thing in common they don't handle large binary files all that gracefully. Why would we need that anyway While...

In Chapter 2 we had a brief look at all the folders an Android project has. We identified the assets and res folders to be the ones we can put files in that should get distributed with our application. When we discussed the manifest file, I told you that we're not going to make use of the res folder, as it implies restrictions on how we structure our file set. The assets directory is the place to put all our files, in whatever folder hierarchy we want. The files in the assets folder are exposed...

## Broad Phase and Narrow Phase Collision Detection

We still don't know how to check for collisions between our objects and their bounding shapes. There are two phases of collision detection Broad phase In this phase we try to figure out which objects can potentially collide. Imagine having 100 objects that could each collide with each other. We'd need to perform 100 x 100 2 overlap tests if we chose to naively test each object against each other object. This na ve overlap testing approach is of O(n2) asymptotic complexity, meaning it would take...

## OOn OOOu

Game design building blocks The leftmost rectangle is our screen, roughly the size of my Nexus One's screen. That's where we'll place all the other elements on. The next building blocks are two buttons that we'll use to control the snake. Finally, there's the snake's head, a couple of tail parts, and a piece it can eat. I also wrote out some numbers and cut them out. Those will be used to display the score. Figure 3-12 illustrates my vision of the initial playing field. Figure...

## The Game Screen

We are nearing the completion of Super Jumper. The last thing we need to implement is the game screen, which will present the actual game world to the player and allow it to interact with it. The game screen consists of five subscreens, as shown in Figure 9-2. We have the ready screen, the normal running screen, the next-level screen, the gameover screen, and the pause screen. The game screen in Mr. Nom was similar to this it only lacked a next-level screen, as there was only one level. We will...

## The Android FileIO Class

The original FileIO interface was lean and mean. It only contained three methods one to get an InputStream for an asset, another to get an InputStream for a file on the external storage, and a third that returns an OutputStream for a file on the external storage. In Chapter 4 you learned how we can open assets and files on the external storage with the Android APIs. Listing 5-1 shows you the implementation of the FileIO interface we'll use based on the knowledge from Chapter 4. Listing 5-1....

## Using the Sprite Batcher Class

Let's incorporate the TextureRegion and SpriteBatcher classes in our cannon example. I copied the TextureAtlas example and renamed it SpriteBatcherTest. The classes contained in it are called SpriteBatcherTest and SpriteBatcherScreen. The first thing I did was get rid of the Vertices members in the screen class. We don't need them anymore, since the SpriteBatcher will do all the dirty work for us. Instead I added the following members TextureRegion cannonRegion TextureRegion ballRegion...

The Bitmap class will become our best friend. We load a bitmap from a file by using the BitmapFactory singleton. As we store our images in the form of assets, let's see how we can load an image from the assets directory InputStream inputStream assetManager.open(bob.png) Bitmap bitmap The Bitmap class itself has a couple of methods that are of interest to us. First we want to get to know its width and height in pixels int width bitmap.getWidth() int height bitmap.getHeight() The next thing we...

## Getting the Screen Resolution and Coordinate Systems

In Chapter 2 we talked a lot about the framebuffer and its properties. Remember that a framebuffer holds the colors of the pixels that get displayed on the screen. The number of pixels available to us is defined by the screen resolution, which is given by its width and height in pixels. Now, with our custom View implementation, we don't actually render directly to the framebuffer. But since our View spans the complete screen, we can pretend it does. In order to know where we can render our game...

## A

Accelerometer state, reading, 141-144 life cycle of, 120-127 in practice, 123-127 in theory, 120-123 starting programmatically, 118-119 test, 119-120 Activity class default, 232-237 Assets class, 233 LoadingScreen class, 236-237 Settings class, 234-235 main, 448-449, 591-592 < activity> elements, 107-109, 113-114, 124, 160, 351, 490, 525 Activity notificaton method, 286 activity stack, 120 Activity with a Button, 37 Activity.isFinishing( ) method, 123, 125 Activity.onCreate( ) method, 36,...

## Texture Class

To reduce the code needed for subsequent examples, I wrote a little helper class called Texture. It will load a bitmap from an asset and create a texture object from it. It also has a few convenience methods to bind the texture and dispose of it. Listing 7-8 shows the code. Listing 7-8. Texture.java, a Little OpenGL ES Texture Class package import java.io.IOException import java.io.InputStream import import android.graphics.Bitmap import android.graphics.BitmapFactory import...

## Putting It All Together

So how do we integrate all this with a separate rendering thread, as well as with the activity life cycle The best way to figure this out is to look at some actual code. Listing 4-16 shows you a complete example that performs the rendering in a separate thread on a SurfaceView. Listing 4-16. The SurfaceViewTest Activity package com.badlogic.androidgames import android.app.Activity import android.content.Context import android.graphics.Canvas import android.os.Bundle import...

## Android Audio Android Sound and Android Music Crash Bang Boom

We designed three interfaces in Chapter 3 for all our audio needs Audio, Sound, and Music. Audio is responsible for creating Sound and Music instances from asset files. Sound let's us playback sound effects completely stored in RAM, and Music streams bigger music files from disk to the audio card. In Chapter 4 you learned what Android APIs we need to implement this. We start off with the implementation of AndroidAudio, as shown in Listing 5-2. Listing 5-2. AndroidAudiojava Implementing the...

## Setting the Volume Controls

If you possess an Android device, you will have noticed that when you press the volume up and down buttons, you control different volume settings depending on what application you are currently in. In a call you control the volume of the incoming voice stream. In the YouTube application you control the volume of the video's audio. On the home screen you control the volume of the ringer. Android has different audio streams for different purposes. When we play back audio in our game, we use...

## Putting It Together

To round this section out, let's put all this together via a nice GLGame and Screen implementation. Listing 7-5 shows the complete example. package com.badlogic.androidgames.glbasics import java.nio.ByteBuffer import java.nio.ByteOrder import java.nio.FloatBuffer import import import import import public class FirstTriangleTest extends GLGame Override return new FirstTriangleScreen(this) The FirstTriangleTest class derives from GLGame, and thus has to implement the Game.getStartScreen() method....

## Sign Your Games APK

After you have successfully registered as an official Android developer it's time to prepare your application for publishing. In order to publish your application you have to sign the APK file. Before we do that we should make sure everything is in place. Here's a laundry list of things to do before signing the application Remove the android debuggable attribute from the < application> tag in your manifest file. In the < manifest> tag you'll find the android versionCode and android...

## The Action Mask and More Event Types

Next we have to get the pure event type minus the additional pointer index that is encoded in the integer returned by MotionEvent.getAction(). We just need to mask the pointer index out int action event.getAction() & MotionEvent.ACTION_MASK OK, that was easy. Sadly you'll only understand it if you know what that pointer index is and that it is actually encoded in the action. What's left is to decode the event type as we did before. I already said that there are a few new event types, so...

## Simple Camera System

In the last example we saw a hint of how we could implement a camera system in 3D. We used glTranslatef() to push down the complete world by 2 units on the y-axis. Since our camera is fixed to be at the origin, looking down the negative z-axis, this approach gives the impression that the camera itself was moved up by 2 units. All the objects are still defined with their y-coordinates set to zero. It's like in the classic saying, If the mountain will not come to the prophet, the prophet will go...

## The Main Menu Screen

This is the screen that is returned by SuperJumper.getStartScreen(), so it's the first screen the player will see. It renders the background and UI elements and simply waits there for us to touch any of the UI elements. Based on which element was hit, we either change the configuration (sound enabled disabled) or transition to a new screen. Listing 9-6 shows the code. Listing 9-6. MainMenuScreen.java The Main Menu Screen package com.badlogic.androidgames.jumper import import import import...

## The Spatial Hash Grid

Our cannon will be bounded by a rectangle of 1 x1 meters, the cannonball will have a bounding rectangle of 0.2x0.2 meters, and the targets will each have a bounding rectangle of 0.5x0.5 meters. The bounding rectangles are centered around each object's positions to make our lives a little easier. When our cannon example starts up, we'll simply place a number of targets at random positions. Here's how we could set up the objects in our world Cannon cannon new Cannon(0, 0, 1, 1) DynamicGameObject...

## Running and Debugging Android Applications

Once we've written the first iteration of our application code, we want to run and test it to identify potential problems or just be amazed at its glory. We have two ways we can achieve this We can run our application on a real device connected to the development PC via USB. We can fire up the emulator that is included in the SDK and test our application there. In both cases we have to do a little bit of setup work before we can finally see our application in action. Before we can connect our...

## Normalized Device Space and the Viewport

Once OpenGL ES has figured out the projected points of a triangle on the near clipping plane, it can finally translate them to pixel coordinates in the framebuffer. For this, it must first transform the points to so-called normalized device space. This equals the coordinate system depicted in Figure 7-2. Based on these normalized device space coordinates OpenGL ES calculates the final framebuffer pixel coordinates via the following simple formulas pixelX norX 1 viewportWidth 1 norX pixelY norY...

## The Game Screen Class

There's only one more screen to implement. Let's see what that screen does As defined in Mr. Nom's design in Chapter 3, it can be in one of three states waiting for the user to confirm that he's ready, running the game, waiting in a paused state, and waiting for the user to click a button in the game-over state. In the ready state we simply ask the user to touch the screen to start the game. In the running state we update the world, render it, and also tell Mr. Nom to turn left and right when...

## The World Renderer Class

It simply uses the SpriteBatcher we pass to it in the constructor and renders the world accordingly. Listing 9-23 shows the beginning of the code. Listing 9-23. Excerpt from WorldRenderer.java Constants, Members, and Constructor package com.badlogic.androidgames.jumper import import import import import import static final float FRUSTUM_WIDTH 10 static final float FRUSTUM_HEIGHT 15 GLGraphics glGraphics World world Camera2D cam public WorldRenderer GLGraphics...

## Implementing a Vector Class

We want to create an easy-to-use vector class for 2D vectors. Let's call it Vector2. It should have two members, for holding the x and y components of the vector. Additionally it should have a couple of nice methods that allow us to the following Multiply the vector components with a scalar Measure the length of a vector Calculate the angle between a vector and the x-axis Java lacks operator overloading, so we have to come up with a mechanism that makes working with the Vector2 class less...

## Vectors in 3D

In Chapter 8 we discussed vectors and their interpretation in 2D. As you might have guessed, all the things we discussed there hold in 3D space as well. All we do is add one more coordinate to our vector, namely the z-coordinate. The operations we looked at with vectors in 2D can be easily transferred to 3D space. We specify vectors in 3D with a statement like this Addition in 3D is carried out as follows c a b a.x, a.y, b.z b.x, b.y, b.z a.x b.x, a.y b.y, a.z b.z Subtraction works exactly the...

## Reading the Accelerometer State

A very interesting input option for games is the accelerometer. All Android devices are required to contain a three-axis accelerometer. We talked about accelerometers in the last chapter a little bit. We'll generally only poll the state of the accelerometer. So how do we get that accelerometer information You guessed correctly, by registering a listener. The interface we need to implement is called SensorEventListener, which has two methods public void onSensorChanged SensorEvent event public...

## An Euler Camera Example

We now want to use the EulerCamera class in a little program. We want to be able to rotate it up and down and left and right based on swiping the touchscreen with a finger. We also want it to move forward when a button is pressed. Our world should be populated by a couple of crates. Figure 11-10 shows you the initial setup of our scene. Figure 11-10. A simple scene with 25 crates, a point light, and an Euler camera in its initial position and orientation The camera will be located at 0,1,3 . We...

## The Concept of Binding Vertices

So is there anything else we can improve Let's look at our current present method one more time with removed glRotatef and glScalef public void present float deltaTime GL10 gl glGraphics.getGL gl.glClear GL10. GL_COLOR_BUFFER_BIT gl.glMatrixMode GL10.GL_MODELVIEW for int i 0 i lt NUM_BOBS i gl.glLoadIdentity gl.glTranslatef bobs i .x, bobs i .y, 0 bobModel.draw GL10.GL_TRIANGLES, 0, 6 That looks pretty much optimal, doesn't it Well, in fact it is not optimal. First, we can also move the...

## The Camera2D Class

Let's put all this together into a single class. We want it to store the camera's position, the standard frustum width and height, and the zoom factor. We also want a convenience method that sets the viewport always use the whole screen and projection matrix correctly. Additionally we want a method that can translate touch coordinates to world coordinates. Listing 8-15 shows our new Camera2D class. Listing 8-15. Camera2D.java, Our Shiny New Camera Class for 2D Rendering package import import...

## An Example

Let's create an example called AnimationTest with a corresponding screen called AnimationScreen. As always we'll only discuss the screen itself. We want to render a number of cavemen, all walking to the left. Our world will be the same size as our view frustum, which has the size 4.8x3.2 meters this is really arbitrary we could use any size . A caveman is a DynamicGameObject with a size of 1 x1 meters. We will derive from DynamicGameObject and create a new class called Caveman, which will store...

## An First Example Using Translation

What can we use this for Say we want to render 100 Bobs at different positions in our world. Additionally we want them to move around on the screen and change direction each time they hit an edge of the screen or rather a plane of our parallel projection view frustum, which coincides with the extents of our screen . We could do this by having one large Vertices instance that holds the vertices of the 100 rectangles one for each Bob and recalculate the vertex positions each frame. The easier...

## Removing Unnecessary State Changes

So let's look at the present method of BobTest and see what we can change. Here's the snippet for reference I added the FPSCounter in, and we also use glRotatef and glScalef public void present float deltaTime GL10 gl glGraphics.getGL gl.glViewport 0, 0, glGraphics.getWidth , glGraphics.getHeight gl.glClear GL10. GL_COLOR_BUFFER_BIT gl.glEnable GL10. GL_TEXTURE_2D bobTexture.bind gl.glMatrixMode GL10.GL_MODELVIEW for int i 0 i lt NUM_BOBS i gl.glLoadIdentity gl.glTranslatef bobs i .x, bobs i...

## The World Render Class

Let's recall what we have to render in 3D Our ship, using the ship model and texture and applying lighting. The invaders, using the invader model and texture, again with lighting. Any shots on the playfield, based on the shot model, this time without texturing but with lighting. The shield blocks, based on the shield block model, again without texturing but with lighting and transparency see Figure 12-3 . Explosions instead of the ship or invader model in case the ship or an invader is...

## Playing Sound Effects

In Chapter 3 we discussed the difference between streaming music and playing back sound effects. The latter are stored in memory and are usually no longer than a few seconds. Android provides us with a class called SoundPool that makes playing back sound effects really easy. We can simply instantiate a new SoundPool instances as follows SoundPool soundPool new SoundPool 20, AudioManager.STREAM_MUSIC, 0 The first parameter defines how many sound effects we can play simultaneously at most. This...

## Coping with Different Aspect Ratios

Replica Island performs a cheap but very useful magic trick in order to deal with the aspect ratio problem. The game was originally designed for everything to fit on a 480 320-pixel screen, including all the sprites e.g., the robot and the doctor , the tiles of the world, and the UI elements e.g., the buttons at the bottom left and the status info at the top of the screen . When the game is rendered on a Hero, each pixel in the sprite bitmaps maps to exactly one pixel on the screen. On a Nexus...

## Android Graphics Serving Our Drawing Needs

The Graphics interface we designed in Chapter 3 is also pretty lean and mean. It will draw pixels, lines, rectangles, and Pixmaps to the framebuffer. As discussed, we'll use a Bitmap as our framebuffer and direct all drawing calls to it via a Canvas. It is also responsible for creating Pixmap instances from asset files. We'll thus also need an AssetManager again. Listing 5-13 shows the code for our implementation of that interface, AndroidGraphics. Listing 5-12. AndroidGraphics.java...

## Specifying Per Vertex Color

In the last example we set a global default color for all vertices we draw via glColor4f . Sometimes we want to have more granular control e.g., we want to set a color per vertex . OpenGL ES offers us this functionality, and it's really easy to use. All we have to do is add RGBA float components to each vertex and tell OpenGL ES where it can find the color for each vertex, similar to how we told it where it can find the position for each vertex. Let's start by adding the colors to each vertex...

## The Sprite Batcher Class

As already discussed, a sprite can be easily defined by its position, size, and texture region and optionally, its rotation and scale . It is simply a graphical rectangle in our world space. To make things easier we'll stick to the conventions of the position being in the center of the sprite and the rectangle constructed around that center. Now, we could have a Sprite class and use it like this Sprite bobSprite new Sprite 20, 20, 0.5f, 0.5f, bobRegion That would construct a new sprite with its...

We already know how to do that on Android Bitmap bitmap Here we load Bob in an RGB888 configuration. The next thing we need to do is tell OpenGL ES that we want to create a new texture. OpenGL ES has the notion of objects for a couple of things, such as textures. To create a texture object, we can call the following method GL10.glGenTextures int numTextures, int ids, int offset The first parameter specifies how many texture objects we want to create. Usually we...

## Keyboard Handler Up Up Down Down Left Right

The KeyboardHandler has to fulfill a couple of tasks. First it must hook up with the View from which keyboard events are to be received. Next it must store the current state of each key for polling. It must also keep a list of KeyEvent instances, which we designed in Chapter 3 for event-based input handling. Finally it must properly synchronize all this, as it will receive events on the UI thread while being polled from our main game loop, which is executed on a different thread. Quite a lot of...

## The First Person or Euler Camera

The first-person or Euler camera is defined by the following attributes The field of view in degrees. The viewport aspect ratio. The near and far clipping planes. A position in 3D space. An angle around the x-axis pitch . This is limited to the range -90 to 90 degrees. Think how far you can tilt your own head and try to go beyond those angles I'm not responsible for any injuries. The first three attributes are used to define the perspective projection matrix. We did this already with calls to...

## The Simulation Classes

Before we can dive into the game screen we need to create our simulation classes. We'll follow the same pattern as in Mr. Nom, with a class for each game object and an all-knowing superclass called World that ties together the loose ends and makes our game world tick. We'll need classes for Bob, squirrels, and platforms can move, so we'll base their classes on the DynamicGameObject we created in the last chapter. Springs and coins are static, so those will derive from the GameObject class. The...

## Implementing Super Jumper

Implementing Super Jumper will be pretty easy. We can reuse our complete framework from the previous chapter and follow the architecture we had in Mr. Nom on a high level. This means we'll have a class for each screen, and each of these classes will implement the logic and presentation expected from that screen. Besides that, we'll also have our standard project setup with a proper manifest file, all our assets in the assets folder, an icon for our application, and so on. Let's start with our...

## Bounding Sphere Overlap Testing

Making the bounding sphere smaller to better fit an object Listing 11-13. Sphere.java, a Simple Bounding Sphere package public final Vector3 center new Vector3 public float radius public Sphere float x, float y, float z, float radius this.center.set x,y,z this.radius radius That's the same code as in the Circle class. All we changed is the vector holding the center, which is now a Vector3 instead of a Vector2. Let's also extend our OverlapTester class with methods to check for...

## Texture Atlas to the Rescue

That's all the graphical assets we have in our game. Now, we already talked about how textures need to have power-of-two widths and heights. Our background image and all the help screens have a size of 320x480 pixels. We'll store those in 512x512-pixel images so we can load them as textures. That's already six textures. Do we create separate textures for all the other images as well No. We create a single texture atlas. It turns out that everything else fits nicely in a single 512x512 pixel...

## Working Around a Bug in Float Buffer

The reason for this isn't obvious at all. Our SpriteBatcher puts a float array into a direct ByteBuffer each frame when we call Vertices.setVertices . The method boils down to calling FloatBuffer.put float , and that's the culprit of our performance hit here. While desktop Java implements that FloatBuffer method via a real bulk memory move, the Harmony version calls FloatBuffer.put float for each element in the array. And that's extremely unfortunate, as that method is a JNI method, which has a...

## How Lighting Works

Let's think about how lighting works for a moment. The first thing we need is a light source, to emit light. We also need an object that can be lit. Finally we need a sensor, like our eyes or a camera, which will catch the photons that are sent out by the light source and reflected back by the object. Lighting changes the perceived color of an object depending on the following The light source's color or intensity The light source's position and direction relative to the lit object The object's...

## Light Sources

We are surrounded by all kind of light sources. The sun constantly throws photons at us. Our monitors emit light, surrounding us with that nice blue glow at night. Light bulbs and headlights keep us from bumping or driving into things in the dark. OpenGL ES allows you to create four types of light sources Ambient light Ambient light is not a light source per se but rather the result of photons coming from other light sources bouncing around in our world. All these stray photons combined make...

## Perspective Projection The Closer the Bigger

Until now we have always used an orthographic projection, meaning that no matter how far an object is from the near clipping plane, it will always have the same size on the screen. Our eyes show us a different picture of the world. The further away an object is, the smaller it appears to us. This is called perspective projection, and we've already talked about it a little in Chapter 4. The difference between an orthographic projection and a perspective projection can be explained by the shape...

## More Primitives Points Lines Strips and Fans

When I told you that OpenGL ES was a big, nasty triangle-rendering machine, I was not being 100 percent honest. In fact, OpenGL ES can also render points and lines. Best of all these are also defined via vertices, and thus all of the above also applies to them texturing, per-vertex colors, etc. . All we need to do to render these primitives is use something other than GL10.GL_TRIANGLES when we call glDrawArrays glDrawElements . We can also perform indexed rendering with these primitives,...

## Alpha Blending I Can See Through

Alpha blending in OpenGL ES is pretty easy to enable. We only need two method calls gl.glEnable GL10.GL_BLEND gl.glBlendFunc GL10.GL_SRC_ALPHA, GL10.GL_0NE_MINUS_SRC_ALPHA The first method call should be familiar it just tells OpenGL ES that it should apply alpha blending to all triangles we render from this point on. The second method is a little bit more involved. It specifies how the source and destination color should be combined. If you remember what we discussed in Chapter 3, the way a...

## Abstracting the World of Mr Nom Model View Controller

If you are a long-time coder, you've probably heard about design patterns. They are more or less strategies to design your code given a scenario. Some of them are academic, and some have uses in the real world. For game development we can borrow some ideas from the Model-View-Controller MVC design pattern. It's often used by the database and web community to separate the data model from the presentation layer and the data manipulation layer. We won't strictly follow this design pattern, but...

## LookAt Camera

The second type of camera usually found in games is a simple look-at camera. It is defined by the following An up vector. Think of this as an arrow coming out of the top of your skull, pointing in the direction of the top of your skull. A look-at position in space or alternatively a direction vector. We'll use the former. A near and far clipping plane distance. The only difference from the Euler camera is the way we encode the orientation of the camera. In this case we specify the orientation...

## The Texture Region Class

Since we've worked with texture regions already, it should be straightforward to figure out what we need. We know how to convert from pixel coordinates to texture coordinates. We want to have a class where we can specify pixel coordinates of an image in a texture atlas that then stores the corresponding texture coordinates for the atlas region for further processing e.g., when we want to render a sprite . Without further ado, Listing 8-16 shows our TextureRegion class. Listing 8-16....

## Defining an Android Application The Manifest File

An Android application can consist of a multitude of different components Activities These are user-facing components that present a UI to interact with. Services These are processes that work in the background and don't have a visible UI. A service might be responsible for polling a mail server for new e-mails, for example. Content providers These components make parts of your application data available to other applications. Intents These are messages created by the system or applications...

## Defining the Game World

The classic chicken-and-egg problem haunts us again. You learned in the last chapter that we have a correspondence between world units e.g., meters and pixels. Our objects are defined physically in world space. Bounding shapes and positions are given in meters, velocities are given in meters per second. The graphical representations of our objects are defined in pixels, though, so we have to have some sort of mapping. We overcome this problem by first defining a target resolution for our...

## Hello World Android Style

With our development set up, we can now finally create our first Android project in Eclipse. The ADT plug-in installed a couple of wizards for us to make the creation of new Android projects really easy. There are two ways to create a new Android project. The first one works by right-clicking in the Package Explorer view see Figure 2-4 and selecting New gt Project from the pop-up menu. In the new dialog, select Android Project under the Android category. As you can see, there are a lot of other...

## Indexed Vertices Because Reuse Is Good for

Up until this point, we have always defined lists of triangles, where each triangle has its own set of vertices. We have actually only ever drawn a single triangle, but adding more would not have been a big deal. There are cases, however, where two or more triangles can share some vertices. Let's think about how we'd render a rectangle with our current knowledge. We'd simply define two triangles that would have two vertices with the same positions, colors, and texture coordinates. We can do...

## World and Model Space

To understand how this works we have to literally think outside of our little orthographic view frustum box. Our view frustum is in a special coordinate system called the world space. This is the space where all our vertices are going to end up eventually. Up until now we have specified all vertex positions in absolute coordinates relative to the origin of this world space compare with Figure 7-5 . What we really want is to make the definition of the positions of our vertices independent from...

## Little Trigonometry

Let's turn to trigonometry for a minute. There are two essential functions in trigonometry cosine and sine. Each takes a single argument an angle. We are used to specifying angles in degrees e.g., 45 or 360 . In most math libraries, trigonometry functions expect the angle in radians, though. We can easily convert between degrees and radians with the following equations degreesToRadians anglelnDegrees anglelnDegrees 180 pi radiansToDegrees angle anglelnRadians pi 180 Here, pi is our beloved...

## Defining the Projection Matrix

The next thing we need to define is the projection matrix. As we are only concerned with 2D graphics in this chapter, we want to use a parallel projection. How do we do that We already discussed that OpenGL ES keeps track of three matrices the projection matrix, the model-view matrix, and the texture matrix which we'll continue to ignore . OpenGL ES offers us a couple of specific methods to modify these matrices. Before we can use these methods, however, we have to tell OpenGL ES which matrix...

## Narrow Phase

Once we are done with the broad phase, we have to check whether the bounding shapes of the potentially colliding objects overlap. I mentioned earlier that we have a couple of options for bounding shapes. Triangle meshes are the most computationally expensive and cumbersome to create. It turns out that we can get away with bounding rectangles and bounding circles in most 2D games, so that's what we'll concentrate on here. Bounding circles are the cheapest way to check whether two objects...

## Android Game Tying Everything Together

Our little game development framework is nearly complete. All we need to do is tie together the loose ends by implementating the Game interface we designed in Chapter 3, using the classes we created in the previous sections of this chapter. Here's a list of responsibilities Perform window management. In our context, that means setting up an activity and an AndroidFastRenderView, and handling the activity life cycle in a clean way. Use and manage a WakeLock so that the screen does not get...

## Android FastRender View Loop Strech Loop Stretch

The name of this class should already give away what lies ahead. In the last chapter we discussed using a SurfaceView to perform continuous rendering in a separate thread that could also house our game's main loop. We developed a very simple class called FastRenderView, which derived from the SurfaceView class, we made sure we play nice with the activity life cycle, and we set up a thread in which we constantly rendered to the SurfaceView via a Canvas. We'll reuse this FastRenderView class and...

## The Wavefront OBJ Format

We will implement a loader for a subset of this format. Our loader will support models that are composed of triangles only and optionally may contain texture coordinates and normals. The OBJ format also supports storing arbitrary convex polygons, but we won't go into that. Whether you simply find an OBJ model, or create your own, just make sure that it is triangulated, meaning that it's composed of triangles only. The OBJ format is-line based. Here are the parts of the syntax we are going to...

## Continuous Rendering in the UI Thread

All we've done up until now is set the text of a TextView when needed. The actual rendering has been performed by the TextView itself. Let's create our own custom View whose sole purpose it is to let us draw stuff to the screen. We also want it to redraw itself as often as possible, and we want a simple way to perform our own drawing in that mysterious redraw method. Although this may sound complicated, in reality Android makes it really easy for us to create such a thing. All we have to do is...

## N

Name attribute, 108, 110, 113 narrow phase collision detection, 379-384 circle collision, 379-380 circle rectangle collision, 382-383 coding, 383-384 rectangle collision, 381-382 NES controller, 22 New Android Project dialog box, Eclipse, 113 new highscore score, 479 new highscore value, 482 New I O buffers. See NIO buffers newBitmap, 552 newMusic method, 188 newObject method, 195-196 newPixmap method, 218 newSound method, 188 Next button, signed export dialog box, 628 Nexus One, 16-17 NIO New...

## Brief History of Android

Android was first publicly noticed in 2005 when Google acquired a small startup called Android, Inc. This fueled speculation that Google wanted to enter the mobile space. In 2008, the release of version 1.0 of Android put an end to all speculation, and Android became the new challenger on the mobile market. Since then, it's been battling it out with already established platforms such as iOS then called iPhone OS and BlackBerry, and its chances of winning look rather good. Because Android is...

## Drawing Pixels

The first thing we want to know is how to draw a single pixel. That's done with the following method Canvas.drawPoint float x, float y, Paint paint Two things to notice immediately are that the coordinates of the pixel are specified with floats, and that the Canvas doesn't let us specify the color directly, but instead wants an instance of the Paint class from us. Don't get confused by the fact that we specify coordinates as floats. Canvas has some very advanced functionality that actually...

## Texture Coordinates

To map a bitmap to a triangle we need to add so-called texture coordinates to each vertex of the triangle. What is a texture coordinate It specifies a point within the texture our uploaded bitmap to be mapped to one of the triangle's vertices. Texture coordinates are usually 2D. While we call our positional coordinates x, y, and z, texture coordinates are usually called u and v or s and t, depending on the circle of graphics programmers you are a part of. OpenGL ES calls them s and t, so that's...

## The Settings Screen

The settings screen offers the player to change the input method as well as enable or disable audio. We indicate this by three different icons see Figure 12-4 . Touching either the hand or the tilted phone will enable the respective input method. The icon for the currently active input method will have a gold color. For the audio icon we do the same as in the previous games. The choices of the user are reflected by setting the respective boolean values in the Settings class. We also make sure...

## The Multi TouchHandler

For multitouch handling, we have a class called MultiTouchHandler, as shown in Listing 5-10. Listing 5-10. MultiTouchHandler.java More of the Same package import java.util.ArrayList import java.util.List import android.view.MotionEvent import android.view.View import import import public class MultiTouchHandler implements TouchHandler boolean isTouched new boolean 20 int touchX new int 20 int touchY new int 20 Pool lt TouchEvent gt touchEventPool List lt TouchEvent gt touchEvents new ArrayList...

## The Font Class

We are going to use bitmap fonts to render arbitrary ASCII text. We already discussed how this works on a high level, so let's look at the code in Listing 9-4. Listing 9-4. Font.java, a Bitmap Font-Rendering Class package public final Texture texture public final int glyphWidth public final int glyphHeight public final TextureRegion glyphs new TextureRegion 96 The class stores the texture containg the font's glyph, the width and height of a single glyph, and an array of TextureRegions one for...

## Processing Single Touch Events

When we processed clicks on a button in Chapter 2, we saw that listener interfaces are the way Android reports events to us. Touch events are no different. Touch events are passed to an OnTouchListener interface implementation that we register with a View. The OnTouchListener interface has only a single method public abstract boolean onTouch View v, MotionEvent event The first argument is the View that the touch events get dispatched to. The second argument is what we'll dissect to get the...

## Mipmapping

If you've played around with our previous examples and let the cube move further away from the camera, you might have noticed that the texture starts to looks grainy and full of little artifacts the smaller the cube gets. This effect is called aliasing, a prominent effect in all types of signal processing. Figure 11-8 shows you the effect on the right side and the result of applying a technique called mipmapping on the left side. Figure 11-8. Aliasing artifacts on the right the results of...

## S

Screen classes, 96-97, 236, 285, 287, 298, 352 Screen density, 115 Screen implementations, 97-98, 286-288, 297, 302, 352 Screen instances, 98 Screen interface, 227 screen resolution, 163-164 Screen.dispose method, 98 screenOrientation attribute, 107, 113 Screen.pause method, 98 Screen.present method, 98, 220 Screen.render method, 278 Screen.resume method, 98, 284 screens Droid Invaders game, 580 and game design, 64-70 handling different sizes and resolutions, 210-215 aspect ratios, 211-212...

## The Pool Class Because Reuse is Good for

What's the worst thing that can happen to us as Android developers World-stopping garbage collection If you look at the Input interface definition in Chapter 3, you'll find the methods getTouchEvents and getKeyEvents . These return lists of TouchEvents and KeyEvents. In our keyboard and touch event handlers, we'll constantly create instances of these two classes and store them in lists internal to the handlers. The Android input system fires a lot of those events when a key is pressed or a...

## Processing Key Events

After the insanity of the last section, we deserve something dead simple. Welcome to processing key events. To catch key events, we implement another listener interface, called OnKeyListener. It has a single method called onKey , with the following signature public boolean onKey View view, int keyCode, KeyEvent event The View specifies the view that received the key event, the keyCode argument is one of the constants defined in the KeyEvent class, and the final argument is the key event itself,...

## GLGame Implementing the Game Interface

In the previous chapter, we implemented the AndroidGame class, which ties together all the submodules for audio, file I O, graphics, and user input handling. We want to reuse most of this for our upcoming 2D OpenGL ES game, so let's implement a new class called GLGame that implements the Game interface we defined earlier. The first thing you will notice is that we can't possibly implement the Graphics interface with our current knowledge of OpenGL ES. Here's a surprise we won't implement it....

## Using the OBJ Loader

To demonstrate the OBJ loader, I've rewritten the last example and created a new test called ObjTest along with an ObjScreen. I copied over all the code from the previous example and only changed a single line in the constructor of ObjScreen cube ObjLoader.Iood glGame, cube.obj So, instead of using the createCube method which I removed , we are now directly loading a model from an OBJ file called cube.obj. I created a replica of the cube we previously specified programmatically in createCube in...