The UI System

•July 2, 2010 • Leave a Comment

I decided to take a break from the rendering/physics side of things and work on the UI system. The UI system is based on a custom scripting system – embedded in the project as data files which cannot be downloaded or modified in the final game to avoid issues with Apple. Anyway all the UI in the game – both 2D and 3D – will use this data driven system.

I’ve also implemented the text system, which uses a bitmap font for fast rendering, to enable text to be properly layered with the UI and to allow the text to be rendered onto 3D UI surfaces. You’ll be able to freely interact with both 2D and 3D UI’s, allowing you to interact with panels directly in the 3D world. These panels can cause things to occur in the world (doors opening, platforms moving, etc.) and will be a primary means of interacting with the world.

Here are some examples screenshots showing the UI system with both 2D and 3D UI’s.

These shots are taken directly from the iPad. Notice the button color on the bottom buttons changing as I press them, which also changes the text. The first screenshot is in the default state, then I press the left, middle and right buttons.

This screenshot simply shows the 3D UI from a different angle, so you can see that it is really the UI being properly rendered in the 3D world, on the wall.

And finally the same level and UI on the iPhone. Everything works seamlessly between the iPad and iPhone versions.

The UI scripts are exactly the same for 2D and 3D, in fact they are interchangeable. When a UI is created, the instance running is created as a 2D UI (such as the game HUD) or by a model in the world as a 3D UI. Unlike, games like Doom3, the UI is setup with iPhone/iPad control in mind – essentially as touch interfaces. This means that you’ll be able to drag around elements (if enabled by the script), use multi-touch where appropriate, use iPhone like scrolling instead of scroll bars and so on.

Finally the UI panels and game systems will be able to communicate, allowing things like the UI scripts affecting game-state, game-state affecting UI, and having data saved in one level be accessible by any other level. An simple example would be using one panel in a level to turn on one in another level, or all panels in an area to be disabled if the power is off.

Advertisements

Control Protocol Work Resumes

•June 26, 2010 • Leave a Comment

After having to take a break from Control Protocol work, I have recently resumed the project. I’m focused on getting part of the game complete which I will use to show (through a gameplay video and screenshots) and talk about the game. This will be when I announce the game properly and give people a better idea what Control Protocol will be.

One area of focus has been to start honing in on the look for the target area of the game and being able to start putting together the real levels. To that end I’ve implemented enhancements to the 3D sector system, proper model support and enhancements to the lighting system.

I implemented support for FBX files, which are converted to a binary format for use on the device. Blender can export FBX files, which I then run through a converter. The final files are then quickly loaded by the game. Currently materials, positions, normals and texture coordinates are exported with animation data coming soon.

The lightmaps were showing some color artifacting in bright areas – basically the result of clipping the color to fit within the limited 32 bit precision of the textures (8 bits per channel). The other problem is that bright areas could never exceed the base texture color, which resulted in flat – weird looking lighting if I made the light sources too bright. The obvious solution is HDR lightmaps but it should be obvious this isn’t workable for an iPhone/iPad game.

When saving the lightmaps, I look at the brightest color channel – we’ll call it ‘L’. If L is less then or equal to 1.0 I leave the color alone and set the alpha to 0 (no compression). If L is greater then 1.0, I divide the rgb color by L and store L-1.0 in the alpha channel (clamping at 1.0). Then when rendering the lightmap, the final lighting color is then lightmap.rgb * (1 + lightmap.a). This means that bright areas can have up to double the texture brightness and that color is preserved much better. Also all of the clipping artifacts just disappear. 🙂

You can see this in action here (cropped iPad resolution – taken in the editor):

What’s interesting is that I can use the alpha channel of the lightmap as a bloom mask, which means that on high end phones (and the iPad) I can apply an HDR-like bloom effect. This means that only pixels truly brighter then 1.0 or emissive will glow. I haven’t implemented this yet on the iPhone but I’ll show the results in the editor to give you an idea what it’ll look like:

Right now I’m also working on the data-driven UI system, which will allow interactive UI panels in the world similar to Doom 3. This will be used for puzzles and other in-world interactivity using a touchscreen interface. This system will also be used for the HUD, any status/inventory screens, the title screen and any other UI elements. After that I’ll start putting together some of the game UI elements.

Initial Level Work

•April 12, 2010 • 2 Comments

Having completed the multi-bounce radiosity support in the editor, I’ve started working on the level design and art for the first area I’m going to show. One of the things I’ve discovered with my radiosity solution is that area lights and textured lights work very well – providing much more natural looking lighting then the point lights I’ve shown up until now. The way it works is that emissive pixels in textures have a certain range of alpha values – and these emissive pixels are automatically treated as area lights when generating the lightmaps at no extra cost. The plan is to do all the static lighting using this method – whenever I want a light source, I simply apply a texture with emissive pixels on some of it’s surfaces.

Another thing I should mention is that areas of the game will initially have the main power disabled, which means that certain things won’t work, such as doors and other machines, until you figure out how to turn the power back on. This means that there will be multiple lighting states – emergency power only and full lighting being the most common states.

Below you can see a hallway with emergency power only, lit using only emissive pixels from the floor texture (these screenshots are from the editor, but it looks very similar on the device itself):

The level geometry will get more complex, of course. In addition the ceiling texture needs some tweaking to make the tiling less obvious. Anyway you can see that “emergency power” only has a moody – low light look, however the lighting is bright enough that the player can still see easily.

Next time I’ll show lighting will the power enabled, with more interesting level geometry. 🙂

Lightmaps III

•April 10, 2010 • Leave a Comment

The radiosity system is finally working in the editor. I’ll give a brief overview of how the system works.

As discussed previously, the level geometry is automatically unwrapped. After this direct lighting – for non-shadow casting lights – is rendered into the lightmaps using the GPU. In addition the positions and normals of each sample point (light map texel) is also rendered, to be used later.

Next the position and normal map rendered during the previous phase is downloaded into system memory. The alpha channel is used as a mask, so that texels that have no geometry associated with them are ignored. A hemicube is then rendered on the GPU for each valid texel. A cosine and distortion correction map is then multiplied with the unrolled hemicube – the result will be normalized later on the CPU.

Finally three additional GPU passes are applied, each summing 8×8 blocks of pixels and writing them into a 1/8 x 1/8 size render target. The final result of this is a single 1×1 texture in GPU memory which is then downloaded to the CPU. Finally this value is divided by a normalization factor – basically the lighting contribution over the entire hemicube should have a combined weight of 1.0.

This can be done multiple times, using the previously resulting lightmaps to generate the hemicubes. Each pass is a full radiosity bounce – so only 3 or 4 will be necessary for maximum quality (beyond that contributions become too small to be noticable). The last step is to add up the direct lighting and all the indirect lighting passes to be used on the iPhone and iPad.

Here is an example, captured directly from the iPad, of a single bounce. Note that textures have been disabled to make the effect easier to see:

The iPhone version looks the same – except smaller and a different aspect ratio. Note that the small lights on the floor are direct lights, but the effect on the walls and ceiling is caused from the indirect light. Here is an image from a previous post to compare:

Notice how much flatter the lighting is, with no reflected light on the walls and ceiling.

There are a variety of improvements I still plan on making, but I’m at the stage now where I need to start building some real environments in order to fully tweak the lighting. So next up I will start working on an area of the game (the second area you see, the first one requires some additional technology) – I hope to start putting together a gameplay prototype soon and then announce this project properly. 🙂

Universal App

•April 4, 2010 • Leave a Comment

I may not have mentioned this before, but I intend Control Protocol to be a universal app. The idea is to develop the game for all the iDevice platforms simultaneously, with scaled graphics and effects based on the platform. Now that I have my iPad, I was able to the current app on the real device. Fortunately, because I properly handle resolution, everything worked with almost no code changes – though more will be necessary to load the correct assets and data.

In addition I’ve been working on the GPU based radiosity solver for the PC level editor. Once that is done, I’ll work on the lightgrid, so that dynamic and instanced objects can get lit properly. Finally I’ve started modeling the main character type, I’ll show that later once it’s further along.

Here are some screens from the test level on the iPad – the same test level as before until some more work gets done on the editor side:

Lightmaps Part II

•March 27, 2010 • Leave a Comment

Continuing on my lightmap work from the last post, I now have lightmaps working in the iPhone app. I had to fix some texture coordinate issues. In addition I forgot to uncomment out some code when taking the last posts pictures, the lighting only took into account the attenuation and not the surface normals. In addition I had to get multi-texture working in the iPhone engine as well as modifying the shaders.

Lightmaps are applied with a 2x multiplier, which allows the lighting hotspots to actually brighten up the textures, which helps give the surfaces more contrast.

These screenshots were taking directly off the iPhone:

Next I have to add soft shadows and finally implement the radiosity solver.

Lightmaps Part I

•March 25, 2010 • Leave a Comment

I’ve been modifying the level editor to support lightmaps. The tool unwraps the level geometry while keeping a nearly constant texel to worldspace ratio. This is done so that edges of different faces have the same texel density to limit errors and to generate consistent quality throughout the level. The tool then packs the surfaces into a single lightmap texture for a level. Right now I’m only rendering direct lighting into the lightmaps, using the GPU, but will be adding full GPU assisted radiosity support next. In addition I will be adding local light shadows, emissive textures (emissive surfaces will automatically emit light) and in-editor light painting.

Here is the lightmap generated – 512×512 right now. It is a small test level, the extra space will be used for real levels.

All screenshots below are taken from the level editor. Lightmap support will be added to the game (on the iPhone obviously) by the next update. Also diffuse textures have been disabled so the lighting is clearly visible.

After rendering the lightmap, I have to apply a hole filling pass on the texture (also done on the GPU) otherwise seams are visible. This is caused by limited rasterization precision when generating the lightmaps and bilinear filtering reading outside the primitive when rendering the lightmapped surfaces.

Lightmap seams (no hole filling):

After seam filling: