Tag Archives: 3d

Low-Poly Modelling Screencast!

It’s been a while since I last posted something to Ye Olde Blog, so time to fix that.

A few days ago someone linked to the low-poly work of Kenneth Fejer on Google+ and apart from being awesome it gave me the idea to try and play around with the style. Not only was it low poly work, but the textures had an almost pixel-art appeal to them which I found intriguing.

You see, I’m a programmer most of the time but I do enjoy doing arty things, which some may say is unusual for a programmer, but when you’re making all of this stuff on your own it certainly helps. I don’t think my models are the best by any stretch of the imagination, but they’re certainly passable, but one thing that I’ve always never been satisfied with is my ability to texture them. This is why this pixel-art style appeals to me, because it’s a lot easier (at least for me) to create pixel-art. Continue reading

Retina Display, Open GL, and You!

Over the long weekend, I took some time to code support for the iPad and iPhone 4’s Retina Display into Red Nova. It was a fairly painless process, after Paul Pridham pointed me in the right direction.

If you are using OpenGL for your app on the iPhone and are using an Orthographic projection for your 2D bits, you shouldn’t have to change much of your code to get it “Retina Ready”.

Setting the View Scale Factor

First, your Open GL View class if you add the following in your init code it will tell the OS that you want to display your graphics at 960×640:

if([[UIScreen mainScreen] respondsToSelector: NSSelectorFromString(@"scale")])
	if([self respondsToSelector: NSSelectorFromString(@"contentScaleFactor")])
		self.contentScaleFactor = [[UIScreen mainScreen] scale];

If you run your app at this point you should see the content running in the corner of the screen because your GL viewport is still running at 480×320. So that brings us to…

Setting up the Open GL Viewport

Note: If you haven’t already you may want to make a wrapper function that returns the current device scale factor. I did and it’s very useful in code that needs to properly deal with displaying your graphics at the right size.

Anyway, this is the next and pretty much final step.

Find your call to glViewport, and modify it thusly:


Obviously change the width and the height to correspond to your app’s layout.

If the rest of your code sets up your projections (2D and 3D) based on 480×320, when you compile and run your app you should get glorious retina display goodness! You will of course have to adjust your 2D bitmapped assets (fonts and images) to reflect the higher resolution of the display, but the end result is that your app thinks in 480×320 (this is how Apple manages it with Cocoa Touch, as far as I can tell, amusingly enough) but displays at 960×640!

Blender Export Scripts

Inspired by the idea behind iDevBlogADay I have decided to continue to blog more, as I enjoy writing and sharing useful (and not so useful) information.

Today I’m going to talk about adapting Blender for use in game development, specifically for exporting textured 3D models from Blender for use in our games.

I am assuming you know how to make Blender work and have created a mesh with a UV texture map. The problem you’re having specifically is getting that mesh out of Blender. You could of course use one of the built in exporters, but this way is more educational, also sometimes it is good to have control over the format your mesh data is stored in. The file described here is pretty basic, but hopefully this gives you enough working knowledge of Blender’s guts to use this as a springboard to create awesomer things. Another thing to take into consideration is that Blender and OpenGL use slightly different coordinate spaces, but it is not hard to work around.

Anyway, the language of choice for Blender is Python. If you like tabs, you will like Python. Much like how LISP was invented by parenthesis fetishists, I suspect that Guido van Rossum has a thing for tabs.

So, let’s get crackin’! Fire up your favorite text editor and do the following:

Name: 'MyMeshExport'
Blender: 248
Group: 'Export'
Tooltip: 'Export a MyMesh File'
import Blender
import bpy

What this does is tells Blender that it’s a Blender Python script named “MyMeshExport”, runs on at least Blender 2.48 (you can change this as you see fit), and goes in the “Export” group. The imports hook into a bunch of Blender specific stuff including the scene.

Next, we’re going to specify the “write” function for the script. This is what is run automatically when you choose this script to export it.

def write(filename):
	out = file(filename, "w")
	sce = bpy.data.scenes.active
	ob = sce.objects.active
	mesh = ob.getData(mesh=1)

What this does is opens a file for writing, and hooks it to out, takes the currently active scene assigning it to sce, the currently active object and assigns it to ob, and then assigns the first mesh in ob to mesh. This assumes you have selected your object before you exported, and that object only has one mesh.

Next we’ll put in how many vertexes and faces there are in our mesh into the file. This will make life easier for our file loader:

	out.write('%i Vertexes\n' % (len(mesh.verts)))
	out.write('%i Faces\n' % (len(mesh.faces)))

Next we insert vertex co-ordinates:

	for vert in mesh.verts:
		out.write('v %f %f %f\n' % (vert.co.x, vert.co.y, vert.co.z))

This goes through each vertex in the mesh and inserts “v x y z” where x, y, and z represent the co-ordinates of the vertex.

Finally we write in information for each face in the mesh, including what vertexes make up the face, the vertex normals for that face, and the UV coordinates.

	for face in mesh.faces:
		for vert in face.v:
			out.write(' %i' % (vert.index))
		out.write('n %f %f %f\n' %(face.no[0], face.no[1], face.no[2])
		out.write('n %f %f %f\n' %(face.no[0], face.no[1], face.no[2])
		out.write('n %f %f %f\n' %(face.no[0], face.no[1], face.no[2])
		for uv in face.uv:
			out.write('uv %f %f\n' % (uv[0], uv[1])
Blender.Window.FileSelector(write, "Export")

This is where the per-polygon information is spat out into your file. For each face it outputs “f v1 v2 v3” where v1, v2, and v3 are the indexes of the vertexes we exported previously.

Next we output the normals. In this simple example we’re going to assign the face normals to each of the vertex normals. This is because whatever Blender thinks the “smooth” per-vertex normals are has no basis in reality what-so-ever. If you want to get smooth normals, you will have to compute them yourselves. The normals are encoded as “n fn1 fn2 fn3” on 3-separate lines (one per vertex) where fn1, fn2, and fn3 is the face normal.

Finally we output the UV coordinates of the texture map. For each vertex in the face we get “uv uv1 uv2” where uv1 and uv2 are the UV coordinates of that vertex.

The last line in the file tells Blender to open it’s File Selector window and call ‘write’.

Once you save your file with an imaginative name like ‘myfileexport.py’ you can copy it to wherever Python keeps its scripts. This is different under different platforms obviously, though you can get that information here, which incidentally I discovered after writing all this is practically identical to my post, except I cover exporting UV coordinates. Possibly because this is where I learned this over a year ago and subsequently forgot. Mea Culpa.

Anyway, let’s finish this up. So you’ve put your script into your scripts folder. Now you need to update the menus in Blender to tell it that it’s there. So you go to the Python pane and go to Scripts > Update Menus. Then you can go to your scene, select your object then go to File > Export > MyMeshExport to save your new mesh for inclusion in your game.

Optimizing for OpenGL ES

If you’re trying to get your OpenGL ES code to run really fast on the iPhone you have to take one very important piece of information into account: writing to memory on the iPhone is expensive. To that effect, I just got another >10fps in my 3D engine by precomputing vertex, normal and texture coordinate buffers for all my 3d models instead of writing to a generic buffer in my drawing loop as I had been doing. So basically when I call gl*Pointer calls, I point to the buffer in my model object.

It means I’m using up a tiny bit more memory per model (not per object, as they reference the models) but the performance increase is amazing and totally worth it.


Well, I got my iPod Touch on Wednesday, so I spent about an hour to get a provisioning profile from Apple and to set up Xcode to work with that. I started an Open GL ES project and copied my 3D engine code into it and just kept beating it with a sledgehammer until it compiled. It was actually a fairly easy process overall.

I had already replaced all my immediate mode OpenGL calls with vertex arrays, but I discovered much to my horror that I was still using some unsupported calls. I cleaned that up and I figured out how to install GLU for the iPhone in the simulator and iPhone OS SDKs as well.

Then I realized that the iPhone’s GPU only supports power of two textures… oh great shades of 1998! I consulted the intertubes and found out how to use some GraphicsServices calls to scale my textures to a more appropriate size. This is in no way optimal mind you! Once everything is working properly as is, I’ll get a start on optimizing my textures and stuff like that, but this is a great start.

The next night I decided to start using a thread to run my game code instead of using the timer method as it creates a lot of overhead. If you end up having trouble with this, I have discovered for some reason you should initialize your GL buffers in the main thread instead of the game thread. I’m thinking it’s some sort of resource synchronization issue with the main thread and the forked thread.

So right now I sit with a working 3D engine and my next step is to capture touch input from the main thread and start pushing it into an event queue for my game thread to nom on. I suspect I will need a blocking mechanism while the game thread is doing it’s thing.