Feb 22, 2011

Progress Report: Windowing System

At the moment I'm working on the windowing system for the UI. Simple things like Buttons and Checkboxes and all those standard UI controls we take for granted in Windows or other platforms need to be developed from scratch when you're working outside of their standard UI framework, which is exactly what happens when you use a graphics library like DirectX or OpenGL.

So far I've implemented the Window control and the Button control, which is a surprising amount of work once you start having to develop a control hierarchy, and have a way for controls to receive mouse up/down events. It's not overly complex work, but it's one of those situations where you need to put in a whole lot of work before you can get anything remotely functional.

Hand in hand with the windowing system is the skinning system. Since we're dealing with a 3D world, at its simplest level the visual aspect of the control is a mesh that you then slap a texture on. The skinning system manages that process in such a way that you can resize the control without having to redesign the texture in a paint program and it still have it look correct.

The textures I'm currently using for the windowing system are 64x64 pixels, and an example of one I'm using for window controls is below:

Source texture for window control

If you take this window texture and naively stretch it to the size of a 300x200 pixel window this is the result:

EWWWW... UGLY!

Simply stretching the image doesn't look very good at all. We could solve the problem by redrawing the window texture at 300x200 in Photoshop, and the result will look nice, but then we'd need to draw a window texture for every window that might pop up in the application, and we lose the ability to dynamically size a window based on content without encountering the same issue above.

A solution to this problem is to break the original image down into a 3x3 grid and then tile each component (corner, edge, centre) using different methods. Stretching still becomes an issue with the edge and centre parts if the texture is not homogenous in the direction of stretching, so instead of stretching the image, I tile it. The layout/tiling scheme works as follows:

  • Corners: no tiling.
  • Edges: tiling in 1 axis only.
  • Centre: tiling in both axes.


Tiling Scheme (image scaled to 4x normal size)

In my windowing system I use control points located at (15,15) and (47,47), meaning the edges are all 16 pixels and the centre is a 32x32 pixel block.

Internally the skinning system actually breaks the source texture into 4 sub-textures to get it to be able to tile correctly, as the tiling would be across the entire texture not just the parts we want. In addition, as a consequence of the 3x3 layout of the texturing, the mesh for the controls is more complex, consisting of 18 triangles instead of just two.

Below is an example of the finished result as it would appear in the app:

The tiled window.

The beauty of having the look of the control be derived from a simple 64x64 texture is that it is relatively simple to alter the look of the entire UI of the application, which is a great benefit when you're in the early stages of development and haven't settled on a final look and feel.

As I mentioned at the start of the post, I've gotten the Button and Window controls working, and will probably spend a bit more time tidying them up and then move onto drag and drop functionality, which is necessary for the inventory screen.

Feb 15, 2011

Development Schedule and Bad Lists.

As I mentioned in a previous post I hope to have something releasable by the end of the year. In order for that to happen I've had to cut features.

These features aren't lost forever, they have simply been moved from the "essential list" to the "wish list". As the name suggests, the essential items are absolutely required for Bulldog to be playable, whereas the wishlist items are all the cool and interesting features that I want to include to make Bulldog stand out and be fun to play, but will delay the first releaseable version.

My plan from day one has been to release as early as possible and then release updates as time goes by, rather than work in isolation for 5 years and try to suddenly release something amazing. I'm hoping that the first release will show enough promise and get people sufficiently interested in what I'm trying to achieve in the long term, that I can build up a community around Bulldog, which will help guide the development and let me know what works and doesn't. Well, that's the plan at least. :)

Since the wishlist items are a long way off, I'll ignore them for now and focus on the "essential" list, which looks as follows:

//TODO: Inventory Screen / Management
//TODO: Food & Water
//TODO: Danger.. ie Mobs
//TODO: Title Screen
//TODO: Save/Load World & Zones - partially implemented.
//TODO: Character Models
//TODO: items spawning + management
//TODO: Item Models/graphics
//TODO: Windowing System
//TODO: Picking System
//TODO: Ability to move character
//TODO: Ability to issue orders
//TODO: Ability to attack stuff
//TODO: Ability to loot stuff
//TODO: Mobs models
//TODO: Mobs AI

It doesn't look like many items, but each of those items actually has many subitems that need to happen before they can be completed. For example, the first entry about the inventory screen will require that I complete the windowing system, then design and build the inventory screen, implement the ability to drag and drop items into containers, come up with and design in-game items, and design the graphics for the icons being used.

Interestingly, I just noticed that since I wrote that list over two months ago I haven't been able to cross any items off the list. I've actually made progress on Bulldog, but on supporting tasks, which aren't captured in that list.

I've been told that you're more likely to be motivated to complete a list of tasks if you can see some tangible progress on the list. The idea is that you take a big task and break it down into more manageable sub tasks, which you can tick off and feel good about your progress.  So for example, you're more likely to finish cleaning your house if the task list has items such as "vaccuum the floor", "wash the windows", "do the dishes", etc.. than just "clean the entire house".

So its pretty obvious to me now that the list above is a bad list. Not just bad for motivation, but also bad for tracking progress on the project overall. If I wanted to estimate my overall progress and come up with percentage complete figure, I pretty much have to take a guess based on what I've done. I can't actually use any of what is still left to be done to help guide my estimate, as the list is woefully vague in the actual details. The end result is likely to be a false impression of how much work is left and may lead to schedule slip.

So I think my next task is to go away and break down the list into more manageable tasks, which will also serve the purpose of forcing me to think about and define what each of those tasks actually entails.

Feb 1, 2011

Procedural Texture Generation

As I mentioned in the introductory post, Bulldog is going to feature procedural content. This will cover things like terrain, random encounters and even textures.

At present, the only procedural content I've built for Bulldog is a texture generator class which generates textures from pre-defined templates. These templates specify a series of operations performed sequentially to produce the desired output. At present it is fairly naive and produces rudimentary results, but I hope to improve the texture generator's capabilities over time with more advanced synthesis techniques.

I've found that RGB is not a particularly good colour system for texture modelling, as you rarely want to adjust the redness, blueness or greenness of a particular pixel. You're more likely to want to adjust the brightness or other parameters. So I use the HSL colour system, which consists of Hue, Saturation and Luminance. This allows me to adjust the texture in a much more "natural" way. 

As an example of how the texture generation process works, consider a sand texture. Sand is a granular solid made up of tiny bits of rock. The grains are vaguely uniform in size, but vary in colour. I've modelled this in my texture generator by taking a sand-like base colour, then applying random noise to each pixel to adjust the hue, saturation and brightness accordingly. The image below shows the different parts that are generated by the texture generator class and combined to form the final texture.


Uniform vs Gaussian noise

The random noise function I used in the examples above is called Gaussian Noise. This is noise where the probability distribution follows the "normal" distribution. To generate Gaussian Noise, you need a random number generator that will produce random numbers that are biased in such a way that they follow the normal distribution, rather than a uniform distribution.

The random number generator in .net (found in System.Random), is typical of most random number functions found in modern languages in that it returns random numbers with a uniform probability distribution. This means that each value in the output range chosen has equal probability of being output. If you were to graph the probability of each value in the range appearing, the result would pretty much be a horizontal line.

The "normal" probability distribution on the other hand, has a symmetric, bell-shaped curve. Values around the mean have a higher probability of being output, and values at the very edges of the range have a much lower probability. I won't go into any more depth on the topic, for further reading look at something like
this wikipedia article.

Below are two examples of noise generated using Uniform and Gaussian Noise:

Uniform NoiseGaussian Noise

At first glance it appears as though the uniform noise has a wider range, as it has significantly darker and lighter pixels than the gaussian noise. However, if you look closely at the Gaussian noise you can see that there are some pixels that are quite dark, and some that are quite light, however the majority are somewhere in between, ie, clustered around the mean value. Gaussian noise is a closer match for real-world applications, as the normal distribution appears very frequently in the real world.

For those that are interested, to generate the Gaussian random numbers I wrote a random number generator based on the Box-Muller Transform. It uses a standard uniform probability distribution random number generator and then remaps the values so the output has the normal distribution. The actual implementation I chose uses the Marsaglia Polar Method, which eliminates the need for the sin() and cos() functions.

An interesting quirk with this algorithm is that it needs 2 independent uniform random numbers to actually work, but the output is 2 independent normally distributed random numbers, so in my implementation I actually cache the second value and return that without the need for computation on the second call.