Quick Update

Been a while since I’ve posted anything on this blog. So here’s a quick update about what’s going on recently:

  • Finished my final project in school, “Earthrune: The Demae’n Wildwood” (website pending @ http://earthrune.simiansoftwerks.com)
  • Graduated from DigiPen Institute of Technology – Singapore with the honor of representing the school as its Valedictorian of the class of 2014
  • Got hired by Ubisoft Singapore as a Junior Programmer (currently) specializing in Engine Programming
  • Shipped my first AAA title (albeit joining the project *quite* late): Assassin’s Creed Rogue
  • Living the game development dream!

As for my current activities, I’ve been experimenting with a lot of engine programming techniques I’ve been picking up while studying the Assassin Creed codebase as an educational exercise. Itching to either complete a usable engine or join an open source game engine project (maybe Ogre?).

I probably won’t update this blog as “frequently” as I used to, although I do have a lot more free time now, so maybe I should? Let’s see what the future has in store first!

Whoo.. Progress update!

It’s been a lonnnnnng time since I’ve posted anything. Studying at DigiPen really takes all the time away from you. :(

Awesome HDR with tone mapping (uncharted 3) also features gamma correction

However, since I’m free now here are some screenshots from my latest game project :D . We’re really aiming big game here, real hardcore “current” gen stuff (not “next” gen, mind you).

This game is still in development and I’m sure it’ll be awesome when its done. I played the role of Technical Director/Graphics Programmer on the project, leading all technical development with an iron grip (heh heh). But, I am truly blessed with awesome an awesome team that works fantastic together, thanks for everything guys!

Swarm Trailer

Check out the game I’ve just completed! It was a submission for a school project and the result of 3 months of hard work!

A Robust C++ Interpolation Animation System

I’ve recently spent a great deal of time working on a 2D renderer/game engine as a hobby when I came across the need to add interpolation for my animation system. Studying other animation solutions like greensock (for flash) and tweener I was inspired by the relative ease of use. Hash tables to modify attributes of the interpolated object and specification of tweening effect as a parameter made tweening a quick and painless process.

e.g. In greensock:

TweenLite.to(mc, 1, {x: 100, y: 100, ease: Back.easeOut})

Of course, such a solution is not possible to completely replicate in C++ due to its limited support for reflection. But before I begin discussing a viable solution for the C++ language, here’s a basic explanation on what interpolation is! According to wikipedia: interpolation is a method of constructing new data points within the range of a discrete set of known data points, but as far as multimedia/real-time programming is concern, interpolation is mainly used to generate the in-betweens of 2 points for animation subsystems (or anything that requires some kind of movement really).  Extrapolation is an extensition of interpolation where points are calculated beyond the range of values, this is often used for pridictive movement algorithms in online games and, unfortunately, is beyond the scope of this article.

Now, the most basic interpolation formula goes as such:

Position = Initial Position + (Final Position - InitialPosition) * Interpolation Range
where Interpolation Range is a value from 0 – 1.

The rate at which Interpolation Range changes controls what “type” of interpolation it is. For example if Interpolation Range increases steadily (constant increase) the type of interpolation would be described as linear. Interpolation Range usually uses time to drive its value, hence producing the effect of animation (movement over time).

Here are some visualizations of interpolation range:

With that explanation out of the way, lets get into the real implementation details! The system relies heavily on the use of function pointers to specify the interpolation type. This method produces:

  • clean and simple looking code
  • a unified interface for interpolation
  • easily extensible framework for user animations

Firstly, the function pointer declaration:

typedef float (*AnimationType)(float);

We will implement different types of animation using the AnimationType declaraction as a base for the function. The parameter represents the unmodified time value (0-1). The function returns the modified time value (interpolation value). The function provides a quick prototype for defining functions that calculate interpolation based on time.

The next step in implementing the system, is to come up with some basic interpoolation functions (using the AnimationType prototype).

float linear(float time)
{
    return time;
}

float quadraticIn(float time)
{
    return time * time;
}

float quadraticOut(float time)
{
    return time * (2 - time);
}

// you can simply add more types here in the future and not feel the reprecussions!

Alone these AnimationType functions are not very useful, so the next step is to create the unified interface that uses these functions.

float interpolate(AnimationType animationType, float start, float end, float time)
{
    return start + (end - start) * animationType(time);
}

// additionally you can choose to add helpers for commonly interpolated types such as vectors and colors:
Vector3 interpolate(AnimationType animationType, const Vector3& start, const Vector3& end, float time)
{
    return Vector3(
        interpolate(animationType, start.x, end.x, time),
        interpolate(animationType, start.y, end.y, time),
        interpolate(animationType, start.z, end.z, time));
}

Color interpolate(AnimationType animationType, const Color& start, const Color& end, float time)
{
    return Color(
        interpolate(animationType, start.r, end.r, time),
        interpolate(animationType, start.g, end.g, time),
        interpolate(animationType, start.b, end.b, time),
        interpolate(animationType, start.a, end.a, time));
}

And lastly, using the system would be as simple as converting your current time to a range of 0 – 1 (currentTime/totalTime) and then passing in the various parameters:

float time = currentTime/totalTime;
Fireball.position = interpolate(&linear, Dragon.position, Hero.position, time);

Hope this has been useful! Feel free to comment on how this system can be improved in the comments section. :)

Pcsx Android – CD-Rom and UI

Here’s another quick update on the status of Pcsx Android. I have been busy at work on the emulator for the past few days and have managed to create a simple iso cdrom plugin! The plugin loads and runs correctly in the PC version of Pcsx and has even reported to be working correctly in the Android version! :D

I’ve also finished my own implementation of a file browser (to select the iso file) so that the cd can load it.

The next step in porting the emulator will be to port P.E.Op.S OpenGL plugin… but first, I’ve got to familiarize myself with setting opengl up in Android (which is kinda weird since the initialisation code is in the SDK not the NDK).

Lastly, I am looking for a skilled assembly programmer who can code assembly for ARM. Preferably with knowledge of android and how to compile assembly to work with the ndk. Knowing how emulators work will be a plus point too! If you’re interested in helping to port Pcsx to Android, feel free to contact me at ze_bransay(at)hotmail(dot)com.

p.s. I’ve set up a google code project at: http://code.google.com/p/pcsx-android!

PCSX Android Update

It has come to my attention that people have actually noticed my efforts to port PCSX to android. Let me remind the community that this is still only a hobby project until I can get more people to help/support this project! I am an intermediate programmer (no where as good as the actual people who programmed pcsx) and, by myself, will not be able to quickly port the emulator or solve every performance issue that might crop up.

So here’s an update on all the work that needs to be done and what I’ve done so far:

  • Compile PCSX with Android NDK
  • Write Android specific system calls for PCSX
  • Test base system. (I have a Nexus One)
  • Compile CDROM plugin with Android NDK
  • Test with PCSX Android
  • Compile GPU plugin with Android NDK
  • Test with PCSX Android (At this point, some games should load to menu)
  • Compile PAD plugin with Android NDK
  • Test with PCSX Android (Hopefully, games will be playable now)
  • Compile SPU plugin with Android NDK
  • Test with PCSX Android (Games with sound should now be playable)
  • Performance tweaking, testing and bug fixing (This will take time accordingly)

So as you can see there is much to do. I have a basic understanding of emulators and how they work and coming from a game programming background, I am not 100% adept at dealing with low level programming and will therefore make progress at a slow rate.

As a side note, I’m hoping that I can get the emulator working without using the PSX bios (HLE seems to be working fine on PCs).

An update… finally!

Okay so I’ve been really busy lately.. First with my new google nexus one (which I’m loving), secondly with adding features and refining the development pipeline of project warpy and lastly finishing up some stuff for work (today was a major milestone/deadline). I find it extremely important to establish a good production pipeline for any game project in general as this makes sure content is continuously added to the game. Of course for this to work, all bottle necks that prevent the flow of content from creator (artist/programmer/designer/sound engineer) into the game must be settled with utmost priority!

Anyway.. I’m also currently working on porting pcsx (psx emulator) to the android platform. I’ve never actually ported anything before so this is more of a hobby for me than anything else. After checking out the (really advanced) source code for the emulator, I can only say that the developers planned cross platform elements very well and I am generally impressed by it! I’ve also realised that debugging when writing native code for the android platform is seriously a pain in the ass. XD

Anyway, I will be updating this blog more frequently from now onwards! Thanks for all the comments in the character physics article (apparently it’s getting lots of hits from google. o_O).

Happy coding! :)

Taking a break from game programming…

… and starting on application programming! :D With all the research and core systems tested and proven in project warpy, it’s time to start creating the toolset for the designers to use. I’ll be developing these tools in C#.NET and despite my lack of experience in application programming for C#, it wasn’t hard to adapt (mostly due to my experience in VB.NET).

Been pretty busy with work lately (and collecting loot in world of warcraft). Going down to the creative company for tips on web design on a daily basis and frankly I’m quite sick of it here. The internet is slow, the place is in the middle of nowhere and I’m bumming around not doing much most of the time.

DockPanel Suite in action

Ah well, at least I got a chance to check DockPanel Suite out. It’s frigging awesome, with a couple of drag and dropping with the form designer in C# you can have an application looking like VS.NET’s docking system. Take a look on top!

Finishing up a couple of stuff in the office and heading off.. damn looks like I’m stuck here for the rest of the week. :(

Skeletal Animation in 2D (Part 2: Construction)

Okay so here’s an update on my implementation on skeletal animation in 2D… Basically… it works! :D

Above are some screenshots showing skeletal animation working hand in hand with the scenegraph I programmed for Project Warpy. These are some of the first development screenshots I’ve actually posted and although I’m not supposed to reveal game-play, I should be able to release screenshots as long as they do not reveal game-play (such as these).

Okay so first things first, how does the skeletal animation actually work:

  1. Create skeletal animation in maya
  2. Export to .X format
  3. Parse .X format with XNA’s X importer
  4. Custom processor to serialize bones and animations
  5. Parse bone transformations and convert to scenegraph
  6. Animation controller updates the bone nodes
  7. Interpolate between keyframes
  8. Attach sprites to bone nodes

Voila! This technique closely mimicks rigid binding and requires some knowledge of matrices and hierarchical transformation.

So before you can get any bone animation working in 2D you’ll need some kind of hierarchical system (in my case, a scenegraph). I create “nodes” that contain position, scale and rotation and use the following formula to obtain a transformation (in world space).

transform = Matrix.Identity;
transform *= Matrix.CreateScale(new Vector3(scale.X, scale.Y, 1));
transform *= Matrix.CreateRotationZ(rotation);
transform *= Matrix.CreateTranslation(new Vector3(position.X, position.Y, 0));
transform *= parent.transform;

transform represents the transformation matrix of the current node in world space. It is created by multiplying the local scale, rotation and translation followed by its parent’s transformation. This is done down the hierarchical chain (with parent first multiplying its transformation, followed by its children) just prior to rendering. The result is nodes that translate, scale and rotates according to its parent’s transformation, in layman terms, when the parent moves the child node moves in accordance.

Because bones are usually stored with transformations relative to its parent bone its easy to adapt it to such a system and therefore the only thing I do in the model processor is read in bones and animations and decompose the matrix into translation, rotation and scale (as used later in the application described above).

// process the bindpose matrices..
for (int i = 0; i < bindPose.Count; i ++ )
{
    Vector3 trans, scal;
    Quaternion rot;

    bindPose[i].Decompose(out scal, out rot, out trans);

    float rs = (float)Math.Sqrt(rot.X * rot.X + rot.Y * rot.Y + rot.Z * rot.Z);

    // make sure we dont get a divisable by 0 error.
    if (rs == 0)
        rs = 1;

    float z = rot.Z / rs;
    float angle = 2.0f * (float)Math.Acos((double)rot.W) * z;

    this.bindPose.Add(new Transform(new Vector2(trans.X, trans.Y), new Vector2(scal.X, scal.Y), angle));
}

I also store a list of parents for each bone, so I know how to parent them after they are loaded:

IList<BoneContent> bones = MeshHelper.FlattenSkeleton(skeleton);

foreach (BoneContent bone in bones)
    hierarchy.Add(bones.IndexOf(bone.Parent as BoneContent));

Once these values are stored the skeleton simply has to be reconstructed using the nodes in the scenegraph.

// contains a list of attached joints.
joints = new List<SkeletonJoint>();

// constructs the bones straight away!
for (int i = 0; i < data.bindPose.Count; i++)
{
    SkeletonJoint joint = new SkeletonJoint();
    joint.position = data.bindPose[i].translation;
    joint.rotation = data.bindPose[i].rotation;
    joint.scale = data.bindPose[i].scale;
    joints.Add(joint);
}

// go through that list again to parent joints..
for (int i = 0; i < data.hierarchy.Count; i++)
{
    // check if this is the root joint..
    if (data.hierarchy[i] == -1)
    {
        addChild(joints[i]);
    }
    else
        joints[data.hierarchy[i]].addChild(joints[i]);
}

Now that the skeleton is built (in your scenegraph), the only thing left to do is animation! I will be releasing an article describing keyframe animation and interpolation in part 3 of the article. Hope you enjoyed reading this.

p.s. This article has been left intentionally vague, sorry for not publishing complete code samples but there shouldn’t be a problem implementing this if you get the theory! :P

Input Action Mapping

One interesting way to generalize input in games is known as action mapping. Basically instead of keys (or buttons) being pressed or released, objects react to when actions are “pressed” and “released”. The largest benefit of using this method when programming input for games is when your project requires input from multiple devices (or support for multiple devices in the future).

Actions are generated from device handlers and propagated to game objects for evaluation. Device handlers handle input device events (such as key press/release) and create the corresponding action.

This worked especially well when developing Project Warpy for the XBOX/PC as debugging on the PC requires keyboard input, while the XBOX only accepts controller input.

Here’s a quick snippet on how you can implement this into your own games:

enum Action
{
    MOVE_LEFT,
    MOVE_RIGHT,
    JUMP
};

class KeyboardHandler
{
public:
    void onPress(char key)
    {
        if (key == LEFT_ARROW)
            gameWorld.propagateKeyPress(MOVE_LEFT);
        if (key == RIGHT_ARROW)
            gameWorld.propagateKeyPress(MOVE_RIGHT);
    }

    void onRelease(char key)
    {
        if (key == LEFT_ARROW)
            gameWorld.propagateKeyRelease(MOVE_LEFT);
        if (key == RIGHT_ARROW)
            gameWorld.propagateKeyRelease(MOVE_RIGHT);
    }
};

Of course, this is not very useful if you only need support for 1 type of input device. But… say I wanted support for the XBOX 360 controller.

class ControllerHandler
{
public:
    void onPress(Button button)
    {
        if (button == DPAD_LEFT)
            gameWorld.propagateKeyPress(MOVE_LEFT);
        if (button == DPAD_RIGHT)
            gameWorld.propagateKeyPress(MOVE_RIGHT);
    }

    void onRelease(Button button)
    {
        if (button == DPAD_LEFT)
            gameWorld.propagateKeyRelease(MOVE_LEFT);
        if (button == DPAD_RIGHT)
            gameWorld.propagateKeyRelease(MOVE_RIGHT);
    }
};

This works for any device (or device library for that matter) as long as propagateKeyPress and propagateKeyRelease are called in the right place and because the game objects react to the Action enum, no extra coding on the game object side is required.