You know what’s really hard in game development? User interfaces. That doesn’t seem obvious: there are certainly lots of areas that require hard stuff: maths, physics, etc. Graphics, physics, advanced AI, procedural generation – these all require pretty advanced concepts; and user interface generally does not. But the fact is, I’ve seen good existing libraries for all of these things; I’ve never seen a good GUI library. And none of my fellow developers had, either. Every game GUI system out there is weird, quirky, slow, hard to use and/or badly coded. Sometimes all of the above. Here’s what GUI systems are commonly used with Unity3D engine.


Unity3D has a built-in system called UnityGUI. It’s pretty weird: all interface is created by code, and controls are function calls. Here’s what I mean: usually, GUI systems have a notion of “control”: a button, a line of text, a checkbox, etc. These controls are created either in some kind of editor or directly by code, like this:

var button = new Button();
button.Width = 100;
button.Height = 30;
button.Caption = "Click me";
button.Click = DoStuff;

Then this “button” is placed somewhere, like in a window, and the GUI system takes care of drawing it in the right place, checking for clicks, etc. You can still change the button after it’s created, i.e. resize it or change color or whatever.

In UnityGUI, though, there is no explicit button. Instead, the code calls a function that draws a button on screen immediately, and also checks for clicks. This makes the code dead simple:

if(Button(100,30,"Click Me"))

This is how UnityGUI looks like.

But while this is simple and easy to code, there are many problems with this approach. First, since there is no button, there’s no easy way to change its size, or text, or whatever, dynamically. Second, it requires that the code that draws the button, and the code that runs on button click, are specified in one place. This is called “tight coupling” in programmers’ parlance, and is a big no-no: it causes code to become tangled, intertwined, and ultimately unusable. And third, since all GUI elements are drawn immediately with this approach, it tends to be quite slow.

UnityGUI is essentially so bad, even Unity Technologies employees don’t advise using it. It does have its use, though: when you need to draw lots of mostly independent controls, and don’t care much about visual prettiness or performance, UnityGUI really shines. This is not the case with games, but this is exactly what’s needed for game tools. My editors for procedural generation, game objects and settings, and lots of other stuff all use UnityGUI and I love it; but it can’t be used for the actual game.


This is how NGUI looks like

Another GUI system commonly used with Unity is NGUI. It’s a relatively nice system, based on the common approach of creating various controls in the editor (it basically reuses Unity level editor). However, it does not cut it for my needs. NGUI is based on the premise that all controls are basically created and laid out in advance, and the game might show or hide them, with animations maybe, but not create new ones. Not that dynamic creation is impossible in NGUI – it’s just not really thought out. In the same vein, NGUI does not offer any ways for laying out (positioning) controls on screen, except the most basic. This is enough for many interfaces, but Xenos is going to need more. I plan lots of different windows: inventory, character and NPC information, dialogs, crafting tables, etc; simple layouts of NGUI are not enough. Also, NGUI costs $95 – it’s not much, but still an investment.

Other systems

There are some other GUI systems usable with Unity out there, but they’re less widely used and are probably less functional. There are also two really advanced alternatives: Flash and HTML/Javascript GUI (basically, integrating a whole another renderer into Unity). These are nice, but they require advanced capabilities of Unity3D Pro license, which I don’t have (and it costs $1500 and I’m not willing to spend this much just yet).

This leaves me with no other choice but create my own GUI system. Now, I can’t really hope to best literally everyone out there. Most probably, my GUI system would turn out to be bad too. However, what I do hope to achieve, is make a system that is good enough in the areas that really matter for this game, and maybe is shitty in some less important ones. Also, having GUI system written by myself means that I’d probably understand it really well and would be able to fix things relatively easy. That’s not guaranteed, though.


XGUI is the name of system that I ended up with. It’s not fully finished yet, but all the big stuff is in. I can create windows and controls using a visual editor. I can automatically generate “glue” code that makes using these windows easy, and de-couples control creation and use. I can create and change anything dynamically. I have a small, but effective library of simple controls that can be combined to create complex interfaces. And I have advanced automated layouting system, that adapts to target resolution.

Basically, what’s missing is drag-and-drop support (it’s pretty easy to add) and a system for editing and playing animations in GUI.
And, of course, missing is any kind of nice artwork to actually show off the interface. For now, all my GUI consists of differently colored boxes. But I hope to enlist an actual artist’s help for this, so stay tuned. Meanwhile, here’s how the GUI looks now.

xgui-editor xgui-ingame

2 responses on “Interfaces

  1. tzero2

    disregard what I said before I’m trying to make a minecraft type game and I got unity and the coherent noise asset and don’t know what to do with either one

Leave a Reply

Your email address will not be published. Required fields are marked *