Resurrected Entertainment

Archive for the 'Game Development' category

DOS Software Development Environment

December 12, 2014

I love writing software for Microsoft’s DOS. I didn’t cut my teeth programming on this platform, that was done on an Atari 800 XL machine. However, it was on this platform that I was first exposed to languages like C and Assembly Language, and thus sparked my torrid love affair for programming which lasts to this day. The focus of this post is about DOS software development and remote debugging.

If you have done any development for iOS or Android, then you have already been using remote debugging — unless you are some kind of masochist who still clings to device logging even when it is not necessary. The basic concept is that a programmer can walk through the execution of a program on one machine via the debugger client, and trace the execution of that program through a debug server running on another machine.

The really cool part of this technology is that it’s available for all sorts of platforms, including DOS! Using the right tool chain, we can initiate a remote debugging session from one platform (Windows XP in this case), and debug our program on another machine which is running DOS! The client program can even have a relatively competent UI. For this project, the toolset we are going to use is available through the OpenWatcom v1.9 project, and the tools found inside that wonderful package will allow us to write 16-bit or 32-bit DOS applications and debug them on an actual DOS hardware target! In addition, we can apply similar techniques but this time our server can be hosted within a customized DOSBox emulator, which is also really cool since it allows you to debug your code more easily on the road.

The first scenario is the one I prefer, since it is the faster of the two approaches, but before we get into the details how to set this up, let’s consider some of the broader requirements.

You’ll need two machines for scenario number one. The DOS machine will need to be a network enabled machine, meaning it should have a network interface card and a working packet driver. I would recommend testing your driver out with tools like SSH for DOS, or the PC/TCP networking software originally sold by FTP Software. In order to use the OpenWatcom IDE, you’ll need a Windows machine. I use VirtualBox and a Windows XP Professional installation; my host machine is a Macbook Pro running Max OS X 10.7.5 with 4 GB of RAM.

The second scenario involves using the same virtual machine configuration, but running the DOSBox emulator within that environment. You will need to use this version of the DOSBox emulator, which has built-in network card emulation. They chose to emulate an NE2000 compatible card for maximum compatibility, and also because the original author of the patch was technically familiar with it. After installation, you’ll need to associate a real network card with the emulated one, and then load up the right packet driver (it comes bundled with the archive).

For reference, the network interface card and the associated packet driver I am using on the DOS machines is listed below:

  • D-Link DFE-538TX

These are the steps I have used to initiate a remote debugging session on the DOS machine:

  • Using Microsoft’s LAN Manager, I obtain an IP address. For network resolution speed and simplicity, I have configured my router to assign a static IP address using the MAC address of my network card; below is the config.sys and autoexec.bat configurations for my network
    
    
    
    
    AUTOEXEC.BAT
    @REM ==== LANMAN 2.2a == DO NOT MODIFY BETWEEN THESE LINES == LANMAN 2.2a ====
    SET PATH=C:\LANMAN.DOS\NETPROG;%PATH%
    C:\LANMAN.DOS\DRIVERS\PROTOCOL\TCPIP\UMB.COM
    rem - By Windows 98 Network - NET START WORKSTATION
    LOAD TCPIP
    rem - By Windows 98 Network - NET LOGON michael *
    @REM ==== LANMAN 2.2a == DO NOT MODIFY BETWEEN THESE LINES == LANMAN 2.2a ====
    
    CONFIG.SYS
    DEVICEHIGH=C:\LANMAN.DOS\DRIVERS\PROTMAN\PROTMAN.DOS /i:C:\LANMAN.DOS
    DEVICEHIGH=C:\LANMAN.DOS\DRIVERS\ETHERNET\DLKRTS\DLKRTS.DOS
    DEVICEHIGH=C:\LANMAN.DOS\DRIVERS\PROTOCOL\TCPIP\NEMM.DOS
    DEVICEHIGH=C:\LANMAN.DOS\DRIVERS\PROTOCOL\TCPIP\TCPDRV.DOS
  • Load the D-Link Packet driver
  • I load a TSR program, which I have built from a Turbo Assembly module, which can kill the active DOS process. I do this because the TCP server provided with OpenWatcom v1.9 does not exit cleanly all of the time, and will often lock up your machine. In the end, your packet driver may not be able to recover anyway, and you will need to reboot the machine, unless you can find a way to unload it and reinitialize. Incidentally, the packet driver does have a means to unload it, but when I attempt to do so after the process has been killed, it reports that it cannot be unloaded. The irony of the situation will make you laugh too, I am sure.
  • Navigate to my OpenWatcom project directory, then I start the TCP server which uses the packet driver and your active IP address to start the service. The service will wait for a client connection; in my case, the client is initiated from my Windows XP virtual machine using the OpenWatcom Windows IDE.
    • Ensure that the values for “sockdelay” and “datatimeout” are both “9999”, and make sure the “inactive” value is “0” in your WATTCP.CFG file. Even though the documentation says that a value of “0” for the “datatimeout” field is essentially no timeout, I did not find that to be the case. The symptom of the timeout can be onbserved when you launch the debug session from the OpenWatcom IDE and you see the message “Session started” on your DOS machine, but then the IDE reports a message the the debug session terminated.

These are the steps for the DOSBox emulator running within the Windows XP guest installation:

  • Install the special network enabled build of DOS Box mentioned above;
  • Fire up the NE2000 packet driver  (c:\NE2000 -p 0x60);
  • Start the TCP service
    • Note that I configured a static IP address on my router using the Ethernet address reported by the packet driver. You should not be able to ping that address successfully until the TCP server is running in DOSBox. While the process worked, I found the time it took for the session to be established and the delay between debug commands to be monstrously slow (45-90 seconds to establish the connection, for example) and as a result, made this solution unusable.

While working on a project, it can be really useful to create the assets on a modern machine and then automatically deploy them to the DOS machine without needing to perform a lot of extra steps. It can also be useful to have the freedom to edit or tweak the data on the DOS machine without needing to manually synchronize them. The solution which came immediately to my mind was a Windows network share. This is possible in DOS via the Microsoft LAN Manager software product and has been discussed before in a previous post.

Building Wolfenstein 3D Source Code

June 16, 2014

Way back on Feb 6, 2012, id Software released the source code to Wolfenstein 3D — 20 years after it had already been written. The source code release does not come with any support or assets from the originally released game. In fact, id Software is still selling this title on various Internet stores like Steam. I played around with a DOS port of the DOOM source code quite some time ago, but I had never bothered to try and build its ancestral project. Until now!

As it turns out, it’s actually quite straight-forward with only a minor hiccup here and there. The first thing you’ll need is a compiler, that almighty piece of software that transforms your poorly written slop into a form that the operating system can feed to the machine. For this project, the authors decided to settle on the Borland C++ v3.0, but it is 100% compatible with v3.1. I don’t know if more recent compilers from Borland are compatible with the project files, or the code present in the project produces viable targets, so good luck if you decide to make your own roads.

As per the details in the README file, there are a couple of object files you will want to make sure don’t get deleted when you perform a clean within the IDE:

  • GAMEPAL.OBJ
  • SIGNON.OBJ

You can open up the pre-built project file in the Borland IDE, and after tweaking the locations for the above two files, you should be able to build without any errors. The resulting executable can then be copied into a working test directory where all of the originally released assets are located, I believe my assets were from the 1.2 release.

There are also a few resource files you must have in order for the compiled executable to find all of the right resources. According to legend, the various asset files were pulled from a sprinkling of source formats and assembled into “WL6” resource files. A utility called I-Grab, which is available via the TED5 editor utility, produced header files (.H) and assembler based (.EQU) files from that resource content which allowed the game to refer to them by constant indices once the monolithic WL6 resource files were built. There are annotations in the definition files, using the “.EQU or .H” extension, with a generated comment at the top which confirms part of that legend.

The tricky part in getting the game to run properly revolves around which resource files are being used by the current code base. The code refers to specific WL6 resource files, but locating those resource files using public releases of the game can be very tricky because those generated files have changed an unknown number of times. Luckily, someone has already gone through the trouble of making sure the graphics match up with the indices in the generated files. The files have conveniently been assembled and made available here:

After unpacking, you’ll need to copy those to the test directory holding the registered content for the game. Note that without the right resource files, the game will not look right and will suffer from a variety of visual ailments, such as B.J. Blazkowicz’s head being used as a cursor in the main menu, or failing to see any content when a level is loaded.

2D Scrolling and EGA Support

May 29, 2014

Originally, I was using a Matrox Millennium PCI graphics card in my DOS gaming box, but I found the 2D scrolling performance to be somewhat lacking as games like The Lost Vikings would shear while playing. The card also has virtually no EGA graphics mode support, which was important to me since I wanted to run games like Crystal Caves from Apogee, and I also wanted the development option of writing computer graphics programs written for this video mode.

Enter the S3 ViRGE (Virtual Reality Graphics Engine).

Interestingly, this was S3’s first attempt at a 3D graphics accelerator card. The performance was somewhat lower than expected, however, making the card only slightly faster than the best software renderers at the time, and equal to those renderers when anything other than the simplest 3D techniques were used. Because of the card’s poor performance, it was dubbed the “Worlds First Graphics Decelerator” by critics in the graphics and gaming communities.

I own the “DX” model of this card which is somewhat more performant than its predecessor, but I didn’t buy the card for how well it could render 3D graphics so it matters very little to me. I bought the card for how well it could accelerate 2D graphics and its support for lower end video modes, and it is very impressive thus far.

The Ouya is coming, but are you excited enough?

June 19, 2013

Ok, so there isn’t a whole lot to be excited about in terms of a revolutionary hardware platform, but I think the people behind the Ouya seem to have made a nice machine. Yes, it runs Android, and no it’s not a phone. Unfortunately, there are a number of people making that comparison, and yes, there are Android phones out there with better hardware, but the Ouya is not a phone. It’s a stable Android platform which game makers can target like a console. This is something that has dogged the Android development environment for years because of the rapid and large scale changes to the Android SDK after every major version and the hardware offerings themselves. It’s hard work to ensure your latest Android game runs flawlessly on the numerous bits of Android jetsome floating about in the market, and wouldn’t it be easier if the hardware platform didn’t change on a weekly basis? The Ouya tries to solve this problem as well as giving Android gamers a better gaming environment than a phone, which I am sure is something you think is pretty special, but it does lack a bit of flexibility in the input and output arena. It’s also a beautiful platform for Indie developers looking to release cool software for your living room; I am sure there are lots of opportunities just waiting there for creative programmers looking to take a slice of the Android pie. Think about it, what other options were there for you to release software on a computing device in the living room? The Xbox 360, the PS3, the Wii U? These are great platforms but I would hardly call them accessible to very small development companies. Not to mention, these platforms are only interested in releasing games for the most part. There is the odd piece of software like Netflix, but the bulk of 3rd party offerings is in the entertainment market.

My Book Library

February 13, 2013

I like to read software development texts when I have the time, and over time, I have developed a sizeable game and graphics programming library. I also have a below average capacity to remember important details, although I am usually fairly good at remembering the big picture. Because of this, I often need to return to a text when I am involved in developing a new algorithm or researching information on existing topics. I have long desired a means by which I can add my books to a software library, and then have that library automatically index all of my books based on topical reference points.

When an author creates an index for their book, it is usually a tedious process of reading through the manuscript and finding the keywords and concepts they want to index, and then pouring over the manuscript looking for instances of those keywords and recording page numbers and related topics; there is software which can help with keyword indexing, but concepts are a little more tricky to isolate and cross reference. In either case, the result of all of this pain-staking labour is an index which should prove useful to the reader.

I want to take it a step further and take those indexes and make them searchable electronically along with extra-metadata, so that if I am researching a topic like collision detection, for instance, I can see which books in my library talk about it (and possibly the number of pages dedicated to the topic), along with a paragraph surrounding each hit to provide some context. Google Books is starting to provide such a service, but it’s mostly centred around indexing the brief cover summary and then offering related books for sale based on what it found. At the bare minimum, this service would require the electronic versions of the books which are much more prevalent these days but many authors or publishers do not want to share with Google. Do you know of a service which does a better job than Google Books?

Game Development Tip

November 29, 2010

If you’re going to create a game, and that game allows for in-game loads of saved games, then please place the load menu after the save menu if you’ve designed an in-game menu like LOZ: Ocarina of Time or Ys: Book I & II for the Nintendo DS. When you think about it, the number of times your players are going to save a game greatly out numbers the number of times they are going to load a previously saved game while in the middle of playing the game. I would recommend against putting it in an in-game menu altogether and just leave the option to load in the main title screen. However, there are cases where that may be annoying to the player, especially if you’re game is slow to exit back to the title screen.

What other in-game pet peeves can you think of when playing your favourite game?

Edit: I forgot to mention that the main reason I think the menu should be in reverse order is so that the player does not accidentally load a game when they meant to save a game. When you’re thinking about other things before the heat of a boss battle, the order of the screens is quite easy to overlook. You can just imagine the filth erupting from the gaming dungeon when that happens.

The PC Gaming Onion

July 9, 2010

I have been caught in a diagnose, debug, and repair cycle for a few days now when using my PC. It’s been a frustrating experience so far, and I have yet to arrive at a stable platform to play games, which is the reason why I began this epic quest in the first place. This particular problem has been quite nasty, and all the usual tricks and secret handshakes aren’t working. I have had to systematically replace and diagnose each component of my system by placing everything from software services to my set of DRAM sticks under the microscope.

On the one hand, I do enjoy problem solving, so I could put the frustration aside sometimes and concentrate on the problem (although, I wanted to throw the machine out the window yesterday).  However, problem solving was not my goal here. Since I’m on vacation right now, I want to play games, not fix computer problems. Like everything else in the desktop computer market, the complexity has risen to the point where an average computer user must treat their computer like a mystical black box. You might as well throw a big fat “No Serviceable Components Inside” sticker on the side of the case. Unless it’s something simple, like a unplugged monitor or a mis-aligned video card, then I would expect most people to throw their hands up in frustration and start filing through their list of contacts looking for the local computer geek.

You don’t need to understand everything about everything,  you need to comprehend just enough so that you can effectively peel back a layer or two and fix your problem, or at least diagnose it. Of course, if it were that easy, I would be playing Bionic Commando right now. The hard part is dealing with all of these hardware and software layers, and trying to find that mystical needle in the haystack.

I would like to see a better solution to the problem of diagnosing a system crash. It’s a hard problem, and one that cannot be fixed without first changing the system, and reducing the number of layers or at least fixing those layers to know stack. The problem is that layers provide a certain kind of freedom to hardware and software engineers. It allows them to ignore a lot of the inner workings of a system, and concentrate more on what they want to build, be it a game or a piece of tax software, so as much as I feel like getting rid of them right now, those layers are here to stay. However, we do not need all of those layers in all circumstances, and some people have customized their desktop configurations so that only the necessary layers are used when performing a task. These customizations are very high level, the user typically does not have a lot of control over the low level pieces of the operating system (unless you’re using Linux, but if your goal is to play games, then you’re not using Linux anyway). I don’t think these kind of profiles are a good idea either, since that creates an even bigger nightmare while testing or debugging a failing product, and much of the configuration is beyond the understanding of a typical user or support personnel.

What the desktop needs is a specialized mode, which essentially brings some of the benefits of a console to the PC arena. This mode must be supported by the operating system, and consist of a limited but complete set of layers, so that game programmers can continue to write great games. That’s it; nothing else should be included. By reducing the number of layers, and provide appropriate diagnostic tools for reporting on hardware configurations or problems, game development companies can target much more typical configurations while testing, and support personnel could diagnose problems faster.

Products like DirectX for Microsoft Windows were designed to solve that problem, but over time the development platform began to stray with the introduction of more layers and thus more complexity. Games which utilize DirectX are nice, but simply providing DirectX is not a complete solution, since there are so many services still running on the system, which have nothing to do with the game or the game development framework. These services can cause problems, add complexity, and serve to hide the real culprit, until we arrive at the situation I am dealing with today.

Black Art of 3D Game Programming

September 20, 2009

Black Art of 3D Game Programming So you’re an aspiring game writer, now what are you going to do about it? You could get a job as a summer intern at a local game company and work for free doing all sorts of tedious tasks no one else wants to do. Or you could hit eBay and do yourself a huge favor and find this text. Before you write back about the publication date, yes it was published in 1995. No, it does not talk about OpenGL or DirectX. Phew, now that we have that out of the way, let’s talk about what this book can teach you.

This book’s greatest strength is how it dabbles in a number of good topics without being totally useless. A number of texts provide a lot of information about too many topics, while not providing enough material to do something fun or useful. That exercise is often left to the reader. Ha! A book like that in my house will pay a little visit to the recycling bin or get banished to the library, never to be read again. Other books tend to steam roll over you with too much theory of the mathematical kind. While I’m all for roots and quadratics, aspiring young programmers wanting the enter the field of game development shouldn’t necessarily want to go whole hog on their first book.

That being said, they will need some basic skills to get started, and matrix math is a fundamental skill for playing around in 3D. Fortunately for the junior programmer, it’s not the most difficult skill to learn. One of the black arts the book teaches you is how to master 2D and 3D transformations using matrix algebra. It doesn’t go into quaternions for gimble-lock free rotations, but since the reader is probably just starting out, it doesn’t really matter either. They will be able to make do without all of the fancy topics getting in the way of actually learning something useful.

The book does show its age in some chapters where the topic of conversation is how to take control of various video modes, but I think it’s still a worth while read. You can still do this kind of programming if your recreational operating system of choice is FreeDOS or some other version of DOS, even on a modern PC. You could even use Windows 95 or 98 for that matter, just get it to boot using the version of DOS which ships with all of that GUI crap. Trust me, you’ll be better off in the long run, plus you can wow your friends with your amazing latch register skillz.

Qt for Games – Niblet #5

December 12, 2008

I spent some time trying to place a QGLWidget into a QGraphicsScene. Apparently, this is not a supported use of the QGraphicsScene class. One Trolltech engineer suggested creating a viewport using a QGLWidget instead of the typical QWidget (default). Neither approach worked and both ended up suspending the program on my Mac. There goes option number two for OpenGL sprites in the game.

Qt for Games – Niblet #4

Before I began my investigation into the suitability of Qt for games, I wanted to use Qt’s new QGraphicsItem framework alongside the OpenGL support. I was curious to see if an OpenGL object could be wrapped by a QGraphicsItem and rendered into the same view. The short answer is yes, but the result is not very pretty or practical. Basically, it comes down to a fundamental issue with OpenGL and sharing rendering contexts with other engines. OpenGL doesn’t allow this by design, but you could use an QGLWidget and encapsulate that in a QGraphicsItem implementation. The result is very heavy weight, not particularly elegant, and doesn’t easily support animation of the OpenGL shape. With the latest version of the Qt API, you can render a QWidget object into an OpenGL rendering context, but not the other way around (without using a container like QGLWidget).

This is fine and not detrimental to the project, so we’ll continue using the new graphics framework minus the features relating to OpenGL. This gives us plenty of options for rendering, but 3D will be set aside for now (unless I get a spark of genius). We may revisit it later to explore different ideas, like the intro screen or credits; however, we will need to find out how well Qt supports direct pixel rendering. I am very familiar with various image classes, but whether they can be used in and efficient rendering pipeline is still an unknown.

I’m trying to be realistic here. Niblet is not an overly complex idea, but I would like to use compositing for a variety of effects and some effects like fire would need to be rendered using fast pixel access.