Resurrected Entertainment

Microsoft LAN Manager

July 25, 2007

After I had finished assembling my RetroBox, I was forced to to make a decision on what operating system to use. I chose to go with Windows 98 because I needed to be able to access my network. I also wanted to be able to boot into a DOS shell, but I didn’t want to dual-boot the machine. By writing a few options into the CONFIG.SYS and AUTOEXEC.BAT files, the user is presented with a small set of choices which may drop them into the shell. When I need to access files on my network or if I want to print something, I choose an option which boots me into Windows. My RetroBox is not a terribly fast beast, and it takes a while to boot and log-in.

To help alleviate this problem, I installed Microsoft LAN Manager in my DOS environment. Now I can access my network files and print through the network with ease. The process was a little frustrating and your experience may vary. I am providing this information, just in case you’re considering tossing the whole project and moving to Tibet to become a monk (if you’re already a monk living in Tibet, then you may be considering moving to Canada to become a Mountie).

The very first step would be to obtain a network card. For compatibility reasons, a popular 10/100 Mb card would be optimal. I have compiled a list of network card drivers which are bundled with Microsoft LAN Manager (which I will now be calling LAN Man to save some typing). Even if your network card is not on the list (mine wasn’t), you may still be able to obtain a driver from the company’s web site. My card is a D-Link “DFE-538TX Rev. D” and the earliest driver posted on the website was for Windows 98. After decompressing the file, however, I found a driver specifically for LAN Man which saved me from having to create a custom NIF file. I will explain how LAN Man uses these files a little later.

Once the physical card is installed, unpack your installation and run SETUP.EXE to get started. For many people, the setup process should be pain-free. My setup was not so problem free, however. I suppose it was due to the fact that I was running the Windows 98 shell, and not a pure MS-DOS, PC-DOS, or FreeDOS installation. I believe the root of the problem stemmed from the fact that the Windows 98 shell doesn’t like the ‘$’ wildcard. If you take a look at the SETUP.INF file, you’ll notice a number of files ending in the dollar sign. Within the file, a program like NetBEUI would be referenced like this:

DRIVERS\PROTOCOL\NETBEUI\NETBEUI.ex$

Under MS-DOS, this would match the file NETBEUI.EX_. Under the Windows 98 shell, it does not seem to match it at all, so the installer complains that it cannot find the file. What I did to correct this problem was to simply change the wildcard to the ‘_’ character and presto! The installer was able to see the file and continue with the installation.

Because my card was not on the list of available drivers, I needed to add it to the list myself, which meant making an entry in the SETUP.INF file so it could find the driver files (<DRIVER NAME>.DOS and PROTOCOL.INI) and the Network Information File (NIF). After making an additional entry in the SETUP.INF file and copying the files to the right locations (under DRIVERS\ETHERNET), the setup program correctly enumerated the card name and I was able to select it from the installation menu.

After these changes, the install carried on happily, but complained at the very end that it could not find the “NETWKSTA.” file. There is no “NETWKSTA” file; it’s just a directory. This is the eight-character directory shorthand for Network Work Station. I don’t know why it was complaining, but it’s an important program and needs to be installed so you can use network resources like shared files and printers. This is also easily corrected after completing the installation. Just copy the file:

\MSLANMAN.DOS\NETWKSTA\NETWKSTA.500 to
\MSLANMAN.DOS\NETPROG\NETWKSTA.EXE.

Don’t copy it from the CD-ROM, because that file is compressed and won’t run. That file will end in an underscore, so it won’t match the name I listed above.

During the network setup portion of the installer, if you choose to use DHCP then you don’t need to specify an IP address or sub-net mask. For log-in credentials, just supply a user name without a password if you don’t have an MS LAN Manager or Windows NT authentication service setup. When it asks you for a domain name and you don’t know what that means, just enter in a dummy name or leave it blank (I didn’t try leaving it blank, the installer may refuse to continue). One item to keep in mind: a Windows domain is not the same thing as a work-group, but I used my work-group name anyway since I knew it wouldn’t matter. Eventually, if I choose to setup a domain, I will change the domain name at that time.

After moving the commands inserted into my CONFIG.SYS and AUTOEXEC.BAT files around to suit my own taste, I rebooted the machine and ran the NET.EXE program. Using this software, I could enter the UNC path to one of my machines:

\\MACHINE1

Go to the view menu and select “View available network resources”. It will present you with a dialog which will allow you to assign a shared folder to a local resource like a drive letter. I was able to read and write (provided the permissions are set appropriately on the share) to the folder using the new drive letter (F: in this case).

Please note: you won’t be able to mount a printer or a shared folder if the share name is greater than eight characters! It will simply complain that it cannot find the network resource.

Getting Dirty with DOS DOOM

July 15, 2007

The story behind the source code release goes something like this: In December of 1997, id Software released the source code for Doom to much fan fare and adulation. After only a few short weeks, web sites starting popping up and modified (or modded) versions started appear on bulletin board systems. Before the source code was even released, their were already tools available for modifying the existing graphics and levels, or for creating your own levels, on the Internet. These tools operated on the storage format for the game’s resources, which included maps, graphics, sound effects, etc. These resource files were called .WAD or .IWAD files. WAD does not stand for anything in particular, but you could choose to think of it like a wad of gum after eating a burrito. It contains all sort of unrelated bits in it, like green pepper and cheese, but together they form a chewy and cohesive whole. Yummy.

Modification to the actual game engine allows for much greater control over how the game operates. Despite this new found ability, many programmers or modders chose to leave the original code base intact, in order to give their users the freedom to play the older levels designed by the master’s at id Software. I believe this should be an important creed for any aspiring Doom hacker to follow, as it will allow your audience to experience the many thousands of levels available for immediate download. Just in case your levels fail to impress your core audience, such as Mother.

This project is intended to document what you need in order to build the DOSDoom (another port of Doom) source code on your machine. We chose to use DOSDoom, instead of the original code base, because the original requires a fair amount of modification to get it building with the available development tools and environments on an MS-DOS compatible machine. This brings me to my next point, you’re going to need a compatible DOS operating system installed somewhere in your house, if you want to compile this project; Windows 95 actually works quite well too as they are basically one and the same in many respects. As a trusty member of the classic gaming community, I trust this shouldn’t be a problem for those of you who are still reading.

DOSDoom brings a nice set of options to your exisiting Doom game. Features like mouse support, the “look” feature, CD audio, alpha blending, custom resolutions, and a whole lot more! Trust me, when you’re finished setting it up, you’ll think it’s almost like a new game.

Ok, first things first, you’re going to need the source code. It also helps to have a pre-built copy of DOSDoom on hand for comparison, and a chance to try it out just for fun. Personally, I love this version of Doom and will typically opt to play it over the original; unless we’re talking about the Playstation port, then we could be persuaded switch platforms for a while, or the Mac OS X port I wrote about a while ago. Of course, before either version will work, you need an original version of Doom. Please respect copyright laws and get yourself a legal copy; it’s not that expensive when you consider what you’re getting in return. Now would be a good time to unzip the pre-built copy of DOSDoom into your Doom game directory.

For those of you who weren’t born with a debugger in your bonnet, there’s a few pieces you need to have in place before you start unpacking the source code for DOSDoom. First, you need to understand the process by which program code gets transformed into something you can actually use. However, we’re not going to write yet another treatise on “How To Program In Such And Such A Language,” although for your reference, the core programming technologies used in this project are called C and Assembly Language. Learning to become a programmer is an iteresting but terribly long ordeal. I don’t recommend it unless you’re planning on making it a serious hobby or even a profession. Choose to play the game instead, you’ll probably be happier.

Since you’re still reading this article, I think it’s fair to assume that you’re mildly interested in the topics presented thus far, so let’s get a few terms under our belt before we begin. There is one general phase which must take place before you can use the source code provided and that phase is called compilation. A compiler is a tool used for translating source code which has been written in a high-level language, such as the C programming language, into a lower-level language more suitable for native execution within the hardware environment you are using. From a software development perspective, writing a good compiler is one of the most difficult and rewarding experiences a programmer could choose to undertake. However, simply using a compiler is usually no more difficult than learning any other moderately complex tool. Yes, there are usually a plethora of options and fancy doodads available for any aspiring geek, but many of them are rarely used and can be ignored most of the time. The compiler used for this project is part of a free development environment called DJGPP. DJGPP was primarily constructed by a man named DJ Delorie with plenty of help from the open source community. It contains numerous utilities, including a DOS port of the GNU GCC Compiler Collection.

Assembly Language, on the other hand, is much less abstract and tends to be a lot closer to a machine’s level of understanding. This source code does not need to be rigorously compiled and students of computer science will often write simple assemblers for course projects. Instead, Assembly is translated into machine language almost directly. Depending on the software used, “almost” can vary from assembler to assembler. DOSDoom’s source code contains only a couple of assembly language modules and can safely be ignored unless you’re planning on making low-level modifications to the game’s rendering engine. For this project, we’ll be using GCC’s assembler called GAS for all of our assembly needs. There is one item we wanted to mention when using GAS assembly code: it does not follow Intel’s coding conventions for instruction mnemonics, it uses the AT&T style instead. For a great book on coding for the Linux platform (which typically uses GCC as the default compiler/assembler toolkit), pick up a copy of Professional Assembly Language by Richard Blum (ISBN: 0-7645-7901-0).

Now that we’re all a little familiar with the basic process, here is the complete list of packages needed in order to compile DOSDoom:

Now what are these extra packages all about anyway, eh? They’re all required too, except for RHIDE which is a nifty Integrated Development Environment (IDE) very much like Borland’s early C/C++ IDE, and the DJ File Packer which is used to shrink the size of the resulting executable into something more manageable. The DPMI server is required by every 32-bit protected-mode application (in this case it’s DOSDoom) for DOS and gets launched automatically by the client application – just make sure it’s available via the PATH environment variable or in the applications home directory. The item listed only as Make is a set of tools used to easily build applications by managing their dependencies; although the syntax used to create Make files is anything but intuitive. Last, but certainly not least is the Allegro Game Library originally written by Shawn Hargreaves. Allegro is used to fill some of the gaps left by id Software when they released the source code for Doom. It’s a great library and can save you loads of time when you’re trying to write an application. All of these tools or libraries are relatively complex and deserve your attention if you want to make any useful contributions to this code base.

All but the last package, CSDPMI, must be unzipped into the same directory; CSDPMI goes into the Doom game directory instead. To help illustrate where everything goes, here’s a look at our build directory:

contrib
allegro
bin
doc
gnu
include
info
lib
man
manifest
projects
share
tmp

Once the packages have been decompressed, modify the DJGPP.BAT file to suit your own directory organization. For example, my batch file looks like this:

@echo off
set PATH=c:\source\dosdoom\djgpp\bin;%PATH%
set DJGPP=c:\source\dosdoom\djgpp\djgpp.env

Execute the batch file whenever you open a new console window (when using Windows 95), or after you’ve booted into DOS. Now, open the file under the Allegro directory and find a file named makefile. Remove the bit of text “-Werror” from the file using your favourite text editor. This is a compiler option which will halt the compilation process when a compiler warning is encountered; normally, it’s not a bad idea, but for the sake of simplicity we’ll remove it for now. The next step is to build the Allegro library by typing the command make.

Depending on the speed of your machine, the compilation process may take a while and you’ll see several messages displayed on screen. If you downloaded everything from this web site, you should experience no errors (provided you removed the compiler option mentioned above); although a few compiler warnings will make their appearance now and then. Once the Allegro library has been built, you should now enter your DOSDoom source code directory and type make again. This will build your DOSDoom executable file. You’ll need to copy this file (/obj/dosdoom.exe) into your Doom game folder (where the main executable file doom.exe is found) . Run the file dosdoom.exe and enjoy!

Remember: When making modifications to DOSDoom, please acknowledge all of the contributors who have made DOSDoom into what it is today. Without the tireless efforts from these people, you would have a lot of work on your plate before you even got started implementing your own vision of what you want Doom to be.

Exult v1.2 API Documentation

July 10, 2007

Since I couldn’t find any on their site, I’ve churned through the Exult engine using Doxygen to produce a usable set of API documentation. It’s a good reference to get a handle on their overall architecture in case you felt like diving a little more deeply.

Introducing CircleMUD

July 7, 2007

During the early 1990s, the Internet was fairly new to most people. At the University I attended for a time, we were using a PC-DOS based environment which had Novell’s Netware stitched on top of it. At the time, I had a small number of Usenet news groups I liked to read every morning, and I stumbled across a post by an individual in alt.rec.gaming or the like. This was before the days of spam and prolific pornography, so when a person posted a message with a subject line like “Check out this cool site!” You had no reservations whatsoever about following the link.

The link he provided wasn’t a URL which could be viewed in a web browser, but looked something like this:

telnet://chaosrealm.darkmatter.org:4000/

Until I started paying to go to school, the only experience I had with a public network in the late 1980s was through local Bulletin Board Systems. I could e-mail people on other BBS systems through something called FidoNet. Those systems would use a BBS mailer for the users to create their message (or it could be uploaded using a pre-formatted text file), and then copy it over to the FidoNet servers for transmission using specialized protocols. The mailing address was rather complicated and needed to have a series of names, symbols, and numbers affixed to it if you ever expected it to reach its destination. Due to my own inquisitive nature and two very understanding parents, I was able to set up and experiment with computer communication. As a result, I was somewhat more experienced with networking before entering University than your typical freshman, but I had never used a telnet client before. It was certainly a mystery that needed solving. Little did I know what worlds waited for me on the other side. Nowadays, there is a bare-bones telnet client on virtually every desktop. Under the Xandros operating system, the program is called “telnet.” Likewise for Microsoft Windows and MacOS X.

After a bit of research, I discovered what I needed to access the site, so I started poking at the client program. A friend of mine saw what I was doing and thought it looked interesting, so he pulled up a chair and logged into his own terminal. What we saw was disappointing at first. It was essentially a blank screen with a small title surrounded by credits centered in the middle and a login prompt near the bottom left-hand corner. I knew neither of us had an account, so I thought our little adventure would probably end right then and there with a message like “Invalid login” or “Permission denied.” Instead, we received a prompt asking “Are you sure you want to use this name?” Tentatively, I declined and sat staring at the screen for a few seconds. I was familiar with the concept of a “handle” or nickname from the BBS systems I frequented. I remember not wanting to use the name I had entered, which was probably something unoriginal like my first name. I don’t remember the name I chose and it’s probably for the best. Little did I know, the interesting questions would soon follow.

The server accepted my new name and prompted me for a password. It then asked if I wanted to be “[M]ale, or [F]emale?” During my role-playing sessions, I had never considered being a woman, even though friends of mine tended to flip-flop between the sexes just for fun from time to time. I guess it just wasn’t something I even thought about. My teenage philosophy was black and white: how could I be a fearless knight of the realm with breasts? However, faced with the same opportunity on my computer, I did pause for the briefest of moments. Anonymity is not somethings easily overlooked by most people. Not wanting to appear unmanly in front of a friend, I quickly pressed the ‘M’ key. To this day, I have never played as someone from the opposite side of the gender coin. I eventually realized it didn’t really have an appeal for me. If I was going to invest the time creating a character in this new world, I needed to take it seriously and not go galavanting around the realm trying to fool everyone into thinking this body is real and I know how to use it. Besides, I really don’t think I would be able to do a very good job of it. I would probably end up over-acting and annoy everyone in the game while fooling no one at the same time. And then there’s the whole psychological impact of sustaining an Internet female persona…

Depending on the site you visit, the questions don’t necessarily end after you choose your sex. There can be questions about your race, occupation (warrior, cleric, mage, or dentist), attributes, and a description for your character just to name a few. Before you start hammering away at the the keyboard, it’s important to realize these details will be visible to other people. The site I visited that day was classified as a MUD which stands for Multi-User Dungeon/Dimension. There are many other classifications of multi-user software environments such as MOOs, MUCKs and so on, but I have only been interested in dungeons. A MUD is a world tapered in such a way the user will feel like their transported into an interactive book. All of the details in a MUD are described by words; there are no sounds, graphics, or vibrating joysticks. The illusion is helped along by talented authors and other player characters. Just like actors and actresses, most people who enter a MUD want to remain in character while they’re on-line, so information about their personal lives are usually not revealed. If they want to elaborate on their real life, then they usually initiate a private chat or use special abbreviations to illustrate they are speaking out of character.

The software hosting the MUD is stateful. It knows when you’re logged in and when you’re not. When you’re naughty and when you’re nice. By your command or when you log off, it can record vital details about your character like the equipment you’re carrying, the money you’ve accumulated, and other pertinent details. Anyone familiar with games like Dungeons & Dragons will feel right at home in this universe. In fact, it’s often a fun alternative to playing face-to-face, especially when you’re role-playing amigos live far away. Muds essentially run by themselves, but they can also be directed in real-time by controlling characters often called wizards, gods, or heroes. These characters have permissions to modify the world and initiate quests. Ordinary players can sometimes attain these positions by achieving a very high-level character ranking and then be offered a promotion, or simply participating in its day-to-day operations by becoming more involved. Sometimes it’s just easier to befriend one of the wizards and beg for an opportunity to prove yourself worthy. Just try not to cry if they refuse.

Wizards are usually programmers who typically interact with the MUD on a technical level. They usually have a character which is used to logon to the server in order to test the deployment of a new feature. It’s inadvisable to anger these people because they can eject you from the MUDvery easily. Their characters on the MUD are not ordinary players; they usually cannot be killed and they often make themselves invisible to ordinary mortals. This is a kind of power one can abuse. Do what you like on your own server, but I can guarantee no one will want to visit your realm if you abuse your abilities as a Wizard.

Back in the heyday’s of MUDding, there were literally thousands of these servers up and running on the Internet. And as a player, you had your pick of the litter. Low and behold, though, the majority of those servers were rehashes of existing worlds. There were a few bright stars amoung them, but it was a challenge to find a truly original MUD with a decent up-time. If you’re going to spend the time to create and manage a MUD, then I suggest you think of an original theme first and then come up with an interesting environment. It could be used as an interesting spot where you and your friends and family can hang out instead of the typical chat room.

Interaction on the MUD is accomplished via text commands you feed to the server hosting the game. These commands and transferred over the Internet via the telnet protocol and then the response is sent back to your client in a timely manner. So, what commands are available for you to use? It depends on the server software, but they are very similar to the commands used by many classic adventure games. At some point in their history, games like Leisure Suit Larry and King’s Quest all sported a command interface where you typed what you wanted the character to do. For example, commands like “take bottle” or “look at painting” were commonplace. MUD commands were pretty much the same, except they could usually handle more complex sentences and contained a larger set of available commands. There is also the multi-player aspect to consider. A MUD needs a much more extensive set of commands used for communication like “shout,” “talk,” “whisper,” or “emote.” These all have different effects based on your permissions and proximity to other players. For example, some MUDs do not allow just anyone to shout a message, since it will usually be heard by everyone currently logged into the MUD. Players in the past have used this feature to be very naughty.

Due to its popularity, you would expect there to be a lot of MUD software available on the Internet and you would be right in thinking so. From this point onward, we’ll be studying an implementation called CircleMUD. CircleMUD was created by a man named Jeremy Elson in 1993 and is a worthy extension to the DikuMUD codebase which was written by Katja Nyboe, Tom Madsen, Hans Henrik Staerfeldt, Michael Seifert and Sebastian Hammer in 1990. It is stable, well programmed and documented and certainly one of the more popular servers available today.

Before we delve too deeply and greedily, I think it’s worth talking about the software license. You can use the software free of charge, but you must comply with the license agreement, which basically has three requirements. First, you can’t make any money off of CircleMUD. That also includes soliciting funds or accepting donations. Second, you must give the authors credit for their hard work. And lastly, you must comply with DikuMUD license. All of these details can be found in the documents accompanying the distribution.

Building a RetroBox

Playing games through emulators or virtualization software is one thing, playing them on a real box, powered by real hardware is quite another. The tactile sensation, the whir of the power supply, and the smell of electro-static dust crud slowly wafting through the ventilation is enough to send chills down my spine. Seriously, I can’t stand it.

Besides all of those tangible qualities, there is also the very real compatibility of the games to consider. They may not run perfectly in your emulated environment. On my primary machine, for example, DOSBox does not run any of my computationally expensive games very well such as Wolfenstein, DOOM, Hexen, etc – basically, any 3D FPS game. VMWare may also not produce the sound very accurately (assuming I can even find a way to get audio to work in FreeDOS on the machine I am using). I also found it enjoyable building the machine and locating the parts for it. My local used hardware store was a great help as well as raiding my own closets for any spare pieces lying around. This little venture also helped to ferret out any unneeded parts for charitable donations.

As a bench mark, I used Ultima VII: Black Gate and Serpent Isle, since those two games had very demanding audio and memory requirements. The games also required a decent processing speed, mouse support, and a modern video card supporting VGA graphics modes. When I say “very demanding,” I mean when compared to the technology of the day. If you were to use a machine made using today’s technology, it would simply be too fast. Many games wouldn’t operate correctly and it would be next to impossible to find MS-DOS compatible drivers.

Hardware

The two most important pieces of hardware to consider would be your motherboard and your sound card. You may have noticed I neglected to mention processor and video card. If you pick a motherboard from the same era as Ultima VII, it most likely won’t have a CPU socket (although my RetroBox does and provides a Socket 7 interface) so you would typically be stuck with the on-board processor. The video card is less important, although I would make sure it has a reasonably high number of video modes, and optionally, high-quality video output that suites your taste and matches the capabilities of your monitor. You should also try and get a PCI video card since a 16-bit ISA bus can be a bottle neck for Pentium processors during the early to mid 1990s. The last video card I used in this machine was a Trident Super VGA card. I loved that card, but when I upgraded to a new card sporting a PCI interface the difference in speed was very obvious.

Should you choose to build a machine yourself, I have provided a list of components I used to build mine. You will probably want to alter the list to suit availability and personal preference, but if you choose to build it to my specifications, I can guarantee you the machine will run Ultima VII perfectly:

  • GMB-486SPS 80486 PCI Green Motherboard
  • Intel P133 Microprocessor
  • Sound Blaster 16 ISA Sound Card
  • Matrox G400 PCI Video Card
  • Yamaha 4x4x16 CD-ROM Drive
  • Generic PCI Network Interface Card
  • Microsoft Compatible Mouse
  • 40 GB Hard Drive
  • 3 1/2″ Floppy Drive
  • 250W Power Supply

The size of the hard drive is somewhat arbitrary. Forty GBs is a lot of space for a DOS installation. I could install every game in my collection including Windows 3.1, compilers, word processors, and anything else I could find around the house and it still wouldn’t fill half the hard drive, which is why I partitioned the drive and installed Windows 95 as well. I did this more for network connectivity, so that I could copy files back and forth easily (although I will probably forgo the Windows installation and use an Ethernet packet driver and SSH/SCP to move files around). Just remember that hard drives available today will probably not be compatible with older motherboards. The BIOS or your operating system may not recognize your disk or refuse to utilize most of the space.

Remember, you should probably work out an effective way for you to get files from your home network to your RetroBox. It could be more difficult to find a motherboard capable of interfacing with USB devices, and even if you did find one, you would need to install Windows with USB support.

TIP: Some earlier versions of Windows 95 did not support USB devices, so be careful if you’re purchasing a copy on eBay. Another alternative to Windows would be to install FreeDOS – it has USB support, although I am not sure if it would recognize the hardware on all motherboards. The OS is built for old and new machines, so I’m guessing luck is on your side.

Software

I use a number of light weight drivers and utilities to make my RetroBox compatible with Ultima VII. The CD-ROM, mouse, and sound card all require drivers. I wanted a light weight driver and device extension software for my Yamaha drive so I chose to go with the generic CD-ROM driver provided by Mitsumi called MTMCDAI.SYS and extension software called SHSUCDX.EXE written by Jason Hood and John McCoy. For the sound card I use the standard Sound Blaster drivers and related software. For the mouse, I chose something a lot smaller than the typical Microsoft compatible mouse driver. The mouse driver is called CuteMouse; it’s open source and optimized for size. It’s only 6 KBs compared to the Microsoft driver which is 56 KBs – a huge savings!

However, the best piece of software I found was a program called UMBPCI. It only works for specific CPUs, but if you have an Intel P2, P3, Celeron, or Xeon, then they are all compatible since they are all L2 cacheable. Processors which do not work are 486/SX/DX/DX2/Dx4/SLC, 386/SX/DX, 286/SX, or 8086. If you own one of those processors then you may want to try HIRAM.EXE which works in a similar way to UMBPCI. So, what does UMBPCI actually do? Here’s a concise explanation taken from the documentation (it has been edited for clarity):

UMBPCI.SYS extends Microsoft’s HIMEM.SYS by supplying the ‘Request XMS-UMB’ function. Microsoft’s EMM386.EXE does the same thing, when loaded with the ‘NOEMS,’ ‘HIGHSCAN,’ or ‘RAM’ parameters in CONFIG.SYS.

UMBPCI.SYS creates UMBs (Upper Memory Blocks) using the existing system memory intended to be used as Shadow RAM, but disabled by default, only in the C800-EFFF range, not using the address range B000-B7FF. That particular memory area is normally used for monochrome video (used by older graphics adapters), not for BIOS (ROM) extensions, therefore X86 chipsets cannot enable shadow RAM within this region. UMBPCI.SYS enables this memory and disables its write protection.

UMBPCI.SYS takes only 240 Bs of conventional RAM (224 Bs code + 16 Bs environment), while providing up to 629 KBs of free conventional memory, provided you’re loading all the devices/drivers/TSRs to use high memory!

Microsoft EMM386.EXE creates UMBs from the computer’s physical XMS (eXtended Memory Specifications) by virtually remapping XMS to the upper memory area using the Memory Management Unit (MMU) of 386 and higher CPUs. It needs an additional 150 KBs of XMS, 4 KBs of low memory and 7 KBs of UMA (Upper Memory Area) when loaded. It also switches the CPU into protectedmode, which tends to be slower, because it’s necessary to use the MMU. UMBPCI.SYS leaves the CPU in real mode, for better compatibility and faster performance.

Basically what this boils down to is the ability to load all of your drivers and software to a region other than conventional memory. This makes games like Ultima VII, which is a complete memory hog, very happy indeed.

For some games, I also use a utility called PENTSLOW.COM which slows down a Pentium class CPU by disabling branch prediction, v-pipeline, and internal cache. This produces a more even gradient for speed changes, unlike some software driven tools (essentially TSR programs which simply loop) that produce choppy or ineffective speed reductions. I also down-clock my CPU in the BIOS from 133 MHz to 100 MHz which helps Ultima VII cope with the faster processor. You may also wish to use a disk management tool like SMARTDRV.EXE which can help improve the performance of disk intensive software.

For your benefit, I have made all of my batch files, configuration files, drivers and software available to you. Let me know how your RetroBox works out!

The Hydra Development Kit

July 5, 2007

I’m very excited. I just received the new game development kit from Nurve Networks which includes the Hydra game console, development book, RAM expansion module (extra), storage expansion module (extra), SX-Key programmer (extra), keyboard, mouse, experimenter board, and software. A friend of mine asked why I enjoyed using the kit despite the obvious limitations when compared to development kits and hardware available today. I rambled something out since I was on my way to a meeting, but after further consideration I find it a hard question to answer.

I’m a huge experimenter. I love the possibilities new hardware and software bring to the table. It doesn’t need to be fancy, it just needs to do something interesting or expose possibilities in order to capture my imagination. I’m not expecting to produce commercial quality games with the Hydra development kit, and despite what the documentation and marketing hype implies, it will not show an aspiring programmer how to produce games on today’s hardware consoles. The technology, development methodologies, and tools are so different, you could spend years writing software for the Hydra game console and be completely lost when asked to write software for an XBox 360.

So why does the kit exist at all? I don’t know what the sales numbers are for Nurve Networks, but if this kit existed when I was a teenager, I would have given or done anything to get my hands on it. That’s right, anything! I would have taken on an extra paper route to earn enough money to buy it. I would have done every house chore my parents could dream up, if only they would consider buying one for my birthday. Even today, I have a genuine technical interest in the device, and something at a more fundamental level which drives my desire to learn and create.

I’m a professional programmer by day. I can afford expensive and powerful development platforms, but I do not tend to use them in my spare time. The reason may be similar to why an artist chooses to use acrylic paints on a cloth canvas instead of water based paints on a paper canvas. It’s more than just a personal preference; it’s an attraction to the art itself. There is a certain intimacy with the hardware which is impossible to achieve using high-level production tools. Sure, the compiler may allow you to graft in a few lines of assembly code, but it’s not necessary most of time and can produce problems later, which is why most companies frown at the practice of introducing assembly code into software for no good reason.

The art of programming can be enjoyed by any skilled software developer using almost any tool or language. That in itself can provide great satisfaction to the designer if everything works out, but the synthesis of hardware and software can produce a masterpiece not often experienced by programmers today, and I suppose the elation which follows is what attracts me to development kits like the Hydra.

VMWare for Games?

June 29, 2007

VMWare LogoJust for kicks, I decided to try and install FreeDOS into the free version of VMWare server yesterday. The installation went very nicely and before long I was mucking around in the console. I decided to try and play a game in this environment. Having used FreeDOS before, I knew it would work with the operating system, but I was unsure how playable it would be under VMWare.

The first game I tried was Blake Stone: Aliens of Gold and it ran quite well. I was encouraged so I decided to try a more complex game. I chose DOSDoom because I love the game and it seemed like the next logical step. It also ran well. To try out your own game, just run an ISO generation tool on the installed directory, and attach the ISO to your VMWare guest. If you’re running Xandros, use the ‘mkisofs’ command:

$ mkisofs -o /tmp/doom/cd.iso /tmp/doom/DOOM/

Assuming your installation directory is located under ‘/tmp/doom/DOOM’, this command will quickly generate an ISO file. To make it easier on yourself, I would just throw all of your favourite games into one ISO, burn the CD, archive it (you can edit it later if you want to add more games), and use that whenever you feel an itch to play.

This technology is a fabulously free alternative to building your own RetroBox, but maybe not as much fun to put together.

Emulation Strain

June 28, 2007

A friend of mine wrote an article a few days ago suggesting that emulation may meet its end in the not-so-distant future because the computing power necessary to emulate the new hardware used by today’s consoles would be too vast. I agree with him completely, but I still have reservations about this statement for a couple of reasons.

First, until recently, consoles were always lagging behind computers in terms of pure processing power. Programmers and hardware engineers have always been able to do more with less on static systems like the Nintendo 64 or Turbo Grafx 16. The consoles had specialized hardware in them which could accelerate common operations used by games. Thus, the console stayed with exactly the same hardware during the lifetime of the product but often continued to look good in the eyes of the consumer because of these extra hardware features. Computer hardware, on the other hand, continued to get better but because of complex operating systems like Microsoft Windows, the computer games often had to play catch-up. For example, during the late 1980’s and and 1990’s, a side-by-side comparison would have been difficult at the best of times because the consoles often ran completely different games from the kind played by computer enthusiasts. Not to mention, the computer often had to contend with many more software layers than a barebones console. It was and still is today, impossible for a computer to run a game without some overhead from the operating system, hardware drivers, other applications, etc. In stark contrast, consoles usually had a bare metal operating system with a spartan software architecture; essentially nothing more than a boot loader and perhaps a small firmware BIOS.

In a bizarre twist of evolutionary electronics, consoles are becoming more and more like computers, complete with powerful operating systems (albeit dumbed down and with minimal functionality exposed). Admittedly, these machines do have powerful hardware, but then again, so do modern computers. Another twist is that the software running on these consoles (ie: the games) is becoming more and more compatible with the software used by modern day computers. With multi-core processors, hyper-complex acclerator cards, and virtualized hardware environments, computers running emulation software could map a number of hardware features used by today’s consoles more easily than ever before.

So what’s stopping emulation from taking off? There are a few big, perhaps insurmountable, issues to contend with before we start playing Xbox 360 games on a Mac. First, the games are so much larger, and the software needed to make them work is equally large, which means if you want to run those games on systems which weren’t designed to run them in the first place, you’ll need to handle the hardware issues as well as the software requirements. For example, on an Xbox 360 there are Microsoft libraries which handle such mundane things like drawing menus and windows, handling kernel events, DirectX libraries for input control and graphics acceleration, etc., will all need to be available on the target platform. So, you had better start coding now, and maybe with a bit of luck, you’ll finish before you’re dead. The second big issue is security. It’s complex. It’s big. And depending on the system, it could be distributed throughout the software and hardware in a tangled mess designed to keep people like you from writing software like ZSNES.

Bottom line: if you want to play PS3 or Xbox 360 games while you can still hold the controller, then you had better fork out the cash for a new or used system because those clever teenagers who wrote the early generation of emulation software, won’t be able to make a dent in the new console market.

Simulation or Emulation?

April 19, 2007

When people in the gaming community speak of emulation, they are usually referring to the emulation of a hardware console or a computer system. This software allows them to play games on hardware systems which were never engineered to allow it. Imagine, playing Super Mario Kart with your friends through the Internet on your Mac OS X, Xandros, or Microsoft Windows computer. Impossible? Google the ZSNES or SNES9X project for more details.

First of all, there seems to be some confusion over the use of the terms “emulation” and “simulation” on the Internet. It seems many people (myself included) are referring to programs, such as DOSBox or ZSNES, as emulators. Usually, I abhor ignorance (unless it applies to me somehow), but thankfully it’s not simply a misused term stemming from the ignorant, adolescent population on the Internet.

Part of the confusion may come from people in the hardware industry referring to simulators as software that precisely mimics hardware, and emulators as pieces of hardware that are precisely controlled by software. Reread that last sentence since there is a distinct difference in their meaning. For people in the software industry, the terms are a little different. When a programmer attempts to duplicate hardware components through software, it’s often referred to as emulation. Simulation, on the other hand, can be used to mimic the actions of a piece of hardware or software, but the internal workings may be completely different than the original product. It may look and act like an Atari computer, but it’s not really using any of the core components of that machine. Within the realm of software, simulators can also apply to programs which simulate completely different systems such as organic or chemical interactions, economic or disaster forecasts, or the weather.

I’m going to digress for a moment and talk about a simulation used to provoke this global warming hysteria we all are experiencing today. Simulations can be as simple as simulating the periodic movement of a pendulum, or as complex as a weather system. Simulations can also involve complex interactions between other systems, and for these intricate systems to be effective at predicting or postulating outcomes, they must rely entirely on our flawless understanding of how all of these systems work. It’s funny (and sad) to read about simulators which can accurately predict the effect of human pollution on the weather system. These are probably two of the most complex environmental systems attempting to be simulated today. It’s no surprise that even the most sophisticated simulation models used today don’t work, but various environmental groups and political leaders are running with the “evidence” as if it were iron clad proof. The only way they got the simulation to output the desired result was to modify the data going into the system. Now, I’m sure we’re not all scientists here but it should be fairly obvious that manipulating data to validate your own hypothesis isn’t exactly following the scientific method. I don’t want to get off on a tangent here, it’s a good example of simulators being used today and the unexpected complexities that can arise during the development of a simulator.

Most of the time, programmer’s would not attempt to simulate an entire machine, but rather couple the simulation with emulated components. The reason is simple: it’s a lot more work to simulate an entire machine than it is to emulate one. In a simulation, none of the original software for the hardware platform can be used, which means it too must be simulated. Looking at the other side of the coin, we see that pure hardware emulation is also not very practical to implement through software. How does one emulate a hardware bus, for example? The bus is purely an electrical component and has little to do with the purpose of the overall system. Yes, hardware engineers need to implement bus systems, but programmers attempting to emulate the machine can forget about it entirely. Simulation is often used in place of arcane hardware constructs which are part of the electrical system or those pieces which simply cannot be emulated. It acts like a bridge between discrete or unrelated pieces within an emulator so that the emulator can function as a cooperative unit.

When writing an emulator for a video game console, it’s vital to get as much information about the hardware as possible. Doing so will hopefully give you the big picture, and make it easier for you to divide your work into component groups. In a simple system containing only one central processing unit, for example, you will need to emulate the operation of the chip. This includes instruction sets, registers, any available memory caches, and operands, which constitute the general make up for most microprocessors. Advanced microprocessors necessitate advanced emulation, but not all components of a microprocessor need to be emulated in order to maintain accuracy. For example, optimization features used by the chip can essentially be omitted for a perfectly functioning system, unless those pieces modify the data in a peculiar way or they contain a bug which affects the software’s presentation or operation on the original hardware.

Another component group needing particular attention would be the memory subsystem used by the console. The memory architecture used could be as simple as a linear array of addresses or as non-intuitive as paged memory or memory-mapped input/output. The solution chosen for emulating memory should probably favor reading memory over writing memory, since the former occurs a lot more frequently. Some systems, such as the Nintendo Entertainment System, allowed the clever engineers to essentially extend the available memory through the cartridge. They used bank switching within the cartridge to overcome the addressing issue normally encountered when a system attempts to extend the range of available memory than what was originally available.

I could go on and on as there are many other aspects to consider before writing an emulator, such as debugging or functions related to emulation as an application but they are outside the scope of this little write-up.

Wii Virtual Console

April 18, 2007

Wii System ConsoleSo, I downloaded my first Virtual Console game the other day and immediately backed it up to an SD card. The process worked flawlessly but it took a little while for the Wii to save the game. I thought at first it had crashed because the actual game data wasn’t particularly large (about 1 MB for the raw ROM data). After popping the card into a reader for my Mac, it turns out the code/data for the game had grown on the SD card to about 5 MBs. That’s quite an increase and I began to wonder why it had grown in size by a factor of five. There are quite a few opinions as to what constitutes the increase in size on the Internet, and the possible reasons I give you here are no different, but I tried to sift through the possibilities and come up with the three most likely reasons:

  • It’s very likely the Wii uses a symmetric encryption algorithm to protect the data, since asymmetric encryption wouldn’t really add to the security of the system in an environment where both the private and public keys would need to be stored locally in system memory. If they wanted to use asymmetric encryption, they would need to move the private key to a remote server in order for the system to be effective. However, one of the design goals for Nintendo probably would have been to avoid a situation where a constantly available connection to the an Internet server was required. In any case, the encryption algorithm used may inflate the size of the data being encrypted, or they may pad to the size of the file for the algorithm to be more effective.
  • In order for Nintendo and third-party developers to release self-contained Virtual Console games, it’s likely the emulation software is included with every game. Otherwise, if the emulator was part of the console’s on-board software library, and they decided to patch the on-board emulator sometime in the future, then they would need to re-test every released game with the new emulator. Ouch. Using a pre-installed emulator would also prevent developers from making game specific tweaks to the emulation environment.
  • Finally, the game probably includes extra information for the console to display when your at the Wii main menu. For example, it would need a picture and maybe some music or sound effects for the game when shown on the main screen. If this information wasn’t stored within the file, then the Wii would need to decrypt the file every time you accessed the main menu. Given the strength of the encryption algorithm, it’s very likely the process is computationally expensive, and if you owned several Virtual Console games it would need to decrypt all of them before you could see the little preview window.

What do you think?