Resurrected Entertainment

Archive for the 'PC' category

The QuickBASIC and the 0xDEAD

November 2, 2011

QuickBASIC is very similar to QBasic since the latter is just a stripped down version of the former. It had a few functions QBasic did not have — such as the ability to compile programs and libraries, allowed for more than one module, and could create programs which were larger in size. It also had this annoying bug where the memory management model in the interpreter was different than the one used by the compiler; this was a problem when your program worked in the debugger, which used the interpreter, but not when the program was actually compiled and run from the shell.

QBasic shipped with MS-DOS 5 and consisted of only an interpreter, meaning it translated small chunks of code into machine code as they were executed (I believe it did cache the translated portions, so it wouldn’t need to translate them again). QuickBASIC was more advanced and had a compiler as well as an interpreter, which allowed it to translate and optimize the machine code it generated before you ran it. It had only one dependency during compilation, and that was the QB.LIB library. Using the QuickBASIC IDE or the command line, you could compile multiple modules into a single target which could be a library or an executable program.

QuickBASIC was handy for prototyping and demos. I didn’t need to do many of these for my own projects; although, I did a lot of experimentation with network interrupts before porting those routines and programs to C. QuickBASIC was mainly used for projects and demos at school and eventually college; many of them were also written in C due to a few course requirements.

While in college, I had written a chat application over a local area network for DOS before the concept became popular. It was using the SPX protocol for some parts, and the IPX protocol for others. It only supported peer-to-peer communication and only with one other person, bit it served as a demo for simple network communication, and many of the students used it quite regularly.

In other programs, I was using Novell Netware interrupts for communication, broadcasts and client machine discovery. I loved it and found coding these applications fun and exciting; my Ralph Brown textbook was well used during this time. I think my interest in programming these network applications stemmed from my days playing DOOM over similar networks. Basically, I just loved lot the potential for interactivity, which is a tad ironic since I was fairly quiet programmer back then. A few of my friends became interested in the software I was writing and one day I decided to play a few practical jokes on my fellow students. I created a custom program for sending messages via the Netware API. These messages could be broadcast to every machine on the network, a group of machines, or a specific machine. When a machine received one of these messages, it would display a dialog and show the user a message. Pretty simple concept, but many people were not aware their machines could even do that, or what it meant to even be on a network.

Simply using the network software which came with Netware wasn’t an option, since the message dialogs produced by those tools contained the name or address of the machine from which the message originated. According to Steve Wozniak, co-founder of Apple Computers and well-known prankster, the most important element to any gag is not to get caught. We needed to customize the message being sent so that it included only the details we wanted to send, and the only way to do that was to write a custom program. This was fairly easy since I had been writing software on Netware for several months, and when the packets were sent off through the network, our machine name was carefully omitted and our address forged.

We used the program to send funny or confusing messages to some of the students. No profanity or crude humour, mind you… well nothing too crude anyway. My goal was always to get the user to believe what the machine was telling them. I had convinced one student her computer didn’t like the way she typed; her keystrokes were always too hard or fast for the computer’s liking. Several computers were being used by students to view pornography, so I did my best to make them feel uncomfortable in a public setting. Many of them believed they were being watched by the network administrators, which could have been true, although network monitoring software was much more limited at this time). Anyway, these people quickly shuffled out of the room red-faced, hoping not to get caught on their way out. I still get a chuckle out of it even now, when I think about it.

In a way, I am kind of saddened by the complexity of today’s operating systems. Trying to write the same software on modern machines would be extraordinarily more difficult today, mostly because of new operating system and network security features. They just aren’t managed in the same way anymore, so a programmer’s ability to exploit a network so directly has disappeared. I don’t particularly disapprove, it’s just not as easy to have a little fun.

Lume: Puzzle Game (Part I)

May 15, 2011

This is the first installment of (presumably) a multi-part puzzle game series. The first version was fairly short but had a nice, not so in your face musical score. I guess it’s fairly typical for this type of game, as you’re often spending a lot of time in one particular scene. My advice to you is don’t pay too much attention when the game tells you there is nothing of interest here when looking at Granpa’s book shelf. I used information in those books to unlock the tool cabinet under the sink. Personally, I find the “find the hidden number” games to be annoying most of the time, so I hope there aren’t too many of them in the sequel.

The PC Gaming Onion

July 9, 2010

I have been caught in a diagnose, debug, and repair cycle for a few days now when using my PC. It’s been a frustrating experience so far, and I have yet to arrive at a stable platform to play games, which is the reason why I began this epic quest in the first place. This particular problem has been quite nasty, and all the usual tricks and secret handshakes aren’t working. I have had to systematically replace and diagnose each component of my system by placing everything from software services to my set of DRAM sticks under the microscope.

On the one hand, I do enjoy problem solving, so I could put the frustration aside sometimes and concentrate on the problem (although, I wanted to throw the machine out the window yesterday).  However, problem solving was not my goal here. Since I’m on vacation right now, I want to play games, not fix computer problems. Like everything else in the desktop computer market, the complexity has risen to the point where an average computer user must treat their computer like a mystical black box. You might as well throw a big fat “No Serviceable Components Inside” sticker on the side of the case. Unless it’s something simple, like a unplugged monitor or a mis-aligned video card, then I would expect most people to throw their hands up in frustration and start filing through their list of contacts looking for the local computer geek.

You don’t need to understand everything about everything,  you need to comprehend just enough so that you can effectively peel back a layer or two and fix your problem, or at least diagnose it. Of course, if it were that easy, I would be playing Bionic Commando right now. The hard part is dealing with all of these hardware and software layers, and trying to find that mystical needle in the haystack.

I would like to see a better solution to the problem of diagnosing a system crash. It’s a hard problem, and one that cannot be fixed without first changing the system, and reducing the number of layers or at least fixing those layers to know stack. The problem is that layers provide a certain kind of freedom to hardware and software engineers. It allows them to ignore a lot of the inner workings of a system, and concentrate more on what they want to build, be it a game or a piece of tax software, so as much as I feel like getting rid of them right now, those layers are here to stay. However, we do not need all of those layers in all circumstances, and some people have customized their desktop configurations so that only the necessary layers are used when performing a task. These customizations are very high level, the user typically does not have a lot of control over the low level pieces of the operating system (unless you’re using Linux, but if your goal is to play games, then you’re not using Linux anyway). I don’t think these kind of profiles are a good idea either, since that creates an even bigger nightmare while testing or debugging a failing product, and much of the configuration is beyond the understanding of a typical user or support personnel.

What the desktop needs is a specialized mode, which essentially brings some of the benefits of a console to the PC arena. This mode must be supported by the operating system, and consist of a limited but complete set of layers, so that game programmers can continue to write great games. That’s it; nothing else should be included. By reducing the number of layers, and provide appropriate diagnostic tools for reporting on hardware configurations or problems, game development companies can target much more typical configurations while testing, and support personnel could diagnose problems faster.

Products like DirectX for Microsoft Windows were designed to solve that problem, but over time the development platform began to stray with the introduction of more layers and thus more complexity. Games which utilize DirectX are nice, but simply providing DirectX is not a complete solution, since there are so many services still running on the system, which have nothing to do with the game or the game development framework. These services can cause problems, add complexity, and serve to hide the real culprit, until we arrive at the situation I am dealing with today.

Bionic Commando 3D

July 6, 2010

I downloaded this title from Steam a few nights ago and have been struggling with frustrating system lock-up problems ever since. I went through the usual suspects: the game and its patches, video card drivers, sound drivers, GPU temperature, CPU temperature (usually causes a reboot when the core temperature goes too high, but I investigated just to be sure). My machine was having the following symptoms:

1. The game would freeze in seemingly random locations – a reboot was required to fix it
2.  Eventually my desktop froze as well at one point, which seemed to indicate it wasn’t the game
3. I tried three different video card driver versions: AMD Catalyst 10.6, 10.5, 104
4. I tried a brand new video card, using a different chip set (nVidia), then subsequently returned it to the store
5. I tried posting to the Bionic Command message board with a DirectX dump (useless, apparently the game is too old)
6. I made sure my fans were running at peak efficiency
7. I cried (only a little)

Until it occurred to me, that it could be Steam causing the crash. So, I took Steam off-line, as it is not possible to play Bionic Commando without it (when purchased through the Steam network). Low and behold, it was a miracle. No freezes. For your reference, here are a few of my vital stats in case someone from Steam wanders by:

Windows 7 64-bit Professional
Intel Core2 Quad CPU with 4 GB of RAM
ATI Radeon 3850 with 500 MB of VRAM
Steam API v009; built June 30, 2010 14:43:46

Here’s the system information Steam gathered.

Dragon Age – Data Read Error

July 1, 2010

It has been a while since Dragon Age first appeared on the market. My wife have been playing off and on since the release, and we are close to completing the game, definitely within the last 25%. To make a long and frustrating story short, I managed to destroy our operating system installation accidentally one dark and stormy night (NTFS file system gone, EXT3 in it’s place with additional files installed on top for good measure), on the very drive which held our Dragon Age and Fallout 3 games. Two very long games, so no small amount of investment. Obviously, there was much drama and hand waving.

Long story short (didn’t we already do this part?), I recovered most of the save game files, reinstalled the operating system (Windows 7, 64-bit), and installed all of those nasty dependencies. Now, I was ready to install the game. I gingerly placed my Dragon Age game disc in the DvD drive, ran the auto-installer, and… whammo! A big ol’ data read error in my face. Now isn’t that just great. Insolent hardware! I went through this process several times later (copying contents from the disc to the hard drive, disabling the Data Execution Prevention for the installer, etc.), and to make a long story short, had no success with the same error appearing in seemingly random locations during the install. However there is a silver lining, and I eventually did manage to get a fully functional installation. Would you like to know how?

SCRATCH HERE TO REVEAL SECRET

I simply copied the disc’s contents from another machine’s DvD drive (an iMac in this case) to my Windows 7 box. Ran the installer and it worked. No fuss, no muss. Except for all of the fuss and muss I mentioned above. Please feel free to send me money if you find this helpful.

The Revolutionary Guide to Bitmapped Graphics

December 29, 2009

revguidebitmappedgraphicsThis is another book from my library that I have decided to take a look back on and see if there are any useful tidbits to be used by programmers today. As with most technical books which are more than ten years old, there is usually an abundant amount of information about specific technologies which are no longer in popular use, or perhaps the technologies are still present in one form or another but the means to access them have changed dramatically. I personally believe that many of these books can give the novice programmer a background not taught in universities and colleges and will certainly give them an edge when working on limited or older machines.

The book does talk about video hardware used in that time period and delves deep into the programmatic underpinnings when accessing the display and creating custom video modes. I found some of the discussions to be noteworthy but if you really want a thorough explanation, you may want to investigate the Zen of Graphics Programming or the Graphics Programming Black Book. It also delves into a bit of assembly language primer, which is very typical for these books, since many of the routines were coded using that language. The introduction is short but may be a nice refresher for those who haven’t gotten their hands dirty in a couple of years.

I’ve made a list of what was still useful for work you may be doing today – unless you’re one of the lucky few who get to maintain software written in 1994. Your mileage will vary as some of the techniques are really just short introductions to a much larger field like digital image processing (DIP) and morphing. It even had a short introduction to 3D graphics, which seemed to be slapped on at the end because the publisher wanted “something on 3D” so they could put it on the cover.

  • It provided color space introductions, conventions, and conversions for the following spaces: CIE, CMY, CMYK, HSV, HLS, YIQ, and RGB. Most of the conversions go both ways (to and from RGB space), although CMY/K conversion calculations are only provided from RGB space.
  • Dithering and half-toning, followed by a chapter on printing. I think the authors mentioned Floyd-Steingberg in there somewhere, but it wasn’t a full discussion.
  • Fading the YIQ and HLS color space. I’m not sure why they didn’t provide one for the RGB space, but it could very well be on the bundled CD-ROM.
  • It introduces the reader to a few algorithms for primitive shape drawing and clipping, like Bresenham line drawing and Sutherland-Cohen clipping. It also included discussions and examples for ellipses, filled polygons, and b-spline curves.
  • Extensive discussions on graphics file formats for GIF, JPEG, TGA, PCX, and DIB. Although these tended to be higher-level than what would have been useful for someone implementing a decoder for any one of these formats (with the possible except of PCX). Associated algorithms like LZW and RLE are also explained as they are used by encoders of these formats.
  • The topic on fractals and chaotic systems was a little out of place, but was a little more extensive than the chapter on 3D. It did explain the concept of an L-system fractal, and even provided a generator for it. When supplied with a configuration file, it could produce fractals like the von Koch curve. It briefly touched on the Harter-Height Dragon fractal and introduced the Mandelbrot and Julia sets, but didn’t delve into chaos theory, even though I’m sure one the authors desperately wanted to do so.
  • Related to the discussion of fractals was the section on generated landscapes via the midpoint displacement method. While not a landscape per se, the authors digressed a bit to talk about cloud generation as well.

The book finally managed to get around to the reason I bought it in the first place many years ago, which was the all too brief chapter on DIP techniques. It quickly introduced and provided code for algorithms like the Laplace filter, as well as popular effects like emboss, blur, diffuse, and interpolation. The treatment was very light, so the reader will not walk away with a solid understanding for any of the example code, other than trivial effects like pixelate or crystalize.

Zen of Graphics Programming (2nd Edition)

September 11, 2009

Zen of Graphics ProgrammingThere are certain books on my shelf which I will never give up. I donate many to the library but some have a staying power I don’t often see and tend to covet when I do. Michael Abrash’s aforementioned text is one of those books. It’s  a technical book cover to cover, there are a lot of hairy details which are fun to read about and you would be hard pressed to find code listings your mother could understand (unless she too is a programmer, lucky you!).

The book was written in 1996  for a whopping retail price of $62.99 (CAN). That was a lot of money for a young man who was entering his second year of university. Like many people, I bought the book during a time when browsing the table of contents on the Internet was simply not an option. Most of the time, I couldn’t even look inside a physical copy because my book store simply didn’t stock the books I was interested in buying. I often needed to special order these texts using a behind the scenes method, that is, if I was lucky enough to find a distributor in their corporate directory. I fondly remember spelling out the book titles carefully since the nice women working behind the counter didn’t have a clue what I was talking about. Most of the time they must have thought I was speaking a foreign tounge because they would often ask me to repeat it, or reply seconds later with a vaguely stunned expression.

What I needed to rely on, was the opinions and recommendations of graphics and games programmers who posted comments through various Usenet news groups, BBS systems, and the very rare in face to face conversations. This particular book was published by The Coriolis Group and featured a wealth of information on programming computer graphics using an Intel microprocessor and a video card sporting multiple video modes, including compatibility with the elusive and fabled Mode X.

Being a book on graphics programming in 1990’s, it centers around producing efficient drawing code. It tackles the more basic primitives and how to decompose the traditional rendering algorithms into something which could execute very quickly, and then review that optimized code progressively to try and squeeze out even more cycles from your overworked CPU. One of the best features of the book, is that it’s mostly a book on algorithmic optimization, instead of code optimization, which is why it has staying power even today. For those interested in the past-time of code reduction, the author has written another book called The Zen of Assembly Language which is freely available as a digital download (or was the last time I looked).

Three dimensional computer graphics was a popular topic at the time and still is to this day, although advances in this area of computer science have rendered (pun intended) some of the earlier topics almost obsolete. Of interest to those studying the rendering pipelines used by classic games like Doom and Quake are topics around the interpretation and optimization of BSP trees. BSP is a fancy acronym standing for Binary Space Partitions, which is another fancy way of describing how to carve up space into optimal chunks, and then placing those chunks into a data structure like a binary tree. The reason Doom ran as fast as it did was not because the developers wrote some of their modules in assembly language, it was because the BSP compilers produced an optimally pruned set of data. Less data to process, meant more cycles to render those beautiful walls, ceilings and stairs we have all come to love and adore.

I find this book to be especially useful, however, when you need to return to the fundamentals. With the advent of small, portable devices, programmers working on these products have needed to return to their roots, or if they are new in the industry, to open their minds and think about the construction of 2D & 3D engines in a way they never thought of before. Most of the engines used by nubile programmers entering the field of computer graphics will be wrappers around this functionality; abstracting the core mechanics and shielding the developer from the tedious and structural bits of the engine.

Indeed, this is probably for the best, most of the time, if productivity is what your company needs. It never hurts to return to your roots every now and then, and let your mind flourish in an area rarely experienced by most technical professionals.

Fallout 3: The Waters of Life Bug

December 7, 2008

Right around the point where you finish with the Waters of Life quest, one of the waypoints during the main quest, the game crashes at different points much to the frustration of Fallout players everywhere. I do not know of a fix for Xbox 360 players, but for PC players, using your debug console may save you a lot of time and headaches. The basic strategy is to turn off the AI engine before the game crashes (you’ll need to experiment), progress to a new scene or area or event, then enable it again. We had to do this a couple of times: once during our approach to Jefferson Memorial (we reactivated it once we teleported to Janice Kaplinski), and again before we turned the valve to unblock the pipes (last request from your father). To teleport to Janice we followed these steps:

  1. Press the tilde (~) and disable the game’s AI with the “tai” console command.
  2. Enter the command: player.moveto 00019FC6. This will teleport you to Janice Kaplinski.
  3. Talk to her, Dr. Li, and any other scientists, if you want.
  4. Turn the AI on again with the same “tai” command.
Good luck.

Spoiler: Fallout 3 has bugs. Sorry.

November 4, 2008

We’ve been playing Fallout 3 for a few days now and it seems to have a few problems, but nothing too major so far. Over the last week or so, I have read a number of posts and articles complaining about the stability of the game. Well, my response to these people is to just relax and try and fix your problem with the debug console – Xbox and PS3 users are out of luck, I’m afraid.

With a game of this size and complexity, you must allow for the initial round of bugs, even ones related to the main quest. Bethesda simply does not employ enough people to cover the same amount of ground as the community. As an example, how many people do you think have purchased a copy of the game since it was released? I’m lazy, but let’s just say 100,000 copies were sold. As is the case for almost every piece of software, users really do make the best testers since there are a lot of bored people in the world, and everything they do in the game is new and intriguing to them so they tend to poke and prod at the software in ways the game development company never expected. Do you honestly think Bethesda should try and compete with that many play testers? Well in order to have a testing team which can compete with a group that size they would need to charge $1000 for the game. Yeah, no thanks, I’ll live with my slightly crippled software, until they come out with a patch release.

When you think about the number of things which could have gone wrong, the bugs I have seen floating around the Internet are annoying, but not outside what I would have expected from a game like Fallout 3.

QBasic

December 5, 2007

QBasic EnvironmentWhen I mention QBasic to some people, they immediately think I’m talking about Quick BASIC. The two products, however, are a little different. They were both created by Microsoft but Quick BASIC is basically a super set of QBasic. QBasic has an interpreter and an editor built as one package; I hesitate to call it an IDE since your projects could only use one module at a time. It was also limited in the amount of memory available to the program and the amount of memory available to the editor. I experienced the latter problem only once while creating a game involving viruses and robots (I didn’t get around to naming it); the editor just started losing lines of code I had written and was behaving eradically. Eventually, I became frustrated and moved on to other and presumably smaller projects.

QBasic made its first appearance with MS-DOS 5.0. It came with a few example programs and games. One of these games was called Nibble. I love this simple game, even to this day. It’s a little similar to games like Centipede, although the game itself is far too simple to make a reasonable comparison. The goal for each level is to gobble up the numbers that appear in random locations. Each time one of those numbers gets consumed, your “snake” grows a little longer. You have to avoid running into the walls of the level, which get more complicated as you progress, and you must not run into yourself. As you attain higher levels, your snake becomes faster and faster. This game was never synchronized with the system clock, so if you play the game on a machine made today, it would move around so quickly as to render the game unplayable.

I have often thought a game like Nibbles would make an excellent game to practice your porting skills on other platforms. It is sufficiently interesting to make the project worthwhile and could be adapted to play well using almost any input device. It could also be rendered using a simple text mode, just like the QBasic version, or you could enhance it in a graphics mode using imagery, vector graphics, etc.

QBasic also introduced the concept of functions and other forms of structured programming. GW-BASIC could only remember and execute your program if each line was prefixed by a number, or right away if you are using instructions in immediate mode. As you added and removed lines from your code, there were a couple of functions to reorder or renumber your line numbers when you ran out of room.

The concept of a function didn’t really exist in GW-BASIC; instead, it allowed you to jump to a particular line number using commands like GOTO or GOSUB. The latter was more like a function since you could jump to a specific region of code and then return from that function when the code had finished. GW-BASIC also supported one line functions which were handy for calculations. Although QBasic could still use line numbers, it encouraged the use of named labels for functions instead. Despite my work with Amiga’s BASIC, I still preferred the old way since I had been doing it for so long. It took me a while to adjust to the new program structure at first, so I purchased a new QBasic book after upgrading and essentially dove in head first.

Interrupts would become increasingly important for me in the future, but at this time I knew little about them. QBasic had no direct interface for handling interrupts, but it could handle interrupts used by the system’s timer:

ON TIMER(n) GOSUB MySubRoutine

Having no functionality to manipulate interrupts meant there were no functions to gather information about input devices like the mouse. Despite this seemingly major failing, all was not lost. While it could not handle input from the mouse directly, you could make use of a machine code sub-routine which could get the information you needed, like position and button states. You could use techniques like this to gather information from other devices.

QBasic also introduced me to one of the greatest time saving features ever created: the debugger. A debugger can be a separate program or feature within an IDE which allows you to trace through your program and examine variables and addressable data as the software executes. One of the core features of a debugger would be the ability to set a break-point at a specific addressable location that corresponds to a precise line within your source code. Before debugging, I was tracing through my program by hand and using PRINT commands to dump the contents of a variable. Even today, there are professional programmers who don’t use a debugger, either by choice or lack thereof, and choose to examine how there software operates by sending information to a log file or an output stream of some sort.