Resurrected Entertainment

Archive for the 'Reflections' category

8-bit Christmas

December 27, 2014

I have recently finished the book entitled “8-bit Christmas” by Kevin Jakubowski. It was a very enjoyable read with many similarities to my childhood along with many differences; it was fun to read about another person’s perspective on that period. My brothers and I all wanted a Nintendo after we played one at a relative’s house during the Christmas break. The trouble being, of course, that we would have to wait until next Christmas to get one. Luckily, I had a birthday in July!

Are we overemphasizing the importance of old games?

July 28, 2013

I visit the site 1up.com fairly regularly. I enjoy their articles much more than other sites and I find they have a good perspective on retro-gaming content as well. Sometime last year, they wrote up a Top 100 Games of All Time list, which is nothing new and seems to have been done to death in the game journalism industry. Their angle was a little different, though, since they wrote up a full retrospective review on each game in the list, rather than just a short synopsis on why the game was important. One of the games reviewed was Wolfenstein  3D, of course, since that was a top 100 games of all time list. In the article, they explain that Wolfenstein 3D created all the basic elements of any FPS game made today, such as items being positioned in front of the player character, armour, tiered weaponed systems, and enemies which require different combat strategies to finish them off. I personally love this game and some of the knock-offs too, like Blake Stone: Aliens of Gold; I think a number of the core mechanics present in FPS games were popularized with that game. In the article, however, they imply that FPS games made today owe much to Wolf 3D and that it defined basically everything that an FPS could be:

“id Software set these standards back in 1992, and since then, very few FPSes have veered from these fundamental ideas.”

This assertion, I think, over emphasizes its importance a little too much. Yes, they do have some of the basic elements in common, much like every good cookie has flour and sugar as ingredients, and credit should be given to those who breached the market first as well as the inventiveness of the individuals at the time. However, saying that today’s FPSes have veered very little since then, is like saying action movies continue to have action in them, but are really just boring rip offs of the very first action film, The Great Train Robbery made in 1903. Once a genre is created, by definition, there will be situations and themes in common with other products in that category. But to say that very little has been introduced to expand the FPS genre definition since then is to ignore everything that makes them different.

Since I am a programmer by trade and one who specializes in game science and computer graphics, I know how technically different games made today are from those made in 1992, and for technical reasons alone, they should rarely (if ever) be compared to earlier titles because to do so diminishes the achievement of today’s games. There is so much more technical effort, detail, and expense in making a game like Call of Duty, than Wolfenstein 3D even when you account for shifts in the technology gap. The latter was created by fewer than five people and according to Wikipedia:

“The game’s development began in late 1991… and was released for DOS in May, 1992.”, source.

There’s not a lot of sophistication that can be placed into a game in six months; the folks at id Software were shooting to fill a void in the marketplace, much like Apple did with the iPod. They depended heavily on being the first to market. Sure, there was a certain amount of inventiveness needed for both products to be successful and having a Nazi theme surely helped to popularize the title, but usually the bar tends to be much lower since you are the first to arrive in the market. For these kinds of products, timing is just as important as any other attribute. You don’t need to be the best; in fact, various markets are filled with examples of better products which have been released only a couple of months later that do not survive, simply because they were not first to fill the void.

Whereas, games like Call of Duty or Mass Effect take years of development on teams of 50-150 people, and typically have features like:

  • Complex stories and nuances of game play;
  • Interesting characters and custom look and feel to main protagonists;
  • Large variety of multi-faceted equipment and combat scenarios;
  • Full orchestral music and top of the line sound effects;
  • Physics based environments and realistic interactive environments;
  • Sophisticated level of detail within very large environments;
  • AI driven characters which can act cooperatively or competitively in a variety of environments;
  • Player created and user-driven of influenceable story arcs;
  • Dynamic levels constructions which can be destroyed and collapsed at will;
  • Multi-player scenarios with large groups of people around team based objectives;
  • Facial animation and expression techniques which map to voice samples;

The list can go on from there, but you get the idea. Comparing one of these games to Wolfenstein 3D is like comparing a bicycle to something James Bond would drive and saying they are basically the same because they are both vehicles…

Treasure your Console

June 26, 2013
Console

Ouya Console

There are some people who are collectors, they love to hoard and collect things of all types, and others who git rid of things as soon as they stop using them. There are still others who shoot them on sight when playing Mass Effect 2. Unfortunately for video game consoles, they tend to be traded, given away, or discarded more often than not as soon as the next big thing comes along. The fact is that for many of us, it is simply not practical to hold onto every piece of hardware that crosses your path, even though we may want to hold on to it.

I enjoy playing games on the console hardware, but I also enjoy playing those same games through an emulator because some the console’s quirks can irritate me after a while. I tend to be a pragmatist when it comes to gaming nostalgia; generally speaking, if I don’t use or enjoy the console or the games available for it, then I will get rid of it. The reasons for doing this are almost always centred around clutter; I relish my hobby more if I can find and use the games I enjoy more easily. I almost never throw these systems away, although a friend of mine will attest to an unfortunate incident involving a fully functional Atari ST computer and monitor going into the trash in a moment of weakness, but these incidents are very rare thankfully.

These bits of retro hardware are full of little issues for the modern gamer. They don’t work very well on new televisions, they sometimes emit a musty smell, their hardware eventually deteriorates to the point where certain components need to be replaced, they lack save state functionality, and they often require bulky things like cartridges in order to do something useful. Some of these problems can be addressed: the console hardware can be modified to support new output technologies like component or RCA or VGA, the failing hardware components can be replaced with higher quality modern electronics, new cartridges are often available that use flash or SD Card storage and, of course, they can be cleaned.

However, I am of the opinion that certain pieces of gaming hardware and software are more than just the sum of their parts and are worth hanging onto because they represent something greater, even if they don’t work that well out of the storage box. We are part of a very large body of people in the world who have been brought up with iconic video game characters like Mario and Zelda in their daily lives. Many other elements of video game culture such as the people involved, the companies behind the platforms, the art, music and sound effects have permeated our society over the last five decades. Many of these systems have long and varied histories, most of which are very interesting from a cultural and technological standpoint. They are the reason you are using a smart phone right now, and they are also the reason you television has a web browser. There are numerous books published on video game history and technology which are available through various on-line retailers or perhaps directly through the author’s own web site, if you desire something fun and interesting to read. For those systems you found fun to play, try hanging onto them for a while, your kids may enjoy using them and you may certainly enjoy regaling them with interesting historical tidbits as they develop an appreciation for the devices they use.

Launch birthdays for the *Sega 32X and **Nintendo DS!

November 21, 2011

* North American launch date in 1994 for the 32X
** North American launch date in 2004 for the Nintendo DS

The Sega 32X system was designed to breath new life into the aging MegaDrive and Genesis video game console, which was being ripped apart by Nintendo’s juggernaut the SNES. Sadly, it didn’t do so well and quickly evaporated from store shelves. Despite the poor reception, it’s still has a birthday, so make sure you play your favorite title tonight. It shouldn’t be too hard to pick one, given the limited selection of good titles for that platform. It’s probably for the best that we don’t mention the Sega CD either… well, despite my obvious misgivings about the system, I know it is near and dear to a more than a few people. So Sega fans, on this happy day, I salute you!

The Nintendo DS needs no introduction, since many of you have an incarnation of that device sitting in your house right now. With over 149 million units sold worldwide, I think it could be considered a success. Let’s do a little math, shall we?

149,000,000 x $150 + (149,000,000 x (6 * $40)) = $58,110,000,000

That’s an average cost of $150 dollars per unit, and each unit having an average of 6 games which sell for an average of $40 each. Let’s assume that 70% of that amount went to pay for everything from manufacturing, to licensing, to marketing and employee salaries. We’re still talking close to $20 billion in the clear.

So Nintendo Corporation, while you sit upon your mountain of money contemplating this special day, we the bottom dwellers of society raise our filthy hands in a formal salute!

Thomson-Davis Editor (TDE)

November 6, 2011

TDE - PE of ChoiceThe Thomson-Davis Editor, or TDE, was the first programmer’s editor to ever grace my hard drive. A programmer’s editor (PE) can be somewhat different than a typical text editor used for typing up README files or other user-level documents. A good PE will usually have a large assortment of features which makes the job of editing source code a little easier. Ironically, these editors can be some of the most obtuse software installed on a desktop system. To use them effectively, you must memorize several obscure key combinations and commands. Once committed to memory, these commands can be very powerful, allowing you to perform several complex editing or searching operations.

As an aside, the usability goals for an editor within an IDE such as Borland C++, seem to be the exact opposite for the goals set out by the authors of many PEs. The editing within IDE is almost universally easy to use, while performing the same tasks within a PE requires practice and certain degree of research. Approaching this from the viewpoint of a novice looking into the dark world of a seasoned, and perhaps a little cynical UNIX-computer programmer, it would seem a little odd since the interface of an IDE can do so many different tasks (and thus have so many opportunities to botch things up), whereas a PE tends to be targeted to one task: editing source code. Surely, with fewer features, it must be easier to craft something which is even easier? It’s almost as if the author of a PE is trying to compensate for lack of development features by throwing in additional editing complexity. For example, how many standard forms of regular expression syntax does your PE support? If you answered anything other than “as many as I want since I can simply write a plug-in to support it,” then you’re not using the right editor.

I stumbled across TDE while browsing the download area for a local BBS. It included the source code for the program which was written in C, and it seemed to have a variety of interesting features; it certainly had more technical features than the QuickC editor I was using at the time. I guess I gravitated towards complexity at the time, and I quickly grew to love TDE and began modifying the source code to suit my needs. If I were to use the latest version of TDE today, I’m betting a lot of those hacked-in features would already be implemented.

The code for TDE was a great learning experience for me. It was organized fairly well so it made for relatively easy modifications and creative hacks. I lost the source code for my modified version during The Great Hard-drive Crash in the mid-1990’s. For some odd reason, I didn’t have a single back-up. It was particularly strange since most of my favourite projects were copied onto a floppy disk at some point. Frustratingly, I think I still have a working copy of the original 3.X code on disk! Anyway, the source code had excellent implementation details like fancy text and syntax handling, decorated windows, and reasonably tidy data structures. I don’t remember using any code from the editor in any future project, but I certainly took away a number of ideas. Doublely-linked lists may not seem like a big deal to me now, but I hadn’t read that many books at this point, so glimpses of real implementations using these data structures was very cool and inspiring. I found the windowing classes and structures particularly interesting since some of the windows were used for configuration, others for editing, and some for help. This was the first abstraction for a windowing system within an application I had run across. It was beautiful and gave me a lot of ideas for HoundDog – a future project which was used to track the contents of recordable media like floppies and CD-ROMs, so that I could find that one file or project quickly without needing to use those floppy labels.

The QuickBASIC and the 0xDEAD

November 2, 2011

QuickBASIC is very similar to QBasic since the latter is just a stripped down version of the former. It had a few functions QBasic did not have — such as the ability to compile programs and libraries, allowed for more than one module, and could create programs which were larger in size. It also had this annoying bug where the memory management model in the interpreter was different than the one used by the compiler; this was a problem when your program worked in the debugger, which used the interpreter, but not when the program was actually compiled and run from the shell.

QBasic shipped with MS-DOS 5 and consisted of only an interpreter, meaning it translated small chunks of code into machine code as they were executed (I believe it did cache the translated portions, so it wouldn’t need to translate them again). QuickBASIC was more advanced and had a compiler as well as an interpreter, which allowed it to translate and optimize the machine code it generated before you ran it. It had only one dependency during compilation, and that was the QB.LIB library. Using the QuickBASIC IDE or the command line, you could compile multiple modules into a single target which could be a library or an executable program.

QuickBASIC was handy for prototyping and demos. I didn’t need to do many of these for my own projects; although, I did a lot of experimentation with network interrupts before porting those routines and programs to C. QuickBASIC was mainly used for projects and demos at school and eventually college; many of them were also written in C due to a few course requirements.

While in college, I had written a chat application over a local area network for DOS before the concept became popular. It was using the SPX protocol for some parts, and the IPX protocol for others. It only supported peer-to-peer communication and only with one other person, bit it served as a demo for simple network communication, and many of the students used it quite regularly.

In other programs, I was using Novell Netware interrupts for communication, broadcasts and client machine discovery. I loved it and found coding these applications fun and exciting; my Ralph Brown textbook was well used during this time. I think my interest in programming these network applications stemmed from my days playing DOOM over similar networks. Basically, I just loved lot the potential for interactivity, which is a tad ironic since I was fairly quiet programmer back then. A few of my friends became interested in the software I was writing and one day we decided to play a few practical jokes on my fellow students. We created a custom program for sending messages via the Netware API. These messages could be broadcast to every machine on the network, a group of machines, or a specific machine. When a machine received one of these messages, it would display a dialog and show the user a message. Pretty simple concept, but many people were not aware their machines could even do that, or what it meant to even be on a network.

Simply using the network software which came with Netware wasn’t an option, since the message dialogs produced by those tools contained the name or address of the machine from which the message originated. According to Steve Wozniak, co-founder of Apple Computers and well-known prankster, the most important element to any gag is not to get caught. We needed to customize the message being sent so that it included only the details we wanted to send, and the only way to do that was to write a custom program. This was fairly easy since I had been writing software on Netware for several months, and when the packets were sent off through the network, our machine name was carefully omitted and our address forged.

We used the program to send funny or confusing messages to some of the students. No profanity or crude humour, mind you… well nothing too crude anyway. My goal was always to get the user to believe what the machine was telling them. I had convinced one student her computer didn’t like the way she typed; her keystrokes were always too hard or fast for the computer’s liking. She had actually phoned the administrator’s office and asked them for a computer which didn’t complain so much! Several computers were being used by students to view pornography, so I did my best to make them feel uncomfortable in a public setting. Many of them believed they were being watched by the network administrators, which could have been true (although network monitoring software was generally never used or not available). Anyway, these people quickly shuffled out of the room red-faced, hoping not to get caught on there way out. I still get a giggle out of it even now, when I think about it.

In a way, I am kind of saddened by the complexity of today’s operating systems. Trying to write the same software on modern machines would be extraordinarily more difficult today, mostly because of new operating system features and application stacks. They just aren’t managed in the same way anymore, so a programmer’s ability to exploit a network so directly has disappeared. That being said, I don’t particularly disapprove, it’s just not as easy to have a little fun.

Microsoft Quick C Compiler

December 21, 2010

Quick C CompilerWhen I first came in contact with this compiler, I was just starting high school and eager for the challenges ahead (except for the material which didn’t interest me — basically non-science courses). When I went to pick the courses for the year, I noticed a couple which taught computer programming. The first course, which was a pre-requisite for the second, taught BASIC while the second course taught C programming. At this point in my life, I was an old hand at BASIC, so I basically breezed through first programme. The second course intrigued me much more. I was familiar with C programming from my relatively brief experience with the Amiga, but I had a lot left to learn. My high school didn’t use the Lattice C compiler, but a Microsoft C compiler instead. I located the gentleman who taught the course and he pointed me to a book called Microsoft C Programming for the PC written by Robert LaFore and the Microsoft QuickC Compiler software. I had a job delivering newspapers at the time, so I could just barely afford the book using salary and tips saved from two weeks doing hard time ($50 at the time), but the compiler was just too expensive. So I did what any highly effective teenager would do, basically I dropped really big hints around the house (including the location and price of the compiler package I wanted) until my parents purchased a copy for me on my birthday.

There are a number of differences between the BASIC and C programming languages. One of the more obscure differences lies in how the C programming language deals with special variables that can hold memory addresses. These variables are called pointers and are an integral part of the syntax and functionality of the language. BASIC did have a few special functions which could accept and address locations in memory – I’m thinking of the CALL and USR functions specifically, although there were others. However, a variable holding an address was the same as one holding any other number since BASIC lacked the concept of strong types. The grammar of the C language is also much more complex than BASIC; it had special characters and symbols to express program scope and perform unary operations, which introduced me to the concept of coding style. When a programmer first learns a particular style of coding, it can turn into a religion, but I hadn’t really been exposed to the language long enough to form an opinion. That would come later, and then be summarily discarded once I had more experience.

There were libraries of all sorts which provided functionality for working with strings, math functions, standard input and output, file functions, and so on. At the time, I thought C’s handling of strings (character data) was incredibly obtuse. Basically, I thought the need to manage memory was a complete nuisance. BASIC never required me to free strings after I had declared them, it just took care of it for me under the hood. Despite the coddling I received, I was familiar with the concept of array allocations since even BASIC had the DIM command which dimensioned array containers; re-allocation was also somewhat familiar because of REDIM. However, there were many more functions and parameters in C related to memory management, and I just thought the whole bloody thing was a real mess. The differences between heap and stack memory confused me for a while.

There were many features of the language and compiler I did enjoy, of course. Smaller and snappier programs were a huge benefit to the somewhat sluggish software produced by the QuickBASIC compiler and the BASIC interpreter. The compiled C programs didn’t have dependencies on any run-time libraries either, even though there was probably a way to statically link the QuickBASIC modules together. Pointers were powerful and were loads of fun to use in your programs, especially once I learned the addresses for video memory which introduced me to concepts like double buffering when I began learning about animation. Writing directly to video memory sounds pretty trivial to me right now, but it was so intoxicating at the time. I was more involved in game programming by then and these techniques allowed me to expand into areas I never considered. It allowed for flicker-free animation, lightning fast ASCII/ANSI window renderings via my custom text windowing library, and special off-screen manipulations that allowed me to easily zip buffers around on the screen. A number of interesting text rendering concepts came from a book entitled Teach Yourself Advanced C in 21 Days by Bradley L. Jones, which is still worth reading to this day.

At around this time, I also started to learn about serial and network communications. The latter didn’t happen until my last year at high school. Basically, I wanted to learn how to get my computers to talk to one another. It all started when I became enchanted by the id Software game called DOOM, which allowed you to network a few machines together and play against each other in a vicious winner takes all death-match style combat. Incidentally, games like Doom, Wolfenstein 3D, or Blake Stone: Aliens of Gold led me down another long-winding path: 3D graphics, but that didn’t happen until a few months later. Again, the book store came to the rescue by providing me with a book entitled C Programmer’s Guide to Serial Communications by Joe Campbell. I was somewhat familiar with programming simple software which could use a MODEM for communication, since BASIC supported this functionality through the OPEN function, but I knew very little about the specifics. Once I dug into the first few chapters, I knew that was all going to change.

NetFlix, Stream Movies for $57.99 per Month (Canada only)

November 6, 2010

NetFlix LogoSo, we decided to give NetFlix a try a few weeks ago, just to get a feel for the experience; the platform of choice was our PS3 system. We were asking ourselves if this was a technology we would want to use to rent films that weren’t on our “must see” or “must own” list. I like to watch movies — a lot. After a long day at work, sometimes I am just mentally exhausted, which compels me to find sources of entertainment which are either completely different from what I working on during the day, or submit to a more passive form of recreation that doesn’t involve too much thought. I find movies a great way to relax and so I often partake in movie rentals at our local video store, sometimes as many as three or four times per week if I’m having a busy time at work. Most of the movies I rent tend to be fairly entertaining, but generally not a movie I would prefer to own, so the idea of streaming is very appealing to me since it eliminates the need to visit the mall. However, I prefer to think of the streaming business as a complement to existing movie distribution networks and not as a replacement.

I am living in an area now which is a tad on the remote side, so finding a decent video rental store is next to impossible, and the store I did find only provides a very small number of titles, most of which are still recorded as VHS cassettes. When my wife first talked to me about NetFlix, I must admit I was fairly reserved at the idea. I was mostly concerned about movie selection, but movie quality was also high on the list as well. After using it for a little less than a month, I must give the service 5 / 5 stars. I did see some minor quality problems in some movies, and the service had a bandwidth hiccup or two (probably not their fault), but overall I am very pleased with the video and audio quality and movie selection process. The NetFlix application on my PS3 is very easy to use with snappy response times and a rating system tuned to your personal tastes. The rating system does take time to learn what you like to watch, and it’s important to rate a film after you watch it.

The service costs $7.99 per month with the first month being free at the time of this writing. A standard-definition movie which runs for 2 hours consumes approximately 1.8 GB of data, while a high-definition movie running the same amount of time chews through approximately 3.0 GB of data. I don’t know what the consumer break-down is for Roger’s service plans, but one of their plans gives you a 60 GB transfer limit (upload + download) per month. Overage costs vary with your plan, and our plan costs us $2 for every GB over our 60 GB limit up to a maximum of $50 per month. We live in a 5 member house and four of us work in the technology sector, which means our tendency for data consumption is usually higher than most families, except those families with one or more tech savvy teenagers doing strange things in their basement. What this usually means is that we tend to consume more than half our available transfer limit just doing our day to day activities for work or recreation, which leaves less than 30 GB of wiggle room. That leaves us with the capacity for watching approximately 11 standard-definition movies, or  5 high-definition ones without getting charged extra. Since I prefer to watch some types of films in high-definition, my film roster is looking fairly small for an entire month, so naturally I’m going to enter into overage costs, rent more movies, or just do without.

I personally know a few families who use NetFlix mostly to provide entertainment for their children. I am wondering how often they run into overage costs since kids love to watch movies repeatedly, and does it save them money if they didn’t need to rent or buy those videos? For those types of users, it would be nice if NetFlix offered a download and record option so that the movies do not need to be repeatedly streamed. Those cached movies could have an expiration date based on the last time the movie was viewed (and not based on when it was downloaded), since NetFlix supports the policy of watching a movie as many times as you like. As usual, it would also behoove Rogers to offer more competitive Internet service plans (compared to the plans available in the USA), and for the CRTC to step out of the picture and stop meddling with what should be a privately run business sector. But, now I’m really dreaming.

Digital Download Dilemma

October 29, 2010

I was reading an article on slashgamer.net that the PS4 was not going to support digital download only for games, and the writer was almost chastising the company for not embracing the future. Why, exactly, would we strive for this? I do not want an external server to hold the one and only back-up copy for the software I buy — if their service goes down for any reason (and they will go down permanently at some point), who do you think is going to be left holding the bag? Not to mention the Internet service provider problem. While a lot of us in North America have high speed connections, many of us do not have “unlimited” download (especially Canadians) contracts, and those which do have them, their download rates tend to be somewhat atrocious when you consider that the content you want to download is several gigabytes in size. In some cases, you will start your download today and maybe play your game tomorrow, assuming your connection stays healthy.

If you take the angle that there will be less packaging, therefore production costs will come down and the publishers will offer the game at a lower price, then you may want to think twice about that. Consider iTunes, for example, the music they offer is pretty much on par with the retail price of a physical compact disc, if you multiply the price you pay per song by an average number of songs per album (12-14). Less packaging may not translate into a cheaper price for the consumer, but it certainly does translate into larger margins for the publisher.

With the advent of collector’s editions becoming more popular, I was hoping that a trend towards better and more interesting packaging would be upon us. Think Baldur’s Gate, Space Quest, or the Ultima Games. Beautiful works of art and a real pleasure to own.

DOOM II: Let the obsession begin. Again.

December 23, 2008

[Taken from the back of the box]

The wait is over. In your hot little hands, you hold the biggest, baddest Doom ever – DOOM II: Hell on Earth! This time, the entire forces of the netherworld have overrun Earth. To save her, you must descend into the stygian depths of Hell itself!

Battle mightier, nastier, deadlier demons and monsters. Use more powerful weapons. Survive more mind-blowing explosions and more of the bloodiest, fiercest, most awesome blastfest ever!

The 3-D modeling and texture bit-mapping technologies push the envelope out to the max. The graphics, animation, sound effects and gameplay are so virtually realistic, they’re unbelievable!

Play DOOM II solo, with two people over a modem, or with up to four players over a LAN (supporting IPX protocol). No matter which way you choose, get ready for adrenaline-pumping, action-packed excitement that’s sure to give your heart a real workout.

It must be tough to keep reading – what with your hands trembling in shear anticipation. So we’ll stop talking now to let you take this box to the counter.

DOOM II. It’s time to get obsessed again.