Archive for December, 2008
DOOM II: Let the obsession begin. Again.
December 23, 2008[Taken from the back of the box]
The wait is over. In your hot little hands, you hold the biggest, baddest Doom ever – DOOM II: Hell on Earth! This time, the entire forces of the netherworld have overrun Earth. To save her, you must descend into the stygian depths of Hell itself!
Battle mightier, nastier, deadlier demons and monsters. Use more powerful weapons. Survive more mind-blowing explosions and more of the bloodiest, fiercest, most awesome blastfest ever!
The 3-D modeling and texture bit-mapping technologies push the envelope out to the max. The graphics, animation, sound effects and gameplay are so virtually realistic, they’re unbelievable!
Play DOOM II solo, with two people over a modem, or with up to four players over a LAN (supporting IPX protocol). No matter which way you choose, get ready for adrenaline-pumping, action-packed excitement that’s sure to give your heart a real workout.
It must be tough to keep reading – what with your hands trembling in shear anticipation. So we’ll stop talking now to let you take this box to the counter.
DOOM II. It’s time to get obsessed again.
Categories: DOS, Reflections
No Comments »
Computer Virus Research
December 21, 2008As part of being a well-rounded programmer, I dabble in all sorts of technical things. One of my areas of interest is computer virus research. In the last thirty years, I have witnessed a large number of changes to this industry, and I find myself compelled to write a little bit about it today after reading about a couple of courses offered at the University of Calgary.
As it exists today, computer virus defense is wide collection of software programs and support networks which are offered to companies and users for the sole purpose of protecting their data from loss, damage, or theft from a myriad of small computer programs called computer viruses. These programs must have the ability to replicate (either a copy of themselves or an enhanced version) and which often carry a payload. The means by which a computer virus can replicate are complicated and often involve details of the operating system. In addition to preventing virus outbreaks from occurring, anti-virus software is also used to help prevent service outages and ensure a general level of stability. In other words, they are selling security or at least one form of security, since security in general is a very large net which cannot be cast by only one program. As an aside note, please be aware of the tools you are using for anti-virus protection. With some research and a little education, it’s often not necessary to purchase these programs in the first place.
I am currently reading Peter Szor’s book entitled, The Art of Computer Virus Research and Defense (ISBN-10: 0321304543). I am almost finished the text and I have found the book to be incredibly informative; filled with illustrations and summaries for all sorts of computer virus deployment scenarios, technical information about individual strains, and historical pieces of information as to how the programs evolved and mistakes made by both researchers and virus writers.
Even though I have the skills and the opportunities to do so, I have never written a computer virus for the purposes of deployment, nor do I ever wish to do so, but I can tell you that writing an original computer virus is challenging work; writing a simple virus is easy. Isolating, debugging, and analyzing the virus is also interesting work, albeit somewhat more tedious. Both jobs require similar skill sets, detailed knowledge of and low level access to a specific system.
I used to posit that the best virus writers would be the people who have taken it upon themselves to write the anti-virus software. After all, the best way to ensure the success of a business built on computer virus defense is to construct viruses that can be easily and quickly disarmed by your software. Much to the disappointment of conspiracy theorists, this is probably not the case, since fellow researchers would easily link a pre-mature inoculation with a future virus outbreak if it happened too often to be mere coincidence. However, if your business was based on quick and successful virus resolutions, then timely outbreaks followed by timely cures would seem to solidify the business model. Personally, I think anti-virus researchers are kept busy enough with “naturally” occurring strains to necessitate a manual jump start of the industry. Although that could change as users and technology platforms become more advanced, although the more probably route is the disappearance of the anti-virus industry; we live in a messy world and there may be opportunities for those wanting to leave their mark, even in the face of futuristic technology gambits.
Computer virus writers are plagued, somewhat ironically, by numerous problems with deploying their masterpiece. A computer virus can be written generically so that it can spread to a wider variety of hosts, or it can be written for a specific environment, which can include requirements on the hardware or software being used. Dependencies on software libraries, operating system components, hardware drivers and even specific types of hard-disks are all liabilities and advantages for a virus. They are liabilities because dependencies limit the scope of infection so the virus spreads more slowly, but at the same time, they often enable the virus to replicate, since the virus may be using known vulnerabilities or opportunities within these pieces to deliver the payload or as as means to allow for it to spread.
Virus research, writing, and defense is a fascinating topic. Unfortunately, I find the pomposity, and to some degree the absurdity, in various branches of the industry to be laughable and a little scary at times. In case you haven’t heard, the University of Calgary is offering a course on computer virus research. While I find this to be a refreshing take on education, my hopes are quickly dashed when I read the requirements and the Course Lab Layout (warning PDF monster). Do they think their students are secret agents working in a top secret laboratory? Of course they do, why else would there be security cameras installed in the room, and why do they restrict access to the course syllabus? Well, I’ve got news for the committee who approved the layout of the lab, and who probably approves the students who can attend the course: computer viruses are just pieces of software. That’s right, they’re just software. They don’t have artificially intelligent brains, they can’t get into your computer by the power lines, and they are quite a bit less complicated than your average word processor. This means that any programmer with the desire and a development environment can write a virus, trojan, or any other form of malware. They don’t need to take your course and they don’t need access to your Big Brother Lab.
The absurdity of protecting information which is already publicly available and has been for decades makes me want to laugh out loud and strangle someone at the same time. It’s rather disturbing and I really don’t like the idea of closing doors on knowledge, even if the attempt is futile. The University of Calgary’s computer science department should be ashamed at perpetuating such ignorance within a learning institution, and I am truly disappointed how bureaucratic such systems have become.
Update 12-29-20008:Â To respond to a verbal conversation I had with a couple of people: I understand why the university placed the security restrictions in the program; they want to validate the program and make it appear legitimate to the community and their peers. That’s fine, but at the same time, it must be acknowledged that the secret to mounting a successful defense against viral software and Internet based attacks is shared knowledge and open avenues for information. Understandably, this information will go both ways, but the virus writer will gain nothing they do not already possess (except the knowledge that we know what they are doing), while the general public may be a little more aware of the problem than they would be without this information.
Indeed, using viral kits and small customization programs can make viral programming easy for the layman or immature programmer, but we shouldn’t be locking away information about these techniques or programming practices simply because the result is something undesirable or easy to dispense. There are real opportunities to learn and disseminate this knowledge today, and the bigger the audience, the larger the opportunities for successful anti-viral software and general consumer awareness which will combine to create the most effective vaccine of all: knowledge.
Categories: Programming, Reflections
No Comments »
I started playing this game last week but it had been sitting around on my Xbox 360 for about a month before I had a chance to play it (my wife and I had been deep into Fallout 3 and had little time for anything else). I have to say that I am impressed so far. The fine people at Capcom and GRIN have put their collective heads together and created a new experience around a classic title.
The training rooms are perhaps the most significant departure from the original. While the new visual look and great audio soundtracks do not change the gameplay experience, the training exercises are designed to enhance the game with a public ranking system and add a competitive edge for those who would not be completely satisfied by Bionic Commando’s mission based levels.
For those who like item and monster records in their games, and I am one of those people, this game adds that feature to a genre which has traditionally shied away from such features. GRIN has also revised how you hack the enemies communication systems. Instead of simply selecting the “hack” option and then waiting to see if you have been detected, Rearmed starts a mini-game where you need to direct a ball to various target spaces in a cubic grid.
However, the most important feature the GRIN team has added is to ensure that it feels like Bionic Commando. I hope you give it a try and let me know what you think about the remake.
Categories: Games, Retro, Xbox 360
No Comments »
Core Memory
December 19, 2008Just finished another classic computing book entitled Core Memory: A Visual Survey of Vintage Computers (ISBN: 0811854426), written by John Alderman and photographed by Mark Richards. While I did enjoy the photography much better in this book than in Digital Retro, I found both the context of the photographs and the text provided somewhat lacking in substance. Although I did find the photographs to be colorful, I do not find so many pictures of wire bundles and wiring trunks to be especially interesting. While it is interesting the see the complexity in wiring for one or two of these machines, it would have been more intriguing to see various parts of the machines, and to have those parts labeled.
I have not had the pleasure of using one of these machines, let alone putting them together. I can readily identify electronic components, but without providing context for the photograph, it’s just a jumble of wires or components with no discernible purpose- however pretty they may appear on camera. It would also have been a great opportunity to provide more technical details on the machine, sample machine code or instruction sets, screen shots or running software (assuming the machine could even be turned on), and what not. It would have been fascinating to have a detailed list of primary technical components and their functions for each machine. Since some of these machines occupied so much territory, it would also have been informative to have a common layout diagram with a typical installation. Don’t get me wrong, I enjoyed this book and all of the machines are photographed very well, but I think it would have been better to have smaller spreads and more annotated pictures.
Categories: Books, Retro
No Comments »
Digital Retro
December 13, 2008I just finished reading the book Digital Retro: The Evolution and Design of the Personal Computer, written by Gordon Laing (ISBN-10:Â 078214330X). Each machine in the book gets a four page spread, showing different angles of the various pieces making up the computer. I must admit that I did not find the images particularly engaging for the most part. They were excellent photographs, no doubt about it, but I don’t think showing the back/sides/top of a monitor and keyboard is the best way to go about creating a visual history that will resonate with your intended audience. I would have been more engaged to read about the various quirks, pour over screenshots of popular software titles and the operating systems of choice, working code from the most popular programming languages for the platform, and even get a close look at the internals (for the inquisitive types who weren’t afraid to void their warranties).
Pictures of the units themselves abound, but most are of poor quality, so a well lit photograph goes a long way to documenting what these computers looked like. However, once you have taken a photo of the front and back, there really isn’t much left of the exterior that is interesting (not including the few machines which actually took advantage of the third dimension and had features on the sides of the unit). With all of the pictures, it left little room for text, and the text contained little more than a summary one could pull off a Wikipedia page. To be fair, the summaries are concise and some columns are filled with interesting tidbits. My favorite is on the last page within the section which documents the NeXT Cube. The page is almost completely filled by the monitor for the system, but there is an excellent piece of information which describes how Steve Jobs acquired the Industrial Light and Magic (ILM) division of Lucasfilm. Basically, George Lucas was in a bind after an expensive divorce and need to raise about $30 million dollars in capital. After a few failed attempts from other buyers, Lucas eventually accepted Steve’s low-ball figure of $10 million. Needless to say, ILM eventually payed back huge dividends since they eventually released the enormously successful film Toy Story, which grossed over $362 million worldwide. ILM produced a number of large films and when you top all of that off with a lucrative IPO, Mr. Jobs is sitting on a mountain of money which probably has its own zip code.Â
I find the book to be a excellent coffee table reference and the columns do make for a quick and easy reference, but I would have enjoyed the book a lot more if it went the extra mile and showed me things most books never touch upon.
Categories: Books, Retro
No Comments »
Qt for Games – Niblet #5
December 12, 2008I spent some time trying to place a QGLWidget into a QGraphicsScene. Apparently, this is not a supported use of the QGraphicsScene class. One Trolltech engineer suggested creating a viewport using a QGLWidget instead of the typical QWidget (default). Neither approach worked and both ended up suspending the program on my Mac. There goes option number two for OpenGL sprites in the game.
Categories: Game Development
No Comments »
Before I began my investigation into the suitability of Qt for games, I wanted to use Qt’s new QGraphicsItem framework alongside the OpenGL support. I was curious to see if an OpenGL object could be wrapped by a QGraphicsItem and rendered into the same view. The short answer is yes, but the result is not very pretty or practical. Basically, it comes down to a fundamental issue with OpenGL and sharing rendering contexts with other engines. OpenGL doesn’t allow this by design, but you could use an QGLWidget and encapsulate that in a QGraphicsItem implementation. The result is very heavy weight, not particularly elegant, and doesn’t easily support animation of the OpenGL shape. With the latest version of the Qt API, you can render a QWidget object into an OpenGL rendering context, but not the other way around (without using a container like QGLWidget).
This is fine and not detrimental to the project, so we’ll continue using the new graphics framework minus the features relating to OpenGL. This gives us plenty of options for rendering, but 3D will be set aside for now (unless I get a spark of genius). We may revisit it later to explore different ideas, like the intro screen or credits; however, we will need to find out how well Qt supports direct pixel rendering. I am very familiar with various image classes, but whether they can be used in and efficient rendering pipeline is still an unknown.
I’m trying to be realistic here. Niblet is not an overly complex idea, but I would like to use compositing for a variety of effects and some effects like fire would need to be rendered using fast pixel access.
Categories: Game Development
2 Comments »
Fallout 3: The Waters of Life Bug
December 7, 2008Right around the point where you finish with the Waters of Life quest, one of the waypoints during the main quest, the game crashes at different points much to the frustration of Fallout players everywhere. I do not know of a fix for Xbox 360 players, but for PC players, using your debug console may save you a lot of time and headaches. The basic strategy is to turn off the AI engine before the game crashes (you’ll need to experiment), progress to a new scene or area or event, then enable it again. We had to do this a couple of times: once during our approach to Jefferson Memorial (we reactivated it once we teleported to Janice Kaplinski), and again before we turned the valve to unblock the pipes (last request from your father). To teleport to Janice we followed these steps:
- Press the tilde (~) and disable the game’s AI with the “tai” console command.
- Enter the command: player.moveto 00019FC6. This will teleport you to Janice Kaplinski.
- Talk to her, Dr. Li, and any other scientists, if you want.
- Turn the AI on again with the same “tai” command.
Categories: Games, PC
1 Comment »