Resurrected Entertainment

Archive for the 'Books' category

Book Review

May 8, 2023

Let’s talk about this book written by Bob Flanders and Michael Holmes in 1993 entitled “C LAB NOTES”.

The title hints that this may be a book about programming in the C language. Spoiler: it is not a book about how to program in C, even though it uses the C language. Instead, it provides you with several interesting, examples about how to tack real world problems (at least, problems that existed in 1993). Here is a run down of the topics being explored:

  1. Setting the system clock via modem and the United State Naval Observatory’s cesium clock.
  2. Collecting program statistics around interrupts used, video modes, disk access.
  3. Running programs on other nodes via Novell NetWare; sending messages is also explored.
  4. Interacting with laser printers.
  5. Phone dialing using a modem.
  6. Synchronizing directory contents.
  7. Automatically retrieving the latest files from multiple directories.
  8. Analyzing disk structure, such as clusters and sectors.
  9. Managing your appointments with a custom calendar.

This was such a fascinating book back in the day. I had limited income to spend on expensive computer programming texts. Many of the programming books that were available in my local bookstore tended to focus on abstract problems that served to introduce foundational concepts. It was so refreshing to see these sorts of problems being explored. Even today, so much literature is written around how a specific technology works, but so few products are written where the author just assumes you know something about the technologies being used. If they used those assumptions constructively, they could dive straight into interesting problems. It would imagine it might be more difficult to find a publisher who would back you on writing a book like this today; they may look at the material and say it would be too hard for most readers to understand, and therefore would not sell particularly well. They would probably be right on both counts.

In any case, I think we can all breathe a collective sigh of relief that advanced books like these are being quickly fazed out and replaced by much more mainstream titles. After all, once a programmer knows how to sort a linked list or draw a simple scene in OpenGL or Unity, there is really nothing else worth exploring.

Making Chemistry Fun

March 23, 2013

There once was a time when chemistry was interesting, then it became boring, then it turned interesting again. This pretty much follows my experience with the subject which can be mapped accurately to something like this:

1. Received first chemistry set (interesting).
2. Took chemistry in high school (boring).
3. Reading about chemistry independently (interesting).

I have read about the experiences others have had with chemistry in high school and I am extremely envious. Sure, they learned about the typical academic formulations, like chemical equations and how to determine the molar volume of a solution, but they also did experiments that were interesting and showed off the true nature of the periodic elements. I think the most interesting experiment I did in high school was how to determine the acidity of a solution. Was it an acid or a base? Hmm. Well, that’s all very interesting the first time you do it, but it’s not interesting enough to talk about acids vs. bases for a week, then show a demonstration of how to measure acidity, then participate in a lab about measuring acidity. There’s just not enough beef there to sustain my curiosity, and the whole subject of acids and bases could have been wrapped up in a couple of days at the most.

Meanwhile my counterparts in other schools around the globe got to connect with chemistry on an entirely different level. They got to participate in experiments which were incredibly interesting and practical. Learning about the foundations of chemistry is more than simply learning the periodic table and studying chemical equations, it’s about connecting everyday chemistry with the elements that make up our world. It’s about seeing chlorine gas and how it reacts with pretty much everything, and learning about the beauty and misunderstood dangers of mercury, or learning how tin foil is made. There are so many interesting facets of chemistry and I needed to wait years before learning about them. Because of this void, I think there are four books every high school chemistry teacher should read before attempting to teach a class on it:

1. The Golden Book of Chemistry Experiments

This book is out of print and fetches a pretty mean price tag on Amazon or eBay, but you can get the PDF version for free from the author, and the book has gone out of copyright since the author did not renew it, so you can get a print version if you like.

2. Theo Gray’s Mad Science: 51 Experiments You Can Do At Home–But Probably Shouldn’t
3. The Elements: A Visual Exploration of Every Known Atom in the Universe
4. The Disappearing Spoon: And Other True Tales of Madness, Love, and the History of the World from the Periodic Table of the Elements

Learning chemistry in high school should be about the information presented in these books and others of their ilk. It should be about capturing the minds of your students so that when the formality of theory is presented, they have an engaged and enlightened perspective from which to pursue it.

The Revolutionary Guide to Bitmapped Graphics

December 29, 2009

revguidebitmappedgraphicsThis is another book from my library that I have decided to take a look back on and see if there are any useful tidbits to be used by programmers today. As with most technical books which are more than ten years old, there is usually an abundant amount of information about specific technologies which are no longer in popular use, or perhaps the technologies are still present in one form or another but the means to access them have changed dramatically. I personally believe that many of these books can give the novice programmer a background not taught in universities and colleges and will certainly give them an edge when working on limited or older machines.

The book does talk about video hardware used in that time period and delves deep into the programmatic underpinnings when accessing the display and creating custom video modes. I found some of the discussions to be noteworthy but if you really want a thorough explanation, you may want to investigate the Zen of Graphics Programming or the Graphics Programming Black Book. It also delves into a bit of assembly language primer, which is very typical for these books, since many of the routines were coded using that language. The introduction is short but may be a nice refresher for those who haven’t gotten their hands dirty in a couple of years.

I’ve made a list of what was still useful for work you may be doing today – unless you’re one of the lucky few who get to maintain software written in 1994. Your mileage will vary as some of the techniques are really just short introductions to a much larger field like digital image processing (DIP) and morphing. It even had a short introduction to 3D graphics, which seemed to be slapped on at the end because the publisher wanted “something on 3D” so they could put it on the cover.

  • It provided color space introductions, conventions, and conversions for the following spaces: CIE, CMY, CMYK, HSV, HLS, YIQ, and RGB. Most of the conversions go both ways (to and from RGB space), although CMY/K conversion calculations are only provided from RGB space.
  • Dithering and half-toning, followed by a chapter on printing. I think the authors mentioned Floyd-Steingberg in there somewhere, but it wasn’t a full discussion.
  • Fading the YIQ and HLS color space. I’m not sure why they didn’t provide one for the RGB space, but it could very well be on the bundled CD-ROM.
  • It introduces the reader to a few algorithms for primitive shape drawing and clipping, like Bresenham line drawing and Sutherland-Cohen clipping. It also included discussions and examples for ellipses, filled polygons, and b-spline curves.
  • Extensive discussions on graphics file formats for GIF, JPEG, TGA, PCX, and DIB. Although these tended to be higher-level than what would have been useful for someone implementing a decoder for any one of these formats (with the possible except of PCX). Associated algorithms like LZW and RLE are also explained as they are used by encoders of these formats.
  • The topic on fractals and chaotic systems was a little out of place, but was a little more extensive than the chapter on 3D. It did explain the concept of an L-system fractal, and even provided a generator for it. When supplied with a configuration file, it could produce fractals like the von Koch curve. It briefly touched on the Harter-Height Dragon fractal and introduced the Mandelbrot and Julia sets, but didn’t delve into chaos theory, even though I’m sure one the authors desperately wanted to do so.
  • Related to the discussion of fractals was the section on generated landscapes via the midpoint displacement method. While not a landscape per se, the authors digressed a bit to talk about cloud generation as well.

The book finally managed to get around to the reason I bought it in the first place many years ago, which was the all too brief chapter on DIP techniques. It quickly introduced and provided code for algorithms like the Laplace filter, as well as popular effects like emboss, blur, diffuse, and interpolation. The treatment was very light, so the reader will not walk away with a solid understanding for any of the example code, other than trivial effects like pixelate or crystalize.

Black Art of 3D Game Programming

September 20, 2009

Black Art of 3D Game Programming So you’re an aspiring game writer, now what are you going to do about it? You could get a job as a summer intern at a local game company and work for free doing all sorts of tedious tasks no one else wants to do. Or you could hit eBay and do yourself a huge favor and find this text. Before you write back about the publication date, yes it was published in 1995. No, it does not talk about OpenGL or DirectX. Phew, now that we have that out of the way, let’s talk about what this book can teach you.

This book’s greatest strength is how it dabbles in a number of good topics without being totally useless. A number of texts provide a lot of information about too many topics, while not providing enough material to do something fun or useful. That exercise is often left to the reader. Ha! A book like that in my house will pay a little visit to the recycling bin or get banished to the library, never to be read again. Other books tend to steam roll over you with too much theory of the mathematical kind. While I’m all for roots and quadratics, aspiring young programmers wanting the enter the field of game development shouldn’t necessarily want to go whole hog on their first book.

That being said, they will need some basic skills to get started, and matrix math is a fundamental skill for playing around in 3D. Fortunately for the junior programmer, it’s not the most difficult skill to learn. One of the black arts the book teaches you is how to master 2D and 3D transformations using matrix algebra. It doesn’t go into quaternions for gimble-lock free rotations, but since the reader is probably just starting out, it doesn’t really matter either. They will be able to make do without all of the fancy topics getting in the way of actually learning something useful.

The book does show its age in some chapters where the topic of conversation is how to take control of various video modes, but I think it’s still a worth while read. You can still do this kind of programming if your recreational operating system of choice is FreeDOS or some other version of DOS, even on a modern PC. You could even use Windows 95 or 98 for that matter, just get it to boot using the version of DOS which ships with all of that GUI crap. Trust me, you’ll be better off in the long run, plus you can wow your friends with your amazing latch register skillz.

Zen of Graphics Programming (2nd Edition)

September 11, 2009

Zen of Graphics ProgrammingThere are certain books on my shelf which I will never give up. I donate many to the library but some have a staying power I don’t often see and tend to covet when I do. Michael Abrash’s aforementioned text is one of those books. It’s  a technical book cover to cover, there are a lot of hairy details which are fun to read about and you would be hard pressed to find code listings your mother could understand (unless she too is a programmer, lucky you!).

The book was written in 1996  for a whopping retail price of $62.99 (CAN). That was a lot of money for a young man who was entering his second year of university. Like many people, I bought the book during a time when browsing the table of contents on the Internet was simply not an option. Most of the time, I couldn’t even look inside a physical copy because my book store simply didn’t stock the books I was interested in buying. I often needed to special order these texts using a behind the scenes method, that is, if I was lucky enough to find a distributor in their corporate directory. I fondly remember spelling out the book titles carefully since the nice women working behind the counter didn’t have a clue what I was talking about. Most of the time they must have thought I was speaking a foreign tounge because they would often ask me to repeat it, or reply seconds later with a vaguely stunned expression.

What I needed to rely on, was the opinions and recommendations of graphics and games programmers who posted comments through various Usenet news groups, BBS systems, and the very rare in face to face conversations. This particular book was published by The Coriolis Group and featured a wealth of information on programming computer graphics using an Intel microprocessor and a video card sporting multiple video modes, including compatibility with the elusive and fabled Mode X.

Being a book on graphics programming in 1990’s, it centers around producing efficient drawing code. It tackles the more basic primitives and how to decompose the traditional rendering algorithms into something which could execute very quickly, and then review that optimized code progressively to try and squeeze out even more cycles from your overworked CPU. One of the best features of the book, is that it’s mostly a book on algorithmic optimization, instead of code optimization, which is why it has staying power even today. For those interested in the past-time of code reduction, the author has written another book called The Zen of Assembly Language which is freely available as a digital download (or was the last time I looked).

Three dimensional computer graphics was a popular topic at the time and still is to this day, although advances in this area of computer science have rendered (pun intended) some of the earlier topics almost obsolete. Of interest to those studying the rendering pipelines used by classic games like Doom and Quake are topics around the interpretation and optimization of BSP trees. BSP is a fancy acronym standing for Binary Space Partitions, which is another fancy way of describing how to carve up space into optimal chunks, and then placing those chunks into a data structure like a binary tree. The reason Doom ran as fast as it did was not because the developers wrote some of their modules in assembly language, it was because the BSP compilers produced an optimally pruned set of data. Less data to process, meant more cycles to render those beautiful walls, ceilings and stairs we have all come to love and adore.

I find this book to be especially useful, however, when you need to return to the fundamentals. With the advent of small, portable devices, programmers working on these products have needed to return to their roots, or if they are new in the industry, to open their minds and think about the construction of 2D & 3D engines in a way they never thought of before. Most of the engines used by nubile programmers entering the field of computer graphics will be wrappers around this functionality; abstracting the core mechanics and shielding the developer from the tedious and structural bits of the engine.

Indeed, this is probably for the best, most of the time, if productivity is what your company needs. It never hurts to return to your roots every now and then, and let your mind flourish in an area rarely experienced by most technical professionals.

Racing the Beam

May 16, 2009

Racing the Beam

Racing the Beam

I just finished another great book the other day, entitled Racing the Beam: The Atari Video Computer System by Montfort and Bogost. It’s an inside book about some the development challenges and solutions when writing games for the Atari VCS. This is a unique machine and is often considered one of the most difficult machines for a programmer to cut their teeth on. With 128 bytes of RAM and an average ROM size of 2, 4, or 8K, you must fight tooth and nail of every byte used by your software. What lengths do some programmers go to skimp and save on bytes? Ever thought about using the same byte for both an opcode and a piece of data? Ever thought about using the opcodes and operands found in the code segment of your program as data, which gets fed a pseudo-random number generator or to produce a rendering effect because you didn’t have the spare space in ROM to place this stuff into the data segment? Well, neither did I until I read this text. Along with little gems like this, the book has a number of interesting tips and tricks into the how and why of software development for the Atari 2600.

The book centres itself around the idea of a platform, and how the constraints and peculiarities of a system can affect how a game is presented. Game adaptation, especially when you’re trying to port software from one hardware architecture to another, is a very important topic when you’re trying to maintain the look or feel of a game. Sometimes, neither is possible and you’re forced to go your own road and come up with something completely different.

A word of caution, though. This book will not teach you how to write software for the 2600 system. It is not a technical reference by any means, nor does it advertise itself as one. However, I would heartily recommend this title to anyone thinking about producing a game for that system, or those of us with an inner geek needing to be satisfied.

I love the idea behind this series of “platform” books as I have often wished for such books to be written and have even contemplated writing one myself just to fill the void. One of the most useful parts of this book is the reference section which can lead you to all sorts of new and interesting articles, books, or projects. I do hope the next book contains a bit more technical detail while keeping thevarious bits of historical data and interesting character references which really helps to tie the why and the how of the topics together.

Core Memory

December 19, 2008

Core Memory

Core Memory

Just finished another classic computing book entitled Core Memory: A Visual Survey of Vintage Computers (ISBN: 0811854426), written by John Alderman and photographed by Mark Richards. While I did enjoy the photography much better in this book than in Digital Retro, I found both the context of the photographs and the text provided somewhat lacking in substance. Although I did find the photographs to be colorful, I do not find so many pictures of wire bundles and wiring trunks to be especially interesting. While it is interesting the see the complexity in wiring for one or two of these machines, it would have been more intriguing to see various parts of the machines, and to have those parts labeled.

I have not had the pleasure of using one of these machines, let alone putting them together. I can readily identify electronic components, but without providing context for the photograph, it’s just a jumble of wires or components with no discernible purpose- however pretty they may appear on camera. It would also have been a great opportunity to provide more technical details on the machine, sample machine code or instruction sets, screen shots or running software (assuming the machine could even be turned on), and what not. It would have been fascinating to have a detailed list of primary technical components and their functions for each machine. Since some of these machines occupied so much territory, it would also have been informative to have a common layout diagram with a typical installation. Don’t get me wrong, I enjoyed this book and all of the machines are photographed very well, but I think it would have been better to have smaller spreads and more annotated pictures.

Digital Retro

December 13, 2008

I just finished reading the book Digital Retro: The Evolution and Design of the Personal Computer, written by Gordon Laing (ISBN-10: 078214330X). Each machine in the book gets a four page spread, showing different angles of the various pieces making up the computer. I must admit that I did not find the images particularly engaging for the most part. They were excellent photographs, no doubt about it, but I don’t think showing the back/sides/top of a monitor and keyboard is the best way to go about creating a visual history that will resonate with your intended audience. I would have been more engaged to read about the various quirks, pour over screenshots of popular software titles and the operating systems of choice, working code from the most popular programming languages for the platform, and even get a close look at the internals (for the inquisitive types who weren’t afraid to void their warranties).

Pictures of the units themselves abound, but most are of poor quality, so a well lit photograph goes a long way to documenting what these computers looked like. However, once you have taken a photo of the front and back, there really isn’t much left of the exterior that is interesting (not including the few machines which actually took advantage of the third dimension and had features on the sides of the unit). With all of the pictures, it left little room for text, and the text contained little more than a summary one could pull off a Wikipedia page. To be fair, the summaries are concise and some columns are filled with interesting tidbits. My favorite is on the last page within the section which documents the NeXT Cube. The page is almost completely filled by the monitor for the system, but there is an excellent piece of information which describes how Steve Jobs acquired the Industrial Light and Magic (ILM) division of Lucasfilm. Basically, George Lucas was in a bind after an expensive divorce and need to raise about $30 million dollars in capital. After a few failed attempts from other buyers, Lucas eventually accepted Steve’s low-ball figure of $10 million. Needless to say, ILM eventually payed back huge dividends since they eventually released the enormously successful film Toy Story, which grossed over $362 million worldwide. ILM produced a number of large films and when you top all of that off with a lucrative IPO, Mr. Jobs is sitting on a mountain of money which probably has its own zip code. 

I find the book to be a excellent coffee table reference and the columns do make for a quick and easy reference, but I would have enjoyed the book a lot more if it went the extra mile and showed me things most books never touch upon.

The Spectacular Rise and Fall of Commodore

July 3, 2008

On the Edge (Cover)I finished the book entitled “On the Edge: The Spectacular Rise and Fall of Commodore” a few weeks ago and I wanted to share some insight into the book’s content.

Back in the late seventies and early eighties, there was a lot of heavy and dirty competition between the fledgling computer companies of that era. Because many of the hardware engineers and executives from different companies knew each other, often the competition rose to personal attacks not often seen today. This aggression is evident in the books written by these people or the journalists who are trying to capture the tension during the personal computer revolution. Brian Bagnall’s book reveals many interesting tidbits of information, although some were easier to extract than others as the stories often overlapped and flip-flopped. In certain chapters, the text feels very biased and the passages from former employees are often rife with bitterness and frustration seeping from their words. These emotions actually helped to propel me into the book, since it seemed like I had something invested in finishing it, just like they invested their time and even their life to building those wonderful machines. Some of the people involved (like Chuck Peddle) were obviously still licking their wounds even after three decades. They’ll claim indifference, but if the quotes are at all accurate, there’s a seething tension which becomes obvious as the book plays out. You will walk away from this book feeling a much deeper connection to your favourite Commodore machine, be it an Amiga 2000+, Commodore 64, or the Commodore PET to name just a few. I would recommend this book to anyone interested in the technological developments during this time period and the people who made it happen.

PGP: Source Code and Internals

April 28, 2008

PGP BookMy copy finally arrived the other day and I am elated. Now, before you make another hasty buying decision based solely on my opinion, and on a single line of text, there are a couple of things worth mentioning about the book. First and foremost, the book does lack a few structural elements like plot and little things like paragraphs. Having read about the history of Philip Zimmerman and his toil with the U.S. government, I already knew this before I purchased the book.

It does contain source code. Lots of code. The entire PGP program, in fact, including the project files. The book was sent to print because the digital privacy laws the government was attempting to enact at the time, did not cover the printed page. If all you want is the source code, you can simply find it on the Internet. Looking for a specific version, especially an older version, is more difficult and may be fraught with export restrictions.

The story behind the printing of the book is a fascinating history lesson and one we should all be concerned about, even if you live in another country, since we all know governments are not terribly adept at learning from their mistakes.