Friday, October 28, 2011

The Spectralyzer





The Spectralyzer was an App available for iOS.


Medical Physics


File:Lineaer accelerator en.svg

From the age of 14 (and a bit younger) I spent most of my school holidays in the Medical Physics Electronics/Mechanical workshop of Mount Vernon Hospital in Northwood, UK.

Their main reason for being was to look after the Radiation Therapy Cancer Treatment department.

Radiotherapy Linear Accelerators.

Linear Particle Accelerators are not just used in medical treatment, they are used in physics research.

There are Linear accelerators competing with the LHC, which is the same technology in a circle rather than a straight line.

Or as one of my colleagues used to explain it, "a drain pipe with a light bulb at the end"

Most people never see one of these machines.

Patients normally see them like this:-



We would have to service but more importantly fix these lethal machines, and I ended up installing them during "Holidays" whilst doing my Electronics Degree, meaning they look more familiar to me like this:-



Either way, being capable of keeping these machines going meant the Engineers I worked with were capable of tackling just about any problem.

Some other work I ended up being involved in was studying nerve re growth following a cut (in particular wrist cuts), and developing a battery powered TENS machine that was used on patients, along with an impedance measuring device that monitored blood flow in newly grafted skin.

Cyber K'nex

As Merlin Mann has pointed out, nerds love building Robots. It was therefore almost unfair to get the challenge of turning the K'nex system into Cyber K'nex robotic system, designed to be a more "Fun" toy to interact with than Lego Mindstorms.

I contributed to the hardware design. I then wrote, in assembly language the multi tasking software that ran on the low cost 2 processor processor that read the instruction keys and decoded the remote, played sounds, controlled the motors and flashed lights. Other people actually designed the hardware and wrote the software that was on the different models for different keys.



At one point in the development the entire team (15-20 people) were in all weekend for an end of project panic. The office resembled the scene in Blade Runner, with out of control Knex robots everywhere, and a bunch of adults who had all abandoned their families for the weekend to meet the important Christmas deadline.

It was a compelling product to work on.

Some people have even designed their own models.

King Arthur again

In 2002, for the second time in my career, I again found myself working on a product about King Arthur.

A fantastic spec and prototype game arrived, designed by the brilliant  Reiner Knizia. The team sat down with Reiner and played the game.

It is a very engaging game.



The game play specifications were complex and large, and the brand new tag/board reading technology also needed developing.

I worked with the hardware team on the core design and component/chip selection, leading to the decision to use a completely new and low cost microprocessor that I felt could handle all the requirements. And this time I could develop the product in 'C' (a favorite language of mine), not just in assembly language, unlike like some other products (for example this one)

The product had many systems to be developed on this one chip. It therefore effectively required me to develop, once again, my own embedded operating system. An example was the board interface and player recognition system. This was a combination of extra digital hardware for the board reading, and a combined minimal analogue hardware / DSP (written in assembly language, mostly running on interrupts) system for identifying the different players.

On the same chip also implemented the game play software system, with some help from other software engineers implementing my specs. To accommodate the limited ROM space I developed it as a virtual machine that was written in 'C'. Large parts of the game play was contained within data - this helped compress the game play to fit within the limited ROM space available.

I was able to develop and have others bug test the game software by cross compiling and generating a PC version.

I also supervised the development of a Pseudo Random Number generator that was quantified in order to meet Reiner's specification requirements.

Along the way I also I developed a testing methodology I called "Monkey Testing". This emulated game play in the same way as the "Shakespeare project", except I used pseudo random numbers to press random keys, ran this on the PC and target hardware, and had each generate logs files that I could compare. Found a few bugs this way...

There is more to the story...

Saturday, September 17, 2011

Space Football




Back when the Nintendo SNES was brand new, I developed some sound utilities and software for Space Football.

This was my introduction to coding in my bedroom through the night...

Head Related Transfer Function (HRTF)


After Space Football, I was asked to join Argonaut Software, initially to carry out research into 3-D sound in tandem with work Argonaut were doing with Nintendo into a Virtual Reality headset.

This research involved running the Head Related Transfer Function (HRTF) software on dedicated Digital Signal Processing cards in the PC. I learnt a lot about how our ears and brains (may) perceive sounds and the direction they are coming from.

As every person's ear has a different shape, in the ideal case, each person has their own HRTF measured.

This was achieved by putting small microphones in a persons ears and then recording the response of this microphone to real clicks (impulses) generated around the subject from different locations in space.

This then creates a 3-D impulse response map.

To use the HRTF to position a sound in 3-D via headphones you modify your source sound by the measured impulse response from that location in space.

Of interest was that the HRTF maps we had were of other people (we did not have our own taken), and initially this meant that initially people were not very good at pointing to where a sound was being placed by the software. Over time users "learnt" the other peoples HRTF's and became better at pointing to the intended location of the sound.

The first King Arthur, and Dolby Surround




After doing the work on the Head Related Transfer Function, I was moved to writing the software and utilities for the newly formed sound department at Argonaut.

The first game I worked on was a SNES game called King Arthur's World.

Towards completion date, Jez San (Argonaut founder) asked us to try and make space for Dolby Surround encoded samples.

Instead I spotted a hack that effectively meant that by controlling the volume registers on the SNES's dedicated sound chip, the game software software could control a sounds position in 3-D.


This was of virtually no use in King Arthurs World as it was a 2-D scrolling game, but the weather sound effects game from all around you, and the opening chords spin round the room...


We then had Dolby come and visit us, and in the end King Arthur was the first game to carry the official Dolby Surround Sound Licence.

This resulted in Dolby promoting the game for us, and I was interviewed for Home Entertainment magazine Autumn 1993 edition.

Ren and Stimpy SNES




Perhaps not the world's best game... unlike most other TV/Film spin out games... Sounds OK however :)

FX Fighter





FX Fighter was the showcase for what was at the time a cutting edge 3-D graphics engine developed at Argonaut that went on to be used in several other successful games.

The main new feature this had in the sound drivers was real time sound decompression algorithms that meant that the sound drivers actually used less CPU power than playing un-compressed 16-bit samples. This was mainly because the decompression routines (assembly language, of course) took virtually no time, yet each sample was ~4 bits of data rather than 16, so ~1/4 less data was being shifted, and as we all know, memory access can be a bigger bottle neck to code execution than the instructions themselves.

Creature Shock





Creature Shock was in part the result of Silicon Graphics machines, some very talented graphics artists, coders and sound designers, and the CD ROM revolution. It was computer graphics like we had not really seen before. We rather cruelly came to refer to the game style this spawned as the "Watch 'em up" :)

It was my baptism into developing sound drivers, interrupt handlers and optimised assembly language coding on the PC, and was therefore pretty much a nightmare.

It also involved having to write separate drivers for what was beginning to be an explosion  of sound cards to compete with the Sound Blaster.

In fact, it was worse than this, as it was just at the beginning of 32-bit protected mode days (wow, imagine, being able to allocate a whole 1 Meg of memory, in one chunk, unheard of!), which made the main game code a lot easier.

However, if you were writing interrupt handlers in assembly language (that was me then), you had to write 16 and 32 bit code, and it was a nightmare, leading to too many all nighters, including one on the phone to Intel to try and figure out why some of their chips in certain machines went bang when we were switching modes...

Vortex




Ah, Vortex, finally a 3-D SNES game that could properly use my Dolby Surround hack.

Justin's sound track has to be one of the best ever on the SNES