Thursday, December 19, 2013

Manic Karts




In 1995 I was persuaded to leave Argonaut and join Manic Media. The Manic team were working on the follow up to the successful and addictive Super Karts, called Manic Karts.

One of the first tasks was to build a Recording Studio.

Therefore, as well as developing the sound software required, I project managed the conversion of a barn into a16-track digital, 16-track analogue project recording studio complete with acoustically designed and treated control room, live room, vocal booth, chill out area, and Kit (a DDA QMR mixing desc and lots of outboard/MIDI gear.)




And then we went on to make some music, with a variety of people including the much missed Jo Bruce (founder of Afrocelts) and Jo Strummer.

Friday, October 28, 2011

The Spectralyzer





The Spectralyzer was an App available for iOS.


Medical Physics


File:Lineaer accelerator en.svg

From the age of 14 (and a bit younger) I spent most of my school holidays in the Medical Physics Electronics/Mechanical workshop of Mount Vernon Hospital in Northwood, UK.

Their main reason for being was to look after the Radiation Therapy Cancer Treatment department.

Radiotherapy Linear Accelerators.

Linear Particle Accelerators are not just used in medical treatment, they are used in physics research.

There are Linear accelerators competing with the LHC, which is the same technology in a circle rather than a straight line.

Or as one of my colleagues used to explain it, "a drain pipe with a light bulb at the end"

Most people never see one of these machines.

Patients normally see them like this:-



We would have to service but more importantly fix these lethal machines, and I ended up installing them during "Holidays" whilst doing my Electronics Degree, meaning they look more familiar to me like this:-



Either way, being capable of keeping these machines going meant the Engineers I worked with were capable of tackling just about any problem.

Some other work I ended up being involved in was studying nerve re growth following a cut (in particular wrist cuts), and developing a battery powered TENS machine that was used on patients, along with an impedance measuring device that monitored blood flow in newly grafted skin.

Cyber K'nex

As Merlin Mann has pointed out, nerds love building Robots. It was therefore almost unfair to get the challenge of turning the K'nex system into Cyber K'nex robotic system, designed to be a more "Fun" toy to interact with than Lego Mindstorms.

I contributed to the hardware design. I then wrote, in assembly language the multi tasking software that ran on the low cost 2 processor processor that read the instruction keys and decoded the remote, played sounds, controlled the motors and flashed lights. Other people actually designed the hardware and wrote the software that was on the different models for different keys.



At one point in the development the entire team (15-20 people) were in all weekend for an end of project panic. The office resembled the scene in Blade Runner, with out of control Knex robots everywhere, and a bunch of adults who had all abandoned their families for the weekend to meet the important Christmas deadline.

It was a compelling product to work on.

Some people have even designed their own models.

King Arthur again

In 2002, for the second time in my career, I again found myself working on a product about King Arthur.

A fantastic spec and prototype game arrived, designed by the brilliant  Reiner Knizia. The team sat down with Reiner and played the game.

It is a very engaging game.



The game play specifications were complex and large, and the brand new tag/board reading technology also needed developing.

I worked with the hardware team on the core design and component/chip selection, leading to the decision to use a completely new and low cost microprocessor that I felt could handle all the requirements. And this time I could develop the product in 'C' (a favorite language of mine), not just in assembly language, unlike like some other products (for example this one)

The product had many systems to be developed on this one chip. It therefore effectively required me to develop, once again, my own embedded operating system. An example was the board interface and player recognition system. This was a combination of extra digital hardware for the board reading, and a combined minimal analogue hardware / DSP (written in assembly language, mostly running on interrupts) system for identifying the different players.

On the same chip also implemented the game play software system, with some help from other software engineers implementing my specs. To accommodate the limited ROM space I developed it as a virtual machine that was written in 'C'. Large parts of the game play was contained within data - this helped compress the game play to fit within the limited ROM space available.

I was able to develop and have others bug test the game software by cross compiling and generating a PC version.

I also supervised the development of a Pseudo Random Number generator that was quantified in order to meet Reiner's specification requirements.

Along the way I also I developed a testing methodology I called "Monkey Testing". This emulated game play in the same way as the "Shakespeare project", except I used pseudo random numbers to press random keys, ran this on the PC and target hardware, and had each generate logs files that I could compare. Found a few bugs this way...

There is more to the story...

Saturday, September 17, 2011

Space Football




Back when the Nintendo SNES was brand new, I developed some sound utilities and software for Space Football.

This was my introduction to coding in my bedroom through the night...

Head Related Transfer Function (HRTF)


After Space Football, I was asked to join Argonaut Software, initially to carry out research into 3-D sound in tandem with work Argonaut were doing with Nintendo into a Virtual Reality headset.

This research involved running the Head Related Transfer Function (HRTF) software on dedicated Digital Signal Processing cards in the PC. I learnt a lot about how our ears and brains (may) perceive sounds and the direction they are coming from.

As every person's ear has a different shape, in the ideal case, each person has their own HRTF measured.

This was achieved by putting small microphones in a persons ears and then recording the response of this microphone to real clicks (impulses) generated around the subject from different locations in space.

This then creates a 3-D impulse response map.

To use the HRTF to position a sound in 3-D via headphones you modify your source sound by the measured impulse response from that location in space.

Of interest was that the HRTF maps we had were of other people (we did not have our own taken), and initially this meant that initially people were not very good at pointing to where a sound was being placed by the software. Over time users "learnt" the other peoples HRTF's and became better at pointing to the intended location of the sound.