Nokia N-Gage – another “Handheld Hero”

I posted an image of this on my Dribble account the other day along with a similar bit of writing, but I thought I’d should really add a blog post about it here too as it’s one of my favourite devices, plus I’m trying to write a bit more frequently on my blog again.

The Nokia N-Gage is one of what I call the “Handheld Heroes”, these are 12 handheld devices that I consider to be icons of their time and are symbolic of a particular point of technological change in the last 30 – 40 years (ok, the pencil is quite a bit older than that ?). The N-Gage was released around 2003, so over 20 years old at the time of writing this(!).

The N-Gage was an amazing device and in many ways was ahead of its time. Long before Pokemon Go was a thing Nokia explored the idea of location-based / location-aware games, but mobile connectivity was very limited, very slow and very expensive at that time. I recall thinking about how amazing it would be to be able to walk around with a device and have a constant connection to the internet, but it was a long way off from the ubiquity of 4G / 5G today. So whilst there were a few location based games for the N-Gage this lack of affordable mobile data meant it had limited appeal.

Nokia Push was one of the coolest initiatives that they tried, basically there were two aspects, one for skateboarding and one for snowboarding. Both involved using sensors attached to the skateboard or snowboard and then tracking the telemetry such as rotations, flips, height, speed etc. For skateboarding the intention was that you could compete with someone in a different location in the world, both skaters could see from their phones which tricks the other had done:

Nokia Push skateboarding promo video

For snowboarding the intention was to track telemetry such as height, speed, rotations, impact etc:

Nokia Push snowboarding promo video

So the N-Gage wasn’t successful in location based gaming but it definitely had some of the seeds of what was yet to come, Nokia Push explored that even further. Yep, the N-Gage was a weird device to make calls on (especially the first version, see “Sidetalking!“), but talking was not what was really intended for, and I loved mine. What’s not to love about a phone that can play Tony Hawk’s Pro Skater?

Is it Cake?

It’s interesting seeing the speed at which “AI” based generative media has been developing over the last few years, Midjourney, Stable Diffusion etc. OpenAI recently announced “Sora”, their new text-to-video AI model which can generate videos up to one minute long.

The example videos in their showreel video on YouTube are pretty impressive, go take a look if you haven’t done so already:

There’s definitely a bit of an “uncanny valley” quality about some of them though. They made me think of the Netflix series “Is it Cake?” where contestants have to make a cake that looks like a real object with the aim of fooling the judges who have to try and pick the fake cake-based item out of a lineup with three other real versions of the object.

This image shows several tool bags on plinths, one of the tool bags is actually a cake made to look like a real tool bag.

These cake-versions of real objects most-often look incredible and are created using amazing cake-making techniques and edible materials, they really do look like amazing realistic edible sculptures.

In the show the judges are not allowed to go close up to view the objects but can only view them from about 15-20 feet / 4.5-6 metres away. At that distance it is much harder to notice the subtle inconsistencies, e.g. not-so-straight edges or odd surface textures (or smell!) that might give away the illusion. But if they were allowed to go close up then the illusion would likely be much more apparent.

This image shows three people standing next to podiums, the people are judges on the Netflix show "Is it cake?"

It feels a bit like this with a lot of generative AI content too, looking at it broadly – especially on a small device screen – they do look incredible, but if you look closer and carefully you can spot some of the same not-so-straight edges and / or unusual textures (and sometimes extra fingers / legs!) that makes you think, “That’s a cake!“.

Over time though I’m sure it is going to get increasingly more difficult to tell the difference between these and real images / video as the subtle giveaways such as soft / fuzzy edges and extra limbs are reduced. However, even with the current issues it does already present a big challenge when it comes to evaluating the authenticity of the images and videos we see online.

OpenAI does make a clear statement when it comes to the “safety” of their tools and aims to prevent them from being used to create content that is hateful or contains misinformation, but the challenge will be when these types of models become more widely accessible by companies / organisations who don’t hold to these higher standards. It’s certainly going to be a bit of a wild west out there.


(Some of the AI stuff also reminded me of this old “Mr Soft” TV advert for Trebor mints too!).

Happy 40th Birthday to the Macintosh!

I first used a Macintosh when I went to college in 1989, I had just left school and started an art & design course. The college was a little behind in technology so even though it was 1989 and there were some more modern Macs these machines were a bit older so probably SE’s and SE30’s. Even though they were old these were amazing, I remember using MacPaint to draw with the various tools and then printing it out on the Apple LaserWriter printer. In hindsight it is obviously so low-tech compared to now but it was incredible.

Prior to using the Mac I had used computers at home, as far back as about 1981 we had a Commodore Vic 20 and then a Commodore 64. So I had tried doing things on those, entering programs in from computer magazines and attempting to write BASIC programs myself. I remember thinking at one point that it would be great if there was an easier way to work with computers, nothing specific but just that entering text into programs seemed hard and long-winded. So when I first used a Mac in 1989 it was a bit like a light came on and it all made so much sense that this is how computers should work.

iPhone view of 2023

Here is my annual “iPhone view” video for 2023, basically I compile all images either taken directly with my iPhone, screenshot or copied onto it in 2023 and then make a video out of it.. This year’s video is 3 minutes and 37 seconds long.

As with last year’s video I played around with GarageBand to make an audio track to go with it, this way I can make the video and audio whatever length I want it and get them to work, I kind of like the way it came out.

There are now 14 years of “iPhone view” videos in my “iPhone view…” playlist.

Make Something Wonderful

This was released back in April this year (2023) and I’d meant to link to it at the time, but if you haven’t already seen it then do check out “Make Something Wonderful” which is a collection of speeches and writing from Steve Jobs published by the Steve Jobs Archive:

The best way to understand a person is to listen to that person directly. And the best way to understand Steve is to listen to what he said and wrote over the course of his life. His words—in speeches, interviews, and emails—offer a window into how he thought. And he was an exquisite thinker.

Laurene Powell Jobs

There is some really interesting writing and some great photos too. It’s available to view on the website as well as in iBook format via the Apple Book store but it’s definitely best viewed on the website as the reading experience has been really nicely done by the team at Jony Ive’s agency LoveFrom.