We start this week’s links sent to our studio mailing list with one from Jack, that combines three subjects close to our heart – robotics, short-run manufacture by small companies, and small companies talking about trying to do new things so others can learn from it. This blog from Modular Robotics is a wonderful insight into all three.
More British Transport ephemera from Matt Webb, on how cars and cities chat to each other using magnets:
In Southampton when I was growing up, we had one of the world’s first adaptive traffic routing systems, where the traffic light delays would alter dynamically depending on traffic. At night they would switch to an “on demand” model: the main route at a crossroads would remain green, while the minor route would only go green 15 seconds after a car had approached it.
Since this was detected by the induction loop, and since an induction loop only has magnetic flux through it when a material is *moving* over it, if you sat in a static car waiting for the traffic light to turn from green to red again, it would never reactivate. (I sat for 15 minutes one, still at a red light on a minor route, to make sure it didn’t come on by accident. It turned green as soon as I moved the car just a little bit.)
Here’s a nice hack:
Ambulances had *moving* parts underneath them, to trigger a stronger signal in the induction loop. In Southampton, this was used to identify emergency vehicles moving up to the lights, and preferentially change traffic lights green a hop or two ahead of them on likely routes.
Some people build these into their cars to get better treatment by these routing systems.
Finally, all that remains is Alice’s recommendation of a Chrome experimental video by the band OK Go & Pilobolus, which allows us to put their best feet forward in order to say happy Friday from all of us in the studio…
It is rearing its head in our work, and in work and writings by others – so thought I would give it another airing.
The talk at Glug London bounced through some of our work, and our collective obsession with Mary Poppins, so I’ll cut to the bit about the Robot-Readable World, and rather than try and reproduce the talk I’ll embed the images I showed that evening, but embellish and expand on what I was trying to point at.

Robot-Readable World is a pot to put things in, something that I first started putting things in back in 2007 or so.
At Interesting back then, I drew a parallel between the Apple Newton’s sophisticated, complicated hand-writing recognition and the Palm Pilot’s approach of getting humans to learn a new way to write, i.e. Graffiti.
The connection I was trying to make was that there is a deliberate design approach that makes use of the plasticity and adaptability of humans to meet computers (more than) half way.
Connecting this to computer vision and robotics I said something like:
“What if, instead of designing computers and robots that relate to what we can see, we meet them half-way – covering our environment with markers, codes and RFIDs, making a robot-readable world”
“The Cambrian explosion was triggered by the sudden evolution of vision” in simple organisms… active predation became possible with the advent of vision, and prey species found themselves under extreme pressure to adapt in ways that would make them less likely to be spotted. New habitats opened as organisms were able to see their environment for the first time, and an enormous amount of specialization occurred as species differentiated.”
In this light (no pun intended) the “Robot-Readable World” imagines the evolutionary pressure of those three billion (and growing) linked, artificial eyes on our environment.
[it is an aesthetic…] Of computer-vision, of 3d-printing; of optimised, algorithmic sensor sweeps and compression artefacts. Of LIDAR and laser-speckle. Of the gaze of another nature on ours. There’s something in the kinect-hacked photography of NYC’s subways that we’ve linked to here before, that smacks of the viewpoint of that other next nature, the robot-readable world. The fascination we have with how bees see flowers, revealing animal link between senses and motives. That our environment is shared with things that see with motives we have intentionally or unintentionally programmed them with.

The things we are about to share our environment with are born themselves out of a domestication of inexpensive computation, the ‘Fractional AI’ and ‘Big Maths for trivial things’ that Matt Webb has spoken about this year (I’d recommend starting at his Do Lecture).
We’re in a present, after all, where a £100 point-and-shoot camera has the approximate empathic capabilities of a infant, recognising and modifying it’s behaviour based on facial recognition.

And where the number one toy last Christmas is a computer-vision eye that can sense depth, movement, detect skeletons and is a direct descendent of techniques and technologies used for surveillance and monitoring.
As Matt Webb pointed out on twitter last year:

Ten years of investment in security measures funded and inspired by the ‘War On Terror’ have lead us to this point, but what has been left behind by that tide is domestic, cheap and hackable.
Kinect hacking has become officially endorsed and, to my mind, the hacks are more fun than the games that have published for it.
Greg Borenstein, who scanned me with a Kinect at FooCamp is at the moment writing a book for O’Reilly called ‘Making Things See’.

It is a companion in someways to Tom Igoe’s handbook to injecting behaviour into everyday things with Arduino and other hackable, programmable hardware called “Making Things Talk”.
“Making Things See” could be the the beginning of a ‘light-switch’ moment for everyday things with behaviour hacked-into them. For things with fractional AI, fractional agency – to be given a fractional sense of their environment.
The way the world is fractured from a different viewpoint, a different set of senses from a new set of sensors.
Perhaps it’s the suspicious look from the fella with the moustache that nails it.
And its a thought that was with me while I wrote that post that I want to pick at.
The fascination we have with how bees see flowers, revealing the animal link between senses and motives. That our environment is shared with things that see with motives we have intentionally or unintentionally programmed them with.
“What we see of the real world is not the unvarnished world but a model of the world, regulated and adjusted by sense data, but constructed so it’s useful for dealing with the real world.
The nature of the model depends on the kind of animal we are. A flying animal needs a different kind of model from a walking, climbing or swimming animal. A monkey’s brain must have software capable of simulating a three-dimensional world of branches and trunks. A mole’s software for constructing models of its world will be customized for underground use. A water strider’s brain doesn’t need 3D software at all, since it lives on the surface of the pond in an Edwin Abbott flatland.”
Middle World — the range of sizes and speeds which we have evolved to feel intuitively comfortable with –is a bit like the narrow range of the electromagnetic spectrum that we see as light of various colours. We’re blind to all frequencies outside that, unless we use instruments to help us. Middle World is the narrow range of reality which we judge to be normal, as opposed to the queerness of the very small, the very large and the very fast.”
At the Glug London talk, I showed a short clip of Dawkins’ 1991 RI Christmas Lecture “The Ultraviolet Garden”. The bit we’re interested in starts about 8 minutes in – but the whole thing is great.
In that bit he talks about how flowers have evolved to become attractive to bees, hummingbirds and humans – all occupying separate sensory worlds…
Which leads me back to…


What’s evolving to become ‘attractive’ and meaningful to both robot and human eyes?
Also – as Dawkins points out
The nature of the model depends on the kind of animal we are.
That is, to say ‘robot eyes’ is like saying ‘animal eyes’ – the breadth of speciation in the fourth kingdom will lead to a huge breadth of sensory worlds to design within.
One might look for signs in the world of motion-capture special effects, where Zoe Saldana’s chromakey acne and high-viz dreadlocks that transform here into an alien giantess in Avatar could morph into fashion statements alongside Beyoncé’s chromasocks…

Or Takashi Murakami’s illustrative QR codes for Louis Vuitton.

That a such a bluntly digital format such as a QR code can be appropriated by a luxury brand such as LV is notable by itself.
Diego’s project “With Robots” imagines a domestic scene where objects, furniture and the general environment have been modified for robot senses and affordances.

Another recent RCA project, this time from the Design Products course, looks at fashion in a robot-readable world.
Thorunn Arnadottir’s QR-code beaded dresses and sunglasses imagine a scenario where pop-stars inject payloads of their own marketing messages into the photographs taken by paparazzi via readable codes turning the parasites into hosts.
But, such overt signalling to distinct and separate senses of human and robots is perhaps too clean-cut an approach.
Computer vision is a deep, dark specialism with strange opportunities and constraints. The signals that we design towards robots might be both simpler and more sophisticated than QR codes or other 2d barcodes.
Those QR ‘illustrations’ are gaining attention because they are novel. They are cheap, early and ugly computer-readable illustration, one side of an evolutionary pressure towards a robot-readable world. In the other direction, images of paintings, faces, book covers and buildings are becoming ‘known’ through the internet and huge databases. Somewhere they may meet in the middle, and we may have beautiful hybrids such as http://www.mayalotan.com/urbanseeder-thesis/inside/
In our own work with Dentsu – the Suwappu characters are being designed to be attractive and cute to humans and meaningful to computer vision.
Their bodies are being deliberately gauged to register with a computer vision application, so that they can interact with imaginary storylines and environments generated by the smartphone.
Back to Dawkins.
Living in the middle means that our limited human sensoriums and their specialised, superhuman robotic senses will overlap, combine and contrast.
Wavelengths we can’t see can be overlaid on those we can – creating messages for both of us.

SVK wasn’t created for robots to read, but it shows how UV wavelengths might be used to create an alternate hidden layer to be read by eyes that see the world in a wider range of wavelengths.
Timo and Jack call this “Antiflage” – a made-up word for something we’re just starting to play with.
It is the opposite of camouflage – the markings and shapes that attract and beguile robot eyes that see differently to us – just as Dawkins describes the strategies that flowers and plants have built up over evolutionary time to attract and beguile bees, hummingbirds – and exist in a layer of reality complimentary to that which we humans sense and are beguiled by.
And I guess that’s the recurring theme here – that these layers might not be hidden from us just by dint of their encoding, but by the fact that we don’t have the senses to detect them without technological-enhancement.
And while I present this as a phenomena, and dramatise it a little into being an emergent ‘force of nature’, let’s be clear that it is a phenomena to design for, and with. It’s something we will invent, within the frame of the cultural and technical pressures that force design to evolve.
That was the message I was trying to get across at Glug: we’re the ones making the robots, shaping their senses, and the objects and environments they relate to.
Hence we make a robot-readable world.
I closed my talk with this quote from my friend Chris Heathcote, which I thought goes to the heart of this responsibility.
There’s a whiff in the air that it’s not as far off as we might think.
The Robot-Readable World is pre-Cambrian at the moment, but perhaps in a blink of an eye it will be all around us.
Some sad news today: over the next few weeks, we’ll be shuttering Schooloscope, and wrapping up our journey into UK schools.
There’s more detail on the Schooloscope blog — but for here, I wanted a shout out of thanks for the team: our friends at 4iP, our collaborators and folks we’ve met along the way, everyone at BERG, and especially Kari, Tom Armitage and Matt Brown (Tom and Matt have since moved on) – because I believe it’s a wonderful site, something I deeply believe is needed – and thank you all for being with us on this journey.
As is the custom (at least when I can find it in time), we begin our weekly all-hands meeting with a burst of the theme to Battle Of The Planets and a fact from wikipedia about the week number.
Alex is back in the studio after an exciting incident involving his bike, his shoulder, a road and one of London’s characteristically-careful and considerate drivers. Good to have him back and on the mend. He’s working this week on Barry, and his monitor is full of incredibly-detailed, pixel-perfect illustrations and type. It’s looking lovely.
Denise is also working on Barry all week, but more on the service and product design aspects as well as the overall direction of the thing. She’s been working with the printers on the prep for the next run of SVK – so if you missed out first time round, make sure to sign up at http://getsvk.com for news about the next print run!
Joe’s working on Suwappu phase 2, concentrating on UI and visual design to collaborate with Nick on building.
Nick’s planning out some of the Suwappu tech architecture, ramping up on a Uinta project in terms of research and prep, doing more architecture work on Barry, and also descending into some embedded code darkness.
James is deep in Schooloscope tweaks and data-munging He’s also Barry spec-writing and development, integrating his work with Alice’s and working on turning the IA into code, refactoring as he goes. Busy fella.
Alice is stuck into Barry work with James and and Alex. Barry’s been cracking along as a result, with lots of progress and intermittent whoops from the team.
Schulze is mainly writing with Timo this week – on Chaco and some other projects. Timo, back from holiday, meanwhile is planning for Chaco and Uinta outputs, doing some treatments for scripts for a couple of other things that are bubbling away. He’s also switched on his scholarly mind-modules – doing research for an article he’s writing which I’m quite excited about.
Kari’s doing a bit of SVK customer service, writing some documentation and pursuing year-end company financials stuff with MW.
Simon’s negotiating a lot of complexity this week. There’s SVK reprint planning, Uinta workshop planning, Chaco planning, Schoolscope and Suwappu project management, writing user-stories for Barry, finalising phase two of Dimensions and generally keeping us all honest and pointing in the right direction. At this rate, his trip Burning Man at the end of the month is going to seem like a walk in the park to him.
Matt Webb wasn’t at all hands as he was out chasing down an exciting lead that might resurrect some old inventions… But he sent a telegram to be read out, framing the week for him as “Meetings-y”. He’s going to be on top of Schooloscope and company financials apart from that.
For me, this is a bit of a pause-for-breath week. I’m prepping for some new projects for Uinta, and ongoing work for Chaco – but also hoping to be doing a bit of thinking and writing here on the blog – while my giant metaphorical plastic sit-in log slowly ratchets up the incline, before the next steep drop of the future-flume we call BERG…
Initial thinking and brainstorming about cheap, ubiquitous, mundane technologies leads to fantastic leaps as the particpants draw on the whiteboard.
As always there are dead ends and flights of fancy – but, as always – there are a couple of intriguing combinations and mutant products that have an itchy promise to them…
The mutating, morphing quality of drawing our hopeful monster objects on the whiteboard…
Always contrasts interestingly with the more procedural, mechanical evolutionary drawing produced by tables of post-it-pixels…
On the second day, we deployed our secret weapon!
We were lucky enough to have Durrell Bishop of the mighty Luckybite join us, and set us all an incredible brief for the day – design a mouse trap, and a ghost trap…
We’d asked the group to think about their favourite traps overnight, and come back with a drawing.
My favourite I think was this diagram of the boulder trap in Raiders of the Lost Ark.
So much peril encapsulated in a stick figure!
The day saw the group tangle with the realities of catching mice, and then swap to the more symbolic, reality-shaping nature of designing a ghost-trap.
Some favourites – from many – include…
Jill’s self-composting mouse-trap
Rafa’s CCTV gargoyle ghost-trap
Peter’s ghost-traps, including the awesome ‘Dark Sucker’, which we hope he builds…
And… Nora’s Black Cat/White-Cat ghost-trap service
Fantastic fun, and everyone produced really excellent, surprising stuff.
Thanks again to Liz Danzico, Qing Qing Chen and, of course, the group who attended the workshop and threw themselves into it so fully in the NYC heat…
Finally – I had great fun one of the afternoons taking photos of the group with an iPhone and a magnifying glass while they drew…
With at least a quarter of the studio gone on any given day this week, it’s been a bit quiet on the studio email list, but there have still been a few gems popping up.
Alice sent round a link to “The Flashed Face Effect”: the phenomenon that normal faces flashing by look monstrous when you’re viewing them with your peripheral vision. It’s another one of those fascinating things the brain does, and scientists still aren’t really sure why.
Denise pointed out Bill DeRouchey’s SXSW presentation on The History of the Button. It’s a fascinating walk through the past century looking at how buttons developed, what they signified, where we’ve gotten to now and where things might be going.
Matt Jones found the utterly delightful Rapping Paper. I’d be tempted to just frame the Run DMC “It’s Tricky” paper and hang it on my wall.
Nick pointed us to Bacon Ipsum, for when your Lorum Ipsum needs to be a little meatier. Simon countered with his friend Katie’s Vegan Ipsum for those among us that eschew meat and meat products.
Another last minute entry from Jones: his friend Steve Murray created “Forty Fords”, a tribute to Harrison Ford in commemoration of his 40th credited big screen appearance.
And finally, just for fun (we do quite like a bit of fun round here after all), I will leave you on this lovely Friday with the inimitable Beeker, doing an impressive multi-dubbed video rendition of Ode To Joy. That is, until it all goes a bit… err… badly.
Talk to Me, MoMA’s new exhibition about design and the communication between people and objects opened this week at the Museum of Modern Art in NYC. We’re very proud to be a part of a show that pulls together so many potent strands of contemporary design:
New branches of design practice have emerged in the past decades that combine design’s old-fashioned preoccupations—with form, function, and meaning—with a focus on the exchange of information and even emotion. Talk to Me explores this new terrain, featuring a variety of designs that enhance communicative possibilities and embody a new balance between technology and people, bringing technological breakthroughs up or down to a comfortable, understandable human scale.
There are physical interactive products such as David Rose’s ever-impressive connected medicine container Glowcaps, the exquisitely crafted musical interfaces Monome and Tenori-on, the empowering iOS payment interface Square and the characterful and playful Tengu, alongside popular apps like Talking Carl and Wordlens.
For such a broad exhibition it is great to see all of the works curated and presented with such thought and attention to the quality of each piece.
The exhibition takes place in the MoMA Special Exhibitions Gallery, from 24 July until 7 November 2011. Thanks to Paola Antonelli and the Talk to Me team for the excellent and patient work in putting this all together.
Week 320, and there’s almost as many people in the studio as there was when I started here, back in week two hundred and seventy something. Most people are out of the office for some reason or another, which makes tea rounds a lot easier than usual. It’s gonna be a slightly empty office next week too, as myself, Jack, Matt Jones and Simon are off to the US for a few days of workshops. We’re currently listening to Hot Sauce Committee Part Two. I’ve been making people listen to David Rodigan’s Thursday night sets on Radio 2 as well.
I’ll start with the people not here. Jack & Matt Jones are currently in the US presenting something a great deal of the team have been working on for the last few months. I’m hoping Matt Webb is currently having a holiday and not working too much. Timo’s working out of the office this week, after going to the opening of Talk to Me at MoMA with Jack & Jones where a few of our recent projects are being exhibited, and working on Chaco related bits.
Joe is off today, but has been working on some Chaco related stuff, and is now working on a document trying to define our design process as a company – speaking to myself and Denise to try and solidify our ways of attacking design challenges going forward. It’s a hard thing to put on paper but will be fantastic to see progress, and also an essential thing to have sorted as we continue to grow as a company.
The back room (more commonly known as Statham) currently contains the mighty brains of Andy, Nick, Alice & James Darling – concentrating mostly on all things Weminuche. Andy did a bit of office tidying at the end of last week after a crazy few days. Denise is working with James on some IA and I’m working with Alice on some graphical elements. James is also doing some work on Schooloscope. I’ve been talking to Nick about cars in between his work on Weminuche & Chaco.
The mighty Simon Pearson is doing his usual brilliant job of herding our flocks of projects and making them work properly. He’s also been working with Kari running the customer service for SVK. He’s on a lot of conference calls trying to make things happen – and working with Denise on Suwappu. Kari’s only in for a few days this week, but keeping the office running like a well oiled machine as usual.
That’s it – I’m keeping weeknotes short this week. Super busy as usual but strangely quiet, at the same time.
Just discussing the fact we’re a Mattless office for the day (they’re both in NY). It’s not something that happens often, and so to soothe this slight unease, it feels fitting that we start Friday links with a BBC archive of the Moon landings, from Matt Webb. It has “lots of telly clips from the past 42 years”, including interviews, episodes of Panorama and a very young looking Patrick Moore, hosting The Sky at Night.
You might well have seen this by now, as it’s been quite widely discussed online, but Andy sent around this post on fake Chinese Apple stores earlier this week, titled ‘Are you listening, Steve Jobs?’ It’s quite extraordinary – and a little bit ‘uncanny valley’, if it’s possible to use that in this context. It’s almost right, but you can sense something’s up.
We had a bit of discussion about the Window to the World concept, from Toyota, a link discovered via @antimega. In the end we got slightly sidetracked by the comments. The fury at the child’s parents for not buckling her into a seatbelt, and the wrath of others for bringing that up. Feel free to dream about the future, but make sure you sweat the details.
James and Nick discussed Tubetap, an app that enables you to apply for a refund from Transport for London at the tap of a button – or several buttons. And on that note, Fix My Transport from MySociety is in beta testing at the moment – but for commuters like me, on multiple forms of public transport – looks like it could be great. (Also, it has the best tagline ever.)
Nick also found a bunch of Little Fellas over on Craftzine. You can’t always spot them in the wild, so why not make one?. I’d like to see them applied to the LittleDog Robot mentioned last week.
So, it’s week 319. I’m pretty sure you know this already but just in case, 319 is a Smith Number, and perhaps more importantly, according to Wikipedia, the name of a song by Prince, that can not be found on Spotify.
The office is as busy as ever this week but, dare I say it, slightly quieter in terms of volume. Completely co-incidentally I’m sure, Matt Jones and Jack are in New York this week, taking workshops and visiting clients. They’ve sent word back home via a Google Hangout – which sounds like it was a pain to set up, but once it got going felt like “quite a nice informal way to video chat”. They’re getting a lot done over there, but we also have a sneaking suspicion they could be having too much fun.
It feels like a very collaborative week. We’ve just had an ‘all hands meeting’, and several people mentioned ‘being a sounding board for…’, which is one of the nice things about working here. It’s easy to ask opinions of others – and everyone is interested in everyone else’s ‘stuff’. There’s a lot of respect for the knowledge others have, even if the boundaries of people’s particular work disciplines are blurred.
In literal terms this week, the project code named ‘Chaco’ is taking up time from Joe, Simon, Andy and Nick. Each person is playing a very different role.
Weminuche is occupying the minds of Alex and Alice, James and myself, with a bit of extra time from Andy and Nick. Alex and Alice are working closely together on APIs and design and I’m working on IA with James, who very patiently listens to my latest master plan and either agrees or pokes my ideas with a big stick to see if and when they fall apart.
James is also working on Schooloscope, with some help from Nick. As well as Chaco, Simon is also working on some SVK customer service, and planning for new Suwappu and Dimensions stages.
Matt Webb is trying to go on holiday, but has managed to book himself into a hotel 3 blocks from the location of a client meeting with Jack and Matt J. We’ll see how well he manages to avoid them. If you’re in NY and spot him on the street, for heavens sake don’t mention work.