This website is now archived. To find out what BERG did next, go to

Search results for 'suwappu'

Suwappu in Designs Of The Year 2012

Suwappu at Designs Of The Year, Design Museum

Suwappu – the augmented-reality toy we invented with Dentsu London is a nominee this year in the Digital category of the Designs Of The Year show at London’s Design Museum.


It’s in great company – with other nominees in the category such as the Kinect, the Guardian’s iPad app (which we also consulted on, with Mark Porter and the brilliant internal team at the paper), High Arctic by UVA and others.

The Suwappu certainly get around a bit – here they are last year where they went to Pop!Tech with me to speak about toys, play and learning in a Robot-Readable World.

Suwappu at Pop!Tech

And last year they also lived for a while at MoMA, at the Talk To Me exhibit

We worked with Dentsu London from their original idea to bring them to life through model-making and animation, and then build working prototype software on the cutting-edge of what’s possible in computer-vision on smartphones.

It’s great to have partnerships like this that can rapidly get all the way from a strategic idea ‘What if toys were a media channel’ through to working, real things that can be taken to market.

That’s our favourite thing!

Of course – it’s a lovely bonus when they get recognised in a wider cultural context such as MoMA or the Design Museum.

As well as making our own products, we spend most of our time in the studio working closely in partnership with clients to create new things for them – making strategy real through research, design, making and communication.

Do get in touch if you and your company would like to work with us this way.

Suwappu app prototype – toys, stories, and augmented reality

You may remember Suwappu, our toy invention project with Dentsu — those woodland creatures that talk to one-another when you watch them through your phone camera. You can see the film – the design concept – here or (and now I’m showing off) in the New York at Moma, in the exhibition Talk to Me.

Here’s the next stage, a sneak peek at the internal app prototype:

Direct link: Suwappu app prototype video, on Vimeo.

It’s an iPhone app which is a window to Suwappu, where you can see Deer and Badger talk as you play with them.

Behind the scenes, there’s some neat technology here. The camera recognises Deer and Badger just from what they look like — it’s a robot-readable world but there’s not a QR Code in sight. The camera picks up on the designs of the faces of the Suwappu creatures. Technically this is markerless augmented reality — it’s cutting-edge computer vision.


And what’s also neat is that the augmented reality is all in 3D: you suddenly see Deer as inside a new environment, one that moves around and behind the toy as your move the phone around. It’s all tabletop too, which is nicely personal. The tabletop is a fascinating place for user interfaces, alongside the room-side interfaces of Xbox Kinects and Nintendo Wiis, the intimate scale of mobiles, and the close desktop of the PC. Tabletop augmented reality is play-scale!

But what tickles us all most about Suwappu is the story-telling.

Seeing the two characters chatting, and referencing a just-out-of-camera event, is so provocative. It makes me wonder what could be done with this story-telling. Could there be a new story every week, some kind of drama occurring between the toys? Or maybe Badger gets to know you, and you interact on Facebook too. How about one day Deer mentions a new character, and a couple of weeks later you see it pop up on TV or in the shops.

The system that it would all require is intriguing: what does a script look like, when you’re authoring a story for five or six woodland creatures, and one or two human kids who are part of the action? How do we deliver the story to the phone? What stories work best? This app scratches the surface of that, and I know these are the avenues the folks at Dentsu are looking forward to exploring in the future. It feels like inventing a new media channel.

Suwappu is magical because it’s so alive, and it fizzes with promise. Back in the 1980s, I played with Transformers toys, and in my imagination I thought about the stories in the Transformers TV cartoon. And when I watched the cartoon, I was all the more engaged for having had the actual Transformers toys in my hands. With Suwappu, the stories and the toys are happening in the same place at the same time, right in my hands and right in-front of you.

Here are some more pics.


The app icon.


Starting the tech demo. You can switch between English and Japanese.


Badger saying “Did I make another fire?” (Badger has poor control over his laser eyes!)


Deer retweeting Badger, and adding “Oh dear.” I love the gentle way the characters interact.

You can’t download the iPhone app — this is an internal-only prototype for Dentsu to test the experience and test the technology. We’re grateful to them for being so open, and for creating and sharing Suwappu.

Thanks to all our friends at Dentsu (the original introduction has detailed credits), the team here at BERG, and thanks especially to Zappar, whose technology and smarts in augmented reality and computer vision has brought Suwappu to life.

Read more about the Suwappu app prototype on Dentsu London’s blog, which also discusses some future commercial directions for Suwappu.

Suwappu: Toys in media

Dentsu London are developing an original product called Suwappu. Suwappu are woodland creatures that swap pants, toys that come to life in augmented reality. BERG have been brought in as consultant inventors, and we’ve made this film. Have a look!

Suwappu is a range of toys, animal characters that live in little digital worlds. The physical toys are canvasses upon which we can paint worlds, through a phone (or tablet) lens we can see into the narratives, games and media in which they live.

Dentsu London says:

We think Suwappu represents a new kind of media platform, and all sorts of social, content and commercial possibilities.

Each character lives in different environments: Badger lives in a harsh and troubled world, Deer lives in a forest utopia, Fox in an urban garden, Tuna in a paddling pool of nicely rendered water. The worlds also contain other things, such as animated facial expression, dialogue pulled from traditional media and Twitter, and animated sidekick characters.

Suwappu Deer and Tuna

The first part of this film imagines and explores the Suwappu world. Here we are using film to explore how animation and behaviours can draw out character and narrative in physical toy settings. The second part is an explanation of how Suwappu products might work, from using animal patterns as markers for augmented reality, to testing out actual Augmented Reality (AR) worlds on a mobile phone.

Suwappu real-time AR tests

We wanted to picture a toy world that was part-physical, part-digital and that acts as a platform for media. We imagine toys developing as connected products, pulling from and leaking into familiar media like Twitter and Youtube. Toys already have a long and tenuous relationship with media, as film or television tie-ins and merchandise. It hasn’t been an easy relationship. AR seems like a very apt way of giving cheap, small, non-interactive plastic objects an identity and set of behaviours in new and existing media worlds.

Schulze says:

We see the media and animation content around the toys as almost episodic, like comic books. Their changing characters, behaviours and motivations played out across different media.

Toys are often related as merchandise to their screen based counterparts. Although as products toys have fantastic charm and an awesome legacy. They feel muted in comparison to their animated mirror selves on the big screens. As we worked with Dentsu on the product and brand space around the toys we speculated on animated narratives to accompany the thinking and characters developed.

In the film, one of the characters makes a reference to dreams. I love the idea that the toys in their physical form, dream their animated televised adventures in video. When they awake, into their plastic prisons, they half remember the super rendered full motion freedoms and adventures from the world of TV.

Each Suwappu character can be split into two parts, each half can be swapped with any other resulting in a new hybrid character. Each character has its own personality (governed by its top half) and ‘environment’ (dictated by its bottom half). This allows the creatures to visit each other’s worlds, and opens up for experimentation with the permutations of characters personality and the worlds that they inhabit. It’s possible to set up games and narratives based on the ways that the characters and their pants are manipulated.

Suwappu 3D registration

This is not primarily a technology demo, it’s a video exploration of how toys and media might converge through computer vision and augmented video. We’ve used video both as a communication tool and as a material exploration of toys, animation, augmented reality and 3D worlds. We had to invent ways of turning inanimate models into believable living worlds through facial animation, environmental effects, sound design and written dialogue. There are other interesting findings in the exploration, such as the way in which the physical toys ‘cut out’ or ‘occlude’ their digital environments. This is done by masking out an invisible virtual version of the toy in 3D, which makes for a much more believable and satisfying experience, and something we haven’t seen much of in previous AR implementations.

We all remember making up stories with our toys when we were young, or our favourite childhood TV cartoon series where our toys seemed to have impossible, brilliant lives of their own. Now that we have the technology to have toys soak in media, what tales will they tell?

BERG in Fast Company’s list of the world’s 50 most innovative companies

I’m super proud to see BERG at #44 in Fast Company’s list of the World’s 50 Most Innovative Companies.

It’s an incredible list to be on! Apple, Facebook, Google and Amazon, Square and Twitter are all there — and to pick just a few of the others, Jawbone, Polyvore, and Kickstarter.


Little Printer gets a shout out — our product for the home that prints you a miniaturised personal newspaper, daily.

It also seems like BERG is the only design consultancy on the list. We spend 50% of our time collaborating with clients, people like Intel and the Guardian, prototyping and producing everything from cute toys that talk in augmented reality to the first magazine (and magazine platform) for the iPad.

We’re currently planning our work across March and April. If you’d like to work with us, inventing and innovating, do drop me a line. The email is

In the meantime here’s our profile in Fast Company’s list: #44 Berg – For wildly imagining the marriage of the digital and physical worlds.

Thanks FastCo, and thanks team!

Week 344

Factoid of the week: the year 344 was a leap year starting on a Sunday. As is 2012. How about that.

Week 344 in the BERG studio has a lot of to-ing and fro-ing. Joe rejoined the studio (back from the US trip with Jones & Webb) on Tuesday. Jones stayed in the US for a couple of extra days but has just arrived back in the studio, straight from the airport. What can I say, he is hardcore. Webb was at CES in Las Vegas yesterday (we can’t wait to get his report) and continues his US mini-tour in San Francisco today. James Darling is still on a tropical beach somewhere. Other BERG folk have been out to see GPs and osteos, track down packages, run various errands, etc. At the same time, we’ve had a number of visits from clients/partners and also have several contractors spending time in the studio this week. So it’s still felt like the busy, buzzing hub that it usually is.

Let me say a quick word about two people who have been mentioned in passing in previous weeknotes without much other explanation as to who they are. Phil Wright is a contractor who has been helping us out with the development of Little Printer since April of last year. He spends most days in the studio and has his own desk and everything, so although he remains on contract status, he feels like part of the regular BERG team. Helen Rogers joined us for two afternoons a week at the beginning of December to start training to take over for me as our Studio Manager when I go on maternity leave at the beginning of February. From this week she’s up to four afternoons a week, and from the start of February, she’ll be four full days a week. It’s been a treat to work with her thus far as she is super clever and catches onto everything so quickly. It’s nice knowing that the studio will be in very competent hands when I step away in a few weeks. Watch for more info about her to show up in the Studio section of the website soon!

As for the rest of the BERGians, this week Simon is doing some rounding off of project costs for 2011 and looking at capacity planning for 2012, leading some workshops on the continued development and future of Little Printer, coordinating various bits of Uinta projects that we have on the go, and working through the final issues that still need to be resolved in the new studio. In case you missed it, he also posted adverts for two new positions that we’re looking to hire for. If you’re interested in working for BERG, please do have a look to see if either of those describes you!

Nick has been working on the technical architecture for BERG Cloud, thinking about chips and font rendering for Little Printer and doing some work on the Suwappu app.

Joe has been catching up on what he missed being out for a week and getting his feet back under him. He’s mainly working on integrating animation in a couple of Uinta projects.

Denise is still very generously handling most of the enquiries that come in about BERG Cloud and Little Printer. She’s also continuing work on the UI and IA for the internet side of Little Printer.

Alex has the fun job of developing the brief for the Little Printer packaging and unboxing experience. He’s still doing some work on Uinta this week and is also helping to make the new studio a happier, more accommodating place with a functional doorbell and signage.

Alice is also involved in the font rendering work for Little Printer and is doing some early stage investigative work into dev tools for people who want to create their own publications for Little Printer.

Timo is working on a Uinta animation brief and is also doing some shooting for a 90 second test pilot. I’m sure more will be revealed about that in good time, but it’s potentially pretty exciting.

Andy is making good use of our CitySpring courier account, sending various components hither and yon. He’s also having conversations about what should be printed on the back of Little Printer. I suppose most people don’t really think too much about the copy on the back of their electronics, but it turns out it’s pretty important.

As for me, I have been doing all the usual financial admin, trying to wrap up some last bits of business around moving studio, ordering office supplies, handling all the general (i.e. non-Little Printer or BERG Cloud) enquiries that come in to the studio, etc. Today I get to teach Helen how to run the quarterly VAT return. (Exciting stuff, eh?) And I’ve been getting kicked in the ribs (from the inside) pretty much the whole time I’ve been typing this. Maybe that second cup of tea wasn’t such a great idea after all…

Gardens and Zoos

This is a version of a talk that I gave at the “In Progress” event, staged by ‘It’s Nice That‘ magazine.

It builds on some thoughts that I’ve spoken about at some other events in 2011, but I think this version put it forward in the way I’m happiest with.

Having said that, I took the name of the event literally – it’s a bit of a work-in-progress, still.

It might more properly be entitled ‘Pets & Pot-plants’ rather than ‘Gardens & Zoos’ – but the audience seemed to enjoy it, and hopefully it framed some of the things we’re thinking about and discussing in the studio over the last year or so, as we’ve been working on and other projects looking at the near-future of connected products.

And – with that proviso… Here it is.

Let me introduce a few characters…

This is my frying pan. I bought it in Helsinki. It’s very good at making omelettes.

This is Sukie. She’s a pot-plant that we adopted from our friend Heather’s ‘Wayward Plants‘ project, at the Radical Nature exhibit at the Barbican (where “In Progress” is!)

This is a puppy – we’ll call him ‘Bruno’.

I have no idea if that’s his name, but it’s from our friend Matt Cottam’s “Dogs I Meet” flickr set, and Matt’s dog is called Bruno – so it seemed fitting.

And finally, this is Siri – a bot.

And, I’m Matt Jones – a designer and one of the principals at BERG, a design and invention studio.

There are currently 13 of us – half-technologists, half-designers, sharing a room in East London where we invent products for ourselves and for other people – generally large technology and media companies.

This is Availabot, one of the first products that we designed – it’s a small connected product that represents your online status physically…

But I’m going to talk today about the near-future of connected products.

And it is a near-future, not far from the present.

In fact, one of our favourite quotes about the future is from William Burroughs: When you cut into the present, the future leaks out…

A place we like to ‘cut into the present’ is the Argos catalogue! Matt Webb’s talked about this before.

It’s really where you see Moore’s Law hit the high-street.

Whether it’s toys, kitchen gear or sports equipment – it’s getting hard to find consumer goods that don’t have software inside them.

This is near-future where the things around us start to display behaviour – acquiring motive and agency as they act and react to the context around them according to the software they have inside them, and increasingly the information they get from (and publish back to) the network.

In this near-future, it’s very hard to identify the ‘U’ in UI’ – that is, the User in User-Interface. It’s not so clear anymore what these things are. Tools… or something more.

Of course, I choose to illustrate this slightly-nuanced point with a video of kittens riding a Roomba that Matt Webb found, so you might not be convinced.

However, this brings us back to our new friends, the Bots.

By bot – I guess I mean a piece of software that displays a behaviour, that has motive and agency.

Let me show a clip about Siri, and how having bots in our lives might affect us [Contains Strong Language!]

Perhaps, like me – you have more sympathy for the non-human in that clip…

But how about some other visions of what it might be like to have non-human companions in our lives? For instance, the ‘daemons’ of Phillip Pullman’s ‘Dark Materials‘ trilogy. They are you, but not you – able to reveal things about you and reveal things to you. Able to interact naturally with you and each other.

Creatures we’ve made that play and explore the world don’t seem that far-fetched anymore. This is a clip of work on juggling robot quadcopters by ETH Zurich.

Which brings me back to my earlier thought – that it’s hard to see where the User in User-Interfaces might be. User-Centred Design has been the accepted wisdom for decades in interaction design.

I like this quote that my friend Karsten introduced me to, by Prof Bertrand Meyer (coincidentally at professor at ETH) that might offer an alternative view…

A more fruitful stance for interaction design in this new landscape might be that offered by Actor-Network Theory?

I like this snippet from a formulation of ANT based on work by Geoff Walsham et al.

“Creating a body of allies, human and non-human…”

Which brings me back to this thing…

Which is pretty unequivocally a tool. No motive, no agency. The behaviour is that of it’s evident, material properties.

Domestic pets, by contrast, are chock-full of behaviour, motive, agency. We have a model of what they want, and how they behave in certain contexts – as they do of us, we think.

We’ll never know, truly of course.

They can surprise us.

That’s part of why we love them.

But what about these things?

Even though we might give them names, and have an idea of their ‘motive’ and behaviour, they have little or no direct agency. They move around by getting us to move them around, by thriving or wilting…

And – this occurred to me while doing this talk – what are houseplants for?

Let’s leave that one hanging for a while…

And come back to design – or more specifically – some of the impulses beneath it. To make things, and to make sense of things. This is one of my favourite quotes about that. I found it in an exhibition explaining the engineering design of the Sydney Opera House.

Making models to understand is what we do as we design.

And, as we design for slightly-unpredictable, non-human-centred near-futures we need to make more of them, and share them so we can play with them, spin them round, pick them apart and talk about what we want them to be – together.

I’ll just quickly mention some of the things we talk about a lot in our work. The things we think are important in the models, and designs we make for connected products. The first one is legibility. That the product or service presents a readable, evident model of how it works to the world on it’s surface. That there is legible feedback, and you can quickly construct a theory how it works through that feedback.

One of the least useful notions you come up against, particularly in technology companies, is the stated ambition that the use of products and services should be ‘seamless experiences’.

Matthew Chalmers has stated (after Mark Weiser, one of the founding figures of ‘ubicomp’) that we need to design “seamful systems, with beautiful seams”

Beautiful seams attract us to the legible surfaces of a thing, and allow our imagination in – so that we start to build a model in our minds (and appreciate the craft at work, the values of the thing, the values of those that made it, and how we might adapt it to our values – but that’s another topic)

Finally – this guy – who pops up a lot on whiteboards in the studio, or when we’re working with clients.

B.A.S.A.A.P. is a bit of an internal manifesto at BERG, and stands for Be As Smart As A Puppy – and it’s something I’ve written about at length before.

It stems from something robotics and AI expert Rodney Brooks said… that if we put the fifty smartest people in a room for fifty years, we’d be luck if we make AIs as smart as a puppy.

We see this an opportunity rather than a problem!

We’ve made our goal to look to other models of intelligence and emotional response in products and services than emulating what we’d expect from humans.

Which is what this talk is about. Sort-of.

But before we move on, a quick example of how we express these three values in our work.

“Text Camera” is a very quick sketch of something that we think illustrates legibility, seamful-ness and BASAAP neatly.

Text Camera is about making the inputs and inferences the phone sees around it to ask a series of friendly questions that help to make clearer what it can sense and interpret. It kind of reports back on what it sees in text, rather through a video feed.

Let me explain one of the things it can do as an example. Your smartphone camera has a bunch of software to interpret the light it’s seeing around you – in order to adjust the exposure automatically.

So, we look to that and see if it’s reporting ‘tungsten light’ for instance, and can infer from that whether to ask the question “Am I indoors?”.

Through the dialog we feel the seams – the capabilities and affordances of the smartphone, and start to make a model of what it can do.

So next, I want to talk a little about a story you might be familiar with – that of…

I hope that last line doesn’t spoil it for anyone who hasn’t seen it yet…

But – over the last year I’ve been talking with lot to people about a short scene in the original 1977 Star Wars movie ‘A New Hope’ – where Luke and his Uncle Owen are attempting to buy some droids from the Jawas that have pulled up outside their farmstead.

I’ve become a little obsessed with this sequence – where the droids are presented like… Appliances? Livestock?

Or more troublingly, slaves?

Luke and Uncle Owen relate to them as all three – at the same time addressing them directly, aggressively and passive-aggressively. It’s such a rich mix of ways that ‘human and non-human actors’ might communicate.

Odd, and perhaps the most interesting slice of ‘science-fiction’ in what otherwise is squarely a fantasy film.

Of course Artoo and Threepio are really just…

Men in tin-suits, but our suspension of belief is powerful! Which brings me to the next thing we should quickly throw into the mix of the near-future…

This is the pedal of my Brompton bike. It’s also a yapping dog (to me at least)

Our brains are hard-wired to see faces, it’s part of a phenomena called ‘Pareidolia

It’s something we’ve talked about before on the BERGblog, particularly in connection with Schoolscope. I started a group on flickr called “Hello Little Fella” to catalogue my pareidolic-excesses (other facespotting groups are available).

This little fella is probably my favourite.

He’s a little bit ill, and has a temperature.


The reason for this particular digression is to point out that one of the prime materials we work with as interaction designers is human perception. We try to design things that work to take advantage of its particular capabilities and peculiarities.

I’m not sure if anyone here remembers the Apple Newton and the Palm Pilot?

The Newton was an incredible technological attainment for it’s time – recognising the user’s handwriting. The Palm instead forced us to learn a new type of writing (“Graffiti“).

We’re generally faster learners than our technology, as long as we are given something that can be easily approached and mastered. We’re more plastic and malleable – what we do changes our brains – so the ‘wily’ technology (and it’s designers) will sieze upon this and use it…

All of which leaves me wondering whether we are working towards Artificial Empathy, rather than Artificial Intelligence in the things we are designing…

If you’ve seen this video of ‘Big Dog’, an all-terrain robot by Boston Dynamics – and you’re anything like me – then you flinch when it’s tester kicks it.

To quote from our ‘Artificial Empathy’ post:

Big Dog’s movements and reactions – it’s behaviour in response to being kicked by one of it’s human testers (about 36 seconds into the video above) is not expressed in a designed face, or with sad ‘Dreamworks’ eyebrows – but in pure reaction – which uncannily resembles the evasion and unsteadiness of a just-abused animal.

Of course, before we get too carried away by artificial empathy, we shouldn’t forget what Big Dog is primarily designed for, and funded by…

Anyway – coming back to ‘wily’ tactics, here’s the often-referenced ‘Uncanny Valley’ diagram, showing the relationship between ever-more-realistic simulations of life, particularly humans and our ‘familiarity’ with them.

Basically, as we get ever closer to trying to create lifelike-simulations of humans, they start to creep us out.

It can perhaps be most neatly summed up as our reaction to things like the creepy, mocapped synthespians in the movie Polar Express…

The ‘wily’ tactic then would be to stay far away from the valley – aim to make technology behave with empathic qualities that aren’t human at all, and let us fill in the gaps as we do so well.

Which, brings us back to BASAAP, which as Rodney Brooks pointed out – is still really tough.

Bruno’s wild ancestors started to brute-force the problem of creating artificial empathy and a working companion-species relationship with humans through the long, complex process of domestication and selective-breeding…

…from that point the first time these kind of eyes were made towards scraps of meat held at the end of a campfire somewhere between 12-30,000 years ago…

Some robot designers have opted to stay on the non-human side of the uncanny valley, notably perhaps Sony with AIBO.

Here’s an interesting study from 2003 that hints a little at what the effects of designing for ‘artificial empathy’ might be.

We’re good at holding conflicting models of things in our heads at the same time it seems. That AIBO is a technology, but that it also has ‘an inner life’.

Take a look at this blog, where an AIBO owner posts it’s favourite places, and laments:

“[he] almost never – well, make it never – leaves his station these days. It’s not for lack on interest – he still is in front of me at the office – but for want of preservation. You know, if he breaks a leg come a day or a year, will Sony still be there to fix him up?”

(One questioner after my talk asked: “What did the 25% of people who didn’t think AIBO was a technological gadget report it to be?” – Good question!)

Some recommendations of things to look at around this area: the work of Donna Haraway, esp. The Companion Species Manifesto.

Also, the work of Cynthia Brezeal, Heather Knight and Kacie Kinzer – and the ongoing LIREC research project that our friend Alexandra Deschamps-Sonsino is working with, that’s looking to studies of canine behaviour and companionship to influence the design of bots and robots.

In science-fiction there’s a long, long list that could go here – but for now I’ll just point to the most-affecting recent thing I’ve read in the area, Ted Chiang’s novella “The Lifecycle of Software Objects” – which I took as my title for a talk partly on this subject at UX London earlier in the year.

In our own recent work I’d pick out Suwappu, a collaboration with Dentsu London as something where we’re looking to animate, literally, toys with an inner life through a computer-vision application that recognises each character and overlays dialogue and environments around them.

I wonder how this type of technology might develop hand-in-hand with storytelling to engage and delight – while leaving room for the imagination and empathy that we so easily project on things, especially when we are young.

Finally, I want to move away from the companion animal as a model, back to these things…

I said we’d come back to this! Have you ever thought about why we have pot plants? What we have them in the corners of our lives? How did they get there? What are they up to?!?

(Seriously – I haven’t managed yet to find research or a cultural history of how pot-plants became part of our home life. There are obvious routes through farming, gardening and cooking – but what about ornamental plants? If anyone reading this wants to point me at some they’d recommend in the comments to this post, I’d be most grateful!)

Take a look at this – one of the favourite finds of the studio in 2011 – Sticky Light.

It is very beautifully simple. It displays motive and behaviour. We find it fascinating and playful. Of course, part of it’s charm is that it can move around of its own volition – it has agency.

Pot-plants have motives (stay alive, reproduce) and behaviour (grow towards the light, shrivel when not watered) but they don’t have much agency. They rely on us to move them into the light, to water them.

Some recent projects have looked to augment domestic plants with some agency – Botanicalls by Kati London, Kate Hartman, Rebecca Bray and Rob Faludi equips a plant not only with a network connection, but a twitter account! Activated by sensors it can report to you (and its followers) whether it is getting enough water. Some voice, some agency.

(I didn’t have time to mention it in the talk, but I’d also point to James Chamber’s evolution of the idea with his ‘Has Needs’ project, where an abused potplant not only has a network connection, but the means to advertise for a new owner on freecycle…)

Here’s my botanical, which I chose to call Robert Plant…

So, much simpler systems that people or pets can find places in our lives as companions. Legible motives, limited behaviours and agency can illicit response, empathy and engagement from us.

We think this is rich territory for design as the things around us start to acquire means of context-awareness, computation and connectivity.

As we move from making inert tools – that we are unequivocally the users of – to companions, with behaviours that animate them – we wonder whether we should go straight from this…

…to this…

Namely, straight from things with predictable and legible properties and affordances, to things that try and have a peer-relationship, speaking with human voice and making great technological leaps to relate to us in that way, but perhaps with a danger of entering the uncanny valley.

What if there’s an interesting space to design somewhere in-between?

This in part is the inspiration behind some of the thinking in our new platform Berg Cloud, and its first product – Little Printer.

We like to think of Little Printer as something of a ‘Cloud Companion Species’ that mediates the internet and the domestic, that speaks with your smartphone, and digests the web into delightful little chunks that it dispenses when you want.

Little Printer is the beginning of our explorations into these cloud-companions, and BERG Cloud is the means we’re creating to explore them.

Ultimately we’re interested in the potential for new forms of companion species that extend us. A favourite project for us is Natalie Jeremijenko’s “Feral Robotic Dogs” – a fantastic example of legibility, seamful-ness and BASAAP.

Natalie went to communities near reclaimed-land that might still have harmful toxins present, and taught workshops where cheap (remember Argos?) robot dogs that could be bought for $30 or so where opened up and hacked to accommodate new sensors.

They were reprogrammed to seek the chemical traces associated with lingering toxins. Once release by the communities they ‘sniff’ them out, waddling towards the highest concentrations – an immediate tangible and legible visualisation of problem areas.

Perhaps most important was that the communities themselves were the ones taught to open the toys up, repurpose their motives and behaviour – giving them the agency over the technology and evidence they could build themselves.

In the coming world of bots – whether companions or not, we have to attempt to maintain this sort of open literacy. And it is partly the designer’s role to increase its legibility. Not only to beguile and create empathy – but to allow a dialogue.

As Kevin Slavin said about the world of algorithms growing around us“We can write it but we can’t read it”

We need to engage with the complexity and make it open up to us.

To make evident, seamful surfaces through which we can engage with puppy-smart things.

As our friend Chris Heathcote has put so well:

Thanks for inviting me, and for your attention today.

FOOTNOTE: Auger & Loizeau’s Domestic Robots.

I didn’t get the chance to reference the work of James Auger & Jimmy Loizeau in the talk, but their “Carnivorous Robots” project deserves study.

From the project website:

“For a robot to comfortably migrate into our homes, appearance is critical. We applied the concept of adaptation to move beyond the functional forms employed in laboratories and the stereotypical fictional forms often applied to robots. In effect creating a clean slate for designing robot form, then looking to the contemporary domestic landscape and the related areas of fashion and trends for inspiration. The result is that on the surface the CDER series more resemble items of contemporary furniture than traditional robots. This is intended to facilitate a seamless transition into the home through aesthetic adaptation, there are however, subtle anomalies or alien features that are intended to draw the viewer in and encourage further investigation into the object.”

And on robots performing as “Companion Species”

”In the home there are several established object categories each in some way justifying the products presence through the benefit or comfort they bring to the occupant, these include: utility; ornament; companionship; entertainment and combinations of the above, for example, pets can be entertaining and chairs can be ornamental. The simplest route for robots to enter the home would be to follow one of these existing paths but by necessity of definition, offering something above and beyond the products currently occupying those roles.”

James Auger is currently completing his Phd at the RCA on ‘Domestication of Robotics’ and I can’t wait to read it.

Week 333

It’s a drizzly day in London and I have cold forearms.
Alex, Jones and Jack are in Uinta workshops this week, so the office feels a bit empty and Jones’ iconic eyebrows are missing from my view across the desk.

This week Simon is shepherding, doing a bit of re-planning, pinging off emails and ushering the rest of us into the right places at the right time with his characteristic patience and charm.

Kari is still doing ‘the usual’, a lot of putting things into spreadsheets. This week she is also writing documentation for new financial admin procedures, which I can only hope is more exciting than it sounds.

Nick has his fingers and also some toes in many pies (dexterous feet) this week. He’s working with Joe on Uinta, with James, Phil, Andy and I on Weminuche, applying some polish to Suwappu, moving more google accounts from one place to another, and doing a bit of Schooloscope migration.

Denise is making some very beautiful things for Barry, which I can’t wait to see in the world.

Joe is working on Uinta, making some truly gorgeous looking animations, and swinging his arms around a lot.

James is working on Weminuche with Alex. Right now he is looking at something complicated in Omingraffle and tapping his face thoughtfully.

I am also working with Denise on Barry. Taking pictures from dropbox and making them into real things.

Matt Webb is thinking about January, doing his regular catch ups with the team, financial stuff and meetings.

Andy is thinking about process and working on Barry. Something must be afoot because every time the doorbell goes he jumps out from Statham and runs to the door to collect whatever the postman has brought. What’s he building back there?

Timo is working with Jack on Chaco stuff. He is also pulling together a script for Uinta work and writing a proposal.

The rain has stopped, and Alex and Jones have just arrived back, laden with coffee and fun things for us all to look at.


Week 330

Fact of the week: 330 is the number of dimples in a British golf ball according to Wikipedia.

So having been on holiday all of last week, I’m only just catching up with what’s been going on in the studio and am still not quite sure I’ve sussed it all out. Since I’m the one that actually makes up the blog rota, though, I have only myself to blame for assigning myself to write weeknotes the week after I’ve been on holiday. Trust me, I won’t do that again.

When I asked Matt Webb what last week was like, his summary was, “There was drama.” Unsurprisingly, when much of your work is dependent on the whims and wishes and ever-changing timelines of clients, things can go a bit pear-shaped. As regular readers of this blog will know, however, BERG is not a company that has all its eggs in one basket, so one client throwing us for a loop doesn’t completely knock us off balance. Nevertheless, there has been drama and so we’re having to deal with that.

In terms of what various folks are up to this week…

Matt Jones, Jack and Alex (who celebrated his one-year anniversary at BERG today!) are doing work on Uinta in preparation for a presentation on Thursday. Matt & Jack are also spending lots of time doing general company planning and re-planning and thinking about sales. Alongside all that, Jack has his fingers in Barry design. He’s also looking at lots of documents.

Barry is also occupying Alex, Denise, Alice, James, Nick, Matt Webb and Andy. Besides following up with various partners & suppliers and chasing China for quotes, Andy is specifically doing some reflecting on the last year of Barry development. (Side note: Because of my role and the fact that I don’t work on Fridays – thus missing weekly studio demos – I only get to see very small slivers of the progress that’s been made on Barry. I do catch a whiff of the excitement and anticipation around it every now and then, though. And I can say with some confidence: it’s going to be pretty spectacular, people.)

Simon, Alice, James and Nick had a meeting at the pub yesterday to make some decisions around Barry and are now working to implement those. There are still a number of near-term decisions that still need to be made, though. Since Nick, James and Alice are now sitting in the main room of the studio where I am (having previously been based in Statham next door), I’m overhearing lots more about the code and technology underlying Barry. Most of the time I have no idea what it means, but it’s rather fun to eavesdrop on anyway and to see them working to solve problems together.

Joe is mostly working on Uinta this week, fleshing out the system underneath the recently approved “look and feel”. He and Simon will be spending some time out of the studio this week meeting with partners who are doing some work on that project alongside us.

Simon is, as usual, skillfully balancing multiple project and clients and partners – this week it’s mostly Chaco, Uinta, Barry and Suwappu with the occasional random bit of SVK and other stuff thrown in. It involves lots and lots of post-it notes. Which is causing me a bit of anxiety because he’s out this afternoon and the wind is blowing the post-it notes around and I have no idea what sort of system he’s organised them into and if you come back and all your carefully assemble post-it notes are out of order, Simon, I apologise. Blame our need for fresh air.

And as well as dealing with the drama stirred up last week, Matt Webb is planning, thinking about finances & sales and having lots of coffee with various people. I hope for his sake some of that is decaf.

Timo’s holiday has stretched into this week and we’re looking forward to welcoming him back tomorrow.

As for me, I’m working furiously to catch up with all the bookkeeping, replying to lots of general studio correspondence, booking travel, updating spreadsheets, doing customer service for SVK (it’s not to late to buy one!) and chasing non-responsive suppliers and overdue invoices. And eating cake. There seems to be lots of cake this week.

I hope that wherever you are, whatever you’re doing, there’s cake there too. Have a good week!

Week 322

It’s week 322 here at BERG, and I have been left in charge of weeknotes.

322 has a sort of interesting Wikipedia page, under the “Technology” header it says:

 “The first dependable representation of a horse rider with paired stirrups was found in China in a Jin Dynasty tomb.”

Looking at the page for the number 322, we find that unlike last weeks notably boring entry, 322 is actually quite good. The sort of number you would be pleased to find seated at your table at a wedding. Of course 322 would be too modest to tell you all at once, but as the Wikipedia entry points out:

 “322 is a sphenicnontotientuntouchablehashard number. It is also seen as a Skull and Bones reference of power”

Keeping these facts in mind, what is the everyone up to this week?

Alex and Matt Jones are planning Uinita. This involves conference calls and Alex saying “yeahhhh, brilliant” a lot. Alex is also continuing with Barry work and mending his busted shoulder. As Alex shares bits of Barry design for us all to ponder, I look around Statham, which is papered in drawings and work taking shape, and think how brilliant it is that I get to work here.

Denise is continuing with Barry, steering the project and working through the tiny details of how everything happens with James.

Along with talking through the difficult stuff with Denise,  James is also planning the next Barry sprint with Simon, and continuing to code on Barry. James also has a new pair two new pairs of trousers, which I am very pleased to see.

This week Simon has his project managing fingers in many pies; Chaco, BBC Dimensions 1 and 2, Barry and continuing to roll along the SVK reprint.

Joe is on Suwappu phase 2, and working with Nick on making.

Jack is working on Suwappu and overseeing the continuing work on Barry.

Timo is writing proposals and working on Chaco sketching with Matt Jones.

Matthew Webb is touching many different projects, in the way he does. Guiding the direction of things at a very high level as well as getting down into the decisions about atoms that crop up. He is also doing ‘finances’ which I am unable to explain further, though I suspect it’s paperwork.

Kari is also on the finance admin, apparently the second week of the month is always finance admin heavy. She’s also doing the housework involved with the end of the financial year, which probably means more paperwork.

Nick is working on the technical side of Barry with James and I, as well as starting the technical tippy tappy on Suwappu 2, following Joe’s designs.

And that concludes my very first week notes. What say you, internet? week notes? or weak notes?

The Robot-Readable World


I gave a talk at Glug London last week, where I discussed something that’s been on my mind at least since 2007, when I last talked about it briefly at Interesting.

It is rearing its head in our work, and in work and writings by others – so thought I would give it another airing.

The talk at Glug London bounced through some of our work, and our collective obsession with Mary Poppins, so I’ll cut to the bit about the Robot-Readable World, and rather than try and reproduce the talk I’ll embed the images I showed that evening, but embellish and expand on what I was trying to point at.

Robot-Readable World is a pot to put things in, something that I first started putting things in back in 2007 or so.

At Interesting back then, I drew a parallel between the Apple Newton’s sophisticated, complicated hand-writing recognition and the Palm Pilot’s approach of getting humans to learn a new way to write, i.e. Graffiti.

The connection I was trying to make was that there is a deliberate design approach that makes use of the plasticity and adaptability of humans to meet computers (more than) half way.

Connecting this to computer vision and robotics I said something like:

“What if, instead of designing computers and robots that relate to what we can see, we meet them half-way – covering our environment with markers, codes and RFIDs, making a robot-readable world”

After that I ran a little session at FooCamp in 2009 called “Robot readable world (AR shouldn’t just be for humans)” which was a bit ill-defined and caught up in the early hype of augmented reality…

But the phrase and the thought has been nagging at me ever since.

I read Kevin Kelly’s “What technology wants” recently, and this quote popped out at me:

Three billion artificial eyes!

In zoologist Andrew Parker’s 2003 book “In the blink of an eye” he outlines ‘The Light Switch Theory’.

“The Cambrian explosion was triggered by the sudden evolution of vision” in simple organisms… active predation became possible with the advent of vision, and prey species found themselves under extreme pressure to adapt in ways that would make them less likely to be spotted. New habitats opened as organisms were able to see their environment for the first time, and an enormous amount of specialization occurred as species differentiated.”

In this light (no pun intended) the “Robot-Readable World” imagines the evolutionary pressure of those three billion (and growing) linked, artificial eyes on our environment.

It imagines a new aesthetic born out of that pressure.

As I wrote in “Sensor-Vernacular”

[it is an aesthetic…] Of computer-vision, of 3d-printing; of optimised, algorithmic sensor sweeps and compression artefacts. Of LIDAR and laser-speckle. Of the gaze of another nature on ours. There’s something in the kinect-hacked photography of NYC’s subways that we’ve linked to here before, that smacks of the viewpoint of that other next nature, the robot-readable world. The fascination we have with how bees see flowers, revealing animal link between senses and motives. That our environment is shared with things that see with motives we have intentionally or unintentionally programmed them with.

The things we are about to share our environment with are born themselves out of a domestication of inexpensive computation, the ‘Fractional AI’ and ‘Big Maths for trivial things’ that Matt Webb has spoken about this year (I’d recommend starting at his Do Lecture).

And, as he’s also said before – it is a plausible, purchasable near-future that can be read in the catalogues of discount retailers as well as the short stories of speculative fiction writers.

We’re in a present, after all, where a £100 point-and-shoot camera has the approximate empathic capabilities of a infant, recognising and modifying it’s behaviour based on facial recognition.

And where the number one toy last Christmas is a computer-vision eye that can sense depth, movement, detect skeletons and is a direct descendent of techniques and technologies used for surveillance and monitoring.

As Matt Webb pointed out on twitter last year:

Ten years of investment in security measures funded and inspired by the ‘War On Terror’ have lead us to this point, but what has been left behind by that tide is domestic, cheap and hackable.

Kinect hacking has become officially endorsed and, to my mind, the hacks are more fun than the games that have published for it.

Greg Borenstein, who scanned me with a Kinect at FooCamp is at the moment writing a book for O’Reilly called ‘Making Things See’.

It is a companion in someways to Tom Igoe’s handbook to injecting behaviour into everyday things with Arduino and other hackable, programmable hardware called “Making Things Talk”.

“Making Things See” could be the the beginning of a ‘light-switch’ moment for everyday things with behaviour hacked-into them. For things with fractional AI, fractional agency – to be given a fractional sense of their environment.

Again, I wrote a little bit about that in “Sensor-Vernacular”, and the above image by James George & Alexander Porter still pins that feeling for me.

The way the world is fractured from a different viewpoint, a different set of senses from a new set of sensors.

Perhaps it’s the suspicious look from the fella with the moustache that nails it.

And its a thought that was with me while I wrote that post that I want to pick at.

The fascination we have with how bees see flowers, revealing the animal link between senses and motives. That our environment is shared with things that see with motives we have intentionally or unintentionally programmed them with.

Which leads me to Richard Dawkins.

Richard Dawkins talks about how we have evolved to live ‘in the middle’ ( and our sensorium defines our relationship to this ‘Middle World’

“What we see of the real world is not the unvarnished world but a model of the world, regulated and adjusted by sense data, but constructed so it’s useful for dealing with the real world.

The nature of the model depends on the kind of animal we are. A flying animal needs a different kind of model from a walking, climbing or swimming animal. A monkey’s brain must have software capable of simulating a three-dimensional world of branches and trunks. A mole’s software for constructing models of its world will be customized for underground use. A water strider’s brain doesn’t need 3D software at all, since it lives on the surface of the pond in an Edwin Abbott flatland.”

Middle World — the range of sizes and speeds which we have evolved to feel intuitively comfortable with –is a bit like the narrow range of the electromagnetic spectrum that we see as light of various colours. We’re blind to all frequencies outside that, unless we use instruments to help us. Middle World is the narrow range of reality which we judge to be normal, as opposed to the queerness of the very small, the very large and the very fast.”

At the Glug London talk, I showed a short clip of Dawkins’ 1991 RI Christmas Lecture “The Ultraviolet Garden”. The bit we’re interested in starts about 8 minutes in – but the whole thing is great.

In that bit he talks about how flowers have evolved to become attractive to bees, hummingbirds and humans – all occupying separate sensory worlds…

Which leads me back to…

What’s evolving to become ‘attractive’ and meaningful to both robot and human eyes?

Also – as Dawkins points out

The nature of the model depends on the kind of animal we are.

That is, to say ‘robot eyes’ is like saying ‘animal eyes’ – the breadth of speciation in the fourth kingdom will lead to a huge breadth of sensory worlds to design within.

One might look for signs in the world of motion-capture special effects, where Zoe Saldana’s chromakey acne and high-viz dreadlocks that transform here into an alien giantess in Avatar could morph into fashion statements alongside Beyoncé’s chromasocks…

Or Takashi Murakami’s illustrative QR codes for Louis Vuitton.

That a such a bluntly digital format such as a QR code can be appropriated by a luxury brand such as LV is notable by itself.

Since the talk at Glug London, Timo found a lovely piece of work featured by BLDGBLOG by Diego Trujillo-Pisanty who is a student on the Design Interactions course at the RCA that I sometimes teach at.

Diego’s project “With Robots” imagines a domestic scene where objects, furniture and the general environment have been modified for robot senses and affordances.

Another recent RCA project, this time from the Design Products course, looks at fashion in a robot-readable world.

Thorunn Arnadottir’s QR-code beaded dresses and sunglasses imagine a scenario where pop-stars inject payloads of their own marketing messages into the photographs taken by paparazzi via readable codes turning the parasites into hosts.

But, such overt signalling to distinct and separate senses of human and robots is perhaps too clean-cut an approach.

Computer vision is a deep, dark specialism with strange opportunities and constraints. The signals that we design towards robots might be both simpler and more sophisticated than QR codes or other 2d barcodes.

Timo has pointed us towards Maya Lotanʼs work from Ivrea back in 2005. He neatly frames what may be the near-future of the Robot-Readable World:

Those QR ‘illustrations’ are gaining attention because they are novel. They are cheap, early and ugly computer-readable illustration, one side of an evolutionary pressure towards a robot-readable world. In the other direction, images of paintings, faces, book covers and buildings are becoming ‘known’ through the internet and huge databases. Somewhere they may meet in the middle, and we may have beautiful hybrids such as

In our own work with Dentsu – the Suwappu characters are being designed to be attractive and cute to humans and meaningful to computer vision.

Their bodies are being deliberately gauged to register with a computer vision application, so that they can interact with imaginary storylines and environments generated by the smartphone.

Back to Dawkins.

Living in the middle means that our limited human sensoriums and their specialised, superhuman robotic senses will overlap, combine and contrast.

Wavelengths we can’t see can be overlaid on those we can – creating messages for both of us.

SVK wasn’t created for robots to read, but it shows how UV wavelengths might be used to create an alternate hidden layer to be read by eyes that see the world in a wider range of wavelengths.

Timo and Jack call this “Antiflage” – a made-up word for something we’re just starting to play with.

It is the opposite of camouflage – the markings and shapes that attract and beguile robot eyes that see differently to us – just as Dawkins describes the strategies that flowers and plants have built up over evolutionary time to attract and beguile bees, hummingbirds – and exist in a layer of reality complimentary to that which we humans sense and are beguiled by.

And I guess that’s the recurring theme here – that these layers might not be hidden from us just by dint of their encoding, but by the fact that we don’t have the senses to detect them without technological-enhancement.

I say a recurring theme as it’s at the core of the Immaterials work that Jack and Timo did with RFID – looking to bring these phenomena into our “Middle World” as materials to design with.

And while I present this as a phenomena, and dramatise it a little into being an emergent ‘force of nature’, let’s be clear that it is a phenomena to design for, and with. It’s something we will invent, within the frame of the cultural and technical pressures that force design to evolve.

That was the message I was trying to get across at Glug: we’re the ones making the robots, shaping their senses, and the objects and environments they relate to.

Hence we make a robot-readable world.

I closed my talk with this quote from my friend Chris Heathcote, which I thought goes to the heart of this responsibility.

There’s a whiff in the air that it’s not as far off as we might think.

The Robot-Readable World is pre-Cambrian at the moment, but perhaps in a blink of an eye it will be all around us.

This thought is a shared one – that has emerged from conversations with Matt Webb, Jack, Timo, and Nick in the studio – and Kevin Slavin (watch his recent, brilliant TED talk if you haven’t already), Noam Toran, James Auger, Ben Cerveny, Matt Biddulph, Greg Borenstein, James George, Tom Igoe, Kevin Grennan, Natalie Jeremijenko, Russell Davies, James Bridle (who will be giving a talk this October with the title ‘Robot-readable world’ and will no doubt take it to further and wilder places far more eloquently than I ever could), Tom Armitage and many others over the last few years.

If you’re tracking the Robot-Readable World too, let me know in comments here – or the hashtag #robotreadableworld.

Week 321

It’s week 321.

As is the custom (at least when I can find it in time), we begin our weekly all-hands meeting with a burst of the theme to Battle Of The Planets and a fact from wikipedia about the week number.

It turns out that the entry about 321 is extremely boring, and no-one apart from me remembers Ted Rogers. So we’ll go straight into what’s happening this week.

Alex is back in the studio after an exciting incident involving his bike, his shoulder, a road and one of London’s characteristically-careful and considerate drivers. Good to have him back and on the mend. He’s working this week on Barry, and his monitor is full of incredibly-detailed, pixel-perfect illustrations and type. It’s looking lovely.

Denise is also working on Barry all week, but more on the service and product design aspects as well as the overall direction of the thing. She’s been working with the printers on the prep for the next run of SVK – so if you missed out first time round, make sure to sign up at for news about the next print run!

Joe’s working on Suwappu phase 2, concentrating on UI and visual design to collaborate with Nick on building.

Nick’s planning out some of the Suwappu tech architecture, ramping up on a Uinta project in terms of research and prep, doing more architecture work on Barry, and also descending into some embedded code darkness.

James is deep in Schooloscope tweaks and data-munging He’s also Barry spec-writing and development, integrating his work with Alice’s and working on turning the IA into code, refactoring as he goes. Busy fella.

Alice is stuck into Barry work with James and and Alex. Barry’s been cracking along as a result, with lots of progress and intermittent whoops from the team.

Schulze is mainly writing with Timo this week – on Chaco and some other projects. Timo, back from holiday, meanwhile is planning for Chaco and Uinta outputs, doing some treatments for scripts for a couple of other things that are bubbling away. He’s also switched on his scholarly mind-modules – doing research for an article he’s writing which I’m quite excited about.

Kari’s doing a bit of SVK customer service, writing some documentation and pursuing year-end company financials stuff with MW.

Simon’s negotiating a lot of complexity this week. There’s SVK reprint planning, Uinta workshop planning, Chaco planning, Schoolscope and Suwappu project management, writing user-stories for Barry, finalising phase two of Dimensions and generally keeping us all honest and pointing in the right direction. At this rate, his trip Burning Man at the end of the month is going to seem like a walk in the park to him.

Matt Webb wasn’t at all hands as he was out chasing down an exciting lead that might resurrect some old inventions… But he sent a telegram to be read out, framing the week for him as “Meetings-y”. He’s going to be on top of Schooloscope and company financials apart from that.

For me, this is a bit of a pause-for-breath week. I’m prepping for some new projects for Uinta, and ongoing work for Chaco – but also hoping to be doing a bit of thinking and writing here on the blog – while my giant metaphorical plastic sit-in log slowly ratchets up the incline, before the next steep drop of the future-flume we call BERG…




‘Talk to Me’ at MoMA

Talk to Me, MoMA’s new exhibition about design and the communication between people and objects opened this week at the Museum of Modern Art in NYC. We’re very proud to be a part of a show that pulls together so many potent strands of contemporary design:

New branches of design practice have emerged in the past decades that combine design’s old-fashioned preoccupations—with form, function, and meaning—with a focus on the exchange of information and even emotion. Talk to Me explores this new terrain, featuring a variety of designs that enhance communicative possibilities and embody a new balance between technology and people, bringing technological breakthroughs up or down to a comfortable, understandable human scale.

There is an enormous amount of work that we value and admire across the exhibition. A range of games from the experimental Passage, Chromaroma and Sharkrunners to Little Big Planet, SimCity and Spore. It’s great to see Usman Haque’s Pachube alongside other sensor networks and platforms such as Homesense and Botanicalls.

There are physical interactive products such as David Rose’s ever-impressive connected medicine container Glowcaps, the exquisitely crafted musical interfaces Monome and Tenori-on, the empowering iOS payment interface Square and the characterful and playful Tengu, alongside popular apps like Talking Carl and Wordlens.

There’s a wide range of mapping work, from the early and potent They Rule to Prettymaps, Legible London, Ushahidi and Walking Papers. And there is plenty of work that defies classification such as Camille Scherrer’s The Haunted Book, Kacie Kinzer’s wonderfully simple and affective social Tweenbots and Keiichi Matsuda’s Augmented (hyper) Reality.

BERG has seven works in the show. The bendy maps Here and There, the interactive exploration of scale BBC Dimensions, the films made with the Touch project exploring the qualities of touch and RFID: Nearness and Immaterials: Ghost in the Field, our collaborations with Dentsu London on Media Surfaces: Incidental Media, The Journey and the augmented toys Suwappu.

For such a broad exhibition it is great to see all of the works curated and presented with such thought and attention to the quality of each piece.

The exhibition takes place in the MoMA Special Exhibitions Gallery, from 24 July until 7 November 2011. Thanks to Paola Antonelli and the Talk to Me team for the excellent and patient work in putting this all together.

Week 320

Week 320, and there’s almost as many people in the studio as there was when I started here, back in week two hundred and seventy something. Most people are out of the office for some reason or another, which makes tea rounds a lot easier than usual. It’s gonna be a slightly empty office next week too, as myself, Jack, Matt Jones and Simon are off to the US for a few days of workshops. We’re currently listening to Hot Sauce Committee Part Two. I’ve been making people listen to David Rodigan’s Thursday night sets on Radio 2 as well.

I’ll start with the people not here. Jack & Matt Jones are currently in the US presenting something a great deal of the team have been working on for the last few months. I’m hoping Matt Webb is currently having a holiday and not working too much. Timo’s working out of the office this week, after going to the opening of Talk to Me at MoMA with Jack & Jones where a few of our recent projects are being exhibited, and working on Chaco related bits.

Joe is off today, but has been working on some Chaco related stuff, and is now working on a document trying to define our design process as a company – speaking to myself and Denise to try and solidify our ways of attacking design challenges going forward. It’s a hard thing to put on paper but will be fantastic to see progress, and also an essential thing to have sorted as we continue to grow as a company.

The back room (more commonly known as Statham) currently contains the mighty brains of Andy, Nick, Alice & James Darling – concentrating mostly on all things Weminuche. Andy did a bit of office tidying at the end of last week after a crazy few days. Denise is working with James on some IA and I’m working with Alice on some graphical elements. James is also doing some work on Schooloscope. I’ve been talking to Nick about cars in between his work on Weminuche & Chaco.

The mighty Simon Pearson is doing his usual brilliant job of herding our flocks of projects and making them work properly. He’s also been working with Kari running the customer service for SVK. He’s on a lot of conference calls trying to make things happen – and working with Denise on Suwappu. Kari’s only in for a few days this week, but keeping the office running like a well oiled machine as usual.

That’s it – I’m keeping weeknotes short this week. Super busy as usual but strangely quiet, at the same time.

Week 319

So, it’s week 319. I’m pretty sure you know this already but just in case, 319 is a Smith Number, and perhaps more importantly, according to Wikipedia, the name of a song by Prince, that can not be found on Spotify.

The office is as busy as ever this week but, dare I say it, slightly quieter in terms of volume. Completely co-incidentally I’m sure, Matt Jones and Jack are in New York this week, taking workshops and visiting clients. They’ve sent word back home via a Google Hangout – which sounds like it was a pain to set up, but once it got going felt like “quite a nice informal way to video chat”. They’re getting a lot done over there, but we also have a sneaking suspicion they could be having too much fun.

It feels like a very collaborative week. We’ve just had an ‘all hands meeting’, and several people mentioned ‘being a sounding board for…’, which is one of the nice things about working here. It’s easy to ask opinions of others – and everyone is interested in everyone else’s ‘stuff’. There’s a lot of respect for the knowledge others have, even if the boundaries of people’s particular work disciplines are blurred.

In literal terms this week, the project code named ‘Chaco’ is taking up time from Joe, Simon, Andy and Nick. Each person is playing a very different role.

Weminuche is occupying the minds of Alex and Alice, James and myself, with a bit of extra time from Andy and Nick. Alex and Alice are working closely together on APIs and design and I’m working on IA with James, who very patiently listens to my latest master plan and either agrees or pokes my ideas with a big stick to see if and when they fall apart.

James is also working on Schooloscope, with some help from Nick. As well as Chaco, Simon is also working on some SVK customer service, and planning for new Suwappu and Dimensions stages.

Matt Webb is trying to go on holiday, but has managed to book himself into a hotel 3 blocks from the location of a client meeting with Jack and Matt J. We’ll see how well he manages to avoid them. If you’re in NY and spot him on the street, for heavens sake don’t mention work.

Week 318

It’s a little difficult to work out exactly what’s going on in Week 318 because THERE IS SO MUCH. (“Let me explain. No, there is too much. Let me sum up.” It feels like that.)

Projects that are on the go or bubbling up again this week include:
Chaco x3
Barry / Weminuche
Suwappu (with Dentsu London)
Here and Then

All Hands on Tuesday morning was a little head spinning as fourteen different people reeled off all the things they were working on this week. Most folks have their hands in more than one project at a time. Simon and Matt Webb, because of the nature of their jobs, are each trying to pay attention to at least five different projects over the course of the week if not all in one day. Lord bless ’em.

A major project on the boards this week is a Chaco presentation which Matt Jones and Jack will be bringing to New York at the end of the week. Alex and Timo are both contributing their respective talents to that along with Matt and Jack.

In addition to that, Timo is working on creating films about the different Chaco projects. Every now and then he points his camera at something for a while, moves some things around and points his camera again. And then goes back to his computer to make it into magic. Joe is helping out by contributing animations.

There’s plenty of ongoing work on the various Chaco projects. Nick is tweaking software so that it can be shipped and just work. (Seems like a worthy use of time to me: I like it when things just work.) Andy and Simon are both doing a lot of liaising with our external collaborators some of whom are literally on the other side of the world.

Now that Shuush is in the world and getting some attention, Alice is working on making some tweaks to that. Most of her time is being spent on Barry, though, so she has temporarily relocated to Statham 2 which could also be called The Barry War Room. Also working on Barry / Weminuche this week are Alex, Denise, James, Nick and Andy.

Tom and Alex are pushing Dimensions closer and closer to a deliverable thing. Tom has swapped places with Alice for the week and it’s been very nice to have him in the main room. He’s a lot more talkative than I thought.

And the Suwappu project with Dentsu (carrying on from this) is kicking off this week. That’s another project where we have third party collaborators. Just keeping track of all the external collaborators is a job in itself around here – mostly down to Simon.

Schooloscope has been a tad neglected of late due to available hands to work on it, but it’s getting a bit of a polish this week thanks to James.

And now that SVK is in the world and, for the last week, has been landing in customers’ hands, we’ve moved on to the Customer Support phase – which is how I’ve been spending most of my week with lots of help from Simon and Matt Webb. (They are angels, really.) On the one hand, it’s frustrating that there are glitches and things need remedying, but on the other hand, most requests for customer service are accompanied by exclamations of delight at the comic itself, the packaging, the overall product, etc. It’s very gratifying to hear from so many people who really, really like a thing we did. (Note: if you didn’t manage to grab a copy during the 48 hours that it was on sale before selling out, add your email address at to be notified when the second printing becomes available!)

Phew! I’m sure there’s stuff that I missed, but I think that’s probably an adequate summary.

It’s Thursday morning and it’s actually kind of sunny outside and Matt Jones is playing Django Reinhardt on the studio stereo. Happy Bastille Day!

Week 304

In another file on my laptop I’ve got the notes I was intending to write here. They’re all about how to structure risk in projects in order to leave the maximum room for unintended invention.

Whatever. I’m in no state to finish them. I’ve been eight timezones away for the past week with Matt J, Jack and Timo (who has joined us as a creative director!), I just flew back in, and my head feels like it’s full of bees.

I don’t know if you’ve ever played the game Canabalt. You should, it’s fun. That’s what the week has been like. Wake up early, read new resumes for the project manager position and decide whether to interview or not. Up, shower, downstairs. Drink coffee, respond to latest changes in a contract being negotiated, write a response to a media request, arrange meetings for next week, finalise another contract. Finish coffee, off to workshops, back, catch up on emails from the day, out for dinner, bed.

While we were on the road, Timo and Jack launched Suwappu with Dentsu London, our first gig as official “consultant inventors.” And toys too! Sweet.

So with so many people away it was quiet in the studio. Nick tells me it was industrious, and that’s the truth. I spent my first hour back in today being shown what’s been going on. Ads for the comic, testing the online shop and fulfilment, continuing mini breakthroughs and prototyping in our own new product development, etc. Lots!

It’s very sunny here too. A good Spring day in London.

I’ll see more at 4pm at Friday demos. Between that and Tuesday All Hands, there’s a nice rhythm to the week.

I’m not sure when we’ll all be back in the studio at the same time again. I think Jack may be off to New York next week, to kick off an engagement that will see us through the next six months or so. But we also have someone new starting on Monday, and then these two positions we’re just starting interviews for, and then, gosh, we’re out of desks. I don’t really want to move, but it does mean, on top of everything else going on, I’m now also looking around for new premises. Somewhere in Shoreditch with a bit of character, a bit of room to grow, a quiet room, a meeting room, and a space to run workshops and make films. I’m getting on the estate agent train. Do let me know if you have any ideas or serendipitous opportunities.

I’m looking forward to travel pausing for a bit, and having everyone back in the same room. There have been lots of changes recently, and the Room – which in my head I’ve started capitalising, Room not room – is nothing if not a culture – a particular stance to design and the world, and shared values – a way to work which is beautiful, popular and inventive – and a network of people in which ideas transmit, roll round and mutate, and come back in new forms and hit you in the back of the head. The Room is what it’s all about. It’s a broth that requires more investment than we’ve been giving it recently. So, yeah, that.

It’s the end of week 304 folks.

Projects matching 'suwappu'

Pages matching 'suwappu'