This website is now archived. To find out what BERG did next, go to www.bergcloud.com.

Blog (page 20)

Thanks Steve.

We probably wouldn’t be doing what we do if Steve Jobs hadn’t done what he did.

To mark his passing this week, I asked the studio for their stories of first contact with Apple.

Kari:

In 1983 when I was 10 years old, my school got about 10 shiny new Apple IIe computers and decided that we should learn a little basic programming. They divided us into two groups. Half of the class learned Basic, and the other half – the group that I was in – learned Logo. I remember feeling like we were pretty important and very cool for learning how to do computer programming. And I loved that I could change a few letters and number and make the turtle do different things. It gave me a great sense of autonomy.

Alex:

The thing about buying an apple product is that you’re sold an experience. It’s not all about the industrial design of the product, or the UI, or anything else, it’s the russian doll effect of unpeeling layers of a sealed box and feeling like you’ve bought something really special, which is something I’d never really experienced with a consumer product until I bought my first iBook at uni. If you can make people smile before they’ve even lifted the product out of the box, I think you’re almost half way there.

Denise:

My first introduction to the word of computing was at home. We had a commodore PET and graduated to the BBC Micro. I have only one sibling, so aside from a few fights over who got to play space invaders next, we were lucky enough to be able to use a computer when we wanted to. It was never a big deal, there was no fear involved for six year old me.

Two things changed this. First, the BBC moved from the front room to my father’s study. Home computing was work computing, not a play thing. Second, we started ‘doing computers’ at school. This meant looking over the shoulders of 30 other kids at the one, perhaps two, computers available. It meant a fight to get to the front—a fight I wasn’t that fussed about joining. (I’d already seen a computer anyway).

And so I didn’t touch a computer again until university in the early 90s, by which time I’d learnt to fear them. We weren’t trained in ‘desktop publishing’ on my degree course. Yet again, scheduling educational computer time involved a bit of a fight, and so I squared up to a Mac at last, and wondered how the hell I was going to get the file I’d just made off the desktop and on to a disk.

After a split second’s thought, I picked up the picture of the file and put it on the picture of a disk. Seemed obvious. Was obvious. Worked. Turned out that using computers was really easy.

Better than that, it meant that when I said yes to my first design job as a penniless graduate, was handed a magazine to design and a deadline for the end of the week I had a tool to use. Not quite as easy as using a pencil, but not so far off.

At the end of the month, I got paid. Thanks Steve, for feeding me.

Joe:

One of my first encounters with a Mac was back in 1996 during a High School ‘Design & Technology’ lesson. I think it was a Macintosh LC 580. Anyway, it was reasonably new but already grubby with the greasy fingerprints of overzealous teenagers. I turned it on and the screen flickered with light. A blinking apparition of a disk appeared alongside a question mark. I remember taking this to mean that it had crashed so I reached back, opened my hand and struck the monitor really, really hard. The next thing I remember was a jolt of shock as the class teacher screamed at me from across the room. I got sent out. Apparently the question mark was normal.

The second and more favourable memory was the first time I saw pictures of the first iPhone on the internet. I remember looking at it and thinking that I was seeing the device that featured in so many of my sci-fi fuelled childhood dreams. A screen that you could hold in your palm. That would show you videos. That would let you communicate to your friends. Find your way around. Somehow Apple had made real something that I could only conjure up in my imagination and it felt magic to be part of the generation that got to see it happen.

Simon:

1994, using an Macintosh II in IT at secondary school. My only other prior exposure to computers had been Apricot and Amstrad (we had a CPC6128 with a colour screen). I was amazed by the tiny size and loading speed. No tapes! It was the first computer I’d ever used with more than one word processing font.

Matthew:

The first Mac I owned was an LC475, and that pizza box unit is still at my mum’s house, the insides chewed away by mice which is what happens if you leave computers hanging around in houses in the countryside. I loved that computer: I made fanzines and I made art. I connected to my first BBS, through a 2400 baud modem I bought from a classifieds ad at the back of a magazine. When I connected, that very first time, when I saw the future and everything changed, pivoted on the spot and pointed towards a much larger very different future, the stereo was playing Pink Floyd’s A Momentary Lapse of Reason. The standout single from that album is called “Learning to Fly.”

The first Mac I saw was at a house of my dad’s friend, a man named Dave, and he had one of those all-in-one Macs, and I was very young. This was a long time before the LC475 that I had, and the strong memory of seeing it was the reason I would, when I was older, get that LC475. I was amazed at three things: the GUI which looked like pen-and-ink draftsmanship; that the power button was on the keyboard instead of being on the back of the box, and that the keyboard was a separate, independent thing; and that there was no computer: there was just the screen, the thing that you used. I couldn’t believe it. I looked at it for a long time.

Jack:

I don’t really care what anyone else than Steve does.

Timo:

‘Bloody Steve’ we used to shout, as Finder windows lost their positions, as CDs were teased with paperclips or the screens of Titanium laptops fell off. Schulze & I have lambasted Jobs over the years for faults in OSes and problems with Apple hardware, but that was before we realised how hard it is to do even basic hardware and software properly at scale. My respect for him grew enormously as Apple moved from computing and swallowed up ever more industries that I cared about. Perhaps the most significant thing I have learnt from Jobs is that his products were absolutely his politics.

My story:

It was 1986, I was 14, and working at Harris Printers in my hometown in South Wales after school everyday and most saturdays. I cleaned, collated, folded, packed print. Sometimes I got to make litho plates on the big Agfa camera, and sometimes I got to use the ancient treadle-powered letterpress. I sometimes got to do layout with Letraset and typesetting galleys from an IBM golfball printer. I convinced the owner that something called DeeTeePee was the next big thing, and we should buy a Mac Se30, Quark Xpress and Adobe Illustrator 1.0. I think with the Radius display and Laserwriter+ it must have come to about £20k or so. It was an incredible machine. What you saw was what you did was what you got. You moved things on a screen that seemed real, not abstractions. My only computer experiences, like most kids till then had been a BBC Model B, or a Vic-20, abstract and arcane. This was something that everyone, in the printers, in my family, my school friends – everyone – could see was different. It was when I first felt I wouldn’t have to choose between technology and art.

Thanks Steve.

Suwappu app prototype – toys, stories, and augmented reality

You may remember Suwappu, our toy invention project with Dentsu — those woodland creatures that talk to one-another when you watch them through your phone camera. You can see the film – the design concept – here or (and now I’m showing off) in the New York at Moma, in the exhibition Talk to Me.

Here’s the next stage, a sneak peek at the internal app prototype:

Direct link: Suwappu app prototype video, on Vimeo.

It’s an iPhone app which is a window to Suwappu, where you can see Deer and Badger talk as you play with them.

Behind the scenes, there’s some neat technology here. The camera recognises Deer and Badger just from what they look like — it’s a robot-readable world but there’s not a QR Code in sight. The camera picks up on the designs of the faces of the Suwappu creatures. Technically this is markerless augmented reality — it’s cutting-edge computer vision.

Suwappu-20111006-008

And what’s also neat is that the augmented reality is all in 3D: you suddenly see Deer as inside a new environment, one that moves around and behind the toy as your move the phone around. It’s all tabletop too, which is nicely personal. The tabletop is a fascinating place for user interfaces, alongside the room-side interfaces of Xbox Kinects and Nintendo Wiis, the intimate scale of mobiles, and the close desktop of the PC. Tabletop augmented reality is play-scale!

But what tickles us all most about Suwappu is the story-telling.

Seeing the two characters chatting, and referencing a just-out-of-camera event, is so provocative. It makes me wonder what could be done with this story-telling. Could there be a new story every week, some kind of drama occurring between the toys? Or maybe Badger gets to know you, and you interact on Facebook too. How about one day Deer mentions a new character, and a couple of weeks later you see it pop up on TV or in the shops.

The system that it would all require is intriguing: what does a script look like, when you’re authoring a story for five or six woodland creatures, and one or two human kids who are part of the action? How do we deliver the story to the phone? What stories work best? This app scratches the surface of that, and I know these are the avenues the folks at Dentsu are looking forward to exploring in the future. It feels like inventing a new media channel.

Suwappu is magical because it’s so alive, and it fizzes with promise. Back in the 1980s, I played with Transformers toys, and in my imagination I thought about the stories in the Transformers TV cartoon. And when I watched the cartoon, I was all the more engaged for having had the actual Transformers toys in my hands. With Suwappu, the stories and the toys are happening in the same place at the same time, right in my hands and right in-front of you.

Here are some more pics.

Suwappu-20111006-001

The app icon.

Suwappu-20111006-002

Starting the tech demo. You can switch between English and Japanese.

Suwappu-20111006-004

Badger saying “Did I make another fire?” (Badger has poor control over his laser eyes!)

Suwappu-20111006-009

Deer retweeting Badger, and adding “Oh dear.” I love the gentle way the characters interact.

You can’t download the iPhone app — this is an internal-only prototype for Dentsu to test the experience and test the technology. We’re grateful to them for being so open, and for creating and sharing Suwappu.

Thanks to all our friends at Dentsu (the original introduction has detailed credits), the team here at BERG, and thanks especially to Zappar, whose technology and smarts in augmented reality and computer vision has brought Suwappu to life.

Read more about the Suwappu app prototype on Dentsu London’s blog, which also discusses some future commercial directions for Suwappu.

BERG at PopTech and MoMA in October

We’re really pleased to announce that we’re participating in a couple of incredible events in the next few weeks.

Myself and Schulze will be at the MoMA “Talk To Me” symposium on the 19th October in NYC, alongside an array of intimidating design superbrains like Slavin, Kati London, Revital Cohen, Natalie Jeremijenko and Bjarke Ingels.

MoMA 'Talk to Me' Exhibition opening

We’ll be giving a short talk and participating in a panel discussion about “Translating Worlds”.

We imagine it will not be about learning Klingon or Na’vi, but who knows.

Can’t wait.

Later that same week, I’m going to be speaking at PopTech in Camden, Maine.

This is incredibly exciting to me as I’ve long admired and wanted to attend this particular event – the diversity of speakers and subject matter that is critically and pragmatically addressed has always been top-notch.

And they have, let’s face it, a cracking logo.

Former colleagues from Nokia and the RCA Design Interactions course – Jan Chipchase and Daisy Ginsberg – are also on the speaker roster, and I believe I’m going to be in a session on the ‘Future of UI’…

Week 330

Fact of the week: 330 is the number of dimples in a British golf ball according to Wikipedia.

So having been on holiday all of last week, I’m only just catching up with what’s been going on in the studio and am still not quite sure I’ve sussed it all out. Since I’m the one that actually makes up the blog rota, though, I have only myself to blame for assigning myself to write weeknotes the week after I’ve been on holiday. Trust me, I won’t do that again.

When I asked Matt Webb what last week was like, his summary was, “There was drama.” Unsurprisingly, when much of your work is dependent on the whims and wishes and ever-changing timelines of clients, things can go a bit pear-shaped. As regular readers of this blog will know, however, BERG is not a company that has all its eggs in one basket, so one client throwing us for a loop doesn’t completely knock us off balance. Nevertheless, there has been drama and so we’re having to deal with that.

In terms of what various folks are up to this week…

Matt Jones, Jack and Alex (who celebrated his one-year anniversary at BERG today!) are doing work on Uinta in preparation for a presentation on Thursday. Matt & Jack are also spending lots of time doing general company planning and re-planning and thinking about sales. Alongside all that, Jack has his fingers in Barry design. He’s also looking at lots of documents.

Barry is also occupying Alex, Denise, Alice, James, Nick, Matt Webb and Andy. Besides following up with various partners & suppliers and chasing China for quotes, Andy is specifically doing some reflecting on the last year of Barry development. (Side note: Because of my role and the fact that I don’t work on Fridays – thus missing weekly studio demos – I only get to see very small slivers of the progress that’s been made on Barry. I do catch a whiff of the excitement and anticipation around it every now and then, though. And I can say with some confidence: it’s going to be pretty spectacular, people.)

Simon, Alice, James and Nick had a meeting at the pub yesterday to make some decisions around Barry and are now working to implement those. There are still a number of near-term decisions that still need to be made, though. Since Nick, James and Alice are now sitting in the main room of the studio where I am (having previously been based in Statham next door), I’m overhearing lots more about the code and technology underlying Barry. Most of the time I have no idea what it means, but it’s rather fun to eavesdrop on anyway and to see them working to solve problems together.

Joe is mostly working on Uinta this week, fleshing out the system underneath the recently approved “look and feel”. He and Simon will be spending some time out of the studio this week meeting with partners who are doing some work on that project alongside us.

Simon is, as usual, skillfully balancing multiple project and clients and partners – this week it’s mostly Chaco, Uinta, Barry and Suwappu with the occasional random bit of SVK and other stuff thrown in. It involves lots and lots of post-it notes. Which is causing me a bit of anxiety because he’s out this afternoon and the wind is blowing the post-it notes around and I have no idea what sort of system he’s organised them into and if you come back and all your carefully assemble post-it notes are out of order, Simon, I apologise. Blame our need for fresh air.

And as well as dealing with the drama stirred up last week, Matt Webb is planning, thinking about finances & sales and having lots of coffee with various people. I hope for his sake some of that is decaf.

Timo’s holiday has stretched into this week and we’re looking forward to welcoming him back tomorrow.

As for me, I’m working furiously to catch up with all the bookkeeping, replying to lots of general studio correspondence, booking travel, updating spreadsheets, doing customer service for SVK (it’s not to late to buy one!) and chasing non-responsive suppliers and overdue invoices. And eating cake. There seems to be lots of cake this week.

I hope that wherever you are, whatever you’re doing, there’s cake there too. Have a good week!

Friday links: first-person music videos, biological lightpainting and synthesis…

Alex shared this music video by Biting Elbows. Imagine what would happen if The Office met Peep Show and Doom. The first-person perspective makes this really engaging.

Matt W shared another first-person video from Cinnamon Chasers. Dark and compelling:

Jones shared this beautiful biological lightpainting:

Nick shared Kevin Karsch‘s work on inserting synthetic objects into still images.

Jones also shared Slate’s Robottke experiment. How easily could you be replaced by a robot?

And finally, though you’ve probably seen it already, this is some quantised dubstep dancing that Matt Webb sent round early in the week. If this could be synthesised, it’d make a great music visualisation.

Happy weekends, all (and an especially toasty one if you happen to be in the UK).

Week 329

Following a super-busy week last week, we’re barely pausing for breath in the studio.

Alice has her headphones in, whizzing up some javascript which will eventually be part of Dimensions 1. Alex, Matt Jones, Jack and Joe are busy sketching for Uinta, and all but Joe will be out of the studio having workshops with them next week. Joe and Jack are also continuing to hone the latest Chaco work.

Nick has skilfully managed to migrate all of us (and our various email setup preferences) to Google for Domains. It’s a fiddly and time-consuming task, but switching all our email and calendars is already making all our lives easier – especially mine. Being able to automatically see calendars reliably makes juggling the commitments of this group of busy folk that much easier.

James, Alice, Andy, Nick, Jack and Matt W are all working on various parts of Weminuche and Barry. I’m also paying close attention to these projects as we focus on what we need to talk about and deliver imminently. Andy’s making and requesting quotes, Jack is poking and pondering, James is refactoring, Alice is rendering, Matt W is communicating and numerating, and Nick is adminning. I think I just invented a new word.

Like many companies our size, we are more than the sum of permanent folk here in the studio. We work with a burgeoning group of occasional co-conspirators, and at this precise moment we’re working with a great number. A lot of my time is spent planning the work we do with our partners, involving the right people, putting in place the necessary documents. There’s a lot of that going on this week as we kick off another raft of making, primarily for Chaco and Uinta.

The latest project with Dentsu London is wrapping up this week, and Matt will be writing about that shortly.

Our new overspill studio is set up and is a hive of productivity. Now we have extra space, it’s easier for us to think through making. We can set up our prototypes and experiments permanently so we can revisit, tweak, tinker and revise without having to pack down and set up each time. It’s a very good thing.

Timo, Kari and Denise are all away having a rest. In the meantime their desks have inevitably been occupied by people and things.

The design behind How many really

How big really is now just over a year old, released just before I started work at BERG, and I still find myself totally engaged with the simplicity of the concept. It’s a solid, easy to digest punch of information that translates unknown quantities into something instantly recognisable. How many really is the second part of the experiment, and I was tasked with working on the design. This is a little write up of the design process.

We started off by following a workshop Webb & Jones had run with the BBC to kick off the initial concept of examining quantity. Myself, James Darling & Matt Brown spent a week whiteboarding, sketching and iterating, to try and nail down some initial ideas.

The first thought was the variables with which we could use to convey changes in quantity. Time, movement, zoom & scale were all identified as being potentially useful.

We started to construct sentences that could tell a story, and break down into portions to allow new stories to slot in.

Looking at splitting grids into sections to show different variables.

We thought a bit about avatars, and how to use them in visual representations of data, in this case combining them with friends’ names and stories.

Looking at combining avatars with ‘bodies’. Bird suits, vehicles, polaroids.

An early narrative concept, setting up the story early on and sending you through a process of experience. We thought about pushing bits of stories to devices in real time.

After a bit more crunching and sketching, we broke everything down into two routes:

  • Scale – influenced by Powers of 10, used to compare your networks to increasing sizes of numbers,
  • Grouping / snapping – used to take your contacts and run them through a set of statistics, applying them personally to historical events and comparing them against similar events in different times.


What became clear after the sketching was the need to show a breadcrumb trail of information, to give the user a real sense of their scale compared to the numbers we were looking at. Eames’ Powers of 10 video achieves this – a set of steps, with consistent visual comparisons between each step.

Perfect for showing the relevance of one thing in relation to the next, or a larger collective group. But the variation in the stories we’d be showing meant that we didn’t want bespoke graphics for each individual scenario. We tested out a quick mockup in illustrator using relatively sized, solid colour squares.

Despite the lack of rich textures and no visual indicators of your current position in the story, the impact was there. We added Facebook / Twitter avatars for signed in states, and worked on a colour palette that would sit well with BBC branding.

The next problem was dealing with non-signed in states. How many really was always designed to work with social networks, but we wanted it to be just as relevant with no Facebook or Twitter credentials – for classrooms, for example. We took a trip to the V&A to view the Isotype exhibition that was on at the time.

 

That’s 85 year old iconography and infographic design that looks as relevant today as it did back then. A real sense of quantity through simple pictograms. Completely fantastic. We set about designing a stack of isotype influenced icons to work with the site when users weren’t signed into their social networks.

And the icons in context…

We used a bit of Isotype inspiration for the organisation of the grouping stories – evenly spaced grids of icons or avatars.

The rest of the site was intended to stay consistent with How big really. We used photography in place of bespoke graphics for the story panels, as the graphical output varies for each user.

How many really is an entirely different beast to How big really. Rather than each dimension being a solid, one shot hit, the value is in backing up simple visuals with interesting narratives. We spent almost as much time on the written aspect of stories as we did on the aesthetics and interaction. I hope it gives a little context to numbers and figures we often take for granted. Please do have a browse around!

Friday Links

Video Game in a Box by Teague Labs. Delightful.

dextr + telly

danw’s Toby Barnes’ collection of glanceable displays and devices (above is dextr).

The reconstruction of what a person sees by measuring brain activity:

The left clip is a segment of the movie that the subject viewed while in the magnet. The right clip shows the reconstruction of this movie from brain activity measured using fMRI. The reconstruction was obtained using only each subject’s brain activity and a library of 18 million seconds of random YouTube video.

Physical graffiti that beautifies (via @urbnscl)

Time-lapse taken from the front of the International Space Station. WHOA.

Week 328

I’m writing these notes on the bus back to London. We have a rota to write weeknotes – everyone has a turn – and after months of teasing people when they don’t put notes up early in the week, on Tuesday after All Hands, it’s a little shoddy to have left it till Friday myself.

It’s been an eventful week!

  • Tuesday, How Many Really? launched, a website with the BBC that shows you populations from significant historical times compared to your own social network.
  • We put out a short video product sketch of clocks for robots.
  • On Wednesday, we started furnishing our new overspill office. It’s just over the road, and all Chaco work is shifting over there. It’s far from optimal to split the room like this, but we’re really packed in at the moment, and it’s stifling. So pending the big studio move (we received, and agreed, top-level numbers this week too), we have the overspill office on a short-term lease. The upside here is that James and Alice, who sit in Statham, can sit in the main room with everyone else. But it’s going to be weird and difficult and we’ll have to pay a lot of attention to the split to make sure it’s not damaging. An aside: somehow the office is becoming called “BERG 9 Overspill Area.”
  • And on Thursday, Nick consolidated all our email, calendars and what-not on Google Apps for Business. Our IT has all been a bit organic up till now – the less polite way of saying it would be haphazard and often broken – so this is a good step.

To stick with that Thursday Google consolidation for a second… it’s a shame to no longer allow the variety of email systems and calendar applications that we had before, but there’s a huge benefit in having everyone use the same tools, and having all the tools built by the same company. There’s some kind of network effect — a benefit in sharing protocols and originators.

I don’t like the word “ecosystem” because it feels like something else: maybe all the Google tools align in the same crystal lattice. I choose a crystal because electrons move freely and quickly in the regularity of structure, pausing when they have to cross boundaries where the grain of the lattice changes.

So I have these various lattices in which I live my electronic life. There’s Google for email, calendar, apps etc. Apple for phone, music, photos, and my place of work — my desktop, truly, and my metaphorical pens and paper too really. HDMI at home, which is the lattice that music and video travels through on the way to the speakers and projector after it leaves the Apple lattice. Since I standardised on HDMI (upgrading a couple of bits of kit, discarding handfuls and handfuls of interim convertors and cables), my home AV is way simpler and I get to play music and watch telly without having to remember what switch needs to be turned to whatever setting.

Consolidations of protocol. I don’t know, there’s something in this I want to think about more.

Let me say a little about what projects are on.

Our two Uinta projects are gathering momentum. Simon, Joe, and Matt J were in Brighton on Monday for a meeting with potential collaborators.

Chaco continues, and is well underway. Jack and Timo have been filming this week, and Timo has a shiny new iMac on his desk dedicated to editing and doing maths on pixels. The projects (Chaco is a family of projects) are huge and ambitious, and we’re going to need a broader team to pull them off — in part, that’s what my blog post this week about vacancies was for. (And I encourage you to have a read! There’s hardware and software and all sorts of things we’re interested in.)

Weminuche – the platform – and Barry – the first instance of it – continue too, and almost everyone is involved in some way or another. Looking down my list from All Hands, Denise, Alice, Nick, Alex, Simon, James, Timo, Jack and I all mentioned time spent on that. I’m being cryptic I know. But part of the purpose of these weeknotes is personal, so I can look back one day and remember “ah, that was what we were doing in week such-and-such” and memories will come flooding back. So I pair people names and project names in order to drop a future-anchor into the here and now.

As if that wasn’t enough, SVK second print run sales continue, and we continue to debug our fulfilment and customer service processes. I’m proud of SVK, as an internally run project. It’s hard to push work into the world when you don’t have a client because the temptation is to wait until perfection. But unless you get out into the world, all that work is for nothing anyway, and the experience of making your work public is so transformational to a project that you have to leave time and room to understand and build on that transformation. Launch unfinished, I say! Easy to say, hard to do. Mother birds push their chicks out of the nest before they can fly, but who’s going to learn to fly when somebody’s bringing food to you the whole time? Mother birds must feel horrible. Baby birds must resent them.

And there’s also some more work with Dentsu that I cannot wait to show you. Not long now.

This week I’ve started trying to think about two areas I’m not in the habit of thinking about: sales, and process.

We have a very simple sales model at the moment. We sell the product of our time and thinking. But there’s something in the vague area of long-term research projects, product partnerships, IP, that kind of thing. I don’t know, to be honest. My commercial sense is very undeveloped. All I know is that I know very little, and I see in-front of me a broad, grey, undifferentiated space. So I want to work on that, feel it out, and get to better understand commercial reality.

The other area I want to get deeper on is process. I feel very naive around process right now. I observe that we’re a design company, with a design culture built over 6 years, yet we’re having to cultivate a new engineering culture that sits within it and alongside it, and the two have different crystal grains. It’s good that they do — engineering through a design process can feel harried and for some projects that does not lead to good outcomes. And vice versa. But it throws up all kinds of questions for me: do we really want two domains of engineering and design; what is the common protocol – the common language – of engineering culture, and indeed of our design culture; how do these lattices touch and interact where they meet; how do we go from an unthought process to one chosen deliberately; how is change (the group understanding of, and agreement with a common language) to be brought about, and what will it feel like as it happens.

Again, not things I have much experience with.

And again, as I did last time, I read back over my weeknotes and wonder whether it’s worth thinking about these kind of things. You know, I’ve just spent an hour bus ride noodling about things that maybe only make sense when a company is 100s of people, rather than a dozen plus change. Couldn’t I have spent my time replying to email?

I genuinely don’t know. Other people appear to construct and grow companies without any need for this kind of abstract introspection. Maybe it’d be better to pick the first thing that seems to work and just go for it.

Then again, maybe not. My metric for thinking about this is: ensuring the best possible environment for happiness and invention. Happiness feels like the easier of the two to work towards. It’s more easily identifiable, if not always easy to reach. Invention I see only out of the corner of my eye. Mostly you can only identify invention in retrospect. It is rare and fragile. Keeping hold of that feels worth a bus-ride thinking.

Product sketch: Clocks for Robots

As a studio we have recently been quite pre-occupied with two themes. One is new systems of time and place in interactive experiences. The second is with the emerging ecology of new artificial eyes – “The Robot Readable World”. We’re interested in the markings and shapes that attract the attention of computer vision, connected eyes that see differently to us.

We recently met an idea which seems to combine both, and thought we’d talk about it today – as a ‘product sketch’ in video to start a conversation hopefully.

Our “Clock for Robots” is something from this coming robot-readable world. It acts as dynamic signage for computers. It is an object that signal both time and place to artificial eyes.

It is a sign in a public space displaying dynamic code that is both here and now. Connected devices in this space are looking for this code, so the space can broker authentication and communication more efficiently.

BERG-Clocks-20110909-005

The difference between fixed signage and changing LED displays is well understood for humans, but hasn’t yet been expressed for computers as far as we know. You might think about those coded digital keyfobs that come with bank accounts, except this is for places, things and smartphones.

Timo says about this:

One of the things I find most interesting about this is how turning a static marking like a QR code into a dynamic piece of information somehow makes it seem more relevant. Less of a visual imposition on the environment and more part of a system. Better embedded in time and space.

In a way, our clock in the cafe is kind of like holding up today’s newspaper in pictures to prove it’s live. It is a very narrow, useful piece of data, which is relevant only because of context.

If you think about RFID technology, proximity is security, and touch is interaction. With our clocks, the line-of-sight is security and ‘seeing’ is the interaction.

BERG-Clocks-20110909-011

Our mobiles have changed our relationship to time and place. They have radio/GPS/wifi so we always know the time and we are never lost, but it is at wobbly, bubbly, and doesn’t have the same obvious edges we associate with places… it doesn’t happen at human scale.


^ “The bubbles of radio” by Ingeborg Marie Dehs Thomas

Line of sight to our clock now gives us a ‘trusted’ or ‘authenticated’ place. A human-legible sense of place is matched to what the phone ‘sees’. What if digital authentication/trust was achieved through more human scale systems?

Timo again:

In the film there is an app that looks at the world but doesn’t represent itself as a camera (very different from most barcode readers for instance, that are always about looking through the device’s camera). I’d like to see more exploration of computer vision that wasn’t about looking through a camera, but about our devices interpreting the world and relaying that back to us in simple ways.

BERG-Clocks-20110909-008

We’re interested in this for a few different reasons.

Most obviously perhaps because of what it might open up for quick authentication for local services. Anything that might be helped by my phone declaring ‘I am definitely here and now’ e.g., as we’ve said – wifi access in a busy coffee shop, or authentication of coupons or special offers, or foursquare event check-ins.

BERG-Clocks-20110909-007

What if there were tagging bots searching photos for our clocks…

…a bit like the astrometry bot looking for constellations on Flickr?

But, there are lots directions this thinking could be taken in. We’re thinking about it being something of a building block for something bigger.

Spimes are an idea conceived by Bruce Sterling in his book “Shaping Things” where physical things are directly connected to metadata about their use and construction.

We’re curious as to what might happen if you start to use these dynamic signs for computer vision in connection with those ideas. For instance, what if you could make a tiny clock as a cheap solar powered e-ink sticker that you could buy in packs of ten, each with it’s own unique identity, that ticks away constantly. That’s all it does.

This could help make anything a bit more spime-y – a tiny bookmark of where your phone saw this thing in space and time.

Maybe even just out of the corner of it’s eye…

As I said – this is a product sketch – very much a speculation that asks questions rather than a finished, finalised thing.

We wanted to see whether we could make more of a sketch-like model, film it and publish it in a week – and put it on the blog as a stimulus to ourselves and hopefully others.

We’d love to know what thoughts it might spark – please do let us know.


Clocks for Robots has a lot of influences behind it – including but not limited to:

Josh DiMauro’s Paperbits
e.g. http://www.flickr.com/photos/jazzmasterson/3227130466/in/set-72157612986908546
http://metacarpal.net/blog/archives/2006/09/06/data-shadows-phones-labels-thinglinks-cameras-and-stuff/

Mike Kuniavsky:

Warren Ellis on datashadows

Bruce Sterling: Shaping Things

Tom Insam‘s herejustnow.com prototype and Aaron Straup Cope’s http://spacetimeid.appspot.com/, http://www.aaronland.info/weblog/2010/02/04/cheap/#spacetime

We made a quick-and-dirty mockup with a kindle and http://qrtime.com

BERG-Clocks-20110912-016

Recent Posts

Popular Tags