This website is now archived. To find out what BERG did next, go to www.bergcloud.com.

Blog posts tagged as 'looking'

Swiping through cinema, touching through glass

The studio is continually interested in the beautiful and inventive stuff that can happen when you poke and prod around the edges of technology and culture. Mag+ emerged from a curiosity from both Bonnier and BERG about reading on tablets while Making Future Magic emerged from experiments with light painting and screens.

Early last year we were experimenting with product photography for a retail client pitch. We wondered how we could we use cinematic techniques to explore product imagery.

Watch the video of our experiments on Vimeo.

What would happen if instead of a single product image or a linear video, we could flick and drag our way through time and the optical qualities of lenses? What if we had control of the depth of field, focus, lighting, exposure, frame-rate or camera position through tap and swipe?

Swiping through cinema

This is a beautiful 1960’s Rolex that we shot in video while pulling focus across the surface the watch. On the iPad, the focus is then under your control, the focal point changes to match your finger as it taps and swipes across the object. Your eye and finger are in the same place, you are in control of the locus of attention.

Jack originally explored focus navigation (with technical help by George Grinsted) in 2000, and now Lytro allow ‘tap to focus’ in pictures produced by the ‘light field camera‘.

The lovely thing here is that we can see all of the analogue, optical qualities such as the subtle shifts in perspective as the lens elements move, and the blooming, reflection and chromatic abberations that change under our fingertips. Having this optical, cinematic language under the fine control of our fingertips feels new, it’s a lovely, playful, explorative interaction.

Orson Welles’ Deep Focus.

Cinematic language is a rich seam to explore, what if we could adjust the exposure to get a better view of highlights and shadows? Imagine this was diamond jewellery, and we could control the lighting in the room. Or to experiment with aperture, going from the deep focus of Citizen Kane, through to the extremely shallow focus used in Gomorrah, where the foreground is separated from the environment.

Cold Dark Matter by Cornelia Parker.

What if we dropped or exploded everyday objects under a super high-frame rate cinematography, and gave ourselves a way of swiping through the chaotic motion? Lots of interesting interactions to explore there.

Touching through glass

This next experiment really fascinated us. We shot a glass jar full of thread bobbins rotating in front of the camera, on the iPad you can swipe to explore these beautiful, intricate colourful objects.

There is a completely new dimension here, in that you are both looking at a glass jar and touching a cold glass surface. The effect is almost uncanny, a somewhat realistic sense of touch has been re-introduced into the cold, smooth iPad screen. We’re great fans of Bret Victor’s brilliant rant on the problems of the lack of tactility in ‘pictures under glass‘, and in a way this is a re-inforcement of that critique: tactility is achieved through an uncanny visual re-inforcement of touching cold glass. This one really needs to be in your hands to be fully experienced.

And it made us think, what if we did all product photography that was destined for iPads inside gorgeous Victorian bell jars?

Nick realised this as an App on a first-generation iPad:

Each of the scenes in the Swiping through Cinema app are made up of hundreds (and in some cases thousands) of individual images, each extracted from a piece of real-time HD video. It is the high speed manipulation of these images which creates one continuous experience, and has only become possible relatively recently.

During our time developing Mag+, we learnt a great deal about using images on tablets. With the first-generation iPad, you needed to pay careful attention to RAM use, as the system could kill your app for being excessively greedy, even after loading only a handful of photographs. We eventually created a method which would allow you to smoothly animate any number of full-screen images.

With that code in place, we moved onto establishing a workflow which would allow us to shoot footage and be able to preview it within the app in a matter of minutes. We also consciously avoided filling the screen with user interface elements, which means that the only interaction is direct manipulation of what you see on-screen.

With the Retina display on the third-generation iPad, we’re really excited by the prospect of being able to move through super crisp and detailed image sequences.

We’re really excited about re-invigorating photographic and cinematographic techniques for iPads and touchscreens, and finding out how to do new kinds of interactive product imagery in the process.

Robot Readable World. The film.

I recently cut together a short film, an experiment in found machine-vision footage:

Robot readable world from Timo on Vimeo.

As robots begin to inhabit the world alongside us, how do they see and gather meaning from our streets, cities, media and from us? The robot-readable world is one of the themes that the studio has been preoccupied by recently. Matt Jones talked about it last year:

“The things we are about to share our environment with are born themselves out of a domestication of inexpensive computation, the ‘Fractional AI’ and ‘Big Maths for trivial things’ that Matt Webb has spoken about.

and

‘Making Things See’ could be the the beginning of a ‘light-switch’ moment for everyday things with behaviour hacked-into them. For things with fractional AI, fractional agency – to be given a fractional sense of their environment.

This film uses found-footage from computer vision research to explore how machines are making sense of the world. And from a very high-level and non-expert viewing, it seems very true that machines have a tiny, fractional view of our environment, that sometimes echoes our own human vision, and sometimes doesn’t.

For a long time I have been struck by just how beautiful the visual expressions of machine vision can be. In many research papers and Siggraph experiments that float through our inboxes, there are moments with extraordinary visual qualities, probably quite separate from and unintended by the original research. Something about the crackly, jittery but yet often organic, insect-like or human quality of a robot’s interpetation of the world. It often looks unstable and unsure, and occasionally mechanically certain and accurate.

Of the film Warren Ellis says:

“Imagine it as, perhaps, the infant days of a young machine intelligence.”

The Robot-Readable World is pre-Cambrian at the moment, but machine vision is becoming a design material alongside metals, plastics and immaterials. It’s something we need to develop understandings and approaches to, as we begin to design, build and shape the senses of our new artificial companions.

Much of our fascination with this has been fuelled by James George’s beautiful experiments, Kevin Slavin’s lucid unpacking of algorithms and the work (above) by Adam Harvey developing a literacy within computer vision. Shynola are also headed in interesting directions with their production diary for the upcoming Red Men film, often crossing over with James Bridle’s excellent ongoing research into the aesthetics of contemporary life. And then there is the work of Harun Farocki in his Eye / Machine series that unpacks human-machine distinctions through collected visual material.

As a sidenote, this has reminded me that I was long ago inspired by Paul Bush’s ‘Rumour of true things’ which is ‘constructed entirely from transient images – including computer games, weapons testing, production line monitoring and marriage agency tapes’ and a ”A remarkable anthropological portrait of a society obsessed with imaging itself.’. This found-footage tactic is fascinating: the process of gathering and selecting footage is an interesting R&D exercise, and cutting it all together reveals new meanings and concepts. Something to investigate, as a method of research and communication.

Recent Posts

Popular Tags