The studio is continually interested in the beautiful and inventive stuff that can happen when you poke and prod around the edges of technology and culture. Mag+ emerged from a curiosity from both Bonnier and BERG about reading on tablets while Making Future Magic emerged from experiments with light painting and screens.
Early last year we were experimenting with product photography for a retail client pitch. We wondered how we could we use cinematic techniques to explore product imagery.
Watch the video of our experiments on Vimeo.
What would happen if instead of a single product image or a linear video, we could flick and drag our way through time and the optical qualities of lenses? What if we had control of the depth of field, focus, lighting, exposure, frame-rate or camera position through tap and swipe?
Swiping through cinema
This is a beautiful 1960’s Rolex that we shot in video while pulling focus across the surface the watch. On the iPad, the focus is then under your control, the focal point changes to match your finger as it taps and swipes across the object. Your eye and finger are in the same place, you are in control of the locus of attention.
Jack originally explored focus navigation (with technical help by George Grinsted) in 2000, and now Lytro allow ‘tap to focus’ in pictures produced by the ‘light field camera‘.
The lovely thing here is that we can see all of the analogue, optical qualities such as the subtle shifts in perspective as the lens elements move, and the blooming, reflection and chromatic abberations that change under our fingertips. Having this optical, cinematic language under the fine control of our fingertips feels new, it’s a lovely, playful, explorative interaction.
Orson Welles’ Deep Focus.
Cinematic language is a rich seam to explore, what if we could adjust the exposure to get a better view of highlights and shadows? Imagine this was diamond jewellery, and we could control the lighting in the room. Or to experiment with aperture, going from the deep focus of Citizen Kane, through to the extremely shallow focus used in Gomorrah, where the foreground is separated from the environment.
Cold Dark Matter by Cornelia Parker.
What if we dropped or exploded everyday objects under a super high-frame rate cinematography, and gave ourselves a way of swiping through the chaotic motion? Lots of interesting interactions to explore there.
Touching through glass
This next experiment really fascinated us. We shot a glass jar full of thread bobbins rotating in front of the camera, on the iPad you can swipe to explore these beautiful, intricate colourful objects.
There is a completely new dimension here, in that you are both looking at a glass jar and touching a cold glass surface. The effect is almost uncanny, a somewhat realistic sense of touch has been re-introduced into the cold, smooth iPad screen. We’re great fans of Bret Victor’s brilliant rant on the problems of the lack of tactility in ‘pictures under glass‘, and in a way this is a re-inforcement of that critique: tactility is achieved through an uncanny visual re-inforcement of touching cold glass. This one really needs to be in your hands to be fully experienced.
And it made us think, what if we did all product photography that was destined for iPads inside gorgeous Victorian bell jars?
Nick realised this as an App on a first-generation iPad:
Each of the scenes in the Swiping through Cinema app are made up of hundreds (and in some cases thousands) of individual images, each extracted from a piece of real-time HD video. It is the high speed manipulation of these images which creates one continuous experience, and has only become possible relatively recently.
During our time developing Mag+, we learnt a great deal about using images on tablets. With the first-generation iPad, you needed to pay careful attention to RAM use, as the system could kill your app for being excessively greedy, even after loading only a handful of photographs. We eventually created a method which would allow you to smoothly animate any number of full-screen images.
With that code in place, we moved onto establishing a workflow which would allow us to shoot footage and be able to preview it within the app in a matter of minutes. We also consciously avoided filling the screen with user interface elements, which means that the only interaction is direct manipulation of what you see on-screen.
With the Retina display on the third-generation iPad, we’re really excited by the prospect of being able to move through super crisp and detailed image sequences.
We’re really excited about re-invigorating photographic and cinematographic techniques for iPads and touchscreens, and finding out how to do new kinds of interactive product imagery in the process.
11 Comments and Trackbacks
1. Greg Borenstein said on 8 March 2012...
Beautiful results. A classmate and I played around with a system like this awhile back for doing documentation of circuits, which are fundamentally 3D and so quite hard to label in a flat image without mixing up holes and wires. We used Processing.js and an html video tag to scrub through a video of a circuit rotating on a lazy susan so you could view it from all sides: http://urbanhonking.com/ideasfordozens/2010/09/29/360_degree_interactive_camera/ and we were working on an interactive system for annotating the parts of the circuit so you could have rotating labels follow each part as you scrubbed. I think there’s lots of possibility for annotation of 3D objects in this format. Imagine providing assembly or refill instructions for Little Printer using this form.
I seem to remember that we stole the rotating technique from some Apple product photography on their site. Our results were totally low res and not nearly as beautiful as what you’re showing here, though.
It’s interesting that you refer to these as “cinematographic” techniques. I’ve also been paying a lot of attention to a form of animated GIF people are calling “cinemagraphs” http://cinemagraphs.com is an example. By mixing frozen photography with selected motions they get a powerful poetic effect. Using motion/stillness contrast to aim your eye as well as focus/defocus. And just this week Microsoft Research released an app for automating the creation of these (which they call Cliplets): http://research.microsoft.com/en-us/um/redmond/projects/cliplets/
Something about the current state of the web and our devices are causing us to melt interactivity and movies into our still images.
2. PA said on 14 March 2012...
Nice.
Came across Arqball Spin yesterday night and thought you might find it interesting. It’s an app that creates 3D models of spinning objects by filming them with an iPhone. Here’s a video of one of the founders demoing it at SXSW .
Cheers.
3. PA said on 14 March 2012...
http://arqspin.com
http://youtu.be/tQ3iwmC3NMY
(links didn’t appear, sorry.)
4. SJ said on 14 April 2012...
No one remember Quicktime VR?
Funny how the iPad is making us repeat multimedia and the CD Rom age even down to HD versions of the more impressive tech at the time.
Trackback: Interactive and Interaction 16 March 2012
[…] I still get very excited however when I see agencies like BERG London exploring and converging techniques to produce lovely playful interactions.They have used cinematic techniques including depth of field, focus and exposure to allow the user using simple swipes to explore the qualities of a product. Read the full blog post over at berglondon.com […]
Trackback: BERG Uses Cinematic Techniques To Create Tactile Product Photos [Video] @PSFK 17 April 2012
[…] BERG use cinematic techniques and swiping iPad functions to cast new light on product photography, inspiring new ways of showcasing items and non-linear visual storytelling. What would happen if instead of a single product image or a linear video, we could flick and drag our way through time and the optical qualities of lenses? What if we had control of the depth of field, focus, lighting, exposure, frame-rate or camera position through tap and swipe? […]
Trackback: Inspiration | Rachael Johnston 24 April 2012
[…] way wanted it to look to Paul; having the user be able to scroll around a 3D view. This led us to this project, which is absolutely […]
Trackback: Thedropnyc – Video: BERG – Swiping through cinema, Touching through glass - Thedropnyc 25 May 2012
[…] further explanation of what BERG exactly did follows here. We cannot wait for such a shopping experience on the iPad in the […]
Trackback: Futurism Friday: Directly Manipulating Video Objects 3 January 2013
[…] Movies. In the article, writer Mark Wilson is blown away by a project the design consultancy BERG has created. They came up with a way to directly manipulate objects in videos using surprisingly […]
Trackback: Rachael's Blog » Inspiration 20 February 2013
[…] way wanted it to look to Paul; having the user be able to scroll around a 3D view. This led us to this project, which is absolutely […]
Trackback: Motive Photography | Gasket 3 March 2013
[…] post titled Swiping Through Cinema, Touching Through Glass pulls together a number of threads of thought that are fascinating, but it’s the atmosphere […]