Every Surface is a Screen

From CanonBase
Revision as of 19:35, 18 February 2023 by LouisR (talk | contribs)
(diff) ← Older revision | Latest revision (diff) | Newer revision → (diff)

Since the start of the 21st century, projection and pixel mapping have developed as new ways to integrate video content into all kinds of live performances, from experimental dance to large-scale public celebrations.

Sydney Opera House, projection mapped during the Vivid Sydney festival 2013.

In Edinburgh in the summer of 2001, at the international entertainment lighting Colloquium Showlight (Q30645), a novel product was previewed. Catalyst combined a high-power video projector fitted with a unique moving-mirror head that allowed it to point in all directions, and a software that controlled video content in real-time via DMX, the lighting control protocol (Q3957). For the first time, a video projector could be controlled in the same way as other lights: the video content could be resized, stretched, colour tinted, started, stopped, sped up and slowed down, faded in and out, and overlaid in multiple layers, all from a standard lighting console. As far as the console and the lighting operator were concerned, Catalyst was just another (very complicated) moving light.

It turned out that the real power of Catalyst was in using the software to feed content to standard video projectors. For lighting designers wanting to control all the visuals, particularly for live music concerts, clubs and similar applications, it brought video into the same technical system, and the same workflow, as lighting. If you wanted to match the colour tint of the video to the lighting, you could – just programme Catalyst in the same way you programmed a moving light. For video designers, almost everything could be done in real time. Rather than having to edit and re-render the video files when the content had to be rescaled or colour corrected, it could be done live. Video design could be as responsive as lighting and sound design. Following Catalyst’s success, other manufacturers started to make their own version, and a new product category was born: the media server.

One of the features of media servers is they allowed video content to be distortion-corrected to counteract the effect of the projector hitting the screen at an angle – something previously requiring difficult graphical or photographic techniques. As well as this type of distortion correction, media servers facilitated more complex ways of relating the video content to the surface it was projected on – a technique known as projection mapping. The first projection mapping was analogue. In 1969 Disney created their Haunted Mansion ride at Disneyland with its singing busts known as the ‘Grim Grinning Ghosts’ which were created by filming head-shots of five singers and projecting the footage onto three-dimensional sculptures of their faces. On a larger scale, the 1986 musical Time featured a huge head of the actor Laurence Olivier, animated by a projected film of the actor. For projection mapping onto buildings, where the content related to the features of the façade, elaborate systems were developed using layers of large format film, with one layer masking out the others as the film scrolled through the projector. The result was effects such as fish seeming to swim between the pillars of the building, or only appearing in its windows.

With the advent of digital video and media servers, all of these effects became easier to achieve, and could be more sophisticated. Today, projection mapping is used to animate buildings for festivals, public events and son et lumière performances, as well as on stage for theatre, opera, dance concerts. By integrating the media server control with other stage systems, video content can track the movement on stage of objects and people. For the 2004 musical The Woman in White, the stage scenery of moving and revolving walls was painted grey, with projected scenes for each location made using games software. The video tracked the moving walls, so the images appeared to be ‘painted on’.

Taking a step further, Troika Ranch’s 2006 dance production 16 [R]evolutions used an infra-red camera to track the motion of the dancers, so the video content could respond to the dancer’s movements on stage. In one sequence, white bars projected on the back wall outlined the maximum extent of the dancers moves, so the dancer appeared to be pushing the bars away as if to make space to dance in.

The Catalyst software was not just the first media server, opening up the possibilities of projection mapping. Its creator, Richard Bleasdale, added a feature to output DMX data based on video content: the result was the new technique of pixel mapping. The colour and brightness of any pixel in a video could be used to control the colour and brightness of a light on stage. Suddenly, video could be used to make rich, organic, non-repeating lighting effects which previously would take many hours to programme with a traditional lighting console. Pixel mapping has many uses, but it is perhaps most frequently seen in television light entertainment shows, controlling individual lights, strips of LEDs built into the set, and so on (C.10).

Since the start of the 21st century, the technologies of projection and pixel mapping have developed as remarkable new ways to integrate media content into live performances and events. The results can be seen in almost every sector of performance, from opera to theme parks.

Wikidata