Animation – green screen

Leave a comment

Working at NUA as the Animation and sound technician, this week’s process test was to go through greenscreen, from beginning to end.

Through this I would be able to test out the new dragonframe to see what features had been updated, and perhaps changed, to make sure that I am always up-to-date.

To also ensure that my green screen setup was as good as possible for an upcoming project with the first years, and then brush-up on using After Effects for the post production.

So I grabbed one of our walkcycle armatures, borrowed some doll’s clothes from my children and went into the depths on Animation Studio 1.


Destined for stardom!

The key points for green screen are to light the background and foreground almost separately, obviously in the reduced space of an animation studio this is a little more difficult as you can’t get a lot of space in between but, starting with the two basic lights, a flo-light (floodlight) and a kick light to pick out the model from the background, that’s a good place to start.


A flo-light (floodlight) at the top to try and light the background evenly, then two dedolights and a kick from the back to try and distinguish foreground from background

As you can see the result has harsh lights from the spot, which you need, but adding diffusion will soften the harsh shadows, because we want as little of those as possible.


The fabulous dedolights let you easily attach some diffusion material (or gels) directly onto the barn doors with an easy to use tiny clamp

This lessened the shadows and gave me a result I was fairly happy with, although in an ideal world the Green screen would have maybe 2 flo-lights on, to be more even.


Softer shadows with diffusion, but I did have to tun up the dedolight a little to compensate

Ready to film, I then turned to the new dragonframe, and to be honest there’s not a lot of difference from version 3, the interface is slightly smarter, but for the students, it will mean an easy transition to the latest version. Which was a must as we had new cameras waiting to be installed, but they would only work with DragonFrame 4. (Canon 1300D’s)

A short jerky walkcycle later – it’s been a while – and I had my character in the middle of the stage, ready to react with a blue polystyrene box that the students have been using, so that my armature (and the action) could stay in the middle.

Disaster fell at this point in the proceedings too…



His ankle joint broke, but as with all good English actors, we carried on!

The resulting video, is not my finest work, the clamp rig is really too big and heavy for this small armature character, there’s a terrible jerk where his ankle breaks , but the reaction works well, and I like the character that the little blue box has… In my head it’s a very lively puppy, that growled to stop my man in his tracks, then once beckoned turns into a slobbering excited mess when he gets a hug and a kiss…

It’s amazing what my imagination adds, now to see if I can add a little post-production magic to help anyone else see it too!

When using DragonFrame, you can either export video or stills, but you must remember to conform your take if you want to discard any re-shot frames, or deleted frames, as when you bring in an image sequence into AE, it can pick up those dud frames.

Also make sure your frame rate is correct, again if you lengthen or hold frames on the Xsheet, you will need to conform your take for those changes to take effect and your image sequence to reflect your timed animation from dragonframe.

Leaving the animation studio behind I headed up to the Media Lab to get started in After Effects.

Once you’ve set up a regular 1080p workspace and composition bringing in an image sequence is really simple, click on your first image and after effects will pick up all of the tiff’s in that folder, in sequence, and ‘pre-comp’ them together as a single piece of media, so for animation from dragonframe, that’s exactly what you want.

Then drag this tiff sequence down onto your pre-set composition timeline, and resize them to fit – this is why you should always setup the comp first, not just plonk your content onto the timeline as it will take it’s size from the media and who knows what size it might end up, which then leads to rendering/processing problems.

I like to use a garbage matte before applying the keylight effect, as it cuts down how much green the effect is trying to process, and with my small setup I knew the corners were going to need taking out. So, although it’s a laborious process I step through all of the frames, altering the mask slightly to allow for model movement. It is lovely when you don’t have to move it for a few frames!

Then I could move onto adding the keylight 1.2 effect… it does a fab job, and this is where you can really see any shortfall in your green screen technique – and there were some very particular areas in this test! The best tips I would give are clipping the black and white points (in the settings area of the effect) and using the alpha preview to see exactly what is black and white. I had a bit of spill both on the box and the white clothing which I couldn’t seem to sort out which left parts of my characters with slightly see-through areas, a bit more subtle tweaking of the advanced settings with the blacks and whites, got it beautifully crisp.

Now to put a simple background in to see how it was all doing.

Et voila, it’s ok, it’s nice to see it in a situation away from green, or black, really good exercise to go  through before the next first years project, dragonframe 4 is still as easy to use and after effects has many different and powerful ways of keying.

To add to the ways stated above, you could also; clone stamp in AE to remove the pins, which I did do a bit, but it makes a crazy amount of layers; Add some 3D lighting to perk up the character; Colour correct the background and animation to make them feel more cohesive; Track eyes/features onto the characters in 3D space and use layers more cleverly to give a sense of perspective.

However what I wanted to do was give my little bluebox puppy a bit of life, but I didn’t want perfect, which would be my normal style, something in Illustrator with beautiful clarity of line, I was after a more Mr Messy feel and although I don’t use it a lot, I knew that TV Paint would get me a really nice organic free feel to it.

So I rendered a low res copy out of AE, and used this for a background layer in TV paint, then got my wacom tablet out and let my imagination go a little wild!

I can practically hear the excited slobbering doggy noises, so at some point, I will return to this project and add some sound…



Critical Evaluation

Leave a comment

‘Reality Augmented Reality’

Tracey Tutt

Masters Project – Critical Evaluation

MA Moving Image and Sound 2014


For my final project I wanted to create an engaging experience utilising Augmented Reality to bring an extinct animal back to life in an interactive installation.

Through my research into immersive spaces both historical (trompe l’oeil, the phantasmagoria, peppers ghost) and contemporary (James May App @ the Science Museum, Digital Revolutions show at the Barbican, Google Glasses) it was clear that with each new emerging technology inventors and artists would seek different ways to utilise the newly discovered to make all-encompassing works.

Their aim was to delight in playing with the senses of those that came to see, to hold them captive by what they had produced, to amaze and wow the crowds with each new version of the spectacle.

Many of the Augmented Reality examples I have seen in my research, – my preferred software platform Aurasma has been used by Harrods, BBC, Disney and Universal – tend to be flippant games, or marketing frippery and every time I see them I’m disappointed that this fabulous way of personally interacting with people is so commercial.

My aim is to use all of my technical skills and that plethora of new digital technology that is available, to make a beautiful immersive piece of interactive art meaningful, basing it on the very real problem of extinction.

The Great Bustard is still the heaviest living bird that can fly, but they died out from the UK in the 1830’s. I saw the information label in the Norwich Castle Museum where they house an impressive case of these large animals, and when I saw it, I wanted to bring them out of the case and back to ‘life’ through the magic of today’s technology.

The Great Bustard Label from the Norwich Castle Museum

The Great Bustard Label from the Norwich Castle Museum

Although my virtual reintroduction of the bird to East Anglia only exists in this installation, there is an actual re-introduction of the Great Bustard on Salisbury Plain by a very small, but dedicated group.

Chapter 1 – Reality Augmented

In my previous self negotiated unit I had created a purely digital device-led experience using a 3D cgi version of a Great Bustard which I laboured over intensively, and although the feedback was really positive, I felt its’ focus was too narrow on its own. The intention had been as an accompaniment to a museum exhibit. As a digital artist I sometimes forget the very visceral stimuli of the real world. I also realised that if this was a live public project I would be collaborating with professional 3D artists (not struggling with it myself) as the object (the cgi) would not be the end result.

Being an avid museum fan myself, I am always fascinated by being able to gaze upon an original artefact, and I realised that the interest in seeing the actual objects as well as interacting with them was part of the direction that my final project should take.

I needed to include a physical object within my installation, and the obvious choice was a model of a great bustard. Having a life size sculpture of the Great Bustard in my installation was important to give the physical context of this enormous bird. All of your senses are instantly engaged with its very presence and there was no better way to convey this than with an accurate model.

Although my installation is underpinned by technology, putting a physical object in my space will give it more meaning. Digital screens are all pervasive and we are so used to the architectural furniture of the ‘flat black’ that I wanted to ensure I brought a spatial dynamic back in, and once more we would become inquisitive about this new shape that we have been presented with.

It will also draw the intrigued viewer closer and when they choose to interact, to step forward, they will move unwittingly onto my pressure mat, which once triggered, plays the animation of the Great Bustard taking off and flying across the wall space right in front of you. This gives the visitor a chance to be part of my virtual awakening and re-introduction.

Whilst looking around at what current practitioners are doing with art and technology I made sure I visited the Barbican exhibition, Digital Revolution.

“This immersive and interactive exhibition brings together for the first time a range of artists, filmmakers, architects, designers, musicians and game developers, all pushing the boundaries of their fields using digital media.” (Barbican press release 2014)

Mimaform petting Zoo at the Digital Revolution exhibition at the Barbican

Mimaform petting Zoo at the Digital Revolution exhibition at the Barbican

Many of the pieces in the exhibition lacked the interactivity that I had expected, and although beautiful or clever, with no interactivity apparent, the viewer would soon move onto the next piece. To illustrate this point, when I was at the Barbican, one of the pieces, the Minimaforms petting zoo with their fabulous digital pet snakes, was supposed to be interactive. I wanted to see this in action, but after 3 attempts and watching many visitors also try, the piece just didn’t respond to those wanting to play and I observed a lot of people’s disappointment. The same with the piece, with beautifully machined pyramid-shaped mechanical instruments and cutting edge graphics, but no responsiveness to external stimuli, limiting its playfulness and appeal.

Making the sculpture interactive, actually responding to the people or person within the space is of paramount importance to the success of the piece, when you can feel that you impact upon a piece of art or sculpture you’re making a passive event into an experience.

An Oxford dictionary definition of ‘ to interact’ is described as “to act in such a way as to have an effect on each other”. Because I chose to leave my sculpture plain white so that the size is the foremost impression, I have also given myself a perfect surface to project my animation onto, again reinforcing the physical interaction and focus of the piece.

Reactions from those that have seen the sculpture thus far have been intrigue, interest and a genuine request for a bit more information, usually along the lines of, “is it life-size?” All of them seemed to agree it was a conversation starter and were fascinated to see it in situ with the animation running. This means I have a great starting point to talk about the very real subject of extinction and create digital art with meaning.

The animation itself came through a process of mark making and testing with different media and techniques. I wanted it to have texture and movement within the paper, I wanted it to be beautiful, not a stylized graphical illustration, which is my normal safe digital art practice. I wanted the art to feel as though it moved, to be bold and free, with high expressiveness of line capturing the movement of the physical action.

Showing the movement on one my animation frames

Showing the movement on one of my animation frames

To make the texture as exciting as possible I had discovered through more experimentation that using found feathers in the animation frames made such particular marks that I could not replicate, or better, by hand. It seemed fitting to be using them to animate flight.

To truly be augmented art.

“No one could fault the advances in technology on display, but the art that has emerged out of that technology? Well, on this showing, too much of it seems gimmicky, weak and overly concerned with spectacle rather than meaning, or making a comment on our culture.” (Sooke 2014)

I also wanted to envelop the viewer with sound and after researching their previous habitats in Norfolk  I came upon the impact of environmental issues, the Brecks used to be a vast heathland covering parts of Norfolk and Suffolk, but few truly wild pieces of this heathland remain. It’s very protected and access is limited, but the audio recordings are a lovely reminder of what a summers day can be, relaxing, peaceful and it puts the viewer exactly where the Great Bustard would have roamed wild.

Chapter 2 – Augmented Reality

I saw a group of museum innovators in Belgium (meSCH project) inserting a mobile device within a wooden loupe/magnifying glass which they gave to their visitors to view more in-depth information on any of the objects with the little loupe symbol on their labels.

Through their research they had discovered in their first iteration of using digital devices to show more information, where they had simply given an iPad unadorned and with all controls available to the user, volume, on/off etc, ended badly, as people were more confused about which button to press or not to press, on the devices they were given.

“Still, many visitors were reluctant to pick up an iPad. The installation did not have a clear interface and many people are not familiar with AR yet… when encountering a piece of technology they have used before outside of the museum space, visitors will try to use that technology like they have used it before “, (Van der Vaart 2014)

the Digital 'Loupe' prototype from the meSCH group

The Digital ‘Loupe’ prototype from the meSCH project

Their second ‘loupe’ approach helped overcome many of the obstacles that people perceive around digital devices. You held it in a natural way, and used it to look at things just as you would a real magnifying glass. I wanted to use this ‘soft’ approach to a digital device being in my space and because I was basing my project around the heaviest flying bird to live, a pair of binoculars seemed the natural choice.

I was lucky enough to get a short interview with David Waterhouse (Curator of the current ‘Wonder of Birds’ Exhibition at the Norwich Castle Museum) and I demoed my AR binoculars to him. He liked the intuitive form factor and could see a very real use for overlaying all sorts of information on exhibits with them, such as his future Mammoth project, to see what it would have looked like, and conversely to see the skeletal structure of existing taxidermy pieces.

David Waterhouse using my AR binoculars

David Waterhouse using my AR binoculars

He also found that their use of the technology in your pocket (ie, an iPhone) is appealing as it’s personal and you can use it to find out more, when you want to. This is exactly the point of using your own smartphone, to discover extra layers. This is the pleasure of exhibition technology that I am trying to communicate through my work.

“Those who run museums know that the people walking around their buildings are already spending an inordinate amount of time using their phones… So it only makes sense to find ways to turn phones into storytelling tools that can bring the inanimate to life. Or shift time. Or add layers of knowledge.” (Rieland, 2012)

Giving the visitors a variety of ‘artworks’ to look at through the binoculars presents an opportunity to show all of the different ways you could impart more knowledge or information, from simple overlaying of explanatory video, to interactive screens where you choose from a menu, and make your own decisions about what you would like to explore.

Chapter 3 – Reality Augmented Reality

Putting the physical and the virtual together into an interactive space to present a seamless experience has – at times – seemed a far too ambitious project for just one person.

However, much of the peripheral items that I have created, such as the sculpture, artwork and associated literature already exist in most heritage or educational establishments.

My aim was to show the world that you can overlay these existing items in a non-destructive way. It would have been ideal to bring in a stuffed Great Bustard, but as there are (at time of writing) only 14 living Great Bustards existing in the UK on the Salisbury re-introduction site they are too rare and special and are highly prized amongst collectors, should one come on the market.

I could also have exhibited a different set of drawings, but in this case – as in most exhibitions – the items on display should have a connection with each other.

With the audio soundscape, cohesive theme and relevant literature I hope to engage the viewer and pique their interest in the plight of the Great Bustards and the struggles that David Waters (Great Bustard Group founder) and his team have, but also inform them by overlaying information on every item through Augmented Reality.

I wanted to prove it could be done, to show museum professionals and academics that if I were to work alongside them for future projects, we could engage the viewer on many more levels, learning would become more of an experience. Overlaying the existing exhibit with more content using the device in their pocket and appropriate extra information would enlighten the viewer there and then directly in front of the item they want to learn more about.


I have taken a contemporary issue (extinction of species) and devised a complex and immersive strategy for making the viewer of the installation consider the physical , visual and sonic, and aesthetic loss that such extinction creates, filling the space where the bird should be with replacement sensory experiences.

This is the key to making my installation a success, encouraging people to take part, whether virtually, by accessing the extra layers of information in my printed items through Augmented Reality, or with their physical curiosity providing the reaction with my sculpture.

But the sculpture is not the outcome of my final piece, nor the animation, or the sound, or even the Augmented Reality art that people can take away, it’s the layering of them all together, one over the other, over the other, it’s proving that the interaction of these layers is where the future lies for storytelling in museums or art galleries or schools, not just through one medium, but through them all and the power of interaction.

If we were all one-dimensional how boring would that be?


Barbican Press Release, 2014, An immersive exhibition of art, design, film, music and videogames

Beck, J.  (2004) Animation Art: From Pencil to Pixel, the World of Cartoon, Anime, and CGI. London. Harper Design,

Colson, R., 2007. The Fundamentals of Digital Art. Switzerland. AVA publishing.

Dobson, T. (2006) The Film Work of Norman McLaren. Eastleigh. John Libbey Publishing

Esther, L. (2004). Hollywood Flatlands: animation, critical theory and the avant garde. London. Verso.

Geoffrey Mann. 2011. Flight Take-Off. cast glass. Held at Norwich Castle Museum (acquired in 2012)

Grau, O., 2007. Media Art Histories. MIT Press

Paul, C (2008) Digital Art. Thames and Hudson, world art series.

Rieland, Randy, 2012, Augmented Reality Livens up Museums, accessed Jan 2014

Rodgers, P and  Smyth, M. (2010). Digital Blur: Creative Practice at the Boundaries of Architecture, Design and Art. Oxford. Libri Publishing

Rose, G. (2006) Visual Methodologies: An introduction to the interpretation of Visual Materials: An introduction to the interpretation of Visual Methods (2nd Ed.), London:Sage

Sooke, Alistair, Jun 2014, Digital Revolution, Barbican Centre, review: ‘gimmicky’ Accessed June 2014.

Van der Vaart, Merel. April 23rd 2014.  Using Augmented Reality in the Museum Accessed June 3rd 2014

XU, W. (2012). Drawing in the Digital Age. Indiana. Wiley-Blackwell. . McLaren, Love on the wing, (1938) accessed 14/03/2014 Arctic Monkeys, I wanna be yours. (2013) accessed 14/03/2014

David Waterhouse, the wonder of birds

Leave a comment

I was lucky enough to get a chance to meet with David Waterhouse and talk about his ‘Wonder of Birds’ exhibition currently running at the Norwich Castle Museum.

Wonder of Birds at the Norwich Castle Museum

Wonder of Birds at the Norwich Castle Museum

I was interested to know why he had chosen the pieces he had and whether he had looked at more modern technology within this show, trying to gauge if my work would be suitable in this kind of environment.

It took him 4 years to curate the whole show and he wanted to use new technology and computers as an extra way of interacting and layering information, but constraints on time and budget meant he really needed to concentrate on the pieces first and foremost, and not having an extra pair of hands, or a new technologist to concentrate on that side of things, meant it didn’t happen for this project.

But, up in the rotunda with the regimental museum section, they have introduced touchscreens to explain more of the exhibits that you can see. David told me that at one time they used to have museum interpreters working in the different sections and they would act as guides for the pieces, and they hoped that these touchscreens would be used in a similar way.

I then went on to show David my AR binoculars, which he was fascinated with and we discussed which parts of my project could have real world use and these he felt the strongest element that could translate across into exhibitions.

David Waterhouse using my AR binoculars

David Waterhouse using my AR binoculars

He loved the fact that inside the shiny binoculars was just an old iPhone which meant that it was accessible to everyone with just the device in their own pockets. He could see them being used for looking inside animals, seeing the skeleton over the stuffed animal, or seeing what it once would have looked like over the bones.

He also agreed with me that the recognisable form factor – binoculars – meant that you instinctively knew what to do with them, which was my hope!

David had also looked at the art of labelling and had read some research about the distance between the article and it’s label, the further away from the object, actually contributes to disconnecting the information. This made sense, if it took you a long time to find the correlating text, you may well have lost interest or seen something else in the meantime!

I think this is where AR has a real bonus, you’re right there and so is the information…

Touchscreens in the Regimental Museum section at Norwich Castle

Touchscreens in the Regimental Museum section at Norwich Castle

After our interview I went up to visit the screens in the rotunda, they look great, there is a vast amount of information on them, beautifully presented, but when I sat to observe people interacting with the space, everyone enjoyed looking at the objects in the glass cases, and stood and looked at the screens, but apart from children (and me) nobody touched them, the girl I saw quickly swiped back and forth over the timeline, but was called away by her mum to look in the case… The screens are sat on a wall facing the objects, but of course you have to sit with your back to the objects to use them and then you’re sat right in front of a wall, with a great photo on it, blown up to cover the entire wall, but only the screen…

The information on them is also quite dry, wonderfully detailed in many different layers, but no-one to click on them… a shame… they have done exactly what David was referring too, disassociated the information from the objects, through physical distance.


Digital Revolution and the V&A

Leave a comment

Digital Revolution at the Barbican until 14th September


I had desperately wanted to get down to London to see this Digital show, but with deadlines short this was the only date available, with College Workshops shut on a Friday the only thing left for me to do was the writing and visit this show, so with 4 hours on the train to concentrate on my critical evaluation I thought it was a perfect opportunity to marry the two.



The exhibition is broken up into sections.


Although it was fascinating to see all of the old technology I had hoped for more from this section, I recognised quite a few games and consoles, such as an old spectrum and the cream coloured macs with floppy disk drive, it wasn’t much of a revolution.

Quantel Paintbox, 1981, predecessor of the Wacom Tablet, revolutionised the way graphics were produced

Quantel Paintbox, 1981, predecessor of the Wacom Tablet, revolutionised the way graphics were produced

My frame for the Johnny Cash Project

My frame for the Johnny Cash Project

The We Create section was more what I was expecting, you could submit your art in the Johnny Cash Project and interact with robotic birds made from recycled phones, by contacting them on an old dial phone.


The information about inception and gravity was interesting, but the way they presented and you could access the behind the scenes layers of Inception was of more interest to me, they looked to be using leapmotion…. A wonderful little device which can track five fingers of movement in 3D space. Really fluid transition though the layers on information, very responsive and made it very easy to jump in and use.

wpid-dsc_0179.jpgwpid-dsc_0177.jpgUsing the leap motion to scroll through the layers used in the making of Inception. using one of the oldest illusions in the world the inverted shape to give that 3D effect using one of the oldest illusions in the world the inverted shape to give that 3D effect

The project was ok, it was just a platform with fancy animatronics to control the individually designed pyramid instruments and the cleverest part was the use of the inverted shape to give the illusion that his eyes and face were following you around the room, but it’s a very old trick.


Chris Milk’s ‘State of play’ is a really impressive interactive art piece, this is exactly what I expected from the digital revolution show – and it looked spectacular, the movement was fluid and although it all happens quite quickly, you really get engaged with your shadow and what happens to it in the three stages. Very reactive and fully immersive in the massive space.

Dev art was full of more quirky pieces, I wasn’t sure if I was contributing to the art there or not for some of the pieces, but the keyboard radio was quite fascinating.

Dev Art area

Dev Art area

Digital futures included lady gagas dress and a skirt you could put pregenerated led light images onto (iMiniskirt).

Again the indie games section was interesting but not what I’d call innovative.

Mimaforms petting zoo

Mimaforms petting zoo

The mimaforms petting zoo was only disappointing because I didn’t see a single person successfully interact with them, they looked cool though.



Umbrellium was a trance experience in a smoky underbelly space and felt like being at the end of a quiet rave, when viewed through the plexiglass window whilst we had our pep talk, it looked like a zombie movie, people entranced by the light moving slowly about with their arms outraised to the light.

Marshmallow Laser Feast Forest

Marshmallow Laser Feast Forest

The Laser feast tree installation was a work on an immense scale, it looked amazing and gently moving through the ‘trees’ giving each trunk a good push make pleasing tones and I really enjoyed watching the laser lights on the roof dance about alongside their relaxing notes.

Overall I was slightly disappointed with The Barbican show, but on the other hand very interested to see that my peer Andy Logies art and sound piece, would fit straight in, and with a few tweaks, so would mine.

Andy had his Forum exhibition on Thursday and it was brilliant, it worked wel, looked fantastic on the enormous screens they have in the Fusion screen at the Forum, and I thoroughly enjoyed interacting with his piece.

Andy Logie's piece 'bound'

Andy Logie’s piece ‘bound’

Although it made me think a little more about mine, would my piece be as engaging, it’s a very quick shot – firing  the flight animation – will it hold the viewer for more than a moment, how do I get across the meaning behind it… ie, this is what AR could do for you, and you already have the device in your pocket!

The V&A palindrome sign

The V&A palindrome sign

I managed to squeeze in a quick dash to the V & A, to see their interactive tables….

Interactive material tables in the furniture section at the V&A

Interactive material tables in the furniture section at the V&A

The V & A furniture collection have introduced touch screens with information beside the object, but they are just so dry, very similar to the screens at Norwich Museum, even though they are right by the object, they feel strangely disconnected and are uninteresting to click on.

wpid-dsc_0128.jpgThe materials interactive tables are also disappointing… although you have the added interest of tactility with the object itself, they have samples of the different materials scattered around the tables edge, the content that comes up is just like a page from the internet and again it’s a very dry way of interacting.

Different media/materials are on the outside of the table

Different media/materials are on the outside of the table

The way that it functions is also slightly awkward as you need to hold your hand over the little hole that they have in each different piece of wood or metal sample, and if you remove your hand before it’s loaded it can stall and disappear, conversely if you do want to read the other pages, hovering over the object for their pre-determined amount of time feels like an eternity to wait. I would like to have seen the first page come up much quicker and then be able to control the speed and which page I am viewing with the more intuitive hand swipes and gesture that we are used to using.

The holes which you need to cover in order for the interactivity to work

The holes which you need to cover in order for the interactivity to work

It’s a very large area for not much happening.

Rapid Response Collectiona st the V&A

Rapid Response Collection at the V&A

However the rapid response collecting area which I stumbled upon was a really pleasant surprise.


“The museum collected the objects in this gallery in direct response to important moments in the recent history of design and manufacturing”

Flappy bird and the nude shoe

Flappy bird and the nude shoe

An eclectic collection of a dozen objects, one of which included the app ‘flappy bird’ and a wearable terminal, they had an oculus rift headset.

Oculus Rift in the Rapid Response collection at the V&A

Oculus Rift in the Rapid Response collection at the V&A

Great to see such an established Museum making a collection out of news headline tech or social changes.

Disobedient Objects

Disobedient Objects

Disobedient Objects is one of the featured shows within the V&A currently and it was interesting to see this very politically motivated exhibition on one side of the beautiful reception area, and just opposite were the beautiful statues in a grand space.




MAX msp and the pressure mat switch – the hard part #maxmsp #pressuremat

1 Comment

After successfully wiring up the pressure mat and feeling very pleased as I have never wired anything, ever…

It was over to MAX msp, I dutifully read through the tutorials and they are great, max comes with at least 20 or so basic tutorials and these all have working ‘patches’ or files that you can open, look at, play with and alter to get a feel for making your own and understand the syntax it uses for processes.

There seemed to be many people using MAX, for audio, video and interactive projects, the forums were active and lively and I felt confident to start with my own patch.

What I needed to program was the starting video to play and loop until the pressure mat switch was activated and a second video plays until it’s end and then returns to the original video which is still looping.

Doesn’t sound too difficult does it, but I have been testing and trying on and off from the beginning of July to try and get this to work…

I have looked and googled and tried as many different search terms as you like to try and see if anyone else has ever used a pressure mat directly into a computer and MAX msp, but there is no-one out there, or no-one has ever written about it being successful, and last night I’d almost decided that it wasn’t possible, I couldn’t get it to loop and return and the pressure mat thing wasn’t working or hadn’t been successfully recorded anywhere, my idea for simple activity was looking doomed.

I had been able to find a number of people using an arduino to interface with MAX, but at this point I didn’t want to start with another purchase and more software to learn!

I also had doubts about the pressure mat itself, it had come with 4 wires… which ones made the circuit? No paperwork came in the box, I guess you’re supposed to know what you’re doing…

And of course I had wired it to the plug before checking the live circuit.

I had to find an expert… fortunately Phil, one of the MRC technicians was one of the people who had said that MAX was good in the first place for interactivity, so with promises of coffee and or cake (Earl Grey black for future reference!) I managed to get an hour with Phil.

We started with the patch that my husband and I (I’d even roped him in too!) had been co-working on the night before – him more than me as I was about ready to give up at this point – and Phil was kind enough to say we were on the right lines, but needed to input the videos differently, using a bang, or a button rather than reading in the file to loop or play it. Then he tackled the returning to the original video as we had one switching to another on a click, just not by reading the end of the video to trigger the return to video 1.

We had looked at the delay function, but Phil suggested using the pipe command, we had calculated the length of the clip with ‘length’ but this was giving an odd number, that when worked into the pipe function returned to the original video, yay!..but before the end of the clip had actually played. Phil then set about trying to work out the fps and miliseconds needed as the 2720 ‘length’ number was obviously wrong, as he was looking through some of the reference material, I saw a ‘duration’ function which listed as returning ms, just what we wanted, and when he put into the patch, it worked!

This was amazing, I don’t think Phil will realise just how brilliant it was to see this working, for me…

With that working, he turned to the pressure mat, of course the first thing he did was to check which was the live loop out of the four wires… it wasn’t the ones I had hooked up!

So if you ever buy one of these pressure mats from maplin, the active wires that make the contact loop are the two on the inside of the mat, they actually came with a bit of the plastic casing missing, but no diagram, so here’s one I made earlier!


The red wires make the active loop.

I connected the right wires up to the extended wires and we plugged the now working jack back into the microphone socket.

He put the adc~ code into the patch, but it didn’t register anything, so we looked at the audio input options and here we found it wasn’t on and it wasn’t defaulting to the correct input, so after a bit of jiggery pokery with the audio in on the control panel of my laptop we got a signal.

Phil had put in what looked like a volt meter in MAX so we could read the base voltage and see what it changed to when the mat was stepped on, then he added a greater than value, which would activate the change in state, this worked well, but when you stepped off before the video finished it would return to the looping video, not good as I wanted the whole video to play, so Phil added a ‘gate’ which closed the activation whilst the video was playing.

To put it bluntly Phil is awesome and it all now works, bar the fullscreen which I will sort when it’s actually on the mac I will use for the show as that is different between macs and pcs (I’m working on a pc for these tests)

So just in case you ever fancy doing something like this yourself, here is a screenshot of the patch!


and here it is working!!!!

USB lighting for Digital space

Leave a comment

I will be mounting two larger pieces of AR Art on the wall for the initial focus for the participant, but many more will be available in the form of postcards and business cards and a demo booklet that I am inviting people to buy for a very modest sum of £2.50, to take my art away with them and play at home!

To display the two larger Artworks though I need to source lighting for that side of my installation as it is split into two sides, the ‘reality augmented’ where my sculpture and projection mapping will take place and the ‘augmented reality’  corner where my binoculars will sit on a plinth with my printed documents and the two artworks will be sited.

I have gone down the route of LED lighting using USB power as I need the lights to be low heat output as they will be on for 7-8 hours and constantly powered rather than relying on batteries.

USB Plug In 5LED Light

I found a positionable neck 5 led light stick, but needed to extend the tiny usb lead as it’s primary use is with a computer or laptop. With USB leads you do have to make sure that they are under 5 metres long, or you get reduced power output and need a repeater cable, fortunately I only needed 2 metres so was pretty sure it would be okay.

I also wanted to source a dual usb plug to reduce sockets needed and it seemed sensible as usb lights are only low volatge and masterplug do a reliable version of their plugs with the added bonus of a through socket, so I could actually power all of my plinths power needs ( 2 USB sockets and an ordinary plug socket) through this one plug – I may still need that option!


I wanted to test that this light source was bright enough and with everything plugged in there was no reduction in power actually in situ, so took all the cables into my space earlier…


So, in this picture we have the two led stick lights plugged into their own two 3 metre usb male to female extensions, plugged into the dual usb power socket of the masterplug, in the exact situation in my installation space.

I then stuck a test image onto the wall with blu-tac and attached one of the lights to the wall, to test if the light was strong enough for my phone to trigger the Aurasma content in this low light situation.

and it was…

The low light in this corner also did not spill over into the other area of my installation where my projection will be.



3D Binocular Augmented Reality Viewer… done #augmentedreality #AR #aurasma

Leave a comment





Remember these?


The central divider which turned them into a 3D viewer was a sticking point for them to view Augmented Reality, so I had to try and find a way to remove it without breaking the rest of the plastic surround.

I was very disappointed to realise that the plastic that they had used was in fact very strong, so a craft knife wasn’t even going to make a dent in the rigid structure.

I went down into the 3D workshop to see what tools they might have that could be of use… I thought that a curved hacksaw blade might do the trick, but it just wouldn’t work as you would have no room with which to draw the blade back and forth  any useful distance…

I then asked if they had any heavyduty ‘snips’ I remember using tin snips in previous making ventures and them cutting tin well… Luckily Jim did have a pair of snips, although he didn’t think they would get through the thick plastic.

He gave it a go and they went through the plastic easier than he thought they would! Brilliant.. I sat down to do it and found it really wasn’t as easy as Jim made it look, my feeble little hands struggled making the snips cut any sort of distance, so I resorted to taking tiny little nibbles out of the middle divider.

This was still really hard and also meant of course it took 3 times as long, about a hour and a half to get down the full length of the binoculars – and the blisters on my fingers will attest to this!



Eventually, I reached a point where you couldn’t see anything left of the divider when looking through the eyepieces, so then turned to a large handled rasp to file away all of the little ragged edges.
wpid-dsc_0039.jpgPop in your ipod and hey presto AR Binoculars!

These will be used in my installation to demo my Augmented Reality booklet content and postcards.

I will be preloading the ipod with my own Aurasma channel – tracey tutt – so that all of my printed materials come to life when viewed through the AR Binoculars.

The idea behind making them binoculars comes from a desire to introduce devices to view content in a soft way, ie rather than have an obvious iphone or android smart phone sat in front of you, which could confuse frighten or just irritate the viewer I wanted it to be simple, pick it up in a tactile form – binoculars – and simply do the natural thing with the object, look through the eyepieces.



Putting the Bustard in it’s place #greatbustard

Leave a comment


The Bustard sculpture is now practically finished… I could probably sand and sand and re plaster indefinitely, but I’ve decided to see how it looks in the space.

Taking it up the stairs is helped by it being light and still in two pieces.

I know the area I’ve been given and have a few options as to which orientation it could be.


On the right, it’s a good size in the space.


I borrowed a leftover label from the (now shut) BA show to place on the base to represent the label reproduction from the Norwich Museum…


I want visitors to lean into the sculpture to try and read the label, thereby triggering the animated projection which will cross the wall…

Looking at the space, I’m wondering if it’s possible to animate over one wall and around the corner to fill the walls a little more…plus now I see the sculpture in the space I realise it’s not going to take long to cover the one wall I originally thought of, purely because I’m working life-size…


This is what you could see on approach as they are building a wall on the left and another to the right, which the plinth represents…


So would people just peek in, or would they come into the space. They would need to come through the space to reach the other installation, which makes a bit of a path through ‘my’ area, but I need it triggered.


on the left



wpid-dsc_0029.jpgThis side could make people look around the edge wall and then just move on, I can obviously mark up the mat with ‘step on me’ or similar, but it would be nice if it was a bit more unexpected and natural.

I need to know whether I’m getting a false flat wall put in on either existing wall, or if I have to deal with the sockets and radiator and door with glass window in.

The space left behind from the right hand side new walls makes a lovely corner space, with handy plug sockets for my Augmented reality plinth, but totally bisected because of foot traffic through it…


On the other hand, plenty of the spaces I saw on the BA show you had to walk through, so maybe I shouldn’t worry.

I could maybe ask for the entrance to the 2nd installation to be at the other end, but that would leave a scarily big space!

The one thing I definitely do not notice when it is place inside the space, are any lump, bumps or plaster imperfections, that I have been a little obsessed with working with it close up in the 3D workshop.

It’s decided, no more sanding, move onto the next bit…




Smoothing dilemma – a real ‘head’ache

Leave a comment

Being able to leave the sculpture over a couple of days to really dry off is helpful, but the problems I face getting a reasonably smooth surface are quite apparent now it’s dried.





wpid-dsc_0247.jpgThe head and neck area are the worst affected, and the photos of the face above are after I have spent a whole morning using sandpaper on the lumpiest bits, but this in turn brings out the fluff of the bandage, a real downside to the modroc sculpting method. If you don’t get it smooth on application, then sanding it reveals the material. Rather than if I had applied traditional plaster I would have been able to sand it as much as I liked (well down to the polystyrene former). But of course that would have made the model an awful lot heavier and the thicker skin would have impacted on my original carving.

My next step will be to apply another layer of modroc, but at the same time a skim of plaster, trying to work a smooth surface as I go, understanding a little more of how the modroc works. I will need to apply the regular plaster skim at the same time, so that it adheres to the still damp modroc surface.

I need to make sure I don’t apply too much plaster as this will totally cancel out the benefits and reasons that I used the modroc in the first place…


Getting plastered all over #greatbustard


After finishing the tail in plaster it was time to get on with the largest area, and what I hoped would be the easier section, the large flat sides of the Bustard.

Using slightly larger pieces of the modroc bandage I make my way across one of the sides, working quickly to try and get the smoothest result…

I turn the Bustard sculpture gently onto it’s side so that I can see both the bottom and the top, to try not to leave any nasty bits of unsmoothed bandage where I can’t quite reach. It is pretty tricky as the benches are high, and the Bustard is big, even on it’s side and after taking all morning to cover it I have left a couple of dodgy edges where you can clearly see the bandage.



The ridges of the modroc itself are also still showing through and the occasional tiny ball of polystyrene is still managing to wreck any chances I have of a smooth finish. (images are hard to take in the plaster room because of the fluorescent lighting, so apologies for the stripy photos!)

One extra morning used up…

The next morning I get, it’s onto the other side, I try and randomise the layering of the modroc as much as is possible, but I’m still not able to smooth it as much as I’d hoped and the little pieces of modroc string falling off coupled with polystyrene bits in the mix are driving me crazy!

wpid-dsc_0230.jpgBut I get the other side done and the underneath covered.






Time is up, but my Bustard sculpture is fully plastered!

It’s looking really nice coated in plaster, but I will be unable to present it in this state, so I will need to continue longer than I had hoped, on trying to get a more polished finish.


Older Entries