This module has seen me beta test my Masters project, researching current use of Augmented Reality, design of a trigger image/floor graphic and creation of a 3D model of a Great Bustard. When coupled together and processed through an Augmented Reality app platform the final outcome gives the viewer the chance to see up close and in 360 degrees an animal – that until recently was extinct from British soil – on any mobile device they have to hand.

Extra experience, extra understanding and at the viewer’s own pace. I want help bring a very modern movement into the local museum, giving visitors a realistic sight and sound experience, a chance to interact with the virtual exhibit.

This is my own personal journey to bring more understanding and an extra layer of information to an extinct exhibit in a dry and dusty cabinet in the museum, from the first time I realised the Great Bustard no longer existed in the UK I had a vision of bringing it alive.

After looking in the initial stages of AR development at Blippar and Layar, big money commercial players, Aurasma has become my chosen Augmented reality platform, it’s free,  easy to use, has an advanced action set and I have been experimenting with it since it’s launch nearly two years ago.

There really is nothing out there quite like my intended finished masters piece, the closest thing in existence that I have found is the James May app at the Science Museum. It uses similar features, a trigger image and when triggered, 3D content appears through the app interface on a mobile device.

JamesMay

In this case James May appears standing on top of the plinth over the trigger image and talks to you about the object you are standing next to. For me this is very much a vehicle for James May, the content is not great and relies on you being able to listen to what he is saying, at the time I tested the app I didn’t have any headphones and consequently couldn’t hear a single thing he said, so I had this fabulously modeled digital James May gesturing and opening and closing his mouth. I couldn’t interact with him, no more information appears on the screen and there is no distinction between one trigger and another, they are all the same and it relies on you telling the app the correct exhibit. What I intend to create, you’ll be able to read and see more content, learn the history, see related video clips, interact with it, get it to fly right around the display case, and of course this could be very simply expanded and  rolled out with individual trigger images and new content for different exhibits. So although the James May app comes closest, it was a real missed opportunity to let the technology really shine.

Looking at other exhibits in the Science Museum and The Natural History Museum and watching how people interact with the objects I know I need to be very careful and clever about how I communicate and implement this technology. People look, maybe take a photo and move on, school children might try and draw it, but most pass through quite quickly unless there’s a button to press. I didn’t see a single person accessing the James May app in the few hours I was there, most people don’t seem to know about it before they come and they don’t have the facility, patience or data allowance to download it on site, so it’s tricky. When I met with Daniel Brightman, the Interactive Designer at the Natural History Museum he showed me their auditorium in which each visitor is given their own device which at the appropriate moment in the presentation, preloaded with the right app and software, shows a 3D model.

When talking with Dave Patten, Head of New Media at the Science Museum he told me he would like to use more seamless interfaces such as Google Goggles and digital contact lens technology. Although those are a little more in the future they are currently beta testing interactive glass on a display case, which gets over a lot of obstacles and really could change the way people interact with exhibits.

Starting from scratch I needed to create a trigger image, so I took the Norwich Castle Museum logo and combined it with the Great Bustard group logo in illustrator, this gave me a highly identifiable and hopeful intriguing icon/graphic from which to trigger my 3D Aura.

A3Trigger

This icon would work in conjunction with advertising and leaflets plus promotional press on launch, to create a cohesive and visual campaign, this piece would constitute the floor graphic. On testing in situ though I found it to be way too small in front of the Great Bustard case at A3 and propose that if it were launched it should be at least A1.

TT_NCM2

The next phase was to get as much visual research for my  3D model, so as well as taking full advantage of the display in the Norwich Castle Museum and taking many observational photos I have been in contact with the guys at the Great Bustard group and they have put me in touch with their pro photographer (Dave Kjaer) who has been cataloging the project since their re-introduction in 2004 and he has been kind enough to offer me full access to his entire back catalogue.

Creating 3D models in Autodesk Maya – a piece of software I have never used before – has been an intense and steep learning curve, just watching the basic tutorials took most of a week.

I gave myself the – what seemed simple – task of building the basic shape of the Great Bustard and animating it’s legs within Maya, but because the Great Bustard is an animal all of the shapes had to be custom made to look organic enough. I did create a 2D animation of the Great Bustard’s walk cycle in Flash but found when trying to translate this into maya, it proved too tricky to translate across to be useful, what would have been more worth my while, would have been an onion skin illustration of the movement.

From the photos and research images I produced 2 linear outline plane illustrations which when placed into MAYA give me an accurate guide to starting out with the 3D shapes.

SS_IllustratorGBrefrefplanes_maya

Once I had learnt the controls, I found building the model to be relatively straightforward, not easy, but mostly successful, the tail needs more work and at this point I don’t feel confident enough to work on the wings, but I know that going through this process will enable me to make an improved model, with time.

GB_MAYA_tail_6

When I took the static outcome to Jon Maxwell at the Museum he seemed impressed with the technology working and my model appearing over the trigger image, but as he was looking at it I knew that it needed to move and have more information available, rather than just be a static model, otherwise it would be a little like the James May app, pretty, but pretty useless.

JM_NCM_2

However animating the model turned into a real problem for me, incompatibility between college and home pc’s meant I was unable to use files from college, on which I had created the lion’s share of work in Maya, at home, which had the correct exporter for Aurasma, the next stage of this project.  I had to take the decision to strip back my idea to use joints to animate in Maya, as the tutorials would have you do, so that I could test the rest of the process. It may have just been the differing filetypes and working on two very different computers but trying to be realistic with what I can achieve in the limited time for this module has made it possible for me to produce a simpler model which proves the concept works in this initial phase of development.

Looking at what I have managed to make I am really pleased with the outcome, I have had so many technical hurdles to get over, not just with learning the industry standard, and therefore very complex, 3D modelling and animating software, but then taking this onto one of the most advanced Augmented Reality platforms, with all of it’s own requirements and compatibility problems.

Although my output appears small if you look on the face of it, what has gone into this has been a mammoth effort, I have achieved a good footing on the road of a lifelong desire – to learn Maya/3D digital animation – and looking around at what is being presently used I realise I am really pushing the boundaries and testing processes and uses which means I’m working at the leading edge of this creative movement.

The knowledge I have gained putting my masters project through this beta testing stage is absolutely invaluable, I have made some great connections and am really excited by where my work could take me, the potential really is limitless.

Advertisements