Animation – green screen

Leave a comment

Working at NUA as the Animation and sound technician, this week’s process test was to go through greenscreen, from beginning to end.

Through this I would be able to test out the new dragonframe to see what features had been updated, and perhaps changed, to make sure that I am always up-to-date.

To also ensure that my green screen setup was as good as possible for an upcoming project with the first years, and then brush-up on using After Effects for the post production.

So I grabbed one of our walkcycle armatures, borrowed some doll’s clothes from my children and went into the depths on Animation Studio 1.

destinedforstardom

Destined for stardom!

The key points for green screen are to light the background and foreground almost separately, obviously in the reduced space of an animation studio this is a little more difficult as you can’t get a lot of space in between but, starting with the two basic lights, a flo-light (floodlight) and a kick light to pick out the model from the background, that’s a good place to start.

Green1

A flo-light (floodlight) at the top to try and light the background evenly, then two dedolights and a kick from the back to try and distinguish foreground from background

As you can see the result has harsh lights from the spot, which you need, but adding diffusion will soften the harsh shadows, because we want as little of those as possible.

diffuser

The fabulous dedolights let you easily attach some diffusion material (or gels) directly onto the barn doors with an easy to use tiny clamp

This lessened the shadows and gave me a result I was fairly happy with, although in an ideal world the Green screen would have maybe 2 flo-lights on, to be more even.

Greenwithdiffuse

Softer shadows with diffusion, but I did have to tun up the dedolight a little to compensate

Ready to film, I then turned to the new dragonframe, and to be honest there’s not a lot of difference from version 3, the interface is slightly smarter, but for the students, it will mean an easy transition to the latest version. Which was a must as we had new cameras waiting to be installed, but they would only work with DragonFrame 4. (Canon 1300D’s)

A short jerky walkcycle later – it’s been a while – and I had my character in the middle of the stage, ready to react with a blue polystyrene box that the students have been using, so that my armature (and the action) could stay in the middle.

Disaster fell at this point in the proceedings too…

thegreataccident

ouch!

His ankle joint broke, but as with all good English actors, we carried on!

The resulting video, is not my finest work, the clamp rig is really too big and heavy for this small armature character, there’s a terrible jerk where his ankle breaks , but the reaction works well, and I like the character that the little blue box has… In my head it’s a very lively puppy, that growled to stop my man in his tracks, then once beckoned turns into a slobbering excited mess when he gets a hug and a kiss…

It’s amazing what my imagination adds, now to see if I can add a little post-production magic to help anyone else see it too!

When using DragonFrame, you can either export video or stills, but you must remember to conform your take if you want to discard any re-shot frames, or deleted frames, as when you bring in an image sequence into AE, it can pick up those dud frames.

Also make sure your frame rate is correct, again if you lengthen or hold frames on the Xsheet, you will need to conform your take for those changes to take effect and your image sequence to reflect your timed animation from dragonframe.

Leaving the animation studio behind I headed up to the Media Lab to get started in After Effects.

Once you’ve set up a regular 1080p workspace and composition bringing in an image sequence is really simple, click on your first image and after effects will pick up all of the tiff’s in that folder, in sequence, and ‘pre-comp’ them together as a single piece of media, so for animation from dragonframe, that’s exactly what you want.

Then drag this tiff sequence down onto your pre-set composition timeline, and resize them to fit – this is why you should always setup the comp first, not just plonk your content onto the timeline as it will take it’s size from the media and who knows what size it might end up, which then leads to rendering/processing problems.

I like to use a garbage matte before applying the keylight effect, as it cuts down how much green the effect is trying to process, and with my small setup I knew the corners were going to need taking out. So, although it’s a laborious process I step through all of the frames, altering the mask slightly to allow for model movement. It is lovely when you don’t have to move it for a few frames!

Then I could move onto adding the keylight 1.2 effect… it does a fab job, and this is where you can really see any shortfall in your green screen technique – and there were some very particular areas in this test! The best tips I would give are clipping the black and white points (in the settings area of the effect) and using the alpha preview to see exactly what is black and white. I had a bit of spill both on the box and the white clothing which I couldn’t seem to sort out which left parts of my characters with slightly see-through areas, a bit more subtle tweaking of the advanced settings with the blacks and whites, got it beautifully crisp.

Now to put a simple background in to see how it was all doing.

Et voila, it’s ok, it’s nice to see it in a situation away from green, or black, really good exercise to go  through before the next first years project, dragonframe 4 is still as easy to use and after effects has many different and powerful ways of keying.

To add to the ways stated above, you could also; clone stamp in AE to remove the pins, which I did do a bit, but it makes a crazy amount of layers; Add some 3D lighting to perk up the character; Colour correct the background and animation to make them feel more cohesive; Track eyes/features onto the characters in 3D space and use layers more cleverly to give a sense of perspective.

However what I wanted to do was give my little bluebox puppy a bit of life, but I didn’t want perfect, which would be my normal style, something in Illustrator with beautiful clarity of line, I was after a more Mr Messy feel and although I don’t use it a lot, I knew that TV Paint would get me a really nice organic free feel to it.

So I rendered a low res copy out of AE, and used this for a background layer in TV paint, then got my wacom tablet out and let my imagination go a little wild!

I can practically hear the excited slobbering doggy noises, so at some point, I will return to this project and add some sound…

tbc…

Advertisements

Playful politics in Photoshop

Leave a comment

Couldn’t resist a quick play in photoshop as the image of Theresa May as the ultimate baddy!

TM_WWoE

Kodak Ektra – a perfect match?

Leave a comment

Christmas is coming early!

I have been following the development of this new camera phone from Kodak, and today I got my pre-order notification.

I already have a fantastic camera on my phone (it’s a Sony Z5) but really wanted to check out this innovative and rather cool looking kodak offering.

Pre-ordering gives me the free case, and I chose the natty Tan version, it should arrive by mid-december and I will be giving it a thorough workout as I have started a project cataloguing the seasons in my local park that I just happen to pass through every morning.

eatonpark_tutt3043

https://www.flickr.com/photos/traceytutt/

eatonpark_09

Currently I am employing my Sony Phone or my portable on the go Fuji x30, but I’m always on the lookout for what could be a perfect combination of snapper and phone.

Full specs don’t seem to be out for the kodak, like full physical specs, but the data numbers are in.

Spec Details
Camera 21MP main camera with Kodak non-reflective lens coating. Aperture f2.0.

Optical Image Stabilization and Auto Focus.

13MP front-facing camera.

Phase Detection Auto Focus (PDAF) & HDR Imaging.

4K Video Capture.

Internal Leading Helio X-20 Decacore Processor

3000 mAh battery

3GB RAM

32GB Memory, expandable with MicroSD cards

4K Video Capture.

I will be interested to see how it feels in hand, being quite a bit thicker and I’ve also gleaned that the internal camera may well be the same as my current Sony… watch this space…

ueabroad_11

 

Creative Coding Week 1 & 2 #creativecoding

Leave a comment

Future eLearn have some fantastic free online courses, MOOC’s and I have been immersing myself in their creative coding one at every opportunity. Although I’m a couple of weeks behind the latest modules, because I can take it and learn at any time, I’ll catch up, or just finish at my own pace, therein lies the beauty and flexibility of the concept!

Use computer programming as a creative discipline to generate sounds, images, animations and more, with this free online course.

https://www.futurelearn.com/courses/creative-coding

Coding

In the first two weeks I have made my name in a lovely little interactive drawing canvas, see above, and we have been introduced to great interactive and digital artists using processing and creative code within their art.

I am hoping that I can spot a link between these kind of basic interactions and my own interactive sculpture, or artworks, as I still want people to interact and not be passive within galleries or museums.

Daniel Rozin, particularly interests me and one of his latest works is fascinating to watch…

List from the course of artists and designers and researchers in interaction design.

Adventures in Zoetrope Animation

1 Comment

First things first, remember when you got your mathematics tin set at school and you played with it all and thought I’ll never need this…’ well, if you want to have a go at making a zoetrope you’re going to need to fire up your math student brain, find a compass and remember what pi is!

you will need a compass

you will need a compass…

I am putting together a set of resources for an introduction to animation I will shortly be presenting, in line with the teaching course that I am currently studying for, and I wanted to get the students to create an animation and understand  keyframes, movement and persistence of vision (which means our brains see still images as moving).

Now in the short time I have, they won’t be able to create a full blown animation, so I’ll be guiding them through how to make a real basic staple of animation, a walkcycle, consisting of just 12 frames, running cyclically.

I don’t have a fancy animation rostrum hooked up to a massive projector or anything but need to be able to show the class the results, almost instantly, and I hit on the idea of putting those frames into a zoetrope viewing device, so that they can all have a go and see what happens with the movement they create.

Cutting the base

Cutting the base circle

I’m pretty handy with a scalpel so dug out some foamboard to make the basic structure of the zoetrope itself.

I started with the size of frame I wanted them to draw on becuase I didn’t want it to be too small an area, and then worked backwards, calculating a regular space in between and ended up with a strip 670mm long and 70mm tall.

This is where you need your pi and compass, take the 670 and divide it by pi to get the circumference of the circle you need for the base, divide this in half and set your pair of compasses up to draw your circle and cut!

Admittedly it didn’t quite fit on the first cut, I put this down to the very worn compasses that I managed to eventually find in my daughters room, under some books, but it was larger than needs be so I re-trimmed a slither and it fit!

Zoetrope and base now fitted after a little re-trim

Zoetrope and base now fitted after a little re-trim

Using a thin ribbon of double sided sticky tape around the bottom provided a good snug fit

Using a thin ribbon of double sided sticky tape around the bottom provided a good snug fit

The outside wall I add is 670mm x 140mm, laminated and cut, with the frames and slit marks printed on one side and all black on the outside so I have a register for my animation and a template to cut for the thin viewing holes.

wpid-dsc_0090.jpg

You can see the slit holes have been cut out in this shot

I also needed to work out how to get it to spin, this was something I mulled over and looked at other ways to do it, but I didn’t have any ‘lazy susan’ bearings as one suggested and didn’t like the twizzle it in your hand method often used in other ‘how to’s’ .

Searching for another method I looked around my desk for inspiration and found a DVD case, one of those ones for a 100 discs, with a long spindle, playing with it I discovered that the discs, when spun, quite happily turned and kept moving fairly easily – aha! I had found a really cheap easy option to making my zoetrope spin.

CD stuck to the bottom of the base

CD stuck to the bottom of the base

I stuck one disk to the bottom of the base and added a few padding layers of foam board to bring the height of the zoetrope up…

cutting extra padding discs of foamboard

cutting extra padding discs of foamboard

The extra layers added to the spindle and free turning CD on the top which would spin against the one of the base

The extra layers added to the spindle and free turning CD on the top which would spin against the one of the base

I experimented with having 1 or 2 extra cds underneath, and found that 2 worked best to give a smoother turn.

Finished and in testing with my walkcycle animation

Finished and in testing with my walkcycle animation

All was working, it spun fairly well – I would like to improve this, but cost and time are against me – but the last obstacle was that my line drawn animation just didn’t show up when spun, another 12 frame cycle I had which was solid black shapes worked really well, so out with the felt tips to colour mine in and hey presto… zoetrope resource… done!

Below you’ll find a link to my pdf templates so you can have a go too!

zoetrope template copy

Inital GoPano 360 video tests #360 #gopanomicro

Leave a comment

So, I’ve bought my GoPano micro adaptor (not many left!), conned my husband into ‘needing’ an iphone 5 (just happens to fit the GoPano Micro :)) and shot two pieces of 360 footage, not exciting pieces obviously, just me wandering round my house and outside, but I just needed some test footage to move forward with…

This is a screengrab from the inside shot.

Screengrab from test go pano 360 video shot

Screengrab from test go pano 360 video test shoot

I love that when you’re viewing it online you can turn around and zoom with keyboard controls.

  1. So problems that are easy to spot before I can develop it further are, I’m quite prominent in the frame, and staring at myself is not what I want to do – solution, test different ways of holding the camera and GoPano setup.
  2. Light needs to be good as the phone auto corrects as we move through and it doesn’t cope well with internal lighting.
  3. Finally – quality, is it up to scratch, this can only be really tested when I go a step further and see if I can view this video in my google cardboard VR glasses..

Lastly can I take it into a programme and add interactivity?

What I really want is one where the viewer can drive the movement, similar to Fibrums Rollercoaster where you start the ride by focusing your gaze on the ‘go’ lever within the environment.

A new addition to the recently updated Google Cardboard compatible apps is ‘Titans of Space’

Titans of Space® is a short guided tour of a few planets and stars, the point of which is to give the player a sense of scale of just how big these planets and stars are compared to each other.

In game visual from Titans of Space

In game visual from Titans of Space

Again this uses the mechanic that you have a virtual crosshair and focussing where you are looking, at a trigger will reward you with a reaction in game, very clever stuff..

I would ideally be able to wander around the created environment just by turning my head and opening doors or entering corridors with this virtual crosshair as my controller.

So I need to trial different capture methods to minimise my presence in the resulting video and test the video from gopano site with google Cardboard to see if it’s compatible… will report back soon!

Creating 360 Video – on a budget? #360video

Leave a comment

Since making my own Google Cardboard I have been looking at how I can put my own content into it.

My DIY google cardboard

My DIY google cardboard

The easiest and quickest way is to take a full 360 panorama photo (known as a photosphere) using the Google Camera app, which appears automatically if you have a google device, or can now be downloaded from the app store onto any android device.

Google Camera App Screenshot

Google Camera App Screenshot

This is a really quick and interesting image to view in the Google Cardboard viewer, you select it from the list when you turn the google cardboard app on, it then takes you straight through to your last photosphere image.

It’s very eerie viewing a fully immersive scene – and it really works well – when not in that particular place. The first shot I took was of my office and when I viewed it back on different occasions found it quite disorientating as the day I shot it, it was sunny, so to put myself into that moment on a cloudy day, and it to be so lifelike, when you ‘came out’ of it, it was quite impressive.

360 panosphere in my office

360 photosphere in my office

It’s not perfect, I actually have two computers and screens on my desk, but because of the limitations of the camera/app/360 capability, it appears I only have one as a portion has been overlapped so much, but, on the whole, it really puts you there, in the other space. The google cardboard photo viewer automatically works out the side by side bi-ocular view and the refresh rate when turning round is practically real time, I didn’t notice any lag whatsoever.

When I showed this to my children (my testers for anything), they thought it was great and wanted to take photospheres in every room, so that they could sit in a different room and view the other room… I 360’d our kitchen and they then wanted to show all their friends that they could be ‘in the kitchen’ whilst anywhere else in the house. Great fun.

It’s very effective, but, I want to do 360 video.

S0, investigating this and I come across a wonderful app (to be used with google cardboard) made by a very clever company called Jaunt, featuring a song with Paul McCartney where not only have they produced 360 video which you can look all around, but the sound is also 360… Put on a good pair of headphones, download this app and see the future of music videos!

https://play.google.com/store/apps/details?id=com.jauntvr.preview.mccartney&hl=en_GB

But how would I replicate this, on a budget and without access to the latest gizmos…?

Go Pano produce a large and expensive option of a lens add on to a pro camera, costing $500, but, I have found they also do a micro version for just $29, which a uk company sell for £24 (http://www.red-door.co.uk/pages/productpages/gopano-micro-iphone.html) which is a great price and will help me experiment, but it will only fit onto an iPhone 4, 4s and 5, which I don’t have and seems to be no update for, so leads me to believe it’s not very good or not very popular, so I’d need to get my hands on a phone pretty soon to trial it.

Fortunately from their purchase page they have a link to users own uploaded 360 videos, this one was my favourite http://www.gopano.com/video/MTM2NzI these guys have so much energy and are really having fun with the 360 video and because it’s in a fixed position it works well, this video of a walkaround behind the scenes of a red bull bike day, shows the problem with holding the device, http://www.gopano.com/video/MjE1MDk, look behind you and there’s a giant arm! One more video which shows a shortcoming of this lovely little device, which has big ideas, but I’m not sure the output is up to high quality standard, is an acoustic music piece, and if you look around you can very clearly see dirty marks, or dinks in the reflective surface, which greatly affects the quality of the video, which also seems a little soft and I’m not sure if that is because the versions of iphone that it works with don’t have great video quality anyway… http://www.gopano.com/video/MjIzNTE

So I could for about £100 get some 360 video, but I would love to make it slightly interactive, such as in the FiBrum VR Rollercoaster, where to make the experience start you have to concentrate your view on a red lever which starts the ride.

Fibrum Rollercoaster App Screenshot

Fibrum Rollercoaster App Screenshot

I also don’t know if you can view this type of 360 video in Google Cardboard, this article has some good references

Taken from http://www.chioka.in/tag/google-cardboard/

Devices that can capture (360) degrees panorama:

  • GoPano – A special lens attachable to iPhone that allows you to take panoramas and panoramic videos. It works by having a 360 degree lens and bend the light into the iPhone camera. Works for iPhone only. 360 degrees horizontally only.

  • Kageto – A company manufacturing the Dot, Lucy, and Jo. They are successive versions of a special lens attachable to iPhone or Android to take panoramas and panoramic videos. Similar to GoPano, 360 degrees horizontally only.

  • BubbleScope – Another attachable lens to iPhone for capturing panoramas and panoramic videos. Similar to GoPano, 360 degrees horizontally only.

I need to follow through a few leads from these links, then I need to look at software that can take in 360 degree video and make it interactive and output it in a form that Google Cardboard can use, but it looks like it is possible, this firm make a SDK which, currently, is free to play with, again more reading required in the depths of the small type to see what file types they are compatible with…  http://www.panframe.com/

So yes you can take 360 video – with a small budget – with a few limitations, but good enough to test my proof of concept for interactive live 360 video, when my Go Pano Micro turns up, I’ll report back.

I decided to go for the Go Pano as the video from the Bubblescope looks particularly poor on first glance and the Kogeto seems to be very proprietary and I haven’t been able to find any video to view – as yet. But as an aside, both of these 360 add ons have the attachment flat to the phone, so you cannot get rid of the black box of the phone in the resulting video, see bubblescope still below.

Bubblescope still

Bubblescope still

 

Online learning – Begin Programming: Build Your First Mobile Game @futurelearn

Leave a comment

I have been looking into available (and free) online courses for more in-depth programming and I came across this one by Future Learn.

Learn basic Java programming by developing a simple mobile game that you can run on your computer, Android phone, or tablet.

https://www.futurelearn.com/courses/begin-programming

This course is only a 7 week course run originally by Reading University – with videos, projects and quizzes.

I am playing catch-up, one of the benefits of online learning is that although I missed the original startdate, I can still join as long as the course is running, but am only up to week 4 and it’s just got really complex!

The first week saw me struggle to download the Android development studio and the java toolkit, plus finding a suitable android device I could test on. You can use the on-board emulator but it doesn’t give you the tactile feedback – or pleasure – of seeing the green screen and animated elements that you yourself have coded.

We’ve gone through code constructs, data types and variables, conditional statements and we are now starting arrays and loops, and the content is good quality – very dry, but I think the subject matter gets quite intense and serious quite quickly.

I am struggling with the exceptionally mathematically minded way that this particular coding syntax is set out, I understand more than I can write myself, which is useful, and I can see where I’ve gone wrong in the code, such as the extra { that broke my game, but when let loose to add in whole new sections, it’s hard, but I will get to the end and hope to gain more insight.

I am finding that I need more than the recommended 3 hrs a week, but can just about cope with that, the best thing with this way of learning is that I can always go back and restart, rewind the video, or try again tomorrow…

The future learn forum for the course tries to encourage you to join in and share, but again, feels flat,perhaps they could schedule a live google hangout session,  and get some real interaction going on!

Musical Code

Leave a comment

More and more intelligent and dynamic interactions can happen within today’s browsers and networks, when I visited the Digital Revolution at the Barbican in London last year, there was an audio piece by Zach Lieberman called Play The World, where you could play international radio-stations on a piano. Each key makes the system listen to radio around the world to find one playing that particular not, then feeds that radio station onto the speakers.

This mock up is from Zach Liberemans DevArt page with all of the information on from the project

Connections between live tweets and graphic interfaces have been around a long time, (visible tweets, tweetbeam, and more) but I discovered that those clever audio tinklers have also got tweets to play music!

Although The Listening Machine is no longer live it has archived a few excerpts from different times of the day and it makes for interesting listening as they do have their own tempo and feel…

listeningmachine

So with some clever coding you can interact with live comments, this leads me with the question; could you do this with a physical interaction, with something like a kinect or a leap motion, so instead of a physical key, a gesture can control the trigger?

found via @MetaMusical @ConversationEDU @olliebown – https://theconversation.com/explainer-interactive-composition-33594

The Power of Processing #processing #nordevcon #rumyra

Leave a comment

Last friday I went to the NorDevCon http://www.nordevcon.com/ and the most inspirational talk was the one using code to control web API’s from Ruth John. Ruth works for the Lab at O2, UXing, designing and front end coding.

She embarked on a journey back into her vj’ing past when using tape and video (heavy on the thundercats) and wondered if she could recreate the entire experience using code loading directly into a web browser.

She started with a ‘simple’ css animation and showed us the development of the stages she took, adding another functionality until she eventually had video dynamically loading in time to the beat of a music track (also dynamically loaded) she also had her working browser reacting to noise input from us!

These web apis are out there, people are animating, drawing and interacting with their browsers, they are being developed and improved and new functionality added day by day.

Ruth John’s slides from NordevCon

http://rumyras-talks.herokuapp.com/web-vs-native-nordevcon/#/

You need to have a play with this… the animating beats are amazing and got me excited about more code interactivity.

Direct link to the fun demos – http://dancing.rumyra.com/

Ruth recommended this – The Web Audio API O’Reilly book by Boris Smus is free to read online!

I spoke to Ruth afterwards, asking if she had tried interacting between a leap motion and the browser and she thought that it would be possible, so not only can images and video load dynamically, but it could be controlled by gesture.

Explaining my interest and where I was in my research – at the beginning – she recommended that I use javascript as my language to get going on this.

But alongside the .js for mobile/web interactivity, I wanted to look into processing, again this is able to draw, animate and interact within your browser, I don’t know if they are compatible, but will endeavour to find out, but lots of things to play with and look at!

processing examples

https://processing.org/examples/

Books

Getting-Started-Processing-Hands-Introduction

Form-Code-Design-Architecture-Briefs form+and+code

Older Entries