Creating 360 Video – on a budget? #360video

Leave a comment

Since making my own Google Cardboard I have been looking at how I can put my own content into it.

My DIY google cardboard

My DIY google cardboard

The easiest and quickest way is to take a full 360 panorama photo (known as a photosphere) using the Google Camera app, which appears automatically if you have a google device, or can now be downloaded from the app store onto any android device.

Google Camera App Screenshot

Google Camera App Screenshot

This is a really quick and interesting image to view in the Google Cardboard viewer, you select it from the list when you turn the google cardboard app on, it then takes you straight through to your last photosphere image.

It’s very eerie viewing a fully immersive scene – and it really works well – when not in that particular place. The first shot I took was of my office and when I viewed it back on different occasions found it quite disorientating as the day I shot it, it was sunny, so to put myself into that moment on a cloudy day, and it to be so lifelike, when you ‘came out’ of it, it was quite impressive.

360 panosphere in my office

360 photosphere in my office

It’s not perfect, I actually have two computers and screens on my desk, but because of the limitations of the camera/app/360 capability, it appears I only have one as a portion has been overlapped so much, but, on the whole, it really puts you there, in the other space. The google cardboard photo viewer automatically works out the side by side bi-ocular view and the refresh rate when turning round is practically real time, I didn’t notice any lag whatsoever.

When I showed this to my children (my testers for anything), they thought it was great and wanted to take photospheres in every room, so that they could sit in a different room and view the other room… I 360’d our kitchen and they then wanted to show all their friends that they could be ‘in the kitchen’ whilst anywhere else in the house. Great fun.

It’s very effective, but, I want to do 360 video.

S0, investigating this and I come across a wonderful app (to be used with google cardboard) made by a very clever company called Jaunt, featuring a song with Paul McCartney where not only have they produced 360 video which you can look all around, but the sound is also 360… Put on a good pair of headphones, download this app and see the future of music videos!

https://play.google.com/store/apps/details?id=com.jauntvr.preview.mccartney&hl=en_GB

But how would I replicate this, on a budget and without access to the latest gizmos…?

Go Pano produce a large and expensive option of a lens add on to a pro camera, costing $500, but, I have found they also do a micro version for just $29, which a uk company sell for £24 (http://www.red-door.co.uk/pages/productpages/gopano-micro-iphone.html) which is a great price and will help me experiment, but it will only fit onto an iPhone 4, 4s and 5, which I don’t have and seems to be no update for, so leads me to believe it’s not very good or not very popular, so I’d need to get my hands on a phone pretty soon to trial it.

Fortunately from their purchase page they have a link to users own uploaded 360 videos, this one was my favourite http://www.gopano.com/video/MTM2NzI these guys have so much energy and are really having fun with the 360 video and because it’s in a fixed position it works well, this video of a walkaround behind the scenes of a red bull bike day, shows the problem with holding the device, http://www.gopano.com/video/MjE1MDk, look behind you and there’s a giant arm! One more video which shows a shortcoming of this lovely little device, which has big ideas, but I’m not sure the output is up to high quality standard, is an acoustic music piece, and if you look around you can very clearly see dirty marks, or dinks in the reflective surface, which greatly affects the quality of the video, which also seems a little soft and I’m not sure if that is because the versions of iphone that it works with don’t have great video quality anyway… http://www.gopano.com/video/MjIzNTE

So I could for about £100 get some 360 video, but I would love to make it slightly interactive, such as in the FiBrum VR Rollercoaster, where to make the experience start you have to concentrate your view on a red lever which starts the ride.

Fibrum Rollercoaster App Screenshot

Fibrum Rollercoaster App Screenshot

I also don’t know if you can view this type of 360 video in Google Cardboard, this article has some good references

Taken from http://www.chioka.in/tag/google-cardboard/

Devices that can capture (360) degrees panorama:

  • GoPano – A special lens attachable to iPhone that allows you to take panoramas and panoramic videos. It works by having a 360 degree lens and bend the light into the iPhone camera. Works for iPhone only. 360 degrees horizontally only.

  • Kageto – A company manufacturing the Dot, Lucy, and Jo. They are successive versions of a special lens attachable to iPhone or Android to take panoramas and panoramic videos. Similar to GoPano, 360 degrees horizontally only.

  • BubbleScope – Another attachable lens to iPhone for capturing panoramas and panoramic videos. Similar to GoPano, 360 degrees horizontally only.

I need to follow through a few leads from these links, then I need to look at software that can take in 360 degree video and make it interactive and output it in a form that Google Cardboard can use, but it looks like it is possible, this firm make a SDK which, currently, is free to play with, again more reading required in the depths of the small type to see what file types they are compatible with…  http://www.panframe.com/

So yes you can take 360 video – with a small budget – with a few limitations, but good enough to test my proof of concept for interactive live 360 video, when my Go Pano Micro turns up, I’ll report back.

I decided to go for the Go Pano as the video from the Bubblescope looks particularly poor on first glance and the Kogeto seems to be very proprietary and I haven’t been able to find any video to view – as yet. But as an aside, both of these 360 add ons have the attachment flat to the phone, so you cannot get rid of the black box of the phone in the resulting video, see bubblescope still below.

Bubblescope still

Bubblescope still

 

Online learning – Begin Programming: Build Your First Mobile Game @futurelearn

Leave a comment

I have been looking into available (and free) online courses for more in-depth programming and I came across this one by Future Learn.

Learn basic Java programming by developing a simple mobile game that you can run on your computer, Android phone, or tablet.

https://www.futurelearn.com/courses/begin-programming

This course is only a 7 week course run originally by Reading University – with videos, projects and quizzes.

I am playing catch-up, one of the benefits of online learning is that although I missed the original startdate, I can still join as long as the course is running, but am only up to week 4 and it’s just got really complex!

The first week saw me struggle to download the Android development studio and the java toolkit, plus finding a suitable android device I could test on. You can use the on-board emulator but it doesn’t give you the tactile feedback – or pleasure – of seeing the green screen and animated elements that you yourself have coded.

We’ve gone through code constructs, data types and variables, conditional statements and we are now starting arrays and loops, and the content is good quality – very dry, but I think the subject matter gets quite intense and serious quite quickly.

I am struggling with the exceptionally mathematically minded way that this particular coding syntax is set out, I understand more than I can write myself, which is useful, and I can see where I’ve gone wrong in the code, such as the extra { that broke my game, but when let loose to add in whole new sections, it’s hard, but I will get to the end and hope to gain more insight.

I am finding that I need more than the recommended 3 hrs a week, but can just about cope with that, the best thing with this way of learning is that I can always go back and restart, rewind the video, or try again tomorrow…

The future learn forum for the course tries to encourage you to join in and share, but again, feels flat,perhaps they could schedule a live google hangout session,  and get some real interaction going on!

Musical Code

Leave a comment

More and more intelligent and dynamic interactions can happen within today’s browsers and networks, when I visited the Digital Revolution at the Barbican in London last year, there was an audio piece by Zach Lieberman called Play The World, where you could play international radio-stations on a piano. Each key makes the system listen to radio around the world to find one playing that particular not, then feeds that radio station onto the speakers.

This mock up is from Zach Liberemans DevArt page with all of the information on from the project

Connections between live tweets and graphic interfaces have been around a long time, (visible tweets, tweetbeam, and more) but I discovered that those clever audio tinklers have also got tweets to play music!

Although The Listening Machine is no longer live it has archived a few excerpts from different times of the day and it makes for interesting listening as they do have their own tempo and feel…

listeningmachine

So with some clever coding you can interact with live comments, this leads me with the question; could you do this with a physical interaction, with something like a kinect or a leap motion, so instead of a physical key, a gesture can control the trigger?

found via @MetaMusical @ConversationEDU @olliebown – https://theconversation.com/explainer-interactive-composition-33594

The Power of Processing #processing #nordevcon #rumyra

Leave a comment

Last friday I went to the NorDevCon http://www.nordevcon.com/ and the most inspirational talk was the one using code to control web API’s from Ruth John. Ruth works for the Lab at O2, UXing, designing and front end coding.

She embarked on a journey back into her vj’ing past when using tape and video (heavy on the thundercats) and wondered if she could recreate the entire experience using code loading directly into a web browser.

She started with a ‘simple’ css animation and showed us the development of the stages she took, adding another functionality until she eventually had video dynamically loading in time to the beat of a music track (also dynamically loaded) she also had her working browser reacting to noise input from us!

These web apis are out there, people are animating, drawing and interacting with their browsers, they are being developed and improved and new functionality added day by day.

Ruth John’s slides from NordevCon

http://rumyras-talks.herokuapp.com/web-vs-native-nordevcon/#/

You need to have a play with this… the animating beats are amazing and got me excited about more code interactivity.

Direct link to the fun demos – http://dancing.rumyra.com/

Ruth recommended this – The Web Audio API O’Reilly book by Boris Smus is free to read online!

I spoke to Ruth afterwards, asking if she had tried interacting between a leap motion and the browser and she thought that it would be possible, so not only can images and video load dynamically, but it could be controlled by gesture.

Explaining my interest and where I was in my research – at the beginning – she recommended that I use javascript as my language to get going on this.

But alongside the .js for mobile/web interactivity, I wanted to look into processing, again this is able to draw, animate and interact within your browser, I don’t know if they are compatible, but will endeavour to find out, but lots of things to play with and look at!

processing examples

https://processing.org/examples/

Books

Getting-Started-Processing-Hands-Introduction

Form-Code-Design-Architecture-Briefs form+and+code

Interactive Code & Art #creativecoding

Leave a comment

In all of the projects that I have undertaken, my coding knowledge has helped me through, my past knowledge of creative and fluid CSS design, javascript and flash variables and being able to dissect html and plugins. My biggest struggle was with Max MSP  https://cycling74.com/ as it was a totally new language and as usual I was trying to do something with it that had never been done before, (link to my research on Max pages) but my determination drove me to hack, tweak and cheat it so I got my desired result.

With my leap motion testing it also needs a specific language to start to develop your own content, and in the past I have used what comes in with whatever software has been recommended, so wanted to look at programming from a different perspective, can it in it’s purest form create art, create interaction and how easy is that?

I found the video below and it covers a wide variety of coding for visualisation, cinder, processing, the rgbd toolkit (which uses a kinect to make amazing video effects) and shows the very powerful way artists could harness code, but I don’t want to just ‘code’ and have virtual art and or sculptures, I want to make it more interactive and perhaps physical, could it be plugged into live reactive projects? How easy is it to translate this into a physical object, through 3D printing, manipulation and processing.

Can you use a leap motion, an AR experience, an oculus rift to help generate the code, and it be wireless?

I’m going to systematically look through the beginnings of this type of code and see if I can apply it to my kind of art…

The text below comes directly from PBS Off Book

Programming plays a huge role in the world that surrounds us, and though its uses are often purely functional, there is a growing community of artists who use the language of code as their medium. Their work includes everything from computer generated art to elaborate interactive installations, all with the goal of expanding our sense of what is possible with digital tools. To simplify the coding process, several platforms and libraries have been assembled to allow coders to cut through the nitty-gritty of programming and focus on the creative aspects of the project. These platforms all share a strong open source philosophy that encourages growth and experimentation, creating a rich community of artists that share their strategies and work with unprecedented openness.

Articulate vs Captivate

Leave a comment

I love playing with new technology and coming with that is an aptitude for testing new software and in my role as a Multimedia Developer I have been tasked with looking at new elearning software for City College.

I have been assessing and researching two of the leading elearning software brands with their free 30 day trials of each. These were narrowed down from four or five (iSpring, Lectora, Elucidat) by talking to other professionals and online research.

Although I have used Captivate 5 for a while now, I was keen to see what captivate 8 had in store and had previously trialled the first version of Articulate so felt I had a good starting position for an evaluation for my company.

Articulate vs Captivate Comparison

Adobe captivate import choices:

  • For a straight powerpoint import you can check each slide you wish to bring in
  • Comes in with all of the powerpoint timings and bulleted text animations
  • You cannot edit the text in Captivate, but you can edit the linked powerpoint file
  • Opening up powerpoint in a pop up window so that you can adjust text, etc

Every mouse click that advanced the ppt file has been kept and the slides have their individual timings and reveals included, but nothing is editable within captivate itself.

You can also open a ppt file and have it unlinked, but again if you want to edit the text or image or timing, you can only do so by editing the slide in captivates ‘ppt’ pop-out edit area.

So although everything imports from Powerpoint beautifully, I still have to copy and paste the individual items if I want to make them interactive or correct a spelling mistake.

Articulate

The text and images are all immediately editable with Articulate.

With the separate layers you can easily see all of the individual elements. Some of the timings are present, and the audio has come through.

Working in Captivate

Editing images is a much simpler operation in Storyline as all of the options appear on a right click.

Once you have found the edit image section in Captivate you find a more limited set of functions.

Conclusion

Importing projects from powerpoint 5 2
General ease of use 4 3
Quiz questions and options 4 4
Image editing 4 3
Recording & editing a screen simulation 4 3
Customisation 5 3
Output 4* 4*
Active online community 5 2
PPT conversion Time 3 hrs 5hrs

 

Captivate has a more limiting powerpoint feel to it, basic adjustments, working within more rigid boundaries and a harder learning curve, plus there is not the community support group online.

Articulate has a better user interface, is quicker to put simple quizzes together and make minor adjustments within them, and produces e-learning that looks immediately better without having to delve too deep into the settings, and therefore quicker to pick up.

Another plus for Articulate is the very active and lively online community where they promote sharing of new templates and enhancements that will enable the user to focus on the learning design aspect rather than the software obstacles.

I have looked at many comparisons and reviews between Captivate and Articulate and they both have devoted fans of each platform leading me to believe that both are fully capable pieces of software, but, I found Articulate the quickest to get going with, providing the best import from powerpoint and many ready to go good looking templates with varied uses.

In my evaluation and testing Articulate also needed the least time to produce a better looking product, with my starting point for both test projects as a supplied powerpoint.

Using Articulate it took me 3 hours to convert the project into a simple quiz and screen simulation whereas working with Captivate took me 5 hours.

Articulate has the edge in almost all aspects in my testing and I would recommend this as the software.

 

This is my shortened report, if you would like to read the full report, you view it here (Word Document)

History Conversion Articulate vs Captivate Conclusion – TT

Google Cardboard – VR out of a box!

4 Comments

So at last I had enough time to finally put together the VR Google Cardboard DIY headset. I had previously purchased one off of ebay, but it was so poorly cutout and made, I couldn’t even fit an old ipod touch into it, however, what it did get me was the lenses and the NFC chip that are actually quite tricky to get hold of.

wpid-dsc_0090.jpg

First step was to cut out the paper printouts and make sure they all would fit into my lovely bit of cardboard, I found that the regular corrugated stuff was not very usable.

wpid-dsc_0092.jpg

This the lovely thin but firm cardboard I rescued from a magimix box, just the right type of stuff.

wpid-dsc_0102.jpg

Unlike this rubbish that I bought from eBay…

wpid-dsc_0093.jpg

Tools needed included plenty of blades and my trusty scalpels…

wpid-dsc_0091.jpg

plus some good old spray mount, don’t you just love how it covers everything in a fine mist of stickiness 🙂

wpid-dsc_0094.jpg

Safety ruler at the ready and I start with the complicated section to hold in the lenses.

wpid-dsc_0095.jpg

Straight sections are a breeze but the circular areas look impossible to get smooth.

wpid-dsc_0096.jpg

Looking pretty good,  but it does take me 45 minutes to cut all the fiddly bits out, but I am very pleased that I haven’t lost my knife skills.

wpid-dsc_0099.jpg

Another hour sees all of the areas cut and ready for assembly, unfortunately I don’t have instructions as to how it all fits together and which way the lenses go in so a bit of youtube surfing ensues…

wpid-dsc_0104.jpg

Lenses and NFC chip ready to go in, but where?

wpid-dsc_0105.jpg

This is where the NFC chip for Google Cardboard goes!

wpid-dsc_0107.jpg

Add a bit of double sided to keep the lenses in position and squeeze it together.

wpid-dsc_0108.jpg

Put into the cut out slots and it’s all fitting together nicely. add a rubber band and some double sided velcro and it’s finished, although there seems to be a fatal flaw that my phone can just slip out either side, hmm, will have to look at an updated design for that bit…

 

Space in the side for my phone to slide out!

Space in the side for my phone to slide out!

Unfortunately, because my box is not plain cardboard it looks like I now have a magimix VR food viewer, but hey let’s give it a go!

wpid-dsc_0112.jpg

wpid-dsc_0111.jpg

wpid-dsc_0109.jpg

The first experience I want to try with GC is the Paul McCartney and Jaunt 360 app that I have already downloaded onto my phone.

http://mashable.com/2014/11/20/paulvmccartney-vr-app/

When I tried this without the GC it was amazingly clever, as the sound moves around as you turn, and with good headphones on is mightily impressive.

Then I have a look at what’s available within the Google Cardboard app itself on the Play store, it has a few things, one of the nicest was ‘windy day’ a cute little animated 360 film about a mouse with a big hat on a windy day. The funniest thing about this was I was obviously facing the wrong way and didn’t realise there was a character ‘stood’ behind me, I was just looking at the falling leaves!

The next demo was of sculptures that you could look all the way round, very nice, but not very immersive…

https://play.google.com/store/apps/details?id=com.google.samples.apps.cardboarddemo&hl=en

I then started to search for a roller coaster 360 demo and I plumped for FiBrums offering, whicvh after I realised I needed to stare at the go lever it was very cool, in fact I was almost glad when it finished, very, very clever.

http://fibrum.ru/index_en.html

I also had a look at another offering from Jaunt – Kaiju Fury  which wasn’t very inspiring, but there are lots of things out there to play with.

I will start looking at things from a slightly different angle with my newly built Google Cardboard goggles and need to put them together with my leap motion for some truly immersive visual feasts!

So yes, it was definitely worth waiting for, and you cannot appreciate the experience without having a go, so I highly recommend making a pair for yourself, but if you don’t have 3 hours to put one together I would buy the official version from one of the 4 big companies that sell them such as dodo case or unofficial cardboard, this link takes you to the google page that explains a little more and gives you a link to the makers sites.

https://www.google.com/get/cardboard/get-cardboard.html

Give it a go!

Dubai 360 – interactive 360 degree timelapse experiences

Leave a comment

Dubai 360

http://dubai360.com/

This amazing project is employing 360 timelapse imagery into interactive experiences available from anywhere in the world.  (a project launched by Sheikh Hamdan back in August)

Fascinating…

Dubai360 was shot with four perfectly synchronized Canon 1Dx cameras.

Lenses on the cameras were Canon EF 8-15mm f4 L USM fisheye zooms.

Over 88,000 photographs were taken over the course of the 30 hours of shooting, with one set of photographs being taken every 5 seconds.
22K Panoramas. These photographs were then stitched using Kolor Autopano Video into 22,000 separate panoramas to create the source frames for the video you can experience above.
and for the ultimate wow, look at this footage of Sheikh Hamdan standing on top of the Burj

Leap Motion, a first look

1 Comment

There are a few different ways to use gesture to control, rather than a physical button pressing controller.

The Leap Motion is a lovely little device, and promises much.

 

 

Leap Motion

Leap Motion

Leap Motion next to a pen so you can see the size

Leap Motion next to a pen so you can see the size

Leap Motion

Leap Motion

I wanted to see if it could deliver it’s claim of a new way of interacting with the world.

The first thing after unboxing was to have a play in the recently updated Leap Motion playground with some of the v2 apps.

As you can see from the video it’s amazing when it works, how intuitively it takes your hand movements and interpret them into a 3D space, when you can pick up and play with virtual objects.

The Leap Motion getting started zone

The Leap Motion getting started zone

But almost as soon as I’m out of the ‘playground’ area I stumble over the recurring problem of coding the damn thing, not even that, I have to choose my language… Where’s the helpful button that says, don’t know which coding environment to use because it scares you witless? Click here and we’ll help .

I have no idea which development environment I’m going to be able to manage with, but I am always willing to have a look if I get a bit of help.

This is quite a common theme to trying to make art interactive, the code behind the technology is almost prohibitive and I know from experience that you can go so far down the complex track of coding, only to discover that actually, it would have been better to do it a different way, in a different code environment, but not being a coder this is tricky. I envy the guys at Aparna Rao as they have tech guys who turn their ideas into reality by looking after the backend, whilst they create…

But back to the Leap, I have to dive into the code, so I plump for the Javascript option, hoping that my small amount of flash scripting might help.

to be continued…

 

How not to place a pressure mat! fringe foul-up

Leave a comment

The undercroft is a great place to put my interactive sculpture, but I knew I needed to replace my pressure mat as it had seemed quite worn out and unresponsive after my MA show, so I duly ordered a new one and feeling pleased I’d been organised put it in place and was gratified to see it work much more smoothly.

But on Monday night when I was rewiring my sound – it had only been playing in mono, but it hadn’t affected it too much in the echoey space – I noticed the new mat was not functioning properly, in fact I pretty much needed to jump on it to make it trigger the animation

As I wondered what on earth had happened I noticed a small tear in the cover and placing my hand over it discovered a sharp protrusion underneath, I looked under the cover in case a stone had got in, no, then I lifted the pressure mat to find this!wpid-dsc_0075.jpgBlimey…

no wonder the mat was being unresponsive this bit of piling that they would have used to reinforce the concrete had stabbed all the way through the mat and out of the cover as well…sheesh… so much for being organised and ordering a lovely new pressure sensor for the Norwich Fringe!

wpid-dsc_0079.jpg

look what it did!

wpid-dsc_0077.jpg

and out the other side…

What a proverbial pain in the rear end…

It’s so disappointing, but this is why I want to look at more gesture based control, I know that can still go wrong, but the physical mat doesn’t take to being used like this very well.

I ordered yet another mat from maplin which arrived this morning, so was able to install it for todays exhibition, it’s so pleasing when it works.

When I met up with Andy Logie the other day, we talked about the possibility of that type of control using a kinect, Andy seemed to think this was a possibility, but we both agreed that the technical coding side of these things just drove us potty!

Older Entries Newer Entries