Mike Robbins as Keynote Speaker at Festival de Cine Creative Commons Bogotá

b4ac1f15822bc96

On September 24th, Mike Robbins is the opening act at the Festival de Cine Creative Commons in Bogotá, Columbia. In addition to his keynote talk A dónde nos llevará la (r)evolución del cine y el audiovisual, which translates to “Where we’re going with this cinema and audio visual (r)evolution”, Mike will also be presenting a Master Class on The Digital Environment and Film: Web Development and New Platforms.

Visit http://bogota.festivaldecine.cc/ to find out more about the up and coming Festival de Cine Creative Commons.

Webby Awards: Vote for A Short History of the Highrise!

It’s been a whirlwind two weeks for A Short History of the Highrise – landing the top honour for interactive documentary at the World Press Photo Multimedia awards, then a Peabody Award (wow!) and now TWO nominations plus an honourable mention for Webby Awards in the People’s Voice competition. Web: Best Use of Photography, Online Film & Video: Best Use of Interactive Video, Web: Best Use of Video or Moving Image (Honouree).

Show your love for A Short History of the Highrise and VOTE NOW!

webby_highrise

A Short History of The Highrise Won A Peabody Award!

A Short History of the Highrise has landed another coveted award, as of this morning the interactive documentary directed by Kat Cizek and produced by the NFB and NYTimes was honoured with a Peabody Award!

The University of Georgia’s 73rd Annual Peabody Awards were handed out to a record number of 46 winning recipients chosen from almost 1,100 entries fighting it out for the title of best electronic media for the year 2013.

NYT_highrise_hero

Helios and the Tribeca Film Institute Hackathon Collide at CERN

Our fearless Creative Technologist, Mike Robbins, is off on an envy-inducing experience at the Tribeca Film Institute’s “Story Matter” Hackathon at the world-renowned CERN facility in Geneva, Switzerland. This is the first international summit in the Tribeca Hackathon series and it’s goal is to fuse the disparate realms of new media stories with advanced technology and science.

lhc17

Photo: Let’s pretend that Mike is the one in the hard hat and he has security clearance to get that close to the Large Hadron Collider. 

We’ll have a full report on all things CERN from Mike when he returns from the front lines of next-level-mind-melting-non-linear-team-building-interactive-storytelling, so for now we’ll leave you with this enigmatic quote from the Tribeca Film Institute:

“Each team’s task is to broaden the field of interactive storytelling by collaborating to create non-linear media works that illuminate those often hidden stories behind science, within data, and on the forefront of discovery. The projects will be informed by the breadth and depth of CERN’s expertise in scientific disciplines, which include particle, nuclear and high-energy physics on topics ranging from the Standard Model to supersymmetry, as well as grid computing and experiments on such things as a possible link between cosmic rays and cloud formation.”

And in Mike’s own words:

“CERN is equal parts The Prisoner, Dharma Institute, Riven, and industrial park.”

 

Helios at SXSW 2014 – A long day’s journey into interactivity

sxsw

Helios will be out on the front lines of the 2014 version of the massive interactive scrum known as SXSW Interactive. Our intrepid UK producer, Sarah Arruda will be there supporting OFFSHORE, a feature length interactive documentary co-directed by Brenda Longfellow and Helios, nominated for the Interactive Award for Activism.

Ably assisted by UK filmmaker DK Cooper, Sarah will be reporting back on a regular basis on all the thrills, spills, wheels and deals to be found when you cram 30,000 geeks, entrepreneurs, dreamers and screamers together in the space of a couple of days. It’s a rough, grim job, but someone has to do it.

Vai com Deus, Sarah.

Follow us at @heliosdesignlabs

Design by Code: The Hive-Mind of Git

beehive
The word for today is: Git.

Like the increasingly ubiquitous battered macbook, Git has become one of those iconic entities that straddle a divide between geek-chic fashion accessory and indispensable tool for the betterment of mankind. Even so, the average person in the street might not know what Git is, so here’s a quick explanation.

The online face of Git, GitHub, is a kind of online social-network that allows for communication between developers via the agency of the sharing of code. The basic GitHub building block is the repository, or repo, which is a storage unit for code, media assets, instructions, examples. GitHub users can create a new repo, contribute to an existing repo, follow a cool repo, steal from a genius repo. GitHub is a collaborative tool, allowing many to work together in harmony, ideas to travel from upstairs to downstairs to the other side of the world, concepts to mutate and evolve of time and use. It brings out the bee in us.

Illustrating the Git hive-mindset, here is one of our repos, built by one of our very own super-programmers, Iain Campbell. Borne out  requirements for immersive and interactive soundscapes in a number of current projects, the audio mixer repo contains a javascript code library that welds the Web Audio API into an web-based sound design platform.

With this audio mixer we can dynamically add multiple audio tracks to a web interface, control and animate their volumes, when they play and stop, their stereo position (left, right, and for the stoner crowd, forward and backwards), and soon, realtime and non-destructive effects such as reverb, pitch-shift, delay. And for the other super programmers, it plugs in nicely to an Angular framework.

But here’s the thing, this code actively and openly incorporates the code of others. A lot of the core of our library is based on the work of Kevin Ennis, who maintains a great web audio mixer repo here. The animation library uses something called Tween.js, written and maintained by Soledad Penadés, which in turn draws on the fundamental easing algorithms of Robert Penner.  Paul Irish has classic things to say about web-browser animation loops. And hopefully, our code can be of some use, somewhere, to some sweaty desperate designer. We’ve panicked so you don’t have to.

The social nature of Git and GitHub underlines and supports the co-dependence of the creative and technical halves of a modern design and development eco-system. By making code sharable, the path between concept and realization is shortened. Do Shorter means Do More, which ultimately (or hopefully) means Do Cooler Stuff.

Some other cool repos to check out (Git pun!):

Three JS (amazing webGL 3d rendering engine and codebase)

Bitcore (tap into the Bitcoin network, create your own your own Bitcoin app)

Node (is the New Black of Backends)

Skyjack (autonomously seek out, hack, and wirelessly take full control over any other drones within wireless or flying distance, creating an army of zombie drones under your control)

Howler (a web audio library, quite similar to ours)

So get to it, people! The hive-mind awaits.

 

Design By Code: 17000 Islands and hot-house programming

17k_voronoi_shot

17000 Islands (17k) is an interactive experiment in documentary image making, a collaboration between film-makers Thomas Østbye, Edwin, interactive producer Paramita Nath and ourselves. Centered around Thomas’ and Edwin’s film explorations of Indonesia’s Taman Mini Theme Park, the interactive piece also serves as a hot-house environment for some of our more extreme code experiments.

17k enables people to assemble and edit video, audio and text content via a linear, timeline based editor and/or a more abstract, voronoi-based, very-non-linear, wtf-type of  interface. Pretty much, words fail to describe this part of the experience, but we can reasonably describe how these editing platforms plug into a real-time server-based video rendering system we built for this project.

Our system depends on something called Node Js. In bluntly technical terms, Node is a platform for server-side Javascript applications. While explaining exactly why Node is cool is beyond the scope of this article, we can say however it provides an ideal conduit for communication between web-browser and web-server through a two-way messaging system called web sockets. And why is this communication important? Well, it’s all about context and relations.

17k_timeline_shot

Our aforementioned video editor lives within the user’s web-browser, but it is editing files that reside on a remote web server. And it is not just files on this remote server but also a set of programs that  run independently of our user’s own computer, OS and installed applications. Much in the same way that Google Docs allow one to access word-processing, spreadsheets and the like without installing MS Office, our use of Node allows access to video rendering without installing Final Cut.

Node collects such user information as what media elements have been selected, where they are place on a timeline, what their in and out points are, and sends it to our server based application, which is in turn comprised of three swiss-army tools: FFMPEG for video and sound, Sound Exchange (Sox for sort) for sound mixing, and ImageMagick for text and still graphics. These programs have been around forever. They are amazingly powerful, and are controlled by a geek’s fest of command-lines. Perfect for the job at hand!

17k_render_shot

Here’s the recipe:

Start with ImageMagick. This graphics library takes our text inputs and converts them into a set of transparent pngs, one per text block. Apart from font management, and text position, this step is relatively easy and straightforward. Set aside for later

Next, Sox.  A little more complex as we have to mix multiple tracks of audio, from video tracks to sound effects, to recorded audio. The first step is to create a silent audio clip the length of the entire composition. the next its to trim each audio clip to its start and end point, and then do a mixdown of all the clips on top of the silent clip. Set aside for later

And now its time for  FFMPEG. Its a lot more complex, as now we have to mix multiple tracks of video, fly in the text pngs, and place the audio. First we trim all the required video clips to size. The we use FFMPEG’s overlay filter to mash all the sliced videos and text pngs together, with the final audio. FFMPEG can render this out composition at SD resolution on our creaky old ubuntu server at around 30 frames per second (Render times slow considerably if we output at HD  but we can still do it if required, and whoever is rendering has patience). We can output to pretty much any format supported by available codecs on the server. For now, its webm.

And here’s the shell script we built to handle all our command line inputs. Its specific to the 17k project, but if you’re interested in building your own server side video rendering system, I’m sure it will save you some time in scouring the internet!

If you have made it this far, massive props and all that jazz. As noted above, the codebase we’ve described is very specific to this project, but its not hard to see how the methods and processes it uses and exposes can be extrapolated into other instances. We’ve already starting extrapolating, so stay tuned for the next installment of Design By Code, or How Helios Came To Love The Command Line.z