New Media in Israel

Last Friday, Feb. 24, Helios creative technologist Mike Robbins was lucky enough to be in Tel Aviv as part of the Copro New Media Track daylong panel around digital storytelling, VR, volumetric filmmaking, and how to fund all this craziness. There is lots going on here, a hothouse of creative energy, innovation and cool work. Looking forward to being back later this summer for the film festival itself.

In association with the Directors Guild, Editors Guild and the Israeli Documentary Filmmakers Forum.

Design by Code : WebVR, WebGL and Volumetric Imagery

The first in a series of posts around some of our recent experimental work with WebVR and WebGL, towards an idea of volumetric imagery.

Admixture: the action of adding an ingredient to something else.

Admixture describes much of the Helios work process. As visual designers, we have come to see design and code as something almost like a lens through which we can look into spaces, real or imagined, in potentially different ways than we have come to expect.

Maybe it’s perversity, but we’ve always been interested in creating motion with still imagery, and depth with flatness.

Often this work is starts with experiments, visual essays that in of themselves have no narrative or meaning beyond such abstractions like: “How can we make dimensional spaces with non-dimensional spaces like still photos, particles, or even … math!!??”.

 

Our attempts at answering questions like these might not pay the bills directly, an important consideration important for design studios like ours, but it does help us to make sense of technologies that impact the way we design and tell stories.

An emerging technology for us (and a lot of other people, for that matter), is WebVR. WebVR is a form of Virtual Reality that uses a web-browser, WebGL and javascript to provide an immersive experience across a number of any number platforms and formats, from 2d “flat web” on a laptop or tablet, Cardboard or Gear for mobile, and full stereo VR for those with an HTC Vive and powerful gaming system.

Screen Shot 2017-02-18 at 2.10.48 PM

This is our take on “Hello World”, a classic coding scenario used by programmers to demonstrate vestigial syntax and structure of a computing language. In our version, sheets of Perlin noise do nothing more than drift and collide and stretch off into the distance.

The experience as whole might is not much the story-telling department but it does introduce the glimmerings of how light, patterns and movement can work together to create a sense of space, filled with volume.

Put on your cardboard or gear if you have, or just your imagination if you don’t and check it here.

Let’s get representational and see what happens when we start mixing in sheets of photographic information to our clouds of Perlin noise.

Screen Shot 2017-02-18 at 2.11.29 PM

Check it here.

Again, a simple and basic experiment in space and volume created entirely out of flat images and particles.  Somehow it seemed boring to just create a 3d box and place a code camera inside. This example uses trig functions to build a composite photo and particle mosaic around our code camera. As we shift our point of view, so shift the image planes, re-focussing on a central point behind the camera.

From this still abstract, primitive and somewhat fractured reality, possibilities start to present themselves. There is a recognizable space one can move through, but how does that space become more recognizable?

One easy solution would be to use 360 degree imagery, in the form of panoramic spherical projections, either as still or video.  We’ve done this already in the OffshoreVR project, where we created a dark ominous oil rig mixing CGI imagery, 3D meshes, and live action film. Yet for all the immersion offered by the experience, there remains a sense of separation from one’s surroundings, of being inside a sphere (which of course the viewer is), of a lack of volume.

Another not-so-easy solution would be to build an entire scene as a 3d model, but a well-known visual paradox arises in this direction: as the harder one strives for reality, the less real the intended experience becomes.

Enter photogrammetry, the combination of photo imagery and particles, or rather, a process through which photo imagery becomes particles, and how code to places these particles, and the volume they create onto a mobile screen.

That’s for another day, so stay tuned for the next post! Until then, a quick example of a point cloud made from 57 photos, an hour of rendering, and a dash of javascript.

Screen Shot 2017-02-20 at 9.43.18 AM

 

 

 

Wrangling the Booker API for The Ten Spot

We recently launched a rebranded site for The Ten Spot, a beauty bar with locations across Canada. The new design rolled out on The Ten Spot’s tenth anniversary, trading the dark palette that they were using previously for a light, airy vibe.

ten-spot-website

At Helios, WordPress is our CMS of choice and—at The Ten Spot’s request—we created a child theme to piggy back on the popular Avada WordPress theme. Avada comes chock-full with customization settings that did a lot of the heavy lifting so the bulk of our development energies were spent on The Ten Spot’s big ask: letting guests book appointments and buy gift cards through their website.

Previously, when a user hit the “book now” button, they were redirected to a third party web app hosted by Booker.com. It worked, but The Ten Spot wanted to have a seamless look and feel. This meant coding up a custom front end that interacted with the Booker.com API.

The first step, before writing any code, was to submit a detailed workflow diagram to Booker. It had to illustrate each step in the UX (User Experience) along with all the requests that our app would make to the Booker database. The .pdf we submitted ended up being thirteen pages long and included everything from user login/signup to appointment confirmation to checkout for gift card purchases. That document made it clear that this section of the website was an entire app in itself.

About five minutes after we sent off the workflow document we got an email back from Justin, our support rep, walking us through every single API call for every view. I don’t know how he was able read our document, figure out which of the 400+ API methods to use where and send back a detailed description for each step in that short time. (Justin and I would get to know each other very well through the course of this project). He also set up a test environment for the Ten Spot account. It was time to roll up our sleeves and dig in.

There were a lot of choices to be made about structure, languages, potential frameworks, etc. Ultimately, no extra libraries or plugins were added. We went with what WordPress has in its core and created a simple JavaScript front end that communicates with the server via AJAX.

Working with the Booker API has its pros and cons.

On one hand, the documentation leaves a lot to be desired. The docs require a login that expires extremely quickly so credential need to be typed in pretty much every time a developer wants to look something up. Once the correct page is accessed, none of the query parameters have descriptions, which creates a lot of guesswork.

Also frustrating is that the Booker system requires users to be associated with a specific location. That is fine, but there is no way to query for which location a user selected. So to do certain tasks, like update their password, the guest needs to remember and enter the location that they used when they first signed up.

On the other hand, Booker support—always Justin—would respond to any questions at lightning speed. It got to a point where I would just email as soon as I got caught in any snag. In any case, the 400+ API methods mentioned earlier does give clients the flexibility to integrate the Booker system in almost any way they would like. On thetenspot.com guests are able to book, manage and cancel their appointments without leaving the site.

The launch went very well aside from a couple DNS hiccups. The only issues we’ve run into since have been due to edge cases in legacy user accounts.

I think that this project ended up being a bit bigger than expected but we’re very happy with how it turned out.

Making Movies: Design and Code and the Web Browser

deeper_lab

There are often several stories within a story or about a story. Take the recently released interactive documentary The Deeper They Bury Me  for an example. This a 20 minute online piece featuring the words and thoughts of the late Herman Wallace. One could tell a story about the miscarriage of justice and racial inequality, or speak of the creative collaborations that have gone on over the years to bring this story to light, or we could talk about web-browsers.

The Web Browser?

Perhaps it’s not the sexiest thing in the world to talk about the role the web-browser plays in the interactive story-telling and design process, but let’s strap on our Nerd Goggles and go for it.

We (Helios) see the web-browser as a platform for making movies. Maybe not the kind of movies that you might see in the theatre (yet), but an experience that is cinematic all the same. These are definitely NOT web-sites. Web-sites have buttons,and forms and data and users. The interactive movies we like to make have cameras, edits, pacing, directors, and instead of users, an audience.

The Browser is a Visual Compositor.

The modern web-browser, ie the one capable of dealing with a decent set of HTML5 api’s is the Adobe AfterEffects of the internet, capable of real-time compositing. Granted there’s not much of a graphical interface for the designer; in this case, it’s a good text editor, some nifty javascript libraries gleaned from Git, some mad Google search skills, some like-minded nerds close at hand. Clunky perhaps, but with a certain amount of luck and perseverance, one can indeed pull off some pretty cool things.

In the case of The Deeper They Bury Me,  a cool thing was to layer transparent video  on top of interactive 360 panoramas. We started with one of the oldest tricks in the book, a video split into two halves, the top half as the colour or  RGB channel, the bottom half, the mask or Alpha channel. You could use canvas to render the transparency. The operative code would look something like this:

But, alas, this adds a huge processing overhead, as getImageData is computationally expensive. and things grind to a halt when the videos approach full screen,  But we can do it using webGL. Most people who think about webGL (a strangely small number, to be truthful) think of it as a means of providing 3d rendering in a web-browser, as popularized by the javascript library Three.js. But it’s super useful for 2d rendering as well. WebGL uses “shaders” which are pieces of code executed on a computer’s graphic card that define the position of colour on a shape. They usually work in pairs.

A vertex shader tells us where:

A fragment shader us tells what colour goes where as dictated by the vertex shader:

And then printed to screen with javascript via this library here. True, your laptop fan will go crazy. But it looks beautiful.

The intent of this apparent technical hocus-pocus is to blur the lines between what is video, what is still imagery, what it text, and what is code. The important think here is that together, all these things react to an audience input, via mouse or touch, the the movement one sees on screen is a result of their interaction.

The Browser is an Audio Mixer.

A large part of the narrative drive of The Deeper They Bury Me is provided by the voice of Herman Wallace himself, as recorded over the course on a number of phones calls, his primary contact with the outside world. Most web-sites are silent. but our interactive movies rely almost as much on sound as they do on visuals to provide story and emotional impact. And quite simply, without Herman’s voice, The Deeper They Bury Me would not make much sense.

But people will hear more than just a voice. The audio, like the visuals, is a layered composition of foreground and background elements, the phone call, sounds of outside and inside, scraps of jazz, and laughter and voices. Once again, the modern web-browser can help us out here in the form of our old friend, the webAudio api, which allows us to load in and play any number of audio tracks, assign each track a volume, and a stereo position, and loop it or end it. We can connect our audio mixer to the visuals and audience interaction. And, you can too (steal this code)

Everything audio starts like this:

This snippet doesn’t really do much, but the important thing to note here is that code likes to talk with other code. Our interactive 360 panorama is essentially a code camera, controlled by mouse or swipe that provides a first person point of view. It outputs tracking information to the audio mixer, which is made entirely out of code, which then accordingly positions the specific track in the stereo mix. In effect 3d, interactive, immersive sound. And once again, your little laptop, with its fan blazing away, and its latest version of Chrome, is making movies.

The Browser as Web App.

Let’s get really gone.

For a project like The Deeper They Bury Me, we want to distance ourselves from the “web-site” paradigm, where one selects an item from a menu, to in order to leave one page to navigate to another. This is when an application framework like AngularJS comes into play. Now our Nerd Goggles are glowing with fiendish intensity!

Angular describes itself as a “Superheroic Javascript MVC Framework”. It goes well beyond the scope of this post to discuss Models and Views and Controllers; we’ll leave that to the superheroes, but the important thing to take away is that Angular allows us to create a user-journey through what is called a single page application. This means in effect, stuff comes to the viewer, rather than the viewer going to the stuff. So, “navigation” can become “transition”, the audience can stay inside the experience, with distinct deep-linked url for every stop along the way. We can also control time. One of the important aspects of The Deeper They Bury Me is that it lasts for exactly 20 minutes, with the intent of providing a story-arc in a non-linear story-telling experience (which often don’t have story arcs)

What Does It All Mean?

Code, like design, like story-telling, should never be an end in itself, but be used as a support for the other elements that go into making an experience. And as creators we should learn from what we have created. A lot of what what we put into The Deeper They Bury Me are the inelegant products of  painfully learned lessons, but they also plant the seeds for other projects moving forward, and gives us the groundwork to do other cool things in the browser, like facial recognition, virtual reality (webVR), and encryption, and file sharing, and astronomical data visualization.

But.

At the end of this story, it all comes back to the world we live in, to a real person, with a real voice, with real things to say, without whom none of the above would mean anything.

Thank you Herman Wallace, wherever you are.

hermansfeet

Disaster Resilience Journal wins Digital Communication Award in Berlin

Check it here. Quietly nestled among the bling for the mega campaigns for the likes of Audi and RollsRoyce, Nestle, Barclays, IBM, is the Disaster Resilience Journal , winner of the 2015 NGO award,  trying hard to look out of place, but most deserving at the same time. Andy Channelle from the IFCR (below), along with Jiannis Sotiropoulos, represented for the many who made it happen.

trophy

After the Storm at Sheffield Docfest

After the Storm is off to the UK as part of the Sheffield DocFest Interactive Exhibition.

After the Storm is an interactive documentary short created by Alabama film-maker, Andy Grace and ourselves (Helios Design Labs), that documents the before, during, and after of 2011 tornado touch that devastated much of Andy’s hometown of Tuscaloosa. The piece is a sort of audio-visual letter to imagined future disaster survivors, hinging on the concept that one’s experience of catastrophe in any form is at once intensely personal, and universal on a basic primal level.

Sheffield Docfest’s interactive program has grown over the last couple of years from cameo appearance to major role. This is the third year we have had a project that we’ve worked on included in  the festival, previous collaborations being 17,000 Islands (with Thomas Østbye and Edwin), and OFFSHORE (with Brenda Longfellow and Glen Richards) and Short History of Highrise (with Kat Cizek and a cast of thousands).

Hopefully the weather is better this year.

helios_site_gif_ATS