For this year’s Beaufort Street Festival I worked on some projection mapping, bringing the Alexander Buildings to life. Good times were had by all! Sam Price and Andrew Buckley also animated with coordination and production support from Emilia Jolakoska at Siamese. The music was by NAIK.
I started work on the project in Perth and continued working remotely after relocating to London, but I’ve managed to find a (sped-up) video recording of it, below! I believe it was recorded by Roly Skender who also supplied additional tech support, as did Kat Black from VJZoo and Sohan Ariel Hayes. Seriously great team!
Andrew used photo modelling to generate a sweet digital copy of the building, which I then animated on and around as necessary. I wrote a script to bake sequences direct to the UVs, which were then edited. Being able to play back the sequence directly on the building meant it was then relatively easy for Andrew using Nuke to align and render the final output during projection tests.
The render (left) was one of my tests in Lagoa using an asset from a recent Gatecrasher print campaign. The Lagoa project is available here. I was primarily interested in the accuracy of the physically-based lighting and the results were impressive, mostly as it’s from software that runs in your browser.
There’s still a long way to go, but recent innovations like Lagoa, Clara.io and Autodesk Remote are getting me excited and thinking pretty hard about the future of studio virtualisation and its place in our increasingly globalised industry.
I was asked by The Penguin Empire to recreate my Perth City Superdolly for an official City of Perth TVC (well – 3x30s & 3x15s cutdowns). In addition to consulting on processes for the other shots, I somehow ended up DOP’ing my own shot (!!). It’s the closing hero shot in the vid below, which repeats across each of the three TVCs:
The challenge was a little bigger on this one – I couldn’t get away with the grungy look of the original, the output quality needed to be finer including all timelapses shot at 5k, and the move involved a speed ramp down to a specific location, arriving at a specific time of day – down the to the minute. After some hefty calculations a bit of 3D previz I was feeling a little apprehensive about some of the tolerances! I headed out gear in hand to my 2km-long “dolly track”, guerilla style. I used a Canon 5D, an EF 70-200mm lens gaffa’d to about 75mm, a monopod, a tripod and a stopwatch.
My crew for the shoot
Swan River Dolphin
Projected city lights
I came back to the computerbox after 3 takes on separate nights. The shot required a lot of massaging around to stabilise exposure and movement, dustbusting and rotoscoping out unwanted buoys, planes, birds, tree branches and boats. Director Rob Forsyth requested some stylistic tweaks – predominately to remove the clouds and shift the nightfall earlier. Removing the clouds would have required the whole city to be roto’d, which wasn’t possible in the timeframe. I luma-keyed instead and used it to knock the clouds back and blur them out to subtle them up.
Shifting the nightfall though – that could be done! I graded the daylight out to transition to night as the move comes to a rest. I then had some building lights over black night from the tail of the shot, which I tracked and projected over some stand-in geometry for the move. I added the projected lights over the graded plate and the night magically arrived earlier. The final shot starts at “real” daytime, moves to “fake” nighttime, and back to “real” nighttime by the end.
I’ve been playing around with Lagoa, a collaborative cloud-based 3D platform. I’ve been finding it pretty nifty so far and can definitely see it fitting into my pipeline – most likely when working on direct-to-agency print jobs.
I created the image above in Lagoa to demonstrate the use of mesh lights in the platform, and they’re definitely easier to set up than in most alternate packages. In Lagoa, the process is as follows:
Select your object
In the properties pane on the right, under ‘light emission’ check ‘enable’
Adjust the intensity and colour to suit
Be impressed as it swiftly renders on the cloud, with fine quality output by default
Nice! If you’d like to check out my Lagoa project, you can access it here. To illustrate how easy this is by contrast, to do the same in mentalray you’d need to do something like this:
Select your object
Create & apply an mia material x
Plug in a light surface node to the ‘additional color’ slot under ‘advanced’ on the new material
Adjust the settings on the light surface to actually make it turn on Final Gather contribution
Turn on FG for the render
Tweak your FG and light surface settings until it smooths out & looks nice
Of course, it’s all horses for courses – I’ve always liked the power and control that mentalray gives you, but it’s not always necessary. I’ll keep you updated on further Lagoa pipeline developments
I recently supervised a motion reference shoot on 3 x 5Ds for the film at Siamese, all went swimmingly! Ballerina Meg Parry performed an excellent Russian ballet sequence for us, choreographed by the equally talented Sam Fox. I wish I could show more of the materials and look dev, it’s starting to look very cool behind the scenes!
The sequence is currently being edited and will be imported into Maya as image planes from 3 angles. From there the animators will be able to (loosely) roto the dance, as we require a slight stop-motion treatment to the movement and the character is missing a foot, after all. Can’t wait to see how it turns out, I’ll post up something if / when I can! (Probably not until release, sadface)
By now we’re probably all up to speed on the debate surrounding the state of the VFX industry, particularly post-Oscars. For anyone who isn’t, VFX Soldier has made some thoughtful, objective posts on the topic.
I’m not at all convinced that unionisation is the answer, but it is something that affects myself and many of my colleagues and friends (see image: left. As a side note, some of your greens are a bit iffy, guys and girls ) I may be wrong on the unionisation issue, though I’m curious to see what a workable model would look like and am looking forward to participating in the discussions moving forward.
Meanwhile, VES has posted an open letter calling for greater subsidies for Hollywood – like others I’d say this will only exacerbate the problem and would work as a short-term, short-range solution at best.
If there were an easy solution it would have already been found, but it’s a debate I’m keen to contribute more to in the hope that all stakeholders can be running successful businesses supported by satisfied employees.
Direct-to-agency via Tim, Doni and Simon at Gatecrasher, cheers guys! Tim supplied the audio, trees, houses and bike artwork, and being a small job I handled the rest myself – except for the titles which were bumped on at John Cheese during finishing. Apparently the client’s only final feedback was “I love it!”. Looks like everybody wins, it makes me happy
We’ve been experimenting around the Qantm office with various forms of tracking – mocap suits down in the green screen room, various implementations of the Kinect, and then this!:
Pictured: Blake vs. faceshift and live markerless facial tracking. The RGB+depth video (left) coming from the Kinect sensor feeds into a calibrated model of Blake (center). The blend deformers on the calibrated Blake model drive the blends on the dog (right) – live! I’d have to say I was very impressed, the nuancing was great and included eye-look direction and blinks. It sometimes slipped a little and the lips didn’t always close correctly (UPDATE: apparently the latest version of the software has fixed this), but overall it was clean and fast.
I later had the data feeding live into our own custom models inside Maya, where the data could be recorded and baked directly. Overall you’re still going to have to do some cleanup in a process like this, but it’s still a very solid and exciting solution! There are also some pretty nifty things you could do with it video-art wise once it’s in Maya, using your own facial expressions as an input control for pretty much anything.
The super awesome Beaufort St Festival is coming up, so I was excited to be able to help out with the TVC for it. Lead animator Andrew Buckley did a great job bringing it all together. I lent some VFX Supe / TD / consulting / production assistance, including creating a versatile rig for the people cards and painting a few of them myself.
All of the people cards were actually the same piece of geometry, with two sliders above it – one to swap out the background cardboard texture and one for the foreground person illustration. I used an embedded cutout matte to punch out the ripped edges. This enhanced workflow and direct-ability, as changes to the texture on the card could be quickly dialed in and new textures could be easily added.
I used projection maps and some tricky light-swapping on the skateboard ramp shot to have the cardboard properly interact with the strong sunrise spec highlights on the road. We used mental ray to render. All turned out well!