Meet the Studio: Jellyfish Pictures

Established in 2001, Jellyfish provides a full range of visual effects, animation and motion graphics services and has earned BAFTA, Emmy and VES Award recognition for its work. We spoke with Jellyfish Pictures’ Matt Plummer, who handles both 2D Lead and TD responsibilities out of the studio’s Brixton location in South London. ShotGrid has been part of Jellyfish’s workflow for a while, but Matt recently integrated it more deeply using ShotGrid’s Pipeline Toolkit to automate repetitive tasks, free artists’ time to create, and enable more iterations.

Tell us about Jellyfish and the type of projects you work on.

Jellyfish has two sides. The Noho studio on Margaret Street mostly handles live action film, TV and advertising projects, working on notable entertainment like “Star Wars Rogue One”, “Black Mirror” or “Outlander.” Our Brixton studio specializes in animation, largely for children’s TV. We’re currently working on “Dennis and Gnasher Unleashed (Dennis the Menace),” along with “Bitz n' Bob” for the BBC. Last year we worked on a kids series called “Floogals,” which is now available in the US. We also just opened a third studio – the Oval office, which is an extension of the Brixton studio.


What content creation tools do you use in-house?

We are entirely Maya-based, with a little bit of Mudbox, MARI and Zbrush. For compositing, we use NUKE; motion graphics is done in After Effects and editorial is done in Premiere.


How did you first hear about ShotGrid?

I’ve been a compositor for many years so I’d used ShotGrid for shot management at previous studios. When I came to Jellyfish, they were using it for basic production, but I was an instigator for using it more heavily and leveraging Pipeline Toolkit as a pipeline configuration. Now I write a lot of applications and systems for publishing from ShotGrid, then doing the review process through RV and ShotGrid.

What prompted you to move towards the technical side of VFX?

It was kind of organic. I did Codecademy and learned Python two years ago. I joined Jellyfish as a lead compositor for “Floogals” and during the project, I wrote the pipeline for comp. That’s expanded to me setting up Pipeline Toolkit across every project and multiple parts of it.


Which aspect do you enjoy more? Or is there a comparison?

I think they complement each other. For example, part of what we were doing on “Floogals” was assembling bash comps automatically through ShotGrid. Lighting would publish passes to ShotGrid and I’d coded a template system that reads those passes and tries to create a composite of them. If you’ve got 150 shots in an episode, you can create a first version super quickly with that level of automation. I’m quite surprised how much I’ve enjoyed the programming side of VFX, but it’s a completely different satisfaction than you’d get from comping.


How does your background as an artist inform your work as a TD?

Having experience using the application you’re writing software for is always going to be a huge help. Instead of trying to interpret what someone might want, you know exactly what works best for their needs.


ShotGrid is a lifesaver for every project… I can’t imagine doing another season of “Floogals” without ShotGrid; we wouldn’t be able to do it.


What’s a recent project that ShotGrid was particularly useful for?

ShotGrid is a lifesaver for every project. If you have to deliver 52 episodes with 150-200 shots per episode, and you’ve got animation, lighting and compositing, it’s immensely helpful to be able to track statuses of several assets used in those shots, the actual progress of those shots, which asset is used on which shot, and generally automate processes for those shots and assets. Having ShotGrid means it’s all organized and it’s all easy. Instead of opening a Maya file and exporting an OBJ every time it’s needed, I can write a ShotGrid application then a right click automatically publishes those OBJs. I can’t imagine doing another season of “Floogals” without ShotGrid; we wouldn’t be able to do it.

What are your favorite features of ShotGrid?

We started out by just using ShotGrid Pipeline Toolkit for Maya and had our own pipeline for NUKE, and in the last six months or so, we’ve added Pipeline Toolkit for NUKE and Photoshop, and soon we’ll be adding MARI. Pipeline Toolkit provides a lot of free useful features that remove tasks you’d normally have to work out yourself – like where work files are located on disk, how it publishes, generating QuickTimes for publishers and what it’s actually publishing. It handles all the little things really well, which is especially great for someone like me who is still learning Python. I only need to configure the high level bits and pieces rather than manually coding them. I think my favorite bit about ShotGrid is that you have so much access to its inner workings through Pipeline Toolkit and the Python API. You’ve got a lot of power to make it do what you want. It’s made my life a lot easier for sure; that’s my favorite feature.


How much effort do you focus on building out the pipeline?

It depends on what you’re setting up that pipeline to do. In terms of just installing Pipeline Toolkit and getting the basics, it’s super easy. Building custom applications depends on the difficulty of what you’re trying to make it do, but if I can learn Python and get full studio pipeline working, pretty much anyone can do it.


By automating parts of your pipeline or your general workflow, you’re freeing artists’ time to create.


Would you encourage up-and-coming artists to learn Python?

If you’re using NUKE or Maya, Python is the programming language to learn. I’d encourage people to at least have a very basic knowledge. If I had learned Python earlier, it would have saved me a lot of time. I didn’t realize the time I could have saved. I think almost everyone’s had to do a simple but tedious task at some point and thought, ‘a robot should be doing this.’ Learning to code is how you build that robot. By automating parts of your pipeline or your general workflow, you’re freeing artists’ time to create.


What’s a typical day in your life?

I start the morning by looking at my task list, then tackle the biggest problem or the largest new piece of pipeline, or design a tool around a short-term problem and proceed from there. Most of my day is currently spent programming, but I’m also involved with meetings to organize workflow, pipeline and on-set supervision for upcoming TV shows. It's a mixture of pipeline coding and preparing for new shows’ compositing. There’s also normally fried chicken once a week – my current favorite is from a place called Chicken Liquor.


What inspires you creatively?

I’ve found that I’m strongest when responding to other people’s work, not just in visual arts but writing as well. That serves as a sort of springboard for my own ideas. I don’t think artists can live in a vacuum and art, whether music, film or other visual media, is always a reflection of either the world around someone or the influence of someone’s environment.

How do you unwind?

I try to listen to three new albums each week. I grew up listening to metal, rock and some electronic music, but in the last five to six years, I’ve branched out to almost anything. I think virtual reality (VR) and the future of that technology is super interesting too, so I try to keep up to date on it. I also read an awful lot of books, mostly science fiction but not exclusively. And I play guitar, but horribly.


What are your general predictions for VR?

People are only going to be content staring at flat movie screens for so long. I don’t think VR will replace film, but there are logical evolutions to visual media that involve VR or VR might be a stepping stone towards what’s next. It’s tough to predict its level of success because the entry point is so expensive at the moment, but I think it’s fair to say there’s a future for immersive digital content. I don’t know if that’s within 10 years or 100 years, but it’s great fun.


What led you to visual effects?

It started out as a hobby. During university, I had a friend who directed short films and I became his FX guy. I was constantly messing around with software and making things explode in his films. I was also part of an online community called FXhome, which is now HitFilm. While earning a Digital Arts degree, I discovered there was a VFX industry in London. I had naively assumed it was all in Hollywood. I got a job as a runner at a production studio and learned NUKE through in my spare time. Eventually, I got a job as a roto artist working on “Prince of Persia” and it sort of went from there. I kind of got lucky really.


What’s your advice for someone wanting to get into VFX?

A big part of your daily job is knowing software, but software will only get you so far, so learn technique. Also, work out exactly what you want to do in VFX and then specialize, whether it’s in compositing, editing, animation, modeling, lighting, etc. These are all components of a very large pipeline and its good to know early on that you should focus on one aspect of it rather than trying to be the whole show, because that’s how most big studios work. Try to get involved with global or local VFX communities because you should have an understanding of your prospective employment environment. It can also be helpful to seek out a trainee finder or skillset recruitment program that will place you into a facility.

Matt Plummer, 2D Lead and TD at Jellyfish Pictures