Qualcomm Developer Network September Developer of the Month is Claude Dareau from Framestore. Claude is a lead programmer at Framestore and his work centers around the development of rendering solutions for AR, VR, and XR applications.
Framestore is a visual effects company with offices in the United Kingdom, United States, and Canada. Their work spans film, advertising, and evolving media platforms such as VR. Claude and his team at Framestore, have been developing on devices powered by the Qualcomm® Snapdragon™ 835 Mobile Platform such as the Oculus Quest, to allow Framestore’s visual effects teams to work their magic on virtual experiences.
How was Framestore started?
Framestore was founded in London in 1986, with work covering images in commercials, music videos, and television graphics. The computer graphics (CG) department was set up in 1992 and has been expanding ever since, now covering award winning work in film, commercials, and TV. In 2001 Framestore brought their talents to the Harry Potter world with work on Harry Potter and the Philosopher’s Stone and have subsequently worked on all eight films in the series as well as the various spin offs. Framestore continues to work on the some of the most successful movies and franchises, with a few Oscar wins along the way!
The Immersive team of which I am a part was set up in New York City in 2012 and has since expanded to house development teams in four sites across the globe. Over its short existence it has proved itself over and over again, with creative solutions to unusual development problems, winning a slew of awards in the process.
What can you tell us about the products you develop?
Framestore is known globally for our visual effects. We have a proud history of creating extraordinary images and scenes for some of Hollywood’s biggest pictures, collecting numerous industry awards along the way. Our work is also seen across the advertising industry, bringing magic to the small screens that surround us every day.
Image provided by Framestore
I work in the Immersive team making real time XR (VR/AR) applications and experiences. Often this work is driven by existing relationships with the film or advertising industries. The products are often promotional pieces tied to large, well known IPs. For example, our last project was an AR location-based experience tied to the premiere of the final season of Game of Thrones. So, tight and inflexible deadlines are a common theme in our products. 4D and ‘experiential’ is also a huge part of what we do; whether it is VR in a hydraulic rig or retrofitting a school bus with bleeding edge technology for a trip to Mars.
The Immersive Technology team is global. Each location has its particular strengths and one of the other common features of the products that we work on in New York City is that we tend to concentrate on mobile platforms. Previous projects have seen us working with technologies from Qualcomm Technologies, Inc. (QTI) and the Snapdragon 835 reference HMD, Oculus Quest, and most recently Magic Leap. We are currently working on an Oculus Quest/Go/GearVR title that we are super excited about.
Where does your team get inspiration?
We are in an environment where most of the people around us are working on beautiful pre-rendered visuals, so obviously our main focus is to push the visual quality as much as we can. Not that interactivity and gameplay should be ignored, but we have to be honest and admit that most of our projects are not deep experiences that the user will spend long periods of time with. Rather they are experiences that people spend a short amount of time with, and need to make an instant impact. The flip side of that is when we do get a project that is a deeper experience, we can really appreciate that aspect of it.
Image provided by Framestore
Who are your technology heroes?
My technology heroes are all people that I have worked with who have had a lasting impact on me because of their attention to detail and a shared passion for sweating the details. Whenever I need inspiration or motivation to dive deeper into a problem and really deliver something that I would be happy to use, I think of those people and how they would not be satisfied with results that are merely "good enough".
What does the future of your industry look like in 10 years?
I have this crazy idea that pre-rendered content will be gone. You will still be able to go to the theatre to watch a movie, but it will be being rendered real time on a PC in the projection room. That will open up huge possibilities for dynamic storytelling but also huge challenges.
On the VR side of things, cables will be long gone. Rendering on wireless HMDs will probably be a mixture of streaming high-fidelity content rendered on the cloud with maybe some content rendered on device. Hopefully we will have information about the user beyond just the position of the controllers they are holding. VR content will likely mirror the progression of video games, and they will become more social, have deeper storytelling experiences, etc.
Head-mounted AR might just be a viable proposition. The potential for AR is incredible, and at some point it will be just about everywhere and in everything. However, the obstacles it needs to overcome to get to the point where it is a viable consumer proposition are so much more significant than those that VR faces.
What are some development tools and resources you can’t live without?
A good GPU profiler. Accurate timing that takes us beyond guesswork and emergency feature cutting at the end of a project is vital, especially when working with tight deadlines. A great GPU profiler also reveals the workings of the hardware itself and allows us to gain a deeper insight into exactly what is happening when we submit that draw call. PC profilers are arguably at that point already, but with the sort of tiled GPUs that we see in mobile devices, I don’t feel that I am getting the same quality of information out of the profilers that are available currently.
How are you using the QTI technology in your products?
Innovations like the foveated rendering and multi-view rendering in the Snapdragon 835 are huge for us. They allow us to push the visual quality on devices using this hardware far beyond where we would be able to without it. And just as important is the fact that we see the benefits of these innovations with very little work on our part.
Do you plan on using QTI technologies on future projects? If so, how?
Our current project will deliver on Oculus Quest sometime around the end of the year. I would love to follow that up with another project on the platform so that we can take the lessons learned and use them to push the hardware even further. The VR space is evolving more rapidly than anything I have been involved in before and it is always tempting to chase the latest new thing. But looking back at my days in video games, the best and most rewarding work was always produced when you had a deeper understanding of the hardware you were using.
Qualcomm Snapdragon is a product of Qualcomm Technologies, Inc. and/or its subsidiaries.