Here in Orlando, we’re a hub for military simulation with the University of Central Florida Research Park, Full Sail University as well as DAVE School right in our backyard. All the FX houses are incorporating some type of scanning and digitizing like character motion capture, body scanning and in today’s discussion: show set capture and digitization.
In big blockbuster movies you’ll see scenes where the character narrowly escapes death by the skin of his teeth and blows up half a historical landmark in the process… how do they get away with that? It’s costly to build sets and blow them up: just ask the producers from Waterworld. The idea is that these one-of-a-kind environments get star treatment digitally. Using 3D laser scanners and photogrammetry tools to capture reality, the modeling team can put an environment together in a fraction of the time and in much greater detail and accuracy than ever before.
So, how are FX studios using the data now?
I’ve been lucky to have worked with a few studios and now supporting a local customer EA Tiburon, the design team for games like Madden and Tiger Woods, right here in Orlando. Together, we’ve solved lots of technical issues relating to how the current industry consumes point clouds — and identifying the lack of tools thereof.
The data collection side seems to be the easiest part these days with simplified scanning workflows.The real issues lie in downstream software processing. In EA’s scenario, they’ve developed some amazing tools themselves… and continue to refine this process. Check it out.Here’s an older clip of Remnant Studios in CA working with our first gen scanner.They aren’t the only ones using scanning; here’s Activision scanning in Florida.
Another “power user” is Scott Metzger, an accomplished visual effects artist who has a nice technical video he shares with his use and application of 3D scanners in conjunction with the dizzying conversion and modeling programs used to make such high quality digital media.
So, there are benefits to this renaissance of companies embracing scanning as their primary reality capture tools … the trickle-down effects of these big shops with big budgets force the hand of the major design packages to play in the point cloud sandbox. Autodesk acquired Alice Labs a while back, enabling users to one day have “unlimited” point cloud visualization capability in their native packages including CAD. “Unlimited” is not a word we scanner-heads are used to hearing; but there’s some truth to this. With multi-core processors and true 64-bit hyper-threaded applications, this is soon becoming a reality (pun intended).
A complimentary yet stand-alone product is Click-VR, which is a stand-alone real-time 3D rendering application that allows you to work with the cloud in a “pre-rendered” format. Those who are not in the 3D animation biz may be scratching our heads, but you know that short amount of time that the cloud goes fuzzy while you spin around in your 3D viewer? Yeah, that’s “rendering”… pre-rendered means there is no lag, no regeneration time with full point cloud display and access. Most importantly, with further tweaking and the right workflow, this can be used as a true simulation tool, like this example.
As we speak, many of the sports stadiums across the country are being scanned to generate a portfolio for these game companies.
How cool would it be to play HALO with a game map being your local park or your office? Hmmm… sounds like a nice service offering to me. Do you think the corporate execs at Activision, Ethers might be thinking the same thing (wink, wink)?