

We know that they report that they're getting 12 fps out of their new system for interactive posing. Blender's new depsgraph draws heavily on this work, with tweaks in so that it works with what we've got. Now, away from all the conjecture and looking more at stuff we actually know for certain: - This whole system is powered by a intensely multithreaded depsgraph engine, which they've published a paper on (I've posted a link to this before, but it might be paywalled), and some publicly available course notes which basically repeat most of the same (with a greater focus on what lessons they learned when trying to deploy this stuff in production). Would that mean that there was some kind of exporter/importer to get the animation back out of Maya into their system)?Īs far as the rigs go, while the initial videos I saw all seemed to suggest that they had moved away from placing actual controls on any of the rigs, it seems instead that they simply have them hidden but still selectable - exactly like some of the rigs I was playing around with a few years ago, which was also inspired by some experiments from Keith Lango back in the day with his 'Otto' rig IIRC. At one point, it also looked like some animator was actually animating/working in Maya instead (I do wonder whether it's true that some guys animated in Maya in the past. It seemed that they could in fact still move controls about interactively (see the shot where they've got an animator supposedly playing around with how open Hiccup's mouth is), and the animation baking seemed to be running at about 1 frame every 1-2 seconds (unless of course, we just caught it mid-update). From clips seen in the video above though, the situation seems a bit less dire. Oh, and this baked animation was still done using low-poly proxy models, but still, if they tried to animate more than one character at a time, the system would simply crash (presumably from not having enough RAM to store all the data for both rigs). It sounded like artists were forced to pose their rigs by typing in numbers in the spreadsheet, then hitting a 'calculate animation' button, which would run some kind of batch process (complete with modal-blocking progress dialog, that runs for a few minutes) that goes through the entire timeline and bakes out the effect of changing those parameters on the whole shot, before finally they could play back the cached animation (or even see the new pose of their character) with the changes applied.
