BEFORESANDAFTERS.COM
Roadtesting Rokokos Smartgloves and Coil Pro
Matt Estela fires up this motion capture kit from Rokoko for a test run of their Smartgloves and Coil Pro.You may know Matt Estela from his Houdini activities, or his incredible CG resource CGWiki. Matt is currently a Senior Houdini Artist at Google and previously worked in VFX and animation at Animal Logic, Dr. D Studios and Framestore CFC.Matt likes tinkering with new tech, so I asked him to road test Rokokos Smartloves and Coil Pro; two motion capture offerings from the company. Heres how Matt broke down the tools.(And, yes, befores & afters ON THE BOX series is back!)TLDR; its goodIt captures fingers nicely, position tracking with the Coil Pro works great, calibration is fast, the capture software is easy to use and exports very clean FBXs. Its a little pricey as a full package, but worth it all things considered, and Rokoko support is great.My BackgroundWhile Im known as a minor Houdini celebrity in very small social circles, I actually started in 3D as a character animator in 1999/2000. It took about a year to realize I didnt have the patience or dedication for it and moved into more technical roles, but 24 years later I still love and appreciate quality animation.My move into tech plus my background as a failed animator meant when Ian offered these gloves to review I jumped at the chance.Tech BackgroundBroadly, mocap tech falls into several categories:Dedicated opticalDedicated IMUMachine learningAdaptation of smartphones and VR headsetsAt its core, mocap needs to know where a joint is in 3D space. Optical uses multiple cameras to identify bright dots on a suit, triangulates where those dots are based on all those cameras, and then calculates an absolute position of where that dot is. While optical is very accurate, it is also very expensive; these systems require special high speed cameras, ideally as many as possible, with associated dedicated infrared lighting, which all need to be carefully calibrated in a dedicated performance space.IMU (Inertial Measurement Unit) systems like Rokoko dont directly solve the absolute position of joints, but calculate it from acceleration. Cast your mind back to high school physics, and remember how position, velocity and acceleration are linked. Velocity is a change in position over time, and acceleration is a change in velocity over time. IMU sensors measure acceleration, you can take that acceleration and run those high school equations in reverse; use acceleration to get velocity, use velocity to get position. Because IMUs are self-contained they dont require cameras, meaning they dont suffer from the occlusion issues of optical systems. While IMU systems are not as accurate as optical, they are substantially cheaper.Machine learning has been a recent addition to the space, where they guess the pose of a human based on training data. They produce adequate results for real time use, but to achieve the quality required for film and games require offline processing in the cloud, which can be a concern for some.The final category is adapting smartphones and VR headsets. Both have cameras and IMU sensors on board, and also increasingly feature on-board machine learning for hand tracking and pose estimation. Quality is variable, and are limited to motions that can be comfortably done while holding a phone or wearing a headset.SmartglovesIn 2020 Rokoko launched the Smartgloves, one of the first commercial systems to offer hand tracking at a reasonable price point without requiring the skills of a dedicated motion capture facility. It also offered the ability to integrate with the smartsuit to provide an all in one solution for body and hand mocap.I had the chance to test these gloves shortly after launch. My experience with mocap at that point was a handful of optical systems for some university research projects, and dabbling with some smartphone systems for facial capture and early VR apps for hand and head position capture.This put me in an interesting space; I hadnt tried any IMU systems, and so was judging the gloves based on experience with the optical body capture and VR hand capture systems mentioned above.I tested them for a couple of weeks, and my initial verdict was oh, theyre ok I guess. The gloves did exactly what they were designed to do, capture fingers, but as someone who talks with their hands a lot, my expectation was that the gloves would capture full hand gestures, which if you think about it, means understanding what the wrists and elbows are doing for full Italian style gesticulation silliness.Further, because I was only wearing the gloves (and clothes, cmon), it was natural to try and focus on hand centric animation; clapping, typing, steepling fingers etc. Again, the gloves in their original format arent really meant to do this. Think about the limitation of IMU, theres no ability to know where the hands are relative to each other, they cant detect if youre really perfectly still or moving veerrryyy slowly at a constant velocity.This all manifests as drift; do a clap for example, very quickly the hands end up intersecting each other. Hands on a desk will slide away, overall body pose starts to rotate etc. If your needs are broad body gestures maybe this is fine, especially for Vtubers and similar fields where high accuracy isnt an issue.At its core, IMU on its own is incapable of the accuracy needed to capture hand gestures. Again back to high school physics, that process of acceleration -> velocity -> position is affected by sensor accuracy and the limits of real time calculation. The numbers arent perfect, the sensors arent perfect, meaning results drift. Theres ways to compensate for this, e.g. the smartsuit understands the biomechanics of a human skeleton to make educated guesses of where the feet should be, how knees should bend, and if paired with the gloves, can drastically improve the quality of the hand tracking. But without the suit, and without other sensor data, two handed gestures would always be difficult.Rokoko themselves of course know about the limitations of IMU, and had plans to augment this.Coil ProFast forward a few years, and Rokoko released the Coil Pro. This uses another technology EMF, or electromagnetic fields, in conjunction with IMU, to be able to calculate worldspace positions. It promised results like the worldspace positions of optical, without the occlusion issues of optical, and especially without the cost of optical.Rokoko mentioned this was coming soon back in 2020, time passed, I forgot. In 2024 they got in touch again, unsurprisingly getting it to market took longer than expected, and asked if Id be interested in trying again, of course I was.SetupThe coil arrived, about the size of a small guitar practice amp. Special mention has to be made for the unboxing process, an amusing bit of showmanship:The install process was pretty straightforward; connect it via USB to your computer to register it, then disconnect it. It doesnt require a permanent connection to your computer, only power (which is also delivered via USB), so its easy to move to a convenient location.A new pair of Smartgloves also arrived with the Coil, they needed to be registered and have firmware updates. This took longer than expected, mainly because Im an idiot who didnt read the manual carefully enough; the gloves need to be updated one at a time. Special shout-out to Rokoko support who were very helpful, and logged in to my machine remotely to identify issues. Everyone at Rokoko pointed out I wasnt getting special treatment, this is the level of service they offer to all their customers.Once the gloves were updated and registered, the final setup step was how you bind the gloves to your avatar within the Rokoko software. By default the gloves float free in virtual 3D space, which worked, but the results felt a little drifty and strange. Again my dumb fault for not reading the manual, support advised me to bind the gloves to a full body avatar, despite not having a full Smartsuit.Suddenly the result was a lot more accurate. My understanding is that when linked this way, the software can use biomechanics to make better estimates of the wrist and arm positions, leading to a much more accurate result.In useWith everything setup, I was impressed at how invisible the process became. Previous mocap tests with optical and smartphone/vr headset systems constantly reminded me of their limitations; occlusion with optical will guess strangely, ML systems will often do a plausible but incorrect guess of limb locations. With the Smartgloves and Coil, I never got these glitches, it just feels like an on screen mirror of my actions.Calibration is very straightforward; hit a neutral pose, hold it for 3 seconds, done. Calibration for optical systems has taken a lot longer. Once calibrated, hit record, do a motion, then stop recording. You can review the action immediately, re-record if you choose.Exporting to FBX is very easy, and loaded into Houdini without issues.Example: Testing a laptopMany times Ive had ideas for little animations, but Id get 10% into the process, get bored, stop. Similarly Id have ideas for stuff that I might film (I was one of the dweebs who got one of the first video capable DSLRs thinking Id be the next Gondry), but again the effort to finish something was just too much.Once the gloves were setup and I could see my avatar on screen, I started testing scenarios in realtime; how cartoony could I move my arms, how did the body adjust based on the hand positions, how well typing was captured. Quickly I improvised a scenario where testing a laptop goes wrong, the tester reacts, panics. I could let it record, try a few different takes, play it back. It was the ideal blend of animation output, but with the spontaneity of improv and acting.The limitations of doing full body capture with only gloves led to some fun problem solving. How would the character enter and exit? I couldnt capture walking easily, but maybe theyd be on a segway? Again I could test quickly, export a clip, blend it and the previous take in Houdini, be inspired, try something else. Heres the end result:Example: NewsreaderA friend was curious about the gloves, so I asked what sort of motion they might record. He said a newsreader talking to camera, shuffling papers, that sort of thing.Out of curiosity I timed how long it took from getting the suggestion to having it playback in Houdini; it was 5 minutes. 5 minutes! Amazingly, 2 minutes of that time was waiting for a Rokoko software update. The result was perfectly usable, glitch free, no need for cleanup. Thats pretty amazing.What surprised me with this test was how the Rokoko software animated the body, even though I was only recording motion via the Smartgloves. The software uses the hand data to estimate what the rest of the body is doing; its possible to split off the hands in a DCC, but not only was the body estimation pretty good, it had way more character than I expected.Comparing to alternativesFull disclosure, as stated earlier Im not a mocap person, so what follows is results of some quick google/youtube research.The main competitors appear to be Manus and Stretchsense. A key differentiating factor is Manus and Stretchsense are designed to be used with another mocap system, while Rokoko are pushing a unified body+gloves package.As such this makes direct comparisons a little tricky. All 3 systems track fingers, but to get accurate hand positions where collisions matter, all need augmentation; Rokoko via the Coil, Manus and Smartsense from an optical system like an Optitrack. If the Manus or Smartsense are paired with a IMU system like Xsens, their ability to track claps and other two handed gestures will be limited.Cost is relevant here too, the Smartgloves and Coil combination is cheaper than either of the top of the line options for Manus or Smartsense, and the 2 later options would still require another system to do accurate positional tracking. Theres analogies to be made here to the mac vs pc world; Rokoko are pushing a single Apple style ecosystem, while the others are modular and designed to work with a few different systems.Moving away from dedicated systems, theres the option of using a Quest headset to track hands. The Glycon app is cheap and cheerful, but suffers the same issues of occlusion; if the cameras on the headset cant see your fingers, it will guess what your fingers are doing, often incorrectly. The location of the cameras means your most neutral handby-sides idle pose is not tracked well. Further, while a mocap suit+gloves setup is designed to handle extreme motion, a VR headset is not, so youre limited to gestures you can do comfortably and safely while wearing a high tech shoebox on your face.The final alternative is keyframing hand and finger animation manually. Hand animation is invisible when done well, but draws attention to itself when done poorly. Like faces, our brains are so tuned to the behaviour of hands that we spot bad hand animation immediately. To get hand animation comparable to the results I captured, even for relatively simple idle and keepalive animations, would take hours to keyframe. If you require lots of hand animation, that time (and artist cost) adds up quickly.As a very rough matrix of options, the full list looks like this:Other thoughtsWas interesting chatting with Jakob Balslev, the CEO of Rokoko. It reminded me of the old adage of the difference between theory and practice:In theory there is no difference but in practice there is.The basic theory of using IMU and EMF for motion capture makes sense, but the engineering work required to make it happen, to get to manufacture, to hit a price point, is huge. Hardware is way harder to develop than most of us take for granted. Jakob quipped we would probably have never started on if we knew how hard it would be but now we are glad we did!. It was also interesting to hear how lessons from each product informed the next, so the gloves are better than the suit, and the coil is better than both. The tricky part is theyre all meant to work together, an interesting balancing act. Rokoko definitely seem to love the work they do, and are constantly refining their products with both software and firmware updates.ConclusionAs I said at the start, its good. It solves many of the issues that exist with older or cheaper hand setups, while avoiding the cost of more advanced setups. I was impressed that while all mocap gloves are expected to only track fingers and maybe some wrist rotation, I was able to get some fun and plausible full body mocap with very expressive arm animation. If your mocap needs are largely fingers and hand based, and occlusion issues with AI or Quest setups have bothered you, the smartgloves and coil are an ideal solution.Bonus: testing with full suit at UTS ALAId been chatting with friends at UTS ALA who had the Smartsuit and Smartgloves. As far as we could tell the studio space is made of magnets, iron filings, van der graf machines, as a result the system never worked as well as they hoped. Alex Weight, the creative lead at ALA, is a very experienced film character animator and director, and found that while the system might have been ok for, say a Vtuber, it wasnt at the level he needed for previs; hands would drift through each other too easily, legs would kick out at strange angles, no matter how much calibration or wifi adjustments they made.Rokoko were happy for me to pop down with the Coil and test. Given their previous results the team at ALA were a little sceptical, but after the Coil was set up, the difference was astonishing. Practically no drift, worldspace positions of the hands remarkably clean. We got increasingly ambitious with tests, so holding props, picking up a chair, leaning a hand on a desk, all worked exactly as expected. I know that Rokoko are working on a big firmware upgrade for the suit that will improve results with the Coil further still.Do you have a product (hardware or software) that youd like to see in befores & afters On The Box? Send us an email about it.The post Roadtesting Rokokos Smartgloves and Coil Pro appeared first on befores & afters.
0 Commentarii
0 Distribuiri
66 Views