Live Link Face 对比 MobileCap 的使用情况和统计数据

Live Link Face for effortless facial animation in Unreal Engine — Capture performances for MetaHuman Animator to achieve the highest fidelity results or stream facial animation in real time from your iPhone or iPad for live performances. Capture facial performances for MetaHuman Animator: - MetaHuman Animator uses Live Link Face to capture performances on iPhone then applies its own processing to create high-fidelity facial animation for MetaHumans. - The Live Link Face iOS app captures raw video and depth data, which is ingested directly from your device into Unreal Engine for use with the MetaHuman plugin. - Facial animation created with MetaHuman Animator can be applied to any MetaHuman character, in just a few clicks. - This workflow requires an iPhone (12 or above) and a desktop PC running Windows 10/11, as well as the MetaHuman Plugin for Unreal Engine. Real time animation for live performances: - Stream out ARKit animation data live to an Unreal Engine instance via Live Link over a network. - Visualize facial expressions in real time with live rendering in Unreal Engine. - Drive a 3D preview mesh, optionally overlaid over the video reference on the phone. - Record the raw ARKit animation data and front-facing video reference footage. - Tune the capture data to the individual performer and improve facial animation quality with rest pose calibration. Timecode support for multi-device synchronization: - Select from the iPhone system clock, an NTP server, or use a Tentacle Sync to connect with a master clock on stage. - Video reference is frame accurate with embedded timecode for editorial. Control Live Link Face remotely with OSC or via the MetaHuman Plugin for Unreal Engine: - Trigger recording externally so actors can focus on their performances. - Capture slate names and take numbers consistently. - Extract data for processing and storage. Browse and manage the captured library of takes: - Delete takes within Live Link Face, share via AirDrop. - Transfer directly over network when using MetaHuman Animator. - Play back the captured video on the phone.
  • Apple 应用商店
  • 免费
  • 图形与设计

商店排名

- -

Instead of buying an expensive and cumbersome motion capture suit, or tediously animating by hand, record 3D animations with nothing more than your iOS device! Powered by ARKit, MobileCap allows for full body 14-point tracking in real time from your device's camera. Once you've completed a capture, make refinements with a comprehensive set of export options, preview your adjustments in the app, and then export the capture as an FBX animation file that can be used in all common 3D modeling programs and game engines, or as a CSV spreadsheet full of rotation data. Use a LiDAR enabled device such as the iPhone 12 Pro or iPad Pro 2020 or later for more accurate tracking. Note: Body tracking functionality requires an A12 Bionic processor. This is limited to the iPhone Xs and Xs Max, iPhone XR, iPad mini 5th generation, iPad Air 3rd generation, iPad Pro (2018), and any newer devices. However, all other app functionality can be evaluated on other devices.
  • Apple 应用商店
  • 免费
  • 图形与设计

商店排名

- -

Live Link Face与MobileCap排名比较

对比 Live Link Face 与 MobileCap 在过去 28 天内的排名趋势

排名

没有可用的数据

Live Link Face 对比 MobileCap 的排名,按国家/地区比较

对比 Live Link Face 与 MobileCap 在过去 28 天内的排名趋势

所有品类

没有可用的数据

图形与设计

应用
热门国家/地区
排名
#130
- -
- -

通过免费试用版比较任何网站

开始使用
Live Link Face VS.
MobileCap

一月 8, 2025