Live Link Face vs MobileCap Utilisation & Stats

Live Link Face for effortless facial animation in Unreal Engine — Capture performances for MetaHuman Animator to achieve the highest fidelity results or stream facial animation in real time from your iPhone or iPad for live performances. Capture facial performances for MetaHuman Animator: - MetaHuman Animator uses Live Link Face to capture performances on iPhone then applies its own processing to create high-fidelity facial animation for MetaHumans. - The Live Link Face iOS app captures raw video and depth data, which is ingested directly from your device into Unreal Engine for use with the MetaHuman plugin. - Facial animation created with MetaHuman Animator can be applied to any MetaHuman character, in just a few clicks. - This workflow requires an iPhone (12 or above) and a desktop PC running Windows 10/11, as well as the MetaHuman Plugin for Unreal Engine. Real time animation for live performances: - Stream out ARKit animation data live to an Unreal Engine instance via Live Link over a network. - Visualize facial expressions in real time with live rendering in Unreal Engine. - Drive a 3D preview mesh, optionally overlaid over the video reference on the phone. - Record the raw ARKit animation data and front-facing video reference footage. - Tune the capture data to the individual performer and improve facial animation quality with rest pose calibration. Timecode support for multi-device synchronization: - Select from the iPhone system clock, an NTP server, or use a Tentacle Sync to connect with a master clock on stage. - Video reference is frame accurate with embedded timecode for editorial. Control Live Link Face remotely with OSC or via the MetaHuman Plugin for Unreal Engine: - Trigger recording externally so actors can focus on their performances. - Capture slate names and take numbers consistently. - Extract data for processing and storage. Browse and manage the captured library of takes: - Delete takes within Live Link Face, share via AirDrop. - Transfer directly over network when using MetaHuman Animator. - Play back the captured video on the phone.
  • Apple App Store
  • Gratuit
  • Graphisme & Design

Classement dans le store

- -

Instead of buying an expensive and cumbersome motion capture suit, or tediously animating by hand, record 3D animations with nothing more than your iOS device! Powered by ARKit, MobileCap allows for full body 14-point tracking in real time from your device's camera. Once you've completed a capture, make refinements with a comprehensive set of export options, preview your adjustments in the app, and then export the capture as an FBX animation file that can be used in all common 3D modeling programs and game engines, or as a CSV spreadsheet full of rotation data. Use a LiDAR enabled device such as the iPhone 12 Pro or iPad Pro 2020 or later for more accurate tracking. Note: Body tracking functionality requires an A12 Bionic processor. This is limited to the iPhone Xs and Xs Max, iPhone XR, iPad mini 5th generation, iPad Air 3rd generation, iPad Pro (2018), and any newer devices. However, all other app functionality can be evaluated on other devices.
  • Apple App Store
  • Gratuit
  • Graphisme & Design

Classement dans le store

- -

Comparaison des classements Live Link Face vs. MobileCap

Comparez l'évolution du classement de Live Link Face au cours des 28 derniers jours à celle de MobileCap.

Rank

Aucune donnée disponible

Comparaison des classements Live Link Face et MobileCap par pays

Comparez l'évolution du classement de Live Link Face au cours des 28 derniers jours à celle de MobileCap.

Toutes les catégories

Aucune donnée disponible

Graphisme & Design

Appli
Top pays
Classement
#130
- -
- -

Comparez avec n'importe quel site grâce à notre essai gratuit

Démarrer
Live Link Face VS.
MobileCap

janvier 8, 2025