• No results found

Real-time Motion Capture Facial Animation

N/A
N/A
Protected

Academic year: 2022

Share "Real-time Motion Capture Facial Animation"

Copied!
71
0
0

Laster.... (Se fulltekst nå)

Fulltekst

(1)

Real-time Motion Capture Facial Animation

Catarina Runa Miranda

Verónica Costa Orvalho

(2)

Overview

•  Introduction

•  MoCap Fundamental Science

•  Facial MoCap Tracking

•  MoCap Facial Animation

•  MoCap VR Methods

•  Contributions

•  Conclusion

22/03/16 Real-time Motion Capture Facial Animation 2

(3)

Introduction

•  Main results

•  Motivation

•  Problem Statement

•  Goal

•  Framework

22/03/16 Real-time Motion Capture Facial Animation 3

(4)

VIDEO

22/03/16 Real-time Motion Capture Facial Animation 4

(5)

Goal

Research and develop methods for

non-expert users to recognize facial

movements non-intrusively and map them

to a 3D character on-the-;ly

22/03/16 Introduction 5

(6)

Motivation

•  LIFEisGAME and VERE projects’ goal:

Markerless and Real-time Facial Animation of 3D characters using off-the-shelf hardware

22/03/16 Introduction 6

(7)

Problem Statement

Realistic Facial animation

labor-intensive & expert dependent

22/03/16 Introduction 7

Ryse Son of Rome, Crytek (2013)

(8)

Problem Statement

MoCap Facial animation solutions are

not suitable for general user

expensive setups & complex calibrations &

not compatible to VR environments

22/03/16 Introduction 8

Graham Fyffe et al. Driving High-Resolution Facial Scans with Video Performance Capture (2014)

(9)

Framework

22/03/16 9

(10)

Framework

22/03/16 10

Independent solutions

for real-time MoCap facial

animation

(11)

11

DeVine which features need to

be tracked and mapped to 3D

character

(12)

MoCap Fundamental Science

•  Face Image Task: Self-perception of Facial Features

•  Real-time Emotion Recognition

22/03/16 Real-time Motion Capture Facial Animation 12

(13)

Study 1:

Face Image Task

To understand how individuals perceive their own facial structure through the evaluation of their knowledge about the

position of key facial features

22/03/16 MoCap Fundamental Science 13

(14)

Face Image Task:

Experiment overview

50 participants indicated the location of key features (red) of their own face relative to anchor point (green)

22/03/16 MoCap Fundamental Science 14

(15)

Face Image Task:

Conclusions

•  Human’s spacial perception of his own face using 11 key features is poor

•  High loadings on the upper face accompanied by low

loadings of the lower face, or vice versa

MoCap Tracker VR MoCap Methods

22/03/16 MoCap Fundamental Science 15

average horizontal (x)

and vertical (y) error

(16)

which facial features characterize the six universal emotions

+

Real-time geometric features extraction and emotion classi6ication

22/03/16 16

Study 2:

Real-time Emotion Recognition:

MoCap Fundamental Science

(17)

Real-time Emotion Recognition

Geometrical Features Extraction method

22/03/16 17

ϵ [0,1]

Eccentricity features

MoCap Fundamental Science

(18)

Real-time Emotion Recognition

Geometrical Features Extraction method

22/03/16 18

Linear features

MoCap Fundamental Science

(19)

Real-time Emotion Recognition

Results

Comparison with state of the art methods:

22/03/16 MoCap Fundamental Science 19

(20)

Real-time Emotion Recognition

Conclusions

Our geometric method:

•  Allows real-time feature extraction

•  Recognizes 6 universal emotions with 94% of Presents higher accuracy than state of the art methods

VR MoCap Methods

22/03/16 MoCap Fundamental Science 20

(21)

21

To track unique facial features

reducing user manual intervention

(22)

Facial MoCap Tracking

•  Background

•  Methodology

•  Results

•  Conclusions

22/03/16 Real-time Motion Capture Facial Animation 22

(23)

Background

Equipment- based

•   Intrusive

•   Expert dependent

•   Time consuming

•   OfVline Vine tuning

22/03/16 Facial MoCap Tracking 23

Beyond Two Souls, Quantic Dream (2013)

(24)

Background

Markerless

•  Less intrusive

•  Manual and tedious calibrations

•  Model Vitting in each frame limits facial movements detected

•  Locate only semantic facial features like eyes, mouth, nose, etc

•  Not compatible with

persistent partial occlusions

22/03/16 Facial MoCap Tracking 24

Chen Cao et al. Displaced dynamic expression regression (2014)

(25)

Goal

22/03/16 25

Markerless tracking of unique facial features movements, such as cheeks or forehead

movements and asymmetrical movements, using off-the-shelf hardware.

Facial MoCap Tracking

(26)

Hypothesis

22/03/16 26

To prove that we can use the sensitivity of Optical Flow algorithms to track subtle and

unique facial movements

Facial MoCap Tracking

(27)

Method

22/03/16 Facial MoCap Tracking 27

(28)

Calibration:

Stabilization methods

22/03/16 28

BME - Baseline Movement Estimation

Facial MoCap Tracking

(29)

22/03/16 Facial MoCap Tracking 29

Zone-based stabilization

loads a hierarchy of facial zones and landmarks that deVine a certain facial model

J.M. Saragih et al. Real-time avatar animation from a single image (2011)

Calibration:

Stabilization methods

(30)

22/03/16 30

1)  Update landmarks:

Landmark X Optical Flow to update X

Zone containing X X inVluence area

Facial MoCap Tracking

Runtime

Tracking movements

Op$cal Flow + Zone–based stabiliza$on

zone limits (black line) and influence raDo (blue raDo)

(31)

22/03/16 31

2) Failure Check:

Facial MoCap Tracking

Runtime

Zone-based stabilization

Hierarchy structure is maintained

(32)

Results

22/03/16 Facial MoCap Tracking 32

(33)

Conclusions

22/03/16 33

Our method:

•  Allows unsupervised real-time tracking of uncommon facial features, such cheeks

movements

•  Performs less accuractly than recent SotA methods under extreme environmental changes or during presence of more than one participant

Facial MoCap Tracking

(34)

34

To automatically transfer

movements tracked to 3D character creating

facial animation

(35)

MoCap Facial Animation

•  Background

•  Methodology

•  Results

•  Conclusions

22/03/16 Real-time Motion Capture Facial Animation 35

(36)

Background

•  Example-based algorithms: Digital-Ira

22/03/16 MoCap Facial Animation 36

Graham Fyffe et al. Driving High-Resolution Facial Scans with Video Performance Capture (2014)

(37)

Background

•  Example-based algorithms:

22/03/16 37

Chen Cao et al. Displaced dynamic expression regression (2014)

User-dependent long calibrations + Model Learning

Blendshape Rig

Limited facial expressions

MoCap Facial Animation

(38)

Goal

22/03/16 38

to create a mapping method that adapts to user-choice MoCap tracking algorithm and

reduces user-dependent calibration requirements.

MoCap Facial Animation

(39)

Mapping method

22/03/16 MoCap Facial Animation 39

(40)

22/03/16 40

1) Global + Local Transform between MoCap tracking and 3D character’s rig

≠ spaces

2D landmarks MoCap tracking space

3D landmarks 3D character’s space

MoCap Facial Animation

Mapping method

Calibration

(41)

Mapping method

Calibration

22/03/16 41

2) Hashtable with connection between 3D landmarks and vertex in the 3D character’s mesh

3D landmarks

3D character’s space Vertex in the Mesh 3D character’s space

MoCap Facial Animation

(42)

22/03/16 42

+ Apply Global and Local transform

+ Geometric Mapping between vertex and bones

+ calculate intensity of bone’s movements to create animation

2D landmarks MoCap tracking space

Rig’s bones

3D character’s space

≠ topology

MoCap Facial Animation

Mapping method

Runtime

(43)

Mapping method

Runtime: Geometric Mapping

22/03/16 43

correspondence between vertex (in the mesh) and the bones in the 3D character’s rig.

To each vertex:

1)  Main-bone translation:

Main – bone 2D translation

associated Landmark Movement

Main – bone Initial position

Sum of Weights of bone and childs

MoCap Facial Animation

(44)

Mapping method

Runtime: Geometric Mapping

22/03/16 44

correspondence between vertex (in the mesh) and the bones in the 3D character’s rig.

To each vertex:

1)  Main-bone translation:

2) Secondary bones translation:

Sec – bone 2D

translation Sum of weights of Bsec

and childs Bone translation

Sec – bone initial translation

MoCap Facial Animation

(45)

Mapping method

Runtime: Animation

22/03/16 45

calculates the intensity of deformation produced by the translation of each bone.

MoCap Facial Animation

(46)

Mapping method

Runtime: Animation

22/03/16 46

calculates the intensity of deformation produced by the translation of each bone.

Output:

Geometric mapping

MoCap Facial Animation

(47)

Mapping method

Runtime: Animation

22/03/16 47

calculates the intensity of deformation produced by the translation of each bone.

Rig setup by artist

MoCap Facial Animation

(48)

Mapping method

Runtime: Animation

22/03/16 MoCap Facial Animation 48

(49)

Mapping method

Runtime: Animation

22/03/16 49

Bone Animation =

Animation 2 with weight 0.4*b Animation 3 with weight 0.7*c

MoCap Facial Animation

(50)

Results

22/03/16 MoCap Facial Animation 50

(51)

Conclusions

22/03/16 51

Geometric mapping algorithm:

•  Adapts to different MoCap tracking systems

•  Allow real-time animation without complex calibrations

•  Reproduces asymmetrical facial movements

MoCap Facial Animation

(52)

52

To create MoCap tracking systems

compatible with VR environments

(53)

MoCap VR Methods

•  Background

•  Methodology

•  Results

•  Conclusions

22/03/16 Real-time Motion Capture Facial Animation 53

(54)

Background

•  VR scenario: Persistent partial occlusions

22/03/16 MoCap VR Methods 54

Hao Li et al, Facial performance sensing head-mounted display (2015)

Hardware based tracking FACS calibration

to blendshape animation

(55)

Goal

22/03/16 55

to create methods to estimate facial expressions of upper part of the face and predicts

emotions using movements tracked from bottom of the face.

MoCap VR Methods

(56)

MoCap VR methods

22/03/16 MoCap VR Methods 56

(57)

MoCap VR methods

Persistent Partial Occlusions

22/03/16 MoCap VR Methods 57

(58)

MoCap VR methods

Assessing Facial Expressions

22/03/16 58

Geometric features extraction :

MoCap VR Methods

(59)

Statistical Validation

22/03/16 MoCap VR Methods 59

(60)

Statistical Validation

22/03/16 MoCap VR Methods 60

(61)

Results

22/03/16 MoCap VR Methods 61

(62)

Conclusions

22/03/16 62

MoCap VR methods:

•  Make generic MoCap tracker systems compatible with persistent partial occlusions in VR

environments

•  Predict six universal emotions

•  Estimate eyebrows’ movements

MoCap VR Methods

(63)

Contributions

22/03/16 Real-time Motion Capture Facial Animation 63

Facial Features studies

MoCap Facial Animation pipeline

VR MoCap Tracking

MoCap Mapping MoCap

Tracking

FdMiee Database

Facial

Animation

(64)

Contributions

22/03/16 Real-time Motion Capture Facial Animation 64

Facial Features studies

MoCap Facial Animation pipeline

VR MoCap Tracking

MoCap Mapping MoCap

Tracking

FdMiee Database

Facial Animation

R eal-time

M odular

N on-intrusive

R educe manual intervention

U sable by non-experts

O ff-the-shelf hardware

(65)

Dissemination

•  2 articles accepted

•  3 articles submitted

•  1 Eurographics course submitted

•  2 best idea/concept Award

•  1 EU Project Workshop

•  5 invited talks

22/03/16 Real-time Motion Capture Facial Animation 65

(66)

Conclusion

22/03/16 Real-time Motion Capture Facial Animation 66

MoCap tracking Facial Features

studies

F ace perception

F acial behaviors

P sychology of emotions

MoCap

Mapping VR MoCap

Tracking

(67)

Conclusion

22/03/16 Real-time Motion Capture Facial Animation 67

MoCap tracking

OF to track unique facial traits

B iometrics

S ecurity

Facial Features studies

F ace perception

F acial behaviors

P sychology of emotions

MoCap

Mapping VR MoCap

Tracking

(68)

Conclusion

22/03/16 Real-time Motion Capture Facial Animation 68

MoCap tracking

OF to track unique facial traits

B iometrics

S ecurity

Facial Features studies

F ace perception

F acial behaviors

P sychology of emotions

MoCap Mapping

A daptive animation algorithms

U ser friendly applications

VR MoCap

Tracking

(69)

Conclusion

22/03/16 Real-time Motion Capture Facial Animation 69

MoCap tracking

OF to track unique facial traits

B iometrics

S ecurity

Facial Features studies

F ace perception

F acial behaviors

P sychology of emotions

MoCap Mapping

A daptive animation algorithms

U ser friendly applications

VR MoCap Tracking

VR tracking of emotions and facial expressions

H ardware free approach

L earning algorithms

for expressions prediction

(70)

Take-home message

Facial Animation created by anyone for everyone!

Thank you!

22/03/16 Real-time Motion Capture Facial Animation 70

(71)

FdMiee’s protocol

22/03/16 71

Protocol to create facial databases under a wide range of environemnt and behavior changes

FdMiee database:

•  6 participants

•  3 capture systems

•  6 Fixed Parameters

Facial MoCap Tracking

Referanser

RELATERTE DOKUMENTER

It consists of the following steps: (1) high-fidelity expressive facial mo- tion data of human subjects are recorded, (2) objective facial motion patterns are extracted by

Figure 2: Our hybrid face animation pipeline computes the large-scale facial motion from a linear deformation model, and adds fine-scale details using a pose-space deformation

Incorporating Parameterized Motion Groups One of the challenges of character animation based on mo- tion data is that it may require large databases and exces- sive sampling of

In this thesis, a new behavioral study of articulation is performed which takes advantage of the redundancy in the database: by observing the behavior of the mouth for the

Real-Time Individualized Virtual Humans Motion Capture Textured cloth body animation... Human

The scope of this thesis is on motion, focusing on expression wrinkles (Chapter 3), acquisition of face performances (Chapter 4), real-time animation and transfer of facial

Facial Animation is based on ideas pioneered by Parke [Par72], who introduced the first parameterized facial model allowing direct creation of facial deformation by defining ad

For motion capture, set-up time is reduced using fewer cameras, accuracy is in- creased despite occlusions and general environments, initialization is automated, and free roaming