20 Method for Recognizing Expression Considering Fuzzy Based on Optical Flow

Similar documents
SURF,,., 55%,.,., SURF(Speeded Up Robust Features), 4 (,,, ), SURF.,, 84%, 96%, 28%, 32%.,,,. SURF, i

,,.,.,,.,.,.,.,,.,..,,,, i

Virtual Window System Virtual Window System Virtual Window System Virtual Window System Virtual Window System Virtual Window System Social Networking

28 Horizontal angle correction using straight line detection in an equirectangular image

7,, i

25 Removal of the fricative sounds that occur in the electronic stethoscope

1 1 tf-idf tf-idf i

, (GPS: Global Positioning Systemg),.,, (LBS: Local Based Services).. GPS,.,. RFID LAN,.,.,.,,,.,..,.,.,,, i

Web Web Web Web Web, i


2 122

1 Fig. 1 Extraction of motion,.,,, 4,,, 3., 1, 2. 2.,. CHLAC,. 2.1,. (256 ).,., CHLAC. CHLAC, HLAC. 2.3 (HLAC ) r,.,. HLAC. N. 2 HLAC Fig. 2

58 10

258 5) GPS 1 GPS 6) GPS DP 7) 8) 10) GPS GPS ) GPS Global Positioning System

soturon.dvi

橡最終原稿.PDF

19 Systematization of Problem Solving Strategy in High School Mathematics for Improving Metacognitive Ability

i

九州大学学術情報リポジトリ Kyushu University Institutional Repository 看護師の勤務体制による睡眠実態についての調査 岩下, 智香九州大学医学部保健学科看護学専攻 出版情報 : 九州大学医学部保健学

28 TCG SURF Card recognition using SURF in TCG play video

( ) [1] [4] ( ) 2. [5] [6] Piano Tutor[7] [1], [2], [8], [9] Radiobaton[10] Two Finger Piano[11] Coloring-in Piano[12] ism[13] MIDI MIDI 1 Fig. 1 Syst

,.,.,,.,. X Y..,,., [1].,,,.,,.. HCI,,,,,,, i

Core Ethics Vol.

ii


2

SOM SOM(Self-Organizing Maps) SOM SOM SOM SOM SOM SOM i

関西における地域銀行について

untitled

i

Web Basic Web SAS-2 Web SAS-2 i

WebRTC P2P Web Proxy P2P Web Proxy WebRTC WebRTC Web, HTTP, WebRTC, P2P i

29 jjencode JavaScript

,,,,., C Java,,.,,.,., ,,.,, i

Studies of Foot Form for Footwear Design (Part 9) : Characteristics of the Foot Form of Young and Elder Women Based on their Sizes of Ball Joint Girth

06’ÓŠ¹/ŒØŒì

25 D Effects of viewpoints of head mounted wearable 3D display on human task performance

i


Wide Scanner TWAIN Source ユーザーズガイド

Web Stamps 96 KJ Stamps Web Vol 8, No 1, 2004

ron.dvi

II

EQUIVALENT TRANSFORMATION TECHNIQUE FOR ISLANDING DETECTION METHODS OF SYNCHRONOUS GENERATOR -REACTIVE POWER PERTURBATION METHODS USING AVR OR SVC- Ju

, IT.,.,..,.. i

kut-paper-template.dvi

untitled

Takens / / 1989/1/1 2009/9/ /1/1 2009/9/ /1/1 2009/9/30,,, i


) ,

THE INSTITUTE OF ELECTRONICS, INFORMATION AND COMMUNICATION ENGINEERS TECHNICAL REPORT OF IEICE.


Vol. 42 No MUC-6 6) 90% 2) MUC-6 MET-1 7),8) 7 90% 1 MUC IREX-NE 9) 10),11) 1) MUCMET 12) IREX-NE 13) ARPA 1987 MUC 1992 TREC IREX-N


Sport and the Media: The Close Relationship between Sport and Broadcasting SUDO, Haruo1) Abstract This report tries to demonstrate the relationship be


16_.....E...._.I.v2006

企業の信頼性を通じたブランド構築に関する考察


橡最新卒論

KII, Masanobu Vol.7 No Spring

入門ガイド

28 Docker Design and Implementation of Program Evaluation System Using Docker Virtualized Environment

IPSJ SIG Technical Report Vol.2014-EIP-63 No /2/21 1,a) Wi-Fi Probe Request MAC MAC Probe Request MAC A dynamic ads control based on tra

The Japanese Journal of Psychology 2000, Vol. 71, No. 3, Emotion recognition: Facial components associated with various emotions Ken Gouta and

Core Ethics Vol. -

3D UbiCode (Ubiquitous+Code) RFID ResBe (Remote entertainment space Behavior evaluation) 2 UbiCode Fig. 2 UbiCode 2. UbiCode 2. 1 UbiCode UbiCode 2. 2

IR0036_62-3.indb


2017 (413812)

Transcription:

20 Method for Recognizing Expression Considering Fuzzy Based on Optical Flow 1115084 2009 3 5

3.,,,.., HCI(Human Computer Interaction),.,,.,,.,.,,..,. i

Abstract Method for Recognizing Expression Considering Fuzzy Based on Optical Flow Masashi Onishi When humans communicate to each other, a face has three roles. They are the roles as the display to express the person s individuality, the indicator to reflect the state of mind at that time and the messenger to express the intention which should be transmitted. The role of face as a media which conveys feelings especially by expression is important. For this reason, it is an important subject for a computer to recognize expression especially in the field of HCI (human computer interaction). In recent researches, it has been researched to divide a subject s expression into some categories and recognize expression. However, since the strength in expression exists, it is necessary to take it into consideration, because the role of more natural media can be played by the ability to recognize the strength of expression. In this research, after describing the extraction method of features for recognizing expression and the classification technique of expression, the technique for showing the strength of expression is proposed. An optical flow is used as features. Functions showing the strength of expressions are calculated. The strength of expressions are shown using the function. key words Strength of Expression, Optical flow, Fitting ii

1 1 1.1................................... 1 1.1.1............................... 1 1.1.2............................ 2 1.1.3 FACS....................... 2 1.2.................................. 3 1.3................................. 4 2 5 2.1............................ 5 2.2.......................... 5 2.3.......................... 7 3 8 3.1................................. 8 3.2........................... 12 3.3........................... 13 3.4................................... 14 3.5.............................. 15 3.6................................... 15 4 16 4.1................................. 16 4.2................................. 28 iii

5 30 6 32 33 35 A 36 iv

2.1.............................. 6 3.1.................................... 8 3.2.................................... 9 3.3................................ 9 3.4................................ 11 3.5................................... 13 3.6.......................... 14 4.1 ()............................. 20 4.2 1 ()................................ 21 4.3 ()............................. 21 4.4 ()............................. 22 4.5 1 ()................................ 22 4.6 ()............................. 23 4.7 ()............................. 23 4.8 1 ()................................ 24 4.9 ()............................. 24 4.10 ()........................... 25 4.11 1 ()............................... 25 4.12 ()........................... 26 4.13........................ 27 v

3.1................................. 9 3.2.......................... 10 3.3...................................... 10 3.4.............................. 10 3.5................................... 12 4.1 ()................................ 16 4.2 ()................................ 17 4.3 ()................................ 17 4.4 ()............................... 18 4.5.................................... 18 4.6............................... 19 4.7............................... 19 4.8............................... 19 4.9.............................. 19 4.10............................ 20 4.11.................................... 28 4.12............................ 28 4.13.......................... 29 A.1 1........................... 36 A.2 2........................... 37 A.3 1........................... 38 A.4 2........................... 39 vi

A.5 1........................... 40 A.6 2........................... 41 A.7 1......................... 42 A.8 2......................... 43 vii

1 1.1 1.1.1., face-to-face comunication,.,, 7%, 38%, 55% Mehrabion, () [1].,,., [2].,, [3]., 1

1.1 [4]. 1.1.2,, [2].,,,,.,. 1.,,,.,,,,. 2.,,.,,. 1.1.3 FACS,,., 2

1.2 Ekman Friesen Facial Action Coding System(FACS)[5],. FACS Action Unit (AU) 46,, AU., AU, AU. AU., AU,, 1.2 1, 2., FACS,,,., FACS FACS. AU [6][7],, AU., FACS AU. 1.2,, [2][8].,, [9].,.., FACS AU 3

1.3,.,,,,. 1.3 2,,,.,,. 3.,,,,.,,. 4, 3.,. 5, 6 4

2 2.1,.,,.,,.,,,. 2.2,.,. 2.1. 5

2.2 2.1 n x i., y i. (x i, y i ).,,, y = 1 ( 2.1), 1 ( 2.2), y = 1 ( 2.3),. 6

2.3 f(x) = 1 1 + exp[ (x 2 )] (2.1) f(x) = ax + b (2.2) f(x) = 1 exp[ x] (2.3),,. 2.3 n x i, n (x i, y i ), n., (2.1),, (2.2) a, b, (2.3)., [10],., (2.4) J. J = n [f(x i ) y i ] 2 (2.4) i=1 7

3 3.1 21.,,,, 4.. 3.1, 3.2. 3.1. 8

3.1 3.2 3.1 Victor GZ-MG77 3.3 9

3.1 3.2 Victor GZ-MG77 1/3.9 218 CCD 123 200 F1.22.0 f=3.838mm(35mm 35.8457mm) MPEG-2 Dolby Digital NTSC HDD, SD 3.3 SLIK ABLE 400 DX-LE 1.55m, 0.33m 2.48kg 3.4 1mm/m=0.0527 5.25mm/m= 0.2999 10

3.1,,, 12.,, 3.4. 3.4 11, 10. 11

3.2 3.2 14., 05 6. 0, 5. 05 3.5. 3.5 5 4 3 2 1 0. 12

3.3 3.3,,. 3.5. 3.5 13

3.4. 10, 30. 3.6. 3.6,. 3.4. n. n, (2.4) J. 14

3.5 3.5... 3.6,,.,. 0. 15

4 4.1 n 4.1 4.4. n 4.5. 4.1 n () 1 0.287 5 0.489 10 0.559 20 0.632 50 0.686 100 0.704 200 0.706 400 0.650 0.632 16

4.1 4.2 n () 1 0.303 5 0.399 10 0.401 20 0.438 50 0.513 100 0.577 200 0.636 400 0.658 0.657 4.3 n () 1 0.634 5 0.713 10 0.727 20 0.751 50 0.770 100 0.771 200 0.758 400 0.729 0.700 17

4.1 4.4 () n 1 0.400 5 0.438 10 0.470 20 0.541 50 0.584 100 0.576 200 0.529 400 0.474 0.458 4.5 n 200 0.706 100 0.771 400 0.658 50 0.584 n,. 4.6 4.9. 18

4.1 4.6 (2.1) = 0.07 = 42.0 (2.2) a = 0.02 b = 0.06 (2.3) = 0.04 4.7 (2.1) = 0.06 = 23.0 (2.2) a = 0.01 b = 0.39 (2.3) = 0.06 4.8 (2.1) = 0.04 = 54.0 (2.2) a = 0.01 b = 0.22 (2.3) = 0.03 4.9 (2.1) = 0.03 = 92.0 (2.2) a = 0.01 b = 0.06 (2.3) = 0.02 19

4.1 4.10, 4.1 4.12. 4.10 (2.1) (2.2) (2.3) 0.134 0.124 0.147 0.155 0.157 0.162 0.194 0.200 0.243 0.111 0.105 0.091 4.1 () 20

4.1 4.2 1 () 4.3 () 21

4.1 4.4 () 4.5 1 () 22

4.1 4.6 () 4.7 () 23

4.1 4.8 1 () 4.9 () 24

4.1 4.10 () 4.11 1 () 25

4.1 4.12 () 26

4.1.,,, 0 0.,, y = 1,, (2.3) ( 4.13). 4.13 27

4.2 4.2,. 10,,, 30. 4.11, 4.12, 4.13. 4.11 66.67% 40.00% 26.67% 86.67% 4.12 80.00% 70.00% 50.00% 70.00% 20.00% 30.00% 40.00% 20.00% 20.00% 70.00% 90.00% 100.00% 28

4.2 4.13 A 66.67% 66.67% 0.00% 66.67% 50.00% B 33.33% 0.00% 33.33% 100.00% 41.67% C 33.33% 66.67% 66.67% 66.67% 58.33% D 66.67% 100.00% 33.33% 100.00% 66.67% E 66.67% 33.33% 0.00% 100.00% 50.00% F 66.67% 33.33% 0.00% 66.67% 41.67% G 100.00% 0.00% 33.33% 66.67% 50.00% H 33.33% 0.00% 100.00% 100.00% 58.33% I 100.00% 33.33% 0.00% 100.00% 58.33% J 100.00% 33.33% 0.00% 100.00% 58.33%, A.1 A.8 29

5,,, ( 4.10).,,, 0 0.,, y = 1,, (2.3) ( 4.13).,., f(x) y i, x i,. f(x) y i, x i,.,., 4.12,,,.,,.,,,., 4.13.,,., 26.67%,, 30

.,,. 31

6,,,,.,,.,,,,.,, 4,,.,.,,,,,,. 32

,,., 3, 4,,,,,.,,,.,.,.,,,,,.,.,,,., 4,., 4.,,,.,,.,, 2,, 1,,,, 4,,,,,,,,,,,,, 3,,,,., 33

,,.,,..,,,.... 34

[1], [], Vol.85 No.9, pp680-685 (2002). [2], [], Vol.85 No.10, pp766-771 (2002). [3], [], Vol.85 No.12, pp936-941 (2002). [4], [], Vol.86 No.1, pp54-61 (2003). [5] P. Ekman and W. V. Friesen, Pictures of Facial Affect, Human Interaction Laboratory, Univ. of California Medical Center, San Francisco, (1976). [6] G. Donato, M. S. Bartlett, J. C. Hager, P. Ekman, and T. J. Sejnowski,Classifying Facial Actions, IEEE Trans. PAMI, Vol.21, No.10, Oct. (1999). [7] Y. Tian, T. Kanade, and J. F. Cohn, Recognizing Action Units for Facial Expression Analysis, IEEE Trans. PAMI, Vol.23, No.2, Feb. (2001). [8],,,, J81-D-2, No.6 pp1150-1159 (1998). [9],,, 20, pp325. [10],,, (2004). 35

A A.1A.8,. A.1 1 A B C D E 36

A.2 2 F G H I J 37

A.3 1 A B C D E 38

A.4 2 F G H I J 39

A.5 1 A B C D E 40

A.6 2 F G H I J 41

A.7 1 A B C D E 42

A.8 2 F G H I J 43