,,.,.,,. 2.,..,,,.,.,.,,,. 90%,.,,, i

Similar documents
28 Horizontal angle correction using straight line detection in an equirectangular image

& Vol.5 No (Oct. 2015) TV 1,2,a) , Augmented TV TV AR Augmented Reality 3DCG TV Estimation of TV Screen Position and Ro

paper.dvi

kut-paper-template.dvi

25 Removal of the fricative sounds that occur in the electronic stethoscope

Studies of Foot Form for Footwear Design (Part 9) : Characteristics of the Foot Form of Young and Elder Women Based on their Sizes of Ball Joint Girth



28 TCG SURF Card recognition using SURF in TCG play video

23 A Comparison of Flick and Ring Document Scrolling in Touch-based Mobile Phones

SOM SOM(Self-Organizing Maps) SOM SOM SOM SOM SOM SOM i

,,.,.,,.,.,.,.,,.,..,,,, i

24 Perceived depth position of autostereoscopic stimulus

, (GPS: Global Positioning Systemg),.,, (LBS: Local Based Services).. GPS,.,. RFID LAN,.,.,.,,,.,..,.,.,,, i

Virtual Window System Virtual Window System Virtual Window System Virtual Window System Virtual Window System Virtual Window System Social Networking

2007-Kanai-paper.dvi

,,,,., C Java,,.,,.,., ,,.,, i

Web Web Web Web Web, i

21 e-learning Development of Real-time Learner Detection System for e-learning

ii


2

untitled

i

2 1 ( ) 2 ( ) i

i


Wide Scanner TWAIN Source ユーザーズガイド

SURF,,., 55%,.,., SURF(Speeded Up Robust Features), 4 (,,, ), SURF.,, 84%, 96%, 28%, 32%.,,,. SURF, i

Sobel Canny i


2 ( ) i

(Visual Secret Sharing Scheme) VSSS VSSS 3 i

7,, i

, IT.,.,..,.. i

27 VR Effects of the position of viewpoint on self body in VR environment

(MIRU2008) HOG Histograms of Oriented Gradients (HOG)


24 Depth scaling of binocular stereopsis by observer s own movements

25 D Effects of viewpoints of head mounted wearable 3D display on human task performance

ron.dvi

17 The Analysis of Hand-Writing datas for pen-input character boxes

28 Docker Design and Implementation of Program Evaluation System Using Docker Virtualized Environment

(VKIR) VKIR VKIR DCT (R) (G) (B) Ward DCT i

Web Web Web Web i

NotePC 8 10cd=m 2 965cd=m Note-PC Weber L,M,S { i {

テレビ番組による相互交流

入門ガイド

活用ガイド (ソフトウェア編)

24 LED A visual programming environment for art work using a LED matrix

24 Region-Based Image Retrieval using Fuzzy Clustering

<4D F736F F F696E74202D C835B B E B8CDD8AB B83685D>

IT,, i

..,,,, , ( ) 3.,., 3.,., 500, 233.,, 3,,.,, i

SC-85X2取説


26 Development of Learning Support System for Fixation of Basketball Shoot Form

n 2 n (Dynamic Programming : DP) (Genetic Algorithm : GA) 2 i

1., 1 COOKPAD 2, Web.,,,,,,.,, [1]., 5.,, [2].,,.,.,, 5, [3].,,,.,, [4], 33,.,,.,,.. 2.,, 3.., 4., 5., ,. 1.,,., 2.,. 1,,

86 7 I ( 13 ) II ( )

第62巻 第1号 平成24年4月/石こうを用いた木材ペレット

SNS ( ) SNS(Social Networking Service) SNS SNS i

Steel Construction Vol. 6 No. 22(June 1999) Engineering

情報処理学会研究報告 IPSJ SIG Technical Report Vol.2013-CVIM-186 No /3/15 EMD 1,a) SIFT. SIFT Bag-of-keypoints. SIFT SIFT.. Earth Mover s Distance

<30375F97E996D88E812E696E6464>

25 AR 3 Property of three-dimensional perception in the wearable AR environment

Fig. 3 Flow diagram of image processing. Black rectangle in the photo indicates the processing area (128 x 32 pixels).

活用ガイド (ソフトウェア編)

IT i

ï\éÜA4*

橡最新卒論

P2P P2P peer peer P2P peer P2P peer P2P i

卒業論文2.dvi

23 The Study of support narrowing down goods on electronic commerce sites

o 2o 3o 3 1. I o 3. 1o 2o 31. I 3o PDF Adobe Reader 4o 2 1o I 2o 3o 4o 5o 6o 7o 2197/ o 1o 1 1o

5104-toku3.indd

r z m ε r ε θ z rθ



kiyo5_1-masuzawa.indd

System to Diagnosis Concrete Deterioration with Spectroscopic Analysis IHI IHI IHI The most popular method for inspecting concrete structures for dete


3D UbiCode (Ubiquitous+Code) RFID ResBe (Remote entertainment space Behavior evaluation) 2 UbiCode Fig. 2 UbiCode 2. UbiCode 2. 1 UbiCode UbiCode 2. 2

II

これわかWord2010_第1部_ indd

パワポカバー入稿用.indd

これでわかるAccess2010

情報処理学会研究報告 IPSJ SIG Technical Report Vol.2016-MBL-80 No.11 Vol.2016-CDS-17 No /8/ (VR) (AR) VR, AR VR, AR Study of a Feedback Method fo

...J QX

Fig. 1. Example of characters superimposed on delivery slip.

14 CRT Color Constancy in the Conditions of Dierent Cone Adaptation in a CRT Display

揃 Lag [hour] Lag [day] 35

IPSJ SIG Technical Report Vol.2012-IS-119 No /3/ Web A Multi-story e-picture Book with the Degree-of-interest Extraction Function

3.....ren

27 1 NP NP-completeness of Picross 3D without segment-information and that with height of one


22 Google Trends Estimation of Stock Dealing Timing using Google Trends

14 2 5

レーザ誘起蛍光法( LIF法) によるピストンの油膜挙動の解析

国土技術政策総合研究所 研究資料

Transcription:

27 A study on estimation of ring size using a palm image 1160375 2016 2 26

,,.,.,,. 2.,..,,,.,.,.,,,. 90%,.,,, i

Abstract A study on estimation of ring size using a palm image Hideki YOSHIKAWA Recently, surprise marriage proposals are often aired in TV dramas. But actually, it is difficult to examine the ring size of girl friend without being noticed. In this study, we have proposed a system to estimate ring size by using a palm image taken from a known distance with a smartphone. An input RGB image is converted to a grayscale image, and binarized by using a fixed threshold value. The palm region is detected as largest white area. Then angle of each finger is determined by robust estimation of lines using contour of fingers. The image of a finger selected by a user is rotated to make it parallel to Y axis, and is cropped to make just one finger image. After that, length of white area is scanned in each horizontal lines, then the finger width is determined as maximum of those length. The length in the image is converted into actual finger width, and finally ring size is estimated. We have confirmed that result shows 90% match between the size estimated by our system and the size that user prefered. key words Ring, Smartphone, Photogrammetry, Robust estimation ii

1 1 1.1................................... 1 1.2................................... 1 2 3 2.1............................. 3 2.2.............................. 4 2.2.1............................... 4 2.2.2............................ 4 2.2.3................................ 7 3 9 3.1................................... 9 3.2............................. 9 3.3................................ 11 3.4.................................. 12 3.5.............................. 14 3.6.................................. 15 3.7................................... 18 3.7.1................................ 18 3.7.2................................ 19 3.8................................. 21 3.9................................ 23 4 24 iii

4.1............................. 24 4.2................................... 26 4.3................................... 26 4.4................................... 27 4.5...................................... 28 5 30 31 32 A 33 iv

2.1...................... 5 2.2 2.1-2.4.............................. 6 2.3 2.5................................ 6 2.4 1................. 7 3.1............................. 10 3.2................................... 11 3.3................................ 12 3.4................ 13 3.5 40 ( ) 50 ( )...... 14 3.6...................... 15 3.7 3........................ 16 3.8 ( ), ( ), ( )............ 17 3.9.............................. 18 3.10............................ 19 3.11............................... 20 3.12....................... 20 3.13 2..................... 21 3.14....................... 22 3.15............................ 22 4.1............................. 25 4.2........................ 27 4.3................... 28 v

4.4.................................... 29 A.1......................... 33 vi

4.1.................. 24 4.2........................ 26 vii

1 1.1,,.,,.,,,,.,,.,.,.,,,.. 1.2.,,,.,, 1

1.2.,.,,.,,. 2

2,,,. 2.1,.,,..,.,.,, 10 3., 1 [1].,,.,,, 3,,,. 3

2.2 2.2 2.2.1, [2]., 2. 2.2.2,., 1.. 0.33mm 1.,,1 0.33mm. 3 4,1 0.11. 1, 1,., 1.,,,., EXIF.,EXIF 35mm [3]. 35mm., 50mm 46,, [4]. 2.1. APS-C, 0.65 4

2.2. 2.1,. 50mm,APS-C 31mm., 31mm APS-C, 50mm.,APS-C, 31mm, 50mm 35mm.,, 35mm. 43.27mm,EXIF. 35mm f e, f, h, w,. 2.2. tan 1 ( 43.27 2f e ) = θ 2 (2.1) 2f tan( θ 2 ) = s i (2.2) hs i h2 + w 2 = s h (2.3) 5

2.2 ws i h2 + w 2 = s w (2.4) 2.2 2.1-2.4 θ,s i., s h s w. 1,s h /h s w /w. 1,,,1.. P, f, L, x,. 2.3. P : f = x : L (2.5) 2.3 2.5 6

2.2 x,1. 2.2.3 1, 1 2.2., 0.11mm,. 2.4 1, iphone5s.,0.11mm,20 30cm.,.,.,.,,Kinect v2. Kinect v2,, 7

2.2.,Kinect v2 0.5m 8.0m, mm [5]. Kinect v2., Kinect v2.,,,.,,,. 8

3,,. 3.1, 2. 1,.,.,,. 2,. 2,. 3.2 3.1.. 9

3.2 3.1 10

3.3 手首部分の除去 3.3 手首部分の除去 今回の提案システムで用いる入力画像は, 撮影領域内に開いた手が全て収まっており, かつ 撮影機器と手を並行にして写っている写真が望ましい. 入力画像の例を図 3.2 に示す. 図 3.2 入力画像例 図 3.2 のような写真を入力画像として与えることで, 精度の高い結果が得られる. しかし, 今回の提案システムで必要な部分は手のひら部分のみの為, 手首から下の部分は必要がない. また撮影状況によっては服の袖や腕時計が写ってしまい, 後の閾値指定による二値化処理の 段階で, 大量のノイズが発生してしまう場合がある. その為入力画像の底部から 1/4 を黒く 塗りつぶし, 今回のシステムに不必要な部分を入力画像中から除去する. 除去を行った結果画 像を図 3.3 に示す. 除去後, 二値化処理に移る. 11

3.4 二値化処理 図 3.3 手首部分除去後 3.4 二値化処理 手領域と思われる部分を検出するために, グレースケール化した入力画像の明度に対して 閾値を設け, 閾値以上だった場合を手領域の候補として白色で描画する処理を行う. 処理を行う前に, 複数の入力画像に対して適切に手領域を検出できる閾値の値を設定しな ければならない. 閾値の設定を行うにあたってまず,1 枚の入力画像にいくつかの閾値を与え 元画像との欠損具合を比較し, 欠損が軽微であり, かつ誤検出の発生量が少ない閾値を今回 の提案システムで用いる閾値として設定する. なお, 適切な閾値と判断する基準としては, 画 像を肉眼で確認した際に欠損が軽微であり, かつ誤検出が少ないという条件を満たすものと する. 比較結果を図 3.4 に示す. 図 3.4 の補足として, 欠損部分をわかりやすくする為, 手領 域として検出された部分を元画像で, 欠損部分を薄い緑で表示している. 12

3.4 二値化処理 図 3.4 閾値の変化による欠損度およびノイズ量の比較 比較結果より, 閾値は 40 50 程が誤検出が少なく, かつ欠損も比較的軽微であることがわ かる. 閾値 20 では黒い部分が検出されており, 手領域よりも検出領域が膨張している事が分 かる. この工程を, 後の評価実験で用いる画像の枚数分繰り返し, 誤検出が少なく欠損が軽微 な閾値を設定する. 結果として, 用いた 10 枚の内 1 枚が閾値 40 にて大量の誤検出が発生し, 後に行う誤検出部分の除去処理でも除去しきれない結果となった. 閾値 50 では誤検出の大 幅な減少が見られた為, 今回の提案システムでは, 閾値を 50 に設定し二値化処理を行う. 閾 値を 40 に設定し二値化を行った結果と,50 に設定し二値化を行った結果を図 3.5 に示す. 二 値化処理後, ラベリングによる最大白色領域検出に移る. 13

3.5 3.5 40 ( ) 50 ( ) 3.5,. 8.,.,. 3.6.,. 14

3.6 3.6 3.6,,,,.,,. 1, 2 3, 3 [6][7].3 3.7., P 2 (P 2x, P 2y ), P 1 (P 1x, P 1y ) P 3 (P 3x, P 3y ),. C 1 = (P 1x P 2x, P 1y P 2y ) (3.1) C 2 = (P 3x P 2x, P 3y P 2y ) (3.2) c 1 = C 1 C 1 (3.3) c 2 = C 2 C 2 (3.4) c1 c2 θ = arccos( c1 c2 ) (3.5) 15

3.6 3.7 3,3., θ 60 3,. 60,C 1 C 2. 0,,0. 3.8. 16

3.6 3.8 ( ), ( ), ( ),,5., 2, 2. 2, 2, x. x x,.,,,,.,. 17

3.7 3.7 3.7.1,2.,1 [8], 1.,,,. 3.9. 3.9,.,. 3.10. 18

3.7 3.10,,.,. 3.7.2 3,.,.,,. 3.11. 19

3.7 3.11,1 2.,.,,,.,.,. 3.12. 3.12, 2. 20

3.8 3.13. 3.13 2,. 3.8,3.6,.,.,.. 3.14. 21

3.8 3.14,, 2 3,. 3.15. 3.15,,., 22

3.9,., 1, P, (2.5). x,. 3.9. JIS.,,.,.,., 1. 23

4,,,.,. 4.1., 20cm,. 1cm, 2cm, 3cm, 4cm 4. 4.1. (mm) (mm) (mm) (px) 10.00 10.32 +0.32 +5 20.00 20.65 +0.65 +9 30.00 31.05 +1.05 +14 40.00 41.31 +1.31 +18 4.1,.,,,10mm 4.5px. 24

4.1,.. D m, s, L, f,,. D 4.1, 4.2. 4.5 10 D m = p e (4.1) p e sl = D e f (4.2) D m D e = D c (4.3) 4.1 25

4.2 D(mm) D m (mm) D e (mm) D c (mm) 10.00 10.32 0.33 9.99 20.00 20.65 0.66 19.98 30.00 31.05 1.00 30.04 40.00 41.31 1.33 40.04 4.2 p e D m,d e p e,d c.,,.,,. 4.2,.,.,,.,, 12 13.,. 20cm, iphone5s. 4.3., 1 5 10,,., 26

4.4.,,,.,., 1,2.1 1,.,. 4.4,,., 10 4, 40.,, 4.2. 4.2,, 27

4.5., +2, +3, +4.. 4.3. 4.3 3 ( ),. ( ), ( )., 40 36, 90%. +1,, 3. 4.5, 90%.,,,,.,. 28

4.5 4.4. 4.4,., 4 3. -4,.,,., 10:9, -4 10:8.,., 4.4,,.,. 2,,. 29

5,,.,,,, 90%.,,.,., 50,.,,.,. 2,,., iphone5s 800,,. 30

,,,,,,.,,,.,.,,,,,,.,,..,,,,.,. 31

[1] KATSUKI,,http://www.katsuki21.co.jp/ringsize/, :2016/2/1. [2],,, pp.8, 2012. [3],,http://www.antaresdigicame.org/photo gallery/camera/camera105.html, :2016/1/27. [4], 35mm,http://diji1.ehoh.net/contents/35mm.html, :2016/2/26 [5],,,, KINECT for Windows SDK Kinect for Windows v2, pp.64, 2015. [6], HMD, 23, 2011. [7] Atagan memo,,http://ataganmemo.blogspot.jp/2012/08/blog-post 31.html, :2016/2/5. [8] Taehee Lee, et al., Handy AR: Markerless Inspection of Augmented Reality Objects Using Fingertip Tracking, IEEE ISWC 2007, 2007. 32

A A.1 33