ドップラーセンサ 送信波 観測対象 1 1 1 SVM 2 9 Activity and State Recognition without Body-Attached Sensor Using Microwave Doppler Sensor Masatoshi Sekine, 1 Kurato Maeno 1 and Masanori Nozaki 1 To spread context-aware services, it is necessary to develop the technology that automatically recognizes the user s activities and states and his or her surroundings. A doppler sensor is one of the important sensors used to recognize them. Doppler sensors are arranged in the environment, and they can detect motions without being attached to the user s body like acceleration sensors. In this paper, it is assumed to automatically recognize some activities and states using a microwave doppler sensor. In our method, some feature quantities are extracted from the sensed data. Then user s activities and states are recognized by support vector machine (SVM), a machine learning algorithm. In addition, we propose the technique for recognizing approaching and separating states using two signals that are obtained from a sensor and have a difference in their phases. As a result of the evaluation experiments, we report that the average recognition success rates of our methods are over 90 percents. 1. 周波数の差に応じて出力される信号 1 反射波 1) 1 Corporate R&D Center, Oki Electric Industry Co., Ltd. 1 c 2009 Information Processing Society of Japan
2) 3) 1 2 3 8 4 1 2 5 2. LAN 4) 2 5) 6) 7) 3. 3.1 2 3 K MIC NJR4261JB0916 8) 24.11GHz 1000 0V 5V 3kHz 4 (1) (8) (1)(5) (2)(3)(4)(6)(7) (1) (2) (3) (4) (5) 2 c 2009 Information Processing Society of Japan
デスクトップ PC 1 ドップラーセンサノード ドップラーセンサ 増幅器 ( 増幅度 = 最大 1000 倍 ) ローパスフィルタ ( カットオフ周波数 = 約 1kHz) A/D 変換器 (USB データロガー ) データロガーソフトウェア 電圧信号データ 特徴量抽出 機械学習エンジン - - 2 認識結果 0.2 0.4 0.6 0.8 1.2 1.4 1.6 5 6 1 - - 3 ( 4 NJR4261J ) (6) (7) (8) 10kHz16384 (1.6384 ) 100 800 0V 5V 16bit USB PC (FFT) 20 0.2 0.4 0.6 0.8 1.2 1.4 1.6 7 8 0Hz5Hz5Hz10Hz10Hz15Hz15Hz20Hz20Hz50Hz, 50Hz75Hz 75Hz100Hz100Hz150Hz150Hz200Hz200Hz500Hz 1 10 1 Weka( 3.7.0) 9) (SVM) 3 c 2009 Information Processing Society of Japan
1 1 - - - - 0.2 0.4 0.6 0.8 1.2 1.4 1.6 9 10 0.2 0.4 0.6 0.8 1.2 1.4 1.6 13 14 1 1 - - - - 0.2 0.4 0.6 0.8 1.2 1.4 1.6 11 12 0.2 0.4 0.6 0.8 1.2 1.4 1.6 15 16 3.2 5 20 5 20 17 1 92.1 1 77 3.3 4 21 22 25 22 4 c 2009 Information Processing Society of Japan
1 1 ( 0 ) 0.2 0.4 0.6 0.8 1.2 1.4 1.6 - - 17 18 () 98 1 1 9 90 2 8 9 1 97 2 97.0 100 100 4 7 5 77 7 77.0 2 6 92 9 2 83 15 8 100 100 1 22 23 0Hz5Hz 0.2 0.4 0.6 0.8 1.2 1.4 1.6 19 - - 20 21 24 150Hz200Hz 25 10 23 0Hz5Hz 24 150Hz200Hz 25 10 4. 4.1 5 c 2009 Information Processing Society of Japan
26 2 27 2 3 1 2 1 3 2 2 1 2 2 1 90 2 1 90 4.2 26 1 2 90 t 12 v 1(t)v 2(t) 2 v 1(t) v 2(t) v 1(t) v 2(t) 0 t ii V cross(t) 26 2 t it i+1 t i t i+1 v 1(t i+1) v 2(t i) V cross(t i+1) V cross(t i) v 1(t i+1) v 2(t i) V cross(t i+1) V cross(t i) 26 t 1 t 2t 3 t 4 t 5 t 6 v 1(t 2) v 2(t 1)v 1(t 4) v 2(t 3) v 1(t 6) v 2(t 5) V cross(t 2) V cross(t 1)V cross(t 4) V cross(t 3) V cross(t 6) V cross(t 5) t 2 t 3t 4 t 5 v 1(t 3) v 2(t 2) v 1(t 5) v 2(t 4) V cross(t 3) V cross(t 2)V cross(t 5) V cross(t 4) 27 90 6 c 2009 Information Processing Society of Japan
28 2 2 v 1(t) v 2(t) t jj 2 t jt j+1 t j t j+1 v 1(t j+1) v 2(t j) V cross(t j+1) V cross(t j) v 1(t j+1) v 2(t j) V cross(t j+1) V cross(t j) 27 t 1 t 2t 3 t 4 t 5 t 6 v 1(t 2) v 2(t 1)v 1(t 4) v 2(t 3) v 1(t 6) v 2(t 5) V cross(t 2) V cross(t 1)V cross(t 4) V cross(t 3) V cross(t 6) V cross(t 5) t 3 t 2t 5 t 4 v 1(t 3) v 2(t 2) v 1(t 5) v 2(t 4) V cross(t 3) V cross(t 2)V cross(t 5) V (t 4) 28 T unit V cross(t) 2 t kt k+1k V cross(t k+1) V cross(t k) V th(> 0) 29 29 2 ( A) T unit v 1(t) v 2(t) B2 C C YES V cross(t k+1) V cross(t k) V th E E YES I E NO V cross(t k+1) V cross(t k) V th G G YES J G NO A C NO 2 D D YES V cross(t k+1) V cross(t k) V th F D NO A F YES 7 c 2009 Information Processing Society of Japan
2 () () () 12701 7849 61.8 () 13257 8308 62.7 () 1119 1119 100 () 1563 1563 100 F NO V cross(t k+1) V cross(t k) V th H H YES H NO A 2 4.3 4.2 3 5 5 30 150 4.2 T unit 05 10kHz 50 V th V 4.4 2 6 62.7 ( 62.2 ) 100 5. 8 2 2 9 21 1) Zhen-Yu He; Lian-Wen Jin, Activity recognition from acceleration data using AR model representation and SVM, Proc. of International Conference on Machine Learning and Cybernetics, pp.2245-2250, July 2008. 2) SPECIAL, CQ Oct. 2006 3) S. Mitra and T. Acharya, Gesture Recognition: A Survey, IEEE Transactions on Systems, Man, and Cybernetics, Part C: Applications and Reviews, Vol. 37, NO. 3, pp.311-324, May 2007. 4) Muthukrishnan, K. and Lijding, M.E.M. and Meratnia, N. and Havinga, P.J.M., Sensing motion using spectral and spatial analysis of WLAN RSSI, Proc. of the 2nd European Conference on Smart Sensing and Context (EuroSSC 2007), Oct. 2007. 5), 70 2ZD-4pp.3-1933-194March 2008 6) Kaustubh Kalgaonkar, Bhiksha Raj, One-handed gesture recognition using ultrasonic Doppler sonar, Proc. of IEEE International Conference on Acoustics, Speech and Signal Processing (ICASSP), April 2009. 7) 23 Jun. 2009 8) http://mc.njr.co.jp/jpn/technical/sensor_ 3.html 9) Weka 3 - Data Mining with Open Source Machine Learning Software in Java, http://www.cs.waikato.ac.nz/ml/weka/ 8 c 2009 Information Processing Society of Japan