‘îŁñ›È−wfiÁŁÊ”À„±I --Tensorflow‡ð”g‡Á‡½fl»ŁÊ›ð’Í--

Size: px
Start display at page:

Download "‘îŁñ›È−wfiÁŁÊ”À„±I --Tensorflow‡ð”g‡Á‡½fl»ŁÊ›ð’Í--"

Transcription

1 I Tensorflow ( ) ( ) Tensorflow ( ) 1 / 39

2 Tensorflow I Tensorflow Python Python(+Python Anaconda) Tensorflow Tensorflow 1 Anaconda Prompt 2 Anaconda Prompt (base) C:\Users\komori>conda create -n tensorflow python=3.5 3 tensorflow (base) C:\Users\komori>activate tensorflow 4 pip(pip installs Packages ) tensorflow (base) C:\Users\komori>pip install --ignore-installed --upgrade tensorflow cp35-cp35m-win_amd64.whl 5 idle (base) C:\Users\komori>idle ( ) Tensorflow ( ) 2 / 39

3 Tensorflow I Tensorflow Google C++ Python (flow) Google Tensorflow ( ) Tensorflow ( ) 3 / 39

4 Tensorflow II ( ) Tensorflow ( ) 4 / 39

5 Tensor I Tensor Tensorflow (tensor) (flow) tensor >>> import numpy as np # python >>> M1=np.array([1,2,3]) # >>> M1 array([1, 2, 3]) >>> M1.ndim # M1 1 >>> M1.shape # M1 (3,) >>> M2=np.array([[1,2,3],[4,5,6]]) # >>> M2 array([[1, 2, 3], [4, 5, 6]]) >>> M2.ndim # M2 2 >>> M2.shape # M2 (2, 3) ( ) Tensorflow ( ) 5 / 39

6 Tensor II tensor( ) >>> M3=np.array([[[1,2,3],[4,5,6]],[[7,8,9],[10,11,12]],[[1,1,1],[1,1,1]],[[2,2,2],[ >>> M3 # array([[[ 1, 2, 3], [ 4, 5, 6]], [[ 7, 8, 9], [10, 11, 12]], [[ 1, 1, 1], [ 1, 1, 1]], [[ 2, 2, 2], [ 2, 2, 2]]]) >>> M3.ndim # M3 3 >>> M3.shape # M3 (4, 2, 3) ( ) Tensorflow ( ) 6 / 39

7 Tensor III ( ) Tensorflow ( ) 7 / 39

8 Tensorflow I Tensorflow 2 Tensorflow 1 (dataflow graph) 2 Tensorflow (session) node( ): (operation, op), tensor( ) edge( ): tensor( ) ( ) Tensorflow ( ) 8 / 39

9 Tensorflow II tensor ( tensor!) tensor constant: tensor( ) Variable: tensor( ) trainable( ) non-trainable( ) w sess.run(w.initializer), sees.run(tf.global variables initializer()) placeholder: tensor. session feed dict={ } tf.initialize all variables 2017 ( ) Tensorflow ( ) 9 / 39

10 Tensorflow I Tensorflow activate tensorflow activate C:\\Users\\komori\\Desktop>activate tensorflow (tensorflow) tensorflow activate (tensorflow)c:\\users\\komori\\desktop> python (idle: integrated development environment) python (tensorflow)c:\\users\\komori\\desktop>idle python a.py python a.py ( ) Tensorflow ( ) 10 / 39

11 Session I Session (operation) (tensor) >>> import tensorflow as tf # tensorflow import tf # Build a graph. >>> a = tf.constant(5.0) >>> b = tf.constant(6.0) >>> c = tf.constant(7.0) >>> d = a * b + c # Launch the graph in a session. >>> sess = tf.session() # Evaluate the tensor d. >>> print(sess.run(d)) # 37.0 >>> sess.close() # session >>> print(d)#d node Tensor("mul:0", shape=(), dtype=float32) ( ) Tensorflow ( ) 11 / 39

12 I Tensorboard >>> import tensorflow as tf # tensorflow import tf # Build a graph. >>> a=tf.constant(5.0, name="a") # >>> b=tf.constant(6.0, name="b") # >>> c=tf.constant(7.0, name="c") >>> d=a * b + c >>> tf.summary.scalar( d, d) # d # Launch the graph in a session. >>> sess = tf.session() >>> writer=tf.summary.filewriter(./log, sess.graph) # log graph >>> sess.close() # session >>> writer.close() # writer tensorboard ( ) Tensorflow ( ) 12 / 39

13 II tensorboard (tensorflow) C:\\Users\\komori\\Desktop>tensorboard --logdir=./log C:\\Users\\komori\\Desktop>tasklist # PID, 2528 C:\\Users\\komori\\Desktop>taskkill /F /PID 2528 C:\\Users\\komori\\Desktop>help taskkill # help ( ) Tensorflow ( ) 13 / 39

14 Iris I pandas pandas C:\\Users\\komori\\Desktop>pip install pandas pip : Pip installs Packages, Pip installs Python pandas : panel data (x it, i = 1,..., n, t = 1,..., T), iris >>> import pandas as pd >>> iris=pd.read_csv("iris100.csv") >>> iris.iloc[0:3,] #iloc, integer location Sepal.Length Sepal.Width Petal.Length Petal.Width Species setosa setosa setosa ( ) Tensorflow ( ) 14 / 39

15 Iris II iris ( ) >>> iris.index Int64Index([ 1, 2, 3, 4, 5, 6, 7, 8, 9, 10, 11, 14, 15, 16, 17, 18, 19, 20, 21, 22, 23, 24, 27, 28, 29, 30, 31, 32, 33, 34, 35, 36, 37, 40, 41, 42, 43, 44, 45, 46, 47, 48, 49, 50, 53, 54, 55, 56, 57, 58, 59, 60, 61, 62, 63, 66, 67, 68, 69, 70, 71, 72, 73, 74, 75, 76, 79, 80, 81, 82, 83, 84, 85, 86, 87, 88, 89, 92, 93, 94, 95, 96, 97, 98, 99, 100], dtype= int64 ) >>> iris.columns Species ], dtype= object ) >>> iris.shape (100, 5) ( ) Tensorflow ( ) 15 / 39

16 Iris III matplotlib pandas C:\\Users\\komori\\Desktop>pip install matplotlib iris >>> import matplotlib.pyplot as plt #matplotlib pyplot plt >>> plt.scatter(iris.iloc[0:50,0],iris.iloc[0:50,1]) >>> plt.scatter(iris.iloc[49:100,0],iris.iloc[49:100,1]) >>> plt.xlabel(iris.columns[0]) >>> plt.ylabel(iris.columns[1]) >>> plt.show() ( ) Tensorflow ( ) 16 / 39

17 Iris I ( ) Tensorflow ( ) 17 / 39

18 tf.matmul I matmul ( ) >>> import tensorflow as tf >>> impoart numpy as np >>> X=np.array([[1,2,3,4],[5,6,7,8]]) >>> W=np.array([1,1,-1,-1]) # W: 1 4 >>> W.shape=(4,1) # 4 1 >>> A=tf.matmul(X,W) >>> with tf.session() as sess: #: print(sess.run(a)) [[-4] [-4]] with sess.close() ( ) ( ) Tensorflow ( ) 18 / 39

19 a I tf.assign update.py import tensorflow as tf a = tf.variable(0, name="a") b= tf.assign(a, a+1) with tf.session() as sess: sess.run(tf.global_variables_initializer()) #tf.variable print(sess.run(b)) for _ in range(3): # for _ print(sess.run(b)) tf.assign(ref,value): Update ref by assigning value to it. ( ) Tensorflow ( ) 19 / 39

20 (fetching) I sess.run() fetch.py import tensorflow as tf a=tf.constant(1) # default graph #a=1 # default graph ( ) b=2 c=3 d=a+b e=a*b+c with tf.session() as sess: res1, res2 = sess.run ([d,e]) # d,e fetch print(res1,res2) fetch: [ ] ( ) Tensorflow ( ) 20 / 39

21 I graph.py import tensorflow as tf g = tf.graph() #graph g with g.as_default():# Define operations and tensors in g. a=tf.constant(1) # graph g (tensor)a b=2 c=3 d=a+b e=a*b+c A=tf.constant(30) # default graph (tensor)a with tf.session(graph=g) as sess: print(sess.run([d,e])) with tf.session(graph=tf.get_default_graph()) as sess: print(sess.run(a)) default graph tf.get default graph() ( ) Tensorflow ( ) 21 / 39

22 placeholder I placeholder.py import tensorflow as tf import numpy as np x = tf.placeholder(tf.int32, shape=(3, 3)) # y = tf.matmul(x, x)+1 with tf.session() as sess: A = np.random.randint(10,size=(3,3)) # print(a) print(sess.run(y, feed_dict={x: A})) # (feed) np.random.randint() np.random.rand() np.random.normal() np.random.binomial() ( ) Tensorflow ( ) 22 / 39

23 tf.reduce sum I reduce.py import tensorflow as tf x = tf.constant([[1, 1, 1], [1, 1, 1]]) a1=tf.reduce_sum(x) a2=tf.reduce_sum(x, 0) a3=tf.reduce_sum(x, 1) a4=tf.reduce_sum(x, [0, 1]) with tf.session() as sess: x, a1, a2, a3, a4=sess.run([x,a1,a2,a3,a4]) print("x= %s,a1=%s, a2=%s, a3=%s, a4=%s" % (x,a1,a2,a3,a4)) x= [[1 1 1] [1 1 1]],a1=6, a2=[2 2 2], a3=[3 3], a4=6 reduce sum axis. %s: string cf %d cf. reduce prod ( ) Tensorflow ( ) 23 / 39

24 Iris I logistic.py import numpy as np import pandas as pd import tensorflow as tf iris=pd.read_csv("iris100.csv") # data=iris.iloc[:,0:4] #4 X=tf.placeholder(tf.float32,[None,4]) W=tf.Variable(tf.zeros([4,1])) w0=tf.variable(tf.zeros(1)) Fx=tf.matmul(X,W)+w0 #Fx Fx=tf.reshape(Fx,shape=[100]) #Fx prob=tf.sigmoid(fx) #Fx prob y=tf.concat([tf.zeros(50),tf.ones(50)],0) #iris tf.concat([t1,t2], axis) tf.sigmoid: y = 1 / (1 + exp(-x)) y: setosa 0, sersicolor 1 ( ) Tensorflow ( ) 24 / 39

25 Iris II logistic.py( ) likelihood=tf.reduce_sum(y*tf.log(prob)+(1-y)*tf.log(1-prob)) loss=-likelihood learning_rate=0.001 # train= tf.train.gradientdescentoptimizer(learning_rate).minimize(loss) likelihood: loss 2 (tf.reduce mean(tf.square(y-prob))) (learning rate=0.1 ) η: W = W η loss W Optimizer GradientDescentOptimizer: AdagradOptimize: AdaGrad MomentumOptimizer: AdamOptimizer: Adam ( ) Tensorflow ( ) 25 / 39

26 Iris III Session logistic.py( ) with tf.session() as sess: sess.run(tf.global_variables_initializer()) #Variables for i in range(100): #100 sess.run(train,feed_dict={x:data}) if i % 10 == 0: #10 print("%5d:(loss,w,w0)=(%f, %s,%.4f)" % (i, sess.run(loss,feed_dict={x:data}), sess.run(tf.reshape(w,shape=[4])), sess.run(w0))) print("%d:prob=%s" % (i, sess.run([prob[0:50],prob[49:100]], feed_dict={x:data}))) sess.run(train,feed dict={x:data}) ( ) Tensorflow ( ) 26 / 39

27 Iris IV 99:prob=[array([ , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , ], dtype=float32), array([ , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , ], dtype=float32)] ( ) Tensorflow ( ) 27 / 39

28 (name scope, name space) I Tensorboard tensorflow tf.name scope tf.name scope with with tf.variable_scope("probability"): with tf.name_scope("x"): X=tf.placeholder(tf.float32,[None,4]) with tf.name_scope("w"): W=tf.Variable(tf.zeros([4,1])) with tf.name_scope("w0"): w0=tf.variable(tf.zeros(1)) with tf.name_scope("fx"): Fx=tf.matmul(X,W)+w0 Fx=tf.reshape(Fx,shape=[100]) with tf.name_scope("prob"): prob=tf.sigmoid(fx) W W.name ( ) Tensorflow ( ) 28 / 39

29 (name scope, name space) II >>> print(w.name,w0.name) probability/w/variable:0 probability/w0/variable:0 0 index. 2 0 index >>> a,b=tf.nn.top_k([1,4,2], 1) # k index >>> a.name TopKV2_2:0 >>> b.name TopKV2_2:1 ( ) Tensorflow ( ) 29 / 39

30 (summary) I tf.summary tensorboard with tf.name_scope( summary ): tf.summary.scalar("loss",loss) tf.summary.scalar("mean.w",tf.reduce_mean(w)) tf.summary.histogram("hist.w",w) (sess.run ) merged = tf.summary.merge_all() ( ) Tensorflow ( ) 30 / 39

31 (summary) II log (logistic.writer.py) log_dir = C:\Users\komori\Desktop\log #log path with tf.session() as sess: sess.run(tf.global_variables_initializer()) #Variable writer = tf.summary.filewriter(log_dir, sess.graph) # log print(sess.run([loss,y],feed_dict={x:data})) for i in range(100): #100 summary,_=sess.run([merged,train],feed_dict={x:data}) writer.add_summary(summary, i) #merged summary log writer.close() #writer tensorboard (tensorflow) C:\Users\komori\Desktop>tensorboard --logdir="./log" ( ) Tensorflow ( ) 31 / 39

32 Tensorboard (scalars) I ( ) Tensorflow ( ) 32 / 39

33 Tensorboard (distributions) I ( ) Tensorflow ( ) 33 / 39

34 Tensorboard (histgrams) I ( ) Tensorflow ( ) 34 / 39

35 Tensorboard (graph) I ( ) Tensorflow ( ) 35 / 39

36 Tensorboard (graph ) I ( ) Tensorflow ( ) 36 / 39

37 I Tensorflow Tensorflow R pandas ( ) Tensorflow ( ) 37 / 39

38 I logistic.writer.py import numpy as np import pandas as pd import tensorflow as tf iris=pd.read_csv("iris100.csv") data=iris.iloc[:,0:4] with tf.variable_scope("probability"): with tf.name_scope("x"): X=tf.placeholder(tf.float32,[None,4]) with tf.name_scope("w"): W=tf.Variable(tf.zeros([4,1])) WW=tf.Variable(tf.zeros([4,1])) with tf.name_scope("w0"): w0=tf.variable(tf.zeros(1)) with tf.name_scope("fx"): Fx=tf.matmul(X,W)+w0 Fx=tf.reshape(Fx,shape=[100]) with tf.name_scope("prob"): prob=tf.sigmoid(fx) with tf.name_scope("label"): y=tf.concat([tf.zeros(50),tf.ones(50)],0) ( ) Tensorflow ( ) 38 / 39

39 II logistic.writer.py( ) with tf.name_scope("loss"): likelihood=tf.reduce_sum(y*tf.log(prob)+(1-y)*tf.log(1-prob)) loss=-likelihood learning_rate=0.001 train= tf.train.gradientdescentoptimizer(learning_rate).minimize(loss) with tf.name_scope( summary ): tf.summary.scalar("loss",loss) tf.summary.scalar("mean.w",tf.reduce_mean(w)) tf.summary.histogram("hist.w",w) merged = tf.summary.merge_all() log_dir = C:\Users\komori\Desktop\log with tf.session() as sess: sess.run(tf.global_variables_initializer()) writer = tf.summary.filewriter(log_dir, sess.graph) print(sess.run([loss,y],feed_dict={x:data})) for i in range(100): summary,_=sess.run([merged,train],feed_dict={x:data}) writer.add_summary(summary, i) writer.close() ( ) Tensorflow ( ) 39 / 39

Visual Python, Numpy, Matplotlib

Visual Python, Numpy, Matplotlib Visual Python, Numpy, Matplotlib 1 / 57 Contents 1 2 Visual Python 3 Numpy Scipy 4 Scipy 5 Matplotlib 2 / 57 Contents 1 2 Visual Python 3 Numpy Scipy 4 Scipy 5 Matplotlib 3 / 57 3 Visual Python: 3D Numpy,

More information

Anaconda (2019/7/3)

Anaconda (2019/7/3) Published on Research Center for Computational Science (https://ccportal.ims.ac.jp) Home > Anaconda3-2019.03 (2019/7/3) Anaconda3-2019.03 (2019/7/3) 1 利用方法 conda, anaconda に関する情報はウェブ上にたくさんありますので それらも参考にしてください

More information

「産業上利用することができる発明」の審査の運用指針(案)

「産業上利用することができる発明」の審査の運用指針(案) 1 1.... 2 1.1... 2 2.... 4 2.1... 4 3.... 6 4.... 6 1 1 29 1 29 1 1 1. 2 1 1.1 (1) (2) (3) 1 (4) 2 4 1 2 2 3 4 31 12 5 7 2.2 (5) ( a ) ( b ) 1 3 2 ( c ) (6) 2. 2.1 2.1 (1) 4 ( i ) ( ii ) ( iii ) ( iv)

More information

Visual Python, Numpy, Matplotlib

Visual Python, Numpy, Matplotlib Visual Python, Numpy, Matplotlib 1 / 38 Contents 1 2 Visual Python 3 Numpy Scipy 4 Scipy 5 Matplotlib 2 / 38 Contents 1 2 Visual Python 3 Numpy Scipy 4 Scipy 5 Matplotlib 3 / 38 3 Visual Python: 3D Numpy,

More information

In [5]: soup.tbody Out[5]: <tbody> <tr> <th><label for=""> </label></th> <td> </td> <td><input checked="checked" class="input-label-horizontal" id="se

In [5]: soup.tbody Out[5]: <tbody> <tr> <th><label for=> </label></th> <td> </td> <td><input checked=checked class=input-label-horizontal id=se IPC JPlatpat Chrome cntl + Source Code cd ' 20170101.html' In [1]: from bs4 import BeautifulSoup 20170101 In [2]: html = open('shimazu_2017.html') soup = BeautifulSoup(html, "html.parser") In [3]: type(soup)

More information

Python (Anaconda ) Anaconda 2 3 Python Python IDLE Python NumPy 6

Python (Anaconda ) Anaconda 2 3 Python Python IDLE Python NumPy 6 Python (Anaconda ) 2017. 05. 30. 1 1 2 Anaconda 2 3 Python 3 3.1 Python.......................... 3 3.2 IDLE Python....................... 5 4 NumPy 6 5 matplotlib 7 5.1..................................

More information

44 4 I (1) ( ) (10 15 ) ( 17 ) ( 3 1 ) (2)

44 4 I (1) ( ) (10 15 ) ( 17 ) ( 3 1 ) (2) (1) I 44 II 45 III 47 IV 52 44 4 I (1) ( ) 1945 8 9 (10 15 ) ( 17 ) ( 3 1 ) (2) 45 II 1 (3) 511 ( 451 1 ) ( ) 365 1 2 512 1 2 365 1 2 363 2 ( ) 3 ( ) ( 451 2 ( 314 1 ) ( 339 1 4 ) 337 2 3 ) 363 (4) 46

More information

i ii i iii iv 1 3 3 10 14 17 17 18 22 23 28 29 31 36 37 39 40 43 48 59 70 75 75 77 90 95 102 107 109 110 118 125 128 130 132 134 48 43 43 51 52 61 61 64 62 124 70 58 3 10 17 29 78 82 85 102 95 109 iii

More information

Anaconda x86_64 版バージョン の インストールとパッケージの追加 最終更新 : 2018 年 2 月 10 日 URL: Anaconda は,Py

Anaconda x86_64 版バージョン の インストールとパッケージの追加 最終更新 : 2018 年 2 月 10 日 URL:   Anaconda は,Py Anaconda x86_64 版バージョン 5.0.0 の インストールとパッケージの追加 最終更新 : 2018 年 2 月 10 日 URL: https://www.kunihikokaneko.com/dblab/toolchain/anaconda3.html Anaconda は,Python バージョン 3 の言語処理系と, 開発環境と, 各種ツールの詰め合わせであ る. キーワード

More information

Python ( ) Anaconda 2 3 Python Python IDLE Python NumPy 6 5 matpl

Python ( ) Anaconda 2 3 Python Python IDLE Python NumPy 6 5 matpl Python ( ) 2017. 11. 21. 1 1 2 Anaconda 2 3 Python 3 3.1 Python.......................... 3 3.2 IDLE Python....................... 5 4 NumPy 6 5 matplotlib 7 5.1.................................. 7 5.2..................................

More information

In [168]: soup.find_all("label")

In [168]: soup.find_all(label) step 1 kakaku.com/bicycle/bicycle-battery/ web web Chrome cntl + Source Code and Copy cd 'kakaku_com_bicycle_bicycle-battery.html' In [166]: from bs4 import BeautifulSoup In [167]: html = open('kakaku_com_bicycle_bicycle-battery_2017.html')

More information

1 6/13 2 6/20 3 6/27 4 7/4 5 7/11 6 7/18 N 7 7/25 Warshall-Floyd, Bellman-Ford, Dijkstra TSP DP, 8/1 2 / 36

1 6/13 2 6/20 3 6/27 4 7/4 5 7/11 6 7/18 N 7 7/25 Warshall-Floyd, Bellman-Ford, Dijkstra TSP DP, 8/1 2 / 36 3 2016 6 27 1 / 36 1 6/13 2 6/20 3 6/27 4 7/4 5 7/11 6 7/18 N 7 7/25 Warshall-Floyd, Bellman-Ford, Dijkstra TSP DP, 8/1 2 / 36 1 2 3 3 / 36 4 / 36 os.urandom(n) n >>> import os >>> r = os.urandom(4) #

More information

2018 年 11 月 10 日開催 第 27 回日本コンピュータ外科学会大会 ハンズオンセミナー 2 外科領域における医用画像の深層学習 事前インストール手順 2018 年 10 月 11 日版 作成 : 名古屋大学小田昌宏 1

2018 年 11 月 10 日開催 第 27 回日本コンピュータ外科学会大会 ハンズオンセミナー 2 外科領域における医用画像の深層学習 事前インストール手順 2018 年 10 月 11 日版 作成 : 名古屋大学小田昌宏 1 2018 年 11 月 10 日開催 第 27 回日本コンピュータ外科学会大会 ハンズオンセミナー 2 外科領域における医用画像の深層学習 事前インストール手順 2018 年 10 月 11 日版 作成 : 名古屋大学小田昌宏 1 必要環境 Windows10 がインストールされた PC メモリ 8GB 以上必須,16GB 以上推奨 インターネット接続 Windows のユーザ名に日本語等の全角文字を使用していないこと.

More information

1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 () - 1 - - 2 - - 3 - - 4 - - 5 - 34 35 36 37 38 39 40 41 42 43 44 45 46 47 48 49 50 51 52 53 54 55 56 57

More information

i

i 14 i ii iii iv v vi 14 13 86 13 12 28 14 16 14 15 31 (1) 13 12 28 20 (2) (3) 2 (4) (5) 14 14 50 48 3 11 11 22 14 15 10 14 20 21 20 (1) 14 (2) 14 4 (3) (4) (5) 12 12 (6) 14 15 5 6 7 8 9 10 7

More information

2015 I ( TA)

2015 I ( TA) 2015 I ( TA) Schrödinger PDE Python u(t, x) x t 2 u(x, t) = k u(t, x) t x2 k k = i h 2m Schrödinger h m 1 ψ(x, t) i h ( 1 ψ(x, t) = i h ) 2 ψ(x, t) t 2m x Cauchy x : x Fourier x x Fourier 2 u(x, t) = k

More information

Stapy_Tsuji_ key

Stapy_Tsuji_ key Python #23 2017.4.12 Python @tsjshg shingo.tsuji@gmail.com 1975 C++ IT Java Web 10 Python Python 3 Python Python Python Python SQL Excel PowerPoint PDF 2 http://pypl.github.io/pypl.html 1 http://blog.codeeval.com/codeevalblog/2016/2/2/most-popular-coding-languages-of-2016

More information

第1部 一般的コメント

第1部 一般的コメント (( 2000 11 24 2003 12 31 3122 94 2332 508 26 a () () i ii iii iv (i) (ii) (i) (ii) (iii) (iv) (a) (b)(c)(d) a) / (i) (ii) (iii) (iv) 1996 7 1996 12

More information

Python による科学技術計算の概要

Python による科学技術計算の概要 1 2 https://www.kdnuggets.com/2017/05/poll-analyticsdata-science-machine-learning-software-leaders.html https://www.kdnuggets.com/2017/08/pythonovertakes-r-leader-analytics-data-science.html 3 4 5 6 7

More information

- 2 -

- 2 - - 2 - - 3 - (1) (2) (3) (1) - 4 - ~ - 5 - (2) - 6 - (1) (1) - 7 - - 8 - (i) (ii) (iii) (ii) (iii) (ii) 10 - 9 - (3) - 10 - (3) - 11 - - 12 - (1) - 13 - - 14 - (2) - 15 - - 16 - (3) - 17 - - 18 - (4) -

More information

2 1980 8 4 4 4 4 4 3 4 2 4 4 2 4 6 0 0 6 4 2 4 1 2 2 1 4 4 4 2 3 3 3 4 3 4 4 4 4 2 5 5 2 4 4 4 0 3 3 0 9 10 10 9 1 1

2 1980 8 4 4 4 4 4 3 4 2 4 4 2 4 6 0 0 6 4 2 4 1 2 2 1 4 4 4 2 3 3 3 4 3 4 4 4 4 2 5 5 2 4 4 4 0 3 3 0 9 10 10 9 1 1 1 1979 6 24 3 4 4 4 4 3 4 4 2 3 4 4 6 0 0 6 2 4 4 4 3 0 0 3 3 3 4 3 2 4 3? 4 3 4 3 4 4 4 4 3 3 4 4 4 4 2 1 1 2 15 4 4 15 0 1 2 1980 8 4 4 4 4 4 3 4 2 4 4 2 4 6 0 0 6 4 2 4 1 2 2 1 4 4 4 2 3 3 3 4 3 4 4

More information

1 (1) (2)

1 (1) (2) 1 2 (1) (2) (3) 3-78 - 1 (1) (2) - 79 - i) ii) iii) (3) (4) (5) (6) - 80 - (7) (8) (9) (10) 2 (1) (2) (3) (4) i) - 81 - ii) (a) (b) 3 (1) (2) - 82 - - 83 - - 84 - - 85 - - 86 - (1) (2) (3) (4) (5) (6)

More information

20 15 14.6 15.3 14.9 15.7 16.0 15.7 13.4 14.5 13.7 14.2 10 10 13 16 19 22 1 70,000 60,000 50,000 40,000 30,000 20,000 10,000 0 2,500 59,862 56,384 2,000 42,662 44,211 40,639 37,323 1,500 33,408 34,472

More information

I? 3 1 3 1.1?................................. 3 1.2?............................... 3 1.3!................................... 3 2 4 2.1........................................ 4 2.2.......................................

More information

第1章 国民年金における無年金

第1章 国民年金における無年金 1 2 3 4 ILO ILO 5 i ii 6 7 8 9 10 ( ) 3 2 ( ) 3 2 2 2 11 20 60 12 1 2 3 4 5 6 7 8 9 10 11 12 13 13 14 15 16 17 14 15 8 16 2003 1 17 18 iii 19 iv 20 21 22 23 24 25 ,,, 26 27 28 29 30 (1) (2) (3) 31 1 20

More information

表1票4.qx4

表1票4.qx4 iii iv v 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 22 23 10 11 24 25 26 27 10 56 28 11 29 30 12 13 14 15 16 17 18 19 2010 2111 22 23 2412 2513 14 31 17 32 18 33 19 34 20 35 21 36 24 37 25 38 2614

More information

1 matplotlib matplotlib Python matplotlib numpy matplotlib Installing A 2 pyplot matplotlib 1 matplotlib.pyplot matplotlib.pyplot plt import import nu

1 matplotlib matplotlib Python matplotlib numpy matplotlib Installing A 2 pyplot matplotlib 1 matplotlib.pyplot matplotlib.pyplot plt import import nu Python Matplotlib 2016 ver.0.06 matplotlib python 2 3 (ffmpeg ) Excel matplotlib matplotlib doc PDF 2,800 python matplotlib matplotlib matplotlib Gallery Matplotlib Examples 1 matplotlib 2 2 pyplot 2 2.1

More information

Python Speed Learning

Python   Speed Learning Python Speed Learning 1 / 89 1 2 3 4 (import) 5 6 7 (for) (if) 8 9 10 ( ) 11 12 for 13 2 / 89 Contents 1 2 3 4 (import) 5 6 7 (for) (if) 8 9 10 ( ) 11 12 for 13 3 / 89 (def) (for) (if) etc. 1 4 / 89 Jupyter

More information

RL_tutorial

RL_tutorial )! " = $ % & ' "(& &*+ = ' " + %' "(- + %. ' "(. + γ γ=0! " = $ " γ=0.9! " = $ " + 0.9$ " + 0.81$ "+, + ! " #, % #! " #, % # + (( + #,- +. max 2 3! " #,-, % 4! " #, % # ) α ! " #, % ' ( )(#, %)!

More information

Python Speed Learning

Python   Speed Learning Python Speed Learning 1 / 76 Python 2 1 $ python 1 >>> 1 + 2 2 3 2 / 76 print : 1 print : ( ) 3 / 76 print : 1 print 1 2 print hello 3 print 1+2 4 print 7/3 5 print abs(-5*4) 4 / 76 print : 1 print 1 2

More information

provider_020524_2.PDF

provider_020524_2.PDF 1 1 1 2 2 3 (1) 3 (2) 4 (3) 6 7 7 (1) 8 (2) 21 26 27 27 27 28 31 32 32 36 1 1 2 2 (1) 3 3 4 45 (2) 6 7 5 (3) 6 7 8 (1) ii iii iv 8 * 9 10 11 9 12 10 13 14 15 11 16 17 12 13 18 19 20 (2) 14 21 22 23 24

More information

SunPro会誌 2016 技術書典

SunPro会誌 2016 技術書典 1 hiromu 1.1 SunPro hiromu(@hiromu1996) Deep Convolutional Generative Adversarial Network (DCGAN) DCGAN 3 DCGAN non-root TensorFlow 1.2 DCGAN DCGAN Generative Adversarial Network(GAN, ) GAN 2014 Goodfellow

More information

nakao

nakao Fortran+Python 4 Fortran, 2018 12 12 !2 Python!3 Python 2018 IEEE spectrum https://spectrum.ieee.org/static/interactive-the-top-programming-languages-2018!4 Python print("hello World!") if x == 10: print

More information

Jupiter User Guide 1.0.2 30 3 16 Jupiter Genius[2] Genius Jupiter Jupiter Stacked Alternating Offers Protocol(SAOP)[1] Jupiter 1 Genius Jupiter 1 Jupiter 2 Jupiter 3 1 1 2 4 1 Jupiter 5 1.1..............................

More information

i ii iii iv v vi vii ( ー ー ) ( ) ( ) ( ) ( ) ー ( ) ( ) ー ー ( ) ( ) ( ) ( ) ( ) 13 202 24122783 3622316 (1) (2) (3) (4) 2483 (1) (2) (3) (4) (5) (6) (7) (8) (9) (10) (11) 11 11 2483 13

More information

2017 (413812)

2017 (413812) 2017 (413812) Deep Learning ( NN) 2012 Google ASIC(Application Specific Integrated Circuit: IC) 10 ASIC Deep Learning TPU(Tensor Processing Unit) NN 12 20 30 Abstract Multi-layered neural network(nn) has

More information

178 5 I 1 ( ) ( ) 10 3 13 3 1 8891 8 3023 6317 ( 10 1914 7152 ) 16 5 1 ( ) 6 13 3 13 3 8575 3896 8 1715 779 6 (1) 2 7 4 ( 2 ) 13 11 26 12 21 14 11 21

178 5 I 1 ( ) ( ) 10 3 13 3 1 8891 8 3023 6317 ( 10 1914 7152 ) 16 5 1 ( ) 6 13 3 13 3 8575 3896 8 1715 779 6 (1) 2 7 4 ( 2 ) 13 11 26 12 21 14 11 21 I 178 II 180 III ( ) 181 IV 183 V 185 VI 186 178 5 I 1 ( ) ( ) 10 3 13 3 1 8891 8 3023 6317 ( 10 1914 7152 ) 16 5 1 ( ) 6 13 3 13 3 8575 3896 8 1715 779 6 (1) 2 7 4 ( 2 ) 13 11 26 12 21 14 11 21 4 10 (

More information

WiFiの現状

WiFiの現状 V1.0 2019/10/23 はじめての AI 用パソコン Tensorflow 学習編 (ubuntu 版 ) 抜粋版 スペクトラム テクノロジー株式会社 https://spectrum-tech.co.jp sales@spectrum-tech.co.jp all rights reserved 2019 spectrum technology co. 1 目次 ubuntu 運用マニュアル

More information

日本経済新聞社編『経済学の巨人危機と闘う─達人が読み解く先人の知恵』

日本経済新聞社編『経済学の巨人危機と闘う─達人が読み解く先人の知恵』 2012 309pp. Yasuo Suzuki / I 2008 4 3-16 1 2 3 174 2013 winter / No.398 4 5 6 7 8 J S 9 10 11 12 13 14 15 16 17 II 9 2011 11 9 9 16 175 6-8 1 1 1 2 7 13 15 10 280 281-286 176 2013 winter / No.398 281 286-298

More information

はじめに 私は Ruby が好きだ だからデータ分析だって Ruby でやりたい Ruby よりも向いている言語があるのはわかっているけどさー

はじめに 私は Ruby が好きだ だからデータ分析だって Ruby でやりたい Ruby よりも向いている言語があるのはわかっているけどさー Ruby も Apache Arrow でデータ処理言語の仲間入り 須藤功平クリアコード DataScience.rb ワークショップ 2017-05-19 はじめに 私は Ruby が好きだ だからデータ分析だって Ruby でやりたい Ruby よりも向いている言語があるのはわかっているけどさー Apache Arrow データフォーマットの仕様とその仕様を処理する実装 Arrow: 解決したい問題

More information

HARK Designer Documentation 0.5.0 HARK support team 2013 08 13 Contents 1 3 2 5 2.1.......................................... 5 2.2.............................................. 5 2.3 1: HARK Designer.................................

More information

1 I EViews View Proc Freeze

1 I EViews View Proc Freeze EViews 2017 9 6 1 I EViews 4 1 5 2 10 3 13 4 16 4.1 View.......................................... 17 4.2 Proc.......................................... 22 4.3 Freeze & Name....................................

More information

A Study on Practical Use of Artificial Intelligence. The purpose of this research paper is to demonstrate the ease of using artificial intelligence in

A Study on Practical Use of Artificial Intelligence. The purpose of this research paper is to demonstrate the ease of using artificial intelligence in A Study on Practical Use of Artificial Intelligence. The purpose of this research paper is to demonstrate the ease of using artificial intelligence in the light of the recent popularity of tertiary artificial

More information

untitled

untitled i ii iii iv v 43 43 vi 43 vii T+1 T+2 1 viii 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 36 37 38 39 40 41 42 43 44 45 46 47 48 49 50 a) ( ) b) ( ) 51

More information

4 (TA:, ) 2018 (Ver2.2) Python Python anaconda hello world

4 (TA:, ) 2018 (Ver2.2) Python Python anaconda hello world 4 (TA:, ) 2018 (Ver2.2) 1 2 2 3 3 Python 4 3.1 Python....................... 4 3.1.1 anaconda............................ 4 3.1.2 hello world............................... 4 3.1.3 pip.............. 4

More information

AccessflÌfl—−ÇŠš1

AccessflÌfl—−ÇŠš1 ACCESS ACCESS i ii ACCESS iii iv ACCESS v vi ACCESS CONTENTS ACCESS CONTENTS ACCESS 1 ACCESS 1 2 ACCESS 3 1 4 ACCESS 5 1 6 ACCESS 7 1 8 9 ACCESS 10 1 ACCESS 11 1 12 ACCESS 13 1 14 ACCESS 15 1 v 16 ACCESS

More information

2

2 1 2 3 4 5 6 7 8 9 10 I II III 11 IV 12 V 13 VI VII 14 VIII. 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 _ 33 _ 34 35 36 37 38 39 40 41 42 43 44 45 46 47 48 49 50 VII 51 52 53 54 55 56 57 58 59

More information

卒論 提出用ファイル.doc

卒論 提出用ファイル.doc 11 13 1LT99097W (i) (ii) 0. 0....1 1....3 1.1....3 1.2....4 2....7 2.1....7 2.2....8 2.2.1....8 2.2.2....9 2.2.3.... 10 2.3.... 12 3.... 15 Appendix... 17 1.... 17 2.... 19 3.... 20... 22 (1) a. b. c.

More information

262014 3 1 1 6 3 2 198810 2/ 198810 2 1 3 4 http://www.pref.hiroshima.lg.jp/site/monjokan/ 1... 1... 1... 2... 2... 4... 5... 9... 9... 10... 10... 10... 10... 13 2... 13 3... 15... 15... 15... 16 4...

More information

I 12 1 26 4 23 42 1 12 2 12 3 12 4 13 5 13 6 13 7 13 8 14 9 14 10 14 11 14 2 26 3 26 10 27 28 11 28 12 28 13 28 VI 29 1 29 1 29 34 5 35 6 35 7 35 8 35 9 36 10 36 11 36 12 36 1 42 2 42 24 43 25 43 26 43

More information

4 OLS 4 OLS 4.1 nurseries dual c dual i = c + βnurseries i + ε i (1) 1. OLS Workfile Quick - Estimate Equation OK Equation specification dual c nurser

4 OLS 4 OLS 4.1 nurseries dual c dual i = c + βnurseries i + ε i (1) 1. OLS Workfile Quick - Estimate Equation OK Equation specification dual c nurser 1 EViews 2 2007/5/17 2007/5/21 4 OLS 2 4.1.............................................. 2 4.2................................................ 9 4.3.............................................. 11 4.4

More information

P01_表紙

P01_表紙 INDEX MEDIA DATA 2017 2 MEDIA DATA 2017 3 4.9 6.1 3.4 3.9 11.7 11.4 11.5 10.9 7.7 4.9 5.7 6.5 4.4 3.9 2.4 3.3 MEDIA DATA 2017 4 70.6 69.2 67.3 63.7 11.6 8.9 10.4 6.9 16.6 15.6 15.1 12.7 9.7 8.6 9.0 14.2

More information

FX ) 2

FX ) 2 (FX) 1 1 2009 12 12 13 2009 1 FX ) 2 1 (FX) 2 1 2 1 2 3 2010 8 FX 1998 1 FX FX 4 1 1 (FX) () () 1998 4 1 100 120 1 100 120 120 100 20 FX 100 100 100 1 100 100 100 1 100 1 100 100 1 100 101 101 100 100

More information

fx-9860G Manager PLUS_J

fx-9860G Manager PLUS_J fx-9860g J fx-9860g Manager PLUS http://edu.casio.jp k 1 k III 2 3 1. 2. 4 3. 4. 5 1. 2. 3. 4. 5. 1. 6 7 k 8 k 9 k 10 k 11 k k k 12 k k k 1 2 3 4 5 6 1 2 3 4 5 6 13 k 1 2 3 1 2 3 1 2 3 1 2 3 14 k a j.+-(),m1

More information

- - - - - - - - - - - - - - - - - - - - - - - - - -1 - - - - - - - - - - - - - - - - - - - - - - - - - - - - -2...2...3...4...4...4...5...6...7...8...

- - - - - - - - - - - - - - - - - - - - - - - - - -1 - - - - - - - - - - - - - - - - - - - - - - - - - - - - -2...2...3...4...4...4...5...6...7...8... 取 扱 説 明 書 - - - - - - - - - - - - - - - - - - - - - - - - - -1 - - - - - - - - - - - - - - - - - - - - - - - - - - - - -2...2...3...4...4...4...5...6...7...8...9...11 - - - - - - - - - - - - - - - - -

More information

WHITE PAPER RNN

WHITE PAPER RNN WHITE PAPER RNN ii 1... 1 2 RNN?... 1 2.1 ARIMA... 1 2.2... 2 2.3 RNN Recurrent Neural Network... 3 3 RNN... 5 3.1 RNN... 6 3.2 RNN... 6 3.3 RNN... 7 4 SAS Viya RNN... 8 4.1... 9 4.2... 11 4.3... 15 5...

More information

³ÎΨÏÀ

³ÎΨÏÀ 2017 12 12 Makoto Nakashima 2017 12 12 1 / 22 2.1. C, D π- C, D. A 1, A 2 C A 1 A 2 C A 3, A 4 D A 1 A 2 D Makoto Nakashima 2017 12 12 2 / 22 . (,, L p - ). Makoto Nakashima 2017 12 12 3 / 22 . (,, L p

More information

ito.dvi

ito.dvi 1 2 1006 214 542 160 120 160 1 1916 49 1710 55 1716 1 2 1995 1 2 3 4 2 3 1950 1973 1969 1989 1 4 3 3.1 3.1.1 1989 2 3.1.2 214 542 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27

More information

86 7 I ( 13 ) II ( )

86 7 I ( 13 ) II ( ) 10 I 86 II 86 III 89 IV 92 V 2001 93 VI 95 86 7 I 2001 6 12 10 2001 ( 13 ) 10 66 2000 2001 4 100 1 3000 II 1988 1990 1991 ( ) 500 1994 2 87 1 1994 2 1000 1000 1000 2 1994 12 21 1000 700 5 800 ( 97 ) 1000

More information

入門ガイド

入門ガイド ii iii iv NEC Corporation 1998 v P A R 1 P A R 2 P A R 3 T T T vi P A R T 4 P A R T 5 P A R T 6 P A R T 7 vii 1P A R T 1 2 2 1 3 1 4 1 1 5 2 3 6 4 1 7 1 2 3 8 1 1 2 3 9 1 2 10 1 1 2 11 3 12 1 2 1 3 4 13

More information

1 2 3 4 5 1 1:30 NPO 16 1 19 16 2 17-6 - 10 2008 2010 120 150 IT( ) 60 21 40-7 - - 8-10 ( ) NPO 2 10 16:40-9 - 10 ii NPO NPO ( ) ( ) 11 12 13 14 15 22 26 27 28 29 30 31 32 33 34 m3 m3

More information

Introduction Purpose This training course demonstrates the use of the High-performance Embedded Workshop (HEW), a key tool for developing software for

Introduction Purpose This training course demonstrates the use of the High-performance Embedded Workshop (HEW), a key tool for developing software for Introduction Purpose This training course demonstrates the use of the High-performance Embedded Workshop (HEW), a key tool for developing software for embedded systems that use microcontrollers (MCUs)

More information

CuPy とは何か?

CuPy とは何か? GTC Japan 2018 CuPy NumPy 互換 GPU ライブラリによる Python での高速計算 Preferred Networks 取締役最高技術責任者奥田遼介 okuta@preferred.jp CuPy とは何か? CuPy とは GPU を使って NumPy 互換の機能を提供するライブラリ import numpy as np X_cpu = np.zeros((10,))

More information

kubostat2017b p.1 agenda I 2017 (b) probability distribution and maximum likelihood estimation :

kubostat2017b p.1 agenda I 2017 (b) probability distribution and maximum likelihood estimation : kubostat2017b p.1 agenda I 2017 (b) probabilit distribution and maimum likelihood estimation kubo@ees.hokudai.ac.jp http://goo.gl/76c4i 2017 11 14 : 2017 11 07 15:43 1 : 2 3? 4 kubostat2017b (http://goo.gl/76c4i)

More information

橡災害.PDF

橡災害.PDF 1 2 3 4 5 6 7 8 9 10 11 12 2.1 2.2 2.2.1 13 2.2.2 2.2.3 14 2.3 2.3.1 2.3.2 all or nothing 2.4. 2.4.1 15 i) ii) iii) iv) 2.5 2.4.2 2.2 2.5 2.5.1 16 2.5.2 2.6 2.6.1 2.4 2.6.2 2.6.3 2.6.4 17 18 3.2.1 Hazard

More information

SC-85X2取説

SC-85X2取説 I II III IV V VI .................. VII VIII IX X 1-1 1-2 1-3 1-4 ( ) 1-5 1-6 2-1 2-2 3-1 3-2 3-3 8 3-4 3-5 3-6 3-7 ) ) - - 3-8 3-9 4-1 4-2 4-3 4-4 4-5 4-6 5-1 5-2 5-3 5-4 5-5 5-6 5-7 5-8 5-9 5-10 5-11

More information

untitled

untitled JCCS Ver.2.0 JCCS -i- ISO-12006 -ii- JCCS -iii- JCCS -iv- -v- JCCS Ver. 2.0 1 (7) JCCS Ver. 2.0 1 (118) JCCS 2007 6 JCCS Construction information Classification System in Japan JCCS ISO12006 2001 JCCS

More information

1 10 1113 14 1516 1719 20 21 22 2324 25 2627 i 2829 30 31 32 33 3437 38 3941 42 4344 4547 48 4950 5152 53 5455 ii 56 5758 59 6061 iii 1 2 3 4 5 6 7 8 9 10 PFI 30 20 10 PFI 11 12 13 14 15 10 11 16 (1) 17

More information

<4D6963726F736F667420506F776572506F696E74202D208376838C835B83938365815B835683878393312E707074205B8CDD8AB78382815B83685D>

<4D6963726F736F667420506F776572506F696E74202D208376838C835B83938365815B835683878393312E707074205B8CDD8AB78382815B83685D> i i vi ii iii iv v vi vii viii ix 2 3 4 5 6 7 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 36 37 38 39 40 41 42 43 44 45 46 47 48 49 50 51 52 53 54 55 56 57 58 59 60

More information

Microsoft Word - GPM_read_program_guide_forPython_V5.1.docx

Microsoft Word - GPM_read_program_guide_forPython_V5.1.docx 2018/03/15 第 版 本書は全球降 観測衛星 (GPM) のデータを読み込むプログラム (Python) の作成 法についてまとめたものです 本書で解説するサンプルプログラムは GPM はプロダクトバージョン 5 GSMaP はプロダクトバージョン 4 で動作を確認しています 目次 1. はじめに... 3 2.GPM データの 法... 4 3. 関連 書 サンプルプログラムの 法...

More information

(1)2004年度 日本地理

(1)2004年度 日本地理 1 2 3 4 1 2 3 4 5 6 7 8 9 10 11 12-5.0-5.1-1.4 4.2 8.6 12.4 16.9 19.5 16.6 10.8 3.3-2.0 6.6 16.6 16.6 18.6 21.3 23.8 26.6 28.5 28.2 27.2 24.9 21.7 18.4 22.7 5 1 2 3 4 5 6 7 8 9 10 11 12 2.2 3.5 7.7 11.1

More information

001 No.3/12 1 1 2 3 4 5 6 4 8 13 27 33 39 001 No.3/12 4 001 No.3/12 5 001 No.3/12 6 001 No.3/12 7 001 8 No.3/12 001 No.3/12 9 001 10 No.3/12 001 No.3/12 11 Index 1 2 3 14 18 21 001 No.3/12 14 001 No.3/12

More information