■ matmul 함수 사용.
x_data
- 4x3 배열 생성
y_data
- 4x1 배열 생성
hypothesis = tf.matmul(X,W)+b
- X : 4x3 , Y:3x1 matrix multiplier 수행 후 b와 더함.
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 36 37 38 39 40 41 42 43 44 45 46 47 48 49 50 51 52 53 54 55 56 57 58 59 60 61 62 63 64 65 66 67 68 69 70 71 72 73 74 75 76 77 | # Lab04-2 Multi-variable linear regression import tensorflow as tf tf.set_random_seed(777) # for reproducibility x_data = [[73., 80., 75.], [93., 88., 93.], [89., 91., 90.], [96., 98., 100.], [73., 66., 70.]] y_data = [[152.], [185.], [180.], [196.], [142.]] # placeholders for a tensor that will be always fed. X = tf.placeholder(tf.float32, shape=[None, 3]) Y = tf.placeholder(tf.float32, shape=[None, 1]) W = tf.Variable(tf.random_normal([3, 1]), name='weight') b = tf.Variable(tf.random_normal([1]), name='bias') # Hypothesis hypothesis = tf.matmul(X, W) + b # Simplified cost/loss function cost = tf.reduce_mean(tf.square(hypothesis - Y)) # Minimize optimizer = tf.train.GradientDescentOptimizer(learning_rate=1e-5) train = optimizer.minimize(cost) # Launch the graph in a session. sess = tf.Session() # Initializes global variables in the graph. sess.run(tf.global_variables_initializer()) for step in range(2001): cost_val, hy_val, _ = sess.run( [cost, hypothesis, train], feed_dict={X: x_data, Y: y_data}) if step % 10 == 0: print(step, "Cost: ", cost_val, "\nPrediction:\n", hy_val) | cs |
반응형
'잡다한 IT > 머신러닝 & 딥러닝' 카테고리의 다른 글
05-1. Logistic Regression Classifier (0) | 2018.08.15 |
---|---|
04-3. Mulit-variable linear regression with file input (0) | 2018.08.15 |
04-1. Multi-variable linear regression (0) | 2018.08.15 |
노드 수? 레이어 수? (0) | 2018.08.08 |
03-4 Minimizing Cost tf gradient (0) | 2018.08.07 |