■ 직접구한 Minimizing cost 와 텐서플로우 사용 Minimizing cost 비교
gvs = optimizer.compute_gradients(cost,[W])
- gradient 값을 얻어오는 함수
apply_graidents = optimizer.apply_gradients(gvs)
- gradient 값을 통하여 W 값을 얻어내기 위한 그래프를 생성함
sess.run(apply_gradients)
- apply_gradients를 실행하여 W 값을 얻어옴
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 36 37 38 39 40 41 42 43 44 45 46 47 48 49 50 51 52 53 54 55 56 57 58 59 60 61 62 63 64 65 66 67 68 69 70 71 72 73 | # Lab03-X Minimizing Cost tf gradient # This is optional import tensorflow as tf tf.set_random_seed(777) # for reproducibility # tf Graph Input X = [1, 2, 3] Y = [1, 2, 3] # Set wrong model weights W = tf.Variable(5.) # Linear model hypothesis = X * W # Manual gradient gradient = tf.reduce_mean((W * X - Y) * X) * 2 # cost/loss function cost = tf.reduce_mean(tf.square(hypothesis - Y)) # Minimize: Gradient Descent Magic optimizer = tf.train.GradientDescentOptimizer(learning_rate=0.01) train = optimizer.minimize(cost) # Get gradients gvs = optimizer.compute_gradients(cost, [W]) # Optional: modify gradient if necessary # gvs = [(tf.clip_by_value(grad, -1., 1.), var) for grad, var in gvs] # Apply gradients apply_gradients = optimizer.apply_gradients(gvs) # Launch the graph in a session. sess = tf.Session() # Initializes global variables in the graph. sess.run(tf.global_variables_initializer()) for step in range(100): print(step, sess.run([gradient, W, gvs])) sess.run(apply_gradients) # Same as sess.run(train) | cs |
반응형
'잡다한 IT > 머신러닝 & 딥러닝' 카테고리의 다른 글
04-1. Multi-variable linear regression (0) | 2018.08.15 |
---|---|
노드 수? 레이어 수? (0) | 2018.08.08 |
03-3 Minimizing Cost tf optimizer (0) | 2018.08.07 |
03-2 Minimizing Cost without tensorflow APIs (0) | 2018.08.07 |
03-1 Minimizing Cost show graph (0) | 2018.08.07 |