I just started to study tensorflow recently. When doing some exercises, a question come up. For building hidden layers,there are two ways as I know to define.
By using tf.layers.dense, to define fully connected layer e.g.
layer_1=tf.layers.dense(X,512,activation=tf.nn.relu) layer_2=tf.layers.dense(layer_1,256,activation=tf.nn.relu)
By using tf.add(tf.matmul(X,W),b), a direct matrix multiply, to define layer e.g.:
w1= tf.Variable(tf.random_normal([in_size, out_size]))
b1=....
w2=....
b2=...
layer_1=tf.add(tf.matmul(x,w1),b1) layer_1=tf.nn.relu(relu) layer_2=tf.add(tf.matmul(layer_1,w2),b2) layer_2=tf.nn.relu(layer_2)
I tried these two ways to build a multilayer NN, both can work. My quesiton:is there difference between them? my guess: 1) In approach 2, W, b can be monitored by tensorboard since they are explicitly defined.
Appreciated for any feedback. Thanks