Computer Vision News - June 2018

For demonstration purposes, you can include the above functions in constructing the simplest FC layer. In the function, we see variable_summaries used for the biases and weights , while for the activations values themselves we’re interested only in histogram . When you run TensorBoard in your browser, it looks like the images below (to learn how to run TensorBoard follow the link ) : For example, accuracy: The histograms for preactivations and activations: 13 Tool Computer Vision News def nn_layer(input_tensor, input_dim, output_dim, layer_name, act=tf.nn.relu): with tf.name_scope(layer_name): # This Variable will hold the state of the weights for the layer with tf.name_scope( 'weights' ): weights = weight_variable([input_dim, output_dim]) variable_summaries(weights) with tf.name_scope( 'biases' ): biases = bias_variable([output_dim]) variable_summaries(biases) with tf.name_scope( 'Wx_plus_b' ): preactivate = tf.matmul(input_tensor, weights) + biases tf.summary.histogram( 'pre_activations' , preactivate) activations = act(preactivate, name = 'activation' ) tf.summary.histogram( 'activations' , activations) return activations for Deep Learning in TensorFlow and Keras

RkJQdWJsaXNoZXIy NTc3NzU=