I want to get the output of the fc6 layer in the slim vgg16 net, is there any good way to do that?
Actually I have figured out one possible solution, please help me confirm its correctness.
The output of the fc6 layer is actually the result of the Relu op, i.e., the activation function. And I find the name of it is vgg_16/fc6/Relu by executing tf.get_default_graph().get_operations(). So, maybe the result of tf.get_default_graph().get_tensor_by_name('vgg_16/fc6/Relu:0') is what I want?