https://towardsdatascience.com/using-tf-print-in-tensorflow-aa26e1cff11e
After reading it, you should understand how tf.Print() function works and to use it.
But, what if you want to know the timestamp of the node/operation in real run-time?
The answer is quite simple. For instance, you want to print out your loss function's value and with the timestamp, just do it as follows:
<<< my training code >>>
...
...
loss = tf.nn.sparse_softmax_cross_entropy_with_logits(logits=logits, labels=label)
loss = tf.reduce_mean(cost, name='cross_entropy_loss')
loss = tf.Print(loss, [loss, tf.timestamp()], message='timestamp_cost:')
#(option) to check if this tf.Print op is in building graph or not!!
print(loss)
First, you will see your tf.Print op's Tensor information like this in screen (stdout)Tensor("tower1/Print:0", shape=(), dtype=float32, device=/device:GPU:1)
Next, the loss's value and the timestamp information will be printed in every step ( it depends on your training code...)
That's all.
timestamp_cost:[2.11570501][1534406561.0370209]
timestamp_cost:[2.10749817][1534406561.148946]
timestamp_cost:[2.18681598][1534406561.261364]
timestamp_cost:[2.15023327][1534406561.3735371]
timestamp_cost:[2.07061148][1534406561.4861519]
timestamp_cost:[2.09767246][1534406561.5999889]
timestamp_cost:[2.14620495][1534406561.714628]
timestamp_cost:[2.0326376][1534406561.827477]
timestamp_cost:[1.99513125][1534406561.939805]
timestamp_cost:[1.9811722][1534406562.0516469]
timestamp_cost:[2.05018902][1534406562.1656229]
timestamp_cost:[2.03314352][1534406562.2777109]
timestamp_cost:[1.96346831][1534406562.3898611]
timestamp_cost:[1.98018146][1534406562.504113]
...
...
...
So, the first value is from cost function and second one is from tf.timestamp().That's all.
No comments:
Post a Comment