I’m new to tensorboard and learning the use of it following the tutorial, which goes well and tensorboard works as expected.
Referring to that tutorial, I wrote my own code to train a logic-and model with jupyter notebook
%load_ext tensorboard import datetime log_folder = "logs/fit/" + datetime.datetime.now().strftime("%Y%m%d-%H%M%S") import tensorflow as tf import numpy as np x_train = np.asarray([[0, 0],[0, 1],[1, 0],[1, 1]], np.float32) y_train = np.asarray([0, 0, 0, 1], np.float32) model = tf.keras.models.Sequential([ tf.keras.layers.Dense(1, activation=tf.nn.sigmoid) ]) def custom_loss(y,a): return -(y*tf.math.log(a) + (1-y)*tf.math.log(1-a)) model.compile(loss=custom_loss, optimizer='SGD', metrics=['accuracy']) tensorboard_callback = tf.keras.callbacks.TensorBoard(log_folder, histogram_freq=1) model.fit(x_train, y_train, epochs=2000, verbose=0, callbacks=[tensorboard_callback])
the training goes well and needs some improvement.
However, tensorboard shows nothing
%tensorboard --logdir log_folder
where is the key to make tensorboard work?
You’re just using the ipython magic wrong. You need to put a dollar sign in front of your variable name (see e.g. How to pass a variable to magic ´run´ function in IPython).
%tensorboard --logdir $log_folder
For further exploring, try pretending you’re working in the future (at least as far as date goes), and add a cell like this
log_folder_future = "logs/fit/" + (datetime.datetime.now() - datetime.timedelta(days=1)).strftime("%Y%m%d-%H%M%S") up_dir = './' + '/'.join(log_folder_future.split('/')[:-1]) model.compile(loss=custom_loss, optimizer='SGD', metrics=['accuracy']) tensorboard_callback_future = tf.keras.callbacks.TensorBoard(log_folder_future, histogram_freq=1) model.fit(x_train, y_train, epochs=2500, verbose=0, callbacks=[tensorboard_callback_future])
and call like this
%tensorboard --logdir $up_dir
and end up with something like this
For more on tensorboard directory structures and multiple runs, see this page