Machine Learning Overview

TensorFlow: Using TensorBoard, Callbacks, and Model Saving in Keras _. day 16

Mastering TensorFlow: Using TensorBoard, Callbacks, and Model Saving in Keras

Mastering TensorFlow: Using TensorBoard, Callbacks, and Model Saving in Keras

TensorFlow and Keras provide powerful tools for building, training, and evaluating deep learning models. In this blog post, we will explore three essential techniques:

  • Using TensorBoard for visualization
  • Utilizing callbacks to enhance model training
  • Saving and restoring models

Using TensorBoard for Visualization

TensorBoard is an interactive visualization tool that helps you understand your model’s training dynamics. It allows you to view learning curves, compare metrics between multiple runs, and analyze training statistics.

Installation


!pip install -q -U tensorflow tensorboard-plugin-profile
  

Setting Up Logging Directory

We need a directory to save our logs. This directory will contain event files that TensorBoard reads to visualize the training process.


from pathlib import Path
from time import strftime

def get_run_logdir(root_logdir="my_logs"):
    return Path(root_logdir) / strftime("run_%Y_%m_%d_%H_%M_%S")

run_logdir = get_run_logdir()
  

Saving and Restoring a Model

Keras allows you to save the entire model (architecture, weights, and training configuration) to a single file or a folder.

Saving a Model


model.save("my_keras_model", save_format="tf")
  

Loading a Model


model = tf.keras.models.load_model("my_keras_model")
  

Saving Weights Only


model.save_weights("my_weights.h5")
model.load_weights("my_weights.h5")
  

Using Callbacks

Callbacks in Keras allow you to perform actions at various stages of training (e.g., saving checkpoints, early stopping).

ModelCheckpoint

Save the model at regular intervals.


checkpoint_cb = tf.keras.callbacks.ModelCheckpoint("my_checkpoints.weights.h5", save_weights_only=True)
  

EarlyStopping

Stop training when a monitored metric has stopped improving.


early_stopping_cb = tf.keras.callbacks.EarlyStopping(patience=10, restore_best_weights=True)
  

TensorBoard Callback

Log data for TensorBoard.


tensorboard_cb = tf.keras.callbacks.TensorBoard(log_dir=run_logdir, profile_batch=(100, 200))
  

Full Code with Techniques Applied

Here is the full code incorporating all the techniques discussed:


# Install TensorFlow and TensorBoard plugin
!pip install -q -U tensorflow tensorboard-plugin-profile

# Import necessary libraries
import tensorflow as tf
import matplotlib.pyplot as plt
import pandas as pd
from pathlib import Path
from time import strftime

# Load Fashion MNIST dataset
fashion_mnist = tf.keras.datasets.fashion_mnist
(X_train_full, y_train_full), (X_test, y_test) = fashion_mnist.load_data()

# Split data into training, validation, and test sets
X_train, y_train = X_train_full[:-5000], y_train_full[:-5000]
X_valid, y_valid = X_train_full[-5000:], y_train_full[-5000:]

# Scale pixel values to the 0-1 range
X_train, X_valid, X_test = X_train / 255.0, X_valid / 255.0, X_test / 255.0

# Class names for Fashion MNIST
class_names = ["T-shirt/top", "Trouser", "Pullover", "Dress", "Coat", 
               "Sandal", "Shirt", "Sneaker", "Bag", "Ankle boot"]

# Display the first few images and labels
plt.figure(figsize=(10,10))
for i in range(25):
    plt.subplot(5, 5, i + 1)
    plt.xticks([])
    plt.yticks([])
    plt.grid(False)
    plt.imshow(X_train[i], cmap=plt.cm.binary)
    plt.xlabel(class_names[y_train[i]])
plt.show()

# Set random seed for reproducibility
tf.random.set_seed(42)

# Build the model
model = tf.keras.Sequential([
    tf.keras.layers.Flatten(input_shape=[28, 28]),
    tf.keras.layers.Dense(300, activation="relu"),
    tf.keras.layers.Dense(100, activation="relu"),
    tf.keras.layers.Dense(10, activation="softmax")
])

# Compile the model
model.compile(loss="sparse_categorical_crossentropy", 
              optimizer="sgd", 
              metrics=["accuracy"])

# Define log directory for TensorBoard
def get_run_logdir(root_logdir="my_logs"):
    return Path(root_logdir) / strftime("run_%Y_%m_%d_%H_%M_%S")
run_logdir = get_run_logdir()

# Define callbacks
checkpoint_cb = tf.keras.callbacks.ModelCheckpoint("my_checkpoints.weights.h5", save_weights_only=True)
early_stopping_cb = tf.keras.callbacks.EarlyStopping(patience=10, restore_best_weights=True)
tensorboard_cb = tf.keras.callbacks.TensorBoard(log_dir=run_logdir, profile_batch=(100, 200))

# Train the model with callbacks
history = model.fit(X_train, y_train, epochs=30, 
                    validation_data=(X_valid, y_valid), 
                    callbacks=[checkpoint_cb, early_stopping_cb, tensorboard_cb])

# Start TensorBoard (run these two lines in a Jupyter notebook or Colab cell)
# %load_ext tensorboard
# %tensorboard --logdir=./my_logs

# Evaluate the model on the test set
test_loss, test_acc = model.evaluate(X_test, y_test)
print(f"Test accuracy: {test_acc:.4f}")

# Plot training and validation accuracy and loss
def plot_learning_curves(history):
    pd.DataFrame(history.history).plot(figsize=(8, 5))
    plt.grid(True)
    plt.gca().set_ylim(0, 1)  # set the vertical range to [0-1]
    plt.show()

plot_learning_curves(history)

# Make predictions
y_pred = model.predict(X_test)

# Plot the first 25 test images, their predicted labels, and the true labels.
# Color correct predictions in blue and incorrect predictions in red.
plt.figure(figsize=(10,10))
for i in range(25):
    plt.subplot(5, 5, i + 1)
    plt.xticks([])
    plt.yticks([])
    plt.grid(False)
    plt.imshow(X_test[i], cmap=plt.cm.binary)
    predicted_label = class_names[y_pred[i].argmax()]
    true_label = class_names[y_test[i]]
    color = 'blue' if predicted_label == true_label else 'red'
    plt.xlabel(f"{predicted_label} ({true_label})", color=color)
plt.show()
  

This blog post covers the essential techniques of using TensorBoard, callbacks, and model saving/restoring in Keras with TensorFlow. These techniques will enhance your model training process and provide valuable insights into your model’s performance. Happy coding!

Check the results in the screenshots below:

Just run the code in Google Colab to see your model in action, but let’s see some images from the result of our given code which was a functional Keras model discussed in our previous post.

Check the results in the screenshots Bellow :

Just run the code in Google Colab to see your Self but let’s see some images from the result of our given code which was a functional Keras mode which we discussed in our previous days post

don't miss our new posts. Subscribe for updates

We don’t spam! Read our privacy policy for more info.