Logits

Use a fully-connected layer to extract multiclass logits from the CNN.

Chapter Goals:

  • Obtain the logits for each digit class

A. Multiclass logits

Since there are 10 possible digits an MNIST image can be, we use a 10 neuron fully-connected layer to obtain the logits for each digit class. The logits are the output of the model_layers function.

The rest of the model follows the standard format for multiclass classification:

  • Softmax applied to the logits to convert them into per class probabilities
  • The labels are one-hot vectors, where the "hot index" corresponds to the digit in the MNIST image
  • Softmax cross entropy to calculate loss

Time to Code!

In this chapter, we obtains logits from the previous chapter's dropout.

We use a final fully-connected layer to obtain our logits, which we return as the output of our function.

Set logits equal to tf.keras.layers.Dense applied with dropout as the inputs, self.output_size as the output size, and name equal to 'logits'.
Then return logits.

Press + to interact
import tensorflow as tf
class MNISTModel(object):
# Model Initialization
def __init__(self, input_dim, output_size):
self.input_dim = input_dim
self.output_size = output_size
# CNN Layers
def model_layers(self, inputs, is_training):
reshaped_inputs = tf.reshape(
inputs, [-1, self.input_dim, self.input_dim, 1])
# Convolutional Layer #1
conv1 = tf.keras.layers.Conv2D(
filters=32,
kernel_size=[5, 5],
padding='same',
activation='relu',
name='conv1')(reshaped_inputs)
# Pooling Layer #1
pool1 = tf.keras.layers.MaxPool2D(
pool_size=[2, 2],
strides=2,
name='pool1')(conv1)
# Convolutional Layer #2
conv2 = tf.keras.layers.Conv2D(
filters=64,
kernel_size=[5, 5],
padding='same',
activation='relu',
name='conv1')(pool1)
# Pooling Layer #2
pool2 = tf.keras.layers.MaxPool2D(
pool_size=[2, 2],
strides=2,
name='pool2')(conv2)
# Dense Layer
hwc = pool2.shape.as_list()[1:]
flattened_size = hwc[0] * hwc[1] * hwc[2]
pool2_flat = tf.reshape(pool2, [-1, flattened_size])
dense = tf.keras.layers.Dense(
1024, activation='relu', name='dense')(pool2_flat)
# Apply Dropout
#dropout = tf.layers.dropout(
dropout = tf.keras.layers.Dropout(rate=0.4)(dense, training=is_training)
# CODE HERE
return logits

Get hands-on with 1300+ tech skills courses.