top of page

Day 23: Building a Neural Network

let's look at the coffee roast data again:


Temperature ( Celcius)

Duration (minutes)

y

200

17

1

120

5

0

425

20

0

212

18

1


layer_1 = Dense(units=3, activation="sigmoid")
layer_2 = Dense(units=1, activation="sigmoid")

model = Sequential([layer_1, layer_2])

Instead of manually taking the data passing it to layer_1, and then taking the activations from layer_1 and pass it to layer_2. We can instead tell TensorFlow that we would like it to take layer_1 and layer_2 and string them together to form a Neural Network.

x = np.array([[200.0, 17.0],        # 4x2 matrix
              [120.0,  5.0],
              [425.0, 20.0],
              [212.0, 18.0]])
              
y = np.array([1, 0, 0, 1])

If you want to train the neural network given this data, all you need to do is call this function:

model.compile(...)        # more on this next week

or model.fit(x, y), which tells TensorFlow to take this neural network that are created by sequentially stringing together layer_1 and layer_2 and train it on the data x and y


model.predict(x_new) carries out forward propagation and inference for you, using this neural network that you compiled using the sequential function.


By convention, this is how we'll usually write the code:

model = Sequential([
    Dense(units=3, activation="sigmoid"),
    Dense(units=1, activation="sigmoid")])

Side Note:

In the duration of the course, the term inference tend to be used while discussing forward propagation. To clarify, forward propagation and inference are related concepts in the context of neural network, but they are not exactly the same:

Forward propagation:

  • The process of passing input data through a neural network to compute predictions or activations for a given set of input examples.

  • It occurs during both training and inference phases

  • During training, forward propagation calculates predictions, and these predictions are compared to the actual target values(ground truth label y) to compute a loss or error. This loss is then used to update the model's parameters through back-propagation (more on this in the next course: Deep Learning)

  • During inference, forward propagation is used to make predictions on new, unseen data. The model's learned parameters are fixed, and it only performs the forward pass to make predictions.

Inference:

  • Inference is the process of using a trained neural network to make predictions on new, unseen data.

  • During inference, the model's parameters are fixed and not updated.

  • The main goal of inference is to obtain predictions or classifications for input data, and it does not involve updating model weights or computing gradients.

Recent Posts

See All

Day 39: Tree Ensembles

Using Multiple Decision Trees One of the weaknesses of using a single decision tree is that decision tree can be highly sensitive to...

Comments


bottom of page