- Hands-On Neural Networks with Keras
- Niloy Purkait
- 144字
- 2025-04-04 14:37:33
Building the model
The main architectural difference in this regression model, as opposed to the previous classification models we built, is to do with the way we construct the last layer of this network. Recall that in a classic scalar regression problem, such as the one at hand, we aim to predict a continuous variable. To implement this, we avoid using an activation function in our last layer, and use only one output neuron.
The reason we forego an activation function is because we do not want to constrain the range that the output values of this layer may take. Since we are implementing a purely linear layer, our network is able to learn to predict a scalar continuous value, just as we want it to:
from keras.layers import Dense, Dropout
from keras.models import Sequential
model= Sequential()
model.add(Dense(26, activation='relu',input_shape=(13,)))
model.add(Dense(26, activation='relu'))
model.add(Dense(12, activation='relu'))
model.add(Dense(1))