Regularization Techniques (L1/L2/Dropout)
- Prevent overfitting by adding L1/L2 regularization or dropout.pythonCopy code
from tensorflow.keras.layers import Dropout from tensorflow.keras.regularizers import l2 model = Sequential([ Dense(128, activation='relu', kernel_regularizer=l2(0.001), input_shape=(20,)), Dropout(0.5), Dense(64, activation='relu'), Dropout(0.5), Dense(1) ]) model.compile(optimizer='adam', loss='mse')
- Dropout randomly drops a fraction of the neurons, making the model less dependent on specific paths and helping to generalize better.
Leave a Reply