- Use custom weight initializers for better convergence, especially in deep networks.pythonCopy code
from tensorflow.keras.initializers import HeNormal model = Sequential([ Dense(128, activation='relu', kernel_initializer=HeNormal(), input_shape=(20,)), Dense(64, activation='relu', kernel_initializer=HeNormal()), Dense(1) ]) model.compile(optimizer='adam', loss='mse')
- For ReLU-based networks, He initialization can lead to faster convergence.
Leave a Reply