Weight Initialization

  • Use custom weight initializers for better convergence, especially in deep networks.pythonCopy codefrom tensorflow.keras.initializers import HeNormal model = Sequential([ Dense(128, activation='relu', kernel_initializer=HeNormal(), input_shape=(20,)), Dense(64, activation='relu', kernel_initializer=HeNormal()), Dense(1) ]) model.compile(optimizer='adam', loss='mse')
  • For ReLU-based networks, He initialization can lead to faster convergence.

Comments

Leave a Reply

Your email address will not be published. Required fields are marked *