what are activation function,losses and optimisers|python code-ReLU, sigmoid and tanh

Опубликовано: 17 Март 2025
на канале: Data Science Made Easy
13
1

what are activation function
what is RELU activation function
what is sigmoid activation function
what is tanh activation function
python code of ReLU, sigmoid and tanh activation function.
what are losses
what are mse losses
what are cross entropy losses
what are categorical cross entropy loss
python code for mse,cross entropy and categorical cross entropy loss
what are optimisers
what is adam optimiser
what is gradient descent optimiser
what is momentum optimiser
what is Nesterov Accelerated Gradient (NAG) optimiser
python code for adam, gradient descent, NAG and adam optimiser

#loss
#optimizer
#activation
#relu
#sigmoid
#tanh
#mse
#cross entropy loss
#categorical cross entropy
#adam
#gradientdescent
#nesterov accelerated gradient
#pythonprogramming
#pythoncoding
#python