/azizsdcn

My assignments for Self Driving Car Nanodegree

Primary LanguageJupyter Notebook

azizsdcn

My assignments for Self Driving Car Nanodegree

Terms I heard first time Stochastic Gradient Descent Entropy Softmax A Rectified linear unit (ReLU) is type of activation function that is defined as f(x) = max(0, x). The function returns 0 if x is negative, otherwise it returns x. TensorFlow provides the ReLU function as tf.nn.relu(), as shown below.