Update CM20315_Gradients_II.ipynb

This commit is contained in:
Pietro Monticone
2023-11-30 16:36:35 +01:00
parent ef28d848df
commit 6b2f25101e

View File

@@ -32,7 +32,7 @@
"source": [
"# Gradients II: Backpropagation algorithm\n",
"\n",
"In this practical, we'll investigate the backpropagation algoritithm. This computes the gradients of the loss with respect to all of the parameters (weights and biases) in the network. We'll use these gradients when we run stochastic gradient descent."
"In this practical, we'll investigate the backpropagation algorithm. This computes the gradients of the loss with respect to all of the parameters (weights and biases) in the network. We'll use these gradients when we run stochastic gradient descent."
],
"metadata": {
"id": "L6chybAVFJW2"
@@ -53,7 +53,7 @@
{
"cell_type": "markdown",
"source": [
"First let's define a neural network. We'll just choose the weights and biaes randomly for now"
"First let's define a neural network. We'll just choose the weights and biases randomly for now"
],
"metadata": {
"id": "nnUoI0m6GyjC"
@@ -178,7 +178,7 @@
{
"cell_type": "markdown",
"source": [
"Now let's define a loss function. We'll just use the least squaures loss function. We'll also write a function to compute dloss_doutpu"
"Now let's define a loss function. We'll just use the least squares loss function. We'll also write a function to compute dloss_doutpu"
],
"metadata": {
"id": "SxVTKp3IcoBF"