Created using Colab

This commit is contained in:
udlbook
2024-12-16 16:06:30 -05:00
parent 9b71ac0487
commit dd9a56d96b

View File

@@ -67,7 +67,7 @@
"# Set seed so we always get the same random numbers\n",
"np.random.seed(0)\n",
"\n",
"# Number of layers\n",
"# Number of hidden layers\n",
"K = 5\n",
"# Number of neurons per layer\n",
"D = 6\n",
@@ -114,7 +114,7 @@
{
"cell_type": "markdown",
"source": [
"Now let's run our random network. The weight matrices $\\boldsymbol\\Omega_{1\\ldots K}$ are the entries of the list \"all_weights\" and the biases $\\boldsymbol\\beta_{1\\ldots K}$ are the entries of the list \"all_biases\"\n",
"Now let's run our random network. The weight matrices $\\boldsymbol\\Omega_{0\\ldots K}$ are the entries of the list \"all_weights\" and the biases $\\boldsymbol\\beta_{0\\ldots K}$ are the entries of the list \"all_biases\"\n",
"\n",
"We know that we will need the preactivations $\\mathbf{f}_{0\\ldots K}$ and the activations $\\mathbf{h}_{1\\ldots K}$ for the forward pass of backpropagation, so we'll store and return these as well.\n"
],
@@ -299,7 +299,7 @@
"delta_fd = 0.000001\n",
"\n",
"# Test the dervatives of the bias vectors\n",
"for layer in range(K):\n",
"for layer in range(K+1):\n",
" dl_dbias = np.zeros_like(all_dl_dbiases[layer])\n",
" # For every element in the bias\n",
" for row in range(all_biases[layer].shape[0]):\n",
@@ -323,7 +323,7 @@
"\n",
"\n",
"# Test the derivatives of the weights matrices\n",
"for layer in range(K):\n",
"for layer in range(K+1):\n",
" dl_dweight = np.zeros_like(all_dl_dweights[layer])\n",
" # For every element in the bias\n",
" for row in range(all_weights[layer].shape[0]):\n",