From ecbe2f1051f5c3128d0f077fa2678801f6eececd Mon Sep 17 00:00:00 2001 From: Pietro Monticone <38562595+pitmonticone@users.noreply.github.com> Date: Mon, 30 Oct 2023 18:08:36 +0100 Subject: [PATCH] Update 12_1_Self_Attention.ipynb --- Notebooks/Chap12/12_1_Self_Attention.ipynb | 4 ++-- 1 file changed, 2 insertions(+), 2 deletions(-) diff --git a/Notebooks/Chap12/12_1_Self_Attention.ipynb b/Notebooks/Chap12/12_1_Self_Attention.ipynb index d1e9fe5..3a26f94 100644 --- a/Notebooks/Chap12/12_1_Self_Attention.ipynb +++ b/Notebooks/Chap12/12_1_Self_Attention.ipynb @@ -153,7 +153,7 @@ { "cell_type": "markdown", "source": [ - "We'll need a softmax function (equation 12.5) -- here, it will take a list of arbirtrary numbers and return a list where the elements are non-negative and sum to one\n" + "We'll need a softmax function (equation 12.5) -- here, it will take a list of arbitrary numbers and return a list where the elements are non-negative and sum to one\n" ], "metadata": { "id": "Se7DK6PGPSUk" @@ -364,7 +364,7 @@ { "cell_type": "markdown", "source": [ - "TODO -- Investigate whether the self-attention mechanism is covariant with respect to permulation.\n", + "TODO -- Investigate whether the self-attention mechanism is covariant with respect to permutation.\n", "If it is, when we permute the columns of the input matrix $\\mathbf{X}$, the columns of the output matrix $\\mathbf{X}'$ will also be permuted.\n" ], "metadata": {