Update 12_1_Self_Attention.ipynb

This commit is contained in:
Pietro Monticone
2023-10-30 18:08:36 +01:00
parent 4f2d3f31f0
commit ecbe2f1051

View File

@@ -153,7 +153,7 @@
{ {
"cell_type": "markdown", "cell_type": "markdown",
"source": [ "source": [
"We'll need a softmax function (equation 12.5) -- here, it will take a list of arbirtrary numbers and return a list where the elements are non-negative and sum to one\n" "We'll need a softmax function (equation 12.5) -- here, it will take a list of arbitrary numbers and return a list where the elements are non-negative and sum to one\n"
], ],
"metadata": { "metadata": {
"id": "Se7DK6PGPSUk" "id": "Se7DK6PGPSUk"
@@ -364,7 +364,7 @@
{ {
"cell_type": "markdown", "cell_type": "markdown",
"source": [ "source": [
"TODO -- Investigate whether the self-attention mechanism is covariant with respect to permulation.\n", "TODO -- Investigate whether the self-attention mechanism is covariant with respect to permutation.\n",
"If it is, when we permute the columns of the input matrix $\\mathbf{X}$, the columns of the output matrix $\\mathbf{X}'$ will also be permuted.\n" "If it is, when we permute the columns of the input matrix $\\mathbf{X}$, the columns of the output matrix $\\mathbf{X}'$ will also be permuted.\n"
], ],
"metadata": { "metadata": {