Skip to content

Commit 220ae2b

Browse files
committed
Update content src/site/notes/AI/transformers/transformers architecture.md
1 parent 4a47b83 commit 220ae2b

File tree

1 file changed

+20
-2
lines changed

1 file changed

+20
-2
lines changed

src/site/notes/AI/transformers/transformers architecture.md

+20-2
Original file line numberDiff line numberDiff line change
@@ -13,9 +13,27 @@ https://zhuanlan.zhihu.com/p/82312421
1313

1414
![Pasted image 20241124140104.png](/img/user/AI/transformers/attachments/Pasted%20image%2020241124140104.png)
1515

16+
https://www.youtube.com/watch?v=wjZofJX0v4M
1617

17-
![Transformers(how LLMs work)](https://www.youtube.com/watch?v=wjZofJX0v4M)
1818

1919

20+
<div style="position: relative; width: 100%; max-width: 800px; height: 0; padding-bottom: 56.25%; margin: 0 auto;">
21+
<iframe
22+
src="https://www.youtube.com/embed/wjZofJX0v4M?rel=0&modestbranding=1&autoplay=0&showinfo=0&fs=1&disablekb=1"
23+
style="position: absolute; top: 0; left: 0; width: 100%; height: 100%; border: none;"
24+
allow="accelerometer; autoplay; clipboard-write; encrypted-media; gyroscope; picture-in-picture"
25+
allowfullscreen>
26+
</iframe>
27+
</div>
2028

21-
![Pasted image 20250119001634.png](/img/user/AI/transformers/attachments/Pasted%20image%2020250119001634.png)
29+
30+
31+
32+
![Pasted image 20250119001634.png](/img/user/AI/transformers/attachments/Pasted%20image%2020250119001634.png)
33+
34+
35+
### 1. embedding层
36+
[[AI/torch/nn/embedding\|embedding]]
37+
主要将词表表达为vector并映射到高维词表空间,对于英文来说有大概5k+左右的向量,除了表示token的含义之外,还通过位置编码将token在context的位置也包含了进去,将位置编码进去的主要目的是为了可以并行计算不会像RNN网络一样存在前后依赖,可以最大化并行利用好gpu的效率
38+
### 2. attention 层
39+
### 3. feed forward layer

0 commit comments

Comments
 (0)