1
- Deep Learning for NLP with Pytorch
1
+ PyTorchλ₯Ό μ΄μ©ν NLPλ₯Ό μν λ₯λ¬λ
2
2
**********************************
3
- **Author **: `Robert Guthrie <https://github.com/rguthrie3/DeepLearningForNLPInPytorch >`_
4
-
5
- This tutorial will walk you through the key ideas of deep learning
6
- programming using Pytorch. Many of the concepts (such as the computation
7
- graph abstraction and autograd) are not unique to Pytorch and are
8
- relevant to any deep learning toolkit out there.
9
-
10
- I am writing this tutorial to focus specifically on NLP for people who
11
- have never written code in any deep learning framework (e.g, TensorFlow ,
12
- Theano, Keras, Dynet). It assumes working knowledge of core NLP
13
- problems: part-of-speech tagging, language modeling, etc. It also
14
- assumes familiarity with neural networks at the level of an intro AI
15
- class (such as one from the Russel and Norvig book). Usually, these
16
- courses cover the basic backpropagation algorithm on feed-forward neural
17
- networks, and make the point that they are chains of compositions of
18
- linearities and non-linearities. This tutorial aims to get you started
19
- writing deep learning code, given you have this prerequisite knowledge .
20
-
21
- Note this is about * models *, not data. For all of the models, I just
22
- create a few test examples with small dimensionality so you can see how
23
- the weights change as it trains. If you have some real data you want to
24
- try, you should be able to rip out any of the models from this notebook
25
- and use them on it .
3
+ **μ μ **: `Robert Guthrie <https://github.com/rguthrie3/DeepLearningForNLPInPytorch >`_
4
+ ** λ²μ **: ` μ€μμ° < github.com/oh5221 >`_
5
+
6
+ μ΄ νν 리μΌμ PyTorchλ₯Ό μ¬μ©ν λ₯λ¬λ νλ‘κ·Έλ¨μ μ£Όμ μμ΄λμ΄μ λν΄
7
+ μ°¨κ·Όμ°¨κ·Ό μ΄ν΄λ³Ό κ²μ
λλ€. λ§μ κ°λ
λ€(κ³μ° κ·Έλν μΆμν λ°
8
+ autograd)μ PyTorchμμλ§ μ 곡νλ κ²μ΄ μλλ©°, μ΄λ―Έ 곡κ°λ
9
+ λ₯λ¬λ toolkitκ³Ό κ΄λ ¨μ΄ μμ΅λλ€.
10
+
11
+ μ΄ νν 리μΌμ λ₯λ¬λ νλ μμν¬(μ: Tensorflow, Theano, Keras ,
12
+ Dynet)μμ μ΄λ€ μ½λλ μμ±ν΄ λ³Έ μ μ΄ μλ μ¬λλ€μ
13
+ μν NLPμ νΉλ³ν μ΄μ μ λ§μΆμ΄ μμ±νμμ΅λλ€. νν 리μΌμ μν΄ NLP λΆμΌμ
14
+ ν΅μ¬ λ¬Έμ μ λν μ€λ¬΄ κΈ°μ΄ μ§μμ΄ νμν©λλ€. μμ: νμ¬ νκΉ
, μΈμ΄ λͺ¨λΈλ§ λ±. λν
15
+ AI μ
λ¬Έ μμ
μμ€ (Russelκ³Ό Norvig μ±
μ λμ€λ κ² κ°μ) μ κ²½λ§ μΉμλκ° νμν©λλ€. μΌλ°μ μΌλ‘,
16
+ feed-forward μ κ²½λ§μ λν κΈ°λ³Έμ μΈ μμ ν μκ³ λ¦¬μ¦μ
17
+ λ€λ£¨κ³ , μ νμ±κ³Ό λΉμ νμ±μ μ°μμ μΈ κ΅¬μ±μ΄λΌλ μ μ
18
+ κ°μ‘°ν©λλ€. μ΄ νν 리μΌμ μ΄λ° νμμ μΈ μ§μμ΄ μλ μνμμ
19
+ λ₯λ¬λ μ½λ μμ±μ μμνλ κ²μ λͺ©νλ‘ ν©λλ€ .
20
+
21
+ μ΄ νν 리μΌμ΄ λ°μ΄ν°κ° μλλΌ * λͺ¨λΈ * μ κ΄ν κ²μμ μ£Όμν΄μΌ ν©λλ€. λͺ¨λ
22
+ λͺ¨λΈμ μμ΄, λ¨μ§ μμ μ°¨μμ κ°μ§ λͺ κ°μ§ μμ λ§μ λ§λ€μ΄ νλ ¨ μ
23
+ κ°μ€μΉ λ³νλ₯Ό λ³Ό μ μκ² ν©λλ€. λ§μ½ μ€μ λ°μ΄ν°λ₯Ό κ°κ³ μλ€λ©΄,
24
+ μ΄ λ
ΈνΈλΆμ λͺ¨λΈ μ€ νλλ₯Ό κ°μ Έλ€κ°
25
+ μ¬μ©ν΄ λ³Ό μ μμ κ²μ
λλ€ .
26
26
27
27
28
28
.. toctree ::
@@ -36,19 +36,20 @@ and use them on it.
36
36
37
37
38
38
.. galleryitem :: /beginner/nlp/pytorch_tutorial.py
39
- :intro: All of deep learning is computations on tensors, which are generalizations of a matrix that can be
39
+ :intro: λͺ¨λ λ₯λ¬λμ νλ ¬μ μΌλ°νμΈ Tensorμ λν κ³μ°μ
λλ€.
40
40
41
41
.. galleryitem :: /beginner/nlp/deep_learning_tutorial.py
42
- :intro: Deep learning consists of composing linearities with non-linearities in clever ways. The introduction of non-linearities allows
42
+ :intro: λ₯λ¬λμ μ νμ±κ³Ό λΉμ νμ±μ μ리νκ² μ‘°ν©νλ κ²μΌλ‘ ꡬμ±λ©λλ€. λΉμ νμ± λμ
μ μκ°
43
43
44
44
.. galleryitem :: /beginner/nlp/word_embeddings_tutorial.py
45
- :intro: Word embeddings are dense vectors of real numbers, one per word in your vocabulary. In NLP, it is almost always the case that your features are
45
+ :intro: λ¨μ΄ μλ² λ©μ μ€μμ dense vectorλ‘, vocabulary(λ¨μ΄ μ§ν©)μ λ¨μ΄ λΉ νλμ©μ
λλ€. NLPμμλ κ±°μ feature λλΆλΆμ κ²½μ°μ ν΄λΉν©λλ€.
46
46
47
47
.. galleryitem :: /beginner/nlp/sequence_models_tutorial.py
48
- :intro: At this point, we have seen various feed-forward networks. That is, there is no state maintained by the network at all.
48
+ :intro: μ΄ μμ μμ, λ€μν feed-forward λ€νΈμν¬λ₯Ό 보μμ΅λλ€. μ¦, λ€νΈμν¬μ μν΄ μ μ§λλ μνκ° μμ΅λλ€.
49
+
49
50
50
51
.. galleryitem :: /beginner/nlp/advanced_tutorial.py
51
- :intro: Dynamic versus Static Deep Learning Toolkits. Pytorch is a * dynamic * neural network kit .
52
+ :intro: λμ vs. μ μ λ₯λ¬λ Toolkits. PyTorchλ * λμ * μ κ²½λ§ ν€νΈμ
λλ€ .
52
53
53
54
54
55
.. raw :: html
0 commit comments