Skip to content

Commit 9cb5af4

Browse files
authored
Translate beginner_source/deep_learning_nlp_tutorial.rst λ²ˆμ—­ (#887)
* Translate beginner_source/deep_learning_nlp_tutorial.rst
1 parent 2f5c892 commit 9cb5af4

File tree

1 file changed

+30
-29
lines changed

1 file changed

+30
-29
lines changed

β€Žbeginner_source/deep_learning_nlp_tutorial.rst

+30-29
Original file line numberDiff line numberDiff line change
@@ -1,28 +1,28 @@
1-
Deep Learning for NLP with Pytorch
1+
PyTorchλ₯Ό μ΄μš©ν•œ NLPλ₯Ό μœ„ν•œ λ”₯λŸ¬λ‹
22
**********************************
3-
**Author**: `Robert Guthrie <https://github.com/rguthrie3/DeepLearningForNLPInPytorch>`_
4-
5-
This tutorial will walk you through the key ideas of deep learning
6-
programming using Pytorch. Many of the concepts (such as the computation
7-
graph abstraction and autograd) are not unique to Pytorch and are
8-
relevant to any deep learning toolkit out there.
9-
10-
I am writing this tutorial to focus specifically on NLP for people who
11-
have never written code in any deep learning framework (e.g, TensorFlow,
12-
Theano, Keras, Dynet). It assumes working knowledge of core NLP
13-
problems: part-of-speech tagging, language modeling, etc. It also
14-
assumes familiarity with neural networks at the level of an intro AI
15-
class (such as one from the Russel and Norvig book). Usually, these
16-
courses cover the basic backpropagation algorithm on feed-forward neural
17-
networks, and make the point that they are chains of compositions of
18-
linearities and non-linearities. This tutorial aims to get you started
19-
writing deep learning code, given you have this prerequisite knowledge.
20-
21-
Note this is about *models*, not data. For all of the models, I just
22-
create a few test examples with small dimensionality so you can see how
23-
the weights change as it trains. If you have some real data you want to
24-
try, you should be able to rip out any of the models from this notebook
25-
and use them on it.
3+
**μ €μž**: `Robert Guthrie <https://github.com/rguthrie3/DeepLearningForNLPInPytorch>`_
4+
**λ²ˆμ—­**: `μ˜€μˆ˜μ—° <github.com/oh5221>`_
5+
6+
이 νŠœν† λ¦¬μ–Όμ€ PyTorchλ₯Ό μ‚¬μš©ν•œ λ”₯λŸ¬λ‹ ν”„λ‘œκ·Έλž¨μ˜ μ£Όμš” 아이디어에 λŒ€ν•΄
7+
μ°¨κ·Όμ°¨κ·Ό μ‚΄νŽ΄λ³Ό κ²ƒμž…λ‹ˆλ‹€. λ§Žμ€ κ°œλ…λ“€(계산 κ·Έλž˜ν”„ 좔상화 및
8+
autograd)은 PyTorchμ—μ„œλ§Œ μ œκ³΅ν•˜λŠ” 것이 μ•„λ‹ˆλ©°, 이미 곡개된
9+
λ”₯λŸ¬λ‹ toolkitκ³Ό 관련이 μžˆμŠ΅λ‹ˆλ‹€.
10+
11+
이 νŠœν† λ¦¬μ–Όμ€ λ”₯λŸ¬λ‹ ν”„λ ˆμž„μ›Œν¬(예: Tensorflow, Theano, Keras,
12+
Dynet)μ—μ„œ μ–΄λ–€ μ½”λ“œλ„ μž‘μ„±ν•΄ λ³Έ 적이 μ—†λŠ” μ‚¬λžŒλ“€μ„
13+
μœ„ν•œ NLP에 νŠΉλ³„νžˆ μ΄ˆμ μ„ λ§žμΆ”μ–΄ μž‘μ„±ν•˜μ˜€μŠ΅λ‹ˆλ‹€. νŠœν† λ¦¬μ–Όμ„ μœ„ν•΄ NLP λΆ„μ•Όμ˜
14+
핡심 λ¬Έμ œμ— λŒ€ν•œ 싀무 기초 지식이 ν•„μš”ν•©λ‹ˆλ‹€. μ˜ˆμ‹œ: ν’ˆμ‚¬ νƒœκΉ…, μ–Έμ–΄ λͺ¨λΈλ§ λ“±. λ˜ν•œ
15+
AI μž…λ¬Έ μˆ˜μ—… μˆ˜μ€€ (Russelκ³Ό Norvig 책에 λ‚˜μ˜€λŠ” 것 같은) 신경망 μΉœμˆ™λ„κ°€ ν•„μš”ν•©λ‹ˆλ‹€. 일반적으둜,
16+
feed-forward 신경망에 λŒ€ν•œ 기본적인 μ—­μ „νŒŒ μ•Œκ³ λ¦¬μ¦˜μ„
17+
닀루고, μ„ ν˜•μ„±κ³Ό λΉ„μ„ ν˜•μ„±μ˜ 연쇄적인 κ΅¬μ„±μ΄λΌλŠ” 점을
18+
κ°•μ‘°ν•©λ‹ˆλ‹€. 이 νŠœν† λ¦¬μ–Όμ€ 이런 ν•„μˆ˜μ μΈ 지식이 μžˆλŠ” μƒνƒœμ—μ„œ
19+
λ”₯λŸ¬λ‹ μ½”λ“œ μž‘μ„±μ„ μ‹œμž‘ν•˜λŠ” 것을 λͺ©ν‘œλ‘œ ν•©λ‹ˆλ‹€.
20+
21+
이 νŠœν† λ¦¬μ–Όμ΄ 데이터가 μ•„λ‹ˆλΌ *λͺ¨λΈ* 에 κ΄€ν•œ κ²ƒμž„μ— μ£Όμ˜ν•΄μ•Ό ν•©λ‹ˆλ‹€. λͺ¨λ“ 
22+
λͺ¨λΈμ— μžˆμ–΄, 단지 μž‘μ€ 차원을 가진 λͺ‡ 가지 μ˜ˆμ œλ§Œμ„ λ§Œλ“€μ–΄ ν›ˆλ ¨ μ‹œ
23+
κ°€μ€‘μΉ˜ λ³€ν™”λ₯Ό λ³Ό 수 있게 ν•©λ‹ˆλ‹€. λ§Œμ•½ μ‹€μ œ 데이터λ₯Ό κ°–κ³  μžˆλ‹€λ©΄,
24+
이 λ…ΈνŠΈλΆμ˜ λͺ¨λΈ 쀑 ν•˜λ‚˜λ₯Ό κ°€μ Έλ‹€κ°€
25+
μ‚¬μš©ν•΄ λ³Ό 수 μžˆμ„ κ²ƒμž…λ‹ˆλ‹€.
2626

2727

2828
.. toctree::
@@ -36,19 +36,20 @@ and use them on it.
3636

3737

3838
.. galleryitem:: /beginner/nlp/pytorch_tutorial.py
39-
:intro: All of deep learning is computations on tensors, which are generalizations of a matrix that can be
39+
:intro: λͺ¨λ“  λ”₯λŸ¬λ‹μ€ ν–‰λ ¬μ˜ μΌλ°˜ν™”μΈ Tensor에 λŒ€ν•œ κ³„μ‚°μž…λ‹ˆλ‹€.
4040

4141
.. galleryitem:: /beginner/nlp/deep_learning_tutorial.py
42-
:intro: Deep learning consists of composing linearities with non-linearities in clever ways. The introduction of non-linearities allows
42+
:intro: λ”₯λŸ¬λ‹μ€ μ„ ν˜•μ„±κ³Ό λΉ„μ„ ν˜•μ„±μ„ μ˜λ¦¬ν•˜κ²Œ μ‘°ν•©ν•˜λŠ” κ²ƒμœΌλ‘œ κ΅¬μ„±λ©λ‹ˆλ‹€. λΉ„μ„ ν˜•μ„± λ„μž…μ˜ μ†Œκ°œ
4343

4444
.. galleryitem:: /beginner/nlp/word_embeddings_tutorial.py
45-
:intro: Word embeddings are dense vectors of real numbers, one per word in your vocabulary. In NLP, it is almost always the case that your features are
45+
:intro: 단어 μž„λ² λ”©μ€ μ‹€μˆ˜μ˜ dense vector둜, vocabulary(단어 집합)의 단어 λ‹Ή ν•˜λ‚˜μ”©μž…λ‹ˆλ‹€. NLPμ—μ„œλŠ” 거의 feature λŒ€λΆ€λΆ„μ˜ κ²½μš°μ— ν•΄λ‹Ήν•©λ‹ˆλ‹€.
4646

4747
.. galleryitem:: /beginner/nlp/sequence_models_tutorial.py
48-
:intro: At this point, we have seen various feed-forward networks. That is, there is no state maintained by the network at all.
48+
:intro: 이 μ‹œμ μ—μ„œ, λ‹€μ–‘ν•œ feed-forward λ„€νŠΈμ›Œν¬λ₯Ό λ³΄μ•˜μŠ΅λ‹ˆλ‹€. 즉, λ„€νŠΈμ›Œν¬μ— μ˜ν•΄ μœ μ§€λ˜λŠ” μƒνƒœκ°€ μ—†μŠ΅λ‹ˆλ‹€.
49+
4950

5051
.. galleryitem:: /beginner/nlp/advanced_tutorial.py
51-
:intro: Dynamic versus Static Deep Learning Toolkits. Pytorch is a *dynamic* neural network kit.
52+
:intro: 동적 vs. 정적 λ”₯λŸ¬λ‹ Toolkits. PyTorchλŠ” *동적* 신경망 ν‚€νŠΈμž…λ‹ˆλ‹€.
5253

5354

5455
.. raw:: html

0 commit comments

Comments
Β (0)