Skip to content

Commit

Permalink
remove image example and add text input example
Browse files Browse the repository at this point in the history
  • Loading branch information
MoustafaAMahmoud committed Jul 14, 2019
1 parent 093aaf2 commit 66276f0
Show file tree
Hide file tree
Showing 6 changed files with 33 additions and 8 deletions.
Binary file added Figures/Ch_2_Background/char-lstm.jpg
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
2 changes: 1 addition & 1 deletion chapters/ch_literature.tex
Original file line number Diff line number Diff line change
@@ -1,5 +1,5 @@
\chapter{Literature Review}\label{Ch:Literature}
Poetry meter classification and detection have not addressed as a learning problem or similar our way for solving this problem. In this literature, we can see that they treat the problem mainly as deterministic. They are restricted by some static conditions and {\color{ref}unable to machine learning approach to address the problem. This approach can work but with other problems which have static rule-based approach. However, this problem required learning approach so, the rule-based model is not the best option in our case. }
Poetry meter classification and detection have not addressed as a learning problem or similar our way for solving this problem. In this literature, we can see that they treat the problem mainly as deterministic. They are restricted by some static conditions and {\color{red}unable to machine learning approach to address the problem. This approach can work but with other problems which have static rule-based approach. However, this problem required learning approach so, the rule-based model is not the best option in our case. }



Expand Down
25 changes: 18 additions & 7 deletions chapters/sec_background_deep_learning.tex
Original file line number Diff line number Diff line change
Expand Up @@ -16,15 +16,26 @@ \section{Deep Learning Background}\label{Sec:Deep_Learning_Background}

However, for many tasks, it is difficult to identify the features which should be extracted. For example, we need to detect cars in photographs. We know every car has wheels. So, to detect cars, we can check if there is a wheel to be a feature for car detection. However, to detect or to describe wheels in terms of pixel values is a difficult task. The image may not be clear or may be complicated by shadows, the sun glaring off the metal parts of the wheel, the blurring in images may sometimes make it unclear, and so on~\cite{Goodfellow-et-al-2016}. One solution to this problem is to use machine learning itself to discover not only the output of the model but also the features which are the input for the model. This approach is known as representation learning. Learned representation can achieve better results than hard-designed representation. This approach also allows AI systems to rapidly adapt to new tasks or be automatically identified from any new data. Representation learning can automatically discover many features quickly or can take more time in case of complex tasks, but will at least provide an excellent set of features which can be adapted for any complex problem without the need for manual features. In this research, we used the AI to identify the features for our model, enabling this model to achieve breakthrough results, compared with the old method of manual feature machine learning.

If we go back to the image example, we can show that it is not an easy task to extract features to detect the car from an image. So, Deep Learning is trying to solve this problem in feature engineering by introducing representation learning that build complex representations in terms of another simpler layer of representations. Figure~\ref{Fig:Deep_Learning_Image_Person_Example} shows how deep learning represents an image of a person by combining simpler representation, e.g. the edges and contours which led to understanding complex representations. The benefit from allowing the computer to understand the data and building the representation is the ability to build and understand very complex representation and also to utilize and combine features from simpler to deep representations with many ways such as recurrence or sequences.
\begin{figure}[!ht] \includegraphics[width=\linewidth]{./Figures/Ch_2_Background/DeepLearningImagePersonExample.png}
\caption{Illustrations on how Deep Learning can work based on images figure presented~\cite{Goodfellow-et-al-2016}~\cite{Zeiler2014}.}
\label{Fig:Deep_Learning_Image_Person_Example}
\end{figure}
%If we go back to the image example, we can show that it is not an easy task to extract features to detect the car from an image. So, Deep Learning is trying to solve this problem in feature engineering by introducing representation learning that build complex representations in terms of another simpler layer of representations. Figure~\ref{Fig:Deep_Learning_Image_Person_Example} shows how deep learning represents an image of a person by combining simpler representation, e.g. the edges and contours which led to understanding complex representations. The benefit from allowing the computer to understand the data and building the representation is the ability to build and understand very complex representation and also to utilize and combine features from simpler to deep representations with many ways such as recurrence or sequences.
%\begin{figure}[!ht] \includegraphics[width=\linewidth]{./Figures/Ch_2_Background/DeepLearningImagePersonExample.png}
% \caption{Illustrations on how Deep Learning can work based on images figure presented~\cite{Goodfellow-et-al-2016}~\cite{Zeiler2014}.}
% \label{Fig:Deep_Learning_Image_Person_Example}
%\end{figure}

{\color{red}
Deep Learning is trying to solve many problems especially problems related to text and NLP by introducing representation learning that build complex representations in terms of another simpler layer of representations.

One of the examples of the text problems in DL is classify text based on char-level. Figure~\ref{Fig:Deep_Learning_Char_Level_Example} shows how deep learning can work to represents a sequence of a sentence by combining simpler representation, e.g. understand the char-sequence pattern which led to understanding complex representations.

Another example is using DL in text generation problems. Lots of research works to use DL to generate text after learning the pattern from the text for example, train the model on Shakespeare’s poems and then the model try to generate a similar text. We can also use DL to classify the text similar idea of our research work. The main idea is how to use these techniques to get best results without hand-crafted feature engineering.
The benefit from allowing the computer to understand the data and building the representation is the ability to build and understand very complex representation and also to utilize and combine features from simpler to deep representations with many ways such as recurrence or sequences.

\begin{figure}[!ht] \includegraphics[width=.7\linewidth]{./Figures/Ch_2_Background/char-lstm.jpg}
\caption{Illustrations on how Deep Learning can work based on char level figure presented~\cite{Stathis}.}
\label{Fig:Deep_Learning_Char_Level_Example}
\end{figure}


Deal Learning currently involved in many problems DL in text generation problems. Lots of research works to use DL to generate text after learning the pattern from the text for example, train the model on Shakespeare’s poems and then the model try to generate a similar text. We can also use DL to classify the text similar idea of our research work. The main idea is how to use these techniques to get best results without hand-crafted feature engineering.

}

Expand All @@ -40,7 +51,7 @@ \subsection{Logistic Regression}
A simple example of logistic regression would be an algorithm for fraud detection. It takes some raw data input and detects if it is a fraud case or not. Assume fraud case is one and non-fraud case is zero. David Cox developed logistic regression in 1958~\cite{Cox2958}. The “logistic” name came from its core logistic function, also named \textit{Sigmoid function} function in Equation~\eqref{eq:logistic_function}. The Logistic function is shaped as an S-shape.

One of these function features can take any input real number and convert it into a value between 1 and 0.%
\begin{figure}
\begin{figure}[!ht]
\centering
\input{./Figures/Ch_2_Background/fig_logistic.tex}
\caption{Logistic Regression Function (S-Shape)}\label{Fig:Logistic}
Expand Down
7 changes: 7 additions & 0 deletions master.bib
Original file line number Diff line number Diff line change
Expand Up @@ -94,6 +94,13 @@ @misc{colah
year = 2015,
url = {http://colah.github.io/posts/2015-08-Understanding-LSTMs/}
}
@misc{stathis,
author = {Stathis},
title = {{How to read: Character level deep learning}},
year = 2017,
url = {https://offbit.github.io/how-to-read/}
}

@INPROCEEDINGS{Mikolov_et_al,
author = {Tom Mikolov and Martin Karafi and Luk Burget and Jan ernock and Sanjeev Khudanpur},
title = {Recurrent neural network based language model},
Expand Down
Binary file modified master.pdf
Binary file not shown.
7 changes: 7 additions & 0 deletions references.tex
Original file line number Diff line number Diff line change
Expand Up @@ -92,6 +92,13 @@
year = 2015,
url = {http://colah.github.io/posts/2015-08-Understanding-LSTMs/}
}
@misc{stathis,
author = {Stathis},
title = {{How to read: Character level deep learning}},
year = 2017,
url = {https://offbit.github.io/how-to-read/}
}

@INPROCEEDINGS{Mikolov_et_al,
author = {Tom Mikolov and Martin Karafi and Luk Burget and Jan ernock and Sanjeev Khudanpur},
title = {Recurrent neural network based language model},
Expand Down

0 comments on commit 66276f0

Please sign in to comment.