Applied Data Science: LSTM Networks, RNNs, and Long-Term Dependencies

Verified

Added on  2022/08/21

|4
|653
|12
Report
AI Summary
This report provides an overview of LSTM (Long Short-Term Memory) networks, a crucial component in data science, and their comparison to Recurrent Neural Networks (RNNs). The assignment focuses on the importance of LSTM networks, particularly their ability to handle long-term dependencies in time series data, a problem RNNs often struggle with. It highlights the structural differences between LSTM and RNN, emphasizing the unique design of LSTM's repeating modules which incorporate four interacting layers instead of a single layer. The report discusses the advantages of LSTMs over RNNs, such as their suitability for categorizing procedures and forecasting time series data with unknown durations and time lags, and the relative insensitivity to parameter picking. The report also acknowledges the complexity and cost of operation associated with LSTMs. The analysis concludes that LSTMs are a more appropriate technique for deep learning applications in predicting and classifying time series data, providing better results and control capabilities compared to RNNs. The report is supported by references to relevant research papers.
Document Page
Running head: APPLIED DATA SCIENCE
APPLIED DATA SCIENCE
Name of the Student
Name of the University
Author Note
tabler-icon-diamond-filled.svg

Paraphrase This Document

Need a fresh take? Get an instant paraphrase of this document with our AI Paraphraser
Document Page
1APPLIED DATA SCIENCE
Discussions
The main aim of the blog post is for the purpose of education and it is used in order to
understand the LSTM networks. The blog post describes about the Recurrent Neural
Networks
This post is chosen because LSTM Networks that consists of Recurrent Neural
networks are important (Liu et al 2017). A recurrent neural network is a group of Artificial
neural networks where the connection between the various nodes from a graph that is directed
all along a series that is temporal. This permits the neural network to show a dynamic
behaviour that is temporal.
The data science problem that is the dependencies those are long-term. One of the
importance of the Recurrent Neural Networks is that they can connect the previous
information that includes the previous frames of video (Kuchaiev and Ginsburg 2017). In
theory, the Recurrent Neural Networks are able to handle the long-term dependencies. The
RNN has several problems in picking up parameters and the LTSM do not have this problem
of picking up parameters in order to pick up the parameters.
The Long Short Term memory networks are an important kind of RNN that has
ability of learning dependencies those are long-term. These neural networks work well on
range of issues and these neural networks are utilised extensively. The LSTMs are designed
explicitly in order to avoid the problem of long-term dependency. All the neural networks
those are recurrent have the type of chain of modules those are repeating of the neural
networks (Vorontsov et al 2017). The LSTMs also consists of a structure like chain but their
repeating module has various structure. Instead of having an individual layer of neural
network, four interact in a particular way.
Document Page
2APPLIED DATA SCIENCE
The LSTMs over RNN is that it is well suitable in order to categorize procedures and
then forecast the time series and the time lags of duration those are unknown. The
insensitivity those are relative provides an advantage to the LSTMs over the RNNs. There are
more controlling knobs that control the mixing and the flow of inputs as per the weights those
are trained (Kratzert et al 2018). The LSTMs provides control ability and better results and it
is more complicated and cost of operation.
The technique that is the LSTMs are appropriate because it provides more advantage
than the RNNs as it is used in deep learning and it is used in order to predict and classify the
data of time series. The structure of the LSTMs is more or less same as the LSTMs but it has
more advantage than the RNNs.
Document Page
3APPLIED DATA SCIENCE
References
Kratzert, F., Klotz, D., Brenner, C., Schulz, K. and Herrnegger, M., 2018. Rainfall–runoff
modelling using long short-term memory (LSTM) networks. Hydrol. Earth Syst. Sci, 22(11),
pp.6005-6022.
Kuchaiev, O. and Ginsburg, B., 2017. Factorization tricks for LSTM networks. arXiv
preprint arXiv:1703.10722.
Liu, J., Wang, G., Hu, P., Duan, L.Y. and Kot, A.C., 2017. Global context-aware attention
LSTM networks for 3D action recognition. In Proceedings of the IEEE Conference on
Computer Vision and Pattern Recognition (pp. 1647-1656).
Vorontsov, E., Trabelsi, C., Kadoury, S. and Pal, C., 2017, August. On orthogonality and
learning recurrent networks with long term dependencies. In Proceedings of the 34th
International Conference on Machine Learning-Volume 70 (pp. 3570-3578). JMLR. org.
chevron_up_icon
1 out of 4
circle_padding
hide_on_mobile
zoom_out_icon
[object Object]