Abstract:
Many learning tasks require dealing with the temporal and spatial analysis of data.
Neural Networks are more powerful learning algorithms that are producing the amazing
results in a wide range of supervised and unsupervised machine learning tasks. LSTM
(Long Short Term Memory) architecture is the second-order recurrent Neural Network
architecture that is working on sequential short term memories and retrieving them for long
term analysis in many time steps later. LSTM was developed for Neural Network
architecture for the processing and predicting the output for long temporal sequence of data.
In this research objective is to prove the significant result in the Distractor Sequence Recall
task by using LSTM nodes internally and network level control LSTM nodes. The novel
approach is inspired by HTM (Hierarchical Temporal Memory) structure.
Through breaking the LSTM neural node the neural network is trained for the task of
Distractor Sequence Recall task which contains a single trial consists of the presentation of a
temporal sequence. That consists of randomly chosen target symbols and randomly chosen
distractor symbols, all will be in random order and at the end of the sequence symbols are
the two prompts, which direct the network to produce an output consists of the first and
second target in the sequence, regardless of when they were occurred.