Recurrent Neural Network

Authors
  • Amit Shekhar
    Name
    Amit Shekhar
    Published on
Recurrent Neural Network

I am Amit Shekhar, I have taught and mentored many developers, and their efforts landed them high-paying tech jobs, helped many tech companies in solving their unique problems, and created many open-source libraries being used by top companies. I am passionate about sharing knowledge through open-source, blogs, and videos.

Join my program and get high paying tech job: amitshekhar.me

Before we start, I would like to mention that, I have released a video playlist to help you crack the Android Interview: Check out Android Interview Questions and Answers.

In this blog, we will learn about the Recurrent Neural Network.

What is Recurrent Neural Network (RNN)?

As per Wikipedia, a recurrent neural network (RNN) is a class of artificial neural network where connections between units form a directed graph along a sequence. This allows it to exhibit dynamic temporal behavior for a time sequence. Unlike feedforward neural networks, RNNs can use their internal state (memory) to process sequences of inputs. This makes them applicable to tasks such as unsegmented, connected handwriting recognition or speech recognition.

Recurrent Neural Network comes into the picture when any model needs context to be able to provide the output based on the input.

Sometimes the context is the single most important thing for the model to predict the most appropriate output.

Let's understand this by an analogy. Suppose you are watching a movie, you keep watching the movie as at any point in time, you have the context because you have seen the movie until that point, then only you are able to relate everything correctly. It means that you remember everything that you have watched.

Similarly, RNN remembers everything. In other neural networks, all the inputs are independent of each other. But in RNN, all the inputs are related to each other. Let's say you have to predict the next word in a given sentence, in that case, the relation among all the previous words helps in predicting the better output. The RNN remembers all these relations while training itself.

In order to achieve it, the RNN creates the networks with loops in them, which allows it to persist the information.

rnn rolled

Image Source: colah's blog

This loop structure allows the neural network to take the sequence of input. If you see the unrolled version, you will understand it better.

rnn unrolled

Image Source: colah's blog

As you can see in the unrolled version. First, it takes the x(0) from the sequence of input and then it outputs h(0) which together with x(1) is the input for the next step. So, the h(0) and x(1) is the input for the next step. Similarly, h(1) from the next is the input with x(2) for the next step and so on. This way, it keeps remembering the context while training.

This way the RNN works.RNN helps wherever we need the context from the previous input.

The following are the few applications of the RNN:

  • Next word prediction.
  • Music composition.
  • Image captioning
  • Speech recognition
  • Time series anomaly detection
  • Stock market prediction

Now a days, RNN has become very popular as it helps in solving many real-life problems which the industries are facing.

That's it for now.

Thanks

Amit Shekhar

You can connect with me on:

Read all of my high-quality blogs here.