Open in app
Home
Notifications
Lists
Stories

Write
Rajdeep Borgohain
Rajdeep Borgohain

Home
About

Pinned

Overview of Encoder-Decoder | From the Basics NLP Part 1/4

PART 2: What is “Attention”? The encoder-decoder is a technique used to represent text sequences. It works like humans; as we iterate through a sentence, we accumulate the information at the end of the sentence. Table of Content 1. Use Cases of Sequence to Sequence Models 2. Sequence to sequence basic intuition 1. Use Cases of Sequence to Sequence Models a) Machine Translation:

Deep Learning

3 min read

Overview of Encoder-Decoder | From the Basics NLP Part 1/4
Overview of Encoder-Decoder | From the Basics NLP Part 1/4

Feb 24

What is “Attention”?| From the Basics NLP Part 2/4

The problem with the encoder-decoder model arises when we start working with lengthy sentences or long paragraphs. The context-vector fails to capture the meaning of each input word of the sentence and fails to understand the essence of translation from one language to another. Table of Contents How humans translate language Basic Intuition…

Deep Learning

4 min read

What is “Attention”?| From the Basics NLP Part 2/4
What is “Attention”?| From the Basics NLP Part 2/4
Rajdeep Borgohain

Rajdeep Borgohain

MLE @Cubyts | Ex-Research Engg. IIT Guwahati

Help

Status

Writers

Blog

Careers

Privacy

Terms

About

Knowable