PART 2: What is “Attention”? The encoder-decoder is a technique used to represent text sequences. It works like humans; as we iterate through a sentence, we accumulate the information at the end of the sentence. Table of Content 1. Use Cases of Sequence to Sequence Models 2. Sequence to sequence basic intuition 1. Use Cases of Sequence to Sequence Models a) Machine Translation: