<aside> 💡 References

</aside>

<aside> 💡 Summary

</aside>

Background

seq2seq_5.gif

출처: https://jalammar.github.io/visualizing-neural-machine-translation-mechanics-of-seq2seq-models-with-attention/

출처: https://jalammar.github.io/visualizing-neural-machine-translation-mechanics-of-seq2seq-models-with-attention/

IDEA

출처: https://jalammar.github.io/visualizing-neural-machine-translation-mechanics-of-seq2seq-models-with-attention/

출처: https://jalammar.github.io/visualizing-neural-machine-translation-mechanics-of-seq2seq-models-with-attention/

Inference Flow

attention_process.gif

Encoder & Decoder

Eecoder 구조

Eecoder 구조

Decoder 구조 (출처: https://bigdaheta.tistory.com/67)

Decoder 구조 (출처: https://bigdaheta.tistory.com/67)