Carrying out NMT using simple Encoder-decoder architecture, Attention mechanism, hugging face models
- Encoder_Decoder_with_Attention : https://github.com/ritikdhame/Language_Translation_Multiplemodel/tree/main/Encoder_Decoder_with_Attention
- Encoder_Decoder_with_Attention : https://github.com/ritikdhame/Language_Translation_Multiplemodel/tree/main/Encoder_Decoder_with_Attention
- Language_Translation_huggingface_models: https://github.com/ritikdhame/Language_Translation_Multiplemodel/tree/main/Language_Translation_huggingface_models
Please Leave a STAR if you found this helpful