Bidirectional LSTM Networks for Poetry Generation in Hindi

Ankit Kumar1

1

Publication Date: 2021/09/09

Abstract: This paper proposes a self-attention enhanced Recurrent Neural Networks for the task of poetry generation in Hindi language. The proposed framework uses Long Short-Term Memory with multi-head selfattention mechanism. We have utilized the multi-head self-attention component to further develop the element determination and hence protect reliance over longer lengths in the recurrent neural network architectures. The paper uses a Hindi poetry dataset to train the network to generate poetries given a set of words as input. The two LSTM models proposed in the paper are able to generate poetries with significant meaning.

Keywords: Hindi Poetry Generation, Text Generation, Poetry Generation, Long Short-Term Memory.

DOI: No DOI Available

PDF: https://ijirst.demo4.arinfotech.co/assets/upload/files/IJISRT21AUG763.pdf

REFERENCES

No References Available