How should we pad text sequence in keras using pad_sequences?
问题 I have coded a sequence to sequence learning LSTM in keras myself using the knowledge gained from the web tutorials and my own intuitions. I converted my sample text to sequences and then padded using pad_sequence function in keras. from keras.preprocessing.text import Tokenizer,base_filter from keras.preprocessing.sequence import pad_sequences def shift(seq, n): n = n % len(seq) return seq[n:] + seq[:n] txt="abcdefghijklmn"*100 tk = Tokenizer(nb_words=2000, filters=base_filter(), lower=True,