I want to read a text file with a size of almost 10.4 GB in Python and then tokenize its sentences as the following:
with open("all_docs.txt", "