I\'m creating large file with my python script (more than 1GB
, actually there\'s 8 of them). Right after I create them I have to create process that will use those
f.close()
calls f.flush()
, which sends the data to the OS. That doesn't necessarily write the data to disk, because the OS buffers it. As you rightly worked out, if you want to force the OS to write it to disk, you need to os.fsync()
.
Have you considered just piping the data directly into use_file
?
EDIT: you say that os.fsync()
'doesn't work'. To clarify, if you do
f = open(...)
# write data to f
f.flush()
os.fsync(f.fileno())
f.close()
import pdb; pdb.set_trace()
and then look at the file on disk, does it have data?