I am using the function os.path.getsize()
which gives the size of the file in bytes.
As my one file size is 10gb it give me size in negative(bytes).
Your Linux kernel obviously has large file support, since ls -l
works correctly. Thus, it's your Python installation that is lacking the support. (Are you using your distribution's Python package? What distribution is it?)
The documentation on POSIX large file support in Python states that Python should typically make use of large file support if it is available on Linux. It also suggests to try and configure Python with the command line
CFLAGS='-D_LARGEFILE64_SOURCE -D_FILE_OFFSET_BITS=64' OPT="-g -O2 $CFLAGS" \
./configure
And finally, quoting the man page of the stat system call:
This can occur when an application compiled on a 32-bit platform without
-D_FILE_OFFSET_BITS=64
callsstat()
on a file whose size exceeds(1<<31)-1
bits.
(I believe the last word should be "bytes".)
Looks like an overflow of 32-bit int used for size which is limited to 4GB. This may be a bug (or even a missing compilation flag) in your particular version of Python. I just tried it in a 32-bit Linux box, using python 2.4 and 2.6; both give correct results on files bigger than 4GB.
Try upgrading your Python; the fix is probably a minor version away.