large-files

Is O_LARGEFILE needed just to write a large file?

♀尐吖头ヾ 提交于 2019-12-17 16:29:03
问题 Is the O_LARGEFILE flag needed if all that I want to do is write a large file ( O_WRONLY ) or append to a large file ( O_APPEND | O_WRONLY )? From a thread that I read titled "Cannot write >2gb index file" on the CLucene-dev mailing list, it appears that O_LARGEFILE might be needed to write large files, but participants in that discussion are using O_RDWR , not O_WRONLY , so I am not sure. 回答1: O_LARGEFILE should never be used directly by applications. It's to be used internally by the 64-bit

Extracting data between two tags in HTML file

假装没事ソ 提交于 2019-12-17 14:21:21
问题 I've got a HUUUGE HTML file here saved on my system, which contains data from a product catalogue. The data is structured such that for each product record the name is between two tags (name) and (/name) . Each product has up to 3 attributes: name, productID, and color, but not all products will have all these attributes. How would I go about extracting this data for each product without mixing up the product attributes? The file is also 50 megabyte! Code example .... <name>'hat'</name> blah

Very large uploads with PHP

丶灬走出姿态 提交于 2019-12-17 06:27:16
问题 I want to allow uploads of very large files into our PHP application (hundred of megs - 8 gigs). There are a couple of problems with this however. Browser: HTML uploads have crappy feedback, we need to either poll for progress (which is a bit silly) or show no feedback at all Flash uploader puts entire file into memory before starting the upload Server: PHP forces us to set post_max_size, which could result in an easily exploitable DOS attack. I'd like to not set this setting globally. The

Java : Read last n lines of a HUGE file

穿精又带淫゛_ 提交于 2019-12-17 04:28:50
问题 I want to read the last n lines of a very big file without reading the whole file into any buffer/memory area using Java. I looked around the JDK APIs and Apache Commons I/O and am not able to locate one which is suitable for this purpose. I was thinking of the way tail or less does it in UNIX. I don't think they load the entire file and then show the last few lines of the file. There should be similar way to do the same in Java too. 回答1: If you use a RandomAccessFile, you can use length and

PHP x86 How to get filesize of > 2 GB file without external program?

人盡茶涼 提交于 2019-12-17 03:36:57
问题 I need to get the file size of a file over 2 GB in size. (testing on 4.6 GB file). Is there any way to do this without an external program? Current status: filesize() , stat() and fseek() fails fread() and feof() works There is a possibility to get the file size by reading the file content (extremely slow!). $size = (float) 0; $chunksize = 1024 * 1024; while (!feof($fp)) { fread($fp, $chunksize); $size += (float) $chunksize; } return $size; I know how to get it on 64-bit platforms (using

PHP x86 How to get filesize of > 2 GB file without external program?

99封情书 提交于 2019-12-17 03:36:29
问题 I need to get the file size of a file over 2 GB in size. (testing on 4.6 GB file). Is there any way to do this without an external program? Current status: filesize() , stat() and fseek() fails fread() and feof() works There is a possibility to get the file size by reading the file content (extremely slow!). $size = (float) 0; $chunksize = 1024 * 1024; while (!feof($fp)) { fread($fp, $chunksize); $size += (float) $chunksize; } return $size; I know how to get it on 64-bit platforms (using

Get last 10 lines of very large text file > 10GB

不问归期 提交于 2019-12-17 02:27:18
问题 What is the most efficient way to display the last 10 lines of a very large text file (this particular file is over 10GB). I was thinking of just writing a simple C# app but I'm not sure how to do this effectively. 回答1: Read to the end of the file, then seek backwards until you find ten newlines, and then read forward to the end taking into consideration various encodings. Be sure to handle cases where the number of lines in the file is less than ten. Below is an implementation (in C# as you

Why can't my program save a large amount (>2GB) to a file?

▼魔方 西西 提交于 2019-12-14 03:59:59
问题 I am having trouble trying to figure out why my program cannot save more than 2GB of data to a file. I cannot tell if this is a programming or environment (OS) problem. Here is my source code: #define _LARGEFILE_SOURCE #define _LARGEFILE64_SOURCE #define _FILE_OFFSET_BITS 64 #include <math.h> #include <time.h> #include <errno.h> #include <stdio.h> #include <stdlib.h> #include <string.h> /*-------------------------------------*/ //for file mapping in Linux #include<fcntl.h> #include<unistd.h>

Input stream reads large files very slowly, why?

旧时模样 提交于 2019-12-13 17:33:10
问题 I am trying to submit a 500 MB file. I can load it but I want to improve the performance. This is the slow code: File dest = getDestinationFile(source, destination); if(dest == null) return false; in = new BufferedInputStream(new FileInputStream(source)); out = new BufferedOutputStream(new FileOutputStream(dest)); byte[] buffer = new byte[1024 * 20]; int i = 0; // this while loop is very slow while((i = in.read(buffer)) != -1){ out.write(buffer, 0, i); //<-- SLOW HERE out.flush(); } How can I

Need to transpose a LARGE csv file in perl [closed]

﹥>﹥吖頭↗ 提交于 2019-12-13 09:46:41
问题 It's difficult to tell what is being asked here. This question is ambiguous, vague, incomplete, overly broad, or rhetorical and cannot be reasonably answered in its current form. For help clarifying this question so that it can be reopened, visit the help center. Closed 7 years ago . The csv data file is 3.2 GB in total, with god knows how many rows and columns (assume very large). The file is a genomics data with SNP data for a population of individuals. Thus the csv file contains IDs such