large-files

Efficiently adding huge amounts of data from CSV files into an SQLite DB in Java [duplicate]

谁都会走 提交于 2019-12-24 07:03:37
问题 This question already has answers here : Android SQLite database: slow insertion (5 answers) Closed 2 years ago . I'm trying to parse values from a CSV file to a SQLite DB, however the file is quite large (~2,500,000 lines). I ran my program for a a few hours, printing where it was up to, but by my calculation, the file would have taken about 100 hours to parse completely, so I stopped it. I'm going to have to run this program as a background process at least once a week, on a new CSV file

Better way to store large files in a MySQL database?

我与影子孤独终老i 提交于 2019-12-24 04:04:24
问题 I have a PHP script that you can upload very large files with (up to 500MB), and the file's content is stored in a MySQL database. Currently I do something like this: mysql_query("INSERT INTO table VALUES('')"); $uploadedfile = fopen($_FILES['file']['tmp_name'], 'rb'); while (!feof($uploadedfile)) { $line = mysql_escape_string(fgets($uploadedfile, 4096)); mysql_query("UPDATE table SET file = CONCAT(file, '$line') WHERE something = something"); } fclose($uploadedfile); This of course does a

How get unique lines from a very large file in linux?

我的梦境 提交于 2019-12-24 00:36:49
问题 I have a very large data file (255G; 3,192,563,934 lines). Unfortunately I only have 204G of free space on the device (and no other devices I can use). I did a random sample and found that in a given, say, 100K lines, there are about 10K unique lines... but the file isn't sorted. Normally I would use, say: pv myfile.data | sort | uniq > myfile.data.uniq and just let it run for a day or so. That won't work in this case because I don't have enough space left on the device for the temporary

Obb (Opaque Binary Blobb) getMountedObbPath() returning null

安稳与你 提交于 2019-12-23 22:28:11
问题 Why is the obb-file not found when its actually there? in the logcat window 11-05 20:42:33.860: I/System.out(7563): mainFile = /storage/sdcard0/Android/obb/main.11.se.sourses.thai.obb 11-05 20:42:33.860: I/System.out(7563): main = null the codesnippet StorageManager storage = (StorageManager) getApplicationContext().getSystemService(STORAGE_SERVICE); File mainFile = new File(Environment.getExternalStorageDirectory() + "/Android/obb/" + "main." + 11 + "." + "se.sourses.thai" + ".obb"); String

loop for reading different data types & sizes off very large byte array from file

生来就可爱ヽ(ⅴ<●) 提交于 2019-12-23 18:23:02
问题 I have a raw byte stream stored on a file (rawbytes.txt) that I need to parse and output to a CSV-style text file. The input of raw bytes (when read as characters/long/int etc.) looks something like this: A2401028475764B241102847576511001200C... Parsed it should look like: OutputA.txt (Field1,Field2,Field3) - heading A,240,1028475764 OutputB.txt (Field1,Field2,Field3,Field4,Field5) - heading B,241,1028475765,1100,1200 OutputC.txt C,...//and so on Essentially, it's a hex-dump-style input of

Use php to zip large files [closed]

眉间皱痕 提交于 2019-12-23 14:19:10
问题 Closed . This question needs to be more focused. It is not currently accepting answers. Want to improve this question? Update the question so it focuses on one problem only by editing this post. Closed 2 years ago . I have a php form that has a bunch of checkboxes that all contain links to files. Once a user clicks on which checkboxes (files) they want, it then zips up the files and forces a download. I got a simple php zip force download to work, but when one of the files is huge or if

Using Perl6 to process a large text file, and it's Too Slow.(2014-09)

拥有回忆 提交于 2019-12-23 09:57:29
问题 The code host in https://github.com/yeahnoob/perl6-perf , as follow: use v6; my $file=open "wordpairs.txt", :r; my %dict; my $line; repeat { $line=$file.get; my ($p1,$p2)=$line.split(' '); if ?%dict{$p1} { %dict{$p1} = "{%dict{$p1}} {$p2}".words; } else { %dict{$p1} = $p2; } } while !$file.eof; Running well when the "wordpairs.txt" is small. But when the "wordpairs.txt" file is about 140,000 lines (each line, two words), it is running Very Very Slow. And it cannot Finish itself, even after 20

Viewing large XML files in eclipse?

匆匆过客 提交于 2019-12-23 07:51:12
问题 I'm working on a project involving some large XML files (from 50MB to over 1GB) and it would be nice if I could view them in eclipse (simple text view is fine) without Java running out of heap space. I've tried tweaking the amount of memory available to the jvm in eclipse.ini but haven't had much success. Any ideas? 回答1: I am not sure you can open such large files, as stated already in 2005. You will end up with !MESSAGE Unable to create editor ID org.eclipse.ui.DefaultTextEditor: Editor

To Compress a big file in a ZIP with Java

霸气de小男生 提交于 2019-12-23 07:49:55
问题 I have the need to compress a one Big file (~450 Mbyte) through the Java class ZipOutputStream. This big dimension causes a problem of "OutOfMemory" error of my JVM Heap Space. This happens because the "zos.write(...)" method stores ALL the file content to compress in an internal byte array before compressing it. origin = new BufferedInputStream(fi, BUFFER); ZipEntry entry = new ZipEntry(filePath); zos.putNextEntry(entry); int count; while ((count = origin.read(data, 0, BUFFER)) != -1) { zos

Is my .XAP file too big? and if so how do i make it smaller?

大城市里の小女人 提交于 2019-12-23 04:42:16
问题 I'm currently doing some work on a silverlight LOB application i wrote for a customer.. When i deploy i can't help noticing how big the xap file is (4mb) and considering it's not a massive app it seems a little unusual. I am using the telerik silverlight toolkit (but only including the required themes - 2 I think).. There's about 1mb of images (which is maybe a bit too much for a LOB app).. Is this average for a silverlight application? Whats the average size of your xap files? How would I go