large-files

Is my .XAP file too big? and if so how do i make it smaller?

久未见 提交于 2019-12-23 04:41:12
问题 I'm currently doing some work on a silverlight LOB application i wrote for a customer.. When i deploy i can't help noticing how big the xap file is (4mb) and considering it's not a massive app it seems a little unusual. I am using the telerik silverlight toolkit (but only including the required themes - 2 I think).. There's about 1mb of images (which is maybe a bit too much for a LOB app).. Is this average for a silverlight application? Whats the average size of your xap files? How would I go

Connection dropped by client when serving large files for download (Java, Jersey, HTTP, GET)

心不动则不痛 提交于 2019-12-23 04:22:57
问题 I have a HTTP server which serves files in download, and some of these files are pretty large (can be 7 GB or more). When these files are downloaded from some networks, the connection is dropped and we find the following error in the tomcat catalina log: org.apache.catalina.connector.ClientAbortException: java.io.IOException: Connection reset by peer at org.apache.catalina.connector.OutputBuffer.realWriteBytes(OutputBuffer.java:393) at org.apache.tomcat.util.buf.ByteChunk.flushBuffer

How can I write/create a file larger than 2GB by using C/C++

耗尽温柔 提交于 2019-12-23 03:52:05
问题 I tried to use write() function to write a large piece of memory into a file (more than 2GB) but never succeed. Can somebody be nice and tell me what to do? 回答1: Assuming Linux :) http://www.suse.de/~aj/linux_lfs.html 1/ define _FILE_OFFSET_BITS to 64 2/ define _LARGEFILE_SOURCE and _LARGEFILE_SOURCE64 4/ Use the O_LARGEFILE flag with open to operate on large file Also some information there: http://www.gnu.org/software/libc/manual/html_node/Opening-Streams.html#index-fopen64-931 These days

How can I write/create a file larger than 2GB by using C/C++

心不动则不痛 提交于 2019-12-23 03:51:11
问题 I tried to use write() function to write a large piece of memory into a file (more than 2GB) but never succeed. Can somebody be nice and tell me what to do? 回答1: Assuming Linux :) http://www.suse.de/~aj/linux_lfs.html 1/ define _FILE_OFFSET_BITS to 64 2/ define _LARGEFILE_SOURCE and _LARGEFILE_SOURCE64 4/ Use the O_LARGEFILE flag with open to operate on large file Also some information there: http://www.gnu.org/software/libc/manual/html_node/Opening-Streams.html#index-fopen64-931 These days

Converting very large files from xml to csv

≡放荡痞女 提交于 2019-12-23 03:29:21
问题 Currently I'm using the following code snippet to convert a .txt file with XML data to .CSV format. My question is this, currently this works perfectly with files that are around 100-200 mbs and the conversion time is very low (1-2 minutes max), However I now need this to work for much bigger files (1-2 GB's each file). Currently the program freezes the computer and the conversion takes about 30-40 minutes with this function. Not sure how I would proceed changing this function. Any help will

Python - Opening and changing large text files

∥☆過路亽.° 提交于 2019-12-22 06:57:31
问题 I have a ~600MB Roblox type .mesh file, which reads like a text file in any text editor. I have the following code below: mesh = open("file.mesh", "r").read() mesh = mesh.replace("[", "{").replace("]", "}").replace("}{", "},{") mesh = "{"+mesh+"}" f = open("p2t.txt", "w") f.write(mesh) It returns: Traceback (most recent call last): File "C:\TheDirectoryToMyFile\p2t2.py", line 2, in <module> mesh = mesh.replace("[", "{").replace("]", "}").replace("}{", "},{") MemoryError Here is a sample of my

content-length header from php is overwritten !

[亡魂溺海] 提交于 2019-12-22 06:49:43
问题 I'm trying to figure why the Content-Length header of php gets overwritten. This is demo.php <?php header("Content-Length: 21474836470");die; ?> a request to fetch the headers curl -I http://someserver.com/demo.php HTTP/1.1 200 OK Date: Tue, 19 Jul 2011 13:44:11 GMT Server: Apache/2.2.16 (Debian) X-Powered-By: PHP/5.3.3-7+squeeze3 Content-Length: 2147483647 Cache-Control: must-revalidate Content-Type: text/html; charset=UTF-8 See Content-Length ? It maxes out at 2147483647 bytes, that is 2GB.

Upload large file in background (service restarting when the app closed)

拜拜、爱过 提交于 2019-12-22 05:14:46
问题 I would like to upload large files (~10 - 100Mb wifi or mobile network), but in background, because the user maybe will leave the app and later the system will close the app (if not enoguh memory) I created a service for this case but my problem is that when i killed the app the service restarting and the uploading start again. I found same problems without solution: keeping background service alive after user exit app My service is restarted each time the application is closed So it won't

How to remove largefiles from Mercurial repo

谁说胖子不能爱 提交于 2019-12-22 04:53:06
问题 See also this question. Without knowing what I was doing, I enabled the largefiles extension, committed a file and pushed it to kiln. Now I know the error of my ways, and I need to permanently revert this change. I followed the guidance from SO on the subject; and I can remove largefiles locally, but this doesn't affect the remote repos in kiln. I have tried opening the repo in KilnRepositories on the Kiln server and nuking the largefiles folder (as well as deleting 'largefiles' from the

How do I join pairs of consecutive lines in a large file (1 million lines) using vim, sed, or another similar tool?

橙三吉。 提交于 2019-12-22 03:18:05
问题 I need to move the contents of every second line up to the line above such that line2's data is alongside line1's, either comma or space separated works. Input: line1 line2 line3 line4 Output: line1 line2 line3 line4 I've been doing it in vim with a simple recording but vim seems to crash when I tell it to do it 100 000 times... I'm thinking maybe sed would be a good alternative but not sure how to do what I want or maybe there's a better option? Each line only contains 1 numerical value, I