file-upload

Custom $options directory for jQuery Blueimp Upload Plugin?

Deadly 提交于 2021-02-07 19:42:50
问题 I have been working with the blueimp jQuery File Upload plugin and I'm trying to setup a custom field the user can change for uploads. Within the PHP class there is a construct which sets up all the default options variables. I'm looking for the best way I can store this data so after the user sets a value it'll save as the new default upload folder. I'm thinking inside an external XML file, which is them imported into the PHP script. Here's the upload class constructor function: function _

File Upload Large Files PHP/AJAX [closed]

扶醉桌前 提交于 2021-02-07 14:16:44
问题 As it currently stands, this question is not a good fit for our Q&A format. We expect answers to be supported by facts, references, or expertise, but this question will likely solicit debate, arguments, polling, or extended discussion. If you feel that this question can be improved and possibly reopened, visit the help center for guidance. Closed 7 years ago . Im trying to find some software to help me upload really large files (>200mb). I only want to upload a single file, no multi-options.

Sending file direct from browser to S3 but changing file name

∥☆過路亽.° 提交于 2021-02-07 13:18:35
问题 I am using signed authorized S3 uploads so that users can upload files directly from their browser to S3 bypassing my server. This presently works, but the file name is the same as on the user's machine. I'd like to save it on S3 as a different name. The formdata I post to amazon looks like this: var formData = new FormData(); formData.append('key', targetPath); // e.g. /path/inside/bucket/myFile.mov formData.append('AWSAccessKeyId', s3Auth.AWSAccessKeyId); // aws public key formData.append(

Sending file direct from browser to S3 but changing file name

十年热恋 提交于 2021-02-07 13:17:09
问题 I am using signed authorized S3 uploads so that users can upload files directly from their browser to S3 bypassing my server. This presently works, but the file name is the same as on the user's machine. I'd like to save it on S3 as a different name. The formdata I post to amazon looks like this: var formData = new FormData(); formData.append('key', targetPath); // e.g. /path/inside/bucket/myFile.mov formData.append('AWSAccessKeyId', s3Auth.AWSAccessKeyId); // aws public key formData.append(

Retrieve the file name while using file type input

天涯浪子 提交于 2021-02-07 05:19:49
问题 I have a jsp with this code snippet in it. <form name="AudioFileConversionForm" enctype="multipart/form-data" method="post" > Choose File: <input type="file" id="audioFile" name="audioFile"><br> <input type="submit" value="upload"> </form> This is my controller in spring . public String convertFile(HttpServletRequest request, HttpSession session) { String audioFile = request.getParameter("audioFile"); System.out.println(request.getParameter("audioFile")); System.out.println("Audio File

Retrieve the file name while using file type input

烂漫一生 提交于 2021-02-07 05:19:48
问题 I have a jsp with this code snippet in it. <form name="AudioFileConversionForm" enctype="multipart/form-data" method="post" > Choose File: <input type="file" id="audioFile" name="audioFile"><br> <input type="submit" value="upload"> </form> This is my controller in spring . public String convertFile(HttpServletRequest request, HttpSession session) { String audioFile = request.getParameter("audioFile"); System.out.println(request.getParameter("audioFile")); System.out.println("Audio File

Android pause and resume upload

社会主义新天地 提交于 2021-02-07 03:57:49
问题 I'd like to create a pause and resume system for my uploading component. So far I'm able to directly and continuously upload a video to ym remote server with the following code: private void uploadVideo(String videoPath) throws Exception { HttpClient httpclient = new DefaultHttpClient(); HttpPost httppost = new HttpPost(SEVER_ENDPOINT); FileBody filebodyVideo = new FileBody(new File(videoPath)); StringBody title = new StringBody("Filename: " + videoPath); StringBody description = new

FileReader memory leak in Chrome

与世无争的帅哥 提交于 2021-02-07 03:51:40
问题 I have a webpage with file upload functionality. The upload is performed in 5MB chunks. I want to calculate hash for each chunk before sending it to the server. The chunks are represented by Blob objects. In order to calculate the hash I am reading such blob into an ArrayBuffer using a native FileReader. Here is the code: var reader = new FileReader(); var getHash = function (blob, callback) { reader.onloadend = function (e) { var hash = util.hash(e.target.result); callback(hash); }; reader

FileReader memory leak in Chrome

徘徊边缘 提交于 2021-02-07 03:49:40
问题 I have a webpage with file upload functionality. The upload is performed in 5MB chunks. I want to calculate hash for each chunk before sending it to the server. The chunks are represented by Blob objects. In order to calculate the hash I am reading such blob into an ArrayBuffer using a native FileReader. Here is the code: var reader = new FileReader(); var getHash = function (blob, callback) { reader.onloadend = function (e) { var hash = util.hash(e.target.result); callback(hash); }; reader

FileReader memory leak in Chrome

别来无恙 提交于 2021-02-07 03:49:26
问题 I have a webpage with file upload functionality. The upload is performed in 5MB chunks. I want to calculate hash for each chunk before sending it to the server. The chunks are represented by Blob objects. In order to calculate the hash I am reading such blob into an ArrayBuffer using a native FileReader. Here is the code: var reader = new FileReader(); var getHash = function (blob, callback) { reader.onloadend = function (e) { var hash = util.hash(e.target.result); callback(hash); }; reader