Sigh, we\'re back to this. I can easily enough use CORS on any decent enough browser to directly upload files to my AWS S3 bucket. But (it was coming), with IE I have to fall ba
Summarizing my answer in the comments: IE has CORS support with some restrictions: http://www.html5rocks.com/en/tutorials/cors/
and this implementation of direct upload to S3 looks much simpler than jquery fileupload and its not in jquery: http://codeartists.com/post/36892733572/how-to-directly-upload-files-to-amazon-s3-from-your
Hope this helps!
Almost everything you need to know about how the jQuery File Upload plugin does iframe uploads is in its Iframe Transport plugin (along with supporting result.html page).
As an introduction, you may want to read their user instructions on their Cross domain uploads wiki page, specifically the Cross-site iframe transport uploads section. (Note that according to their Browser support page, niceties like upload progress are not supported for IE <10, so I wouldn't consider these possible using the iframe transport, at least without significant effort.)
(Also, I don't believe any S3 upload implementation using the File Upload plugin has access to the XML content of a file upload error)
The Iframe Transport plugin adds a new Ajax "transport" method for jQuery and is not specific to the File Upload plugin. You may want to read the documentation for jQuery.ajaxTransport() to understand the API that jQuery provides for adding a new transport.
I'll try to summarize what the Iframe Transport plugin is doing, and how it relates to uploading files to Amazon S3:
When a file upload is triggered, the send()
function is called. This function:
Creates a hidden form element
Creates an iframe element with src="javascript:false;"
, and binds a load
event handler to the iframe
Appends the iframe to the hidden form, and appends the hidden form to the document.
When the iframe is created and its "page" loaded, its load
event handler is called. The handler:
Clears itself from the iframe, and binds another load
event handler
Configures the hidden form:
The form's action
will be the URL for the S3 bucket
The form's target
is set to the iframe, so that the server response is loaded in the iframe
Other fields, e.g. AWSAccessKeyId
, are added. Specifically, success_action_redirect
is set to the URL of result.html on your server, e.g. http://example.org/result.html?%s
.
Normally, the %s
token should be replaced with the upload results by server-side code, but with S3 this can be hard-coded with a success value by your code, since Amazon will redirect to this URL only if the upload succeeded.
File input fields from the original form are moved into the hidden form, with cloned fields left in the original fields' place
Submits the hidden form
Moves the file input fields back into the original form, replacing the cloned fields
The file(s) are uploaded to S3. If successful, Amazon redirects the iframe to the success_action_redirect
URL. If not successful, Amazon returns an error, which is also loaded in the iframe.
The iframe's load
event handler is called. The handler:
Tries to save a reference to the iframe's document
object. If the file upload failed, the handler saves an undefined
instead.
Calls the complete callback with a success code and a reference to the iframe's document
object (or undefined
)
Removes the hidden form (and iframe)
Before control is returned to your code, the iframe's document
object is passed to a converter (at the bottom of the Iframe Transport plugin), depending on what type of data you were expecting. The converter extracts that data from the document
object and returns it (or undefined
if the file upload failed) to your callback(s).
Your callback(s) (success
and/or complete
as passed to jQuery.ajax()) is called. A success code is always returned by the plugin, and so any error
callback will not be triggered.
If the data passed to your callback(s) is the value you included in the success_action_redirect
, then the file upload succeeded. If the data is undefined
, then the file upload failed.
Update: If the error XML page stays on the same origin as the S3 bucket, then another page from the S3 bucket, loaded into another iframe, can access the original iframe's content (because they are from the same origin). Your main page can communicate with this second iframe using postMessage()
(or easyXDM's FlashTransport, if you need to support IE6/7).
This problem, of providing accurate feedback to users using browsers with no FileReader or FormData support has troubled me a lot as wel. I spent a whole 3 days trying to come up with a solution and finally came up with something close to nothing.
Lets get down to the facts:
Ok, then there is no other way of uploading the file than using an iframe. Right?
So, jQuery File Upload using jQuery Iframe Transport as @jeferry_to describes so well is the tool for the job.
*Actually the tool/plugin doesn't change a thing..
What now?
Well... we need to access the S3 response inside the transport iframe. But we can't because its on a different domain. So we decide to deal with it by using this trick involving a second iframe.
The setup:
The scenario:
First of all we need to modify jQuery Iframe Transport so that it does not auto remove the auto-generated form and transport frame. We need to do this cause #postMessage which will use later is asynchronous by nature and we don't want the iframe gone by the time we try to access it.
top.frames['iframe X'].document.documentElement
to access te contents of the TransportFrame, stringifies them and sends them back to TopFrame through #postMessage.Ok, everything should work now cause everything is done by the book.
Nahh, you should not even bother.
You see... if you force a modern browser to use the iframe transport instead of the XHR2 the above solution will indeed work like a charm.
However that's pointless. We want it to work in IE8 + 9.
Well... in IE8/9 it sometimes work, it sometimes doesn't. Usually it doesn't.
Why? Because of the IE's friendly HTTP error messages. Oh yes you read just fine.
In case of an error, S3 responds with an HTTP error status depending on the error (400, 403 etc). Now, depending on the status and the length of the response as shown here, IE discards the S3 response and replaces it with a friendly error message. In order to overcome this, you must make sure the response is always > 512 bytes. In this case you cannot guarrantee anything like that cause you don't control the response. S3 does and the typical errors are less than 512 bytes.
In short:
The iframe trick works on those browsers that do not need it, and doesn't on those who do.
Unfortunately, I can't think of anything else so that case is closed for me now.
AS for the "postMessage" scenario, maybe the iframe should contain a simple javascript
[edit] for iframes taken over by an errormessage
IFRAME script
window.document.onload = function(e){
window.parent.postMessage(document, '*'); //replace '*' with your parent if possible
}
// just to get the proper document for the parent to target me
window.addEventListener('message',function(e) {
if (e.domain == 'example.com') { // the domain of your parent frame
if (e.data == "Salute") {
window.parent.postMessage("I'm here", '*'); //replace '*' with your parent too
}
}
});
Now parent knows the iFrame perfectly well and can track it's status (depending on if it's answering a simple postMessage)
PARENT script
var iFrameTarget;
var iFrameTakenOver = false;
var timer;
window.addEventListener('message',function(e) {
if (e.domain == 'example.com') { // the domain of your iframe
if (e.data) { // e.data contains the iframe document
if(typeof(e.data) =='object')
iFrameTarget = e.source;
elseif(e.data == "I'm here")
{
iFrameTakenOver = false;
}
timer =setInterval(call_iFrame(),5000); // check iFrame presence in 5 seconds
}
}
});
function call_iFrame() {
iFrameTarget.postMessage('Salute');
iFrameTakenOver = true;
}
IF iframe is not responding with it's "code" iFrameTakenOver will be permanently set to false checking that will verify if an error has occured or not.