The thing: I have a page, which has to display undetermined number of images, loaded through AJAX (using base64 encoding on the server-side) one by one.
var
Wrong and wrong. Don't user timers, don't chain them. Look at jQuery Deferred / when, it has everything you need.
var imgara = [];
for (image in imglist) {
imgara[] = ajax call
}
$.when.apply($, imgara).done(function() {
// do something
}).fail(function() {
// do something else
});
Try using setInterval()
function instead of while()
.
var fetch = setInterval(loadImage, 2000);
function loadImage(){
position= new position; //Change variable position here.
getImageRequest(position);
if(!GLOB_PROCEED_FETCH){
clearInterval(fetch);
}
}
Your best bet would be to restructure your code to use async ajax calls and launch the next call when the first one completes and so on. This will allow the page to redisplay between image fetches.
This will also give the browser a chance to breathe and take care of its other housekeeping and not think that maybe it's locked up or hung.
And, use async: 'false'
is a bad idea. I see no reason why properly structured code couldn't use asynchronous ajax calls here and not hang the browser while you're fetching this data.
You could do it with asynchronous ajax like this:
function getAllImages(position, maxImages) {
var imgCount = 0;
function getNextImage() {
$.ajax({
url: urlAJAX + 'scan=' + position,
method: 'GET',
async: true,
success: function(data) {
if (data.status == "success" && imgCount <= maxImages) {
++imgCount;
renderImageData(data);
getNextImage();
}
}
});
}
getNextImage();
}
// no while loop is needed
// just call getAllImages() and pass it the
// position and the maxImages you want to retrieve
getAllImages('front', 20);
Also, while this may look like recursion, it isn't really recursion because of the async nature of the ajax call. getNextImage()
has actually completed before the next one is called so it isn't technically recursion.