Handling File Uploads When Offline With Service Worker

前端 未结 3 1763
遇见更好的自我
遇见更好的自我 2021-02-05 11:34

We have a web app (built using AngularJS) that we\'re gradually adding PWA \'features\' too (service worker, launchable, notifications, etc). One of the features our web app has

相关标签:
3条回答
  • 2021-02-05 12:06

    The Cache API is designed to store a request (as the key) and a response (as the value) in order to cache a content from the server, for the web page. Here, we're talking about caching user input for future dispatch to the server. In other terms, we're not trying to implement a cache, but a message broker, and that's not currently something handled by the Service Worker spec (Source).

    You can figure it out by trying this code:

    HTML:

    <button id="get">GET</button>
    <button id="post">POST</button>
    <button id="put">PUT</button>
    <button id="patch">PATCH</button>
    

    JavaScript:

    if ('serviceWorker' in navigator) {
      navigator.serviceWorker.register('/service-worker.js', { scope: '/' }).then(function (reg) {
        console.log('Registration succeeded. Scope is ' + reg.scope);
      }).catch(function (error) {
        console.log('Registration failed with ' + error);
      });
    };
    
    document.getElementById('get').addEventListener('click', async function () {
      console.log('Response: ', await fetch('50x.html'));
    });
    
    document.getElementById('post').addEventListener('click', async function () {
      console.log('Response: ', await fetch('50x.html', { method: 'POST' }));
    });
    
    document.getElementById('put').addEventListener('click', async function () {
      console.log('Response: ', await fetch('50x.html', { method: 'PUT' }));
    });
    
    document.getElementById('patch').addEventListener('click', async function () {
      console.log('Response: ', await fetch('50x.html', { method: 'PATCH' }));
    });
    

    Service Worker:

    self.addEventListener('fetch', function (event) {
        var response;
        event.respondWith(fetch(event.request).then(function (r) {
            response = r;
            caches.open('v1').then(function (cache) {
                cache.put(event.request, response);
            }).catch(e => console.error(e));
            return response.clone();
        }));
    });
    

    Which throws:

    TypeError: Request method 'POST' is unsupported

    TypeError: Request method 'PUT' is unsupported

    TypeError: Request method 'PATCH' is unsupported

    Since, the Cache API can't be used, and following the Google guidelines, IndexedDB is the best solution as a data store for ongoing requests. Then, the implementation of a message broker is the responsibility of the developer, and there is no unique generic implementation that will cover all of the use cases. There are many parameters that will determine the solution:

    • Which criteria will trigger the use of the message broker instead of the network? window.navigator.onLine? A certain timeout? Other?
    • Which criteria should be used to start trying to forward ongoing requests on the network? self.addEventListener('online', ...)? navigator.connection?
    • Should requests respect the order or should they be forwarded in parallel? In other terms, should they be considered as dependent on each other, or not?
    • If run in parallel, should they be batched to prevent a bottleneck on the network?
    • In case the network is considered available, but the requests still fail for some reason, which retry logic to implement? Exponential backoff? Other?
    • How to notify the user that their actions are in a pending state while they are?
    • ...

    This is really very broad for a single StackOverflow answer.

    That being said, here is a minimal working solution:

    HTML:

    <input id="file" type="file">
    <button id="sync">SYNC</button>
    <button id="get">GET</button>
    

    JavaScript:

    if ('serviceWorker' in navigator) {
      navigator.serviceWorker.register('/service-worker.js', { scope: '/' }).then(function (reg) {
        console.log('Registration succeeded. Scope is ' + reg.scope);
      }).catch(function (error) {
        console.log('Registration failed with ' + error);
      });
    };
    
    document.getElementById('get').addEventListener('click', async function () {
      fetch('api');
    });
    
    document.getElementById('file').addEventListener('change', function () {
      fetch('api', { method: 'PUT', body: document.getElementById('file').files[0] });
    });
    
    document.getElementById('sync').addEventListener('click', async function () {
      navigator.serviceWorker.controller.postMessage('sync');
    });
    

    Service Worker:

    self.importScripts('https://unpkg.com/idb@5.0.1/build/iife/index-min.js');
    
    const { openDB, deleteDB, wrap, unwrap } = idb;
    
    const dbPromise = openDB('put-store', 1, {
        upgrade(db) {
            db.createObjectStore('put');
        },
    });
    
    const idbKeyval = {
        async get(key) {
            return (await dbPromise).get('put', key);
        },
        async set(key, val) {
            return (await dbPromise).put('put', val, key);
        },
        async delete(key) {
            return (await dbPromise).delete('put', key);
        },
        async clear() {
            return (await dbPromise).clear('put');
        },
        async keys() {
            return (await dbPromise).getAllKeys('put');
        },
    };
    
    self.addEventListener('fetch', function (event) {
        if (event.request.method === 'PUT') {
            let body;
            event.respondWith(event.request.blob().then(file => {
                // Retrieve the body then clone the request, to avoid "body already used" errors
                body = file;
                return fetch(new Request(event.request.url, { method: event.request.method, body }));
            }).then(response => handleResult(response, event, body)).catch(() => handleResult(null, event, body)));
    
        } else if (event.request.method === 'GET') {
            event.respondWith(fetch(event.request).then(response => {
                return response.ok ? response : caches.match(event.request);
            }).catch(() => caches.match(event.request)));
        }
    });
    
    async function handleResult(response, event, body) {
        const getRequest = new Request(event.request.url, { method: 'GET' });
        const cache = await caches.open('v1');
        await idbKeyval.set(event.request.method + '.' + event.request.url, { url: event.request.url, method: event.request.method, body });
        const returnResponse = response && response.ok ? response : new Response(body);
        cache.put(getRequest, returnResponse.clone());
        return returnResponse;
    }
    
    // Function to call when the network is supposed to be available
    
    async function sync() {
        const keys = await idbKeyval.keys();
        for (const key of keys) {
            try {
                const { url, method, body } = await idbKeyval.get(key);
                const response = await fetch(url, { method, body });
                if (response && response.ok)
                    await idbKeyval.delete(key);
            }
            catch (e) {
                console.warn(`An error occurred while trying to sync the request: ${key}`, e);
            }
        }
    }
    
    self.addEventListener('message', sync);
    

    Some words about the solution: it allows to cache the PUT request for future GET requests, and it also stores the PUT request into an IndexedDB database for future sync. About the key, I was inspired by Angular's TransferHttpCacheInterceptor which allows to serialize backend requests on the server-side rendered page for use by the browser-rendered page. It uses <verb>.<url> as the key. That supposes a request will override another request with the same verb and URL.

    This solution also supposes that the backend does not return 204 No content as a response of a PUT request, but 200 with the entity in the body.

    0 讨论(0)
  • 2021-02-05 12:15

    One way to handle file uploads/deletes and almost everything, is by keeping track of all the changes made during the offline requests. We can create a sync object with two arrays inside, one for pending files that will need to be uploaded and one for deleted files that will need to be deleted when we'll get back online.

    tl;dr

    Key phases


    1. Service Worker Installation


      • Along with static data, we make sure to fetch dynamic data as the main listing of our uploaded files (in the example case /uploads GET returns JSON data with the files).

    2. Service Worker Fetch


      • Handling the service worker fetch event, if the fetch fails, then we have to handle the requests for the files listing, the requests that upload a file to the server and the request that deletes a file from the server. If we don't have any of these requests, then we return a match from the default cache.

        • Listing GET
          We get the cached object of the listing (in our case /uploads) and the sync object. We concat the default listing files with the pending files and we remove the deleted files and we return new response object with a JSON result as the server would have returned it.
        • Uloading PUT
          We get the cached listing files and the sync pending files from the cache. If the file isn't present, then we create a new cache entry for that file and we use the mime type and the blob from the request to create a new Response object that it will be saved to the default cache.
        • Deleting DELETE
          We check in the cached uploads and if the file is present we delete the entry from both the listing array and the cached file. If the file is pending we just delete the entry from the pending array, else if it's not already in the deleted array, then we add it. We update listing, files and sync object cache at the end.

    3. Syncing


      • When the online event gets triggered, we try to synchronize with the server. We read the sync cache.

        • If there are pending files, then we get each file Response object from cache and we send a PUT fetch request back to the server.
        • If there are deleted files, then we send a DELETE fetch request for each file to the server.
        • Finally, we reset the sync cache object.

    Code implementation


    (Please read the inline comments)

    Service Worker Install

    const cacheName = 'pwasndbx';
    const syncCacheName = 'pwasndbx-sync';
    const pendingName = '__pending';
    const syncName = '__sync';
    
    const filesToCache = [
      '/',
      '/uploads',
      '/styles.css',
      '/main.js',
      '/utils.js',
      '/favicon.ico',
      '/manifest.json',
    ];
    
    /* Start the service worker and cache all of the app's content */
    self.addEventListener('install', function(e) {
      console.log('SW:install');
    
      e.waitUntil(Promise.all([
        caches.open(cacheName).then(async function(cache) {
          let cacheAdds = [];
    
          try {
            // Get all the files from the uploads listing
            const res = await fetch('/uploads');
            const { data = [] } = await res.json();
            const files = data.map(f => `/uploads/${f}`);
    
            // Cache all uploads files urls
            cacheAdds.push(cache.addAll(files));
          } catch(err) {
            console.warn('PWA:install:fetch(uploads):err', err);
          }
    
          // Also add our static files to the cache
          cacheAdds.push(cache.addAll(filesToCache));
          return Promise.all(cacheAdds);
        }),
        // Create the sync cache object
        caches.open(syncCacheName).then(cache => cache.put(syncName, jsonResponse({
          pending: [], // For storing the penging files that later will be synced
          deleted: []  // For storing the files that later will be deleted on sync
        }))),
      ])
      );
    });
    

    Service Worker Fetch

    self.addEventListener('fetch', function(event) {
      // Clone request so we can consume data later
      const request = event.request.clone();
      const { method, url, headers } = event.request;
    
      event.respondWith(
        fetch(event.request).catch(async function(err) {
          const { headers, method, url } = event.request;
    
          // A custom header that we set to indicate the requests come from our syncing method
          // so we won't try to fetch anything from cache, we need syncing to be done on the server
          const xSyncing = headers.get('X-Syncing');
    
          if(xSyncing && xSyncing.length) {
            return caches.match(event.request);
          }
    
          switch(method) {
            case 'GET':
              // Handle listing data for /uploads and return JSON response
              break;
            case 'PUT':
              // Handle upload to cache and return success response
              break;
            case 'DELETE':
              // Handle delete from cache and return success response
              break;
          }
    
          // If we meet no specific criteria, then lookup to the cache
          return caches.match(event.request);
        })
      );
    });
    
    function jsonResponse(data, status = 200) {
      return new Response(data && JSON.stringify(data), {
        status,
        headers: {'Content-Type': 'application/json'}
      });
    }
    

    Service Worker Fetch Listing GET

    if(url.match(/\/uploads\/?$/)) { // Failed to get the uploads listing
      // Get the uploads data from cache
      const uploadsRes = await caches.match(event.request);
      let { data: files = [] } = await uploadsRes.json();
    
      // Get the sync data from cache
      const syncRes = await caches.match(new Request(syncName), { cacheName: syncCacheName });
      const sync = await syncRes.json();
    
      // Return the files from uploads + pending files from sync - deleted files from sync
      const data = files.concat(sync.pending).filter(f => sync.deleted.indexOf(f) < 0);
    
      // Return a JSON response with the updated data
      return jsonResponse({
        success: true,
        data
      });
    }
    

    Service Worker Fetch Uloading PUT

    // Get our custom headers
    const filename = headers.get('X-Filename');
    const mimetype = headers.get('X-Mimetype');
    
    if(filename && mimetype) {
      // Get the uploads data from cache
      const uploadsRes = await caches.match('/uploads', { cacheName });
      let { data: files = [] } = await uploadsRes.json();
    
      // Get the sync data from cache
      const syncRes = await caches.match(new Request(syncName), { cacheName: syncCacheName });
      const sync = await syncRes.json();
    
      // If the file exists in the uploads or in the pendings, then return a 409 Conflict response
      if(files.indexOf(filename) >= 0 || sync.pending.indexOf(filename) >= 0) {
        return jsonResponse({ success: false }, 409);
      }
    
      caches.open(cacheName).then(async (cache) => {
        // Write the file to the cache using the response we cloned at the beggining
        const data = await request.blob();
        cache.put(`/uploads/${filename}`, new Response(data, {
          headers: { 'Content-Type': mimetype }
        }));
    
        // Write the updated files data to the uploads cache
        cache.put('/uploads', jsonResponse({ success: true, data: files }));
      });
    
      // Add the file to the sync pending data and update the sync cache object
      sync.pending.push(filename);
      caches.open(syncCacheName).then(cache => cache.put(new Request(syncName), jsonResponse(sync)));
    
      // Return a success response with fromSw set to tru so we know this response came from service worker
      return jsonResponse({ success: true, fromSw: true });
    }
    

    Service Worker Fetch Deleting DELETE

    // Get our custom headers
    const filename = headers.get('X-Filename');
    
    if(filename) {
      // Get the uploads data from cache
      const uploadsRes = await caches.match('/uploads', { cacheName });
      let { data: files = [] } = await uploadsRes.json();
    
      // Get the sync data from cache
      const syncRes = await caches.match(new Request(syncName), { cacheName: syncCacheName });
      const sync = await syncRes.json();
    
      // Check if the file is already pending or deleted
      const pendingIndex = sync.pending.indexOf(filename);
      const uploadsIndex = files.indexOf(filename);
    
      if(pendingIndex >= 0) {
        // If it's pending, then remove it from pending sync data
        sync.pending.splice(pendingIndex, 1);
      } else if(sync.deleted.indexOf(filename) < 0) {
        // If it's not in pending and not already in sync for deleting,
        // then add it for delete when we'll sync with the server
        sync.deleted.push(filename);
      }
    
      // Update the sync cache
      caches.open(syncCacheName).then(cache => cache.put(new Request(syncName), jsonResponse(sync)));
    
      // If the file is in the uplods data
      if(uploadsIndex >= 0) {
        // Updates the uploads data
        files.splice(uploadsIndex, 1);
        caches.open(cacheName).then(async (cache) => {
          // Remove the file from the cache
          cache.delete(`/uploads/${filename}`);
          // Update the uploads data cache
          cache.put('/uploads', jsonResponse({ success: true, data: files }));
        });
      }
    
      // Return a JSON success response
      return jsonResponse({ success: true });
    }
    

    Synching

    // Get the sync data from cache
    const syncRes = await caches.match(new Request(syncName), { cacheName: syncCacheName });
    const sync = await syncRes.json();
    
    // If the are pending files send them to the server
    if(sync.pending && sync.pending.length) {
      sync.pending.forEach(async (file) => {
        const url = `/uploads/${file}`;
        const fileRes = await caches.match(url);
        const data = await fileRes.blob();
    
        fetch(url, {
          method: 'PUT',
          headers: {
            'X-Filename': file,
            'X-Syncing': 'syncing' // Tell SW fetch that we are synching so to ignore this fetch
          },
          body: data
        }).catch(err => console.log('sync:pending:PUT:err', file, err));
      });
    }
    
    // If the are deleted files send delete request to the server
    if(sync.deleted && sync.deleted.length) {
      sync.deleted.forEach(async (file) => {
        const url = `/uploads/${file}`;
    
        fetch(url, {
          method: 'DELETE',
          headers: {
            'X-Filename': file,
            'X-Syncing': 'syncing' // Tell SW fetch that we are synching so to ignore this fetch
          }
        }).catch(err => console.log('sync:deleted:DELETE:err', file, err));
      });
    }
    
    // Update and reset the sync cache object
    caches.open(syncCacheName).then(cache => cache.put(syncName, jsonResponse({
      pending: [],
      deleted: []
    })));
    

    Example PWA


    I have created a PWA example that implements all these, which you can find and test here. I have tested it using Chrome and Firefox and using Firefox Android on a mobile device.

    You can find the full source code of the application (including an express server) in this Github repository: https://github.com/clytras/pwa-sandbox.

    0 讨论(0)
  • 2021-02-05 12:25

    When the user selects a file via an <input type="file"> element, we are able to get the selected file(s) via fileInput.files. This gives us a FileList object, each item in it being a File object representing the selected file(s). FileList and File are supported by HTML5's Structured Clone Algorithm.

    When adding items to an IndexedDB store, it creates a structured clone of the value being stored. Since FileList and File objects are supported by the structured clone algorithm, this means that we can store these objects in IndexedDB directly.

    To perform those file uploads once the user goes online again, you can use the Background Sync feature of service workers. Here's an introductory article on how to do that. There are a lot of other resources for that as well.

    In order to be able to include file attachments in your request once your background sync code runs, you can use FormData. FormDatas allow adding File objects into the request that will be sent to your backend, and it is available from within the service worker context.

    0 讨论(0)
提交回复
热议问题