Handling File Uploads When Offline With Service Worker

前端 未结 3 1769
遇见更好的自我
遇见更好的自我 2021-02-05 11:34

We have a web app (built using AngularJS) that we\'re gradually adding PWA \'features\' too (service worker, launchable, notifications, etc). One of the features our web app has

3条回答
  •  一生所求
    2021-02-05 12:06

    The Cache API is designed to store a request (as the key) and a response (as the value) in order to cache a content from the server, for the web page. Here, we're talking about caching user input for future dispatch to the server. In other terms, we're not trying to implement a cache, but a message broker, and that's not currently something handled by the Service Worker spec (Source).

    You can figure it out by trying this code:

    HTML:

    
    
    
    
    

    JavaScript:

    if ('serviceWorker' in navigator) {
      navigator.serviceWorker.register('/service-worker.js', { scope: '/' }).then(function (reg) {
        console.log('Registration succeeded. Scope is ' + reg.scope);
      }).catch(function (error) {
        console.log('Registration failed with ' + error);
      });
    };
    
    document.getElementById('get').addEventListener('click', async function () {
      console.log('Response: ', await fetch('50x.html'));
    });
    
    document.getElementById('post').addEventListener('click', async function () {
      console.log('Response: ', await fetch('50x.html', { method: 'POST' }));
    });
    
    document.getElementById('put').addEventListener('click', async function () {
      console.log('Response: ', await fetch('50x.html', { method: 'PUT' }));
    });
    
    document.getElementById('patch').addEventListener('click', async function () {
      console.log('Response: ', await fetch('50x.html', { method: 'PATCH' }));
    });
    

    Service Worker:

    self.addEventListener('fetch', function (event) {
        var response;
        event.respondWith(fetch(event.request).then(function (r) {
            response = r;
            caches.open('v1').then(function (cache) {
                cache.put(event.request, response);
            }).catch(e => console.error(e));
            return response.clone();
        }));
    });
    

    Which throws:

    TypeError: Request method 'POST' is unsupported

    TypeError: Request method 'PUT' is unsupported

    TypeError: Request method 'PATCH' is unsupported

    Since, the Cache API can't be used, and following the Google guidelines, IndexedDB is the best solution as a data store for ongoing requests. Then, the implementation of a message broker is the responsibility of the developer, and there is no unique generic implementation that will cover all of the use cases. There are many parameters that will determine the solution:

    • Which criteria will trigger the use of the message broker instead of the network? window.navigator.onLine? A certain timeout? Other?
    • Which criteria should be used to start trying to forward ongoing requests on the network? self.addEventListener('online', ...)? navigator.connection?
    • Should requests respect the order or should they be forwarded in parallel? In other terms, should they be considered as dependent on each other, or not?
    • If run in parallel, should they be batched to prevent a bottleneck on the network?
    • In case the network is considered available, but the requests still fail for some reason, which retry logic to implement? Exponential backoff? Other?
    • How to notify the user that their actions are in a pending state while they are?
    • ...

    This is really very broad for a single StackOverflow answer.

    That being said, here is a minimal working solution:

    HTML:

    
    
    
    

    JavaScript:

    if ('serviceWorker' in navigator) {
      navigator.serviceWorker.register('/service-worker.js', { scope: '/' }).then(function (reg) {
        console.log('Registration succeeded. Scope is ' + reg.scope);
      }).catch(function (error) {
        console.log('Registration failed with ' + error);
      });
    };
    
    document.getElementById('get').addEventListener('click', async function () {
      fetch('api');
    });
    
    document.getElementById('file').addEventListener('change', function () {
      fetch('api', { method: 'PUT', body: document.getElementById('file').files[0] });
    });
    
    document.getElementById('sync').addEventListener('click', async function () {
      navigator.serviceWorker.controller.postMessage('sync');
    });
    

    Service Worker:

    self.importScripts('https://unpkg.com/idb@5.0.1/build/iife/index-min.js');
    
    const { openDB, deleteDB, wrap, unwrap } = idb;
    
    const dbPromise = openDB('put-store', 1, {
        upgrade(db) {
            db.createObjectStore('put');
        },
    });
    
    const idbKeyval = {
        async get(key) {
            return (await dbPromise).get('put', key);
        },
        async set(key, val) {
            return (await dbPromise).put('put', val, key);
        },
        async delete(key) {
            return (await dbPromise).delete('put', key);
        },
        async clear() {
            return (await dbPromise).clear('put');
        },
        async keys() {
            return (await dbPromise).getAllKeys('put');
        },
    };
    
    self.addEventListener('fetch', function (event) {
        if (event.request.method === 'PUT') {
            let body;
            event.respondWith(event.request.blob().then(file => {
                // Retrieve the body then clone the request, to avoid "body already used" errors
                body = file;
                return fetch(new Request(event.request.url, { method: event.request.method, body }));
            }).then(response => handleResult(response, event, body)).catch(() => handleResult(null, event, body)));
    
        } else if (event.request.method === 'GET') {
            event.respondWith(fetch(event.request).then(response => {
                return response.ok ? response : caches.match(event.request);
            }).catch(() => caches.match(event.request)));
        }
    });
    
    async function handleResult(response, event, body) {
        const getRequest = new Request(event.request.url, { method: 'GET' });
        const cache = await caches.open('v1');
        await idbKeyval.set(event.request.method + '.' + event.request.url, { url: event.request.url, method: event.request.method, body });
        const returnResponse = response && response.ok ? response : new Response(body);
        cache.put(getRequest, returnResponse.clone());
        return returnResponse;
    }
    
    // Function to call when the network is supposed to be available
    
    async function sync() {
        const keys = await idbKeyval.keys();
        for (const key of keys) {
            try {
                const { url, method, body } = await idbKeyval.get(key);
                const response = await fetch(url, { method, body });
                if (response && response.ok)
                    await idbKeyval.delete(key);
            }
            catch (e) {
                console.warn(`An error occurred while trying to sync the request: ${key}`, e);
            }
        }
    }
    
    self.addEventListener('message', sync);
    

    Some words about the solution: it allows to cache the PUT request for future GET requests, and it also stores the PUT request into an IndexedDB database for future sync. About the key, I was inspired by Angular's TransferHttpCacheInterceptor which allows to serialize backend requests on the server-side rendered page for use by the browser-rendered page. It uses . as the key. That supposes a request will override another request with the same verb and URL.

    This solution also supposes that the backend does not return 204 No content as a response of a PUT request, but 200 with the entity in the body.

提交回复
热议问题