How can I efficiently download a large file using Go?

后端 未结 4 1977
悲哀的现实
悲哀的现实 2021-01-29 18:48

Is there a way to download a large file using Go that will store the content directly into a file instead of storing it all in memory before writing it to a file? Because the fi

相关标签:
4条回答
  • 2021-01-29 18:51

    A more descriptive version of Steve M's answer.

    import (
        "os"
        "net/http"
        "io"
    )
    
    func downloadFile(filepath string, url string) (err error) {
    
      // Create the file
      out, err := os.Create(filepath)
      if err != nil  {
        return err
      }
      defer out.Close()
    
      // Get the data
      resp, err := http.Get(url)
      if err != nil {
        return err
      }
      defer resp.Body.Close()
    
      // Check server response
      if resp.StatusCode != http.StatusOK {
        return fmt.Errorf("bad status: %s", resp.Status)
      }
    
      // Writer the body to file
      _, err = io.Copy(out, resp.Body)
      if err != nil  {
        return err
      }
    
      return nil
    }
    
    0 讨论(0)
  • 2021-01-29 19:06

    The answer selected above using io.Copy is exactly what you need, but if you are interested in additional features like resuming broken downloads, auto-naming files, checksum validation or monitoring progress of multiple downloads, checkout the grab package.

    0 讨论(0)
  • 2021-01-29 19:08
    1. Here is a sample. https://github.com/thbar/golang-playground/blob/master/download-files.go

    2. Also I give u some codes might help you.

    code:

    func HTTPDownload(uri string) ([]byte, error) {
        fmt.Printf("HTTPDownload From: %s.\n", uri)
        res, err := http.Get(uri)
        if err != nil {
            log.Fatal(err)
        }
        defer res.Body.Close()
        d, err := ioutil.ReadAll(res.Body)
        if err != nil {
            log.Fatal(err)
        }
        fmt.Printf("ReadFile: Size of download: %d\n", len(d))
        return d, err
    }
    
    func WriteFile(dst string, d []byte) error {
        fmt.Printf("WriteFile: Size of download: %d\n", len(d))
        err := ioutil.WriteFile(dst, d, 0444)
        if err != nil {
            log.Fatal(err)
        }
        return err
    }
    
    func DownloadToFile(uri string, dst string) {
        fmt.Printf("DownloadToFile From: %s.\n", uri)
        if d, err := HTTPDownload(uri); err == nil {
            fmt.Printf("downloaded %s.\n", uri)
            if WriteFile(dst, d) == nil {
                fmt.Printf("saved %s as %s\n", uri, dst)
            }
        }
    }
    
    0 讨论(0)
  • 2021-01-29 19:17

    I'll assume you mean download via http (error checks omitted for brevity):

    import ("net/http"; "io"; "os")
    ...
    out, err := os.Create("output.txt")
    defer out.Close()
    ...
    resp, err := http.Get("http://example.com/")
    defer resp.Body.Close()
    ...
    n, err := io.Copy(out, resp.Body)
    

    The http.Response's Body is a Reader, so you can use any functions that take a Reader, to, e.g. read a chunk at a time rather than all at once. In this specific case, io.Copy() does the gruntwork for you.

    0 讨论(0)
提交回复
热议问题