fatal: early EOF fatal: index-pack failed

后端 未结 30 1208
滥情空心
滥情空心 2020-11-22 10:51

I have googled and found many solutions but none work for me.

I am trying to clone from one machine by connecting to the remote server which is in the LAN network.

相关标签:
30条回答
  • 2020-11-22 11:31

    I've experience the same problem. The REPO was too big to be downloaded via SSH. Just like @elin3t recommended, I've cloned over HTTP/HTTPS and change the REMOTE URL in .git/config to use the SSH REPO.

    0 讨论(0)
  • 2020-11-22 11:32

    A previous answer recommends setting to 512m. I'd say there are reasons to think that's counterproductive on a 64bit architecture. The documentation for core.packedGitLimit says:

    Default is 256 MiB on 32 bit platforms and 32 TiB (effectively unlimited) on 64 bit platforms. This should be reasonable for all users/operating systems, except on the largest projects. You probably do not need to adjust this value.

    If you want to try it out check if you have it set and then remove the setting:

    git config --show-origin core.packedGitLimit
    git config --unset --global core.packedGitLimit
    
    0 讨论(0)
  • 2020-11-22 11:34

    As @ingyhere said:

    Shallow Clone

    First, turn off compression:

    git config --global core.compression 0
    

    Next, let's do a partial clone to truncate the amount of info coming down:

    git clone --depth 1 <repo_URI>
    

    When that works, go into the new directory and retrieve the rest of the clone:

    git fetch --unshallow
    

    or, alternately,

    git fetch --depth=2147483647
    

    Now, do a pull:

    git pull --all
    

    Then to solve the problem of your local branch only tracking master

    open your git config file (.git/config) in the editor of your choice

    where it says:

    [remote "origin"]
        url=<git repo url>
        fetch = +refs/heads/master:refs/remotes/origin/master
    

    change the line

    fetch = +refs/heads/master:refs/remotes/origin/master
    

    to

    fetch = +refs/heads/*:refs/remotes/origin/*
    

    Do a git fetch and git will pull all your remote branches now

    0 讨论(0)
  • 2020-11-22 11:34

    In my case the problem was none of the git configuration parameters but the fact that my repository had one file exceeding the maximum file size allowed on my system. I was able to check it trying to download a large file and getting an "File Size Limit Exceeded" on Debian.

    After that I edited my /etc/security/limits.conf file adding et the end of it the following lines:

    • hard fsize 1000000
    • soft fsize 1000000

    To actually "apply" the new limit values you need to re-login

    0 讨论(0)
  • 2020-11-22 11:34

    I had the same problem, I even tried to download the project directly from the website as a zip file but the download got interrupted at the exact same percent.

    This single line fixed my problem like a charm

    git config --global core.compression 0
    

    I know other answers have mentioned this but, no one here mentioned that this line alone can fix the problem.

    Hope it helps.

    0 讨论(0)
  • 2020-11-22 11:35

    Setting below's config doesn't work for me.

    [core] 
    packedGitLimit = 512m 
    packedGitWindowSize = 512m 
    [pack] 
    deltaCacheSize = 2047m 
    packSizeLimit = 2047m 
    windowMemory = 2047m
    

    As previous comment, it might the memory issue from git. Thus, I try to reduce working threads(from 32 to 8). So that it won't get much data from server at the same time. Then I also add "-f " to force to sync other projects.

    -f: Proceed with syncing other projects even if a project fails to sync.
    

    Then it works fine now.

    repo sync -f -j8
    
    0 讨论(0)
提交回复
热议问题