It's definitely possible, but it's a little hacky. I've been doing it myself for quite some time using wget
. The trick is to make the server think that the request is being made from a browser and to make that happen, you need a couple of things:
- The Download Link (The actual link to the file)
- Link Referrer (The webpage with the download button)
- Zippyshare Session ID (Found in Cookies)
Here's a screenshot explaining where you can get each item:
Now open up your terminal, and use the following command (replacing necessary items):
wget <download_link> \
--referer='<referrer>' \
--cookies=off --header "Cookie: JSESSIONID=<session_id>" \
--user-agent='Mozilla/5.0 (Windows NT 6.0) Gecko/20100101 Firefox/14.0.1'
Example:
wget http://www16.zippyshare.com/d/29887835/8895183/hello.txt \
--referer='http://www16.zippyshare.com/v/29887835/file.html' \
--cookies=off --header "Cookie: JSESSIONID=26458C0893BF69F88EB5743D74FE0F8C" \
--user-agent='Mozilla/5.0 (Windows NT 6.0) Gecko/20100101 Firefox/14.0.1'
Original answer: How to use wget to download from hosting sites?
Note: In the command, it is in fact 'referer' not 'referrer'