- Fast (multithreading & stuff)
- Ability to interrupt/resume (task mangement)
- Support for proxies( socks5 or http)
- Bandwidth limiting
- You can give it a file that contains list of urls to download
$ go get github.com/abzcoding/hget
$ cd $GOPATH/src/github.com/abzcoding/hget
$ make clean install
Binary file will be built at ./bin/hget, you can copy to /usr/bin or /usr/local/bin and even alias wget hget
to replace wget totally :P
hget [-n parallel] [-skip-tls false] [-rate bwRate] [-proxy proxy_server] [-file filename] [URL] # to download url, with n connections, and not skip tls certificate
hget - resume TaskName # to resume task
hget -proxy "127.0.0.1:12345" URL # to download using socks5 proxy
hget -proxy "http://sample-proxy.com:8080" URL # to download using http proxy
hget -file sample.txt # to download a list of urls
hget -n 4 -rate 100KB URL # to download using 4 threads & limited to 100KB per second
# real world example
hget -n 16 -rate 10MiB "https://releases.ubuntu.com/24.04.1/ubuntu-24.04.1-desktop-amd64.iso"
# resuming a stopped download
hget -resume "ubuntu-24.04.1-desktop-amd64.iso"
[I] ➜ hget -h
Usage of hget:
-file string
path to a file that contains one URL per line
-n int
number of connections (default 12)
-proxy string
proxy for downloading, e.g. -proxy '127.0.0.1:12345' for socks5 or -proxy 'http://proxy.com:8080' for http proxy
-rate string
bandwidth limit during download, e.g. -rate 10kB or -rate 10MiB
-resume string
resume download task with given task name (or URL)
-skip-tls
skip certificate verification for https (default true)
To interrupt any on-downloading process, just ctrl-c or ctrl-d at the middle of the download, hget will safely save your data and you will be able to resume later