gdrive-downloader is a collection of shell scripts runnable on all POSIX compatible shells ( sh / ksh / dash / bash / zsh / etc ).
It can be used to to download files or folders from google gdrive.
- Minimal
- No authentication required
- Download gdrive files and folders
- Download subfolders
- Resume Interrupted downloads
- Parallel downloading
- Pretty logging
- Easy to install and update
- Self update
- Auto update
- Can be per-user and invoked per-shell, hence no root access required or global install.
- Compatibility
- Installing and Updating
- Usage
- Uninstall
- How it works
- Reporting Issues
- Contributing
- License
As this is a collection of shell scripts, there aren't many dependencies. See Native Dependencies after this section for explicitly required program list.
For Linux or MacOS, you hopefully don't need to configure anything extra, it should work by default.
Install Termux.
Then, pkg install curl
and done.
It's fully tested for all usecases of this script.
Install iSH
While it has not been officially tested, but should work given the description of the app. Report if you got it working by creating an issue.
Again, it has not been officially tested on windows, there shouldn't be anything preventing it from working. Report if you got it working by creating an issue.
This repo contains two types of scripts, posix compatible and bash compatible.
These programs are required in both bash and posix scripts.
Program | Role In Script |
---|---|
curl | All network requests |
xargs | For parallel downloading |
mkdir | To create folders |
rm | To remove files and folders |
grep | Miscellaneous |
sed | Miscellaneous |
mktemp | To generate temporary files ( optional ) |
sleep | Self explanatory |
If BASH is not available or BASH is available but version is less tham 4.x, then below programs are also required:
Program | Role In Script |
---|---|
date | For installation, update and Miscellaneous |
stty or zsh or tput | To determine column size ( optional ) |
You can install the script by automatic installation script provided in the repository.
Default values set by automatic installation script, which are changeable:
Repo: Akianonymus/gdrive-downloader
Command name: gdl
Installation path: $HOME/.gdrive-downloader
Source value: master
Shell file: .bashrc
or .zshrc
or .profile
For custom command name, repo, shell file, etc, see advanced installation method.
Now, for automatic install script, there are two ways:
To install gdrive-downloader in your system, you can run the below command:
curl -Ls --compressed drivedl.cf | sh -s
alternatively, you can use the original github url instead of drivedl.cf
curl -Ls --compressed https://github.com/Akianonymus/gdrive-downloader/raw/master/install.sh | sh -s
and done.
This section provides information on how to utilise the install.sh script for custom usescases.
These are the flags that are available in the install.sh script:
Click to expand
-
-p | --path <dir_name>
Custom path where you want to install the script.
Note: For global installs, give path outside of the home dir like /usr/bin and it must be in the executable path already.
-
-c | --cmd <command_name>
Custom command name, after installation, script will be available as the input argument.
-
-r | --repo <Username/reponame>
Install script from your custom repo, e.g --repo Akianonymus/gdrive-downloader, make sure your repo file structure is same as official repo.
-
-b | --branch <branch_name>
Specify branch name for the github repo, applies to custom and default repo both.
-
-s | --shell-rc <shell_file>
Specify custom rc file, where PATH is appended, by default script detects .zshrc, .bashrc. and .profile.
-
-t | --time 'no of days'
Specify custom auto update time ( given input will taken as number of days ) after which script will try to automatically update itself.
Default: 5 ( 5 days )
-
--sh | --posix
Force install posix scripts even if system has compatible bash binary present.
-
-q | --quiet
Only show critical error/sucess logs.
-
--skip-internet-check
Do not check for internet connection, recommended to use in sync jobs.
-
-U | --uninstall
Uninstall the script and remove related files.\n
-U force can be used to remove any remnants left in shell rc, even if command is not installed.
-
-D | --debug
Display script command trace.
-
-h | --help
Display usage instructions.
Now, run the script and use flags according to your usecase.
E.g:
curl -Ls --compressed drivedl.cf | sh -s -- -r username/reponame -p somepath -s shell_file -c command_name -b branch_name
Run these commands before installing if gdl is already installed:
# this will remove "${HOME}/.gdrive-downloader", proceed with caution
rm -rf "${HOME}/.gdrive-downloader"
script="$(curl -Ls drivedl.cf --compressed)"
printf "%s\n" "${script}" | sh -s -- -U force
printf "%s\n" "${script}" | sh -s -- -U
Remove any previously set alias to the gdl command or any custom paths where gdl was installed with -p flag.
If you have followed the automatic method to install the script, then you can automatically update the script.
There are three methods:
-
Automatic updates
By default, script checks for update after 3 days. Use -t / --time flag of install.sh to modify the interval.
An update log is saved in "${HOME}/.gdrive-downloader/update.log".
-
Use the script itself to update the script.
gdl -u or gdl --update
This will update the script where it is installed.
If you use the this flag without actually installing the script,
e.g just by
sh gdl.sh -u
then it will install the script or update if already installed. -
Run the installation script again.
Yes, just run the installation script again as we did in install section, and voila, it's done.
Note: Above methods always obey the values set by user in advanced installation,
e.g if you have installed the script with different repo, say myrepo/gdrive-downloader
, then the update will be also fetched from the same repo.
After installation, no more configuration is needed.
gdl gdrive_id/gdrive_url
Script supports argument as gdrive_url, or a gdrive_id, given those should be publicly available.
Now, we have covered the basics, move on to the next section for extra features and usage, like skipping sub folders, parallel downloads, etc.
These are the custom flags that are currently implemented:
-
-d | --directory 'foldername'
Custom workspace folder where given input will be downloaded.
-
-s | --skip-subdirs
Skip downloading of sub folders present in case of folders.
-
-p | --parallel <no_of_files_to_parallely_download>
Download multiple files in parallel.
Note:
- This command is only helpful if you are downloding many files which aren't big enough to utilise your full bandwidth, using it otherwise will not speed up your download and even error sometimes,
- 5 to 10 value is recommended. If errors with a high value, use smaller number.
- Beaware, this isn't magic, obviously it comes at a cost of increased cpu/ram utilisation as it forks multiple bash processes to download ( google how xargs works with -P option ).
-
--speed 'speed'
Limit the download speed, supported formats: 1K, 1M and 1G.
-
-R | --retry 'num of retries'
Retry the file download if it fails, postive integer as argument. Currently only for file downloads.
-
-l | --log 'log_file_name'
Save downloaded files info to the given filename.
-
-q | --quiet
Supress the normal output, only show success/error download messages for files, and one extra line at the beginning for folder showing no. of files and sub folders.
-
-V | --verbose
Display detailed message (only for non-parallel downloads).
-
--skip-internet-check
Do not check for internet connection, recommended to use in sync jobs.
-
--info
Show detailed info about script ( if script is installed system wide ).
-
-u | --update
Update the installed script in your system, if not installed, then install.
-
--uninstall
Uninstall the installed script in your system.
-
-h | --help
Display usage instructions.
-
-D | --debug
Display script command trace.
You can use multiple inputs without any extra hassle.
Pass arguments normally, e.g: gdl url1 url2 id2 id2
where usr1 and usr2 is drive urls and rest two are gdrive ids.
Downloads interrupted either due to bad internet connection or manual interruption, can be resumed from the same position.
You can interrupt many times you want, it will resume ( hopefully ).
It will not download again if file is already present, thus avoiding bandwidth waste.
If you have followed the automatic method to install the script, then you can automatically uninstall the script.
There are two methods:
-
Use the script itself to uninstall the script.
gdl --uninstall
This will remove the script related files and remove path change from shell file.
-
Run the installation script again with -U/--uninstall flag
curl -Ls --compressed drivedl.cf | sh -s -- --uninstall
Yes, just run the installation script again with the flag and voila, it's done.
Note: Above methods always obey the values set by user in advanced installation.
In this section, the mechanism of the script it explained, if one is curious how it works to download folders as it is not supported officially.
The main catch here is that the script uses gdrive api to fetch details of a given file or folder id/url. But then how it is without authentication ?
Well, it does uses the api key but i have provided it in script. I have grabbed the api key from their gdrive file page, just open a gdrive folder on browser, open console and see network requests, open one of the POST requests and there you have it.
Also, google api key have a check for referer, so we pass referer with curl as https://drive.google.com
to properly use the key.
Now, next steps are simple enough:
Main Function: _check_id
It parses the input and extract the file_id, then it does a network request to fetch name, size and mimetype of id.
If it's doesn't give http status 40*, then proceed.
In case of:
Main Function: _download_file
Before downloading, the script checks if file is already present. If present compare the file size to remote file size and resume the download if applicable.
Recent updates by google have the made the download links ip specific and very strict about cookies, so it can only be downloaded on the system where cookies was fetched. Earlier, cookies was only needed for a file greater than 100 MB.
But either the case, the file can be moved to a different system and the script will resume the file from same position.
Main Function: _download_folder
First, all the files and sub folder details are fetched. Details include id and mimeType.
Now, it downloads the files using _download_file
function, and in case of sub-folders, _download_folder
function is repeated.
Issues Status |
---|
Use the GitHub issue tracker for any bugs or feature suggestions.
Total Contributers |
---|
Pull Requests |
---|
Submit patches to code or documentation as GitHub pull requests. Make sure to run merge.sh and format.sh before making a new pull request.
If using a code editor, then use shfmt plugin instead of format.sh