sync_git_repo_from_pc1_to_pc2.sh: make a continually-syncing daemon version
ElectricRCAircraftGuy opened this issue · 4 comments
Script: sync_git_repo_from_pc1_to_pc2.sh
NB: Get rid of the "SYNC" branch. You'll sync uncommitted changes with just rsync
, and committed changes with their actual branch names.
Redo the script entirely (probably as a version 2?), as a continually-running daemon to rapidly do a near real-time bidirectional sync every 2 to 5 seconds, using a combination of git and rsync.
Use C, C++, Bash, and Python together, as appropriate, favoring using only Bash, if possible, to allow this script to more easily run on small or embedded systems, and to avoid having to compile.
Ensure rsync on both computers is the same version, and if not, log a warning at daemon startup, since differing rsync versions are frequently incompatible and will have strange errors. Log stdout and stderr to known locations on each computer in the home dir (probably) or in /var/log perhaps if that doesn't require root privileges.
Allow multiple running modes:
- Local --> remote sync, including moving destination files to a local trash
- Remote --> local sync, incl. trash
- Local <--> remote bidirectional sync, incl. trash on each side
You will start the script on the side you want to be master. The other side will be slave. The master will run continuously and periodically call the slave script on the other side automatically over ssh cmds.
The master will be in charge, running sending sync cmds over ssh as necessary, and requesting the slave do the appropriate syncing. Main syncing will be via git, ensuring both sides are on the same branch. After that, git status
will be used to determine which files to sync over rsync, and how. This will allow rapid automatic syncing every 2 to 5 seconds or so, so you can make changes on either side and see them rapidly propagate to the other side.
Trashes on each side will probably be handled via spontaneously-created git "trash" branches created when necessary, so as to always keep a delete history rather than just losing data.
Have a trash folder in .git/gsync/trash
. Place deleted files there.
Call the program gsync
(Gabriel's sync, although you can also think of it as "git sync").
Have a user config file at ~/.gsync_config
.
cmds:
gsync begin # begin continually syncing at the preconfigured interval (ex: 5 seconds); only spawn a new process if one is *not* already running
gsync empty_trash # rm everything in .git/gsync/trash
gsync stop # **gently** kill any gsync running processes via a signal it will catch within 0.1 second
gsync once # sync one single time right now; send a signal to a running gsync process to do this if a continual one is already running
gsync status # return "running (PID=9999)" or "stopped"
Alternatives to consider or look at:
- ***** How to keep a local directory automatically synced with a remote, without latency issues?
- https://github.com/lsyncd/lsyncd - looks very promising; uses
rsync
under the hood, as well as inotify notifications to identify file changes on the source. - https://github.com/bcpierce00/unison - appears to be actively developed, and is an alternative to
rsync
- https://github.com/lsyncd/lsyncd - looks very promising; uses
- ***** Possibly the most well-developed and well-supported of them all!: https://github.com/syncthing/syncthing
- See additional setup info. for it here: https://hackaday.com/2020/07/23/linux-fu-keep-in-sync/
- https://github.com/jachin/GitSync
Other references I may need, to help me:
- Once I get my script updated to do live, continuous updates, using
git
andrsync
together, add my script as a new answer here: How to keep a local directory automatically synced with a remote, without latency issues?
Make it an automatic 2-way sync so that if I make changes in the local repo, the remote pc's version is changed, and vice versa, so that if I make rapid edits on the remote pc for rapid build and test cycles, those changes show up automagically locally too for rapid git difftool
checks and committing.