[TRACKER] Database caching & updating
chipsenkbeil opened this issue · 0 comments
chipsenkbeil commented
Today, the database is static and is created exactly once when the file does not exist. This is obviously not what we need for a production version. Here is what needs to be done:
- When the plugin first starts, it should scan through all files in the org-roam directory to see if any have been modified since last parsed. If a file is new or has been modified, we need to do the following:
a. Load the file to get the latest nodes
b. Delete all nodes associated with the file to remove any nodes that no longer exist with this file
c. Add the new nodes generated by the file - While scanning, we also need to see which files no longer exist. This means maintaining a list of scanned file paths and cross-referencing that with all file paths associated in the database. We should be able to do this by looking up the database index of
file
for all filenames and see which exist in the database but not in our scan. Nodes for those files will be removed. - Whenever a write occurs on a buffer, we need to perform the scan from step 1 again, but just for the single file. This should be configurable to turn off.
- When we refile an org-roam template, we want to perform a rescan for the target file.
- We need a command that can re-trigger the scan for step 1 & 2. Something like
:OrgRoamRefresh
.
All of the above run async and notify when done.