Can't get it to work with VS2017 Community 15.7.4
Opened this issue · 6 comments
Oh, thanks for reporting!
I got another similar report and managed to find an issue that might be the cause of that.
With the current implementation, it would only consider files valid if they were in a subdir relative to the sln file's dir.
I changed the implementation to use absolute paths to store the versioned files.
This makes the stored paths very long, so it might not be good for all file managers.
It should still list any old versioned files it can find using the old format, but it will always save the new ones in the new format.
I'll be pushing a new version shortly with this change. I'll mark this version as beta, in case there are still annoyances left.
If you manage to get it, please reply back to tell us if it worked for your case.
If it still doesn't work, it would be awesome if you could enable debug and trace logs (in the options page) and try saving a file, then paste the log in a new comment here.
The log pane can be found in the Output window (View -> Output) (see screenshot below).
A bit more of elaboration on the changes
Say you have the following structure
c:\YourProjectRoot\Solution.sln
\Project1\Project1.csproj
\Project1_File1.cs
\Project1_File2.cs
Then, the root for the .localhistory folder would be
c:\YourProjectRoot\.localhistory
When you save Project1_File1.cs
, it would create a version like this:
- in the old format
c:\YourProjectRoot\.localhistory\Project1\1530224700$Project1_File1.cs
- in the new format
c:\YourProjectRoot\.localhistory\c\YourProjectRoot\Project1\1530224700$Project1_File1.cs
Since projects can be anywhere, that's the safest approach I could think (without increasing the complexity of the program).
The drawbacks is that it will create long directory structures, and is a bit more rigid: if you move your root, the history will not match anymore (you can rename the folders to get it to match, though)
Path length will be root_folder_of_sln + full_path_to_file + timestamp - many Windows devs fight with just the length of the "full_path_to_file", especially if they're doing Angular or any sort of node/npm-based projects that have node_modules a folder (with all its nested dependencies).
If I had to guess, I would say this is ultimately going to cause more issues than what it solves.
I realize this suggestion may be a pretty major overhaul, but have you considered using a sqlite database?
Some benefits and features that this might provide:
- Storing the contents of each files history snapshot in the database itself, that way the entire localhistory is represented by a single portable .db file for each .sln.
- Concise / coherent table structure. e.g. a File table with all files, and a FileHistory table with the individual snapshots of each and maybe a Create/Update/Delete operator recorded (so you can record history of deletions, too)
- Users could still browse their history outside of VS (albeit not as simple as just open File Explorer), using a tool like sqlite browser
- No filesystem path length limitations
- Ability to store additional data/metadata for future features (e.g. VS Settings history?? ;-)
I did consider it. Also considered using git.
In the end, I decided to try my best at keeping it as simple files, like the original is.
IntelliJ IDEs use a db and they work fairly well. A huge pain point is that if it somehow gets corrupted, you have to start over.
It shouldn't be too hard to change this to use a db, and would even simplify stuff (like how the timestamps and labels are currently stored).
One thing to notice is that we would have to copy to a temporary file in order to display diffs.
I won't have a lot of time for a while, so it might take a while.
If anyone wants to have a go, it would help a lot!
If I had to guess, I would say this is ultimately going to cause more issues than what it solves.
Aside from using a DB or something like git, I couldn't think of anything else.
If we assume that the project's files are always in a subdir relative to the sln file's dir, then we can reduce that nicely. Without that assumption, there's not enough data without the full path.
In the end, what's on the table is
-
explicitly disable support versioning such files (like it was before, but with a proper warning)
-
use full paths
-
use a middle ground: force using full paths only when necessary. Then it would continue working as it was before, unless a file is out of the dir structure, then it would use full paths.
Neither is as good as a full-fledged db, of course.
I'm not sure if you currently only archive ascii/text files or if this tool will archive binary files like images and dlls, but if it's text only, I think those .db files should zip up into almost nothing (assuming the format doesn't use some compression natively). I suppose it wouldn't be a horrible idea to keep some sort of archive in a .db.zip format or something to address any possible corruption (maybe hourly snapshots and keep the last 3 or 4 hours around).
Possibly could also write some sort of checksum and validate that before we finalize the commit. I've used sqlite for a few small projects but can't say I know enough about it. I wonder, is corruption a common thing?
I'm very interested in this project, and that feature in particular. If it's something you are interested in and would accept a pull request (assuming it's up to your standards, and whatnot), I may look into doing that. I'm a bit busy these days too, but I'll try to find some time to work on that.
I feel the filesystem approach is going to be very fragile. The absolute paths are extra hard to maintain, but even the old way of doing it is going to break for some people. I know there's been more than one occasion where I've gotten "path too long" errors and had to trim and remove a few characters here and there to finally get it to fit in the string length. Yours will always be that length + ".localhistory\".Length + timestamp.Length - so this extension will always be the first point of failure if someone is getting close to hitting that length.
One more idea would be to maintain separate zip files for each folder (maybe name them like a GUID) and then have a json or xml file that acts as the manifest file and handles file locations. Problem is, it would be kinda difficult for someone to track down an individual file (outside of the extensions UI in VS). In that case corruption could still occur, but it would likely just be a folder instead of the whole DB. Personally, I still prefer the sqlite approach (assuming corruption isn't a very common issue).
I don't think there's anything that checks if a file is binary or not, before creating a revision.
The only thing it checks is (optionally) if the file is dirty, before creating a revision.
I wonder, is corruption a common thing?
Not much, no. There are a few things that will cause corruption (How To Corrupt An SQLite Database File), but I don't think it's more common than a normal file/disk corruption. Only the stakes are higher: if the file breaks, you risk losing all of your history.
I'm very interested in this project, and that feature in particular. If it's something you are interested in and would accept a pull request (assuming it's up to your standards, and whatnot), I may look into doing that.
That would be great. I've never used SQLite with .NET before, but, since we only need basic features, we can manage to get it done without much hassle.
By the way, there isn't really any standards at the moment.
This project was forked from an old project and the code wasn't (and isn't) the prettiest.
I mostly did a small cleanup of features I had been using for a long time, and published so people can still use it on newer Visual Studio versions.
We can test it a bunch in an alpha branch before publishing, so, feel free to break it as much as you want!
I'll open a new issue for this and cross reference this one.