Something like ack's --sort-files.
Opened this issue ยท 7 comments
It's nice to have output that's grouped by files, also sorted by the path to that file. Having just searched in a big project, all the results for stuff in app/
was grouped and all results for db/
, config/
and spec/
were grouped, but the ordering of groups was: spec/
, config/
, app/
, and db/
. Seems weird.
Yeah, it's a trade-off. If you don't print matches as soon as they're found, people think the program is slower.
It'd be nice to have this feature, but doing it right requires a rewrite of print.c. Honestly, I doubt I'll ever get around to it.
There is one workaround: You can use --workers=1 to get results in order, but it reduces performance considerably.
Oh, wow โ so internally you are sorting the contents of each directory, but it's the parallelism that then de-sorts the results? I didn't even know yet that there was parallelism in "ag", so โ I'm impressed!
๐ I made it multithreaded a few months ago. If you're curious how I went about that, you can read this post.
I don't really sort anything, but scandir seems to go through dirents alphabetically on most systems.
FWIW, on Fedora 21 searching the gdb source, they are not ordered even within a given path. For example, I'm getting results from ChangeLog-2006
, then ChangeLog-1999
, then ChangeLog-2005
. (I should set my preferences to ignore ChangeLogs, but that's a separate topic...)
Can you describe even a sketched idea of what the necessary rewrite might entail?
It'll be great to have this as an option (I don't mind it'll be slower then)
Ignoring performance, this works for me:
ag 'search terms' `ag -l 'search terms' | sort`
(-l
is --files-with-matches
)