@SpectateSwamp said:
55 million lines isn't too big 3 maybe 4 Gig. At 20,000,000 cps and faster the first match will be up on the screen fast. Keep a copy of that on your local PC. If you need a source module. Book it out and change it as required.
I happened to have the source code for Linux 2.6.23 lying around, so I ran a test:
/usr/src/linux-source-2.6.23$ time rgrep something .
... 2000 boring results ...
real 2m24.243s
user 0m0.304s
sys 0m1.432s
This is how long your unindexed search will take to scan all of Linux's 300MB, if you're lucky. If I needed to actually find something, 2 and a half minutes is unacceptable, and I happen to have an extremely fast computer. Searching an index would be insanely faster. For example, finding a file:
Without using an index:
$ time find / -name '*something*'
... 3 boring results ...
real 10m15.254s
user 0m2.292s
sys 0m12.961s
And with:
$ time locate something
... 3 boring results ...
real 0m0.277s
user 0m0.276s
sys 0m0.004s
In our universe, 0.3 seconds is a lot faster than 10 minutes.