I needed to test out some work just completed, and to do so, I needed to run some files against a process. Problem is, there are 1000 files to choose from, and I only want to test a handful.
But which handful? Ideally, it would be random.
So here's my quick shell hack to give me 15 random files from a given directory:
ls | while read x; do echo "`expr $RANDOM % 1000`:$x"; done \ | sort -n| sed 's/[0-9]*://' | head -15
This works by:
- Generating a list of every file in the directory
- Reading in each file into the variable x
- I prepend a random number (thanks to the magic $RANDOM variable in bash
- I then sort that listing numerically - which means sorting based on the random number
- I then strip off the random number
- I show only 15 items from the stream
Of course, this isn't efficient - but it doesn't need to be. Heck, it took longer to write up the blog post describing it, then it did to actually write the code.
I so love Unix. And I'm running this under cygwin on a Windows box. See, there's no excuse for not using Unix.