diff options
author | Naohiro Aota <naohiro.aota@wdc.com> | 2023-08-21 16:12:11 +0900 |
---|---|---|
committer | Zorro Lang <zlang@kernel.org> | 2023-08-25 22:20:50 +0800 |
commit | 778364fffb3502ac010f9ff89afc880a8a6a7239 (patch) | |
tree | 9b97ded635c9b97b9d32f94472c78ebc36a33934 | |
parent | eccdeae8fc417ecb369b822a8182fd5eeab751e1 (diff) | |
download | xfstests-dev-778364fffb3502ac010f9ff89afc880a8a6a7239.tar.gz |
common/rc: introduce _random_file() helper
Currently, we use "ls ... | sort -R | head -n1" (or tail) to choose a
random file in a directory.It sorts the files with "ls", sort it randomly
and pick the first line, which wastes the "ls" sort.
Also, using "sort -R | head -n1" is inefficient. For example, in a
directory with 1000000 files, it takes more than 15 seconds to pick a file.
$ time bash -c "ls -U | sort -R | head -n 1 >/dev/null"
bash -c "ls -U | sort -R | head -n 1 >/dev/null" 15.38s user 0.14s system 99% cpu 15.536 total
$ time bash -c "ls -U | shuf -n 1 >/dev/null"
bash -c "ls -U | shuf -n 1 >/dev/null" 0.30s user 0.12s system 138% cpu 0.306 total
So, we should just use "ls -U" and "shuf -n 1" to choose a random file.
Introduce _random_file() helper to do it properly.
Signed-off-by: Naohiro Aota <naohiro.aota@wdc.com>
Reviewed-by: Anand Jain <anand.jain@oracle.com>
Signed-off-by: Zorro Lang <zlang@kernel.org>
-rw-r--r-- | common/rc | 7 |
1 files changed, 7 insertions, 0 deletions
@@ -5229,6 +5229,13 @@ _require_unshare() { _notrun "unshare: command not found, should be in util-linux" } +# Return a random file in a directory. A directory is *not* followed +# recursively. +_random_file() { + local basedir=$1 + echo "$basedir/$(ls -U $basedir | shuf -n 1)" +} + init_rc ################################################################################ |