[PLUG] Trying to avoid the ls limit

website reader website.reader3 at gmail.com
Sun Aug 3 20:33:43 UTC 2014


To bash gurus:

I just now ran into the wall while processing files in a long simulation
run, there are over 50,000 files in one directory and now the bash shell
expansion wild card character * is expanding command line arguments and
then the infamous "too many argments" message is given.  This is
particularly bad while trying the ls command: "ls *file*  ( I apparently
understand that there is a limitation in the readdir() command buffer size)

I did find a c program under getdents(2) which gets around this problem of
listing lots and lots files in one directory, but found out that I can use
a certain find command:

find . -maxdepth 1 -type f {paramter_here} -print

which will take the wild card character * okay when manually entered.  But
when I attempt to use something simple like

lsb *parameter*

where lsb is a bash shell script then the infamous "too many arguments"
error shows up again.

I can toggle the bash "set -f" for turning off wildcard expansion, but I
really need to toggle this off to get the command line parameter
*parameter* without expansion, then drop it inside a simple bash script,
then turn it on to execute the find line like above.

Right now bash is expanding * in command parameter #1 before dropping it
into the bash script.

And ideas on how to do this?  I would like to try to avoid

set -f ; lsb *filename* ; set +f

if possible.

Thanks for your help.

- Randall



More information about the PLUG mailing list