I've tried a few solutions posted here, but to no avail. I'm hoping it's something silly and obvious that I've missed, that you can spot easily, and I can facepalm myself.
I have a bash script containing multiple commands for data analysis. If I were to run this script on the command line, it would look something like this (simplified):
./script.sh /path/to/file1.gz /path/to/file2.gz output_name
My first two arguments are paths to files needed in the commands in the script.sh
for data analysis. The output_name
, is, as you'd expect, the name of the file that's output at the end.
This works fine, but I want to instead have a file containing the paths to multiple files, each on a new line, and loop over each line in the file and run the script sequentially. So the arguments file would look like:
/path/to/file1.gz /path/to/file2.gz output_name1
On each line.
And then feed each line into the script in a while loop (I figure this is best anyway).
I tried:
while read -r line; do; ./script.sh; done < arg_list.txt
Where arg_list.txt is the file containing my list of input arguments, I got "failed to open file `'."
Feel free to tell me I'm stupid and missing something!
somecmd "$1"
, it'll just pass an empty string to the command, and that empty string isn't a valid filename (though for some reason it gives the error for a nonexisting file, not that for an invalid value).