I'm trying to run a command similar to the one below in a bash script. It should search through all subfolders of $sourcedir
and copy all files of a certain type to the root level of $targetdir
.
#!/bin/bash
# These are set as arguments to the script, not hard-coded
sourcedir="/path/to/sourcedir"
targetdir="/path/to/targetdir"
find "$sourcedir" -type f -name "*.type" -exec sh -c 'cp "$1" "$2/`basename "$1"`"' "{}" "$targetdir" \;
This seems to be pretty close, except that {}
isn't being passed in as $2
to -exec sh -c ...
I would like to do this as close to the "right way" as possible, with tolerance for special characters in filenames (specifically single quote characters).
Edit: I see people suggesting using xargs
or argument chaining. I was under the impression that this is only ok for a limited number of arguments. If I have, for example, thousands of .jpg files I'm trying to copy from a number of gallery directories into a giant slideshow directory, will the solutions chaining arguments still work?
Edit 2: My problem was I was missing a _
before my first option to sh in the -exec
command. For anyone who is curious about how to make the find command work, add the _
and everything will be fine:
find "$sourcedir" -type f -name "*.type" -exec sh -c 'cp "$1" "$2"' _ "{}" "$targetdir" \;
I have accepted an answer below though because it accomplishes the same task, but is more efficient and elegant.
xargs
was ever created, to automatically handle huge amounts of arguments on regular commands that had limits. Also to consider, most of the max argument limits, have been vastly improved for the standard GNU utils. You also will see a performance benefit, avoiding all those process forks, which on thousands of files is relevant.