0

I have a bash script that execute every 10 minutes (based on a crontab) multiple python script which create CSV, then move this files. It's basically this:

python3 first_script.py
wait;
sudo mv /script_dir/first_output.csv /var/www/html
wait;
python3 second_script
wait;
sudo mv /script_dir/second_output.csv /var/www/html
wait;

There is a lot of python scripts and sometimes, one of them doesn't execute. So the CSV isn't up to date.

Is it possible to write a log file when there is an error to know what is happening (and not have to check each python script one by one) ?

What I want is a txt file that I can check regularly like this: "yyyy mm dd hh mm ss : the file first_script.py doesn't works yyyy mm dd3 hh2 mm ss : the file second_script.py doesn't works"

and if possible: "the error was XXXXX"

3
  • You could loop over script_dir and check, which CSV files are present, and report the missing ones as error. Commented Mar 15, 2022 at 12:01
  • problem is it happen regularly (every week or two) that I add one python script to the bash script, or delete one.. Commented Mar 15, 2022 at 13:17
  • So I don't see what is the actual question: Making the number of scripts to be monitored configurable, or finding out which script produced an error? You need to be more specific in your question. In the way it is written, I have no idea what exactly you want to achieve. Commented Mar 15, 2022 at 13:35

1 Answer 1

1

Since you only want the output in case of an error, you'll need to redirect the output to a temporary file first. Only if the script failed, write that file into your log. I'm assuming that your scripts are well-behaved so that they return a nonzero exit code on failure.

You can write a helper function to encapsulate all this:

function log_errors() {
    script_log=$(mktemp)
    "$@" > "$script_log" 2>&1
    exit_code=$?
    if [[ exit_code -ne 0 ]]; then
        {
            date
            echo "\$ $(printf "%q " "$@")"
            cat "$script_log"
            echo "Failed with exit code $exit_code"
            echo
        } >> /var/log/my_scripts.log
    fi
    rm "$script_log"
}

log_errors python3 first_script.py
sudo mv /script_dir/first_output.csv /var/www/html
log_errors python3 second_script.py
sudo mv /script_dir/second_output.csv /var/www/html

Example output:

Wed Mar 16 03:48:13 PM CET 2022
$ python3 second_script.py 
python3: can't open file '/tmp/second_script.py': [Errno 2] No such file or directory
Failed with exit code 2
Sign up to request clarification or add additional context in comments.

Comments

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.