I have a bash script that execute every 10 minutes (based on a crontab) multiple python script which create CSV, then move this files. It's basically this:
python3 first_script.py
wait;
sudo mv /script_dir/first_output.csv /var/www/html
wait;
python3 second_script
wait;
sudo mv /script_dir/second_output.csv /var/www/html
wait;
There is a lot of python scripts and sometimes, one of them doesn't execute. So the CSV isn't up to date.
Is it possible to write a log file when there is an error to know what is happening (and not have to check each python script one by one) ?
What I want is a txt file that I can check regularly like this: "yyyy mm dd hh mm ss : the file first_script.py doesn't works yyyy mm dd3 hh2 mm ss : the file second_script.py doesn't works"
and if possible: "the error was XXXXX"