I have this Django web application project that needs to be hosted on a local network. In addition basic CRUD features, the scope requires to continuously monitor a local storage folder (C: or D: or E:) for csv files and import them to the database (Postgresql). I have already written the code for reading the csv files and importing them to the database and moving these csv files to another folder (after importing). What I don't know is where should I put this code and call the function (import_to_db), such that it runs continuously to scan the folder for new csv files? It cannot be a python command line interface. I am not fully conversant with Django REST Framework and not sure if it applies to this scope, since the csv files will be made available in a local folder. Any tips or references to examples/libraries would help.
Code to Import:
def get_files():
csv_files = []
for file in os.listdir(os.getcwd()):
if file.endswith('.csv'):
csv_files.append(file)
return csv_files
def move_files_to_folder(csv_files, destination):
try:
os.mkdir("BackupFiles")
except:
print('BackupFiles Directory Already Exists')
finally:
for file in csv_files:
shutil.move(file, destination)
os.chdir(destination)
return csv_files
def import_to_db():
csv_files = get_files()
engine = create_engine(
url="postgresql://{0}:{1}@{2}:{3}/{4}".format(user, password, host, port, database))
for file in csv_files:
df = pd.read_csv(file, parse_dates=[1, 2], infer_datetime_format=True, encoding="utf-8")
df.rename(columns={'ItemNumber': 'ItemNumber_id'}, inplace=True)
df.to_sql('app_logdata', engine, if_exists='append', index=False)
print('Imported Data Successfully')
destination = os.getcwd() + os.sep + "BackupFiles"
move_files_to_folder(csv_files, destination)
print('CSV Files Moved To BackupFiles Folder', csv_files)
# os.chdir('...')
# print('App Folder = ', os.getcwd())
import_to_db()