Contains utilities used for converting application database data to usable Data Sheets, and for merging Data Sheets to be imported into the application database.
We recommend configuring a virtual environment for this project.
cdto the project rootpython3 -m venv venv/python3 -m venv venvsource venv/bin/activate/.\venv\Scripts\activate.bat- In your IDE, configure the Python interpreter to use the virtual environment
- For initial setup, do
pip install -r requirements.txt
You can configure the database connection parameters in columns.py
Initialiser script that takes a CSV file of the application database and converts it to a usable Data Sheet by drs. Can assume formatting and values of the CSV file to be correct. Some things it does include:
- Renames columns
- Applies xlsx data validation rules
- Formats cells to have dropdowns to select from
Merger script that takes a directory of Data Sheets and merges them into a single Data Sheet.
If a DataSheet has duplicate rows, the script will arbitrarily choose one of the rows, and discard the other one. It will log the DataSheets with duplicates for further checking later on.
If multiple DataSheets have overlapping rows filled out for a patient, the script will arbitrarily keep one.
Takes a successfully merged Data Sheet and imports it into the application database.
Ensure that types.csv is in the project, and up-to-date.
Fill out the columns.py file with the correct parameters for the database.
Initially when the system is working: Cron Job:
- While the biometrics system is up, run /cron/backup.py, to grab csv backups of the database every minute.
- Periodically check the run logs to ensure that the backups are working
- You can also make sure the webserver is running in advance.
Importing CSV to Database (System is down):
- Make sure the database info in columns.py is correct, and backend, frontend and database container is up.
- Copy the latest backup csv from /cron/backups to /merger, and rename it to merged_output.csv
- Make sure to clear any existing database data on the current machine. (Use TRUNCATE in PgAdmin)
- Navigate to /importer and run
python importer.py
Converting DB to Data Sheet: (System is down)
- Run the Initialiser.py script, which pulls the latest backup csv from /cron/backups, and converts it to DataSheet.xlsx.
- Place the DataSheet.xlsx in the /server/uploads folder for the stations to use.
- Navigate to /server/, and start up the webserver with
python app.py, so that users can download the DataSheet.xlsx file to use.- If the server is down, share the DataSheet.xlsx file using thumbdrives
Converting Data Sheet to DB: (System is back up)
- Ensure the system is back up.
- Navigate to /server/, and start up the webserver with
python app.py, so that users can upload their DataSheet.xlsx files.- If the server is down, collect the DataSheet.xlsx files using thumbdrives, then place them in the /server/uploads folder. Naming shouldn't matter.
- Run Merger.py, merge the DataSheets into a single merged_output.csv file.
- Go to PgAdmin, and run the following SQL command to clear the database:
TRUNCATE TABLE admin CASCADE;- Run Importer.py to import the merged_output.csv file into the database.
- In order to prevent accidentally running the script and resetting the database, the script will only work if the existing tables are cleared.
- To clear the database, enter the Query Editor in pgAdmin and run the following SQL command manually:
TRUNCATE TABLE admin CASCADE;