Service to Validate the OSW files that is uploaded. At the moment, the service does the following:
- Listens to the topic which is mentioned in
.envfile for any new message (that is triggered when a file is uploaded), exampleUPLOAD_TOPIC=osw-upload - Consumes the message and perform following checks -
- Download the file locally
- File location is in the message
data.meta.file_upload_path - Uses
python-osw-validationto validate the file - Adds the
isValidandvalidationMessagekeys to the original message
- Publishes the result to the topic mentioned in
.envfile, exampleVALIDATION_TOPIC=osw-validation
The project is built on Python with FastAPI framework. All the regular nuances for a Python project are valid for this.
| Software | Version |
|---|---|
| Python | 3.10.x |
- Connecting this to cloud will need the following in the
.envfile
PROVIDER=Azure
QUEUECONNECTION=xxxx
STORAGECONNECTION=xxxx
VALIDATION_REQ_TOPIC=xxxx
VALIDATION_REQ_SUB=xxxx
VALIDATION_RES_TOPIC=xxxx
CONTAINER_NAME=xxxx
AUTH_PERMISSION_URL=xxx # This is the URL to get the token
MAX_CONCURRENT_MESSAGES=xxx # Optional if not provided defaults to 2
AUTH_SIMULATE=xxx # Optional if not provided defaults to FalseThe application connect with the STORAGECONNECTION string provided in .env file and validates downloaded zipfile using python-osw-validation package.
QUEUECONNECTION is used to send out the messages and listen to messages.
MAX_CONCURRENT_MESSAGES is the maximum number of concurrent messages that the service can handle. If not provided, defaults to 2
Follow the steps to install the python packages required for both building and running the application
-
Setup virtual environment
python3.10 -m venv .venv source .venv/bin/activate -
Install the dependencies. Run the following command in terminal on the same directory as
requirements.txt# Installing requirements pip install -r requirements.txt
- The http server by default starts with
8000port - Run server
uvicorn src.main:app --reload - By default
getcall onlocalhost:8000/healthgives a sample response - Other routes include a
pingwith get and post. Makegetorpostrequest tohttp://localhost:8000/health/ping - Once the server starts, it will start to listening the subscriber(
VALIDATION_REQ_SUBshould be in env file)
{
"messageId": "tdei_record_id",
"messageType": "workflow_identifier",
"data": {
"file_upload_path": "file_upload_path",
"user_id": "user_id",
"tdei_project_group_id": "tdei_project_group_id"
}
} {
"messageId": "tdei_record_id",
"messageType": "workflow_identifier",
"data": {
"file_upload_path": "file_upload_path",
"user_id": "user_id",
"tdei_project_group_id": "tdei_project_group_id",
"success": true/false,
"message": "message" // if false the error string else empty string
},
"publishedDate": "published date"
}Make sure you have set up the project properly before running the tests, see above for How to Setup and Build.
- Add the new set of test inside
tests/test_harness/tests.jsonfile like -{ "Name": "Test Name", "Input_file": "test_files/osw_test_case1.json", // Input file path which you want to provide to the test "Result": true/false // Defining the test output } - Test Harness would require a valid
.envfile. - To run the test harness
python tests/test_harness/run_tests.py
.envfile is not required for Unit test cases.- To run the unit test cases
python test_report.py- Above command will run all test cases and generate the html report, in
reportsfolder at the root level.
- To run the coverage
python -m coverage run --source=src -m unittest discover -s tests/unit_tests- Above command will run all the unit test cases.
- To generate the coverage report in console
coverage report- Above command will generate the code coverage report in terminal.
- To generate the coverage report in html.
coverage html- Above command will generate the html report, and generated html would be in
htmlcovdirectory at the root level.
- NOTE : To run the
htmlorreportcoverage, 3.i) command is mandatory
.envfile is required for Unit test cases.- To run the integration test cases, run the below command
python test_integration.py- Above command will run all integration test cases and generate the html report, in
reportsfolder at the root level.
This microservice deals with two topics/queues.
- upload queue from osw-upload
- validation queue from osw-validation
The incoming messages will be from the upload queue osw-upload.
The format is mentioned in osw-upload.json
The outgoing messages will be to the osw-validation topic.
The format of the message is at osw-validation.json