Skip to content
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
21 changes: 19 additions & 2 deletions README.md
Original file line number Diff line number Diff line change
Expand Up @@ -65,7 +65,8 @@ Clone the repository if you haven't done so already, then follow these steps to

To run the application locally, follow these steps:

> Note: Ensure you have activated your pipenv shell by running `pipenv shell`.
> [!IMPORTANT]
> Ensure you have activated your pipenv shell by running `pipenv shell`.

1. Ensure all dependencies are installed using Pipenv.
```bash
Expand All @@ -82,7 +83,8 @@ To run the application locally, follow these steps:
flask run --host=127.0.0.1 --port=5001
```

> **NOTE:** If your frontend is also running, ensure it is configured to communicate with this backend API. You may need to set the API URL in your frontend configuration.
> [!NOTE]
> If your frontend is also running, ensure it is configured to communicate with this backend API. You may need to set the API URL in your frontend configuration.
> Also, you might need to use CORS (Cross-Origin Resource Sharing) if your frontend and backend are served from different origins. You can use the `flask-cors` package to handle this:
> ```shell
> pipenv install flask-cors
Expand Down Expand Up @@ -122,6 +124,16 @@ pipenv install flake8
flake8 src
```

## Black Formatting

It is highly encouraged to use [Black](https://black.readthedocs.io/en/stable/) for code formatting before every commit. You can run it as follows:

```shell
pipenv run black .
```

Run this inside a pipenv shell, from the root directory of the project. This will format all Python files in the project.

## Database Setup and Migrations

### Local Database
Expand All @@ -146,6 +158,10 @@ flask db migrate -m "Create user table"
flask db upgrade # Execute this if you pull new changes that alter the schema remotely.
```

> [!IMPORTANT]
> When you run `flask db migrate`, it generates a new migration script in the `src/migrations/versions` directory. Review this script to ensure it accurately reflects the changes you made to the models. If necessary, modify the script before running `flask db upgrade`.
> This is very crucial because the migration script might contain errors or false commands, like `drop_table`, which can lead to data loss. Always double-check the generated migration script before applying it.

## Git Hooks

Configure your local hooks as follows:
Expand All @@ -163,6 +179,7 @@ Specifically, the application is deployed to an AWS ECS cluster using Fargate. T

It also uses Terraform to manage the infrastructure as code.

> [!TIP]
> **Visit the [augmed-infra repository](https://github.com/DHEPLab/augmed-infra) for more details on the infrastructure setup and deployment process.**

## License
Expand Down
8 changes: 7 additions & 1 deletion pyproject.toml
Original file line number Diff line number Diff line change
@@ -1,7 +1,13 @@
[tool.black]
line-length = 88
target-version = ['py38']
exclude = '''
/(
src/migrations/*
\.venv # skip your virtual-env
| src/migrations # skip all files under migrations
| \.git # skip git dir
| build # skip build/
| dist # skip dist/
)/
'''

Expand Down
27,270 changes: 25,280 additions & 1,990 deletions script/assign_cases/config.csv

Large diffs are not rendered by default.

71 changes: 46 additions & 25 deletions script/assign_cases/generate_config.sh
Original file line number Diff line number Diff line change
@@ -1,4 +1,4 @@
#!/usr/bin/env sh
#!/usr/bin/env bash
Copy link

Copilot AI Jun 20, 2025

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

[nitpick] Consider adding set -euo pipefail immediately after the shebang to ensure the script exits on errors, treats unset variables as errors, and fails on pipeline errors.

Copilot uses AI. Check for mistakes.

# ---------------------------------------------------------------------------------------------------------------
# generate_config.sh
Expand Down Expand Up @@ -26,8 +26,8 @@
#
# To run:
# 1. Change directory to the script's location: cd script/assign_cases
# 1. Run: chmod +x generate_config.sh
# 2. Run the script: (Remember to remove the # symbols at the beginning of each line)
# 2. Run: chmod +x generate_config.sh
# 3. Run the script: (Remember to remove the # symbols at the beginning of each line)
# ./generate_config.sh \
# <db_host> \
# <db_port> \
Expand Down Expand Up @@ -63,28 +63,33 @@ CASES_PER_USER="$6"
RISK_CASES_PER_USER="$7"
OUT="$8"

# Prompt for database password once, export to PGPASSWORD for all subsequent psql invocations
printf 'Password for user %s: ' "$DB_USER"
stty -echo
read DB_PASS
stty echo
echo
export PGPASSWORD="$DB_PASS"

# 1) Print a single header line. (The CSV parser expects exactly these columns.)
printf 'User,Case No.,Path,Collapse,Highlight,Top\n' > "$OUT"

# 2) Run one big psql‐COPY that:
# • Picks N random users
# • Picks M random cases per user (only person_ids that exist in visit_occurrence)
# • For each sampled (user,case_id), fetch all leaf‐observations but filter them at random() < 0.5
# → emit only those chosen leaves, with Collapse=FALSE, Highlight=TRUE
# • Separately, pick K random “risk cases” per user → emit exactly one "CRC risk assessments" row for each
# of those, again with Collapse=FALSE, Highlight=TRUE
psql -h "$DB_HOST" \
-p "$DB_PORT" \
-U "$DB_USER" \
-d "$DB_NAME" \
-A -F',' -t <<SQL >> "$OUT"
# 2) Loop over each of N random users, generate their cases, and append to CSV with progress logging
count=0
while IFS= read -r user; do
count=$((count + 1))
echo "[${count}/${NUM_USERS}] Processing user: $user"

psql -h "$DB_HOST" \
-p "$DB_PORT" \
-U "$DB_USER" \
-d "$DB_NAME" \
-w \
-A -F',' -t <<SQL >> "$OUT"
WITH
-- STEP A: Pick N random users
-- STEP A: Single user context
users AS (
SELECT email
FROM public."user"
ORDER BY random()
LIMIT $NUM_USERS
SELECT '$user' AS email
),

-- STEP B: All person_ids that actually have at least one visit (so get_case_by_user won't blow up)
Expand Down Expand Up @@ -329,10 +334,26 @@ WITH
FROM risk_cases rc
)

-- UNION together leaf_rows + risk_rows
SELECT * FROM leaf_rows
UNION ALL
SELECT * FROM risk_rows;
-- UNION together leaf_rows + risk_rows and randomize order for this user
SELECT * FROM (
SELECT * FROM leaf_rows
UNION ALL
SELECT * FROM risk_rows
) AS all_rows
ORDER BY random();
SQL

done < <(psql -h "$DB_HOST" \
-p "$DB_PORT" \
-U "$DB_USER" \
-d "$DB_NAME" \
-w \
-A -t <<SQL
SELECT email
FROM public."user"
ORDER BY random()
LIMIT $NUM_USERS;
SQL
)

echo "Wrote $OUT"
echo "Done – wrote $OUT"
26 changes: 17 additions & 9 deletions src/analytics/controller/analytics_controller.py
Original file line number Diff line number Diff line change
Expand Up @@ -5,7 +5,10 @@
from src.user.repository.display_config_repository import DisplayConfigRepository
from src.common.model.ApiResponse import ApiResponse
from src.user.utils.auth_utils import jwt_validation_required
from src.common.exception.BusinessException import BusinessException, BusinessExceptionEnum
from src.common.exception.BusinessException import (
BusinessException,
BusinessExceptionEnum,
)
from datetime import datetime, timezone

# Give the blueprint its full prefix; no strict_slashes here
Expand All @@ -15,31 +18,36 @@
url_prefix="/api/analytics",
)


@analytics_blueprint.route("/", methods=["POST"], strict_slashes=False)
@jwt_validation_required()
def record(): # pragma: no cover
payload = request.get_json() or {}
case_config_id = payload.get("caseConfigId")
case_open_str = payload.get("caseOpenTime")
answer_open_str = payload.get("answerOpenTime")
case_config_id = payload.get("caseConfigId")
case_open_str = payload.get("caseOpenTime")
answer_open_str = payload.get("answerOpenTime")
answer_submit_str = payload.get("answerSubmitTime")

if not all([case_config_id, case_open_str, answer_open_str, answer_submit_str]):
ex = BusinessException(
BusinessExceptionEnum.RenderTemplateError,
"Missing analytics metrics fields"
"Missing analytics metrics fields",
)
return jsonify(ApiResponse.error(ex)), 400

fmt = "%Y-%m-%dT%H:%M:%S.%fZ"
try:
case_open = datetime.strptime(case_open_str, fmt).replace(tzinfo=timezone.utc)
answer_open = datetime.strptime(answer_open_str, fmt).replace(tzinfo=timezone.utc)
answer_submit = datetime.strptime(answer_submit_str, fmt).replace(tzinfo=timezone.utc)
case_open = datetime.strptime(case_open_str, fmt).replace(tzinfo=timezone.utc)
answer_open = datetime.strptime(answer_open_str, fmt).replace(
tzinfo=timezone.utc
)
answer_submit = datetime.strptime(answer_submit_str, fmt).replace(
tzinfo=timezone.utc
)
except ValueError:
ex = BusinessException(
BusinessExceptionEnum.RenderTemplateError,
"Bad timestamp format for analytics"
"Bad timestamp format for analytics",
)
return jsonify(ApiResponse.error(ex)), 400

Expand Down
16 changes: 8 additions & 8 deletions src/analytics/model/analytics.py
Original file line number Diff line number Diff line change
Expand Up @@ -2,6 +2,7 @@
from sqlalchemy import Column, Integer, String, DateTime, Float
from src import db


class Analytics(db.Model):
__tablename__ = "analytics"

Expand All @@ -11,22 +12,21 @@ class Analytics(db.Model):
case_id = Column(Integer, nullable=False)

# these three fields will also accept and store tz-aware UTC datetimes
case_open_time = Column(DateTime(timezone=True), nullable=False)
answer_open_time = Column(DateTime(timezone=True), nullable=False)
answer_submit_time= Column(DateTime(timezone=True), nullable=False)
case_open_time = Column(DateTime(timezone=True), nullable=False)
answer_open_time = Column(DateTime(timezone=True), nullable=False)
answer_submit_time = Column(DateTime(timezone=True), nullable=False)

to_answer_open_secs = Column(Float, nullable=False)
to_submit_secs = Column(Float, nullable=False)
to_submit_secs = Column(Float, nullable=False)
total_duration_secs = Column(Float, nullable=False)

created_timestamp = Column(
DateTime(timezone=True),
default=lambda: datetime.now(timezone.utc)
created_timestamp = Column(
DateTime(timezone=True), default=lambda: datetime.now(timezone.utc)
)
modified_timestamp = Column(
DateTime(timezone=True),
default=lambda: datetime.now(timezone.utc),
onupdate=lambda: datetime.now(timezone.utc)
onupdate=lambda: datetime.now(timezone.utc),
)

__table_args__ = (
Expand Down
1 change: 1 addition & 0 deletions src/analytics/repository/analytics_repository.py
Original file line number Diff line number Diff line change
@@ -1,5 +1,6 @@
from src.analytics.model.analytics import Analytics


class AnalyticsRepository:
def __init__(self, session): # pragma: no cover
self.session = session
Expand Down
15 changes: 12 additions & 3 deletions src/analytics/service/analytics_service.py
Original file line number Diff line number Diff line change
@@ -1,10 +1,14 @@
from datetime import datetime
from src.analytics.model.analytics import Analytics
from src.analytics.repository.analytics_repository import AnalyticsRepository
from src.common.exception.BusinessException import BusinessException, BusinessExceptionEnum
from src.common.exception.BusinessException import (
BusinessException,
BusinessExceptionEnum,
)
from src.user.utils.auth_utils import get_user_email_from_jwt
from src.user.repository.display_config_repository import DisplayConfigRepository


class AnalyticsService:
def __init__(
self,
Expand All @@ -14,8 +18,13 @@ def __init__(
self.analytics_repo = analytics_repository
self.config_repo = display_config_repository

def record_metrics(self, case_config_id: str, case_open: datetime,
answer_open: datetime, answer_submit: datetime) -> Analytics: # pragma: no cover
def record_metrics(
self,
case_config_id: str,
case_open: datetime,
answer_open: datetime,
answer_submit: datetime,
) -> Analytics: # pragma: no cover

# verify user owns this case_config
config = self.config_repo.get_configuration_by_id(case_config_id)
Expand Down
8 changes: 4 additions & 4 deletions src/answer/controller/answer_controller.py
Original file line number Diff line number Diff line change
Expand Up @@ -4,10 +4,10 @@
from src.answer.repository.answer_repository import AnswerRepository
from src.answer.service.answer_service import AnswerService
from src.common.model.ApiResponse import ApiResponse
from src.configration.repository.answer_config_repository import \
AnswerConfigurationRepository
from src.user.repository.display_config_repository import \
DisplayConfigRepository
from src.configration.repository.answer_config_repository import (
AnswerConfigurationRepository,
)
from src.user.repository.display_config_repository import DisplayConfigRepository
from src.user.utils.auth_utils import jwt_validation_required

answer_blueprint = Blueprint("answer", __name__)
Expand Down
3 changes: 2 additions & 1 deletion src/answer/model/answer.py
Original file line number Diff line number Diff line change
@@ -1,5 +1,6 @@
from datetime import datetime

from sqlalchemy import Boolean
from sqlalchemy.dialects.postgresql import UUID

from src import db
Expand All @@ -18,7 +19,7 @@ class Answer(db.Model):
display_configuration = db.Column(db.JSON, nullable=True)
answer_config_id = db.Column(UUID(as_uuid=True), nullable=True)
answer: dict = db.Column(db.JSON, nullable=True)

ai_score_shown: bool = db.Column(Boolean, nullable=False, default=False)
created_timestamp: datetime = db.Column(db.DateTime, default=datetime.utcnow)
modified_timestamp: datetime = db.Column(
db.DateTime, default=datetime.utcnow, onupdate=datetime.utcnow
Expand Down
17 changes: 11 additions & 6 deletions src/answer/service/answer_service.py
Original file line number Diff line number Diff line change
@@ -1,11 +1,13 @@
from src.answer.model.answer import Answer
from src.answer.repository.answer_repository import AnswerRepository
from src.common.exception.BusinessException import (BusinessException,
BusinessExceptionEnum)
from src.configration.repository.answer_config_repository import \
AnswerConfigurationRepository
from src.user.repository.display_config_repository import \
DisplayConfigRepository
from src.common.exception.BusinessException import (
BusinessException,
BusinessExceptionEnum,
)
from src.configration.repository.answer_config_repository import (
AnswerConfigurationRepository,
)
from src.user.repository.display_config_repository import DisplayConfigRepository
from src.user.utils import auth_utils


Expand All @@ -26,6 +28,8 @@ def add_answer_response(self, task_id: int, data: dict):
answer = data["answer"]
answer_config_id = data["answerConfigId"]

ai_shown = data.get("aiScoreShown", False)

configuration = self.configuration_repository.get_configuration_by_id(task_id)

if not configuration or configuration.user_email != user_eamil:
Copy link

Copilot AI Jun 20, 2025

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

There's a typo in the variable name user_eamil. It should be user_email to match the existing variable.

Suggested change
if not configuration or configuration.user_email != user_eamil:
if not configuration or configuration.user_email != user_email:

Copilot uses AI. Check for mistakes.
Expand All @@ -41,6 +45,7 @@ def add_answer_response(self, task_id: int, data: dict):
task_id=task_id,
case_id=configuration.case_id,
user_email=user_eamil,
ai_score_shown=ai_shown,
display_configuration=configuration.path_config,
answer_config_id=answer_config.id,
answer=answer,
Expand Down
12 changes: 4 additions & 8 deletions src/cases/controller/case_controller.py
Original file line number Diff line number Diff line change
Expand Up @@ -3,19 +3,15 @@
from src import db
from src.answer.repository.answer_repository import AnswerRepository
from src.cases.repository.concept_repository import ConceptRepository
from src.cases.repository.drug_exposure_repository import \
DrugExposureRepository
from src.cases.repository.drug_exposure_repository import DrugExposureRepository
from src.cases.repository.measurement_repository import MeasurementRepository
from src.cases.repository.observation_repository import ObservationRepository
from src.cases.repository.person_repository import PersonRepository
from src.cases.repository.visit_occurrence_repository import \
VisitOccurrenceRepository
from src.cases.repository.visit_occurrence_repository import VisitOccurrenceRepository
from src.cases.service.case_service import CaseService
from src.common.model.ApiResponse import ApiResponse
from src.common.repository.system_config_repository import \
SystemConfigRepository
from src.user.repository.display_config_repository import \
DisplayConfigRepository
from src.common.repository.system_config_repository import SystemConfigRepository
from src.user.repository.display_config_repository import DisplayConfigRepository
from src.user.utils import auth_utils
from src.user.utils.auth_utils import jwt_validation_required

Expand Down
Loading