Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Merge develop into main #98

Merged
merged 50 commits into from
Aug 24, 2024
Merged
Show file tree
Hide file tree
Changes from 48 commits
Commits
Show all changes
50 commits
Select commit Hold shift + click to select a range
5b7cfc8
Using course unit sigarra id to fetch schedules
diogotvf7 Feb 21, 2024
e7928fd
Refactored the endpoints
diogotvf7 Mar 13, 2024
d77c437
Fixed (?) issue on schema - lack of id on statistics table cause erro…
diogotvf7 Mar 13, 2024
b12af3c
Merge pull request #74 from NIAEFEUP/refactor/states-refactor
diogotvf7 Apr 6, 2024
0f4456b
Changed the name of the professor model values
diogotvf7 Apr 10, 2024
61c99c0
Merge pull request #76 from NIAEFEUP/fix/professor-endpoint-vals-names
diogotvf7 Apr 10, 2024
1894271
feat: endpoint to return course unit based on id
tomaspalma Jun 4, 2024
354844e
Merge pull request #78 from NIAEFEUP/feature/course-unit-by-id
thePeras Jun 7, 2024
eec5d36
readme: fix wrong command and detail on django caveats
tomaspalma Jun 11, 2024
f111adc
Merge pull request #81 from NIAEFEUP/documentation/improve-readme
thePeras Jun 14, 2024
36977f4
fix: redis url is now dynamic and not bound to folder name
tomaspalma Jun 24, 2024
f23726e
Merge pull request #83 from NIAEFEUP/fix/redis-url-dynamic
tomaspalma Jul 4, 2024
d92b254
refactor: some tables now have no autoincrement ids
tomaspalma Jul 19, 2024
4502ed9
Merge pull request #86 from NIAEFEUP/refactor/changes-to-sigarra-page
tomaspalma Jul 24, 2024
219a5f2
refactor: classes and slot relation is now many to many
tomaspalma Jul 25, 2024
8565e9c
refactor: classes endpoint now uses SlotClass
tomaspalma Jul 26, 2024
f4b63b8
docs: new schema image
tomaspalma Jul 29, 2024
a8265d1
Merge pull request #88 from NIAEFEUP/refactor/classes-slot-manytomany
tomaspalma Jul 29, 2024
973bd14
refactor: use more django orm features to speed up classes retrieval
tomaspalma Aug 5, 2024
178cb4e
Merge pull request #93 from NIAEFEUP/refactor/classes-retrieval-speedup
tomaspalma Aug 7, 2024
92043c6
Remove hard coded statistics
thePeras Aug 8, 2024
9595fc9
Merge remote-tracking branch 'origin/develop' into refactor/remove-st…
thePeras Aug 8, 2024
9049053
Merge pull request #94 from NIAEFEUP/refactor/remove-statistics
thePeras Aug 8, 2024
9d5d892
feat: added hash check endpoint using query params
jose-carlos-sousa Aug 13, 2024
fae9bbf
working with POST
jose-carlos-sousa Aug 13, 2024
10953f7
feat: implemented hash check endpoint
jose-carlos-sousa Aug 13, 2024
da80d5a
fix: removed files that shouldn't be tracked
jose-carlos-sousa Aug 13, 2024
d3224ae
fix: removed tests
jose-carlos-sousa Aug 13, 2024
9ca684a
removed stats and sql data file
jose-carlos-sousa Aug 13, 2024
9294c4a
fix : update endpoint name
jose-carlos-sousa Aug 13, 2024
21d7108
refactor: use postgres
limwa Aug 13, 2024
c092e0c
chore: some touchups
limwa Aug 13, 2024
9bffd7b
chore: prepare for niployments
limwa Aug 13, 2024
7611bb6
docs: update README
limwa Aug 13, 2024
ab638c4
fix: use dump of autogenerated models
limwa Aug 13, 2024
d86c89b
refactor: change order of columns
limwa Aug 13, 2024
d9d78d1
fix: added missing attributes to some primary keys
tomaspalma Aug 13, 2024
ef8020d
Merge branch 'refactor/postgres' of github.com:NIAEFEUP/tts-be into r…
tomaspalma Aug 13, 2024
6db6d48
fix: remove volume in makefile
limwa Aug 14, 2024
1bf78b2
ci: add staging environment
limwa Aug 14, 2024
7e41610
Merge pull request #96 from NIAEFEUP/refactor/postgres
limwa Aug 14, 2024
493a746
Merge pull request #97 from NIAEFEUP/chore/prepare-for-niployments
limwa Aug 14, 2024
5e255ff
update: now use query params and leave the verification up to the fro…
jose-carlos-sousa Aug 15, 2024
8e68d6f
Merge branch 'develop' of github.com:NIAEFEUP/tts-be into feature/has…
jose-carlos-sousa Aug 15, 2024
a2d0cbb
Delete django/statistics.sql
jose-carlos-sousa Aug 15, 2024
4094561
Merge pull request #95 from NIAEFEUP/feature/hashCheckEndpoint
jose-carlos-sousa Aug 15, 2024
3323140
Fix schedule start time and duration types
Process-ing Aug 20, 2024
9f2a23b
Merge pull request #99 from NIAEFEUP/fix/schedule-time-type
Process-ing Aug 20, 2024
dd4f6ab
Fix class/ endpoint to return all professors
Process-ing Aug 23, 2024
73c906f
Merge pull request #102 from NIAEFEUP/fix/multple-professor-lessons
tomaspalma Aug 23, 2024
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
57 changes: 0 additions & 57 deletions .github/workflows/ci.yml

This file was deleted.

22 changes: 22 additions & 0 deletions .github/workflows/niployments.yaml
Original file line number Diff line number Diff line change
@@ -0,0 +1,22 @@
name: Deploy

on:
push:
branches:
- main
- develop

jobs:
build:
runs-on: ubuntu-latest

steps:
- name: Upload to NIployments registry
uses: NIAEFEUP/[email protected]
with:
docker_dockerfile: Dockerfile
docker_context: ./django
docker_target: prod
NIPLOYMENTS_REGISTRY_URL: ${{ vars.NIPLOYMENTS_REGISTRY_URL }}
NIPLOYMENTS_REGISTRY_USERNAME: ${{ vars.NIPLOYMENTS_REGISTRY_USERNAME }}
NIPLOYMENTS_REGISTRY_PASSWORD: ${{ secrets.NIPLOYMENTS_REGISTRY_PASSWORD }}
9 changes: 4 additions & 5 deletions .gitignore
Original file line number Diff line number Diff line change
Expand Up @@ -3,20 +3,19 @@ logs
*.log

# sigarra internal data cannot be committed
mysql/sql/01_dump_mysql.sql
postgres/sql/01_dump_postgres.sql

# dotenv environment variables file
.env

# mysql data
mysql/data/*
mysql/sql/01_data.sql
# postgres data
postgres/data/*
postgres/sql/01_data.sql
**__pycache__

# django
django/**/migrations/**
django/university/models.py
django/statistics.sql

# celery
django/celerybeat-schedule
Expand Down
12 changes: 6 additions & 6 deletions Makefile
Original file line number Diff line number Diff line change
@@ -1,6 +1,6 @@
.PHONY: all clean

MYSQL_DATA = ./mysql/sql
POSTGRES_DATA = ./postgres/sql

all: clean_database
@echo [EXECUTING] ./scripts/$(EXEC)
Expand All @@ -11,10 +11,10 @@ download: clean_fetcher clean_database
@-mkdir ./fetcher/data
@echo [DOWNLOADING] data from the source...
@docker-compose run fetcher python ./update_data/download.py
@echo [REMOVING] data from mysql...
@-rm $(MYSQL_DATA)/01_data.sql
@echo [REMOVING] data from postgres...
@-rm $(POSTGRES_DATA)/01_data.sql
@echo [MOVING] data from fetcher to sql...
@mv ./fetcher/data/* ./mysql/sql
@mv ./fetcher/data/* ./postgres/sql

upload:
@echo [UPLOADING] data...
Expand All @@ -27,5 +27,5 @@ clean_fetcher:
@-rm -r ./fetcher/data

clean_database:
@echo [CLEANING] Removing folder mysql/data...
@-rm -r ./mysql/data/
@echo [CLEANING] Removing database data...
@-docker volume rm tts_postgres_data || true
71 changes: 54 additions & 17 deletions README.md
Original file line number Diff line number Diff line change
@@ -1,45 +1,82 @@
# TTS - backend
The backend for timetable selector.
# TTS - Backend

The backend for the timetable selector, which is a platform that aims to help students better choose their class schedules by allowing them to see and play with all possible combinations.

Made with ❤️ by NIAEFEUP.

## Installation
### Prerequisites
- `docker`
- `docker-compose`
- `docker compose`

### Installing docker
to install docker, take a look on the [official website](https://www.docker.com/) and follow the [`Get docker`](https://docs.docker.com/get-docker/) section to install it. If you're using windows, make sure to have the [`wsl`](https://docs.microsoft.com/en-us/windows/wsl/install) installed.

In case you're using linux, after installing docker check the [`Manage Docker as a non-root user`](https://docs.docker.com/engine/install/linux-postinstall/), so you can use docker without the `sudo` command.
In case you're using linux, after installing docker check the [`Manage Docker as a non-root user`](https://docs.docker.com/engine/install/linux-postinstall/), so you can use docker without the `sudo` command, which involves creating a user group for docker.

## Data

The data is available the NIAEFEUP drive (Only for NIAEFEUP members):
The data is available at the NIAEFEUP drive (Only for NIAEFEUP members):

https://drive.google.com/drive/folders/1hyiwPwwPWhbAPeJm03c0MAo1HTF6s_zK?usp=sharing

- The ```00_schema_mysql.sql``` corresponds to the schema for the most recent data.

- Copy the ```01_data.sql``` and ```00_schema_mysql.sql``` of year and semester you desire to the ```mysql/sql``` folder.

- The ```00_schema_postgres.sql``` corresponds to the schema for the most recent data.

- Copy the ```01_data.sql``` and ```00_schema_postgres.sql``` of year and semester you desire to the ```postgres/sql``` folder.

## Usage

### Development environment
You can start developing by building the local server with docker:

#### Building the container

After you installed docker, go to the folder where you cloned this repository and do:

```yaml
docker-compose build .
docker compose build
```

This will build the docker container for the backend.

In case you have __already build the server before and want to repopulate the database__, make sure you run

```bash
sudo make clean
```

We need to clean the database to repopulate it, since the way the postgres container works is that it only runs the `sql` files present in the `postgres/sql` folder if the database is clean. This is way we need to issue `sudo make clean` in order for the insert sql queries to be run.

#### Running the container

Before running docker, you have to create an `.env` file with required environment variables for the backend to work.

```bash
cp .env.dev .env
```

In case you have __already build the server before and want to build it again__, be sure to delete the folder in `mysql/data`. You can do this by running `sudo rm -r mysql/data/`. To make your life easier, you can simply run the `build_dev.sh` script: `sudo ./build_dev.sh`.
> The sudo permission is nevessary to delete the `mysql/data` folder.
And then you need to set the correct desired values in the `.env` file.

*The `.env` file is not already on the repository in order to prevent sensitive information to be leaked. This way, the file with default non important values (`.env.dev`) serves as a template, while the real file with important sensitive values is on `.gitignore` so it is never accidentally
uploaded to `github` with sensitive information.*

```yaml
docker-compose up
docker compose up
```
#### Some django caveats after running the container

- The first time you run this docker container or after you clean the database, you will need to a wait for some time (5-10 minutes) until the database is populated. It is normal to see django giving a `115` error since the database is not yet ready to anwser to connection requests since it is busy populating itself.

- There are some times on the first execution of this command that django will start giving a`2` error. If that happens, you need to close the container with `docker compose down` and then turning it on with `docker compose up` again.

As well as the build, the running command can also be executed with the `run_dev.sh` script by executing: `./run_dev.sh`.

#### Accessing the development database

> __WARNING__: it's likely that the first execution of `docker-compose up` after building won't work, since django doesn't wait for the database being populated to executed. Thus, if that's your ccase, execute it again.
We are currently using `pgadmin` and you can access it

1. Go to `localhost:4000`

2. On the login screen, both the credentials are as follows:

- Email: [email protected]
- Password: admin

This is fine, since this is only a development environment.
15 changes: 10 additions & 5 deletions django/.env.dev
Original file line number Diff line number Diff line change
@@ -1,8 +1,13 @@
DEBUG=0
SECRET_KEY=foo

MYSQL_DATABASE=tts
MYSQL_PASSWORD=root
MYSQL_USER=root
MYSQL_HOST=db
MYSQL_PORT=3306
POSTGRES_DB=tts
POSTGRES_USER=root
POSTGRES_PASSWORD=root
POSTGRES_HOST=db
POSTGRES_PORT=5432

TTS_REDIS_HOST=tts_redis
TTS_REDIS_PORT=6379
TTS_REDIS_USERNAME=
TTS_REDIS_PASSWORD=
30 changes: 21 additions & 9 deletions django/Dockerfile
Original file line number Diff line number Diff line change
@@ -1,26 +1,38 @@
FROM python:3.8-slim-buster
# deps
FROM python:3.8-slim-buster AS deps

WORKDIR /usr/src/django/

# Get's the output from the django in realtime.
ENV PYTHONUNBUFFERED 1
ENV STATISTICS_NAME tts_be
ENV STATISTICS_PASS batata_frita_123
ENV PYTHONUNBUFFERED=1

# Copy requirements
COPY ./requirements.txt ./requirements.txt

# Dependencies for mysqlclient
# Dependencies for building the requirements
RUN apt-get update
RUN apt-get -y install build-essential default-libmysqlclient-dev
RUN apt-get -y install build-essential

# Install mysql command to wait for the database initialization
RUN apt -y install default-mysql-client
# Install postgres dependencies (pgsql client and development files)
COPY ./etc/pgdg.sh /tmp/pgdg.sh
RUN /tmp/pgdg.sh

RUN apt -y install libpq-dev postgresql-client-16
RUN apt -y clean && rm -rf /var/lib/apt/lists/*

# Install the requirements
RUN pip install -r requirements.txt

EXPOSE 8000

COPY ./entrypoint.sh ./entrypoint.sh
ENTRYPOINT ["sh", "/usr/src/django/entrypoint.sh"]
ENTRYPOINT ["/usr/src/django/entrypoint.sh"]

# prod
FROM deps AS prod

COPY tts_be/ ./tts_be
COPY university/ ./university
COPY manage.py tasks.py ./

CMD ["python", "manage.py", "runserver", "0.0.0.0:8000"]
14 changes: 6 additions & 8 deletions django/entrypoint.sh
100644 → 100755
Original file line number Diff line number Diff line change
@@ -1,24 +1,22 @@
#!bin/sh
#!/bin/sh

# WARNING: The script will not work if formated with CRLF.

# Configure the shell behaviour.
set -e
if [[ ${DEBUG} == 1 ]]
if [[ "${DEBUG}" == 1 ]]
then set -x
fi

# Get parameters.
database_host="$1" # The database host and should be provided the container name.
shift
cmd="$@"

# Waits for mysql initialization.
until mysql -h "$database_host" -u ${MYSQL_USER} -p${MYSQL_PASSWORD} ${MYSQL_DATABASE} -e 'select 1'; do
>&2 echo "MySQL is unavailable - sleeping"
# Waits for PostgreSQL initialization.
until PGPASSWORD="${POSTGRES_PASSWORD}" psql -h "${POSTGRES_HOST}" -U "${POSTGRES_USER}" "${POSTGRES_DB}" -c 'select 1'; do
>&2 echo "PostgreSQL is unavailable - sleeping"
sleep 4
done
>&2 echo "Mysql is up - executing command"
>&2 echo "PostgreSQL is up - executing command"

# Migrate the Django.
python manage.py inspectdb > university/models.py
Expand Down
15 changes: 15 additions & 0 deletions django/etc/pgdg.sh
Original file line number Diff line number Diff line change
@@ -0,0 +1,15 @@
#!/bin/sh

# Source: https://www.postgresql.org/download/linux/ubuntu/

# Import the repository signing key:
apt install -y curl ca-certificates postgresql-common lsb-release

install -d /usr/share/postgresql-common/pgdg
curl -o /usr/share/postgresql-common/pgdg/apt.postgresql.org.asc --fail https://www.postgresql.org/media/keys/ACCC4CF8.asc

# Create the repository configuration file:
sh -c 'echo "deb [signed-by=/usr/share/postgresql-common/pgdg/apt.postgresql.org.asc] https://apt.postgresql.org/pub/repos/apt $(lsb_release -cs)-pgdg main" > /etc/apt/sources.list.d/pgdg.list'

# Update the package lists:
apt update
3 changes: 2 additions & 1 deletion django/requirements.txt
Original file line number Diff line number Diff line change
Expand Up @@ -5,6 +5,7 @@ django-cors-headers==3.10.1
djangorestframework==3.11.0
pytz==2021.3
sqlparse==0.4.2
mysqlclient==1.4.6
psycopg2==2.9.9
celery==5.2.7
redis==3.5.3
python-dotenv==1.0.1
32 changes: 16 additions & 16 deletions django/tasks.py
Original file line number Diff line number Diff line change
@@ -1,23 +1,23 @@
from celery import Celery
from celery.schedules import crontab
import os
from dotenv import dotenv_values

app = Celery('tasks', broker="redis://tts-be-redis_service-1:6379")
CONFIG={
**dotenv_values(".env"), # load variables
**os.environ, # override loaded values with environment variables
}

username_password_str = ''
if os.getenv('TTS_REDIS_USERNAME') != '' and os.getenv('TTS_REDIS_PASSWORD') != '':
username_password_str = f"{os.getenv('TTS_REDIS_USERNAME')}:{os.getenv('TTS_REDIS_PASSWORD')}@"

app = Celery('tasks', broker=f"redis://{username_password_str}{os.getenv('TTS_REDIS_HOST')}:{os.getenv('TTS_REDIS_PORT')}")

# Gets called after celery sets up. Creates a worker that runs the dump_statistics function at midnight and noon everyday
@app.on_after_configure.connect
def setup_periodic_tasks(sender, **kwargs):
sender.add_periodic_task(
crontab(minute='0', hour='0, 12'),
dump_statistics.s(),
name='dump statistics'
)
#@app.on_after_configure.connect
#def setup_periodic_tasks(sender, **kwargs):
# sender.add_periodic_task()



@app.task
def dump_statistics():
command = "mysqldump -P {} -h db -u {} -p{} {} statistics > statistics.sql".format(
os.environ["MYSQL_PORT"],
os.environ["MYSQL_USER"],
os.environ["MYSQL_PASSWORD"],
os.environ["MYSQL_DATABASE"])
os.system(command)
Loading
Loading