diff --git a/Coordination.md b/Coordination.md new file mode 100644 index 0000000000..ffe0fa699f --- /dev/null +++ b/Coordination.md @@ -0,0 +1,29 @@ + +DUE FRI AT 5pm +[PHASE 3 REQUIREMENTS](https://docs.google.com/document/d/1oaXD2gjbQTMcSbYllbsGI17IqQbSJP5T0lSpxT6BRAs/edit?tab=t.0) + +- [others_should_add_what_they_think_is_needed] Make sure readme is complete + - [ ] Be sure to link video presentation +- [x] add in more mock data via mockaroo or similar +- [ ] add in bridge tables (which imo is great news -- we wont have to build a front end for the a few features) +- [ ] Implement REST API to python (imo im not sure we need as many rows as we have -- which is also good news) +- [ ] Jose Landing Page + - [x] Jose Feature 1 + - 3.4 Done by Ryan, though its not pretty. I mostly did it to understand streamlit syntax + - [ ] Jose Feature 2 + - [ ] Jose Feature 3 +- [ ] Jack Landing Page + - [ ] Jack Feature 1 + - [ ] Jack Feature 2 + - [ ] Jack Feature 3 +- [ ] Alan Landing Page + - [ ] Alan Feature 1 + - [ ] Alan Feature 2 + - [ ] Alan Feature 3 +- [ ] Avery Landing Page + - [x] Avery Feature 1 + - 1.1 Done by Ryan. I interpreted it as just having subgoals, though that shouldnt be an issue. + - [ ] Avery Feature 2 + - 1.5 + - [ ] Avery Feature 3 + diff --git a/README.md b/README.md index 57559df051..0e9f53d867 100644 --- a/README.md +++ b/README.md @@ -1,108 +1,143 @@ -# Summer 2 2025 CS 3200 Project Template +# Summer 2 2025 CS 3200 Project - What is Goal Planner (Global GoalFlow)? -This is a template repo CS 3200 Summer 2 2025 Course Project. +Goal Planner is a comprehensive goal and habit management platform that transforms how people approach long-term achievement by making data work for them, not against them. Unlike traditional to-do apps that leave users drowning in endless lists, Goal Planner intelligently breaks down ambitious projects into manageable phases, automatically suggests next tasks when you complete something, and seamlessly integrates daily habits with major milestones. -It includes most of the infrastructure setup (containers), sample databases, and example UI pages. Explore it fully and ask questions! +By collecting and analyzing user progress patterns, deadline adherence, and completion rates, our app provides personalized insights that help users understand their productivity patterns and optimize their approach to goal achievement. + +We're building this for four distinct user types: individual achievers like freelancers and students who juggle multiple projects, professionals and researchers who need structured approaches to complex work, business analysts who require data-driven insights into team performance and goal completion rates, and system administrators who need robust, scalable platforms for managing user communities. + +This repo includes the infrastrucure setup, a MySQL database along with mock data, and example UI pages. + +### Project Members + +- Ryan Baylon +- Hyeyeon Seo +- Jaden Hu +- Rishik Kellar +- Fabrizio Flores + +--- ## Prerequisites -- A GitHub Account -- A terminal-based git client or GUI Git client such as GitHub Desktop or the Git plugin for VSCode. -- VSCode with the Python Plugin installed -- A distribution of Python running on your laptop. The distribution supported by the course is Anaconda or Miniconda. - - Create a new Python 3.11 environment in conda named `db-proj` by running: - ```bash - conda create -n db-proj python=3.11 - ``` - - Install the Python dependencies listed in `api/requirements.txt` and `app/src/requirements.txt` into your local Python environment. You can do this by running `pip install -r requirements.txt` in each respective directory. +Before starting, make sure you have: + +- A GitHub account +- Git client (terminal or GUI such as GitHub Desktop or Git plugin for VSCode) +- VSCode with the Python Plugin or your preferred IDE +- Docker and Docker Compose installed on your machine + +--- + +## Repo Structure +The repo is organized into five main directories: + +- `./app` – Frontend Streamlit app for user interaction. +- `./api` – Backend REST API (Flask) to handle business logic and database communication. +- `./database-files` – SQL scripts to initialize and seed the MySQL database with mock data. +- `./datasets` – Folder for datasets (if needed). +- `docker-compose.yaml` – Configuration to start the app, API, and MySQL database containers. + +--- + +## Database Setup + +We use a MySQL database named `global-GoalFlow`. The schema includes tables to manage users, goals, tasks, posts, tags, bug reports, and more, supporting the core functionality of Goal Planner. -## Structure of the Repo +### Key Tables Overview -- The repo is organized into five main directories: - - `./app` - the Streamlit app - - `./api` - the Flask REST API - - `./database-files` - SQL scripts to initialize the MySQL database - - `./datasets` - folder for storing datasets +- **users**: Stores user profiles, roles, contact info, and management relationships. +- **tags**: Categories for goals, posts, and tasks. +- **posts** & **post_reply**: Community forum posts and replies. +- **user_data**: Tracks user activity, devices, and login info. +- **bug_reports**: For tracking issues submitted by users. +- **consistent_tasks**, **daily_tasks**: Task management for recurring and daily items. +- **goals** & **subgoals**: Hierarchical goal tracking with status, priority, and deadlines. -- The repo also contains a `docker-compose.yaml` file that is used to set up the Docker containers for the front end app, the REST API, and MySQL database. +The database schema is designed to support role-based access, data integrity, and efficient queries with proper indexes and foreign keys. -## Suggestion for Learning the Project Code Base +--- -If you are not familiar with web app development, this code base might be confusing. But don't worry, we'll get through it together. Here are some suggestions for learning the code base: +## How to Build and Run -1. Have two versions of the template repo - one for you to individually explore and learn and another for your team's project implementation. -1. Start by exploring the `./app` directory. This is where the Streamlit app is located. The Streamlit app is a Python-based web app that is used to interact with the user. It's a great way to build a simple web app without having to learn a lot of web development. -1. Next, explore the `./api` directory. This is where the Flask REST API is located. The REST API is used to interact with the database and perform other server-side tasks. You might also consider this the "application logic" or "business logic" layer of your app. -1. Finally, explore the `./database-files` directory. This is where the SQL scripts are located that will be used to initialize the MySQL database. +### 1. Clone the Repository -### Setting Up Your Personal Testing Repo +```bash +git clone +cd +``` -**Before you start**: You need to have a GitHub account and a terminal-based git client or GUI Git client such as GitHub Desktop or the Git plugin for VSCode. +### 2. Set up Environment Variables +Copy the **.env.template** file inside the **api** folder and rename it to **.env**. Edit the .env file to include your database credentials and secrets. Make sure passwords are secure and unique. -1. Clone this repo to your local machine. - 1. You can do this by clicking the green "Code" button on the top right of the repo page and copying the URL. Then, in your terminal, run `git clone `. - 1. Or, you can use the GitHub Desktop app to clone the repo. See [this page](https://docs.github.com/en/desktop/adding-and-cloning-repositories/cloning-a-repository-from-github-to-github-desktop) of the GitHub Desktop Docs for more info. -1. Open the repository folder in VSCode. -1. Set up the `.env` file in the `api` folder based on the `.env.template` file. - 1. Make a copy of the `.env.template` file and name it `.env`. - 1. Open the new `.env` file. - 1. On the last line, delete the `<...>` placeholder text, and put a password. Don't reuse any passwords you use for any other services (email, etc.) -1. For running the testing containers (for your personal repo), you will tell `docker compose` to use a different configuration file than the typical one. The one you will use for testing is `sandbox.yaml`. - 1. `docker compose -f sandbox.yaml up -d` to start all the containers in the background - 1. `docker compose -f sandbox.yaml down` to shutdown and delete the containers - 1. `docker compose -f sandbox.yaml up db -d` only start the database container (replace db with api or app for the other two services as needed) - 1. `docker compose -f sandbox.yaml stop` to "turn off" the containers but not delete them. +### 3. Start Docker Containers +Use Docker Compose to start the full stack: -### Setting Up Your Team's Repo +```bash +docker compose up -d +``` +This will start: + - MySQL database container + - Flask REST API backend + - Streamlit frontend app -**Before you start**: As a team, one person needs to assume the role of _Team Project Repo Owner_. +To stop and remove containers: +```bash +docker compose down +``` -1. The Team Project Repo Owner needs to **fork** this template repo into their own GitHub account **and give the repo a name consistent with your project's name**. If you're worried that the repo is public, don't. Every team is doing a different project. -1. In the newly forked team repo, the Team Project Repo Owner should go to the **Settings** tab, choose **Collaborators and Teams** on the left-side panel. Add each of your team members to the repository with Write access. +### 4. Initialize the Database +Run the SQL scripts inside ./database-files to create tables and insert initial data: -**Remaining Team Members** +```bash +mysql -u -p < ./database-files/schema.sql +``` +Or connect to the running MySQL container and execute the scripts. -1. Each of the other team members will receive an invitation to join. -1. Once you have accepted the invitation, you should clone the Team's Project Repo to your local machine. -1. Set up the `.env` file in the `api` folder based on the `.env.template` file. -1. For running the testing containers (for your team's repo): - 1. `docker compose up -d` to start all the containers in the background - 1. `docker compose down` to shutdown and delete the containers - 1. `docker compose up db -d` only start the database container (replace db with api or app for the other two services as needed) - 1. `docker compose stop` to "turn off" the containers but not delete them. +--- -**Note:** You can also use the Docker Desktop GUI to start and stop the containers after the first initial run. +## User Personas & Stories +Persona 1: Avery - Freelance Designer + - Juggles client and personal projects. + - Needs task automation and habit tracking to stay consistent. + - Wants a visual dashboard for progress and deadlines. + - Requires space for creative ideas and manageable workflows. -## Handling User Role Access and Control +Persona 2: Dr. Alan - Professor + - Math professor balancing research and teaching. + - Needs categorized projects, priority control, and deadline management. + - Wants completed projects archived but accessible for reference. -In most applications, when a user logs in, they assume a particular role. For instance, when one logs in to a stock price prediction app, they may be a single investor, a portfolio manager, or a corporate executive (of a publicly traded company). Each of those _roles_ will likely present some similar features as well as some different features when compared to the other roles. So, how do you accomplish this in Streamlit? This is sometimes called Role-based Access Control, or **RBAC** for short. +Persona 3: Jose – System Administrator + - Oversees app scalability, user support, and community engagement. + - Requires bug tracking dashboard, user analytics, and payment plan insights. -The code in this project demonstrates how to implement a simple RBAC system in Streamlit but without actually using user authentication (usernames and passwords). The Streamlit pages from the original template repo are split up among 3 roles - Political Strategist, USAID Worker, and a System Administrator role (this is used for any sort of system tasks such as re-training ML model, etc.). It also demonstrates how to deploy an ML model. +Persona 4: Jack – Financial Analyst + - Tracks company goals and employee task completion. + - Needs subgoal checkboxes, deadlines, and aggregated progress reports. -Wrapping your head around this will take a little time and exploration of this code base. Some highlights are below. +--- -### Getting Started with the RBAC +### Features + - Automatic project phase generation prevents overwhelming long-term goals + - Intelligent task queuing surfaces next actionable items automatically + - Comprehensive analytics dashboards provide insights into productivity patterns + - Role-based access control supports users with distinct permissions and views + - Community forum for user discussions, bug reports, and feedback + - Task and goal hierarchy with tags, priorities, and scheduling -1. We need to turn off the standard panel of links on the left side of the Streamlit app. This is done through the `app/src/.streamlit/config.toml` file. So check that out. We are turning it off so we can control directly what links are shown. -1. Then I created a new python module in `app/src/modules/nav.py`. When you look at the file, you will se that there are functions for basically each page of the application. The `st.sidebar.page_link(...)` adds a single link to the sidebar. We have a separate function for each page so that we can organize the links/pages by role. -1. Next, check out the `app/src/Home.py` file. Notice that there are 3 buttons added to the page and when one is clicked, it redirects via `st.switch_page(...)` to that Roles Home page in `app/src/pages`. But before the redirect, I set a few different variables in the Streamlit `session_state` object to track role, first name of the user, and that the user is now authenticated. -1. Notice near the top of `app/src/Home.py` and all other pages, there is a call to `SideBarLinks(...)` from the `app/src/nav.py` module. This is the function that will use the role set in `session_state` to determine what links to show the user in the sidebar. -1. The pages are organized by Role. Pages that start with a `0` are related to the _Political Strategist_ role. Pages that start with a `1` are related to the _USAID worker_ role. And, pages that start with a `2` are related to The _System Administrator_ role. +--- +## Notes on User Roles and Access Control +Our platform implements a simple Role-Based Access Control (RBAC) system, differentiating between: + - Individual users (freelancers, researchers) + - Business analysts and managers + - System administrators -## Incorporating ML Models into your Project (Optional for CS 3200) +Each role experiences a customized view with access to features relevant to their needs and permissions. -_Note_: This project only contains the infrastructure for a hypothetical ML model. +--- -1. Collect and preprocess necessary datasets for your ML models. -1. Build, train, and test your ML model in a Jupyter Notebook. - - You can store your datasets in the `datasets` folder. You can also store your Jupyter Notebook in the `ml-src` folder. -1. Once your team is happy with the model's performance, convert your Jupyter Notebook code for the ML model to a pure Python script. - - You can include the `training` and `testing` functionality as well as the `prediction` functionality. - - Develop and test this pure Python script first in the `ml-src` folder. - - You may or may not need to include data cleaning, though. -1. Review the `api/backend/ml_models` module. In this folder, - - We've put a sample (read _fake_) ML model in the `model01.py` file. The `predict` function will be called by the Flask REST API to perform '_real-time_' prediction based on model parameter values that are stored in the database. **Important**: you would never want to hard code the model parameter weights directly in the prediction function. -1. The prediction route for the REST API is in `api/backend/customers/customer_routes.py`. Basically, it accepts two URL parameters and passes them to the `prediction` function in the `ml_models` module. The `prediction` route/function packages up the value(s) it receives from the model's `predict` function and send its back to Streamlit as JSON. -1. Back in streamlit, check out `app/src/pages/11_Prediction.py`. Here, I create two numeric input fields. When the button is pressed, it makes a request to the REST API URL `/c/prediction/.../...` function and passes the values from the two inputs as URL parameters. It gets back the results from the route and displays them. Nothing fancy here. \ No newline at end of file +## Contact & Support +For questions or bug reports, please open an issue in the GitHub repository or contact the system administrator (Ryan). \ No newline at end of file diff --git a/TO_ADD/00_Dr_Alan_Home.py b/TO_ADD/00_Dr_Alan_Home.py new file mode 100644 index 0000000000..c18e8eff5b --- /dev/null +++ b/TO_ADD/00_Dr_Alan_Home.py @@ -0,0 +1,35 @@ +import logging +logger = logging.getLogger(__name__) + +import streamlit as st +from modules.nav import SideBarLinks + +st.set_page_config(layout = 'wide') + +# Show appropriate sidebar links for the role of the currently logged in user +#SideBarLinks() + +#st.title(f"Welcome Professor, {st.session_state['first_name']}.") +st.write('') +st.write('') +st.write('### What would you like to do today?') + +if st.button('Add new project', + type='primary', + use_container_width=True): + st.switch_page('pages/01_Add_New_Project.py') + +if st.button('View completed projects', + type='primary', + use_container_width=True): + st.switch_page('pages/02_Completed_Projects.py') + +if st.button('View project by tags', + type='primary', + use_container_width=True): + st.switch_page('pages/03_Project_Tags.py') + +if st.button('Manage planner and tasks', + type='primary', + use_container_width=True): + st.switch_page('pages/04_Planner_And_Tasks.py') \ No newline at end of file diff --git a/TO_ADD/01_Add_New_Project.py b/TO_ADD/01_Add_New_Project.py new file mode 100644 index 0000000000..c1db6c18a8 --- /dev/null +++ b/TO_ADD/01_Add_New_Project.py @@ -0,0 +1,39 @@ +import streamlit as st +import requests + +from modules.nav import SideBarLinks +SideBarLinks(show_home=True) + +st.title("Add New Project") + +# Form inputs +userID = st.text_input("User ID") +tagID = st.text_input("Tag ID") +title = st.text_input("Title") +notes = st.text_area("Notes") +status = st.selectbox("Status", ["onIce", "inProgress", "completed"]) +priority = st.slider("Priority", 1, 4, 4) +schedule = st.text_input("Deadline") + +if st.button("Submit"): + project_data = { + "userID": userID, + "tagID": tagID, + "title": title, + "notes": notes, + "status": status, + "priority": priority, + "completedAt": completedAt or None, + "schedule": schedule + } + + try: + # Replace this URL with your actual backend API URL + response = requests.post("http://localhost:4000/projects", json=project_data) + + if response.status_code == 200: + st.success("Project added successfully!") + else: + st.error(f"Failed to add project: {response.text}") + except Exception as e: + st.error(f"Error: {e}") diff --git a/TO_ADD/02_Completed_Projects.py b/TO_ADD/02_Completed_Projects.py new file mode 100644 index 0000000000..d7fd906486 --- /dev/null +++ b/TO_ADD/02_Completed_Projects.py @@ -0,0 +1,29 @@ +import streamlit as st +import requests + +st.title("Completed Projects") + +try: + response = requests.get("http://localhost:4000/projects/completedprojects") + + if response.status_code == 200: + projects = response.json() + + if projects: + for project in projects: + st.subheader(project.get('title', 'Untitled Project')) + st.write(f"Notes: {project.get('notes', 'No notes')}") + st.write(f"Priority: {project.get('priority', 'N/A')}") + st.write(f"Completed At: {project.get('completedAt', 'N/A')}") + st.write("---") + else: + st.info("No completed projects found.") + + elif response.status_code == 404: + st.info("No completed projects found.") + + else: + st.error(f"Error fetching completed projects: {response.status_code}") + +except Exception as e: + st.error(f"Failed to fetch projects: {e}") diff --git a/TO_ADD/Add_New_Project.py b/TO_ADD/Add_New_Project.py new file mode 100644 index 0000000000..d08f4926da --- /dev/null +++ b/TO_ADD/Add_New_Project.py @@ -0,0 +1,39 @@ +import streamlit as st +import requests + +from modules.nav import SideBarLinks +SideBarLinks(show_home=True) + +st.title("Add New Project") + +# Form inputs +userID = st.text_input("User ID") +tagID = st.text_input("Tag ID") +title = st.text_input("Title") +notes = st.text_area("Notes") +status = st.selectbox("Status", ["onIce", "inProgress", "completed"]) +priority = st.slider("Priority", 1, 10, 4) +schedule = st.text_input("Deadline") + +if st.button("Submit"): + project_data = { + "userID": userID, + "tagID": tagID, + "title": title, + "notes": notes, + "status": status, + "priority": priority, + "completedAt": completedAt or None, + "schedule": schedule + } + + try: + # Replace this URL with your actual backend API URL + response = requests.post("http://localhost:4000/projects", json=project_data) + + if response.status_code == 200: + st.success("Project added successfully!") + else: + st.error(f"Failed to add project: {response.text}") + except Exception as e: + st.error(f"Error: {e}") diff --git a/TO_ADD/Developer_View.py b/TO_ADD/Developer_View.py new file mode 100644 index 0000000000..8b3920c149 --- /dev/null +++ b/TO_ADD/Developer_View.py @@ -0,0 +1,37 @@ +import logging +logger = logging.getLogger(__name__) + +import streamlit as st +import mysql.connector + + +st.set_page_config(layout = 'wide') + +@st.cache_resource +def init_connection(): + return mysql.connector.connect( + host="localhost", + port=3306, + user="root", + password="1203", + database="global-GoalFlow" + ) + +# Function to run queries +def run_query(query, params=None): + conn = init_connection() + cursor = conn.cursor() + cursor.execute(query, params or ()) + result = cursor.fetchall() + cursor.close() + conn.close() + return result + + +st.title(f"Welcome Jose!") + +emails = run_query("SELECT COUNT(email) FROM users") + +st.write("Total users on the app:") +st.write(emails[0][0]) + diff --git a/TO_ADD/On_Ice.py b/TO_ADD/On_Ice.py new file mode 100644 index 0000000000..70f717c7e7 --- /dev/null +++ b/TO_ADD/On_Ice.py @@ -0,0 +1,37 @@ +import streamlit as st +import mysql.connector +import pandas as pd + +st.title("Backlog") + +@st.cache_resource +def init_connection(): + return mysql.connector.connect( + host="localhost", + port=3306, + user="root", + password="1203", + database="global-GoalFlow" + ) + +def run_query(query, params=None): + conn = init_connection() + cursor = conn.cursor() + cursor.execute(query, params or ()) + result = cursor.fetchall() + cursor.close() + return result + +on_ice = run_query("SELECT title, notes FROM goals WHERE status = 'ON ICE'") +print(on_ice) + +for goal in on_ice: + col1, col2 = st.columns([4, 1]) + + with col1: + st.write(f"**{goal[1]}**") + + with col2: + if st.button("Activate", key=f"activate_{goal[0]}"): + # activate_goal(goal[0]) + st.rerun() \ No newline at end of file diff --git a/api/.env.template b/api/.env.template deleted file mode 100644 index 3a51ab40f9..0000000000 --- a/api/.env.template +++ /dev/null @@ -1,6 +0,0 @@ -SECRET_KEY=someCrazyS3cR3T!Key.! -DB_USER=root -DB_HOST=db -DB_PORT=3306 -DB_NAME=ngo_db -MYSQL_ROOT_PASSWORD= diff --git a/api/backend/blueprints/__init__.py b/api/backend/blueprints/__init__.py new file mode 100644 index 0000000000..e69de29bb2 diff --git a/api/backend/blueprints/persona1_avery.py b/api/backend/blueprints/persona1_avery.py new file mode 100644 index 0000000000..878f39b53a --- /dev/null +++ b/api/backend/blueprints/persona1_avery.py @@ -0,0 +1,199 @@ +from flask import Blueprint, request, current_app, jsonify +from backend.db_connection import db +from mysql.connector import Error +from datetime import datetime +import re + +def slugify(text): + text = re.sub(r"[^\w\s-]", "", text).strip().lower() + return re.sub(r"[-\s]+", "-", text) + +# Blueprint for Persona 1 (Avery) +persona1_bp = Blueprint("persona1", __name__, url_prefix="/persona1") + +# Avery's profile +@persona1_bp.route("/me", methods=["GET"]) +def me(): + + try: + user_id = int(request.args.get("userId", 1)) + conn = db.get_db() + cur = conn.cursor(dictionary=True) + + cur.execute("SELECT * FROM users WHERE id = %s", (user_id,)) + user = cur.fetchone() + if not user: + cur.close() + return jsonify({"error": "user not found"}), 404 + + cur.execute("SELECT * FROM user_data WHERE userId = %s ORDER BY (lastLogin IS NULL) ASC, lastLogin DESC, id DESC LIMIT 1", (user_id,)) + latest_data = cur.fetchone() + + cur.close() + return jsonify({"user": user, "latest_user_data": latest_data}), 200 + + except Error as e: + current_app.logger.exception("DB error") + return jsonify({"error": str(e)}), 500 + +# Posts +@persona1_bp.route("/posts", methods=["GET"]) +def list_posts(): + try: + user_id = int(request.args.get("userId", 1)) + limit = int(request.args.get("limit", 10)) + conn = db.get_db() + cur = conn.cursor(dictionary=True) + + cur.execute("SELECT p.*, t.name AS tagName, t.color AS tagColor FROM posts p LEFT JOIN tags t ON p.tag = t.id WHERE p.authorId = %s ORDER BY p.createdAt DESC, p.id DESC LIMIT %s", (user_id, limit)) + rows = cur.fetchall() + cur.close() + return jsonify(rows), 200 + + except Error as e: + current_app.logger.exception("DB error") + return jsonify({"error": str(e)}), 500 + +# Post create +@persona1_bp.route("/posts", methods=["POST"]) +def create_post(): + try: + data = request.get_json(force=True) or {} + user_id = int(data.get("userId", 1)) + title = (data.get("title") or "").strip() + content = data.get("content") + tag_id = data.get("tagId") + meta_title = data.get("metaTitle") + + if not title or not tag_id: + return jsonify({"error": "title and tagId are required"}), 400 + + base_slug = slugify(title) + slug = base_slug + now = datetime.utcnow().strftime("%Y-%m-%d %H:%M:%S") + + conn = db.get_db() + cur = conn.cursor() + + n = 1 + while True: + cur.execute("SELECT 1 FROM posts WHERE slug = %s", (slug,)) + exists = cur.fetchone() + if not exists: + break + n += 1 + slug = f"{base_slug}-{n}" + + cur.execute(""" + INSERT INTO posts (authorId, title, metaTitle, createdAt, publishedAt, slug, content, tag) + VALUES (%s, %s, %s, %s, %s, %s, %s, %s) + """, (user_id, title, meta_title, now, now, slug, content, int(tag_id))) + conn.commit() + new_id = cur.lastrowid + cur.close() + return jsonify({"message": "created", "post_id": new_id, "slug": slug}), 201 + + except Error as e: + current_app.logger.exception("DB error") + return jsonify({"error": str(e)}), 500 + +# Daily tasks +@persona1_bp.route("/daily-tasks", methods=["GET"]) +def list_daily_tasks(): + try: + user_id = int(request.args.get("userId", 1)) + date_str = request.args.get("date") + status = request.args.get("status") + + conn = db.get_db() + cur = conn.cursor(dictionary=True) + + q = """SELECT dt.*, CONCAT_WS(' ', u.firstName, u.lastName) AS userName + FROM daily_tasks dt + JOIN users u ON dt.userId = u.id + WHERE dt.userId = %s + """ + params = [user_id] + + if date_str: + q += " AND dt.schedule = %s" + params.append(date_str) + + if status is not None: + q += " AND dt.status = %s" + params.append(int(status)) + + q += " ORDER BY dt.schedule DESC, dt.id DESC" + + cur.execute(q, tuple(params)) + rows = cur.fetchall() + cur.close() + return jsonify(rows), 200 + + except Error as e: + current_app.logger.exception("DB error") + return jsonify({"error": str(e)}), 500 + +# Daily task delete +@persona1_bp.route("/daily-tasks/", methods=["DELETE"]) +def delete_daily_task(task_id: int): + try: + user_id = int(request.args.get("userId", 1)) + conn = db.get_db() + cur = conn.cursor(dictionary=True) + + cur.execute("SELECT id FROM daily_tasks WHERE id = %s AND userId = %s", (task_id, user_id)) + row = cur.fetchone() + if not row: + cur.close() + return jsonify({"error": "task not found"}), 404 + + cur = conn.cursor() + cur.execute("DELETE FROM daily_tasks WHERE id = %s", (task_id,)) + conn.commit() + cur.close() + return jsonify({"message": "deleted"}), 200 + + except Error as e: + current_app.logger.exception("DB error") + return jsonify({"error": str(e)}), 500 + +# Goal update +@persona1_bp.route("/goals/", methods=["PUT"]) +def update_goal(goal_id: int): + try: + user_id = int(request.args.get("userId", 1)) + data = request.get_json(force=True) or {} + + allowed = ["title", "notes", "onIce", "status", "priority", "completed", "schedule"] + sets, params = [], [] + + for k in allowed: + if k in data and data[k] is not None: + sets.append(f"{k} = %s") + params.append(data[k]) + + if not sets: + return jsonify({"error": "no valid fields to update"}), 400 + + conn = db.get_db() + cur = conn.cursor(dictionary=True) + + cur.execute("SELECT id FROM goals WHERE id = %s AND userId = %s", (goal_id, user_id)) + row = cur.fetchone() + if not row: + cur.close() + return jsonify({"error": "goal not found"}), 404 + + q = f"UPDATE goals SET {', '.join(sets)} WHERE id = %s" + params.append(goal_id) + + cur = conn.cursor() + cur.execute(q, tuple(params)) + conn.commit() + cur.close() + return jsonify({"message": "updated"}), 200 + + except Error as e: + current_app.logger.exception("DB error") + return jsonify({"error": str(e)}), 500 diff --git a/api/backend/consistent_tasks/consistent_tasks_routes.py b/api/backend/consistent_tasks/consistent_tasks_routes.py new file mode 100644 index 0000000000..93b2478f2e --- /dev/null +++ b/api/backend/consistent_tasks/consistent_tasks_routes.py @@ -0,0 +1,169 @@ +from flask import Blueprint, jsonify, request +from backend.db_connection import db +from mysql.connector import Error +from flask import current_app + +consistent_tasks = Blueprint("consistent_tasks", __name__) + +@consistent_tasks.route("/get_consistent_tasks", methods=["GET"]) +def get_all_tasks(): + try: + current_app.logger.info('Starting get_all_tasks request') + cursor = db.get_db().cursor() + + # Note: Query parameters are added after the main part of the URL. + # Here is an example: + # http://localhost:4000/ngo/ngos?founding_year=1971 + # founding_year is the query param. + + # Get query parameters for filtering + title = request.args.get("title") + category = request.args.get("category") + notes = request.args.get("notes") + + current_app.logger.debug(f'Query parameters - title: {title}, category: {category}, notes: {notes}') + + # Prepare the Base query + query = "SELECT * FROM consistent_tasks WHERE 1=1" + params = [] + + # Add filters if provided + if title: + query += " AND title = %s" + params.append(title) + if category: + query += " AND category = %s" + params.append(category) + if notes: + query += " AND notes = %s" + params.append(notes) + + current_app.logger.debug(f'Executing query: {query} with params: {params}') + cursor.execute(query, params) + results = cursor.fetchall() + + # Get column names to map to dictionaries + columns = [col[0] for col in cursor.description] + consistent_tasks = [dict(zip(columns, row)) for row in results] + cursor.close() + + current_app.logger.info(f'Successfully retrieved {len(consistent_tasks)} consistent tasks') + return jsonify(consistent_tasks), 200 + except Error as e: + current_app.logger.error(f'Database error in get_all_tasks: {str(e)}') + return jsonify({"error": str(e)}), 500 + +@consistent_tasks.route("/create_consistent_task", methods=["POST"]) +def create_task(): + try: + data = request.get_json() + + # Validate required fields + required_fields = ["userId", "title", "slug"] + for field in required_fields: + if field not in data: + return jsonify({"error": f"Missing required field: {field}"}), 400 + + cursor = db.get_db().cursor() + + query = """ + INSERT INTO consistent_tasks (userId, title, slug, category, notes) + VALUES (%s, %s, %s, %s, %s) + """ + cursor.execute( + query, + ( + data["userId"], + data["title"], + data["slug"], + data.get("category", None), + data.get("notes", None), + ), + ) + + db.get_db().commit() + new_task_id = cursor.lastrowid + cursor.close() + + return ( + jsonify({"message": "task created successfully", "task_id": new_task_id}), + 201, + ) + except Error as e: + return jsonify({"error": str(e)}), 500 + +@consistent_tasks.route("/delete_consistent_task/", methods = ["DELETE"]) +def delete_task(task_id): + try: + cursor = db.get_db().cursor() + + cursor.execute("SELECT * FROM consistent_tasks WHERE id = %s", (task_id,)) + task = cursor.fetchone() + if not task: + return jsonify({"error": "task not found"}), 404 + + cursor.execute("DELETE FROM consistent_tasks WHERE id = %s", (task_id,)) + db.get_db().commit() + cursor.close() + + return jsonify({"message": "task deleted successfully"}), 200 + except Error as e: + return jsonify({"error": str(e)}), 500 + +@consistent_tasks.route("/consistent_task/", methods=["PUT"]) +def rename_task(task_id): + try: + data = request.get_json() + + # Check if NGO exists + cursor = db.get_db().cursor() + cursor.execute("SELECT * FROM consistent_tasks WHERE id = %s", (task_id,)) + if not cursor.fetchone(): + return jsonify({"error": "task not found"}), 404 + + # Build update query dynamically based on provided fields + update_fields = [] + params = [] + allowed_fields = ["title", "category", "notes"] + + for field in allowed_fields: + if field in data: + update_fields.append(f"{field} = %s") + params.append(data[field]) + + if not update_fields: + return jsonify({"error": "No valid fields to update"}), 400 + + params.append(task_id) + query = f"UPDATE consistent_tasks SET {', '.join(update_fields)} WHERE id = %s" + + cursor.execute(query, params) + db.get_db().commit() + cursor.close() + + return jsonify({"message": "task updated successfully"}), 200 + except Error as e: + return jsonify({"error": str(e)}), 500 + +@consistent_tasks.route("/consistent_task/", methods=["GET"]) +def get_consistent_task(task_id): + try: + cursor = db.get_db().cursor() + + cursor.execute("SELECT * FROM consistent_tasks WHERE id = %s", (task_id,)) + task_row = cursor.fetchone() + + if not task_row: + return jsonify({"error": "user not found"}), 404 + + columns = [col[0] for col in cursor.description] + consistent_tasks = dict(zip(columns, task_row)) + + cursor.close() + return jsonify(consistent_tasks), 200 + + except Error as e: + return jsonify({"error": str(e)}), 500 + + + diff --git a/api/backend/daily_tasks/daily_tasks_routes.py b/api/backend/daily_tasks/daily_tasks_routes.py new file mode 100644 index 0000000000..f35a1d5f0c --- /dev/null +++ b/api/backend/daily_tasks/daily_tasks_routes.py @@ -0,0 +1,168 @@ +from flask import Blueprint, jsonify, request +from backend.db_connection import db +from mysql.connector import Error +from flask import current_app + +daily_tasks = Blueprint("daily_tasks", __name__) + +@daily_tasks.route("/get_daily_tasks", methods=["GET"]) +def get_all_tasks(): + try: + current_app.logger.info('Starting get_all_tasks request') + cursor = db.get_db().cursor() + + # Note: Query parameters are added after the main part of the URL. + # Here is an example: + # http://localhost:4000/ngo/ngos?founding_year=1971 + # founding_year is the query param. + + # Get query parameters for filtering + title = request.args.get("title") + notes = request.args.get("notes") + + current_app.logger.debug(f'Query parameters - title: {title}, notes: {notes}') + + # Prepare the Base query + query = "SELECT * FROM daily_tasks WHERE 1=1" + params = [] + + # Add filters if provided + if title: + query += " AND title = %s" + params.append(title) + if notes: + query += " AND notes = %s" + params.append(notes) + + current_app.logger.debug(f'Executing query: {query} with params: {params}') + cursor.execute(query, params) + results = cursor.fetchall() + + # Get column names to map to dictionaries + columns = [col[0] for col in cursor.description] + daily_tasks = [dict(zip(columns, row)) for row in results] + cursor.close() + + current_app.logger.info(f'Successfully retrieved {len(daily_tasks)} daily tasks') + return jsonify(daily_tasks), 200 + except Error as e: + current_app.logger.error(f'Database error in get_all_tasks: {str(e)}') + return jsonify({"error": str(e)}), 500 + +@daily_tasks.route("/create_daily_task", methods=["POST"]) +def create_task(): + try: + data = request.get_json() + + # Validate required fields + required_fields = ["userId", "tagId", "title", "slug", "status", "completed"] + for field in required_fields: + if field not in data: + return jsonify({"error": f"Missing required field: {field}"}), 400 + + cursor = db.get_db().cursor() + + query = """ + INSERT INTO daily_tasks (userId, tagId, title, slug, status, completed, schedule, notes) + VALUES (%s, %s, %s, %s, %s, %s, %s, %s) + """ + cursor.execute( + query, + ( + data["userId"], + data["tagId"], + data["title"], + data["slug"], + data["status"], + data["completed"], + data.get("schedule", None), + data.get("notes", None) + ), + ) + + db.get_db().commit() + new_task_id = cursor.lastrowid + cursor.close() + + return ( + jsonify({"message": "task created successfully", "task_id": new_task_id}), + 201, + ) + except Error as e: + return jsonify({"error": str(e)}), 500 + +@daily_tasks.route("/delete_daily_task/", methods = ["DELETE"]) +def delete_task(task_id): + try: + cursor = db.get_db().cursor() + + cursor.execute("SELECT * FROM daily_tasks WHERE id = %s", (task_id,)) + task = cursor.fetchone() + if not task: + return jsonify({"error": "task not found"}), 404 + + cursor.execute("DELETE FROM daily_tasks WHERE id = %s", (task_id,)) + db.get_db().commit() + cursor.close() + + return jsonify({"message": "task deleted successfully"}), 200 + except Error as e: + return jsonify({"error": str(e)}), 500 + +@daily_tasks.route("/daily_task/", methods=["PUT"]) +def rename_task(task_id): + try: + data = request.get_json() + + # Check if NGO exists + cursor = db.get_db().cursor() + cursor.execute("SELECT * FROM daily_tasks WHERE id = %s", (task_id,)) + if not cursor.fetchone(): + return jsonify({"error": "task not found"}), 404 + + # Build update query dynamically based on provided fields + update_fields = [] + params = [] + allowed_fields = ["title", "completed", "notes"] + + for field in allowed_fields: + if field in data: + update_fields.append(f"{field} = %s") + params.append(data[field]) + + if not update_fields: + return jsonify({"error": "No valid fields to update"}), 400 + + params.append(task_id) + query = f"UPDATE daily_tasks SET {', '.join(update_fields)} WHERE id = %s" + + cursor.execute(query, params) + db.get_db().commit() + cursor.close() + + return jsonify({"message": "task updated successfully"}), 200 + except Error as e: + return jsonify({"error": str(e)}), 500 + +@daily_tasks.route("/daily_task/", methods=["GET"]) +def get_daily_task(task_id): + try: + cursor = db.get_db().cursor() + + cursor.execute("SELECT * FROM daily_tasks WHERE id = %s", (task_id,)) + task_row = cursor.fetchone() + + if not task_row: + return jsonify({"error": "user not found"}), 404 + + columns = [col[0] for col in cursor.description] + daily_tasks = dict(zip(columns, task_row)) + + cursor.close() + return jsonify(daily_tasks), 200 + + except Error as e: + return jsonify({"error": str(e)}), 500 + + + diff --git a/api/backend/goals/goal_routes.py b/api/backend/goals/goal_routes.py new file mode 100644 index 0000000000..62d4d9d332 --- /dev/null +++ b/api/backend/goals/goal_routes.py @@ -0,0 +1,200 @@ +from flask import Blueprint, jsonify, request +from backend.db_connection import db +from mysql.connector import Error +from flask import current_app + +# Create a Blueprint for NGO routes +goals = Blueprint("goals", __name__) + + +# Get all NGOs with optional filtering by country, focus area, and founding year +# Example: /ngo/ngos?country=United%20States&focus_area=Environmental%20Conservation +@goals.route("/goals", methods=["GET"]) +def get_all_goals(): + try: + current_app.logger.info('Starting get_all_goals request') + cursor = db.get_db().cursor() + + # Note: Query parameters are added after the main part of the URL. + # Here is an example: + # http://localhost:4000/ngo/ngos?founding_year=1971 + # founding_year is the query param. + + # Get query parameters for filtering + title = request.args.get("title") + schedule = request.args.get("schedule") + notes = request.args.get("notes") + + current_app.logger.debug(f'Query parameters - title: {title}, schedule: {schedule}, notes: {notes}') + query = "SELECT id, title, notes, schedule FROM goals g WHERE g.status = 'ACTIVE' LIMIT 3;" + + + current_app.logger.debug(f'Executing query: {query}') + cursor.execute(query, params) + goals_data = cursor.fetchall() + cursor.close() + + current_app.logger.info(f'Successfully retrieved {len(goals_data)} NGOs') + return jsonify(goals_data), 200 + + except Error as e: + current_app.logger.error(f'Database error in get_all_ngos: {str(e)}') + return jsonify({"error": str(e)}), 500 + + +# Get detailed information about a specific NGO including its projects and donors +# # Example: /ngo/ngos/1 +# @goals.route("/goals/", methods=["GET"]) +# def get_ngo(ngo_id): +# try: +# cursor = db.get_db().cursor() + +# # Get NGO details +# cursor.execute("SELECT * FROM WorldNGOs WHERE NGO_ID = %s", (ngo_id,)) +# ngo = cursor.fetchone() + +# if not ngo: +# return jsonify({"error": "NGO not found"}), 404 + +# # Get associated projects then donors +# cursor.execute("SELECT * FROM Projects WHERE NGO_ID = %s", (ngo_id,)) +# projects = cursor.fetchall() + +# cursor.execute("SELECT * FROM Donors WHERE NGO_ID = %s", (ngo_id,)) +# donors = cursor.fetchall() + +# # Combine data from multiple related queries into one object to return (after jsonify) +# ngo["projects"] = projects +# ngo["donors"] = donors + +# cursor.close() +# return jsonify(ngo), 200 +# except Error as e: +# return jsonify({"error": str(e)}), 500 + + +# Create a new NGO +# Required fields: Name, Country, Founding_Year, Focus_Area, Website +# Example: POST /ngo/ngos with JSON body +# @ngos.route("/ngos", methods=["POST"]) +# def create_ngo(): +# try: +# data = request.get_json() + +# # Validate required fields +# required_fields = ["Name", "Country", "Founding_Year", "Focus_Area", "Website"] +# for field in required_fields: +# if field not in data: +# return jsonify({"error": f"Missing required field: {field}"}), 400 + +# cursor = db.get_db().cursor() + +# # Insert new NGO +# query = """ +# INSERT INTO WorldNGOs (Name, Country, Founding_Year, Focus_Area, Website) +# VALUES (%s, %s, %s, %s, %s) +# """ +# cursor.execute( +# query, +# ( +# data["Name"], +# data["Country"], +# data["Founding_Year"], +# data["Focus_Area"], +# data["Website"], +# ), +# ) + +# db.get_db().commit() +# new_ngo_id = cursor.lastrowid +# cursor.close() + +# return ( +# jsonify({"message": "NGO created successfully", "ngo_id": new_ngo_id}), +# 201, +# ) +# except Error as e: +# return jsonify({"error": str(e)}), 500 + + +# # Update an existing NGO's information +# # Can update any field except NGO_ID +# # Example: PUT /ngo/ngos/1 with JSON body containing fields to update +# @ngos.route("/ngos/", methods=["PUT"]) +# def update_ngo(ngo_id): +# try: +# data = request.get_json() + +# # Check if NGO exists +# cursor = db.get_db().cursor() +# cursor.execute("SELECT * FROM WorldNGOs WHERE NGO_ID = %s", (ngo_id,)) +# if not cursor.fetchone(): +# return jsonify({"error": "NGO not found"}), 404 + +# # Build update query dynamically based on provided fields +# update_fields = [] +# params = [] +# allowed_fields = ["Name", "Country", "Founding_Year", "Focus_Area", "Website"] + +# for field in allowed_fields: +# if field in data: +# update_fields.append(f"{field} = %s") +# params.append(data[field]) + +# if not update_fields: +# return jsonify({"error": "No valid fields to update"}), 400 + +# params.append(ngo_id) +# query = f"UPDATE WorldNGOs SET {', '.join(update_fields)} WHERE NGO_ID = %s" + +# cursor.execute(query, params) +# db.get_db().commit() +# cursor.close() + +# return jsonify({"message": "NGO updated successfully"}), 200 +# except Error as e: +# return jsonify({"error": str(e)}), 500 + + +# # Get all projects associated with a specific NGO +# # Example: /ngo/ngos/1/projects +# @ngos.route("/ngos//projects", methods=["GET"]) +# def get_ngo_projects(ngo_id): +# try: +# cursor = db.get_db().cursor() + +# # Check if NGO exists +# cursor.execute("SELECT * FROM WorldNGOs WHERE NGO_ID = %s", (ngo_id,)) +# if not cursor.fetchone(): +# return jsonify({"error": "NGO not found"}), 404 + +# # Get all projects for the NGO +# cursor.execute("SELECT * FROM Projects WHERE NGO_ID = %s", (ngo_id,)) +# projects = cursor.fetchall() +# cursor.close() + +# return jsonify(projects), 200 +# except Error as e: +# return jsonify({"error": str(e)}), 500 + + +# # Get all donors associated with a specific NGO +# # Example: /ngo/ngos/1/donors +# @ngos.route("/ngos//donors", methods=["GET"]) +# def get_ngo_donors(ngo_id): +# try: +# cursor = db.get_db().cursor() + +# # Check if NGO exists +# cursor.execute("SELECT * FROM WorldNGOs WHERE NGO_ID = %s", (ngo_id,)) +# if not cursor.fetchone(): +# return jsonify({"error": "NGO not found"}), 404 + +# # Get all donors for the NGO +# cursor.execute("SELECT * FROM Donors WHERE NGO_ID = %s", (ngo_id,)) +# donors = cursor.fetchall() +# cursor.close() + +# return jsonify(donors), 200 +# except Error as e: +# return jsonify({"error": str(e)}), 500 diff --git a/api/backend/ngos/ngo_routes.py b/api/backend/goals/ngo_routes.py similarity index 100% rename from api/backend/ngos/ngo_routes.py rename to api/backend/goals/ngo_routes.py diff --git a/api/backend/rest_entry.py b/api/backend/rest_entry.py index 2bba27f8a1..a923b1392b 100644 --- a/api/backend/rest_entry.py +++ b/api/backend/rest_entry.py @@ -7,6 +7,8 @@ from backend.db_connection import db from backend.simple.simple_routes import simple_routes from backend.ngos.ngo_routes import ngos +from api.backend.blueprints.persona1_avery import persona1_avery_bp + def create_app(): app = Flask(__name__) @@ -47,6 +49,7 @@ def create_app(): app.logger.info("create_app(): registering blueprints with Flask app object.") app.register_blueprint(simple_routes) app.register_blueprint(ngos, url_prefix="/ngo") + app.register_blueprint(persona1_avery_bp, url_prefix='/persona1') # Don't forget to return the app object return app diff --git a/api/backend/simple/playlist.py b/api/backend/simple/playlist.py deleted file mode 100644 index a9e7a9ef03..0000000000 --- a/api/backend/simple/playlist.py +++ /dev/null @@ -1,129 +0,0 @@ -# ------------------------------------------------------------ -# Sample data for testing generated by ChatGPT -# ------------------------------------------------------------ - -sample_playlist_data = { - "playlist": { - "id": "37i9dQZF1DXcBWIGoYBM5M", - "name": "Chill Hits", - "description": "Relax and unwind with the latest chill hits.", - "owner": { - "id": "spotify_user_123", - "display_name": "Spotify User" - }, - "tracks": { - "items": [ - { - "track": { - "id": "3n3Ppam7vgaVa1iaRUc9Lp", - "name": "Lose Yourself", - "artists": [ - { - "id": "1dfeR4HaWDbWqFHLkxsg1d", - "name": "Eminem" - } - ], - "album": { - "id": "1ATL5GLyefJaxhQzSPVrLX", - "name": "8 Mile" - }, - "duration_ms": 326000, - "track_number": 1, - "disc_number": 1, - "preview_url": "https://p.scdn.co/mp3-preview/lose-yourself.mp3", - "uri": "spotify:track:3n3Ppam7vgaVa1iaRUc9Lp" - } - }, - { - "track": { - "id": "7ouMYWpwJ422jRcDASZB7P", - "name": "Blinding Lights", - "artists": [ - { - "id": "0fW8E0XdT6aG9aFh6jGpYo", - "name": "The Weeknd" - } - ], - "album": { - "id": "1ATL5GLyefJaxhQzSPVrLX", - "name": "After Hours" - }, - "duration_ms": 200040, - "track_number": 9, - "disc_number": 1, - "preview_url": "https://p.scdn.co/mp3-preview/blinding-lights.mp3", - "uri": "spotify:track:7ouMYWpwJ422jRcDASZB7P" - } - }, - { - "track": { - "id": "4uLU6hMCjMI75M1A2tKUQC", - "name": "Shape of You", - "artists": [ - { - "id": "6eUKZXaKkcviH0Ku9w2n3V", - "name": "Ed Sheeran" - } - ], - "album": { - "id": "3fMbdgg4jU18AjLCKBhRSm", - "name": "Divide" - }, - "duration_ms": 233713, - "track_number": 4, - "disc_number": 1, - "preview_url": "https://p.scdn.co/mp3-preview/shape-of-you.mp3", - "uri": "spotify:track:4uLU6hMCjMI75M1A2tKUQC" - } - }, - { - "track": { - "id": "0VjIjW4GlUZAMYd2vXMi3b", - "name": "Levitating", - "artists": [ - { - "id": "4tZwfgrHOc3mvqYlEYSvVi", - "name": "Dua Lipa" - } - ], - "album": { - "id": "7dGJo4pcD2V6oG8kP0tJRR", - "name": "Future Nostalgia" - }, - "duration_ms": 203693, - "track_number": 5, - "disc_number": 1, - "preview_url": "https://p.scdn.co/mp3-preview/levitating.mp3", - "uri": "spotify:track:0VjIjW4GlUZAMYd2vXMi3b" - } - }, - { - "track": { - "id": "6habFhsOp2NvshLv26DqMb", - "name": "Sunflower", - "artists": [ - { - "id": "1dfeR4HaWDbWqFHLkxsg1d", - "name": "Post Malone" - }, - { - "id": "0C8ZW7ezQVs4URX5aX7Kqx", - "name": "Swae Lee" - } - ], - "album": { - "id": "6k3hyp4efgfHP5GMVd3Agw", - "name": "Spider-Man: Into the Spider-Verse (Soundtrack)" - }, - "duration_ms": 158000, - "track_number": 3, - "disc_number": 1, - "preview_url": "https://p.scdn.co/mp3-preview/sunflower.mp3", - "uri": "spotify:track:6habFhsOp2NvshLv26DqMb" - } - } - ] - }, - "uri": "spotify:playlist:37i9dQZF1DXcBWIGoYBM5M" - } -} \ No newline at end of file diff --git a/api/backend/simple/simple_routes.py b/api/backend/simple/simple_routes.py deleted file mode 100644 index a753d14c50..0000000000 --- a/api/backend/simple/simple_routes.py +++ /dev/null @@ -1,98 +0,0 @@ -from flask import ( - Blueprint, - request, - jsonify, - make_response, - current_app, - redirect, - url_for, -) -import json -from backend.db_connection import db -from backend.simple.playlist import sample_playlist_data -from backend.ml_models import model01 - -# This blueprint handles some basic routes that you can use for testing -simple_routes = Blueprint("simple_routes", __name__) - - -# ------------------------------------------------------------ -# / is the most basic route -# Once the api container is started, in a browser, go to -# localhost:4000/playlist -@simple_routes.route("/") -def welcome(): - current_app.logger.info("GET / handler") - welcome_message = "

Welcome to the CS 3200 Project Template REST API" - response = make_response(welcome_message) - response.status_code = 200 - return response - - -# ------------------------------------------------------------ -# /playlist returns the sample playlist data contained in playlist.py -# (imported above) -@simple_routes.route("/playlist") -def get_playlist_data(): - current_app.logger.info("GET /playlist handler") - response = make_response(jsonify(sample_playlist_data)) - response.status_code = 200 - return response - - -# ------------------------------------------------------------ -@simple_routes.route("/niceMesage", methods=["GET"]) -def affirmation(): - message = """ -

Think about it...

-
- You only need to be 1% better today than you were yesterday! - """ - response = make_response(message) - response.status_code = 200 - return response - - -# ------------------------------------------------------------ -# Demonstrates how to redirect from one route to another. -@simple_routes.route("/message") -def mesage(): - return redirect(url_for(affirmation)) - - -@simple_routes.route("/data") -def getData(): - current_app.logger.info("GET /data handler") - - # Create a simple dictionary with nested data - data = {"a": {"b": "123", "c": "Help"}, "z": {"b": "456", "c": "me"}} - - response = make_response(jsonify(data)) - response.status_code = 200 - return response - - -@simple_routes.route("/prediction//", methods=["GET"]) -def get_prediction(var_01, var_02): - current_app.logger.info("GET /prediction handler") - - try: - # Call prediction function from model01 - prediction = model01.predict(var_01, var_02) - current_app.logger.info(f"prediction value returned is {prediction}") - - response_data = { - "prediction": prediction, - "input_variables": {"var01": var_01, "var02": var_02}, - } - - response = make_response(jsonify(response_data)) - response.status_code = 200 - return response - - except Exception as e: - response = make_response( - jsonify({"error": "Error processing prediction request"}) - ) - response.status_code = 500 - return response diff --git a/api/backend/support/support_routes.py b/api/backend/support/support_routes.py new file mode 100644 index 0000000000..afa5baa9bb --- /dev/null +++ b/api/backend/support/support_routes.py @@ -0,0 +1,157 @@ +from flask import Blueprint, jsonify, request +from backend.db_connection import db +from mysql.connector import Error +from flask import current_app + +support = Blueprint("support", __name__) + +support.route("/bug_reports", methods=["GET"]) +def get_bug_reports(): + try: + current_app.logger.info('Starting get_bug_reports request') + cursor = db.get_db().cursor() + + # Get query parameters for filtering + userId = request.args.get("userId") + title = request.args.get("title") + description = request.args.get("description") + status = request.args.get("status") + priority = request.args.get("priority") + + current_app.logger.debug(f'Query parameters - userId: {userId}, title: {title}, description: {description}, status: {status}, priority: {priority}') + + # Prepare the Base query + query = "SELECT * FROM bug_reports WHERE 1=1" + params = [] + + # Add filters if provided + if userId: + query += " AND userId = %s" + params.append(userId) + if title: + query += " AND title = %s" + params.append(title) + if description: + query += " AND description = %s" + params.append(description) + if status: + query += " AND status = %s" + params.append(status) + if priority: + query += " AND priority = %s" + params.append(priority) + + current_app.logger.debug(f'Executing query: {query} with params: {params}') + cursor.execute(query, params) + results = cursor.fetchall() + + # Get column names to map to dictionaries + columns = [col[0] for col in cursor.description] + bug_reports = [dict(zip(columns, row)) for row in results] + cursor.close() + + current_app.logger.info(f'Successfully retrieved {len(bug_reports)} bug reports') + return jsonify(bug_reports), 200 + except Error as e: + current_app.logger.error(f'Database error in get_bug_reports: {str(e)}') + return jsonify({"error": str(e)}), 500 + +@support.route("/bug_reports/", methods=["PUT"]) +def archive_bug_report(bug_report_id): + try: + data = request.get_json() + + cursor = db.get_db().cursor() + cursor.execute("SELECT * FROM bug_reports WHERE id = %s", (bug_report_id,)) + if not cursor.fetchone(): + return jsonify({"error": "tag not found"}), 404 + + query = 'UPDATE bug_reports SET status = 1 WHERE id = %s' + cursor.execute(query, (bug_report_id,)) + db.get_db().commit() + cursor.close() + + return jsonify({"message": "bug report updated successfully"}), 200 + except Error as e: + return jsonify({"error": str(e)}), 500 + +@support.route("/post_reply/", methods=["GET"]) +def get_post_replies(user_id): + try: + cursor = db.get_db().cursor() + + # Get NGO details + cursor.execute("SELECT * FROM post_reply WHERE userId = %s", (user_id,)) + post_rows = cursor.fetchall() + + if not post_rows: + return jsonify({"error": "no replies from user"}), 404 + + columns = [col[0] for col in cursor.description] + post = dict(zip(columns, post_rows)) + + cursor.close() + return jsonify(post), 200 + + except Error as e: + return jsonify({"error": str(e)}), 500 + +@support.route("/post_reply", methods=["POST"]) +def create_post_reply(): + try: + data = request.get_json() + + # Validate required fields + required_fields = ["userId", "postId", "title", "createdAt", "tag"] + for field in required_fields: + if field not in data: + return jsonify({"error": f"Missing required field: {field}"}), 400 + + cursor = db.get_db().cursor() + + query = """ + INSERT INTO post_reply (userId, postId, title, createdAt, publishedAt, content, tag) + VALUES (%s, %s, %s, %s, %s, %s, %s) + """ + cursor.execute( + query, + ( + data["userId"], + data["postId"], + data["title"], + data["createdAt"], + data.get("publishedAt"), + data.get("content"), + data["tag"] + ), + ) + + db.get_db().commit() + new_reply_id = cursor.lastrowid + cursor.close() + + return ( + jsonify({"message": "post reply created successfully", "reply_id": new_reply_id}), + 201, + ) + except Error as e: + return jsonify({"error": str(e)}), 500 + +@support.route("/post_reply/", methods = ["DELETE"]) +def delete_tags(post_reply_id): + try: + cursor = db.get_db().cursor() + + cursor.execute("SELECT * FROM post_reply WHERE id = %s", (post_reply_id,)) + support = cursor.fetchone() + if not support: + return jsonify({"error": "post reply not found"}), 404 + + cursor.execute("DELETE FROM post_reply WHERE id = %s", (post_reply_id,)) + db.get_db().commit() + cursor.close() + + return jsonify({"message": "post reply deleted successfully"}), 200 + except Error as e: + return jsonify({"error": str(e)}), 500 + diff --git a/api/backend/tags/tags_routes.py b/api/backend/tags/tags_routes.py new file mode 100644 index 0000000000..6038821270 --- /dev/null +++ b/api/backend/tags/tags_routes.py @@ -0,0 +1,154 @@ +from flask import Blueprint, jsonify, request +from backend.db_connection import db +from mysql.connector import Error +from flask import current_app + +tags = Blueprint("tags", __name__) + +@tags.route("/get_tag", methods=["GET"]) +def get_all_tags(): + try: + current_app.logger.info('Starting get_all_tags request') + cursor = db.get_db().cursor() + + # Get query parameters for filtering + name = request.args.get("name") + color = request.args.get("color") + + current_app.logger.debug(f'Query parameters - name: {name}, color: {color}') + + # Prepare the Base query + query = "SELECT * FROM tags WHERE 1=1" + params = [] + + # Add filters if provided + if name: + query += " AND name = %s" + params.append(name) + if color: + query += " AND color = %s" + params.append(color) + + current_app.logger.debug(f'Executing query: {query} with params: {params}') + cursor.execute(query, params) + results = cursor.fetchall() + + # Get column names to map to dictionaries + columns = [col[0] for col in cursor.description] + tags = [dict(zip(columns, row)) for row in results] + cursor.close() + + current_app.logger.info(f'Successfully retrieved {len(tags)} tags') + return jsonify(tags), 200 + except Error as e: + current_app.logger.error(f'Database error in get_all_tags: {str(e)}') + return jsonify({"error": str(e)}), 500 + +@tags.route("/create_tag", methods=["POST"]) +def create_tag(): + try: + data = request.get_json() + + # Validate required fields + required_fields = ["color"] + for field in required_fields: + if field not in data: + return jsonify({"error": f"Missing required field: {field}"}), 400 + + cursor = db.get_db().cursor() + + query = """ + INSERT INTO tags (name, color) + VALUES (%s, %s) + """ + cursor.execute( + query, + ( + data.get["name"], + data["color"], + ), + ) + + db.get_db().commit() + new_tag_id = cursor.lastrowid + cursor.close() + + return ( + jsonify({"message": "tag created successfully", "tag_id": new_tag_id}), + 201, + ) + except Error as e: + return jsonify({"error": str(e)}), 500 + +@tags.route("/delete_tag/", methods = ["DELETE"]) +def delete_tags(tag_id): + try: + cursor = db.get_db().cursor() + + cursor.execute("SELECT * FROM tags WHERE id = %s", (tag_id,)) + tag = cursor.fetchone() + if not tag: + return jsonify({"error": "tag not found"}), 404 + + cursor.execute("DELETE FROM tags WHERE id = %s", (tag_id,)) + db.get_db().commit() + cursor.close() + + return jsonify({"message": "tag deleted successfully"}), 200 + except Error as e: + return jsonify({"error": str(e)}), 500 + +@tags.route("/rename_tag/", methods=["PUT"]) +def rename_tag(tag_id): + try: + data = request.get_json() + + # Check if NGO exists + cursor = db.get_db().cursor() + cursor.execute("SELECT * FROM tags WHERE id = %s", (tag_id,)) + if not cursor.fetchone(): + return jsonify({"error": "tag not found"}), 404 + + # Build update query dynamically based on provided fields + update_fields = [] + params = [] + allowed_fields = ["name", "color"] + + for field in allowed_fields: + if field in data: + update_fields.append(f"{field} = %s") + params.append(data[field]) + + if not update_fields: + return jsonify({"error": "No valid fields to update"}), 400 + + params.append(tag_id) + query = f"UPDATE tags SET {', '.join(update_fields)} WHERE id = %s" + + cursor.execute(query, params) + db.get_db().commit() + cursor.close() + + return jsonify({"message": "tag updated successfully"}), 200 + except Error as e: + return jsonify({"error": str(e)}), 500 + +@tags.route("tags/", methods=["GET"]) +def get_tag(tag_id): + try: + cursor = db.get_db().cursor() + cursor.execute("SELECT * FROM tags WHERE id = %s", (tag_id,)) + tag_row = cursor.fetchone() + + if not tag_row: + return jsonify({"error": "tag not found"}), 404 + + columns = [col[0] for col in cursor.description] + tag = dict(zip(columns, tag_row)) + + cursor.close() + return jsonify(tag), 200 + except Error as e: + return jsonify({"error": str(e)}), 500 + + diff --git a/api/backend/users/users_routes.py b/api/backend/users/users_routes.py new file mode 100644 index 0000000000..3306782ce4 --- /dev/null +++ b/api/backend/users/users_routes.py @@ -0,0 +1,160 @@ +from flask import Blueprint, jsonify, request +from backend.db_connection import db +from mysql.connector import Error +from flask import current_app + +users = Blueprint("users", __name__) + +@users.route("/users", methods=["GET"]) +def get_all_users(): + try: + current_app.logger.info('Starting get_all_users request') + cursor = db.get_db().cursor() + + # Get query parameters for filtering + firstname = request.args.get("firstName") + lastname = request.args.get("lastName") + phonenumber = request.args.get("phoneNumber") + email = request.args.get("email") + role = request.args.get("role") + + + + current_app.logger.debug(f'Query parameters - firstName: {firstname}, lastName: {lastname}, phoneNumber: {phonenumber}, email: {email}, role: {role}') + + # Prepare the Base query + query = "SELECT * FROM users WHERE 1=1" + params = [] + + # Add filters if provided + if firstname: + query += " AND firstName = %s" + params.append(firstname) + if lastname: + query += " AND lastName = %s" + params.append(lastname) + if phonenumber: + query += " AND phoneNumber = %s" + params.append(phonenumber) + if email: + query += " AND email = %s" + params.append(email) + if role: + query += " AND role = %s" + params.append(role) + + current_app.logger.debug(f'Executing query: {query} with params: {params}') + cursor.execute(query, params) + results = cursor.fetchall() + + # Get column names to map to dictionaries + columns = [col[0] for col in cursor.description] + users = [dict(zip(columns, row)) for row in results] + cursor.close() + + current_app.logger.info(f'Successfully retrieved {len(users)} users') + return jsonify(users), 200 + except Error as e: + current_app.logger.error(f'Database error in get_all_users: {str(e)}') + return jsonify({"error": str(e)}), 500 + +@users.route("/get_user/", methods=["GET"]) +def get_user(user_id): + try: + cursor = db.get_db().cursor() + + cursor.execute("SELECT * FROM users WHERE id = %s", (user_id,)) + user_row = cursor.fetchone() + + if not user_row: + return jsonify({"error": "user not found"}), 404 + + columns = [col[0] for col in cursor.description] + user = dict(zip(columns, user_row)) + + cursor.close() + return jsonify(user), 200 + + except Error as e: + return jsonify({"error": str(e)}), 500 + +@users.route("/create_users", methods=["POST"]) +def create_user(): + try: + data = request.get_json() + + # Validate required fields + required_fields = ["firstName", "lastName", "email", "role"] + for field in required_fields: + if field not in data: + return jsonify({"error": f"Missing required field: {field}"}), 400 + + cursor = db.get_db().cursor() + + query = """ + INSERT INTO users (firstName, middleName, lastName, phoneNumber, email, role, planType, manages) + VALUES (%s, %s, %s, %s, %s, %s, %s, %s) + """ + cursor.execute( + query, + ( + data["firstName"], + data.get["middleName"], + data["lastName"], + data["phoneNumber"], + data["email"], + data["role"], + data.get("planType", "plan_name"), + data.get("manages") + ), + ) + + db.get_db().commit() + new_user_id = cursor.lastrowid + cursor.close() + + return ( + jsonify({"message": "user created successfully", "user_id": new_user_id}), + 201, + ) + except Error as e: + return jsonify({"error": str(e)}), 500 + +@users.route("/delete_users/", methods = ["DELETE"]) +def delete_user(user_id): + try: + cursor = db.get_db().cursor() + + cursor.execute("SELECT * FROM users WHERE id = %s", (user_id,)) + user = cursor.fetchone() + if not user: + return jsonify({"error": "user not found"}), 404 + + cursor.execute("DELETE FROM users WHERE id = %s", (user_id,)) + db.get_db().commit() + cursor.close() + + return jsonify({"message": "user deleted successfully"}), 200 + except Error as e: + return jsonify({"error": str(e)}), 500 + +@users.route("/user_data/", methods=["GET"]) +def get_user_data(user_data_id): + try: + cursor = db.get_db().cursor() + + cursor.execute("SELECT * FROM user_data WHERE id = %s", (user_data_id,)) + user_row = cursor.fetchone() + + if not user_row: + return jsonify({"error": "user not found"}), 404 + + columns = [col[0] for col in cursor.description] + user_data = dict(zip(columns, user_row)) + + cursor.close() + return jsonify(user_data), 200 + + except Error as e: + return jsonify({"error": str(e)}), 500 + diff --git a/app/src/Home.py b/app/src/Home.py index abe97588aa..45c127c56f 100644 --- a/app/src/Home.py +++ b/app/src/Home.py @@ -75,5 +75,10 @@ st.session_state['first_name'] = 'SysAdmin' st.switch_page('pages/20_Admin_Home.py') - - +if st.button('Goal Dashboard', + type = 'primary', + use_container_width=False): + st.session_state['authenticated'] = True + st.session_state['role'] = 'administrator' + st.session_state['first_name'] = 'SysAdmin' + st.switch_page('pages/01_Dashboard.py') \ No newline at end of file diff --git a/app/src/pages/00_Dr_Alan_Home_Page.py b/app/src/pages/00_Dr_Alan_Home_Page.py new file mode 100644 index 0000000000..c18e8eff5b --- /dev/null +++ b/app/src/pages/00_Dr_Alan_Home_Page.py @@ -0,0 +1,35 @@ +import logging +logger = logging.getLogger(__name__) + +import streamlit as st +from modules.nav import SideBarLinks + +st.set_page_config(layout = 'wide') + +# Show appropriate sidebar links for the role of the currently logged in user +#SideBarLinks() + +#st.title(f"Welcome Professor, {st.session_state['first_name']}.") +st.write('') +st.write('') +st.write('### What would you like to do today?') + +if st.button('Add new project', + type='primary', + use_container_width=True): + st.switch_page('pages/01_Add_New_Project.py') + +if st.button('View completed projects', + type='primary', + use_container_width=True): + st.switch_page('pages/02_Completed_Projects.py') + +if st.button('View project by tags', + type='primary', + use_container_width=True): + st.switch_page('pages/03_Project_Tags.py') + +if st.button('Manage planner and tasks', + type='primary', + use_container_width=True): + st.switch_page('pages/04_Planner_And_Tasks.py') \ No newline at end of file diff --git a/app/src/pages/01_Add_New_Project b/app/src/pages/01_Add_New_Project new file mode 100644 index 0000000000..01e01bf132 --- /dev/null +++ b/app/src/pages/01_Add_New_Project @@ -0,0 +1,38 @@ +import streamlit as st +import requests + +from modules.nav import SideBarLinks +SideBarLinks(show_home=True) + +st.title("Add New Project") + +# Form inputs +userID = st.text_input("User ID") +tagID = st.text_input("Tag ID") +title = st.text_input("Title") +notes = st.text_area("Notes") +status = st.selectbox("Status", ["onIce", "inProgress", "completed"]) +priority = st.slider("Priority", 1, 4, 4) +schedule = st.text_input("Deadline") + +if st.button("Submit"): + project_data = { + "userID": userID, + "tagID": tagID, + "title": title, + "notes": notes, + "status": status, + "priority": priority, + "schedule": schedule + } + + try: + # Replace this URL with your actual backend API URL + response = requests.post("http://localhost:4000/projects", json=project_data) + + if response.status_code == 200: + st.success("Project added successfully!") + else: + st.error(f"Failed to add project: {response.text}") + except Exception as e: + st.error(f"Error: {e}") diff --git a/app/src/pages/01_Dashboard.py b/app/src/pages/01_Dashboard.py new file mode 100644 index 0000000000..ef67317141 --- /dev/null +++ b/app/src/pages/01_Dashboard.py @@ -0,0 +1,85 @@ +import logging +logging.basicConfig(format='%(filename)s:%(lineno)s:%(levelname)s -- %(message)s', level=logging.INFO) +logger = logging.getLogger(__name__) + + +from modules.nav import SideBarLinks +import streamlit as st +import mysql.connector + +st.session_state['authenticated'] = False +SideBarLinks(show_home=True) +st.set_page_config(layout = 'wide') + +# @st.cache_resource +# def init_connection(): +# return mysql.connector.connect( +# host="localhost", +# port=3306, +# user="root", +# password="1203", +# database="global-GoalFlow" +# ) + +# # Function to run queries +# def run_query(query, params=None): +# conn = init_connection() +# cursor = conn.cursor() +# cursor.execute(query, params or ()) +# result = cursor.fetchall() +# cursor.close() +# return result + +st.write("# Welcome to GoalFlow!") +st.write("What are we going to get done today?") + +left, right = st.columns(2) + +active = run_query("SELECT id, title, notes, schedule FROM goals g WHERE g.status = 'ACTIVE' LIMIT 3;") +subgoals = run_query("SELECT g.id, sg.title FROM subgoals sg JOIN goals g ON g.id = sg.goalsid;") + +# completed=run_query("SELECT g.id, COUNT(sg.status) FROM subgoals sg JOIN goals g ON g.id = sg.goalsid WHERE sg.status = 'ARCHIVED' GROUP BY g.id;") +# uncompleted=run_query("SELECT g.id, COUNT(sg.status) FROM subgoals sg JOIN goals g ON g.id = sg.goalsid WHERE sg.status = 'ACTIVE' OR 'ON ICE' GROUP BY g.id;") + + + + +with left: + a, b, c = st.columns(3) + with a: + st.header("Goal A", divider=True) + st.subheader(active[0][1]) # Goal name + st.write(f"Due :blue[{active[0][3]}]") # Due date + st.write(active[0][2]) # Description + st.write("### Subgoals") # Subgoals + for subgoal in subgoals: + if subgoal[0] == active[0][0]: + st.write(f"- {subgoal[1]}") + + with b: + st.header("Goal B", divider=True) + st.subheader(active[1][1]) + st.write(f"Due :blue[{active[1][3]}]") + st.write(active[1][2]) + st.write("### Subgoals") + for subgoal in subgoals: + if subgoal[0] == active[1][0]: + st.write(f"- {subgoal[1]}") + with c: + st.header("Goal C", divider=True) + st.subheader(active[2][1]) + st.write(f"Due :blue[{active[2][3]}]") + st.write(active[2][2]) + st.write("### Subgoals") + for subgoal in subgoals: + if subgoal[0] == active[2][0]: + st.write(f"- {subgoal[1]}") + +with right: + st.header("Right Half") + st.write("Content for the right side") + if st.button('Add New Project', + type = 'primary', + use_container_width=False): + st.switch_page('pages/NewProject.py') + diff --git a/database-files/01_gflow_db.sql b/database-files/01_gflow_db.sql new file mode 100644 index 0000000000..3d4fbe118d --- /dev/null +++ b/database-files/01_gflow_db.sql @@ -0,0 +1,297 @@ +-- Drops the old, and recreates the database. Then use it. +DROP DATABASE IF EXISTS `global-GoalFlow`; +CREATE DATABASE `global-GoalFlow`; +USE `global-GoalFlow`; + + + + + +-- USERS TABLE +DROP TABLE IF EXISTS users; +CREATE TABLE IF NOT EXISTS users ( +firstName VARCHAR(50) NOT NULL, +middleName VARCHAR(50), +lastName VARCHAR(50) NOT NULL, +phoneNumber VARCHAR(15), +email VARCHAR(75) NOT NULL, +role VARCHAR(50) NOT NULL, +planType VARCHAR(50) NOT NULL DEFAULT 'plan_name', +manages INT, +id INT AUTO_INCREMENT NOT NULL, -- DIF. NAME + +PRIMARY KEY (id), + +UNIQUE INDEX uq_idx_phoneNumber (phoneNumber), +UNIQUE INDEX uq_idx_email (email), +INDEX idx_manages (manages), +INDEX idx_role (role), + +FOREIGN KEY (manages) REFERENCES users(id) + ON DELETE SET NULL + ON UPDATE CASCADE +); + + + + + +-- TAGS TABLE +DROP TABLE IF EXISTS tags; +CREATE TABLE IF NOT EXISTS tags ( +name VARCHAR(50), +color VARCHAR(7) NOT NULL DEFAULT '#ffffff', +id INT AUTO_INCREMENT NOT NULL, -- DIF. NAME + +PRIMARY KEY (id), + +INDEX idx_name (name) +); + + + + + +-- POSTS TABLE +DROP TABLE IF EXISTS posts; +CREATE TABLE IF NOT EXISTS posts ( +authorId INT NOT NULL, -- NEEDED EXTRA ATTRIBUTE +title VARCHAR(75) NOT NULL, +metaTitle VARCHAR(100), -- EXTRA ATTRIBUTE +createdAt DATETIME NOT NULL, +updatedAt DATETIME, -- EXTRA ATTRIBUTE +publishedAt DATETIME, -- EXTRA ATTRIBUTE +slug VARCHAR(100) NOT NULL, -- EXTRA ATTRIBUTE +content TEXT, +tag INT NOT NULL, +id INT AUTO_INCREMENT NOT NULL, -- DIF. NAME + +PRIMARY KEY (id), + +UNIQUE INDEX uq_idx_slug (slug), +INDEX idx_authorId (authorId), + +FOREIGN KEY (authorId) REFERENCES users(id) + ON UPDATE RESTRICT + ON DELETE CASCADE, + +FOREIGN KEY (tag) REFERENCES tags(id) + ON UPDATE RESTRICT + ON DELETE CASCADE +); + + + + + +-- POST_REPLY TABLE +DROP TABLE IF EXISTS post_reply; +CREATE TABLE IF NOT EXISTS post_reply ( +userId INT NOT NULL, -- NEEDED EXTRA ATTRIBUTE +postId INT NOT NULL, -- NEEDED EXTRA ATTRIBUTE +title VARCHAR(100) NOT NULL, +createdAt DATETIME NOT NULL, +publishedAt DATETIME, -- EXTRA ATTRIBUTE +content TEXT, +tag INT NOT NULL, +id INT AUTO_INCREMENT NOT NULL, -- DIF. NAME + +PRIMARY KEY (id), + +INDEX index_userId (userId), +INDEX index_postId (postId), + +FOREIGN KEY (userId) REFERENCES users(id) + ON UPDATE RESTRICT + ON DELETE CASCADE, + +FOREIGN KEY (postId) REFERENCES posts(id) +ON UPDATE RESTRICT +ON DELETE CASCADE +); + + + + + +-- USER_DATA TABLE +DROP TABLE IF EXISTS user_data; +CREATE TABLE IF NOT EXISTS user_data ( +userId INT NOT NULL, -- NEEDED EXTRA ATTRIBUTE +location VARCHAR(100), -- city?, region?, general loc.? +totalTime INT UNSIGNED NOT NULL DEFAULT 0, +deviceType VARCHAR(50) NOT NULL, +age TINYINT UNSIGNED, +registeredAt DATETIME NOT NULL DEFAULT CURRENT_TIMESTAMP, +lastLogin DATETIME, -- EXTRA ATTRIBUTE +isActive TINYINT(1) NOT NULL DEFAULT 1, -- EXTRA ATTRIBUTE +postCount INT UNSIGNED NOT NULL DEFAULT 0, -- EXTRA ATTRIBUTE +id INT AUTO_INCREMENT NOT NULL, -- DIF. NAME + +PRIMARY KEY (id), + +INDEX idx_userId (userId), +INDEX idx_deviceType (deviceType), +INDEX idx_lastLogin (lastLogin), + +FOREIGN KEY (userId) REFERENCES users(id) + ON UPDATE RESTRICT + ON DELETE CASCADE +); + + + + + +-- BUG_REPORTS TABLE +DROP TABLE IF EXISTS bug_reports; +CREATE TABLE IF NOT EXISTS bug_reports ( +userId INT NOT NULL, -- NEEDED EXTRA ATTRIBUTE +title VARCHAR(75) NOT NULL, +metaTitle VARCHAR(100), -- EXTRA ATTRIBUTE +slug VARCHAR(100) NOT NULL, -- EXTRA ATTRIBUTE +description TEXT, +dateReported DATETIME NOT NULL DEFAULT CURRENT_TIMESTAMP, +status TINYINT(1) NOT NULL DEFAULT 0, -- 0 = Not Completed, 1 = Completed +priority TINYINT NOT NULL DEFAULT 4, -- where 1 = critical, 2 = high, 3 = medium, 4 = low +id INT AUTO_INCREMENT NOT NULL, -- DIF. NAME + +PRIMARY KEY (id), + +UNIQUE INDEX uq_slug (slug), +INDEX idx_status (status), +INDEX idx_priority (priority), +INDEX idx_dateReported (dateReported), + +FOREIGN KEY (userId) REFERENCES users(id) + ON UPDATE RESTRICT + ON DELETE CASCADE +); + + + + + +-- CONSISTENT_TASKS TABLE +DROP TABLE IF EXISTS consistent_tasks; +CREATE TABLE IF NOT EXISTS consistent_tasks ( +userId INT NOT NULL, -- NEEDED EXTRA ATTRIBUTE +title VARCHAR(75) NOT NULL, +metaTitle VARCHAR(100), -- EXTRA ATTRIBUTE +slug VARCHAR(100) NOT NULL, -- EXTRA ATTRIBUTE +category VARCHAR(100), +createdAt DATETIME NOT NULL DEFAULT CURRENT_TIMESTAMP, +notes TEXT, +id INT AUTO_INCREMENT NOT NULL, -- DIF. NAME + +PRIMARY KEY (id), + +UNIQUE INDEX uq_slug (slug), +INDEX idx_category (category), +INDEX idx_createdAt (createdAt), + +FOREIGN KEY (userId) REFERENCES users(id) + ON UPDATE RESTRICT + ON DELETE CASCADE +); + + + + + +-- DAILY_TASKS TABLE +DROP TABLE IF EXISTS daily_tasks; +CREATE TABLE IF NOT EXISTS daily_tasks ( +userId INT NOT NULL, -- NEEDED EXTRA ATTRIBUTE +tagId INT NOT NULL, -- NEEDED EXTRA ATTRIBUTE +title VARCHAR(75) NOT NULL, +metaTitle VARCHAR(100), -- EXTRA ATTRIBUTE +slug VARCHAR(100) NOT NULL, -- EXTRA ATTRIBUTE +status TINYINT(1) NOT NULL DEFAULT 0, +completed TINYINT(1) NOT NULL DEFAULT 0, +schedule DATE, +notes TEXT, +id INT AUTO_INCREMENT NOT NULL, -- DIF. NAME + +PRIMARY KEY (id), + +UNIQUE INDEX uq_slug (slug), +INDEX idx_userId (userId), +INDEX idx_schedule (schedule), +INDEX idx_status (status), + +FOREIGN KEY (userId) REFERENCES users(id) + ON UPDATE RESTRICT + ON DELETE CASCADE, + +FOREIGN KEY (tagId) REFERENCES tags(id) + ON UPDATE RESTRICT + ON DELETE CASCADE +); + + + + + +-- GOALS TABLE +DROP TABLE IF EXISTS goals; +CREATE TABLE IF NOT EXISTS goals ( +userId INT NOT NULL, -- NEEDED EXTRA ATTRIBUTE +tagId INT NOT NULL, -- NEEDED EXTRA ATTRIBUTE +title VARCHAR(75) NOT NULL, +notes TEXT, +onIce TINYINT(1) NOT NULL DEFAULT 0, +status VARCHAR(50) NOT NULL DEFAULT 'ON ICE', -- ON ICE, ACTIVE, ARCHIVED +priority TINYINT NOT NULL DEFAULT 4, -- where 1 = critical, 2 = high, 3 = medium, 4 = low +createdAt DATETIME NOT NULL DEFAULT CURRENT_TIMESTAMP, +completedAt DATETIME, +completed TINYINT(1) NOT NULL DEFAULT 0, -- EXTRA ATTRIBUTE, NOT SURE +schedule DATE, -- ADDED ? +id INT AUTO_INCREMENT NOT NULL, -- DIF. NAME + +PRIMARY KEY (id), + +INDEX idx_userId (userId), +INDEX idx_tagId (tagId), +INDEX idx_status (status), +INDEX idx_priority (priority), +INDEX idx_createdAt (createdAt), +INDEX idx_completedAt (completedAt), + +FOREIGN KEY (userId) REFERENCES users(id) + ON UPDATE RESTRICT + ON DELETE CASCADE, + +FOREIGN KEY (tagId) REFERENCES tags(id) + ON UPDATE RESTRICT + ON DELETE CASCADE +); + + + + + +-- SUBGOALS TABLE +DROP TABLE IF EXISTS subgoals; +CREATE TABLE IF NOT EXISTS subgoals ( +goalsId INT NOT NULL, -- NEEDED EXTRA ATTRIBUTE +title VARCHAR(75) NOT NULL, +notes TEXT, +status VARCHAR(50) NOT NULL DEFAULT 'ON ICE', -- ON ICE, ACTIVE, ARCHIVED +createdAt DATETIME NOT NULL DEFAULT CURRENT_TIMESTAMP, -- EXTRA ATTRIBUTE +completedAt DATETIME, -- EXTRA ATTRIBUTE +completed TINYINT(1) NOT NULL DEFAULT 0, +schedule DATE, -- ADDED ? +id INT AUTO_INCREMENT NOT NULL, -- DIF. NAME + +PRIMARY KEY (id), + +INDEX idx_goalsId (goalsId), +INDEX idx_status (status), +INDEX idx_createdAt (createdAt), +INDEX idx_completedAt (completedAt), + +FOREIGN KEY (goalsId) REFERENCES goals(id) + ON UPDATE RESTRICT + ON DELETE CASCADE +); \ No newline at end of file diff --git a/database-files/02_gflow_mockdata.sql b/database-files/02_gflow_mockdata.sql new file mode 100644 index 0000000000..76b49b9a63 --- /dev/null +++ b/database-files/02_gflow_mockdata.sql @@ -0,0 +1,681 @@ +-- Enter the SQL database. +USE `global-GoalFlow`; +SET FOREIGN_KEY_CHECKS = 0; + + + + + +-- ========== USERS (40 rows) ========== +INSERT INTO users (id, firstName, middleName, lastName, phoneNumber, email, role, planType, manages) +VALUES +-- 1..4: Personas (must be first four) +(1, 'Avery', NULL, 'Reyes', '555-1001', 'avery.reyes@example.com', 'user', 'standard', NULL), -- Freelance Designer +(2, 'Alan', 'J.', 'Jaden', '555-1002', 'alan.jaden@university.edu', 'user', 'standard', NULL), -- Dr. Alan +(3, 'Jose', NULL, 'Rivera', '555-1003', 'jose@goalflow.co', 'admin', 'enterprise', NULL), -- Jose +(4, 'Jack', NULL, 'Morris', '555-1004', 'jack.morris@meta.example', 'manager', 'enterprise', NULL), -- Jack +-- 5..40: Additional realistic users to fill out dataset +(5, 'Taylor', NULL, 'Nguyen', '555-1005', 'taylor.nguyen@example.com', 'user', 'free', 2), +(6, 'Sam', NULL, 'Patel', '555-1006', 'sam.patel@example.com', 'user', 'free', 2), +(7, 'Maya', NULL, 'Ortiz', '555-1007', 'maya.ortiz@example.com', 'user', 'standard', 2), +(8, 'Priya', NULL, 'Singh', '555-1008', 'priya.singh@example.com', 'user', 'standard', 3), +(9, 'Omar', NULL, 'Khan', '555-1009', 'omar.khan@example.com', 'user', 'free', 3), +(10, 'Lena', NULL, 'Gonzalez', '555-1010', 'lena.gonzalez@example.com', 'user', 'standard', 3), +(11, 'Noah', NULL, 'White', '555-1011', 'noah.white@example.com', 'user', 'free', 2), +(12, 'Ivy', NULL, 'Cole', '555-1012', 'ivy.cole@example.com', 'user', 'standard', 5), +(13, 'Ben', NULL, 'Miller', '555-1013', 'ben.miller@example.com', 'user', 'free', 5), +(14, 'Rosa', NULL, 'Torres', '555-1014', 'rosa.torres@example.com', 'user', 'standard', 2), +(15, 'Ethan', NULL, 'Gomez', '555-1015', 'ethan.gomez@example.com', 'user', 'free', 3), +(16, 'Zoe', NULL, 'Hart', '555-1016', 'zoe.hart@example.com', 'user', 'standard', 3), +(17, 'Miguel', NULL, 'Reyes', '555-1017', 'miguel.reyes@example.com', 'user', 'free', 4), +(18, 'Lila', NULL, 'Bennett', '555-1018', 'lila.bennett@example.com', 'user', 'standard', 4), +(19, 'Connor', NULL, 'Walsh', '555-1019', 'connor.walsh@example.com', 'user', 'free', 3), +(20, 'Asha', NULL, 'Desai', '555-1020', 'asha.desai@example.com', 'user', 'standard', 2), +(21, 'Ravi', NULL, 'Kapoor', '555-1021', 'ravi.kapoor@example.com', 'user', 'free', 3), +(22, 'Nina', NULL, 'Park', '555-1022', 'nina.park@example.com', 'user', 'standard', 2), +(23, 'Carl', NULL, 'Bell', '555-1023', 'carl.bell@example.com', 'user', 'free', 5), +(24, 'Marta', NULL, 'Santos', '555-1024', 'marta.santos@example.com', 'user', 'standard', 5), +(25, 'Hank', NULL, 'Yates', '555-1025', 'hank.yates@example.com', 'user', 'free', 2), +(26, 'Gina', NULL, 'Ivers', '555-1026', 'gina.ivers@example.com', 'user', 'standard', 2), +(27, 'Leo', NULL, 'Park', '555-1027', 'leo.park@example.com', 'user', 'free', 4), +(28, 'Wendy', NULL, 'Lopez', '555-1028', 'wendy.lopez@example.com', 'user', 'standard', 4), +(29, 'Tomas', NULL, 'Silva', '555-1029', 'tomas.silva@example.com', 'user', 'free', 2), +(30, 'Kara', NULL, 'Evans', '555-1030', 'kara.evans@example.com', 'user', 'standard', 2), +(31, 'Drew', NULL, 'Fleming', '555-1031', 'drew.fleming@example.com', 'user', 'free', 3), +(32, 'Hana', NULL, 'Kato', '555-1032', 'hana.kato@example.com', 'user', 'standard', 3), +(33, 'Oli', NULL, 'Simmons', '555-1033', 'oli.simmons@example.com', 'user', 'free', 2), +(34, 'Bea', NULL, 'Marshall', '555-1034', 'bea.marshall@example.com', 'user', 'standard', 3), +(35, 'Arun', NULL, 'Shah', '555-1035', 'arun.shah@example.com', 'user', 'free', 2), +(36, 'Jill', NULL, 'Parker', '555-1036', 'jill.parker@example.com', 'user', 'standard', 5), +(37, 'Kyle', NULL, 'Reid', '555-1037', 'kyle.reid@example.com', 'user', 'free', 3), +(38, 'Sofia', NULL, 'Mendez', '555-1038', 'sofia.mendez@example.com', 'user', 'standard', 4), +(39, 'Paul', NULL, 'Nash', '555-1039', 'paul.nash@example.com', 'user', 'free', 2), +(40, 'Rina', NULL, 'Okafor', '555-1040', 'rina.okafor@example.com', 'user', 'standard', 2); + + + + + +-- ========== TAGS (30 rows) ========== +INSERT INTO tags (id, name, color) +VALUES +(1, 'Urgent', '#ff0000'), +(2, 'Feature', '#00aaff'), +(3, 'Bug', '#ff8800'), +(4, 'Research', '#ffd700'), +(5, 'Personal', '#ff00ff'), +(6, 'Maintenance', '#008800'), +(7, 'LowPriority', '#cccccc'), +(8, 'HighPriority', '#cc0000'), +(9, 'UX', '#ff66cc'), +(10, 'Backend', '#3333ff'), +(11, 'Frontend', '#33ccff'), +(12, 'DevOps', '#00cc66'), +(13, 'Analytics', '#6600ff'), +(14, 'Design', '#ff9966'), +(15, 'QA', '#ff0066'), +(16, 'Security', '#990000'), +(17, 'Customer', '#0066ff'), +(18, 'Integration', '#009999'), +(19, 'Alpha', '#66ff66'), +(20, 'Experiment', '#9999ff'), +(21, 'Onboarding', '#ffcc00'), +(22, 'Docs', '#777777'), +(23, 'Mobile', '#cc99ff'), +(24, 'Web', '#339900'), +(25, 'Performance', '#ff4444'), +(26, 'Compliance', '#4444ff'), +(27, 'Refactor', '#aaaaaa'), +(28, 'Sprint', '#00aa88'), +(29, 'Holiday', '#ffb6c1'), +(30, 'Ideas', '#a0522d'); + + + + + +-- ========== POSTS (40 rows) ========== +INSERT INTO posts (id, authorId, title, metaTitle, createdAt, updatedAt, publishedAt, slug, content, tag) +VALUES +-- 1..8: Persona Posts +(1, 1, 'My Portfolio Process', NULL, '2025-05-03 09:00:00', NULL, '2025-05-03 09:00:00', 'portfolio-process', 'Phase-driven workflow: research → sketches → deliverables.', 14), +(2, 1, 'Daily Creative Habit Tips', NULL, '2025-06-01 08:30:00', NULL, '2025-06-01 08:30:00', 'creative-habit-tips', 'How to keep a daily sketch habit as a freelancer.', 5), +(3, 2, 'Balancing Research & Teaching', NULL, '2024-09-01 10:00:00', NULL, '2024-09-01 10:00:00', 'balancing-research-class', 'Strategies to keep research progress while teaching.', 4), +(4, 2, 'Robust Estimators — Paper Plan', NULL, '2025-01-15 11:00:00', NULL, '2025-01-15 11:00:00', 'robust-estimators-plan', 'Outline and milestones for the statistical paper.', 4), +(5, 3, 'Ops: How We Triage Bugs', NULL, '2024-06-10 09:00:00', NULL, '2024-06-10 09:00:00', 'ops-triage-bugs', 'Triage rules and prioritization for sysadmins.', 12), +(6, 3, 'Community AMA: Office Hours', NULL, '2025-07-01 12:00:00', NULL, '2025-07-01 12:00:00', 'community-ama', 'Monthly office hours: ask the founder anything.', 21), +(7, 4, 'Finance: Weekly P&L Snapshot', NULL, '2025-02-01 08:00:00', NULL, '2025-02-01 08:00:00', 'weekly-pnl-snapshot', 'How I review revenue and expenses each morning.', 25), +(8, 4, 'Plan to Increase Profit 5%', NULL, '2025-02-02 09:00:00', NULL, '2025-02-02 09:00:00', 'increase-profit-5pct', 'Tactics and daily adjustments to reach profit targets.', 25), +-- 8..40: Additional realistic users posts to fill out dataset +(9, 5, 'Onboarding Checklist Improvements', NULL, '2025-03-01 09:30:00', NULL, '2025-03-01 09:30:00', 'onboarding-checklist', 'Checklist to reduce time-to-value for new users.', 21), +(10, 6, 'Release Notes — April', NULL, '2025-04-02 10:00:00', NULL, '2025-04-02 10:00:00', 'release-notes-april', 'What shipped in April.', 2), +(11, 7, 'Design Tokens Update', NULL, '2025-05-05 11:00:00', NULL, '2025-05-05 11:00:00', 'design-tokens-update', 'Updated spacing and color tokens.', 14), +(12, 8, 'Performance Benchmark Results', NULL, '2025-05-10 15:00:00', NULL, '2025-05-10 15:00:00', 'perf-benchmarks', 'Benchmarks after DB and cache changes.', 25), +(13, 9, 'Mobile Layout Fixes', NULL, '2025-05-20 09:00:00', NULL, '2025-05-20 09:00:00', 'mobile-layout-fixes', 'Fixes for iOS and Android UI issues.', 23), +(14, 10, 'How to Write Better Docs', NULL, '2025-06-01 09:00:00', NULL, '2025-06-01 09:00:00', 'write-better-docs', 'Structure and tone guidance for docs.', 22), +(15, 11, 'QA: Running Regressions Locally', NULL, '2025-06-05 10:00:00', NULL, '2025-06-05 10:00:00', 'qa-regressions-local', 'How to run regression suite on dev machine.', 15), +(16, 12, 'User Story Examples', NULL, '2025-06-10 09:00:00', NULL, '2025-06-10 09:00:00', 'user-story-examples', 'Good templates for writing stories.', 28), +(17, 13, 'Analytics: Retention Cohort', NULL, '2025-07-01 08:00:00', NULL, '2025-07-01 08:00:00', 'retention-cohorts', 'Breaking down cohorts for retention analysis.', 13), +(18, 14, 'Security Checklist', NULL, '2025-07-05 09:00:00', NULL, '2025-07-05 09:00:00', 'security-checklist', 'Checklist for release security review.', 16), +(19, 15, 'Ideas Board: Feature Suggestions', NULL, '2025-07-08 10:00:00', NULL, '2025-07-08 10:00:00', 'ideas-board', 'Collecting feature ideas from customers.', 30), +(20, 16, 'Refactor: Notifications Service', NULL, '2025-07-10 11:00:00', NULL, '2025-07-10 11:00:00', 'refactor-notifications', 'Why the service was refactored.', 27), +(21, 17, 'Partner Integration Notes', NULL, '2025-07-14 09:30:00', NULL, '2025-07-14 09:30:00', 'partner-integration', 'Docs for partner connectors.', 18), +(22, 18, 'Holiday Release Plan', NULL, '2025-07-20 12:00:00', NULL, '2025-07-20 12:00:00', 'holiday-release-plan', 'Planned features for holiday campaign.', 29), +(23, 19, 'Community Highlights', NULL, '2025-07-25 15:00:00', NULL, '2025-07-25 15:00:00', 'community-highlights', 'Top community contributions this month.', 21), +(24, 20, 'Daily Habit: 10-minute Write', NULL, '2025-07-28 07:00:00', NULL, '2025-07-28 07:00:00', '10minute-write', 'How short daily habits build consistency.', 5), +(25, 21, 'DB Backup Best Practices', NULL, '2025-07-30 09:00:00', NULL, '2025-07-30 09:00:00', 'db-backup-best-practices', 'Backup strategies and retention', 6), +(26, 22, 'Integrations: Zapier Guide', NULL, '2025-08-01 10:00:00', NULL, '2025-08-01 10:00:00', 'zapier-guide', 'Connect GoalFlow to Zapier.', 18), +(27, 23, 'UX Microcopy Examples', NULL, '2025-08-02 09:30:00', NULL, '2025-08-02 09:30:00', 'ux-microcopy-examples', 'CTAs and label examples.', 9), +(28, 24, 'Sprint Plan Template', NULL, '2025-08-03 08:45:00', NULL, '2025-08-03 08:45:00', 'sprint-plan-template', 'Template for sprint planning.', 28), +(29, 25, 'How to Set Priorities', NULL, '2025-08-04 10:20:00', NULL, '2025-08-04 10:20:00', 'set-priorities', 'Priority framework for teams.', 8), +(30, 26, 'Refactor: Jobs Queue', NULL, '2025-08-05 11:00:00', NULL, '2025-08-05 11:00:00', 'refactor-jobs-queue', 'Improve reliability of background jobs.', 27), +(31, 27, 'Experiment Backlog', NULL, '2025-08-06 09:00:00', NULL, '2025-08-06 09:00:00', 'experiment-backlog', 'Ideas and experiments to test.', 20), +(32, 28, 'Design Roadmap Q3', NULL, '2025-08-07 10:00:00', NULL, '2025-08-07 10:00:00', 'design-roadmap-q3', 'Planned design initiatives for Q3.', 14), +(33, 29, 'Monthly Metrics - July', NULL, '2025-08-08 08:00:00', NULL, '2025-08-08 08:00:00', 'monthly-metrics-july', 'Key metrics for July.', 13), +(34, 30, 'Projection Charts', NULL, '2025-08-09 09:00:00', NULL, '2025-08-09 09:00:00', 'projection-charts', 'Charts to help long-term planning.', 13), +(35, 31, 'API Pagination Best Practices', NULL, '2025-08-09 11:00:00', NULL, '2025-08-09 11:00:00', 'api-pagination', 'Cursor vs page-number discussion.', 11), +(36, 32, 'Community Forum: Getting Started', NULL, '2025-08-09 12:00:00', NULL, '2025-08-09 12:00:00', 'community-forum-getting-started', 'How to engage in the forum constructively.', 21), +(37, 33, 'Retention Experiment Plan', NULL, '2025-08-09 13:00:00', NULL, '2025-08-09 13:00:00', 'retention-experiment-plan', 'Design for increasing retention.', 20), +(38, 34, 'Docs: Release Notes Style', NULL, '2025-08-09 14:00:00', NULL, '2025-08-09 14:00:00', 'release-notes-style', 'Tone and structure guidance.', 22), +(39, 35, 'Bug Bash: How to Run One', NULL, '2025-08-09 15:00:00', NULL, '2025-08-09 15:00:00', 'bug-bash-guide', 'Run an effective bug-finding session.', 15), +(40, 36, 'Community Spotlight: Volunteer', NULL, '2025-08-09 16:00:00', NULL, '2025-08-09 16:00:00', 'community-spotlight', 'Highlight a community volunteer.', 21); + + + + + +-- ========== POST_REPLY (150 rows) ========== +INSERT INTO post_reply (id, userId, postId, title, createdAt, publishedAt, content, tag) +VALUES +(1, 5, 1, 'Nice structure — thanks!', '2025-05-03 10:02:00', '2025-05-03 10:02:00', 'This structure makes it easy to plan milestones. Do you use a template for case studies?', 14), +(2, 6, 1, 'Do you timebox phases?', '2025-05-03 10:18:00', NULL, 'Do you set fixed timeboxes for research vs. build, or let them flex by project?', 14), +(3, 1, 2, 'Love the habit idea', '2025-06-01 09:05:00', '2025-06-01 09:05:00', 'I’ve been doing a 20-minute sketch a day — your prompt list idea is helpful.', 5), +(4, 7, 3, 'Balancing tips are great', '2024-09-01 11:30:00', NULL, 'Scheduling fixed research blocks around class time is exactly what I needed.', 4), +(5, 8, 4, 'Suggestion for simulations', '2025-01-16 09:35:00', NULL, 'Consider adding a reproducibility subsection (random seeds, environment) to the methods.', 4), +(6, 3, 5, 'Nice triage flow', '2024-06-11 09:20:00', '2024-06-11 09:20:00', 'I like the three-bucket approach. Would it help to mark paid-customer bugs differently?', 12), +(7, 9, 6, 'Will the AMA be recorded?', '2025-07-01 13:05:00', NULL, 'I’m in a different timezone — will you post a recording for those who can’t attend?', 21), +(8, 10, 7, 'Recurring vs one-offs', '2025-02-01 09:20:00', NULL, 'When you snapshot P&L, do you separate recurring revenue from one-offs?', 25), +(9, 11, 8, 'Benchmark scripts share?', '2025-05-10 15:32:00', NULL, 'Can you share the scripts or command-line steps you used for the benchmarks?', 25), +(10, 12, 9, 'Thanks — mobile fix confirmed', '2025-05-20 10:05:00', NULL, 'Confirming the Android overlap was resolved — thanks for the quick turnaround.', 23), +(11, 13, 10, 'Docs snippet request', '2025-06-01 10:05:00', NULL, 'A short code example for the API export would help adoption — could you add one?', 22), +(12, 14, 11, 'QA tip on flakiness', '2025-06-05 11:05:00', NULL, 'Start by quarantining flaky tests and adding deterministic seeds where possible.', 15), +(13, 15, 12, 'Acceptance criteria suggestion', '2025-06-10 09:35:00', NULL, 'Nice examples — could you add an acceptance-criteria template?', 28), +(14, 16, 13, 'Try weekly buckets', '2025-07-01 09:35:00', NULL, 'Splitting retention into weekly cohorts often surfaces short-term changes faster.', 13), +(15, 17, 14, 'Dependency checklist', '2025-07-05 10:05:00', NULL, 'Add a dependency-review step to the security checklist (licenses + CVEs).', 16), +(16, 18, 15, 'Collect votes on features', '2025-07-08 11:10:00', NULL, 'Maybe add an upvote count so the team sees priority from users.', 30), +(17, 19, 16, 'Latency improved — nice', '2025-07-10 12:10:00', NULL, 'After the refactor we saw a 15% median latency improvement — well done!', 27), +(18, 20, 17, 'Connector SDK idea', '2025-07-14 10:20:00', NULL, 'A small SDK for partners would make integrations much easier to adopt.', 18), +(19, 21, 18, 'Coordinate marketing early', '2025-07-20 13:10:00', NULL, 'If this goes live during the holidays, loop in marketing three weeks ahead with assets.', 29), +(20, 22, 19, 'Great spotlight — thanks', '2025-07-25 16:10:00', NULL, 'Love seeing community contributions highlighted — keeps people engaged.', 21), +(21, 23, 20, 'Short habit advice', '2025-07-28 08:05:00', NULL, 'Short, daily practices scale — 10 minutes a day is a manageable start for most.', 5), +(22, 24, 21, 'Backup restore test', '2025-07-30 09:35:00', NULL, 'Don’t forget to test restores end-to-end on at least one backup monthly.', 6), +(23, 25, 22, 'Webhook example added', '2025-08-01 10:40:00', NULL, 'Thanks — the webhook transform example clarified a lot for our integration team.', 18), +(24, 26, 23, 'Shorter CTAs help', '2025-08-02 10:10:00', NULL, 'On mobile, concise CTAs have higher tap rates — consider 2–3 words max.', 9), +(25, 27, 24, 'Retro: add capacity row', '2025-08-03 09:05:00', NULL, 'Add engineer capacity to the sprint template so planning is realistic.', 28), +(26, 28, 25, 'Prioritize by impact', '2025-08-04 11:10:00', NULL, 'Triage bugs by customer impact (revenue + active users) to focus effort.', 8), +(27, 29, 26, 'Batching wins', '2025-08-05 11:45:00', NULL, 'Batching small jobs reduced overall queue pressure by 30% in our last rollout.', 27), +(28, 30, 27, 'Segment experiment cohorts', '2025-08-06 09:35:00', NULL, 'Recommend at least 3 cohorts to avoid noise.', 20), +(29, 31, 28, 'Accessibility note', '2025-08-07 10:45:00', NULL, 'When you publish the roadmap, add an accessibility row for each item to track compliance.', 14), +(30, 32, 29, 'Weekly metrics suggestion', '2025-08-08 08:15:00', NULL, 'Consider a short executive summary of the top 3 metrics so stakeholders can scan quickly.', 13), +(31, 33, 30, 'Export CSV please', '2025-08-09 09:05:00', NULL, 'CSV export would make sharing these charts much easier.', 13), +(32, 34, 31, 'Pagination question', '2025-08-09 11:10:00', NULL, 'Do you support cursor pagination for large datasets?', 11), +(33, 35, 32, 'Welcome tip', '2025-08-09 12:10:00', NULL, 'A pinned “how to get started” thread helps new users.', 21), +(34, 36, 33, 'Experiment tweak', '2025-08-09 13:20:00', NULL, 'Use longer test windows for retention signals to reduce noise.', 20), +(35, 37, 34, 'Docs style note', '2025-08-09 14:20:00', NULL, 'Keep release notes to 3–5 bullets for quick scanning.', 22), +(36, 38, 35, 'Bug bash logistics', '2025-08-09 15:20:00', NULL, 'Set teams and target areas, and publish a short how-to.', 15), +(37, 39, 36, 'Community shoutout', '2025-08-09 16:20:00', NULL, 'Thanks to volunteers for moderation — great work!', 21), +(38, 40, 37, 'Retention baseline', '2025-08-09 17:00:00', NULL, 'Use 30-day retention as a baseline and measure lift vs. it.', 20), +(39, 1, 3, 'Prof insight', '2024-09-02 09:00:00', NULL, 'Useful tips on balancing class prep and research.', 4), +(40, 2, 4, 'Paper plan review', '2025-01-16 10:20:00', NULL, 'I propose adding a bootstrap section to simulate robustness.', 4), +(41, 3, 5, 'Ops follow-up', '2024-06-12 09:30:00', NULL, 'Would you accept contributions on triage rules?', 12), +(42, 4, 7, 'Finance follow-up', '2025-02-01 10:00:00', NULL, 'I can share a P&L template for teams.', 25), +(43, 5, 8, 'Release question', '2025-04-03 10:30:00', NULL, 'Any breaking changes to note?', 2), +(44, 6, 9, 'Mobile confirm', '2025-05-21 11:20:00', NULL, 'iOS fix confirmed in next build.', 23), +(45, 7, 10, 'Docs sample added', '2025-06-02 10:05:00', NULL, 'Added sample snippet for the export endpoint.', 22), +(46, 8, 11, 'QA automation tip', '2025-06-06 12:00:00', NULL, 'Flaky tests: add consistent seeding.', 15), +(47, 9, 12, 'Perf configure', '2025-06-11 15:20:00', NULL, 'Benchmark harness simplified and scripts posted.', 25), +(48, 10, 13, 'Design tweak', '2025-07-02 09:00:00', NULL, 'Increased spacing on cards, looks cleaner.', 14), +(49, 11, 14, 'Security addendum', '2025-07-06 09:30:00', NULL, 'Add dependency checks to CI.', 16), +(50, 12, 15, 'Idea upvote', '2025-07-09 11:00:00', NULL, 'I upvote the experimental approach.', 30), +(51, 13, 16, 'Refactor check', '2025-07-11 13:00:00', NULL, 'Unit tests passed locally.', 27), +(52, 14, 17, 'Connector note', '2025-07-15 10:00:00', NULL, 'Connector docs include auth examples now.', 18), +(53, 15, 18, 'Holiday assets', '2025-07-21 12:15:00', NULL, 'Marketing assets are ready.', 29), +(54, 16, 19, 'Community thumbs', '2025-07-26 16:00:00', NULL, 'Nice community contributions — congrats!', 21), +(55, 17, 20, 'Habit reminder', '2025-07-29 08:00:00', NULL, 'Short habits compound over time.', 5), +(56, 18, 21, 'DB tip', '2025-07-31 09:10:00', NULL, 'Rotate backups monthly and test restores.', 6), +(57, 19, 22, 'Zapier sample', '2025-08-01 11:45:00', NULL, 'Webhook transform clarified. Thanks!', 18), +(58, 20, 23, 'Microcopy done', '2025-08-02 10:30:00', NULL, 'Finalized labels for the new UI.', 9), +(59, 21, 24, 'Sprint tweak', '2025-08-03 09:25:00', NULL, 'Added acceptance matrix.', 28), +(60, 22, 25, 'Priority dispatch', '2025-08-04 12:55:00', NULL, 'Assigned resources to top items.', 8), +(61, 23, 26, 'Queue resolved', '2025-08-05 14:20:00', NULL, 'Backlog reduced by 30%.', 27), +(62, 24, 27, 'Experiment notes', '2025-08-06 10:05:00', NULL, 'Updated experiment doc.', 20), +(63, 25, 28, 'Design sync 2', '2025-08-07 10:25:00', NULL, 'Sync completed; components updated.', 14), +(64, 26, 29, 'Metrics confirm', '2025-08-08 08:30:00', NULL, 'Checked alert thresholds.', 13), +(65, 27, 30, 'API sample 2', '2025-08-09 09:20:00', NULL, 'Cursor snippet added.', 11), +(66, 28, 31, 'Forum help', '2025-08-09 12:50:00', NULL, 'Guided new user to docs.', 21), +(67, 29, 32, 'Retention follow', '2025-08-09 14:10:00', NULL, 'Maybe segment by signup source.', 20), +(68, 30, 33, 'Metrics small', '2025-08-09 15:20:00', NULL, 'Include AOV in monthly report.', 13), +(69, 31, 34, 'Proj request', '2025-08-09 16:30:00', NULL, 'Add CSV export for projections.', 13), +(70, 32, 35, 'Pagination answer 2', '2025-08-09 17:40:00', NULL, 'Cursor recommended for large datasets.', 11), +(71, 33, 36, 'Forum welcome', '2025-08-09 18:00:00', NULL, 'Welcome to the community! Start with our guide.', 21), +(72, 34, 37, 'Experiment feedback', '2025-08-09 19:00:00', NULL, 'Consider alternative segmentations.', 20), +(73, 35, 38, 'Docs update', '2025-08-09 20:00:00', NULL, 'Updated release notes template.', 22), +(74, 36, 39, 'Bug bash' '2025-08-09 20:30:00', NULL, 'I will join the bug bash on Friday.', 15), +(75, 37, 40, 'Community thanks', '2025-08-09 21:00:00', NULL, 'Thanks for spotlighting volunteers.', 21), +(76, 1, 5, 'Ops QA note', '2024-06-12 11:00:00', NULL, 'Triage board looks good.', 12), +(77, 2, 1, 'Prof: design idea', '2025-05-04 08:00:00', NULL, 'Consider a rubric to evaluate case studies.', 14), +(78, 3, 6, 'AMA logistics', '2025-06-30 10:00:00', NULL, 'Will the AMA be recorded and posted?', 21), +(79, 4, 7, 'Finance note', '2025-02-02 08:15:00', NULL, 'Add a recurring revenue row to spreadsheet.', 25), +(80, 5, 8, 'Release praise', '2025-04-03 11:00:00', NULL, 'Nice incremental improvements.', 2), +(81, 6, 9, 'Mobile QA', '2025-05-21 11:00:00', NULL, 'Regression tests updated.', 23), +(82, 7, 10, 'Doc tweak', '2025-06-03 09:30:00', NULL, 'Clarified the CLI example.', 22), +(83, 8, 11, 'Design thumbs', '2025-06-07 10:10:00', NULL, 'Agree with updated tokens.', 14), +(84, 9, 12, 'Perf follow-up', '2025-06-11 16:00:00', NULL, 'Added profiling outputs.', 25), +(85, 10, 13, 'Mobile improvement', '2025-07-03 09:45:00', NULL, 'Resolved with CSS fix.', 23), +(86, 11, 14, 'Security note 2', '2025-07-07 11:00:00', NULL, 'Link added to checklist.', 16), +(87, 12, 15, 'Idea comment', '2025-07-09 12:00:00', NULL, 'Consider A/B testing variability.', 30), +(88, 13, 16, 'Refactor follow-up', '2025-07-11 14:00:00', NULL, 'Tests added for new queue.', 27), +(89, 14, 17, 'Connector example', '2025-07-15 11:15:00', NULL, 'OAuth example included.', 18), +(90, 15, 18, 'Holiday check', '2025-07-21 13:30:00', NULL, 'Assets QA passed.', 29), +(91, 16, 19, 'Community great', '2025-07-26 16:30:00', NULL, 'Congrats to contributors!', 21), +(92, 17, 20, 'Habit endorsement', '2025-07-29 08:30:00', NULL, 'Small daily habits add up.', 5), +(93, 18, 21, 'Backup confirm', '2025-07-31 10:00:00', NULL, 'Verified backups in staging.', 6), +(94, 19, 22, 'Zapier follow', '2025-08-01 11:45:00', NULL, 'Webhook transform clarified.', 18), +(95, 20, 23, 'Microcopy approve', '2025-08-02 10:15:00', NULL, 'Shortened label works well.', 9), +(96, 21, 24, 'Sprint approval', '2025-08-03 09:15:00', NULL, 'Template looks good.', 28), +(97, 22, 25, 'Priority ack', '2025-08-04 12:45:00', NULL, 'Escalated high-impact bugs.', 8), +(98, 23, 26, 'Queue note 2', '2025-08-05 14:00:00', NULL, 'Batching implemented.', 27), +(99, 24, 27, 'Experiment vote', '2025-08-06 09:50:00', NULL, 'I support the new CTA experiment.', 20), +(100, 25, 28, 'Design alignment', '2025-08-07 10:40:00', NULL, 'Tokens synchronized.', 14), +(101, 26, 29, 'Metrics ack', '2025-08-08 08:50:00', NULL, 'Added alert thresholds.', 13), +(102, 27, 30, 'API sample 3', '2025-08-09 09:30:00', NULL, 'Added more code examples.', 11), +(103, 28, 31, 'Forum help', '2025-08-09 12:50:00', NULL, 'Guided new user to docs.', 21), +(104, 29, 32, 'Retention adjust', '2025-08-09 14:20:00', NULL, 'Added secondary metric for retention.', 20), +(105, 30, 33, 'Monthly metric add', '2025-08-09 15:35:00', NULL, 'Year-over-year included.', 13), +(106, 31, 34, 'Proj export', '2025-08-09 16:45:00', NULL, 'CSV export endpoint added.', 13), +(107, 32, 35, 'Cursor example', '2025-08-09 17:50:00', NULL, 'Cursor pattern documented.', 11), +(108, 33, 36, 'Forum welcome 2', '2025-08-09 18:50:00', NULL, 'Pinned welcome message for newcomers.', 21), +(109, 34, 37, 'Experiment doc', '2025-08-09 19:50:00', NULL, 'Extended experiment plan.', 20), +(110, 35, 38, 'Docs finalize', '2025-08-09 20:50:00', NULL, 'Style guide included.', 22), +(111, 36, 39, 'Bug bash idea 2', '2025-08-09 20:40:00', NULL, 'Set a leaderboard to gamify participation.', 15), +(112, 37, 40, 'Community thanks 2', '2025-08-09 21:10:00', NULL, 'Great recognition of volunteers.', 21), +(113, 38, 1, 'Designer Q', '2025-05-04 09:30:00', NULL, 'Do you use a design brief template?', 14), +(114, 39, 2, 'Habit Q', '2025-06-02 09:15:00', NULL, 'How do you keep motivated long-term?', 5), +(115, 40, 3, 'Research Q', '2024-09-02 11:00:00', NULL, 'Do you share progress publicly or keep internal notes?', 4), +(116, 5, 4, 'Paper logistics', '2025-01-17 10:00:00', NULL, 'Add a timeline for experiments and checkpoints.', 4), +(117, 6, 5, 'Ops note 2', '2024-06-13 09:30:00', NULL, 'Consider a triage rotation to reduce bus-factor.', 12), +(118, 7, 6, 'AMA follow-up', '2025-07-02 12:00:00', NULL, 'Where will recordings be hosted?', 21), +(119, 8, 7, 'P&L question', '2025-02-03 09:00:00', NULL, 'Do you include deferred revenue in snapshots?', 25), +(120, 9, 8, 'Release follow-up', '2025-04-03 11:30:00', NULL, 'Noted, thanks!', 2), +(121, 10, 9, 'Mobile confirm', '2025-05-21 11:20:00', NULL, 'Confirmed fix on Android 13', 23), +(122, 11, 10, 'Docs approve', '2025-06-02 10:20:00', NULL, 'Docs ready for staging.', 22), +(123, 12, 11, 'QA note 2', '2025-06-06 12:10:00', NULL, 'Added regression tests.', 15), +(124, 13, 12, 'Perf follow 2', '2025-06-11 16:30:00', NULL, 'Profiling data uploaded.', 25), +(125, 14, 13, 'Design create', '2025-07-03 09:50:00', NULL, 'Created new token set.', 14), +(126, 15, 14, 'Security follow 2', '2025-07-07 11:30:00', NULL, 'CI security scan added.', 16), +(127, 16, 15, 'Idea follow', '2025-07-09 12:20:00', NULL, 'A/B test scheduled.', 30), +(128, 17, 16, 'Refactor notes 2', '2025-07-11 14:20:00', NULL, 'Queue service regression fixed.', 27), +(129, 18, 17, 'Connector patch', '2025-07-15 11:45:00', NULL, 'OAuth fix applied.', 18), +(130, 19, 18, 'Holiday assets 2', '2025-07-21 13:50:00', NULL, 'Final images exported.', 29), +(131, 20, 19, 'Community update', '2025-07-26 16:50:00', NULL, 'Adding more highlights next month.', 21), +(132, 21, 20, 'Habit follow', '2025-07-29 08:40:00', NULL, 'Streak maintained for 40 days.', 5), +(133, 22, 21, 'Backup note 2', '2025-07-31 10:10:00', NULL, 'Verified backups in staging.', 6), +(134, 23, 22, 'Zapier follow', '2025-08-01 12:00:00', NULL, 'Webhook transform clarified.', 18), +(135, 24, 23, 'Microcopy done', '2025-08-02 10:30:00', NULL, 'Finalized labels.', 9), +(136, 25, 24, 'Sprint tweak', '2025-08-03 09:25:00', NULL, 'Added acceptance matrix.', 28), +(137, 26, 25, 'Priority dispatch', '2025-08-04 12:55:00', NULL, 'Assigned resources to top items.', 8), +(138, 27, 26, 'Queue resolved', '2025-08-05 14:20:00', NULL, 'Backlog reduced by 30%.', 27), +(139, 28, 27, 'Experiment notes', '2025-08-06 10:05:00', NULL, 'Updated experiment doc.', 20), +(140, 29, 28, 'Design sync 2', '2025-08-07 10:25:00', NULL, 'Sync completed.', 14), +(141, 30, 29, 'Metrics confirm', '2025-08-08 08:30:00', NULL, 'Checked alert thresholds.', 13), +(142, 31, 30, 'API sample 3', '2025-08-09 09:30:00', NULL, 'Added more code examples.', 11), +(143, 32, 31, 'Forum onboarding', '2025-08-09 13:10:00', NULL, 'Welcome steps added to forum.', 21), +(144, 33, 32, 'Retention adjust', '2025-08-09 14:20:00', NULL, 'Added secondary metric for retention.', 20), +(145, 34, 33, 'Monthly metric add', '2025-08-09 15:35:00', NULL, 'Year-over-year included.', 13), +(146, 35, 34, 'Proj export', '2025-08-09 16:45:00', NULL, 'CSV export endpoint added.', 13), +(147, 36, 35, 'Cursor example', '2025-08-09 17:50:00', NULL, 'Cursor pattern documented.', 11), +(148, 37, 36, 'Forum welcome 3', '2025-08-09 18:50:00', NULL, 'Pinned welcome message.', 21), +(149, 38, 37, 'Experiment doc 2', '2025-08-09 19:50:00', NULL, 'Extended experiment plan.', 20), +(150, 39, 38, 'Docs finalize 2', '2025-08-09 20:50:00', NULL, 'Style guide included.', 22); + + + + + +-- ========== USER_DATA (75 rows) ========== +INSERT INTO user_data (id, userId, location, totalTime, deviceType, age, registeredAt, lastLogin, isActive, postCount) +VALUES +(1, 1, 'New York, NY', 220, 'tablet', 23, '2025-03-01 09:00:00', '2025-08-09 09:00:00', 1, 4), +(2, 2, 'Boston, MA', 950, 'desktop', 52, '2020-09-01 08:00:00', '2025-08-08 18:00:00', 1, 9), +(3, 3, 'Boston, MA', 1800, 'desktop', 22, '2019-06-01 10:00:00', '2025-08-09 20:30:00', 1, 28), +(4, 4, 'Los Angeles, CA', 650, 'desktop', 28, '2023-02-20 11:00:00', '2025-08-09 16:00:00', 1, 10), +(5, 5, 'Seattle, WA', 480, 'desktop', 29, '2024-04-10 09:00:00', '2025-08-07 14:00:00', 1, 6), +(6, 6, 'Austin, TX', 390, 'mobile', 31, '2024-05-05 10:00:00', '2025-08-05 09:00:00', 1, 7), +(7, 7, 'Portland, OR', 720, 'desktop', 27, '2024-06-01 11:00:00', '2025-08-06 11:00:00', 1, 9), +(8, 8, 'Denver, CO', 540, 'desktop', 34, '2024-07-10 08:30:00', '2025-08-04 10:00:00', 1, 8), +(9, 9, 'Miami, FL', 610, 'mobile', 25, '2024-08-15 09:00:00', '2025-08-03 12:00:00', 1, 5), +(10, 10, 'Raleigh, NC', 210, 'desktop', 29, '2024-09-01 09:30:00', '2025-08-02 10:00:00', 1, 2), +(11, 11, 'Minneapolis, MN', 840, 'desktop', 35, '2024-09-20 08:05:00', '2025-08-08 21:00:00', 1, 12), +(12, 12, 'San Francisco,CA', 980, 'desktop', 27, '2024-10-01 10:05:00', '2025-08-06 14:00:00', 1, 14), +(13, 13, 'Los Angeles, CA', 1200, 'desktop', 36, '2024-11-01 08:30:00', '2025-08-05 10:45:00', 1, 18), +(14, 14, 'Salt Lake City,UT', 260, 'desktop', 30, '2025-01-05 13:00:00', '2025-07-30 10:00:00', 1, 3), +(15, 15, 'Orlando, FL', 380, 'mobile', 28, '2025-01-10 09:50:00', '2025-08-02 11:20:00', 1, 6), +(16, 16, 'Philadelphia,PA', 460, 'desktop', 34, '2025-01-15 17:15:00', '2025-08-06 14:05:00', 1, 4), +(17, 17, 'Boise, ID', 180, 'mobile', 27, '2025-01-20 12:00:00', '2025-07-25 09:00:00', 1, 1), +(18, 18, 'Palo Alto, CA', 1500, 'desktop', 41, '2020-05-10 08:00:00', '2025-08-07 19:40:00', 1, 25), +(19, 19, 'Cincinnati, OH', 220, 'mobile', 26, '2025-02-10 10:20:00', '2025-07-29 14:00:00', 1, 2), +(20, 20, 'Columbus, OH', 480, 'desktop', 32, '2025-02-20 09:00:00', '2025-08-04 16:10:00', 1, 7), +(21, 21, 'Las Vegas, NV', 310, 'mobile', 29, '2025-02-25 11:30:00', '2025-07-26 11:30:00', 1, 3), +(22, 22, 'Toronto, ON', 540, 'desktop', 33, '2025-03-01 08:45:00', '2025-08-01 10:55:00', 1, 8), +(23, 23, 'Montreal,QC', 190, 'mobile', 24, '2025-03-05 14:00:00', '2025-07-27 09:15:00', 1, 1), +(24, 24, 'Vancouver,BC', 760, 'desktop', 35, '2025-03-10 09:30:00', '2025-08-03 12:00:00', 1, 10), +(25, 25, 'Birmingham,AL', 90, 'mobile', 22, '2025-03-15 11:00:00', '2025-07-25 08:00:00', 1, 0), +(26, 26, 'Oakland, CA', 420, 'desktop', 30, '2025-03-20 13:40:00', '2025-08-05 09:15:00', 1, 6), +(27, 27, 'Syracuse, NY', 360, 'mobile', 29, '2025-03-25 10:20:00', '2025-07-31 07:40:00', 1, 4), +(28, 28, 'Rochester, NY', 280, 'desktop', 31, '2025-03-30 09:00:00', '2025-08-02 08:20:00', 1, 3), +(29, 29, 'Hartford, CT', 410, 'mobile', 36, '2025-04-04 16:00:00', '2025-08-06 09:50:00', 1, 7), +(30, 30, 'Nashville, TN', 340, 'desktop', 28, '2025-04-09 08:10:00', '2025-08-04 07:00:00', 1, 5), +(31, 31, 'Cleveland, OH', 230, 'mobile', 33, '2025-04-14 10:40:00', '2025-08-03 09:40:00', 1, 4), +(32, 32, 'Richmond, VA', 600, 'desktop', 38, '2025-04-19 09:05:00', '2025-08-02 21:00:00', 1, 12), +(33, 33, 'Tucson, AZ', 200, 'mobile', 27, '2025-04-24 11:10:00', '2025-07-30 12:30:00', 1, 2), +(34, 34, 'Santa Fe, NM', 300, 'desktop', 35, '2025-04-29 09:45:00', '2025-08-01 13:30:00', 1, 5), +(35, 35, 'Reno, NV', 150, 'mobile', 26, '2025-05-04 10:00:00', '2025-07-29 09:00:00', 1, 1), +(36, 36, 'Burlington, VT', 340, 'desktop', 31, '2025-05-09 09:10:00', '2025-08-08 16:00:00', 1, 6), +(37, 37, 'Ithaca, NY', 400, 'desktop', 52, '2020-09-01 08:00:00', '2025-08-07 19:00:00', 1, 14), +(38, 38, 'Boston, MA', 1920, 'desktop', 22, '2019-06-01 10:00:00', '2025-08-09 20:30:00', 1, 40), +(39, 39, 'Los Angeles, CA', 700, 'desktop', 28, '2023-02-20 11:00:00', '2025-08-09 16:00:00', 1, 22), +(40, 40, 'London, UK', 520, 'desktop', 33, '2024-02-01 09:00:00', '2025-08-08 12:00:00', 1, 11), +(41, 1, 'New York, NY', 240, 'desktop', 23, '2025-04-01 09:00:00', '2025-07-01 09:00:00', 1, 5), +(42, 2, 'Boston, MA', 1000, 'desktop', 52, '2019-10-01 08:00:00', '2025-06-01 18:00:00', 1, 8), +(43, 3, 'Boston, MA', 1750, 'desktop', 22, '2020-01-01 10:00:00', '2025-05-01 19:00:00', 1, 26), +(44, 4, 'Los Angeles, CA', 600, 'desktop', 28, '2023-03-01 11:00:00', '2025-04-01 16:00:00', 1, 9), +(45, 5, 'Seattle, WA', 510, 'desktop', 29, '2024-05-01 09:00:00', '2025-03-01 14:00:00', 1, 7), +(46, 6, 'Austin, TX', 410, 'mobile', 31, '2024-06-01 10:00:00', '2025-02-01 09:00:00', 1, 8), +(47, 7, 'Portland, OR', 730, 'desktop', 27, '2024-07-01 11:00:00', '2025-01-01 11:00:00', 1, 10), +(48, 8, 'Denver, CO', 560, 'desktop', 34, '2024-08-01 08:30:00', '2024-12-01 10:00:00', 1, 9), +(49, 9, 'Miami, FL', 640, 'mobile', 25, '2024-09-01 09:00:00', '2024-11-01 12:00:00', 1, 6), +(50, 10, 'Raleigh, NC', 220, 'desktop', 29, '2024-10-01 09:30:00', '2024-10-01 10:00:00', 1, 3), +(51, 11, 'Minneapolis, MN', 860, 'desktop', 35, '2024-10-15 08:05:00', '2024-09-01 21:00:00', 1, 11), +(52, 12, 'San Francisco,CA', 990, 'desktop', 27, '2024-11-15 10:05:00', '2024-08-01 14:00:00', 1, 16), +(53, 13, 'Los Angeles, CA', 1250, 'desktop', 36, '2024-12-01 08:30:00', '2024-07-01 10:45:00', 1, 20), +(54, 14, 'Salt Lake City,UT', 300, 'desktop', 30, '2025-01-10 13:00:00', '2024-06-01 10:00:00', 1, 5), +(55, 15, 'Orlando, FL', 400, 'mobile', 28, '2025-01-15 09:50:00', '2024-05-01 11:20:00', 1, 8), +(56, 16, 'Philadelphia,PA', 480, 'desktop', 34, '2025-01-20 17:15:00', '2024-04-01 14:05:00', 1, 6), +(57, 17, 'Boise, ID', 190, 'mobile', 27, '2025-01-25 12:00:00', '2024-03-01 09:00:00', 1, 2), +(58, 18, 'Palo Alto, CA', 1550, 'desktop', 41, '2019-05-10 08:00:00', '2024-02-01 19:40:00', 1, 28), +(59, 19, 'Cincinnati, OH', 230, 'mobile', 26, '2025-02-15 10:20:00', '2024-01-01 14:00:00', 1, 4), +(60, 20, 'Columbus, OH', 490, 'desktop', 32, '2025-02-25 09:00:00', '2023-12-01 16:10:00', 1, 9), +(61, 21, 'Las Vegas, NV', 320, 'mobile', 29, '2025-03-05 11:30:00', '2023-11-01 11:30:00', 1, 4), +(62, 22, 'Toronto, ON', 560, 'desktop', 33, '2025-03-10 08:45:00', '2023-10-01 10:55:00', 1, 10), +(63, 23, 'Montreal,QC', 210, 'mobile', 24, '2025-03-15 14:00:00', '2023-09-01 09:15:00', 1, 2), +(64, 24, 'Vancouver,BC', 780, 'desktop', 35, '2025-03-20 09:30:00', '2023-08-01 12:00:00', 1, 14), +(65, 25, 'Birmingham,AL', 100, 'mobile', 22, '2025-03-25 11:00:00', '2023-07-01 08:00:00', 1, 1), +(66, 26, 'Oakland, CA', 440, 'desktop', 30, '2025-03-30 13:40:00', '2023-06-01 09:15:00', 1, 7), +(67, 27, 'Syracuse, NY', 380, 'mobile', 29, '2025-04-04 10:20:00', '2023-05-01 07:40:00', 1, 6), +(68, 28, 'Rochester, NY', 300, 'desktop', 31, '2025-04-09 09:00:00', '2023-04-01 08:20:00', 1, 4), +(69, 29, 'Hartford, CT', 430, 'mobile', 36, '2025-04-14 16:00:00', '2023-03-01 09:50:00', 1, 9), +(70, 30, 'Nashville, TN', 350, 'desktop', 28, '2025-04-19 08:10:00', '2023-02-01 07:00:00', 1, 6), +(71, 31, 'Cleveland, OH', 240, 'mobile', 33, '2025-04-24 10:40:00', '2023-01-01 09:40:00', 1, 5), +(72, 32, 'Richmond, VA', 610, 'desktop', 38, '2025-04-29 09:05:00', '2022-12-01 21:00:00', 1, 15), +(73, 33, 'Tucson, AZ', 220, 'mobile', 27, '2025-05-04 11:10:00', '2022-11-01 12:30:00', 1, 3), +(74, 34, 'Santa Fe, NM', 320, 'desktop', 35, '2025-05-09 09:45:00', '2022-10-01 13:30:00', 1, 7), +(75, 35, 'Reno, NV', 170, 'mobile', 26, '2025-05-14 10:00:00', '2022-09-01 09:00:00', 1, 2); + + + + + +-- ========== BUG_REPORTS (35 rows) ========== +INSERT INTO bug_reports (id, userId, title, metaTitle, slug, description, dateReported, status, priority) +VALUES +(1, 1, 'Portfolio image upload fails', NULL, 'portfolio-image-upload', 'PNG >5MB fails in uploader', '2025-06-10 11:20:00', 0, 3), +(2, 2, 'Simulation reproducibility', NULL, 'sim-repro-issue', 'Random seed mismatch across runs', '2025-01-20 09:30:00', 0, 2), +(3, 3, 'Spam in community forum', NULL, 'forum-spam', 'Automated spam posts bypass moderation', '2025-07-02 09:00:00', 0, 2), +(4, 4, 'CSV export timeouts', NULL, 'csv-export-timeout', 'Large CSV exports time out at 30s', '2025-03-05 16:40:00', 0, 3), +(5, 5, 'Mobile layout regression', NULL, 'mobile-layout-regression', 'Buttons overlap on Android 13', '2025-05-20 09:00:00', 0, 2), +(6, 6, 'DB deadlock on reports', NULL, 'db-deadlock-reports', 'Deadlocks in complex report queries', '2025-05-08 13:20:00', 0, 1), +(7, 7, 'Notification duplication', NULL, 'notification-dup', 'Users receive duplicate notifications', '2025-04-28 09:00:00', 0, 3), +(8, 8, 'Webhook retries too aggressive', NULL, 'webhook-retry', 'Retries without exponential backoff', '2025-04-08 12:00:00', 0, 2), +(9, 9, 'Search indexing gap', NULL, 'search-index-gap', 'New docs not appearing in search', '2025-04-20 16:00:00', 0, 3), +(10, 10, 'iOS app cold start crash', NULL, 'ios-cold-start', 'Crash on cold start v1.2.0', '2025-05-02 07:50:00', 0, 1), +(11, 11, 'Rate limit miscount', NULL, 'rate-limit-miscount', 'Rate limiter miscounts concurrent requests', '2025-04-11 09:30:00', 0, 2), +(12, 12, 'Missing translations', NULL, 'missing-translations', 'Some locales missing labels', '2025-03-12 08:10:00', 0, 4), +(13, 13, 'Memory leak in worker', NULL, 'memory-leak-worker', 'Long-running worker grows memory', '2025-03-25 14:00:00', 0, 1), +(14, 14, 'Image resize fails', NULL, 'image-resize-fail', 'Large images fail to resize', '2025-07-18 12:30:00', 0, 3), +(15, 15, 'Permission elevation', NULL, 'perm-elevation', 'Users see actions they should not', '2025-07-22 09:15:00', 0, 1), +(16, 16, 'Queue backlog growth', NULL, 'queue-backlog', 'Job queue length increasing', '2025-07-30 11:00:00', 0, 2), +(17, 17, 'Timezone display bug', NULL, 'timezone-display', 'Events showing UTC instead of local', '2025-05-26 08:05:00', 0, 4), +(18, 18, 'Schema migration failure', NULL, 'schema-migration-fail', 'Migration fails on large tables', '2025-06-12 03:10:00', 0, 1), +(19, 19, 'Broken download link', NULL, 'download-link-403', 'Downloads returning 403 for PDFs', '2025-08-02 09:30:00', 0, 3), +(20, 20, 'Unexpected 500s after deploy', NULL, '500-after-deploy', 'Release causing 500s on API', '2025-06-01 02:30:00', 1, 1), +(21, 21, 'Attachments lost on save', NULL, 'attachments-lost', 'Attachments not persistently saved', '2025-07-11 09:30:00', 0, 3), +(22, 22, 'Webhook 500 errors', NULL, 'webhook-500-errors-2', 'Third-party webhooks intermittently 500', '2025-07-06 12:00:00', 0, 3), +(23, 23, 'Export pagination bug', NULL, 'export-pagination', 'Pagination broken for large exports', '2025-06-28 16:40:00', 0, 3), +(24, 24, 'UI flicker on resize', NULL, 'ui-flicker-resize', 'Flicker when resizing browser window', '2025-08-01 08:45:00', 0, 4), +(25, 25, 'Login fails intermittently', NULL, 'login-intermittent', 'Some users get login errors', '2025-07-20 13:00:00', 0, 2), +(26, 26, 'Performance regression', NULL, 'perf-regression', 'Search became slower after index change', '2025-07-26 10:00:00', 0, 2), +(27, 27, 'File uploads 413', NULL, 'upload-413', 'Large uploads get 413 on certain proxies', '2025-05-18 10:30:00', 0, 3), +(28, 28, 'Staging data mismatch', NULL, 'staging-data-mismatch', 'Staging uses old dataset', '2025-08-03 10:00:00', 0, 4), +(29, 29, 'Email template regression', NULL, 'email-template-regression-2', 'Formatting broken in emails', '2025-08-04 13:30:00', 0, 4), +(30, 30, 'Policy link broken', NULL, 'policy-link-broken', 'Privacy policy link returns 404', '2025-06-06 09:45:00', 0, 4), +(31, 31, 'Duplicate uploads', NULL, 'duplicate-uploads-2', 'Users can upload same file twice', '2025-06-16 09:30:00', 0, 4), +(32, 32, 'Search performance', NULL, 'search-performance', 'Search slower after change', '2025-07-26 10:00:00', 0, 2), +(33, 33, 'Queue worker OOM', NULL, 'queue-worker-oom', 'Worker runs out of memory on long jobs', '2025-07-30 11:30:00', 0, 1), +(34, 34, 'Broken reset link', NULL, 'reset-link-broken', 'Password reset link invalid for some users', '2025-07-07 16:20:00', 0, 4), +(35, 35, 'Attachment encoding issue', NULL, 'attachment-encoding', 'Attachments corrupted on download', '2025-07-11 09:30:00', 0, 3); + + + +-- ========== CONSISTENT_TASKS (35 rows) ========== +INSERT INTO consistent_tasks (id, userId, title, metaTitle, slug, category, notes, createdAt) +VALUES +(1, 1, 'Daily Sketch', '20-min sketch', 'daily-sketch', 'Personal', 'Daily creative sketch habit', '2025-05-01 08:00:00'), +(2, 2, 'Weekly Research Sync', NULL, 'weekly-research-sync', 'Research', 'Sync with lab students', '2020-09-01 15:00:00'), +(3, 3, 'Daily Ops Triage', NULL, 'daily-ops-triage', 'Ops', 'Triage new bug reports each morning', '2019-06-01 09:00:00'), +(4, 4, 'Daily P&L Snapshot', NULL, 'daily-pnl-snapshot', 'Finance', 'Quick morning revenue check', '2023-02-20 08:30:00'), +(5, 5, 'Weekly Team Retro', NULL, 'weekly-retro', 'Meetings', 'Collect action items', '2024-03-01 09:00:00'), +(6, 6, 'DB Backups', NULL, 'db-backups', 'Maintenance', 'Nightly DB backups', '2024-04-01 02:00:00'), +(7, 7, 'Code Review Hour', NULL, 'code-review-hour', 'Engineering', 'Daily review window', '2024-04-15 14:00:00'), +(8, 8, 'Design Critique', NULL, 'design-critique', 'Design', 'Weekly critique for mockups', '2024-05-01 11:00:00'), +(9, 9, 'Rotate Logs', NULL, 'rotate-logs', 'DevOps', 'Rotate logs and archive', '2024-05-10 03:00:00'), +(10, 10,'Accessibility Audit', NULL, 'access-audit', 'Design', 'Quarterly accessibility review', '2024-06-01 09:00:00'), +(11, 11,'Monthly Export', NULL, 'monthly-export', 'Analytics', 'Generate exports monthly', '2024-06-15 08:00:00'), +(12, 12,'Blog Post', NULL, 'monthly-blog', 'Content', 'Write a helpful post per month', '2024-07-01 09:30:00'), +(13, 13,'Sprint Grooming', NULL, 'sprint-groom', 'Product', 'Refine backlog for sprint', '2024-07-10 14:00:00'), +(14, 14,'Test Suite Maintenance', NULL, 'test-maint', 'QA', 'Keep tests green', '2024-07-20 08:30:00'), +(15, 15,'Perf Benchmark', NULL, 'perf-benchmark', 'Performance', 'Monthly benchmarks', '2024-08-01 09:00:00'), +(16, 16,'Customer Interviews', NULL, 'customer-interviews', 'Customer', 'Interview two customers per month', '2024-08-10 10:00:00'), +(17, 17,'Docs Review', NULL, 'docs-review', 'Docs', 'Review docs before releases', '2024-09-01 09:00:00'), +(18, 18,'Release Checklist', NULL, 'release-checklist', 'Ops', 'Checklist for major releases', '2024-09-10 12:00:00'), +(19, 19,'Mentorship Hour', NULL, 'mentorship-hour', 'People', 'Weekly pairing with juniors', '2024-09-15 15:00:00'), +(20, 20,'Analytics Cleanup', NULL, 'analytics-cleanup', 'Analytics', 'Archive old datasets quarterly', '2024-09-20 11:00:00'), +(21, 21,'Weekly Newsletter', NULL, 'weekly-news', 'Content', 'Compile highlights for team', '2024-09-25 09:00:00'), +(22, 22,'Run DB Migrations', NULL, 'run-db-migrations', 'Engineering', 'Schedule migration windows', '2024-10-01 02:00:00'), +(23, 23,'UX Study', NULL, 'ux-study', 'Research', 'Monthly usability sessions', '2024-10-10 10:00:00'), +(24, 24,'Refactor Backlog', NULL, 'refactor-backlog', 'Engineering', 'Plan refactors into sprints', '2024-10-20 09:00:00'), +(25, 25,'Localization Check', NULL, 'localization-check', 'Ops', 'Verify translations monthly', '2024-11-01 11:00:00'), +(26, 26,'Compliance Review', NULL, 'compliance-review', 'Legal', 'Quarterly compliance checks', '2024-11-15 13:00:00'), +(27, 27,'Feature Flagging', NULL, 'feature-flagging', 'Product', 'Manage feature switches', '2024-12-01 09:30:00'), +(28, 28,'Idea Grooming', NULL, 'idea-grooming', 'Product', 'Weekly idea triage', '2025-01-02 14:30:00'), +(29, 29,'Holiday Planning', NULL, 'holiday-planning', 'People', 'Plan holiday schedule', '2025-01-05 16:00:00'), +(30, 30,'Retention Experiments', NULL, 'retention-experiments', 'Growth', 'Run A/B retention experiments', '2025-01-10 10:00:00'), +(31, 31,'Ops Runbook Updates', NULL, 'ops-runbooks', 'DevOps', 'Keep runbooks current', '2025-01-15 09:00:00'), +(32, 32,'Community Events', NULL, 'community-events', 'Community', 'Plan meetups', '2025-01-20 18:00:00'), +(33, 33,'On-call Rotation', NULL, 'oncall-rotation', 'Support', 'Manage on-call schedule', '2025-01-25 08:00:00'), +(34, 34,'Design System Sync', NULL, 'design-sync', 'Design', 'Sync tokens across apps', '2025-01-30 11:00:00'), +(35, 35,'Bug Triage Session', NULL, 'bug-triage', 'QA', 'Weekly bug triage', '2025-02-05 09:30:00'); + + + + + +-- ========== DAILY_TASKS (75 rows) ========== +INSERT INTO daily_tasks (id, userId, tagId, title, metaTitle, slug, status, completed, schedule, notes) +VALUES +(1, 1, 14, 'Portfolio research', NULL, 'portfolio-research', 0, 1, '2025-05-05', 'Collect references and inspirations.'), +(2, 1, 14, 'Sketch: 20-min prompt', NULL, 'sketch-20min', 0, 1, '2025-08-09', 'Daily creative habit.'), +(3, 1, 5, 'Backlog: quick ideas', NULL, 'backlog-ideas', 0, 0, '2025-08-10', 'Small ideas to store for later.'), +(4, 2, 4, 'Research block (deep work)', NULL, 'research-block', 0, 0, '2025-08-09', '2-hour deep work slot.'), +(5, 2, 21, 'Course prep: lecture 3', NULL, 'course-prep-lecture-3', 0, 0, '2025-08-10', 'Slides and assignment due.'), +(6, 2, 4, 'Simulation run', NULL, 'simulation-run', 0, 0, '2025-08-11', 'Run Monte Carlo sims.'), +(7, 3, 12, 'Triage new bug reports', NULL, 'triage-bugs', 0, 0, '2025-08-09', 'Prioritize enterprise customers.'), +(8, 3, 21, 'Host office hours', NULL, 'host-office-hours', 0, 0, '2025-08-12', 'Monthly community event.'), +(9, 3, 6, 'Verify nightly backups', NULL, 'verify-backups', 0, 1, '2025-08-08', 'Confirm backup integrity.'), +(10, 4, 25, 'Morning revenue snapshot', NULL, 'rev-snapshot', 0, 1, '2025-08-09', 'Quick P&L check.'), +(11, 4, 25, 'Assign subgoals to teams', NULL, 'assign-subgoals', 0, 0, '2025-08-10', 'Notify project leads of deadlines.'), +(12, 4, 8, 'Weekly cost review', NULL, 'weekly-cost-review', 0, 0, '2025-08-11', 'Check operating expenses.'), +(13, 5, 21, 'Welcome email follow-up', NULL, 'welcome-followup', 0, 0, '2025-07-23', 'Follow up to earlier welcome email.'), +(14, 6, 15, 'Run QA regression', NULL, 'qa-regression', 0, 1, '2025-08-06', 'Full test run.'), +(15, 7, 14, 'Design polish', NULL, 'design-polish', 0, 0, '2025-07-25', 'Fix spacing & tokens.'), +(16, 8, 25, 'Perf benchmark', NULL, 'perf-benchmark', 0, 0, '2025-07-27', 'Run benchmark suite.'), +(17, 9, 23, 'Mobile smoke test', NULL, 'mobile-smoke', 0, 1, '2025-07-31', 'Quick mobile checks.'), +(18, 10, 22, 'Docs update', NULL, 'docs-update', 0, 0, '2025-07-28', 'Add missing examples.'), +(19, 11, 6, 'Rotate logs', NULL, 'rotate-logs', 0, 1, '2025-08-02', 'Daily log rotation.'), +(20, 12, 28, 'Plan next sprint', NULL, 'plan-sprint', 0, 0, '2025-07-30', 'Define sprint goal.'), +(21, 13, 29, 'Order swag', NULL, 'order-swag', 0, 0, '2025-11-01', 'Finalize designs for swag.'), +(22, 14, 7, 'Clean up tickets', NULL, 'cleanup-tickets', 0, 1, '2025-07-24', 'Close obsolete tickets.'), +(23, 15, 26, 'Compliance check', NULL, 'compliance-check', 0, 0, '2025-07-20', 'Gather logs for audit.'), +(24, 16, 13, 'Analyze funnel', NULL, 'analyze-funnel', 0, 0, '2025-07-27', 'Inspect conversion drops.'), +(25, 17, 17, 'Follow-up customer', NULL, 'follow-customer', 0, 0, '2025-07-29', 'Send next steps.'), +(26, 18, 4, 'Read new paper', NULL, 'read-paper', 0, 0, '2025-08-07', 'Read one research paper.'), +(27, 19, 2, 'Spec review', NULL, 'spec-review', 0, 0, '2025-07-30', 'Review spec draft.'), +(28, 20, 21, 'Newsletter draft', NULL, 'newsletter-draft', 0, 0, '2025-07-25', 'Draft weekly newsletter.'), +(29, 21, 10, 'Index DB for search', NULL, 'index-db', 0, 1, '2025-07-29', 'Ensure indexing complete.'), +(30, 22, 18, 'Integration test', NULL, 'integration-test', 0, 1, '2025-07-29', 'All tests passing.'), +(31, 23, 9, 'UX microcopy audit', NULL, 'ux-audit', 0, 0, '2025-07-28', 'Audit CTAs.'), +(32, 24, 14, 'Design sync', NULL, 'design-sync', 0, 1, '2025-07-25', 'Sync tokens.'), +(33, 25, 26, 'Run compliance', NULL, 'run-compliance', 0, 0, '2025-07-20', 'Run checks for compliance events.'), +(34, 26, 16, 'Rotate API keys', NULL, 'rotate-keys', 0, 0, '2025-07-22', 'Rotate expired keys.'), +(35, 27, 27, 'Refactor helper', NULL, 'refactor-helper', 0, 1, '2025-07-23', 'Simplify functions.'), +(36, 28, 19, 'Prototype feature', NULL, 'prototype-feature', 0, 0, '2025-07-29', 'Prototype user flow.'), +(37, 29, 8, 'High priority bug', NULL, 'high-priority-bug', 0, 0, '2025-08-06', 'Escalated to oncall.'), +(38, 30, 10, 'DB index task', NULL, 'db-index-task', 0, 1, '2025-07-21', 'Add index on events.'), +(39, 31, 5, 'Evening stretch', NULL, 'evening-stretch', 0, 1, '2025-08-05', '5 minutes.'), +(40, 32, 3, 'Personal journal', NULL, 'personal-journal', 0, 1, '2025-08-04', 'Daily reflections.'), +(41, 33, 4, 'Read literature', NULL, 'read-literature', 0, 0, '2025-08-07', 'Read 1 paper.'), +(42, 34, 6, 'Monitor dashboards', NULL, 'monitor-dashboards', 0, 1, '2025-08-06', 'Check for alerts.'), +(43, 35, 2, 'Write specs', NULL, 'write-specs', 1, 1, '2025-08-05', 'Draft v2 ready.'), +(44, 36, 21, 'New user onboarding', NULL, 'onboard-new-user', 0, 1, '2025-07-22', 'Email sent.'), +(45, 37, 4, 'Lab meeting prep', NULL, 'lab-meeting-prep', 0, 0, '2025-08-09', 'Prepare slides.'), +(46, 38, 12, 'Monitor ops', NULL, 'monitor-ops', 0, 1, '2025-08-09', 'Check alerts and incidents.'), +(47, 39, 25, 'Daily revenue check', NULL, 'daily-rev-check', 0, 1, '2025-08-09', 'P&L quick check.'), +(48, 40, 21, 'Forum moderation', NULL, 'forum-moderation', 0, 0, '2025-08-09', 'Review flagged posts.'), +(49, 5, 1, 'Fix critical bug', NULL, 'fix-critical-bug', 0, 0, '2025-08-06', 'Assigned by QA'), +(50, 6, 15, 'Smoke tests', NULL, 'smoke-tests', 0, 1, '2025-08-06', 'Pre-release check.'), +(51, 7, 2, 'Spec notes', NULL, 'spec-notes', 0, 0, '2025-07-30', 'Review draft.'), +(52, 8, 4, 'Collect papers', NULL, 'collect-papers', 0, 0, '2025-08-07', 'Collect relevant papers.'), +(53, 9, 12, 'Rotate logs', NULL, 'rotate-logs-2', 0, 1, '2025-08-02', 'Rotate logs daily.'), +(54, 10, 21, 'Add welcome step', NULL, 'add-welcome-step', 1, 1, '2025-07-22', 'Onboarding email sent.'), +(55, 11, 1, 'Hotfix deploy', NULL, 'hotfix-deploy-2', 0, 0, '2025-08-06', 'Rollback ready.'), +(56, 12, 28, 'Sprint planning', NULL, 'sprint-planning-2', 0, 0, '2025-07-25', 'Define sprint goal.'), +(57, 13, 29, 'Order swag (design)', NULL, 'order-swag-2', 0, 0, '2025-11-01', 'Design finalized.'), +(58, 14, 7, 'Cleanup backlog', NULL, 'cleanup-backlog', 0, 1, '2025-07-24', 'Close old issues.'), +(59, 15, 25, 'Perf test run', NULL, 'perf-test-run', 0, 0, '2025-08-03', 'Run benchmarks.'), +(60, 16, 17, 'Customer follow-ups', NULL, 'customer-followup', 0, 0, '2025-07-29', 'Send next steps.'), +(61, 17, 22, 'Update docs', NULL, 'update-docs-2', 0, 0, '2025-07-28', 'Add missing examples.'), +(62, 18, 20, 'Run experiment', NULL, 'run-experiment-2', 0, 1, '2025-07-30', 'Collect metrics daily.'), +(63, 19, 24, 'Check mobile build', NULL, 'check-mobile-build-2', 0, 1, '2025-07-31', 'Build passed.'), +(64, 20, 13, 'Analyze funnel 2', NULL, 'analyze-funnel-2', 0, 0, '2025-07-27', 'Conversion drop analysis.'), +(65, 21, 11, 'Refactor small module', NULL, 'refactor-module-2', 0, 0, '2025-07-26', 'Backend tidy-up.'), +(66, 22, 18, 'Integration smoke', NULL, 'integration-smoke', 0, 1, '2025-07-29', 'All tests passing.'), +(67, 23, 9, 'UX microcopy work', NULL, 'ux-microcopy-2', 0, 0, '2025-07-28', 'Minor copy updates.'), +(68, 24, 14, 'Design sync 2', NULL, 'design-sync-2', 0, 1, '2025-07-25', 'Sync with design system.'), +(69, 25, 26, 'Compliance prep', NULL, 'compliance-prep', 0, 0, '2025-07-20', 'Gather logs.'), +(70, 26, 16, 'Rotate keys 2', NULL, 'rotate-keys-2', 0, 0, '2025-07-22', 'Rotate API keys.'), +(71, 27, 27, 'Refactor tests', NULL, 'refactor-tests-2', 0, 1, '2025-07-30', 'Simplify flaky tests.'), +(72, 28, 19, 'Prototype signup', NULL, 'prototype-signup', 0, 0, '2025-07-31', 'Signup uplift prototype.'), +(73, 29, 8, 'Stabilize RC', NULL, 'stabilize-rc', 0, 0, '2025-07-21', 'Prepare release candidate.'), +(74, 30, 5, 'Write first post', NULL, 'write-first-post', 0, 0, '2025-07-23', 'Draft blog post.'), +(75, 31, 11, 'Component cleanup', NULL, 'component-cleanup', 0, 0, '2025-07-25', 'Tidy components.'); + + + + + +-- ========== GOALS (35 rows) ========== +INSERT INTO goals (id, userId, tagId, title, notes, onIce, status, priority, createdAt, completedAt, completed, schedule) +VALUES +(1, 1, 14, 'Portfolio redesign', 'Break into research > mockups > build > launch', 0, 'ACTIVE', 2, '2025-05-01 09:00:00', NULL, 0, '2025-08-15'), +(2, 2, 4, 'Statistical model paper', 'Robust estimators — simulations + paper', 0, 'ACTIVE', 1, '2024-10-01 09:00:00', NULL, 0, '2026-01-30'), +(3, 3, 6, 'Platform uptime 99.95%', 'Improve CI/CD, monitoring, alerting', 0, 'ACTIVE', 1, '2019-06-01 08:00:00', NULL, 0, '2025-09-01'), +(4, 4, 25, 'Increase company profit 5%', 'Coordinate experiments, retention, and cost cuts', 0, 'ACTIVE', 1, '2025-02-01 09:00:00', NULL, 0, '2025-12-31'), +(5, 5, 21, 'Improve onboarding conversion', NULL, 0, 'ACTIVE', 2, '2025-03-01 09:00:00', NULL, 0, '2025-06-01'), +(6, 6, 25, 'Reduce API latency', NULL, 0, 'ACTIVE', 1, '2025-03-10 09:00:00', NULL, 0, '2025-05-01'), +(7, 7, 14, 'Design system v2', NULL, 0, 'ACTIVE', 2, '2025-03-20 11:00:00', NULL, 0, '2025-07-01'), +(8, 8, 13, 'Analytics: funnel breakdown', NULL, 0, 'ACTIVE', 3, '2025-04-01 10:45:00', NULL, 0, '2025-06-30'), +(9, 9, 23, 'Mobile QA ramp', NULL, 0, 'ACTIVE', 2, '2025-04-10 09:00:00', NULL, 0, '2025-09-01'), +(10, 10, 11, 'Frontend performance', NULL, 0, 'ACTIVE', 2, '2025-04-20 09:00:00', NULL, 0, '2025-06-01'), +(11, 11, 16, 'Security hardening', NULL, 0, 'ACTIVE', 1, '2025-04-30 09:30:00', NULL, 0, '2025-05-15'), +(12, 12, 22, 'Docs completeness', NULL, 0, 'ACTIVE', 3, '2025-05-05 10:00:00', NULL, 0, '2025-06-20'), +(13, 13, 25, 'Benchmark harness', NULL, 0, 'ACTIVE', 2, '2025-05-12 09:00:00', NULL, 0, '2025-07-01'), +(14, 14, 3, 'Bug backlog cleanup', NULL, 0, 'ACTIVE', 3, '2025-05-20 09:15:00', NULL, 0, '2025-06-05'), +(15, 15, 20, 'Experiment: dashboard CTA', NULL, 0, 'ACTIVE', 3, '2025-05-25 10:00:00', NULL, 0, '2025-07-15'), +(16, 16, 18, 'Integrations marketplace', NULL, 1, 'ON ICE', 2, '2025-05-30 09:00:00', NULL, 0, NULL), +(17, 17, 7, 'Refactor auth flow', NULL, 0, 'ACTIVE', 2, '2025-06-01 10:30:00', NULL, 0, '2025-07-30'), +(18, 18, 17, 'Customer success pilot', NULL, 0, 'ACTIVE', 2, '2025-06-05 11:00:00', NULL, 0, '2025-08-01'), +(19, 19, 29, 'Holiday release planning', NULL, 0, 'PLANNED', 4, '2025-06-10 12:00:00', NULL, 0, '2025-11-15'), +(20, 20, 1, 'Incident response improvements', NULL, 0, 'ACTIVE', 1, '2025-06-15 10:00:00', NULL, 0, '2025-07-01'), +(21, 21, 4, 'Research ideas database', NULL, 0, 'ACTIVE', 2, '2025-06-20 09:00:00', NULL, 0, NULL), +(22, 22, 28, 'Automate deployments', NULL, 0, 'ACTIVE', 2, '2025-06-25 09:00:00', NULL, 0, '2025-08-01'), +(23, 23, 8, 'QA ramp for mobile', NULL, 0, 'ACTIVE', 2, '2025-07-01 10:00:00', NULL, 0, '2025-09-01'), +(24, 24, 24, 'Web performance focus', NULL, 0, 'ACTIVE', 1, '2025-07-05 09:30:00', NULL, 0, '2025-09-15'), +(25, 25, 26, 'Compliance readiness', NULL, 0, 'PLANNED', 1, '2025-07-10 11:00:00', NULL, 0, '2025-12-01'), +(26, 26, 27, 'Refactor logging', NULL, 0, 'ACTIVE', 3, '2025-07-12 09:00:00', NULL, 0, '2025-08-31'), +(27, 27, 30, 'Idea harvest', NULL, 0, 'ACTIVE', 4, '2025-07-15 14:00:00', NULL, 0, NULL), +(28, 28, 19, 'Prototype feature A', NULL, 0, 'ACTIVE', 4, '2025-07-16 09:00:00', NULL, 0, '2025-07-30'), +(29, 29, 2, 'Stabilize release', NULL, 0, 'ACTIVE', 2, '2025-07-20 09:00:00', NULL, 0, '2025-08-20'), +(30, 30, 5, 'Personal blog launch', NULL, 0, 'ACTIVE', 4, '2025-07-22 10:00:00', NULL, 0, '2025-08-10'), +(31, 31, 11, 'Front-end mastery', NULL, 0, 'ACTIVE', 3, '2025-07-24 09:00:00', NULL, 0, '2025-10-01'), +(32, 32, 20, 'Experiment tracking', NULL, 0, 'ACTIVE', 2, '2025-07-26 09:00:00', NULL, 0, '2025-09-01'), +(33, 33, 16, 'Secret scanning', NULL, 0, 'ACTIVE', 1, '2025-07-28 14:00:00', NULL, 0, '2025-08-15'), +(34, 34, 9, 'UX microcopy baseline', NULL, 0, 'ACTIVE', 4, '2025-07-30 09:00:00', NULL, 0, '2025-08-15'), +(35, 35, 18, 'Partner onboarding docs', NULL, 0, 'ACTIVE', 2, '2025-08-01 10:00:00', NULL, 0, '2025-09-15'); + + + + + +-- ========== SUBGOALS (75 rows) ========== +INSERT INTO subgoals (id, goalsId, title, notes, status, createdAt, completedAt, completed, schedule) +VALUES +(1, 1, 'Collect 8 case studies', 'Select projects and write case studies', 'ACTIVE', '2025-05-03 10:00:00', NULL, 0, '2025-06-10'), +(2, 2, 'Write literature review', 'Collect related papers', 'ACTIVE', '2024-10-10 10:00:00', NULL, 0, '2025-02-01'), +(3, 4, 'A/B retention experiments', 'Run 3 experiments to increase retention', 'ACTIVE', '2025-02-10 09:00:00', NULL, 0, '2025-06-30'), +(4, 1, 'Design mockups mobile+web', NULL, 'ACTIVE', '2025-06-12 09:00:00', NULL, 0, '2025-07-05'), +(5, 1, 'Launch portfolio site', NULL, 'PLANNED', '2025-07-10 09:00:00', NULL, 0, '2025-08-15'), +(6, 2, 'Run simulations', NULL, 'ACTIVE', '2025-01-05 09:30:00', NULL, 0, '2025-04-01'), +(7, 2, 'Draft methods section', NULL, 'ACTIVE', '2025-03-01 09:00:00', NULL, 0, '2025-08-01'), +(8, 3, 'Add p95 latency alerting', NULL, 'ACTIVE', '2019-07-01 09:00:00', NULL, 0, '2025-03-15'), +(9, 3, 'Improve CI pipeline', NULL, 'ACTIVE', '2020-01-01 12:00:00', NULL, 0, '2025-06-01'), +(10, 3, 'Create runbook for incidents', NULL, 'ACTIVE', '2024-06-01 10:00:00', NULL, 0, '2025-07-01'), +(11, 4, 'Cut ops cost 2%', NULL, 'ACTIVE', '2025-03-01 09:00:00', NULL, 0, '2025-09-30'), +(12, 4, 'Reduce churn 1%', NULL, 'ACTIVE', '2025-02-15 09:00:00', NULL, 0, '2025-06-30'), +(13, 5, 'Add onboarding checklist', NULL, 'ACTIVE', '2025-03-02 09:00:00', NULL, 0, '2025-05-01'), +(14, 5, 'In-app walkthrough', NULL, 'ACTIVE', '2025-03-05 09:30:00', NULL, 0, '2025-04-15'), +(15, 6, 'Introduce Redis cache', NULL, 'ACTIVE', '2025-03-14 09:00:00', NULL, 0, '2025-04-01'), +(16, 6, 'Add query plan checks', NULL, 'ON ICE', '2025-03-20 08:00:00', NULL, 0, NULL), +(17, 7, 'Create component library', NULL, 'ACTIVE', '2025-03-22 11:00:00', NULL, 0, '2025-06-01'), +(18, 8, 'Add funnel viz', NULL, 'ACTIVE', '2025-03-12 09:00:00', NULL, 0, '2025-05-01'), +(19, 9, 'Increase mobile tests', NULL, 'ACTIVE', '2025-04-02 10:00:00', NULL, 0, '2025-06-01'), +(20, 10, 'Lazy-load charts', NULL, 'ACTIVE', '2025-04-03 09:20:00', NULL, 0, '2025-04-20'), +(21, 11, 'Secrets scan in CI', NULL, 'ACTIVE', '2025-04-11 10:00:00', NULL, 0, '2025-04-30'), +(22, 12, 'Bulk export sample', NULL, 'ACTIVE', '2025-04-21 12:00:00', NULL, 0, '2025-05-15'), +(23, 13, 'Benchmark harness v1', NULL, 'ACTIVE', '2025-04-30 09:00:00', NULL, 0, '2025-06-15'), +(24, 14, 'Close stale bugs', NULL, 'ACTIVE', '2025-05-06 09:00:00', '2025-06-06 12:00:00', 1, '2025-06-06'), +(25, 15, 'Create CTA variants', NULL, 'ACTIVE', '2025-05-13 10:00:00', NULL, 0, '2025-06-20'), +(26, 16, 'Alpha connector: Stripe', NULL, 'ON ICE', '2025-05-22 11:30:00', NULL, 0, NULL), +(27, 17, 'Migrate token store', NULL, 'ACTIVE', '2025-05-26 09:00:00', NULL, 0, '2025-06-10'), +(28, 18, 'Recruit pilot customers', NULL, 'ACTIVE', '2025-06-02 10:00:00', NULL, 0, '2025-07-01'), +(29, 19, 'Holiday campaign assets', NULL, 'PLANNED', '2025-06-11 12:00:00', NULL, 0, '2025-10-01'), +(30, 20, 'Run incident drills', NULL, 'ACTIVE', '2025-06-16 10:00:00', NULL, 0, '2025-07-01'), +(31, 21, 'ML literature review', NULL, 'ON ICE', '2025-06-22 09:00:00', NULL, 0, NULL), +(32, 22, 'Blue-green deploy test', NULL, 'ACTIVE', '2025-06-27 09:30:00', NULL, 0, '2025-07-15'), +(33, 23, 'Mobile benchmark tests', NULL, 'ACTIVE', '2025-07-03 10:00:00', NULL, 0, '2025-08-01'), +(34, 24, 'Split JS bundles', NULL, 'ACTIVE', '2025-07-07 09:00:00', NULL, 0, '2025-08-01'), +(35, 25, 'Map SOC2 controls', NULL, 'PLANNED', '2025-07-12 11:00:00', NULL, 0, '2025-09-01'), +(36, 26, 'Centralize logging', NULL, 'ACTIVE', '2025-07-14 09:00:00', NULL, 0, '2025-08-15'), +(37, 27, 'Collect refactor candidates', NULL, 'ACTIVE', '2025-07-18 14:30:00', NULL, 0, '2025-07-31'), +(38, 28, 'Harvest ideas from support', NULL, 'ACTIVE', '2025-07-19 10:00:00', NULL, 0, NULL), +(39, 29, 'Outline holiday offers', NULL, 'PLANNED', '2025-07-22 16:00:00', NULL, 0, '2025-10-01'), +(40, 30, 'Write blog draft #1', NULL, 'ACTIVE', '2025-07-24 10:30:00', NULL, 0, '2025-08-01'), +(41, 31, 'Component docs', NULL, 'ACTIVE', '2025-07-26 09:30:00', NULL, 0, '2025-08-15'), +(42, 32, 'Metric naming', NULL, 'ACTIVE', '2025-07-28 09:30:00', NULL, 0, '2025-08-30'), +(43, 33, 'Add pre-commit hook', NULL, 'ACTIVE', '2025-07-30 09:10:00', NULL, 0, '2025-08-05'), +(44, 34, 'CTA audit v2', NULL, 'ACTIVE', '2025-08-01 09:00:00', NULL, 0, '2025-08-12'), +(45, 35, 'Partner doc outline', NULL, 'ACTIVE', '2025-08-02 10:15:00', NULL, 0, '2025-08-20'), +(46, 36, 'Finalize tags API', NULL, 'ACTIVE', '2025-01-12 09:00:00', NULL, 0, '2025-02-15'), +(47, 37, 'Cache warming strategy', NULL, 'ACTIVE', '2025-01-18 09:30:00', NULL, 0, '2025-02-25'), +(48, 38, 'Experiment instrumentation', NULL, 'ACTIVE', '2025-02-06 10:00:00', NULL, 0, '2025-03-20'), +(49, 39, 'Flaky tests triage', NULL, 'ACTIVE', '2025-02-20 09:00:00', NULL, 0, '2025-03-30'), +(50, 40, 'Walkthrough script', NULL, 'ACTIVE', '2025-03-08 09:00:00', NULL, 0, '2025-04-10'), +(51, 1, 'Portfolio final review', NULL, 'ACTIVE', '2025-05-01 09:00:00', NULL, 0, '2025-08-10'), +(52, 2, 'Finalize simulation results', NULL, 'ACTIVE', '2025-05-10 09:00:00', NULL, 0, '2025-08-15'), +(53, 3, 'Run emergency drill', NULL, 'ACTIVE', '2025-05-15 09:00:00', NULL, 0, '2025-08-20'), +(54, 4, 'Monthly revenue report', NULL, 'ACTIVE', '2025-05-20 09:00:00', NULL, 0, '2025-08-31'), +(55, 5, 'New user checklist', NULL, 'ACTIVE', '2025-05-25 09:00:00', NULL, 0, '2025-09-01'), +(56, 6, 'DB optimization', NULL, 'ACTIVE', '2025-05-30 09:00:00', NULL, 0, '2025-09-10'), +(57, 7, 'Design tokens audit', NULL, 'ACTIVE', '2025-06-04 09:00:00', NULL, 0, '2025-09-15'), +(58, 8, 'Data hygiene', NULL, 'ACTIVE', '2025-06-09 09:00:00', NULL, 0, '2025-09-20'), +(59, 9, 'Mobile regression', NULL, 'ACTIVE', '2025-06-14 09:00:00', NULL, 0, '2025-09-25'), +(60, 10, 'Docs search index', NULL, 'ACTIVE', '2025-06-19 09:00:00', NULL, 0, '2025-09-30'), +(61, 11, 'QA tech debt', NULL, 'ACTIVE', '2025-06-24 09:00:00', NULL, 0, '2025-10-05'), +(62, 12, 'Sprint retrospective', NULL, 'ACTIVE', '2025-06-29 09:00:00', NULL, 0, '2025-10-10'), +(63, 13, 'Perf baseline run', NULL, 'ACTIVE', '2025-07-04 09:00:00', NULL, 0, '2025-10-15'), +(64, 14, 'Bug closure campaign', NULL, 'ACTIVE', '2025-07-09 09:00:00', NULL, 0, '2025-10-20'), +(65, 15, 'CTA experiment', NULL, 'ACTIVE', '2025-07-14 09:00:00', NULL, 0, '2025-10-25'), +(66, 16, 'Marketplace doc draft', NULL, 'ACTIVE', '2025-07-19 09:00:00', NULL, 0, '2025-11-01'), +(67, 17, 'Auth refactor follow-up', NULL, 'ACTIVE', '2025-07-24 09:00:00', NULL, 0, '2025-11-05'), +(68, 18, 'Customer check-ins', NULL, 'ACTIVE', '2025-07-29 09:00:00', NULL, 0, '2025-11-10'), +(69, 19, 'Holiday brief', NULL, 'ACTIVE', '2025-08-03 09:00:00', NULL, 0, '2025-11-15'), +(70, 20, 'Runbook test', NULL, 'ACTIVE', '2025-08-08 09:00:00', NULL, 0, '2025-11-20'), +(71, 21, 'Add idea to DB', NULL, 'ACTIVE', '2025-08-13 09:00:00', NULL, 0, '2025-11-25'), +(72, 22, 'Docs examples update', NULL, 'ACTIVE', '2025-08-18 09:00:00', NULL, 0, '2025-11-30'), +(73, 23, 'UX follow up', NULL, 'ACTIVE', '2025-08-23 09:00:00', NULL, 0, '2025-12-05'), +(74, 24, 'Design tokens final', NULL, 'ACTIVE', '2025-08-28 09:00:00', NULL, 0, '2025-12-10'), +(75, 25, 'Compliance doc prep', NULL, 'ACTIVE', '2025-09-02 09:00:00', NULL, 0, '2025-12-20'); + + + + + +SET FOREIGN_KEY_CHECKS = 1; \ No newline at end of file