diff --git a/api/backend/ml_models/__init__.py b/Dockerfile similarity index 100% rename from api/backend/ml_models/__init__.py rename to Dockerfile diff --git a/README.md b/README.md index 5582fd65c2..b99cdaf309 100644 --- a/README.md +++ b/README.md @@ -1,85 +1,426 @@ -# Spring 2025 CS 3200 Project Template Repository +# Coopalytics 🎓 -This repo is a template for your semester project. It includes most of the infrastructure setup (containers), sample databases, and example UI pages. Explore it fully and ask questions! +[![Docker](https://img.shields.io/badge/Docker-Containerized-blue?logo=docker)](https://www.docker.com/) +[![Streamlit](https://img.shields.io/badge/Frontend-Streamlit-red?logo=streamlit)](https://streamlit.io/) +[![Flask](https://img.shields.io/badge/Backend-Flask-green?logo=flask)](https://flask.palletsprojects.com/) +[![MySQL](https://img.shields.io/badge/Database-MySQL-orange?logo=mysql)](https://www.mysql.com/) -## Prerequisites +A comprehensive co-op management system designed to streamline the cooperative education process for students, employers, and administrators at Northeastern University. -- A GitHub Account -- A terminal-based git client or GUI Git client such as GitHub Desktop or the Git plugin for VSCode. -- VSCode with the Python Plugin -- A distribution of Python running on your laptop. The distro supported by the course is Anaconda or Miniconda. +## 📋 Project Overview -## Current Project Components +Coopalytics is a full-stack web application that facilitates the entire co-op lifecycle, from position posting to application management and administrative oversight. The platform provides tailored experiences for four distinct user types: +>>>>>>> theirs -Currently, there are three major components that will each run in their own Docker Containers: +### 🎯 Key Features -- Streamlit App in the `./app` directory -- Flask REST api in the `./api` directory -- MySQL Database that will be initialized with SQL script files from the `./database-files` directory +- **Student Portal**: Browse positions, submit applications, manage profiles, and track application status +- **Academic Advisor Portal**: Monitor your advisees' application progress, analyze placement trends, identify students needing support, and access comprehensive analytics +>>>>>>> theirs +- **Employer Dashboard**: Post co-op positions, review applications, manage company profiles, and make hiring decisions +- **Admin Panel**: Oversee all positions, manage user accounts, review flagged content, and maintain system integrity +- **Real-time Application Management**: Status updates, notifications, and comprehensive tracking +- **Advanced Filtering**: Search positions by industry, location, skills, and other criteria +- **Diversity & Inclusion Analytics**: DEI reporting and insights for administrators -## Suggestion for Learning the Project Code Base +### 👥 Target Users -If you are not familiar with web app development, this code base might be confusing. But don't worry, it's not that bad. Here are some suggestions for learning the code base: +- **Students**: Seeking co-op opportunities and managing their application process +- **Advisors**: Managing students' application processes and overlooking student data +>>>>>>> theirs +- **Employers**: Posting positions and managing the hiring process +- **System Administrators**: Overseeing platform operations and maintaining data integrity -1. Have two versions of the template repo - one for you to individually explore and lear and another for the team's project implementation. -1. Start by exploring the `./app` directory. This is where the Streamlit app is located. The Streamlit app is a Python-based web app that is used to interact with the user. It's a great way to build a simple web app without having to learn a lot of web development. -1. Next, explore the `./api` directory. This is where the Flask REST API is located. The REST API is used to interact with the database and perform other server-side tasks. -1. Finally, explore the `./database-files` directory. This is where the SQL scripts are located that will be used to initialize the MySQL database. +### 🛠️ Technology Stack -### Setting Up Your Personal Repo +- **Frontend**: Streamlit (Python-based web framework) +- **Backend API**: Flask with RESTful endpoints +- **Database**: MySQL with comprehensive relational schema +- **Containerization**: Docker & Docker Compose for easy deployment +- **Authentication**: Session-based user management +- **Styling**: Custom CSS with responsive design -1. In GitHub, click the **fork** button in the upper right corner of the repo screen. -1. When prompted, give the new repo a unique name, perhaps including your last name and the word 'personal'. -1. Once the fork has been created, clone YOUR forked version of the repo to your computer. -1. Set up the `.env` file in the `api` folder based on the `.env.template` file. -1. For running the testing containers (for your personal repo), you will tell `docker compose` to use a different configuration file named `docker-compose-testing.yaml`. - 1. `docker compose -f docker-compose-testing.yaml up -d` to start all the containers in the background - 1. `docker compose -f docker-compose-testing.yaml down` to shutdown and delete the containers - 1. `docker compose -f docker-compose-testing.yaml up db -d` only start the database container (replace db with api or app for the other two services as needed) - 1. `docker compose -f docker-compose-testing.yaml stop` to "turn off" the containers but not delete them. +## 🚀 Getting Started -### Setting Up Your Team's Repo +### Prerequisites -**Before you start**: As a team, one person needs to assume the role of _Team Project Repo Owner_. +Before running Coopalytics, ensure you have the following installed: -1. The Team Project Repo Owner needs to fork this template repo into their own GitHub account **and give the repo a name consistent with your project's name**. If you're worried that the repo is public, don't. Every team is doing a different project. -1. In the newly forked team repo, the Team Project Repo Owner should go to the **Settings** tab, choose **Collaborators and Teams** on the left-side panel. Add each of your team members to the repository with Write access. +- [Docker](https://www.docker.com/get-started) (version 20.0 or higher) +- [Docker Compose](https://docs.docker.com/compose/install/) (version 2.0 or higher) +- Git for cloning the repository -**Remaining Team Members** +### Installation & Setup -1. Each of the other team members will receive an invitation to join. Obviously accept the invite. -1. Once that process is complete, each team member, including the repo owner, should clone the Team's Repo to their local machines (in a different location your Personal Project Repo). -1. Set up the `.env` file in the `api` folder based on the `.env.template` file. -1. For running the testing containers (for your team's repo): - 1. `docker compose up -d` to start all the containers in the background - 1. `docker compose down` to shutdown and delete the containers - 1. `docker compose up db -d` only start the database container (replace db with api or app for the other two services as needed) - 1. `docker compose stop` to "turn off" the containers but not delete them. +1. **Clone the Repository** + ```bash + git clone https://github.com/your-username/Coopalytics.git + cd Coopalytics + ``` -**Note:** You can also use the Docker Desktop GUI to start and stop the containers after the first initial run. +2. **Start the Application** + ```bash + # Start all containers in detached mode + docker-compose up -d + ``` -## Handling User Role Access and Control +3. **Verify Container Status** + ```bash + # Check that all containers are running + docker-compose ps + ``` -In most applications, when a user logs in, they assume a particular role. For instance, when one logs in to a stock price prediction app, they may be a single investor, a portfolio manager, or a corporate executive (of a publicly traded company). Each of those _roles_ will likely present some similar features as well as some different features when compared to the other roles. So, how do you accomplish this in Streamlit? This is sometimes called Role-based Access Control, or **RBAC** for short. +4. **Access the Application** + - **Streamlit Frontend**: [http://localhost:8501](http://localhost:8501) + - **Flask API**: [http://localhost:4000](http://localhost:4000) + - **MySQL Database**: `localhost:3306` (for direct database access) -The code in this project demonstrates how to implement a simple RBAC system in Streamlit but without actually using user authentication (usernames and passwords). The Streamlit pages from the original template repo are split up among 3 roles - Political Strategist, USAID Worker, and a System Administrator role (this is used for any sort of system tasks such as re-training ML model, etc.). It also demonstrates how to deploy an ML model. +### Initial Setup -Wrapping your head around this will take a little time and exploration of this code base. Some highlights are below. +The application includes pre-populated sample data for immediate testing: +- Sample student, employer, and admin accounts +- Co-op positions across various industries +- Application records and company profiles -### Getting Started with the RBAC +## 📁 Application Structure -1. We need to turn off the standard panel of links on the left side of the Streamlit app. This is done through the `app/src/.streamlit/config.toml` file. So check that out. We are turning it off so we can control directly what links are shown. -1. Then I created a new python module in `app/src/modules/nav.py`. When you look at the file, you will se that there are functions for basically each page of the application. The `st.sidebar.page_link(...)` adds a single link to the sidebar. We have a separate function for each page so that we can organize the links/pages by role. -1. Next, check out the `app/src/Home.py` file. Notice that there are 3 buttons added to the page and when one is clicked, it redirects via `st.switch_page(...)` to that Roles Home page in `app/src/pages`. But before the redirect, I set a few different variables in the Streamlit `session_state` object to track role, first name of the user, and that the user is now authenticated. -1. Notice near the top of `app/src/Home.py` and all other pages, there is a call to `SideBarLinks(...)` from the `app/src/nav.py` module. This is the function that will use the role set in `session_state` to determine what links to show the user in the sidebar. -1. The pages are organized by Role. Pages that start with a `0` are related to the _Political Strategist_ role. Pages that start with a `1` are related to the _USAID worker_ role. And, pages that start with a `2` are related to The _System Administrator_ role. +``` +Coopalytics/ +├── app/ # Streamlit Frontend Application +│ ├── src/ +│ │ ├── pages/ # Individual page components +│ │ │ ├── 1*_Student_*.py # Student-facing pages +│ │ │ ├── 2*_Employer_*.py # Employer-facing pages +│ │ │ └── 3*_Admin_*.py # Admin-facing pages +│ │ ├── modules/ # Shared components and utilities +│ │ └── Home.py # Main application entry point +│ └── Dockerfile # Frontend container configuration +├── api/ # Flask Backend API +│ ├── backend/ +│ │ ├── applications/ # Application management endpoints +│ │ ├── coopPositions/ # Position management endpoints +│ │ ├── users/ # User management endpoints +│ │ └── db_connection.py # Database connection utilities +│ └── Dockerfile # Backend container configuration +├── database-files/ # MySQL Database Setup +│ ├── 01-coopalytics-schema.sql # Database schema definition +│ └── 02-coopalytics-data.sql # Sample data insertion +├── docker-compose.yaml # Container orchestration +└── README.md # This file +``` -## (VERY Optional) Adding an ML Model to your App +### Key Components -_Note_: This project only contains the infrastructure for a hypothetical ML model. +- **Frontend (`/app`)**: Streamlit-based user interface with role-based page access +- **Backend API (`/api`)**: Flask RESTful API with modular endpoint organization +- **Database (`/database-files`)**: MySQL schema and sample data for immediate functionality -1. Build, train, and test your ML model in a Jupyter Notebook. -1. Once you're happy with the model's performance, convert your Jupyter Notebook code for the ML model to a pure python script. You can include the `training` and `testing` functionality as well as the `prediction` functionality. You may or may not need to include data cleaning, though. -1. Check out the `api/backend/ml_models` module. In this folder, I've put a sample (read _fake_) ML model in `model01.py`. The `predict` function will be called by the Flask REST API to perform '_real-time_' prediction based on model parameter values that are stored in the database. **Important**: you would never want to hard code the model parameter weights directly in the prediction function. tl;dr - take some time to look over the code in `model01.py`. -1. The prediction route for the REST API is in `api/backend/customers/customer_routes.py`. Basically, it accepts two URL parameters and passes them to the `prediction` function in the `ml_models` module. The `prediction` route/function packages up the value(s) it receives from the model's `predict` function and send its back to Streamlit as JSON. -1. Back in streamlit, check out `app/src/pages/11_Prediction.py`. Here, I create two numeric input fields. When the button is pressed, it makes a request to the REST API URL `/c/prediction/.../...` function and passes the values from the two inputs as URL parameters. It gets back the results from the route and displays them. Nothing fancy here. +## 💻 Usage + +### User Personas & Access + +The application supports three distinct user personas, each with tailored functionality: + +#### 🎓 Student Persona +- **Dashboard**: View application status and recommended positions +- **Position Search**: Browse and filter available co-op opportunities +- **Application Management**: Submit applications and track their progress +- **Profile Management**: Update personal information and skills + +#### 🏢 Employer Persona +- **Company Dashboard**: Manage company profile and posted positions +- **Position Management**: Create, edit, and manage co-op postings +- **Application Review**: View and process student applications +- **Candidate Profiles**: Access detailed student information and documents + +#### 🔧 Administrator Persona +- **System Overview**: Monitor platform activity and user engagement +- **Position Moderation**: Review, approve, or flag co-op postings +- **User Management**: Oversee student and employer accounts +- **DEI Analytics**: Access diversity and inclusion reporting tools + +### Navigation + +1. **Home Page**: Select your user persona to access role-specific features +2. **Sidebar Navigation**: Use the left sidebar to navigate between different sections +3. **Quick Actions**: Utilize dashboard widgets for common tasks +4. **Search & Filters**: Apply filters to find relevant positions or applications + +## 🔧 Development + +### Container Management + +```bash +# Stop all containers +docker-compose down + +# View container logs +docker logs web-app # Streamlit frontend logs +docker logs web-api # Flask backend logs +docker logs coopalytics-db # MySQL database logs + +# Rebuild containers after code changes +docker-compose up -d --build + +# Restart specific container +docker-compose restart web-app +``` + +### Environment Configuration + +The application uses Docker environment variables defined in `docker-compose.yaml`: + +- **Database Configuration**: MySQL credentials and connection settings +- **API Configuration**: Flask server settings and CORS policies +- **Frontend Configuration**: Streamlit server configuration + +### Database Access + +```bash +# Access MySQL database directly +docker exec -it coopalytics-db mysql -u root -p + +# Export database backup +docker exec coopalytics-db mysqldump -u root -p coopalytics > backup.sql +``` + +**Port Conflicts** +```bash +# If ports 8501 or 4000 are in use, modify docker-compose.yaml +# Change the port mapping: "8502:8501" for frontend, "4001:4000" for backend +``` + +**Container Startup Issues** +```bash +# Check container status +docker-compose ps + +# View detailed logs +docker-compose logs + +# Restart all containers +docker-compose down && docker-compose up -d +``` + +**Database Connection Problems** +```bash +# Reset database container +docker-compose down +docker volume rm coopalytics_mysql_data +docker-compose up -d +``` + +**Application Not Loading** +- Ensure all containers are running: `docker-compose ps` +- Check for port conflicts on 8501 and 4000 +- Verify Docker daemon is running +- Clear browser cache and try again + +### Performance Optimization + +- **Container Resources**: Adjust memory limits in `docker-compose.yaml` if needed +- **Database Performance**: Monitor MySQL logs for slow queries +- **Frontend Caching**: Streamlit automatically caches data; restart container to clear cache + +## 📝 License + +This project is developed for educational purposes as part of the CS3200 Database Design course at Northeastern University. + +## 🤝 Contributing + +This is an academic project. For questions or issues, please contact the development team or course instructors. + +--- + +======= + +### Installation & Setup + +1. **Clone the Repository** + ```bash + git clone https://github.com/your-username/Coopalytics.git + cd Coopalytics + ``` + +2. **Start the Application** + ```bash + # Start all containers in detached mode + docker-compose up -d + ``` + +3. **Verify Container Status** + ```bash + # Check that all containers are running + docker-compose ps + ``` + +4. **Access the Application** + - **Streamlit Frontend**: [http://localhost:8501](http://localhost:8501) + - **Flask API**: [http://localhost:4000](http://localhost:4000) + - **MySQL Database**: `localhost:3306` (for direct database access) + +### Initial Setup + +The application includes pre-populated sample data for immediate testing: +- Sample student, employer, and admin accounts +- Co-op positions across various industries +- Application records and company profiles + +## 📁 Application Structure + +``` +Coopalytics/ +├── app/ # Streamlit Frontend Application +│ ├── src/ +│ │ ├── pages/ # Individual page components +│ │ │ ├── 1*_Student_*.py # Student-facing pages +│ │ │ ├── 2*_Employer_*.py # Employer-facing pages +│ │ │ └── 3*_Admin_*.py # Admin-facing pages +│ │ ├── modules/ # Shared components and utilities +│ │ └── Home.py # Main application entry point +│ └── Dockerfile # Frontend container configuration +├── api/ # Flask Backend API +│ ├── backend/ +│ │ ├── applications/ # Application management endpoints +│ │ ├── coopPositions/ # Position management endpoints +│ │ ├── users/ # User management endpoints +│ │ └── db_connection.py # Database connection utilities +│ └── Dockerfile # Backend container configuration +├── database-files/ # MySQL Database Setup +│ ├── 01-coopalytics-schema.sql # Database schema definition +│ └── 02-coopalytics-data.sql # Sample data insertion +├── docker-compose.yaml # Container orchestration +└── README.md # This file +``` + +### Key Components + +- **Frontend (`/app`)**: Streamlit-based user interface with role-based page access +- **Backend API (`/api`)**: Flask RESTful API with modular endpoint organization +- **Database (`/database-files`)**: MySQL schema and sample data for immediate functionality + +## 💻 Usage + +### User Personas & Access + +The application supports four distinct user personas, each with tailored functionality: + +#### 🎓 Student Persona +- **Dashboard**: View application status and recommended positions +- **Position Search**: Browse and filter available co-op opportunities +- **Application Management**: Submit applications and track their progress +- **Profile Management**: Update personal information and skills + +#### 👨‍🏫 Advisor Persona +- **Advisor Dashboard**: View and manage advisor profile +- **Student Management**: View and flag assigned students' profiles and application status +- **Placement Analytics**: Monitor and filter detailed statistics for student placement +- **Company Partnerships**: Access detailed company information and ratings + +#### 🏢 Employer Persona +- **Company Dashboard**: Manage company profile and posted positions +- **Position Management**: Create, edit, and manage co-op postings +- **Application Review**: View and process student applications +- **Candidate Profiles**: Access detailed student information and documents + +#### 🔧 Administrator Persona +- **System Overview**: Monitor platform activity and user engagement +- **Position Moderation**: Review, approve, or flag co-op postings +- **User Management**: Oversee student and employer accounts +- **DEI Analytics**: Access diversity and inclusion reporting tools + +### Navigation + +1. **Home Page**: Select your user persona to access role-specific features +2. **Sidebar Navigation**: Use the left sidebar to navigate between different sections +3. **Quick Actions**: Utilize dashboard widgets for common tasks +4. **Search & Filters**: Apply filters to find relevant positions or applications + +## 🔧 Development + +### Container Management + +```bash +# Stop all containers +docker-compose down + +# View container logs +docker logs web-app # Streamlit frontend logs +docker logs web-api # Flask backend logs +docker logs coopalytics-db # MySQL database logs + +# Rebuild containers after code changes +docker-compose up -d --build + +# Restart specific container +docker-compose restart web-app +``` + +### Environment Configuration + +The application uses Docker environment variables defined in `docker-compose.yaml`: + +- **Database Configuration**: MySQL credentials and connection settings +- **API Configuration**: Flask server settings and CORS policies +- **Frontend Configuration**: Streamlit server configuration + +### Database Access + +```bash +# Access MySQL database directly +docker exec -it coopalytics-db mysql -u root -p + +# Export database backup +docker exec coopalytics-db mysqldump -u root -p coopalytics > backup.sql +``` + +**Port Conflicts** +```bash +# If ports 8501 or 4000 are in use, modify docker-compose.yaml +# Change the port mapping: "8502:8501" for frontend, "4001:4000" for backend +``` + +**Container Startup Issues** +```bash +# Check container status +docker-compose ps + +# View detailed logs +docker-compose logs + +# Restart all containers +docker-compose down && docker-compose up -d +``` + +**Database Connection Problems** +```bash +# Reset database container +docker-compose down +docker volume rm coopalytics_mysql_data +docker-compose up -d +``` + +**Application Not Loading** +- Ensure all containers are running: `docker-compose ps` +- Check for port conflicts on 8501 and 4000 +- Verify Docker daemon is running +- Clear browser cache and try again + +### Performance Optimization + +- **Container Resources**: Adjust memory limits in `docker-compose.yaml` if needed +- **Database Performance**: Monitor MySQL logs for slow queries +- **Frontend Caching**: Streamlit automatically caches data; restart container to clear cache + +## 📝 License + +This project is developed for educational purposes as part of the CS3200 Database Design course at Northeastern University. + +## 🤝 Contributing + +This is an academic project. For questions or issues, please contact the development team or course instructors. + +--- + +>>>>>>> theirs +**Built with ❤️ by Co-op Huntrix** diff --git a/api/.env.template b/api/.env.template index b24b99326f..3f10128f11 100644 --- a/api/.env.template +++ b/api/.env.template @@ -2,5 +2,5 @@ SECRET_KEY=someCrazyS3cR3T!Key.! DB_USER=root DB_HOST=db DB_PORT=3306 -DB_NAME=northwind -MYSQL_ROOT_PASSWORD= +DB_NAME=coopalytics +MYSQL_ROOT_PASSWORD=" " diff --git a/api/backend/advisoradvisee/advisoradvisee_routes.py b/api/backend/advisoradvisee/advisoradvisee_routes.py new file mode 100644 index 0000000000..0abe173dc9 --- /dev/null +++ b/api/backend/advisoradvisee/advisoradvisee_routes.py @@ -0,0 +1,56 @@ +from flask import Blueprint +from flask import request +from flask import jsonify +from flask import make_response +from flask import current_app +from backend.db_connection import db + +advisoradvisee = Blueprint('advisoradvisee', __name__) + +# Advisor identifies students with too few applications +@advisoradvisee.route('/advisor//students/low-applications', methods=['GET']) +def get_students_with_low_applications(advisorID): + current_app.logger.info('GET /advisor//students/low-applications route') + + query = ''' + SELECT u.userId, + u.firstName, + u.lastName, + COUNT(apps.applicationId) AS totalApps + FROM advisor_advisee aa + JOIN users u ON u.userId = aa.studentId + LEFT JOIN appliesToApp ata ON ata.studentId = u.userId + LEFT JOIN applications apps ON ata.applicationId = apps.applicationId + WHERE aa.advisorId = {0} + GROUP BY u.userId, u.firstName, u.lastName + HAVING COUNT(apps.applicationId) < 5 + ORDER BY totalApps ASC, u.lastName; + '''.format(advisorID) + + cursor = db.get_db().cursor() + cursor.execute(query) + theData = cursor.fetchall() + + the_response = make_response(jsonify(theData)) + the_response.status_code = 200 + return the_response + +# Admin reassigns students to different advisors as needed +@advisoradvisee.route('/admin//', + methods = ['PUT']) +def reassignAdvisor(): + current_app.logger.info('PUT /admin// route') + advisorId = request.json + studentId = request.json + + query = ''' + UPDATE advisor_advisee + SET advisorId = %s + WHERE studentId = %s; + ''' + data=(advisorId, studentId) + cursor = db.get_db().cursor() + r = cursor.execute(query, data) + db.get_db().commit() + return 'advisor reassigned successfully' + diff --git a/api/backend/applications/applications_routes.py b/api/backend/applications/applications_routes.py new file mode 100644 index 0000000000..ce012ef7e6 --- /dev/null +++ b/api/backend/applications/applications_routes.py @@ -0,0 +1,256 @@ +from flask import Blueprint +from flask import request +from flask import jsonify +from flask import make_response +from flask import current_app +from backend.db_connection import db +import pymysql + +# New Blueprint for applications +applications = Blueprint('applications', __name__) + + +# Student viewing their own application statuses +@applications.route('/student//applications', methods=['GET']) +def get_student_applications(studentID): + current_app.logger.info(f'GET /student/{studentID}/applications route') + + query = ''' + SELECT u.userId, + u.firstName, + u.lastName, + a.applicationId, + a.status AS applicationStatus, + a.resume, + a.coverLetter, + a.gpa, + cp.title AS positionTitle, + cp.deadline AS applicationDeadline, + a.dateTimeApplied, + cp.description AS positionDescription + FROM users u + JOIN appliesToApp ata ON u.userId = ata.studentId + JOIN applications a ON ata.applicationId = a.applicationId + JOIN coopPositions cp ON a.coopPositionId = cp.coopPositionId + WHERE u.userId = %s + ORDER BY a.dateTimeApplied DESC, cp.deadline ASC + ''' + + connection = db.get_db() + + cursor = connection.cursor(pymysql.cursors.DictCursor) + cursor.execute(query, (studentID,)) + theData = cursor.fetchall() + + return make_response(jsonify(theData), 200) + +# student sees how many positions they have applied to +@applications.route('/student//applications/summary', methods=['GET']) +def get_numb_apps(studentID): + current_app.logger.info('GET /student//applications route') + + query = ''' + SELECT a.status, + COUNT(*) AS ApplicationCount + FROM applications a + JOIN appliesToApp ata ON a.applicationId = ata.applicationId + WHERE ata.studentId = %s + GROUP BY a.status + + ''' + + cursor = db.get_db().cursor() + cursor.execute(query, (studentID,)) + theData = cursor.fetchall() + + the_response = make_response(jsonify(theData)) + the_response.status_code = 200 + return the_response + + + +# Advisor viewing all their advisees' application statuses +@applications.route('/advisor//students/applications', methods=['GET']) +def get_advisor_student_applications(advisorID): + current_app.logger.info('GET /advisor//students/applications route') + + query = ''' + SELECT aa.advisorId, + u.userId, + u.firstName, + u.lastName, + a.applicationId, + a.status AS applicationStatus, + cp.title AS positionTitle, + cp.deadline AS applicationDeadline, + com.name AS companyName, + a.dateApplied + FROM advisor_advisee aa + JOIN users u ON aa.studentId = u.userId + JOIN appliesToApp ata ON u.userId = ata.studentId + JOIN applications a ON ata.applicationId = a.applicationId + JOIN coopPositions cp ON a.coopPositionId = cp.coopPositionId + JOIN companyProfiles com ON cp.companyProfileId = com.companyProfileId + LEFT JOIN workedAtPos wp ON u.userId = wp.studentId AND wp.coopPositionId = cp.coopPositionId + WHERE aa.advisorId = {0} + ORDER BY u.lastName, u.firstName, a.dateApplied DESC + '''.format(advisorID) + + cursor = db.get_db().cursor() + cursor.execute(query) + theData = cursor.fetchall() + + the_response = make_response(jsonify(theData)) + the_response.status_code = 200 + return the_response + +# Employer views all applications of a posting +@applications.route('/applications/', methods=['GET']) +def get_applications(coopPositionId): + current_app.logger.info('GET /applications/%s', coopPositionId) + + query = ''' + SELECT a.dateTimeApplied, a.status, a.resume, a.gpa, a.coverLetter, + a.coopPositionId, a.applicationId + FROM applications a + JOIN coopPositions cp ON a.coopPositionId = cp.coopPositionId + WHERE a.coopPositionId = %s + ORDER BY a.dateTimeApplied DESC; + ''' + cursor = db.get_db().cursor() + cursor.execute(query, (coopPositionId,)) + theData = cursor.fetchall() + + return make_response(jsonify(theData), 200) + +# NEW ENDPOINT: Employer views all applications with student details for a specific position +@applications.route('/applications//with-students', methods=['GET']) +def get_applications_with_students(coopPositionId): + current_app.logger.info('GET /applications/%s/with-students', coopPositionId) + + query = ''' + SELECT a.dateTimeApplied, a.status, a.resume, a.gpa, a.coverLetter, + a.coopPositionId, a.applicationId, + u.userId as studentId, u.firstName, u.lastName, u.email, + u.major, u.minor, u.college, u.gradYear, u.grade + FROM applications a + JOIN appliesToApp ata ON a.applicationId = ata.applicationId + JOIN users u ON ata.studentId = u.userId + JOIN coopPositions cp ON a.coopPositionId = cp.coopPositionId + WHERE a.coopPositionId = %s + ORDER BY a.dateTimeApplied DESC; + ''' + cursor = db.get_db().cursor() + cursor.execute(query, (coopPositionId,)) + theData = cursor.fetchall() + + return make_response(jsonify(theData), 200) + +# NEW ENDPOINT: Update application status (for employers) +@applications.route('/applications//status', methods=['PUT']) +def update_application_status(applicationId): + current_app.logger.info('PUT /applications/%s/status', applicationId) + + try: + request_data = request.json + new_status = request_data.get('status') + + if not new_status: + return make_response(jsonify({"error": "Status is required"}), 400) + + # Validate status values + valid_statuses = ['Draft', 'Submitted', 'Under Review', 'Accepted', 'Rejected'] + if new_status not in valid_statuses: + return make_response(jsonify({"error": "Invalid status"}), 400) + + query = ''' + UPDATE applications + SET status = %s + WHERE applicationId = %s + ''' + + cursor = db.get_db().cursor() + cursor.execute(query, (new_status, applicationId)) + + if cursor.rowcount == 0: + return make_response(jsonify({"error": "Application not found"}), 404) + + db.get_db().commit() + + return make_response(jsonify({ + "success": True, + "applicationId": applicationId, + "newStatus": new_status + }), 200) + + except Exception as e: + current_app.logger.error(f"Error updating application status: {e}") + return make_response(jsonify({"error": "Internal server error"}), 500) + +# NEW ENDPOINT: Get single application details by application ID +@applications.route('/applications//details', methods=['GET']) +def get_application_details(applicationId): + current_app.logger.info('GET /applications/%s/details', applicationId) + + query = ''' + SELECT a.applicationId, a.dateTimeApplied, a.status, a.resume, a.gpa, a.coverLetter, + a.coopPositionId, cp.title as positionTitle, cp.location, cp.hourlyPay, + cp.deadline, cp.industry, cp.description as positionDescription, + u.userId as studentId, u.firstName, u.lastName, u.email, + u.major, u.minor, u.college, u.gradYear, u.grade + FROM applications a + JOIN appliesToApp ata ON a.applicationId = ata.applicationId + JOIN users u ON ata.studentId = u.userId + JOIN coopPositions cp ON a.coopPositionId = cp.coopPositionId + WHERE a.applicationId = %s + ''' + + cursor = db.get_db().cursor() + cursor.execute(query, (applicationId,)) + theData = cursor.fetchall() + + if not theData: + return make_response(jsonify({"error": "Application not found"}), 404) + + return make_response(jsonify(theData[0]), 200) + +# Student applies to a position +@applications.route('/applications/new', methods=['POST']) +def create_application(): + current_app.logger.info('POST /applications/new') + + data = request.json + required_fields = ['coopPositionId', 'studentId'] + + if not all(field in data for field in required_fields): + current_app.logger.warning('POST /applications/new missing required fields') + return make_response(jsonify({"error": "coopPositionId and studentId are required"}), 400) + + try: + cursor = db.get_db().cursor() + + # Insert application + cursor.execute(''' + INSERT INTO applications (resume, gpa, coverLetter, coopPositionId) + VALUES (%s, %s, %s, %s) + ''', ( + data.get('resume', ''), + data.get('gpa'), + data.get('coverLetter', ''), + data['coopPositionId'] + )) + + application_id = cursor.lastrowid + + # Link student to application + cursor.execute(''' + INSERT INTO appliesToApp (applicationId, studentId) + VALUES (%s, %s) + ''', (application_id, data['studentId'])) + + db.get_db().commit() + return jsonify({"message": "Application submitted", "applicationId": application_id}), 201 + + except Exception as e: + current_app.logger.error(f"❌ Error creating application: {e}") + return jsonify({"error": str(e)}), 500 # Temporarily return full error for debugging \ No newline at end of file diff --git a/api/backend/companyProfiles/companyProfiles_routes.py b/api/backend/companyProfiles/companyProfiles_routes.py new file mode 100644 index 0000000000..4b2b4f7abc --- /dev/null +++ b/api/backend/companyProfiles/companyProfiles_routes.py @@ -0,0 +1,42 @@ +from flask import Blueprint +from flask import request +from flask import jsonify +from flask import make_response +from flask import current_app +from backend.db_connection import db + +companyProfiles = Blueprint('companyProfiles', __name__) + +# Student/Advisor views a company profile +@companyProfiles.route('/companyProfiles/', methods=['GET']) +def get_company_profile(companyProfileId): + query = ''' + SELECT companyProfileId, name, bio, industry, websiteLink + FROM companyProfiles + WHERE companyProfileId = %s + ''' + + cursor = db.get_db().cursor() + cursor.execute(query, (companyProfileId,)) + theData = cursor.fetchall() + + the_response = make_response(jsonify(theData)) + the_response.status_code = 200 + return the_response + +# Advisor views all company profiles +@companyProfiles.route('/companyProfiles', methods=['GET']) +def get_all_company_profiles(): + query = ''' + SELECT companyProfileId, name, bio, industry, websiteLink + FROM companyProfiles + ORDER BY name + ''' + + cursor = db.get_db().cursor() + cursor.execute(query) + theData = cursor.fetchall() + + the_response = make_response(jsonify(theData)) + the_response.status_code = 200 + return the_response \ No newline at end of file diff --git a/api/backend/coopPositions/coopPositions_routes.py b/api/backend/coopPositions/coopPositions_routes.py new file mode 100644 index 0000000000..cde33fec35 --- /dev/null +++ b/api/backend/coopPositions/coopPositions_routes.py @@ -0,0 +1,456 @@ +from flask import Blueprint +from flask import request +from flask import jsonify +from flask import make_response +from flask import current_app +from backend.db_connection import db + +coopPositions = Blueprint('coopPositions', __name__) + +#Student views a co-op position +@coopPositions.route('/positions', methods = ['GET']) +def get_position_info(): + current_app.logger.info('GET /positions route') + query = ''' + SELECT cp.* + FROM coopPositions cp + ''' + + cursor = db.get_db().cursor() + cursor.execute(query) + theData = cursor.fetchall() + + the_response = make_response(jsonify(theData)) + the_response.status_code = 200 + return the_response + + +# Student/Advisor views the average pay for each industry +@coopPositions.route('/coopPositions/industryAveragePay', methods=['GET']) +def get_industry_average_pay(): + query = ''' + SELECT cp.industry, AVG(cp.hourlyPay) AS industryAvgHourlyPay + FROM coopPositions cp + GROUP BY cp.industry; + ''' + + current_app.logger.info('GET /industryAveragePay route') + + cursor = db.get_db().cursor() + cursor.execute(query) + theData = cursor.fetchall() + + the_response = make_response(jsonify(theData)) + the_response.status_code = 200 + return the_response + + +# Student view positions with desired skills that match their skills +@coopPositions.route('//desiredSkills', methods=['GET']) +def get_desired_skills(studentID): + current_app.logger.info('GET /desiredSkills route') + + query = ''' + SELECT cp.coopPositionId, + cp.title, + cp.location, + cp.description + FROM coopPositions cp + LEFT JOIN viewsPos vp ON cp.coopPositionId = vp.coopPositionId + JOIN users u ON u.userId = %s + WHERE (vp.preference IS NULL OR vp.preference = TRUE) + AND cp.desiredSkillsId IN (SELECT skillId + FROM skillDetails + WHERE studentId = %s) + AND (cp.desiredGPA IS NULL OR cp.desiredGPA <= u.grade) + ''' + + cursor = db.get_db().cursor() + cursor.execute(query, (studentID, studentID)) + theData = cursor.fetchall() + the_response = make_response(jsonify(theData)) + the_response.status_code = 200 + return the_response + + +# students view positions with required skills that match their skills +@coopPositions.route('//requiredSkills', methods=['GET']) +def get_required_skills(studentID): + current_app.logger.info('GET /requiredSkills route') + + query = ''' + SELECT cp.coopPositionId, + cp.title, + cp.location, + cp.description + FROM coopPositions cp + LEFT JOIN viewsPos vp ON cp.coopPositionId = vp.coopPositionId + JOIN users u ON u.userId = %s + WHERE (vp.preference IS NULL OR vp.preference = TRUE) + AND cp.requiredSkillsId IN (SELECT skillId + FROM skillDetails + WHERE studentId = %s) + ''' + + cursor = db.get_db().cursor() + cursor.execute(query, (studentID, studentID)) + theData = cursor.fetchall() + + the_response = make_response(jsonify(theData)) + the_response.status_code = 200 + return the_response + + +# Employer posts co-op position +@coopPositions.route('/createsPos/coopPosition', methods=['POST']) +def create_position(): + current_app.logger.info('POST /createsPos/coopPosition') + pos_info = request.json + coop_position_id = pos_info['coopPositionId'], + title = pos_info['title'], + location = pos_info['location'], + description = pos_info['description'], + hourly_pay = pos_info['hourlyPay'], + required_skills = pos_info.get('requiredSkillsId'), + desired_skills = pos_info.get('desiredSkillsId'), + desired_gpa = pos_info.get('desiredGPA'), + deadline = pos_info.get('deadline'), + start_date = pos_info['startDate'], + end_date = pos_info['endDate'], + flag = pos_info.get('flagged', False), + industry = pos_info['industry'] + + query = ''' + INSERT INTO coopPositions + (coopPositionId, title, location, description, hourlyPay, requiredSkillsId, + desiredSkillsId, desiredGPA, deadline, startDate, endDate, flagged, industry) + VALUES (%s, %s, %s, %s, %s, %s, %s, %s, %s, %s, %s, %s, %s); + ''' + data = (coop_position_id, title, location, description, hourly_pay, + required_skills, desired_skills, desired_gpa, deadline,start_date, + end_date, flag, industry) + + cursor = db.get_db().cursor() + cursor.execute(query, data) + db.get_db().commit() + return make_response(jsonify({"message": "Position created!"}), 201) + + + +# Admin reviews all positions (both pending and approved) +@coopPositions.route('/coopPositions/pending', methods=['GET']) +def get_all_positions_for_admin(): + current_app.logger.info('GET /pending route') + + query = ''' + SELECT + cp.coopPositionId, + cp.title, + cp.location, + cp.description, + cp.hourlyPay, + cp.deadline, + cp.startDate, + cp.endDate, + cp.industry, + cp.flag, + com.name AS companyName + FROM coopPositions cp + LEFT JOIN createsPos cr ON cr.coopPositionId = cp.coopPositionId + LEFT JOIN users u ON u.userId = cr.employerId + LEFT JOIN companyProfiles com ON com.companyProfileId = u.companyProfileId + ORDER BY cp.flag DESC, cp.deadline IS NULL, cp.deadline ASC, cp.coopPositionId DESC + ''' + + cursor = db.get_db().cursor() + cursor.execute(query) + theData = cursor.fetchall() + + the_response = make_response(jsonify(theData)) + the_response.status_code = 200 + return the_response + +# Admin views number of co-ops posted by each employer +@coopPositions.route('/coopPositions/employerJobCounts', methods=['GET']) +def get_employer_job_counts(): + current_app.logger.info('GET /employerJobCounts route') + + query = ''' + SELECT + u.userId AS employerId, + u.firstName, + u.lastName, + com.name AS companyName, + COUNT(cr.coopPositionId) AS numJobs + FROM users u + JOIN companyProfiles com + ON com.companyProfileId = u.companyProfileId + LEFT JOIN createsPos cr + ON cr.employerId = u.userId + WHERE u.companyProfileId IS NOT NULL + GROUP BY u.userId, u.firstName, u.lastName, com.name + ORDER BY numJobs DESC, u.lastName ASC, u.firstName ASC; + ''' + + cursor = db.get_db().cursor() + cursor.execute(query) + theData = cursor.fetchall() + + the_response = make_response(jsonify(theData)) + the_response.status_code = 200 + return the_response + +# Admin approves a co-op position +@coopPositions.route('/coopPositions//approve', methods=['PUT']) +def approve_position(pos_id): + current_app.logger.info('PUT /coopPositions/%s/approve route', pos_id) + + # First check if the position exists + check_query = ''' + SELECT coopPositionId, flag + FROM coopPositions + WHERE coopPositionId = %s + ''' + + cursor = db.get_db().cursor() + cursor.execute(check_query, (pos_id,)) + position = cursor.fetchone() + + if not position: + the_response = make_response(jsonify({ + "ok": False, + "error": f"Position {pos_id} not found" + })) + the_response.status_code = 404 + return the_response + + # Check if already approved + if position['flag'] == 0: # flag = FALSE means already approved + the_response = make_response(jsonify({ + "ok": True, + "positionId": pos_id, + "status": "already approved", + "message": f"Position {pos_id} is already approved" + })) + the_response.status_code = 200 + return the_response + + # Approve the position (set flag to FALSE) + update_query = ''' + UPDATE coopPositions + SET flag = FALSE + WHERE coopPositionId = %s + ''' + + cursor.execute(update_query, (pos_id,)) + db.get_db().commit() + + the_response = make_response(jsonify({ + "ok": True, + "positionId": pos_id, + "status": "approved", + "message": f"Position {pos_id} has been approved" + })) + the_response.status_code = 200 + return the_response + +# Admin deletes a co-op position +@coopPositions.route('/coopPositions/', methods=['DELETE']) +def delete_position(pos_id): + current_app.logger.info('DELETE /coopPositions/%s route', pos_id) + + # First check if the position exists + check_query = ''' + SELECT coopPositionId, flag, title + FROM coopPositions + WHERE coopPositionId = %s + ''' + + cursor = db.get_db().cursor() + cursor.execute(check_query, (pos_id,)) + position = cursor.fetchone() + + if not position: + the_response = make_response(jsonify({ + "ok": False, + "error": f"Position {pos_id} not found" + })) + the_response.status_code = 404 + return the_response + + # Delete the position (remove flag restriction for admin flexibility) + delete_query = ''' + DELETE FROM coopPositions + WHERE coopPositionId = %s + ''' + + try: + cursor.execute(delete_query, (pos_id,)) + db.get_db().commit() + + the_response = make_response(jsonify({ + "ok": True, + "positionId": pos_id, + "deleted": True, + "message": f"Position {pos_id} '{position['title']}' has been permanently deleted" + })) + the_response.status_code = 200 + return the_response + + except Exception as e: + current_app.logger.error(f"Error deleting position {pos_id}: {e}") + the_response = make_response(jsonify({ + "ok": False, + "error": f"Cannot delete position {pos_id} due to related records (applications, etc.)" + })) + the_response.status_code = 409 + return the_response + +# Admin flags a position +@coopPositions.route('/coopPositions//flag/', methods=['PUT']) +def set_position_flag(pos_id, value): + current_app.logger.info('PUT /coopPositions/%s/flag/%s route', pos_id, value) + + # First check if the position exists + check_query = ''' + SELECT coopPositionId, flag + FROM coopPositions + WHERE coopPositionId = %s + ''' + + cursor = db.get_db().cursor() + cursor.execute(check_query, (pos_id,)) + position = cursor.fetchone() + + if not position: + the_response = make_response(jsonify({ + "ok": False, + "error": f"Position {pos_id} not found" + })) + the_response.status_code = 404 + return the_response + + # Validate flag value (should be 0 or 1) + if value not in [0, 1]: + the_response = make_response(jsonify({ + "ok": False, + "error": "Flag value must be 0 (approved) or 1 (flagged)" + })) + the_response.status_code = 400 + return the_response + + # Update the flag + update_query = ''' + UPDATE coopPositions + SET flag = %s + WHERE coopPositionId = %s + ''' + + cursor.execute(update_query, (value, pos_id)) + db.get_db().commit() + + flag_status = "flagged" if value == 1 else "approved" + the_response = make_response(jsonify({ + "ok": True, + "positionId": pos_id, + "flag": value, + "status": flag_status, + "message": f"Position {pos_id} has been {flag_status}" + })) + the_response.status_code = 200 + return the_response + +# Admin removes a flag from a position +@coopPositions.route('/coopPositions//unflag', methods=['PUT']) +def unflag_position(pos_id): + current_app.logger.info('PUT /coopPositions/%s/unflag route', pos_id) + + # First check if the position exists + check_query = ''' + SELECT coopPositionId, flag + FROM coopPositions + WHERE coopPositionId = %s + ''' + + cursor = db.get_db().cursor() + cursor.execute(check_query, (pos_id,)) + position = cursor.fetchone() + + if not position: + the_response = make_response(jsonify({ + "ok": False, + "error": f"Position {pos_id} not found" + })) + the_response.status_code = 404 + return the_response + + # Update the flag to FALSE (unflag) + update_query = ''' + UPDATE coopPositions + SET flag = FALSE + WHERE coopPositionId = %s + ''' + + cursor.execute(update_query, (pos_id,)) + db.get_db().commit() + + the_response = make_response(jsonify({ + "ok": True, + "positionId": pos_id, + "flag": 0, + "status": "approved", + "message": f"Position {pos_id} has been unflagged (approved)" + })) + the_response.status_code = 200 + return the_response + +@coopPositions.route('/allPositions', methods=['GET']) +def get_all_positions(): + current_app.logger.info('GET /allPositions route') + query = ''' + SELECT + coopPositionId, + title, + location, + description, + hourlyPay, + desiredGPA, + deadline, + startDate, + endDate, + industry + FROM coopPositions + ORDER BY deadline ASC, coopPositionId DESC + ''' + cursor = db.get_db().cursor() + cursor.execute(query) + theData = cursor.fetchall() + the_response = make_response(jsonify(theData)) + the_response.status_code = 200 + return the_response + +# NEW ENDPOINT: Get positions created by a specific employer +@coopPositions.route('/employers//positions', methods=['GET']) +def get_employer_positions(employerId): + current_app.logger.info('GET /employers/%s/positions', employerId) + + query = ''' + SELECT cp.coopPositionId, cp.title, cp.description, cp.location, + cp.hourlyPay, cp.startDate, cp.endDate, cp.deadline, + cp.industry, comp.name as companyName + FROM coopPositions cp + JOIN createsPos crp ON cp.coopPositionId = crp.coopPositionId + LEFT JOIN users emp ON crp.employerId = emp.userId + LEFT JOIN companyProfiles comp ON emp.companyProfileId = comp.companyProfileId + WHERE crp.employerId = %s + ORDER BY cp.deadline DESC; + ''' + + cursor = db.get_db().cursor() + cursor.execute(query, (employerId,)) + theData = cursor.fetchall() + + the_response = make_response(jsonify(theData)) + the_response.status_code = 200 + return the_response + diff --git a/api/backend/customers/customer_routes.py b/api/backend/customers/customer_routes.py deleted file mode 100644 index 4fda460220..0000000000 --- a/api/backend/customers/customer_routes.py +++ /dev/null @@ -1,83 +0,0 @@ -######################################################## -# Sample customers blueprint of endpoints -# Remove this file if you are not using it in your project -######################################################## -from flask import Blueprint -from flask import request -from flask import jsonify -from flask import make_response -from flask import current_app -from backend.db_connection import db -from backend.ml_models.model01 import predict - -#------------------------------------------------------------ -# Create a new Blueprint object, which is a collection of -# routes. -customers = Blueprint('customers', __name__) - - -#------------------------------------------------------------ -# Get all customers from the system -@customers.route('/customers', methods=['GET']) -def get_customers(): - - cursor = db.get_db().cursor() - cursor.execute('''SELECT id, company, last_name, - first_name, job_title, business_phone FROM customers - ''') - - theData = cursor.fetchall() - - the_response = make_response(jsonify(theData)) - the_response.status_code = 200 - return the_response - -#------------------------------------------------------------ -# Update customer info for customer with particular userID -# Notice the manner of constructing the query. -@customers.route('/customers', methods=['PUT']) -def update_customer(): - current_app.logger.info('PUT /customers route') - cust_info = request.json - cust_id = cust_info['id'] - first = cust_info['first_name'] - last = cust_info['last_name'] - company = cust_info['company'] - - query = 'UPDATE customers SET first_name = %s, last_name = %s, company = %s where id = %s' - data = (first, last, company, cust_id) - cursor = db.get_db().cursor() - r = cursor.execute(query, data) - db.get_db().commit() - return 'customer updated!' - -#------------------------------------------------------------ -# Get customer detail for customer with particular userID -# Notice the manner of constructing the query. -@customers.route('/customers/', methods=['GET']) -def get_customer(userID): - current_app.logger.info('GET /customers/ route') - cursor = db.get_db().cursor() - cursor.execute('SELECT id, first_name, last_name FROM customers WHERE id = {0}'.format(userID)) - - theData = cursor.fetchall() - - the_response = make_response(jsonify(theData)) - the_response.status_code = 200 - return the_response - -#------------------------------------------------------------ -# Makes use of the very simple ML model in to predict a value -# and returns it to the user -@customers.route('/prediction//', methods=['GET']) -def predict_value(var01, var02): - current_app.logger.info(f'var01 = {var01}') - current_app.logger.info(f'var02 = {var02}') - - returnVal = predict(var01, var02) - return_dict = {'result': returnVal} - - the_response = make_response(jsonify(return_dict)) - the_response.status_code = 200 - the_response.mimetype = 'application/json' - return the_response \ No newline at end of file diff --git a/api/backend/demographics/demographics_routes.py b/api/backend/demographics/demographics_routes.py new file mode 100644 index 0000000000..9710b37ab8 --- /dev/null +++ b/api/backend/demographics/demographics_routes.py @@ -0,0 +1,98 @@ +from flask import Blueprint +from flask import request +from flask import jsonify +from flask import make_response +from flask import current_app +from backend.db_connection import db + +demographics = Blueprint('demographics', __name__) + +# DEI by employer +@demographics.route('/demographics/employers/gender', methods=['GET']) +def dei_employers_gender(): + current_app.logger.info('GET /demographics/employers/gender route') + query = ''' + SELECT + com.name AS companyName, + d.gender, + COUNT(*) AS applicationCount + FROM applications a + JOIN appliesToApp ata ON ata.applicationId = a.applicationId + JOIN users us ON us.userId = ata.studentId + LEFT JOIN demographics d ON d.demographicId = us.userId + JOIN coopPositions cp ON cp.coopPositionId = a.coopPositionId + JOIN createsPos cr ON cr.coopPositionId = cp.coopPositionId + JOIN users ue ON ue.userId = cr.employerId + JOIN companyProfiles com ON com.companyProfileId = ue.companyProfileId + GROUP BY com.name, d.gender + ORDER BY com.name, d.gender; + ''' + cursor = db.get_db().cursor() + cursor.execute(query) + theData = cursor.fetchall() + resp = make_response(jsonify(theData)); resp.status_code = 200 + return resp + +# DEI by posting +@demographics.route('/demographics/positions/gender', methods=['GET']) +def dei_positions_gender(): + current_app.logger.info('GET /demographics/positions/gender route') + query = ''' + SELECT + cp.coopPositionId, + cp.title, + com.name AS companyName, + d.gender, + COUNT(*) AS applicationCount + FROM applications a + JOIN appliesToApp ata ON ata.applicationId = a.applicationId + JOIN users us ON us.userId = ata.studentId + LEFT JOIN demographics d ON d.demographicId = us.userId + JOIN coopPositions cp ON cp.coopPositionId = a.coopPositionId + JOIN createsPos cr ON cr.coopPositionId = cp.coopPositionId + JOIN users ue ON ue.userId = cr.employerId + JOIN companyProfiles com ON com.companyProfileId = ue.companyProfileId + GROUP BY cp.coopPositionId, cp.title, com.name, d.gender + ORDER BY cp.title, d.gender; + ''' + cursor = db.get_db().cursor() + cursor.execute(query) + theData = cursor.fetchall() + resp = make_response(jsonify(theData)); resp.status_code = 200 + return resp + +# /api/dei/metrics +@demographics.route('/api/dei/metrics', methods=['GET']) +def dei_metrics(): + current_app.logger.info('GET /api/dei/metrics route') + query = ''' + SELECT 'gender' AS metric, gender AS label, COUNT(*) AS count + FROM demographics + WHERE gender IS NOT NULL + GROUP BY gender + + UNION ALL + SELECT 'race' AS metric, race AS label, COUNT(*) AS count + FROM demographics + WHERE race IS NOT NULL + GROUP BY race + + UNION ALL + SELECT 'nationality' AS metric, nationality AS label, COUNT(*) AS count + FROM demographics + WHERE nationality IS NOT NULL + GROUP BY nationality + + UNION ALL + SELECT 'disability' AS metric, disability AS label, COUNT(*) AS count + FROM demographics + WHERE disability IS NOT NULL + GROUP BY disability + + ORDER BY metric, count DESC, label; + ''' + cursor = db.get_db().cursor() + cursor.execute(query) + theData = cursor.fetchall() + resp = make_response(jsonify(theData)); resp.status_code = 200 + return resp \ No newline at end of file diff --git a/api/backend/ml_models/model01.py b/api/backend/ml_models/model01.py deleted file mode 100644 index 368152fbab..0000000000 --- a/api/backend/ml_models/model01.py +++ /dev/null @@ -1,48 +0,0 @@ -""" -model01.py is an example of how to access model parameter values that you are storing -in the database and use them to make a prediction when a route associated with prediction is -accessed. -""" -from backend.db_connection import db -import numpy as np -import logging - - -def train(): - """ - You could have a function that performs training from scratch as well as testing (see below). - It could be activated from a route for an "administrator role" or something similar. - """ - return 'Training the model' - -def test(): - return 'Testing the model' - -def predict(var01, var02): - """ - Retreives model parameters from the database and uses them for real-time prediction - """ - # get a database cursor - cursor = db.get_db().cursor() - # get the model params from the database - query = 'SELECT beta_vals FROM model1_params ORDER BY sequence_number DESC LIMIT 1' - cursor.execute(query) - return_val = cursor.fetchone() - - params = return_val['beta_vals'] - logging.info(f'params = {params}') - logging.info(f'params datatype = {type(params)}') - - # turn the values from the database into a numpy array - params_array = np.array(list(map(float, params[1:-1].split(',')))) - logging.info(f'params array = {params_array}') - logging.info(f'params_array datatype = {type(params_array)}') - - # turn the variables sent from the UI into a numpy array - input_array = np.array([1.0, float(var01), float(var02)]) - - # calculate the dot product (since this is a fake regression) - prediction = np.dot(params_array, input_array) - - return prediction - diff --git a/api/backend/products/products_routes.py b/api/backend/products/products_routes.py deleted file mode 100644 index a3e596d0d3..0000000000 --- a/api/backend/products/products_routes.py +++ /dev/null @@ -1,208 +0,0 @@ -######################################################## -# Sample customers blueprint of endpoints -# Remove this file if you are not using it in your project -######################################################## - -from flask import Blueprint -from flask import request -from flask import jsonify -from flask import make_response -from flask import current_app -from backend.db_connection import db - -#------------------------------------------------------------ -# Create a new Blueprint object, which is a collection of -# routes. -products = Blueprint('products', __name__) - -#------------------------------------------------------------ -# Get all the products from the database, package them up, -# and return them to the client -@products.route('/products', methods=['GET']) -def get_products(): - query = ''' - SELECT id, - product_code, - product_name, - list_price, - category - FROM products - ''' - - # get a cursor object from the database - cursor = db.get_db().cursor() - - # use cursor to query the database for a list of products - cursor.execute(query) - - # fetch all the data from the cursor - # The cursor will return the data as a - # Python Dictionary - theData = cursor.fetchall() - - # Create a HTTP Response object and add results of the query to it - # after "jasonify"-ing it. - response = make_response(jsonify(theData)) - # set the proper HTTP Status code of 200 (meaning all good) - response.status_code = 200 - # send the response back to the client - return response - -# ------------------------------------------------------------ -# get product information about a specific product -# notice that the route takes and then you see id -# as a parameter to the function. This is one way to send -# parameterized information into the route handler. -@products.route('/product/', methods=['GET']) -def get_product_detail (id): - - query = f'''SELECT id, - product_name, - description, - list_price, - category - FROM products - WHERE id = {str(id)} - ''' - - # logging the query for debugging purposes. - # The output will appear in the Docker logs output - # This line has nothing to do with actually executing the query... - # It is only for debugging purposes. - current_app.logger.info(f'GET /product/ query={query}') - - # get the database connection, execute the query, and - # fetch the results as a Python Dictionary - cursor = db.get_db().cursor() - cursor.execute(query) - theData = cursor.fetchall() - - # Another example of logging for debugging purposes. - # You can see if the data you're getting back is what you expect. - current_app.logger.info(f'GET /product/ Result of query = {theData}') - - response = make_response(jsonify(theData)) - response.status_code = 200 - return response - -# ------------------------------------------------------------ -# Get the top 5 most expensive products from the database -@products.route('/mostExpensive') -def get_most_pop_products(): - - query = ''' - SELECT product_code, - product_name, - list_price, - reorder_level - FROM products - ORDER BY list_price DESC - LIMIT 5 - ''' - - # Same process as handler above - cursor = db.get_db().cursor() - cursor.execute(query) - theData = cursor.fetchall() - - response = make_response(jsonify(theData)) - response.status_code = 200 - return response - -# ------------------------------------------------------------ -# Route to get the 10 most expensive items from the -# database. -@products.route('/tenMostExpensive', methods=['GET']) -def get_10_most_expensive_products(): - - query = ''' - SELECT product_code, - product_name, - list_price, - reorder_level - FROM products - ORDER BY list_price DESC - LIMIT 10 - ''' - - # Same process as above - cursor = db.get_db().cursor() - cursor.execute(query) - theData = cursor.fetchall() - - response = make_response(jsonify(theData)) - response.status_code = 200 - return response - - -# ------------------------------------------------------------ -# This is a POST route to add a new product. -# Remember, we are using POST routes to create new entries -# in the database. -@products.route('/product', methods=['POST']) -def add_new_product(): - - # In a POST request, there is a - # collecting data from the request object - the_data = request.json - current_app.logger.info(the_data) - - #extracting the variable - name = the_data['product_name'] - description = the_data['product_description'] - price = the_data['product_price'] - category = the_data['product_category'] - - query = f''' - INSERT INTO products (product_name, - description, - category, - list_price) - VALUES ('{name}', '{description}', '{category}', {str(price)}) - ''' - # TODO: Make sure the version of the query above works properly - # Constructing the query - # query = 'insert into products (product_name, description, category, list_price) values ("' - # query += name + '", "' - # query += description + '", "' - # query += category + '", ' - # query += str(price) + ')' - current_app.logger.info(query) - - # executing and committing the insert statement - cursor = db.get_db().cursor() - cursor.execute(query) - db.get_db().commit() - - response = make_response("Successfully added product") - response.status_code = 200 - return response - -# ------------------------------------------------------------ -### Get all product categories -@products.route('/categories', methods = ['GET']) -def get_all_categories(): - query = ''' - SELECT DISTINCT category AS label, category as value - FROM products - WHERE category IS NOT NULL - ORDER BY category - ''' - - cursor = db.get_db().cursor() - cursor.execute(query) - theData = cursor.fetchall() - - response = make_response(jsonify(theData)) - response.status_code = 200 - return response - -# ------------------------------------------------------------ -# This is a stubbed route to update a product in the catalog -# The SQL query would be an UPDATE. -@products.route('/product', methods = ['PUT']) -def update_product(): - product_info = request.json - current_app.logger.info(product_info) - - return "Success" \ No newline at end of file diff --git a/api/backend/rest_entry.py b/api/backend/rest_entry.py index d8d78502d9..ec24b7a445 100644 --- a/api/backend/rest_entry.py +++ b/api/backend/rest_entry.py @@ -1,9 +1,15 @@ from flask import Flask from backend.db_connection import db -from backend.customers.customer_routes import customers -from backend.products.products_routes import products -from backend.simple.simple_routes import simple_routes +from backend.users.users_routes import users +from backend.coopPositions.coopPositions_routes import coopPositions +from backend.companyProfiles.companyProfiles_routes import companyProfiles +from backend.workedatpos.workedatpos_routes import workedatpos +from backend.viewsPos.viewsPos_routes import views_position +from backend.applications.applications_routes import applications +from backend.advisoradvisee.advisoradvisee_routes import advisoradvisee +from backend.demographics.demographics_routes import demographics + import os from dotenv import load_dotenv @@ -20,28 +26,30 @@ def create_app(): # secret key that will be used for securely signing the session # cookie and can be used for any other security related needs by # extensions or your application - # app.config['SECRET_KEY'] = 'someCrazyS3cR3T!Key.!' app.config['SECRET_KEY'] = os.getenv('SECRET_KEY') - # # these are for the DB object to be able to connect to MySQL. - # app.config['MYSQL_DATABASE_USER'] = 'root' + # these are for the DB object to be able to connect to MySQL. app.config['MYSQL_DATABASE_USER'] = os.getenv('DB_USER').strip() - app.config['MYSQL_DATABASE_PASSWORD'] = os.getenv('MYSQL_ROOT_PASSWORD').strip() + app.config['MYSQL_DATABASE_PASSWORD'] = os.getenv('MYSQL_ROOT_PASSWORD') app.config['MYSQL_DATABASE_HOST'] = os.getenv('DB_HOST').strip() app.config['MYSQL_DATABASE_PORT'] = int(os.getenv('DB_PORT').strip()) - app.config['MYSQL_DATABASE_DB'] = os.getenv('DB_NAME').strip() # Change this to your DB name + app.config['MYSQL_DATABASE_DB'] = os.getenv('DB_NAME').strip() # Initialize the database object with the settings above. app.logger.info('current_app(): starting the database connection') db.init_app(app) - # Register the routes from each Blueprint with the app object # and give a url prefix to each app.logger.info('current_app(): registering blueprints with Flask app object.') - app.register_blueprint(simple_routes) - app.register_blueprint(customers, url_prefix='/c') - app.register_blueprint(products, url_prefix='/p') + app.register_blueprint(users) + app.register_blueprint(coopPositions) + app.register_blueprint(companyProfiles) + app.register_blueprint(workedatpos) + app.register_blueprint(views_position) + app.register_blueprint(applications) + app.register_blueprint(advisoradvisee) + app.register_blueprint(demographics) # Don't forget to return the app object return app diff --git a/api/backend/simple/playlist.py b/api/backend/simple/playlist.py deleted file mode 100644 index a9e7a9ef03..0000000000 --- a/api/backend/simple/playlist.py +++ /dev/null @@ -1,129 +0,0 @@ -# ------------------------------------------------------------ -# Sample data for testing generated by ChatGPT -# ------------------------------------------------------------ - -sample_playlist_data = { - "playlist": { - "id": "37i9dQZF1DXcBWIGoYBM5M", - "name": "Chill Hits", - "description": "Relax and unwind with the latest chill hits.", - "owner": { - "id": "spotify_user_123", - "display_name": "Spotify User" - }, - "tracks": { - "items": [ - { - "track": { - "id": "3n3Ppam7vgaVa1iaRUc9Lp", - "name": "Lose Yourself", - "artists": [ - { - "id": "1dfeR4HaWDbWqFHLkxsg1d", - "name": "Eminem" - } - ], - "album": { - "id": "1ATL5GLyefJaxhQzSPVrLX", - "name": "8 Mile" - }, - "duration_ms": 326000, - "track_number": 1, - "disc_number": 1, - "preview_url": "https://p.scdn.co/mp3-preview/lose-yourself.mp3", - "uri": "spotify:track:3n3Ppam7vgaVa1iaRUc9Lp" - } - }, - { - "track": { - "id": "7ouMYWpwJ422jRcDASZB7P", - "name": "Blinding Lights", - "artists": [ - { - "id": "0fW8E0XdT6aG9aFh6jGpYo", - "name": "The Weeknd" - } - ], - "album": { - "id": "1ATL5GLyefJaxhQzSPVrLX", - "name": "After Hours" - }, - "duration_ms": 200040, - "track_number": 9, - "disc_number": 1, - "preview_url": "https://p.scdn.co/mp3-preview/blinding-lights.mp3", - "uri": "spotify:track:7ouMYWpwJ422jRcDASZB7P" - } - }, - { - "track": { - "id": "4uLU6hMCjMI75M1A2tKUQC", - "name": "Shape of You", - "artists": [ - { - "id": "6eUKZXaKkcviH0Ku9w2n3V", - "name": "Ed Sheeran" - } - ], - "album": { - "id": "3fMbdgg4jU18AjLCKBhRSm", - "name": "Divide" - }, - "duration_ms": 233713, - "track_number": 4, - "disc_number": 1, - "preview_url": "https://p.scdn.co/mp3-preview/shape-of-you.mp3", - "uri": "spotify:track:4uLU6hMCjMI75M1A2tKUQC" - } - }, - { - "track": { - "id": "0VjIjW4GlUZAMYd2vXMi3b", - "name": "Levitating", - "artists": [ - { - "id": "4tZwfgrHOc3mvqYlEYSvVi", - "name": "Dua Lipa" - } - ], - "album": { - "id": "7dGJo4pcD2V6oG8kP0tJRR", - "name": "Future Nostalgia" - }, - "duration_ms": 203693, - "track_number": 5, - "disc_number": 1, - "preview_url": "https://p.scdn.co/mp3-preview/levitating.mp3", - "uri": "spotify:track:0VjIjW4GlUZAMYd2vXMi3b" - } - }, - { - "track": { - "id": "6habFhsOp2NvshLv26DqMb", - "name": "Sunflower", - "artists": [ - { - "id": "1dfeR4HaWDbWqFHLkxsg1d", - "name": "Post Malone" - }, - { - "id": "0C8ZW7ezQVs4URX5aX7Kqx", - "name": "Swae Lee" - } - ], - "album": { - "id": "6k3hyp4efgfHP5GMVd3Agw", - "name": "Spider-Man: Into the Spider-Verse (Soundtrack)" - }, - "duration_ms": 158000, - "track_number": 3, - "disc_number": 1, - "preview_url": "https://p.scdn.co/mp3-preview/sunflower.mp3", - "uri": "spotify:track:6habFhsOp2NvshLv26DqMb" - } - } - ] - }, - "uri": "spotify:playlist:37i9dQZF1DXcBWIGoYBM5M" - } -} \ No newline at end of file diff --git a/api/backend/simple/simple_routes.py b/api/backend/simple/simple_routes.py deleted file mode 100644 index 8685fbac76..0000000000 --- a/api/backend/simple/simple_routes.py +++ /dev/null @@ -1,48 +0,0 @@ -from flask import Blueprint, request, jsonify, make_response, current_app, redirect, url_for -import json -from backend.db_connection import db -from backend.simple.playlist import sample_playlist_data - -# This blueprint handles some basic routes that you can use for testing -simple_routes = Blueprint('simple_routes', __name__) - - -# ------------------------------------------------------------ -# / is the most basic route -# Once the api container is started, in a browser, go to -# localhost:4000/playlist -@simple_routes.route('/') -def welcome(): - current_app.logger.info('GET / handler') - welcome_message = '

Welcome to the CS 3200 Project Template REST API' - response = make_response(welcome_message) - response.status_code = 200 - return response - -# ------------------------------------------------------------ -# /playlist returns the sample playlist data contained in playlist.py -# (imported above) -@simple_routes.route('/playlist') -def get_playlist_data(): - current_app.logger.info('GET /playlist handler') - response = make_response(jsonify(sample_playlist_data)) - response.status_code = 200 - return response - -# ------------------------------------------------------------ -@simple_routes.route('/niceMesage', methods = ['GET']) -def affirmation(): - message = ''' -

Think about it...

-
- You only need to be 1% better today than you were yesterday! - ''' - response = make_response(message) - response.status_code = 200 - return response - -# ------------------------------------------------------------ -# Demonstrates how to redirect from one route to another. -@simple_routes.route('/message') -def mesage(): - return redirect(url_for(affirmation)) \ No newline at end of file diff --git a/api/backend/users/users_routes.py b/api/backend/users/users_routes.py new file mode 100644 index 0000000000..02115e750a --- /dev/null +++ b/api/backend/users/users_routes.py @@ -0,0 +1,721 @@ +from flask import Blueprint +from flask import request +from flask import jsonify +from flask import make_response +from flask import current_app +from backend.db_connection import db +import logging + +logger = logging.getLogger(__name__) + +users = Blueprint('users', __name__) + +# Get student profiles with demographics +@users.route('/users/', methods=['GET']) +def get_user(userID): + query = ''' + SELECT u.*, d.gender, d.race, d.nationality, d.sexuality, d.disability + FROM users u + LEFT JOIN demographics d ON u.userId = d.demographicId + WHERE u.userId = %s + ''' + + cursor = db.get_db().cursor() + cursor.execute(query, (userID,)) + theData = cursor.fetchall() + + the_response = make_response(jsonify(theData)) + the_response.status_code = 200 + return the_response + +# Get student skills with proficiency levels +@users.route('/users//skills', methods=['GET']) +def get_user_skills(userID): + current_app.logger.info(f'GET /users/{userID}/skills route') + + query = ''' + SELECT s.skillId, s.name, s.category, sd.proficiencyLevel + FROM skills s + JOIN skillDetails sd ON s.skillId = sd.skillId + WHERE sd.studentId = %s + ORDER BY s.category, s.name + ''' + + cursor = db.get_db().cursor() + cursor.execute(query, (userID,)) + theData = cursor.fetchall() + + the_response = make_response(jsonify(theData)) + the_response.status_code = 200 + return the_response + +# Get recent applications for dashboard (limit 5) +@users.route('/users//recent-applications', methods=['GET']) +def get_user_recent_applications(userID): + current_app.logger.info(f'GET /users/{userID}/recent-applications route') + + query = ''' + SELECT a.applicationId, + a.status, + a.dateTimeApplied, + a.gpa, + cp.title AS positionTitle, + cp.location, + cp.hourlyPay, + cp.deadline, + com.name AS companyName + FROM applications a + JOIN appliesToApp ata ON a.applicationId = ata.applicationId + JOIN coopPositions cp ON a.coopPositionId = cp.coopPositionId + JOIN createsPos crp ON cp.coopPositionId = crp.coopPositionId + JOIN users emp ON crp.employerId = emp.userId + JOIN companyProfiles com ON emp.companyProfileId = com.companyProfileId + WHERE ata.studentId = %s + ORDER BY a.dateTimeApplied DESC + LIMIT 5 + ''' + + cursor = db.get_db().cursor() + cursor.execute(query, (userID,)) + theData = cursor.fetchall() + + the_response = make_response(jsonify(theData)) + the_response.status_code = 200 + return the_response + +# Get all available skills +@users.route('/skills', methods=['GET']) +def get_all_skills(): + query = ''' + SELECT skillId, name, category + FROM skills + ORDER BY category, name + ''' + + cursor = db.get_db().cursor() + cursor.execute(query) + theData = cursor.fetchall() + + the_response = make_response(jsonify(theData)) + the_response.status_code = 200 + return the_response + +# Update user skills (modify proficiency levels and remove skills) +@users.route('/users//skills', methods=['PUT']) +def update_user_skills(userID): + try: + data = request.get_json() + updated_skills = data.get('updated_skills', []) + removed_skills = data.get('removed_skills', []) + + cursor = db.get_db().cursor() + + # Update existing skills proficiency levels + for skill in updated_skills: + update_query = ''' + UPDATE skillDetails + SET proficiencyLevel = %s + WHERE studentId = %s AND skillId = %s + ''' + cursor.execute(update_query, (skill['proficiencyLevel'], userID, skill['skillId'])) + + # Remove skills marked for deletion + if removed_skills: + placeholders = ','.join(['%s'] * len(removed_skills)) + delete_query = f''' + DELETE FROM skillDetails + WHERE studentId = %s AND skillId IN ({placeholders}) + ''' + cursor.execute(delete_query, [userID] + removed_skills) + + db.get_db().commit() + + the_response = make_response(jsonify({"message": "Skills updated successfully"})) + the_response.status_code = 200 + return the_response + + except Exception as e: + logger.error(f"Error updating user skills: {e}") + the_response = make_response(jsonify({"error": "Failed to update skills"})) + the_response.status_code = 500 + return the_response + +# Add new skills to user profile +@users.route('/users//skills', methods=['POST']) +def add_user_skills(userID): + try: + data = request.get_json() + new_skills = data.get('skills', []) + + if not new_skills: + the_response = make_response(jsonify({"error": "No skills provided"})) + the_response.status_code = 400 + return the_response + + cursor = db.get_db().cursor() + + # Add new skills to skillDetails table + for skill in new_skills: + insert_query = ''' + INSERT INTO skillDetails (skillId, studentId, proficiencyLevel) + VALUES (%s, %s, %s) + ''' + cursor.execute(insert_query, (skill['skillId'], userID, skill['proficiencyLevel'])) + + db.get_db().commit() + + the_response = make_response(jsonify({"message": f"Added {len(new_skills)} skills successfully"})) + the_response.status_code = 200 + return the_response + + except Exception as e: + logger.error(f"Error adding user skills: {e}") + the_response = make_response(jsonify({"error": "Failed to add skills"})) + the_response.status_code = 500 + return the_response + +# Get advisor's assigned students +@users.route('/advisors//students', methods=['GET']) +def get_advisor_students(advisorID): + query = ''' + SELECT u.userId, u.firstName, u.lastName, u.email, u.phone, + u.major, u.minor, u.college, u.gradYear, u.grade, + d.gender, d.race, d.nationality, d.sexuality, d.disability, + aa.flag as flagged + FROM users u + LEFT JOIN demographics d ON u.userId = d.demographicId + JOIN advisor_advisee aa ON u.userId = aa.studentId + WHERE aa.advisorId = %s + ORDER BY u.lastName, u.firstName + ''' + + cursor = db.get_db().cursor() + cursor.execute(query, (advisorID,)) + theData = cursor.fetchall() + + the_response = make_response(jsonify(theData)) + the_response.status_code = 200 + return the_response + +# Update student flag status for advisor +@users.route('/advisors//students//flag', methods=['PUT']) +def update_student_flag(advisorID, studentID): + try: + data = request.get_json() + flagged = data.get('flagged', False) + + query = ''' + UPDATE advisor_advisee + SET flag = %s + WHERE advisorId = %s AND studentId = %s + ''' + + cursor = db.get_db().cursor() + cursor.execute(query, (flagged, advisorID, studentID)) + db.get_db().commit() + + the_response = make_response(jsonify({"message": "Student flag updated successfully", "flagged": flagged})) + the_response.status_code = 200 + return the_response + + except Exception as e: + logger.error(f"Error updating student flag: {e}") + the_response = make_response(jsonify({"error": "Failed to update student flag"})) + the_response.status_code = 500 + return the_response + +# Get placement analytics data for advisor +@users.route('/advisors//analytics/placement-data', methods=['GET']) +def get_advisor_placement_analytics(advisorID): + try: + query = ''' + SELECT + u.firstName, + u.lastName, + u.gradYear, + u.major, + u.college, + a.gpa, + a.status, + cp.title as positionTitle, + cp.hourlyPay as salary, + comp.name as companyName, + cp.industry + FROM users u + JOIN advisor_advisee aa ON u.userId = aa.studentId + JOIN appliesToApp ata ON u.userId = ata.studentId + JOIN applications a ON ata.applicationId = a.applicationId + JOIN coopPositions cp ON a.coopPositionId = cp.coopPositionId + LEFT JOIN createsPos crp ON cp.coopPositionId = crp.coopPositionId + LEFT JOIN users emp ON crp.employerId = emp.userId + LEFT JOIN companyProfiles comp ON emp.companyProfileId = comp.companyProfileId + WHERE aa.advisorId = %s + AND a.status IN ('Accepted', 'Rejected') + AND cp.hourlyPay IS NOT NULL + AND a.gpa IS NOT NULL + + UNION ALL + + SELECT + u.firstName, + u.lastName, + u.gradYear, + u.major, + u.college, + avg_gpa.gpa as gpa, + 'Completed' as status, + cp.title as positionTitle, + cp.hourlyPay as salary, + comp.name as companyName, + cp.industry + FROM users u + JOIN advisor_advisee aa ON u.userId = aa.studentId + JOIN workedAtPos wap ON u.userId = wap.studentId + JOIN coopPositions cp ON wap.coopPositionId = cp.coopPositionId + LEFT JOIN createsPos crp ON cp.coopPositionId = crp.coopPositionId + LEFT JOIN users emp ON crp.employerId = emp.userId + LEFT JOIN companyProfiles comp ON emp.companyProfileId = comp.companyProfileId + LEFT JOIN ( + SELECT ata.studentId, AVG(a.gpa) as gpa + FROM appliesToApp ata + JOIN applications a ON ata.applicationId = a.applicationId + WHERE a.gpa IS NOT NULL + GROUP BY ata.studentId + ) avg_gpa ON u.userId = avg_gpa.studentId + WHERE aa.advisorId = %s + AND cp.hourlyPay IS NOT NULL + + ORDER BY lastName, firstName + ''' + + cursor = db.get_db().cursor() + cursor.execute(query, (advisorID, advisorID)) + theData = cursor.fetchall() + + the_response = make_response(jsonify(theData)) + the_response.status_code = 200 + return the_response + + except Exception as e: + logger.error(f"Error fetching placement analytics: {e}") + the_response = make_response(jsonify({"error": "Failed to fetch placement analytics"})) + the_response.status_code = 500 + return the_response + + + +# Update advisor profile (separate from student profile updates) +@users.route('/advisors//profile', methods=['PUT']) +def update_advisor_profile(advisorID): + try: + current_app.logger.info(f'PUT /advisors/{advisorID}/profile route') + advisor_info = request.json + + first_name = advisor_info.get('firstName') + last_name = advisor_info.get('lastName') + email = advisor_info.get('email') + phone = advisor_info.get('phone') + gender = advisor_info.get('gender') + race = advisor_info.get('race') + nationality = advisor_info.get('nationality') + sexuality = advisor_info.get('sexuality') + disability = advisor_info.get('disability') + + # Update users table (basic info) + user_query = ''' + UPDATE users + SET firstName = %s, + lastName = %s, + email = %s, + phone = %s + WHERE userId = %s + ''' + + # Update demographics table + demo_query = ''' + UPDATE demographics + SET gender = %s, + race = %s, + nationality = %s, + sexuality = %s, + disability = %s + WHERE demographicId = %s + ''' + + cursor = db.get_db().cursor() + + # Execute user update + cursor.execute(user_query, (first_name, last_name, email, phone, advisorID)) + + # Execute demographics update + cursor.execute(demo_query, (gender, race, nationality, sexuality, disability, advisorID)) + + db.get_db().commit() + + the_response = make_response(jsonify({"message": "Advisor profile updated successfully"})) + the_response.status_code = 200 + return the_response + + except Exception as e: + logger.error(f"Error updating advisor profile: {e}") + the_response = make_response(jsonify({"error": "Failed to update advisor profile"})) + the_response.status_code = 500 + return the_response + +# Update student profiles to include additional info +@users.route('/users', methods=['PUT']) +def update_users(): + current_app.logger.info('PUT /users route') + user_info = request.json + user_id = user_info['userId'] + first_name = user_info['firstName'] + last_name = user_info['lastName'] + email = user_info['email'] + phone = user_info['phone'] + major = user_info['major'] + minor = user_info['minor'] + college = user_info['college'] + grad_year = user_info['gradYear'] + grade = user_info['grade'] + gender = user_info['gender'] + race = user_info['race'] + nationality = user_info['nationality'] + sexuality = user_info['sexuality'] + disability = user_info['disability'] + + query = ''' + UPDATE users u + JOIN demographics d ON u.userId = d.demographicId + SET u.firstName = %s, + u.lastName = %s, + u.email = %s, + u.phone = %s, + u.major = %s, + u.minor = %s, + u.college = %s, + u.gradYear = %s, + u.grade = %s, + d.gender = %s, + d.race = %s, + d.nationality = %s, + d.sexuality = %s, + d.disability = %s + WHERE u.userId = %s;''' + data = (first_name, last_name, email, phone, major, minor, college, grad_year, grade, gender, race, nationality, sexuality, disability, user_id) + cursor = db.get_db().cursor() + cursor.execute(query, data) + db.get_db().commit() + return 'user updated!' + + +# Employer views student profile +@users.route('/applications/appliesToApp//users', methods=['GET']) +def employer_view_student(): + current_app.logger.info('GET /applications/appliesToApp//users') + user_info = request.json + application_info = request.json + user_id = user_info['userId'] + first_name = user_info['firstName'] + last_name = user_info['lastName'] + email = user_info['email'] + major = user_info['major'] + minor = user_info['minor'] + college = user_info['college'] + grade = user_info['grade'] + grad_year = user_info['gradYear'] + gpa = application_info['gpa'] + resume = application_info['resume'] + cover_letter = application_info['coverLetter'] + + + query = ''' + SELECT u.userId, u.firstName, u.lastName, u.email, u.major, + u.minor, u.college, u.grade, u.gradYear, a.gpa, + a.resume, a.coverLetter + FROM users u JOIN applications a;''' + data = (first_name, last_name, email, major, minor, college, grad_year, grade, user_id, + gpa, resume, cover_letter) + cursor = db.get_db().cursor() + r = cursor.execute(query, data) + db.get_db().commit() + + + theData = cursor.fetchall() + + the_response = make_response(jsonify(theData)) + the_response.status_code = 200 + return the_response + +# Employer filters student profiles +@users.route('/applications/appliesToApp//users', methods=['GET']) +def employee_filter_student(): + current_app.logger.info('GET /applications/appliesToApp//users') + user_info = request.json + skill_info = request.json + user_id = user_info['userId'] + first_name = user_info['firstName'] + last_name = user_info['lastName'] + name = skill_info['name'] + grad_year = user_info['gradYear'] + major = user_info['major'] + + query = ''' + SELECT DISTINCT u.userId, u.firstName, u.lastName, u.gradYear, u.major + FROM users u + JOIN skillDetails sd ON u.userId = sd.studentId + JOIN skills s ON sd.skillId = s.skillId + WHERE (s.name = %s OR s.name = %s OR s.name = %s) + AND u.gradYear = %s + AND u.major = %s; + ''' + data = (skill1, skill2, skill3, grad_year, major) + + cursor = db.get_db().cursor(dictionary=True) + cursor.execute(query, data) + theData = cursor.fetchall() + + return make_response(jsonify(theData), 200) + +# Admin creates a user (student/employer/advisor) +@users.route('/users', methods=['POST']) +def create_user(): + current_app.logger.info('POST /users route') + + b = request.json + user_id = b['userId'] + first_name = b['firstName'] + last_name = b['lastName'] + email = b['email'] + phone = b.get('phone') + major = b.get('major') + minor = b.get('minor') + college = b.get('college') + grad_year = b.get('gradYear') + grade = b.get('grade') + company_profile_id = b.get('companyProfileId') + industry = b.get('industry') + demographic_id = b.get('demographicId') + + query = ''' + INSERT INTO users + (userId, firstName, lastName, demographicId, email, phone, + major, minor, college, gradYear, grade, companyProfileId, industry) + VALUES + (%s, %s, %s, %s, %s, %s, + %s, %s, %s, %s, %s, %s, %s); + ''' + data = (user_id, first_name, last_name, demographic_id, email, phone,major, minor, college, grad_year, grade, company_profile_id, industry) + + cur = db.get_db().cursor() + cur.execute(query, data) + db.get_db().commit() + return 'user created!', 201 + +# Admin deletes a user +@users.route('/users', methods=['DELETE']) +def delete_user(): + current_app.logger.info('DELETE /users route') + + user_id = request.args.get('userId', type=int) + + query = ''' + DELETE FROM users + WHERE userId = %s; + ''' + + cursor = db.get_db().cursor() + cursor.execute(query, (user_id,)) + db.get_db().commit() + + the_response = make_response(jsonify({'message': 'user deleted!'})) + + the_response.status_code = 200 + return the_response + +# Employer creates a company profile +@users.route('/users/companyProfiles/create', methods=['POST']) +def createCompanyProfile(): + + the_data = request.json + current_app.logger.info(the_data) + + name = the_data['company_name'] + bio = the_data['company_bio'] + industry = the_data['company_industry'] + websiteLink = the_data['website_link'] + + query = f''' + INSERT INTO companyProfiles (name, bio, industry, websiteLink) + VALUE( '{name}', '{bio}', '{industry}', '{websiteLink}') + ''' + + current_app.logger.info(query) + cursor = db.get_db().cursor() + cursor.execute(query) + db.get_db().commit() + + response = make_response("Created company profile") + response.status_code = 200 + return response + +# Employer updates/edits company information +@users.route('/users/companyProfiles/create/', methods=['PUT']) +def updateCompanyProfile(companyProfileId): + current_app.logger.info('PUT /users/companyProfiles/create/ route') + + company_info = request.json + companyId = company_info['id'] + companyName = company_info['name'] + companyBio = company_info['bio'] + companyIndustry = company_info['industry'] + companyWebsite = company_info['website_link'] + + query = ''' + UPDATE companyProfiles + SET name = %s, + bio = %s, + industry = %s, + websiteLink = %s + WHERE companyProfileId = %s + ''' + data = (companyName, companyBio, companyIndustry, companyWebsite, companyId) + cursor = db.get_db().cursor() + r = cursor.execute(query, data) + db.get_db().commit() + return 'Updated company profile!' + +# Get count of students +@users.route('/users/count/students', methods=['GET']) +def count_students(): + current_app.logger.info('GET /users/count/students route') + + try: + query = ''' + SELECT COUNT(*) as student_count + FROM users + WHERE major IS NOT NULL AND major != '' AND companyProfileId IS NULL; + ''' + + cursor = db.get_db().cursor() + cursor.execute(query) + result = cursor.fetchone() + + the_response = make_response(jsonify({"student_count": result['student_count']})) + the_response.status_code = 200 + return the_response + + except Exception as e: + current_app.logger.error(f"Error counting students: {e}") + logger.error(f"Error counting students: {e}") + the_response = make_response(jsonify({"error": f"Failed to count students: {str(e)}"})) + the_response.status_code = 500 + return the_response + +# Get count of advisors +@users.route('/users/count/advisors', methods=['GET']) +def count_advisors(): + current_app.logger.info('GET /users/count/advisors route') + + try: + query = ''' + SELECT COUNT(DISTINCT aa.advisorId) as advisor_count + FROM advisor_advisee aa; + ''' + + cursor = db.get_db().cursor() + cursor.execute(query) + result = cursor.fetchone() + + the_response = make_response(jsonify({"advisor_count": result['advisor_count']})) + the_response.status_code = 200 + return the_response + + except Exception as e: + current_app.logger.error(f"Error counting advisors: {e}") + logger.error(f"Error counting advisors: {e}") + the_response = make_response(jsonify({"error": f"Failed to count advisors: {str(e)}"})) + the_response.status_code = 500 + return the_response + +# Get count of employers +@users.route('/users/count/employers', methods=['GET']) +def count_employers(): + current_app.logger.info('GET /users/count/employers route') + + try: + query = ''' + SELECT COUNT(*) as employer_count + FROM users + WHERE companyProfileId IS NOT NULL; + ''' + + cursor = db.get_db().cursor() + cursor.execute(query) + result = cursor.fetchone() + + the_response = make_response(jsonify({"employer_count": result['employer_count']})) + the_response.status_code = 200 + return the_response + + except Exception as e: + current_app.logger.error(f"Error counting employers: {e}") + logger.error(f"Error counting employers: {e}") + the_response = make_response(jsonify({"error": f"Failed to count employers: {str(e)}"})) + the_response.status_code = 500 + return the_response + +# Get all user counts in one request +@users.route('/users/count/all', methods=['GET']) +def count_all_users(): + current_app.logger.info('GET /users/count/all route') + + try: + # Count students + student_query = ''' + SELECT COUNT(*) as student_count + FROM users + WHERE major IS NOT NULL AND major != '' AND companyProfileId IS NULL + ''' + + # Count advisors + advisor_query = ''' + SELECT COUNT(DISTINCT aa.advisorId) as advisor_count + FROM advisor_advisee aa + ''' + + # Count employers + employer_query = ''' + SELECT COUNT(*) as employer_count + FROM users + WHERE companyProfileId IS NOT NULL + ''' + + cursor = db.get_db().cursor() + + # Execute all queries + cursor.execute(student_query) + student_count = cursor.fetchone()['student_count'] + + cursor.execute(advisor_query) + advisor_count = cursor.fetchone()['advisor_count'] + + cursor.execute(employer_query) + employer_count = cursor.fetchone()['employer_count'] + + counts = { + "student_count": student_count, + "advisor_count": advisor_count, + "employer_count": employer_count, + "total_users": student_count + advisor_count + employer_count + } + + the_response = make_response(jsonify(counts)) + the_response.status_code = 200 + return the_response + + except Exception as e: + current_app.logger.error(f"Error counting all users: {e}") + logger.error(f"Error counting all users: {e}") + the_response = make_response(jsonify({"error": f"Failed to count users: {str(e)}"})) + the_response.status_code = 500 + return the_response \ No newline at end of file diff --git a/api/backend/viewsPos/viewsPos_routes.py b/api/backend/viewsPos/viewsPos_routes.py new file mode 100644 index 0000000000..7d49882ce0 --- /dev/null +++ b/api/backend/viewsPos/viewsPos_routes.py @@ -0,0 +1,146 @@ +from flask import Blueprint +from flask import request +from flask import jsonify +from flask import make_response +from flask import current_app +from backend.db_connection import db + +# New Blueprint for applications +views_position = Blueprint('views_position', __name__) + +# student flags positions they like/do not like +@views_position.route('/position', methods=['POST']) +def set_job_preference(): + the_data = request.json + current_app.logger.info(the_data) + + student_id = the_data['studentId'] + coop_position_id = the_data['coopPositionId'] + preference = the_data['preference'] + + query = f''' + INSERT INTO viewsPos (studentId, coopPositionId, preference) + VALUES (%s, %s, %s) + ON DUPLICATE KEY UPDATE preference = VALUES(preference) + ''' + current_app.logger.info(query) + + # Execute query and commit + cursor = db.get_db().cursor() + cursor.execute(query, (student_id, coop_position_id, int(preference))) + db.get_db().commit() + + # Return success response + response = make_response("Preference saved successfully") + response.status_code = 200 + return response + +# student deletes preference for a posting +@views_position.route('/position', methods=['DELETE']) +def remove_job_preference(): + the_data = request.json + current_app.logger.info(f"DELETE preference for: {the_data}") + + student_id = the_data['studentId'] + coop_position_id = the_data['coopPositionId'] + + query = ''' + DELETE FROM viewsPos + WHERE studentId = %s AND coopPositionId = %s + ''' + + cursor = db.get_db().cursor() + cursor.execute(query, (student_id, coop_position_id)) + db.get_db().commit() + + return make_response(jsonify({"message": "Preference removed"}), 200) + + +# Student views deadlines for positions +@views_position.route('//deadlines', methods=['GET']) +def get_deadlines(studentID): + current_app.logger.info(f'GET /{studentID}/deadlines route') + + query = ''' + SELECT cp.title, + cp.deadline + FROM viewsPos vp + JOIN coopPositions cp ON vp.coopPositionId = cp.coopPositionId + WHERE vp.studentId = %s AND vp.preference = TRUE; + ''' + + cursor = db.get_db().cursor() + cursor.execute(query, (studentID,)) + theData = cursor.fetchall() + + the_response = make_response(jsonify(theData)) + the_response.status_code = 200 + return the_response + +# Student views positions based on preference +@views_position.route('/viewpos/', methods=['GET']) +def get_positions_by_preference(studentID): + current_app.logger.info(f'GET /viewpos/{studentID} route') + + pref_param = request.args.get('preference') + + # Validate preference query param and build SQL condition + preference_clause = '' + if pref_param is not None: + if pref_param.lower() in ['true', '1']: + preference_clause = 'AND vp.preference = TRUE' + elif pref_param.lower() in ['false', '0']: + preference_clause = 'AND vp.preference = FALSE' + else: + return jsonify({"error": "Invalid preference value"}), 400 + + query = f''' + SELECT cp.* + FROM viewsPos vp + JOIN coopPositions cp ON cp.coopPositionId = vp.coopPositionId + WHERE vp.studentId = %s + {preference_clause} + ''' + + try: + cursor = db.get_db().cursor() + cursor.execute(query, (studentID,)) + data = cursor.fetchall() + return jsonify(data), 200 + except Exception as e: + current_app.logger.error(f"Error fetching positions by preference: {e}") + return jsonify({"error": "Server error"}), 500 + + +# Admin views preference metrics +@views_position.route('/viewspos/', methods=['GET']) +def get_preference_metrics(preference): + current_app.logger.info('GET /viewspos/%s route', preference) + + query = ''' + SELECT + cp.coopPositionId, + cp.title, + com.name AS companyName, + COUNT(vp.studentId) AS prefCount + FROM coopPositions cp + LEFT JOIN createsPos cr + ON cr.coopPositionId = cp.coopPositionId + LEFT JOIN users u + ON u.userId = cr.employerId + LEFT JOIN companyProfiles com + ON com.companyProfileId = u.companyProfileId + LEFT JOIN viewsPos vp + ON vp.coopPositionId = cp.coopPositionId + AND vp.preference = %s + GROUP BY cp.coopPositionId, cp.title, com.name + ORDER BY prefCount DESC, cp.title ASC; + ''' + + cursor = db.get_db().cursor() + cursor.execute(query, (preference,)) + theData = cursor.fetchall() + + the_response = make_response(jsonify(theData)) + the_response.status_code = 200 + return the_response \ No newline at end of file diff --git a/api/backend/workedatpos/workedatpos_routes.py b/api/backend/workedatpos/workedatpos_routes.py new file mode 100644 index 0000000000..1621809c4f --- /dev/null +++ b/api/backend/workedatpos/workedatpos_routes.py @@ -0,0 +1,113 @@ +from flask import Blueprint +from flask import request +from flask import jsonify +from flask import make_response +from flask import current_app +from backend.db_connection import db + +workedatpos = Blueprint('workedatpos', __name__) + +# Advisor views historical placement data for all students (filter by major/industry in frontend) +@workedatpos.route('/workedatpos/placement-data', methods=['GET']) +def get_scatter_plot_data(): + current_app.logger.info('GET /workedatpos/placement-data route') + + query = ''' + SELECT u.major, + cp.industry, + a.gpa, + cp.hourlyPay, + wp.studentId AS wasHired, + u.firstName, + u.lastName, + cp.title AS positionTitle, + comp.name AS companyName, + u.college, + u.gradYear, + cp.location + FROM users u + JOIN appliesToApp ata ON u.userId = ata.studentId + JOIN applications a ON ata.applicationId = a.applicationId + JOIN coopPositions cp ON a.coopPositionId = cp.coopPositionId + JOIN companyProfiles comp ON cp.industry = comp.industry + LEFT JOIN workedAtPos wp ON u.userId = wp.studentId AND wp.coopPositionId = cp.coopPositionId + WHERE a.gpa IS NOT NULL + AND cp.hourlyPay IS NOT NULL + ORDER BY u.major, cp.industry, a.gpa DESC + ''' + + cursor = db.get_db().cursor() + cursor.execute(query) + theData = cursor.fetchall() + + the_response = make_response(jsonify(theData)) + the_response.status_code = 200 + the_response.headers.add('Access-Control-Allow-Origin', '*') + the_response.headers.add('Access-Control-Allow-Headers', 'Content-Type') + return the_response + +# Advisor and student views company rating data rated by past co-ops +@workedatpos.route('/workedatpos/company-ratings', methods=['GET']) +def get_company_ratings(): + current_app.logger.info('GET /workedatpos/company-ratings route') + + # Query to get company ratings by individual company + query = ''' + SELECT + comp.companyProfileId, + comp.name AS companyName, + comp.industry AS companyIndustry, + AVG(wp.companyRating) AS avgRating, + COUNT(wp.companyRating) AS totalRatings, + MIN(wp.companyRating) AS minRating, + MAX(wp.companyRating) AS maxRating, + COUNT(DISTINCT wp.studentId) AS studentsWhoRated + FROM workedAtPos wp + JOIN coopPositions cp ON wp.coopPositionId = cp.coopPositionId + JOIN companyProfiles comp ON cp.industry = comp.industry + WHERE wp.companyRating IS NOT NULL + GROUP BY comp.companyProfileId, comp.name, comp.industry + ORDER BY avgRating DESC; + ''' + + cursor = db.get_db().cursor() + cursor.execute(query) + theData = cursor.fetchall() + + the_response = make_response(jsonify(theData)) + the_response.status_code = 200 + the_response.headers.add('Access-Control-Allow-Origin', '*') + the_response.headers.add('Access-Control-Allow-Headers', 'Content-Type') + return the_response + +# Student views wage data from past co-ops +@workedatpos.route('/workedatpos/wagedata', methods=['GET']) +def get_company_wage_data(): + current_app.logger.info('GET /workedatpos/wagedata route') + + query = ''' + SELECT cp.name AS companyName, + pos.title AS positionTitle, + MIN(pos.hourlyPay) AS minSalary, + MAX(pos.hourlyPay) AS maxSalary, + AVG(pos.hourlyPay) AS avgPay, + COUNT(w.studentId) AS numPreviousCoops + FROM companyProfiles cp JOIN users u ON cp.companyProfileId = u.companyProfileId + JOIN createsPos cr ON u.userId = cr.employerId + JOIN coopPositions pos ON cr.coopPositionId = pos.coopPositionId + LEFT JOIN workedAtPos w ON pos.coopPositionId = w.coopPositionId + GROUP BY cp.name, pos.title + ORDER BY avgPay DESC; + + ''' + + cursor = db.get_db().cursor() + cursor.execute(query) + theData = cursor.fetchall() + + the_response = make_response(jsonify(theData)) + the_response.status_code = 200 + the_response.headers.add('Access-Control-Allow-Origin', '*') + the_response.headers.add('Access-Control-Allow-Headers', 'Content-Type') + return the_response + diff --git a/app/Dockerfile b/app/Dockerfile index 6eb11bff2e..c4f7981358 100644 --- a/app/Dockerfile +++ b/app/Dockerfile @@ -5,7 +5,6 @@ WORKDIR /appcode RUN apt-get update && apt-get install -y \ build-essential \ curl \ - software-properties-common \ git \ && rm -rf /var/lib/apt/lists/* diff --git a/app/src/.streamlit/config.toml b/app/src/.streamlit/config.toml index bb28be97de..1f0ce5efe6 100644 --- a/app/src/.streamlit/config.toml +++ b/app/src/.streamlit/config.toml @@ -15,6 +15,6 @@ showSidebarNavigation = false [theme] # Setting some basic config options related to the theme of the app base="light" -primaryColor="#6550e6" -font="monospace" +primaryColor="#1e3a8a" +font="sans serif" diff --git a/app/src/Home.py b/app/src/Home.py index ef0f7b19ad..7eb50f9ed2 100644 --- a/app/src/Home.py +++ b/app/src/Home.py @@ -32,46 +32,185 @@ # The major content of this page # *************************************************** -# set the title of the page and provide a simple prompt. +# Custom CSS for styling +st.markdown(""" + +""", unsafe_allow_html=True) + +# Enhanced header section +st.markdown('

CoopAlytics

', unsafe_allow_html=True) +st.markdown('

Your Gateway to Co-op Data Analytics & Management

', unsafe_allow_html=True) + +# Welcome section with feature highlights +st.markdown(""" +
+

Welcome to CoopAlytics! 🎯

+

Select your role below to access personalized dashboards and insights

+
+""", unsafe_allow_html=True) + logger.info("Loading the Home page of the app") -st.title('CS 3200 Sample Semester Project App') -st.write('\n\n') -st.write('### HI! As which user would you like to log in?') - -# For each of the user personas for which we are implementing -# functionality, we put a button on the screen that the user -# can click to MIMIC logging in as that mock user. - -if st.button("Act as John, a Political Strategy Advisor", - type = 'primary', - use_container_width=True): - # when user clicks the button, they are now considered authenticated - st.session_state['authenticated'] = True - # we set the role of the current user - st.session_state['role'] = 'pol_strat_advisor' - # we add the first name of the user (so it can be displayed on - # subsequent pages). - st.session_state['first_name'] = 'John' - # finally, we ask streamlit to switch to another page, in this case, the - # landing page for this particular user type - logger.info("Logging in as Political Strategy Advisor Persona") - st.switch_page('pages/00_Pol_Strat_Home.py') - -if st.button('Act as Mohammad, an USAID worker', - type = 'primary', - use_container_width=True): - st.session_state['authenticated'] = True - st.session_state['role'] = 'usaid_worker' - st.session_state['first_name'] = 'Mohammad' - st.switch_page('pages/10_USAID_Worker_Home.py') - -if st.button('Act as System Administrator', - type = 'primary', - use_container_width=True): - st.session_state['authenticated'] = True - st.session_state['role'] = 'administrator' - st.session_state['first_name'] = 'SysAdmin' - st.switch_page('pages/20_Admin_Home.py') + +# Create three columns for better layout +col1, col2, col3 = st.columns([1, 2, 1]) + +with col2: + st.markdown("### Choose Your User Role") + + # Student Persona + st.markdown(""" +
+
🎓 Student Portal
+
+ Access your application status, explore co-op opportunities, track your progress, + and get insights on industry trends and placement data. +
+
+ """, unsafe_allow_html=True) + + if st.button("Login as Student", + type='primary', + use_container_width=True, + key="student_login"): + st.session_state['authenticated'] = True + st.session_state['role'] = 'student' + st.session_state['first_name'] = 'Charlie' + st.session_state['user_id'] = 1 + logger.info("Logging in as Student Persona") + st.switch_page('pages/00_Student_Home.py') + + st.markdown("
", unsafe_allow_html=True) + + # Advisor Persona + st.markdown(""" +
+
👨‍🏫 Academic Advisor Portal
+
+ Monitor your advisees' application progress, analyze placement trends, + identify students needing support, and access comprehensive analytics. +
+
+ """, unsafe_allow_html=True) + + if st.button('Login as Academic Advisor', + type='primary', + use_container_width=True, + key="advisor_login"): + st.session_state['authenticated'] = True + st.session_state['role'] = 'advisor' + st.session_state['first_name'] = 'Dr. Sarah' + st.session_state['user_id'] = 31 # Sarah Martinez + logger.info("Logging in as Academic Advisor Persona") + st.switch_page('pages/10_Advisor_Home.py') + + st.markdown("
", unsafe_allow_html=True) + + # Employer Persona + st.markdown(""" +
+
🏢 Employer Portal
+
+ Manage co-op positions, review applications, track hiring metrics, + and access candidate analytics to make informed hiring decisions. +
+
+ """, unsafe_allow_html=True) + + if st.button('Login as Employer', + type='primary', + use_container_width=True, + key="employer_login"): + st.session_state['authenticated'] = True + st.session_state['role'] = 'employer' + st.session_state['first_name'] = 'Jennifer' + logger.info("Logging in as Employer Persona") + st.switch_page('pages/20_Employer_Home.py') + + st.markdown("
", unsafe_allow_html=True) + + # System Administrator + st.markdown(""" +
+
⚙️ System Administrator
+
+ Access system-wide analytics, manage user accounts, monitor platform performance, + and maintain database integrity across all user roles. +
+
+ """, unsafe_allow_html=True) + + if st.button('Login as System Administrator', + type='primary', + use_container_width=True, + key="admin_login"): + st.session_state['authenticated'] = True + st.session_state['role'] = 'administrator' + st.session_state['first_name'] = 'SysAdmin' + logger.info("Logging in as System Administrator Persona") + st.switch_page('pages/30_Admin_Home.py') + +# Footer section +st.markdown("

", unsafe_allow_html=True) +st.markdown(""" +
+ 🚀 Platform Features: Application Tracking • Placement Analytics • Company Ratings • + GPA vs Salary Insights • Industry Trends • Student Progress Monitoring +
+""", unsafe_allow_html=True) diff --git a/app/src/assets/coopalyticslogo.png b/app/src/assets/coopalyticslogo.png new file mode 100644 index 0000000000..68b3f50c6e Binary files /dev/null and b/app/src/assets/coopalyticslogo.png differ diff --git a/app/src/modules/nav.py b/app/src/modules/nav.py index cb31d3bf67..c029435fe7 100644 --- a/app/src/modules/nav.py +++ b/app/src/modules/nav.py @@ -11,48 +11,117 @@ def HomeNav(): def AboutPageNav(): - st.sidebar.page_link("pages/30_About.py", label="About", icon="🧠") + st.sidebar.page_link("pages/90_About.py", label="About", icon="🧠") -#### ------------------------ Examples for Role of pol_strat_advisor ------------------------ -def PolStratAdvHomeNav(): +#### ------------------------ Student (Charlie Stout) Role ------------------------ +def StudentHomeNav(): st.sidebar.page_link( - "pages/00_Pol_Strat_Home.py", label="Political Strategist Home", icon="👤" + "pages/00_Student_Home.py", label="Student Dashboard", icon="🎓" ) -def WorldBankVizNav(): +def StudentApplicationsNav(): st.sidebar.page_link( - "pages/01_World_Bank_Viz.py", label="World Bank Visualization", icon="🏦" + "pages/01_Student_Applications.py", label="My Applications", icon="📝" ) -def MapDemoNav(): - st.sidebar.page_link("pages/02_Map_Demo.py", label="Map Demonstration", icon="🗺️") +def StudentPositionsNav(): + st.sidebar.page_link( + "pages/02_Student_Browse_Positions.py", label="Browse Co-op Positions", icon="🔍" + ) -## ------------------------ Examples for Role of usaid_worker ------------------------ -def ApiTestNav(): - st.sidebar.page_link("pages/12_API_Test.py", label="Test the API", icon="🛜") +def StudentAnalyticsNav(): + st.sidebar.page_link( + "pages/03_Student_Analytics.py", label="Salary & Company Data", icon="📊" + ) + +def StudentCalendarNav(): + st.sidebar.page_link( + "pages/04_Student_Calendar.py", label="Application Calendar", icon="📅" + ) -def PredictionNav(): +#### ------------------------ Advisor (Sarah Martinez) Role ------------------------ +def AdvisorHomeNav(): st.sidebar.page_link( - "pages/11_Prediction.py", label="Regression Prediction", icon="📈" + "pages/10_Advisor_Home.py", label="Advisor Dashboard", icon="👨‍🏫" + ) + +def AdvisorStudentManagementNav(): + st.sidebar.page_link( + "pages/13_Advisor_StudentManagement.py", label="Student Management", icon="👥" ) -def ClassificationNav(): +def AdvisorAnalyticsNav(): st.sidebar.page_link( - "pages/13_Classification.py", label="Classification Demo", icon="🌺" + "pages/11_Advisor_Analytics.py", label="Placement Analytics", icon="📈" ) -#### ------------------------ System Admin Role ------------------------ -def AdminPageNav(): - st.sidebar.page_link("pages/20_Admin_Home.py", label="System Admin", icon="🖥️") +def AdvisorCompaniesNav(): st.sidebar.page_link( - "pages/21_ML_Model_Mgmt.py", label="ML Model Management", icon="🏢" + "pages/12_Advisor_Companies.py", label="Company Partnerships", icon="🏢" + ) + + +#### ------------------------ Employer (Phoebe Hwang) Role ------------------------ +def EmployerHomeNav(): + st.sidebar.page_link( + "pages/20_Employer_Home.py", label="Employer Dashboard", icon="🏢" + ) + + +def EmployerPostingsNav(): + st.sidebar.page_link( + "pages/21_Employer_Postings.py", label="Manage Co-Op Postings", icon="📄" + ) + + +def EmployerApplicationsNav(): + st.sidebar.page_link( + "pages/22_Employer_Applications.py", label="Review Applications", icon="👀" + ) + + +def EmployerCandidatesNav(): + st.sidebar.page_link( + "pages/23_Employer_Candidates.py", label="Search Candidates", icon="🔎" + ) + + + +#### ------------------------ System Administrator (Kaelyn Dunn) Role ------------------------ +def AdminHomeNav(): + st.sidebar.page_link( + "pages/30_Admin_Home.py", label="Admin Dashboard", icon="⚙️" + ) + + +def AdminEmployersNav(): + st.sidebar.page_link( + "pages/31_Admin_Employers.py", label="Manage Employers", icon="🏭" + ) + + +def AdminPostingsNav(): + st.sidebar.page_link( + "pages/32_Admin_Postings.py", label="Review Job Postings", icon="✅" + ) + + +def AdminDEINav(): + st.sidebar.page_link( + "pages/33_Admin_DEI.py", label="DEI Metrics", icon="🌍" + ) + + +def AdminAnalyticsNav(): + st.sidebar.page_link( + "pages/34_Admin_Analytics.py", label="Platform Analytics", icon="📈" ) @@ -63,7 +132,7 @@ def SideBarLinks(show_home=False): """ # add a logo to the sidebar always - st.sidebar.image("assets/logo.png", width=150) + st.sidebar.image("assets/coopalyticslogo.png", width=300) # If there is no logged in user, redirect to the Home (Landing) page if "authenticated" not in st.session_state: @@ -77,21 +146,36 @@ def SideBarLinks(show_home=False): # Show the other page navigators depending on the users' role. if st.session_state["authenticated"]: - # Show World Bank Link and Map Demo Link if the user is a political strategy advisor role. - if st.session_state["role"] == "pol_strat_advisor": - PolStratAdvHomeNav() - WorldBankVizNav() - MapDemoNav() - - # If the user role is usaid worker, show the Api Testing page - if st.session_state["role"] == "usaid_worker": - PredictionNav() - ApiTestNav() - ClassificationNav() - - # If the user is an administrator, give them access to the administrator pages + # Student Navigation (Charlie Stout persona) + if st.session_state["role"] == "student": + StudentHomeNav() + StudentApplicationsNav() + StudentPositionsNav() + StudentCalendarNav() + StudentAnalyticsNav() + + # Advisor Navigation (Sarah Martinez persona) + if st.session_state["role"] == "advisor": + AdvisorHomeNav() + AdvisorStudentManagementNav() + AdvisorAnalyticsNav() + AdvisorCompaniesNav() + + # Employer Navigation (Phoebe Hwang persona) + if st.session_state["role"] == "employer": + EmployerHomeNav() + EmployerPostingsNav() + EmployerApplicationsNav() + EmployerCandidatesNav() + + # System Administrator Navigation (Kaelyn Dunn persona) if st.session_state["role"] == "administrator": - AdminPageNav() + AdminHomeNav() + AdminEmployersNav() + AdminPostingsNav() + AdminDEINav() + AdminAnalyticsNav() + # Always show the About page at the bottom of the list of links AboutPageNav() @@ -101,4 +185,4 @@ def SideBarLinks(show_home=False): if st.sidebar.button("Logout"): del st.session_state["role"] del st.session_state["authenticated"] - st.switch_page("Home.py") + st.switch_page("Home.py") \ No newline at end of file diff --git a/app/src/pages/00_Pol_Strat_Home.py b/app/src/pages/00_Pol_Strat_Home.py deleted file mode 100644 index 3d02f25552..0000000000 --- a/app/src/pages/00_Pol_Strat_Home.py +++ /dev/null @@ -1,25 +0,0 @@ -import logging -logger = logging.getLogger(__name__) - -import streamlit as st -from modules.nav import SideBarLinks - -st.set_page_config(layout = 'wide') - -# Show appropriate sidebar links for the role of the currently logged in user -SideBarLinks() - -st.title(f"Welcome Political Strategist, {st.session_state['first_name']}.") -st.write('') -st.write('') -st.write('### What would you like to do today?') - -if st.button('View World Bank Data Visualization', - type='primary', - use_container_width=True): - st.switch_page('pages/01_World_Bank_Viz.py') - -if st.button('View World Map Demo', - type='primary', - use_container_width=True): - st.switch_page('pages/02_Map_Demo.py') \ No newline at end of file diff --git a/app/src/pages/00_Student_Home.py b/app/src/pages/00_Student_Home.py new file mode 100644 index 0000000000..a0200ab3be --- /dev/null +++ b/app/src/pages/00_Student_Home.py @@ -0,0 +1,472 @@ +import logging +logging.basicConfig(format='%(filename)s:%(lineno)s:%(levelname)s -- %(message)s', level=logging.INFO) +logger = logging.getLogger(__name__) + +import streamlit as st +from modules.nav import SideBarLinks +import requests + +# set up +st.set_page_config(layout='wide') +SideBarLinks() + +logger.info("Loading Student Home page") + +# Charlie Stout's userId from database +API_BASE_URL = "http://web-api:4000" + +# user_id from session state +charlie_user_id = st.session_state.get("user_id", None) + +if charlie_user_id is None: + st.error("User not logged in. Please return to home and log in.") + st.stop() + +# Function to get user data from API +def fetch_user_data(user_id): + try: + response = requests.get(f"{API_BASE_URL}/users/{user_id}") + logger.info(f"Fetching user data from API: status_code={response.status_code}") + if response.status_code == 200: + data = response.json() + logger.info(f"User data received: {data}") + return data + else: + logger.error(f"Failed to fetch user data, status code: {response.status_code}, response: {response.text}") + return None + except Exception as e: + logger.error(f"Error fetching user data: {e}") + # Fallback data if API is not available + return { + 'userId': 1, + 'firstName': 'Charlie', + 'lastName': 'Stout', + 'email': 'c.stout@student.edu', + 'phone': '555-0101', + 'major': 'Computer Science', + 'minor': 'Mathematics', + 'college': 'Khoury College of Computer Sciences', + 'gradYear': '2026', + 'grade': 'Junior', + 'gender': None, + 'race': None, + 'nationality': None, + 'sexuality': None, + 'disability': None + } + +# Function to get user skills from API +def fetch_user_skills(user_id): + try: + response = requests.get(f"{API_BASE_URL}/users/{user_id}/skills") + if response.status_code == 200: + return response.json() + return [] + except Exception as e: + logger.error(f"Error fetching user skills: {e}") + return [] + +# Function to get application summary from API +def fetch_application_summary(user_id): + try: + response = requests.get(f"{API_BASE_URL}/student/{user_id}/applications/summary") + logger.info(f"Fetching application summary from API: status_code={response.status_code}") + if response.status_code == 200: + data = response.json() + logger.info(f"Application summary data received: {data}") + return data + else: + logger.warning(f"Failed to fetch application summary, status code: {response.status_code}") + return [] + except Exception as e: + logger.error(f"Error fetching application summary: {e}") + return [] + +# Function to get recent applications from API +def fetch_recent_applications(user_id): + try: + response = requests.get(f"{API_BASE_URL}/users/{user_id}/recent-applications") + if response.status_code == 200: + return response.json() + return [] + except Exception as e: + logger.error(f"Error fetching recent applications: {e}") + return [] + +# Function to update user data via API +def update_user_data(user_data): + try: + response = requests.put(f"{API_BASE_URL}/users", json=user_data) + return response.status_code == 200 + except Exception as e: + logger.error(f"Error updating user data: {e}") + return False + +# Function to get all available skills from API +def fetch_all_skills(): + try: + response = requests.get(f"{API_BASE_URL}/skills") + if response.status_code == 200: + return response.json() + return [] + except Exception as e: + logger.error(f"Error fetching all skills: {e}") + return [] + +# Function to update user skills +def update_user_skills(user_id, updated_skills, removed_skills): + try: + update_data = { + "updated_skills": list(updated_skills.values()), + "removed_skills": removed_skills + } + response = requests.put(f"{API_BASE_URL}/users/{user_id}/skills", json=update_data) + return response.status_code == 200 + except Exception as e: + logger.error(f"Error updating user skills: {e}") + return False + +# Function to add new skills to user profile +def add_user_skills(user_id, new_skills): + try: + response = requests.post(f"{API_BASE_URL}/users/{user_id}/skills", json={"skills": new_skills}) + return response.status_code == 200 + except Exception as e: + logger.error(f"Error adding user skills: {e}") + return False + +# Get user data and related information +user_data = fetch_user_data(charlie_user_id) +if isinstance(user_data, list) and len(user_data) > 0: + user_data = user_data[0] + +user_skills = fetch_user_skills(charlie_user_id) +app_summary = fetch_application_summary(charlie_user_id) +recent_applications = fetch_recent_applications(charlie_user_id) + +if user_data: + # Header + st.title("🎓 Student Dashboard") + st.subheader(f"Welcome back, {user_data['firstName']}!") + + # Tabs for initial student profile view + tab1, tab2, tab3 = st.tabs(["📋 Profile", "📊 Quick Stats", "🛠️ Skills Management"]) + + # Initial student profile view that can be updated + with tab1: + st.header("Your Profile") + + with st.form("profile_form"): + col1, col2 = st.columns(2) + + with col1: + st.subheader("Personal Information") + first_name = st.text_input("First Name", value=user_data.get("firstName", "")) + last_name = st.text_input("Last Name", value=user_data.get("lastName", "")) + email = st.text_input("Email", value=user_data.get("email", "")) + phone = st.text_input("Phone", value=user_data.get("phone", "")) + + with col2: + st.subheader("Academic Information") + major_options = ["Computer Science", "Data Science", "Information Systems", "Cybersecurity", + "Business", "Marketing", "Finance", "International Business", "Mechanical Engineering", + "Biomedical Engineering", "Electrical Engineering", "Environmental Engineering", + "Physics", "Biology", "Chemistry", "Psychology", "Design", "Mathematics", + "Economics", "Art", "Spanish", "Sociology", "History"] + + major_index = 0 + if user_data.get("major") in major_options: + major_index = major_options.index(user_data.get("major")) + major = st.selectbox("Major", major_options, index=major_index) + + minor_options = ["None"] + major_options + minor_index = 0 + if user_data.get("minor") in minor_options: + minor_index = minor_options.index(user_data.get("minor")) + minor = st.selectbox("Minor", minor_options, index=minor_index) + + college_options = ["College of Arts, Media and Design", "Bouvé College of Health Sciences", + "D'Amore-McKim School of Business", "Khoury College of Computer Sciences", + "College of Engineering", "College of Science", "College of Social Sciences and Humanities"] + college_index = 0 + current_college = user_data.get("college", "") + if current_college in college_options: + college_index = college_options.index(current_college) + college = st.selectbox("College", college_options, index=college_index) + + grad_year_options = ["2024", "2025", "2026", "2027"] + grad_year_index = 0 + if user_data.get("gradYear") in grad_year_options: + grad_year_index = grad_year_options.index(user_data.get("gradYear")) + grad_year = st.selectbox("Graduation Year", grad_year_options, index=grad_year_index) + + grade_options = ["Sophomore", "Junior", "Senior"] + grade_index = 0 + if user_data.get("grade") in grade_options: + grade_index = grade_options.index(user_data.get("grade")) + grade = st.selectbox("Current Grade", grade_options, index=grade_index) + + st.subheader("Demographics") + demo_col1, demo_col2 = st.columns(2) + + with demo_col1: + gender_options = ["Male", "Female", "Non-binary", "Prefer not to say", "Other"] + gender_index = 0 + if user_data.get("gender") in gender_options: + gender_index = gender_options.index(user_data.get("gender")) + gender = st.selectbox("Gender", gender_options, index=gender_index) + + race_options = ["White", "Asian", "Black/African American", "Hispanic/Latino", + "Native American", "Pacific Islander", "Mixed", "Prefer not to say"] + race_index = 0 + if user_data.get("race") in race_options: + race_index = race_options.index(user_data.get("race")) + race = st.selectbox("Race/Ethnicity", race_options, index=race_index) + + with demo_col2: + nationality_options = ["American", "International", "Prefer not to say"] + nationality_index = 0 + if user_data.get("nationality") in nationality_options: + nationality_index = nationality_options.index(user_data.get("nationality")) + nationality = st.selectbox("Nationality", nationality_options, index=nationality_index) + + sexuality_options = ["Heterosexual", "LGBTQ+", "Prefer not to say"] + sexuality_index = 0 + if user_data.get("sexuality") in sexuality_options: + sexuality_index = sexuality_options.index(user_data.get("sexuality")) + sexuality = st.selectbox("Sexual Orientation", sexuality_options, index=sexuality_index) + + disability_options = ["None", "ADHD", "Anxiety", "Dyslexia", "Depression", "Autism", "Prefer not to say"] + disability_index = 0 + if user_data.get("disability") in disability_options: + disability_index = disability_options.index(user_data.get("disability")) + disability = st.selectbox("Disability Status", disability_options, index=disability_index) + + submitted = st.form_submit_button("Update Profile", type="primary", use_container_width=True) + + if submitted: + update_data = { + "userId": charlie_user_id, + "firstName": first_name, + "lastName": last_name, + "email": email, + "phone": phone, + "major": major, + "minor": minor if minor != "None" else None, + "college": college, + "gradYear": grad_year, + "grade": grade, + "gender": gender, + "race": race, + "nationality": nationality, + "sexuality": sexuality, + "disability": disability if disability != "None" else None + } + + if update_user_data(update_data): + st.success("✅ Profile updated successfully!") + st.rerun() + else: + st.error("❌ Failed to update profile") + + # General Stats for the student on their application process + with tab2: + st.header("📊 Quick Stats") + + # Calculate metrics from real data + total_applications = sum(item.get('ApplicationCount', 0) for item in app_summary) if app_summary else 0 + under_review = next((item.get('ApplicationCount', 0) for item in app_summary if item.get('status') == 'Under Review'), 0) + submitted = next((item.get('ApplicationCount', 0) for item in app_summary if item.get('status') == 'Submitted'), 0) + + # Get GPA from most recent application + latest_gpa = "N/A" + if recent_applications: + latest_gpa = recent_applications[0].get('gpa', 'N/A') + + # Display metrics in a clean layout + metric_col1, metric_col2, metric_col3, metric_col4 = st.columns(4) + + with metric_col1: + st.metric(label="📝 Applications Submitted", value=str(total_applications), delta="Total") + + with metric_col2: + st.metric(label="👁️ Under Review", value=str(under_review), delta="Pending") + + with metric_col3: + st.metric(label="📄 Recently Submitted", value=str(submitted), delta="Awaiting Review") + + with metric_col4: + st.metric(label="⭐ GPA", value=str(latest_gpa), delta="Latest Application") + + # Skills section that can be updated by the student + st.subheader("🛠️ Your Skills Profile") + st.caption("Based on your profile and experience") + + if user_skills: + # Group skills by category + skills_by_category = {} + for skill in user_skills: + category = skill['category'] + if category not in skills_by_category: + skills_by_category[category] = [] + skills_by_category[category].append(skill) + + # Display skills in columns + categories = list(skills_by_category.keys()) + if len(categories) >= 3: + skill_col1, skill_col2, skill_col3 = st.columns(3) + cols = [skill_col1, skill_col2, skill_col3] + elif len(categories) == 2: + skill_col1, skill_col2 = st.columns(2) + cols = [skill_col1, skill_col2] + else: + cols = [st] + + for i, category in enumerate(categories): + col = cols[i % len(cols)] + with col: + st.markdown(f"**{category}**") + for skill in skills_by_category[category]: + proficiency = skill['proficiencyLevel'] + progress_value = proficiency / 5.0 # Convert 1-5 scale to 0-1 + + # Convert proficiency to text + proficiency_text = {1: "Beginner", 2: "Basic", 3: "Intermediate", 4: "Advanced", 5: "Expert"} + level_text = proficiency_text.get(proficiency, "Unknown") + + st.write(f"{skill['name']} ({level_text})") + st.progress(progress_value) + else: + st.info("No skills data available. Please contact your advisor to update your skills profile.") + + with tab3: + # Skills Management Section + st.header("🛠️ Skills Management") + + if user_skills: + # Group skills by category + skills_by_category = {} + for skill in user_skills: + category = skill['category'] + if category not in skills_by_category: + skills_by_category[category] = [] + skills_by_category[category].append(skill) + + # Create skills management form + with st.form("skills_form"): + st.subheader("📝 Edit Your Skills & Proficiency Levels") + + # Display skills grouped by category + updated_skills = {} + skills_to_remove = [] + + for category, skills in skills_by_category.items(): + st.markdown(f"**{category}**") + + for skill in skills: + col1, col2, col3 = st.columns([3, 2, 1]) + + with col1: + st.write(f"• {skill['name']}") + + with col2: + # Proficiency level slider (1-5) + proficiency = st.slider( + f"Level", + min_value=1, + max_value=5, + value=skill['proficiencyLevel'], + key=f"skill_{skill['skillId']}_proficiency", + help="1=Beginner, 2=Novice, 3=Intermediate, 4=Advanced, 5=Expert" + ) + updated_skills[skill['skillId']] = { + 'skillId': skill['skillId'], + 'proficiencyLevel': proficiency + } + + with col3: + # Remove skill checkbox + if st.checkbox("Remove", key=f"remove_skill_{skill['skillId']}"): + skills_to_remove.append(skill['skillId']) + + st.markdown("") # Add spacing between categories + + # Save skills changes button + skills_submitted = st.form_submit_button("💾 Save Skills Changes", type="primary", use_container_width=True) + + if skills_submitted: + # Filter out skills marked for removal + final_skills = {k: v for k, v in updated_skills.items() if k not in skills_to_remove} + + if update_user_skills(charlie_user_id, final_skills, skills_to_remove): + st.success("✅ Skills updated successfully!") + st.rerun() + else: + st.error("❌ Failed to update skills") + + # Add New Skills Section + st.markdown("---") + st.subheader("➕ Add New Skills") + + # get all available skills for adding + all_skills = fetch_all_skills() + if all_skills: + # Filter out skills user already has + current_skill_ids = [skill['skillId'] for skill in user_skills] if user_skills else [] + available_skills = [skill for skill in all_skills if skill['skillId'] not in current_skill_ids] + + if available_skills: + with st.form("add_skills_form"): + # Group available skills by category for easier selection + available_by_category = {} + for skill in available_skills: + category = skill['category'] + if category not in available_by_category: + available_by_category[category] = [] + available_by_category[category].append(skill) + + selected_skills = [] + + for category, skills in available_by_category.items(): + st.markdown(f"**{category}**") + + for skill in skills: + col1, col2 = st.columns([3, 2]) + + with col1: + if st.checkbox(skill['name'], key=f"add_skill_{skill['skillId']}"): + with col2: + proficiency = st.slider( + "Proficiency", + min_value=1, + max_value=5, + value=3, + key=f"new_skill_{skill['skillId']}_proficiency", + help="1=Beginner, 2=Novice, 3=Intermediate, 4=Advanced, 5=Expert" + ) + selected_skills.append({ + 'skillId': skill['skillId'], + 'proficiencyLevel': proficiency + }) + + st.markdown("") + + # Add selected skills button + add_skills_submitted = st.form_submit_button("➕ Add Selected Skills", type="secondary", use_container_width=True) + + if add_skills_submitted and selected_skills: + if add_user_skills(charlie_user_id, selected_skills): + st.success(f"✅ Added {len(selected_skills)} new skills!") + st.rerun() + else: + st.error("❌ Failed to add skills") + elif add_skills_submitted and not selected_skills: + st.warning("⚠️ Please select at least one skill to add") + else: + st.info("🎉 You have all available skills! Great job!") + else: + st.error("❌ Unable to load available skills") + + +else: + st.error("Unable to load user data. Please try again later.") \ No newline at end of file diff --git a/app/src/pages/01_Student_Applications.py b/app/src/pages/01_Student_Applications.py new file mode 100644 index 0000000000..54af735736 --- /dev/null +++ b/app/src/pages/01_Student_Applications.py @@ -0,0 +1,97 @@ +import logging +import streamlit as st +import requests +from modules.nav import SideBarLinks + +# Setup +st.set_page_config(layout='wide') +SideBarLinks() +logger = logging.getLogger(__name__) +logger.info("Loading Applications page") + +API_BASE_URL = "http://web-api:4000" +user_id = st.session_state.get("user_id", None) + +if user_id is None: + st.error("🚫 User not logged in. Please return to the home page and log in.") + st.stop() + +# Create tabs +tab1, tab2 = st.tabs(["📄 Application Status", "📝 Apply to New Position"]) + +# Existing applications tab +with tab1: + st.subheader("📄 My Applications") + + try: + res = requests.get(f"{API_BASE_URL}/student/{user_id}/applications") + if res.status_code == 200: + apps = res.json() + if not apps: + st.info("No applications submitted yet.") + else: + for app in apps: + # Assign individual fields to variables for clarity + position_title = app.get('positionTitle', 'Unknown Position') + application_status = app.get('applicationStatus', 'Unknown Status') + date_applied = app.get('dateTimeApplied', 'N/A') + gpa = app.get('gpa', 'N/A') + resume = app.get('resume', 'N/A') + cover_letter = app.get('coverLetter', 'N/A') + + with st.expander(f"{position_title} — {application_status}"): + st.markdown(f"**Applied on:** `{date_applied}`") + st.markdown(f"**GPA:** `{gpa}`") + st.markdown("**Resume:**") + st.code(resume) + st.markdown("**Cover Letter:**") + st.code(cover_letter) + else: + st.error(f"Could not fetch applications. Server returned status code {res.status_code}") + except Exception as e: + st.error(f"Error fetching applications: {e}") + +# Apply to new position tab +with tab2: + st.subheader("📝 Submit a New Application") + + try: + pos_res = requests.get(f"{API_BASE_URL}/positions") + if pos_res.status_code == 200: + positions = pos_res.json() + pos_map = {f"{pos['title']} (ID: {pos['coopPositionId']})": pos['coopPositionId'] for pos in positions} + + if not pos_map: + st.warning("No co-op positions are currently available.") + else: + pos_label = st.selectbox("Select a Position", list(pos_map.keys())) + selected_pos_id = pos_map[pos_label] + + resume = st.text_area("Paste your resume here", height=150) + cover_letter = st.text_area("Paste your cover letter here", height=150) + gpa = st.number_input("Your GPA", min_value=0.0, max_value=4.0, step=0.01, format="%.2f") + + if st.button("📤 Submit Application"): + data = { + "studentId": user_id, + "coopPositionId": selected_pos_id, + "resume": resume, + "coverLetter": cover_letter, + "gpa": gpa, + "status": "Submitted" + } + + try: + submit_res = requests.post(f"{API_BASE_URL}/applications/new", json=data) + if submit_res.status_code == 201: + st.success("Application submitted successfully!") + st.rerun() + else: + error_info = submit_res.json().get("error", "No error message returned") + st.error(f"Failed to submit application. Reason: {error_info}") + except Exception as e: + st.error(f"Exception occurred during submission: {e}") + else: + st.error("Failed to fetch positions.") + except Exception as e: + st.error(f"Error: {e}") diff --git a/app/src/pages/01_World_Bank_Viz.py b/app/src/pages/01_World_Bank_Viz.py deleted file mode 100644 index a34cbb1529..0000000000 --- a/app/src/pages/01_World_Bank_Viz.py +++ /dev/null @@ -1,41 +0,0 @@ -import logging -logger = logging.getLogger(__name__) -import pandas as pd -import streamlit as st -from streamlit_extras.app_logo import add_logo -import world_bank_data as wb -import matplotlib.pyplot as plt -import numpy as np -import plotly.express as px -from modules.nav import SideBarLinks - -# Call the SideBarLinks from the nav module in the modules directory -SideBarLinks() - -# set the header of the page -st.header('World Bank Data') - -# You can access the session state to make a more customized/personalized app experience -st.write(f"### Hi, {st.session_state['first_name']}.") - -# get the countries from the world bank data -with st.echo(code_location='above'): - countries:pd.DataFrame = wb.get_countries() - - st.dataframe(countries) - -# the with statment shows the code for this block above it -with st.echo(code_location='above'): - arr = np.random.normal(1, 1, size=100) - test_plot, ax = plt.subplots() - ax.hist(arr, bins=20) - - st.pyplot(test_plot) - - -with st.echo(code_location='above'): - slim_countries = countries[countries['incomeLevel'] != 'Aggregates'] - data_crosstab = pd.crosstab(slim_countries['region'], - slim_countries['incomeLevel'], - margins = False) - st.table(data_crosstab) diff --git a/app/src/pages/02_Map_Demo.py b/app/src/pages/02_Map_Demo.py deleted file mode 100644 index 5ca09a9633..0000000000 --- a/app/src/pages/02_Map_Demo.py +++ /dev/null @@ -1,104 +0,0 @@ -import logging -logger = logging.getLogger(__name__) -import streamlit as st -from streamlit_extras.app_logo import add_logo -import pandas as pd -import pydeck as pdk -from urllib.error import URLError -from modules.nav import SideBarLinks - -SideBarLinks() - -# add the logo -add_logo("assets/logo.png", height=400) - -# set up the page -st.markdown("# Mapping Demo") -st.sidebar.header("Mapping Demo") -st.write( - """This Mapping Demo is from the Streamlit Documentation. It shows how to use -[`st.pydeck_chart`](https://docs.streamlit.io/library/api-reference/charts/st.pydeck_chart) -to display geospatial data.""" -) - - -@st.cache_data -def from_data_file(filename): - url = ( - "http://raw.githubusercontent.com/streamlit/" - "example-data/master/hello/v1/%s" % filename - ) - return pd.read_json(url) - - -try: - ALL_LAYERS = { - "Bike Rentals": pdk.Layer( - "HexagonLayer", - data=from_data_file("bike_rental_stats.json"), - get_position=["lon", "lat"], - radius=200, - elevation_scale=4, - elevation_range=[0, 1000], - extruded=True, - ), - "Bart Stop Exits": pdk.Layer( - "ScatterplotLayer", - data=from_data_file("bart_stop_stats.json"), - get_position=["lon", "lat"], - get_color=[200, 30, 0, 160], - get_radius="[exits]", - radius_scale=0.05, - ), - "Bart Stop Names": pdk.Layer( - "TextLayer", - data=from_data_file("bart_stop_stats.json"), - get_position=["lon", "lat"], - get_text="name", - get_color=[0, 0, 0, 200], - get_size=15, - get_alignment_baseline="'bottom'", - ), - "Outbound Flow": pdk.Layer( - "ArcLayer", - data=from_data_file("bart_path_stats.json"), - get_source_position=["lon", "lat"], - get_target_position=["lon2", "lat2"], - get_source_color=[200, 30, 0, 160], - get_target_color=[200, 30, 0, 160], - auto_highlight=True, - width_scale=0.0001, - get_width="outbound", - width_min_pixels=3, - width_max_pixels=30, - ), - } - st.sidebar.markdown("### Map Layers") - selected_layers = [ - layer - for layer_name, layer in ALL_LAYERS.items() - if st.sidebar.checkbox(layer_name, True) - ] - if selected_layers: - st.pydeck_chart( - pdk.Deck( - map_style="mapbox://styles/mapbox/light-v9", - initial_view_state={ - "latitude": 37.76, - "longitude": -122.4, - "zoom": 11, - "pitch": 50, - }, - layers=selected_layers, - ) - ) - else: - st.error("Please choose at least one layer above.") -except URLError as e: - st.error( - """ - **This demo requires internet access.** - Connection error: %s - """ - % e.reason - ) diff --git a/app/src/pages/02_Student_Browse_Positions.py b/app/src/pages/02_Student_Browse_Positions.py new file mode 100644 index 0000000000..2e4719d33c --- /dev/null +++ b/app/src/pages/02_Student_Browse_Positions.py @@ -0,0 +1,155 @@ +import logging +import streamlit as st +import requests +from modules.nav import SideBarLinks + +# Logging setup +logging.basicConfig(format='%(filename)s:%(lineno)s:%(levelname)s -- %(message)s', level=logging.INFO) +logger = logging.getLogger(__name__) + +# Page setup +st.set_page_config(layout='wide') +SideBarLinks() + +logger.info("Loading Coop Positions page") + +# Constants +API_BASE_URL = "http://web-api:4000" + +# student id from session state +charlie_user_id = st.session_state.get("user_id", None) + +if charlie_user_id is None: + st.error("🚫 User not logged in. Please return to the home page and log in.") + st.stop() + +# Coop Position filter +filter_option = st.selectbox( + "View positions by:", + options=["All", "Liked", "Disliked", "Matches Desired Skills"] +) + +# Get coopPositionIds based on preference +def get_preference_ids(student_id, preference_value): + url = f"{API_BASE_URL}/viewpos/{student_id}?preference={'true' if preference_value else 'false'}" + try: + response = requests.get(url) + if response.status_code == 200: + positions = response.json() + return {pos["coopPositionId"] for pos in positions} + else: + logger.error(f"Failed to fetch preference={preference_value} positions: {response.status_code}") + return set() + except requests.exceptions.RequestException as e: + logger.error(f"Error fetching preference={preference_value} positions: {e}") + return set() + +# get positions based on selected filter +def fetch_positions(): + if filter_option == "All": + url = f"{API_BASE_URL}/positions" + elif filter_option == "Liked": + url = f"{API_BASE_URL}/viewpos/{charlie_user_id}?preference=true" + elif filter_option == "Disliked": + url = f"{API_BASE_URL}/viewpos/{charlie_user_id}?preference=false" + elif filter_option == "Matches Desired Skills": + url = f"{API_BASE_URL}/{charlie_user_id}/desiredSkills" + else: + st.warning("Unknown filter selected.") + return [] + + try: + response = requests.get(url) + if response.status_code == 200: + return response.json() + else: + st.error(f"Failed to fetch data: {response.status_code}") + return [] + except requests.exceptions.RequestException as e: + st.error(f"API request error: {e}") + return [] + +# Load position ID preferences only if viewing All +liked_position_ids = set() +disliked_position_ids = set() +if filter_option == "All": + liked_position_ids = get_preference_ids(charlie_user_id, True) + disliked_position_ids = get_preference_ids(charlie_user_id, False) + +# get and display positions +positions = fetch_positions() + +if filter_option == "Matches Desired Skills" and not positions: + st.info("🔍 No matches found. You may not have any desired skills set in your profile.") + +for pos in positions: + coop_id = pos["coopPositionId"] + title = pos["title"] + + # Add icons if in viewing all positions + if filter_option == "All": + liked = coop_id in liked_position_ids + disliked = coop_id in disliked_position_ids + + if liked and disliked: + title += " 👍👎" + elif liked: + title += " 👍" + elif disliked: + title += " 👎" + + with st.expander(title): + st.write(f"**Location**: {pos.get('location', 'Not Specified')}") + st.write(f"**Description**: {pos.get('description', 'N/A')}") + st.write(f"**Industry**: {pos.get('industry', 'Not Specified')}") + st.write(f"**Hourly Pay**: ${pos.get('hourlyPay', 'N/A')}/hr") + + st.write(f"**Desired GPA**: {pos.get('desiredGPA', 'N/A')}") + st.write(f"**Deadline**: {pos.get('deadline', 'N/A')}") + st.write(f"**Start Date**: {pos.get('startDate', 'N/A')}") + st.write(f"**End Date**: {pos.get('endDate', 'N/A')}") + + st.write(f"**Required Skills ID**: {pos.get('requiredSkillsId', 'None')}") + st.write(f"**Desired Skills ID**: {pos.get('desiredSkillsId', 'None')}") + st.write(f"**Flagged**: {'Yes' if pos.get('flag') else 'No'}") + + # like, dislike, and remove preference buttons + col1, col2, col3 = st.columns([1, 1, 1]) + + with col1: + if st.button("👍 Like", key=f"like_{coop_id}"): + response = requests.post(f"{API_BASE_URL}/position", json={ + "studentId": charlie_user_id, + "coopPositionId": coop_id, + "preference": True + }) + if response.status_code == 200: + st.success("Marked as liked.") + st.rerun() + else: + st.error("Failed to save preference.") + + with col2: + if st.button("👎 Dislike", key=f"dislike_{coop_id}"): + response = requests.post(f"{API_BASE_URL}/position", json={ + "studentId": charlie_user_id, + "coopPositionId": coop_id, + "preference": False + }) + if response.status_code == 200: + st.warning("Marked as disliked.") + st.rerun() + else: + st.error("Failed to save preference.") + + with col3: + if st.button("🗑️ Remove Preference", key=f"remove_{coop_id}"): + response = requests.delete(f"{API_BASE_URL}/position", json={ + "studentId": charlie_user_id, + "coopPositionId": coop_id + }) + if response.status_code == 200: + st.info("Preference removed.") + st.rerun() + else: + st.error("Failed to remove preference.") diff --git a/app/src/pages/03_Simple_Chat_Bot.py b/app/src/pages/03_Simple_Chat_Bot.py deleted file mode 100644 index fa8db58e84..0000000000 --- a/app/src/pages/03_Simple_Chat_Bot.py +++ /dev/null @@ -1,66 +0,0 @@ -import logging -logger = logging.getLogger(__name__) -import streamlit as st -from streamlit_extras.app_logo import add_logo -import numpy as np -import random -import time -from modules.nav import SideBarLinks - -SideBarLinks() - -def response_generator(): - response = random.choice ( - [ - "Hello there! How can I assist you today?", - "Hi, human! Is there anything I can help you with?", - "Do you need help?", - ] - ) - for word in response.split(): - yield word + " " - time.sleep(0.05) -#----------------------------------------------------------------------- - -st.set_page_config (page_title="Sample Chat Bot", page_icon="🤖") -add_logo("assets/logo.png", height=400) - -st.title("Echo Bot 🤖") - -st.markdown(""" - Currently, this chat bot only returns a random message from the following list: - - Hello there! How can I assist you today? - - Hi, human! Is there anything I can help you with? - - Do you need help? - """ - ) - - -# Initialize chat history -if "messages" not in st.session_state: - st.session_state.messages = [] - -# Display chat message from history on app rerun -for message in st.session_state.messages: - with st.chat_message(message["role"]): - st.markdown(message["content"]) - -# React to user input -if prompt := st.chat_input("What is up?"): - # Display user message in chat message container - with st.chat_message("user"): - st.markdown(prompt) - - # Add user message to chat history - st.session_state.messages.append({"role": "user", "content": prompt}) - - response = f"Echo: {prompt}" - - # Display assistant response in chat message container - with st.chat_message("assistant"): - # st.markdown(response) - response = st.write_stream(response_generator()) - - # Add assistant response to chat history - st.session_state.messages.append({"role": "assistant", "content": response}) - diff --git a/app/src/pages/03_Student_Analytics.py b/app/src/pages/03_Student_Analytics.py new file mode 100644 index 0000000000..050d3cabff --- /dev/null +++ b/app/src/pages/03_Student_Analytics.py @@ -0,0 +1,107 @@ +import logging +logging.basicConfig(format='%(filename)s:%(lineno)s:%(levelname)s -- %(message)s', level=logging.INFO) +logger = logging.getLogger(__name__) + +import streamlit as st +from modules.nav import SideBarLinks +import requests +import pandas as pd + +st.set_page_config(layout='wide') +SideBarLinks() + +st.title("Student Analytics Dashboard") +st.markdown("---") + + +API_BASE_URL = "http://web-api:4000" +WAGE_DATA_ENDPOINT = f"{API_BASE_URL}/workedatpos/wagedata" + +def fetch_wage_data(): + """Fetch wage data from the REST API""" + try: + response = requests.get(WAGE_DATA_ENDPOINT, timeout=10) + if response.status_code == 200: + return response.json() + else: + st.error(f"Failed to fetch data: {response.status_code}") + return [] + except requests.exceptions.RequestException as e: + st.error(f"Error connecting to API: {str(e)}") + return [] + +wage_data = fetch_wage_data() + +if wage_data: + df = pd.DataFrame(wage_data) + col1, col2, col3, col4 = st.columns(4) + + with col1: + st.metric("Total Positions", len(df)) + + with col2: + avg_pay = df['avgPay'].mean() if 'avgPay' in df.columns else 0 + st.metric("Average Pay", f"${avg_pay:.2f}/hr") + + with col3: + max_pay = df['maxSalary'].max() if 'maxSalary' in df.columns else 0 + st.metric("Highest Pay", f"${max_pay:.2f}/hr") + + with col4: + total_coops = df['numPreviousCoops'].sum() if 'numPreviousCoops' in df.columns else 0 + st.metric("Total Previous Co-ops", total_coops) + + st.markdown("---") + + # Display the wage data table + st.subheader("Co-op Position Wage Data") + st.markdown("Data from past co-op positions showing company names, position titles, and salary ranges.") + + # Format the data for better display + if not df.empty: + # Rename columns for better display + display_df = df.copy() + if 'companyName' in display_df.columns: + display_df = display_df.rename(columns={'companyName': 'Company Name'}) + if 'positionTitle' in display_df.columns: + display_df = display_df.rename(columns={'positionTitle': 'Position Title'}) + if 'minSalary' in display_df.columns: + display_df = display_df.rename(columns={'minSalary': 'Min Salary ($/hr)'}) + if 'maxSalary' in display_df.columns: + display_df = display_df.rename(columns={'maxSalary': 'Max Salary ($/hr)'}) + if 'avgPay' in display_df.columns: + display_df = display_df.rename(columns={'avgPay': 'Average Pay ($/hr)'}) + if 'numPreviousCoops' in display_df.columns: + display_df = display_df.rename(columns={'numPreviousCoops': 'Previous Co-ops'}) + + # Format salary columns to 2 decimal places + salary_columns = ['Min Salary ($/hr)', 'Max Salary ($/hr)', 'Average Pay ($/hr)'] + for col in salary_columns: + if col in display_df.columns: + display_df[col] = display_df[col].round(2) + + # Display the table + st.dataframe(display_df, use_container_width=True) + + # Add some visualizations + st.markdown("---") + st.subheader("Pay Distribution Analysis") + + col1, col2 = st.columns(2) + + with col1: + if 'Average Pay ($/hr)' in display_df.columns: + st.bar_chart(display_df.set_index('Position Title')['Average Pay ($/hr)'].head(10)) + st.caption("Top 10 Positions by Average Pay") + + with col2: + if 'Company Name' in display_df.columns and 'Average Pay ($/hr)' in display_df.columns: + company_avg = display_df.groupby('Company Name')['Average Pay ($/hr)'].mean().sort_values(ascending=False).head(10) + st.bar_chart(company_avg) + st.caption("Top 10 Companies by Average Pay") + +else: + st.warning("No wage data available. Please check if the API is running and accessible.") + st.info("Make sure the backend API is running on port 4000.") + + diff --git a/app/src/pages/04_Prediction.py b/app/src/pages/04_Prediction.py deleted file mode 100644 index a5a322a2f4..0000000000 --- a/app/src/pages/04_Prediction.py +++ /dev/null @@ -1,38 +0,0 @@ -import logging -logger = logging.getLogger(__name__) - -import streamlit as st -from modules.nav import SideBarLinks -import requests - -st.set_page_config(layout = 'wide') - -# Display the appropriate sidebar links for the role of the logged in user -SideBarLinks() - -st.title('Prediction with Regression') - -# create a 2 column layout -col1, col2 = st.columns(2) - -# add one number input for variable 1 into column 1 -with col1: - var_01 = st.number_input('Variable 01:', - step=1) - -# add another number input for variable 2 into column 2 -with col2: - var_02 = st.number_input('Variable 02:', - step=1) - -logger.info(f'var_01 = {var_01}') -logger.info(f'var_02 = {var_02}') - -# add a button to use the values entered into the number field to send to the -# prediction function via the REST API -if st.button('Calculate Prediction', - type='primary', - use_container_width=True): - results = requests.get(f'http://api:4000/c/prediction/{var_01}/{var_02}').json() - st.dataframe(results) - \ No newline at end of file diff --git a/app/src/pages/04_Student_Calendar.py b/app/src/pages/04_Student_Calendar.py new file mode 100644 index 0000000000..463b318b1c --- /dev/null +++ b/app/src/pages/04_Student_Calendar.py @@ -0,0 +1,91 @@ +import calendar +import datetime +import streamlit as st +import pandas as pd +import requests +from modules.nav import SideBarLinks + +# Setup +st.set_page_config(layout="wide") +SideBarLinks() +API_BASE_URL = "http://web-api:4000" +charlie_user_id = st.session_state.get("user_id", None) +if charlie_user_id is None: + st.error("🚫 User not logged in. Please return to the home page and log in.") + st.stop() + +# get deadlines +def fetch_flagged_deadlines(user_id): + try: + url = f"{API_BASE_URL}/{user_id}/deadlines" + response = requests.get(url) + if response.status_code == 200: + return response.json() + else: + st.error(f"Failed to fetch deadlines: {response.status_code}") + return [] + except Exception as e: + st.error(f"Error fetching deadlines: {e}") + return [] + +deadlines = fetch_flagged_deadlines(charlie_user_id) + +# Convert to df +if deadlines: + df = pd.DataFrame(deadlines) + df['deadline'] = pd.to_datetime(df['deadline']).dt.date +else: + df = pd.DataFrame(columns=["title", "deadline"]) + +st.title("📅 Your Position Deadline Calendar") + +# create feature for user to select the month and year they want to look at +col1, col2 = st.columns(2) +today = datetime.date.today() +with col1: + year = st.number_input("Year", min_value=2000, max_value=2100, value=today.year) +with col2: + month = st.selectbox("Month", list(calendar.month_name)[1:], index=today.month - 1) + +# Header for Calendar +st.subheader(f"{month} {year}") + +month_num = list(calendar.month_name).index(month) + +# Generate calendar matrix for the selected month and year +cal = calendar.monthcalendar(year, month_num) + +# group positions by deadline date +positions_by_date = {} +for _, row in df.iterrows(): + positions_by_date.setdefault(row['deadline'], []).append(row['title']) + +# show calendar +days_of_week = ["Mon", "Tue", "Wed", "Thu", "Fri", "Sat", "Sun"] +cols = st.columns(7) +for i, day in enumerate(days_of_week): + cols[i].markdown(f"**{day}**") + +for week in cal: + cols = st.columns(7) + for i, day in enumerate(week): + if day == 0: + cols[i].markdown(" ") + else: + date_obj = datetime.date(year, month_num, day) + pos_titles = positions_by_date.get(date_obj, []) + # Show date number + day_str = f"**{day}**" + # Show position titles + if pos_titles: + if len(pos_titles) <= 2: + events_str = "\n".join([f"- {title}" for title in pos_titles]) + else: + events_str = "\n".join([f"- {title}" for title in pos_titles[:2]]) + events_str += f"\n- +{len(pos_titles)-2} more" + cols[i].markdown(f"{day_str}\n{events_str}") + else: + cols[i].markdown(day_str) + +if df.empty: + st.info("📭 You haven’t flagged any positions yet. Flag some positions to see their deadlines here!") diff --git a/app/src/pages/10_Advisor_Home.py b/app/src/pages/10_Advisor_Home.py new file mode 100644 index 0000000000..f1541a9e29 --- /dev/null +++ b/app/src/pages/10_Advisor_Home.py @@ -0,0 +1,219 @@ +import logging +logging.basicConfig(format='%(filename)s:%(lineno)s:%(levelname)s -- %(message)s', level=logging.INFO) +logger = logging.getLogger(__name__) + +import streamlit as st +from modules.nav import SideBarLinks +import requests + +st.set_page_config(layout='wide') +SideBarLinks() + +logger.info("Loading Advisor Home page") + +# API configuration +API_BASE_URL = "http://web-api:4000" + +# Get the user_id from session state +advisor_user_id = st.session_state.get("user_id", None) + +if advisor_user_id is None: + st.error("User not logged in. Please return to home and log in.") + st.stop() + +# Function to fetch advisor data from API +def fetch_advisor_data(user_id): + try: + response = requests.get(f"{API_BASE_URL}/users/{user_id}") + logger.info(f"Fetching advisor data from API: status_code={response.status_code}") + if response.status_code == 200: + data = response.json() + return data[0] if data else None + return None + except Exception as e: + logger.error(f"Error fetching advisor data: {e}") + # Fallback data if API is not available + return { + 'userId': 31, + 'firstName': 'Sarah', + 'lastName': 'Martinez', + 'email': 's.martinez@neu.edu', + 'phone': '555-0301', + 'college': 'NEU', + 'industry': 'Academic', + 'gender': 'Female', + 'race': 'Hispanic', + 'nationality': 'American', + 'sexuality': 'Heterosexual', + 'disability': None + } + +# Function to fetch advisor's assigned students from API +def fetch_advisor_students(advisor_id): + try: + response = requests.get(f"{API_BASE_URL}/advisors/{advisor_id}/students") + logger.info(f"Fetching advisor students from API: status_code={response.status_code}") + if response.status_code == 200: + return response.json() + return [] + except Exception as e: + logger.error(f"Error fetching advisor students: {e}") + # Fallback data if API is not available + return [ + { + 'userId': 1, + 'firstName': 'Charlie', + 'lastName': 'Stout', + 'email': 'c.stout@student.edu', + 'phone': '555-0101', + 'major': 'Computer Science', + 'minor': 'Mathematics', + 'college': 'Khoury College of Computer Sciences', + 'gradYear': '2026', + 'grade': 'Junior' + }, + { + 'userId': 2, + 'firstName': 'Liam', + 'lastName': 'Williams', + 'email': 'l.williams@student.edu', + 'phone': '555-0102', + 'major': 'Business', + 'minor': 'Economics', + 'college': 'D\'Amore-McKim School of Business', + 'gradYear': '2025', + 'grade': 'Senior' + }, + { + 'userId': 3, + 'firstName': 'Sophia', + 'lastName': 'Brown', + 'email': 's.brown@student.edu', + 'phone': '555-0103', + 'major': 'Mechanical Engineering', + 'minor': 'Physics', + 'college': 'College of Engineering', + 'gradYear': '2027', + 'grade': 'Sophomore' + } + ] + +# Function to fetch student application statistics +def fetch_student_application_stats(student_id): + try: + response = requests.get(f"{API_BASE_URL}/student/{student_id}/applications/summary") + logger.info(f"Fetching student application stats from API: status_code={response.status_code}") + if response.status_code == 200: + data = response.json() + logger.info(f"Student application stats received: {data}") + return data + else: + logger.warning(f"Failed to fetch student application stats, status code: {response.status_code}") + return [] + except Exception as e: + logger.error(f"Error fetching student application stats: {e}") + return [] + +# Function to update advisor data +def update_advisor_data(advisor_id, advisor_data): + try: + response = requests.put(f"{API_BASE_URL}/advisors/{advisor_id}/profile", json=advisor_data) + logger.info(f"Updating advisor profile: status_code={response.status_code}") + return response.status_code == 200 + except Exception as e: + logger.error(f"Error updating advisor data: {e}") + return False + +# Function to update student flag status +def update_student_flag(advisor_id, student_id, flagged): + try: + response = requests.put(f"{API_BASE_URL}/advisors/{advisor_id}/students/{student_id}/flag", + json={"flagged": flagged}) + logger.info(f"Updating student flag: status_code={response.status_code}") + return response.status_code == 200 + except Exception as e: + logger.error(f"Error updating student flag: {e}") + return False + +# Fetch data +advisor_data = fetch_advisor_data(advisor_user_id) +advisor_students = fetch_advisor_students(advisor_user_id) + +if advisor_data: + + st.header("Your Advisor Profile") + + with st.form("advisor_profile_form"): + st.subheader("Personal Information") + col1, col2 = st.columns(2) + + with col1: + first_name = st.text_input("First Name", value=advisor_data.get("firstName", "")) + last_name = st.text_input("Last Name", value=advisor_data.get("lastName", "")) + + with col2: + email = st.text_input("Email", value=advisor_data.get("email", "")) + phone = st.text_input("Phone", value=advisor_data.get("phone", "")) + + st.subheader("Demographics") + demo_col1, demo_col2 = st.columns(2) + + with demo_col1: + gender_options = ["Male", "Female", "Non-binary", "Prefer not to say", "Other"] + gender_index = 0 + if advisor_data.get("gender") in gender_options: + gender_index = gender_options.index(advisor_data.get("gender")) + gender = st.selectbox("Gender", gender_options, index=gender_index) + + race_options = ["White", "Asian", "Black/African American", "Hispanic/Latino", + "Native American", "Pacific Islander", "Mixed", "Prefer not to say"] + race_index = 0 + if advisor_data.get("race") in race_options: + race_index = race_options.index(advisor_data.get("race")) + race = st.selectbox("Race/Ethnicity", race_options, index=race_index) + + with demo_col2: + nationality_options = ["American", "International", "Prefer not to say"] + nationality_index = 0 + if advisor_data.get("nationality") in nationality_options: + nationality_index = nationality_options.index(advisor_data.get("nationality")) + nationality = st.selectbox("Nationality", nationality_options, index=nationality_index) + + sexuality_options = ["Heterosexual", "LGBTQ+", "Prefer not to say"] + sexuality_index = 0 + if advisor_data.get("sexuality") in sexuality_options: + sexuality_index = sexuality_options.index(advisor_data.get("sexuality")) + sexuality = st.selectbox("Sexual Orientation", sexuality_options, index=sexuality_index) + + disability_options = ["None", "ADHD", "Anxiety", "Dyslexia", "Depression", "Autism", "Prefer not to say"] + disability_index = 0 + if advisor_data.get("disability") in disability_options: + disability_index = disability_options.index(advisor_data.get("disability")) + disability = st.selectbox("Disability Status", disability_options, index=disability_index) + + submitted = st.form_submit_button("Update Profile", type="primary", use_container_width=True) + + if submitted: + update_data = { + "userId": advisor_user_id, + "firstName": first_name, + "lastName": last_name, + "email": email, + "phone": phone, + "gender": gender, + "race": race, + "nationality": nationality, + "sexuality": sexuality, + "disability": disability if disability != "None" else None + } + + if update_advisor_data(advisor_user_id, update_data): + st.success("✅ Profile updated successfully!") + st.rerun() + else: + st.error("❌ Failed to update profile") + + +else: + st.error("Unable to load advisor data. Please try again later.") + st.info("If this problem persists, please contact the system administrator.") \ No newline at end of file diff --git a/app/src/pages/10_USAID_Worker_Home.py b/app/src/pages/10_USAID_Worker_Home.py deleted file mode 100644 index d7b230384c..0000000000 --- a/app/src/pages/10_USAID_Worker_Home.py +++ /dev/null @@ -1,30 +0,0 @@ -import logging -logger = logging.getLogger(__name__) - -import streamlit as st -from modules.nav import SideBarLinks - -st.set_page_config(layout = 'wide') - -# Show appropriate sidebar links for the role of the currently logged in user -SideBarLinks() - -st.title(f"Welcome USAID Worker, {st.session_state['first_name']}.") -st.write('') -st.write('') -st.write('### What would you like to do today?') - -if st.button('Predict Value Based on Regression Model', - type='primary', - use_container_width=True): - st.switch_page('pages/11_Prediction.py') - -if st.button('View the Simple API Demo', - type='primary', - use_container_width=True): - st.switch_page('pages/12_API_Test.py') - -if st.button("View Classification Demo", - type='primary', - use_container_width=True): - st.switch_page('pages/13_Classification.py') \ No newline at end of file diff --git a/app/src/pages/11_Advisor_Analytics.py b/app/src/pages/11_Advisor_Analytics.py new file mode 100644 index 0000000000..16a64c7ef9 --- /dev/null +++ b/app/src/pages/11_Advisor_Analytics.py @@ -0,0 +1,315 @@ +import logging +logging.basicConfig(format='%(filename)s:%(lineno)s:%(levelname)s -- %(message)s', level=logging.INFO) +logger = logging.getLogger(__name__) + +import streamlit as st +import plotly.graph_objects as go +import pandas as pd +import requests +from modules.nav import SideBarLinks + +st.set_page_config(layout='wide') +SideBarLinks() + +logger.info("Loading Advisor Analytics page") + +# API configuration +API_BASE_URL = "http://web-api:4000" + +# Get the user_id from session state +advisor_user_id = st.session_state.get("user_id", None) + +if advisor_user_id is None: + st.error("User not logged in. Please return to home and log in.") + st.stop() + +# Function to fetch advisor data from API +def fetch_advisor_data(user_id): + try: + response = requests.get(f"{API_BASE_URL}/users/{user_id}") + logger.info(f"Fetching advisor data from API: status_code={response.status_code}") + if response.status_code == 200: + data = response.json() + logger.info(f"Received advisor data: {data}") + return data[0] if data and len(data) > 0 else None + else: + logger.warning(f"API returned status {response.status_code}") + return None + except Exception as e: + logger.error(f"Error fetching advisor data: {e}") + # Fallback data if API is not available + return { + 'userId': 31, + 'firstName': 'Sarah', + 'lastName': 'Martinez', + 'email': 's.martinez@neu.edu' + } + +# Function to fetch placement analytics data +def fetch_placement_analytics(advisor_id): + # Define fallback sample data for demonstration + fallback_data = [ + { + 'firstName': 'Charlie', 'lastName': 'Stout', 'gradYear': '2026', 'major': 'Computer Science', + 'college': 'Khoury College of Computer Sciences', 'gpa': 3.8, 'status': 'Accepted', + 'positionTitle': 'Software Engineer Intern', 'salary': 75000, 'companyName': 'TechCorp', + 'industry': 'Technology' + }, + { + 'firstName': 'Isabella', 'lastName': 'Anderson', 'gradYear': '2025', 'major': 'Business Administration', + 'college': "D'Amore-McKim School of Business", 'gpa': 3.6, 'status': 'Rejected', + 'positionTitle': 'Marketing Analyst', 'salary': 65000, 'companyName': 'MarketPro', + 'industry': 'Marketing' + }, + { + 'firstName': 'Liam', 'lastName': 'Williams', 'gradYear': '2025', 'major': 'Mechanical Engineering', + 'college': 'College of Engineering', 'gpa': 3.9, 'status': 'Accepted', + 'positionTitle': 'Engineering Intern', 'salary': 70000, 'companyName': 'EngineerCorp', + 'industry': 'Manufacturing' + }, + { + 'firstName': 'Sophia', 'lastName': 'Brown', 'gradYear': '2027', 'major': 'Data Science', + 'college': 'Khoury College of Computer Sciences', 'gpa': 3.7, 'status': 'Accepted', + 'positionTitle': 'Data Analyst', 'salary': 68000, 'companyName': 'DataFlow Analytics', + 'industry': 'Technology' + }, + { + 'firstName': 'Emma', 'lastName': 'Davis', 'gradYear': '2026', 'major': 'Finance', + 'college': "D'Amore-McKim School of Business", 'gpa': 3.5, 'status': 'Rejected', + 'positionTitle': 'Financial Analyst', 'salary': 72000, 'companyName': 'FinanceFirst', + 'industry': 'Finance' + }, + { + 'firstName': 'Noah', 'lastName': 'Miller', 'gradYear': '2025', 'major': 'Computer Science', + 'college': 'Khoury College of Computer Sciences', 'gpa': 3.4, 'status': 'Accepted', + 'positionTitle': 'Full Stack Developer', 'salary': 80000, 'companyName': 'WebSolutions', + 'industry': 'Technology' + } + ] + + try: + response = requests.get(f"{API_BASE_URL}/advisors/{advisor_id}/analytics/placement-data") + logger.info(f"Fetching placement analytics from API: status_code={response.status_code}") + if response.status_code == 200: + api_data = response.json() + logger.info(f"Received placement data: {len(api_data) if api_data else 0} records") + if api_data: # If API returns data, use it + logger.info(f"Successfully fetched {len(api_data)} placement records from API") + return api_data + else: # If API returns empty array, use fallback data + logger.info("API returned empty data, using fallback sample data") + return fallback_data + else: + # API returned error status, use fallback data + logger.warning(f"API returned status {response.status_code}, using fallback sample data") + return fallback_data + except Exception as e: + logger.error(f"Error fetching placement analytics: {e}") + logger.info("Using fallback sample data due to API error") + return fallback_data + +# Fetch data +advisor_data = fetch_advisor_data(advisor_user_id) +placement_data = fetch_placement_analytics(advisor_user_id) + + + +if advisor_data: + # Header + st.title("📊 Student Analytics Dashboard") + st.subheader(f"Welcome back, {advisor_data['firstName']}!") + + if placement_data: + # Convert to DataFrame for easier manipulation + df = pd.DataFrame(placement_data) + + # Create two-column layout: sidebar (25%) and main content (75%) + sidebar_col, main_col = st.columns([1, 3]) + + with sidebar_col: + st.markdown("### 🔍 Filter Controls") + + # Extract unique values for filters + unique_grad_years = sorted(df['gradYear'].unique().tolist()) + unique_colleges = sorted(df['college'].unique().tolist()) + unique_majors = sorted(df['major'].unique().tolist()) + unique_industries = sorted(df['industry'].unique().tolist()) + + # Filter controls with "All" option + grad_year_options = ["All"] + unique_grad_years + selected_grad_years = st.multiselect( + "Graduation Year", + options=grad_year_options, + default=["All"], + help="Select graduation years to include in analysis", + key="grad_years_filter" + ) + + college_options = ["All"] + unique_colleges + selected_colleges = st.multiselect( + "Department/College", + options=college_options, + default=["All"], + help="Select colleges/departments to include", + key="colleges_filter" + ) + + major_options = ["All"] + unique_majors + selected_majors = st.multiselect( + "Major", + options=major_options, + default=["All"], + help="Select student majors to include", + key="majors_filter" + ) + + industry_options = ["All"] + unique_industries + selected_industries = st.multiselect( + "Industry", + options=industry_options, + default=["All"], + help="Select job industries to include", + key="industries_filter" + ) + + gpa_range = st.slider( + "GPA Range", + min_value=0.0, + max_value=4.0, + value=(0.0, 4.0), + step=0.1, + help="Select GPA range for filtering students", + key="gpa_range_filter" + ) + + with main_col: + # Apply filters to data with real-time updates + # Handle "All" option for each filter + if "All" in selected_grad_years: + grad_year_filter = unique_grad_years + else: + grad_year_filter = selected_grad_years + + if "All" in selected_colleges: + college_filter = unique_colleges + else: + college_filter = selected_colleges + + if "All" in selected_majors: + major_filter = unique_majors + else: + major_filter = selected_majors + + if "All" in selected_industries: + industry_filter = unique_industries + else: + industry_filter = selected_industries + + # Apply filters to DataFrame + filtered_df = df[ + (df['gradYear'].isin(grad_year_filter)) & + (df['college'].isin(college_filter)) & + (df['major'].isin(major_filter)) & + (df['industry'].isin(industry_filter)) & + (df['gpa'] >= gpa_range[0]) & + (df['gpa'] <= gpa_range[1]) + ] + + if not filtered_df.empty: + # Summary statistics + total_records = len(filtered_df) + accepted_applications = len(filtered_df[filtered_df['status'] == 'Accepted']) + completed_coops = len(filtered_df[filtered_df['status'] == 'Completed']) + success_rate = ((accepted_applications + completed_coops) / total_records) * 100 if total_records > 0 else 0 + avg_hourly_successful = filtered_df[filtered_df['status'].isin(['Accepted', 'Completed'])]['salary'].mean() + + # Display summary statistics + stat_col1, stat_col2, stat_col3 = st.columns(3) + with stat_col1: + st.metric("Total Records", total_records) + with stat_col2: + st.metric("Success Rate", f"{success_rate:.1f}%") + with stat_col3: + if not pd.isna(avg_hourly_successful): + st.metric("Avg Hourly Pay (Successful)", f"${avg_hourly_successful:.2f}") + else: + st.metric("Avg Hourly Pay (Successful)", "N/A") + + st.markdown("---") + + # Create interactive scatterplot + fig = go.Figure() + + # Add successful experiences (green dots) - both Accepted applications and Completed co-ops + successful_data = filtered_df[filtered_df['status'].isin(['Accepted', 'Completed'])] + if not successful_data.empty: + fig.add_trace(go.Scatter( + x=successful_data['gpa'], + y=successful_data['salary'], + mode='markers', + marker=dict(color='green', size=10, opacity=0.7), + name='Accepted Apps & Completed Co-ops', + hovertemplate='%{customdata[0]} %{customdata[1]}
' + + 'GPA: %{x:.2f}
' + + 'Position: %{customdata[2]}
' + + 'Company: %{customdata[3]}
' + + 'Hourly Pay: $%{y:.2f}
' + + 'Status: %{customdata[4]}', + customdata=successful_data[['firstName', 'lastName', 'positionTitle', 'companyName', 'status']].values + )) + + # Add rejected applications (red dots) + rejected_data = filtered_df[filtered_df['status'] == 'Rejected'] + if not rejected_data.empty: + fig.add_trace(go.Scatter( + x=rejected_data['gpa'], + y=rejected_data['salary'], + mode='markers', + marker=dict(color='red', size=10, opacity=0.7), + name='Rejected Applications', + hovertemplate='%{customdata[0]} %{customdata[1]}
' + + 'GPA: %{x:.2f}
' + + 'Position: %{customdata[2]}
' + + 'Company: %{customdata[3]}
' + + 'Hourly Pay: $%{y:.2f}
' + + 'Status: %{customdata[4]}', + customdata=rejected_data[['firstName', 'lastName', 'positionTitle', 'companyName', 'status']].values + )) + + # Update layout + fig.update_layout( + title="Student Placement Analytics: GPA vs Hourly Pay", + xaxis_title="Student GPA", + yaxis_title="Hourly Pay (USD)", + yaxis=dict(tickformat='$,.2f'), + xaxis=dict(range=[0, 4.0]), + height=600, + showlegend=True, + legend=dict( + orientation="h", + yanchor="bottom", + y=1.02, + xanchor="right", + x=1 + ) + ) + + # Display the plot + st.plotly_chart(fig, use_container_width=True) + + # Legend explanation + st.markdown(""" + **Legend:** 🟢 Green = Accepted Applications & Completed Co-ops | 🔴 Red = Rejected Applications + + **How to use:** Hover over data points to see detailed information about each application or completed co-op. + Use the filter controls on the left to focus on specific student groups or criteria. + """) + + else: + st.warning("No data matches the selected filters. Please adjust your filter criteria.") + else: + st.info("No placement data available for analysis.") + +else: + st.error("Unable to load advisor data. Please try again later.") + st.info("If this problem persists, please contact the system administrator.") \ No newline at end of file diff --git a/app/src/pages/11_Prediction.py b/app/src/pages/11_Prediction.py deleted file mode 100644 index a5a322a2f4..0000000000 --- a/app/src/pages/11_Prediction.py +++ /dev/null @@ -1,38 +0,0 @@ -import logging -logger = logging.getLogger(__name__) - -import streamlit as st -from modules.nav import SideBarLinks -import requests - -st.set_page_config(layout = 'wide') - -# Display the appropriate sidebar links for the role of the logged in user -SideBarLinks() - -st.title('Prediction with Regression') - -# create a 2 column layout -col1, col2 = st.columns(2) - -# add one number input for variable 1 into column 1 -with col1: - var_01 = st.number_input('Variable 01:', - step=1) - -# add another number input for variable 2 into column 2 -with col2: - var_02 = st.number_input('Variable 02:', - step=1) - -logger.info(f'var_01 = {var_01}') -logger.info(f'var_02 = {var_02}') - -# add a button to use the values entered into the number field to send to the -# prediction function via the REST API -if st.button('Calculate Prediction', - type='primary', - use_container_width=True): - results = requests.get(f'http://api:4000/c/prediction/{var_01}/{var_02}').json() - st.dataframe(results) - \ No newline at end of file diff --git a/app/src/pages/12_API_Test.py b/app/src/pages/12_API_Test.py deleted file mode 100644 index 74883c5a85..0000000000 --- a/app/src/pages/12_API_Test.py +++ /dev/null @@ -1,25 +0,0 @@ -import logging -logger = logging.getLogger(__name__) -import streamlit as st -import requests -from streamlit_extras.app_logo import add_logo -from modules.nav import SideBarLinks - -SideBarLinks() - -st.write("# Accessing a REST API from Within Streamlit") - -""" -Simply retrieving data from a REST api running in a separate Docker Container. - -If the container isn't running, this will be very unhappy. But the Streamlit app -should not totally die. -""" -data = {} -try: - data = requests.get('http://api:4000/data').json() -except: - st.write("**Important**: Could not connect to sample api, so using dummy data.") - data = {"a":{"b": "123", "c": "hello"}, "z": {"b": "456", "c": "goodbye"}} - -st.dataframe(data) diff --git a/app/src/pages/12_Advisor_Companies.py b/app/src/pages/12_Advisor_Companies.py new file mode 100644 index 0000000000..e8720e1f97 --- /dev/null +++ b/app/src/pages/12_Advisor_Companies.py @@ -0,0 +1,181 @@ +import logging +logger = logging.getLogger(__name__) + +import streamlit as st +from modules.nav import SideBarLinks +import requests + +st.set_page_config(layout='wide') + +SideBarLinks() + +# API endpoint configuration +API_BASE_URL = "http://web-api:4000" +RATING_ENDPOINT = f"{API_BASE_URL}/workedatpos/company-ratings" +ALL_COMPANIES_ENDPOINT = f"{API_BASE_URL}/companyProfiles" + +st.title('Company Partnerships') + +st.markdown(""" +This page displays company partnerships sorted by their average student ratings. +Companies with higher ratings appear first, showing detailed statistics including min/max ratings and total ratings. +""") + +def fetch_company_ratings(): + """Fetch company profiles sorted by rating from the API""" + try: + response = requests.get(RATING_ENDPOINT, timeout=10) + if response.status_code == 200: + data = response.json() + return data + else: + st.error(f"Failed to fetch rating data: {response.status_code}") + st.error(f"Response: {response.text}") + return [] + except requests.exceptions.RequestException as e: + st.error(f"Error connecting to API: {str(e)}") + return [] + except Exception as e: + st.error(f"Unexpected error: {str(e)}") + return [] + +def fetch_all_companies(): + """Fetch all company profiles from the API""" + try: + response = requests.get(ALL_COMPANIES_ENDPOINT, timeout=10) + if response.status_code == 200: + data = response.json() + return data + else: + st.error(f"Failed to fetch company data: {response.status_code}") + st.error(f"Response: {response.text}") + return [] + except requests.exceptions.RequestException as e: + st.error(f"Error connecting to API: {str(e)}") + return [] + except Exception as e: + st.error(f"Unexpected error: {str(e)}") + return [] + +def display_company_ratings(): + """Display company ratings in a table format""" + # Fetch data from API + with st.spinner("Fetching company data..."): + rated_companies = fetch_company_ratings() + all_companies = fetch_all_companies() + + # Display summary statistics + if rated_companies: + total_rated = len(rated_companies) + + # Ensure ratings are converted to float and handle any None values + ratings = [] + for comp in rated_companies: + rating = comp.get('avgRating') + if rating is not None: + try: + ratings.append(float(rating)) + except (ValueError, TypeError): + continue + + if ratings: + avg_rating = sum(ratings) / len(ratings) + top_company = max(rated_companies, key=lambda x: float(x.get('avgRating', 0)) if x.get('avgRating') is not None else 0) + else: + avg_rating = 0 + top_company = rated_companies[0] if rated_companies else None + + col1, col2, col3 = st.columns(3) + with col1: + st.metric("Companies with Ratings", total_rated) + with col2: + st.metric("Overall Average Rating", f"{avg_rating:.1f}/5.0") + with col3: + company_name = top_company.get('companyName', 'N/A') if top_company else 'N/A' + st.metric("Top Rated Company", company_name) + + st.divider() + + # Add filtering options + if rated_companies: + # Since workedatpos endpoint doesn't have industry, we'll skip industry filtering + filtered_companies = rated_companies + else: + filtered_companies = [] + + # Display companies with ratings + if filtered_companies: + st.subheader("🏆 Highest Performing Companies") + st.markdown("*Sorted by average rating by past coops (highest to lowest)*") + + # Create a DataFrame-like display using Streamlit + col1, col2, col3, col4, col5, col6, col7 = st.columns([1, 3, 2, 1, 1, 1, 1]) + + with col1: + st.write("**Company ID**") + with col2: + st.write("**Company Name**") + with col3: + st.write("**Industry**") + with col4: + st.write("**Avg Rating**") + with col5: + st.write("**# of Ratings**") + with col6: + st.write("**Min**") + with col7: + st.write("**Max**") + + st.divider() + + for company in filtered_companies: + col1, col2, col3, col4, col5, col6, col7 = st.columns([1, 3, 2, 1, 1, 1, 1]) + + with col1: + st.write(company.get('companyProfileId', 'N/A')) + with col2: + st.write(f"**{company.get('companyName', 'N/A')}**") + with col3: + st.write(company.get('companyIndustry', 'N/A')) + with col4: + avg_rating = company.get('avgRating', 0) + if avg_rating is not None: + try: + avg_rating = float(avg_rating) + # Display rating with color coding + if avg_rating >= 4.0: + st.success(f"{avg_rating:.1f}/5.0 ⭐") + elif avg_rating >= 3.0: + st.info(f"{avg_rating:.1f}/5.0") + else: + st.warning(f"{avg_rating:.1f}/5.0") + except (ValueError, TypeError): + st.write("Invalid rating") + else: + st.write("No ratings") + with col5: + total_ratings = company.get('totalRatings', 0) + st.write(f"{total_ratings}") + with col6: + min_rating = company.get('minRating', 'N/A') + if min_rating is not None: + st.write(f"{min_rating:.1f}") + else: + st.write("N/A") + with col7: + max_rating = company.get('maxRating', 'N/A') + if max_rating is not None: + st.write(f"{max_rating:.1f}") + else: + st.write("N/A") + + st.divider() + +# Main content +try: + display_company_ratings() +except Exception as e: + st.error(f"An error occurred: {str(e)}") + logger.error(f"Error in display_company_ratings: {str(e)}") + + diff --git a/app/src/pages/13_Advisor_StudentManagement.py b/app/src/pages/13_Advisor_StudentManagement.py new file mode 100644 index 0000000000..de7a438909 --- /dev/null +++ b/app/src/pages/13_Advisor_StudentManagement.py @@ -0,0 +1,289 @@ +import logging +logger = logging.getLogger(__name__) + +import streamlit as st +from modules.nav import SideBarLinks +import requests + +st.set_page_config(layout='wide') + +SideBarLinks() + +st.header("👥 Student Management") + +logger.info("Loading Advisor Student Management page") + +# API configuration +API_BASE_URL = "http://web-api:4000" + +# Get the user_id from session state +advisor_user_id = st.session_state.get("user_id", None) + +if advisor_user_id is None: + st.error("User not logged in. Please return to home and log in.") + st.stop() + +# Function to fetch advisor data from API +def fetch_advisor_data(user_id): + try: + response = requests.get(f"{API_BASE_URL}/users/{user_id}") + logger.info(f"Fetching advisor data from API: status_code={response.status_code}") + if response.status_code == 200: + data = response.json() + return data[0] if data else None + return None + except Exception as e: + logger.error(f"Error fetching advisor data: {e}") + # Fallback data if API is not available + return { + 'userId': 31, + 'firstName': 'Sarah', + 'lastName': 'Martinez', + 'email': 's.martinez@neu.edu', + 'phone': '555-0301', + 'college': 'NEU', + 'industry': 'Academic', + 'gender': 'Female', + 'race': 'Hispanic', + 'nationality': 'American', + 'sexuality': 'Heterosexual', + 'disability': None + } + +# Function to fetch advisor's assigned students from API +def fetch_advisor_students(advisor_id): + try: + response = requests.get(f"{API_BASE_URL}/advisors/{advisor_id}/students") + logger.info(f"Fetching advisor students from API: status_code={response.status_code}") + if response.status_code == 200: + return response.json() + return [] + except Exception as e: + logger.error(f"Error fetching advisor students: {e}") + # Fallback data if API is not available + return [ + { + 'userId': 1, + 'firstName': 'Charlie', + 'lastName': 'Stout', + 'email': 'c.stout@student.edu', + 'phone': '555-0101', + 'major': 'Computer Science', + 'minor': 'Mathematics', + 'college': 'Khoury College of Computer Sciences', + 'gradYear': '2026', + 'grade': 'Junior' + }, + { + 'userId': 2, + 'firstName': 'Liam', + 'lastName': 'Williams', + 'email': 'l.williams@student.edu', + 'phone': '555-0102', + 'major': 'Business', + 'minor': 'Economics', + 'college': 'D\'Amore-McKim School of Business', + 'gradYear': '2025', + 'grade': 'Senior' + }, + { + 'userId': 3, + 'firstName': 'Sophia', + 'lastName': 'Brown', + 'email': 's.brown@student.edu', + 'phone': '555-0103', + 'major': 'Mechanical Engineering', + 'minor': 'Physics', + 'college': 'College of Engineering', + 'gradYear': '2027', + 'grade': 'Sophomore' + } + ] + +# Function to fetch student application statistics +def fetch_student_application_stats(student_id): + try: + response = requests.get(f"{API_BASE_URL}/student/{student_id}/applications/summary") + logger.info(f"Fetching student application stats from API: status_code={response.status_code}") + if response.status_code == 200: + data = response.json() + logger.info(f"Student application stats received: {data}") + return data + else: + logger.warning(f"Failed to fetch student application stats, status code: {response.status_code}") + return [] + except Exception as e: + logger.error(f"Error fetching student application stats: {e}") + return [] + +# Function to update advisor data +def update_advisor_data(advisor_id, advisor_data): + try: + response = requests.put(f"{API_BASE_URL}/advisors/{advisor_id}/profile", json=advisor_data) + logger.info(f"Updating advisor profile: status_code={response.status_code}") + return response.status_code == 200 + except Exception as e: + logger.error(f"Error updating advisor data: {e}") + return False + +# Function to update student flag status +def update_student_flag(advisor_id, student_id, flagged): + try: + response = requests.put(f"{API_BASE_URL}/advisors/{advisor_id}/students/{student_id}/flag", + json={"flagged": flagged}) + logger.info(f"Updating student flag: status_code={response.status_code}") + return response.status_code == 200 + except Exception as e: + logger.error(f"Error updating student flag: {e}") + return False + +# Fetch data +advisor_data = fetch_advisor_data(advisor_user_id) +advisor_students = fetch_advisor_students(advisor_user_id) + +# Display advisor information +if advisor_data: + st.write(f"**Name:** {advisor_data.get('firstName', '')} {advisor_data.get('lastName', '')}") + st.write(f"**Email:** {advisor_data.get('email', '')}") + st.write(f"**Phone:** {advisor_data.get('phone', '')}") + + +st.markdown("---") + +# Display students +if advisor_students: + st.subheader(f"Your Advisees ({len(advisor_students)} students)") + + # Search and filter functionality + search_col1, search_col2 = st.columns([2, 1]) + + with search_col1: + search_term = st.text_input("🔍 Search students by name or major", placeholder="Enter student name or major...") + + with search_col2: + grad_years = sorted(list(set([student.get('gradYear', '') for student in advisor_students if student.get('gradYear')]))) + selected_year = st.selectbox("Filter by Graduation Year", ["All"] + grad_years) + + # Filter students based on search and year + filtered_students = advisor_students + + if search_term: + filtered_students = [ + student for student in filtered_students + if search_term.lower() in f"{student.get('firstName', '')} {student.get('lastName', '')}".lower() + or search_term.lower() in student.get('major', '').lower() + ] + + if selected_year != "All": + filtered_students = [ + student for student in filtered_students + if student.get('gradYear') == selected_year + ] + + st.markdown("---") + + # Display students in cards + for i, student in enumerate(filtered_students): + # Check if student is flagged + is_flagged = student.get('flagged', False) + + # Create container with conditional styling for flagged students + if is_flagged: + with st.container(): + st.markdown(""" +
+ """, unsafe_allow_html=True) + + col1, col2, col3, col4 = st.columns([2, 2, 1.5, 0.5]) + + with col1: + st.markdown(f"**🚩 {student.get('firstName', '')} {student.get('lastName', '')}**") + st.write(f"📧 {student.get('email', '')}") + st.write(f"📱 {student.get('phone', '')}") + + with col2: + st.write(f"🎓 **Major:** {student.get('major', '')}") + if student.get('minor'): + st.write(f"📚 **Minor:** {student.get('minor', '')}") + st.write(f"🏫 **College:** {student.get('college', '')}") + st.write(f"📅 **Graduation:** {student.get('gradYear', '')} ({student.get('grade', '')})") + + with col3: + # Fetch detailed application stats for this student + app_stats = fetch_student_application_stats(student.get('userId')) + + # Create status counts + status_counts = {item.get('status', ''): item.get('ApplicationCount', 0) for item in app_stats} if app_stats else {} + under_review = status_counts.get('Under Review', 0) + submitted = status_counts.get('Submitted', 0) + rejected = status_counts.get('Rejected', 0) + + # Display detailed metrics in a compact layout + st.markdown("**Application Status:**") + metric_col1, metric_col2, metric_col3 = st.columns(3) + with metric_col1: + st.metric("📋 Review", under_review) + with metric_col2: + st.metric("📤 Submit", submitted) + with metric_col3: + st.metric("❌ Reject", rejected) + + with col4: + # Flag toggle + if st.button("🚩 Unflag", key=f"unflag_{student.get('userId')}", use_container_width=True): + if update_student_flag(advisor_user_id, student.get('userId'), False): + st.success("Student unflagged!") + st.rerun() + else: + st.error("Failed to unflag student") + + st.markdown("
", unsafe_allow_html=True) + else: + with st.container(): + col1, col2, col3, col4 = st.columns([2, 2, 1.5, 0.5]) + + with col1: + st.markdown(f"**{student.get('firstName', '')} {student.get('lastName', '')}**") + st.write(f"📧 {student.get('email', '')}") + st.write(f"📱 {student.get('phone', '')}") + + with col2: + st.write(f"🎓 **Major:** {student.get('major', '')}") + if student.get('minor'): + st.write(f"📚 **Minor:** {student.get('minor', '')}") + st.write(f"🏫 **College:** {student.get('college', '')}") + st.write(f"📅 **Graduation:** {student.get('gradYear', '')} ({student.get('grade', '')})") + + with col3: + # Fetch detailed application stats for this student + app_stats = fetch_student_application_stats(student.get('userId')) + + # Create status counts + status_counts = {item.get('status', ''): item.get('ApplicationCount', 0) for item in app_stats} if app_stats else {} + under_review = status_counts.get('Under Review', 0) + submitted = status_counts.get('Submitted', 0) + rejected = status_counts.get('Rejected', 0) + + # Display detailed metrics in a compact layout + st.markdown("**Application Status:**") + metric_col1, metric_col2, metric_col3 = st.columns(3) + with metric_col1: + st.metric("📋 Review", under_review) + with metric_col2: + st.metric("📤 Submit", submitted) + with metric_col3: + st.metric("❌ Reject", rejected) + + with col4: + # Flag toggle + if st.button("🏳️ Flag", key=f"flag_{student.get('userId')}", use_container_width=True): + if update_student_flag(advisor_user_id, student.get('userId'), True): + st.success("Student flagged!") + st.rerun() + else: + st.error("Failed to flag student") + + st.markdown("---") + +else: + st.info("No students assigned to you at this time.") \ No newline at end of file diff --git a/app/src/pages/13_Classification.py b/app/src/pages/13_Classification.py deleted file mode 100644 index be2535c49d..0000000000 --- a/app/src/pages/13_Classification.py +++ /dev/null @@ -1,57 +0,0 @@ -import logging -logger = logging.getLogger(__name__) -import streamlit as st -import pandas as pd -from sklearn import datasets -from sklearn.ensemble import RandomForestClassifier -from streamlit_extras.app_logo import add_logo -from modules.nav import SideBarLinks - -SideBarLinks() - -st.write(""" -# Simple Iris Flower Prediction App - -This example is borrowed from [The Data Professor](https://github.com/dataprofessor/streamlit_freecodecamp/tree/main/app_7_classification_iris) - -This app predicts the **Iris flower** type! -""") - -st.sidebar.header('User Input Parameters') - -def user_input_features(): - sepal_length = st.sidebar.slider('Sepal length', 4.3, 7.9, 5.4) - sepal_width = st.sidebar.slider('Sepal width', 2.0, 4.4, 3.4) - petal_length = st.sidebar.slider('Petal length', 1.0, 6.9, 1.3) - petal_width = st.sidebar.slider('Petal width', 0.1, 2.5, 0.2) - data = {'sepal_length': sepal_length, - 'sepal_width': sepal_width, - 'petal_length': petal_length, - 'petal_width': petal_width} - features = pd.DataFrame(data, index=[0]) - return features - -df = user_input_features() - -st.subheader('User Input parameters') -st.write(df) - -iris = datasets.load_iris() -X = iris.data -Y = iris.target - -clf = RandomForestClassifier() -clf.fit(X, Y) - -prediction = clf.predict(df) -prediction_proba = clf.predict_proba(df) - -st.subheader('Class labels and their corresponding index number') -st.write(iris.target_names) - -st.subheader('Prediction') -st.write(iris.target_names[prediction]) -#st.write(prediction) - -st.subheader('Prediction Probability') -st.write(prediction_proba) \ No newline at end of file diff --git a/app/src/pages/20_Admin_Home.py b/app/src/pages/20_Admin_Home.py deleted file mode 100644 index 0dbd0f36b4..0000000000 --- a/app/src/pages/20_Admin_Home.py +++ /dev/null @@ -1,17 +0,0 @@ -import logging -logger = logging.getLogger(__name__) - -import streamlit as st -from modules.nav import SideBarLinks -import requests - -st.set_page_config(layout = 'wide') - -SideBarLinks() - -st.title('System Admin Home Page') - -if st.button('Update ML Models', - type='primary', - use_container_width=True): - st.switch_page('pages/21_ML_Model_Mgmt.py') \ No newline at end of file diff --git a/app/src/pages/20_Employer_Home.py b/app/src/pages/20_Employer_Home.py new file mode 100644 index 0000000000..726615a2b7 --- /dev/null +++ b/app/src/pages/20_Employer_Home.py @@ -0,0 +1,179 @@ +import logging +logger = logging.getLogger(__name__) + +import streamlit as st +from modules.nav import SideBarLinks +import requests + +st.set_page_config(layout = 'wide') + +SideBarLinks() + +st.title('🏢Employer Dashboard') + +logger.info("Loading Employer Home page") + +# API configuration +API_BASE_URL = "http://web-api:4000" + +# Get the user_id from session state (use real session state in production) +employer_user_id = 37 #st.session_state.get("user_id", 37), Default to 37 for demo + +if employer_user_id is None: + st.error("User not logged in. Please return to home and log in.") + st.stop() + +# We'll get the company_profile_id from the user data +company_profile_id = None + + +# Function to fetch user data from API +def fetch_user_data(user_id): + try: + response = requests.get(f"{API_BASE_URL}/users/{user_id}") + logger.info(f"Fetching user data for {user_id}: status_code={response.status_code}") + + if response.status_code == 200: + data = response.json() + logger.info(f"User data received: {data}") + return data[0] if data else None + else: + logger.warning(f"Failed to fetch user data, status code: {response.status_code}") + return None + except Exception as e: + logger.error(f"Error fetching user data: {e}") + return None + +# Function to fetch company data from API +def fetch_company_data(company_profile_id): + try: + response = requests.get(f"{API_BASE_URL}/companyProfiles/{company_profile_id}") + logger.info(f"Fetching company data for {company_profile_id}: status_code={response.status_code}") + + if response.status_code == 200: + data = response.json() + logger.info(f"Company data received: {data}") + return data[0] if data else None + else: + logger.warning(f"Failed to fetch company data, status code: {response.status_code}") + return None + except Exception as e: + logger.error(f"Error fetching company data: {e}") + return None + +# Function to update user data via API +def update_user_data(user_data): + try: + response = requests.put(f"{API_BASE_URL}/users", json=user_data) + return response.status_code == 200 + except Exception as e: + logger.error(f"Error updating user data: {e}") + return False + +# Function to update company data via API +def update_company_data(company_data, company_id): + try: + # Use the existing endpoint from users routes + update_data = { + "id": company_id, + "name": company_data["name"], + "bio": company_data["bio"], + "industry": company_data["industry"], + "website_link": company_data["websiteLink"] + } + response = requests.put(f"{API_BASE_URL}/users/companyProfiles/create/{company_id}", json=update_data) + logger.info(f"Updating company data: status_code={response.status_code}") + return response.status_code == 200 + except Exception as e: + logger.error(f"Error updating company data: {e}") + return False + +# Fetch user data first to get company profile ID +user_data = fetch_user_data(employer_user_id) + +# Get company profile ID from user data +if user_data and user_data.get('companyProfileId'): + company_profile_id = user_data.get('companyProfileId') + company_data = fetch_company_data(company_profile_id) +else: + company_data = None + +if user_data: + # Header + st.subheader(f"Welcome back, {user_data['firstName']} {user_data['lastName']}!") + + # Display user info + st.info(f"👤 **Email:** {user_data.get('email', 'Not provided')} | 📞 **Phone:** {user_data.get('phone', 'Not provided')}") + + # Company Information Form + if company_data: + with st.form("company_form"): + st.subheader("Company Information") + col1 = st.columns(1)[0] + + with col1: + company_name = st.text_input("Company Name", value=company_data.get("name", "")) + bio = st.text_area("Company Description", value=company_data.get("bio", ""), height=100) + industry = st.text_input("Industry", value=company_data.get("industry", "")) + website_link = st.text_input("Website", value=company_data.get("websiteLink", "")) + + company_submitted = st.form_submit_button("Update Company Profile", type="primary", use_container_width=True) + + if company_submitted: + company_update_data = { + "name": company_name, + "bio": bio, + "industry": industry, + "websiteLink": website_link + } + + if update_company_data(company_update_data, company_profile_id): + st.success("✅ Company profile updated successfully!") + st.rerun() + else: + st.error("❌ Failed to update company profile") + else: + st.warning("⚠️ No company profile associated with this employer account.") + st.info("Contact your administrator to associate a company profile with your account.") + + # Personal Information Form (separate form) + with st.form("personal_form"): + st.subheader("Personal Information") + personal_col1 = st.columns(1)[0] + + with personal_col1: + first_name = st.text_input("First Name", value=user_data.get("firstName", "")) + last_name = st.text_input("Last Name", value=user_data.get("lastName", "")) + email = st.text_input("Email", value=user_data.get("email", "")) + phone = st.text_input("Phone", value=user_data.get("phone", "")) + + personal_submitted = st.form_submit_button("Update Personal Profile", type="primary", use_container_width=True) + + if personal_submitted: + personal_update_data = { + "userId": employer_user_id, + "firstName": first_name, + "lastName": last_name, + "email": email, + "phone": phone, + # Add required fields for the PUT /users endpoint + "major": user_data.get("major", ""), + "minor": user_data.get("minor", ""), + "college": user_data.get("college", ""), + "gradYear": user_data.get("gradYear", ""), + "grade": user_data.get("grade", ""), + "gender": user_data.get("gender", ""), + "race": user_data.get("race", ""), + "nationality": user_data.get("nationality", ""), + "sexuality": user_data.get("sexuality", ""), + "disability": user_data.get("disability", "") + } + + if update_user_data(personal_update_data): + st.success("✅ Personal profile updated successfully!") + st.rerun() + else: + st.error("❌ Failed to update personal profile") + +else: + st.error("Unable to load user data. Please try again later.") \ No newline at end of file diff --git a/app/src/pages/21_Employer_Postings.py b/app/src/pages/21_Employer_Postings.py new file mode 100644 index 0000000000..d6701a2695 --- /dev/null +++ b/app/src/pages/21_Employer_Postings.py @@ -0,0 +1,269 @@ +import logging +logger = logging.getLogger(__name__) + +import streamlit as st +from modules.nav import SideBarLinks +import requests +from datetime import datetime, date + +st.set_page_config(layout='wide') + +SideBarLinks() + +st.title('🆕 Create Co-op Posting') + +logger.info("Loading Create Co-op Posting page") + +# API configuration +API_BASE_URL = "http://web-api:4000" + +# Get the user_id from session state (use real session state in production) +employer_user_id = st.session_state.get("user_id", 37) # Default to 37 for demo + +if employer_user_id is None: + st.error("User not logged in. Please return to home and log in.") + st.stop() + +# Function to fetch available skills +def fetch_skills(): + try: + response = requests.get(f"{API_BASE_URL}/skills") + logger.info(f"Fetching skills: status_code={response.status_code}") + + if response.status_code == 200: + data = response.json() + logger.info(f"Skills data received: {len(data)} skills") + return data + else: + logger.warning(f"Failed to fetch skills, status code: {response.status_code}") + return [] + except Exception as e: + logger.error(f"Error fetching skills: {e}") + return [] + +# Function to get next available co-op position ID +def get_next_coop_position_id(): + try: + response = requests.get(f"{API_BASE_URL}/coopPositions") + if response.status_code == 200: + positions = response.json() + if positions: + max_id = max([pos.get('coopPositionId', 0) for pos in positions]) + return max_id + 1 + else: + return 1 + else: + return 1 + except Exception as e: + logger.error(f"Error getting next position ID: {e}") + return 1 + +# Function to create co-op position +def create_coop_position(position_data): + try: + response = requests.post(f"{API_BASE_URL}/coopPositions", json=position_data) + logger.info(f"Creating co-op position: status_code={response.status_code}") + return response.status_code in (200, 201) + except Exception as e: + logger.error(f"Error creating co-op position: {e}") + return False + +# Function to link employer to position +def link_employer_to_position(employer_id, position_id): + try: + link_data = { + "employerId": employer_id, + "coopPositionId": position_id + } + response = requests.post(f"{API_BASE_URL}/createsPos", json=link_data) + logger.info(f"Linking employer to position: status_code={response.status_code}") + return response.status_code in (200, 201) + except Exception as e: + logger.error(f"Error linking employer to position: {e}") + return False + +# Function to fetch user data to get company info +def fetch_user_data(user_id): + try: + response = requests.get(f"{API_BASE_URL}/users/{user_id}") + if response.status_code == 200: + data = response.json() + return data[0] if data else None + return None + except Exception as e: + logger.error(f"Error fetching user data: {e}") + return None + +# Fetch user data and skills +user_data = fetch_user_data(employer_user_id) +available_skills = fetch_skills() + +if not user_data: + st.error("Unable to load user data. Please try again later.") + st.stop() + +# Header +st.subheader(f"👋 Hello, {user_data['firstName']} {user_data['lastName']}!") +st.info("Create a new co-op position for your company.") + +# Create the form +with st.form("create_coop_form"): + st.subheader("📋 Position Details") + + # Basic position information + col1, col2 = st.columns(2) + + with col1: + title = st.text_input("Position Title*") + location = st.text_input("Location*") + hourly_pay = st.number_input("Hourly Pay ($)*", min_value=0.0, value=20.0, step=0.50, format="%.2f") + industry = st.selectbox("Industry*", [ + "Technology", "Finance", "Healthcare", "Manufacturing", + "Consulting", "Education", "Marketing", "Engineering", + "Biotechnology", "Non-profit", "Other" + ], index=0) + + with col2: + start_date = st.date_input("Start Date*") + end_date = st.date_input("End Date*") + deadline = st.date_input("Application Deadline*") + desired_gpa = st.number_input("Minimum GPA", min_value=0.0, max_value=4.0, value=3.0, step=0.1, format="%.1f") + + # Description + st.subheader("📝 Position Description") + description = st.text_area( + "Job Description*", + placeholder="Describe what students can expect in this position", + height=150 + ) + + # Skills section + st.subheader("🛠️ Skills Requirements") + + skill_options = [] + skill_ids = {} + + if available_skills: + for skill in available_skills: + skill_display = f"{skill['name']} ({skill['category']})" + skill_options.append(skill_display) + skill_ids[skill_display] = skill['skillId'] + + col1, col2 = st.columns(2) + + with col1: + st.write("**Required Skills** (Must have)") + required_skills = st.multiselect( + "Select required skills", + options=skill_options, + help="Students must have these skills to be eligible" + ) + + with col2: + st.write("**Desired Skills** (Nice to have)") + desired_skills = st.multiselect( + "Select desired skills", + options=skill_options, + help="Preferred skills that would be beneficial" + ) + + if required_skills or desired_skills: + st.info(f"📋 **Selected:** {len(required_skills)} required, {len(desired_skills)} desired skills") + + # Additional requirements + st.subheader("📋 Additional Information") + additional_requirements = st.text_area( + "Additional Requirements or Preferences", + placeholder="Any other requirements, preferred majors, or additional information...", + height=100 + ) + + # Submit button + submitted = st.form_submit_button("🚀 Create Co-op Position", type="primary", use_container_width=True) + + if submitted: + # Validation + errors = [] + + if not title.strip(): + errors.append("Position title is required") + if not location.strip(): + errors.append("Location is required") + if not description.strip(): + errors.append("Job description is required") + if hourly_pay <= 0: + errors.append("Hourly pay must be greater than 0") + if start_date >= end_date: + errors.append("End date must be after start date") + if deadline >= start_date: + errors.append("Application deadline must be before start date") + + if errors: + for error in errors: + st.error(f"❌ {error}") + else: + # Get next position ID + next_position_id = get_next_coop_position_id() + + # Prepare position data + position_data = { + "coopPositionId": next_position_id, + "title": title.strip(), + "location": location.strip(), + "description": description.strip() + (f"\n\nAdditional Requirements:\n{additional_requirements.strip()}" if additional_requirements.strip() else ""), + "hourlyPay": float(hourly_pay), + "requiredSkillsId": skill_ids.get(required_skills[0]) if required_skills else None, + "desiredSkillsId": skill_ids.get(desired_skills[0]) if desired_skills else None, + "desiredGPA": float(desired_gpa), + "deadline": f"{deadline} 23:59:59", + "startDate": str(start_date), + "endDate": str(end_date), + "flag": False, + "industry": industry + } + + # Create the position + if create_coop_position(position_data): + # Link employer to position + if link_employer_to_position(employer_user_id, next_position_id): + st.success("🎉 Co-op position created successfully!") + st.balloons() + + # Display summary + st.subheader("📊 Position Summary") + summary_col1, summary_col2 = st.columns(2) + + with summary_col1: + st.write(f"**Position ID:** {next_position_id}") + st.write(f"**Title:** {title}") + st.write(f"**Location:** {location}") + st.write(f"**Industry:** {industry}") + st.write(f"**Hourly Pay:** ${hourly_pay:.2f}") + + # And update the summary display section: + with summary_col2: + st.write(f"**Start Date:** {start_date}") + st.write(f"**End Date:** {end_date}") + st.write(f"**Application Deadline:** {deadline}") + st.write(f"**Minimum GPA:** {desired_gpa}") + if required_skills: + if len(required_skills) == 1: + st.write(f"**Required Skill:** {required_skills[0]}") + else: + st.write(f"**Required Skills:** {required_skills[0]} (+{len(required_skills)-1} more)") + if desired_skills: + if len(desired_skills) == 1: + st.write(f"**Desired Skill:** {desired_skills[0]}") + else: + st.write(f"**Desired Skills:** {desired_skills[0]} (+{len(desired_skills)-1} more)") + + st.info("💡 Students can now view and apply to this position!") + + # Option to create another position + if st.button("➕ Create Another Position"): + st.rerun() + + else: + st.error("❌ Position created but failed to link to employer. Please contact support.") + else: + st.error("❌ Failed to create co-op position. Please try again.") diff --git a/app/src/pages/21_ML_Model_Mgmt.py b/app/src/pages/21_ML_Model_Mgmt.py deleted file mode 100644 index 148978c24b..0000000000 --- a/app/src/pages/21_ML_Model_Mgmt.py +++ /dev/null @@ -1,28 +0,0 @@ -import logging -logger = logging.getLogger(__name__) -import streamlit as st -from modules.nav import SideBarLinks -import requests - -st.set_page_config(layout = 'wide') - -SideBarLinks() - -st.title('App Administration Page') - -st.write('\n\n') -st.write('## Model 1 Maintenance') - -st.button("Train Model 01", - type = 'primary', - use_container_width=True) - -st.button('Test Model 01', - type = 'primary', - use_container_width=True) - -if st.button('Model 1 - get predicted value for 10, 25', - type = 'primary', - use_container_width=True): - results = requests.get('http://api:4000/c/prediction/10/25').json() - st.dataframe(results) diff --git a/app/src/pages/22_Employer_Applications.py b/app/src/pages/22_Employer_Applications.py new file mode 100644 index 0000000000..a46b2a6aad --- /dev/null +++ b/app/src/pages/22_Employer_Applications.py @@ -0,0 +1,335 @@ +import logging +logging.basicConfig(format='%(filename)s:%(lineno)s:%(levelname)s -- %(message)s', level=logging.INFO) +logger = logging.getLogger(__name__) + +import streamlit as st +from modules.nav import SideBarLinks +import requests +from datetime import datetime, date +import pandas as pd + +st.set_page_config(layout='wide') +SideBarLinks() + +logger.info("Loading Employer Applications page") + +# API configuration +API_BASE_URL = "http://web-api:4000" + +# Get the user_id from session state +employer_user_id = 37#st.session_state.get("user_id", None) + +if employer_user_id is None: + st.error("User not logged in. Please return to home and log in.") + st.stop() + +# For demo purposes, use employer 37 who has positions in the database +# In production, this would use the actual logged-in employer ID +if employer_user_id == 1: # If logged in as student, use employer 37 for demo + employer_user_id = 37 + +# Function to fetch employer's co-op positions +def fetch_employer_positions(employer_id): + """Fetch all co-op positions created by the employer""" + try: + response = requests.get(f"{API_BASE_URL}/employers/{employer_id}/positions") + logger.info(f"Fetching employer positions: status_code={response.status_code}") + + if response.status_code == 200: + positions = response.json() + logger.info(f"Found {len(positions)} positions for employer {employer_id}") + return positions + + return [] + except Exception as e: + logger.error(f"Error fetching employer positions: {e}") + return [] + +# Function to fetch applications with student details for a specific position +def fetch_position_applications_with_students(position_id): + """Fetch all applications with student details for a specific co-op position""" + try: + response = requests.get(f"{API_BASE_URL}/applications/{position_id}/with-students") + logger.info(f"Fetching applications with students for position {position_id}: status_code={response.status_code}") + + if response.status_code == 200: + applications = response.json() + logger.info(f"Found {len(applications)} applications for position {position_id}") + return applications + return [] + except Exception as e: + logger.error(f"Error fetching applications for position {position_id}: {e}") + return [] + +# Function to update application status +def update_application_status(application_id, new_status): + """Update the status of an application""" + try: + response = requests.put( + f"{API_BASE_URL}/applications/{application_id}/status", + json={"status": new_status} + ) + logger.info(f"Updating application {application_id} status to {new_status}: status_code={response.status_code}") + + if response.status_code == 200: + result = response.json() + logger.info(f"Successfully updated application status: {result}") + return True, result + else: + error_msg = response.json().get('error', 'Unknown error') + logger.error(f"Failed to update application status: {error_msg}") + return False, error_msg + + except Exception as e: + logger.error(f"Error updating application status: {e}") + return False, str(e) + +# Function to aggregate all applications for employer +def fetch_all_employer_applications(employer_id): + """Aggregate all applications across all employer positions""" + try: + employer_positions = fetch_employer_positions(employer_id) + all_applications = [] + + for position in employer_positions: + position_id = position.get('coopPositionId') + if position_id: + applications = fetch_position_applications_with_students(position_id) + + # Filter out draft applications and enrich with position details + for app in applications: + # Skip draft applications + if app.get('status', '').lower() == 'draft': + continue + + # Add position details from the position data + app['positionTitle'] = position.get('title', 'Unknown Position') + app['companyName'] = position.get('companyName', 'Unknown Company') + app['hourlyPay'] = position.get('hourlyPay', 0) + app['deadline'] = position.get('deadline', None) + app['location'] = position.get('location', 'Unknown Location') + app['industry'] = position.get('industry', 'Unknown Industry') + + # Student details are already included from the API + app['studentName'] = f"{app.get('firstName', 'Unknown')} {app.get('lastName', 'Student')}" + app['studentEmail'] = app.get('email', 'unknown@email.com') + app['studentMajor'] = app.get('major', 'Unknown Major') + app['gradYear'] = app.get('gradYear', 'Unknown') + + all_applications.append(app) + + logger.info(f"Total applications found: {len(all_applications)}") + return all_applications + + except Exception as e: + logger.error(f"Error aggregating employer applications: {e}") + return [] + +# Initialize session state for refresh trigger +if 'refresh_applications' not in st.session_state: + st.session_state.refresh_applications = False + +# Fetch all applications for the employer +all_applications = fetch_all_employer_applications(employer_user_id) + +# Reset refresh trigger +if st.session_state.refresh_applications: + st.session_state.refresh_applications = False + st.rerun() + +# Page header +st.title("📋 Application Management") +st.subheader("Manage applications for your co-op positions") + +if not all_applications: + st.info("No applications found for your posted positions.") + st.markdown(""" + **Possible reasons:** + - No co-op positions have been posted yet + - No students have applied to your positions + - Applications are still being processed + + Please check back later or contact support if you believe this is an error. + """) + st.stop() + +# Convert to DataFrame for easier manipulation +df = pd.DataFrame(all_applications) + +# Summary statistics +st.markdown("### 📊 Application Summary") +col1, col2, col3, col4 = st.columns(4) + +with col1: + st.metric("Total Applications", len(df)) + +with col2: + submitted_count = len(df[df['status'] == 'Submitted']) if 'status' in df.columns else 0 + st.metric("Submitted", submitted_count) + +with col3: + under_review_count = len(df[df['status'] == 'Under Review']) if 'status' in df.columns else 0 + st.metric("Under Review", under_review_count) + +with col4: + accepted_count = len(df[df['status'] == 'Accepted']) if 'status' in df.columns else 0 + st.metric("Accepted", accepted_count) + +st.markdown("---") + +# Filters and search +st.markdown("### 🔍 Filter Applications") +filter_col1, filter_col2, filter_col3 = st.columns(3) + +with filter_col1: + # Status filter + status_options = ["All"] + list(df['status'].unique()) if 'status' in df.columns else ["All"] + selected_status = st.selectbox("Filter by Status", status_options) + +with filter_col2: + # Position filter + position_options = ["All"] + list(df['positionTitle'].unique()) if 'positionTitle' in df.columns else ["All"] + selected_position = st.selectbox("Filter by Position", position_options) + +with filter_col3: + # Search by student name or major + search_term = st.text_input("Search by Student Name/Major", placeholder="Enter search term...") + +# Apply filters +filtered_df = df.copy() + +if selected_status != "All": + filtered_df = filtered_df[filtered_df['status'] == selected_status] + +if selected_position != "All": + filtered_df = filtered_df[filtered_df['positionTitle'] == selected_position] + +if search_term: + search_mask = ( + filtered_df['studentName'].str.contains(search_term, case=False, na=False) | + filtered_df['studentMajor'].str.contains(search_term, case=False, na=False) + ) + filtered_df = filtered_df[search_mask] + +st.markdown("---") + +# Display applications +st.markdown(f"### 📋 Applications ({len(filtered_df)} found)") + +if filtered_df.empty: + st.info("No applications match your current filters.") +else: + # Display applications in cards + for idx, application in filtered_df.iterrows(): + with st.container(): + # Create card layout + card_col1, card_col2, card_col3, card_col4 = st.columns([2, 2, 2, 1]) + + with card_col1: + st.markdown(f"**👤 {application.get('studentName', 'Unknown Student')}**") + st.write(f"📧 {application.get('studentEmail', 'No email')}") + st.write(f"🎓 {application.get('studentMajor', 'Unknown Major')}") + st.write(f"📅 Grad Year: {application.get('gradYear', 'Unknown')}") + if 'gpa' in application and application['gpa']: + st.write(f"⭐ GPA: {application['gpa']}") + + with card_col2: + st.markdown(f"**💼 {application.get('positionTitle', 'Unknown Position')}**") + st.write(f"🏢 {application.get('companyName', 'Unknown Company')}") + st.write(f"📍 {application.get('location', 'Unknown Location')}") + st.write(f"💰 ${application.get('hourlyPay', 0)}/hour") + if application.get('deadline'): + st.write(f"⏰ Deadline: {application['deadline']}") + + with card_col3: + # Application details + app_date = application.get('dateTimeApplied', 'Unknown Date') + if app_date != 'Unknown Date': + try: + # Format the date if it's a valid datetime string + if 'GMT' in str(app_date): + # Handle GMT format from API + formatted_date = datetime.strptime(app_date.split(' GMT')[0], '%a, %d %b %Y %H:%M:%S').strftime('%Y-%m-%d') + else: + formatted_date = datetime.fromisoformat(str(app_date).replace('Z', '+00:00')).strftime('%Y-%m-%d') + st.write(f"📅 Applied: {formatted_date}") + except: + st.write(f"📅 Applied: {app_date}") + else: + st.write(f"📅 Applied: {app_date}") + + # Status with color coding + status = application.get('status', 'Unknown') + if status == 'Accepted': + st.success(f"✅ {status}") + elif status == 'Rejected': + st.error(f"❌ {status}") + elif status == 'Under Review': + st.warning(f"👁️ {status}") + else: + st.info(f"📝 {status}") + + # Resume indicator + if application.get('resume'): + st.write("📄 Resume: Available") + else: + st.write("📄 Resume: Not provided") + + # Cover letter excerpt + cover_letter = application.get('coverLetter', '') + if cover_letter: + excerpt = cover_letter[:100] + "..." if len(cover_letter) > 100 else cover_letter + st.write(f"📝 Cover Letter: {excerpt}") + + with card_col4: + # Action buttons based on application status + status = application.get('status', 'Unknown') + application_id = application.get('applicationId', idx) + + # Accept/Reject buttons for actionable applications + if status in ['Submitted', 'Under Review']: + col_accept, col_reject = st.columns(2) + + with col_accept: + if st.button("✅", key=f"accept_{application_id}", use_container_width=True, type="primary"): + with st.spinner("Accepting application..."): + success, result = update_application_status(application_id, "Accepted") + if success: + st.success("Application accepted!") + st.session_state.refresh_applications = True + st.rerun() + else: + st.error(f"Failed to accept application: {result}") + + with col_reject: + if st.button("❌", key=f"reject_{application_id}", use_container_width=True): + with st.spinner("Rejecting application..."): + success, result = update_application_status(application_id, "Rejected") + if success: + st.success("Application rejected!") + st.session_state.refresh_applications = True + st.rerun() + else: + st.error(f"Failed to reject application: {result}") + + # View Student Profile button + st.markdown("---") + if st.button(f"👤 View Profile", key=f"profile_{application_id}", use_container_width=True): + # Set the selected student ID in session state + st.session_state["selected_student_id"] = application.get('studentId', None) + st.session_state["selected_application_id"] = application_id + + # Navigate to candidate profile page + st.switch_page("pages/23_Employer_Candidates.py") + + st.markdown("---") + +# Additional information +st.markdown("### ℹ️ Need Help?") +st.info(""" +**Tips for managing applications:** +- Use the filters above to focus on specific types of applications +- Click "View Profile" to see detailed student information +- Applications are automatically updated when students submit new ones +- Contact support if you notice any issues with application data +""") \ No newline at end of file diff --git a/app/src/pages/23_Employer_Candidates.py b/app/src/pages/23_Employer_Candidates.py new file mode 100644 index 0000000000..32fe8e3edc --- /dev/null +++ b/app/src/pages/23_Employer_Candidates.py @@ -0,0 +1,206 @@ +import logging +logging.basicConfig(format='%(filename)s:%(lineno)s:%(levelname)s -- %(message)s', level=logging.INFO) +logger = logging.getLogger(__name__) + +import streamlit as st +from modules.nav import SideBarLinks +import requests + +st.set_page_config(layout='wide') +SideBarLinks() + +logger.info("Loading Employer Candidates page") + +# API configuration +API_BASE_URL = "http://web-api:4000" + +# Function to fetch student details +def fetch_student_details(student_id): + """Fetch student information from the API""" + try: + response = requests.get(f"{API_BASE_URL}/users/{student_id}") + logger.info(f"Fetching student details for {student_id}: status_code={response.status_code}") + + if response.status_code == 200: + student_data = response.json() + logger.info(f"Student data received: {student_data}") + return student_data[0] if student_data else None + else: + logger.warning(f"Failed to fetch student details, status code: {response.status_code}") + return None + except Exception as e: + logger.error(f"Error fetching student details: {e}") + return None + +# Function to fetch application details +def fetch_application_details(application_id): + """Fetch application information from the API""" + try: + response = requests.get(f"{API_BASE_URL}/applications/{application_id}/details") + logger.info(f"Fetching application details for {application_id}: status_code={response.status_code}") + + if response.status_code == 200: + application_data = response.json() + logger.info(f"Application data received: {application_data}") + return application_data + else: + logger.warning(f"Failed to fetch application details, status code: {response.status_code}") + return None + except Exception as e: + logger.error(f"Error fetching application details: {e}") + return None + +# Check if a student was selected from the applications page +selected_student_id = st.session_state.get("selected_student_id", None) +selected_application_id = st.session_state.get("selected_application_id", None) + +st.title('👤 Student Profile') + +if selected_student_id: + # Fetch student data and application details + with st.spinner("Loading student profile..."): + student_data = fetch_student_details(selected_student_id) + application_data = None + if selected_application_id: + application_data = fetch_application_details(selected_application_id) + + if student_data: + # Display student name in header + student_name = f"{student_data.get('firstName', 'Unknown')} {student_data.get('lastName', 'Student')}" + st.subheader(f"👤 {student_name}") + + # Display application context if available + if selected_application_id: + st.info(f"📋 Viewing profile from Application ID: {selected_application_id}") + + # Student Information Section + st.markdown("### 📊 Student Information") + + col1, col2 = st.columns(2) + + with col1: + st.markdown("**Personal Details:**") + st.write(f"• **Name:** {student_name}") + st.write(f"• **Email:** {student_data.get('email', 'Not provided')}") + st.write(f"• **Major:** {student_data.get('major', 'Not specified')}") + if student_data.get('minor'): + st.write(f"• **Minor:** {student_data.get('minor')}") + st.write(f"• **Graduation Year:** {student_data.get('gradYear', 'Not specified')}") + st.write(f"• **Grade Level:** {student_data.get('grade', 'Not specified')}") + st.write(f"• **College:** {student_data.get('college', 'Not specified')}") + if student_data.get('phone'): + st.write(f"• **Phone:** {student_data.get('phone')}") + + with col2: + st.markdown("**Demographics & Additional Info:**") + if student_data.get('gender'): + st.write(f"• **Gender:** {student_data.get('gender')}") + if student_data.get('race'): + st.write(f"• **Race:** {student_data.get('race')}") + if student_data.get('nationality'): + st.write(f"• **Nationality:** {student_data.get('nationality')}") + if student_data.get('sexuality'): + st.write(f"• **Sexuality:** {student_data.get('sexuality')}") + if student_data.get('disability'): + st.write(f"• **Disability:** {student_data.get('disability')}") + + # Application Context Section (if available) + if selected_application_id and application_data: + st.markdown("### 📋 Application Details") + + col3, col4 = st.columns(2) + + with col3: + st.markdown("**Application Information:**") + st.write(f"• **Application ID:** {application_data.get('applicationId', 'N/A')}") + + # Status with color coding + status = application_data.get('status', 'Unknown') + if status == 'Accepted': + st.success(f"• **Status:** ✅ {status}") + elif status == 'Rejected': + st.error(f"• **Status:** ❌ {status}") + elif status == 'Under Review': + st.warning(f"• **Status:** 👁️ {status}") + else: + st.info(f"• **Status:** 📝 {status}") + + # Format date applied + date_applied = application_data.get('dateTimeApplied', 'Unknown') + if date_applied != 'Unknown': + try: + from datetime import datetime + formatted_date = datetime.strptime(date_applied.split(' GMT')[0], '%a, %d %b %Y %H:%M:%S').strftime('%B %d, %Y at %I:%M %p') + st.write(f"• **Date Applied:** {formatted_date}") + except: + st.write(f"• **Date Applied:** {date_applied}") + else: + st.write(f"• **Date Applied:** {date_applied}") + + # Application-specific GPA + if application_data.get('gpa'): + st.write(f"• **GPA (from application):** {application_data.get('gpa')}") + + with col4: + st.markdown("**Position Applied For:**") + st.write(f"• **Position:** {application_data.get('positionTitle', 'Unknown Position')}") + st.write(f"• **Location:** {application_data.get('location', 'Unknown Location')}") + st.write(f"• **Hourly Pay:** ${application_data.get('hourlyPay', 0)}/hour") + st.write(f"• **Industry:** {application_data.get('industry', 'Unknown Industry')}") + + # Application deadline + deadline = application_data.get('deadline', 'Unknown') + if deadline != 'Unknown': + try: + formatted_deadline = datetime.strptime(deadline.split(' GMT')[0], '%a, %d %b %Y %H:%M:%S').strftime('%B %d, %Y') + st.write(f"• **Application Deadline:** {formatted_deadline}") + except: + st.write(f"• **Application Deadline:** {deadline}") + + # Documents section + st.markdown("### 📄 Application Documents") + + doc_col1, doc_col2 = st.columns(2) + + with doc_col1: + st.markdown("**Resume:**") + if application_data.get('resume'): + with st.expander("View Resume Content"): + st.text_area("Resume", application_data.get('resume'), height=200, disabled=True) + else: + st.write("No resume provided") + + with doc_col2: + st.markdown("**Cover Letter:**") + if application_data.get('coverLetter'): + with st.expander("View Cover Letter"): + st.text_area("Cover Letter", application_data.get('coverLetter'), height=200, disabled=True) + else: + st.write("No cover letter provided") + + elif selected_application_id and not application_data: + st.markdown("### 📋 Application Context") + st.warning("⚠️ Unable to load application details. The application may not exist or there was an error fetching the data.") + st.write(f"Application ID: {selected_application_id}") + + else: + st.error("❌ Unable to load student profile. The student may not exist or there was an error fetching the data.") + st.write("Please try again or contact support if the issue persists.") + + st.markdown("---") + + # Navigation back to applications + if st.button("← Back to Applications", use_container_width=True): + # Clear the selected student from session state + if "selected_student_id" in st.session_state: + del st.session_state["selected_student_id"] + if "selected_application_id" in st.session_state: + del st.session_state["selected_application_id"] + + st.switch_page("pages/22_Employer_Applications.py") + +else: + st.info("No student selected. Please navigate from the Application Management page to view a specific student profile.") + + if st.button("Go to Application Management", use_container_width=True): + st.switch_page("pages/22_Employer_Applications.py") \ No newline at end of file diff --git a/app/src/pages/30_Admin_Home.py b/app/src/pages/30_Admin_Home.py new file mode 100644 index 0000000000..d44e3df9fc --- /dev/null +++ b/app/src/pages/30_Admin_Home.py @@ -0,0 +1,77 @@ +import os, requests, pandas as pd, streamlit as st +from modules.nav import SideBarLinks + +# ----- Config ----- +BASE_API = os.getenv("BASE_API", "http://web-api:4000") +COOP_API = f"{BASE_API}/coopPositions" +DEI_API = f"{BASE_API}/api/dei" # ok if not registered; handled below +TIMEOUT = 10 + +st.set_page_config(page_title="Admin • Coopalytics", layout="wide", initial_sidebar_state="expanded") +SideBarLinks() +st.title("⚙️ System Admin Home Page") + +# ----- Helpers ----- +def get_json(url): + r = requests.get(url, timeout=TIMEOUT); r.raise_for_status(); return r.json() + +# ----- Load data (safe fallbacks) ----- +pending_df = pd.DataFrame() +employers_df = pd.DataFrame() +dei_gender = pd.DataFrame() + +try: + pending_df = pd.DataFrame(get_json(f"{COOP_API}/pending")) +except Exception: + pending_df = pd.DataFrame() + +try: + employers_df = pd.DataFrame(get_json(f"{COOP_API}/employerJobCounts")) +except Exception: + employers_df = pd.DataFrame() + +try: + dei_gender = pd.DataFrame(get_json(f"{DEI_API}/representation/gender")) +except Exception: + dei_gender = pd.DataFrame() + +# ----- KPIs ----- +c1, c2, c3, c4 = st.columns(4) +c1.metric("Pending Postings", 0 if pending_df.empty else len(pending_df)) +active_employers = 0 if employers_df.empty else employers_df[employers_df["numJobs"] > 0]["employerId"].nunique() +c2.metric("Active Employers", active_employers) +total_jobs = 0 if employers_df.empty else int(employers_df["numJobs"].sum()) +c3.metric("Total Jobs Posted", total_jobs) + +st.divider() + +# ----- Quick links ----- +st.subheader("🚀 Quick Links") +col1, col2, col3 = st.columns(3) +with col1: + st.page_link("pages/32_Admin_Postings.py", label="Review Job Postings", icon="✅") +with col2: + st.page_link("pages/31_Admin_Employers.py", label="Manage Employers", icon="🏢") +with col3: + st.page_link("pages/33_Admin_DEI.py", label="DEI Metrics", icon="🌍") + +st.divider() + +# ----- Tables (preview) ----- +left, right = st.columns(2, gap="large") + +with left: + st.subheader("📌 Pending (Top 10)") + if pending_df.empty: + st.info("No pending positions.") + else: + show = pending_df[["coopPositionId","title","companyName","location","deadline","hourlyPay","industry"]].copy() + st.dataframe(show.head(10), use_container_width=True) + +with right: + st.subheader("🏢 Employers by Job Count") + if employers_df.empty: + st.info("No employer data.") + else: + show_e = employers_df[["employerId","firstName","lastName","companyName","numJobs"]].sort_values("numJobs", ascending=False) + st.dataframe(show_e.head(10), use_container_width=True) \ No newline at end of file diff --git a/app/src/pages/31_Admin_Employers.py b/app/src/pages/31_Admin_Employers.py new file mode 100644 index 0000000000..d8a0f73b25 --- /dev/null +++ b/app/src/pages/31_Admin_Employers.py @@ -0,0 +1,79 @@ +import os, requests, pandas as pd, streamlit as st +from modules.nav import SideBarLinks + +# ---- Config ---- +BASE_API = os.getenv("BASE_API", "http://web-api:4000") +COOP_API = f"{BASE_API}/coopPositions" +TIMEOUT = 10 + +st.set_page_config(page_title="Employer Accounts • Coopalytics", layout="wide") +SideBarLinks() +st.title("🏢 Employer Accounts") + +# ---- Helpers ---- +def get_json(url): + r = requests.get(url, timeout=TIMEOUT); r.raise_for_status(); return r.json() + + +# ---- Load data ---- +try: + counts_df = pd.DataFrame(get_json(f"{COOP_API}/employerJobCounts")) +except Exception as e: + counts_df = pd.DataFrame() + st.error(f"Could not load employer job counts: {e}") + +# ---- Top stats ---- +col1, col2, col3 = st.columns(3) +total_employers = counts_df["employerId"].nunique() if not counts_df.empty else 0 +active_employers = counts_df[counts_df["numJobs"] > 0]["employerId"].nunique() if not counts_df.empty else 0 +zero_job_employers = total_employers - active_employers + +col1.metric("Total Employers", total_employers) +col2.metric("Active (≥1 job)", active_employers) +col3.metric("No Jobs Yet", zero_job_employers) + +st.divider() + +# ---- Filters / search ---- +with st.container(): + fcol1, fcol2 = st.columns([3,1]) + query = fcol1.text_input("Search employers (name/company)", value="") + sort_by = fcol2.selectbox("Sort by", ["numJobs ↓","numJobs ↑","lastName A→Z","company A→Z"], index=0) + +view = counts_df.copy() +if not view.empty: + # compose display name and filter + view["employerName"] = (view["firstName"].fillna("") + " " + view["lastName"].fillna("")).str.strip() + if query: + q = query.lower() + view = view[ + view["employerName"].str.lower().str.contains(q, na=False) | + view["companyName"].str.lower().str.contains(q, na=False) + ] + #view = view[view["numJobs"] >= min_jobs] + + # sorting + if sort_by == "numJobs ↓": + view = view.sort_values(["numJobs","lastName","firstName"], ascending=[False,True,True]) + elif sort_by == "numJobs ↑": + view = view.sort_values(["numJobs","lastName","firstName"], ascending=[True,True,True]) + elif sort_by == "lastName A→Z": + view = view.sort_values(["lastName","firstName"]) + else: # company A→Z + view = view.sort_values(["companyName","lastName","firstName"]) + +st.subheader("Accounts") +if view.empty: + st.info("No employers match your filters.") +else: + show = view[["employerId","employerName","companyName","numJobs"]].rename( + columns={ + "employerId":"Employer ID", + "employerName":"Employer", + "companyName":"Company", + "numJobs":"# Jobs" + } + ) + st.dataframe(show, use_container_width=True) + +st.caption("Data source: /api/coopPositions/employerJobCounts") \ No newline at end of file diff --git a/app/src/pages/32_Admin_Postings.py b/app/src/pages/32_Admin_Postings.py new file mode 100644 index 0000000000..8d9306483d --- /dev/null +++ b/app/src/pages/32_Admin_Postings.py @@ -0,0 +1,195 @@ +import os, requests, pandas as pd, streamlit as st +from modules.nav import SideBarLinks + +# ---- Config ---- +BASE_API = os.getenv("BASE_API", "http://web-api:4000") +COOP_API = f"{BASE_API}/coopPositions" +TIMEOUT = 10 + +st.set_page_config(page_title="Review Job Postings • Coopalytics", layout="wide", initial_sidebar_state="expanded") +SideBarLinks() +st.title("📝 Review Job Postings") + +# ---- Helpers ---- +def get_json(url): + r = requests.get(url, timeout=TIMEOUT); r.raise_for_status(); return r.json() + +def put_json(url, payload=None): + r = requests.put(url, json=payload or {}, timeout=TIMEOUT); r.raise_for_status(); return r.json() + +def delete_json(url): + r = requests.delete(url, timeout=TIMEOUT); r.raise_for_status(); return r.json() + +def flag_json(pos_id, value: int): + r = requests.put(f"{COOP_API}/{pos_id}/flag/{value}", timeout=TIMEOUT); r.raise_for_status(); return r.json() + +def unflag_json(pos_id): + r = requests.put(f"{COOP_API}/{pos_id}/unflag", timeout=TIMEOUT); r.raise_for_status(); return r.json() + +# ---- Load pending ---- +try: + pending = pd.DataFrame(get_json(f"{COOP_API}/pending")) +except Exception as e: + st.error(f"Could not load pending positions: {e}") + pending = pd.DataFrame() + +# ---- Top bar (metrics + filters) ---- +c1, c2, c3, c4 = st.columns([1,1,2,2]) + +# Calculate metrics from the data +if not pending.empty: + total_positions = len(pending) + flagged_count = len(pending[pending['flag'] == 1]) if 'flag' in pending.columns else 0 + approved_count = len(pending[pending['flag'] == 0]) if 'flag' in pending.columns else total_positions +else: + total_positions = flagged_count = approved_count = 0 + +c1.metric("Total Positions", total_positions) +c2.metric("🚩 Flagged", flagged_count, delta=f"-{approved_count} approved") + +try: + avg_pay = pd.DataFrame(get_json(f"{COOP_API}/industryAveragePay")) + c3.metric("Industries", 0 if avg_pay.empty else len(avg_pay)) +except Exception: + c3.metric("Industries", "—") + +# Filters +status_filter = c4.selectbox("Status Filter", ["All", "Approved", "Flagged"], index=0) +q = st.text_input("🔍 Search title/company/location", "") +industry_filter = st.selectbox( + "Industry filter", + ["All"] + (sorted(pending["industry"].dropna().unique().tolist()) if not pending.empty else ["All"]), + index=0 +) + +# Apply filters +view = pending.copy() +if not view.empty: + view["companyName"] = view.get("companyName", "") + + # Add flag status column for display + if 'flag' in view.columns: + view["Status"] = view["flag"].apply(lambda x: "🚩 Flagged" if x == 1 else "✅ Approved") + else: + view["Status"] = "✅ Approved" + + # Apply search filter + if q: + ql = q.lower() + view = view[ + view["title"].str.lower().str.contains(ql, na=False) | + view["companyName"].astype(str).str.lower().str.contains(ql, na=False) | + view["location"].str.lower().str.contains(ql, na=False) + ] + + # Apply status filter + if status_filter == "Approved": + view = view[view["flag"] == 0] if 'flag' in view.columns else view + elif status_filter == "Flagged": + view = view[view["flag"] == 1] if 'flag' in view.columns else view.iloc[0:0] # Empty dataframe + + # Apply industry filter + if industry_filter != "All": + view = view[view["industry"] == industry_filter] + +st.divider() + +# ---- Table + actions ---- +left, right = st.columns([2.2, 1]) + +with left: + st.subheader("📌 Co-op Positions Management") + if view.empty: + st.info("No positions match your filters.") + else: + # Prepare display columns with status + display_columns = ["coopPositionId", "Status", "title", "companyName", "location", "hourlyPay", "deadline", "industry"] + available_columns = [col for col in display_columns if col in view.columns] + + show = view[available_columns].sort_values(["deadline","coopPositionId"], ascending=[True, False]) + + # Style the dataframe with colors + def style_status(val): + if "Flagged" in str(val): + return 'background-color: #ffebee; color: #c62828' # Light red background, dark red text + elif "Approved" in str(val): + return 'background-color: #e8f5e8; color: #2e7d32' # Light green background, dark green text + return '' + + if "Status" in show.columns: + styled_df = show.style.applymap(style_status, subset=['Status']) + st.dataframe(styled_df, use_container_width=True, height=420) + else: + st.dataframe(show, use_container_width=True, height=420) + +with right: + st.subheader("⚡ Quick Actions") + pos_id = st.number_input("Position ID", min_value=0, step=1, value=0) + a1, a2 = st.columns(2) + a3, a4 = st.columns(2) + + if a1.button("✅ Approve", type="primary", use_container_width=True, disabled=pos_id<=0): + with st.spinner(f"Approving position {int(pos_id)}..."): + try: + result = put_json(f"{COOP_API}/{int(pos_id)}/approve") + if result.get("ok"): + st.success(f"✅ {result.get('message', f'Position {int(pos_id)} approved successfully')}") + st.info("Position is now visible to students and available for applications.") + else: + st.warning(f"⚠️ {result.get('message', f'Position {int(pos_id)} was already approved')}") + st.rerun() + except Exception as e: + st.error(f"❌ Approve failed: {e}") + + if a2.button("🗑️ Delete", use_container_width=True, disabled=pos_id<=0): + with st.spinner(f"Deleting position {int(pos_id)}..."): + try: + result = delete_json(f"{COOP_API}/{int(pos_id)}") + if result.get("ok"): + st.success(f"🗑️ {result.get('message', f'Position {int(pos_id)} deleted successfully')}") + st.info("Position has been permanently removed from the system.") + else: + st.error(f"❌ {result.get('error', 'Delete failed')}") + st.rerun() + except Exception as e: + st.error(f"❌ Delete failed: {e}") + + if a3.button("🚩 Flag", use_container_width=True, disabled=pos_id<=0): + with st.spinner(f"Flagging position {int(pos_id)}..."): + try: + result = flag_json(int(pos_id), 1) + if result.get("ok"): + st.success(f"🚩 {result.get('message', f'Position {int(pos_id)} flagged successfully')}") + st.info("Position is now hidden from students and marked for review.") + else: + st.error(f"❌ {result.get('error', 'Flag failed')}") + st.rerun() + except Exception as e: + st.error(f"❌ Flag failed: {e}") + + if a4.button("✅ Unflag", use_container_width=True, disabled=pos_id<=0): + with st.spinner(f"Unflagging position {int(pos_id)}..."): + try: + result = unflag_json(int(pos_id)) + if result.get("ok"): + st.success(f"✅ {result.get('message', f'Position {int(pos_id)} unflagged successfully')}") + st.info("Position is now approved and visible to students.") + else: + st.error(f"❌ {result.get('error', 'Unflag failed')}") + st.rerun() + except Exception as e: + st.error(f"❌ Unflag failed: {e}") + +st.divider() + +# ---- Industry averages (optional context) ---- +st.subheader("💸 Industry Average Hourly Pay") +try: + if 'avg_pay' not in locals(): + avg_pay = pd.DataFrame(get_json(f"{COOP_API}/industryAveragePay")) + if avg_pay.empty: + st.caption("No data.") + else: + st.dataframe(avg_pay.rename(columns={"industry":"Industry","industryAvgHourlyPay":"Avg $/hr"}), use_container_width=True) +except Exception as e: + st.caption(f"Could not load averages: {e}") diff --git a/app/src/pages/33_Admin_DEI.py b/app/src/pages/33_Admin_DEI.py new file mode 100644 index 0000000000..69960d5539 --- /dev/null +++ b/app/src/pages/33_Admin_DEI.py @@ -0,0 +1,91 @@ +# Admin_DEI.py +import os, requests, pandas as pd, streamlit as st +from modules.nav import SideBarLinks + +st.set_page_config(page_title="DEI Metrics • Coopalytics", layout="wide") +SideBarLinks() +st.title("DEI Metrics") + +BASE_API = os.getenv("BASE_API", "http://web-api:4000") + +def fetch_json(path): + r = requests.get(f"{BASE_API}{path}", timeout=10) + r.raise_for_status() + return r.json() + +def normalize_dei(payload): + """ + Accept either: + - dict: {"gender":[{"label","count"}, ...], "race":[...], ...} + - list: [{"metric":"gender","label":"Female","count":12}, ...] + and always return the dict shape. + """ + if isinstance(payload, dict): + return payload + + if isinstance(payload, list): + out = {"gender": [], "race": [], "nationality": [], "disability": []} + for row in payload: + metric = (row.get("metric") or "").strip().lower() + label = row.get("label") + count = row.get("count", 0) + if metric in out and label is not None: + try: + count = int(count) + except Exception: + pass + out[metric].append({"label": label, "count": count}) + return out + + # Unknown payload + return {} + +# Try a single summary endpoint first; fall back to per-dimension endpoints if needed +data = {} +try: + raw = fetch_json("/api/dei/metrics") + data = normalize_dei(raw) +except Exception: + # Fallback to separate endpoints if your API exposes them + for dim in ["gender", "race", "nationality", "disability"]: + try: + data[dim] = fetch_json(f"/api/dei/{dim}") + except Exception: + data[dim] = [] + +# Only keep non-empty lists +dims = [k for k, v in data.items() if isinstance(v, list) and len(v) > 0] +if not dims: + st.info("No DEI data available.") + st.stop() + +# Selector +colA, colB = st.columns([2, 1]) +with colA: + dim = st.selectbox("Select metric", dims, index=0) +with colB: + show_table = st.toggle("Show table", value=False) + +# Prep dataframe +df = pd.DataFrame(data[dim]) +if "label" not in df.columns or "count" not in df.columns: + st.error(f"Endpoint for '{dim}' must return items with 'label' and 'count'.") + st.stop() + +df = df.groupby("label", as_index=False)["count"].sum().sort_values("count", ascending=False) +total = int(df["count"].sum()) + +# KPIs +k1, k2, k3 = st.columns(3) +k1.metric("Total records", f"{total}") +k2.metric("Distinct categories", f"{df.shape[0]}") +coverage = 100 if total > 0 else 0 +k3.metric("Coverage (%)", f"{coverage:.0f}%") + +# Chart + (optional) table +st.subheader(dim.capitalize()) +st.bar_chart(df.set_index("label")["count"]) + +if show_table: + st.dataframe(df, use_container_width=True) + diff --git a/app/src/pages/34_Admin_Analytics.py b/app/src/pages/34_Admin_Analytics.py new file mode 100644 index 0000000000..3d17949993 --- /dev/null +++ b/app/src/pages/34_Admin_Analytics.py @@ -0,0 +1,216 @@ +import streamlit as st +import requests +import pandas as pd +import plotly.express as px +import plotly.graph_objects as go +from datetime import datetime +import time + +import logging +logging.basicConfig(format='%(filename)s:%(lineno)s:%(levelname)s -- %(message)s', level=logging.INFO) +logger = logging.getLogger(__name__) + +from modules.nav import SideBarLinks + +st.set_page_config(layout='wide', page_title="Admin Analytics Dashboard") +SideBarLinks() + +# Custom CSS for professional styling +st.markdown(""" + +""", unsafe_allow_html=True) + +# Header Section +st.markdown(""" +
+

📊 Admin Analytics Dashboard

+

Comprehensive overview of system metrics and user analytics

+
+""", unsafe_allow_html=True) + +# Test the API endpoints +test_url = "http://web-api:4000" + +# User Metrics Section +st.markdown('

👥 User Analytics

', unsafe_allow_html=True) + +# Create three columns for metrics +col1, col2, col3 = st.columns(3) + +with col1: + # Fetch student count first + try: + response = requests.get(f"{test_url}/users/count/students", timeout=5) + if response.status_code == 200: + data = response.json() + student_count = data.get('student_count', 0) + else: + student_count = 0 + except Exception as e: + student_count = 0 + + st.markdown(f""" +
+
👨‍🎓 Students
+
{student_count}
+
Active in system
+
+ """, unsafe_allow_html=True) + +with col2: + # Fetch advisor count first + try: + response = requests.get(f"{test_url}/users/count/advisors", timeout=5) + if response.status_code == 200: + data = response.json() + advisor_count = data.get('advisor_count', 0) + else: + advisor_count = 0 + except Exception as e: + advisor_count = 0 + + st.markdown(f""" +
+
👨‍🏫 Advisors
+
{advisor_count}
+
Academic support
+
+ """, unsafe_allow_html=True) + +with col3: + # Fetch employer count first + try: + response = requests.get(f"{test_url}/users/count/employers", timeout=5) + if response.status_code == 200: + data = response.json() + employer_count = data.get('employer_count', 0) + else: + employer_count = 0 + except Exception as e: + employer_count = 0 + + st.markdown(f""" +
+
🏢 Employers
+
{employer_count}
+
Industry partners
+
+ """, unsafe_allow_html=True) + +# Summary Section +st.markdown("---") +st.markdown('

📈 System Summary

', unsafe_allow_html=True) + +# Calculate total users +try: + student_response = requests.get(f"{test_url}/users/count/students", timeout=5) + advisor_response = requests.get(f"{test_url}/users/count/advisors", timeout=5) + employer_response = requests.get(f"{test_url}/users/count/employers", timeout=5) + + if all(r.status_code == 200 for r in [student_response, advisor_response, employer_response]): + student_data = student_response.json() + advisor_data = advisor_response.json() + employer_data = employer_response.json() + + total_users = student_data.get('student_count', 0) + advisor_data.get('advisor_count', 0) + employer_data.get('employer_count', 0) + + col1, col2, col3, col4 = st.columns(4) + + with col1: + st.metric("Total Users", total_users, delta=None) + + with col2: + st.metric("System Status", "Online", delta="✓", delta_color="normal") + + with col3: + st.metric("API Response", "Healthy", delta="< 5s", delta_color="normal") + + with col4: + st.metric("Last Updated", datetime.now().strftime("%H:%M"), delta="Live", delta_color="normal") + + else: + st.warning("⚠️ Some metrics are unavailable. Please check system connectivity.") + +except Exception as e: + st.error(f"❌ Unable to fetch system summary: {str(e)}") + +# Footer +st.markdown("---") +st.markdown(""" +
+ Admin Analytics Dashboard • Real-time monitoring • Last updated: """ + datetime.now().strftime("%Y-%m-%d %H:%M:%S") + """ +
+""", unsafe_allow_html=True) + + diff --git a/app/src/pages/30_About.py b/app/src/pages/90_About.py similarity index 100% rename from app/src/pages/30_About.py rename to app/src/pages/90_About.py diff --git a/database-files/00_northwind.sql b/database-files/00_northwind.sql deleted file mode 100644 index 57678cfc72..0000000000 --- a/database-files/00_northwind.sql +++ /dev/null @@ -1,546 +0,0 @@ -SET @OLD_UNIQUE_CHECKS=@@UNIQUE_CHECKS, UNIQUE_CHECKS=0; -SET @OLD_FOREIGN_KEY_CHECKS=@@FOREIGN_KEY_CHECKS, FOREIGN_KEY_CHECKS=0; -SET @OLD_SQL_MODE=@@SQL_MODE, SQL_MODE='TRADITIONAL,ALLOW_INVALID_DATES'; - -DROP SCHEMA IF EXISTS `northwind` ; -CREATE SCHEMA IF NOT EXISTS `northwind` DEFAULT CHARACTER SET latin1 ; -USE `northwind` ; - --- ----------------------------------------------------- --- Table `northwind`.`customers` --- ----------------------------------------------------- -CREATE TABLE IF NOT EXISTS `northwind`.`customers` ( - `id` INT(11) NOT NULL AUTO_INCREMENT, - `company` VARCHAR(50) NULL DEFAULT NULL, - `last_name` VARCHAR(50) NULL DEFAULT NULL, - `first_name` VARCHAR(50) NULL DEFAULT NULL, - `email_address` VARCHAR(50) NULL DEFAULT NULL, - `job_title` VARCHAR(50) NULL DEFAULT NULL, - `business_phone` VARCHAR(25) NULL DEFAULT NULL, - `home_phone` VARCHAR(25) NULL DEFAULT NULL, - `mobile_phone` VARCHAR(25) NULL DEFAULT NULL, - `fax_number` VARCHAR(25) NULL DEFAULT NULL, - `address` LONGTEXT NULL DEFAULT NULL, - `city` VARCHAR(50) NULL DEFAULT NULL, - `state_province` VARCHAR(50) NULL DEFAULT NULL, - `zip_postal_code` VARCHAR(15) NULL DEFAULT NULL, - `country_region` VARCHAR(50) NULL DEFAULT NULL, - `web_page` LONGTEXT NULL DEFAULT NULL, - `notes` LONGTEXT NULL DEFAULT NULL, - `attachments` LONGBLOB NULL DEFAULT NULL, - PRIMARY KEY (`id`), - INDEX `city` (`city` ASC), - INDEX `company` (`company` ASC), - INDEX `first_name` (`first_name` ASC), - INDEX `last_name` (`last_name` ASC), - INDEX `zip_postal_code` (`zip_postal_code` ASC), - INDEX `state_province` (`state_province` ASC)) -ENGINE = InnoDB -DEFAULT CHARACTER SET = utf8; - - --- ----------------------------------------------------- --- Table `northwind`.`employees` --- ----------------------------------------------------- -CREATE TABLE IF NOT EXISTS `northwind`.`employees` ( - `id` INT(11) NOT NULL AUTO_INCREMENT, - `company` VARCHAR(50) NULL DEFAULT NULL, - `last_name` VARCHAR(50) NULL DEFAULT NULL, - `first_name` VARCHAR(50) NULL DEFAULT NULL, - `email_address` VARCHAR(50) NULL DEFAULT NULL, - `job_title` VARCHAR(50) NULL DEFAULT NULL, - `business_phone` VARCHAR(25) NULL DEFAULT NULL, - `home_phone` VARCHAR(25) NULL DEFAULT NULL, - `mobile_phone` VARCHAR(25) NULL DEFAULT NULL, - `fax_number` VARCHAR(25) NULL DEFAULT NULL, - `address` LONGTEXT NULL DEFAULT NULL, - `city` VARCHAR(50) NULL DEFAULT NULL, - `state_province` VARCHAR(50) NULL DEFAULT NULL, - `zip_postal_code` VARCHAR(15) NULL DEFAULT NULL, - `country_region` VARCHAR(50) NULL DEFAULT NULL, - `web_page` LONGTEXT NULL DEFAULT NULL, - `notes` LONGTEXT NULL DEFAULT NULL, - `attachments` LONGBLOB NULL DEFAULT NULL, - PRIMARY KEY (`id`), - INDEX `city` (`city` ASC), - INDEX `company` (`company` ASC), - INDEX `first_name` (`first_name` ASC), - INDEX `last_name` (`last_name` ASC), - INDEX `zip_postal_code` (`zip_postal_code` ASC), - INDEX `state_province` (`state_province` ASC)) -ENGINE = InnoDB -DEFAULT CHARACTER SET = utf8; - - --- ----------------------------------------------------- --- Table `northwind`.`privileges` --- ----------------------------------------------------- -CREATE TABLE IF NOT EXISTS `northwind`.`privileges` ( - `id` INT(11) NOT NULL AUTO_INCREMENT, - `privilege_name` VARCHAR(50) NULL DEFAULT NULL, - PRIMARY KEY (`id`)) -ENGINE = InnoDB -DEFAULT CHARACTER SET = utf8; - - --- ----------------------------------------------------- --- Table `northwind`.`employee_privileges` --- ----------------------------------------------------- -CREATE TABLE IF NOT EXISTS `northwind`.`employee_privileges` ( - `employee_id` INT(11) NOT NULL, - `privilege_id` INT(11) NOT NULL, - PRIMARY KEY (`employee_id`, `privilege_id`), - INDEX `employee_id` (`employee_id` ASC), - INDEX `privilege_id` (`privilege_id` ASC), - INDEX `privilege_id_2` (`privilege_id` ASC), - CONSTRAINT `fk_employee_privileges_employees1` - FOREIGN KEY (`employee_id`) - REFERENCES `northwind`.`employees` (`id`) - ON DELETE NO ACTION - ON UPDATE NO ACTION, - CONSTRAINT `fk_employee_privileges_privileges1` - FOREIGN KEY (`privilege_id`) - REFERENCES `northwind`.`privileges` (`id`) - ON DELETE NO ACTION - ON UPDATE NO ACTION) -ENGINE = InnoDB -DEFAULT CHARACTER SET = utf8; - - --- ----------------------------------------------------- --- Table `northwind`.`inventory_transaction_types` --- ----------------------------------------------------- -CREATE TABLE IF NOT EXISTS `northwind`.`inventory_transaction_types` ( - `id` TINYINT(4) NOT NULL, - `type_name` VARCHAR(50) NOT NULL, - PRIMARY KEY (`id`)) -ENGINE = InnoDB -DEFAULT CHARACTER SET = utf8; - - --- ----------------------------------------------------- --- Table `northwind`.`shippers` --- ----------------------------------------------------- -CREATE TABLE IF NOT EXISTS `northwind`.`shippers` ( - `id` INT(11) NOT NULL AUTO_INCREMENT, - `company` VARCHAR(50) NULL DEFAULT NULL, - `last_name` VARCHAR(50) NULL DEFAULT NULL, - `first_name` VARCHAR(50) NULL DEFAULT NULL, - `email_address` VARCHAR(50) NULL DEFAULT NULL, - `job_title` VARCHAR(50) NULL DEFAULT NULL, - `business_phone` VARCHAR(25) NULL DEFAULT NULL, - `home_phone` VARCHAR(25) NULL DEFAULT NULL, - `mobile_phone` VARCHAR(25) NULL DEFAULT NULL, - `fax_number` VARCHAR(25) NULL DEFAULT NULL, - `address` LONGTEXT NULL DEFAULT NULL, - `city` VARCHAR(50) NULL DEFAULT NULL, - `state_province` VARCHAR(50) NULL DEFAULT NULL, - `zip_postal_code` VARCHAR(15) NULL DEFAULT NULL, - `country_region` VARCHAR(50) NULL DEFAULT NULL, - `web_page` LONGTEXT NULL DEFAULT NULL, - `notes` LONGTEXT NULL DEFAULT NULL, - `attachments` LONGBLOB NULL DEFAULT NULL, - PRIMARY KEY (`id`), - INDEX `city` (`city` ASC), - INDEX `company` (`company` ASC), - INDEX `first_name` (`first_name` ASC), - INDEX `last_name` (`last_name` ASC), - INDEX `zip_postal_code` (`zip_postal_code` ASC), - INDEX `state_province` (`state_province` ASC)) -ENGINE = InnoDB -DEFAULT CHARACTER SET = utf8; - - --- ----------------------------------------------------- --- Table `northwind`.`orders_tax_status` --- ----------------------------------------------------- -CREATE TABLE IF NOT EXISTS `northwind`.`orders_tax_status` ( - `id` TINYINT(4) NOT NULL, - `tax_status_name` VARCHAR(50) NOT NULL, - PRIMARY KEY (`id`)) -ENGINE = InnoDB -DEFAULT CHARACTER SET = utf8; - - --- ----------------------------------------------------- --- Table `northwind`.`orders_status` --- ----------------------------------------------------- -CREATE TABLE IF NOT EXISTS `northwind`.`orders_status` ( - `id` TINYINT(4) NOT NULL, - `status_name` VARCHAR(50) NOT NULL, - PRIMARY KEY (`id`)) -ENGINE = InnoDB -DEFAULT CHARACTER SET = utf8; - - --- ----------------------------------------------------- --- Table `northwind`.`orders` --- ----------------------------------------------------- -CREATE TABLE IF NOT EXISTS `northwind`.`orders` ( - `id` INT(11) NOT NULL AUTO_INCREMENT, - `employee_id` INT(11) NULL DEFAULT NULL, - `customer_id` INT(11) NULL DEFAULT NULL, - `order_date` DATETIME NULL DEFAULT NULL, - `shipped_date` DATETIME NULL DEFAULT NULL, - `shipper_id` INT(11) NULL DEFAULT NULL, - `ship_name` VARCHAR(50) NULL DEFAULT NULL, - `ship_address` LONGTEXT NULL DEFAULT NULL, - `ship_city` VARCHAR(50) NULL DEFAULT NULL, - `ship_state_province` VARCHAR(50) NULL DEFAULT NULL, - `ship_zip_postal_code` VARCHAR(50) NULL DEFAULT NULL, - `ship_country_region` VARCHAR(50) NULL DEFAULT NULL, - `shipping_fee` DECIMAL(19,4) NULL DEFAULT '0.0000', - `taxes` DECIMAL(19,4) NULL DEFAULT '0.0000', - `payment_type` VARCHAR(50) NULL DEFAULT NULL, - `paid_date` DATETIME NULL DEFAULT NULL, - `notes` LONGTEXT NULL DEFAULT NULL, - `tax_rate` DOUBLE NULL DEFAULT '0', - `tax_status_id` TINYINT(4) NULL DEFAULT NULL, - `status_id` TINYINT(4) NULL DEFAULT '0', - PRIMARY KEY (`id`), - INDEX `customer_id` (`customer_id` ASC), - INDEX `customer_id_2` (`customer_id` ASC), - INDEX `employee_id` (`employee_id` ASC), - INDEX `employee_id_2` (`employee_id` ASC), - INDEX `id` (`id` ASC), - INDEX `id_2` (`id` ASC), - INDEX `shipper_id` (`shipper_id` ASC), - INDEX `shipper_id_2` (`shipper_id` ASC), - INDEX `id_3` (`id` ASC), - INDEX `tax_status` (`tax_status_id` ASC), - INDEX `ship_zip_postal_code` (`ship_zip_postal_code` ASC), - CONSTRAINT `fk_orders_customers` - FOREIGN KEY (`customer_id`) - REFERENCES `northwind`.`customers` (`id`) - ON DELETE NO ACTION - ON UPDATE NO ACTION, - CONSTRAINT `fk_orders_employees1` - FOREIGN KEY (`employee_id`) - REFERENCES `northwind`.`employees` (`id`) - ON DELETE NO ACTION - ON UPDATE NO ACTION, - CONSTRAINT `fk_orders_shippers1` - FOREIGN KEY (`shipper_id`) - REFERENCES `northwind`.`shippers` (`id`) - ON DELETE NO ACTION - ON UPDATE NO ACTION, - CONSTRAINT `fk_orders_orders_tax_status1` - FOREIGN KEY (`tax_status_id`) - REFERENCES `northwind`.`orders_tax_status` (`id`) - ON DELETE NO ACTION - ON UPDATE NO ACTION, - CONSTRAINT `fk_orders_orders_status1` - FOREIGN KEY (`status_id`) - REFERENCES `northwind`.`orders_status` (`id`) - ON DELETE NO ACTION - ON UPDATE NO ACTION) -ENGINE = InnoDB -DEFAULT CHARACTER SET = utf8; - - --- ----------------------------------------------------- --- Table `northwind`.`products` --- ----------------------------------------------------- -CREATE TABLE IF NOT EXISTS `northwind`.`products` ( - `supplier_ids` LONGTEXT NULL DEFAULT NULL, - `id` INT(11) NOT NULL AUTO_INCREMENT, - `product_code` VARCHAR(25) NULL DEFAULT NULL, - `product_name` VARCHAR(50) NULL DEFAULT NULL, - `description` LONGTEXT NULL DEFAULT NULL, - `standard_cost` DECIMAL(19,4) NULL DEFAULT '0.0000', - `list_price` DECIMAL(19,4) NOT NULL DEFAULT '0.0000', - `reorder_level` INT(11) NULL DEFAULT NULL, - `target_level` INT(11) NULL DEFAULT NULL, - `quantity_per_unit` VARCHAR(50) NULL DEFAULT NULL, - `discontinued` TINYINT(1) NOT NULL DEFAULT '0', - `minimum_reorder_quantity` INT(11) NULL DEFAULT NULL, - `category` VARCHAR(50) NULL DEFAULT NULL, - `attachments` LONGBLOB NULL DEFAULT NULL, - PRIMARY KEY (`id`), - INDEX `product_code` (`product_code` ASC)) -ENGINE = InnoDB -DEFAULT CHARACTER SET = utf8; - - --- ----------------------------------------------------- --- Table `northwind`.`purchase_order_status` --- ----------------------------------------------------- -CREATE TABLE IF NOT EXISTS `northwind`.`purchase_order_status` ( - `id` INT(11) NOT NULL, - `status` VARCHAR(50) NULL DEFAULT NULL, - PRIMARY KEY (`id`)) -ENGINE = InnoDB -DEFAULT CHARACTER SET = utf8; - - --- ----------------------------------------------------- --- Table `northwind`.`suppliers` --- ----------------------------------------------------- -CREATE TABLE IF NOT EXISTS `northwind`.`suppliers` ( - `id` INT(11) NOT NULL AUTO_INCREMENT, - `company` VARCHAR(50) NULL DEFAULT NULL, - `last_name` VARCHAR(50) NULL DEFAULT NULL, - `first_name` VARCHAR(50) NULL DEFAULT NULL, - `email_address` VARCHAR(50) NULL DEFAULT NULL, - `job_title` VARCHAR(50) NULL DEFAULT NULL, - `business_phone` VARCHAR(25) NULL DEFAULT NULL, - `home_phone` VARCHAR(25) NULL DEFAULT NULL, - `mobile_phone` VARCHAR(25) NULL DEFAULT NULL, - `fax_number` VARCHAR(25) NULL DEFAULT NULL, - `address` LONGTEXT NULL DEFAULT NULL, - `city` VARCHAR(50) NULL DEFAULT NULL, - `state_province` VARCHAR(50) NULL DEFAULT NULL, - `zip_postal_code` VARCHAR(15) NULL DEFAULT NULL, - `country_region` VARCHAR(50) NULL DEFAULT NULL, - `web_page` LONGTEXT NULL DEFAULT NULL, - `notes` LONGTEXT NULL DEFAULT NULL, - `attachments` LONGBLOB NULL DEFAULT NULL, - PRIMARY KEY (`id`), - INDEX `city` (`city` ASC), - INDEX `company` (`company` ASC), - INDEX `first_name` (`first_name` ASC), - INDEX `last_name` (`last_name` ASC), - INDEX `zip_postal_code` (`zip_postal_code` ASC), - INDEX `state_province` (`state_province` ASC)) -ENGINE = InnoDB -DEFAULT CHARACTER SET = utf8; - - --- ----------------------------------------------------- --- Table `northwind`.`purchase_orders` --- ----------------------------------------------------- -CREATE TABLE IF NOT EXISTS `northwind`.`purchase_orders` ( - `id` INT(11) NOT NULL AUTO_INCREMENT, - `supplier_id` INT(11) NULL DEFAULT NULL, - `created_by` INT(11) NULL DEFAULT NULL, - `submitted_date` DATETIME NULL DEFAULT NULL, - `creation_date` DATETIME NULL DEFAULT NULL, - `status_id` INT(11) NULL DEFAULT '0', - `expected_date` DATETIME NULL DEFAULT NULL, - `shipping_fee` DECIMAL(19,4) NOT NULL DEFAULT '0.0000', - `taxes` DECIMAL(19,4) NOT NULL DEFAULT '0.0000', - `payment_date` DATETIME NULL DEFAULT NULL, - `payment_amount` DECIMAL(19,4) NULL DEFAULT '0.0000', - `payment_method` VARCHAR(50) NULL DEFAULT NULL, - `notes` LONGTEXT NULL DEFAULT NULL, - `approved_by` INT(11) NULL DEFAULT NULL, - `approved_date` DATETIME NULL DEFAULT NULL, - `submitted_by` INT(11) NULL DEFAULT NULL, - PRIMARY KEY (`id`), - UNIQUE INDEX `id` (`id` ASC), - INDEX `created_by` (`created_by` ASC), - INDEX `status_id` (`status_id` ASC), - INDEX `id_2` (`id` ASC), - INDEX `supplier_id` (`supplier_id` ASC), - INDEX `supplier_id_2` (`supplier_id` ASC), - CONSTRAINT `fk_purchase_orders_employees1` - FOREIGN KEY (`created_by`) - REFERENCES `northwind`.`employees` (`id`) - ON DELETE NO ACTION - ON UPDATE NO ACTION, - CONSTRAINT `fk_purchase_orders_purchase_order_status1` - FOREIGN KEY (`status_id`) - REFERENCES `northwind`.`purchase_order_status` (`id`) - ON DELETE NO ACTION - ON UPDATE NO ACTION, - CONSTRAINT `fk_purchase_orders_suppliers1` - FOREIGN KEY (`supplier_id`) - REFERENCES `northwind`.`suppliers` (`id`) - ON DELETE NO ACTION - ON UPDATE NO ACTION) -ENGINE = InnoDB -DEFAULT CHARACTER SET = utf8; - - --- ----------------------------------------------------- --- Table `northwind`.`inventory_transactions` --- ----------------------------------------------------- -CREATE TABLE IF NOT EXISTS `northwind`.`inventory_transactions` ( - `id` INT(11) NOT NULL AUTO_INCREMENT, - `transaction_type` TINYINT(4) NOT NULL, - `transaction_created_date` DATETIME NULL DEFAULT NULL, - `transaction_modified_date` DATETIME NULL DEFAULT NULL, - `product_id` INT(11) NOT NULL, - `quantity` INT(11) NOT NULL, - `purchase_order_id` INT(11) NULL DEFAULT NULL, - `customer_order_id` INT(11) NULL DEFAULT NULL, - `comments` VARCHAR(255) NULL DEFAULT NULL, - PRIMARY KEY (`id`), - INDEX `customer_order_id` (`customer_order_id` ASC), - INDEX `customer_order_id_2` (`customer_order_id` ASC), - INDEX `product_id` (`product_id` ASC), - INDEX `product_id_2` (`product_id` ASC), - INDEX `purchase_order_id` (`purchase_order_id` ASC), - INDEX `purchase_order_id_2` (`purchase_order_id` ASC), - INDEX `transaction_type` (`transaction_type` ASC), - CONSTRAINT `fk_inventory_transactions_orders1` - FOREIGN KEY (`customer_order_id`) - REFERENCES `northwind`.`orders` (`id`) - ON DELETE NO ACTION - ON UPDATE NO ACTION, - CONSTRAINT `fk_inventory_transactions_products1` - FOREIGN KEY (`product_id`) - REFERENCES `northwind`.`products` (`id`) - ON DELETE NO ACTION - ON UPDATE NO ACTION, - CONSTRAINT `fk_inventory_transactions_purchase_orders1` - FOREIGN KEY (`purchase_order_id`) - REFERENCES `northwind`.`purchase_orders` (`id`) - ON DELETE NO ACTION - ON UPDATE NO ACTION, - CONSTRAINT `fk_inventory_transactions_inventory_transaction_types1` - FOREIGN KEY (`transaction_type`) - REFERENCES `northwind`.`inventory_transaction_types` (`id`) - ON DELETE NO ACTION - ON UPDATE NO ACTION) -ENGINE = InnoDB -DEFAULT CHARACTER SET = utf8; - - --- ----------------------------------------------------- --- Table `northwind`.`invoices` --- ----------------------------------------------------- -CREATE TABLE IF NOT EXISTS `northwind`.`invoices` ( - `id` INT(11) NOT NULL AUTO_INCREMENT, - `order_id` INT(11) NULL DEFAULT NULL, - `invoice_date` DATETIME NULL DEFAULT NULL, - `due_date` DATETIME NULL DEFAULT NULL, - `tax` DECIMAL(19,4) NULL DEFAULT '0.0000', - `shipping` DECIMAL(19,4) NULL DEFAULT '0.0000', - `amount_due` DECIMAL(19,4) NULL DEFAULT '0.0000', - PRIMARY KEY (`id`), - INDEX `id` (`id` ASC), - INDEX `id_2` (`id` ASC), - INDEX `fk_invoices_orders1_idx` (`order_id` ASC), - CONSTRAINT `fk_invoices_orders1` - FOREIGN KEY (`order_id`) - REFERENCES `northwind`.`orders` (`id`) - ON DELETE NO ACTION - ON UPDATE NO ACTION) -ENGINE = InnoDB -DEFAULT CHARACTER SET = utf8; - - --- ----------------------------------------------------- --- Table `northwind`.`order_details_status` --- ----------------------------------------------------- -CREATE TABLE IF NOT EXISTS `northwind`.`order_details_status` ( - `id` INT(11) NOT NULL, - `status_name` VARCHAR(50) NOT NULL, - PRIMARY KEY (`id`)) -ENGINE = InnoDB -DEFAULT CHARACTER SET = utf8; - - --- ----------------------------------------------------- --- Table `northwind`.`order_details` --- ----------------------------------------------------- -CREATE TABLE IF NOT EXISTS `northwind`.`order_details` ( - `id` INT(11) NOT NULL AUTO_INCREMENT, - `order_id` INT(11) NOT NULL, - `product_id` INT(11) NULL DEFAULT NULL, - `quantity` DECIMAL(18,4) NOT NULL DEFAULT '0.0000', - `unit_price` DECIMAL(19,4) NULL DEFAULT '0.0000', - `discount` DOUBLE NOT NULL DEFAULT '0', - `status_id` INT(11) NULL DEFAULT NULL, - `date_allocated` DATETIME NULL DEFAULT NULL, - `purchase_order_id` INT(11) NULL DEFAULT NULL, - `inventory_id` INT(11) NULL DEFAULT NULL, - PRIMARY KEY (`id`), - INDEX `id` (`id` ASC), - INDEX `inventory_id` (`inventory_id` ASC), - INDEX `id_2` (`id` ASC), - INDEX `id_3` (`id` ASC), - INDEX `id_4` (`id` ASC), - INDEX `product_id` (`product_id` ASC), - INDEX `product_id_2` (`product_id` ASC), - INDEX `purchase_order_id` (`purchase_order_id` ASC), - INDEX `id_5` (`id` ASC), - INDEX `fk_order_details_orders1_idx` (`order_id` ASC), - INDEX `fk_order_details_order_details_status1_idx` (`status_id` ASC), - CONSTRAINT `fk_order_details_orders1` - FOREIGN KEY (`order_id`) - REFERENCES `northwind`.`orders` (`id`) - ON DELETE NO ACTION - ON UPDATE NO ACTION, - CONSTRAINT `fk_order_details_products1` - FOREIGN KEY (`product_id`) - REFERENCES `northwind`.`products` (`id`) - ON DELETE NO ACTION - ON UPDATE NO ACTION, - CONSTRAINT `fk_order_details_order_details_status1` - FOREIGN KEY (`status_id`) - REFERENCES `northwind`.`order_details_status` (`id`) - ON DELETE NO ACTION - ON UPDATE NO ACTION) -ENGINE = InnoDB -DEFAULT CHARACTER SET = utf8; - - --- ----------------------------------------------------- --- Table `northwind`.`purchase_order_details` --- ----------------------------------------------------- -CREATE TABLE IF NOT EXISTS `northwind`.`purchase_order_details` ( - `id` INT(11) NOT NULL AUTO_INCREMENT, - `purchase_order_id` INT(11) NOT NULL, - `product_id` INT(11) NULL DEFAULT NULL, - `quantity` DECIMAL(18,4) NOT NULL, - `unit_cost` DECIMAL(19,4) NOT NULL, - `date_received` DATETIME NULL DEFAULT NULL, - `posted_to_inventory` TINYINT(1) NOT NULL DEFAULT '0', - `inventory_id` INT(11) NULL DEFAULT NULL, - PRIMARY KEY (`id`), - INDEX `id` (`id` ASC), - INDEX `inventory_id` (`inventory_id` ASC), - INDEX `inventory_id_2` (`inventory_id` ASC), - INDEX `purchase_order_id` (`purchase_order_id` ASC), - INDEX `product_id` (`product_id` ASC), - INDEX `product_id_2` (`product_id` ASC), - INDEX `purchase_order_id_2` (`purchase_order_id` ASC), - CONSTRAINT `fk_purchase_order_details_inventory_transactions1` - FOREIGN KEY (`inventory_id`) - REFERENCES `northwind`.`inventory_transactions` (`id`) - ON DELETE NO ACTION - ON UPDATE NO ACTION, - CONSTRAINT `fk_purchase_order_details_products1` - FOREIGN KEY (`product_id`) - REFERENCES `northwind`.`products` (`id`) - ON DELETE NO ACTION - ON UPDATE NO ACTION, - CONSTRAINT `fk_purchase_order_details_purchase_orders1` - FOREIGN KEY (`purchase_order_id`) - REFERENCES `northwind`.`purchase_orders` (`id`) - ON DELETE NO ACTION - ON UPDATE NO ACTION) -ENGINE = InnoDB -DEFAULT CHARACTER SET = utf8; - - --- ----------------------------------------------------- --- Table `northwind`.`sales_reports` --- ----------------------------------------------------- -CREATE TABLE IF NOT EXISTS `northwind`.`sales_reports` ( - `group_by` VARCHAR(50) NOT NULL, - `display` VARCHAR(50) NULL DEFAULT NULL, - `title` VARCHAR(50) NULL DEFAULT NULL, - `filter_row_source` LONGTEXT NULL DEFAULT NULL, - `default` TINYINT(1) NOT NULL DEFAULT '0', - PRIMARY KEY (`group_by`)) -ENGINE = InnoDB -DEFAULT CHARACTER SET = utf8; - - --- ----------------------------------------------------- --- Table `northwind`.`strings` --- ----------------------------------------------------- -CREATE TABLE IF NOT EXISTS `northwind`.`strings` ( - `string_id` INT(11) NOT NULL AUTO_INCREMENT, - `string_data` VARCHAR(255) NULL DEFAULT NULL, - PRIMARY KEY (`string_id`)) -ENGINE = InnoDB -DEFAULT CHARACTER SET = utf8; - - -SET SQL_MODE=@OLD_SQL_MODE; -SET FOREIGN_KEY_CHECKS=@OLD_FOREIGN_KEY_CHECKS; -SET UNIQUE_CHECKS=@OLD_UNIQUE_CHECKS; diff --git a/database-files/01-coopalytics.sql b/database-files/01-coopalytics.sql new file mode 100644 index 0000000000..df74c89801 --- /dev/null +++ b/database-files/01-coopalytics.sql @@ -0,0 +1,146 @@ +DROP DATABASE IF EXISTS `coopalytics`; +CREATE DATABASE `coopalytics`; +USE `coopalytics`; + +CREATE TABLE skills ( + skillId INT PRIMARY KEY, + name VARCHAR(20) NOT NULL, + category VARCHAR(20) NOT NULL +); + +CREATE TABLE companyProfiles ( + companyProfileId INT PRIMARY KEY, + name VARCHAR(50) NOT NULL, + bio LONGTEXT, + industry VARCHAR(30) NOT NULL, + websiteLink VARCHAR(100) +); + +CREATE TABLE users ( + userId INT PRIMARY KEY, + firstName VARCHAR(30) NOT NULL, + lastName VARCHAR(30) NOT NULL, + email VARCHAR(100) NOT NULL, + phone VARCHAR(20), + major VARCHAR(50), + minor VARCHAR(50), + college VARCHAR(100), + gradYear VARCHAR(10), + grade VARCHAR(20), + companyProfileId INT, + industry VARCHAR(30), + + FOREIGN KEY (companyProfileId) REFERENCES companyProfiles (companyProfileId) ON UPDATE CASCADE ON DELETE SET NULL +); + +CREATE TABLE demographics ( + demographicId INT PRIMARY KEY, + gender VARCHAR(20), + race VARCHAR(20), + nationality VARCHAR(20), + sexuality VARCHAR(20), + disability VARCHAR(20), + + FOREIGN KEY (demographicId) REFERENCES users (userId) ON UPDATE CASCADE ON DELETE CASCADE +); + +CREATE TABLE coopPositions ( + coopPositionId INT PRIMARY KEY, + title VARCHAR(30) NOT NULL, + location VARCHAR(30) NOT NULL DEFAULT 'Not Specified', + description LONGTEXT NOT NULL, + hourlyPay FLOAT NOT NULL, + requiredSkillsId INT, + desiredSkillsId INT, + desiredGPA FLOAT, + deadline DATETIME, + startDate DATE NOT NULL, + endDate DATE NOT NULL, + flag BOOLEAN NOT NULL DEFAULT FALSE, + industry VARCHAR(30) NOT NULL DEFAULT 'Not Specified', + + FOREIGN KEY (requiredSkillsId) REFERENCES skills (skillId) ON UPDATE CASCADE ON DELETE SET NULL, + FOREIGN KEY (desiredSkillsId) REFERENCES skills (skillId) ON UPDATE CASCADE ON DELETE SET NULL +); + +CREATE TABLE skillDetails ( + skillId INT, + studentId INT, + proficiencyLevel INT NOT NULL, + + PRIMARY KEY (skillId, studentId), + FOREIGN KEY (skillId) REFERENCES skills (skillId) ON UPDATE CASCADE ON DELETE CASCADE, + FOREIGN KEY (studentId) REFERENCES users (userId) ON UPDATE CASCADE ON DELETE CASCADE +); + +CREATE TABLE advisor_advisee ( + studentId INT, + advisorId INT, + flag BOOLEAN NOT NULL DEFAULT FALSE, + + PRIMARY KEY (studentId, advisorId), + FOREIGN KEY (studentId) REFERENCES users (userId) ON UPDATE CASCADE ON DELETE CASCADE, + FOREIGN KEY (advisorId) REFERENCES users (userId) ON UPDATE CASCADE ON DELETE CASCADE +); + +CREATE TABLE workedAtPos ( + studentId INT, + coopPositionId INT, + startDate DATE NOT NULL, + endDate DATE NOT NULL, + companyRating INT, + + PRIMARY KEY (studentId, coopPositionId), + FOREIGN KEY (studentId) REFERENCES users (userId) ON UPDATE CASCADE ON DELETE CASCADE, + FOREIGN KEY (coopPositionId) REFERENCES coopPositions (coopPositionId) ON UPDATE CASCADE ON DELETE CASCADE +); + +CREATE TABLE viewsPos ( + studentId INT, + coopPositionId INT, + preference BOOLEAN DEFAULT FALSE, + + PRIMARY KEY (studentId, coopPositionId), + FOREIGN KEY (studentId) REFERENCES users (userId) ON UPDATE CASCADE ON DELETE CASCADE, + FOREIGN KEY (coopPositionId) REFERENCES coopPositions (coopPositionId) ON UPDATE CASCADE ON DELETE CASCADE +); + +CREATE TABLE createsPos ( + employerId INT, + coopPositionId INT, + + PRIMARY KEY (employerId, coopPositionId), + FOREIGN KEY (employerId) REFERENCES users (userId) ON UPDATE CASCADE ON DELETE RESTRICT, + FOREIGN KEY (coopPositionId) REFERENCES coopPositions (coopPositionId) ON UPDATE CASCADE ON DELETE CASCADE +); + +CREATE TABLE applications ( + applicationId INT NOT NULL AUTO_INCREMENT PRIMARY KEY, + dateTimeApplied DATETIME NOT NULL DEFAULT CURRENT_TIMESTAMP, + status VARCHAR(15) NOT NULL DEFAULT 'Draft', + resume LONGTEXT, + gpa FLOAT, + coverLetter LONGTEXT, + coopPositionId INT NOT NULL, + + FOREIGN KEY (coopPositionId) REFERENCES coopPositions (coopPositionId) ON UPDATE CASCADE ON DELETE CASCADE +); + +CREATE TABLE appliesToApp ( + applicationId INT, + studentId INT, + + PRIMARY KEY (applicationId, studentId), + FOREIGN KEY (applicationId) REFERENCES applications (applicationId) ON UPDATE CASCADE ON DELETE CASCADE, + FOREIGN KEY (studentId) REFERENCES users (userId) ON UPDATE CASCADE ON DELETE CASCADE +); + +CREATE TABLE reviewsApp ( + applicationId INT, + employerId INT, + flag BOOLEAN NOT NULL DEFAULT FALSE, + + PRIMARY KEY (applicationId, employerId), + FOREIGN KEY (applicationId) REFERENCES applications (applicationId) ON UPDATE CASCADE ON DELETE CASCADE, + FOREIGN KEY (employerId) REFERENCES users (userId) ON UPDATE CASCADE ON DELETE CASCADE +); diff --git a/database-files/01_northwind-default-current-timestamp.sql b/database-files/01_northwind-default-current-timestamp.sql deleted file mode 100644 index 5596e4759c..0000000000 --- a/database-files/01_northwind-default-current-timestamp.sql +++ /dev/null @@ -1,546 +0,0 @@ -SET @OLD_UNIQUE_CHECKS=@@UNIQUE_CHECKS, UNIQUE_CHECKS=0; -SET @OLD_FOREIGN_KEY_CHECKS=@@FOREIGN_KEY_CHECKS, FOREIGN_KEY_CHECKS=0; -SET @OLD_SQL_MODE=@@SQL_MODE, SQL_MODE='TRADITIONAL,ALLOW_INVALID_DATES'; - -DROP SCHEMA IF EXISTS `northwind` ; -CREATE SCHEMA IF NOT EXISTS `northwind` DEFAULT CHARACTER SET latin1 ; -USE `northwind` ; - --- ----------------------------------------------------- --- Table `northwind`.`customers` --- ----------------------------------------------------- -CREATE TABLE IF NOT EXISTS `northwind`.`customers` ( - `id` INT(11) NOT NULL AUTO_INCREMENT, - `company` VARCHAR(50) NULL DEFAULT NULL, - `last_name` VARCHAR(50) NULL DEFAULT NULL, - `first_name` VARCHAR(50) NULL DEFAULT NULL, - `email_address` VARCHAR(50) NULL DEFAULT NULL, - `job_title` VARCHAR(50) NULL DEFAULT NULL, - `business_phone` VARCHAR(25) NULL DEFAULT NULL, - `home_phone` VARCHAR(25) NULL DEFAULT NULL, - `mobile_phone` VARCHAR(25) NULL DEFAULT NULL, - `fax_number` VARCHAR(25) NULL DEFAULT NULL, - `address` LONGTEXT NULL DEFAULT NULL, - `city` VARCHAR(50) NULL DEFAULT NULL, - `state_province` VARCHAR(50) NULL DEFAULT NULL, - `zip_postal_code` VARCHAR(15) NULL DEFAULT NULL, - `country_region` VARCHAR(50) NULL DEFAULT NULL, - `web_page` LONGTEXT NULL DEFAULT NULL, - `notes` LONGTEXT NULL DEFAULT NULL, - `attachments` LONGBLOB NULL DEFAULT NULL, - PRIMARY KEY (`id`), - INDEX `city` (`city` ASC), - INDEX `company` (`company` ASC), - INDEX `first_name` (`first_name` ASC), - INDEX `last_name` (`last_name` ASC), - INDEX `zip_postal_code` (`zip_postal_code` ASC), - INDEX `state_province` (`state_province` ASC)) -ENGINE = InnoDB -DEFAULT CHARACTER SET = utf8; - - --- ----------------------------------------------------- --- Table `northwind`.`employees` --- ----------------------------------------------------- -CREATE TABLE IF NOT EXISTS `northwind`.`employees` ( - `id` INT(11) NOT NULL AUTO_INCREMENT, - `company` VARCHAR(50) NULL DEFAULT NULL, - `last_name` VARCHAR(50) NULL DEFAULT NULL, - `first_name` VARCHAR(50) NULL DEFAULT NULL, - `email_address` VARCHAR(50) NULL DEFAULT NULL, - `job_title` VARCHAR(50) NULL DEFAULT NULL, - `business_phone` VARCHAR(25) NULL DEFAULT NULL, - `home_phone` VARCHAR(25) NULL DEFAULT NULL, - `mobile_phone` VARCHAR(25) NULL DEFAULT NULL, - `fax_number` VARCHAR(25) NULL DEFAULT NULL, - `address` LONGTEXT NULL DEFAULT NULL, - `city` VARCHAR(50) NULL DEFAULT NULL, - `state_province` VARCHAR(50) NULL DEFAULT NULL, - `zip_postal_code` VARCHAR(15) NULL DEFAULT NULL, - `country_region` VARCHAR(50) NULL DEFAULT NULL, - `web_page` LONGTEXT NULL DEFAULT NULL, - `notes` LONGTEXT NULL DEFAULT NULL, - `attachments` LONGBLOB NULL DEFAULT NULL, - PRIMARY KEY (`id`), - INDEX `city` (`city` ASC), - INDEX `company` (`company` ASC), - INDEX `first_name` (`first_name` ASC), - INDEX `last_name` (`last_name` ASC), - INDEX `zip_postal_code` (`zip_postal_code` ASC), - INDEX `state_province` (`state_province` ASC)) -ENGINE = InnoDB -DEFAULT CHARACTER SET = utf8; - - --- ----------------------------------------------------- --- Table `northwind`.`privileges` --- ----------------------------------------------------- -CREATE TABLE IF NOT EXISTS `northwind`.`privileges` ( - `id` INT(11) NOT NULL AUTO_INCREMENT, - `privilege_name` VARCHAR(50) NULL DEFAULT NULL, - PRIMARY KEY (`id`)) -ENGINE = InnoDB -DEFAULT CHARACTER SET = utf8; - - --- ----------------------------------------------------- --- Table `northwind`.`employee_privileges` --- ----------------------------------------------------- -CREATE TABLE IF NOT EXISTS `northwind`.`employee_privileges` ( - `employee_id` INT(11) NOT NULL, - `privilege_id` INT(11) NOT NULL, - PRIMARY KEY (`employee_id`, `privilege_id`), - INDEX `employee_id` (`employee_id` ASC), - INDEX `privilege_id` (`privilege_id` ASC), - INDEX `privilege_id_2` (`privilege_id` ASC), - CONSTRAINT `fk_employee_privileges_employees1` - FOREIGN KEY (`employee_id`) - REFERENCES `northwind`.`employees` (`id`) - ON DELETE NO ACTION - ON UPDATE NO ACTION, - CONSTRAINT `fk_employee_privileges_privileges1` - FOREIGN KEY (`privilege_id`) - REFERENCES `northwind`.`privileges` (`id`) - ON DELETE NO ACTION - ON UPDATE NO ACTION) -ENGINE = InnoDB -DEFAULT CHARACTER SET = utf8; - - --- ----------------------------------------------------- --- Table `northwind`.`inventory_transaction_types` --- ----------------------------------------------------- -CREATE TABLE IF NOT EXISTS `northwind`.`inventory_transaction_types` ( - `id` TINYINT(4) NOT NULL, - `type_name` VARCHAR(50) NOT NULL, - PRIMARY KEY (`id`)) -ENGINE = InnoDB -DEFAULT CHARACTER SET = utf8; - - --- ----------------------------------------------------- --- Table `northwind`.`shippers` --- ----------------------------------------------------- -CREATE TABLE IF NOT EXISTS `northwind`.`shippers` ( - `id` INT(11) NOT NULL AUTO_INCREMENT, - `company` VARCHAR(50) NULL DEFAULT NULL, - `last_name` VARCHAR(50) NULL DEFAULT NULL, - `first_name` VARCHAR(50) NULL DEFAULT NULL, - `email_address` VARCHAR(50) NULL DEFAULT NULL, - `job_title` VARCHAR(50) NULL DEFAULT NULL, - `business_phone` VARCHAR(25) NULL DEFAULT NULL, - `home_phone` VARCHAR(25) NULL DEFAULT NULL, - `mobile_phone` VARCHAR(25) NULL DEFAULT NULL, - `fax_number` VARCHAR(25) NULL DEFAULT NULL, - `address` LONGTEXT NULL DEFAULT NULL, - `city` VARCHAR(50) NULL DEFAULT NULL, - `state_province` VARCHAR(50) NULL DEFAULT NULL, - `zip_postal_code` VARCHAR(15) NULL DEFAULT NULL, - `country_region` VARCHAR(50) NULL DEFAULT NULL, - `web_page` LONGTEXT NULL DEFAULT NULL, - `notes` LONGTEXT NULL DEFAULT NULL, - `attachments` LONGBLOB NULL DEFAULT NULL, - PRIMARY KEY (`id`), - INDEX `city` (`city` ASC), - INDEX `company` (`company` ASC), - INDEX `first_name` (`first_name` ASC), - INDEX `last_name` (`last_name` ASC), - INDEX `zip_postal_code` (`zip_postal_code` ASC), - INDEX `state_province` (`state_province` ASC)) -ENGINE = InnoDB -DEFAULT CHARACTER SET = utf8; - - --- ----------------------------------------------------- --- Table `northwind`.`orders_tax_status` --- ----------------------------------------------------- -CREATE TABLE IF NOT EXISTS `northwind`.`orders_tax_status` ( - `id` TINYINT(4) NOT NULL, - `tax_status_name` VARCHAR(50) NOT NULL, - PRIMARY KEY (`id`)) -ENGINE = InnoDB -DEFAULT CHARACTER SET = utf8; - - --- ----------------------------------------------------- --- Table `northwind`.`orders_status` --- ----------------------------------------------------- -CREATE TABLE IF NOT EXISTS `northwind`.`orders_status` ( - `id` TINYINT(4) NOT NULL, - `status_name` VARCHAR(50) NOT NULL, - PRIMARY KEY (`id`)) -ENGINE = InnoDB -DEFAULT CHARACTER SET = utf8; - - --- ----------------------------------------------------- --- Table `northwind`.`orders` --- ----------------------------------------------------- -CREATE TABLE IF NOT EXISTS `northwind`.`orders` ( - `id` INT(11) NOT NULL AUTO_INCREMENT, - `employee_id` INT(11) NULL DEFAULT NULL, - `customer_id` INT(11) NULL DEFAULT NULL, - `order_date` DATETIME NOT NULL DEFAULT CURRENT_TIMESTAMP, - `shipped_date` DATETIME NULL DEFAULT NULL, - `shipper_id` INT(11) NULL DEFAULT NULL, - `ship_name` VARCHAR(50) NULL DEFAULT NULL, - `ship_address` LONGTEXT NULL DEFAULT NULL, - `ship_city` VARCHAR(50) NULL DEFAULT NULL, - `ship_state_province` VARCHAR(50) NULL DEFAULT NULL, - `ship_zip_postal_code` VARCHAR(50) NULL DEFAULT NULL, - `ship_country_region` VARCHAR(50) NULL DEFAULT NULL, - `shipping_fee` DECIMAL(19,4) NULL DEFAULT '0.0000', - `taxes` DECIMAL(19,4) NULL DEFAULT '0.0000', - `payment_type` VARCHAR(50) NULL DEFAULT NULL, - `paid_date` DATETIME NULL DEFAULT NULL, - `notes` LONGTEXT NULL DEFAULT NULL, - `tax_rate` DOUBLE NULL DEFAULT '0', - `tax_status_id` TINYINT(4) NULL DEFAULT NULL, - `status_id` TINYINT(4) NULL DEFAULT '0', - PRIMARY KEY (`id`), - INDEX `customer_id` (`customer_id` ASC), - INDEX `customer_id_2` (`customer_id` ASC), - INDEX `employee_id` (`employee_id` ASC), - INDEX `employee_id_2` (`employee_id` ASC), - INDEX `id` (`id` ASC), - INDEX `id_2` (`id` ASC), - INDEX `shipper_id` (`shipper_id` ASC), - INDEX `shipper_id_2` (`shipper_id` ASC), - INDEX `id_3` (`id` ASC), - INDEX `tax_status` (`tax_status_id` ASC), - INDEX `ship_zip_postal_code` (`ship_zip_postal_code` ASC), - CONSTRAINT `fk_orders_customers` - FOREIGN KEY (`customer_id`) - REFERENCES `northwind`.`customers` (`id`) - ON DELETE NO ACTION - ON UPDATE NO ACTION, - CONSTRAINT `fk_orders_employees1` - FOREIGN KEY (`employee_id`) - REFERENCES `northwind`.`employees` (`id`) - ON DELETE NO ACTION - ON UPDATE NO ACTION, - CONSTRAINT `fk_orders_shippers1` - FOREIGN KEY (`shipper_id`) - REFERENCES `northwind`.`shippers` (`id`) - ON DELETE NO ACTION - ON UPDATE NO ACTION, - CONSTRAINT `fk_orders_orders_tax_status1` - FOREIGN KEY (`tax_status_id`) - REFERENCES `northwind`.`orders_tax_status` (`id`) - ON DELETE NO ACTION - ON UPDATE NO ACTION, - CONSTRAINT `fk_orders_orders_status1` - FOREIGN KEY (`status_id`) - REFERENCES `northwind`.`orders_status` (`id`) - ON DELETE NO ACTION - ON UPDATE NO ACTION) -ENGINE = InnoDB -DEFAULT CHARACTER SET = utf8; - - --- ----------------------------------------------------- --- Table `northwind`.`products` --- ----------------------------------------------------- -CREATE TABLE IF NOT EXISTS `northwind`.`products` ( - `supplier_ids` LONGTEXT NULL DEFAULT NULL, - `id` INT(11) NOT NULL AUTO_INCREMENT, - `product_code` VARCHAR(25) NULL DEFAULT NULL, - `product_name` VARCHAR(50) NULL DEFAULT NULL, - `description` LONGTEXT NULL DEFAULT NULL, - `standard_cost` DECIMAL(19,4) NULL DEFAULT '0.0000', - `list_price` DECIMAL(19,4) NOT NULL DEFAULT '0.0000', - `reorder_level` INT(11) NULL DEFAULT NULL, - `target_level` INT(11) NULL DEFAULT NULL, - `quantity_per_unit` VARCHAR(50) NULL DEFAULT NULL, - `discontinued` TINYINT(1) NOT NULL DEFAULT '0', - `minimum_reorder_quantity` INT(11) NULL DEFAULT NULL, - `category` VARCHAR(50) NULL DEFAULT NULL, - `attachments` LONGBLOB NULL DEFAULT NULL, - PRIMARY KEY (`id`), - INDEX `product_code` (`product_code` ASC)) -ENGINE = InnoDB -DEFAULT CHARACTER SET = utf8; - - --- ----------------------------------------------------- --- Table `northwind`.`purchase_order_status` --- ----------------------------------------------------- -CREATE TABLE IF NOT EXISTS `northwind`.`purchase_order_status` ( - `id` INT(11) NOT NULL, - `status` VARCHAR(50) NULL DEFAULT NULL, - PRIMARY KEY (`id`)) -ENGINE = InnoDB -DEFAULT CHARACTER SET = utf8; - - --- ----------------------------------------------------- --- Table `northwind`.`suppliers` --- ----------------------------------------------------- -CREATE TABLE IF NOT EXISTS `northwind`.`suppliers` ( - `id` INT(11) NOT NULL AUTO_INCREMENT, - `company` VARCHAR(50) NULL DEFAULT NULL, - `last_name` VARCHAR(50) NULL DEFAULT NULL, - `first_name` VARCHAR(50) NULL DEFAULT NULL, - `email_address` VARCHAR(50) NULL DEFAULT NULL, - `job_title` VARCHAR(50) NULL DEFAULT NULL, - `business_phone` VARCHAR(25) NULL DEFAULT NULL, - `home_phone` VARCHAR(25) NULL DEFAULT NULL, - `mobile_phone` VARCHAR(25) NULL DEFAULT NULL, - `fax_number` VARCHAR(25) NULL DEFAULT NULL, - `address` LONGTEXT NULL DEFAULT NULL, - `city` VARCHAR(50) NULL DEFAULT NULL, - `state_province` VARCHAR(50) NULL DEFAULT NULL, - `zip_postal_code` VARCHAR(15) NULL DEFAULT NULL, - `country_region` VARCHAR(50) NULL DEFAULT NULL, - `web_page` LONGTEXT NULL DEFAULT NULL, - `notes` LONGTEXT NULL DEFAULT NULL, - `attachments` LONGBLOB NULL DEFAULT NULL, - PRIMARY KEY (`id`), - INDEX `city` (`city` ASC), - INDEX `company` (`company` ASC), - INDEX `first_name` (`first_name` ASC), - INDEX `last_name` (`last_name` ASC), - INDEX `zip_postal_code` (`zip_postal_code` ASC), - INDEX `state_province` (`state_province` ASC)) -ENGINE = InnoDB -DEFAULT CHARACTER SET = utf8; - - --- ----------------------------------------------------- --- Table `northwind`.`purchase_orders` --- ----------------------------------------------------- -CREATE TABLE IF NOT EXISTS `northwind`.`purchase_orders` ( - `id` INT(11) NOT NULL AUTO_INCREMENT, - `supplier_id` INT(11) NULL DEFAULT NULL, - `created_by` INT(11) NULL DEFAULT NULL, - `submitted_date` DATETIME NULL DEFAULT NULL, - `creation_date` DATETIME NOT NULL DEFAULT CURRENT_TIMESTAMP, - `status_id` INT(11) NULL DEFAULT '0', - `expected_date` DATETIME NULL DEFAULT NULL, - `shipping_fee` DECIMAL(19,4) NOT NULL DEFAULT '0.0000', - `taxes` DECIMAL(19,4) NOT NULL DEFAULT '0.0000', - `payment_date` DATETIME NULL DEFAULT NULL, - `payment_amount` DECIMAL(19,4) NULL DEFAULT '0.0000', - `payment_method` VARCHAR(50) NULL DEFAULT NULL, - `notes` LONGTEXT NULL DEFAULT NULL, - `approved_by` INT(11) NULL DEFAULT NULL, - `approved_date` DATETIME NULL DEFAULT NULL, - `submitted_by` INT(11) NULL DEFAULT NULL, - PRIMARY KEY (`id`), - UNIQUE INDEX `id` (`id` ASC), - INDEX `created_by` (`created_by` ASC), - INDEX `status_id` (`status_id` ASC), - INDEX `id_2` (`id` ASC), - INDEX `supplier_id` (`supplier_id` ASC), - INDEX `supplier_id_2` (`supplier_id` ASC), - CONSTRAINT `fk_purchase_orders_employees1` - FOREIGN KEY (`created_by`) - REFERENCES `northwind`.`employees` (`id`) - ON DELETE NO ACTION - ON UPDATE NO ACTION, - CONSTRAINT `fk_purchase_orders_purchase_order_status1` - FOREIGN KEY (`status_id`) - REFERENCES `northwind`.`purchase_order_status` (`id`) - ON DELETE NO ACTION - ON UPDATE NO ACTION, - CONSTRAINT `fk_purchase_orders_suppliers1` - FOREIGN KEY (`supplier_id`) - REFERENCES `northwind`.`suppliers` (`id`) - ON DELETE NO ACTION - ON UPDATE NO ACTION) -ENGINE = InnoDB -DEFAULT CHARACTER SET = utf8; - - --- ----------------------------------------------------- --- Table `northwind`.`inventory_transactions` --- ----------------------------------------------------- -CREATE TABLE IF NOT EXISTS `northwind`.`inventory_transactions` ( - `id` INT(11) NOT NULL AUTO_INCREMENT, - `transaction_type` TINYINT(4) NOT NULL, - `transaction_created_date` DATETIME NOT NULL DEFAULT CURRENT_TIMESTAMP, - `transaction_modified_date` DATETIME NOT NULL DEFAULT CURRENT_TIMESTAMP, - `product_id` INT(11) NOT NULL, - `quantity` INT(11) NOT NULL, - `purchase_order_id` INT(11) NULL DEFAULT NULL, - `customer_order_id` INT(11) NULL DEFAULT NULL, - `comments` VARCHAR(255) NULL DEFAULT NULL, - PRIMARY KEY (`id`), - INDEX `customer_order_id` (`customer_order_id` ASC), - INDEX `customer_order_id_2` (`customer_order_id` ASC), - INDEX `product_id` (`product_id` ASC), - INDEX `product_id_2` (`product_id` ASC), - INDEX `purchase_order_id` (`purchase_order_id` ASC), - INDEX `purchase_order_id_2` (`purchase_order_id` ASC), - INDEX `transaction_type` (`transaction_type` ASC), - CONSTRAINT `fk_inventory_transactions_orders1` - FOREIGN KEY (`customer_order_id`) - REFERENCES `northwind`.`orders` (`id`) - ON DELETE NO ACTION - ON UPDATE NO ACTION, - CONSTRAINT `fk_inventory_transactions_products1` - FOREIGN KEY (`product_id`) - REFERENCES `northwind`.`products` (`id`) - ON DELETE NO ACTION - ON UPDATE NO ACTION, - CONSTRAINT `fk_inventory_transactions_purchase_orders1` - FOREIGN KEY (`purchase_order_id`) - REFERENCES `northwind`.`purchase_orders` (`id`) - ON DELETE NO ACTION - ON UPDATE NO ACTION, - CONSTRAINT `fk_inventory_transactions_inventory_transaction_types1` - FOREIGN KEY (`transaction_type`) - REFERENCES `northwind`.`inventory_transaction_types` (`id`) - ON DELETE NO ACTION - ON UPDATE NO ACTION) -ENGINE = InnoDB -DEFAULT CHARACTER SET = utf8; - - --- ----------------------------------------------------- --- Table `northwind`.`invoices` --- ----------------------------------------------------- -CREATE TABLE IF NOT EXISTS `northwind`.`invoices` ( - `id` INT(11) NOT NULL AUTO_INCREMENT, - `order_id` INT(11) NULL DEFAULT NULL, - `invoice_date` DATETIME NOT NULL DEFAULT CURRENT_TIMESTAMP, - `due_date` DATETIME NULL DEFAULT NULL, - `tax` DECIMAL(19,4) NULL DEFAULT '0.0000', - `shipping` DECIMAL(19,4) NULL DEFAULT '0.0000', - `amount_due` DECIMAL(19,4) NULL DEFAULT '0.0000', - PRIMARY KEY (`id`), - INDEX `id` (`id` ASC), - INDEX `id_2` (`id` ASC), - INDEX `fk_invoices_orders1_idx` (`order_id` ASC), - CONSTRAINT `fk_invoices_orders1` - FOREIGN KEY (`order_id`) - REFERENCES `northwind`.`orders` (`id`) - ON DELETE NO ACTION - ON UPDATE NO ACTION) -ENGINE = InnoDB -DEFAULT CHARACTER SET = utf8; - - --- ----------------------------------------------------- --- Table `northwind`.`order_details_status` --- ----------------------------------------------------- -CREATE TABLE IF NOT EXISTS `northwind`.`order_details_status` ( - `id` INT(11) NOT NULL, - `status_name` VARCHAR(50) NOT NULL, - PRIMARY KEY (`id`)) -ENGINE = InnoDB -DEFAULT CHARACTER SET = utf8; - - --- ----------------------------------------------------- --- Table `northwind`.`order_details` --- ----------------------------------------------------- -CREATE TABLE IF NOT EXISTS `northwind`.`order_details` ( - `id` INT(11) NOT NULL AUTO_INCREMENT, - `order_id` INT(11) NOT NULL, - `product_id` INT(11) NULL DEFAULT NULL, - `quantity` DECIMAL(18,4) NOT NULL DEFAULT '0.0000', - `unit_price` DECIMAL(19,4) NULL DEFAULT '0.0000', - `discount` DOUBLE NOT NULL DEFAULT '0', - `status_id` INT(11) NULL DEFAULT NULL, - `date_allocated` DATETIME NULL DEFAULT NULL, - `purchase_order_id` INT(11) NULL DEFAULT NULL, - `inventory_id` INT(11) NULL DEFAULT NULL, - PRIMARY KEY (`id`), - INDEX `id` (`id` ASC), - INDEX `inventory_id` (`inventory_id` ASC), - INDEX `id_2` (`id` ASC), - INDEX `id_3` (`id` ASC), - INDEX `id_4` (`id` ASC), - INDEX `product_id` (`product_id` ASC), - INDEX `product_id_2` (`product_id` ASC), - INDEX `purchase_order_id` (`purchase_order_id` ASC), - INDEX `id_5` (`id` ASC), - INDEX `fk_order_details_orders1_idx` (`order_id` ASC), - INDEX `fk_order_details_order_details_status1_idx` (`status_id` ASC), - CONSTRAINT `fk_order_details_orders1` - FOREIGN KEY (`order_id`) - REFERENCES `northwind`.`orders` (`id`) - ON DELETE NO ACTION - ON UPDATE NO ACTION, - CONSTRAINT `fk_order_details_products1` - FOREIGN KEY (`product_id`) - REFERENCES `northwind`.`products` (`id`) - ON DELETE NO ACTION - ON UPDATE NO ACTION, - CONSTRAINT `fk_order_details_order_details_status1` - FOREIGN KEY (`status_id`) - REFERENCES `northwind`.`order_details_status` (`id`) - ON DELETE NO ACTION - ON UPDATE NO ACTION) -ENGINE = InnoDB -DEFAULT CHARACTER SET = utf8; - - --- ----------------------------------------------------- --- Table `northwind`.`purchase_order_details` --- ----------------------------------------------------- -CREATE TABLE IF NOT EXISTS `northwind`.`purchase_order_details` ( - `id` INT(11) NOT NULL AUTO_INCREMENT, - `purchase_order_id` INT(11) NOT NULL, - `product_id` INT(11) NULL DEFAULT NULL, - `quantity` DECIMAL(18,4) NOT NULL, - `unit_cost` DECIMAL(19,4) NOT NULL, - `date_received` DATETIME NULL DEFAULT NULL, - `posted_to_inventory` TINYINT(1) NOT NULL DEFAULT '0', - `inventory_id` INT(11) NULL DEFAULT NULL, - PRIMARY KEY (`id`), - INDEX `id` (`id` ASC), - INDEX `inventory_id` (`inventory_id` ASC), - INDEX `inventory_id_2` (`inventory_id` ASC), - INDEX `purchase_order_id` (`purchase_order_id` ASC), - INDEX `product_id` (`product_id` ASC), - INDEX `product_id_2` (`product_id` ASC), - INDEX `purchase_order_id_2` (`purchase_order_id` ASC), - CONSTRAINT `fk_purchase_order_details_inventory_transactions1` - FOREIGN KEY (`inventory_id`) - REFERENCES `northwind`.`inventory_transactions` (`id`) - ON DELETE NO ACTION - ON UPDATE NO ACTION, - CONSTRAINT `fk_purchase_order_details_products1` - FOREIGN KEY (`product_id`) - REFERENCES `northwind`.`products` (`id`) - ON DELETE NO ACTION - ON UPDATE NO ACTION, - CONSTRAINT `fk_purchase_order_details_purchase_orders1` - FOREIGN KEY (`purchase_order_id`) - REFERENCES `northwind`.`purchase_orders` (`id`) - ON DELETE NO ACTION - ON UPDATE NO ACTION) -ENGINE = InnoDB -DEFAULT CHARACTER SET = utf8; - - --- ----------------------------------------------------- --- Table `northwind`.`sales_reports` --- ----------------------------------------------------- -CREATE TABLE IF NOT EXISTS `northwind`.`sales_reports` ( - `group_by` VARCHAR(50) NOT NULL, - `display` VARCHAR(50) NULL DEFAULT NULL, - `title` VARCHAR(50) NULL DEFAULT NULL, - `filter_row_source` LONGTEXT NULL DEFAULT NULL, - `default` TINYINT(1) NOT NULL DEFAULT '0', - PRIMARY KEY (`group_by`)) -ENGINE = InnoDB -DEFAULT CHARACTER SET = utf8; - - --- ----------------------------------------------------- --- Table `northwind`.`strings` --- ----------------------------------------------------- -CREATE TABLE IF NOT EXISTS `northwind`.`strings` ( - `string_id` INT(11) NOT NULL AUTO_INCREMENT, - `string_data` VARCHAR(255) NULL DEFAULT NULL, - PRIMARY KEY (`string_id`)) -ENGINE = InnoDB -DEFAULT CHARACTER SET = utf8; - - -SET SQL_MODE=@OLD_SQL_MODE; -SET FOREIGN_KEY_CHECKS=@OLD_FOREIGN_KEY_CHECKS; -SET UNIQUE_CHECKS=@OLD_UNIQUE_CHECKS; diff --git a/database-files/02-coopalytics-data.sql b/database-files/02-coopalytics-data.sql new file mode 100644 index 0000000000..19933699b7 --- /dev/null +++ b/database-files/02-coopalytics-data.sql @@ -0,0 +1,547 @@ +USE coopalytics; + +-- 1. Skills table (40 rows - strong entity, no dependencies) +INSERT INTO skills (skillId, name, category) VALUES +(1, 'Python', 'Programming'), +(2, 'Java', 'Programming'), +(3, 'JavaScript', 'Programming'), +(4, 'React', 'Web Development'), +(5, 'Node.js', 'Web Development'), +(6, 'SQL', 'Database'), +(7, 'MongoDB', 'Database'), +(8, 'AWS', 'Cloud'), +(9, 'Docker', 'DevOps'), +(10, 'Git', 'Version Control'), +(11, 'Machine Learning', 'Data Science'), +(12, 'Data Analysis', 'Data Science'), +(13, 'Excel', 'Office'), +(14, 'PowerPoint', 'Office'), +(15, 'Project Management', 'Management'), +(16, 'Agile', 'Management'), +(17, 'Communication', 'Soft Skills'), +(18, 'Leadership', 'Soft Skills'), +(19, 'Problem Solving', 'Soft Skills'), +(20, 'Teamwork', 'Soft Skills'), +(21, 'C++', 'Programming'), +(22, 'C#', 'Programming'), +(23, 'PHP', 'Programming'), +(24, 'Ruby', 'Programming'), +(25, 'Swift', 'Programming'), +(26, 'Kotlin', 'Programming'), +(27, 'Angular', 'Web Development'), +(28, 'Vue.js', 'Web Development'), +(29, 'CSS', 'Web Development'), +(30, 'HTML', 'Web Development'), +(31, 'PostgreSQL', 'Database'), +(32, 'MySQL', 'Database'), +(33, 'Azure', 'Cloud'), +(34, 'GCP', 'Cloud'), +(35, 'Kubernetes', 'DevOps'), +(36, 'Jenkins', 'DevOps'), +(37, 'Tableau', 'Data Science'), +(38, 'R', 'Data Science'), +(39, 'Adobe Creative', 'Design'), +(40, 'UI/UX Design', 'Design'); + +-- 2. Company Profiles table (35 rows - strong entity, no dependencies) +INSERT INTO companyProfiles (companyProfileId, name, bio, industry, websiteLink) VALUES +(1, 'TechNova Inc', 'Leading software development company specializing in enterprise solutions and cloud infrastructure.', 'Technology', 'www.technova.com'), +(2, 'DataFlow Analytics', 'Data science consulting firm helping businesses make data-driven decisions through advanced analytics.', 'Technology', 'www.dataflow.com'), +(3, 'GreenTech Solutions', 'Environmental technology company focused on sustainable energy and green infrastructure solutions.', 'Environmental', 'www.greentech.com'), +(4, 'FinanceFirst Corp', 'Financial services company providing investment management and banking solutions to corporate clients.', 'Finance', 'www.financefirst.com'), +(5, 'HealthTech Innovations', 'Healthcare technology startup developing AI-powered diagnostic tools and patient management systems.', 'Healthcare', 'www.healthtech.com'), +(6, 'CyberShield Security', 'Cybersecurity firm specializing in threat detection, incident response, and security consulting services.', 'Technology', 'www.cybershield.com'), +(7, 'BioResearch Labs', 'Biotechnology research company focused on drug discovery and medical device development.', 'Healthcare', 'www.bioresearch.com'), +(8, 'CloudFirst Technologies', 'Cloud infrastructure provider offering scalable solutions for enterprise digital transformation.', 'Technology', 'www.cloudfirst.com'), +(9, 'MarketPulse Agency', 'Digital marketing agency specializing in social media strategy and content marketing campaigns.', 'Marketing', 'www.marketpulse.com'), +(10, 'AutoMech Industries', 'Manufacturing company specializing in automotive parts and industrial automation systems.', 'Manufacturing', 'www.automech.com'), +(11, 'EduTech Platform', 'Educational technology company developing online learning platforms and student management systems.', 'Education', 'www.edutech.com'), +(12, 'RetailMax Solutions', 'Retail technology provider offering point-of-sale systems and inventory management solutions.', 'Retail', 'www.retailmax.com'), +(13, 'EnergyFlow Corp', 'Renewable energy company focused on solar and wind power generation and distribution systems.', 'Energy', 'www.energyflow.com'), +(14, 'LogiTrans Systems', 'Logistics and transportation company providing supply chain management and delivery solutions.', 'Logistics', 'www.logitrans.com'), +(15, 'DesignStudio Pro', 'Creative design agency specializing in brand identity, web design, and user experience consulting.', 'Design', 'www.designstudio.com'), +(16, 'AgriTech Innovations', 'Agricultural technology company developing precision farming tools and crop management systems.', 'Agriculture', 'www.agritech.com'), +(17, 'SportsTech Analytics', 'Sports technology company providing performance analytics and fan engagement platforms.', 'Sports', 'www.sportstech.com'), +(18, 'MediaStream Corp', 'Media and entertainment company specializing in streaming platforms and content distribution.', 'Media', 'www.mediastream.com'), +(19, 'RealEstate Plus', 'Real estate technology company offering property management and virtual tour solutions.', 'Real Estate', 'www.realestate.com'), +(20, 'TravelTech Solutions', 'Travel technology provider developing booking platforms and travel management systems.', 'Travel', 'www.traveltech.com'), +(21, 'FoodTech Innovations', 'Food technology company focused on sustainable food production and delivery optimization.', 'Food', 'www.foodtech.com'), +(22, 'InsureTech Corp', 'Insurance technology company providing digital insurance platforms and risk assessment tools.', 'Insurance', 'www.insuretech.com'), +(23, 'GameDev Studios', 'Video game development company creating mobile and console games with immersive experiences.', 'Gaming', 'www.gamedev.com'), +(24, 'LegalTech Solutions', 'Legal technology provider offering case management systems and document automation tools.', 'Legal', 'www.legaltech.com'), +(25, 'ConstructTech Pro', 'Construction technology company providing project management and building information modeling.', 'Construction', 'www.constructtech.com'), +(26, 'PharmaResearch Inc', 'Pharmaceutical research company focused on drug development and clinical trial management.', 'Pharmaceutical', 'www.pharmaresearch.com'), +(27, 'AeroSpace Dynamics', 'Aerospace engineering company developing aircraft systems and space exploration technologies.', 'Aerospace', 'www.aerospace.com'), +(28, 'TextileTech Corp', 'Textile manufacturing company specializing in smart fabrics and sustainable clothing production.', 'Textile', 'www.textiletech.com'), +(29, 'MiningTech Solutions', 'Mining technology provider offering equipment automation and resource extraction optimization.', 'Mining', 'www.miningtech.com'), +(30, 'WaterTech Systems', 'Water technology company developing purification systems and water resource management solutions.', 'Water', 'www.watertech.com'), +(31, 'RoboTech Industries', 'Robotics company creating industrial automation solutions and service robots for various sectors.', 'Robotics', 'www.robotech.com'), +(32, 'ChemTech Labs', 'Chemical technology company specializing in materials science and chemical process optimization.', 'Chemical', 'www.chemtech.com'), +(33, 'TransportTech Corp', 'Transportation technology provider developing autonomous vehicle systems and traffic management.', 'Transportation', 'www.transporttech.com'), +(34, 'SecurityTech Pro', 'Physical security technology company offering surveillance systems and access control solutions.', 'Security', 'www.securitytech.com'), +(35, 'CleanTech Innovations', 'Clean technology company focused on waste management and environmental remediation solutions.', 'Environmental', 'www.cleantech.com'); + +-- 3. Users table (48 rows - references companyProfiles) +INSERT INTO users (userId, firstName, lastName, email, phone, major, minor, college, gradYear, grade, companyProfileId, industry) VALUES +-- Students (userId 1-30) +(1, 'Charlie', 'Stout', 'c.stout@student.edu', '555-0101', 'Computer Science', 'Mathematics', 'Khoury College of Computer Sciences', '2026', 'Junior', NULL, NULL), +(2, 'Liam', 'Williams', 'l.williams@student.edu', '555-0102', 'Business', 'Economics', 'D\'Amore-McKim School of Business', '2025', 'Senior', NULL, NULL), +(3, 'Sophia', 'Brown', 's.brown@student.edu', '555-0103', 'Mechanical Engineering', 'Physics', 'College of Engineering', '2027', 'Sophomore', NULL, NULL), +(4, 'Noah', 'Davis', 'n.davis@student.edu', '555-0104', 'Data Science', NULL, 'Khoury College of Computer Sciences', '2026', 'Junior', NULL, NULL), +(5, 'Olivia', 'Miller', 'o.miller@student.edu', '555-0105', 'Marketing', 'Psychology', 'D\'Amore-McKim School of Business', '2025', 'Senior', NULL, NULL), +(6, 'Mason', 'Wilson', 'm.wilson@student.edu', '555-0106', 'Cybersecurity', NULL, 'Khoury College of Computer Sciences', '2026', 'Junior', NULL, NULL), +(7, 'Ava', 'Moore', 'a.moore@student.edu', '555-0107', 'Biomedical Engineering', 'Chemistry', 'College of Engineering', '2027', 'Sophomore', NULL, NULL), +(8, 'Ethan', 'Taylor', 'e.taylor@student.edu', '555-0108', 'Finance', NULL, 'D\'Amore-McKim School of Business', '2025', 'Senior', NULL, NULL), +(9, 'Isabella', 'Anderson', 'i.anderson@student.edu', '555-0109', 'Psychology', 'Sociology', 'College of Social Sciences and Humanities', '2026', 'Junior', NULL, NULL), +(10, 'James', 'Thomas', 'j.thomas@student.edu', '555-0110', 'Mechanical Engineering', NULL, 'College of Engineering', '2027', 'Sophomore', NULL, NULL), +(11, 'Mia', 'Jackson', 'm.jackson@student.edu', '555-0111', 'Computer Science', NULL, 'Khoury College of Computer Sciences', '2025', 'Senior', NULL, NULL), +(12, 'Lucas', 'White', 'l.white@student.edu', '555-0112', 'Business', 'Data Science', 'D\'Amore-McKim School of Business', '2026', 'Junior', NULL, NULL), +(13, 'Charlotte', 'Harris', 'c.harris@student.edu', '555-0113', 'Environmental Engineering', 'Biology', 'College of Engineering', '2027', 'Sophomore', NULL, NULL), +(14, 'Benjamin', 'Martin', 'b.martin@student.edu', '555-0114', 'Information Systems', NULL, 'Khoury College of Computer Sciences', '2025', 'Senior', NULL, NULL), +(15, 'Amelia', 'Garcia', 'a.garcia@student.edu', '555-0115', 'Physics', 'Mathematics', 'College of Science', '2026', 'Junior', NULL, NULL), +(16, 'Henry', 'Rodriguez', 'h.rodriguez@student.edu', '555-0116', 'Computer Science', 'Mathematics', 'Khoury College of Computer Sciences', '2027', 'Sophomore', NULL, NULL), +(17, 'Harper', 'Lewis', 'h.lewis@student.edu', '555-0117', 'Design', 'Art', 'College of Arts, Media and Design', '2025', 'Senior', NULL, NULL), +(18, 'Alexander', 'Lee', 'a.lee@student.edu', '555-0118', 'Electrical Engineering', NULL, 'College of Engineering', '2026', 'Junior', NULL, NULL), +(19, 'Evelyn', 'Walker', 'e.walker@student.edu', '555-0119', 'International Business', 'Spanish', 'D\'Amore-McKim School of Business', '2027', 'Sophomore', NULL, NULL), +(20, 'Sebastian', 'Hall', 's.hall@student.edu', '555-0120', 'Data Science', NULL, 'Khoury College of Computer Sciences', '2025', 'Senior', NULL, NULL), +(21, 'Aria', 'Allen', 'a.allen@student.edu', '555-0121', 'Marketing', NULL, 'D\'Amore-McKim School of Business', '2026', 'Junior', NULL, NULL), +(22, 'Owen', 'Young', 'o.young@student.edu', '555-0122', 'Computer Science', NULL, 'Khoury College of Computer Sciences', '2027', 'Sophomore', NULL, NULL), +(23, 'Luna', 'King', 'l.king@student.edu', '555-0123', 'Business', 'Finance', 'D\'Amore-McKim School of Business', '2025', 'Senior', NULL, NULL), +(24, 'Grayson', 'Wright', 'g.wright@student.edu', '555-0124', 'Cybersecurity', NULL, 'Khoury College of Computer Sciences', '2026', 'Junior', NULL, NULL), +(25, 'Chloe', 'Lopez', 'c.lopez@student.edu', '555-0125', 'Biology', 'Chemistry', 'College of Science', '2027', 'Sophomore', NULL, NULL), +(26, 'Carter', 'Hill', 'c.hill@student.edu', '555-0126', 'Information Systems', 'Business', 'Khoury College of Computer Sciences', '2025', 'Senior', NULL, NULL), +(27, 'Zoey', 'Scott', 'z.scott@student.edu', '555-0127', 'Environmental Engineering', NULL, 'College of Engineering', '2026', 'Junior', NULL, NULL), +(28, 'Luke', 'Green', 'l.green@student.edu', '555-0128', 'Chemistry', 'Mathematics', 'College of Science', '2027', 'Sophomore', NULL, NULL), +(29, 'Lily', 'Adams', 'l.adams@student.edu', '555-0129', 'Design', NULL, 'College of Arts, Media and Design', '2025', 'Senior', NULL, NULL), +(30, 'Jack', 'Baker', 'j.baker@student.edu', '555-0130', 'Computer Science', NULL, 'Khoury College of Computer Sciences', '2026', 'Junior', NULL, NULL), +-- Advisors (userId 31-36) +(31, 'Sarah', 'Martinez', 's.martinez@neu.edu', '555-0301', NULL, NULL, 'NEU', NULL, NULL, NULL, 'Academic'), +(32, 'Michael', 'Chen', 'm.chen@neu.edu', '555-0302', NULL, NULL, 'NEU', NULL, NULL, NULL, 'Academic'), +(33, 'Jennifer', 'Kim', 'j.kim@neu.edu', '555-0303', NULL, NULL, 'NEU', NULL, NULL, NULL, 'Academic'), +(34, 'David', 'Johnson', 'd.johnson@neu.edu', '555-0304', NULL, NULL, 'NEU', NULL, NULL, NULL, 'Academic'), +(35, 'Lisa', 'Thompson', 'l.thompson@neu.edu', '555-0305', NULL, NULL, 'NEU', NULL, NULL, NULL, 'Academic'), +(36, 'Robert', 'Wilson', 'r.wilson@neu.edu', '555-0306', NULL, NULL, 'NEU', NULL, NULL, NULL, 'Academic'), +-- Employers (userId 37-54) +(37, 'Phoebe', 'Hwang', 'p.hwang@technova.com', '555-0401', NULL, NULL, NULL, NULL, NULL, 1, 'Technology'), +(38, 'Marcus', 'Roberts', 'm.roberts@dataflow.com', '555-0402', NULL, NULL, NULL, NULL, NULL, 2, 'Technology'), +(39, 'Elena', 'Thompson', 'e.thompson@greenenergy.com', '555-0403', NULL, NULL, NULL, NULL, NULL, 3, 'Energy'), +(40, 'James', 'Martinez', 'j.martinez@healthtech.com', '555-0404', NULL, NULL, NULL, NULL, NULL, 4, 'Healthcare'), +(41, 'Rachel', 'Anderson', 'r.anderson@financefirst.com', '555-0405', NULL, NULL, NULL, NULL, NULL, 5, 'Finance'), +(42, 'Daniel', 'Clark', 'd.clark@autoinnovate.com', '555-0406', NULL, NULL, NULL, NULL, NULL, 6, 'Automotive'), +(43, 'Amanda', 'Lewis', 'a.lewis@cloudsecure.com', '555-0407', NULL, NULL, NULL, NULL, NULL, 7, 'Technology'), +(44, 'Christopher', 'Walker', 'c.walker@bioresearch.com', '555-0408', NULL, NULL, NULL, NULL, NULL, 8, 'Healthcare'), +(45, 'Natalie', 'Wells', 'natalie.wells@cloudfirst.com', '555-0409', NULL, NULL, NULL, NULL, NULL, 8, 'Technology'), +(46, 'Derek', 'Foster', 'derek.foster@energyflow.com', '555-0410', NULL, NULL, NULL, NULL, NULL, 13, 'Energy'), +(47, 'Vanessa', 'Li', 'vanessa.li@greenloop.net', '555-0411', NULL, NULL, NULL, NULL, NULL, 35, 'Technology'), +(48, 'Carlos', 'Nguyen', 'carlos.nguyen@quantive.co', '555-0412', NULL, NULL, NULL, NULL, NULL, 12, 'Finance'), +(49, 'Priya', 'Mehta', 'priya.mehta@verdanthub.org', '555-0413', NULL, NULL, NULL, NULL, NULL, 21, 'Technology'), +(50, 'Samuel', 'Bryant', 'samuel.bryant@tridentlabs.tech', '555-0414', NULL, NULL, NULL, NULL, NULL, 31, 'Technology'), +(51, 'Isabelle', 'Drake', 'isabelle.drake@devvibe.ai', '555-0415', NULL, NULL, NULL, NULL, NULL, 15, 'Technology'), +(52, 'Liam', 'Patel', 'liam.patel@xentrix.io', '555-0416', NULL, NULL, NULL, NULL, NULL, 22, 'Finance'), +(53, 'Olivia', 'Kim', 'olivia.kim@aerovate.com', '555-0417', NULL, NULL, NULL, NULL, NULL, 27, 'Technology'), +(54, 'Marcus', 'Holt', 'marcus.holt@clearbyte.net', '555-0418', NULL, NULL, NULL, NULL, NULL, 34, 'Technology'), +-- Admins (userId 55-58) +(55, 'Kaelyn', 'Dunn', 'k.dunn@neu.edu', '555-0501', NULL, NULL, 'NEU', NULL, NULL, NULL, 'Administration'), +(56, 'Tyler', 'Rodriguez', 't.rodriguez@neu.edu', '555-0502', NULL, NULL, 'NEU', NULL, NULL, NULL, 'Administration'), +(57, 'Madison', 'Foster', 'm.foster@neu.edu', '555-0503', NULL, NULL, 'NEU', NULL, NULL, NULL, 'Administration'), +(58, 'Jordan', 'Bell', 'j.bell@neu.edu', '555-0504', NULL, NULL, 'NEU', NULL, NULL, NULL, 'Administration'); + +-- 4. Demographics table (48 rows - references users) +INSERT INTO demographics (demographicId, gender, race, nationality, sexuality, disability) VALUES +-- Students (1-30) +(1, 'Male', 'White', 'American', 'Heterosexual', NULL), +(2, 'Male', 'Hispanic', 'American', 'Heterosexual', NULL), +(3, 'Female', 'Asian', 'American', 'Heterosexual', NULL), +(4, 'Male', 'Black', 'American', 'Heterosexual', NULL), +(5, 'Female', 'White', 'American', 'Bisexual', NULL), +(6, 'Male', 'White', 'American', 'Heterosexual', 'ADHD'), +(7, 'Female', 'Mixed Race', 'American', 'Heterosexual', NULL), +(8, 'Male', 'Asian', 'International', 'Heterosexual', NULL), +(9, 'Female', 'Hispanic', 'American', 'Heterosexual', 'Anxiety'), +(10, 'Male', 'White', 'American', 'Gay', NULL), +(11, 'Female', 'Black', 'American', 'Lesbian', NULL), +(12, 'Male', 'White', 'American', 'Heterosexual', NULL), +(13, 'Female', 'Native American', 'American', 'Heterosexual', NULL), +(14, 'Male', 'Asian', 'American', 'Heterosexual', 'Dyslexia'), +(15, 'Female', 'White', 'International', 'Heterosexual', NULL), +(16, 'Male', 'Hispanic', 'American', 'Bisexual', NULL), +(17, 'Female', 'Asian', 'American', 'Heterosexual', NULL), +(18, 'Male', 'Black', 'American', 'Heterosexual', NULL), +(19, 'Female', 'White', 'International', 'Heterosexual', NULL), +(20, 'Male', 'Mixed Race', 'American', 'Heterosexual', 'Depression'), +(21, 'Female', 'Hispanic', 'American', 'Heterosexual', NULL), +(22, 'Male', 'White', 'American', 'Heterosexual', NULL), +(23, 'Female', 'Asian', 'International', 'Heterosexual', NULL), +(24, 'Male', 'White', 'American', 'Gay', NULL), +(25, 'Female', 'Black', 'American', 'Heterosexual', NULL), +(26, 'Male', 'Hispanic', 'American', 'Heterosexual', 'Autism'), +(27, 'Female', 'White', 'American', 'Bisexual', NULL), +(28, 'Male', 'Asian', 'American', 'Heterosexual', NULL), +(29, 'Female', 'Mixed Race', 'American', 'Heterosexual', NULL), +(30, 'Male', 'White', 'American', 'Heterosexual', NULL), +-- Advisors (31-36) +(31, 'Female', 'Hispanic', 'American', 'Heterosexual', NULL), +(32, 'Male', 'Asian', 'American', 'Heterosexual', NULL), +(33, 'Female', 'Korean', 'American', 'Heterosexual', NULL), +(34, 'Male', 'White', 'American', 'Heterosexual', NULL), +(35, 'Female', 'Black', 'American', 'Heterosexual', NULL), +(36, 'Male', 'White', 'American', 'Gay', NULL), +-- Employers (37-44) +(37, 'Female', 'Asian', 'American', 'Heterosexual', NULL), +(38, 'Male', 'White', 'American', 'Heterosexual', NULL), +(39, 'Female', 'Hispanic', 'American', 'Bisexual', NULL), +(40, 'Male', 'Hispanic', 'American', 'Heterosexual', NULL), +(41, 'Female', 'Black', 'American', 'Heterosexual', NULL), +(42, 'Male', 'White', 'American', 'Heterosexual', NULL), +(43, 'Female', 'White', 'American', 'Lesbian', NULL), +(44, 'Male', 'Mixed Race', 'American', 'Heterosexual', NULL), +-- Admins (45-48) +(45, 'Female', 'White', 'American', 'Heterosexual', NULL), +(46, 'Male', 'Hispanic', 'American', 'Heterosexual', NULL), +(47, 'Female', 'Asian', 'American', 'Bisexual', NULL), +(48, 'Non-binary', 'Black', 'American', 'Pansexual', NULL); + +-- 5. Coop Positions table (50 rows - references skills) +INSERT INTO coopPositions (coopPositionId, title, location, description, hourlyPay, requiredSkillsId, desiredSkillsId, desiredGPA, deadline, startDate, endDate, flag, industry) VALUES +(1, 'Software Developer Intern', 'Boston, MA', 'Develop web applications using modern frameworks and participate in agile development processes.', 22.50, 1, 4, 3.0, '2025-02-15 23:59:59', '2025-06-01', '2025-12-01', FALSE, 'Technology'), +(2, 'Data Analyst Co-op', 'Cambridge, MA', 'Analyze business data and create reports using SQL and Python for data-driven insights.', 20.00, 12, 11, 3.2, '2025-02-20 23:59:59', '2025-06-01', '2025-12-01', FALSE, 'Technology'), +(3, 'Marketing Assistant', 'New York, NY', 'Support digital marketing campaigns and social media strategy development.', 18.50, 17, 14, 2.8, '2025-03-01 23:59:59', '2025-06-01', '2025-12-01', FALSE, 'Marketing'), +(4, 'Cybersecurity Intern', 'Burlington, MA', 'Assist with security assessments and vulnerability testing in cloud environments.', 25.00, 6, 8, 3.3, '2025-02-10 23:59:59', '2025-06-01', '2025-12-01', FALSE, 'Technology'), +(5, 'Financial Analyst Co-op', 'Boston, MA', 'Support financial modeling and investment analysis for banking operations.', 21.00, 13, 12, 3.4, '2025-02-25 23:59:59', '2025-06-01', '2025-12-01', FALSE, 'Finance'), +(6, 'UX Design Intern', 'San Francisco, CA', 'Create user interface designs and conduct user research for mobile applications.', 24.00, 40, 39, 3.0, '2025-03-05 23:59:59', '2025-06-01', '2025-12-01', FALSE, 'Technology'), +(7, 'Biotech Research Co-op', 'Cambridge, MA', 'Conduct laboratory research and assist with clinical trial data analysis.', 19.50, 12, 11, 3.5, '2025-02-28 23:59:59', '2025-06-01', '2025-12-01', FALSE, 'Healthcare'), +(8, 'Environmental Engineer', 'Portland, OR', 'Work on renewable energy projects and sustainability assessments.', 23.00, 15, 12, 3.1, '2025-03-10 23:59:59', '2025-06-01', '2025-12-01', FALSE, 'Energy'), +(9, 'DevOps Intern', 'Seattle, WA', 'Manage CI/CD pipelines and cloud infrastructure using AWS and Docker.', 26.00, 9, 35, 3.2, '2025-02-12 23:59:59', '2025-06-01', '2025-12-01', FALSE, 'Technology'), +(10, 'Business Analyst Co-op', 'Chicago, IL', 'Analyze business processes and requirements for software implementation.', 20.50, 15, 13, 3.0, '2025-03-15 23:59:59', '2025-06-01', '2025-12-01', FALSE, 'Consulting'), +(11, 'Machine Learning Intern', 'Austin, TX', 'Develop ML models for predictive analytics and data processing pipelines.', 28.00, 11, 1, 3.6, '2025-02-18 23:59:59', '2025-06-01', '2025-12-01', FALSE, 'Technology'), +(12, 'Mobile App Developer', 'Los Angeles, CA', 'Build iOS and Android applications using native and cross-platform technologies.', 24.50, 25, 26, 3.1, '2025-03-20 23:59:59', '2025-06-01', '2025-12-01', FALSE, 'Technology'), +(13, 'Supply Chain Analyst', 'Atlanta, GA', 'Optimize logistics operations and analyze supply chain performance metrics.', 19.00, 12, 15, 2.9, '2025-02-22 23:59:59', '2025-06-01', '2025-12-01', FALSE, 'Logistics'), +(14, 'Game Developer Intern', 'San Diego, CA', 'Create game mechanics and features using Unity and C# programming.', 22.00, 22, 21, 3.0, '2025-03-25 23:59:59', '2025-06-01', '2025-12-01', FALSE, 'Gaming'), +(15, 'Healthcare Data Analyst', 'Philadelphia, PA', 'Analyze patient data and healthcare outcomes for medical research.', 21.50, 12, 37, 3.3, '2025-02-14 23:59:59', '2025-06-01', '2025-12-01', FALSE, 'Healthcare'), +(16, 'Full Stack Developer', 'Denver, CO', 'Build end-to-end web applications using React, Node.js, and databases.', 25.50, 3, 5, 3.2, '2025-03-08 23:59:59', '2025-06-01', '2025-12-01', FALSE, 'Technology'), +(17, 'Quality Assurance Co-op', 'Miami, FL', 'Test software applications and develop automated testing frameworks.', 18.00, 3, 10, 2.8, '2025-02-26 23:59:59', '2025-06-01', '2025-12-01', FALSE, 'Technology'), +(18, 'Robotics Engineer Intern', 'Detroit, MI', 'Design and program robotic systems for manufacturing automation.', 24.00, 21, 11, 3.4, '2025-03-12 23:59:59', '2025-06-01', '2025-12-01', FALSE, 'Manufacturing'), +(19, 'Digital Marketing Co-op', 'Nashville, TN', 'Manage social media campaigns and analyze digital marketing performance.', 17.50, 17, 12, 2.7, '2025-02-16 23:59:59', '2025-06-01', '2025-12-01', FALSE, 'Marketing'), +(20, 'Cloud Engineer Intern', 'Phoenix, AZ', 'Deploy and manage cloud infrastructure using AWS and Azure platforms.', 27.00, 8, 33, 3.5, '2025-03-18 23:59:59', '2025-06-01', '2025-12-01', FALSE, 'Technology'), +(21, 'Product Manager Co-op', 'San Jose, CA', 'Assist with product roadmap planning and coordinate development teams.', 23.50, 15, 17, 3.1, '2025-02-08 23:59:59', '2025-09-01', '2026-03-01', FALSE, 'Technology'), +(22, 'Backend Developer Intern', 'Portland, OR', 'Build server-side applications and APIs using Java and microservices.', 24.00, 2, 6, 3.0, '2025-02-24 23:59:59', '2025-09-01', '2026-03-01', FALSE, 'Technology'), +(23, 'Business Intelligence', 'Dallas, TX', 'Create dashboards and reports for executive decision making.', 20.50, 37, 13, 3.2, '2025-03-22 23:59:59', '2025-09-01', '2026-03-01', FALSE, 'Finance'), +(24, 'Frontend Developer Co-op', 'Tampa, FL', 'Develop user interfaces using React and modern CSS frameworks.', 22.00, 4, 29, 2.9, '2025-02-11 23:59:59', '2025-09-01', '2026-03-01', FALSE, 'Technology'), +(25, 'Research Assistant', 'Baltimore, MD', 'Support biomedical research projects and data collection efforts.', 16.50, 12, 38, 3.4, '2025-03-14 23:59:59', '2025-09-01', '2026-03-01', FALSE, 'Healthcare'), +(26, 'Systems Administrator', 'Salt Lake City, UT', 'Maintain IT infrastructure and provide technical support services.', 21.00, 8, 9, 3.0, '2025-02-19 23:59:59', '2025-09-01', '2026-03-01', FALSE, 'Technology'), +(27, 'Finance Intern', 'Minneapolis, MN', 'Assist with financial planning and budget analysis for corporate clients.', 19.00, 13, 12, 3.3, '2025-03-28 23:59:59', '2025-09-01', '2026-03-01', FALSE, 'Finance'), +(28, 'Software QA Engineer', 'Orlando, FL', 'Design test cases and automate testing procedures for software releases.', 20.00, 10, 3, 3.1, '2025-02-13 23:59:59', '2025-09-01', '2026-03-01', FALSE, 'Technology'), +(29, 'Data Engineer Co-op', 'Charlotte, NC', 'Build data pipelines and manage ETL processes for analytics platforms.', 25.00, 1, 7, 3.4, '2025-03-30 23:59:59', '2025-09-01', '2026-03-01', FALSE, 'Technology'), +(30, 'Project Coordinator', 'Kansas City, MO', 'Coordinate cross-functional teams and manage project timelines.', 18.00, 15, 16, 2.8, '2025-02-21 23:59:59', '2025-09-01', '2026-03-01', FALSE, 'Consulting'), +(31, 'Web Developer Intern', 'Las Vegas, NV', 'Create responsive websites and web applications for client projects.', 21.50, 30, 4, 3.0, '2025-03-06 23:59:59', '2026-01-01', '2026-07-01', FALSE, 'Technology'), +(32, 'Security Analyst Co-op', 'Raleigh, NC', 'Monitor security systems and investigate potential cyber threats.', 23.00, 6, 8, 3.2, '2025-02-17 23:59:59', '2026-01-01', '2026-07-01', FALSE, 'Technology'), +(33, 'Operations Analyst', 'Columbus, OH', 'Improve operational efficiency and analyze business performance metrics.', 19.50, 13, 15, 3.0, '2025-03-24 23:59:59', '2026-01-01', '2026-07-01', FALSE, 'Operations'), +(34, 'Android Developer', 'Indianapolis, IN', 'Develop native Android applications using Kotlin and Java.', 23.50, 26, 2, 3.1, '2025-02-09 23:59:59', '2026-01-01', '2026-07-01', FALSE, 'Technology'), +(35, 'Database Administrator', 'Memphis, TN', 'Manage database systems and optimize query performance.', 22.00, 6, 31, 3.3, '2025-03-16 23:59:59', '2026-01-01', '2026-07-01', FALSE, 'Technology'), +(36, 'Sales Analytics Intern', 'Louisville, KY', 'Analyze sales data and create performance reports for management.', 17.00, 12, 14, 2.9, '2025-02-23 23:59:59', '2026-01-01', '2026-07-01', FALSE, 'Sales'), +(37, 'IoT Developer Co-op', 'Oklahoma City, OK', 'Develop Internet of Things applications and sensor integration systems.', 24.50, 3, 1, 3.2, '2025-03-26 23:59:59', '2026-01-01', '2026-07-01', FALSE, 'Technology'), +(38, 'Technical Writer', 'Richmond, VA', 'Create technical documentation and user manuals for software products.', 18.50, 17, 14, 2.8, '2025-02-15 23:59:59', '2026-01-01', '2026-07-01', FALSE, 'Technology'), +(39, 'Blockchain Developer', 'Providence, RI', 'Build decentralized applications and smart contracts using blockchain technology.', 29.00, 3, 22, 3.5, '2025-03-11 23:59:59', '2026-01-01', '2026-07-01', FALSE, 'Finance'), +(40, 'AI Research Intern', 'Hartford, CT', 'Conduct artificial intelligence research and develop machine learning algorithms.', 26.50, 11, 1, 3.6, '2025-02-27 23:59:59', '2026-01-01', '2026-07-01', FALSE, 'Technology'), +(41, 'Network Engineer Co-op', 'Bridgeport, CT', 'Design and maintain network infrastructure and troubleshoot connectivity issues.', 21.50, 8, 32, 3.1, '2025-03-19 23:59:59', '2026-06-01', '2026-12-01', FALSE, 'Technology'), +(42, 'HR Analytics Intern', 'Newark, NJ', 'Analyze employee data and support human resources decision making.', 17.50, 12, 13, 2.9, '2025-02-12 23:59:59', '2026-06-01', '2026-12-01', FALSE, 'Human Resources'), +(43, 'Manufacturing Engineer', 'Buffalo, NY', 'Optimize production processes and implement lean manufacturing principles.', 22.50, 15, 12, 3.3, '2025-03-21 23:59:59', '2026-06-01', '2026-12-01', FALSE, 'Manufacturing'), +(44, 'Content Creator Co-op', 'Syracuse, NY', 'Develop multimedia content and manage brand social media presence.', 16.00, 39, 17, 2.7, '2025-02-20 23:59:59', '2026-06-01', '2026-12-01', FALSE, 'Marketing'), +(45, 'Automation Engineer', 'Rochester, NY', 'Design automated systems and implement robotic process automation.', 25.50, 1, 21, 3.4, '2025-03-13 23:59:59', '2026-06-01', '2026-12-01', FALSE, 'Manufacturing'), +(46, 'Customer Success Intern', 'Albany, NY', 'Support customer onboarding and analyze customer satisfaction metrics.', 17.00, 17, 12, 2.8, '2025-02-25 23:59:59', '2026-06-01', '2026-12-01', FALSE, 'Customer Service'), +(47, 'Computer Vision Co-op', 'Burlington, VT', 'Develop image processing algorithms and computer vision applications.', 27.50, 11, 1, 3.5, '2025-03-17 23:59:59', '2026-06-01', '2026-12-01', FALSE, 'Technology'), +(48, 'Product Design Intern', 'Manchester, NH', 'Create product prototypes and conduct user experience research.', 20.00, 40, 39, 3.0, '2025-02-14 23:59:59', '2026-06-01', '2026-12-01', FALSE, 'Design'), +(49, 'Infrastructure Engineer', 'Portland, ME', 'Manage cloud infrastructure and implement DevOps best practices.', 24.00, 8, 35, 3.2, '2025-03-23 23:59:59', '2026-06-01', '2026-12-01', FALSE, 'Technology'), +(50, 'Business Development', 'Concord, NH', 'Identify new business opportunities and support partnership development.', 18.50, 17, 15, 2.9, '2025-02-18 23:59:59', '2026-06-01', '2026-12-01', FALSE, 'Business'); + +-- 6. Applications table (60 rows - references coopPositions) +INSERT INTO applications (applicationId, dateTimeApplied, status, resume, gpa, coverLetter, coopPositionId) VALUES +-- Charlie Stout (CS major, GPA 3.7, has Python, Java, JavaScript, React, etc.) +(1, '2025-01-15 10:30:00', 'Submitted', 'Resume content for Charlie Stout...', 3.7, 'Cover letter expressing interest in software development...', 1), +(2, '2025-01-16 14:20:00', 'Under Review', 'Resume content for Charlie Stout...', 3.7, 'Cover letter for full stack developer...', 16), +-- Liam Williams (Business major, GPA 3.5, has Excel, PowerPoint, Project Management, etc.) +(3, '2025-01-18 09:45:00', 'Submitted', 'Resume content for Liam Williams...', 3.5, 'Cover letter highlighting business experience...', 5), +(4, '2025-01-20 16:15:00', 'Draft', 'Resume content for Liam Williams...', 3.5, NULL, 10), +-- Sophia Brown (Engineering major, GPA 3.8, has C++, Excel, Project Management, etc.) +(5, '2025-01-22 11:30:00', 'Submitted', 'Resume content for Sophia Brown...', 3.8, 'Cover letter for engineering role...', 18), +(6, '2025-01-25 08:45:00', 'Under Review', 'Resume content for Sophia Brown...', 3.8, 'Manufacturing engineering application...', 43), +-- Noah Davis (Data Science major, GPA 3.9, has Python, ML, Data Analysis, etc.) +(7, '2025-01-28 13:20:00', 'Submitted', 'Resume content for Noah Davis...', 3.9, 'Machine learning interest cover letter...', 11), +(8, '2025-01-30 10:00:00', 'Rejected', 'Resume content for Noah Davis...', 3.9, 'Data engineering application...', 29), +-- Additional placement data for analytics +(61, '2025-02-01 09:00:00', 'Accepted', 'Resume content for Charlie Stout...', 3.7, 'Software engineer acceptance...', 1), +(62, '2025-02-02 10:00:00', 'Rejected', 'Resume content for Liam Williams...', 3.5, 'Business analyst rejection...', 5), +(63, '2025-02-03 11:00:00', 'Accepted', 'Resume content for Sophia Brown...', 3.8, 'Engineering acceptance...', 18), +(64, '2025-02-04 12:00:00', 'Accepted', 'Resume content for Noah Davis...', 3.9, 'Data scientist acceptance...', 11), +(65, '2025-02-05 13:00:00', 'Rejected', 'Resume content for Olivia Miller...', 3.4, 'Marketing rejection...', 6), +(66, '2025-02-06 14:00:00', 'Accepted', 'Resume content for Emma Davis...', 3.6, 'Finance acceptance...', 10), + +-- Additional comprehensive application data for expanded analytics +-- Mason Wilson (userId=6) - Cybersecurity applications +(67, '2025-02-07 09:00:00', 'Accepted', 'Resume content for Mason Wilson...', 3.6, 'Cybersecurity acceptance...', 32), +(68, '2025-02-08 10:00:00', 'Rejected', 'Resume content for Mason Wilson...', 3.6, 'Security analyst rejection...', 41), + +-- Ava Moore (userId=7) - Biomedical Engineering applications +(69, '2025-02-09 11:00:00', 'Accepted', 'Resume content for Ava Moore...', 3.8, 'Research acceptance...', 25), +(70, '2025-02-10 12:00:00', 'Rejected', 'Resume content for Ava Moore...', 3.8, 'Healthcare rejection...', 15), + +-- Ethan Taylor (userId=8) - Finance applications +(71, '2025-02-11 13:00:00', 'Accepted', 'Resume content for Ethan Taylor...', 3.5, 'Finance acceptance...', 27), +(72, '2025-02-12 14:00:00', 'Rejected', 'Resume content for Ethan Taylor...', 3.5, 'Investment rejection...', 23), + +-- Isabella Anderson (userId=9) - Psychology/HR applications +(73, '2025-02-13 15:00:00', 'Accepted', 'Resume content for Isabella Anderson...', 3.4, 'HR analytics acceptance...', 42), +(74, '2025-02-14 16:00:00', 'Rejected', 'Resume content for Isabella Anderson...', 3.4, 'People analytics rejection...', 33), + +-- James Thomas (userId=10) - Mechanical Engineering applications +(75, '2025-02-15 17:00:00', 'Accepted', 'Resume content for James Thomas...', 3.9, 'Manufacturing acceptance...', 43), +(76, '2025-02-16 18:00:00', 'Rejected', 'Resume content for James Thomas...', 3.9, 'Robotics rejection...', 18), + +-- Mia Jackson (userId=11) - CS Senior applications +(77, '2025-02-17 09:00:00', 'Accepted', 'Resume content for Mia Jackson...', 3.8, 'Software developer acceptance...', 1), +(78, '2025-02-18 10:00:00', 'Rejected', 'Resume content for Mia Jackson...', 3.8, 'Full stack rejection...', 16), +(79, '2025-02-19 11:00:00', 'Accepted', 'Resume content for Mia Jackson...', 3.8, 'Blockchain acceptance...', 39), + +-- Lucas White (userId=12) - Business with Data Science minor +(80, '2025-02-20 12:00:00', 'Accepted', 'Resume content for Lucas White...', 3.6, 'Business analyst acceptance...', 10), +(81, '2025-02-21 13:00:00', 'Rejected', 'Resume content for Lucas White...', 3.6, 'Data analyst rejection...', 2), + +-- Charlotte Harris (userId=13) - Environmental Engineering +(82, '2025-02-22 14:00:00', 'Accepted', 'Resume content for Charlotte Harris...', 3.7, 'Environmental acceptance...', 8), +(83, '2025-02-23 15:00:00', 'Rejected', 'Resume content for Charlotte Harris...', 3.7, 'Sustainability rejection...', 33), + +-- Benjamin Martin (userId=14) - Information Systems Senior +(84, '2025-02-24 16:00:00', 'Accepted', 'Resume content for Benjamin Martin...', 3.5, 'Systems admin acceptance...', 26), +(85, '2025-02-25 17:00:00', 'Rejected', 'Resume content for Benjamin Martin...', 3.5, 'Database admin rejection...', 35), + +-- Amelia Garcia (userId=15) - Physics major +(86, '2025-02-26 18:00:00', 'Accepted', 'Resume content for Amelia Garcia...', 3.9, 'Research assistant acceptance...', 25), +(87, '2025-02-27 09:00:00', 'Rejected', 'Resume content for Amelia Garcia...', 3.9, 'Data science rejection...', 11), + +-- Henry Rodriguez (userId=16) - CS Sophomore +(88, '2025-02-28 10:00:00', 'Accepted', 'Resume content for Henry Rodriguez...', 3.3, 'QA engineer acceptance...', 17), +(89, '2025-03-01 11:00:00', 'Rejected', 'Resume content for Henry Rodriguez...', 3.3, 'Software dev rejection...', 1), + +-- Harper Lewis (userId=17) - Design Senior +(90, '2025-03-02 12:00:00', 'Accepted', 'Resume content for Harper Lewis...', 3.8, 'UX design acceptance...', 6), +(91, '2025-03-03 13:00:00', 'Rejected', 'Resume content for Harper Lewis...', 3.8, 'Product design rejection...', 48), + +-- Alexander Lee (userId=18) - Electrical Engineering +(92, '2025-03-04 14:00:00', 'Accepted', 'Resume content for Alexander Lee...', 3.7, 'Robotics acceptance...', 18), +(93, '2025-03-05 15:00:00', 'Rejected', 'Resume content for Alexander Lee...', 3.7, 'Hardware rejection...', 45), + +-- Evelyn Walker (userId=19) - International Business +(94, '2025-03-06 16:00:00', 'Accepted', 'Resume content for Evelyn Walker...', 3.6, 'Business dev acceptance...', 50), +(95, '2025-03-07 17:00:00', 'Rejected', 'Resume content for Evelyn Walker...', 3.6, 'Marketing rejection...', 3), + +-- Sebastian Hall (userId=20) - Data Science Senior +(96, '2025-03-08 18:00:00', 'Accepted', 'Resume content for Sebastian Hall...', 3.9, 'Data analyst acceptance...', 2), +(97, '2025-03-09 09:00:00', 'Rejected', 'Resume content for Sebastian Hall...', 3.9, 'ML engineer rejection...', 11), +(98, '2025-03-10 10:00:00', 'Accepted', 'Resume content for Sebastian Hall...', 3.9, 'Computer vision acceptance...', 47), + +-- Aria Allen (userId=21) - Marketing major +(99, '2025-03-11 11:00:00', 'Accepted', 'Resume content for Aria Allen...', 3.4, 'Marketing assistant acceptance...', 3), +(100, '2025-03-12 12:00:00', 'Rejected', 'Resume content for Aria Allen...', 3.4, 'Digital marketing rejection...', 19), + +-- Owen Young (userId=22) - CS Sophomore +(101, '2025-03-13 13:00:00', 'Accepted', 'Resume content for Owen Young...', 3.5, 'QA engineer acceptance...', 28), +(102, '2025-03-14 14:00:00', 'Rejected', 'Resume content for Owen Young...', 3.5, 'Frontend dev rejection...', 24), + +-- Luna King (userId=23) - Business Senior with Finance focus +(103, '2025-03-15 15:00:00', 'Accepted', 'Resume content for Luna King...', 3.7, 'Finance intern acceptance...', 27), +(104, '2025-03-16 16:00:00', 'Rejected', 'Resume content for Luna King...', 3.7, 'Business intelligence rejection...', 23), +(105, '2025-03-17 17:00:00', 'Accepted', 'Resume content for Luna King...', 3.7, 'Blockchain dev acceptance...', 39), + +-- Grayson Wright (userId=24) - Cybersecurity major +(106, '2025-03-18 18:00:00', 'Accepted', 'Resume content for Grayson Wright...', 3.6, 'Security analyst acceptance...', 32), +(107, '2025-03-19 09:00:00', 'Rejected', 'Resume content for Grayson Wright...', 3.6, 'Penetration testing rejection...', 41), + +-- Chloe Lopez (userId=25) - Biology major +(108, '2025-03-20 10:00:00', 'Accepted', 'Resume content for Chloe Lopez...', 3.8, 'Healthcare data acceptance...', 15), +(109, '2025-03-21 11:00:00', 'Rejected', 'Resume content for Chloe Lopez...', 3.8, 'Biotech rejection...', 7), +-- Olivia Miller (Marketing major, GPA 3.4, has Excel, PowerPoint, Communication, etc.) +(9, '2025-02-01 15:30:00', 'Draft', 'Resume content for Olivia Miller...', 3.4, NULL, 3), +(10, '2025-02-03 12:45:00', 'Submitted', 'Resume content for Olivia Miller...', 3.4, 'Digital marketing interest...', 19); + +-- 7. Skill Details table (sample rows for testing - references skills and users) +INSERT INTO skillDetails (skillId, studentId, proficiencyLevel) VALUES +(1, 1, 4), (2, 1, 3), (3, 1, 5), (4, 1, 4), (5, 1, 3), +(6, 2, 4), (7, 2, 3), (8, 2, 5), (9, 2, 4), (10, 2, 3), +(11, 3, 4), (12, 3, 3), (13, 3, 5), (14, 3, 4), (15, 3, 3), +(16, 4, 4), (17, 4, 3), (18, 4, 5), (19, 4, 4), (20, 4, 3), +(21, 5, 4), (22, 5, 3), (23, 5, 5), (24, 5, 4), (25, 5, 3), +(26, 6, 4), (27, 6, 3), (28, 6, 5), (29, 6, 4), (30, 6, 3), +(31, 7, 4), (32, 7, 3), (33, 7, 5), (34, 7, 4), (35, 7, 3), +(36, 8, 4), (37, 8, 3), (38, 8, 5), (39, 8, 4), (40, 8, 3), +(1, 9, 5), (2, 9, 4), (3, 9, 3), (4, 9, 5), (5, 9, 4), +(6, 10, 3), (7, 10, 5), (8, 10, 4), (9, 10, 3), (10, 10, 5), +(11, 11, 4), (12, 11, 3), (13, 11, 5), (14, 11, 4), (15, 11, 3), +(16, 12, 5), (17, 12, 4), (18, 12, 3), (19, 12, 5), (20, 12, 4), +(21, 13, 3), (22, 13, 5), (23, 13, 4), (24, 13, 3), (25, 13, 5), +(26, 14, 4), (27, 14, 3), (28, 14, 5), (29, 14, 4), (30, 14, 3), +(31, 15, 5), (32, 15, 4), (33, 15, 3), (34, 15, 5), (35, 15, 4), +(36, 16, 3), (37, 16, 5), (38, 16, 4), (39, 16, 3), (40, 16, 5), +(1, 17, 4), (2, 17, 3), (3, 17, 5), (4, 17, 4), (5, 17, 3), +(6, 18, 5), (7, 18, 4), (8, 18, 3), (9, 18, 5), (10, 18, 4), +(11, 19, 3), (12, 19, 5), (13, 19, 4), (14, 19, 3), (15, 19, 5), +(16, 20, 4), (17, 20, 3), (18, 20, 5), (19, 20, 4), (20, 20, 3), +(21, 21, 5), (22, 21, 4), (23, 21, 3), (24, 21, 5), (25, 21, 4), +(26, 22, 3), (27, 22, 5), (28, 22, 4), (29, 22, 3), (30, 22, 5), +(31, 23, 4), (32, 23, 3), (33, 23, 5), (34, 23, 4), (35, 23, 3), +(36, 24, 5), (37, 24, 4), (38, 24, 3), (39, 24, 5), (40, 24, 4), +(1, 25, 3), (2, 25, 5), (3, 25, 4), (4, 25, 3), (5, 25, 5), +(6, 26, 4), (7, 26, 3), (8, 26, 5), (9, 26, 4), (10, 26, 3), +(11, 27, 5), (12, 27, 4), (13, 27, 3), (14, 27, 5), (15, 27, 4), +(16, 28, 3), (17, 28, 5), (18, 28, 4), (19, 28, 3), (20, 28, 5), +(21, 29, 4), (22, 29, 3), (23, 29, 5), (24, 29, 4), (25, 29, 3), +(26, 30, 5), (27, 30, 4), (28, 30, 3), (29, 30, 5), (30, 30, 4); + + + + + +-- 7. Creates Position relationships (bridge table - references users and coopPositions) +INSERT INTO createsPos (employerId, coopPositionId) VALUES +(37, 1), (37, 2), (37, 3), (37, 4), (37, 5), (37, 6), (37, 7), (37, 8), (37, 9), (37, 10), +(38, 11), (38, 12), (38, 13), (38, 14), (38, 15), (38, 16), (38, 17), (38, 18), (38, 19), (38, 20), +(39, 21), (39, 22), (39, 23), (39, 24), (39, 25), (39, 26), (39, 27), (39, 28), (39, 29), (39, 30), +(40, 31), (40, 32), (40, 33), (40, 34), (40, 35), (40, 36), (40, 37), (40, 38), (40, 39), (40, 40), +(41, 41), (41, 42), (41, 43), (41, 44), (41, 45), (41, 46), (41, 47), (41, 48), (41, 49), (41, 50); + +-- 7. Advisor-Advisee relationships (bridge table - references users) +INSERT INTO advisor_advisee (advisorId, studentId) VALUES +-- Sarah Martinez (advisor 31) advises students 1-25 (expanded for comprehensive analytics) +(31, 1), (31, 2), (31, 3), (31, 4), (31, 5), (31, 6), (31, 7), (31, 8), (31, 9), (31, 10), +(31, 11), (31, 12), (31, 13), (31, 14), (31, 15), (31, 16), (31, 17), (31, 18), (31, 19), (31, 20), +(31, 21), (31, 22), (31, 23), (31, 24), (31, 25), +-- Michael Chen (advisor 32) advises students 26-30 (reduced set) +(32, 26), (32, 27), (32, 28), (32, 29), (32, 30), +-- Jennifer Kim (advisor 33) advises some graduate students (if any) +(33, 31), (33, 32), (33, 33), (33, 34), (33, 35); + +-- 8. Applies To App relationships (bridge table - references applications and users) +INSERT INTO appliesToApp (applicationId, studentId) VALUES +(1, 1), (2, 1), (3, 2), (4, 2), (5, 3), (6, 3), (7, 4), (8, 4), (9, 5), (10, 5), +-- Additional placement data relationships +(61, 1), (62, 2), (63, 3), (64, 4), (65, 5), (66, 6), +-- Comprehensive application relationships for expanded dataset +(67, 6), (68, 6), (69, 7), (70, 7), (71, 8), (72, 8), (73, 9), (74, 9), (75, 10), (76, 10), +(77, 11), (78, 11), (79, 11), (80, 12), (81, 12), (82, 13), (83, 13), (84, 14), (85, 14), +(86, 15), (87, 15), (88, 16), (89, 16), (90, 17), (91, 17), (92, 18), (93, 18), (94, 19), (95, 19), +(96, 20), (97, 20), (98, 20), (99, 21), (100, 21), (101, 22), (102, 22), (103, 23), (104, 23), (105, 23), +(106, 24), (107, 24), (108, 25), (109, 25); + +-- 9. Past Co-op Experience (workedAtPos table - completed co-op positions) +INSERT INTO workedAtPos (studentId, coopPositionId, startDate, endDate, companyRating) VALUES +-- Original entries (keep existing) +(1, 2, '2024-01-15', '2024-06-15', 5), +(2, 7, '2024-03-01', '2024-08-31', 4), +(3, 19, '2023-09-01', '2024-02-29', 5), +(4, 12, '2024-01-01', '2024-06-30', 4), +(5, 8, '2023-06-01', '2023-12-31', 3), + +-- Additional entries for more comprehensive data + +-- Mason Wilson (userId=6) - Cybersecurity major, completed security co-op +(6, 4, '2023-09-01', '2024-02-29', 4), + +-- Ava Moore (userId=7) - Biomedical Engineering, completed research co-op +(7, 7, '2023-06-01', '2023-12-31', 5), + +-- Ethan Taylor (userId=8) - Finance major, completed financial analyst co-op +(8, 5, '2024-01-15', '2024-06-15', 4), + +-- Isabella Anderson (userId=9) - Psychology major, completed HR analytics co-op +(9, 42, '2023-06-01', '2023-12-31', 3), + +-- James Thomas (userId=10) - Mechanical Engineering, completed manufacturing co-op +(10, 43, '2024-01-01', '2024-06-30', 5), + +-- Mia Jackson (userId=11) - CS Senior, completed multiple co-ops +(11, 1, '2023-01-15', '2023-06-15', 4), -- First co-op - Software Developer +(11, 16, '2023-09-01', '2024-02-29', 5), -- Second co-op - Full Stack Developer + +-- Lucas White (userId=12) - Business major with Data Science minor +(12, 10, '2023-06-01', '2023-12-31', 4), -- Business Analyst co-op + +-- Charlotte Harris (userId=13) - Environmental Engineering +(13, 8, '2024-01-15', '2024-06-15', 5), -- Environmental Engineer co-op + +-- Benjamin Martin (userId=14) - Information Systems Senior +(14, 26, '2023-01-15', '2023-06-15', 3), -- Systems Administrator co-op +(14, 35, '2023-09-01', '2024-02-29', 4), -- Database Administrator co-op + +-- Amelia Garcia (userId=15) - Physics major +(15, 25, '2023-06-01', '2023-12-31', 4), -- Research Assistant co-op + +-- Henry Rodriguez (userId=16) - CS Sophomore (first co-op) +(16, 17, '2024-01-01', '2024-06-30', 3), -- QA co-op (entry level) + +-- Harper Lewis (userId=17) - Design Senior +(17, 6, '2023-01-15', '2023-06-15', 5), -- UX Design co-op +(17, 48, '2023-09-01', '2024-02-29', 4), -- Product Design co-op + +-- Alexander Lee (userId=18) - Electrical Engineering +(18, 18, '2023-06-01', '2023-12-31', 4), -- Robotics Engineer co-op + +-- Evelyn Walker (userId=19) - International Business +(19, 50, '2024-01-15', '2024-06-15', 4), -- Business Development co-op + +-- Sebastian Hall (userId=20) - Data Science Senior +(20, 2, '2023-01-15', '2023-06-15', 5), -- Data Analyst co-op +(20, 11, '2023-09-01', '2024-02-29', 5), -- Machine Learning co-op + +-- Aria Allen (userId=21) - Marketing major +(21, 3, '2023-06-01', '2023-12-31', 3), -- Marketing Assistant co-op +(21, 19, '2024-01-01', '2024-06-30', 4), -- Digital Marketing co-op + +-- Owen Young (userId=22) - CS Sophomore (first co-op) +(22, 28, '2024-01-15', '2024-06-15', 4), -- Software QA Engineer + +-- Luna King (userId=23) - Business Senior with Finance focus +(23, 27, '2023-01-15', '2023-06-15', 4), -- Finance Intern +(23, 23, '2023-09-01', '2024-02-29', 5), -- Business Intelligence + +-- Grayson Wright (userId=24) - Cybersecurity major +(24, 32, '2023-06-01', '2023-12-31', 4), -- Security Analyst co-op + +-- Chloe Lopez (userId=25) - Biology major +(25, 15, '2024-01-01', '2024-06-30', 5), -- Healthcare Data Analyst + +-- Carter Hill (userId=26) - Information Systems Senior +(26, 41, '2023-01-15', '2023-06-15', 3), -- Network Engineer co-op + +-- Zoey Scott (userId=27) - Environmental Engineering +(27, 33, '2023-06-01', '2023-12-31', 4), -- Operations Analyst co-op + +-- Luke Green (userId=28) - Chemistry major +(28, 25, '2024-01-15', '2024-06-15', 4), -- Research Assistant co-op + +-- Lily Adams (userId=29) - Design Senior +(29, 44, '2023-01-15', '2023-06-15', 3), -- Content Creator co-op + +-- Jack Baker (userId=30) - CS Junior +(30, 24, '2023-09-01', '2024-02-29', 4), -- Frontend Developer co-op + +-- Additional second co-ops for some students to show progression + +-- Charlie Stout (userId=1) - second co-op after first successful one +(1, 22, '2024-09-01', '2025-02-28', NULL), -- Backend Developer (current/recent) + +-- Sophia Brown (userId=3) - second engineering co-op +(3, 45, '2024-06-01', '2024-11-30', NULL), -- Automation Engineer (current/recent) + +-- Noah Davis (userId=4) - advanced data science co-op +(4, 40, '2024-09-01', '2025-02-28', NULL), -- AI Research Intern (current/recent) + +-- Mia Jackson (userId=11) - third co-op as senior +(11, 39, '2024-06-01', '2024-11-30', NULL), -- Blockchain Developer (current/recent) + +-- Sebastian Hall (userId=20) - third advanced co-op +(20, 47, '2024-06-01', '2024-11-30', NULL), -- Computer Vision co-op (current/recent) + +-- Luna King (userId=23) - advanced finance role +(23, 39, '2024-06-01', '2024-11-30', NULL); -- Blockchain Developer (current/recent) + + +# this is maybe fixed \ No newline at end of file diff --git a/database-files/02_northwind-data.sql b/database-files/02_northwind-data.sql deleted file mode 100644 index e4477299ac..0000000000 --- a/database-files/02_northwind-data.sql +++ /dev/null @@ -1,654 +0,0 @@ -# -# Converted from MS Access 2010 Northwind database (northwind.accdb) using -# Bullzip MS Access to MySQL Version 5.1.242. http://www.bullzip.com -# -# CHANGES MADE AFTER INITIAL CONVERSION -# * column and row names in CamelCase converted to lower_case_with_underscore -# * space and slash ("/") in table and column names replaced with _underscore_ -# * id column names converted to "id" -# * foreign key column names converted to xxx_id -# * variables of type TIMESTAMP converted to DATETIME to avoid TIMESTAMP -# range limitation (1997 - 2038 UTC), and other limitations. -# * unique and foreign key checks disabled while loading data -# -#------------------------------------------------------------------ -# - -SET @OLD_UNIQUE_CHECKS=@@UNIQUE_CHECKS, UNIQUE_CHECKS=0; -SET @OLD_FOREIGN_KEY_CHECKS=@@FOREIGN_KEY_CHECKS, FOREIGN_KEY_CHECKS=0; - -USE `northwind`; - -# -# Dumping data for table 'customers' -# - -INSERT INTO `customers` (`id`, `company`, `last_name`, `first_name`, `email_address`, `job_title`, `business_phone`, `home_phone`, `mobile_phone`, `fax_number`, `address`, `city`, `state_province`, `zip_postal_code`, `country_region`, `web_page`, `notes`, `attachments`) VALUES (1, 'Company A', 'Bedecs', 'Anna', NULL, 'Owner', '(123)555-0100', NULL, NULL, '(123)555-0101', '123 1st Street', 'Seattle', 'WA', '99999', 'USA', NULL, NULL, ''); -INSERT INTO `customers` (`id`, `company`, `last_name`, `first_name`, `email_address`, `job_title`, `business_phone`, `home_phone`, `mobile_phone`, `fax_number`, `address`, `city`, `state_province`, `zip_postal_code`, `country_region`, `web_page`, `notes`, `attachments`) VALUES (2, 'Company B', 'Gratacos Solsona', 'Antonio', NULL, 'Owner', '(123)555-0100', NULL, NULL, '(123)555-0101', '123 2nd Street', 'Boston', 'MA', '99999', 'USA', NULL, NULL, ''); -INSERT INTO `customers` (`id`, `company`, `last_name`, `first_name`, `email_address`, `job_title`, `business_phone`, `home_phone`, `mobile_phone`, `fax_number`, `address`, `city`, `state_province`, `zip_postal_code`, `country_region`, `web_page`, `notes`, `attachments`) VALUES (3, 'Company C', 'Axen', 'Thomas', NULL, 'Purchasing Representative', '(123)555-0100', NULL, NULL, '(123)555-0101', '123 3rd Street', 'Los Angelas', 'CA', '99999', 'USA', NULL, NULL, ''); -INSERT INTO `customers` (`id`, `company`, `last_name`, `first_name`, `email_address`, `job_title`, `business_phone`, `home_phone`, `mobile_phone`, `fax_number`, `address`, `city`, `state_province`, `zip_postal_code`, `country_region`, `web_page`, `notes`, `attachments`) VALUES (4, 'Company D', 'Lee', 'Christina', NULL, 'Purchasing Manager', '(123)555-0100', NULL, NULL, '(123)555-0101', '123 4th Street', 'New York', 'NY', '99999', 'USA', NULL, NULL, ''); -INSERT INTO `customers` (`id`, `company`, `last_name`, `first_name`, `email_address`, `job_title`, `business_phone`, `home_phone`, `mobile_phone`, `fax_number`, `address`, `city`, `state_province`, `zip_postal_code`, `country_region`, `web_page`, `notes`, `attachments`) VALUES (5, 'Company E', 'O’Donnell', 'Martin', NULL, 'Owner', '(123)555-0100', NULL, NULL, '(123)555-0101', '123 5th Street', 'Minneapolis', 'MN', '99999', 'USA', NULL, NULL, ''); -INSERT INTO `customers` (`id`, `company`, `last_name`, `first_name`, `email_address`, `job_title`, `business_phone`, `home_phone`, `mobile_phone`, `fax_number`, `address`, `city`, `state_province`, `zip_postal_code`, `country_region`, `web_page`, `notes`, `attachments`) VALUES (6, 'Company F', 'Pérez-Olaeta', 'Francisco', NULL, 'Purchasing Manager', '(123)555-0100', NULL, NULL, '(123)555-0101', '123 6th Street', 'Milwaukee', 'WI', '99999', 'USA', NULL, NULL, ''); -INSERT INTO `customers` (`id`, `company`, `last_name`, `first_name`, `email_address`, `job_title`, `business_phone`, `home_phone`, `mobile_phone`, `fax_number`, `address`, `city`, `state_province`, `zip_postal_code`, `country_region`, `web_page`, `notes`, `attachments`) VALUES (7, 'Company G', 'Xie', 'Ming-Yang', NULL, 'Owner', '(123)555-0100', NULL, NULL, '(123)555-0101', '123 7th Street', 'Boise', 'ID', '99999', 'USA', NULL, NULL, ''); -INSERT INTO `customers` (`id`, `company`, `last_name`, `first_name`, `email_address`, `job_title`, `business_phone`, `home_phone`, `mobile_phone`, `fax_number`, `address`, `city`, `state_province`, `zip_postal_code`, `country_region`, `web_page`, `notes`, `attachments`) VALUES (8, 'Company H', 'Andersen', 'Elizabeth', NULL, 'Purchasing Representative', '(123)555-0100', NULL, NULL, '(123)555-0101', '123 8th Street', 'Portland', 'OR', '99999', 'USA', NULL, NULL, ''); -INSERT INTO `customers` (`id`, `company`, `last_name`, `first_name`, `email_address`, `job_title`, `business_phone`, `home_phone`, `mobile_phone`, `fax_number`, `address`, `city`, `state_province`, `zip_postal_code`, `country_region`, `web_page`, `notes`, `attachments`) VALUES (9, 'Company I', 'Mortensen', 'Sven', NULL, 'Purchasing Manager', '(123)555-0100', NULL, NULL, '(123)555-0101', '123 9th Street', 'Salt Lake City', 'UT', '99999', 'USA', NULL, NULL, ''); -INSERT INTO `customers` (`id`, `company`, `last_name`, `first_name`, `email_address`, `job_title`, `business_phone`, `home_phone`, `mobile_phone`, `fax_number`, `address`, `city`, `state_province`, `zip_postal_code`, `country_region`, `web_page`, `notes`, `attachments`) VALUES (10, 'Company J', 'Wacker', 'Roland', NULL, 'Purchasing Manager', '(123)555-0100', NULL, NULL, '(123)555-0101', '123 10th Street', 'Chicago', 'IL', '99999', 'USA', NULL, NULL, ''); -INSERT INTO `customers` (`id`, `company`, `last_name`, `first_name`, `email_address`, `job_title`, `business_phone`, `home_phone`, `mobile_phone`, `fax_number`, `address`, `city`, `state_province`, `zip_postal_code`, `country_region`, `web_page`, `notes`, `attachments`) VALUES (11, 'Company K', 'Krschne', 'Peter', NULL, 'Purchasing Manager', '(123)555-0100', NULL, NULL, '(123)555-0101', '123 11th Street', 'Miami', 'FL', '99999', 'USA', NULL, NULL, ''); -INSERT INTO `customers` (`id`, `company`, `last_name`, `first_name`, `email_address`, `job_title`, `business_phone`, `home_phone`, `mobile_phone`, `fax_number`, `address`, `city`, `state_province`, `zip_postal_code`, `country_region`, `web_page`, `notes`, `attachments`) VALUES (12, 'Company L', 'Edwards', 'John', NULL, 'Purchasing Manager', '(123)555-0100', NULL, NULL, '(123)555-0101', '123 12th Street', 'Las Vegas', 'NV', '99999', 'USA', NULL, NULL, ''); -INSERT INTO `customers` (`id`, `company`, `last_name`, `first_name`, `email_address`, `job_title`, `business_phone`, `home_phone`, `mobile_phone`, `fax_number`, `address`, `city`, `state_province`, `zip_postal_code`, `country_region`, `web_page`, `notes`, `attachments`) VALUES (13, 'Company M', 'Ludick', 'Andre', NULL, 'Purchasing Representative', '(123)555-0100', NULL, NULL, '(123)555-0101', '456 13th Street', 'Memphis', 'TN', '99999', 'USA', NULL, NULL, ''); -INSERT INTO `customers` (`id`, `company`, `last_name`, `first_name`, `email_address`, `job_title`, `business_phone`, `home_phone`, `mobile_phone`, `fax_number`, `address`, `city`, `state_province`, `zip_postal_code`, `country_region`, `web_page`, `notes`, `attachments`) VALUES (14, 'Company N', 'Grilo', 'Carlos', NULL, 'Purchasing Representative', '(123)555-0100', NULL, NULL, '(123)555-0101', '456 14th Street', 'Denver', 'CO', '99999', 'USA', NULL, NULL, ''); -INSERT INTO `customers` (`id`, `company`, `last_name`, `first_name`, `email_address`, `job_title`, `business_phone`, `home_phone`, `mobile_phone`, `fax_number`, `address`, `city`, `state_province`, `zip_postal_code`, `country_region`, `web_page`, `notes`, `attachments`) VALUES (15, 'Company O', 'Kupkova', 'Helena', NULL, 'Purchasing Manager', '(123)555-0100', NULL, NULL, '(123)555-0101', '456 15th Street', 'Honolulu', 'HI', '99999', 'USA', NULL, NULL, ''); -INSERT INTO `customers` (`id`, `company`, `last_name`, `first_name`, `email_address`, `job_title`, `business_phone`, `home_phone`, `mobile_phone`, `fax_number`, `address`, `city`, `state_province`, `zip_postal_code`, `country_region`, `web_page`, `notes`, `attachments`) VALUES (16, 'Company P', 'Goldschmidt', 'Daniel', NULL, 'Purchasing Representative', '(123)555-0100', NULL, NULL, '(123)555-0101', '456 16th Street', 'San Francisco', 'CA', '99999', 'USA', NULL, NULL, ''); -INSERT INTO `customers` (`id`, `company`, `last_name`, `first_name`, `email_address`, `job_title`, `business_phone`, `home_phone`, `mobile_phone`, `fax_number`, `address`, `city`, `state_province`, `zip_postal_code`, `country_region`, `web_page`, `notes`, `attachments`) VALUES (17, 'Company Q', 'Bagel', 'Jean Philippe', NULL, 'Owner', '(123)555-0100', NULL, NULL, '(123)555-0101', '456 17th Street', 'Seattle', 'WA', '99999', 'USA', NULL, NULL, ''); -INSERT INTO `customers` (`id`, `company`, `last_name`, `first_name`, `email_address`, `job_title`, `business_phone`, `home_phone`, `mobile_phone`, `fax_number`, `address`, `city`, `state_province`, `zip_postal_code`, `country_region`, `web_page`, `notes`, `attachments`) VALUES (18, 'Company R', 'Autier Miconi', 'Catherine', NULL, 'Purchasing Representative', '(123)555-0100', NULL, NULL, '(123)555-0101', '456 18th Street', 'Boston', 'MA', '99999', 'USA', NULL, NULL, ''); -INSERT INTO `customers` (`id`, `company`, `last_name`, `first_name`, `email_address`, `job_title`, `business_phone`, `home_phone`, `mobile_phone`, `fax_number`, `address`, `city`, `state_province`, `zip_postal_code`, `country_region`, `web_page`, `notes`, `attachments`) VALUES (19, 'Company S', 'Eggerer', 'Alexander', NULL, 'Accounting Assistant', '(123)555-0100', NULL, NULL, '(123)555-0101', '789 19th Street', 'Los Angelas', 'CA', '99999', 'USA', NULL, NULL, ''); -INSERT INTO `customers` (`id`, `company`, `last_name`, `first_name`, `email_address`, `job_title`, `business_phone`, `home_phone`, `mobile_phone`, `fax_number`, `address`, `city`, `state_province`, `zip_postal_code`, `country_region`, `web_page`, `notes`, `attachments`) VALUES (20, 'Company T', 'Li', 'George', NULL, 'Purchasing Manager', '(123)555-0100', NULL, NULL, '(123)555-0101', '789 20th Street', 'New York', 'NY', '99999', 'USA', NULL, NULL, ''); -INSERT INTO `customers` (`id`, `company`, `last_name`, `first_name`, `email_address`, `job_title`, `business_phone`, `home_phone`, `mobile_phone`, `fax_number`, `address`, `city`, `state_province`, `zip_postal_code`, `country_region`, `web_page`, `notes`, `attachments`) VALUES (21, 'Company U', 'Tham', 'Bernard', NULL, 'Accounting Manager', '(123)555-0100', NULL, NULL, '(123)555-0101', '789 21th Street', 'Minneapolis', 'MN', '99999', 'USA', NULL, NULL, ''); -INSERT INTO `customers` (`id`, `company`, `last_name`, `first_name`, `email_address`, `job_title`, `business_phone`, `home_phone`, `mobile_phone`, `fax_number`, `address`, `city`, `state_province`, `zip_postal_code`, `country_region`, `web_page`, `notes`, `attachments`) VALUES (22, 'Company V', 'Ramos', 'Luciana', NULL, 'Purchasing Assistant', '(123)555-0100', NULL, NULL, '(123)555-0101', '789 22th Street', 'Milwaukee', 'WI', '99999', 'USA', NULL, NULL, ''); -INSERT INTO `customers` (`id`, `company`, `last_name`, `first_name`, `email_address`, `job_title`, `business_phone`, `home_phone`, `mobile_phone`, `fax_number`, `address`, `city`, `state_province`, `zip_postal_code`, `country_region`, `web_page`, `notes`, `attachments`) VALUES (23, 'Company W', 'Entin', 'Michael', NULL, 'Purchasing Manager', '(123)555-0100', NULL, NULL, '(123)555-0101', '789 23th Street', 'Portland', 'OR', '99999', 'USA', NULL, NULL, ''); -INSERT INTO `customers` (`id`, `company`, `last_name`, `first_name`, `email_address`, `job_title`, `business_phone`, `home_phone`, `mobile_phone`, `fax_number`, `address`, `city`, `state_province`, `zip_postal_code`, `country_region`, `web_page`, `notes`, `attachments`) VALUES (24, 'Company X', 'Hasselberg', 'Jonas', NULL, 'Owner', '(123)555-0100', NULL, NULL, '(123)555-0101', '789 24th Street', 'Salt Lake City', 'UT', '99999', 'USA', NULL, NULL, ''); -INSERT INTO `customers` (`id`, `company`, `last_name`, `first_name`, `email_address`, `job_title`, `business_phone`, `home_phone`, `mobile_phone`, `fax_number`, `address`, `city`, `state_province`, `zip_postal_code`, `country_region`, `web_page`, `notes`, `attachments`) VALUES (25, 'Company Y', 'Rodman', 'John', NULL, 'Purchasing Manager', '(123)555-0100', NULL, NULL, '(123)555-0101', '789 25th Street', 'Chicago', 'IL', '99999', 'USA', NULL, NULL, ''); -INSERT INTO `customers` (`id`, `company`, `last_name`, `first_name`, `email_address`, `job_title`, `business_phone`, `home_phone`, `mobile_phone`, `fax_number`, `address`, `city`, `state_province`, `zip_postal_code`, `country_region`, `web_page`, `notes`, `attachments`) VALUES (26, 'Company Z', 'Liu', 'Run', NULL, 'Accounting Assistant', '(123)555-0100', NULL, NULL, '(123)555-0101', '789 26th Street', 'Miami', 'FL', '99999', 'USA', NULL, NULL, ''); -INSERT INTO `customers` (`id`, `company`, `last_name`, `first_name`, `email_address`, `job_title`, `business_phone`, `home_phone`, `mobile_phone`, `fax_number`, `address`, `city`, `state_province`, `zip_postal_code`, `country_region`, `web_page`, `notes`, `attachments`) VALUES (27, 'Company AA', 'Toh', 'Karen', NULL, 'Purchasing Manager', '(123)555-0100', NULL, NULL, '(123)555-0101', '789 27th Street', 'Las Vegas', 'NV', '99999', 'USA', NULL, NULL, ''); -INSERT INTO `customers` (`id`, `company`, `last_name`, `first_name`, `email_address`, `job_title`, `business_phone`, `home_phone`, `mobile_phone`, `fax_number`, `address`, `city`, `state_province`, `zip_postal_code`, `country_region`, `web_page`, `notes`, `attachments`) VALUES (28, 'Company BB', 'Raghav', 'Amritansh', NULL, 'Purchasing Manager', '(123)555-0100', NULL, NULL, '(123)555-0101', '789 28th Street', 'Memphis', 'TN', '99999', 'USA', NULL, NULL, ''); -INSERT INTO `customers` (`id`, `company`, `last_name`, `first_name`, `email_address`, `job_title`, `business_phone`, `home_phone`, `mobile_phone`, `fax_number`, `address`, `city`, `state_province`, `zip_postal_code`, `country_region`, `web_page`, `notes`, `attachments`) VALUES (29, 'Company CC', 'Lee', 'Soo Jung', NULL, 'Purchasing Manager', '(123)555-0100', NULL, NULL, '(123)555-0101', '789 29th Street', 'Denver', 'CO', '99999', 'USA', NULL, NULL, ''); -# 29 records - -# -# Dumping data for table 'employee_privileges' -# - -INSERT INTO `employee_privileges` (`employee_id`, `privilege_id`) VALUES (2, 2); -# 1 records - -# -# Dumping data for table 'employees' -# - -INSERT INTO `employees` (`id`, `company`, `last_name`, `first_name`, `email_address`, `job_title`, `business_phone`, `home_phone`, `mobile_phone`, `fax_number`, `address`, `city`, `state_province`, `zip_postal_code`, `country_region`, `web_page`, `notes`, `attachments`) VALUES (1, 'Northwind Traders', 'Freehafer', 'Nancy', 'nancy@northwindtraders.com', 'Sales Representative', '(123)555-0100', '(123)555-0102', NULL, '(123)555-0103', '123 1st Avenue', 'Seattle', 'WA', '99999', 'USA', '#http://northwindtraders.com#', NULL, ''); -INSERT INTO `employees` (`id`, `company`, `last_name`, `first_name`, `email_address`, `job_title`, `business_phone`, `home_phone`, `mobile_phone`, `fax_number`, `address`, `city`, `state_province`, `zip_postal_code`, `country_region`, `web_page`, `notes`, `attachments`) VALUES (2, 'Northwind Traders', 'Cencini', 'Andrew', 'andrew@northwindtraders.com', 'Vice President, Sales', '(123)555-0100', '(123)555-0102', NULL, '(123)555-0103', '123 2nd Avenue', 'Bellevue', 'WA', '99999', 'USA', 'http://northwindtraders.com#http://northwindtraders.com/#', 'Joined the company as a sales representative, was promoted to sales manager and was then named vice president of sales.', ''); -INSERT INTO `employees` (`id`, `company`, `last_name`, `first_name`, `email_address`, `job_title`, `business_phone`, `home_phone`, `mobile_phone`, `fax_number`, `address`, `city`, `state_province`, `zip_postal_code`, `country_region`, `web_page`, `notes`, `attachments`) VALUES (3, 'Northwind Traders', 'Kotas', 'Jan', 'jan@northwindtraders.com', 'Sales Representative', '(123)555-0100', '(123)555-0102', NULL, '(123)555-0103', '123 3rd Avenue', 'Redmond', 'WA', '99999', 'USA', 'http://northwindtraders.com#http://northwindtraders.com/#', 'Was hired as a sales associate and was promoted to sales representative.', ''); -INSERT INTO `employees` (`id`, `company`, `last_name`, `first_name`, `email_address`, `job_title`, `business_phone`, `home_phone`, `mobile_phone`, `fax_number`, `address`, `city`, `state_province`, `zip_postal_code`, `country_region`, `web_page`, `notes`, `attachments`) VALUES (4, 'Northwind Traders', 'Sergienko', 'Mariya', 'mariya@northwindtraders.com', 'Sales Representative', '(123)555-0100', '(123)555-0102', NULL, '(123)555-0103', '123 4th Avenue', 'Kirkland', 'WA', '99999', 'USA', 'http://northwindtraders.com#http://northwindtraders.com/#', NULL, ''); -INSERT INTO `employees` (`id`, `company`, `last_name`, `first_name`, `email_address`, `job_title`, `business_phone`, `home_phone`, `mobile_phone`, `fax_number`, `address`, `city`, `state_province`, `zip_postal_code`, `country_region`, `web_page`, `notes`, `attachments`) VALUES (5, 'Northwind Traders', 'Thorpe', 'Steven', 'steven@northwindtraders.com', 'Sales Manager', '(123)555-0100', '(123)555-0102', NULL, '(123)555-0103', '123 5th Avenue', 'Seattle', 'WA', '99999', 'USA', 'http://northwindtraders.com#http://northwindtraders.com/#', 'Joined the company as a sales representative and was promoted to sales manager. Fluent in French.', ''); -INSERT INTO `employees` (`id`, `company`, `last_name`, `first_name`, `email_address`, `job_title`, `business_phone`, `home_phone`, `mobile_phone`, `fax_number`, `address`, `city`, `state_province`, `zip_postal_code`, `country_region`, `web_page`, `notes`, `attachments`) VALUES (6, 'Northwind Traders', 'Neipper', 'Michael', 'michael@northwindtraders.com', 'Sales Representative', '(123)555-0100', '(123)555-0102', NULL, '(123)555-0103', '123 6th Avenue', 'Redmond', 'WA', '99999', 'USA', 'http://northwindtraders.com#http://northwindtraders.com/#', 'Fluent in Japanese and can read and write French, Portuguese, and Spanish.', ''); -INSERT INTO `employees` (`id`, `company`, `last_name`, `first_name`, `email_address`, `job_title`, `business_phone`, `home_phone`, `mobile_phone`, `fax_number`, `address`, `city`, `state_province`, `zip_postal_code`, `country_region`, `web_page`, `notes`, `attachments`) VALUES (7, 'Northwind Traders', 'Zare', 'Robert', 'robert@northwindtraders.com', 'Sales Representative', '(123)555-0100', '(123)555-0102', NULL, '(123)555-0103', '123 7th Avenue', 'Seattle', 'WA', '99999', 'USA', 'http://northwindtraders.com#http://northwindtraders.com/#', NULL, ''); -INSERT INTO `employees` (`id`, `company`, `last_name`, `first_name`, `email_address`, `job_title`, `business_phone`, `home_phone`, `mobile_phone`, `fax_number`, `address`, `city`, `state_province`, `zip_postal_code`, `country_region`, `web_page`, `notes`, `attachments`) VALUES (8, 'Northwind Traders', 'Giussani', 'Laura', 'laura@northwindtraders.com', 'Sales Coordinator', '(123)555-0100', '(123)555-0102', NULL, '(123)555-0103', '123 8th Avenue', 'Redmond', 'WA', '99999', 'USA', 'http://northwindtraders.com#http://northwindtraders.com/#', 'Reads and writes French.', ''); -INSERT INTO `employees` (`id`, `company`, `last_name`, `first_name`, `email_address`, `job_title`, `business_phone`, `home_phone`, `mobile_phone`, `fax_number`, `address`, `city`, `state_province`, `zip_postal_code`, `country_region`, `web_page`, `notes`, `attachments`) VALUES (9, 'Northwind Traders', 'Hellung-Larsen', 'Anne', 'anne@northwindtraders.com', 'Sales Representative', '(123)555-0100', '(123)555-0102', NULL, '(123)555-0103', '123 9th Avenue', 'Seattle', 'WA', '99999', 'USA', 'http://northwindtraders.com#http://northwindtraders.com/#', 'Fluent in French and German.', ''); -# 9 records - -# -# Dumping data for table 'inventory_transaction_types' -# - -INSERT INTO `inventory_transaction_types` (`id`, `type_name`) VALUES (1, 'Purchased'); -INSERT INTO `inventory_transaction_types` (`id`, `type_name`) VALUES (2, 'Sold'); -INSERT INTO `inventory_transaction_types` (`id`, `type_name`) VALUES (3, 'On Hold'); -INSERT INTO `inventory_transaction_types` (`id`, `type_name`) VALUES (4, 'Waste'); -# 4 records - -# -# Dumping data for table 'inventory_transactions' -# - -INSERT INTO `inventory_transactions` (`id`, `transaction_type`, `transaction_created_date`, `transaction_modified_date`, `product_id`, `quantity`, `purchase_order_id`, `customer_order_id`, `comments`) VALUES (35, 1, '2006-03-22 16:02:28', '2006-03-22 16:02:28', 80, 75, NULL, NULL, NULL); -INSERT INTO `inventory_transactions` (`id`, `transaction_type`, `transaction_created_date`, `transaction_modified_date`, `product_id`, `quantity`, `purchase_order_id`, `customer_order_id`, `comments`) VALUES (36, 1, '2006-03-22 16:02:48', '2006-03-22 16:02:48', 72, 40, NULL, NULL, NULL); -INSERT INTO `inventory_transactions` (`id`, `transaction_type`, `transaction_created_date`, `transaction_modified_date`, `product_id`, `quantity`, `purchase_order_id`, `customer_order_id`, `comments`) VALUES (37, 1, '2006-03-22 16:03:04', '2006-03-22 16:03:04', 52, 100, NULL, NULL, NULL); -INSERT INTO `inventory_transactions` (`id`, `transaction_type`, `transaction_created_date`, `transaction_modified_date`, `product_id`, `quantity`, `purchase_order_id`, `customer_order_id`, `comments`) VALUES (38, 1, '2006-03-22 16:03:09', '2006-03-22 16:03:09', 56, 120, NULL, NULL, NULL); -INSERT INTO `inventory_transactions` (`id`, `transaction_type`, `transaction_created_date`, `transaction_modified_date`, `product_id`, `quantity`, `purchase_order_id`, `customer_order_id`, `comments`) VALUES (39, 1, '2006-03-22 16:03:14', '2006-03-22 16:03:14', 57, 80, NULL, NULL, NULL); -INSERT INTO `inventory_transactions` (`id`, `transaction_type`, `transaction_created_date`, `transaction_modified_date`, `product_id`, `quantity`, `purchase_order_id`, `customer_order_id`, `comments`) VALUES (40, 1, '2006-03-22 16:03:40', '2006-03-22 16:03:40', 6, 100, NULL, NULL, NULL); -INSERT INTO `inventory_transactions` (`id`, `transaction_type`, `transaction_created_date`, `transaction_modified_date`, `product_id`, `quantity`, `purchase_order_id`, `customer_order_id`, `comments`) VALUES (41, 1, '2006-03-22 16:03:47', '2006-03-22 16:03:47', 7, 40, NULL, NULL, NULL); -INSERT INTO `inventory_transactions` (`id`, `transaction_type`, `transaction_created_date`, `transaction_modified_date`, `product_id`, `quantity`, `purchase_order_id`, `customer_order_id`, `comments`) VALUES (42, 1, '2006-03-22 16:03:54', '2006-03-22 16:03:54', 8, 40, NULL, NULL, NULL); -INSERT INTO `inventory_transactions` (`id`, `transaction_type`, `transaction_created_date`, `transaction_modified_date`, `product_id`, `quantity`, `purchase_order_id`, `customer_order_id`, `comments`) VALUES (43, 1, '2006-03-22 16:04:02', '2006-03-22 16:04:02', 14, 40, NULL, NULL, NULL); -INSERT INTO `inventory_transactions` (`id`, `transaction_type`, `transaction_created_date`, `transaction_modified_date`, `product_id`, `quantity`, `purchase_order_id`, `customer_order_id`, `comments`) VALUES (44, 1, '2006-03-22 16:04:07', '2006-03-22 16:04:07', 17, 40, NULL, NULL, NULL); -INSERT INTO `inventory_transactions` (`id`, `transaction_type`, `transaction_created_date`, `transaction_modified_date`, `product_id`, `quantity`, `purchase_order_id`, `customer_order_id`, `comments`) VALUES (45, 1, '2006-03-22 16:04:12', '2006-03-22 16:04:12', 19, 20, NULL, NULL, NULL); -INSERT INTO `inventory_transactions` (`id`, `transaction_type`, `transaction_created_date`, `transaction_modified_date`, `product_id`, `quantity`, `purchase_order_id`, `customer_order_id`, `comments`) VALUES (46, 1, '2006-03-22 16:04:17', '2006-03-22 16:04:17', 20, 40, NULL, NULL, NULL); -INSERT INTO `inventory_transactions` (`id`, `transaction_type`, `transaction_created_date`, `transaction_modified_date`, `product_id`, `quantity`, `purchase_order_id`, `customer_order_id`, `comments`) VALUES (47, 1, '2006-03-22 16:04:20', '2006-03-22 16:04:20', 21, 20, NULL, NULL, NULL); -INSERT INTO `inventory_transactions` (`id`, `transaction_type`, `transaction_created_date`, `transaction_modified_date`, `product_id`, `quantity`, `purchase_order_id`, `customer_order_id`, `comments`) VALUES (48, 1, '2006-03-22 16:04:24', '2006-03-22 16:04:24', 40, 120, NULL, NULL, NULL); -INSERT INTO `inventory_transactions` (`id`, `transaction_type`, `transaction_created_date`, `transaction_modified_date`, `product_id`, `quantity`, `purchase_order_id`, `customer_order_id`, `comments`) VALUES (49, 1, '2006-03-22 16:04:28', '2006-03-22 16:04:28', 41, 40, NULL, NULL, NULL); -INSERT INTO `inventory_transactions` (`id`, `transaction_type`, `transaction_created_date`, `transaction_modified_date`, `product_id`, `quantity`, `purchase_order_id`, `customer_order_id`, `comments`) VALUES (50, 1, '2006-03-22 16:04:31', '2006-03-22 16:04:31', 48, 100, NULL, NULL, NULL); -INSERT INTO `inventory_transactions` (`id`, `transaction_type`, `transaction_created_date`, `transaction_modified_date`, `product_id`, `quantity`, `purchase_order_id`, `customer_order_id`, `comments`) VALUES (51, 1, '2006-03-22 16:04:38', '2006-03-22 16:04:38', 51, 40, NULL, NULL, NULL); -INSERT INTO `inventory_transactions` (`id`, `transaction_type`, `transaction_created_date`, `transaction_modified_date`, `product_id`, `quantity`, `purchase_order_id`, `customer_order_id`, `comments`) VALUES (52, 1, '2006-03-22 16:04:41', '2006-03-22 16:04:41', 74, 20, NULL, NULL, NULL); -INSERT INTO `inventory_transactions` (`id`, `transaction_type`, `transaction_created_date`, `transaction_modified_date`, `product_id`, `quantity`, `purchase_order_id`, `customer_order_id`, `comments`) VALUES (53, 1, '2006-03-22 16:04:45', '2006-03-22 16:04:45', 77, 60, NULL, NULL, NULL); -INSERT INTO `inventory_transactions` (`id`, `transaction_type`, `transaction_created_date`, `transaction_modified_date`, `product_id`, `quantity`, `purchase_order_id`, `customer_order_id`, `comments`) VALUES (54, 1, '2006-03-22 16:05:07', '2006-03-22 16:05:07', 3, 100, NULL, NULL, NULL); -INSERT INTO `inventory_transactions` (`id`, `transaction_type`, `transaction_created_date`, `transaction_modified_date`, `product_id`, `quantity`, `purchase_order_id`, `customer_order_id`, `comments`) VALUES (55, 1, '2006-03-22 16:05:11', '2006-03-22 16:05:11', 4, 40, NULL, NULL, NULL); -INSERT INTO `inventory_transactions` (`id`, `transaction_type`, `transaction_created_date`, `transaction_modified_date`, `product_id`, `quantity`, `purchase_order_id`, `customer_order_id`, `comments`) VALUES (56, 1, '2006-03-22 16:05:14', '2006-03-22 16:05:14', 5, 40, NULL, NULL, NULL); -INSERT INTO `inventory_transactions` (`id`, `transaction_type`, `transaction_created_date`, `transaction_modified_date`, `product_id`, `quantity`, `purchase_order_id`, `customer_order_id`, `comments`) VALUES (57, 1, '2006-03-22 16:05:26', '2006-03-22 16:05:26', 65, 40, NULL, NULL, NULL); -INSERT INTO `inventory_transactions` (`id`, `transaction_type`, `transaction_created_date`, `transaction_modified_date`, `product_id`, `quantity`, `purchase_order_id`, `customer_order_id`, `comments`) VALUES (58, 1, '2006-03-22 16:05:32', '2006-03-22 16:05:32', 66, 80, NULL, NULL, NULL); -INSERT INTO `inventory_transactions` (`id`, `transaction_type`, `transaction_created_date`, `transaction_modified_date`, `product_id`, `quantity`, `purchase_order_id`, `customer_order_id`, `comments`) VALUES (59, 1, '2006-03-22 16:05:47', '2006-03-22 16:05:47', 1, 40, NULL, NULL, NULL); -INSERT INTO `inventory_transactions` (`id`, `transaction_type`, `transaction_created_date`, `transaction_modified_date`, `product_id`, `quantity`, `purchase_order_id`, `customer_order_id`, `comments`) VALUES (60, 1, '2006-03-22 16:05:51', '2006-03-22 16:05:51', 34, 60, NULL, NULL, NULL); -INSERT INTO `inventory_transactions` (`id`, `transaction_type`, `transaction_created_date`, `transaction_modified_date`, `product_id`, `quantity`, `purchase_order_id`, `customer_order_id`, `comments`) VALUES (61, 1, '2006-03-22 16:06:00', '2006-03-22 16:06:00', 43, 100, NULL, NULL, NULL); -INSERT INTO `inventory_transactions` (`id`, `transaction_type`, `transaction_created_date`, `transaction_modified_date`, `product_id`, `quantity`, `purchase_order_id`, `customer_order_id`, `comments`) VALUES (62, 1, '2006-03-22 16:06:03', '2006-03-22 16:06:03', 81, 125, NULL, NULL, NULL); -INSERT INTO `inventory_transactions` (`id`, `transaction_type`, `transaction_created_date`, `transaction_modified_date`, `product_id`, `quantity`, `purchase_order_id`, `customer_order_id`, `comments`) VALUES (63, 2, '2006-03-22 16:07:56', '2006-03-24 11:03:00', 80, 30, NULL, NULL, NULL); -INSERT INTO `inventory_transactions` (`id`, `transaction_type`, `transaction_created_date`, `transaction_modified_date`, `product_id`, `quantity`, `purchase_order_id`, `customer_order_id`, `comments`) VALUES (64, 2, '2006-03-22 16:08:19', '2006-03-22 16:08:59', 7, 10, NULL, NULL, NULL); -INSERT INTO `inventory_transactions` (`id`, `transaction_type`, `transaction_created_date`, `transaction_modified_date`, `product_id`, `quantity`, `purchase_order_id`, `customer_order_id`, `comments`) VALUES (65, 2, '2006-03-22 16:08:29', '2006-03-22 16:08:59', 51, 10, NULL, NULL, NULL); -INSERT INTO `inventory_transactions` (`id`, `transaction_type`, `transaction_created_date`, `transaction_modified_date`, `product_id`, `quantity`, `purchase_order_id`, `customer_order_id`, `comments`) VALUES (66, 2, '2006-03-22 16:08:37', '2006-03-22 16:08:59', 80, 10, NULL, NULL, NULL); -INSERT INTO `inventory_transactions` (`id`, `transaction_type`, `transaction_created_date`, `transaction_modified_date`, `product_id`, `quantity`, `purchase_order_id`, `customer_order_id`, `comments`) VALUES (67, 2, '2006-03-22 16:09:46', '2006-03-22 16:10:27', 1, 15, NULL, NULL, NULL); -INSERT INTO `inventory_transactions` (`id`, `transaction_type`, `transaction_created_date`, `transaction_modified_date`, `product_id`, `quantity`, `purchase_order_id`, `customer_order_id`, `comments`) VALUES (68, 2, '2006-03-22 16:10:06', '2006-03-22 16:10:27', 43, 20, NULL, NULL, NULL); -INSERT INTO `inventory_transactions` (`id`, `transaction_type`, `transaction_created_date`, `transaction_modified_date`, `product_id`, `quantity`, `purchase_order_id`, `customer_order_id`, `comments`) VALUES (69, 2, '2006-03-22 16:11:39', '2006-03-24 11:00:55', 19, 20, NULL, NULL, NULL); -INSERT INTO `inventory_transactions` (`id`, `transaction_type`, `transaction_created_date`, `transaction_modified_date`, `product_id`, `quantity`, `purchase_order_id`, `customer_order_id`, `comments`) VALUES (70, 2, '2006-03-22 16:11:56', '2006-03-24 10:59:41', 48, 10, NULL, NULL, NULL); -INSERT INTO `inventory_transactions` (`id`, `transaction_type`, `transaction_created_date`, `transaction_modified_date`, `product_id`, `quantity`, `purchase_order_id`, `customer_order_id`, `comments`) VALUES (71, 2, '2006-03-22 16:12:29', '2006-03-24 10:57:38', 8, 17, NULL, NULL, NULL); -INSERT INTO `inventory_transactions` (`id`, `transaction_type`, `transaction_created_date`, `transaction_modified_date`, `product_id`, `quantity`, `purchase_order_id`, `customer_order_id`, `comments`) VALUES (72, 1, '2006-03-24 10:41:30', '2006-03-24 10:41:30', 81, 200, NULL, NULL, NULL); -INSERT INTO `inventory_transactions` (`id`, `transaction_type`, `transaction_created_date`, `transaction_modified_date`, `product_id`, `quantity`, `purchase_order_id`, `customer_order_id`, `comments`) VALUES (73, 2, '2006-03-24 10:41:33', '2006-03-24 10:41:42', 81, 200, NULL, NULL, 'Fill Back Ordered product, Order #40'); -INSERT INTO `inventory_transactions` (`id`, `transaction_type`, `transaction_created_date`, `transaction_modified_date`, `product_id`, `quantity`, `purchase_order_id`, `customer_order_id`, `comments`) VALUES (74, 1, '2006-03-24 10:53:13', '2006-03-24 10:53:13', 48, 100, NULL, NULL, NULL); -INSERT INTO `inventory_transactions` (`id`, `transaction_type`, `transaction_created_date`, `transaction_modified_date`, `product_id`, `quantity`, `purchase_order_id`, `customer_order_id`, `comments`) VALUES (75, 2, '2006-03-24 10:53:16', '2006-03-24 10:55:46', 48, 100, NULL, NULL, 'Fill Back Ordered product, Order #39'); -INSERT INTO `inventory_transactions` (`id`, `transaction_type`, `transaction_created_date`, `transaction_modified_date`, `product_id`, `quantity`, `purchase_order_id`, `customer_order_id`, `comments`) VALUES (76, 1, '2006-03-24 10:53:36', '2006-03-24 10:53:36', 43, 300, NULL, NULL, NULL); -INSERT INTO `inventory_transactions` (`id`, `transaction_type`, `transaction_created_date`, `transaction_modified_date`, `product_id`, `quantity`, `purchase_order_id`, `customer_order_id`, `comments`) VALUES (77, 2, '2006-03-24 10:53:39', '2006-03-24 10:56:57', 43, 300, NULL, NULL, 'Fill Back Ordered product, Order #38'); -INSERT INTO `inventory_transactions` (`id`, `transaction_type`, `transaction_created_date`, `transaction_modified_date`, `product_id`, `quantity`, `purchase_order_id`, `customer_order_id`, `comments`) VALUES (78, 1, '2006-03-24 10:54:04', '2006-03-24 10:54:04', 41, 200, NULL, NULL, NULL); -INSERT INTO `inventory_transactions` (`id`, `transaction_type`, `transaction_created_date`, `transaction_modified_date`, `product_id`, `quantity`, `purchase_order_id`, `customer_order_id`, `comments`) VALUES (79, 2, '2006-03-24 10:54:07', '2006-03-24 10:58:40', 41, 200, NULL, NULL, 'Fill Back Ordered product, Order #36'); -INSERT INTO `inventory_transactions` (`id`, `transaction_type`, `transaction_created_date`, `transaction_modified_date`, `product_id`, `quantity`, `purchase_order_id`, `customer_order_id`, `comments`) VALUES (80, 1, '2006-03-24 10:54:33', '2006-03-24 10:54:33', 19, 30, NULL, NULL, NULL); -INSERT INTO `inventory_transactions` (`id`, `transaction_type`, `transaction_created_date`, `transaction_modified_date`, `product_id`, `quantity`, `purchase_order_id`, `customer_order_id`, `comments`) VALUES (81, 2, '2006-03-24 10:54:35', '2006-03-24 11:02:02', 19, 30, NULL, NULL, 'Fill Back Ordered product, Order #33'); -INSERT INTO `inventory_transactions` (`id`, `transaction_type`, `transaction_created_date`, `transaction_modified_date`, `product_id`, `quantity`, `purchase_order_id`, `customer_order_id`, `comments`) VALUES (82, 1, '2006-03-24 10:54:58', '2006-03-24 10:54:58', 34, 100, NULL, NULL, NULL); -INSERT INTO `inventory_transactions` (`id`, `transaction_type`, `transaction_created_date`, `transaction_modified_date`, `product_id`, `quantity`, `purchase_order_id`, `customer_order_id`, `comments`) VALUES (83, 2, '2006-03-24 10:55:02', '2006-03-24 11:03:00', 34, 100, NULL, NULL, 'Fill Back Ordered product, Order #30'); -INSERT INTO `inventory_transactions` (`id`, `transaction_type`, `transaction_created_date`, `transaction_modified_date`, `product_id`, `quantity`, `purchase_order_id`, `customer_order_id`, `comments`) VALUES (84, 2, '2006-03-24 14:48:15', '2006-04-04 11:41:14', 6, 10, NULL, NULL, NULL); -INSERT INTO `inventory_transactions` (`id`, `transaction_type`, `transaction_created_date`, `transaction_modified_date`, `product_id`, `quantity`, `purchase_order_id`, `customer_order_id`, `comments`) VALUES (85, 2, '2006-03-24 14:48:23', '2006-04-04 11:41:14', 4, 10, NULL, NULL, NULL); -INSERT INTO `inventory_transactions` (`id`, `transaction_type`, `transaction_created_date`, `transaction_modified_date`, `product_id`, `quantity`, `purchase_order_id`, `customer_order_id`, `comments`) VALUES (86, 3, '2006-03-24 14:49:16', '2006-03-24 14:49:16', 80, 20, NULL, NULL, NULL); -INSERT INTO `inventory_transactions` (`id`, `transaction_type`, `transaction_created_date`, `transaction_modified_date`, `product_id`, `quantity`, `purchase_order_id`, `customer_order_id`, `comments`) VALUES (87, 3, '2006-03-24 14:49:20', '2006-03-24 14:49:20', 81, 50, NULL, NULL, NULL); -INSERT INTO `inventory_transactions` (`id`, `transaction_type`, `transaction_created_date`, `transaction_modified_date`, `product_id`, `quantity`, `purchase_order_id`, `customer_order_id`, `comments`) VALUES (88, 3, '2006-03-24 14:50:09', '2006-03-24 14:50:09', 1, 25, NULL, NULL, NULL); -INSERT INTO `inventory_transactions` (`id`, `transaction_type`, `transaction_created_date`, `transaction_modified_date`, `product_id`, `quantity`, `purchase_order_id`, `customer_order_id`, `comments`) VALUES (89, 3, '2006-03-24 14:50:14', '2006-03-24 14:50:14', 43, 25, NULL, NULL, NULL); -INSERT INTO `inventory_transactions` (`id`, `transaction_type`, `transaction_created_date`, `transaction_modified_date`, `product_id`, `quantity`, `purchase_order_id`, `customer_order_id`, `comments`) VALUES (90, 3, '2006-03-24 14:50:18', '2006-03-24 14:50:18', 81, 25, NULL, NULL, NULL); -INSERT INTO `inventory_transactions` (`id`, `transaction_type`, `transaction_created_date`, `transaction_modified_date`, `product_id`, `quantity`, `purchase_order_id`, `customer_order_id`, `comments`) VALUES (91, 2, '2006-03-24 14:51:03', '2006-04-04 11:09:24', 40, 50, NULL, NULL, NULL); -INSERT INTO `inventory_transactions` (`id`, `transaction_type`, `transaction_created_date`, `transaction_modified_date`, `product_id`, `quantity`, `purchase_order_id`, `customer_order_id`, `comments`) VALUES (92, 2, '2006-03-24 14:55:03', '2006-04-04 11:06:56', 21, 20, NULL, NULL, NULL); -INSERT INTO `inventory_transactions` (`id`, `transaction_type`, `transaction_created_date`, `transaction_modified_date`, `product_id`, `quantity`, `purchase_order_id`, `customer_order_id`, `comments`) VALUES (93, 2, '2006-03-24 14:55:39', '2006-04-04 11:06:13', 5, 25, NULL, NULL, NULL); -INSERT INTO `inventory_transactions` (`id`, `transaction_type`, `transaction_created_date`, `transaction_modified_date`, `product_id`, `quantity`, `purchase_order_id`, `customer_order_id`, `comments`) VALUES (94, 2, '2006-03-24 14:55:52', '2006-04-04 11:06:13', 41, 30, NULL, NULL, NULL); -INSERT INTO `inventory_transactions` (`id`, `transaction_type`, `transaction_created_date`, `transaction_modified_date`, `product_id`, `quantity`, `purchase_order_id`, `customer_order_id`, `comments`) VALUES (95, 2, '2006-03-24 14:56:09', '2006-04-04 11:06:13', 40, 30, NULL, NULL, NULL); -INSERT INTO `inventory_transactions` (`id`, `transaction_type`, `transaction_created_date`, `transaction_modified_date`, `product_id`, `quantity`, `purchase_order_id`, `customer_order_id`, `comments`) VALUES (96, 3, '2006-03-30 16:46:34', '2006-03-30 16:46:34', 34, 12, NULL, NULL, NULL); -INSERT INTO `inventory_transactions` (`id`, `transaction_type`, `transaction_created_date`, `transaction_modified_date`, `product_id`, `quantity`, `purchase_order_id`, `customer_order_id`, `comments`) VALUES (97, 3, '2006-03-30 17:23:27', '2006-03-30 17:23:27', 34, 10, NULL, NULL, NULL); -INSERT INTO `inventory_transactions` (`id`, `transaction_type`, `transaction_created_date`, `transaction_modified_date`, `product_id`, `quantity`, `purchase_order_id`, `customer_order_id`, `comments`) VALUES (98, 3, '2006-03-30 17:24:33', '2006-03-30 17:24:33', 34, 1, NULL, NULL, NULL); -INSERT INTO `inventory_transactions` (`id`, `transaction_type`, `transaction_created_date`, `transaction_modified_date`, `product_id`, `quantity`, `purchase_order_id`, `customer_order_id`, `comments`) VALUES (99, 2, '2006-04-03 13:50:08', '2006-04-03 13:50:15', 48, 10, NULL, NULL, NULL); -INSERT INTO `inventory_transactions` (`id`, `transaction_type`, `transaction_created_date`, `transaction_modified_date`, `product_id`, `quantity`, `purchase_order_id`, `customer_order_id`, `comments`) VALUES (100, 1, '2006-04-04 11:00:54', '2006-04-04 11:00:54', 57, 100, NULL, NULL, NULL); -INSERT INTO `inventory_transactions` (`id`, `transaction_type`, `transaction_created_date`, `transaction_modified_date`, `product_id`, `quantity`, `purchase_order_id`, `customer_order_id`, `comments`) VALUES (101, 2, '2006-04-04 11:00:56', '2006-04-04 11:08:49', 57, 100, NULL, NULL, 'Fill Back Ordered product, Order #46'); -INSERT INTO `inventory_transactions` (`id`, `transaction_type`, `transaction_created_date`, `transaction_modified_date`, `product_id`, `quantity`, `purchase_order_id`, `customer_order_id`, `comments`) VALUES (102, 1, '2006-04-04 11:01:14', '2006-04-04 11:01:14', 34, 50, NULL, NULL, NULL); -INSERT INTO `inventory_transactions` (`id`, `transaction_type`, `transaction_created_date`, `transaction_modified_date`, `product_id`, `quantity`, `purchase_order_id`, `customer_order_id`, `comments`) VALUES (103, 1, '2006-04-04 11:01:35', '2006-04-04 11:01:35', 43, 250, NULL, NULL, NULL); -INSERT INTO `inventory_transactions` (`id`, `transaction_type`, `transaction_created_date`, `transaction_modified_date`, `product_id`, `quantity`, `purchase_order_id`, `customer_order_id`, `comments`) VALUES (104, 3, '2006-04-04 11:01:37', '2006-04-04 11:01:37', 43, 300, NULL, NULL, 'Fill Back Ordered product, Order #41'); -INSERT INTO `inventory_transactions` (`id`, `transaction_type`, `transaction_created_date`, `transaction_modified_date`, `product_id`, `quantity`, `purchase_order_id`, `customer_order_id`, `comments`) VALUES (105, 1, '2006-04-04 11:01:55', '2006-04-04 11:01:55', 8, 25, NULL, NULL, NULL); -INSERT INTO `inventory_transactions` (`id`, `transaction_type`, `transaction_created_date`, `transaction_modified_date`, `product_id`, `quantity`, `purchase_order_id`, `customer_order_id`, `comments`) VALUES (106, 2, '2006-04-04 11:01:58', '2006-04-04 11:07:37', 8, 25, NULL, NULL, 'Fill Back Ordered product, Order #48'); -INSERT INTO `inventory_transactions` (`id`, `transaction_type`, `transaction_created_date`, `transaction_modified_date`, `product_id`, `quantity`, `purchase_order_id`, `customer_order_id`, `comments`) VALUES (107, 1, '2006-04-04 11:02:17', '2006-04-04 11:02:17', 34, 300, NULL, NULL, NULL); -INSERT INTO `inventory_transactions` (`id`, `transaction_type`, `transaction_created_date`, `transaction_modified_date`, `product_id`, `quantity`, `purchase_order_id`, `customer_order_id`, `comments`) VALUES (108, 2, '2006-04-04 11:02:19', '2006-04-04 11:08:14', 34, 300, NULL, NULL, 'Fill Back Ordered product, Order #47'); -INSERT INTO `inventory_transactions` (`id`, `transaction_type`, `transaction_created_date`, `transaction_modified_date`, `product_id`, `quantity`, `purchase_order_id`, `customer_order_id`, `comments`) VALUES (109, 1, '2006-04-04 11:02:37', '2006-04-04 11:02:37', 19, 25, NULL, NULL, NULL); -INSERT INTO `inventory_transactions` (`id`, `transaction_type`, `transaction_created_date`, `transaction_modified_date`, `product_id`, `quantity`, `purchase_order_id`, `customer_order_id`, `comments`) VALUES (110, 2, '2006-04-04 11:02:39', '2006-04-04 11:41:14', 19, 10, NULL, NULL, 'Fill Back Ordered product, Order #42'); -INSERT INTO `inventory_transactions` (`id`, `transaction_type`, `transaction_created_date`, `transaction_modified_date`, `product_id`, `quantity`, `purchase_order_id`, `customer_order_id`, `comments`) VALUES (111, 1, '2006-04-04 11:02:56', '2006-04-04 11:02:56', 19, 10, NULL, NULL, NULL); -INSERT INTO `inventory_transactions` (`id`, `transaction_type`, `transaction_created_date`, `transaction_modified_date`, `product_id`, `quantity`, `purchase_order_id`, `customer_order_id`, `comments`) VALUES (112, 2, '2006-04-04 11:02:58', '2006-04-04 11:07:37', 19, 25, NULL, NULL, 'Fill Back Ordered product, Order #48'); -INSERT INTO `inventory_transactions` (`id`, `transaction_type`, `transaction_created_date`, `transaction_modified_date`, `product_id`, `quantity`, `purchase_order_id`, `customer_order_id`, `comments`) VALUES (113, 1, '2006-04-04 11:03:12', '2006-04-04 11:03:12', 72, 50, NULL, NULL, NULL); -INSERT INTO `inventory_transactions` (`id`, `transaction_type`, `transaction_created_date`, `transaction_modified_date`, `product_id`, `quantity`, `purchase_order_id`, `customer_order_id`, `comments`) VALUES (114, 2, '2006-04-04 11:03:14', '2006-04-04 11:08:49', 72, 50, NULL, NULL, 'Fill Back Ordered product, Order #46'); -INSERT INTO `inventory_transactions` (`id`, `transaction_type`, `transaction_created_date`, `transaction_modified_date`, `product_id`, `quantity`, `purchase_order_id`, `customer_order_id`, `comments`) VALUES (115, 1, '2006-04-04 11:03:38', '2006-04-04 11:03:38', 41, 50, NULL, NULL, NULL); -INSERT INTO `inventory_transactions` (`id`, `transaction_type`, `transaction_created_date`, `transaction_modified_date`, `product_id`, `quantity`, `purchase_order_id`, `customer_order_id`, `comments`) VALUES (116, 2, '2006-04-04 11:03:39', '2006-04-04 11:09:24', 41, 50, NULL, NULL, 'Fill Back Ordered product, Order #45'); -INSERT INTO `inventory_transactions` (`id`, `transaction_type`, `transaction_created_date`, `transaction_modified_date`, `product_id`, `quantity`, `purchase_order_id`, `customer_order_id`, `comments`) VALUES (117, 2, '2006-04-04 11:04:55', '2006-04-04 11:05:04', 34, 87, NULL, NULL, NULL); -INSERT INTO `inventory_transactions` (`id`, `transaction_type`, `transaction_created_date`, `transaction_modified_date`, `product_id`, `quantity`, `purchase_order_id`, `customer_order_id`, `comments`) VALUES (118, 2, '2006-04-04 11:35:50', '2006-04-04 11:35:54', 51, 30, NULL, NULL, NULL); -INSERT INTO `inventory_transactions` (`id`, `transaction_type`, `transaction_created_date`, `transaction_modified_date`, `product_id`, `quantity`, `purchase_order_id`, `customer_order_id`, `comments`) VALUES (119, 2, '2006-04-04 11:35:51', '2006-04-04 11:35:54', 7, 30, NULL, NULL, NULL); -INSERT INTO `inventory_transactions` (`id`, `transaction_type`, `transaction_created_date`, `transaction_modified_date`, `product_id`, `quantity`, `purchase_order_id`, `customer_order_id`, `comments`) VALUES (120, 2, '2006-04-04 11:36:15', '2006-04-04 11:36:21', 17, 40, NULL, NULL, NULL); -INSERT INTO `inventory_transactions` (`id`, `transaction_type`, `transaction_created_date`, `transaction_modified_date`, `product_id`, `quantity`, `purchase_order_id`, `customer_order_id`, `comments`) VALUES (121, 2, '2006-04-04 11:36:39', '2006-04-04 11:36:47', 6, 90, NULL, NULL, NULL); -INSERT INTO `inventory_transactions` (`id`, `transaction_type`, `transaction_created_date`, `transaction_modified_date`, `product_id`, `quantity`, `purchase_order_id`, `customer_order_id`, `comments`) VALUES (122, 2, '2006-04-04 11:37:06', '2006-04-04 11:37:09', 4, 30, NULL, NULL, NULL); -INSERT INTO `inventory_transactions` (`id`, `transaction_type`, `transaction_created_date`, `transaction_modified_date`, `product_id`, `quantity`, `purchase_order_id`, `customer_order_id`, `comments`) VALUES (123, 2, '2006-04-04 11:37:45', '2006-04-04 11:37:49', 48, 40, NULL, NULL, NULL); -INSERT INTO `inventory_transactions` (`id`, `transaction_type`, `transaction_created_date`, `transaction_modified_date`, `product_id`, `quantity`, `purchase_order_id`, `customer_order_id`, `comments`) VALUES (124, 2, '2006-04-04 11:38:07', '2006-04-04 11:38:11', 48, 40, NULL, NULL, NULL); -INSERT INTO `inventory_transactions` (`id`, `transaction_type`, `transaction_created_date`, `transaction_modified_date`, `product_id`, `quantity`, `purchase_order_id`, `customer_order_id`, `comments`) VALUES (125, 2, '2006-04-04 11:38:27', '2006-04-04 11:38:32', 41, 10, NULL, NULL, NULL); -INSERT INTO `inventory_transactions` (`id`, `transaction_type`, `transaction_created_date`, `transaction_modified_date`, `product_id`, `quantity`, `purchase_order_id`, `customer_order_id`, `comments`) VALUES (126, 2, '2006-04-04 11:38:48', '2006-04-04 11:38:53', 43, 5, NULL, NULL, NULL); -INSERT INTO `inventory_transactions` (`id`, `transaction_type`, `transaction_created_date`, `transaction_modified_date`, `product_id`, `quantity`, `purchase_order_id`, `customer_order_id`, `comments`) VALUES (127, 2, '2006-04-04 11:39:12', '2006-04-04 11:39:29', 40, 40, NULL, NULL, NULL); -INSERT INTO `inventory_transactions` (`id`, `transaction_type`, `transaction_created_date`, `transaction_modified_date`, `product_id`, `quantity`, `purchase_order_id`, `customer_order_id`, `comments`) VALUES (128, 2, '2006-04-04 11:39:50', '2006-04-04 11:39:53', 8, 20, NULL, NULL, NULL); -INSERT INTO `inventory_transactions` (`id`, `transaction_type`, `transaction_created_date`, `transaction_modified_date`, `product_id`, `quantity`, `purchase_order_id`, `customer_order_id`, `comments`) VALUES (129, 2, '2006-04-04 11:40:13', '2006-04-04 11:40:16', 80, 15, NULL, NULL, NULL); -INSERT INTO `inventory_transactions` (`id`, `transaction_type`, `transaction_created_date`, `transaction_modified_date`, `product_id`, `quantity`, `purchase_order_id`, `customer_order_id`, `comments`) VALUES (130, 2, '2006-04-04 11:40:32', '2006-04-04 11:40:38', 74, 20, NULL, NULL, NULL); -INSERT INTO `inventory_transactions` (`id`, `transaction_type`, `transaction_created_date`, `transaction_modified_date`, `product_id`, `quantity`, `purchase_order_id`, `customer_order_id`, `comments`) VALUES (131, 2, '2006-04-04 11:41:39', '2006-04-04 11:41:45', 72, 40, NULL, NULL, NULL); -INSERT INTO `inventory_transactions` (`id`, `transaction_type`, `transaction_created_date`, `transaction_modified_date`, `product_id`, `quantity`, `purchase_order_id`, `customer_order_id`, `comments`) VALUES (132, 2, '2006-04-04 11:42:17', '2006-04-04 11:42:26', 3, 50, NULL, NULL, NULL); -INSERT INTO `inventory_transactions` (`id`, `transaction_type`, `transaction_created_date`, `transaction_modified_date`, `product_id`, `quantity`, `purchase_order_id`, `customer_order_id`, `comments`) VALUES (133, 2, '2006-04-04 11:42:24', '2006-04-04 11:42:26', 8, 3, NULL, NULL, NULL); -INSERT INTO `inventory_transactions` (`id`, `transaction_type`, `transaction_created_date`, `transaction_modified_date`, `product_id`, `quantity`, `purchase_order_id`, `customer_order_id`, `comments`) VALUES (134, 2, '2006-04-04 11:42:48', '2006-04-04 11:43:08', 20, 40, NULL, NULL, NULL); -INSERT INTO `inventory_transactions` (`id`, `transaction_type`, `transaction_created_date`, `transaction_modified_date`, `product_id`, `quantity`, `purchase_order_id`, `customer_order_id`, `comments`) VALUES (135, 2, '2006-04-04 11:43:05', '2006-04-04 11:43:08', 52, 40, NULL, NULL, NULL); -INSERT INTO `inventory_transactions` (`id`, `transaction_type`, `transaction_created_date`, `transaction_modified_date`, `product_id`, `quantity`, `purchase_order_id`, `customer_order_id`, `comments`) VALUES (136, 3, '2006-04-25 17:04:05', '2006-04-25 17:04:57', 56, 110, NULL, NULL, NULL); -# 102 records - -# -# Dumping data for table 'invoices' -# - -INSERT INTO `invoices` (`id`, `order_id`, `invoice_date`, `due_date`, `tax`, `shipping`, `amount_due`) VALUES (5, 31, '2006-03-22 16:08:59', NULL, 0, 0, 0); -INSERT INTO `invoices` (`id`, `order_id`, `invoice_date`, `due_date`, `tax`, `shipping`, `amount_due`) VALUES (6, 32, '2006-03-22 16:10:27', NULL, 0, 0, 0); -INSERT INTO `invoices` (`id`, `order_id`, `invoice_date`, `due_date`, `tax`, `shipping`, `amount_due`) VALUES (7, 40, '2006-03-24 10:41:41', NULL, 0, 0, 0); -INSERT INTO `invoices` (`id`, `order_id`, `invoice_date`, `due_date`, `tax`, `shipping`, `amount_due`) VALUES (8, 39, '2006-03-24 10:55:46', NULL, 0, 0, 0); -INSERT INTO `invoices` (`id`, `order_id`, `invoice_date`, `due_date`, `tax`, `shipping`, `amount_due`) VALUES (9, 38, '2006-03-24 10:56:57', NULL, 0, 0, 0); -INSERT INTO `invoices` (`id`, `order_id`, `invoice_date`, `due_date`, `tax`, `shipping`, `amount_due`) VALUES (10, 37, '2006-03-24 10:57:38', NULL, 0, 0, 0); -INSERT INTO `invoices` (`id`, `order_id`, `invoice_date`, `due_date`, `tax`, `shipping`, `amount_due`) VALUES (11, 36, '2006-03-24 10:58:40', NULL, 0, 0, 0); -INSERT INTO `invoices` (`id`, `order_id`, `invoice_date`, `due_date`, `tax`, `shipping`, `amount_due`) VALUES (12, 35, '2006-03-24 10:59:41', NULL, 0, 0, 0); -INSERT INTO `invoices` (`id`, `order_id`, `invoice_date`, `due_date`, `tax`, `shipping`, `amount_due`) VALUES (13, 34, '2006-03-24 11:00:55', NULL, 0, 0, 0); -INSERT INTO `invoices` (`id`, `order_id`, `invoice_date`, `due_date`, `tax`, `shipping`, `amount_due`) VALUES (14, 33, '2006-03-24 11:02:02', NULL, 0, 0, 0); -INSERT INTO `invoices` (`id`, `order_id`, `invoice_date`, `due_date`, `tax`, `shipping`, `amount_due`) VALUES (15, 30, '2006-03-24 11:03:00', NULL, 0, 0, 0); -INSERT INTO `invoices` (`id`, `order_id`, `invoice_date`, `due_date`, `tax`, `shipping`, `amount_due`) VALUES (16, 56, '2006-04-03 13:50:15', NULL, 0, 0, 0); -INSERT INTO `invoices` (`id`, `order_id`, `invoice_date`, `due_date`, `tax`, `shipping`, `amount_due`) VALUES (17, 55, '2006-04-04 11:05:04', NULL, 0, 0, 0); -INSERT INTO `invoices` (`id`, `order_id`, `invoice_date`, `due_date`, `tax`, `shipping`, `amount_due`) VALUES (18, 51, '2006-04-04 11:06:13', NULL, 0, 0, 0); -INSERT INTO `invoices` (`id`, `order_id`, `invoice_date`, `due_date`, `tax`, `shipping`, `amount_due`) VALUES (19, 50, '2006-04-04 11:06:56', NULL, 0, 0, 0); -INSERT INTO `invoices` (`id`, `order_id`, `invoice_date`, `due_date`, `tax`, `shipping`, `amount_due`) VALUES (20, 48, '2006-04-04 11:07:37', NULL, 0, 0, 0); -INSERT INTO `invoices` (`id`, `order_id`, `invoice_date`, `due_date`, `tax`, `shipping`, `amount_due`) VALUES (21, 47, '2006-04-04 11:08:14', NULL, 0, 0, 0); -INSERT INTO `invoices` (`id`, `order_id`, `invoice_date`, `due_date`, `tax`, `shipping`, `amount_due`) VALUES (22, 46, '2006-04-04 11:08:49', NULL, 0, 0, 0); -INSERT INTO `invoices` (`id`, `order_id`, `invoice_date`, `due_date`, `tax`, `shipping`, `amount_due`) VALUES (23, 45, '2006-04-04 11:09:24', NULL, 0, 0, 0); -INSERT INTO `invoices` (`id`, `order_id`, `invoice_date`, `due_date`, `tax`, `shipping`, `amount_due`) VALUES (24, 79, '2006-04-04 11:35:54', NULL, 0, 0, 0); -INSERT INTO `invoices` (`id`, `order_id`, `invoice_date`, `due_date`, `tax`, `shipping`, `amount_due`) VALUES (25, 78, '2006-04-04 11:36:21', NULL, 0, 0, 0); -INSERT INTO `invoices` (`id`, `order_id`, `invoice_date`, `due_date`, `tax`, `shipping`, `amount_due`) VALUES (26, 77, '2006-04-04 11:36:47', NULL, 0, 0, 0); -INSERT INTO `invoices` (`id`, `order_id`, `invoice_date`, `due_date`, `tax`, `shipping`, `amount_due`) VALUES (27, 76, '2006-04-04 11:37:09', NULL, 0, 0, 0); -INSERT INTO `invoices` (`id`, `order_id`, `invoice_date`, `due_date`, `tax`, `shipping`, `amount_due`) VALUES (28, 75, '2006-04-04 11:37:49', NULL, 0, 0, 0); -INSERT INTO `invoices` (`id`, `order_id`, `invoice_date`, `due_date`, `tax`, `shipping`, `amount_due`) VALUES (29, 74, '2006-04-04 11:38:11', NULL, 0, 0, 0); -INSERT INTO `invoices` (`id`, `order_id`, `invoice_date`, `due_date`, `tax`, `shipping`, `amount_due`) VALUES (30, 73, '2006-04-04 11:38:32', NULL, 0, 0, 0); -INSERT INTO `invoices` (`id`, `order_id`, `invoice_date`, `due_date`, `tax`, `shipping`, `amount_due`) VALUES (31, 72, '2006-04-04 11:38:53', NULL, 0, 0, 0); -INSERT INTO `invoices` (`id`, `order_id`, `invoice_date`, `due_date`, `tax`, `shipping`, `amount_due`) VALUES (32, 71, '2006-04-04 11:39:29', NULL, 0, 0, 0); -INSERT INTO `invoices` (`id`, `order_id`, `invoice_date`, `due_date`, `tax`, `shipping`, `amount_due`) VALUES (33, 70, '2006-04-04 11:39:53', NULL, 0, 0, 0); -INSERT INTO `invoices` (`id`, `order_id`, `invoice_date`, `due_date`, `tax`, `shipping`, `amount_due`) VALUES (34, 69, '2006-04-04 11:40:16', NULL, 0, 0, 0); -INSERT INTO `invoices` (`id`, `order_id`, `invoice_date`, `due_date`, `tax`, `shipping`, `amount_due`) VALUES (35, 67, '2006-04-04 11:40:38', NULL, 0, 0, 0); -INSERT INTO `invoices` (`id`, `order_id`, `invoice_date`, `due_date`, `tax`, `shipping`, `amount_due`) VALUES (36, 42, '2006-04-04 11:41:14', NULL, 0, 0, 0); -INSERT INTO `invoices` (`id`, `order_id`, `invoice_date`, `due_date`, `tax`, `shipping`, `amount_due`) VALUES (37, 60, '2006-04-04 11:41:45', NULL, 0, 0, 0); -INSERT INTO `invoices` (`id`, `order_id`, `invoice_date`, `due_date`, `tax`, `shipping`, `amount_due`) VALUES (38, 63, '2006-04-04 11:42:26', NULL, 0, 0, 0); -INSERT INTO `invoices` (`id`, `order_id`, `invoice_date`, `due_date`, `tax`, `shipping`, `amount_due`) VALUES (39, 58, '2006-04-04 11:43:08', NULL, 0, 0, 0); -# 35 records - -# -# Dumping data for table 'order_details' -# - -INSERT INTO `order_details` (`id`, `order_id`, `product_id`, `quantity`, `unit_price`, `discount`, `status_id`, `date_allocated`, `purchase_order_id`, `inventory_id`) VALUES (27, 30, 34, 100, 14, 0, 2, NULL, 96, 83); -INSERT INTO `order_details` (`id`, `order_id`, `product_id`, `quantity`, `unit_price`, `discount`, `status_id`, `date_allocated`, `purchase_order_id`, `inventory_id`) VALUES (28, 30, 80, 30, 3.5, 0, 2, NULL, NULL, 63); -INSERT INTO `order_details` (`id`, `order_id`, `product_id`, `quantity`, `unit_price`, `discount`, `status_id`, `date_allocated`, `purchase_order_id`, `inventory_id`) VALUES (29, 31, 7, 10, 30, 0, 2, NULL, NULL, 64); -INSERT INTO `order_details` (`id`, `order_id`, `product_id`, `quantity`, `unit_price`, `discount`, `status_id`, `date_allocated`, `purchase_order_id`, `inventory_id`) VALUES (30, 31, 51, 10, 53, 0, 2, NULL, NULL, 65); -INSERT INTO `order_details` (`id`, `order_id`, `product_id`, `quantity`, `unit_price`, `discount`, `status_id`, `date_allocated`, `purchase_order_id`, `inventory_id`) VALUES (31, 31, 80, 10, 3.5, 0, 2, NULL, NULL, 66); -INSERT INTO `order_details` (`id`, `order_id`, `product_id`, `quantity`, `unit_price`, `discount`, `status_id`, `date_allocated`, `purchase_order_id`, `inventory_id`) VALUES (32, 32, 1, 15, 18, 0, 2, NULL, NULL, 67); -INSERT INTO `order_details` (`id`, `order_id`, `product_id`, `quantity`, `unit_price`, `discount`, `status_id`, `date_allocated`, `purchase_order_id`, `inventory_id`) VALUES (33, 32, 43, 20, 46, 0, 2, NULL, NULL, 68); -INSERT INTO `order_details` (`id`, `order_id`, `product_id`, `quantity`, `unit_price`, `discount`, `status_id`, `date_allocated`, `purchase_order_id`, `inventory_id`) VALUES (34, 33, 19, 30, 9.2, 0, 2, NULL, 97, 81); -INSERT INTO `order_details` (`id`, `order_id`, `product_id`, `quantity`, `unit_price`, `discount`, `status_id`, `date_allocated`, `purchase_order_id`, `inventory_id`) VALUES (35, 34, 19, 20, 9.2, 0, 2, NULL, NULL, 69); -INSERT INTO `order_details` (`id`, `order_id`, `product_id`, `quantity`, `unit_price`, `discount`, `status_id`, `date_allocated`, `purchase_order_id`, `inventory_id`) VALUES (36, 35, 48, 10, 12.75, 0, 2, NULL, NULL, 70); -INSERT INTO `order_details` (`id`, `order_id`, `product_id`, `quantity`, `unit_price`, `discount`, `status_id`, `date_allocated`, `purchase_order_id`, `inventory_id`) VALUES (37, 36, 41, 200, 9.65, 0, 2, NULL, 98, 79); -INSERT INTO `order_details` (`id`, `order_id`, `product_id`, `quantity`, `unit_price`, `discount`, `status_id`, `date_allocated`, `purchase_order_id`, `inventory_id`) VALUES (38, 37, 8, 17, 40, 0, 2, NULL, NULL, 71); -INSERT INTO `order_details` (`id`, `order_id`, `product_id`, `quantity`, `unit_price`, `discount`, `status_id`, `date_allocated`, `purchase_order_id`, `inventory_id`) VALUES (39, 38, 43, 300, 46, 0, 2, NULL, 99, 77); -INSERT INTO `order_details` (`id`, `order_id`, `product_id`, `quantity`, `unit_price`, `discount`, `status_id`, `date_allocated`, `purchase_order_id`, `inventory_id`) VALUES (40, 39, 48, 100, 12.75, 0, 2, NULL, 100, 75); -INSERT INTO `order_details` (`id`, `order_id`, `product_id`, `quantity`, `unit_price`, `discount`, `status_id`, `date_allocated`, `purchase_order_id`, `inventory_id`) VALUES (41, 40, 81, 200, 2.99, 0, 2, NULL, 101, 73); -INSERT INTO `order_details` (`id`, `order_id`, `product_id`, `quantity`, `unit_price`, `discount`, `status_id`, `date_allocated`, `purchase_order_id`, `inventory_id`) VALUES (42, 41, 43, 300, 46, 0, 1, NULL, 102, 104); -INSERT INTO `order_details` (`id`, `order_id`, `product_id`, `quantity`, `unit_price`, `discount`, `status_id`, `date_allocated`, `purchase_order_id`, `inventory_id`) VALUES (43, 42, 6, 10, 25, 0, 2, NULL, NULL, 84); -INSERT INTO `order_details` (`id`, `order_id`, `product_id`, `quantity`, `unit_price`, `discount`, `status_id`, `date_allocated`, `purchase_order_id`, `inventory_id`) VALUES (44, 42, 4, 10, 22, 0, 2, NULL, NULL, 85); -INSERT INTO `order_details` (`id`, `order_id`, `product_id`, `quantity`, `unit_price`, `discount`, `status_id`, `date_allocated`, `purchase_order_id`, `inventory_id`) VALUES (45, 42, 19, 10, 9.2, 0, 2, NULL, 103, 110); -INSERT INTO `order_details` (`id`, `order_id`, `product_id`, `quantity`, `unit_price`, `discount`, `status_id`, `date_allocated`, `purchase_order_id`, `inventory_id`) VALUES (46, 43, 80, 20, 3.5, 0, 1, NULL, NULL, 86); -INSERT INTO `order_details` (`id`, `order_id`, `product_id`, `quantity`, `unit_price`, `discount`, `status_id`, `date_allocated`, `purchase_order_id`, `inventory_id`) VALUES (47, 43, 81, 50, 2.99, 0, 1, NULL, NULL, 87); -INSERT INTO `order_details` (`id`, `order_id`, `product_id`, `quantity`, `unit_price`, `discount`, `status_id`, `date_allocated`, `purchase_order_id`, `inventory_id`) VALUES (48, 44, 1, 25, 18, 0, 1, NULL, NULL, 88); -INSERT INTO `order_details` (`id`, `order_id`, `product_id`, `quantity`, `unit_price`, `discount`, `status_id`, `date_allocated`, `purchase_order_id`, `inventory_id`) VALUES (49, 44, 43, 25, 46, 0, 1, NULL, NULL, 89); -INSERT INTO `order_details` (`id`, `order_id`, `product_id`, `quantity`, `unit_price`, `discount`, `status_id`, `date_allocated`, `purchase_order_id`, `inventory_id`) VALUES (50, 44, 81, 25, 2.99, 0, 1, NULL, NULL, 90); -INSERT INTO `order_details` (`id`, `order_id`, `product_id`, `quantity`, `unit_price`, `discount`, `status_id`, `date_allocated`, `purchase_order_id`, `inventory_id`) VALUES (51, 45, 41, 50, 9.65, 0, 2, NULL, 104, 116); -INSERT INTO `order_details` (`id`, `order_id`, `product_id`, `quantity`, `unit_price`, `discount`, `status_id`, `date_allocated`, `purchase_order_id`, `inventory_id`) VALUES (52, 45, 40, 50, 18.4, 0, 2, NULL, NULL, 91); -INSERT INTO `order_details` (`id`, `order_id`, `product_id`, `quantity`, `unit_price`, `discount`, `status_id`, `date_allocated`, `purchase_order_id`, `inventory_id`) VALUES (53, 46, 57, 100, 19.5, 0, 2, NULL, 105, 101); -INSERT INTO `order_details` (`id`, `order_id`, `product_id`, `quantity`, `unit_price`, `discount`, `status_id`, `date_allocated`, `purchase_order_id`, `inventory_id`) VALUES (54, 46, 72, 50, 34.8, 0, 2, NULL, 106, 114); -INSERT INTO `order_details` (`id`, `order_id`, `product_id`, `quantity`, `unit_price`, `discount`, `status_id`, `date_allocated`, `purchase_order_id`, `inventory_id`) VALUES (55, 47, 34, 300, 14, 0, 2, NULL, 107, 108); -INSERT INTO `order_details` (`id`, `order_id`, `product_id`, `quantity`, `unit_price`, `discount`, `status_id`, `date_allocated`, `purchase_order_id`, `inventory_id`) VALUES (56, 48, 8, 25, 40, 0, 2, NULL, 108, 106); -INSERT INTO `order_details` (`id`, `order_id`, `product_id`, `quantity`, `unit_price`, `discount`, `status_id`, `date_allocated`, `purchase_order_id`, `inventory_id`) VALUES (57, 48, 19, 25, 9.2, 0, 2, NULL, 109, 112); -INSERT INTO `order_details` (`id`, `order_id`, `product_id`, `quantity`, `unit_price`, `discount`, `status_id`, `date_allocated`, `purchase_order_id`, `inventory_id`) VALUES (59, 50, 21, 20, 10, 0, 2, NULL, NULL, 92); -INSERT INTO `order_details` (`id`, `order_id`, `product_id`, `quantity`, `unit_price`, `discount`, `status_id`, `date_allocated`, `purchase_order_id`, `inventory_id`) VALUES (60, 51, 5, 25, 21.35, 0, 2, NULL, NULL, 93); -INSERT INTO `order_details` (`id`, `order_id`, `product_id`, `quantity`, `unit_price`, `discount`, `status_id`, `date_allocated`, `purchase_order_id`, `inventory_id`) VALUES (61, 51, 41, 30, 9.65, 0, 2, NULL, NULL, 94); -INSERT INTO `order_details` (`id`, `order_id`, `product_id`, `quantity`, `unit_price`, `discount`, `status_id`, `date_allocated`, `purchase_order_id`, `inventory_id`) VALUES (62, 51, 40, 30, 18.4, 0, 2, NULL, NULL, 95); -INSERT INTO `order_details` (`id`, `order_id`, `product_id`, `quantity`, `unit_price`, `discount`, `status_id`, `date_allocated`, `purchase_order_id`, `inventory_id`) VALUES (66, 56, 48, 10, 12.75, 0, 2, NULL, 111, 99); -INSERT INTO `order_details` (`id`, `order_id`, `product_id`, `quantity`, `unit_price`, `discount`, `status_id`, `date_allocated`, `purchase_order_id`, `inventory_id`) VALUES (67, 55, 34, 87, 14, 0, 2, NULL, NULL, 117); -INSERT INTO `order_details` (`id`, `order_id`, `product_id`, `quantity`, `unit_price`, `discount`, `status_id`, `date_allocated`, `purchase_order_id`, `inventory_id`) VALUES (68, 79, 7, 30, 30, 0, 2, NULL, NULL, 119); -INSERT INTO `order_details` (`id`, `order_id`, `product_id`, `quantity`, `unit_price`, `discount`, `status_id`, `date_allocated`, `purchase_order_id`, `inventory_id`) VALUES (69, 79, 51, 30, 53, 0, 2, NULL, NULL, 118); -INSERT INTO `order_details` (`id`, `order_id`, `product_id`, `quantity`, `unit_price`, `discount`, `status_id`, `date_allocated`, `purchase_order_id`, `inventory_id`) VALUES (70, 78, 17, 40, 39, 0, 2, NULL, NULL, 120); -INSERT INTO `order_details` (`id`, `order_id`, `product_id`, `quantity`, `unit_price`, `discount`, `status_id`, `date_allocated`, `purchase_order_id`, `inventory_id`) VALUES (71, 77, 6, 90, 25, 0, 2, NULL, NULL, 121); -INSERT INTO `order_details` (`id`, `order_id`, `product_id`, `quantity`, `unit_price`, `discount`, `status_id`, `date_allocated`, `purchase_order_id`, `inventory_id`) VALUES (72, 76, 4, 30, 22, 0, 2, NULL, NULL, 122); -INSERT INTO `order_details` (`id`, `order_id`, `product_id`, `quantity`, `unit_price`, `discount`, `status_id`, `date_allocated`, `purchase_order_id`, `inventory_id`) VALUES (73, 75, 48, 40, 12.75, 0, 2, NULL, NULL, 123); -INSERT INTO `order_details` (`id`, `order_id`, `product_id`, `quantity`, `unit_price`, `discount`, `status_id`, `date_allocated`, `purchase_order_id`, `inventory_id`) VALUES (74, 74, 48, 40, 12.75, 0, 2, NULL, NULL, 124); -INSERT INTO `order_details` (`id`, `order_id`, `product_id`, `quantity`, `unit_price`, `discount`, `status_id`, `date_allocated`, `purchase_order_id`, `inventory_id`) VALUES (75, 73, 41, 10, 9.65, 0, 2, NULL, NULL, 125); -INSERT INTO `order_details` (`id`, `order_id`, `product_id`, `quantity`, `unit_price`, `discount`, `status_id`, `date_allocated`, `purchase_order_id`, `inventory_id`) VALUES (76, 72, 43, 5, 46, 0, 2, NULL, NULL, 126); -INSERT INTO `order_details` (`id`, `order_id`, `product_id`, `quantity`, `unit_price`, `discount`, `status_id`, `date_allocated`, `purchase_order_id`, `inventory_id`) VALUES (77, 71, 40, 40, 18.4, 0, 2, NULL, NULL, 127); -INSERT INTO `order_details` (`id`, `order_id`, `product_id`, `quantity`, `unit_price`, `discount`, `status_id`, `date_allocated`, `purchase_order_id`, `inventory_id`) VALUES (78, 70, 8, 20, 40, 0, 2, NULL, NULL, 128); -INSERT INTO `order_details` (`id`, `order_id`, `product_id`, `quantity`, `unit_price`, `discount`, `status_id`, `date_allocated`, `purchase_order_id`, `inventory_id`) VALUES (79, 69, 80, 15, 3.5, 0, 2, NULL, NULL, 129); -INSERT INTO `order_details` (`id`, `order_id`, `product_id`, `quantity`, `unit_price`, `discount`, `status_id`, `date_allocated`, `purchase_order_id`, `inventory_id`) VALUES (80, 67, 74, 20, 10, 0, 2, NULL, NULL, 130); -INSERT INTO `order_details` (`id`, `order_id`, `product_id`, `quantity`, `unit_price`, `discount`, `status_id`, `date_allocated`, `purchase_order_id`, `inventory_id`) VALUES (81, 60, 72, 40, 34.8, 0, 2, NULL, NULL, 131); -INSERT INTO `order_details` (`id`, `order_id`, `product_id`, `quantity`, `unit_price`, `discount`, `status_id`, `date_allocated`, `purchase_order_id`, `inventory_id`) VALUES (82, 63, 3, 50, 10, 0, 2, NULL, NULL, 132); -INSERT INTO `order_details` (`id`, `order_id`, `product_id`, `quantity`, `unit_price`, `discount`, `status_id`, `date_allocated`, `purchase_order_id`, `inventory_id`) VALUES (83, 63, 8, 3, 40, 0, 2, NULL, NULL, 133); -INSERT INTO `order_details` (`id`, `order_id`, `product_id`, `quantity`, `unit_price`, `discount`, `status_id`, `date_allocated`, `purchase_order_id`, `inventory_id`) VALUES (84, 58, 20, 40, 81, 0, 2, NULL, NULL, 134); -INSERT INTO `order_details` (`id`, `order_id`, `product_id`, `quantity`, `unit_price`, `discount`, `status_id`, `date_allocated`, `purchase_order_id`, `inventory_id`) VALUES (85, 58, 52, 40, 7, 0, 2, NULL, NULL, 135); -INSERT INTO `order_details` (`id`, `order_id`, `product_id`, `quantity`, `unit_price`, `discount`, `status_id`, `date_allocated`, `purchase_order_id`, `inventory_id`) VALUES (86, 80, 56, 10, 38, 0, 1, NULL, NULL, 136); -INSERT INTO `order_details` (`id`, `order_id`, `product_id`, `quantity`, `unit_price`, `discount`, `status_id`, `date_allocated`, `purchase_order_id`, `inventory_id`) VALUES (90, 81, 81, 0, 2.99, 0, 5, NULL, NULL, NULL); -INSERT INTO `order_details` (`id`, `order_id`, `product_id`, `quantity`, `unit_price`, `discount`, `status_id`, `date_allocated`, `purchase_order_id`, `inventory_id`) VALUES (91, 81, 56, 0, 38, 0, 0, NULL, NULL, NULL); -# 58 records - -# -# Dumping data for table 'order_details_status' -# - -INSERT INTO `order_details_status` (`id`, `status_name`) VALUES (0, 'None'); -INSERT INTO `order_details_status` (`id`, `status_name`) VALUES (1, 'Allocated'); -INSERT INTO `order_details_status` (`id`, `status_name`) VALUES (2, 'Invoiced'); -INSERT INTO `order_details_status` (`id`, `status_name`) VALUES (3, 'Shipped'); -INSERT INTO `order_details_status` (`id`, `status_name`) VALUES (4, 'On Order'); -INSERT INTO `order_details_status` (`id`, `status_name`) VALUES (5, 'No Stock'); -# 6 records - -# -# Dumping data for table 'orders' -# - -INSERT INTO `orders` (`id`, `employee_id`, `customer_id`, `order_date`, `shipped_date`, `shipper_id`, `ship_name`, `ship_address`, `ship_city`, `ship_state_province`, `ship_zip_postal_code`, `ship_country_region`, `shipping_fee`, `taxes`, `payment_type`, `paid_date`, `notes`, `tax_rate`, `tax_status_id`, `status_id`) VALUES (30, 9, 27, '2006-01-15 00:00:00', '2006-01-22 00:00:00', 2, 'Karen Toh', '789 27th Street', 'Las Vegas', 'NV', '99999', 'USA', 200, 0, 'Check', '2006-01-15 00:00:00', NULL, 0, NULL, 3); -INSERT INTO `orders` (`id`, `employee_id`, `customer_id`, `order_date`, `shipped_date`, `shipper_id`, `ship_name`, `ship_address`, `ship_city`, `ship_state_province`, `ship_zip_postal_code`, `ship_country_region`, `shipping_fee`, `taxes`, `payment_type`, `paid_date`, `notes`, `tax_rate`, `tax_status_id`, `status_id`) VALUES (31, 3, 4, '2006-01-20 00:00:00', '2006-01-22 00:00:00', 1, 'Christina Lee', '123 4th Street', 'New York', 'NY', '99999', 'USA', 5, 0, 'Credit Card', '2006-01-20 00:00:00', NULL, 0, NULL, 3); -INSERT INTO `orders` (`id`, `employee_id`, `customer_id`, `order_date`, `shipped_date`, `shipper_id`, `ship_name`, `ship_address`, `ship_city`, `ship_state_province`, `ship_zip_postal_code`, `ship_country_region`, `shipping_fee`, `taxes`, `payment_type`, `paid_date`, `notes`, `tax_rate`, `tax_status_id`, `status_id`) VALUES (32, 4, 12, '2006-01-22 00:00:00', '2006-01-22 00:00:00', 2, 'John Edwards', '123 12th Street', 'Las Vegas', 'NV', '99999', 'USA', 5, 0, 'Credit Card', '2006-01-22 00:00:00', NULL, 0, NULL, 3); -INSERT INTO `orders` (`id`, `employee_id`, `customer_id`, `order_date`, `shipped_date`, `shipper_id`, `ship_name`, `ship_address`, `ship_city`, `ship_state_province`, `ship_zip_postal_code`, `ship_country_region`, `shipping_fee`, `taxes`, `payment_type`, `paid_date`, `notes`, `tax_rate`, `tax_status_id`, `status_id`) VALUES (33, 6, 8, '2006-01-30 00:00:00', '2006-01-31 00:00:00', 3, 'Elizabeth Andersen', '123 8th Street', 'Portland', 'OR', '99999', 'USA', 50, 0, 'Credit Card', '2006-01-30 00:00:00', NULL, 0, NULL, 3); -INSERT INTO `orders` (`id`, `employee_id`, `customer_id`, `order_date`, `shipped_date`, `shipper_id`, `ship_name`, `ship_address`, `ship_city`, `ship_state_province`, `ship_zip_postal_code`, `ship_country_region`, `shipping_fee`, `taxes`, `payment_type`, `paid_date`, `notes`, `tax_rate`, `tax_status_id`, `status_id`) VALUES (34, 9, 4, '2006-02-06 00:00:00', '2006-02-07 00:00:00', 3, 'Christina Lee', '123 4th Street', 'New York', 'NY', '99999', 'USA', 4, 0, 'Check', '2006-02-06 00:00:00', NULL, 0, NULL, 3); -INSERT INTO `orders` (`id`, `employee_id`, `customer_id`, `order_date`, `shipped_date`, `shipper_id`, `ship_name`, `ship_address`, `ship_city`, `ship_state_province`, `ship_zip_postal_code`, `ship_country_region`, `shipping_fee`, `taxes`, `payment_type`, `paid_date`, `notes`, `tax_rate`, `tax_status_id`, `status_id`) VALUES (35, 3, 29, '2006-02-10 00:00:00', '2006-02-12 00:00:00', 2, 'Soo Jung Lee', '789 29th Street', 'Denver', 'CO', '99999', 'USA', 7, 0, 'Check', '2006-02-10 00:00:00', NULL, 0, NULL, 3); -INSERT INTO `orders` (`id`, `employee_id`, `customer_id`, `order_date`, `shipped_date`, `shipper_id`, `ship_name`, `ship_address`, `ship_city`, `ship_state_province`, `ship_zip_postal_code`, `ship_country_region`, `shipping_fee`, `taxes`, `payment_type`, `paid_date`, `notes`, `tax_rate`, `tax_status_id`, `status_id`) VALUES (36, 4, 3, '2006-02-23 00:00:00', '2006-02-25 00:00:00', 2, 'Thomas Axen', '123 3rd Street', 'Los Angelas', 'CA', '99999', 'USA', 7, 0, 'Cash', '2006-02-23 00:00:00', NULL, 0, NULL, 3); -INSERT INTO `orders` (`id`, `employee_id`, `customer_id`, `order_date`, `shipped_date`, `shipper_id`, `ship_name`, `ship_address`, `ship_city`, `ship_state_province`, `ship_zip_postal_code`, `ship_country_region`, `shipping_fee`, `taxes`, `payment_type`, `paid_date`, `notes`, `tax_rate`, `tax_status_id`, `status_id`) VALUES (37, 8, 6, '2006-03-06 00:00:00', '2006-03-09 00:00:00', 2, 'Francisco Pérez-Olaeta', '123 6th Street', 'Milwaukee', 'WI', '99999', 'USA', 12, 0, 'Credit Card', '2006-03-06 00:00:00', NULL, 0, NULL, 3); -INSERT INTO `orders` (`id`, `employee_id`, `customer_id`, `order_date`, `shipped_date`, `shipper_id`, `ship_name`, `ship_address`, `ship_city`, `ship_state_province`, `ship_zip_postal_code`, `ship_country_region`, `shipping_fee`, `taxes`, `payment_type`, `paid_date`, `notes`, `tax_rate`, `tax_status_id`, `status_id`) VALUES (38, 9, 28, '2006-03-10 00:00:00', '2006-03-11 00:00:00', 3, 'Amritansh Raghav', '789 28th Street', 'Memphis', 'TN', '99999', 'USA', 10, 0, 'Check', '2006-03-10 00:00:00', NULL, 0, NULL, 3); -INSERT INTO `orders` (`id`, `employee_id`, `customer_id`, `order_date`, `shipped_date`, `shipper_id`, `ship_name`, `ship_address`, `ship_city`, `ship_state_province`, `ship_zip_postal_code`, `ship_country_region`, `shipping_fee`, `taxes`, `payment_type`, `paid_date`, `notes`, `tax_rate`, `tax_status_id`, `status_id`) VALUES (39, 3, 8, '2006-03-22 00:00:00', '2006-03-24 00:00:00', 3, 'Elizabeth Andersen', '123 8th Street', 'Portland', 'OR', '99999', 'USA', 5, 0, 'Check', '2006-03-22 00:00:00', NULL, 0, NULL, 3); -INSERT INTO `orders` (`id`, `employee_id`, `customer_id`, `order_date`, `shipped_date`, `shipper_id`, `ship_name`, `ship_address`, `ship_city`, `ship_state_province`, `ship_zip_postal_code`, `ship_country_region`, `shipping_fee`, `taxes`, `payment_type`, `paid_date`, `notes`, `tax_rate`, `tax_status_id`, `status_id`) VALUES (40, 4, 10, '2006-03-24 00:00:00', '2006-03-24 00:00:00', 2, 'Roland Wacker', '123 10th Street', 'Chicago', 'IL', '99999', 'USA', 9, 0, 'Credit Card', '2006-03-24 00:00:00', NULL, 0, NULL, 3); -INSERT INTO `orders` (`id`, `employee_id`, `customer_id`, `order_date`, `shipped_date`, `shipper_id`, `ship_name`, `ship_address`, `ship_city`, `ship_state_province`, `ship_zip_postal_code`, `ship_country_region`, `shipping_fee`, `taxes`, `payment_type`, `paid_date`, `notes`, `tax_rate`, `tax_status_id`, `status_id`) VALUES (41, 1, 7, '2006-03-24 00:00:00', NULL, NULL, 'Ming-Yang Xie', '123 7th Street', 'Boise', 'ID', '99999', 'USA', 0, 0, NULL, NULL, NULL, 0, NULL, 0); -INSERT INTO `orders` (`id`, `employee_id`, `customer_id`, `order_date`, `shipped_date`, `shipper_id`, `ship_name`, `ship_address`, `ship_city`, `ship_state_province`, `ship_zip_postal_code`, `ship_country_region`, `shipping_fee`, `taxes`, `payment_type`, `paid_date`, `notes`, `tax_rate`, `tax_status_id`, `status_id`) VALUES (42, 1, 10, '2006-03-24 00:00:00', '2006-04-07 00:00:00', 1, 'Roland Wacker', '123 10th Street', 'Chicago', 'IL', '99999', 'USA', 0, 0, NULL, NULL, NULL, 0, NULL, 2); -INSERT INTO `orders` (`id`, `employee_id`, `customer_id`, `order_date`, `shipped_date`, `shipper_id`, `ship_name`, `ship_address`, `ship_city`, `ship_state_province`, `ship_zip_postal_code`, `ship_country_region`, `shipping_fee`, `taxes`, `payment_type`, `paid_date`, `notes`, `tax_rate`, `tax_status_id`, `status_id`) VALUES (43, 1, 11, '2006-03-24 00:00:00', NULL, 3, 'Peter Krschne', '123 11th Street', 'Miami', 'FL', '99999', 'USA', 0, 0, NULL, NULL, NULL, 0, NULL, 0); -INSERT INTO `orders` (`id`, `employee_id`, `customer_id`, `order_date`, `shipped_date`, `shipper_id`, `ship_name`, `ship_address`, `ship_city`, `ship_state_province`, `ship_zip_postal_code`, `ship_country_region`, `shipping_fee`, `taxes`, `payment_type`, `paid_date`, `notes`, `tax_rate`, `tax_status_id`, `status_id`) VALUES (44, 1, 1, '2006-03-24 00:00:00', NULL, NULL, 'Anna Bedecs', '123 1st Street', 'Seattle', 'WA', '99999', 'USA', 0, 0, NULL, NULL, NULL, 0, NULL, 0); -INSERT INTO `orders` (`id`, `employee_id`, `customer_id`, `order_date`, `shipped_date`, `shipper_id`, `ship_name`, `ship_address`, `ship_city`, `ship_state_province`, `ship_zip_postal_code`, `ship_country_region`, `shipping_fee`, `taxes`, `payment_type`, `paid_date`, `notes`, `tax_rate`, `tax_status_id`, `status_id`) VALUES (45, 1, 28, '2006-04-07 00:00:00', '2006-04-07 00:00:00', 3, 'Amritansh Raghav', '789 28th Street', 'Memphis', 'TN', '99999', 'USA', 40, 0, 'Credit Card', '2006-04-07 00:00:00', NULL, 0, NULL, 3); -INSERT INTO `orders` (`id`, `employee_id`, `customer_id`, `order_date`, `shipped_date`, `shipper_id`, `ship_name`, `ship_address`, `ship_city`, `ship_state_province`, `ship_zip_postal_code`, `ship_country_region`, `shipping_fee`, `taxes`, `payment_type`, `paid_date`, `notes`, `tax_rate`, `tax_status_id`, `status_id`) VALUES (46, 7, 9, '2006-04-05 00:00:00', '2006-04-05 00:00:00', 1, 'Sven Mortensen', '123 9th Street', 'Salt Lake City', 'UT', '99999', 'USA', 100, 0, 'Check', '2006-04-05 00:00:00', NULL, 0, NULL, 3); -INSERT INTO `orders` (`id`, `employee_id`, `customer_id`, `order_date`, `shipped_date`, `shipper_id`, `ship_name`, `ship_address`, `ship_city`, `ship_state_province`, `ship_zip_postal_code`, `ship_country_region`, `shipping_fee`, `taxes`, `payment_type`, `paid_date`, `notes`, `tax_rate`, `tax_status_id`, `status_id`) VALUES (47, 6, 6, '2006-04-08 00:00:00', '2006-04-08 00:00:00', 2, 'Francisco Pérez-Olaeta', '123 6th Street', 'Milwaukee', 'WI', '99999', 'USA', 300, 0, 'Credit Card', '2006-04-08 00:00:00', NULL, 0, NULL, 3); -INSERT INTO `orders` (`id`, `employee_id`, `customer_id`, `order_date`, `shipped_date`, `shipper_id`, `ship_name`, `ship_address`, `ship_city`, `ship_state_province`, `ship_zip_postal_code`, `ship_country_region`, `shipping_fee`, `taxes`, `payment_type`, `paid_date`, `notes`, `tax_rate`, `tax_status_id`, `status_id`) VALUES (48, 4, 8, '2006-04-05 00:00:00', '2006-04-05 00:00:00', 2, 'Elizabeth Andersen', '123 8th Street', 'Portland', 'OR', '99999', 'USA', 50, 0, 'Check', '2006-04-05 00:00:00', NULL, 0, NULL, 3); -INSERT INTO `orders` (`id`, `employee_id`, `customer_id`, `order_date`, `shipped_date`, `shipper_id`, `ship_name`, `ship_address`, `ship_city`, `ship_state_province`, `ship_zip_postal_code`, `ship_country_region`, `shipping_fee`, `taxes`, `payment_type`, `paid_date`, `notes`, `tax_rate`, `tax_status_id`, `status_id`) VALUES (50, 9, 25, '2006-04-05 00:00:00', '2006-04-05 00:00:00', 1, 'John Rodman', '789 25th Street', 'Chicago', 'IL', '99999', 'USA', 5, 0, 'Cash', '2006-04-05 00:00:00', NULL, 0, NULL, 3); -INSERT INTO `orders` (`id`, `employee_id`, `customer_id`, `order_date`, `shipped_date`, `shipper_id`, `ship_name`, `ship_address`, `ship_city`, `ship_state_province`, `ship_zip_postal_code`, `ship_country_region`, `shipping_fee`, `taxes`, `payment_type`, `paid_date`, `notes`, `tax_rate`, `tax_status_id`, `status_id`) VALUES (51, 9, 26, '2006-04-05 00:00:00', '2006-04-05 00:00:00', 3, 'Run Liu', '789 26th Street', 'Miami', 'FL', '99999', 'USA', 60, 0, 'Credit Card', '2006-04-05 00:00:00', NULL, 0, NULL, 3); -INSERT INTO `orders` (`id`, `employee_id`, `customer_id`, `order_date`, `shipped_date`, `shipper_id`, `ship_name`, `ship_address`, `ship_city`, `ship_state_province`, `ship_zip_postal_code`, `ship_country_region`, `shipping_fee`, `taxes`, `payment_type`, `paid_date`, `notes`, `tax_rate`, `tax_status_id`, `status_id`) VALUES (55, 1, 29, '2006-04-05 00:00:00', '2006-04-05 00:00:00', 2, 'Soo Jung Lee', '789 29th Street', 'Denver', 'CO', '99999', 'USA', 200, 0, 'Check', '2006-04-05 00:00:00', NULL, 0, NULL, 3); -INSERT INTO `orders` (`id`, `employee_id`, `customer_id`, `order_date`, `shipped_date`, `shipper_id`, `ship_name`, `ship_address`, `ship_city`, `ship_state_province`, `ship_zip_postal_code`, `ship_country_region`, `shipping_fee`, `taxes`, `payment_type`, `paid_date`, `notes`, `tax_rate`, `tax_status_id`, `status_id`) VALUES (56, 2, 6, '2006-04-03 00:00:00', '2006-04-03 00:00:00', 3, 'Francisco Pérez-Olaeta', '123 6th Street', 'Milwaukee', 'WI', '99999', 'USA', 0, 0, 'Check', '2006-04-03 00:00:00', NULL, 0, NULL, 3); -INSERT INTO `orders` (`id`, `employee_id`, `customer_id`, `order_date`, `shipped_date`, `shipper_id`, `ship_name`, `ship_address`, `ship_city`, `ship_state_province`, `ship_zip_postal_code`, `ship_country_region`, `shipping_fee`, `taxes`, `payment_type`, `paid_date`, `notes`, `tax_rate`, `tax_status_id`, `status_id`) VALUES (57, 9, 27, '2006-04-22 00:00:00', '2006-04-22 00:00:00', 2, 'Karen Toh', '789 27th Street', 'Las Vegas', 'NV', '99999', 'USA', 200, 0, 'Check', '2006-04-22 00:00:00', NULL, 0, NULL, 0); -INSERT INTO `orders` (`id`, `employee_id`, `customer_id`, `order_date`, `shipped_date`, `shipper_id`, `ship_name`, `ship_address`, `ship_city`, `ship_state_province`, `ship_zip_postal_code`, `ship_country_region`, `shipping_fee`, `taxes`, `payment_type`, `paid_date`, `notes`, `tax_rate`, `tax_status_id`, `status_id`) VALUES (58, 3, 4, '2006-04-22 00:00:00', '2006-04-22 00:00:00', 1, 'Christina Lee', '123 4th Street', 'New York', 'NY', '99999', 'USA', 5, 0, 'Credit Card', '2006-04-22 00:00:00', NULL, 0, NULL, 3); -INSERT INTO `orders` (`id`, `employee_id`, `customer_id`, `order_date`, `shipped_date`, `shipper_id`, `ship_name`, `ship_address`, `ship_city`, `ship_state_province`, `ship_zip_postal_code`, `ship_country_region`, `shipping_fee`, `taxes`, `payment_type`, `paid_date`, `notes`, `tax_rate`, `tax_status_id`, `status_id`) VALUES (59, 4, 12, '2006-04-22 00:00:00', '2006-04-22 00:00:00', 2, 'John Edwards', '123 12th Street', 'Las Vegas', 'NV', '99999', 'USA', 5, 0, 'Credit Card', '2006-04-22 00:00:00', NULL, 0, NULL, 0); -INSERT INTO `orders` (`id`, `employee_id`, `customer_id`, `order_date`, `shipped_date`, `shipper_id`, `ship_name`, `ship_address`, `ship_city`, `ship_state_province`, `ship_zip_postal_code`, `ship_country_region`, `shipping_fee`, `taxes`, `payment_type`, `paid_date`, `notes`, `tax_rate`, `tax_status_id`, `status_id`) VALUES (60, 6, 8, '2006-04-30 00:00:00', '2006-04-30 00:00:00', 3, 'Elizabeth Andersen', '123 8th Street', 'Portland', 'OR', '99999', 'USA', 50, 0, 'Credit Card', '2006-04-30 00:00:00', NULL, 0, NULL, 3); -INSERT INTO `orders` (`id`, `employee_id`, `customer_id`, `order_date`, `shipped_date`, `shipper_id`, `ship_name`, `ship_address`, `ship_city`, `ship_state_province`, `ship_zip_postal_code`, `ship_country_region`, `shipping_fee`, `taxes`, `payment_type`, `paid_date`, `notes`, `tax_rate`, `tax_status_id`, `status_id`) VALUES (61, 9, 4, '2006-04-07 00:00:00', '2006-04-07 00:00:00', 3, 'Christina Lee', '123 4th Street', 'New York', 'NY', '99999', 'USA', 4, 0, 'Check', '2006-04-07 00:00:00', NULL, 0, NULL, 0); -INSERT INTO `orders` (`id`, `employee_id`, `customer_id`, `order_date`, `shipped_date`, `shipper_id`, `ship_name`, `ship_address`, `ship_city`, `ship_state_province`, `ship_zip_postal_code`, `ship_country_region`, `shipping_fee`, `taxes`, `payment_type`, `paid_date`, `notes`, `tax_rate`, `tax_status_id`, `status_id`) VALUES (62, 3, 29, '2006-04-12 00:00:00', '2006-04-12 00:00:00', 2, 'Soo Jung Lee', '789 29th Street', 'Denver', 'CO', '99999', 'USA', 7, 0, 'Check', '2006-04-12 00:00:00', NULL, 0, NULL, 0); -INSERT INTO `orders` (`id`, `employee_id`, `customer_id`, `order_date`, `shipped_date`, `shipper_id`, `ship_name`, `ship_address`, `ship_city`, `ship_state_province`, `ship_zip_postal_code`, `ship_country_region`, `shipping_fee`, `taxes`, `payment_type`, `paid_date`, `notes`, `tax_rate`, `tax_status_id`, `status_id`) VALUES (63, 4, 3, '2006-04-25 00:00:00', '2006-04-25 00:00:00', 2, 'Thomas Axen', '123 3rd Street', 'Los Angelas', 'CA', '99999', 'USA', 7, 0, 'Cash', '2006-04-25 00:00:00', NULL, 0, NULL, 3); -INSERT INTO `orders` (`id`, `employee_id`, `customer_id`, `order_date`, `shipped_date`, `shipper_id`, `ship_name`, `ship_address`, `ship_city`, `ship_state_province`, `ship_zip_postal_code`, `ship_country_region`, `shipping_fee`, `taxes`, `payment_type`, `paid_date`, `notes`, `tax_rate`, `tax_status_id`, `status_id`) VALUES (64, 8, 6, '2006-05-09 00:00:00', '2006-05-09 00:00:00', 2, 'Francisco Pérez-Olaeta', '123 6th Street', 'Milwaukee', 'WI', '99999', 'USA', 12, 0, 'Credit Card', '2006-05-09 00:00:00', NULL, 0, NULL, 0); -INSERT INTO `orders` (`id`, `employee_id`, `customer_id`, `order_date`, `shipped_date`, `shipper_id`, `ship_name`, `ship_address`, `ship_city`, `ship_state_province`, `ship_zip_postal_code`, `ship_country_region`, `shipping_fee`, `taxes`, `payment_type`, `paid_date`, `notes`, `tax_rate`, `tax_status_id`, `status_id`) VALUES (65, 9, 28, '2006-05-11 00:00:00', '2006-05-11 00:00:00', 3, 'Amritansh Raghav', '789 28th Street', 'Memphis', 'TN', '99999', 'USA', 10, 0, 'Check', '2006-05-11 00:00:00', NULL, 0, NULL, 0); -INSERT INTO `orders` (`id`, `employee_id`, `customer_id`, `order_date`, `shipped_date`, `shipper_id`, `ship_name`, `ship_address`, `ship_city`, `ship_state_province`, `ship_zip_postal_code`, `ship_country_region`, `shipping_fee`, `taxes`, `payment_type`, `paid_date`, `notes`, `tax_rate`, `tax_status_id`, `status_id`) VALUES (66, 3, 8, '2006-05-24 00:00:00', '2006-05-24 00:00:00', 3, 'Elizabeth Andersen', '123 8th Street', 'Portland', 'OR', '99999', 'USA', 5, 0, 'Check', '2006-05-24 00:00:00', NULL, 0, NULL, 0); -INSERT INTO `orders` (`id`, `employee_id`, `customer_id`, `order_date`, `shipped_date`, `shipper_id`, `ship_name`, `ship_address`, `ship_city`, `ship_state_province`, `ship_zip_postal_code`, `ship_country_region`, `shipping_fee`, `taxes`, `payment_type`, `paid_date`, `notes`, `tax_rate`, `tax_status_id`, `status_id`) VALUES (67, 4, 10, '2006-05-24 00:00:00', '2006-05-24 00:00:00', 2, 'Roland Wacker', '123 10th Street', 'Chicago', 'IL', '99999', 'USA', 9, 0, 'Credit Card', '2006-05-24 00:00:00', NULL, 0, NULL, 3); -INSERT INTO `orders` (`id`, `employee_id`, `customer_id`, `order_date`, `shipped_date`, `shipper_id`, `ship_name`, `ship_address`, `ship_city`, `ship_state_province`, `ship_zip_postal_code`, `ship_country_region`, `shipping_fee`, `taxes`, `payment_type`, `paid_date`, `notes`, `tax_rate`, `tax_status_id`, `status_id`) VALUES (68, 1, 7, '2006-05-24 00:00:00', NULL, NULL, 'Ming-Yang Xie', '123 7th Street', 'Boise', 'ID', '99999', 'USA', 0, 0, NULL, NULL, NULL, 0, NULL, 0); -INSERT INTO `orders` (`id`, `employee_id`, `customer_id`, `order_date`, `shipped_date`, `shipper_id`, `ship_name`, `ship_address`, `ship_city`, `ship_state_province`, `ship_zip_postal_code`, `ship_country_region`, `shipping_fee`, `taxes`, `payment_type`, `paid_date`, `notes`, `tax_rate`, `tax_status_id`, `status_id`) VALUES (69, 1, 10, '2006-05-24 00:00:00', NULL, 1, 'Roland Wacker', '123 10th Street', 'Chicago', 'IL', '99999', 'USA', 0, 0, NULL, NULL, NULL, 0, NULL, 0); -INSERT INTO `orders` (`id`, `employee_id`, `customer_id`, `order_date`, `shipped_date`, `shipper_id`, `ship_name`, `ship_address`, `ship_city`, `ship_state_province`, `ship_zip_postal_code`, `ship_country_region`, `shipping_fee`, `taxes`, `payment_type`, `paid_date`, `notes`, `tax_rate`, `tax_status_id`, `status_id`) VALUES (70, 1, 11, '2006-05-24 00:00:00', NULL, 3, 'Peter Krschne', '123 11th Street', 'Miami', 'FL', '99999', 'USA', 0, 0, NULL, NULL, NULL, 0, NULL, 0); -INSERT INTO `orders` (`id`, `employee_id`, `customer_id`, `order_date`, `shipped_date`, `shipper_id`, `ship_name`, `ship_address`, `ship_city`, `ship_state_province`, `ship_zip_postal_code`, `ship_country_region`, `shipping_fee`, `taxes`, `payment_type`, `paid_date`, `notes`, `tax_rate`, `tax_status_id`, `status_id`) VALUES (71, 1, 1, '2006-05-24 00:00:00', NULL, 3, 'Anna Bedecs', '123 1st Street', 'Seattle', 'WA', '99999', 'USA', 0, 0, NULL, NULL, NULL, 0, NULL, 0); -INSERT INTO `orders` (`id`, `employee_id`, `customer_id`, `order_date`, `shipped_date`, `shipper_id`, `ship_name`, `ship_address`, `ship_city`, `ship_state_province`, `ship_zip_postal_code`, `ship_country_region`, `shipping_fee`, `taxes`, `payment_type`, `paid_date`, `notes`, `tax_rate`, `tax_status_id`, `status_id`) VALUES (72, 1, 28, '2006-06-07 00:00:00', '2006-06-07 00:00:00', 3, 'Amritansh Raghav', '789 28th Street', 'Memphis', 'TN', '99999', 'USA', 40, 0, 'Credit Card', '2006-06-07 00:00:00', NULL, 0, NULL, 3); -INSERT INTO `orders` (`id`, `employee_id`, `customer_id`, `order_date`, `shipped_date`, `shipper_id`, `ship_name`, `ship_address`, `ship_city`, `ship_state_province`, `ship_zip_postal_code`, `ship_country_region`, `shipping_fee`, `taxes`, `payment_type`, `paid_date`, `notes`, `tax_rate`, `tax_status_id`, `status_id`) VALUES (73, 7, 9, '2006-06-05 00:00:00', '2006-06-05 00:00:00', 1, 'Sven Mortensen', '123 9th Street', 'Salt Lake City', 'UT', '99999', 'USA', 100, 0, 'Check', '2006-06-05 00:00:00', NULL, 0, NULL, 3); -INSERT INTO `orders` (`id`, `employee_id`, `customer_id`, `order_date`, `shipped_date`, `shipper_id`, `ship_name`, `ship_address`, `ship_city`, `ship_state_province`, `ship_zip_postal_code`, `ship_country_region`, `shipping_fee`, `taxes`, `payment_type`, `paid_date`, `notes`, `tax_rate`, `tax_status_id`, `status_id`) VALUES (74, 6, 6, '2006-06-08 00:00:00', '2006-06-08 00:00:00', 2, 'Francisco Pérez-Olaeta', '123 6th Street', 'Milwaukee', 'WI', '99999', 'USA', 300, 0, 'Credit Card', '2006-06-08 00:00:00', NULL, 0, NULL, 3); -INSERT INTO `orders` (`id`, `employee_id`, `customer_id`, `order_date`, `shipped_date`, `shipper_id`, `ship_name`, `ship_address`, `ship_city`, `ship_state_province`, `ship_zip_postal_code`, `ship_country_region`, `shipping_fee`, `taxes`, `payment_type`, `paid_date`, `notes`, `tax_rate`, `tax_status_id`, `status_id`) VALUES (75, 4, 8, '2006-06-05 00:00:00', '2006-06-05 00:00:00', 2, 'Elizabeth Andersen', '123 8th Street', 'Portland', 'OR', '99999', 'USA', 50, 0, 'Check', '2006-06-05 00:00:00', NULL, 0, NULL, 3); -INSERT INTO `orders` (`id`, `employee_id`, `customer_id`, `order_date`, `shipped_date`, `shipper_id`, `ship_name`, `ship_address`, `ship_city`, `ship_state_province`, `ship_zip_postal_code`, `ship_country_region`, `shipping_fee`, `taxes`, `payment_type`, `paid_date`, `notes`, `tax_rate`, `tax_status_id`, `status_id`) VALUES (76, 9, 25, '2006-06-05 00:00:00', '2006-06-05 00:00:00', 1, 'John Rodman', '789 25th Street', 'Chicago', 'IL', '99999', 'USA', 5, 0, 'Cash', '2006-06-05 00:00:00', NULL, 0, NULL, 3); -INSERT INTO `orders` (`id`, `employee_id`, `customer_id`, `order_date`, `shipped_date`, `shipper_id`, `ship_name`, `ship_address`, `ship_city`, `ship_state_province`, `ship_zip_postal_code`, `ship_country_region`, `shipping_fee`, `taxes`, `payment_type`, `paid_date`, `notes`, `tax_rate`, `tax_status_id`, `status_id`) VALUES (77, 9, 26, '2006-06-05 00:00:00', '2006-06-05 00:00:00', 3, 'Run Liu', '789 26th Street', 'Miami', 'FL', '99999', 'USA', 60, 0, 'Credit Card', '2006-06-05 00:00:00', NULL, 0, NULL, 3); -INSERT INTO `orders` (`id`, `employee_id`, `customer_id`, `order_date`, `shipped_date`, `shipper_id`, `ship_name`, `ship_address`, `ship_city`, `ship_state_province`, `ship_zip_postal_code`, `ship_country_region`, `shipping_fee`, `taxes`, `payment_type`, `paid_date`, `notes`, `tax_rate`, `tax_status_id`, `status_id`) VALUES (78, 1, 29, '2006-06-05 00:00:00', '2006-06-05 00:00:00', 2, 'Soo Jung Lee', '789 29th Street', 'Denver', 'CO', '99999', 'USA', 200, 0, 'Check', '2006-06-05 00:00:00', NULL, 0, NULL, 3); -INSERT INTO `orders` (`id`, `employee_id`, `customer_id`, `order_date`, `shipped_date`, `shipper_id`, `ship_name`, `ship_address`, `ship_city`, `ship_state_province`, `ship_zip_postal_code`, `ship_country_region`, `shipping_fee`, `taxes`, `payment_type`, `paid_date`, `notes`, `tax_rate`, `tax_status_id`, `status_id`) VALUES (79, 2, 6, '2006-06-23 00:00:00', '2006-06-23 00:00:00', 3, 'Francisco Pérez-Olaeta', '123 6th Street', 'Milwaukee', 'WI', '99999', 'USA', 0, 0, 'Check', '2006-06-23 00:00:00', NULL, 0, NULL, 3); -INSERT INTO `orders` (`id`, `employee_id`, `customer_id`, `order_date`, `shipped_date`, `shipper_id`, `ship_name`, `ship_address`, `ship_city`, `ship_state_province`, `ship_zip_postal_code`, `ship_country_region`, `shipping_fee`, `taxes`, `payment_type`, `paid_date`, `notes`, `tax_rate`, `tax_status_id`, `status_id`) VALUES (80, 2, 4, '2006-04-25 17:03:55', NULL, NULL, 'Christina Lee', '123 4th Street', 'New York', 'NY', '99999', 'USA', 0, 0, NULL, NULL, NULL, 0, NULL, 0); -INSERT INTO `orders` (`id`, `employee_id`, `customer_id`, `order_date`, `shipped_date`, `shipper_id`, `ship_name`, `ship_address`, `ship_city`, `ship_state_province`, `ship_zip_postal_code`, `ship_country_region`, `shipping_fee`, `taxes`, `payment_type`, `paid_date`, `notes`, `tax_rate`, `tax_status_id`, `status_id`) VALUES (81, 2, 3, '2006-04-25 17:26:53', NULL, NULL, 'Thomas Axen', '123 3rd Street', 'Los Angelas', 'CA', '99999', 'USA', 0, 0, NULL, NULL, NULL, 0, NULL, 0); -# 48 records - -# -# Dumping data for table 'orders_status' -# - -INSERT INTO `orders_status` (`id`, `status_name`) VALUES (0, 'New'); -INSERT INTO `orders_status` (`id`, `status_name`) VALUES (1, 'Invoiced'); -INSERT INTO `orders_status` (`id`, `status_name`) VALUES (2, 'Shipped'); -INSERT INTO `orders_status` (`id`, `status_name`) VALUES (3, 'Closed'); -# 4 records - -# -# Dumping data for table 'orders_tax_status' -# - -INSERT INTO `orders_tax_status` (`id`, `tax_status_name`) VALUES (0, 'Tax Exempt'); -INSERT INTO `orders_tax_status` (`id`, `tax_status_name`) VALUES (1, 'Taxable'); -# 2 records - -# -# Dumping data for table 'privileges' -# - -INSERT INTO `privileges` (`id`, `privilege_name`) VALUES (2, 'Purchase Approvals'); -# 1 records - -# -# Dumping data for table 'products' -# - -INSERT INTO `products` (`supplier_ids`, `id`, `product_code`, `product_name`, `description`, `standard_cost`, `list_price`, `reorder_level`, `target_level`, `quantity_per_unit`, `discontinued`, `minimum_reorder_quantity`, `category`, `attachments`) VALUES ('4', 1, 'NWTB-1', 'Northwind Traders Chai', NULL, 13.5, 18, 10, 40, '10 boxes x 20 bags', 0, 10, 'Beverages', ''); -INSERT INTO `products` (`supplier_ids`, `id`, `product_code`, `product_name`, `description`, `standard_cost`, `list_price`, `reorder_level`, `target_level`, `quantity_per_unit`, `discontinued`, `minimum_reorder_quantity`, `category`, `attachments`) VALUES ('10', 3, 'NWTCO-3', 'Northwind Traders Syrup', NULL, 7.5, 10, 25, 100, '12 - 550 ml bottles', 0, 25, 'Condiments', ''); -INSERT INTO `products` (`supplier_ids`, `id`, `product_code`, `product_name`, `description`, `standard_cost`, `list_price`, `reorder_level`, `target_level`, `quantity_per_unit`, `discontinued`, `minimum_reorder_quantity`, `category`, `attachments`) VALUES ('10', 4, 'NWTCO-4', 'Northwind Traders Cajun Seasoning', NULL, 16.5, 22, 10, 40, '48 - 6 oz jars', 0, 10, 'Condiments', ''); -INSERT INTO `products` (`supplier_ids`, `id`, `product_code`, `product_name`, `description`, `standard_cost`, `list_price`, `reorder_level`, `target_level`, `quantity_per_unit`, `discontinued`, `minimum_reorder_quantity`, `category`, `attachments`) VALUES ('10', 5, 'NWTO-5', 'Northwind Traders Olive Oil', NULL, 16.0125, 21.35, 10, 40, '36 boxes', 0, 10, 'Oil', ''); -INSERT INTO `products` (`supplier_ids`, `id`, `product_code`, `product_name`, `description`, `standard_cost`, `list_price`, `reorder_level`, `target_level`, `quantity_per_unit`, `discontinued`, `minimum_reorder_quantity`, `category`, `attachments`) VALUES ('2;6', 6, 'NWTJP-6', 'Northwind Traders Boysenberry Spread', NULL, 18.75, 25, 25, 100, '12 - 8 oz jars', 0, 25, 'Jams, Preserves', ''); -INSERT INTO `products` (`supplier_ids`, `id`, `product_code`, `product_name`, `description`, `standard_cost`, `list_price`, `reorder_level`, `target_level`, `quantity_per_unit`, `discontinued`, `minimum_reorder_quantity`, `category`, `attachments`) VALUES ('2', 7, 'NWTDFN-7', 'Northwind Traders Dried Pears', NULL, 22.5, 30, 10, 40, '12 - 1 lb pkgs.', 0, 10, 'Dried Fruit & Nuts', ''); -INSERT INTO `products` (`supplier_ids`, `id`, `product_code`, `product_name`, `description`, `standard_cost`, `list_price`, `reorder_level`, `target_level`, `quantity_per_unit`, `discontinued`, `minimum_reorder_quantity`, `category`, `attachments`) VALUES ('8', 8, 'NWTS-8', 'Northwind Traders Curry Sauce', NULL, 30, 40, 10, 40, '12 - 12 oz jars', 0, 10, 'Sauces', ''); -INSERT INTO `products` (`supplier_ids`, `id`, `product_code`, `product_name`, `description`, `standard_cost`, `list_price`, `reorder_level`, `target_level`, `quantity_per_unit`, `discontinued`, `minimum_reorder_quantity`, `category`, `attachments`) VALUES ('2;6', 14, 'NWTDFN-14', 'Northwind Traders Walnuts', NULL, 17.4375, 23.25, 10, 40, '40 - 100 g pkgs.', 0, 10, 'Dried Fruit & Nuts', ''); -INSERT INTO `products` (`supplier_ids`, `id`, `product_code`, `product_name`, `description`, `standard_cost`, `list_price`, `reorder_level`, `target_level`, `quantity_per_unit`, `discontinued`, `minimum_reorder_quantity`, `category`, `attachments`) VALUES ('6', 17, 'NWTCFV-17', 'Northwind Traders Fruit Cocktail', NULL, 29.25, 39, 10, 40, '15.25 OZ', 0, 10, 'Canned Fruit & Vegetables', ''); -INSERT INTO `products` (`supplier_ids`, `id`, `product_code`, `product_name`, `description`, `standard_cost`, `list_price`, `reorder_level`, `target_level`, `quantity_per_unit`, `discontinued`, `minimum_reorder_quantity`, `category`, `attachments`) VALUES ('1', 19, 'NWTBGM-19', 'Northwind Traders Chocolate Biscuits Mix', NULL, 6.9, 9.2, 5, 20, '10 boxes x 12 pieces', 0, 5, 'Baked Goods & Mixes', ''); -INSERT INTO `products` (`supplier_ids`, `id`, `product_code`, `product_name`, `description`, `standard_cost`, `list_price`, `reorder_level`, `target_level`, `quantity_per_unit`, `discontinued`, `minimum_reorder_quantity`, `category`, `attachments`) VALUES ('2;6', 20, 'NWTJP-6', 'Northwind Traders Marmalade', NULL, 60.75, 81, 10, 40, '30 gift boxes', 0, 10, 'Jams, Preserves', ''); -INSERT INTO `products` (`supplier_ids`, `id`, `product_code`, `product_name`, `description`, `standard_cost`, `list_price`, `reorder_level`, `target_level`, `quantity_per_unit`, `discontinued`, `minimum_reorder_quantity`, `category`, `attachments`) VALUES ('1', 21, 'NWTBGM-21', 'Northwind Traders Scones', NULL, 7.5, 10, 5, 20, '24 pkgs. x 4 pieces', 0, 5, 'Baked Goods & Mixes', ''); -INSERT INTO `products` (`supplier_ids`, `id`, `product_code`, `product_name`, `description`, `standard_cost`, `list_price`, `reorder_level`, `target_level`, `quantity_per_unit`, `discontinued`, `minimum_reorder_quantity`, `category`, `attachments`) VALUES ('4', 34, 'NWTB-34', 'Northwind Traders Beer', NULL, 10.5, 14, 15, 60, '24 - 12 oz bottles', 0, 15, 'Beverages', ''); -INSERT INTO `products` (`supplier_ids`, `id`, `product_code`, `product_name`, `description`, `standard_cost`, `list_price`, `reorder_level`, `target_level`, `quantity_per_unit`, `discontinued`, `minimum_reorder_quantity`, `category`, `attachments`) VALUES ('7', 40, 'NWTCM-40', 'Northwind Traders Crab Meat', NULL, 13.8, 18.4, 30, 120, '24 - 4 oz tins', 0, 30, 'Canned Meat', ''); -INSERT INTO `products` (`supplier_ids`, `id`, `product_code`, `product_name`, `description`, `standard_cost`, `list_price`, `reorder_level`, `target_level`, `quantity_per_unit`, `discontinued`, `minimum_reorder_quantity`, `category`, `attachments`) VALUES ('6', 41, 'NWTSO-41', 'Northwind Traders Clam Chowder', NULL, 7.2375, 9.65, 10, 40, '12 - 12 oz cans', 0, 10, 'Soups', ''); -INSERT INTO `products` (`supplier_ids`, `id`, `product_code`, `product_name`, `description`, `standard_cost`, `list_price`, `reorder_level`, `target_level`, `quantity_per_unit`, `discontinued`, `minimum_reorder_quantity`, `category`, `attachments`) VALUES ('3;4', 43, 'NWTB-43', 'Northwind Traders Coffee', NULL, 34.5, 46, 25, 100, '16 - 500 g tins', 0, 25, 'Beverages', ''); -INSERT INTO `products` (`supplier_ids`, `id`, `product_code`, `product_name`, `description`, `standard_cost`, `list_price`, `reorder_level`, `target_level`, `quantity_per_unit`, `discontinued`, `minimum_reorder_quantity`, `category`, `attachments`) VALUES ('10', 48, 'NWTCA-48', 'Northwind Traders Chocolate', NULL, 9.5625, 12.75, 25, 100, '10 pkgs', 0, 25, 'Candy', ''); -INSERT INTO `products` (`supplier_ids`, `id`, `product_code`, `product_name`, `description`, `standard_cost`, `list_price`, `reorder_level`, `target_level`, `quantity_per_unit`, `discontinued`, `minimum_reorder_quantity`, `category`, `attachments`) VALUES ('2', 51, 'NWTDFN-51', 'Northwind Traders Dried Apples', NULL, 39.75, 53, 10, 40, '50 - 300 g pkgs.', 0, 10, 'Dried Fruit & Nuts', ''); -INSERT INTO `products` (`supplier_ids`, `id`, `product_code`, `product_name`, `description`, `standard_cost`, `list_price`, `reorder_level`, `target_level`, `quantity_per_unit`, `discontinued`, `minimum_reorder_quantity`, `category`, `attachments`) VALUES ('1', 52, 'NWTG-52', 'Northwind Traders Long Grain Rice', NULL, 5.25, 7, 25, 100, '16 - 2 kg boxes', 0, 25, 'Grains', ''); -INSERT INTO `products` (`supplier_ids`, `id`, `product_code`, `product_name`, `description`, `standard_cost`, `list_price`, `reorder_level`, `target_level`, `quantity_per_unit`, `discontinued`, `minimum_reorder_quantity`, `category`, `attachments`) VALUES ('1', 56, 'NWTP-56', 'Northwind Traders Gnocchi', NULL, 28.5, 38, 30, 120, '24 - 250 g pkgs.', 0, 30, 'Pasta', ''); -INSERT INTO `products` (`supplier_ids`, `id`, `product_code`, `product_name`, `description`, `standard_cost`, `list_price`, `reorder_level`, `target_level`, `quantity_per_unit`, `discontinued`, `minimum_reorder_quantity`, `category`, `attachments`) VALUES ('1', 57, 'NWTP-57', 'Northwind Traders Ravioli', NULL, 14.625, 19.5, 20, 80, '24 - 250 g pkgs.', 0, 20, 'Pasta', ''); -INSERT INTO `products` (`supplier_ids`, `id`, `product_code`, `product_name`, `description`, `standard_cost`, `list_price`, `reorder_level`, `target_level`, `quantity_per_unit`, `discontinued`, `minimum_reorder_quantity`, `category`, `attachments`) VALUES ('8', 65, 'NWTS-65', 'Northwind Traders Hot Pepper Sauce', NULL, 15.7875, 21.05, 10, 40, '32 - 8 oz bottles', 0, 10, 'Sauces', ''); -INSERT INTO `products` (`supplier_ids`, `id`, `product_code`, `product_name`, `description`, `standard_cost`, `list_price`, `reorder_level`, `target_level`, `quantity_per_unit`, `discontinued`, `minimum_reorder_quantity`, `category`, `attachments`) VALUES ('8', 66, 'NWTS-66', 'Northwind Traders Tomato Sauce', NULL, 12.75, 17, 20, 80, '24 - 8 oz jars', 0, 20, 'Sauces', ''); -INSERT INTO `products` (`supplier_ids`, `id`, `product_code`, `product_name`, `description`, `standard_cost`, `list_price`, `reorder_level`, `target_level`, `quantity_per_unit`, `discontinued`, `minimum_reorder_quantity`, `category`, `attachments`) VALUES ('5', 72, 'NWTD-72', 'Northwind Traders Mozzarella', NULL, 26.1, 34.8, 10, 40, '24 - 200 g pkgs.', 0, 10, 'Dairy products', ''); -INSERT INTO `products` (`supplier_ids`, `id`, `product_code`, `product_name`, `description`, `standard_cost`, `list_price`, `reorder_level`, `target_level`, `quantity_per_unit`, `discontinued`, `minimum_reorder_quantity`, `category`, `attachments`) VALUES ('2;6', 74, 'NWTDFN-74', 'Northwind Traders Almonds', NULL, 7.5, 10, 5, 20, '5 kg pkg.', 0, 5, 'Dried Fruit & Nuts', ''); -INSERT INTO `products` (`supplier_ids`, `id`, `product_code`, `product_name`, `description`, `standard_cost`, `list_price`, `reorder_level`, `target_level`, `quantity_per_unit`, `discontinued`, `minimum_reorder_quantity`, `category`, `attachments`) VALUES ('10', 77, 'NWTCO-77', 'Northwind Traders Mustard', NULL, 9.75, 13, 15, 60, '12 boxes', 0, 15, 'Condiments', ''); -INSERT INTO `products` (`supplier_ids`, `id`, `product_code`, `product_name`, `description`, `standard_cost`, `list_price`, `reorder_level`, `target_level`, `quantity_per_unit`, `discontinued`, `minimum_reorder_quantity`, `category`, `attachments`) VALUES ('2', 80, 'NWTDFN-80', 'Northwind Traders Dried Plums', NULL, 3, 3.5, 50, 75, '1 lb bag', 0, 25, 'Dried Fruit & Nuts', ''); -INSERT INTO `products` (`supplier_ids`, `id`, `product_code`, `product_name`, `description`, `standard_cost`, `list_price`, `reorder_level`, `target_level`, `quantity_per_unit`, `discontinued`, `minimum_reorder_quantity`, `category`, `attachments`) VALUES ('3', 81, 'NWTB-81', 'Northwind Traders Green Tea', NULL, 2, 2.99, 100, 125, '20 bags per box', 0, 25, 'Beverages', ''); -INSERT INTO `products` (`supplier_ids`, `id`, `product_code`, `product_name`, `description`, `standard_cost`, `list_price`, `reorder_level`, `target_level`, `quantity_per_unit`, `discontinued`, `minimum_reorder_quantity`, `category`, `attachments`) VALUES ('1', 82, 'NWTC-82', 'Northwind Traders Granola', NULL, 2, 4, 20, 100, NULL, 0, NULL, 'Cereal', ''); -INSERT INTO `products` (`supplier_ids`, `id`, `product_code`, `product_name`, `description`, `standard_cost`, `list_price`, `reorder_level`, `target_level`, `quantity_per_unit`, `discontinued`, `minimum_reorder_quantity`, `category`, `attachments`) VALUES ('9', 83, 'NWTCS-83', 'Northwind Traders Potato Chips', NULL, .5, 1.8, 30, 200, NULL, 0, NULL, 'Chips, Snacks', ''); -INSERT INTO `products` (`supplier_ids`, `id`, `product_code`, `product_name`, `description`, `standard_cost`, `list_price`, `reorder_level`, `target_level`, `quantity_per_unit`, `discontinued`, `minimum_reorder_quantity`, `category`, `attachments`) VALUES ('1', 85, 'NWTBGM-85', 'Northwind Traders Brownie Mix', NULL, 9, 12.49, 10, 20, '3 boxes', 0, 5, 'Baked Goods & Mixes', ''); -INSERT INTO `products` (`supplier_ids`, `id`, `product_code`, `product_name`, `description`, `standard_cost`, `list_price`, `reorder_level`, `target_level`, `quantity_per_unit`, `discontinued`, `minimum_reorder_quantity`, `category`, `attachments`) VALUES ('1', 86, 'NWTBGM-86', 'Northwind Traders Cake Mix', NULL, 10.5, 15.99, 10, 20, '4 boxes', 0, 5, 'Baked Goods & Mixes', ''); -INSERT INTO `products` (`supplier_ids`, `id`, `product_code`, `product_name`, `description`, `standard_cost`, `list_price`, `reorder_level`, `target_level`, `quantity_per_unit`, `discontinued`, `minimum_reorder_quantity`, `category`, `attachments`) VALUES ('7', 87, 'NWTB-87', 'Northwind Traders Tea', NULL, 2, 4, 20, 50, '100 count per box', 0, NULL, 'Beverages', ''); -INSERT INTO `products` (`supplier_ids`, `id`, `product_code`, `product_name`, `description`, `standard_cost`, `list_price`, `reorder_level`, `target_level`, `quantity_per_unit`, `discontinued`, `minimum_reorder_quantity`, `category`, `attachments`) VALUES ('6', 88, 'NWTCFV-88', 'Northwind Traders Pears', NULL, 1, 1.3, 10, 40, '15.25 OZ', 0, NULL, 'Canned Fruit & Vegetables', ''); -INSERT INTO `products` (`supplier_ids`, `id`, `product_code`, `product_name`, `description`, `standard_cost`, `list_price`, `reorder_level`, `target_level`, `quantity_per_unit`, `discontinued`, `minimum_reorder_quantity`, `category`, `attachments`) VALUES ('6', 89, 'NWTCFV-89', 'Northwind Traders Peaches', NULL, 1, 1.5, 10, 40, '15.25 OZ', 0, NULL, 'Canned Fruit & Vegetables', ''); -INSERT INTO `products` (`supplier_ids`, `id`, `product_code`, `product_name`, `description`, `standard_cost`, `list_price`, `reorder_level`, `target_level`, `quantity_per_unit`, `discontinued`, `minimum_reorder_quantity`, `category`, `attachments`) VALUES ('6', 90, 'NWTCFV-90', 'Northwind Traders Pineapple', NULL, 1, 1.8, 10, 40, '15.25 OZ', 0, NULL, 'Canned Fruit & Vegetables', ''); -INSERT INTO `products` (`supplier_ids`, `id`, `product_code`, `product_name`, `description`, `standard_cost`, `list_price`, `reorder_level`, `target_level`, `quantity_per_unit`, `discontinued`, `minimum_reorder_quantity`, `category`, `attachments`) VALUES ('6', 91, 'NWTCFV-91', 'Northwind Traders Cherry Pie Filling', NULL, 1, 2, 10, 40, '15.25 OZ', 0, NULL, 'Canned Fruit & Vegetables', ''); -INSERT INTO `products` (`supplier_ids`, `id`, `product_code`, `product_name`, `description`, `standard_cost`, `list_price`, `reorder_level`, `target_level`, `quantity_per_unit`, `discontinued`, `minimum_reorder_quantity`, `category`, `attachments`) VALUES ('6', 92, 'NWTCFV-92', 'Northwind Traders Green Beans', NULL, 1, 1.2, 10, 40, '14.5 OZ', 0, NULL, 'Canned Fruit & Vegetables', ''); -INSERT INTO `products` (`supplier_ids`, `id`, `product_code`, `product_name`, `description`, `standard_cost`, `list_price`, `reorder_level`, `target_level`, `quantity_per_unit`, `discontinued`, `minimum_reorder_quantity`, `category`, `attachments`) VALUES ('6', 93, 'NWTCFV-93', 'Northwind Traders Corn', NULL, 1, 1.2, 10, 40, '14.5 OZ', 0, NULL, 'Canned Fruit & Vegetables', ''); -INSERT INTO `products` (`supplier_ids`, `id`, `product_code`, `product_name`, `description`, `standard_cost`, `list_price`, `reorder_level`, `target_level`, `quantity_per_unit`, `discontinued`, `minimum_reorder_quantity`, `category`, `attachments`) VALUES ('6', 94, 'NWTCFV-94', 'Northwind Traders Peas', NULL, 1, 1.5, 10, 40, '14.5 OZ', 0, NULL, 'Canned Fruit & Vegetables', ''); -INSERT INTO `products` (`supplier_ids`, `id`, `product_code`, `product_name`, `description`, `standard_cost`, `list_price`, `reorder_level`, `target_level`, `quantity_per_unit`, `discontinued`, `minimum_reorder_quantity`, `category`, `attachments`) VALUES ('7', 95, 'NWTCM-95', 'Northwind Traders Tuna Fish', NULL, .5, 2, 30, 50, '5 oz', 0, NULL, 'Canned Meat', ''); -INSERT INTO `products` (`supplier_ids`, `id`, `product_code`, `product_name`, `description`, `standard_cost`, `list_price`, `reorder_level`, `target_level`, `quantity_per_unit`, `discontinued`, `minimum_reorder_quantity`, `category`, `attachments`) VALUES ('7', 96, 'NWTCM-96', 'Northwind Traders Smoked Salmon', NULL, 2, 4, 30, 50, '5 oz', 0, NULL, 'Canned Meat', ''); -INSERT INTO `products` (`supplier_ids`, `id`, `product_code`, `product_name`, `description`, `standard_cost`, `list_price`, `reorder_level`, `target_level`, `quantity_per_unit`, `discontinued`, `minimum_reorder_quantity`, `category`, `attachments`) VALUES ('1', 97, 'NWTC-82', 'Northwind Traders Hot Cereal', NULL, 3, 5, 50, 200, NULL, 0, NULL, 'Cereal', ''); -INSERT INTO `products` (`supplier_ids`, `id`, `product_code`, `product_name`, `description`, `standard_cost`, `list_price`, `reorder_level`, `target_level`, `quantity_per_unit`, `discontinued`, `minimum_reorder_quantity`, `category`, `attachments`) VALUES ('6', 98, 'NWTSO-98', 'Northwind Traders Vegetable Soup', NULL, 1, 1.89, 100, 200, NULL, 0, NULL, 'Soups', ''); -INSERT INTO `products` (`supplier_ids`, `id`, `product_code`, `product_name`, `description`, `standard_cost`, `list_price`, `reorder_level`, `target_level`, `quantity_per_unit`, `discontinued`, `minimum_reorder_quantity`, `category`, `attachments`) VALUES ('6', 99, 'NWTSO-99', 'Northwind Traders Chicken Soup', NULL, 1, 1.95, 100, 200, NULL, 0, NULL, 'Soups', ''); -# 45 records - -# -# Dumping data for table 'purchase_order_details' -# - -INSERT INTO `purchase_order_details` (`id`, `purchase_order_id`, `product_id`, `quantity`, `unit_cost`, `date_received`, `posted_to_inventory`, `inventory_id`) VALUES (238, 90, 1, 40, 14, '2006-01-22 00:00:00', 1, 59); -INSERT INTO `purchase_order_details` (`id`, `purchase_order_id`, `product_id`, `quantity`, `unit_cost`, `date_received`, `posted_to_inventory`, `inventory_id`) VALUES (239, 91, 3, 100, 8, '2006-01-22 00:00:00', 1, 54); -INSERT INTO `purchase_order_details` (`id`, `purchase_order_id`, `product_id`, `quantity`, `unit_cost`, `date_received`, `posted_to_inventory`, `inventory_id`) VALUES (240, 91, 4, 40, 16, '2006-01-22 00:00:00', 1, 55); -INSERT INTO `purchase_order_details` (`id`, `purchase_order_id`, `product_id`, `quantity`, `unit_cost`, `date_received`, `posted_to_inventory`, `inventory_id`) VALUES (241, 91, 5, 40, 16, '2006-01-22 00:00:00', 1, 56); -INSERT INTO `purchase_order_details` (`id`, `purchase_order_id`, `product_id`, `quantity`, `unit_cost`, `date_received`, `posted_to_inventory`, `inventory_id`) VALUES (242, 92, 6, 100, 19, '2006-01-22 00:00:00', 1, 40); -INSERT INTO `purchase_order_details` (`id`, `purchase_order_id`, `product_id`, `quantity`, `unit_cost`, `date_received`, `posted_to_inventory`, `inventory_id`) VALUES (243, 92, 7, 40, 22, '2006-01-22 00:00:00', 1, 41); -INSERT INTO `purchase_order_details` (`id`, `purchase_order_id`, `product_id`, `quantity`, `unit_cost`, `date_received`, `posted_to_inventory`, `inventory_id`) VALUES (244, 92, 8, 40, 30, '2006-01-22 00:00:00', 1, 42); -INSERT INTO `purchase_order_details` (`id`, `purchase_order_id`, `product_id`, `quantity`, `unit_cost`, `date_received`, `posted_to_inventory`, `inventory_id`) VALUES (245, 92, 14, 40, 17, '2006-01-22 00:00:00', 1, 43); -INSERT INTO `purchase_order_details` (`id`, `purchase_order_id`, `product_id`, `quantity`, `unit_cost`, `date_received`, `posted_to_inventory`, `inventory_id`) VALUES (246, 92, 17, 40, 29, '2006-01-22 00:00:00', 1, 44); -INSERT INTO `purchase_order_details` (`id`, `purchase_order_id`, `product_id`, `quantity`, `unit_cost`, `date_received`, `posted_to_inventory`, `inventory_id`) VALUES (247, 92, 19, 20, 7, '2006-01-22 00:00:00', 1, 45); -INSERT INTO `purchase_order_details` (`id`, `purchase_order_id`, `product_id`, `quantity`, `unit_cost`, `date_received`, `posted_to_inventory`, `inventory_id`) VALUES (248, 92, 20, 40, 61, '2006-01-22 00:00:00', 1, 46); -INSERT INTO `purchase_order_details` (`id`, `purchase_order_id`, `product_id`, `quantity`, `unit_cost`, `date_received`, `posted_to_inventory`, `inventory_id`) VALUES (249, 92, 21, 20, 8, '2006-01-22 00:00:00', 1, 47); -INSERT INTO `purchase_order_details` (`id`, `purchase_order_id`, `product_id`, `quantity`, `unit_cost`, `date_received`, `posted_to_inventory`, `inventory_id`) VALUES (250, 90, 34, 60, 10, '2006-01-22 00:00:00', 1, 60); -INSERT INTO `purchase_order_details` (`id`, `purchase_order_id`, `product_id`, `quantity`, `unit_cost`, `date_received`, `posted_to_inventory`, `inventory_id`) VALUES (251, 92, 40, 120, 14, '2006-01-22 00:00:00', 1, 48); -INSERT INTO `purchase_order_details` (`id`, `purchase_order_id`, `product_id`, `quantity`, `unit_cost`, `date_received`, `posted_to_inventory`, `inventory_id`) VALUES (252, 92, 41, 40, 7, '2006-01-22 00:00:00', 1, 49); -INSERT INTO `purchase_order_details` (`id`, `purchase_order_id`, `product_id`, `quantity`, `unit_cost`, `date_received`, `posted_to_inventory`, `inventory_id`) VALUES (253, 90, 43, 100, 34, '2006-01-22 00:00:00', 1, 61); -INSERT INTO `purchase_order_details` (`id`, `purchase_order_id`, `product_id`, `quantity`, `unit_cost`, `date_received`, `posted_to_inventory`, `inventory_id`) VALUES (254, 92, 48, 100, 10, '2006-01-22 00:00:00', 1, 50); -INSERT INTO `purchase_order_details` (`id`, `purchase_order_id`, `product_id`, `quantity`, `unit_cost`, `date_received`, `posted_to_inventory`, `inventory_id`) VALUES (255, 92, 51, 40, 40, '2006-01-22 00:00:00', 1, 51); -INSERT INTO `purchase_order_details` (`id`, `purchase_order_id`, `product_id`, `quantity`, `unit_cost`, `date_received`, `posted_to_inventory`, `inventory_id`) VALUES (256, 93, 52, 100, 5, '2006-01-22 00:00:00', 1, 37); -INSERT INTO `purchase_order_details` (`id`, `purchase_order_id`, `product_id`, `quantity`, `unit_cost`, `date_received`, `posted_to_inventory`, `inventory_id`) VALUES (257, 93, 56, 120, 28, '2006-01-22 00:00:00', 1, 38); -INSERT INTO `purchase_order_details` (`id`, `purchase_order_id`, `product_id`, `quantity`, `unit_cost`, `date_received`, `posted_to_inventory`, `inventory_id`) VALUES (258, 93, 57, 80, 15, '2006-01-22 00:00:00', 1, 39); -INSERT INTO `purchase_order_details` (`id`, `purchase_order_id`, `product_id`, `quantity`, `unit_cost`, `date_received`, `posted_to_inventory`, `inventory_id`) VALUES (259, 91, 65, 40, 16, '2006-01-22 00:00:00', 1, 57); -INSERT INTO `purchase_order_details` (`id`, `purchase_order_id`, `product_id`, `quantity`, `unit_cost`, `date_received`, `posted_to_inventory`, `inventory_id`) VALUES (260, 91, 66, 80, 13, '2006-01-22 00:00:00', 1, 58); -INSERT INTO `purchase_order_details` (`id`, `purchase_order_id`, `product_id`, `quantity`, `unit_cost`, `date_received`, `posted_to_inventory`, `inventory_id`) VALUES (261, 94, 72, 40, 26, '2006-01-22 00:00:00', 1, 36); -INSERT INTO `purchase_order_details` (`id`, `purchase_order_id`, `product_id`, `quantity`, `unit_cost`, `date_received`, `posted_to_inventory`, `inventory_id`) VALUES (262, 92, 74, 20, 8, '2006-01-22 00:00:00', 1, 52); -INSERT INTO `purchase_order_details` (`id`, `purchase_order_id`, `product_id`, `quantity`, `unit_cost`, `date_received`, `posted_to_inventory`, `inventory_id`) VALUES (263, 92, 77, 60, 10, '2006-01-22 00:00:00', 1, 53); -INSERT INTO `purchase_order_details` (`id`, `purchase_order_id`, `product_id`, `quantity`, `unit_cost`, `date_received`, `posted_to_inventory`, `inventory_id`) VALUES (264, 95, 80, 75, 3, '2006-01-22 00:00:00', 1, 35); -INSERT INTO `purchase_order_details` (`id`, `purchase_order_id`, `product_id`, `quantity`, `unit_cost`, `date_received`, `posted_to_inventory`, `inventory_id`) VALUES (265, 90, 81, 125, 2, '2006-01-22 00:00:00', 1, 62); -INSERT INTO `purchase_order_details` (`id`, `purchase_order_id`, `product_id`, `quantity`, `unit_cost`, `date_received`, `posted_to_inventory`, `inventory_id`) VALUES (266, 96, 34, 100, 10, '2006-01-22 00:00:00', 1, 82); -INSERT INTO `purchase_order_details` (`id`, `purchase_order_id`, `product_id`, `quantity`, `unit_cost`, `date_received`, `posted_to_inventory`, `inventory_id`) VALUES (267, 97, 19, 30, 7, '2006-01-22 00:00:00', 1, 80); -INSERT INTO `purchase_order_details` (`id`, `purchase_order_id`, `product_id`, `quantity`, `unit_cost`, `date_received`, `posted_to_inventory`, `inventory_id`) VALUES (268, 98, 41, 200, 7, '2006-01-22 00:00:00', 1, 78); -INSERT INTO `purchase_order_details` (`id`, `purchase_order_id`, `product_id`, `quantity`, `unit_cost`, `date_received`, `posted_to_inventory`, `inventory_id`) VALUES (269, 99, 43, 300, 34, '2006-01-22 00:00:00', 1, 76); -INSERT INTO `purchase_order_details` (`id`, `purchase_order_id`, `product_id`, `quantity`, `unit_cost`, `date_received`, `posted_to_inventory`, `inventory_id`) VALUES (270, 100, 48, 100, 10, '2006-01-22 00:00:00', 1, 74); -INSERT INTO `purchase_order_details` (`id`, `purchase_order_id`, `product_id`, `quantity`, `unit_cost`, `date_received`, `posted_to_inventory`, `inventory_id`) VALUES (271, 101, 81, 200, 2, '2006-01-22 00:00:00', 1, 72); -INSERT INTO `purchase_order_details` (`id`, `purchase_order_id`, `product_id`, `quantity`, `unit_cost`, `date_received`, `posted_to_inventory`, `inventory_id`) VALUES (272, 102, 43, 300, 34, NULL, 0, NULL); -INSERT INTO `purchase_order_details` (`id`, `purchase_order_id`, `product_id`, `quantity`, `unit_cost`, `date_received`, `posted_to_inventory`, `inventory_id`) VALUES (273, 103, 19, 10, 7, '2006-04-17 00:00:00', 1, 111); -INSERT INTO `purchase_order_details` (`id`, `purchase_order_id`, `product_id`, `quantity`, `unit_cost`, `date_received`, `posted_to_inventory`, `inventory_id`) VALUES (274, 104, 41, 50, 7, '2006-04-06 00:00:00', 1, 115); -INSERT INTO `purchase_order_details` (`id`, `purchase_order_id`, `product_id`, `quantity`, `unit_cost`, `date_received`, `posted_to_inventory`, `inventory_id`) VALUES (275, 105, 57, 100, 15, '2006-04-05 00:00:00', 1, 100); -INSERT INTO `purchase_order_details` (`id`, `purchase_order_id`, `product_id`, `quantity`, `unit_cost`, `date_received`, `posted_to_inventory`, `inventory_id`) VALUES (276, 106, 72, 50, 26, '2006-04-05 00:00:00', 1, 113); -INSERT INTO `purchase_order_details` (`id`, `purchase_order_id`, `product_id`, `quantity`, `unit_cost`, `date_received`, `posted_to_inventory`, `inventory_id`) VALUES (277, 107, 34, 300, 10, '2006-04-05 00:00:00', 1, 107); -INSERT INTO `purchase_order_details` (`id`, `purchase_order_id`, `product_id`, `quantity`, `unit_cost`, `date_received`, `posted_to_inventory`, `inventory_id`) VALUES (278, 108, 8, 25, 30, '2006-04-05 00:00:00', 1, 105); -INSERT INTO `purchase_order_details` (`id`, `purchase_order_id`, `product_id`, `quantity`, `unit_cost`, `date_received`, `posted_to_inventory`, `inventory_id`) VALUES (279, 109, 19, 25, 7, '2006-04-05 00:00:00', 1, 109); -INSERT INTO `purchase_order_details` (`id`, `purchase_order_id`, `product_id`, `quantity`, `unit_cost`, `date_received`, `posted_to_inventory`, `inventory_id`) VALUES (280, 110, 43, 250, 34, '2006-04-10 00:00:00', 1, 103); -INSERT INTO `purchase_order_details` (`id`, `purchase_order_id`, `product_id`, `quantity`, `unit_cost`, `date_received`, `posted_to_inventory`, `inventory_id`) VALUES (281, 90, 1, 40, 14, NULL, 0, NULL); -INSERT INTO `purchase_order_details` (`id`, `purchase_order_id`, `product_id`, `quantity`, `unit_cost`, `date_received`, `posted_to_inventory`, `inventory_id`) VALUES (282, 92, 19, 20, 7, NULL, 0, NULL); -INSERT INTO `purchase_order_details` (`id`, `purchase_order_id`, `product_id`, `quantity`, `unit_cost`, `date_received`, `posted_to_inventory`, `inventory_id`) VALUES (283, 111, 34, 50, 10, '2006-04-04 00:00:00', 1, 102); -INSERT INTO `purchase_order_details` (`id`, `purchase_order_id`, `product_id`, `quantity`, `unit_cost`, `date_received`, `posted_to_inventory`, `inventory_id`) VALUES (285, 91, 3, 50, 8, NULL, 0, NULL); -INSERT INTO `purchase_order_details` (`id`, `purchase_order_id`, `product_id`, `quantity`, `unit_cost`, `date_received`, `posted_to_inventory`, `inventory_id`) VALUES (286, 91, 4, 40, 16, NULL, 0, NULL); -INSERT INTO `purchase_order_details` (`id`, `purchase_order_id`, `product_id`, `quantity`, `unit_cost`, `date_received`, `posted_to_inventory`, `inventory_id`) VALUES (288, 140, 85, 10, 9, NULL, 0, NULL); -INSERT INTO `purchase_order_details` (`id`, `purchase_order_id`, `product_id`, `quantity`, `unit_cost`, `date_received`, `posted_to_inventory`, `inventory_id`) VALUES (289, 141, 6, 10, 18.75, NULL, 0, NULL); -INSERT INTO `purchase_order_details` (`id`, `purchase_order_id`, `product_id`, `quantity`, `unit_cost`, `date_received`, `posted_to_inventory`, `inventory_id`) VALUES (290, 142, 1, 1, 13.5, NULL, 0, NULL); -INSERT INTO `purchase_order_details` (`id`, `purchase_order_id`, `product_id`, `quantity`, `unit_cost`, `date_received`, `posted_to_inventory`, `inventory_id`) VALUES (292, 146, 20, 40, 60, NULL, 0, NULL); -INSERT INTO `purchase_order_details` (`id`, `purchase_order_id`, `product_id`, `quantity`, `unit_cost`, `date_received`, `posted_to_inventory`, `inventory_id`) VALUES (293, 146, 51, 40, 39, NULL, 0, NULL); -INSERT INTO `purchase_order_details` (`id`, `purchase_order_id`, `product_id`, `quantity`, `unit_cost`, `date_received`, `posted_to_inventory`, `inventory_id`) VALUES (294, 147, 40, 120, 13, NULL, 0, NULL); -INSERT INTO `purchase_order_details` (`id`, `purchase_order_id`, `product_id`, `quantity`, `unit_cost`, `date_received`, `posted_to_inventory`, `inventory_id`) VALUES (295, 148, 72, 40, 26, NULL, 0, NULL); -# 55 records - -# -# Dumping data for table 'purchase_order_status' -# - -INSERT INTO `purchase_order_status` (`id`, `status`) VALUES (0, 'New'); -INSERT INTO `purchase_order_status` (`id`, `status`) VALUES (1, 'Submitted'); -INSERT INTO `purchase_order_status` (`id`, `status`) VALUES (2, 'Approved'); -INSERT INTO `purchase_order_status` (`id`, `status`) VALUES (3, 'Closed'); -# 4 records - -# -# Dumping data for table 'purchase_orders' -# - -INSERT INTO `purchase_orders` (`id`, `supplier_id`, `created_by`, `submitted_date`, `creation_date`, `status_id`, `expected_date`, `shipping_fee`, `taxes`, `payment_date`, `payment_amount`, `payment_method`, `notes`, `approved_by`, `approved_date`, `submitted_by`) VALUES (90, 1, 2, '2006-01-14 00:00:00', '2006-01-22 00:00:00', 2, NULL, 0, 0, NULL, 0, NULL, NULL, 2, '2006-01-22 00:00:00', 2); -INSERT INTO `purchase_orders` (`id`, `supplier_id`, `created_by`, `submitted_date`, `creation_date`, `status_id`, `expected_date`, `shipping_fee`, `taxes`, `payment_date`, `payment_amount`, `payment_method`, `notes`, `approved_by`, `approved_date`, `submitted_by`) VALUES (91, 3, 2, '2006-01-14 00:00:00', '2006-01-22 00:00:00', 2, NULL, 0, 0, NULL, 0, NULL, NULL, 2, '2006-01-22 00:00:00', 2); -INSERT INTO `purchase_orders` (`id`, `supplier_id`, `created_by`, `submitted_date`, `creation_date`, `status_id`, `expected_date`, `shipping_fee`, `taxes`, `payment_date`, `payment_amount`, `payment_method`, `notes`, `approved_by`, `approved_date`, `submitted_by`) VALUES (92, 2, 2, '2006-01-14 00:00:00', '2006-01-22 00:00:00', 2, NULL, 0, 0, NULL, 0, NULL, NULL, 2, '2006-01-22 00:00:00', 2); -INSERT INTO `purchase_orders` (`id`, `supplier_id`, `created_by`, `submitted_date`, `creation_date`, `status_id`, `expected_date`, `shipping_fee`, `taxes`, `payment_date`, `payment_amount`, `payment_method`, `notes`, `approved_by`, `approved_date`, `submitted_by`) VALUES (93, 5, 2, '2006-01-14 00:00:00', '2006-01-22 00:00:00', 2, NULL, 0, 0, NULL, 0, NULL, NULL, 2, '2006-01-22 00:00:00', 2); -INSERT INTO `purchase_orders` (`id`, `supplier_id`, `created_by`, `submitted_date`, `creation_date`, `status_id`, `expected_date`, `shipping_fee`, `taxes`, `payment_date`, `payment_amount`, `payment_method`, `notes`, `approved_by`, `approved_date`, `submitted_by`) VALUES (94, 6, 2, '2006-01-14 00:00:00', '2006-01-22 00:00:00', 2, NULL, 0, 0, NULL, 0, NULL, NULL, 2, '2006-01-22 00:00:00', 2); -INSERT INTO `purchase_orders` (`id`, `supplier_id`, `created_by`, `submitted_date`, `creation_date`, `status_id`, `expected_date`, `shipping_fee`, `taxes`, `payment_date`, `payment_amount`, `payment_method`, `notes`, `approved_by`, `approved_date`, `submitted_by`) VALUES (95, 4, 2, '2006-01-14 00:00:00', '2006-01-22 00:00:00', 2, NULL, 0, 0, NULL, 0, NULL, NULL, 2, '2006-01-22 00:00:00', 2); -INSERT INTO `purchase_orders` (`id`, `supplier_id`, `created_by`, `submitted_date`, `creation_date`, `status_id`, `expected_date`, `shipping_fee`, `taxes`, `payment_date`, `payment_amount`, `payment_method`, `notes`, `approved_by`, `approved_date`, `submitted_by`) VALUES (96, 1, 5, '2006-01-14 00:00:00', '2006-01-22 00:00:00', 2, NULL, 0, 0, NULL, 0, NULL, 'Purchase generated based on Order #30', 2, '2006-01-22 00:00:00', 5); -INSERT INTO `purchase_orders` (`id`, `supplier_id`, `created_by`, `submitted_date`, `creation_date`, `status_id`, `expected_date`, `shipping_fee`, `taxes`, `payment_date`, `payment_amount`, `payment_method`, `notes`, `approved_by`, `approved_date`, `submitted_by`) VALUES (97, 2, 7, '2006-01-14 00:00:00', '2006-01-22 00:00:00', 2, NULL, 0, 0, NULL, 0, NULL, 'Purchase generated based on Order #33', 2, '2006-01-22 00:00:00', 7); -INSERT INTO `purchase_orders` (`id`, `supplier_id`, `created_by`, `submitted_date`, `creation_date`, `status_id`, `expected_date`, `shipping_fee`, `taxes`, `payment_date`, `payment_amount`, `payment_method`, `notes`, `approved_by`, `approved_date`, `submitted_by`) VALUES (98, 2, 4, '2006-01-14 00:00:00', '2006-01-22 00:00:00', 2, NULL, 0, 0, NULL, 0, NULL, 'Purchase generated based on Order #36', 2, '2006-01-22 00:00:00', 4); -INSERT INTO `purchase_orders` (`id`, `supplier_id`, `created_by`, `submitted_date`, `creation_date`, `status_id`, `expected_date`, `shipping_fee`, `taxes`, `payment_date`, `payment_amount`, `payment_method`, `notes`, `approved_by`, `approved_date`, `submitted_by`) VALUES (99, 1, 3, '2006-01-14 00:00:00', '2006-01-22 00:00:00', 2, NULL, 0, 0, NULL, 0, NULL, 'Purchase generated based on Order #38', 2, '2006-01-22 00:00:00', 3); -INSERT INTO `purchase_orders` (`id`, `supplier_id`, `created_by`, `submitted_date`, `creation_date`, `status_id`, `expected_date`, `shipping_fee`, `taxes`, `payment_date`, `payment_amount`, `payment_method`, `notes`, `approved_by`, `approved_date`, `submitted_by`) VALUES (100, 2, 9, '2006-01-14 00:00:00', '2006-01-22 00:00:00', 2, NULL, 0, 0, NULL, 0, NULL, 'Purchase generated based on Order #39', 2, '2006-01-22 00:00:00', 9); -INSERT INTO `purchase_orders` (`id`, `supplier_id`, `created_by`, `submitted_date`, `creation_date`, `status_id`, `expected_date`, `shipping_fee`, `taxes`, `payment_date`, `payment_amount`, `payment_method`, `notes`, `approved_by`, `approved_date`, `submitted_by`) VALUES (101, 1, 2, '2006-01-14 00:00:00', '2006-01-22 00:00:00', 2, NULL, 0, 0, NULL, 0, NULL, 'Purchase generated based on Order #40', 2, '2006-01-22 00:00:00', 2); -INSERT INTO `purchase_orders` (`id`, `supplier_id`, `created_by`, `submitted_date`, `creation_date`, `status_id`, `expected_date`, `shipping_fee`, `taxes`, `payment_date`, `payment_amount`, `payment_method`, `notes`, `approved_by`, `approved_date`, `submitted_by`) VALUES (102, 1, 1, '2006-03-24 00:00:00', '2006-03-24 00:00:00', 2, NULL, 0, 0, NULL, 0, NULL, 'Purchase generated based on Order #41', 2, '2006-04-04 00:00:00', 1); -INSERT INTO `purchase_orders` (`id`, `supplier_id`, `created_by`, `submitted_date`, `creation_date`, `status_id`, `expected_date`, `shipping_fee`, `taxes`, `payment_date`, `payment_amount`, `payment_method`, `notes`, `approved_by`, `approved_date`, `submitted_by`) VALUES (103, 2, 1, '2006-03-24 00:00:00', '2006-03-24 00:00:00', 2, NULL, 0, 0, NULL, 0, NULL, 'Purchase generated based on Order #42', 2, '2006-04-04 00:00:00', 1); -INSERT INTO `purchase_orders` (`id`, `supplier_id`, `created_by`, `submitted_date`, `creation_date`, `status_id`, `expected_date`, `shipping_fee`, `taxes`, `payment_date`, `payment_amount`, `payment_method`, `notes`, `approved_by`, `approved_date`, `submitted_by`) VALUES (104, 2, 1, '2006-03-24 00:00:00', '2006-03-24 00:00:00', 2, NULL, 0, 0, NULL, 0, NULL, 'Purchase generated based on Order #45', 2, '2006-04-04 00:00:00', 1); -INSERT INTO `purchase_orders` (`id`, `supplier_id`, `created_by`, `submitted_date`, `creation_date`, `status_id`, `expected_date`, `shipping_fee`, `taxes`, `payment_date`, `payment_amount`, `payment_method`, `notes`, `approved_by`, `approved_date`, `submitted_by`) VALUES (105, 5, 7, '2006-03-24 00:00:00', '2006-03-24 00:00:00', 2, NULL, 0, 0, NULL, 0, 'Check', 'Purchase generated based on Order #46', 2, '2006-04-04 00:00:00', 7); -INSERT INTO `purchase_orders` (`id`, `supplier_id`, `created_by`, `submitted_date`, `creation_date`, `status_id`, `expected_date`, `shipping_fee`, `taxes`, `payment_date`, `payment_amount`, `payment_method`, `notes`, `approved_by`, `approved_date`, `submitted_by`) VALUES (106, 6, 7, '2006-03-24 00:00:00', '2006-03-24 00:00:00', 2, NULL, 0, 0, NULL, 0, NULL, 'Purchase generated based on Order #46', 2, '2006-04-04 00:00:00', 7); -INSERT INTO `purchase_orders` (`id`, `supplier_id`, `created_by`, `submitted_date`, `creation_date`, `status_id`, `expected_date`, `shipping_fee`, `taxes`, `payment_date`, `payment_amount`, `payment_method`, `notes`, `approved_by`, `approved_date`, `submitted_by`) VALUES (107, 1, 6, '2006-03-24 00:00:00', '2006-03-24 00:00:00', 2, NULL, 0, 0, NULL, 0, NULL, 'Purchase generated based on Order #47', 2, '2006-04-04 00:00:00', 6); -INSERT INTO `purchase_orders` (`id`, `supplier_id`, `created_by`, `submitted_date`, `creation_date`, `status_id`, `expected_date`, `shipping_fee`, `taxes`, `payment_date`, `payment_amount`, `payment_method`, `notes`, `approved_by`, `approved_date`, `submitted_by`) VALUES (108, 2, 4, '2006-03-24 00:00:00', '2006-03-24 00:00:00', 2, NULL, 0, 0, NULL, 0, NULL, 'Purchase generated based on Order #48', 2, '2006-04-04 00:00:00', 4); -INSERT INTO `purchase_orders` (`id`, `supplier_id`, `created_by`, `submitted_date`, `creation_date`, `status_id`, `expected_date`, `shipping_fee`, `taxes`, `payment_date`, `payment_amount`, `payment_method`, `notes`, `approved_by`, `approved_date`, `submitted_by`) VALUES (109, 2, 4, '2006-03-24 00:00:00', '2006-03-24 00:00:00', 2, NULL, 0, 0, NULL, 0, NULL, 'Purchase generated based on Order #48', 2, '2006-04-04 00:00:00', 4); -INSERT INTO `purchase_orders` (`id`, `supplier_id`, `created_by`, `submitted_date`, `creation_date`, `status_id`, `expected_date`, `shipping_fee`, `taxes`, `payment_date`, `payment_amount`, `payment_method`, `notes`, `approved_by`, `approved_date`, `submitted_by`) VALUES (110, 1, 3, '2006-03-24 00:00:00', '2006-03-24 00:00:00', 2, NULL, 0, 0, NULL, 0, NULL, 'Purchase generated based on Order #49', 2, '2006-04-04 00:00:00', 3); -INSERT INTO `purchase_orders` (`id`, `supplier_id`, `created_by`, `submitted_date`, `creation_date`, `status_id`, `expected_date`, `shipping_fee`, `taxes`, `payment_date`, `payment_amount`, `payment_method`, `notes`, `approved_by`, `approved_date`, `submitted_by`) VALUES (111, 1, 2, '2006-03-31 00:00:00', '2006-03-31 00:00:00', 2, NULL, 0, 0, NULL, 0, NULL, 'Purchase generated based on Order #56', 2, '2006-04-04 00:00:00', 2); -INSERT INTO `purchase_orders` (`id`, `supplier_id`, `created_by`, `submitted_date`, `creation_date`, `status_id`, `expected_date`, `shipping_fee`, `taxes`, `payment_date`, `payment_amount`, `payment_method`, `notes`, `approved_by`, `approved_date`, `submitted_by`) VALUES (140, 6, NULL, '2006-04-25 00:00:00', '2006-04-25 16:40:51', 2, NULL, 0, 0, NULL, 0, NULL, NULL, 2, '2006-04-25 16:41:33', 2); -INSERT INTO `purchase_orders` (`id`, `supplier_id`, `created_by`, `submitted_date`, `creation_date`, `status_id`, `expected_date`, `shipping_fee`, `taxes`, `payment_date`, `payment_amount`, `payment_method`, `notes`, `approved_by`, `approved_date`, `submitted_by`) VALUES (141, 8, NULL, '2006-04-25 00:00:00', '2006-04-25 17:10:35', 2, NULL, 0, 0, NULL, 0, NULL, NULL, 2, '2006-04-25 17:10:55', 2); -INSERT INTO `purchase_orders` (`id`, `supplier_id`, `created_by`, `submitted_date`, `creation_date`, `status_id`, `expected_date`, `shipping_fee`, `taxes`, `payment_date`, `payment_amount`, `payment_method`, `notes`, `approved_by`, `approved_date`, `submitted_by`) VALUES (142, 8, NULL, '2006-04-25 00:00:00', '2006-04-25 17:18:29', 2, NULL, 0, 0, NULL, 0, 'Check', NULL, 2, '2006-04-25 17:18:51', 2); -INSERT INTO `purchase_orders` (`id`, `supplier_id`, `created_by`, `submitted_date`, `creation_date`, `status_id`, `expected_date`, `shipping_fee`, `taxes`, `payment_date`, `payment_amount`, `payment_method`, `notes`, `approved_by`, `approved_date`, `submitted_by`) VALUES (146, 2, 2, '2006-04-26 18:26:37', '2006-04-26 18:26:37', 1, NULL, 0, 0, NULL, 0, NULL, NULL, NULL, NULL, 2); -INSERT INTO `purchase_orders` (`id`, `supplier_id`, `created_by`, `submitted_date`, `creation_date`, `status_id`, `expected_date`, `shipping_fee`, `taxes`, `payment_date`, `payment_amount`, `payment_method`, `notes`, `approved_by`, `approved_date`, `submitted_by`) VALUES (147, 7, 2, '2006-04-26 18:33:28', '2006-04-26 18:33:28', 1, NULL, 0, 0, NULL, 0, NULL, NULL, NULL, NULL, 2); -INSERT INTO `purchase_orders` (`id`, `supplier_id`, `created_by`, `submitted_date`, `creation_date`, `status_id`, `expected_date`, `shipping_fee`, `taxes`, `payment_date`, `payment_amount`, `payment_method`, `notes`, `approved_by`, `approved_date`, `submitted_by`) VALUES (148, 5, 2, '2006-04-26 18:33:52', '2006-04-26 18:33:52', 1, NULL, 0, 0, NULL, 0, NULL, NULL, NULL, NULL, 2); -# 28 records - -# -# Dumping data for table 'sales_reports' -# - -INSERT INTO `sales_reports` (`group_by`, `display`, `title`, `filter_row_source`, `default`) VALUES ('Category', 'Category', 'Sales By Category', 'SELECT DISTINCT [Category] FROM [products] ORDER BY [Category];', 0); -INSERT INTO `sales_reports` (`group_by`, `display`, `title`, `filter_row_source`, `default`) VALUES ('country_region', 'Country/Region', 'Sales By Country', 'SELECT DISTINCT [country_region] FROM [customers Extended] ORDER BY [country_region];', 0); -INSERT INTO `sales_reports` (`group_by`, `display`, `title`, `filter_row_source`, `default`) VALUES ('Customer ID', 'Customer', 'Sales By Customer', 'SELECT DISTINCT [Company] FROM [customers Extended] ORDER BY [Company];', 0); -INSERT INTO `sales_reports` (`group_by`, `display`, `title`, `filter_row_source`, `default`) VALUES ('employee_id', 'Employee', 'Sales By Employee', 'SELECT DISTINCT [Employee Name] FROM [employees Extended] ORDER BY [Employee Name];', 0); -INSERT INTO `sales_reports` (`group_by`, `display`, `title`, `filter_row_source`, `default`) VALUES ('Product ID', 'Product', 'Sales by Product', 'SELECT DISTINCT [Product Name] FROM [products] ORDER BY [Product Name];', 1); -# 5 records - -# -# Dumping data for table 'shippers' -# - -INSERT INTO `shippers` (`id`, `company`, `last_name`, `first_name`, `email_address`, `job_title`, `business_phone`, `home_phone`, `mobile_phone`, `fax_number`, `address`, `city`, `state_province`, `zip_postal_code`, `country_region`, `web_page`, `notes`, `attachments`) VALUES (1, 'Shipping Company A', NULL, NULL, NULL, NULL, NULL, NULL, NULL, NULL, '123 Any Street', 'Memphis', 'TN', '99999', 'USA', NULL, NULL, ''); -INSERT INTO `shippers` (`id`, `company`, `last_name`, `first_name`, `email_address`, `job_title`, `business_phone`, `home_phone`, `mobile_phone`, `fax_number`, `address`, `city`, `state_province`, `zip_postal_code`, `country_region`, `web_page`, `notes`, `attachments`) VALUES (2, 'Shipping Company B', NULL, NULL, NULL, NULL, NULL, NULL, NULL, NULL, '123 Any Street', 'Memphis', 'TN', '99999', 'USA', NULL, NULL, ''); -INSERT INTO `shippers` (`id`, `company`, `last_name`, `first_name`, `email_address`, `job_title`, `business_phone`, `home_phone`, `mobile_phone`, `fax_number`, `address`, `city`, `state_province`, `zip_postal_code`, `country_region`, `web_page`, `notes`, `attachments`) VALUES (3, 'Shipping Company C', NULL, NULL, NULL, NULL, NULL, NULL, NULL, NULL, '123 Any Street', 'Memphis', 'TN', '99999', 'USA', NULL, NULL, ''); -# 3 records - -# -# Dumping data for table 'strings' -# - -INSERT INTO `strings` (`string_id`, `string_data`) VALUES (2, 'Northwind Traders'); -INSERT INTO `strings` (`string_id`, `string_data`) VALUES (3, 'Cannot remove posted inventory!'); -INSERT INTO `strings` (`string_id`, `string_data`) VALUES (4, 'Back ordered product filled for Order #|'); -INSERT INTO `strings` (`string_id`, `string_data`) VALUES (5, 'Discounted price below cost!'); -INSERT INTO `strings` (`string_id`, `string_data`) VALUES (6, 'Insufficient inventory.'); -INSERT INTO `strings` (`string_id`, `string_data`) VALUES (7, 'Insufficient inventory. Do you want to create a purchase order?'); -INSERT INTO `strings` (`string_id`, `string_data`) VALUES (8, 'Purchase orders were successfully created for | products'); -INSERT INTO `strings` (`string_id`, `string_data`) VALUES (9, 'There are no products below their respective reorder levels'); -INSERT INTO `strings` (`string_id`, `string_data`) VALUES (10, 'Must specify customer name!'); -INSERT INTO `strings` (`string_id`, `string_data`) VALUES (11, 'Restocking will generate purchase orders for all products below desired inventory levels. Do you want to continue?'); -INSERT INTO `strings` (`string_id`, `string_data`) VALUES (12, 'Cannot create purchase order. No suppliers listed for specified product'); -INSERT INTO `strings` (`string_id`, `string_data`) VALUES (13, 'Discounted price is below cost!'); -INSERT INTO `strings` (`string_id`, `string_data`) VALUES (14, 'Do you want to continue?'); -INSERT INTO `strings` (`string_id`, `string_data`) VALUES (15, 'Order is already invoiced. Do you want to print the invoice?'); -INSERT INTO `strings` (`string_id`, `string_data`) VALUES (16, 'Order does not contain any line items'); -INSERT INTO `strings` (`string_id`, `string_data`) VALUES (17, 'Cannot create invoice! Inventory has not been allocated for each specified product.'); -INSERT INTO `strings` (`string_id`, `string_data`) VALUES (18, 'Sorry, there are no sales in the specified time period'); -INSERT INTO `strings` (`string_id`, `string_data`) VALUES (19, 'Product successfully restocked.'); -INSERT INTO `strings` (`string_id`, `string_data`) VALUES (21, 'Product does not need restocking! Product is already at desired inventory level.'); -INSERT INTO `strings` (`string_id`, `string_data`) VALUES (22, 'Product restocking failed!'); -INSERT INTO `strings` (`string_id`, `string_data`) VALUES (23, 'Invalid login specified!'); -INSERT INTO `strings` (`string_id`, `string_data`) VALUES (24, 'Must first select reported!'); -INSERT INTO `strings` (`string_id`, `string_data`) VALUES (25, 'Changing supplier will remove purchase line items, continue?'); -INSERT INTO `strings` (`string_id`, `string_data`) VALUES (26, 'Purchase orders were successfully submitted for | products. Do you want to view the restocking report?'); -INSERT INTO `strings` (`string_id`, `string_data`) VALUES (27, 'There was an error attempting to restock inventory levels.'); -INSERT INTO `strings` (`string_id`, `string_data`) VALUES (28, '| product(s) were successfully restocked. Do you want to view the restocking report?'); -INSERT INTO `strings` (`string_id`, `string_data`) VALUES (29, 'You cannot remove purchase line items already posted to inventory!'); -INSERT INTO `strings` (`string_id`, `string_data`) VALUES (30, 'There was an error removing one or more purchase line items.'); -INSERT INTO `strings` (`string_id`, `string_data`) VALUES (31, 'You cannot modify quantity for purchased product already received or posted to inventory.'); -INSERT INTO `strings` (`string_id`, `string_data`) VALUES (32, 'You cannot modify price for purchased product already received or posted to inventory.'); -INSERT INTO `strings` (`string_id`, `string_data`) VALUES (33, 'Product has been successfully posted to inventory.'); -INSERT INTO `strings` (`string_id`, `string_data`) VALUES (34, 'Sorry, product cannot be successfully posted to inventory.'); -INSERT INTO `strings` (`string_id`, `string_data`) VALUES (35, 'There are orders with this product on back order. Would you like to fill them now?'); -INSERT INTO `strings` (`string_id`, `string_data`) VALUES (36, 'Cannot post product to inventory without specifying received date!'); -INSERT INTO `strings` (`string_id`, `string_data`) VALUES (37, 'Do you want to post received product to inventory?'); -INSERT INTO `strings` (`string_id`, `string_data`) VALUES (38, 'Initialize purchase, orders, and inventory data?'); -INSERT INTO `strings` (`string_id`, `string_data`) VALUES (39, 'Must first specify employee name!'); -INSERT INTO `strings` (`string_id`, `string_data`) VALUES (40, 'Specified user must be logged in to approve purchase!'); -INSERT INTO `strings` (`string_id`, `string_data`) VALUES (41, 'Purchase order must contain completed line items before it can be approved'); -INSERT INTO `strings` (`string_id`, `string_data`) VALUES (42, 'Sorry, you do not have permission to approve purchases.'); -INSERT INTO `strings` (`string_id`, `string_data`) VALUES (43, 'Purchase successfully approved'); -INSERT INTO `strings` (`string_id`, `string_data`) VALUES (44, 'Purchase cannot be approved'); -INSERT INTO `strings` (`string_id`, `string_data`) VALUES (45, 'Purchase successfully submitted for approval'); -INSERT INTO `strings` (`string_id`, `string_data`) VALUES (46, 'Purchase cannot be submitted for approval'); -INSERT INTO `strings` (`string_id`, `string_data`) VALUES (47, 'Sorry, purchase order does not contain line items'); -INSERT INTO `strings` (`string_id`, `string_data`) VALUES (48, 'Do you want to cancel this order?'); -INSERT INTO `strings` (`string_id`, `string_data`) VALUES (49, 'Canceling an order will permanently delete the order. Are you sure you want to cancel?'); -INSERT INTO `strings` (`string_id`, `string_data`) VALUES (100, 'Your order was successfully canceled.'); -INSERT INTO `strings` (`string_id`, `string_data`) VALUES (101, 'Cannot cancel an order that has items received and posted to inventory.'); -INSERT INTO `strings` (`string_id`, `string_data`) VALUES (102, 'There was an error trying to cancel this order.'); -INSERT INTO `strings` (`string_id`, `string_data`) VALUES (103, 'The invoice for this order has not yet been created.'); -INSERT INTO `strings` (`string_id`, `string_data`) VALUES (104, 'Shipping information is not complete. Please specify all shipping information and try again.'); -INSERT INTO `strings` (`string_id`, `string_data`) VALUES (105, 'Cannot mark as shipped. Order must first be invoiced!'); -INSERT INTO `strings` (`string_id`, `string_data`) VALUES (106, 'Cannot cancel an order that has already shipped!'); -INSERT INTO `strings` (`string_id`, `string_data`) VALUES (107, 'Must first specify salesperson!'); -INSERT INTO `strings` (`string_id`, `string_data`) VALUES (108, 'Order is now marked closed.'); -INSERT INTO `strings` (`string_id`, `string_data`) VALUES (109, 'Order must first be marked shipped before closing.'); -INSERT INTO `strings` (`string_id`, `string_data`) VALUES (110, 'Must first specify payment information!'); -INSERT INTO `strings` (`string_id`, `string_data`) VALUES (111, 'There was an error attempting to restock inventory levels. | product(s) were successfully restocked.'); -INSERT INTO `strings` (`string_id`, `string_data`) VALUES (112, 'You must supply a Unit Cost.'); -INSERT INTO `strings` (`string_id`, `string_data`) VALUES (113, 'Fill back ordered product, Order #|'); -INSERT INTO `strings` (`string_id`, `string_data`) VALUES (114, 'Purchase generated based on Order #|'); -# 62 records - -# -# Dumping data for table 'suppliers' -# - -INSERT INTO `suppliers` (`id`, `company`, `last_name`, `first_name`, `email_address`, `job_title`, `business_phone`, `home_phone`, `mobile_phone`, `fax_number`, `address`, `city`, `state_province`, `zip_postal_code`, `country_region`, `web_page`, `notes`, `attachments`) VALUES (1, 'Supplier A', 'Andersen', 'Elizabeth A.', NULL, 'Sales Manager', NULL, NULL, NULL, NULL, NULL, NULL, NULL, NULL, NULL, NULL, NULL, ''); -INSERT INTO `suppliers` (`id`, `company`, `last_name`, `first_name`, `email_address`, `job_title`, `business_phone`, `home_phone`, `mobile_phone`, `fax_number`, `address`, `city`, `state_province`, `zip_postal_code`, `country_region`, `web_page`, `notes`, `attachments`) VALUES (2, 'Supplier B', 'Weiler', 'Cornelia', NULL, 'Sales Manager', NULL, NULL, NULL, NULL, NULL, NULL, NULL, NULL, NULL, NULL, NULL, ''); -INSERT INTO `suppliers` (`id`, `company`, `last_name`, `first_name`, `email_address`, `job_title`, `business_phone`, `home_phone`, `mobile_phone`, `fax_number`, `address`, `city`, `state_province`, `zip_postal_code`, `country_region`, `web_page`, `notes`, `attachments`) VALUES (3, 'Supplier C', 'Kelley', 'Madeleine', NULL, 'Sales Representative', NULL, NULL, NULL, NULL, NULL, NULL, NULL, NULL, NULL, NULL, NULL, ''); -INSERT INTO `suppliers` (`id`, `company`, `last_name`, `first_name`, `email_address`, `job_title`, `business_phone`, `home_phone`, `mobile_phone`, `fax_number`, `address`, `city`, `state_province`, `zip_postal_code`, `country_region`, `web_page`, `notes`, `attachments`) VALUES (4, 'Supplier D', 'Sato', 'Naoki', NULL, 'Marketing Manager', NULL, NULL, NULL, NULL, NULL, NULL, NULL, NULL, NULL, NULL, NULL, ''); -INSERT INTO `suppliers` (`id`, `company`, `last_name`, `first_name`, `email_address`, `job_title`, `business_phone`, `home_phone`, `mobile_phone`, `fax_number`, `address`, `city`, `state_province`, `zip_postal_code`, `country_region`, `web_page`, `notes`, `attachments`) VALUES (5, 'Supplier E', 'Hernandez-Echevarria', 'Amaya', NULL, 'Sales Manager', NULL, NULL, NULL, NULL, NULL, NULL, NULL, NULL, NULL, NULL, NULL, ''); -INSERT INTO `suppliers` (`id`, `company`, `last_name`, `first_name`, `email_address`, `job_title`, `business_phone`, `home_phone`, `mobile_phone`, `fax_number`, `address`, `city`, `state_province`, `zip_postal_code`, `country_region`, `web_page`, `notes`, `attachments`) VALUES (6, 'Supplier F', 'Hayakawa', 'Satomi', NULL, 'Marketing Assistant', NULL, NULL, NULL, NULL, NULL, NULL, NULL, NULL, NULL, NULL, NULL, ''); -INSERT INTO `suppliers` (`id`, `company`, `last_name`, `first_name`, `email_address`, `job_title`, `business_phone`, `home_phone`, `mobile_phone`, `fax_number`, `address`, `city`, `state_province`, `zip_postal_code`, `country_region`, `web_page`, `notes`, `attachments`) VALUES (7, 'Supplier G', 'Glasson', 'Stuart', NULL, 'Marketing Manager', NULL, NULL, NULL, NULL, NULL, NULL, NULL, NULL, NULL, NULL, NULL, ''); -INSERT INTO `suppliers` (`id`, `company`, `last_name`, `first_name`, `email_address`, `job_title`, `business_phone`, `home_phone`, `mobile_phone`, `fax_number`, `address`, `city`, `state_province`, `zip_postal_code`, `country_region`, `web_page`, `notes`, `attachments`) VALUES (8, 'Supplier H', 'Dunton', 'Bryn Paul', NULL, 'Sales Representative', NULL, NULL, NULL, NULL, NULL, NULL, NULL, NULL, NULL, NULL, NULL, ''); -INSERT INTO `suppliers` (`id`, `company`, `last_name`, `first_name`, `email_address`, `job_title`, `business_phone`, `home_phone`, `mobile_phone`, `fax_number`, `address`, `city`, `state_province`, `zip_postal_code`, `country_region`, `web_page`, `notes`, `attachments`) VALUES (9, 'Supplier I', 'Sandberg', 'Mikael', NULL, 'Sales Manager', NULL, NULL, NULL, NULL, NULL, NULL, NULL, NULL, NULL, NULL, NULL, ''); -INSERT INTO `suppliers` (`id`, `company`, `last_name`, `first_name`, `email_address`, `job_title`, `business_phone`, `home_phone`, `mobile_phone`, `fax_number`, `address`, `city`, `state_province`, `zip_postal_code`, `country_region`, `web_page`, `notes`, `attachments`) VALUES (10, 'Supplier J', 'Sousa', 'Luis', NULL, 'Sales Manager', NULL, NULL, NULL, NULL, NULL, NULL, NULL, NULL, NULL, NULL, NULL, ''); -# 10 records - -SET FOREIGN_KEY_CHECKS=@OLD_FOREIGN_KEY_CHECKS; -SET UNIQUE_CHECKS=@OLD_UNIQUE_CHECKS; \ No newline at end of file diff --git a/database-files/03_add_to_northwind.sql b/database-files/03_add_to_northwind.sql deleted file mode 100644 index 4587e2b616..0000000000 --- a/database-files/03_add_to_northwind.sql +++ /dev/null @@ -1,22 +0,0 @@ -USE northwind; - --- ----------------------------------------------------- --- Model Params Table and data added by Dr. Fontenot --- ----------------------------------------------------- -CREATE TABLE IF NOT EXISTS model1_param_vals( - sequence_number INTEGER AUTO_INCREMENT PRIMARY KEY, - beta_0 FLOAT, - beta_1 FLOAT, - beta_2 FLOAT -); - -INSERT INTO model1_param_vals(beta_0, beta_1, beta_2) values (0.1214, 0.2354, 0.3245); - -CREATE TABLE IF NOT EXISTS model1_params( - sequence_number INTEGER AUTO_INCREMENT PRIMARY KEY, - beta_vals varchar(100) -); - -INSERT INTO model1_params (beta_vals) VALUES ("[0.124, 0.2354, 0.3245]"); - -commit; diff --git a/database-files/README.md b/database-files/README.md deleted file mode 100644 index 04e8e63b89..0000000000 --- a/database-files/README.md +++ /dev/null @@ -1,3 +0,0 @@ -# `database-files` Folder - -TODO: Put some notes here about how this works. include how to re-bootstrap the db. \ No newline at end of file diff --git a/database-files/classicModels.sql b/database-files/classicModels.sql deleted file mode 100644 index 0b26e399ef..0000000000 --- a/database-files/classicModels.sql +++ /dev/null @@ -1,7933 +0,0 @@ -/* -********************************************************************* -http://www.mysqltutorial.org -********************************************************************* -Name: MySQL Sample Database classicmodels -Link: http://www.mysqltutorial.org/mysql-sample-database.aspx -Version 3.1 -+ changed data type from DOUBLE to DECIMAL for amount columns -Version 3.0 -+ changed DATETIME to DATE for some colunmns -Version 2.0 -+ changed table type from MyISAM to InnoDB -+ added foreign keys for all tables -********************************************************************* -*/ - - -/*!40101 SET NAMES utf8 */; - -/*!40101 SET SQL_MODE=''*/; - -/*!40014 SET @OLD_UNIQUE_CHECKS=@@UNIQUE_CHECKS, UNIQUE_CHECKS=0 */; -/*!40014 SET @OLD_FOREIGN_KEY_CHECKS=@@FOREIGN_KEY_CHECKS, FOREIGN_KEY_CHECKS=0 */; -/*!40101 SET @OLD_SQL_MODE=@@SQL_MODE, SQL_MODE='NO_AUTO_VALUE_ON_ZERO' */; -/*!40111 SET @OLD_SQL_NOTES=@@SQL_NOTES, SQL_NOTES=0 */; -CREATE DATABASE /*!32312 IF NOT EXISTS*/`classicmodels` /*!40100 DEFAULT CHARACTER SET latin1 */; - -USE `classicmodels`; - -flush privileges; - -/*Table structure for table `customers` */ - -DROP TABLE IF EXISTS `customers`; - -CREATE TABLE `customers` ( - `customerNumber` int(11) NOT NULL, - `customerName` varchar(50) NOT NULL, - `contactLastName` varchar(50) NOT NULL, - `contactFirstName` varchar(50) NOT NULL, - `phone` varchar(50) NOT NULL, - `addressLine1` varchar(50) NOT NULL, - `addressLine2` varchar(50) DEFAULT NULL, - `city` varchar(50) NOT NULL, - `state` varchar(50) DEFAULT NULL, - `postalCode` varchar(15) DEFAULT NULL, - `country` varchar(50) NOT NULL, - `salesRepEmployeeNumber` int(11) DEFAULT NULL, - `creditLimit` decimal(10,2) DEFAULT NULL, - PRIMARY KEY (`customerNumber`), - KEY `salesRepEmployeeNumber` (`salesRepEmployeeNumber`), - CONSTRAINT `customers_ibfk_1` FOREIGN KEY (`salesRepEmployeeNumber`) REFERENCES `employees` (`employeeNumber`) -) ENGINE=InnoDB DEFAULT CHARSET=latin1; - -/*Data for the table `customers` */ - -insert into `customers`(`customerNumber`,`customerName`,`contactLastName`,`contactFirstName`,`phone`,`addressLine1`,`addressLine2`,`city`,`state`,`postalCode`,`country`,`salesRepEmployeeNumber`,`creditLimit`) values - -(103,'Atelier graphique','Schmitt','Carine ','40.32.2555','54, rue Royale',NULL,'Nantes',NULL,'44000','France',1370,'21000.00'), - -(112,'Signal Gift Stores','King','Jean','7025551838','8489 Strong St.',NULL,'Las Vegas','NV','83030','USA',1166,'71800.00'), - -(114,'Australian Collectors, Co.','Ferguson','Peter','03 9520 4555','636 St Kilda Road','Level 3','Melbourne','Victoria','3004','Australia',1611,'117300.00'), - -(119,'La Rochelle Gifts','Labrune','Janine ','40.67.8555','67, rue des Cinquante Otages',NULL,'Nantes',NULL,'44000','France',1370,'118200.00'), - -(121,'Baane Mini Imports','Bergulfsen','Jonas ','07-98 9555','Erling Skakkes gate 78',NULL,'Stavern',NULL,'4110','Norway',1504,'81700.00'), - -(124,'Mini Gifts Distributors Ltd.','Nelson','Susan','4155551450','5677 Strong St.',NULL,'San Rafael','CA','97562','USA',1165,'210500.00'), - -(125,'Havel & Zbyszek Co','Piestrzeniewicz','Zbyszek ','(26) 642-7555','ul. Filtrowa 68',NULL,'Warszawa',NULL,'01-012','Poland',NULL,'0.00'), - -(128,'Blauer See Auto, Co.','Keitel','Roland','+49 69 66 90 2555','Lyonerstr. 34',NULL,'Frankfurt',NULL,'60528','Germany',1504,'59700.00'), - -(129,'Mini Wheels Co.','Murphy','Julie','6505555787','5557 North Pendale Street',NULL,'San Francisco','CA','94217','USA',1165,'64600.00'), - -(131,'Land of Toys Inc.','Lee','Kwai','2125557818','897 Long Airport Avenue',NULL,'NYC','NY','10022','USA',1323,'114900.00'), - -(141,'Euro+ Shopping Channel','Freyre','Diego ','(91) 555 94 44','C/ Moralzarzal, 86',NULL,'Madrid',NULL,'28034','Spain',1370,'227600.00'), - -(144,'Volvo Model Replicas, Co','Berglund','Christina ','0921-12 3555','Berguvsvägen 8',NULL,'Luleå',NULL,'S-958 22','Sweden',1504,'53100.00'), - -(145,'Danish Wholesale Imports','Petersen','Jytte ','31 12 3555','Vinbæltet 34',NULL,'Kobenhavn',NULL,'1734','Denmark',1401,'83400.00'), - -(146,'Saveley & Henriot, Co.','Saveley','Mary ','78.32.5555','2, rue du Commerce',NULL,'Lyon',NULL,'69004','France',1337,'123900.00'), - -(148,'Dragon Souveniers, Ltd.','Natividad','Eric','+65 221 7555','Bronz Sok.','Bronz Apt. 3/6 Tesvikiye','Singapore',NULL,'079903','Singapore',1621,'103800.00'), - -(151,'Muscle Machine Inc','Young','Jeff','2125557413','4092 Furth Circle','Suite 400','NYC','NY','10022','USA',1286,'138500.00'), - -(157,'Diecast Classics Inc.','Leong','Kelvin','2155551555','7586 Pompton St.',NULL,'Allentown','PA','70267','USA',1216,'100600.00'), - -(161,'Technics Stores Inc.','Hashimoto','Juri','6505556809','9408 Furth Circle',NULL,'Burlingame','CA','94217','USA',1165,'84600.00'), - -(166,'Handji Gifts& Co','Victorino','Wendy','+65 224 1555','106 Linden Road Sandown','2nd Floor','Singapore',NULL,'069045','Singapore',1612,'97900.00'), - -(167,'Herkku Gifts','Oeztan','Veysel','+47 2267 3215','Brehmen St. 121','PR 334 Sentrum','Bergen',NULL,'N 5804','Norway ',1504,'96800.00'), - -(168,'American Souvenirs Inc','Franco','Keith','2035557845','149 Spinnaker Dr.','Suite 101','New Haven','CT','97823','USA',1286,'0.00'), - -(169,'Porto Imports Co.','de Castro','Isabel ','(1) 356-5555','Estrada da saúde n. 58',NULL,'Lisboa',NULL,'1756','Portugal',NULL,'0.00'), - -(171,'Daedalus Designs Imports','Rancé','Martine ','20.16.1555','184, chaussée de Tournai',NULL,'Lille',NULL,'59000','France',1370,'82900.00'), - -(172,'La Corne D\'abondance, Co.','Bertrand','Marie','(1) 42.34.2555','265, boulevard Charonne',NULL,'Paris',NULL,'75012','France',1337,'84300.00'), - -(173,'Cambridge Collectables Co.','Tseng','Jerry','6175555555','4658 Baden Av.',NULL,'Cambridge','MA','51247','USA',1188,'43400.00'), - -(175,'Gift Depot Inc.','King','Julie','2035552570','25593 South Bay Ln.',NULL,'Bridgewater','CT','97562','USA',1323,'84300.00'), - -(177,'Osaka Souveniers Co.','Kentary','Mory','+81 06 6342 5555','1-6-20 Dojima',NULL,'Kita-ku','Osaka',' 530-0003','Japan',1621,'81200.00'), - -(181,'Vitachrome Inc.','Frick','Michael','2125551500','2678 Kingston Rd.','Suite 101','NYC','NY','10022','USA',1286,'76400.00'), - -(186,'Toys of Finland, Co.','Karttunen','Matti','90-224 8555','Keskuskatu 45',NULL,'Helsinki',NULL,'21240','Finland',1501,'96500.00'), - -(187,'AV Stores, Co.','Ashworth','Rachel','(171) 555-1555','Fauntleroy Circus',NULL,'Manchester',NULL,'EC2 5NT','UK',1501,'136800.00'), - -(189,'Clover Collections, Co.','Cassidy','Dean','+353 1862 1555','25 Maiden Lane','Floor No. 4','Dublin',NULL,'2','Ireland',1504,'69400.00'), - -(198,'Auto-Moto Classics Inc.','Taylor','Leslie','6175558428','16780 Pompton St.',NULL,'Brickhaven','MA','58339','USA',1216,'23000.00'), - -(201,'UK Collectables, Ltd.','Devon','Elizabeth','(171) 555-2282','12, Berkeley Gardens Blvd',NULL,'Liverpool',NULL,'WX1 6LT','UK',1501,'92700.00'), - -(202,'Canadian Gift Exchange Network','Tamuri','Yoshi ','(604) 555-3392','1900 Oak St.',NULL,'Vancouver','BC','V3F 2K1','Canada',1323,'90300.00'), - -(204,'Online Mini Collectables','Barajas','Miguel','6175557555','7635 Spinnaker Dr.',NULL,'Brickhaven','MA','58339','USA',1188,'68700.00'), - -(205,'Toys4GrownUps.com','Young','Julie','6265557265','78934 Hillside Dr.',NULL,'Pasadena','CA','90003','USA',1166,'90700.00'), - -(206,'Asian Shopping Network, Co','Walker','Brydey','+612 9411 1555','Suntec Tower Three','8 Temasek','Singapore',NULL,'038988','Singapore',NULL,'0.00'), - -(209,'Mini Caravy','Citeaux','Frédérique ','88.60.1555','24, place Kléber',NULL,'Strasbourg',NULL,'67000','France',1370,'53800.00'), - -(211,'King Kong Collectables, Co.','Gao','Mike','+852 2251 1555','Bank of China Tower','1 Garden Road','Central Hong Kong',NULL,NULL,'Hong Kong',1621,'58600.00'), - -(216,'Enaco Distributors','Saavedra','Eduardo ','(93) 203 4555','Rambla de Cataluña, 23',NULL,'Barcelona',NULL,'08022','Spain',1702,'60300.00'), - -(219,'Boards & Toys Co.','Young','Mary','3105552373','4097 Douglas Av.',NULL,'Glendale','CA','92561','USA',1166,'11000.00'), - -(223,'Natürlich Autos','Kloss','Horst ','0372-555188','Taucherstraße 10',NULL,'Cunewalde',NULL,'01307','Germany',NULL,'0.00'), - -(227,'Heintze Collectables','Ibsen','Palle','86 21 3555','Smagsloget 45',NULL,'Århus',NULL,'8200','Denmark',1401,'120800.00'), - -(233,'Québec Home Shopping Network','Fresnière','Jean ','(514) 555-8054','43 rue St. Laurent',NULL,'Montréal','Québec','H1J 1C3','Canada',1286,'48700.00'), - -(237,'ANG Resellers','Camino','Alejandra ','(91) 745 6555','Gran Vía, 1',NULL,'Madrid',NULL,'28001','Spain',NULL,'0.00'), - -(239,'Collectable Mini Designs Co.','Thompson','Valarie','7605558146','361 Furth Circle',NULL,'San Diego','CA','91217','USA',1166,'105000.00'), - -(240,'giftsbymail.co.uk','Bennett','Helen ','(198) 555-8888','Garden House','Crowther Way 23','Cowes','Isle of Wight','PO31 7PJ','UK',1501,'93900.00'), - -(242,'Alpha Cognac','Roulet','Annette ','61.77.6555','1 rue Alsace-Lorraine',NULL,'Toulouse',NULL,'31000','France',1370,'61100.00'), - -(247,'Messner Shopping Network','Messner','Renate ','069-0555984','Magazinweg 7',NULL,'Frankfurt',NULL,'60528','Germany',NULL,'0.00'), - -(249,'Amica Models & Co.','Accorti','Paolo ','011-4988555','Via Monte Bianco 34',NULL,'Torino',NULL,'10100','Italy',1401,'113000.00'), - -(250,'Lyon Souveniers','Da Silva','Daniel','+33 1 46 62 7555','27 rue du Colonel Pierre Avia',NULL,'Paris',NULL,'75508','France',1337,'68100.00'), - -(256,'Auto Associés & Cie.','Tonini','Daniel ','30.59.8555','67, avenue de l\'Europe',NULL,'Versailles',NULL,'78000','France',1370,'77900.00'), - -(259,'Toms Spezialitäten, Ltd','Pfalzheim','Henriette ','0221-5554327','Mehrheimerstr. 369',NULL,'Köln',NULL,'50739','Germany',1504,'120400.00'), - -(260,'Royal Canadian Collectables, Ltd.','Lincoln','Elizabeth ','(604) 555-4555','23 Tsawassen Blvd.',NULL,'Tsawassen','BC','T2F 8M4','Canada',1323,'89600.00'), - -(273,'Franken Gifts, Co','Franken','Peter ','089-0877555','Berliner Platz 43',NULL,'München',NULL,'80805','Germany',NULL,'0.00'), - -(276,'Anna\'s Decorations, Ltd','O\'Hara','Anna','02 9936 8555','201 Miller Street','Level 15','North Sydney','NSW','2060','Australia',1611,'107800.00'), - -(278,'Rovelli Gifts','Rovelli','Giovanni ','035-640555','Via Ludovico il Moro 22',NULL,'Bergamo',NULL,'24100','Italy',1401,'119600.00'), - -(282,'Souveniers And Things Co.','Huxley','Adrian','+61 2 9495 8555','Monitor Money Building','815 Pacific Hwy','Chatswood','NSW','2067','Australia',1611,'93300.00'), - -(286,'Marta\'s Replicas Co.','Hernandez','Marta','6175558555','39323 Spinnaker Dr.',NULL,'Cambridge','MA','51247','USA',1216,'123700.00'), - -(293,'BG&E Collectables','Harrison','Ed','+41 26 425 50 01','Rte des Arsenaux 41 ',NULL,'Fribourg',NULL,'1700','Switzerland',NULL,'0.00'), - -(298,'Vida Sport, Ltd','Holz','Mihael','0897-034555','Grenzacherweg 237',NULL,'Genève',NULL,'1203','Switzerland',1702,'141300.00'), - -(299,'Norway Gifts By Mail, Co.','Klaeboe','Jan','+47 2212 1555','Drammensveien 126A','PB 211 Sentrum','Oslo',NULL,'N 0106','Norway ',1504,'95100.00'), - -(303,'Schuyler Imports','Schuyler','Bradley','+31 20 491 9555','Kingsfordweg 151',NULL,'Amsterdam',NULL,'1043 GR','Netherlands',NULL,'0.00'), - -(307,'Der Hund Imports','Andersen','Mel','030-0074555','Obere Str. 57',NULL,'Berlin',NULL,'12209','Germany',NULL,'0.00'), - -(311,'Oulu Toy Supplies, Inc.','Koskitalo','Pirkko','981-443655','Torikatu 38',NULL,'Oulu',NULL,'90110','Finland',1501,'90500.00'), - -(314,'Petit Auto','Dewey','Catherine ','(02) 5554 67','Rue Joseph-Bens 532',NULL,'Bruxelles',NULL,'B-1180','Belgium',1401,'79900.00'), - -(319,'Mini Classics','Frick','Steve','9145554562','3758 North Pendale Street',NULL,'White Plains','NY','24067','USA',1323,'102700.00'), - -(320,'Mini Creations Ltd.','Huang','Wing','5085559555','4575 Hillside Dr.',NULL,'New Bedford','MA','50553','USA',1188,'94500.00'), - -(321,'Corporate Gift Ideas Co.','Brown','Julie','6505551386','7734 Strong St.',NULL,'San Francisco','CA','94217','USA',1165,'105000.00'), - -(323,'Down Under Souveniers, Inc','Graham','Mike','+64 9 312 5555','162-164 Grafton Road','Level 2','Auckland ',NULL,NULL,'New Zealand',1612,'88000.00'), - -(324,'Stylish Desk Decors, Co.','Brown','Ann ','(171) 555-0297','35 King George',NULL,'London',NULL,'WX3 6FW','UK',1501,'77000.00'), - -(328,'Tekni Collectables Inc.','Brown','William','2015559350','7476 Moss Rd.',NULL,'Newark','NJ','94019','USA',1323,'43000.00'), - -(333,'Australian Gift Network, Co','Calaghan','Ben','61-7-3844-6555','31 Duncan St. West End',NULL,'South Brisbane','Queensland','4101','Australia',1611,'51600.00'), - -(334,'Suominen Souveniers','Suominen','Kalle','+358 9 8045 555','Software Engineering Center','SEC Oy','Espoo',NULL,'FIN-02271','Finland',1501,'98800.00'), - -(335,'Cramer Spezialitäten, Ltd','Cramer','Philip ','0555-09555','Maubelstr. 90',NULL,'Brandenburg',NULL,'14776','Germany',NULL,'0.00'), - -(339,'Classic Gift Ideas, Inc','Cervantes','Francisca','2155554695','782 First Street',NULL,'Philadelphia','PA','71270','USA',1188,'81100.00'), - -(344,'CAF Imports','Fernandez','Jesus','+34 913 728 555','Merchants House','27-30 Merchant\'s Quay','Madrid',NULL,'28023','Spain',1702,'59600.00'), - -(347,'Men \'R\' US Retailers, Ltd.','Chandler','Brian','2155554369','6047 Douglas Av.',NULL,'Los Angeles','CA','91003','USA',1166,'57700.00'), - -(348,'Asian Treasures, Inc.','McKenna','Patricia ','2967 555','8 Johnstown Road',NULL,'Cork','Co. Cork',NULL,'Ireland',NULL,'0.00'), - -(350,'Marseille Mini Autos','Lebihan','Laurence ','91.24.4555','12, rue des Bouchers',NULL,'Marseille',NULL,'13008','France',1337,'65000.00'), - -(353,'Reims Collectables','Henriot','Paul ','26.47.1555','59 rue de l\'Abbaye',NULL,'Reims',NULL,'51100','France',1337,'81100.00'), - -(356,'SAR Distributors, Co','Kuger','Armand','+27 21 550 3555','1250 Pretorius Street',NULL,'Hatfield','Pretoria','0028','South Africa',NULL,'0.00'), - -(357,'GiftsForHim.com','MacKinlay','Wales','64-9-3763555','199 Great North Road',NULL,'Auckland',NULL,NULL,'New Zealand',1612,'77700.00'), - -(361,'Kommission Auto','Josephs','Karin','0251-555259','Luisenstr. 48',NULL,'Münster',NULL,'44087','Germany',NULL,'0.00'), - -(362,'Gifts4AllAges.com','Yoshido','Juri','6175559555','8616 Spinnaker Dr.',NULL,'Boston','MA','51003','USA',1216,'41900.00'), - -(363,'Online Diecast Creations Co.','Young','Dorothy','6035558647','2304 Long Airport Avenue',NULL,'Nashua','NH','62005','USA',1216,'114200.00'), - -(369,'Lisboa Souveniers, Inc','Rodriguez','Lino ','(1) 354-2555','Jardim das rosas n. 32',NULL,'Lisboa',NULL,'1675','Portugal',NULL,'0.00'), - -(376,'Precious Collectables','Urs','Braun','0452-076555','Hauptstr. 29',NULL,'Bern',NULL,'3012','Switzerland',1702,'0.00'), - -(379,'Collectables For Less Inc.','Nelson','Allen','6175558555','7825 Douglas Av.',NULL,'Brickhaven','MA','58339','USA',1188,'70700.00'), - -(381,'Royale Belge','Cartrain','Pascale ','(071) 23 67 2555','Boulevard Tirou, 255',NULL,'Charleroi',NULL,'B-6000','Belgium',1401,'23500.00'), - -(382,'Salzburg Collectables','Pipps','Georg ','6562-9555','Geislweg 14',NULL,'Salzburg',NULL,'5020','Austria',1401,'71700.00'), - -(385,'Cruz & Sons Co.','Cruz','Arnold','+63 2 555 3587','15 McCallum Street','NatWest Center #13-03','Makati City',NULL,'1227 MM','Philippines',1621,'81500.00'), - -(386,'L\'ordine Souveniers','Moroni','Maurizio ','0522-556555','Strada Provinciale 124',NULL,'Reggio Emilia',NULL,'42100','Italy',1401,'121400.00'), - -(398,'Tokyo Collectables, Ltd','Shimamura','Akiko','+81 3 3584 0555','2-2-8 Roppongi',NULL,'Minato-ku','Tokyo','106-0032','Japan',1621,'94400.00'), - -(406,'Auto Canal+ Petit','Perrier','Dominique','(1) 47.55.6555','25, rue Lauriston',NULL,'Paris',NULL,'75016','France',1337,'95000.00'), - -(409,'Stuttgart Collectable Exchange','Müller','Rita ','0711-555361','Adenauerallee 900',NULL,'Stuttgart',NULL,'70563','Germany',NULL,'0.00'), - -(412,'Extreme Desk Decorations, Ltd','McRoy','Sarah','04 499 9555','101 Lambton Quay','Level 11','Wellington',NULL,NULL,'New Zealand',1612,'86800.00'), - -(415,'Bavarian Collectables Imports, Co.','Donnermeyer','Michael',' +49 89 61 08 9555','Hansastr. 15',NULL,'Munich',NULL,'80686','Germany',1504,'77000.00'), - -(424,'Classic Legends Inc.','Hernandez','Maria','2125558493','5905 Pompton St.','Suite 750','NYC','NY','10022','USA',1286,'67500.00'), - -(443,'Feuer Online Stores, Inc','Feuer','Alexander ','0342-555176','Heerstr. 22',NULL,'Leipzig',NULL,'04179','Germany',NULL,'0.00'), - -(447,'Gift Ideas Corp.','Lewis','Dan','2035554407','2440 Pompton St.',NULL,'Glendale','CT','97561','USA',1323,'49700.00'), - -(448,'Scandinavian Gift Ideas','Larsson','Martha','0695-34 6555','Åkergatan 24',NULL,'Bräcke',NULL,'S-844 67','Sweden',1504,'116400.00'), - -(450,'The Sharp Gifts Warehouse','Frick','Sue','4085553659','3086 Ingle Ln.',NULL,'San Jose','CA','94217','USA',1165,'77600.00'), - -(452,'Mini Auto Werke','Mendel','Roland ','7675-3555','Kirchgasse 6',NULL,'Graz',NULL,'8010','Austria',1401,'45300.00'), - -(455,'Super Scale Inc.','Murphy','Leslie','2035559545','567 North Pendale Street',NULL,'New Haven','CT','97823','USA',1286,'95400.00'), - -(456,'Microscale Inc.','Choi','Yu','2125551957','5290 North Pendale Street','Suite 200','NYC','NY','10022','USA',1286,'39800.00'), - -(458,'Corrida Auto Replicas, Ltd','Sommer','Martín ','(91) 555 22 82','C/ Araquil, 67',NULL,'Madrid',NULL,'28023','Spain',1702,'104600.00'), - -(459,'Warburg Exchange','Ottlieb','Sven ','0241-039123','Walserweg 21',NULL,'Aachen',NULL,'52066','Germany',NULL,'0.00'), - -(462,'FunGiftIdeas.com','Benitez','Violeta','5085552555','1785 First Street',NULL,'New Bedford','MA','50553','USA',1216,'85800.00'), - -(465,'Anton Designs, Ltd.','Anton','Carmen','+34 913 728555','c/ Gobelas, 19-1 Urb. La Florida',NULL,'Madrid',NULL,'28023','Spain',NULL,'0.00'), - -(471,'Australian Collectables, Ltd','Clenahan','Sean','61-9-3844-6555','7 Allen Street',NULL,'Glen Waverly','Victoria','3150','Australia',1611,'60300.00'), - -(473,'Frau da Collezione','Ricotti','Franco','+39 022515555','20093 Cologno Monzese','Alessandro Volta 16','Milan',NULL,NULL,'Italy',1401,'34800.00'), - -(475,'West Coast Collectables Co.','Thompson','Steve','3105553722','3675 Furth Circle',NULL,'Burbank','CA','94019','USA',1166,'55400.00'), - -(477,'Mit Vergnügen & Co.','Moos','Hanna ','0621-08555','Forsterstr. 57',NULL,'Mannheim',NULL,'68306','Germany',NULL,'0.00'), - -(480,'Kremlin Collectables, Co.','Semenov','Alexander ','+7 812 293 0521','2 Pobedy Square',NULL,'Saint Petersburg',NULL,'196143','Russia',NULL,'0.00'), - -(481,'Raanan Stores, Inc','Altagar,G M','Raanan','+ 972 9 959 8555','3 Hagalim Blv.',NULL,'Herzlia',NULL,'47625','Israel',NULL,'0.00'), - -(484,'Iberia Gift Imports, Corp.','Roel','José Pedro ','(95) 555 82 82','C/ Romero, 33',NULL,'Sevilla',NULL,'41101','Spain',1702,'65700.00'), - -(486,'Motor Mint Distributors Inc.','Salazar','Rosa','2155559857','11328 Douglas Av.',NULL,'Philadelphia','PA','71270','USA',1323,'72600.00'), - -(487,'Signal Collectibles Ltd.','Taylor','Sue','4155554312','2793 Furth Circle',NULL,'Brisbane','CA','94217','USA',1165,'60300.00'), - -(489,'Double Decker Gift Stores, Ltd','Smith','Thomas ','(171) 555-7555','120 Hanover Sq.',NULL,'London',NULL,'WA1 1DP','UK',1501,'43300.00'), - -(495,'Diecast Collectables','Franco','Valarie','6175552555','6251 Ingle Ln.',NULL,'Boston','MA','51003','USA',1188,'85100.00'), - -(496,'Kelly\'s Gift Shop','Snowden','Tony','+64 9 5555500','Arenales 1938 3\'A\'',NULL,'Auckland ',NULL,NULL,'New Zealand',1612,'110000.00'); - -/*Table structure for table `employees` */ - -DROP TABLE IF EXISTS `employees`; - -CREATE TABLE `employees` ( - `employeeNumber` int(11) NOT NULL, - `lastName` varchar(50) NOT NULL, - `firstName` varchar(50) NOT NULL, - `extension` varchar(10) NOT NULL, - `email` varchar(100) NOT NULL, - `officeCode` varchar(10) NOT NULL, - `reportsTo` int(11) DEFAULT NULL, - `jobTitle` varchar(50) NOT NULL, - PRIMARY KEY (`employeeNumber`), - KEY `reportsTo` (`reportsTo`), - KEY `officeCode` (`officeCode`), - CONSTRAINT `employees_ibfk_1` FOREIGN KEY (`reportsTo`) REFERENCES `employees` (`employeeNumber`), - CONSTRAINT `employees_ibfk_2` FOREIGN KEY (`officeCode`) REFERENCES `offices` (`officeCode`) -) ENGINE=InnoDB DEFAULT CHARSET=latin1; - -/*Data for the table `employees` */ - -insert into `employees`(`employeeNumber`,`lastName`,`firstName`,`extension`,`email`,`officeCode`,`reportsTo`,`jobTitle`) values - -(1002,'Murphy','Diane','x5800','dmurphy@classicmodelcars.com','1',NULL,'President'), - -(1056,'Patterson','Mary','x4611','mpatterso@classicmodelcars.com','1',1002,'VP Sales'), - -(1076,'Firrelli','Jeff','x9273','jfirrelli@classicmodelcars.com','1',1002,'VP Marketing'), - -(1088,'Patterson','William','x4871','wpatterson@classicmodelcars.com','6',1056,'Sales Manager (APAC)'), - -(1102,'Bondur','Gerard','x5408','gbondur@classicmodelcars.com','4',1056,'Sale Manager (EMEA)'), - -(1143,'Bow','Anthony','x5428','abow@classicmodelcars.com','1',1056,'Sales Manager (NA)'), - -(1165,'Jennings','Leslie','x3291','ljennings@classicmodelcars.com','1',1143,'Sales Rep'), - -(1166,'Thompson','Leslie','x4065','lthompson@classicmodelcars.com','1',1143,'Sales Rep'), - -(1188,'Firrelli','Julie','x2173','jfirrelli@classicmodelcars.com','2',1143,'Sales Rep'), - -(1216,'Patterson','Steve','x4334','spatterson@classicmodelcars.com','2',1143,'Sales Rep'), - -(1286,'Tseng','Foon Yue','x2248','ftseng@classicmodelcars.com','3',1143,'Sales Rep'), - -(1323,'Vanauf','George','x4102','gvanauf@classicmodelcars.com','3',1143,'Sales Rep'), - -(1337,'Bondur','Loui','x6493','lbondur@classicmodelcars.com','4',1102,'Sales Rep'), - -(1370,'Hernandez','Gerard','x2028','ghernande@classicmodelcars.com','4',1102,'Sales Rep'), - -(1401,'Castillo','Pamela','x2759','pcastillo@classicmodelcars.com','4',1102,'Sales Rep'), - -(1501,'Bott','Larry','x2311','lbott@classicmodelcars.com','7',1102,'Sales Rep'), - -(1504,'Jones','Barry','x102','bjones@classicmodelcars.com','7',1102,'Sales Rep'), - -(1611,'Fixter','Andy','x101','afixter@classicmodelcars.com','6',1088,'Sales Rep'), - -(1612,'Marsh','Peter','x102','pmarsh@classicmodelcars.com','6',1088,'Sales Rep'), - -(1619,'King','Tom','x103','tking@classicmodelcars.com','6',1088,'Sales Rep'), - -(1621,'Nishi','Mami','x101','mnishi@classicmodelcars.com','5',1056,'Sales Rep'), - -(1625,'Kato','Yoshimi','x102','ykato@classicmodelcars.com','5',1621,'Sales Rep'), - -(1702,'Gerard','Martin','x2312','mgerard@classicmodelcars.com','4',1102,'Sales Rep'); - -/*Table structure for table `offices` */ - -DROP TABLE IF EXISTS `offices`; - -CREATE TABLE `offices` ( - `officeCode` varchar(10) NOT NULL, - `city` varchar(50) NOT NULL, - `phone` varchar(50) NOT NULL, - `addressLine1` varchar(50) NOT NULL, - `addressLine2` varchar(50) DEFAULT NULL, - `state` varchar(50) DEFAULT NULL, - `country` varchar(50) NOT NULL, - `postalCode` varchar(15) NOT NULL, - `territory` varchar(10) NOT NULL, - PRIMARY KEY (`officeCode`) -) ENGINE=InnoDB DEFAULT CHARSET=latin1; - -/*Data for the table `offices` */ - -insert into `offices`(`officeCode`,`city`,`phone`,`addressLine1`,`addressLine2`,`state`,`country`,`postalCode`,`territory`) values - -('1','San Francisco','+1 650 219 4782','100 Market Street','Suite 300','CA','USA','94080','NA'), - -('2','Boston','+1 215 837 0825','1550 Court Place','Suite 102','MA','USA','02107','NA'), - -('3','NYC','+1 212 555 3000','523 East 53rd Street','apt. 5A','NY','USA','10022','NA'), - -('4','Paris','+33 14 723 4404','43 Rue Jouffroy D\'abbans',NULL,NULL,'France','75017','EMEA'), - -('5','Tokyo','+81 33 224 5000','4-1 Kioicho',NULL,'Chiyoda-Ku','Japan','102-8578','Japan'), - -('6','Sydney','+61 2 9264 2451','5-11 Wentworth Avenue','Floor #2',NULL,'Australia','NSW 2010','APAC'), - -('7','London','+44 20 7877 2041','25 Old Broad Street','Level 7',NULL,'UK','EC2N 1HN','EMEA'); - -/*Table structure for table `orderdetails` */ - -DROP TABLE IF EXISTS `orderdetails`; - -CREATE TABLE `orderdetails` ( - `orderNumber` int(11) NOT NULL, - `productCode` varchar(15) NOT NULL, - `quantityOrdered` int(11) NOT NULL, - `priceEach` decimal(10,2) NOT NULL, - `orderLineNumber` smallint(6) NOT NULL, - PRIMARY KEY (`orderNumber`,`productCode`), - KEY `productCode` (`productCode`), - CONSTRAINT `orderdetails_ibfk_1` FOREIGN KEY (`orderNumber`) REFERENCES `orders` (`orderNumber`), - CONSTRAINT `orderdetails_ibfk_2` FOREIGN KEY (`productCode`) REFERENCES `products` (`productCode`) -) ENGINE=InnoDB DEFAULT CHARSET=latin1; - -/*Data for the table `orderdetails` */ - -insert into `orderdetails`(`orderNumber`,`productCode`,`quantityOrdered`,`priceEach`,`orderLineNumber`) values - -(10100,'S18_1749',30,'136.00',3), - -(10100,'S18_2248',50,'55.09',2), - -(10100,'S18_4409',22,'75.46',4), - -(10100,'S24_3969',49,'35.29',1), - -(10101,'S18_2325',25,'108.06',4), - -(10101,'S18_2795',26,'167.06',1), - -(10101,'S24_1937',45,'32.53',3), - -(10101,'S24_2022',46,'44.35',2), - -(10102,'S18_1342',39,'95.55',2), - -(10102,'S18_1367',41,'43.13',1), - -(10103,'S10_1949',26,'214.30',11), - -(10103,'S10_4962',42,'119.67',4), - -(10103,'S12_1666',27,'121.64',8), - -(10103,'S18_1097',35,'94.50',10), - -(10103,'S18_2432',22,'58.34',2), - -(10103,'S18_2949',27,'92.19',12), - -(10103,'S18_2957',35,'61.84',14), - -(10103,'S18_3136',25,'86.92',13), - -(10103,'S18_3320',46,'86.31',16), - -(10103,'S18_4600',36,'98.07',5), - -(10103,'S18_4668',41,'40.75',9), - -(10103,'S24_2300',36,'107.34',1), - -(10103,'S24_4258',25,'88.62',15), - -(10103,'S32_1268',31,'92.46',3), - -(10103,'S32_3522',45,'63.35',7), - -(10103,'S700_2824',42,'94.07',6), - -(10104,'S12_3148',34,'131.44',1), - -(10104,'S12_4473',41,'111.39',9), - -(10104,'S18_2238',24,'135.90',8), - -(10104,'S18_2319',29,'122.73',12), - -(10104,'S18_3232',23,'165.95',13), - -(10104,'S18_4027',38,'119.20',3), - -(10104,'S24_1444',35,'52.02',6), - -(10104,'S24_2840',44,'30.41',10), - -(10104,'S24_4048',26,'106.45',5), - -(10104,'S32_2509',35,'51.95',11), - -(10104,'S32_3207',49,'56.55',4), - -(10104,'S50_1392',33,'114.59',7), - -(10104,'S50_1514',32,'53.31',2), - -(10105,'S10_4757',50,'127.84',2), - -(10105,'S12_1108',41,'205.72',15), - -(10105,'S12_3891',29,'141.88',14), - -(10105,'S18_3140',22,'136.59',11), - -(10105,'S18_3259',38,'87.73',13), - -(10105,'S18_4522',41,'75.48',10), - -(10105,'S24_2011',43,'117.97',9), - -(10105,'S24_3151',44,'73.46',4), - -(10105,'S24_3816',50,'75.47',1), - -(10105,'S700_1138',41,'54.00',5), - -(10105,'S700_1938',29,'86.61',12), - -(10105,'S700_2610',31,'60.72',3), - -(10105,'S700_3505',39,'92.16',6), - -(10105,'S700_3962',22,'99.31',7), - -(10105,'S72_3212',25,'44.77',8), - -(10106,'S18_1662',36,'134.04',12), - -(10106,'S18_2581',34,'81.10',2), - -(10106,'S18_3029',41,'80.86',18), - -(10106,'S18_3856',41,'94.22',17), - -(10106,'S24_1785',28,'107.23',4), - -(10106,'S24_2841',49,'65.77',13), - -(10106,'S24_3420',31,'55.89',14), - -(10106,'S24_3949',50,'55.96',11), - -(10106,'S24_4278',26,'71.00',3), - -(10106,'S32_4289',33,'65.35',5), - -(10106,'S50_1341',39,'35.78',6), - -(10106,'S700_1691',31,'91.34',7), - -(10106,'S700_2047',30,'85.09',16), - -(10106,'S700_2466',34,'99.72',9), - -(10106,'S700_2834',32,'113.90',1), - -(10106,'S700_3167',44,'76.00',8), - -(10106,'S700_4002',48,'70.33',10), - -(10106,'S72_1253',48,'43.70',15), - -(10107,'S10_1678',30,'81.35',2), - -(10107,'S10_2016',39,'105.86',5), - -(10107,'S10_4698',27,'172.36',4), - -(10107,'S12_2823',21,'122.00',1), - -(10107,'S18_2625',29,'52.70',6), - -(10107,'S24_1578',25,'96.92',3), - -(10107,'S24_2000',38,'73.12',7), - -(10107,'S32_1374',20,'88.90',8), - -(10108,'S12_1099',33,'165.38',6), - -(10108,'S12_3380',45,'96.30',4), - -(10108,'S12_3990',39,'75.81',7), - -(10108,'S12_4675',36,'107.10',3), - -(10108,'S18_1889',38,'67.76',2), - -(10108,'S18_3278',26,'73.17',9), - -(10108,'S18_3482',29,'132.29',8), - -(10108,'S18_3782',43,'52.84',12), - -(10108,'S18_4721',44,'139.87',11), - -(10108,'S24_2360',35,'64.41',15), - -(10108,'S24_3371',30,'60.01',5), - -(10108,'S24_3856',40,'132.00',1), - -(10108,'S24_4620',31,'67.10',10), - -(10108,'S32_2206',27,'36.21',13), - -(10108,'S32_4485',31,'87.76',16), - -(10108,'S50_4713',34,'74.85',14), - -(10109,'S18_1129',26,'117.48',4), - -(10109,'S18_1984',38,'137.98',3), - -(10109,'S18_2870',26,'126.72',1), - -(10109,'S18_3232',46,'160.87',5), - -(10109,'S18_3685',47,'125.74',2), - -(10109,'S24_2972',29,'32.10',6), - -(10110,'S18_1589',37,'118.22',16), - -(10110,'S18_1749',42,'153.00',7), - -(10110,'S18_2248',32,'51.46',6), - -(10110,'S18_2325',33,'115.69',4), - -(10110,'S18_2795',31,'163.69',1), - -(10110,'S18_4409',28,'81.91',8), - -(10110,'S18_4933',42,'62.00',9), - -(10110,'S24_1046',36,'72.02',13), - -(10110,'S24_1628',29,'43.27',15), - -(10110,'S24_1937',20,'28.88',3), - -(10110,'S24_2022',39,'40.77',2), - -(10110,'S24_2766',43,'82.69',11), - -(10110,'S24_2887',46,'112.74',10), - -(10110,'S24_3191',27,'80.47',12), - -(10110,'S24_3432',37,'96.37',14), - -(10110,'S24_3969',48,'35.29',5), - -(10111,'S18_1342',33,'87.33',6), - -(10111,'S18_1367',48,'48.52',5), - -(10111,'S18_2957',28,'53.09',2), - -(10111,'S18_3136',43,'94.25',1), - -(10111,'S18_3320',39,'91.27',4), - -(10111,'S24_4258',26,'85.70',3), - -(10112,'S10_1949',29,'197.16',1), - -(10112,'S18_2949',23,'85.10',2), - -(10113,'S12_1666',21,'121.64',2), - -(10113,'S18_1097',49,'101.50',4), - -(10113,'S18_4668',50,'43.27',3), - -(10113,'S32_3522',23,'58.82',1), - -(10114,'S10_4962',31,'128.53',8), - -(10114,'S18_2319',39,'106.78',3), - -(10114,'S18_2432',45,'53.48',6), - -(10114,'S18_3232',48,'169.34',4), - -(10114,'S18_4600',41,'105.34',9), - -(10114,'S24_2300',21,'102.23',5), - -(10114,'S24_2840',24,'28.64',1), - -(10114,'S32_1268',32,'88.61',7), - -(10114,'S32_2509',28,'43.83',2), - -(10114,'S700_2824',42,'82.94',10), - -(10115,'S12_4473',46,'111.39',5), - -(10115,'S18_2238',46,'140.81',4), - -(10115,'S24_1444',47,'56.64',2), - -(10115,'S24_4048',44,'106.45',1), - -(10115,'S50_1392',27,'100.70',3), - -(10116,'S32_3207',27,'60.28',1), - -(10117,'S12_1108',33,'195.33',9), - -(10117,'S12_3148',43,'148.06',10), - -(10117,'S12_3891',39,'173.02',8), - -(10117,'S18_3140',26,'121.57',5), - -(10117,'S18_3259',21,'81.68',7), - -(10117,'S18_4027',22,'122.08',12), - -(10117,'S18_4522',23,'73.73',4), - -(10117,'S24_2011',41,'119.20',3), - -(10117,'S50_1514',21,'55.65',11), - -(10117,'S700_1938',38,'75.35',6), - -(10117,'S700_3962',45,'89.38',1), - -(10117,'S72_3212',50,'52.42',2), - -(10118,'S700_3505',36,'86.15',1), - -(10119,'S10_4757',46,'112.88',11), - -(10119,'S18_1662',43,'151.38',3), - -(10119,'S18_3029',21,'74.84',9), - -(10119,'S18_3856',27,'95.28',8), - -(10119,'S24_2841',41,'64.40',4), - -(10119,'S24_3151',35,'72.58',13), - -(10119,'S24_3420',20,'63.12',5), - -(10119,'S24_3816',35,'82.18',10), - -(10119,'S24_3949',28,'62.10',2), - -(10119,'S700_1138',25,'57.34',14), - -(10119,'S700_2047',29,'74.23',7), - -(10119,'S700_2610',38,'67.22',12), - -(10119,'S700_4002',26,'63.67',1), - -(10119,'S72_1253',28,'40.22',6), - -(10120,'S10_2016',29,'118.94',3), - -(10120,'S10_4698',46,'158.80',2), - -(10120,'S18_2581',29,'82.79',8), - -(10120,'S18_2625',46,'57.54',4), - -(10120,'S24_1578',35,'110.45',1), - -(10120,'S24_1785',39,'93.01',10), - -(10120,'S24_2000',34,'72.36',5), - -(10120,'S24_4278',29,'71.73',9), - -(10120,'S32_1374',22,'94.90',6), - -(10120,'S32_4289',29,'68.79',11), - -(10120,'S50_1341',49,'41.46',12), - -(10120,'S700_1691',47,'91.34',13), - -(10120,'S700_2466',24,'81.77',15), - -(10120,'S700_2834',24,'106.79',7), - -(10120,'S700_3167',43,'72.00',14), - -(10121,'S10_1678',34,'86.13',5), - -(10121,'S12_2823',50,'126.52',4), - -(10121,'S24_2360',32,'58.18',2), - -(10121,'S32_4485',25,'95.93',3), - -(10121,'S50_4713',44,'72.41',1), - -(10122,'S12_1099',42,'155.66',10), - -(10122,'S12_3380',37,'113.92',8), - -(10122,'S12_3990',32,'65.44',11), - -(10122,'S12_4675',20,'104.80',7), - -(10122,'S18_1129',34,'114.65',2), - -(10122,'S18_1889',43,'62.37',6), - -(10122,'S18_1984',31,'113.80',1), - -(10122,'S18_3232',25,'137.17',3), - -(10122,'S18_3278',21,'69.15',13), - -(10122,'S18_3482',21,'133.76',12), - -(10122,'S18_3782',35,'59.06',16), - -(10122,'S18_4721',28,'145.82',15), - -(10122,'S24_2972',39,'34.74',4), - -(10122,'S24_3371',34,'50.82',9), - -(10122,'S24_3856',43,'136.22',5), - -(10122,'S24_4620',29,'67.10',14), - -(10122,'S32_2206',31,'33.79',17), - -(10123,'S18_1589',26,'120.71',2), - -(10123,'S18_2870',46,'114.84',3), - -(10123,'S18_3685',34,'117.26',4), - -(10123,'S24_1628',50,'43.27',1), - -(10124,'S18_1749',21,'153.00',6), - -(10124,'S18_2248',42,'58.12',5), - -(10124,'S18_2325',42,'111.87',3), - -(10124,'S18_4409',36,'75.46',7), - -(10124,'S18_4933',23,'66.28',8), - -(10124,'S24_1046',22,'62.47',12), - -(10124,'S24_1937',45,'30.53',2), - -(10124,'S24_2022',22,'36.29',1), - -(10124,'S24_2766',32,'74.51',10), - -(10124,'S24_2887',25,'93.95',9), - -(10124,'S24_3191',49,'76.19',11), - -(10124,'S24_3432',43,'101.73',13), - -(10124,'S24_3969',46,'36.11',4), - -(10125,'S18_1342',32,'89.38',1), - -(10125,'S18_2795',34,'138.38',2), - -(10126,'S10_1949',38,'205.73',11), - -(10126,'S10_4962',22,'122.62',4), - -(10126,'S12_1666',21,'135.30',8), - -(10126,'S18_1097',38,'116.67',10), - -(10126,'S18_1367',42,'51.21',17), - -(10126,'S18_2432',43,'51.05',2), - -(10126,'S18_2949',31,'93.21',12), - -(10126,'S18_2957',46,'61.84',14), - -(10126,'S18_3136',30,'93.20',13), - -(10126,'S18_3320',38,'94.25',16), - -(10126,'S18_4600',50,'102.92',5), - -(10126,'S18_4668',43,'47.29',9), - -(10126,'S24_2300',27,'122.68',1), - -(10126,'S24_4258',34,'83.76',15), - -(10126,'S32_1268',43,'82.83',3), - -(10126,'S32_3522',26,'62.05',7), - -(10126,'S700_2824',45,'97.10',6), - -(10127,'S12_1108',46,'193.25',2), - -(10127,'S12_3148',46,'140.50',3), - -(10127,'S12_3891',42,'169.56',1), - -(10127,'S12_4473',24,'100.73',11), - -(10127,'S18_2238',45,'140.81',10), - -(10127,'S18_2319',45,'114.14',14), - -(10127,'S18_3232',22,'149.02',15), - -(10127,'S18_4027',25,'126.39',5), - -(10127,'S24_1444',20,'50.86',8), - -(10127,'S24_2840',39,'34.30',12), - -(10127,'S24_4048',20,'107.63',7), - -(10127,'S32_2509',45,'46.53',13), - -(10127,'S32_3207',29,'60.90',6), - -(10127,'S50_1392',46,'111.12',9), - -(10127,'S50_1514',46,'55.65',4), - -(10128,'S18_3140',41,'120.20',2), - -(10128,'S18_3259',41,'80.67',4), - -(10128,'S18_4522',43,'77.24',1), - -(10128,'S700_1938',32,'72.75',3), - -(10129,'S10_4757',33,'123.76',2), - -(10129,'S24_2011',45,'113.06',9), - -(10129,'S24_3151',41,'81.43',4), - -(10129,'S24_3816',50,'76.31',1), - -(10129,'S700_1138',31,'58.67',5), - -(10129,'S700_2610',45,'72.28',3), - -(10129,'S700_3505',42,'90.15',6), - -(10129,'S700_3962',30,'94.34',7), - -(10129,'S72_3212',32,'44.23',8), - -(10130,'S18_3029',40,'68.82',2), - -(10130,'S18_3856',33,'99.52',1), - -(10131,'S18_1662',21,'141.92',4), - -(10131,'S24_2841',35,'60.97',5), - -(10131,'S24_3420',29,'52.60',6), - -(10131,'S24_3949',50,'54.59',3), - -(10131,'S700_2047',22,'76.94',8), - -(10131,'S700_2466',40,'86.76',1), - -(10131,'S700_4002',26,'63.67',2), - -(10131,'S72_1253',21,'40.22',7), - -(10132,'S700_3167',36,'80.00',1), - -(10133,'S18_2581',49,'80.26',3), - -(10133,'S24_1785',41,'109.42',5), - -(10133,'S24_4278',46,'61.58',4), - -(10133,'S32_1374',23,'80.91',1), - -(10133,'S32_4289',49,'67.41',6), - -(10133,'S50_1341',27,'37.09',7), - -(10133,'S700_1691',24,'76.73',8), - -(10133,'S700_2834',27,'115.09',2), - -(10134,'S10_1678',41,'90.92',2), - -(10134,'S10_2016',27,'116.56',5), - -(10134,'S10_4698',31,'187.85',4), - -(10134,'S12_2823',20,'131.04',1), - -(10134,'S18_2625',30,'51.48',6), - -(10134,'S24_1578',35,'94.67',3), - -(10134,'S24_2000',43,'75.41',7), - -(10135,'S12_1099',42,'173.17',7), - -(10135,'S12_3380',48,'110.39',5), - -(10135,'S12_3990',24,'72.62',8), - -(10135,'S12_4675',29,'103.64',4), - -(10135,'S18_1889',48,'66.99',3), - -(10135,'S18_3278',45,'65.94',10), - -(10135,'S18_3482',42,'139.64',9), - -(10135,'S18_3782',45,'49.74',13), - -(10135,'S18_4721',31,'133.92',12), - -(10135,'S24_2360',29,'67.18',16), - -(10135,'S24_2972',20,'34.36',1), - -(10135,'S24_3371',27,'52.05',6), - -(10135,'S24_3856',47,'139.03',2), - -(10135,'S24_4620',23,'76.80',11), - -(10135,'S32_2206',33,'38.62',14), - -(10135,'S32_4485',30,'91.85',17), - -(10135,'S50_4713',44,'78.92',15), - -(10136,'S18_1129',25,'117.48',2), - -(10136,'S18_1984',36,'120.91',1), - -(10136,'S18_3232',41,'169.34',3), - -(10137,'S18_1589',44,'115.73',2), - -(10137,'S18_2870',37,'110.88',3), - -(10137,'S18_3685',31,'118.68',4), - -(10137,'S24_1628',26,'40.25',1), - -(10138,'S18_1749',33,'149.60',6), - -(10138,'S18_2248',22,'51.46',5), - -(10138,'S18_2325',38,'114.42',3), - -(10138,'S18_4409',47,'79.15',7), - -(10138,'S18_4933',23,'64.86',8), - -(10138,'S24_1046',45,'59.53',12), - -(10138,'S24_1937',22,'33.19',2), - -(10138,'S24_2022',33,'38.53',1), - -(10138,'S24_2766',28,'73.60',10), - -(10138,'S24_2887',30,'96.30',9), - -(10138,'S24_3191',49,'77.05',11), - -(10138,'S24_3432',21,'99.58',13), - -(10138,'S24_3969',29,'32.82',4), - -(10139,'S18_1342',31,'89.38',7), - -(10139,'S18_1367',49,'52.83',6), - -(10139,'S18_2795',41,'151.88',8), - -(10139,'S18_2949',46,'91.18',1), - -(10139,'S18_2957',20,'52.47',3), - -(10139,'S18_3136',20,'101.58',2), - -(10139,'S18_3320',30,'81.35',5), - -(10139,'S24_4258',29,'93.49',4), - -(10140,'S10_1949',37,'186.44',11), - -(10140,'S10_4962',26,'131.49',4), - -(10140,'S12_1666',38,'118.90',8), - -(10140,'S18_1097',32,'95.67',10), - -(10140,'S18_2432',46,'51.05',2), - -(10140,'S18_4600',40,'100.50',5), - -(10140,'S18_4668',29,'40.25',9), - -(10140,'S24_2300',47,'118.84',1), - -(10140,'S32_1268',26,'87.64',3), - -(10140,'S32_3522',28,'62.05',7), - -(10140,'S700_2824',36,'101.15',6), - -(10141,'S12_4473',21,'114.95',5), - -(10141,'S18_2238',39,'160.46',4), - -(10141,'S18_2319',47,'103.09',8), - -(10141,'S18_3232',34,'143.94',9), - -(10141,'S24_1444',20,'50.86',2), - -(10141,'S24_2840',21,'32.18',6), - -(10141,'S24_4048',40,'104.09',1), - -(10141,'S32_2509',24,'53.03',7), - -(10141,'S50_1392',44,'94.92',3), - -(10142,'S12_1108',33,'166.24',12), - -(10142,'S12_3148',33,'140.50',13), - -(10142,'S12_3891',46,'167.83',11), - -(10142,'S18_3140',47,'129.76',8), - -(10142,'S18_3259',22,'95.80',10), - -(10142,'S18_4027',24,'122.08',15), - -(10142,'S18_4522',24,'79.87',7), - -(10142,'S24_2011',33,'114.29',6), - -(10142,'S24_3151',49,'74.35',1), - -(10142,'S32_3207',42,'60.90',16), - -(10142,'S50_1514',42,'56.24',14), - -(10142,'S700_1138',41,'55.34',2), - -(10142,'S700_1938',43,'77.08',9), - -(10142,'S700_3505',21,'92.16',3), - -(10142,'S700_3962',38,'91.37',4), - -(10142,'S72_3212',39,'46.96',5), - -(10143,'S10_4757',49,'133.28',15), - -(10143,'S18_1662',32,'126.15',7), - -(10143,'S18_3029',46,'70.54',13), - -(10143,'S18_3856',34,'99.52',12), - -(10143,'S24_2841',27,'63.71',8), - -(10143,'S24_3420',33,'59.83',9), - -(10143,'S24_3816',23,'74.64',14), - -(10143,'S24_3949',28,'55.96',6), - -(10143,'S50_1341',34,'34.91',1), - -(10143,'S700_1691',36,'86.77',2), - -(10143,'S700_2047',26,'87.80',11), - -(10143,'S700_2466',26,'79.78',4), - -(10143,'S700_2610',31,'69.39',16), - -(10143,'S700_3167',28,'70.40',3), - -(10143,'S700_4002',34,'65.15',5), - -(10143,'S72_1253',37,'49.66',10), - -(10144,'S32_4289',20,'56.41',1), - -(10145,'S10_1678',45,'76.56',6), - -(10145,'S10_2016',37,'104.67',9), - -(10145,'S10_4698',33,'154.93',8), - -(10145,'S12_2823',49,'146.10',5), - -(10145,'S18_2581',30,'71.81',14), - -(10145,'S18_2625',30,'52.70',10), - -(10145,'S24_1578',43,'103.68',7), - -(10145,'S24_1785',40,'87.54',16), - -(10145,'S24_2000',47,'63.98',11), - -(10145,'S24_2360',27,'56.10',3), - -(10145,'S24_4278',33,'71.73',15), - -(10145,'S32_1374',33,'99.89',12), - -(10145,'S32_2206',31,'39.43',1), - -(10145,'S32_4485',27,'95.93',4), - -(10145,'S50_4713',38,'73.22',2), - -(10145,'S700_2834',20,'113.90',13), - -(10146,'S18_3782',47,'60.30',2), - -(10146,'S18_4721',29,'130.94',1), - -(10147,'S12_1099',48,'161.49',7), - -(10147,'S12_3380',31,'110.39',5), - -(10147,'S12_3990',21,'74.21',8), - -(10147,'S12_4675',33,'97.89',4), - -(10147,'S18_1889',26,'70.84',3), - -(10147,'S18_3278',36,'74.78',10), - -(10147,'S18_3482',37,'129.35',9), - -(10147,'S24_2972',25,'33.23',1), - -(10147,'S24_3371',30,'48.98',6), - -(10147,'S24_3856',23,'123.58',2), - -(10147,'S24_4620',31,'72.76',11), - -(10148,'S18_1129',23,'114.65',13), - -(10148,'S18_1589',47,'108.26',9), - -(10148,'S18_1984',25,'136.56',12), - -(10148,'S18_2870',27,'113.52',10), - -(10148,'S18_3232',32,'143.94',14), - -(10148,'S18_3685',28,'135.63',11), - -(10148,'S18_4409',34,'83.75',1), - -(10148,'S18_4933',29,'66.28',2), - -(10148,'S24_1046',25,'65.41',6), - -(10148,'S24_1628',47,'46.29',8), - -(10148,'S24_2766',21,'77.24',4), - -(10148,'S24_2887',34,'115.09',3), - -(10148,'S24_3191',31,'71.91',5), - -(10148,'S24_3432',27,'96.37',7), - -(10149,'S18_1342',50,'87.33',4), - -(10149,'S18_1367',30,'48.52',3), - -(10149,'S18_1749',34,'156.40',11), - -(10149,'S18_2248',24,'50.85',10), - -(10149,'S18_2325',33,'125.86',8), - -(10149,'S18_2795',23,'167.06',5), - -(10149,'S18_3320',42,'89.29',2), - -(10149,'S24_1937',36,'31.20',7), - -(10149,'S24_2022',49,'39.87',6), - -(10149,'S24_3969',26,'38.57',9), - -(10149,'S24_4258',20,'90.57',1), - -(10150,'S10_1949',45,'182.16',8), - -(10150,'S10_4962',20,'121.15',1), - -(10150,'S12_1666',30,'135.30',5), - -(10150,'S18_1097',34,'95.67',7), - -(10150,'S18_2949',47,'93.21',9), - -(10150,'S18_2957',30,'56.21',11), - -(10150,'S18_3136',26,'97.39',10), - -(10150,'S18_4600',49,'111.39',2), - -(10150,'S18_4668',30,'47.29',6), - -(10150,'S32_3522',49,'62.05',4), - -(10150,'S700_2824',20,'95.08',3), - -(10151,'S12_4473',24,'114.95',3), - -(10151,'S18_2238',43,'152.27',2), - -(10151,'S18_2319',49,'106.78',6), - -(10151,'S18_2432',39,'58.34',9), - -(10151,'S18_3232',21,'167.65',7), - -(10151,'S24_2300',42,'109.90',8), - -(10151,'S24_2840',30,'29.35',4), - -(10151,'S32_1268',27,'84.75',10), - -(10151,'S32_2509',41,'43.29',5), - -(10151,'S50_1392',26,'108.81',1), - -(10152,'S18_4027',35,'117.77',1), - -(10152,'S24_1444',25,'49.13',4), - -(10152,'S24_4048',23,'112.37',3), - -(10152,'S32_3207',33,'57.17',2), - -(10153,'S12_1108',20,'201.57',11), - -(10153,'S12_3148',42,'128.42',12), - -(10153,'S12_3891',49,'155.72',10), - -(10153,'S18_3140',31,'125.66',7), - -(10153,'S18_3259',29,'82.69',9), - -(10153,'S18_4522',22,'82.50',6), - -(10153,'S24_2011',40,'111.83',5), - -(10153,'S50_1514',31,'53.31',13), - -(10153,'S700_1138',43,'58.00',1), - -(10153,'S700_1938',31,'80.55',8), - -(10153,'S700_3505',50,'87.15',2), - -(10153,'S700_3962',20,'85.41',3), - -(10153,'S72_3212',50,'51.87',4), - -(10154,'S24_3151',31,'75.23',2), - -(10154,'S700_2610',36,'59.27',1), - -(10155,'S10_4757',32,'129.20',13), - -(10155,'S18_1662',38,'138.77',5), - -(10155,'S18_3029',44,'83.44',11), - -(10155,'S18_3856',29,'105.87',10), - -(10155,'S24_2841',23,'62.34',6), - -(10155,'S24_3420',34,'56.55',7), - -(10155,'S24_3816',37,'76.31',12), - -(10155,'S24_3949',44,'58.69',4), - -(10155,'S700_2047',32,'89.61',9), - -(10155,'S700_2466',20,'87.75',2), - -(10155,'S700_3167',43,'76.80',1), - -(10155,'S700_4002',44,'70.33',3), - -(10155,'S72_1253',34,'49.16',8), - -(10156,'S50_1341',20,'43.64',1), - -(10156,'S700_1691',48,'77.64',2), - -(10157,'S18_2581',33,'69.27',3), - -(10157,'S24_1785',40,'89.72',5), - -(10157,'S24_4278',33,'66.65',4), - -(10157,'S32_1374',34,'83.91',1), - -(10157,'S32_4289',28,'56.41',6), - -(10157,'S700_2834',48,'109.16',2), - -(10158,'S24_2000',22,'67.79',1), - -(10159,'S10_1678',49,'81.35',14), - -(10159,'S10_2016',37,'101.10',17), - -(10159,'S10_4698',22,'170.42',16), - -(10159,'S12_1099',41,'188.73',2), - -(10159,'S12_2823',38,'131.04',13), - -(10159,'S12_3990',24,'67.03',3), - -(10159,'S18_2625',42,'51.48',18), - -(10159,'S18_3278',21,'66.74',5), - -(10159,'S18_3482',25,'129.35',4), - -(10159,'S18_3782',21,'54.71',8), - -(10159,'S18_4721',32,'142.85',7), - -(10159,'S24_1578',44,'100.30',15), - -(10159,'S24_2360',27,'67.18',11), - -(10159,'S24_3371',50,'49.60',1), - -(10159,'S24_4620',23,'80.84',6), - -(10159,'S32_2206',35,'39.43',9), - -(10159,'S32_4485',23,'86.74',12), - -(10159,'S50_4713',31,'78.11',10), - -(10160,'S12_3380',46,'96.30',6), - -(10160,'S12_4675',50,'93.28',5), - -(10160,'S18_1889',38,'70.84',4), - -(10160,'S18_3232',20,'140.55',1), - -(10160,'S24_2972',42,'30.59',2), - -(10160,'S24_3856',35,'130.60',3), - -(10161,'S18_1129',28,'121.72',12), - -(10161,'S18_1589',43,'102.04',8), - -(10161,'S18_1984',48,'139.41',11), - -(10161,'S18_2870',23,'125.40',9), - -(10161,'S18_3685',36,'132.80',10), - -(10161,'S18_4933',25,'62.72',1), - -(10161,'S24_1046',37,'73.49',5), - -(10161,'S24_1628',23,'47.29',7), - -(10161,'S24_2766',20,'82.69',3), - -(10161,'S24_2887',25,'108.04',2), - -(10161,'S24_3191',20,'72.77',4), - -(10161,'S24_3432',30,'94.23',6), - -(10162,'S18_1342',48,'87.33',2), - -(10162,'S18_1367',45,'45.28',1), - -(10162,'S18_1749',29,'141.10',9), - -(10162,'S18_2248',27,'53.28',8), - -(10162,'S18_2325',38,'113.15',6), - -(10162,'S18_2795',48,'156.94',3), - -(10162,'S18_4409',39,'86.51',10), - -(10162,'S24_1937',37,'27.55',5), - -(10162,'S24_2022',43,'38.98',4), - -(10162,'S24_3969',37,'32.82',7), - -(10163,'S10_1949',21,'212.16',1), - -(10163,'S18_2949',31,'101.31',2), - -(10163,'S18_2957',48,'59.96',4), - -(10163,'S18_3136',40,'101.58',3), - -(10163,'S18_3320',43,'80.36',6), - -(10163,'S24_4258',42,'96.42',5), - -(10164,'S10_4962',21,'143.31',2), - -(10164,'S12_1666',49,'121.64',6), - -(10164,'S18_1097',36,'103.84',8), - -(10164,'S18_4600',45,'107.76',3), - -(10164,'S18_4668',25,'46.29',7), - -(10164,'S32_1268',24,'91.49',1), - -(10164,'S32_3522',49,'57.53',5), - -(10164,'S700_2824',39,'86.99',4), - -(10165,'S12_1108',44,'168.32',3), - -(10165,'S12_3148',34,'123.89',4), - -(10165,'S12_3891',27,'152.26',2), - -(10165,'S12_4473',48,'109.02',12), - -(10165,'S18_2238',29,'134.26',11), - -(10165,'S18_2319',46,'120.28',15), - -(10165,'S18_2432',31,'60.77',18), - -(10165,'S18_3232',47,'154.10',16), - -(10165,'S18_3259',50,'84.71',1), - -(10165,'S18_4027',28,'123.51',6), - -(10165,'S24_1444',25,'46.82',9), - -(10165,'S24_2300',32,'117.57',17), - -(10165,'S24_2840',27,'31.12',13), - -(10165,'S24_4048',24,'106.45',8), - -(10165,'S32_2509',48,'50.86',14), - -(10165,'S32_3207',44,'55.30',7), - -(10165,'S50_1392',48,'106.49',10), - -(10165,'S50_1514',38,'49.21',5), - -(10166,'S18_3140',43,'136.59',2), - -(10166,'S18_4522',26,'72.85',1), - -(10166,'S700_1938',29,'76.22',3), - -(10167,'S10_4757',44,'123.76',9), - -(10167,'S18_1662',43,'141.92',1), - -(10167,'S18_3029',46,'69.68',7), - -(10167,'S18_3856',34,'84.70',6), - -(10167,'S24_2011',33,'110.60',16), - -(10167,'S24_2841',21,'54.81',2), - -(10167,'S24_3151',20,'77.00',11), - -(10167,'S24_3420',32,'64.44',3), - -(10167,'S24_3816',29,'73.80',8), - -(10167,'S700_1138',43,'66.00',12), - -(10167,'S700_2047',29,'87.80',5), - -(10167,'S700_2610',46,'62.16',10), - -(10167,'S700_3505',24,'85.14',13), - -(10167,'S700_3962',28,'83.42',14), - -(10167,'S72_1253',40,'42.71',4), - -(10167,'S72_3212',38,'43.68',15), - -(10168,'S10_1678',36,'94.74',1), - -(10168,'S10_2016',27,'97.53',4), - -(10168,'S10_4698',20,'160.74',3), - -(10168,'S18_2581',21,'75.19',9), - -(10168,'S18_2625',46,'49.06',5), - -(10168,'S24_1578',50,'103.68',2), - -(10168,'S24_1785',49,'93.01',11), - -(10168,'S24_2000',29,'72.36',6), - -(10168,'S24_3949',27,'57.32',18), - -(10168,'S24_4278',48,'68.10',10), - -(10168,'S32_1374',28,'89.90',7), - -(10168,'S32_4289',31,'57.78',12), - -(10168,'S50_1341',48,'39.71',13), - -(10168,'S700_1691',28,'91.34',14), - -(10168,'S700_2466',31,'87.75',16), - -(10168,'S700_2834',36,'94.92',8), - -(10168,'S700_3167',48,'72.00',15), - -(10168,'S700_4002',39,'67.37',17), - -(10169,'S12_1099',30,'163.44',2), - -(10169,'S12_2823',35,'126.52',13), - -(10169,'S12_3990',36,'71.82',3), - -(10169,'S18_3278',32,'65.13',5), - -(10169,'S18_3482',36,'136.70',4), - -(10169,'S18_3782',38,'52.84',8), - -(10169,'S18_4721',33,'120.53',7), - -(10169,'S24_2360',38,'66.49',11), - -(10169,'S24_3371',34,'53.27',1), - -(10169,'S24_4620',24,'77.61',6), - -(10169,'S32_2206',26,'37.01',9), - -(10169,'S32_4485',34,'83.68',12), - -(10169,'S50_4713',48,'75.66',10), - -(10170,'S12_3380',47,'116.27',4), - -(10170,'S12_4675',41,'93.28',3), - -(10170,'S18_1889',20,'70.07',2), - -(10170,'S24_3856',34,'130.60',1), - -(10171,'S18_1129',35,'134.46',2), - -(10171,'S18_1984',35,'128.03',1), - -(10171,'S18_3232',39,'165.95',3), - -(10171,'S24_2972',36,'34.74',4), - -(10172,'S18_1589',42,'109.51',6), - -(10172,'S18_2870',39,'117.48',7), - -(10172,'S18_3685',48,'139.87',8), - -(10172,'S24_1046',32,'61.00',3), - -(10172,'S24_1628',34,'43.27',5), - -(10172,'S24_2766',22,'79.97',1), - -(10172,'S24_3191',24,'77.91',2), - -(10172,'S24_3432',22,'87.81',4), - -(10173,'S18_1342',43,'101.71',6), - -(10173,'S18_1367',48,'51.75',5), - -(10173,'S18_1749',24,'168.30',13), - -(10173,'S18_2248',26,'55.09',12), - -(10173,'S18_2325',31,'127.13',10), - -(10173,'S18_2795',22,'140.06',7), - -(10173,'S18_2957',28,'56.84',2), - -(10173,'S18_3136',31,'86.92',1), - -(10173,'S18_3320',29,'90.28',4), - -(10173,'S18_4409',21,'77.31',14), - -(10173,'S18_4933',39,'58.44',15), - -(10173,'S24_1937',31,'29.87',9), - -(10173,'S24_2022',27,'39.42',8), - -(10173,'S24_2887',23,'98.65',16), - -(10173,'S24_3969',35,'35.70',11), - -(10173,'S24_4258',22,'93.49',3), - -(10174,'S10_1949',34,'207.87',4), - -(10174,'S12_1666',43,'113.44',1), - -(10174,'S18_1097',48,'108.50',3), - -(10174,'S18_2949',46,'100.30',5), - -(10174,'S18_4668',49,'44.27',2), - -(10175,'S10_4962',33,'119.67',9), - -(10175,'S12_4473',26,'109.02',1), - -(10175,'S18_2319',48,'101.87',4), - -(10175,'S18_2432',41,'59.55',7), - -(10175,'S18_3232',29,'150.71',5), - -(10175,'S18_4600',47,'102.92',10), - -(10175,'S24_2300',28,'121.40',6), - -(10175,'S24_2840',37,'32.18',2), - -(10175,'S32_1268',22,'89.57',8), - -(10175,'S32_2509',50,'50.86',3), - -(10175,'S32_3522',29,'56.24',12), - -(10175,'S700_2824',42,'80.92',11), - -(10176,'S12_1108',33,'166.24',2), - -(10176,'S12_3148',47,'145.04',3), - -(10176,'S12_3891',50,'160.91',1), - -(10176,'S18_2238',20,'139.17',10), - -(10176,'S18_4027',36,'140.75',5), - -(10176,'S24_1444',27,'55.49',8), - -(10176,'S24_4048',29,'101.72',7), - -(10176,'S32_3207',22,'62.14',6), - -(10176,'S50_1392',23,'109.96',9), - -(10176,'S50_1514',38,'52.14',4), - -(10177,'S18_3140',23,'113.37',9), - -(10177,'S18_3259',29,'92.77',11), - -(10177,'S18_4522',35,'82.50',8), - -(10177,'S24_2011',50,'115.52',7), - -(10177,'S24_3151',45,'79.66',2), - -(10177,'S700_1138',24,'58.67',3), - -(10177,'S700_1938',31,'77.95',10), - -(10177,'S700_2610',32,'64.33',1), - -(10177,'S700_3505',44,'88.15',4), - -(10177,'S700_3962',24,'83.42',5), - -(10177,'S72_3212',40,'52.96',6), - -(10178,'S10_4757',24,'131.92',12), - -(10178,'S18_1662',42,'127.73',4), - -(10178,'S18_3029',41,'70.54',10), - -(10178,'S18_3856',48,'104.81',9), - -(10178,'S24_2841',34,'67.82',5), - -(10178,'S24_3420',27,'65.75',6), - -(10178,'S24_3816',21,'68.77',11), - -(10178,'S24_3949',30,'64.15',3), - -(10178,'S700_2047',34,'86.90',8), - -(10178,'S700_2466',22,'91.74',1), - -(10178,'S700_4002',45,'68.11',2), - -(10178,'S72_1253',45,'41.71',7), - -(10179,'S18_2581',24,'82.79',3), - -(10179,'S24_1785',47,'105.04',5), - -(10179,'S24_4278',27,'66.65',4), - -(10179,'S32_1374',45,'86.90',1), - -(10179,'S32_4289',24,'63.97',6), - -(10179,'S50_1341',34,'43.20',7), - -(10179,'S700_1691',23,'75.81',8), - -(10179,'S700_2834',25,'98.48',2), - -(10179,'S700_3167',39,'80.00',9), - -(10180,'S10_1678',29,'76.56',9), - -(10180,'S10_2016',42,'99.91',12), - -(10180,'S10_4698',41,'164.61',11), - -(10180,'S12_2823',40,'131.04',8), - -(10180,'S18_2625',25,'48.46',13), - -(10180,'S18_3782',21,'59.06',3), - -(10180,'S18_4721',44,'147.31',2), - -(10180,'S24_1578',48,'98.05',10), - -(10180,'S24_2000',28,'61.70',14), - -(10180,'S24_2360',35,'60.95',6), - -(10180,'S24_4620',28,'68.71',1), - -(10180,'S32_2206',34,'33.39',4), - -(10180,'S32_4485',22,'102.05',7), - -(10180,'S50_4713',21,'74.85',5), - -(10181,'S12_1099',27,'155.66',14), - -(10181,'S12_3380',28,'113.92',12), - -(10181,'S12_3990',20,'67.03',15), - -(10181,'S12_4675',36,'107.10',11), - -(10181,'S18_1129',44,'124.56',6), - -(10181,'S18_1589',42,'124.44',2), - -(10181,'S18_1889',22,'74.69',10), - -(10181,'S18_1984',21,'129.45',5), - -(10181,'S18_2870',27,'130.68',3), - -(10181,'S18_3232',45,'147.33',7), - -(10181,'S18_3278',30,'73.17',17), - -(10181,'S18_3482',22,'120.53',16), - -(10181,'S18_3685',39,'137.04',4), - -(10181,'S24_1628',34,'45.28',1), - -(10181,'S24_2972',37,'32.85',8), - -(10181,'S24_3371',23,'54.49',13), - -(10181,'S24_3856',25,'122.17',9), - -(10182,'S18_1342',25,'83.22',3), - -(10182,'S18_1367',32,'44.21',2), - -(10182,'S18_1749',44,'159.80',10), - -(10182,'S18_2248',38,'54.49',9), - -(10182,'S18_2325',20,'105.52',7), - -(10182,'S18_2795',21,'135.00',4), - -(10182,'S18_3320',33,'86.31',1), - -(10182,'S18_4409',36,'88.35',11), - -(10182,'S18_4933',44,'61.29',12), - -(10182,'S24_1046',47,'63.20',16), - -(10182,'S24_1937',39,'31.86',6), - -(10182,'S24_2022',31,'39.87',5), - -(10182,'S24_2766',36,'87.24',14), - -(10182,'S24_2887',20,'116.27',13), - -(10182,'S24_3191',33,'73.62',15), - -(10182,'S24_3432',49,'95.30',17), - -(10182,'S24_3969',23,'34.88',8), - -(10183,'S10_1949',23,'180.01',8), - -(10183,'S10_4962',28,'127.06',1), - -(10183,'S12_1666',41,'114.80',5), - -(10183,'S18_1097',21,'108.50',7), - -(10183,'S18_2949',37,'91.18',9), - -(10183,'S18_2957',39,'51.22',11), - -(10183,'S18_3136',22,'90.06',10), - -(10183,'S18_4600',21,'118.66',2), - -(10183,'S18_4668',40,'42.26',6), - -(10183,'S24_4258',47,'81.81',12), - -(10183,'S32_3522',49,'52.36',4), - -(10183,'S700_2824',23,'85.98',3), - -(10184,'S12_4473',37,'105.47',6), - -(10184,'S18_2238',46,'145.72',5), - -(10184,'S18_2319',46,'119.05',9), - -(10184,'S18_2432',44,'60.77',12), - -(10184,'S18_3232',28,'165.95',10), - -(10184,'S24_1444',31,'57.22',3), - -(10184,'S24_2300',24,'117.57',11), - -(10184,'S24_2840',42,'30.06',7), - -(10184,'S24_4048',49,'114.73',2), - -(10184,'S32_1268',46,'84.75',13), - -(10184,'S32_2509',33,'52.49',8), - -(10184,'S32_3207',48,'59.03',1), - -(10184,'S50_1392',45,'92.60',4), - -(10185,'S12_1108',21,'195.33',13), - -(10185,'S12_3148',33,'146.55',14), - -(10185,'S12_3891',43,'147.07',12), - -(10185,'S18_3140',28,'124.30',9), - -(10185,'S18_3259',49,'94.79',11), - -(10185,'S18_4027',39,'127.82',16), - -(10185,'S18_4522',47,'87.77',8), - -(10185,'S24_2011',30,'105.69',7), - -(10185,'S24_3151',33,'83.20',2), - -(10185,'S50_1514',20,'46.86',15), - -(10185,'S700_1138',21,'64.67',3), - -(10185,'S700_1938',30,'79.68',10), - -(10185,'S700_2610',39,'61.44',1), - -(10185,'S700_3505',37,'99.17',4), - -(10185,'S700_3962',22,'93.35',5), - -(10185,'S72_3212',28,'47.50',6), - -(10186,'S10_4757',26,'108.80',9), - -(10186,'S18_1662',32,'137.19',1), - -(10186,'S18_3029',32,'73.12',7), - -(10186,'S18_3856',46,'98.46',6), - -(10186,'S24_2841',22,'60.29',2), - -(10186,'S24_3420',21,'59.83',3), - -(10186,'S24_3816',36,'68.77',8), - -(10186,'S700_2047',24,'80.56',5), - -(10186,'S72_1253',28,'42.71',4), - -(10187,'S18_2581',45,'70.12',1), - -(10187,'S24_1785',46,'96.29',3), - -(10187,'S24_3949',43,'55.96',10), - -(10187,'S24_4278',33,'64.48',2), - -(10187,'S32_4289',31,'61.22',4), - -(10187,'S50_1341',41,'39.71',5), - -(10187,'S700_1691',34,'84.95',6), - -(10187,'S700_2466',44,'95.73',8), - -(10187,'S700_3167',34,'72.00',7), - -(10187,'S700_4002',44,'70.33',9), - -(10188,'S10_1678',48,'95.70',1), - -(10188,'S10_2016',38,'111.80',4), - -(10188,'S10_4698',45,'182.04',3), - -(10188,'S18_2625',32,'52.09',5), - -(10188,'S24_1578',25,'95.80',2), - -(10188,'S24_2000',40,'61.70',6), - -(10188,'S32_1374',44,'81.91',7), - -(10188,'S700_2834',29,'96.11',8), - -(10189,'S12_2823',28,'138.57',1), - -(10190,'S24_2360',42,'58.87',3), - -(10190,'S32_2206',46,'38.62',1), - -(10190,'S32_4485',42,'89.80',4), - -(10190,'S50_4713',40,'67.53',2), - -(10191,'S12_1099',21,'155.66',3), - -(10191,'S12_3380',40,'104.52',1), - -(10191,'S12_3990',30,'70.22',4), - -(10191,'S18_3278',36,'75.59',6), - -(10191,'S18_3482',23,'119.06',5), - -(10191,'S18_3782',43,'60.93',9), - -(10191,'S18_4721',32,'136.90',8), - -(10191,'S24_3371',48,'53.27',2), - -(10191,'S24_4620',44,'77.61',7), - -(10192,'S12_4675',27,'99.04',16), - -(10192,'S18_1129',22,'140.12',11), - -(10192,'S18_1589',29,'100.80',7), - -(10192,'S18_1889',45,'70.84',15), - -(10192,'S18_1984',47,'128.03',10), - -(10192,'S18_2870',38,'110.88',8), - -(10192,'S18_3232',26,'137.17',12), - -(10192,'S18_3685',45,'125.74',9), - -(10192,'S24_1046',37,'72.02',4), - -(10192,'S24_1628',47,'49.30',6), - -(10192,'S24_2766',46,'86.33',2), - -(10192,'S24_2887',23,'112.74',1), - -(10192,'S24_2972',30,'33.23',13), - -(10192,'S24_3191',32,'69.34',3), - -(10192,'S24_3432',46,'93.16',5), - -(10192,'S24_3856',45,'112.34',14), - -(10193,'S18_1342',28,'92.47',7), - -(10193,'S18_1367',46,'46.36',6), - -(10193,'S18_1749',21,'153.00',14), - -(10193,'S18_2248',42,'60.54',13), - -(10193,'S18_2325',44,'115.69',11), - -(10193,'S18_2795',22,'143.44',8), - -(10193,'S18_2949',28,'87.13',1), - -(10193,'S18_2957',24,'53.09',3), - -(10193,'S18_3136',23,'97.39',2), - -(10193,'S18_3320',32,'79.37',5), - -(10193,'S18_4409',24,'92.03',15), - -(10193,'S18_4933',25,'66.28',16), - -(10193,'S24_1937',26,'32.19',10), - -(10193,'S24_2022',20,'44.80',9), - -(10193,'S24_3969',22,'38.16',12), - -(10193,'S24_4258',20,'92.52',4), - -(10194,'S10_1949',42,'203.59',11), - -(10194,'S10_4962',26,'134.44',4), - -(10194,'S12_1666',38,'124.37',8), - -(10194,'S18_1097',21,'103.84',10), - -(10194,'S18_2432',45,'51.05',2), - -(10194,'S18_4600',32,'113.82',5), - -(10194,'S18_4668',41,'47.79',9), - -(10194,'S24_2300',49,'112.46',1), - -(10194,'S32_1268',37,'77.05',3), - -(10194,'S32_3522',39,'61.41',7), - -(10194,'S700_2824',26,'80.92',6), - -(10195,'S12_4473',49,'118.50',6), - -(10195,'S18_2238',27,'139.17',5), - -(10195,'S18_2319',35,'112.91',9), - -(10195,'S18_3232',50,'150.71',10), - -(10195,'S24_1444',44,'54.33',3), - -(10195,'S24_2840',32,'31.82',7), - -(10195,'S24_4048',34,'95.81',2), - -(10195,'S32_2509',32,'51.95',8), - -(10195,'S32_3207',33,'59.03',1), - -(10195,'S50_1392',49,'97.23',4), - -(10196,'S12_1108',47,'203.64',5), - -(10196,'S12_3148',24,'151.08',6), - -(10196,'S12_3891',38,'147.07',4), - -(10196,'S18_3140',49,'127.03',1), - -(10196,'S18_3259',35,'81.68',3), - -(10196,'S18_4027',27,'126.39',8), - -(10196,'S50_1514',46,'56.82',7), - -(10196,'S700_1938',50,'84.88',2), - -(10197,'S10_4757',45,'118.32',6), - -(10197,'S18_3029',46,'83.44',4), - -(10197,'S18_3856',22,'85.75',3), - -(10197,'S18_4522',50,'78.99',14), - -(10197,'S24_2011',41,'109.37',13), - -(10197,'S24_3151',47,'83.20',8), - -(10197,'S24_3816',22,'67.93',5), - -(10197,'S700_1138',23,'60.00',9), - -(10197,'S700_2047',24,'78.75',2), - -(10197,'S700_2610',50,'66.50',7), - -(10197,'S700_3505',27,'100.17',10), - -(10197,'S700_3962',35,'88.39',11), - -(10197,'S72_1253',29,'39.73',1), - -(10197,'S72_3212',42,'48.59',12), - -(10198,'S18_1662',42,'149.81',4), - -(10198,'S24_2841',48,'60.97',5), - -(10198,'S24_3420',27,'61.81',6), - -(10198,'S24_3949',43,'65.51',3), - -(10198,'S700_2466',42,'94.73',1), - -(10198,'S700_4002',40,'74.03',2), - -(10199,'S50_1341',29,'37.97',1), - -(10199,'S700_1691',48,'81.29',2), - -(10199,'S700_3167',38,'70.40',3), - -(10200,'S18_2581',28,'74.34',3), - -(10200,'S24_1785',33,'99.57',5), - -(10200,'S24_4278',39,'70.28',4), - -(10200,'S32_1374',35,'80.91',1), - -(10200,'S32_4289',27,'65.35',6), - -(10200,'S700_2834',39,'115.09',2), - -(10201,'S10_1678',22,'82.30',2), - -(10201,'S10_2016',24,'116.56',5), - -(10201,'S10_4698',49,'191.72',4), - -(10201,'S12_2823',25,'126.52',1), - -(10201,'S18_2625',30,'48.46',6), - -(10201,'S24_1578',39,'93.54',3), - -(10201,'S24_2000',25,'66.27',7), - -(10202,'S18_3782',30,'55.33',3), - -(10202,'S18_4721',43,'124.99',2), - -(10202,'S24_2360',50,'56.10',6), - -(10202,'S24_4620',50,'75.18',1), - -(10202,'S32_2206',27,'33.39',4), - -(10202,'S32_4485',31,'81.64',7), - -(10202,'S50_4713',40,'79.73',5), - -(10203,'S12_1099',20,'161.49',8), - -(10203,'S12_3380',20,'111.57',6), - -(10203,'S12_3990',44,'63.84',9), - -(10203,'S12_4675',47,'115.16',5), - -(10203,'S18_1889',45,'73.15',4), - -(10203,'S18_3232',48,'157.49',1), - -(10203,'S18_3278',33,'66.74',11), - -(10203,'S18_3482',32,'127.88',10), - -(10203,'S24_2972',21,'33.23',2), - -(10203,'S24_3371',34,'56.94',7), - -(10203,'S24_3856',47,'140.43',3), - -(10204,'S18_1129',42,'114.65',17), - -(10204,'S18_1589',40,'113.24',13), - -(10204,'S18_1749',33,'153.00',4), - -(10204,'S18_1984',38,'133.72',16), - -(10204,'S18_2248',23,'59.33',3), - -(10204,'S18_2325',26,'119.50',1), - -(10204,'S18_2870',27,'106.92',14), - -(10204,'S18_3685',35,'132.80',15), - -(10204,'S18_4409',29,'83.75',5), - -(10204,'S18_4933',45,'69.84',6), - -(10204,'S24_1046',20,'69.82',10), - -(10204,'S24_1628',45,'46.79',12), - -(10204,'S24_2766',47,'79.06',8), - -(10204,'S24_2887',42,'112.74',7), - -(10204,'S24_3191',40,'84.75',9), - -(10204,'S24_3432',48,'104.94',11), - -(10204,'S24_3969',39,'34.88',2), - -(10205,'S18_1342',36,'98.63',2), - -(10205,'S18_1367',48,'45.82',1), - -(10205,'S18_2795',40,'138.38',3), - -(10205,'S24_1937',32,'27.88',5), - -(10205,'S24_2022',24,'36.74',4), - -(10206,'S10_1949',47,'203.59',6), - -(10206,'S12_1666',28,'109.34',3), - -(10206,'S18_1097',34,'115.50',5), - -(10206,'S18_2949',37,'98.27',7), - -(10206,'S18_2957',28,'51.84',9), - -(10206,'S18_3136',30,'102.63',8), - -(10206,'S18_3320',28,'99.21',11), - -(10206,'S18_4668',21,'45.78',4), - -(10206,'S24_4258',33,'95.44',10), - -(10206,'S32_3522',36,'54.94',2), - -(10206,'S700_2824',33,'89.01',1), - -(10207,'S10_4962',31,'125.58',15), - -(10207,'S12_4473',34,'95.99',7), - -(10207,'S18_2238',44,'140.81',6), - -(10207,'S18_2319',43,'109.23',10), - -(10207,'S18_2432',37,'60.77',13), - -(10207,'S18_3232',25,'140.55',11), - -(10207,'S18_4027',40,'143.62',1), - -(10207,'S18_4600',47,'119.87',16), - -(10207,'S24_1444',49,'57.80',4), - -(10207,'S24_2300',46,'127.79',12), - -(10207,'S24_2840',42,'30.76',8), - -(10207,'S24_4048',28,'108.82',3), - -(10207,'S32_1268',49,'84.75',14), - -(10207,'S32_2509',27,'51.95',9), - -(10207,'S32_3207',45,'55.30',2), - -(10207,'S50_1392',28,'106.49',5), - -(10208,'S12_1108',46,'176.63',13), - -(10208,'S12_3148',26,'128.42',14), - -(10208,'S12_3891',20,'152.26',12), - -(10208,'S18_3140',24,'117.47',9), - -(10208,'S18_3259',48,'96.81',11), - -(10208,'S18_4522',45,'72.85',8), - -(10208,'S24_2011',35,'122.89',7), - -(10208,'S24_3151',20,'80.54',2), - -(10208,'S50_1514',30,'57.99',15), - -(10208,'S700_1138',38,'56.67',3), - -(10208,'S700_1938',40,'73.62',10), - -(10208,'S700_2610',46,'63.61',1), - -(10208,'S700_3505',37,'95.16',4), - -(10208,'S700_3962',33,'95.34',5), - -(10208,'S72_3212',42,'48.05',6), - -(10209,'S10_4757',39,'129.20',8), - -(10209,'S18_3029',28,'82.58',6), - -(10209,'S18_3856',20,'97.40',5), - -(10209,'S24_2841',43,'66.45',1), - -(10209,'S24_3420',36,'56.55',2), - -(10209,'S24_3816',22,'79.67',7), - -(10209,'S700_2047',33,'90.52',4), - -(10209,'S72_1253',48,'44.20',3), - -(10210,'S10_2016',23,'112.99',2), - -(10210,'S10_4698',34,'189.79',1), - -(10210,'S18_1662',31,'141.92',17), - -(10210,'S18_2581',50,'68.43',7), - -(10210,'S18_2625',40,'51.48',3), - -(10210,'S24_1785',27,'100.67',9), - -(10210,'S24_2000',30,'63.22',4), - -(10210,'S24_3949',29,'56.64',16), - -(10210,'S24_4278',40,'68.10',8), - -(10210,'S32_1374',46,'84.91',5), - -(10210,'S32_4289',39,'57.10',10), - -(10210,'S50_1341',43,'43.20',11), - -(10210,'S700_1691',21,'87.69',12), - -(10210,'S700_2466',26,'93.74',14), - -(10210,'S700_2834',25,'98.48',6), - -(10210,'S700_3167',31,'64.00',13), - -(10210,'S700_4002',42,'60.70',15), - -(10211,'S10_1678',41,'90.92',14), - -(10211,'S12_1099',41,'171.22',2), - -(10211,'S12_2823',36,'126.52',13), - -(10211,'S12_3990',28,'79.80',3), - -(10211,'S18_3278',35,'73.17',5), - -(10211,'S18_3482',28,'138.17',4), - -(10211,'S18_3782',46,'60.30',8), - -(10211,'S18_4721',41,'148.80',7), - -(10211,'S24_1578',25,'109.32',15), - -(10211,'S24_2360',21,'62.33',11), - -(10211,'S24_3371',48,'52.66',1), - -(10211,'S24_4620',22,'80.84',6), - -(10211,'S32_2206',41,'39.83',9), - -(10211,'S32_4485',37,'94.91',12), - -(10211,'S50_4713',40,'70.78',10), - -(10212,'S12_3380',39,'99.82',16), - -(10212,'S12_4675',33,'110.55',15), - -(10212,'S18_1129',29,'117.48',10), - -(10212,'S18_1589',38,'105.77',6), - -(10212,'S18_1889',20,'64.68',14), - -(10212,'S18_1984',41,'133.72',9), - -(10212,'S18_2870',40,'117.48',7), - -(10212,'S18_3232',40,'155.79',11), - -(10212,'S18_3685',45,'115.85',8), - -(10212,'S24_1046',41,'61.73',3), - -(10212,'S24_1628',45,'43.27',5), - -(10212,'S24_2766',45,'81.78',1), - -(10212,'S24_2972',34,'37.38',12), - -(10212,'S24_3191',27,'77.91',2), - -(10212,'S24_3432',46,'100.66',4), - -(10212,'S24_3856',49,'117.96',13), - -(10213,'S18_4409',38,'84.67',1), - -(10213,'S18_4933',25,'58.44',2), - -(10213,'S24_2887',27,'97.48',3), - -(10214,'S18_1749',30,'166.60',7), - -(10214,'S18_2248',21,'53.28',6), - -(10214,'S18_2325',27,'125.86',4), - -(10214,'S18_2795',50,'167.06',1), - -(10214,'S24_1937',20,'32.19',3), - -(10214,'S24_2022',49,'39.87',2), - -(10214,'S24_3969',44,'38.57',5), - -(10215,'S10_1949',35,'205.73',3), - -(10215,'S18_1097',46,'100.34',2), - -(10215,'S18_1342',27,'92.47',10), - -(10215,'S18_1367',33,'53.91',9), - -(10215,'S18_2949',49,'97.26',4), - -(10215,'S18_2957',31,'56.21',6), - -(10215,'S18_3136',49,'89.01',5), - -(10215,'S18_3320',41,'84.33',8), - -(10215,'S18_4668',46,'42.76',1), - -(10215,'S24_4258',39,'94.47',7), - -(10216,'S12_1666',43,'133.94',1), - -(10217,'S10_4962',48,'132.97',4), - -(10217,'S18_2432',35,'58.34',2), - -(10217,'S18_4600',38,'118.66',5), - -(10217,'S24_2300',28,'103.51',1), - -(10217,'S32_1268',21,'78.97',3), - -(10217,'S32_3522',39,'56.24',7), - -(10217,'S700_2824',31,'90.02',6), - -(10218,'S18_2319',22,'110.46',1), - -(10218,'S18_3232',34,'152.41',2), - -(10219,'S12_4473',48,'94.80',2), - -(10219,'S18_2238',43,'132.62',1), - -(10219,'S24_2840',21,'31.12',3), - -(10219,'S32_2509',35,'47.62',4), - -(10220,'S12_1108',32,'189.10',2), - -(10220,'S12_3148',30,'151.08',3), - -(10220,'S12_3891',27,'166.10',1), - -(10220,'S18_4027',50,'126.39',5), - -(10220,'S24_1444',26,'48.55',8), - -(10220,'S24_4048',37,'101.72',7), - -(10220,'S32_3207',20,'49.71',6), - -(10220,'S50_1392',37,'92.60',9), - -(10220,'S50_1514',30,'56.82',4), - -(10221,'S18_3140',33,'133.86',3), - -(10221,'S18_3259',23,'89.75',5), - -(10221,'S18_4522',39,'84.26',2), - -(10221,'S24_2011',49,'113.06',1), - -(10221,'S700_1938',23,'69.29',4), - -(10222,'S10_4757',49,'133.28',12), - -(10222,'S18_1662',49,'137.19',4), - -(10222,'S18_3029',49,'79.14',10), - -(10222,'S18_3856',45,'88.93',9), - -(10222,'S24_2841',32,'56.86',5), - -(10222,'S24_3151',47,'74.35',14), - -(10222,'S24_3420',43,'61.15',6), - -(10222,'S24_3816',46,'77.99',11), - -(10222,'S24_3949',48,'55.27',3), - -(10222,'S700_1138',31,'58.67',15), - -(10222,'S700_2047',26,'80.56',8), - -(10222,'S700_2466',37,'90.75',1), - -(10222,'S700_2610',36,'69.39',13), - -(10222,'S700_3505',38,'84.14',16), - -(10222,'S700_3962',31,'81.43',17), - -(10222,'S700_4002',43,'66.63',2), - -(10222,'S72_1253',31,'45.19',7), - -(10222,'S72_3212',36,'48.59',18), - -(10223,'S10_1678',37,'80.39',1), - -(10223,'S10_2016',47,'110.61',4), - -(10223,'S10_4698',49,'189.79',3), - -(10223,'S18_2581',47,'67.58',9), - -(10223,'S18_2625',28,'58.75',5), - -(10223,'S24_1578',32,'104.81',2), - -(10223,'S24_1785',34,'87.54',11), - -(10223,'S24_2000',38,'60.94',6), - -(10223,'S24_4278',23,'68.10',10), - -(10223,'S32_1374',21,'90.90',7), - -(10223,'S32_4289',20,'66.73',12), - -(10223,'S50_1341',41,'41.02',13), - -(10223,'S700_1691',25,'84.03',14), - -(10223,'S700_2834',29,'113.90',8), - -(10223,'S700_3167',26,'79.20',15), - -(10224,'S12_2823',43,'141.58',6), - -(10224,'S18_3782',38,'57.20',1), - -(10224,'S24_2360',37,'60.26',4), - -(10224,'S32_2206',43,'37.01',2), - -(10224,'S32_4485',30,'94.91',5), - -(10224,'S50_4713',50,'81.36',3), - -(10225,'S12_1099',27,'157.60',9), - -(10225,'S12_3380',25,'101.00',7), - -(10225,'S12_3990',37,'64.64',10), - -(10225,'S12_4675',21,'100.19',6), - -(10225,'S18_1129',32,'116.06',1), - -(10225,'S18_1889',47,'71.61',5), - -(10225,'S18_3232',43,'162.57',2), - -(10225,'S18_3278',37,'69.96',12), - -(10225,'S18_3482',27,'119.06',11), - -(10225,'S18_4721',35,'135.41',14), - -(10225,'S24_2972',42,'34.74',3), - -(10225,'S24_3371',24,'51.43',8), - -(10225,'S24_3856',40,'130.60',4), - -(10225,'S24_4620',46,'77.61',13), - -(10226,'S18_1589',38,'108.26',4), - -(10226,'S18_1984',24,'129.45',7), - -(10226,'S18_2870',24,'125.40',5), - -(10226,'S18_3685',46,'122.91',6), - -(10226,'S24_1046',21,'65.41',1), - -(10226,'S24_1628',36,'47.79',3), - -(10226,'S24_3432',48,'95.30',2), - -(10227,'S18_1342',25,'85.27',3), - -(10227,'S18_1367',31,'50.14',2), - -(10227,'S18_1749',26,'136.00',10), - -(10227,'S18_2248',28,'59.93',9), - -(10227,'S18_2325',46,'118.23',7), - -(10227,'S18_2795',29,'146.81',4), - -(10227,'S18_3320',33,'99.21',1), - -(10227,'S18_4409',34,'87.43',11), - -(10227,'S18_4933',37,'70.56',12), - -(10227,'S24_1937',42,'27.22',6), - -(10227,'S24_2022',24,'39.42',5), - -(10227,'S24_2766',47,'84.51',14), - -(10227,'S24_2887',33,'102.17',13), - -(10227,'S24_3191',40,'78.76',15), - -(10227,'S24_3969',27,'34.88',8), - -(10228,'S10_1949',29,'214.30',2), - -(10228,'S18_1097',32,'100.34',1), - -(10228,'S18_2949',24,'101.31',3), - -(10228,'S18_2957',45,'57.46',5), - -(10228,'S18_3136',31,'100.53',4), - -(10228,'S24_4258',33,'84.73',6), - -(10229,'S10_4962',50,'138.88',9), - -(10229,'S12_1666',25,'110.70',13), - -(10229,'S12_4473',36,'95.99',1), - -(10229,'S18_2319',26,'104.32',4), - -(10229,'S18_2432',28,'53.48',7), - -(10229,'S18_3232',22,'157.49',5), - -(10229,'S18_4600',41,'119.87',10), - -(10229,'S18_4668',39,'43.77',14), - -(10229,'S24_2300',48,'115.01',6), - -(10229,'S24_2840',33,'34.65',2), - -(10229,'S32_1268',25,'78.97',8), - -(10229,'S32_2509',23,'49.78',3), - -(10229,'S32_3522',30,'52.36',12), - -(10229,'S700_2824',50,'91.04',11), - -(10230,'S12_3148',43,'128.42',1), - -(10230,'S18_2238',49,'153.91',8), - -(10230,'S18_4027',42,'142.18',3), - -(10230,'S24_1444',36,'47.40',6), - -(10230,'S24_4048',45,'99.36',5), - -(10230,'S32_3207',46,'59.03',4), - -(10230,'S50_1392',34,'100.70',7), - -(10230,'S50_1514',43,'57.41',2), - -(10231,'S12_1108',42,'193.25',2), - -(10231,'S12_3891',49,'147.07',1), - -(10232,'S18_3140',22,'133.86',6), - -(10232,'S18_3259',48,'97.81',8), - -(10232,'S18_4522',23,'78.12',5), - -(10232,'S24_2011',46,'113.06',4), - -(10232,'S700_1938',26,'84.88',7), - -(10232,'S700_3505',48,'86.15',1), - -(10232,'S700_3962',35,'81.43',2), - -(10232,'S72_3212',24,'48.59',3), - -(10233,'S24_3151',40,'70.81',2), - -(10233,'S700_1138',36,'66.00',3), - -(10233,'S700_2610',29,'67.94',1), - -(10234,'S10_4757',48,'118.32',9), - -(10234,'S18_1662',50,'146.65',1), - -(10234,'S18_3029',48,'84.30',7), - -(10234,'S18_3856',39,'85.75',6), - -(10234,'S24_2841',44,'67.14',2), - -(10234,'S24_3420',25,'65.09',3), - -(10234,'S24_3816',31,'78.83',8), - -(10234,'S700_2047',29,'83.28',5), - -(10234,'S72_1253',40,'45.69',4), - -(10235,'S18_2581',24,'81.95',3), - -(10235,'S24_1785',23,'89.72',5), - -(10235,'S24_3949',33,'55.27',12), - -(10235,'S24_4278',40,'63.03',4), - -(10235,'S32_1374',41,'90.90',1), - -(10235,'S32_4289',34,'66.73',6), - -(10235,'S50_1341',41,'37.09',7), - -(10235,'S700_1691',25,'88.60',8), - -(10235,'S700_2466',38,'92.74',10), - -(10235,'S700_2834',25,'116.28',2), - -(10235,'S700_3167',32,'73.60',9), - -(10235,'S700_4002',34,'70.33',11), - -(10236,'S10_2016',22,'105.86',1), - -(10236,'S18_2625',23,'52.70',2), - -(10236,'S24_2000',36,'65.51',3), - -(10237,'S10_1678',23,'91.87',7), - -(10237,'S10_4698',39,'158.80',9), - -(10237,'S12_2823',32,'129.53',6), - -(10237,'S18_3782',26,'49.74',1), - -(10237,'S24_1578',20,'109.32',8), - -(10237,'S24_2360',26,'62.33',4), - -(10237,'S32_2206',26,'35.00',2), - -(10237,'S32_4485',27,'94.91',5), - -(10237,'S50_4713',20,'78.92',3), - -(10238,'S12_1099',28,'161.49',3), - -(10238,'S12_3380',29,'104.52',1), - -(10238,'S12_3990',20,'73.42',4), - -(10238,'S18_3278',41,'68.35',6), - -(10238,'S18_3482',49,'144.05',5), - -(10238,'S18_4721',44,'120.53',8), - -(10238,'S24_3371',47,'53.88',2), - -(10238,'S24_4620',22,'67.91',7), - -(10239,'S12_4675',21,'100.19',5), - -(10239,'S18_1889',46,'70.07',4), - -(10239,'S18_3232',47,'135.47',1), - -(10239,'S24_2972',20,'32.47',2), - -(10239,'S24_3856',29,'133.41',3), - -(10240,'S18_1129',41,'125.97',3), - -(10240,'S18_1984',37,'136.56',2), - -(10240,'S18_3685',37,'134.22',1), - -(10241,'S18_1589',21,'119.46',11), - -(10241,'S18_1749',41,'153.00',2), - -(10241,'S18_2248',33,'55.70',1), - -(10241,'S18_2870',44,'126.72',12), - -(10241,'S18_4409',42,'77.31',3), - -(10241,'S18_4933',30,'62.72',4), - -(10241,'S24_1046',22,'72.02',8), - -(10241,'S24_1628',21,'47.29',10), - -(10241,'S24_2766',47,'89.05',6), - -(10241,'S24_2887',28,'117.44',5), - -(10241,'S24_3191',26,'69.34',7), - -(10241,'S24_3432',27,'107.08',9), - -(10242,'S24_3969',46,'36.52',1), - -(10243,'S18_2325',47,'111.87',2), - -(10243,'S24_1937',33,'30.87',1), - -(10244,'S18_1342',40,'99.66',7), - -(10244,'S18_1367',20,'48.52',6), - -(10244,'S18_2795',43,'141.75',8), - -(10244,'S18_2949',30,'87.13',1), - -(10244,'S18_2957',24,'54.96',3), - -(10244,'S18_3136',29,'85.87',2), - -(10244,'S18_3320',36,'87.30',5), - -(10244,'S24_2022',39,'42.11',9), - -(10244,'S24_4258',40,'97.39',4), - -(10245,'S10_1949',34,'195.01',9), - -(10245,'S10_4962',28,'147.74',2), - -(10245,'S12_1666',38,'120.27',6), - -(10245,'S18_1097',29,'114.34',8), - -(10245,'S18_4600',21,'111.39',3), - -(10245,'S18_4668',45,'48.80',7), - -(10245,'S32_1268',37,'81.86',1), - -(10245,'S32_3522',44,'54.94',5), - -(10245,'S700_2824',44,'81.93',4), - -(10246,'S12_4473',46,'99.54',5), - -(10246,'S18_2238',40,'144.08',4), - -(10246,'S18_2319',22,'100.64',8), - -(10246,'S18_2432',30,'57.73',11), - -(10246,'S18_3232',36,'145.63',9), - -(10246,'S24_1444',44,'46.24',2), - -(10246,'S24_2300',29,'118.84',10), - -(10246,'S24_2840',49,'34.65',6), - -(10246,'S24_4048',46,'100.54',1), - -(10246,'S32_2509',35,'45.45',7), - -(10246,'S50_1392',22,'113.44',3), - -(10247,'S12_1108',44,'195.33',2), - -(10247,'S12_3148',25,'140.50',3), - -(10247,'S12_3891',27,'167.83',1), - -(10247,'S18_4027',48,'143.62',5), - -(10247,'S32_3207',40,'58.41',6), - -(10247,'S50_1514',49,'51.55',4), - -(10248,'S10_4757',20,'126.48',3), - -(10248,'S18_3029',21,'80.86',1), - -(10248,'S18_3140',32,'133.86',12), - -(10248,'S18_3259',42,'95.80',14), - -(10248,'S18_4522',42,'87.77',11), - -(10248,'S24_2011',48,'122.89',10), - -(10248,'S24_3151',30,'85.85',5), - -(10248,'S24_3816',23,'83.02',2), - -(10248,'S700_1138',36,'66.00',6), - -(10248,'S700_1938',40,'81.41',13), - -(10248,'S700_2610',32,'69.39',4), - -(10248,'S700_3505',30,'84.14',7), - -(10248,'S700_3962',35,'92.36',8), - -(10248,'S72_3212',23,'53.51',9), - -(10249,'S18_3856',46,'88.93',5), - -(10249,'S24_2841',20,'54.81',1), - -(10249,'S24_3420',25,'65.75',2), - -(10249,'S700_2047',40,'85.99',4), - -(10249,'S72_1253',32,'49.16',3), - -(10250,'S18_1662',45,'148.23',14), - -(10250,'S18_2581',27,'84.48',4), - -(10250,'S24_1785',31,'95.20',6), - -(10250,'S24_2000',32,'63.22',1), - -(10250,'S24_3949',40,'61.42',13), - -(10250,'S24_4278',37,'72.45',5), - -(10250,'S32_1374',31,'99.89',2), - -(10250,'S32_4289',50,'62.60',7), - -(10250,'S50_1341',36,'36.66',8), - -(10250,'S700_1691',31,'91.34',9), - -(10250,'S700_2466',35,'90.75',11), - -(10250,'S700_2834',44,'98.48',3), - -(10250,'S700_3167',44,'76.00',10), - -(10250,'S700_4002',38,'65.89',12), - -(10251,'S10_1678',59,'93.79',2), - -(10251,'S10_2016',44,'115.37',5), - -(10251,'S10_4698',43,'172.36',4), - -(10251,'S12_2823',46,'129.53',1), - -(10251,'S18_2625',44,'58.15',6), - -(10251,'S24_1578',50,'91.29',3), - -(10252,'S18_3278',20,'74.78',2), - -(10252,'S18_3482',41,'145.52',1), - -(10252,'S18_3782',31,'50.36',5), - -(10252,'S18_4721',26,'127.97',4), - -(10252,'S24_2360',47,'63.03',8), - -(10252,'S24_4620',38,'69.52',3), - -(10252,'S32_2206',36,'36.21',6), - -(10252,'S32_4485',25,'93.89',9), - -(10252,'S50_4713',48,'72.41',7), - -(10253,'S12_1099',24,'157.60',13), - -(10253,'S12_3380',22,'102.17',11), - -(10253,'S12_3990',25,'67.03',14), - -(10253,'S12_4675',41,'109.40',10), - -(10253,'S18_1129',26,'130.22',5), - -(10253,'S18_1589',24,'103.29',1), - -(10253,'S18_1889',23,'67.76',9), - -(10253,'S18_1984',33,'130.87',4), - -(10253,'S18_2870',37,'114.84',2), - -(10253,'S18_3232',40,'145.63',6), - -(10253,'S18_3685',31,'139.87',3), - -(10253,'S24_2972',40,'34.74',7), - -(10253,'S24_3371',24,'50.82',12), - -(10253,'S24_3856',39,'115.15',8), - -(10254,'S18_1749',49,'137.70',5), - -(10254,'S18_2248',36,'55.09',4), - -(10254,'S18_2325',41,'102.98',2), - -(10254,'S18_4409',34,'80.99',6), - -(10254,'S18_4933',30,'59.87',7), - -(10254,'S24_1046',34,'66.88',11), - -(10254,'S24_1628',32,'43.27',13), - -(10254,'S24_1937',38,'28.88',1), - -(10254,'S24_2766',31,'85.42',9), - -(10254,'S24_2887',33,'111.57',8), - -(10254,'S24_3191',42,'69.34',10), - -(10254,'S24_3432',49,'101.73',12), - -(10254,'S24_3969',20,'39.80',3), - -(10255,'S18_2795',24,'135.00',1), - -(10255,'S24_2022',37,'37.63',2), - -(10256,'S18_1342',34,'93.49',2), - -(10256,'S18_1367',29,'52.83',1), - -(10257,'S18_2949',50,'92.19',1), - -(10257,'S18_2957',49,'59.34',3), - -(10257,'S18_3136',37,'83.78',2), - -(10257,'S18_3320',26,'91.27',5), - -(10257,'S24_4258',46,'81.81',4), - -(10258,'S10_1949',32,'177.87',6), - -(10258,'S12_1666',41,'133.94',3), - -(10258,'S18_1097',41,'113.17',5), - -(10258,'S18_4668',21,'49.81',4), - -(10258,'S32_3522',20,'62.70',2), - -(10258,'S700_2824',45,'86.99',1), - -(10259,'S10_4962',26,'121.15',12), - -(10259,'S12_4473',46,'117.32',4), - -(10259,'S18_2238',30,'134.26',3), - -(10259,'S18_2319',34,'120.28',7), - -(10259,'S18_2432',30,'59.55',10), - -(10259,'S18_3232',27,'152.41',8), - -(10259,'S18_4600',41,'107.76',13), - -(10259,'S24_1444',28,'46.82',1), - -(10259,'S24_2300',47,'121.40',9), - -(10259,'S24_2840',31,'31.47',5), - -(10259,'S32_1268',45,'95.35',11), - -(10259,'S32_2509',40,'45.99',6), - -(10259,'S50_1392',29,'105.33',2), - -(10260,'S12_1108',46,'180.79',5), - -(10260,'S12_3148',30,'140.50',6), - -(10260,'S12_3891',44,'169.56',4), - -(10260,'S18_3140',32,'121.57',1), - -(10260,'S18_3259',29,'92.77',3), - -(10260,'S18_4027',23,'137.88',8), - -(10260,'S24_4048',23,'117.10',10), - -(10260,'S32_3207',27,'55.30',9), - -(10260,'S50_1514',21,'56.24',7), - -(10260,'S700_1938',33,'80.55',2), - -(10261,'S10_4757',27,'116.96',1), - -(10261,'S18_4522',20,'80.75',9), - -(10261,'S24_2011',36,'105.69',8), - -(10261,'S24_3151',22,'79.66',3), - -(10261,'S700_1138',34,'64.00',4), - -(10261,'S700_2610',44,'58.55',2), - -(10261,'S700_3505',25,'89.15',5), - -(10261,'S700_3962',50,'88.39',6), - -(10261,'S72_3212',29,'43.68',7), - -(10262,'S18_1662',49,'157.69',9), - -(10262,'S18_3029',32,'81.72',15), - -(10262,'S18_3856',34,'85.75',14), - -(10262,'S24_1785',34,'98.48',1), - -(10262,'S24_2841',24,'63.71',10), - -(10262,'S24_3420',46,'65.75',11), - -(10262,'S24_3816',49,'82.18',16), - -(10262,'S24_3949',48,'58.69',8), - -(10262,'S32_4289',40,'63.97',2), - -(10262,'S50_1341',49,'35.78',3), - -(10262,'S700_1691',40,'87.69',4), - -(10262,'S700_2047',44,'83.28',13), - -(10262,'S700_2466',33,'81.77',6), - -(10262,'S700_3167',27,'64.80',5), - -(10262,'S700_4002',35,'64.41',7), - -(10262,'S72_1253',21,'41.71',12), - -(10263,'S10_1678',34,'89.00',2), - -(10263,'S10_2016',40,'107.05',5), - -(10263,'S10_4698',41,'193.66',4), - -(10263,'S12_2823',48,'123.51',1), - -(10263,'S18_2581',33,'67.58',10), - -(10263,'S18_2625',34,'50.27',6), - -(10263,'S24_1578',42,'109.32',3), - -(10263,'S24_2000',37,'67.03',7), - -(10263,'S24_4278',24,'59.41',11), - -(10263,'S32_1374',31,'93.90',8), - -(10263,'S700_2834',47,'117.46',9), - -(10264,'S18_3782',48,'58.44',3), - -(10264,'S18_4721',20,'124.99',2), - -(10264,'S24_2360',37,'61.64',6), - -(10264,'S24_4620',47,'75.18',1), - -(10264,'S32_2206',20,'39.02',4), - -(10264,'S32_4485',34,'100.01',7), - -(10264,'S50_4713',47,'67.53',5), - -(10265,'S18_3278',45,'74.78',2), - -(10265,'S18_3482',49,'123.47',1), - -(10266,'S12_1099',44,'188.73',14), - -(10266,'S12_3380',22,'110.39',12), - -(10266,'S12_3990',35,'67.83',15), - -(10266,'S12_4675',40,'112.86',11), - -(10266,'S18_1129',21,'131.63',6), - -(10266,'S18_1589',36,'99.55',2), - -(10266,'S18_1889',33,'77.00',10), - -(10266,'S18_1984',49,'139.41',5), - -(10266,'S18_2870',20,'113.52',3), - -(10266,'S18_3232',29,'137.17',7), - -(10266,'S18_3685',33,'127.15',4), - -(10266,'S24_1628',28,'40.25',1), - -(10266,'S24_2972',34,'35.12',8), - -(10266,'S24_3371',47,'56.33',13), - -(10266,'S24_3856',24,'119.37',9), - -(10267,'S18_4933',36,'71.27',1), - -(10267,'S24_1046',40,'72.02',5), - -(10267,'S24_2766',38,'76.33',3), - -(10267,'S24_2887',43,'93.95',2), - -(10267,'S24_3191',44,'83.90',4), - -(10267,'S24_3432',43,'98.51',6), - -(10268,'S18_1342',49,'93.49',3), - -(10268,'S18_1367',26,'45.82',2), - -(10268,'S18_1749',34,'164.90',10), - -(10268,'S18_2248',31,'60.54',9), - -(10268,'S18_2325',50,'124.59',7), - -(10268,'S18_2795',35,'148.50',4), - -(10268,'S18_3320',39,'96.23',1), - -(10268,'S18_4409',35,'84.67',11), - -(10268,'S24_1937',33,'31.86',6), - -(10268,'S24_2022',40,'36.29',5), - -(10268,'S24_3969',30,'37.75',8), - -(10269,'S18_2957',32,'57.46',1), - -(10269,'S24_4258',48,'95.44',2), - -(10270,'S10_1949',21,'171.44',9), - -(10270,'S10_4962',32,'124.10',2), - -(10270,'S12_1666',28,'135.30',6), - -(10270,'S18_1097',43,'94.50',8), - -(10270,'S18_2949',31,'81.05',10), - -(10270,'S18_3136',38,'85.87',11), - -(10270,'S18_4600',38,'107.76',3), - -(10270,'S18_4668',44,'40.25',7), - -(10270,'S32_1268',32,'93.42',1), - -(10270,'S32_3522',21,'52.36',5), - -(10270,'S700_2824',46,'101.15',4), - -(10271,'S12_4473',31,'99.54',5), - -(10271,'S18_2238',50,'147.36',4), - -(10271,'S18_2319',50,'121.50',8), - -(10271,'S18_2432',25,'59.55',11), - -(10271,'S18_3232',20,'169.34',9), - -(10271,'S24_1444',45,'49.71',2), - -(10271,'S24_2300',43,'122.68',10), - -(10271,'S24_2840',38,'28.64',6), - -(10271,'S24_4048',22,'110.00',1), - -(10271,'S32_2509',35,'51.95',7), - -(10271,'S50_1392',34,'93.76',3), - -(10272,'S12_1108',35,'187.02',2), - -(10272,'S12_3148',27,'123.89',3), - -(10272,'S12_3891',39,'148.80',1), - -(10272,'S18_4027',25,'126.39',5), - -(10272,'S32_3207',45,'56.55',6), - -(10272,'S50_1514',43,'53.89',4), - -(10273,'S10_4757',30,'136.00',4), - -(10273,'S18_3029',34,'84.30',2), - -(10273,'S18_3140',40,'117.47',13), - -(10273,'S18_3259',47,'87.73',15), - -(10273,'S18_3856',50,'105.87',1), - -(10273,'S18_4522',33,'72.85',12), - -(10273,'S24_2011',22,'103.23',11), - -(10273,'S24_3151',27,'84.08',6), - -(10273,'S24_3816',48,'83.86',3), - -(10273,'S700_1138',21,'66.00',7), - -(10273,'S700_1938',21,'77.95',14), - -(10273,'S700_2610',42,'57.82',5), - -(10273,'S700_3505',40,'91.15',8), - -(10273,'S700_3962',26,'89.38',9), - -(10273,'S72_3212',37,'51.32',10), - -(10274,'S18_1662',41,'129.31',1), - -(10274,'S24_2841',40,'56.86',2), - -(10274,'S24_3420',24,'65.09',3), - -(10274,'S700_2047',24,'75.13',5), - -(10274,'S72_1253',32,'49.66',4), - -(10275,'S10_1678',45,'81.35',1), - -(10275,'S10_2016',22,'115.37',4), - -(10275,'S10_4698',36,'154.93',3), - -(10275,'S18_2581',35,'70.12',9), - -(10275,'S18_2625',37,'52.09',5), - -(10275,'S24_1578',21,'105.94',2), - -(10275,'S24_1785',25,'97.38',11), - -(10275,'S24_2000',30,'61.70',6), - -(10275,'S24_3949',41,'58.00',18), - -(10275,'S24_4278',27,'67.38',10), - -(10275,'S32_1374',23,'89.90',7), - -(10275,'S32_4289',28,'58.47',12), - -(10275,'S50_1341',38,'40.15',13), - -(10275,'S700_1691',32,'85.86',14), - -(10275,'S700_2466',39,'82.77',16), - -(10275,'S700_2834',48,'102.04',8), - -(10275,'S700_3167',43,'72.00',15), - -(10275,'S700_4002',31,'59.96',17), - -(10276,'S12_1099',50,'184.84',3), - -(10276,'S12_2823',43,'150.62',14), - -(10276,'S12_3380',47,'104.52',1), - -(10276,'S12_3990',38,'67.83',4), - -(10276,'S18_3278',38,'78.00',6), - -(10276,'S18_3482',30,'139.64',5), - -(10276,'S18_3782',33,'54.71',9), - -(10276,'S18_4721',48,'120.53',8), - -(10276,'S24_2360',46,'61.64',12), - -(10276,'S24_3371',20,'58.17',2), - -(10276,'S24_4620',48,'67.10',7), - -(10276,'S32_2206',27,'35.40',10), - -(10276,'S32_4485',38,'94.91',13), - -(10276,'S50_4713',21,'67.53',11), - -(10277,'S12_4675',28,'93.28',1), - -(10278,'S18_1129',34,'114.65',6), - -(10278,'S18_1589',23,'107.02',2), - -(10278,'S18_1889',29,'73.15',10), - -(10278,'S18_1984',29,'118.07',5), - -(10278,'S18_2870',39,'117.48',3), - -(10278,'S18_3232',42,'167.65',7), - -(10278,'S18_3685',31,'114.44',4), - -(10278,'S24_1628',35,'48.80',1), - -(10278,'S24_2972',31,'37.38',8), - -(10278,'S24_3856',25,'136.22',9), - -(10279,'S18_4933',26,'68.42',1), - -(10279,'S24_1046',32,'68.35',5), - -(10279,'S24_2766',49,'76.33',3), - -(10279,'S24_2887',48,'106.87',2), - -(10279,'S24_3191',33,'78.76',4), - -(10279,'S24_3432',48,'95.30',6), - -(10280,'S10_1949',34,'205.73',2), - -(10280,'S18_1097',24,'98.00',1), - -(10280,'S18_1342',50,'87.33',9), - -(10280,'S18_1367',27,'47.44',8), - -(10280,'S18_1749',26,'161.50',16), - -(10280,'S18_2248',25,'53.28',15), - -(10280,'S18_2325',37,'109.33',13), - -(10280,'S18_2795',22,'158.63',10), - -(10280,'S18_2949',46,'82.06',3), - -(10280,'S18_2957',43,'54.34',5), - -(10280,'S18_3136',29,'102.63',4), - -(10280,'S18_3320',34,'99.21',7), - -(10280,'S18_4409',35,'77.31',17), - -(10280,'S24_1937',20,'29.87',12), - -(10280,'S24_2022',45,'36.29',11), - -(10280,'S24_3969',33,'35.29',14), - -(10280,'S24_4258',21,'79.86',6), - -(10281,'S10_4962',44,'132.97',9), - -(10281,'S12_1666',25,'127.10',13), - -(10281,'S12_4473',41,'98.36',1), - -(10281,'S18_2319',48,'114.14',4), - -(10281,'S18_2432',29,'56.52',7), - -(10281,'S18_3232',25,'135.47',5), - -(10281,'S18_4600',25,'96.86',10), - -(10281,'S18_4668',44,'42.76',14), - -(10281,'S24_2300',25,'112.46',6), - -(10281,'S24_2840',20,'33.95',2), - -(10281,'S32_1268',29,'80.90',8), - -(10281,'S32_2509',31,'44.91',3), - -(10281,'S32_3522',36,'59.47',12), - -(10281,'S700_2824',27,'89.01',11), - -(10282,'S12_1108',41,'176.63',5), - -(10282,'S12_3148',27,'142.02',6), - -(10282,'S12_3891',24,'169.56',4), - -(10282,'S18_2238',23,'147.36',13), - -(10282,'S18_3140',43,'122.93',1), - -(10282,'S18_3259',36,'88.74',3), - -(10282,'S18_4027',31,'132.13',8), - -(10282,'S24_1444',29,'49.71',11), - -(10282,'S24_4048',39,'96.99',10), - -(10282,'S32_3207',36,'51.58',9), - -(10282,'S50_1392',38,'114.59',12), - -(10282,'S50_1514',37,'56.24',7), - -(10282,'S700_1938',43,'77.95',2), - -(10283,'S10_4757',25,'130.56',6), - -(10283,'S18_3029',21,'78.28',4), - -(10283,'S18_3856',46,'100.58',3), - -(10283,'S18_4522',34,'71.97',14), - -(10283,'S24_2011',42,'99.54',13), - -(10283,'S24_3151',34,'80.54',8), - -(10283,'S24_3816',33,'77.15',5), - -(10283,'S700_1138',45,'62.00',9), - -(10283,'S700_2047',20,'74.23',2), - -(10283,'S700_2610',47,'68.67',7), - -(10283,'S700_3505',22,'88.15',10), - -(10283,'S700_3962',38,'85.41',11), - -(10283,'S72_1253',43,'41.22',1), - -(10283,'S72_3212',33,'49.14',12), - -(10284,'S18_1662',45,'137.19',11), - -(10284,'S18_2581',31,'68.43',1), - -(10284,'S24_1785',22,'101.76',3), - -(10284,'S24_2841',30,'65.08',12), - -(10284,'S24_3420',39,'59.83',13), - -(10284,'S24_3949',21,'65.51',10), - -(10284,'S24_4278',21,'66.65',2), - -(10284,'S32_4289',50,'60.54',4), - -(10284,'S50_1341',33,'35.78',5), - -(10284,'S700_1691',24,'87.69',6), - -(10284,'S700_2466',45,'95.73',8), - -(10284,'S700_3167',25,'68.00',7), - -(10284,'S700_4002',32,'73.29',9), - -(10285,'S10_1678',36,'95.70',6), - -(10285,'S10_2016',47,'110.61',9), - -(10285,'S10_4698',27,'166.55',8), - -(10285,'S12_2823',49,'131.04',5), - -(10285,'S18_2625',20,'50.88',10), - -(10285,'S24_1578',34,'91.29',7), - -(10285,'S24_2000',39,'61.70',11), - -(10285,'S24_2360',38,'64.41',3), - -(10285,'S32_1374',37,'82.91',12), - -(10285,'S32_2206',37,'36.61',1), - -(10285,'S32_4485',26,'100.01',4), - -(10285,'S50_4713',39,'76.48',2), - -(10285,'S700_2834',45,'102.04',13), - -(10286,'S18_3782',38,'51.60',1), - -(10287,'S12_1099',21,'190.68',12), - -(10287,'S12_3380',45,'117.44',10), - -(10287,'S12_3990',41,'74.21',13), - -(10287,'S12_4675',23,'107.10',9), - -(10287,'S18_1129',41,'113.23',4), - -(10287,'S18_1889',44,'61.60',8), - -(10287,'S18_1984',24,'123.76',3), - -(10287,'S18_2870',44,'114.84',1), - -(10287,'S18_3232',36,'137.17',5), - -(10287,'S18_3278',43,'68.35',15), - -(10287,'S18_3482',40,'127.88',14), - -(10287,'S18_3685',27,'139.87',2), - -(10287,'S18_4721',34,'119.04',17), - -(10287,'S24_2972',36,'31.34',6), - -(10287,'S24_3371',20,'58.17',11), - -(10287,'S24_3856',36,'137.62',7), - -(10287,'S24_4620',40,'79.22',16), - -(10288,'S18_1589',20,'120.71',14), - -(10288,'S18_1749',32,'168.30',5), - -(10288,'S18_2248',28,'50.25',4), - -(10288,'S18_2325',31,'102.98',2), - -(10288,'S18_4409',35,'90.19',6), - -(10288,'S18_4933',23,'57.02',7), - -(10288,'S24_1046',36,'66.88',11), - -(10288,'S24_1628',50,'49.30',13), - -(10288,'S24_1937',29,'32.19',1), - -(10288,'S24_2766',35,'81.78',9), - -(10288,'S24_2887',48,'109.22',8), - -(10288,'S24_3191',34,'76.19',10), - -(10288,'S24_3432',41,'101.73',12), - -(10288,'S24_3969',33,'37.75',3), - -(10289,'S18_1342',38,'92.47',2), - -(10289,'S18_1367',24,'44.75',1), - -(10289,'S18_2795',43,'141.75',3), - -(10289,'S24_2022',45,'41.22',4), - -(10290,'S18_3320',26,'80.36',2), - -(10290,'S24_4258',45,'83.76',1), - -(10291,'S10_1949',37,'210.01',11), - -(10291,'S10_4962',30,'141.83',4), - -(10291,'S12_1666',41,'123.00',8), - -(10291,'S18_1097',41,'96.84',10), - -(10291,'S18_2432',26,'52.26',2), - -(10291,'S18_2949',47,'99.28',12), - -(10291,'S18_2957',37,'56.21',14), - -(10291,'S18_3136',23,'93.20',13), - -(10291,'S18_4600',48,'96.86',5), - -(10291,'S18_4668',29,'45.28',9), - -(10291,'S24_2300',48,'109.90',1), - -(10291,'S32_1268',26,'82.83',3), - -(10291,'S32_3522',32,'53.00',7), - -(10291,'S700_2824',28,'86.99',6), - -(10292,'S12_4473',21,'94.80',8), - -(10292,'S18_2238',26,'140.81',7), - -(10292,'S18_2319',41,'103.09',11), - -(10292,'S18_3232',21,'147.33',12), - -(10292,'S18_4027',44,'114.90',2), - -(10292,'S24_1444',40,'48.55',5), - -(10292,'S24_2840',39,'34.30',9), - -(10292,'S24_4048',27,'113.55',4), - -(10292,'S32_2509',50,'54.11',10), - -(10292,'S32_3207',31,'59.65',3), - -(10292,'S50_1392',41,'113.44',6), - -(10292,'S50_1514',35,'49.79',1), - -(10293,'S12_1108',46,'187.02',8), - -(10293,'S12_3148',24,'129.93',9), - -(10293,'S12_3891',45,'171.29',7), - -(10293,'S18_3140',24,'110.64',4), - -(10293,'S18_3259',22,'91.76',6), - -(10293,'S18_4522',49,'72.85',3), - -(10293,'S24_2011',21,'111.83',2), - -(10293,'S700_1938',29,'77.95',5), - -(10293,'S72_3212',32,'51.32',1), - -(10294,'S700_3962',45,'98.32',1), - -(10295,'S10_4757',24,'136.00',1), - -(10295,'S24_3151',46,'84.08',3), - -(10295,'S700_1138',26,'62.00',4), - -(10295,'S700_2610',44,'71.56',2), - -(10295,'S700_3505',34,'93.16',5), - -(10296,'S18_1662',36,'146.65',7), - -(10296,'S18_3029',21,'69.68',13), - -(10296,'S18_3856',22,'105.87',12), - -(10296,'S24_2841',21,'60.97',8), - -(10296,'S24_3420',31,'63.78',9), - -(10296,'S24_3816',22,'83.02',14), - -(10296,'S24_3949',32,'63.46',6), - -(10296,'S50_1341',26,'41.02',1), - -(10296,'S700_1691',42,'75.81',2), - -(10296,'S700_2047',34,'89.61',11), - -(10296,'S700_2466',24,'96.73',4), - -(10296,'S700_3167',22,'74.40',3), - -(10296,'S700_4002',47,'61.44',5), - -(10296,'S72_1253',21,'46.68',10), - -(10297,'S18_2581',25,'81.95',4), - -(10297,'S24_1785',32,'107.23',6), - -(10297,'S24_2000',32,'70.08',1), - -(10297,'S24_4278',23,'71.73',5), - -(10297,'S32_1374',26,'88.90',2), - -(10297,'S32_4289',28,'63.29',7), - -(10297,'S700_2834',35,'111.53',3), - -(10298,'S10_2016',39,'105.86',1), - -(10298,'S18_2625',32,'60.57',2), - -(10299,'S10_1678',23,'76.56',9), - -(10299,'S10_4698',29,'164.61',11), - -(10299,'S12_2823',24,'123.51',8), - -(10299,'S18_3782',39,'62.17',3), - -(10299,'S18_4721',49,'119.04',2), - -(10299,'S24_1578',47,'107.07',10), - -(10299,'S24_2360',33,'58.87',6), - -(10299,'S24_4620',32,'66.29',1), - -(10299,'S32_2206',24,'36.21',4), - -(10299,'S32_4485',38,'84.70',7), - -(10299,'S50_4713',44,'77.29',5), - -(10300,'S12_1099',33,'184.84',5), - -(10300,'S12_3380',29,'116.27',3), - -(10300,'S12_3990',22,'76.61',6), - -(10300,'S12_4675',23,'95.58',2), - -(10300,'S18_1889',41,'63.14',1), - -(10300,'S18_3278',49,'65.94',8), - -(10300,'S18_3482',23,'144.05',7), - -(10300,'S24_3371',31,'52.05',4), - -(10301,'S18_1129',37,'114.65',8), - -(10301,'S18_1589',32,'118.22',4), - -(10301,'S18_1984',47,'119.49',7), - -(10301,'S18_2870',22,'113.52',5), - -(10301,'S18_3232',23,'135.47',9), - -(10301,'S18_3685',39,'137.04',6), - -(10301,'S24_1046',27,'64.67',1), - -(10301,'S24_1628',22,'40.75',3), - -(10301,'S24_2972',48,'32.10',10), - -(10301,'S24_3432',22,'86.73',2), - -(10301,'S24_3856',50,'122.17',11), - -(10302,'S18_1749',43,'166.60',1), - -(10302,'S18_4409',38,'82.83',2), - -(10302,'S18_4933',23,'70.56',3), - -(10302,'S24_2766',49,'75.42',5), - -(10302,'S24_2887',45,'104.52',4), - -(10302,'S24_3191',48,'74.48',6), - -(10303,'S18_2248',46,'56.91',2), - -(10303,'S24_3969',24,'35.70',1), - -(10304,'S10_1949',47,'201.44',6), - -(10304,'S12_1666',39,'117.54',3), - -(10304,'S18_1097',46,'106.17',5), - -(10304,'S18_1342',37,'95.55',13), - -(10304,'S18_1367',37,'46.90',12), - -(10304,'S18_2325',24,'102.98',17), - -(10304,'S18_2795',20,'141.75',14), - -(10304,'S18_2949',46,'98.27',7), - -(10304,'S18_2957',24,'54.34',9), - -(10304,'S18_3136',26,'90.06',8), - -(10304,'S18_3320',38,'95.24',11), - -(10304,'S18_4668',34,'44.27',4), - -(10304,'S24_1937',23,'29.21',16), - -(10304,'S24_2022',44,'42.11',15), - -(10304,'S24_4258',33,'80.83',10), - -(10304,'S32_3522',36,'52.36',2), - -(10304,'S700_2824',40,'80.92',1), - -(10305,'S10_4962',38,'130.01',13), - -(10305,'S12_4473',38,'107.84',5), - -(10305,'S18_2238',27,'132.62',4), - -(10305,'S18_2319',36,'117.82',8), - -(10305,'S18_2432',41,'58.95',11), - -(10305,'S18_3232',37,'160.87',9), - -(10305,'S18_4600',22,'112.60',14), - -(10305,'S24_1444',45,'48.55',2), - -(10305,'S24_2300',24,'107.34',10), - -(10305,'S24_2840',48,'30.76',6), - -(10305,'S24_4048',36,'118.28',1), - -(10305,'S32_1268',28,'94.38',12), - -(10305,'S32_2509',40,'48.70',7), - -(10305,'S50_1392',42,'109.96',3), - -(10306,'S12_1108',31,'182.86',13), - -(10306,'S12_3148',34,'145.04',14), - -(10306,'S12_3891',20,'145.34',12), - -(10306,'S18_3140',32,'114.74',9), - -(10306,'S18_3259',40,'83.70',11), - -(10306,'S18_4027',23,'126.39',16), - -(10306,'S18_4522',39,'85.14',8), - -(10306,'S24_2011',29,'109.37',7), - -(10306,'S24_3151',31,'76.12',2), - -(10306,'S32_3207',46,'60.28',17), - -(10306,'S50_1514',34,'51.55',15), - -(10306,'S700_1138',50,'61.34',3), - -(10306,'S700_1938',38,'73.62',10), - -(10306,'S700_2610',43,'62.16',1), - -(10306,'S700_3505',32,'99.17',4), - -(10306,'S700_3962',30,'87.39',5), - -(10306,'S72_3212',35,'48.05',6), - -(10307,'S10_4757',22,'118.32',9), - -(10307,'S18_1662',39,'135.61',1), - -(10307,'S18_3029',31,'71.40',7), - -(10307,'S18_3856',48,'92.11',6), - -(10307,'S24_2841',25,'58.23',2), - -(10307,'S24_3420',22,'64.44',3), - -(10307,'S24_3816',22,'75.47',8), - -(10307,'S700_2047',34,'81.47',5), - -(10307,'S72_1253',34,'44.20',4), - -(10308,'S10_2016',34,'115.37',2), - -(10308,'S10_4698',20,'187.85',1), - -(10308,'S18_2581',27,'81.95',7), - -(10308,'S18_2625',34,'48.46',3), - -(10308,'S24_1785',31,'99.57',9), - -(10308,'S24_2000',47,'68.55',4), - -(10308,'S24_3949',43,'58.00',16), - -(10308,'S24_4278',44,'71.73',8), - -(10308,'S32_1374',24,'99.89',5), - -(10308,'S32_4289',46,'61.22',10), - -(10308,'S50_1341',47,'37.09',11), - -(10308,'S700_1691',21,'73.07',12), - -(10308,'S700_2466',35,'88.75',14), - -(10308,'S700_2834',31,'100.85',6), - -(10308,'S700_3167',21,'79.20',13), - -(10308,'S700_4002',39,'62.93',15), - -(10309,'S10_1678',41,'94.74',5), - -(10309,'S12_2823',26,'144.60',4), - -(10309,'S24_1578',21,'96.92',6), - -(10309,'S24_2360',24,'59.56',2), - -(10309,'S32_4485',50,'93.89',3), - -(10309,'S50_4713',28,'74.04',1), - -(10310,'S12_1099',33,'165.38',10), - -(10310,'S12_3380',24,'105.70',8), - -(10310,'S12_3990',49,'77.41',11), - -(10310,'S12_4675',25,'101.34',7), - -(10310,'S18_1129',37,'128.80',2), - -(10310,'S18_1889',20,'66.99',6), - -(10310,'S18_1984',24,'129.45',1), - -(10310,'S18_3232',48,'159.18',3), - -(10310,'S18_3278',27,'70.76',13), - -(10310,'S18_3482',49,'122.00',12), - -(10310,'S18_3782',42,'59.06',16), - -(10310,'S18_4721',40,'133.92',15), - -(10310,'S24_2972',33,'33.23',4), - -(10310,'S24_3371',38,'50.21',9), - -(10310,'S24_3856',45,'139.03',5), - -(10310,'S24_4620',49,'75.18',14), - -(10310,'S32_2206',36,'38.62',17), - -(10311,'S18_1589',29,'124.44',9), - -(10311,'S18_2870',43,'114.84',10), - -(10311,'S18_3685',32,'134.22',11), - -(10311,'S18_4409',41,'92.03',1), - -(10311,'S18_4933',25,'66.99',2), - -(10311,'S24_1046',26,'70.55',6), - -(10311,'S24_1628',45,'48.80',8), - -(10311,'S24_2766',28,'89.05',4), - -(10311,'S24_2887',43,'116.27',3), - -(10311,'S24_3191',25,'85.61',5), - -(10311,'S24_3432',46,'91.02',7), - -(10312,'S10_1949',48,'214.30',3), - -(10312,'S18_1097',32,'101.50',2), - -(10312,'S18_1342',43,'102.74',10), - -(10312,'S18_1367',25,'43.67',9), - -(10312,'S18_1749',48,'146.20',17), - -(10312,'S18_2248',30,'48.43',16), - -(10312,'S18_2325',31,'111.87',14), - -(10312,'S18_2795',25,'150.19',11), - -(10312,'S18_2949',37,'91.18',4), - -(10312,'S18_2957',35,'54.34',6), - -(10312,'S18_3136',38,'93.20',5), - -(10312,'S18_3320',33,'84.33',8), - -(10312,'S18_4668',39,'44.27',1), - -(10312,'S24_1937',39,'27.88',13), - -(10312,'S24_2022',23,'43.46',12), - -(10312,'S24_3969',31,'40.21',15), - -(10312,'S24_4258',44,'96.42',7), - -(10313,'S10_4962',40,'141.83',7), - -(10313,'S12_1666',21,'131.20',11), - -(10313,'S18_2319',29,'109.23',2), - -(10313,'S18_2432',34,'52.87',5), - -(10313,'S18_3232',25,'143.94',3), - -(10313,'S18_4600',28,'110.18',8), - -(10313,'S24_2300',42,'102.23',4), - -(10313,'S32_1268',27,'96.31',6), - -(10313,'S32_2509',38,'48.70',1), - -(10313,'S32_3522',34,'55.59',10), - -(10313,'S700_2824',30,'96.09',9), - -(10314,'S12_1108',38,'176.63',5), - -(10314,'S12_3148',46,'125.40',6), - -(10314,'S12_3891',36,'169.56',4), - -(10314,'S12_4473',45,'95.99',14), - -(10314,'S18_2238',42,'135.90',13), - -(10314,'S18_3140',20,'129.76',1), - -(10314,'S18_3259',23,'84.71',3), - -(10314,'S18_4027',29,'129.26',8), - -(10314,'S24_1444',44,'51.44',11), - -(10314,'S24_2840',39,'31.82',15), - -(10314,'S24_4048',38,'111.18',10), - -(10314,'S32_3207',35,'58.41',9), - -(10314,'S50_1392',28,'115.75',12), - -(10314,'S50_1514',38,'50.38',7), - -(10314,'S700_1938',23,'83.15',2), - -(10315,'S18_4522',36,'78.12',7), - -(10315,'S24_2011',35,'111.83',6), - -(10315,'S24_3151',24,'78.77',1), - -(10315,'S700_1138',41,'60.67',2), - -(10315,'S700_3505',31,'99.17',3), - -(10315,'S700_3962',37,'88.39',4), - -(10315,'S72_3212',40,'51.32',5), - -(10316,'S10_4757',33,'126.48',17), - -(10316,'S18_1662',27,'140.34',9), - -(10316,'S18_3029',21,'72.26',15), - -(10316,'S18_3856',47,'89.99',14), - -(10316,'S24_1785',25,'93.01',1), - -(10316,'S24_2841',34,'67.14',10), - -(10316,'S24_3420',47,'55.23',11), - -(10316,'S24_3816',25,'77.15',16), - -(10316,'S24_3949',30,'67.56',8), - -(10316,'S32_4289',24,'59.16',2), - -(10316,'S50_1341',34,'36.66',3), - -(10316,'S700_1691',34,'74.90',4), - -(10316,'S700_2047',45,'73.32',13), - -(10316,'S700_2466',23,'85.76',6), - -(10316,'S700_2610',48,'67.22',18), - -(10316,'S700_3167',48,'77.60',5), - -(10316,'S700_4002',44,'68.11',7), - -(10316,'S72_1253',34,'43.70',12), - -(10317,'S24_4278',35,'69.55',1), - -(10318,'S10_1678',46,'84.22',1), - -(10318,'S10_2016',45,'102.29',4), - -(10318,'S10_4698',37,'189.79',3), - -(10318,'S18_2581',31,'81.95',9), - -(10318,'S18_2625',42,'49.67',5), - -(10318,'S24_1578',48,'93.54',2), - -(10318,'S24_2000',26,'60.94',6), - -(10318,'S32_1374',47,'81.91',7), - -(10318,'S700_2834',50,'102.04',8), - -(10319,'S12_2823',30,'134.05',9), - -(10319,'S18_3278',46,'77.19',1), - -(10319,'S18_3782',44,'54.71',4), - -(10319,'S18_4721',45,'120.53',3), - -(10319,'S24_2360',31,'65.80',7), - -(10319,'S24_4620',43,'78.41',2), - -(10319,'S32_2206',29,'35.00',5), - -(10319,'S32_4485',22,'96.95',8), - -(10319,'S50_4713',45,'79.73',6), - -(10320,'S12_1099',31,'184.84',3), - -(10320,'S12_3380',35,'102.17',1), - -(10320,'S12_3990',38,'63.84',4), - -(10320,'S18_3482',25,'139.64',5), - -(10320,'S24_3371',26,'60.62',2), - -(10321,'S12_4675',24,'105.95',15), - -(10321,'S18_1129',41,'123.14',10), - -(10321,'S18_1589',44,'120.71',6), - -(10321,'S18_1889',37,'73.92',14), - -(10321,'S18_1984',25,'142.25',9), - -(10321,'S18_2870',27,'126.72',7), - -(10321,'S18_3232',33,'164.26',11), - -(10321,'S18_3685',28,'138.45',8), - -(10321,'S24_1046',30,'68.35',3), - -(10321,'S24_1628',48,'42.76',5), - -(10321,'S24_2766',30,'74.51',1), - -(10321,'S24_2972',37,'31.72',12), - -(10321,'S24_3191',39,'81.33',2), - -(10321,'S24_3432',21,'103.87',4), - -(10321,'S24_3856',26,'137.62',13), - -(10322,'S10_1949',40,'180.01',1), - -(10322,'S10_4962',46,'141.83',8), - -(10322,'S12_1666',27,'136.67',9), - -(10322,'S18_1097',22,'101.50',10), - -(10322,'S18_1342',43,'92.47',14), - -(10322,'S18_1367',41,'44.21',5), - -(10322,'S18_2325',50,'120.77',6), - -(10322,'S18_2432',35,'57.12',11), - -(10322,'S18_2795',36,'158.63',2), - -(10322,'S18_2949',33,'100.30',12), - -(10322,'S18_2957',41,'54.34',13), - -(10322,'S18_3136',48,'90.06',7), - -(10322,'S24_1937',20,'26.55',3), - -(10322,'S24_2022',30,'40.77',4), - -(10323,'S18_3320',33,'88.30',2), - -(10323,'S18_4600',47,'96.86',1), - -(10324,'S12_3148',27,'148.06',1), - -(10324,'S12_4473',26,'100.73',7), - -(10324,'S18_2238',47,'142.45',8), - -(10324,'S18_2319',33,'105.55',10), - -(10324,'S18_3232',27,'137.17',12), - -(10324,'S18_4027',49,'120.64',13), - -(10324,'S18_4668',38,'49.81',6), - -(10324,'S24_1444',25,'49.71',14), - -(10324,'S24_2300',31,'107.34',2), - -(10324,'S24_2840',30,'29.35',9), - -(10324,'S24_4258',33,'95.44',3), - -(10324,'S32_1268',20,'91.49',11), - -(10324,'S32_3522',48,'60.76',4), - -(10324,'S700_2824',34,'80.92',5), - -(10325,'S10_4757',47,'111.52',6), - -(10325,'S12_1108',42,'193.25',8), - -(10325,'S12_3891',24,'166.10',1), - -(10325,'S18_3140',24,'114.74',9), - -(10325,'S24_4048',44,'114.73',5), - -(10325,'S32_2509',38,'44.37',3), - -(10325,'S32_3207',28,'55.30',2), - -(10325,'S50_1392',38,'99.55',4), - -(10325,'S50_1514',44,'56.24',7), - -(10326,'S18_3259',32,'94.79',6), - -(10326,'S18_4522',50,'73.73',5), - -(10326,'S24_2011',41,'120.43',4), - -(10326,'S24_3151',41,'86.74',3), - -(10326,'S24_3816',20,'81.34',2), - -(10326,'S700_1138',39,'60.67',1), - -(10327,'S18_1662',25,'154.54',6), - -(10327,'S18_2581',45,'74.34',8), - -(10327,'S18_3029',25,'74.84',5), - -(10327,'S700_1938',20,'79.68',7), - -(10327,'S700_2610',21,'65.05',1), - -(10327,'S700_3505',43,'85.14',2), - -(10327,'S700_3962',37,'83.42',3), - -(10327,'S72_3212',37,'48.05',4), - -(10328,'S18_3856',34,'104.81',6), - -(10328,'S24_1785',47,'87.54',14), - -(10328,'S24_2841',48,'67.82',1), - -(10328,'S24_3420',20,'56.55',2), - -(10328,'S24_3949',35,'55.96',3), - -(10328,'S24_4278',43,'69.55',4), - -(10328,'S32_4289',24,'57.10',5), - -(10328,'S50_1341',34,'42.33',7), - -(10328,'S700_1691',27,'84.03',8), - -(10328,'S700_2047',41,'75.13',9), - -(10328,'S700_2466',37,'95.73',10), - -(10328,'S700_2834',33,'117.46',11), - -(10328,'S700_3167',33,'71.20',13), - -(10328,'S700_4002',39,'69.59',12), - -(10329,'S10_1678',42,'80.39',1), - -(10329,'S10_2016',20,'109.42',2), - -(10329,'S10_4698',26,'164.61',3), - -(10329,'S12_1099',41,'182.90',5), - -(10329,'S12_2823',24,'128.03',6), - -(10329,'S12_3380',46,'117.44',13), - -(10329,'S12_3990',33,'74.21',14), - -(10329,'S12_4675',39,'102.49',15), - -(10329,'S18_1889',29,'66.22',9), - -(10329,'S18_2625',38,'55.72',12), - -(10329,'S18_3278',38,'65.13',10), - -(10329,'S24_1578',30,'104.81',7), - -(10329,'S24_2000',37,'71.60',4), - -(10329,'S32_1374',45,'80.91',11), - -(10329,'S72_1253',44,'41.22',8), - -(10330,'S18_3482',37,'136.70',3), - -(10330,'S18_3782',29,'59.06',2), - -(10330,'S18_4721',50,'133.92',4), - -(10330,'S24_2360',42,'56.10',1), - -(10331,'S18_1129',46,'120.31',6), - -(10331,'S18_1589',44,'99.55',14), - -(10331,'S18_1749',44,'154.70',7), - -(10331,'S18_1984',30,'135.14',8), - -(10331,'S18_2870',26,'130.68',10), - -(10331,'S18_3232',27,'169.34',11), - -(10331,'S18_3685',26,'132.80',12), - -(10331,'S24_2972',27,'37.00',13), - -(10331,'S24_3371',25,'55.11',9), - -(10331,'S24_3856',21,'139.03',1), - -(10331,'S24_4620',41,'70.33',2), - -(10331,'S32_2206',28,'33.39',3), - -(10331,'S32_4485',32,'100.01',4), - -(10331,'S50_4713',20,'74.04',5), - -(10332,'S18_1342',46,'89.38',15), - -(10332,'S18_1367',27,'51.21',16), - -(10332,'S18_2248',38,'53.88',9), - -(10332,'S18_2325',35,'116.96',8), - -(10332,'S18_2795',24,'138.38',1), - -(10332,'S18_2957',26,'53.09',17), - -(10332,'S18_3136',40,'100.53',18), - -(10332,'S18_4409',50,'92.03',2), - -(10332,'S18_4933',21,'70.56',3), - -(10332,'S24_1046',23,'61.73',4), - -(10332,'S24_1628',20,'47.29',5), - -(10332,'S24_1937',45,'29.87',6), - -(10332,'S24_2022',26,'43.01',10), - -(10332,'S24_2766',39,'84.51',7), - -(10332,'S24_2887',44,'108.04',11), - -(10332,'S24_3191',45,'77.91',12), - -(10332,'S24_3432',31,'94.23',13), - -(10332,'S24_3969',41,'34.47',14), - -(10333,'S10_1949',26,'188.58',3), - -(10333,'S12_1666',33,'121.64',6), - -(10333,'S18_1097',29,'110.84',7), - -(10333,'S18_2949',31,'95.23',5), - -(10333,'S18_3320',46,'95.24',2), - -(10333,'S18_4668',24,'42.26',8), - -(10333,'S24_4258',39,'95.44',1), - -(10333,'S32_3522',33,'62.05',4), - -(10334,'S10_4962',26,'130.01',2), - -(10334,'S18_2319',46,'108.00',6), - -(10334,'S18_2432',34,'52.87',1), - -(10334,'S18_3232',20,'147.33',3), - -(10334,'S18_4600',49,'101.71',4), - -(10334,'S24_2300',42,'117.57',5), - -(10335,'S24_2840',33,'32.88',2), - -(10335,'S32_1268',44,'77.05',1), - -(10335,'S32_2509',40,'49.78',3), - -(10336,'S12_1108',33,'176.63',10), - -(10336,'S12_3148',33,'126.91',11), - -(10336,'S12_3891',49,'141.88',1), - -(10336,'S12_4473',38,'95.99',3), - -(10336,'S18_2238',49,'153.91',6), - -(10336,'S18_3140',48,'135.22',12), - -(10336,'S18_3259',21,'100.84',7), - -(10336,'S24_1444',45,'49.71',4), - -(10336,'S24_4048',31,'113.55',5), - -(10336,'S32_3207',31,'59.03',9), - -(10336,'S50_1392',23,'109.96',8), - -(10336,'S700_2824',46,'94.07',2), - -(10337,'S10_4757',25,'131.92',8), - -(10337,'S18_4027',36,'140.75',3), - -(10337,'S18_4522',29,'76.36',2), - -(10337,'S24_2011',29,'119.20',4), - -(10337,'S50_1514',21,'54.48',6), - -(10337,'S700_1938',36,'73.62',9), - -(10337,'S700_3505',31,'84.14',1), - -(10337,'S700_3962',36,'83.42',7), - -(10337,'S72_3212',42,'49.14',5), - -(10338,'S18_1662',41,'137.19',1), - -(10338,'S18_3029',28,'80.86',3), - -(10338,'S18_3856',45,'93.17',2), - -(10339,'S10_2016',40,'117.75',4), - -(10339,'S10_4698',39,'178.17',3), - -(10339,'S18_2581',27,'79.41',2), - -(10339,'S18_2625',30,'48.46',1), - -(10339,'S24_1578',27,'96.92',10), - -(10339,'S24_1785',21,'106.14',7), - -(10339,'S24_2841',55,'67.82',12), - -(10339,'S24_3151',55,'73.46',13), - -(10339,'S24_3420',29,'57.86',14), - -(10339,'S24_3816',42,'72.96',16), - -(10339,'S24_3949',45,'57.32',11), - -(10339,'S700_1138',22,'53.34',5), - -(10339,'S700_2047',55,'86.90',15), - -(10339,'S700_2610',50,'62.16',9), - -(10339,'S700_4002',50,'66.63',8), - -(10339,'S72_1253',27,'49.66',6), - -(10340,'S24_2000',55,'62.46',8), - -(10340,'S24_4278',40,'63.76',1), - -(10340,'S32_1374',55,'95.89',2), - -(10340,'S32_4289',39,'67.41',3), - -(10340,'S50_1341',40,'37.09',4), - -(10340,'S700_1691',30,'73.99',5), - -(10340,'S700_2466',55,'81.77',7), - -(10340,'S700_2834',29,'98.48',6), - -(10341,'S10_1678',41,'84.22',9), - -(10341,'S12_1099',45,'192.62',2), - -(10341,'S12_2823',55,'120.50',8), - -(10341,'S12_3380',44,'111.57',1), - -(10341,'S12_3990',36,'77.41',10), - -(10341,'S12_4675',55,'109.40',7), - -(10341,'S24_2360',32,'63.03',6), - -(10341,'S32_4485',31,'95.93',4), - -(10341,'S50_4713',38,'78.11',3), - -(10341,'S700_3167',34,'70.40',5), - -(10342,'S18_1129',40,'118.89',2), - -(10342,'S18_1889',55,'63.14',1), - -(10342,'S18_1984',22,'115.22',3), - -(10342,'S18_3232',30,'167.65',4), - -(10342,'S18_3278',25,'76.39',5), - -(10342,'S18_3482',55,'136.70',7), - -(10342,'S18_3782',26,'57.82',8), - -(10342,'S18_4721',38,'124.99',11), - -(10342,'S24_2972',39,'30.59',9), - -(10342,'S24_3371',48,'60.01',10), - -(10342,'S24_3856',42,'112.34',6), - -(10343,'S18_1589',36,'109.51',4), - -(10343,'S18_2870',25,'118.80',3), - -(10343,'S18_3685',44,'127.15',2), - -(10343,'S24_1628',27,'44.78',6), - -(10343,'S24_4620',30,'76.80',1), - -(10343,'S32_2206',29,'37.41',5), - -(10344,'S18_1749',45,'168.30',1), - -(10344,'S18_2248',40,'49.04',2), - -(10344,'S18_2325',30,'118.23',3), - -(10344,'S18_4409',21,'80.99',4), - -(10344,'S18_4933',26,'68.42',5), - -(10344,'S24_1046',29,'61.00',7), - -(10344,'S24_1937',20,'27.88',6), - -(10345,'S24_2022',43,'38.98',1), - -(10346,'S18_1342',42,'88.36',3), - -(10346,'S24_2766',25,'87.24',1), - -(10346,'S24_2887',24,'117.44',5), - -(10346,'S24_3191',24,'80.47',2), - -(10346,'S24_3432',26,'103.87',6), - -(10346,'S24_3969',22,'38.57',4), - -(10347,'S10_1949',30,'188.58',1), - -(10347,'S10_4962',27,'132.97',2), - -(10347,'S12_1666',29,'132.57',3), - -(10347,'S18_1097',42,'113.17',5), - -(10347,'S18_1367',21,'46.36',7), - -(10347,'S18_2432',50,'51.05',8), - -(10347,'S18_2795',21,'136.69',6), - -(10347,'S18_2949',48,'84.09',9), - -(10347,'S18_2957',34,'60.59',10), - -(10347,'S18_3136',45,'95.30',11), - -(10347,'S18_3320',26,'84.33',12), - -(10347,'S18_4600',45,'115.03',4), - -(10348,'S12_1108',48,'207.80',8), - -(10348,'S12_3148',47,'122.37',4), - -(10348,'S18_4668',29,'43.77',6), - -(10348,'S24_2300',37,'107.34',1), - -(10348,'S24_4258',39,'82.78',2), - -(10348,'S32_1268',42,'90.53',3), - -(10348,'S32_3522',31,'62.70',5), - -(10348,'S700_2824',32,'100.14',7), - -(10349,'S12_3891',26,'166.10',10), - -(10349,'S12_4473',48,'114.95',9), - -(10349,'S18_2238',38,'142.45',8), - -(10349,'S18_2319',38,'117.82',7), - -(10349,'S18_3232',48,'164.26',6), - -(10349,'S18_4027',34,'140.75',5), - -(10349,'S24_1444',48,'50.29',4), - -(10349,'S24_2840',36,'31.47',3), - -(10349,'S24_4048',23,'111.18',2), - -(10349,'S32_2509',33,'44.37',1), - -(10350,'S10_4757',26,'110.16',5), - -(10350,'S18_3029',43,'84.30',6), - -(10350,'S18_3140',44,'135.22',1), - -(10350,'S18_3259',41,'94.79',2), - -(10350,'S18_4522',30,'70.22',3), - -(10350,'S24_2011',34,'98.31',7), - -(10350,'S24_3151',30,'86.74',9), - -(10350,'S24_3816',25,'77.15',10), - -(10350,'S32_3207',27,'61.52',14), - -(10350,'S50_1392',31,'104.18',8), - -(10350,'S50_1514',44,'56.82',17), - -(10350,'S700_1138',46,'56.00',11), - -(10350,'S700_1938',28,'76.22',4), - -(10350,'S700_2610',29,'68.67',12), - -(10350,'S700_3505',31,'87.15',13), - -(10350,'S700_3962',25,'97.32',16), - -(10350,'S72_3212',20,'48.05',15), - -(10351,'S18_1662',39,'143.50',1), - -(10351,'S18_3856',20,'104.81',2), - -(10351,'S24_2841',25,'64.40',5), - -(10351,'S24_3420',38,'53.92',4), - -(10351,'S24_3949',34,'68.24',3), - -(10352,'S700_2047',23,'75.13',3), - -(10352,'S700_2466',49,'87.75',2), - -(10352,'S700_4002',22,'62.19',1), - -(10352,'S72_1253',49,'46.18',4), - -(10353,'S18_2581',27,'71.81',1), - -(10353,'S24_1785',28,'107.23',2), - -(10353,'S24_4278',35,'69.55',3), - -(10353,'S32_1374',46,'86.90',5), - -(10353,'S32_4289',40,'68.10',7), - -(10353,'S50_1341',40,'35.78',8), - -(10353,'S700_1691',39,'73.07',9), - -(10353,'S700_2834',48,'98.48',4), - -(10353,'S700_3167',43,'74.40',6), - -(10354,'S10_1678',42,'84.22',6), - -(10354,'S10_2016',20,'95.15',2), - -(10354,'S10_4698',42,'178.17',3), - -(10354,'S12_1099',31,'157.60',9), - -(10354,'S12_2823',35,'141.58',4), - -(10354,'S12_3380',29,'98.65',11), - -(10354,'S12_3990',23,'76.61',12), - -(10354,'S12_4675',28,'100.19',13), - -(10354,'S18_1889',21,'76.23',8), - -(10354,'S18_2625',28,'49.06',10), - -(10354,'S18_3278',36,'69.15',7), - -(10354,'S24_1578',21,'96.92',5), - -(10354,'S24_2000',28,'62.46',1), - -(10355,'S18_3482',23,'117.59',7), - -(10355,'S18_3782',31,'60.30',1), - -(10355,'S18_4721',25,'124.99',2), - -(10355,'S24_2360',41,'56.10',3), - -(10355,'S24_2972',36,'37.38',4), - -(10355,'S24_3371',44,'60.62',6), - -(10355,'S24_3856',32,'137.62',8), - -(10355,'S24_4620',28,'75.18',9), - -(10355,'S32_2206',38,'32.99',10), - -(10355,'S32_4485',40,'93.89',5), - -(10356,'S18_1129',43,'120.31',8), - -(10356,'S18_1342',50,'82.19',9), - -(10356,'S18_1367',22,'44.75',6), - -(10356,'S18_1984',27,'130.87',2), - -(10356,'S18_2325',29,'106.79',3), - -(10356,'S18_2795',30,'158.63',1), - -(10356,'S24_1937',48,'31.86',5), - -(10356,'S24_2022',26,'42.11',7), - -(10356,'S50_4713',26,'78.11',4), - -(10357,'S10_1949',32,'199.30',10), - -(10357,'S10_4962',43,'135.92',9), - -(10357,'S12_1666',49,'109.34',8), - -(10357,'S18_1097',39,'112.00',1), - -(10357,'S18_2432',41,'58.95',7), - -(10357,'S18_2949',41,'91.18',6), - -(10357,'S18_2957',49,'59.34',5), - -(10357,'S18_3136',44,'104.72',4), - -(10357,'S18_3320',25,'84.33',3), - -(10357,'S18_4600',28,'105.34',2), - -(10358,'S12_3148',49,'129.93',5), - -(10358,'S12_4473',42,'98.36',9), - -(10358,'S18_2238',20,'142.45',10), - -(10358,'S18_2319',20,'99.41',11), - -(10358,'S18_3232',32,'137.17',12), - -(10358,'S18_4027',25,'117.77',13), - -(10358,'S18_4668',30,'46.29',8), - -(10358,'S24_1444',44,'56.07',14), - -(10358,'S24_2300',41,'127.79',7), - -(10358,'S24_2840',36,'33.59',4), - -(10358,'S24_4258',41,'88.62',6), - -(10358,'S32_1268',41,'82.83',1), - -(10358,'S32_3522',36,'51.71',2), - -(10358,'S700_2824',27,'85.98',3), - -(10359,'S10_4757',48,'122.40',6), - -(10359,'S12_1108',42,'180.79',8), - -(10359,'S12_3891',49,'162.64',5), - -(10359,'S24_4048',22,'108.82',7), - -(10359,'S32_2509',36,'45.45',3), - -(10359,'S32_3207',22,'62.14',1), - -(10359,'S50_1392',46,'99.55',2), - -(10359,'S50_1514',25,'47.45',4), - -(10360,'S18_1662',50,'126.15',12), - -(10360,'S18_2581',41,'68.43',13), - -(10360,'S18_3029',46,'71.40',14), - -(10360,'S18_3140',29,'122.93',8), - -(10360,'S18_3259',29,'94.79',18), - -(10360,'S18_3856',40,'101.64',15), - -(10360,'S18_4522',40,'76.36',1), - -(10360,'S24_1785',22,'106.14',17), - -(10360,'S24_2011',31,'100.77',2), - -(10360,'S24_2841',49,'55.49',16), - -(10360,'S24_3151',36,'70.81',3), - -(10360,'S24_3816',22,'78.83',4), - -(10360,'S700_1138',32,'64.67',5), - -(10360,'S700_1938',26,'86.61',6), - -(10360,'S700_2610',30,'70.11',7), - -(10360,'S700_3505',35,'83.14',9), - -(10360,'S700_3962',31,'92.36',10), - -(10360,'S72_3212',31,'54.05',11), - -(10361,'S10_1678',20,'92.83',13), - -(10361,'S10_2016',26,'114.18',8), - -(10361,'S24_3420',34,'62.46',6), - -(10361,'S24_3949',26,'61.42',7), - -(10361,'S24_4278',25,'68.83',1), - -(10361,'S32_4289',49,'56.41',2), - -(10361,'S50_1341',33,'35.78',3), - -(10361,'S700_1691',20,'88.60',4), - -(10361,'S700_2047',24,'85.99',14), - -(10361,'S700_2466',26,'91.74',9), - -(10361,'S700_2834',44,'107.97',5), - -(10361,'S700_3167',44,'76.80',10), - -(10361,'S700_4002',35,'62.19',11), - -(10361,'S72_1253',23,'47.67',12), - -(10362,'S10_4698',22,'182.04',4), - -(10362,'S12_2823',22,'131.04',1), - -(10362,'S18_2625',23,'53.91',3), - -(10362,'S24_1578',50,'91.29',2), - -(10363,'S12_1099',33,'180.95',3), - -(10363,'S12_3380',34,'106.87',4), - -(10363,'S12_3990',34,'68.63',5), - -(10363,'S12_4675',46,'103.64',6), - -(10363,'S18_1889',22,'61.60',7), - -(10363,'S18_3278',46,'69.15',10), - -(10363,'S18_3482',24,'124.94',11), - -(10363,'S18_3782',32,'52.22',12), - -(10363,'S18_4721',28,'123.50',13), - -(10363,'S24_2000',21,'70.08',8), - -(10363,'S24_2360',43,'56.10',14), - -(10363,'S24_3371',21,'52.05',15), - -(10363,'S24_3856',31,'113.75',1), - -(10363,'S24_4620',43,'75.99',9), - -(10363,'S32_1374',50,'92.90',2), - -(10364,'S32_2206',48,'38.22',1), - -(10365,'S18_1129',30,'116.06',1), - -(10365,'S32_4485',22,'82.66',3), - -(10365,'S50_4713',44,'68.34',2), - -(10366,'S18_1984',34,'116.65',3), - -(10366,'S18_2870',49,'105.60',2), - -(10366,'S18_3232',34,'154.10',1), - -(10367,'S18_1589',49,'105.77',1), - -(10367,'S18_1749',37,'144.50',3), - -(10367,'S18_2248',45,'50.25',4), - -(10367,'S18_2325',27,'124.59',5), - -(10367,'S18_2795',32,'140.06',7), - -(10367,'S18_3685',46,'131.39',6), - -(10367,'S18_4409',43,'77.31',8), - -(10367,'S18_4933',44,'66.99',9), - -(10367,'S24_1046',21,'72.76',10), - -(10367,'S24_1628',38,'50.31',11), - -(10367,'S24_1937',23,'29.54',13), - -(10367,'S24_2022',28,'43.01',12), - -(10367,'S24_2972',36,'36.25',2), - -(10368,'S24_2766',40,'73.60',2), - -(10368,'S24_2887',31,'115.09',5), - -(10368,'S24_3191',46,'83.04',1), - -(10368,'S24_3432',20,'93.16',4), - -(10368,'S24_3969',46,'36.52',3), - -(10369,'S10_1949',41,'195.01',2), - -(10369,'S18_1342',44,'89.38',8), - -(10369,'S18_1367',32,'46.36',7), - -(10369,'S18_2949',42,'100.30',1), - -(10369,'S18_2957',28,'51.84',6), - -(10369,'S18_3136',21,'90.06',5), - -(10369,'S18_3320',45,'80.36',4), - -(10369,'S24_4258',40,'93.49',3), - -(10370,'S10_4962',35,'128.53',4), - -(10370,'S12_1666',49,'128.47',8), - -(10370,'S18_1097',27,'100.34',1), - -(10370,'S18_2319',22,'101.87',5), - -(10370,'S18_2432',22,'60.16',7), - -(10370,'S18_3232',27,'167.65',9), - -(10370,'S18_4600',29,'105.34',6), - -(10370,'S18_4668',20,'41.76',2), - -(10370,'S32_3522',25,'63.99',3), - -(10371,'S12_1108',32,'178.71',6), - -(10371,'S12_4473',49,'104.28',4), - -(10371,'S18_2238',25,'160.46',7), - -(10371,'S24_1444',25,'53.75',12), - -(10371,'S24_2300',20,'126.51',5), - -(10371,'S24_2840',45,'35.01',8), - -(10371,'S24_4048',28,'95.81',9), - -(10371,'S32_1268',26,'82.83',1), - -(10371,'S32_2509',20,'44.37',2), - -(10371,'S32_3207',30,'53.44',11), - -(10371,'S50_1392',48,'97.23',10), - -(10371,'S700_2824',34,'83.95',3), - -(10372,'S12_3148',40,'146.55',4), - -(10372,'S12_3891',34,'140.15',1), - -(10372,'S18_3140',28,'131.13',3), - -(10372,'S18_3259',25,'91.76',5), - -(10372,'S18_4027',48,'119.20',6), - -(10372,'S18_4522',41,'78.99',7), - -(10372,'S24_2011',37,'102.00',8), - -(10372,'S50_1514',24,'56.82',9), - -(10372,'S700_1938',44,'74.48',2), - -(10373,'S10_4757',39,'118.32',3), - -(10373,'S18_1662',28,'143.50',4), - -(10373,'S18_3029',22,'75.70',5), - -(10373,'S18_3856',50,'99.52',6), - -(10373,'S24_2841',38,'58.92',7), - -(10373,'S24_3151',33,'82.31',12), - -(10373,'S24_3420',46,'53.92',11), - -(10373,'S24_3816',23,'83.86',10), - -(10373,'S24_3949',39,'62.10',13), - -(10373,'S700_1138',44,'58.00',14), - -(10373,'S700_2047',32,'76.94',15), - -(10373,'S700_2610',41,'69.39',16), - -(10373,'S700_3505',34,'94.16',2), - -(10373,'S700_3962',37,'83.42',8), - -(10373,'S700_4002',45,'68.11',17), - -(10373,'S72_1253',25,'44.20',9), - -(10373,'S72_3212',29,'48.05',1), - -(10374,'S10_2016',39,'115.37',5), - -(10374,'S10_4698',22,'158.80',1), - -(10374,'S18_2581',42,'75.19',2), - -(10374,'S18_2625',22,'48.46',4), - -(10374,'S24_1578',38,'112.70',6), - -(10374,'S24_1785',46,'107.23',3), - -(10375,'S10_1678',21,'76.56',12), - -(10375,'S12_1099',45,'184.84',7), - -(10375,'S12_2823',49,'150.62',13), - -(10375,'S24_2000',23,'67.03',9), - -(10375,'S24_2360',20,'60.26',14), - -(10375,'S24_4278',43,'60.13',2), - -(10375,'S32_1374',37,'87.90',3), - -(10375,'S32_4289',44,'59.85',4), - -(10375,'S32_4485',41,'96.95',15), - -(10375,'S50_1341',49,'36.22',5), - -(10375,'S50_4713',49,'69.16',8), - -(10375,'S700_1691',37,'86.77',6), - -(10375,'S700_2466',33,'94.73',1), - -(10375,'S700_2834',25,'98.48',10), - -(10375,'S700_3167',44,'69.60',11), - -(10376,'S12_3380',35,'98.65',1), - -(10377,'S12_3990',24,'65.44',5), - -(10377,'S12_4675',50,'112.86',1), - -(10377,'S18_1129',35,'124.56',2), - -(10377,'S18_1889',31,'61.60',4), - -(10377,'S18_1984',36,'125.18',6), - -(10377,'S18_3232',39,'143.94',3), - -(10378,'S18_1589',34,'121.95',5), - -(10378,'S18_3278',22,'66.74',4), - -(10378,'S18_3482',43,'146.99',10), - -(10378,'S18_3782',28,'60.30',9), - -(10378,'S18_4721',49,'122.02',8), - -(10378,'S24_2972',41,'30.59',7), - -(10378,'S24_3371',46,'52.66',6), - -(10378,'S24_3856',33,'129.20',3), - -(10378,'S24_4620',41,'80.84',2), - -(10378,'S32_2206',40,'35.80',1), - -(10379,'S18_1749',39,'156.40',2), - -(10379,'S18_2248',27,'50.85',1), - -(10379,'S18_2870',29,'113.52',5), - -(10379,'S18_3685',32,'134.22',4), - -(10379,'S24_1628',32,'48.80',3), - -(10380,'S18_1342',27,'88.36',13), - -(10380,'S18_2325',40,'119.50',10), - -(10380,'S18_2795',21,'156.94',8), - -(10380,'S18_4409',32,'78.23',1), - -(10380,'S18_4933',24,'66.99',2), - -(10380,'S24_1046',34,'66.88',3), - -(10380,'S24_1937',32,'29.87',4), - -(10380,'S24_2022',27,'37.63',5), - -(10380,'S24_2766',36,'77.24',6), - -(10380,'S24_2887',44,'111.57',7), - -(10380,'S24_3191',44,'77.05',9), - -(10380,'S24_3432',34,'91.02',11), - -(10380,'S24_3969',43,'32.82',12), - -(10381,'S10_1949',36,'182.16',3), - -(10381,'S10_4962',37,'138.88',6), - -(10381,'S12_1666',20,'132.57',1), - -(10381,'S18_1097',48,'114.34',2), - -(10381,'S18_1367',25,'49.60',9), - -(10381,'S18_2432',35,'60.77',7), - -(10381,'S18_2949',41,'100.30',8), - -(10381,'S18_2957',40,'51.22',4), - -(10381,'S18_3136',35,'93.20',5), - -(10382,'S12_1108',34,'166.24',10), - -(10382,'S12_3148',37,'145.04',11), - -(10382,'S12_3891',34,'143.61',12), - -(10382,'S12_4473',32,'103.10',13), - -(10382,'S18_2238',25,'160.46',5), - -(10382,'S18_3320',50,'84.33',7), - -(10382,'S18_4600',39,'115.03',1), - -(10382,'S18_4668',39,'46.29',2), - -(10382,'S24_2300',20,'120.12',3), - -(10382,'S24_4258',33,'97.39',4), - -(10382,'S32_1268',26,'85.72',6), - -(10382,'S32_3522',48,'57.53',8), - -(10382,'S700_2824',34,'101.15',9), - -(10383,'S18_2319',27,'119.05',11), - -(10383,'S18_3140',24,'125.66',9), - -(10383,'S18_3232',47,'155.79',6), - -(10383,'S18_3259',26,'83.70',12), - -(10383,'S18_4027',38,'137.88',1), - -(10383,'S18_4522',28,'77.24',7), - -(10383,'S24_1444',22,'52.60',2), - -(10383,'S24_2840',40,'33.24',3), - -(10383,'S24_4048',21,'117.10',4), - -(10383,'S32_2509',32,'53.57',5), - -(10383,'S32_3207',44,'55.93',8), - -(10383,'S50_1392',29,'94.92',13), - -(10383,'S50_1514',38,'48.62',10), - -(10384,'S10_4757',34,'129.20',4), - -(10384,'S24_2011',28,'114.29',3), - -(10384,'S24_3151',43,'71.69',2), - -(10384,'S700_1938',49,'71.02',1), - -(10385,'S24_3816',37,'78.83',2), - -(10385,'S700_1138',25,'62.00',1), - -(10386,'S18_1662',25,'130.88',7), - -(10386,'S18_2581',21,'72.65',18), - -(10386,'S18_3029',37,'73.12',5), - -(10386,'S18_3856',22,'100.58',6), - -(10386,'S24_1785',33,'101.76',11), - -(10386,'S24_2841',39,'56.86',1), - -(10386,'S24_3420',35,'54.57',9), - -(10386,'S24_3949',41,'55.96',12), - -(10386,'S24_4278',50,'71.73',8), - -(10386,'S700_2047',29,'85.09',13), - -(10386,'S700_2466',37,'90.75',14), - -(10386,'S700_2610',37,'67.22',10), - -(10386,'S700_3167',32,'68.00',17), - -(10386,'S700_3505',45,'83.14',2), - -(10386,'S700_3962',30,'80.44',3), - -(10386,'S700_4002',44,'59.22',15), - -(10386,'S72_1253',50,'47.67',16), - -(10386,'S72_3212',43,'52.42',4), - -(10387,'S32_1374',44,'79.91',1), - -(10388,'S10_1678',42,'80.39',4), - -(10388,'S10_2016',50,'118.94',5), - -(10388,'S10_4698',21,'156.86',7), - -(10388,'S12_2823',44,'125.01',6), - -(10388,'S32_4289',35,'58.47',8), - -(10388,'S50_1341',27,'41.02',1), - -(10388,'S700_1691',46,'74.90',2), - -(10388,'S700_2834',50,'111.53',3), - -(10389,'S12_1099',26,'182.90',4), - -(10389,'S12_3380',25,'95.13',6), - -(10389,'S12_3990',36,'76.61',7), - -(10389,'S12_4675',47,'102.49',8), - -(10389,'S18_1889',49,'63.91',3), - -(10389,'S18_2625',39,'52.09',5), - -(10389,'S24_1578',45,'112.70',1), - -(10389,'S24_2000',49,'61.70',2), - -(10390,'S18_1129',36,'117.48',14), - -(10390,'S18_1984',34,'132.29',15), - -(10390,'S18_2325',31,'102.98',16), - -(10390,'S18_2795',26,'162.00',7), - -(10390,'S18_3278',40,'75.59',9), - -(10390,'S18_3482',50,'135.23',1), - -(10390,'S18_3782',36,'54.09',2), - -(10390,'S18_4721',49,'122.02',3), - -(10390,'S24_2360',35,'67.87',4), - -(10390,'S24_2972',37,'35.87',5), - -(10390,'S24_3371',46,'51.43',6), - -(10390,'S24_3856',45,'134.81',8), - -(10390,'S24_4620',30,'66.29',10), - -(10390,'S32_2206',41,'39.02',11), - -(10390,'S32_4485',45,'101.03',12), - -(10390,'S50_4713',22,'81.36',13), - -(10391,'S10_1949',24,'195.01',4), - -(10391,'S10_4962',37,'121.15',7), - -(10391,'S12_1666',39,'110.70',9), - -(10391,'S18_1097',29,'114.34',10), - -(10391,'S18_1342',35,'102.74',2), - -(10391,'S18_1367',42,'47.44',3), - -(10391,'S18_2432',44,'57.73',5), - -(10391,'S18_2949',32,'99.28',6), - -(10391,'S24_1937',33,'26.55',8), - -(10391,'S24_2022',24,'36.29',1), - -(10392,'S18_2957',37,'61.21',3), - -(10392,'S18_3136',29,'103.67',2), - -(10392,'S18_3320',36,'98.22',1), - -(10393,'S12_3148',35,'145.04',8), - -(10393,'S12_4473',32,'99.54',10), - -(10393,'S18_2238',20,'137.53',11), - -(10393,'S18_2319',38,'104.32',7), - -(10393,'S18_4600',30,'106.55',9), - -(10393,'S18_4668',44,'41.76',1), - -(10393,'S24_2300',33,'112.46',2), - -(10393,'S24_4258',33,'88.62',3), - -(10393,'S32_1268',38,'84.75',4), - -(10393,'S32_3522',31,'63.35',5), - -(10393,'S700_2824',21,'83.95',6), - -(10394,'S18_3232',22,'135.47',5), - -(10394,'S18_4027',37,'124.95',1), - -(10394,'S24_1444',31,'53.18',2), - -(10394,'S24_2840',46,'35.36',6), - -(10394,'S24_4048',37,'104.09',7), - -(10394,'S32_2509',36,'47.08',3), - -(10394,'S32_3207',30,'55.93',4), - -(10395,'S10_4757',32,'125.12',2), - -(10395,'S12_1108',33,'205.72',1), - -(10395,'S50_1392',46,'98.39',4), - -(10395,'S50_1514',45,'57.99',3), - -(10396,'S12_3891',33,'155.72',3), - -(10396,'S18_3140',33,'129.76',2), - -(10396,'S18_3259',24,'91.76',4), - -(10396,'S18_4522',45,'83.38',5), - -(10396,'S24_2011',49,'100.77',6), - -(10396,'S24_3151',27,'77.00',7), - -(10396,'S24_3816',37,'77.99',8), - -(10396,'S700_1138',39,'62.00',1), - -(10397,'S700_1938',32,'69.29',5), - -(10397,'S700_2610',22,'62.88',4), - -(10397,'S700_3505',48,'86.15',3), - -(10397,'S700_3962',36,'80.44',2), - -(10397,'S72_3212',34,'52.96',1), - -(10398,'S18_1662',33,'130.88',11), - -(10398,'S18_2581',34,'82.79',15), - -(10398,'S18_3029',28,'70.54',18), - -(10398,'S18_3856',45,'92.11',17), - -(10398,'S24_1785',43,'100.67',16), - -(10398,'S24_2841',28,'60.29',3), - -(10398,'S24_3420',34,'61.15',13), - -(10398,'S24_3949',41,'56.64',2), - -(10398,'S24_4278',45,'65.93',14), - -(10398,'S32_4289',22,'60.54',4), - -(10398,'S50_1341',49,'38.84',5), - -(10398,'S700_1691',47,'78.55',6), - -(10398,'S700_2047',36,'75.13',7), - -(10398,'S700_2466',22,'98.72',8), - -(10398,'S700_2834',23,'102.04',9), - -(10398,'S700_3167',29,'76.80',10), - -(10398,'S700_4002',36,'62.19',12), - -(10398,'S72_1253',34,'41.22',1), - -(10399,'S10_1678',40,'77.52',8), - -(10399,'S10_2016',51,'99.91',7), - -(10399,'S10_4698',22,'156.86',6), - -(10399,'S12_2823',29,'123.51',5), - -(10399,'S18_2625',30,'51.48',4), - -(10399,'S24_1578',57,'104.81',3), - -(10399,'S24_2000',58,'75.41',2), - -(10399,'S32_1374',32,'97.89',1), - -(10400,'S10_4757',64,'134.64',9), - -(10400,'S18_1662',34,'129.31',1), - -(10400,'S18_3029',30,'74.84',7), - -(10400,'S18_3856',58,'88.93',6), - -(10400,'S24_2841',24,'55.49',2), - -(10400,'S24_3420',38,'59.18',3), - -(10400,'S24_3816',42,'74.64',8), - -(10400,'S700_2047',46,'82.37',5), - -(10400,'S72_1253',20,'41.71',4), - -(10401,'S18_2581',42,'75.19',3), - -(10401,'S24_1785',38,'87.54',5), - -(10401,'S24_3949',64,'59.37',12), - -(10401,'S24_4278',52,'65.93',4), - -(10401,'S32_1374',49,'81.91',1), - -(10401,'S32_4289',62,'62.60',6), - -(10401,'S50_1341',56,'41.46',7), - -(10401,'S700_1691',11,'77.64',8), - -(10401,'S700_2466',85,'98.72',10), - -(10401,'S700_2834',21,'96.11',2), - -(10401,'S700_3167',77,'73.60',9), - -(10401,'S700_4002',40,'66.63',11), - -(10402,'S10_2016',45,'118.94',1), - -(10402,'S18_2625',55,'58.15',2), - -(10402,'S24_2000',59,'61.70',3), - -(10403,'S10_1678',24,'85.17',7), - -(10403,'S10_4698',66,'174.29',9), - -(10403,'S12_2823',66,'122.00',6), - -(10403,'S18_3782',36,'55.33',1), - -(10403,'S24_1578',46,'109.32',8), - -(10403,'S24_2360',27,'57.49',4), - -(10403,'S32_2206',30,'35.80',2), - -(10403,'S32_4485',45,'88.78',5), - -(10403,'S50_4713',31,'65.09',3), - -(10404,'S12_1099',64,'163.44',3), - -(10404,'S12_3380',43,'102.17',1), - -(10404,'S12_3990',77,'67.03',4), - -(10404,'S18_3278',90,'67.54',6), - -(10404,'S18_3482',28,'127.88',5), - -(10404,'S18_4721',48,'124.99',8), - -(10404,'S24_3371',49,'53.27',2), - -(10404,'S24_4620',48,'65.48',7), - -(10405,'S12_4675',97,'115.16',5), - -(10405,'S18_1889',61,'72.38',4), - -(10405,'S18_3232',55,'147.33',1), - -(10405,'S24_2972',47,'37.38',2), - -(10405,'S24_3856',76,'127.79',3), - -(10406,'S18_1129',61,'124.56',3), - -(10406,'S18_1984',48,'133.72',2), - -(10406,'S18_3685',65,'117.26',1), - -(10407,'S18_1589',59,'114.48',11), - -(10407,'S18_1749',76,'141.10',2), - -(10407,'S18_2248',42,'58.12',1), - -(10407,'S18_2870',41,'132.00',12), - -(10407,'S18_4409',6,'91.11',3), - -(10407,'S18_4933',66,'64.14',4), - -(10407,'S24_1046',26,'68.35',8), - -(10407,'S24_1628',64,'45.78',10), - -(10407,'S24_2766',76,'81.78',6), - -(10407,'S24_2887',59,'98.65',5), - -(10407,'S24_3191',13,'77.05',7), - -(10407,'S24_3432',43,'101.73',9), - -(10408,'S24_3969',15,'41.03',1), - -(10409,'S18_2325',6,'104.25',2), - -(10409,'S24_1937',61,'27.88',1), - -(10410,'S18_1342',65,'99.66',7), - -(10410,'S18_1367',44,'51.21',6), - -(10410,'S18_2795',56,'145.13',8), - -(10410,'S18_2949',47,'93.21',1), - -(10410,'S18_2957',53,'49.97',3), - -(10410,'S18_3136',34,'84.82',2), - -(10410,'S18_3320',44,'81.35',5), - -(10410,'S24_2022',31,'42.56',9), - -(10410,'S24_4258',50,'95.44',4), - -(10411,'S10_1949',23,'205.73',9), - -(10411,'S10_4962',27,'144.79',2), - -(10411,'S12_1666',40,'110.70',6), - -(10411,'S18_1097',27,'109.67',8), - -(10411,'S18_4600',46,'106.55',3), - -(10411,'S18_4668',35,'41.25',7), - -(10411,'S32_1268',26,'78.01',1), - -(10411,'S32_3522',27,'60.76',5), - -(10411,'S700_2824',34,'89.01',4), - -(10412,'S12_4473',54,'100.73',5), - -(10412,'S18_2238',41,'150.63',4), - -(10412,'S18_2319',56,'120.28',8), - -(10412,'S18_2432',47,'49.83',11), - -(10412,'S18_3232',60,'157.49',9), - -(10412,'S24_1444',21,'47.40',2), - -(10412,'S24_2300',70,'109.90',10), - -(10412,'S24_2840',30,'32.88',6), - -(10412,'S24_4048',31,'108.82',1), - -(10412,'S32_2509',19,'50.86',7), - -(10412,'S50_1392',26,'105.33',3), - -(10413,'S12_1108',36,'201.57',2), - -(10413,'S12_3148',47,'145.04',3), - -(10413,'S12_3891',22,'173.02',1), - -(10413,'S18_4027',49,'133.57',5), - -(10413,'S32_3207',24,'56.55',6), - -(10413,'S50_1514',51,'53.31',4), - -(10414,'S10_4757',49,'114.24',3), - -(10414,'S18_3029',44,'77.42',1), - -(10414,'S18_3140',41,'128.39',12), - -(10414,'S18_3259',48,'85.71',14), - -(10414,'S18_4522',56,'83.38',11), - -(10414,'S24_2011',43,'108.14',10), - -(10414,'S24_3151',60,'72.58',5), - -(10414,'S24_3816',51,'72.96',2), - -(10414,'S700_1138',37,'62.00',6), - -(10414,'S700_1938',34,'74.48',13), - -(10414,'S700_2610',31,'61.44',4), - -(10414,'S700_3505',28,'84.14',7), - -(10414,'S700_3962',40,'84.41',8), - -(10414,'S72_3212',47,'54.60',9), - -(10415,'S18_3856',51,'86.81',5), - -(10415,'S24_2841',21,'60.97',1), - -(10415,'S24_3420',18,'59.83',2), - -(10415,'S700_2047',32,'73.32',4), - -(10415,'S72_1253',42,'43.20',3), - -(10416,'S18_1662',24,'129.31',14), - -(10416,'S18_2581',15,'70.96',4), - -(10416,'S24_1785',47,'90.82',6), - -(10416,'S24_2000',32,'62.46',1), - -(10416,'S24_3949',18,'64.83',13), - -(10416,'S24_4278',48,'70.28',5), - -(10416,'S32_1374',45,'86.90',2), - -(10416,'S32_4289',26,'68.10',7), - -(10416,'S50_1341',37,'39.71',8), - -(10416,'S700_1691',23,'88.60',9), - -(10416,'S700_2466',22,'84.76',11), - -(10416,'S700_2834',41,'98.48',3), - -(10416,'S700_3167',39,'65.60',10), - -(10416,'S700_4002',43,'63.67',12), - -(10417,'S10_1678',66,'79.43',2), - -(10417,'S10_2016',45,'116.56',5), - -(10417,'S10_4698',56,'162.67',4), - -(10417,'S12_2823',21,'144.60',1), - -(10417,'S18_2625',36,'58.75',6), - -(10417,'S24_1578',35,'109.32',3), - -(10418,'S18_3278',16,'70.76',2), - -(10418,'S18_3482',27,'139.64',1), - -(10418,'S18_3782',33,'56.57',5), - -(10418,'S18_4721',28,'120.53',4), - -(10418,'S24_2360',52,'64.41',8), - -(10418,'S24_4620',10,'66.29',3), - -(10418,'S32_2206',43,'36.61',6), - -(10418,'S32_4485',50,'100.01',9), - -(10418,'S50_4713',40,'72.41',7), - -(10419,'S12_1099',12,'182.90',13), - -(10419,'S12_3380',10,'111.57',11), - -(10419,'S12_3990',34,'64.64',14), - -(10419,'S12_4675',32,'99.04',10), - -(10419,'S18_1129',38,'117.48',5), - -(10419,'S18_1589',37,'100.80',1), - -(10419,'S18_1889',39,'67.76',9), - -(10419,'S18_1984',34,'133.72',4), - -(10419,'S18_2870',55,'116.16',2), - -(10419,'S18_3232',35,'165.95',6), - -(10419,'S18_3685',43,'114.44',3), - -(10419,'S24_2972',15,'32.10',7), - -(10419,'S24_3371',55,'52.66',12), - -(10419,'S24_3856',70,'112.34',8), - -(10420,'S18_1749',37,'153.00',5), - -(10420,'S18_2248',36,'52.06',4), - -(10420,'S18_2325',45,'116.96',2), - -(10420,'S18_4409',66,'73.62',6), - -(10420,'S18_4933',36,'68.42',7), - -(10420,'S24_1046',60,'60.26',11), - -(10420,'S24_1628',37,'48.80',13), - -(10420,'S24_1937',45,'32.19',1), - -(10420,'S24_2766',39,'76.33',9), - -(10420,'S24_2887',55,'115.09',8), - -(10420,'S24_3191',35,'77.05',10), - -(10420,'S24_3432',26,'104.94',12), - -(10420,'S24_3969',15,'35.29',3), - -(10421,'S18_2795',35,'167.06',1), - -(10421,'S24_2022',40,'44.80',2), - -(10422,'S18_1342',51,'91.44',2), - -(10422,'S18_1367',25,'47.44',1), - -(10423,'S18_2949',10,'89.15',1), - -(10423,'S18_2957',31,'56.21',3), - -(10423,'S18_3136',21,'98.44',2), - -(10423,'S18_3320',21,'80.36',5), - -(10423,'S24_4258',28,'78.89',4), - -(10424,'S10_1949',50,'201.44',6), - -(10424,'S12_1666',49,'121.64',3), - -(10424,'S18_1097',54,'108.50',5), - -(10424,'S18_4668',26,'40.25',4), - -(10424,'S32_3522',44,'54.94',2), - -(10424,'S700_2824',46,'85.98',1), - -(10425,'S10_4962',38,'131.49',12), - -(10425,'S12_4473',33,'95.99',4), - -(10425,'S18_2238',28,'147.36',3), - -(10425,'S18_2319',38,'117.82',7), - -(10425,'S18_2432',19,'48.62',10), - -(10425,'S18_3232',28,'140.55',8), - -(10425,'S18_4600',38,'107.76',13), - -(10425,'S24_1444',55,'53.75',1), - -(10425,'S24_2300',49,'127.79',9), - -(10425,'S24_2840',31,'31.82',5), - -(10425,'S32_1268',41,'83.79',11), - -(10425,'S32_2509',11,'50.32',6), - -(10425,'S50_1392',18,'94.92',2); - -/*Table structure for table `orders` */ - -DROP TABLE IF EXISTS `orders`; - -CREATE TABLE `orders` ( - `orderNumber` int(11) NOT NULL, - `orderDate` date NOT NULL, - `requiredDate` date NOT NULL, - `shippedDate` date DEFAULT NULL, - `status` varchar(15) NOT NULL, - `comments` text, - `customerNumber` int(11) NOT NULL, - PRIMARY KEY (`orderNumber`), - KEY `customerNumber` (`customerNumber`), - CONSTRAINT `orders_ibfk_1` FOREIGN KEY (`customerNumber`) REFERENCES `customers` (`customerNumber`) -) ENGINE=InnoDB DEFAULT CHARSET=latin1; - -/*Data for the table `orders` */ - -insert into `orders`(`orderNumber`,`orderDate`,`requiredDate`,`shippedDate`,`status`,`comments`,`customerNumber`) values - -(10100,'2003-01-06','2003-01-13','2003-01-10','Shipped',NULL,363), - -(10101,'2003-01-09','2003-01-18','2003-01-11','Shipped','Check on availability.',128), - -(10102,'2003-01-10','2003-01-18','2003-01-14','Shipped',NULL,181), - -(10103,'2003-01-29','2003-02-07','2003-02-02','Shipped',NULL,121), - -(10104,'2003-01-31','2003-02-09','2003-02-01','Shipped',NULL,141), - -(10105,'2003-02-11','2003-02-21','2003-02-12','Shipped',NULL,145), - -(10106,'2003-02-17','2003-02-24','2003-02-21','Shipped',NULL,278), - -(10107,'2003-02-24','2003-03-03','2003-02-26','Shipped','Difficult to negotiate with customer. We need more marketing materials',131), - -(10108,'2003-03-03','2003-03-12','2003-03-08','Shipped',NULL,385), - -(10109,'2003-03-10','2003-03-19','2003-03-11','Shipped','Customer requested that FedEx Ground is used for this shipping',486), - -(10110,'2003-03-18','2003-03-24','2003-03-20','Shipped',NULL,187), - -(10111,'2003-03-25','2003-03-31','2003-03-30','Shipped',NULL,129), - -(10112,'2003-03-24','2003-04-03','2003-03-29','Shipped','Customer requested that ad materials (such as posters, pamphlets) be included in the shippment',144), - -(10113,'2003-03-26','2003-04-02','2003-03-27','Shipped',NULL,124), - -(10114,'2003-04-01','2003-04-07','2003-04-02','Shipped',NULL,172), - -(10115,'2003-04-04','2003-04-12','2003-04-07','Shipped',NULL,424), - -(10116,'2003-04-11','2003-04-19','2003-04-13','Shipped',NULL,381), - -(10117,'2003-04-16','2003-04-24','2003-04-17','Shipped',NULL,148), - -(10118,'2003-04-21','2003-04-29','2003-04-26','Shipped','Customer has worked with some of our vendors in the past and is aware of their MSRP',216), - -(10119,'2003-04-28','2003-05-05','2003-05-02','Shipped',NULL,382), - -(10120,'2003-04-29','2003-05-08','2003-05-01','Shipped',NULL,114), - -(10121,'2003-05-07','2003-05-13','2003-05-13','Shipped',NULL,353), - -(10122,'2003-05-08','2003-05-16','2003-05-13','Shipped',NULL,350), - -(10123,'2003-05-20','2003-05-29','2003-05-22','Shipped',NULL,103), - -(10124,'2003-05-21','2003-05-29','2003-05-25','Shipped','Customer very concerned about the exact color of the models. There is high risk that he may dispute the order because there is a slight color mismatch',112), - -(10125,'2003-05-21','2003-05-27','2003-05-24','Shipped',NULL,114), - -(10126,'2003-05-28','2003-06-07','2003-06-02','Shipped',NULL,458), - -(10127,'2003-06-03','2003-06-09','2003-06-06','Shipped','Customer requested special shippment. The instructions were passed along to the warehouse',151), - -(10128,'2003-06-06','2003-06-12','2003-06-11','Shipped',NULL,141), - -(10129,'2003-06-12','2003-06-18','2003-06-14','Shipped',NULL,324), - -(10130,'2003-06-16','2003-06-24','2003-06-21','Shipped',NULL,198), - -(10131,'2003-06-16','2003-06-25','2003-06-21','Shipped',NULL,447), - -(10132,'2003-06-25','2003-07-01','2003-06-28','Shipped',NULL,323), - -(10133,'2003-06-27','2003-07-04','2003-07-03','Shipped',NULL,141), - -(10134,'2003-07-01','2003-07-10','2003-07-05','Shipped',NULL,250), - -(10135,'2003-07-02','2003-07-12','2003-07-03','Shipped',NULL,124), - -(10136,'2003-07-04','2003-07-14','2003-07-06','Shipped','Customer is interested in buying more Ferrari models',242), - -(10137,'2003-07-10','2003-07-20','2003-07-14','Shipped',NULL,353), - -(10138,'2003-07-07','2003-07-16','2003-07-13','Shipped',NULL,496), - -(10139,'2003-07-16','2003-07-23','2003-07-21','Shipped',NULL,282), - -(10140,'2003-07-24','2003-08-02','2003-07-30','Shipped',NULL,161), - -(10141,'2003-08-01','2003-08-09','2003-08-04','Shipped',NULL,334), - -(10142,'2003-08-08','2003-08-16','2003-08-13','Shipped',NULL,124), - -(10143,'2003-08-10','2003-08-18','2003-08-12','Shipped','Can we deliver the new Ford Mustang models by end-of-quarter?',320), - -(10144,'2003-08-13','2003-08-21','2003-08-14','Shipped',NULL,381), - -(10145,'2003-08-25','2003-09-02','2003-08-31','Shipped',NULL,205), - -(10146,'2003-09-03','2003-09-13','2003-09-06','Shipped',NULL,447), - -(10147,'2003-09-05','2003-09-12','2003-09-09','Shipped',NULL,379), - -(10148,'2003-09-11','2003-09-21','2003-09-15','Shipped','They want to reevaluate their terms agreement with Finance.',276), - -(10149,'2003-09-12','2003-09-18','2003-09-17','Shipped',NULL,487), - -(10150,'2003-09-19','2003-09-27','2003-09-21','Shipped','They want to reevaluate their terms agreement with Finance.',148), - -(10151,'2003-09-21','2003-09-30','2003-09-24','Shipped',NULL,311), - -(10152,'2003-09-25','2003-10-03','2003-10-01','Shipped',NULL,333), - -(10153,'2003-09-28','2003-10-05','2003-10-03','Shipped',NULL,141), - -(10154,'2003-10-02','2003-10-12','2003-10-08','Shipped',NULL,219), - -(10155,'2003-10-06','2003-10-13','2003-10-07','Shipped',NULL,186), - -(10156,'2003-10-08','2003-10-17','2003-10-11','Shipped',NULL,141), - -(10157,'2003-10-09','2003-10-15','2003-10-14','Shipped',NULL,473), - -(10158,'2003-10-10','2003-10-18','2003-10-15','Shipped',NULL,121), - -(10159,'2003-10-10','2003-10-19','2003-10-16','Shipped',NULL,321), - -(10160,'2003-10-11','2003-10-17','2003-10-17','Shipped',NULL,347), - -(10161,'2003-10-17','2003-10-25','2003-10-20','Shipped',NULL,227), - -(10162,'2003-10-18','2003-10-26','2003-10-19','Shipped',NULL,321), - -(10163,'2003-10-20','2003-10-27','2003-10-24','Shipped',NULL,424), - -(10164,'2003-10-21','2003-10-30','2003-10-23','Resolved','This order was disputed, but resolved on 11/1/2003; Customer doesn\'t like the colors and precision of the models.',452), - -(10165,'2003-10-22','2003-10-31','2003-12-26','Shipped','This order was on hold because customers\'s credit limit had been exceeded. Order will ship when payment is received',148), - -(10166,'2003-10-21','2003-10-30','2003-10-27','Shipped',NULL,462), - -(10167,'2003-10-23','2003-10-30',NULL,'Cancelled','Customer called to cancel. The warehouse was notified in time and the order didn\'t ship. They have a new VP of Sales and are shifting their sales model. Our VP of Sales should contact them.',448), - -(10168,'2003-10-28','2003-11-03','2003-11-01','Shipped',NULL,161), - -(10169,'2003-11-04','2003-11-14','2003-11-09','Shipped',NULL,276), - -(10170,'2003-11-04','2003-11-12','2003-11-07','Shipped',NULL,452), - -(10171,'2003-11-05','2003-11-13','2003-11-07','Shipped',NULL,233), - -(10172,'2003-11-05','2003-11-14','2003-11-11','Shipped',NULL,175), - -(10173,'2003-11-05','2003-11-15','2003-11-09','Shipped','Cautious optimism. We have happy customers here, if we can keep them well stocked. I need all the information I can get on the planned shippments of Porches',278), - -(10174,'2003-11-06','2003-11-15','2003-11-10','Shipped',NULL,333), - -(10175,'2003-11-06','2003-11-14','2003-11-09','Shipped',NULL,324), - -(10176,'2003-11-06','2003-11-15','2003-11-12','Shipped',NULL,386), - -(10177,'2003-11-07','2003-11-17','2003-11-12','Shipped',NULL,344), - -(10178,'2003-11-08','2003-11-16','2003-11-10','Shipped','Custom shipping instructions sent to warehouse',242), - -(10179,'2003-11-11','2003-11-17','2003-11-13','Cancelled','Customer cancelled due to urgent budgeting issues. Must be cautious when dealing with them in the future. Since order shipped already we must discuss who would cover the shipping charges.',496), - -(10180,'2003-11-11','2003-11-19','2003-11-14','Shipped',NULL,171), - -(10181,'2003-11-12','2003-11-19','2003-11-15','Shipped',NULL,167), - -(10182,'2003-11-12','2003-11-21','2003-11-18','Shipped',NULL,124), - -(10183,'2003-11-13','2003-11-22','2003-11-15','Shipped','We need to keep in close contact with their Marketing VP. He is the decision maker for all their purchases.',339), - -(10184,'2003-11-14','2003-11-22','2003-11-20','Shipped',NULL,484), - -(10185,'2003-11-14','2003-11-21','2003-11-20','Shipped',NULL,320), - -(10186,'2003-11-14','2003-11-20','2003-11-18','Shipped','They want to reevaluate their terms agreement with the VP of Sales',489), - -(10187,'2003-11-15','2003-11-24','2003-11-16','Shipped',NULL,211), - -(10188,'2003-11-18','2003-11-26','2003-11-24','Shipped',NULL,167), - -(10189,'2003-11-18','2003-11-25','2003-11-24','Shipped','They want to reevaluate their terms agreement with Finance.',205), - -(10190,'2003-11-19','2003-11-29','2003-11-20','Shipped',NULL,141), - -(10191,'2003-11-20','2003-11-30','2003-11-24','Shipped','We must be cautions with this customer. Their VP of Sales resigned. Company may be heading down.',259), - -(10192,'2003-11-20','2003-11-29','2003-11-25','Shipped',NULL,363), - -(10193,'2003-11-21','2003-11-28','2003-11-27','Shipped',NULL,471), - -(10194,'2003-11-25','2003-12-02','2003-11-26','Shipped',NULL,146), - -(10195,'2003-11-25','2003-12-01','2003-11-28','Shipped',NULL,319), - -(10196,'2003-11-26','2003-12-03','2003-12-01','Shipped',NULL,455), - -(10197,'2003-11-26','2003-12-02','2003-12-01','Shipped','Customer inquired about remote controlled models and gold models.',216), - -(10198,'2003-11-27','2003-12-06','2003-12-03','Shipped',NULL,385), - -(10199,'2003-12-01','2003-12-10','2003-12-06','Shipped',NULL,475), - -(10200,'2003-12-01','2003-12-09','2003-12-06','Shipped',NULL,211), - -(10201,'2003-12-01','2003-12-11','2003-12-02','Shipped',NULL,129), - -(10202,'2003-12-02','2003-12-09','2003-12-06','Shipped',NULL,357), - -(10203,'2003-12-02','2003-12-11','2003-12-07','Shipped',NULL,141), - -(10204,'2003-12-02','2003-12-10','2003-12-04','Shipped',NULL,151), - -(10205,'2003-12-03','2003-12-09','2003-12-07','Shipped',' I need all the information I can get on our competitors.',141), - -(10206,'2003-12-05','2003-12-13','2003-12-08','Shipped','Can we renegotiate this one?',202), - -(10207,'2003-12-09','2003-12-17','2003-12-11','Shipped','Check on availability.',495), - -(10208,'2004-01-02','2004-01-11','2004-01-04','Shipped',NULL,146), - -(10209,'2004-01-09','2004-01-15','2004-01-12','Shipped',NULL,347), - -(10210,'2004-01-12','2004-01-22','2004-01-20','Shipped',NULL,177), - -(10211,'2004-01-15','2004-01-25','2004-01-18','Shipped',NULL,406), - -(10212,'2004-01-16','2004-01-24','2004-01-18','Shipped',NULL,141), - -(10213,'2004-01-22','2004-01-28','2004-01-27','Shipped','Difficult to negotiate with customer. We need more marketing materials',489), - -(10214,'2004-01-26','2004-02-04','2004-01-29','Shipped',NULL,458), - -(10215,'2004-01-29','2004-02-08','2004-02-01','Shipped','Customer requested that FedEx Ground is used for this shipping',475), - -(10216,'2004-02-02','2004-02-10','2004-02-04','Shipped',NULL,256), - -(10217,'2004-02-04','2004-02-14','2004-02-06','Shipped',NULL,166), - -(10218,'2004-02-09','2004-02-16','2004-02-11','Shipped','Customer requested that ad materials (such as posters, pamphlets) be included in the shippment',473), - -(10219,'2004-02-10','2004-02-17','2004-02-12','Shipped',NULL,487), - -(10220,'2004-02-12','2004-02-19','2004-02-16','Shipped',NULL,189), - -(10221,'2004-02-18','2004-02-26','2004-02-19','Shipped',NULL,314), - -(10222,'2004-02-19','2004-02-27','2004-02-20','Shipped',NULL,239), - -(10223,'2004-02-20','2004-02-29','2004-02-24','Shipped',NULL,114), - -(10224,'2004-02-21','2004-03-02','2004-02-26','Shipped','Customer has worked with some of our vendors in the past and is aware of their MSRP',171), - -(10225,'2004-02-22','2004-03-01','2004-02-24','Shipped',NULL,298), - -(10226,'2004-02-26','2004-03-06','2004-03-02','Shipped',NULL,239), - -(10227,'2004-03-02','2004-03-12','2004-03-08','Shipped',NULL,146), - -(10228,'2004-03-10','2004-03-18','2004-03-13','Shipped',NULL,173), - -(10229,'2004-03-11','2004-03-20','2004-03-12','Shipped',NULL,124), - -(10230,'2004-03-15','2004-03-24','2004-03-20','Shipped','Customer very concerned about the exact color of the models. There is high risk that he may dispute the order because there is a slight color mismatch',128), - -(10231,'2004-03-19','2004-03-26','2004-03-25','Shipped',NULL,344), - -(10232,'2004-03-20','2004-03-30','2004-03-25','Shipped',NULL,240), - -(10233,'2004-03-29','2004-04-04','2004-04-02','Shipped','Customer requested special shippment. The instructions were passed along to the warehouse',328), - -(10234,'2004-03-30','2004-04-05','2004-04-02','Shipped',NULL,412), - -(10235,'2004-04-02','2004-04-12','2004-04-06','Shipped',NULL,260), - -(10236,'2004-04-03','2004-04-11','2004-04-08','Shipped',NULL,486), - -(10237,'2004-04-05','2004-04-12','2004-04-10','Shipped',NULL,181), - -(10238,'2004-04-09','2004-04-16','2004-04-10','Shipped',NULL,145), - -(10239,'2004-04-12','2004-04-21','2004-04-17','Shipped',NULL,311), - -(10240,'2004-04-13','2004-04-20','2004-04-20','Shipped',NULL,177), - -(10241,'2004-04-13','2004-04-20','2004-04-19','Shipped',NULL,209), - -(10242,'2004-04-20','2004-04-28','2004-04-25','Shipped','Customer is interested in buying more Ferrari models',456), - -(10243,'2004-04-26','2004-05-03','2004-04-28','Shipped',NULL,495), - -(10244,'2004-04-29','2004-05-09','2004-05-04','Shipped',NULL,141), - -(10245,'2004-05-04','2004-05-12','2004-05-09','Shipped',NULL,455), - -(10246,'2004-05-05','2004-05-13','2004-05-06','Shipped',NULL,141), - -(10247,'2004-05-05','2004-05-11','2004-05-08','Shipped',NULL,334), - -(10248,'2004-05-07','2004-05-14',NULL,'Cancelled','Order was mistakenly placed. The warehouse noticed the lack of documentation.',131), - -(10249,'2004-05-08','2004-05-17','2004-05-11','Shipped','Can we deliver the new Ford Mustang models by end-of-quarter?',173), - -(10250,'2004-05-11','2004-05-19','2004-05-15','Shipped',NULL,450), - -(10251,'2004-05-18','2004-05-24','2004-05-24','Shipped',NULL,328), - -(10252,'2004-05-26','2004-06-04','2004-05-29','Shipped',NULL,406), - -(10253,'2004-06-01','2004-06-09','2004-06-02','Cancelled','Customer disputed the order and we agreed to cancel it. We must be more cautions with this customer going forward, since they are very hard to please. We must cover the shipping fees.',201), - -(10254,'2004-06-03','2004-06-13','2004-06-04','Shipped','Customer requested that DHL is used for this shipping',323), - -(10255,'2004-06-04','2004-06-12','2004-06-09','Shipped',NULL,209), - -(10256,'2004-06-08','2004-06-16','2004-06-10','Shipped',NULL,145), - -(10257,'2004-06-14','2004-06-24','2004-06-15','Shipped',NULL,450), - -(10258,'2004-06-15','2004-06-25','2004-06-23','Shipped',NULL,398), - -(10259,'2004-06-15','2004-06-22','2004-06-17','Shipped',NULL,166), - -(10260,'2004-06-16','2004-06-22',NULL,'Cancelled','Customer heard complaints from their customers and called to cancel this order. Will notify the Sales Manager.',357), - -(10261,'2004-06-17','2004-06-25','2004-06-22','Shipped',NULL,233), - -(10262,'2004-06-24','2004-07-01',NULL,'Cancelled','This customer found a better offer from one of our competitors. Will call back to renegotiate.',141), - -(10263,'2004-06-28','2004-07-04','2004-07-02','Shipped',NULL,175), - -(10264,'2004-06-30','2004-07-06','2004-07-01','Shipped','Customer will send a truck to our local warehouse on 7/1/2004',362), - -(10265,'2004-07-02','2004-07-09','2004-07-07','Shipped',NULL,471), - -(10266,'2004-07-06','2004-07-14','2004-07-10','Shipped',NULL,386), - -(10267,'2004-07-07','2004-07-17','2004-07-09','Shipped',NULL,151), - -(10268,'2004-07-12','2004-07-18','2004-07-14','Shipped',NULL,412), - -(10269,'2004-07-16','2004-07-22','2004-07-18','Shipped',NULL,382), - -(10270,'2004-07-19','2004-07-27','2004-07-24','Shipped','Can we renegotiate this one?',282), - -(10271,'2004-07-20','2004-07-29','2004-07-23','Shipped',NULL,124), - -(10272,'2004-07-20','2004-07-26','2004-07-22','Shipped',NULL,157), - -(10273,'2004-07-21','2004-07-28','2004-07-22','Shipped',NULL,314), - -(10274,'2004-07-21','2004-07-29','2004-07-22','Shipped',NULL,379), - -(10275,'2004-07-23','2004-08-02','2004-07-29','Shipped',NULL,119), - -(10276,'2004-08-02','2004-08-11','2004-08-08','Shipped',NULL,204), - -(10277,'2004-08-04','2004-08-12','2004-08-05','Shipped',NULL,148), - -(10278,'2004-08-06','2004-08-16','2004-08-09','Shipped',NULL,112), - -(10279,'2004-08-09','2004-08-19','2004-08-15','Shipped','Cautious optimism. We have happy customers here, if we can keep them well stocked. I need all the information I can get on the planned shippments of Porches',141), - -(10280,'2004-08-17','2004-08-27','2004-08-19','Shipped',NULL,249), - -(10281,'2004-08-19','2004-08-28','2004-08-23','Shipped',NULL,157), - -(10282,'2004-08-20','2004-08-26','2004-08-22','Shipped',NULL,124), - -(10283,'2004-08-20','2004-08-30','2004-08-23','Shipped',NULL,260), - -(10284,'2004-08-21','2004-08-29','2004-08-26','Shipped','Custom shipping instructions sent to warehouse',299), - -(10285,'2004-08-27','2004-09-04','2004-08-31','Shipped',NULL,286), - -(10286,'2004-08-28','2004-09-06','2004-09-01','Shipped',NULL,172), - -(10287,'2004-08-30','2004-09-06','2004-09-01','Shipped',NULL,298), - -(10288,'2004-09-01','2004-09-11','2004-09-05','Shipped',NULL,166), - -(10289,'2004-09-03','2004-09-13','2004-09-04','Shipped','We need to keep in close contact with their Marketing VP. He is the decision maker for all their purchases.',167), - -(10290,'2004-09-07','2004-09-15','2004-09-13','Shipped',NULL,198), - -(10291,'2004-09-08','2004-09-17','2004-09-14','Shipped',NULL,448), - -(10292,'2004-09-08','2004-09-18','2004-09-11','Shipped','They want to reevaluate their terms agreement with Finance.',131), - -(10293,'2004-09-09','2004-09-18','2004-09-14','Shipped',NULL,249), - -(10294,'2004-09-10','2004-09-17','2004-09-14','Shipped',NULL,204), - -(10295,'2004-09-10','2004-09-17','2004-09-14','Shipped','They want to reevaluate their terms agreement with Finance.',362), - -(10296,'2004-09-15','2004-09-22','2004-09-16','Shipped',NULL,415), - -(10297,'2004-09-16','2004-09-22','2004-09-21','Shipped','We must be cautions with this customer. Their VP of Sales resigned. Company may be heading down.',189), - -(10298,'2004-09-27','2004-10-05','2004-10-01','Shipped',NULL,103), - -(10299,'2004-09-30','2004-10-10','2004-10-01','Shipped',NULL,186), - -(10300,'2003-10-04','2003-10-13','2003-10-09','Shipped',NULL,128), - -(10301,'2003-10-05','2003-10-15','2003-10-08','Shipped',NULL,299), - -(10302,'2003-10-06','2003-10-16','2003-10-07','Shipped',NULL,201), - -(10303,'2004-10-06','2004-10-14','2004-10-09','Shipped','Customer inquired about remote controlled models and gold models.',484), - -(10304,'2004-10-11','2004-10-20','2004-10-17','Shipped',NULL,256), - -(10305,'2004-10-13','2004-10-22','2004-10-15','Shipped','Check on availability.',286), - -(10306,'2004-10-14','2004-10-21','2004-10-17','Shipped',NULL,187), - -(10307,'2004-10-14','2004-10-23','2004-10-20','Shipped',NULL,339), - -(10308,'2004-10-15','2004-10-24','2004-10-20','Shipped','Customer requested that FedEx Ground is used for this shipping',319), - -(10309,'2004-10-15','2004-10-24','2004-10-18','Shipped',NULL,121), - -(10310,'2004-10-16','2004-10-24','2004-10-18','Shipped',NULL,259), - -(10311,'2004-10-16','2004-10-23','2004-10-20','Shipped','Difficult to negotiate with customer. We need more marketing materials',141), - -(10312,'2004-10-21','2004-10-27','2004-10-23','Shipped',NULL,124), - -(10313,'2004-10-22','2004-10-28','2004-10-25','Shipped','Customer requested that FedEx Ground is used for this shipping',202), - -(10314,'2004-10-22','2004-11-01','2004-10-23','Shipped',NULL,227), - -(10315,'2004-10-29','2004-11-08','2004-10-30','Shipped',NULL,119), - -(10316,'2004-11-01','2004-11-09','2004-11-07','Shipped','Customer requested that ad materials (such as posters, pamphlets) be included in the shippment',240), - -(10317,'2004-11-02','2004-11-12','2004-11-08','Shipped',NULL,161), - -(10318,'2004-11-02','2004-11-09','2004-11-07','Shipped',NULL,157), - -(10319,'2004-11-03','2004-11-11','2004-11-06','Shipped','Customer requested that DHL is used for this shipping',456), - -(10320,'2004-11-03','2004-11-13','2004-11-07','Shipped',NULL,144), - -(10321,'2004-11-04','2004-11-12','2004-11-07','Shipped',NULL,462), - -(10322,'2004-11-04','2004-11-12','2004-11-10','Shipped','Customer has worked with some of our vendors in the past and is aware of their MSRP',363), - -(10323,'2004-11-05','2004-11-12','2004-11-09','Shipped',NULL,128), - -(10324,'2004-11-05','2004-11-11','2004-11-08','Shipped',NULL,181), - -(10325,'2004-11-05','2004-11-13','2004-11-08','Shipped',NULL,121), - -(10326,'2004-11-09','2004-11-16','2004-11-10','Shipped',NULL,144), - -(10327,'2004-11-10','2004-11-19','2004-11-13','Resolved','Order was disputed and resolved on 12/1/04. The Sales Manager was involved. Customer claims the scales of the models don\'t match what was discussed.',145), - -(10328,'2004-11-12','2004-11-21','2004-11-18','Shipped','Customer very concerned about the exact color of the models. There is high risk that he may dispute the order because there is a slight color mismatch',278), - -(10329,'2004-11-15','2004-11-24','2004-11-16','Shipped',NULL,131), - -(10330,'2004-11-16','2004-11-25','2004-11-21','Shipped',NULL,385), - -(10331,'2004-11-17','2004-11-23','2004-11-23','Shipped','Customer requested special shippment. The instructions were passed along to the warehouse',486), - -(10332,'2004-11-17','2004-11-25','2004-11-18','Shipped',NULL,187), - -(10333,'2004-11-18','2004-11-27','2004-11-20','Shipped',NULL,129), - -(10334,'2004-11-19','2004-11-28',NULL,'On Hold','The outstaniding balance for this customer exceeds their credit limit. Order will be shipped when a payment is received.',144), - -(10335,'2004-11-19','2004-11-29','2004-11-23','Shipped',NULL,124), - -(10336,'2004-11-20','2004-11-26','2004-11-24','Shipped','Customer requested that DHL is used for this shipping',172), - -(10337,'2004-11-21','2004-11-30','2004-11-26','Shipped',NULL,424), - -(10338,'2004-11-22','2004-12-02','2004-11-27','Shipped',NULL,381), - -(10339,'2004-11-23','2004-11-30','2004-11-30','Shipped',NULL,398), - -(10340,'2004-11-24','2004-12-01','2004-11-25','Shipped','Customer is interested in buying more Ferrari models',216), - -(10341,'2004-11-24','2004-12-01','2004-11-29','Shipped',NULL,382), - -(10342,'2004-11-24','2004-12-01','2004-11-29','Shipped',NULL,114), - -(10343,'2004-11-24','2004-12-01','2004-11-26','Shipped',NULL,353), - -(10344,'2004-11-25','2004-12-02','2004-11-29','Shipped',NULL,350), - -(10345,'2004-11-25','2004-12-01','2004-11-26','Shipped',NULL,103), - -(10346,'2004-11-29','2004-12-05','2004-11-30','Shipped',NULL,112), - -(10347,'2004-11-29','2004-12-07','2004-11-30','Shipped','Can we deliver the new Ford Mustang models by end-of-quarter?',114), - -(10348,'2004-11-01','2004-11-08','2004-11-05','Shipped',NULL,458), - -(10349,'2004-12-01','2004-12-07','2004-12-03','Shipped',NULL,151), - -(10350,'2004-12-02','2004-12-08','2004-12-05','Shipped',NULL,141), - -(10351,'2004-12-03','2004-12-11','2004-12-07','Shipped',NULL,324), - -(10352,'2004-12-03','2004-12-12','2004-12-09','Shipped',NULL,198), - -(10353,'2004-12-04','2004-12-11','2004-12-05','Shipped',NULL,447), - -(10354,'2004-12-04','2004-12-10','2004-12-05','Shipped',NULL,323), - -(10355,'2004-12-07','2004-12-14','2004-12-13','Shipped',NULL,141), - -(10356,'2004-12-09','2004-12-15','2004-12-12','Shipped',NULL,250), - -(10357,'2004-12-10','2004-12-16','2004-12-14','Shipped',NULL,124), - -(10358,'2004-12-10','2004-12-16','2004-12-16','Shipped','Customer requested that DHL is used for this shipping',141), - -(10359,'2004-12-15','2004-12-23','2004-12-18','Shipped',NULL,353), - -(10360,'2004-12-16','2004-12-22','2004-12-18','Shipped',NULL,496), - -(10361,'2004-12-17','2004-12-24','2004-12-20','Shipped',NULL,282), - -(10362,'2005-01-05','2005-01-16','2005-01-10','Shipped',NULL,161), - -(10363,'2005-01-06','2005-01-12','2005-01-10','Shipped',NULL,334), - -(10364,'2005-01-06','2005-01-17','2005-01-09','Shipped',NULL,350), - -(10365,'2005-01-07','2005-01-18','2005-01-11','Shipped',NULL,320), - -(10366,'2005-01-10','2005-01-19','2005-01-12','Shipped',NULL,381), - -(10367,'2005-01-12','2005-01-21','2005-01-16','Resolved','This order was disputed and resolved on 2/1/2005. Customer claimed that container with shipment was damaged. FedEx\'s investigation proved this wrong.',205), - -(10368,'2005-01-19','2005-01-27','2005-01-24','Shipped','Can we renegotiate this one?',124), - -(10369,'2005-01-20','2005-01-28','2005-01-24','Shipped',NULL,379), - -(10370,'2005-01-20','2005-02-01','2005-01-25','Shipped',NULL,276), - -(10371,'2005-01-23','2005-02-03','2005-01-25','Shipped',NULL,124), - -(10372,'2005-01-26','2005-02-05','2005-01-28','Shipped',NULL,398), - -(10373,'2005-01-31','2005-02-08','2005-02-06','Shipped',NULL,311), - -(10374,'2005-02-02','2005-02-09','2005-02-03','Shipped',NULL,333), - -(10375,'2005-02-03','2005-02-10','2005-02-06','Shipped',NULL,119), - -(10376,'2005-02-08','2005-02-18','2005-02-13','Shipped',NULL,219), - -(10377,'2005-02-09','2005-02-21','2005-02-12','Shipped','Cautious optimism. We have happy customers here, if we can keep them well stocked. I need all the information I can get on the planned shippments of Porches',186), - -(10378,'2005-02-10','2005-02-18','2005-02-11','Shipped',NULL,141), - -(10379,'2005-02-10','2005-02-18','2005-02-11','Shipped',NULL,141), - -(10380,'2005-02-16','2005-02-24','2005-02-18','Shipped',NULL,141), - -(10381,'2005-02-17','2005-02-25','2005-02-18','Shipped',NULL,321), - -(10382,'2005-02-17','2005-02-23','2005-02-18','Shipped','Custom shipping instructions sent to warehouse',124), - -(10383,'2005-02-22','2005-03-02','2005-02-25','Shipped',NULL,141), - -(10384,'2005-02-23','2005-03-06','2005-02-27','Shipped',NULL,321), - -(10385,'2005-02-28','2005-03-09','2005-03-01','Shipped',NULL,124), - -(10386,'2005-03-01','2005-03-09','2005-03-06','Resolved','Disputed then Resolved on 3/15/2005. Customer doesn\'t like the craftsmaship of the models.',141), - -(10387,'2005-03-02','2005-03-09','2005-03-06','Shipped','We need to keep in close contact with their Marketing VP. He is the decision maker for all their purchases.',148), - -(10388,'2005-03-03','2005-03-11','2005-03-09','Shipped',NULL,462), - -(10389,'2005-03-03','2005-03-09','2005-03-08','Shipped',NULL,448), - -(10390,'2005-03-04','2005-03-11','2005-03-07','Shipped','They want to reevaluate their terms agreement with Finance.',124), - -(10391,'2005-03-09','2005-03-20','2005-03-15','Shipped',NULL,276), - -(10392,'2005-03-10','2005-03-18','2005-03-12','Shipped',NULL,452), - -(10393,'2005-03-11','2005-03-22','2005-03-14','Shipped','They want to reevaluate their terms agreement with Finance.',323), - -(10394,'2005-03-15','2005-03-25','2005-03-19','Shipped',NULL,141), - -(10395,'2005-03-17','2005-03-24','2005-03-23','Shipped','We must be cautions with this customer. Their VP of Sales resigned. Company may be heading down.',250), - -(10396,'2005-03-23','2005-04-02','2005-03-28','Shipped',NULL,124), - -(10397,'2005-03-28','2005-04-09','2005-04-01','Shipped',NULL,242), - -(10398,'2005-03-30','2005-04-09','2005-03-31','Shipped',NULL,353), - -(10399,'2005-04-01','2005-04-12','2005-04-03','Shipped',NULL,496), - -(10400,'2005-04-01','2005-04-11','2005-04-04','Shipped','Customer requested that DHL is used for this shipping',450), - -(10401,'2005-04-03','2005-04-14',NULL,'On Hold','Customer credit limit exceeded. Will ship when a payment is received.',328), - -(10402,'2005-04-07','2005-04-14','2005-04-12','Shipped',NULL,406), - -(10403,'2005-04-08','2005-04-18','2005-04-11','Shipped',NULL,201), - -(10404,'2005-04-08','2005-04-14','2005-04-11','Shipped',NULL,323), - -(10405,'2005-04-14','2005-04-24','2005-04-20','Shipped',NULL,209), - -(10406,'2005-04-15','2005-04-25','2005-04-21','Disputed','Customer claims container with shipment was damaged during shipping and some items were missing. I am talking to FedEx about this.',145), - -(10407,'2005-04-22','2005-05-04',NULL,'On Hold','Customer credit limit exceeded. Will ship when a payment is received.',450), - -(10408,'2005-04-22','2005-04-29','2005-04-27','Shipped',NULL,398), - -(10409,'2005-04-23','2005-05-05','2005-04-24','Shipped',NULL,166), - -(10410,'2005-04-29','2005-05-10','2005-04-30','Shipped',NULL,357), - -(10411,'2005-05-01','2005-05-08','2005-05-06','Shipped',NULL,233), - -(10412,'2005-05-03','2005-05-13','2005-05-05','Shipped',NULL,141), - -(10413,'2005-05-05','2005-05-14','2005-05-09','Shipped','Customer requested that DHL is used for this shipping',175), - -(10414,'2005-05-06','2005-05-13',NULL,'On Hold','Customer credit limit exceeded. Will ship when a payment is received.',362), - -(10415,'2005-05-09','2005-05-20','2005-05-12','Disputed','Customer claims the scales of the models don\'t match what was discussed. I keep all the paperwork though to prove otherwise',471), - -(10416,'2005-05-10','2005-05-16','2005-05-14','Shipped',NULL,386), - -(10417,'2005-05-13','2005-05-19','2005-05-19','Disputed','Customer doesn\'t like the colors and precision of the models.',141), - -(10418,'2005-05-16','2005-05-24','2005-05-20','Shipped',NULL,412), - -(10419,'2005-05-17','2005-05-28','2005-05-19','Shipped',NULL,382), - -(10420,'2005-05-29','2005-06-07',NULL,'In Process',NULL,282), - -(10421,'2005-05-29','2005-06-06',NULL,'In Process','Custom shipping instructions were sent to warehouse',124), - -(10422,'2005-05-30','2005-06-11',NULL,'In Process',NULL,157), - -(10423,'2005-05-30','2005-06-05',NULL,'In Process',NULL,314), - -(10424,'2005-05-31','2005-06-08',NULL,'In Process',NULL,141), - -(10425,'2005-05-31','2005-06-07',NULL,'In Process',NULL,119); - -/*Table structure for table `payments` */ - -DROP TABLE IF EXISTS `payments`; - -CREATE TABLE `payments` ( - `customerNumber` int(11) NOT NULL, - `checkNumber` varchar(50) NOT NULL, - `paymentDate` date NOT NULL, - `amount` decimal(10,2) NOT NULL, - PRIMARY KEY (`customerNumber`,`checkNumber`), - CONSTRAINT `payments_ibfk_1` FOREIGN KEY (`customerNumber`) REFERENCES `customers` (`customerNumber`) -) ENGINE=InnoDB DEFAULT CHARSET=latin1; - -/*Data for the table `payments` */ - -insert into `payments`(`customerNumber`,`checkNumber`,`paymentDate`,`amount`) values - -(103,'HQ336336','2004-10-19','6066.78'), - -(103,'JM555205','2003-06-05','14571.44'), - -(103,'OM314933','2004-12-18','1676.14'), - -(112,'BO864823','2004-12-17','14191.12'), - -(112,'HQ55022','2003-06-06','32641.98'), - -(112,'ND748579','2004-08-20','33347.88'), - -(114,'GG31455','2003-05-20','45864.03'), - -(114,'MA765515','2004-12-15','82261.22'), - -(114,'NP603840','2003-05-31','7565.08'), - -(114,'NR27552','2004-03-10','44894.74'), - -(119,'DB933704','2004-11-14','19501.82'), - -(119,'LN373447','2004-08-08','47924.19'), - -(119,'NG94694','2005-02-22','49523.67'), - -(121,'DB889831','2003-02-16','50218.95'), - -(121,'FD317790','2003-10-28','1491.38'), - -(121,'KI831359','2004-11-04','17876.32'), - -(121,'MA302151','2004-11-28','34638.14'), - -(124,'AE215433','2005-03-05','101244.59'), - -(124,'BG255406','2004-08-28','85410.87'), - -(124,'CQ287967','2003-04-11','11044.30'), - -(124,'ET64396','2005-04-16','83598.04'), - -(124,'HI366474','2004-12-27','47142.70'), - -(124,'HR86578','2004-11-02','55639.66'), - -(124,'KI131716','2003-08-15','111654.40'), - -(124,'LF217299','2004-03-26','43369.30'), - -(124,'NT141748','2003-11-25','45084.38'), - -(128,'DI925118','2003-01-28','10549.01'), - -(128,'FA465482','2003-10-18','24101.81'), - -(128,'FH668230','2004-03-24','33820.62'), - -(128,'IP383901','2004-11-18','7466.32'), - -(129,'DM826140','2004-12-08','26248.78'), - -(129,'ID449593','2003-12-11','23923.93'), - -(129,'PI42991','2003-04-09','16537.85'), - -(131,'CL442705','2003-03-12','22292.62'), - -(131,'MA724562','2004-12-02','50025.35'), - -(131,'NB445135','2004-09-11','35321.97'), - -(141,'AU364101','2003-07-19','36251.03'), - -(141,'DB583216','2004-11-01','36140.38'), - -(141,'DL460618','2005-05-19','46895.48'), - -(141,'HJ32686','2004-01-30','59830.55'), - -(141,'ID10962','2004-12-31','116208.40'), - -(141,'IN446258','2005-03-25','65071.26'), - -(141,'JE105477','2005-03-18','120166.58'), - -(141,'JN355280','2003-10-26','49539.37'), - -(141,'JN722010','2003-02-25','40206.20'), - -(141,'KT52578','2003-12-09','63843.55'), - -(141,'MC46946','2004-07-09','35420.74'), - -(141,'MF629602','2004-08-16','20009.53'), - -(141,'NU627706','2004-05-17','26155.91'), - -(144,'IR846303','2004-12-12','36005.71'), - -(144,'LA685678','2003-04-09','7674.94'), - -(145,'CN328545','2004-07-03','4710.73'), - -(145,'ED39322','2004-04-26','28211.70'), - -(145,'HR182688','2004-12-01','20564.86'), - -(145,'JJ246391','2003-02-20','53959.21'), - -(146,'FP549817','2004-03-18','40978.53'), - -(146,'FU793410','2004-01-16','49614.72'), - -(146,'LJ160635','2003-12-10','39712.10'), - -(148,'BI507030','2003-04-22','44380.15'), - -(148,'DD635282','2004-08-11','2611.84'), - -(148,'KM172879','2003-12-26','105743.00'), - -(148,'ME497970','2005-03-27','3516.04'), - -(151,'BF686658','2003-12-22','58793.53'), - -(151,'GB852215','2004-07-26','20314.44'), - -(151,'IP568906','2003-06-18','58841.35'), - -(151,'KI884577','2004-12-14','39964.63'), - -(157,'HI618861','2004-11-19','35152.12'), - -(157,'NN711988','2004-09-07','63357.13'), - -(161,'BR352384','2004-11-14','2434.25'), - -(161,'BR478494','2003-11-18','50743.65'), - -(161,'KG644125','2005-02-02','12692.19'), - -(161,'NI908214','2003-08-05','38675.13'), - -(166,'BQ327613','2004-09-16','38785.48'), - -(166,'DC979307','2004-07-07','44160.92'), - -(166,'LA318629','2004-02-28','22474.17'), - -(167,'ED743615','2004-09-19','12538.01'), - -(167,'GN228846','2003-12-03','85024.46'), - -(171,'GB878038','2004-03-15','18997.89'), - -(171,'IL104425','2003-11-22','42783.81'), - -(172,'AD832091','2004-09-09','1960.80'), - -(172,'CE51751','2004-12-04','51209.58'), - -(172,'EH208589','2003-04-20','33383.14'), - -(173,'GP545698','2004-05-13','11843.45'), - -(173,'IG462397','2004-03-29','20355.24'), - -(175,'CITI3434344','2005-05-19','28500.78'), - -(175,'IO448913','2003-11-19','24879.08'), - -(175,'PI15215','2004-07-10','42044.77'), - -(177,'AU750837','2004-04-17','15183.63'), - -(177,'CI381435','2004-01-19','47177.59'), - -(181,'CM564612','2004-04-25','22602.36'), - -(181,'GQ132144','2003-01-30','5494.78'), - -(181,'OH367219','2004-11-16','44400.50'), - -(186,'AE192287','2005-03-10','23602.90'), - -(186,'AK412714','2003-10-27','37602.48'), - -(186,'KA602407','2004-10-21','34341.08'), - -(187,'AM968797','2004-11-03','52825.29'), - -(187,'BQ39062','2004-12-08','47159.11'), - -(187,'KL124726','2003-03-27','48425.69'), - -(189,'BO711618','2004-10-03','17359.53'), - -(189,'NM916675','2004-03-01','32538.74'), - -(198,'FI192930','2004-12-06','9658.74'), - -(198,'HQ920205','2003-07-06','6036.96'), - -(198,'IS946883','2004-09-21','5858.56'), - -(201,'DP677013','2003-10-20','23908.24'), - -(201,'OO846801','2004-06-15','37258.94'), - -(202,'HI358554','2003-12-18','36527.61'), - -(202,'IQ627690','2004-11-08','33594.58'), - -(204,'GC697638','2004-08-13','51152.86'), - -(204,'IS150005','2004-09-24','4424.40'), - -(205,'GL756480','2003-12-04','3879.96'), - -(205,'LL562733','2003-09-05','50342.74'), - -(205,'NM739638','2005-02-06','39580.60'), - -(209,'BOAF82044','2005-05-03','35157.75'), - -(209,'ED520529','2004-06-21','4632.31'), - -(209,'PH785937','2004-05-04','36069.26'), - -(211,'BJ535230','2003-12-09','45480.79'), - -(216,'BG407567','2003-05-09','3101.40'), - -(216,'ML780814','2004-12-06','24945.21'), - -(216,'MM342086','2003-12-14','40473.86'), - -(219,'BN17870','2005-03-02','3452.75'), - -(219,'BR941480','2003-10-18','4465.85'), - -(227,'MQ413968','2003-10-31','36164.46'), - -(227,'NU21326','2004-11-02','53745.34'), - -(233,'BOFA23232','2005-05-20','29070.38'), - -(233,'II180006','2004-07-01','22997.45'), - -(233,'JG981190','2003-11-18','16909.84'), - -(239,'NQ865547','2004-03-15','80375.24'), - -(240,'IF245157','2004-11-16','46788.14'), - -(240,'JO719695','2004-03-28','24995.61'), - -(242,'AF40894','2003-11-22','33818.34'), - -(242,'HR224331','2005-06-03','12432.32'), - -(242,'KI744716','2003-07-21','14232.70'), - -(249,'IJ399820','2004-09-19','33924.24'), - -(249,'NE404084','2004-09-04','48298.99'), - -(250,'EQ12267','2005-05-17','17928.09'), - -(250,'HD284647','2004-12-30','26311.63'), - -(250,'HN114306','2003-07-18','23419.47'), - -(256,'EP227123','2004-02-10','5759.42'), - -(256,'HE84936','2004-10-22','53116.99'), - -(259,'EU280955','2004-11-06','61234.67'), - -(259,'GB361972','2003-12-07','27988.47'), - -(260,'IO164641','2004-08-30','37527.58'), - -(260,'NH776924','2004-04-24','29284.42'), - -(276,'EM979878','2005-02-09','27083.78'), - -(276,'KM841847','2003-11-13','38547.19'), - -(276,'LE432182','2003-09-28','41554.73'), - -(276,'OJ819725','2005-04-30','29848.52'), - -(278,'BJ483870','2004-12-05','37654.09'), - -(278,'GP636783','2003-03-02','52151.81'), - -(278,'NI983021','2003-11-24','37723.79'), - -(282,'IA793562','2003-08-03','24013.52'), - -(282,'JT819493','2004-08-02','35806.73'), - -(282,'OD327378','2005-01-03','31835.36'), - -(286,'DR578578','2004-10-28','47411.33'), - -(286,'KH910279','2004-09-05','43134.04'), - -(298,'AJ574927','2004-03-13','47375.92'), - -(298,'LF501133','2004-09-18','61402.00'), - -(299,'AD304085','2003-10-24','36798.88'), - -(299,'NR157385','2004-09-05','32260.16'), - -(311,'DG336041','2005-02-15','46770.52'), - -(311,'FA728475','2003-10-06','32723.04'), - -(311,'NQ966143','2004-04-25','16212.59'), - -(314,'LQ244073','2004-08-09','45352.47'), - -(314,'MD809704','2004-03-03','16901.38'), - -(319,'HL685576','2004-11-06','42339.76'), - -(319,'OM548174','2003-12-07','36092.40'), - -(320,'GJ597719','2005-01-18','8307.28'), - -(320,'HO576374','2003-08-20','41016.75'), - -(320,'MU817160','2003-11-24','52548.49'), - -(321,'DJ15149','2003-11-03','85559.12'), - -(321,'LA556321','2005-03-15','46781.66'), - -(323,'AL493079','2005-05-23','75020.13'), - -(323,'ES347491','2004-06-24','37281.36'), - -(323,'HG738664','2003-07-05','2880.00'), - -(323,'PQ803830','2004-12-24','39440.59'), - -(324,'DQ409197','2004-12-13','13671.82'), - -(324,'FP443161','2003-07-07','29429.14'), - -(324,'HB150714','2003-11-23','37455.77'), - -(328,'EN930356','2004-04-16','7178.66'), - -(328,'NR631421','2004-05-30','31102.85'), - -(333,'HL209210','2003-11-15','23936.53'), - -(333,'JK479662','2003-10-17','9821.32'), - -(333,'NF959653','2005-03-01','21432.31'), - -(334,'CS435306','2005-01-27','45785.34'), - -(334,'HH517378','2003-08-16','29716.86'), - -(334,'LF737277','2004-05-22','28394.54'), - -(339,'AP286625','2004-10-24','23333.06'), - -(339,'DA98827','2003-11-28','34606.28'), - -(344,'AF246722','2003-11-24','31428.21'), - -(344,'NJ906924','2004-04-02','15322.93'), - -(347,'DG700707','2004-01-18','21053.69'), - -(347,'LG808674','2003-10-24','20452.50'), - -(350,'BQ602907','2004-12-11','18888.31'), - -(350,'CI471510','2003-05-25','50824.66'), - -(350,'OB648482','2005-01-29','1834.56'), - -(353,'CO351193','2005-01-10','49705.52'), - -(353,'ED878227','2003-07-21','13920.26'), - -(353,'GT878649','2003-05-21','16700.47'), - -(353,'HJ618252','2005-06-09','46656.94'), - -(357,'AG240323','2003-12-16','20220.04'), - -(357,'NB291497','2004-05-15','36442.34'), - -(362,'FP170292','2004-07-11','18473.71'), - -(362,'OG208861','2004-09-21','15059.76'), - -(363,'HL575273','2004-11-17','50799.69'), - -(363,'IS232033','2003-01-16','10223.83'), - -(363,'PN238558','2003-12-05','55425.77'), - -(379,'CA762595','2005-02-12','28322.83'), - -(379,'FR499138','2003-09-16','32680.31'), - -(379,'GB890854','2004-08-02','12530.51'), - -(381,'BC726082','2004-12-03','12081.52'), - -(381,'CC475233','2003-04-19','1627.56'), - -(381,'GB117430','2005-02-03','14379.90'), - -(381,'MS154481','2003-08-22','1128.20'), - -(382,'CC871084','2003-05-12','35826.33'), - -(382,'CT821147','2004-08-01','6419.84'), - -(382,'PH29054','2004-11-27','42813.83'), - -(385,'BN347084','2003-12-02','20644.24'), - -(385,'CP804873','2004-11-19','15822.84'), - -(385,'EK785462','2003-03-09','51001.22'), - -(386,'DO106109','2003-11-18','38524.29'), - -(386,'HG438769','2004-07-18','51619.02'), - -(398,'AJ478695','2005-02-14','33967.73'), - -(398,'DO787644','2004-06-21','22037.91'), - -(398,'JPMR4544','2005-05-18','615.45'), - -(398,'KB54275','2004-11-29','48927.64'), - -(406,'BJMPR4545','2005-04-23','12190.85'), - -(406,'HJ217687','2004-01-28','49165.16'), - -(406,'NA197101','2004-06-17','25080.96'), - -(412,'GH197075','2004-07-25','35034.57'), - -(412,'PJ434867','2004-04-14','31670.37'), - -(415,'ER54537','2004-09-28','31310.09'), - -(424,'KF480160','2004-12-07','25505.98'), - -(424,'LM271923','2003-04-16','21665.98'), - -(424,'OA595449','2003-10-31','22042.37'), - -(447,'AO757239','2003-09-15','6631.36'), - -(447,'ER615123','2003-06-25','17032.29'), - -(447,'OU516561','2004-12-17','26304.13'), - -(448,'FS299615','2005-04-18','27966.54'), - -(448,'KR822727','2004-09-30','48809.90'), - -(450,'EF485824','2004-06-21','59551.38'), - -(452,'ED473873','2003-11-15','27121.90'), - -(452,'FN640986','2003-11-20','15130.97'), - -(452,'HG635467','2005-05-03','8807.12'), - -(455,'HA777606','2003-12-05','38139.18'), - -(455,'IR662429','2004-05-12','32239.47'), - -(456,'GJ715659','2004-11-13','27550.51'), - -(456,'MO743231','2004-04-30','1679.92'), - -(458,'DD995006','2004-11-15','33145.56'), - -(458,'NA377824','2004-02-06','22162.61'), - -(458,'OO606861','2003-06-13','57131.92'), - -(462,'ED203908','2005-04-15','30293.77'), - -(462,'GC60330','2003-11-08','9977.85'), - -(462,'PE176846','2004-11-27','48355.87'), - -(471,'AB661578','2004-07-28','9415.13'), - -(471,'CO645196','2003-12-10','35505.63'), - -(473,'LL427009','2004-02-17','7612.06'), - -(473,'PC688499','2003-10-27','17746.26'), - -(475,'JP113227','2003-12-09','7678.25'), - -(475,'PB951268','2004-02-13','36070.47'), - -(484,'GK294076','2004-10-26','3474.66'), - -(484,'JH546765','2003-11-29','47513.19'), - -(486,'BL66528','2004-04-14','5899.38'), - -(486,'HS86661','2004-11-23','45994.07'), - -(486,'JB117768','2003-03-20','25833.14'), - -(487,'AH612904','2003-09-28','29997.09'), - -(487,'PT550181','2004-02-29','12573.28'), - -(489,'OC773849','2003-12-04','22275.73'), - -(489,'PO860906','2004-01-31','7310.42'), - -(495,'BH167026','2003-12-26','59265.14'), - -(495,'FN155234','2004-05-14','6276.60'), - -(496,'EU531600','2005-05-25','30253.75'), - -(496,'MB342426','2003-07-16','32077.44'), - -(496,'MN89921','2004-12-31','52166.00'); - -/*Table structure for table `productlines` */ - -DROP TABLE IF EXISTS `productlines`; - -CREATE TABLE `productlines` ( - `productLine` varchar(50) NOT NULL, - `textDescription` varchar(4000) DEFAULT NULL, - `htmlDescription` mediumtext, - `image` mediumblob, - PRIMARY KEY (`productLine`) -) ENGINE=InnoDB DEFAULT CHARSET=latin1; - -/*Data for the table `productlines` */ - -insert into `productlines`(`productLine`,`textDescription`,`htmlDescription`,`image`) values - -('Classic Cars','Attention car enthusiasts: Make your wildest car ownership dreams come true. Whether you are looking for classic muscle cars, dream sports cars or movie-inspired miniatures, you will find great choices in this category. These replicas feature superb attention to detail and craftsmanship and offer features such as working steering system, opening forward compartment, opening rear trunk with removable spare wheel, 4-wheel independent spring suspension, and so on. The models range in size from 1:10 to 1:24 scale and include numerous limited edition and several out-of-production vehicles. All models include a certificate of authenticity from their manufacturers and come fully assembled and ready for display in the home or office.',NULL,NULL), - -('Motorcycles','Our motorcycles are state of the art replicas of classic as well as contemporary motorcycle legends such as Harley Davidson, Ducati and Vespa. Models contain stunning details such as official logos, rotating wheels, working kickstand, front suspension, gear-shift lever, footbrake lever, and drive chain. Materials used include diecast and plastic. The models range in size from 1:10 to 1:50 scale and include numerous limited edition and several out-of-production vehicles. All models come fully assembled and ready for display in the home or office. Most include a certificate of authenticity.',NULL,NULL), - -('Planes','Unique, diecast airplane and helicopter replicas suitable for collections, as well as home, office or classroom decorations. Models contain stunning details such as official logos and insignias, rotating jet engines and propellers, retractable wheels, and so on. Most come fully assembled and with a certificate of authenticity from their manufacturers.',NULL,NULL), - -('Ships','The perfect holiday or anniversary gift for executives, clients, friends, and family. These handcrafted model ships are unique, stunning works of art that will be treasured for generations! They come fully assembled and ready for display in the home or office. We guarantee the highest quality, and best value.',NULL,NULL), - -('Trains','Model trains are a rewarding hobby for enthusiasts of all ages. Whether you\'re looking for collectible wooden trains, electric streetcars or locomotives, you\'ll find a number of great choices for any budget within this category. The interactive aspect of trains makes toy trains perfect for young children. The wooden train sets are ideal for children under the age of 5.',NULL,NULL), - -('Trucks and Buses','The Truck and Bus models are realistic replicas of buses and specialized trucks produced from the early 1920s to present. The models range in size from 1:12 to 1:50 scale and include numerous limited edition and several out-of-production vehicles. Materials used include tin, diecast and plastic. All models include a certificate of authenticity from their manufacturers and are a perfect ornament for the home and office.',NULL,NULL), - -('Vintage Cars','Our Vintage Car models realistically portray automobiles produced from the early 1900s through the 1940s. Materials used include Bakelite, diecast, plastic and wood. Most of the replicas are in the 1:18 and 1:24 scale sizes, which provide the optimum in detail and accuracy. Prices range from $30.00 up to $180.00 for some special limited edition replicas. All models include a certificate of authenticity from their manufacturers and come fully assembled and ready for display in the home or office.',NULL,NULL); - -/*Table structure for table `products` */ - -DROP TABLE IF EXISTS `products`; - -CREATE TABLE `products` ( - `productCode` varchar(15) NOT NULL, - `productName` varchar(70) NOT NULL, - `productLine` varchar(50) NOT NULL, - `productScale` varchar(10) NOT NULL, - `productVendor` varchar(50) NOT NULL, - `productDescription` text NOT NULL, - `quantityInStock` smallint(6) NOT NULL, - `buyPrice` decimal(10,2) NOT NULL, - `MSRP` decimal(10,2) NOT NULL, - PRIMARY KEY (`productCode`), - KEY `productLine` (`productLine`), - CONSTRAINT `products_ibfk_1` FOREIGN KEY (`productLine`) REFERENCES `productlines` (`productLine`) -) ENGINE=InnoDB DEFAULT CHARSET=latin1; - -/*Data for the table `products` */ - -insert into `products`(`productCode`,`productName`,`productLine`,`productScale`,`productVendor`,`productDescription`,`quantityInStock`,`buyPrice`,`MSRP`) values - -('S10_1678','1969 Harley Davidson Ultimate Chopper','Motorcycles','1:10','Min Lin Diecast','This replica features working kickstand, front suspension, gear-shift lever, footbrake lever, drive chain, wheels and steering. All parts are particularly delicate due to their precise scale and require special care and attention.',7933,'48.81','95.70'), - -('S10_1949','1952 Alpine Renault 1300','Classic Cars','1:10','Classic Metal Creations','Turnable front wheels; steering function; detailed interior; detailed engine; opening hood; opening trunk; opening doors; and detailed chassis.',7305,'98.58','214.30'), - -('S10_2016','1996 Moto Guzzi 1100i','Motorcycles','1:10','Highway 66 Mini Classics','Official Moto Guzzi logos and insignias, saddle bags located on side of motorcycle, detailed engine, working steering, working suspension, two leather seats, luggage rack, dual exhaust pipes, small saddle bag located on handle bars, two-tone paint with chrome accents, superior die-cast detail , rotating wheels , working kick stand, diecast metal with plastic parts and baked enamel finish.',6625,'68.99','118.94'), - -('S10_4698','2003 Harley-Davidson Eagle Drag Bike','Motorcycles','1:10','Red Start Diecast','Model features, official Harley Davidson logos and insignias, detachable rear wheelie bar, heavy diecast metal with resin parts, authentic multi-color tampo-printed graphics, separate engine drive belts, free-turning front fork, rotating tires and rear racing slick, certificate of authenticity, detailed engine, display stand\r\n, precision diecast replica, baked enamel finish, 1:10 scale model, removable fender, seat and tank cover piece for displaying the superior detail of the v-twin engine',5582,'91.02','193.66'), - -('S10_4757','1972 Alfa Romeo GTA','Classic Cars','1:10','Motor City Art Classics','Features include: Turnable front wheels; steering function; detailed interior; detailed engine; opening hood; opening trunk; opening doors; and detailed chassis.',3252,'85.68','136.00'), - -('S10_4962','1962 LanciaA Delta 16V','Classic Cars','1:10','Second Gear Diecast','Features include: Turnable front wheels; steering function; detailed interior; detailed engine; opening hood; opening trunk; opening doors; and detailed chassis.',6791,'103.42','147.74'), - -('S12_1099','1968 Ford Mustang','Classic Cars','1:12','Autoart Studio Design','Hood, doors and trunk all open to reveal highly detailed interior features. Steering wheel actually turns the front wheels. Color dark green.',68,'95.34','194.57'), - -('S12_1108','2001 Ferrari Enzo','Classic Cars','1:12','Second Gear Diecast','Turnable front wheels; steering function; detailed interior; detailed engine; opening hood; opening trunk; opening doors; and detailed chassis.',3619,'95.59','207.80'), - -('S12_1666','1958 Setra Bus','Trucks and Buses','1:12','Welly Diecast Productions','Model features 30 windows, skylights & glare resistant glass, working steering system, original logos',1579,'77.90','136.67'), - -('S12_2823','2002 Suzuki XREO','Motorcycles','1:12','Unimax Art Galleries','Official logos and insignias, saddle bags located on side of motorcycle, detailed engine, working steering, working suspension, two leather seats, luggage rack, dual exhaust pipes, small saddle bag located on handle bars, two-tone paint with chrome accents, superior die-cast detail , rotating wheels , working kick stand, diecast metal with plastic parts and baked enamel finish.',9997,'66.27','150.62'), - -('S12_3148','1969 Corvair Monza','Classic Cars','1:18','Welly Diecast Productions','1:18 scale die-cast about 10\" long doors open, hood opens, trunk opens and wheels roll',6906,'89.14','151.08'), - -('S12_3380','1968 Dodge Charger','Classic Cars','1:12','Welly Diecast Productions','1:12 scale model of a 1968 Dodge Charger. Hood, doors and trunk all open to reveal highly detailed interior features. Steering wheel actually turns the front wheels. Color black',9123,'75.16','117.44'), - -('S12_3891','1969 Ford Falcon','Classic Cars','1:12','Second Gear Diecast','Turnable front wheels; steering function; detailed interior; detailed engine; opening hood; opening trunk; opening doors; and detailed chassis.',1049,'83.05','173.02'), - -('S12_3990','1970 Plymouth Hemi Cuda','Classic Cars','1:12','Studio M Art Models','Very detailed 1970 Plymouth Cuda model in 1:12 scale. The Cuda is generally accepted as one of the fastest original muscle cars from the 1970s. This model is a reproduction of one of the orginal 652 cars built in 1970. Red color.',5663,'31.92','79.80'), - -('S12_4473','1957 Chevy Pickup','Trucks and Buses','1:12','Exoto Designs','1:12 scale die-cast about 20\" long Hood opens, Rubber wheels',6125,'55.70','118.50'), - -('S12_4675','1969 Dodge Charger','Classic Cars','1:12','Welly Diecast Productions','Detailed model of the 1969 Dodge Charger. This model includes finely detailed interior and exterior features. Painted in red and white.',7323,'58.73','115.16'), - -('S18_1097','1940 Ford Pickup Truck','Trucks and Buses','1:18','Studio M Art Models','This model features soft rubber tires, working steering, rubber mud guards, authentic Ford logos, detailed undercarriage, opening doors and hood, removable split rear gate, full size spare mounted in bed, detailed interior with opening glove box',2613,'58.33','116.67'), - -('S18_1129','1993 Mazda RX-7','Classic Cars','1:18','Highway 66 Mini Classics','This model features, opening hood, opening doors, detailed engine, rear spoiler, opening trunk, working steering, tinted windows, baked enamel finish. Color red.',3975,'83.51','141.54'), - -('S18_1342','1937 Lincoln Berline','Vintage Cars','1:18','Motor City Art Classics','Features opening engine cover, doors, trunk, and fuel filler cap. Color black',8693,'60.62','102.74'), - -('S18_1367','1936 Mercedes-Benz 500K Special Roadster','Vintage Cars','1:18','Studio M Art Models','This 1:18 scale replica is constructed of heavy die-cast metal and has all the features of the original: working doors and rumble seat, independent spring suspension, detailed interior, working steering system, and a bifold hood that reveals an engine so accurate that it even includes the wiring. All this is topped off with a baked enamel finish. Color white.',8635,'24.26','53.91'), - -('S18_1589','1965 Aston Martin DB5','Classic Cars','1:18','Classic Metal Creations','Die-cast model of the silver 1965 Aston Martin DB5 in silver. This model includes full wire wheels and doors that open with fully detailed passenger compartment. In 1:18 scale, this model measures approximately 10 inches/20 cm long.',9042,'65.96','124.44'), - -('S18_1662','1980s Black Hawk Helicopter','Planes','1:18','Red Start Diecast','1:18 scale replica of actual Army\'s UH-60L BLACK HAWK Helicopter. 100% hand-assembled. Features rotating rotor blades, propeller blades and rubber wheels.',5330,'77.27','157.69'), - -('S18_1749','1917 Grand Touring Sedan','Vintage Cars','1:18','Welly Diecast Productions','This 1:18 scale replica of the 1917 Grand Touring car has all the features you would expect from museum quality reproductions: all four doors and bi-fold hood opening, detailed engine and instrument panel, chrome-look trim, and tufted upholstery, all topped off with a factory baked-enamel finish.',2724,'86.70','170.00'), - -('S18_1889','1948 Porsche 356-A Roadster','Classic Cars','1:18','Gearbox Collectibles','This precision die-cast replica features opening doors, superb detail and craftsmanship, working steering system, opening forward compartment, opening rear trunk with removable spare, 4 wheel independent spring suspension as well as factory baked enamel finish.',8826,'53.90','77.00'), - -('S18_1984','1995 Honda Civic','Classic Cars','1:18','Min Lin Diecast','This model features, opening hood, opening doors, detailed engine, rear spoiler, opening trunk, working steering, tinted windows, baked enamel finish. Color yellow.',9772,'93.89','142.25'), - -('S18_2238','1998 Chrysler Plymouth Prowler','Classic Cars','1:18','Gearbox Collectibles','Turnable front wheels; steering function; detailed interior; detailed engine; opening hood; opening trunk; opening doors; and detailed chassis.',4724,'101.51','163.73'), - -('S18_2248','1911 Ford Town Car','Vintage Cars','1:18','Motor City Art Classics','Features opening hood, opening doors, opening trunk, wide white wall tires, front door arm rests, working steering system.',540,'33.30','60.54'), - -('S18_2319','1964 Mercedes Tour Bus','Trucks and Buses','1:18','Unimax Art Galleries','Exact replica. 100+ parts. working steering system, original logos',8258,'74.86','122.73'), - -('S18_2325','1932 Model A Ford J-Coupe','Vintage Cars','1:18','Autoart Studio Design','This model features grille-mounted chrome horn, lift-up louvered hood, fold-down rumble seat, working steering system, chrome-covered spare, opening doors, detailed and wired engine',9354,'58.48','127.13'), - -('S18_2432','1926 Ford Fire Engine','Trucks and Buses','1:18','Carousel DieCast Legends','Gleaming red handsome appearance. Everything is here the fire hoses, ladder, axes, bells, lanterns, ready to fight any inferno.',2018,'24.92','60.77'), - -('S18_2581','P-51-D Mustang','Planes','1:72','Gearbox Collectibles','Has retractable wheels and comes with a stand',992,'49.00','84.48'), - -('S18_2625','1936 Harley Davidson El Knucklehead','Motorcycles','1:18','Welly Diecast Productions','Intricately detailed with chrome accents and trim, official die-struck logos and baked enamel finish.',4357,'24.23','60.57'), - -('S18_2795','1928 Mercedes-Benz SSK','Vintage Cars','1:18','Gearbox Collectibles','This 1:18 replica features grille-mounted chrome horn, lift-up louvered hood, fold-down rumble seat, working steering system, chrome-covered spare, opening doors, detailed and wired engine. Color black.',548,'72.56','168.75'), - -('S18_2870','1999 Indy 500 Monte Carlo SS','Classic Cars','1:18','Red Start Diecast','Features include opening and closing doors. Color: Red',8164,'56.76','132.00'), - -('S18_2949','1913 Ford Model T Speedster','Vintage Cars','1:18','Carousel DieCast Legends','This 250 part reproduction includes moving handbrakes, clutch, throttle and foot pedals, squeezable horn, detailed wired engine, removable water, gas, and oil cans, pivoting monocle windshield, all topped with a baked enamel red finish. Each replica comes with an Owners Title and Certificate of Authenticity. Color red.',4189,'60.78','101.31'), - -('S18_2957','1934 Ford V8 Coupe','Vintage Cars','1:18','Min Lin Diecast','Chrome Trim, Chrome Grille, Opening Hood, Opening Doors, Opening Trunk, Detailed Engine, Working Steering System',5649,'34.35','62.46'), - -('S18_3029','1999 Yamaha Speed Boat','Ships','1:18','Min Lin Diecast','Exact replica. Wood and Metal. Many extras including rigging, long boats, pilot house, anchors, etc. Comes with three masts, all square-rigged.',4259,'51.61','86.02'), - -('S18_3136','18th Century Vintage Horse Carriage','Vintage Cars','1:18','Red Start Diecast','Hand crafted diecast-like metal horse carriage is re-created in about 1:18 scale of antique horse carriage. This antique style metal Stagecoach is all hand-assembled with many different parts.\r\n\r\nThis collectible metal horse carriage is painted in classic Red, and features turning steering wheel and is entirely hand-finished.',5992,'60.74','104.72'), - -('S18_3140','1903 Ford Model A','Vintage Cars','1:18','Unimax Art Galleries','Features opening trunk, working steering system',3913,'68.30','136.59'), - -('S18_3232','1992 Ferrari 360 Spider red','Classic Cars','1:18','Unimax Art Galleries','his replica features opening doors, superb detail and craftsmanship, working steering system, opening forward compartment, opening rear trunk with removable spare, 4 wheel independent spring suspension as well as factory baked enamel finish.',8347,'77.90','169.34'), - -('S18_3233','1985 Toyota Supra','Classic Cars','1:18','Highway 66 Mini Classics','This model features soft rubber tires, working steering, rubber mud guards, authentic Ford logos, detailed undercarriage, opening doors and hood, removable split rear gate, full size spare mounted in bed, detailed interior with opening glove box',7733,'57.01','107.57'), - -('S18_3259','Collectable Wooden Train','Trains','1:18','Carousel DieCast Legends','Hand crafted wooden toy train set is in about 1:18 scale, 25 inches in total length including 2 additional carts, of actual vintage train. This antique style wooden toy train model set is all hand-assembled with 100% wood.',6450,'67.56','100.84'), - -('S18_3278','1969 Dodge Super Bee','Classic Cars','1:18','Min Lin Diecast','This replica features opening doors, superb detail and craftsmanship, working steering system, opening forward compartment, opening rear trunk with removable spare, 4 wheel independent spring suspension as well as factory baked enamel finish.',1917,'49.05','80.41'), - -('S18_3320','1917 Maxwell Touring Car','Vintage Cars','1:18','Exoto Designs','Features Gold Trim, Full Size Spare Tire, Chrome Trim, Chrome Grille, Opening Hood, Opening Doors, Opening Trunk, Detailed Engine, Working Steering System',7913,'57.54','99.21'), - -('S18_3482','1976 Ford Gran Torino','Classic Cars','1:18','Gearbox Collectibles','Highly detailed 1976 Ford Gran Torino \"Starsky and Hutch\" diecast model. Very well constructed and painted in red and white patterns.',9127,'73.49','146.99'), - -('S18_3685','1948 Porsche Type 356 Roadster','Classic Cars','1:18','Gearbox Collectibles','This model features working front and rear suspension on accurately replicated and actuating shock absorbers as well as opening engine cover, rear stabilizer flap, and 4 opening doors.',8990,'62.16','141.28'), - -('S18_3782','1957 Vespa GS150','Motorcycles','1:18','Studio M Art Models','Features rotating wheels , working kick stand. Comes with stand.',7689,'32.95','62.17'), - -('S18_3856','1941 Chevrolet Special Deluxe Cabriolet','Vintage Cars','1:18','Exoto Designs','Features opening hood, opening doors, opening trunk, wide white wall tires, front door arm rests, working steering system, leather upholstery. Color black.',2378,'64.58','105.87'), - -('S18_4027','1970 Triumph Spitfire','Classic Cars','1:18','Min Lin Diecast','Features include opening and closing doors. Color: White.',5545,'91.92','143.62'), - -('S18_4409','1932 Alfa Romeo 8C2300 Spider Sport','Vintage Cars','1:18','Exoto Designs','This 1:18 scale precision die cast replica features the 6 front headlights of the original, plus a detailed version of the 142 horsepower straight 8 engine, dual spares and their famous comprehensive dashboard. Color black.',6553,'43.26','92.03'), - -('S18_4522','1904 Buick Runabout','Vintage Cars','1:18','Exoto Designs','Features opening trunk, working steering system',8290,'52.66','87.77'), - -('S18_4600','1940s Ford truck','Trucks and Buses','1:18','Motor City Art Classics','This 1940s Ford Pick-Up truck is re-created in 1:18 scale of original 1940s Ford truck. This antique style metal 1940s Ford Flatbed truck is all hand-assembled. This collectible 1940\'s Pick-Up truck is painted in classic dark green color, and features rotating wheels.',3128,'84.76','121.08'), - -('S18_4668','1939 Cadillac Limousine','Vintage Cars','1:18','Studio M Art Models','Features completely detailed interior including Velvet flocked drapes,deluxe wood grain floor, and a wood grain casket with seperate chrome handles',6645,'23.14','50.31'), - -('S18_4721','1957 Corvette Convertible','Classic Cars','1:18','Classic Metal Creations','1957 die cast Corvette Convertible in Roman Red with white sides and whitewall tires. 1:18 scale quality die-cast with detailed engine and underbvody. Now you can own The Classic Corvette.',1249,'69.93','148.80'), - -('S18_4933','1957 Ford Thunderbird','Classic Cars','1:18','Studio M Art Models','This 1:18 scale precision die-cast replica, with its optional porthole hardtop and factory baked-enamel Thunderbird Bronze finish, is a 100% accurate rendition of this American classic.',3209,'34.21','71.27'), - -('S24_1046','1970 Chevy Chevelle SS 454','Classic Cars','1:24','Unimax Art Galleries','This model features rotating wheels, working streering system and opening doors. All parts are particularly delicate due to their precise scale and require special care and attention. It should not be picked up by the doors, roof, hood or trunk.',1005,'49.24','73.49'), - -('S24_1444','1970 Dodge Coronet','Classic Cars','1:24','Highway 66 Mini Classics','1:24 scale die-cast about 18\" long doors open, hood opens and rubber wheels',4074,'32.37','57.80'), - -('S24_1578','1997 BMW R 1100 S','Motorcycles','1:24','Autoart Studio Design','Detailed scale replica with working suspension and constructed from over 70 parts',7003,'60.86','112.70'), - -('S24_1628','1966 Shelby Cobra 427 S/C','Classic Cars','1:24','Carousel DieCast Legends','This diecast model of the 1966 Shelby Cobra 427 S/C includes many authentic details and operating parts. The 1:24 scale model of this iconic lighweight sports car from the 1960s comes in silver and it\'s own display case.',8197,'29.18','50.31'), - -('S24_1785','1928 British Royal Navy Airplane','Planes','1:24','Classic Metal Creations','Official logos and insignias',3627,'66.74','109.42'), - -('S24_1937','1939 Chevrolet Deluxe Coupe','Vintage Cars','1:24','Motor City Art Classics','This 1:24 scale die-cast replica of the 1939 Chevrolet Deluxe Coupe has the same classy look as the original. Features opening trunk, hood and doors and a showroom quality baked enamel finish.',7332,'22.57','33.19'), - -('S24_2000','1960 BSA Gold Star DBD34','Motorcycles','1:24','Highway 66 Mini Classics','Detailed scale replica with working suspension and constructed from over 70 parts',15,'37.32','76.17'), - -('S24_2011','18th century schooner','Ships','1:24','Carousel DieCast Legends','All wood with canvas sails. Many extras including rigging, long boats, pilot house, anchors, etc. Comes with 4 masts, all square-rigged.',1898,'82.34','122.89'), - -('S24_2022','1938 Cadillac V-16 Presidential Limousine','Vintage Cars','1:24','Classic Metal Creations','This 1:24 scale precision die cast replica of the 1938 Cadillac V-16 Presidential Limousine has all the details of the original, from the flags on the front to an opening back seat compartment complete with telephone and rifle. Features factory baked-enamel black finish, hood goddess ornament, working jump seats.',2847,'20.61','44.80'), - -('S24_2300','1962 Volkswagen Microbus','Trucks and Buses','1:24','Autoart Studio Design','This 1:18 scale die cast replica of the 1962 Microbus is loaded with features: A working steering system, opening front doors and tailgate, and famous two-tone factory baked enamel finish, are all topped of by the sliding, real fabric, sunroof.',2327,'61.34','127.79'), - -('S24_2360','1982 Ducati 900 Monster','Motorcycles','1:24','Highway 66 Mini Classics','Features two-tone paint with chrome accents, superior die-cast detail , rotating wheels , working kick stand',6840,'47.10','69.26'), - -('S24_2766','1949 Jaguar XK 120','Classic Cars','1:24','Classic Metal Creations','Precision-engineered from original Jaguar specification in perfect scale ratio. Features opening doors, superb detail and craftsmanship, working steering system, opening forward compartment, opening rear trunk with removable spare, 4 wheel independent spring suspension as well as factory baked enamel finish.',2350,'47.25','90.87'), - -('S24_2840','1958 Chevy Corvette Limited Edition','Classic Cars','1:24','Carousel DieCast Legends','The operating parts of this 1958 Chevy Corvette Limited Edition are particularly delicate due to their precise scale and require special care and attention. Features rotating wheels, working streering, opening doors and trunk. Color dark green.',2542,'15.91','35.36'), - -('S24_2841','1900s Vintage Bi-Plane','Planes','1:24','Autoart Studio Design','Hand crafted diecast-like metal bi-plane is re-created in about 1:24 scale of antique pioneer airplane. All hand-assembled with many different parts. Hand-painted in classic yellow and features correct markings of original airplane.',5942,'34.25','68.51'), - -('S24_2887','1952 Citroen-15CV','Classic Cars','1:24','Exoto Designs','Precision crafted hand-assembled 1:18 scale reproduction of the 1952 15CV, with its independent spring suspension, working steering system, opening doors and hood, detailed engine and instrument panel, all topped of with a factory fresh baked enamel finish.',1452,'72.82','117.44'), - -('S24_2972','1982 Lamborghini Diablo','Classic Cars','1:24','Second Gear Diecast','This replica features opening doors, superb detail and craftsmanship, working steering system, opening forward compartment, opening rear trunk with removable spare, 4 wheel independent spring suspension as well as factory baked enamel finish.',7723,'16.24','37.76'), - -('S24_3151','1912 Ford Model T Delivery Wagon','Vintage Cars','1:24','Min Lin Diecast','This model features chrome trim and grille, opening hood, opening doors, opening trunk, detailed engine, working steering system. Color white.',9173,'46.91','88.51'), - -('S24_3191','1969 Chevrolet Camaro Z28','Classic Cars','1:24','Exoto Designs','1969 Z/28 Chevy Camaro 1:24 scale replica. The operating parts of this limited edition 1:24 scale diecast model car 1969 Chevy Camaro Z28- hood, trunk, wheels, streering, suspension and doors- are particularly delicate due to their precise scale and require special care and attention.',4695,'50.51','85.61'), - -('S24_3371','1971 Alpine Renault 1600s','Classic Cars','1:24','Welly Diecast Productions','This 1971 Alpine Renault 1600s replica Features opening doors, superb detail and craftsmanship, working steering system, opening forward compartment, opening rear trunk with removable spare, 4 wheel independent spring suspension as well as factory baked enamel finish.',7995,'38.58','61.23'), - -('S24_3420','1937 Horch 930V Limousine','Vintage Cars','1:24','Autoart Studio Design','Features opening hood, opening doors, opening trunk, wide white wall tires, front door arm rests, working steering system',2902,'26.30','65.75'), - -('S24_3432','2002 Chevy Corvette','Classic Cars','1:24','Gearbox Collectibles','The operating parts of this limited edition Diecast 2002 Chevy Corvette 50th Anniversary Pace car Limited Edition are particularly delicate due to their precise scale and require special care and attention. Features rotating wheels, poseable streering, opening doors and trunk.',9446,'62.11','107.08'), - -('S24_3816','1940 Ford Delivery Sedan','Vintage Cars','1:24','Carousel DieCast Legends','Chrome Trim, Chrome Grille, Opening Hood, Opening Doors, Opening Trunk, Detailed Engine, Working Steering System. Color black.',6621,'48.64','83.86'), - -('S24_3856','1956 Porsche 356A Coupe','Classic Cars','1:18','Classic Metal Creations','Features include: Turnable front wheels; steering function; detailed interior; detailed engine; opening hood; opening trunk; opening doors; and detailed chassis.',6600,'98.30','140.43'), - -('S24_3949','Corsair F4U ( Bird Cage)','Planes','1:24','Second Gear Diecast','Has retractable wheels and comes with a stand. Official logos and insignias.',6812,'29.34','68.24'), - -('S24_3969','1936 Mercedes Benz 500k Roadster','Vintage Cars','1:24','Red Start Diecast','This model features grille-mounted chrome horn, lift-up louvered hood, fold-down rumble seat, working steering system and rubber wheels. Color black.',2081,'21.75','41.03'), - -('S24_4048','1992 Porsche Cayenne Turbo Silver','Classic Cars','1:24','Exoto Designs','This replica features opening doors, superb detail and craftsmanship, working steering system, opening forward compartment, opening rear trunk with removable spare, 4 wheel independent spring suspension as well as factory baked enamel finish.',6582,'69.78','118.28'), - -('S24_4258','1936 Chrysler Airflow','Vintage Cars','1:24','Second Gear Diecast','Features opening trunk, working steering system. Color dark green.',4710,'57.46','97.39'), - -('S24_4278','1900s Vintage Tri-Plane','Planes','1:24','Unimax Art Galleries','Hand crafted diecast-like metal Triplane is Re-created in about 1:24 scale of antique pioneer airplane. This antique style metal triplane is all hand-assembled with many different parts.',2756,'36.23','72.45'), - -('S24_4620','1961 Chevrolet Impala','Classic Cars','1:18','Classic Metal Creations','This 1:18 scale precision die-cast reproduction of the 1961 Chevrolet Impala has all the features-doors, hood and trunk that open; detailed 409 cubic-inch engine; chrome dashboard and stick shift, two-tone interior; working steering system; all topped of with a factory baked-enamel finish.',7869,'32.33','80.84'), - -('S32_1268','1980’s GM Manhattan Express','Trucks and Buses','1:32','Motor City Art Classics','This 1980’s era new look Manhattan express is still active, running from the Bronx to mid-town Manhattan. Has 35 opeining windows and working lights. Needs a battery.',5099,'53.93','96.31'), - -('S32_1374','1997 BMW F650 ST','Motorcycles','1:32','Exoto Designs','Features official die-struck logos and baked enamel finish. Comes with stand.',178,'66.92','99.89'), - -('S32_2206','1982 Ducati 996 R','Motorcycles','1:32','Gearbox Collectibles','Features rotating wheels , working kick stand. Comes with stand.',9241,'24.14','40.23'), - -('S32_2509','1954 Greyhound Scenicruiser','Trucks and Buses','1:32','Classic Metal Creations','Model features bi-level seating, 50 windows, skylights & glare resistant glass, working steering system, original logos',2874,'25.98','54.11'), - -('S32_3207','1950\'s Chicago Surface Lines Streetcar','Trains','1:32','Gearbox Collectibles','This streetcar is a joy to see. It has 80 separate windows, electric wire guides, detailed interiors with seats, poles and drivers controls, rolling and turning wheel assemblies, plus authentic factory baked-enamel finishes (Green Hornet for Chicago and Cream and Crimson for Boston).',8601,'26.72','62.14'), - -('S32_3522','1996 Peterbilt 379 Stake Bed with Outrigger','Trucks and Buses','1:32','Red Start Diecast','This model features, opening doors, detailed engine, working steering, tinted windows, detailed interior, die-struck logos, removable stakes operating outriggers, detachable second trailer, functioning 360-degree self loader, precision molded resin trailer and trim, baked enamel finish on cab',814,'33.61','64.64'), - -('S32_4289','1928 Ford Phaeton Deluxe','Vintage Cars','1:32','Highway 66 Mini Classics','This model features grille-mounted chrome horn, lift-up louvered hood, fold-down rumble seat, working steering system',136,'33.02','68.79'), - -('S32_4485','1974 Ducati 350 Mk3 Desmo','Motorcycles','1:32','Second Gear Diecast','This model features two-tone paint with chrome accents, superior die-cast detail , rotating wheels , working kick stand',3341,'56.13','102.05'), - -('S50_1341','1930 Buick Marquette Phaeton','Vintage Cars','1:50','Studio M Art Models','Features opening trunk, working steering system',7062,'27.06','43.64'), - -('S50_1392','Diamond T620 Semi-Skirted Tanker','Trucks and Buses','1:50','Highway 66 Mini Classics','This limited edition model is licensed and perfectly scaled for Lionel Trains. The Diamond T620 has been produced in solid precision diecast and painted with a fire baked enamel finish. It comes with a removable tanker and is a perfect model to add authenticity to your static train or car layout or to just have on display.',1016,'68.29','115.75'), - -('S50_1514','1962 City of Detroit Streetcar','Trains','1:50','Classic Metal Creations','This streetcar is a joy to see. It has 99 separate windows, electric wire guides, detailed interiors with seats, poles and drivers controls, rolling and turning wheel assemblies, plus authentic factory baked-enamel finishes (Green Hornet for Chicago and Cream and Crimson for Boston).',1645,'37.49','58.58'), - -('S50_4713','2002 Yamaha YZR M1','Motorcycles','1:50','Autoart Studio Design','Features rotating wheels , working kick stand. Comes with stand.',600,'34.17','81.36'), - -('S700_1138','The Schooner Bluenose','Ships','1:700','Autoart Studio Design','All wood with canvas sails. Measures 31 1/2 inches in Length, 22 inches High and 4 3/4 inches Wide. Many extras.\r\nThe schooner Bluenose was built in Nova Scotia in 1921 to fish the rough waters off the coast of Newfoundland. Because of the Bluenose racing prowess she became the pride of all Canadians. Still featured on stamps and the Canadian dime, the Bluenose was lost off Haiti in 1946.',1897,'34.00','66.67'), - -('S700_1691','American Airlines: B767-300','Planes','1:700','Min Lin Diecast','Exact replia with official logos and insignias and retractable wheels',5841,'51.15','91.34'), - -('S700_1938','The Mayflower','Ships','1:700','Studio M Art Models','Measures 31 1/2 inches Long x 25 1/2 inches High x 10 5/8 inches Wide\r\nAll wood with canvas sail. Extras include long boats, rigging, ladders, railing, anchors, side cannons, hand painted, etc.',737,'43.30','86.61'), - -('S700_2047','HMS Bounty','Ships','1:700','Unimax Art Galleries','Measures 30 inches Long x 27 1/2 inches High x 4 3/4 inches Wide. \r\nMany extras including rigging, long boats, pilot house, anchors, etc. Comes with three masts, all square-rigged.',3501,'39.83','90.52'), - -('S700_2466','America West Airlines B757-200','Planes','1:700','Motor City Art Classics','Official logos and insignias. Working steering system. Rotating jet engines',9653,'68.80','99.72'), - -('S700_2610','The USS Constitution Ship','Ships','1:700','Red Start Diecast','All wood with canvas sails. Measures 31 1/2\" Length x 22 3/8\" High x 8 1/4\" Width. Extras include 4 boats on deck, sea sprite on bow, anchors, copper railing, pilot houses, etc.',7083,'33.97','72.28'), - -('S700_2824','1982 Camaro Z28','Classic Cars','1:18','Carousel DieCast Legends','Features include opening and closing doors. Color: White. \r\nMeasures approximately 9 1/2\" Long.',6934,'46.53','101.15'), - -('S700_2834','ATA: B757-300','Planes','1:700','Highway 66 Mini Classics','Exact replia with official logos and insignias and retractable wheels',7106,'59.33','118.65'), - -('S700_3167','F/A 18 Hornet 1/72','Planes','1:72','Motor City Art Classics','10\" Wingspan with retractable landing gears.Comes with pilot',551,'54.40','80.00'), - -('S700_3505','The Titanic','Ships','1:700','Carousel DieCast Legends','Completed model measures 19 1/2 inches long, 9 inches high, 3inches wide and is in barn red/black. All wood and metal.',1956,'51.09','100.17'), - -('S700_3962','The Queen Mary','Ships','1:700','Welly Diecast Productions','Exact replica. Wood and Metal. Many extras including rigging, long boats, pilot house, anchors, etc. Comes with three masts, all square-rigged.',5088,'53.63','99.31'), - -('S700_4002','American Airlines: MD-11S','Planes','1:700','Second Gear Diecast','Polished finish. Exact replia with official logos and insignias and retractable wheels',8820,'36.27','74.03'), - -('S72_1253','Boeing X-32A JSF','Planes','1:72','Motor City Art Classics','10\" Wingspan with retractable landing gears.Comes with pilot',4857,'32.77','49.66'), - -('S72_3212','Pont Yacht','Ships','1:72','Unimax Art Galleries','Measures 38 inches Long x 33 3/4 inches High. Includes a stand.\r\nMany extras including rigging, long boats, pilot house, anchors, etc. Comes with 2 masts, all square-rigged',414,'33.30','54.60'); - -/*!40101 SET SQL_MODE=@OLD_SQL_MODE */; -/*!40014 SET FOREIGN_KEY_CHECKS=@OLD_FOREIGN_KEY_CHECKS */; -/*!40014 SET UNIQUE_CHECKS=@OLD_UNIQUE_CHECKS */; -/*!40111 SET SQL_NOTES=@OLD_SQL_NOTES */; - diff --git a/database-files/ngo_db.sql b/database-files/ngo_db.sql deleted file mode 100644 index 526ba0070c..0000000000 --- a/database-files/ngo_db.sql +++ /dev/null @@ -1,63 +0,0 @@ -DROP DATABASE IF EXISTS ngo_database; -CREATE DATABASE IF NOT EXISTS ngo_database; - -USE ngo_database; - - -CREATE TABLE IF NOT EXISTS WorldNGOs ( - NGO_ID INT AUTO_INCREMENT PRIMARY KEY, - Name VARCHAR(255) NOT NULL, - Country VARCHAR(100) NOT NULL, - Founding_Year INTEGER, - Focus_Area VARCHAR(100), - Website VARCHAR(255) -); - -CREATE TABLE IF NOT EXISTS Projects ( - Project_ID INT AUTO_INCREMENT PRIMARY KEY, - Project_Name VARCHAR(255) NOT NULL, - Focus_Area VARCHAR(100), - Budget DECIMAL(15, 2), - NGO_ID INT, - Start_Date DATE, - End_Date DATE, - FOREIGN KEY (NGO_ID) REFERENCES WorldNGOs(NGO_ID) -); - -CREATE TABLE IF NOT EXISTS Donors ( - Donor_ID INT AUTO_INCREMENT PRIMARY KEY, - Donor_Name VARCHAR(255) NOT NULL, - Donor_Type ENUM('Individual', 'Organization') NOT NULL, - Donation_Amount DECIMAL(15, 2), - NGO_ID INT, - FOREIGN KEY (NGO_ID) REFERENCES WorldNGOs(NGO_ID) -); - -INSERT INTO WorldNGOs (Name, Country, Founding_Year, Focus_Area, Website) -VALUES -('World Wildlife Fund', 'United States', 1961, 'Environmental Conservation', 'https://www.worldwildlife.org'), -('Doctors Without Borders', 'France', 1971, 'Medical Relief', 'https://www.msf.org'), -('Oxfam International', 'United Kingdom', 1995, 'Poverty and Inequality', 'https://www.oxfam.org'), -('Amnesty International', 'United Kingdom', 1961, 'Human Rights', 'https://www.amnesty.org'), -('Save the Children', 'United States', 1919, 'Child Welfare', 'https://www.savethechildren.org'), -('Greenpeace', 'Netherlands', 1971, 'Environmental Protection', 'https://www.greenpeace.org'), -('International Red Cross', 'Switzerland', 1863, 'Humanitarian Aid', 'https://www.icrc.org'), -('CARE International', 'Switzerland', 1945, 'Global Poverty', 'https://www.care-international.org'), -('Habitat for Humanity', 'United States', 1976, 'Affordable Housing', 'https://www.habitat.org'), -('Plan International', 'United Kingdom', 1937, 'Child Rights', 'https://plan-international.org'); - -INSERT INTO Projects (Project_Name, Focus_Area, Budget, NGO_ID, Start_Date, End_Date) -VALUES -('Save the Amazon', 'Environmental Conservation', 5000000.00, 1, '2022-01-01', '2024-12-31'), -('Emergency Medical Aid in Syria', 'Medical Relief', 3000000.00, 2, '2023-03-01', '2023-12-31'), -('Education for All', 'Poverty and Inequality', 2000000.00, 3, '2021-06-01', '2025-05-31'), -('Human Rights Advocacy in Asia', 'Human Rights', 1500000.00, 4, '2022-09-01', '2023-08-31'), -('Child Nutrition Program', 'Child Welfare', 2500000.00, 5, '2022-01-01', '2024-01-01'); - -INSERT INTO Donors (Donor_Name, Donor_Type, Donation_Amount, NGO_ID) -VALUES -('Bill & Melinda Gates Foundation', 'Organization', 10000000.00, 1), -('Elon Musk', 'Individual', 5000000.00, 2), -('Google.org', 'Organization', 2000000.00, 3), -('Open Society Foundations', 'Organization', 3000000.00, 4), -('Anonymous Philanthropist', 'Individual', 1000000.00, 5); \ No newline at end of file diff --git a/docker-compose-testing.yaml b/docker-compose-testing.yaml deleted file mode 100644 index 5b7ce4d694..0000000000 --- a/docker-compose-testing.yaml +++ /dev/null @@ -1,28 +0,0 @@ -name: project-app-testing -services: - app-test: - build: ./app - container_name: web-app-test - hostname: web-app - volumes: ["./app/src:/appcode"] - ports: - - 8502:8501 - - api-test: - build: ./api - container_name: web-api-test - hostname: web-api - volumes: ["./api:/apicode"] - ports: - - 4001:4000 - - db-test: - env_file: - - ./api/.env - image: mysql:9 - container_name: mysql-db-test - hostname: db - volumes: - - ./database-files:/docker-entrypoint-initdb.d/:ro - ports: - - 3201:3306