Overview description of this project: This a mock module that I have researched about technologies used in practical projects and I have implemented it in this project.
- Use Java version 17, Spring framework version 3
- Use services of AWS
- RDS
- CloudWatch
- SSM
- BeanStalk
- IAM
- EC2
- Setup CI/CD deploy application
- GitHub Action
- Terraform
- GitHub Action
- Redis cache data.
- install and start Redis
npm install -g redis-commander
redis-commander
- start command line and access
localhost
and you can see this through picture below:
- install and start Redis
- Separate environments to testing.
- qa
- silo-5
- prod
- uat
- Use Splunk to monitoring logs
- Use Thymeleaf as template to send mail
- Use logback and log4j2.
* Import dependencies
Important attention to the Java version and Spring boot version, maybe when you import dependencies will occur error because of not integrated version.
<dependency>
<groupId>org.springframework.cloud</groupId>
<artifactId>spring-cloud-starter-bootstrap</artifactId>
</dependency>
<dependency>
<groupId>org.springframework.cloud</groupId>
<artifactId>spring-cloud-starter-aws-parameter-store-config</artifactId>
<version>2.2.6.RELEASE</version>
</dependency>
...
<dependencyManagement>
<dependencies>
<dependency>
<groupId>org.springframework.cloud</groupId>
<artifactId>spring-cloud-dependencies</artifactId>
<version>${spring-cloud.version}</version>
<type>pom</type>
<scope>import</scope>
</dependency>
</dependencies>
</dependencyManagement>
-
Set up environment in intellij to connect AWS
edit configuration -> modify options -> Enviroment Variables
-
Create parameter store in SSM
- create parameter store SSM
-
Custom name of parameter store by creating:
bootstrap.properties
aws.paramstore.prefix=/ppn aws.paramstore.default-context=dev aws.paramstore.profile-separator= aws.paramstore.enabled=true
- Create RDS and configure username, password, rules RDS.
Note You have to set up inbound rules for your RDS and public accessible.
after we created success as this below: We can monitoring status and workload database.
- Connect MySQL with RDS: go ahead AWS RDS and copy the endpoint and paste it into hostname at MySQL workbench.
- create
application-prod.properties
,bootstrap-prod.properties
and then usespring.profiles.active=name_enviroment
or you can set up in active environment in intellij IDEA application-prod.properties
spring.datasource.driver-class-name=com.mysql.cj.jdbc.Driver spring.datasource.url=${url-db} spring.datasource.username=${username} spring.datasource.password=${password-db} spring.jpa.hibernate.ddl-auto=update spring.jpa.show-sql=true spring.jpa.properties.hibernate.format_sql=false spring.jpa.properties.hibernate.dialect=org.hibernate.dialect.MySQL55Dialect spring.jpa.properties.hibernate.globally_quoted_identifiers=true server.port=5000
bootstrap-prod.properties
aws.paramstore.prefix=/ppn aws.paramstore.default-context=prod aws.paramstore.profile-separator= aws.paramstore.enabled=true
- create
maven.yml
and in this file you'll config steps and connect to AWS:
name: ppn
on:
push:
branches: [ "master" ]
pull_request:
branches: [ "master" ]
jobs:
build:
runs-on: ubuntu-latest
steps:
- uses: actions/checkout@v3
- name: Set up JDK 17
uses: actions/setup-java@v3
with:
java-version: '17'
distribution: 'temurin'
cache: maven
- name: Build with Maven
run: mvn clean install
- name: Prepare S3 upload target
run: mkdir artifacts && cp target/*.jar artifacts/
- name: install aws cli
run: sudo apt-get update && sudo apt-get install -y awscli
- name: Set up aws credentials
run: |
mkdir -p ~/.aws
touch ~/.aws/credentials
echo "[default]
aws_access_key_id = ${AWS_ACCESS_KEY_ID}
aws_secret_access_key = ${AWS_SECRET_ACCESS_KEY}" > ~/.aws/credentials
env:
AWS_ACCESS_KEY_ID: ${{ secrets.AWS_ACCESS_KEY_ID }}
AWS_SECRET_ACCESS_KEY: ${{ secrets.AWS_SECRET_ACCESS_KEY}}
AWS_S3_BUCKET: ${{ secrets.AWS_S3_BUCKET}}
- name: Copy files to S3
run: aws s3 cp artifacts s3://${{ secrets.AWS_S3_BUCKET }}/${GITHUB_SHA::7}/ --recursive --region us-east-1
# Optional: Uploads the full dependency graph to GitHub to improve the quality of Dependabot alerts this repository can receive
- name: Update dependency graph
uses: advanced-security/maven-dependency-submission-action@571e99aab1055c2e71a1e2309b9691de18d6b7d6
- create
terraform.yml
to manage AWS, it will demonstrate through three steps:terraform init
terraform plan
terraform apply
name: 'Deploy PPN with Terraform'
on:
push:
branches: [ "master" ]
pull_request:
permissions:
contents: read
jobs:
deploy_terraform:
name: Deploy with terraform
runs-on: ubuntu-latest
steps:
- uses: actions/checkout@v3
- uses: hashicorp/setup-terraform@v1
- name: Set up AWS credentials
run: |
mkdir -p ~/.aws
touch ~/.aws/credentials
echo "[default]
aws_access_key_id = ${AWS_ACCESS_KEY_ID}
aws_secret_access_key = ${AWS_SECRET_ACCESS_KEY}" > ~/.aws/credentials
env:
AWS_ACCESS_KEY_ID: ${{ secrets.DEPLOY_AWS_TERRAFORM_ACCESS_KEY_ID }}
AWS_SECRET_ACCESS_KEY: ${{ secrets.DEPLOY_AWS_TERRAFORM_SECRET_ACCESS_KEY }}
- name: "intialize terraform"
run: terraform init
- name: "validate terraform"
run: terraform validate -no-color
- name: "Run terraform apply"
run: terraform apply -auto-approve -no-color
- If you set up success and create a PR to testing:
- Step 1: You will go ahead cloud watch of AWS and create a log group, each log group will have a lot of log streams corresponding to environments.
- Step 2: Create logback.xml file in resources folder simple like that:
<configuration>
<appender name="cloudwatch" class="com.ppn.ppn.utils.CloudWatchAppender">
<filter class="ch.qos.logback.classic.filter.ThresholdFilter">
<level>INFO</level>
</filter>
</appender>
<root level="info">
<appender-ref ref="cloudwatch"/>
</root>
</configuration>
- Step 3: Create CloudWatchAppender class to handle logs, you can see this class in my sources code.
- Step 4: When you can set up successfully, you can monitoring logs of application in cloud watch, if you want to look for a log, only click Logs insights tab and use the command line with filter by correlationId attached each request.