-
Notifications
You must be signed in to change notification settings - Fork 9
Commit
This commit does not belong to any branch on this repository, and may belong to a fork outside of the repository.
- Loading branch information
0 parents
commit b6fa9fe
Showing
22 changed files
with
11,488 additions
and
0 deletions.
There are no files selected for viewing
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Original file line number | Diff line number | Diff line change |
---|---|---|
@@ -0,0 +1,3 @@ | ||
node_modules/ | ||
.aws-sam/ | ||
samconfig.toml |
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Original file line number | Diff line number | Diff line change |
---|---|---|
@@ -0,0 +1,190 @@ | ||
# Blog Crossposting Automation | ||
|
||
Are you a blog writer? Hate cross-posting your content across the web? You're in luck! | ||
|
||
This solution will hook into your blog creation process and automatically cross-post your content for you to Medium, Dev.to, and Hashnode! | ||
|
||
Deploy into your AWS account and type away! | ||
|
||
For a full summary of this solution [please refer to this blog post](https://www.readysetcloud.io/blog/allen.helton/how-i-built-a-serverless-automation-to-cross-post-my-blogs/) by [Allen Helton](https://twitter.com/allenheltondev). | ||
|
||
## Prerequisites | ||
|
||
For cross-posts to work successfully, there are a few prereqs that must be met in your setup. | ||
|
||
* Your blog post must be written in [markdown](https://en.wikipedia.org/wiki/Markdown). | ||
* Content is checked into a repository in GitHub | ||
* You have an application in [AWS Amplify](https://aws.amazon.com/amplify/) that has a runnable CI pipeline | ||
* Blog posts have front matter in the format outlined in the [Blog Metadata](#blog-metadata) section | ||
|
||
## How It Works | ||
|
||
 | ||
|
||
The cross posting process is outlined below. | ||
|
||
1. Completed blog post written in markdown is committed to main branch | ||
2. AWS Amplify CI pipeline picks up changes and runs build | ||
3. On success, Amplify publishes a `Amplify Deployment Status Change` event to EventBridge, triggering a Lambda function deployed in this stack | ||
4. The function uses your GitHub PAT to identify and load the blog post content and pass it into a Step Function workflow | ||
5. The workflow will do an idempotency check, and if it's ok to continue will transform and publish to Medium, Hashnode, and Dev.to in parallel | ||
6. After publish is complete, the workflow checks if there were any failures. | ||
* If there was a failure, it sends an email with a link to the execution for debugging | ||
* On success, it sends an email with links to the published content and updates the idempotency record and article catalog | ||
|
||
*Note - If you do not provide a SendGrid API key, you will not receive email status updates* | ||
|
||
## Platforms | ||
|
||
This solution will take content you create and automatically cross-post it on three platforms: | ||
|
||
* [Medium](https://medium.com) | ||
* [Dev.to](https://dev.to) | ||
* [Hashnode](https://hashnode.com) | ||
|
||
|
||
|
||
## Deployment | ||
|
||
The solution is built using AWS SAM. To deploy the resources into the cloud you must install the [SAM CLI](https://docs.aws.amazon.com/serverless-application-model/latest/developerguide/install-sam-cli.html). | ||
|
||
Once installed, run the following commands in the root folder of the solution. | ||
|
||
```bash | ||
sam build --parallel | ||
sam deploy --guided | ||
``` | ||
|
||
This will walk you through deployment, prompting you for all the parameters necessary for proper use. Below are the parameters you must fill out on deploy. | ||
|
||
|Parameter|Description|Required| | ||
|---------|-----------|--------| | ||
|TableName|Name of the DynamoDB table to create|No| | ||
|GSI1|Name of the GSI on the DDB table|No| | ||
|GitHubPAT|Personal Access Token to load newsletter content from your repository|Yes| | ||
|GitHubOwner|The GitHub user name that owns the repository for your content|Yes| | ||
|GitHubRepo|The repository name that contains your content|Yes| | ||
|AmplifyProjectId|Identifier of the Amplify project that builds your newsletter|Yes| | ||
|MediumApiKey|API key used to manipulate data in your Medium account|Yes| | ||
|MediumPublicationId|Identifier of the publication you wish to submit to on Medium|No| | ||
|MediumAuthorId|Identifier of your user on Medium|Yes if `MediumPublicationId` is not provided| | ||
|DevApiKey|API key used to manipulate data in your Dev.to account|Yes| | ||
|DevOrganizationId|Identifier of the organization you wish to submit to on Dev.to|No| | ||
|HashnodeApiKey|API key used to manipulate data in your Hashnode account|Yes| | ||
|HashnodePublicationId|Identifier for your blog publication on Hashnode|Yes| | ||
|HashnodeBlogUrl|Base url of your blog hosted in Hashnode|Yes| | ||
|BlogBaseUrl|Vase url of your blog on your personal site|Yes| | ||
|BlogContentPath|Relative path from the root directory to the blog content folder in your GitHub repo|Yes| | ||
|SendgridApiKey|Api Key of the SendGrid account that will send the status report when cross-posting is complete|No| | ||
|NotificationEmail|Email address to notify when cross posting is complete|No| | ||
|SendgridFromEmail|Email address for SendGrid that sends you the status email|No| | ||
|
||
## Notification Emails | ||
|
||
If you wish to get notification emails on the status of the cross posting, you must use [SendGrid](https://sendgrid.com). SendGrid offers a generous free tier for email messages and is quick to get started. To configure SendGrid to send you emails you must: | ||
|
||
* [Create an API key](https://docs.sendgrid.com/ui/account-and-settings/api-keys) | ||
* [Create a sender](https://docs.sendgrid.com/ui/sending-email/senders) | ||
|
||
Once you perform the above actions, you may use the values in the respective deployment variables listed above. | ||
|
||
## Replay | ||
|
||
In the event the cross-posting does not work, it can be safely retried without worrying about pushing your content multiple times. Each post will update the idempotency DynamoDB record for the cross-posting state machine. This record holds the status (*success/failure*) for each platform. If the article was successfully posted on a platform, it will be skipped on subsequent executions. | ||
|
||
## Blog Metadata | ||
|
||
Your blog must be written in Markdown for this solution to work appropriately. To save metadata about your post, you can add [front matter](https://gohugo.io/content-management/front-matter/) at the beginning of the file. This solution requires a specific set of metadata in order to function appropriately. | ||
|
||
**Example** | ||
```yaml | ||
--- | ||
title: My first blog! | ||
description: This is the subtitle that is used for SEO and visible in Medium and Hashnode posts. | ||
image: https://link-to-hero-image.png | ||
image_attribution: Any attribution required for hero image | ||
categories: | ||
- categoryOne | ||
tags: | ||
- serverless | ||
- other tag | ||
slug: /my-first-blog | ||
--- | ||
``` | ||
|
||
|Field|Description|Required?| | ||
|-----|-----------|---------| | ||
|title|Title of the blog issue |Yes| | ||
|description| Brief summary of article. This shows up on Hashnode and Medium and is used in SEO previews|Yes| | ||
|image|Link to the hero image for your article|Yes| | ||
|image_attribution|Any attribution text needed for your hero image|No| | ||
|categories|Array of categories. This will be used as tags for Dev and Medium|No| | ||
|tags|Array of tags. Also used as tags for Dev and Medium|No| | ||
|slug|Relative url of your post. Used in the article catalog|Yes| | ||
|
||
## Article Catalog | ||
|
||
One of the neat features provided by this solution is substituting relative urls for the appropriate urls on a given page. For example, if you use a relative url to link to another blog post you've written on your site, this solution will replace that with the cross-posted version. So Medium articles will always point to Medium articles, Hashnode articles will always point to Hashnode, etc... | ||
|
||
This is managed for you by the solution. It creates entries for your content in DynamoDB with the following format: | ||
|
||
```json | ||
{ | ||
"pk": "<article slug>", | ||
"sk": "article", | ||
"GSI1PK": "article", | ||
"GSI1SK": "<title of the post>", | ||
"links": { | ||
"url": "<article slug>", | ||
"devUrl": "<full path to article on dev.to>", | ||
"mediumUrl": "<full path to article on Medium>", | ||
"hashnodeUrl": "<full path to article on Hashnode>" | ||
} | ||
} | ||
``` | ||
|
||
When transforming your Markdown content, it will load all articles from DynamoDB, use a Regex to match on the article slug in your content, and replace with the url of appropriate site. | ||
|
||
If you already have a number of articles and wish to seed the database with the cross references, you will have to compile the data manually and put it in the following format: | ||
|
||
```json | ||
[ | ||
{ | ||
"title": "<title of article>", | ||
"devUrl": "<url of article on dev.to>", | ||
"url": "<relative url of article on your blog>", | ||
"mediumUrl": "<url of article on medium>", | ||
"hashnodeUrl": "<url of article on hashnode>" | ||
} | ||
] | ||
``` | ||
|
||
Take this data and update the [load-cross-posts](/functions/load-cross-posts/index.js) function to load and handle that data. Run the function manually to seed the data in your database table. | ||
|
||
## Embeds | ||
|
||
If you are embedding content in your posts, they might not work out of the box. *There is only support for Hugo twitter embeds.* The format of a Hugo Twitter embed is: | ||
|
||
``` | ||
{{<tweet user="" id="">}} | ||
``` | ||
|
||
If you include this in your content, it will be automatically transformed to the appropriate embed style on the appropriate platform. | ||
|
||
## Limitations | ||
|
||
Below are a list of known limitations: | ||
|
||
* Your content must be written in Markdown with front matter describing the blog post. | ||
* Content must be hosted in GitHub. | ||
* You are required to post to Dev.to, Medium, and Hashnode. You cannot pick and choose which platforms you want to use. | ||
* Only Hugo style Twitter embeds are supported. Embeds for other content will not work. | ||
* This process is triggered on a successful build of an AWS Amplify project. Other triggers are not supported (but can easily be modified to add them). | ||
* Notifications are limited to sending emails in SendGrid. | ||
* The only way to deploy the solution is with AWS SAM. | ||
|
||
## Contributions | ||
|
||
Please feel free to contribute to this project! Bonus points if you can meaningfully address any of the limitations listed above :) | ||
|
||
This is an AWS Community Builders project and is meant to help the community. If you see fit, please donate some time into making it better! |
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Original file line number | Diff line number | Diff line change |
---|---|---|
@@ -0,0 +1,122 @@ | ||
const { Octokit } = require('octokit'); | ||
const { SFNClient, StartExecutionCommand } = require('@aws-sdk/client-sfn'); | ||
const shared = require('/opt/nodejs/index'); | ||
|
||
const sfn = new SFNClient(); | ||
let octokit; | ||
|
||
exports.handler = async (event) => { | ||
try { | ||
await initializeOctokit(); | ||
|
||
const recentCommits = await getRecentCommits(); | ||
if (recentCommits.length) { | ||
const newContent = await getNewContent(recentCommits); | ||
if (newContent.length) { | ||
const data = await getContentData(newContent); | ||
await processNewContent(data); | ||
} | ||
} | ||
} catch (err) { | ||
console.error(err); | ||
} | ||
}; | ||
|
||
const initializeOctokit = async () => { | ||
if (!octokit) { | ||
const gitHubSecret = await shared.getSecret('github'); | ||
octokit = new Octokit({ auth: gitHubSecret }); | ||
} | ||
}; | ||
|
||
const getRecentCommits = async () => { | ||
const timeTolerance = Number(process.env.COMMIT_TIME_TOLERANCE_MINUTES); | ||
const date = new Date(); | ||
date.setMinutes(date.getMinutes() - timeTolerance); | ||
|
||
const result = await octokit.rest.repos.listCommits({ | ||
owner: process.env.OWNER, | ||
repo: process.env.REPO, | ||
path: process.env.PATH, | ||
since: date.toISOString() | ||
}); | ||
|
||
const newPostCommits = result.data.filter(c => c.commit.message.toLowerCase().startsWith(process.env.NEW_CONTENT_INDICATOR)); | ||
return newPostCommits.map(d => d.sha); | ||
}; | ||
|
||
const getNewContent = async (commits) => { | ||
const newContent = await Promise.allSettled(commits.map(async (commit) => { | ||
const commitDetail = await octokit.rest.repos.getCommit({ | ||
owner: process.env.OWNER, | ||
repo: process.env.REPO, | ||
ref: commit | ||
}); | ||
|
||
const newFiles = commitDetail.data.files.filter(f => f.status == 'added' && f.filename.startsWith(`${process.env.PATH}/`)); | ||
return newFiles.map(p => { | ||
return { | ||
fileName: p.filename, | ||
commit: commit | ||
} | ||
}); | ||
})); | ||
|
||
let content = []; | ||
for (const result of newContent) { | ||
if (result.status == 'rejected') { | ||
console.error(result.reason); | ||
} else { | ||
content = [...content, ...result.value]; | ||
} | ||
} | ||
|
||
return content; | ||
}; | ||
|
||
const getContentData = async (newContent) => { | ||
const contentData = await Promise.allSettled(newContent.map(async (content) => { | ||
const postContent = await octokit.request('GET /repos/{owner}/{repo}/contents/{path}', { | ||
owner: process.env.OWNER, | ||
repo: process.env.REPO, | ||
path: content.fileName | ||
}); | ||
|
||
const buffer = Buffer.from(postContent.data.content, 'base64'); | ||
const data = buffer.toString('utf8'); | ||
|
||
return { | ||
fileName: content.fileName, | ||
commit: content.commit, | ||
content: data, | ||
sendStatusEmail: process.env.SEND_STATUS_EMAIL == 'true' | ||
}; | ||
})); | ||
|
||
let allContent = []; | ||
for (const result of contentData) { | ||
if (result.status == 'rejected') { | ||
console.error(result.reason); | ||
} else { | ||
allContent.push(result.value); | ||
} | ||
} | ||
|
||
return allContent; | ||
}; | ||
|
||
const processNewContent = async (newContent) => { | ||
const executions = await Promise.allSettled(newContent.map(async (content) => { | ||
const command = new StartExecutionCommand({ | ||
stateMachineArn: process.env.STATE_MACHINE_ARN, | ||
input: JSON.stringify(content) | ||
}); | ||
await sfn.send(command); | ||
})); | ||
|
||
for (const execution of executions) { | ||
if (execution.status == 'rejected') { | ||
console.error(execution.reason); | ||
} | ||
} | ||
}; |
Oops, something went wrong.