-
Notifications
You must be signed in to change notification settings - Fork 13
Description
Summary
Called out in our Slack channel, but Greenwood should definitely have some support for sitemaps, which are an XML file used to tell Search Engines about the content and pages contained within a site, in particular for larger sites and / or where links between pages are maybe not as consistent.
https://developers.google.com/search/docs/crawling-indexing/sitemaps/overview
A sitemap tells search engines which pages and files you think are important in your site, and also provides valuable information about these files. For example, when the page was last updated and any alternate language versions of the page.
Here is a basic example
https://developers.google.com/search/docs/crawling-indexing/sitemaps/build-sitemap
<?xml version="1.0" encoding="UTF-8"?>
<urlset xmlns="http://www.sitemaps.org/schemas/sitemap/0.9">
<url>
<loc>https://www.example.com/foo.html</loc>
<lastmod>2022-06-04</lastmod>
</url>
</urlset>
Details
I think the approach used in Next.js is probably good enough for Greenwood supporting either of this options
- ✅ Static File, e.g. sitemap.xml - will be copied automatically to the output
- Dynamic File, e.g. sitemap.xml.js - will be provided a copy of the greenwood graph and be expected to return valid XML
export async function sitemap(compilation) { const urls = compilation.graph.map((page) => { return ` <url> <loc>http://www.example.com${page.route}</loc> </url> `; }); return ` <?xml version="1.0" encoding="UTF-8"?> <urlset xmlns="http://www.sitemaps.org/schemas/sitemap/0.9"> ${urls} </urlset> ` }
Might want to wait until after #955 is merged since we might want to piggy back off any solutions there re: extending the ability for pages to be more than just markdown (.md) or JavaScript (.js).