Skip to content

Commit d8db5d6

Browse files
committed
chore: configure minimal sitemap.xml
1 parent 040d5db commit d8db5d6

File tree

3 files changed

+33
-5
lines changed

3 files changed

+33
-5
lines changed

config/sitemap.rb

Lines changed: 32 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,32 @@
1+
# This file is generated with `rake sitemap:install`
2+
# To update `public/sitemap.xml.gz` run `rake sitemap:refresh`.
3+
#
4+
# See https://github.com/kjvarga/sitemap_generator?tab=readme-ov-file#rake-tasks
5+
6+
# Set the host name for URL creation
7+
SitemapGenerator::Sitemap.default_host = 'https://codebar.io'
8+
9+
SitemapGenerator::Sitemap.create do
10+
# Put links creation logic here.
11+
#
12+
# The root path '/' and sitemap index file are added automatically for you.
13+
# Links are added to the Sitemap in the order they are specified.
14+
#
15+
# Usage: add(path, options={})
16+
# (default options are used if you don't specify)
17+
#
18+
# Defaults: :priority => 0.5, :changefreq => 'weekly',
19+
# :lastmod => Time.now, :host => default_host
20+
#
21+
# Examples:
22+
#
23+
# Add '/articles'
24+
#
25+
# add articles_path, :priority => 0.7, :changefreq => 'daily'
26+
#
27+
# Add all articles:
28+
#
29+
# Article.find_each do |article|
30+
# add article_path(article), :lastmod => article.updated_at
31+
# end
32+
end

public/robots.txt

Lines changed: 1 addition & 5 deletions
Original file line numberDiff line numberDiff line change
@@ -1,5 +1 @@
1-
# See http://www.robotstxt.org/wc/norobots.html for documentation on how to use the robots.txt file
2-
#
3-
# To ban all spiders from the entire site uncomment the next two lines:
4-
# User-agent: *
5-
# Disallow: /
1+
Sitemap: https://codebar.io/sitemap.xml.gz

public/sitemap.xml.gz

328 Bytes
Binary file not shown.

0 commit comments

Comments
 (0)