Skip to content

Commit

Permalink
Merge pull request #20 from gwu-libraries/t17-seo-optimization
Browse files Browse the repository at this point in the history
T17 seo optimization. Was able to run rake tasks (generate and ping) and generate sitemap.xml and updated robots.txt. Fixes #17
  • Loading branch information
StudioZut authored Oct 31, 2016
2 parents f80b191 + 77886f0 commit a5cea57
Show file tree
Hide file tree
Showing 6 changed files with 43 additions and 2 deletions.
3 changes: 3 additions & 0 deletions .gitignore
Original file line number Diff line number Diff line change
Expand Up @@ -15,3 +15,6 @@
/log/*
!/log/.keep
/tmp

# Ignore sitemap
public/sitemap.xml
3 changes: 3 additions & 0 deletions Gemfile
Original file line number Diff line number Diff line change
Expand Up @@ -23,6 +23,9 @@ gem 'jbuilder', '~> 2.0'
# bundle exec rake doc:rails generates the API under doc/api.
gem 'sdoc', '~> 0.4.0', group: :doc

# Use Sitemap to generate sitemap
gem 'sitemap'

# Use ActiveModel has_secure_password
# gem 'bcrypt', '~> 3.1.7'

Expand Down
6 changes: 6 additions & 0 deletions app/jobs/sitemap_regenerate_job.rb
Original file line number Diff line number Diff line change
@@ -0,0 +1,6 @@
class SitemapRegenerateJob < ActiveJob::Base
def perform
Rake::Task['sitemap:generate'].invoke
Rake::Task['sitemap:ping'].invoke
end
end
13 changes: 13 additions & 0 deletions config/sitemap.rb
Original file line number Diff line number Diff line change
@@ -0,0 +1,13 @@
Sitemap::Generator.instance.load(host: 'example.com') do
path :root, priority: 1, change_frequency: 'weekly'
path :search_catalog, priority: 1, change_frequency: 'weekly'
read_group = Solrizer.solr_name('read_access_group', :symbol)
Work.where(read_group => 'public').each do |f|
literal Rails.application.routes.url_helpers.curation_concerns_work_path(f),
priority: 1, change_frequency: 'weekly'
end
Collection.where(read_group => 'public').each do |c|
literal Rails.application.routes.url_helpers.collection_path(c),
priority: 1, change_frequency: 'weekly'
end
end
11 changes: 11 additions & 0 deletions lib/tasks/gwss.rake
Original file line number Diff line number Diff line change
@@ -0,0 +1,11 @@
namespace :gwss do
# adding a logger since it got removed from our gemset
def logger
Rails.logger
end

desc "Queues a job to (re)generate the sitemap.xml"
task "sitemap_queue_generate" => :environment do
SitemapRegenerateJob.perform_later
end
end
9 changes: 7 additions & 2 deletions public/robots.txt
Original file line number Diff line number Diff line change
@@ -1,5 +1,10 @@
# See http://www.robotstxt.org/robotstxt.html for documentation on how to use the robots.txt file
#
# To ban all spiders from the entire site uncomment the next two lines:
# User-agent: *
# Disallow: /
User-agent: *
Disallow: /*?file=thumbnail$
Sitemap: https://example.com/sitemap.xml
User-agent: AhrefsBot
Disallow: /
User-agent: Pcore-HTTP/v0.23.20
Disallow: /

0 comments on commit a5cea57

Please sign in to comment.