I recently had the opportunity to present to the brilliant minds at the 2018 Wordcamp conference in Kansas City. It was an awesome, record crowd and was a blast to be with so many WordPress developers, consultants, and marketers.

My presentation was How to Complete an SEO-Friendly Redesign. It addressed the steps you need to take when launching (or relaunching) a website. You can find my full article on the topic at Search Engine Journal or view my Wordcamp slides below.

I promised in my presentation to provide this follow up post to link to some great learning resources for those in the audience who aren’t SEOs or who don’t go through this process all the time like my team at Voltage does.

Here’s the rundown in an FAQ of sorts on topics or concepts I mentioned in the presentation, but didn’t have time to go into detail on:

  • Canonicals: the “canonical” is simply the intended final URL for a given page or piece of content. If you have multiple pages with the same or very similar content, you’ll want to identify a canonical version and use the rel=canonical tag in the head section of the canonical page and all other similar pages to reference the canonical page as the one you want the search engines to index and care about. More info here: https://moz.com/blog/canonical-url-tag-the-most-important-advancement-in-seo-practices-since-sitemaps
  • On-Page elements: This is the literal set of items on the webpage that you can edit. On-page optimization is best done consistently through the page. No one element is important by itself meaning, you can’t just optimize the title tag, meta description, or other parts/pieces by themselves and expect rankings. On-page is best done by thinking about all the elements in concert with each other and ensuring that important terms or phrases are woven throughout.
  • Redirects: Properly redirecting old URLs to new site URLs is the most important and only thing I recommend if you can do just one thing when you relaunch. This means ensuring you 301 redirect all old site links to the most relevant page on the new website. You want to use 301 redirects. Here’s an article with more information: https://moz.com/learn/seo/redirection
  • Spider, Robot, Crawler: These are all names or references to the mechanisms Google, Bing, other search engines, and other entities uses to evaluate content on your website. The search engine “spiders’ visit your site to “crawl” at certain intervals that range from daily to up to once in every 90 days. When these spiders/robots/crawlers visit sites, they evaluate content and use the information gathered to evaluate your content and determine how to rank it.
  • Dynamic features: My use of this term relates to any SEO elements of page that are generated from a database and not literally written for each individual instance. Programming to auto-populate URLs, title tags, meta descriptions, headings, and other aspects of page content would fall into this category. In a large site, you would likely want to use dynamic or semantic coding of elements.
  • Page speed: Page speed has been important for several years as “tie-breaker” SEO factor. It continues to become more important for SEO just like it is for user-experience. You can use one of many site speed tools to measure page speed including Google Search Console and Google Analytics. You can also use external tools like: https://developers.google.com/speed/.
  • Mobile-friendly: Having a site that is mobile-friendly is pretty much a given expectation by visitors in 2018. Don’t take it for granted if you have a responsive, adaptive, or other version of mobile experience for your users. Test your site in the Google mobile-friendly testing tool and make sure it passes.
  • XML Sitemap: The XML sitemap replaces the old school method of “submitting” pages to the search engines. If you can have a dynamic or automatically built XML sitemap by your website/platform, then you’re in good shape. Regardless of if it is automatically generated or manually (via tools like https://www.xml-sitemaps.com/) you need to have one as it is something the search engines visit each time they come to your site to index pages. They uses this file or set of files as a new school table of contents or index page to make sure they don’t miss any of your website content.
  • Robots.txt: This is a file in the root of your website that search engines look for when visiting your site for instructions on what pages/sections to look at and what to ignore. At a high level, this is where you can specify pages or directories to be indexed and what to be excluded. You can test and validate your robots.txt file in Google Search Console for issues or errors as well.

If you have any questions pertaining to your project or scenario, please reach out! I’m happy to talk (for free) and give advice.