About this website

Search Engine Strategies

Site maps

An XML site map was created using the free online tool at sitemaps.org. This was submitted to Google via Google Search Console and Bing via Bing Webmaster Tools. A robots.txt file was created which directs web crawlers to the XML site map. An HTML site map was also created and placed in the footer to enhance internal findability


An OpenSearch XML file was added and made discoverable via the head section. This makes it easier for site visitors to search the site's content. It can also be accessed from outside of the website.

Markup and Content Strategies

Structured data

Schema markup was added in JSON-LD format to create richer search results.

Semantic Markup and Keywords

Keywords were researched using the Google Ads Keyword Planner and used in semantic markup elements such as headings, links and strong tags. Semantic HTML was used to correctly structure all content and make it easier for search engines to interpret content.

404 Page

A custom 404 page was built to redirect lost visitors to the site's main content. The .htaccess file was used to configure the server to achieve this.

Backlink Strategies

Link Trading

I 'traded links' with other website owners by offering to link their websites from mine in exchange for backlinks from their websites.

Fresh and shareable content

Websites which release new content frequently get crawled more often by Google and other search engines. To encourage this I have added a blog to which I can regularly publish new articles. Blog articles are also good for getting back links as other authors may want to cite my articles as a resource. A resource list was also added as this is another type of content likely to be cited by other bloggers. Social media meta tags were added to each article to present articles in a rich way when being shared on social media. Making the content more shareable may help passively earn backlinks as they are more likely to get shared on social media and get noticed by other website and blog owners.

SEO Evaluation Methods


  • To rank on the first page of Google
  • To gain more back links
  • To have a Click Through Rate higher than 3.09%

Google Search Console

Google Search console (formerly Google Webmaster Tools) was added via the DNS settings on the web host. This is the main console for evaluating the relative success of my SEO efforts. The main metrics I use to track success are clicks, impressions, click through rate and average position. Definitions for these terms can be found in the metrics table in the Google Search Console presentation. How well the website performs for these metrics can be evaluated using the Performance Report.

Click Through Rate

Pages which rank well in Search Engine Result Pages tend to have high Click Through Rates. I am aiming to get a click through rate higher than 3.09% as this is the minimum average click through rate for top 10 ranking pages.

Average Position

Average position is useful for gauging how well this website ranks for particular search queries. I want to rank well for the following key words and phrases:

  • Search engine
  • Google Search Console
  • What is SEO
  • Keywords
  • Error 404
  • Structured Data

These key phrases were chosen using the Google Keyword Planner. I have picked them as they receive over 1000 search enquiries a month (some more than 10,000) but are classed as 'low competion' key phrases meaning it is relatively easier to rank highly for them. As the website has only recently been published it has received little traffic an doesn't rank well for these search terms. If I can rank on the first page of Google for these terms I will consider my SEO efforts successful.


I use the External Links report to see how many inbound links I've gained, where they come from and to which pages. I aim to see a steady increase in the number of inbound or backlinks to the homepage as well as the 'article' type pages.