In this post we’re going to briefly look at tools to improve the index-ability of your website as well has how to best use all the various data collected about your website in Google Webmaster Tools (GWT), and how this data can be used to improve your online business strategy.
We’ll also uncover ways to understand how people are searching and interacting with your website to help you gain more insight, control into your website’s performance in Google’s organic search results, and better connect with your customers.
Don’t Ignore Search
A study conducted by Enquisite.com concluded:
88% of online search dollars are spent on paid results, even though 85% of searches click the organic results.
As the landscape continues to change and we become more of a searching culture, it is becoming increasingly important for businesses to not only have a presence on the web but to also have presence in search engine results.
Moving into the future it will be important to be aware of search, understanding the importance of organic search and discovering the online opportunities available to you and your business in addition to discovering barriers that are blocking your website’s potential.
Companies that ignore the online opportunity (or react too slowly), do so at their own peril as today’s news regarding the Whitcoulls group clearly suggests. In a First Rate research report dated November 25, 2005 we clearly highlighted the threat posed to Whitcoulls by online bookstores and the changing consumer search and purchasing behaviours. Six years later and reality has started to hit.
5 Ways to Improve Organic Search Metrics
Google Webmaster Tools allows you to submit XML sitemaps instructing Google’s crawlers which content can be indexed and how it should be prioritised. Google Webmaster Tools now also allows submission of specialised site maps such as video, mobile, news, GEO, Code Search, and images. Specialised Sitemaps allow you to include specific information about other rich media content embedded into your site.
For example submitting a ‘News’ Sitemap, will help Google’s News engine better discover news articles on your site by pointing Google’s crawlers directly to your websites news article URLs. With News Sitemaps, you can also specify information about a news article including article title and the date of publication. News articles can be tagged by genre and you can use meta data to annotate articles. Note: while Sitemaps help Google’s crawlers navigate through your site, there’s no guarantee all allowed website URLs will be crawled and indexed.
Stuck with a Top Level Domain (TLD) i.e. a .com or a .org? But your target market is within a specific country? If one of the objectives of your website is to target a specific geographic area, for example, a New Zealand website primarily targeting New Zealanders or, an Australian website targeting people living in Australia, then it’s important that your website ranks best for its target geographic location. If this is the case and you have a .com or a .org, Google will assign your website’s geographic location based on it’s IP address.
“But my domain is registered overseas and my target market is here in New Zealand”. By using GWT you can specify a geographic target i.e. New Zealand, to let Google know the primary audience your website aims to target is in this area. You can find more info on this here and here.
The search queries feature in GWT is one which has only recently been enhanced. With the search queries data, you can discover which Google search queries are returning pages from your website to display in the search results and investigate the information available for each search query. Google Webmaster Tools Search Queries provides information on each and the number of queries returning pages from your website, including impressions, clicks, CTR, and average position in addition to the change given to those metrics within a specified time period.
Segments of Search Query information can be filtered by “containing or excluding” queries, “starred queries”, type of web content (i.e. images, mobile, video etc) and geographic location. With search queries data, you can use keywords to target content and view which URLs are ranking for keywords. The search queries “top pages” report also provides information on which pages your sites are performing well for. By identifying which of your webpages perform best, you can then look into how to improve and optimise these pages to perform even better.
The problem with duplicate content (multiple pages largely with identical content) is when search engines like Google find pages which appear to have duplicate content, one of the pages will be chosen to be indexed whereby it may not be the correct one therefore, assigning the ‘weight’ of internal links to the wrong page. In addition there’s also the possibility of diluting link juice.
To find out if your website has instances of duplicate content, in GWT run a diagnostic report and review the HTML suggestions. If you find Google is returning instances of duplicate content on your website start with the target pages and review the suggestions.
Reduce duplicate content check list:
- Select canonical URL from duplicates.
- Be consistent throughout the site with canonicals.
- Use 301 permanent redirects where possible.
- Implement rel=”canonical”.
Page Load Speed:
In their endeavour to improve search, Google also believes good quality websites should also be improving their user experience. As part of this, in April 2010 Google officially announced ‘Using site speed in web search ranking‘ is to be included as a new signal in its search ranking algorithm. First Rate covered this in an earlier post, here.
Google Webmaster Tools site performance provides a site performance overview chart showing how long, on average, it takes for pages on your website to load.
To evaluate the page speed of your site from both a web server and front-end code perspective, a recommended tool to use is the page speed extension for FireFox/Firebug. The tool conducts tests on your website based on web performance best practices and assigns scores to each page along with suggestions for improvement.
If you find pages on your website take longer than 2 seconds to load, enable gzip compression if your haven’t already, and review the page speed suggestions.
Looking to take search seriously? Contact Us.