18 factors for evaluating the quality of a website
A guide on how to evaluate the quality of an existing website, including 18 important factors both off-site and on-site.
Table of Contents
- Alexa Ranking
- Majestic SEO
- Moz Open Site Explorer
- Google PageSpeed Insights
- WebPagetest.org
- Website Grader
- Security Headers
- Markup Validation Service
- SSL Certificate
- Navigation
- Accessibility
- Design / Responsive
- Call to Actions
- Privacy Policy and Terms and Conditions
- XML Sitemap and robots.txt
- OpenGraph data
- Twitter Cards
- Structured Data / Rich Snippets
Perhaps you are considering redesigning an existing website or maybe you are considering acquiring a website, whatever the reason, this post provides a series of both quantitative and qualitative metrics to build up a picture of the quality of an existing website.
This approach just evaluates the website based upon its publicly accessible surface and information that can be gathered from other publicly available tools and services. Following these steps does not have any impact negative or otherwise on the website you are evaluating.
Back to topAlexa Ranking
Alexa has been around for years but a website’s Alexa rank is still quite a good way to see how much traffic a website receives. As you can compare the website to its competitor sites to see how it compares to its peers. There’s also useful information here about the website’s bounce rate, keywords and similar websites. A high Alexa rank is a good sign that the website is performing well in the search engines and attracting regular traffic.
Back to topMajestic SEO
Majestic SEO will provide a rough indication of the number of backlinks the website has, this is the number of links elsewhere on the internet that link back to the website. This is a tough metric to manufacture so it’s where the value of a potential acquisition can be seen. Other interesting factors include the anchor text distribution/analysis.
Back to topMoz Open Site Explorer
The Open Site Explorer from Moz is similar to the Majestic SEO tools, so provides a range of metrics covering the backlinks to the website and anchor text, but also offers the ability to compare metrics with competitor sites too.
Back to topGoogle PageSpeed Insights
For our website projects we generally achieve a 90+ score (on desktop).
Google PageSpeed Insights is Google’s own website performance benchmarking service, it allows you to enter a URL and Google will return both a mobile and desktop score for the performance of the website. The use of third-party services (YouTube videos, live chat, external sign up forms, etc.) often decreases a website’s PageSpeed score.
Back to topWebPagetest.org
For our website projects we generally achieve an "A" grade.
WebPagetest.org is another well regarded performance benchmarking service, as with Google’s PageSpeed you insert a URL and then wait in a queue for your results to appear and a higher grade is also more difficult if the website uses multiple third-party services.
Back to topWebsite Grader
For our website projects we generally achieve a score of 90+.
Website Grader is a more all-encompassing benchmarking service, which scores a website based on performance, security and SEO metrics.
Back to topSecurity Headers
For our website projects we generally achieve a B, C or D grade.
SecurityHeaders.io is a security benchmarking service, it’s main aim is to raise awareness of security headers that make websites more secure. It’s actually a really fast job to configure a web server to return some of the security headers this website checks for. A “B” grade is actually really good and whilst an A grade is achievable some of the security headers such as Public Key Pinning and Strict Transport Security should be added with care as they can cause more problems than they are worth.
Back to topMarkup Validation Service
We validate the HTML of all our projects to ensure our websites are free from syntax errors, ensuring the best longevity and browser compatibility.
The W3Cs Markup Validation Service has been around for ages and it still remains a well-used tool in every website designers toolbox. Take a selection of different page URLs from the website and run them through this tool. Note that there’s a difference between a warning and an error, some warnings, a warning should not always be considered an issue as there may be good reason for it. Whereas a pattern of several errors on each page indicates that the website is poorly coded and suffers from HTML syntax errors.
Back to topSSL Certificate
All our websites are issued with auto-renewing SSL certificates at no additional cost.
This is an easy one, is the website served securely over HTTPS rather than the insecure HTTP protocol. If that means nothing to you it’s easy to identify by looking at the website’s URL in your browser’s address bar, does is start with “https://” (secure) or “http://” (insecure), the former being best. An SSL certificate ensures that any data your staff and visitors exchange with the website is done so over an encrypted connection.
Back to topNavigation
This is somewhat down to opinion but think of the commons tasks visitors might want to complete on the website. For each task start from the homepage with that task in your mind and try and follow through that task to the end. Was it easy to find your way and complete the task? Did you get lost, were there moment when you had to use the “back” button (not a good sign)?
Back to topAccessibility
Our websites projects are built in accordance with WCAG 2.0 accessibility guidelines.
Accessibility in relation to a website refers to the inclusive practice of removing barriers that prevent interaction with, or access to content and functionality, by people with disabilities. When sites are correctly designed, developed and edited, all users have equal access. This is probably the hardest of all the factors in this blog post to evaluate as there aren’t really any tools that can give you a score without needing some knowledge of web accessibility to correctly interpret them. But the best of those tools is the SortSite - Accessibility Checker and Validator.
Back to topDesign / Responsive
Again somewhat down to opinion but is the website visually appealing? Does it use an attractive colour scheme and is there a clear visual hierarchy so your eye naturally flows through the page rather than feeling like every element on the page is competing for your attention all at once.
Is the website responsive? So if you visit it on a mobile or tablet device are you left pinching and zooming around the page awkwardly or does the design adapt to the device's screen giving you an equal experience across mobile, tablet and desktop devices.
Back to topCall to Actions
Does the website include clear calls to action? Is it obvious as to the actions the website wants visitors to complete? Does this continue throughout the user’s journey to complete those calls to action?
Back to topPrivacy Policy and Terms and Conditions
Our website projects include initial privacy policy and terms and conditions pages ready for customising.
Does the website provide a Privacy Policy and Terms and Conditions?
This is a good sign that the website (and the organisation behind it) is being open about how any information that is exchanged with it is processed, stored and in some cases shared.
Back to topXML Sitemap and robots.txt
Our website projects include both an XML Sitemap and robots.txt as standard.
An XML sitemap is definitely recommended, as this is a sitemap/blueprint of all the pages in the website for search engines to use, so whilst search engines can still manage without one you’re giving them a good helping hand so there’s no good reason not to have one.
The robots.txt file is not really essential but can be used to inform search engines as to the URL where they can locate the website’s sitemap. The presence of a robots.txt file perhaps indicates good attention to detail as it should direct search engines to the website’s XML sitemap and also flag to search engines (and other spiders/crawlers that obey it) the URLs in the website that should not be indexed, such as CMS log-in pages and such like.
You can often take a guess at the URLs for these resources, which are usually the website’s domain name plus:
- XML Sitemap: sitemap.xml (e.g. www.enovate.co.uk/sitemap.xml)
- Robots.txt: robots.txt (e.g. www.enovate.co.uk/robots.txt)
Back to topOpenGraph data
Our website projects include OpenGraph data as standard ensuring your content looks its best when shared.
OpenGraph is a protocol created by Facebook, which enables a web page to become a “rich object” in a social graph. So in essence it allows any web page to have the same functionality as any other object on Facebook.
It’s highly recommended to include OpenGraph metadata in a web page so as to maximise traffic from and sharing on social media networks. But it’s not just about making sure the necessary meta data is present but rather making sure it’s carefully composed to do the best job of encouraging people to click through to the web page.
You can view the OpenGraph data that is configured for a webpage by using Facebook’s OpenGraph debugging tool.
Back to topTwitter Cards
Our website projects include additional Twitter Card meta data so that your content looks its best when shared on Twitter.
Similar to Facebook, Twitter also use the meta data on a web page when sharing content on the site. This meta data is displayed on Twitter in what Twitter call a “Twitter Card”. Fortunately, Twitter largely use the same OpenGraph protocol that Facebook do but there are a few additional elements of Twitter-only meta data that can be added to improve things further. Twitter also provide a tool like Facebook’s debugger for validating Twitter Cards.
Back to topStructured Data / Rich Snippets
Our website projects include an initial implementation of Structured Data, with a more thorough implementation available as an option.
Google’s Webmaster guidelines encourage web designers to use Structured Data wherever possible. Structured Data is a method of presenting a common object on a web page in a standardised way. For example that common object might be a place of business, breadcrumb links, a person, a blog post/news item, a product, a service, an event or even a recipe.
Google is growing a catalogue of objects (from the enormous schema.org library), which it can draw meaning from and surface in search results. So a website that is using Structured Data is likely to receive more traffic from search engines due to greater visibility.
To see if any of your websites pages include Structured Data take a selection of URLs and paste them into Google’s Structured Data Testing Tool.
You might also like...