Ben Jones is a freelance SEO consultant working for SEO Doctor.
As an SEO I have encountered, if not thousands then tons of hundreds websites that come to me for SEO audits and most of them contain quite similar problems that Paddy Moogan have also shared through one of his post on SEOmoz.
It’s pretty much painful to spend hours auditing each and every website and every time you face same problems and errors at step one of SEO. As this is common to me I decided to automate the process or at least find a way to reduce manual work and make this audit as quick as possible.
This post will discuss tools that I use to audit the website on an initial level to find out the problems a website have and how much SEO work will be needed to achieve targets.
SPY on Web:
Not many people know about it, but it’s a phenomenal tool that I just started to use in order to speed up my initial SEO audit. It is important to check Google Analytics code that your website is using as it should not be used on any different website and if they are using it on any other website then you should fix and track the things accordingly.
If two or more websites are going to use the same Google Analytics code then chances are that your website might see the traffic of both websites in the same Google Analytics and this might disturb you with tracking leads and sales and how the website is improving over the time.
It is ideal to use unique Google Analytics code for one domain.
This is another quick tool that provides lot of different information about the website but I usually use it for checking the technical language that a website is using. It helps most when people use different content management systems on their website i.e WordPress, Express Engine or any other CMS services.
Many different CMS systems are not SEO friendly and this is where you can guess and estimate your efforts would be required on that particular CMS. I usually hate when it returns ASP.net platform as it needs lot of effort to make it SEO friendly.
XML Sitemap Validator:
Now the next step is to move towards Robots.txt and xml sitemap. Robots.txt id easy and one can check it manually but xml sitemap is very tricky. Having an xml sitemap is not enough but your sitemap should not contain any 4xx or 5xx response code.
It is important to create a sitemap that either passes the 200 header response or 301/302 in case of redirections. This tool helps you automate your process and you don’t really have to check every single URL, just copy and paste where your sitemap is located and it will pass you accurate results accordingly.
In the initial audit it is important to know that is your content original or duplicated. In case your website content is copied within the website SEOmoz pro can help one identifies that but what if the content is copied from some other website.
This tool gives a quick analysis of copied content from 3rd party websites. This information helps me understand how much work force will be required to make the project successful.
Screaming Frog is the famous among all the tools mentioned here. It is one of the amazing tools for initial audit that offers tons of information, which help one determine amount of work is coming on the technical site of SEO.
If your website is below 500 pages screaming frog is FREE for you but if not then it contain an affordable cost and in against that it offer a quick technical overview of your website.
All you have to do is to submit your URL in it and click RUN; screaming frog will scan your website and will give you all the technical information that includes title, Meta data, Heading tags, header statuses, and links.
Obviously there are other tools that come in with the further stages of the audit but the above mentioned tools help you in the infancy stage of your project and provide you with all the basic information about a website. If you have any other tool available that can improve the basic stage of SEO audit, please share it in the comment section.