Sitebulb Review – Hands on SEO’s newest Auditing Tool
Sitebulb Review – Hands on SEO’s newest Auditing Tool
When I heard the news about a new auditing tool coming onto the SEO market in Sitebulb, I’ll be honest enough to say that I didn’t think I really needed one. I’ve been using Screaming Frog for what feels like forever, and I’m in love with the way DeepCrawl makes my life easy for regular weekly checks on my clients health.
So when Sitebulb was announced, I signed up apprehensively to the newsletter as I wanted to check out what it is that made Patrick and the guys feel there was a need for a new tool on the market.
After now using the tool for around a month on various small projects, I’ve put together a little review about some of it’s key features, and how you could incorporate this tool into your toolkit.
Oddly, when I came to write this review I did so on a Windows machine when I’ve actually been using the tool on Mac during the Beta period, so I’m keen to see if there was any kinks I missed! On a plus note, the forgot password worked a charm and arrived in seconds.
At the day of publishing this it’s the official Sitebulb release day (28th September 2017) – meaning the software is now publicly available for all users to access.
The software is a desktop based application much akin to Screaming Frog, that allows users to crawl up-to 500 urls for free and has limited features should you wish not to pay for a license. For those that do pay for a licence you’ll get some nifty features like Crawl maps, Mobile Friendly checks and unlimited URL crawls amongst others that I’ll touch on where possible through this post.
The installation process for the software is fairly rudimentary. Download the software from the site, run through the installation and create yourself an account. Once you’ve got that all up and running you just need to sign into the software with your details and you’re free as a bird.
All previous audits are saved locally, so if you do want to take your crawls with you, you’re probably best backing them up in the cloud. Handily, you can import previous crawls into the tool for review on different devices.
Setting up a Crawl
When you first come to set up a crawl on the platform it’s a simple process. Just click on start new project and follow the instructions on screen about your domains details. Once submitted, Sitebulb will test the configuration of the site and pre-populate some settings based on it’s checks.
One nice little feature is the ability to be able to crawl just specific directories. Great if you just want to do a quick audit of a resource section such as your blog, or if you wanted a snapshot of a certain area of your site.
Once that has finalised you get to the jiggery pokery part that will help you set up the results that you want from the crawl. I’ve outlined some of the features below;
The AMP analysis report does exactly what you’d expect it to. It will perform an analysis of the AMP URLs for you and highlight any issues with them. Items it will check for is stuff like status codes, index status, correlation with sitemap and canonical status.
Potential bug – If you’ve correctly setup your AMP pages to have a canonical tag to the main version of the site, Sitebulb reports it as an “Issue” as the AMP page is not “indexable” due to the canonical tag. Either me and The Guardian have messed up the implementation of AMP urls, or this is just a little bug that needs ironing out.
Update: After speaking to the guys at Sitebulb, this is indeed a bug which they will look to get resolved in a future update.
If you have a site that is serving content for multiple locations, this nifty little report will allow you test your hreflang tags to ensure they are all referenced appropriately.
Enabling this feature will add another dimension to your reporting. Allowing you to test whether discovered URLs are appearing in your sitemap. When enabling this option SItebulb will pre-populate the sitemaps URLs it has discovered (Assumingly via the robots.txt) and add them to the list of those to be checked, manual entries and multiple sitemaps are allowed.
Another handy report you get from this is the overview of your sitemaps, with a report highlighting issues with URLs that have been discovered during the crawl. This includes whether any URLs have problematic status codes, issues with sizes of sitemaps, or even whether they have not been included in the robots.txt files.
Once you have performed your crawl you get to really understand where the tool is going to help you out. Whether you’re a one man band (or woman) or you work in-house or for an agency, the visual reports that this software offers you are bound to blow your socks off.
Now, don’t get me completely wrong, I like and know my way round Screaming Frog pretty well, but when it comes to putting that information in front of clients, it doesn’t always cut the mustard. With Sitebulb the visualisations make things a little bit easier to comprehend, and they look pretty damn good too. It’s easy to just take a screenshot from the tool and dump into an email or presentation with a note about what it all means.
As there is a large number of features, and a pretty cool in-software tutorial within Sitebulb itself, I’ve just briefly summarised some of the reporting tabs that Sitebulb has to offer.
Throughout each of these reports Sitebulb offers you hints based on the information that they have discovered in their crawls. Each tab will show it’s own hints, however for a quick snapshot on everything there is a handy little tab that you can click to show all the hints that the crawl has discovered.
The overview tab does exactly what it says on the tin. Going into this section of the audit you will get a brief overview of when the audit was compiled, the naming conventions you used and most importantly a top line overview of it’s findings.
This tab is the first port of call when it comes to looking at what may be the highest of priorities when it comes to fixing issues on your site. Quite handily, hovering over each section allows you to click through to the corresponding tab that contains more information about that data.
There is also some pretty nifty charts available on this tab including;
- Crawled URLs by folder depth
- Summary of status codes by number of pages
- Type of URLs by their depth – e.g. External Links, Media Resources and orphaned pages
- Source of URLs – e.g. picked up by crawler, from sitemap or from one of the plugged in resources (Google Analytics or Google Search Console)
- Summary of the types of content (CSS, JS, HTML etc.)
Next up with the have the internal tab. This looks in more detail at the types of pages that you have on your site, including the file types and the protocols that are in use. You will also see more information here regarding the source of the URLs discovered and how pagination is used across the domain you are crawling.
The links tab looks at the state of your internal linking structure, highlighting where you may have some broken internal links, or where internal links are going through a redirect. The report also shows the number of unique internal links across the site, as well as the average number of unique internal links by the crawl depth of the site – this is especially handy if you think you may have an issue with internal links to important pages.
You will also see all the information you need in regards to the use of follow and no-follow internal links. Again this reports on how this is structured across the site.
The next two sections of this report look at how the anchor text is used across the site, and what the most linked pages are. A useful snapshot when looking to see where little value is being added to some of your more important content.
Redirects can be an issue to sites for many reasons, crawl waste, speed, and general user experience. The redirects reports in Sitebulb shows you how redirects from the crawl were discovered, and at which depth of the site thee commonly exist.
The indexability report is pretty damn hot. This report takes several things into the equation including robots.txt, canonical tags and the meta robots tags. Using this report you are able to identify where there may be issues with content not having the correct canonical or meta robots configuration, and also a snapshot of where your most noindexed pages appear.
The on page report is pretty nifty for a quick content analysis. The report takes a look at word counts on the page, but also showcases readability and sentiment scores to inform you where improvements should be considered. Using this report you are able to highlight any issues with the way that content is structured, as well as review any potential issues with Meta Descriptions and Page Titles, including those of which are duplicated or missing.
The resources tab takes a look at the status of the different filetypes that are used across the site. Offering you a summary of all the different types of resources discovered in the crawl, you’ll see a report on any potential issues with status codes, but also the total number of the different types of resources discovered.
Site Speed is important to many people, and so it should be. The report that Sitebulb offers here is very, very good. When it comes to measuring site speed I am an advocate of using as many sources as possible to highlight what appears to be the common issues, this tool offers some pretty good information to help support that process.
The site speed report shows you a number of different metrics, including an overall score based on it’s discoveries. Metrics that it includes can be seen below;
- Site Speed Score
- Slowest Time to First Byte
- Slowest Download Time
- Time to First Byte
- Download Time
- Time to First Byte by Depth
- Download Time by Depth
The mobile rendering tab also uses a score based system and looks to measure against a few different metrics. This report can be used to qualify any issues that you may have in the use of frames, plugins, viewport settings and font sizes amongst others.
The search traffic report is again something very useful. If you utilise Screaming Frog or Deep Crawl and have connected your Google Accounts for crawls, you’ll understand how valuable this data can be. The reports in Sitebulb allow you to see where there may be traffic gaps for different devices, where there may be problems with your conversions, but also the distribution of your traffic across the site by the number of search visits to your HTML urls.
The external URLs report is the last of the main features of Sitebulbs reporting tool. This allows you to take a look at the types of URLs that are being linked to externally within your crawl. This is useful as you can use this to check for any broken resources, but also qualify where your most linked to content is coming from.
The crawl maps feature of Sitebulb is pretty special. If you’ve ever wanted to see a data visualisation of the way that your site is structured, then this is hands down one of the best ways to go about it. The interactive report allows you to get a top down overview of your site, originating from the homepage, and look to see how content is silo’d across your domain.
Hovering over each node of the crawl you will be able to see more information about what that page is and whether it correlates to what you would expect it to be. This report can be especially useful where you have nodes way off the centre for what could be considered important content for you.
In summary, I’ve been super impressed by the Sitebulb beta platform – and even more so their release notes – and I’ll likely continue to use the tool moving forward for projects. I’ve been really pleased with the way that the visualisations have enabled me to pull together information for clients, and the setup and configuration is fairly simple to use.
If I had to mention one bugbear, it was the fact that I couldn’t see a way to re-crawl a URL that had already been crawled. Many domains now use some sort of DDOS protection, so if you have a site that is temperamental to this make sure you use the slowest crawler speed, and in extreme circumstances consider whitelisting the user agent.
Update: It’s planned in the future for the tool to be able to re-crawl any failed URLs, but as the reports will all need to be regenerated too it’s not a simple process.
Anyway, great job Sitebulb – I look forward to seeing how the tool progresses.