Home » SEO Site Audit – Your Online Brand

SEO Site Audit – Your Online Brand

by | May 3, 2018 | Technology, Your Online Brand | 0 comments


Hello, Max Duchaine here for the Society of American Florists. Back with another tech video and today we’re talking about SEO, specifically how to do a SEO site audit. Why do you need to do a site audit? Well, in the world of search engine optimization, you can’t optimize what you don’t recognize.

Bottom line, a site audit is the first thing you want to do when optimizing your website for search. It’s a great way to take an objective look at your website from a technical standpoint, helping you identify problem areas and patterns that will guide your optimization down the road.

We aren’t going to be performing any large scale fixes today, in fact, the bulk of what we’re going to be doing is for informational purposes only. Think of it as a site diagnostic examining the technical details of your website that search engines take into account when determining how to treat your website and your content.

So let’s get started.

The first thing we want to do is perform a site crawl. For all intents and purposes, a site crawl for a SEO audit is basically a replication of the process that search engines use to index a website. A site crawl will help you get a full view of your website from a SEO perspective. A site crawl doesn’t fix anything on its own, but it helps you identify glaring issues and gives you the information necessary to make repairs. In order to do this, we need to use dedicated software that can mimic the behavior of a search engine. I like Screaming Frog SEO Spider, in fact I highly recommend downloading the free version which gives you up to 500 urls per domain crawl, which should be more than enough for a small to medium size business. Screaming Frog is easy to use and does just about everything you need for a site audit by crawling a provided domain to analyze your on-site SEO, identifying many issues that may negatively affect your SEO including broken links, missing meta tags, duplicate content or other errors.

I’m going to run a quick site crawl of the SAF website, just to show you how it’s done and so you can get a sense of the kind of data that a site crawl gives us. I’m using Screaming Frog SEO Spider. All you have to do is type the URL of your website into the box, hit start and let the crawl run.

The next thing we want to do is make sure you have access to your analytics. It’s so important to see how your website is attracting visitors and how your visitors are behaving once they reach your site, so make sure that you have access to your website’s analytics. You can do this easily by registering your website through Google and Bing’s webmaster tools, which give you access to your website’s data for visitors that arrive via both search engines. Why are we using Google and Bing? Well, over 90% of all searches run through Google, with Bing/Yahoo making up a large portion of the remaining searches. You don’t want to ignore 10% of your web traffic.

You can register your pages with Google and Bing by going to their respective webmaster tools pages and following the directions. Google Webmasters Tools can found at google.com/webmasters, and Bing’s are at bing.com/toolbox/webmaster. They make it very simple and the information that they give you about site traffic and behavior is incredible. Make sure you do this for each search engine.

Now we need to check for any accessibility bottlenecks that might prevent a search engine from accessing and indexing your website. There are a number of technical factors that determine how discoverable your website is to search engines, so we want to use the insight that our site crawl provided to take a deeper look. Your robots.txt file is a great example, and a logical place to start because it basically defines access permissions for your website. Search engines are only allowed to crawl and index the pages that you allow them to, which are permissions granted within your website’s robots.txt file. You can either check your robots.txt file manually, use your site crawl data or Google Webmaster Tools to see which pages might be being restricted.

While we have our site crawl data still available, let’s see if we have any broken links or HTTP errors. What you want to be one the lookout for are 400 codes, which indicate a client error, and 500 codes, which indicate a server error. If you see any of these codes, you’ll know which pages crawlers are having a hard time finding. Go ahead and fix these errors on your website by either removing the link or creating a 301 redirect to the correct page or location.

Another key technical item that a search engine uses to navigate your website is the XML sitemap, which is a document that lives on your website that essentially serves as a roadmap for crawlers. Unlike the robots.txt file, which tells the search engine where it can and can’t go, the sitemap tells search engines how to get there. There are plenty of tools out there that help you diagnose problems with your sitemap, but as a general rule of thumb, it’s a good idea to make sure that Google has a copy of your sitemap by submitting the file directly via their webmaster tools.

Now that the search engines have a roadmap of your website, we need to check to make sure that the road is as smooth as possible by examining your site architecture. Plain and simple, your site architecture is the way your website is structured. For this, you don’t necessarily need your site crawl data or any third party tool. You just have to visit your website and start clicking around. You should already have a pretty good idea of how your website is structured, but it’s still a good idea to take a focused look around your site. Take note of the important pages, like your homepage, pages for different services you provide and product categories. It even helps to draw up a diagram that shows your site architecture. As a general rule, search engines like to see a “flatter” website structure, as opposed to deeply-nested pages that require a million clicks to get to. Think logically here. We want to make it as easy as possible for visitors to find what they’re looking for on your website. Remember, if it’s easy for a human to find, it’s easy for a site crawler to find.

Another very important and sometimes overlooked detail that can have drastic implications on your SEO is website performance. How quickly does your site load? Search engines want to make sure their users have the best possible experience when using their products, so you can bet that they take into account page load times when considering page ranking. Easiest way to test your site performance is to use an online speed test or performance tool to see how quickly your website loads. I use Google’s PageSpeed Insights , because it not only grades your site’s performance, it also shows you areas for improvement and tells you how to fix them.

Next step is to investigate your website’s indexability. Now that we know how accessible your website is to search engines, we want to see how much fun crawlers are having on your pages after they find them. Lucky for us, Google gives us the ability to see how many pages on each domain they’ve been able to crawl and index using a very simple search query called the “site: command”. Just type “site: your domain” into a Google search and take a look at the number of results that are returned. This is only a rough estimate, but if you compare this number to the number of pages that are on your website, you can determine if search engine crawlers are not only accessing, but indexing, your entire website. If you still aren’t sure if a search engine has indexed a specific page on your website, just copy and paste the page’s URL into a search query and see if it’s returned by the search engine.

As a good next step, you should do a brand search. Ever Google yourself? Of course you have, but now we’re doing it with a real purpose. Perform a simple brand search using the name and location of your business and check to make sure that your website is showing up towards the top of the page. The reason we use the location is to filter out any other businesses that might be using the same business name as yours, assuming that they aren’t located in the same city or town as you. If your website is absent after a brand search, it means you have some work to do.

One thing that you can do today to help improve your brand’s presence on search engine results pages, or SERPs, is to engage in what search engine optimizers call “SERPs domination”. This is the practice of determining which pages or websites you want to show up during a brand search and then aligning the content of these pages to encourage search engines to find and return them instead of anything else. The ultimate goal here is to have brand-controlled, or at least positive, pages at the top of the organic search results. Your homepage, your Facebook profile, your Twitter profile, Instagram, business listing on local review websites like Yelp; these are the pages that you want your customers to find when they go looking for you. It takes a lot more than just making sure these pages are created. You need to make sure they’re relevant as well. Remember, search engines look for content that is relevant and popular when determining which pages to rank higher, so here is how to do that.

Make sure that all of the information on these pages that you have control over (the business name, page bios, descriptions, business locations, phone numbers, etc.) is accurate and up-to-date across all pages. Build out your social profiles so that they’re robust, informative and they describe what you do and who you do it for. When Google or any other search engine is looking for relevant content, we want these pages to check that box first.
Link to these pages whenever and wherever you see an opportunity. The goal here is to drive as much traffic to your homepage, social media profiles, review sites, etc. Then, participate in these pages. If it’s a social network, make sure you’re on top of your posting. Encourage your customers to participate by following your profiles and leaving positive reviews on Yelp and Google. Traffic = popularity, and remember, that’s part 2 of what Google is looking for when filling up the SERPs.

Be sure to check back in the near future for a complete, step-by-step guide to local SEO, including claiming your business on Google, setting up your social media profiles and how to get started with Yelp and FourSquare.

Before we start claiming our business listings and setting up profiles, it’s a good idea to get organized by writing out the criteria that should be consistent across all of your profiles. That way, when you start setting them up, you have a master list that can help ensure consistency.
If you haven’t done so already, start by claiming your business on Google My Business. The process is fairly straightforward and can help the search engine connect your online profiles to your business and return them when someone is looking for you online. You can find the full instructions at google.com/business.

So there you have it. You just did a SEO site audit, and you helped improve your brand’s presence on the search engine results pages! Remember, there is still a lot that we can do with this data, so sit tight because in our next video, we’re going to show you how to start optimizing the on-page elements to improve your website’s SEO.

As always, if you have any questions, comments or have an idea for a tech topic that you want to learn more about, drop us a line by sending an email to techtopics@safnow.org. Once again, this is Max Duchaine for the Society of American Florists wishing you a great week and all the best. See you soon.

Safnow Login


SAF Members only. Please login to access this page.

Not a member? Click here to find out why you should join SAF today.

Email :


Password :


Lost your password?

(close)