How a Hack SEO To the Next Level Using a Log File

As we approach the end of 2017, I would like to reflect on where search marketing is and where I think it is going. You will find articles that talk about the same future trends in the organic search field. Some patterns that people mention are how artificial intelligence will become more relevant in search. You will read how things like voice search will play an essential role in the way we optimize our sites. Experts continue to express the importance of user experience and how it will be critical to success. While I agree with all of these assessments, I don’t see many articles that talk about log file analysis, so I wanted to add my suggestion to the mix.

This is how to hack SEO to the next level using a log file to have Google and Bing focus on your most important pages. Any previous SEO work combined with this strategy I believe is a ticket for success. Let’s begin.

 

Why Should You Look at Your Log File for SEO?

 

Think of log file analysis for search engine optimization like this.

 

Imagine spending months researching a topic to cover to drive more traffic to your site. Countless hours dedicated revising content and performing keyword research into the assignment. When the launch of the material goes out into the world wide web, you wait eagerly for all of that hard work to pay off. After a few weeks and months of tracking keyword rankings, traffic, and conversions, you notice almost no uplift. You quickly head over to Google Search Console to see if any impressions are being gained from this URL and nothing seems to have changed. Back to the drawing boards for that next “SEO killer content” again, right?

 

A lot of SEO’s at this moment will be stuck and unsure if something went wrong. You go back to your keywords that you optimized for to make sure that there were no drops in search interest. Another thing you might do is check Google Trends to see if there might be a seasonality factor happening. One other favorite strategy is to immediately review the competition in the SERP and start building a new plan to reverse SEO their approach to success. The one step that is important is if Google and Bing saw this content and more importantly continued to crawl it. Only using Google Search Console and Bing Webmaster Tools is just not good enough. Those free tools only tell you how many got crawled. These services don’t show what the web pages were that got any crawl action. Questions like this are where log file analysis comes into play for search marketing.

 

What Does a Log File Look Like?

 

Before we get into how to use one of these files for SEO work, we should first look at what a log file is. I use GoDaddy to host my website, and they have a section in their C Panel suite where I can review the logs on my site. I combine the raw files there with a tool called Screaming Frog Log File Analyzer. This service from Screaming Frog can take hundreds of gigs of data and create a simple dashboard that I will show later down in this blog article. Here is what a raw log file looks like.

 

 

What Does a Raw Log File Look Like

 

When I take a sample of logs like this, you won’t learn that much to be honest. You have to make a few weeks/months depending on how many hits your site gets to get an accurate reading. I analyze over 130 gigs of log files each month for a large e-commerce site that reaches over 200 thousand organic visits each month, so the tool works great. You will need to make sure your computer can handle a lot of Java files at once if you are concerned you have more data then that. Here is what the report from Screaming Frog looks like when I import a log file into it. Trends and data instantly appear because the tool is reading each line as a separate event.

 

Log File Analysis using Screaming Frog

 

How to Extract a Raw Log File

 

Depending on the host provider or the way the files come out, you will need to extract the data to read it. I use something called 7 Zip where I can right click on file to extract the log file to read it. In the picture below, you will see a before and after shot of the extraction. Notice how you can read this data on the right after you use this tool.

 

Before and After Extracting a Log File using 7 Zip

 

How to Leverage a Log File for SEO?

 

Screaming Frog has a lot of sections built into it, but I am going to go over the main things that look at for this blog post. The first thing that I do is head over to the response code section and change the view to make it a tree view. Having a tree view is important because I need to make sure that Google and Bing look at my site from an optimized structural standpoint. I am more concerned that both bots spend more time on my top pages then on a random blog post that gets zero traffic.

 

Most Important Sections to Look at for Log File Analysis

 

 

One tip to anyone new to SEO is that Google and Bing will look at the robots.txt file and sitemap as much as they can. You can see that is the case with my log file of my website. Google and Bing both go to these sections of my website before heading over to any core pages. If you do not have either of these parts on your website, you should make this a priority to fix. You can use the log file tool to see if both search engines crawl dynamic URL’s on your site and if so block them in the robots.txt file.  An e-commerce site is destined to have facet versions of pages, so a log file can show all the variations crawled which wastes crawl budget.

 

 

A second tip that I would like to provide is to have your sitemap in the robots.txt file. Some experts say that it is not needed, but I would like to disagree with this. If Google and Bing first visit the robots.txt file, I want to give both bots the most explicit path to my essential pages. An optimized sitemap is something you need to focus on because the priority setting plays a role into how often a set of pages gets crawled by Google and Bing. I will get into that more later on, but remember that the robots.txt file and sitemap need to be on your site to have success in SEO.

 

 

My third tip in this view is to then review your most important sections on your site. After Google and Bing looked at the robots.txt file and sitemap, I want them to focus their attention on my service pages. You can see below that I expanded the services folder to open up all of the pages that fall underneath it. Immediately I can see that I have a page exists that holds zero value and should be 301 redirected to the main service page. This page comes back as a status 200 code which means it is accessible but serves zero purposes on my website. I would not have seen this URL because it does not exist in my posts on my WordPress dashboard, so already I have something to work on next from this log file.

 

 

Deeper Log File Analysis

 

 

How to Focus Google and Bing?

 

With the picture above, I want to have a focus on my service pages. I mentioned the importance of the sitemap earlier, so I am going to keep that as an idea to have those pages crawled more often. I will test a few other options, so I listed those out below.

 

Change the frequency of a page/s in the XML sitemap

Manually fetch this page in both Google Search Console and Bing Webmaster Tools

Add more content to this page and republish it

Create some backlinks on this page or make internal links that point to this page

 

Calculate Changes to Measure Your KPI’s

 

At the start of this blog post, I gave the scenario of content that we spent a lot of time and money creating, but no improvement in organic traffic came from it. With a log file, you can see how often this particular page or pages got crawled by Google and Bing. If you use a tool like Google Search Console or Bing Webmaster Tools, you can measure SEO improvement on a timeframe. Here is an example of the growth of my services pages regarding impression data. With SEO, we know that this is going to take a long time to drive actual traffic, so I like to look at impression data in Google Search Console to see if I am making traction. With the growth of this trend line, I feel confident that my efforts are working.

 

How to Use a Log File to Improve Your SEO Performance

 

Here is another tool that I use to measure my visibility in Google Search and Google Maps. I have this set to local search, so it becomes incredibly important. This tool is called georanker.

 

Better Ranks in Google and Google Maps using a Log File

 

The way to measure this growth in both rank and impressions is to tie it to a log file. Here is an example below of me directly clicking into one of my service pages. At the bottom of the log file tool, I can review a timeline of how often Google and Bing looked at this webpage. A tip that I would give is to record these data points monthly to show growth in how often Google and Bing crawl these pages. You can then show how an increase in crawled pages improved and accelerated my growth in rank and impression data.

 

How to Drilldown into a URL in a Log File

 

Look for Spam and Block It

 

In this blog post, I have shown how to use the log file to understand better what Google and Bing look at when they come to my website. Security is something that needs to part of your analysis when using this tool too. Monthly checks of your IP visits is essential to making sure spam bots do not hit your servers. One of my complaints with GoDaddy hosting is that they group a lot of IP’s, so it becomes tricky to fully analyze IP’s. I do have accounts I work on that do show all of the IP addresses, so I can narrow down the issue by bulk uploading IP addresses into a third-party tool to review. Here is an example of something I saved in my OneNote a few months ago where I saw a suspicious IP address from my logs.

 

Block Suspicious IP Addresses

 

 

This type of prevention is not spoken enough by the SEO community. When Google mentioned the disavow tool, SEO marketers spent time reviewing their backlinks to disavow them in Google and Bing so not to be penalized. Another thing that Google and Bing factor into their algorithm is if there appears to be hacked content on a website. Sometimes it is not clear if your content is hacked, but comment spam and infused material on a site can still draw an algorithm concern which hurts your rankings. Take the time to review the IP addresses on a monthly basis to make sure your website is as clean as it can get. This practice will save your server unnecessary bandwidth stress which would then help your load time for your audience.

 

Conclusion

 

Log file analysis is something I believe will separate SEO strategies moving forward in 2018 and beyond. The reviewing of log files is a new technique that does not get a lot of coverage from places like Search Engine Land and Search Engine Journal, but I hope will get more traction from blog posts like this. This type of strategy works as a great way to hack your efforts for better SEO performance. Screaming Frog can also work by importing a crawl to see if some pages never get looked at by Google and Bing.

 

If you are looking for log file analysis of your website for SEO success, I offer freelance SEO Services that covers this.

(877) 425-2141
1686 Massachusetts Avenue Apartment C
Cambridge, MA 02138
USA