As we approach the end of 2017, I would like to reflect on where search marketing is and where I think it is going. You will find articles that talk about the same future trends in the organic search field. Some patterns that people mention are how artificial intelligence will become more relevant in search. You will read how things like voice search will play an essential role in the way we optimize our sites. Experts continue to express the importance of user experience and how it will be critical to success. While I agree with all of these assessments, I don’t see articles that talk about log file analysis, so I want to add my suggestion to the discussion.
Log files can boost your SEO to the next level in ways you can’t imagine.
Why Should You Review Your Log File for SEO?
Think of log file analysis for search engine optimization like this.
Imagine spending months researching a topic to drive more traffic to your site. Countless hours spent revising content and performing keyword research for the project. When the launch of the material goes out into the world wide web, you wait eagerly for all of that hard work to pay off. After weeks of tracking keyword rankings, traffic, and conversions, you notice no uplift. You head over to Google Search Console to see if any impressions are forming from this URL, but nothing is there. Back to the drawing boards for that next “SEO killer content” again, right?
The one step that is important is if Google and Bing saw this content and if they can crawl it. Using Google Search Console and Bing Webmaster Tools is not good enough. These tools don’t show what web pages are crawlable by Google and Bing.
What Does a Log File Look Like?
Before we get into how to use one of these files for SEO work, we need to know what a log file is. I use GoDaddy to host my website, and they have a section in their C Panel suite where I can review the logs on my site. I combine the raw files there with Screaming Frog Log File Analyzer. This service from Screaming Frog can take hundreds of gigs worth of data and creates a dashboard that you can review the results from a log. Here is an example of what a raw file looks like.
I analyze over 130 gigs of log files each month for a large e-commerce, so the tool works great. You should make sure your computer can handle a lot of Java files at once when reviewing log files.
Here is what the report from Screaming Frog looks like when I import a log file into it. Trends and data instantly appear because the tool is reading each line as a separate event.
How to Extract a Raw Log File
Depending on how the files come out, you may need to extract the data to read it. I use 7 Zip where I can right click on file to extract the log files to read it. In the picture below, you will see a before and after shot of the extraction. Notice how you can read this data on the right after you use this tool.
How to Leverage a Log File for SEO?
Here are the main things that I look at when I review log files in the tool. The first thing I look at is the response code section and change the view to make it a tree view. Having a tree view is important because I need to make sure that Google and Bing look at my site from an SEO structural standpoint. This picture can show pages that are ophan pages, so it becomes compelling in that regard as well.
One tip is that Google and Bing will look at the robots.txt file and sitemap as much as they can. That is the case with my log file on my website. Google and Bing go to these sections of my website before crawling my core pages. If you do not have either of these parts on your website, you should make this a priority to fix.
A second tip that I would like to provide is to have your sitemap in the robots.txt file. Some experts say that you don’t need that, but I disagree with this. If Google and Bing first visit the robots.txt file, I want to give both bots the most explicit path to my essential pages. Having a sitemap that is clean will play a significant factor into this strategy.
My third tip in this view is to then review your most important sections on your site. After Google and Bing look at the robots.txt file and sitemap, I want them to focus their attention on my service pages. You can see that I expand the services folder to open up all of the pages underneath it. From this view, I have a page exists that holds zero value and should 301 redirect to the main service page. This page comes back as a status 200 code which means it is accessible but serves zero purposes on my website. I would not have seen this URL because it does not exist in my posts on my WordPress dashboard, so already I have something to work on next from this log file.
How to Focus Google and Bing?
With the picture above, I want to have a focus on my service pages. I mention the importance of the sitemap earlier in this blog post, so I am keeping that as the primary goal. I will test these options too.
Change the frequency of a page/s in the XML sitemap
Manually fetch these pages into Google Search Console and Bing Webmaster Tools
Add more content to this page and republish it
Create backlinks for this page
Calculate Changes to Measure Your KPI’s
With a log file, you can see how often this particular page or pages are on the crawl path by Google and Bing. If you use a tool like Google Search Console or Bing Webmaster Tools, you can measure SEO improvement on a timeframe. Here is an example of the growth my services pages have with impression data. With SEO, we know that this is going to take a long time to drive traffic, so I look at impression data to see if I am moving in the right direction. With the growth of this trend line, I feel confident that my efforts are working.
Here is another tool that I use to measure my visibility in Google Search and Google Maps. I have this set to local search, so it becomes incredibly important to my local business. This tool is georanker if you want to check it out.
The way to measure this growth in both rank and impressions is to tie it to a log file. Here is an example of clicking on one of my service pages. At the bottom of the log file tool, I can review a timeline of how often Google and Bing crawl this webpage. A tip is to record these data points monthly to show growth in how often Google and Bing crawl these pages. You can see that the more often a crawl happens, the better the ranks are for my website.
Look for Spam and Block It
Security needs to part of your analysis when using this tool too. Monthly checks of your IP visits is essential to making sure spam bots do not hit your servers. One of my complaints with GoDaddy hosting is that they group IP’s, so it is tricky to analyze IP’s. I do have accounts I work on that show all IP addresses, so I can narrow down the issue by bulk uploading IP addresses into a third-party tool to review.
Google and Bing take security of a website into consideration for their alogrithm. If they suspect a website might have content that comes from a third party source, they can deem the website unsafe for their audience. Take the time to review the IP addresses on a monthly basis to make sure your website is clean. This practice will save your server unnecessary bandwidth stress which would then help your load time for your audience too.
Conclusion
Log file analysis is something I believe will separate SEO strategies moving forward. Reviewing log files is a new technique that does not get a lot of coverage from places like Search Engine Land and Search Engine Journal, but I hope will get more traction from a blog post like this. This strategy works as a way to boost your SEO efforts for your website.
If you are looking for log file analysis of your website for SEO success, I offer New York SEO Expert services that will cover this.