My Google ranking has dropped, and I don’t know what to do next. Why did my website drop in Google? My website traffic is down, and I have no idea why? These are common scenarios and questions I receive from people looking for NY SEO Expert services. In this post, I’ll share how I diagnosed a traffic drop, made changes, and was able to rebound my organic traffic easily.
If it is easier to follow, I recorded a video below that goes over this step as a visual.
How to Diagnosis a Traffic Drop
Google Analytics is a primary tool I use for SEO. If I notice my traffic is becoming flat or trending down, I”ll investigate further. When using Google Analytics, I came across a drop in organic traffic to my Microsoft Rewards Guide blog post. The way I was able to view this report was to go to the landing page report in the behavior section in Google Analytics. By using a a weekly view, I was able to spot the trend which was the drop. In this picture below, I’m showing the slide along with the recovery that the page is on via organic traffic to tell the story later on in this blog post.
Now that I can confirm that the specific blog post was driving less traffic than before, I went over to Google Search Console to see what was dropping via keywords. Google Search Console will display organic keywords that rank for a web page. A drop in impressions is a clear signal that a keyword is starting to lose rank position. For example, I might have a keyword drop off from page one in Google to page two, so my impressions might drop drastically since most people don’t go beyond page one. You can play around with the compare feature to see what keyword ranked better, but I’m just going to show the before and after in Search Console to illustrate the improvement that I’ve made.
How to Rebound Your Traffic
Seeing the issue in Google Analytics and Google Search Console, I use SEMRush to rebuild my lost traffic. By creating a project dashboard, I use the On Page SEO Checker to see if there are ways for me to beef up my content. Below is a list of keywords that I’ve self-identified that I would like the blog post to rank for in Google. As you can see here, I can see that I’m now including all of my keywords on the page which is good. Initially, I was missing a few keyword phrases which were hurting the organic performance on the page.
Another thing that I look at in this report is the TF-IDF report to find additional terms to optimize for my page. This report essentially gives Google and Bing more context around your primary keywords. By including more TF-IDF terms on a page, you are giving search engines more clues that you are thoroughly covering a topic.
The last thing I look at in this report is the semantic keyword section. Semantic keywords are what Google Hummingbird deals with. Using similar words on a page will give Google more context to help rank a page higher in their results. When you use semantic keywords on your page, you should understand that you won’t rank for those terms. For example, my post on the rewards is about ways to earn more points in Microsoft. By putting in a semantic term like (500 points), I’m giving Google more context around earning more points. I won’t rank for the keyword “500 points”, but that total is what you can earn on select days of the calendar from Microsoft.
Additional Things to Consider
Merely adding content to a page may not be good enough depending on the competition. Sometimes you have to look for other ways to optimize the material even further. SEMRush gives additional suggestions like using a video on the page, backlinks to build, the readability of the page, and much more. It’s best to look at each section to figure out if there are other ways you can beef up the content to make the experience enjoyable for users when they visit the page.
Create (More) Internal Links
Internal linking is an excellent strategy within SEO. If you have a blog post that has relations to something else, you should set up an internal link to that section. For example, I frequently talk about that rewards post as an example with SEO strategies I perform. The problem was that I was not always linking that anchor text back to my target page. This was an area that I would call a missed opportunity. By creating an internal link, I could drive more people to that section on my site along with giving Google and Bing more chances to crawl that piece. To find opportunities for where I can create internal links, I use a search command in Google. Below is an example of a site command that I performed in Google. This site command tells Google only to show my site and where I use my target keyword. The thought process is that I can go into those additional blog posts and create an internal link back to my target page. You can imagine how powerful this strategy could be if your search brought back many results!
Fetch the Page with Google and Bing
By adding content to the blog post, I need to get the changes seen by search engines. A fast way to have content updates seen by Google and Bing is to use Google Search Console and Bing Webmaster Tools. Both tools allow you to fetch as their bot any page you want on your site. The fetch request will immediately send a bot over to your page for the search bot to crawl and understand. Both tools have a daily limit on how many times you can fetch as a page just as a heads up.
Any pages that got an update via internal links should be fetched by Google and Bing as well. The strategy behind this is to force Google and Bing to crawl the content and see the new internal link. By Google and Bing seeing the new internal link, you can expect both search engines will update their algorithm rankings because they are given more clues that your target page is about this precise topic.
Reduce Page Depth
John Mueller within a Google Hangout spoke a bit click depth recently. Click depth pertains to the number of clicks that are required to reach a target page as a user and bot. If you would like to hear what he said, I have a link here. You want to have your most critical content as close to the homepage as possible. By having essential content many paths down from the homepage means less crawling and indexing by search spiders on a daily basis. Search spiders crawl from top to bottom of a website most of the time, so it’s key to get top content as close to the homepage as possible. Some ways to get key content crawled is to include the content in the sitemap and to change the frequency via the XML sitemap. You can also add internal links from top sections on your site that link back to your target page. The strategy is that search spiders crawl pages more often when they are close to the homepage, so that they will see those links and travel to that specific page more often now.
Conclusion
There are many factors to consider when diagnosing a traffic drop. Some outside elements that you should be aware of deal with Google algorithm changes. Mozcast is a page I frequently check to see if Google SERP changes are happening more often than not. Another element to think about is Google moving towards a mobile-first index. TM Blast moved to a mobile first-index this year and saw a brief decline in organic traffic. I was not alarmed by this drop because I felt Google was reorganizing their SERP content. Mozcast also illustrated a high temperature, so I knew Google was making a lot of changes. As the weeks went by, I saw a bump in organic traffic that brought my levels back.
If something (like the rewards post above) is in your control, you should take the steps that I’ve outlined above. Discover the issue directly in Google Analytics and Google Search Console. Use a tool like SEMRush to beef up the content on the page. Build backlinks to your page if you find articles that would be a good compliment for their audience. Fetch the changes that you made by Google and Bing, so you speed up the ranking process by both engines. And finally, you should check the results daily so you can spot if your strategy is working or not.