Want to make better SEO visibility on an engine that covers more than 20% of the U.S desktop share? If so, then you should check your traffic on Bing. Too many times we only focus on traffic from Google since that is where the lion share of traffic comes from, but making sure you are technically sound in Bing can make that extra difference in traffic. Let’s go over some troubleshoot steps if you notice you are losing traffic within Bing.
What is Bing Webmaster Tools?
If you have not done this step, you should immediately get this tool and verify your website. This tool is the way that Bingbot see’s your site like Google Search Console, so right here you will see what’s going on.
The way I like to verify my Bing Webmaster Tool’s listing is to use the meta tag option. Assuming you just clicked on the sign-up button, you want to add your URL at the top
From there, you want to click on the verification method that you want to show that you own this site. I like the meta tag because it’s just one line of code that must go before the end of the </head> tag. Once that code is added, you click on the verify button and you are in. One thing to know about this is that you will see past data assuming your site gets Bing traffic, so it’s not like you are starting at zero when you verify this tool.
Submit the XML Sitemap
Submitting your sitemap to Bing allows you to get this on the radar. Technically you don’t have to do this, but I would say better give the search engines as many clues about what is going on with your site
Fetch as Bingbot your Sitemap
With the sitemap submitted, we want to then fetch it as Bingbot. Bing has a lower tolerance for a dirty sitemap, so check out how to clean up your sitemap if you run into any issues. If you have a lot of dirt in your sitemap (404’s, 301’s, 302’s, 500 errors), you will find that Bing does not like that. Google does not prefer this method either, but Bing has gone on record of saying they allow 1% of dirt in the sitemap.
You can use a tool like screaming frog to download your sitemap in an XML file to submit it to Screaming Frog. The beauty of this is that you don’t need the paid version to use this tool if you upload something as a list
Index Explorer
This is a pretty insane view that shows the hierarchy of your site according to Bing. Techincal SEO and structure are so important for SEO success. What I look for is if individual pages that are most important are not being crawled enough. You can actually recrawl a particular page if you updated it recently, but it has not been crawled by Bing. By having their bot crawl the page, you can see a ranking boost if you added some much content, or got some valuable links that point back to this page.
SEO Reports
This might take a week or two to create if you just created your account, so that is why I put this a bit further in this post. I believe that this report holds more insight compared to the HTML improvements section within Google Search Console. You will see necessary things like title tags to long, but Bing will then list some other more significant areas. Like Google, they will show the severity or importance of the fix, so you know what you can work on that might yield a better result.
Here is a picture of an account that I work on that has a bunch of different areas. If I am looking to increase my visibility with Bing and Yahoo, I’m going to carve out time and resources to address and fix them.
Ignore URL Parameters
Bing is fine with the rules of your robots.txt file and meta name robots, but there are always reasons why certain URL’s will be wasted for their crawl and indexability. You might see this on large sites that use a lot of tracking, e-commerce sites that have facet URL’s that can’t be noindexed, and so on. Bing specifically gives you this message below that literally tells you that properly blocking certain URL’s can help their bot and not split the authority.
Security Check
Bing gives you the chance to see if there is anything concerning that is going on with your site. I would say to check this out monthly just to make sure that there are no errors. The two things that this tool will check is if the site has any malware and if there are any phishing schemes associated with the domain. It’s always important to confirm that there are no errors with your hosting platform, but you would have to act immediately if you did see an error message like this within Bing.
Search Keyword Pulling
This is a bit manual, but it allows you to see the keywords that drive in clicks, impressions, ctr, and the average place for those terms. If you pull this monthly, you can then check the data on a weekly basis to start to look for trends. This will be a manual process, so you might want to look into the API section to see how to expedite this process.
API Pulling
This is a pretty advanced ask, so I wanted to highlight that this is something an engineer might have to create your account. Depending on the time you can divide for Bing, you should explore the API section within the tool to help expedite reporting. There are many things that you can pull, but I would say the page traffic report is most important.
Outside of the Tool
If everything looks normal within the tool, you should then put in some of your high-value keywords into Bing (incognito) to see what the SERP looks like. AHREFS did a study that showed about 30% of the clicks happen in place 1, then drops off severely after that, so you can imagine what would happen if the SERP changed.
Changes in the SERP could be things like more ads at the top, an answer box within Bing, more map organic listings that push a high organic place down, knowledge box results, and so much more.
Robots.TXT Check
This is the first place that Bing looks at when they come to a website. You should do a manual check of the file to make sure you did not accidentally block your entire site Look for something that says disallow:/
If you see that line of code above, you are effectively telling Google and Bing to not look at your website. The second thing you want to make sure is if your XML sitemap is listed in your file. Since the robots file is the first thing the bot looks at, you want them to see the path to the XML sitemap on your site to crawl.
Conclusion
Many times, a loss of traffic in Bing comes down to more than just one reason. A mix of keyword research, technical fixes, and more make the difference when it comes to traffic improvement. Another way to increase your reach is to ask your site for Bing News PubHub. This is an area where it can distribute your content across many Microsoft properties and reach millions of people who use Cortana, Outlook, and the Bing app.
If you are looking for a NY SEO Expert to help grow your Bing traffic, you can visit that link.