October 28, 2014 |
Blocking Googlebot from your page may have negative effects on your SEO. Emerging from the old text browser world, Google has announced
This announcement is related to one in May
, which spoke about Google now rendering pages more like a modern browser and less like old browsers that were only able to view text. You are able to see how Google fetches your page with the Fetch and Render
tool that was released late May.
- Keep in mind that not all functionality that is used on websites today is supported when crawling a site, therefore it is a good idea to adhere to web design principles of "progressive enhancement" to make sure that Googlebot can see all content.
- Once again, the importance of the speed of your page is emphasized. The quicker your page renders for users, the more efficiently the page will be able to be indexed.
If you would like to confirm that your page is adhering to Google guidelines as well as being optimized for your keywords, take a look at the updated On-Page Keyword Optimization
report in the Rank Ranger platform. Use this report to analyze landing pages, it provides a SERP preview and editor, suggestions for improvements and a score based on how well your website is currently meeting best practice with respect to ranking for a particular keyword.
Is your site currently allowing Googlebot to crawl all of your data?
If you're a web hosting provider, how might this change affect server load - is it something for customers to be concerned about?