With the introduction of Panda 4.0 in May 2014, Google also slid a new update into Webmaster tools, called “Fetch & Render.” This addition allows a user to fetch their website as Googlebot, but also shows an in-page display of how Googlebot renders the site.
So what’s the big deal? Well, the results can sometimes be shocking. Matt Cutts recently confirmed in the SMX Advanced 2014 that Googlebot has evolved to the point where it will attempt to use any and all files and scripts that are needed to display your web page. This includes Javascript and CSS.
Cutts emphasizes the fact that Google is getting better at “page understanding” and that not granting full access to your site display may result in your site “not getting the full credit that it deserves.” This apparently can affect your rankings and in some instances there is evidence that it could completely crush your position in search engine land.
Joost de Valk, author of the well-known WordPress SEO plugin, claims that tweaking your robots.txt file to allow Googlebot to scan CSS and Javascript can result in a dramatic change in how Google renders your web page. It makes sense, and looking at the case study he presents shows how different a site could look without all of the styling and scripts needed to render it as it was intended.
Hop into webmaster tools, select a URL to fetch & render, and review the results. Once the process is complete, you can see all of the permission blocks that Googlebot encountered while trying to review your site. Also taken note of the thumbnail render – does it look drastically different from the way the site is intended to look? If so, you might have to check and see whether certain files or scripts are denied access.
At the present time, this might not be an enormous issue for many sites, but given Google’s insistence on pleasant mobile and overall user experience, it’s certainly worth keeping an eye on. You wouldn’t want to be penalized just because you aren’t letting Googlebot see your CSS now would you?