Googlebot Cannot Access CSS and JS Files

30 Jul 2015
Written by 
Published in Blog
(0 votes)

Approximately from 28th July 2015 webmasters across the internet began receiving warning notices from Google Search Console informing them that Googlebot was unable to access certain CSS and JavaScript elements due to robots.txt rules.

So now the important question was "How To Quickly Fix 'Googlebot Cannot Access CSS and JS Files' Warning in Search Console ?"

And the answer came from Google's own Gary Ilyles who posted a quick and simple method of resolving this issue on StackOverflow, and the fix is surprisingly quick and easy to implement. According to Gary, "simplest form of allow rule to allow crawling javascript and css resources" is to add the following lines to your robots.txt file:

User-Agent: Googlebot
Allow: .js
Allow: .css

These few lines of code will allow Google's crawling bots explicit access to all CSS and JavaScript elements to ensure that these site assets are fully visible and indexable. NOTE: Implementing these allow rules are superseded by any disallow rules, meaning the above lines will not solve any blocked elements issues, and any directory or structural rules are still applied.

John Mueller of Google stated on Google+ that "…when Google checks for blocked CSS and JavaScript assets, they don't go too deep into your site. They primarily look at your home page and then the mobile/smartphone view of your web site."

Read 647 times Last modified on Monday, 07 September 2015 05:52
Login to post comments
"We are waiting for your call."
A quick chat or telephone call is often the best way to see how we can help, so get in touch today.

Contact Address

 C-8, New Ashok Nagar,
     Delhi, INDIA

 Opp. Yog Bharti School

 +91-9818993865

  This email address is being protected from spambots. You need JavaScript enabled to view it.

  This email address is being protected from spambots. You need JavaScript enabled to view it.