How to Fix Googlebot cannot access CSS and JS files

Today, I received a number of messages from Google Webmaster Tools, just like the one below because I have blocked quite a few resources from search engine bots on many of my sites.

I didn’t realize I was blocking Google’s ability to render the page because previously as you fetched the page via Google’s “fetch as Googlebot” tool, the page would render properly. So Google has changed something, and is now letting webmasters who have verified their properties within Google’s Webmaster Tools of this error.

If you just want to get to what I did to fix it, scroll down.

Here’s an Example Email I got from Google:



The Simple Fix:

There’s a number of ways to fix this problem, and will depend on what you want to do, and the complexity of your robots.txt file.

If you’re not very technically savvy, this should fix it for you very easily – just add these following lines of code to the very end of your robots.txt file. The reason you want to add them to the end, is because this way they will override any previous restrictions:

User-Agent: Googlebot
Allow: /*.js*
Allow: /*.css*

The reason I added the asterisk behind the .js and .css extensions is because some of my files are called with version numbers, such as file.js?v=123 and I want to make sure they were allowed also.

If you already have a section of your robots.txt that specifies Googlebot, you can just add the 2 “allow” lines from above, to the bottom of that section.

How to Test That it Works

In order to make sure it works, you will need to use the fetch as googlebot tool and click the “fetch and render” button. Then you will see the result – you will see that in mine, it says “partial” as the result.



Click on the result line where it says “/” and it will display what Googlebot sees, what visitors see (because, they also fetch without respecting the robots.txt rules, to weed out spam sites that use trick tactics – so I don’t see why this is an issue now because they do it anyway, but that’s another subject) and a list of any errors.

Here’s what my render and errors result looks like:

As you can see below, there are a few lines of blocked CSS. If you click the “robots.txt Tester” link it will let you test each one individually.


Here’s what that looks like, when I click the first one that was blocked:

Google will tell you the line that is blocking the file they are trying to access, and allow you to update the robots.txt file right there and test it against the URL.


Below, after I make changes to the robots.txt file right here to test it out, you can see that when I add the 3 lines of code I mentioned earlier, that Googlebot can now access that file.


Yay, the Test Worked But There is More to Do!

In order to actually update the file, you will need to change the file on your server. As you can see the testing tool also has a “submit” button and a “last version seen” date and time.

You will need to do the following steps to let Googlebot know that you’ve fixed this access error.

  1. Upload your fixed robots.txt file to your server.
  2. If you use Cloudflare, or any other caching system or caching plugins, make sure you clear the cache and verify that you can see the updated robots.txt
  3. Click the “submit” button and Google will prompt you through the steps as follows:robots-tester-submit-confirmation
  4. Once you have done these steps, when you click “submit” you will tell Googlebot to get the newest version of the file instead of their cached version.
  5. It may take a few minutes, but you can reload the tester to see your new version of the file and confirm that Googlebot can now access your .js and .css files.

How’d you do?

Do you have any other tips on how to fix this?  Please share below in the comment area if you do.

11 thoughts on “How to Fix Googlebot cannot access CSS and JS files”

  1. Hi,
    here a joomla user.
    Thanks a lot.
    Looks like your solution is working for joomla too.

  2. Works great! There are already released a couple of updates for the Wordfence plugin related to the problem, and I believe other plugins will follow, but with this workaround I don’t have to wait, thanks! 🙂

  3. Thanks so much for the article!! Gotta love the internet 🙂

    Do you know how I can edit my robots.txt for my WordPress site? It seems to be a virtual file that is not in my directory when accessing it with ftp.


    1. Hi Kelly,

      The “file” is automatically created by WordPress newer versions like you said – just a virtual file.

      But if you create an actual file, WordPress checks whether a file exists and if it does, that will override what WordPress would otherwise create.

  4. On July 26 my adsense revenue dropped hugely — hundreds of dollars a day. My CPC’s and CTR’s were considerably lower than usual. (My traffic was normal.) After a few days of this, I figured I was getting a run of low-quality, irrelevant ads and hoped it would return to normal. This morning, 5 days later, the low revenue persisted. I figured I better look into it further.

    I thought back to the July 28 (two days after seeing Adsense revenue drop) emails I received from Google about Googlebot not being able to access my js and css. After getting those, I did go into the sites to remove the offending disallow wp-admin from the robots.txt files. But today, when I checked the robots.txt files in webmaster tools, it was still showing the older version.

    So, I added the additional “allow” lines you suggested above (although in my case I’m not sure I really need them, but can’t hurt!). In fact, those lines are all I have in the robots.txt files, except for a pointer to my site map:

    User-Agent: Googlebot
    Allow: /*.js*
    Allow: /*.css*


    Then I deleted the Supercache cache on all my sites, and then purged the cache in Cloudflare. Then in Webmaster tools I submitted the new robots.txt for each site, and it shows the current file.

    At this point my fingers are crossed, waiting to see if there is any immediate change in Adsense revenue.

    I have no idea if this does affect Adsense, but I have read that if Googlebot has crawl issues it will show lower quality ads. The timing is so coincidental, I’m hoping that the robots.txt was the problem.

      1. Thank you Mika. Here’s an update, that I also posted on the thread at WordPress that brought me to your site:

        Upon further inspection, even after the robots.txt change, and even after turn off Cloudflare and Supercache, when I “Fetch and Render” I get a “Partial” result (just “Fetch” says complete), and it lists the following:

        Googlebot couldn’t get all resources for this page

        When I view the rendered pages, it shows Adsense ads on “How the user sees the page” and no Adsense ads on “How Googlebot sees the page”

        Even after completely emptying out my robots.txt file, I still get the same thing.

        What could possibly be blocking this js file?

        1. That is fine. Googlebot does not index the content of AdSense ads. That domain is owned by Google and you have no control over how robots crawl that domain. This is the same on my sites and in my opinion is completely normal although yes the “partial” result statement and this error sound misleading.

          1. Thanks so much. I will let you know if my Adsense picks up again after making the robots.txt changes. I will be able to tell within a day of any change on Google’s part. Hopefully, it something they immediately see (if it’s an issue in the first place).

          2. Have you confirmed that you are not blocking the AdSense bot from indexing your content pages in robots.txt? If you are mistakenly blocking it there, I could definitely see it affecting your revenue.

Comments are closed.