Errors with .htaccess : Site Down with 403 Forbidden Errors

It is well known that .htaccess file largely controls the way your server responds to queries. While traveling across cities I checked my site to find 403 Forbidden errors on all pages. Upon checking my email I was surprised to receive a message from Dreamhost support notifying me that Googlebot isn’t working properly with my sites and they just blocked it via .htaccess for me and the load on the server is dropping to normal. I thank you all for the concerned emails readers sent and I explain how I fixed the issue.

The hourly statistics (which I use to track my site downtime) revealed that the site was down for the last 15 hours. I sent out several support messages to Dreamhost notifying them that they had created some misconfiguration of the .htaccess file.

Learning from the experience when my site was down for 36 Hours last time, and I myself had to fix the Internal Server Errors; I decided that waiting for a response and possible fix would take several more hours. Since the entire site as well as the WordPress admin was down, the only way I could access the .htaccess file was to download it. Maybe I could fix it and get my site up myself…

How I fixed site .htaccess error

Since I was away from my home computer, I needed to get an FTP client fast. I quickly downloaded my favorite FTP software Filezilla. I downloaded the latest Filezilla 3 Beta, installed it on the local computer, logged in to my webhosting, but the .htaccess file was nowhere to be found. I realized that since .htaccess is a hidden file, you needed to enable Filezilla to show hidden files. But the option was nowhere to be found.

Then I realized I should have downloaded the Filezilla (the regular full version, after all beta’s are still in testing). So I uninstalled Filezilla 3 Beta (I still do not know why they have removed the option to show hidden files in the beta), and installed the current version Filezilla, logging in via FTP, found the option to “show hidden files“, found the .htaccess in the root folder and downloaded it

I opened the file in wordpad and found this

order allow,deny
deny from 66.249

Obviously that was what was blocking the site. Since I had never edited my .htaccess earlier, I deleted all that text, saved the blank file and replaced it on the server. And the site went live instantly.

htaccess errors

Dreamhost has promised better webhosting this year and to err is human, so I guess it is an isolated case. At least they let me know they were tweaking the .htaccess file, and not disabling my site (which I suppose most other webhosts would do) and they *do* respond to feedback.

NOTE: This is my personal experience of how I fixed my .htaccess error. I am no professional expert for managing this issue. If you have no idea about how this works, seek professional support. I am not responsible for any errors you many commit following my experience.

Update: Dreamhost support has apologized saying it was an error on their part. As I said they are genuine guys and incidents do happen with anyone. However, they have configured my .htaccess again to block Google (I hope this is temporary!), to keep the server running (rather than disabling the site; Thank you). Here is what the .htaccess looks like now.

order deny,allow
deny from 66.249

It was another learning experience for me and I still recommend Dreamhost. So incase you missed the blog for those 15 hours, do check out the site again.


  1. Ed says:

    Another alternative would have been to SSH into your dreamhost server and edit the htaccess directly on the server. I always keep a copy of putty on my thumbdrive for such occasions.

  2. Ed says:

    but I can’t remember to close my html tags apparently… hehe

  3. Daniel says:

    You have this site on a shared hosting plan over DreamHost?

    I am a client of them as well, but I only host smaller sites there. Not reliable at all from my past experiences.

  4. Jack says:

    I use Dreamhost as well. They disabled two of my sites at different times because they said I configured my .htaccess file incorrectly, causing it to hit the server over and over again. In actuality, it was googlebot spidering my site of 20,000 pages. Seriously, they can’t handle googlebot? To solve the problem I went into Google Webmaster Tools and slowed down the crawl rate for spidering my site.

  5. Josh says:

    Blocking Googlebot goes too far. For people who make a living from search engines it is unacceptable.

  6. Mr. Nosuch says:

    Thanks for posting this. It helped me figure out why our forums page rank went to 0. It killed our ad revenues and search results.

    I’m really xx that Dreamhost would do this without contacting me. They have a fine contact/support system, yet when they decided to make huge changes like this, they don’t let me know?

    Really, really annoying.

  7. Matt Van Dusen says:

    In my experience with Dreamhost, they do mean well, and usually fix it right the first time – but this shows that yes, they are human. They can make mistakes, However, they will tell you when they do something, they don’t beat around the bush, and if something requires action from you (that they can’t just control themselves from the support center), they will give you step-by-step direction on the matter, whether it is upgrading your webapps (a clear-cut case of “staying on top of things”) or telling Google that they don’t need to make 50,000 hits every hour (a clear-cut case of “our servers can handle it…. can yours?”).

    Granted, Google pretty much IS the internets these days, but even they can be blocked with very little effort. :P

  8. MPLS says:

    I had a similar problem. I was doing some testing and have installed PHP 5 on a development server. I had a site installed on the server that was working correctly, complete with quite an elaborate .htaccess file handling many redirects and rules. However, after installing PHP5 it turns out any php file I tried to view just returned a 403 Forbidden error. I began to scratch my head. Searching on Google didn’t immediately bring up a solution either. However, I eventually narrowed it down to the fact that I needed to add Options +FollowSymLinks to my .htaccess file. That solved it all.

  9. Michael Brode says:

    Hello friend,

    You can visible hidden files (.htaccess) in FileZilla 3 as follows:
    — You go under the menu item “Server” and activate “list hidden files force”
    — The following dialog close simply by klick OK. Done.


  10. online says:

    You will probably want to create an error document for codes 404 and 500, at the least 404 since this would give you a chance to handle requests for pages not found. 500 would help you out with internal server errors in any scripts you have running. You may also want to consider ErrorDocuments for 401 – Authorization Required (as in when somebody tries to enter a protected area of your site without the proper credentials), 403 – Forbidden (as in when a file with permissions not allowing it to be accessed by the user is requested) and 400 – Bad Request, which is one of those generic kind of errors that people get to by doing some weird stuff with your URL or scripts.

Leave a Reply

Your email address will not be published. Required fields are marked *