How to secure your website from attacks using the .htaccess file

Every website owner needs to draw up a strategy to deal with attacks such as Cross-Site Scripting, Local File Inclusion and SQL Injection which all start with transmission of hostile commands via the HTTP Request. One challenge in dealing this kind of issue is how to separate legitimate requests from those that are hostile or malicious.

If the website is hosted on an Apache server, there is a powerful mechanism available to help mitigating attacks: that is the Apache .htaccess file. When properly configured, the .htaccess file on a server with mod_rewrite enabled can provide a robust defense against many HTTP Request attacks.

The .htaccess file is very powerful; therefore is recommended that there bbe a full backup of your website before you start editing your .htaccess file. The slightest syntax error can result in an internal server malfunction and usually you'll get a 500 error (Internal Server Error) message. Once you are done it is very important to verify your entire website’s functionality.

Note: if you're using Notepad for editing purposes, which I DO NOT recommend, turn of word-wrap and when saving the .htaccess file make sure to save it as:

File Type: All Files (not as a text document or it wont work).

Name: .htaccess

What is an .htaccess file?

The .htaccess file or distributed configuration file is a plain text file containing a list of configuration directives (rules) and provides a way to make configuration changes on a per-directory basis.

A website can have more than one .htaccess file, i.e. specific instructions may be applied at individual directory level. The .htaccess file will be read on every HTTP Request submitted to the web server. Changes made to this file take immediate effect.

If the website is hosted on an Apache server, your hosting company should allow the use of .htaccess files. If your host does not allow the use of .htaccess files, Ii will be recommended that you change to one that does. On Todhost, you can use the .htacess file.

Is mod_rewrite enabled?

If you don't have shell (or other) access to the apache configuration file (httpd.conf) and the host supports php, just paste the text below into a text editor and save the file as info.php in your public html path.

0.

<?php

1.

phpinfo();

2.

?>

Next run info.php from a browser. Scroll down to the Apache section and check Loaded Modules. Once you’ve obtained the necessary info, delete info.php from your server.

Another simple way to verify if mod_rewrite is enabled consists of creating a single redirect to test it. Add the following lines to the top of your .htaccess file and re-upload it to your server.

0.

RewriteEngine on

1. RewriteRule testpage\.html http://www.google.com [R]

Once the .htaccess file has been updated, enter the following into your browser:

http://www.yoursite.com/testpage.html

Replace www.yoursite.com with your actual website name. This should automatically redirect you to Google. If this redirect works, then mod_rewrite is working on your server.

Is .htaccess enabled?

It will be rare for the .htaccess not to be enabled on your site. If you have access to the server configuration files, it's quite easy to fix; open your httpd.conf in a text editor (without word-wrap enabled), and locate the section.

Locate the line that reads..

AllowOverride None

And change it to..

AllowOverride All

Restart Apache. Now .htaccess should work.

If there be any of these that you cannot do, simply contact your web host to get things done for you. If you don’t have access to the server configuration files then you’re probably on shared hosting platform and if your .htaccess file doesn’t work then you’ll need to contact your web hosting provider to request their support.

If you don’t have FTP or cPanel access to your website, DO NOT upload or create a .htaccess file(s). Seek assistance from your hosting company instead as you might create a 500 error (Internal Server Error) and once this occurs you won’t be able to access the site through a browser anymore.

What types of rules can the .htaccess file contain?

The most common security related directives (rules) .htaccess may contain are:

  • Authorization or authentication (username/password) control at directory level
  • Use of allow/deny rules to block users by IP address or domain
  • Customized error responses - e.g. page not found (404)
  • Directory listing – what to do if a directory rather than a page is requested
  • URL Rewriting – can be used for a number of purposes including search engine optimization (SEO), redirection of moved content, hot linking and security processing of the HTTP Request.

Commenting the .htaccess code

Comments in the .htaccess file are allowed on a per-line basis, each line of comments is preceded by the # (pound) sign. Comments running over several lines require thus multiple # signs. It’s also advised to use only letters, numbers, dashes, and underscores in the comments. This safe practice will help in avoiding potential server parsing errors.

# this is a comment

1. # each line must have its own pound sign

2. # only use letters and numbers along with dashes - and underscores _

Enable Basic Rewriting

To ensure mod_rewrite (basic rewriting) is enabled on your site, add the following line to your site’s root htaccess file:

# enable basic rewriting

1. RewriteEngine on

Enable Symbolic Links

Enable symbolic links (symlinks) by adding the following directive to the .htaccess file.

# enable symbolic links

1. Options +FollowSymLinks

Some webhosting companies may have swapped FollowSymLinks to SymLinksIfOwnerMatch due to security reasons. Check with your provider which directive should be used.

What is order allow,deny?

Order allow,deny is a setting in your Apache web server configuration that is used to allow or restrict access to certain parts or the whole of a website.

The allow directive affects who can access an area of the server or website. Access is usually controlled by hostname, IP address, or IP address range.

The deny directive restricts access to the server. Restrictions can be based again on hostname, IP address, or environment variables.

Order allow,deny tells your server that the allow rules are processed before the deny rules. If the client does not match the allow rule or it does match the deny rule, then the client will be denied access.

Order deny, allow means that the deny rules are processed before the allow rules. If the client does not match the deny rule or it does match the allow rule, then it will be granted access.

Deny Access to .htaccess

If you attempt to access the .htaccess file will result in a 403 error message.

# locked htaccess file

1. < Files .htaccess>

2. order allow,deny

3. deny from all

4. allow from xxxx IP address

5. < /Files>

Deny Access to a Specific File

Similar to the above example you can restrict access to a specific file by simply adding the following code. Edit the file name accordingly.

0. # deny viewing of a specific file

1. <files myfile.png>

2. order allow,deny

3. deny from all

4. </files>

Deny Access to Multiple File Types

To restrict access to certain file types, add the following code and edit the file types you wish to protect. As you might notice you need to escape the dot (.) this is done by typing a backslash (\) in front of the character that needs escaping.

<FilesMatch "\.(htaccess|ini|log)$">

1. Order allow,deny

2. deny from all

3. </FilesMatch>

Disable Directory Browsing

If there is no default index page on your site root folder, everything within the root of your site will be accessible to all visitors. Disabling directory browsing will instruct the server to display a "403 Forbidden - Authorization Required" message for any request to view a directory.

# deny directory browsing

1. Options All -Indexes

To enable directory browsing, use the following directive:

0. # allow directory browsing

1. Options All +Indexes

The following rule will prevent the server from listing directory contents:

0. # deny folder listing

1. IndexIgnore *

The IndexIgnore directive can also be used to prevent the display of select file types:

0. # no display of select file types

1. IndexIgnore *.wmv *.mp4 *.avi *.etc

Deny Access to a Specific Directory / Folder

Restricting directory / folder access is probably one of the most frequently used .htaccess techniques. To deny all requests for the restricted directory or folder, prepare a .htaccess file in that directory and put the following directive in it:

# deny directory access

1. deny from all

Allow access from a certain IP (where xxx.xxx.xxx.xxx is your IP):

 

# deny directory access and allow your IP

1. order deny,allow

2. deny from all

3. allow from xxx.xxx.xxx.xxx

There is a different way to archive the same objective by using the code below (where xxx.xxx.xxx.xxx is your IP):

 

# deny directory access and allow your IP

01. IndexIgnore .htaccess * */.??* *~ *#

02. DirectoryIndex index.php index.html

03. RewriteCond %{REMOTE_ADDR} !^xxx\.xxx\.xxx\.xxx

04. RewriteRule .* - [F,L]

05.  RewriteCond %{HTTP_HOST} .

06. RewriteCond %{HTTP_HOST} !^yourdomain\.com

07. RewriteRule (.*) http://yourdomain.com/directory_to_deny_access_to/$1 [R=301,L]

Set A Default Index Page

The rule below tells the server to display "content.html" as the default directory index.

# display other default index page

1. DirectoryIndex content.html

This rule is similar, but the server will search the root folder for the listed files and load the first match it encounters.

# show first available default index page from list

1. DirectoryIndex index.html index.php default.html

Redirecting www Requests

Redirect www.yoursite.com HTTP requests to yoursite.com using a 301 redirect.

0. RewriteCond %{HTTP_HOST} .

1. RewriteCond %{HTTP_HOST} !^yoursite\.com

2. RewriteRule (.*) http://yoursite.com/$1 [R=301,L]

If your site uses HTTPS, then this code will need to be modified to preserve the http / https in the incoming requests.

0. RewriteCond %{HTTP_HOST} .

1. RewriteCond %{HTTP_HOST} !^yoursite\.com

2. RewriteCond %{SERVER_PORT}>s ^(443>(s)|[0-9]+>s)$

3. RewriteRule (.*) http://yoursite.com/$1 [R=301,L]

Custom Error Documents

The .htaccess file is where you can configure error documents:

0. # serve custom error pages

1. ErrorDocument 400 /errors/400.html

2. ErrorDocument 401 /errors/401.html

3. ErrorDocument 403 /errors/403.html

4. ErrorDocument 404 /errors/404.html

5. ErrorDocument 500 /errors/500.html

Password-protect Files / Directories

Secure the site contents by requiring user authentication for specified files and/or directories. Adjust the AuthName field to suit your needs: members only , admin section ...

00. # password-protect single file

01. <Files admin.php>

02. AuthType basic

03. AuthName "Admin Panel"

04. AuthUserFile /home/path/.htpasswd

05. Require valid-user

06. </Files>

07. # password-protect multiple files

08. <FilesMatch "^(index|secure|backup)*$">

09. AuthType basic

10. AuthName "Dev"

11. AuthUserFile /home/path/.htpasswd

12. Require valid-user

13. </FilesMatch>

14. # password-protect the directory in which this htaccess rule resides

15. AuthType basic

16. AuthName "This directory is protected"

17. AuthUserFile /home/path/.htpasswd

18. AuthGroupFile /dev/null

19. Require valid-user

Allow or Deny by IP Address and/or Domain

Deny access to everyone except your LAN and domain.com:

0. # no one gets in

1. order deny,allow

2. deny from all

3. allow from https://www.linkedin.com/redir/invalid-link-page?url=192%2e168%2e0%2e0%2F24

4. allow from .*domain\.com.*

Sometimes, you will only want to ban a single IP, a persistent bot or a forum spammer ...

0. order allow,deny

1. #Botnet controller

2. deny from 78.109.29.112

3. allow from all

There are different ways to block a range of IP’s via the .htaccess file.

   By their CIDR number - 76.73.0.0/17

   By truncating digits until the desired IP range is obtained - 76.73.0.* - blocks all IP’s between 76.73.0.0 and 76.73.0.255

0. order allow,deny

1. #FakeGoogleBots-ProxyHijacks

2. deny from 76.73.0.0/17

3. deny from 76.73.0.*

4. deny from 76.73.*.*

5. deny from 76.*.*.*

6. allow from all

It is possible to block an entire range of IP addresses. Below are a few examples illustrating proper .htaccess syntax:

00. order allow,deny

01. # block a partial domain via network/netmask values

02. deny from 76.73.0.0/255.255.0.0

03. # block domain.com but allow sub.domain.com

04. deny from domain.com

05. allow from sub.domain.com

06. #RFI attacks

07. deny from 70.86.235.162

08. deny from 72.249.108.140

09. deny from 81.176.226.194

10. allow from all

Stop Hotlinking

Bandwidth theft or hotlinking is direct linking to a web site's files, usually images, video, etc...

Example: You’ve got some pictures illustrating an article and someone else uses an <img> tag to show that image on their webpage page instead of hosting the image themselves.

Bandwidth refers to the amount of data transferred from a web site to a user's computer. When you visit a web page, you are using that site's bandwidth to display the files. Depending on your web hosting provider, if a site is over its monthly bandwidth, the owner gets billed for the extra data or the site might be taken offline.

To stop hotlinking of your images from other sites and display a replacement image called hotlink.gif, place this code in your .htaccess file:

0. RewriteEngine On

1. # stop hotlinking

2. RewriteCond %{HTTP_REFERER} !^$

3. RewriteCond %{HTTP_REFERER} !^http(s)?://(www\.)?mysite\.com/ [NC]

4. RewriteCond %{REQUEST_URI} !^/hotlinking/hotlink\.gif$

5. RewriteRule \.(gif|jpe?g|bmp|swf)$ /hotlinking/hotlink.gif [NC,L]

Allow hotlinking from a couple of websites can be done in two ways, either put each domain on its own line or simply put several keywords separated by a pipe (|) character.

00. RewriteEngine On

01. # stop hotlinking

02. RewriteCond %{HTTP_REFERER} !^$

03. RewriteCond %{HTTP_REFERER} !^http(s)?://(www\.)?mysite\.com/ [NC]

04. # allow myspace and facebook

05. RewriteCond %{HTTP_REFERER} !^http://(www\.)?myspace\.com [NC]

06. RewriteCond %{HTTP_REFERER} !^http://(www\.)?facebook\.com [NC]

07. # allow different subdomains from mail.live.com

08.RewriteCond %{HTTP_REFERER} !^http://([^.]+\.)+mail\.live\.com [NC]

09. # allow several search engines

10. RewriteCond %{HTTP_REFERER} !(google|msn|bing|bingj|yahoo|ask)\. [NC]

11. RewriteCond %{REQUEST_URI} !^/hotlinking/hotlink\.gif$

12. RewriteRule \.(gif|jpe?g|bmp|swf)$ /hotlinking/hotlink.gif [NC,L]

To deliver a standard or custom error page instead of a replacement image, replace the line containing the RewriteRule in the above htaccess directive with the following line:

0. # serve a standard 403 forbidden error page

1. RewriteRule \.(gif|jpe?g|bmp|swf)$ - [F,L]

You can also get an excellent PHP script called Auto-HotLink. What I like most about it: It logs the HTTP requests so you can actually see who’s hotlinking your files.

Block Evil Robots, Site Rippers ...

The block of code below will block some unwanted junk from accessing your website. Any listed user agents will be denied access and receive an error message instead. Below are a couple of examples of user agents you really don't want to access your site.

Note: DO NOT include the "[OR]" on the very last RewriteCond or your server will crash, delivering 500 Errors to all page requests.

00. RewriteBase /

01. # useragents starting with

02. RewriteCond %{HTTP_USER_AGENT} ^atraxbot [NC,OR]

03. RewriteCond %{HTTP_USER_AGENT} ^Azureus [NC,OR]

04. RewriteCond %{HTTP_USER_AGENT} ^geohasher [NC,OR]

05. RewriteCond %{HTTP_USER_AGENT} ^PycURL [NC,OR]

06. RewriteCond %{HTTP_USER_AGENT} ^Python-urllib [NC,OR]

07. RewriteCond %{HTTP_USER_AGENT} ^research-scan-bot [NC,OR]

08. RewriteCond %{HTTP_USER_AGENT} ^Sosospider [NC,OR]

09. RewriteCond %{HTTP_USER_AGENT} ^Wget [NC,OR]

10. RewriteCond %{HTTP_USER_AGENT} ^xenu [NC,OR]

11. # user agent contains string

12. RewriteCond %{HTTP_USER_AGENT} ^.*casper [NC,OR]

13. RewriteCond %{HTTP_USER_AGENT} ^.*goblox [NC,OR]

14. RewriteCond %{HTTP_USER_AGENT} ^.*sun4u [NC]

15. RewriteRule ^(.*)$ - [F]

You can also include specific referrers by using HTTP_REFERER or block requests from a particular remote host using REMOTE_HOST:

0. RewriteBase /

1. # block amazon cloud

2. RewriteCond %{HTTP_REFERER} ^http://(([^.]+\.)+)?amazonaws\.com [NC,OR]

3. RewriteCond %{REMOTE_HOST} ^.*\.amazonaws\.com$ [NC,OR]

4. # block remote hosts starting with mail or smtp

5. RewriteCond %{REMOTE_HOST} ^mail(.*)\. [NC,OR]

6. RewriteCond %{REMOTE_HOST} ^smtp(.*)\. [NC]

7. RewriteRule ^(.*)$ - [F]

Block Semalt (semalt.com) Referrer Spam. The directive below will block all traffic coming from semalt.com and any subdomain (semalt.semalt.com)

0. RewriteBase /

1. # block semalt referrer spam

2. RewriteCond %{HTTP_REFERER} ^http://(([^.]+\.)+)?semalt\.com [NC]

3. RewriteRule ^(.*)$ - [F]

Instead of delivering a friendly error message you could also send these bad boys to a (bad) website of your choice by replacing the RewriteRule in the last line with the name of the domain where you want to redirect them to.

Block empty user agents and empty HTTP referrers

There are times you want to block empty user agents and empty HTTP referrers as there's no reason for browsers to request only images from your website and if those requests show up with an empty user agent and HTTP referrer, it's most certainly a bot that's scraping those images. The directive below will stop such requests.

0. # Blank UA and Referer

1. RewriteCond %{HTTP_REFERER} ^$ [NC]

2. RewriteCond %{HTTP_USER_AGENT} ^$ [NC]

3. RewriteRule ^(.*)\.(ico|gif|jpe?g|png|bmp|swf)$ - [F,L]

If you wish to apply this rule to all your pages, no matter which content is requested, replace the above RewriteRule with the code below:

0. RewriteRule .* - [F,L]

Disable the Server Signature

To remove the server signature which is displayed as part of the default Apache error documents, use the following code:

0. ServerSignature Off

Permanently Redirect a Specific IP

Sometimes you want to create permanent redirect (301) when the request is coming from a particular IP. In other words, if someone from 123.45.67.8 visits the website, the visitor will be redirected to Google for example. All other visitors will be able to access the site.

0. RewriteEngine On

1. RewriteCond %{REMOTE_ADDR} ^123\.45\.67\.8$

2. RewriteRule ^(.*)$ http://google.com [L]

Permanently Redirect Multiple IP's

Sometimes you might want to redirect multiple IP addresses to another URL, preventing access to your webiste from those IP's. One way of doing this is to add [OR] to the end of the first RewriteCond and add another IP on the next line. Do NOT place an [OR] on the final RewriteCond.

0. RewriteEngine On

1. RewriteCond %{REMOTE_ADDR} ^123\.45\.67\.8$ [OR]

2. RewriteCond %{REMOTE_ADDR} ^123\.110\.253\.195$ [OR]

3. RewriteCond %{REMOTE_ADDR} ^123\.8\.67\.54$

4. RewriteRule ^(.*)$ http://google.com [L]

Allow Multiple IP's and Redirect Everyone Else

During site development for example you might want to allow the administrators / developers to access the frontend of the CMS and redirect other visitors to a maintenance page.

The rules are pretty similar to the above ones, except that with when using a negation sign - ! - you DON'T add the [OR] to the end of the RewriteCond.

0. RewriteEngine On

1. #Allow the IP's below to access the CMS frontend and redirect visitors to offline.html

2. RewriteCond %{REMOTE_ADDR} !^123\.45\.67\.89$

3. RewriteCond %{REMOTE_ADDR} !^98\.76\.54\.32$

4. RewriteCond %{REQUEST_URI} !/offline\.html$ [NC]

5. RewriteRule ^(.*)$ /offline\.html [R=302,L]

More Redirect Fun ...

Using the above rules you can also prevent a visitor from a particular IP from accessing a specific page on your site and redirect the visitor to Google.

0. RewriteEngine On

1. RewriteCond %{REMOTE_ADDR} ^123\.45\.67\.8$

2. RewriteCond %{REQUEST_URI} /a-particular-requested-page\.html$

3. RewriteRule ^(.*)$ http://google.com [L]

Or to another page ...

0. RewriteEngine On

1. RewriteCond %{REMOTE_ADDR} ^123\.45\.67\.8$

2. RewriteCond %{REQUEST_URI} /a-particular-requested-page\.html$

3. RewriteRule .* /a-page-made-for-you.html [R=301,L]

You could also redirect multiple IP's to a PHP script which will send them to different destinations. When someone is abusing your website you could for example redirect them to the acceptable use policy (AUP) of their webhost.

0. RewriteEngine On

1. RewriteCond %{REMOTE_ADDR} ^123\.45\.67\.8$ [OR]

2. RewriteCond %{REMOTE_ADDR} ^123\.110\.253\.195$ [OR]

3. RewriteCond %{REMOTE_ADDR} ^123\.8\.67\.54$

4. RewriteCond %{REQUEST_URI} !/myscript\.php$ [NC]

5. RewriteRule ^(.*)$ myscript.php [R=301,L]

Enhanced .htaccess Directives

Below are a couple of rules initially posted on 0x000000.com, a white hacker website, according to a post here. They will help in blocking attacks against your server.

I personally took out the single quote / apostrophe (') in the rule on line 05 that "blocks out the use of illegal or unsafe characters in the Referer Variable of the HTTP Request". The quote may appear in a search query as seen below and thus innocently block the visitor.

google.com/search?....&q=pdf+parser+'FlateDecode+decompress+failed'

00. # Prevent use of specified methods in HTTP Request

01. RewriteCond %{REQUEST_METHOD} ^(HEAD|TRACE|DELETE|TRACK) [NC,OR]

02. # Block out use of illegal or unsafe characters in the HTTP Request

03. RewriteCond %{THE_REQUEST} ^.*(\\r|\\n|%0A|%0D).* [NC,OR]

04. # Block out use of illegal or unsafe characters in the Referer Variable of the HTTP Request

05. RewriteCond %{HTTP_REFERER} ^(.*)(<|>|'|%0A|%0D|%27|%3C|%3E|%00).* [NC,OR]

06. # Block out use of illegal or unsafe characters in any cookie associated with the HTTP Request

07. RewriteCond %{HTTP_COOKIE} ^.*(<|>|'|%0A|%0D|%27|%3C|%3E|%00).* [NC,OR]

08. # Block out use of illegal characters in URI or use of malformed URI

09. RewriteCond %{REQUEST_URI} ^/(,|;|:|<|>|">|"<|/|\\\.\.\\).{0,9999}.* [NC,OR]

10. # Block out use of empty User Agent Strings

11. # NOTE - disable this rule if your site is integrated with Payment Gateways such as PayPal

12. RewriteCond %{HTTP_USER_AGENT} ^$ [OR]

13. # Block out use of illegal or unsafe characters in the User Agent variable

14. RewriteCond %{HTTP_USER_AGENT} ^.*(<|>|'|%0A|%0D|%27|%3C|%3E|%00).* [NC,OR]

15. # Measures to block out SQL injection attacks

16. RewriteCond %{QUERY_STRING} ^.*(;|<|>|'|"|\)|%0A|%0D|%22|%27|%3C|%3E|%00).*(/\*|union|select|insert|cast|set|declare|drop|update|md5|benchmark).* [NC,OR]

17. # Block out reference to localhost/loopback/https://www.linkedin.com/redir/invalid-link-page?url=127%2e0%2e0%2e1 in the Query String

18. RewriteCond %{QUERY_STRING} ^.*(localhost|loopback|127\.0\.0\.1).* [NC,OR]

19. # Block out use of illegal or unsafe characters in the Query String variable

20. RewriteCond %{QUERY_STRING} ^.*(<|>|'|%0A|%0D|%27|%3C|%3E|%00).* [NC]

Many websites, about 70-80%, are vulnerable against LFI attacks and hackers will attempt to exploit this to get a shell on the server. Below is an example of php code that is vulnerable to LFI:

0. <?php

1. // LFI Vulnerable Code

2. $redirect = $_GET[redirect];

3. include($redirect);

4. ?>

The code below, graciously provided SigSiu.net, should help against Local and Remote File Inclusion attacks.

0. ########## Begin - File injection protection, by SigSiu.net

1. RewriteCond %{REQUEST_METHOD} GET

2. RewriteCond %{QUERY_STRING} [a-zA-Z0-9_]=http:// [OR]

3. RewriteCond %{QUERY_STRING} [a-zA-Z0-9_]=(\.\.//?)+ [OR]

4. RewriteCond %{QUERY_STRING} [a-zA-Z0-9_]=/([a-z0-9_.]//?)+ [NC]

5. RewriteRule .* - [F]

6. ########## End - File injection protection

In addition to the above rules you might want to add the one below in case they use URL encoding in the http:// part.

0. RewriteCond %{QUERY_STRING} [a-zA-Z0-9_]=http%3A%2F%2F [OR]

Adding the following directive in the .htaccess file should give you an additional layer of security against this type of attacks using the proc/self/environ method.

0. #proc/self/environ? no way!

1. RewriteCond %{QUERY_STRING} proc\/self\/environ [NC,OR]

Apache HTTP Server CVE-2011-3192 Denial Of Service Vulnerability

The directive below will help against the Apache HTTP Server CVE-2011-3192 Denial Of Service Vulnerability discovered in August 2011 if you can't update Apache to the latest version.

0. # drop Range header when more than 5 ranges.

1. # CVE-2011-3192

2. SetEnvIf Range (,.*?){5,} bad-range=1

3. RequestHeader unset Range env=bad-range

4. # optional logging.

5. #CustomLog insert-path-and-name-of-log common env=bad-range

PHP-CGI Remote Code Execution Bug (CVE-2012-1823)

The directive below will help against the PHP-CGI Remote Code Execution Bug (CVE-2012-1823) discovered on the 3rd May 2012 by Security Researchers from Eindbazen

0. # PHP-CGI Vulnerability

1. RewriteCond %{QUERY_STRING} ^(%2d|\-)[^=]+$ [NC]

2. RewriteRule (.*) - [F,L]

HTTP Headers to Help Secure Your Website

Preventing cross-site request forgery (CSRF) attacks is hard and web applications must be built to prevent CSRF vulnerabilities. The first vulnerability is cross-site scripting (XSS).

Around 40.000 web sites have been catalogued by XSSed as being vulnerable to cross-site scripting (XSS). These attacks leave your users open to cookie theft, information theft, account hijacking, clickjacking and more.

Modern web browsers have some powerful protection build in nowadays but you need to tell the browser that you want those protection mechanisms used for your website. This can be archived by setting specific HTTP headers.

X-Frame-Options

The X-Frame-Options HTTP response header can be used to indicate whether or not a browser should be allowed to render a page in a <frame> or <iframe>. This can be used to avoid clickjacking attacks, by ensuring that your content is not embedded into other sites.

This directive is pretty similar to the frame buster code explained in "Double Trouble on Google Images" except that it is only supported in the following browsers:

   Internet Explorer 8+

   Opera 10.50+

   Safari 4+

   Chrome 4.1.249.1042+

   Firefox 3.6.9+ (or earlier with NoScript)

There are three possible values for this header:

   DENY - This setting prevents any pages served from being placed in a frame even if it is on the same website it originates from. should be used if you never intend for your pages to be used inside of a frame.

   SAMEORIGIN - This setting allows pages to be served in a frame of a page on the same website. If an external site attempts to load the page in a frame the request will be denied.

   ALLOW-FROM origin - If the value contains the token ALLOW-FROM origin, the browser will block rendering only if the origin of the top-level browsing context is different than the origin value supplied with the Allow-From directive.

The code below sets the directive to DENY, preventing our pages from being served in any frames, even from our own website.

0. # Don't allow any pages to be framed - Defends against CSRF

1. Header set X-Frame-Options DENY

X-XSS-Protection

This header is exclusive to Internet Explorer 8 and 9, it turns on cross site scripting protection in IE 8 and IE 9 which is turned off by default as it could potentially break some websites. To turn on the XSS filter, use the header X-XSS-Protection "1; mode=block". If you wish to prevent this filter from being turned on for your website set the headers value to "0";

0. # Turn on IE8-IE9 XSS prevention tools

1. Header set X-XSS-Protection "1; mode=block"

X-Content-Security-Policy

Content Security Policy (CSP) is an added layer of security that helps to detect and mitigate certain types of attacks, including Cross Site Scripting (XSS) and data injection attacks. These attacks are used for everything from data theft to site defacement or distribution of malware. This header is designed to specify how content interacts with your website.

Setting this directive to "allow 'self';" will prevent our pages from loading any external JavaScript or running any inline JavaScript. Our website will be safer from XSS attacks as it becomes more difficult for an attacker to run JavaScript in the visitor’s browser through a framed page on our website or through a SQL injection.

There are a lot of settings available for this header. You need to read the Mozilla Wiki page before using CSP as you will have to allow certain external content if you run Google Adsense for example.

0. # Only allow JavaScript from the same domain to be run.

1. # Don't allow inline JavaScript to run.

2. Header set X-Content-Security-Policy "allow 'self';"

X-Content-Type-Options: nosniff

This header prevents "mime" based attacks. This header prevents Internet Explorer from MIME-sniffing a response away from the declared content-type as the header instructs the browser not to override the response content type. With the nosniff option, if the server says the content is text/html, the browser will render it as text/html.

0. # prevent mime based attacks

1. Header set X-Content-Type-Options "nosniff"

Regex Character Definitions For htaccess

# the # instructs the server to ignore the line. used for including comments. each line of comments requires it’s own #. when including comments, it is good practice to use only letters, numbers, dashes, and underscores. this practice will help eliminate/avoid potential server parsing errors.

[F]

Forbidden: instructs the server to return a 403 Forbidden to the client.

[L]

Last rule: instructs the server to stop rewriting after the preceding directive is processed.

[N]

Next: instructs Apache to rerun the rewrite rule until all rewriting directives have been achieved.

[G]

Gone: instructs the server to deliver Gone (no longer exists) status message.

[P]

Proxy: instructs server to handle requests by mod_proxy

[C]

Chain: instructs server to chain the current rule with the previous rule.

[R]

Redirect: instructs Apache to issue a redirect, causing the browser to request the rewritten/modified URL.

[NC]

No Case: defines any associated argument as case-insensitive. i.e., "NC" = "No Case".

[PT]

Pass Through: instructs mod_rewrite to pass the rewritten URL back to Apache for further processing.

[OR]

Or: specifies a logical "or" that ties two expressions together such that either one proving true will cause the associated rule to be applied.

[NE]

No Escape: instructs the server to parse output without escaping characters.

[NS]

No Subrequest: instructs the server to skip the directive if internal sub-request.

[QSA]

Append Query String: directs server to add the query string to the end of the expression (URL).

[S=x]

Skip: instructs the server to skip the next "x" number of rules if a match is detected.

[E=variable:value]

Environmental Variable: instructs the server to set the environmental variable "variable" to "value".

[T=MIME-type]

Mime Type: declares the mime type of the target resource.

[]

specifies a character class, in which any character within the brackets will be a match. e.g., [xyz] will match either an x, y, or z.

[]+

character class in which any combination of items within the brackets will be a match. e.g., [xyz]+ will match any number of x’s, y’s, z’s, or any combination of these characters.

[^]

specifies not within a character class. e.g., [^xyz] will match any character that is neither x, y, nor z.

[a-z]

a dash (-) between two characters within a character class ([]) denotes the range of characters between them. e.g., [a-zA-Z] matches all lowercase and uppercase letters from a to z.

a{n}

specifies an exact number, n, of the preceding character. e.g., x{3} matches exactly three x’s.

a{n,}

specifies n or more of the preceding character. e.g., x{3,} matches three or more x’s.

a{n,m}

specifies a range of numbers, between n and m, of the preceding character. e.g., x{3,7} matches three, four, five, six, or seven x’s.

()

used to group characters together, thereby considering them as a single unit. e.g., (perishable)?press will match press, with or without the perishable prefix.

^

denotes the beginning of a regex (regex = regular expression) test string. i.e., begin argument with the proceeding character.

$

denotes the end of a regex (regex = regular expression) test string. i.e., end argument with the previous character.

?

declares as optional the preceding character. e.g., monzas? will match monza or monzas, while mon(za)? will match either mon or monza. i.e., x? matches zero or one of x.

!

declares negation. e.g., “!string” matches everything except “string”.

.

a dot (or period) indicates any single arbitrary character.

-

instructs “not to” rewrite the URL, as in “...domain.com.* - [F]”.

+

matches one or more of the preceding character. e.g., G+ matches one or more G’s, while "+" will match one or more characters of any kind.

*

matches zero or more of the preceding character. e.g., use “.*” as a wildcard.

|

declares a logical “or” operator. for example, (x|y) matches x or y.

\

escapes special characters ( ^ $ ! . * | ). e.g., use “\.” to indicate/escape a literal dot.

\.

indicates a literal dot (escaped).

/*

zero or more slashes.

.*

zero or more arbitrary characters.

^$

defines an empty string.

^.*$

the standard pattern for matching everything.

[^/.]

defines one character that is neither a slash nor a dot.

[^/.]+

defines any number of characters which contains neither slash nor dot.

http://

this is a literal statement — in this case, the literal character string, “http://”.

^domain.*

defines a string that begins with the term “domain”, which then may be proceeded by any number of any characters.

^domain\.com$

defines the exact string “domain.com”.

-d

tests if string is an existing directory

-f

tests if string is an existing file

-s

tests if file in test string has a non-zero value

Share this post

Comments (0)

Leave a comment


Powered by Simple Blog