SEO website optimization tool. htaccess file
Source: Shangpin China |
Type: website encyclopedia |
Date: March 28, 2012
Since you are a technician, the actual code is indispensable. Since most of the current websites use PHP+MYSQL as the mainstream website program, we must pay special attention to the website configuration file. This article has collected and sorted out the configuration, usage and function of almost all. htaccess files for the author. The collection comes from the Internet. [301 Permanent Redirect] It is a good habit and method to use 301 permanent redirection to avoid the weight between the www and the root domain name. However, setting from the http header is not good. One is to increase the number of http responses to the web page and extend the response time of the page. The other is to easily set it to 302 temporary redirection. Therefore, the author suggests using the. htaccess file for 301 permanent redirection of the website. Why is this search engine friendly? Because now many modern search engines have the function of updating their existing records according to the 301 permanent steering check. The code is as follows (take the author's blog www.AAA. com as an example): RewriteEngine on RewriteCond %{http_host} ^AAA.com [NC] RewriteRule ^(.*)$ //www.AAA.com/ $1 [L,R=301] This means that when you visit AAA. com/, you will automatically go to www AAA.com, similarly, we can reverse the setting, that is, visit www Automatically turn to AAA. com when AAA. com: RewriteEngine on RewriteCond %{http_host} ^ www.AAA.com [NC] RewriteRule ^(.*)$ //AAA.com/ $1 [L,R=301] [User defined Error Page] Some hosts can only set 404 error pages in the background. Use the. htaccess file to customize your own error pages for each error code. The codes are as follows: ErrorDocument 401 /error/401.php ErrorDocument 403 /error/403.php ErrorDocument 404 /error/404.php ErrorDocument 500 /error/500.php [Compressed File] Optimize the access speed of your website by compressing the volume of static resources and other files on your website. It can compress text, html, javascript, css, xml and other files. The code is as follows AddOutputFilterByType DEFLATE text/plain AddOutputFilterByType DEFLATE text/html AddOutputFilterByType DEFLATE text/xml AddOutputFilterByType DEFLATE text/css AddOutputFilterByType DEFLATE application/xml AddOutputFilterByType DEFLATE application/xhtml+xml AddOutputFilterByType DEFLATE application/rss+xml AddOutputFilterByType DEFLATE application/javacript AddOutputFilterByType DEFLATE application/x-javascript [Static Resource Browser Cache Settings] For files that do not change and update frequently, it is very important to set the static file cache, which can greatly improve the page access rate. It is also one of the important items of Yahoo's YSLow evaluation standard. The codes are as follows: ExpiresActive on//Enable cache ExpiresByType text/css "access 1 month"//css file cache for 1 month ExpiresByType text/plain "access 2 days"//Plain text content is cached for 2 days ExpiresByType text/html "access 2 days"//html file cache for 2 days ExpiresByType application/javascript "access 1 month"//JS file cache for 1 month ExpiresByType image/jpeg "access 1 month"//jpeg image cache for 1 month ExpiresByType image/x-icon "access 1 month"//icon cache for 1 month ExpiresByType image/gif "access 1 month"//gif image cache for one month ExpiresByType image/png "access 1 month"//png One month image cache ExpiresByType image/ico "access 1 month"//ico cache for 1 month ExpiresByType application/pdf "access 1 month"//pdf file cache for one month ExpiresByType application/x-shockwave flash "access 1 month"//flash cache for one month ExpiresDefault "access 1 month"//By default (cacheable files not mentioned), cache for 1 month [Cache is prohibited for some file types] There are many dynamic files on the website that cannot be cached by the browser. In this way, we need to limit the files that cannot be cached. The codes are as follows: Header unset Cache-Control [Allow access and block IP access] You can use the following command to block an IP address. For the medical industry, the code of auction promotion and business communication with malicious clicks can use this to block competitors. The codes are as follows: deny from 000.000.000.000 000.000.000.000 here is the IP address that is blocked. If you only specify a few of them, you can block the address of the entire network segment. If you enter 210.10.56. *, all IP addresses from 210.10.56.0 to 210.10.56.255 will be blocked. You can also use the following command to allow an IP address to access the website. The codes are as follows: allow from 000.000.000.000 The allowed IP address is 000.000.000.000. You can block the entire network segment as you block IP addresses. If you want to block everyone from accessing this directory, you can do so. The codes are as follows: deny from all It should be noted, however, that this does not affect the script program's use of the documents in this directory, but only prevents users from accessing them (someone might think of using it to allow only spiders to access it). [Picture security chain] The following htaccess code can improve the security level of your web server. Image link theft protection is very useful. It can prevent others from stealing and using the image resources on your server. The codes are as follows: RewriteBase / RewriteCond %{HTTP_REFERER} !^$ RewriteCond %{HTTP_REFERER} !^ //(www.)? aqee.net/.*$ [NC] RewriteRule . (gif|jpg|swf|flv|png)$ /feed/ [R=302,L] [Anti hacker attack vulnerability] If you want to improve the security level of the website, you can remove the following lines of code, which can prevent some common malicious URL matching hacker attacks. For medical websites, it is very difficult to prevent and deal with thousands of conversations that broke out in a period of time due to the pop-up code of a competitor attacking MySpace. Therefore, we can use. htaccess to protect our MySpace. The codes are as follows: RewriteEngine On # proc/self/environ? no way! RewriteCond %{QUERY_STRING} proc/self/environ [OR] #Prevent scripts from attempting to modify the mosConfig value through the URL RewriteCond %{QUERY_STRING} mosConfig_[a-zA-Z_]{1,21}(=|%3D)[OR] #Prevent base64_encode garbage information passed by script through URL RewriteCond %{QUERY_STRING} base64_encode.*(.*) [OR] #Block scripts with<script>tags in URLs RewriteCond %{QUERY_STRING} (<|%3C).*script.*(>|%3E)[NC,OR] #Prevent scripts that attempt to set PHP GLOBALS variables through URLs RewriteCond %{QUERY_STRING} GLOBALS(=|[|%[0-9A-Z]{0,2})[OR] #Prevent scripts that attempt to set PHP's _REQUEST variable through URL RewriteCond %{QUERY_STRING} _REQUEST(=|[|%[0-9A-Z]{0,2}) #Turn all blocked requests to the 403 Disable Prompt page! RewriteRule ^(.*)$ index.php [F,L] [Block everyone from accessing your website files] The following code can prevent others from accessing your. htaccess file. Similarly, you can set to block multiple file types. [Protect your htaccess file] The codes are as follows: <Files .htaccess> order allow,deny deny from all </Files> [Block viewing the specified file] <Files secretfile.jpg> order allow,deny deny from all </Files> [Block viewing of specified multiple file types] <FilesMatch ".(htaccess|htpasswd|ini|phps|fla|psd|log|sh)$"> Order Allow,Deny Deny from all </FilesMatch> [Rename. htaccess file] Since. htaccess is so important, we need to protect it. Modifying its file name is one of the ways to protect it. The codes are as follows: AccessFileName htacc.ess [Block some unwelcome visitors by quoting information] The codes are as follows: 《IfModule mod_rewrite.c》 RewriteEngine on RewriteCond %{HTTP_REFERER} AAA.com [NC,OR] RewriteCond %{HTTP_REFERER} seowto.com [NC,OR] RewriteRule .* - [F] 8.</ifModule> [Block some requests by judging the browser header information] This method can save your bandwidth traffic by preventing some robots or spiders from crawling your website. In particular, the effect of anti acquisition is very good. The codes are as follows: <IfModule mod_rewrite.c> SetEnvIfNoCase ^User-Agent$ .*(craftbot|download|extract|stripper|sucker|ninja|clshttp|webspider |leacher|collector|grabber|webpictures) HTTP_SAFE_BADBOT SetEnvIfNoCase ^User-Agent$ .*(libwww-perl|aesop_com_spiderman) HTTP_SAFE_BADBOT Deny from env=HTTP_SAFE_BADBOT </ifModule> [Disable script execution to enhance your directory security] AddHandler cgi-script.php .pl .py .jsp .asp .htm .shtml .sh .cgi Options -ExecCGI [Prohibit directory browsing] The server is prohibited from displaying the directory structure externally, and vice versa. [Prohibit directory browsing] The codes are as follows: Options All -Indexes #Open directory browsing The codes are as follows: Options All +Indexes [Change the default Index page] The codes are as follows: You can change the default index.html, index.php or index.htm to other pages. DirectoryIndex business.html This article was published on Beijing website construction Company Shangpin China //ihucc.com/
Source Statement: This article is original or edited by Shangpin China's editors. If it needs to be reproduced, please indicate that it is from Shangpin China. The above contents (including pictures and words) are from the Internet. If there is any infringement, please contact us in time (010-60259772).