How to choose a good space for SEO website optimization
1. Why is space important?
The search engine crawler (Baidu's crawler is called spider, Google's crawler is called robot) is a program that crawls the page of the website depending on the website address. We call it crawler. It runs automatically. It collects the website address and downloads this Beijing website production And make statistics of all links on the page, including the inner link and the outer link. After statistics, crawl again and save them to its server in the form of txt text.
Collection is divided into two processes; 1. First, collect links by crawling to a page (the link to the search engine you submitted). 2. Climb to your page and download the page. 1. Cache server (snapshot) 2, SITE server (collection) 3, index list server (ranking) They are not on the same server, which is why our snapshot dates are different. For example, there is no home page for our SITE domain name, but there is a home page for our direct search domain name, which means that the data is not synchronized.
Why is the stability of space very important? Because search engine crawlers simulate the user's behavior habits to crawl website content. If the server is unstable or the opening speed is slow, the crawler will lose interest in the website when it loses data or cannot crawl to the content when it crawls to the website. So Wudang reminds SEOers that the instability of the server will have a direct negative impact on SEO optimization.
2. So how should we prevent it?
1. The website data backup (web page data and database data) must be carried out regularly. The database backup website files are packaged and downloaded locally as a whole. In case of being attacked, we can directly recover the data, modify the FTP password server password or space background control password, and temporarily cancel the right to write to the website folder. The more complex the FTP password, the better!
2. The space opening speed of more than 6 seconds is quite unfavorable for SEO. If the website has too many pictures and too many flash, it is recommended that you compress the pictures to no more than 50KB. If flash can be used, it is not necessary. In addition, it is recommended to enable the compression and transmission function of the server. Another reason is that as long as the website called is slow to open the website, especially the weather forecast, its own website will also be slow. It is OK to have one online message software and website statistics, and more will also affect the opening speed of the website. One thing to remember: the larger the calling code, the slower it will open! If none of the above reasons is true, it is most likely that the space or server is slow to open. Please communicate with the space provider or the computer room to solve the problem. If the problem cannot be solved, it must be replaced decisively. If the space or server is replaced, please remember several points:
First, transfer the data (web files and databases),
Test space or server speed before the second transfer,
Third, first enable the second level domain name for debugging or use the third level domain name provided by the space provider for debugging,
Fourth, domain name resolution The best time for domain name resolution is when users have the least access,
Fifth, after domain name resolution, the original space should be stable for 24 hours. The original space cannot be closed, and the original space data cannot be cleared. Because the global effective time of DNS resolution is 5 minutes to 24 hours, many old users have the cache of the original IP, and the effective time of DNS resolution in each region is different, and spiders also have cache.
3. How to choose a reasonable space?
The first is to support pseudo static space, most of which are now High end website construction The source code is dynamic and pseudo static, so pseudo static must be supported.
Second, it is better to provide IIS log query. If you want to know the trends of the crawler on the website, you must check the IIS log, and it is better to generate an IIS log every hour.
Third, it is best to support php+mysql space. Most webmasters use php+mysql website source code.
Fourth, we need to support the background online decompression function of the space. If we do not support the background online decompression and compression, we will upload files or backup a lot of time.
Fifth, support 301 Redirect Binding with 404 error pages, 301 redirection can centralize or transfer the weight of our website. 404 error pages are friendly to users and crawlers.
Sixth, it is better not to limit the number of concurrent IIS. The space to limit the number of concurrent IIS will be directly paralyzed as long as it is attacked by threads.
Seventh, the problem technology can be solved in about 12 hours