Beijing website constructionCompany Shangpin China: Although the amount of website coverage is not directly related to the ranking weight, the amount of coverage is an important way for us to judge whether the site content is recognized by search.Many webmasters often ask how to improve their collection in the forum. The author believes that the effect is easy to see as long as the collection of the site is good in several aspects.Among them, there are three that I think are more important, one is the handling of invalid links.2. Page dwell time.Whether the three page code uses spiders to crawl.As long as these three important tasks are done well, it is easy to quickly increase the number of sites included.Today, I would like to talk to you about my practical experience in operation.
A small station recently operated by the author has not done a good job in details, which has led to a continuous decline in the number of entries, as shown in the figure:
1、 Streamline the code and improve the smoothness of spider crawling
In many cases, the inability of many sites to collect data is not because the content is worthless or unreadable. The reason is that the code of the website page is not conducive to spiders' crawling. If the directory of a site's page exceeds three to five layers, how can spiders get such a deep directory URL?Just like the forum, the first three pages of a post are generally more likely to be included, while those over three pages are basically not included.So if we want to make the site's collection volume soar, we must first ensure that the website page code is conducive to spider crawling, so that users can quickly find the content of the page, and do not let the content page hide too deeply.After all, for spiders to crawl and grab, the fragrance of wine should be deep in the alley.One more thing, try to simplify the code of the website, remove some useless space characters, carriage return characters, line feed characters, duplicate DIV and STRONG tags, and put ALT tags on all images. This is very helpful to improve the friendliness of the page to spiders. At the same time, simplifying the code can fundamentally reduce the size of the page,It is conducive to improving the full opening speed of the page and is also very effective for recording.
2、 Handling of invalid links
In website optimization, there is one element that has a great impact on the volume of content, that is, invalid links.If we use the understanding habits of our life, the invalid links in the website are equivalent to dead ends, dead ends and the like in real life. If you want to go through such a road, you can't go without any external help.The same invalid links are also the same key for spiders to crawl. We can analyze this way. When spiders enter the home page of a site and want to crawl into each page, if there are too many invalid links in the site, then naturally spiders will often crawl to a dead end, which will not only affect the image of the site in the eyes of spiders, but more importantly,When spiders find that your site has too many dead ends, they will naturally subconsciously judge your site as a garbage site. As time goes by, the website's collection will become lower and lower.Therefore, we can set 301 redirects to redirect some regular invalid page links to new page links.If you don't know about 301 redirection, you can use ROBOTS and 404 page guidance.Personally, I recommend 404 pages, because this can effectively eliminate invalid links without losing the amount of content, so that spiders can continue to crawl deeply through the guide links of 404 pages to crawl the site pages.
3、 How long the user stays on the page
One factor not only affects the ranking weight, but also is critical for the search guide to judge the site content, that is, the user's stay time in the page.In fact, if you carefully analyze the rival sites that rank better than yourself but have few external links, you can get such a data. The reason why you can get stable rankings even without external links is that the users of the other site are more loyal than those of your own site. If the other party shows you the statistical data, it is very easy to see the results.The longer a user stays on the page, the more valuable these pages will be classified by the natural search guide and will be included naturally.After all, who will spend a lot of time looking at the content of pages that are not interesting or useful to themselves?Therefore, when the page content generates value for users, users will naturally spend more time to read and understand the content in depth, and search quotes will immediately include the page for this reason.That is very helpful for us to increase the amount of content on the site, isn't it?Will search quotes not be included in the face of users' valuable page content?And as long as they persist for a long time and are fully trusted by the search engine, then naturally the site content will be included first in the later stage, which is one of the most effective ways to increase the amount of content included.
Many webmasters, when the number of websites included drops, are more likely to restore their number by increasing the number of external links.The author believes that although the external chain has something to do with the volume of content, the reason for the site's volume of content is actually more internal to the website. The simplest reason is that spiders can't crawl to the content page to crawl, and can't even reach the destination. Why raise the volume of content?Therefore, the site's collection volume does not go up or even continues to decline. First of all, instead of blindly increasing the external chain, more attention should be paid to the internal inspection of the site, and try to deal with some details. This is an effective way to address both symptoms and root causes.