People from different users who have different content, such as different geographical locations (for example, in some more popular different product promotions)/different screen resolutions (to better match the content with the screen size) or who enter the website from different navigation points.In these cases, sometimes you may display different unique content to different users.It is better to have a default content version displayed to users and search engines that do not have these properties.
Including search engines, there are several common websites for different visitors.Show reasons for different content.The following are the most common ones.
Multivariable and A/B testing
It is better to use JavaScript/Cookie/Sense to display content. To test the conversion of the login page, different content needs to be displayed to different visitors.In this case.To give a search engine a standardized version of the page that will not be decorated every time it crawls (although content changes may not be harmful to you every time you visit), Googl also provides a software called GooglWebsitOptimize to realize this function.
The content to be registered and the first click are free
It is better to insist on the same URL for logged in users and non logged in users to show a summary to the non logged in users and search engines (usually it is enough to show the full content to the search engines at both ends, if you want to force users to register (whether paid or free), then you can access the content.Some content transfer rules can be used. For example, new visitors can access the first one or two pages of content before they register. After this grace period, they are required to register.This allows you to be honest in your intentions. You can also use cookies or sessions to restrict users' access and display complete content to search engines.
You can also choose to participate in the First ClickFree program of Googl. As long as users who click on the search results can access the first article for free, in this case, the website can display the paid or login content to Googl spiders. Many famousWebsite designPublishers use this technology, including popular websites Experts-Exchange.com
To be more specific, it is necessary to implement free click for the first time.The publisher must continue the Googl spider (and possibly other search engine spiders) to access all the content he wants to be indexed, even though users usually need to log in to see these content.Users who visit the website still need to log in, but search engine spiders do not.This will lead to the website content appearing in the search results at an appropriate time. However, if a user clicks the search results to come to the website, he must be allowed to access the whole article (if it includes articles on the same page, he must be allowed to access all pages of the article). Once the user clicks to visit another article on the website, he can be asked to log in.
For more details, please refer to Googl's first click on the free program page://googlewebmastercentral.blogspot/2008/10/first-click-free-for-web-search.html
Navigation that search engines cannot crawl
You should consider displaying a crawlable version of HTML content to search engines.Many websites simply display a layer that is visible to people but not to search engines through CSS layers.You can also use the noscript tag if your navigation is Flash/JavaScript/Java or other non crawlable formats.Although this is usually more dangerous, many cheaters use noscript to hide content.Adobe recently released aSEOWith the Flash portal, provide search engine confirmed best practices to assist in the discovery of Flash files.Be careful to ensure that the content displayed in the visible layer of the search engine is substantially the same as that in the human visible layer
Copy content
Limit spider access.This ensures that you can display a unique page to the search engine. If a large part of the content of a page is copied, you can consider putting it in an ifram that prohibits crawling through the robots.txt file.At the same time, prevent copying content problems. This article was published by Beijing Website Construction Company Shangpin China//ihucc.com/