experienced the update in June 5th, Baidu more than a collection of 200 pages, Google also included more than 100 pages. But in the past few days, found the problem, found that the site GOOGLE is not normal, every day in the reduced search, there is no online collection, such as mass, keyword cheating, your server is stable, GOOGLE also has several hundred pages every day, without any problem.
later after many reasons to find that Baidu and Google’s algorithm system is different, so this phenomenon is very normal.
As long as you
the web site in the search engine data do not need to worry too much, then need to do is to examine the change of pages included, if the observation time in the unit (such as a month) in the number of pages indexed has been in decline, it would have to pay attention.
causes the number of web pages to decline, there are generally five:
first, your site information content copy (such as collection, reprint, etc.) of other people’s site.
second, your site content is copied by other sites, and that station weight is greater than your station.
third, the search engine in data updates, then the number of dropped only temporary.
fourth, the root directory of Robots.txt file malicious tampering, prohibit search engine robot on the site page index.
fifth, remove the station chain and external links or external links also included in the chain to reduce, reduce the mutual influence of external links.
increase the number of pages included skills:
spends a lot of time and effort on writing original copywriting, and the search engine loves its unique content;
web page release or information update frequently, let search engine spiders often visit your website;
reasonably plan the website structure, increase the auxiliary navigation (such as related articles, popular articles, previous articles, etc.), don’t let a web page become "information island".
The editor of http://s.feizhuliushipin.com.