From the perspective of S log in the search for the search engine spiders activity

1: to get the log file we need to have IIS space log function, if we have the space function, the general will be recorded in the log file in the weblog folder, we can directly download our site log files from this folder.

three: how to interpret the spider behavior of

two: how to get the log file and the

in the optimization process of the site in a site is not to say that all the problems can get information directly from the webmaster tools, often the webmaster tools to get information on the site is to detect problems. As a Shanghai dragon Er, hidden information we need to learn more of the site. For example, how these days do the chain effect? Our content that is more vulnerable to the search engine spiders love? The search engine spiders crawling for positive degree of our site? And so on, these are hidden in some of the key information of the site content. Through the webmaster tools it is very difficult for us to analyze the information. The information they can find the answer from the IIS log on our site.

3: through the log we can find some fault space, these faults may be some webmaster tools cannot be aware of. For example, last stage is the fire orange space because technicians due to misoperation of space shield spider love Shanghai event, if the webmaster prior analysis to analyze the log space, may find this mistake.

1: the diary we can analyze more clear information search engine spider crawling on the site, this information contains a spider crawling route and crawling depth. With this information, we can analyze the recent chain effect how we build. Because we know that the chain like spider silk and guide the spider crawling, if the construction of the chain is good, the spider crawling naturally and frequently, we can record from which a "entrance" into the high frequency of spider.

we can use Notepad, Notepad search and Google spider love Shanghai, respectively is BaiduSpider and noble baby bot.

2: content update and spider crawling website there is a certain relationship, usually as long as we update the stable frequently, the spider will crawl more frequently. We can use the log in spider visit update frequency of the site content to do a fine frequency.

should pay attention to matters We open the site log file

2: when using this feature we need to pay attention to the generation time log settings, the author suggests if the site is a small site can make it generated once a day, if it is the big site we can make the hourly updates, so the generated file is too large.

: why IIS log analysis in such an important site of implicit information in

Leave your comment