The impact of Baidu snapshot included important issues

this morning, our company programmer, suddenly received customer information, said his website is a collection of Baidu, but the snapshot of the page, press the list page is blank, and then asked for my help. "I’ll ask them what the code in those places is," he replied, "yes, JS… I heard it,

!"

in the Internet to see a lot of problems, mostly are discussed how to let Baidu snapshot fast update, very few people will talk about the impact of Baidu snapshot of the problem. Today, I’ll talk to you about some of the problems that affect Baidu snapshots.

1. The site code itself has a major problem,

website programmers do not understand the search engine, this is very normal. So when we are doing the website, we must make clear to the website programmer which code will affect the search engine as much as possible.

flash code, JavaScript code, the two code can make the site more beautiful, but for your search engine is extremely unfavorable, really want to use, the best use of call, such as call JavaScript code: < scripttype=text/javascriptsrc=" " position your JS file is located; > < /script>

;

picture of your website search engine is more beautiful, you can’t know the picture, it is only through the alt attribute to your picture for you to capture, so pictures used websites to add the alt attribute, ALT write the word best with your own website keywords contact. < IMG src=" picture path " alt=" simple picture text description " >

and DIV+CSS write the best use of the CSS file to separate separate, not only can be used in other page calls, but also reduces the size of web pages, web access speed will be faster, why not shoot two hawks with one arrow?

two. Website preliminary work is not done, hurry up on-line

makes mistakes that are often made by inexperienced people. Once the site is ready, upload it to the server, and then do some debugging and modification. This is very bad, so it is likely to cause our not yet perfect, good page was spider grab, and later changes will give spider cause a bad impression. So, we must debug the site in the local, and then on-line.

is also a good way to use robots.txt files skillfully, and the function of robots.txt is not only to guide spiders to crawl, but if we change the code into:

>

Leave a Reply

Your email address will not be published. Required fields are marked *