Black Hat SEO Anatomy Tool

Invisible, it means that the road is in the same way as Xiyi;
This system article is divided into four parts, which are tactics, tools, hidden articles, and summary articles. This article is a tool article, mainly introduces some tools that are often used in black hat seo, and their uses.
Engaged in black hat SEO is often a batch operation, so automated tools are indispensable, and it is also an important part of the entire black production ring. This article will introduce several commonly used tools in black hat seo. Since this article was written a year ago, some tools may have been eliminated or upgraded.

Parasite (jsc)

Implanting parasites is a common method used by black hat SEOs to automatically generate various illegal pages by invading other people’s websites and implanting parasite programs. The reason why it is called a parasite is because it can trigger the generation itself, rather than generating it once, for example, when it is accessed by a web page, automatically generating a page and forming a sprocket. In simple terms, a parasite is a program. The function of this program is to be able to create web pages by itself, and the conditions created can be customized. For example, when someone visits a page, the parasite program is triggered to generate a new batch of web pages. Files, or create them every day, etc.
I have encountered such a situation when dealing with a customer’s emergency response event. Whenever I clean up all malicious web files, a large number of new web files are automatically generated from time to time on the server. The headache is that I couldn’t fully grasp the rules of generating new files. Later, when we excluded the files on the web server, we found one of the malicious dynamic language files (the sample was not retained for various reasons). This malicious file is similar to the parasite program and will be used when we visit this website. Pages are triggered to generate a new batch of malicious pages.

Parasite classification

Parasites are divided into dynamic and static. The dynamic parasite program will automatically generate new pages (such as the case I mentioned above), or automatically change the content after refreshing the page. The malicious files generated by dynamic parasites are often asp/ Php suffix files; pages generated by static parasite programs are often fixed content, mostly html suffix files.

Parasite template

The pages generated by the parasite program often have fixed templates. The quality of the template sometimes determines whether it can be quickly included by the search engine. The following are the template pages generated by the two parasite programs I collected.
Parasite template case one:

Parasite template case two:

Static parasite hanging secondary directory case

The case came from an intrusion detection incident handled last year. We found that the target website was hanged with an illegal promotion page, as shown in the following figure:

By logging in to the web server, we found that there is a second-level directory ds under the root directory of the website, and the ds directory is filled with html files, which are generated by the parasite. (The html sample file has been lost due to the long time)

By logging in to the server log analysis, we finally found that the hacker obtained the server permission through the web application vulnerability, and created a large number of malicious html suffix files on the server using the static parasite program, and stored in the ds directory, which utilized This is the high-weight website secondary directory approach.

The above takes up a lot of space to introduce a lot of black hat seo techniques, but also introduces the parasite program, a tool for automatically generating web files. So how does the black hat seo make these illegal pages quickly indexed by search engines? We know that if these maliciously promoted pages are not included in the search engine, then the black hat SEO will not achieve the desired results. At first I was thinking about this problem when I was studying black hat seo. According to common sense, search engines should not include promotion pages with malicious content. The fact is that we are currently searching for Baidu on betting or Porn, there will be a large number of government educational institutions websites that are linked to the color. Obviously these pages are still well-received by search engines and can even be included very quickly. I have found malicious pages that have been included in a few minutes. So is the search engine deliberately, or is someone using some of the features or vulnerabilities of the search engine? To understand this problem, I think we must introduce another big tool of Black Hat SEO - Spider Pool.

Spider Pool

Spider Pool is a program that uses search engine rankings and rankings by leveraging large platform weights. The principle can be understood as the creation of some station groups in advance to acquire (support) a large number of search engine spiders. When you want to promote a new site, you only need to add the form of the link outside the site to the station group to attract spiders to crawl. Simply put, by purchasing a large number of domain names, renting a large number of servers, and building a website in batches to form a station group. These sites form a sprocket between them, and the content of the site is mostly hyperlinks, or some dynamic news content. After a period of operation, this group of visitors can attract a certain amount of search engine spiders every day. The number of spiders depends on the quality of the website and the number of domain names. When the number of spiders reaches an order of magnitude and is stable, you can add the pages you want to promote, such as illegal pages created by black hat SEO. This process is like adding a friendship link on a high-weight website, which will achieve the purpose of quick inclusion.

Spider Pool Trading Platform

I casually Baidu and found that there are many spider pool trading platforms on the Internet, which can promote malicious web pages through spider pools on the Internet. This way saves the trouble of building a spider pool, but it also provides convenience for the black hat seo personnel. When collecting information, I selected one of the trading platforms, the screenshot is as follows:

Spider Pool Site Case

When I collected black hat SEO related materials for this article, I found a classic spider pool site, which I share here.

It is characterized by dynamic content generation, refreshing the page and discovering random changes in content.

It is obvious that the content of this website is generated by the dynamic parasite program, and the content is constantly changing to increase Baidu’s inclusion. (Baidu currently has a high rate of inclusion of original content)

Several major search engines included

Baidu search engine inclusion:

Google search engine inclusion:

Bing search engine inclusion:

Sogou search engine inclusion:

By comparing the collection of this spider pool site by several popular search engines, it is not difficult to see that this spider pool program is currently only effective for Baidu search engine crawlers. Of course, the amount of 78 entries is not very high for a spider pool site, indicating that Baidu has already taken precautions against this approach.


[Summary of Black Hat SEO Analysis] (
[Invisible article of black hat SEO analysis] (
[Black Hat SEO Analysis Tool] (
[Black Hat SEO Anatomy] (

本文标题:Black Hat SEO Anatomy Tool


发布时间:2017年09月28日 - 15:09

最后更新:2019年08月16日 - 15:08


许可协议: 署名-非商业性使用-禁止演绎 4.0 国际 转载请保留原文链接及作者。

nmask wechat