the threat of apt to traditional anti-virus technology and our attempt to deal with it

Posted by santillano at 2020-04-08

Note: This is based on the author's shorthand draft of information security forum of cncc2012 on October 20, which was edited, added and deleted.

Antan laboratory Jiang Haike (Xiao Xinguang)

1. Introduction

Apt (advanced persistent threat) is an aspect that we are very interested in at present. We have talked about apt in public for many times. Compared with the last time we summarized some shortcomings of our apt sample analysis in the security threat forum of Ninis (National Institute of network information security technology), we have made more practical progress. Today, I want to make another self analysis and reflection from the perspective of traditional anti-virus practitioners (not just from the perspective of an analyst).

Whether talking about apt or any security threat, or even any security issue or security incident, there are several basic elements. Like all other things, the elements of security events must also include: time, place, "people", etc., which leads to the topic we are going to discuss today: imagine the difference between the time, place and view of people in the apt era and the time, place and view of people in the traditional virus era.

L. comparison of "time"

Let's first review more traditional malicious code from a time perspective. In the era of disk based transmission of infectious virus and DOS virus, the earliest malicious code was generated by simple media exchange between people or between devices, which is the result of extremely slow diffusion. And the concept of "time" is really promoted to the awareness agenda of anti-virus workers after the advent of the era of worms. Dr. Du Yuejin once divided the era of malicious code into worm era, Trojan era and secret stealing era. I understand that this is an evolution from "dominated by communication means" to "dominated by object goals". Although worms like Melissa and happy99 have appeared in the late 1990s, most people think that the so-called worm era began in 2000.

Name of virus

Release time

Discovery time


August 3, 2001

August 3, 2001

Shock wave (blaster)

August 11, 2003

August 12, 2003

Shock wave (Sasser)

April 30, 2004

May 1, 2004


August 13, 2005

August 16, 2005


January 20, 2006

February 3, 2006

Table 1 age of worms

Here we list five typical malicious codes from 2001 to 2006, including those with rapid propagation ability such as CODEREDII, shockwave and shockwave. We can draw a conclusion that no matter how unprepared these malicious codes were to the anti-virus manufacturers and security response organizations, it is undeniable that these security groups The team did not perceive it for more than 24 hours. In another sense, the emergency capture system of contemporary anti-virus manufacturers has been matured through the experience of rapid network transmission infection in the worm era. In the past, anti-virus confrontation was a process from capture identification confrontation to killing confrontation. In this process, it is more of a penetration of our capture chain and planning of backfire, and formed such a time chain.

Name of virus

Release time

Discovery time


June 2009

July 2010


2007 or 2008?

August 2011


Before December 2007?

May 2012

Table 2 apt Era

In contrast, let's take a look at the time chain of the apt era. In the age of apt, there are three very typical malicious codes: Stuxnet, Duqu and flame. We compare the release extrapolation time with the discovery time as follows:

Stuxnet was found in July 2010. According to its corresponding time stamp information, we think its release time is June 2009;

Ø Duqu was discovered in August 2011 and is currently considered to be released at the end of 2007 or the beginning of 2008;

Flame was later discovered in May 2012. However, in terms of some early drivers discovered and domain names registered in succession, its system started early activities at the end of 2007.

From the above time comparison, we can see that what makes the whole security industry feel very embarrassed is that these apt worms have existed for at least one year before they are widely perceived and handled by the industry. What's more embarrassing for the security industry is that according to the current time chain, the release time of these malicious codes is just the opposite of the discovery time. Basically, we believe that the whole apt incident is the final result of the Stuxnet attack, which ultimately destroys Iran's relevant industrial system, with flame and Duqu as the leading collection leaders. That is to say, during the active time of flame for nearly five years, the whole security industry did not have any perception. Until Stuxnet finally launched a fatal attack, the security industry began to pay attention to the security of industrial control system and the apt attack, and finally found the earliest flame with the cooperation of users submitting samples through gradual orientation, discovery and mining. The relationship between "release time" and "discovery time" reflects the change of time view in the whole apt era. The anti-virus manufacturers have degenerated from "24-hour" keen perception ability to "month" or even "year" slow perception ability.

L. comparison of "places"

Let's look back at the changes in the traditional concept of location in the era of virus. Figure 1 is a schematic diagram of the infection range of the slammer worm 30 minutes after its attack published by the relevant foreign organizations (Note: the slammer worm first appeared on January 25, 2003, it is a SQL worm, with a length of 384 bytes, which spreads rapidly through UDP broadcast, infecting a large number of SQL Server servers in the whole world in a short time.) From the whole diagram, we can see that the areas with high worm density of Slammer are mainly concentrated in the whole territory of the United States, the whole territory of Western Europe and the southeast coast of China, which shows that the distribution density of malicious code is directly proportional to the degree of informatization without directionality, that is, the higher the degree of informatization technology, the larger the infection range of malicious code However, in countries and regions with weak or sparse information technology, such as parts of Africa, Russia to the East and South America in the figure, the scope of malicious code infection is relatively small, or even not infected. This is the spread of worms in the age of disorientation.

Figure 1: 30 minute infection range of Slammer attack

Figure 2 is a schematic diagram of Stuxnet's early infection range released by another manufacturer. From the figure, we can see that the whole world is basically green, and Stuxnet's entire infection range accumulates in the Middle East, and may be related to some kind of geographical impact, along the line from the Indian Ocean to the Pacific Ocean, the equator and even the whole Southeast Asia archipelago. This shows that in the age of apt, the distribution of malicious code is no longer related to the degree of informatization, but closely related to its initial target.

Figure 2 Schematic diagram of early infection range of tuxnet

Of course, there are some directional cases in the age of worms. For example, CodeRed mentioned earlier is a directional case. For example, why didn't China pay much attention to coderedi when it appeared, but began to pay attention to CODEREDII when it appeared? It is speculated that CodeRed may be a virus written by Chinese cyber attackers. At that time, the strategy adopted was to stop the transmission if it was found to be a Chinese system, and continue to spread if it was found to be an English system. Therefore, in the current situation, CodeRed was basically active abroad, which did not attract domestic attention. It is said that this malicious code was later modified by a Dutch hacker organization to spread 300 threads on the English system, while on the Chinese system, the number of threads spread doubled, resulting in a larger spread range and density in China. Although it also realizes directionality in a certain sense, we can see that this kind of directionality is very coarse-grained, which is only a simple division of language set as the region limit, rather than in the current apt era, judging by a very specific target state, using multi sign combination and remote verification service to comprehensively determine whether the attack occurs. So from the point of view of location, the worm era and apt era have very obvious differences in non orientation and orientation.

L. comparison of "characters"

Let's take a look at the corresponding character elements. In the era of virus and traditional attack, like the authors of CIH (Chen Yinghao, Taiwan, China), Melissa (Smith, the United States), and Sven jaschan (Germany), the authors of some viruses and the initiators of the attack, without exception, have a common feature, that is, they are all attacked, pursued, found and positioned by the judiciary. Although there are a large number of classic malicious codes with many harms, generally speaking, as long as the emergency response ability of the whole society and the judicial institutions of the society are willing to bear the cost, as long as the international emergency coordination and judicial interaction mechanism operate normally, the author of the virus and the person who releases the virus can basically be located and brought to justice.

In the age of apt, which one of the known apt attacks with distinctive colors of state and political group has been punished, who is the culprit? Can we locate specific people? Is it possible to target specific organizations? Taking the Stuxnet incident as an example, from the perspective of media reports and public speculation, at the beginning, the American media said that it may have been done by Israel. The Israeli media said that it may have been done by the United States. Later, the so-called "deep throat" came out to say that it was jointly done by the two sides (according to some media speculation, the "leak" may be due to Obama's consideration of election situation in the general election Before the need to establish a strong style). But is it impossible for the victims of this matter to go to seek for two equal or hostile countries. Therefore, compared with the traditional virus era, the time, place and people's view of the whole apt era have changed dramatically. The fundamental background of this huge change is the admission of "big players", and the so-called "big players" are the state and political and economic interest groups.

2. Looking back at traditional anti-virus

In the previous section, we introduced the changes of elements in APT era compared with traditional virus era. Now let's go back to the traditional anti-virus system.

L. traditional anti-virus infrastructure

Figure 3 shows the arrectnet monitoring capture processing system of Antan. This is a relatively traditional architecture. In the whole analysis pipeline of malicious code, the most important two elements are: one is the capture of the front end, the other is the analysis of the back end. Here, the most important impact of apt is "capture". In the past, there were many three-dimensional means of capture, including traffic capture, bait mailbox, on-site collection, sample exchange between international and domestic brother manufacturers, and active reporting by users. We believe that each method has its disadvantages and advantages, including the most active traffic collection and honeypot method. Although it can obtain samples in the first time and solve the real-time problem of samples, it has a small number and a large cost. After the cloud era, almost no one has denied that automatic reporting by the terminal is the best way. At that time, we also disassembled various sample reporting channels in nature:

Ø from the source type, we divide all sample acquisition channels into controllable channels and uncontrollable channels. The so-called controllable channels mean that AV manufacturers have complete policy management and complete data recovery capability for the entire channel, that is, manufacturers can access and obtain at any time, while other ways are uncontrollable channels.

Ø in terms of sample quality, we also divide the quality of sample acquisition channel into multiple dimensions, including real-time (i.e. obtaining samples at the first time), comprehensiveness (i.e. obtaining all modules of samples completely), integrity (i.e. whether the sample files are intact and undamaged), etc.

Based on this kind of deconstruction, we defined the first capture system diagram, but in the age of apt, because we can't sense the cross year malicious code in time, in fact, the real-time acquisition of the traditional anti-virus system is the first challenge.

Figure 3 traditional anti-virus infrastructure

L. working mechanism of traditional anti-virus

From the perspective of the whole anti-virus core working mechanism, in fact, we can summarize the anti-virus system as a workflow, and abstract the model of the whole process into a combination of several branches, which are composed of matchers, preprocessors, authenticators and disposers, guided by format recognition.

Figure 4 model abstraction of traditional anti-virus mechanism

We can divide the maintenance of traditional anti-virus model into three parts, including the maintenance of normalization system, the maintenance of accurate detection, and the maintenance of unknown detection.

Figure 5 model maintenance of traditional anti-virus

The maintenance of normalization system refers to that the current anti-virus engine is basically based on the file format, which is composed of several normalization branches. At that time, we thought that there were two ways of penetration: the first is rule penetration, that is, malicious code is not hit by existing rules; the second is normalized penetration, that is, malicious code can not enter the current detection branches or make detection branches invalid. For example, before the emergence of PDF format overflow, anti-virus software may not recognize the PDF format, or skip as a non-toxic format. The emergence of PDF format overflow is not a simple detection rule penetration, but a normalized penetration, because new format parsing methods and normalization branches need to be added. Therefore, the traditional anti-virus model maintenance must have a normalized maintenance part, which is a process of establishing a new normalized branch through new format features.

The second part is the precise rule detection part, that is, it is a sample of one-to-one rule extraction or rule extraction with certain coverage, and it is a rule detection for malicious code individuals or a small number of aggregation groups.

There is another level above the precise rule detection, which is our original unknown detection. I want to correct two misunderstandings of the public on anti-virus software: the first one is that many people think that anti-virus is one-to-one detection, there is no unknown detection method, unknown virus is not detectable for the existing anti-virus system, AV manufacturers do not do unknown detection work, many rough academic papers attack the aver industry with this view to prove their success The value of fruit. The second misunderstanding is that some people think that anti-virus should be able to detect everything, it should be omnipotent, and it is incompetent if it can't be detected, or that there are some ultimate detection methods for anti-virus, which is also a kind of misunderstanding.

In fact, the work of aver in unknown detection includes three parts: the first part is based on the association aggregation of similar gene segments to form family characteristics; the second part is based on the common behavior aggregation to form behavior characteristics. These two parts form the general detection methods of gene related family and behavior related family respectively. The third part is to improve the ability of heuristic detection by increasing or decreasing the decision point of heuristic, adjusting its configuration and model. These works are all works that AV manufacturers have been doing for a long time. The forerunners of aver have laid the foundation for these works in the era of DOS virus. These works should not be deified or ignored. We just say that this granularity of work is bound to be inadequate for apt.

The real weakness of traditional anti-virus

Although aver has done a lot of work, the current unknown detection work is still insufficient to deal with apt attacks. The fundamental weakness of the traditional anti-virus engine is that it is an easy to obtain security resources. For example, we visit the famous site, and we retrieve 562 results through the antivirus keyword, which are all kinds of anti-virus software and their different versions of downloads. The weakness of anti-virus lies in this. It is a product serving the public, so it must be a very easy resource, and it will not be able to resist the continuous test + modification. At the same time, multi engine scanning has become a very mature engineering method. There are not only online service sites like VirusTotal and VirSCAN, but also a number of foreign manufacturers launch large black box multi engine scanning products, which is a new service and product form from the perspective of the defense side. But from the Perspective of the attack side, the maturity of multi engine contrast scanning also enables the attack and defense to match Build a similar environment to improve the efficiency and ability of confrontation test. At the same time, we should also see that apt does not need to fight against all anti-virus software because of its long-term process of external call and stampede. It only needs to fight against the anti-virus software installed in the specified scenario.

To sum up: it is because anti-virus software is easy to obtain, so it is easy to be pre tested, which makes it become the fundamental weakness of anti-virus, that is, the fundamental reason why anti-virus is impossible to fight apt.

L. progress and confusion

In recent years, the anti-virus industry has also been advancing, which can be summarized as follows:

1. Cloud killing. Looking back on the development of cloud killing, it is not that aver takes the initiative to apply the cloud method to anti-virus system, but because it first meets the test of rule expansion (including blacklist and whitelist), which is difficult for client resources to carry, At the same time, in the case that the malicious code itself has a very strong ability of rapid transformation and deformation, which makes the original mode of distributing virus library through high frequency, and then local reinitialization has become not timely enough, there is a demand for improvement based on the timeliness of anti-virus response process and customer experience. From the final result, we can see that the leading cause of cloud killing is to put massive data in the cloud, which saves the terminal's precious memory and computing resources when the network bandwidth is guaranteed. At the same time, because the rules in the cloud are database level, it solves the problem of fast repair of real-time response and false alarm. And then gradually develop the practice basis of cloud killing into reputation cloud or identification cloud.

2. Program reputation. This concept is related to the traditional manufacturer's digital signature mechanism, and integrates the anti-virus behavior trust mechanism, trust the program with digital signature, which is the result of the division of responsibilities in the security industry, and the security manufacturer also has its own set of system. For example, some "parasitic" AV manufacturers basically rely on "background contrast scanning + foreground full hash" detection. The credibility of this program is that for a period of time (such as three months), no engine alarm files are added to the white list. The white list is scanned again for a longer time (such as one year). If it is found to be black, it will be added back to the black list. However, if a user reports a false or missed report in the forum, he / she will enter the manual analysis to respond quickly. This is an interactive response based on the Internet. Although the process of solving with the user's feedback response seems a little rough, it is also a fast and effective method to establish the program reputation.

3.    URL。 In the past, we used to take a single file as the working object, and after adding the URL, we introduced two concepts: the first is the source, that is, where the program comes from, and whether its source is reliable; the second is the compound verification, that is, the security verification of the source and the security verification of the file are combined to form a closed-loop, which determines the security of the URL through the file, and the security of the URL through the file The security level is extended to the security level of the directory, to the security level of the sub station, and then to the security level of the primary domain name.

4. Active defense. It adopts some extra rule means, such as the detection and protection of memory injection, strong prompt or interception for all PE execution from the Internet, etc.

The above means are the symbol that anti-virus technology has not stagnated in the past 5-7 years. From the perspective of traditional anti-virus, they are improving, but looking back, they are very flustered relative to apt.

3. Trouble points from apt to aver

Based on the above, let's take a look at some of the problems apt has brought to anti-virus workers.

L. apt's subversive mode

Apt is a pattern subversion. It subverts not only the anti-virus pattern, but also the whole information security pattern. For example, Stuxnet's program digital signature uses Realtek, Duqu uses c-media, which is the signature of mainstream manufacturers. Therefore, it can break through some anti-virus software protection mechanisms, and get execution opportunities. Why did anti-virus software in the past trust programs with digital signatures? This shows that there is a division of labor in the whole information security system. The anti-virus manufacturer is mainly responsible for the detection of the blacklist, the CA certification authority certificate manufacturer is mainly responsible for the credit of the whitelist, and the developers of application software and drivers are responsible for the security of their own certificate issuing environment.

In the past a series of apt and related attacks, the whole information security program reputation system chain began to collapse: "RSA was invaded, the random number seed used to generate electronic token was stolen, resulting in the need to recall 40 million electronic tokens, and this event indirectly led to the invasion of several relevant military and important industrial manufacturers in the United States; Ca in the Netherlands was invaded, and finally closed down. ”This is still the case for the electronic certification authority itself, let alone the security of the certificate user's signature environment. It can be seen that this is an upstream collapse. Our trouble is that it not only subverts our value laws and chains, but also subverts the whole original division system of information security.

L. capture of apt subversion

Before we talked about the traditional anti-virus capture methods, including a variety of work, but in the age of apt, these capture methods are almost invalid. Looking back at the previous Stuxnet incident, it was first spread in Iran. There are no users of anti-virus manufacturers in Iran, and malicious code is highly targeted. Therefore, when its delivery location is outside the area that these manufacturers can capture, these manufacturers will not be able to capture it. For example, flame is the first concern and report of Kaspersky, but it is not actively discovered by Kaspersky, but actively contacted by a Saudi user. This brings us back to the era of original data reporting. In the past, there might have been some kind of symmetry between manufacturers: first, traditional malicious code propagation does not have absolute directionality, but depends on a certain range of coverage, for example, whether large manufacturers have hundreds of millions of installed capacity, or small manufacturers have hundreds of thousands of installed capacity, it may be captured; second, due to the exchange relationship between manufacturers, it makes up for Insufficient coverage of manufacturers. But in the age of apt, the former is interfered by high directionality, and whether the latter will be restricted by national background factors is unknown.

L. apt's subversive privacy

But even if the manufacturer's products cover users who are attacked by apt, can the manufacturer definitely perceive the attack? Because this is not a simple problem, first is whether the perceptron exists; second is whether the perceptron manufacturer has the right to obtain. Many apt are based on the overflow of compound document format, no matter RSA secure Whether the ID is stolen or Google is invaded, many of them take the 0-day vulnerability of compound document format resolution as the first attack wave. Most of the format overflow vulnerabilities cannot be detected in the known detection environment of anti-virus itself. Even if the anti-virus product or sensor department is in the position of obtaining these file samples, the file type must be more sensitive If the manufacturer is authorized to obtain the document file, it may become a kind of leakage channel. So it's hard to get a user license.

L. the cost of apt subversion

Since the age of apt, the role of anti-virus manufacturers has changed from initial single point defense or rapid killing to in-depth analysis. Generally speaking, when apt is analyzed, it is usually when it has been declared to be successful. What needs to be evaluated at this time is its actual losses and its whole process chain and mechanism. Kaspersky once said, "the Stuxnet sample is 500K in total. It took us several months to analyze. How long do we analyze the flame file of 20m?"

We also have an example. Stuxnet is said to be able to propagate through the U disk. We also read the part of traversing and copying files from the code, but why it can't propagate when executed in the environment. Finally, we have analyzed and proved that it is based on a configuration, which is related to seven flag bits. The flag bit at 0xc8 is closed by default, that is to say, it is closed by default Without U disk propagation, propagation can only be carried out when all the flag bits are adapted. At that time, we did this single point analysis for about a week, so how long does the whole large module system need to be analyzed? Can we afford the cost? This is its time cost.

I'll review and summarize our work in these three worm era, including compiling 16 documents based on comprehensive analysis, 10 of which have been published on the network / media, and translating 7 documents. We have analyzed nearly 100 sample documents manually, extracted multiple network detection rules, compiled one killing tool, and also produced the safety and security of industrial control system A set of real scene simulation system. From the previous 14 page analysis report of Stuxnet to the 100 page analysis report of flame, our analysis investment has been increasing; however, when we compare with foreign manufacturers, we still find that there is a huge gap. Although you are almost the same in the sample acquisition point, we will discuss the core process in each subsequent analysis stage, and finally reveal that the core principle is 1 to 2 months behind others, which is a mismatch of hard ability, a mismatch of resource organization ability, and an asymmetry of information. So we must admit that there is such a real gap between domestic and foreign manufacturers.

4. Response and attempt

Based on the above background, we have made some corresponding responses and attempts.

L. prepositioning capture / analysis capability

As for apt, I think we need to re emphasize some points. First, no matter who it is, in the case of isolated intranet, no manufacturer can completely cover the corresponding target system to be protected; second, even if we deploy it, it is impossible to send back a large number of files for analysis. So in the age of apt, our breakthrough in thinking is to pre capture / analyze capabilities, that is, to put the manufacturer's capabilities in front of the user end, to build a private cloud and supporting analysis equipment through the front end, to transform the manufacturer's capabilities into user capabilities, and to turn this user capabilities into products sold to users.

L. decision expansion

What are the changes in the design details of the whole engine here? First of all, from the perspective of two identification, the traditional anti-virus engine alarm principle is to select the result with the highest risk level from multiple results given by various judgment modules for a detection object. Compared with the alarm threshold, if it is higher than the threshold, it will alarm. But the mechanism itself is very easy to be challenged by the attacker's early tests. Therefore, we believe that in the future, the fundamental change in the product form of apt's anti-virus engine is that whoever can reveal more information details will win. Therefore, our new static authenticator is actually based on a credit authentication process from the white list method to the black list method, and then the hit results of dozens of decision points are fully reflected and disclosed to users. We believe that in the age of apt, anti-virus is difficult to rely on automated mechanisms to ensure the safety of key users, and we must increase the participation of people, while traditional anti-virus technology is more just to play the role of increasing the cost of attack.

L. dynamic and static combination of analysis methods

Before the age of apt, dynamic detection has been continuously strengthened, but after the age of apt, the role of static will be brought up again, because only under static conditions is unconditional, and the challenge of apt makes key users willing to accept the time cost of static analysis that they could not bear in the past. In the whole static environment, we have strengthened the response process. In the APT system, the static means need to be improved and strengthened. From the perspective of dynamic analysis, it is very important to locate format overflow more accurately. However, domestic users have the characteristics of domestic users, such as WPS, Foxit reader, and these unique domestic software may also be overflow objects, which must be considered in the design. The whole dynamic analysis environment should be built based on such scenarios, and then find the corresponding format overflow. In the past, these in-depth dynamic and static analysis are not in the scope of AV manufacturer's delivery to users, but a manufacturer's backstage link. We need to encapsulate these two links into a box, turn our engine into a system with analysis ability, and sell it to users.

The establishment and lessons of virus files

We hope to build a malicious code file ( to provide more support for users of Antan engine. At present, we have uploaded about 1.1 million articles. I want to tell you a lesson here, that is, a lot of information generated in dynamic and static analysis is actually interference item, which is generated by analysis environment, loading mechanism, etc. it is difficult to pay attention to real personality information if these information cannot be effectively filtered. We are also completing relevant research work to improve the quality of information.

5. We still have a long way to go in APT testing

L. combination with the scene

From the data analysis, we think that anti-virus detection is divided into several levels. For example, when a PE file is transmitted on the Internet, its risk level is different in the email attachment and HTTP or FTP. Generally speaking, the risk of it as an email attachment is the highest, or the risk of the email attachment contained in a package is the highest. The traditional file risk or unknown file risk is only related to the file content, some of which consider some scenarios related to the host, but how to effectively and deeply relate to the network scenarios, which should be noted later.

L. automatic classification based on massive data analysis

At present, we still use more basic methods of statistics and aggregation, relying on the semi artificial way to improve the corresponding processing.

In the past 20 years since the Internet era, each country has independently carried out the information high-speed race from the very beginning. Through big integration, small collision and small friction, the world has gradually formed a basic emergency response chain. Generally speaking, the basic international strategic logic of many big countries is based on the construction of false enemies, "great countries must have huge false enemies", which is an important support point and power source for the development of some countries and national psychology. At present, trust among the world's major powers is based on a unified anti-terrorism background, but bin Laden's death as a political symbol will bring about new changes in the pattern. Will it make the East and the West from the original big cooperation to the small and medium-sized confrontation, to the background of the big confrontation and small and medium-sized cooperation? The city gate will be on fire, which will bring harm to the pond fish. All countries have taken the fight against crime and terrorism as their initial formation, but it is also very The fragile emergency system is likely to be gradually overturned.

Twenty percent of the work can solve 80 percent of the problems, and so can anti-virus. As long as we do some work, we can detect most of the viruses. In the case of rapid development of all manufacturers, as long as we do an automatic extraction point, a sample cycle, a contrast scan plus a hash extraction, and then the cloud killing mode, this is 20% of the work, to solve a wide range of things, but once a hard index appears, we will eliminate some "hard ability insufficient" manufacturers. Because when a huge and necessary price needs to be borne, those who have the ability to pay this price can survive, and those who cannot overcome will be eliminated. In the past, there have also been 28 watersheds, such as macro virus, which is the watershed of Chinese civil and commercial anti-virus teams. Macro virus is easy to write, easy to change, and hard to kill. From the domestic situation at that time, there are many small anti-virus teams in China, but in the era of macro virus, all of them have been eliminated. This is because Microsoft is unwilling to disclose ole structure to Chinese manufacturers, so it is necessary to Reverse solution, who can resist the months of reverse analysis, who can succeed. Of course, in this watershed, we can see some differences in non-technical capabilities. McAfee and other manufacturers can get ole documents through contact with Microsoft in a lot of time, but domestic manufacturers are rejected by Microsoft. Therefore, after the age of apt, whether the technical capabilities of various manufacturers can keep up with, stand up to, and be affordable has become a new watershed.

The original AV chain is still oriented to the capture analysis confrontation of massive samples, that is, who has more terminals and who is more active; who has stronger back-end analysis ability, larger automatic analysis scale and who is more active, but now it is combined with deep analysis ability and patient competition. For example, Kaspersky and Symantec once appeared "silent 45 days". Symantec thoroughly analyzed the whole process of Stuxnet acting on WinCC and PLC within 45 days of silence, while Kaspersky analyzed the gene association model of Stuxnet and Duqu within 45 days. International manufacturers have made a huge investment in APT sample analysis. The analysis team has not only invested in analysts, but also a large number of system architecture personnel and core developers. In order to overcome the key process for a period of time, there may be no news released like a pause. Can we endure the short-term and utilitarian domestic security industry atmosphere? So I say it's not only a competition of technical ability, but also a competition of patience.

Apt is by no means a mass product, or even a cash cow product. Ordinary netizens don't think it will have any impact on their lives. The previous anti-virus products will remain as old as before, while apt is a niche scenario. What should be done about the anti apt products of niche scenario will inevitably lead to the change of product model.

Why do Chinese manufacturers or industries lose the starting line? First of all, it is a congenital deficiency, that is, information asymmetry. The operating system is dominated by others, the large search engine and data aggregation are dominated by others, the key hardware security links are also dominated by others, and the whole basic industrial system and information technology capability are still dominated by others. Second, it is an acquired deficiency. China is actually a country with very weak independent security manufacturers Only independent manufacturers are the driving force of basic security research, which is quite insufficient in China, especially in the main battlefield. At present, Internet manufacturers are the main driving force. Under the Internet mode, the thinking mode of manufacturers often does not value the systematization, preciseness and patience of traditional security teams, but more value the keen, fast and front-end experience It is not a comparison of advantages and disadvantages. It is determined by the value orientation of the team. Human resources, material resources, financial resources, analysis capabilities, as well as a large number of samples and genes are the cost of anti apt comprehensive confrontation, but we have lost in the starting line.

The first time was on March 10, 2003. Before March 8, we were all responding to password worms. When we were very tired and haggard, we suddenly found another kind of worm on a server. We named it Rongrong, which was also spread through the password worm infection mechanism. It actually existed longer than the password worm at that time, but we didn't find it. At the beginning, we were very excited when we found that the mechanism of password worm spread, and all the team's good hands were involved in it. Suddenly, the worm appeared, so we fell into a great fear. Because the team was very small at that time, the team had exhausted its strength and energy, and felt powerless and incompetent. But what this event triggered was that we completed a complete analysis report on behavior related families earlier in the history of anti-virus in China. This report analyzed all malicious code worms retrieved from the sample database based on the remote delivery mechanism of psexec. It was the first analysis report with more than 10 pages in the history of Antai, and it was also created in China The first method of sample family based on analysis association is proposed.

In April 2005, a manual removal of failed Trojans .

The second time was in April 2005, which was also a relatively important node at that time. During the field investigation of users, a rootkit was found. At that time, a large number of our own and third-party tools were used for manual processing, which took about an hour, and finally failed. It suddenly occurred to me that every malicious code analyzed under DOS was manually analyzed and extracted. I felt like I was back to the era of manual confrontation. But driven by this rootkit and the data collection of Trojans at that time, we made an internal technical report, which is a predictive report and a report supported by data. The basic conclusion is that China's information security may collapse in Trojans.

Every time I feel that the spiral of history seems to return to the starting point, I have a sense of fear. I have done a lot of work in the automated analysis pipeline over the years, but I still feel inadequate to cope with new threats. But we are not pessimistic. We used to think that worms are hard to fight and can paralyze the global Internet in a short time, but it turns out that they are not irresistible. The rapid expansion of Trojans has also worried us, but at present, the geometric expansion speed has begun to decline. In fact, we never stop dealing with them, but we may It's not proactive enough, and it's not translated into an effective product form. I always firmly believe that our fear and powerlessness for our opponents is not weakness, but the source of our awe and perseverance for technology.

I suddenly remembered Engels' famous saying that "every great disaster in human history is compensated by great progress." I don't believe that apt is the ultimate threat, apt may not be eliminated, but I believe that it will definitely be contained. At the same time, new and higher-level threats will also appear, and we will also find ways to deal with them. This history of confrontation will run through the history of human information technology, and we are just fragments of this history