Skip to content

Dos And Do Nots To US Spy Implant Tradecraft

In 2017, the Central Intelligence Agency lost control of a large cache of documents and cyber-weapons from an air-gapped system. Because of the information leaked through the Wikileak’s Vault7 release, we now know that the CIA had an exclusive cyber arsenal, that they hoard 0-day exploits, and have a wide range of attacks ranging from compromising automobiles, televisions, phones, and operating systems.

Out of many important document releases, an imperative read for malware researchers is a leaked document called “Development Tradecraft Dos and Don’ts” from the CIA’s Information Operations Center. This document explains the techniques that the CIA uses in their development of “implants” or malware. From this publication, we can peer into the minds of our government’s top spy outfit on how they create undetectable malware.

Malware comes and goes and the cyber arsenal that was published to the world will eventually be mitigated by security researchers, developers, and systems administrators. But that does not mean that the CIA or any other actor will stop creating high threat malware to compromise systems. Unfortunately, due to the CIA’s negligent safe-keeping of this information other attackers around the world now have tools and direction on creating nation state level detection-less malicious code.

The following was taken from the document for easier reading. This includes all the dos and don’ts of creating malware with the outfit’s rationale for each position. Reading this publication is a must for all aspiring or seasoned security researchers and analysts to learn how the best hackers in the world hide their malware.

If malware writers were not following these guidelines before the release, you can be sure they are using them now so its beneficial to know what to look for, how an attacker hides their attacks, and the basis for it. Enjoy!

Dos and Don’ts Of Implant Development Tradecraft

General

  • DO obfuscate or encrypt all strings and configuration data that directly relate to tool functionality. Consideration should be made to also only de-obfuscating strings in-memory at the moment the data is needed. When a previously de-obfuscated value no longer needed, it should be wiped from memory.

String data and/or configuration data is very usefule to analysts and reverse engineers.

  • DO NOT decrypt or de-obfuscate all data or configuration data immediatly upon execution.

Raises the difficulty for automated dynamic analysis of the binary to find sensitive data.

  • DO explicity remove sensitive data (encryption keys, raw collection data, shellcode, uploaded modules, etc) from memory as soon as the data no longer needed in plain-text form.

    DO NOT RELY ON THE OPERATING SYSTEM TO DO THIS UPON TERMINATION OF EXECUTION.

Raises the difficulty for incident response and forensics review.

  • Do utilize a deployment-time unique key for obfuscation/de-obfuscation of sensitive strings and configuration data.

Raises the difficulty of analysis of multiple deployments of the same tool.

  • Do strip all debug symbol information, manifests (MSVC artifact), build paths, developer usernames from the final build of binary.

Raises the difficulty for analysis and reverse-engineering, and removes artifacts used for attribution/origination.

  • DO strip all debugging outputs (e.g. calls to printf(), OutputDebugString(), etc) from the final build of a tool.

Raises the difficulty for analysis and reverse-engineering.

  • DO NOT explicitly import/call functions that is [sic] not consistent with a tool’s overt functionality (i.e. WriteProcessMemory, VirtualAlloc, CreateRemoteThread, etc – binary that is supposed to be a notepad replacement).

Lowers potential scrutiny of binary and slightly raises the difficulty for static analysis and reverse-engineering.

  • DO NOT export sensitive function names; if having exports are required for a binary, utilize an ordinal or a benign function name.

Raises the difficulty for analysis and reverse-engineering.

  • DO NOT generate crashdump files, coredump files, “Blue” screens, Dr Watson or other dialog pop-ups and/or other artifacts in the event of a program crash.

    DO attempt to force a program crash during unit testing in order to properly verify this.

Avoids suspicion by the end user and system admins, and raises the difficulty for incident response and reverse-engineering.

  • DO NOT perform operations that will cause the target computer to be unresponsive to the user (e.g. CPU spikes, screen flashes, screen “freezing”, etc.)

Avoids unwanted attention for the user or system administrator to tool’s existence and behavior.

  • DO make all reasonable efforts to minimize binary file size for all binaries that will be unloaded to a remote target (without the use of packets or compression). Ideal binary sizes should be under 150KB for a fully featured tool.

Shortens overall “time on air” not only to get the tool on target, but to time to execute functionality and clean-up.

  • DO provide a means to completely “uninstall”/”remove” implants, function hooks, injected threads, dropped files, registry keys, services, forked processes, etc whenever possible. Explicity document (even if the document is “There is no uninstall for this <feature>”) the procedures, permissions required and side effects of removal.

Avoids unwanted data left on target. Also, proper documentation allows operators to make better operational risk assessment and fully understand the implications of using a tool or specific feature of a tool.

  • DO NOT leave dates/times such as compile timestamps, linker timestamps, build times, access times, etc. that correlates to general US core working hours (i.e. 8am-6pm Eastern time)

Avoids direct correlation to origination in the United States.

  • DO NOT leave data in a binary file that demonstrates CIA, USG, or its witting partner companies involvement in creation or use of the binary/tool.

Attribution of binary/tool/etc by an adversary can cause irreversible impacts to past, present, and future USG operations and equities.

  • DO NOT have data that contains CIA or USG cover terms, compartments, operation code names or other CIA and USG specific terminology in the binary.

Attribution of binary/tool/etc by an adversary can cause irreversible impacts to past, present, and future USG operations and equities.

  • DO NOT have “dirty words” (see dirty words list – TBD) in the binary.

Dirty words, such as hacker terms, may cause unwarranted scrutiny of the binary file in question.

Networking

  • DO use end-to-end encryption for all network communications.

    Never use networking protocols which break the end-to-end principle with the respect to encryption of payloads.

Stifles network traffic analysis and avoids exposing operational/collection data.

  • DO NOT solely rely on SSL/TLS to secure data in transit.

Numerous man-in-middle attack vectors and publicly disclosed flaws in the protocol.

  • DO NOT allow network traffic, such as C2 packets, to be re-playable.

Protects the integrity of operational equities.

  • DO use ITEF RFC compliant network protocols as a blender layer. The actual data, which must be encrypted in transit across the network, should be tunneled through a well known and standardized protocol. (e.g. HTTPS)

Custom protocols can stand-out to network analysts and IDS filters.

  • DO NOT break compliance of an RFC protocol that is being used as a blending layer. (i.e. Wireshark should not flag the traffic as being broken or mangled)

Broken network protocols can easily stand-out in IDS filters and network analysis.

  • DO use variable size or timing (aka jitter) of beacons/network communications.

    DO NOT predicatively send packets with a fixed size and timing.

Raises the difficulty of network analysis and correlation of network activity.

  • DO proper cleanup of network connections.

    DO NOT leave around stale network connections.

Raises the difficulty of network analysis and incident response.

DISK I/O

  • DO explicitly document the “disk forensic footprint” that could be potentially created by various features of a binary/tool on a remote target.

Enables better operational risk assessments with knowledge of potential file system forensics artifacts.

  • DO NOT read, write and/or cache data to disk unnecessarily. Be cognizant of 3rd party code that may implicitly write/cache data to disk.

Lowers potential for a forensic artifacts and potential signatures.

  • DO NOT write plain-text collection data to disk.

Raises difficulty of incident response and forensic analysis.

  • DO encrypt all data written to disk

Disguises intent of file (collection, sensitive code, etc.) and raises difficulty of forensic analysis and incident response.

  • DO utilize a secure erase when removing a file from disk that wipes at a minimum the file’s filename, datetime stamps (create, modify, and access) and its content.

    (Note: The definition of “secure erase” varies from filesystem to filesystem, but at least a single pass of zeros of the data should be performed. The emphasis here is on removing all filesystem artifacts that could be useful during forensics analysis)

Raises difficulty of incident response and forensics analysis.

  • DO NOT perform Disk I/O operations that will cause the system to become unresponsive to the user or alerting to a System Administrator.

Avoids unwanted attention from the user or system administrator to tool’s existence and behavior.

  • DO NOT use a “magic header/footer” for encrypted files written to disk. All encrypted files should be completely opaque data files.

Avoids signature of custom file format’s magic values.

  • DO NOT use hard-coded filenames or filepaths when writing files to disk. This must be configurable at deployment time by the operator.

Allows operator to choose the proper filename that fits with the operational target.

  • DO have a configurable maximum size limit and/or output file count for writing encrypted output files.

Avoids situations where a collection task can get out of control and fills the target’s disk; which will draw unwanted attention to the tools and/or the operation.

Dates/Time

  • DO use GMT/UTC/Zulu as the time zone when comparing date/time.

Provides consistent behavior and helps ensure “triggers/beacons/etc” fire when expected.

  • DO NOT use US-centric timestamp formats such as MM-DD-YYYY. YYYYMMDD is generally perfected.

Maintains consistency across tools, avoids associations with the United States.

Personal Security Products /Antivirus

  • DO NOT assume a “free” PSP product is the same as a “retail” copy. Test on all SKUs where possible.

While the PSP/AV Product may come from the same vendor and appears to have the same features despite having different SKUs, they are not. Test on all SKUs where possible.

  • DO test PSPs with live (or recently live) internet connections where possible.

    NOTE: This can be a risk vs gain balance that requires careful consideration and should not be haphazardly done with in-development software. It is well known that PSP/AV products with a live internet connection can and do upload samples software based varying criteria

PSP/AV products exhibit significant differences in behavior and detection when connected to the internet vise not.

Michael has been a professional in the information technology field for over 10 years, specializing in software engineering and systems administration. He studied network security and holds a software engineering degree from Milwaukee Area Technical College with thousands of hours of self taught learning as well. He mainly writes about technology, current events, and coding. Michael also is the founder of Sof Digital, an U.S. based software development Firm. His hobbies are archery, turntablism, disc golf and rally racing.

Comments are closed, but trackbacks and pingbacks are open.