Data loss prevention software
This article needs additional citations for verification. (July 2016) (Learn how and when to remove this template message)
Data loss prevention software detects potential data breaches/data ex-filtration transmissions and prevents them by monitoring, detecting and blocking sensitive data while in-use (endpoint actions), in-motion (network traffic), and at-rest (data storage).
The terms "data loss" and "data leak" are related and are often used interchangeably. Data loss incidents turn into data leak incidents in cases where media containing sensitive information is lost and subsequently acquired by an unauthorized party. However, a data leak is possible without losing the data on the originating side. Other terms associated with data leakage prevention are information leak detection and prevention (ILDP), information leak prevention (ILP), content monitoring and filtering (CMF), information protection and control (IPC) and extrusion prevention system (EPS), as opposed to intrusion prevention system.
The technological means employed for dealing with data leakage incidents can be divided into categories: standard security measures, advanced/intelligent security measures, access control and encryption and designated DLP systems.
Standard security measures, such as firewalls, intrusion detection systems (IDSs) and antivirus software, are commonly available products that guard computers against outsider and insider attacks. The use of a firewall, for example, prevents the access of outsiders to the internal network and an intrusion detection system detects intrusion attempts by outsiders. Inside attacks can be averted through antivirus scans that detect Trojan horses that send confidential information, and by the use of thin clients that operate in a client-server architecture with no personal or sensitive data stored on a client device.
Advanced security measures employ machine learning and temporal reasoning algorithms for detecting abnormal access to data (e.g., databases or information retrieval systems) or abnormal email exchange, honeypots for detecting authorized personnel with malicious intentions and activity-based verification (e.g., recognition of keystroke dynamics) and user activity monitoring for detecting abnormal data access.
Designated systems detect and prevent unauthorized attempts to copy or send sensitive data, intentionally or unintentionally, mainly by personnel who are authorized to access the sensitive information. In order to classify certain information as sensitive, these use mechanisms, such as exact data matching, structured data fingerprinting, statistical methods, rule and regular expression matching, published lexicons, conceptual definitions and keywords.
Network (data in motion) technology is typically installed at network egress points near the perimeter. It analyzes network traffic to detect sensitive data that is being sent in violation of information security policies. Multiple security control points may report activity to be analyzed by a central management server.
Endpoint (data in use) systems run on internal end-user workstations or servers. Like network-based systems, endpoint-based technology can address internal as well as external communications. it can therefore be used to control information flow between groups or types of users (e.g. 'Chinese walls'). They can also control email and Instant Messaging communications before they reach the corporate archive, such that a blocked communication (i.e., one that was never sent, and therefore not subject to retention rules) will not be identified in a subsequent legal discovery situation. Endpoint systems have the advantage that they can monitor and control access to physical devices (such as mobile devices with data storage capabilities) and in some cases can access information before it is encrypted. Some endpoint-based systems provide application controls to block attempted transmissions of confidential information and provide immediate user feedback. They must be installed on every workstation in the network, cannot be used on mobile devices (e.g., cell phones and PDAs) or where they cannot be practically installed (for example on a workstation in an Internet café).
DLP includes techniques for identifying confidential or sensitive information. Sometimes confused with discovery, data identification is a process by which organizations use a DLP technology to determine what to look for.
Data is classified as either structured or unstructured. Structured data resides in fixed fields within a file such as a spreadsheet, while unstructured data refers to free-form text or media in text documents, PDF files and video. An estimated 80% of all data is unstructured and 20% structured. Data classification is divided into content analysis, focused on structured data and contextual analysis which looks at the place of origin or the application or system that generated the data.
Methods for describing sensitive content are abundant. They can be divided into both precise and imprecise methods. Precise methods involve content registration and trigger almost zero false positive incidents. All other methods are imprecise and can include: keywords, lexicons, regular expressions, extended regular expressions, meta data tags, bayesian analysis and statistical analysis techniques such as Machine Learning, behavior analytics, hierarchical threat modeling, predefined dlp (templates), etc. 
The strength of the analysis engine directly relates to its accuracy. The accuracy of DLP identification is important to lowering/avoiding false positives and negatives. Accuracy can depend on many variables, some of which may be situational or technological. Testing for accuracy is recommended to ensure virtually zero false positives/negatives. High false positive rates cause the system to be considered DLD not DLP.
Data leak detection
Sometimes a data distributor gives sensitive data to one or more third parties. Sometime later, some of the data is found in an unauthorized place (e.g., on the web or on a user's laptop). The distributor must then investigate the source of the leak.
"Data at rest" specifically refers to old archived information. This information is of great concern to businesses and government institutions simply because the longer data is left unused in storage, the more likely it might be retrieved by unauthorized individuals. Protecting such data involves methods such as access control, data encryption and data retention policies.
"Data in use" refers to data that the user is currently interacting with. DLP systems that protect data in-use may monitor and flag unauthorized activities. These activities include screen-capture, copy/paste, print and fax operations involving sensitive data. It can be intentional or unintentional attempts to transmit sensitive data over communication channels.
"Data in motion" is data that is traversing through a network to an endpoint destination. Networks can be internal or external. DLP systems that protect data in-motion monitor sensitive data traveling across a network through various communication channels.
- Asaf Shabtai, Yuval Elovici, Lior Rokach, A Survey of Data Leakage Detection and Prevention Solutions, Springer-Verlag New York Incorporated, 2012
- Phua, C., Protecting organisations from personal data breaches, Computer Fraud and Security, 1:13-18, 2009
- Ouellet, E., Magic Quadrant for Content-Aware Data Loss Prevention, Technical Report, RA4 06242010, Gartner RAS Core Research, 2012
- "unstructured data Definition from PC Magazine Encyclopedia".
- Brian E. Burke, “Information Protection and Control survey: Data Loss Prevention and Encryption trends,” IDC, May 2008
- "Understanding and Selecting a Data Loss Prevention Solution" (PDF). Securosis, L.L.C. Retrieved January 13, 2017.
- "Core Technology of Enterprise DLP". GTB Technologies, Inc.