Learn about Managed DLP
Managed DLP can leverage expert security analysis of the SOC to provide configuration support in classifying data, defining egress points, developing policies, and tuning monitoring in an effort to prevent the misplacement of sensitive data. DLP is only part of the picture in a comprehensive data security strategy.
- What Exactly Is DLP?
- How is DLP Implemented?
- What is Managed DLP?
- What is the Evolution of a DLP Program?
- What is Data Liability?
Managed DLP: FAQ
1What Exactly Is DLP?
Data Loss Prevention (DLP) is a discipline that encompasses monitoring and controlling what data can be sent outbound from inside an organization's network in order to prevent data breach and/or loss of sensitive data. The term "loss" is used in the sense of losing control of the data because its privacy or proprietary nature has been compromised by being stored outside the network perimeter of the organization.
A modern DLP strategy is critical for any organization attempting to avoid deliberate or inadvertent data leakage from internal sources. Several technology solutions exist on the market, but adoption of a DLP product is only one piece of the puzzle. The grander scope of a data security strategy utilizes multiple tactics by combining DLP with EDR and other tools, so the organization is not necessarily relying completely on DLP. It's best used as a mitigating tool to verify that internal users are following policy.
2How is DLP Implemented?
DLP is accomplished by defining the goals and strategy for a program. Are you looking to satisfy regulatory compliance? Or is your DLP program a larger campaign of internal governance?
Next, the data is identified and classified using a labeling convention. Is it structured or unstructured data? By monitoring the files and contents of unstructured data like emails, both in storage and in motion, the data can be subjected to analysis by a rules engine to determine if it violates defined policies for its usage. If the data is a violation, automated action can be configured to warn the user or trigger other controls like account suspension.
DLP products, similar in fashion to EDR, use two primary methods to monitor:
* A local agent running on the endpoint (preferred for remote workforce)
* A network monitor that analyzes traffic to identify data egress
DLP implementation for any organization is a campaign of sorts. It begins with a philosophy of internal governance which requires classification and labeling of data as well as development of policies for what can/ cannot leave the network.
As data stores grow, liability grows. A policy of retention and deletion becomes necessary for risk management, especially with regard to email and other stored communications.
3What is Managed DLP?
Managed DLP is a service provided by an experienced Managed Services Provider to maximize the effectiveness and efficiency of a DLP implementation. In the context of security, a provider can utilize specialized knowledge of data egress and exfiltration techniques to refine and tune DLP and other tools.
Evolution of a DLP Program
Most DLP programs begin with a passive approach — monitoring, data classification, and analysis to further determine what data should be identified. After this step has been taken, policies can be developed and new controls can be implemented to enforce the policies.
Theoretically, the ultimate mature DLP system would be able to recognize any sensitive data, alert a user that it violates policy, and carry out action to block the egress or suspend user rights. Many organizations will adopt a model that places the entirety of data security on DLP: Watch egress points and let their SOC catch the violations.
However, it soon becomes apparent that the effort to define all the regex filters to match data, tuning to limit false positives, et al, can prove overwhelming.
An experienced Managed Security Services group will use the tool within a certain scope as part of a broader solution without complete focus as the only data security solution.
As an organization grows, so does its data footprint. That data may include emails, databases, and general file storage. Some of that data may become less relevant for daily operations to the point that it's obsolete. This data still carries a liability because it may contain sensitive information. If a breach occurs, it can expose the organization to liability well after its usefulness has expired.
A data breach is a gift that keeps on giving — often the ramifications aren't realized until well after the event. Regulatory compliance requirements could mean fines, audits, and more dollars spent.
For these reasons, a DLP strategy may include deletion of data that's classified as sensitive and no longer useful. For example, emails older than 3 years or stored payment cards that are expired.
Glossary of Terms
Data Classification is the process of categorizing the different types, purposes, and contexts of data to determine its handling. Through tagging or labeling, a DLP engine can effectively apply the appropriate policies and controls to data. In general, classification is a manual process, but it can be automated to some extent.
Data labels are simple flags applied to files or data so that it can be effectively classified. This process is similar to a tagging system. Retention Level to "1" may mean that it's retained indefinitely.
Governance is making organizations beholden to rules defined by public governing bodies. Internal governance is a voluntary, self-defined operating standard implemented to ensure performance and reduce risk. Regulatory compliance is law,
The term "audit" is used in a few different contexts in data security. There's an audit similar to a tax audit, where a regulatory body seeks to find rule violations. Also, there's the concept of an internal audit to ensure that a DLP program is performing as designed.
Egress means to exit, leave, or escape. An egress point terms of data security is any conduit to outside the defined boundaries of the organization. Common defined egress points are outgoing email, cloud-backed applications that upload data, mapped network drives, websites with upload functionality, and removable flash drives.
Structured data is normally what's seen in a database or some pre-defined data model that allows for discrimination in the data stored and retrieved. It's made up of a schema, clearly defined patterns, or formats that are easy to consume programmatically.
- Relational databases
- Formatted data exports (CSV, XML, etc.)
Unstructured data is what's left over when it's not structured. It's undefined, inconsistent, and non-standard in size or nature.
- Text files
- Social media
- Media files (images and videos)
- Untracked meta data
To manage risk from data liability, certain data should not be retained past a defined age. Data can be destroyed based on its retention level to alleviate liability using a system of labeling and classification.