.

The new normal in DLP

By Dipin Behl, Principal Engineer, Stryker

All businesses today are heavily reliant on data for not just critical decision making but also for normal day to day functioning. They are not just consuming the data but also generating it at an unprecedented pace. So, a compromise in these data streams can hurt their businesses ghastly and, in worst of scenarios, even put operations to halt. When the pandemic was at large, it forced a lot of organizations to rethink their workplace policies and induce remote working as the new norm. A lot many digitally driven organizations were at the forefront, designing solutions to enable working remotely or to support the concept in all possible ways. The major residual shift of this change was moving digital assets to public cloud-based providers with Microsoft, Google and Amazon getting the lion’s share.

The under-discussed big shift

Most of the organizations were already leveraging public cloud-based services such as Microsoft 365 or Google Workspace or Salesforce to various effects. With the new normal, more and more services were being introduced, inducted, and incorporated into either an existing mesh or as standalone sandboxes. Thus, this shift was not given a special treatment and corporations, though revisited but almost never gave due diligence to their existing Data Loss Prevention strategies.

Traditionally, digital threats such as cyber-attacks, system failures and data corruption are considered as major threats against data with human errors and natural disasters accounted as important factors. The move to public cloud provider makes sense as all the major players in the space have battle hardened services and technologies in place against data breaches and data losses with quick turnaround times for newly discovered vulnerabilities. They have pre-configured platforms with industry best practices which can be fine-tuned. The features like ability use a tenant’s own encryption keys adds an additional layer of security. They already have in place fault tolerant strategies for data retention and replication in multiple regions which are essential to prevention against data losses and are also helping improve its accessibility. The concept of edge computing and hybrid clouds (public, private and self-hosted datacenters) are adding immense value to the securing the essential data of the organizations. I believe the biggest value addition to an organization’s data security and loss prevention strategies by moving to the cloud-based providers is not having to hire a team of technology specialists to achieve the goal. The offshoring of data handling to these configurable providers allows the organizations more focus on their core competencies.

With great power comes great responsibility

The cloud providers have a team of technology specialists and industry veterans which design strategies and implement best practices to keep sure that your data stays secure and available. Yet, it was estimated that businesses of all sizes, fell victim to a cyber attack every 11 seconds in 2021. A Verizon 2021 Data Breach report revealed that more than 20% of security incidents involved insiders. Thus, as the data moves to new locations, we must give a renewed call to thoroughly review our DLP strategies.

As no data is same, this must begin with classification of the data into varying levels of criticality. The identification of data allows to place guards in accordance with the type of data along with defining the access to the data. Every organization must lay emphasis on the defining the access levels not based on the hierarchy in the organization but on the role of the individual in the organization. Paired along with appropriate authorization mechanisms, this should serve as the frontline guard against the accidental deletions, malicious insiders, and human errors.

An important part of any DLP strategy which can be easily overlooked is educating and training employees against threats to the organization’s data. They should not just be in sync with the organization’s DLP strategy but be a contributing part of it. They are usually the weakest link in the cybersecurity chain and their education should be a continued process.

The rise of AI and ML as new guardians

An important aspect of any DLP strategy is logging and monitoring the access to the critical and non- critical data. This is where Artificial Intelligence and Machine Learning can add immense value to the DLP strategy. There are numerous Security Information and Event Management (SIEM) and Security Orchestration, Automation and Response (SOAR) tools available in the market such a Microsoft Sentinel for Microsoft public cloud. They accelerate threat detection and help timely actions in mitigating any adverse incident. They are quickly becoming the most essential industry tools in the arsenal against data threats. DLP strategy is not one size fits all notion and the ML models must be fine tuned to an

organization’s requirements. The ML models must be trained to define what is right for an industry, a business and how the organization functions.

With changing technological landscapes of how data is stored, processes and accessed, new threats are emerging, and the shorter detection and response times are becoming critical. The tools must be upgraded and made intelligent to keep pace with the critical shift. DLP strategies are moving targets and must be thoroughly reviewed to keep them relevant to the shifting paradigms of data threats.

Hot Topics

Related Articles