According to the Ponemon Institutes, "Cost of a Data Breach" report, a data breach's total global cost averaged $3.86 million in 2020. However, a data breach's implications go far beyond financial losses; it can severely hinder an organization's operational capacity and compliance structures. According to the report, loss of business is at the top of the list coming in at an average loss of US $1.52 million due to higher customer turnover and the cost of customer acquisition, all stemming from a damaged reputation in the public sphere.
The report also reflects the global shift to remote working in early 2020, stating that remote-first organizations cost $137,000 higher than the worldwide average of $3.86 million. It is now imperative that organizations streamline and automate data security control and protocols, as well as regularly stress testing perimeters and the broad attack surfaces inherent in modern IT systems.
The following article explains the core reasons why you should implement data security controls, outlines a number of ways to initiate that implementation, as well as the benefits you will gain from fortifying your data security protocols.
Table of Contents
- Support Data Governance
- Safeguard Against Non-Compliance
- Types of Data Security Control
- Cloud Security Controls
- Data Security Strategies
- Xplenty: Elite Security and Compliance for Data Integration
TRUSTED BY COMPANIES WORLDWIDE
Enjoying This Article?
Receive great content weekly with the Xplenty Newsletter!
Support Data Governance
Data is now the lifeblood of many organizations, but working with and holding this information does not come without immense responsibility. The term data governance peppers all conversations relating to anything data-driven; it surrounds overall management of data availability, relevancy, usability, integrity, and security in an enterprise.
After several high-profile breaches over the past number of years, the term came to prominence, coupled with growing public awareness of data privacy and the implementation of laws such as GDPR and CCPA.
Robust data security controls go hand in hand with a clear data governance framework as follows:
- Transparency: how will the organization control its data?
- Roles and expectations: enforcement and rigor for those enacting and maintaining the data governance framework
- Granularity: data management processes must be examined and documented in detail
Safeguard Against Non-Compliance
The GDPR (General Data Protection Regulation) specifies two tiers of administrative fines that are imposable as penalties for breaching compliance:
- Up to €10 million, or 2% annual global turnover – whichever is higher.
- Up to €20 million, or 4% annual global turnover – whichever is higher.
Not all GDPR infringements result in data protection fines. Supervisory authorities like the UK’s ICO (Information Commissioner’s Office) and Data Protection Commission (DPC) in Ireland have a range of corrective powers and sanctions to enforce GDPR.
- Issuing warnings and reprimands;
- Imposing a temporary or permanent ban on data processing;
- Ordering the rectification, restriction, or erasure of data; and
- Suspending data transfers to third countries
Besides, data subjects have a right to take legal proceedings against a controller or a processor if they believe that their rights under GDPR have been infringed. Compliance breaches have consequences. Just take a look at these GDPR rulings.
The core legal framework of the CCPA differs significantly from GDPR. For example, a fundamental principle of the GDPR is the requirement to have a “legal basis” for personal data processing; this does not hold for CCPA.
The California State Attorney General enforces the CCPA. Organizations found to violate CCPA compliance are subject to a civil penalty of up to $2,500 per violation and up to $7,500 per willful violation. The settlements move to a new “Consumer Privacy Fund,” which offsets future costs incurred by the courts or the state attorney concerning enforcement.
Types of Data Security Control
Data security controls encompass data protection from unauthorized access, use, change, disclosure, and destruction. It’s multifaceted, ranging from hardware and storage devices’ physical security to administrative and access controls (ACLs), including organizational policies and procedures.
The multidimensional data security model includes:
- Physical: Devices must be physically inaccessible to unauthorized users, including such things as data center perimeter fencing, locks, guards, access control cards, biometric access control systems (fingerprint, voice, face, iris, handwriting, and other automated methods), surveillance cameras, and intrusion detection sensors.
- Personnel: System admins, DBAs, and security members must be reliable, and background checked before hiring.
- Procedures: Protocols used in the system's operation must be robust. Those responsible for each task must be accountable and competent - mitigations against human factors include personnel recruitment and separation strategies, training and awareness, disaster preparedness, and recovery plans.
- Technical: Access, storage, manipulation, and transmission of data must be protected by technology that enforces one's chosen control policies, e.g., encryption at rest and in flight, access control lists (ACLs), Smart cards, network authentication, and file integrity auditing software
TRUSTED BY COMPANIES WORLDWIDE
Enjoying This Article?
Receive great content weekly with the Xplenty Newsletter!
Cloud Security Controls
Data protection in the cloud usually encompasses authentication and identity, access control, encryption, and secure deletion, to name a few.
Authentication and Identity entities such as a user, administrator, or guest require an identity - this process of identity verification is called authentication. Authentication of users may take several forms like a password, a security token, or physical characteristics such as a biometric fingerprint.
Access control (such as IAM) ensures an authenticated entity (signed in) is authorized and has permission to use resources. Authn primarily deals with user identity: e.g., who is this person? Authz handles what should this user or system be allowed to access. There are different types of access control, depending on the sensitivity of the information inside. Discretionary access control is the least restrictive and gives access to resources based on users' identities or groups. Role-based access control assigns access based on the organizational role and enables users to access only certain aspects of the system. Mandatory access control is essentially provided superuser credentials and is only available to DevOps and Lead Developers.
Secure Deletion and data sanitization within the cloud is a grey area, and responsibility remains with the customer. Amazon gives customers choices such as DoD 5220.22-M ("National Industrial Security Program Operating Manual ") & NIST 800-88 ("Guidelines for Media Sanitization") - but does not contractually agree to fulfill this. Microsoft has a similar stance and states that only Azure physical platform disks are disposed of according to NIST 800-88 Guidelines for Media Sanitation. One could use data masking to mitigate against this, but the best option is to use robust encryption techniques.
Encryption in transit protects one's data in the case of compromised communications or interception as data moves between one's site and the cloud provider or between two services - utilizing encrypted connections (HTTPS, SSL, TLS, FTPS, etc.). Data at rest encryption within the cloud environment ensures data sanitization once the information has left the service. The data is unreadable for any other party without the (destroyed) key.
Data Security Strategies
The following is a list of strategies you can implement immediately to mitigate against attacks.
- Inventory: Actively manage all hardware devices that are live on the network; only authorized devices should have access. Unauthorized and unmanaged devices should be immediately booted from the system and blacklisted. A concrete first step should include deploying an automated asset inventory discovery tool that can build an inventory of the connections tethered to your organization's public and private networks.
- Exfiltration: Rogue actors who have access to a corporate network are extremely dangerous; any boundary defense is rendered useless in these cases. To mitigate, deploy an automated tool on network perimeters which monitors the unauthorized transfer of sensitive data and freezes such transfers while alerting the security team.
- Logs: Collect, manage, aggregate, and analyze audit logs of events that could help detect, understand, or recover from an attack. Google's postmortem philosophy is hugely relevant in this case: "The primary goals of writing a postmortem are to ensure that the incident is documented, that all contributing root cause(s) are well understood, and, especially, that effective preventive actions are put in place to reduce the likelihood and impact of recurrence."
- Reduce Attack Surface: One should associate active ports, services, and protocols to the relevant asset inventory's hardware assets -ensuring that all network ports, protocols, and services listening on a system are cross-referenced and validated with the business; if a port is open, it should be for a good reason. Firewalls should be in front of any critical service to verify and validate the server's traffic while blocking and logging unauthorized traffic.
- Incident Response: There should be a playbook defining procedures and individuals' roles in the mitigation effort. External contact information for Law Enforcement, relevant government departments, vendors, and Information Sharing and Analysis Center partners should be at hand. According to IBM Security's 2020 Cyber Resilient Organization Report - companies with incident response teams that extensively test and drill their incident response plans spend an average of $1.2 million less on data breaches than those without clear and transparent objectives.
Integrate Your Data Today!
Try Xplenty free for 14 days. No credit card required.
Xplenty: Elite Security and Compliance for Data Integration
At Xplenty, data security and compliance are two of the most critical aspects of our automatic ETL service’s most essential elements. We have incorporated the most advanced data security and encryption technology into our platform, such as:
- Physical infrastructure hosted by accredited Amazon Web Service (AWS) technology
- Advanced preparations to meet the European General Data Protection Regulation (GDPR) standards
- SSL/TLS encryption on all our websites and microservices (encryption in transit and at rest)
- Field Level Encryption
- Encryption of sensitive data anytime it's at rest in the Xplenty platform using industry-standard encryption.
- Constant verification of our security certificates and encryption algorithms
- Firewalls that restrict access to systems from external networks and between systems internally
If you'd like to know more about our data security standards, schedule a demo with the Xplenty team now.