April 2, 2021 |
4 min read |
back to resources
The cloud offers enormous flexibility and scale, but it’s easy to worry about what’s not in your control. New data-centric security technologies have the potential to ‘tilt the scales’ back in your favor.
We often hear that security in the cloud follows a shared responsibility model. Simply stated, the cloud provider accepts some of the responsibility to keep your environment safe, but not all. The lines can be hazy, but usually the boundaries are identifiable.
Public cloud providers like Amazon Web Services, Microsoft Azure, and Google Cloud Platform offer enormous upside, which is why so many companies have embraced cloud or hybrid-cloud environments. Not every security professional is quite so eager to make the jump, however, especially when protecting highly sensitive data is at stake. Some believe that migrating away from on-premises means giving up control over the security of your systems, and that it can complicate compliance with regulations such as GDPR, SCHREMS II, HIPAA, and PCI-DSS. The knowledge that cloud security is a shared responsibility only adds to the uncertainty.
However, if you know where the lines are drawn and implement the right technologies and practices, this shared model can actually be a positive attribute, allowing you greater ownership and control in keeping sensitive data safe.
The policies can vary based on the cloud provider and type of service, such as PaaS vs. IaaS, but there are some common boundaries. The cloud provider is responsible for physical security, meaning the physical hosts, networks, and datacenters; the customer is responsible for securing application elements under their own control such as data, endpoints, accounts, and access management. AWS summarizes this by saying that they take “responsibility for security of the cloud, while the customer is responsible for security in the cloud.” If you want to research each policy, links to their official pages are provided below.
The bottom line is that protecting data is the responsibility of the owner–the entity that procured the data–and if a cloud data breach were to occur, it’s up to the owner to recover after any fallout. And unfortunately cloud breaches occur often–in a study last year from Ermetric, 80% of companies had experienced a cloud data breach in the previous 18 months, and nearly half of them (43%) reported more than 10 cloud breaches. The responsibility to protect your data in the cloud is a weighty one.
Regulators have also made it clear that securing personal data is the owner’s responsibility. Regulations such as GDPR and CPRA, and rulings such as SCHREMS II have added additional duties for businesses that store data in the cloud. As an example, the EU recommends supplemental security measures for European personal data stored in clouds hosted by US companies, and violations can come with spine-bending fines. To learn more about the supplemental measures and how to comply with them, here’s a helpful whitepaper. The net takeaway is that experiencing a cloud data breach from inadequate protection may result in an especially unwelcome insult to injury–regulatory fines, on top of the many other consequences such as lost data, reputational damage, and downtime.
A safe data harbor is an environment created using new, data-centric security technologies that offer cloud scalability while enabling precision control. In essence, a data harbor is a virtual environment built using cloud or hybrid infrastructure, in which data is automatically fragmented and scattered across multiple storage locations. The original data is removed from the source location, where it may otherwise be vulnerable to attack. As a result, the data ceases to exist in a usable state in any jurisdiction (something called data jurisdiction-independence), both protecting it from unauthorized access and simplifying global compliance. With a data harbor, only the rightful owner of the data can retrieve fragments and reassemble their data, not a third party. That includes the cloud provider, even if they’re under subpoena.
Something amazing happens when you use a data harbor–the cloud itself becomes a basic utility, a supremely powerful entanglement of infrastructure you can use for your own purposes. You don’t have to overly-rely on native security offerings, or worry about cloud data breaches that happen outside of your control. Instead, you simply use their infrastructure as a powerful medium to enable a secure environment.
Using a data harbor on top of public cloud infrastructure offers numerous advantages, such as the ability to protect against data breaches and ransomware, regardless if the attack was perpetrated on a local machine or one of the public cloud storage locations. A data harbor can automatically self-heal from a cyber attack, offering near instantaneous recovery (learn how resilience works in a data harbor here). A data harbor can ensure your data is always available, even if a cloud provider experiences downtime. It also ensures that the cloud provider itself can’t run analytics on your data, assuming they were so inclined. To learn more about how a data harbor works and how to configure one, click here.
With data breaches, ransomware, and exfiltration attacks dominating headlines, cybersecurity has reached a pivotal point. The cloud offers astonishing scale and productivity for businesses, but control has become a much more powerful currency. Productivity can be undone in a flash after a cyber attack, but greater control may have prevented the attack in the first place. A safe data harbor allows you to create an environment bigger than any single cloud, with no single point of failure. More importantly, there’s no better way to take back control of your data–deciding who can see what, and when.
Vice President of Marketing