Friday, April 26, 2024

april, 2024

Interview with Edward M.L. Peters, Ph.D. Panzura’s Chief Innovation Officer – Managing and Securing Cloud Storage

Can you summarize some of the trends you see defining the cloud storage space in 2021?

EMLP: The cost of storage will continue to drop. In the year ahead, we will see more and more that the amount of data—how many terabytes you have—will no longer dictate cost. Pricing will instead revolve around how data is actually used. In this regard, storage costs are beginning to follow the pattern of cloud providers which also means issues around vendor lock and favorable services.

As we make this transition, enterprise companies in particular will need to rethink how they consume the data itself. In fact, this thinking is already underway. Data is processed closer to the edge source, near the physical location of people as devices become ever more geographically dispersed.

The pandemic has taught us that workforce models are changing. For instance, cached memory inside on-prem environments and stored operations, where the server closest to the user will increasingly return the cached data, is becoming important once again. However, as cache becomes less of a constraint and more of a consumable resource, IT leaders will need to consider how they can do more things in cache, and how their provisioning environment will need to be altered.

How has the pandemic highlighted the vulnerability of cloud file systems?

EMLP: Collaboration is driving the enterprise to adopt new cloud services, and bad actors naturally are attracted to the opportunities this provides. Data protection, retention, and security are more important than ever, and IT departments are now expected to securely support remote work across any cloud configuration.

Global file systems are a good example. Multi-cloud services platforms need to allow people to collaborate in real-time, unencumbered by notions of place, time or location. The network is now anywhere an employee decides it is, and matters are complicated by the exponential volume of data\created by applications. Costly legacy storage models are being transitioned to the cloud to support this new normal.

The weakest security link is the proverbial fabric that transforms cloud storage into a global file system. The fact is, most of these systems were not created for the cloud. They are considered zero-trust environments by IT departments. We need look no further than ransomware and other malicious code attacks. Cloud-native file sharing and other hybrid-cloud services have to be engineered to avoid failure in the first place.

Can you describe some of the key differences between cloud-native and legacy file storage solutions in terms of security?

EMLP: Cloud-native file solutions can prevent external attacks and data loss from the cloud and unmanaged devices. These systems were ostensibly designed to provide data protection and high-availability with no single point of failure. In terms of encryption, for example, FIPS 140-2 certification and at-rest data protection with AES-256 bit encryption are often considered table stakes, while TLS 1.2 encryption secures data as it moves within and between clouds.

When data is methodically compressed and then encrypted, before it is sent over the wire, it is unintelligible even if intercepted. The most advanced cloud-based file systems use AES-256-CBC to encrypt data stored both at the edge and in the cloud, and TLS/SSL encryption technology is also used. Cloud-native file systems that use encryption technology to mask data stored in the cloud, which includes directory names, file names, and file data, provide a means of data protection.

However, consider that poor encryption methods result in more data and higher cloud storage costs. This can be avoided by eliminating redundant backup and DR processes using approaches such as dedicated HA options, immediate data consistency, snapshot-based restoration, and a near-zero RPO.

How has the recent global regulation around data governance changed things?

EMLP: The security features of cloud file systems need to help companies meet their compliance obligations. For instance, the GDPR provides data subjects with the right to be forgotten but deleting data in the cloud can be challenging without features for wiping it from the edge all the way to cloud stores.

Encryption using customer supplied crypto keys which are never stored in the cloud object store, is also a crucial layer of governance and protection. Full encryption of communications using TLS 1.2 SSL sessions both for managing a filer, as well as any transport to and from the cloud, can ensure that the contracted cloud storage provider has no access to any data.

Integration with directory servers such as Microsoft Active Directory to authenticate and authorize data access for connected users for SMB, NFS exports based on host names, IP addresses, network, Netgroups and Kerberose also support governance requirements. When regulators come knocking, audit logging to track who created, accessed, modified or deleted what data when is a key component of compliance as well.

What are the key cyber threats you see dominating the headlines this year?

EMLP: Ransomware and similar attacks on data and infrastructure, while they are far from the only concerns with entire workforces now outside corporate network boundaries, are even more menacing when considering that each time attackers are paid off, the victims are forcibly funding development of newer and more complicated exploits.

This is even worse because, whatever the outcome, there is never a guarantee that the data will be decrypted and unlocked. In reality, the code that liberates data from ransom could actually be a sleeper cell infecting the network and facilitating future assaults on computing resources. All too often, it just doesn’t work.

It is a particular concern when it comes to cloud-based file systems. These platforms, which are often designed to combine the benefits of cloud and on-premises services at scale, represent a lucrative attack surface. Businesses are faced with the impossible choice of relenting to cybercriminals or dealing with the costly disruption of business and data access.

How can immutable architectures address that threat?

EMLP: Immutable data architecture means that data, once written, cannot be changed. It naturally follows that data that cannot be changed also cannot be encrypted by ransomware. Global file systems, however, are based on the premise that files change frequently. While the most common approach is to ensure that the master copy cannot be altered once written, this assumes that unauthorized changes will not be successful.

Traditional NAS-based file systems allow changes to the file itself. A more effective way is to write new or changed blocks to additional objects in the cloud. That means that frequent data synchronization events can occur both to the cloud and to every local location in a network, creating a persistent, objective data recovery point.

If ransomware encrypts data, local filers can write the resulting encrypted files to a cloud object store as new data. Pre-existing data is unaffected and preserved as original objects in the object store.  When coupled with read-only snapshots capturing a point-in-time state of the file system, this means that all files encrypted by the ransomware code can be restored to their previous state. Ideally, this can be done for a single file, individual  directories, or an entire global file system. Essentially, this removes the economic incentive of ransom-based schemes.

What are some of the emerging technologies and best practices for addressing these problems?

EMLP: Immutable data architectures continue to hold sway. From a technical standpoint, services on the client-side of a filer can be serviced by either the SMB or NFS protocols. With this model, services can be created, modified, or deleted as needed, providing users have the appropriate file permissions to do so.

The data architecture is immutable in that a filer will not alter data in an object store but still allows changes to be made to the resulting files. A good example is saving a file—a filer can split that data into multiple objects and cache those blocks for fast retrieval. As changes are made, the filer will write new blocks without any modification to existing objects, and update file pointers to reflect which objects comprise the file in its new state. When a file is reopened, it can simply be decrypted to assemble the associated data blocks.

In the case of a ransomware attack, the resulting encrypted files are written as new data. Since existing data is preserved as original objects in the object store, any file altered or encrypted by ransomware can simply be reverted back to its last ‘clean’ state. This can be applied to a single file or directory, or the entire global file system.

Edward M.L. Peters, Ph.D
Edward M.L. Peters, Ph.D
Edward M.L. Peters, Ph.D. is Panzura’s Chief Innovation Officer. Panzura is the fabric that transforms cloud storage into a global file system, allowing enterprises to use the cloud as a high performance, globally available data center. Companies all around the world in the sports, healthcare, financial services, media and entertainment, gaming, and architectural, engineering and construction industries, as well as government agencies use Panzura’s fabric to manage hundreds of petabytes of data in the cloud.

Related Articles

LEAVE A REPLY

Please enter your comment!
Please enter your name here

Stay Connected

1,595FansLike
0FollowersFollow
24FollowersFollow
2,892FollowersFollow
0SubscribersSubscribe

Latest News