We use cookies to keep our website secure, personalize your experience and for web analytics purposes. Read our Privacy Policy to learn more. By clicking Accept, you agree to our use of cookies.
Whether you’re looking for cost savings or higher drive capacity, cloud storage has solutions for any enterprise. Even with its advantages, cloud storage requires a different approach compared to configuring internal network storage. Misconfigurations could leave your organization data open to attackers, and unorganized management could lead to unnecessary costs. Following these best practices eliminates most of the pitfalls an organization can experience integrating cloud storage into applications, infrastructure, and failover strategies.
Before you create strategies around storage, the first step is to determine the use case specific to your organization. Every organization has their own goals for storage, but some common reasons for cloud storage implementation include:
After your use case is determined, strategies for your cloud storage configurations can be analyzed. Many of the strategies and best practices revolve around cybersecurity and configurations, but others determine the way you should manage your cloud storage and organize files. Not every strategy is necessary, but the following best practices will help administrators get started provisioning, configuring, and managing cloud storage.
If your goals are to store application data, it might be worth investing in at least two providers. Most cloud providers offer extensive uptime guarantees but having one cloud provider leaves the corporation open to a single point of failure. In 2017, a human error caused an outage on Amazon Web Services (AWS) storage in their US-EAST-1 region. It’s rare for AWS to fail, but it is a possibility. If your application relies on only one provider, it could mean downtime for the application until the cloud provider can recover.
A second provider can also be configured as a failover resource. Should the primary cloud provider fail, a secondary service provider can take over. For instance, Microsoft Azure or Google Cloud Platform (GCP) could be used as failover for AWS. Note that this would add considerable costs to the enterprise, but it also could save thousands of dollars due to cloud provider downtime.
Several compliance regulations require cybersecurity and standards in place for the way organizations manage customer data. Any personally identifiable information (PII) must be stored in encrypted form and monitored and audited for unauthorized access. The European Union’s General Data Protection Regulation (GDPR) requires that businesses allow customers to request deletion of their data. PCI-DSS oversees merchant accounts and financial transactions. Review any regulatory standards that could fine the business for poor cloud storage management.
When choosing a cloud provider, check that they are Service Organization Controls (SOC) 3 compliant. SOC 3 cloud providers must offer transparency reports to the public on the way security and infrastructure are managed. The provider’s data centers should also be Tier 3. Tier 3 data centers provide a 99.982% uptime guarantee, which is only 1.6 hours of downtime per year.
Even large, well-known organizations make the mistake of leaving cloud storage publicly accessible, which leads to large data breaches. You don’t need to be a hacker to find open access AWS buckets. Online scanning tools let anyone find open publicly available data. Ensure that the cloud storage isn’t open to the public, but this configuration isn’t the only access control policy needed.
Folders and files stored in the cloud should have the same strict access controls as your internal data. Cloud providers offer account access and management tools, and many of them integrate with internal services such as Active Directory. Use permissions based on the least privilege standard, which says that users should only have access to files necessary to perform their job functions. This standard helps reduce the chance for privilege escalation and stops attackers from traversing the network freely on a high-privilege account.
Whether it’s your own standards or for compliance, always use encryption for data storage on sensitive information and PII. Weak cryptographic libraries leave the data open to dictionary brute-force attacks, so it’s just as important to use the right algorithms.
Encryption adds some performance overhead, so take performance into consideration. Advanced Encryption Standard (AES) 128-bits is a cryptographically secure symmetric algorithm often used in data storage. AES 256-bits is also available for a higher level of encryption protection, but it suffers from performance degradation. For password storage and one-way hashing, the Secure Hash Algorithm (SHA) 3 standard is available.
Organization of folders and files will help administrators determine if they should be backed up, if the folders contain sensitive information, and if they can be archived. Archived data is moved and deleted from its original storage location so that an audit can be done should the organization need to review it in the future. Archives can be compressed when stored, so archiving unused data is useful in cost savings.
It’s also beneficial in determining access controls across large folder trees. Organized folders make every aspect of storage management easier for administrators, so a policy on the way folders should be set up will improve cost savings, backup strategies, and archive management.
Retention policies are common for backups, but cloud providers also offer retention policies in case users accidentally delete data. Instead of permanently deleting data, a retention policy on cloud storage will hold it for retrieval and recovery for a set amount of time before permanently deleting it. This strategy saves administrators time so that they do not need to recover data from backup files.
Cross-Origin Resource Sharing (CORS) is a security standard that restricts access to external resources. If your application reads data from cloud storage, you must allow access to it using the Access-Control-Allow-Origin header. Some developers use the asterisk (‘*’) in the Access-Control-Allow-Origin header value, which tells the cloud storage bucket to allow any application to read from it. This permissive misconfiguration leaves bucket data open to any attacker-controlled site.
For example, it’s not uncommon for developers to use the XMLHttpRequest object to retrieve external data in JavaScript. When making the request, the browser does a preflight request to determine if the application has permission. If the domain is included in the Access-Control-Allow-Origin header, the request continues. Otherwise, the browser’s CORS restrictions reject the request.
To use a domain example, suppose your domain named yourdomain.com makes a request to an AWS bucket. Your AWS bucket should be configured to allow only yourdomain.com applications to retrieve data. AWS, GCP and Azure have these controls available to developers. The following Access-Control-Allow-Origin header would be the proper way to allow your application and disallow any others:
Access-Control-Allow-Origin: https://yourdomain.com
Should an attacker send a phishing message to users and attempt to launch a Cross-Site Request Forgery (CSRF) attack, the attacker’s application call would be blocked due to the above header configuration.
Monitoring is not only a part of compliance requirements, but it will keep administrators informed on file access activity. Every major cloud provider offers monitoring controls, and they can be beneficial when attackers compromise infrastructure. It can reduce damage from an ongoing attack, or it can stop an attacker from continuing vulnerability scans looking for exploit opportunities.
Organizations can use monitoring tools for more than just cybersecurity. Monitoring can tell administrators if data was accidentally deleted, help identify a failure, audit file access, and determine current storage capacity and if it needs to be increased.
Cloud storage has several benefits for organizations, but the way it’s managed and configured plays a big role in its successful implementation. It saves on IT costs, but it also can cost organizations millions of dollars should the infrastructure be misconfigured. Before implementing cloud storage in your software deployment or backup strategy, take the time to prepare access policies, organization standards, and a monitoring setup.