In this piece of article, we will discuss the best tips to secure AWS S3 storage. Before we see the tips for securing AWS S3 storage, we should know why it is crucial. In 2017 it had exposed critical data such as private social media accounts and classified data from the Pentagon. Since then, every organization pays close attention to securing their data stored in the AWS S3.

AWS Shared Responsibility Model

Most of the solutions offered by the public cloud provide a Shared Responsibility Model. This means the responsibility for the security of the cloud platform is taken care of by AWS, and the cloud customers are responsible for security in the cloud. This shared model helps mitigate against the data breaches. The below diagram shows the general responsibility of the AWS and customer’s responsibility for securing the data.

Study the above diagram to familiarise yourself with the responsibilities that you have to take. Preventative measures to secure S3 storage is essential, but every threat cannot be prevented. AWS provides a few ways to help you proactively monitor and avoid the risk from data breaches. Let’s look at the following best practices to secure AWS S3 storage.

Create a Private and Public Bucket

When you create a new bucket, the default bucket policy is private. The same is applied for the new objects uploaded. You will have to manually grant access to the entity that you wish to access the data. By using the combination of bucket policies, ACL and IAM policies give the right access to the right entities. But, this will become complex and hard if you keep both private and public objects in the same bucket. By mixing both the public and private objects in the same bucket will lead to a careful analysis of ACLs, leading to a waste of your productive time. A simple approach is the separate the objects into a public bucket and private bucket. Create a single public bucket with a bucket policy to grant access to all the objects stored in it. Next, create another bucket to store private objects. By default, all the access to the bucket will be blocked for public access. You can then use the IAM policies to grant access to these objects to specific users or application access.

Encrypting Data at Rest and Transit

To protect data during rest and transit, enable encryption. You can set this up in AWS to encrypt objects on servers-sider before storing it in S3. This can be achieved using default AWS-managed S3 keys or your keys created in the Key Management Service.  To enforce data encryption during transit by using HTTPS protocol for all the bucket operations, you must add the below code in the bucket policy.

Utilize CloudTrail

CloudTrail is an AWS service that logs and maintains the trail of events taking place across the AWS services. The two types of CloudTrail events are data events and management events. Data events are disabled by default and are much more granular.

The management events refer to creating, deleting, or updating S3 buckets. And the Data events refer to the API calls made on the objects such as PutObject, GetObject, or GetObject. Unlike management events, data events will cost $0.10 per 100,000 events. You create a specific trail to log and monitor your S3 bucket in a given region or globally. These trails will store logs in the S3 bucket.

CloudWatch and alerting

Having CloudTrail setup is great for monitoring, but if you need to have control over alerting and self-healing, then use CloudWatch. AWS CloudWatch offers immediate logging of events.

Also, you can setup CloudTrail within a CloudWatch log group to create log streams. Having a CloudTrail event in the CloudWatch adds some powerful features. You can set up the metric filters to enable CloudWatch alarm for suspicious activities.

Setup Lifecycle Policy

Setting up a lifecycle policy secures your data as well as save you money. By setting up the lifecycle policy, you move the unwanted data to make it private and later delete it. This ensures that the unwanted data can no longer be accessed by the hackers and save the money by freeing the space. Enable the Lifecycle policy to move the data from the standard storage to AWS Glacier for saving money. Later the data stored in the Glacier can be deleted if it adds no more value to you or organization.

S3 Block Public Access

AWS has taken steps to automate the functionality to block public access of a bucket, previously a combination of CloudWatch, CloudTrail, and Lambda was used. There are instances where developers will accidentally make the objects or bucket to the public. To avoid the accidental access to making the bucket or objects public, these features come in handy.

The new block public access setting feature will prevent anyone from making the bucket to be public. You can enable this setting in the AWS console, as shown in the above video. You can also apply this setting on the account level, as explained in the below video.

Listen to AWS Trusted Advisor

AWS Trusted Advisor is a built-in feature that is used to analyze the AWS resources within your account and recommends the best practices.

They offer recommendations in 5 categories; one of the crucial features is security. Since Feb 2018, AWS alerts you when the S3 buckets are made to be publicly accessible.

Third-party AWS security tools

Security Monkey

It is a tool developed by Netflix to monitor the AWS policy changes and alerts if it finds any insecure configurations. Security Monkey performs a few audits on S3 to ensure the best practices are in place. It also supports the Google Cloud Platform.

Cloud Custodian

Cloud Custodian helps you to manage resources in a cloud aligned with the best practices. In simple words, once you have identified the best practice, you can use this tool to scan the resources in the cloud to ensure that it is being met. If they aren’t met, you can use many options to send alerts or enforce the missing polices.

Cloud Mapper

Duo Security created the Cloud Mapper, which is a great cloud visualization and audit tool. It carries a similar feature of Security Monkey to perform the scan of S3 buckets for any misconfigurations. It offers a great visual representation of your AWS infrastructure to enhance the identification of further issues.

And it offers excellent reporting.

Conclusion

Since most of the work is carried out using data, securing them should be one of the core responsibilities. One can never know when and how the data breach will happen. Hence a preventive action is always recommended. Better be safe than sorry. Securing the data will save you thousands of dollars. If you are new to the cloud and interested in learning AWS, then check out this Udemy course.

7 Best Practices to Secure AWS S3 Storage - 57 Best Practices to Secure AWS S3 Storage - 247 Best Practices to Secure AWS S3 Storage - 767 Best Practices to Secure AWS S3 Storage - 477 Best Practices to Secure AWS S3 Storage - 487 Best Practices to Secure AWS S3 Storage - 507 Best Practices to Secure AWS S3 Storage - 737 Best Practices to Secure AWS S3 Storage - 847 Best Practices to Secure AWS S3 Storage - 77 Best Practices to Secure AWS S3 Storage - 907 Best Practices to Secure AWS S3 Storage - 957 Best Practices to Secure AWS S3 Storage - 317 Best Practices to Secure AWS S3 Storage - 407 Best Practices to Secure AWS S3 Storage - 697 Best Practices to Secure AWS S3 Storage - 127 Best Practices to Secure AWS S3 Storage - 637 Best Practices to Secure AWS S3 Storage - 567 Best Practices to Secure AWS S3 Storage - 347 Best Practices to Secure AWS S3 Storage - 137 Best Practices to Secure AWS S3 Storage - 47 Best Practices to Secure AWS S3 Storage - 237 Best Practices to Secure AWS S3 Storage - 857 Best Practices to Secure AWS S3 Storage - 417 Best Practices to Secure AWS S3 Storage - 507 Best Practices to Secure AWS S3 Storage - 527 Best Practices to Secure AWS S3 Storage - 517 Best Practices to Secure AWS S3 Storage - 467 Best Practices to Secure AWS S3 Storage - 307 Best Practices to Secure AWS S3 Storage - 987 Best Practices to Secure AWS S3 Storage - 96