All about Cloud, mostly about Amazon Web Services (AWS)

New S3 Lifecycle Management Features Announced

 2016-03-17 /  378 words /  2 minutes

The Official AWS Blog 16 Mar 2016 announced new Amazon Simple Storage Service (S3) Lifecycle Management features. This blog post looks at the existing lifecycle features, the new S3 lifecycle management features announced and answers the question, why are they needed?

Cloud storage offers amazing features like global accessibility, almost infinite storage capacity, high durability and integrated content distribution.

Cloud storage also presents challenges like managing the costs and tracking the state of the data.

Existing S3 Lifecycle Features

Amazon offers several options for helping manage the costs of object storage including Reduced Redundancy Storage, Infrequent Access Storage and Amazon Glacier.

Reduced Redundancy Storage keeps fewer copies of an object. It costs less than regular S3 storage but results in less durability. This increases the risk of data loss. The trade is cost vs durability.

Infrequent Access Storage feature also costs less than regular S3 storage per GB stored but costs more to retrieve data. The trade is cost of storage vs cost of access.

Glacier is not part of S3 but is used for long term archival. Glacier costs less than regular S3 storage per GB stored but takes much longer when retrieving data. Much much longer! The trade is cost of storage vs speed of access.

The user must determine the level of durability required, the frequency the data needs to be accessed, and the speed the data needs to be made available, then select the right storage options based on the requirements.

Fortunately, S3 Lifecycle Management features help automate both data management and tracking:

  • Move data to Infrequent Access Storage
  • Move data to Glacier
  • Expire versioned objects
  • Permanently delete objects

New S3 Lifecycle Features

The blog post covers two new additions to Lifecycle Management, Incomplete Multipart Uploads and Expired Object Delete Markers.

Deleting Incomplete Multipart Uploads reduces costs associated with storing data which failed to upload completely. Incomplete uploads are effectively unusable.

Deleting Expired Object Delete Markers reduces management overhead by automating cleanup processes.

The blog entry also discusses some S3 Best Practices which are worth reading. Here’s the summary:

  1. Use Versioning to prevent accidental data loss
  2. Use Cross-Region Replication to prevent disaster related data loss
  3. For high volume usage, optimize for performance
  4. Manage costs using lifecycle rules to migrate data to lower storage tiers

Tags:  AWS  Amazon DynamoDB  Go  AWS Console
Categories:  AWS  Amazon DynamoDB

See Also

 Top Ten Tags

AWS (43)   Kinesis (9)   Streams (8)   AWS Console (5)   Go (5)   Analytics (4)   Data (4)   database (4)   Amazon DynamoDB (3)   Amazon Elastic Compute Cloud (EC2) (3)  


All Tags (173)

Disclaimer

All data and information provided on this site is for informational purposes only. cloudninja.cloud makes no representations as to accuracy, completeness, currentness, suitability, or validity of any information on this site and will not be liable for any errors, omissions, or delays in this information or any losses, injuries, or damages arising from its display or use. All information is provided on an as-is basis.

This is a personal weblog. The opinions expressed here represent my own and not those of my employer. My opinions may change over time.