AWS S3

Click to rate this post!
[Total: 1 Average: 5]

So till the previous blocks, we people created users, roles as well as the policies. And all these data must be stored in a safe manner for further retrieval. Moreover,  the need for storage is increasing day-to-day. Therefore building and maintaining your own repositories has become a difficult job. Because it is a difficult task, to predict the amount of capacity needed in advance. So to overcome this kind of problem Amazon offers the internal storage through S3. So in this block, we will discuss AWS S3 in detail.

What is Amazon S3?

Amazon S3 is an abbreviation for Simple Storage Service (S3). This Amazon S3 is a storage on the internet. This Amazon S3 is responsible to store and retrieve any amount of data at any time on the web. People can use this storage platform, through the AWS Management sole. This platform is designed for the large capacity, low-cost storage provision across the multiple geographical regions. S3 provides the developers and IT teams with secure, durable and highly scalable object storage. This platform allows users to upload, store and download any type of files up to 5TB in size. This service allows subscribers to access the same systems that Amazon uses to run on its own websites.

Click the link to learn How to learn AWS 
Storage Classes:

S3 storage classes were meant to assist the concurrent loss of data in one (or) two facilities. These storage classes maintain data integrity using checksums. Besides S3 provides the life cycle management for the automatic migration of objects. Besides, Storage classes were also responsible to distinguish between the use case of the particular object in the bucket. Amazon S3 supports various types of storage classes.  Let us discuss them in detail.

AWS S3

S3 standard :

standard storage class stores the data redundantly across multiple devices. And it is responsible to sustain the loss of 2 facilities concurrently. While uploading the files, if no storage class is specified, Amazon considers the Standard as the default storage class. Moreover, it provides low latency and high throughput performance.  And it is designed in such a way that, it provides the 99.99% availability and 99.99% durability.

S3 standard – IA:

IA stands for the infrequently accesses data.  This class is used when data is accessed less frequently. But it required rapid access when needed. And it has a lower fee when compared to S3. Moreover, it is capable to sustain the loss of 2 facilities concurrently.  Besides S3 provides low latency and high throughput performance. And it has 99.99% availability and 99.99% durability.

S3 one -zone infrequent access:

This storage class is applied where the data is less accessed less frequently but requires rapid access when needed. It stores the data in the single availability zones rather than the other storage classes which store the data in a minimum of three availability zones. So it costs 20 % less when compared with standard IA storage class. Besides, it is a good choice to store backup data.  It is a cost-effective region which is replicated from other AWS region using the S3 Cross-region replication. In a single availability zone, it has 99.5 % of availability and 99.99 % objects durability. This provides lifecycle management for the automatic migration of objects to other storage classes.

S3 Glacier:

It is the cheapest storage class, but it can be used for archive only. When compared to the other storage classes, here you can store any amount of data at a lower cost. This storage class allows you to upload the objects directly. Across the multiple availability zones, it is designed for 99.99999% durability.  This storage class provides the three types of models:

  • Expedited:

In this model, the data is stored for a few minutes and it charges very high

  • Standard:

This model has a retrieval time of 3 to 5 hrs.

  • Bulk:

The retrieval time of this model is 5 to 12 hrs.

S3 Intelligent  Tiering:

It is the first object storage class,  that delivers the automatic cost savings by moving the data between the two access tiers when the access pattern change. These tiers are frequent as well as the infrequent access.  This storage class stores the object in two access tiers.  It utilizes the first tier to optimize for frequent access. And the second tier to optimize for the infrequent access.  It is designed for 99.99 % availability and 99.99 % durability.

One Zone  IA:

It is the suggestable storage class designed for customers who want a lower cost option to the Infrequently accessed data. It is intended for the use cases with infrequently accessed data such as storing the backup copies of on-premises data.  In this storage class, customers can now store the infrequently accessed data within a single availability zone at 20 % lower cost than the standard -IA.

What kind and how much amount of data be stored in S3?

In S3, you can store virtually any kind of data. Amazon S3 can store the large volume and many numbers of objects.’ Amazon S3 can be employed to store any type of object. This platform can store internet applications, back up and recovery.

How data is organized in S3?

Data in S3 is organized in the form of buckets.

A bucket is a logical unit of storage in S3.

A bucket contains the objects which contain the data as well as the metadata.

Click the link to Know about What is AWS Certification
What is an Amazon Bucket?

Amazon S3 has two primary entities named buckets and objects. Buckets were responsible to store the objects. And these buckets have the flat hierarchy. Today every organization needs data in an ordered fashion. So Amazon introduced S3  to maintain the data in a traditional manner. And these buckets were capable of storing the different kinds of objects in the cloud. This platform allows users to upload the folders as well as the objects/files.  Amazon platform allows the users to create a maximum of 100 buckets. Since these buckets were accesses globally, these bucket names must be unique. Moreover, I personally suggest the users follow the DNS naming conventions. i.e all letters should be in lowercase.

AWS S3

So let us see how to create the bucket in practical.

Bucket Creation:

step -1 :

Log in to the AWS Console and search for S3 and click on it. Then you will be entering into the following screen.

AWS S3

step – 2:

Click on Create bucket.

AWS S3

step – 3:

Fill the details and click on Create

AWS S3

step – 4:

Click on Create Bucket

AWS S3

step-5:

Once created, click on the bucket that you have created. Then you will be entered into the following screen.

AWS S3

step – 6 :

Click on Add files.

AWS S3

step – 7:

Add any file and click on Upload

AWS S3

Likewise, you people can also add the folders (Drag and drop)

step – 8:

Click on the object, that you have created, then you will be entered in the following screen

AWS S3

step – 9:

Click on the object URL and you will be entered into the following screen

AWS S3

step – 10:

So make it as public to remove this bug.

AWS S3

And you select an object/ folder you can see various options as shown below

AWS S3

Note:

We cannot make the file public to all kind of file. And it is restricted to some kind of files.

Get those restrictions at AWS Online Training
Properties:
What is object Versioning:

Object versioning provides the flexibility of storing the objects with the same names but with different version numbers. This prevents the unintended overwriting (or) object deletion.

let us see how the versioning works in detail

step – 1:

create an object and upload any file ( For instance, I’m using the .html file). Then you will be entering into the following screen.

AWS S3

step – 2:

Select the object and make it public. And then open the object using the URL. Then you will be getting the output as follows

AWS S3

step – 3:

so now navigate to the properties tab and select versioning and click on enable and save it.

AWS S3

step -4:

Add some data to the existing file and save it

step-5:

Upload the file once again

step -6:

Open your bucket then you get activate show. Then you will get log as follows

AWS S3

So once you observed the above screenshot, you can the two files in different timings. It means, for every change that you have made in the source file, a new version will be created

Note :

For every file updation, you must make the file as public. Else you cannot access the file

step -7:

So once you have done all this, then try to access the files. And in my case, ill get my screen as follows

AWS S3

And if you navigate back to the object and click on properties, then you can see the screen as follows:

AWS S3

In the above screenshot, server access logging contains log details of various users in the amazon account

AWS S3

So click on the sever access logging and select Enable logging, select the bucket and click SAVE.

Static Website:

in the next column, we have an option static website. So we will now see how to launch the static website.

step -1:

Navigate to S3, and select any bucket, go to properties and then click on static website hosting.

AWS S3

step – 2:

Select use this bucket to host the website and provide the names as shown below then save it.

AWS S3

Note: Before saving your option, copy the endpoint in any text editor.

step – 3 :

Open any text editors like Notepad and create two HTML files like index.html and error.html as shown below

Index.html

<html>
<head>
<title>
Index page
</title>
</head>
<body>
<h1> Welcome to ONLINEITGURU AWS Tutorial </HTML>
<p> This page works good </p>
</body>
</html>

error.html

<html>
<head>
<title>
Error page
</title>
</head>
<body>
<h1> Welcome to ONLINEITGURU AWS Tutorial </HTML>
<p> This page is an Error page </p>
</body>
</html>

 If you are new to HTML, get the HTML basic at UI Online Training

step – 4:

Open your desired bucket and Upload the HTML Files.

step  -5:

Then you can see your bucket as follows.

step – 6:

Make your files as public.

step -7 :

In step -2, we have saved the endpoint URL in the text editor. Open the link in the browser.

AWS S3

step -8:

Now after the endpoint URL, type /abc (some thing after / ) then you will get the output as shown below. You can see the ouput as shown below

AWS S3

Likewise, we can host the static website.

Object-level – logging :

Object-level logging records the API activity to the API Cloud Service. Basically, Cloud trail is an AWS API auditing service. So this object level logging allows you to incorporate the S3 object access to the cloud trail central auditing and logging. And we can implement the object level logging with the following steps.

step -1:

So once you click on object level logging, you  can see the screen as shown below

AWS S3

step – 2:

Click on Cloud Trail Console.

Then you will be entering into the following screen

step -3:

Click on Create trail

AWS S3

step -4:

Fill the details as shown below

AWS S3

Note :

you can change the management events as per your requirementAWS S3

step -5:

And click on create

AWS S3

step -6:

So once go back to the step -1 and select the cloud trail, provide the permissions and then click on Create

AWS S3Across the globe, we have several network protocols. And Amazon platform provides security in the following manner. So Click on default encryption and you can see the following options as shown below

AWS S3

and you can select the encryption as per your requirement and click on save 

AWS S3Permissions:

This block contains various bucket level policies.

Block Policy Access:

This block contains the list of various accessing policies. So you grant the access to various control list and policies on the basis of your requirement.

AWS S3

in the above screen, shoot Click on Edit to assign the access.

Access Control List:

 Likewise, if you navigate to the permission tab, you will be entering into the following screen

AWS S3

And if you observe the above screenshot, you have seen the add account, it means this platform allows the user to share the access to other people. And you can click on add account and provide the canonical ID’s and then provide the permissions. Likewise, if you need public access, Select everyone as shown below

AWS S3

Similarly, you can get the data from various formats as shown below.

Moreover,  you can get the canonical ID when you click on the name of our Account.

AWS S3

Besides, this platform not only allows you to grant access to the other AWS Accounts but also allows you to grant the public access as well as the log delivery group.

Bucket Policy:

Bucket policy and user policy are the two kinds of access level polices available in Amazon S3.  These policies allow you to grant permissions to Amazon S3 resources. Moreover, both policies use the JSON Based access policy language. So let us see how to create the bucket policy.

step -1:

In the bucket policy tab, click on Policy Generator.

AWS S3

step-2:

Fill the details as shown below and click on Add statement.AWS S3

step -3:

Click on generate Policy.

AWS S3

step – 4:

Click on Close.

AWS S3

Management:
Life Cycle Rule:

we can create the life cycle rules manually as follows:

step -1:

Navigate to your S3 object and then move to the Management tab and then click on Add lifecycle rule

AWS S3

step -2:

Provide the rule name  and click on Next

AWS S3

step – 3:

Select the transition version and click Next

AWS S3

step – 4:

select the object expiration days and click Next.

AWS S3

step -5:

Click Save

AWS S3

Replication

Replication refers to the duplicate copy of the original file. Amazon follows allows the users to replicate the objects between the different AWS regions (or) within the same AWS region. Same region replication usually copies the objects across the Amazon S3 buckets in the same S3 region.

step – 1:

Click on Add Rule.

AWS S3

step -2:

Click Next.

AWS S3

step – 3:

In the destination bucket, click on buckets in another account and provide the credential and click on save.

AWS S3

And click on Next.

AWS S3

step – 4:

Select the IAM role and provide any name, finally, click Next

AWS S3

step – 5:

finally, click on SAVE.

AWS S3

So likewise, you can create the replication factor. And if one copy fails, the other copies will work.

Analytics:

AWS S3 storage class analysis can analyse the storage access patterns. These access patterns help you to decide when to transition the right data to the right storage class. Besides, this  Amazon S3 analytics features observe the data access patterns. And these patterns determine when to transition less frequently accessed standard storage to the STANDARD_IA.

Once the storage class analysis, observes the infrequent access patterns to the filtered data set over the period of time. Moreover, these analytics results help to improve lifecycle policies. Furthermore, you can configure the storage class analysis, to analyze all the objects in the bucket. This platform allows users to have multiple storage class filters. And this may up to 1000 and it receives the separate analysis for each filter.

step – 1:

Navigate to Analytics in the Management tab and click on Add.

AWS S3

step – 2:

provide any Name and Click Save.

AWS S3

Metrics:

S3 helps you to understand and improve application performance that uses S3.  Amazon S3 uses the Cloud watch in two ways.

  • Daily storage metrics for buckets:

you can monitor the bucket storage using the cloud watch. This platform processes the storage data from S3 into the readable daily metrics. These Amazon S3 storage metrics are reported once in a day and provides to the customers for free.

  • Request Metrics:

You can opt the Amazon S3 request, to quickly identify and act on operational issues.  And these metrics were available at 1-minute intervals after some latency to the processes. Moreover, these metrics were billed at the same rate as the Amazon Cloud Watch Custom Metrics. And these 1-minute metrics were available at the amazon S3 bucket level.

And we can add the metrics to the amazon bucket as follows:

step – 1:

Click on ADD in the Metrics Tab.

AWS S3

step – 2:

Provide any name and Click on SAVE

AWS S3

step – 3:

Then you can check the metrics as follows :

AWS S3

Inventory:

Amazon S3 inventory is one of the tools that help to manage your storage. And you can use this to audit and report on replication and the encryption status of your objects. This includes the business, compliance and regulatory needs. Besides using S3 Inventory, you can also simplify and speed up business workflows as well as the big data jobs. Moreover, S3 Inventory provides the Comma Separated Values (CSV), Apache Optimized Row columnar (ORC) (or) the APache Paraquet. Output files. And these files list your objects and the corresponding metadata, on the weekly (or) the daily basis to an S3 bucket. Moreover, this platform allows you to configure multiple inventory lists for the bucket. Besides, you can also query the Amazon S3 Inventory using the Amazon Athena, Amazon Redshift Spectrum and some other tools.

And we can add the inventory to the bucket as follows.

step – 1:

In the Inventory Click on Add NewAWS S3

step – 2:

Fill the details as shown below and click on SAVE

AWS S3

And you will be redirected to the following screen

AWS S3

Amazon S3 Features:
Low cost and easy to use:

With Amazon S3 we can store a large amount of data at a minimal cost.

Secure:

When compared with AWS services, AWS S3 is Secure, because, AWS provides the encryption to store the data in two ways. They are Client as well as the server-side encryption. Moreover, this platform maintains multiple copies, to enable data regeneration in case of corruption. This platform provides strong authentication and ensures the security of regionally store data.

Durable:

Besides S3 this platform is highly durable. And it regularly verifies the data integrity using checksums. For instance, if S3 detects there is a corruption in data, it immediately replicates the data.

Highly Scalable:

S3 is highly scalable. it automatically scales your storage according to the requirement and you can pay for the storage you use.

High Performance:

Amazon S3 is integrated with Amazon Cloud Front. This platform distributes the content with low latency and provides high data transfer speeds without any minimum storage.

Storage:

This platform allows unlimited data and objects storage of most data types in various formats. It is a stored data object which ranges from 1 TB to 5 TB. Moreover, this platform provides Reduced Redundancy storage (RSS). This reduces the latency by storing the data in regionally segregated buckets. This saves the resources and facilitates the application efficiency for users in the geographically dispersed locations.

Web Protocols:

This platform contains the Representational State Transfer (REST) and simple object access protocol (SOAP) web service interfaces. And these interfaces were built to operate with any type of web development Kit.

Advantages of Amazon S3 Service:
  • scalability on-demand:

AWS S3 is a very good option if you want to scale your application according to the change in traffic. You can scale up (or) down with just a few clicks.

  • Content Storage and Distribution:

Amazon S3 could be used as the foundation for the content delivery framework. And S3 is responsible for content storage and distribution.

  • Big data and analytics on Amazon S3:

Amazon Quick Sight UI can be connected with Amazon S3. And large amounts of data can be analyzed with it.

  • Back up and Archive:

Amazon platform has the capability to backup large amounts of data. This includes the static files and the dynamic files that you were working on.

  • Disaster Recovery:

This platform stores the data in multiple available zones and gives the user the flexibility to recover the files. Moreover, cross-region replication technology has the capability to store in any number of Amazon WorldWide data centres.

Subscribe
to our newsletter

Drop Us A Query

Trending Courses
  • oracle 12c rac | OnlineITGuru
    Oracle RAC Training
  • Oracle is the large vendor in providing the various storge services to the people across the globe. This vendor provides a different amount of storage services to the people across the globe.

  • salesforce lightning training | OnlineITGuru
    Salesforce Lightning Training
  • Developing an application is not a simple and easy task. There are various parameters that the web developer need to take care while developing an application. One of those parameters that the developer needs to take care of is the code reusability.

  • Selenium with python
    Selenium with Python Training
  • As we know, that Selenium with Python Web Browser Selenium Automation is Gaining Popularity Day by Day. So many Frameworks and Tools Have arisen to get Services to Developers.

  • machine learning with python
    Machine Learning with Python Training
  • Over last few years, Big Data and analysis have come up, with Exponential and modified Direction of Business. That operate Python, emerged with a fast and strong Contender for going with Predictive Analysis.

  • Data science with R
    Data Science With R Training
  • Understanding and using Linear, non-linear regression Models and Classifying techniques for stats analysis. Hypothesis testing sample methods, to get business decisions.

  • data science with python
    Data Science with Python Training
  • Everyone starts Somewhere, first you learn basics of Every Scripting concept. Here you need complete Introduction to Data Science python libraries Concepts.


100% Secure Payments. All major credit & debit cards accepted.

Call Now Button