AWS ElasticCache is a fully managed, in-memory cache and data store in the cloud

Are you preparing to take the AWS Certified Solutions Architect Professional certification test? This space will contain articles that cover topics covered in the Solutions Architect Professional exam. This article will explain AWS Elastic cache, which is an important service that Amazon web services offer. This topic will likely bring up many questions for the Solution Architect Certification exam. To receive the latest updates on this topic, you can subscribe to our mailing list.
AWS Certified Solutions Architect Professional – Free Test
How to prepare for the AWS Certified Solutions Architect Professional Exam?

This topic addresses the Scalability & Elasticity domain, as highlighted in the AWS Blueprint exam guide

What is Elastic Cache?
Elastic cache is a web-service that makes it possible to retrieve data quicker from managed caches. Instead of relying upon slower disk-based databases. A cache mechanism can greatly improve the performance of your application, as the disk read/write operation is much faster than an in-built cache.
Amazon takes care of all the technical aspects of managing caches. AWS handles all the infrastructure and upgrades to the cache software. AWS takes care of all this so you can focus on your application.
Real-time applications like gaming, social networking, and media sharing can use caches.
AWS allows you to choose between 2 cache engines (Redis and memcached). Below are some key differences.
Redis is single-threaded, while Memcached has multiple threads. Redis is a single application that can handle one piece of data at once.
Memcached doesn’t allow for persistence. All data is flushed if a node is brought down.
Flat string cache – This is the best choice if you need to store plain HTML files or serialize JSON.
Memcached can be scaled horizontally so that you can add nodes to the database to increase the load.
Memcached is simpler to maintain.

Below are some of the benefits of Elastic Cache.
Easy to deploy – The AWS console makes it easy to deploy the cache service. In just a few simple steps, your cache service can be up and running in no matter how fast you are.
Easy Scaling of Nodes: It’s easy to add new nodes in AWS to an existing cluster. AWS offers a console that allows you to quickly add nodes to an existing cluster.
Fault Tolerant – AWS Cloudwatch monitors the entire infrastructure of Elastic cache. AWS can detect node failures automatically and replace them on the fly.
Elastic cache clusters are distributed across multiple availability zones. This ensures that the cache is always accessible to the application and end users.
Redis replication groups – Redis offers the possibility of creating clusters. Clusters can improve fault tolerance for your application. You can add a read replica to an existing cluster and move it to another availability area. This will ensure availability in the event that your primary server is unavailable.
Backups – A backup is a point in time copy of a Redis Cluster. Backups can be used for seeding a new cluster or to restore an old cluster. Backups include all data within a cluster as well as metadata. Backups are not supported in Memcached.
Endpoints – Any type of application can connect to the cache and use it. An end point is a domain name and port number that allows for connection. These end points can be used by applications such as.Net or PHP to communicate with caches.
Clustering Elastic Cache
Let’s now examine the steps involved in creating an elastic cache
Step 1:Log in to AWS Console and navigate to the Database Section and select ElastiCache

Step 2: Selecting the cache type – On the next screen, you will be asked to select the elastic cache type. AWS offers two types of caches, as described in the features of elastic cache. Let’s use Memcache to accomplish this task.

Step 3: Next, choose the Node type you want and the number of nodes that you want in your cache. This is crucial as it will affect the number of requests that your application will cache. You will also need to give your cache a name.
You can leave the default values of the Engine version compatibility or Parameter Group.

It is important to keep track of the Port number, as this will be required to connect to the cache from an application.