Let's have a quick peek into this weeks AWS updates:
- AWS Direct Connect enables failover testing
- CodePipeline now supports invoking Step functions
- AWS introduces the CloudEndure migration factory Solution
- Redshift enables a useful storage feature
- AWS snowcone
- Lambda now supports EFS
- AWS launches AWS CodeArtifact
AWS Direct Connect enables failover testing
The AWS Direct Connect has enabled failover testing. Direct Connect is a service that allows you to privately connect your data center office or co-location environment directly to your AWS account without traversing the internet. Until now, AWS hadn't provided any tools to include failures to test the resiliency of your connection. Currently, AWS has provided a resilience toolkit failover testing feature, which lets you shut down your BGP or border gateway protocol sessions for a configurable period. You can also cancel the failover test any time and return to your previous configuration when it was all working. This is a very helpful feature to ensure that your connections are highly available and resilient.
CodePipeline supports invoking step functions with a new action type.
CodePipeline has a new action type to trigger AWS Step functions, which will make it easier to invoke complex workflows as part of your releases process. With this latest action type, CodePipeline stages can now trigger Step functions state machines, which support error handling, asynchronous tasks. They can easily invoke other AWS services as well through service integrations. This will help everyone using CodePipeline to maintain simpler release pipelines and delegate the behavior through the Step Functions.
Introducing AWS CloudEndure migration factory solution
AWS has released the AWS CloudEndure migration factory solution, which details how to migrate a large number of servers over to AWS using CloudEndure, in a simplified and fast way, but still at scale. It automates many time-consuming tasks that enterprises face when migrating to the cloud-like source machine pre-requisites, installing and uninstalling software on the source and target machine, and so on. Thousands of services have been migrated to AWS using this solution.
Storage Controls for Schemas in Amazon Redshift
Now, you can restrict the amount of disk space used by a schema in Amazon Redshift. This lets you set quotas on the maximum amount of storage consumed by your schemas, allowing you to control and monitor the storage amount used by different applications and users across your organizations.
AWS Snowcone now available
AWS has announced the AWS Snowcone, the newest and smallest member of the AWS Snow family of edge computing and data transfer devices. It is a portable, rugged, and secure device that can be used to deploy applications at the edge. It can collect data and process it locally, and then it can move it to AWS either by shipping the device back to them or by the internet using AWS data sync. It is small light enough to fit in a backpack or even be carried by a drone. And it is built to withstand harsh environments. It has two CPUs, 4Gig RAM, 8 TB of useable storage, and has both wifi and wired networking. It can run edge computing workloads with select EC2 instance or IoT Greengrass, and you can even plug in a battery for mobility.
Lambda support for Amazon EFS
Lambda now supports Amazon Elastic File System, giving you the ability to share data across function invocations easily, read large files, and write function output to a persistent and shared data store. Lambda functions only have 512 MB of temporary storage, so working with larger files than this was previously not possible. This opens up a whole new world of Lambda use, and it will also simplify a lot of functions that already had to go and pull temporary data from s3, process it, and then push it all back. All that's no longer necessary now. And it even enables tools and libraries that don't know how to talk to S3 natively to work as well, as they will have a file system to run on.
Introducing AWS CodeArtifact
AWS has introduced CodeArtifact, a fully managed software artifact repository service. It makes it easy for any organization of any size to store, publish securely, and share packages used in their software development process. CodeArtifact eliminates the need for you to set up, operate, and scale the infrastructure required to manage your artifacts, and it lets you get on with the job of software development. It works with common package managers like mavens, NPM, Yarn, pip, and twine, making it easier to integrate into your existing development workflows.