Keeping a close eye on how customers are using its cloud offerings, Amazon Web Services has debuted two new notable services aimed at automating the deployment of cloud native applications onto the commercial cloud service.

EventBridge is a new serverless event bus for ingesting and processing data across different SaaS and AWS services.  Likewise new, the AWS Cloud Development Kit (CDK) offers the ability to programmatically define AWS infrastructure using Java, Typescript or other programming languages directly from within an IDE.

AWS Chief Technology Officer Werner Vogels introduced both technologies at the AWS Summit in New York, Thursday. There he emphasized that these tools were designed to help the developer focus on cloud-first coding.

“Container environments feel like they are pre-cloud because suddenly you have to manage infrastructure again,” he said, referring to how many AWS customers will use Kubernetes to aid in development before moving the code to AWS for execution. “The whole premise of running on the cloud is that you can focus on developing your applications.”

“I strongly believe that in the future, development will be that you only write business logic, and you will not be managing any of the computer environments that you do now,” he added.

A Serverless Event Bus

EventBridge extends the idea of serverless, which AWS pioneered with its AWS Lambda, to application-to-application messaging as well, using the event bus model. The idea is to provide “a much simpler programming model, where you can stitch [together] your code, AWS services, and services from third parties.” The technology builds on AWS CloudWatch Events but allows you to add in third-party resources or your own.

Read More:   Update 4 Keys to Navigating the AI/ML Modernization Journey

“Basically, you create events, events go into the events bus. You subscribe to the events and process them,” Vogels said. According to the blog post explaining the technology:

The asynchronous, event-based model is fast, clean, and easy to use. The publisher (SaaS application) and the consumer (code running on AWS) are completely decoupled, and are not dependent on a shared communication protocol, runtime environment, or programming language. You can use simple Lambda functions to handle events that come from a SaaS application, and you can also route events to a wide variety of other AWS targets. You can store incident or ticket data in Amazon Redshift, train a machine learning model on customer support queries, and much more.

EventBridge can e accessed from the AWS Management Console, AWS Command Line Interface, or via the AWS SDKs.

Creating an Event Bus, with AWS EventBridge.

AWS has also set up a partner program that would make it easier for third-party SaaS providers to provide hooks into EventBridge. Initially, Datadog, Zendesk, PagerDuty, SignalFx, SugarCRM, Symantec have all signed on.

The Programmable Cloud

Now generally available, The AWS CDK allows you to describe how your execution environment should look like, and how you should execute it, Vogels said. AWS CloudFormation provides this ability but in a declarative format. This approach has some drawbacks, as another AWS blog post points out:

Configuration files used to manage your infrastructure are traditionally implemented as YAML or JSON text files, but in this way, you’re missing most of the advantages of modern programming languages. Specifically, with YAML, it can be very difficult to detect a file truncated while transferring to another system, or a missing line when copying and pasting from one template to another.

CDK is aimed to help programmers specifically to spin up infrastructure, allowing them to describe the resources they need using languages and IDEs they are familiar with, such as Java, Python, JavaScript and TypeScript. The language used to write the application can be the same as the language used to define the infrastructure that the application will run on.  “This is the ideal destination infrastructure-as-a-code because it is really code that you are writing,” Vogels said.

Read More:   New Bitbucket Features Show Git’s Rise in the Enterprise – InApps 2022

In a demonstration, AWS evangelist Martin Beeby showed how to use the CDK to spin up a virtual private cloud (VPC), and then an Elastic Container Service (ECS)  cluster, using the CDK with TypeScript. The CDK provides a set of commands that can be run from the local machine. Modules for specific AWS services are also provided. Using their preferred language, the programmer creates “constructs” that define the VPC or other setup. An IDE with code-completion capabilities will be able to show the developer all the options that come with a particular function or call, and can easily call up documentation.

Once the programmer is completed, the  CDK creates the transformation script to deploy the requested resources. Anything not specified is set to “sensible defaults.” In the VPC example, it sets up the subnets, routing and NAT gateways without specific instruction.  Likewise, a similar “construct” is created to set up a cluster, which then can be deployed with a single command. Beeby also imported a ECS Patterns Library to deploy an AWS Fargate service on the cluster.

A CDK application.

Cheaper in the Cloud?

Vogels shared a number of other new offerings during his presentation. In the realm of artificial intelligence, he touted that now over 85% of the users for Google TensorFlow machine learning library are actually deploying the service on Amazon Web Services. He attributed this success rate to AWS’ ability to offer higher utilization of compute cores — A stock Tensorflow distribution could only achieve 65 percent efficiency across 256 CPUs, while the AWS-optimized version offered 90%, according to AWS tests. Promising to cut machine learning costs even further, AWS revealed spot instances of AWS’ Sagemaker ML deployment service, which could save up to 80% in deployment costs, according to Vogels.

Read More:   Update Prisma Aims to Unite the Polyglot of Databases with GraphQL

In a case study for the keynote, Steve Randich, CIO of the Financial Industry Regulatory Authority, described how FINRA uses AWS services to capture all daily stock market data, then recreate it to look for fraud and abuse on the part of brokerages and trader firms. This job requires about 50,000 compute nodes at any given time, which ingest 7TB of new data each day (adding to a pile of 37PBs of data). This system will conduct up to 1/2 trillion “validation checks” per day.

Surprisingly, the cost for running these jobs in the cloud would be less expensive than if the organization built out this system in-house, Randich said.