Big Data Workload Simulations and Cost Estimates Now Available for EC2 Instances

/, Big Data Performance/Big Data Workload Simulations and Cost Estimates Now Available for EC2 Instances

Big Data Workload Simulations and Cost Estimates Now Available for EC2 Instances

Figuring out the right kinds of cloud machines (commonly known as instances) to run your Big Data workloads. To get the best price to performance ratio is often a lengthy trial and error exercise.

To that end, MityLytics MiCPM© can now simulate your Spark and Hadoop (more to come) application workloads. Simulation can be on various Amazon Web Services (AWS) EC2 instance types. Provides you with cost and performance estimates.

Example of MiCPM in Deployment Planning

To begin, simply attach any of your existing Spark or Hadoop clusters (on-premise or cloud) to MityLytics MiCPM©. Run your applications (the more the better), and then start the deployment planning process as shown below:

Once you select your desired region and instance type, we show detailed information about the instance that we’ve retrieved from AWS.

You can then select the kind of workload (tasks/input bytes) and the number of instances (worker nodes for Spark and data nodes for Hadoop). We will then simulate the behavior of the application on the target AWS instance type and present the analysis of how we expect your application to perform, along with the cost to run such a workload as shown below:

We are confident that this will significantly accelerate your DevOps workflows and help you manage AWS costs better. To get started, please contact us!  Also, watch for MityLytics MiCPM coming soon on the AWS Marketplace!

By | 2018-01-04T22:57:59+00:00 December 18th, 2017|Categories: AWS, Big Data Performance|Tags: , , , , |0 Comments

Leave A Comment