amazondumps.us

Monday, 21 January 2019

Most popular AWS certifications


As mentioned above, there are several AWS certifications available. If you’re torn as to which AWS certification is right for you, we’ve broken down some of the basics on the most popular AWS certifications available, which can help you make a more informed decision.

Simplilearn explains how to choose the right course in AWS for you that suits your skills and requirements.


AWS Technical Essentials Certification
If you’re looking just to get certified in the fundamentals, the AWS Technical Essentials may be your perfect fit. This certification is ideal for IT developers and SysOps administrators, or even someone who is brand new to AWS and looking to get fully trained in the basics.

This AWS Technical Essentials course from Simplilearn is designed to train students on various AWS products, services, and solutions. Students will learn an array of new skills, including how to navigate the AWS management console and all about AWS security measures. The course features seven hours of instruction, two live projects, hands-on project execution using the AWS console, and much more.

Friday, 18 January 2019

2019 Latest AWS SysOps Exam Dumps Questions

QUESTION 1

Due to compliance regulations, management has asked you to provide a system that allows for cost-effective long-term storage of your application logs and provides a way for support staff to view the logs more quickly. Currently your log system archives logs automatically to Amazon S3 every hour, and support staff must wait for these logs to appear in Amazon S3, because they do not currently have access to the systems to view live logs. What method should you use to become compliant while also providing a faster way for support staff to have access to logs?

A. Update Amazon S3 lifecycle policies to archive old logs to Amazon Glacier, and add a new policy
to push all log entries to Amazon SQS for ingestion by the support team.
B. B. Update Amazon S3 lifecycle policies to archive old logs to Amazon Glacier, and use or write a
service to also stream your application logs to CloudWatch Logs.
C. Update Amazon Glacier lifecycle policies to pull new logs from Amazon S3, and in the Amazon
EC2 console, enable the CloudWatch Logs Agent on all of your application servers.
D. Update Amazon S3 lifecycle policies to archive old logs to Amazon Glacier. key can be different
from the tableEnable Amazon S3 partial uploads on your Amazon S3 bucket, and trigger an
Amazon SNS notification when a partial upload occurs.
E. Use or write a service to stream your application logs to CloudWatch Logs. Use an Amazon
Elastic Map Reduce cluster to live stream your logs from CloudWatch Logs for ingestion by the
support team, and create a Hadoop job to push the logs to S3 in five-minute chunks.


Answer: E

QUESTION 2

You want to securely distribute credentials for your Amazon RDS instance to your fleet of web server instances. The credentials are stored in a file that is controlled by a configuration management system. How do you securely deploy the credentials in an automated manner across the fleet of web server instances, which can number in the hundreds, while retaining the ability to roll back if needed?

A. Store your credential files in an Amazon S3 bucket.
Use Amazon S3 server-side encryption on the credential files.
Have a scheduled job that pulls down the credential files into the instances every 10 minutes.
B. Store the credential files in your version-controlled repository with the rest of your code.
Have a post-commit action in version control that kicks off a job in your continuous integration
system which securely copses the new credential files to all web server instances.
C. Insert credential files into user data and use an instance lifecycle policy to periodically refresh the
file from the user data.
D. Keep credential files as a binary blob in an Amazon RDS MySQL DB instance, and have a script
on each Amazon EC2 instance that pulls the files down from the RDS instance.
E. Store the credential files in your version-controlled repository with the rest of your code.
Use a parallel file copy program to send the credential files from your local machine to the
Amazon EC2 instances.

Answer: D

QUESTION 3

You are using a configuration management system to manage your Amazon EC2 instances. On your Amazon EC2 Instances, you want to store credentials for connecting to an Amazon RDS DB instance.How should you securely store these credentials?

A. Give the Amazon EC2 instances an IAM role that allows read access to a private Amazon S3
bucket.
Store a file with database credentials in the Amazon S3 bucket.
Have your configuration management system pull the file from the bucket when it is needed.
B. Launch an Amazon EC2 instance and use the configuration management system to bootstrap the
instance with the Amazon RDS DB credentials.
Create an AMI from this instance.
C. Store the Amazon RDS DB credentials in Amazon EC2 user data.
Import the credentials into the Instance on boot.
D. Assign an IAM role to your Amazon RDS instance, and use this IAM role to access the Amazon
RDS DB from your Amazon EC2 instances.
E. Store your credentials in your version control system, in plaintext.
Check out a copy of your credentials from the version control system on boot.
Use Amazon EBS encryption on the volume storing the Amazon RDS DB credentials.

Answer: D

QUESTION 4

Your company has developed a web application and is hosting it in an Amazon S3 bucket configured for static website hosting. The application is using the AWS SDK for JavaScript in the browser to access data stored in an Amazon DynamoDB table. How can you ensure that API keys for access to your data in DynamoDB are kept secure?

A. Create an Amazon S3 role in IAM with access to the specific DynamoDB tables, and assign it to
the bucket hosting your website.
B. Configure S3 bucket tags with your AWS access keys for your bucket hosing your website so that
the application can query them for access.
C. Configure a web identity federation role within IAM to enable access to the correct DynamoDB
resources and retrieve temporary credentials.
D. Store AWS keys in global variables within your application and configure the application to use
these credentials when making requests.

Answer: C

QUESTION 5

You need to implement A/B deployments for several multi-tier web applications. Each of them has Its Individual infrastructure: Amazon Elastic Compute Cloud (EC2) front-end servers, Amazon ElastiCache clusters, Amazon Simple Queue Service (SQS) queues, and Amazon Relational Database (RDS) Instances. Which combination of services would give you the ability to control traffic between different deployed versions of your application? (Choose one.)

A. Create one AWS Elastic Beanstalk application and all AWS resources (using configuration files
inside the application source bundle) for each web application.
New versions would be deployed a-eating Elastic Beanstalk environments and using the Swap
URLs feature.
B. Using AWS CloudFormation templates, create one Elastic Beanstalk application and all AWS
resources (in the same template) for each web application.
New versions would be deployed using AWS CloudFormation templates to create new Elastic
Beanstalk environments, and traffic would be balanced between them using weighted Round
Robin (WRR) records in Amazon Route53.
C. Using AWS CloudFormation templates, create one Elastic Beanstalk application and all AWS
resources (in the same template) for each web application.
New versions would be deployed updating a parameter on the CloudFormation template and
passing it to the cfn-hup helper daemon, and traffic would be balanced between them using
Weighted Round Robin (WRR) records in Amazon Route 53.
D. Create one Elastic Beanstalk application and all AWS resources (using configuration files inside
the application source bundle) for each web application.
New versions would be deployed updating the Elastic Beanstalk application version for the
current Elastic Beanstalk environment.

Answer: B

Wednesday, 26 July 2017

Valid Amazon AWS SysOps Braindumps Question NO: 25

You are currently hosting multiple applications in a VPC and have logged numerous port scans coming in from a specific IP address block. Your security team has requested that all access from the offending IP address block be denied tor the next 24 hours.
Which of the following is the best method to quickly and temporarily deny access from the specified IP address block?


A. Create an AD policy to modify Windows Firewall settings on all hosts in the VPC to deny ac- cess from the IP address block
B. Modify the Network ACLs associated with all public subnets in the VPC to deny access from the IP address block
C. Add a rule to all of the VPC 5 Security Groups to deny access from the IP address block
D. Modify the Windows Firewall settings on all Amazon Machine Images (AMIs) that your organization uses in that VPC to deny access from the IP address block

Correct Answer: C
Explanation
Explanation/Reference:
Reference: http://docs.aws.amazon.com/AmazonVPC/latest/UserGuide/VPC_SecurityGroups.html

Wednesday, 7 June 2017

Kickdrum Joins Amazon Web Services Partner Network | Latest News

Kickdrum, custom software development and technology strategy firm, today announced that it has joined the Amazon Web Services (AWS) Partner Network (APN) as a Standard Technology Partner. The program helps AWS customers identify partners like Kickdrum with proven experience, capabilities and expertise architecting enterprise-grade solutions using the AWS cloud.

The partnership gives Kickdrum access to resources and training to help customers deploy and run applications on AWS. The Kickdrum team includes 17 accredited business professionals and 12 accredited technical professionals along with two on-staff AWS Certified Solution Architects.

“Kickdrum has a proven track record of success in building game-changing technology solutions for our customers whether they are Fortune 500 or middle-market companies,” said Tom Carter, principal architect at Kickdrum. “The AWS cloud is a key part of our toolbox and joining the APN will give us additional resources and channels to share with our customers.”

Kickdrum has deployed thousands of cloud servers, services and serverless solutions for consulting clients for everything from PoC and Dev/QA workloads to mission critical, high availability production environments. Kickdrum cloud architects help customers of all sizes design, architect, build, migrate and manage their workloads and applications on AWS.