{"id":66246,"date":"2023-10-31T02:30:45","date_gmt":"2023-10-31T02:30:45","guid":{"rendered":"https:\/\/www.dumpsbase.com\/freedumps\/?p=66246"},"modified":"2023-10-31T02:30:50","modified_gmt":"2023-10-31T02:30:50","slug":"updated-dbs-c01-exam-dumps-v11-02-good-learning-materials-for-aws-certified-database-specialty-exam-preparation","status":"publish","type":"post","link":"https:\/\/www.dumpsbase.com\/freedumps\/updated-dbs-c01-exam-dumps-v11-02-good-learning-materials-for-aws-certified-database-specialty-exam-preparation.html","title":{"rendered":"Updated DBS-C01 Exam Dumps V11.02 &#8211; Good Learning Materials for AWS Certified Database &#8211; Specialty Exam Preparation"},"content":{"rendered":"\n<p>Everyone should know that the AWS Certified Database &#8211; Specialty certification is a valuable credential that can help you achieve your goals. You can demonstrate your expertise in recommending, designing, and maintaining optimal AWS database solutions by passing the DBS-C01 exam successfully. To ensure your success in the DBS-C01 exam, it is essential to have the right learning resources. DumpsBase offers the latest and easiest AWS Certified Database &#8211; Specialty Exam dumps that can help you excel in your certification journey. Whether you are a busy professional or a student with a flexible study schedule, DumpsBase provides the comprehensive materials you need. With the support of DumpsBase and their updated DBS-C01 exam dumps V11.02, you can confidently prepare for the AWS Certified Database &#8211; Specialty exam and stand out in the competitive field of cloud technology.<\/p>\n<h2>AWS Certified Database &#8211; Specialty <em><span style=\"background-color: #ffff00;\">DBS-C01 Free Exam Dumps Below<\/span><\/em><\/h2>\n<script>\n\t  window.fbAsyncInit = function() {\n\t    FB.init({\n\t      appId            : '622169541470367',\n\t      autoLogAppEvents : true,\n\t      xfbml            : true,\n\t      version          : 'v3.1'\n\t    });\n\t  };\n\t\n\t  (function(d, s, id){\n\t     var js, fjs = d.getElementsByTagName(s)[0];\n\t     if (d.getElementById(id)) {return;}\n\t     js = d.createElement(s); js.id = id;\n\t     js.src = \"https:\/\/connect.facebook.net\/en_US\/sdk.js\";\n\t     fjs.parentNode.insertBefore(js, fjs);\n\t   }(document, 'script', 'facebook-jssdk'));\n\t<\/script><script type=\"text\/javascript\" >\ndocument.addEventListener(\"DOMContentLoaded\", function(event) { \nif(!window.jQuery) alert(\"The important jQuery library is not properly loaded in your site. Your WordPress theme is probably missing the essential wp_head() call. You can switch to another theme and you will see that the plugin works fine and this notice disappears. If you are still not sure what to do you can contact us for help.\");\n});\n<\/script>  \n  \n<div  id=\"watupro_quiz\" class=\"quiz-area single-page-quiz\">\n<p id=\"submittingExam7843\" style=\"display:none;text-align:center;\"><img loading=\"lazy\" decoding=\"async\" src=\"https:\/\/www.dumpsbase.com\/freedumps\/wp-content\/plugins\/watupro\/img\/loading.gif\" width=\"16\" height=\"16\"><\/p>\n\n<div class=\"watupro-exam-description\" id=\"description-quiz-7843\"><\/div>\n\n<form action=\"\" method=\"post\" class=\"quiz-form\" id=\"quiz-7843\"  enctype=\"multipart\/form-data\" >\n<div class='watu-question ' id='question-1' style=';'><div id='questionWrap-1'  class='   watupro-question-id-291571'>\n\t\t\t<div class='question-content'><div><span class='watupro_num'>1. <\/span>A company has deployed an e-commerce web application in a new AWS account. An Amazon RDS for MySQL Multi-AZ DB instance is part of this deployment with a database-1.xxxxxxxxxxxx.us-east-1.rds.amazonaws.com endpoint listening on port 3306. The company\u2019s Database Specialist is able to log in to MySQL and run queries from the bastion host using these details. <br \/>\r<br>When users try to utilize the application hosted in the AWS account, they are presented with a generic error message. The application servers are logging a \u201ccould not connect to server: Connection times out\u201d error message to Amazon CloudWatch Logs. <br \/>\r<br>What is the cause of this error?<\/div><input type='hidden' name='question_id[]' id='qID_1' value='291571' \/><input type='hidden' id='answerType291571' value='radio'><!-- end question-content--><\/div><div class='question-choices watupro-choices-columns '><div class='watupro-question-choice  ' dir='auto' ><input type='radio' name='answer-291571[]' id='answer-id-1145796' class='answer   answerof-291571 ' value='1145796'   \/><label for='answer-id-1145796' id='answer-label-1145796' class=' answer'><span>The user name and password the application is using are incorrect.<\/span><\/label><\/div><div class='watupro-question-choice  ' dir='auto' ><input type='radio' name='answer-291571[]' id='answer-id-1145797' class='answer   answerof-291571 ' value='1145797'   \/><label for='answer-id-1145797' id='answer-label-1145797' class=' answer'><span>The security group assigned to the application servers does not have the necessary rules to allow inbound connections from the DB instance.<\/span><\/label><\/div><div class='watupro-question-choice  ' dir='auto' ><input type='radio' name='answer-291571[]' id='answer-id-1145798' class='answer   answerof-291571 ' value='1145798'   \/><label for='answer-id-1145798' id='answer-label-1145798' class=' answer'><span>The security group assigned to the DB instance does not have the necessary rules to allow inbound connections from the application servers.<\/span><\/label><\/div><div class='watupro-question-choice  ' dir='auto' ><input type='radio' name='answer-291571[]' id='answer-id-1145799' class='answer   answerof-291571 ' value='1145799'   \/><label for='answer-id-1145799' id='answer-label-1145799' class=' answer'><span>The user name and password are correct, but the user is not authorized to use the DB instance.<\/span><\/label><\/div><!-- end question-choices--><\/div><!-- end questionWrap--><\/div><\/div><div class='watu-question ' id='question-2' style=';'><div id='questionWrap-2'  class='   watupro-question-id-291572'>\n\t\t\t<div class='question-content'><div><span class='watupro_num'>2. <\/span>An AWS CloudFormation stack that included an Amazon RDS DB instance was accidentally deleted and recent data was lost. A Database Specialist needs to add RDS settings to the CloudFormation template to reduce the chance of accidental instance data loss in the future. <br \/>\r<br>Which settings will meet this requirement? (Choose three.)<\/div><input type='hidden' name='question_id[]' id='qID_2' value='291572' \/><input type='hidden' id='answerType291572' value='checkbox'><!-- end question-content--><\/div><div class='question-choices watupro-choices-columns '><div class='watupro-question-choice  ' dir='auto' ><input type='checkbox' name='answer-291572[]' id='answer-id-1145800' class='answer   answerof-291572 ' value='1145800'   \/><label for='answer-id-1145800' id='answer-label-1145800' class=' answer'><span>Set DeletionProtection to True<\/span><\/label><\/div><div class='watupro-question-choice  ' dir='auto' ><input type='checkbox' name='answer-291572[]' id='answer-id-1145801' class='answer   answerof-291572 ' value='1145801'   \/><label for='answer-id-1145801' id='answer-label-1145801' class=' answer'><span>Set MultiAZ to True<\/span><\/label><\/div><div class='watupro-question-choice  ' dir='auto' ><input type='checkbox' name='answer-291572[]' id='answer-id-1145802' class='answer   answerof-291572 ' value='1145802'   \/><label for='answer-id-1145802' id='answer-label-1145802' class=' answer'><span>Set TerminationProtection to True<\/span><\/label><\/div><div class='watupro-question-choice  ' dir='auto' ><input type='checkbox' name='answer-291572[]' id='answer-id-1145803' class='answer   answerof-291572 ' value='1145803'   \/><label for='answer-id-1145803' id='answer-label-1145803' class=' answer'><span>Set DeleteAutomatedBackups to False<\/span><\/label><\/div><div class='watupro-question-choice  ' dir='auto' ><input type='checkbox' name='answer-291572[]' id='answer-id-1145804' class='answer   answerof-291572 ' value='1145804'   \/><label for='answer-id-1145804' id='answer-label-1145804' class=' answer'><span>Set DeletionPolicy to Delete<\/span><\/label><\/div><div class='watupro-question-choice  ' dir='auto' ><input type='checkbox' name='answer-291572[]' id='answer-id-1145805' class='answer   answerof-291572 ' value='1145805'   \/><label for='answer-id-1145805' id='answer-label-1145805' class=' answer'><span>Set DeletionPolicy to Retain<\/span><\/label><\/div><!-- end question-choices--><\/div><!-- end questionWrap--><\/div><\/div><div class='watu-question ' id='question-3' style=';'><div id='questionWrap-3'  class='   watupro-question-id-291573'>\n\t\t\t<div class='question-content'><div><span class='watupro_num'>3. <\/span>A Database Specialist is troubleshooting an application connection failure on an Amazon Aurora DB cluster with multiple Aurora Replicas that had been running with no issues for the past 2 months. The connection failure lasted for 5 minutes and corrected itself after that. The Database Specialist reviewed the Amazon RDS events and determined a failover event occurred at that time. The failover process took around 15 seconds to complete. <br \/>\r<br>What is the MOST likely cause of the 5-minute connection outage?<\/div><input type='hidden' name='question_id[]' id='qID_3' value='291573' \/><input type='hidden' id='answerType291573' value='radio'><!-- end question-content--><\/div><div class='question-choices watupro-choices-columns '><div class='watupro-question-choice  ' dir='auto' ><input type='radio' name='answer-291573[]' id='answer-id-1145806' class='answer   answerof-291573 ' value='1145806'   \/><label for='answer-id-1145806' id='answer-label-1145806' class=' answer'><span>After a database crash, Aurora needed to replay the redo log from the last database checkpoint<\/span><\/label><\/div><div class='watupro-question-choice  ' dir='auto' ><input type='radio' name='answer-291573[]' id='answer-id-1145807' class='answer   answerof-291573 ' value='1145807'   \/><label for='answer-id-1145807' id='answer-label-1145807' class=' answer'><span>The client-side application is caching the DNS data and its TTL is set too high<\/span><\/label><\/div><div class='watupro-question-choice  ' dir='auto' ><input type='radio' name='answer-291573[]' id='answer-id-1145808' class='answer   answerof-291573 ' value='1145808'   \/><label for='answer-id-1145808' id='answer-label-1145808' class=' answer'><span>After failover, the Aurora DB cluster needs time to warm up before accepting client connections<\/span><\/label><\/div><div class='watupro-question-choice  ' dir='auto' ><input type='radio' name='answer-291573[]' id='answer-id-1145809' class='answer   answerof-291573 ' value='1145809'   \/><label for='answer-id-1145809' id='answer-label-1145809' class=' answer'><span>There were no active Aurora Replicas in the Aurora DB cluster<\/span><\/label><\/div><!-- end question-choices--><\/div><!-- end questionWrap--><\/div><\/div><div class='watu-question ' id='question-4' style=';'><div id='questionWrap-4'  class='   watupro-question-id-291574'>\n\t\t\t<div class='question-content'><div><span class='watupro_num'>4. <\/span>A company is deploying a solution in Amazon Aurora by migrating from an on-premises system. The IT department has established an AWS Direct Connect link from the company\u2019s data center. The company\u2019s Database Specialist has selected the option to require SSL\/TLS for connectivity to prevent plaintext data from being set over the network. The migration appears to be working successfully, and the data can be queried from a desktop machine. <br \/>\r<br>Two Data Analysts have been asked to query and validate the data in the new Aurora DB cluster. Both Analysts are unable to connect to Aurora. Their user names and passwords have been verified as valid and the Database Specialist can connect to the DB cluster using their accounts. The Database Specialist also verified that the security group configuration allows network from all corporate IP addresses. <br \/>\r<br>What should the Database Specialist do to correct the Data Analysts\u2019 inability to connect?<\/div><input type='hidden' name='question_id[]' id='qID_4' value='291574' \/><input type='hidden' id='answerType291574' value='radio'><!-- end question-content--><\/div><div class='question-choices watupro-choices-columns '><div class='watupro-question-choice  ' dir='auto' ><input type='radio' name='answer-291574[]' id='answer-id-1145810' class='answer   answerof-291574 ' value='1145810'   \/><label for='answer-id-1145810' id='answer-label-1145810' class=' answer'><span>Restart the DB cluster to apply the SSL change.<\/span><\/label><\/div><div class='watupro-question-choice  ' dir='auto' ><input type='radio' name='answer-291574[]' id='answer-id-1145811' class='answer   answerof-291574 ' value='1145811'   \/><label for='answer-id-1145811' id='answer-label-1145811' class=' answer'><span>Instruct the Data Analysts to download the root certificate and use the SSL certificate on the connection string to connect.<\/span><\/label><\/div><div class='watupro-question-choice  ' dir='auto' ><input type='radio' name='answer-291574[]' id='answer-id-1145812' class='answer   answerof-291574 ' value='1145812'   \/><label for='answer-id-1145812' id='answer-label-1145812' class=' answer'><span>Add explicit mappings between the Data Analysts\u2019 IP addresses and the instance in the security group assigned to the DB cluster.<\/span><\/label><\/div><div class='watupro-question-choice  ' dir='auto' ><input type='radio' name='answer-291574[]' id='answer-id-1145813' class='answer   answerof-291574 ' value='1145813'   \/><label for='answer-id-1145813' id='answer-label-1145813' class=' answer'><span>Modify the Data Analysts\u2019 local client firewall to allow network traffic to AW<\/span><\/label><\/div><!-- end question-choices--><\/div><!-- end questionWrap--><\/div><\/div><div class='watu-question ' id='question-5' style=';'><div id='questionWrap-5'  class='   watupro-question-id-291575'>\n\t\t\t<div class='question-content'><div><span class='watupro_num'>5. <\/span>A company is concerned about the cost of a large-scale, transactional application using Amazon DynamoDB that only needs to store data for 2 days before it is deleted. In looking at the tables, a Database Specialist notices that much of the data is months old, and goes back to when the application was first deployed. <br \/>\r<br>What can the Database Specialist do to reduce the overall cost?<\/div><input type='hidden' name='question_id[]' id='qID_5' value='291575' \/><input type='hidden' id='answerType291575' value='radio'><!-- end question-content--><\/div><div class='question-choices watupro-choices-columns '><div class='watupro-question-choice  ' dir='auto' ><input type='radio' name='answer-291575[]' id='answer-id-1145814' class='answer   answerof-291575 ' value='1145814'   \/><label for='answer-id-1145814' id='answer-label-1145814' class=' answer'><span>Create a new attribute in each table to track the expiration time and create an AWS Glue transformation to delete entries more than 2 days old.<\/span><\/label><\/div><div class='watupro-question-choice  ' dir='auto' ><input type='radio' name='answer-291575[]' id='answer-id-1145815' class='answer   answerof-291575 ' value='1145815'   \/><label for='answer-id-1145815' id='answer-label-1145815' class=' answer'><span>Create a new attribute in each table to track the expiration time and enable DynamoDB Streams on each table.<\/span><\/label><\/div><div class='watupro-question-choice  ' dir='auto' ><input type='radio' name='answer-291575[]' id='answer-id-1145816' class='answer   answerof-291575 ' value='1145816'   \/><label for='answer-id-1145816' id='answer-label-1145816' class=' answer'><span>Create a new attribute in each table to track the expiration time and enable time to live (TTL) on \r\neach table.<\/span><\/label><\/div><div class='watupro-question-choice  ' dir='auto' ><input type='radio' name='answer-291575[]' id='answer-id-1145817' class='answer   answerof-291575 ' value='1145817'   \/><label for='answer-id-1145817' id='answer-label-1145817' class=' answer'><span>Create an Amazon CloudWatch Events event to export the data to Amazon S3 daily using AWS Data Pipeline and then truncate the Amazon DynamoDB table.<\/span><\/label><\/div><!-- end question-choices--><\/div><!-- end questionWrap--><\/div><\/div><div class='watu-question ' id='question-6' style=';'><div id='questionWrap-6'  class='   watupro-question-id-291576'>\n\t\t\t<div class='question-content'><div><span class='watupro_num'>6. <\/span>A company has an on-premises system that tracks various database operations that occur over the lifetime of a database, including database shutdown, deletion, creation, and backup. <br \/>\r<br>The company recently moved two databases to Amazon RDS and is looking at a solution that would satisfy these requirements. The data could be used by other systems within the company. <br \/>\r<br>Which solution will meet these requirements with minimal effort?<\/div><input type='hidden' name='question_id[]' id='qID_6' value='291576' \/><input type='hidden' id='answerType291576' value='radio'><!-- end question-content--><\/div><div class='question-choices watupro-choices-columns '><div class='watupro-question-choice  ' dir='auto' ><input type='radio' name='answer-291576[]' id='answer-id-1145818' class='answer   answerof-291576 ' value='1145818'   \/><label for='answer-id-1145818' id='answer-label-1145818' class=' answer'><span>Create an Amazon Cloudwatch Events rule with the operations that need to be tracked on Amazon RD<\/span><\/label><\/div><div class='watupro-question-choice  ' dir='auto' ><input type='radio' name='answer-291576[]' id='answer-id-1145819' class='answer   answerof-291576 ' value='1145819'   \/><label for='answer-id-1145819' id='answer-label-1145819' class=' answer'><span>Create an AWS Lambda function to act on these rules and write the output to the tracking systems.<\/span><\/label><\/div><div class='watupro-question-choice  ' dir='auto' ><input type='radio' name='answer-291576[]' id='answer-id-1145820' class='answer   answerof-291576 ' value='1145820'   \/><label for='answer-id-1145820' id='answer-label-1145820' class=' answer'><span>Create an AWS Lambda function to trigger on AWS CloudTrail API calls. Filter on specific RDS API calls and write the output to the tracking systems.<\/span><\/label><\/div><div class='watupro-question-choice  ' dir='auto' ><input type='radio' name='answer-291576[]' id='answer-id-1145821' class='answer   answerof-291576 ' value='1145821'   \/><label for='answer-id-1145821' id='answer-label-1145821' class=' answer'><span>Create RDS event subscriptions. Have the tracking systems subscribe to specific RDS event system notifications.<\/span><\/label><\/div><div class='watupro-question-choice  ' dir='auto' ><input type='radio' name='answer-291576[]' id='answer-id-1145822' class='answer   answerof-291576 ' value='1145822'   \/><label for='answer-id-1145822' id='answer-label-1145822' class=' answer'><span>Write RDS logs to Amazon Kinesis Data Firehose. Create an AWS Lambda function to act on these rules and write the output to the tracking systems.<\/span><\/label><\/div><!-- end question-choices--><\/div><!-- end questionWrap--><\/div><\/div><div class='watu-question ' id='question-7' style=';'><div id='questionWrap-7'  class='   watupro-question-id-291577'>\n\t\t\t<div class='question-content'><div><span class='watupro_num'>7. <\/span>A clothing company uses a custom ecommerce application and a PostgreSQL database to sell clothes to thousands of users from multiple countries. The company is migrating its application and database from its on- premises data center to the AWS Cloud. The company has selected Amazon EC2 for the application and Amazon RDS for PostgreSQL for the database. The company requires database passwords to be changed every 60 days. A Database Specialist needs to ensure that the credentials used by the web application to connect to the database are managed securely. <br \/>\r<br>Which approach should the Database Specialist take to securely manage the database credentials?<\/div><input type='hidden' name='question_id[]' id='qID_7' value='291577' \/><input type='hidden' id='answerType291577' value='radio'><!-- end question-content--><\/div><div class='question-choices watupro-choices-columns '><div class='watupro-question-choice  ' dir='auto' ><input type='radio' name='answer-291577[]' id='answer-id-1145823' class='answer   answerof-291577 ' value='1145823'   \/><label for='answer-id-1145823' id='answer-label-1145823' class=' answer'><span>Store the credentials in a text file in an Amazon S3 bucket. Restrict permissions on the bucket to \r\nthe IAM role associated with the instance profile only. Modify the application to download the text file and retrieve the credentials on start up. Update the text file every 60 days.<\/span><\/label><\/div><div class='watupro-question-choice  ' dir='auto' ><input type='radio' name='answer-291577[]' id='answer-id-1145824' class='answer   answerof-291577 ' value='1145824'   \/><label for='answer-id-1145824' id='answer-label-1145824' class=' answer'><span>Configure IAM database authentication for the application to connect to the database. Create an IAM user and map it to a separate database user for each ecommerce user. Require users to update their passwords every 60 days.<\/span><\/label><\/div><div class='watupro-question-choice  ' dir='auto' ><input type='radio' name='answer-291577[]' id='answer-id-1145825' class='answer   answerof-291577 ' value='1145825'   \/><label for='answer-id-1145825' id='answer-label-1145825' class=' answer'><span>Store the credentials in AWS Secrets Manager. Restrict permissions on the secret to only the IAM role associated with the instance profile. Modify the application to retrieve the credentials from Secrets Manager on start up. Configure the rotation interval to 60 days.<\/span><\/label><\/div><div class='watupro-question-choice  ' dir='auto' ><input type='radio' name='answer-291577[]' id='answer-id-1145826' class='answer   answerof-291577 ' value='1145826'   \/><label for='answer-id-1145826' id='answer-label-1145826' class=' answer'><span>Store the credentials in an encrypted text file in the application AM<\/span><\/label><\/div><div class='watupro-question-choice  ' dir='auto' ><input type='radio' name='answer-291577[]' id='answer-id-1145827' class='answer   answerof-291577 ' value='1145827'   \/><label for='answer-id-1145827' id='answer-label-1145827' class=' answer'><span>Use AWS KMS to store the key for decrypting the text file. Modify the application to decrypt the text file and retrieve the credentials on start up. Update the text file and publish a new AMI every 60 days.<\/span><\/label><\/div><!-- end question-choices--><\/div><!-- end questionWrap--><\/div><\/div><div class='watu-question ' id='question-8' style=';'><div id='questionWrap-8'  class='   watupro-question-id-291578'>\n\t\t\t<div class='question-content'><div><span class='watupro_num'>8. <\/span>A financial services company is developing a shared data service that supports different applications from throughout the company. A Database Specialist designed a solution to leverage Amazon ElastiCache for Redis with cluster mode enabled to enhance performance and scalability. The cluster is configured to listen on port 6379. <br \/>\r<br>Which combination of steps should the Database Specialist take to secure the cache data and protect it from unauthorized access? (Choose three.)<\/div><input type='hidden' name='question_id[]' id='qID_8' value='291578' \/><input type='hidden' id='answerType291578' value='checkbox'><!-- end question-content--><\/div><div class='question-choices watupro-choices-columns '><div class='watupro-question-choice  ' dir='auto' ><input type='checkbox' name='answer-291578[]' id='answer-id-1145828' class='answer   answerof-291578 ' value='1145828'   \/><label for='answer-id-1145828' id='answer-label-1145828' class=' answer'><span>Enable in-transit and at-rest encryption on the ElastiCache cluster.<\/span><\/label><\/div><div class='watupro-question-choice  ' dir='auto' ><input type='checkbox' name='answer-291578[]' id='answer-id-1145829' class='answer   answerof-291578 ' value='1145829'   \/><label for='answer-id-1145829' id='answer-label-1145829' class=' answer'><span>Ensure that Amazon CloudWatch metrics are configured in the ElastiCache cluster.<\/span><\/label><\/div><div class='watupro-question-choice  ' dir='auto' ><input type='checkbox' name='answer-291578[]' id='answer-id-1145830' class='answer   answerof-291578 ' value='1145830'   \/><label for='answer-id-1145830' id='answer-label-1145830' class=' answer'><span>Ensure the security group for the ElastiCache cluster allows all inbound traffic from itself and inbound traffic on TCP port 6379 from trusted clients only.<\/span><\/label><\/div><div class='watupro-question-choice  ' dir='auto' ><input type='checkbox' name='answer-291578[]' id='answer-id-1145831' class='answer   answerof-291578 ' value='1145831'   \/><label for='answer-id-1145831' id='answer-label-1145831' class=' answer'><span>Create an IAM policy to allow the application service roles to access all ElastiCache API actions.<\/span><\/label><\/div><div class='watupro-question-choice  ' dir='auto' ><input type='checkbox' name='answer-291578[]' id='answer-id-1145832' class='answer   answerof-291578 ' value='1145832'   \/><label for='answer-id-1145832' id='answer-label-1145832' class=' answer'><span>Ensure the security group for the ElastiCache clients authorize inbound TCP port 6379 and port 22 traffic from the trusted ElastiCache cluster\u2019s security group.<\/span><\/label><\/div><div class='watupro-question-choice  ' dir='auto' ><input type='checkbox' name='answer-291578[]' id='answer-id-1145833' class='answer   answerof-291578 ' value='1145833'   \/><label for='answer-id-1145833' id='answer-label-1145833' class=' answer'><span>Ensure the cluster is created with the auth-token parameter and that the parameter is used in all subsequent commands.<\/span><\/label><\/div><!-- end question-choices--><\/div><!-- end questionWrap--><\/div><\/div><div class='watu-question ' id='question-9' style=';'><div id='questionWrap-9'  class='   watupro-question-id-291579'>\n\t\t\t<div class='question-content'><div><span class='watupro_num'>9. <\/span>A company is running an Amazon RDS for PostgeSQL DB instance and wants to migrate it to an Amazon Aurora PostgreSQL DB cluster. The current database is 1 TB in size. The migration needs to have minimal downtime. <br \/>\r<br>What is the FASTEST way to accomplish this?<\/div><input type='hidden' name='question_id[]' id='qID_9' value='291579' \/><input type='hidden' id='answerType291579' value='radio'><!-- end question-content--><\/div><div class='question-choices watupro-choices-columns '><div class='watupro-question-choice  ' dir='auto' ><input type='radio' name='answer-291579[]' id='answer-id-1145834' class='answer   answerof-291579 ' value='1145834'   \/><label for='answer-id-1145834' id='answer-label-1145834' class=' answer'><span>Create an Aurora PostgreSQL DB cluster. Set up replication from the source RDS for PostgreSQL DB instance using AWS DMS to the target DB cluster.<\/span><\/label><\/div><div class='watupro-question-choice  ' dir='auto' ><input type='radio' name='answer-291579[]' id='answer-id-1145835' class='answer   answerof-291579 ' value='1145835'   \/><label for='answer-id-1145835' id='answer-label-1145835' class=' answer'><span>Use the pg_dump and pg_restore utilities to extract and restore the RDS for PostgreSQL DB instance to the Aurora PostgreSQL DB cluster.<\/span><\/label><\/div><div class='watupro-question-choice  ' dir='auto' ><input type='radio' name='answer-291579[]' id='answer-id-1145836' class='answer   answerof-291579 ' value='1145836'   \/><label for='answer-id-1145836' id='answer-label-1145836' class=' answer'><span>Create a database snapshot of the RDS for PostgreSQL DB instance and use this snapshot to create the Aurora PostgreSQL DB cluster.<\/span><\/label><\/div><div class='watupro-question-choice  ' dir='auto' ><input type='radio' name='answer-291579[]' id='answer-id-1145837' class='answer   answerof-291579 ' value='1145837'   \/><label for='answer-id-1145837' id='answer-label-1145837' class=' answer'><span>Migrate data from the RDS for PostgreSQL DB instance to an Aurora PostgreSQL DB cluster using an Aurora Replica. Promote the replica during the cutover.<\/span><\/label><\/div><!-- end question-choices--><\/div><!-- end questionWrap--><\/div><\/div><div class='watu-question ' id='question-10' style=';'><div id='questionWrap-10'  class='   watupro-question-id-291580'>\n\t\t\t<div class='question-content'><div><span class='watupro_num'>10. <\/span>A Database Specialist is migrating a 2 TB Amazon RDS for Oracle DB instance to an RDS for PostgreSQL DB instance using AWS DMS. The source RDS Oracle DB instance is in a VPC in the us-east-1 Region. The target RDS for PostgreSQL DB instance is in a VPC in the use-west-2 Region. <br \/>\r<br>Where should the AWS DMS replication instance be placed for the MOST optimal performance?<\/div><input type='hidden' name='question_id[]' id='qID_10' value='291580' \/><input type='hidden' id='answerType291580' value='radio'><!-- end question-content--><\/div><div class='question-choices watupro-choices-columns '><div class='watupro-question-choice  ' dir='auto' ><input type='radio' name='answer-291580[]' id='answer-id-1145838' class='answer   answerof-291580 ' value='1145838'   \/><label for='answer-id-1145838' id='answer-label-1145838' class=' answer'><span>In the same Region and VPC of the source DB instance<\/span><\/label><\/div><div class='watupro-question-choice  ' dir='auto' ><input type='radio' name='answer-291580[]' id='answer-id-1145839' class='answer   answerof-291580 ' value='1145839'   \/><label for='answer-id-1145839' id='answer-label-1145839' class=' answer'><span>In the same Region and VPC as the target DB instance<\/span><\/label><\/div><div class='watupro-question-choice  ' dir='auto' ><input type='radio' name='answer-291580[]' id='answer-id-1145840' class='answer   answerof-291580 ' value='1145840'   \/><label for='answer-id-1145840' id='answer-label-1145840' class=' answer'><span>In the same VPC and Availability Zone as the target DB instance<\/span><\/label><\/div><div class='watupro-question-choice  ' dir='auto' ><input type='radio' name='answer-291580[]' id='answer-id-1145841' class='answer   answerof-291580 ' value='1145841'   \/><label for='answer-id-1145841' id='answer-label-1145841' class=' answer'><span>In the same VPC and Availability Zone as the source DB instance<\/span><\/label><\/div><!-- end question-choices--><\/div><!-- end questionWrap--><\/div><\/div><div class='watu-question ' id='question-11' style=';'><div id='questionWrap-11'  class='   watupro-question-id-291581'>\n\t\t\t<div class='question-content'><div><span class='watupro_num'>11. <\/span>The Development team recently executed a database script containing several data definition language (DDL) and data manipulation language (DML) statements on an Amazon Aurora MySQL DB cluster. The release accidentally deleted thousands of rows from an important table and broke some application functionality. This was discovered 4 hours after the release. Upon investigation, a Database Specialist tracked the issue to a DELETE command in the script with an incorrect WHERE clause filtering the wrong set of rows. <br \/>\r<br>The Aurora DB cluster has Backtrack enabled with an 8-hour backtrack window. The Database Administrator also took a manual snapshot of the DB cluster before the release started. The database needs to be returned to the correct state as quickly as possible to resume full application functionality. Data loss must be minimal. <br \/>\r<br>How can the Database Specialist accomplish this?<\/div><input type='hidden' name='question_id[]' id='qID_11' value='291581' \/><input type='hidden' id='answerType291581' value='radio'><!-- end question-content--><\/div><div class='question-choices watupro-choices-columns '><div class='watupro-question-choice  ' dir='auto' ><input type='radio' name='answer-291581[]' id='answer-id-1145842' class='answer   answerof-291581 ' value='1145842'   \/><label for='answer-id-1145842' id='answer-label-1145842' class=' answer'><span>Quickly rewind the DB cluster to a point in time before the release using Backtrack.<\/span><\/label><\/div><div class='watupro-question-choice  ' dir='auto' ><input type='radio' name='answer-291581[]' id='answer-id-1145843' class='answer   answerof-291581 ' value='1145843'   \/><label for='answer-id-1145843' id='answer-label-1145843' class=' answer'><span>Perform a point-in-time recovery (PITR) of the DB cluster to a time before the release and copy the deleted rows from the restored database to the original database.<\/span><\/label><\/div><div class='watupro-question-choice  ' dir='auto' ><input type='radio' name='answer-291581[]' id='answer-id-1145844' class='answer   answerof-291581 ' value='1145844'   \/><label for='answer-id-1145844' id='answer-label-1145844' class=' answer'><span>Restore the DB cluster using the manual backup snapshot created before the release and change the application configuration settings to point to the new DB cluster.<\/span><\/label><\/div><div class='watupro-question-choice  ' dir='auto' ><input type='radio' name='answer-291581[]' id='answer-id-1145845' class='answer   answerof-291581 ' value='1145845'   \/><label for='answer-id-1145845' id='answer-label-1145845' class=' answer'><span>Create a clone of the DB cluster with Backtrack enabled. Rewind the cloned cluster to a point in time before the release. Copy deleted rows from the clone to the original database.<\/span><\/label><\/div><!-- end question-choices--><\/div><!-- end questionWrap--><\/div><\/div><div class='watu-question ' id='question-12' style=';'><div id='questionWrap-12'  class='   watupro-question-id-291582'>\n\t\t\t<div class='question-content'><div><span class='watupro_num'>12. <\/span>A company is load testing its three-tier production web application deployed with an AWS CloudFormation template on AWS. The Application team is making changes to deploy additional Amazon EC2 and AWS Lambda resources to expand the load testing capacity. A Database Specialist wants to ensure that the changes made by the Application team will not change the Amazon RDS database resources already deployed. <br \/>\r<br>Which combination of steps would allow the Database Specialist to accomplish this? (Choose two.)<\/div><input type='hidden' name='question_id[]' id='qID_12' value='291582' \/><input type='hidden' id='answerType291582' value='checkbox'><!-- end question-content--><\/div><div class='question-choices watupro-choices-columns '><div class='watupro-question-choice  ' dir='auto' ><input type='checkbox' name='answer-291582[]' id='answer-id-1145846' class='answer   answerof-291582 ' value='1145846'   \/><label for='answer-id-1145846' id='answer-label-1145846' class=' answer'><span>Review the stack drift before modifying the template<\/span><\/label><\/div><div class='watupro-question-choice  ' dir='auto' ><input type='checkbox' name='answer-291582[]' id='answer-id-1145847' class='answer   answerof-291582 ' value='1145847'   \/><label for='answer-id-1145847' id='answer-label-1145847' class=' answer'><span>Create and review a change set before applying it<\/span><\/label><\/div><div class='watupro-question-choice  ' dir='auto' ><input type='checkbox' name='answer-291582[]' id='answer-id-1145848' class='answer   answerof-291582 ' value='1145848'   \/><label for='answer-id-1145848' id='answer-label-1145848' class=' answer'><span>Export the database resources as stack outputs<\/span><\/label><\/div><div class='watupro-question-choice  ' dir='auto' ><input type='checkbox' name='answer-291582[]' id='answer-id-1145849' class='answer   answerof-291582 ' value='1145849'   \/><label for='answer-id-1145849' id='answer-label-1145849' class=' answer'><span>Define the database resources in a nested stack<\/span><\/label><\/div><div class='watupro-question-choice  ' dir='auto' ><input type='checkbox' name='answer-291582[]' id='answer-id-1145850' class='answer   answerof-291582 ' value='1145850'   \/><label for='answer-id-1145850' id='answer-label-1145850' class=' answer'><span>Set a stack policy for the database resources<\/span><\/label><\/div><!-- end question-choices--><\/div><!-- end questionWrap--><\/div><\/div><div class='watu-question ' id='question-13' style=';'><div id='questionWrap-13'  class='   watupro-question-id-291583'>\n\t\t\t<div class='question-content'><div><span class='watupro_num'>13. <\/span>A manufacturing company\u2019s website uses an Amazon Aurora PostgreSQL DB cluster. <br \/>\r<br>Which configurations will result in the LEAST application downtime during a failover? (Choose three.)<\/div><input type='hidden' name='question_id[]' id='qID_13' value='291583' \/><input type='hidden' id='answerType291583' value='checkbox'><!-- end question-content--><\/div><div class='question-choices watupro-choices-columns '><div class='watupro-question-choice  ' dir='auto' ><input type='checkbox' name='answer-291583[]' id='answer-id-1145851' class='answer   answerof-291583 ' value='1145851'   \/><label for='answer-id-1145851' id='answer-label-1145851' class=' answer'><span>Use the provided read and write Aurora endpoints to establish a connection to the Aurora DB cluster.<\/span><\/label><\/div><div class='watupro-question-choice  ' dir='auto' ><input type='checkbox' name='answer-291583[]' id='answer-id-1145852' class='answer   answerof-291583 ' value='1145852'   \/><label for='answer-id-1145852' id='answer-label-1145852' class=' answer'><span>Create an Amazon CloudWatch alert triggering a restore in another Availability Zone when the primary Aurora DB cluster is unreachable.<\/span><\/label><\/div><div class='watupro-question-choice  ' dir='auto' ><input type='checkbox' name='answer-291583[]' id='answer-id-1145853' class='answer   answerof-291583 ' value='1145853'   \/><label for='answer-id-1145853' id='answer-label-1145853' class=' answer'><span>Edit and enable Aurora DB cluster cache management in parameter groups.<\/span><\/label><\/div><div class='watupro-question-choice  ' dir='auto' ><input type='checkbox' name='answer-291583[]' id='answer-id-1145854' class='answer   answerof-291583 ' value='1145854'   \/><label for='answer-id-1145854' id='answer-label-1145854' class=' answer'><span>Set TCP keepalive parameters to a high value.<\/span><\/label><\/div><div class='watupro-question-choice  ' dir='auto' ><input type='checkbox' name='answer-291583[]' id='answer-id-1145855' class='answer   answerof-291583 ' value='1145855'   \/><label for='answer-id-1145855' id='answer-label-1145855' class=' answer'><span>Set JDBC connection string timeout variables to a low value.<\/span><\/label><\/div><div class='watupro-question-choice  ' dir='auto' ><input type='checkbox' name='answer-291583[]' id='answer-id-1145856' class='answer   answerof-291583 ' value='1145856'   \/><label for='answer-id-1145856' id='answer-label-1145856' class=' answer'><span>Set Java DNS caching timeouts to a high value.<\/span><\/label><\/div><!-- end question-choices--><\/div><!-- end questionWrap--><\/div><\/div><div class='watu-question ' id='question-14' style=';'><div id='questionWrap-14'  class='   watupro-question-id-291584'>\n\t\t\t<div class='question-content'><div><span class='watupro_num'>14. <\/span>A company is hosting critical business data in an Amazon Redshift cluster. Due to the sensitive nature of the data, the cluster is encrypted at rest using AWS KMS. As a part of disaster recovery requirements, the company needs to copy the Amazon Redshift snapshots to another Region. <br \/>\r<br>Which steps should be taken in the AWS Management Console to meet the disaster recovery requirements?<\/div><input type='hidden' name='question_id[]' id='qID_14' value='291584' \/><input type='hidden' id='answerType291584' value='radio'><!-- end question-content--><\/div><div class='question-choices watupro-choices-columns '><div class='watupro-question-choice  ' dir='auto' ><input type='radio' name='answer-291584[]' id='answer-id-1145857' class='answer   answerof-291584 ' value='1145857'   \/><label for='answer-id-1145857' id='answer-label-1145857' class=' answer'><span>Create a new KMS customer master key in the source Region. Switch to the destination Region, enable Amazon Redshift cross-Region snapshots, and use the KMS key of the source Region.<\/span><\/label><\/div><div class='watupro-question-choice  ' dir='auto' ><input type='radio' name='answer-291584[]' id='answer-id-1145858' class='answer   answerof-291584 ' value='1145858'   \/><label for='answer-id-1145858' id='answer-label-1145858' class=' answer'><span>Create a new IAM role with access to the KMS key. Enable Amazon Redshift cross-Region replication using the new IAM role, and use the KMS key of the source Region.<\/span><\/label><\/div><div class='watupro-question-choice  ' dir='auto' ><input type='radio' name='answer-291584[]' id='answer-id-1145859' class='answer   answerof-291584 ' value='1145859'   \/><label for='answer-id-1145859' id='answer-label-1145859' class=' answer'><span>Enable Amazon Redshift cross-Region snapshots in the source Region, and create a snapshot copy grant and use a KMS key in the destination Region.<\/span><\/label><\/div><div class='watupro-question-choice  ' dir='auto' ><input type='radio' name='answer-291584[]' id='answer-id-1145860' class='answer   answerof-291584 ' value='1145860'   \/><label for='answer-id-1145860' id='answer-label-1145860' class=' answer'><span>Create a new KMS customer master key in the destination Region and create a new IAM role with access to the new KMS key. Enable Amazon Redshift cross-Region replication in the source Region and use the KMS key of the destination Region.<\/span><\/label><\/div><!-- end question-choices--><\/div><!-- end questionWrap--><\/div><\/div><div class='watu-question ' id='question-15' style=';'><div id='questionWrap-15'  class='   watupro-question-id-291585'>\n\t\t\t<div class='question-content'><div><span class='watupro_num'>15. <\/span>A company has a production Amazon Aurora Db cluster that serves both online transaction processing (OLTP) transactions and compute-intensive reports. The reports run for 10% of the total cluster uptime while the OLTP transactions run all the time. The company has benchmarked its workload and determined that a six-node Aurora DB cluster is appropriate for the peak workload. The company is now looking at cutting costs for this DB cluster, but needs to have a sufficient number of nodes in the cluster to support the workload at different times. The workload has not changed since the previous benchmarking exercise. <br \/>\r<br>How can a Database Specialist address these requirements with minimal user involvement?<\/div><input type='hidden' name='question_id[]' id='qID_15' value='291585' \/><input type='hidden' id='answerType291585' value='radio'><!-- end question-content--><\/div><div class='question-choices watupro-choices-columns '><div class='watupro-question-choice  ' dir='auto' ><input type='radio' name='answer-291585[]' id='answer-id-1145861' class='answer   answerof-291585 ' value='1145861'   \/><label for='answer-id-1145861' id='answer-label-1145861' class=' answer'><span>Split up the DB cluster into two different clusters: one for OLTP and the other for reporting. \r\nMonitor and set up replication between the two clusters to keep data consistent.<\/span><\/label><\/div><div class='watupro-question-choice  ' dir='auto' ><input type='radio' name='answer-291585[]' id='answer-id-1145862' class='answer   answerof-291585 ' value='1145862'   \/><label for='answer-id-1145862' id='answer-label-1145862' class=' answer'><span>Review all evaluate the peak combined workload. Ensure that utilization of the DB cluster node is at an acceptable level. Adjust the number of instances, if necessary.<\/span><\/label><\/div><div class='watupro-question-choice  ' dir='auto' ><input type='radio' name='answer-291585[]' id='answer-id-1145863' class='answer   answerof-291585 ' value='1145863'   \/><label for='answer-id-1145863' id='answer-label-1145863' class=' answer'><span>Use the stop cluster functionality to stop all the nodes of the DB cluster during times of minimal workload. The cluster can be restarted again depending on the workload at the time.<\/span><\/label><\/div><div class='watupro-question-choice  ' dir='auto' ><input type='radio' name='answer-291585[]' id='answer-id-1145864' class='answer   answerof-291585 ' value='1145864'   \/><label for='answer-id-1145864' id='answer-label-1145864' class=' answer'><span>Set up automatic scaling on the DB cluster. This will allow the number of reader nodes to adjust automatically to the reporting workload, when needed.<\/span><\/label><\/div><!-- end question-choices--><\/div><!-- end questionWrap--><\/div><\/div><div class='watu-question ' id='question-16' style=';'><div id='questionWrap-16'  class='   watupro-question-id-291586'>\n\t\t\t<div class='question-content'><div><span class='watupro_num'>16. <\/span>A company is running a finance application on an Amazon RDS for MySQL DB instance. The application is governed by multiple financial regulatory agencies. The RDS DB instance is set up with security groups to allow access to certain Amazon EC2 servers only. AWS KMS is used for encryption at rest. <br \/>\r<br>Which step will provide additional security?<\/div><input type='hidden' name='question_id[]' id='qID_16' value='291586' \/><input type='hidden' id='answerType291586' value='radio'><!-- end question-content--><\/div><div class='question-choices watupro-choices-columns '><div class='watupro-question-choice  ' dir='auto' ><input type='radio' name='answer-291586[]' id='answer-id-1145865' class='answer   answerof-291586 ' value='1145865'   \/><label for='answer-id-1145865' id='answer-label-1145865' class=' answer'><span>Set up NACLs that allow the entire EC2 subnet to access the DB instance<\/span><\/label><\/div><div class='watupro-question-choice  ' dir='auto' ><input type='radio' name='answer-291586[]' id='answer-id-1145866' class='answer   answerof-291586 ' value='1145866'   \/><label for='answer-id-1145866' id='answer-label-1145866' class=' answer'><span>Disable the master user account<\/span><\/label><\/div><div class='watupro-question-choice  ' dir='auto' ><input type='radio' name='answer-291586[]' id='answer-id-1145867' class='answer   answerof-291586 ' value='1145867'   \/><label for='answer-id-1145867' id='answer-label-1145867' class=' answer'><span>Set up a security group that blocks SSH to the DB instance<\/span><\/label><\/div><div class='watupro-question-choice  ' dir='auto' ><input type='radio' name='answer-291586[]' id='answer-id-1145868' class='answer   answerof-291586 ' value='1145868'   \/><label for='answer-id-1145868' id='answer-label-1145868' class=' answer'><span>Set up RDS to use SSL for data in transit<\/span><\/label><\/div><!-- end question-choices--><\/div><!-- end questionWrap--><\/div><\/div><div class='watu-question ' id='question-17' style=';'><div id='questionWrap-17'  class='   watupro-question-id-291587'>\n\t\t\t<div class='question-content'><div><span class='watupro_num'>17. <\/span>A company needs a data warehouse solution that keeps data in a consistent, highly structured format. The company requires fast responses for end-user queries when looking at data from the current year, and users must have access to the full 15-year dataset, when needed. This solution also needs to handle a fluctuating number incoming queries. Storage costs for the 100 TB of data must be kept low. <br \/>\r<br>Which solution meets these requirements?<\/div><input type='hidden' name='question_id[]' id='qID_17' value='291587' \/><input type='hidden' id='answerType291587' value='radio'><!-- end question-content--><\/div><div class='question-choices watupro-choices-columns '><div class='watupro-question-choice  ' dir='auto' ><input type='radio' name='answer-291587[]' id='answer-id-1145869' class='answer   answerof-291587 ' value='1145869'   \/><label for='answer-id-1145869' id='answer-label-1145869' class=' answer'><span>Leverage an Amazon Redshift data warehouse solution using a dense storage instance type while keeping all the data on local Amazon Redshift storage. Provision enough instances to support high demand.<\/span><\/label><\/div><div class='watupro-question-choice  ' dir='auto' ><input type='radio' name='answer-291587[]' id='answer-id-1145870' class='answer   answerof-291587 ' value='1145870'   \/><label for='answer-id-1145870' id='answer-label-1145870' class=' answer'><span>Leverage an Amazon Redshift data warehouse solution using a dense storage instance to store the most recent data. Keep historical data on Amazon S3 and access it using the Amazon Redshift Spectrum layer. Provision enough instances to support high demand.<\/span><\/label><\/div><div class='watupro-question-choice  ' dir='auto' ><input type='radio' name='answer-291587[]' id='answer-id-1145871' class='answer   answerof-291587 ' value='1145871'   \/><label for='answer-id-1145871' id='answer-label-1145871' class=' answer'><span>Leverage an Amazon Redshift data warehouse solution using a dense storage instance to store the most recent data. Keep historical data on Amazon S3 and access it using the Amazon Redshift Spectrum layer. Enable Amazon Redshift Concurrency Scaling.<\/span><\/label><\/div><div class='watupro-question-choice  ' dir='auto' ><input type='radio' name='answer-291587[]' id='answer-id-1145872' class='answer   answerof-291587 ' value='1145872'   \/><label for='answer-id-1145872' id='answer-label-1145872' class=' answer'><span>Leverage an Amazon Redshift data warehouse solution using a dense storage instance to store the most recent data. Keep historical data on Amazon S3 and access it using the Amazon Redshift Spectrum layer. Leverage Amazon Redshift elastic resize.<\/span><\/label><\/div><!-- end question-choices--><\/div><!-- end questionWrap--><\/div><\/div><div class='watu-question ' id='question-18' style=';'><div id='questionWrap-18'  class='   watupro-question-id-291588'>\n\t\t\t<div class='question-content'><div><span class='watupro_num'>18. <\/span>A gaming company wants to deploy a game in multiple Regions. The company plans to save local high scores in Amazon DynamoDB tables in each Region. A Database Specialist needs to design a solution to automate the deployment of the database with identical configurations in additional Regions, as needed. The solution should also automate configuration changes across all Regions. <br \/>\r<br>Which solution would meet these requirements and deploy the DynamoDB tables?<\/div><input type='hidden' name='question_id[]' id='qID_18' value='291588' \/><input type='hidden' id='answerType291588' value='radio'><!-- end question-content--><\/div><div class='question-choices watupro-choices-columns '><div class='watupro-question-choice  ' dir='auto' ><input type='radio' name='answer-291588[]' id='answer-id-1145873' class='answer   answerof-291588 ' value='1145873'   \/><label for='answer-id-1145873' id='answer-label-1145873' class=' answer'><span>Create an AWS CLI command to deploy the DynamoDB table to all the Regions and save it for future deployments.<\/span><\/label><\/div><div class='watupro-question-choice  ' dir='auto' ><input type='radio' name='answer-291588[]' id='answer-id-1145874' class='answer   answerof-291588 ' value='1145874'   \/><label for='answer-id-1145874' id='answer-label-1145874' class=' answer'><span>Create an AWS CloudFormation template and deploy the template to all the Regions.<\/span><\/label><\/div><div class='watupro-question-choice  ' dir='auto' ><input type='radio' name='answer-291588[]' id='answer-id-1145875' class='answer   answerof-291588 ' value='1145875'   \/><label for='answer-id-1145875' id='answer-label-1145875' class=' answer'><span>Create an AWS CloudFormation template and use a stack set to deploy the template to all the Regions.<\/span><\/label><\/div><div class='watupro-question-choice  ' dir='auto' ><input type='radio' name='answer-291588[]' id='answer-id-1145876' class='answer   answerof-291588 ' value='1145876'   \/><label for='answer-id-1145876' id='answer-label-1145876' class=' answer'><span>Create DynamoDB tables using the AWS Management Console in all the Regions and create a step-by- step guide for future deployments.<\/span><\/label><\/div><!-- end question-choices--><\/div><!-- end questionWrap--><\/div><\/div><div class='watu-question ' id='question-19' style=';'><div id='questionWrap-19'  class='   watupro-question-id-291589'>\n\t\t\t<div class='question-content'><div><span class='watupro_num'>19. <\/span>A team of Database Specialists is currently investigating performance issues on an Amazon RDS for MySQL DB instance and is reviewing related metrics. The team wants to narrow the possibilities down to specific database wait events to better understand the situation. <br \/>\r<br>How can the Database Specialists accomplish this?<\/div><input type='hidden' name='question_id[]' id='qID_19' value='291589' \/><input type='hidden' id='answerType291589' value='radio'><!-- end question-content--><\/div><div class='question-choices watupro-choices-columns '><div class='watupro-question-choice  ' dir='auto' ><input type='radio' name='answer-291589[]' id='answer-id-1145877' class='answer   answerof-291589 ' value='1145877'   \/><label for='answer-id-1145877' id='answer-label-1145877' class=' answer'><span>Enable the option to push all database logs to Amazon CloudWatch for advanced analysis<\/span><\/label><\/div><div class='watupro-question-choice  ' dir='auto' ><input type='radio' name='answer-291589[]' id='answer-id-1145878' class='answer   answerof-291589 ' value='1145878'   \/><label for='answer-id-1145878' id='answer-label-1145878' class=' answer'><span>Create appropriate Amazon CloudWatch dashboards to contain specific periods of time<\/span><\/label><\/div><div class='watupro-question-choice  ' dir='auto' ><input type='radio' name='answer-291589[]' id='answer-id-1145879' class='answer   answerof-291589 ' value='1145879'   \/><label for='answer-id-1145879' id='answer-label-1145879' class=' answer'><span>Enable Amazon RDS Performance Insights and review the appropriate dashboard<\/span><\/label><\/div><div class='watupro-question-choice  ' dir='auto' ><input type='radio' name='answer-291589[]' id='answer-id-1145880' class='answer   answerof-291589 ' value='1145880'   \/><label for='answer-id-1145880' id='answer-label-1145880' class=' answer'><span>Enable Enhanced Monitoring will the appropriate settings<\/span><\/label><\/div><!-- end question-choices--><\/div><!-- end questionWrap--><\/div><\/div><div class='watu-question ' id='question-20' style=';'><div id='questionWrap-20'  class='   watupro-question-id-291590'>\n\t\t\t<div class='question-content'><div><span class='watupro_num'>20. <\/span>A large company is using an Amazon RDS for Oracle Multi-AZ DB instance with a Java application. As a part of its disaster recovery annual testing, the company would like to simulate an Availability Zone failure and record how the application reacts during the DB instance failover activity. The company does not want to make any code changes for this activity. <br \/>\r<br>What should the company do to achieve this in the shortest amount of time?<\/div><input type='hidden' name='question_id[]' id='qID_20' value='291590' \/><input type='hidden' id='answerType291590' value='radio'><!-- end question-content--><\/div><div class='question-choices watupro-choices-columns '><div class='watupro-question-choice  ' dir='auto' ><input type='radio' name='answer-291590[]' id='answer-id-1145881' class='answer   answerof-291590 ' value='1145881'   \/><label for='answer-id-1145881' id='answer-label-1145881' class=' answer'><span>Use a blue-green deployment with a complete application-level failover test<\/span><\/label><\/div><div class='watupro-question-choice  ' dir='auto' ><input type='radio' name='answer-291590[]' id='answer-id-1145882' class='answer   answerof-291590 ' value='1145882'   \/><label for='answer-id-1145882' id='answer-label-1145882' class=' answer'><span>Use the RDS console to reboot the DB instance by choosing the option to reboot with failover<\/span><\/label><\/div><div class='watupro-question-choice  ' dir='auto' ><input type='radio' name='answer-291590[]' id='answer-id-1145883' class='answer   answerof-291590 ' value='1145883'   \/><label for='answer-id-1145883' id='answer-label-1145883' class=' answer'><span>Use RDS fault injection queries to simulate the primary node failure<\/span><\/label><\/div><div class='watupro-question-choice  ' dir='auto' ><input type='radio' name='answer-291590[]' id='answer-id-1145884' class='answer   answerof-291590 ' value='1145884'   \/><label for='answer-id-1145884' id='answer-label-1145884' class=' answer'><span>Add a rule to the NACL to deny all traffic on the subnets associated with a single Availability Zone<\/span><\/label><\/div><!-- end question-choices--><\/div><!-- end questionWrap--><\/div><\/div><div class='watu-question ' id='question-21' style=';'><div id='questionWrap-21'  class='   watupro-question-id-291591'>\n\t\t\t<div class='question-content'><div><span class='watupro_num'>21. <\/span>A company maintains several databases using Amazon RDS for MySQL and PostgreSQL. Each RDS database generates log files with retention periods set to their default values. The company has now mandated that database logs be maintained for up to 90 days in a centralized repository to facilitate real-time and after- the-fact analyses. <br \/>\r<br>What should a Database Specialist do to meet these requirements with minimal effort?<\/div><input type='hidden' name='question_id[]' id='qID_21' value='291591' \/><input type='hidden' id='answerType291591' value='radio'><!-- end question-content--><\/div><div class='question-choices watupro-choices-columns '><div class='watupro-question-choice  ' dir='auto' ><input type='radio' name='answer-291591[]' id='answer-id-1145885' class='answer   answerof-291591 ' value='1145885'   \/><label for='answer-id-1145885' id='answer-label-1145885' class=' answer'><span>Create an AWS Lambda function to pull logs from the RDS databases and consolidate the log files in an Amazon S3 bucket. Set a lifecycle policy to expire the objects after 90 days.<\/span><\/label><\/div><div class='watupro-question-choice  ' dir='auto' ><input type='radio' name='answer-291591[]' id='answer-id-1145886' class='answer   answerof-291591 ' value='1145886'   \/><label for='answer-id-1145886' id='answer-label-1145886' class=' answer'><span>Modify the RDS databases to publish log to Amazon CloudWatch Logs. Change the log retention policy for each log group to expire the events after 90 days.<\/span><\/label><\/div><div class='watupro-question-choice  ' dir='auto' ><input type='radio' name='answer-291591[]' id='answer-id-1145887' class='answer   answerof-291591 ' value='1145887'   \/><label for='answer-id-1145887' id='answer-label-1145887' class=' answer'><span>Write a stored procedure in each RDS database to download the logs and consolidate the log files in an Amazon S3 bucket. Set a lifecycle policy to expire the objects after 90 days.<\/span><\/label><\/div><div class='watupro-question-choice  ' dir='auto' ><input type='radio' name='answer-291591[]' id='answer-id-1145888' class='answer   answerof-291591 ' value='1145888'   \/><label for='answer-id-1145888' id='answer-label-1145888' class=' answer'><span>Create an AWS Lambda function to download the logs from the RDS databases and publish the logs to Amazon CloudWatch Logs. Change the log retention policy for the log group to expire the events after 90 days.<\/span><\/label><\/div><!-- end question-choices--><\/div><!-- end questionWrap--><\/div><\/div><div class='watu-question ' id='question-22' style=';'><div id='questionWrap-22'  class='   watupro-question-id-291592'>\n\t\t\t<div class='question-content'><div><span class='watupro_num'>22. <\/span>A Database Specialist is setting up a new Amazon Aurora DB cluster with one primary instance and three Aurora Replicas for a highly intensive, business-critical application. The Aurora DB cluster has one medium- sized primary instance, one large-sized replica, and two medium sized replicas. The Database Specialist did not assign a promotion tier to the replicas. <br \/>\r<br>In the event of a primary failure, what will occur?<\/div><input type='hidden' name='question_id[]' id='qID_22' value='291592' \/><input type='hidden' id='answerType291592' value='radio'><!-- end question-content--><\/div><div class='question-choices watupro-choices-columns '><div class='watupro-question-choice  ' dir='auto' ><input type='radio' name='answer-291592[]' id='answer-id-1145889' class='answer   answerof-291592 ' value='1145889'   \/><label for='answer-id-1145889' id='answer-label-1145889' class=' answer'><span>Aurora will promote an Aurora Replica that is of the same size as the primary instance<\/span><\/label><\/div><div class='watupro-question-choice  ' dir='auto' ><input type='radio' name='answer-291592[]' id='answer-id-1145890' class='answer   answerof-291592 ' value='1145890'   \/><label for='answer-id-1145890' id='answer-label-1145890' class=' answer'><span>Aurora will promote an arbitrary Aurora Replica<\/span><\/label><\/div><div class='watupro-question-choice  ' dir='auto' ><input type='radio' name='answer-291592[]' id='answer-id-1145891' class='answer   answerof-291592 ' value='1145891'   \/><label for='answer-id-1145891' id='answer-label-1145891' class=' answer'><span>Aurora will promote the largest-sized Aurora Replica<\/span><\/label><\/div><div class='watupro-question-choice  ' dir='auto' ><input type='radio' name='answer-291592[]' id='answer-id-1145892' class='answer   answerof-291592 ' value='1145892'   \/><label for='answer-id-1145892' id='answer-label-1145892' class=' answer'><span>Aurora will not promote an Aurora Replica<\/span><\/label><\/div><!-- end question-choices--><\/div><!-- end questionWrap--><\/div><\/div><div class='watu-question ' id='question-23' style=';'><div id='questionWrap-23'  class='   watupro-question-id-291593'>\n\t\t\t<div class='question-content'><div><span class='watupro_num'>23. <\/span>A company is running its line of business application on AWS, which uses Amazon RDS for MySQL at <br \/>\r<br>the persistent data store. The company wants to minimize downtime when it migrates the database to Amazon Aurora. <br \/>\r<br>Which migration method should a Database Specialist use?<\/div><input type='hidden' name='question_id[]' id='qID_23' value='291593' \/><input type='hidden' id='answerType291593' value='radio'><!-- end question-content--><\/div><div class='question-choices watupro-choices-columns '><div class='watupro-question-choice  ' dir='auto' ><input type='radio' name='answer-291593[]' id='answer-id-1145893' class='answer   answerof-291593 ' value='1145893'   \/><label for='answer-id-1145893' id='answer-label-1145893' class=' answer'><span>Take a snapshot of the RDS for MySQL DB instance and create a new Aurora DB cluster with the option to migrate snapshots.<\/span><\/label><\/div><div class='watupro-question-choice  ' dir='auto' ><input type='radio' name='answer-291593[]' id='answer-id-1145894' class='answer   answerof-291593 ' value='1145894'   \/><label for='answer-id-1145894' id='answer-label-1145894' class=' answer'><span>Make a backup of the RDS for MySQL DB instance using the mysqldump utility, create a new Aurora DB cluster, and restore the backup.<\/span><\/label><\/div><div class='watupro-question-choice  ' dir='auto' ><input type='radio' name='answer-291593[]' id='answer-id-1145895' class='answer   answerof-291593 ' value='1145895'   \/><label for='answer-id-1145895' id='answer-label-1145895' class=' answer'><span>Create an Aurora Replica from the RDS for MySQL DB instance and promote the Aurora DB cluster.<\/span><\/label><\/div><div class='watupro-question-choice  ' dir='auto' ><input type='radio' name='answer-291593[]' id='answer-id-1145896' class='answer   answerof-291593 ' value='1145896'   \/><label for='answer-id-1145896' id='answer-label-1145896' class=' answer'><span>Create a clone of the RDS for MySQL DB instance and promote the Aurora DB cluster.<\/span><\/label><\/div><!-- end question-choices--><\/div><!-- end questionWrap--><\/div><\/div><div class='watu-question ' id='question-24' style=';'><div id='questionWrap-24'  class='   watupro-question-id-291594'>\n\t\t\t<div class='question-content'><div><span class='watupro_num'>24. <\/span>The Security team for a finance company was notified of an internal security breach that happened 3 weeks ago. A Database Specialist must start producing audit logs out of the production Amazon Aurora PostgreSQL cluster for the Security team to use for monitoring and alerting. The Security team is required to perform real- time alerting and monitoring outside the Aurora DB cluster and wants to have the cluster push encrypted files to the chosen solution. <br \/>\r<br>Which approach will meet these requirements?<\/div><input type='hidden' name='question_id[]' id='qID_24' value='291594' \/><input type='hidden' id='answerType291594' value='radio'><!-- end question-content--><\/div><div class='question-choices watupro-choices-columns '><div class='watupro-question-choice  ' dir='auto' ><input type='radio' name='answer-291594[]' id='answer-id-1145897' class='answer   answerof-291594 ' value='1145897'   \/><label for='answer-id-1145897' id='answer-label-1145897' class=' answer'><span>Use pg_audit to generate audit logs and send the logs to the Security team.<\/span><\/label><\/div><div class='watupro-question-choice  ' dir='auto' ><input type='radio' name='answer-291594[]' id='answer-id-1145898' class='answer   answerof-291594 ' value='1145898'   \/><label for='answer-id-1145898' id='answer-label-1145898' class=' answer'><span>Use AWS CloudTrail to audit the DB cluster and the Security team will get data from Amazon S3.<\/span><\/label><\/div><div class='watupro-question-choice  ' dir='auto' ><input type='radio' name='answer-291594[]' id='answer-id-1145899' class='answer   answerof-291594 ' value='1145899'   \/><label for='answer-id-1145899' id='answer-label-1145899' class=' answer'><span>Set up database activity streams and connect the data stream from Amazon Kinesis to consumer applications.<\/span><\/label><\/div><div class='watupro-question-choice  ' dir='auto' ><input type='radio' name='answer-291594[]' id='answer-id-1145900' class='answer   answerof-291594 ' value='1145900'   \/><label for='answer-id-1145900' id='answer-label-1145900' class=' answer'><span>Turn on verbose logging and set up a schedule for the logs to be dumped out for the Security team.<\/span><\/label><\/div><!-- end question-choices--><\/div><!-- end questionWrap--><\/div><\/div><div class='watu-question ' id='question-25' style=';'><div id='questionWrap-25'  class='   watupro-question-id-291595'>\n\t\t\t<div class='question-content'><div><span class='watupro_num'>25. <\/span>A company is using Amazon RDS for MySQL to redesign its business application. A Database Specialist has noticed that the Development team is restoring their MySQL database multiple times a day when Developers make mistakes in their schema updates. The Developers sometimes need to wait hours to the restores to complete. <br \/>\r<br>Multiple team members are working on the project, making it difficult to find the correct restore point for each mistake. <br \/>\r<br>Which approach should the Database Specialist take to reduce downtime?<\/div><input type='hidden' name='question_id[]' id='qID_25' value='291595' \/><input type='hidden' id='answerType291595' value='radio'><!-- end question-content--><\/div><div class='question-choices watupro-choices-columns '><div class='watupro-question-choice  ' dir='auto' ><input type='radio' name='answer-291595[]' id='answer-id-1145901' class='answer   answerof-291595 ' value='1145901'   \/><label for='answer-id-1145901' id='answer-label-1145901' class=' answer'><span>Deploy multiple read replicas and have the team members make changes to separate replica instances<\/span><\/label><\/div><div class='watupro-question-choice  ' dir='auto' ><input type='radio' name='answer-291595[]' id='answer-id-1145902' class='answer   answerof-291595 ' value='1145902'   \/><label for='answer-id-1145902' id='answer-label-1145902' class=' answer'><span>Migrate to Amazon RDS for SQL Server, take a snapshot, and restore from the snapshot<\/span><\/label><\/div><div class='watupro-question-choice  ' dir='auto' ><input type='radio' name='answer-291595[]' id='answer-id-1145903' class='answer   answerof-291595 ' value='1145903'   \/><label for='answer-id-1145903' id='answer-label-1145903' class=' answer'><span>Migrate to Amazon Aurora MySQL and enable the Aurora Backtrack feature<\/span><\/label><\/div><div class='watupro-question-choice  ' dir='auto' ><input type='radio' name='answer-291595[]' id='answer-id-1145904' class='answer   answerof-291595 ' value='1145904'   \/><label for='answer-id-1145904' id='answer-label-1145904' class=' answer'><span>Enable the Amazon RDS for MySQL Backtrack feature<\/span><\/label><\/div><!-- end question-choices--><\/div><!-- end questionWrap--><\/div><\/div><div class='watu-question ' id='question-26' style=';'><div id='questionWrap-26'  class='   watupro-question-id-291596'>\n\t\t\t<div class='question-content'><div><span class='watupro_num'>26. <\/span>A media company is using Amazon RDS for PostgreSQL to store user data. The RDS DB instance currently has a publicly accessible setting enabled and is hosted in a public subnet. Following a recent AWS Well- Architected Framework review, a Database Specialist was given new security requirements. <br \/>\r<br>Only certain on-premises corporate network IPs should connect to the DB instance. Connectivity is allowed from the corporate network only. <br \/>\r<br>Which combination of steps does the Database Specialist need to take to meet these new requirements? (Choose three.)<\/div><input type='hidden' name='question_id[]' id='qID_26' value='291596' \/><input type='hidden' id='answerType291596' value='checkbox'><!-- end question-content--><\/div><div class='question-choices watupro-choices-columns '><div class='watupro-question-choice  ' dir='auto' ><input type='checkbox' name='answer-291596[]' id='answer-id-1145905' class='answer   answerof-291596 ' value='1145905'   \/><label for='answer-id-1145905' id='answer-label-1145905' class=' answer'><span>Modify the pg_hba.conf file. Add the required corporate network IPs and remove the unwanted IPs.<\/span><\/label><\/div><div class='watupro-question-choice  ' dir='auto' ><input type='checkbox' name='answer-291596[]' id='answer-id-1145906' class='answer   answerof-291596 ' value='1145906'   \/><label for='answer-id-1145906' id='answer-label-1145906' class=' answer'><span>Modify the associated security group. Add the required corporate network IPs and remove the unwanted IPs.<\/span><\/label><\/div><div class='watupro-question-choice  ' dir='auto' ><input type='checkbox' name='answer-291596[]' id='answer-id-1145907' class='answer   answerof-291596 ' value='1145907'   \/><label for='answer-id-1145907' id='answer-label-1145907' class=' answer'><span>Move the DB instance to a private subnet using AWS DM<\/span><\/label><\/div><div class='watupro-question-choice  ' dir='auto' ><input type='checkbox' name='answer-291596[]' id='answer-id-1145908' class='answer   answerof-291596 ' value='1145908'   \/><label for='answer-id-1145908' id='answer-label-1145908' class=' answer'><span>Enable VPC peering between the application host running on the corporate network and the VPC associated with the DB instance.<\/span><\/label><\/div><div class='watupro-question-choice  ' dir='auto' ><input type='checkbox' name='answer-291596[]' id='answer-id-1145909' class='answer   answerof-291596 ' value='1145909'   \/><label for='answer-id-1145909' id='answer-label-1145909' class=' answer'><span>Disable the publicly accessible setting.<\/span><\/label><\/div><div class='watupro-question-choice  ' dir='auto' ><input type='checkbox' name='answer-291596[]' id='answer-id-1145910' class='answer   answerof-291596 ' value='1145910'   \/><label for='answer-id-1145910' id='answer-label-1145910' class=' answer'><span>Connect to the DB instance using private IPs and a VP<\/span><\/label><\/div><!-- end question-choices--><\/div><!-- end questionWrap--><\/div><\/div><div class='watu-question ' id='question-27' style=';'><div id='questionWrap-27'  class='   watupro-question-id-291597'>\n\t\t\t<div class='question-content'><div><span class='watupro_num'>27. <\/span>A company is about to launch a new product, and test databases must be re-created from production data. The company runs its production databases on an Amazon Aurora MySQL DB cluster. A Database Specialist needs to deploy a solution to create these test databases as quickly as possible with the least amount of administrative effort. <br \/>\r<br>What should the Database Specialist do to meet these requirements?<\/div><input type='hidden' name='question_id[]' id='qID_27' value='291597' \/><input type='hidden' id='answerType291597' value='radio'><!-- end question-content--><\/div><div class='question-choices watupro-choices-columns '><div class='watupro-question-choice  ' dir='auto' ><input type='radio' name='answer-291597[]' id='answer-id-1145911' class='answer   answerof-291597 ' value='1145911'   \/><label for='answer-id-1145911' id='answer-label-1145911' class=' answer'><span>Restore a snapshot from the production cluster into test clusters<\/span><\/label><\/div><div class='watupro-question-choice  ' dir='auto' ><input type='radio' name='answer-291597[]' id='answer-id-1145912' class='answer   answerof-291597 ' value='1145912'   \/><label for='answer-id-1145912' id='answer-label-1145912' class=' answer'><span>Create logical dumps of the production cluster and restore them into new test clusters<\/span><\/label><\/div><div class='watupro-question-choice  ' dir='auto' ><input type='radio' name='answer-291597[]' id='answer-id-1145913' class='answer   answerof-291597 ' value='1145913'   \/><label for='answer-id-1145913' id='answer-label-1145913' class=' answer'><span>Use database cloning to create clones of the production cluster<\/span><\/label><\/div><div class='watupro-question-choice  ' dir='auto' ><input type='radio' name='answer-291597[]' id='answer-id-1145914' class='answer   answerof-291597 ' value='1145914'   \/><label for='answer-id-1145914' id='answer-label-1145914' class=' answer'><span>Add an additional read replica to the production cluster and use that node for testing<\/span><\/label><\/div><!-- end question-choices--><\/div><!-- end questionWrap--><\/div><\/div><div class='watu-question ' id='question-28' style=';'><div id='questionWrap-28'  class='   watupro-question-id-291598'>\n\t\t\t<div class='question-content'><div><span class='watupro_num'>28. <\/span>A company with branch offices in Portland, New York, and Singapore has a three-tier web application that leverages a shared database. The database runs on Amazon RDS for MySQL and is hosted in the us-west-2 Region. The application has a distributed front end deployed in the us-west-2, ap-southheast-1, and us-east-2 Regions. <br \/>\r<br>This front end is used as a dashboard for Sales Managers in each branch office to see current sales statistics. There are complaints that the dashboard performs more slowly in the Singapore location than it does in Portland or New York. A solution is needed to provide consistent performance for all users in each location. <br \/>\r<br>Which set of actions will meet these requirements?<\/div><input type='hidden' name='question_id[]' id='qID_28' value='291598' \/><input type='hidden' id='answerType291598' value='radio'><!-- end question-content--><\/div><div class='question-choices watupro-choices-columns '><div class='watupro-question-choice  ' dir='auto' ><input type='radio' name='answer-291598[]' id='answer-id-1145915' class='answer   answerof-291598 ' value='1145915'   \/><label for='answer-id-1145915' id='answer-label-1145915' class=' answer'><span>Take a snapshot of the instance in the us-west-2 Region. Create a new instance from the snapshot in the ap-southeast-1 Region. Reconfigure the ap-southeast-1 front-end dashboard to access this instance.<\/span><\/label><\/div><div class='watupro-question-choice  ' dir='auto' ><input type='radio' name='answer-291598[]' id='answer-id-1145916' class='answer   answerof-291598 ' value='1145916'   \/><label for='answer-id-1145916' id='answer-label-1145916' class=' answer'><span>Create an RDS read replica in the ap-southeast-1 Region from the primary RDS DB instance in the us- west-2 Region. Reconfigure the ap-southeast-1 front-end dashboard to access this instance.<\/span><\/label><\/div><div class='watupro-question-choice  ' dir='auto' ><input type='radio' name='answer-291598[]' id='answer-id-1145917' class='answer   answerof-291598 ' value='1145917'   \/><label for='answer-id-1145917' id='answer-label-1145917' class=' answer'><span>Create a new RDS instance in the ap-southeast-1 Region. Use AWS DMS and change data capture (CDC) to update the new instance in the ap-southeast-1 Region. Reconfigure the ap-southeast-1 front-end dashboard to access this instance.<\/span><\/label><\/div><div class='watupro-question-choice  ' dir='auto' ><input type='radio' name='answer-291598[]' id='answer-id-1145918' class='answer   answerof-291598 ' value='1145918'   \/><label for='answer-id-1145918' id='answer-label-1145918' class=' answer'><span>Create an RDS read replica in the us-west-2 Region where the primary instance resides. Create a read replica in the ap-southeast-1 Region from the read replica located on the us-west-2 Region. Reconfigure the ap-southeast-1 front-end dashboard to access this instance.<\/span><\/label><\/div><!-- end question-choices--><\/div><!-- end questionWrap--><\/div><\/div><div class='watu-question ' id='question-29' style=';'><div id='questionWrap-29'  class='   watupro-question-id-291599'>\n\t\t\t<div class='question-content'><div><span class='watupro_num'>29. <\/span>A company wants to migrate its existing on-premises Oracle database to Amazon Aurora PostgreSQL. The migration must be completed with minimal downtime using AWS DMS. A Database Specialist must validate that the data was migrated accurately from the source to the target before the cutover. The migration must have minimal impact on the performance of the source database. <br \/>\r<br>Which approach will MOST effectively meet these requirements?<\/div><input type='hidden' name='question_id[]' id='qID_29' value='291599' \/><input type='hidden' id='answerType291599' value='radio'><!-- end question-content--><\/div><div class='question-choices watupro-choices-columns '><div class='watupro-question-choice  ' dir='auto' ><input type='radio' name='answer-291599[]' id='answer-id-1145919' class='answer   answerof-291599 ' value='1145919'   \/><label for='answer-id-1145919' id='answer-label-1145919' class=' answer'><span>Use the AWS Schema Conversion Tool (AWS SCT) to convert source Oracle database schemas to the target Aurora DB cluster. Verify the datatype of the columns.<\/span><\/label><\/div><div class='watupro-question-choice  ' dir='auto' ><input type='radio' name='answer-291599[]' id='answer-id-1145920' class='answer   answerof-291599 ' value='1145920'   \/><label for='answer-id-1145920' id='answer-label-1145920' class=' answer'><span>Use the table metrics of the AWS DMS task created for migrating the data to verify the statistics for the tables being migrated and to verify that the data definition language (DDL) statements are completed.<\/span><\/label><\/div><div class='watupro-question-choice  ' dir='auto' ><input type='radio' name='answer-291599[]' id='answer-id-1145921' class='answer   answerof-291599 ' value='1145921'   \/><label for='answer-id-1145921' id='answer-label-1145921' class=' answer'><span>Enable the AWS Schema Conversion Tool (AWS SCT) premigration validation and review the premigration checklist to make sure there are no issues with the conversion.<\/span><\/label><\/div><div class='watupro-question-choice  ' dir='auto' ><input type='radio' name='answer-291599[]' id='answer-id-1145922' class='answer   answerof-291599 ' value='1145922'   \/><label for='answer-id-1145922' id='answer-label-1145922' class=' answer'><span>Enable AWS DMS data validation on the task so the AWS DMS task compares the source and target records, and reports any mismatches.<\/span><\/label><\/div><!-- end question-choices--><\/div><!-- end questionWrap--><\/div><\/div><div class='watu-question ' id='question-30' style=';'><div id='questionWrap-30'  class='   watupro-question-id-291600'>\n\t\t\t<div class='question-content'><div><span class='watupro_num'>30. <\/span>A company is planning to close for several days. A Database Specialist needs to stop all applications along with the DB instances to ensure employees do not have access to the systems during this time. All databases are running on Amazon RDS for MySQL. <br \/>\r<br>The Database Specialist wrote and executed a script to stop all the DB instances. When reviewing the logs, the Database Specialist found that Amazon RDS DB instances with read replicas did not stop. <br \/>\r<br>How should the Database Specialist edit the script to fix this issue?<\/div><input type='hidden' name='question_id[]' id='qID_30' value='291600' \/><input type='hidden' id='answerType291600' value='radio'><!-- end question-content--><\/div><div class='question-choices watupro-choices-columns '><div class='watupro-question-choice  ' dir='auto' ><input type='radio' name='answer-291600[]' id='answer-id-1145923' class='answer   answerof-291600 ' value='1145923'   \/><label for='answer-id-1145923' id='answer-label-1145923' class=' answer'><span>Stop the source instances before stopping their read replicas<\/span><\/label><\/div><div class='watupro-question-choice  ' dir='auto' ><input type='radio' name='answer-291600[]' id='answer-id-1145924' class='answer   answerof-291600 ' value='1145924'   \/><label for='answer-id-1145924' id='answer-label-1145924' class=' answer'><span>Delete each read replica before stopping its corresponding source instance<\/span><\/label><\/div><div class='watupro-question-choice  ' dir='auto' ><input type='radio' name='answer-291600[]' id='answer-id-1145925' class='answer   answerof-291600 ' value='1145925'   \/><label for='answer-id-1145925' id='answer-label-1145925' class=' answer'><span>Stop the read replicas before stopping their source instances<\/span><\/label><\/div><div class='watupro-question-choice  ' dir='auto' ><input type='radio' name='answer-291600[]' id='answer-id-1145926' class='answer   answerof-291600 ' value='1145926'   \/><label for='answer-id-1145926' id='answer-label-1145926' class=' answer'><span>Use the AWS CLI to stop each read replica and source instance at the same time<\/span><\/label><\/div><!-- end question-choices--><\/div><!-- end questionWrap--><\/div><\/div><div class='watu-question ' id='question-31' style=';'><div id='questionWrap-31'  class='   watupro-question-id-291601'>\n\t\t\t<div class='question-content'><div><span class='watupro_num'>31. <\/span>A global digital advertising company captures browsing metadata to contextually display relevant images, pages, and links to targeted users. A single page load can generate multiple events that need to be stored individually. The maximum size of an event is 200 KB and the average size is 10 KB. Each page load must query the user\u2019s browsing history to provide targeting recommendations. The advertising company expects over 1 billion page visits per day from users in the United States, Europe, Hong Kong, and India. The structure of the metadata varies depending on the event. Additionally, the browsing metadata must be written and read with very low latency to ensure a good viewing experience for the users. <br \/>\r<br>Which database solution meets these requirements?<\/div><input type='hidden' name='question_id[]' id='qID_31' value='291601' \/><input type='hidden' id='answerType291601' value='radio'><!-- end question-content--><\/div><div class='question-choices watupro-choices-columns '><div class='watupro-question-choice  ' dir='auto' ><input type='radio' name='answer-291601[]' id='answer-id-1145927' class='answer   answerof-291601 ' value='1145927'   \/><label for='answer-id-1145927' id='answer-label-1145927' class=' answer'><span>Amazon DocumentDB<\/span><\/label><\/div><div class='watupro-question-choice  ' dir='auto' ><input type='radio' name='answer-291601[]' id='answer-id-1145928' class='answer   answerof-291601 ' value='1145928'   \/><label for='answer-id-1145928' id='answer-label-1145928' class=' answer'><span>Amazon RDS Multi-AZ deployment<\/span><\/label><\/div><div class='watupro-question-choice  ' dir='auto' ><input type='radio' name='answer-291601[]' id='answer-id-1145929' class='answer   answerof-291601 ' value='1145929'   \/><label for='answer-id-1145929' id='answer-label-1145929' class=' answer'><span>Amazon DynamoDB global table<\/span><\/label><\/div><div class='watupro-question-choice  ' dir='auto' ><input type='radio' name='answer-291601[]' id='answer-id-1145930' class='answer   answerof-291601 ' value='1145930'   \/><label for='answer-id-1145930' id='answer-label-1145930' class=' answer'><span>Amazon Aurora Global Database<\/span><\/label><\/div><!-- end question-choices--><\/div><!-- end questionWrap--><\/div><\/div><div class='watu-question ' id='question-32' style=';'><div id='questionWrap-32'  class='   watupro-question-id-291602'>\n\t\t\t<div class='question-content'><div><span class='watupro_num'>32. <\/span>A Database Specialist modified an existing parameter group currently associated with a production Amazon RDS for SQL Server Multi-AZ DB instance. The change is associated with a static parameter type, which controls the number of user connections allowed on the most critical RDS SQL Server DB instance for the company. This change has been approved for a specific maintenance window to help minimize the impact on users. <br \/>\r<br>How should the Database Specialist apply the parameter group change for the DB instance?<\/div><input type='hidden' name='question_id[]' id='qID_32' value='291602' \/><input type='hidden' id='answerType291602' value='radio'><!-- end question-content--><\/div><div class='question-choices watupro-choices-columns '><div class='watupro-question-choice  ' dir='auto' ><input type='radio' name='answer-291602[]' id='answer-id-1145931' class='answer   answerof-291602 ' value='1145931'   \/><label for='answer-id-1145931' id='answer-label-1145931' class=' answer'><span>Select the option to apply the change immediately<\/span><\/label><\/div><div class='watupro-question-choice  ' dir='auto' ><input type='radio' name='answer-291602[]' id='answer-id-1145932' class='answer   answerof-291602 ' value='1145932'   \/><label for='answer-id-1145932' id='answer-label-1145932' class=' answer'><span>Allow the preconfigured RDS maintenance window for the given DB instance to control when the change is applied<\/span><\/label><\/div><div class='watupro-question-choice  ' dir='auto' ><input type='radio' name='answer-291602[]' id='answer-id-1145933' class='answer   answerof-291602 ' value='1145933'   \/><label for='answer-id-1145933' id='answer-label-1145933' class=' answer'><span>Apply the change manually by rebooting the DB instance during the approved maintenance window<\/span><\/label><\/div><div class='watupro-question-choice  ' dir='auto' ><input type='radio' name='answer-291602[]' id='answer-id-1145934' class='answer   answerof-291602 ' value='1145934'   \/><label for='answer-id-1145934' id='answer-label-1145934' class=' answer'><span>Reboot the secondary Multi-AZ DB instance<\/span><\/label><\/div><!-- end question-choices--><\/div><!-- end questionWrap--><\/div><\/div><div class='watu-question ' id='question-33' style=';'><div id='questionWrap-33'  class='   watupro-question-id-291603'>\n\t\t\t<div class='question-content'><div><span class='watupro_num'>33. <\/span>A Database Specialist is designing a new database infrastructure for a ride hailing application. The application data includes a ride tracking system that stores GPS coordinates for all rides. Real-time statistics and metadata lookups must be performed with high throughput and microsecond latency. The database should be fault tolerant with minimal operational overhead and development effort. <br \/>\r<br>Which solution meets these requirements in the MOST efficient way?<\/div><input type='hidden' name='question_id[]' id='qID_33' value='291603' \/><input type='hidden' id='answerType291603' value='radio'><!-- end question-content--><\/div><div class='question-choices watupro-choices-columns '><div class='watupro-question-choice  ' dir='auto' ><input type='radio' name='answer-291603[]' id='answer-id-1145935' class='answer   answerof-291603 ' value='1145935'   \/><label for='answer-id-1145935' id='answer-label-1145935' class=' answer'><span>Use Amazon RDS for MySQL as the database and use Amazon ElastiCache<\/span><\/label><\/div><div class='watupro-question-choice  ' dir='auto' ><input type='radio' name='answer-291603[]' id='answer-id-1145936' class='answer   answerof-291603 ' value='1145936'   \/><label for='answer-id-1145936' id='answer-label-1145936' class=' answer'><span>Use Amazon DynamoDB as the database and use DynamoDB Accelerator<\/span><\/label><\/div><div class='watupro-question-choice  ' dir='auto' ><input type='radio' name='answer-291603[]' id='answer-id-1145937' class='answer   answerof-291603 ' value='1145937'   \/><label for='answer-id-1145937' id='answer-label-1145937' class=' answer'><span>Use Amazon Aurora MySQL as the database and use Aurora\u2019s buffer cache<\/span><\/label><\/div><div class='watupro-question-choice  ' dir='auto' ><input type='radio' name='answer-291603[]' id='answer-id-1145938' class='answer   answerof-291603 ' value='1145938'   \/><label for='answer-id-1145938' id='answer-label-1145938' class=' answer'><span>Use Amazon DynamoDB as the database and use Amazon API Gateway<\/span><\/label><\/div><!-- end question-choices--><\/div><!-- end questionWrap--><\/div><\/div><div class='watu-question ' id='question-34' style=';'><div id='questionWrap-34'  class='   watupro-question-id-291604'>\n\t\t\t<div class='question-content'><div><span class='watupro_num'>34. <\/span>A company is using an Amazon Aurora PostgreSQL DB cluster with an xlarge primary instance master and two large Aurora Replicas for high availability and read-only workload scaling. A failover event occurs and application performance is poor for several minutes. During this time, application servers in all Availability Zones are healthy and responding normally. <br \/>\r<br>What should the company do to eliminate this application performance issue?<\/div><input type='hidden' name='question_id[]' id='qID_34' value='291604' \/><input type='hidden' id='answerType291604' value='radio'><!-- end question-content--><\/div><div class='question-choices watupro-choices-columns '><div class='watupro-question-choice  ' dir='auto' ><input type='radio' name='answer-291604[]' id='answer-id-1145939' class='answer   answerof-291604 ' value='1145939'   \/><label for='answer-id-1145939' id='answer-label-1145939' class=' answer'><span>Configure both of the Aurora Replicas to the same instance class as the primary DB instance. Enable cache coherence on the DB cluster, set the primary DB instance failover priority to tier-0, and assign a failover priority of tier-1 to the replicas.<\/span><\/label><\/div><div class='watupro-question-choice  ' dir='auto' ><input type='radio' name='answer-291604[]' id='answer-id-1145940' class='answer   answerof-291604 ' value='1145940'   \/><label for='answer-id-1145940' id='answer-label-1145940' class=' answer'><span>Deploy an AWS Lambda function that calls the DescribeDBInstances action to establish which instance has failed, and then use the PromoteReadReplica operation to promote one Aurora Replica to be the primary DB instance. Configure an Amazon RDS event subscription to send a notification to an Amazon SNS topic to which the Lambda function is subscribed.<\/span><\/label><\/div><div class='watupro-question-choice  ' dir='auto' ><input type='radio' name='answer-291604[]' id='answer-id-1145941' class='answer   answerof-291604 ' value='1145941'   \/><label for='answer-id-1145941' id='answer-label-1145941' class=' answer'><span>Configure one Aurora Replica to have the same instance class as the primary DB instance. Implement Aurora PostgreSQL DB cluster cache management. Set the failover priority to tier-0 for the primary DB instance and one replica with the same instance class. Set the failover priority to tier-1 for the other replicas.<\/span><\/label><\/div><div class='watupro-question-choice  ' dir='auto' ><input type='radio' name='answer-291604[]' id='answer-id-1145942' class='answer   answerof-291604 ' value='1145942'   \/><label for='answer-id-1145942' id='answer-label-1145942' class=' answer'><span>Configure both Aurora Replicas to have the same instance class as the primary DB instance. Implement Aurora PostgreSQL DB cluster cache management. Set the failover priority to tier-0 for the primary DB instance and to tier-1 for the replicas.<\/span><\/label><\/div><!-- end question-choices--><\/div><!-- end questionWrap--><\/div><\/div><div class='watu-question ' id='question-35' style=';'><div id='questionWrap-35'  class='   watupro-question-id-291605'>\n\t\t\t<div class='question-content'><div><span class='watupro_num'>35. <\/span>A company has a database monitoring solution that uses Amazon CloudWatch for its Amazon RDS for SQL Server environment. The cause of a recent spike in CPU utilization was not determined using the standard metrics that were collected. The CPU spike caused the application to perform poorly, impacting users. A Database Specialist needs to determine what caused the CPU spike. <br \/>\r<br>Which combination of steps should be taken to provide more visibility into the processes and queries running during an increase in CPU load? (Choose two.)<\/div><input type='hidden' name='question_id[]' id='qID_35' value='291605' \/><input type='hidden' id='answerType291605' value='checkbox'><!-- end question-content--><\/div><div class='question-choices watupro-choices-columns '><div class='watupro-question-choice  ' dir='auto' ><input type='checkbox' name='answer-291605[]' id='answer-id-1145943' class='answer   answerof-291605 ' value='1145943'   \/><label for='answer-id-1145943' id='answer-label-1145943' class=' answer'><span>Enable Amazon CloudWatch Events and view the incoming T-SQL statements causing the CPU to spike.<\/span><\/label><\/div><div class='watupro-question-choice  ' dir='auto' ><input type='checkbox' name='answer-291605[]' id='answer-id-1145944' class='answer   answerof-291605 ' value='1145944'   \/><label for='answer-id-1145944' id='answer-label-1145944' class=' answer'><span>Enable Enhanced Monitoring metrics to view CPU utilization at the RDS SQL Server DB instance level.<\/span><\/label><\/div><div class='watupro-question-choice  ' dir='auto' ><input type='checkbox' name='answer-291605[]' id='answer-id-1145945' class='answer   answerof-291605 ' value='1145945'   \/><label for='answer-id-1145945' id='answer-label-1145945' class=' answer'><span>Implement a caching layer to help with repeated queries on the RDS SQL Server DB instance.<\/span><\/label><\/div><div class='watupro-question-choice  ' dir='auto' ><input type='checkbox' name='answer-291605[]' id='answer-id-1145946' class='answer   answerof-291605 ' value='1145946'   \/><label for='answer-id-1145946' id='answer-label-1145946' class=' answer'><span>Use Amazon QuickSight to view the SQL statement being run.<\/span><\/label><\/div><div class='watupro-question-choice  ' dir='auto' ><input type='checkbox' name='answer-291605[]' id='answer-id-1145947' class='answer   answerof-291605 ' value='1145947'   \/><label for='answer-id-1145947' id='answer-label-1145947' class=' answer'><span>Enable Amazon RDS Performance Insights to view the database load and filter the load by waits, SQL statements, hosts, or users.<\/span><\/label><\/div><!-- end question-choices--><\/div><!-- end questionWrap--><\/div><\/div><div class='watu-question ' id='question-36' style=';'><div id='questionWrap-36'  class='   watupro-question-id-291606'>\n\t\t\t<div class='question-content'><div><span class='watupro_num'>36. <\/span>A company is using Amazon with Aurora Replicas for read-only workload scaling. A Database Specialist needs to split up two read-only applications so each application always connects to a dedicated replica. The Database Specialist wants to implement load balancing and high availability for the read-only applications. <br \/>\r<br>Which solution meets these requirements?<\/div><input type='hidden' name='question_id[]' id='qID_36' value='291606' \/><input type='hidden' id='answerType291606' value='radio'><!-- end question-content--><\/div><div class='question-choices watupro-choices-columns '><div class='watupro-question-choice  ' dir='auto' ><input type='radio' name='answer-291606[]' id='answer-id-1145948' class='answer   answerof-291606 ' value='1145948'   \/><label for='answer-id-1145948' id='answer-label-1145948' class=' answer'><span>Use a specific instance endpoint for each replica and add the instance endpoint to each read-only application connection string.<\/span><\/label><\/div><div class='watupro-question-choice  ' dir='auto' ><input type='radio' name='answer-291606[]' id='answer-id-1145949' class='answer   answerof-291606 ' value='1145949'   \/><label for='answer-id-1145949' id='answer-label-1145949' class=' answer'><span>Use reader endpoints for both the read-only workload applications.<\/span><\/label><\/div><div class='watupro-question-choice  ' dir='auto' ><input type='radio' name='answer-291606[]' id='answer-id-1145950' class='answer   answerof-291606 ' value='1145950'   \/><label for='answer-id-1145950' id='answer-label-1145950' class=' answer'><span>Use a reader endpoint for one read-only application and use an instance endpoint for the other read-only application.<\/span><\/label><\/div><div class='watupro-question-choice  ' dir='auto' ><input type='radio' name='answer-291606[]' id='answer-id-1145951' class='answer   answerof-291606 ' value='1145951'   \/><label for='answer-id-1145951' id='answer-label-1145951' class=' answer'><span>Use custom endpoints for the two read-only applications.<\/span><\/label><\/div><!-- end question-choices--><\/div><!-- end questionWrap--><\/div><\/div><div class='watu-question ' id='question-37' style=';'><div id='questionWrap-37'  class='   watupro-question-id-291607'>\n\t\t\t<div class='question-content'><div><span class='watupro_num'>37. <\/span>An online gaming company is planning to launch a new game with Amazon DynamoDB as its data store. <br \/>\r<br>The database should be designated to support the following use cases: <br \/>\r<br>Update scores in real time whenever a player is playing the game. Retrieve a player\u2019s score details for a specific game session. <br \/>\r<br>A Database Specialist decides to implement a DynamoDB table. Each player has a unique user_id and each game has a unique game_id. <br \/>\r<br>Which choice of keys is recommended for the DynamoDB table?<\/div><input type='hidden' name='question_id[]' id='qID_37' value='291607' \/><input type='hidden' id='answerType291607' value='radio'><!-- end question-content--><\/div><div class='question-choices watupro-choices-columns '><div class='watupro-question-choice  ' dir='auto' ><input type='radio' name='answer-291607[]' id='answer-id-1145952' class='answer   answerof-291607 ' value='1145952'   \/><label for='answer-id-1145952' id='answer-label-1145952' class=' answer'><span>Create a global secondary index with game_id as the partition key<\/span><\/label><\/div><div class='watupro-question-choice  ' dir='auto' ><input type='radio' name='answer-291607[]' id='answer-id-1145953' class='answer   answerof-291607 ' value='1145953'   \/><label for='answer-id-1145953' id='answer-label-1145953' class=' answer'><span>Create a global secondary index with user_id as the partition key<\/span><\/label><\/div><div class='watupro-question-choice  ' dir='auto' ><input type='radio' name='answer-291607[]' id='answer-id-1145954' class='answer   answerof-291607 ' value='1145954'   \/><label for='answer-id-1145954' id='answer-label-1145954' class=' answer'><span>Create a composite primary key with game_id as the partition key and user_id as the sort key<\/span><\/label><\/div><div class='watupro-question-choice  ' dir='auto' ><input type='radio' name='answer-291607[]' id='answer-id-1145955' class='answer   answerof-291607 ' value='1145955'   \/><label for='answer-id-1145955' id='answer-label-1145955' class=' answer'><span>Create a composite primary key with user_id as the partition key and game_id as the sort key<\/span><\/label><\/div><!-- end question-choices--><\/div><!-- end questionWrap--><\/div><\/div><div class='watu-question ' id='question-38' style=';'><div id='questionWrap-38'  class='   watupro-question-id-291608'>\n\t\t\t<div class='question-content'><div><span class='watupro_num'>38. <\/span>A Database Specialist migrated an existing production MySQL database from on-premises to an Amazon RDS for MySQL DB instance. However, after the migration, the database needed to be encrypted at rest using AWS KMS. Due to the size of the database, reloading, the data into an encrypted database would be too time- consuming, so it is not an option. <br \/>\r<br>How should the Database Specialist satisfy this new requirement?<\/div><input type='hidden' name='question_id[]' id='qID_38' value='291608' \/><input type='hidden' id='answerType291608' value='radio'><!-- end question-content--><\/div><div class='question-choices watupro-choices-columns '><div class='watupro-question-choice  ' dir='auto' ><input type='radio' name='answer-291608[]' id='answer-id-1145956' class='answer   answerof-291608 ' value='1145956'   \/><label for='answer-id-1145956' id='answer-label-1145956' class=' answer'><span>Create a snapshot of the unencrypted RDS DB instance. Create an encrypted copy of the unencrypted snapshot. Restore the encrypted snapshot copy.<\/span><\/label><\/div><div class='watupro-question-choice  ' dir='auto' ><input type='radio' name='answer-291608[]' id='answer-id-1145957' class='answer   answerof-291608 ' value='1145957'   \/><label for='answer-id-1145957' id='answer-label-1145957' class=' answer'><span>Modify the RDS DB instance. Enable the AWS KMS encryption option that leverages the AWS CL<\/span><\/label><\/div><div class='watupro-question-choice  ' dir='auto' ><input type='radio' name='answer-291608[]' id='answer-id-1145958' class='answer   answerof-291608 ' value='1145958'   \/><label for='answer-id-1145958' id='answer-label-1145958' class=' answer'><span>Restore an unencrypted snapshot into a MySQL RDS DB instance that is encrypted.<\/span><\/label><\/div><div class='watupro-question-choice  ' dir='auto' ><input type='radio' name='answer-291608[]' id='answer-id-1145959' class='answer   answerof-291608 ' value='1145959'   \/><label for='answer-id-1145959' id='answer-label-1145959' class=' answer'><span>Create an encrypted read replica of the RDS DB instance. Promote it the master.<\/span><\/label><\/div><!-- end question-choices--><\/div><!-- end questionWrap--><\/div><\/div><div class='watu-question ' id='question-39' style=';'><div id='questionWrap-39'  class='   watupro-question-id-291609'>\n\t\t\t<div class='question-content'><div><span class='watupro_num'>39. <\/span>A Database Specialist is planning to create a read replica of an existing Amazon RDS for MySQL Multi-AZ DB instance. When using the AWS Management Console to conduct this task, the Database Specialist discovers that the source RDS DB instance does not appear in the read replica source selection box, so the read replica cannot be created. <br \/>\r<br>What is the most likely reason for this?<\/div><input type='hidden' name='question_id[]' id='qID_39' value='291609' \/><input type='hidden' id='answerType291609' value='radio'><!-- end question-content--><\/div><div class='question-choices watupro-choices-columns '><div class='watupro-question-choice  ' dir='auto' ><input type='radio' name='answer-291609[]' id='answer-id-1145960' class='answer   answerof-291609 ' value='1145960'   \/><label for='answer-id-1145960' id='answer-label-1145960' class=' answer'><span>The source DB instance has to be converted to Single-AZ first to create a read replica from it.<\/span><\/label><\/div><div class='watupro-question-choice  ' dir='auto' ><input type='radio' name='answer-291609[]' id='answer-id-1145961' class='answer   answerof-291609 ' value='1145961'   \/><label for='answer-id-1145961' id='answer-label-1145961' class=' answer'><span>Enhanced Monitoring is not enabled on the source DB instance.<\/span><\/label><\/div><div class='watupro-question-choice  ' dir='auto' ><input type='radio' name='answer-291609[]' id='answer-id-1145962' class='answer   answerof-291609 ' value='1145962'   \/><label for='answer-id-1145962' id='answer-label-1145962' class=' answer'><span>The minor MySQL version in the source DB instance does not support read replicas.<\/span><\/label><\/div><div class='watupro-question-choice  ' dir='auto' ><input type='radio' name='answer-291609[]' id='answer-id-1145963' class='answer   answerof-291609 ' value='1145963'   \/><label for='answer-id-1145963' id='answer-label-1145963' class=' answer'><span>Automated backups are not enabled on the source DB instance.<\/span><\/label><\/div><!-- end question-choices--><\/div><!-- end questionWrap--><\/div><\/div><div class='watu-question ' id='question-40' style=';'><div id='questionWrap-40'  class='   watupro-question-id-291610'>\n\t\t\t<div class='question-content'><div><span class='watupro_num'>40. <\/span>A Database Specialist has migrated an on-premises Oracle database to Amazon Aurora PostgreSQL. The schema and the data have been migrated successfully. The on-premises database server was also being used to run database maintenance cron jobs written in Python to perform tasks including data purging and generating data exports. The logs for these jobs show that, most of the time, the jobs completed within 5 minutes, but a few jobs took up to 10 minutes to complete. These maintenance jobs need to be set up for Aurora PostgreSQL. <br \/>\r<br>How can the Database Specialist schedule these jobs so the setup requires minimal maintenance and provides high availability?<\/div><input type='hidden' name='question_id[]' id='qID_40' value='291610' \/><input type='hidden' id='answerType291610' value='radio'><!-- end question-content--><\/div><div class='question-choices watupro-choices-columns '><div class='watupro-question-choice  ' dir='auto' ><input type='radio' name='answer-291610[]' id='answer-id-1145964' class='answer   answerof-291610 ' value='1145964'   \/><label for='answer-id-1145964' id='answer-label-1145964' class=' answer'><span>Create cron jobs on an Amazon EC2 instance to run the maintenance jobs following the required schedule.<\/span><\/label><\/div><div class='watupro-question-choice  ' dir='auto' ><input type='radio' name='answer-291610[]' id='answer-id-1145965' class='answer   answerof-291610 ' value='1145965'   \/><label for='answer-id-1145965' id='answer-label-1145965' class=' answer'><span>Connect to the Aurora host and create cron jobs to run the maintenance jobs following the required schedule.<\/span><\/label><\/div><div class='watupro-question-choice  ' dir='auto' ><input type='radio' name='answer-291610[]' id='answer-id-1145966' class='answer   answerof-291610 ' value='1145966'   \/><label for='answer-id-1145966' id='answer-label-1145966' class=' answer'><span>Create AWS Lambda functions to run the maintenance jobs and schedule them with Amazon CloudWatch Events.<\/span><\/label><\/div><div class='watupro-question-choice  ' dir='auto' ><input type='radio' name='answer-291610[]' id='answer-id-1145967' class='answer   answerof-291610 ' value='1145967'   \/><label for='answer-id-1145967' id='answer-label-1145967' class=' answer'><span>Create the maintenance job using the Amazon CloudWatch job scheduling plugin.<\/span><\/label><\/div><!-- end question-choices--><\/div><!-- end questionWrap--><\/div><\/div><div class='watu-question ' id='question-41' style=';'><div id='questionWrap-41'  class='   watupro-question-id-291611'>\n\t\t\t<div class='question-content'><div><span class='watupro_num'>41. <\/span>A company has an Amazon RDS Multi-AZ DB instances that is 200 GB in size with an RPO of 6 hours. To meet the company\u2019s disaster recovery policies, the database backup needs to be copied into another Region. The company requires the solution to be cost-effective and operationally efficient. <br \/>\r<br>What should a Database Specialist do to copy the database backup into a different Region?<\/div><input type='hidden' name='question_id[]' id='qID_41' value='291611' \/><input type='hidden' id='answerType291611' value='radio'><!-- end question-content--><\/div><div class='question-choices watupro-choices-columns '><div class='watupro-question-choice  ' dir='auto' ><input type='radio' name='answer-291611[]' id='answer-id-1145968' class='answer   answerof-291611 ' value='1145968'   \/><label for='answer-id-1145968' id='answer-label-1145968' class=' answer'><span>Use Amazon RDS automated snapshots and use AWS Lambda to copy the snapshot into another Region<\/span><\/label><\/div><div class='watupro-question-choice  ' dir='auto' ><input type='radio' name='answer-291611[]' id='answer-id-1145969' class='answer   answerof-291611 ' value='1145969'   \/><label for='answer-id-1145969' id='answer-label-1145969' class=' answer'><span>Use Amazon RDS automated snapshots every 6 hours and use Amazon S3 cross-Region replication to copy the snapshot into another Region<\/span><\/label><\/div><div class='watupro-question-choice  ' dir='auto' ><input type='radio' name='answer-291611[]' id='answer-id-1145970' class='answer   answerof-291611 ' value='1145970'   \/><label for='answer-id-1145970' id='answer-label-1145970' class=' answer'><span>Create an AWS Lambda function to take an Amazon RDS snapshot every 6 hours and use a second Lambda function to copy the snapshot into another Region<\/span><\/label><\/div><div class='watupro-question-choice  ' dir='auto' ><input type='radio' name='answer-291611[]' id='answer-id-1145971' class='answer   answerof-291611 ' value='1145971'   \/><label for='answer-id-1145971' id='answer-label-1145971' class=' answer'><span>Create a cross-Region read replica for Amazon RDS in another Region and take an automated snapshot of the read replica<\/span><\/label><\/div><!-- end question-choices--><\/div><!-- end questionWrap--><\/div><\/div><div class='watu-question ' id='question-42' style=';'><div id='questionWrap-42'  class='   watupro-question-id-291612'>\n\t\t\t<div class='question-content'><div><span class='watupro_num'>42. <\/span>An Amazon RDS EBS-optimized instance with Provisioned IOPS (PIOPS) storage is using less than half of its allocated IOPS over the course of several hours under constant load. The RDS instance exhibits multi-second read and write latency, and uses all of its maximum bandwidth for read throughput, yet the instance uses less than half of its CPU and RAM resources. <br \/>\r<br>What should a Database Specialist do in this situation to increase performance and return latency to sub- second levels?<\/div><input type='hidden' name='question_id[]' id='qID_42' value='291612' \/><input type='hidden' id='answerType291612' value='radio'><!-- end question-content--><\/div><div class='question-choices watupro-choices-columns '><div class='watupro-question-choice  ' dir='auto' ><input type='radio' name='answer-291612[]' id='answer-id-1145972' class='answer   answerof-291612 ' value='1145972'   \/><label for='answer-id-1145972' id='answer-label-1145972' class=' answer'><span>Increase the size of the DB instance storage<\/span><\/label><\/div><div class='watupro-question-choice  ' dir='auto' ><input type='radio' name='answer-291612[]' id='answer-id-1145973' class='answer   answerof-291612 ' value='1145973'   \/><label for='answer-id-1145973' id='answer-label-1145973' class=' answer'><span>Change the underlying EBS storage type to General Purpose SSD (gp2)<\/span><\/label><\/div><div class='watupro-question-choice  ' dir='auto' ><input type='radio' name='answer-291612[]' id='answer-id-1145974' class='answer   answerof-291612 ' value='1145974'   \/><label for='answer-id-1145974' id='answer-label-1145974' class=' answer'><span>Disable EBS optimization on the DB instance<\/span><\/label><\/div><div class='watupro-question-choice  ' dir='auto' ><input type='radio' name='answer-291612[]' id='answer-id-1145975' class='answer   answerof-291612 ' value='1145975'   \/><label for='answer-id-1145975' id='answer-label-1145975' class=' answer'><span>Change the DB instance to an instance class with a higher maximum bandwidth<\/span><\/label><\/div><!-- end question-choices--><\/div><!-- end questionWrap--><\/div><\/div><div class='watu-question ' id='question-43' style=';'><div id='questionWrap-43'  class='   watupro-question-id-291613'>\n\t\t\t<div class='question-content'><div><span class='watupro_num'>43. <\/span>After restoring an Amazon RDS snapshot from 3 days ago, a company\u2019s Development team cannot connect to the restored RDS DB instance. <br \/>\r<br>What is the likely cause of this problem?<\/div><input type='hidden' name='question_id[]' id='qID_43' value='291613' \/><input type='hidden' id='answerType291613' value='radio'><!-- end question-content--><\/div><div class='question-choices watupro-choices-columns '><div class='watupro-question-choice  ' dir='auto' ><input type='radio' name='answer-291613[]' id='answer-id-1145976' class='answer   answerof-291613 ' value='1145976'   \/><label for='answer-id-1145976' id='answer-label-1145976' class=' answer'><span>The restored DB instance does not have Enhanced Monitoring enabled<\/span><\/label><\/div><div class='watupro-question-choice  ' dir='auto' ><input type='radio' name='answer-291613[]' id='answer-id-1145977' class='answer   answerof-291613 ' value='1145977'   \/><label for='answer-id-1145977' id='answer-label-1145977' class=' answer'><span>The production DB instance is using a custom parameter group<\/span><\/label><\/div><div class='watupro-question-choice  ' dir='auto' ><input type='radio' name='answer-291613[]' id='answer-id-1145978' class='answer   answerof-291613 ' value='1145978'   \/><label for='answer-id-1145978' id='answer-label-1145978' class=' answer'><span>The restored DB instance is using the default security group<\/span><\/label><\/div><div class='watupro-question-choice  ' dir='auto' ><input type='radio' name='answer-291613[]' id='answer-id-1145979' class='answer   answerof-291613 ' value='1145979'   \/><label for='answer-id-1145979' id='answer-label-1145979' class=' answer'><span>The production DB instance is using a custom option group<\/span><\/label><\/div><!-- end question-choices--><\/div><!-- end questionWrap--><\/div><\/div><div class='watu-question ' id='question-44' style=';'><div id='questionWrap-44'  class='   watupro-question-id-291614'>\n\t\t\t<div class='question-content'><div><span class='watupro_num'>44. <\/span>A gaming company has implemented a leaderboard in AWS using a Sorted Set data structure within <br \/>\r<br>Amazon ElastiCache for Redis. The ElastiCache cluster has been deployed with cluster mode disabled and has a replication group deployed with two additional replicas. The company is planning for a worldwide gaming event and is anticipating a higher write load than what the current cluster can handle. <br \/>\r<br>Which method should a Database Specialist use to scale the ElastiCache cluster ahead of the upcoming event?<\/div><input type='hidden' name='question_id[]' id='qID_44' value='291614' \/><input type='hidden' id='answerType291614' value='radio'><!-- end question-content--><\/div><div class='question-choices watupro-choices-columns '><div class='watupro-question-choice  ' dir='auto' ><input type='radio' name='answer-291614[]' id='answer-id-1145980' class='answer   answerof-291614 ' value='1145980'   \/><label for='answer-id-1145980' id='answer-label-1145980' class=' answer'><span>Enable cluster mode on the existing ElastiCache cluster and configure separate shards for the Sorted Set across all nodes in the cluster.<\/span><\/label><\/div><div class='watupro-question-choice  ' dir='auto' ><input type='radio' name='answer-291614[]' id='answer-id-1145981' class='answer   answerof-291614 ' value='1145981'   \/><label for='answer-id-1145981' id='answer-label-1145981' class=' answer'><span>Increase the size of the ElastiCache cluster nodes to a larger instance size.<\/span><\/label><\/div><div class='watupro-question-choice  ' dir='auto' ><input type='radio' name='answer-291614[]' id='answer-id-1145982' class='answer   answerof-291614 ' value='1145982'   \/><label for='answer-id-1145982' id='answer-label-1145982' class=' answer'><span>Create an additional ElastiCache cluster and load-balance traffic between the two clusters.<\/span><\/label><\/div><div class='watupro-question-choice  ' dir='auto' ><input type='radio' name='answer-291614[]' id='answer-id-1145983' class='answer   answerof-291614 ' value='1145983'   \/><label for='answer-id-1145983' id='answer-label-1145983' class=' answer'><span>Use the EXPIRE command and set a higher time to live (TTL) after each call to increment a given key.<\/span><\/label><\/div><!-- end question-choices--><\/div><!-- end questionWrap--><\/div><\/div><div class='watu-question ' id='question-45' style=';'><div id='questionWrap-45'  class='   watupro-question-id-291615'>\n\t\t\t<div class='question-content'><div><span class='watupro_num'>45. <\/span>An ecommerce company has tasked a Database Specialist with creating a reporting dashboard that visualizes critical business metrics that will be pulled from the core production database running on Amazon Aurora. Data that is read by the dashboard should be available within 100 milliseconds of an update. The Database Specialist needs to review the current configuration of the Aurora DB cluster and develop a cost-effective solution. The solution needs to accommodate the unpredictable read workload from the reporting dashboard without any impact on the write availability and performance of the DB cluster. <br \/>\r<br>Which solution meets these requirements?<\/div><input type='hidden' name='question_id[]' id='qID_45' value='291615' \/><input type='hidden' id='answerType291615' value='radio'><!-- end question-content--><\/div><div class='question-choices watupro-choices-columns '><div class='watupro-question-choice  ' dir='auto' ><input type='radio' name='answer-291615[]' id='answer-id-1145984' class='answer   answerof-291615 ' value='1145984'   \/><label for='answer-id-1145984' id='answer-label-1145984' class=' answer'><span>Turn on the serverless option in the DB cluster so it can automatically scale based on demand.<\/span><\/label><\/div><div class='watupro-question-choice  ' dir='auto' ><input type='radio' name='answer-291615[]' id='answer-id-1145985' class='answer   answerof-291615 ' value='1145985'   \/><label for='answer-id-1145985' id='answer-label-1145985' class=' answer'><span>Provision a clone of the existing DB cluster for the new Application team.<\/span><\/label><\/div><div class='watupro-question-choice  ' dir='auto' ><input type='radio' name='answer-291615[]' id='answer-id-1145986' class='answer   answerof-291615 ' value='1145986'   \/><label for='answer-id-1145986' id='answer-label-1145986' class=' answer'><span>Create a separate DB cluster for the new workload, refresh from the source DB cluster, and set up ongoing replication using AWS DMS change data capture (CDC).<\/span><\/label><\/div><div class='watupro-question-choice  ' dir='auto' ><input type='radio' name='answer-291615[]' id='answer-id-1145987' class='answer   answerof-291615 ' value='1145987'   \/><label for='answer-id-1145987' id='answer-label-1145987' class=' answer'><span>Add an automatic scaling policy to the DB cluster to add Aurora Replicas to the cluster based on CPU consumption.<\/span><\/label><\/div><!-- end question-choices--><\/div><!-- end questionWrap--><\/div><\/div><div class='watu-question ' id='question-46' style=';'><div id='questionWrap-46'  class='   watupro-question-id-291616'>\n\t\t\t<div class='question-content'><div><span class='watupro_num'>46. <\/span>A retail company is about to migrate its online and mobile store to AWS. The company\u2019s CEO has <br \/>\r<br>strategic plans to grow the brand globally. A Database Specialist has been challenged to provide predictable read and write database performance with minimal operational overhead. <br \/>\r<br>What should the Database Specialist do to meet these requirements?<\/div><input type='hidden' name='question_id[]' id='qID_46' value='291616' \/><input type='hidden' id='answerType291616' value='radio'><!-- end question-content--><\/div><div class='question-choices watupro-choices-columns '><div class='watupro-question-choice  ' dir='auto' ><input type='radio' name='answer-291616[]' id='answer-id-1145988' class='answer   answerof-291616 ' value='1145988'   \/><label for='answer-id-1145988' id='answer-label-1145988' class=' answer'><span>Use Amazon DynamoDB global tables to synchronize transactions<\/span><\/label><\/div><div class='watupro-question-choice  ' dir='auto' ><input type='radio' name='answer-291616[]' id='answer-id-1145989' class='answer   answerof-291616 ' value='1145989'   \/><label for='answer-id-1145989' id='answer-label-1145989' class=' answer'><span>Use Amazon EMR to copy the orders table data across Regions<\/span><\/label><\/div><div class='watupro-question-choice  ' dir='auto' ><input type='radio' name='answer-291616[]' id='answer-id-1145990' class='answer   answerof-291616 ' value='1145990'   \/><label for='answer-id-1145990' id='answer-label-1145990' class=' answer'><span>Use Amazon Aurora Global Database to synchronize all transactions<\/span><\/label><\/div><div class='watupro-question-choice  ' dir='auto' ><input type='radio' name='answer-291616[]' id='answer-id-1145991' class='answer   answerof-291616 ' value='1145991'   \/><label for='answer-id-1145991' id='answer-label-1145991' class=' answer'><span>Use Amazon DynamoDB Streams to replicate all DynamoDB transactions and sync them<\/span><\/label><\/div><!-- end question-choices--><\/div><!-- end questionWrap--><\/div><\/div><div class='watu-question ' id='question-47' style=';'><div id='questionWrap-47'  class='   watupro-question-id-291617'>\n\t\t\t<div class='question-content'><div><span class='watupro_num'>47. <\/span>A company is closing one of its remote data centers. This site runs a 100 TB on-premises data warehouse solution. The company plans to use the AWS Schema Conversion Tool (AWS SCT) and AWS DMS for the migration to AWS. The site network bandwidth is 500 Mbps. A Database Specialist wants to migrate the on- premises data using Amazon S3 as the data lake and Amazon Redshift as the data warehouse. This move must take place during a 2-week period when source systems are shut down for maintenance. The data should stay encrypted at rest and in transit. <br \/>\r<br>Which approach has the least risk and the highest likelihood of a successful data transfer?<\/div><input type='hidden' name='question_id[]' id='qID_47' value='291617' \/><input type='hidden' id='answerType291617' value='radio'><!-- end question-content--><\/div><div class='question-choices watupro-choices-columns '><div class='watupro-question-choice  ' dir='auto' ><input type='radio' name='answer-291617[]' id='answer-id-1145992' class='answer   answerof-291617 ' value='1145992'   \/><label for='answer-id-1145992' id='answer-label-1145992' class=' answer'><span>Set up a VPN tunnel for encrypting data over the network from the data center to AW<\/span><\/label><\/div><div class='watupro-question-choice  ' dir='auto' ><input type='radio' name='answer-291617[]' id='answer-id-1145993' class='answer   answerof-291617 ' value='1145993'   \/><label for='answer-id-1145993' id='answer-label-1145993' class=' answer'><span>Leverage AWS SCT and apply the converted schema to Amazon Redshift. Once complete, start an AWS DMS task to move the data from the source to Amazon S3. Use AWS Glue to load the data from Amazon S3 to Amazon Redshift.<\/span><\/label><\/div><div class='watupro-question-choice  ' dir='auto' ><input type='radio' name='answer-291617[]' id='answer-id-1145994' class='answer   answerof-291617 ' value='1145994'   \/><label for='answer-id-1145994' id='answer-label-1145994' class=' answer'><span>Leverage AWS SCT and apply the converted schema to Amazon Redshift. Start an AWS DMS task with two AWS Snowball Edge devices to copy data from on-premises to Amazon S3 with AWS KMS encryption. Use AWS DMS to finish copying data to Amazon Redshift.<\/span><\/label><\/div><div class='watupro-question-choice  ' dir='auto' ><input type='radio' name='answer-291617[]' id='answer-id-1145995' class='answer   answerof-291617 ' value='1145995'   \/><label for='answer-id-1145995' id='answer-label-1145995' class=' answer'><span>Leverage AWS SCT and apply the converted schema to Amazon Redshift. Once complete, use a fleet of 10 TB dedicated encrypted drives using the AWS Import\/Export feature to copy data from on-premises to Amazon S3 with AWS KMS encryption. Use AWS Glue to load the data to Amazon redshift.<\/span><\/label><\/div><div class='watupro-question-choice  ' dir='auto' ><input type='radio' name='answer-291617[]' id='answer-id-1145996' class='answer   answerof-291617 ' value='1145996'   \/><label for='answer-id-1145996' id='answer-label-1145996' class=' answer'><span>Set up a VPN tunnel for encrypting data over the network from the data center to AW<\/span><\/label><\/div><div class='watupro-question-choice  ' dir='auto' ><input type='radio' name='answer-291617[]' id='answer-id-1145997' class='answer   answerof-291617 ' value='1145997'   \/><label for='answer-id-1145997' id='answer-label-1145997' class=' answer'><span>Leverage a native database export feature to export the data and compress the files. Use the aws S3 cp multi-port upload command to upload these files to Amazon S3 with AWS KMS encryption. Once complete, load the data to Amazon Redshift using AWS Glue.<\/span><\/label><\/div><!-- end question-choices--><\/div><!-- end questionWrap--><\/div><\/div><div class='watu-question ' id='question-48' style=';'><div id='questionWrap-48'  class='   watupro-question-id-291618'>\n\t\t\t<div class='question-content'><div><span class='watupro_num'>48. <\/span>A company is looking to migrate a 1 TB Oracle database from on-premises to an Amazon Aurora PostgreSQL DB cluster. The company\u2019s Database Specialist discovered that the Oracle database is storing 100 GB of large binary objects (LOBs) across multiple tables. The Oracle database has a maximum LOB size of 500 MB with an average LOB size of 350 MB. The Database Specialist has chosen AWS DMS to migrate the data with the largest replication instances. <br \/>\r<br>How should the Database Specialist optimize the database migration using AWS DMS?<\/div><input type='hidden' name='question_id[]' id='qID_48' value='291618' \/><input type='hidden' id='answerType291618' value='radio'><!-- end question-content--><\/div><div class='question-choices watupro-choices-columns '><div class='watupro-question-choice  ' dir='auto' ><input type='radio' name='answer-291618[]' id='answer-id-1145998' class='answer   answerof-291618 ' value='1145998'   \/><label for='answer-id-1145998' id='answer-label-1145998' class=' answer'><span>Create a single task using full LOB mode with a LOB chunk size of 500 MB to migrate the data and LOBs together<\/span><\/label><\/div><div class='watupro-question-choice  ' dir='auto' ><input type='radio' name='answer-291618[]' id='answer-id-1145999' class='answer   answerof-291618 ' value='1145999'   \/><label for='answer-id-1145999' id='answer-label-1145999' class=' answer'><span>Create two tasks: task1 with LOB tables using full LOB mode with a LOB chunk size of 500 MB and task2 without LOBs<\/span><\/label><\/div><div class='watupro-question-choice  ' dir='auto' ><input type='radio' name='answer-291618[]' id='answer-id-1146000' class='answer   answerof-291618 ' value='1146000'   \/><label for='answer-id-1146000' id='answer-label-1146000' class=' answer'><span>Create two tasks: task1 with LOB tables using limited LOB mode with a maximum LOB size of 500 MB and task 2 without LOBs<\/span><\/label><\/div><div class='watupro-question-choice  ' dir='auto' ><input type='radio' name='answer-291618[]' id='answer-id-1146001' class='answer   answerof-291618 ' value='1146001'   \/><label for='answer-id-1146001' id='answer-label-1146001' class=' answer'><span>Create a single task using limited LOB mode with a maximum LOB size of 500 MB to migrate data and LOBs together<\/span><\/label><\/div><!-- end question-choices--><\/div><!-- end questionWrap--><\/div><\/div><div class='watu-question ' id='question-49' style=';'><div id='questionWrap-49'  class='   watupro-question-id-291619'>\n\t\t\t<div class='question-content'><div><span class='watupro_num'>49. <\/span>A Database Specialist is designing a disaster recovery strategy for a production Amazon DynamoDB table. The table uses provisioned read\/write capacity mode, global secondary indexes, and time to live (TTL). The Database Specialist has restored the latest backup to a new table. <br \/>\r<br>To prepare the new table with identical settings, which steps should be performed? (Choose two.)<\/div><input type='hidden' name='question_id[]' id='qID_49' value='291619' \/><input type='hidden' id='answerType291619' value='checkbox'><!-- end question-content--><\/div><div class='question-choices watupro-choices-columns '><div class='watupro-question-choice  ' dir='auto' ><input type='checkbox' name='answer-291619[]' id='answer-id-1146002' class='answer   answerof-291619 ' value='1146002'   \/><label for='answer-id-1146002' id='answer-label-1146002' class=' answer'><span>Re-create global secondary indexes in the new table<\/span><\/label><\/div><div class='watupro-question-choice  ' dir='auto' ><input type='checkbox' name='answer-291619[]' id='answer-id-1146003' class='answer   answerof-291619 ' value='1146003'   \/><label for='answer-id-1146003' id='answer-label-1146003' class=' answer'><span>Define IAM policies for access to the new table<\/span><\/label><\/div><div class='watupro-question-choice  ' dir='auto' ><input type='checkbox' name='answer-291619[]' id='answer-id-1146004' class='answer   answerof-291619 ' value='1146004'   \/><label for='answer-id-1146004' id='answer-label-1146004' class=' answer'><span>Define the TTL settings<\/span><\/label><\/div><div class='watupro-question-choice  ' dir='auto' ><input type='checkbox' name='answer-291619[]' id='answer-id-1146005' class='answer   answerof-291619 ' value='1146005'   \/><label for='answer-id-1146005' id='answer-label-1146005' class=' answer'><span>Encrypt the table from the AWS Management Console or use the update-table command<\/span><\/label><\/div><div class='watupro-question-choice  ' dir='auto' ><input type='checkbox' name='answer-291619[]' id='answer-id-1146006' class='answer   answerof-291619 ' value='1146006'   \/><label for='answer-id-1146006' id='answer-label-1146006' class=' answer'><span>Set the provisioned read and write capacity<\/span><\/label><\/div><!-- end question-choices--><\/div><!-- end questionWrap--><\/div><\/div><div class='watu-question ' id='question-50' style=';'><div id='questionWrap-50'  class='   watupro-question-id-291620'>\n\t\t\t<div class='question-content'><div><span class='watupro_num'>50. <\/span>A Database Specialist is creating Amazon DynamoDB tables, Amazon CloudWatch alarms, and associated infrastructure for an Application team using a development AWS account. The team wants a deployment method that will standardize the core solution components while managing environment-specific settings separately, and wants to minimize rework due to configuration errors. <br \/>\r<br>Which process should the Database Specialist recommend to meet these requirements?<\/div><input type='hidden' name='question_id[]' id='qID_50' value='291620' \/><input type='hidden' id='answerType291620' value='radio'><!-- end question-content--><\/div><div class='question-choices watupro-choices-columns '><div class='watupro-question-choice  ' dir='auto' ><input type='radio' name='answer-291620[]' id='answer-id-1146007' class='answer   answerof-291620 ' value='1146007'   \/><label for='answer-id-1146007' id='answer-label-1146007' class=' answer'><span>Organize common and environmental-specific parameters hierarchically in the AWS Systems Manager Parameter Store, then reference the parameters dynamically from an AWS CloudFormation template. Deploy the CloudFormation stack using the environment name as a parameter.<\/span><\/label><\/div><div class='watupro-question-choice  ' dir='auto' ><input type='radio' name='answer-291620[]' id='answer-id-1146008' class='answer   answerof-291620 ' value='1146008'   \/><label for='answer-id-1146008' id='answer-label-1146008' class=' answer'><span>Create a parameterized AWS CloudFormation template that builds the required objects. Keep separate environment parameter files in separate Amazon S3 buckets. Provide an AWS CLI command that deploys the CloudFormation stack directly referencing the appropriate parameter bucket.<\/span><\/label><\/div><div class='watupro-question-choice  ' dir='auto' ><input type='radio' name='answer-291620[]' id='answer-id-1146009' class='answer   answerof-291620 ' value='1146009'   \/><label for='answer-id-1146009' id='answer-label-1146009' class=' answer'><span>Create a parameterized AWS CloudFormation template that builds the required objects. Import the template into the CloudFormation interface in the AWS Management Console. Make the required changes to the parameters and deploy the CloudFormation stack.<\/span><\/label><\/div><div class='watupro-question-choice  ' dir='auto' ><input type='radio' name='answer-291620[]' id='answer-id-1146010' class='answer   answerof-291620 ' value='1146010'   \/><label for='answer-id-1146010' id='answer-label-1146010' class=' answer'><span>Create an AWS Lambda function that builds the required objects using an AWS SD<\/span><\/label><\/div><div class='watupro-question-choice  ' dir='auto' ><input type='radio' name='answer-291620[]' id='answer-id-1146011' class='answer   answerof-291620 ' value='1146011'   \/><label for='answer-id-1146011' id='answer-label-1146011' class=' answer'><span>Set the required parameter values in a test event in the Lambda console for each environment that the Application team can modify, as needed. Deploy the infrastructure by triggering the test event in the console.<\/span><\/label><\/div><!-- end question-choices--><\/div><!-- end questionWrap--><\/div><\/div><div class='watu-question ' id='question-51' style=';'><div id='questionWrap-51'  class='   watupro-question-id-291621'>\n\t\t\t<div class='question-content'><div><span class='watupro_num'>51. <\/span>A company runs online transaction processing (OLTP) workloads on an Amazon RDS for PostgreSQL Multi- AZ DB instance. Tests were run on the database after work hours, which generated additional database logs. The free storage of the RDS DB instance is low due to these additional logs. <br \/>\r<br>What should the company do to address this space constraint issue?<\/div><input type='hidden' name='question_id[]' id='qID_51' value='291621' \/><input type='hidden' id='answerType291621' value='radio'><!-- end question-content--><\/div><div class='question-choices watupro-choices-columns '><div class='watupro-question-choice  ' dir='auto' ><input type='radio' name='answer-291621[]' id='answer-id-1146012' class='answer   answerof-291621 ' value='1146012'   \/><label for='answer-id-1146012' id='answer-label-1146012' class=' answer'><span>Log in to the host and run the rm $PGDATA\/pg_logs\/* command<\/span><\/label><\/div><div class='watupro-question-choice  ' dir='auto' ><input type='radio' name='answer-291621[]' id='answer-id-1146013' class='answer   answerof-291621 ' value='1146013'   \/><label for='answer-id-1146013' id='answer-label-1146013' class=' answer'><span>Modify the rds.log_retention_period parameter to 1440 and wait up to 24 hours for database logs to be deleted<\/span><\/label><\/div><div class='watupro-question-choice  ' dir='auto' ><input type='radio' name='answer-291621[]' id='answer-id-1146014' class='answer   answerof-291621 ' value='1146014'   \/><label for='answer-id-1146014' id='answer-label-1146014' class=' answer'><span>Create a ticket with AWS Support to have the logs deleted<\/span><\/label><\/div><div class='watupro-question-choice  ' dir='auto' ><input type='radio' name='answer-291621[]' id='answer-id-1146015' class='answer   answerof-291621 ' value='1146015'   \/><label for='answer-id-1146015' id='answer-label-1146015' class=' answer'><span>Run the SELECT rds_rotate_error_log() stored procedure to rotate the logs<\/span><\/label><\/div><!-- end question-choices--><\/div><!-- end questionWrap--><\/div><\/div><div class='watu-question ' id='question-52' style=';'><div id='questionWrap-52'  class='   watupro-question-id-291622'>\n\t\t\t<div class='question-content'><div><span class='watupro_num'>52. <\/span>A user has a non-relational key-value database. The user is looking for a fully managed AWS service that will offload the administrative burdens of operating and scaling distributed databases. The solution must be cost- effective and able to handle unpredictable application traffic. <br \/>\r<br>What should a Database Specialist recommend for this user?<\/div><input type='hidden' name='question_id[]' id='qID_52' value='291622' \/><input type='hidden' id='answerType291622' value='radio'><!-- end question-content--><\/div><div class='question-choices watupro-choices-columns '><div class='watupro-question-choice  ' dir='auto' ><input type='radio' name='answer-291622[]' id='answer-id-1146016' class='answer   answerof-291622 ' value='1146016'   \/><label for='answer-id-1146016' id='answer-label-1146016' class=' answer'><span>Create an Amazon DynamoDB table with provisioned capacity mode<\/span><\/label><\/div><div class='watupro-question-choice  ' dir='auto' ><input type='radio' name='answer-291622[]' id='answer-id-1146017' class='answer   answerof-291622 ' value='1146017'   \/><label for='answer-id-1146017' id='answer-label-1146017' class=' answer'><span>Create an Amazon DocumentDB cluster<\/span><\/label><\/div><div class='watupro-question-choice  ' dir='auto' ><input type='radio' name='answer-291622[]' id='answer-id-1146018' class='answer   answerof-291622 ' value='1146018'   \/><label for='answer-id-1146018' id='answer-label-1146018' class=' answer'><span>Create an Amazon DynamoDB table with on-demand capacity mode<\/span><\/label><\/div><div class='watupro-question-choice  ' dir='auto' ><input type='radio' name='answer-291622[]' id='answer-id-1146019' class='answer   answerof-291622 ' value='1146019'   \/><label for='answer-id-1146019' id='answer-label-1146019' class=' answer'><span>Create an Amazon Aurora Serverless DB cluster<\/span><\/label><\/div><!-- end question-choices--><\/div><!-- end questionWrap--><\/div><\/div><div class='watu-question ' id='question-53' style=';'><div id='questionWrap-53'  class='   watupro-question-id-291623'>\n\t\t\t<div class='question-content'><div><span class='watupro_num'>53. <\/span>A gaming company is designing a mobile gaming app that will be accessed by many users across the globe. The company wants to have replication and full support for multi-master writes. The company also wants to ensure low latency and consistent performance for app users. <br \/>\r<br>Which solution meets these requirements?<\/div><input type='hidden' name='question_id[]' id='qID_53' value='291623' \/><input type='hidden' id='answerType291623' value='radio'><!-- end question-content--><\/div><div class='question-choices watupro-choices-columns '><div class='watupro-question-choice  ' dir='auto' ><input type='radio' name='answer-291623[]' id='answer-id-1146020' class='answer   answerof-291623 ' value='1146020'   \/><label for='answer-id-1146020' id='answer-label-1146020' class=' answer'><span>Use Amazon DynamoDB global tables for storage and enable DynamoDB automatic scaling<\/span><\/label><\/div><div class='watupro-question-choice  ' dir='auto' ><input type='radio' name='answer-291623[]' id='answer-id-1146021' class='answer   answerof-291623 ' value='1146021'   \/><label for='answer-id-1146021' id='answer-label-1146021' class=' answer'><span>Use Amazon Aurora for storage and enable cross-Region Aurora Replicas<\/span><\/label><\/div><div class='watupro-question-choice  ' dir='auto' ><input type='radio' name='answer-291623[]' id='answer-id-1146022' class='answer   answerof-291623 ' value='1146022'   \/><label for='answer-id-1146022' id='answer-label-1146022' class=' answer'><span>Use Amazon Aurora for storage and cache the user content with Amazon ElastiCache<\/span><\/label><\/div><div class='watupro-question-choice  ' dir='auto' ><input type='radio' name='answer-291623[]' id='answer-id-1146023' class='answer   answerof-291623 ' value='1146023'   \/><label for='answer-id-1146023' id='answer-label-1146023' class=' answer'><span>Use Amazon Neptune for storage<\/span><\/label><\/div><!-- end question-choices--><\/div><!-- end questionWrap--><\/div><\/div><div class='watu-question ' id='question-54' style=';'><div id='questionWrap-54'  class='   watupro-question-id-291624'>\n\t\t\t<div class='question-content'><div><span class='watupro_num'>54. <\/span>A Database Specialist needs to speed up any failover that might occur on an Amazon Aurora PostgreSQL DB cluster. The Aurora DB cluster currently includes the primary instance and three Aurora Replicas. <br \/>\r<br>How can the Database Specialist ensure that failovers occur with the least amount of downtime for the application?<\/div><input type='hidden' name='question_id[]' id='qID_54' value='291624' \/><input type='hidden' id='answerType291624' value='radio'><!-- end question-content--><\/div><div class='question-choices watupro-choices-columns '><div class='watupro-question-choice  ' dir='auto' ><input type='radio' name='answer-291624[]' id='answer-id-1146024' class='answer   answerof-291624 ' value='1146024'   \/><label for='answer-id-1146024' id='answer-label-1146024' class=' answer'><span>Set the TCP keepalive parameters low<\/span><\/label><\/div><div class='watupro-question-choice  ' dir='auto' ><input type='radio' name='answer-291624[]' id='answer-id-1146025' class='answer   answerof-291624 ' value='1146025'   \/><label for='answer-id-1146025' id='answer-label-1146025' class=' answer'><span>Call the AWS CLI failover-db-cluster command<\/span><\/label><\/div><div class='watupro-question-choice  ' dir='auto' ><input type='radio' name='answer-291624[]' id='answer-id-1146026' class='answer   answerof-291624 ' value='1146026'   \/><label for='answer-id-1146026' id='answer-label-1146026' class=' answer'><span>Enable Enhanced Monitoring on the DB cluster<\/span><\/label><\/div><div class='watupro-question-choice  ' dir='auto' ><input type='radio' name='answer-291624[]' id='answer-id-1146027' class='answer   answerof-291624 ' value='1146027'   \/><label for='answer-id-1146027' id='answer-label-1146027' class=' answer'><span>Start a database activity stream on the DB cluster<\/span><\/label><\/div><!-- end question-choices--><\/div><!-- end questionWrap--><\/div><\/div><div class='watu-question ' id='question-55' style=';'><div id='questionWrap-55'  class='   watupro-question-id-291625'>\n\t\t\t<div class='question-content'><div><span class='watupro_num'>55. <\/span>A Database Specialist needs to define a database migration strategy to migrate an on-premises Oracle database to an Amazon Aurora MySQL DB cluster. The company requires near-zero downtime for the data migration. The solution must also be cost-effective. <br \/>\r<br>Which approach should the Database Specialist take?<\/div><input type='hidden' name='question_id[]' id='qID_55' value='291625' \/><input type='hidden' id='answerType291625' value='radio'><!-- end question-content--><\/div><div class='question-choices watupro-choices-columns '><div class='watupro-question-choice  ' dir='auto' ><input type='radio' name='answer-291625[]' id='answer-id-1146028' class='answer   answerof-291625 ' value='1146028'   \/><label for='answer-id-1146028' id='answer-label-1146028' class=' answer'><span>Dump all the tables from the Oracle database into an Amazon S3 bucket using datapump (expdp). Run data transformations in AWS Glue. Load the data from the S3 bucket to the Aurora DB cluster.<\/span><\/label><\/div><div class='watupro-question-choice  ' dir='auto' ><input type='radio' name='answer-291625[]' id='answer-id-1146029' class='answer   answerof-291625 ' value='1146029'   \/><label for='answer-id-1146029' id='answer-label-1146029' class=' answer'><span>Order an AWS Snowball appliance and copy the Oracle backup to the Snowball appliance. Once the Snowball data is delivered to Amazon S3, create a new Aurora DB cluster. Enable the S3 integration to migrate the data directly from Amazon S3 to Amazon RD<\/span><\/label><\/div><div class='watupro-question-choice  ' dir='auto' ><input type='radio' name='answer-291625[]' id='answer-id-1146030' class='answer   answerof-291625 ' value='1146030'   \/><label for='answer-id-1146030' id='answer-label-1146030' class=' answer'><span>Use the AWS Schema Conversion Tool (AWS SCT) to help rewrite database objects to MySQL during the schema migration. Use AWS DMS to perform the full load and change data capture (CDC) tasks.<\/span><\/label><\/div><div class='watupro-question-choice  ' dir='auto' ><input type='radio' name='answer-291625[]' id='answer-id-1146031' class='answer   answerof-291625 ' value='1146031'   \/><label for='answer-id-1146031' id='answer-label-1146031' class=' answer'><span>Use AWS Server Migration Service (AWS SMS) to import the Oracle virtual machine image as an \r\nAmazon EC2 instance. Use the Oracle Logical Dump utility to migrate the Oracle data from Amazon EC2 to an Aurora DB cluster.<\/span><\/label><\/div><!-- end question-choices--><\/div><!-- end questionWrap--><\/div><\/div><div class='watu-question ' id='question-56' style=';'><div id='questionWrap-56'  class='   watupro-question-id-291626'>\n\t\t\t<div class='question-content'><div><span class='watupro_num'>56. <\/span>A marketing company is using Amazon DocumentDB and requires that database audit logs be enabled. A Database Specialist needs to configure monitoring so that all data definition language (DDL) statements performed are visible to the Administrator. The Database Specialist has set the audit_logs parameter to enabled in the cluster parameter group. <br \/>\r<br>What should the Database Specialist do to automatically collect the database logs for the Administrator?<\/div><input type='hidden' name='question_id[]' id='qID_56' value='291626' \/><input type='hidden' id='answerType291626' value='radio'><!-- end question-content--><\/div><div class='question-choices watupro-choices-columns '><div class='watupro-question-choice  ' dir='auto' ><input type='radio' name='answer-291626[]' id='answer-id-1146032' class='answer   answerof-291626 ' value='1146032'   \/><label for='answer-id-1146032' id='answer-label-1146032' class=' answer'><span>Enable DocumentDB to export the logs to Amazon CloudWatch Logs<\/span><\/label><\/div><div class='watupro-question-choice  ' dir='auto' ><input type='radio' name='answer-291626[]' id='answer-id-1146033' class='answer   answerof-291626 ' value='1146033'   \/><label for='answer-id-1146033' id='answer-label-1146033' class=' answer'><span>Enable DocumentDB to export the logs to AWS CloudTrail<\/span><\/label><\/div><div class='watupro-question-choice  ' dir='auto' ><input type='radio' name='answer-291626[]' id='answer-id-1146034' class='answer   answerof-291626 ' value='1146034'   \/><label for='answer-id-1146034' id='answer-label-1146034' class=' answer'><span>Enable DocumentDB Events to export the logs to Amazon CloudWatch Logs<\/span><\/label><\/div><div class='watupro-question-choice  ' dir='auto' ><input type='radio' name='answer-291626[]' id='answer-id-1146035' class='answer   answerof-291626 ' value='1146035'   \/><label for='answer-id-1146035' id='answer-label-1146035' class=' answer'><span>Configure an AWS Lambda function to download the logs using the download-db-log-file-portion operation and store the logs in Amazon S3<\/span><\/label><\/div><!-- end question-choices--><\/div><!-- end questionWrap--><\/div><\/div><div class='watu-question ' id='question-57' style=';'><div id='questionWrap-57'  class='   watupro-question-id-291627'>\n\t\t\t<div class='question-content'><div><span class='watupro_num'>57. <\/span>A company is looking to move an on-premises IBM Db2 database running AIX on an IBM POWER7 <br \/>\r<br>server. Due to escalating support and maintenance costs, the company is exploring the option of moving the workload to an Amazon Aurora PostgreSQL DB cluster. <br \/>\r<br>What is the quickest way for the company to gather data on the migration compatibility?<\/div><input type='hidden' name='question_id[]' id='qID_57' value='291627' \/><input type='hidden' id='answerType291627' value='radio'><!-- end question-content--><\/div><div class='question-choices watupro-choices-columns '><div class='watupro-question-choice  ' dir='auto' ><input type='radio' name='answer-291627[]' id='answer-id-1146036' class='answer   answerof-291627 ' value='1146036'   \/><label for='answer-id-1146036' id='answer-label-1146036' class=' answer'><span>Perform a logical dump from the Db2 database and restore it to an Aurora DB cluster. Identify the gaps and compatibility of the objects migrated by comparing row counts from source and target tables.<\/span><\/label><\/div><div class='watupro-question-choice  ' dir='auto' ><input type='radio' name='answer-291627[]' id='answer-id-1146037' class='answer   answerof-291627 ' value='1146037'   \/><label for='answer-id-1146037' id='answer-label-1146037' class=' answer'><span>Run AWS DMS from the Db2 database to an Aurora DB cluster. Identify the gaps and compatibility of the objects migrated by comparing the row counts from source and target tables.<\/span><\/label><\/div><div class='watupro-question-choice  ' dir='auto' ><input type='radio' name='answer-291627[]' id='answer-id-1146038' class='answer   answerof-291627 ' value='1146038'   \/><label for='answer-id-1146038' id='answer-label-1146038' class=' answer'><span>Run native PostgreSQL logical replication from the Db2 database to an Aurora DB cluster to evaluate the migration compatibility.<\/span><\/label><\/div><div class='watupro-question-choice  ' dir='auto' ><input type='radio' name='answer-291627[]' id='answer-id-1146039' class='answer   answerof-291627 ' value='1146039'   \/><label for='answer-id-1146039' id='answer-label-1146039' class=' answer'><span>Run the AWS Schema Conversion Tool (AWS SCT) from the Db2 database to an Aurora DB cluster. \r\nCreate a migration assessment report to evaluate the migration compatibility.<\/span><\/label><\/div><!-- end question-choices--><\/div><!-- end questionWrap--><\/div><\/div><div class='watu-question ' id='question-58' style=';'><div id='questionWrap-58'  class='   watupro-question-id-291628'>\n\t\t\t<div class='question-content'><div><span class='watupro_num'>58. <\/span>An ecommerce company is using Amazon DynamoDB as the backend for its order-processing application. The steady increase in the number of orders is resulting in increased DynamoDB costs. Order verification and reporting perform many repeated GetItem functions that pull similar datasets, and this read activity is contributing to the increased costs. The company wants to control these costs without significant development efforts. <br \/>\r<br>How should a Database Specialist address these requirements?<\/div><input type='hidden' name='question_id[]' id='qID_58' value='291628' \/><input type='hidden' id='answerType291628' value='radio'><!-- end question-content--><\/div><div class='question-choices watupro-choices-columns '><div class='watupro-question-choice  ' dir='auto' ><input type='radio' name='answer-291628[]' id='answer-id-1146040' class='answer   answerof-291628 ' value='1146040'   \/><label for='answer-id-1146040' id='answer-label-1146040' class=' answer'><span>Use AWS DMS to migrate data from DynamoDB to Amazon DocumentDB<\/span><\/label><\/div><div class='watupro-question-choice  ' dir='auto' ><input type='radio' name='answer-291628[]' id='answer-id-1146041' class='answer   answerof-291628 ' value='1146041'   \/><label for='answer-id-1146041' id='answer-label-1146041' class=' answer'><span>Use Amazon DynamoDB Streams and Amazon Kinesis Data Firehose to push the data into Amazon Redshift<\/span><\/label><\/div><div class='watupro-question-choice  ' dir='auto' ><input type='radio' name='answer-291628[]' id='answer-id-1146042' class='answer   answerof-291628 ' value='1146042'   \/><label for='answer-id-1146042' id='answer-label-1146042' class=' answer'><span>Use an Amazon ElastiCache for Redis in front of DynamoDB to boost read performance<\/span><\/label><\/div><div class='watupro-question-choice  ' dir='auto' ><input type='radio' name='answer-291628[]' id='answer-id-1146043' class='answer   answerof-291628 ' value='1146043'   \/><label for='answer-id-1146043' id='answer-label-1146043' class=' answer'><span>Use DynamoDB Accelerator to offload the reads<\/span><\/label><\/div><!-- end question-choices--><\/div><!-- end questionWrap--><\/div><\/div><div class='watu-question ' id='question-59' style=';'><div id='questionWrap-59'  class='   watupro-question-id-291629'>\n\t\t\t<div class='question-content'><div><span class='watupro_num'>59. <\/span>An IT consulting company wants to reduce costs when operating its development environment databases. The company\u2019s workflow creates multiple Amazon Aurora MySQL DB clusters for each development group. The Aurora DB clusters are only used for 8 hours a day. The DB clusters can then be deleted at the end of the development cycle, which lasts 2 weeks. <br \/>\r<br>Which of the following provides the MOST cost-effective solution?<\/div><input type='hidden' name='question_id[]' id='qID_59' value='291629' \/><input type='hidden' id='answerType291629' value='radio'><!-- end question-content--><\/div><div class='question-choices watupro-choices-columns '><div class='watupro-question-choice  ' dir='auto' ><input type='radio' name='answer-291629[]' id='answer-id-1146044' class='answer   answerof-291629 ' value='1146044'   \/><label for='answer-id-1146044' id='answer-label-1146044' class=' answer'><span>Use AWS CloudFormation templates. Deploy a stack with the DB cluster for each development group. Delete the stack at the end of the development cycle.<\/span><\/label><\/div><div class='watupro-question-choice  ' dir='auto' ><input type='radio' name='answer-291629[]' id='answer-id-1146045' class='answer   answerof-291629 ' value='1146045'   \/><label for='answer-id-1146045' id='answer-label-1146045' class=' answer'><span>Use the Aurora DB cloning feature. Deploy a single development and test Aurora DB instance, and create clone instances for the development groups. Delete the clones at the end of the development cycle.<\/span><\/label><\/div><div class='watupro-question-choice  ' dir='auto' ><input type='radio' name='answer-291629[]' id='answer-id-1146046' class='answer   answerof-291629 ' value='1146046'   \/><label for='answer-id-1146046' id='answer-label-1146046' class=' answer'><span>Use Aurora Replicas. From the master automatic pause compute capacity option, create replicas for each development group, and promote each replica to master. Delete the replicas at the end of the development cycle.<\/span><\/label><\/div><div class='watupro-question-choice  ' dir='auto' ><input type='radio' name='answer-291629[]' id='answer-id-1146047' class='answer   answerof-291629 ' value='1146047'   \/><label for='answer-id-1146047' id='answer-label-1146047' class=' answer'><span>Use Aurora Serverless. Restore current Aurora snapshot and deploy to a serverless cluster for each development group. Enable the option to pause the compute capacity on the cluster and set an appropriate timeout.<\/span><\/label><\/div><!-- end question-choices--><\/div><!-- end questionWrap--><\/div><\/div><div class='watu-question ' id='question-60' style=';'><div id='questionWrap-60'  class='   watupro-question-id-291630'>\n\t\t\t<div class='question-content'><div><span class='watupro_num'>60. <\/span>A company has multiple applications serving data from a secure on-premises database. The company is migrating all applications and databases to the AWS Cloud. The IT Risk and Compliance department requires that auditing be enabled on all secure databases to capture all log ins, log outs, failed logins, permission changes, and database schema changes. A Database Specialist has recommended Amazon Aurora MySQL as the migration target, and leveraging the Advanced Auditing feature in Aurora. <br \/>\r<br>Which events need to be specified in the Advanced Auditing configuration to satisfy the minimum auditing requirements? (Choose three.)<\/div><input type='hidden' name='question_id[]' id='qID_60' value='291630' \/><input type='hidden' id='answerType291630' value='checkbox'><!-- end question-content--><\/div><div class='question-choices watupro-choices-columns '><div class='watupro-question-choice  ' dir='auto' ><input type='checkbox' name='answer-291630[]' id='answer-id-1146048' class='answer   answerof-291630 ' value='1146048'   \/><label for='answer-id-1146048' id='answer-label-1146048' class=' answer'><span>CONNECT<\/span><\/label><\/div><div class='watupro-question-choice  ' dir='auto' ><input type='checkbox' name='answer-291630[]' id='answer-id-1146049' class='answer   answerof-291630 ' value='1146049'   \/><label for='answer-id-1146049' id='answer-label-1146049' class=' answer'><span>QUERY_DCL<\/span><\/label><\/div><div class='watupro-question-choice  ' dir='auto' ><input type='checkbox' name='answer-291630[]' id='answer-id-1146050' class='answer   answerof-291630 ' value='1146050'   \/><label for='answer-id-1146050' id='answer-label-1146050' class=' answer'><span>QUERY_DDL<\/span><\/label><\/div><div class='watupro-question-choice  ' dir='auto' ><input type='checkbox' name='answer-291630[]' id='answer-id-1146051' class='answer   answerof-291630 ' value='1146051'   \/><label for='answer-id-1146051' id='answer-label-1146051' class=' answer'><span>QUERY_DML<\/span><\/label><\/div><div class='watupro-question-choice  ' dir='auto' ><input type='checkbox' name='answer-291630[]' id='answer-id-1146052' class='answer   answerof-291630 ' value='1146052'   \/><label for='answer-id-1146052' id='answer-label-1146052' class=' answer'><span>TABLE<\/span><\/label><\/div><div class='watupro-question-choice  ' dir='auto' ><input type='checkbox' name='answer-291630[]' id='answer-id-1146053' class='answer   answerof-291630 ' value='1146053'   \/><label for='answer-id-1146053' id='answer-label-1146053' class=' answer'><span>QUERY<\/span><\/label><\/div><!-- end question-choices--><\/div><!-- end questionWrap--><\/div><\/div><div class='watu-question ' id='question-61' style=';'><div id='questionWrap-61'  class='   watupro-question-id-291631'>\n\t\t\t<div class='question-content'><div><span class='watupro_num'>61. <\/span>A gaming company has recently acquired a successful iOS game, which is particularly popular during the holiday season. The company has decided to add a leaderboard to the game that uses Amazon DynamoDB. The application load is expected to ramp up over the holiday season. <br \/>\r<br>Which solution will meet these requirements at the lowest cost?<\/div><input type='hidden' name='question_id[]' id='qID_61' value='291631' \/><input type='hidden' id='answerType291631' value='radio'><!-- end question-content--><\/div><div class='question-choices watupro-choices-columns '><div class='watupro-question-choice  ' dir='auto' ><input type='radio' name='answer-291631[]' id='answer-id-1146054' class='answer   answerof-291631 ' value='1146054'   \/><label for='answer-id-1146054' id='answer-label-1146054' class=' answer'><span>DynamoDB Streams<\/span><\/label><\/div><div class='watupro-question-choice  ' dir='auto' ><input type='radio' name='answer-291631[]' id='answer-id-1146055' class='answer   answerof-291631 ' value='1146055'   \/><label for='answer-id-1146055' id='answer-label-1146055' class=' answer'><span>DynamoDB with DynamoDB Accelerator<\/span><\/label><\/div><div class='watupro-question-choice  ' dir='auto' ><input type='radio' name='answer-291631[]' id='answer-id-1146056' class='answer   answerof-291631 ' value='1146056'   \/><label for='answer-id-1146056' id='answer-label-1146056' class=' answer'><span>DynamoDB with on-demand capacity mode<\/span><\/label><\/div><div class='watupro-question-choice  ' dir='auto' ><input type='radio' name='answer-291631[]' id='answer-id-1146057' class='answer   answerof-291631 ' value='1146057'   \/><label for='answer-id-1146057' id='answer-label-1146057' class=' answer'><span>DynamoDB with provisioned capacity mode with Auto Scaling<\/span><\/label><\/div><!-- end question-choices--><\/div><!-- end questionWrap--><\/div><\/div><div class='watu-question ' id='question-62' style=';'><div id='questionWrap-62'  class='   watupro-question-id-291632'>\n\t\t\t<div class='question-content'><div><span class='watupro_num'>62. <\/span>A company\u2019s Security department established new requirements that state internal users must connect to an existing Amazon RDS for SQL Server DB instance using their corporate Active Directory (AD) credentials. A Database Specialist must make the modifications needed to fulfill this requirement. <br \/>\r<br>Which combination of actions should the Database Specialist take? (Choose three.)<\/div><input type='hidden' name='question_id[]' id='qID_62' value='291632' \/><input type='hidden' id='answerType291632' value='checkbox'><!-- end question-content--><\/div><div class='question-choices watupro-choices-columns '><div class='watupro-question-choice  ' dir='auto' ><input type='checkbox' name='answer-291632[]' id='answer-id-1146058' class='answer   answerof-291632 ' value='1146058'   \/><label for='answer-id-1146058' id='answer-label-1146058' class=' answer'><span>Disable Transparent Data Encryption (TDE) on the RDS SQL Server DB instance.<\/span><\/label><\/div><div class='watupro-question-choice  ' dir='auto' ><input type='checkbox' name='answer-291632[]' id='answer-id-1146059' class='answer   answerof-291632 ' value='1146059'   \/><label for='answer-id-1146059' id='answer-label-1146059' class=' answer'><span>Modify the RDS SQL Server DB instance to use the directory for Windows authentication. Create appropriate new logins.<\/span><\/label><\/div><div class='watupro-question-choice  ' dir='auto' ><input type='checkbox' name='answer-291632[]' id='answer-id-1146060' class='answer   answerof-291632 ' value='1146060'   \/><label for='answer-id-1146060' id='answer-label-1146060' class=' answer'><span>Use the AWS Management Console to create an AWS Managed Microsoft A<\/span><\/label><\/div><div class='watupro-question-choice  ' dir='auto' ><input type='checkbox' name='answer-291632[]' id='answer-id-1146061' class='answer   answerof-291632 ' value='1146061'   \/><label for='answer-id-1146061' id='answer-label-1146061' class=' answer'><span>Create a trust relationship with the corporate A<\/span><\/label><\/div><div class='watupro-question-choice  ' dir='auto' ><input type='checkbox' name='answer-291632[]' id='answer-id-1146062' class='answer   answerof-291632 ' value='1146062'   \/><label for='answer-id-1146062' id='answer-label-1146062' class=' answer'><span>Stop the RDS SQL Server DB instance, modify it to use the directory for Windows authentication, and start it again. Create appropriate new logins.<\/span><\/label><\/div><div class='watupro-question-choice  ' dir='auto' ><input type='checkbox' name='answer-291632[]' id='answer-id-1146063' class='answer   answerof-291632 ' value='1146063'   \/><label for='answer-id-1146063' id='answer-label-1146063' class=' answer'><span>Use the AWS Management Console to create an AD Connector. Create a trust relationship with the corporate A<\/span><\/label><\/div><div class='watupro-question-choice  ' dir='auto' ><input type='checkbox' name='answer-291632[]' id='answer-id-1146064' class='answer   answerof-291632 ' value='1146064'   \/><label for='answer-id-1146064' id='answer-label-1146064' class=' answer'><span>Configure the AWS Managed Microsoft AD domain controller Security Group.<\/span><\/label><\/div><!-- end question-choices--><\/div><!-- end questionWrap--><\/div><\/div><div class='watu-question ' id='question-63' style=';'><div id='questionWrap-63'  class='   watupro-question-id-291633'>\n\t\t\t<div class='question-content'><div><span class='watupro_num'>63. <\/span>A Database Specialist is performing a proof of concept with Amazon Aurora using a small instance to confirm a simple database behavior. When loading a large dataset and creating the index, the Database Specialist encounters the following error message from Aurora: <br \/>\r<br>ERROR: cloud not write block 7507718 of temporary file: No space left on device <br \/>\r<br>What is the cause of this error and what should the Database Specialist do to resolve this issue?<\/div><input type='hidden' name='question_id[]' id='qID_63' value='291633' \/><input type='hidden' id='answerType291633' value='radio'><!-- end question-content--><\/div><div class='question-choices watupro-choices-columns '><div class='watupro-question-choice  ' dir='auto' ><input type='radio' name='answer-291633[]' id='answer-id-1146065' class='answer   answerof-291633 ' value='1146065'   \/><label for='answer-id-1146065' id='answer-label-1146065' class=' answer'><span>The scaling of Aurora storage cannot catch up with the data loading. The Database Specialist needs to modify the workload to load the data slowly.<\/span><\/label><\/div><div class='watupro-question-choice  ' dir='auto' ><input type='radio' name='answer-291633[]' id='answer-id-1146066' class='answer   answerof-291633 ' value='1146066'   \/><label for='answer-id-1146066' id='answer-label-1146066' class=' answer'><span>The scaling of Aurora storage cannot catch up with the data loading. The Database Specialist needs to enable Aurora storage scaling.<\/span><\/label><\/div><div class='watupro-question-choice  ' dir='auto' ><input type='radio' name='answer-291633[]' id='answer-id-1146067' class='answer   answerof-291633 ' value='1146067'   \/><label for='answer-id-1146067' id='answer-label-1146067' class=' answer'><span>The local storage used to store temporary tables is full. The Database Specialist needs to scale up the instance.<\/span><\/label><\/div><div class='watupro-question-choice  ' dir='auto' ><input type='radio' name='answer-291633[]' id='answer-id-1146068' class='answer   answerof-291633 ' value='1146068'   \/><label for='answer-id-1146068' id='answer-label-1146068' class=' answer'><span>The local storage used to store temporary tables is full. The Database Specialist needs to enable local storage scaling.<\/span><\/label><\/div><!-- end question-choices--><\/div><!-- end questionWrap--><\/div><\/div><div class='watu-question ' id='question-64' style=';'><div id='questionWrap-64'  class='   watupro-question-id-291634'>\n\t\t\t<div class='question-content'><div><span class='watupro_num'>64. <\/span>A financial company wants to store sensitive user data in an Amazon Aurora PostgreSQL DB cluster. The database will be accessed by multiple applications across the company. The company has mandated that all communications to the database be encrypted and the server identity must be validated. Any non-SSL-based connections should be disallowed access to the database. <br \/>\r<br>Which solution addresses these requirements?<\/div><input type='hidden' name='question_id[]' id='qID_64' value='291634' \/><input type='hidden' id='answerType291634' value='radio'><!-- end question-content--><\/div><div class='question-choices watupro-choices-columns '><div class='watupro-question-choice  ' dir='auto' ><input type='radio' name='answer-291634[]' id='answer-id-1146069' class='answer   answerof-291634 ' value='1146069'   \/><label for='answer-id-1146069' id='answer-label-1146069' class=' answer'><span>Set the rds.force_ssl=0 parameter in DB parameter groups. Download and use the Amazon RDS certificate bundle and configure the PostgreSQL connection string with sslmode=allow.<\/span><\/label><\/div><div class='watupro-question-choice  ' dir='auto' ><input type='radio' name='answer-291634[]' id='answer-id-1146070' class='answer   answerof-291634 ' value='1146070'   \/><label for='answer-id-1146070' id='answer-label-1146070' class=' answer'><span>Set the rds.force_ssl=1 parameter in DB parameter groups. Download and use the Amazon RDS certificate bundle and configure the PostgreSQL connection string with sslmode=disable.<\/span><\/label><\/div><div class='watupro-question-choice  ' dir='auto' ><input type='radio' name='answer-291634[]' id='answer-id-1146071' class='answer   answerof-291634 ' value='1146071'   \/><label for='answer-id-1146071' id='answer-label-1146071' class=' answer'><span>Set the rds.force_ssl=0 parameter in DB parameter groups. Download and use the Amazon RDS certificate bundle and configure the PostgreSQL connection string with sslmode=verify-ca.<\/span><\/label><\/div><div class='watupro-question-choice  ' dir='auto' ><input type='radio' name='answer-291634[]' id='answer-id-1146072' class='answer   answerof-291634 ' value='1146072'   \/><label for='answer-id-1146072' id='answer-label-1146072' class=' answer'><span>Set the rds.force_ssl=1 parameter in DB parameter groups. Download and use the Amazon RDS certificate bundle and configure the PostgreSQL connection string with sslmode=verify-full.<\/span><\/label><\/div><!-- end question-choices--><\/div><!-- end questionWrap--><\/div><\/div><div class='watu-question ' id='question-65' style=';'><div id='questionWrap-65'  class='   watupro-question-id-291635'>\n\t\t\t<div class='question-content'><div><span class='watupro_num'>65. <\/span>A financial company wants to store sensitive user data in an Amazon Aurora PostgreSQL DB cluster. The database will be accessed by multiple applications across the company. The company has mandated that all communications to the database be encrypted and the server identity must be validated. Any non-SSL-based connections should be disallowed access to the database. <br \/>\r<br>Which solution addresses these requirements?<\/div><input type='hidden' name='question_id[]' id='qID_65' value='291635' \/><input type='hidden' id='answerType291635' value='radio'><!-- end question-content--><\/div><div class='question-choices watupro-choices-columns '><div class='watupro-question-choice  ' dir='auto' ><input type='radio' name='answer-291635[]' id='answer-id-1146073' class='answer   answerof-291635 ' value='1146073'   \/><label for='answer-id-1146073' id='answer-label-1146073' class=' answer'><span>Set the rds.force_ssl=0 parameter in DB parameter groups. Download and use the Amazon RDS certificate bundle and configure the PostgreSQL connection string with sslmode=allow.<\/span><\/label><\/div><div class='watupro-question-choice  ' dir='auto' ><input type='radio' name='answer-291635[]' id='answer-id-1146074' class='answer   answerof-291635 ' value='1146074'   \/><label for='answer-id-1146074' id='answer-label-1146074' class=' answer'><span>Set the rds.force_ssl=1 parameter in DB parameter groups. Download and use the Amazon RDS certificate bundle and configure the PostgreSQL connection string with sslmode=disable.<\/span><\/label><\/div><div class='watupro-question-choice  ' dir='auto' ><input type='radio' name='answer-291635[]' id='answer-id-1146075' class='answer   answerof-291635 ' value='1146075'   \/><label for='answer-id-1146075' id='answer-label-1146075' class=' answer'><span>Set the rds.force_ssl=0 parameter in DB parameter groups. Download and use the Amazon RDS certificate bundle and configure the PostgreSQL connection string with sslmode=verify-ca.<\/span><\/label><\/div><div class='watupro-question-choice  ' dir='auto' ><input type='radio' name='answer-291635[]' id='answer-id-1146076' class='answer   answerof-291635 ' value='1146076'   \/><label for='answer-id-1146076' id='answer-label-1146076' class=' answer'><span>Set the rds.force_ssl=1 parameter in DB parameter groups. Download and use the Amazon RDS certificate bundle and configure the PostgreSQL connection string with sslmode=verify-full.<\/span><\/label><\/div><!-- end question-choices--><\/div><!-- end questionWrap--><\/div><\/div><div class='watu-question ' id='question-66' style=';'><div id='questionWrap-66'  class='   watupro-question-id-291636'>\n\t\t\t<div class='question-content'><div><span class='watupro_num'>66. <\/span>A company is using 5 TB Amazon RDS DB instances and needs to maintain 5 years of monthly database backups for compliance purposes. A Database Administrator must provide Auditors with data within 24 hours. <br \/>\r<br>Which solution will meet these requirements and is the MOST operationally efficient?<\/div><input type='hidden' name='question_id[]' id='qID_66' value='291636' \/><input type='hidden' id='answerType291636' value='radio'><!-- end question-content--><\/div><div class='question-choices watupro-choices-columns '><div class='watupro-question-choice  ' dir='auto' ><input type='radio' name='answer-291636[]' id='answer-id-1146077' class='answer   answerof-291636 ' value='1146077'   \/><label for='answer-id-1146077' id='answer-label-1146077' class=' answer'><span>Create an AWS Lambda function to run on the first day of every month to take a manual RDS snapshot. Move the snapshot to the company\u2019s Amazon S3 bucket.<\/span><\/label><\/div><div class='watupro-question-choice  ' dir='auto' ><input type='radio' name='answer-291636[]' id='answer-id-1146078' class='answer   answerof-291636 ' value='1146078'   \/><label for='answer-id-1146078' id='answer-label-1146078' class=' answer'><span>Create an AWS Lambda function to run on the first day of every month to take a manual RDS snapshot.<\/span><\/label><\/div><div class='watupro-question-choice  ' dir='auto' ><input type='radio' name='answer-291636[]' id='answer-id-1146079' class='answer   answerof-291636 ' value='1146079'   \/><label for='answer-id-1146079' id='answer-label-1146079' class=' answer'><span>Create an RDS snapshot schedule from the AWS Management Console to take a snapshot every 30 days.<\/span><\/label><\/div><div class='watupro-question-choice  ' dir='auto' ><input type='radio' name='answer-291636[]' id='answer-id-1146080' class='answer   answerof-291636 ' value='1146080'   \/><label for='answer-id-1146080' id='answer-label-1146080' class=' answer'><span>Create an AWS Lambda function to run on the first day of every month to create an automated RDS snapshot.<\/span><\/label><\/div><!-- end question-choices--><\/div><!-- end questionWrap--><\/div><\/div><div class='watu-question ' id='question-67' style=';'><div id='questionWrap-67'  class='   watupro-question-id-291637'>\n\t\t\t<div class='question-content'><div><span class='watupro_num'>67. <\/span>A company wants to automate the creation of secure test databases with random credentials to be stored safely for later use. The credentials should have sufficient information about each test database to initiate a connection and perform automated credential rotations. The credentials should not be logged or stored anywhere in an unencrypted form. <br \/>\r<br>Which steps should a Database Specialist take to meet these requirements using an AWS CloudFormation template?<\/div><input type='hidden' name='question_id[]' id='qID_67' value='291637' \/><input type='hidden' id='answerType291637' value='radio'><!-- end question-content--><\/div><div class='question-choices watupro-choices-columns '><div class='watupro-question-choice  ' dir='auto' ><input type='radio' name='answer-291637[]' id='answer-id-1146081' class='answer   answerof-291637 ' value='1146081'   \/><label for='answer-id-1146081' id='answer-label-1146081' class=' answer'><span>Create the database with the MasterUserName and MasterUserPassword properties set to the default values. Then, create the secret with the user name and password set to the same default values. Add a Secret Target Attachment resource with the SecretId and TargetId properties set to the Amazon Resource Names (ARNs) of the secret and the database. Finally, update the secret\u2019s password value with a randomly generated string set by the GenerateSecretString property.<\/span><\/label><\/div><div class='watupro-question-choice  ' dir='auto' ><input type='radio' name='answer-291637[]' id='answer-id-1146082' class='answer   answerof-291637 ' value='1146082'   \/><label for='answer-id-1146082' id='answer-label-1146082' class=' answer'><span>Add a Mapping property from the database Amazon Resource Name (ARN) to the secret AR<\/span><\/label><\/div><div class='watupro-question-choice  ' dir='auto' ><input type='radio' name='answer-291637[]' id='answer-id-1146083' class='answer   answerof-291637 ' value='1146083'   \/><label for='answer-id-1146083' id='answer-label-1146083' class=' answer'><span>Then, create the secret with a chosen user name and a randomly generated password set by the GenerateSecretString property. Add the database with the MasterUserName and MasterUserPassword properties set to the user name of the secret.<\/span><\/label><\/div><div class='watupro-question-choice  ' dir='auto' ><input type='radio' name='answer-291637[]' id='answer-id-1146084' class='answer   answerof-291637 ' value='1146084'   \/><label for='answer-id-1146084' id='answer-label-1146084' class=' answer'><span>Add a resource of type AWS::SecretsManager::Secret and specify the GenerateSecretString property. Then, define the database user name in the SecureStringTemplate template. Create a resource for the database and reference the secret string for the MasterUserName and MasterUserPassword properties. Then, add a resource of type AWS::SecretsManagerSecretTargetAttachment with the SecretId and TargetId properties set to the Amazon Resource Names (ARNs) of the secret and the database.<\/span><\/label><\/div><div class='watupro-question-choice  ' dir='auto' ><input type='radio' name='answer-291637[]' id='answer-id-1146085' class='answer   answerof-291637 ' value='1146085'   \/><label for='answer-id-1146085' id='answer-label-1146085' class=' answer'><span>Create the secret with a chosen user name and a randomly generated password set by the GenerateSecretString property. Add an SecretTargetAttachment resource with the SecretId property set to the Amazon Resource Name (ARN) of the secret and the TargetId property set to a parameter value matching the desired database AR<\/span><\/label><\/div><div class='watupro-question-choice  ' dir='auto' ><input type='radio' name='answer-291637[]' id='answer-id-1146086' class='answer   answerof-291637 ' value='1146086'   \/><label for='answer-id-1146086' id='answer-label-1146086' class=' answer'><span>Then, create a database with the MasterUserName and MasterUserPassword properties set to the previously created values in the secret.<\/span><\/label><\/div><!-- end question-choices--><\/div><!-- end questionWrap--><\/div><\/div><div class='watu-question ' id='question-68' style=';'><div id='questionWrap-68'  class='   watupro-question-id-291638'>\n\t\t\t<div class='question-content'><div><span class='watupro_num'>68. <\/span>A company is going to use an Amazon Aurora PostgreSQL DB cluster for an application backend. The DB cluster contains some tables with sensitive data. A Database Specialist needs to control the access privileges at the table level. <br \/>\r<br>How can the Database Specialist meet these requirements?<\/div><input type='hidden' name='question_id[]' id='qID_68' value='291638' \/><input type='hidden' id='answerType291638' value='radio'><!-- end question-content--><\/div><div class='question-choices watupro-choices-columns '><div class='watupro-question-choice  ' dir='auto' ><input type='radio' name='answer-291638[]' id='answer-id-1146087' class='answer   answerof-291638 ' value='1146087'   \/><label for='answer-id-1146087' id='answer-label-1146087' class=' answer'><span>Use AWS IAM database authentication and restrict access to the tables using an IAM policy.<\/span><\/label><\/div><div class='watupro-question-choice  ' dir='auto' ><input type='radio' name='answer-291638[]' id='answer-id-1146088' class='answer   answerof-291638 ' value='1146088'   \/><label for='answer-id-1146088' id='answer-label-1146088' class=' answer'><span>Configure the rules in a NACL to restrict outbound traffic from the Aurora DB cluster.<\/span><\/label><\/div><div class='watupro-question-choice  ' dir='auto' ><input type='radio' name='answer-291638[]' id='answer-id-1146089' class='answer   answerof-291638 ' value='1146089'   \/><label for='answer-id-1146089' id='answer-label-1146089' class=' answer'><span>Execute GRANT and REVOKE commands that restrict access to the tables containing sensitive data.<\/span><\/label><\/div><div class='watupro-question-choice  ' dir='auto' ><input type='radio' name='answer-291638[]' id='answer-id-1146090' class='answer   answerof-291638 ' value='1146090'   \/><label for='answer-id-1146090' id='answer-label-1146090' class=' answer'><span>Define access privileges to the tables containing sensitive data in the pg_hba.conf file.<\/span><\/label><\/div><!-- end question-choices--><\/div><!-- end questionWrap--><\/div><\/div><div class='watu-question ' id='question-69' style=';'><div id='questionWrap-69'  class='   watupro-question-id-291639'>\n\t\t\t<div class='question-content'><div><span class='watupro_num'>69. <\/span>A Database Specialist is working with a company to launch a new website built on Amazon Aurora with several Aurora Replicas. This new website will replace an on-premises website connected to a legacy relational database. Due to stability issues in the legacy database, the company would like to test the resiliency of Aurora. <br \/>\r<br>Which action can the Database Specialist take to test the resiliency of the Aurora DB cluster?<\/div><input type='hidden' name='question_id[]' id='qID_69' value='291639' \/><input type='hidden' id='answerType291639' value='radio'><!-- end question-content--><\/div><div class='question-choices watupro-choices-columns '><div class='watupro-question-choice  ' dir='auto' ><input type='radio' name='answer-291639[]' id='answer-id-1146091' class='answer   answerof-291639 ' value='1146091'   \/><label for='answer-id-1146091' id='answer-label-1146091' class=' answer'><span>Stop the DB cluster and analyze how the website responds<\/span><\/label><\/div><div class='watupro-question-choice  ' dir='auto' ><input type='radio' name='answer-291639[]' id='answer-id-1146092' class='answer   answerof-291639 ' value='1146092'   \/><label for='answer-id-1146092' id='answer-label-1146092' class=' answer'><span>Use Aurora fault injection to crash the master DB instance<\/span><\/label><\/div><div class='watupro-question-choice  ' dir='auto' ><input type='radio' name='answer-291639[]' id='answer-id-1146093' class='answer   answerof-291639 ' value='1146093'   \/><label for='answer-id-1146093' id='answer-label-1146093' class=' answer'><span>Remove the DB cluster endpoint to simulate a master DB instance failure<\/span><\/label><\/div><div class='watupro-question-choice  ' dir='auto' ><input type='radio' name='answer-291639[]' id='answer-id-1146094' class='answer   answerof-291639 ' value='1146094'   \/><label for='answer-id-1146094' id='answer-label-1146094' class=' answer'><span>Use Aurora Backtrack to crash the DB cluster<\/span><\/label><\/div><!-- end question-choices--><\/div><!-- end questionWrap--><\/div><\/div><div class='watu-question ' id='question-70' style=';'><div id='questionWrap-70'  class='   watupro-question-id-291640'>\n\t\t\t<div class='question-content'><div><span class='watupro_num'>70. <\/span>A company just migrated to Amazon Aurora PostgreSQL from an on-premises Oracle database. After <br \/>\r<br>the migration, the company discovered there is a period of time every day around 3:00 PM where the response time of the application is noticeably slower. The company has narrowed down the cause of this issue to the database and not the application. <br \/>\r<br>Which set of steps should the Database Specialist take to most efficiently find the problematic PostgreSQL query?<\/div><input type='hidden' name='question_id[]' id='qID_70' value='291640' \/><input type='hidden' id='answerType291640' value='radio'><!-- end question-content--><\/div><div class='question-choices watupro-choices-columns '><div class='watupro-question-choice  ' dir='auto' ><input type='radio' name='answer-291640[]' id='answer-id-1146095' class='answer   answerof-291640 ' value='1146095'   \/><label for='answer-id-1146095' id='answer-label-1146095' class=' answer'><span>Create an Amazon CloudWatch dashboard to show the number of connections, CPU usage, and disk space consumption. Watch these dashboards during the next slow period.<\/span><\/label><\/div><div class='watupro-question-choice  ' dir='auto' ><input type='radio' name='answer-291640[]' id='answer-id-1146096' class='answer   answerof-291640 ' value='1146096'   \/><label for='answer-id-1146096' id='answer-label-1146096' class=' answer'><span>Launch an Amazon EC2 instance, and install and configure an open-source PostgreSQL monitoring tool that will run reports based on the output error logs.<\/span><\/label><\/div><div class='watupro-question-choice  ' dir='auto' ><input type='radio' name='answer-291640[]' id='answer-id-1146097' class='answer   answerof-291640 ' value='1146097'   \/><label for='answer-id-1146097' id='answer-label-1146097' class=' answer'><span>Modify the logging database parameter to log all the queries related to locking in the database and then check the logs after the next slow period for this information.<\/span><\/label><\/div><div class='watupro-question-choice  ' dir='auto' ><input type='radio' name='answer-291640[]' id='answer-id-1146098' class='answer   answerof-291640 ' value='1146098'   \/><label for='answer-id-1146098' id='answer-label-1146098' class=' answer'><span>Enable Amazon RDS Performance Insights on the PostgreSQL database. Use the metrics to identify any queries that are related to spikes in the graph during the next slow period.<\/span><\/label><\/div><!-- end question-choices--><\/div><!-- end questionWrap--><\/div><\/div><div class='watu-question ' id='question-71' style=';'><div id='questionWrap-71'  class='   watupro-question-id-291641'>\n\t\t\t<div class='question-content'><div><span class='watupro_num'>71. <\/span>A company has a web-based survey application that uses Amazon DynamoDB. During peak usage, when survey responses are being collected, a Database Specialist sees the ProvisionedThroughputExceededException error. <br \/>\r<br>What can the Database Specialist do to resolve this error? (Choose two.)<\/div><input type='hidden' name='question_id[]' id='qID_71' value='291641' \/><input type='hidden' id='answerType291641' value='checkbox'><!-- end question-content--><\/div><div class='question-choices watupro-choices-columns '><div class='watupro-question-choice  ' dir='auto' ><input type='checkbox' name='answer-291641[]' id='answer-id-1146099' class='answer   answerof-291641 ' value='1146099'   \/><label for='answer-id-1146099' id='answer-label-1146099' class=' answer'><span>Change the table to use Amazon DynamoDB Streams<\/span><\/label><\/div><div class='watupro-question-choice  ' dir='auto' ><input type='checkbox' name='answer-291641[]' id='answer-id-1146100' class='answer   answerof-291641 ' value='1146100'   \/><label for='answer-id-1146100' id='answer-label-1146100' class=' answer'><span>Purchase DynamoDB reserved capacity in the affected Region<\/span><\/label><\/div><div class='watupro-question-choice  ' dir='auto' ><input type='checkbox' name='answer-291641[]' id='answer-id-1146101' class='answer   answerof-291641 ' value='1146101'   \/><label for='answer-id-1146101' id='answer-label-1146101' class=' answer'><span>Increase the write capacity units for the specific table<\/span><\/label><\/div><div class='watupro-question-choice  ' dir='auto' ><input type='checkbox' name='answer-291641[]' id='answer-id-1146102' class='answer   answerof-291641 ' value='1146102'   \/><label for='answer-id-1146102' id='answer-label-1146102' class=' answer'><span>Change the table capacity mode to on-demand<\/span><\/label><\/div><div class='watupro-question-choice  ' dir='auto' ><input type='checkbox' name='answer-291641[]' id='answer-id-1146103' class='answer   answerof-291641 ' value='1146103'   \/><label for='answer-id-1146103' id='answer-label-1146103' class=' answer'><span>Change the table type to throughput optimized<\/span><\/label><\/div><!-- end question-choices--><\/div><!-- end questionWrap--><\/div><\/div><div class='watu-question ' id='question-72' style=';'><div id='questionWrap-72'  class='   watupro-question-id-291642'>\n\t\t\t<div class='question-content'><div><span class='watupro_num'>72. <\/span>A company is running a two-tier ecommerce application in one AWS account. The web server is deployed using an Amazon RDS for MySQL Multi-AZ DB instance. A Developer mistakenly deleted the database in the production environment. The database has been restored, but this resulted in hours of downtime and lost revenue. <br \/>\r<br>Which combination of changes in existing IAM policies should a Database Specialist make to prevent an error like this from happening in the future? (Choose three.)<\/div><input type='hidden' name='question_id[]' id='qID_72' value='291642' \/><input type='hidden' id='answerType291642' value='checkbox'><!-- end question-content--><\/div><div class='question-choices watupro-choices-columns '><div class='watupro-question-choice  ' dir='auto' ><input type='checkbox' name='answer-291642[]' id='answer-id-1146104' class='answer   answerof-291642 ' value='1146104'   \/><label for='answer-id-1146104' id='answer-label-1146104' class=' answer'><span>Grant least privilege to groups, users, and roles<\/span><\/label><\/div><div class='watupro-question-choice  ' dir='auto' ><input type='checkbox' name='answer-291642[]' id='answer-id-1146105' class='answer   answerof-291642 ' value='1146105'   \/><label for='answer-id-1146105' id='answer-label-1146105' class=' answer'><span>Allow all users to restore a database from a backup that will reduce the overall downtime to restore the database<\/span><\/label><\/div><div class='watupro-question-choice  ' dir='auto' ><input type='checkbox' name='answer-291642[]' id='answer-id-1146106' class='answer   answerof-291642 ' value='1146106'   \/><label for='answer-id-1146106' id='answer-label-1146106' class=' answer'><span>Enable multi-factor authentication for sensitive operations to access sensitive resources and API operations<\/span><\/label><\/div><div class='watupro-question-choice  ' dir='auto' ><input type='checkbox' name='answer-291642[]' id='answer-id-1146107' class='answer   answerof-291642 ' value='1146107'   \/><label for='answer-id-1146107' id='answer-label-1146107' class=' answer'><span>Use policy conditions to restrict access to selective IP addresses<\/span><\/label><\/div><div class='watupro-question-choice  ' dir='auto' ><input type='checkbox' name='answer-291642[]' id='answer-id-1146108' class='answer   answerof-291642 ' value='1146108'   \/><label for='answer-id-1146108' id='answer-label-1146108' class=' answer'><span>Use AccessList Controls policy type to restrict users for database instance deletion<\/span><\/label><\/div><div class='watupro-question-choice  ' dir='auto' ><input type='checkbox' name='answer-291642[]' id='answer-id-1146109' class='answer   answerof-291642 ' value='1146109'   \/><label for='answer-id-1146109' id='answer-label-1146109' class=' answer'><span>Enable AWS CloudTrail logging and Enhanced Monitoring<\/span><\/label><\/div><!-- end question-choices--><\/div><!-- end questionWrap--><\/div><\/div><div class='watu-question ' id='question-73' style=';'><div id='questionWrap-73'  class='   watupro-question-id-291643'>\n\t\t\t<div class='question-content'><div><span class='watupro_num'>73. <\/span>A company is building a new web platform where user requests trigger an AWS Lambda function that performs an insert into an Amazon Aurora MySQL DB cluster. Initial tests with less than 10 users on the new platform yielded successful execution and fast response times. However, upon more extensive tests with the actual target of 3,000 concurrent users, Lambda functions are unable to connect to the DB cluster and receive too many connections errors. <br \/>\r<br>Which of the following will resolve this issue?<\/div><input type='hidden' name='question_id[]' id='qID_73' value='291643' \/><input type='hidden' id='answerType291643' value='radio'><!-- end question-content--><\/div><div class='question-choices watupro-choices-columns '><div class='watupro-question-choice  ' dir='auto' ><input type='radio' name='answer-291643[]' id='answer-id-1146110' class='answer   answerof-291643 ' value='1146110'   \/><label for='answer-id-1146110' id='answer-label-1146110' class=' answer'><span>Edit the my.cnf file for the DB cluster to increase max_connections<\/span><\/label><\/div><div class='watupro-question-choice  ' dir='auto' ><input type='radio' name='answer-291643[]' id='answer-id-1146111' class='answer   answerof-291643 ' value='1146111'   \/><label for='answer-id-1146111' id='answer-label-1146111' class=' answer'><span>Increase the instance size of the DB cluster<\/span><\/label><\/div><div class='watupro-question-choice  ' dir='auto' ><input type='radio' name='answer-291643[]' id='answer-id-1146112' class='answer   answerof-291643 ' value='1146112'   \/><label for='answer-id-1146112' id='answer-label-1146112' class=' answer'><span>Change the DB cluster to Multi-AZ<\/span><\/label><\/div><div class='watupro-question-choice  ' dir='auto' ><input type='radio' name='answer-291643[]' id='answer-id-1146113' class='answer   answerof-291643 ' value='1146113'   \/><label for='answer-id-1146113' id='answer-label-1146113' class=' answer'><span>Increase the number of Aurora Replicas<\/span><\/label><\/div><!-- end question-choices--><\/div><!-- end questionWrap--><\/div><\/div><div class='watu-question ' id='question-74' style=';'><div id='questionWrap-74'  class='   watupro-question-id-291644'>\n\t\t\t<div class='question-content'><div><span class='watupro_num'>74. <\/span>A company is developing a multi-tier web application hosted on AWS using Amazon Aurora as the database. The application needs to be deployed to production and other non-production environments. A Database Specialist needs to specify different MasterUsername and MasterUserPassword properties in the AWS CloudFormation templates used for automated deployment. The CloudFormation templates are version controlled in the company\u2019s code repository. The company also needs to meet compliance requirement by routinely rotating its database master password for production. <br \/>\r<br>What is most secure solution to store the master password?<\/div><input type='hidden' name='question_id[]' id='qID_74' value='291644' \/><input type='hidden' id='answerType291644' value='radio'><!-- end question-content--><\/div><div class='question-choices watupro-choices-columns '><div class='watupro-question-choice  ' dir='auto' ><input type='radio' name='answer-291644[]' id='answer-id-1146114' class='answer   answerof-291644 ' value='1146114'   \/><label for='answer-id-1146114' id='answer-label-1146114' class=' answer'><span>Store the master password in a parameter file in each environment. Reference the environment-specific parameter file in the CloudFormation template.<\/span><\/label><\/div><div class='watupro-question-choice  ' dir='auto' ><input type='radio' name='answer-291644[]' id='answer-id-1146115' class='answer   answerof-291644 ' value='1146115'   \/><label for='answer-id-1146115' id='answer-label-1146115' class=' answer'><span>Encrypt the master password using an AWS KMS key. Store the encrypted master password in the CloudFormation template.<\/span><\/label><\/div><div class='watupro-question-choice  ' dir='auto' ><input type='radio' name='answer-291644[]' id='answer-id-1146116' class='answer   answerof-291644 ' value='1146116'   \/><label for='answer-id-1146116' id='answer-label-1146116' class=' answer'><span>Use the secretsmanager dynamic reference to retrieve the master password stored in AWS Secrets Manager and enable automatic rotation.<\/span><\/label><\/div><div class='watupro-question-choice  ' dir='auto' ><input type='radio' name='answer-291644[]' id='answer-id-1146117' class='answer   answerof-291644 ' value='1146117'   \/><label for='answer-id-1146117' id='answer-label-1146117' class=' answer'><span>Use the ssm dynamic reference to retrieve the master password stored in the AWS Systems Manager Parameter Store and enable automatic rotation.<\/span><\/label><\/div><!-- end question-choices--><\/div><!-- end questionWrap--><\/div><\/div><div class='watu-question ' id='question-75' style=';'><div id='questionWrap-75'  class='   watupro-question-id-291645'>\n\t\t\t<div class='question-content'><div><span class='watupro_num'>75. <\/span>A company is writing a new survey application to be used with a weekly televised game show. The application will be available for 2 hours each week. The company expects to receive over 500,000 entries every week, with each survey asking 2-3 multiple choice questions of each user. A Database Specialist needs to select a platform that is highly scalable for a large number of concurrent writes to handle he anticipated volume. <br \/>\r<br>Which AWS services should the Database Specialist consider? (Choose two.)<\/div><input type='hidden' name='question_id[]' id='qID_75' value='291645' \/><input type='hidden' id='answerType291645' value='checkbox'><!-- end question-content--><\/div><div class='question-choices watupro-choices-columns '><div class='watupro-question-choice  ' dir='auto' ><input type='checkbox' name='answer-291645[]' id='answer-id-1146118' class='answer   answerof-291645 ' value='1146118'   \/><label for='answer-id-1146118' id='answer-label-1146118' class=' answer'><span>Amazon DynamoDB<\/span><\/label><\/div><div class='watupro-question-choice  ' dir='auto' ><input type='checkbox' name='answer-291645[]' id='answer-id-1146119' class='answer   answerof-291645 ' value='1146119'   \/><label for='answer-id-1146119' id='answer-label-1146119' class=' answer'><span>Amazon Redshift<\/span><\/label><\/div><div class='watupro-question-choice  ' dir='auto' ><input type='checkbox' name='answer-291645[]' id='answer-id-1146120' class='answer   answerof-291645 ' value='1146120'   \/><label for='answer-id-1146120' id='answer-label-1146120' class=' answer'><span>Amazon Neptune<\/span><\/label><\/div><div class='watupro-question-choice  ' dir='auto' ><input type='checkbox' name='answer-291645[]' id='answer-id-1146121' class='answer   answerof-291645 ' value='1146121'   \/><label for='answer-id-1146121' id='answer-label-1146121' class=' answer'><span>Amazon Elasticsearch Service<\/span><\/label><\/div><div class='watupro-question-choice  ' dir='auto' ><input type='checkbox' name='answer-291645[]' id='answer-id-1146122' class='answer   answerof-291645 ' value='1146122'   \/><label for='answer-id-1146122' id='answer-label-1146122' class=' answer'><span>Amazon ElastiCache<\/span><\/label><\/div><!-- end question-choices--><\/div><!-- end questionWrap--><\/div><\/div><div class='watu-question ' id='question-76' style=';'><div id='questionWrap-76'  class='   watupro-question-id-291646'>\n\t\t\t<div class='question-content'><div><span class='watupro_num'>76. <\/span>A company has migrated a single MySQL database to Amazon Aurora. The production data is hosted in a DB cluster in VPC_PROD, and 12 testing environments are hosted in VPC_TEST using the same AWS account. Testing results in minimal changes to the test data. The Development team wants each environment refreshed nightly so each test database contains fresh production data every day. <br \/>\r<br>Which migration approach will be the fastest and most cost-effective to implement?<\/div><input type='hidden' name='question_id[]' id='qID_76' value='291646' \/><input type='hidden' id='answerType291646' value='radio'><!-- end question-content--><\/div><div class='question-choices watupro-choices-columns '><div class='watupro-question-choice  ' dir='auto' ><input type='radio' name='answer-291646[]' id='answer-id-1146123' class='answer   answerof-291646 ' value='1146123'   \/><label for='answer-id-1146123' id='answer-label-1146123' class=' answer'><span>Run the master in Amazon Aurora MySQ<\/span><\/label><\/div><div class='watupro-question-choice  ' dir='auto' ><input type='radio' name='answer-291646[]' id='answer-id-1146124' class='answer   answerof-291646 ' value='1146124'   \/><label for='answer-id-1146124' id='answer-label-1146124' class=' answer'><span>Create 12 clones in VPC_TEST, and script the clones to be deleted and re-created nightly.<\/span><\/label><\/div><div class='watupro-question-choice  ' dir='auto' ><input type='radio' name='answer-291646[]' id='answer-id-1146125' class='answer   answerof-291646 ' value='1146125'   \/><label for='answer-id-1146125' id='answer-label-1146125' class=' answer'><span>Run the master in Amazon Aurora MySQ<\/span><\/label><\/div><div class='watupro-question-choice  ' dir='auto' ><input type='radio' name='answer-291646[]' id='answer-id-1146126' class='answer   answerof-291646 ' value='1146126'   \/><label for='answer-id-1146126' id='answer-label-1146126' class=' answer'><span>Take a nightly snapshot, and restore it into 12 databases in VPC_TEST using Aurora Serverless.<\/span><\/label><\/div><div class='watupro-question-choice  ' dir='auto' ><input type='radio' name='answer-291646[]' id='answer-id-1146127' class='answer   answerof-291646 ' value='1146127'   \/><label for='answer-id-1146127' id='answer-label-1146127' class=' answer'><span>Run the master in Amazon Aurora MySQ<\/span><\/label><\/div><div class='watupro-question-choice  ' dir='auto' ><input type='radio' name='answer-291646[]' id='answer-id-1146128' class='answer   answerof-291646 ' value='1146128'   \/><label for='answer-id-1146128' id='answer-label-1146128' class=' answer'><span>Create 12 Aurora Replicas in VPC_TEST, and script the replicas to be deleted and re-created nightly.<\/span><\/label><\/div><div class='watupro-question-choice  ' dir='auto' ><input type='radio' name='answer-291646[]' id='answer-id-1146129' class='answer   answerof-291646 ' value='1146129'   \/><label for='answer-id-1146129' id='answer-label-1146129' class=' answer'><span>Run the master in Amazon Aurora MySQL using Aurora Serverless. Create 12 clones in VPC_TEST, and script the clones to be deleted and re-created nightly.<\/span><\/label><\/div><!-- end question-choices--><\/div><!-- end questionWrap--><\/div><\/div><div class='watu-question ' id='question-77' style=';'><div id='questionWrap-77'  class='   watupro-question-id-291647'>\n\t\t\t<div class='question-content'><div><span class='watupro_num'>77. <\/span>A large ecommerce company uses Amazon DynamoDB to handle the transactions on its web portal. Traffic patterns throughout the year are usually stable; however, a large event is planned. The company knows that traffic will increase by up to 10 times the normal load over the 3-day event. When sale prices are published during the event, traffic will spike rapidly. <br \/>\r<br>How should a Database Specialist ensure DynamoDB can handle the increased traffic?<\/div><input type='hidden' name='question_id[]' id='qID_77' value='291647' \/><input type='hidden' id='answerType291647' value='radio'><!-- end question-content--><\/div><div class='question-choices watupro-choices-columns '><div class='watupro-question-choice  ' dir='auto' ><input type='radio' name='answer-291647[]' id='answer-id-1146130' class='answer   answerof-291647 ' value='1146130'   \/><label for='answer-id-1146130' id='answer-label-1146130' class=' answer'><span>Ensure the table is always provisioned to meet peak needs<\/span><\/label><\/div><div class='watupro-question-choice  ' dir='auto' ><input type='radio' name='answer-291647[]' id='answer-id-1146131' class='answer   answerof-291647 ' value='1146131'   \/><label for='answer-id-1146131' id='answer-label-1146131' class=' answer'><span>Allow burst capacity to handle the additional load<\/span><\/label><\/div><div class='watupro-question-choice  ' dir='auto' ><input type='radio' name='answer-291647[]' id='answer-id-1146132' class='answer   answerof-291647 ' value='1146132'   \/><label for='answer-id-1146132' id='answer-label-1146132' class=' answer'><span>Set an AWS Application Auto Scaling policy for the table to handle the increase in traffic<\/span><\/label><\/div><div class='watupro-question-choice  ' dir='auto' ><input type='radio' name='answer-291647[]' id='answer-id-1146133' class='answer   answerof-291647 ' value='1146133'   \/><label for='answer-id-1146133' id='answer-label-1146133' class=' answer'><span>Preprovision additional capacity for the known peaks and then reduce the capacity after the event<\/span><\/label><\/div><!-- end question-choices--><\/div><!-- end questionWrap--><\/div><\/div><div class='watu-question ' id='question-78' style=';'><div id='questionWrap-78'  class='   watupro-question-id-291648'>\n\t\t\t<div class='question-content'><div><span class='watupro_num'>78. <\/span>A Database Specialist is migrating an on-premises Microsoft SQL Server application database to Amazon RDS for PostgreSQL using AWS DMS. The application requires minimal downtime when the RDS DB instance goes live. <br \/>\r<br>What change should the Database Specialist make to enable the migration?<\/div><input type='hidden' name='question_id[]' id='qID_78' value='291648' \/><input type='hidden' id='answerType291648' value='radio'><!-- end question-content--><\/div><div class='question-choices watupro-choices-columns '><div class='watupro-question-choice  ' dir='auto' ><input type='radio' name='answer-291648[]' id='answer-id-1146134' class='answer   answerof-291648 ' value='1146134'   \/><label for='answer-id-1146134' id='answer-label-1146134' class=' answer'><span>Configure the on-premises application database to act as a source for an AWS DMS full load with ongoing change data capture (CDC)<\/span><\/label><\/div><div class='watupro-question-choice  ' dir='auto' ><input type='radio' name='answer-291648[]' id='answer-id-1146135' class='answer   answerof-291648 ' value='1146135'   \/><label for='answer-id-1146135' id='answer-label-1146135' class=' answer'><span>Configure the AWS DMS replication instance to allow both full load and ongoing change data capture (CDC)<\/span><\/label><\/div><div class='watupro-question-choice  ' dir='auto' ><input type='radio' name='answer-291648[]' id='answer-id-1146136' class='answer   answerof-291648 ' value='1146136'   \/><label for='answer-id-1146136' id='answer-label-1146136' class=' answer'><span>Configure the AWS DMS task to generate full logs to allow for ongoing change data capture (CDC)<\/span><\/label><\/div><div class='watupro-question-choice  ' dir='auto' ><input type='radio' name='answer-291648[]' id='answer-id-1146137' class='answer   answerof-291648 ' value='1146137'   \/><label for='answer-id-1146137' id='answer-label-1146137' class=' answer'><span>Configure the AWS DMS connections to allow two-way communication to allow for ongoing change data capture (CDC)<\/span><\/label><\/div><!-- end question-choices--><\/div><!-- end questionWrap--><\/div><\/div><div class='watu-question ' id='question-79' style=';'><div id='questionWrap-79'  class='   watupro-question-id-291649'>\n\t\t\t<div class='question-content'><div><span class='watupro_num'>79. <\/span>A financial company has allocated an Amazon RDS MariaDB DB instance with large storage capacity to accommodate migration efforts. Post-migration, the company purged unwanted data from the instance. The company now want to downsize storage to save money. The solution must have the least impact on production and near-zero downtime. <br \/>\r<br>Which solution would meet these requirements?<\/div><input type='hidden' name='question_id[]' id='qID_79' value='291649' \/><input type='hidden' id='answerType291649' value='radio'><!-- end question-content--><\/div><div class='question-choices watupro-choices-columns '><div class='watupro-question-choice  ' dir='auto' ><input type='radio' name='answer-291649[]' id='answer-id-1146138' class='answer   answerof-291649 ' value='1146138'   \/><label for='answer-id-1146138' id='answer-label-1146138' class=' answer'><span>Create a snapshot of the old databases and restore the snapshot with the required storage<\/span><\/label><\/div><div class='watupro-question-choice  ' dir='auto' ><input type='radio' name='answer-291649[]' id='answer-id-1146139' class='answer   answerof-291649 ' value='1146139'   \/><label for='answer-id-1146139' id='answer-label-1146139' class=' answer'><span>Create a new RDS DB instance with the required storage and move the databases from the old instances to the new instance using AWS DMS<\/span><\/label><\/div><div class='watupro-question-choice  ' dir='auto' ><input type='radio' name='answer-291649[]' id='answer-id-1146140' class='answer   answerof-291649 ' value='1146140'   \/><label for='answer-id-1146140' id='answer-label-1146140' class=' answer'><span>Create a new database using native backup and restore<\/span><\/label><\/div><div class='watupro-question-choice  ' dir='auto' ><input type='radio' name='answer-291649[]' id='answer-id-1146141' class='answer   answerof-291649 ' value='1146141'   \/><label for='answer-id-1146141' id='answer-label-1146141' class=' answer'><span>Create a new read replica and make it the primary by terminating the existing primary<\/span><\/label><\/div><!-- end question-choices--><\/div><!-- end questionWrap--><\/div><\/div><div class='watu-question ' id='question-80' style=';'><div id='questionWrap-80'  class='   watupro-question-id-291650'>\n\t\t\t<div class='question-content'><div><span class='watupro_num'>80. <\/span>A large financial services company requires that all data be encrypted in transit. A Developer is attempting to connect to an Amazon RDS DB instance using the company VPC for the first time with credentials provided by a Database Specialist. Other members of the Development team can connect, but this user is consistently receiving an error indicating a communications link failure. The Developer asked the Database Specialist to reset the password a number of times, but the error persists. <br \/>\r<br>Which step should be taken to troubleshoot this issue?<\/div><input type='hidden' name='question_id[]' id='qID_80' value='291650' \/><input type='hidden' id='answerType291650' value='radio'><!-- end question-content--><\/div><div class='question-choices watupro-choices-columns '><div class='watupro-question-choice  ' dir='auto' ><input type='radio' name='answer-291650[]' id='answer-id-1146142' class='answer   answerof-291650 ' value='1146142'   \/><label for='answer-id-1146142' id='answer-label-1146142' class=' answer'><span>Ensure that the database option group for the RDS DB instance allows ingress from the Developer machine\u2019s IP address<\/span><\/label><\/div><div class='watupro-question-choice  ' dir='auto' ><input type='radio' name='answer-291650[]' id='answer-id-1146143' class='answer   answerof-291650 ' value='1146143'   \/><label for='answer-id-1146143' id='answer-label-1146143' class=' answer'><span>Ensure that the RDS DB instance\u2019s subnet group includes a public subnet to allow the Developer to connect<\/span><\/label><\/div><div class='watupro-question-choice  ' dir='auto' ><input type='radio' name='answer-291650[]' id='answer-id-1146144' class='answer   answerof-291650 ' value='1146144'   \/><label for='answer-id-1146144' id='answer-label-1146144' class=' answer'><span>Ensure that the RDS DB instance has not reached its maximum connections limit<\/span><\/label><\/div><div class='watupro-question-choice  ' dir='auto' ><input type='radio' name='answer-291650[]' id='answer-id-1146145' class='answer   answerof-291650 ' value='1146145'   \/><label for='answer-id-1146145' id='answer-label-1146145' class=' answer'><span>Ensure that the connection is using SSL and is addressing the port where the RDS DB instance is listening for encrypted connections<\/span><\/label><\/div><!-- end question-choices--><\/div><!-- end questionWrap--><\/div><\/div><div class='watu-question ' id='question-81' style=';'><div id='questionWrap-81'  class='   watupro-question-id-291651'>\n\t\t\t<div class='question-content'><div><span class='watupro_num'>81. <\/span>A company is running Amazon RDS for MySQL for its workloads. There is downtime when AWS operating system patches are applied during the Amazon RDS-specified maintenance window. <br \/>\r<br>What is the MOST cost-effective action that should be taken to avoid downtime?<\/div><input type='hidden' name='question_id[]' id='qID_81' value='291651' \/><input type='hidden' id='answerType291651' value='radio'><!-- end question-content--><\/div><div class='question-choices watupro-choices-columns '><div class='watupro-question-choice  ' dir='auto' ><input type='radio' name='answer-291651[]' id='answer-id-1146146' class='answer   answerof-291651 ' value='1146146'   \/><label for='answer-id-1146146' id='answer-label-1146146' class=' answer'><span>Migrate the workloads from Amazon RDS for MySQL to Amazon DynamoDB<\/span><\/label><\/div><div class='watupro-question-choice  ' dir='auto' ><input type='radio' name='answer-291651[]' id='answer-id-1146147' class='answer   answerof-291651 ' value='1146147'   \/><label for='answer-id-1146147' id='answer-label-1146147' class=' answer'><span>Enable cross-Region read replicas and direct read traffic to then when Amazon RDS is down<\/span><\/label><\/div><div class='watupro-question-choice  ' dir='auto' ><input type='radio' name='answer-291651[]' id='answer-id-1146148' class='answer   answerof-291651 ' value='1146148'   \/><label for='answer-id-1146148' id='answer-label-1146148' class=' answer'><span>Enable a read replicas and direct read traffic to it when Amazon RDS is down<\/span><\/label><\/div><div class='watupro-question-choice  ' dir='auto' ><input type='radio' name='answer-291651[]' id='answer-id-1146149' class='answer   answerof-291651 ' value='1146149'   \/><label for='answer-id-1146149' id='answer-label-1146149' class=' answer'><span>Enable an Amazon RDS for MySQL Multi-AZ configuration<\/span><\/label><\/div><!-- end question-choices--><\/div><!-- end questionWrap--><\/div><\/div><div class='watu-question ' id='question-82' style=';'><div id='questionWrap-82'  class='   watupro-question-id-291652'>\n\t\t\t<div class='question-content'><div><span class='watupro_num'>82. <\/span>A Database Specialist must create a read replica to isolate read-only queries for an Amazon RDS for MySQL DB instance. Immediately after creating the read replica, users that query it report slow response times. <br \/>\r<br>What could be causing these slow response times?<\/div><input type='hidden' name='question_id[]' id='qID_82' value='291652' \/><input type='hidden' id='answerType291652' value='radio'><!-- end question-content--><\/div><div class='question-choices watupro-choices-columns '><div class='watupro-question-choice  ' dir='auto' ><input type='radio' name='answer-291652[]' id='answer-id-1146150' class='answer   answerof-291652 ' value='1146150'   \/><label for='answer-id-1146150' id='answer-label-1146150' class=' answer'><span>New volumes created from snapshots load lazily in the background<\/span><\/label><\/div><div class='watupro-question-choice  ' dir='auto' ><input type='radio' name='answer-291652[]' id='answer-id-1146151' class='answer   answerof-291652 ' value='1146151'   \/><label for='answer-id-1146151' id='answer-label-1146151' class=' answer'><span>Long-running statements on the master<\/span><\/label><\/div><div class='watupro-question-choice  ' dir='auto' ><input type='radio' name='answer-291652[]' id='answer-id-1146152' class='answer   answerof-291652 ' value='1146152'   \/><label for='answer-id-1146152' id='answer-label-1146152' class=' answer'><span>Insufficient resources on the master<\/span><\/label><\/div><div class='watupro-question-choice  ' dir='auto' ><input type='radio' name='answer-291652[]' id='answer-id-1146153' class='answer   answerof-291652 ' value='1146153'   \/><label for='answer-id-1146153' id='answer-label-1146153' class=' answer'><span>Overload of a single replication thread by excessive writes on the master<\/span><\/label><\/div><!-- end question-choices--><\/div><!-- end questionWrap--><\/div><\/div><div class='watu-question ' id='question-83' style=';'><div id='questionWrap-83'  class='   watupro-question-id-291653'>\n\t\t\t<div class='question-content'><div><span class='watupro_num'>83. <\/span>A company developed an AWS CloudFormation template used to create all new Amazon DynamoDB tables in its AWS account. The template configures provisioned throughput capacity using hard-coded values. The company wants to change the template so that the tables it creates in the future have independently configurable read and write capacity units assigned. <br \/>\r<br>Which solution will enable this change?<\/div><input type='hidden' name='question_id[]' id='qID_83' value='291653' \/><input type='hidden' id='answerType291653' value='radio'><!-- end question-content--><\/div><div class='question-choices watupro-choices-columns '><div class='watupro-question-choice  ' dir='auto' ><input type='radio' name='answer-291653[]' id='answer-id-1146154' class='answer   answerof-291653 ' value='1146154'   \/><label for='answer-id-1146154' id='answer-label-1146154' class=' answer'><span>Add values for the rcuCount and wcuCount parameters to the Mappings section of the template. \r\nConfigure DynamoDB to provision throughput capacity using the stack\u2019s mappings.<\/span><\/label><\/div><div class='watupro-question-choice  ' dir='auto' ><input type='radio' name='answer-291653[]' id='answer-id-1146155' class='answer   answerof-291653 ' value='1146155'   \/><label for='answer-id-1146155' id='answer-label-1146155' class=' answer'><span>Add values for two Number parameters, rcuCount and wcuCount, to the template. Replace the hard-coded values with calls to the Ref intrinsic function, referencing the new parameters.<\/span><\/label><\/div><div class='watupro-question-choice  ' dir='auto' ><input type='radio' name='answer-291653[]' id='answer-id-1146156' class='answer   answerof-291653 ' value='1146156'   \/><label for='answer-id-1146156' id='answer-label-1146156' class=' answer'><span>Add values for the rcuCount and wcuCount parameters as outputs of the template. Configure DynamoDB to provision throughput capacity using the stack outputs.<\/span><\/label><\/div><div class='watupro-question-choice  ' dir='auto' ><input type='radio' name='answer-291653[]' id='answer-id-1146157' class='answer   answerof-291653 ' value='1146157'   \/><label for='answer-id-1146157' id='answer-label-1146157' class=' answer'><span>Add values for the rcuCount and wcuCount parameters to the Mappings section of the template. \r\nReplace the hard-coded values with calls to the Ref intrinsic function, referencing the new parameters.<\/span><\/label><\/div><!-- end question-choices--><\/div><!-- end questionWrap--><\/div><\/div><div class='watu-question ' id='question-84' style=';'><div id='questionWrap-84'  class='   watupro-question-id-291654'>\n\t\t\t<div class='question-content'><div><span class='watupro_num'>84. <\/span>A retail company with its main office in New York and another office in Tokyo plans to build a database solution on AWS. The company\u2019s main workload consists of a mission-critical application that updates its application data in a data store. The team at the Tokyo office is building dashboards with complex analytical queries using the application data. The dashboards will be used to make buying decisions, so they need to have access to the application data in less than 1 second. <br \/>\r<br>Which solution meets these requirements?<\/div><input type='hidden' name='question_id[]' id='qID_84' value='291654' \/><input type='hidden' id='answerType291654' value='radio'><!-- end question-content--><\/div><div class='question-choices watupro-choices-columns '><div class='watupro-question-choice  ' dir='auto' ><input type='radio' name='answer-291654[]' id='answer-id-1146158' class='answer   answerof-291654 ' value='1146158'   \/><label for='answer-id-1146158' id='answer-label-1146158' class=' answer'><span>Use an Amazon RDS DB instance deployed in the us-east-1 Region with a read replica instance in the ap- northeast-1 Region. Create an Amazon ElastiCache cluster in the ap-northeast-1 Region to cache application data from the replica to generate the dashboards.<\/span><\/label><\/div><div class='watupro-question-choice  ' dir='auto' ><input type='radio' name='answer-291654[]' id='answer-id-1146159' class='answer   answerof-291654 ' value='1146159'   \/><label for='answer-id-1146159' id='answer-label-1146159' class=' answer'><span>Use an Amazon DynamoDB global table in the us-east-1 Region with replication into the ap-northeast-1 Region. Use Amazon QuickSight for displaying dashboard results.<\/span><\/label><\/div><div class='watupro-question-choice  ' dir='auto' ><input type='radio' name='answer-291654[]' id='answer-id-1146160' class='answer   answerof-291654 ' value='1146160'   \/><label for='answer-id-1146160' id='answer-label-1146160' class=' answer'><span>Use an Amazon RDS for MySQL DB instance deployed in the us-east-1 Region with a read replica instance in the ap-northeast-1 Region. Have the dashboard application read from the read replica.<\/span><\/label><\/div><div class='watupro-question-choice  ' dir='auto' ><input type='radio' name='answer-291654[]' id='answer-id-1146161' class='answer   answerof-291654 ' value='1146161'   \/><label for='answer-id-1146161' id='answer-label-1146161' class=' answer'><span>Use an Amazon Aurora global database. Deploy the writer instance in the us-east-1 Region and the replica in the ap-northeast-1 Region. Have the dashboard application read from the replica ap-northeast-1 Region.<\/span><\/label><\/div><!-- end question-choices--><\/div><!-- end questionWrap--><\/div><\/div><div class='watu-question ' id='question-85' style=';'><div id='questionWrap-85'  class='   watupro-question-id-291655'>\n\t\t\t<div class='question-content'><div><span class='watupro_num'>85. <\/span>A company is using Amazon RDS for PostgreSQL. The Security team wants all database connection requests to be logged and retained for 180 days. The RDS for PostgreSQL DB instance is currently using the default parameter group. A Database Specialist has identified that setting the log_connections parameter to 1 will enable connections logging. <br \/>\r<br>Which combination of steps should the Database Specialist take to meet the logging and retention requirements? (Choose two.)<\/div><input type='hidden' name='question_id[]' id='qID_85' value='291655' \/><input type='hidden' id='answerType291655' value='checkbox'><!-- end question-content--><\/div><div class='question-choices watupro-choices-columns '><div class='watupro-question-choice  ' dir='auto' ><input type='checkbox' name='answer-291655[]' id='answer-id-1146162' class='answer   answerof-291655 ' value='1146162'   \/><label for='answer-id-1146162' id='answer-label-1146162' class=' answer'><span>Update the log_connections parameter in the default parameter group<\/span><\/label><\/div><div class='watupro-question-choice  ' dir='auto' ><input type='checkbox' name='answer-291655[]' id='answer-id-1146163' class='answer   answerof-291655 ' value='1146163'   \/><label for='answer-id-1146163' id='answer-label-1146163' class=' answer'><span>Create a custom parameter group, update the log_connections parameter, and associate the parameter with the DB instance<\/span><\/label><\/div><div class='watupro-question-choice  ' dir='auto' ><input type='checkbox' name='answer-291655[]' id='answer-id-1146164' class='answer   answerof-291655 ' value='1146164'   \/><label for='answer-id-1146164' id='answer-label-1146164' class=' answer'><span>Enable publishing of database engine logs to Amazon CloudWatch Logs and set the event expiration to 180 days<\/span><\/label><\/div><div class='watupro-question-choice  ' dir='auto' ><input type='checkbox' name='answer-291655[]' id='answer-id-1146165' class='answer   answerof-291655 ' value='1146165'   \/><label for='answer-id-1146165' id='answer-label-1146165' class=' answer'><span>Enable publishing of database engine logs to an Amazon S3 bucket and set the lifecycle policy to 180 days<\/span><\/label><\/div><div class='watupro-question-choice  ' dir='auto' ><input type='checkbox' name='answer-291655[]' id='answer-id-1146166' class='answer   answerof-291655 ' value='1146166'   \/><label for='answer-id-1146166' id='answer-label-1146166' class=' answer'><span>Connect to the RDS PostgreSQL host and update the log_connections parameter in the postgresql.conf file<\/span><\/label><\/div><!-- end question-choices--><\/div><!-- end questionWrap--><\/div><\/div><div class='watu-question ' id='question-86' style=';'><div id='questionWrap-86'  class='   watupro-question-id-291656'>\n\t\t\t<div class='question-content'><div><span class='watupro_num'>86. <\/span>A Database Specialist is creating a new Amazon Neptune DB cluster, and is attempting to load fata from Amazon S3 into the Neptune DB cluster using the Neptune bulk loader API. The Database Specialist receives the following error: <br \/>\r<br>\u201cUnable to connect to s3 endpoint. Provided source = s3:\/\/mybucket\/graphdata\/ and region = us-east-1. Please verify your S3 configuration.\u201d <br \/>\r<br>Which combination of actions should the Database Specialist take to troubleshoot the problem? (Choose two.)<\/div><input type='hidden' name='question_id[]' id='qID_86' value='291656' \/><input type='hidden' id='answerType291656' value='checkbox'><!-- end question-content--><\/div><div class='question-choices watupro-choices-columns '><div class='watupro-question-choice  ' dir='auto' ><input type='checkbox' name='answer-291656[]' id='answer-id-1146167' class='answer   answerof-291656 ' value='1146167'   \/><label for='answer-id-1146167' id='answer-label-1146167' class=' answer'><span>Check that Amazon S3 has an IAM role granting read access to Neptune<\/span><\/label><\/div><div class='watupro-question-choice  ' dir='auto' ><input type='checkbox' name='answer-291656[]' id='answer-id-1146168' class='answer   answerof-291656 ' value='1146168'   \/><label for='answer-id-1146168' id='answer-label-1146168' class=' answer'><span>Check that an Amazon S3 VPC endpoint exists<\/span><\/label><\/div><div class='watupro-question-choice  ' dir='auto' ><input type='checkbox' name='answer-291656[]' id='answer-id-1146169' class='answer   answerof-291656 ' value='1146169'   \/><label for='answer-id-1146169' id='answer-label-1146169' class=' answer'><span>Check that a Neptune VPC endpoint exists<\/span><\/label><\/div><div class='watupro-question-choice  ' dir='auto' ><input type='checkbox' name='answer-291656[]' id='answer-id-1146170' class='answer   answerof-291656 ' value='1146170'   \/><label for='answer-id-1146170' id='answer-label-1146170' class=' answer'><span>Check that Amazon EC2 has an IAM role granting read access to Amazon S3<\/span><\/label><\/div><div class='watupro-question-choice  ' dir='auto' ><input type='checkbox' name='answer-291656[]' id='answer-id-1146171' class='answer   answerof-291656 ' value='1146171'   \/><label for='answer-id-1146171' id='answer-label-1146171' class=' answer'><span>Check that Neptune has an IAM role granting read access to Amazon S3<\/span><\/label><\/div><!-- end question-choices--><\/div><!-- end questionWrap--><\/div><\/div><div class='watu-question ' id='question-87' style=';'><div id='questionWrap-87'  class='   watupro-question-id-291657'>\n\t\t\t<div class='question-content'><div><span class='watupro_num'>87. <\/span>A database specialist manages a critical Amazon RDS for MySQL DB instance for a company. The data stored daily could vary from .01% to 10% of the current database size. The database specialist needs to ensure that the DB instance storage grows as needed. <br \/>\r<br>What is the MOST operationally efficient and cost-effective solution?<\/div><input type='hidden' name='question_id[]' id='qID_87' value='291657' \/><input type='hidden' id='answerType291657' value='radio'><!-- end question-content--><\/div><div class='question-choices watupro-choices-columns '><div class='watupro-question-choice  ' dir='auto' ><input type='radio' name='answer-291657[]' id='answer-id-1146172' class='answer   answerof-291657 ' value='1146172'   \/><label for='answer-id-1146172' id='answer-label-1146172' class=' answer'><span>Configure RDS Storage Auto Scaling.<\/span><\/label><\/div><div class='watupro-question-choice  ' dir='auto' ><input type='radio' name='answer-291657[]' id='answer-id-1146173' class='answer   answerof-291657 ' value='1146173'   \/><label for='answer-id-1146173' id='answer-label-1146173' class=' answer'><span>Configure RDS instance Auto Scaling.<\/span><\/label><\/div><div class='watupro-question-choice  ' dir='auto' ><input type='radio' name='answer-291657[]' id='answer-id-1146174' class='answer   answerof-291657 ' value='1146174'   \/><label for='answer-id-1146174' id='answer-label-1146174' class=' answer'><span>Modify the DB instance allocated storage to meet the forecasted requirements.<\/span><\/label><\/div><div class='watupro-question-choice  ' dir='auto' ><input type='radio' name='answer-291657[]' id='answer-id-1146175' class='answer   answerof-291657 ' value='1146175'   \/><label for='answer-id-1146175' id='answer-label-1146175' class=' answer'><span>Monitor the Amazon CloudWatch FreeStorageSpace metric daily and add storage as required.<\/span><\/label><\/div><!-- end question-choices--><\/div><!-- end questionWrap--><\/div><\/div><div class='watu-question ' id='question-88' style=';'><div id='questionWrap-88'  class='   watupro-question-id-291658'>\n\t\t\t<div class='question-content'><div><span class='watupro_num'>88. <\/span>A company is due for renewing its database license. The company wants to migrate its 80 TB transactional database system from on-premises to the AWS Cloud. The migration should incur the least possible downtime on the downstream database applications. The company\u2019s network infrastructure has limited network bandwidth that is shared with other applications. <br \/>\r<br>Which solution should a database specialist use for a timely migration?<\/div><input type='hidden' name='question_id[]' id='qID_88' value='291658' \/><input type='hidden' id='answerType291658' value='radio'><!-- end question-content--><\/div><div class='question-choices watupro-choices-columns '><div class='watupro-question-choice  ' dir='auto' ><input type='radio' name='answer-291658[]' id='answer-id-1146176' class='answer   answerof-291658 ' value='1146176'   \/><label for='answer-id-1146176' id='answer-label-1146176' class=' answer'><span>Perform a full backup of the source database to AWS Snowball Edge appliances and ship them to be loaded to Amazon S3. Use AWS DMS to migrate change data capture (CDC) data from the source database to Amazon S3. Use a second AWS DMS task to migrate all the S3 data to the target database.<\/span><\/label><\/div><div class='watupro-question-choice  ' dir='auto' ><input type='radio' name='answer-291658[]' id='answer-id-1146177' class='answer   answerof-291658 ' value='1146177'   \/><label for='answer-id-1146177' id='answer-label-1146177' class=' answer'><span>Perform a full backup of the source database to AWS Snowball Edge appliances and ship them to be loaded to Amazon S3. Periodically perform incremental backups of the source database to be shipped in another Snowball Edge appliance to handle syncing change data capture (CDC) data from the source to the target database.<\/span><\/label><\/div><div class='watupro-question-choice  ' dir='auto' ><input type='radio' name='answer-291658[]' id='answer-id-1146178' class='answer   answerof-291658 ' value='1146178'   \/><label for='answer-id-1146178' id='answer-label-1146178' class=' answer'><span>Use AWS DMS to migrate the full load of the source database over a VPN tunnel using the internet for its primary connection. Allow AWS DMS to handle syncing change data capture (CDC) data from the source to the target database.<\/span><\/label><\/div><div class='watupro-question-choice  ' dir='auto' ><input type='radio' name='answer-291658[]' id='answer-id-1146179' class='answer   answerof-291658 ' value='1146179'   \/><label for='answer-id-1146179' id='answer-label-1146179' class=' answer'><span>Use the AWS Schema Conversion Tool (AWS SCT) to migrate the full load of the source database over a VPN tunnel using the internet for its primary connection. Allow AWS SCT to handle syncing change data capture (CDC) data from the source to the target database.<\/span><\/label><\/div><!-- end question-choices--><\/div><!-- end questionWrap--><\/div><\/div><div class='watu-question ' id='question-89' style=';'><div id='questionWrap-89'  class='   watupro-question-id-291659'>\n\t\t\t<div class='question-content'><div><span class='watupro_num'>89. <\/span>A database specialist is responsible for an Amazon RDS for MySQL DB instance with one read replica. The DB instance and the read replica are assigned to the default parameter group. The database team currently runs test queries against a read replica. The database team wants to create additional tables in the read replica that will only be accessible from the read replica to benefit the tests. <br \/>\r<br>Which should the database specialist do to allow the database team to create the test tables?<\/div><input type='hidden' name='question_id[]' id='qID_89' value='291659' \/><input type='hidden' id='answerType291659' value='radio'><!-- end question-content--><\/div><div class='question-choices watupro-choices-columns '><div class='watupro-question-choice  ' dir='auto' ><input type='radio' name='answer-291659[]' id='answer-id-1146180' class='answer   answerof-291659 ' value='1146180'   \/><label for='answer-id-1146180' id='answer-label-1146180' class=' answer'><span>Contact AWS Support to disable read-only mode on the read replica. Reboot the read replica. \r\nConnect to the read replica and create the tables.<\/span><\/label><\/div><div class='watupro-question-choice  ' dir='auto' ><input type='radio' name='answer-291659[]' id='answer-id-1146181' class='answer   answerof-291659 ' value='1146181'   \/><label for='answer-id-1146181' id='answer-label-1146181' class=' answer'><span>Change the read_only parameter to false (read_only=0) in the default parameter group of the read replica. Perform a reboot without failover. Connect to the read replica and create the tables using the local_only MySQL option.<\/span><\/label><\/div><div class='watupro-question-choice  ' dir='auto' ><input type='radio' name='answer-291659[]' id='answer-id-1146182' class='answer   answerof-291659 ' value='1146182'   \/><label for='answer-id-1146182' id='answer-label-1146182' class=' answer'><span>Change the read_only parameter to false (read_only=0) in the default parameter group. Reboot the read replica. Connect to the read replica and create the tables.<\/span><\/label><\/div><div class='watupro-question-choice  ' dir='auto' ><input type='radio' name='answer-291659[]' id='answer-id-1146183' class='answer   answerof-291659 ' value='1146183'   \/><label for='answer-id-1146183' id='answer-label-1146183' class=' answer'><span>Create a new DB parameter group. Change the read_only parameter to false (read_only=0). Associate the read replica with the new group. Reboot the read replica. Connect to the read replica and create the tables.<\/span><\/label><\/div><!-- end question-choices--><\/div><!-- end questionWrap--><\/div><\/div><div class='watu-question ' id='question-90' style=';'><div id='questionWrap-90'  class='   watupro-question-id-291660'>\n\t\t\t<div class='question-content'><div><span class='watupro_num'>90. <\/span>A company has a heterogeneous six-node production Amazon Aurora DB cluster that handles online transaction processing (OLTP) for the core business and OLAP reports for the human resources department. To match compute resources to the use case, the company has decided to have the reporting workload for the human resources department be directed to two small nodes in the Aurora DB cluster, while every other workload goes to four large nodes in the same DB cluster. <br \/>\r<br>Which option would ensure that the correct nodes are always available for the appropriate workload while meeting these requirements?<\/div><input type='hidden' name='question_id[]' id='qID_90' value='291660' \/><input type='hidden' id='answerType291660' value='radio'><!-- end question-content--><\/div><div class='question-choices watupro-choices-columns '><div class='watupro-question-choice  ' dir='auto' ><input type='radio' name='answer-291660[]' id='answer-id-1146184' class='answer   answerof-291660 ' value='1146184'   \/><label for='answer-id-1146184' id='answer-label-1146184' class=' answer'><span>Use the writer endpoint for OLTP and the reader endpoint for the OLAP reporting workload.<\/span><\/label><\/div><div class='watupro-question-choice  ' dir='auto' ><input type='radio' name='answer-291660[]' id='answer-id-1146185' class='answer   answerof-291660 ' value='1146185'   \/><label for='answer-id-1146185' id='answer-label-1146185' class=' answer'><span>Use automatic scaling for the Aurora Replica to have the appropriate number of replicas for the desired workload.<\/span><\/label><\/div><div class='watupro-question-choice  ' dir='auto' ><input type='radio' name='answer-291660[]' id='answer-id-1146186' class='answer   answerof-291660 ' value='1146186'   \/><label for='answer-id-1146186' id='answer-label-1146186' class=' answer'><span>Create additional readers to cater to the different scenarios.<\/span><\/label><\/div><div class='watupro-question-choice  ' dir='auto' ><input type='radio' name='answer-291660[]' id='answer-id-1146187' class='answer   answerof-291660 ' value='1146187'   \/><label for='answer-id-1146187' id='answer-label-1146187' class=' answer'><span>Use custom endpoints to satisfy the different workloads.<\/span><\/label><\/div><!-- end question-choices--><\/div><!-- end questionWrap--><\/div><\/div><div class='watu-question ' id='question-91' style=';'><div id='questionWrap-91'  class='   watupro-question-id-291661'>\n\t\t\t<div class='question-content'><div><span class='watupro_num'>91. <\/span>Developers have requested a new Amazon Redshift cluster so they can load new third-party marketing data. The new cluster is ready and the user credentials are given to the developers. <br \/>\r<br>The developers indicate that their copy jobs fail with the following error message: <br \/>\r<br>\u201cAmazon Invalid operation: S3ServiceException:Access Denied,Status 403,Error AccessDenied.\u201d <br \/>\r<br>The developers need to load this data soon, so a database specialist must act quickly to solve this issue. <br \/>\r<br>What is the MOST secure solution?<\/div><input type='hidden' name='question_id[]' id='qID_91' value='291661' \/><input type='hidden' id='answerType291661' value='radio'><!-- end question-content--><\/div><div class='question-choices watupro-choices-columns '><div class='watupro-question-choice  ' dir='auto' ><input type='radio' name='answer-291661[]' id='answer-id-1146188' class='answer   answerof-291661 ' value='1146188'   \/><label for='answer-id-1146188' id='answer-label-1146188' class=' answer'><span>Create a new IAM role with the same user name as the Amazon Redshift developer user I<\/span><\/label><\/div><div class='watupro-question-choice  ' dir='auto' ><input type='radio' name='answer-291661[]' id='answer-id-1146189' class='answer   answerof-291661 ' value='1146189'   \/><label for='answer-id-1146189' id='answer-label-1146189' class=' answer'><span>Provide the IAM role with read-only access to Amazon S3 with the assume role action.<\/span><\/label><\/div><div class='watupro-question-choice  ' dir='auto' ><input type='radio' name='answer-291661[]' id='answer-id-1146190' class='answer   answerof-291661 ' value='1146190'   \/><label for='answer-id-1146190' id='answer-label-1146190' class=' answer'><span>Create a new IAM role with read-only access to the Amazon S3 bucket and include the assume role action. Modify the Amazon Redshift cluster to add the IAM role.<\/span><\/label><\/div><div class='watupro-question-choice  ' dir='auto' ><input type='radio' name='answer-291661[]' id='answer-id-1146191' class='answer   answerof-291661 ' value='1146191'   \/><label for='answer-id-1146191' id='answer-label-1146191' class=' answer'><span>Create a new IAM role with read-only access to the Amazon S3 bucket with the assume role action. Add this role to the developer IAM user ID used for the copy job that ended with an error message.<\/span><\/label><\/div><div class='watupro-question-choice  ' dir='auto' ><input type='radio' name='answer-291661[]' id='answer-id-1146192' class='answer   answerof-291661 ' value='1146192'   \/><label for='answer-id-1146192' id='answer-label-1146192' class=' answer'><span>Create a new IAM user with access keys and a new role with read-only access to the Amazon S3 bucket. Add this role to the Amazon Redshift cluster. Change the copy job to use the access keys created.<\/span><\/label><\/div><!-- end question-choices--><\/div><!-- end questionWrap--><\/div><\/div><div class='watu-question ' id='question-92' style=';'><div id='questionWrap-92'  class='   watupro-question-id-291662'>\n\t\t\t<div class='question-content'><div><span class='watupro_num'>92. <\/span>A database specialist at a large multi-national financial company is in charge of designing the disaster recovery strategy for a highly available application that is in development. The application uses an Amazon DynamoDB table as its data store. The application requires a recovery time objective (RTO) of 1 minute and a recovery point objective (RPO) of 2 minutes. <br \/>\r<br>Which operationally efficient disaster recovery strategy should the database specialist recommend for the DynamoDB table?<\/div><input type='hidden' name='question_id[]' id='qID_92' value='291662' \/><input type='hidden' id='answerType291662' value='radio'><!-- end question-content--><\/div><div class='question-choices watupro-choices-columns '><div class='watupro-question-choice  ' dir='auto' ><input type='radio' name='answer-291662[]' id='answer-id-1146193' class='answer   answerof-291662 ' value='1146193'   \/><label for='answer-id-1146193' id='answer-label-1146193' class=' answer'><span>Create a DynamoDB stream that is processed by an AWS Lambda function that copies the data to a DynamoDB table in another Region.<\/span><\/label><\/div><div class='watupro-question-choice  ' dir='auto' ><input type='radio' name='answer-291662[]' id='answer-id-1146194' class='answer   answerof-291662 ' value='1146194'   \/><label for='answer-id-1146194' id='answer-label-1146194' class=' answer'><span>Use a DynamoDB global table replica in another Region. Enable point-in-time recovery for both tables.<\/span><\/label><\/div><div class='watupro-question-choice  ' dir='auto' ><input type='radio' name='answer-291662[]' id='answer-id-1146195' class='answer   answerof-291662 ' value='1146195'   \/><label for='answer-id-1146195' id='answer-label-1146195' class=' answer'><span>Use a DynamoDB Accelerator table in another Region. Enable point-in-time recovery for the table.<\/span><\/label><\/div><div class='watupro-question-choice  ' dir='auto' ><input type='radio' name='answer-291662[]' id='answer-id-1146196' class='answer   answerof-291662 ' value='1146196'   \/><label for='answer-id-1146196' id='answer-label-1146196' class=' answer'><span>Create an AWS Backup plan and assign the DynamoDB table as a resource.<\/span><\/label><\/div><!-- end question-choices--><\/div><!-- end questionWrap--><\/div><\/div><div class='watu-question ' id='question-93' style=';'><div id='questionWrap-93'  class='   watupro-question-id-291663'>\n\t\t\t<div class='question-content'><div><span class='watupro_num'>93. <\/span>A small startup company is looking to migrate a 4 TB on-premises MySQL database to AWS using an Amazon RDS for MySQL DB instance. <br \/>\r<br>Which strategy would allow for a successful migration with the LEAST amount of downtime?<\/div><input type='hidden' name='question_id[]' id='qID_93' value='291663' \/><input type='hidden' id='answerType291663' value='radio'><!-- end question-content--><\/div><div class='question-choices watupro-choices-columns '><div class='watupro-question-choice  ' dir='auto' ><input type='radio' name='answer-291663[]' id='answer-id-1146197' class='answer   answerof-291663 ' value='1146197'   \/><label for='answer-id-1146197' id='answer-label-1146197' class=' answer'><span>Deploy a new RDS for MySQL DB instance and configure it for access from the on-premises data center. Use the mysqldump utility to create an initial snapshot from the on-premises MySQL server, and copy it to an Amazon S3 bucket. Import the snapshot into the DB instance utilizing the MySQL utilities running on an Amazon EC2 instance. Immediately point the application to the DB instance.<\/span><\/label><\/div><div class='watupro-question-choice  ' dir='auto' ><input type='radio' name='answer-291663[]' id='answer-id-1146198' class='answer   answerof-291663 ' value='1146198'   \/><label for='answer-id-1146198' id='answer-label-1146198' class=' answer'><span>Deploy a new Amazon EC2 instance, install the MySQL software on the EC2 instance, and configure networking for access from the on-premises data center. Use the mysqldump utility to create a snapshot of the on-premises MySQL server. Copy the snapshot into the EC2 instance and restore it into the EC2 MySQL instance. Use AWS DMS to migrate data into a new RDS for MySQL DB instance. Point the application to the DB instance.<\/span><\/label><\/div><div class='watupro-question-choice  ' dir='auto' ><input type='radio' name='answer-291663[]' id='answer-id-1146199' class='answer   answerof-291663 ' value='1146199'   \/><label for='answer-id-1146199' id='answer-label-1146199' class=' answer'><span>Deploy a new Amazon EC2 instance, install the MySQL software on the EC2 instance, and configure networking for access from the on-premises data center. Use the mysqldump utility to create a snapshot of the on-premises MySQL server. Copy the snapshot into an Amazon S3 bucket and import the snapshot into a new RDS for MySQL DB instance using the MySQL utilities running on an EC2 instance. Point the application to the DB instance.<\/span><\/label><\/div><div class='watupro-question-choice  ' dir='auto' ><input type='radio' name='answer-291663[]' id='answer-id-1146200' class='answer   answerof-291663 ' value='1146200'   \/><label for='answer-id-1146200' id='answer-label-1146200' class=' answer'><span>Deploy a new RDS for MySQL DB instance and configure it for access from the on-premises data center. Use the mysqldump utility to create an initial snapshot from the on-premises MySQL server, and copy it to an Amazon S3 bucket. Import the snapshot into the DB instance using the MySQL utilities running on an Amazon EC2 instance. Establish replication into the new DB instance using MySQL replication. Stop application access to the on-premises MySQL server and let the remaining transactions replicate over. Point the application to the DB instance.<\/span><\/label><\/div><!-- end question-choices--><\/div><!-- end questionWrap--><\/div><\/div><div class='watu-question ' id='question-94' style=';'><div id='questionWrap-94'  class='   watupro-question-id-291664'>\n\t\t\t<div class='question-content'><div><span class='watupro_num'>94. <\/span>A software development company is using Amazon Aurora MySQL DB clusters for several use cases, including development and reporting. These use cases place unpredictable and varying demands on the Aurora DB clusters, and can cause momentary spikes in latency. System users run ad-hoc queries sporadically throughout the week. Cost is a primary concern for the company, and a solution that does not require significant rework is needed. <br \/>\r<br>Which solution meets these requirements?<\/div><input type='hidden' name='question_id[]' id='qID_94' value='291664' \/><input type='hidden' id='answerType291664' value='radio'><!-- end question-content--><\/div><div class='question-choices watupro-choices-columns '><div class='watupro-question-choice  ' dir='auto' ><input type='radio' name='answer-291664[]' id='answer-id-1146201' class='answer   answerof-291664 ' value='1146201'   \/><label for='answer-id-1146201' id='answer-label-1146201' class=' answer'><span>Create new Aurora Serverless DB clusters for development and reporting, then migrate to these new DB clusters.<\/span><\/label><\/div><div class='watupro-question-choice  ' dir='auto' ><input type='radio' name='answer-291664[]' id='answer-id-1146202' class='answer   answerof-291664 ' value='1146202'   \/><label for='answer-id-1146202' id='answer-label-1146202' class=' answer'><span>Upgrade one of the DB clusters to a larger size, and consolidate development and reporting activities on this larger DB cluster.<\/span><\/label><\/div><div class='watupro-question-choice  ' dir='auto' ><input type='radio' name='answer-291664[]' id='answer-id-1146203' class='answer   answerof-291664 ' value='1146203'   \/><label for='answer-id-1146203' id='answer-label-1146203' class=' answer'><span>Use existing DB clusters and stop\/start the databases on a routine basis using scheduling tools.<\/span><\/label><\/div><div class='watupro-question-choice  ' dir='auto' ><input type='radio' name='answer-291664[]' id='answer-id-1146204' class='answer   answerof-291664 ' value='1146204'   \/><label for='answer-id-1146204' id='answer-label-1146204' class=' answer'><span>Change the DB clusters to the burstable instance family.<\/span><\/label><\/div><!-- end question-choices--><\/div><!-- end questionWrap--><\/div><\/div><div class='watu-question ' id='question-95' style=';'><div id='questionWrap-95'  class='   watupro-question-id-291665'>\n\t\t\t<div class='question-content'><div><span class='watupro_num'>95. <\/span>A database specialist is building a system that uses a static vendor dataset of postal codes and related territory information that is less than 1 GB in size. The dataset is loaded into the application\u2019s cache at start up. The company needs to store this data in a way that provides the lowest cost with a low application startup time. <br \/>\r<br>Which approach will meet these requirements?<\/div><input type='hidden' name='question_id[]' id='qID_95' value='291665' \/><input type='hidden' id='answerType291665' value='radio'><!-- end question-content--><\/div><div class='question-choices watupro-choices-columns '><div class='watupro-question-choice  ' dir='auto' ><input type='radio' name='answer-291665[]' id='answer-id-1146205' class='answer   answerof-291665 ' value='1146205'   \/><label for='answer-id-1146205' id='answer-label-1146205' class=' answer'><span>Use an Amazon RDS DB instance. Shut down the instance once the data has been read.<\/span><\/label><\/div><div class='watupro-question-choice  ' dir='auto' ><input type='radio' name='answer-291665[]' id='answer-id-1146206' class='answer   answerof-291665 ' value='1146206'   \/><label for='answer-id-1146206' id='answer-label-1146206' class=' answer'><span>Use Amazon Aurora Serverless. Allow the service to spin resources up and down, as needed.<\/span><\/label><\/div><div class='watupro-question-choice  ' dir='auto' ><input type='radio' name='answer-291665[]' id='answer-id-1146207' class='answer   answerof-291665 ' value='1146207'   \/><label for='answer-id-1146207' id='answer-label-1146207' class=' answer'><span>Use Amazon DynamoDB in on-demand capacity mode.<\/span><\/label><\/div><div class='watupro-question-choice  ' dir='auto' ><input type='radio' name='answer-291665[]' id='answer-id-1146208' class='answer   answerof-291665 ' value='1146208'   \/><label for='answer-id-1146208' id='answer-label-1146208' class=' answer'><span>Use Amazon S3 and load the data from flat files.<\/span><\/label><\/div><!-- end question-choices--><\/div><!-- end questionWrap--><\/div><\/div><div class='watu-question ' id='question-96' style=';'><div id='questionWrap-96'  class='   watupro-question-id-291666'>\n\t\t\t<div class='question-content'><div><span class='watupro_num'>96. <\/span>A database specialist needs to review and optimize an Amazon DynamoDB table that is experiencing performance issues. A thorough investigation by the database specialist reveals that the partition key is causing hot partitions, so a new partition key is created. The database specialist must effectively apply this new partition key to all existing and new data. <br \/>\r<br>How can this solution be implemented?<\/div><input type='hidden' name='question_id[]' id='qID_96' value='291666' \/><input type='hidden' id='answerType291666' value='radio'><!-- end question-content--><\/div><div class='question-choices watupro-choices-columns '><div class='watupro-question-choice  ' dir='auto' ><input type='radio' name='answer-291666[]' id='answer-id-1146209' class='answer   answerof-291666 ' value='1146209'   \/><label for='answer-id-1146209' id='answer-label-1146209' class=' answer'><span>Use Amazon EMR to export the data from the current DynamoDB table to Amazon S3. Then use Amazon EMR again to import the data from Amazon S3 into a new DynamoDB table with the new partition key.<\/span><\/label><\/div><div class='watupro-question-choice  ' dir='auto' ><input type='radio' name='answer-291666[]' id='answer-id-1146210' class='answer   answerof-291666 ' value='1146210'   \/><label for='answer-id-1146210' id='answer-label-1146210' class=' answer'><span>Use AWS DMS to copy the data from the current DynamoDB table to Amazon S3. Then import the DynamoDB table to create a new DynamoDB table with the new partition key.<\/span><\/label><\/div><div class='watupro-question-choice  ' dir='auto' ><input type='radio' name='answer-291666[]' id='answer-id-1146211' class='answer   answerof-291666 ' value='1146211'   \/><label for='answer-id-1146211' id='answer-label-1146211' class=' answer'><span>Use the AWS CLI to update the DynamoDB table and modify the partition key.<\/span><\/label><\/div><div class='watupro-question-choice  ' dir='auto' ><input type='radio' name='answer-291666[]' id='answer-id-1146212' class='answer   answerof-291666 ' value='1146212'   \/><label for='answer-id-1146212' id='answer-label-1146212' class=' answer'><span>Use the AWS CLI to back up the DynamoDB table. Then use the restore-table-from-backup command and modify the partition key.<\/span><\/label><\/div><!-- end question-choices--><\/div><!-- end questionWrap--><\/div><\/div><div class='watu-question ' id='question-97' style=';'><div id='questionWrap-97'  class='   watupro-question-id-291667'>\n\t\t\t<div class='question-content'><div><span class='watupro_num'>97. <\/span>A company is going through a security audit. The audit team has identified cleartext master user password in the AWS CloudFormation templates for Amazon RDS for MySQL DB instances. The audit team has flagged this as a security risk to the database team. <br \/>\r<br>What should a database specialist do to mitigate this risk?<\/div><input type='hidden' name='question_id[]' id='qID_97' value='291667' \/><input type='hidden' id='answerType291667' value='radio'><!-- end question-content--><\/div><div class='question-choices watupro-choices-columns '><div class='watupro-question-choice  ' dir='auto' ><input type='radio' name='answer-291667[]' id='answer-id-1146213' class='answer   answerof-291667 ' value='1146213'   \/><label for='answer-id-1146213' id='answer-label-1146213' class=' answer'><span>Change all the databases to use AWS IAM for authentication and remove all the cleartext passwords in CloudFormation templates.<\/span><\/label><\/div><div class='watupro-question-choice  ' dir='auto' ><input type='radio' name='answer-291667[]' id='answer-id-1146214' class='answer   answerof-291667 ' value='1146214'   \/><label for='answer-id-1146214' id='answer-label-1146214' class=' answer'><span>Use an AWS Secrets Manager resource to generate a random password and reference the secret in the CloudFormation template.<\/span><\/label><\/div><div class='watupro-question-choice  ' dir='auto' ><input type='radio' name='answer-291667[]' id='answer-id-1146215' class='answer   answerof-291667 ' value='1146215'   \/><label for='answer-id-1146215' id='answer-label-1146215' class=' answer'><span>Remove the passwords from the CloudFormation templates so Amazon RDS prompts for the password when the database is being created.<\/span><\/label><\/div><div class='watupro-question-choice  ' dir='auto' ><input type='radio' name='answer-291667[]' id='answer-id-1146216' class='answer   answerof-291667 ' value='1146216'   \/><label for='answer-id-1146216' id='answer-label-1146216' class=' answer'><span>Remove the passwords from the CloudFormation template and store them in a separate file. \r\nReplace the passwords by running CloudFormation using a sed command.<\/span><\/label><\/div><!-- end question-choices--><\/div><!-- end questionWrap--><\/div><\/div><div class='watu-question ' id='question-98' style=';'><div id='questionWrap-98'  class='   watupro-question-id-291668'>\n\t\t\t<div class='question-content'><div><span class='watupro_num'>98. <\/span>A company\u2019s database specialist disabled TLS on an Amazon DocumentDB cluster to perform benchmarking tests. A few days after this change was implemented, a database specialist trainee accidentally deleted multiple tables. The database specialist restored the database from available snapshots. An hour after restoring the cluster, the database specialist is still unable to connect to the new cluster endpoint. <br \/>\r<br>What should the database specialist do to connect to the new, restored Amazon DocumentDB cluster?<\/div><input type='hidden' name='question_id[]' id='qID_98' value='291668' \/><input type='hidden' id='answerType291668' value='radio'><!-- end question-content--><\/div><div class='question-choices watupro-choices-columns '><div class='watupro-question-choice  ' dir='auto' ><input type='radio' name='answer-291668[]' id='answer-id-1146217' class='answer   answerof-291668 ' value='1146217'   \/><label for='answer-id-1146217' id='answer-label-1146217' class=' answer'><span>Change the restored cluster\u2019s parameter group to the original cluster\u2019s custom parameter group.<\/span><\/label><\/div><div class='watupro-question-choice  ' dir='auto' ><input type='radio' name='answer-291668[]' id='answer-id-1146218' class='answer   answerof-291668 ' value='1146218'   \/><label for='answer-id-1146218' id='answer-label-1146218' class=' answer'><span>Change the restored cluster\u2019s parameter group to the Amazon DocumentDB default parameter group.<\/span><\/label><\/div><div class='watupro-question-choice  ' dir='auto' ><input type='radio' name='answer-291668[]' id='answer-id-1146219' class='answer   answerof-291668 ' value='1146219'   \/><label for='answer-id-1146219' id='answer-label-1146219' class=' answer'><span>Configure the interface VPC endpoint and associate the new Amazon DocumentDB cluster.<\/span><\/label><\/div><div class='watupro-question-choice  ' dir='auto' ><input type='radio' name='answer-291668[]' id='answer-id-1146220' class='answer   answerof-291668 ' value='1146220'   \/><label for='answer-id-1146220' id='answer-label-1146220' class=' answer'><span>Run the syncInstances command in AWS DataSync.<\/span><\/label><\/div><!-- end question-choices--><\/div><!-- end questionWrap--><\/div><\/div><div class='watu-question ' id='question-99' style=';'><div id='questionWrap-99'  class='   watupro-question-id-291669'>\n\t\t\t<div class='question-content'><div><span class='watupro_num'>99. <\/span>A company runs a customer relationship management (CRM) system that is hosted on-premises with a MySQL database as the backend. A custom stored procedure is used to send email notifications to another system when data is inserted into a table. The company has noticed that the performance of the CRM system has decreased due to database reporting applications used by various teams. The company requires an AWS solution that would reduce maintenance, improve performance, and accommodate the email notification feature. <br \/>\r<br>Which AWS solution meets these requirements?<\/div><input type='hidden' name='question_id[]' id='qID_99' value='291669' \/><input type='hidden' id='answerType291669' value='radio'><!-- end question-content--><\/div><div class='question-choices watupro-choices-columns '><div class='watupro-question-choice  ' dir='auto' ><input type='radio' name='answer-291669[]' id='answer-id-1146221' class='answer   answerof-291669 ' value='1146221'   \/><label for='answer-id-1146221' id='answer-label-1146221' class=' answer'><span>Use MySQL running on an Amazon EC2 instance with Auto Scaling to accommodate the reporting applications. Configure a stored procedure and an AWS Lambda function that uses Amazon SES to send email notifications to the other system.<\/span><\/label><\/div><div class='watupro-question-choice  ' dir='auto' ><input type='radio' name='answer-291669[]' id='answer-id-1146222' class='answer   answerof-291669 ' value='1146222'   \/><label for='answer-id-1146222' id='answer-label-1146222' class=' answer'><span>Use Amazon Aurora MySQL in a multi-master cluster to accommodate the reporting applications. Configure Amazon RDS event subscriptions to publish a message to an Amazon SNS topic and subscribe the other system's email address to the topic.<\/span><\/label><\/div><div class='watupro-question-choice  ' dir='auto' ><input type='radio' name='answer-291669[]' id='answer-id-1146223' class='answer   answerof-291669 ' value='1146223'   \/><label for='answer-id-1146223' id='answer-label-1146223' class=' answer'><span>Use MySQL running on an Amazon EC2 instance with a read replica to accommodate the reporting applications. Configure Amazon SES integration to send email notifications to the other system.<\/span><\/label><\/div><div class='watupro-question-choice  ' dir='auto' ><input type='radio' name='answer-291669[]' id='answer-id-1146224' class='answer   answerof-291669 ' value='1146224'   \/><label for='answer-id-1146224' id='answer-label-1146224' class=' answer'><span>Use Amazon Aurora MySQL with a read replica for the reporting applications. Configure a stored procedure and an AWS Lambda function to publish a message to an Amazon SNS topic. Subscribe the other system's email address to the topic.<\/span><\/label><\/div><!-- end question-choices--><\/div><!-- end questionWrap--><\/div><\/div><div class='watu-question ' id='question-100' style=';'><div id='questionWrap-100'  class='   watupro-question-id-291670'>\n\t\t\t<div class='question-content'><div><span class='watupro_num'>100. <\/span>A company needs to migrate Oracle Database Standard Edition running on an Amazon EC2 instance to an Amazon RDS for Oracle DB instance with Multi-AZ. The database supports an ecommerce website that runs continuously. The company can only provide a maintenance window of up to 5 minutes. <br \/>\r<br>Which solution will meet these requirements?<\/div><input type='hidden' name='question_id[]' id='qID_100' value='291670' \/><input type='hidden' id='answerType291670' value='radio'><!-- end question-content--><\/div><div class='question-choices watupro-choices-columns '><div class='watupro-question-choice  ' dir='auto' ><input type='radio' name='answer-291670[]' id='answer-id-1146225' class='answer   answerof-291670 ' value='1146225'   \/><label for='answer-id-1146225' id='answer-label-1146225' class=' answer'><span>Configure Oracle Real Application Clusters (RAC) on the EC2 instance and the RDS DB instance. Update the connection string to point to the RAC cluster. Once the EC2 instance and RDS DB instance are in sync, fail over from Amazon EC2 to Amazon RD<\/span><\/label><\/div><div class='watupro-question-choice  ' dir='auto' ><input type='radio' name='answer-291670[]' id='answer-id-1146226' class='answer   answerof-291670 ' value='1146226'   \/><label for='answer-id-1146226' id='answer-label-1146226' class=' answer'><span>Export the Oracle database from the EC2 instance using Oracle Data Pump and perform an import into Amazon RD<\/span><\/label><\/div><div class='watupro-question-choice  ' dir='auto' ><input type='radio' name='answer-291670[]' id='answer-id-1146227' class='answer   answerof-291670 ' value='1146227'   \/><label for='answer-id-1146227' id='answer-label-1146227' class=' answer'><span>Stop the application for the entire process. When the import is complete, change the \r\ndatabase connection string and then restart the application.<\/span><\/label><\/div><div class='watupro-question-choice  ' dir='auto' ><input type='radio' name='answer-291670[]' id='answer-id-1146228' class='answer   answerof-291670 ' value='1146228'   \/><label for='answer-id-1146228' id='answer-label-1146228' class=' answer'><span>Configure AWS DMS with the EC2 instance as the source and the RDS DB instance as the destination. Stop the application when the replication is in sync, change the database connection string, and then restart the application.<\/span><\/label><\/div><div class='watupro-question-choice  ' dir='auto' ><input type='radio' name='answer-291670[]' id='answer-id-1146229' class='answer   answerof-291670 ' value='1146229'   \/><label for='answer-id-1146229' id='answer-label-1146229' class=' answer'><span>Configure AWS DataSync with the EC2 instance as the source and the RDS DB instance as the destination. Stop the application when the replication is in sync, change the database connection string, and then restart the application.<\/span><\/label><\/div><!-- end question-choices--><\/div><!-- end questionWrap--><\/div><\/div><div style='display:none' id='question-101'>\n\t<div class='question-content'>\n\t\t<img loading=\"lazy\" decoding=\"async\" src=\"https:\/\/www.dumpsbase.com\/freedumps\/wp-content\/plugins\/watupro\/img\/loading.gif\" width=\"16\" height=\"16\" alt=\"Loading...\" title=\"Loading...\" \/>&nbsp;Loading...\t<\/div>\n<\/div>\n\n<br \/>\n\t\n\t\t\t<div class=\"watupro_buttons flex \" id=\"watuPROButtons7843\" >\n\t\t  <div id=\"prev-question\" style=\"display:none;\"><input type=\"button\" value=\"&lt; Previous\" onclick=\"WatuPRO.nextQuestion(event, 'previous');\"\/><\/div>\t\t  \t\t  \t\t   \n\t\t   \t  \t\t<div><input type=\"button\" name=\"action\" class=\"watupro-submit-button\" onclick=\"WatuPRO.submitResult(event)\" id=\"action-button\" value=\"View Results\"  \/>\n\t\t<\/div>\n\t\t<\/div>\n\t\t\n\t<input type=\"hidden\" name=\"quiz_id\" value=\"7843\" id=\"watuPROExamID\"\/>\n\t<input type=\"hidden\" name=\"start_time\" id=\"startTime\" value=\"2026-05-11 13:17:48\" \/>\n\t<input type=\"hidden\" name=\"start_timestamp\" id=\"startTimeStamp\" value=\"1778505468\" \/>\n\t<input type=\"hidden\" name=\"question_ids\" value=\"\" \/>\n\t<input type=\"hidden\" name=\"watupro_questions\" value=\"291571:1145796,1145797,1145798,1145799 | 291572:1145800,1145801,1145802,1145803,1145804,1145805 | 291573:1145806,1145807,1145808,1145809 | 291574:1145810,1145811,1145812,1145813 | 291575:1145814,1145815,1145816,1145817 | 291576:1145818,1145819,1145820,1145821,1145822 | 291577:1145823,1145824,1145825,1145826,1145827 | 291578:1145828,1145829,1145830,1145831,1145832,1145833 | 291579:1145834,1145835,1145836,1145837 | 291580:1145838,1145839,1145840,1145841 | 291581:1145842,1145843,1145844,1145845 | 291582:1145846,1145847,1145848,1145849,1145850 | 291583:1145851,1145852,1145853,1145854,1145855,1145856 | 291584:1145857,1145858,1145859,1145860 | 291585:1145861,1145862,1145863,1145864 | 291586:1145865,1145866,1145867,1145868 | 291587:1145869,1145870,1145871,1145872 | 291588:1145873,1145874,1145875,1145876 | 291589:1145877,1145878,1145879,1145880 | 291590:1145881,1145882,1145883,1145884 | 291591:1145885,1145886,1145887,1145888 | 291592:1145889,1145890,1145891,1145892 | 291593:1145893,1145894,1145895,1145896 | 291594:1145897,1145898,1145899,1145900 | 291595:1145901,1145902,1145903,1145904 | 291596:1145905,1145906,1145907,1145908,1145909,1145910 | 291597:1145911,1145912,1145913,1145914 | 291598:1145915,1145916,1145917,1145918 | 291599:1145919,1145920,1145921,1145922 | 291600:1145923,1145924,1145925,1145926 | 291601:1145927,1145928,1145929,1145930 | 291602:1145931,1145932,1145933,1145934 | 291603:1145935,1145936,1145937,1145938 | 291604:1145939,1145940,1145941,1145942 | 291605:1145943,1145944,1145945,1145946,1145947 | 291606:1145948,1145949,1145950,1145951 | 291607:1145952,1145953,1145954,1145955 | 291608:1145956,1145957,1145958,1145959 | 291609:1145960,1145961,1145962,1145963 | 291610:1145964,1145965,1145966,1145967 | 291611:1145968,1145969,1145970,1145971 | 291612:1145972,1145973,1145974,1145975 | 291613:1145976,1145977,1145978,1145979 | 291614:1145980,1145981,1145982,1145983 | 291615:1145984,1145985,1145986,1145987 | 291616:1145988,1145989,1145990,1145991 | 291617:1145992,1145993,1145994,1145995,1145996,1145997 | 291618:1145998,1145999,1146000,1146001 | 291619:1146002,1146003,1146004,1146005,1146006 | 291620:1146007,1146008,1146009,1146010,1146011 | 291621:1146012,1146013,1146014,1146015 | 291622:1146016,1146017,1146018,1146019 | 291623:1146020,1146021,1146022,1146023 | 291624:1146024,1146025,1146026,1146027 | 291625:1146028,1146029,1146030,1146031 | 291626:1146032,1146033,1146034,1146035 | 291627:1146036,1146037,1146038,1146039 | 291628:1146040,1146041,1146042,1146043 | 291629:1146044,1146045,1146046,1146047 | 291630:1146048,1146049,1146050,1146051,1146052,1146053 | 291631:1146054,1146055,1146056,1146057 | 291632:1146058,1146059,1146060,1146061,1146062,1146063,1146064 | 291633:1146065,1146066,1146067,1146068 | 291634:1146069,1146070,1146071,1146072 | 291635:1146073,1146074,1146075,1146076 | 291636:1146077,1146078,1146079,1146080 | 291637:1146081,1146082,1146083,1146084,1146085,1146086 | 291638:1146087,1146088,1146089,1146090 | 291639:1146091,1146092,1146093,1146094 | 291640:1146095,1146096,1146097,1146098 | 291641:1146099,1146100,1146101,1146102,1146103 | 291642:1146104,1146105,1146106,1146107,1146108,1146109 | 291643:1146110,1146111,1146112,1146113 | 291644:1146114,1146115,1146116,1146117 | 291645:1146118,1146119,1146120,1146121,1146122 | 291646:1146123,1146124,1146125,1146126,1146127,1146128,1146129 | 291647:1146130,1146131,1146132,1146133 | 291648:1146134,1146135,1146136,1146137 | 291649:1146138,1146139,1146140,1146141 | 291650:1146142,1146143,1146144,1146145 | 291651:1146146,1146147,1146148,1146149 | 291652:1146150,1146151,1146152,1146153 | 291653:1146154,1146155,1146156,1146157 | 291654:1146158,1146159,1146160,1146161 | 291655:1146162,1146163,1146164,1146165,1146166 | 291656:1146167,1146168,1146169,1146170,1146171 | 291657:1146172,1146173,1146174,1146175 | 291658:1146176,1146177,1146178,1146179 | 291659:1146180,1146181,1146182,1146183 | 291660:1146184,1146185,1146186,1146187 | 291661:1146188,1146189,1146190,1146191,1146192 | 291662:1146193,1146194,1146195,1146196 | 291663:1146197,1146198,1146199,1146200 | 291664:1146201,1146202,1146203,1146204 | 291665:1146205,1146206,1146207,1146208 | 291666:1146209,1146210,1146211,1146212 | 291667:1146213,1146214,1146215,1146216 | 291668:1146217,1146218,1146219,1146220 | 291669:1146221,1146222,1146223,1146224 | 291670:1146225,1146226,1146227,1146228,1146229\" \/>\n\t<input type=\"hidden\" name=\"no_ajax\" value=\"0\">\t\t\t<\/form>\n\t<p>&nbsp;<\/p>\n<\/div>\n\n<script type=\"text\/javascript\">\n\/\/jQuery(document).ready(function(){\ndocument.addEventListener(\"DOMContentLoaded\", function(event) { \t\nvar question_ids = \"291571,291572,291573,291574,291575,291576,291577,291578,291579,291580,291581,291582,291583,291584,291585,291586,291587,291588,291589,291590,291591,291592,291593,291594,291595,291596,291597,291598,291599,291600,291601,291602,291603,291604,291605,291606,291607,291608,291609,291610,291611,291612,291613,291614,291615,291616,291617,291618,291619,291620,291621,291622,291623,291624,291625,291626,291627,291628,291629,291630,291631,291632,291633,291634,291635,291636,291637,291638,291639,291640,291641,291642,291643,291644,291645,291646,291647,291648,291649,291650,291651,291652,291653,291654,291655,291656,291657,291658,291659,291660,291661,291662,291663,291664,291665,291666,291667,291668,291669,291670\";\nWatuPROSettings[7843] = {};\nWatuPRO.qArr = question_ids.split(',');\nWatuPRO.exam_id = 7843;\t    \nWatuPRO.post_id = 66246;\nWatuPRO.store_progress = 0;\nWatuPRO.curCatPage = 1;\nWatuPRO.requiredIDs=\"0\".split(\",\");\nWatuPRO.hAppID = \"0.63875700 1778505468\";\nvar url = \"https:\/\/www.dumpsbase.com\/freedumps\/wp-content\/plugins\/watupro\/show_exam.php\";\nWatuPRO.examMode = 1;\nWatuPRO.siteURL=\"https:\/\/www.dumpsbase.com\/freedumps\/wp-admin\/admin-ajax.php\";\nWatuPRO.emailIsNotRequired = 0;\nWatuPROIntel.init(7843);\nWatuPRO.inCategoryPages=1;});    \t \n<\/script>\n\n\n","protected":false},"excerpt":{"rendered":"","protected":false},"author":1,"featured_media":0,"comment_status":"open","ping_status":"open","sticky":false,"template":"","format":"standard","meta":{"footnotes":""},"categories":[175,16400],"tags":[15596],"class_list":["post-66246","post","type-post","status-publish","format-standard","hentry","category-amazon","category-aws-certified-database-specialty","tag-dbs-c01-exam-dumps"],"_links":{"self":[{"href":"https:\/\/www.dumpsbase.com\/freedumps\/wp-json\/wp\/v2\/posts\/66246","targetHints":{"allow":["GET"]}}],"collection":[{"href":"https:\/\/www.dumpsbase.com\/freedumps\/wp-json\/wp\/v2\/posts"}],"about":[{"href":"https:\/\/www.dumpsbase.com\/freedumps\/wp-json\/wp\/v2\/types\/post"}],"author":[{"embeddable":true,"href":"https:\/\/www.dumpsbase.com\/freedumps\/wp-json\/wp\/v2\/users\/1"}],"replies":[{"embeddable":true,"href":"https:\/\/www.dumpsbase.com\/freedumps\/wp-json\/wp\/v2\/comments?post=66246"}],"version-history":[{"count":1,"href":"https:\/\/www.dumpsbase.com\/freedumps\/wp-json\/wp\/v2\/posts\/66246\/revisions"}],"predecessor-version":[{"id":66251,"href":"https:\/\/www.dumpsbase.com\/freedumps\/wp-json\/wp\/v2\/posts\/66246\/revisions\/66251"}],"wp:attachment":[{"href":"https:\/\/www.dumpsbase.com\/freedumps\/wp-json\/wp\/v2\/media?parent=66246"}],"wp:term":[{"taxonomy":"category","embeddable":true,"href":"https:\/\/www.dumpsbase.com\/freedumps\/wp-json\/wp\/v2\/categories?post=66246"},{"taxonomy":"post_tag","embeddable":true,"href":"https:\/\/www.dumpsbase.com\/freedumps\/wp-json\/wp\/v2\/tags?post=66246"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}