2019 New Azure Exam DP-201 Dumps Questions

Go on talking about Microsoft Certified: Azure Data Engineer Associate certification. DP-200 and DP-201 are the two exams for Azure Data Engineer certification. In my last post, we have shared new DP-200 dumps questions. Here we go on to share new Azure exam DP-201 dumps questions. Main tasks of DP-201 exam are below:

  • Design Azure data storage solutions
  • Design data processing solutions
  • Design for data security and compliance
  • Design for high availability and disaster recovery

The same as DP-200 dumps questions, DP-201 questions are based on the Azure exam DP-201 exam measure skills and tasks. With our new Azure exam DP-201 dumps questions, you can pass Designing an Azure Data Solution DP-201 exam successfully.

Free DP-201 Dumps Questions Online, Read Immediately

1. Topic 1, Trey Research Case study

This is a case study. Case studies are not timed separately. You can use as much exam time as you would like to complete each case.

However, there may be additional case studies and sections on this exam. You must manage your time to ensure that you are able to complete all questions included on this exam in the time provided.

To answer the questions included in a case study, you will need to reference information that is provided in the case study. Case studies might contain exhibits and other resources that provide more information about the scenario that is described in the case study. Each question is independent of the other question on this case study.

At the end of this case study, a review screen will appear. This screen allows you to review your answers and to make changes before you move to the next sections of the exam. After you begin a new section, you cannot return to this section.

To start the case study

To display the first question on this case study, click the Next button. Use the buttons in the left pane to explore the content of the case study before you answer the questions. Clicking these buttons displays information such as business requirements, existing environment, and problem statements. If the case study has an All Information tab, note that the information displayed is identical to the information displayed on the subsequent tabs. When you are ready to answer a question, click the Question button to return to the question.

Background

Trey Research is a technology innovator. The company partners with regional transportation department office to build solutions that improve traffic flow and safety.

The company is developing the following solutions:

Regional transportation departments installed traffic sensor systems on major highways across North America. Sensors record the following information each time a vehicle passes in front of a sensor:

– Time

– Location in latitude and longitude

– Speed in kilometers per second (kmps)

– License plate number -Length of vehicle in meters

Sensors provide data by using the following structure:

Traffic sensors will occasionally capture an image of a vehicle for debugging purposes. You must optimize performance of saving/storing vehicle images.

Traffic sensor data

– Sensors must have permission only to add items to the SensorData collection.

– Traffic data insertion rate must be maximized.

– Once every three months all traffic sensor data must be analyzed to look for data patterns that indicate sensor malfunctions.

– Sensor data must be stored in a Cosmos DB named treydata in a collection named SensorData

– The impact of vehicle images on sensor data throughout must be minimized.

Backtrack

This solution reports on all data related to a specific vehicle license plate. The report must use data from the SensorData collection. Users must be able to filter vehicle data in the following ways:

– vehicles on a specific road

– vehicles driving above the speed limit

Planning Assistance

Data used for Planning Assistance must be stored in a sharded Azure SQL Database.

Data from the Sensor Data collection will automatically be loaded into the Planning Assistance database once a week by using Azure Data Factory. You must be able to manually trigger the data load process.

Privacy and security policy

– Azure Active Directory must be used for all services where it is available.

– For privacy reasons, license plate number information must not be accessible in Planning Assistance.

– Unauthorized usage of the Planning Assistance data must be detected as quickly as possible. Unauthorized usage is determined by looking for an unusual pattern of usage.

– Data must only be stored for seven years.

Performance and availability

– The report for Backtrack must execute as quickly as possible.

– The SLA for Planning Assistance is 70 percent, and multiday outages are permitted.

– All data must be replicated to multiple geographic regions to prevent data loss.

– You must maximize the performance of the Real Time Response system.

Financial requirements

Azure resource costs must be minimized where possible.

You need to design the vehicle images storage solution.

What should you recommend?

 
 
 
 

2. You need to design a sharding strategy for the Planning Assistance database.

What should you recommend?

 
 
 
 

3. HOTSPOT

You need to design the SensorData collection.

What should you recommend? To answer, select the appropriate options in the answer area.

NOTE: Each correct selection is worth one point.

4. You need to recommend an Azure SQL Database pricing tier for Planning Assistance.

Which pricing tier should you recommend?

 
 
 
 

5. HOTSPOT

You need to design the Planning Assistance database. For each of the following statements, select Yes if the statement is true. Otherwise, select No. NOTE: Each correct selection is worth one point.

6. HOTSPOT

You need to design the data loading pipeline for Planning Assistance.

What should you recommend? To answer, drag the appropriate technologies to the correct locations. Each technology may be used once, more than once, or not at all. You may need to drag the split bar between panes or scroll to view content.

NOTE: Each correct selection is worth one point.

7. You need to design the runtime environment for the Real Time Response system.

What should you recommend?

 
 
 
 

8. HOTSPOT

You need to ensure that emergency road response vehicles are dispatched automatically.

How should you design the processing system? To answer, select the appropriate options in the answer area. NOTE: Each correct selection is worth one point.

9. DRAG DROP

You need to ensure that performance requirements for Backtrack reports are met.

What should you recommend? To answer, drag the appropriate technologies to the correct locations. Each technology may be used once, more than once, or not at all. You may need to drag the split bar between panes or scroll to view content.

NOTE: Each correct selection is worth one point.

10. HOTSPOT

You need to design the authentication and authorization methods for sensors.

What should you recommend? To answer, select the appropriate options in the answer area. NOTE: Each correct selection is worth one point.

11. HOTSPOT

You need to ensure that security policies for the unauthorized detection system are met.

What should you recommend? To answer, select the appropriate options in the answer area. NOTE: Each correct selection is worth one point.

12. Topic 2, Case study 1

This is a case study. Case studies are not timed separately. You can use as much exam time as you would like to complete each case.

However, there may be additional case studies and sections on this exam. You must manage your time to ensure that you are able to complete all questions included on this exam in the time provided.

To answer the questions included in a case study, you will need to reference information that is provided in the case study. Case studies might contain exhibits and other resources that provide more information about the scenario that is described in the case study. Each question is independent of the other question on this case study.

At the end of this case study, a review screen will appear. This screen allows you to review your answers and to make changes before you move to the next sections of the exam. After you begin a new section, you cannot return to this section.

To start the case study

To display the first question on this case study, click the Next button. Use the buttons in the left pane to explore the content of the case study before you answer the questions. Clicking these buttons displays information such as business requirements, existing environment, and problem statements. If the case study has an All Information tab, note that the information displayed is identical to the information displayed on the subsequent tabs. When you are ready to answer a question, click the Question button to return to the question.

Requirements

Business

The company identifies the following business requirements:

– You must transfer all images and customer data to cloud storage and remove on-premises servers.

– You must develop an analytical processing solution for transforming customer data.

– You must develop an image object and color tagging solution.

– Capital expenditures must be minimized.

– Cloud resource costs must be minimized.

Technical

The solution has the following technical requirements:

– Tagging data must be uploaded to the cloud from the New York office location.

– Tagging data must be replicated to regions that are geographically close to company office locations.

– Image data must be stored in a single data store at minimum cost.

– Customer data must be analyzed using managed Spark clusters.

– Power BI must be used to visualize transformed customer data.

– All data must be backed up in case disaster recovery is required.

Security and optimization

All cloud data must be encrypted at rest and in transit.

The solution must support:

– parallel processing of customer data

– hyper-scale storage of images

– global region data replication of processed image data

You need to recommend a solution for storing the image tagging data.

What should you recommend?

 
 
 
 
 

13. You need to design the solution for analyzing customer data.

What should you recommend?

 
 
 
 
 

14. DRAG DROP

You need to design the image processing solution to meet the optimization requirements for image tag data.

What should you configure? To answer, drag the appropriate setting to the correct drop targets.

Each source may be used once, more than once, or not at all. You may need to drag the split bar between panes or scroll to view content.

NOTE: Each correct selection is worth one point.

15. DRAG DROP

You need to design the encryption strategy for the tagging data and customer data.

What should you recommend? To answer, drag the appropriate setting to the correct drop targets. Each source may be used once, more than once, or not at all. You may need to drag the split bar between panes or scroll to view content.

NOTE: Each correct selection is worth one point.

16. You need to design a backup solution for the processed customer data.

What should you include in the design?

 
 
 
 

17. Topic 3, Case study 2

This is a case study. Case studies are not timed separately. You can use as much exam time as you would like to complete each case.

However, there may be additional case studies and sections on this exam. You must manage your time to ensure that you are able to complete all questions included on this exam in the time provided.

To answer the questions included in a case study, you will need to reference information that is provided in the case study. Case studies might contain exhibits and other resources that provide more information about the scenario that is described in the case study. Each question is independent of the other question on this case study.

At the end of this case study, a review screen will appear. This screen allows you to review your answers and to make changes before you move to the next sections of the exam. After you begin a new section, you cannot return to this section.

To start the case study

To display the first question on this case study, click the Next button. Use the buttons in the left pane to explore the content of the case study before you answer the questions. Clicking these buttons displays information such as business requirements, existing environment, and problem statements. If the case study has an All Information tab, note that the information displayed is identical to the information displayed on the subsequent tabs. When you are ready to answer a question, click the Question button to return to the question.

Background

Current environment

The company has the following virtual machines (VMs):

Requirements

Storage and processing

You must be able to use a file system view of data stored in a blob.

You must build an architecture that will allow Contoso to use the DB FS filesystem layer over a blob store. The architecture will need to support data files, libraries, and images. Additionally, it must provide a web-based interface to documents that contain runnable command, visualizations, and narrative text such as a notebook.

CONT_SQL3 requires an initial scale of 35000 IOPS.

CONT_SQL1 and CONT_SQL2 must use the vCore model and should include replicas. The solution must support 8000 IOPS.

The storage should be configured to optimized storage for database OLTP workloads.

Migration

– You must be able to independently scale compute and storage resources.

– You must migrate all SQL Server workloads to Azure. You must identify related machines in the on-premises environment, get disk size data usage information.

– Data from SQL Server must include zone redundant storage.

– You need to ensure that app components can reside on-premises while interacting with components that run in the Azure public cloud.

– SAP data must remain on-premises.

– The Azure Site Recovery (ASR) results should contain per-machine data.

Business requirements

– You must design a regional disaster recovery topology.

– The database backups have regulatory purposes and must be retained for seven years.

– CONT_SQL1 stores customers sales data that requires ETL operations for data analysis. A solution is required that reads data from SQL, performs ETL, and outputs to Power BI. The solution should use managed clusters to minimize costs. To optimize logistics, Contoso needs to analyze customer sales data to see if certain products are tied to specific times in the year.

– The analytics solution for customer sales data must be available during a regional outage.

Security and auditing

– Contoso requires all corporate computers to enable Windows Firewall.

– Azure servers should be able to ping other Contoso Azure servers.

– Employee PII must be encrypted in memory, in motion, and at rest. Any data encrypted by SQL Server must support equality searches, grouping, indexing, and joining on the encrypted data.

– Keys must be secured by using hardware security modules (HSMs).

– CONT_SQL3 must not communicate over the default ports

Cost

– All solutions must minimize cost and resources.

– The organization does not want any unexpected charges.

– The data engineers must set the SQL Data Warehouse compute resources to consume 300 DWUs.

– CONT_SQL2 is not fully utilized during non-peak hours. You must minimize resource costs for during non-peak hours.

You need to design a solution to meet the SQL Server storage requirements for CONT_SQL3.

Which type of disk should you recommend?

 
 
 

18. You need to recommend an Azure SQL Database service tier.

What should you recommend?

 
 
 
 
 

19. You need to recommend the appropriate storage and processing solution?

What should you recommend?

 
 
 
 
 

20. You need to optimize storage for CONT_SQL3.

What should you recommend?

 
 
 
 

21. HOTSPOT

You need to design network access to the SQL Server data.

What should you recommend? To answer, select the appropriate options in the answer area. NOTE: Each correct selection is worth one point.

22. A company stores sensitive information about customers and employees in Azure SQL Database.

You need to ensure that the sensitive data remains encrypted in transit and at rest.

What should you recommend?

 
 
 
 

23. DRAG DROP

You are designing an Azure SQL Data Warehouse for a financial services company. Azure Active Directory will be used to authenticate the users.

You need to ensure that the following security requirements are met:

– Department managers must be able to create new database.

– The IT department must assign users to databases.

– Permissions granted must be minimized.

Which role memberships should you recommend? To answer, drag the appropriate roles to the correct groups. Each role may be used once, more than once, or not at all. You may need to drag the split bar between panes or scroll to view content.

NOTE: Each correct selection is worth one point.

24. You plan to use Azure SQL Database to support a line of business app.

You need to identify sensitive data that is stored in the database and monitor access to the data.

Which three actions should you recommend? Each correct answer presents part of the solution.

NOTE: Each correct selection is worth one point.

 
 
 
 
 

25. You need to recommend a backup strategy for CONT_SQL1 and CONT_SQL2.

What should you recommend?

 
 
 
 

26. You need to design the disaster recovery solution for customer sales data analytics.

Which three actions should you recommend? Each correct answer presents part of the solution.

NOTE: Each correct selection is worth one point.

 
 
 
 
 
 

27. Note: This question is part of a series of questions that present the same scenario. Each question in the series contains a unique solution that might meet the stated goals. Some question sets might have more than one correct solution, while others might not have a correct solution.

After you answer a question in this section, you will NOT be able to return to it. As a result, these

questions will not appear in the review screen.

You are designing an HDInsight/Hadoop cluster solution that uses Azure Data Lake Gen1 Storage.

The solution requires POSIX permissions and enables diagnostics logging for auditing.

You need to recommend solutions that optimize storage.

Proposed Solution: Ensure that files stored are larger than 250MB.

Does the solution meet the goal?

 
 

28. Note: This question is part of a series of questions that present the same scenario. Each question in the series contains a unique solution that might meet the stated goals. Some question sets might have more than one correct solution, while others might not have a correct solution.

After you answer a question in this section, you will NOT be able to return to it. As a result, these

questions will not appear in the review screen.

You are designing an HDInsight/Hadoop cluster solution that uses Azure Data Lake Gen1 Storage.

The solution requires POSIX permissions and enables diagnostics logging for auditing.

You need to recommend solutions that optimize storage.

Proposed Solution: Implement compaction jobs to combine small files into larger files.

Does the solution meet the goal?

 
 

New Microsoft Azure AI Engineer AI-100 Dumps
2019 New Azure Exam DP-200 Dumps Questions

Add a Comment

Your email address will not be published. Required fields are marked *