2019 New Azure Exam DP-200 Dumps Questions

DP-200 Implementing an Azure Data Solution is one of the two exams for Microsoft Certified: Azure Data Engineer Associate certification.

What are two exams for Azure Data Engineer?

DP-200 Implementing an Azure Data Solution

DP-201 Designing an Azure Data Solution

Today we share new Azure exam DP-200 dumps questions to help you prepare for DP-200 exam well. DP-200 exam measures your ability to accomplish the following technical tasks:

  • Implement data storage solutions
  • Manage and develop data processing
  • Manage data security
  • Monitor data solutions
  • Manage and troubleshoot Azure data solutions

New DP-200 dumps questions are real and accurate, which was cracked to match the exam objectives and tasks for passing successfully.

You can read DP-200 free questions online before buying.

1. Topic 1, Proseware Inc

Background

Proseware, Inc, develops and manages a product named Poll Taker. The product is used for delivering public opinion polling and analysis.

Polling data comes from a variety of sources, including online surveys, house-to-house interviews, and booths at public events.

Polling data

Polling data is stored in one of the two locations:

– An on-premises Microsoft SQL Server 2019 database named PollingData

– Azure Data Lake Gen 2 Data in Data Lake is queried by using PolyBase

Poll metadata

Each poll has associated metadata with information about the poll including the date and number of respondents. The data is stored as JSON.

Phone-based polling Security

– Phone-based poll data must only be uploaded by authorized users from authorized devices

– Contractors must not have access to any polling data other than their own

– Access to polling data must set on a per-active directory user basis

Data migration and loading

– All data migration processes must use Azure Data Factory

– All data migrations must run automatically during non-business hours

– Data migrations must be reliable and retry when needed

Performance

After six months, raw polling data should be moved to a lower-cost storage solution.

Deployments

– All deployments must be performed by using Azure DevOps. Deployments must use templates used in multiple environments

– No credentials or secrets should be used during deployments

Reliability

All services and processes must be resilient to a regional Azure outage.

Monitoring

All Azure services must be monitored by using Azure Monitor. On-premises SQL Server performance must be monitored.

HOTSPOT

You need to ensure that Azure Data Factory pipelines can be deployed.

How should you configure authentication and authorization for deployments? To answer, select the appropriate options in the answer choices.

NOTE: Each correct selection is worth one point.

2. DRAG DROP

You need to provision the polling data storage account.

How should you configure the storage account? To answer, drag the appropriate Configuration Value to the correct Setting. Each Configuration Value may be used once, more than once, or not at all. You may need to drag the split bar between panes or scroll to view content.

NOTE: Each correct selection is worth one point.

3. HOTSPOT

You need to ensure polling data security requirements are met.

Which security technologies should you use? To answer, select the appropriate options in the answer area. NOTE: Each correct selection is worth one point.

4. DRAG DROP

You need to ensure that phone-based polling data can be analyzed in the PollingData database.

Which three actions should you perform in sequence? To answer, move the appropriate actions

from the list of actions to the answer are and arrange them in the correct order.

5. You need to ensure that phone-based poling data can be analyzed in the PollingData database.

How should you configure Azure Data Factory?

 
 
 
 

6. HOTSPOT

You need to ensure phone-based polling data upload reliability requirements are met.

How should you configure monitoring? To answer, select the appropriate options in the answer area.

NOTE: Each correct selection is worth one point.

7. You need to process and query ingested Tier 9 data.

Which two options should you use? Each correct answer presents part of the solution.

NOTE: Each correct selection is worth one point.

 
 
 
 
 
 

8. HOTSPOT

You need set up the Azure Data Factory JSON definition for Tier 10 data.

What should you use? To answer, select the appropriate options in the answer area. NOTE: Each correct selection is worth one point.

9. You need to set up Azure Data Factory pipelines to meet data movement requirements.

Which integration runtime should you use?

 
 
 
 

10. Validate configuration results and deploy the solution

Does the solution meet the goal?

 
 

11. Validate configuration results and deploy the solution

Does the solution meet the goal?

 
 

12. Validate configuration results and deploy the solution

Does the solution meet the goal?

 
 

13. HOTSPOT

You need to mask tier 1 data.

Which functions should you use? To answer, select the appropriate option in the answer area.

NOTE: Each correct selection is worth one point.

14. DRAG DROP

You need to set up access to Azure SQL Database for Tier 7 and Tier 8 partners.

Which three actions should you perform in sequence? To answer, move the appropriate three actions from the list of actions to the answer area and arrange them in the correct order.

15. Note: This question is part of a series of questions that present the same scenario. Each question in the series contains a unique solution that might meet the stated goals. Some questions sets might have more than one correct solution, while others might not have a correct solution.

After you answer a question in this section, you will NOT be able to return to it. As a result, these

questions will not appear in the review screen.

You need to implement diagnostic logging for Data Warehouse monitoring.

Which log should you use?

 
 
 
 

16. Note: This question is part of a series of questions that present the same scenario. Each question in the series contains a unique solution that might meet the stated goals. Some questions sets might have more than one correct solution, while others might not have a correct solution.

After you answer a question in this section, you will NOT be able to return to it. As a result, these questions will not appear in the review screen.

You need setup monitoring for tiers 6 through 8.

What should you configure?

 
 
 
 
 

17. Topic 3, Misc Questions

Note: This question is part of a series of questions that present the same scenario. Each question in the series contains a unique solution. Determine whether the solution meets the stated goals.

You develop a data ingestion process that will import data to a Microsoft Azure SQL Data Warehouse.

The data to be ingested resides in parquet files stored in an Azure Data lake Gen 2 storage account.

You need to load the data from the Azure Data Lake Gen 2 storage account into the Azure SQL Data Warehouse.

Solution;

Create an external data source pointing to the Azure Data Lake Gen 2 storage account.

Create an external tile format and external table using the external data source.

Load the data using the CREATE TABLE AS SELECT statement. Does the solution meet the goal?

 
 

18. Load the data using the CREATE TABLE AS SELECT statement.

Does the solution meet the goal?

 
 

19. Load the data using the CREATE TABLE AS SELECT statement.

Does the solution meet the goal?

 
 

20. Note: This question is part of a series of questions that present the same scenario. Each question in the series contains a unique solution that might meet the stated goals. Some question sets might have more than one correct solution, while others might not have a correct solution.

After you answer a question in this section, you will NOT be able to return to it. As a result, these questions will not appear in the review screen. A company uses Azure Data Lake Gen 1 Storage to store big data related to consumer behavior. You need to implement logging.

Solution: Use information stored m Azure Active Directory reports. Does the solution meet the goal?

 
 

21. Note: This question is part of a series of questions that present the same scenario. Each question in the series contains a unique solution that might meet the stated goals. Some question sets might have more than one correct solution, while others might not have a correct solution.

After you answer a question in this section, you will NOT be able to return to it As a result, these questions will not appear in the review screen. A company uses Azure Data Lake Gen 1 Storage to store big data related to consumer behavior.

You need to implement logging.

Solution: Configure Azure Data Late Storage diagnostics to store logs and metrics in a storage account.

Does the solution meet the goal?

 
 

22. Note: This question is part of a series of questions that present the same scenario. Each question in the series contains a unique solution that might meet the stated goals. Some question sets might have more than one correct solution, while others might not have a correct solution.

After you answer a question in this section, you will NOT be able to return to it. As a result, these questions will not appear in the review screen. A company uses Azure Data Lake Gen 1 Storage to store big data related to consumer behavior. You need to implement logging.

Solution: Create an Azure Automation runbook to copy events. Does the solution meet the goal?

 
 

23. A company has a Microsoft Azure HDInsight solution that uses different cluster types to process and analyze data. Operations are continuous.

Reports indicate slowdowns during a specific lime window.

You need to determine a monitoring solution to track down the issue in the least amount of time.

What should you use?

 
 
 
 
 

24. Your company uses several Azure HDInsight clusters.

The data engineering team reports several errors with some application using these clusters.

You need to recommend a solution to review the health of the clusters.

What should you include in you recommendation?

 
 
 

25. HOTSPOT

A company is planning to use Microsoft Azure Cosmos DB as the data store for an application. You have the following Azure CLI command:

az cosmosdb create -Cname "cosmosdbdev1" C-resource-group "rgdev"

You need to minimize latency and expose the SQL API.

How should you complete the command? To answer, select the appropriate options in the answer area.

NOTE: Each correct selection is worth one point.

26. A company manages several on-premises Microsoft SQL Server databases.

You need to migrate the databases to Microsoft Azure by using the backup process of Microsoft SQL Server.

Which data technology should you use?

 
 
 
 

27. You implement 3 Azure SQL Data Warehouse instance.

You plan to migrate the largest fact table to Azure SQL Data Warehouse The table resides on Microsoft SQL Server on-premises and e 10 terabytes (TB) in size.

Incoming queues use the primary key Sale Key column to retrieve data as displayed in the following table: You need to distribute the fact table across multiple nodes to optimize performance of the table.

Which technology should you use?

 
 
 
 
 

28. An application will use Microsoft Azure Cosmos DB as its data solution. The application will use the Cassandra API to support a column-based database type that uses containers to store items.

You need to provision Azure Cosmos DB.

Which container name and item name should you use? Each correct answer presents part of the solutions.

NOTE: Each correct answer selection is worth one point.

 
 
 
 
 

29. A company is designing a hybrid solution to synchronize data and on-premises Microsoft SQL Server database to Azure SQL Database.

You must perform an assessment of databases to determine whether data will move without compatibility issues.

You need to perform the assessment.

Which tool should you use?

 
 
 
 
 

30. DRAG DROP

You are developing the data platform for a global retail company. The company operates during normal working hours in each region. The analytical database is used once a week for building sales projections.

Each region maintains its own private virtual network.

Building the sales projections is very resource intensive are generates upwards of 20 terabytes (TB) of data.

Microsoft Azure SQL Databases must be provisioned.

– Database provisioning must maximize performance and minimize cost

– The daily sales for each region must be stored in an Azure SQL Database instance

– Once a day, the data for all regions must be loaded in an analytical Azure SQL Database instance

You need to provision Azure SQL database instances.

How should you provision the database instances? To answer, drag the appropriate Azure SQL products to the correct databases. Each Azure SQL product may be used once, more than once, or not at all. You may need to drag the split bar between panes or scroll to view content.

NOTE: Each correct selection is worth one point.

31. You are the data engineer tor your company. An application uses a NoSQL database to store data. The database uses the key-value and wide-column NoSQL database type.

Developers need to access data in the database using an API.

You need to determine which API to use for the database model and type.

Which two APIs should you use? Each correct answer presents a complete solution.

NOTE: Each correct selection s worth one point.

 
 
 
 
 

32. DRAG DROP

Your company uses Microsoft Azure SQL Database configure with Elastic pool. You use Elastic Database jobs to run queries across all databases in the pod.

You need to analyze, troubleshoot, and report on components responsible for running Elastic Database jobs.

You need to determine the component responsible for running job service tasks.

Which components should you use for each Elastic pool job services task? To answer, drag the appropriate component to the correct task. Each component may be used once, more than once, or not at all. You may need to drag the split bar between panes or scroll to view content.

NOTE: Each correct selection is worth one point.

33. DRAG DROP

You are developing a solution to visualize multiple terabytes of geospatial data.

The solution has the following requirements:

– Data must be encrypted.

– Data must be accessible by multiple resources on Microsoft Azure.

You need to provision storage for the solution.

Which four actions should you perform in sequence? To answer, move the appropriate action from

the list of actions to the answer area and arrange them in the correct order.

34. You are developing a data engineering solution for a company. The solution will store a large set of key-value pair data by using Microsoft Azure Cosmos DB

The solution has the following requirements:

• Data must be partitioned into multiple containers.

• Data containers must be configured separately.

• Data must be accessible from applications hosted around the world.

• The solution must minimize latency.

You need to provision Azure Cosmos DB

 
 
 
 
 
 

2019 New Azure Exam DP-201 Dumps Questions
Microsoft Dynamics 365 Certification MB-300 Exam Questions

Add a Comment

Your email address will not be published. Required fields are marked *