Designing an Azure Data Solution DP-201 Dumps Will Be Retired

When IT examiners come for Microsoft Certified: Azure Data Engineer Associate certification, they usually choose to pass DP-200 and DP-201 exams. However, both DP-200 and DP-201 exams will be retired on June 30, 2021. If you are planning to take the exams, please make sure you can complete before that day. More, we have updated DP-201 dumps with 208 practice exam questions and answers. You can choose the latest Microsoft DP-201 dumps questions as the preparation materials. By the way, if you are planning for DP-203 exam to complete Microsoft Certified: Azure Data Engineer Associate certification, you can get valid DP-203 dumps from DumpsBase also.

From this blog, you can check DP-201 free dumps online.

1. Topic 1, Trey Research

Case study

This is a case study. Case studies are not timed separately. You can use as much exam time as you would like to complete each case. However, there may be additional case studies and sections on this exam. You must manage your time to ensure that you are able to complete all questions included on this exam in the time provided.

To answer the questions included in a case study, you will need to reference information that is provided in the case study. Case studies might contain exhibits and other resources that provide more information about the scenario that is described in the case study. Each question is independent of the other questions in this case study.

At the end of this case study, a review screen will appear. This screen allows you to review your answers and to make changes before you move to the next section of the exam. After you begin a new section, you cannot return to this section.

To start the case study

To display the first question in this case study, click the Next button. Use the buttons in the left pane to explore the content of the case study before you answer the questions. Clicking these buttons displays information such as business requirements, existing environment, and problem statements. If the case study has an All Information tab, note that the information displayed is identical to the information displayed on the subsequent tabs. When you are ready to answer a question, click the Question button to return to the question.

Background

Trey Research is a technology innovator. The company partners with regional transportation department office to build solutions that improve traffic flow and safety.

The company is developing the following solutions:

Regional transportation departments installed traffic sensor systems on major highways across North America.

Sensors record the following information each time a vehicle passes in front of a sensor:

- Time

- Location in latitude and longitude

- Speed in kilometers per second (kmps)

- License plate number

- Length of vehicle in meters

Sensors provide data by using the following structure:

Traffic sensors will occasionally capture an image of a vehicle for debugging purposes.

You must optimize performance of saving/storing vehicle images.

Traffic sensor data

- Sensors must have permission only to add items to the SensorData collection.

- Traffic data insertion rate must be maximized.

- Once every three months all traffic sensor data must be analyzed to look for data patterns that indicate sensor malfunctions.

- Sensor data must be stored in a Cosmos DB named treydata in a collection named SensorData

- The impact of vehicle images on sensor data throughout must be minimized.

Backtrack

This solution reports on all data related to a specific vehicle license plate. The report must use data from the SensorData collection.

Users must be able to filter vehicle data in the following ways:

- vehicles on a specific road

- vehicles driving above the speed limit

Planning Assistance

Data used for Planning Assistance must be stored in a sharded Azure SQL Database.

Data from the Sensor Data collection will automatically be loaded into the Planning Assistance database once a week by using Azure Data Factory. You must be able to manually trigger the data load process.

Privacy and security policy

- Azure Active Directory must be used for all services where it is available.

- For privacy reasons, license plate number information must not be accessible in Planning Assistance.

- Unauthorized usage of the Planning Assistance data must be detected as quickly as possible. Unauthorized usage is determined by looking for an unusual pattern of usage.

- Data must only be stored for seven years.

Performance and availability

- The report for Backtrack must execute as quickly as possible.

- The SLA for Planning Assistance is 70 percent, and multiday outages are permitted.

- All data must be replicated to multiple geographic regions to prevent data loss.

- You must maximize the performance of the Real Time Response system.

Financial requirements

Azure resource costs must be minimized where possible.

You need to design a sharding strategy for the Planning Assistance database.

What should you recommend?

2. You need to design the vehicle images storage solution.

What should you recommend?

3. HOTSPOT

You need to ensure that security policies for the unauthorized detection system are met.

What should you recommend? To answer, select the appropriate options in the answer area. NOTE: Each correct selection is worth one point.

4. HOTSPOT

You need to design the authentication and authorization methods for sensors.

What should you recommend? To answer, select the appropriate options in the answer area. NOTE: Each correct selection is worth one point.

5. HOTSPOT

You need to ensure that emergency road response vehicles are dispatched automatically.

How should you design the processing system? To answer, select the appropriate options in the answer area. NOTE: Each correct selection is worth one point.

6. You need to recommend an Azure SQL Database pricing tier for Planning Assistance.

Which pricing tier should you recommend?

7. HOTSPOT

You need to design the SensorData collection.

What should you recommend? To answer, select the appropriate options in the answer area. NOTE: Each correct selection is worth one point.

8. HOTSPOT

You need to design the data loading pipeline for Planning Assistance.

What should you recommend? To answer, drag the appropriate technologies to the correct locations. Each technology may be used once, more than once, or not at all. You may need to drag the split bar between panes or scroll to view content. NOTE: Each correct selection is worth one point.

9. DRAG DROP

You need to ensure that performance requirements for Backtrack reports are met.

What should you recommend? To answer, drag the appropriate technologies to the correct locations. Each technology may be used once, more than once, or not at all. You may need to drag the split bar between panes or scroll to view content. NOTE: Each correct selection is worth one point.

10. You need to design the runtime environment for the Real Time Response system.

What should you recommend?

11. HOTSPOT

You need to design the Planning Assistance database.

For each of the following statements, select Yes if the statement is true. Otherwise, select No. NOTE: Each correct selection is worth one point.

12. Topic 2, Case study 1

Case study

This is a case study. Case studies are not timed separately. You can use as much exam time as you would like to complete each case. However, there may be additional case studies and sections on this exam. You must manage your time to ensure that you are able to complete all questions included on this exam in the time provided.

To answer the questions included in a case study, you will need to reference information that is provided in the case study. Case studies might contain exhibits and other resources that provide more information about the scenario that is described in the case study. Each question is independent of the other questions in this case study.

At the end of this case study, a review screen will appear. This screen allows you to review your answers and to make changes before you move to the next section of the exam. After you begin a new section, you cannot return to this section.

To start the case study

To display the first question in this case study, click the Next button. Use the buttons in the left pane to explore the content of the case study before you answer the questions. Clicking these buttons displays information such as business requirements, existing environment, and problem statements. If the case study has an All Information tab, note that the information displayed is identical to the information displayed on the subsequent tabs. When you are ready to answer a question, click the Question button to return to the question.

Overview

You develop data engineering solutions for Graphics Design Institute, a global media company with offices in New York City, Manchester, Singapore, and Melbourne.

The New York office hosts SQL Server databases that stores massive amounts of customer data. The company also stores millions of images on a physical server located in the New York office. More than 2 TB of image data is added each day. The images are transferred from customer devices to the server in New York.

Many images have been placed on this server in an unorganized manner, making it difficult for editors to search images. Images should automatically have object and color tags generated. The tags must be stored in a document database, and be queried by SQL

You are hired to design a solution that can store, transform, and visualize customer data.

Requirements

Business

The company identifies the following business requirements:

- You must transfer all images and customer data to cloud storage and remove on-premises servers.

- You must develop an analytical processing solution for transforming customer data.

- You must develop an image object and color tagging solution.

- Capital expenditures must be minimized.

- Cloud resource costs must be minimized.

Technical

The solution has the following technical requirements:

- Tagging data must be uploaded to the cloud from the New York office location.

- Tagging data must be replicated to regions that are geographically close to company office locations.

- Image data must be stored in a single data store at minimum cost.

- Customer data must be analyzed using managed Spark clusters.

- Power BI must be used to visualize transformed customer data.

- All data must be backed up in case disaster recovery is required.

Security and optimization

All cloud data must be encrypted at rest and in transit.

The solution must support:

- parallel processing of customer data

- hyper-scale storage of images

- global region da

You need to recommend a solution for storing customer data.

What should you recommend?

13. HOTSPOT

You need to design storage for the solution.

Which storage services should you recommend? To answer, select the appropriate configuration in the answer area. NOTE: Each correct selection is worth one point.

14. HOTSPOT

You need to design the image processing and storage solutions.

What should you recommend? To answer, select the appropriate configuration in the answer area. NOTE: Each correct selection is worth one point.

15. What should you recommend to prevent users outside the Litware on-premises network from accessing the analytical data store?

16. You need to design the solution for analyzing customer data.

What should you recommend?

17. You need to design a backup solution for the processed customer data.

What should you include in the design?

18. DRAG DROP

You discover that the highest chance of corruption or bad data occurs during nightly inventory loads.

You need to ensure that you can quickly restore the data to its state before the nightly load and avoid missing any streaming data.

Which three actions should you perform in sequence? To answer, move the appropriate actions from the list of actions to the answer area and arrange them in the correct order.

19. DRAG DROP

You need to design the image processing solution to meet the optimization requirements for image tag data.

What should you configure? To answer, drag the appropriate setting to the correct drop targets.

Each source may be used once, more than once, or not at all. You may need to drag the split bar between panes or scroll to view content. NOTE: Each correct selection is worth one point.

20. What should you recommend using to secure sensitive customer contact information?

21. You plan to use an Azure SQL data warehouse to store the customer data. You need to recommend a disaster recovery solution for the data warehouse.

What should you include in the recommendation?

22. DRAG DROP

You need to design the encryption strategy for the tagging data and customer data.

What should you recommend? To answer, drag the appropriate setting to the correct drop targets. Each source may be used once, more than once, or not at all. You may need to drag the split bar between panes or scroll to view content. NOTE: Each correct selection is worth one point.

23. You need to recommend a solution for storing the image tagging data.

What should you recommend?

24. Topic 3, Case study 2

Case study

This is a case study. Case studies are not timed separately. You can use as much exam time as you would like to complete each case. However, there may be additional case studies and sections on this exam. You must manage your time to ensure that you are able to complete all questions included on this exam in the time provided.

To answer the questions included in a case study, you will need to reference information that is provided in the case study. Case studies might contain exhibits and other resources that provide more information about the scenario that is described in the case study. Each question is independent of the other questions in this case study.

At the end of this case study, a review screen will appear. This screen allows you to review your answers and to make changes before you move to the next section of the exam. After you begin a new section, you cannot return to this section.

To start the case study

To display the first question in this case study, click the Next button. Use the buttons in the left pane to explore the content of the case study before you answer the questions. Clicking these buttons displays information such as business requirements, existing environment, and problem statements. If the case study has an All Information tab, note that the information displayed is identical to the information displayed on the subsequent tabs. When you are ready to answer a question, click the Question button to return to the question.

Background

Current environment

The company has the following virtual machines (VMs):

Requirements

Storage and processing

You must be able to use a file system view of data stored in a blob.

You must build an architecture that will allow Contoso to use the DB FS filesystem layer over a blob store. The architecture will need to support data files, libraries, and images. Additionally, it must provide a web-based interface to documents that contain runnable command, visualizations, and narrative text such as a notebook.

CONT_SQL3 requires an initial scale of 35000 IOPS.

CONT_SQL1 and CONT_SQL2 must use the vCore model and should include replicas. The solution must support 8000 IOPS.

The storage should be configured to optimized storage for database OLTP workloads.

Migration

- You must be able to independently scale compute and storage resources.

- You must migrate all SQL Server workloads to Azure. You must identify related machines in the on-premises environment, get disk size data usage information.

- Data from SQL Server must include zone redundant storage.

- You need to ensure that app components can reside on-premises while interacting with components that run in the Azure public cloud.

- SAP data must remain on-premises.

- The Azure Site Recovery (ASR) results should contain per-machine data.

Business requirements

- You must design a regional disaster recovery topology.

- The database backups have regulatory purposes and must be retained for seven years.

- CONT_SQL1 stores customers sales data that requires ETL operations for data analysis. A solution is required that reads data from SQL, performs ETL, and outputs to Power BI. The solution should use managed clusters to minimize costs. To optimize logistics, Contoso needs to analyze customer sales data to see if certain products are tied to specific times in the year.

- The analytics solution for customer sales data must be available during a regional outage.

Security and auditing

- Contoso requires all corporate computers to enable Windows Firewall.

- Azure servers should be able to ping other Contoso Azure servers.

- Employee PII must be encrypted in memory, in motion, and at rest. Any data encrypted by SQL Server must support equality searches, grouping, indexing, and joining on the encrypted data.

- Keys must be secured by using hardware security modules (HSMs).

- CONT_SQL3 must not communicate over the default ports

Cost

- All solutions must minimize cost and resources.

- The organization does not want any unexpected charges.

- The data engineers must set the SQL Data Warehouse compute resources to consume 300 DWUs.

- CONT_SQL2 is not fully utilized during non-peak hours. You must minimize resource costs for during non-peak hours.

You plan to use Azure SQL Database to support a line of business app.

You need to identify sensitive data that is stored in the database and monitor access to the data.

Which three actions should you recommend? Each correct answer presents part of the solution. NOTE: Each correct selection is worth one point.

25. You need to design a solution to meet the SQL Server storage requirements for CONT_SQL3.

Which type of disk should you recommend?

26. DRAG DROP

You are designing an Azure SQL Data Warehouse for a financial services company. Azure Active Directory will be used to authenticate the users.

You need to ensure that the following security requirements are met:

✑ Department managers must be able to create new database.

✑ The IT department must assign users to databases.

✑ Permissions granted must be minimized.

Which role memberships should you recommend? To answer, drag the appropriate roles to the correct groups. Each role may be used once, more than once, or not at all. You may need to drag the split bar between panes or scroll to view content. NOTE: Each correct selection is worth one point.

27. You need to recommend the appropriate storage and processing solution?

What should you recommend?

28. You need to optimize storage for CONT_SQL3.

What should you recommend?

29. A company stores sensitive information about customers and employees in Azure SQL Database.

You need to ensure that the sensitive data remains encrypted in transit and at rest.

What should you recommend?

30. You need to recommend a backup strategy for CONT_SQL1 and CONT_SQL2.

What should you recommend?

31. HOTSPOT

You need to design network access to the SQL Server data.

What should you recommend? To answer, select the appropriate options in the answer area. NOTE: Each correct selection is worth one point.

32. You need to recommend an Azure SQL Database service tier.

What should you recommend?

33. You need to design the disaster recovery solution for customer sales data analytics.

Which three actions should you recommend? Each correct answer presents part of the solution. NOTE: Each correct selection is worth one point.

34. Topic 4, ADatum Corporation

Case study

This is a case study. Case studies are not timed separately. You can use as much exam time as you would like to complete each case. However, there may be additional case studies and sections on this exam. You must manage your time to ensure that you are able to complete all questions included on this exam in the time provided.

To answer the questions included in a case study, you will need to reference information that is provided in the case study. Case studies might contain exhibits and other resources that provide more information about the scenario that is described in the case study. Each question is independent of the other questions in this case study.

At the end of this case study, a review screen will appear. This screen allows you to review your answers and to make changes before you move to the next section of the exam. After you begin a new section, you cannot return to this section.

To start the case study

To display the first question in this case study, click the Next button. Use the buttons in the left pane to explore the content of the case study before you answer the questions. Clicking these buttons displays information such as business requirements, existing environment, and problem statements. If the case study has an All Information tab, note that the information displayed is identical to the information displayed on the subsequent tabs. When you are ready to answer a question, click the Question button to return to the question.

Overview

General Overview

ADatum Corporation is a medical company that has 5,000 physicians located in more than 300 hospitals across the US. The company has a medical department, a sales department, a marketing department, a medical research department, and a human resources department.

You are redesigning the application environment of ADatum.

Physical Locations

ADatum has three main offices in New York, Dallas, and Los Angeles. The offices connect to each other by using a WAN link. Each office connects directly to the Internet. The Los Angeles office also has a datacenter that hosts all the company's applications.

Existing Environment

Health Review

ADatum has a critical OLTP web application named Health Review that physicians use to track billing, patient care, and overall physician best practices.

Health Interface

ADatum has a critical application named Health Interface that receives hospital messages related to patient care and status updates. The messages are sent in batches by each hospital's enterprise relationship management (ERM) system by using a VPN. The data sent from each hospital can have varying columns and formats.

Currently, a custom C# application is used to send the data to Health Interface. The application uses deprecated libraries and a new solution must be designed for this functionality.

Health Insights

ADatum has a web-based reporting system named Health Insights that shows hospital and patient insights to physicians and business users. The data is created from the data in Health Review and Health Interface, as well as manual entries.

Database Platform

Currently, the databases for all three applications are hosted on an out-of-date VMware cluster that has a single instance of Microsoft SQL Server 2012.

Problem Statements

ADatum identifies the following issues in its current environment:

- Over time, the data received by Health Interface from the hospitals has slowed, and the number of messages has increased.

- When a new hospital joins ADatum, Health Interface requires a schema modification due to the lack of data standardization.

- The speed of batch data processing is inconsistent.

Business Requirements

Business Goals

ADatum identifies the following business goals:

- Migrate the applications to Azure whenever possible.

- Minimize the development effort required to perform data movement.

- Provide continuous integration and deployment for development, test, and production environments.

- Provide faster access to the applications and the data and provide more consistent application performance.

- Minimize the number of services required to perform data processing, development, scheduling, monitoring, and the operationalizing of pipelines.

Health Review Requirements

ADatum identifies the following requirements for the Health Review application:

- Ensure that sensitive health data is encrypted at rest and in transit.

- Tag all the sensitive health data in Health Review. The data will be used for auditing.

Health Interface Requirements

ADatum identifies the following requirements for the Health Interface application:

- Upgrade to a data storage solution that will provide flexible schemas and increased throughput for writing data. Data must be regionally located close to each hospital, and reads must display be the most recent committed version of an item.

- Reduce the amount of time it takes to add data from new hospitals to Health Interface.

- Support a more scalable batch processing solution in Azure.

- Reduce the amount of development effort to rewrite existing SQL queries.

Health Insights Requirements

ADatum identifies the following requirements for the Health Insights application:

- The analysis of events must be performed over time by using an organizational date dimension table.

- The data from Health Interface and Health Review must be available in Health Insights within 15 minutes of being committed.

- The new Health Insights application must be built on a massively parallel processing (MPP) architecture that will support the high performance of joins on large fact tables.

HOTSPOT

Which Azure data storage solution should you recommend for each application? To answer, select the appropriate options in the answer area. NOTE: Each correct selection is worth one point.

35. You need to recommend a security solution that meets the requirements of Health Review.

What should you include in the recommendation?

36. You need to recommend a solution that meets the data platform requirements of Health Interface. The solution must minimize redevelopment efforts for the application.

What should you include in the recommendation?

37. Which consistency level should you use for Health Interface?

38. You need to design a solution that meets the business requirements of Health Insights.

What should you include in the recommendation?

39. You need to recommend a solution to quickly identify all the columns in Health Review that contain sensitive health data.

What should you include in the recommendation?

40. What should you recommend as a batch processing solution for Health Interface?

41. HOTSPOT

You need to design the storage for the Health Insights data platform.

Which types of tables should you include in the design? To answer, select the appropriate options in the answer area. NOTE: Each correct selection is worth one point.

42. Topic 5, Data Engineer for Trey Research

Overview

You are a data engineer for Trey Research. The company is close to completing a joint project with the government to build smart highways infrastructure across North America. This involves the placement of sensors and cameras to measure traffic flow, car speed, and vehicle details.

You have been asked to design a cloud solution that will meet the business and technical requirements of the smart highway.

Solution components

Telemetry Capture

The telemetry capture system records each time a vehicle passes in front of a sensor.

The sensors run on a custom embedded operating system and record the following telemetry data:

- Time

- Location in latitude and longitude

- Speed in kilometers per hour (kmph)

- Length of vehicle in meters

Visual Monitoring

The visual monitoring system is a network of approximately 1,000 cameras placed near highways that capture images of vehicle traffic every 2 seconds. The cameras record high resolution images. Each image is approximately 3 MB in size.

Requirements: Business

The company identifies the following business requirements:

- External vendors must be able to perform custom analysis of data using machine learning technologies.

- You must display a dashboard on the operations status page that displays the following metrics: telemetry, volume, and processing latency.

- Traffic data must be made available to the Government Planning Department for the purpose of modeling changes to the highway system. The traffic data will be used in conjunction with other data such as information about events such as sporting events, weather conditions, and population statistics. External data used during the modeling is stored in on-premises SQL Server 2016 databases and CSV files stored in an Azure Data Lake Storage Gen2 storage account.

- Information about vehicles that have been detected as going over the speed limit during the last 30 minutes must be available to law enforcement officers. Several law enforcement organizations may respond to speeding vehicles.

- The solution must allow for searches of vehicle images by license plate to support law enforcement investigations. Searches must be able to be performed using a query language and must support fuzzy searches to compensate for license plate detection errors.

Requirements: Security

The solution must meet the following security requirements:

- External vendors must not have direct access to sensor data or images.

- Images produced by the vehicle monitoring solution must be deleted after one month. You must minimize costs associated with deleting images from the data store.

- Unauthorized usage of data must be detected in real time. Unauthorized usage is determined by looking for unusual usage patterns.

- All changes to Azure resources used by the solution must be recorded and stored. Data must be provided to the security team for incident response purposes.

Requirements: Sensor data

You must write all telemetry data to the closest Azure region. The sensors used for the telemetry capture system have a small amount of memory available and so must write data as quickly as possible to avoid losing telemetry data.

You need to design the storage for the telemetry capture system.

What storage solution should you use in the design?

43. You need to design the storage for the visual monitoring system.

Which storage solution should you recommend?

44. You need to design the unauthorized data usage detection system.

What Azure service should you include in the design?

45. DRAG DROP

You need to design the system for notifying law enforcement officers about speeding vehicles.

How should you design the pipeline? To answer, drag the appropriate services to the correct locations. Each service may be used once, more than once, or not at all. You may need to drag the split bar between panes or scroll to view content. NOTE: Each correct selection is worth one point.

46. You need to design the solution for the government planning department.

Which services should you include in the design?

47. Topic 6, Litware Case

Case study

This is a case study. Case studies are not timed separately. You can use as much exam time as you would like to complete each case. However, there may be additional case studies and sections on this exam. You must manage your time to ensure that you are able to complete all questions included on this exam in the time provided.

To answer the questions included in a case study, you will need to reference information that is provided in the case study. Case studies might contain exhibits and other resources that provide more information about the scenario that is described in the case study. Each question is independent of the other questions in this case study.

At the end of this case study, a review screen will appear. This screen allows you to review your answers and to make changes before you move to the next section of the exam. After you begin a new section, you cannot return to this section.

To start the case study

To display the first question in this case study, click the Next button. Use the buttons in the left pane to explore the content of the case study before you answer the questions. Clicking these buttons displays information such as business requirements, existing environment, and problem statements. If the case study has an All Information tab, note that the information displayed is identical to the information displayed on the subsequent tabs. When you are ready to answer a question, click the Question button to return to the question.

Overview

Litware, Inc. owns and operates 300 convenience stores across the US. The company sells a variety of packaged foods and drinks, as well as a variety of prepared foods, such as sandwiches and pizzas.

Litware has a loyalty club whereby members can get daily discounts on specific items by providing their membership number at checkout.

Litware employs business analysts who prefer to analyze data by using Microsoft Power BI, and data scientists who prefer analyzing data in Azure Databricks notebooks.

Requirements. Business Goals

Litware wants to create a new analytics environment in Azure to meet the following requirements:

- See inventory levels across the stores. Data must be updated as close to real time as possible.

- Execute ad hoc analytical queries on historical data to identify whether the loyalty club discounts increase sales of the discounted products.

- Every four hours, notify store employees about how many prepared food items to produce based on historical demand from the sales data.

Requirements. Technical Requirements

Litware identifies the following technical requirements:

- Minimize the number of different Azure services needed to achieve the business goals

- Use platform as a service (PaaS) offerings whenever possible and avoid having to provision virtual machines that must be managed by Litware.

- Ensure that the analytical data store is accessible only to the company’s on-premises network and Azure services.

- Use Azure Active Directory (Azure AD) authentication whenever possible.

- Use the principle of least privilege when designing security.

- Stage inventory data in Azure Data Lake Storage Gen2 before loading the data into the analytical data store. Litware wants to remove transient data from Data Lake Storage once the data is no longer in use. Files that have a modified date that is older than 14 days must be removed.

- Limit the business analysts’ access to customer contact information, such as phone numbers, because this type of data is not analytically relevant.

- Ensure that you can quickly restore a copy of the analytical data store within one hour in the event of corruption or accidental deletion.

Requirements. Planned Environment

Litware plans to implement the following environment:

- The application development team will create an Azure event hub to receive real-time sales data, including store number, date, time, product ID, customer loyalty number, price, and discount amount, from the point of sale (POS) system and output the data to data storage in Azure.

- Customer data, including name, contact information, and loyalty number, comes from Salesforce and can be imported into Azure once every eight hours. Row modified dates are not trusted in the source table.

- Product data, including product ID, name, and category, comes from Salesforce and can be imported into Azure once every eight hours. Row modified dates are not trusted in the source table.

- Daily inventory data comes from a Microsoft SQL server located on a private network.

- Litware currently has 5 TB of historical sales data and 100 GB of customer data. The company expects approximately 100 GB of new data per month for the next year.

- Litware will build a custom application named FoodPrep to provide store employees with the calculation results of how many prepared food items to produce every four hours.

- Litware does not plan to implement Azure ExpressRoute or a VPN between the on-premises network and Azure.

What should you do to improve high availability of the real-time data processing solution?

48. Inventory levels must be calculated by subtracting the current day's sales from the previous day's final inventory.

Which two options provide Litware with the ability to quickly calculate the current inventory levels by store and product? Each correct answer presents a complete solution. NOTE: Each correct selection is worth one point.

49. HOTSPOT

Which Azure service and feature should you recommend using to manage the transient data for Data Lake Storage? To answer, select the appropriate options in the answer area. NOTE: Each correct selection is worth one point.

50. Which Azure service should you recommend for the analytical data store so that the business analysts and data scientists can execute ad hoc queries as quickly as possible?

51. HOTSPOT

Which Azure Data Factory components should you recommend using together to import the customer data from Salesforce to Data Lake Storage? To answer, select the appropriate options in the answer area. NOTE: Each correct selection is worth one point.

52. HOTSPOT

Which Azure Data Factory components should you recommend using together to import the daily inventory data from SQL to Data Lake Storage? To answer, select the appropriate options in the answer area. NOTE: Each correct selection is worth one point.

53. Topic 7, Misc. Questions

HOTSPOT

You manage an on-premises server named Server1 that has a database named Database1. The company purchases a new application that can access data from Azure SQL Database.

You recommend a solution to migrate Database1 to an Azure SQL Database instance.

What should you recommend? To answer, select the appropriate configuration in the answer area. NOTE: Each correct selection is worth one point.

54. You are designing a statistical analysis solution that will use custom proprietary Python functions on near real-time data from Azure Event Hubs.

You need to recommend which Azure service to use to perform the statistical analysis. The solution must minimize latency.

What should you recommend?

55. You plan to create an Azure Synapse Analytics dedicated SQL pool.

You need to minimize the time it takes to identify queries that return confidential information as defined by the company’s data privacy regulations and the users who executed the queries.

Which two components should you include in the solution? Each correct answer presents part of the solution. NOTE: Each correct selection is worth one point.

56. HOTSPOT

You have an Azure Data Lake Storage Gen2 account named account1 that stores logs as shown in the following table.

You do not expect that the logs will be accessed during the retention periods.

You need to recommend a solution for account1 that meets the following requirements:

✑ Automatically deletes the logs at the end of each retention period

✑ Minimizes storage costs

What should you include in the recommendation? To answer, select the appropriate options in the answer area. NOTE: Each correct selection is worth one point.

57. HOTSPOT

You are designing a solution for a company. You plan to use Azure Databricks.

You need to recommend workloads and tiers to meet the following requirements:

✑ Provide managed clusters for running production jobs.

✑ Provide persistent clusters that support auto-scaling for analytics processes.

✑ Provide role-based access control (RBAC) support for Notebooks.

What should you recommend? To answer, select the appropriate options in the answer area. NOTE: Each correct selection is worth one point.

58. HOTSPOT

You have an Azure subscription that contains a logical Microsoft SQL server named Server1. Server1 hosts an Azure Synapse Analytics SQL dedicated pool named Pool1. You need to recommend a Transparent Data Encryption (TDE) solution for Server1.

The solution must meet the following requirements:

• Track the usage of encryption keys.

• Maintain the access of client apps to Pool1 in the event of an Azure datacenter outage

that affects the availability of the encryption keys.

What should you include in the recommendation? To answer, select the appropriate options in the answer area.

59. HOTSPOT

A company stores large datasets in Azure, including sales transactions and customer account information.

You must design a solution to analyze the data.

You plan to create the following HDInsight clusters:

You need to ensure that the clusters support the query requirements.

Which cluster types should you recocmmend? To answer, select the appropriate configuration in the answer area. NOTE: Each correct seletion is worth one point.

60. A company is developing a mission-critical line of business app that uses Azure SQL Database Managed Instance. You must design a disaster recovery strategy for the solution.

You need to ensure that the database automatically recovers when full or partial loss of the Azure SQL Database service occurs in the primary region.

What should you recommend?


 

Updated MB-310 Dumps Online For Microsoft Dynamics 365 Finance Exam
Microsoft D365 Core Finance and Operations MB-300 V15.02 Dumps

Add a Comment

Your email address will not be published. Required fields are marked *