70-776 Perform Big Data Engineering on Microsoft Cloud Services

Clear your 70-776 Perform Big Data Engineering on Microsoft Cloud Services exam. Microsoft 70-776 exam measures your ability to accomplish the technical tasks listed below. View video tutorials about the variety of question types on Microsoft exams.

Free to practice 70-776 exam questions well online as the following:

1. You plan to add a file from Microsoft Azure Data Lake Store to Azure Data Catalog.
You run the Data Catalog tool and select Data Lake Store as the data source.
Which Information should you enter in the Store Account field to connect to the Data Lake Store?

2. You have a Microsoft Azure Data Lake Analytics service.
You have a CSV file that contains employee salaries.
You need to write a U-SQL query to load the file and to extract all the employees who earn salaries that are greater than $100,000. You must encapsulate the data for reuse.
What should you use?

3. You have a Microsoft Azure Data Lake Analytics service.
You plan to configure diagnostic logging.
You need to use Microsoft Operations Management Suite (OMS) to monitor the IP addresses that are used to access the Data Lake.
What should you do?

4. You plan to use Microsoft Azure Data Factory to copy data daily from an Azure SQL data warehouse to an Azure Data Lake Store.
You need to define a linked service for the Data Lake Store. The solution must prevent the access token from expiring.
Which type of authentication should you use?

5. You ingest data into a Microsoft Azure event hub.
You need to export the data from the event hub to Azure Storage and to prepare the data for batch processing tasks in Azure Data Lake Analytics.
Which two actions should you perform? Each correct answer presents part of the solution
NOTE: Each correct selection is worth one point.

6. You have a Microsoft Azure Data Lake Analytics service.
You need to write a U-SQL query to extract from a CSV file all the users who live in Boston, and then to save the results in a new CSV file.
Which U-SQL script should you use?

7. You are developing an application by using the Microsoft .NET SDK. The application will access data from a Microsoft Azure Data Lake folder.
You plan to authenticate the application by using service-to-service authentication. You need to ensure that the application can access the Data Lake folder. Which three actions should you perform? Each correct answer presents part of the solution. NOTE: Each correct selection is worth one point.

8. You are developing an application by using the Microsoft .Net SDK. The application will access data from a Microsoft Azure Data lake folder.
You plan to authenticate the application by using service to service authentication.
You need to ensure that the application can access the Data lake folder.
Which three actions should you perform? Each correct answer presents part of the solution.

9. You need to use the Cognition.Vision.FaceDetector() function in U-SQL to analyze images. Which attribute can you detect by using the function?

10. You are using Cognitive capabilities in U SQL to analyze images that contain different types of objects.
You need to identify which objects might be people.
Which two reference assemblies should you use? Each correct answer presents part of the solution.

11. You have a Microsoft Azure Data Lake Analytics service and an Azure Data Lake Store. You need to use Python to submit a U-SQL job. Which Python module should you install?

12. You have a Microsoft Azure Data Lake Analytics service.
You need to provide a user with the ability to monitor Data Lake Analytics jobs. The solution must minimize the number of permissions assigned to the user.
Which role should you assign to the user?

13. You have a Microsoft Azure SQL data warehouse that has a fact table named FactOrder. FactOrder contains three columns named Customer Id. Order Id, and Order DateKey. FactOrder is hash distributed on Customerld. Order Id is the unique identifier for FactOrder. FactOrder contains 3 million rows.
Orders are distributed evenly among different customers from a table named dimCustomers that contains 2 million rows.
You often run queries that join FactOrder and dimCustomers by selecting and groupinq by the OrderDateKey column.
You add 7 million rows to FactOrder. Most of the new records have a more recent OrderDateKey value than the previous records.
You need to reduce the execution time of queries that group on Order DateKey and that join dimCustomers and FactOrder.
What should you do?

14. You have a Microsoft Azure SQL data warehouse. The following statements are used to define file formats in the data warehouse.


You have an external PolyBase table named file_factPowerMeasurement that uses the FileFormat_ORC file format. You need to change file_factPowerMeasurement to use the FileFormat.PARQUET file format. Which two statements should you execute? Each correct answer presents part of the solution.

15. You have a Microsoft Azure SQL data warehouse named DW1 that is used only from Monday to Friday.
You need to minimize Data Warehouse Unit (DWU) usage during the weekend.
What should you do?

16. You have a Microsoft Azure Data Lake Store and an Azure Active Directory tenant.
You are developing an application that will access the Data Lake Store by using end-user credentials.
You need to ensure that the application uses end-user authentication to access the Data Lake Store.
What should you create?

17. You have an on-premises deployment of Active Directory named contoso.com.
You plan to deploy a Microsoft Azure SQL data warehouse.
You need to ensure that the data warehouse can be accessed by contoso.com users.
Which two components should you deploy? Each correct answer presents part of the solution.
NOTE: Each correct selection is worth one point.

18. You plan to deploy a Microsoft Azure virtual machine that will a host data warehouse. The data warehouse will contain a10-TB database.
You need to provide the fastest read and writes times for the database.
Which disk configuration should you use?

19. You have an on-premises data warehouse that uses Microsoft SQL Server 2016. All the data in the data warehouse comes from text files stored in Azure Blob storage. The text files are imported into the data warehouse by using SQL Server Integration Services (SSIS). The text files are not transformed.
You need to migrate the data to an Azure SQL data warehouse in the least amount of time possible.
Which two actions should you perform? Each correct answer presents part of the solution.
NOTE: Each correct selection is worth one point.

20. You manage an on-premises data warehouse that uses Microsoft SQL Server. The data warehouse contains 100 TB of data. The data is partitioned by month. One TB of data is added to the data warehouse each month.
You create a Microsoft Azure SQL data warehouse and copy the on- premises data to the data warehouse.
You need to implement a process to replicate the on-premises data warehouse to the Azure SQL data warehouse. The solution must Support daily incremental updates and must provide error handling.
What should you use?


 

 

 

Test 70-533 exam questions by yourself online Freely
Check and practice Microsoft MB6-895 exam questions well.

Add a Comment

Your email address will not be published. Required fields are marked *