DP-500 Microsoft Designing and Implementing Enterprise-Scale Analytics Solutions Using Microsoft Azure and Microsoft Power BI Exam

Posted by:admin Posted on:Aug 11,2022

Candidates for this exam should have advanced Power BI skills, including managing data repositories and data processing in the cloud and on-premises, along with using Power Query and Data Analysis Expressions (DAX). They should also be proficient in consuming data from Azure Synapse Analytics and should have experience querying relational databases, analyzing data by using Transact-SQL (T-SQL), and visualizing data.

Part of the requirements for: Microsoft Certified: Azure Enterprise Data Analyst Associate

Related exams: none

Important: See details
Go to Certification Dashboard
Languages: English
Retirement date: none

This exam measures your ability to accomplish the following technical tasks: implement and manage a data analytics environment; query and transform data; implement and manage data models; and explore and visualize data.

This study guide should help you understand what to expect on Exam DP-500: Designing and Implementing Enterprise-Scale Analytics Solutions Using Microsoft Azure and Microsoft Power BI, and includes a summary of the topics the exam might cover and links to additional resources. The information and materials in this document should help you focus your studies as you prepare for the exam.

Certification renewal
Once you earn your certification, don’t let it expire. When you have an active certification that’s expiring within six months, you should renew it—at no cost—by passing a renewal assessment on Microsoft Learn. Remember to renew your certification annually, if you want to retain it.

To identify which certifications are available for you to renew, visit your Certifications in your Microsoft Learn profile:
• Ensure your certification profile is connected to your Microsoft Learn profile.
• Expect an email that directs you to the applicable assessment that you must pass on Microsoft Learn. You’ll receive this email as soon as you have a certification that you’re eligible to renew.
• When you pass an online assessment, your certification will extend by one year from the current expiration date.
• To help prepare for the assessment, explore the collection of free modules on the certification renewal page.

About the exam
Exam DP-500: Designing and Implementing Enterprise-Scale Analytics Solutions Using Microsoft Azure and Microsoft Power BI is required to earn the Microsoft Azure Enterprise Data Analyst Associate certification.
This exam measures your ability to accomplish the following technical tasks: implement and manage a data analytics environment; query and transform data; implement and manage data models; and explore and visualize data.
As an exam candidate, you should have advanced Power BI skills, including managing data repositories and data processing in the cloud and on-premises, along with using Power Query and Data Analysis Expressions (DAX). You should also be proficient in consuming data from Azure Synapse Analytics and should have experience querying relational databases, analyzing data by using Transact-SQL (T-SQL), and visualizing data.

Passing score
A passing score is 700. Learn more about exam scoring and score reports.

What to expect on the exam

Are you new to Microsoft certification exams? You can explore the exam environment by visiting our exam sandbox. We created the sandbox so you have an opportunity to experience an exam before you take it. In the sandbox, you can interact with different question types, such as build list, case studies, and others that you might encounter in the user interface when you take an exam. Additionally, it includes the introductory screens, instructions, and help topics related to the different types of questions that your exam might include. It also includes the non-disclosure agreement that you must accept before you can launch the exam.

Prepare to take the exam
There are several points to consider, or pursue, as you prepare for an exam. The following sections detail those points.

Request accommodations
We’re committed to ensuring all learners are set up for success. If you use assistive devices, require extra time, or need modification to any part of the exam experience, you can request an accommodation. We encourage you to learn more about available accommodations and how to obtain them by visiting this page.

Objective domain: skills the exam measures
The English language version of this exam was released on June 29, 2022.
Some exams are localized into other languages, and those are updated approximately eight weeks after the English version is updated. Other available languages are listed in the Schedule Exam section of the Exam Details webpage. If the exam isn’t available in your preferred language, you can request an additional 30 minutes to complete the exam.

Note The bullets that appear below each of the skills measured are intended to illustrate how we are assessing that skill. Related topics may be covered in the exam.

Note Most questions cover features that are general availability (GA). The exam may contain questions on Preview features if those features are commonly used.

Skills measured
• Implement and manage a data analytics environment (25–30%)
• Query and transform data (20–25%)
• Implement and manage data models (25–30%)
• Explore and visualize data (20–25%)

Functional groups
Implement and manage a data analytics environment (25–30%)


Govern and administer a data analytics environment
• Manage Power BI assets by using Azure Purview
• Identify data sources in Azure by using Azure Purview
• Recommend settings in the Power BI admin portal
• Recommend a monitoring and auditing solution for a data analytics environment, including Power BI REST API and PowerShell cmdlets

Integrate an analytics platform into an existing IT infrastructure
• Identify requirements for a solution, including features, performance, and licensing strategy
• Configure and manage Power BI capacity
• Recommend and configure an on-premises gateway in Power BI
• Recommend and configure a Power BI tenant or workspace to integrate with Azure Data Lake Storage Gen2
• Integrate an existing Power BI workspace into Azure Synapse Analytics

Manage the analytics development lifecycle
• Commit code and artifacts to a source control repository in Azure Synapse Analytics
• Recommend a deployment strategy for Power BI assets
• Recommend a source control strategy for Power BI assets
• Implement and manage deployment pipelines in Power BI
• Perform impact analysis of downstream dependencies from dataflows and datasets
• Recommend automation solutions for the analytics development lifecycle, including Power BI REST API and PowerShell cmdlets
• Deploy and manage datasets by using the XMLA endpoint
• Create reusable assets, including Power BI templates, Power BI data source (.pbids) files, and shared datasets

Query and transform data (20–25%)

Query data by using Azure Synapse Analytics
• Identify an appropriate Azure Synapse pool when analyzing data
• Recommend appropriate file types for querying serverless SQL pools
• Query relational data sources in dedicated or serverless SQL pools, including querying partitioned data sources
• Use a machine learning PREDICT function in a query

Ingest and transform data by using Power BI

• Identify data loading performance bottlenecks in Power Query or data sources
• Implement performance improvements in Power Query and data sources
• Create and manage scalable Power BI dataflows
• Identify and manage privacy settings on data sources
• Create queries, functions, and parameters by using the Power Query Advanced Editor
• Query advanced data sources, including JSON, Parquet, APIs, and Azure Machine Learning models

Implement and manage data models (25–30%)

Design and build tabular models
• Choose when to use DirectQuery for Power BI datasets
• Choose when to use external tools, including DAX Studio and Tabular Editor 2
• Create calculation groups
• Write calculations that use DAX variables and functions, for example handling blanks or errors, creating virtual relationships, and working with iterators
• Design and build a large format dataset
• Design and build composite models, including aggregations
• Design and implement enterprise-scale row-level security and object-level security

Optimize enterprise-scale data models

• Identify and implement performance improvements in queries and report visuals
• Troubleshoot DAX performance by using DAX Studio
• Optimize a data model by using Tabular Editor 2
• Analyze data model efficiency by using VertiPaq Analyzer
• Implement incremental refresh
• Optimize a data model by using denormalization

Explore and visualize data (20–25%)
Explore data by using Azure Synapse Analytics
• Explore data by using native visuals in Spark notebooks
• Explore and visualize data by using the Azure Synapse SQL results pane

Visualize data by using Power BI
• Create and import a custom report theme
• Create R or Python visuals in Power BI
• Connect to and query datasets by using the XMLA endpoint
• Design and configure Power BI reports for accessibility
• Enable personalized visuals in a report
• Configure automatic page refresh
• Create and distribute paginated reports in Power BI Report Builder

Corresponding learning paths and modules

The design of learning paths and modules should teach you how to perform a role and will help you study for the applicable exam. However, learning paths aren’t always in the same order as an exam’s “skills measured” list. Therefore, we’ve created a convenient table that links the skills measured to specific paths and modules.

Implement and manage a data analytics environment (25–30%)
Introduction to data analytics on Azure
• Explore Azure data services for modern analytics
• Understand concepts of data analytics
• Explore data analytics at scale

Implement and manage an analytics environment
Manage the analytics development lifecycle
• 1-Design a Power BI application lifecycle management strategy
• 2-Create and manage Power BI assets

Query and transform data (20–25%)
Model, query, and explore data in Azure Synapse
Optimize enterprise-scale tabular models
• 1-Optimize refresh performance using Synapse and Power BI
• 2-Optimize query performance using Synapse and Power BI
• 3-Improve performance with hybrid tables
• 4-Improve query performance with dual storage mode
• 5-Improve query performance using aggregations
• 6-Use tools to optimize Power BI performance

Implement and manage data models (25–30%)
Prepare data for tabular models in Power BI

• 1-Choose a Power BI model framework
• 2-Understand scalability in Power BI
• 3-Create and manage scalable Power BI dataflows

Explore and visualize data (20–25%)
Implement advanced data visualization techniques by using Power BI
• 1-Understand advanced data visualization concepts
• 2-Monitor data in real-time with Power BI


QUESTION 1
What should you configure in the deployment pipeline?

A. a backward deployment
B. a selective deployment
C. auto-binding
D. a data source rule

Answer: C


QUESTION 2
You need to recommend a solution to add new fields to the financial data Power Bl dataset with data
from the Microsoft SQL Server data warehouse.
What should you include in the recommendation?

A. Azure Purview
B. Site-to-Site VPN
C. an XMLA endpoint
D. the on-premises data gateway

Answer: D


QUESTION 3
You need to recommend a solution for the customer workspaces to support the planned changes.
Which two configurations should you include in the recommendation? Each correct answer presents
part of the solution.
NOTE: Each correct selection is worth one point.

A. Set Use datasets across workspaces to Enabled
B. Publish the financial data to the web.
C. Grant the Build permission for the financial data to each customer.
D. Configure the FinData workspace to use a Power Bl Premium capacity.

Answer: C


QUESTION 4
You develop a solution that uses a Power Bl Premium capacity. The capacity contains a dataset that is
expected to consume 50 GB of memory.
Which two actions should you perform to ensure that you can publish the model successfully to the
Power Bl service? Each correct answer presents part of the solution.
NOTE: Each correct selection is worth one point.

A. Increase the Max Offline Dataset Size setting.
B. Invoke a refresh to load historical data based on the incremental refresh policy.
C. Restart the capacity.
D. Publish an initial dataset that is less than 10 GB.
E. Publish the complete dataset.

Answer: CE


QUESTION 5
You have a Power Bi workspace named Workspacel in a Premium capacity. Workspacel contains a dataset.
During a scheduled refresh, you receive the following error message: “Unable to save the changes
since the new dataset size of 11,354 MB exceeds the limit of 10,240 MB.”
You need to ensure that you can refresh the dataset.
What should you do?

A. Turn on Large dataset storage format.
B. Connect Workspace1 to an Azure Data Lake Storage Gen2 account
C. Change License mode to Premium per user.
D. Change the location of the Premium capacity.

Answer: A

Click to rate this post!
[Total: 0 Average: 0]

admin

No description.Please update your profile.