What you'll learn:
- Plan, implement, and manage a solution for data analytics
- Prepare and serve data
- Implement and manage semantic models
- Explore and analyze data
This course covers the additional content required for the DP-600 "Fabric Analytics Engineer Associate" certification exam, building on your existing knowledge gained for the PL-300 exam. It is as per the DP-600 requirements as of 22 July 2024, with some updates as of 15 November 2024.
It comes in three parts - additional Power BI knowledge, Fabric lakehouses and data warehouses using SQL, and eventhouses using KQL.
In Part 1 of this course, we'll start with Power BI. We'll look at:
Developing our design of semantic models, including calculation groups/items and field parameters.
Expanding our DAX knowledge, with DAX variables and windowing functions.
Using external apps, such as Tableau Editor 2 and DAXStudio,
Implementing many-to-many relationships, implementing dynamic strings, and using the Optimize menu.
The analytics development lifecycle, focusing on version control and deployment solutions,
Other Analytics topics, such as creating aggregation tables and using the XMLAendpoint.
In Part 2 of this course, we'll query and manipulate data in Fabric lakehouses and data warehouses using SQL.
After a brief look around Fabric, we'll create a lakehouse and use notebooks to load data and manipulate it in both SQL.
We'll start by ingesting data by using data pipelines and dataflows.We'll then use our SQLknowledge in a lakehouse and a data warehouse.
We'll then create a lakehouse and use notebooks to load data, manipulate it in both SQLand PySpark, and save the resulting dataframes in PySpark.
(Note:PySpark is no longer required for the DP-600 exam - the relevant videos will shortly be removed.)
We'll create additional objects such as shortcuts and partitions, and optimize performance, implement Slowly Changing Dimensions, and manage our Fabric capacity.
In Part 3 of this course, we'll look at the Eventhouses and KQL:
We'll create an eventhouse, see sample KQLqueries, and how you can convert SQLqueries to KQL.
We'll select, filter and aggregate data using KQL.
We'll expand our KQLqueries using string, number, datetime and timespan functions.
Finally, we'll transform data using KQL, merging and joining data, and identify and resolve duplicate and missing data.
Prior knowledge of all of the topics in the PL-300 exam is assumed, especially with regards to DAXfunctions. This content is available in "PL-300 certification: Microsoft Power BI Data Analyst", which is available on Udemy.
Prior knowledge of SQLServer would be helpful, but is not essential.
Once you have completed the course, you will have a good knowledge of maintaining a data analytics solution, preparing data, and implementing and managing semantic models. And with some practice, you could even go for the official Microsoft certification DP-600 - wouldn't the "Microsoft Certified: Fabric Analytics Engineer Associate" certification look good on your CV or resume?
I hope to see you in the course - why not have a look at what you could learn?