Data Transformation with Dataflow Gen2 (English)
Description
With Data Factory Dataflow Gen2 (part of Fabric), you can easily explore, assess for data quality, retrieve, transform, and load data from a wide range of sources and formats into a SQL Database, Azure Data Explorer (Kusto), Fabric Lakehouse, or Fabric Warehouse (SQL Endpoint).
Dataflow Gen2 supports more than 350 different types of data transformations. These can mostly be set up via the graphical interface, but Data Flows also support the use of functions, *custom columns*, and direct entry or modification of the M code in the Advanced Editor.
In this two-day training, we look at the best way to import, transform, and load your data using Dataflow Gen2 and M (the language of Dataflow Gen2). We will work hands-on with data to create queries that add value to your organization.
This training is aimed at data analysts and Data Engineers who want to give their data transformation in Fabric a powerful and user-friendly place. Perhaps you recognize yourself in one of the following descriptions:
- You have been looking for a good ETL solution in the Microsoft suite for some time. Mapping Data Flows within Synapse Pipelines and ADF did not quite deliver what was needed; now you are looking for a tool that can be operated graphically and has a good code backend (so you can share functions, reuse code, and do simple maintenance).
- You have been developing reports in Power BI for some time, and can also load your own data into it. However, you regularly come across sources that do not "fit" in Power BI. You are looking for a way to load these sources correctly.
After this two-day training:
- You understand the structure of queries in "M" (the language of Dataflow Gen2)
- You can build advanced queries yourself and know how to use the capabilities of the Advanced Editor to write M yourself.
- You learn how to make your queries variable with the help of parameters and enable incremental loading, as well as how to limit development time
- You learn how to properly document your queries so that your colleagues and/or customers can also understand them later
We do this as you are used to at Wortell Smart Learning: under the guidance of an inspiring trainer with practical experience, with really good material, and room to bring your own practice cases. So we will do a lot of work ourselves. Want to know more? Please contact us for the possibilities!
Pre-requisites to attend the Data Transformation with Dataflow Gen2 training
To be able to attend the Data Transformation with Dataflow Gen2 training, it is important to be well-versed in Power BI. This pre-knowledge could have been gained in the Power BI: reporting and data analysis training.
As a participant, it is useful if you have access to a Fabric environment in which you can develop. If you do not have this, please let us know in advance - we will then provide an environment in which you can get to work.
Topics
We delve deep into both theory and practice, focusing on the following components:
- Place of data transformation with Dataflow Gen2 in Fabric
- Data sources
-
Data transformations
- Merge
- Append
- AI insights
- custom functions
- columns from examples
- Advanced Editor
- use of Lists, Records, and other data types
- Performance tips
- Diagnostic tools
- Data preview capabilities
- Query dependencies
- Staging queries
- Query groups
- Parameters
- Duplicates and references
- M, Power Query, and Dataflow Gen2
Study Material
In the Data Transformation with Dataflow Gen2 training, we use materials that we have developed ourselves at Wortell Smart Learning. This includes materials with many exercises and real-world cases. We ensure that you receive all the necessary materials on time. The end result of all exercises is available.
Available dates
There are currently no scheduled dates available. Please contact us for options.