DP-600 Microsoft Fabric Analytics Engineer - Practice Test 3
Your Progress
0 / 65
Question 1EASY
What is Microsoft Fabric?
Microsoft Fabric is a unified analytics platform launched in 2023 that brings together capabilities from Azure Synapse Analytics, Azure Data Factory, Power BI, and more into a single SaaS experience. It includes Lakehouse, Data Warehouse, Data Factory (pipelines/dataflows), Real-Time Intelligence, Data Science (notebooks), and Power BI - all sharing OneLake storage.
See more: Microsoft Fabric Overview
Question 2MEDIUM
What is the difference between a Fabric Capacity (F-SKU) and Power BI Premium capacity (P-SKU)?
Fabric capacity (F-SKUs: F2, F4 ... F2048) are purchased via Azure and billed per second (pauseable). They support all Fabric workloads including Lakehouse, Spark notebooks, KQL databases, and Warehouse. P-SKUs (P1-P5) are older monthly commitments primarily for Power BI. Microsoft is transitioning customers to F-SKUs. Both can host Power BI content, but only F-SKUs unlock the full Fabric experience.
See more: Microsoft Fabric Overview
Question 3EASY
What is a Fabric Lakehouse?
A Lakehouse combines data lake flexibility with data warehouse query capability. The Files section stores raw data (CSV, JSON, Parquet) and the Tables section stores managed Delta Lake tables. An auto-generated SQL Analytics Endpoint allows querying the tables with T-SQL. A Lakehouse also auto-generates a semantic model for Power BI consumption via Direct Lake mode.
See more: Lakehouse & Data Warehouse
Question 4MEDIUM
In Fabric's Dataflow Gen2, what is the "Output Destination"?
Dataflow Gen2 introduces Output Destinations - you can route the output of any query to a Fabric Lakehouse table, Fabric Data Warehouse table, Azure SQL Database, or Azure Data Explorer. This is a major improvement over Gen1, which only loaded data into a shared semantic model. Each query in a Dataflow Gen2 can have its own destination.
See more: Dataflows & Pipelines
Question 5EASY
In a Fabric data pipeline, what does the "Copy Data" activity do?
The Copy Data activity in a Fabric pipeline ingests data from a source connector (SQL Server, REST API, ADLS, Blob, etc.) and writes it to a sink (Lakehouse Files/Tables, Data Warehouse, Azure SQL, etc.). It supports schema mapping, format conversion, and partitioned copying. It is the primary activity for bulk data ingestion.
See more: Dataflows & Pipelines
Question 6MEDIUM
What is Fast Copy in Dataflow Gen2?
Fast Copy is a Dataflow Gen2 feature that detects when a query is simple enough (minimal transformations) to use the high-throughput Copy Data activity engine (based on Azure Data Factory) instead of the standard Mashup engine. This enables significantly faster data movement for ingestion scenarios. Fast Copy is triggered automatically or can be enabled manually via settings.
See more: Dataflows & Pipelines
Question 7EASY
What language is used in Fabric notebooks for data engineering?
Fabric notebooks run on Apache Spark and support PySpark (Python API for Spark), Scala, SparkSQL, and R. PySpark is the most common choice. Notebooks can read/write Lakehouse delta tables, process large datasets in parallel, and be embedded in pipelines. They are used for advanced data transformation, ML feature engineering, and exploration.
See more: Dataflows & Pipelines
Question 8MEDIUM
What is the purpose of a Fabric OneLake shortcut?
OneLake shortcuts create virtual paths that reference data stored elsewhere - in Azure Data Lake Storage Gen2, Amazon S3, Google Cloud Storage, or another Fabric Lakehouse - without physically copying data. The shortcut appears as a folder in the Lakehouse, allowing Spark, SQL, and Power BI to query the external data as if it were local. This enables the "data mesh" pattern in Fabric.
See more: Lakehouse & Data Warehouse
Question 9MEDIUM
What is the difference between a Fabric Lakehouse and a Fabric Data Warehouse?
Both store data in OneLake as Delta/Parquet. The Lakehouse SQL endpoint is read-only (no DML) and is typically interacted with via Spark notebooks. The Fabric Data Warehouse supports full T-SQL DML (INSERT, UPDATE, DELETE, MERGE), stored procedures, views, and cross-warehouse queries. The Warehouse is the choice for BI teams familiar with SQL-based development.
See more: Lakehouse & Data Warehouse
Question 10EASY
How do you ingest data into a Fabric Lakehouse using a local file upload?
In the Fabric Lakehouse UI, you can click "Upload" in the Files section to upload individual files or folders directly from your local machine via the browser. Files land in the Files section (raw zone). To create queryable Delta tables, you then either load them with a notebook (spark.read.csv(...).write.format("delta")...) or use the "Load to Tables" option.
See more: Lakehouse & Data Warehouse
Question 11MEDIUM
In Dataflow Gen2, what does "saving as a template" allow you to do?
Dataflow Gen2 supports saving the definition as a template file (.json). This template can be imported into another workspace or tenant to recreate the same dataflow structure and transformations. It enables reuse of common ingestion patterns across projects without recreating them manually. Templates include the query definitions but not credentials, which must be re-entered.
See more: Dataflows & Pipelines
Question 12EASY
What is a Fabric capacity admin responsible for?
A Fabric capacity admin can assign workspaces to the capacity, monitor the Capacity Metrics app (to track CU utilization), configure capacity settings (e.g., enable/disable workloads), pause and resume the capacity, and manage capacity contributors. They access this via the Microsoft Fabric Admin Portal.
See more: Microsoft Fabric Overview
Question 13MEDIUM
In a Fabric pipeline, what is the purpose of the ForEach activity?
ForEach iterates over an array (passed by a Lookup activity, variable, or parameter) and executes a set of inner activities for each element. For example: look up a list of table names, then ForEach table name, run a Copy Data activity. Setting "isSequential" to false runs iterations in parallel (up to a concurrency limit), speeding up bulk ingestion patterns.
See more: Dataflows & Pipelines
Question 14MEDIUM
What is a Fabric workspace role "Contributor" allowed to do?
Fabric workspace roles (Admin > Member > Contributor > Viewer): Contributor can create, edit, and delete items they own, publish reports, and run pipelines. They cannot add/remove people from the workspace or change workspace settings. Member can do everything a Contributor can plus share content and manage subscriptions. Admin has full control.
See more: Security, Governance & Deployment
Question 15EASY
In Power BI, what is a paginated report?
Paginated reports (.rdl) are designed for precise, pixel-perfect layouts - ideal for invoices, statements, surgical reports, or any scenario requiring exact page formatting. They are built in Power BI Report Builder. Unlike interactive Power BI reports, they are optimized for export (PDF, Excel, Word) and handle thousands of rows across many pages gracefully. They require Premium or Fabric capacity.
See more: Power BI Desktop & Service
Question 16MEDIUM
What is FabricPath (OneLake File Explorer)?
OneLake File Explorer is a Windows desktop application that synchronizes OneLake files to your local machine - similar to OneDrive. It mounts OneLake as a drive (e.g., OneLake - Contoso) in Windows Explorer, allowing you to drag-and-drop files, open them locally, and have changes sync back to OneLake. It uses the ADLS Gen2 REST API under the hood.
See more: Microsoft Fabric Overview
Question 17MEDIUM
What does "implementing workspace-level access controls" in Fabric primarily involve?
Workspace access control is managed through workspace roles. Users or Azure AD security groups are assigned roles: Admin (full control), Member (share content, manage subscriptions), Contributor (create/edit content), or Viewer (read-only). Best practice is to assign security groups rather than individual users to reduce maintenance overhead.
See more: Security, Governance & Deployment
Question 18EASY
What does the M language (Power Query Formula Language) do?
M (Power Query Formula Language) is a functional, case-sensitive language used behind the scenes in Power Query. Every action you take in the Power Query Editor UI generates M code. You can view and edit it in the Advanced Editor. M is used in Power BI Desktop, Dataflow Gen2, Excel, and anywhere Power Query is available.
See more: Dataflows & Pipelines
Question 19MEDIUM
In a Fabric data pipeline, what does the Lookup activity return?
The Lookup activity queries a data source and returns rows as an output object. The output can be referenced by downstream activities. A common pattern: Lookup reads a control table (list of filenames or table names), then ForEach iterates over the results, running a Copy Data activity for each entry. The "First row only" option returns just the first row.
See more: Dataflows & Pipelines
Question 20MEDIUM
What is Azure DevOps integration used for in Fabric workspaces?
Fabric supports Git integration with Azure DevOps (Azure Repos) and GitHub. You can connect a workspace to a branch, commit changes, pull updates, and manage conflicts. Item definitions are serialized as folders/files (e.g., .pbip for reports, .tmdl for semantic models). This enables branch-based development, code review, and CI/CD pipelines for Fabric content.
See more: Security, Governance & Deployment
Question 21MEDIUM
What is a Fabric Power BI "semantic model" (formerly called "dataset")?
In November 2023, Microsoft renamed "datasets" to "semantic models" to better reflect their purpose. A semantic model is an Analysis Services engine instance containing the data model (tables, columns, relationships, measures, hierarchies) that Power BI reports query. Multiple reports can connect to one semantic model, enabling a single source of truth.
See more: Semantic Models
Question 22EASY
What is the purpose of the "Capacity Metrics" app in Microsoft Fabric?
The Microsoft Fabric Capacity Metrics app is an installable Power BI report that shows CU (Capacity Unit) consumption over time by workspace, item, and operation. Admins use it to identify throttling events (when demand exceeds capacity CUs), detect "runaway" operations, and plan capacity scaling. It is the primary tool for capacity governance.
See more: Microsoft Fabric Overview
Question 23MEDIUM
What is the Fabric "Data activator" (Reflex) feature?
Data Activator (formerly Reflex) is a Fabric no-code alerting tool. You connect it to a Power BI visual or an Eventhouse real-time stream and define conditions (e.g., "alert when Sales drops below 1000"). When the condition is met, it triggers an action: Teams notification, email, or a Power Automate workflow. It enables business users to react to data changes automatically without writing code.
See more: Microsoft Fabric Overview
Question 24EASY
What is a Fabric shared semantic model?
A shared semantic model (Live Connection) allows multiple Power BI reports to connect to a single published semantic model in the service. Report authors use "Power BI semantic models" as a data source in Desktop, creating a live connection. This ensures one source of truth - all reports use the same measures and relationships. The semantic model must be refreshed independently.
See more: Semantic Models
Question 25MEDIUM
What is a composite semantic model in Power BI?
A composite model mixes storage modes: some tables use Import (cached in memory) while others use DirectQuery (queried live from source). Alternatively, you can connect to an existing published semantic model (via Live Connection) and add local Import tables, local measures, or relationships on top. This is useful for adding new dimensions to a certified organizational model without modifying the source.
See more: Semantic Models
Popular Posts
1Z0-830 Java SE 21 Developer Certification
Azure AI Foundry Hello World
Azure AI Agent Hello World
Foundry vs Hub Projects
Build Agents with SDK
Bing Web Search Agent
Function Calling Agent
Spring Boot + Azure Key Vault Hello World Example
Spring Boot + Elasticsearch + Azure Key Vault Example
Spring Boot Azure AD (Entra ID) OAuth 2.0 Authentication
Deploy Spring Boot App to Azure App Service
Secure Azure App Service using Azure API Management
Deploy Spring Boot JAR to Azure App Service
Deploy Spring Boot + MySQL to Azure App Service
Spring Boot + Azure Managed Identity Example
Secure Spring Boot Azure Web App with Managed Identity + App Registration
Elasticsearch 8 Security - Integrate Azure AD OIDC