Test Drive Scenario
Here is a sample scenario to test out the AI Data Engineer in your environment
There are a couple of steps to complete in Fabric before kicking off the scenario.
Step 1: Add a Lakehouse
i. Log in to Microsoft Fabric: Go to app.fabric.microsoft.
ii. Create a Lakehouse folder and name it SalesOps

iii. Create a Destination Table
a. Below is a sample Lakehouse table schema for this scenario. The SQL script is run in a notebook, which will generate your table. Note that there are various methods for creating a schema.
For help with creating a table.
%%sql
CREATE TABLE Sales_Orders_Daily (
Company STRING,
Order_Date DATE,
Order_No STRING,
Warehouse_ID STRING,
Part_Classification STRING,
Part_Name STRING,
Qty INT,
Price STRING,
Address STRING,
City STRING,
State STRING,
ZipCode STRING,
Phone STRING,
Notes STRING
) USING DELTA;
Step 2: Add Data
i. Upload your Source data files to the SalesOps folder in your Lakehouse
a. Below are files: one source file and one table for lookups and joins.
AI Data Engineer Test Drive Scenario
Below are sample instructions - copy the instructions below into the Manually Provide Instructions box
Select Save Configuration
# Destination Table or Tables:
Sales_Orders_Daily
# Source files:
All files in "SalesOps" folder. You may need to join data from the sales order file and the warehouse file.
# Ingestion instructions:
If the Order Date field is blank, set the date to 01/01/1900
Remove $ from the Price.
Extract or Infer City, State, and Zip Code from Address
Phone number should be in (XXX) XXX-XXXX US Phone number format. You can skip the country code
Assign the Warehouse ID by joining on Part Description from Sample Orders and the Warehouse Part List.
Part Classification can only take one of these two values. Figure out how to map any values in the source to one of these five values. Product Service
3. Select Generate Notebook
Select Go to Notebook to review the ready-to-run Python notebook based on your configuration instructions, source files, and destination schemas.
Run the Notebook to write the data to the destination schema.
Last updated
Was this helpful?