At FabCon and SQLCon Atlanta in March, Microsoft announced a new pipeline activity in Microsoft Fabric Data Factory: Invoke SSIS Package. It is now in public preview. In short, it runs an existing .dtsx file inside a Fabric pipeline, with the package stored in OneLake and Fabric handling the compute.
That's interesting by itself, but it also has fences around it that matter — particularly for on-prem shops — and a cost model worth understanding before anyone starts dropping packages into OneLake. For now, this post is just an overview, but I'll revisit after I've tested things.
A quick translation for SSIS readers
Fabric uses its own vocabulary, and some of it overlaps with terms we already use in SSIS. This is a short glossary I find useful:
| Fabric term | Closest SSIS equivalent |
|---|---|
| Pipeline | A Control Flow. A container holding steps with precedence between them. |
| Activity | A task. A single step inside the pipeline (ie., Execute SQL Task, File System Task). |
| Invoke SSIS Package activity | An Execute Package Task. One step in the pipeline that runs a .dtsx file. |
| Pipeline canvas | The Control Flow design surface in SSDT — the visual area where you drop steps and draw arrows between them. |
| OneLake | The package store, in Fabric's case. Replaces the SSIS catalog or msdb as the place packages live. |
| Lakehouse | A storage object that lives inside OneLake. Roughly, a Lakehouse is to OneLake what a database is to a SQL Server instance — packages and files live in a Lakehouse, which lives in OneLake. |
| Workspace | A logical container used to create, organize, and manage SQL databases, warehouses, reports, and data pipelines together. |
So 'add the Invoke SSIS Package activity to a Fabric pipeline' is, in SSIS terms, 'add an Execute Package Task to a Control Flow.' What's different is that the package lives in OneLake now, instead of the SSISDB catalog or msdb, and that it is run by Fabric compute rather than the SSIS runtime on a SQL Server host.
What the Invoke SSIS Package activity actually is
Per Microsoft Learn, the Invoke SSIS Package activity is a pipeline activity (a step in a Fabric pipeline) inside Data Factory for Microsoft Fabric. You upload your .dtsx file (and optional .dtsConfig) to OneLake, add the activity to a pipeline, point it at the package and run it. Fabric provides the compute. This means that Microsoft Fabric acts as the infrastructure, engine, and host for running your ETL tasks.
The package store is OneLake, not the SSIS catalog and not msdb. Runtime overrides for connection managers and package properties are supplied through dedicated tabs on the activity. Logs return to OneLake when logging is enabled. There is no Integration Runtime to provision — Fabric handles the runtime on its side.
When logging is enabled in the activity settings, detailed execution logs are automatically written back to OneLake. In my book, you always want the logging on, but I don't know yet if you can customize the logged events in Fabric as you can in classic SSIS.
How Fabric SSIS Logging Works
- Package Setup: You upload your .dtsx package and optional .dtsConfig files to a Lakehouse within OneLake.
- Activity Configuration: You add the Invoke SSIS Package activity to a pipeline and, in the Settings tab, enable the Enable logging option.
- Execution & Logging: When the pipeline runs, Fabric executes the package and captures log data.
- Log Destination: The logs are written back to a specific folder in your OneLake Lakehouse.
- Monitoring: The output of the pipeline activity provides the exact path (logLocation) to these log files in OneLake.
The setup workflow, in six steps
Microsoft's docs lay it out as six steps. Lightly paraphrased:
| Step | What you do |
|---|---|
| 1 | Move .dtsx (and optional .dtsConfig) files into OneLake — drag and drop via OneLake file explorer, or upload through the Fabric portal. |
| 2 | Open or create a pipeline (the Control Flow equivalent). Add the Invoke SSIS Package activity from the Activities pane. |
| 3 | On the Settings tab, point Package path at the .dtsx, optionally point Configuration path at a .dtsConfig, and tick Enable logging if you want logs in OneLake. |
| 4 | Set runtime values on the Connection Managers and Property Overrides tabs. Connection Managers takes a Scope, Name, Property, and Value per override. Property Overrides takes a property path and a value — for example, \Package.Variables[User::<variable name>].Value. |
| 5 | Save, and run the pipeline immediately or schedule it. |
| 6 | Monitor in the pipeline Output tab or the workspace Monitor hub (Fabric's equivalent of looking at job history in SSMS). If logging was enabled, the activity output points you to a logging path on OneLake. |
What stands out about the workflow is what's missing from it. No Visual Studio deployment. No SSIS catalog setup. No SQL Agent job. No proxy account. Six clicks in a browser, start to finish. One gotcha worth flagging from step 4: if your package uses the DontSaveSensitive protection level, the Connection Managers and Property Overrides tabs are not optional — they're where credentials have to live, since the package itself cannot carry them anymore.
The four preview limitations
Straight from the Limitations section of Microsoft Learn:
| Limitation | What it means in practice |
|---|---|
| OneLake only | Only packages stored in OneLake are supported. Not the SSIS catalog. Not msdb. Not anywhere within the filesystem. |
| No on-premises sources or destinations | The activity can't connect to on-premises systems. If your package reads from on-prem SQL Server or writes to a UNC share, this preview is not for you yet. |
| No private-network endpoints | VNet-injected resources, private endpoints — not supported. |
| No custom or third-party components | Packages depending on custom components or third-party components aren't supported. |
That fourth one deserves its own attention, because it is the easiest to overlook. A lot of real-world SSIS packages depend on third-party connectors — Salesforce, ServiceNow, SAP, the assortment of CData and KingswaySoft adapters that show up in many shops that I've supported. None of those will run here right now.
The first three together describe a Fabric-to-Fabric workload. Packages that already live in the cloud, talking to purely cloud-resident systems and nothing on-prem. That is what the preview is built for today.
For a sizable share of today's SSIS shops — anyone whose packages still touch a file share, a domain account, or an on-prem SQL Server — that is a fence with their packages on the wrong side.
Worth noting that the inverse path exists today: traditional SSIS packages running on-prem or in Azure-SSIS IR can target Fabric services as destinations. Microsoft has published tutorials for connecting SSIS packages to Fabric SQL Database and to Fabric Data Warehouse, both of which require Microsoft Entra ID-based authentication. So while you can't yet run SSIS inside Fabric for on-prem workloads, you can write into Fabric from SSIS where you already run it.
What it costs
The pricing model for the activity is in Microsoft Learn. Quick note for SSIS readers: Fabric bills in Capacity Units (CU), which are the abstract compute currency you reserve when you buy a Fabric capacity SKU. CU hours roll up against that reservation. With that in mind, the headline meter looks like this:
| Operation | Consumption Meter | CU consumption rate |
|---|---|---|
| SSIS uptime | SSIS in Fabric | 1.5 CU hours per vCore |
The mechanics, again per the docs:
Each workspace is allocated 4 vCores for SSIS runtime execution. During preview that allocation is fixed and cannot be modified.
Uptime starts when the first Invoke SSIS Package activity in the workspace begins running, and continues as long as at least one activity is in progress. After the last one completes, the SSIS runtime stays warm for a fixed Time-To-Live (TTL) of 30 minutes. If a new activity arrives in that window, it benefits from no cold start. If not, the runtime shuts down and billing stops.
The TTL is fixed at 30 minutes during preview and cannot be configured.
Underneath the headline meter, Microsoft's docs add a note worth quoting plainly:
In addition to the SSIS uptime meter, pipeline orchestration runs and OneLake storage/transactions are charged under their respective meters. For details, see Data Factory pricing for Microsoft Fabric and OneLake consumption.
Translation, for the invoice: the SSIS uptime meter is the visible one, but it is not the only one. Pipeline orchestration runs are billed under Data Factory pricing for Microsoft Fabric, and the storage and transactions on the OneLake side roll into OneLake consumption. If you are sizing for this, you are sizing three things, not one.
What this is, and what it isn't, in one paragraph each
What it is. A preview-stage Fabric Data Factory pipeline activity that executes existing .dtsx packages stored in OneLake, with Fabric providing the compute and OneLake holding the logs.
What it isn't. A solution for on-premises SSIS workloads. The combination of OneLake-only storage, no on-premises connectivity, no private endpoints, and no custom or third-party components draws a clear fence around what the preview supports today. Everything inside the fence is cloud-resident and stock-component. Everything outside it is unsupported in this preview.
What I'm watching for
The preview is gated, and I'm not standing up a Fabric tenant right now to lab it. So, this is an honest research-based overview rather than a hands-on walkthrough. The primary things that I'll be watching (and waiting) for:
On-premises connectivity. The single biggest gap for the on-prem shops I work with. Until that fence comes down, this is a Fabric-to-Fabric story.
Custom and third-party component support. The fourth limitation is the easiest to overlook and the most likely to silently break a real package. Salesforce, ServiceNow, SAP adapters, CData, KingswaySoft — none of those run today.
Pricing clarity. Three meters in play during preview, with a fixed 4-vCore allocation and a fixed 30-minute TTL. When either of those becomes configurable, the cost conversation gets a lot more interesting.
GA timing. Preview features change before general availability. The limitations list and pricing model are both flagged as preview-specific. GA is when the real evaluation starts.
Where this lands depends on where your packages live. If everything is already cloud-resident and built on stock components, the Invoke SSIS Package activity removes real friction. If your packages still touch on-prem in any way, this preview does nothing for you. For the SSIS shops I support, that is not a footnote — it is the whole story. Read the limitations list carefully before assuming your packages will run. I am watching this one closely and will post again soon.
More to Read
Microsoft Learn: Use the Invoke SSIS Package activity in a pipeline (Preview)
Microsoft Learn: Data Factory pricing for Microsoft Fabric
Microsoft Learn: OneLake consumption
Andy Leonard: Microsoft Just Made SSIS-to-Fabric Easier
sqlfingers inc: SSIS Is Not Dead. Yet.
No comments:
Post a Comment