Skip to main content
Skip table of contents

DCS for Azure Dataflows

Dataflows are visual data transformation and processing capabilities in Azure Data Factory. They enable us to transform and process data at a scale without writing code.  

The table below shows Dataflow names and the purpose of their design in supporting masking and data discovery. Click on a name for navigation to the documentation.

Dataflow name

Purpose 

Copy Dataflow 

Copy unmasked data from a source location to a sink location. The dataflow usage is configured via pipeline parameters.  

Profiling Dataflow

Use to profile data in a dataset and discover sensitive data. Leverages ADF’s sampling on the source, shuffles data and consume subset of the sample data. The results are stored back in the metadata store.  

Filtered Masking Dataflow 

Consumes unmasked data from a source, applies data filter, masks the data and prepares it to be written to the sink. 

Filtered Masking Parameters Dataflow 

Computes parameters by reading Azure SQL metadata store to execute the masking dataflow. Used when conditional algorithms are to be applied to a table.  

Unfiltered Masking Dataflow 

Consumes unmasked data from a source, masks the data and prepares it to be written to the sink.  

Unfiltered Masking Parameters Dataflow 

Computes parameters by reading Azure SQL metadata store to execute the masking dataflow.  

Filter Test Utility Dataflow 

Utility to help write ADF-approved filter conditions and validate that the data that is returned is as expected.  

JavaScript errors detected

Please note, these errors can depend on your browser setup.

If this problem persists, please contact our support.