DCS for Azure Dataflows
Dataflows are visual data transformation and processing capabilities in Azure Data Factory. They enable us to transform and process data at a scale without writing code.
The table below shows Dataflow names and the purpose of their design in supporting masking and data discovery. Click on a name for navigation to the documentation.
Dataflow name | Purpose |
Copy unmasked data from a source location to a sink location. The dataflow usage is configured via pipeline parameters. | |
Use to profile data in a dataset and discover sensitive data. Leverages ADF’s sampling on the source, shuffles data and consume subset of the sample data. The results are stored back in the metadata store. | |
Consumes unmasked data from a source, applies data filter, masks the data and prepares it to be written to the sink. | |
Computes parameters by reading Azure SQL metadata store to execute the masking dataflow. Used when conditional algorithms are to be applied to a table. | |
Consumes unmasked data from a source, masks the data and prepares it to be written to the sink. | |
Computes parameters by reading Azure SQL metadata store to execute the masking dataflow. | |
Utility to help write ADF-approved filter conditions and validate that the data that is returned is as expected. |