General

Data Integrity Suite

Product
Spatial_Analytics
Data_Integration
Data_Enrichment
Data_Governance
Precisely_Data_Integrity_Suite
geo_addressing_1
Data_Observability
Data_Quality
dis_core_foundation
Services
Spatial Analytics
Data Integration
Data Enrichment
Data Governance
Geo Addressing
Data Observability
Data Quality
Core Foundation
ft:title
Data Integrity Suite
ft:locale
en-US
PublicationType
pt_product_guide
copyrightfirst
2000
copyrightlast
2026

General steps perform broad, essential transformations that apply to a wide range of use cases. These include copying or renaming fields, filtering records, applying business rules or formulas, generating unique keys, or executing custom code for advanced scenarios.

Transform Description
Copy Field Copies a column to one or more new columns.
Filter Field Removes selected columns from a table.
Filter Row The row filter evaluates whether to include rows depending on a logical expression that evaluates to either true or false.
Evaluate Rule The Evaluate Rule step allows users to create a step in Data Quality that evaluates a record based on an existing Data Catalog Rule.
Execute Formula Performs operations on values and inserts the result in a new column.
Generate Key Generates a unique key for each record, ensuring uniqueness across all datasets.
Custom Coding The Custom Coding Step in the Data Quality Suite enables you to address advanced use cases or requirements that cannot be fulfilled using the existing Quality pipeline steps.
LLM Transform This enables AI-driven modifications such as categorization, translation, and structured data extraction on pipeline data.
Make API Call The Make API Call step lets you connect external APIs to your data pipelines by configuring sources, endpoints, and mapping parameters for smooth data exchange.
Rename Field Renames a column in a pipeline.
Split Field Splits a column into two new columns.