Data Flow Automation Examples - AI Assist and SCI
A combination of data services and pipelines are used to satisfy individual data flow requirements. Here are some worked examples highlighting how the data service and pipeline is customized to satisfy different needs.
Scope 3 spend-based with AI Assist data load
This functionality requires a Data Service of type NLP Service (ETL) and a Data Pipeline to identify and route incoming data accordingly.
Create Data Service
Navigate to Admin → Configuration → Data Flow Automation
On the Data Services page, click Add New Service
Select Category = Transformation
Select Type = NLP service (Data ETL)
Name = “<organization name> NLP Data Service”
Click Save
Create Pipeline
Navigate to Data Pipelines, click Add New Pipeline
Name = “<organization name> NLP Pipeline”
Target System = Account
Filename Pattern = (?i)^Account Setup and Data Load - AI Assist.*\.xls(x|)(?-i)
Data Source = select the NLP Service created in the previous step (please choose only one data source here by unselecting all other data sources)
Data Transformer = None (pass-through)
Click Save
Supply Chain Intelligence (SCI)
This functionality requires a Data Service of type Amazon S3 (Cloud Storage) and a Data Pipeline to identify and route incoming data accordingly.
Create Data Service
Navigate to Admin → Configuration → Data Flow Automation
On the Data Services page, click Add New Service
Select Category = File Loading
Select Type = Amazon S3 (Cloud Storage)
Name = “<organization name> SCI S3”
Click Save
Create Pipeline
Navigate to Data Pipelines, click Add New Pipeline
Name = “<organization name> SCI Pipeline”
Target System = Account
Filename Pattern = (?i)Account_Setup_and_Data_Load_SCIS.*(?-i)
Data Source = select the Amazon S3 Service created in the previous step (please choose only one data source here by unselecting all other data sources)
Data Transformer = None (pass-through)
Click Save