Data Processing Pipeline
ETL-style data transformation workflow
1 version
Content
1name: Data Processing Pipeline2version: "1.0"3description: Transform and analyze data with AI45pipeline:6 - stage: extract7 name: Data Extraction8 prompt: |9 Extract structured data from this input:10 {{raw_data}}11 12 Output as JSON with fields: {{fields}}13 output: extracted_data1415 - stage: transform16 name: Data Transformation17 prompt: |18 Transform this data according to rules:19 Data: {{extracted_data}}20 Rules: {{transformation_rules}}21 output: transformed_data22 depends_on: [extract]2324 - stage: validate25 name: Data Validation26 prompt: |27 Validate this data against schema:28 Data: {{transformed_data}}29 Schema: {{validation_schema}}30 Report any errors or inconsistencies.31 output: validation_report32 depends_on: [transform]3334 - stage: analyze35 name: Data Analysis36 prompt: |37 Analyze this validated data:38 {{transformed_data}}39 40 Provide: summary statistics, patterns, anomalies41 output: analysis_report42 depends_on: [validate]4344variables:45 raw_data:46 type: string47 required: true48 fields:49 type: array50 default: [id, name, value, timestamp]51 transformation_rules:52 type: string53 default: "normalize dates, clean text, convert currencies"54 validation_schema:55 type: string56 default: "standard"