Data Processing Pipeline

ETL-style data transformation workflow

1 version
@bob
about 3 hours ago
Automations

Content

1name: Data Processing Pipeline
2version: "1.0"
3description: Transform and analyze data with AI
4
5pipeline:
6 - stage: extract
7 name: Data Extraction
8 prompt: |
9 Extract structured data from this input:
10 {{raw_data}}
11
12 Output as JSON with fields: {{fields}}
13 output: extracted_data
14
15 - stage: transform
16 name: Data Transformation
17 prompt: |
18 Transform this data according to rules:
19 Data: {{extracted_data}}
20 Rules: {{transformation_rules}}
21 output: transformed_data
22 depends_on: [extract]
23
24 - stage: validate
25 name: Data Validation
26 prompt: |
27 Validate this data against schema:
28 Data: {{transformed_data}}
29 Schema: {{validation_schema}}
30 Report any errors or inconsistencies.
31 output: validation_report
32 depends_on: [transform]
33
34 - stage: analyze
35 name: Data Analysis
36 prompt: |
37 Analyze this validated data:
38 {{transformed_data}}
39
40 Provide: summary statistics, patterns, anomalies
41 output: analysis_report
42 depends_on: [validate]
43
44variables:
45 raw_data:
46 type: string
47 required: true
48 fields:
49 type: array
50 default: [id, name, value, timestamp]
51 transformation_rules:
52 type: string
53 default: "normalize dates, clean text, convert currencies"
54 validation_schema:
55 type: string
56 default: "standard"