Data Processing Example
Optimizing batch data processing.
Scenario
Process a large CSV file with transformations.
Code
import epochly@epochly.optimize(level=3)def process_records(records):results = []for record in records:# CPU-intensive transformationtransformed = {'id': record['id'],'value': complex_calculation(record['data']),'score': compute_score(record)}results.append(transformed)return results# Load datawith open('large_file.csv') as f:records = load_csv(f)# Process with automatic parallelizationresults = process_records(records)
Performance
| Records | Without Epochly | With Epochly | Speedup |
|---|---|---|---|
| 10,000 | 12s | 1.5s | 8x |
| 100,000 | 120s | 8s | 15x |
| 1,000,000 | 20min | 45s | 26x |