Documentation

Data Processing Example

Optimizing batch data processing.

Scenario

Process a large CSV file with transformations.

Code

import epochly
@epochly.optimize(level=3)
def process_records(records):
results = []
for record in records:
# CPU-intensive transformation
transformed = {
'id': record['id'],
'value': complex_calculation(record['data']),
'score': compute_score(record)
}
results.append(transformed)
return results
# Load data
with open('large_file.csv') as f:
records = load_csv(f)
# Process with automatic parallelization
results = process_records(records)

Performance

RecordsWithout EpochlyWith EpochlySpeedup
10,00012s1.5s8x
100,000120s8s15x
1,000,00020min45s26x