Iteration

Iteration components enable looping and repetitive processing within BroxiAI workflows, allowing for data processing over collections and iterative operations.

Loop

The Loop component enables iterative processing over collections of data, performing operations on each item in sequence.

This flow creates a summarizing "for each" loop with the Loop component.

The component iterates over a list of Data objects until it's completed, and then the Done loop aggregates the results.

The File component loads text files from your local machine, and then the Parse Data component parses them into a list of structured Data objects. The Loop component passes each Data object to a Prompt to be summarized.

When the Loop component runs out of Data, the Done loop activates, which counts the number of pages and summarizes their tone with another Prompt. This is represented in BroxiAI by connecting the Parse Data component's Data List output to the Loop component's Data loop input.

Sample Flow looping summarizer

The output will look similar to this:

Document Summary
Total Pages Processed
Total Pages: 2
Overall Tone of Document
Tone: Informative and Instructional
The documentation outlines microservices architecture patterns and best practices.
It emphasizes service isolation and inter-service communication protocols.
The use of asynchronous messaging patterns is recommended for system scalability.
It includes code examples of REST and gRPC implementations to demonstrate integration approaches.

Usage

Loop component features:

  • Collection iteration

  • Sequential processing

  • Progress tracking

  • Result aggregation

  • Error handling

Inputs

Name
Display Name
Info

data_list

Data List

Collection of data items to iterate over

max_iterations

Max Iterations

Maximum number of iterations allowed

break_condition

Break Condition

Condition to stop iteration early

Outputs

Name
Display Name
Info

current_item

Current Item

Current data item being processed

iteration_count

Iteration Count

Current iteration number

loop_results

Loop Results

Aggregated results from all iterations

done

Done

Signal when loop is complete

Loop Patterns

Sequential Processing

  • One-by-One: Process each item individually

  • Ordered Processing: Maintain order of operations

  • State Preservation: Keep state between iterations

  • Progress Tracking: Monitor completion status

Collection Operations

  • Mapping: Transform each item in a collection

  • Filtering: Process items that meet criteria

  • Reduction: Aggregate items into a single result

  • Grouping: Organize items by categories

Control Flow

  • Early Exit: Break loop on specific conditions

  • Skip Items: Continue to next item on errors

  • Retry Logic: Retry failed operations

  • Timeout Handling: Handle long-running operations

Advanced Features

Performance Optimization

  • Batch Processing: Group items for efficient processing

  • Parallel Execution: Process multiple items simultaneously

  • Caching: Cache results to avoid reprocessing

  • Memory Management: Efficient memory usage for large collections

Error Handling

  • Graceful Degradation: Continue processing on partial failures

  • Error Collection: Collect and report all errors

  • Retry Mechanisms: Automatic retry with backoff

  • Rollback Support: Undo operations on failure

Monitoring and Debugging

  • Progress Indicators: Real-time progress updates

  • Performance Metrics: Timing and throughput metrics

  • Debug Logging: Detailed execution logs

  • Breakpoints: Pause execution for debugging

Use Cases

Data Processing

  • Document Processing: Process multiple documents

  • Data Transformation: Transform datasets

  • Batch Operations: Bulk data operations

  • ETL Workflows: Extract, transform, load operations

Content Generation

  • Bulk Content Creation: Generate multiple pieces of content

  • Template Processing: Apply templates to data sets

  • Report Generation: Create multiple reports

  • Personalization: Customize content for users

Validation and Quality Assurance

  • Data Validation: Validate multiple data sources

  • Quality Checks: Run quality checks on datasets

  • Compliance Verification: Check regulatory compliance

  • Testing: Automated testing workflows

Usage Notes

  • Performance: Optimized for large-scale iterative processing

  • Flexibility: Support for various loop patterns and conditions

  • Reliability: Robust error handling and recovery mechanisms

  • Monitoring: Comprehensive progress tracking and reporting

  • Scalability: Efficient processing of large collections

  • Integration: Seamless integration with other workflow components

Last updated