Import Batch

ETL Import Batch represents a single data extraction operation. It tracks the overall statistics and status of extracting data from an ETL Data Source into manageable chunks.

Note: This DocType is system-generated and cannot be created manually. Batches are created when running ETL Job or "Extract to Staging" action from an ETL Data Source.

Field Reference

Batch Information

Field Type Description
Name Auto System-generated unique identifier (IMP-YYYY-MM-DD-#####)
Data Source Link ETL Data Source that generated this batch
Status Select Current batch status (Pending, Running, Complete, Failed)
Notes Small Text Error messages or additional information

Statistics

Field Type Description
Chunk Size Int Number of records per chunk (copied from source)
Rows Int Total number of records extracted
Chunks Int Number of chunks created
Bytes Int Total size of extracted data in bytes
Errors Int Number of errors encountered during extraction

Timestamps

Field Type Description
Started At Datetime When extraction began
Finished At Datetime When extraction completed

Status Values

Status Description
Pending Batch created but extraction not started
Running Extraction currently in progress
Complete Extraction finished successfully
Failed Extraction encountered errors and stopped

Actions

View Chunks

Navigate to the list of ETL Import Chunks belonging to this batch to examine the raw extracted data.

Run Transform

Select an ETL Transform Map to process this batch data into target DocTypes.

Usage Notes

  • Batches cannot be manually created or edited
  • Failed batches can be re-processed by running the extraction again
  • Large batches are automatically split into chunks for memory efficiency
  • Raw data in chunks is stored as JSONL format for efficient processing
  • Completed batches can be transformed multiple times with different Transform Maps

Monitoring Extractions

Use the ETL Import Batch list view to monitor extraction operations:

  1. Filter by Status to see running or failed extractions
  2. Check Row counts to verify expected data volumes
  3. Review Notes field for error details on failed batches
  4. Monitor duration between Started At and Finished At timestamps
  • ETL Data Source: Parent configuration that generates batches
  • ETL Import Chunk: Child records containing the actual extracted data
  • ETL Transform Run: Processing records that consume batch data
Discard
Save

On this page

Review Changes ← Back to Content
Message Status Space Raised By Last update on