Batch processing refers to the scheduling and processing of large volumes of data simultaneously, generally at periods of time when computing resources are experiencing low demand. Batch jobs are typically repetitive in nature and are often scheduled (automated) to occur at set intervals, such as at the end of the day or the end of the week. This is opposed to stream processing, where data is continuously fed into a system as soon as it becomes available. An example of batch processing is credit card transactions, which are typically pushed to account statements together overnight rather than populating to individual accounts instantaneously. Batching large database updates together allows for efficient use of processing resources without interrupting day-to-day business operations.

Use Alteryx Designer Cloud to Automate Batch Processing

Designer Cloud makes it easy to automate the data transformation steps in data pipelines that use batch processing. After using Designer Cloud’s intuitive interface to prepare your data, you can save your transformation steps as recipes and include them as part of your automated data pipelines.

Next Term
Data Source