/Promise-parallel-throttle

It's kinda like Promise.all(), but throttled!

Primary LanguageTypeScriptMIT LicenseMIT

Promise-parallel-throttle

Build Status npm version npm downloads Bundlephobia Bundlephobia

Run a array of Promises in parallel. Kinda like Promise.all(), but throttled!

Install

NPM

npm i promise-parallel-throttle -S

Yarn

yarn add promise-parallel-throttle

Usage

import * as Throttle from 'promise-parallel-throttle';

//Function which should return a Promise
const doReq = async (firstName, lastName) => {
    //Do something async.
    return firstName + ' ' + lastName;
};

const users = [
    {firstName: 'Irene', lastName: 'Pullman'},
    {firstName: 'Sean', lastName: 'Parr'},
];

//Queue with functions to be run
const queue = users.map((user) => () => doReq(user.firstName, user.lastName));

//Default Throttle runs with 5 promises parallel.
const formattedNames = await Throttle.all(queue);

console.log(formattedNames); //['Irene Pullman', 'Sean Parr']

Edit Promise-parallel-throttle example

API

Throttle.all

Throttle.all(tasks, options)

Throttle.all is made to behave exactly like Promise.all but instead of all the tasks running in parallel it runs a maxium amount of tasks in parallel. Only the tasks parameter is required while the options parameter is optional.

Throttle.sync

Throttle.sync(tasks, options)

Throttle.sync runs all the tasks synchronously. Once again the tasks array is required, the options are optional. Be aware that this method is simply a wrapper to pass maxInProgress with 1. So overwriting this option in the options object would run the tasks again in parallel.

Throttle.raw

Throttle.raw(tasks, options)

The raw method instead of returning the tasks their results, will return a result object. Useful if you wan't more statistics about the execution of your tasks. Once again the tasks are required while the options are optional.

Option's Object

Parameter Type Default Definition
maxInProgress Integer 5 max amount of parallel threads
failFast Boolean true (false for the raw method) reject after a single error, or keep running
progressCallback Function Optional callback with progress reports
nextCheck Function Optional function which should return a promise, if the promise resolved true the next task is spawn, errors will propagate and should be handled in the calling code
ignoreIsFunctionCheck Boolean false If one of the tasks is not a function an error is thrown, if this boolean is set to true we simply return the task itself

Result object / Progress callback

The progressCallback and the Raw will return a Result object with the following properties:

Property Type Start value Definition
lastCompletedIndex Integer -1 last index of a task that is completed (either fulfilled or rejected)
amountDone Integer 0 amount of tasks which are finished
amountStarted Integer 0 amount of tasks which started
amountResolved Integer 0 amount of tasks which successfully resolved
amountRejected Integer 0 amount of tasks which returned in an error and are aborted
amountNextCheckFalsey Integer 0 amount of tasks which got a falsey value in the nextCheck
rejectedIndexes Array [] all the indexes in the tasks array where the promise rejected
resolvedIndexes Array [] all the indexes in the tasks array where the promise resolved
nextCheckFalseyIndexes Array [] all the indexes in the tasks array where the nextCheck returned a falsey value
taskResults Array [] array containing the result of every task

nextCheck

All the Throttle methods have a nextCheck method which will be used to verify if a next task is allowed to start.

The default nextCheck is defined like this;

const defaultNextTaskCheck = (status, tasks) => {
    return new Promise((resolve, reject) => {
        resolve(status.amountStarted < tasks.length);
    });
};

This function will get a status object as parameter which adheres to the Result object and it also receives the list of tasks. In the default nextCheck we simply check if the amount of started exceeds the amount to be done, if not we are free to start an other task.

This function can be useful to write your own scheduler based on, for example ram/cpu usage. Lets say that your tasks use a lot of ram and you don't want to exceed a certain amount. You could then write logic inside a nextCheck function which resolves after there is enough ram available to start the next task.

If a custom implementation decides to reject, the error is propagated and should be handled in the user it's code. If a custom implementation returns a falsey value the task will simply not execute and the next task will be scheduled.

Example

Check out the example's directory, it's heavily documented so it should be easy to follow.

To run the example, at least Node 8.x.x is required, since it supports native async/await.

Simply run the example with npm:

npm run-script names

Or with Yarn:

yarn names