/async-function-serializer

Primary LanguageHTMLGNU General Public License v3.0GPL-3.0

Async Function Serializer

Version Bundle size Downloads

GPL 3.0 Licence Test status

Why

Initially, I needed to queue some fetch calls in another project, and since I was unaware of the Async package, I wrote something fast & dirty myself.

I then needed more functionality, and testing, and etc, so I split it out into a little package to see if I could make something which scratched my own itch.

Any time I think of something else I might need, or am just interested to see if I am capable of building, then I add it to this repo.

Features

  • Basic synchronous and asynchronous queue
  • Optional execution concurrency
  • Input transformer, allowing most recent results to influence the execution of the next step
  • Returns execution results or error
  • Batching on addition to queue
  • Sorting immediately before drawing from queue
  • Initial execution delaying

Background

https://www.chriskerr.dev/writing/serialising-async-functions

Usage

npm i async-function-serializer
yarn add async-function-serializer
import serializer from "async-function-serializer";

const exampleFunction = async ( wait: number ): Promise<number> => {
 return await new Promise( resolve => setTimeout(() => resolve( wait ), wait ));
};

const serialExample = serializer( exampleFunction, options? );

const { data, error } = await serialExample( 1000, inputOptions? );

Serializer options

type SerializeOptions<Input, Return> = {
 /**
  * Maximum number of simultaneous executions
  * @defaultValue 1
  */
 concurrency?: number;

 /**
  * How long to delay before starting initial execution
  * @defaultValue 0
  */
 delay?: number;

 /**
  * Used to sort the queue at the beginning of each execution cycle
  * @defaultValue undefined
  */
 sortBy?: {
  /**
   * Key of input to sort. Will only populate if input is of type object, and only supports top-level keys.
   */
  key: keyof Input;
  /**
   * Sort direction.
   * @defaultValue 'asc'
   */
  direction?: 'asc' | 'desc';
 };

 /**
  * Batch input when adding them to the queue.
  * @defaultValue undefined
  */
 batch?: {
  /**
   * How to long wait before adding the batch to the queue
   */
  debounceInterval: number;

  /**
   * Maximum duration before a batch will close
   */
  maxDebounceInterval?: number;

  /**
   * Function to combine the new queue item into the current batch
   * @param existingBatch - the current batch
   * @param newInput - the new item being added into the batch
   */
  batchTransformer: (
   existingBatch: Input | undefined,
   newInput: Input,
  ) => Input;
 };

 /**
  * Function to transform the input at the beginning of each execution cycle
  * @param input - the current value being transformed
  * @param previousResult - the results from the previous execution, if any
  */
 inputTransformer?: (
  input: Input,
  previousResult: Awaited<Return> | undefined,
 ) => Input | Promise<Input>;
};

Input options

type InputOptions = {
 /**
  * This will force-start a new batch. If another batch is in-progress, it will be immediately added to queue.
  * */
 startNewBatch?: boolean;
};

Bugs, Feedback & Contributions

I'd be glad to hear from you! So please provide any through issues, discussions or as a pull request above 😃