SwiftDataLoader
SwiftDataLoader is a generic utility to be used as part of your application's data fetching layer to provide a simplified and consistent API over various remote data sources such as databases or web services via batching and caching.
This is a Swift version of the Facebook DataLoader.
Installation 💻
Update your Package.swift
file.
.Package(url: "https://github.com/kimdv/SwiftDataLoader.git", majorVersion: 1)
Gettings started 🚀
Batching
Batching is not an advanced feature, it's DataLoader's primary feature.
Create a DataLoader by providing a batch loading function
let userLoader = Dataloader<Int, User>(batchLoadFunction: { keys in
try User.query(on: req).filter(\User.id ~~ keys).all().map { users in
return users.map { DataLoaderFutureValue.success($0) }
}
})
Load single key
let future1 = try userLoader.load(key: 1, on: req)
let future2 = try userLoader.load(key: 2, on: req)
let future3 = try userLoader.load(key: 1, on: req)
Now there is only one thing left and that is to dispathc it try userLoader.dispatchQueue(on: req.eventLoop)
The example above will only fetch users twice because future1 == future3
Load multiple keys
There is also an API to load multiple keys at once
try userLoader.loadMany(keys: [1, 2, 3], on: req.eventLoop)
Disable batching
It is also possible to disable batching DataLoaderOptions(batchingEnabled: false)
It will invoke batchLoadFunction
with a single key
Chaching
DataLoader provides a memoization cache for all loads which occur in a single
request to your application. After .load()
is called once with a given key,
the resulting value is cached to eliminate redundant loads.
In addition to relieving pressure on your data storage, caching results per-request also creates fewer objects which may relieve memory pressure on your application:
let userLoader = DataLoader<Int, Int>(...)
let future1 = userLoader.load(1)
let future2 = userLoader.load(1)
assert(future1 === future2)
Caching per-Request
DataLoader caching does not replace Redis, Memcache, or any other shared
application-level cache. DataLoader is first and foremost a data loading mechanism,
and its cache only serves the purpose of not repeatedly loading the same data in
the context of a single request to your Application. To do this, it maintains a
simple in-memory memoization cache (more accurately: .load()
is a memoized function).
Avoid multiple requests from different users using the DataLoader instance, which could result in cached data incorrectly appearing in each request. Typically, DataLoader instances are created when a Request begins, and are not used once the Request ends.
Clearing Cache
In certain uncommon cases, clearing the request cache may be necessary.
The most common example when clearing the loader's cache is necessary is after a mutation or update within the same request, when a cached value could be out of date and future loads should not use any possibly cached value.
Here's a simple example using SQL UPDATE to illustrate.
// Request begins...
let userLoader = DataLoader<Int, Int>(...)
// And a value happens to be loaded (and cached).
userLoader.load(4)
// A mutation occurs, invalidating what might be in cache.
sqlRun('UPDATE users WHERE id=4 SET username="zuck"').then { userLoader.clear(4) }
// Later the value load is loaded again so the mutated data appears.
userLoader.load(4)
// Request completes.
Caching Errors
If a batch load fails (that is, a batch function throws or returns a DataLoaderFutureValue.failure(Error)),
then the requested values will not be cached. However if a batch
function returns an Error
instance for an individual value, that Error
will
be cached to avoid frequently loading the same Error
.
In some circumstances you may wish to clear the cache for these individual Errors:
userLoader.load(1).catch { error in {
if (/* determine if should clear error */) {
userLoader.clear(1);
}
throw error
}
Disabling Cache
In certain uncommon cases, a DataLoader which does not cache may be desirable.
Calling DataLoader(options: DataLoaderOptions(cachingEnabled: false), batchLoadFunction: batchLoadFunction)
will ensure that every
call to .load()
will produce a new Future, and requested keys will not be
saved in memory.
However, when the memoization cache is disabled, your batch function will
receive an array of keys which may contain duplicates! Each key will be
associated with each call to .load()
. Your batch loader should provide a value
for each instance of the requested key.
For example:
let myLoader = DataLoader<String, String>(options: DataLoaderOptions(cachingEnabled: false), batchLoadFunction: { keys in
self.someBatchLoader(keys: keys).map { DataLoaderFutureValue.success($0) }
})
myLoader.load("A")
myLoader.load("B")
myLoader.load("A")
// > [ "A", "B", "A" ]
More complex cache behavior can be achieved by calling .clear()
or .clearAll()
rather than disabling the cache completely. For example, this DataLoader will
provide unique keys to a batch function due to the memoization cache being
enabled, but will immediately clear its cache when the batch function is called
so later requests will load new values.
let myLoader = DataLoader<String, String>(batchLoadFunction: { keys in
identityLoader.clearAll()
return someBatchLoad(keys: keys)
})
Contributing 🤘
All your feedback and help to improve this project is very welcome. Please create issues for your bugs, ideas and enhancement requests, or better yet, contribute directly by creating a PR. 😎
When reporting an issue, please add a detailed instruction, and if possible a code snippet or test that can be used as a reproducer of your problem. 💥
When creating a pull request, please adhere to the current coding style where possible, and create tests with your code so it keeps providing an awesome test coverage level 💪
Acknowledgements 👏
This library is entirely a Swift version of Facebooks DataLoader. Developed by Lee Byron and Nicholas Schrock from Facebook.