Store is a Kotlin library for loading data from remote and local sources.
- Modern software needs data representations to be fluid and always available.
- Users expect their UI experience to never be compromised (blocked) by new data loads. Whether an application is social, news or business-to-business, users expect a seamless experience both online and offline.
- International users expect minimal data downloads as many megabytes of downloaded data can quickly result in astronomical phone bills.
A Store is a class that simplifies fetching, sharing, storage, and retrieval of data in your application. A Store is similar to the Repository pattern while exposing an API built with Coroutines that adheres to a unidirectional data flow.
Store provides a level of abstraction between UI elements and data operations.
A Store is responsible for managing a particular data request. When you create an implementation of a Store, you provide it with a Fetcher
, a function that defines how data will be fetched over network. You can also define how your Store will cache data in-memory and on-disk. Since Store returns your data as a Flow
, threading is a breeze! Once a Store is built, it handles the logic around data flow, allowing your views to use the best data source and ensuring that the newest data is always available for later offline use.
Store leverages multiple request throttling to prevent excessive calls to the network and disk cache. By utilizing Store, you eliminate the possibility of flooding your network with the same request while adding two layers of caching (memory and disk) as well as ability to add disk as a source of truth where you can modify the disk directly without going through Store (works best with databases that can provide observables sources like Jetpack Room, SQLDelight or Realm)
Artifacts are hosted on Maven Central.
def store_version = "4.0.0-alpha04"
implementation "com.dropbox.mobile.store:store4:${store_version}"
android {
compileOptions {
sourceCompatibility 1.8
targetCompatibility 1.8
}
...
}
Let's start by looking at what a fully configured Store looks like. We will then walk through simpler examples showing each piece:
StoreBuilder
.fromNonFlow {
api.fetchSubreddit(it, "10").data.children.map(::toPosts)
}.persister(
reader = db.postDao()::loadPosts,
writer = db.postDao()::insertPosts,
delete = db.postDao()::clearFeed,
deleteAll = db.postDao()::clearAllFeeds
).build()
With the above setup you have:
- In-memory caching for rotation
- Disk caching for when users are offline
- Throttling of API calls when parallel requests are made for the same resource
- Rich API to ask for data whether you want cached, new or a stream of future data updates.
And now for the details:
You create a Store using a builder. The only requirement is to include a function that returns a Flow<ReturnType>
or a suspend
function that returns a ReturnType
.
val store = StoreBuilder
.from { articleId -> api.getArticle(articleId) } // api returns Flow<Article>
.build()
Store uses generic keys as identifiers for data. A key can be any value object that properly implements toString()
, equals()
and hashCode()
. When your Fetcher
function is called, it will be passed a particular Key
value. Similarly, the key will be used as a primary identifier within caches (Make sure to have a proper hashCode()
!!).
Note: We highly recommend using built-in types that implement equals
and hashcode
or Kotlin data
classes for complex keys.
The primary function provided by a Store
instance is the stream
function which has the following signature:
fun stream(request: StoreRequest<Key>): Flow<StoreResponse<Output>>
Each stream
call receives a StoreRequest
object, which defines which key to fetch and which data sources to utilize.
The response is a Flow
of StoreResponse
. StoreResponse
is a Kotlin sealed class that can be either
a Loading
, Data
or Error
instance.
Each StoreResponse
includes an origin
field which specifies where the event is coming from.
- The
Loading
class only has anorigin
field. This can provide you information like "network is fetching data", which can be a good signal to activate the loading spinner in your UI. - The
Data
class has avalue
field which includes an instance of the type returned byStore
. - The
Error
class includes anerror
field that contains the exception thrown by the givenorigin
.
When an error happens, Store
does not throw an exception, instead, it wraps it in a StoreResponse.Error
type which allows Flow
to continue so that it can still receive updates that might be triggered by either changes in your data source or subsequent fetch operations.
lifecycleScope.launchWhenStarted {
store.stream(StoreRequest.cached(key = key, refresh=true)).collect { response ->
when(response) {
is StoreResponse.Loading -> showLoadingSpinner()
is StoreResponse.Data -> {
if (response.origin == ResponseOrigin.Fetcher) hideLoadingSpinner()
updateUI(response.value)
}
is StoreResponse.Error -> {
if (response.origin == ResponseOrigin.Fetcher) hideLoadingSpinner()
showError(response.error)
}
}
}
}
For convenience, there are Store.get(key)
and Store.fresh(key)
extension functions.
suspend fun Store.get(key: Key): Value
: This method returns a single value for the given key. If available, it will be returned from the in memory cache or the persister. An error will be thrown if no value is available in either thecache
orpersister
, and thefetcher
fails to load the data from the network.suspend fun Store.fresh(key: Key): Value
: This method returns a single value for the given key that is obtained by querying the fetcher. An error will be thrown if thefetcher
fails to load the data from the network, regardless of whether any value is available in thecache
orpersister
.
lifecycleScope.launchWhenStarted {
val article = store.get(key)
updateUI(article)
}
The first time you call to suspend store.get(key)
, the response will be stored in an in-memory cache and in the persister, if provided.
All subsequent calls to store.get(key)
with the same Key
will retrieve the cached version of the data, minimizing unnecessary data calls. This prevents your app from fetching fresh data over the network (or from another external data source) in situations when doing so would unnecessarily waste bandwidth and battery. A great use case is any time your views are recreated after a rotation, they will be able to request the cached data from your Store. Having this data available can help you avoid the need to retain this in the view layer.
By default, 100 items will be cached in memory for 24 hours. You may pass in your own memory policy to override the default policy.
Alternatively, you can call store.fresh(key)
to get a suspended result
that skips the memory (and optional disk cache).
A good use case is overnight background updates use fresh()
to make sure that calls to store.get()
will not have to hit the network during normal usage. Another good use case for fresh()
is when a user wants to pull to refresh.
Calls to both fresh()
and get()
emit one value or throw an error.
For real-time updates, you may also call store.stream()
which returns a Flow<T>
that emits each time a new item is returned from your store. You can think of stream as a way to create reactive streams that update when you db or memory cache updates
example calls:
lifecycleScope.launchWhenStarted {
store.stream(StoreRequest.cached(3, refresh = false)) //will get cached value followed by any fresh values, refresh will also trigger network call if set to `true` even if the data is available in cache or disk.
.collect {}
store.stream(StoreRequest.fresh(3)) //skip cache, go directly to fetcher
.collect {}
}
To prevent duplicate requests for the same data, Store offers an inflight debouncer. If the same request is made as a previous identical request that has not completed, the same response will be returned. This is useful for situations when your app needs to make many async calls for the same data at startup or when users are obsessively pulling to refresh. As an example, The New York Times news app asynchronously calls ConfigStore.get()
from 12 different places on startup. The first call blocks while all others wait for the data to arrive. We have seen a dramatic decrease in the app's data usage after implementing this inflight logic.
Stores can enable disk caching by passing a Persister
into the builder. Whenever a new network request is made, the Store will first write to the disk cache and then read from the disk cache.
Providing persister
whose read
function can return a Flow<Value>
allows you to make Store treat your disk as source of truth.
Any changes made on disk, even if it is not made by Store, will update the active Store
streams.
This feature, combined with persistence libraries that provide observable queries (Jetpack Room, SQLDelight or Realm) allows you to create offline first applications that can be used without an active network connection while still providing a great user experience.
StoreBuilder
.fromNonFlow {
api.fetchSubreddit(it, "10").data.children.map(::toPosts)
}.persister(
reader = db.postDao()::loadPosts,
writer = db.postDao()::insertPosts,
delete = db.postDao()::clearFeed
).build()
Stores don’t care how you’re storing or retrieving your data from disk. As a result, you can use Stores with object storage or any database (Realm, SQLite, CouchDB, Firebase etc). Technically, there is nothing stopping you from implementing an in-memory cache for the “persister” implementation and instead have two levels of in-memory caching--one with inflated and one with deflated models, allowing for sharing of the “persister” cache data between stores.
If using SQLite we recommend working with Room which returns a Flow
from a query
The above builder is how we recommend working with data on Android. With the above setup you have:
- Memory caching with TTL & Size policies
- Disk caching with simple integration with Room
- In-flight request management
- Ability to get cached data or bust through your caches (
get()
vs.fresh()
) - Ability to listen for any new emissions from network (stream)
- Structured Concurrency through APIs build on Coroutines and Kotlin Flow
You can configure in-memory cache with the MemoryPolicy
:
StoreBuilder
.fromNonFlow {
api.fetchSubreddit(it, "10").data.children.map(::toPosts)
}.cachePolicy(
MemoryPolicy.builder()
.setMemorySize(10)
.setExpireAfterAccess(10.minutes) // or setExpireAfterWrite(10.minutes)
.build()
).persister(
reader = db.postDao()::loadPosts,
writer = db.postDao()::insertPosts,
delete = db.postDao()::clearFeed,
deleteAll = db.postDao()::clearAllFeeds
).build()
setMemorySize(maxSize: Long)
sets the maximum number of entries to be kept in the cache before starting to evict the least recently used items.setExpireAfterAccess(expireAfterAccess: Duration)
sets the maximum time an entry can live in the cache since the last access, where "access" means reading the cache, adding a new cache entry, and replacing an existing entry with a new one. This duration is also known as time-to-idle (TTI).setExpireAfterWrite(expireAfterWrite: Duration)
sets the maximum time an entry can live in the cache since the last write, where "write" means adding a new cache entry and replacing an existing entry with a new one. This duration is also known as time-to-live (TTL).
Note that setExpireAfterAccess
and setExpireAfterWrite
cannot both be set at the same time.
You can delete a specific entry by key from a store, or clear all entries in a store.
val store = StoreBuilder
.fromNonFlow<String, Int> { key: String ->
api.fetchData(key)
}.build()
The following will clear the entry associated with the key from the in-memory cache:
store.clear("10")
The following will clear all entries from the in-memory cache:
store.clearAll()
When store has a persister (source of truth), you'll need to provide the delete
and deleteAll
functions for clear(key)
and clearAll()
to work:
StoreBuilder
.fromNonFlow<String, Int> { key: String ->
api.fetchData(key)
}.persister(
reader = dao::loadData,
writer = dao::writeData,
delete = dao::clearDataByKey,
deleteAll = dao::clearAllData
).build()
The following will clear the entry associated with the key from both the in-memory cache and the persister (source of truth):
store.clear("10")
The following will clear all entries from both the in-memory cache and the persister (source of truth):
store.clearAll()