# @daiso-tech/core > Mastering @daiso-tech/core: Comprehensive Guides for the Backend Server Component Library ## search - [Search the documentation](/search.md) ## docs - [Tags](/docs/tags.md) - [One doc tagged with "Aws s3"](/docs/tags/aws-s-3.md) - [4 docs tagged with "Cache"](/docs/tags/cache.md) - [4 docs tagged with "Circuit-breaker"](/docs/tags/circuit-breaker.md) - [One doc tagged with "Cloudflare r2"](/docs/tags/cloudflare-r-2.md) - [8 docs tagged with "Configuring adapters"](/docs/tags/configuring-adapters.md) - [8 docs tagged with "Creating adapters"](/docs/tags/creating-adapters.md) - [7 docs tagged with "Creating database adapters"](/docs/tags/creating-database-adapters.md) - [One doc tagged with "Digital ocean spaces"](/docs/tags/digital-ocean-spaces.md) - [4 docs tagged with "Event-bus"](/docs/tags/event-bus.md) - [One doc tagged with "Execution Context"](/docs/tags/execution-context.md) - [4 docs tagged with "FileStorage"](/docs/tags/file-storage.md) - [One doc tagged with "File system"](/docs/tags/file-system.md) - [8 docs tagged with "In-memory"](/docs/tags/in-memory.md) - [6 docs tagged with "Kysely"](/docs/tags/kysely.md) - [6 docs tagged with "Libsql"](/docs/tags/libsql.md) - [4 docs tagged with "Lock"](/docs/tags/lock.md) - [One doc tagged with "Middleware"](/docs/tags/middleware.md) - [One doc tagged with "Minio"](/docs/tags/minio.md) - [6 docs tagged with "Mongodb"](/docs/tags/mongodb.md) - [6 docs tagged with "Mysql"](/docs/tags/mysql.md) - [8 docs tagged with "Namespace"](/docs/tags/namespace.md) - [8 docs tagged with "NoOp"](/docs/tags/no-op.md) - [6 docs tagged with "Postgres"](/docs/tags/postgres.md) - [4 docs tagged with "Rate-limiter"](/docs/tags/rate-limiter.md) - [7 docs tagged with "Redis"](/docs/tags/redis.md) - [One doc tagged with "Resolver"](/docs/tags/resolver.md) - [7 docs tagged with "Resolvers"](/docs/tags/resolvers.md) - [4 docs tagged with "Semaphore"](/docs/tags/semaphore.md) - [4 docs tagged with "Shared-lock"](/docs/tags/shared-lock.md) - [6 docs tagged with "Sqlite"](/docs/tags/sqlite.md) - [One doc tagged with "Supabase Storage"](/docs/tags/supabase-storage.md) - [One doc tagged with "Tigris"](/docs/tags/tigris.md) - [9 docs tagged with "Usage"](/docs/tags/usage.md) - [9 docs tagged with "Utilities"](/docs/tags/utilities.md) - [Backoff policies](/docs/components/backoff_policies.md): The @daiso-tech/core/backoff-policies component - [CacheResolver](/docs/components/cache/cache_resolver.md): The CacheResolver class provides a flexible way to configure and switch between different cache adapters at runtime. - [Cache usage](/docs/components/cache/cache_usage.md): The @daiso-tech/core/cache component provides a way for storing key-value pairs with expiration independent of data storage - [Configuring Cache adapters](/docs/components/cache/configuring_cache_adapters.md): MemoryCacheAdapter - [Creating cache adapters](/docs/components/cache/creating_cache_adapters.md): Implementing your custom ICacheAdapter - [Circuit-breaker provider resolver classes](/docs/components/circuit_breaker/circuit_breaker_factory_resolver.md): pro - [Circuit-breaker usage](/docs/components/circuit_breaker/circuit_breaker_usage.md): The @daiso-tech/core/circuit-breaker component provides a way for managing circuit-breaker independent of underlying platform or storage. - [Configuring circuit-breaker adapters](/docs/components/circuit_breaker/configuring_circuit_breaker_adapters.md): RedisCircuitBreakerAdapter - [Configuring circuit-breaker policies](/docs/components/circuit_breaker/configuring_circuit_breaker_policies.md): ConsecutiveBreaker - [Creating circuit-breaker adapters](/docs/components/circuit_breaker/creating_circuit_breaker_adapters.md): Implementing your custom ICircuitBreakerAdapter - [Creating policies](/docs/components/circuit_breaker/creating_circuit_breaker_policies.md): Implementing your custom ICircuitBreakerPolicy - [Codec](/docs/components/codec.md): The @daiso-tech/core/codec component provides seamless way to encode/decode data. - [Collection](/docs/components/collection.md): The @daiso-tech/core/collection component provides a fluent, convenient wrapper for working with a Array, Iterable and AsyncIterable. - [configuring_event_bus_adapters](/docs/components/event_bus/configuring_event_bus_adapters.md): Configuring EventBus adapters - [Creating EventBus adapters](/docs/components/event_bus/creating_event_bus_adapters.md): Implementing your custom IEventBusAdapter - [EventBusResolver](/docs/components/event_bus/event_bus_resolver.md): The EventBusResolver class provides a flexible way to configure and switch between different event bus adapters at runtime. - [EventBus usage](/docs/components/event_bus/event_bus_usage.md): The @daiso-tech/core/event-bus component provides a way for dispatching and listening to events independent of underlying technology. - [Execution Context](/docs/components/execution-context.md): The @daiso-tech/core/execution-context module provides a type-safe, composable, and environment-agnostic way to store and propagate contextual data (such as request IDs, user info, or tracing metadata) across async boundaries and function calls. It is inspired by thread-local storage and context propagation in distributed systems, but is designed for modern TypeScript/JavaScript applications. - [FileSize](/docs/components/file_size.md): The @daiso-tech/core/file-size component provides an easy way for defining, manipulating, and comparing file size. Furthermore, it is designed for easy integration with external file size libraries. - [Configuring file-storage adapters](/docs/components/file_storage/configuring_file_storage_adapters.md): MemoryFileStorageAdapter - [Creating lock adapters](/docs/components/file_storage/creating_file_storage_adapters.md): Implementing your custom IFileStorageAdapter - [FileStorageResolver](/docs/components/file_storage/file_storage_resolver.md): The FileStorageResolver class provides a flexible way to configure and switch between different file-storage adapters at runtime. - [FileStorage usage](/docs/components/file_storage/file_storage_usage.md): The @daiso-tech/core/file-storage component provides a way for managing files independent of underlying platform or storage. - [Configuring lock adapters](/docs/components/lock/configuring_lock_adapters.md): MemoryLockAdapter - [Creating lock adapters](/docs/components/lock/creating_lock_adapters.md): Implementing your custom ILockAdapter - [LockFactoryResolver](/docs/components/lock/lock_factory_resolver.md): The LockFactoryResolver class provides a flexible way to configure and switch between different lock adapters at runtime. - [Lock usage](/docs/components/lock/lock_usage.md): The @daiso-tech/core/lock component provides a way for managing locks independent of underlying platform or storage. - [Middleware](/docs/components/middleware.md): The @daiso-tech/core/middleware module provides a flexible middleware system for intercepting and composing function calls. It enables you to wrap functions with pre-processing and post-processing logic, similar to middleware patterns found in web frameworks like Express.js. - [Namespace](/docs/components/namespace.md): The @daiso-tech/core/namespace component provides seamless way to group data by prefixing and suffixing keys. - [Configuring rate-limiter adapters](/docs/components/rate-limiter/configuring_rate_limiter_adapters.md): RedisRateLimiterAdapter - [Configuring rate-limiter policies](/docs/components/rate-limiter/configuring_rate_limiter_policies.md): SlidingWindowLimiter - [Creating rate-limiter adapters](/docs/components/rate-limiter/creating_rate_limiter_adapters.md): Implementing your custom IRateLimiterAdapter - [Creating policies](/docs/components/rate-limiter/creating_rate_limiter_policies.md): Implementing your custom IRateLimiterPolicy - [Rate-limiter resolver factory classes](/docs/components/rate-limiter/rate_limiter_factory_resolver.md): RateLimiterFactoryResolver - [Rate-limiter usage](/docs/components/rate-limiter/rate_limiter_usage.md): The @daiso-tech/core/rate-limiter component provides a way for managing rate-limiter independent of underlying platform or storage. - [Resilience](/docs/components/resilience.md): The @daiso-tech/core/resilience component provides predefined fault tolerant middlewares. - [Configuring semaphore adapters](/docs/components/semaphore/configuring_semaphore_adapters.md): MemorySemaphoreAdapter - [Creating semaphore adapters](/docs/components/semaphore/creating_semaphore_adapters.md): Implementing your custom ISemaphoreAdapter - [SemaphoreFactoryResolver](/docs/components/semaphore/semaphore_factory_resolver.md): The SemaphoreFactoryResolver class provides a flexible way to configure and switch between different semaphore adapters at runtime. - [Semaphore usage](/docs/components/semaphore/semaphore_usage.md): The @daiso-tech/core/semaphore component provides a way for managing semaphores independent of underlying platform or storage. - [Serde](/docs/components/serde.md): The @daiso-tech/core/serde component provides seamless way to serialize/deserialize data and adding custom serialization/deserialization logic for custom data types. - [Configuring shared-lock adapters](/docs/components/shared_lock/configuring_shared_lock_adapters.md): MemorySharedLockAdapter - [Creating shared-lock adapters](/docs/components/shared_lock/creating_shared_lock_adapters.md): Implementing your custom ISharedLockAdapter - [SharedLockFactoryResolver](/docs/components/shared_lock/shared_lock_factory_resolver.md): The SharedLockFactoryResolver class provides a flexible way to configure and switch between different shared-lock adapters at runtime. - [Shared-lock usage](/docs/components/shared_lock/shared_lock_usage.md): The @daiso-tech/core/shared-lock component provides a way for managing shared-locks (a.k.a reader writer locks) independent of underlying platform or storage. - [TimeSpan](/docs/components/time_span.md): The @daiso-tech/core/time-span component provides an easy way for defining, manipulating, and comparing durations. Furthermore, it is designed for easy integration with external time libraries like Luxon and Dayjs. - [Installation](/docs/installation.md): Prerequisites - [ErrorPolicy type](/docs/utilities/error_policy_type.md): The ErrorPolicy type determines which errors should be handled for example in resilience middlewares like retry or fallback. - [Invokable](/docs/utilities/invokable.md): An Invokable represents a callable entity, which can be either: --- # Full Documentation Content [Skip to main content](#__docusaurus_skipToContent_fallback) [![@daiso-tech/core Logo](/img/logo.svg)![@daiso-tech/core Logo](/img/logo.svg)](/) [**@daiso-tech/core**](/)[Docs](/docs/installation.md)[API docs](https://daiso-tech.github.io/daiso-core/modules.html) [GitHub](https://github.com/daiso-tech/daiso-core/)[NPM](https://www.npmjs.com/package/@daiso-tech/core) Search # Search the documentation Type your search here Powered by[](https://www.algolia.com/) © 2026 @daiso-tech/core. Built with Docusaurus. --- ## A[​](#A "Direct link to A") * [Aws s31](/docs/tags/aws-s-3.md) *** --- ## [Configuring file-storage adapters](/docs/components/file_storage/configuring_file_storage_adapters.md) MemoryFileStorageAdapter --- ## [Cache usage](/docs/components/cache/cache_usage.md) The @daiso-tech/core/cache component provides a way for storing key-value pairs with expiration independent of data storage --- ## [Circuit-breaker provider resolver classes](/docs/components/circuit_breaker/circuit_breaker_factory_resolver.md) pro --- ## [Configuring file-storage adapters](/docs/components/file_storage/configuring_file_storage_adapters.md) MemoryFileStorageAdapter --- ## [Configuring Cache adapters](/docs/components/cache/configuring_cache_adapters.md) MemoryCacheAdapter --- ## [Creating cache adapters](/docs/components/cache/creating_cache_adapters.md) Implementing your custom ICacheAdapter --- ## [Creating cache adapters](/docs/components/cache/creating_cache_adapters.md) Implementing your custom ICacheAdapter --- ## [Configuring file-storage adapters](/docs/components/file_storage/configuring_file_storage_adapters.md) MemoryFileStorageAdapter --- ## [configuring\_event\_bus\_adapters](/docs/components/event_bus/configuring_event_bus_adapters.md) Configuring EventBus adapters --- ## [Execution Context](/docs/components/execution-context.md) The @daiso-tech/core/execution-context module provides a type-safe, composable, and environment-agnostic way to store and propagate contextual data (such as request IDs, user info, or tracing metadata) across async boundaries and function calls. It is inspired by thread-local storage and context propagation in distributed systems, but is designed for modern TypeScript/JavaScript applications. --- ## [Configuring file-storage adapters](/docs/components/file_storage/configuring_file_storage_adapters.md) MemoryFileStorageAdapter --- ## [Configuring file-storage adapters](/docs/components/file_storage/configuring_file_storage_adapters.md) MemoryFileStorageAdapter --- ## [Configuring Cache adapters](/docs/components/cache/configuring_cache_adapters.md) MemoryCacheAdapter --- ## [Configuring Cache adapters](/docs/components/cache/configuring_cache_adapters.md) MemoryCacheAdapter --- ## [Configuring Cache adapters](/docs/components/cache/configuring_cache_adapters.md) MemoryCacheAdapter --- ## [Configuring lock adapters](/docs/components/lock/configuring_lock_adapters.md) MemoryLockAdapter --- ## [Middleware](/docs/components/middleware.md) The @daiso-tech/core/middleware module provides a flexible middleware system for intercepting and composing function calls. It enables you to wrap functions with pre-processing and post-processing logic, similar to middleware patterns found in web frameworks like Express.js. --- ## [Configuring file-storage adapters](/docs/components/file_storage/configuring_file_storage_adapters.md) MemoryFileStorageAdapter --- ## [Configuring Cache adapters](/docs/components/cache/configuring_cache_adapters.md) MemoryCacheAdapter --- ## [Configuring Cache adapters](/docs/components/cache/configuring_cache_adapters.md) MemoryCacheAdapter --- ## [Cache usage](/docs/components/cache/cache_usage.md) The @daiso-tech/core/cache component provides a way for storing key-value pairs with expiration independent of data storage --- ## [Configuring Cache adapters](/docs/components/cache/configuring_cache_adapters.md) MemoryCacheAdapter --- ## [Configuring Cache adapters](/docs/components/cache/configuring_cache_adapters.md) MemoryCacheAdapter --- ## [Configuring rate-limiter adapters](/docs/components/rate-limiter/configuring_rate_limiter_adapters.md) RedisRateLimiterAdapter --- ## [Configuring Cache adapters](/docs/components/cache/configuring_cache_adapters.md) MemoryCacheAdapter --- ## [LockFactoryResolver](/docs/components/lock/lock_factory_resolver.md) The LockFactoryResolver class provides a flexible way to configure and switch between different lock adapters at runtime. --- ## [CacheResolver](/docs/components/cache/cache_resolver.md) The CacheResolver class provides a flexible way to configure and switch between different cache adapters at runtime. --- ## [Configuring semaphore adapters](/docs/components/semaphore/configuring_semaphore_adapters.md) MemorySemaphoreAdapter --- ## [Configuring shared-lock adapters](/docs/components/shared_lock/configuring_shared_lock_adapters.md) MemorySharedLockAdapter --- ## [Configuring Cache adapters](/docs/components/cache/configuring_cache_adapters.md) MemoryCacheAdapter --- ## [Configuring file-storage adapters](/docs/components/file_storage/configuring_file_storage_adapters.md) MemoryFileStorageAdapter --- ## [Configuring file-storage adapters](/docs/components/file_storage/configuring_file_storage_adapters.md) MemoryFileStorageAdapter --- ## [Cache usage](/docs/components/cache/cache_usage.md) The @daiso-tech/core/cache component provides a way for storing key-value pairs with expiration independent of data storage --- ## [Backoff policies](/docs/components/backoff_policies.md) The @daiso-tech/core/backoff-policies component --- # Backoff policies The `@daiso-tech/core/backoff-policies` component ## Predefined backoff policies[​](#predefined-backoff-policies "Direct link to Predefined backoff policies") The library includes predefined backoff policies: * `constantBackoff` - Constant backoff policy with jitter ``` import { TimeSpan } from "@daiso-tech/core/time-span"; import { constantBackoff } from "@daiso-tech/core/backoff-policies"; // The settings argument is optional and all its fields are optional const backoff = constantBackoff({ delay: TimeSpan.fromSeconds(1), jitter: 0.5, // You can pass null to disable jitter }); ``` * `exponentialBackoff` - Exponential backoff policy with jitter ``` import { TimeSpan } from "@daiso-tech/core/time-span"; import { exponentialBackoff } from "@daiso-tech/core/backoff-policies"; // The settings argument is optional and all its fields are optional const backoff = exponentialBackoff({ maxDelay: TimeSpan.fromSeconds(60), minDelay: TimeSpan.fromMilliseconds(500), multiplier: 2, jitter: 0.5, // You can pass null to disable jitter }); ``` * `linearBackoff` - Linear backoff policy with jitter ``` import { TimeSpan } from "@daiso-tech/core/time-span"; import { linearBackoff } from "@daiso-tech/core/backoff-policies"; // The settings argument is optional and all its fields are optional const backoff = linearBackoff({ maxDelay: TimeSpan.fromSeconds(60), minDelay: TimeSpan.fromMilliseconds(500), jitter: 0.5, // You can pass null to disable jitter }); ``` * `polynomialBackoff` - Polynomial backoff policy with jitter ``` import { TimeSpan } from "@daiso-tech/core/time-span"; import { linearBackoff } from "@daiso-tech/core/backoff-policies"; // The settings argument is optional and all its fields are optional const backoff = linearBackoff({ maxDelay: TimeSpan.fromSeconds(60), minDelay: TimeSpan.fromMilliseconds(500), degree: 2, jitter: 0.5, // You can pass null to disable jitter }); ``` ## Further information[​](#further-information "Direct link to Further information") For further information refer to [`@daiso-tech/core/backoff-policies`](https://daiso-tech.github.io/daiso-core/modules/BackoffPolicy.html) API docs. --- # CacheResolver The `CacheResolver` class provides a flexible way to configure and switch between different cache adapters at runtime. ## Initial configuration[​](#initial-configuration "Direct link to Initial configuration") To begin using the `CacheResolver`, you will need to register all required adapters during initialization. ``` import { CacheResolver } from "@daiso-tech/core/cache"; import { MemoryCacheAdapter } from "@daiso-tech/core/cache/memory-cache-adapter"; import { RedisCacheAdapter } from "@daiso-tech/core/cache/redis-cache-adapter"; import { Serde } from "@daiso-tech/core/serde"; import type { ISerde } from "@daiso-tech/core/serde/contracts"; import { SuperJsonSerdeAdapter } from "@daiso-tech/core/serde/super-json-serde-adapter"; import Redis from "ioredis"; const serde = new Serde(new SuperJsonSerdeAdapter()); const cacheResolver = new CacheResolver({ adapters: { memory: new MemoryCacheAdapter(), redis: new RedisCacheAdapter({ database: new Redis("YOUR_REDIS_CONNECTION"), serde, }), }, // You can set an optional default adapter defaultAdapter: "memory", }); ``` ## Usage[​](#usage "Direct link to Usage") ### 1. Using the default adapter[​](#1-using-the-default-adapter "Direct link to 1. Using the default adapter") ``` await cacheResolver.use().add("user/jose@gmail.com", { name: "Jose", age: 20, }); ``` danger Note that if you dont set a default adapter, an error will be thrown. ### 2. Specifying an adapter explicitly[​](#2-specifying-an-adapter-explicitly "Direct link to 2. Specifying an adapter explicitly") ``` await cacheResolver.use("redis").add("user/jose@gmail.com", { name: "Jose", age: 20, }); ``` danger Note that if you specify a non-existent adapter, an error will be thrown. ### 3. Overriding default settings[​](#3-overriding-default-settings "Direct link to 3. Overriding default settings") ``` import { z } from "zod"; await cacheResolver .setNamespace(new Namespace("@my-namespace")) // You can overide the cache value type by calling setType or setSchema method again .setType() .setSchema( z.object({ name: z.string(), age: z.number(), }), ) .use("redis") .add("user/jose@gmail.com", { name: "Jose", age: 20, }); ``` info Note that the `CacheResolver` is immutable, meaning any configuration override returns a new instance rather than modifying the existing one. ## Further information[​](#further-information "Direct link to Further information") For further information refer to [`@daiso-tech/core/cache`](https://daiso-tech.github.io/daiso-core/modules/Cache.html) API docs. --- # Cache usage The `@daiso-tech/core/cache` component provides a way for storing key-value pairs with expiration independent of data storage ## Initial configuration[​](#initial-configuration "Direct link to Initial configuration") To begin using the `Cache` class, you'll need to create and configure an instance: ``` import { TimeSpan } from "@daiso-tech/core/time-span"; import { MemoryCacheAdapter } from "@daiso-tech/core/cache/memory-cache-adapter"; import { Cache } from "@daiso-tech/core/cache"; const cache = new Cache({ // You can provide default TTL value // If you set it to null it means keys will be stored forever. defaultTtl: TimeSpan.fromSeconds(2), // You can choose the adapter to use adapter: new MemoryCacheAdapter(), }); ``` info Here is a complete list of settings for the [`Cache`](https://daiso-tech.github.io/daiso-core/types/Cache.CacheSettingsBase.html) class. ## Cache basics[​](#cache-basics "Direct link to Cache basics") ### Adding keys[​](#adding-keys "Direct link to Adding keys") You can add a key with a optional TTL to overide the default: ``` await cache.add("a", "value", { ttl: TimeSpan.fromSeconds("1") }); ``` The method returns true if the key does not exists. ### Retrieving keys[​](#retrieving-keys "Direct link to Retrieving keys") You can retrieve the key: ``` await cache.get("a"); ``` ### Checking key existence[​](#checking-key-existence "Direct link to Checking key existence") You can check if the key exists: ``` await cache.exists("a"); ``` You can check if the key is missing: ``` await cache.missing("a"); ``` ### Updating keys[​](#updating-keys "Direct link to Updating keys") You can update a key and true will be returned if the key exists and was updated: ``` await cache.update("a", 2); ``` You can increment the a key and true will be returned if the key exists and was updated. If the key is not a number an error will be thrown: ``` await cache.increment("a", 2); ``` You can decrement the a key and true will be returned if the key exists and was updated. If the key is not a number an error will be thrown,: ``` await cache.decrement("a", 1); ``` You can perform an upsert that replaces the ttl when updated. True will be returned if the key was updated otherwise false is returned: ``` await cache.put("a", 2); await cache.put("a", 4, { ttl: TimeSpan.fromSeconds(3) }); ``` ### Removing keys[​](#removing-keys "Direct link to Removing keys") You can remove a key and true will be returned if the key was found and removed: ``` await cache.remove("a"); ``` You can remove multiple keys and true will be returned if one of the keys exists and where removed: ``` await cache.removeMany(["a", "b"]); ``` You can clear all the keys of the given namespace: ``` await cache.clear(); ``` ## Patterns[​](#patterns "Direct link to Patterns") ### Compile time type safety[​](#compile-time-type-safety "Direct link to Compile time type safety") You can enforce compile time type safety by setting the cache value type: ``` import { MemoryCacheAdapter } from "@daiso-tech/core/cache/memory-cache-adapter"; import { Cache } from "@daiso-tech/core/cache"; type IUser = { name: string; email: string; age: number; }; const cache = new Cache({ adapter: new MemoryCacheAdapter(), }); // A typescript error will occur because the type is not matching. await cache.add("a", "asd"); ``` If you have multiple types you can use algeberical enums: ``` import { MemoryCacheAdapter } from "@daiso-tech/core/cache/memory-cache-adapter"; import { Cache } from "@daiso-tech/core/cache"; type IUser = { type: "USER"; name: string; email: string; age: number; }; type IProduct = { type: "PRODUCT"; name: string; price: number; }; type CacheValue = IUser | IProduct; const cache = new Cache({ adapter: new MemoryCacheAdapter(), }); const cacheValue = await cache.get("user1"); // You need to check the type is "USER" inorder to access IUser fields. if (cacheValue.type === "USER") { console.log(cacheValue.name, cacheValue.age); } // You need to check the type is "PRODUCT" inorder to access IProduct fields. if (cacheValue.type === "PRODUCT") { console.log(cacheValue.name, cacheValue.price); } ``` Alternatively you can use different `Cache` classes with different namespaces: ``` import { MemoryCacheAdapter } from "@daiso-tech/core/cache/memory-cache-adapter"; import { Cache } from "@daiso-tech/core/cache"; const cacheAdapter = new MemoryCacheAdapter(); type IUser = { name: string; email: string; age: number; }; const userCache = new Cache({ adapter: cacheAdapter, }); type IProduct = { name: string; price: number; }; const productCache = new Cache({ adapter: cacheAdapter, }); ``` ### Runtime type safety[​](#runtime-type-safety "Direct link to Runtime type safety") You can enforce runtime and compiletime type safety by passing [standard schema](https://standardschema.dev/) to the cache: ``` import { MemoryCacheAdapter } from "@daiso-tech/core/cache/memory-cache-adapter"; import { Cache } from "@daiso-tech/core/cache"; import { z } from "zod"; const userSchema = z.object({ name: z.string(), email: z.string(), age: z.number(), }); // The type will be infered const cache = new Cache({ adapter: new MemoryCacheAdapter(), schema: userSchema, }); // A typescript and runtime error will occur because the type is not matching. await cache.add("a", "asd"); ``` ### Additional methods[​](#additional-methods "Direct link to Additional methods") You can retrieve the key and if it does not exist an error will be thrown: ``` await cache.getOrFail("ab"); ``` You can retrieve the key and if it does not exist you can return a default value: ``` await cache.getOr("ab", 1); ``` You can retrieve the key and if it does not exist you can insert a default value that will aslo be returned: ``` await cache.getOrAdd("ab", 1); ``` info You can provide synchronous or asynchronous [`Invokable<[], TValue | Promise>`](/docs/utilities/invokable.md) as default values for both `getOr` and `getOrAdd` methods. You can retrieve the key and afterwards remove it and will return true if the value was found: ``` await cache.getAndRemove("ab"); ``` You can add key and if it does exist an error will be thrown: ``` await cache.addOrFail("ab", 1); ``` You can update the key and if it does not exist an error will be thrown: ``` await cache.updateOrFail("ab", 1); ``` You can increment the key and if it does not exist an error will be thrown: ``` await cache.incrementOrFail("ab", 1); ``` You can decrement the key and if it does not exist an error will be thrown: ``` await cache.decrementOrFail("ab", 1); ``` You can remove the key and if it does not exist an error will be thrown: ``` await cache.removeOrFail("ab"); ``` ### Adding jitter to ttl[​](#adding-jitter-to-ttl "Direct link to Adding jitter to ttl") TTL jitter adds a small random offset to expiration times, which resolves the [cache stampede](https://en.wikipedia.org/wiki/Cache_stampede). When many cache keys expire at the same time, every client simultaneously misses the cache and floods the data source with requests. By spreading out expiration times, jitter ensures cache misses are staggered, distributing the load on your data source evenly over time. ``` await cache.add("a", 1, { ttl: TimeSpan.fromMinutes(1), jitter: 0.2, }); ``` info You can enable jitter in the following methods: `addOrFail`, `put` and `getOrAdd`. ### Cache locking[​](#cache-locking "Direct link to Cache locking") The `getOrAdd` method supports distributed locking via the `enableLocking` setting. When multiple clients simultaneously request a key that is missing, without locking they all compute the value and write it to the cache — this is known as a [cache stampede](https://en.wikipedia.org/wiki/Cache_stampede). Enabling locking ensures that only one client computes and stores the value while the others wait and then read the cached result. To use locking, pass a `lockFactory` to the `Cache` constructor: ``` import { Cache } from "@daiso-tech/core/cache"; import { MemoryCacheAdapter } from "@daiso-tech/core/cache/memory-cache-adapter"; import { RedisLockAdapter } from "@daiso-tech/core/lock/redis-lock-adapter"; import Redis from "ioredis"; const cache = new Cache({ adapter: new MemoryCacheAdapter(), lockFactory: new RedisLockAdapter( new Redis("YOUR_REDIS_CONNECTION_STRING"), ), }); ``` Then pass `enableLocking: true` to `getOrAdd`: ``` const value = await cache.getOrAdd( "user:1", async () => { // This expensive computation runs only once even under concurrent requests return await fetchUserFromDatabase(1); }, { enableLocking: true }, ); ``` info You can pass `ILockFactoryBase`, `ILockAdapter`, and `IDatabaseLockAdapter` to `lockFactory` setting. For further information about `LockFactory` refer to the [`@daiso-tech/core/lock`](/docs/components/lock/lock_usage.md) documentation. warning The `lockFactory` defaults to a `NoOpLockAdapter` implementation, so `enableLocking: true` has no effect unless you provide a real lock adapter. ### Namespacing[​](#namespacing "Direct link to Namespacing") You can use the `Namespace` class to group related data without conflicts. Since namespacing is not used be default, you need to pass an obeject that implements `INamespace` object. info For further information about namespacing refer to [`@daiso-tech/core/namespace`](/docs/components/namespace.md) documentation. ``` import { Namespace } from "@daiso-tech/core/namespace"; import { RedisCacheAdapter } from "@daiso-tech/core/cache/redis-cache-adapter"; import { Cache } from "@daiso-tech/core/cache"; import { Serde } from "@daiso-tech/core/serde"; import { SuperJsonSerdeAdapter } from "@daiso-tech/core/serde/super-json-serde-adapter"; import Redis from "ioredis"; const database = new Redis("YOUR_REDIS_CONNECTION_STRING"); const serde = new Serde(new SuperJsonSerdeAdapter()); const cacheA = new Cache({ namespace: new Namespace("@cache-a"), adapter: new RedisCacheAdapter({ database, serde, }), }); const cacheB = new Cache({ namespace: new Namespace("@cache-b"), adapter: new RedisCacheAdapter({ database, serde, }), }); await cacheA.add("key", 1); // cacheA Logs 1 console.log(await cacheA.get("key")); // cacheB Logs null console.log(await cacheB.get("key")); await cacheB.add("key", "tests"); // cacheB Logs "test" console.log(await cacheB.get("key")); // cacheA still Logs 1 console.log(await cacheA.get("key")); ``` ### Cache events[​](#cache-events "Direct link to Cache events") You can listen to different [cache events](https://daiso-tech.github.io/daiso-core/modules/Cache.html) that are triggered by the `Cache` instance. Refer to the [`@daiso-tech/core/event-bus`](/docs/components/event_bus/event_bus_usage.md) documentation to learn how to use events. Since no events are dispatched by default, you need to pass an object that implements `IEventBus` or `IEventBusAdapter` contract. ``` import { CACHE_EVENTS } from "@daiso-tech/core/cache/contracts"; // Will log whenever an item is added, updated and removed await cache.events.subscribe(CACHE_EVENTS.ADDED, (event) => { console.log(event); }); await cache.add("a", "b"); ``` warning If multiple cache adapters (e.g., `RedisCacheAdapter` and `MemoryCacheAdapter`) are used at the same time, you need to isolate their events by assigning separate namespaces. This prevents listeners from unintentionally capturing events across adapters. ``` import { RedisCacheAdapter } from "@daiso-tech/core/cache/redis-cache-adapter"; import { MemoryCacheAdapter } from "@daiso-tech/core/cache/memory-cache-adapter"; import { Cache } from "@daiso-tech/core/cache"; import { Namespace } from "@daiso-tech/core/namespace"; import { RedisPubSubEventBusAdapter } from "@daiso-tech/core/event-bus/redis-pub-sub-event-bus-adapter"; import { Serde } from "@daiso-tech/core/serde"; import { SuperJsonSerdeAdapter } from "@daiso-tech/core/serde/super-json-serde-adapter"; import Redis from "ioredis"; const serde = new Serde(new SuperJsonSerdeAdapter()); const redisPubSubEventBusAdapter = new RedisPubSubEventBusAdapter({ client: new Redis("YOUR_REDIS_CONNECTION_STRING"), serde, }); const memoryCacheAdapter = new MemoryCacheAdapter(); const memoryCache = new Cache({ adapter: memoryCacheAdapter, // We assign distinct namespaces to MemoryCacheAdapter and RedisCacheAdapter to isolate their events. namespace: new Namespace(["memory", "event-bus"]), eventBus: redisPubSubEventBusAdapter, }); const redisCacheAdapter = new RedisCacheAdapter({ serde, database: new Redis("YOUR_REDIS_CONNECTION_STRING"), }); const redisCache = new Cache({ adapter: redisCacheAdapter, // We assign distinct namespaces to MemoryCacheAdapter and RedisCacheAdapter to isolate their events. namespace: new Namespace(["redis", "event-bus"]), eventBus: redisPubSubEventBusAdapter, }); ``` ### Separating reading cache, manipulating cache and listening[​](#separating-reading-cache-manipulating-cache-and-listening "Direct link to Separating reading cache, manipulating cache and listening") The library includes two additional contracts: * [`IReadableCache`](https://daiso-tech.github.io/daiso-core/types/Cache.IReadableCache.html) - Allows only for reading cache. * [`ICacheBase`](https://daiso-tech.github.io/daiso-core/types/Cache.ICacheBase.html) - Allows only for reading and manipulating the cache. * [`ICacheListenable`](https://daiso-tech.github.io/daiso-core/types/Cache.ICacheListenable.html) - Allows only for listening to cache events. This seperation makes it easy to visually distinguish the two contracts, making it immediately obvious that they serve different purposes. ``` import type { ICache, ICacheBase, IReadableCache, ICacheListenable, CACHE_EVENTS, } from "@daiso-tech/core/cache/contracts"; import { Cache } from "@daiso-tech/core/cache"; import { MemoryCacheAdapter } from "@daiso-tech/core/cache/adapter/memory-cache-adapter"; import { MemoryEventBus } from "@daiso-tech/core/event-bus/memory-event-bus"; async function readingFunc(cache: IReadableCache): Promise { // You cannot access the listener methods // You cannot access write methods like put, add and update // You will get typescript error if you try console.lolg("reading only:", await cache.get("a")); } async function manipulatingFunc(cache: ICacheBase): Promise { // You cannot access the listener methods // You will get typescript error if you try await cache.add("a", 1); console.lolg("writing and reading:", await cache.get("a")); } async function listenerFunc(cacheListenable: ICacheListenable): Promise { // You cannot access the cache methods // You will get typescript error if you try await cacheListenable.addListener(CACHE_EVENTS.ADDED, (event) => { console.log("EVENT:", event); }); } const cache = new Cache({ adapter: new MemoryCacheAdapter(), eventBus: new MemoryEventBus(), }); await listenerFunc(cache.events); await manipulatingFunc(cache); ``` ## Further information[​](#further-information "Direct link to Further information") For further information refer to [`@daiso-tech/core/cache`](https://daiso-tech.github.io/daiso-core/modules/Cache.html) API docs. --- # Configuring Cache adapters ## MemoryCacheAdapter[​](#memorycacheadapter "Direct link to MemoryCacheAdapter") To use the `MemoryCacheAdapter` you only need to create instance of it: ``` import { MemoryCacheAdapter } from "@daiso-tech/core/cache/memory-cache-adapter"; const memoryCacheAdapter = new MemoryCacheAdapter(); ``` You can also provide an `Map` that will be used for storing the data in memory: ``` import { MemoryCacheAdapter } from "@daiso-tech/core/cache/memory-cache-adapter"; const map = new Map(); const memoryCacheAdapter = new MemoryCacheAdapter(map); ``` info `MemoryCacheAdapter` lets you test your app without external dependencies like `Redis`, ideal for local development, unit tests, integration tests and fast E2E test for the backend application. ## MongodbCacheAdapter[​](#mongodbcacheadapter "Direct link to MongodbCacheAdapter") To use the `MongodbCacheAdapter`, you'll need to: 1. Install the required dependency: [`mongodb`](https://www.npmjs.com/package/mongodb) package: 2. Provide a string serializer ([`ISerde`](/docs/components/serde.md)): -We recommend using `SuperJsonSerdeAdapter` for this purpose ``` import { MongodbCacheAdapter } from "@daiso-tech/core/cache/mongodb-cache-adapter"; import { Serde } from "@daiso-tech/core/serde"; import { SuperJsonSerdeAdapter } from "@daiso-tech/core/serde/super-json-serde-adapter"; import { MongoClient } from "mongodb"; const client = await MongoClient.connect("YOUR_MONGODB_CONNECTION_STRING"); const database = client.db("database"); const serde = new Serde(new SuperJsonSerdeAdapter()); const mongodbCacheAdapter = new MongodbCacheAdapter({ database, serde, }); // You need initialize the adapter once before using it. // During the initialization the indexes will be created await mongodbCacheAdapter.init(); ``` You can change the collection name: ``` const mongodbCacheAdapter = new MongodbCacheAdapter({ database, serde, // By default "cache" is used as collection name collectionName: "my-cache", }); await mongodbCacheAdapter.init(); ``` You can change the collection settings: ``` const mongodbCacheAdapter = new MongodbCacheAdapter({ database, serde, // You configure additional collection settings collectionSettings: {}, }); await mongodbCacheAdapter.init(); ``` info To remove the cache collection and all stored cache data, use `deInit` method: ``` await mongodbCacheAdapter.deInit(); ``` ## RedisCacheAdapter[​](#rediscacheadapter "Direct link to RedisCacheAdapter") To use the `RedisCacheAdapter`, you'll need to: 1. Install the required dependency: [`ioredis`](https://www.npmjs.com/package/ioredis) package: 2. Provide a string serializer ([`ISerde`](/docs/components/serde.md)): * We recommend using `SuperJsonSerdeAdapter` for this purpose ``` import { RedisCacheAdapter } from "@daiso-tech/core/cache/redis-cache-adapter"; import { Serde } from "@daiso-tech/core/serde"; import { SuperJsonSerdeAdapter } from "@daiso-tech/core/serde/super-json-serde-adapter"; import Redis from "ioredis"; const database = new Redis("YOUR_REDIS_CONNECTION_STRING"); const serde = new Serde(new SuperJsonSerdeAdapter()); const redisCacheAdapter = new RedisCacheAdapter({ database, serde, }); ``` ## KyselyCacheAdapter[​](#kyselycacheadapter "Direct link to KyselyCacheAdapter") To use the `KyselyCacheAdapter`, you'll need to: 1. Install the required dependency: [`kysely`](https://www.npmjs.com/package/kysely) package: 2. Provide a string serializer ([`ISerde`](/docs/components/serde.md)): * We recommend using `SuperJsonSerdeAdapter` for this purpose ### Usage with Sqlite[​](#usage-with-sqlite "Direct link to Usage with Sqlite") You will need to install [`better-sqlite3`](https://www.npmjs.com/package/better-sqlite3) package: ``` import { TimeSpan } from "@daiso-tech/core/time-span"; import { KyselyCacheAdapter } from "@daiso-tech/core/cache/kysely-cache-adapter"; import { Serde } from "@daiso-tech/core/serde"; import { SuperJsonSerdeAdapter } from "@daiso-tech/core/serde/super-json-serde-adapter"; import Sqlite from "better-sqlite3"; import { Kysely, SqliteDialect } from "kysely"; const database = new Sqlite("DATABASE_NAME.db"); const kysely = new Kysely({ dialect: new SqliteDialect({ database, }), }); const serde = new Serde(new SuperJsonSerdeAdapter()); const kyselyCacheAdapter = new KyselyCacheAdapter({ kysely, serde, }); // You need initialize the adapter once before using it. // During the initialization the schema will be created await kyselyCacheAdapter.init(); ``` ### Usage with Postgres[​](#usage-with-postgres "Direct link to Usage with Postgres") You will need to install [`pg`](https://www.npmjs.com/package/pg) package: ``` import { TimeSpan } from "@daiso-tech/core/time-span"; import { KyselyCacheAdapter } from "@daiso-tech/core/cache/kysely-cache-adapter"; import { Serde } from "@daiso-tech/core/serde"; import { SuperJsonSerdeAdapter } from "@daiso-tech/core/serde/super-json-serde-adapter"; import { Pool } from "pg"; import { Kysely, PostgresDialect } from "kysely"; const database = new Pool({ database: "DATABASE_NAME", host: "DATABASE_HOST", user: "DATABASE_USER", // DATABASE port port: 5432, password: "DATABASE_PASSWORD", max: 10, }); const kysely = new Kysely({ dialect: new PostgresDialect({ pool: database, }), }); const serde = new Serde(new SuperJsonSerdeAdapter()); const kyselyCacheAdapter = new KyselyCacheAdapter({ kysely, serde, }); // You need initialize the adapter once before using it. // During the initialization the schema will be created await kyselyCacheAdapter.init(); ``` ### Usage with Mysql[​](#usage-with-mysql "Direct link to Usage with Mysql") You will need to install [`mysql2`](https://www.npmjs.com/package/mysql2) package: ``` import { TimeSpan } from "@daiso-tech/core/time-span"; import { KyselyCacheAdapter } from "@daiso-tech/core/cache/kysely-cache-adapter"; import { Serde } from "@daiso-tech/core/serde"; import { SuperJsonSerdeAdapter } from "@daiso-tech/core/serde/super-json-serde-adapter"; import { createPool } from "mysql2"; import { Kysely, MysqlDialect } from "kysely"; const database = createPool({ host: "DATABASE_HOST", // Database port port: 3306, database: "DATABASE_NAME", user: "DATABASE_USER", password: "DATABASE_PASSWORD", connectionLimit: 10, }); const kysely = new Kysely({ dialect: new MysqlDialect({ pool: database, }), }); const serde = new Serde(new SuperJsonSerdeAdapter()); const kyselyCacheAdapter = new KyselyCacheAdapter({ kysely, serde, }); // You need initialize the adapter once before using it. // During the initialization the schema will be created await kyselyCacheAdapter.init(); ``` ### Usage with Libsql[​](#usage-with-libsql "Direct link to Usage with Libsql") You will need to install [`@libsql/kysely-libsql`](https://www.npmjs.com/package/@libsql/kysely-libsql) package: ``` import { TimeSpan } from "@daiso-tech/core/time-span"; import { KyselyCacheAdapter } from "@daiso-tech/core/cache/kysely-cache-adapter"; import { Serde } from "@daiso-tech/core/serde"; import { SuperJsonSerdeAdapter } from "@daiso-tech/core/serde/super-json-serde-adapter"; import { LibsqlDialect } from "@libsql/kysely-libsql"; import { Kysely } from "kysely"; const kysely = new Kysely({ dialect: new LibsqlDialect({ url: "DATABASE_URL", }), }); const serde = new Serde(new SuperJsonSerdeAdapter()); const kyselyCacheAdapter = new KyselyCacheAdapter({ kysely, serde, }); // You need initialize the adapter once before using it. // During the initialization the schema will be created await kyselyCacheAdapter.init(); ``` ### Usage with other databases[​](#usage-with-other-databases "Direct link to Usage with other databases") Note [`kysely`](https://www.npmjs.com/package/kysely) has support for multiple [databases](https://github.com/kysely-org/awesome-kysely?tab=readme-ov-file#dialects). danger Before choose a database, ensure it supports transactions. Without transaction support, you won't be able to use following methods `put` and `increment`, as they require transactional functionality. ### Settings[​](#settings "Direct link to Settings") Expired keys are cleared at regular intervals and you can change the interval time: ``` import { TimeSpan } from "@daiso-tech/core/time-span"; const kyselyCacheAdapter = new KyselyCacheAdapter({ database, serde, // By default, the interval is 1 minute expiredKeysRemovalInterval: TimeSpan.fromSeconds(10), }); await kyselyCacheAdapter.init(); ``` Disabling scheduled interval cleanup of expired keys: ``` import { TimeSpan } from "@daiso-tech/core/time-span"; const kyselyCacheAdapter = new KyselyCacheAdapter({ database, serde, shouldRemoveExpiredKeys: false, }); await kyselyCacheAdapter.init(); // You can remove all expired keys manually. await kyselyCacheAdapter.removeAllExpired(); ``` info To remove the cache table and all stored cache data, use `deInit` method: ``` await kyselyCacheAdapter.deInit(); ``` ## NoOpCacheAdapter[​](#noopcacheadapter "Direct link to NoOpCacheAdapter") The `NoOpCacheAdapter` is a no-operation implementation, it performs no actions when called: ``` import { NoOpCacheAdapter } from "@daiso-tech/core/cache/no-op-cache-adapter"; const noOpCacheAdapter = new NoOpCacheAdapter(); ``` info The `NoOpCacheAdapter` is useful when you want to mock out or disable your [`Cache`](https://daiso-tech.github.io/daiso-core/classes/Cache.Cache.html) class instance. info Note `NoOpCacheAdapter` returns always null when retrieving items and return true when adding, updating, and removing items. ## Further information[​](#further-information "Direct link to Further information") For further information refer to [`@daiso-tech/core/cache`](https://daiso-tech.github.io/daiso-core/modules/Cache.html) API docs. --- # Creating cache adapters ## Implementing your custom ICacheAdapter[​](#implementing-your-custom-icacheadapter "Direct link to Implementing your custom ICacheAdapter") In order to create an adapter you need to implement the [`ICacheAdapter`](https://daiso-tech.github.io/daiso-core/types/Cache.ICacheAdapter.html) contract. ## Testing your custom ICacheAdapter[​](#testing-your-custom-icacheadapter "Direct link to Testing your custom ICacheAdapter") We provide a complete test suite to test your cache adapter implementation. Simply use the [`cacheAdapterTestSuite`](https://daiso-tech.github.io/daiso-core/functions/Cache.cacheAdapterTestSuite.html) function: * Preconfigured Vitest test cases * Common edge case coverage Usage example: ``` // filename: MyCacheAdapter.test.ts import { beforeEach, describe, expect, test } from "vitest"; import { cacheAdapterTestSuite } from "@daiso-tech/core/cache/test-utilities"; import { MemoryCacheAdapter } from "./MemoryCacheAdapter.js"; describe("class: MyCacheAdapter", () => { cacheAdapterTestSuite({ createAdapter: () => new MemoryCacheAdapter(), test, beforeEach, expect, describe, }); }); ``` ## Implementing your custom IDatabaseCacheAdapter[​](#implementing-your-custom-idatabasecacheadapter "Direct link to Implementing your custom IDatabaseCacheAdapter") We provide an additional contract [`IDatabaseCacheAdapter`](https://daiso-tech.github.io/daiso-core/types/Cache.IDatabaseCacheAdapter.html) for building custom cache adapters tailored to databases. ## Testing your custom IDatabaseCacheAdapter[​](#testing-your-custom-idatabasecacheadapter "Direct link to Testing your custom IDatabaseCacheAdapter") We provide a complete test suite to test your database cache adapter implementation. Simply use the [`databaseCacheAdapterTestSuite`](https://daiso-tech.github.io/daiso-core/functions/Cache.databaseCacheAdapterTestSuite.html) function: * Preconfigured Vitest test cases * Common edge case coverage Usage example: ``` import { beforeEach, describe, expect, test } from "vitest"; import { databaseCacheAdapterTestSuite } from "@daiso-tech/core/cache/test-utilities"; import { MyDatabaseCacheAdapter } from "./MyDatabaseCacheAdapter.js"; describe("class: MyDatabaseCacheAdapter", () => { databaseCacheAdapterTestSuite({ createAdapter: async () => { return new MyDatabaseCacheAdapter(), }, test, beforeEach, expect, describe, }); }); ``` ## Implementing your custom ICache class[​](#implementing-your-custom-icache-class "Direct link to Implementing your custom ICache class") In some cases, you may need to implement a custom [`Cache`](https://daiso-tech.github.io/daiso-core/classes/Cache.Cache.html) class to optimize performance for your specific technology stack. You can then directly implement the [`ICache`](https://daiso-tech.github.io/daiso-core/types/Cache.ICache.html) contract. ## Testing your custom ICache class[​](#testing-your-custom-icache-class "Direct link to Testing your custom ICache class") We provide a complete test suite to verify your custom event bus class implementation. Simply use the [`cacheTestSuite`](https://daiso-tech.github.io/daiso-core/functions/Cache.cacheTestSuite.html) function: * Preconfigured Vitest test cases * Standardized event bus behavior validation * Common edge case coverage Usage example: ``` // filename: MyCache.test.ts import { beforeEach, describe, expect, test } from "vitest"; import { cacheTestSuite } from "@daiso-tech/core/cache/test-utilities"; import { MyCache } from "./MyCache.js"; describe("class: MyCache", () => { cacheTestSuite({ createCache: () => new MyCache(), test, beforeEach, expect, describe, }); }); ``` ## Further information[​](#further-information "Direct link to Further information") For further information refer to [`@daiso-tech/core/cache`](https://daiso-tech.github.io/daiso-core/modules/Cache.html) API docs. --- # Circuit-breaker provider resolver classes ## pro[​](#pro "Direct link to pro") The `CircuitBreakerFactoryResolver` class provides a flexible way to configure and switch between different circuit-breaker adapters at runtime. ### Initial configuration[​](#initial-configuration "Direct link to Initial configuration") To begin using the `CircuitBreakerFactoryResolver`, You will need to register all required adapters during initialization. ``` import { CircuitBreakerFactoryResolver } from "@daiso-tech/core/circuit-breaker"; import { MemoryCircuitBreakerStorageAdapter } from "@daiso-tech/core/circuit-breaker/memory-circuit-breaker-storate-adapter"; import { DatabaseCircuitBreakerAdapter } from "@daiso-tech/core/circuit-breaker/database-circuit-breaker-adapter"; import { RedisCircuitBreakerAdapter } from "@daiso-tech/core/circuit-breaker/redis-circuit-breaker-adapter"; import { Serde } from "@daiso-tech/core/serde"; import { SuperJsonSerdeAdapter } from "@daiso-tech/core/serde/super-json-serde-adapter"; import Redis from "ioredis"; const serde = new Serde(new SuperJsonSerdeAdapter()); const circuitBreakerFactoryResolver = new CircuitBreakerFactoryResolver({ serde, adapters: { memory: new DatabaseCircuitBreakerAdapter({ adapter: new MemoryCircuitBreakerStorageAdapter(), }), redis: new RedisCircuitBreakerAdapter({ database: new Redis("YOUR_REDIS_CONNECTION"), }), }, defaultAdapter: "memory", }); ``` ### Usage[​](#usage "Direct link to Usage") #### 1. Using the default adapter[​](#1-using-the-default-adapter "Direct link to 1. Using the default adapter") ``` // Will apply circuit-breaker logic the default adapter which is MemoryCircuitBreakerStorageAdapter await circuitBreakerFactoryResolver .use() .create("a") .runOrFail(async () => { // ... code to apply circuit-breaker logic }); ``` danger Note that if you dont set a default adapter, an error will be thrown. #### 2. Specifying an adapter explicitly[​](#2-specifying-an-adapter-explicitly "Direct link to 2. Specifying an adapter explicitly") ``` // Will apply circuit-breaker logic using the redis adapter await circuitBreakerFactoryResolver .use("redis") .create("a") .runOrFail(async () => { // ... code to apply circuit-breaker logic }); ``` danger Note that if you specify a non-existent adapter, an error will be thrown. #### 3. Overriding default settings[​](#3-overriding-default-settings "Direct link to 3. Overriding default settings") ``` await circuitBreakerFactoryResolver .use("redis") .setNamespace(new Namespace(["@", "test"])) .create("a") .runOrFail(async () => { // ... code to apply circuit-breaker logic }); ``` info Note that the `CircuitBreakerFactoryResolver` is immutable, meaning any configuration override returns a new instance rather than modifying the existing one. ## DatabaseCircuitBreakerFactoryResolver[​](#databasecircuitbreakerfactoryresolver "Direct link to DatabaseCircuitBreakerFactoryResolver") The `DatabaseCircuitBreakerFactoryResolver` class provides a flexible way to configure and switch between different circuit-breaker-storage adapters at runtime. ### Initial configuration[​](#initial-configuration-1 "Direct link to Initial configuration") To begin using the `DatabaseCircuitBreakerFactoryResolver`, You will need to register all required adapters during initialization. ``` import { DatabaseCircuitBreakerFactoryResolver } from "@daiso-tech/core/circuit-breaker"; import { MemoryCircuitBreakerStorageAdapter } from "@daiso-tech/core/circuit-breaker/memory-circuit-breaker-storate-adapter"; import { KyselyCircuitBreakerStorageAdapter } from "@daiso-tech/core/circuit-breaker/kysely-circuit-breaker-storate-adapter"; import { DatabaseCircuitBreakerAdapter } from "@daiso-tech/core/circuit-breaker/database-circuit-breaker-adapter"; import { Serde } from "@daiso-tech/core/serde"; import { SuperJsonSerdeAdapter } from "@daiso-tech/core/serde/super-json-serde-adapter"; import Sqlite from "better-sqlite3"; import { Kysely, SqliteDialect } from "kysely"; const serde = new Serde(new SuperJsonSerdeAdapter()); const circuitBreakerFactoryResolver = new DatabaseCircuitBreakerFactoryResolver( { serde, adapters: { memory: new MemoryCircuitBreakerStorageAdapter(), sqlite: new KyselyCircuitBreakerStorageAdapter({ kysely: new Kysely({ dialect: new SqliteDialect({ database: new Sqlite("local.db"), }), }), serde, }), }, defaultAdapter: "memory", }, ); // Will apply circuit-breaker logic the default adapter which is MemoryCircuitBreakerStorageAdapter await circuitBreakerFactoryResolver .use() .create("a") .runOrFail(async () => { // ... code to apply circuit-breaker logic }); // Will apply circuit-breaker logic using the KyselyCircuitBreakerStorageAdapter await circuitBreakerFactoryResolver .use("sqlite") .create("a") .runOrFail(async () => { // ... code to apply circuit-breaker logic }); ``` ### Usage[​](#usage-1 "Direct link to Usage") #### 1. Using the default adapter[​](#1-using-the-default-adapter-1 "Direct link to 1. Using the default adapter") ``` // Will apply circuit-breaker logic the default adapter which is MemoryCircuitBreakerStorageAdapter await circuitBreakerFactoryResolver .use() .create("a") .runOrFail(async () => { // ... code to apply circuit-breaker logic }); ``` danger Note that if you dont set a default adapter, an error will be thrown. #### 2. Specifying an adapter explicitly[​](#2-specifying-an-adapter-explicitly-1 "Direct link to 2. Specifying an adapter explicitly") ``` // Will apply circuit-breaker logic using the sqlite adapter await circuitBreakerFactoryResolver .use("sqlite") .create("a") .runOrFail(async () => { // ... code to apply circuit-breaker logic }); ``` danger Note that if you specify a non-existent adapter, an error will be thrown. #### 3. Overriding default settings[​](#3-overriding-default-settings-1 "Direct link to 3. Overriding default settings") ``` import { CountBreaker } from "@daiso-tech/core/circuit-breaker/policies"; import { constantBackoff } from "@daiso-tech/core/backoff-policies"; await circuitBreakerFactoryResolver .setBackoffPolicy(constantBackoff()) .setDefaultCircuitBreakerPolicy(new CountBreaker()) .use("redis") .create("a") .runOrFail(async () => { // ... code to apply circuit-breaker logic }); ``` info Note that the `DatabaseCircuitBreakerFactoryResolver` is immutable, meaning any configuration override returns a new instance rather than modifying the existing one. ## Further information[​](#further-information "Direct link to Further information") For further information refer to [`@daiso-tech/core/circuit-breaker`](https://daiso-tech.github.io/daiso-core/modules/CircuitBreaker.html) API docs. --- # Circuit-breaker usage The `@daiso-tech/core/circuit-breaker` component provides a way for managing circuit-breaker independent of underlying platform or storage. ## Initial configuration[​](#initial-configuration "Direct link to Initial configuration") To begin using the `CircuitBreakerFactory` class, you'll need to create and configure an instance: ``` import { TimeSpan } from "@daiso-tech/core/time-span"; import { MemoryCircuitBreakerStorageAdapter } from "@daiso-tech/core/circuit-breaker/memory-circuit-breaker-storage-adapter"; import { DatabaseCircuitBreakerAdapter } from "@daiso-tech/core/circuit-breaker/database-circuit-breaker-adapter"; import { CircuitBreakerFactory } from "@daiso-tech/core/circuit-breaker"; const circuitBreakerFactory = new CircuitBreakerFactory({ // You can provide default settings // You can choose the adapter to use adapter: new DatabaseCircuitBreakerAdapter({ adapter: new MemoryCircuitBreakerStorageAdapter(), }), }); ``` info Here is a complete list of settings for the [`CircuitBreakerFactory`](https://daiso-tech.github.io/daiso-core/types/CircuitBreaker.CircuitBreakerFactorySettingsBase.html) class. ## Circuit-breaker basics[​](#circuit-breaker-basics "Direct link to Circuit-breaker basics") ### Creating a circuit-breaker[​](#creating-a-circuit-breaker "Direct link to Creating a circuit-breaker") ``` const circuitBreaker = circuitBreakerFactory.create("resource"); ``` ### Using the circuit-breaker[​](#using-the-circuit-breaker "Direct link to Using the circuit-breaker") ``` // The function will only be called when the circuit-breaker is in closed state or half open state. await circuitBreaker.runOrFail(async () => { // Call the external service }); ``` info Note the method throws an error when the circuit-breaker is in open state or isolated state. info You can provide synchronous or asynchronous [`Invokable<[], TValue | Promise>`](/docs/utilities/invokable.md) as values for the `runOrFail` method. ### Applying circuit-breaker on certiain errors[​](#applying-circuit-breaker-on-certiain-errors "Direct link to Applying circuit-breaker on certiain errors") ``` class ErrorA extends Error {} const circuitBreaker = circuitBreakerFactory.create("resource", { errorPolicy: ErrorA, }); await circuitBreaker.runOrFail(async () => { // Call the external service }); ``` ### Setting circuit-breaker triggers[​](#setting-circuit-breaker-triggers "Direct link to Setting circuit-breaker triggers") By default the the circuit-breaker will treat errors and slow calls as failures. You can explicitly set ths option. The `CIRCUIT_BREAKER_TRIGGER.BOTH` will treat error and slow calls as failures. ``` import { CIRCUIT_BREAKER_TRIGGER } from "@daiso-tech/core/circuit-breaker/contracts"; const circuitBreaker = circuitBreakerFactory.create("resource", { trigger: CIRCUIT_BREAKER_TRIGGER.BOTH, }); await circuitBreaker.runOrFail(async () => { // Call the external service }); ``` The `CIRCUIT_BREAKER_TRIGGER.ONLY_ERROR` will treat only errors as failures. ``` import { CIRCUIT_BREAKER_TRIGGER } from "@daiso-tech/core/circuit-breaker/contracts"; const circuitBreaker = circuitBreakerFactory.create("resource", { trigger: CIRCUIT_BREAKER_TRIGGER.ONLY_ERROR, }); await circuitBreaker.runOrFail(async () => { // Call the external service }); ``` The `CIRCUIT_BREAKER_TRIGGER.ONLY_SLOW_CALL` will treat slow calls as failures. ``` import { CIRCUIT_BREAKER_TRIGGER } from "@daiso-tech/core/circuit-breaker/contracts"; const circuitBreaker = circuitBreakerFactory.create("resource", { trigger: CIRCUIT_BREAKER_TRIGGER.ONLY_SLOW_CALL, }); await circuitBreaker.runOrFail(async () => { // Call the external service }); ``` ### Setting the slow call threshold[​](#setting-the-slow-call-threshold "Direct link to Setting the slow call threshold") You can set custom slow call threshold that will be used when treating slow calls as failures. ``` import { TimeSpan } from "@daiso-tech/core/time-span"; const circuitBreaker = circuitBreakerFactory.create("resource", { trigger: TimeSpan.fromSeconds(1), }); await circuitBreaker.runOrFail(async () => { // Call the external service }); ``` ### Reseting the circuit-breaker[​](#reseting-the-circuit-breaker "Direct link to Reseting the circuit-breaker") You can reset circuit-breaker state to the closed state manually. ``` await circuitBreaker.reset(); ``` ### Isolating the circuit-breaker[​](#isolating-the-circuit-breaker "Direct link to Isolating the circuit-breaker") You can manually hold circuit-breaker in open state until reseted. ``` await circuitBreaker.isolate(); ``` ### Checking circuit-breaker state[​](#checking-circuit-breaker-state "Direct link to Checking circuit-breaker state") You can get the circuit-breaker state by using the `getState` method, it returns [`CircuitBreakerState`](https://daiso-tech.github.io/daiso-core/types/CircuitBreaker.CircuitBreakerState.html). ``` import { CIRCUIT_BREAKER_STATE } from "@daiso-tech/core/circuit-breaker/contracts"; const state = await circuitBreaker.getState(); if (state === CIRCUIT_BREAKER_STATE.CLOSED) { console.log("The service is up and running without problems"); } if (state === CIRCUIT_BREAKER_STATE.OPEN) { console.log("The service is down or degraded and you need to wait"); } if (state === CIRCUIT_BREAKER_STATE.HALF_OPEN) { console.log( "Proping to check if the server is up and running or down / degraded", ); } if (state === CIRCUIT_BREAKER_STATE.ISOLATED) { console.log("The service is held in open state manually until reseted"); } ``` ### Circuit-breaker instance variables[​](#circuit-breaker-instance-variables "Direct link to Circuit-breaker instance variables") The `CircuitBreaker` class exposes instance variables such as: ``` const circuitBreaker = circuitBreakerFactory.create("resource"); // Will return the key of the circuit-breaker which is "resource" console.log(circuitBreaker.key.toString()); ``` info The `key` field is an object that implements [`IKey`](/docs/components/namespace.md) contract. ## Patterns[​](#patterns "Direct link to Patterns") ### Namespacing[​](#namespacing "Direct link to Namespacing") You can use the `Namespace` class to group related circuit-breakers without conflicts. Since namespacing is not used be default, you need to pass an obeject that implements `INamespace` object. info For further information about namespacing refer to [`@daiso-tech/core/namespace`](/docs/components/namespace.md) documentation. ``` import { Namespace } from "@daiso-tech/core/namespace"; import { RedisCircuitBreakerAdapter } from "@daiso-tech/core/circuit-breaker/redis-circuit-breaker-adapter"; import { CircuitBreakerFactory } from "@daiso-tech/core/circuit-breaker"; import Redis from "ioredis"; const database = new Redis("YOUR_REDIS_CONNECTION_STRING"); const circuitBreakerFactoryA = new CircuitBreakerFactory({ namespace: new Namespace("@circuit-breaker-a"), adapter: new RedisCircuitBreakerAdapter({ database }), }); const circuitBreakerFactoryB = new CircuitBreakerFactory({ namespace: new Namespace("@circuit-breaker-b"), adapter: new RedisCircuitBreakerAdapter({ database }), }); const circuitBreakerA = await circuitBreakerFactoryA.create("key", { ttl: null, }); const circuitBreakerB = await circuitBreakerFactoryB.create("key", { ttl: null, }); await circuitBreakerA.isolate(); // Will log ISOLATED console.log(await circuitBreakerA.getState()); // Will log CLOSED console.log(await circuitBreakerB.getState()); ``` ### Serialization and deserialization of circuit-breakers[​](#serialization-and-deserialization-of-circuit-breakers "Direct link to Serialization and deserialization of circuit-breakers") circuit-breakers can be serialized, allowing them to be transmitted over the network to another server and later deserialized for reuse. This means you can, for example, acquire the circuit-breaker on the main server, transfer it to a queue worker server, and release it there. In order to serialize or deserialize a circuit-breaker you need pass an object that implements [`ISerderRegister`](/docs/components/serde.md) contract like the [`Serde`](/docs/components/serde.md) class to `CircuitBreakerFactory`. Manually serializing and deserializing the circuit-breaker: ``` import { RedisCircuitBreakerAdapter } from "@daiso-tech/core/circuit-breaker/redis-circuit-breaker-adapter"; import { CircuitBreakerFactory } from "@daiso-tech/core/circuit-breaker"; import { Serde } from "@daiso-tech/core/serde"; import { SuperJsonSerdeAdapter } from "@daiso-tech/core/serde/super-json-serde-adapter"; const serde = new Serde(new SuperJsonSerdeAdapter()); const redisClient = new Redis("YOUR_REDIS_CONNECTION"); const circuitBreakerFactory = new CircuitBreakerFactory({ // You can laso pass in an array of Serde class instances serde, adapter: new RedisCircuitBreakerAdapter({ database: redisClient }), }); const circuitBreaker = circuitBreakerFactory.create("resource"); const serializedCircuitBreaker = serde.serialize(circuitBreaker); const deserializedCircuitBreaker = serde.deserialize(circuitBreaker); ``` danger When serializing or deserializing a circuit-breaker, you must use the same `Serde` instances that were provided to the `CircuitBreakerFactory`. This is required because the `CircuitBreakerFactory` injects custom serialization logic for `ICircuitBreaker` instance into `Serde` instances. info Note you only need manuall serialization and deserialization when integrating with external libraries. As long you pass the same `Serde` instances with all other components you dont need to serialize and deserialize the circuit-breaker manually. ``` import { RedisCircuitBreakerAdapter } from "@daiso-tech/core/circuit-breaker/redis-circuit-breaker-adapter"; import type { ICircuitBreaker } from "@daiso-tech/core/circuit-breaker/contracts"; import { CircuitBreakerFactory } from "@daiso-tech/core/circuit-breaker"; import { RedisPubSubEventBusAdapter } from "@daiso-tech/core/event-bus/redis-pub-sub-event-bus-adapter"; import { EventBus } from "@daiso-tech/core/event-bus"; import { Serde } from "@daiso-tech/core/serde"; import { SuperJsonSerdeAdapter } from "@daiso-tech/core/serde/super-json-serde-adapter"; const serde = new Serde(new SuperJsonSerdeAdapter()); const redis = new Redis("YOUR_REDIS_CONNECTION"); type EventMap = { "sending-circuit-breaker-over-network": { circuitBreaker: ICircuitBreaker; }; }; const eventBus = new EventBus({ adapter: new RedisPubSubEventBusAdapter({ client: redis, serde, }), }); const circuitBreakerFactory = new CircuitBreakerFactory({ serde, adapter: new RedisCircuitBreakerAdapter({ databsae: redis }), eventBus, }); const circuitBreaker = circuitBreakerFactory.create("resource"); // We are sending the circuitBreaker over the network to other servers. await eventBus.dispatch("sending-circuit-breaker-over-network", { circuitBreaker, }); // The other servers will recieve the serialized circuitBreaker and automattically deserialize it. await eventBus.addListener( "sending-circuit-breaker-over-network", ({ circuitBreaker }) => { // The circuitBreaker is deserialized and can be used console.log("CIRCUIT_BREAKER:", circuitBreaker); }, ); ``` ### Circuit-breaker events[​](#circuit-breaker-events "Direct link to Circuit-breaker events") You can listen to different [circuit-breaker events](https://daiso-tech.github.io/daiso-core/modules/CircuitBreaker.html) that are triggered by the `CircuitBreaker` instance. Refer to the [`EventBus`](/docs/components/event_bus/event_bus_usage.md) documentation to learn how to use events. Since no events are dispatched by default, you need to pass an object that implements `IEventBus` or `IEventBusAdapter` contract. ``` import { MemoryCircuitBreakerStorageAdapter } from "@daiso-tech/core/circuit-breaker/memory-circuit-breaker-storage-adapter"; import { DatabaseCircuitBreakerAdapter } from "@daiso-tech/core/circuit-breaker/database-circuit-breaker-adapter"; import { CircuitBreakerFactory, CIRCUIT_BREAKER_EVENTS, } from "@daiso-tech/core/circuit-breaker"; import { MemoryEventBusAdapter } from "@daiso-tech/core/event-bus/memory-event-bus-adapter"; const circuitBreakerFactory = new CircuitBreakerFactory({ adapter: new DatabaseCircuitBreakerAdapter({ adapter: new MemoryCircuitBreakerStorageAdapter(), }), eventBus: new MemoryEventBusAdapter(), }); await circuitBreakerFactory.events.addListener( CIRCUIT_BREAKER_EVENTS.STATE_TRANSITIONED, (event) => { console.log( `State transitioned occurred. from ${event.from} to ${event.to}`, ); }, ); await circuitBreakerFactory.create("a").isolate(); ``` warning If multiple circuit-breaker adapters (e.g., `RedisCircuitBreakerAdapter` and `DatabaseCircuitBreakerAdapter`) are used at the same time, you need to isolate their events by assigning separate namespaces. This prevents listeners from unintentionally capturing events across adapters. ``` import { RedisCircuitBreakerAdapter } from "@daiso-tech/core/circuit-breaker/redis-circuit-breaker-adapter"; import { MemoryCircuitBreakerStorageAdapter } from "@daiso-tech/core/circuit-breaker/memory-circuit-breaker-storage-adapter"; import { DatabaseCircuitBreakerAdapter } from "@daiso-tech/core/circuit-breaker/database-circuit-breaker-adapter"; import { RedisPubSubEventBusAdapter } from "@daiso-tech/core/event-bus/redis-pub-sub-event-bus-adapter"; import { Serde } from "@daiso-tech/core/serde"; import { SuperJsonSerdeAdapter } from "@daiso-tech/core/serde/super-json-serde-adapter"; import Redis from "ioredis"; import { Namespace } from "@daiso-tech/core/namespace"; const serde = new Serde(new SuperJsonSerdeAdapter()); const redisPubSubEventBusAdapter = new RedisPubSubEventBusAdapter({ client: new Redis("YOUR_REDIS_CONNECTION_STRING"), serde, }); const memoryCircuitBreakerFactory = new CircuitBreakerFactory({ adapter: new DatabaseCircuitBreakerAdapter({ adapter: new MemoryCircuitBreakerStorageAdapter(), }), // We assign distinct namespaces to DatabaseCircuitBreakerAdapter and RedisCircuitBreakerAdapter to isolate their events. namespace: new Namespace(["memory", "event-bus"]), eventBus: redisPubSubEventBusAdapter, }); const redisCircuitBreakerAdapter = new RedisCircuitBreakerAdapter({ serde, database: new Redis("YOUR_REDIS_CONNECTION_STRING"), }); const redisCircuitBreakerFactory = new CircuitBreakerFactory({ adapter: redisCircuitBreakerAdapter, // We assign distinct namespaces to DatabaseCircuitBreakerAdapter and RedisCircuitBreakerAdapter to isolate their events. namespace: new Namespace(["redis", "event-bus"]), eventBus: redisPubSubEventBusAdapter, }); ``` ### Separating creating, listening to and using circuit-breakers[​](#separating-creating-listening-to-and-using-circuit-breakers "Direct link to Separating creating, listening to and using circuit-breakers") The library includes 3 additional contracts: * [`ICircuitBreaker`](https://daiso-tech.github.io/daiso-core/types/CircuitBreaker.ICircuitBreaker.html) - Allows only for manipulating of the lock. * [`ICircuitBreakerFactoryBase`](https://daiso-tech.github.io/daiso-core/types/CircuitBreaker.ICircuitBreakerFactoryBase.html) - Allows only for creation of locks. * [`ICircuitBreakerListenable`](https://daiso-tech.github.io/daiso-core/types/CircuitBreaker.ICircuitBreakerListenable.html) - Allows only to listening to lock events. This seperation makes it easy to visually distinguish the 3 contracts, making it immediately obvious that they serve different purposes. ``` import { MemoryEventBusAdapter } from "@daiso-tech/core/event-bus/memory-event-bus-adapter"; import { CircuitBreakerFactory } from "@daiso-tech/core/circuit-breaker"; import { MemoryCircuitBreakerStorageAdapter } from "@daiso-tech/core/circuit-breaker/memory-circuit-breaker-storage-adapter"; import { DatabaseCircuitBreakerAdapter } from "@daiso-tech/core/circuit-breaker/database-circuit-breaker-adapter"; import { type ICircuitBreaker, type ICircuitBreakerFactoryBase, type ICircuitBreakerListenable, CIRCUIT_BREAKER_EVENTS, } from "@daiso-tech/core/circuit-breaker/contracts"; async function circuitBreakerFunc( circuitBreaker: ICircuitBreaker, ): Promise { await circuitBreaker.runOrFail(async () => { // ... cascading failures section }); } async function circuitBreakerFactoryFunc( circuitBreakerFactory: ICircuitBreakerFactoryBase, ): Promise { // You cannot access the listener methods // You will get typescript error if you try const circuitBreaker = circuitBreakerFactory.create("resource"); await circuitBreakerFunc(circuitBreaker); } async function circuitBreakerListenableFunc( circuitBreakerListenable: ICircuitBreakerListenable, ): Promise { // You cannot access the circuitBreakerFactory methods // You will get typescript error if you try await circuitBreakerListenable.addListener( CIRCUIT_BREAKER_EVENTS.STATE_TRANSITIONED, (event) => { console.log( `State transitioned occurred. from ${event.from} to ${event.to}`, ); }, ); } const circuitBreakerFactory = new CircuitBreakerFactory({ adapter: new DatabaseCircuitBreakerAdapter({ adapter: new MemoryCircuitBreakerStorageAdapter(), }), eventBus: new MemoryEventBusAdapter(), }); await circuitBreakerListenableFunc(circuitBreakerFactory.events); await circuitBreakerFactoryFunc(circuitBreakerFactory); ``` ## Further information[​](#further-information "Direct link to Further information") For further information refer to [`@daiso-tech/core/circuit-breaker`](https://daiso-tech.github.io/daiso-core/modules/CircuitBreaker.html) API docs. --- # Configuring circuit-breaker adapters ## RedisCircuitBreakerAdapter[​](#rediscircuitbreakeradapter "Direct link to RedisCircuitBreakerAdapter") To use the `RedisCircuitBreakerAdapter`, you'll need to: 1. Install the required dependency: [`ioredis`](https://www.npmjs.com/package/ioredis) package: ``` import { RedisCircuitBreakerAdapter } from "@daiso-tech/core/circuit-breaker/redis-circuit-breaker-adapter"; import Redis from "ioredis"; const database = new Redis("YOUR_REDIS_CONNECTION_STRING"); const redisCircuitBreakerAdapter = new RedisCircuitBreakerAdapter({ database, }); ``` ### Configuring backoff policy[​](#configuring-backoff-policy "Direct link to Configuring backoff policy") The `type` field is the only required field. All other fields are optional. ``` import { BACKOFFS } from "@daiso-tech/core/backoff-policies"; const database = new Redis("YOUR_REDIS_CONNECTION_STRING"); const redisCircuitBreakerAdapter = new RedisCircuitBreakerAdapter({ database, backoffPolicy: { type: BACKOFFS.CONSTANT, delay: TimeSpan.fromSeconds(1), jitter: 0.5, }, }); ``` The settings are the same as [backoff policies](/docs/components/backoff_policies.md) settings. ### Configuring circuit-breaker policy[​](#configuring-circuit-breaker-policy "Direct link to Configuring circuit-breaker policy") The `type` field is the only required field. All other fields are optional. ``` import { POLICIES } from "@daiso-tech/core/circuit-breaker/policies"; const database = new Redis("YOUR_REDIS_CONNECTION_STRING"); const redisCircuitBreakerAdapter = new RedisCircuitBreakerAdapter({ database, circuitBreakerPolicy: { type: POLICIES.CONSECUTIVE, failureThreshold: 5, successThreshold: 5, }, }); ``` The settings are the same as [circuit-breaker policies](/docs/components/circuit_breaker/configuring_circuit_breaker_policies.md) settings. ## DatabaseCircuitBreakerAdapter[​](#databasecircuitbreakeradapter "Direct link to DatabaseCircuitBreakerAdapter") To use the `DatabaseCircuitBreakerAdapter`, you'll need to use `ICircuitBreakerStorageAdapter`: 1. Creating `ICircuitBreakerStorageAdapter`: ``` import { MemoryCircuitBreakerStorageAdapter } from "@daiso-tech/core/circuit-breaker/memory-circuit-breaker-storage-adapter"; const circuitBreakerStorageAdapter = new MemoryCircuitBreakerStorageAdapter(); ``` 2. Creating `DatabaseCircuitBreakerAdapter`: ``` import { DatabaseCircuitBreakerAdapter } from "@daiso-tech/core/circuit-breaker/database-circuit-breaker-adapter"; const circuitBreakerAdapter = new DatabaseCircuitBreakerAdapter({ adapter: circuitBreakerStorageAdapter, }); ``` ### Configuring backoff policy[​](#configuring-backoff-policy-1 "Direct link to Configuring backoff policy") You can use any of defined [backoff policies](/docs/components/backoff_policies.md). ``` import { DatabaseCircuitBreakerAdapter } from "@daiso-tech/core/circuit-breaker/database-circuit-breaker-adapter"; import { constantBackoff } from "@daiso-tech/core/backoff-policies"; const circuitBreakerAdapter = new DatabaseCircuitBreakerAdapter({ adapter: circuitBreakerStorageAdapter, backoffPolicy: constantBackoff(), }); ``` ### Configuring circuit-breaker policy[​](#configuring-circuit-breaker-policy-1 "Direct link to Configuring circuit-breaker policy") You can use any of defined [circuit-breaker policies](/docs/components/circuit_breaker/configuring_circuit_breaker_policies.md) or [create your own](/docs/components/circuit_breaker/creating_circuit_breaker_policies.md). ``` import { DatabaseCircuitBreakerAdapter } from "@daiso-tech/core/circuit-breaker/database-circuit-breaker-adapter"; import { SamplingBreaker } from "@daiso-tech/core/circuit-breaker/policies"; const circuitBreakerAdapter = new DatabaseCircuitBreakerAdapter({ adapter: circuitBreakerStorageAdapter, circuitBreakerPolicy: new SamplingBreaker(), }); ``` ## NoOpCircuitBreakerAdapter[​](#noopcircuitbreakeradapter "Direct link to NoOpCircuitBreakerAdapter") The `NoOpCircuitBreakerAdapter` is a no-operation implementation, it performs no actions when called: ``` import { NoOpCircuitBreakerAdapter } from "@daiso-tech/core/circuit-breaker/no-op-circuit-breaker-adpater"; const noOpCircuitBreakerAdapter = new NoOpCircuitBreakerAdapter(); ``` info The `NoOpCircuitBreakerAdapter` is useful when you want to mock out or disable your [`CircuitBreakerProvider`](https://daiso-tech.github.io/daiso-core/classes/CircuitBreaker.CircuitBreakerProvider.html) instance. ## KyselyCircuitBreakerStorageAdapter[​](#kyselycircuitbreakerstorageadapter "Direct link to KyselyCircuitBreakerStorageAdapter") To use the `KyselyCircuitBreakerStorageAdapter`, you'll need to: 1. Use database provider that has support for transactions. 2. Install the required dependency: [`kysely`](https://www.npmjs.com/package/kysely) package: 3. Provide a string serializer ([`ISerde`](/docs/components/serde.md)): * We recommend using `SuperJsonSerdeAdapter` for this purpose ``` import { Serde } from "@daiso-tech/core/serde"; import { SuperJsonSerdeAdapter } from "@daiso-tech/core/serde/super-json-serde-adapter"; const serde = new Serde(new SuperJsonSerdeAdapter()); ``` ### With Sqlite[​](#with-sqlite "Direct link to With Sqlite") You will need to install [`better-sqlite3`](https://www.npmjs.com/package/better-sqlite3) package: ``` import { TimeSpan } from "@daiso-tech/core/time-span"; import { KyselyCircuitBreakerStorageAdapter } from "@daiso-tech/core/circuit-breaker/kysely-circuit-breaker-storage-adapter"; import Sqlite from "better-sqlite3"; import { Kysely, SqliteDialect } from "kysely"; const database = new Sqlite("DATABASE_NAME.db"); const kysely = new Kysely({ dialect: new SqliteDialect({ database, }), }); const kyselyCircuitBreakerStorageAdapter = new KyselyCircuitBreakerStorageAdapter({ kysely, serde, }); // You need initialize the adapter once before using it. // During the initialization the schema will be created await kyselyCircuitBreakerStorageAdapter.init(); ``` ### With Postgres[​](#with-postgres "Direct link to With Postgres") You will need to install [`pg`](https://www.npmjs.com/package/pg) package: ``` import { TimeSpan } from "@daiso-tech/core/time-span"; import { KyselyCircuitBreakerStorageAdapter } from "@daiso-tech/core/circuit-breaker/kysely-circuit-breaker-storage-adapter"; import { Pool } from "pg"; import { Kysely, PostgresDialect } from "kysely"; const database = new Pool({ database: "DATABASE_NAME", host: "DATABASE_HOST", user: "DATABASE_USER", // DATABASE port port: 5432, password: "DATABASE_PASSWORD", max: 10, }); const kysely = new Kysely({ dialect: new PostgresDialect({ pool: database, }), }); const kyselyCircuitBreakerStorageAdapter = new KyselyCircuitBreakerStorageAdapter({ kysely, serde, }); // You need initialize the adapter once before using it. // During the initialization the schema will be created await kyselyCircuitBreakerStorageAdapter.init(); ``` ### With Mysql[​](#with-mysql "Direct link to With Mysql") You will need to install [`mysql2`](https://www.npmjs.com/package/mysql2) package: ``` import { TimeSpan } from "@daiso-tech/core/time-span"; import { KyselyCircuitBreakerStorageAdapter } from "@daiso-tech/core/circuit-breaker/kysely-circuit-breaker-storage-adapter"; import { createPool } from "mysql2"; import { Kysely, MysqlDialect } from "kysely"; const database = createPool({ host: "DATABASE_HOST", // Database port port: 3306, database: "DATABASE_NAME", user: "DATABASE_USER", password: "DATABASE_PASSWORD", connectionLimit: 10, }); const kysely = new Kysely({ dialect: new MysqlDialect({ pool: database, }), }); const kyselyCircuitBreakerStorageAdapter = new KyselyCircuitBreakerStorageAdapter({ kysely, serde, }); // You need initialize the adapter once before using it. // During the initialization the schema will be created await kyselyCircuitBreakerStorageAdapter.init(); ``` ### With Libsql[​](#with-libsql "Direct link to With Libsql") You will need to install `@libsql/kysely-libsql` package: ``` import { TimeSpan } from "@daiso-tech/core/time-span"; import { KyselyCircuitBreakerStorageAdapter } from "@daiso-tech/core/circuit-breaker/kysely-circuit-breaker-storage-adapter"; import { LibsqlDialect } from "@libsql/kysely-libsql"; import { Kysely } from "kysely"; const kysely = new Kysely({ dialect: new LibsqlDialect({ url: "DATABASE_URL", }), }); const kyselyCircuitBreakerStorageAdapter = new KyselyCircuitBreakerStorageAdapter({ kysely, serde, }); // You need initialize the adapter once before using it. // During the initialization the schema will be created await kyselyCircuitBreakerStorageAdapter.init(); ``` ## MemoryCircuitBreakerStorageAdapter[​](#memorycircuitbreakerstorageadapter "Direct link to MemoryCircuitBreakerStorageAdapter") To use the `MemoryCircuitBreakerStorageAdapter` you only need to create instance of it: ``` import { MemoryCircuitBreakerStorageAdapter } from "@daiso-tech/core/circuit-breaker/memory-circuit-breaker-storage-adapter"; const memoryCircuitBreakerStorageAdapter = new MemoryCircuitBreakerStorageAdapter(); ``` You can also provide an `Map` that will be used for storing the data in memory: ``` import { MemoryCircuitBreakerStorageAdapter } from "@daiso-tech/core/circuit-breaker/memory-circuit-breaker-storage-adapter"; const map = new Map(); const memoryCircuitBreakerStorageAdapter = new MemoryCircuitBreakerStorageAdapter(map); ``` info `MemoryCircuitBreakerStorageAdapter` lets you test your app without external dependencies like `Redis`, ideal for local development, unit tests, integration tests and fast E2E test for the backend application. ## MongodbCircuitBreakerStorageAdapter[​](#mongodbcircuitbreakerstorageadapter "Direct link to MongodbCircuitBreakerStorageAdapter") To use the `MongodbCircuitBreakerStorageAdapter`, you'll need to: 1. Use database provider that has support for transactions. 2. Install the required dependency: [`mongodb`](https://www.npmjs.com/package/mongodb) package: 3. Provide a string serializer ([`ISerde`](/docs/components/serde.md)): * We recommend using `SuperJsonSerdeAdapter` for this purpose ``` import { MongodbCircuitBreakerStorageAdapter } from "@daiso-tech/core/circuit-breaker/mongodb-circuit-breaker-storage-adapter"; import { MongoClient } from "mongodb"; const client = await MongoClient.connect("YOUR_MONGODB_CONNECTION_STRING"); const database = client.db("database"); const mongodbCircuitBreakerStorageAdapter = new MongodbCircuitBreakerStorageAdapter({ client, database, serde, }); // You need initialize the adapter once before using it. // During the initialization the indexes will be created await mongodbCircuitBreakerStorageAdapter.init(); ``` ## NoOpCircuitBreakerStorageAdapter[​](#noopcircuitbreakerstorageadapter "Direct link to NoOpCircuitBreakerStorageAdapter") The `NoOpCircuitBreakerStorageAdapter` is a no-operation implementation, it performs no actions when called: ``` import { NoOpCircuitBreakerStorageAdapter } from "@daiso-tech/core/circuit-breaker/no-op-circuit-breaker-storage-adpater"; const noOpCircuitBreakerStorageAdapter = new NoOpCircuitBreakerStorageAdapter(); ``` info The `NoOpCircuitBreakerStorageAdapter` is useful when you want to mock out or disable your [`DatabaseCircuitBreakerAdapter`](https://daiso-tech.github.io/daiso-core/classes/CircuitBreaker.DatabaseCircuitBreakerAdapter.html) instance. ## Further information[​](#further-information "Direct link to Further information") For further information refer to [`@daiso-tech/core/circuit-breaker`](https://daiso-tech.github.io/daiso-core/modules/CircuitBreaker.html) API docs. --- # Configuring circuit-breaker policies ## ConsecutiveBreaker[​](#consecutivebreaker "Direct link to ConsecutiveBreaker") The `ConsecutiveBreaker` breaks after n requests in a row fail. ``` import { ConsecutiveBreaker } from "@daiso-tech/core/circuit-breaker/policies"; new ConsecutiveBreaker({ /** * Amount of consecutive failures before going from closed -> open. * The field is optional. */ failureThreshold: 5, /** * Amount of consecutive success before going from half-open -> closed. * The field is optional. */ successThreshold: 5, }); ``` ## CountBreaker[​](#countbreaker "Direct link to CountBreaker") The `CountBreaker` breaks after a proportion of requests in a count based sliding window fail. ``` import { CountBreaker } from "@daiso-tech/core/circuit-breaker/policies" new CountBreaker({ /** * Percentage (from 0 to 1) failures before going from closed -> open. * The field is optional. */ failureThreshold: 0.2, /** * Percentage (from 0 to 1) successes before going from half-open -> closed. * The field is optional. */ successThreshold: 0.8, /** * Size of the count based sliding window. * The field is optional. */ size: 20, /** * The minimum number of calls to go from closed -> open, half-opened -> closed or half-opened -> open. * The field is optional. */ minimumNumberOfCalls: 20; }) ``` ## SamplingBreaker[​](#samplingbreaker "Direct link to SamplingBreaker") The `SamplingBreaker` breaks after a proportion of requests over a time period fail. ``` import { SamplingBreaker } from "@daiso-tech/core/circuit-breaker/policies"; import { TimeSpan } from "@daiso-tech/core/time-span"; new SamplingBreaker({ /** * Percentage (from 0 to 1) failures before going from closed -> open. * The field is optional. */ failureThreshold: 0.2, /** * Percentage (from 0 to 1) successes before going from half-open -> closed. * The field is optional. */ successThreshold: 0.8, /** * Length of time over which to sample. * The field is optional. */ timeSpan: TimeSpan.fromMinutes(1), /** * The sample length time. * The field is optional. */ sampleTimeSpan: TimeSpan.fromMinutes(1).divide(6), /** * The minimum number of calls per seconds to go from closed -> open, half-opened -> closed or half-opened -> open. * The field is optional. */ minimumRps: 5, }); ``` ## Further information[​](#further-information "Direct link to Further information") For further information refer to [`@daiso-tech/core/circuit-breaker`](https://daiso-tech.github.io/daiso-core/modules/CircuitBreaker.html) API docs. --- # Creating circuit-breaker adapters ## Implementing your custom ICircuitBreakerAdapter[​](#implementing-your-custom-icircuitbreakeradapter "Direct link to Implementing your custom ICircuitBreakerAdapter") In order to create an adapter you need to implement the [`ICircuitBreakerAdapter`](https://daiso-tech.github.io/daiso-core/types/CircuitBreaker.ICircuitBreakerAdapter.html) contract. ## Implementing your custom ICircuitBreakerStorageAdapter[​](#implementing-your-custom-icircuitbreakerstorageadapter "Direct link to Implementing your custom ICircuitBreakerStorageAdapter") We provide an additional contract [`ICircuitBreakerStorageAdapter`](https://daiso-tech.github.io/daiso-core/types/CircuitBreaker.ICircuitBreakerStorageAdapter.html) for building custom circuit-breaker storage adapters tailored to [`DatabaseCircuitBreakerAdapter`](/docs/components/circuit_breaker/configuring_circuit_breaker_adapters.md#databasecircuitbreakeradapter) and [`DatabaseCircuitBreakerProviderFactory`](/docs/components/circuit_breaker/circuit_breaker_factory_resolver.md#databasecircuitbreakerfactoryresolver). ## Testing your custom ICircuitBreakerStorageAdapter[​](#testing-your-custom-icircuitbreakerstorageadapter "Direct link to Testing your custom ICircuitBreakerStorageAdapter") We provide a complete test suite to test your circuit-breaker storage adapter implementation. Simply use the [`circuitBreakerStorageTestSuite`](https://daiso-tech.github.io/daiso-core/functions/CircuitBreaker.circuitBreakerStorageTestSuite.html) function: * Preconfigured Vitest test cases * Common edge case coverage Usage example: ``` // filename: MyCircuitBreakerStorageAdapter.test.ts import { beforeEach, describe, expect, test } from "vitest"; import { circuitBreakerStorageTestSuite } from "@daiso-tech/core/circuit-breaker/test-utilities"; import { MemoryCircuitBreakerStorageAdapter } from "./MemoryCircuitBreakerStorageAdapter.js"; describe("class: MyCircuitBreakerStorageAdapter", () => { circuitBreakerStorageTestSuite({ createAdapter: () => new MemoryCircuitBreakerStorageAdapter(), test, beforeEach, expect, describe, }); }); ``` ## Implementing your custom ICircuitBreakerProvider class[​](#implementing-your-custom-icircuitbreakerprovider-class "Direct link to Implementing your custom ICircuitBreakerProvider class") In some cases, you may need to implement a custom [`CircuitBreakerProvider`](https://daiso-tech.github.io/daiso-core/classes/CircuitBreaker.CircuitBreakerProvider.html) class to optimize performance for your specific technology stack. You can then directly implement the [`ICircuitBreakerProvider`](https://daiso-tech.github.io/daiso-core/types/CircuitBreaker.ICircuitBreakerProvider.html) contract. ## Further information[​](#further-information "Direct link to Further information") For further information refer to [`@daiso-tech/core/circuit-breaker`](https://daiso-tech.github.io/daiso-core/modules/CircuitBreaker.html) API docs. --- # Creating policies ## Implementing your custom ICircuitBreakerPolicy[​](#implementing-your-custom-icircuitbreakerpolicy "Direct link to Implementing your custom ICircuitBreakerPolicy") In order to create custom circuit-breaker you need to implement the [`ICircuitBreakerPolicy`](https://daiso-tech.github.io/daiso-core/types/CircuitBreaker.ICircuitBreakerPolicy.html) contract. Custom circuit-breaker policies can be used with [`DatabaseCircuitBreakerAdapter`](/docs/components/circuit_breaker/configuring_circuit_breaker_adapters.md#databasecircuitbreakeradapter) and [`DatabaseCircuitBreakerProviderFactory`](/docs/components/circuit_breaker/circuit_breaker_factory_resolver.md#databasecircuitbreakerfactoryresolver). To understand how to implement a custom [`ICircuitBreakerPolicy`](https://daiso-tech.github.io/daiso-core/types/CircuitBreaker.ICircuitBreakerPolicy.html), refer to the [`ConsecutiveBreaker`](https://github.com/yousif-khalil-abdulkarim/daiso-core/blob/main/src/circuit-breaker/implementations/policies/consecutive-breaker/consecutive-breaker.ts) implementation. ## Further information[​](#further-information "Direct link to Further information") For further information refer to [`@daiso-tech/core/circuit-breaker`](https://daiso-tech.github.io/daiso-core/modules/CircuitBreaker.html) API docs. --- # Codec The `@daiso-tech/core/codec` component provides seamless way to encode/decode data. ## Usage[​](#usage "Direct link to Usage") ``` import { Base64Codec } from "@daiso-tech/core/codec/base-64-codec"; const codec = new Base64Codec(); const encodedStr = codec.encode("This is base-64 encoded"); const decodedStr = codec.decode(encodedStr); ``` ## Separating encoding and decoding[​](#separating-encoding-and-decoding "Direct link to Separating encoding and decoding") The library includes 4 additional contracts: * `IEncoder` - Allows only for encoding. * `IDecoder` - Allows only for decoding. * `ICodec` - Allows for both encoding and decoding. ## Existing Codec:s[​](#existing-codec "Direct link to existing-codec") Currently the library only included `Base64Codec` class. ## Further information[​](#further-information "Direct link to Further information") For further information refer to [`@daiso-tech/core/codec`](https://daiso-tech.github.io/daiso-core/modules/Codec.html) API docs. --- # Collection The `@daiso-tech/core/collection` component provides a fluent, convenient wrapper for working with a `Array`, `Iterable` and `AsyncIterable`. ## Creating a collection[​](#creating-a-collection "Direct link to Creating a collection") You can create a collection from an any object thats implements the `Iterable` or `ArrayLike` contract. Create a collection from an `Array`: ``` import { ICollection, ListCollection } from "@daiso-tech/core/collection"; const fromArray = new ListCollection([1, 2, 3, 4]); // Logs [1, 2, 3, 4] console.log(fromArray.toArray()); ``` Create a collection from an `string`: ``` const fromString = new ListCollection("abc"); // Logs ["a", "b", "c"] console.log(fromString.toArray()); ``` Create a collection from a `Set`: ``` const fromSet = new ListCollection(new Set([1, 2, 2 4])); // Logs [1, 2, 4] console.log(fromSet.toArray()); ``` Create a collection from a `Map`: ``` const fromMap = new ListCollection( new Map([ ["a", 1], ["b", 2], ]), ); // Logs [["a", 1], ["b", 2]] console.log(fromMap.toArray()); ``` Create collection from your own object that implements `Iterable` contract: ``` class MyIterable implements Iterable { *[Symbol.iterator](): Iterator { yield 1; yield 2; yield 3; } } const fromIterable: ListCollection = new ListCollection( new MyIterable(), ); // Logs [1, 2, 3] console.log(fromIterable.toArray()); ``` Create collection from your own object that implements `ArrayLike` contract: ``` const myArrayLike: ArrayLike = { 0: 1, 1: 2, 2: -3, 3: -1, length: 4, }; const fromArrayLike: ListCollection = new ListCollection(myArrayLike); // Logs [1, 2, 3] console.log(fromArrayLike.toArray()); ``` ## Immutability[​](#immutability "Direct link to Immutability") The collection is imutable, meaning that all methods will return a new collection and not modify the original collection. ``` import { ListCollection } from "@daiso-tech/core/collection"; const original = new ListCollection([1, 2, 3]); const modified = original.map((item) => item * 2); // Logs [1, 2, 3] console.log(original.toArray()); // Logs [2, 4, 6] console.log(modified.toArray()); ``` ## Acessing elements from a collection[​](#acessing-elements-from-a-collection "Direct link to Acessing elements from a collection") To access elements from a collection you can use the [`get`](#get) method. The [`get`](#get) method will return the item at the specified index. If the index is out of bounds, it will return null. ``` import { ListCollection } from "@daiso-tech/core/collection"; const collection = new ListCollection([1, 2, 3]); const value = collection.get(1); // Logs 2 console.log(value); ``` If you want to get the item at the specified index but return a default value if the index is out of bounds, you can use the [`getOr`](#getor) method. ``` import { ListCollection } from "@daiso-tech/core/collection"; const collection = new ListCollection([1, 2, 3]); const value = collection.getOr(1, -1); // Logs 2 console.log(value); const value2 = collection.getOr(5, -1); // Logs -1 console.log(value2); ``` If you want to get the item at the specified index and throw an error if the index is out of bounds, you can use the [`getOrFail`](#getorfail) method. ``` import { ListCollection } from "@daiso-tech/core/collection"; const collection = new ListCollection([1, 2, 3]); const value = collection.getOrFail(1); // Logs 2 console.log(value); // throws error const value2 = collection.getOrFail(5); ``` info All methods that ends with `Or` will return the item if it exists or a default value. All methods that ends with `OrFail` it will return the item if it exists or throw an error. ## Iterating over a collection[​](#iterating-over-a-collection "Direct link to Iterating over a collection") Since it is `Iterable` you can also use the `for of` loop to iterate over a collection. ``` import { ListCollection } from "@daiso-tech/core/collection"; const collection = new ListCollection([1, 2, 3]); // Logs 1, 2, 3 for (const item of collection) { console.log(item); } ``` ## Modifying and filtering a collection[​](#modifying-and-filtering-a-collection "Direct link to Modifying and filtering a collection") You can modify the collection by using for example the [`map`](#map) method: ``` import { ListCollection } from "@daiso-tech/core/collection"; const collection = new ListCollection([1, 2, 3]).map((value) => value * value); // Logs [1, 4, 9] console.log(collection.toArray()); ``` You can filter the collection by using for example the [`filter`](#filter) method: ``` import { ListCollection } from "@daiso-tech/core/collection"; const collection = new ListCollection([1, 2, 3, 4, 5, 6]).filter( (value) => value % 2 === 0, ); // Logs [2, 4, 5] console.log(collection.toArray()); ``` info All methods that iterate, modify, or filter the collection expect a callback function with three arguments in the following order 1. `item`: The current element being processed. 2. `index`: The index of the current element. 3. `collection`: The original collection being traversed. ``` import { ListCollection } from "@daiso-tech/core/collection"; const collection = new ListCollection([2, 3, 2, 3, 4, 3]).filter( (item, index, collection) => { // Logs each item console.log("item:", item); // Logs each index of the item console.log("index:", index); // Logs the original collection console.log("collection:", collection.toArray()); return item === 2; }, ); collection.toArray(); ``` ## Types of collections[​](#types-of-collections "Direct link to Types of collections") The library includes 3 types of collections: * `ListCollection` implements the `ICollection` contract and uses `Array` internally. * `IterableCollection` implements the `ICollection` contract and uses `Iterable` internally. It only filters and transforms items when you loop through or access its items. * `AsyncIterableCollection` implements the `IAsyncCollection` contract and uses `AsyncIterable` internally. It only filters and transforms items when you loop through or access its items. ## Serialization and deserialization[​](#serialization-and-deserialization "Direct link to Serialization and deserialization") The `ListCollection` and `IterableCollection` classes supports serialization and deserialization, allowing you to easily convert instances to and from serialized formats. However, registration is required first: Serializing and deserializing `ListCollection`: ``` import { Serde } from "@daiso-tech/core/serde"; import { SuperJsonSerdeAdapter } from "@daiso-tech/core/serde/super-json-serde-adapter"; import { ListCollection, IterableCollection, } from "@daiso-tech/core/collection"; const serde = new Serde(new SuperJsonSerdeAdapter()); serde.registerClass(ListCollection); serde.registerClass(IterableCollection); const listCollection = new ListCollection([1, 2, 3, 4, 5]); const serializedListCollection = serde.serialize(listCollection); const deserializedListCollection = serde.deserialize(serializedListCollection); // Logs false console.log(serializedListCollection === deserializedListCollection); // Logs [1, 2, 3, 4, 5] [1, 2, 3, 4, 5] console.log(listCollection.toArray(), deserializedListCollection.toArray()); ``` Serializing and deserializing `IterableCollection`: ``` import { Serde } from "@daiso-tech/core/serde"; import { SuperJsonSerdeAdapter } from "@daiso-tech/core/serde/super-json-serde-adapter"; import { ListCollection, IterableCollection, } from "@daiso-tech/core/collection"; const serde = new Serde(new SuperJsonSerdeAdapter()); serde.registerClass(ListCollection); serde.registerClass(IterableCollection); const listCollection = new ListCollection([1, 2, 3, 4, 5]); const serializedListCollection = serde.serialize(listCollection); const deserializedListCollection = serde.deserialize(serializedListCollection); // Logs false console.log(serializedListCollection === deserializedListCollection); // Logs [1, 2, 3, 4, 5] [1, 2, 3, 4, 5] console.log(listCollection.toArray(), deserializedListCollection.toArray()); ``` ## Available Methods[​](#available-methods "Direct link to Available Methods") For the remaining of the documentation, we'll discuss each method available on the `ICollection` contract. ### Instance methods[​](#instance-methods "Direct link to Instance methods") * 1. [after](#after) 1) [afterOr](#afteror) 1. [afterOrFail](#afterorfail) 1) [append](#append) 1. [average](#average) 1) [before](#before) 1. [beforeOr](#beforeor) 1) [beforeOrFail](#beforeorfail) 1. [change](#change) 1) [chunk](#chunk) 1. [chunkWhile](#chunkwhile) 1) [collapse](#collapse) 1. [count](#count) 1) [countBy](#countby) 1. [crossJoin](#crossjoin) 1) [difference](#difference) 1. [entries](#entries) 1) [every](#every) 1. [filter](#filter) 1) [filter](#validate) 1. [first](#first) 1) [firstOr](#firstor) 1. [firstOrFail](#firstorfail) 1) [flatMap](#flatmap) 1. [forEach](#foreach) 1) [get](#get) 1. [getOrFail](#getorfail) 1) [groupBy](#groupby) * 1. [insertAfter](#insertafter) 1) [insertBefore](#insertbefore) 1. [isEmpty](#isempty) 1) [isNotEmpty](#isnotempty) 1. [join](#join) 1) [keys](#keys) 1. [last](#last) 1) [lastOr](#lastor) 1. [lastOrFail](#lastorfail) 1) [map](#map) 1. [max](#max) 1) [median](#median) 1. [min](#min) 1) [nth](#nth) 1. [padEnd](#padend) 1) [padStart](#padstart) 1. [page](#page) 1) [partition](#partition) 1. [percentage](#percentage) 1) [pipe](#pipe) 1. [prepend](#prepend) 1) [reduce](#reduce) 1. [reject](#reject) 1) [repeat](#repeat) 1. [reverse](#reverse) 1) [searchFirst](#searchfirst) 1. [searchLast](#searchlast) 1) [serialize](#serialize) * 1. [set](#set) 1) [shuffle](#shuffle) 1. [size](#size) 1) [skip](#skip) 1. [skipUntil](#skipuntil) 1) [skipWhile](#skipwhile) 1. [slice](#slice) 1) [sliding](#sliding) 1. [sole](#sole) 1) [some](#some) 1. [sort](#sort) 1) [split](#split) 1. [sum](#sum) 1) [take](#take) 1. [takeUntil](#takeuntil) 1) [takeWhile](#takewhile) 1. [tap](#tap) 1) [toArray](#toarray) 1. [toIterator](#toiterator) 1) [toMap](#tomap) 1. [toRecord](#torecord) 1) [unique](#unique) 1. [values](#values) 1) [when](#when) 1. [whenEmpty](#whenempty) 1) [whenNot](#whennot) 1. [whenNotEmpty](#whennotempty) 1) [zip](#zip) ### Static methods[​](#static-methods "Direct link to Static methods") * 1. [concat](#concat) 1) [difference](#difference) 1. [zip](#class_zip) ## Instance methods[​](#instance-methods-1 "Direct link to Instance methods") ### after[​](#after "Direct link to after") The `after` method returns the item that comes after the first item that matches `predicateFn`. If the collection is empty or the `predicateFn` does not match or matches the last item then `null` is returned. ``` import { ListCollection } from "@daiso-tech/core/collection"; new ListCollection([1, 2, 3, 4]).after((item) => item === 2); // 3 ``` ### afterOr[​](#afteror "Direct link to afterOr") The `afterOr` method returns the item that comes after the first item that matches `predicateFn`. If the collection is empty or the `predicateFn` does not match or matches the last item then `defaultValue` is returned. ``` import { ListCollection } from "@daiso-tech/core/collection"; new ListCollection([1, 2, 3, 4]).afterOr(-1, (item) => item === 4); // -1 ``` ### afterOrFail[​](#afterorfail "Direct link to afterOrFail") The `afterOrFail` method returns the item that comes after the first item that matches `predicateFn`. If the collection is empty or the `predicateFn` does not match or matches the last item then an error is thrown. ``` import { ListCollection } from "@daiso-tech/core/collection"; new ListCollection([1, 2, 3, 4]).afterOrFail((item) => item === 4); // throws error ``` ### append[​](#append "Direct link to append") The `append` method adds `Iterable` to the end of the collection. ``` import { ListCollection } from "@daiso-tech/core/collection"; new ListCollection([1, 2, 3, 4, 5]).append([-1, -2]).toArray(); // [1, 2, 3, 4, 5, -1, -2] ``` ### average[​](#average "Direct link to average") The `average` method returns the average of all items in the collection. If the collection includes other than number items an error will be thrown. If the collection is empty an error will also be thrown. ``` import { ListCollection } from "@daiso-tech/core/collection"; new ListCollection([1, 2, 3]).average(); // 2 ``` ### before[​](#before "Direct link to before") The `before` method returns the item that comes before the first item that matches `predicateFn`. If the `predicateFn` does not match or matches the first item then null is returned. ``` import { ListCollection } from "@daiso-tech/core/collection"; new ListCollection([1, 2, 3, 4]).before((item) => item === 2); // 1 ``` ### beforeOr[​](#beforeor "Direct link to beforeOr") The `beforeOr` method returns the item that comes before the first item that matches `predicateFn`. If the collection is empty or the `predicateFn` does not match or matches the first item then `defaultValue` is returned. ``` import { ListCollection } from "@daiso-tech/core/collection"; new ListCollection([1, 2, 3, 4]).beforeOr(-1, (item) => item === 2); // 1 ``` ### beforeOrFail[​](#beforeorfail "Direct link to beforeOrFail") The `beforeOrFail` method returns the item that comes before the first item that matches `predicateFn`. If the collection is empty or the `predicateFn` does not match or matches the first item then an error is thrown. ``` import { ListCollection } from "@daiso-tech/core/collection"; new ListCollection([1, 2, 3, 4]).beforeOrFail((item) => item === 1); // throws error ``` ### change[​](#change "Direct link to change") The `change` method changes only the items that passes `predicateFn` using `mapFn`. ``` import { ListCollection } from "@daiso-tech/core/collection"; new ListCollection([1, 2, 3, 4, 5]) .change(item => item % 2 === 0, item => item \* 2) .toArray(); // [1, 4, 3, 8, 5] ``` ### chunk[​](#chunk "Direct link to chunk") The `chunk` method breaks the collection into multiple, smaller collections of size `chunkSize`. If `chunkSize` is not divisible with total number of items then the last chunk will contain the remaining items. ``` import { ListCollection } from "@daiso-tech/core/collection"; new ListCollection([1, 2, 3, 4, 5, 6, 7]) .chunk(4) .map((chunk) => chunk.toArray()) .toArray(); // [[1, 2, 3, 4], [5, 6, 7]] ``` ### chunkWhile[​](#chunkwhile "Direct link to chunkWhile") The `chunkWhile` method breaks the collection into multiple, smaller collections based on the evaluation of `predicateFn`. The chunk variable passed to the `predicateFn` may be used to inspect the previous item. ``` import { ListCollection } from "@daiso-tech/core/collection"; new ListCollection("AABBCCCD") .chunkWhile((item, _index, chunk) => item === chunk.last()) .map((chunk) => chunk.toArray()) .toArray(); // [["A", "A"], ["B", "B"], ["C", "C", "C"], ["D"]] ``` ### collapse[​](#collapse "Direct link to collapse") The `collapse` method collapses a collection of iterables into a single, flat collection. ``` import { ListCollection } from "@daiso-tech/core/collection"; new ListCollection([ [1, 2], [3, 4], ]) .collapse() .toArray(); // [1, 2, 3, 4] ``` ### count[​](#count "Direct link to count") The `count` method returns the total number of items in the collection that passes `predicateFn`. ``` import { ListCollection } from "@daiso-tech/core/collection"; new ListCollection([1, 2, 3, 4, 5, 6]).count((value) => value % 2 === 0); // 3 ``` ### countBy[​](#countby "Direct link to countBy") The `countBy` method counts the occurrences of values in the collection by `selectFn`. By default the equality check occurs on the item. ``` import { ListCollection } from "@daiso-tech/core/collection"; new ListCollection(["a", "a", "a", "b", "b", "c"]).countBy().toArray(); // [["a", 3], ["b", 2], ["c", 1]] ``` ### crossJoin[​](#crossjoin "Direct link to crossJoin") The `crossJoin` method cross joins the collection's values among `iterables`, returning a Cartesian product with all possible permutations. ``` import { ListCollection } from "@daiso-tech/core/collection"; new ListCollection([1, 2]).crossJoin(["a", "b"]).toArray(); // [[1, "a"], [1, "b"], [2, "a"], [2, "b"]] ``` ### entries[​](#entries "Direct link to entries") The `entries` returns an ListCollection of key, value pairs for every entry in the collection. ``` import { ListCollection } from "@daiso-tech/core/collection"; const collection = new ListCollection(["a", "b", "c", "d"]).entries().toArray(); // [[0, "a"], [1, "b"], [2, "c"], [3, "d"]] ``` ### every[​](#every "Direct link to every") The `every` method determines whether all items in the collection matches `predicateFn`. ``` import { ListCollection } from "@daiso-tech/core/collection"; const isAllNumberLessThan6 = new ListCollection([0, 1, 2, 3, 4, 5]).every( (item) => item < 6, ); // true ``` ### filter[​](#filter "Direct link to filter") The `filter` method filters the collection using `predicateFn`, keeping only those items that pass `predicateFn`. ``` import { ListCollection } from "@daiso-tech/core/collection"; new ListCollection([0, 1, 2, 3, 4, 5, 6]) .filter((item) => 2 < item && item < 5) .toArray(); // [3, 4] ``` ### validate[​](#validate "Direct link to validate") The `validate` method filters all items that matches the `schema` and transforms them afterwards. The `schema` can be any [standard schema](https://standardschema.dev/). ``` import { ListCollection } from "@daiso-tech/core/collection"; import { z } from "zod"; new ListCollection(["a", "1.2", "3", "null"]) .validate(z.string().pipe(z.number())) .toArray(); // [1.2, 3] ``` ### first[​](#first "Direct link to first") The `first` method returns the first item in the collection that passes `predicateFn`. By default it will get the first item. If the collection is empty or no items passes `predicateFn` than null i returned. ``` import { ListCollection } from "@daiso-tech/core/collection"; new ListCollection([1, 2, 3, 4]).first(); // 1 ``` ### firstOr[​](#firstor "Direct link to firstOr") The `firstOr` method returns the first item in the collection that passes `predicateFn` By default it will get the first item. If the collection is empty or no items passes `predicateFn` than `defaultValue`. ``` import { ListCollection } from "@daiso-tech/core/collection"; new ListCollection([1, 2, 3, 4]).firstOr(-1, (item) => item > 10); // -1 ``` ### firstOrFail[​](#firstorfail "Direct link to firstOrFail") The `firstOrFail` method returns the first item in the collection that passes `predicateFn`. By default it will get the first item. If the collection is empty or no items passes `predicateFn` than error is thrown. ``` import { ListCollection } from "@daiso-tech/core/collection"; new ListCollection([1, 2, 3, 4]).firstOrFail((item) => item === 5); // throws error ``` ### flatMap[​](#flatmap "Direct link to flatMap") The `flatMap` method returns a new array formed by applying `mapFn` to each item of the array, and then collapses the result by one level. It is identical to a `map` method followed by a `collapse` method. ``` import { ListCollection } from "@daiso-tech/core/collection"; new ListCollection([ ["a", "b"], ["c", "d"], ]) .flatMap((item) => [item.length, ...item]) .toArray(); // [2, "a", "b", 2, "c", "d"] ``` ### forEach[​](#foreach "Direct link to forEach") The `forEach` method iterates through all items in the collection. ``` import { ListCollection } from "@daiso-tech/core/collection"; new ListCollection([1, 2, 3]).forEach((item) => console.log(item)); // Logs: 1, 2, 3 ``` ### get[​](#get "Direct link to get") The `get` method returns the item by index. If the item is not found null will be returned. ``` import { ListCollection } from "@daiso-tech/core/collection"; const collection = new ListCollection([1, 4, 2, 8, -2]); collection.get(2); // 2 collection.get(5); // null ``` ### getOrFail[​](#getorfail "Direct link to getOrFail") The `getOrFail` method returns the item by index. If the item is not found an error will be thrown. ``` import { ListCollection } from "@daiso-tech/core/collection"; const collection = new ListCollection([1, 4, 2, 8, -2]); collection.getOrFail(2); // 2 collection.getOrFail(5); // throws error ``` ### groupBy[​](#groupby "Direct link to groupBy") The `groupBy` method groups the collection's items by `selectFn`. By default the equality check occurs on the item. ``` import { ListCollection } from "@daiso-tech/core/collection"; new ListCollection(["a", "a", "a", "b", "b", "c"]) .groupBy() .map(([k, v]) => [k, v.toArray()]) .toArray(); // [["a", ["a", "a", "a"]], ["b", ["b", "b"]], ["c", ["c"]]] ``` ### insertAfter[​](#insertafter "Direct link to insertAfter") The `insertAfter` method adds `Iterable` after the first item that matches `predicateFn`. ``` import { ListCollection } from "@daiso-tech/core/collection"; new ListCollection([1, 2, 2, 3, 4, 5]) .insertAfter((item) => item === 2, [-1, 20]) .toArray(); // [1, 2, -1, 20, 2, 3, 4, 5] ``` ### insertBefore[​](#insertbefore "Direct link to insertBefore") The `insertBefore` method adds `Iterable` before the first item that matches `predicateFn`. ``` import { ListCollection } from "@daiso-tech/core/collection"; new ListCollection([1, 2, 2, 3, 4, 5]) .insertAfter((item) => item === 2, [-1, 20]) .toArray(); // [1, 2, -1, 20, 2, 3, 4, 5] ``` ### isEmpty[​](#isempty "Direct link to isEmpty") The `isEmpty` returns true if the collection is empty. ``` import { ListCollection } from "@daiso-tech/core/collection"; new ListCollection([]).isEmpty(); // true ``` ### isNotEmpty[​](#isnotempty "Direct link to isNotEmpty") The `isEmpty` returns true if the collection is empty. ``` import { ListCollection } from "@daiso-tech/core/collection"; new ListCollection([]).isEmpty(); // true ``` ### join[​](#join "Direct link to join") The `join` method joins the collection's items with `separator`. An error will be thrown when if a non-string item is encountered. ``` import { ListCollection } from "@daiso-tech/core/collection"; new ListCollection([1, 2, 3, 4]).map((item) => item.toString()).join(); // "1,2,3,4" ``` ``` import { ListCollection } from "@daiso-tech/core/collection"; new ListCollection([1, 2, 3, 4]).map((item) => item.toString()).join("_"); // "1_2_3_4" ``` ### keys[​](#keys "Direct link to keys") The `keys` method returns an ListCollection of keys in the collection. ### last[​](#last "Direct link to last") The `last` method returns the last item in the collection that passes `predicateFn`. By default it will get the last item. If the collection is empty or no items passes `predicateFn` than null i returned. ``` import { ListCollection } from "@daiso-tech/core/collection"; new ListCollection([1, 2, 3, 4]).last(); // 4 ``` ### lastOr[​](#lastor "Direct link to lastOr") The `lastOr` method returns the last item in the collection that passes `predicateFn`. By default it will get the last item. If the collection is empty or no items passes `predicateFn` than `defaultValue`. ``` import { ListCollection } from "@daiso-tech/core/collection"; new ListCollection([1, 2, 3, 4]).lastOr(-1, (item) => item > 10); // -1 ``` ### lastOrFail[​](#lastorfail "Direct link to lastOrFail") The `lastOrFail` method returns the last item in the collection that passes `predicateFn`. By default it will get the last item. If the collection is empty or no items passes `predicateFn` than error is thrown. ``` import { ListCollection } from "@daiso-tech/core/collection"; new ListCollection([1, 2, 3, 4]).lastOrFail((item) => item === 5); // throws error ``` ### map[​](#map "Direct link to map") The `map` method iterates through the collection and passes each item to `mapFn`. The `mapFn` is free to modify the item and return it, thus forming a new collection of modified items. ``` import { ListCollection } from "@daiso-tech/core/collection"; new ListCollection([1, 2, 3, 4, 5]).map((item) => item * 2).toArray(); // [2, 4, 6, 8, 10] ``` ### max[​](#max "Direct link to max") The `max` method returns the max of all items in the collection. If the collection includes other than number items an error will be thrown. If the collection is empty an error will also be thrown. ``` import { ListCollection } from "@daiso-tech/core/collection"; new ListCollection([1, 2, 3]).max(); // 3 ``` ### median[​](#median "Direct link to median") The `median` method returns the median of all items in the collection. If the collection includes other than number items an error will be thrown. If the collection is empty an error will also be thrown. ``` import { ListCollection } from "@daiso-tech/core/collection"; new ListCollection([1, 2, 3]).median(); // 2 ``` ### min[​](#min "Direct link to min") The `min` method returns the min of all items in the collection. If the collection includes other than number items an error will be thrown. If the collection is empty an error will also be thrown. ``` import { ListCollection } from "@daiso-tech/core/collection"; new ListCollection([1, 2, 3]).min(); // 1 ``` ### nth[​](#nth "Direct link to nth") The `nth` method creates a new collection consisting of every n-th item. ``` import { ListCollection } from "@daiso-tech/core/collection"; new ListCollection(["a", "b", "c", "d", "e", "f"]).nth(4).toArray(); // ["a", "e"] ``` ### padEnd[​](#padend "Direct link to padEnd") The `padEnd` method pads this collection with `fillItems` until the resulting collection size reaches `maxLength`. The padding is applied from the end of this collection. ``` import { ListCollection } from "@daiso-tech/core/collection"; new ListCollection("abc").padEnd(10, "foo").join(""); // "abcfoofoof" ``` ### padStart[​](#padstart "Direct link to padStart") The `padStart` method pads this collection with `fillItems` until the resulting collection size reaches `maxLength`. The padding is applied from the start of this collection. ``` import { ListCollection } from "@daiso-tech/core/collection"; new ListCollection("abc").padStart(10, "foo").join(""); // "foofoofabc" ``` ### page[​](#page "Direct link to page") The `page` method returns a new collection containing the items that would be present on `page` with custom `pageSize`. ``` import { ListCollection } from "@daiso-tech/core/collection"; new ListCollection([1, 2, 3, 4, 5, 6, 7, 8, 9]) .page( 2, // Page number 3, // Page size ) .toArray(); // [4, 5, 6] ``` ### partition[​](#partition "Direct link to partition") The `partition` method is used to separate items that pass `predicateFn` from those that do not. ``` import { ListCollection } from "@daiso-tech/core/collection"; new ListCollection([1, 2, 3, 4, 5, 6]) .partition((nbr) => nbr % 2 === 0) .map((chunk) => chunk.toArray()) .toArray(); // [[2, 4, 6], [1, 3, 5]] ``` ### percentage[​](#percentage "Direct link to percentage") The `percentage` method may be used to quickly determine the percentage of items in the collection that pass `predicateFn`. If the collection is empty an error will also be thrown. ``` import { ListCollection } from "@daiso-tech/core/collection"; new ListCollection([1, 1, 2, 2, 2, 3]).percentage((value) => value === 1); // 33.333 ``` ### pipe[​](#pipe "Direct link to pipe") The `pipe` method passes the orignal collection to `callback` and returns the result from `callback`. ``` import { ListCollection } from "@daiso-tech/core/collection"; new ListCollection([1, "2", "a", 1, 3, {}]) .pipe((c) => c.map((item) => Number(item)).reject(isNaN)) .pipe((c) => c.repeat(2).toArray()); // [1, 2, 1, 3] ``` ### prepend[​](#prepend "Direct link to prepend") The `prepend` method adds `Iterable` to the beginning of the collection. ``` import { ListCollection } from "@daiso-tech/core/collection"; new ListCollection([1, 2, 3, 4, 5]).prepend([-1, 20]).toArray(); // [-1, 20, 1, 2, 3, 4, 5] ``` ### reduce[​](#reduce "Direct link to reduce") The `reduce` method executes `reduceFn` function on each item of the array, passing in the return value from the calculation on the preceding item. The final result of running the reducer across all items of the array is a single value. ``` import { ListCollection } from "@daiso-tech/core/collection"; new ListCollection([1, 2, 3]).reduce((sum, item) => sum + item); // 6 ``` ### reject[​](#reject "Direct link to reject") The `reject` method filters the collection using `predicateFn`, keeping only those items that not pass `predicateFn`. ``` import { ListCollection } from "@daiso-tech/core/collection"; new ListCollection([1, 2, 3, 4, 5, 6]) .reject((item) => 2 < item && item < 5) .toArray(); // [1, 2, 5, 6] ``` ### repeat[​](#repeat "Direct link to repeat") The `repeat` method will repeat the original collection `amount` times. ``` import { ListCollection } from "@daiso-tech/core/collection"; new ListCollection([1, 2, 3]).repeat(3).toArray(); // [1, 2, 3, 1, 2, 3, 1, 2, 3] ``` ### reverse[​](#reverse "Direct link to reverse") The `reverse` method will reverse the order of the collection. The reversing of the collection will be applied in chunks that are the size of `chunkSize`. ``` import { ListCollection } from "@daiso-tech/core/collection"; new ListCollection([-1, 2, 4, 3]).reverse().toArray(); // [3, 4, 2, -1] ``` ### searchFirst[​](#searchfirst "Direct link to searchFirst") The `searchFirst` return the index of the first item that matches `predicateFn`. ``` import { ListCollection } from "@daiso-tech/core/collection"; new ListCollection(["a", "b", "b", "c"]).searchFirst((item) => item === "b"); // 1 ``` ### searchLast[​](#searchlast "Direct link to searchLast") The `searchLast` return the index of the last item that matches `predicateFn`. ``` import { ListCollection } from "@daiso-tech/core/collection"; new ListCollection(["a", "b", "b", "c"]).searchLast((item) => item === "b"); // 2 ``` ### set[​](#set "Direct link to set") The `set` method updates the specified index with new value. ``` import { ListCollection } from "@daiso-tech/core/collection"; new ListCollection([1, 2, 3, 4, 5]).set(1, -1).toArray(); // [1, -1, 3, 4, 5] ``` ### shuffle[​](#shuffle "Direct link to shuffle") The `shuffle` method randomly shuffles the items in the collection. You can provide a custom Math.random function by passing in `mathRandom`. ``` import { ListCollection } from "@daiso-tech/core/collection"; new ListCollection([1, 2, 3, 4]).shuffle().toArray(); // Random order, e.g., [3, 1, 4, 2] ``` ### size[​](#size "Direct link to size") The `size` returns the size of the collection. ``` import { ListCollection } from "@daiso-tech/core/collection"; new ListCollection([1, 2, 3]).size(); // 3 ``` ### skip[​](#skip "Direct link to skip") The `skip` method skips the first `offset` items. ``` import { ListCollection } from "@daiso-tech/core/collection"; new ListCollection([1, 2, 3, 4, 5, 6, 7, 8, 9, 10]).skip(4).toArray(); // [5, 6, 7, 8, 9, 10] ``` ### skipUntil[​](#skipuntil "Direct link to skipUntil") The `skipUntil` method skips items until `predicateFn` returns true. ``` import { ListCollection } from "@daiso-tech/core/collection"; new ListCollection([1, 2, 3, 4]).skipUntil((item) => item >= 3).toArray(); // [3, 4] ``` ### skipWhile[​](#skipwhile "Direct link to skipWhile") The `skipWhile` method skips items until `predicateFn` returns false. ``` import { ListCollection } from "@daiso-tech/core/collection"; new ListCollection([1, 2, 3, 4]).skipWhile((item) => item <= 3).toArray(); // [4] ``` ### slice[​](#slice "Direct link to slice") The `slice` method creates porition of the original collection selected from `start` and `end` where `start` and `end` (end not included) represent the index of items in the collection. ``` import { ListCollection } from "@daiso-tech/core/collection"; new ListCollection(["a", "b", "c", "d", "e", "f"]).slice(3).toArray(); // ["d", "e", "f"] ``` ### sliding[​](#sliding "Direct link to sliding") The `sliding` method returns a new collection of chunks representing a "sliding window" view of the items in the collection. ``` import { ListCollection } from "@daiso-tech/core/collection"; new ListCollection([1, 2, 3, 4, 5]) .sliding(2) .map((chunk) => chunk.toArray()) .toArray(); // [[1, 2], [2, 3], [3, 4], [4, 5]] ``` ### sole[​](#sole "Direct link to sole") The `sole` method returns the first item in the collection that passes `predicateFn`, but only if `predicateFn` matches exactly one item. If no items matches or multiple items are found an error will be thrown. ``` import { ListCollection } from "@daiso-tech/core/collection"; new ListCollection([1, 2, 3, 4, 5]).sole((item) => item === 4); // 4 ``` ### some[​](#some "Direct link to some") The `some` method determines whether at least one item in the collection matches `predicateFn`. ``` import { ListCollection } from "@daiso-tech/core/collection"; new ListCollection([0, 1, 2, 3, 4, 5]).some((item) => item === 1); // true ``` ### sort[​](#sort "Direct link to sort") The `sort` method sorts the collection. You can provide a `comparator` function. ``` import { ListCollection } from "@daiso-tech/core/collection"; new ListCollection([-1, 2, 4, 3]).sort().toArray(); // [-1, 2, 3, 4] ``` ### split[​](#split "Direct link to split") The `split` method breaks a collection evenly into `chunkAmount` of chunks. ``` import { ListCollection } from "@daiso-tech/core/collection"; new ListCollection([1, 2, 3, 4, 5]) .split(3) .map((chunk) => chunk.toArray()) .toArray(); // [[1, 2], [3, 4], [5]] ``` ### sum[​](#sum "Direct link to sum") The `sum` method returns the sum of all items in the collection. If the collection includes other than number items an error will be thrown. If the collection is empty an error will also be thrown. ``` import { ListCollection } from "@daiso-tech/core/collection"; new ListCollection([1, 2, 3]).sum(); // 6 ``` ### take[​](#take "Direct link to take") The `take` method takes the first `limit` items. ``` import { ListCollection } from "@daiso-tech/core/collection"; new ListCollection([0, 1, 2, 3, 4, 5]).take(3).toArray(); // [0, 1, 2] ``` ``` import { ListCollection } from "@daiso-tech/core/collection"; new ListCollection([0, 1, 2, 3, 4, 5]).take(-2).toArray(); // [0, 1, 2, 3] ``` ### takeUntil[​](#takeuntil "Direct link to takeUntil") The `takeUntil` method takes items until `predicateFn` returns true. ``` import { ListCollection } from "@daiso-tech/core/collection"; new ListCollection([1, 2, 3, 4]).takeUntil((item) => item >= 3).toArray(); // [1, 2] ``` ### takeWhile[​](#takewhile "Direct link to takeWhile") The `takeWhile` method takes items until `predicateFn` returns false. ``` import { ListCollection } from "@daiso-tech/core/collection"; new ListCollection([1, 2, 3, 4]).takeWhile((item) => item < 4).toArray(); // [1, 2, 3] ``` ### tap[​](#tap "Direct link to tap") The `tap` method passes a copy of the original collection to `callback`, allowing you to do something with the items while not affecting the original collection. ``` import { ListCollection } from "@daiso-tech/core/collection"; new ListCollection([1, 2, 3, 4, 5, 6]) .tap((c) => c.filter((v) => v % 2 === 0).forEach(console.log)) .toArray(); // [1, 2, 3, 4, 5, 6] (logs 2, 4, 6) ``` ### toArray[​](#toarray "Direct link to toArray") The `toArray` method converts the collection to a new `Array`. ``` import { ListCollection } from "@daiso-tech/core/collection"; new ListCollection([1, 2, 3]).toArray(); // [1, 2, 3] ``` ### toIterator[​](#toiterator "Direct link to toIterator") The `toIterator` method converts the collection to a `Iterator`. ``` const iterator = new ListCollection([1, 2, 3, 4, 5]).toIterator(); console.log("item 1:", iterator.next()); console.log("item 2:", iterator.next()); console.log("item 3:", iterator.next()); console.log("done:", iterator.next()); ``` ### toMap[​](#tomap "Direct link to toMap") The `toMap` method converts the collection to a new `Map`. An error will be thrown if item is not a tuple of size 2. ``` import { ListCollection } from "@daiso-tech/core/collection"; new ListCollection([ [0, "a"], [1, "b"], ]).toMap(); // Map { 0 => "a", 1 => "b" } ``` ### toRecord[​](#torecord "Direct link to toRecord") The `toRecord` method converts the collection to a new `Record`. An error will be thrown if item is not a tuple of size 2 where the first element is a string or a number. ``` import { ListCollection } from "@daiso-tech/core/collection"; new ListCollection([ [0, "a"], [1, "b"], ]).toRecord(); // { 0: "a", 1: "b" } ``` ### unique[​](#unique "Direct link to unique") The `unique` method removes all duplicate values from the collection by `selectFn`. By default the equality check occurs on the item. ``` import { ListCollection } from "@daiso-tech/core/collection"; new ListCollection([1, 1, 2, 2, 3, 4, 2]).unique().toArray(); // [1, 2, 3, 4] ``` Equality check on a field in the item. ``` import { ListCollection } from "@daiso-tech/core/collection"; type Phone = { name: string; brand: string; type: string }; new ListCollection([ { name: "iPhone 6", brand: "Apple", type: "phone" }, { name: "iPhone 5", brand: "Apple", type: "phone" }, { name: "Apple Watch", brand: "Apple", type: "watch" }, { name: "Galaxy S6", brand: "Samsung", type: "phone" }, { name: "Galaxy Gear", brand: "Samsung", type: "watch" }, ]) .unique((item) => item.brand) .toArray(); // [ // { name: "iPhone 6", brand: "Apple", type: "phone" }, // { name: "Galaxy S6", brand: "Samsung", type: "phone" }, // ] ``` ### copy[​](#copy "Direct link to copy") The `copy` method returns a copy of the collection. ``` import { ListCollection } from "@daiso-tech/core/collection"; const collectionA = new ListCollection([1, 2, 3, 4]); const collectionB = collectionA.copy(); // Logs false console.log(collectionA === collectionB); // Logs false console.log(collectionA.toArray() === collectionB.toArray()); ``` ### when[​](#when "Direct link to when") The `when` method will execute `callback` when `condition` evaluates to true. ``` import { ListCollection } from "@daiso-tech/core/collection"; new ListCollection([1, 2, 3, 4]) .when(true, (c) => c.append([-3])) .when(false, (c) => c.append([20])) .toArray(); // [1, 2, 3, 4, -3] ``` ### whenEmpty[​](#whenempty "Direct link to whenEmpty") The `whenEmpty` method will execute `callback` when the collection is empty. ``` import { ListCollection } from "@daiso-tech/core/collection"; new ListCollection([]).whenEmpty((c) => c.append([-3])).toArray(); // [-3] ``` ``` import { ListCollection } from "@daiso-tech/core/collection"; new ListCollection([1]).whenEmpty((c) => c.append([-3])).toArray(); // [1] ``` ### whenNot[​](#whennot "Direct link to whenNot") The `whenNot` method will execute `callback` when `condition` evaluates to false. ``` import { ListCollection } from "@daiso-tech/core/collection"; new ListCollection([1, 2, 3, 4]) .whenNot(true, (c) => c.append([-3])) .whenNot(false, (c) => c.append([20])) .toArray(); // [1, 2, 3, 4, 20] ``` ### whenNotEmpty[​](#whennotempty "Direct link to whenNotEmpty") The `whenNotEmpty` method will execute `callback` when the collection is not empty. ``` import { ListCollection } from "@daiso-tech/core/collection"; new ListCollection([]).whenNotEmpty((c) => c.append([-3])).toArray(); // [] ``` ``` import { ListCollection } from "@daiso-tech/core/collection"; new ListCollection([1]).whenNotEmpty((c) => c.append([-3])).toArray(); // [1, -3] ``` ## Static methods[​](#static-methods-1 "Direct link to Static methods") ### concat[​](#concat "Direct link to concat") The `concat` method concatenates multiple `Iterable`'s and returns a new collection. ``` import { ListCollection } from "@daiso-tech/core/collection"; ListCollection.concat([1, 2, 3], [4, 5, 6]).toArray(); // [1, 2, 3, 4, 5, 6] ``` ### difference[​](#difference "Direct link to difference") The `difference` method will return the values in the original collection that are not present in `Iterable`. By default the equality check occurs on the item. ``` new ListCollection([1, 2, 2, 3, 4, 5]).difference([2, 4, 6, 8]).toArray(); // [1, 3, 5] ``` Equality check on a field in the item. ``` import { ListCollection } from "@daiso-tech/core/collection"; new ListCollection([ { name: "iPhone 6", brand: "Apple", type: "phone" }, { name: "iPhone 5", brand: "Apple", type: "phone" }, { name: "Apple Watch", brand: "Apple", type: "watch" }, { name: "Galaxy S6", brand: "Samsung", type: "phone" }, { name: "Galaxy Gear", brand: "Samsung", type: "watch" }, ]) .difference( [{ name: "Apple Watch", brand: "Apple", type: "watch" }], // equality check occurs on product.type (product) => product.type, ) .toArray(); // [ // { name: "iPhone 6", brand: "Apple", type: "phone" }, // { name: "iPhone 5", brand: "Apple", type: "phone" }, // { name: "Galaxy S6", brand: "Samsung", type: "phone" }, // ] ``` The `difference` method is also available as a static method. ``` ListCollection.difference([1, 2, 2, 3, 4, 5], [2, 4, 6, 8]).toArray(); // [1, 3, 5] ``` ### zip[​](#zip "Direct link to zip") The `zip` method merges together the values of `Iterable` with the values of the collection at their corresponding index. The returned collection has size of the shortest collection. The `zip` method is also available as a static method. ``` import { ListCollection } from "@daiso-tech/core/collection"; ListCollection.zip(["Chair", "Desk"], [100, 200]).toArray(); // [["Chair", 100], ["Desk", 200]] ``` ## Further information[​](#further-information "Direct link to Further information") For further information refer to [`@daiso-tech/core/collection`](https://daiso-tech.github.io/daiso-core/modules/Collection.html) API docs. --- # configuring\_event\_bus\_adapters ## Configuring EventBus adapters[​](#configuring-eventbus-adapters "Direct link to Configuring EventBus adapters") ## MemoryEventBusAdapter[​](#memoryeventbusadapter "Direct link to MemoryEventBusAdapter") To use the `MemoryEventBusAdapter` you only need to create instance of it. ``` import { MemoryEventBusAdapter } from "@daiso-tech/core/event-bus/memory-event-bus-adapter"; const eventBusAdapter = new MemoryEventBusAdapter(); ``` You can also provide an `EventEmitter` that will be used for dispatching the events in memory: ``` import { MemoryEventBusAdapter } from "@daiso-tech/core/event-bus/memory-event-bus-adapter"; import { EventEmitter } from "node:events"; const eventEmitter = new EventEmitter(); const eventBusAdapter = new MemoryEventBusAdapter(eventEmitter); ``` info `MemoryEventBusAdapter` lets you test your app without external dependencies like `Redis`, ideal for local development, unit tests, integration tests and fast E2E test for the backend application. ## RedisPubSubEventBusAdapter[​](#redispubsubeventbusadapter "Direct link to RedisPubSubEventBusAdapter") To use the `RedisPubSubEventBusAdapter`, you'll need to: 1. Install the required dependency: [`ioredis`](https://www.npmjs.com/package/ioredis) package: 2. Provide a string serializer ([`ISerde`](/docs/components/serde.md)): * We recommend using `SuperJsonSerdeAdapter` for this purpose ``` import { RedisPubSubEventBusAdapter } from "@daiso-tech/core/event-bus/redis-pub-sub-event-bus-adapter"; import { Serde } from "@daiso-tech/core/serde"; import { SuperJsonSerdeAdapter } from "@daiso-tech/core/serde/super-json-serde-adapter"; import Redis from "ioredis"; const client = new Redis("YOUR_REDIS_CONNECTION_STRING"); const serde = new Serde(new SuperJsonSerdeAdapter()); const eventBusAdapter = new RedisPubSubEventBusAdapter({ client, serde, }); ``` ## NoOpEventBusAdapter[​](#noopeventbusadapter "Direct link to NoOpEventBusAdapter") The `NoOpEventBusAdapter` is a no-operation implementation, it performs no actions when called. ``` import { NoOpEventBusAdapter } from "@daiso-tech/core/event-bus/no-op-event-bus-adapter"; const noEventBusAdapter = new NoOpEventBusAdapter(); ``` info The `NoOpEventBusAdapter` is useful when you want to mock out or disable your [`EventBus`](https://daiso-tech.github.io/daiso-core/classes/EventBus.EventBus.html) class. ## Further information[​](#further-information "Direct link to Further information") For further information refer to [`@daiso-tech/core/event-bus`](https://daiso-tech.github.io/daiso-core/modules/EventBus.html) API docs. --- # Creating EventBus adapters ## Implementing your custom IEventBusAdapter[​](#implementing-your-custom-ieventbusadapter "Direct link to Implementing your custom IEventBusAdapter") In order to create an adapter you need to implement the [`IEventBusAdapter`](https://daiso-tech.github.io/daiso-core/types/EventBus.IEventBusAdapter.html) contract. ## Testing your custom IEventBusAdapter[​](#testing-your-custom-ieventbusadapter "Direct link to Testing your custom IEventBusAdapter") We provide a complete test suite to verify your event bus adapter implementation. Simply use the [`eventBusAdapterTestSuite`](https://daiso-tech.github.io/daiso-core/functions/EventBus.eventBusAdapterTestSuite.html) function: * Preconfigured Vitest test cases * Common edge case coverage Usage example: ``` // filename: MyEventBusAdapter.test.ts import { describe, test, beforeEach, expect } from "vitest"; import { eventBusAdapterTestSuite } from "@daiso-tech/core/event-bus/test-utilities"; import { MyEventBusAdapter } from "./MyEventBusAdapter.js"; describe("class: MyEventBusAdapter", () => { eventBusAdapterTestSuite({ createAdapter: () => new MyEventBusAdapter(), test, beforeEach, expect, describe, }); }); ``` ## Implementing your custom IEventBus class[​](#implementing-your-custom-ieventbus-class "Direct link to Implementing your custom IEventBus class") In some cases, you may need to implement a custom [`EventBus`](https://daiso-tech.github.io/daiso-core/modules/EventBus.html) class to optimize performance for your specific technology stack. You can then directly implement the [`IEventBus`](https://daiso-tech.github.io/daiso-core/types/EventBus.IEventBus.html) contract. ## Testing your custom IEventBus class[​](#testing-your-custom-ieventbus-class "Direct link to Testing your custom IEventBus class") We provide a complete test suite to verify your custom event bus class implementation. Simply use the [`eventBusTestSuite`](https://daiso-tech.github.io/daiso-core/functions/EventBus.eventBusTestSuite.html) function: * Preconfigured Vitest test cases * Standardized event bus behavior validation * Common edge case coverage Usage example: ``` // filename: MyEventBus.test.ts import { describe, test, beforeEach, expect } from "vitest"; import { eventBusTestSuite } from "@daiso-tech/core/event-bus/test-utilities"; import { MyEventBus } from "./MyEventBus.js"; describe("class: EventBus", () => { eventBusTestSuite({ test, expect, describe, beforeEach, createEventBus: () => new MyEventBus(), }); }); ``` ## Further information[​](#further-information "Direct link to Further information") For further information refer to [`@daiso-tech/core/event-bus`](https://daiso-tech.github.io/daiso-core/modules/EventBus.html) API docs. --- # EventBusResolver The `EventBusResolver` class provides a flexible way to configure and switch between different event bus adapters at runtime. ## Initial configuration[​](#initial-configuration "Direct link to Initial configuration") To begin using the `EventBusResolver` class, you will need to register all required adapters during initialization. ``` import { type IEventBusAdapter, BaseEvent, } from "@daiso-tech/core/event-bus/contracts"; import { EventBusResolver } from "@daiso-tech/core/event-bus"; import { RedisPubSubEventBusAdapter } from "@daiso-tech/core/event-bus/redis-pub-sub-event-bus-adapter"; import { MemoryEventBusAdapter } from "@daiso-tech/core/event-bus/memory-event-bus-adapter"; import { Serde } from "@daiso-tech/core/serde"; import { SuperJsonSerdeAdapter } from "@daiso-tech/core/serde/super-json-serde-adapter"; import Redis from "ioredis"; const serde = new Serde(new SuperJsonSerdeAdapter()); const eventBusResolver = new EventBusResolver({ adapters: { memory: new MemoryEventBusAdapter(), redis: new RedisPubSubEventBusAdapter({ client: new Redis("YOUR_REDIS_CONNECTION_STRING"), serde, }), }, // You can set an optional default adapter defaultAdapter: "memory", }); ``` ## Usage[​](#usage "Direct link to Usage") ### 1. Using the default adapter[​](#1-using-the-default-adapter "Direct link to 1. Using the default adapter") ``` await eventBusResolver.use().dispatch("add", { a: 1, b: 2 }); ``` danger Note that if you dont set a default adapter, an error will be thrown. ### 2. Specifying an adapter explicitly[​](#2-specifying-an-adapter-explicitly "Direct link to 2. Specifying an adapter explicitly") ``` await eventBusResolver.use("redis").dispatch("add", { a: 1, b: 2 }); ``` danger Note that if you specify a non-existent adapter, an error will be thrown. ### 3. Overriding default settings[​](#3-overriding-default-settings "Direct link to 3. Overriding default settings") ``` import { z } from "zod"; await eventBusResolver .setNamespace(new Namespace("@my-namespace")) // You can overide the event type by calling setEventMapType or setEventMapSchema method again .setEventMapType<{ add: { a: 1; b: 2; }; }>() .setEventMapSchema({ sub: z.object({ c: z.number(), d: z.number(), }), }) .use("redis") .dispatch("sub", { c: 1, d: 2, }); ``` info Note that the `EventBusResolver` is immutable, meaning any configuration override returns a new instance rather than modifying the existing one. ## Further information[​](#further-information "Direct link to Further information") For further information refer to [`@daiso-tech/core/event-bus`](https://daiso-tech.github.io/daiso-core/modules/EventBus.html) API docs. --- # EventBus usage The `@daiso-tech/core/event-bus` component provides a way for dispatching and listening to events independent of underlying technology. ## Initial configuration[​](#initial-configuration "Direct link to Initial configuration") To begin using the `EventBus` class, you'll need to create and configure an instance: ``` import { MemoryEventBusAdapter } from "@daiso-tech/core/event-bus/memory-event-bus-adapter"; import type { IEventBus } from "@daiso-tech/core/event-bus/contracts"; import { EventBus } from "@daiso-tech/core/event-bus"; const eventBus: IEventBus = new EventBus({ // You can choose the adapter to use adapter: new MemoryEventBusAdapter(), }); ``` info Here is a complete list of settings for the [`EventBus`](https://daiso-tech.github.io/daiso-core/types/EventBus.EventBusSettingsBase.html) class. ## Event handling basics[​](#event-handling-basics "Direct link to Event handling basics") ### Registering Listeners and Dispatching Events[​](#registering-listeners-and-dispatching-events "Direct link to Registering Listeners and Dispatching Events") Event listeners can be added to respond to specific events: ``` await eventBus.addListener("add", (event) => { console.log(event); }); await eventBus.dispatch("add", { a: 5, b: 5, }); ``` ### Listener management[​](#listener-management "Direct link to Listener management") To properly remove a listener, you must use a named function: ``` import type { BaseEvent } from "@daiso-tech/core/event-bus/contracts"; const listener = (event: BaseEvent) => { console.log(event); }; await eventBus.addListener("add", listener); await eventBus.removeListener("add", listener); // The listener is removed before dispatch and won't be triggered. await eventBus.dispatch("add", { a: 5, b: 5, }); ``` ## Patterns[​](#patterns "Direct link to Patterns") ### Compile time type safety[​](#compile-time-type-safety "Direct link to Compile time type safety") An event map can be used to strictly type the events: ``` import { MemoryEventBusAdapter } from "@daiso-tech/core/event-bus/memory-event-bus-adapter"; import type { IEventBus } from "@daiso-tech/core/event-bus/contracts"; import { EventBus } from "@daiso-tech/core/event-bus"; type AddEvent = { a: number; b: number; }; type EventMap = { add: AddEvent; }; const eventBus = new EventBus({ adapter: new MemoryEventBusAdapter(), }); // A typescript error will show up because the event name doesnt exist. await eventBus.dispatch("addd", { a: 2, b: 2, }); // A typescript error will show up because the event fields doesnt match await eventBus.dispatch("add", { nbr1: 1, nbr2: 2, }); // A typescript error will show up because the event name doesnt exist. await eventBus.addListener("addd", (event) => { console.log(event); }); ``` ### Runtime type safety[​](#runtime-type-safety "Direct link to Runtime type safety") You can enforce runtime and compiletime type safety by passing [standard schema](https://standardschema.dev/) to the cache: ``` import { MemoryEventBusAdapter } from "@daiso-tech/core/event-bus/memory-event-bus-adapter"; import { EventBus } from "@daiso-tech/core/event-bus"; import { z } from "zod"; const eventMapSchema = { add: z.object({ a: z.number(), b: z.number(), }), }; // The event type will be infered const eventBus = new EventBus({ adapter: new MemoryEventBusAdapter(), eventMapSchema, }); // A typescript and runtime error will show up because the event fields doesnt match await eventBus.dispatch("add", { nbr1: 1, nbr2: 2, }); ``` ### Subscribe method[​](#subscribe-method "Direct link to Subscribe method") The subscription pattern provides automatic cleanup through an unsubscribe function: ``` const unsubscribe = await eventBus.subscribe("add", (event) => { console.log(event); }); await eventBus.dispatch("add", { a: 20, b: 5, }); await unsubscribe(); ``` ### One-Time event handling[​](#one-time-event-handling "Direct link to One-Time event handling") For listeners that should only trigger once: ``` await eventBus.listenOnce("add", (event) => { console.log(event); }); // Listener will be only triggered here await eventBus.dispatch("add", { a: 5, b: 5, }); // Listener will not be triggered because it removed after the first dispatch. await eventBus.dispatch("add", { a: 3, b: 3, }); ``` You can also cancel one-time listeners before they trigger: ``` import type { BaseEvent } from "@daiso-tech/core/event-bus/contracts"; const listener = (event: BaseEvent) => { console.log(event); }; await eventBus.listenOnce("add", listener); await eventBus.removeListener("add", listener); // The listener is removed before dispatch and won't be triggered. await eventBus.dispatch("add", { a: 5, b: 5, }); ``` The `subscribeOnce` method creates a one-time listener and returns an unsubscribe function: ``` const unsubscribe = await eventBus.subscribeOnce("add", (event) => { console.log(event); }); await unsubscribe(); await eventBus.dispatch("add", { a: 5, b: 5, }); ``` ### Promise-based event handling[​](#promise-based-event-handling "Direct link to Promise-based event handling") Wait for events using promises: ``` import { delay } from "@daiso-tech/core/utilities"; import { TimeSpan } from "@daiso-tech/core/time-span"; // Register the promise before dispatching the event. const eventPromise = eventBus.asPromise("add"); await delay(TimeSpan.fromSeconds(1)); await eventBus.dispatch("add", { a: 30, b: 20, }); const event = await eventPromise; ``` ### Listening to multiple events[​](#listening-to-multiple-events "Direct link to Listening to multiple events") The `addListener`, `removeListener`, and `subscribe` methods all accept either a single event name or an array of event names, allowing you to register one listener for multiple events at once: ``` type AddEvent = { a: number; b: number; }; type RemoveEvent = { id: number; }; type EventMap = { add: AddEvent; remove: RemoveEvent; }; const eventBus = new EventBus({ adapter: new MemoryEventBusAdapter(), }); // The same listener handles both "add" and "remove" events await eventBus.addListener(["add", "remove"], (event) => { console.log("EVENT:", event); // event.type will be "add" or "remove" depending on which was dispatched }); await eventBus.dispatch("add", { a: 1, b: 2 }); await eventBus.dispatch("remove", { id: 42 }); ``` You can also use `subscribe` to get a single cleanup function that unsubscribes from all listed events at once: ``` const unsubscribe = await eventBus.subscribe(["add", "remove"], (event) => { console.log("EVENT:", event); }); await eventBus.dispatch("add", { a: 1, b: 2 }); await eventBus.dispatch("remove", { id: 42 }); // Unsubscribes from both "add" and "remove" in one call await unsubscribe(); ``` ### Separating dispatching and listening[​](#separating-dispatching-and-listening "Direct link to Separating dispatching and listening") The library includes two additional contracts: * [`IEventDispatcher`](https://daiso-tech.github.io/daiso-core/types/EventBus.IEventDispatcher.html) - Allows only for event dispatching. * [`IEventListenable`](https://daiso-tech.github.io/daiso-core/types/EventBus.IEventListenable.html) - Allows only for event listening. This seperation makes it easy to visually distinguish the two contracts, making it immediately obvious that they serve different purposes. ``` import type { IEventBus, IEventListenable, IEventDispatcher, } from "@daiso-tech/core/event-bus/contracts"; import { MemoryEventBusAdapter } from "@daiso-tech/core/event-bus/memory-event-bus-adapter"; import { EventBus } from "@daiso-tech/core/event-bus"; type AddEvent = { a: number; b: number; }; type EventMap = { add: AddEvent; }; async function listenerFunc( eventListenable: IEventListenable, ): Promise { // You cannot access the dispatch method // You will get typescript error if you try await eventListenable.addListener("add", (event) => { console.log("EVENT:", event); }); } async function dispatchingFunc( eventDispatcher: IEventDispatcher, ): Promise { // You cannot access the listener methods // You will get typescript error if you try await eventDispatcher.dispatch("add", { a: 20, b: 5, }); } const eventBus: IEventBus = new EventBus({ // You can choose the adapter to use adapter: new MemoryEventBusAdapter(), }); await listenerFunc(eventBus); await dispatchingFunc(eventBus); ``` ### Invokable listeners[​](#invokable-listeners "Direct link to Invokable listeners") An event listener is `Invokable` meaning you can also pass in an object (class instance or object literal) as listener: info For further information refer the [`Invokable`](/docs/utilities/invokable.md) docs. ``` type AddEvent = { a: number; b: number; }; class Listener implements IEventListenerObject { private count = 0; invoke(event: AddEvent): void { console.log("EVENT:", event); console.log("COUNT:", count); this.count++; } } await eventBus.addListener("add", new Listener()); await eventBus.dispatch("add", { a: 1, b: 2, }); await eventBus.dispatch("add", { a: 3, b: -1, }); ``` ### Namespacing[​](#namespacing "Direct link to Namespacing") You can use the `Namespace` class to group related without conflicts. Since namespacing is not used be default, you need to pass an obeject that implements `INamespace` object. info For further information about namespacing refer to [`@daiso-tech/core/namespace`](/docs/components/namespace.md) documentation. ``` import { Namespace } from "@daiso-tech/core/namespace"; import { RedisPubSubEventBusAdapter } from "@daiso-tech/core/event-bus/redis-pub-sub-event-bus-adapter"; import { EventBus } from "@daiso-tech/core/event-bus"; import { Serde } from "@daiso-tech/core/serde"; import { SuperJsonSerdeAdapter } from "@daiso-tech/core/serde/super-json-serde-adapter"; import Redis from "ioredis"; const client = new Redis("YOUR_REDIS_CONNECTION_STRING"); const serde = new Serde(new SuperJsonSerdeAdapter()); const eventBusA = new EventBus({ namespace: new Namespace("@eventBus-a"), adapter: new RedisPubSubEventBusAdapter({ client, serde, }), }); const eventBusB = new EventBus({ namespace: new Namespace("@eventBus-b"), adapter: new RedisPubSubEventBusAdapter({ client, serde, }), }); await eventBusA.addListener("test", (event) => { console.log("TEST_A:", event); }); await eventBusB.addListener("test", () => { console.log("TEST_B", event); }); // Will only log "TEST_A" { testA: true } await eventBusA.dispatch("test", { testA: true, }); // Will only log "TEST_B" { testB: true } await eventBusB.dispatch("test", { testB: true, }); ``` ## Further information[​](#further-information "Direct link to Further information") For further information refer to [`@daiso-tech/core/event-bus`](https://daiso-tech.github.io/daiso-core/modules/EventBus.html) API docs. --- # Execution Context The `@daiso-tech/core/execution-context` module provides a type-safe, composable, and environment-agnostic way to store and propagate contextual data (such as request IDs, user info, or tracing metadata) across async boundaries and function calls. It is inspired by thread-local storage and context propagation in distributed systems, but is designed for modern TypeScript/JavaScript applications. ## Initial configuration[​](#initial-configuration "Direct link to Initial configuration") To begin using the execution context, you'll need to create and configure an instance: ``` import { ExecutionContext, contextToken, } from "@daiso-tech/core/execution-context"; import { AlsExecutionContextAdapter } from "@daiso-tech/core/execution-context/als-execution-context-adapter"; // Create an execution context instance with an adapter const executionContext = new ExecutionContext(new AlsExecutionContextAdapter()); ``` ## Execution context basics[​](#execution-context-basics "Direct link to Execution context basics") ### Running code with context[​](#running-code-with-context "Direct link to Running code with context") You can run code within a context boundary, and all context values will be accessible throughout the call chain: ``` import { Namespace } from "@daiso-tech/core/namespace"; // Define context tokens using namespaced IDs to avoid collisions const namespace = new Namespace("myapp"); const userToken = contextToken<{ id: string; name: string }>( namespace.id("user"), ); const requestIdToken = contextToken(namespace.id("requestId")); function logData(): void { // Access context values later in the call chain // { id: "123", name: "Alice" } const user = executionContext.get(userToken); // "req-456" const reqId = executionContext.get(requestIdToken); console.log("user:", user); console.log("reqId:", reqId); } executionContext.run(() => { executionContext.put(userToken, { id: "123", name: "Alice" }); executionContext.put(requestIdToken, "req-456"); logData(); }); ``` ### Binding functions to context[​](#binding-functions-to-context "Direct link to Binding functions to context") You can bind a function to the current context, so it always executes with the captured context values: ``` executionContext.run(() => { executionContext.put(userToken, { id: "123", name: "Alice" }); executionContext.put(requestIdToken, "req-456"); const logData = executionContext.bind((msg: string): void => { // Access context values later in the call chain const user = executionContext.get(userToken); // { id: "123", name: "Alice" } const reqId = executionContext.get(requestIdToken); // "req-456" console.log("message:", msg); console.log("user:", user); console.log("reqId:", reqId); }); logData("hello"); }); ``` ## Patterns[​](#patterns "Direct link to Patterns") ### Type safety with context tokens[​](#type-safety-with-context-tokens "Direct link to Type safety with context tokens") You can enforce compile-time type safety by defining context tokens with specific types: ``` const userToken = contextToken<{ id: string; name: string }>("user"); executionContext.put(userToken, { id: "123", name: "Alice" }); // TypeScript will error if you try to put a value of the wrong type. ``` ### Immutable and chainable context operations[​](#immutable-and-chainable-context-operations "Direct link to Immutable and chainable context operations") All context mutation methods return the context instance, allowing for method chaining: ``` executionContext .put(userToken, { id: "123", name: "Alice" }) .put(requestIdToken, "req-456"); ``` ### Conditional context updates[​](#conditional-context-updates "Direct link to Conditional context updates") You can conditionally update the context: ``` executionContext.when(true, (ctx) => ctx.put(userToken, { id: "conditional", name: "Bob" }), ); ``` ### Adapters[​](#adapters "Direct link to Adapters") * **`AlsExecutionContextAdapter`**: Uses Node.js AsyncLocalStorage for async context propagation. * **`NoOpExecutionContextAdapter`**: No-op adapter for testing or environments without async context support. ### Separating reading, updating, and execution concerns[​](#separating-reading-updating-and-execution-concerns "Direct link to Separating reading, updating, and execution concerns") The library includes several contracts that separate concerns for different use cases: * `IReadableContext`: Read-only access to context values (safe for consumers that should not mutate context). * `IContext`: Adds all mutation methods (put, update, remove, etc.) for full context management. * `IExecutionContextBase`: Adds execution boundary methods (run, bind) for context propagation and isolation. * `IExecutionContext`: Combines all of the above for complete context management and execution. #### `IExecutionContextBase`[​](#iexecutioncontextbase "Direct link to iexecutioncontextbase") * `run(invokable)` — Runs a function within the current execution context. All context values are accessible during execution. * `bind(fn)` — Returns a new function that, when called, executes the original function within the captured context. #### `IContext`[​](#icontext "Direct link to icontext") * `add(token, value)` — Adds a value only if it doesn't already exist. No-op if the key exists. * `put(token, value)` — Sets or overwrites a value for the token. * `putIncrement(token, settings?)` — Initializes (if missing) and increments a numeric value. Optional max cap. * `putDecrement(token, settings?)` — Initializes (if missing) and decrements a numeric value. Optional min floor. * `putPush(token, ...values)` — Initializes (if missing) and pushes values to an array. * `update(token, value)` — Updates a value only if it exists. No-op if missing. * `updateIncrement(token, settings?)` — Increments a numeric value only if it exists. Optional max cap. * `updateDecrement(token, settings?)` — Decrements a numeric value only if it exists. Optional min floor. * `updatePush(token, ...values)` — Pushes values to an array only if it exists. No-op if missing. * `remove(token)` — Removes a value from the context. * `when(condition, ...invokables)` — Conditionally applies operations if the condition is true. #### `IReadableContext`[​](#ireadablecontext "Direct link to ireadablecontext") * `contains(token, matchValue)` — Checks if an array context value contains a specific item or matches a predicate. * `exists(token)` — Checks if a value exists for the token. * `missing(token)` — Checks if a value is missing for the token. * `get(token)` — Retrieves a value or null if not found. * `getOr(token, defaultValue)` — Retrieves a value or returns the provided default if not found. * `getOrFail(token)` — Retrieves a value or throws if not found. ## Further information[​](#further-information "Direct link to Further information") For further information refer to [`@daiso-tech/core/execution-context`](https://daiso-tech.github.io/daiso-core/modules/ExecutionContext.html) API docs. --- # FileSize The `@daiso-tech/core/file-size` component provides an easy way for defining, manipulating, and comparing file size. Furthermore, it is designed for easy integration with external file size libraries. ### Creating a FileSize[​](#creating-a-filesize "Direct link to Creating a FileSize") Creating `FileSize` from bytes: ``` import { FileSize } from "@daiso-tech/core/file-size"; const fileSize = FileSize.fromBytes(100); ``` Creating `FileSize` from kilo bytes: ``` import { FileSize } from "@daiso-tech/core/file-size"; const fileSize = FileSize.fromKiloBytes(100); ``` Creating `FileSize` from mega bytes: ``` import { FileSize } from "@daiso-tech/core/file-size"; const fileSize = FileSize.fromMegaBytes(100); ``` Creating `FileSize` from giga bytes: ``` import { FileSize } from "@daiso-tech/core/file-size"; const fileSize = FileSize.fromGigaBytes(100); ``` Creating `FileSize` from tera bytes: ``` import { FileSize } from "@daiso-tech/core/file-size"; const fileSize = FileSize.fromTeraBytes(1); ``` Creating `FileSize` from peta bytes: ``` import { FileSize } from "@daiso-tech/core/file-size"; const fileSize = FileSize.fromPetaBytes(1); ``` ### Comparing FileSize:s[​](#comparing-filesize "Direct link to comparing-filesize") Equals: ``` // Returns false FileSize.fromBytes(20_000).equal(FileSize.fromBytes(40_000)); ``` Greater than: ``` // Returns false FileSize.fromBytes(20_000).gt(FileSize.fromBytes(40_000)); ``` Greater than or equals: ``` // Returns false FileSize.fromBytes(20_000).gte(FileSize.fromBytes(40_000)); ``` Less than: ``` // Returns true FileSize.fromBytes(20_000).lt(FileSize.fromBytes(40_000)); ``` Less than or equals: ``` // Returns true FileSize.fromBytes(20_000).lte(FileSize.fromBytes(40_000)); ``` ### Converting a FileSize[​](#converting-a-filesize "Direct link to Converting a FileSize") You can get amount of bytes contained in the `FileSize`: ``` FileSize.fromKiloBytes(1).toBytes(); ``` You can get amount of kilo bytes contained in the `FileSize`: ``` FileSize.fromMegaBytes(1).toKiloBytes(); ``` You can get amount of giga bytes contained in the `FileSize`: ``` FileSize.fromTeraBytes(1).toGigaBytes(); ``` You can get amount of tera bytes contained in the `FileSize`: ``` FileSize.fromPetaBytes(1).toTeraBytes(); ``` You can get amount of peta bytes contained in the `FileSize`: ``` FileSize.fromPetaBytes(1000).toPetaBytes(); ``` ### Serialization and deserialization of FileSize[​](#serialization-and-deserialization-of-filesize "Direct link to Serialization and deserialization of FileSize") The `FileSize` class supports serialization and deserialization, allowing you to easily convert instances to and from serialized formats. However, registration is required first: ``` import { Serde } from "@daiso-tech/core/serde"; import { SuperJsonSerdeAdapter } from "@daiso-tech/core/serde/super-json-serde-adapter"; import { FileSize } from "@daiso-tech/core/file-size"; const serde = new Serde(new SuperJsonSerdeAdapter()); serde.registerClass(FileSize); const fileSize = FileSize.fromBytes(12); const serializedFileSize = serde.serialize(fileSize); const deserializedFileSize = serde.deserialize(serializedFileSize); // logs false console.log(serializedFileSize === deserializedFileSize); ``` ## FileSize contract[​](#filesize-contract "Direct link to FileSize contract") The `IFileSize` contract provides a standardized way to express a file size as bytes. Key components like `FileStorage`, rely on this contract, ensuring they are not tightly coupled to a specific file size implementation. This decoupling is crucial for interoperability, allowing seamless integration with external file size libraries. To integrate a new library, its file size objects must simply implement the `IFileSize` contract. info Note `FileSize` class implements `IFileSize` contract. The `IFileSize` contract requires you to implement the `TO_MILLISECONDS` method on the file size object, which must return the file size in milliseconds. ``` import { IFileSize, TO_BYTES } from "@daiso-tech/core/file-size/contracts"; export class MyFileSize implements IFileSize { constructor(private readonly fileSizeInBytes: number) {} [TO_BYTES](): number { return this.fileSizeInBytes; } } ``` ## Further information[​](#further-information "Direct link to Further information") For further information refer to [`@daiso-tech/core/file-size`](https://daiso-tech.github.io/daiso-core/modules/FileSize.html) API docs. --- # Configuring file-storage adapters ## MemoryFileStorageAdapter[​](#memoryfilestorageadapter "Direct link to MemoryFileStorageAdapter") To use the `MemoryFileStorageAdapter` you only need to create instance of it: ``` import { MemoryFileStorageAdapter } from "@daiso-tech/core/file-storage/memory-file-storage-adapter"; const memoryFileStorageAdapter = new MemoryFileStorageAdapter(); ``` You can also provide an `Map` that will be used for storing the files in memory: ``` import { MemoryFileStorageAdapter } from "@daiso-tech/core/file-storage/memory-file-storage-adapter"; const map = new Map(); const memoryFileStorageAdapter = new MemoryFileStorageAdapter(map); ``` info `MemoryFileStorageAdapter` lets you test your app without external dependencies like `@aws-sdk/client-s3`, ideal for local development, unit tests, integration tests and fast E2E test for the backend application. warning Note this adapter doesnt have support for creating signed upload, signed download and public urls. ## FsFileStorageAdapter[​](#fsfilestorageadapter "Direct link to FsFileStorageAdapter") To use the `FsFileStorageAdapter` you only need to create instance of it: ``` import { FsFileStorageAdapter } from "@daiso-tech/core/file-storage/fs-file-storage-adapter"; const fsFileStorageAdapter = new FsFileStorageAdapter(); ``` You can configure the root folder: ``` import { FsFileStorageAdapter } from "@daiso-tech/core/file-storage/fs-file-storage-adapter"; const fsFileStorageAdapter = new FsFileStorageAdapter({ location: "/my-custom-location", }); ``` You can configure codec used for file names: ``` import { Base64Codec } from "@daiso-tech/core/codec/base-64-codec"; import { FsFileStorageAdapter } from "@daiso-tech/core/file-storage/fs-file-storage-adapter"; const fsFileStorageAdapter = new FsFileStorageAdapter({ codec: new Base64Codec(), }); ``` warning Not encoding and decoding is required for `FsFileStorageAdapter` to maintain a flat hierarchy within the root folder and to ensure compatibility with OS-restricted characters. warning Not this adapter does not support signed upload, signed download and public urls. It also doesnt support explictly setting the content-type and it will instead infer the content-type from the file name. ## S3FileStorageAdapter[​](#s3filestorageadapter "Direct link to S3FileStorageAdapter") To use the `S3FileStorageAdapter`, you'll need to: 1. Install the required dependency: [`@aws-sdk/client-s3`](https://www.npmjs.com/package/@aws-sdk/client-s3) package: ``` import { S3FileStorageAdapter } from "@daiso-tech/core/file-storage/s3-file-storage-adapter"; const s3Client = new S3Client({ credentials: { accessKeyId: "AWS_ACCESS_KEY_ID", secretAccessKey: "AWS_SECRET_ACCESS_KEY", }, region: "AWS_REGION", }); const s3FileStorageAdapter = new S3FileStorageAdapter({ client: s3Client, }); ``` Other settings: ``` import { S3FileStorageAdapter, defaultPublicUrlGenerator, } from "@daiso-tech/core/file-storage/s3-file-storage-adapter"; const s3Client = new S3Client({ credentials: { accessKeyId: "AWS_ACCESS_KEY_ID", secretAccessKey: "AWS_SECRET_ACCESS_KEY", }, region: "AWS_REGION", }); const s3FileStorageAdapter = new S3FileStorageAdapter({ client: s3Client, /** * The bucket option defines the S3 bucket to use for managing files. */ bucket: "bucket", /** * The cdnUrl field can be used to define the base URL for generating public URL for a file. For example, If you use CloudFront alongside S3 to serve public files, the cdnUrl property should be the CloudFront URL. */ cdnUrl: null, /** * Define ServerSideEncryption option for all objects uploaded to S3. */ serverSideEncryption: "AES256", /** * If false the put method of ISignedFileStorageAdapter will perform one database call and thereby always return true even when the file doesnt exists. * Note the fewer database calls the cheaper when using aws s3. */ enableAccuratePut: true, /** * If false the getSignedDownloadUrl method of ISignedFileStorageAdapter will perfom one database call and therby always return string even when the file doesnt exists. * Note the fewer database calls the cheaper when using aws s3. */ enableAccurateDownload: true, /** * Define a custom public url generator for creating public and signed URLs. */ publicUrlGenerator: defaultPublicUrlGenerator, }); ``` info Note this adapter with object storage services that are compatible with aws s3 like: * Cloudflare r2 * Digital ocean spaces * Tigris * Supabase Storage * Minio ## NoOpFileStorageAdapter[​](#noopfilestorageadapter "Direct link to NoOpFileStorageAdapter") The `NoOpFileStorageAdapter` is a no-operation implementation, it performs no actions when called: ``` import { NoOpFileStorageAdapter } from "@daiso-tech/core/file-storage/no-op-file-storage-adpater"; const noOpFileStorageAdapter = new NoOpFileStorageAdapter(); ``` info The `NoOpFileStorageAdapter` is useful when you want to mock out or disable your `FileStorage` instance. ## Further information[​](#further-information "Direct link to Further information") For further information refer to [`@daiso-tech/core/file-storage`](https://daiso-tech.github.io/daiso-core/modules/FileStorage.html) API docs. --- # Creating lock adapters ## Implementing your custom IFileStorageAdapter[​](#implementing-your-custom-ifilestorageadapter "Direct link to Implementing your custom IFileStorageAdapter") In order to create an adapter you need to implement the [`IFileStorageAdapter`](https://daiso-tech.github.io/daiso-core/types/FileStorage.IFileStorageAdapter.html) contract. ## Implementing your custom ISignedFileStorageAdapter[​](#implementing-your-custom-isignedfilestorageadapter "Direct link to Implementing your custom ISignedFileStorageAdapter") We provide an additional contract [`ISignedFileStorageAdapter`](https://daiso-tech.github.io/daiso-core/types/FileStorage.ISignedFileStorageAdapter.html) for building custom file-storage adapters with support for creating signed download and upload urls. ## Implementing your custom IFileStorage class[​](#implementing-your-custom-ifilestorage-class "Direct link to Implementing your custom IFileStorage class") In some cases, you may need to implement a custom [`FileStorage`](https://daiso-tech.github.io/daiso-core/classes/FileStorage.FileStorage.html) class to optimize performance for your specific technology stack. You can then directly implement the [`IFileStorage`](https://daiso-tech.github.io/daiso-core/types/FileStorage.IFileStorage.html) contract. ## Further information[​](#further-information "Direct link to Further information") For further information refer to [`@daiso-tech/core/file-storage`](https://daiso-tech.github.io/daiso-core/modules/FileStorage.html) API docs. --- # FileStorageResolver The `FileStorageResolver` class provides a flexible way to configure and switch between different file-storage adapters at runtime. ## Initial configuration[​](#initial-configuration "Direct link to Initial configuration") To begin using the `IFileStorageFactory`, You will need to register all required adapters during initialization. ``` import { FileStorageResolver } from "@daiso-tech/core/file-storage"; import { MemoryFileStorageAdapter } from "@daiso-tech/core/file-storage/memory-file-storage-adapter"; import { FsFileStorageAdapter } from "@daiso-tech/core/file-storage/fs-file-storage-adapter"; const fileStorageResolver = new FileStorageResolver({ adapters: { memory: new MemoryFileStorageAdapter(), fs: new FsFileStorageAdapter(), }, // You can set an optional default adapter defaultAdapter: "memory", }); ``` ## Usage[​](#usage "Direct link to Usage") ### 1. Using the default adapter[​](#1-using-the-default-adapter "Direct link to 1. Using the default adapter") ``` await fileStorageResolver.use().create("file.txt").add("Text file content"); ``` danger Note that if you dont set a default adapter, an error will be thrown. ### 2. Specifying an adapter explicitly[​](#2-specifying-an-adapter-explicitly "Direct link to 2. Specifying an adapter explicitly") ``` await fileStorageResolver.use("fs").create("file.txt").add("Text file content"); ``` danger Note that if you specify a non-existent adapter, an error will be thrown. ### 3. Overriding default settings[​](#3-overriding-default-settings "Direct link to 3. Overriding default settings") ``` await fileStorageResolver .setNamespace(new Namespace("@my-namespace")) .use("fs") .create("file.txt") .add("Text file content"); ``` info Note that the `FileStorageResolver` is immutable, meaning any configuration override returns a new instance rather than modifying the existing one. ## Further information[​](#further-information "Direct link to Further information") For further information refer to [`@daiso-tech/core/file-storage`](https://daiso-tech.github.io/daiso-core/modules/FileStorage.html) API docs. --- # FileStorage usage The `@daiso-tech/core/file-storage` component provides a way for managing files independent of underlying platform or storage. ## Initial configuration[​](#initial-configuration "Direct link to Initial configuration") To begin using the `FileStorage` class, you'll need to create and configure an instance: ``` import { TimeSpan } from "@daiso-tech/core/time-span"; import { MemoryFileStorageAdapter } from "@daiso-tech/core/file-storage/memory-file-storage-adapter"; import { FileStorage } from "@daiso-tech/core/file-storage"; const fileStorage = new FileStorage({ // You can provide defaultContentType value by default is application/octet-stream defaultContentType: "text/plain", // You can choose the adapter to use adapter: new MemoryFileStorageAdapter(), }); ``` info Here is a complete list of settings for the [`FileStorage`](https://daiso-tech.github.io/daiso-core/types/FileStorage.FileStorage.html) class. ## FileStorage basics[​](#filestorage-basics "Direct link to FileStorage basics") ### Creating a file object[​](#creating-a-file-object "Direct link to Creating a file object") ``` const file = fileStorage.create("file.txt"); ``` info Note the file object represents a reference to a file and doesnt create the real underlying file. ### Writing buffered files[​](#writing-buffered-files "Direct link to Writing buffered files") You can add a file and true is returned if the file does not exists: ``` const hasAdded = await fileStorage.create("file.txt").add({ data: "CONTENT" }); ``` You can update a file and true will be returned if the file exists and was updated: ``` const hasUpdated = await fileStorage .create("file.txt") .update({ data: "TEXT 1" }); ``` You can upsert a file and true will be returned if the file was updated otherwise false is returned: ``` const hasUpdated = await fileStorage.create("file.txt").put({ data: "TEXT 1" }); const hasUpdated = await fileStorage.create("file.txt").put({ data: "TEXT 2" }); ``` info Note you can pass the following types to `add`, `update`, `put` method: * `Buffer` * `ArrayBuffer` * `SharedArrayBuffer` * `string` * `Uint8Array` * `Int8Array` * `Uint16Array` * `Int16Array` * `Uint32Array` * `Int32Array` * `BigUint64Array` * `BigInt64Array` * `Float32Array` * `Float64Array` * `DataView` But usually you would use `Uint8Array` because it represents data as bytes. You can pass additional optional metadata information to `add`, `update` and `put`: ``` const hasAdded = await fileStorage.create("file.txt").add({ data: "CONTENT" /** * You can explicitly set a custom Content-Type. If one is not provided, it will be inferred from the key. For example, a key ending in .txt (such as key-a.txt) will be assigned text/plain. * If the key contains a non-standard extension it will default to application/octet-stream. */ contentType: "text/plain", /** * Note a default value is always provided. To explicitly unset a field and prevent it from being passed to the underlying adapter, pass in `null`. */ contentLanguage: "en-US", /** * Note a default value is always provided. To explicitly unset a field and prevent it from being passed to the underlying adapter, pass in `null`. */ contentEncoding: "gzip", /** * Note a default value is always provided. To explicitly unset a field and prevent it from being passed to the underlying adapter, pass in `null`. */ contentDisposition: "inline", /** * Note a default value is always provided. To explicitly unset a field and prevent it from being passed to the underlying adapter, pass in `null`. */ cacheControl: "no-cache", }); ``` ### Writing streamed files[​](#writing-streamed-files "Direct link to Writing streamed files") You can add a file stream and true is returned if the file does not exists: ``` import { createReadStream } from "node:fs"; const fileStream = createReadStream("./file.txt"); const hasAdded = await fileStorage .create("file.txt") .addStream({ data: fileStream }); ``` You can update a file stream and true will be returned if the file exists and was updated: ``` import { createReadStream } from "node:fs"; const fileStream = createReadStream("./file.txt"); const hasUpdated = await fileStorage .create("file.txt") .updateStream({ data: fileStream }); ``` You can upsert a file stream and true will be returned if the file was updated otherwise false is returned: ``` import { createReadStream } from "node:fs"; const fileStream = createReadStream("./file.txt"); const hasUpdated = await fileStorage .create("file.txt") .putStream({ data: fileStream }); const hasUpdated = await fileStorage .create("file.txt") .putStream({ data: fileStream }); ``` info Note you can pass the following types to `addStream`, `updateStream`, `putStream` method: * `AsyncIteralbe` * `AsyncIteralbe` * `AsyncIteralbe` * `AsyncIteralbe` * `AsyncIteralbe` * `AsyncIteralbe` * `AsyncIteralbe` * `AsyncIteralbe` * `AsyncIteralbe` * `AsyncIteralbe` * `AsyncIteralbe` * `AsyncIteralbe` * `AsyncIteralbe` * `AsyncIteralbe` * `AsyncIteralbe` But usually you would use `AsyncIterable` because it represents stream as bytes. You can pass additional optional metadata information to `addStrem`, `updateStream` and `putStream`: ``` const fileStream = createReadStream("./file.txt") const hasAdded = await fileStorage.create("file.txt").addStream({ data: fileStream /** * You can explicitly set a custom Content-Type. If one is not provided, it will be inferred from the key. For example, a key ending in .txt (such as key-a.txt) will be assigned text/plain. * If the key contains a non-standard extension it will default to application/octet-stream. */ contentType: "text/plain", /** * Note a default value is always provided. To explicitly unset a field and prevent it from being passed to the underlying adapter, pass in `null`. */ contentLanguage: "en-US", /** * Note a default value is always provided. To explicitly unset a field and prevent it from being passed to the underlying adapter, pass in `null`. */ contentEncoding: "gzip", /** * Note a default value is always provided. To explicitly unset a field and prevent it from being passed to the underlying adapter, pass in `null`. */ contentDisposition: "inline", /** * Note a default value is always provided. To explicitly unset a field and prevent it from being passed to the underlying adapter, pass in `null`. */ cacheControl: "no-cache", }); ``` You can also pass the file size of the stream which used for optimizations by some adapters: ``` import { createReadStream } from "node:fs" import { stat } from "node:fs/promises"; import { FileSize } from "@daiso-tech/file-size"; const fileStream = createReadStream("./file.txt") const { size } = stat("./file.txt") const hasAdded = await fileStorage.create("file.txt").addStream({ data: fileStream fileSize: FileSize.fromBytes(size) }) ``` info It is best practice to pass file size whenever possible because of the optimizations. ### Retrieving files[​](#retrieving-files "Direct link to Retrieving files") The file can be read as utf8 text: ``` const content = await fileStorage.create("file.txt").getText(); console.log(content); ``` The file can be read as `Uint8Array`: ``` const content = await fileStorage.create("file.txt").getBytes(); console.log(content); ``` The file can be read as node js `Buffer`: ``` const content = await fileStorage.create("file.txt").getBuffer(); console.log(content); ``` The file can be read as web `ArrayBuffer`: ``` const content = await fileStorage.create("file.txt").getArrayBuffer(); console.log(content); ``` The file can be read as node js stream: ``` const content = await fileStorage.create("file.txt").getReadable(); console.log(content); ``` The file can be read as web stream: ``` const content = await fileStorage.create("file.txt").getReadableStream(); console.log(content); ``` info Note all this methods return null if the file doesnt exists. ### Checking file existence[​](#checking-file-existence "Direct link to Checking file existence") You can check if the file exists: ``` const exists = await fileStorage.create("file.txt").exists(); ``` You can check if the file doesnt exists: ``` const missing = await fileStorage.create("file.txt").missing(); ``` ### Removing files[​](#removing-files "Direct link to Removing files") You can remove a file and true will be returned if the file exists and was removed: ``` const hasRemoved = await fileStorage.create("file.txt").remove(); console.log(hasRemoved); ``` You can remove multiple files and true will be returned when at least one file exists and was removed: ``` const hasRemovedAtLeastOne = await fileStorage.removeMany([ fileStorage.create("file-1.txt"), fileStorage.create("file-2.txt"), fileStorage.create("file-3.txt"), ]); console.log(hasRemovedAtLeastOne); ``` ### Retrieving file metadata[​](#retrieving-file-metadata "Direct link to Retrieving file metadata") You can retrieve the file metadata. Null is returned if the file doesnt exists: ``` const metadata = await fileStorage.create("file.txt").getMetadata(); console.log(metadata); ``` The `getMetadata` returns [FileMetadata](https://daiso-tech.github.io/daiso-core/types/FileStorage.FileMetadata.html) type. ## Patterns[​](#patterns "Direct link to Patterns") ### Additional methods[​](#additional-methods "Direct link to Additional methods") These variants are equivalent to the standard methods but throw an error if the file does not exist and in case of `addOrFail` it throws error if the file exists. * `getTextOfFail` * `getBytesOrFail` * `getBufferOrFail` * `getArrayBufferOrFail` * `getReadableOrFail` * `getReadableStreamOrFail` * `addOrFail` * `addStreamOrFail` * `updateOrFail` * `updateStreamOrFail` * `removeOrFail` * `getMetadataOrFail` ### Copying files[​](#copying-files "Direct link to Copying files") You can copy a file. True is returned if the source exists and destination doesnt exists: ``` await fileStorage.create("source.txt").copy("destination.txt"); ``` Use `copyOrFail` method to perform the same operations as the `copy` method but it throws an error if the source file is missing or destination exists. You can copy a file and repalce the destination. True is returned if the source exists: ``` await fileStorage.create("source.txt").copyAndReplace("destination.txt"); ``` Use `copyAndReplaceOrFail` method to perform the same operations as the `copyAndReplace` method but it throws an error if the source file is missing. ### Moving files[​](#moving-files "Direct link to Moving files") You can move a file. True is returned if the source exists and destination doesnt exists: ``` await fileStorage.create("source.txt").move("destination.txt"); ``` Use `moveOrFail` method to perform the same operations as the `move` method but it throws an error if the source file is missing or destination exists. You can move a file and repalce the destination. True is returned if the source exists: ``` await fileStorage.create("source.txt").moveAndReplace("destination.txt"); ``` Use `moveAndReplaceOrFail` method to perform the same operations as the `moveAndReplace` method but it throws an error if the source file is missing. ### Signed urls and public urls.[​](#signed-urls-and-public-urls "Direct link to Signed urls and public urls.") Create signed urls to allow clients to upload files directly to file-storage. Upload url methods: * getSignedUploadUrl: Returns the signed upload url string. ``` const uploadUrl = await fileStorage.create("source.txt").getSignedUploadUrl({ // All settings are optional ttl: TimeSpan.fromMinutes(10) // The content type will be infered from the filename by default contentType: "text/plain" }) console.log(uploadUrl) ``` Create signed urls to allow clients to download files directly from file-storage. Download url methods: * getSignedDownloadUrl: Returns the signed download url string, or null if the file does not exist. * getSignedDownloadUrlOrFail: Returns the signed download url string, but throws an error if the file is missing. ``` const file = fileStorage.create("source.txt") await file.add("CONTENT") const donwloadUrl = await file.getSignedDownloadUrl({ // All settings are optional ttl: TimeSpan.fromMinutes(10) // The content type will be infered from the filename by default contentType: "text/plain", contentDisposition: "inline" }) console.log(donwloadUrl) ``` Use these methods to retrieve a permanent link to a file that is publicly accessible within your storage provider. * `getPublicUrl`: Returns the public url as a string, or null if the file does not exist. * `getPublicUrlOrFail`: Returns the public url, but throws an error if the file is missing. ``` const file = fileStorage.create("source.txt"); await file.add("CONTENT"); const publicUrl = await file.getPublicUrl(); console.log(publicUrl); ``` info Note since not all file-storage adapters support signed or public URLs, you can manually override these behaviors using the `urlAdapter` setting: ``` import { TimeSpan } from "@daiso-tech/core/time-span"; import { MemoryFileStorageAdapter } from "@daiso-tech/core/file-storage/memory-file-storage-adapter"; import { FildeAdapterDownloadUrlSettings, FildeAdapterUploadUrlSettings, } from "@daiso-tech/core/file-storage/contracts"; import { FileStorage } from "@daiso-tech/core/file-storage"; const fileStorage = new FileStorage({ // You can provide defaultContentType value by default is application/octet-stream defaultContentType: "text/plain", // You can choose the adapter to use adapter: new MemoryFileStorageAdapter(), urlAdapter: { getPublicUrl(key: string): Promise { return null; }, getSignedDownloadUrl( key: string, settings: FildeAdapterDownloadUrlSettings, ): Promise { return null; }, getSignedUploadUrl( key: string, settings: FildeAdapterUploadUrlSettings, ): Promise { return ""; }, }, }); ``` ### File instance variables[​](#file-instance-variables "Direct link to File instance variables") The `File` class exposes the key instance variable which is the filename: ``` const file = fileStorage.create("file.txt"); // Will return the file name console.log(file.key.toString()); ``` ### Namespacing[​](#namespacing "Direct link to Namespacing") You can use the `Namespace` class to group related files without conflicts. Since namespacing is not used be default, you need to pass an obeject that implements `INamespace` object. info For further information about namespacing refer to [`@daiso-tech/core/namespace`](/docs/components/namespace.md) documentation. ``` import { Namespace } from "@daiso-tech/core/namespace"; import { MemoryFileStorageAdapter } from "@daiso-tech/core/file-storage/memory-file-storage-adapter"; import { FileStorage } from "@daiso-tech/core/file-storage"; const fileStorageA = new FileStorage({ namespace: new Namespace("@file-storage-a"), adapter: new MemoryFileStorageAdapter(), }); const fileStorageB = new FileStorage({ namespace: new Namespace("@file-storage-b"), adapter: new MemoryFileStorageAdapter(), }); const fileA = await fileStorageA.create("file.txt"); const fileB = await fileStorageB.create("file.txt"); await fileA.add({ data: "CONTENT_A" }); await fileB.add({ data: "CONTENT_B" }); // Will log "CONTENT_A" console.log(fileA.getText()); // Will log "CONTENT_B" console.log(fileB.getText()); ``` ### Serialization and deserialization of file[​](#serialization-and-deserialization-of-file "Direct link to Serialization and deserialization of file") File obejcts can be serialized, allowing them to be transmitted over the network to another server and later deserialized for reuse. info Note when only file name will be saved when serialized and not it' content. Which makes it efficient to send file over the network. In order to serialize or deserialize a file object you need pass an object that implements [`ISerderRegister`](/docs/components/serde.md) contract like the [`Serde`](/docs/components/serde.md) class to `FileStorage`. Manually serializing and deserializing the file object: ``` import { MemoryFileStorageAdapter } from "@daiso-tech/core/file-storage/memory-file-storage-adapter"; import { FileStorage } from "@daiso-tech/core/file-storage"; import { Serde } from "@daiso-tech/core/serde"; import { SuperJsonSerdeAdapter } from "@daiso-tech/core/serde/super-json-serde-adapter"; const serde = new Serde(new SuperJsonSerdeAdapter()); const fileStorage = new FileStorage({ // You can laso pass in an array of Serde class instances serde, adapter: new MemoryFileStorageAdapter(), }); const file = fileStorage.create("file.txt"); const serializedFIle = serde.serialize(file); const deserializedFIle = serde.deserialize(file); ``` danger When serializing or deserializing a file, you must use the same `Serde` instances that were provided to the `FilStorage`. This is required because the `FilStorage` injects custom serialization logic for `IFile` instance into `Serde` instances. info Note you only need manuall serialization and deserialization when integrating with external libraries. As long you pass the same `Serde` instances with all other components you dont need to serialize and deserialize the file object manually. ``` import { MemoryFileStorageAdapter } from "@daiso-tech/core/file-storage/memory-file-storage-adapter"; import type { IFile } from "@daiso-tech/core/file-storage/contracts"; import { FileStorage } from "@daiso-tech/core/file-storage"; import { RedisPubSubEventBusAdapter } from "@daiso-tech/core/event-bus/redis-pub-sub-event-bus-adapter"; import { EventBus } from "@daiso-tech/core/event-bus"; import { Serde } from "@daiso-tech/core/serde"; import { SuperJsonSerdeAdapter } from "@daiso-tech/core/serde/super-json-serde-adapter"; const serde = new Serde(new SuperJsonSerdeAdapter()); const redis = new Redis("YOUR_REDIS_CONNECTION"); type EventMap = { "sending-file-over-network": { file: IFile; }; }; const eventBus = new EventBus({ adapter: new RedisPubSubEventBusAdapter({ client: redis, serde, }), }); const fileStorage = new FileStorage({ serde, adapter: new MemoryFileStorageAdapter(), eventBus, }); const file = fileStorage.create("file.txt"); // We are sending the file over the network to other servers. await eventBus.dispatch("sending-file-over-network", { file, }); // The other servers will recieve the serialized file and automattically deserialize it. await eventBus.addListener("sending-file-over-network", ({ file }) => { // The file is deserialized and can be used console.log("file:", file); }); ``` ### File events[​](#file-events "Direct link to File events") You can listen to different [file events](https://daiso-tech.github.io/daiso-core/modules/File.html) that are triggered by the `File` instance. Refer to the [`EventBus`](/docs/components/event_bus/event_bus_usage.md) documentation to learn how to use events. Since no events are dispatched by default, you need to pass an object that implements `IEventBus` or `IEventBusAdapter` contract. ``` import { MemoryFileStorageAdapter } from "@daiso-tech/core/file-storage/memory-file-storage-adapter"; import { FileStorage, FILE_EVENTS } from "@daiso-tech/core/file-storage"; import { MemoryEventBusAdapter } from "@daiso-tech/core/event-bus/memory-event-bus-adapter"; const fileStorage = new FileStorage({ adapter: new MemoryFileStorageAdapter(), eventBus: new MemoryEventBusAdapter(), }); await fileStorage.events.addListener(FILE_EVENTS.ADDED, () => { console.log("File added"); }); await fileStorage.create("file.txt").add({ data: "CONTENT" }); ``` warning If multiple file-storage adapters (e.g., `FsFileStorageAdapter` and `MemoryFileStorageAdapter`) are used at the same time, you need to isolate their events by assigning separate namespaces. This prevents listeners from unintentionally capturing events across adapters. ``` import { FsFileStorageAdapter } from "@daiso-tech/core/file-storage/fs-file-storage-adapter"; import { MemoryFileStorageAdapter } from "@daiso-tech/core/file-storage/memory-file-storage-adapter"; import { RedisPubSubEventBusAdapter } from "@daiso-tech/core/event-bus/redis-pub-sub-event-bus-adapter"; import { Serde } from "@daiso-tech/core/serde"; import { SuperJsonSerdeAdapter } from "@daiso-tech/core/serde/super-json-serde-adapter"; import { Namespace } from "@daiso-tech/core/namespace"; const serde = new Serde(new SuperJsonSerdeAdapter()); const redisPubSubEventBusAdapter = new RedisPubSubEventBusAdapter({ client: new Redis("YOUR_REDIS_CONNECTION_STRING"), serde, }); const memoryFileStorageAdapter = new MemoryFileStorageAdapter(); const memoryFileStorage = new FileStorage({ adapter: memoryFileStorageAdapter, // We assign distinct namespaces to MemoryFileStorageAdapter and FsFileStorageAdapter to isolate their events. namespace: new Namespace(["memory", "event-bus"]), eventBus: redisPubSubEventBusAdapter, }); const fsFileStorageAdapter = new FsFileStorageAdapter(); const fsFileStorage = new FileStorage({ adapter: fsFileStorageAdapter, // We assign distinct namespaces to MemoryFileStorageAdapter and FsFileStorageAdapter to isolate their events. namespace: new Namespace(["fs", "event-bus"]), eventBus: redisPubSubEventBusAdapter, }); ``` ### File locking on write operations[​](#file-locking-on-write-operations "Direct link to File locking on write operations") The `FileStorage` instance method supports distributed locking via the `lockFactory` settings passed into `FileStorage` constructor. Data races will occur when multiple clients simultaneously perform write operation to same file and file will be corrupted. ``` import { MemoryFileStorageAdapter } from "@daiso-tech/core/file-storage/memory-file-storage-adapter"; import { FileStorage } from "@daiso-tech/core/file-storage"; import { MemoryLockAdapter } from "@daiso-tech/core/lock/memory-lock-adapter"; const fileStorage = new FileStorage({ adapter: new MemoryFileStorageAdapter(), lockFactory: new MemoryLockAdapter(), }); // Write operations on the same file key will now be protected by a lock await fileStorage.create("file.txt").add({ data: "CONTENT" }); ``` info You can pass `ILockFactoryBase`, `ILockAdapter`, and `IDatabaseLockAdapter` to `lockFactory` setting. For further information about `LockFactory` refer to the [`@daiso-tech/core/lock`](/docs/components/lock/lock_usage.md) documentation. ### Separating creating, listening to and manipulating files[​](#separating-creating-listening-to-and-manipulating-files "Direct link to Separating creating, listening to and manipulating files") The library includes 3 additional contracts: * [`IFile`](https://daiso-tech.github.io/daiso-core/types/FileStorage.IFile.html) - Allows only for manipulating of the file. * [`IFileFactory`](https://daiso-tech.github.io/daiso-core/types/FileStorage.IFileFactory.html) - Allows only for creation of file. * [`IFileStorageBase`](https://daiso-tech.github.io/daiso-core/types/FileStorage.IFileStorageBase.html) - Allows for creation and removal of files. * [`IFileListenable`](https://daiso-tech.github.io/daiso-core/types/FileStorage.IFileListenable.html) - Allows only to listening to file events. ## Further information[​](#further-information "Direct link to Further information") For further information refer to [`@daiso-tech/core/file-storage`](https://daiso-tech.github.io/daiso-core/modules/FileStorage.html) API docs. --- # Configuring lock adapters ## MemoryLockAdapter[​](#memorylockadapter "Direct link to MemoryLockAdapter") To use the `MemoryLockAdapter` you only need to create instance of it: ``` import { MemoryLockAdapter } from "@daiso-tech/core/lock/memory-lock-adapter"; const memoryLockAdapter = new MemoryLockAdapter(); ``` You can also provide an `Map` that will be used for storing the data in memory: ``` import { MemoryLockAdapter } from "@daiso-tech/core/lock/memory-lock-adapter"; const map = new Map(); const memoryLockAdapter = new MemoryLockAdapter(map); ``` info `MemoryLockAdapter` lets you test your app without external dependencies like `Redis`, ideal for local development, unit tests, integration tests and fast E2E test for the backend application. danger Note the `MemoryLockAdapter` is limited to single process usage and cannot be shared across multiple servers or processes. ## MongodbLockAdapter[​](#mongodblockadapter "Direct link to MongodbLockAdapter") To use the `MongodbLockAdapter`, you'll need to: 1. Install the required dependency: [`mongodb`](https://www.npmjs.com/package/mongodb) package: ``` import { MongodbLockAdapter } from "@daiso-tech/core/lock/mongodb-lock-adapter"; import { MongoClient } from "mongodb"; const client = await MongoClient.connect("YOUR_MONGODB_CONNECTION_STRING"); const database = client.db("database"); const mongodbLockAdapter = new MongodbLockAdapter({ database, }); // You need initialize the adapter once before using it. // During the initialization the indexes will be created await mongodbLockAdapter.init(); ``` You can change the collection name: ``` const mongodbLockAdapter = new MongodbLockAdapter({ database, // By default "lock" is used as collection name collectionName: "my-lock", }); await mongodbLockAdapter.init(); ``` You can change the collection settings: ``` const mongodbLockAdapter = new MongodbLockAdapter({ database, // You configure additional collection settings collectionSettings: {}, }); await mongodbLockAdapter.init(); ``` info To remove the lock collection and all stored lock data, use `deInit` method: ``` await mongodbLockAdapter.deInit(); ``` danger Note in order to use `MongodbLockAdapter` correctly, ensure you use a single, consistent database across all server instances or processes. ## RedisLockAdapter[​](#redislockadapter "Direct link to RedisLockAdapter") To use the `RedisLockAdapter`, you'll need to: 1. Install the required dependency: [`ioredis`](https://www.npmjs.com/package/ioredis) package: ``` import { RedisLockAdapter } from "@daiso-tech/core/lock/redis-lock-adapter"; import Redis from "ioredis"; const database = new Redis("YOUR_REDIS_CONNECTION_STRING"); const redisLockAdapter = new RedisLockAdapter(database); ``` danger Note in order to use `RedisLockAdapter` correctly, ensure you use a single, consistent database across all server instances or processes. ## KyselyLockAdapter[​](#kyselylockadapter "Direct link to KyselyLockAdapter") To use the `KyselyLockAdapter`, you'll need to: 1. Use database provider that has support for transactions. 2. Install the required dependency: [`kysely`](https://www.npmjs.com/package/kysely) package: ### With Sqlite[​](#with-sqlite "Direct link to With Sqlite") You will need to install [`better-sqlite3`](https://www.npmjs.com/package/better-sqlite3) package: ``` import { TimeSpan } from "@daiso-tech/core/time-span"; import { KyselyLockAdapter } from "@daiso-tech/core/lock/kysely-lock-adapter"; import Sqlite from "better-sqlite3"; import { Kysely, SqliteDialect } from "kysely"; const database = new Sqlite("DATABASE_NAME.db"); const kysely = new Kysely({ dialect: new SqliteDialect({ database, }), }); const kyselyLockAdapter = new KyselyLockAdapter({ kysely, }); // You need initialize the adapter once before using it. // During the initialization the schema will be created await kyselyLockAdapter.init(); ``` danger Note using `KyselyLockAdapter` with `sqlite` is limited to single server usage and cannot be shared across multiple servers but it can be shared between different processes. To use it correctly, ensure all process instances access the same persisted database. ### With Postgres[​](#with-postgres "Direct link to With Postgres") You will need to install [`pg`](https://www.npmjs.com/package/pg) package: ``` import { TimeSpan } from "@daiso-tech/core/time-span"; import { KyselyLockAdapter } from "@daiso-tech/core/lock/kysely-lock-adapter"; import { Pool } from "pg"; import { Kysely, PostgresDialect } from "kysely"; const database = new Pool({ database: "DATABASE_NAME", host: "DATABASE_HOST", user: "DATABASE_USER", // DATABASE port port: 5432, password: "DATABASE_PASSWORD", max: 10, }); const kysely = new Kysely({ dialect: new PostgresDialect({ pool: database, }), }); const kyselyLockAdapter = new KyselyLockAdapter({ kysely, }); // You need initialize the adapter once before using it. // During the initialization the schema will be created await kyselyLockAdapter.init(); ``` danger Note in order to use `KyselyLockAdapter` with `postgres` correctly, ensure you use a single, consistent database across all server instances. This means you can't use replication. ### With Mysql[​](#with-mysql "Direct link to With Mysql") You will need to install [`mysql2`](https://www.npmjs.com/package/mysql2) package: ``` import { TimeSpan } from "@daiso-tech/core/time-span"; import { KyselyLockAdapter } from "@daiso-tech/core/lock/kysely-lock-adapter"; import { createPool } from "mysql2"; import { Kysely, MysqlDialect } from "kysely"; const database = createPool({ host: "DATABASE_HOST", // Database port port: 3306, database: "DATABASE_NAME", user: "DATABASE_USER", password: "DATABASE_PASSWORD", connectionLimit: 10, }); const kysely = new Kysely({ dialect: new MysqlDialect({ pool: database, }), }); const kyselyLockAdapter = new KyselyLockAdapter({ kysely, }); // You need initialize the adapter once before using it. // During the initialization the schema will be created await kyselyLockAdapter.init(); ``` danger Note in order to use `KyselyLockAdapter` with `mysql` correctly, ensure you use a single, consistent database across all server instances. This means you can't use replication. ### With Libsql[​](#with-libsql "Direct link to With Libsql") You will need to install `@libsql/kysely-libsql` package: ``` import { TimeSpan } from "@daiso-tech/core/time-span"; import { KyselyLockAdapter } from "@daiso-tech/core/lock/kysely-lock-adapter"; import { LibsqlDialect } from "@libsql/kysely-libsql"; import { Kysely } from "kysely"; const kysely = new Kysely({ dialect: new LibsqlDialect({ url: "DATABASE_URL", }), }); const kyselyLockAdapter = new KyselyLockAdapter({ kysely, }); // You need initialize the adapter once before using it. // During the initialization the schema will be created await kyselyLockAdapter.init(); ``` danger Note in order to use `KyselyLockAdapter` with `libsql` correctly, ensure you use a single, consistent database across all server instances. This means you can't use libsql embedded replicas. ### Settings[​](#settings "Direct link to Settings") Expired keys are cleared at regular intervals and you can change the interval time: ``` import { TimeSpan } from "@daiso-tech/core/time-span"; const kyselyLockAdapter = new KyselyLockAdapter({ database, // By default, the interval is 1 minute expiredKeysRemovalInterval: TimeSpan.fromSeconds(10), }); await kyselyLockAdapter.init(); ``` Disabling scheduled interval cleanup of expired keys: ``` import { TimeSpan } from "@daiso-tech/core/time-span"; const kyselyLockAdapter = new KyselyLockAdapter({ database, shouldRemoveExpiredKeys: false, }); await kyselyLockAdapter.init(); // You can remove all expired keys manually. await kyselyLockAdapter.removeAllExpired(); ``` info To remove the lock table and all stored lock data, use `deInit` method: ``` await kyselyLockAdapter.deInit(); ``` ## NoOpLockAdapter[​](#nooplockadapter "Direct link to NoOpLockAdapter") The `NoOpLockAdapter` is a no-operation implementation, it performs no actions when called: ``` import { NoOpLockAdapter } from "@daiso-tech/core/lock/no-op-lock-adpater"; const noOpLockAdapter = new NoOpLockAdapter(); ``` info The `NoOpLockAdapter` is useful when you want to mock out or disable your [`LockFactory`](https://daiso-tech.github.io/daiso-core/classes/Lock.LockFactory.html) instance. ## Further information[​](#further-information "Direct link to Further information") For further information refer to [`@daiso-tech/core/lock`](https://daiso-tech.github.io/daiso-core/modules/Lock.html) API docs. --- # Creating lock adapters ## Implementing your custom ILockAdapter[​](#implementing-your-custom-ilockadapter "Direct link to Implementing your custom ILockAdapter") In order to create an adapter you need to implement the [`ILockAdapter`](https://daiso-tech.github.io/daiso-core/types/Lock.ILockAdapter.html) contract. ## Testing your custom ILockAdapter[​](#testing-your-custom-ilockadapter "Direct link to Testing your custom ILockAdapter") We provide a complete test suite to test your lock adapter implementation. Simply use the [`lockAdapterTestSuite`](https://daiso-tech.github.io/daiso-core/functions/Lock.lockAdapterTestSuite.html) function: * Preconfigured Vitest test cases * Common edge case coverage Usage example: ``` // filename: MyLockAdapter.test.ts import { beforeEach, describe, expect, test } from "vitest"; import { lockAdapterTestSuite } from "@daiso-tech/core/lock/test-utilities"; import { MemoryLockAdapter } from "./MemoryLockAdapter.js"; describe("class: MyLockAdapter", () => { lockAdapterTestSuite({ createAdapter: () => new MemoryLockAdapter(), test, beforeEach, expect, describe, }); }); ``` ## Implementing your custom IDatabaseLockAdapter[​](#implementing-your-custom-idatabaselockadapter "Direct link to Implementing your custom IDatabaseLockAdapter") We provide an additional contract [`IDatabaseLockAdapter`](https://daiso-tech.github.io/daiso-core/types/Lock.IDatabaseLockAdapter.html) for building custom lock adapters tailored to databases. ## Testing your custom IDatabaseLockAdapter[​](#testing-your-custom-idatabaselockadapter "Direct link to Testing your custom IDatabaseLockAdapter") We provide a complete test suite to test your database lock adapter implementation. Simply use the [`databaseLockAdapterTestSuite`](https://daiso-tech.github.io/daiso-core/functions/Lock.databaseLockAdapterTestSuite.html) function: * Preconfigured Vitest test cases * Common edge case coverage Usage example: ``` import { beforeEach, describe, expect, test } from "vitest"; import { databaseLockAdapterTestSuite } from "@daiso-tech/core/lock/test-utilities"; import { MyDatabaseLockAdapter } from "./MyDatabaseLockAdapter.js"; describe("class: MyDatabaseLockAdapter", () => { databaseLockAdapterTestSuite({ createAdapter: async () => { return new MyDatabaseLockAdapter(), }, test, beforeEach, expect, describe, }); }); ``` ## Implementing your custom ILockFactory class[​](#implementing-your-custom-ilockfactory-class "Direct link to Implementing your custom ILockFactory class") In some cases, you may need to implement a custom [`LockFactory`](https://daiso-tech.github.io/daiso-core/classes/Lock.LockFactory.html) class to optimize performance for your specific technology stack. You can then directly implement the [`ILockFactory`](https://daiso-tech.github.io/daiso-core/types/Lock.ILockFactory.html) contract. ## Testing your custom ILockFactory class[​](#testing-your-custom-ilockfactory-class "Direct link to Testing your custom ILockFactory class") We provide a complete test suite to verify your custom lock factory class implementation. Simply use the [`lockFactoryTestSuite`](https://daiso-tech.github.io/daiso-core/functions/Lock.lockFactoryTestSuite.html) function: * Preconfigured Vitest test cases * Standardized lock factory behavior validation * Common edge case coverage Usage example: ``` // filename: MyLockFactory.test.ts import { beforeEach, describe, expect, test } from "vitest"; import { lockFactoryTestSuite } from "@daiso-tech/core/lock/test-utilities"; import { MyLockFactory } from "./MyLockFactory.js"; describe("class: MyLockFactory", () => { lockFactoryTestSuite({ createLockFactory: () => new MyLockFactory(), test, beforeEach, expect, describe, }); }); ``` ## Further information[​](#further-information "Direct link to Further information") For further information refer to [`@daiso-tech/core/lock`](https://daiso-tech.github.io/daiso-core/modules/Lock.html) API docs. --- # LockFactoryResolver The `LockFactoryResolver` class provides a flexible way to configure and switch between different lock adapters at runtime. ## Initial configuration[​](#initial-configuration "Direct link to Initial configuration") To begin using the `ILockFactoryResolver`, you will need to register all required adapters during initialization. ``` import { LockFactoryResolver } from "@daiso-tech/core/lock"; import { MemoryLockAdapter } from "@daiso-tech/core/lock/memory-lock-adapter"; import { RedisLockAdapter } from "@daiso-tech/core/lock/redis-lock-adapter"; import Redis from "ioredis"; const lockFactoryResolver = new LockFactoryResolver({ adapters: { memory: new MemoryLockAdapter(), redis: new RedisLockAdapter(new Redis("YOUR_REDIS_CONNECTION")), }, // You can set an optional default adapter defaultAdapter: "memory", }); ``` ## Usage[​](#usage "Direct link to Usage") ### 1. Using the default adapter[​](#1-using-the-default-adapter "Direct link to 1. Using the default adapter") ``` await lockFactoryResolver .use() .create("shared-resource") .runOrFail(async () => { // code to run }); ``` danger Note that if you dont set a default adapter, an error will be thrown. ### 2. Specifying an adapter explicitly[​](#2-specifying-an-adapter-explicitly "Direct link to 2. Specifying an adapter explicitly") ``` await lockFactoryResolver .use("redis") .create("shared-resource") .runOrFail(async () => { // code to run }); ``` danger Note that if you specify a non-existent adapter, an error will be thrown. ### 3. Overriding default settings[​](#3-overriding-default-settings "Direct link to 3. Overriding default settings") ``` import { Namespace } from "@daiso-tech/core/namespace"; await lockFactoryResolver .setNamespace(new Namespace("@my-namespace")) .use("redis") .create("shared-resource") .runOrFail(async () => { // code to run }); ``` info Note that the `LockFactoryResolver` is immutable, meaning any configuration override returns a new instance rather than modifying the existing one. ## Further information[​](#further-information "Direct link to Further information") For further information refer to [`@daiso-tech/core/lock`](https://daiso-tech.github.io/daiso-core/modules/Lock.html) API docs. --- # Lock usage The `@daiso-tech/core/lock` component provides a way for managing locks independent of underlying platform or storage. ## Initial configuration[​](#initial-configuration "Direct link to Initial configuration") To begin using the `LockFactory` class, you'll need to create and configure an instance: ``` import { TimeSpan } from "@daiso-tech/core/time-span"; import { MemoryLockAdapter } from "@daiso-tech/core/lock/memory-lock-adapter"; import { LockFactory } from "@daiso-tech/core/lock"; const lockFactory = new LockFactory({ // You can provide default TTL value // If you set it to null it means locks will not expire and most be released manually by default. defaultTtl: TimeSpan.fromSeconds(2), // You can choose the adapter to use adapter: new MemoryLockAdapter(), }); ``` info Here is a complete list of settings for the [`LockFactory`](https://daiso-tech.github.io/daiso-core/types/Lock.LockFactorySettingsBase.html) class. ## Lock basics[​](#lock-basics "Direct link to Lock basics") ### Creating a lock[​](#creating-a-lock "Direct link to Creating a lock") ``` const lock = lockFactory.create("shared-resource"); ``` ### Acquiring and releasing the lock[​](#acquiring-and-releasing-the-lock "Direct link to Acquiring and releasing the lock") ``` const hasAquired = await lock.acquire(); if (hasAquired) { try { // The critical section } finally { await lock.release(); } } ``` Alternatively you could write it as follows: ``` try { // This method will throw if the lock is not acquired await lock.acquireOrFail(); // The critical section } finally { await lock.release(); } ``` danger You need always to wrap the critical section with `try-finally` so the lock get released when error occurs. ### Locks with custom TTL[​](#locks-with-custom-ttl "Direct link to Locks with custom TTL") You can provide a custom TTL for the lock. ``` const lock = lockFactory.create("shared-resource", { // Default TTL is 5min if not overrided // If you set it to null it means locks will not expire and most be released manually. ttl: TimeSpan.fromSeconds(30), }); ``` ### Checking lock state[​](#checking-lock-state "Direct link to Checking lock state") You can get the lock state by using the `getState` method, it returns [`ILockState`](https://daiso-tech.github.io/daiso-core/types/Lock.ILockState.html). ``` import { LOCK_STATE } from "@daiso-tech/core/lock/contracts"; const lock = lockFactory.create("shared-resource"); const state = await lock.getState(); if (state.type === LOCK_STATE.EXPIRED) { console.log("The lock doesnt exists"); } if (state.type === LOCK_STATE.UNAVAILABLE) { console.log("Lock is acquired by different owner"); } if (state.type === LOCK_STATE.ACQUIRED) { console.log("The lock is acquired"); } ``` ## Patterns[​](#patterns "Direct link to Patterns") ### Refreshing locks[​](#refreshing-locks "Direct link to Refreshing locks") The lock can be refreshed by the current owner before it expires. This is particularly useful for long-running tasks, instead of setting an excessively long TTL initially, you can start with a shorter one and use the `refresh` method to set the TTL of the lock: ``` import { delay } from "@daiso-tech/core/utilities/functions"; const lock = lockFactory.create("resource", { ttl: TimeSpan.fromMinutes(1), }); async function doWork(): Promise { // ... critical section } const hasAcquired = await lock.acquire(); if (hasAcquired) { try { while (true) { await lock.refresh(TimeSpan.fromMinutes(1)); const hasFinished = await doWork(); if (hasFinished) { break; } await delay(TimeSpan.fromSeconds(1)); } } finally { await lock.release(); } } ``` warning Note: A lock must have an expiration (a `ttl` value) to be refreshed. You cannot refresh a lock that was created without an expiration (with `ttl: null`) ``` // Create a lock with no expiration (non-refreshable) const lock = lockFactory.create("resource", { ttl: null, }); // A refresh attempt on this lock will fail const hasRefreshed = await lock.refresh(); // This will log 'false' because the lock cannot be refreshed console.log(hasRefreshed); ``` ### Additional methods[​](#additional-methods "Direct link to Additional methods") The `releaseOrFail` method is the same `release` method but it throws an error when not enable to release the lock: ``` const lock = lockFactory.create("resource"); await lock.releaseOrFail(); ``` The `forceRelease` method releases the lock regardless of the owner: ``` const lock = lockFactory.create("resource"); await lock.forceRelease(); ``` The `refreshOrFail` method is the same `refresh` method but it throws an error when not enable to refresh the lock: ``` const lock = lockFactory.create("resource"); await lock.refreshOrFail(); ``` The `runOrFail` method automatically manages lock acquisition and release around function execution. It calls `acquireOrFail` before invoking the function and calls `release` in a finally block, ensuring the lock is always freed, even if an error occurs during execution. ``` const lock = lockFactory.create("resource"); await lock.runOrFail(async () => { // ... critical section }); ``` info Note the method throws an error when the lock cannot be acquired. info You can provide synchronous or asynchronous [`Invokable<[], TValue | Promise>`](/docs/utilities/invokable.md) as values for the `runOrFail` method. ### Lock instance variables[​](#lock-instance-variables "Direct link to Lock instance variables") The `Lock` class exposes instance variables such as: ``` const lock = lockFactory.create("resource"); // Will return the key of the lock which is "resource" console.log(lock.key.toString()); // Will return the id of the lock console.log(lock.id); // Will return the ttl of the lock console.log(lock.ttl); ``` info The `key` field is an object that implements [`IKey`](/docs/components/namespace.md) contract. ### Lock id[​](#lock-id "Direct link to Lock id") By default the lock id is autogenerated but it can also manually defined. ``` const lock = lockFactory.create("lock", { lockId: "my-lock-id", }); const hasAcquire = await lock.acquire(); if (hasAcquired) { console.log("Shared resource"); await lock.release(); } ``` info Manually defining lock id is primarily useful for debugging or implementing manual resource locking by the end user. An example of manual resource locking by the end user can be found in a multi-user CMS, the end user manually locks a document during editing, this resource lock prevents simultaneous edits and data corruption. warning In most cases, setting a custom lock id is unnecessary. Misusing this feature could result in different locks sharing the same lock id while modifying the same resource simultaneously, which may lead to race conditions. ### Namespacing[​](#namespacing "Direct link to Namespacing") You can use the `Namespace` class to group related locks without conflicts. Since namespacing is not used be default, you need to pass an obeject that implements `INamespace` object. info For further information about namespacing refer to [`@daiso-tech/core/namespace`](/docs/components/namespace.md) documentation. ``` import { Namespace } from "@daiso-tech/core/namespace"; import { RedisLockAdapter } from "@daiso-tech/core/lock/redis-lock-adapter"; import { LockFactory } from "@daiso-tech/core/lock"; import Redis from "ioredis"; const database = new Redis("YOUR_REDIS_CONNECTION_STRING"); const lockFactoryA = new LockFactory({ namespace: new Namespace("@lock-a"), adapter: new RedisLockAdapter(database), }); const lockFactoryB = new LockFactory({ namespace: new Namespace("@lock-b"), adapter: new RedisLockAdapter(database), }); const lockA = lockFactoryA.create("key", { ttl: null }); const lockB = lockFactoryB.create("key", { ttl: null }); const hasAquiredA = await lockA.acquire(); // Will log true console.log(hasAquiredA); const hasAquiredB = await lockB.acquire(); // Will log true console.log(hasAquiredB); const hasReleasedB = await lockB.release(); // Will log true console.log(hasReleasedB); // Will log { type: "ACQUIRED", remainingTime: null } console.log(await lockA.getState()); // Will log { type: "EXPIRED" } console.log(await lockB.getState()); ``` ### Retrying acquiring lock by attempts[​](#retrying-acquiring-lock-by-attempts "Direct link to Retrying acquiring lock by attempts") To retry acquiring lock you can use the [`retry`](/docs/components/resilience.md) middleware. Retrying acquiring lock with `acquireOrFail` method: ``` import { retry } from "@daiso-tech/core/resilience"; import { FailedAcquireLockError } from "@daiso-tech/core/lock/contracts"; import { useFactory } from "@daiso-tech/core/middleware"; const lock = lockFactory.create("lock"); const use = useFactory(); try { await use(async () => { await lock.acquireOrFail(); }, [ retry({ maxAttempts: 4, errorPolicy: FailedAcquireLockError, }), ])(); // The critical section } finally { await lock.release(); } ``` Retrying acquiring lock with `acquire` method: ``` import { retry } from "@daiso-tech/core/resilience"; import { useFactory } from "@daiso-tech/core/middleware"; const lock = lockFactory.create("lock"); const use = useFactory(); const hasAquired = await use(async () => { return await lock.acquire(); }, [ retry({ maxAttempts: 4, errorPolicy: { treatFalseAsError: true, }, }), ])(); if (hasAquired) { try { // The critical section } finally { await lock.release(); } } ``` Retrying acquiring lock with `runOrFail` method: ``` import { retry } from "@daiso-tech/core/resilience"; import { FailedAcquireLockError } from "@daiso-tech/core/lock/contracts"; import { useFactory } from "@daiso-tech/core/middleware"; const lock = lockFactory.create("lock"); const use = useFactory(); await use(async () => { await lock.runOrFail(async () => { // The critical section }); }, [ retry({ maxAttempts: 4, errorPolicy: FailedAcquireLockError, }), ])(); ``` ### Retrying acquiring lock by interval[​](#retrying-acquiring-lock-by-interval "Direct link to Retrying acquiring lock by interval") To retry acquiring lock at regular intervals you can use the [`retryInterval`](/docs/components/resilience.md) middleware: Retrying acquiring lock with `acquireOrFail` method: ``` import { retryInterval } from "@daiso-tech/core/resilience"; import { FailedAcquireLockError } from "@daiso-tech/core/lock/contracts"; import { useFactory } from "@daiso-tech/core/middleware"; import { TimeSpan } from "@daiso-tech/core/time-span"; const lock = lockFactory.create("resource"); const use = useFactory(); try { await use(async () => { await lock.acquireOrFail(); }, [ retryInterval({ // Time to wait 1 minute time: TimeSpan.fromMinutes(1), // Interval to try acquire the lock interval: TimeSpan.fromSeconds(1), errorPolicy: FailedAcquireLockError, }), ])(); // ... critical section } finally { await lock.release(); } ``` Retrying acquiring lock with `acquire` method: ``` import { retryInterval } from "@daiso-tech/core/resilience"; import { useFactory } from "@daiso-tech/core/middleware"; import { TimeSpan } from "@daiso-tech/core/time-span"; const lock = lockFactory.create("resource"); const use = useFactory(); const hasAcquired = await use(async () => { return await lock.acquire(); }, [ retryInterval({ time: TimeSpan.fromMinutes(1), interval: TimeSpan.fromSeconds(1), errorPolicy: { treatFalseAsError: true, }, }), ])(); if (hasAcquired) { try { // ... critical section } finally { await lock.release(); } } ``` Retrying acquiring lock with `runOrFail` method: ``` import { retryInterval } from "@daiso-tech/core/resilience"; import { FailedAcquireLockError } from "@daiso-tech/core/lock/contracts"; import { useFactory } from "@daiso-tech/core/middleware"; import { TimeSpan } from "@daiso-tech/core/time-span"; const lock = lockFactory.create("resource"); const use = useFactory(); await use(async () => { await lock.runOrFail(async () => { // ... critical section }); }, [ retryInterval({ time: TimeSpan.fromMinutes(1), interval: TimeSpan.fromSeconds(1), errorPolicy: FailedAcquireLockError, }), ])(); ``` warning Note using `retryInterval` middleware with lock acquiring in a HTTP request handler is discouraged because it blocks the HTTP request handler causing the handler wait until the lock becomes available or the timeout is reached. This will delay HTTP request handler to generate response and will make frontend app slow because of HTTP request handler. ### Serialization and deserialization of lock[​](#serialization-and-deserialization-of-lock "Direct link to Serialization and deserialization of lock") Locks can be serialized, allowing them to be transmitted over the network to another server and later deserialized for reuse. This means you can, for example, acquire the lock on the main server, transfer it to a queue worker server, and release it there. In order to serialize or deserialize a lock you need pass an object that implements [`ISerderRegister`](/docs/components/serde.md) contract like the [`Serde`](/docs/components/serde.md) class to `LockFactory`. Manually serializing and deserializing the lock: ``` import { RedisLockAdapter } from "@daiso-tech/core/lock/redis-lock-adapter"; import { LockFactory } from "@daiso-tech/core/lock"; import { Serde } from "@daiso-tech/core/serde"; import { SuperJsonSerdeAdapter } from "@daiso-tech/core/serde/super-json-serde-adapter"; const serde = new Serde(new SuperJsonSerdeAdapter()); const redisClient = new Redis("YOUR_REDIS_CONNECTION"); const lockFactory = new LockFactory({ // You can laso pass in an array of Serde class instances serde, adapter: new RedisLockAdapter(redisClient), }); const lock = lockFactory.create("resource"); const serializedLock = serde.serialize(lock); const deserializedLock = serde.deserialize(lock); ``` danger When serializing or deserializing a lock, you must use the same `Serde` instances that were provided to the `LockFactory`. This is required because the `LockFactory` injects custom serialization logic for `ILock` instance into `Serde` instances. info Note you only need manuall serialization and deserialization when integrating with external libraries. As long you pass the same `Serde` instances with all other components you dont need to serialize and deserialize the lock manually. ``` import { RedisLockAdapter } from "@daiso-tech/core/lock/redis-lock-adapter"; import type { ILock } from "@daiso-tech/core/lock/contracts"; import { LockFactory } from "@daiso-tech/core/lock"; import { RedisPubSubEventBusAdapter } from "@daiso-tech/core/event-bus/redis-pub-sub-event-bus-adapter"; import { EventBus } from "@daiso-tech/core/event-bus"; import { Serde } from "@daiso-tech/core/serde"; import { SuperJsonSerdeAdapter } from "@daiso-tech/core/serde/super-json-serde-adapter"; const serde = new Serde(new SuperJsonSerdeAdapter()); const redis = new Redis("YOUR_REDIS_CONNECTION"); type EventMap = { "sending-lock-over-network": { lock: ILock; }; }; const eventBus = new EventBus({ adapter: new RedisPubSubEventBusAdapter({ client: redis, serde, }), }); const lockFactory = new LockFactory({ serde, adapter: new RedisLockAdapter(redis), eventBus, }); const lock = lockFactory.create("resource"); // We are sending the lock over the network to other servers. await eventBus.dispatch("sending-lock-over-network", { lock, }); // The other servers will recieve the serialized lock and automattically deserialize it. await eventBus.addListener("sending-lock-over-network", ({ lock }) => { // The lock is deserialized and can be used console.log("LOCK:", lock); }); ``` ### Lock events[​](#lock-events "Direct link to Lock events") You can listen to different [lock events](https://daiso-tech.github.io/daiso-core/modules/Lock.html) that are triggered by the `Lock` instance. Refer to the [`EventBus`](/docs/components/event_bus/event_bus_usage.md) documentation to learn how to use events. Since no events are dispatched by default, you need to pass an object that implements `IEventBus` or `IEventBusAdapter` contract. ``` import { MemoryLockAdapter } from "@daiso-tech/core/lock/memory-lock-adapter"; import { LockFactory, LOCK_EVENTS } from "@daiso-tech/core/lock"; import { MemoryEventBusAdapter } from "@daiso-tech/core/event-bus/memory-event-bus-adapter"; const lockFactory = new LockFactory({ adapter: new MemoryLockAdapter(), eventBus: new MemoryEventBusAdapter(), }); await lockFactory.events.addListener(LOCK_EVENTS.ACQUIRED, () => { console.log("Lock acquired"); }); await lockFactory.create("a").acquire(); ``` warning If multiple lock adapters (e.g., `RedisLockAdapter` and `MemoryLockAdapter`) are used at the same time, you need to isolate their events by assigning separate namespaces. This prevents listeners from unintentionally capturing events across adapters. ``` import { RedisLockAdapter } from "@daiso-tech/core/lock/redis-lock-adapter"; import { MemoryLockAdapter } from "@daiso-tech/core/lock/memory-lock-adapter"; import { RedisPubSubEventBusAdapter } from "@daiso-tech/core/event-bus/redis-pub-sub-event-bus-adapter"; import { Serde } from "@daiso-tech/core/serde"; import { SuperJsonSerdeAdapter } from "@daiso-tech/core/serde/super-json-serde-adapter"; import Redis from "ioredis"; import { Namespace } from "@daiso-tech/core/namespace"; const serde = new Serde(new SuperJsonSerdeAdapter()); const redisPubSubEventBusAdapter = new RedisPubSubEventBusAdapter({ client: new Redis("YOUR_REDIS_CONNECTION_STRING"), serde, }); const memoryLockAdapter = new MemoryLockAdapter(); const memoryLockFactory = new LockFactory({ adapter: memoryLockAdapter, // We assign distinct namespaces to MemoryLockAdapter and RedisLockAdapter to isolate their events. namespace: new Namespace(["memory", "event-bus"]), eventBus: redisPubSubEventBusAdapter, }); const redisLockAdapter = new RedisLockAdapter({ serde, database: new Redis("YOUR_REDIS_CONNECTION_STRING"), }); const redisLockFactory = new LockFactory({ adapter: redisLockAdapter, // We assign distinct namespaces to MemoryLockAdapter and RedisLockAdapter to isolate their events. namespace: new Namespace(["redis", "event-bus"]), eventBus: redisPubSubEventBusAdapter, }); ``` ### Separating creating, listening to and manipulating locks[​](#separating-creating-listening-to-and-manipulating-locks "Direct link to Separating creating, listening to and manipulating locks") The library includes 3 additional contracts: * [`ILock`](https://daiso-tech.github.io/daiso-core/types/Lock.ILock.html) - Allows only for manipulating of the lock. * [`ILockFactoryBase`](https://daiso-tech.github.io/daiso-core/types/Lock.ILockFactoryBase.html) - Allows only for creation of locks. * [`ILockListenable`](https://daiso-tech.github.io/daiso-core/types/Lock.ILockListenable.html) - Allows only to listening to lock events. This seperation makes it easy to visually distinguish the 3 contracts, making it immediately obvious that they serve different purposes. ``` import { MemoryEventBusAdapter } from "@daiso-tech/core/event-bus/memory-event-bus-adapter"; import { LockFactory } from "@daiso-tech/core/lock"; import { MemoryLockAdapter } from "@daiso-tech/core/lock/memory-lock-adapter"; import { type ILock, type ILockFactoryBase, type ILockListenable, LOCK_EVENTS, } from "@daiso-tech/core/lock/contracts"; async function lockFunc(lock: ILock): Promise { await lock.runOrFail(async () => { // ... critical section }); } async function lockFactoryFunc(lockFactory: ILockFactoryBase): Promise { // You cannot access the listener methods // You will get typescript error if you try const lock = lockFactory.create("resource"); await lockFunc(lock); } async function lockListenableFunc( lockListenable: ILockListenable, ): Promise { // You cannot access the lockFactory methods // You will get typescript error if you try await lockListenable.addListener(LOCK_EVENTS.ACQUIRED, (event) => { console.log("ACQUIRED:", event); }); await lockListenable.addListener(LOCK_EVENTS.RELEASED, (event) => { console.log("RELEASED:", event); }); } const lockFactory = new LockFactory({ adapter: new MemoryLockAdapter(), eventBus: new MemoryEventBusAdapter(), }); await lockListenableFunc(lockFactory.events); await lockFactoryFunc(lockFactory); ``` ## Further information[​](#further-information "Direct link to Further information") For further information refer to [`@daiso-tech/core/lock`](https://daiso-tech.github.io/daiso-core/modules/Lock.html) API docs. --- # Middleware The `@daiso-tech/core/middleware` module provides a flexible middleware system for intercepting and composing function calls. It enables you to wrap functions with pre-processing and post-processing logic, similar to middleware patterns found in web frameworks like Express.js. ## UseFactory configuration[​](#usefactory-configuration "Direct link to UseFactory configuration") ## Initial configuration[​](#initial-configuration "Direct link to Initial configuration") To begin using middleware, create a middleware application function using the factory: ``` import { useFactory } from "@daiso-tech/core/middleware"; import { ExecutionContext } from "@daiso-tech/core/execution-context"; import { AlsExecutionContextAdapter } from "@daiso-tech/core/execution-context/als-execution-context-adapter"; // Create a middleware function with a specific execution context const use = useFactory({ executionContext: new ExecutionContext(new AlsExecutionContextAdapter()), defaultPriority: 0, }); ``` Or use the default configuration: ``` import { useFactory } from "@daiso-tech/core/middleware"; const use = useFactory(); ``` Configure the middleware factory with custom settings: ``` type UseFactorySettings = { /** * The execution context to use for all middleware invocations. * Defaults to a new ExecutionContext with NoOpExecutionContextAdapter */ executionContext?: IExecutionContext; /** * Default priority for middleware without an explicit priority. * Defaults to 0 */ defaultPriority?: number; }; const use = useFactory({ executionContext: customContext, defaultPriority: 50, }); ``` ## Middleware basics[​](#middleware-basics "Direct link to Middleware basics") ### Creating a simple middleware[​](#creating-a-simple-middleware "Direct link to Creating a simple middleware") A middleware is a function that receives middleware arguments (containing the original arguments, a next function, and the execution context) and returns the result: ``` import { type MiddlewareArgs } from "@daiso-tech/core/middleware"; import { type MiddlewareFn } from "@daiso-tech/core/middleware/contracts"; const createLoggingMiddleware = , TReturn>( prefix: string = "LOG", ): MiddlewareFn => { return ({ args, next, context }: MiddlewareArgs) => { console.log(`${prefix} - Before invocation with args:`, args); const result = next(args); console.log(`${prefix} - After invocation, result:`, result); return result; }; }; const loggingMiddleware = createLoggingMiddleware(); ``` ### Applying middleware to a function[​](#applying-middleware-to-a-function "Direct link to Applying middleware to a function") Use the `use` function to apply one or more middlewares to a function: ``` const originalFn = (name: string, age: number): string => { return `${name} is ${age} years old`; }; const wrappedFn = use(originalFn, loggingMiddleware); // Call the wrapped function const result = wrappedFn("Alice", 30); // Logs: "Before invocation with args: ["Alice", 30]" // Logs: "After invocation, result: Alice is 30 years old" ``` ### Applying multiple middlewares[​](#applying-multiple-middlewares "Direct link to Applying multiple middlewares") You can apply multiple middlewares, which are executed in order of their priority: ``` const createValidationMiddleware = (): MiddlewareFn< [string, number], string > => { return ({ args, next, context, }: MiddlewareArgs<[string, number], string>) => { const [name, age] = args; if (age < 0) throw new Error("Age cannot be negative"); return next(args); }; }; const createAuthMiddleware = (): MiddlewareFn<[string, number], string> => { return ({ args, next, context, }: MiddlewareArgs<[string, number], string>) => { console.log("Checking authorization..."); return next(args); }; }; const validationMiddleware = createValidationMiddleware(); const authMiddleware = createAuthMiddleware(); const wrappedFn = use(originalFn, [ loggingMiddleware, validationMiddleware, authMiddleware, ]); ``` ## Middleware types[​](#middleware-types "Direct link to Middleware types") ### MiddlewareFn[​](#middlewarefn "Direct link to MiddlewareFn") A function that receives middleware arguments and returns a result: ``` type MiddlewareFn = ( args: MiddlewareArgs, ) => TReturn; ``` ### IMiddlewareObject[​](#imiddlewareobject "Direct link to IMiddlewareObject") A middleware object with an optional priority property: ``` class AuthMiddleware implements IMiddlewareObject<[string, number], string> { constructor(public readonly priority: number = 100) {} invoke({ args, next, context, }: MiddlewareArgs<[string, number], string>): string { // Authentication logic return next(args); } } const authMiddleware = new AuthMiddleware(100); const wrappedFn = use(originalFn, authMiddleware); ``` ### MiddlewareArgs[​](#middlewareargs "Direct link to MiddlewareArgs") The argument passed to each middleware: ``` type MiddlewareArgs = { // Original function arguments args: TParameters; // Function to invoke next middleware or original function next: NextFn; // Execution context for storing request-scoped data context: IContext; }; ``` ## Patterns[​](#patterns "Direct link to Patterns") ### Priority-based ordering[​](#priority-based-ordering "Direct link to Priority-based ordering") Set priority on middleware objects to control execution order (lower numbers execute first): ``` const createPriorityMiddleware = ( name: string, priority: number, ): IMiddlewareObject<[string], string> => ({ priority, invoke: ({ args, next }: MiddlewareArgs<[string], string>): string => { console.log(`${priority}. ${name}`); return next(args); }, }); const authMiddleware = createPriorityMiddleware("Auth", 10); const validationMiddleware = createPriorityMiddleware("Validation", 20); const loggingMiddleware = createPriorityMiddleware("Logging", 30); const wrappedFn = use( (value: string): string => value.toUpperCase(), [loggingMiddleware, validationMiddleware, authMiddleware], ); // Executes in order: Auth -> Validation -> Logging -> Original function ``` ### Using execution context[​](#using-execution-context "Direct link to Using execution context") Access and modify the execution context within middleware. For more details about the execution context module, see [Execution Context](/docs/components/execution-context.md). ``` import { contextToken } from "@daiso-tech/core/execution-context"; import { Namespace } from "@daiso-tech/core/namespace"; import { type MiddlewareFn } from "@daiso-tech/core/middleware/contracts"; const namespace = new Namespace("myapp"); type UserData = { id: string; name: string }; const userToken = contextToken(namespace.create("user").toString()); const createContextAwareMiddleware = ( defaultUser: UserData, ): MiddlewareFn<[string, number], string> => { return ({ args, next, context, }: MiddlewareArgs<[string, number], string>) => { const user = context.getOr(userToken, defaultUser); console.log("Executing as:", user.name); return next(args); }; }; const contextAwareMiddleware = createContextAwareMiddleware({ id: "anonymous", name: "Guest", }); const wrappedFn = use(originalFn, contextAwareMiddleware); ``` ### Async middleware[​](#async-middleware "Direct link to Async middleware") Middleware can be asynchronous: ``` const createAsyncValidationMiddleware = ( validator: (args: [string, number]) => Promise, ): MiddlewareFn<[string, number], Promise> => { return async ({ args, next, context, }: MiddlewareArgs<[string, number], Promise>): Promise => { // Perform async validation const isValid = await validator(args); if (!isValid) throw new Error("Validation failed"); return await next(args); }; }; const asyncValidationMiddleware = createAsyncValidationMiddleware(validateAsync); const wrappedFn = use(originalFn, asyncValidationMiddleware); ``` ### Short-circuiting middleware[​](#short-circuiting-middleware "Direct link to Short-circuiting middleware") Skip calling `next()` to bypass subsequent middleware and the original function: ``` const createCachingMiddleware = ( cacheStore: Map = new Map(), ): MiddlewareFn => { return ({ args, next, context }: MiddlewareArgs) => { const cacheKey: string = JSON.stringify(args); if (cacheStore.has(cacheKey)) { console.log("Cache hit!"); return cacheStore.get(cacheKey); // Skip next() } const result = next(args); cacheStore.set(cacheKey, result); return result; }; }; const cache = new Map(); const cachingMiddleware = createCachingMiddleware(cache); ``` ### Error handling middleware[​](#error-handling-middleware "Direct link to Error handling middleware") Catch and handle errors in middleware: ``` const createErrorHandlingMiddleware = ( errorHandler?: (error: unknown) => void, ): MiddlewareFn<[string, number], Promise> => { return async ({ args, next, context, }: MiddlewareArgs<[string, number], Promise>): Promise => { try { return await next(args); } catch (error) { const message = error instanceof Error ? error.message : String(error); console.error("Error occurred:", message); if (errorHandler) errorHandler(error); throw error; } }; }; const errorHandlingMiddleware = createErrorHandlingMiddleware((error) => console.log("Error handled gracefully"), ); ``` ### Enhancing Methods with `enhanceFactory`[​](#enhancing-methods-with-enhancefactory "Direct link to enhancing-methods-with-enhancefactory") The `enhanceFactory` function provides a convenient way to apply middleware to methods of class instances, enabling interception and augmentation of method calls without manually wrapping each function. #### Purpose[​](#purpose "Direct link to Purpose") `enhanceFactory` is a higher-order factory that, given a `use` function (created by `useFactory`), returns an `enhance` function. This `enhance` function can be used to dynamically enhance (wrap) a method of an object with one or more middlewares. #### Signature[​](#signature "Direct link to Signature") ``` function enhanceFactory(use: Use): Enhance; ``` #### Usage Example[​](#usage-example "Direct link to Usage Example") ``` import { useFactory, enhanceFactory } from "@daiso-tech/core/middleware"; // Create a middleware application function const use = useFactory(); const enhance = enhanceFactory(use); class Greeter { greet(name: string): string { return `Hello, ${name}!`; } } const greeter = new Greeter(); // Example middleware that logs calls function loggingMiddleware< TParameters extends Array, TReturn, >(): MiddlewareFn { return ({ args, next }) => { console.log("Calling greet with:", args); const result = next(args); console.log("Result:", result); return result; }; } // Enhance the 'greet' method with middleware enhance(greeter, "greet", loggingMiddleware()); greeter.greet("Alice"); // Logs: // Calling greet with: ["Alice"] // Result: Hello, Alice! ``` #### Enhancing Object Literal Methods[​](#enhancing-object-literal-methods "Direct link to Enhancing Object Literal Methods") You can enhance methods on plain object literals as well: ``` const obj = { add(a: number, b: number) { return a + b; }, }; enhance(obj, "add", loggingMiddleware()); obj.add(2, 3); // Logs: // Calling greet with: [2, 3] // Result: 5 ``` #### Enhancing Static Methods[​](#enhancing-static-methods "Direct link to Enhancing Static Methods") Static methods on classes can also be enhanced: ``` class MathUtils { static multiply(a: number, b: number) { return a * b; } } enhance(MathUtils, "multiply", loggingMiddleware()); MathUtils.multiply(4, 5); // Logs: // Calling greet with: [4, 5] // Result: 20 ``` #### Enhancing Class Prototype Methods[​](#enhancing-class-prototype-methods "Direct link to Enhancing Class Prototype Methods") You can enhance all instances of a class by enhancing its prototype: ``` class Person { say(message: string) { return `Person says: ${message}`; } } enhance(Person.prototype, "say", loggingMiddleware()); const alice = new Person(); alice.say("Hello!"); // Logs: // Calling greet with: ["Hello!"] // Result: Person says: Hello! ``` #### How it Works[​](#how-it-works "Direct link to How it Works") * The `enhance` function replaces the specified method on the object with a wrapped version that runs the provided middleware pipeline. * If the target property is not a function, a `TypeError` is thrown. * Multiple middlewares can be provided (as an array or single value). This pattern is useful for adding cross-cutting concerns (logging, validation, authorization, etc.) to class methods in a reusable and declarative way. *** ## UseFactory configuration[​](#usefactory-configuration-1 "Direct link to UseFactory configuration") ## Further information[​](#further-information "Direct link to Further information") For further information refer to [`@daiso-tech/core/middleware`](https://daiso-tech.github.io/daiso-core/modules/Middleware.html) API docs. --- # Namespace The `@daiso-tech/core/namespace` component provides seamless way to group data by prefixing and suffixing keys. ## Namespace class[​](#namespace-class "Direct link to Namespace class") The `Namespace` class provides a foundational way to prefix or suffix keys. This mechanism is vital for avoiding key conflicts and logically grouping related items, making it a useful primitive block for features like multi-tenancy. Components such as `Cache` and `Lock` utilize it to ensure data isolation. ``` import { Namespace } from "@daiso-tech/core/namspace"; const namespace = new Namespace("@my-namespace"); // Logs "@my-namespace:_rt" console.log(namespace.toString()); const key = namespace.create("my-key"); // Logs "my-key" console.log(key.get()); // Logs "@my-namespace:_rt:my-key" console.log(key.toString()); // You can extend the root const newNamespace = namespace.appendRoot("sub"); // Logs "@my-namespace:sub:_rt" console.log(newNamespace.toString()); // Logs "@my-namespace:sub:_rt:my-key" console.log(newNamespace.create("my-key").toString()); // Logs false because they use different namespaces console.log(newNamspace.create("my-key").equals(namespace.create("my-key"))); ``` info Note that the `Namespace` class is serializable. See the [`Serde`](/docs/components/serde.md#custom-serialization-and-deserialization-logic-of-classes) component for serialization instructions. ## NoOpNamespace class[​](#noopnamespace-class "Direct link to NoOpNamespace class") The `NoOpNamespace` class is used for disabling namespacing. ``` import { NoOpNamespace } from "@daiso-tech/core/namspace"; const namespace = new NoOpNamespace(); // Logs "" console.log(namespace.toString()); const key = namespace.create("my-key"); // Logs "my-key" console.log(key.get()); // Logs "my-key" console.log(key.toString()); ``` ## INamespace contract[​](#inamespace-contract "Direct link to INamespace contract") Bothe `Namespace` and `INoOpNamepace` implement `INamespace` contract. The `INamespace` contract used for easily swaping between `Namespace` and `NoOpNamespace` class without changing your code. ``` export interface IKey extends IEquals { get(): string; toString(): string; } export type INamespace = { toString(): string; create(key: string): IKey; }; ``` ## Further information[​](#further-information "Direct link to Further information") For further information refer to [`@daiso-tech/core/namespace`](https://daiso-tech.github.io/daiso-core/modules/Namespace.html) API docs. --- # Configuring rate-limiter adapters ## RedisRateLimiterAdapter[​](#redisratelimiteradapter "Direct link to RedisRateLimiterAdapter") To use the `RedisRateLimiterAdapter`, you'll need to: 1. Install the required dependency: [`ioredis`](https://www.npmjs.com/package/ioredis) package: ``` import { RedisRateLimiterAdapter } from "@daiso-tech/core/rate-limiter/redis-rate-limiter-adapter"; import Redis from "ioredis"; const database = new Redis("YOUR_REDIS_CONNECTION_STRING"); const redisRateLimiterAdapter = new RedisRateLimiterAdapter({ database, }); ``` ### Configuring backoff policy[​](#configuring-backoff-policy "Direct link to Configuring backoff policy") The `type` field is the only required field. All other fields are optional. ``` import { BACKOFFS } from "@daiso-tech/core/backoff-policies"; const database = new Redis("YOUR_REDIS_CONNECTION_STRING"); const redisRateLimiterAdapter = new RedisRateLimiterAdapter({ database, backoffPolicy: { type: BACKOFFS.CONSTANT, delay: TimeSpan.fromSeconds(1), jitter: 0.5, }, }); ``` The settings are the same as [backoff policies](/docs/components/backoff_policies.md) settings. ### Configuring rate-limiter policy[​](#configuring-rate-limiter-policy "Direct link to Configuring rate-limiter policy") The `type` field is the only required field. All other fields are optional. ``` import { POLICIES } from "@daiso-tech/core/rate-limiter/policies"; const database = new Redis("YOUR_REDIS_CONNECTION_STRING"); const redisRateLimiterAdapter = new RedisRateLimiterAdapter({ database, rateLimiterPolicy: { type: POLICIES.SLIDING_WINDOW, failureThreshold: 5, successThreshold: 5, }, }); ``` The settings are the same as [rate-limiter policies](/docs/components/rate-limiter/configuring_rate_limiter_policies.md) settings. ## DatabaseRateLimiterAdapter[​](#databaseratelimiteradapter "Direct link to DatabaseRateLimiterAdapter") To use the `DatabaseRateLimiterAdapter`, you'll need to use `IRateLimiterStorageAdapter`: 1. Creating `IRateLimiterStorageAdapter`: ``` import { MemoryRateLimiterStorageAdapter } from "@daiso-tech/core/rate-limiter/memory-rate-limiter-storage-adapter"; const rateLimiterStorageAdapter = new MemoryRateLimiterStorageAdapter(); ``` 2. Creating `DatabaseRateLimiterAdapter`: ``` import { DatabaseRateLimiterAdapter } from "@daiso-tech/core/rate-limiter/database-rate-limiter-adapter"; const rateLimiterAdapter = new DatabaseRateLimiterAdapter({ adapter: rateLimiterStorageAdapter, }); ``` ### Configuring backoff policy[​](#configuring-backoff-policy-1 "Direct link to Configuring backoff policy") You can use any of defined [backoff policies](/docs/components/backoff_policies.md). ``` import { DatabaseRateLimiterAdapter } from "@daiso-tech/core/rate-limiter/database-rate-limiter-adapter"; import { constantBackoff } from "@daiso-tech/core/backoff-policies"; const rateLimiterAdapter = new DatabaseRateLimiterAdapter({ adapter: rateLimiterStorageAdapter, backoffPolicy: constantBackoff(), }); ``` ### Configuring rate-limiter policy[​](#configuring-rate-limiter-policy-1 "Direct link to Configuring rate-limiter policy") You can use any of defined [rate-limiter policies](/docs/components/rate-limiter/configuring_rate_limiter_policies.md) or [create your own](/docs/components/rate-limiter/creating_rate_limiter_policies.md). ``` import { DatabaseRateLimiterAdapter } from "@daiso-tech/core/rate-limiter/database-rate-limiter-adapter"; import { SlidingWindowLimiter } from "@daiso-tech/core/rate-limiter/policies"; const rateLimiterAdapter = new DatabaseRateLimiterAdapter({ adapter: rateLimiterStorageAdapter, rateLimiterPolicy: new SlidingWindowLimiter(), }); ``` ## NoOpRateLimiterAdapter[​](#noopratelimiteradapter "Direct link to NoOpRateLimiterAdapter") The `NoOpRateLimiterAdapter` is a no-operation implementation, it performs no actions when called: ``` import { NoOpRateLimiterAdapter } from "@daiso-tech/core/rate-limiter/no-op-rate-limiter-adpater"; const noOpRateLimiterAdapter = new NoOpRateLimiterAdapter(); ``` info The `NoOpRateLimiterAdapter` is useful when you want to mock out or disable your [`RateLimiterProvider`](https://daiso-tech.github.io/daiso-core/classes/RateLimiter.RateLimiterProvider.html) instance. ## KyselyRateLimiterStorageAdapter[​](#kyselyratelimiterstorageadapter "Direct link to KyselyRateLimiterStorageAdapter") To use the `KyselyRateLimiterStorageAdapter`, you'll need to: 1. Use database provider that has support for transactions. 2. Install the required dependency: [`kysely`](https://www.npmjs.com/package/kysely) package: 3. Provide a string serializer ([`ISerde`](/docs/components/serde.md)): * We recommend using `SuperJsonSerdeAdapter` for this purpose ``` import { Serde } from "@daiso-tech/core/serde"; import { SuperJsonSerdeAdapter } from "@daiso-tech/core/serde/super-json-serde-adapter"; const serde = new Serde(new SuperJsonSerdeAdapter()); ``` ### With Sqlite[​](#with-sqlite "Direct link to With Sqlite") You will need to install [`better-sqlite3`](https://www.npmjs.com/package/better-sqlite3) package: ``` import { TimeSpan } from "@daiso-tech/core/time-span"; import { KyselyRateLimiterStorageAdapter } from "@daiso-tech/core/rate-limiter/kysely-rate-limiter-storage-adapter"; import Sqlite from "better-sqlite3"; import { Kysely, SqliteDialect } from "kysely"; const database = new Sqlite("DATABASE_NAME.db"); const kysely = new Kysely({ dialect: new SqliteDialect({ database, }), }); const kyselyRateLimiterStorageAdapter = new KyselyRateLimiterStorageAdapter({ kysely, serde, }); // You need initialize the adapter once before using it. // During the initialization the schema will be created await kyselyRateLimiterStorageAdapter.init(); ``` ### With Postgres[​](#with-postgres "Direct link to With Postgres") You will need to install [`pg`](https://www.npmjs.com/package/pg) package: ``` import { TimeSpan } from "@daiso-tech/core/time-span"; import { KyselyRateLimiterStorageAdapter } from "@daiso-tech/core/rate-limiter/kysely-rate-limiter-storage-adapter"; import { Pool } from "pg"; import { Kysely, PostgresDialect } from "kysely"; const database = new Pool({ database: "DATABASE_NAME", host: "DATABASE_HOST", user: "DATABASE_USER", // DATABASE port port: 5432, password: "DATABASE_PASSWORD", max: 10, }); const kysely = new Kysely({ dialect: new PostgresDialect({ pool: database, }), }); const kyselyRateLimiterStorageAdapter = new KyselyRateLimiterStorageAdapter({ kysely, serde, }); // You need initialize the adapter once before using it. // During the initialization the schema will be created await kyselyRateLimiterStorageAdapter.init(); ``` ### With Mysql[​](#with-mysql "Direct link to With Mysql") You will need to install [`mysql2`](https://www.npmjs.com/package/mysql2) package: ``` import { TimeSpan } from "@daiso-tech/core/time-span"; import { KyselyRateLimiterStorageAdapter } from "@daiso-tech/core/rate-limiter/kysely-rate-limiter-storage-adapter"; import { createPool } from "mysql2"; import { Kysely, MysqlDialect } from "kysely"; const database = createPool({ host: "DATABASE_HOST", // Database port port: 3306, database: "DATABASE_NAME", user: "DATABASE_USER", password: "DATABASE_PASSWORD", connectionLimit: 10, }); const kysely = new Kysely({ dialect: new MysqlDialect({ pool: database, }), }); const kyselyRateLimiterStorageAdapter = new KyselyRateLimiterStorageAdapter({ kysely, serde, }); // You need initialize the adapter once before using it. // During the initialization the schema will be created await kyselyRateLimiterStorageAdapter.init(); ``` ### With Libsql[​](#with-libsql "Direct link to With Libsql") You will need to install `@libsql/kysely-libsql` package: ``` import { TimeSpan } from "@daiso-tech/core/time-span"; import { KyselyRateLimiterStorageAdapter } from "@daiso-tech/core/rate-limiter/kysely-rate-limiter-storage-adapter"; import { LibsqlDialect } from "@libsql/kysely-libsql"; import { Kysely } from "kysely"; const kysely = new Kysely({ dialect: new LibsqlDialect({ url: "DATABASE_URL", }), }); const kyselyRateLimiterStorageAdapter = new KyselyRateLimiterStorageAdapter({ kysely, serde, }); // You need initialize the adapter once before using it. // During the initialization the schema will be created await kyselyRateLimiterStorageAdapter.init(); ``` ## MemoryRateLimiterStorageAdapter[​](#memoryratelimiterstorageadapter "Direct link to MemoryRateLimiterStorageAdapter") To use the `MemoryRateLimiterStorageAdapter` you only need to create instance of it: ``` import { MemoryRateLimiterStorageAdapter } from "@daiso-tech/core/rate-limiter/memory-rate-limiter-storage-adapter"; const memoryRateLimiterStorageAdapter = new MemoryRateLimiterStorageAdapter(); ``` You can also provide an `Map` that will be used for storing the data in memory: ``` import { MemoryRateLimiterStorageAdapter } from "@daiso-tech/core/rate-limiter/memory-rate-limiter-storage-adapter"; const map = new Map(); const memoryRateLimiterStorageAdapter = new MemoryRateLimiterStorageAdapter( map, ); ``` info `MemoryRateLimiterStorageAdapter` lets you test your app without external dependencies like `Redis`, ideal for local development, unit tests, integration tests and fast E2E test for the backend application. ## MongodbRateLimiterStorageAdapter[​](#mongodbratelimiterstorageadapter "Direct link to MongodbRateLimiterStorageAdapter") To use the `MongodbRateLimiterStorageAdapter`, you'll need to: 1. Use database provider that has support for transactions. 2. Install the required dependency: [`mongodb`](https://www.npmjs.com/package/mongodb) package: 3. Provide a string serializer ([`ISerde`](/docs/components/serde.md)): * We recommend using `SuperJsonSerdeAdapter` for this purpose ``` import { MongodbRateLimiterStorageAdapter } from "@daiso-tech/core/rate-limiter/mongodb-rate-limiter-storage-adapter"; import { MongoClient } from "mongodb"; const client = await MongoClient.connect("YOUR_MONGODB_CONNECTION_STRING"); const database = client.db("database"); const mongodbRateLimiterStorageAdapter = new MongodbRateLimiterStorageAdapter({ client, database, serde, }); // You need initialize the adapter once before using it. // During the initialization the indexes will be created await mongodbRateLimiterStorageAdapter.init(); ``` ## NoOpRateLimiterStorageAdapter[​](#noopratelimiterstorageadapter "Direct link to NoOpRateLimiterStorageAdapter") The `NoOpRateLimiterStorageAdapter` is a no-operation implementation, it performs no actions when called: ``` import { NoOpRateLimiterStorageAdapter } from "@daiso-tech/core/rate-limiter/no-op-rate-limiter-storage-adpater"; const noOpRateLimiterStorageAdapter = new NoOpRateLimiterStorageAdapter(); ``` info The `NoOpRateLimiterStorageAdapter` is useful when you want to mock out or disable your [`DatabaseRateLimiterAdapter`](https://daiso-tech.github.io/daiso-core/classes/RateLimiter.DatabaseRateLimiterAdapter.html) instance. ## Further information[​](#further-information "Direct link to Further information") For further information refer to [`@daiso-tech/core/rate-limiter`](https://daiso-tech.github.io/daiso-core/modules/RateLimiter.html) API docs. --- # Configuring rate-limiter policies ## SlidingWindowLimiter[​](#slidingwindowlimiter "Direct link to SlidingWindowLimiter") ```` import { SlidingWindowLimiter } from "@daiso-tech/core/rate-limiter/policies" import { TimeSpan } from "@daiso-tech/core/time-span" new SlidingWindowLimiter({ /** * The time span in which attempts are active before reseting. * The field is optional. * */ window: TimeSpan.fromSeconds(1) /** * The field is optional. * ``` */ margin: TimeSpan.fromSeconds(4).divide(4) }) ```` ## FixedWindowLimiter[​](#fixedwindowlimiter "Direct link to FixedWindowLimiter") ``` import { FixedWindowLimiter } from "@daiso-tech/core/rate-limiter/policies"; import { TimeSpan } from "@daiso-tech/core/time-span"; new FixedWindowLimiter({ /** * The time span in which attempts are active before reseting. * The field is optional. */ window: TimeSpan.fromSeconds(1), }); ``` ## Further information[​](#further-information "Direct link to Further information") For further information refer to [`@daiso-tech/core/rate-limiter`](https://daiso-tech.github.io/daiso-core/modules/RateLimiter.html) API docs. --- # Creating rate-limiter adapters ## Implementing your custom IRateLimiterAdapter[​](#implementing-your-custom-iratelimiteradapter "Direct link to Implementing your custom IRateLimiterAdapter") In order to create an adapter you need to implement the [`IRateLimiterAdapter`](https://daiso-tech.github.io/daiso-core/types/RateLimiter.IRateLimiterAdapter.html) contract. ## Implementing your custom IRateLimiterStorageAdapter[​](#implementing-your-custom-iratelimiterstorageadapter "Direct link to Implementing your custom IRateLimiterStorageAdapter") We provide an additional contract [`IRateLimiterStorageAdapter`](https://daiso-tech.github.io/daiso-core/types/RateLimiter.IRateLimiterStorageAdapter.html) for building custom rate-limiter storage adapters tailored to [`DatabaseRateLimiterAdapter`](/docs/components/rate-limiter/configuring_rate_limiter_adapters.md#databaseratelimiteradapter) and [`DatabaseRateLimiterProviderFactory`](/docs/components/rate-limiter/rate_limiter_factory_resolver.md#databaseratelimiterfactoryresolver). ## Testing your custom IRateLimiterStorageAdapter[​](#testing-your-custom-iratelimiterstorageadapter "Direct link to Testing your custom IRateLimiterStorageAdapter") We provide a complete test suite to test your rate-limiter storage adapter implementation. Simply use the [`rateLimiterBreakerStorageTestSuite`](https://daiso-tech.github.io/daiso-core/functions/RateLimiter.rateLimiterBreakerStorageTestSuite.html) function: * Preconfigured Vitest test cases * Common edge case coverage Usage example: ``` // filename: MyRateLimiterStorageAdapter.test.ts import { beforeEach, describe, expect, test } from "vitest"; import { rateLimiterBreakerStorageTestSuite } from "@daiso-tech/core/rate-limiter/test-utilities"; import { MemoryRateLimiterStorageAdapter } from "./MemoryRateLimiterStorageAdapter.js"; describe("class: MyRateLimiterStorageAdapter", () => { rateLimiterBreakerStorageTestSuite({ createAdapter: () => new MemoryRateLimiterStorageAdapter(), test, beforeEach, expect, describe, }); }); ``` ## Implementing your custom IRateLimiterProvider class[​](#implementing-your-custom-iratelimiterprovider-class "Direct link to Implementing your custom IRateLimiterProvider class") In some cases, you may need to implement a custom [`RateLimiterProvider`](https://daiso-tech.github.io/daiso-core/classes/RateLimiter.RateLimiterProvider.html) class to optimize performance for your specific technology stack. You can then directly implement the [`IRateLimiterProvider`](https://daiso-tech.github.io/daiso-core/types/RateLimiter.IRateLimiterProvider.html) contract. ## Further information[​](#further-information "Direct link to Further information") For further information refer to [`@daiso-tech/core/rate-limiter`](https://daiso-tech.github.io/daiso-core/modules/RateLimiter.html) API docs. --- # Creating policies ## Implementing your custom IRateLimiterPolicy[​](#implementing-your-custom-iratelimiterpolicy "Direct link to Implementing your custom IRateLimiterPolicy") In order to create custom rate-limiter you need to implement the [`IRateLimiterPolicy`](https://daiso-tech.github.io/daiso-core/types/RateLimiter.IRateLimiterPolicy.html) contract. Custom rate-limiter policies can be used with [`DatabaseRateLimiterAdapter`](/docs/components/rate-limiter/configuring_rate_limiter_adapters.md#databaseratelimiteradapter) and [`DatabaseRateLimiterProviderFactory`](/docs/components/rate-limiter/rate_limiter_factory_resolver.md#databaseratelimiterfactoryresolver). To understand how to implement a custom [`IRateLimiterPolicy`](https://daiso-tech.github.io/daiso-core/types/RateLimiter.IRateLimiterPolicy.html), refer to the [`FixedWindowLimiter`](https://github.com/yousif-khalil-abdulkarim/daiso-core/blob/main/src/rate-limiter/implementations/policies/fixed-window-limiter/fixed-window-limiter.ts) implementation. ## Further information[​](#further-information "Direct link to Further information") For further information refer to [`@daiso-tech/core/rate-limiter`](https://daiso-tech.github.io/daiso-core/modules/RateLimiter.html) API docs. --- # Rate-limiter resolver factory classes ## RateLimiterFactoryResolver[​](#ratelimiterfactoryresolver "Direct link to RateLimiterFactoryResolver") The `RateLimiterFactoryResolver` class provides a flexible way to configure and switch between different rate-limiter adapters at runtime. ### Initial configuration[​](#initial-configuration "Direct link to Initial configuration") To begin using the `RateLimiterFactoryResolver`, You will need to register all required adapters during initialization. ``` import { RateLimiterFactoryResolver } from "@daiso-tech/core/rate-limiter"; import { MemoryRateLimiterStorageAdapter } from "@daiso-tech/core/rate-limiter/memory-rate-limiter-storate-adapter"; import { DatabaseRateLimiterAdapter } from "@daiso-tech/core/rate-limiter/database-rate-limiter-adapter"; import { RedisRateLimiterAdapter } from "@daiso-tech/core/rate-limiter/redis-rate-limiter-adapter"; import { Serde } from "@daiso-tech/core/serde"; import { SuperJsonSerdeAdapter } from "@daiso-tech/core/serde/super-json-serde-adapter"; import Redis from "ioredis"; const serde = new Serde(new SuperJsonSerdeAdapter()); const rateLimiterFactoryResolver = new RateLimiterFactoryResolver({ serde, adapters: { memory: new DatabaseRateLimiterAdapter({ adapter: new MemoryRateLimiterStorageAdapter(), }), redis: new RedisRateLimiterAdapter({ database: new Redis("YOUR_REDIS_CONNECTION"), }), }, defaultAdapter: "memory", }); ``` ### Usage[​](#usage "Direct link to Usage") #### 1. Using the default adapter[​](#1-using-the-default-adapter "Direct link to 1. Using the default adapter") ``` // Will apply rate-limiter logic the default adapter which is MemoryRateLimiterStorageAdapter await rateLimiterFactoryResolver .use() .create("a") .runOrFail(async () => { // ... code to apply rate-limiter logic }); ``` danger Note that if you dont set a default adapter, an error will be thrown. #### 2. Specifying an adapter explicitly[​](#2-specifying-an-adapter-explicitly "Direct link to 2. Specifying an adapter explicitly") ``` // Will apply rate-limiter logic using the redis adapter await rateLimiterFactoryResolver .use("redis") .create("a") .runOrFail(async () => { // ... code to apply rate-limiter logic }); ``` danger Note that if you specify a non-existent adapter, an error will be thrown. #### 3. Overriding default settings[​](#3-overriding-default-settings "Direct link to 3. Overriding default settings") ``` await rateLimiterFactoryResolver .setNamespace(new Namespace(["@", "test"])) .use("redis") .create("a") .runOrFail(async () => { // ... code to apply rate-limiter logic }); ``` info Note that the `RateLimiterFactoryResolver` is immutable, meaning any configuration override returns a new instance rather than modifying the existing one. ## DatabaseRateLimiterFactoryResolver[​](#databaseratelimiterfactoryresolver "Direct link to DatabaseRateLimiterFactoryResolver") The `DatabaseRateLimiterFactoryResolver` class provides a flexible way to configure and switch between different rate-limiter-storage adapters at runtime. ### Initial configuration[​](#initial-configuration-1 "Direct link to Initial configuration") To begin using the `DatabaseRateLimiterFactoryResolver`, You will need to register all required adapters during initialization. ``` import { DatabaseRateLimiterFactoryResolver } from "@daiso-tech/core/rate-limiter"; import { MemoryRateLimiterStorageAdapter } from "@daiso-tech/core/rate-limiter/memory-rate-limiter-storate-adapter"; import { KyselyRateLimiterStorageAdapter } from "@daiso-tech/core/rate-limiter/kysely-rate-limiter-storate-adapter"; import { DatabaseRateLimiterAdapter } from "@daiso-tech/core/rate-limiter/database-rate-limiter-adapter"; import { Serde } from "@daiso-tech/core/serde"; import { SuperJsonSerdeAdapter } from "@daiso-tech/core/serde/super-json-serde-adapter"; import Sqlite from "better-sqlite3"; import { Kysely, SqliteDialect } from "kysely"; const serde = new Serde(new SuperJsonSerdeAdapter()); const rateLimiterFactoryResolver = new DatabaseRateLimiterFactoryResolver({ serde, adapters: { memory: new MemoryRateLimiterStorageAdapter(), sqlite: new KyselyRateLimiterStorageAdapter({ kysely: new Kysely({ dialect: new SqliteDialect({ database: new Sqlite("local.db"), }), }), serde, }), }, defaultAdapter: "memory", }); // Will apply rate-limiter logic the default adapter which is MemoryRateLimiterStorageAdapter await rateLimiterFactoryResolver .use() .create("a") .runOrFail(async () => { // ... code to apply rate-limiter logic }); // Will apply rate-limiter logic using the KyselyRateLimiterStorageAdapter await rateLimiterFactoryResolver .use("sqlite") .create("a") .runOrFail(async () => { // ... code to apply rate-limiter logic }); ``` ### Usage[​](#usage-1 "Direct link to Usage") #### 1. Using the default adapter[​](#1-using-the-default-adapter-1 "Direct link to 1. Using the default adapter") ``` // Will apply rate-limiter logic the default adapter which is MemoryRateLimiterStorageAdapter await rateLimiterFactoryResolver .use() .create("a") .runOrFail(async () => { // ... code to apply rate-limiter logic }); ``` danger Note that if you dont set a default adapter, an error will be thrown. #### 2. Specifying an adapter explicitly[​](#2-specifying-an-adapter-explicitly-1 "Direct link to 2. Specifying an adapter explicitly") ``` // Will apply rate-limiter logic using the sqlite adapter await rateLimiterFactoryResolver .use("sqlite") .create("a") .runOrFail(async () => { // ... code to apply rate-limiter logic }); ``` danger Note that if you specify a non-existent adapter, an error will be thrown. #### 3. Overriding default settings[​](#3-overriding-default-settings-1 "Direct link to 3. Overriding default settings") ``` import { SlidingWindowLimiter } from "@daiso-tech/core/rate-limiter/policies"; import { constantBackoff } from "@daiso-tech/core/backoff-policies"; await rateLimiterFactoryResolver .setBackoffPolicy(constantBackoff()) .setRateLimiterPolicy(new SlidingWindowLimiter()) .use("redis") .create("a") .runOrFail(async () => { // ... code to apply rate-limiter logic }); ``` info Note that the `DatabaseRateLimiterFactoryResolver` is immutable, meaning any configuration override returns a new instance rather than modifying the existing one. ## Further information[​](#further-information "Direct link to Further information") For further information refer to [`@daiso-tech/core/rate-limiter`](https://daiso-tech.github.io/daiso-core/modules/RateLimiter.html) API docs. --- # Rate-limiter usage The `@daiso-tech/core/rate-limiter` component provides a way for managing rate-limiter independent of underlying platform or storage. ## Initial configuration[​](#initial-configuration "Direct link to Initial configuration") To begin using the `RateLimiterFactory` class, you'll need to create and configure an instance: ``` import { TimeSpan } from "@daiso-tech/core/time-span"; import { MemoryRateLimiterStorageAdapter } from "@daiso-tech/core/rate-limiter/memory-rate-limiter-storage-adapter"; import { DatabaseRateLimiterAdapter } from "@daiso-tech/core/rate-limiter/database-rate-limiter-adapter"; import { RateLimiterFactory } from "@daiso-tech/core/rate-limiter"; const rateLimiterFactory = new RateLimiterFactory({ // You can provide default settings // You can choose the adapter to use adapter: new DatabaseRateLimiterAdapter({ adapter: new MemoryRateLimiterStorageAdapter(), }), }); ``` info Here is a complete list of settings for the [`RateLimiterFactory`](https://daiso-tech.github.io/daiso-core/types/RateLimiter.RateLimiterFactorySettingsBase.html) class. ## Rate-limiter basics[​](#rate-limiter-basics "Direct link to Rate-limiter basics") ### Creating a rate-limiter[​](#creating-a-rate-limiter "Direct link to Creating a rate-limiter") ``` const rateLimiter = rateLimiterFactory.create("resource"); ``` ### Using the rate-limiter[​](#using-the-rate-limiter "Direct link to Using the rate-limiter") ``` // The function will only be called when the rate-limiter allows the attempt. await rateLimiter.runOrFail(async () => { // The code / function to rate limit, called it here }); ``` info Note the method throws an error when the rate-limiter is blocked. info You can provide synchronous or asynchronous [`Invokable<[], TValue | Promise>`](/docs/utilities/invokable.md) as values for the `runOrFail` method. ### Applying rate-limiter on only erros[​](#applying-rate-limiter-on-only-erros "Direct link to Applying rate-limiter on only erros") The rate-limiter defaults to counting all attempts. You can optionally configure it to track only failed requests. ``` class ErrorA extends Error {} const rateLimiter = rateLimiterFactory.create("resource", { onlyError: true, }); await rateLimiter.runOrFail(async () => { // The code / function to rate limit, called it here }); ``` ### Applying rate-limiter on certiain errors[​](#applying-rate-limiter-on-certiain-errors "Direct link to Applying rate-limiter on certiain errors") ``` class ErrorA extends Error {} const rateLimiter = rateLimiterFactory.create("resource", { onlyError: true, // Error policy will only work "onlyError" is set to true errorPolicy: ErrorA, }); await rateLimiter.runOrFail(async () => { // The code / function to rate limit, called it here }); ``` ### Reseting the rate-limiter[​](#reseting-the-rate-limiter "Direct link to Reseting the rate-limiter") You can reset rate-limiter state to the allowed state manually. ``` await rateLimiter.reset(); ``` ### Checking rate-limiter state[​](#checking-rate-limiter-state "Direct link to Checking rate-limiter state") You can get the rate-limiter state by using the `getState` method, it returns [`RateLimiterState`](https://daiso-tech.github.io/daiso-core/types/RateLimiter.RateLimiterState.html). ``` import { RATE_LIMITER_STATE } from "@daiso-tech/core/rate-limiter/contracts"; const state = await rateLimiter.getState(); if (state === RATE_LIMITER_STATE.EXPIRED) { console.log("The rate limiter key doesnt exists"); } if (state === RATE_LIMITER_STATE.ALLOWED) { console.log("The rate limiter is allowing calls"); } if (state === RATE_LIMITER_STATE.BLOCKED) { console.log("The rate limiter is blocking calls"); } ``` ### Rate-limiter instance variables[​](#rate-limiter-instance-variables "Direct link to Rate-limiter instance variables") The `RateLimiter` class exposes instance variables such as: ``` const rateLimiter = rateLimiterFactory.create("resource"); // Will return the key of the rate-limiter which is "resource" console.log(rateLimiter.key.toString()); ``` info The `key` field is an object that implements [`IKey`](/docs/components/namespace.md) contract. ## Patterns[​](#patterns "Direct link to Patterns") ### Namespacing[​](#namespacing "Direct link to Namespacing") You can use the `Namespace` class to group related rate-limiters without conflicts. Since namespacing is not used be default, you need to pass an obeject that implements `INamespace` object. info For further information about namespacing refer to [`@daiso-tech/core/namespace`](/docs/components/namespace.md) documentation. ``` import { Namespace } from "@daiso-tech/core/namespace"; import { RedisRateLimiterAdapter } from "@daiso-tech/core/rate-limiter/redis-rate-limiter-adapter"; import { RateLimiterFactory } from "@daiso-tech/core/rate-limiter"; import Redis from "ioredis"; const database = new Redis("YOUR_REDIS_CONNECTION_STRING"); const rateLimiterFactoryA = new RateLimiterFactory({ namespace: new Namespace("@rate-limiter-a"), adapter: new RedisRateLimiterAdapter({ database }), }); const rateLimiterFactoryB = new RateLimiterFactory({ namespace: new Namespace("@rate-limiter-b"), adapter: new RedisRateLimiterAdapter({ database }), }); const rateLimiterA = rateLimiterFactoryA.create("key"); const rateLimiterB = rateLimiterFactoryB.create("key"); await rateLimiterA.runOrFail(async () => { // some operation }); // Will log "ALLOWED" console.log((await rateLimiterA.getState()).type); // Will log "EXPIRED" because rateLimiterB is in a different namespace console.log((await rateLimiterB.getState()).type); ``` ### Serialization and deserialization of rate-limiters[​](#serialization-and-deserialization-of-rate-limiters "Direct link to Serialization and deserialization of rate-limiters") rate-limiters can be serialized, allowing them to be transmitted over the network to another server and later deserialized for reuse. This means you can, for example, acquire the rate-limiter on the main server, transfer it to a queue worker server, and release it there. In order to serialize or deserialize a rate-limiter you need pass an object that implements [`ISerderRegister`](/docs/components/serde.md) contract like the [`Serde`](/docs/components/serde.md) class to `RateLimiterFactory`. Manually serializing and deserializing the rate-limiter: ``` import { RedisRateLimiterAdapter } from "@daiso-tech/core/rate-limiter/redis-rate-limiter-adapter"; import { RateLimiterFactory } from "@daiso-tech/core/rate-limiter"; import { Serde } from "@daiso-tech/core/serde"; import { SuperJsonSerdeAdapter } from "@daiso-tech/core/serde/super-json-serde-adapter"; const serde = new Serde(new SuperJsonSerdeAdapter()); const redisClient = new Redis("YOUR_REDIS_CONNECTION"); const rateLimiterFactory = new RateLimiterFactory({ // You can laso pass in an array of Serde class instances serde, adapter: new RedisRateLimiterAdapter({ database: redisClient }), }); const rateLimiter = rateLimiterFactory.create("resource"); const serializedRateLimiter = serde.serialize(rateLimiter); const deserializedRateLimiter = serde.deserialize(rateLimiter); ``` danger When serializing or deserializing a rate-limiter, you must use the same `Serde` instances that were provided to the `RateLimiterFactory`. This is required because the `RateLimiterFactory` injects custom serialization logic for `IRateLimiter` instance into `Serde` instances. info Note you only need manuall serialization and deserialization when integrating with external libraries. As long you pass the same `Serde` instances with all other components you dont need to serialize and deserialize the rate-limiter manually. ``` import { RedisRateLimiterAdapter } from "@daiso-tech/core/rate-limiter/redis-rate-limiter-adapter"; import type { IRateLimiter } from "@daiso-tech/core/rate-limiter/contracts"; import { RateLimiterFactory } from "@daiso-tech/core/rate-limiter"; import { RedisPubSubEventBusAdapter } from "@daiso-tech/core/event-bus/redis-pub-sub-event-bus-adapter"; import { EventBus } from "@daiso-tech/core/event-bus"; import { Serde } from "@daiso-tech/core/serde"; import { SuperJsonSerdeAdapter } from "@daiso-tech/core/serde/super-json-serde-adapter"; const serde = new Serde(new SuperJsonSerdeAdapter()); const redis = new Redis("YOUR_REDIS_CONNECTION"); type EventMap = { "sending-rate-limiter-over-network": { rateLimiter: IRateLimiter; }; }; const eventBus = new EventBus({ adapter: new RedisPubSubEventBusAdapter({ client: redis, serde, }), }); const rateLimiterFactory = new RateLimiterFactory({ serde, adapter: new RedisRateLimiterAdapter({ databsae: redis }), eventBus, }); const rateLimiter = rateLimiterFactory.create("resource"); // We are sending the rateLimiter over the network to other servers. await eventBus.dispatch("sending-rate-limiter-over-network", { rateLimiter, }); // The other servers will recieve the serialized rateLimiter and automattically deserialize it. await eventBus.addListener( "sending-rate-limiter-over-network", ({ rateLimiter }) => { // The rateLimiter is deserialized and can be used console.log("RATE_LIMITER:", rateLimiter); }, ); ``` ### Rate-limiter events[​](#rate-limiter-events "Direct link to Rate-limiter events") You can listen to different [rate-limiter events](https://daiso-tech.github.io/daiso-core/modules/RateLimiter.html) that are triggered by the `RateLimiter` instance. Refer to the [`EventBus`](/docs/components/event_bus/event_bus_usage.md) documentation to learn how to use events. Since no events are dispatched by default, you need to pass an object that implements `IEventBus` or `IEventBusAdapter` contract. ``` import { MemoryRateLimiterStorageAdapter } from "@daiso-tech/core/rate-limiter/memory-rate-limiter-storage-adapter"; import { DatabaseRateLimiterAdapter } from "@daiso-tech/core/rate-limiter/database-rate-limiter-adapter"; import { RateLimiterFactory, RATE_LIMITER_EVENTS, } from "@daiso-tech/core/rate-limiter"; import { MemoryEventBusAdapter } from "@daiso-tech/core/event-bus/memory-event-bus-adapter"; const rateLimiterFactory = new RateLimiterFactory({ adapter: new DatabaseRateLimiterAdapter({ adapter: new MemoryRateLimiterStorageAdapter(), }), eventBus: new MemoryEventBusAdapter(), }); await rateLimiterFactory.events.addListener( RATE_LIMITER_EVENTS.BLOCKED, (_event) => { console.log("Got blocked:", event); }, ); await rateLimiterFactory.create("a").isolate(); ``` warning If multiple rate-limiter adapters (e.g., `RedisRateLimiterAdapter` and `DatabaseRateLimiterAdapter`) are used at the same time, you need to isolate their events by assigning separate namespaces. This prevents listeners from unintentionally capturing events across adapters. ``` import { RedisRateLimiterAdapter } from "@daiso-tech/core/rate-limiter/redis-rate-limiter-adapter"; import { MemoryRateLimiterStorageAdapter } from "@daiso-tech/core/rate-limiter/memory-rate-limiter-storage-adapter"; import { DatabaseRateLimiterAdapter } from "@daiso-tech/core/rate-limiter/database-rate-limiter-adapter"; import { RedisPubSubEventBusAdapter } from "@daiso-tech/core/event-bus/redis-pub-sub-event-bus-adapter"; import { Serde } from "@daiso-tech/core/serde"; import { SuperJsonSerdeAdapter } from "@daiso-tech/core/serde/super-json-serde-adapter"; import Redis from "ioredis"; import { Namespace } from "@daiso-tech/core/namespace"; const serde = new Serde(new SuperJsonSerdeAdapter()); const redisPubSubEventBusAdapter = new RedisPubSubEventBusAdapter({ client: new Redis("YOUR_REDIS_CONNECTION_STRING"), serde, }); const memoryRateLimiterFactory = new RateLimiterFactory({ adapter: new DatabaseRateLimiterAdapter({ adapter: new MemoryRateLimiterStorageAdapter(), }), // We assign distinct namespaces to DatabaseRateLimiterAdapter and RedisRateLimiterAdapter to isolate their events. namespace: new Namespace(["memory", "event-bus"]), eventBus: redisPubSubEventBusAdapter, }); const redisRateLimiterAdapter = new RedisRateLimiterAdapter({ serde, database: new Redis("YOUR_REDIS_CONNECTION_STRING"), }); const redisRateLimiterFactory = new RateLimiterFactory({ adapter: redisRateLimiterAdapter, // We assign distinct namespaces to DatabaseRateLimiterAdapter and RedisRateLimiterAdapter to isolate their events. namespace: new Namespace(["redis", "event-bus"]), eventBus: redisPubSubEventBusAdapter, }); ``` ### Separating creating, listening to and using rate-limiters[​](#separating-creating-listening-to-and-using-rate-limiters "Direct link to Separating creating, listening to and using rate-limiters") The library includes 3 additional contracts: * [`IRateLimiter`](https://daiso-tech.github.io/daiso-core/types/RateLimiter.IRateLimiter.html) - Allows only for manipulating of the lock. * [`IRateLimiterFactoryBase`](https://daiso-tech.github.io/daiso-core/types/RateLimiter.IRateLimiterFactoryBase.html) - Allows only for creation of locks. * [`IRateLimiterListenable`](https://daiso-tech.github.io/daiso-core/types/RateLimiter.IRateLimiterListenable.html) - Allows only to listening to lock events. This seperation makes it easy to visually distinguish the 3 contracts, making it immediately obvious that they serve different purposes. ``` import { MemoryEventBusAdapter } from "@daiso-tech/core/event-bus/memory-event-bus-adapter"; import { RateLimiterFactory } from "@daiso-tech/core/rate-limiter"; import { MemoryRateLimiterStorageAdapter } from "@daiso-tech/core/rate-limiter/memory-rate-limiter-storage-adapter"; import { DatabaseRateLimiterAdapter } from "@daiso-tech/core/rate-limiter/database-rate-limiter-adapter"; import { type IRateLimiter, type IRateLimiterFactoryBase, type IRateLimiterListenable, RATE_LIMITER_EVENTS, } from "@daiso-tech/core/rate-limiter/contracts"; async function rateLimiterFunc(rateLimiter: IRateLimiter): Promise { await rateLimiter.runOrFail(async () => { // ... rate limited section }); } async function rateLimiterFactoryFunc( rateLimiterFactory: IRateLimiterFactoryBase, ): Promise { // You cannot access the listener methods // You will get typescript error if you try const rateLimiter = rateLimiterFactory.create("resource"); await rateLimiterFunc(rateLimiter); } async function rateLimiterListenableFunc( rateLimiterListenable: IRateLimiterListenable, ): Promise { // You cannot access the rateLimiterFactory methods // You will get typescript error if you try await rateLimiterListenable.addListener( RATE_LIMITER_EVENTS.BLOCKED, (event) => { console.log("Blocked:", event); }, ); } const rateLimiterFactory = new RateLimiterFactory({ adapter: new DatabaseRateLimiterAdapter({ adapter: new MemoryRateLimiterStorageAdapter(), }), eventBus: new MemoryEventBusAdapter(), }); await rateLimiterListenableFunc(rateLimiterFactory.events); await rateLimiterFactoryFunc(rateLimiterFactory); ``` ## Further information[​](#further-information "Direct link to Further information") For further information refer to [`@daiso-tech/core/rate-limiter`](https://daiso-tech.github.io/daiso-core/modules/RateLimiter.html) API docs. --- # Resilience The `@daiso-tech/core/resilience` component provides predefined fault tolerant `middlewares`. info For further information about `middlewares` refer to [`@daiso-tech/core/middleware`](/docs/components/middleware.md) documentation. ## Fallback[​](#fallback "Direct link to Fallback") The `fallback` middleware adds fallback value when an error occurs: ### Usage[​](#usage "Direct link to Usage") ``` import { fallback } from "@daiso-tech/core/resilience"; import { useFactory } from "@daiso-tech/core/middleware"; const use = useFactory(); function unstableFn(): number { // We simulate a function that can throw unexpected errors if (Math.round(Math.random() * 1.5) === 0) { throw new Error("Unexpected error occurred"); } return Math.round((Math.random() + 1) * 99); } const fn = use(unstableFn, [ fallback({ fallbackValue: 1, }), ]); // Will never throw and when error occurs the fallback value will be returned. console.log(await fn()); ``` info You can provide synchronous or asynchronous [`Invokable<[], TValue | Promise>`](/docs/utilities/invokable.md) as fallback value. ### Custom ErrorPolicy[​](#custom-errorpolicy "Direct link to Custom ErrorPolicy") You can define an [`ErrorPolicy`](/docs/utilities/error_policy_type.md) to specify fallback values for specific error cases: ``` const fn = use(unstableFn, [ fallback({ fallbackValue: 1, // Will only fallback errors that are not a TypeError errorPolicy: (error) => !(error instanceof TypeError), }), ]); await fn(); ``` ### Callbacks[​](#callbacks "Direct link to Callbacks") You can add callback [`Invokable`](/docs/utilities/invokable.md) that will be called before the fallback value is returned. ``` const fn = use(unstableFn, [ fallback({ fallbackValue: 1, onFallback: (fallbackData) => console.log(fallbackData), }), ]); await fn(); ``` info For more details about `onFallback` callback data, see the OnFallbackData type. ## Retry[​](#retry "Direct link to Retry") The `retry` middleware enables automatic retries for all errors or specific errors, with configurable backoff policies. An error will be thrown when all retry attempts fail. ### Usage[​](#usage-1 "Direct link to Usage") ``` import { retry } from "@daiso-tech/core/resilience"; import { useFactory } from "@daiso-tech/core/middleware"; const use = useFactory(); function unstableFn(): number { // We simulate a function that can throw unexpected errors if (Math.round(Math.random() * 1.5) === 0) { throw new Error("Unexpected error occurred"); } return Math.round((Math.random() + 1) * 99); } const fn = use(unstableFn, [ retry({ // Will retry 4 times maxAttempts: 4, }), ]); await fn(); ``` ### Custom ErrorPolicy[​](#custom-errorpolicy-1 "Direct link to Custom ErrorPolicy") You can define an [`ErrorPolicy`](/docs/utilities/error_policy_type.md) to retry specific error cases: ``` const fn = use(unstableFn, [ retry({ maxAttempts: 4, // Will only retry errors that are not TypeError errorPolicy: (error) => !(error instanceof TypeError), }), ]); await fn(); ``` ### Throw last error[​](#throw-last-error "Direct link to Throw last error") By default, a `RetryResilienceError` is thrown when the time window expires. This error aggregates all errors encountered during the retry process. You can instead rethrow the last encountered error: ``` const fn = use(unstableFn, [ retry({ maxAttempts: 4, throwLastError: true, }), ]); await fn(); ``` ### Custom BackoffPolicy[​](#custom-backoffpolicy "Direct link to Custom BackoffPolicy") You can use custom [`BackoffPolicy`](/docs/components/backoff_policies.md): ``` import { TimeSpan } from "@daiso-tech/core/time-span"; const fn = use(unstableFn, [ retry({ maxAttempts: 4, // By default a exponential policy is used backoffPolicy: (attempt: number, _error: unknown) => TimeSpan.fromMilliseconds(attempt * 100), }), ]); await fn(); ``` ### Callbacks[​](#callbacks-1 "Direct link to Callbacks") You can add callback [`Invokable`](/docs/utilities/invokable.md) that will be called before execution attempt: ``` const fn = use(unstableFn, [ retry({ maxAttempts: 4, onExecutionAttempt: (data) => console.log(data), }), ]); await fn(); ``` You can add callback [`Invokable`](/docs/utilities/invokable.md) that will be called before the retry delay starts: info For more details about `onExecutionAttempt` callback data, see the `OnRetryAttemptData` type. ``` const fn = use(unstableFn, [ retry({ maxAttempts: 4, onRetryDelay: (data) => console.log(data), }), ]); await fn(); ``` info For more details about `onRetryDelay` callback data, see the `OnRetryDelayData` type. ## Retry by interval[​](#retry-by-interval "Direct link to Retry by interval") The `retryInterval` middleware retries a function repeatedly within a given time window, waiting a fixed interval between each attempt. A `RetryIntervalResilienceError` is thrown when the time window expires and all attempts have failed. ### Usage[​](#usage-2 "Direct link to Usage") ``` import { retryInterval } from "@daiso-tech/core/resilience"; import { useFactory } from "@daiso-tech/core/middleware"; import { TimeSpan } from "@daiso-tech/core/time-span"; const use = useFactory(); function unstableFn(): number { // We simulate a function that can throw unexpected errors if (Math.round(Math.random() * 1.5) === 0) { throw new Error("Unexpected error occurred"); } return Math.round((Math.random() + 1) * 99); } const fn = use(unstableFn, [ retryInterval({ // Retry for up to 10 seconds time: TimeSpan.fromSeconds(10), // Wait 500ms between each attempt interval: TimeSpan.fromMilliseconds(500), }), ]); await fn(); ``` ### Custom ErrorPolicy[​](#custom-errorpolicy-2 "Direct link to Custom ErrorPolicy") You can define an [`ErrorPolicy`](/docs/utilities/error_policy_type.md) to retry only specific error cases: ``` const fn = use(unstableFn, [ retryInterval({ time: TimeSpan.fromSeconds(10), interval: TimeSpan.fromMilliseconds(500), // Will only retry errors that are not a TypeError errorPolicy: (error) => !(error instanceof TypeError), }), ]); await fn(); ``` ### Throw last error[​](#throw-last-error-1 "Direct link to Throw last error") By default, a `RetryIntervalResilienceError` is thrown when the time window expires. This error aggregates all errors encountered during the retry process. You can instead rethrow the last encountered error: ``` const fn = use(unstableFn, [ retryInterval({ time: TimeSpan.fromSeconds(10), interval: TimeSpan.fromMilliseconds(500), throwLastError: true, }), ]); await fn(); ``` ### Callbacks[​](#callbacks-2 "Direct link to Callbacks") You can add callback [`Invokable`](/docs/utilities/invokable.md) that will be called before each execution attempt: ``` const fn = use(unstableFn, [ retryInterval({ time: TimeSpan.fromSeconds(10), interval: TimeSpan.fromMilliseconds(500), onExecutionAttempt: (data) => console.log(data), }), ]); await fn(); ``` info For more details about `onExecutionAttempt` callback data, see the `OnRetryAttemptData` type. You can add callback [`Invokable`](/docs/utilities/invokable.md) that will be called before the retry delay starts: ``` const fn = use(unstableFn, [ retryInterval({ time: TimeSpan.fromSeconds(10), interval: TimeSpan.fromMilliseconds(500), onRetryDelay: (data) => console.log(data), }), ]); await fn(); ``` info For more details about `onRetryDelay` callback data, see the `OnRetryDelayData` type. ## Timeout[​](#timeout "Direct link to Timeout") The `timeout` middleware automatically aborts functions after a specified time period, throwing an error when aborted. ### Usage[​](#usage-3 "Direct link to Usage") ``` import { timeout } from "@daiso-tech/core/resilience"; import { useFactory } from "@daiso-tech/core/middleware"; import { TimeSpan } from "@daiso-tech/core/time-span"; const use = useFactory(); async function fetchData(): Promise { const response = await fetch("ENDPOINT"); console.log("DONE"); return response; } const fn = use(fetchData, [ timeout({ waitTime: TimeSpan.fromSeconds(2), }), ]); await fn(); ``` ### Callbacks[​](#callbacks-3 "Direct link to Callbacks") You can add callback [`Invokable`](/docs/utilities/invokable.md) that will be called before the timeout occurs. ``` const fn = use(fetchData, [ timeout({ waitTime: TimeSpan.fromSeconds(2), onTimeout: (data) => console.log(data), }), ]); await fn(); ``` info For more details about `onTimeout` callback data, see the `OnTimeoutData` type. ## Further information[​](#further-information "Direct link to Further information") For further information refer to [`@daiso-tech/core/resilience`](https://daiso-tech.github.io/daiso-core/modules/Resilience.html) API docs. --- # Configuring semaphore adapters ## MemorySemaphoreAdapter[​](#memorysemaphoreadapter "Direct link to MemorySemaphoreAdapter") To use the `MemorySemaphoreAdapter` you only need to create instance of it: ``` import { MemorySemaphoreAdapter } from "@daiso-tech/core/semaphore/memory-semaphore-adapter"; const memorySemaphoreAdapter = new MemorySemaphoreAdapter(); ``` You can also provide an `Map` that will be used for storing the data in memory: ``` import { MemorySemaphoreAdapter } from "@daiso-tech/core/semaphore/memory-semaphore-adapter"; const map = new Map(); const memorySemaphoreAdapter = new MemorySemaphoreAdapter(map); ``` info `MemorySemaphoreAdapter` lets you test your app without external dependencies like `Redis`, ideal for local development, unit tests, integration tests and fast E2E test for the backend application. danger Note the `MemorySemaphoreAdapter` is limited to single process usage and cannot be shared across multiple servers or processes. ## MongodbSemaphoreAdapter[​](#mongodbsemaphoreadapter "Direct link to MongodbSemaphoreAdapter") To use the `MongodbSemaphoreAdapter`, you'll need to: 1. Install the required dependency: [`mongodb`](https://www.npmjs.com/package/mongodb) package: ``` import { MongodbSemaphoreAdapter } from "@daiso-tech/core/semaphore/mongodb-semaphore-adapter"; import { MongoClient } from "mongodb"; const client = await MongoClient.connect("YOUR_MONGODB_CONNECTION_STRING"); const database = client.db("database"); const mongodbSemaphoreAdapter = new MongodbSemaphoreAdapter({ database, }); // You need initialize the adapter once before using it. // During the initialization the indexes will be created await mongodbSemaphoreAdapter.init(); ``` You can change the collection name: ``` const mongodbSemaphoreAdapter = new MongodbSemaphoreAdapter({ database, // By default "semaphore" is used as collection name collectionName: "my-semaphore", }); await mongodbSemaphoreAdapter.init(); ``` You can change the collection settings: ``` const mongodbSemaphoreAdapter = new MongodbSemaphoreAdapter({ database, // You configure additional collection settings collectionSettings: {}, }); await mongodbSemaphoreAdapter.init(); ``` info To remove the semaphore collection and all stored semaphore data, use `deInit` method: ``` await mongodbSemaphoreAdapter.deInit(); ``` danger Note in order to use `MongodbSemaphoreAdapter` correctly, ensure you use a single, consistent database across all server instances or processes. ## RedisSemaphoreAdapter[​](#redissemaphoreadapter "Direct link to RedisSemaphoreAdapter") To use the `RedisSemaphoreAdapter`, you'll need to: 1. Install the required dependency: [`ioredis`](https://www.npmjs.com/package/ioredis) package: ``` import { RedisSemaphoreAdapter } from "@daiso-tech/core/semaphore/redis-semaphore-adapter"; import Redis from "ioredis"; const database = new Redis("YOUR_REDIS_CONNECTION_STRING"); const redisSemaphoreAdapter = new RedisSemaphoreAdapter(database); ``` danger Note in order to use `RedisSemaphoreAdapter` correctly, ensure you use a single, consistent database across all server instances or processes. ## KyselySemaphoreAdapter[​](#kyselysemaphoreadapter "Direct link to KyselySemaphoreAdapter") To use the `KyselySemaphoreAdapter`, you'll need to: 1. Use database provider that has support for transactions. 2. Install the required dependency: [`kysely`](https://www.npmjs.com/package/kysely) package: ### With Sqlite[​](#with-sqlite "Direct link to With Sqlite") You will need to install [`better-sqlite3`](https://www.npmjs.com/package/better-sqlite3) package: ``` import { TimeSpan } from "@daiso-tech/core/time-span"; import { KyselySemaphoreAdapter } from "@daiso-tech/core/semaphore/kysely-semaphore-adapter"; import Sqlite from "better-sqlite3"; import { Kysely, SqliteDialect } from "kysely"; const database = new Sqlite("DATABASE_NAME.db"); const kysely = new Kysely({ dialect: new SqliteDialect({ database, }), }); const kyselySemaphoreAdapter = new KyselySemaphoreAdapter({ kysely, }); // You need initialize the adapter once before using it. // During the initialization the schema will be created await kyselySemaphoreAdapter.init(); ``` danger Note using `KyselySemaphoreAdapter` with `sqlite` is limited to single server usage and cannot be shared across multiple servers but it can be shared between different processes. To use it correctly, ensure all process instances access the same persisted database. ### With Postgres[​](#with-postgres "Direct link to With Postgres") You will need to install [`pg`](https://www.npmjs.com/package/pg) package: ``` import { TimeSpan } from "@daiso-tech/core/time-span"; import { KyselySemaphoreAdapter } from "@daiso-tech/core/semaphore/kysely-semaphore-adapter"; import { Pool } from "pg"; import { Kysely, PostgresDialect } from "kysely"; const database = new Pool({ database: "DATABASE_NAME", host: "DATABASE_HOST", user: "DATABASE_USER", // DATABASE port port: 5432, password: "DATABASE_PASSWORD", max: 10, }); const kysely = new Kysely({ dialect: new PostgresDialect({ pool: database, }), }); const kyselySemaphoreAdapter = new KyselySemaphoreAdapter({ kysely, }); // You need initialize the adapter once before using it. // During the initialization the schema will be created await kyselySemaphoreAdapter.init(); ``` danger Note in order to use `KyselySemaphoreAdapter` with `postgres` correctly, ensure you use a single, consistent database across all server instances. This means you can't use replication. ### With Mysql[​](#with-mysql "Direct link to With Mysql") You will need to install [`mysql2`](https://www.npmjs.com/package/mysql2) package: ``` import { TimeSpan } from "@daiso-tech/core/time-span"; import { KyselySemaphoreAdapter } from "@daiso-tech/core/semaphore/kysely-semaphore-adapter"; import { createPool } from "mysql2"; import { Kysely, MysqlDialect } from "kysely"; const database = createPool({ host: "DATABASE_HOST", // Database port port: 3306, database: "DATABASE_NAME", user: "DATABASE_USER", password: "DATABASE_PASSWORD", connectionLimit: 10, }); const kysely = new Kysely({ dialect: new MysqlDialect({ pool: database, }), }); const kyselySemaphoreAdapter = new KyselySemaphoreAdapter({ kysely, }); // You need initialize the adapter once before using it. // During the initialization the schema will be created await kyselySemaphoreAdapter.init(); ``` danger Note in order to use `KyselySemaphoreAdapter` with `mysql` correctly, ensure you use a single, consistent database across all server instances. This means you can't use replication. ### With Libsql[​](#with-libsql "Direct link to With Libsql") You will need to install `@libsql/kysely-libsql` package: ``` import { TimeSpan } from "@daiso-tech/core/time-span"; import { KyselySemaphoreAdapter } from "@daiso-tech/core/semaphore/kysely-semaphore-adapter"; import { LibsqlDialect } from "@libsql/kysely-libsql"; import { Kysely } from "kysely"; const kysely = new Kysely({ dialect: new LibsqlDialect({ url: "DATABASE_URL", }), }); const kyselySemaphoreAdapter = new KyselySemaphoreAdapter({ kysely, }); // You need initialize the adapter once before using it. // During the initialization the schema will be created await kyselySemaphoreAdapter.init(); ``` danger Note in order to use `KyselySemaphoreAdapter` with `libsql` correctly, ensure you use a single, consistent database across all server instances. This means you can't use libsql embedded replicas. ### Settings[​](#settings "Direct link to Settings") Expired keys are cleared at regular intervals and you can change the interval time: ``` import { TimeSpan } from "@daiso-tech/core/time-span"; const kyselySemaphoreAdapter = new KyselySemaphoreAdapter({ database, // By default, the interval is 1 minute expiredKeysRemovalInterval: TimeSpan.fromSeconds(10), }); await kyselySemaphoreAdapter.init(); ``` Disabling scheduled interval cleanup of expired keys: ``` import { TimeSpan } from "@daiso-tech/core/time-span"; const kyselySemaphoreAdapter = new KyselySemaphoreAdapter({ database, shouldRemoveExpiredKeys: false, }); await kyselySemaphoreAdapter.init(); // You can remove all expired keys manually. await kyselySemaphoreAdapter.removeAllExpired(); ``` info To remove the semaphore table and all stored semaphore data, use `deInit` method: ``` await kyselySemaphoreAdapter.deInit(); ``` ## NoOpSemaphoreAdapter[​](#noopsemaphoreadapter "Direct link to NoOpSemaphoreAdapter") The `NoOpSemaphoreAdapter` is a no-operation implementation, it performs no actions when called: ``` import { NoOpSemaphoreAdapter } from "@daiso-tech/core/semaphore/no-op-semaphore-adapter"; const noOpSemaphoreAdapter = new NoOpSemaphoreAdapter(); ``` info The `NoOpSemaphoreAdapter` is useful when you want to mock out or disable your [`SemaphoreFactory`](https://daiso-tech.github.io/daiso-core/classes/Semaphore.SemaphoreFactory.html) instance. ## Further information[​](#further-information "Direct link to Further information") For further information refer to [`@daiso-tech/core/semaphore`](https://daiso-tech.github.io/daiso-core/modules/Semaphore.html) API docs. --- # Creating semaphore adapters ## Implementing your custom ISemaphoreAdapter[​](#implementing-your-custom-isemaphoreadapter "Direct link to Implementing your custom ISemaphoreAdapter") In order to create an adapter you need to implement the [`ISemaphoreAdapter`](https://daiso-tech.github.io/daiso-core/types/Semaphore.ISemaphoreAdapter.html) contract. ## Testing your custom ISemaphoreAdapter[​](#testing-your-custom-isemaphoreadapter "Direct link to Testing your custom ISemaphoreAdapter") We provide a complete test suite to test your semaphore adapter implementation. Simply use the [`semaphoreAdapterTestSuite`](https://daiso-tech.github.io/daiso-core/functions/Semaphore.semaphoreAdapterTestSuite.htmll) function: * Preconfigured Vitest test cases * Common edge case coverage Usage example: ``` // filename: MySemaphoreAdapter.test.ts import { beforeEach, describe, expect, test } from "vitest"; import { semaphoreAdapterTestSuite } from "@daiso-tech/core/semaphore/test-utilities"; import { MemorySemaphoreAdapter } from "./MemorySemaphoreAdapter.js"; describe("class: MySemaphoreAdapter", () => { semaphoreAdapterTestSuite({ createAdapter: () => new MemorySemaphoreAdapter(), test, beforeEach, expect, describe, }); }); ``` ## Implementing your custom IDatabaseSemaphoreAdapter[​](#implementing-your-custom-idatabasesemaphoreadapter "Direct link to Implementing your custom IDatabaseSemaphoreAdapter") We provide an additional contract [`IDatabaseSemaphoreAdapter`](https://daiso-tech.github.io/daiso-core/types/Semaphore.IDatabaseSemaphoreAdapter.html) for building custom semaphore adapters tailored to databases. ## Testing your custom IDatabaseSemaphoreAdapter[​](#testing-your-custom-idatabasesemaphoreadapter "Direct link to Testing your custom IDatabaseSemaphoreAdapter") We provide a complete test suite to test your database semaphore adapter implementation. Simply use the [`databaseSemaphoreAdapterTestSuite`](https://daiso-tech.github.io/daiso-core/functions/Semaphore.databaseSemaphoreAdapterTestSuite.html) function: * Preconfigured Vitest test cases * Common edge case coverage Usage example: ``` import { beforeEach, describe, expect, test } from "vitest"; import { databaseSemaphoreAdapterTestSuite } from "@daiso-tech/core/semaphore/test-utilities"; import { MyDatabaseSemaphoreAdapter } from "./MyDatabaseSemaphoreAdapter.js"; describe("class: MyDatabaseSemaphoreAdapter", () => { databaseSemaphoreAdapterTestSuite({ createAdapter: async () => { return new MyDatabaseSemaphoreAdapter(), }, test, beforeEach, expect, describe, }); }); ``` ## Implementing your custom ISemaphoreFactory class[​](#implementing-your-custom-isemaphorefactory-class "Direct link to Implementing your custom ISemaphoreFactory class") In some cases, you may need to implement a custom [`SemaphoreFactory`](https://daiso-tech.github.io/daiso-core/classes/Semaphore.SemaphoreFactory.html) class to optimize performance for your specific technology stack. You can then directly implement the [`ISemaphoreFactory`](https://daiso-tech.github.io/daiso-core/types/Semaphore.ISemaphoreFactory.html) contract. ## Testing your custom ISemaphoreFactory class[​](#testing-your-custom-isemaphorefactory-class "Direct link to Testing your custom ISemaphoreFactory class") We provide a complete test suite to verify your custom event bus class implementation. Simply use the [`semaphoreFactoryTestSuite`](https://daiso-tech.github.io/daiso-core/functions/Semaphore.semaphoreFactoryTestSuite.html) function: * Preconfigured Vitest test cases * Standardized event bus behavior validation * Common edge case coverage Usage example: ``` // filename: MySemaphoreFactory.test.ts import { beforeEach, describe, expect, test } from "vitest"; import { semaphoreFactoryTestSuite } from "@daiso-tech/core/semaphore/test-utilities"; import { MySemaphoreFactory } from "./MySemaphoreFactory.js"; describe("class: MySemaphoreFactory", () => { semaphoreFactoryTestSuite({ createSemaphoreFactory: () => new MySemaphoreFactory(), test, beforeEach, expect, describe, }); }); ``` ## Further information[​](#further-information "Direct link to Further information") For further information refer to [`@daiso-tech/core/semaphore`](https://daiso-tech.github.io/daiso-core/modules/Semaphore.html) API docs. --- # SemaphoreFactoryResolver The `SemaphoreFactoryResolver` class provides a flexible way to configure and switch between different semaphore adapters at runtime. ## Initial configuration[​](#initial-configuration "Direct link to Initial configuration") To begin using the `ISemaphoreFactoryResolver`, you will need to register all required adapters during initialization. ``` import { SemaphoreFactoryResolver } from "@daiso-tech/core/semaphore"; import { MemorySemaphoreAdapter } from "@daiso-tech/core/semaphore/memory-semaphore-adapter"; import { RedisSemaphoreAdapter } from "@daiso-tech/core/semaphore/redis-semaphore-adapter"; import Redis from "ioredis"; const serde = new Serde(new SuperJsonSerdeAdapter()); const semaphoreFactoryResolver = new SemaphoreFactoryResolver({ adapters: { memory: new MemorySemaphoreAdapter(), redis: new RedisSemaphoreAdapter(new Redis("YOUR_REDIS_CONNECTION")), }, // You can set an optional default adapter defaultAdapter: "memory", }); ``` ## Usage[​](#usage "Direct link to Usage") ### 1. Using the default adapter[​](#1-using-the-default-adapter "Direct link to 1. Using the default adapter") ``` await semaphoreFactoryResolver .use() .create("shared-resource") .runOrFail(async () => { // code to run }); ``` danger Note that if you dont set a default adapter, an error will be thrown. ### 2. Specifying an adapter explicitly[​](#2-specifying-an-adapter-explicitly "Direct link to 2. Specifying an adapter explicitly") ``` await semaphoreFactoryResolver .use("redis") .create("shared-resource") .runOrFail(async () => { // code to run }); ``` danger Note that if you specify a non-existent adapter, an error will be thrown. ### 3. Overriding default settings[​](#3-overriding-default-settings "Direct link to 3. Overriding default settings") ``` await semaphoreFactoryResolver .setNamespace(new Namespace("@my-namespace")) .use("redis") .create("shared-resource") .runOrFail(async () => { // code to run }); ``` info Note that the `SemaphoreFactoryResolver` is immutable, meaning any configuration override returns a new instance rather than modifying the existing one. ## Further information[​](#further-information "Direct link to Further information") For further information refer to [`@daiso-tech/core/semaphore`](https://daiso-tech.github.io/daiso-core/modules/Semaphore.html) API docs. --- # Semaphore usage The `@daiso-tech/core/semaphore` component provides a way for managing semaphores independent of underlying platform or storage. ## Initial configuration[​](#initial-configuration "Direct link to Initial configuration") To begin using the `SemaphoreFactory` class, you'll need to create and configure an instance: ``` import { TimeSpan } from "@daiso-tech/core/time-span"; import { MemorySemaphoreAdapter } from "@daiso-tech/core/semaphore/memory-semaphore-adapter"; import { SemaphoreFactory } from "@daiso-tech/core/semaphore"; const semaphoreFactory = new SemaphoreFactory({ // You can provide default TTL value // If you set it to null it means semaphores will not expire and most be released manually by default. defaultTtl: TimeSpan.fromSeconds(2), // You can choose the adapter to use adapter: new MemorySemaphoreAdapter(), }); ``` info Here is a complete list of settings for the [`SemaphoreFactory`](https://daiso-tech.github.io/daiso-core/types/Semaphore.SemaphoreFactorySettingsBase.html) class. ## Semaphore basics[​](#semaphore-basics "Direct link to Semaphore basics") ### Creating a semaphore[​](#creating-a-semaphore "Direct link to Creating a semaphore") ``` const semaphore = semaphoreFactory.create("shared-resource", { // You need to define a limit limit: 2, }); ``` ### Acquiring and releasing the semaphore[​](#acquiring-and-releasing-the-semaphore "Direct link to Acquiring and releasing the semaphore") ``` // 1 slot will be acquired if (await semaphore.acquire()) { console.log("Acquired"); try { // The concurrent section } finally { await semaphore.release(); } } else { console.log("Unable to acquire"); } // 2 slots will be acquired if (await semaphore.acquire()) { console.log("Acquired"); try { // The concurrent section } finally { await semaphore.release(); } } else { console.log("Unable to acquire"); } // Will log false because the limit is reached console.log(await semaphore.acquire()); ``` Alternatively you could write it as follows: ``` // 1 slot will be acquired try { console.log("Acquired"); // This method will throw if the semaphore limit is reached. await semaphore.acquireOrFail(); // The critical section } catch { console.log("Unable to acquire"); } finally { await semaphore.release(); } // 2 slots will be acquired try { console.log("Acquired"); // This method will throw if the semaphore limit is reached. await semaphore.acquireOrFail(); // The critical section } catch { console.log("Unable to acquire"); } finally { await semaphore.release(); } // Will throw because the limit is reached await semaphore.acquireOrFail(); ``` danger You need always to wrap the concurrent section with `try-finally` so the semaphore get released when error occurs. ### Semaphore with custom TTL[​](#semaphore-with-custom-ttl "Direct link to Semaphore with custom TTL") You can provide a custom TTL for the semaphore. ``` const semaphore = semaphoreFactory.create("shared-resource", { // Default TTL is 5min if not overrided // If you set it to null it means semaphore will not expire and most be released manually. ttl: TimeSpan.fromSeconds(30), limit: 2, }); ``` ### Checking semaphore state[​](#checking-semaphore-state "Direct link to Checking semaphore state") You can get the semaphore state by using the `getState` method, it returns [`ISemaphoreState`](https://daiso-tech.github.io/daiso-core/types/Semaphore.ISemaphoreState.html). ``` import { SEMAPHORE_STATE } from "@daiso-tech/core/semaphore/contracts"; const semaphore = semaphoreFactory.create("shared-resource", { limit: 2, }); const state = await semaphore.getState(); if (state.type === SEMAPHORE_STATE.EXPIRED) { console.log("The semaphore doesnt exists"); } if (state.type === SEMAPHORE_STATE.LIMIT_REACHED) { console.log("The limit have been reached and all slots are unavailable"); } if (state.type === SEMAPHORE_STATE.ACQUIRED) { console.log("The semaphore is acquired"); } if (state.type === SEMAPHORE_STATE.UNACQUIRED) { console.log("There are avilable slots but the semaphore is not acquired"); } ``` ## Patterns[​](#patterns "Direct link to Patterns") ### Refreshing semaphores[​](#refreshing-semaphores "Direct link to Refreshing semaphores") The semaphore can be refreshed by the current owner before it expires. This is particularly useful for long-running tasks, instead of setting an excessively long TTL initially, you can start with a shorter one and use the `refresh` method to set the TTL of the semaphore: ``` import { delay } from "@daiso-tech/core/utilities"; const semaphore = semaphoreFactory.create("resource", { limit: 2, ttl: TimeSpan.fromMinutes(1), }); async function doWork(): Promise { // ... critical section } const hasAcquired = await semaphore.acquire(); if (hasAcquired) { try { while (true) { await semaphore.refresh(TimeSpan.fromMinutes(1)); const hasFinished = await doWork(); if (hasFinished) { break; } await delay(TimeSpan.fromSeconds(1)); } } finally { await semaphore.release(); } } ``` warning Note: A semaphore must have an expiration (a `ttl` value) to be refreshed. You cannot refresh a semaphore that was created without an expiration (with `ttl: null`) ``` // Create a semaphore with no expiration (non-refreshable) const semaphore = semaphoreFactory.create("resource", { limit: 2, ttl: null, }); // A refresh attempt on this semaphore will fail const hasRefreshed = await semaphore.refresh(); // This will log 'false' because the semaphore cannot be refreshed console.log(hasRefreshed); ``` ### Additional methods[​](#additional-methods "Direct link to Additional methods") The `releaseOrFail` method is the same `release` method but it throws an error when not enable to release the semaphore: ``` const semaphore = semaphoreFactory.create("resource", { limit: 2, }); await semaphore.releaseOrFail(); ``` You can force release all the semaphore slots: ``` const semaphore = semaphoreFactory.create("resource", { limit: 2, }); await semaphore.forceReleaseAll(); ``` The `refreshOrFail` method is the same `refresh` method but it throws an error when not enable to refresh the semaphore: ``` const semaphore = semaphoreFactory.create("resource"); await semaphore.refreshOrFail(); ``` The `runOrFail` method automatically manages semaphore acquisition and release around function execution. It calls `acquireOrFail` before invoking the function and calls `release` in a finally block, ensuring the semaphore is always freed, even if an error occurs during execution. ``` const semaphore = semaphoreFactory.create("resource", { limit: 2, }); await semaphore.runOrFail(async () => { // ... critical section }); ``` info Note the method throws an error when the semaphore cannot be acquired. info You can provide synchronous or asynchronous [`Invokable<[], TValue | Promise>`](/docs/utilities/invokable.md) as values for the `runOrFail` method. ### Semaphore instance variables[​](#semaphore-instance-variables "Direct link to Semaphore instance variables") The `Semaphore` class exposes instance variables such as: ``` const semaphore = semaphoreFactory.create("resource", { limit: 2, }); // Will return the key of the semaphore which is "resource" console.log(semaphore.key.toString()); // Will return the id of the semaphore console.log(semaphore.id); // Will return the ttl of the semaphore console.log(semaphore.ttl); ``` info The `key` field is an object that implements [`IKey`](/docs/components/namespace.md) contract. ### Semaphore slot id[​](#semaphore-slot-id "Direct link to Semaphore slot id") By default the slot id is autogenerated but it can also manually defined. ``` const semaphore = semaphoreFactory.create("semaphore", { slotId: "my-slot-id", }); const hasAcquire = await semaphore.acquire(); if (hasAcquired) { console.log("Shared resource"); await semaphore.release(); } ``` info Manually defining slot id is primarily useful for debugging or implementing manual resource controll by the end user. warning In most cases, setting a slot id is unnecessary. ### Namespacing[​](#namespacing "Direct link to Namespacing") You can use the `Namespace` class to group related semaphores without conflicts. Since namespacing is not used be default, you need to pass an obeject that implements `INamespace` object. info For further information about namespacing refer to [`@daiso-tech/core/namespace`](/docs/components/namespace.md) documentation. ``` import { Namespace } from "@daiso-tech/core/namespace"; import { RedisSemaphoreAdapter } from "@daiso-tech/core/semaphore/redis-semaphore-adapter"; import { SemaphoreFactory } from "@daiso-tech/core/semaphore"; import Redis from "ioredis"; const database = new Redis("YOUR_REDIS_CONNECTION_STRING"); const serde = new Serde(new SuperJsonSerdeAdapter()); const semaphoreFactoryA = new SemaphoreFactory({ namespace: new Namespace("@semaphore-a"), adapter: new RedisSemaphoreAdapter(database), serde, }); const semaphoreFactoryB = new SemaphoreFactory({ namespace: new Namespace("@semaphore-b"), adapter: new RedisSemaphoreAdapter(database), serde, }); const semaphoreA = semaphoreFactoryA.create("key", { ttl: null, limit: 1, }); const semaphoreB = semaphoreFactoryB.create("key", { ttl: null, limit: 1, }); const hasAquiredA = await semaphoreA.acquire(); // Will log true console.log(hasAquiredA); const hasAquiredB = await semaphoreB.acquire(); // Will log true console.log(hasAquiredB); const hasReleasedB = await semaphoreB.release(); // Will log true console.log(hasReleasedB); // Will log { type: "ACQUIRED", remainingTime: null } console.log(await semaphoreA.getState()); // Will log { type: "EXPIRED" } console.log(await semaphoreB.getState()); ``` ### Retrying acquiring semaphore by attempts[​](#retrying-acquiring-semaphore-by-attempts "Direct link to Retrying acquiring semaphore by attempts") To retry acquiring semaphore you can use the [`retry`](/docs/components/resilience.md) middleware. Retrying acquiring semaphore with `acquireOrFail` method: ``` import { retry } from "@daiso-tech/core/resilience"; import { FailedAcquireSemaphoreError } from "@daiso-tech/core/semaphore/contracts"; import { useFactory } from "@daiso-tech/core/middleware"; const semaphore = semaphoreFactory.create("semaphore", { limit: 2, }); const use = useFactory(); try { await use(async () => { await semaphore.acquireOrFail(); }, [ retry({ maxAttempts: 4, errorPolicy: FailedAcquireSemaphoreError, }), ])(); // The critical section } finally { await semaphore.release(); } ``` Retrying acquiring semaphore with `acquire` method: ``` import { retry } from "@daiso-tech/core/resilience"; import { useFactory } from "@daiso-tech/core/middleware"; const semaphore = semaphoreFactory.create("semaphore", { limit: 2, }); const use = useFactory(); const hasAquired = await use(async () => { return await semaphore.acquire(); }, [ retry({ maxAttempts: 4, errorPolicy: { treatFalseAsError: true, }, }), ])(); if (hasAquired) { try { // The critical section } finally { await semaphore.release(); } } ``` Retrying acquiring semaphore with `runOrFail` method: ``` import { retry } from "@daiso-tech/core/resilience"; import { FailedAcquireSemaphoreError } from "@daiso-tech/core/semaphore/contracts"; import { useFactory } from "@daiso-tech/core/middleware"; const semaphore = semaphoreFactory.create("semaphore", { limit: 2, }); const use = useFactory(); await use(async () => { await semaphore.runOrFail(async () => { // The critical section }); }, [ retry({ maxAttempts: 4, errorPolicy: FailedAcquireSemaphoreError, }), ])(); ``` ### Retrying acquiring semaphore by interval[​](#retrying-acquiring-semaphore-by-interval "Direct link to Retrying acquiring semaphore by interval") To retry acquiring semaphore at regular intervals you can use the [`retryInterval`](/docs/components/resilience.md) middleware: Retrying acquiring semaphore with `acquireOrFail` method: ``` import { retryInterval } from "@daiso-tech/core/resilience"; import { FailedAcquireSemaphoreError } from "@daiso-tech/core/semaphore/contracts"; import { useFactory } from "@daiso-tech/core/middleware"; import { TimeSpan } from "@daiso-tech/core/time-span"; const semaphore = semaphoreFactory.create("resource", { limit: 2, }); const use = useFactory(); try { await use(async () => { await semaphore.acquireOrFail(); }, [ retryInterval({ // Time to wait 1 minute time: TimeSpan.fromMinutes(1), // Interval to try acquire the semaphore interval: TimeSpan.fromSeconds(1), errorPolicy: FailedAcquireSemaphoreError, }), ])(); // ... critical section } finally { await semaphore.release(); } ``` Retrying acquiring semaphore with `acquire` method: ``` import { retryInterval } from "@daiso-tech/core/resilience"; import { useFactory } from "@daiso-tech/core/middleware"; import { TimeSpan } from "@daiso-tech/core/time-span"; const semaphore = semaphoreFactory.create("resource", { limit: 2, }); const use = useFactory(); const hasAcquired = await use(async () => { return await semaphore.acquire(); }, [ retryInterval({ time: TimeSpan.fromMinutes(1), interval: TimeSpan.fromSeconds(1), errorPolicy: { treatFalseAsError: true, }, }), ])(); if (hasAcquired) { try { // ... critical section } finally { await semaphore.release(); } } ``` Retrying acquiring semaphore with `runOrFail` method: ``` import { retryInterval } from "@daiso-tech/core/resilience"; import { FailedAcquireSemaphoreError } from "@daiso-tech/core/semaphore/contracts"; import { useFactory } from "@daiso-tech/core/middleware"; import { TimeSpan } from "@daiso-tech/core/time-span"; const semaphore = semaphoreFactory.create("resource", { limit: 2, }); const use = useFactory(); await use(async () => { await semaphore.runOrFail(async () => { // ... critical section }); }, [ retryInterval({ time: TimeSpan.fromMinutes(1), interval: TimeSpan.fromSeconds(1), errorPolicy: FailedAcquireSemaphoreError, }), ])(); ``` warning Note using `retryInterval` middleware with semaphore acquiring in a HTTP request handler is discouraged because it blocks the HTTP request handler causing the handler wait until the semaphore becomes available or the timeout is reached. This will delay HTTP request handler to generate response and will make frontend app slow because of HTTP request handler. ### Serialization and deserialization of semaphore[​](#serialization-and-deserialization-of-semaphore "Direct link to Serialization and deserialization of semaphore") Semaphores can be serialized, allowing them to be transmitted over the network to another server and later deserialized for reuse. This means you can, for example, acquire the semaphore on the main server, transfer it to a queue worker server, and release it there. In order to serialize or deserialize a semaphore you need pass an object that implements [`ISerderRegister`](/docs/components/serde.md) contract like the [`Serde`](/docs/components/serde.md) class to `SemaphoreFactory`. Manually serializing and deserializing the semaphore: ``` import { RedisSemaphoreAdapter } from "@daiso-tech/core/semaphore/redis-semaphore-adapter"; import { SemaphoreFactory } from "@daiso-tech/core/semaphore"; import { Serde } from "@daiso-tech/core/serde"; import { SuperJsonSerdeAdapter } from "@daiso-tech/core/serde/super-json-serde-adapter"; const serde = new Serde(new SuperJsonSerdeAdapter()); const redisClient = new Redis("YOUR_REDIS_CONNECTION"); const semaphoreFactory = new SemaphoreFactory({ // You can laso pass in an array of Serde class instances serde, adapter: new RedisSemaphoreAdapter(redisClient), }); const semaphore = semaphoreFactory.create("resource", { limit: 2, }); const serializedSemaphore = serde.serialize(semaphore); const deserializedSemaphore = serde.deserialize(semaphore); ``` danger When serializing or deserializing a semaphore, you must use the same `Serde` instances that were provided to the `SemaphoreFactory`. This is required because the `SemaphoreFactory` injects custom serialization logic for `ISemaphore` instance into `Serde` instances. info Note you only need manuall serialization and deserialization when integrating with external libraries. As long you pass the same `Serde` instances with all other components you dont need to serialize and deserialize the semaphore manually. ``` import { RedisSemaphoreAdapter } from "@daiso-tech/core/semaphore/redis-semaphore-adapter"; import type { ISemaphore } from "@daiso-tech/core/semaphore/contracts"; import { SemaphoreFactory } from "@daiso-tech/core/semaphore"; import { RedisPubSubEventBusAdapter } from "@daiso-tech/core/event-bus/redis-pub-sub-event-bus-adapter"; import { EventBus } from "@daiso-tech/core/event-bus"; import { Serde } from "@daiso-tech/core/serde"; import { SuperJsonSerdeAdapter } from "@daiso-tech/core/serde/super-json-serde-adapter"; const serde = new Serde(new SuperJsonSerdeAdapter()); const redis = new Redis("YOUR_REDIS_CONNECTION"); type EventMap = { "sending-semaphore-over-network": { semaphore: ISemaphore; }; }; const eventBus = new EventBus({ adapter: new RedisPubSubEventBusAdapter({ client: redis, serde, }), }); const semaphoreFactory = new SemaphoreFactory({ serde, adapter: new RedisSemaphoreAdapter(redis), eventBus, }); const semaphore = semaphoreFactory.create("resource", { limit: 2, }); // We are sending the semaphore over the network to other servers. await eventBus.dispatch("sending-semaphore-over-network", { semaphore, }); // The other servers will recieve the serialized semaphore and automattically deserialize it. await eventBus.addListener( "sending-semaphore-over-network", ({ semaphore }) => { // The semaphore is deserialized and can be used console.log("SEMAPHORE:", semaphore); }, ); ``` ### Semaphore events[​](#semaphore-events "Direct link to Semaphore events") You can listen to different [semaphore events](https://daiso-tech.github.io/daiso-core/modules/Semaphore.html) that are triggered by the `Semaphore` instance. Refer to the [`EventBus`](/docs/components/event_bus/event_bus_usage.md) documentation to learn how to use events. Since no events are dispatched by default, you need to pass an object that implements `IEventBus` or `IEventBusAdapter` contract. ``` import { MemorySemaphoreAdapter } from "@daiso-tech/core/semaphore/memory-semaphore-adapter"; import { SemaphoreFactory, SEMAPHORE_EVENTS } from "@daiso-tech/core/semaphore"; import { MemoryEventBusAdapter } from "@daiso-tech/core/event-bus/memory-event-bus-adapter"; const semaphoreFactory = new SemaphoreFactory({ adapter: new MemorySemaphoreAdapter(), eventBus: new MemoryEventBusAdapter(), }); await semaphoreFactory.events.addListener(SEMAPHORE_EVENTS.ACQUIRED, () => { console.log("Lock acquired"); }); await semaphoreFactory.create("a").acquire(); ``` warning If multiple semaphore adapters (e.g., `RedisSemaphoreAdapter` and `MemorySemaphoreAdapter`) are used at the same time, you need to isolate their events by assigning separate namespaces. This prevents listeners from unintentionally capturing events across adapters. ``` import { RedisSemaphoreAdapter } from "@daiso-tech/core/semaphore/redis-semaphore-adapter"; import { MemorySemaphoreAdapter } from "@daiso-tech/core/semaphore/memory-semaphore-adapter"; import { RedisPubSubEventBusAdapter } from "@daiso-tech/core/event-bus/redis-pub-sub-event-bus-adapter"; import { Serde } from "@daiso-tech/core/serde"; import { SuperJsonSerdeAdapter } from "@daiso-tech/core/serde/super-json-serde-adapter"; import Redis from "ioredis"; import { Namespace } from "@daiso-tech/core/namespace"; const serde = new Serde(new SuperJsonSerdeAdapter()); const redisPubSubEventBusAdapter = new RedisPubSubEventBusAdapter({ client: new Redis("YOUR_REDIS_CONNECTION_STRING"), serde, }); const memorySemaphoreAdapter = new MemorySemaphoreAdapter(); const memorySemaphoreFactory = new SemaphoreFactory({ adapter: memorySemaphoreAdapter, // We assign distinct namespaces to MemorySemaphoreAdapter and RedisSemaphoreAdapter to isolate their events. namespace: new Namespace(["memory", "event-bus"]), eventBus: redisPubSubEventBusAdapter, }); const redisSemaphoreAdapter = new RedisSemaphoreAdapter({ serde, database: new Redis("YOUR_REDIS_CONNECTION_STRING"), }); const redisSemaphoreFactory = new SemaphoreFactory({ adapter: redisSemaphoreAdapter, // We assign distinct namespaces to MemorySemaphoreAdapter and RedisSemaphoreAdapter to isolate their events. namespace: new Namespace(["redis", "event-bus"]), eventBus: redisPubSubEventBusAdapter, }); ``` ### Separating creating, listening to and manipulating semaphore[​](#separating-creating-listening-to-and-manipulating-semaphore "Direct link to Separating creating, listening to and manipulating semaphore") The library includes 3 additional contracts: * [`ISemaphore`](https://daiso-tech.github.io/daiso-core/types/Semaphore.ISemaphore.html) - Allows only for manipulating of the semaphore. * [`ISemaphoreFactoryBase`](https://daiso-tech.github.io/daiso-core/types/Semaphore.ISemaphoreFactoryBase.html) - Allows only for creation of semaphores. * [`ISemaphoreListenable`](https://daiso-tech.github.io/daiso-core/types/Semaphore.ISemaphoreListenable.html) - Allows only to listening to semaphore events. This seperation makes it easy to visually distinguish the 3 contracts, making it immediately obvious that they serve different purposes. ``` import { MemoryEventBusAdapter } from "@daiso-tech/core/event-bus/memory-event-bus-adapter"; import { SemaphoreFactory } from "@daiso-tech/core/semaphore"; import { MemorySemaphoreAdapter } from "@daiso-tech/core/semaphore/memory-semaphore-adapter"; import { type ISemaphore, type ISemaphoreFactoryBase, type ISemaphoreListenable, SEMAPHORE_EVENTS, } from "@daiso-tech/core/semaphore/contracts"; async function semaphoreFunc(semaphore: ISemaphore): Promise { await semaphore.runOrFail(async () => { // ... critical section }); } async function semaphoreFactoryFunc( semaphoreFactory: ISemaphoreFactoryBase, ): Promise { // You cannot access the listener methods // You will get typescript error if you try const semaphore = semaphoreFactory.create("resource", { limit: 2, }); await semaphoreFunc(semaphore); } async function semaphoreListenableFunc( semaphoreListenable: ISemaphoreListenable, ): Promise { // You cannot access the semaphoreFactory methods // You will get typescript error if you try await semaphoreListenable.addListener( SEMAPHORE_EVENTS.ACQUIRED, (event) => { console.log("ACQUIRED:", event); }, ); await semaphoreListenable.addListener( SEMAPHORE_EVENTS.RELEASED, (event) => { console.log("RELEASED:", event); }, ); } const semaphoreFactory = new SemaphoreFactory({ adapter: new MemorySemaphoreAdapter(), eventBus: new MemoryEventBusAdapter(), }); await semaphoreListenableFunc(semaphoreFactory.events); await semaphoreFactoryFunc(semaphoreFactory); ``` ## Further information[​](#further-information "Direct link to Further information") For further information refer to [`@daiso-tech/core/semaphore`](https://daiso-tech.github.io/daiso-core/modules/Semaphore.html) API docs. --- # Serde The `@daiso-tech/core/serde` component provides seamless way to serialize/deserialize data and adding custom serialization/deserialization logic for custom data types. ## Initial configuration[​](#initial-configuration "Direct link to Initial configuration") ``` import { Serde } from "@daiso-tech/core/serde"; import { SuperJsonSerdeAdapter } from "@daiso-tech/core/serde/super-json-serde-adapter"; const serde = new Serde(new SuperJsonSerdeAdapter()); ``` ## Serde basics[​](#serde-basics "Direct link to Serde basics") ### Serializing and deserializing values[​](#serializing-and-deserializing-values "Direct link to Serializing and deserializing values") Here is an example of serializing and deserializing a value. ``` const serializedValue = serde.serialize({ name: "abra", age: 20, }); const deserializedValue = serde.deserialize(serializedValue); ``` ### Custom serialization and deserialization logic[​](#custom-serialization-and-deserialization-logic "Direct link to Custom serialization and deserialization logic") The `registerCustom` method offers control over serialization and deserialization behavior. ``` import type { ISerdeTransformer } from "@daiso-tech/core/serde/contracts"; type ISerializedUser = { version: "1"; name: string; age: number; }; const userSerdeTransformer: ISerdeTransformer = { name: User.name, isApplicable(value: unknown): value is User { return value instanceof User; } deserialize(serializedValue: TSerializedValue): User { return new User(serializedValue.name, serializedValue.age); } serialize(deserializedValue: User): TSerializedValue { return { version: "1", name: user.name, age: user.age, }; } } serde.registerCustom(userSerdeTransformer); ``` info Note the `ISerdeTranformer` object can be dynamically created. ### Custom serialization and deserialization logic of classes[​](#custom-serialization-and-deserialization-logic-of-classes "Direct link to Custom serialization and deserialization logic of classes") The `registerClass` method provides a simplified abstraction over `registerCustom` method for serialization and deserialization classes. ``` import type { ISerializable } from "@daiso-tech/core/serde/contracts"; type ISerializedUser = { version: "1"; name: string; age: number; }; class User implements ISerializable { static deserialize(serializedUser: ISerializedUser): User { return new User(serializedUser.name, serializedUser.age); } constructor( public readonly name: string, public readonly age: number, ) {} serialize(): ISerializedUser { return { version: "1", name: this.name, age: this.age, }; } logInfo(): void { console.log("Name:", this.name, "Age:", this.age); } } serde.registerClass(User); const user = new User("Carl", 50); const serializedUser = serde.serialize(user); const deserializedUser = serde.deserialize(serializedUser); // The instances will not be the same because deserializedUser is recreated. console.log(user === deserializedUser); // But the content will be the same deserializedUser.logInfo(); user.logInfo(); ``` danger Note you need to register the class before serializing or deserializing any class instances. warning To ensure correct serialization and deserialization, class names must be unique. If multiple classes share the same name, conflicts may occur when serializing and deserializing the objects. To resolve this, you can assign a unique prefix to differentiate between them during the process. ``` serde.registerClass(User, "my-library"); ``` ## Patterns[​](#patterns "Direct link to Patterns") ### Usage with other components[​](#usage-with-other-components "Direct link to Usage with other components") When using `Serde` class instance there is no need to call `serialize` and `deserialize` manually. Because components like `Cache` handle it automatically through their adapter. ``` import { Serde } from "@daiso-tech/core/serde"; import { SuperJsonSerdeAdapter } from "@daiso-tech/core/serde/super-json-serde-adapter"; import { RedisCacheAdapter } from "@daiso-tech/core/cache/redis-cache-adapter"; import { Cache } from "@daiso-tech/core/cache"; import { ListCollection } from "@daiso-tech/core/collection"; import Redis from "ioredis"; const serde = new Serde(new SuperJsonSerdeAdapter()); serde.registerClass(ListCollection); const cache = new Cache({ adapter: new RedisCacheAdapter({ database: new Redis("YOUR_REDIS_CONNECTION_STRING"), serde, }), }); const listCollection = new ListCollection(["a", "b", "c", "d", "e"]); await cache.add("list", listCollection); const deserializedListCollection = await cache.get("list"); if (deserializedListCollection) { // Logs "c" console.log(deserializedListCollection.getOrFail(2)); } ``` info Note you should use one `Serde` class instance accross all components and register all serializable objects before component usage. ## Separating serialization, deserialization and registering custom serialization/deserialization logic[​](#separating-serialization-deserialization-and-registering-custom-serializationdeserialization-logic "Direct link to Separating serialization, deserialization and registering custom serialization/deserialization logic") The library includes 4 additional contracts: * `ISerializer` - Allows only for serialization. * `IDeserializer` - Allows only for deserialization. * `ISerde` - Allows for both serialization and deserialization. * `ISerderRegister` - Allows only for registering custom serialization/deserialization logic. * `IFlexibleSerde` – Allows for both serialization, deserialization and for registering custom serialization/deserialization and deserialization logic. This seperation makes it easy to visually distinguish the 4 contracts, making it immediately obvious that they serve different purposes. ## Further information[​](#further-information "Direct link to Further information") For further information refer to [`@daiso-tech/core/serde`](https://daiso-tech.github.io/daiso-core/modules/Serde.html) API docs. --- # Configuring shared-lock adapters ## MemorySharedLockAdapter[​](#memorysharedlockadapter "Direct link to MemorySharedLockAdapter") To use the `MemorySharedLockAdapter` you only need to create instance of it: ``` import { MemorySharedLockAdapter } from "@daiso-tech/core/shared-lock/memory-shared-lock-adapter"; const memorySharedLockAdapter = new MemorySharedLockAdapter(); ``` You can also provide an `Map` that will be used for storing the data in memory: ``` import { MemorySharedLockAdapter } from "@daiso-tech/core/shared-lock/memory-shared-lock-adapter"; const map = new Map(); const memorySharedLockAdapter = new MemorySharedLockAdapter(map); ``` info `MemorySharedLockAdapter` lets you test your app without external dependencies like `Redis`, ideal for local development, unit tests, integration tests and fast E2E test for the backend application. danger Note the `MemorySharedLockAdapter` is limited to single process usage and cannot be shared across multiple servers or processes. ## MongodbSharedLockAdapter[​](#mongodbsharedlockadapter "Direct link to MongodbSharedLockAdapter") To use the `MongodbSharedLockAdapter`, you'll need to: 1. Install the required dependency: [`mongodb`](https://www.npmjs.com/package/mongodb) package: ``` import { MongodbSharedLockAdapter } from "@daiso-tech/core/shared-lock/mongodb-shared-lock-adapter"; import { MongoClient } from "mongodb"; const client = await MongoClient.connect("YOUR_MONGODB_CONNECTION_STRING"); const database = client.db("database"); const mongodbSharedLockAdapter = new MongodbSharedLockAdapter({ database, }); // You need initialize the adapter once before using it. // During the initialization the indexes will be created await mongodbSharedLockAdapter.init(); ``` You can change the collection name: ``` const mongodbSharedLockAdapter = new MongodbSharedLockAdapter({ database, // By default "shared-lock" is used as collection name collectionName: "my-shared-lock", }); await mongodbSharedLockAdapter.init(); ``` You can change the collection settings: ``` const mongodbSharedLockAdapter = new MongodbSharedLockAdapter({ database, // You configure additional collection settings collectionSettings: {}, }); await mongodbSharedLockAdapter.init(); ``` info To remove the shared-lock collection and all stored shared-lock data, use `deInit` method: ``` await mongodbSharedLockAdapter.deInit(); ``` danger Note in order to use `MongodbSharedLockAdapter` correctly, ensure you use a single, consistent database across all server instances or processes. ## RedisSharedLockAdapter[​](#redissharedlockadapter "Direct link to RedisSharedLockAdapter") To use the `RedisSharedLockAdapter`, you'll need to: 1. Install the required dependency: [`ioredis`](https://www.npmjs.com/package/ioredis) package: ``` import { RedisSharedLockAdapter } from "@daiso-tech/core/shared-lock/redis-shared-lock-adapter"; import Redis from "ioredis"; const database = new Redis("YOUR_REDIS_CONNECTION_STRING"); const redisSharedLockAdapter = new RedisSharedLockAdapter(database); ``` danger Note in order to use `RedisSharedLockAdapter` correctly, ensure you use a single, consistent database across all server instances or processes. ## KyselySharedLockAdapter[​](#kyselysharedlockadapter "Direct link to KyselySharedLockAdapter") To use the `KyselySharedLockAdapter`, you'll need to: 1. Use database provider that has support for transactions. 2. Install the required dependency: [`kysely`](https://www.npmjs.com/package/kysely) package: ### With Sqlite[​](#with-sqlite "Direct link to With Sqlite") You will need to install [`better-sqlite3`](https://www.npmjs.com/package/better-sqlite3) package: ``` import { TimeSpan } from "@daiso-tech/core/time-span"; import { KyselySharedLockAdapter } from "@daiso-tech/core/shared-lock/kysely-shared-lock-adapter"; import Sqlite from "better-sqlite3"; import { Kysely, SqliteDialect } from "kysely"; const database = new Sqlite("DATABASE_NAME.db"); const kysely = new Kysely({ dialect: new SqliteDialect({ database, }), }); const kyselySharedLockAdapter = new KyselySharedLockAdapter({ kysely, }); // You need initialize the adapter once before using it. // During the initialization the schema will be created await kyselySharedLockAdapter.init(); ``` danger Note using `KyselySharedLockAdapter` with `sqlite` is limited to single server usage and cannot be shared across multiple servers but it can be shared between different processes. To use it correctly, ensure all process instances access the same persisted database. ### With Postgres[​](#with-postgres "Direct link to With Postgres") You will need to install [`pg`](https://www.npmjs.com/package/pg) package: ``` import { TimeSpan } from "@daiso-tech/core/time-span"; import { KyselySharedLockAdapter } from "@daiso-tech/core/shared-lock/kysely-shared-lock-adapter"; import { Pool } from "pg"; import { Kysely, PostgresDialect } from "kysely"; const database = new Pool({ database: "DATABASE_NAME", host: "DATABASE_HOST", user: "DATABASE_USER", // DATABASE port port: 5432, password: "DATABASE_PASSWORD", max: 10, }); const kysely = new Kysely({ dialect: new PostgresDialect({ pool: database, }), }); const kyselySharedLockAdapter = new KyselySharedLockAdapter({ kysely, }); // You need initialize the adapter once before using it. // During the initialization the schema will be created await kyselySharedLockAdapter.init(); ``` danger Note in order to use `KyselySharedLockAdapter` with `postgres` correctly, ensure you use a single, consistent database across all server instances. This means you can't use replication. ### With Mysql[​](#with-mysql "Direct link to With Mysql") You will need to install [`mysql2`](https://www.npmjs.com/package/mysql2) package: ``` import { TimeSpan } from "@daiso-tech/core/time-span"; import { KyselySharedLockAdapter } from "@daiso-tech/core/shared-lock/kysely-shared-lock-adapter"; import { createPool } from "mysql2"; import { Kysely, MysqlDialect } from "kysely"; const database = createPool({ host: "DATABASE_HOST", // Database port port: 3306, database: "DATABASE_NAME", user: "DATABASE_USER", password: "DATABASE_PASSWORD", connectionLimit: 10, }); const kysely = new Kysely({ dialect: new MysqlDialect({ pool: database, }), }); const kyselySharedLockAdapter = new KyselySharedLockAdapter({ kysely, }); // You need initialize the adapter once before using it. // During the initialization the schema will be created await kyselySharedLockAdapter.init(); ``` danger Note in order to use `KyselySharedLockAdapter` with `mysql` correctly, ensure you use a single, consistent database across all server instances. This means you can't use replication. ### With Libsql[​](#with-libsql "Direct link to With Libsql") You will need to install `@libsql/kysely-libsql` package: ``` import { TimeSpan } from "@daiso-tech/core/time-span"; import { KyselySharedLockAdapter } from "@daiso-tech/core/shared-lock/kysely-shared-lock-adapter"; import { LibsqlDialect } from "@libsql/kysely-libsql"; import { Kysely } from "kysely"; const kysely = new Kysely({ dialect: new LibsqlDialect({ url: "DATABASE_URL", }), }); const kyselySharedLockAdapter = new KyselySharedLockAdapter({ kysely, }); // You need initialize the adapter once before using it. // During the initialization the schema will be created await kyselySharedLockAdapter.init(); ``` danger Note in order to use `KyselySharedLockAdapter` with `libsql` correctly, ensure you use a single, consistent database across all server instances. This means you can't use libsql embedded replicas. ### Settings[​](#settings "Direct link to Settings") Expired keys are cleared at regular intervals and you can change the interval time: ``` import { TimeSpan } from "@daiso-tech/core/time-span"; const kyselySharedLockAdapter = new KyselySharedLockAdapter({ database, // By default, the interval is 1 minute expiredKeysRemovalInterval: TimeSpan.fromSeconds(10), }); await kyselySharedLockAdapter.init(); ``` Disabling scheduled interval cleanup of expired keys: ``` import { TimeSpan } from "@daiso-tech/core/time-span"; const kyselySharedLockAdapter = new KyselySharedLockAdapter({ database, shouldRemoveExpiredKeys: false, }); await kyselySharedLockAdapter.init(); // You can remove all expired keys manually. await kyselySharedLockAdapter.removeAllExpired(); ``` info To remove the shared-lock table and all stored shared-lock data, use `deInit` method: ``` await kyselySharedLockAdapter.deInit(); ``` ## NoOpSharedLockAdapter[​](#noopsharedlockadapter "Direct link to NoOpSharedLockAdapter") The `NoOpSharedLockAdapter` is a no-operation implementation, it performs no actions when called: ``` import { NoOpSharedLockAdapter } from "@daiso-tech/core/shared-lock/no-op-shared-lock-adapter"; const noOpSharedLockAdapter = new NoOpSharedLockAdapter(); ``` info The `NoOpSharedLockAdapter` is useful when you want to mock out or disable your [`SharedLockFactory`](https://daiso-tech.github.io/daiso-core/classes/SharedLock.SharedLockFactory.html) instance. ## Further information[​](#further-information "Direct link to Further information") For further information refer to [`@daiso-tech/core/shared-lock`](https://daiso-tech.github.io/daiso-core/modules/SharedLock.html) API docs. --- # Creating shared-lock adapters ## Implementing your custom ISharedLockAdapter[​](#implementing-your-custom-isharedlockadapter "Direct link to Implementing your custom ISharedLockAdapter") In order to create an adapter you need to implement the [`ISharedLockAdapter`](https://daiso-tech.github.io/daiso-core/types/SharedLock.ISharedLockAdapter.html) contract. ## Testing your custom ISharedLockAdapter[​](#testing-your-custom-isharedlockadapter "Direct link to Testing your custom ISharedLockAdapter") We provide a complete test suite to test your shared-lock adapter implementation. Simply use the [`sharedLockAdapterTestSuite`](https://daiso-tech.github.io/daiso-core/functions/Lock.lockAdapterTestSuite.html) function: * Preconfigured Vitest test cases * Common edge case coverage Usage example: ``` // filename: MySharedLockAdapter.test.ts import { beforeEach, describe, expect, test } from "vitest"; import { sharedLockAdapterTestSuite } from "@daiso-tech/core/shared-lock/test-utilities"; import { MemorySharedLockAdapter } from "./MemorySharedLockAdapter.js"; describe("class: MySharedLockAdapter", () => { sharedLockAdapterTestSuite({ createAdapter: () => new MemorySharedLockAdapter(), test, beforeEach, expect, describe, }); }); ``` ## Implementing your custom IDatabaseSharedLockAdapter[​](#implementing-your-custom-idatabasesharedlockadapter "Direct link to Implementing your custom IDatabaseSharedLockAdapter") We provide an additional contract [`IDatabaseSharedLockAdapter`](https://daiso-tech.github.io/daiso-core/types/SharedLock.IDatabaseSharedLockAdapter.html) for building custom shared-lock adapters tailored to databases. ## Testing your custom IDatabaseSharedLockAdapter[​](#testing-your-custom-idatabasesharedlockadapter "Direct link to Testing your custom IDatabaseSharedLockAdapter") We provide a complete test suite to test your database shared-lock adapter implementation. Simply use the [`databaseSharedLockAdapterTestSuite`](https://daiso-tech.github.io/daiso-core/functions/SharedLock.databaseSharedLockAdapterTestSuite.html) function: * Preconfigured Vitest test cases * Common edge case coverage Usage example: ``` import { beforeEach, describe, expect, test } from "vitest"; import { databaseSharedLockAdapterTestSuite } from "@daiso-tech/core/shared-lock/test-utilities"; import { MyDatabaseSharedLockAdapter } from "./MyDatabaseSharedLockAdapter.js"; describe("class: MyDatabaseSharedLockAdapter", () => { databaseSharedLockAdapterTestSuite({ createAdapter: async () => { return new MyDatabaseSharedLockAdapter(), }, test, beforeEach, expect, describe, }); }); ``` ## Implementing your custom ISharedLockFactory class[​](#implementing-your-custom-isharedlockfactory-class "Direct link to Implementing your custom ISharedLockFactory class") In some cases, you may need to implement a custom [`SharedLockFactory`](https://daiso-tech.github.io/daiso-core/classes/SharedLock.SharedLockFactory.html) class to optimize performance for your specific technology stack. You can then directly implement the [`ISharedLockFactory`](https://daiso-tech.github.io/daiso-core/types/SharedLock.ISharedLockFactory.html) contract. ## Testing your custom ISharedLockFactory class[​](#testing-your-custom-isharedlockfactory-class "Direct link to Testing your custom ISharedLockFactory class") We provide a complete test suite to verify your custom event bus class implementation. Simply use the [`sharedLockProviderTestSuite`](https://daiso-tech.github.io/daiso-core/functions/SharedLock.sharedLockProviderTestSuite.html) function: * Preconfigured Vitest test cases * Standardized event bus behavior validation * Common edge case coverage Usage example: ``` // filename: MySharedLockFactory.test.ts import { beforeEach, describe, expect, test } from "vitest"; import { sharedLockProviderTestSuite } from "@daiso-tech/core/shared-lock/test-utilities"; import { MySharedLockFactory } from "./MySharedLockFactory.js"; describe("class: MySharedLockFactory", () => { sharedLockProviderTestSuite({ createSharedLockFactory: () => new MySharedLockFactory(), test, beforeEach, expect, describe, }); }); ``` ## Further information[​](#further-information "Direct link to Further information") For further information refer to [`@daiso-tech/core/shared-lock`](https://daiso-tech.github.io/daiso-core/modules/SharedLock.html) API docs. --- # SharedLockFactoryResolver The `SharedLockFactoryResolver` class provides a flexible way to configure and switch between different shared-lock adapters at runtime. ## Initial configuration[​](#initial-configuration "Direct link to Initial configuration") To begin using the `ISharedLockFactoryResolver`, you will need to register all required adapters during initialization. ``` import { SharedLockFactoryResolver } from "@daiso-tech/core/shared-lock"; import { MemorySharedLockAdapter } from "@daiso-tech/core/shared-lock/memory-shared-lock-adapter"; import { RedisSharedLockAdapter } from "@daiso-tech/core/shared-lock/redis-shared-lock-adapter"; import Redis from "ioredis"; const sharedLockFactoryResolver = new SharedLockFactoryResolver({ adapters: { memory: new MemorySharedLockAdapter(), redis: new RedisSharedLockAdapter(new Redis("YOUR_REDIS_CONNECTION")), }, // You can set an optional default adapter defaultAdapter: "memory", }); ``` ## Usage[​](#usage "Direct link to Usage") ### 1. Using the default adapter[​](#1-using-the-default-adapter "Direct link to 1. Using the default adapter") ``` await sharedLockFactoryResolver .use() .create("shared-resource") .runWriterOrFail(async () => { // code to run }); ``` danger Note that if you dont set a default adapter, an error will be thrown. ### 2. Specifying an adapter explicitly[​](#2-specifying-an-adapter-explicitly "Direct link to 2. Specifying an adapter explicitly") ``` await sharedLockFactoryResolver .use("redis") .create("shared-resource") .runWriterOrFail(async () => { // code to run }); ``` danger Note that if you specify a non-existent adapter, an error will be thrown. ### 3. Overriding default settings[​](#3-overriding-default-settings "Direct link to 3. Overriding default settings") ``` await sharedLockFactoryResolver .setNamespace(new Namespace("@my-namespace")) .use("redis") .create("shared-resource") .runWriterOrFail(async () => { // code to run }); ``` info Note that the `SharedLockFactoryResolver` is immutable, meaning any configuration override returns a new instance rather than modifying the existing one. ## Further information[​](#further-information "Direct link to Further information") For further information refer to [`@daiso-tech/core/shared-lock`](https://daiso-tech.github.io/daiso-core/modules/SharedLock.html) API docs. --- # Shared-lock usage The `@daiso-tech/core/shared-lock` component provides a way for managing shared-locks (a.k.a reader writer locks) independent of underlying platform or storage. ## Initial configuration[​](#initial-configuration "Direct link to Initial configuration") To begin using the `SharedLockFactory` class, you'll need to create and configure an instance: ``` import { TimeSpan } from "@daiso-tech/core/time-span"; import { MemorySharedLockAdapter } from "@daiso-tech/core/shared-lock/memory-shared-lock-adapter"; import { SharedLockFactory } from "@daiso-tech/core/shared-lock"; const sharedLockFactory = new SharedLockFactory({ // You can provide default TTL value // If you set it to null it means shared-locks will not expire and most be released manually by default. defaultTtl: TimeSpan.fromSeconds(2), // You can choose the adapter to use adapter: new MemorySharedLockAdapter(), }); ``` info Here is a complete list of settings for the [`SharedLockFactory`](https://daiso-tech.github.io/daiso-core/types/SharedLock.SharedLockFactorySettingsBase.html) class. ## Shared lock basics[​](#shared-lock-basics "Direct link to Shared lock basics") ### Creating a shared-lock[​](#creating-a-shared-lock "Direct link to Creating a shared-lock") ``` const sharedLock = sharedLockFactory.create("shared-resource", { // You need to define a limit limit: 2, }); ``` ### Acquiring and releasing the shared-lock as reader[​](#acquiring-and-releasing-the-shared-lock-as-reader "Direct link to Acquiring and releasing the shared-lock as reader") ``` // 1 slot will be acquired if (await sharedLock.acquireReader()) { console.log("Acquired"); try { // The concurrent section } finally { await sharedLock.releaseReader(); } } else { console.log("Unable to acquire"); } // 2 slots will be acquired if (await sharedLock.acquireReader()) { console.log("Acquired"); try { // The concurrent section } finally { await sharedLock.releaseReader(); } } else { console.log("Unable to acquire"); } // Will log false because the limit is reached console.log(await sharedLock.acquireReader()); ``` Alternatively you could write it as follows: ``` // 1 slot will be acquired try { console.log("Acquired"); // This method will throw if the shared-lock limit is reached. await sharedLock.acquireReaderOrFail(); // The critical section } catch { console.log("Unable to acquire"); } finally { await sharedLock.releaseReader(); } // 2 slots will be acquired try { console.log("Acquired"); // This method will throw if the shared-lock limit is reached. await sharedLock.acquireReaderOrFail(); // The critical section } catch { console.log("Unable to acquire"); } finally { await sharedLock.releaseReader(); } // Will throw because the limit is reached await sharedLock.acquireReaderOrFail(); ``` danger You need always to wrap the concurrent section with `try-finally` so the shared-lock get released when error occurs. ### Acquiring and releasing the shared-lock as writer[​](#acquiring-and-releasing-the-shared-lock-as-writer "Direct link to Acquiring and releasing the shared-lock as writer") ``` const hasAquired = await sharedLock.acquireWriter(); if (hasAquired) { try { // The critical section } finally { await sharedLock.releaseWriter(); } } ``` Alternatively you could write it as follows: ``` try { // This method will throw if the shared-lock is not acquired await sharedLock.acquireWriterOrFail(); // The critical section } finally { await sharedLock.releaseWriter(); } ``` danger You need always to wrap the critical section with `try-finally` so the shared-lock get released when error occurs. ### Shared lock with custom TTL[​](#shared-lock-with-custom-ttl "Direct link to Shared lock with custom TTL") You can provide a custom TTL for the shared-lock. ``` const sharedLock = sharedLockFactory.create("shared-resource", { // Default TTL is 5min if not overrided // If you set it to null it means shared-lock will not expire and most be released manually. ttl: TimeSpan.fromSeconds(30), limit: 2, }); ``` ### Checking shared-lock state[​](#checking-shared-lock-state "Direct link to Checking shared-lock state") You can get the shared-lock state by using the `getState` method, it returns [`ISharedLockState`](https://daiso-tech.github.io/daiso-core/types/SharedLock.ISharedLockState.html). ``` import { SHARED_LOCK_STATE } from "@daiso-tech/core/shared-lock/contracts"; const sharedLock = sharedLockFactory.create("shared-resource", { limit: 2, }); const state = await sharedLock.getState(); if (state.type === SHARED_LOCK_STATE.EXPIRED) { console.log("The shared-lock doesnt exists"); } if (state.type === SHARED_LOCK_STATE.READER_LIMIT_REACHED) { console.log( "The shared-lock is in reader mode and limit have been reached and all slots are unavailable", ); } if (state.type === SHARED_LOCK_STATE.READER_ACQUIRED) { console.log("The shared-lock is in reader mode and is acquired"); } if (state.type === SHARED_LOCK_STATE.READER_UNACQUIRED) { console.log( "The shared-lock is in reader mode and there are avilable slots but the shared-lock is not acquired", ); } if (state.type === SHARED_LOCK_STATE.WRITER_UNAVAILABLE) { console.log( "The shared-lock is in writer mode and is acquired by different owner", ); } if (state.type === SHARED_LOCK_STATE.WRITER_ACQUIRED) { console.log("The shared-lock is in writer mode and is acquired"); } ``` ## Patterns[​](#patterns "Direct link to Patterns") ### Refreshing shared-locks[​](#refreshing-shared-locks "Direct link to Refreshing shared-locks") The shared-lock can be refreshed by the current owner before it expires. This is particularly useful for long-running tasks, instead of setting an excessively long TTL initially, you can start with a shorter one and use the `refreshWriter` or `refreshReader` method to set the TTL of the shared-lock: #### As reader[​](#as-reader "Direct link to As reader") ``` const sharedLock = sharedLockFactory.create("resource", { limit: 2, ttl: TimeSpan.fromMinutes(1), }); async function doWork(): Promise { // ... critical section } const hasAcquired = await sharedLock.acquireWriter(); if (hasAcquired) { try { while (true) { await sharedLock.refreshWriter(TimeSpan.fromMinutes(1)); const hasFinished = await doWork(); if (hasFinished) { break; } await delay(TimeSpan.fromSeconds(1)); } } finally { await sharedLock.releaseWriter(); } } ``` #### As writer[​](#as-writer "Direct link to As writer") ``` const sharedLock = sharedLockFactory.create("resource", { limit: 2, ttl: TimeSpan.fromMinutes(1), }); async function doWork(): Promise { // ... critical section } const hasAcquired = await sharedLock.acquireWriter(); if (hasAcquired) { try { while (true) { await sharedLock.refreshReader(TimeSpan.fromMinutes(1)); const hasFinished = await doWork(); if (hasFinished) { break; } await delay(TimeSpan.fromSeconds(1)); } } finally { await sharedLock.releaseReader(); } } ``` warning Note: A shared-lock must have an expiration (a `ttl` value) to be refreshed. You cannot refresh a shared-lock that was created without an expiration (with `ttl: null`) ``` // Create a shared-lock with no expiration (non-refreshable) const sharedLock = sharedLockFactory.create("resource", { limit: 2, ttl: null, }); // A writer refresh attempt on this shared-ock will fail const hasRefreshedWriter = await sharedLock.refreshWriter(); // This will log 'false' because the sharedLock cannot be refreshed console.log(hasRefreshedWriter); // A reader refresh attempt on this shared-ock will fail const hasRefreshedReader = await sharedLock.refreshReader(); // This will log 'false' because the sharedLock cannot be refreshed console.log(hasRefreshedReader); ``` ### Additional writer methods[​](#additional-writer-methods "Direct link to Additional writer methods") The `releaseWriterOrFail` method is the same `releaseWriter` method but it throws an error when not enable to release the shared-lock as writer: ``` const sharedLock = sharedLockFactory.create("resource", { limit: 2, }); await sharedLock.releaseWriterOrFail(); ``` The `forceReleaseWriter` method releases the shared-lock regardless of the owner if in writer mode: ``` const sharedLock = sharedLockFactory.create("resource", { limit: 2, }); await sharedLock.forceReleaseWriter(); ``` The `refreshWriterOrFail` method is the same `refreshWriter` method but it throws an error when not enable to refresh the shared-lock as writer: ``` const sharedLock = sharedLockFactory.create("resource", { limit: 2, }); await sharedLock.refreshWriterOrFail(); ``` The `runWriterOrFail` method automatically manages shared-lock acquisition and release as writer around function execution. It calls `acquireWriterOrFail` before invoking the function and calls `releaseWriter` in a finally block, ensuring the shared-lock is always freed, even if an error occurs during execution. ``` const sharedLock = sharedLockFactory.create("resource", { limit: 2, }); await sharedLock.runWriterOrFail(async () => { // ... critical section }); ``` info Note the method throws an error when the shared-lock cannot be acquired as writer. info You can provide synchronous Invokable or async/promisable invokable as values for the `runWriterOrFail` method. ### Additional reader methods[​](#additional-reader-methods "Direct link to Additional reader methods") The `releaseReaderOrFail` method is the same `releaseReader` method but it throws an error when not enable to release the shared-lock as reader: ``` const sharedLock = sharedLockFactory.create("resource", { limit: 2, }); await sharedLock.releaseReaderOrFail(); ``` The `forceReleaseAllReaders` method releases all the slots of the shared-lock if in reader mode: ``` const sharedLock = sharedLockFactory.create("resource", { limit: 2, }); await sharedLock.forceReleaseAllReaders(); ``` The `refreshReaderOrFail` method is the same `refreshReader` method but it throws an error when not enable to refresh the shared-lock as reader: ``` const sharedLock = sharedLockFactory.create("resource", { limit: 2, }); await sharedLock.refreshReaderOrFail(); ``` The `runReaderOrFail` method automatically manages shared-lock acquisition and release as reader around function execution. It calls `acquireReaderOrFail` before invoking the function and calls `releaseReader` in a finally block, ensuring the shared-lock is always freed, even if an error occurs during execution. ``` const sharedLock = sharedLockFactory.create("resource", { limit: 2, }); await sharedLock.runReaderOrFail(async () => { // ... critical section }); ``` info Note the method throws an error when the shared-lock cannot be acquired as reader. info You can provide synchronous Invokable or async/promisable invokable as values for the `runReaderOrFail` method. ### Additional methods[​](#additional-methods "Direct link to Additional methods") The `forceRelease` method releases the shared-lock regardless it its in reader or writer mode: ``` const sharedLock = sharedLockFactory.create("resource", { limit: 2, }); await sharedLock.forceRelease(); ``` ### Shared-lock instance variables[​](#shared-lock-instance-variables "Direct link to Shared-lock instance variables") The `Shared-lock` class exposes instance variables such as: ``` const sharedLock = sharedLockFactory.create("resource", { limit: 2, }); // Will return the key of the shared-lock which is "resource" console.log(sharedLock.key.toString()); // Will return the id of the shared-lock console.log(sharedLock.id); // Will return the ttl of the shared-lock console.log(sharedLock.ttl); ``` info The `key` field is an object that implements [`IKey`](/docs/components/namespace.md) contract. ### Shared-lock id[​](#shared-lock-id "Direct link to Shared-lock id") By default the shared-lock id is autogenerated but it can also manually defined. ``` const sharedLock = sharedLockFactory.create("shared-lock", { lockId: "my-shared-lock-id", }); const hasAcquire = await sharedLock.acquireWriter(); if (hasAcquired) { console.log("Shared resource"); await sharedLock.releaseWriter(); } ``` info Manually defining shared-lock id is primarily useful for debugging or implementing manual resource controll by the end user. warning In most cases, setting a shared-lock id is unnecessary. ### Namespacing[​](#namespacing "Direct link to Namespacing") You can use the `Namespace` class to group related shared-locks without conflicts. Since namespacing is not used be default, you need to pass an obeject that implements `INamespace` object. info For further information about namespacing refer to [`@daiso-tech/core/namespace`](/docs/components/namespace.md) documentation. ``` import { Namespace } from "@daiso-tech/core/namespace"; import { RedisSharedLockAdapter } from "@daiso-tech/core/shared-lock/redis-shared-lock-adapter"; import { SharedLockFactory } from "@daiso-tech/core/shared-lock"; import Redis from "ioredis"; const database = new Redis("YOUR_REDIS_CONNECTION_STRING"); const sharedLockFactoryA = new SharedLockFactory({ namespace: new Namespace("@sharedLock-a"), adapter: new RedisSharedLockAdapter(database), }); const sharedLockFactoryB = new SharedLockFactory({ namespace: new Namespace("@sharedLock-b"), adapter: new RedisSharedLockAdapter(database), }); const sharedLockA = await sharedLockFactoryA.create("key", { ttl: null, limit: 1, }); const sharedLockB = await sharedLockFactoryB.create("key", { ttl: null, limit: 1, }); const hasAquiredA = await sharedLockA.acquireWriter(); // Will log true console.log(hasAquiredA); const hasAquiredB = await sharedLockB.acquireWriter(); // Will log true console.log(hasAquiredB); const hasReleasedB = await sharedLockB.releaseWriter(); // Will log true console.log(hasReleasedB); // Will log { type: "WRITER_ACQUIRED", remainingTime: null } console.log(await sharedLockA.getState()); // Will log { type: "EXPIRED" } console.log(await sharedLockB.getState()); ``` ### Retrying acquiring shared-lock as writer by attempts[​](#retrying-acquiring-shared-lock-as-writer-by-attempts "Direct link to Retrying acquiring shared-lock as writer by attempts") To retry acquiring shared-lock as writer you can use the [`retry`](/docs/components/resilience.md) middleware. Retrying acquiring shared-lock as writer with `acquireWriterOrFail` method: ``` import { retry } from "@daiso-tech/core/resilience"; import { FailedAcquireWriterLockError } from "@daiso-tech/core/shared-lock/contracts"; import { useFactory } from "@daiso-tech/core/middleware"; const sharedLock = sharedLockFactory.create("shared-lock", { limit: 2, }); const use = useFactory(); try { await use(async () => { await sharedLock.acquireWriterOrFail(); }, [ retry({ maxAttempts: 4, errorPolicy: FailedAcquireWriterLockError, }), ])(); // The critical section } finally { await sharedLock.release(); } ``` Retrying acquiring sharedLock as writer with `acquireWriter` method: ``` import { retry } from "@daiso-tech/core/resilience"; import { useFactory } from "@daiso-tech/core/middleware"; const sharedLock = sharedLockFactory.create("shared-lock", { limit: 2, }); const use = useFactory(); const hasAquired = await use(async () => { return await sharedLock.acquireWriter(); }, [ retry({ maxAttempts: 4, errorPolicy: { treatFalseAsError: true, }, }), ])(); if (hasAquired) { try { // The critical section } finally { await sharedLock.release(); } } ``` Retrying acquiring shared-lock as writer with `runWriterOrFail` method: ``` import { retry } from "@daiso-tech/core/resilience"; import { FailedAcquireWriterLockError } from "@daiso-tech/core/shared-lock/contracts"; import { useFactory } from "@daiso-tech/core/middleware"; const sharedLock = sharedLockFactory.create("shared-lock", { limit: 2, }); const use = useFactory(); await use(async () => { await sharedLock.runWriterOrFail(async () => { // The critical section }); }, [ retry({ maxAttempts: 4, errorPolicy: FailedAcquireWriterLockError, }), ])(); ``` ### Retrying acquiring shared-lock as reader by attempts[​](#retrying-acquiring-shared-lock-as-reader-by-attempts "Direct link to Retrying acquiring shared-lock as reader by attempts") To retry acquiring shared-lock as reader you can use the [`retry`](/docs/components/resilience.md) middleware. Retrying acquiring shared-lock as reader with `acquireReaderOrFail` method: ``` import { retry } from "@daiso-tech/core/resilience"; import { LimitReachedReaderSemaphoreError } from "@daiso-tech/core/shared-lock/contracts"; import { useFactory } from "@daiso-tech/core/middleware"; const sharedLock = sharedLockFactory.create("shared-lock", { limit: 2, }); const use = useFactory(); try { await use(async () => { await sharedLock.acquireReaderOrFail(); }, [ retry({ maxAttempts: 4, errorPolicy: LimitReachedReaderSemaphoreError, }), ])(); // The critical section } finally { await sharedLock.release(); } ``` Retrying acquiring sharedLock as reader with `acquireReader` method: ``` import { retry } from "@daiso-tech/core/resilience"; import { useFactory } from "@daiso-tech/core/middleware"; const sharedLock = sharedLockFactory.create("shared-lock", { limit: 2, }); const use = useFactory(); const hasAquired = await use(async () => { return await sharedLock.acquireReader(); }, [ retry({ maxAttempts: 4, errorPolicy: { treatFalseAsError: true, }, }), ])(); if (hasAquired) { try { // The critical section } finally { await sharedLock.release(); } } ``` Retrying acquiring shared-lock as reader with `runReaderOrFail` method: ``` import { retry } from "@daiso-tech/core/resilience"; import { LimitReachedReaderSemaphoreError } from "@daiso-tech/core/shared-lock/contracts"; import { useFactory } from "@daiso-tech/core/middleware"; const sharedLock = sharedLockFactory.create("shared-lock", { limit: 2, }); const use = useFactory(); await use(async () => { await sharedLock.runReaderOrFail(async () => { // The critical section }); }, [ retry({ maxAttempts: 4, errorPolicy: LimitReachedReaderSemaphoreError, }), ])(); ``` ### Retrying acquiring shared-lock as writer by interval[​](#retrying-acquiring-shared-lock-as-writer-by-interval "Direct link to Retrying acquiring shared-lock as writer by interval") To retry acquiring shared-lockas as writer at regular intervals you can use the [`retryInterval`](/docs/components/resilience.md) middleware. Retrying acquiring shared-lock with `acquireWriterOrFail` method: ``` import { retryInterval } from "@daiso-tech/core/resilience"; import { FailedAcquireWriterLockError } from "@daiso-tech/core/shared-lock/contracts"; import { useFactory } from "@daiso-tech/core/middleware"; import { TimeSpan } from "@daiso-tech/core/time-span"; const sharedLock = sharedLockFactory.create("resource", { limit: 2, }); const use = useFactory(); try { await use(async () => { await sharedLock.acquireWriterOrFail(); }, [ retryInterval({ // Time to wait 1 minute time: TimeSpan.fromMinutes(1), // Interval to try acquire the shared-lock interval: TimeSpan.fromSeconds(1), errorPolicy: FailedAcquireWriterLockError, }), ])(); // ... critical section } finally { await sharedLock.releaseWriter(); } ``` Retrying acquiring shared-lock with `acquireWriter` method: ``` import { retryInterval } from "@daiso-tech/core/resilience"; import { useFactory } from "@daiso-tech/core/middleware"; import { TimeSpan } from "@daiso-tech/core/time-span"; const sharedLock = sharedLockFactory.create("resource", { limit: 2, }); const use = useFactory(); const hasAcquired = await use(async () => { return await sharedLock.acquireWriter(); }, [ retryInterval({ time: TimeSpan.fromMinutes(1), interval: TimeSpan.fromSeconds(1), errorPolicy: { treatFalseAsError: true, }, }), ])(); if (hasAcquired) { try { // ... critical section } finally { await sharedLock.releaseWriter(); } } ``` Retrying acquiring shared-lock with `runWriterOrFail` method: ``` import { retryInterval } from "@daiso-tech/core/resilience"; import { FailedAcquireWriterLockError } from "@daiso-tech/core/shared-lock/contracts"; import { useFactory } from "@daiso-tech/core/middleware"; import { TimeSpan } from "@daiso-tech/core/time-span"; const sharedLock = sharedLockFactory.create("resource", { limit: 2, }); const use = useFactory(); await use(async () => { await sharedLock.runWriterOrFail(async () => { // ... critical section }); }, [ retryInterval({ time: TimeSpan.fromMinutes(1), interval: TimeSpan.fromSeconds(1), errorPolicy: FailedAcquireWriterLockError, }), ])(); ``` warning Note using `retryInterval` middleware with shared-lock acquiring in a HTTP request handler is discouraged because it blocks the HTTP request handler causing the handler wait until the shared-lock becomes available or the timeout is reached. This will delay HTTP request handler to generate response and will make frontend app slow because of HTTP request handler. ### Retrying acquiring shared-lock as reader by interval[​](#retrying-acquiring-shared-lock-as-reader-by-interval "Direct link to Retrying acquiring shared-lock as reader by interval") To retry acquiring shared-lockas as reader at regular intervals you can use the [`retryInterval`](/docs/components/resilience.md) middleware. Retrying acquiring shared-lock with `acquireReaderOrFail` method: ``` import { retryInterval } from "@daiso-tech/core/resilience"; import { LimitReachedReaderSemaphoreError } from "@daiso-tech/core/shared-lock/contracts"; import { useFactory } from "@daiso-tech/core/middleware"; import { TimeSpan } from "@daiso-tech/core/time-span"; const sharedLock = sharedLockFactory.create("resource", { limit: 2, }); const use = useFactory(); try { await use(async () => { await sharedLock.acquireReaderOrFail(); }, [ retryInterval({ // Time to wait 1 minute time: TimeSpan.fromMinutes(1), // Interval to try acquire the shared-lock interval: TimeSpan.fromSeconds(1), errorPolicy: LimitReachedReaderSemaphoreError, }), ])(); // ... critical section } finally { await sharedLock.releaseReader(); } ``` Retrying acquiring shared-lock with `acquireReader` method: ``` import { retryInterval } from "@daiso-tech/core/resilience"; import { useFactory } from "@daiso-tech/core/middleware"; import { TimeSpan } from "@daiso-tech/core/time-span"; const sharedLock = sharedLockFactory.create("resource", { limit: 2, }); const use = useFactory(); const hasAcquired = await use(async () => { return await sharedLock.acquireReader(); }, [ retryInterval({ time: TimeSpan.fromMinutes(1), interval: TimeSpan.fromSeconds(1), errorPolicy: { treatFalseAsError: true, }, }), ])(); if (hasAcquired) { try { // ... critical section } finally { await sharedLock.releaseReader(); } } ``` Retrying acquiring shared-lock with `runReaderOrFail` method: ``` import { retryInterval } from "@daiso-tech/core/resilience"; import { LimitReachedReaderSemaphoreError } from "@daiso-tech/core/shared-lock/contracts"; import { useFactory } from "@daiso-tech/core/middleware"; import { TimeSpan } from "@daiso-tech/core/time-span"; const sharedLock = sharedLockFactory.create("resource", { limit: 2, }); const use = useFactory(); await use(async () => { await sharedLock.runReaderOrFail(async () => { // ... critical section }); }, [ retryInterval({ time: TimeSpan.fromMinutes(1), interval: TimeSpan.fromSeconds(1), errorPolicy: LimitReachedReaderSemaphoreError, }), ])(); ``` warning Note using `retryInterval` middleware with shared-lock acquiring in a HTTP request handler is discouraged because it blocks the HTTP request handler causing the handler wait until the shared-lock becomes available or the timeout is reached. This will delay HTTP request handler to generate response and will make frontend app slow because of HTTP request handler. ### Serialization and deserialization of shared-lock[​](#serialization-and-deserialization-of-shared-lock "Direct link to Serialization and deserialization of shared-lock") Shared-locks can be serialized, allowing them to be transmitted over the network to another server and later deserialized for reuse. This means you can, for example, acquire the shared-lock on the main server, transfer it to a queue worker server, and release it there. In order to serialize or deserialize a shared-lock you need pass an object that implements [`ISerderRegister`](/docs/components/serde.md) contract like the [`Serde`](/docs/components/serde.md) class to `SharedLockFactory`. Manually serializing and deserializing the shared-lock: ``` import { RedisSharedLockAdapter } from "@daiso-tech/core/shared-lock/redis-shared-lock-adapter"; import { SharedLockFactory } from "@daiso-tech/core/shared-lock"; import { Serde } from "@daiso-tech/core/serde"; import { SuperJsonSerdeAdapter } from "@daiso-tech/core/serde/super-json-serde-adapter"; const serde = new Serde(new SuperJsonSerdeAdapter()); const redisClient = new Redis("YOUR_REDIS_CONNECTION"); const sharedLockFactory = new SharedLockFactory({ // You can laso pass in an array of Serde class instances serde, adapter: new RedisSharedLockAdapter(redisClient), }); const sharedLock = sharedLockFactory.create("resource", { limit: 2, }); const serializedSharedLock = serde.serialize(sharedLock); const deserializedSharedLock = serde.deserialize(sharedLock); ``` danger When serializing or deserializing a shared-lock, you must use the same `Serde` instances that were provided to the `SharedLockFactory`. This is required because the `SharedLockFactory` injects custom serialization logic for `ISharedLock` instance into `Serde` instances. info Note you only need manuall serialization and deserialization when integrating with external libraries. As long you pass the same `Serde` instances with all other components you dont need to serialize and deserialize the shared-lock manually. ``` import { RedisSharedLockAdapter } from "@daiso-tech/core/shared-lock/redis-shared-lock-adapter"; import type { ISharedLock } from "@daiso-tech/core/shared-lock/contracts"; import { SharedLockFactory } from "@daiso-tech/core/shared-lock"; import { RedisPubSubEventBusAdapter } from "@daiso-tech/core/event-bus/redis-pub-sub-event-bus-adapter"; import { EventBus } from "@daiso-tech/core/event-bus"; import { Serde } from "@daiso-tech/core/serde"; import { SuperJsonSerdeAdapter } from "@daiso-tech/core/serde/super-json-serde-adapter"; const serde = new Serde(new SuperJsonSerdeAdapter()); const redis = new Redis("YOUR_REDIS_CONNECTION"); type EventMap = { "sending-shared-lock-over-network": { sharedLock: ISharedLock; }; }; const eventBus = new EventBus({ adapter: new RedisPubSubEventBusAdapter({ client: redis, serde, }), }); const sharedLockFactory = new SharedLockFactory({ serde, adapter: new RedisSharedLockAdapter(redis), eventBus, }); const sharedLock = sharedLockFactory.create("resource", { limit: 2, }); // We are sending the shared-lock over the network to other servers. await eventBus.dispatch("sending-shared-lock-over-network", { sharedLock, }); // The other servers will recieve the serialized shared-lock and automattically deserialize it. await eventBus.addListener( "sending-shared-lock-over-network", ({ sharedLock }) => { // The shared-lock is deserialized and can be used console.log("SHARED_LOCK:", sharedLock); }, ); ``` ### Shared-lock events[​](#shared-lock-events "Direct link to Shared-lock events") You can listen to different [shared-lock events](https://daiso-tech.github.io/daiso-core/modules/SharedLock.html) that are triggered by the `Shared-lock` instance. Refer to the [`EventBus`](/docs/components/event_bus/event_bus_usage.md) documentation to learn how to use events. Since no events are dispatched by default, you need to pass an object that implements `IEventBus` or `IEventBusAdapter` contract. ``` import { MemorySharedLockAdapter } from "@daiso-tech/core/shared-lock/memory-shared-lock-adapter"; import { SharedLockFactory, SHARED_LOCK_EVENTS, } from "@daiso-tech/core/shared-lock"; import { MemoryEventBusAdapter } from "@daiso-tech/core/event-bus/memory-event-bus-adapter"; const redisPubSubEventBusAdapter = new MemoryEventBusAdapter(); const sharedLock = new SharedLockFactory({ adapter: new MemorySharedLockAdapter(), eventBus: redisPubSubEventBusAdapter, }); await sharedLockFactory.events.addListener( SHARED_LOCK_EVENTS.WRITER_ACQUIRED, () => { console.log("Lock acquired"); }, ); await sharedLockFactory.create("a", { limit: 2 }).acquireWriter(); ``` warning If multiple shared-lock adapters (e.g., `RedisSharedLockAdapter` and `MemorySharedLockAdapter`) are used at the same time, you need to isolate their events by assigning separate namespaces. This prevents listeners from unintentionally capturing events across adapters. ``` import { RedisSharedLockAdapter } from "@daiso-tech/core/shared-lock/redis-shared-lock-adapter"; import { MemorySharedLockAdapter } from "@daiso-tech/core/shared-lock/memory-shared-lock-adapter"; import { RedisPubSubEventBusAdapter } from "@daiso-tech/core/event-bus/redis-pub-sub-event-bus-adapter"; import { Serde } from "@daiso-tech/core/serde"; import { SuperJsonSerdeAdapter } from "@daiso-tech/core/serde/super-json-serde-adapter"; import Redis from "ioredis"; import { Namespace } from "@daiso-tech/core/namespace"; const serde = new Serde(new SuperJsonSerdeAdapter()); const redisPubSubEventBusAdapter = new RedisPubSubEventBusAdapter({ client: new Redis("YOUR_REDIS_CONNECTION_STRING"), serde, }); const memorySharedLockAdapter = new MemorySharedLockAdapter(); const memorySharedLockFactory = new SharedLockFactory({ adapter: memorySharedLockAdapter, // We assign distinct namespaces to MemorySharedLockAdapter and RedisSharedLockAdapter to isolate their events. namespace: new Namespace(["memory", "event-bus"]), eventBus: redisPubSubEventBusAdapter, }); const redisSharedLockAdapter = new RedisSharedLockAdapter({ serde, database: new Redis("YOUR_REDIS_CONNECTION_STRING"), }); const redisSharedLockFactory = new SharedLockFactory({ adapter: redisSharedLockAdapter, // We assign distinct namespaces to MemorySharedLockAdapter and RedisSharedLockAdapter to isolate their events. namespace: new Namespace(["redis", "event-bus"]), eventBus: redisPubSubEventBusAdapter, }); ``` ### Separating creating, listening to and manipulating shared-lock[​](#separating-creating-listening-to-and-manipulating-shared-lock "Direct link to Separating creating, listening to and manipulating shared-lock") The library includes 3 additional contracts: * [`ISharedLock`](https://daiso-tech.github.io/daiso-core/types/SharedLock.ISharedLock.html) - Allows only for manipulating of the shared-lock. * [`IWriterLock`](https://daiso-tech.github.io/daiso-core/types/SharedLock.IWriterLock.html) - Allows only for manipulating of the shared-lock as writer. * [`IReaderSemaphore`](https://daiso-tech.github.io/daiso-core/types/SharedLock.IReaderSemaphore.html) - Allows only for manipulating of the shared-lock as reader. * [`ISharedLockFactoryBase`](https://daiso-tech.github.io/daiso-core/types/SharedLock.ISharedLockFactoryBase.html) - Allows only for creation of shared-locks. * [`ISharedLockListenable`](https://daiso-tech.github.io/daiso-core/types/SharedLock.ISharedLockListenable.html) - Allows only to listening to shared-lock events. This seperation makes it easy to visually distinguish the 3 contracts, making it immediately obvious that they serve different purposes. ``` import { MemoryEventBusAdapter } from "@daiso-tech/core/event-bus/memory-event-bus-adapter"; import { SharedLockFactory } from "@daiso-tech/core/shared-lock"; import { MemorySharedLockAdapter } from "@daiso-tech/core/shared-lock/memory-shared-lock-adapter"; import { type ISharedLock, type ISharedLockFactoryBase, type ISharedLockListenable, type IWriterLock, type IReaderSemaphore, SHARED_LOCK_EVENTS, } from "@daiso-tech/core/shared-lock/contracts"; async function writerLockFunc(writerLock: IWriterLock): Promise { // You cannot access the writer methods // You will get typescript error if you try await writerLock.runWriterOrFail(async () => { // ... critical section }); } async function readerSemaphoreFunc( readerSemaphore: IReaderSemaphore, ): Promise { // You cannot access the reader methods // You will get typescript error if you try await readerSemaphore.runReaderOrFail(async () => { // ... critical section }); } async function sharedLockFunc(sharedLock: ISharedLock): Promise { await writerLockFunc(sharedLock); await readerSemaphoreFunc(sharedLock); // You can access this method because it is not part of the writer or reader methods. await sharedLock.forceRelease(); } async function sharedLockFactoryFunc( sharedLockFactory: ISharedLockFactoryBase, ): Promise { // You cannot access the listener methods // You will get typescript error if you try const sharedLock = sharedLockFactory.create("resource", { limit: 2, }); await sharedLockFunc(sharedLock); } async function sharedLockListenableFunc( sharedLockListenable: ISharedLockListenable, ): Promise { // You cannot access the sharedLockFactory methods // You will get typescript error if you try await sharedLockListenable.addListener( SHARED_LOCK_EVENTS.WRITER_ACQUIRED, (event) => { console.log("WRITER ACQUIRED:", event); }, ); await sharedLockListenable.addListener( SHARED_LOCK_EVENTS.WRITER_RELEASED, (event) => { console.log("WRITER RELEASED:", event); }, ); await sharedLockListenable.addListener( SHARED_LOCK_EVENTS.READER_ACQUIRED, (event) => { console.log("READER ACQUIRED:", event); }, ); await sharedLockListenable.addListener( SHARED_LOCK_EVENTS.READER_RELEASED, (event) => { console.log("READER RELEASED:", event); }, ); } const sharedLockFactory = new SharedLockFactory({ adapter: new MemorySharedLockAdapter(), eventBus: new MemoryEventBusAdapter(), }); await sharedLockListenableFunc(sharedLockFactory.events); await sharedLockFactoryFunc(sharedLockFactory); ``` ## Further information[​](#further-information "Direct link to Further information") For further information refer to [`@daiso-tech/core/shared-lock`](https://daiso-tech.github.io/daiso-core/modules/SharedLock.html) API docs. --- # TimeSpan The `@daiso-tech/core/time-span` component provides an easy way for defining, manipulating, and comparing durations. Furthermore, it is designed for easy integration with external time libraries like Luxon and Dayjs. ## TimeSpan class[​](#timespan-class "Direct link to TimeSpan class") The `TimeSpan` class is used for representing time interval. info Note `TimeSpan` cannot be negative. ### Creating a TimeSpan[​](#creating-a-timespan "Direct link to Creating a TimeSpan") Creating `TimeSpan` from milliseconds: ``` import { TimeSpan } from "@daiso-tech/core/time-span"; const timeSpan = TimeSpan.fromMilliseconds(100); ``` Creating `TimeSpan` from seconds: ``` const timeSpan = TimeSpan.fromSeconds(30); ``` Creating `TimeSpan` from minutes: ``` const timeSpan = TimeSpan.fromMinutes(15); ``` Creating `TimeSpan` from hours: ``` const timeSpan = TimeSpan.fromHours(1); ``` Creating `TimeSpan` from days: ``` const timeSpan = TimeSpan.fromDays(1); ``` Creating `TimeSpan` from date range: ``` const timeSpan = TimeSpan.fromDateRange({ start: new Date("2000-01-01"), end: new Date("2010-01-01"), }); ``` Creating `TimeSpan` from `string`: ``` const timeSpan = TimeSpan.fromStr("5s"); ``` info Under the hood, this method leverages [@lukeed/ms](https://www.npmjs.com/package/@lukeed/ms) package to convert various time formats into milliseconds. Refer to its documentation for a complete list of supported time formats and units. ### Adding time to TimeSpan[​](#adding-time-to-timespan "Direct link to Adding time to TimeSpan") You can add milliseconds to a `TimeSpan`: ``` timeSpan.addMilliseconds(200); ``` You can add seconds to a `TimeSpan`:\` ``` timeSpan.addSeconds(30); ``` You can add minutes to a `TimeSpan`: ``` timeSpan.addMinutes(20); ``` You can add hours to a `TimeSpan`: ``` timeSpan.addHours(2); ``` You can add days to a `TimeSpan`: ``` timeSpan.addDays(14); ``` You can add 2 `TimeSpan` together: ``` timeSpan.addTimeSpan(TimeSpan.fromDays(14).addHours(20)); ``` ### Subtracting time from TimeSpan[​](#subtracting-time-from-timespan "Direct link to Subtracting time from TimeSpan") You can subtract milliseconds from a `TimeSpan`: ``` timeSpan.subtractMilliseconds(200); ``` You can subtract seconds from a `TimeSpan`:\` ``` timeSpan.subtractSeconds(30); ``` You can subtract minutes from a `TimeSpan`: ``` timeSpan.subtractMinutes(20); ``` You can subtract hours from a `TimeSpan`: ``` timeSpan.subtractHours(2); ``` You can subtract days from a `TimeSpan`: ``` timeSpan.subtractDays(14); ``` You can subtract 2 `TimeSpan` together: ``` timeSpan.subtractTimeSpan(TimeSpan.fromDays(14).addHours(20)); ``` ### Multiplying and dividing a TimeSpan[​](#multiplying-and-dividing-a-timespan "Direct link to Multiplying and dividing a TimeSpan") Dividing a timespan: ``` // Will be now 100 milliseconds TimeSpan.fromMilliseconds(200).divide(2); ``` Multiplying a timespan: ``` // Will be now 400 milliseconds TimeSpan.fromMilliseconds(200).multiply(2); ``` ### Comparing TimeSpan:s[​](#comparing-timespan "Direct link to comparing-timespan") Equals: ``` // Returns false TimeSpan.fromSeconds(1).equal(TimeSpan.fromSeconds(2)); ``` Greater than: ``` // Returns false TimeSpan.fromSeconds(1).gt(TimeSpan.fromSeconds(2)); ``` Greater than or equals: ``` // Returns false TimeSpan.fromSeconds(1).gte(TimeSpan.fromSeconds(2)); ``` Less than: ``` // Returns true TimeSpan.fromSeconds(1).lt(TimeSpan.fromSeconds(2)); ``` Less than or equals: ``` // Returns true TimeSpan.fromSeconds(1).lte(TimeSpan.fromSeconds(2)); ``` ### Converting a TimeSpan[​](#converting-a-timespan "Direct link to Converting a TimeSpan") You can get amount of milliseconds contained in the `TimeSpan`: ``` TimeSpan.fromSeconds(1).toMilliseconds(); ``` You can get amount of seconds contained in the `TimeSpan`: ``` TimeSpan.fromMinutes(1).toSeconds(); ``` You can get amount of minutes contained in the `TimeSpan`: ``` TimeSpan.fromHours(1).toMinutes(); ``` You can get amount of hours contained in the `TimeSpan`: ``` TimeSpan.fromDays(1).toHours(); ``` You can get amount of days contained in the `TimeSpan`: ``` TimeSpan.fromHours(48).toDays(); ``` You can get end date relative to a start date: ``` // Will return date of "2002-01-01" TimeSpan.fromDays(365).toEndDate(new Date("2001-01-01")); ``` You can get start date relative to a end date: ``` // Will return date of "2000-01-01" TimeSpan.fromDays(365).toStartDate(new Date("2001-01-01")); ``` ### Serialization and deserialization of TimeSpan[​](#serialization-and-deserialization-of-timespan "Direct link to Serialization and deserialization of TimeSpan") The `TimeSpan` class supports serialization and deserialization, allowing you to easily convert instances to and from serialized formats. However, registration is required first: ``` import { Serde } from "@daiso-tech/core/serde"; import { SuperJsonSerdeAdapter } from "@daiso-tech/core/serde/super-json-serde-adapter"; import { TimeSpan } from "@daiso-tech/core/time-span"; const serde = new Serde(new SuperJsonSerdeAdapter()); serde.registerClass(TimeSpan); const timeSpan = TimeSpan.fromSeconds(12); const serializedTimeSpan = serde.serialize(timeSpan); const deserializedTimeSpan = serde.deserialize(serializedTimeSpan); // logs false console.log(serializedTimeSpan === deserializedTimeSpan); ``` ## ITimeSpan contract[​](#itimespan-contract "Direct link to ITimeSpan contract") The `ITimeSpan` contract provides a standardized way to express a duration as milliseconds. Key components, including `Cache` and `Lock`, rely on this contract, ensuring they are not tightly coupled to a specific duration implementation. This decoupling is crucial for interoperability, allowing seamless integration with external time libraries like `Luxon` or `Dayjs`. To integrate a new library, its duration objects must simply implement the `ITimeSpan` contract. info Note `TimeSpan` class implements `ITimeSpan` contract. The `ITimeSpan` contract requires you to implement the `TO_MILLISECONDS` method on the duration object, which must return the duration in milliseconds. ``` import { ITimeSpan, TO_MILLISECONDS, } from "@daiso-tech/core/time-span/contracts"; export class Duration implements ITimeSpan { constructor(private readonly timeInMs: number) {} [TO_MILLISECONDS](): number { return this.timeInMs; } } ``` ## Further information[​](#further-information "Direct link to Further information") For further information refer to [`@daiso-tech/core/time-span`](https://daiso-tech.github.io/daiso-core/modules/TimeSpan.html) API docs. --- # Installation ## Prerequisites[​](#prerequisites "Direct link to Prerequisites") * Node.js installed (version 20.0.0) * npm/yarn/pnpm package manager ## Install the Package[​](#install-the-package "Direct link to Install the Package") Run the following command to install the library: ``` npm install @daiso-tech/core ``` ## Configuration[​](#configuration "Direct link to Configuration") #### Set Module Type:[​](#set-module-type "Direct link to Set Module Type:") `@daiso-tech/core` exclusively uses ESM (ECMAScript Modules). To configure your project: 1. Open your `package.json` 2. Add or update the `type` field: ``` { "type": "module" // ... your existing configurations } ``` info This is only required when running in Node.js. Frameworks like `Next.js`, `SvelteKit.js` and `Nuxt.js` use bundlers that support ESM modules automatically. --- # ErrorPolicy type The `ErrorPolicy` type determines which errors should be handled for example in resilience middlewares like [`retry`](/docs/components/resilience.md) or [`fallback`](/docs/components/resilience.md). ## Predicate as ErrorPolicy[​](#predicate-as-errorpolicy "Direct link to Predicate as ErrorPolicy") A predicate function can be used to dynamically determine if an error should be handled: ``` import { fallback } from "@daiso-tech/core/resilience"; import { useFactory } from "@daiso-tech/core/middleware"; class CustomError extends Error { constructor( readonly errorCode: string, message: string, cause?: unknown, ) { super(message, { cause }); this.name = CustomError.name; } } const use = useFactory(); const func = use((): string => { return "asd"; }, [ fallback({ fallbackValue: "DEFAULT_VALUE", errorPolicy: (error) => error instanceof CustomError, }), ]); await func(); ``` ## Classes as ErrorPolicy:[​](#classes-as-errorpolicy "Direct link to Classes as ErrorPolicy:") You can directly pass an class to match if errors are instance of the class: ``` import { fallback } from "@daiso-tech/core/resilience"; import { useFactory } from "@daiso-tech/core/middleware"; const use = useFactory(); const func = use((): string => { return "asd"; }, [ fallback({ fallbackValue: "DEFAULT_VALUE", errorPolicy: CustomError, }), ]); await func(); ``` You can also pass multiple error classes: ``` import { fallback } from "@daiso-tech/core/resilience"; import { useFactory } from "@daiso-tech/core/middleware"; const use = useFactory(); const func = use((): string => { return "asd"; }, [ fallback({ fallbackValue: "DEFAULT_VALUE", errorPolicy: [CustomErrorA, CustomErrorB], }), ]); await func(); ``` ## Standard Schema as ErrorPolicy[​](#standard-schema-as-errorpolicy "Direct link to Standard Schema as ErrorPolicy") You can use any [standard schema](https://standardschema.dev/) as error policy: ``` import { z } from "zod"; import { fallback } from "@daiso-tech/core/resilience"; import { useFactory } from "@daiso-tech/core/middleware"; const use = useFactory(); const func = use((): string => { return "asd"; }, [ fallback({ fallbackValue: "DEFAULT_VALUE", errorPolicy: z.object({ code: z.literal("e20"), message: z.string(), }), }), ]); await func(); ``` ## False return values as error[​](#false-return-values-as-error "Direct link to False return values as error") You can treat false return values as errors. This useful when you want to retry functions that return boolean. ``` import { retry } from "@daiso-tech/core/resilience"; import { useFactory } from "@daiso-tech/core/middleware"; const use = useFactory(); const func = use(async (): Promise => { // Will be console.log("EXECUTING"); return false; }, [ retry({ maxAttempts: 4, errorPolicy: { treatFalseAsError: true, }, }), ]); await func(); ``` ## Further information[​](#further-information "Direct link to Further information") For further information refer to [`@daiso-tech/core/utilities`](https://daiso-tech.github.io/daiso-core/types/Utilities.ErrorPolicy.html) API docs. --- # Invokable An [`Invokable`](https://daiso-tech.github.io/daiso-core/types/Utilities.Invokable.html) represents a callable entity, which can be either: 1. A function `InvokableFn` 2. An object with a specific invocation signature (`IInvokableObject`) ## Types[​](#types "Direct link to Types") * `Invokable` - Union type of `InvokableFn` and `IInvokableObject` * `InvokableFn` - Function signature * `IInvokableObject` - Object with `invoke` method ## Function Invokable (`InvokableFn`)[​](#function-invokable-invokablefn "Direct link to function-invokable-invokablefn") Represents a standard function with typed parameters and return value. ``` import type { InvokableFn } from "@daiso-tech/core/utilities"; // Using InvokableFn type AddFunction = InvokableFn<[arg1: number, arg2: number], number>; // Equivalent to: type TraditionalFunction = (arg1: number, arg2: number) => number; ``` ## Object Invokable (`IInvokableObject`)[​](#object-invokable-iinvokableobject "Direct link to object-invokable-iinvokableobject") An object that implements a callable contract through an invoke method. This pattern is especially useful for dependency injection (DI) integration, as most DI frameworks are adapted for class-based resolution. ``` import type { IInvokableObject } from "@daiso-tech/core/utilities"; class InvokableObject implements IInvokableObject<[arg1: number, arg2: number], number> { invoke(arg1: number, arg2: number): number { throw new Error("Method not implemented."); } } const invokableObject: IInvokableObject<[arg1: number, arg2: number], number> = { invoke(arg1: number, arg2: number): number { throw new Error("Method not implemented."); }, }; ``` ## Further information[​](#further-information "Direct link to Further information") For further information refer to [`@daiso-tech/core/utilities`](https://daiso-tech.github.io/daiso-core/types/Utilities.Invokable.html) API docs. ---