-
-
Notifications
You must be signed in to change notification settings - Fork 677
Component Design
Dendrite is built entirely using a micro-service architecture. That is, all of Dendrite’s functions are implemented across a set of components which can either run in the same process (referred to as a “monolith” deployment) or across separate processes or even separate machines (referred to as a “polylith” deployment).
A component may do any of the following:
- Implement specific logic useful to the homeserver
- Store and retrieve data from a SQL database, a filesystem, etc
- Produce Kafka events for eventual, but guaranteed, consumption by other Dendrite components
- Consume Kafka events generated by other Dendrite components or sources
- Provide an internal API, exposing Remote Procedure Call (RPC)-like functionality to other Dendrite components
- Consume the internal APIs of other components, to call upon RPC-like functionality of another component in realtime
- Provide an external API, allowing clients, federated homeservers etc to communicate inbound with the Dendrite deployment
- Consume an external API of another homeserver, service etc over the network or the Internet
Dendrite components are all fully capable of running either as their own process or as part of a shared monolith process.
A component which implements an internal API will have three packages:
-
api
- Defines the API shape, exposed functions, request and response structs -
internal
- Implements the API shape defined in theapi
package with concrete logic -
inthttp
- Implements the API shape defined in theapi
package as a set of both HTTP client and server endpoints for wrapping API function calls across the network
When running a monolith deployment, the internal
packages are used exclusively and the function implementations are called directly by other components. The request and response structs are passed between components as pointer values. This is referred to as “short-circuiting” as the API surface is not exposed outside of the process.
When running a polylith deployment, the inthttp
package wraps the API calls by taking the request struct, serialising it to JSON and then sending it over the network to the remote component. That component will then deserialise the request, call the matching internal
API function locally and then serialise the response before sending it back to the calling component.
All internal HTTP APIs must be registered onto the internal API mux so that they are exposed on the correct listeners in polylith and/or full HTTP monolith mode.
It is important that both the internal
and the inthttp
packages implement the interface as defined in the api
package exactly, so that they can be used interchangeably.
Today there are three main classes of API functions:
-
Query
calls, which typically take a set of inputs and return something in response, without causing side-effects or changing state -
Perform
calls, which typically take a set of inputs and synchronously perform a task that will change state or have side-effects, optionally returning some return values -
Input
calls, which typically take a set of inputs and either synchronously or asynchronously perform a task that will change state, typically without returning any return values
A component that wishes to export HTTP endpoints to clients, other federated homeservers etc must do so by registering HTTP endpoints onto one of the public API muxes:
- The public client API mux is for registering endpoints to be accessed by clients (under
/_matrix/client
) - The public federation API mux is for registering endpoints to be accessed by other federated homeservers (under
/_matrix/federation
)
For this, the component must implement a routing
package, which will contain all of the code for setting up and handling these requests. A number of helper functions are available in httputil
which assist in creating authenticated or federation endpoints with the correct CORS and federation header header/access token verification - these are always used.