Details
-
Epic
-
Status: To Do
-
TBD
-
Resolution: Unresolved
-
None
-
None
-
None
-
None
-
Undetermined
-
Perf - API Requests
Description
As a user I want to preserve bandwidth and get the results as fast as possible. Right now we are running way too many requests to show something like a dashboard. Ideally, the backend understands batched requests (either using HTTP/2 or a protocol on top of HTTP/REST such as GraphQL). If not we may want to introduce special "frontend-driven" API endpoints that are closer to what we want to see from the API.
Most notably, all APIs that return arrays of data (directly or indirectly) should support pagination (i.e., providing length and offset, or working with length and cursor). The micro frontends should prepare and run efficient request patterns in combination with a library such as `swr` to ensure not fetching the same data twice.
Updates to existing data should not require polling / careless re-fetching, but have to be propagated using either web sockets (WS) or server-sent events (SSE). The WS/SSE mechanism would inform to any data change, independent of that data change was triggered on the current client, on another client, or from somebody else.
In summary:
1. See what requests are unnecessary and which can be combined into a single one already.
2. Provide the ability to efficiently batch requests in the frontend to the backend.
3. Provide the ability to paginate (and / or select the desired data) in the backend.
4. Provide an interface (using either WS or SSE) to find out if data is outdated and should be re-fetched.