Enhancing the Mock Server: A User Interface Approach

Agile approach including the Mock Server

As far as feature life cycles go, we as a team follow certain agile practices in pursuing its delivery. We first discover and surface potential features or enhancements through data-driven approaches, which then culminate as a proposal in the form of an intake document. Following its signoff, we then narrow the scope and define deliverables, focusing on an iterative approach to incrementally accomplish the feature in more manageable milestones. Lastly, once we have fleshed out the technical documentation, initial design mockups, API schemas, and ticket creation we begin with the actual implementation.

At this point, however, a common scenario takes place in which the API endpoints have not yet been developed, making frontend developers have to postpone fetching from live endpoints and continue developing the UI by mocking the API response statically. Popular tools have arisen to tackle this issue, such as mirage.js, MSW, etc., which facilitate the mocking of servers, typically by intercepting the desired endpoints and returning predefined responses. This enables front-end developers to work independently from the backend while reducing the time needed to finish the milestone.

Agile approach including the Mock Server

Fig 1. Agile approach including the Mock Server

While this solved the issue of frontend independence, the other arose during the review phases with our product manager. A typical review cycle could take the form of developers first publishing the current state of the feature on the staging environment in order for it to be easily accessible by authorized users but still publicly hidden. Those internal users would then be able to inspect the feature though only in the state the mocked values allowed it to display. Naturally, requests came along to see how the feature would react if the API would return certain edge case responses. This required an update in the code base, another pull request to publish it, and finally its deployment on the staging environment. These steps could be reduced even further and possibly make our colleagues more independent from developers when reviewing such feature behaviors.

Solution Summary

While the foundation for our solution is based on mirage.js, using similar libraries that allow server mocking should also be feasible. In our case, there was little reason to try a different library after having used it and having done initial research on its applicability. The bottleneck, however, was that these libraries were only able to mock each endpoint with a single response, requiring a change in code to load different mocked responses if desired.

To overcome this, a UI was built on top of mirage.js so that users themselves could choose what specific endpoints should return as a response in order to make the application behave in a certain way. An example of this was our Data Freshness feature, which rendered differently depending on how recent KPIs or other similar data were updated. If a product manager would like to check how that specific feature would change in appearance if the responsible endpoint either returned freshly added, late or no data at all, then they would only need to select the provided options on the mock server UI to have the changes take effect.

Mock Server UI in action: mocking the /branding-campaigns-summary endpoint

Fig 2. Mock Server UI in action: mocking the /branding-campaigns-summary endpoint

In this case, neither a developer nor a new staging deployment is needed in order for users to inspect specific UI edge cases and scenarios while also having the option to shut down the mock server on the fly once our backend has finished implementing live endpoints. The only additional step required is the setup of these edge cases that features could potentially exhibit in the form of multiple mocked data sets for the mock server to consume.

Deep Dive

The actual implementation of the mock server follows similar suggestions from the official docs of mirage.js in that we have to define three parts:

  • the mocked data responses in JSON format
  • a controller to define the endpoints we wish the mock server to intercept
  • the instantiation of the mock server itself

Provider Component: To ensure the mock server intercepts all relevant endpoints effectively, it should be instantiated before key parts of the application are mounted. Following this, the mock server may only return a single response per endpoint. To overcome this limitation, the UI enables users to control when the mock server instantiates in order to load different mocked responses based on user preferences. This is achieved by using a wrapper component like React’s Context API, which not only houses the logic for its re-instantiation but also simplifies setting up the mock server. By wrapping the main component with the Context API, developers can easily configure the mock server by providing the necessary props to the Provider component. This approach streamlines the implementation process of the UI component (<MockServer />) with which it can automatically gather all required information without the need for additional props.

const isMockServerEnabled = config.env !== "production"; const App = isMockServerEnabled ? ( <MockServerProvider apiNamespace={config.namespace} makeServer={makeServer} mockServerOptions={OPTIONS} > {children} </MockServerProvider> ) : ( children ); ... // In any nested component import { MockServer } from "@dna-zdirect-ui/mock-server"; ... <MockServer />

Session Storage: The other issue to overcome is in passing different mocked responses to the endpoints. Since we allow the user to change returned responses of endpoints at any point of the app's lifecycle via UI options a page refresh is necessary in order for the mock server to load a different set of mock data. Carrying over the chosen option, however, was not possible through application state management due to full app re-mounting after a page reload. The browser's session storage is used instead in order to persist state outside of the app’s lifecycle while also cleaning up entries in the session storage object once the session has ended. A unique key is also used here in case multiple apps are using this mock server implementation in the same session.

Screenshot of the inspection window: console tab

Screenshot of the inspection window: application tabs

Fig 3 + Fig 4. Screenshots of the inspection window: Console and application tabs

The UI itself is a constellation of components provided by a UI-Kit library for the simple reasons of quick development and consistent design with the main requirements of enabling the user to easily select their desired mocked responses, triggering a page reload, disabling or re-enabling the mock server.

Limitations and Alternatives

By building on top of the mock server library mirage.js a solution is implemented that not only supplements the inherent advantage of enabling parallel development of an app's API and UI but does so by making it more flexible and accessible.

  • allows visual documentation and a showcase of edge-case scenarios
  • enables the mocking of endpoints on the fly
  • provides ease of use by means of a customized and non-intrusive UI

This solution is by no means an alternative to writing proper unit tests for edge-case scenarios. In fact, unit tests take precedence while this mock server rather acts as an enhancement during an app’s development by enabling an easier way to showcase such scenarios, e.g. during demos. Similarly, contract testing, in which services, such as an API provider and a client are tested if requests are correctly understood and responses are correctly generated, also takes precedence. Where mocks do shine more are the development phases in which the API services are still being developed and can act as an interim solution until these services are available.

While this specific implementation targets REST APIs the approach should also be compatible with a GraphQL architecture, like the one provided by the Apollo framework, which already comes bundled with its own mocking solution. Whichever technology is used, however, the definitions of mocks are entirely on the frontend side, meaning conventional API validations and error handling are separate from any backend service. Thus, also special attention has to be paid to continuously match the schema of the backend service that was originally intended to be mocked.

Conclusion

All in all, through positive feedback, especially from our product managers and designers, the inclusion of this mock server in our apps not only improves the collaboration between them and engineers by facilitating the presentation of features in various development phases but also eases the setup of a mock server solution for engineers by encapsulating non-business related logic and providing intuitive components. After a couple of implementations, a more generalized version of this mock server has been developed, which is internally available as a separate NPM module.

Lastly, while this is a niche solution that might not fit with many setups, we'd like to stress the importance of allowing developers to have space, resources, and support within their team to explore and experiment in a variety of ways has to be emphasized so that ideas may have enough time to bear fruit.

trang chủ - Wiki
Copyright © 2011-2024 iteam. Current version is 2.137.3. UTC+08:00, 2024-11-28 17:50
浙ICP备14020137号-1 $bản đồ khách truy cập$