Pickup in 3 minutes: Uber’s implementation of Live Activity on iOS

The 2022 WWDC keynote brought an unexpected surprise when Apple™ unveiled the new Live Activities feature, using Uber’s Rider app as a prominent example. This announcement generated excitement for the feature to come and set the stage for an exhilarating journey for our team.

What follows is the story of how we started designing for surfaces outside the app, the engineering problems we had to solve along the way, and ultimately how we measurably improved the experience of riders and drivers.

Given the secretive nature of the announcement,  only a few directors on our side were in the loop. For the rest of the engineering and design team, it was an exhilarating surprise, immediately followed by the realization that we needed to move quickly.

At a company the size of Uber, work is usually planned on a half-year schedule and resources and timelines are always tight.  So, while considering the prioritization of this feature, we recognized the potential upside of being in the App Store® on day one, to be a champion of the platform.  We also believed that this feature would be of great benefit to our users, as now they would be able to observe trip updates from outside of the app in a granular way.

We got into a room and began redrawing our plans, reallocating resources, starting from a minimal team that could work on this as their main project. We then recruited additional resources on a side project basis, making sure to plan their contributions to avoid too much context switching.

We began by tech-proofing the concept as much as possible up front, working closely with the UX and design team. We set specific guardrails to keep the scope manageable, knowing that we needed to construct a space for flexible thinking. Everyone on the team had enough know-how and authority to allow for plan changes based on product iterations, newly discovered engineering limitations, or staffing conflicts. The autonomy given to each member of the team was a crucial aspect, meant to minimize the turnaround time of changes in scope and requirements while navigating the uncharted waters of developing for a new app surface.

From a UX perspective, we quickly recognized the unique challenge posed by Live Activities. Information on this surface is only glanced at for a few seconds at a time, and it extends beyond the boundaries of the main app. Attempting to cram the same information from our app into the Live Activity view would have been a mistake. Users would only look at it briefly, so we categorized the information based on what we believe users’ priorities would be during a trip: 

  • P0. time to pick up/destination. Important information during the waiting-for-pickup phase, the user wants to know how long before being picked up, so they can get ready and still use their phone without constantly hopping into the app to check.
  • P1. license number, and picture of the vehicle. At pickup time it helps the user identify the location of the vehicle in the real world.
  • P1. name and picture of the driver. This adds safety context and helps the user ensure they are getting into the right vehicle.
  • P2. overall progress of the trip. Brings the experience all together. It has been designed with a dynamic display curve, where the last 20% of the progress bar represents the last 2 min of the progress no matter how long the whole waiting time is.

This prioritization not only informed the feature design, but also guided us whenever we encountered engineering issues. The information priority was always the main anchor point when evaluating and justifying the scope of a change. 

Image

Figure 1: Visual representation of the hierarchy of information on the live activity.

Working with the Dynamic Island, the pill-shaped cutout that shows app information on top of the screen, took this concept to an extreme, due to its minuscule amount of space. Unlike a standalone Live Activity, the Dynamic Island shares its limited real estate with other apps and system information. This meant that our design had to be even more concise and efficient. We had to ensure that our information was clear and useful within the tiny space, without overwhelming the user or clashing with other data. For example, during the on-trip phase, we had several iterations to make sure the drop-off time wasn’t going to be mistaken with the system clock. Every pixel counted, and we had to be exceptionally creative to maximize the utility of this feature.

Image

Figure 2: All the phases of a trip expressed in the Dynamic Island.

Live Activities are designed as a separate target from the main application, with no access to networking capabilities and no state. They are updated through the main app or push notifications. Once invoked, the Live Activity constructs its own SwiftUI® view and exists without the ability to directly communicate back to the main target. This setup presented a major challenge: how to handle images dynamically needed by the Live Activity, such as the driver and vehicle’s  picture.

To solve this, we leveraged two technical characteristics of the app: the main target being in the background during the trip request and App Groups, which allows targets to read and write into a shared directory. Upon a payload update, we process any image URLs found and serialize them to disk. The Live Activity then reads those same URLs from disk. 

Image

Figure 3: Timeline of Live Activity update and asset caching strategy.

What is usually the simple task of downloading assets and presenting them on screen, becomes, with Live Activities, a more complicated process that adds several possible failure points. The only silver lining, in our situation,  is that since assets rarely change during a trip, we have the opportunity to retry at every payload update if necessary.

Later in this article we will discuss possible alternative solutions, but there is one main aspect that is worth noting up front: the app is not guaranteed to be running in the background during a trip. The Rider app uses an entitlement to run in the background, so we can update the user location and guarantee precise pickup information, but the user or the OS can still kill the app, stopping the stream of updates.

In our first design, we relied exclusively on the app running in the background to update the live activity, but during beta testing we received reports from users lamenting that the Live Activity stopped updating or it was not dismissed after trip completion. We consequently opted for a different design that leverages a backend service and Push Notifications (more in the OOA Service section).

A major part of the first solution was to estimate the impact of this feature on the backend API.  While this feature would add traffic to our API services, we believed users would spend less time inside the application, reducing traffic at the same time (the app needs to pull data frequently while in the foreground). We could not guess how our users’ behavior would change, but we estimated the load transfer from foreground to background use would not be impactful. Additionally, as an initial safeguard, a feature flag was added to control the rate of pulling while in the background. However, after moving to the aforementioned OOA Service, the rate of updates is already controlled by the backend, which has a more sophisticated load balancing and throttling logic. 

Measuring the impact of new features is a standard practice at Uber. We anticipated that Live Activities would change how users interacted with our app, leading to significant improvements in metrics such as reduced cancellations and waiting times at pickup. The figures below may seem small, but they are in fact huge wins at the scale we operate. Based on the results, we believe riders are more aware of when they are getting picked up, resulting in more completed trips (and earnings for drivers).

  • 2.26% reduction in driver cancellations at pickup
  • 2.13% reduction in rider cancellation at pickup
  • 1.06% reduction in pickup defects per request

The metrics above are a measure of user behavior which are measured independently by features that are added to the app. When it comes to collecting engineering metrics for Live Activities, their unique behavior made the task more challenging. Unlike a normal app, Live Activities don’t offer easy ways to gather metrics without significant gaps. We had to make assumptions based on incomplete data, such as monitoring the sequence of activityStateUpdates to see if Live Activities were correctly cleared at the end of a trip. However this sequence only emits if the app is running in the background, which is not guaranteed, meaning that a significant amount of events would be missing. 

To mitigate the issue we had to join data about foreground events where a Live Activity is present without a trip in progress. This method does not guarantee 100% coverage of the cases, but it allows us to set an acceptable baseline and create regression metrics and dashboards that, while they don’t represent a complete picture, are still useful to catch regressions in production or during development.

Another standard Uber practice is to flag new features or sizable iteration changes. With Live Activities, this was tricky due to the lack of network access and bilateral communication with the main target. We use the same trick as with image loading: at the startup of the main target, we read the feature flags from our standard pipeline and write their values to disk. The Live Activity then reads these values from disk when invoked.

Image

Figure 4: Diagram describing the passthrough of Feature flags to the Live Activity.

While Android doesn’t have a direct counterpart to Live Activities, we could still enhance the push notification strategy by updating push notifications, as opposed to sending new ones, adding the same visual element used in the Live Activity to achieve feature parity.

To streamline this process and ensure consistency across iOS™ and Android, we developed a simple server-driven language. This language allows us to easily modify the content presented in both the Live Activity and Android Push Notifications. By centralizing the content logic on the server, we can dynamically update and tailor the user experience without requiring app updates. This system provides flexibility and ensures that any changes or new features can quickly be implemented across both platforms, maintaining a unified experience for all our users.

We needed to design a domain specific language (DSL) that had a very small implementation footprint and few external dependencies. Uber already has a Server Driven UI system, but it is pretty extensive and as a result it cannot be imported to an external target without impact on the binary size. Furthermore keeping the capabilities to a minimum also allows to reduce the maintenance cost in the future.

Our solution was to build a very opinionated, semi-descriptive DSL. Only UI elements that can be present in Live Activities are included (no text fields or segmented controllers for example) and no extensive styling was provided. For views that need styling, we provide an array of tags that is then processed at the client level. For example, the title label could be represented by a component of type Label, along with the text and a tag of type title. Then the client applies the font, color, number of lines, alignment, etc. for the specific style title.

Image

Figure 5: Exemplification of a DSL payload that hydrates the Live Activity.

Lastly the DSL would not represent any framing or anchoring information. A different view could be provided for the header or the progress bar, but the location of each view is set by the client and cannot change (unless for instructions that are applied with the aforementioned tags). 

Image

Figure 6: Demonstration of how the templating works on the Live Activity and Android Push Notification.

Working in tandem with this new DSL system we developed a backend service specifically designed for surfaces that live outside of the app; we call it the OOA Service (Out Of App). The OOA Service is responsible for the logic to balance the amount of updates delivered to the Apple Push Notifications Service. It evaluates whether changes to the application state are important enough to be delivered, as well as debouncing state changes that happen in rapid succession. Because of the need to evaluate and debounce, this service has to cache previous states for all concurrent trips on the platform, which is a substantial scaling effort.

Building the DSL and the OOA Service was a significant step forward. It didn’t only simplify the development process, but also opened the door to integrate changes without having to duplicate the decision tree logic  and deploy code on multiple platforms. By building this generic solution, we avoid solving complex problems multiple times on different teams. This, in particular, is a decision that we believe will pay dividends in the future as more and more vertical teams at Uber utilize the Live Activity flow.
Even the Eats platform is already using a good part of the mobile framework that deals with the Live Activity life cycle, the image caching and the feature flag injection. They are also evaluating onboarding the DSL and the OOA Service after seeing ‌the positive results of the Rider integration. 

Building the Rider iOS Live Activity was an intense, but rewarding journey. From the surprise of the WWDC announcement to the challenges of developing and implementing a new technology on a tight timeline, the experience showcased the resilience, adaptability, and creativity of our team. We navigated technical hurdles, redefined our UX approach, and ultimately delivered a feature that we believe improved the rider and driver experience. I hope the pieces of ingenuity we have shown in this article may inspire any developer working on Live Activities, to help them overcome similar scenarios and generally take a pragmatic approach for experiences that live outside of the main app.
At a personal level, it was a great experience and opportunity: developing an early adopter product at a company of our scale is challenging, but insanely fulfilling.

A special thanks to the whole team. This project not only demonstrated everyone’s technical prowess, but also highlighted the collaborative spirit that drives innovation at Uber. Kyle Gabriel, Ken Nguyen, Tiffany Chang, Hyewon Son, Radhika Gemawat, Maxim Bogatov, Yifan Ding, Evan Cooper

Apple, App Store, Swift, and SwiftUI are trademarks of Apple Inc., registered in the U.S. and other countries and regions.iOS is a trademark or registered trademark of Cisco in the U.S. and other countries and is used by Apple under license.

Android is a trademark of Google LLC, registered in the U.S. and other countries and regions.

trang chủ - Wiki
Copyright © 2011-2024 iteam. Current version is 2.137.3. UTC+08:00, 2024-11-28 20:38
浙ICP备14020137号-1 $bản đồ khách truy cập$