Building Myntra’s Video Platform: Part 4

There were other elements like comments etc too which we are not covering in this article in order to simplify it.

Host App

We created an app for the Hosts / Influencers to use and do the livestream. Hosts can run, manage, request to host livestream in slots, practise in mock livestreams etc.

One challenge with livestream is the host dropping off because of bad internet connection, device issues. We have handled this gracefully and show a message to our live audience about the same and don’t break the livestream. So we are decoupled from the host ‘s internet connection, issues on that side don’t break our livestreams.

We monitor and proactively let our hosts know if battery level is low, device has heated etc so that they are well aware of the issues and can respond accordingly.

Ingestion

Imagine ingestion as the entry point for your livestream. It’s the software or hardware that captures your raw video and audio feed, often from a camera, microphone, or gameplay capture software. This feed is then sent to the next stage in the pipeline.

The primary requirement of this component is to give the ability to a host to join a video call and start the live stream.

  • This component would expose hooks to start or stop recording of a video call.
  • It will break the continuous video stream into small segments.
  • These segments will be stored for permanent storage.
  • It will inform the Transcoder responsible for transcoding the same and generating the live stream as when the segments are generated.

We use WebRTC for ingestion as it allows for low latency ingestion and allows multiple hosts to be in the same video call.

Since Myntra sells lipsticks and other products too, displaying the right color is extremely important. We did optimisations around color filters to make sure that the right/almost perfect color is picked and is visible to other customers on the tail end of the pipeline.

We have resilience to handle issues like the SFU going down, Recorders going down. In such scenarios failover to another free recorder instance happens. This is completely invisible to users.

Kafka

Kafka connects the recorder pool and the Stream Processing based Transcoder. Message about the livestream segment is sent. Multiple livestreams and their corresponding recorders may beam such messages here.

Transcoder

The raw video feed from ingestion might not be compatible with every viewer’s device or internet connection. This is where the transcoder comes in. It acts like a translator, converting the raw feed into multiple formats and bitrates. Imagine it creating different sized versions of the same video, ensuring a smooth playback experience for viewers on various devices and internet speeds.

We modeled Transcoder on a Stream Processing Engine.

There are a few steps in Transcoder

Rendition

  • Understands the livestream, the bitrate ladder for this livestream and splits into encoding jobs.

Encoding

  • Downloads and encodes the video segment based on the job detail in the message.
  • Uploads the transcoded segment on the container
  • Our VOD segments are ready right after the livestream ends.
  • Sends the message for playlist generation

Playlist Generation

  • Message from Encoding step is received
  • Makes sure that segments are added in proper order only in the media playlists.
  • Out of order etc is handled.

Distribution

  • This step ensures that all content is pushed to the CDN ingestion servers.
  • All the processed video segments and the playlists are sent to the server.

Transcoder is truly horizontally scalable. We have run 15 concurrent livestreams in the past and have the ability to scale as much as we need. Resilience and failover has been handled at each step.

CDN

Once the transcoder has prepared multiple versions of your stream, it’s time for delivery. This is where the CDN takes center stage. A CDN is a network of geographically distributed servers that store and deliver your content. When a viewer requests your stream, the CDN routes them to the nearest server, minimizing latency (delay) and ensuring a smooth viewing experience.

HLS protocol was consciously chosen as most of the CDN players support this well. This can help us scale to thousands and potentially millions if needed in the future.

Propagation from ingest servers of CDN to edge servers was an important metric for us to choose our CDN partner.

For livestream a push based strategy is followed wherein we push content to CDN as and when it is generated.

Blob Storage

While the stream is live, it’s also being recorded for later viewing or archiving. Blob storage acts as a massive digital warehouse that efficiently stores large amounts of unstructured data, like your livestream recordings. It’s a scalable and cost-effective solution for storing the livestream archives.

Myntra App

This is our main eCommerce app which our users use. This is used to watch the livestreams. Regular users go to screens and watch the livestream.

  • Product rack is shown to users
  • User can comment on the livestream
  • Send hearts

We use ExoPlayer on Android and AVPlayer on iOS

We have our own layers built on top of that and optimize for glass to glass latency. We balance buffer and lag so that users can have some resilience with network disruptions and at the same time don’t have a lot of lag as that would break the interactivity element in the livestream.

Admin Consoles

Apart from Host app & Myntra Customer app, we have internal consoles to manage, create and approve livestreams.

Some memorable moments

We have successfully run the livestream platform for more than 3 years, with zero to minimal issues.

This platform seeded Flipkart ‘s Livestream tech too.

Hosted celebrities like

  • Hrithik Roshan
  • Manish Malhotra
  • Ayushman Khurana
  • Dulquer Salman
  • Bhuvan Bam

Conclusion

Building the livestream platform 3 yrs back was an absolute joy. It posed multiple unique problems. Each meriting their own articles. We hope you liked these articles as much as we did building it !

trang chủ - Wiki
Copyright © 2011-2024 iteam. Current version is 2.137.3. UTC+08:00, 2024-11-28 20:54
浙ICP备14020137号-1 $bản đồ khách truy cập$