How We Built the BFCM 2023 Globe

Every year for Black Friday Cyber Monday (BFCM), we put together a real-time visualization of purchases made through Shopify-powered merchants worldwide. This year we're cooking up something big, and in anticipation we wanted to show you how we built the globe last year.

Video of BFCM 2023 Globe

Besides going for better visuals, a big focus was performance. We've put together an interactive deep dive into how we built and optimized the 3D visuals using technologies such as Three.js and React-three-fiber.

Before we jump in, here's the globe with simulated data that you can play with.

Arcs

The arcs flying around are the most important component. Each one represents an order being placed in real time, and they travel from the merchant to the buyer's location.

These can be defined as a Bézier curve with 4 control points. Each point is a 3-dimensional vector.

  • P0: located at start position
  • P1: located 25% of the way between start and end
  • P2: located 75% of the way between start and end
  • P3: located at end position

Arc height can be adjusted by moving control points P1 and P2 away from the surface.

To render the arc, we create a mesh that is a strip of triangles moving along the curve. This gives us the ability to control thickness and have better flexibility for stylizing it with shaders.

You'll notice however that as you move the mesh around, it disappears from certain angles. We needed a way to make it always face the viewer. The solution was surprisingly simple.

The red lines along the curve represent the tangent at those points. Taking the cross product of the tangent with the direction of the camera gives a new vector. This is used in the vertex shader to offset the position so the mesh always faces the viewer.

To texture the arcs, the UVs of the mesh are defined with v going from 0 at the start to 1 at the end. So a vertex in the middle of the arc would have v being 0.5.

Here it is represented with color = uv.y

To animate an arc, the current age of it has to be calculated. This is given by:

The age is normalized to the range 0 to 2.

  • age = 0: the arc is starting
  • age = 1: the arc has reached its destination
  • age = 2: the entire arc trail animation has fully played out

To achieve the dissolve type effect we wanted, we use the UV and the age of the arc and pass that into a noise function.

As new orders come in, these arc meshes are created and added to the globe. They each have their own startTime attribute. As the uTime value increases each frame, all arcs animate independently.

This was the initial setup we had, and was based on code we had been using for many years. There are many implementations of arc rendering floating around the web, and it's very common for each arc to have its own mesh.

But it's not very performant.

In previous years we were limited by how many arcs we could draw per frame. Rendering over 1000 arcs at once would bring some mobile devices to a crawl. We finally found our solution: instancing.

Optimizing the arcs

Instancing is a technique where multiple copies (instances) of the same base mesh can be drawn at the same time on the GPU. Each instance can have its own unique properties such as position or rotation, but they all share the same geometry and material data. This results in a single draw call. The shared geometry also means that only the vertices of the base mesh need to be uploaded to the GPU.

Without instancing, each mesh requires a separate draw call, as the CPU must send individual data sets to the GPU for each mesh. This increases memory usage and impacts performance.

React-three-fiber and Three.js support instancing out of the box.

But how can a common piece of geometry be used to represent every shape of arc instance? For this we landed on an approach to deform the geometry in the vertex shader.

Instead of computing the Bezier curve for each mesh on the CPU, it was all done on the GPU. Even though instances share the same geometry data, they can have their own unique attributes. These are known as instanced buffer attributes.

For each arc we need the P0, P1, P2, P3 control points. There are 3 floats for each of these to represent the x, y, and z coordinates. We also need a single float for the start time.

The data for each arc is then interleaved into a single large buffer.

Interleaved buffer

Data layout of an interleaved buffer

Inside the vertex shader we can then use these attributes to recreate the curve. Since each vertex has a UV attribute between 0 and 1 depending on where it is along the curve, this can be used to calculate its position.

As new orders come in, we don't have to upload new geometry anymore. This is our new flow:

  1. Loop through data until an arc is found with age > 2. This means the arc is finished
  2. Calculate new control points & start time based on the order's origin and destination
  3. Upload the data to the GPU only for that specific arc.

With this new method we are able to render tens of thousands of arcs in a single draw call. Our initial benchmarking found we can render up to a million arcs at once on an M1 laptop. This should give us a good deal of elbow room for years to come!

City Dots

The glowing dots on the globe represent cities with placed orders in the past 24 hours. We debated whether to use meshes aligned to the surface or to use particles that always face the camera.

The difference is subtle, but we chose instanced particles with gl.POINTS since they lead to a nicer glow on the horizon.

See for yourself with the slider below. The left side is surface aligned quads, and the right side are the billboarded points.

Fireworks

We really wanted to find a way to celebrate merchants having their first sale. What better way than to have fireworks go off above their city!

Similar to our arcs, we wanted to see if all fireworks could be done in a single draw call. We generate the base mesh by using Three.js's IcosahedronGeometry and connect triangle strips from the center to each vertex.

To give the mesh the look of a firework burst, we gradually lower vertices along each trail to simulate gravity.

The UVs on each trail are treated the same way as the arcs with v starting at 0 at the start and going to 1 towards the end. This allows us to easily animate the trails bursting outwards.

We also use the noise effect from the arcs to simulate the burst dissipating over time. This is combined with a delayFactor so that the burst trails have a bit of time before starting to disappear.

The whole effect really comes together once bloom is added. This globe was the first where we added postprocessing effects and it added the extra visual oomph we were looking for.

While the bursts themselves were really satisfying to look at, they were missing a launch trail coming up from the ground. We didn't want to introduce a second mesh for this, so instead we have an additional long triangle strip that starts at the origin and goes to the center of the burst.

We added a geometry attribute called isBurstTrail which is false for any vertex part of that launch trail, and true for the burst trails. This lets us drive the animation for each part.

Linearly animating t from 0 to 1 felt a bit too slow so we experimented with different easing values. Cubic easing gives it the speed we were looking for.

We also added some shader uniforms for launch trail height, burst size, and rotation offsets. This let us group together fireworks in fun ways.

The best part is this is all still one single draw call. Each separate firework has its own start time attribute similar to the arcs so they can animate in and out independently.

The end result is endlessly fascinating to watch.

Camera Animations

One of the most delightful features of this globe was the way the camera moved. You could search the name of any city, and no matter where the camera was at that moment, it would move along a beautiful path until it framed the city in the horizon.

The simplest way to go about this is with spherical interpolation:

  1. Convert the 3D start position of the camera into spherical coordinates.
  2. Convert the 3D end position of the camera into spherical coordinates.
  3. Interpolate between the start and end points over time.

The spherical coordinates have radius, phi, and theta attributes which relate to zoom, latitude, and longitude.

In order to make sure the camera looks at the desired city as it moves, the lookAt function on the camera can be used.

Ideally we want the camera tilted down so that it frames the city with the horizon. Offsetting the phi angle slightly downwards achieves the desired effect.

There is a hidden problem here though. If you travel to New York or London everything works great, but watch what happens when you travel to Sydney.

Why does the world flip upside down when we visit Sydney?

  • The lookAt function uses the up property of the camera to establish which way is up.
  • The up property is a reference direction, not the actual upward direction of the camera. Even when the camera rotates, the up property stays as [0, 1, 0] unless we manually change it.
  • When the camera moves to Sydney, the angle between the up property and the actual upward direction of the camera becomes greater than 90 degrees. This causes the camera to flip.

There's luckily a simple fix for that. We need to set the up property each frame, and we can use the phi and theta values to calculate it.

With that fix in place, we now have a system that works perfectly for any city on earth. Try traveling to Sydney to see for yourself:

Animated Pins

This globe's main addition was showing insights about individual cities. To promote discovery, small pins pop out of the globe that you could tap to navigate to that city.

To give the pins a satisfying entrance animation, we turned to react-spring.

Here's a comparison of an early pin animation (left) with the final one (right) that shows how a bit of easing can go a long way to giving a polished feel.

Airplanes

Did you notice the tiny airplanes flying around the globe? They are one of many little surprises to discover. They each follow orbits that pass through two random cities.

Here's a quick way to achieve the circular motion:

  1. Parent the airplane to an empty object (this will be the pivot point)
  2. Offset the airplane by a given amount
  3. Rotate the pivot object each frame

The airplane is now orbiting around the equator. Aligning its orbit to pass through two cities turned out to be simpler than we thought:

This method works perfectly fine and is suitable for the small amount of airplanes we have. However, we really wanted to see if we could animate all the airplanes in the GPU in a single draw call. For this to work, we had to replicate the above in a shader.

We start by creating the instanced mesh.

In this case we can't use separate pivot objects, but we can achieve the same effect with trigonometry in the shader. The airplane material is a MeshBasicMaterial, and we can use the onBeforeCompile method to modify its vertex shader.

This gives the same effect as the airplanes pivoting around the origin. Since each instance has its own transformation matrix, we can set it to a matrix representing the lookAt rotation from before.

The airplanes are now rendered in a single draw call on the GPU. Animating the airplanes is just a matter of updating the currentTime uniform each frame.

While we could have rendered tens of thousands of airplanes without breaking a sweat, we settled on a much more reasonable dozen.

The only airplane on the globe that isn't instanced is a special "Shopify Airplane" that flies across different cities. For this we use 6 orbits and interpolate between them. To make it look natural, we make the airplane bank when it transitions from one orbit to the next. It also sways horizontally and vertically by following some sinusoidal waves.

At the height of BFCM, with thousands of orders flying around the globe every second, hopping on the Shopify Airplane was truly epic.

Loopy Arcs

Loopy arcs were the source of a lot of jokes such as "no wonder my order took 6 weeks to ship, apparently it was doing backflips in the stratosphere."

Having an order do a loop in outer space is certainly not the most efficient way of getting it from point A to point B, but they sure add a nice whimsical feel to our globe.

In order to have as much control over the animations as possible, we opted not to use Bézier splines. Instead we wanted to try defining the animations using keyframes and cubic Hermite splines. These are common for 3D animations, and give an artist more fine grained controls over the look and feel.

Let's take a look.

Above is a box that slides back and forth along the X axis. On the right is a Hermite curve made up of two separate splines (the left half and the right half).

Each spline is defined by a start and end point, as well as a slope (or tangent) for each point. You can think of each point/slope pair as a keyframe. The X axis represents time, and the Y axis represents the value.

This is what the three keyframes above look like in code:

So how do we get the value of the curve for an arbitrary time? For example, 3 seconds.

  1. Find the keyframe that comes before 3 seconds, and the keyframe that comes after.
    For a Hermite curve with only a few keyframes we can do a linear search. For one with many more keyframes we could do a binary search. There are also more sophisticated techniques that make finding the right keyframes a constant lookup operation, but for our case we can just do a simple linear search. For time = 3 seconds, we find keyframe[1] and keyframe[2].

  2. Interpolate between both keyframes
    Now that we have both keyframes, we can interpolate between them using the following function:

In order to animate more than one property, we just need to create a separate curve for each property. In the example below we move the box in an arc-like motion. The red curve represents the X position of the box, while the green one represents its Y position.

Here are the curves for moving the box in a circle.

In order to get our loopy arcs, we had to combine the arc-like motion with the circular motion above. The final curves end up looking like this:

Now that we have a beautiful loopy arc, the question becomes how do we orient our coordinate system above so that the arc flies between two cities?

Our friend lookAt comes in handy once again.

This time we use the Matrix4 version of lookAt because it lets us specify the up axis. This allows us to create a rotation matrix that rotates our coordinate system so that the X axis lines up between both cities, and the Y axis comes out of the globe at the desired spot.

If we want to find the position on our animation curve relative to this new coordinate system, we can apply our lookAtMatrix.

Now comes the real challenging part. How do we instance this?

The first thing to note is that the geometry of the loopy arcs is identical to the geometry of regular arcs. The way the triangles are arranged and billboarded, and the way the UVs are used to fade out the arcs with noise are all the same.

What's different about loopy arcs are the attributes we upload to the GPU.

To describe a loopy arc we need to upload all the keyframes of our animation curves and some additional data to construct a transformation equivalent to the lookAtMatrix.

Each keyframe is 4 floats of data. Once we added up all the floats we needed to pass to the GPU, we were at 94. That's 376 bytes of data per arc, which turns out isn't possible to upload. WebGL displayed this error when we tried:

Stride is over the maximum stride allowed by WebGL

The maximum stride allowed by WebGL is 255 bytes.

We started to reconsider our options. We thought about storing the data in a texture, which is a technique called Vertex Texture Animation (VAT). This would get around the limits, but would add some complexity. Luckily we ended up noticing that a lot of the floats we were trying to upload were equal to zero. You can see this yourself in the graphs above. All the flat slopes are equal to zero, and there are a lot of them. Additionally, a lot of the floats that are nonzero are equal to each other or equal but negated.

After identifying which of the attributes were unique, we ended up only needing to upload 17 floats per arc. WebGL could handle that without any problems as it was way below the 255 byte limit. The rest of the values were hardcoded or computed directly in the shader.

With that we had beautiful instanced loopy arcs!

In retrospect, there are definitely simpler ways we could have implemented these. A combination of Bezier curves and circle equations would have been much simpler and more performant. However we were curious about experimenting with Hermite curves to allow for more bespoke animations. We'll explore this more for a future Globe!

Globe Material

In the early stages of the project, we experimented with a wide range of styles for the globe. This idea kept coming back of a hand-painted aesthetic.

We toyed with effects like toon-shading and ink-shading, but eventually we found ourselves drawn to a style that mirrored the look of a risograph print.

Risograph style

Depiction of Risograph style artwork used as inspiration for Globe.

Implementing noise

A key element of risograph prints is the noise that you see all over them. They are covered with little dots of varying sizes. We initially tried to implement that as a screen space effect using the react-postprocessing library, but it never felt quite right.

We needed to add noise to the water, land and atmosphere materials of our globe. The challenge was how to apply noise to the surface of a sphere with no visible seams and without it stretching or getting distorted at the poles.

We tried a few different 2D and 3D noise functions like Perlin noise, Simplex noise and Worley noise (also called Voronoi noise or cellular noise). It wasn't until we discovered psrdnoise, and in particular the flow noise one can create with psrdnoise, that we felt we had found what we were looking for.

Psrdnoise is a variant of Simplex noise that tiles in 2D and 3D, and that supports an animation technique called flow noise in 3D. It's fairly new and was published in 2022.

3D psrdnoise can be applied to a sphere without any modifications and it gives noise with no seams, stretching or distortion. The following example walks through how we apply 3D psrdnoise to a sphere. The number in the bottom right corresponds to the explanation below.

  1. Start with a standard THREE.SphereGeometry. The color gradient represents the vertex positions visualized as colors.
  2. Feed the vertex positions into the noise function. It returns a value of 0 to 1 for every point on the sphere. Use a time value to animate it.
  3. Change the scale by multiplying vertex positions by a scaling factor. As the scaling factor gets bigger, the noise gets smaller. With 3D noise there are no seams, stretching or distortion.
  4. Add another layer of the psrdnoise function again using a "fractal sum" technique. This creates flow noise.
  5. Add two more layers of the psrdnoise function to make the flow noise more detailed.
  6. Scale the noise.

3D psrdnoise is wonderful, but it's not without its downsides. It is computationally expensive, especially when calling the psrdnoise function more than once per fragment to create flow noise. Because of that we switched to using the cheaper 2D psrdnoise function. However, now we had to fix issues with seams, stretching and distortion:

  1. Start with an icosphere. Use the UVs of the icosphere as inputs to the 2D psrdnoise function. Notice the seam.
  2. Scale the noise. The seam is a little less noticeable, but it's still there. There is a lot of stretching at the equator and distortion at the poles.
  3. The stretching and distortion is caused by the default UV mapping of the icosphere.
  4. Switch to a custom UV layout consisting of six round planes. There is no more stretching or distortion.
  5. Seams are still present, but they will go away after the next steps.
  6. Create flow noise by calling the 2D psrdnoise function one additional time.
  7. Add two additional psrdnoise calls.
  8. Scale the noise. The seams have now disappeared.

Below is a comparison of 2D psrdnoise (left) and 3D psrdnoise (right). While the 3D noise does have some subtle visual improvements, we opted to compromise for the more performant version.

Next time you need to use a noise function, give psrdnoise a shot!

Material optimizations

The standard shader in Three.js is a PBR (Physically Based Rendering) shader. It's great for realism, but it's also more computationally expensive, especially on mobile devices.

To simplify the shader, we use the onBeforeCompile method from before to strip down to only the most necessary parts. It was mostly done through trial and error of removing bits that didn't have any effect on the appearance.

The final shader was around a hundred lines vs the thousands in the standard shader, and we switched most to be vertex based instead of fragment based.

Can you tell the difference? The left globe is the PBR one.

Stars

The stars in the distance ended up using psrdnoise as well. Creating a really nice starry sky shader only ends up taking a few lines of code.

  1. Start with an icosphere with the same UVs as our globe.
  2. Use the UVs of the sphere as inputs for the 2D psrdnoise function. The noise is not animated because a time value is not given as an input.
  3. Scale and stretch the UVs.
  4. Feed the noise to the pow function. Increasing the exponent filters the noise so that only the brightest parts remain.
  5. Multiply the color by an intensity factor to add more bloom.

Here is the corresponding GLSL code:

Wrapping up

Building the 2023 BFCM Globe was a journey in pushing the boundaries of what is possible with instancing and WebGL. We look forward to sharing what we've been cooking up for this year's Globe!

Accueil - Wiki
Copyright © 2011-2024 iteam. Current version is 2.139.0. UTC+08:00, 2024-12-26 14:33
浙ICP备14020137号-1 $Carte des visiteurs$