fast giant lists
A while back I was asked to fix a page speed issue. This was an online application with several pages with lists. There wasn't just one implementation either: one was an actual unordered list, the other was more table-like but implemented with divs.
Mostly it would run fine with about two hundred to eight hundred lines. But there were edge cases with up to three thousand lines. Which would simply make the browser completely unresponsive.
Funny thing was that the current implementation was already a 'better' version. So prior to that the performance was even worse.
I am no fan of infinite scroll. In most cases I would simply recommend pagination. Not only because of speed, but also because pagination gives you a clear sense of how large the actual set is.
But infinite scroll was already in place. Only if I couldn't fix the problem I would send it back to the UX drawing board.
Frameworks
It must be pointed out that these speed issues are hardly related to framework or absence of one. If you have three thousand rows you'll run into the same issues regardless.
Under the hood
When you are improving performance it is good to have some idea of what happens inside the browser the moment you load an URL. So you know where to look for problems. I'll not repeat the details, enough articles on that.
When a browser requests a website it will contact a server which responds by sending an HTML file, which is just text. It parses the text into DOM and then loads CSS (mostly). It parses that into CSSOM and together with the DOM it cooks up the first paint.
In our case we first expected it to be a server-side issue; it loaded the three thousand lines all at once. This took a bit longer than normal but it was still within reason. After some devtools inspection it became clear that it was the paint that was the problem.
Don't reinvent the wheel
Like any self respecting developer I first searched the web because others are sure to have solved this in a much cleverder way than I ever could. And then coded it myself because other people cannot code. Ha no, I make a joke.
I did look online but not for code: I looked for popular sites with infinite scroll and checked how they did it technically. Pick a random social network and it has infinite scroll: Facebook, Youtube, Twitter, Pinterest, 9GAG, everything.
The trend in most of these is toggling the visibility of an element once it is outside the viewport. They all differ in complexity and size though. Which is why a solution that works for one would not work for the other.
Facebook hides items depending on viewport position. But it starts slowing down at about three hundred items, even though it still churns out requests every second. I never use Facebook anymore so I might be wrong on this, it could be a fluke.
Youtube stops loading at six hundred thirty four items and it doesn't toggle any viewport related visibility. Yet it still performs ok when scrolling.
I tried reaching the end at Reddit but I do no think there is. At eight hundred items it slowed down to a crawl even though it did toggle the item visibility.
Twitter and Pinterest have a list of n visible items (where n is a number depending on your screen size). When you scroll they change the DOM by simply adding and removing elements. The invisible elements are kept in memory. Which makes it difficult to count the maximum number of items without checking the code but it is roughly seven hundred for Twitter and one thousand for Pinterest (document.body.scrollHeight / window.innerHeight * averageItemsInView
).
Changing the DOM this way works, but it is a bit heavy on memory and calculation. Which is probably why both capped the maximum number.
The big surprise was 9GAG. Like with the other sites I just added this lousy line to console setInterval(() = >window.scrollTo(0, document.body.scrollHeight), 2000)
which output ID can be used to stop it. When I did this it was still quite responsive at about three thousand items.
They also just toggle item visibility but in chunks. A chunk being the amount of items that fit into a page wrapped by an element. At any given moment only two of these wrappers have to be visible.
Test all the things!
With all these different implementations there is not really one best solution, some work well, some not so much. And it all depends on a lot of factors: how complex is the HTML of the item, the width and height of the item, how far do people scroll on average, how are the items stacked, are they the same size.
What does seem to work a bit better however is toggling visibility as opposed to toggling DOM presence. It is also less complex programmatically because the visibility toggle is on the content, the element wrapper is still visible so it can be used to measure intersection with the viewport. This is harder for items that are not in the DOM especially when they differ in height because otherwise you could just multiply the number of preceding items with a value.
But this is theory, before implementing a final solution to our problem we want to test it with a stripped down example.
Testing DOM removal
This first test is one where DOM elements are removed and added. This fairly works well but does require some work for calculating the top and bottom padding because right now the scrollbar does not work as it should.
Testing chunk visibility
This second test makes use of turning chunk visibility on- and off. So the rows are grouped into chunks of x rows.
This is faster than the previous method for two reasons.
We're not calculating the visibility of individual rows but two chunks of rows. This is where the pinch method comes in which you can read about here.
The other reason is that the scroll handling is debounced. The downside is that when debouncing the handling takes place after the events stop firing (throttling is rather pointless in this case). This shows as empty space prior to the chunk turned visible again. We can make it easier on the eyes by setting a repeating backround image.
Testing complex content
The above examples are fine technically, but in real life we have more complex content. Maybe a heading with an image, some body text, an anchor or even a button.
Speaking of buttons, you know what is a drain on giant lists? Form elements: for some reason form elements (with a parent HTMLFormElement) are so expensive to render that it pays to swap them with a fake element until they receive focus.
Anyway, complex content might also differ in height. This makes it harder to create a good preview, but not impossible with SVG backgrounds.
So...
If you ever need to render a lot of rows: use pagination. If you cannot use pagination: fire the UX team. If all else fails: toggle visibility in chunks.
And to finish with a Dutch saying: better well stolen than badly thought up.