Answer the question
In order to leave comments, you need to log in
Should I unload old elements on infinite scroll?
Analyzed popular social networks and blogs with endless scrolling. None of them remove already loaded elements from the DOM. The same user wall on Facebook takes up less than 100 MB of memory. And we take optimized sites, where links and likes under each message are most likely bound by a common handler through event bubbling. If we take Angular and similar frameworks, then the handler will hang on each element of the list. There was an idea of unloading previously loaded elements from the DOM (they remain in the cache). For some reason, I found only a couple of similar solutions and not a single study.
Tell me something about the topic. First of all, the questions are of interest: 1) how much more memory does the DOM take (off-screen) than the data in the model on the basis of which it is formed (for example, comment text) and how the browser optimizes off-screen elements; 2) when unloading already loaded elements, it is worth simply deleting them or inserting an empty block with the height of these elements instead of them in order to maintain the position of the scroll slider, how will this affect performance?
Answer the question
In order to leave comments, you need to log in
LinkedIn developers once published a description of the infinite scroll optimization approaches that they used:
http://engineering.linkedin.com/linkedin-ipad-5-techniques-smooth-infinite-scrolling-html5
and facebook is not the best example frontend optimization.
ps regarding angular and delegated events, in some parts of my project I use event bubbling instead of directly binding to an element. I do this through 2 directives (fsDelegate, and then all sorts of event handlers ala fsTap, etc.). Moreover, fsDelegate hangs handlers on the list element, fsTap is registered with fsDelegate (if, of course, it is there, the connection between them is through the fsDelegate controller and the require parameter: '^?fsDelegate'). Again, when catching an event, it is determined which element worked, its scope is taken, and the expression enclosed in fsTap is executed.
The browser definitely optimizes something when elements go beyond the viewport, but you should not count on it much.
a) I did horizontal scrolling of the viewport, the animation goes much smoother if there are much fewer elements on the sides and hide those that are not in the visibility area through display:none; Visually, the difference can be seen. You can check.
b) I did an endless vertical scroll down, like a wall in a social network, where, reaching the bottom, a new pack with media content comes. In js, all handlers were uncoupled in a timely manner, I profiled everything long and hard, trying to speed up this process. But after 1000-2000 content elements per page, the browser on the PC started to get stupid and ate more and more memory. Although there is definitely optimization by the browser itself. But removing elements from the DOM turned out to be much more efficient.
2) The question here is not scrolling, but the fact that when you remove the higher-lying elements, the view will collapse, the elements will be repositioned. Inserting empty elements of the same height instead of the previous ones is a crutch.
You can see examples from here - http://masonry.desandro.com/
I'm sure that someone unloads the elements too, so that the GUI is more responsive.
I encourage you to address issues as they arise.
As long as the browser on modern mobile phones does not crash, nothing needs to be done.
1) how much more memory the DOM (outside the screen) takes than the data in the model
. This is very specific. This is not so much a memory issue as it is a load on the layout engine (unless absolute positioning is used).
2) how the browser optimizes off-screen elements;
Depends on the browser. They are not always rendered, but the layout is recalculated in most cases. About webkit (chrome, safari) I can say that the optimization there is excellent. The layout is recalculated incrementally. Hundreds of elements almost do not strain.
2) when unloading already loaded elements, it is worth simply deleting them or inserting an empty block with the height of these elements instead of them in order to maintain the position of the scroll slider, how will this affect performance?
It will be more difficult. Remove children from the block, with the block height set - will help the browser. And with absolute positioning - generally a song. If it is possible to calculate the height of the blocks, then this is the best approach.
In general, why guess - there are ChromeDev tools timeline + profiles, there is a firebug in the mozilla, etc. This will give an objective answer.
I've been thinking about this for a long time too. Somehow I even wrote a userscript for VK on my knee so that the scrolled elements are hidden, and when scrolling up, they are shown back. But I did not optimize much, so something was constantly jumping and twitching. It is also worth remembering that the feed is automatically updated, and new posts must also be hidden and then shown.
In general, I made advertising posts disappear, and hiding posts when scrolling was a by-product.
Didn't find what you were looking for?
Ask your questionAsk a Question
731 491 924 answers to any question