Answer the question
In order to leave comments, you need to log in
What are the disadvantages of storing images in base64 in html?
Made adding articles and comments to the site. But the pictures are not stored as separate files, but in the html itself as "data:image/...".
Of the benefits - you do not need to separately protect the rights of the picture. Those. there are rights to the article/comments and there is no need to protect the picture, because you simply cannot open it by clicking on the link.
And what are the disadvantages of such storage? And then my hands itch to do it like it’s done everywhere, but I can’t find a reason why to do it.
Answer the question
In order to leave comments, you need to log in
I'll copy my comment here from an answer to another question:
Suppose you have a collective blog like Habr, under the publications in which users conduct a lively correspondence and an avatar is displayed next to each comment. Let's say there is a post under which 10 users left 10 comments each. Let me tell you right now, it's not much. Total 100 comments. The average size of one avatar is 10 kilobytes. In the case of regular images, the user's browser will download 50 kilobytes of text and 100 kilobytes of images. It will cache images and will not load on page refresh or on other pages. In the case of using the scenario you conceived, you will have to load more and more often. First, base64 encoding increases the data size by a third on average. That is, all pictures now weigh 15 kilobytes. Since they are embedded in the page, all instances will be loaded. And they will not be cached. The page weight will increase to 1550 kilobytes. Therefore, it will take longer to load, longer to be parsed by the browser, and the browser will eat up more memory. The user experience has been reduced. But that's not all the trouble. If your blog is promoted and you have an average of a thousand users per day (again, this is not a lot), then the web server will have to give about 1500 megabytes of traffic just for this publication of yours. And if you have a habraeffect and the user flow generates 100 requests per second, then the web server will generate a data flow of about a gigabit. Well, do not forget that before a page weighing 1550 kilobytes goes over the network to devour the memory of the user's machine, it will devour the server's memory. take longer to be parsed by the browser, and the browser will eat up more memory. The user experience has been reduced. But that's not all the trouble. If your blog is promoted and you have an average of a thousand users per day (again, this is not a lot), then the web server will have to give about 1500 megabytes of traffic just for this publication of yours. And if you have a habraeffect and the user flow generates 100 requests per second, then the web server will generate a data flow of about a gigabit. Well, do not forget that before a page weighing 1550 kilobytes goes over the network to devour the memory of the user's machine, it will devour the server's memory. take longer to be parsed by the browser, and the browser will eat up more memory. The user experience has been reduced. But that's not all the trouble. If your blog is promoted and you have an average of a thousand users per day (again, this is not a lot), then the web server will have to give about 1500 megabytes of traffic just for this publication of yours. And if you have a habraeffect and the user flow generates 100 requests per second, then the web server will generate a data flow of about a gigabit. Well, do not forget that before a page weighing 1550 kilobytes goes over the network to devour the memory of the user's machine, it will devour the server's memory. then the web server will have to give about 1500 megabytes of traffic just for this publication of yours. And if you have a habraeffect and the user flow generates 100 requests per second, then the web server will generate a data flow of about a gigabit. Well, do not forget that before a page weighing 1550 kilobytes goes over the network to devour the memory of the user's machine, it will devour the server's memory. then the web server will have to give about 1500 megabytes of traffic just for this publication of yours. And if you have a habraeffect and the user flow generates 100 requests per second, then the web server will generate a data flow of about a gigabit. Well, do not forget that before a page weighing 1550 kilobytes goes over the network to devour the memory of the user's machine, it will devour the server's memory.
It's not quite clear what it means "you don't need to separately protect the image with the rights", because you can open the page containing this image and pull it out of the code, if necessary.
In essence: such images greatly slow down the initial display of the site, because they are parsed along with Html, while ordinary images are loaded asynchronously.
Another moment.
the list of articles in your query is select * from articles where ....
or select name, date_publish, ... without full text ...
If the first, then there may not be enough memory to display a list of articles from 30;)
You can copy it the same way.
It won't hit the cache. The site will slow down the user.
Didn't find what you were looking for?
Ask your questionAsk a Question
731 491 924 answers to any question