V
V
Vlad Zaitsev2020-05-18 16:25:28
Virtualization
Vlad Zaitsev, 2020-05-18 16:25:28

How to make the most offline server?

There is a requirement to host a certain amount of information so that it is not lost and is available from the Internet for an unlimited amount of time. Preferably without maintenance actions, or with minimal actions: it may happen that in ten years there will be no one to support it, but you want to save it.

You can take the "eternal server", for example, I found these:
vdsi na(dot)ru/eternal-server
etern alhost(dot)net/vps
seop ulses(dot)ru/kak-poluchit-besplatniy-vps-vds-server-navsegda /
clou datcost(dot)com/dedicated-servers

You can even have several, and configure mirrors via DNS. With a domain, it is clear that there is nothing to do but pay for the maximum, for 10 years. And what can be done on hosting technologies itself?
It is clear that the ideal solution is static hosting, but it seems to me that this will not work. How can you ensure maximum fault tolerance?
All modern technologies, such as docker, immediately disappear: in ten years it will change so much that it will not be related to the version that is on the server.

Answer the question

In order to leave comments, you need to log in

10 answer(s)
V
ValdikSS, 2020-05-18
@ValdikSS

1. No "eternal servers" . It's kind of embarrassing to even explain this. “Eternal Server” is a marketing ploy, in fact a scam, which will end as soon as the company changes conditions / reorganizes / closes. It should be read as “well, it will work for three years, and then it is unknown.”
2. It is not clear what kind of information you have, and what exactly do you mean by the word "host", the criterion of autonomy is also incomprehensible. Post information to the public? Should it be indexed? Does it need access control? Under what protocol should it be available? Will it be needed only by you in 10 years, or someone else? Is this licensed content that can be removed under the DMCA (movies, TV series, music)? Is it personal data (database leaks)? Is the information popular and/or in demand at the moment? Is it likely that it will be relatively in demand in 10 years? Is the information cataloged? Is the information thematic (for example, an archive dedicated to a specific topic, field of science, etc.)? Is the convenience and speed of access to information important?
There are many technologies, but they are all different, with different purposes. Answers to the listed questions are necessary in order to discard the unsuitable ones and consider in detail the suitable ones.
3. If the information is public and in demand , and will be in demand in 10 years, then you should use DC ++, BitTorrent + web storage with direct links to the file, adding links to the .torrent file, in the form of webseed.
Bittorrent has existed since 2006, it is popular, there are clients for all operating systems, compatibility and reliability are excellent.
DC++ is still popular. The main advantage over Bittorrent: the ability to search for a file by its name or directory name, the ability to easily update and add information (there is no link to a “directory” in the form of a .torrent file)
If the information is sensitive or requires access control , and you and anyone else won't be able to maintain it in any way for 10 years (I don't know your situation, so let's say you face a 10 year prison sentence), then, perhaps it makes sense to pay for cloud storage from large companies (Google, Yandex, Microsoft, Apple) for 10 years in advance. This does not give any guarantees, but I think this option is more reliable than shared hosting (and especially VPS).
If there is little information, it is not copyrighted , cataloged and useful, you can simply place it on long-existing free hosting, like Ucoz, Google Sites, Neocities, upload to Bitbucket, Github, Sourceforge (the latter supports storing large files that can be downloaded via a direct link, it is quite suitable as a webseed for a torrent, by the way).
If you are not afraid to try emerging, but not yet established technologies, take a closer look at IPFS . It works on the principle of Bittorrent, but allows you to access information via HTTP, and is also supported by large players represented by Cloudflare, which has a gateway from the Internet to IPFS: https://cloudflare-ipfs.com/
I run several static sites with their own domains in IPFS, on my home computer, behind a Cloudflare gateway. Advantages: all the advantages of BitTorrent, the ability to access it as a site (including on your own domain), indexing by search engines, there are services for long-term paid storage of files ( eternum.io , pinata.cloud ), the ability to easily update information. Disadvantages: still quite slow and unstable, only static sites.
4. Judging by your comment above, you only have 100 GB of media files. This is generally nonsense. If they are public and are of value at least for a narrow circle of people, you can host them with me through the Schare project: https://valdikss.org.ru/schare/
My criterion for autonomy is maximum independence from third-party infrastructure, so files are hosted on a home server and distributed in decentralized file sharing networks.

X
xmoonlight, 2020-05-18
@xmoonlight

I'm not an expert, but if this is to be hosted for a long time, the information itself in such material should take care of this: give readers an understanding of their value and ask for rehosting / "mirror" by them in order for the content to survive.
If the information is valuable to people, it will be able to survive for a long time in this way.
The main thing is to write inside: how best to rehost it so that others can find and read it, and why this is important.
In fact, this is a "viral" repost.
What's important now: webarchive, githubs and thematic wiki sites. Pour it in there first.
You can also make websites on free popular cms hostings so that you can download everything in one file and the cache of pages in search engines remains.

S
Sanes, 2020-05-18
@Sanes

Nothing will work on its own. Especially forever.
It is not known what will happen tomorrow, and you are making plans for 10 years.

D
Dmitry, 2020-05-18
@Tabletko

No way.

available from the Internet for an unlimited amount of time
Requires unlimited money.

P
paran0id, 2020-05-18
@paran0id

Scatter the service on several reliable hostings with a good reputation (there are none among those listed), pay for 10 years in advance (or link an account with automatic payment). Make a balancer between them. But still, in 10 years, something is bound to happen. If you have more than just static data, add the probability of finding 0day holes, due to which your service will fall into the botnet and be rightly disabled. It seems to me that it is easier to conclude an agreement to support this service. More reliable and possibly cheaper.

A
Alexey Dmitriev, 2020-05-18
@SignFinder

There are no perfect and safe operating systems. Keeping the OS up to date automatically at this stage is not possible.
In 10 years without support, your server will turn into one big security hole with no place for your information.

A
Armenian Radio, 2020-05-18
@gbg

Host an image of a virtual machine. And inside the virtual machine, you can shove anything, even a docker.

I
Ivan Shumov, 2020-05-18
@inoise

Yes, there is no way to ensure that. Not a single virtual machine works with 100% availability, which means that it is restarting and that something can be updated there. Or, in rare cases, the resources to run the machine in the data center will run out.
Static hosting is about the same story - anything can change, for example, the rules for using the service. Or the service will be closed or something.
In general, no one will give a 100% probability.
Well and still, availability is a certain number in % from nines. For example, 99.99%. Adding each nine additionally, offhand, is adding a couple of zeros to the cost of the project. Drawing conclusions

A
Andrey Gavrilov, 2020-05-18
@thexaver

amazon aws

C
CityCat4, 2020-05-18
@CityCat4

How can you ensure maximum fault tolerance?

No way.
Even a virtual machine with static requires maintenance - for example, the place will run out - the logs will litter everything. I have a couple of examples where systems have been running for more than a decade without being updated, but nevertheless they were still periodically serviced and transferred from server to server.

Didn't find what you were looking for?

Ask your question

Ask a Question

731 491 924 answers to any question