Answer the question
In order to leave comments, you need to log in
Synchronization of linux machines
Sorry if the question is noob.
So, there is a working machine, there is a laptop. What is the best way to make it so that I can finish work on the work machine and continue on the laptop? Then return home and continue working on the main machine?
Programs - related to development. Eclipse, tools, etc. Linux is noob, but as I understand it, the same eclipse - you can simply copy it. The same applies to the ~ folder.
But there are also all kinds of packages, users, groups, etc. They also need to be synchronized somehow.
What do you think about this? In general, is the problem solvable? After all, if the hardware is different, then, probably, some set of packages, in principle, should differ.
I thought about a virtual machine, but the performance drop is very large.
PS: yes, I know about dropbox and ubuntuone (but I preferred lubuntu), I know that there is rsync (but I have never used it yet). The answer may be just advice on how to write a script and general thoughts on this matter.
PPS: well, let's say, I installed some tool on one machine via apt-get. then created the user and group. and on the other how to automatically put the tool and automatically create a user?
PPPS: what needs to be synchronized at all? here it is: user and /opt folders (for example, via rsync), users / groups (I don’t know how yet).
Answer the question
In order to leave comments, you need to log in
I do not think that it is possible (and necessary) to synchronize absolutely everything. I keep projects in virtual machines (I don't notice performance degradation, virtualbox). I set up the environment and tools once and there and there. I keep SSH keys and project docks in a drop box, the code is still in GIT, so there remains a platform for projects - databases, system, etc. - it's in virtualbox. If the virtual machine is too far behind in time, then I simply copy the disk image (I dump it on the mobile HDD), but usually it’s enough to do a git pull. If you really want to, then you can keep virtual machines directly on a portable HDD, I think you can find a pretty fast one.
Version control systems. For example git. You can set up a remote repository on your server or buy it on github.
I do this with my projects and working projects work on the same principle. We work from the office, but sometimes you need to do something from home. It all boils down to a simple git pull... git commit, git push.
The first thing that came to mind was remote access. Those. you, even from home, even from work, actually use the same computer, just in one case locally, and in the other remotely. But it comes down to internet speed.
The second option, in fact, is the simplest - physically carry a computer with you (of course, a laptop) :)
They came to work, plugged in a monitor / mouse / keyboard - and here you have a workplace, continue exactly from the moment you finished at home. I somehow don't see any
other options on how to synchronize everything . But, to be honest, I don’t see the point in such synchronization, because. I doubt you install/remove packages or change environments that often. And for everything else there is MasterCardVCS, Dropbox and more.
Of course, you can also use rsync. Its quite simple to use rsync [what] [where].
Set up ssh without a password , set the host in ~/.ssh/config and then just merge it and upload
rsync --progress -avz -e ssh yourserver:/home/user/www/project/ /home/user/www/project/
it in the same way.
rsync --progress -avz -e ssh /home/user/www/project/ yourserver:/home/user/www/project/
But git is better :)
If the task is to synchronize in the general case (i.e., ideally, everything, but I don’t know what exactly may need to be synchronized tomorrow), then look towards virtual machines. Almost everything (yes, I’ll look at absolutely all modern ones) allows you to save the current state, incl. RAM (if so, it may turn out to be relevant).
Wrap everything inside the virtual machine and carry the image and current snapshots on a flash drive / dropbox analogs with you.
If the data is much larger than the size of the flash drive, do a stupid bzdiff, keeping a copy of the previous / carried away image at home and at work. Those. each node should store copies of the current state on all nodes, it is clear that at one point in time not all of them are the same, that's why they need to be saved, and before leaving, do bzdif with the saved one, apply the resulting patch to this image, bring it home and apply it there (before leaving for work, do the same but with a working image).
With the number of nodes = 2, the list of commands for this is very small - 2-3 lines in each.
And if you look at the option to keep a virtual machine in the cloud (so, a purely conceptual question)?
Didn't find what you were looking for?
Ask your questionAsk a Question
731 491 924 answers to any question