Answer the question
In order to leave comments, you need to log in
Should I compile my own kernel for my home OS?
Let's take a spherical PC - Intel i5, 8GB RAM, mid-range graphics card.
Does it make sense for the user of such a home PC to assemble the kernel himself? I had the experience of building the FreeBSD kernel - of course, the size is reduced, the system boots faster, you can customize it "for yourself", but ... it seems to me that compiling the kernel is more fun (a hobby in a sense) and there is not so much practical use
Answer the question
In order to leave comments, you need to log in
Recompiling the kernel for the sake of “reducing the size, the system boots faster” was relevant when 64 megabytes of RAM seemed incredible happiness, each byte was counted, reducing the kernel from conditional 6 to 3 megabytes seemed a miracle. Now it makes no sense.
Recompilation may be required, as already mentioned, to support rare hardware, and even then, in many cases you can (and should) get by with modules.
If such questions arise, it is definitely not worth it.
The exception is if you need to patch the kernel to support some "exotic" hardware.
Recompiling the kernel is sometimes worth it! This is especially true for laptops and servers.
The fact is that the default kernel works on everything and does not have many optimizations. So, for example, compiling for your processor can significantly reduce power consumption, increase performance, and much more. In general, there is no point in compiling the kernel. But I, for example, do this because 10% of battery life is significant.
Think about kernel updates, and if you are not too lazy to recompile regularly, then set it to custom. Otherwise, never mind.
If you want to figure out what's what, what depends on what, and generally enter technologies that depend on kernel support, then it's worth it. If you just want to have a Linux desktop, then no.
In general, the choice is based not on improving the performance of the computer, but on the tasks that are solved by the assembly of the kernel.
By the way, if you are told that performance improvement is a myth, don't believe it. Right now I'm watching two identical computers (i.e. the same memory, motherboards, processors). They ran the same minrd binary. On one, bitcoins are mined 10% faster - this is Gentoo, which I configured and assembled by hand. The second one is debian.
I think that in the third millennium nobody needs it. I myself at one time had fun with the assembly of the FreeBSD kernel and the assembly of software from ports. I came to the conclusion that human time is more expensive than processor time.
New kernels in distributions appear with a significant delay, and in new versions there is significant progress in the field of open video drivers and sound. If this is important to you, I see no problem downloading a new kernel and building it without even changing anything. This operation may well be performed by an ordinary user and get a significant profit. If you do not need this, or you are not sure, they correctly write above that you should not bother.
there is a liquorix project, where they release overthrown kernels under debian with a cut out code for supporting ancient devices. But I had problems with it under Ubuntu 13.04, unity did not start.
Yes. There is a profit, but not always noticeable. For example, on my weak laptop, flash did not work normally until I rebuilt everything.
Didn't find what you were looking for?
Ask your questionAsk a Question
731 491 924 answers to any question