Answer the question
In order to leave comments, you need to log in
Best practices for setting up a sensitive data lab?
Hello!
I was given the task of designing the design of a sensitive data lab. But since I have not encountered such things before, I do not know how to start working on this document. I would be very grateful if someone would share the best practices for creating such a laboratory based on Linux systems.
The laboratory requirements are as follows:
Answer the question
In order to leave comments, you need to log in
1. LDAP (users, logins/passwords, access to appropriate places, etc.) + Kerberos (for authorization via tokens) + NFS (user home directories). User workstations are interchangeable.
security policies were not explicitly applied in Linux, it was done by setting up the workstation and arranging groups for the user (such as a user such and such is included in the disk group, the disk group is allowed to mount flash drives).
Centos DS was used as LDAP, AWP - on Debian (the image was prepared in advance and stupidly uploaded to the disk if necessary by user technical support). NFS - one server for about 200 people, performance problems arose only during the mass inclusion of computers (rested on disk iops, and not on the network).
2. access to the Internet was carried out through a proxy with authorization (see LDAP). We let those who need it everywhere, those who need it - on the white list, the rest go further. Well, log analysis, traffic accounting, etc.
3. logging - as you set it up (logs of poking with the mouse are hardly possible, logs for opening files were used). Centralized storage - by forwarding logs to the server. Almost all syslogs can do this, just read the documentation.
4. Not seen to be installed on every workstation. Over the network - used SNORT with a bunch of rules.
Didn't find what you were looking for?
Ask your questionAsk a Question
731 491 924 answers to any question