Z
Z
zyrik2017-09-18 10:26:17
linux
zyrik, 2017-09-18 10:26:17

Best practices for setting up a sensitive data lab?

Hello!
I was given the task of designing the design of a sensitive data lab. But since I have not encountered such things before, I do not know how to start working on this document. I would be very grateful if someone would share the best practices for creating such a laboratory based on Linux systems.
The laboratory requirements are as follows:

  1. Centralized management of resources (adding a workstation, automatic application of security policies, such as a ban on the use of connected storage media) and users (it should be possible to centrally create / delete a user and issue / take away access rights)
  2. Complete prohibition of access to the Internet from workstations and data servers. Interaction with the outside world is carried out through a special server (shipper) and an administrator (installing / removing software, receiving / saving data from an external / internal network).
  3. Logging and centralized storage of logs of all actions of users and administrators (commands, launching programs, access to data repositories, etc.)
  4. Early notification of violations and automatic blocking

In this regard, I have the following questions:
  1. What software can be used for each of these tasks?
  2. Where can I find best practice documents on this issue?

I would be grateful for any tips!

Answer the question

In order to leave comments, you need to log in

2 answer(s)
S
Stanislav Vlasov, 2017-09-19
@zyrik

1. LDAP (users, logins/passwords, access to appropriate places, etc.) + Kerberos (for authorization via tokens) + NFS (user home directories). User workstations are interchangeable.
security policies were not explicitly applied in Linux, it was done by setting up the workstation and arranging groups for the user (such as a user such and such is included in the disk group, the disk group is allowed to mount flash drives).
Centos DS was used as LDAP, AWP - on Debian (the image was prepared in advance and stupidly uploaded to the disk if necessary by user technical support). NFS - one server for about 200 people, performance problems arose only during the mass inclusion of computers (rested on disk iops, and not on the network).
2. access to the Internet was carried out through a proxy with authorization (see LDAP). We let those who need it everywhere, those who need it - on the white list, the rest go further. Well, log analysis, traffic accounting, etc.
3. logging - as you set it up (logs of poking with the mouse are hardly possible, logs for opening files were used). Centralized storage - by forwarding logs to the server. Almost all syslogs can do this, just read the documentation.
4. Not seen to be installed on every workstation. Over the network - used SNORT with a bunch of rules.

C
CityCat4, 2017-09-18
@CityCat4

Try Astra Linux. It is clear that you will not find a certified distribution, but there are also public editions

Didn't find what you were looking for?

Ask your question

Ask a Question

731 491 924 answers to any question