Answer the question
In order to leave comments, you need to log in
What is the best way to distribute files on two sites from one remote storage?
There is a task to distribute files on two sites from one remote storage.
Sites are spinning on the same server, Nginx + Apache
I mount a directory from a remote server with files, via FTP via curlftpfs, then I throw symlinks in the web site directories, distribute the files themselves using the X-Accel-Redirect NGINX server header.
Everything is fine, the files are returned, but there is one problem, the script cannot organize a check for the presence of files, since php functions like is_dir, is_file, fopen do not work on the connected directory (files) via a symlink, although they should work according to the documentation.
If I mount the remote directory directly to the web directory of the site, then everything works fine, and the files are checked for presence and given through the server header correctly. But there are two sites, and later there may be several more, and mounting the directory to each site is not the best option, so I try to connect via symlinks.
What can be wrong? In rights? in curlftpfs?
How to be in such a situation? What is the best way to organize remote storage and interaction between these two sites?
Do I understand correctly that it is enough to put down the rights to the folder (files) for the user from which the web server is running in order to have access to it from the code, regardless of the location of the file or directory?
Answer the question
In order to leave comments, you need to log in
What prevents you from moving the shared storage directory ABOVE web-root and allowing access to this directory from all domains for PHP scripts?
Didn't find what you were looking for?
Ask your questionAsk a Question
731 491 924 answers to any question