I have a client whose site has an absolutely massive image directory; A few dozen gigabytes. (No, it's not that type of site)
I would like to find a good way for the provisional copy of the site to have access to the files in this directory in a secure way. That is, any new deletion or upload that may trigger the use of the preparation site should not affect the image directory that the live site can access. The easiest approach to this would be to simply create a copy of the directory in which you can play the preparation site, but the absolutely massive site in this directory makes it impractical. (My current approach is simply to copy small subdirectories of the main directory from time to time, which works if my client and his employees only test things in small parts of the site).
I imagine I could do this partially work by creating an empty directory to use as the file directory where new files can be loaded and configure Nginx to first verify the files in that directory and then return to the "live" image directory when the file requests in that file arrive directory in. However, those files in the live directory would not be readable by the PHP site code itself and we could not "delete" existing files.
So I guess what I expect is a file system-level trick that allows me to reflect the contents of a directory while allowing the effective creation of new files or the removal of existing files from that directory but without really affecting the directory original as it would happen with a symbolic link. But any other ideas on how to handle this type of situation would also be greatly appreciated.