measurement of CPU, RAM, disk and network performance (ubuntu 18.04)

Hi.
SEMrush

I have built a web application (a CMS, http://seductiveapps.com) that works very well and that is based on a growing number of open-source, free-use, commercial libraries and software packages.

Now I would like to know when one of them is having problems, in terms of performance

what I have is:
3 ubuntu functional computers behind 1 adsl line, 1 dev machine, 1 server with a lot of RAM and 1 spare server with lubuntu and only 4G of RAM. All have SSD of generous size.
a lot of knowledge of PHP and Javascript (and knowledge of related things)
Database design experience in couchdb (JSON, scalable, free) and PostGreSQL (SQL, scalable, free)
Outgoing mail routines for PHP that work.
some experience with graphic libraries like Raphael (http://dmitrybaranovskiy.github.io/raphael/)

What I need is:
– through PHP calls to the operating system (ubuntu, through the exec function of php), to monitor and store performance data (in SQL or CouchDB, if you have suggestions on how to store and recover I am more willing to listen to it) * and * what this server activity generates, what Ubuntu process, preferably also what initiated that process, including, if possible, what HTTP request generated the server activity
– the percentage of CPU and RAM used by each process during each sampling time, and the same for each server request executed by any HTTP daemon (couchdb works by HTTP), and the same for PostGreSQL which I do not know yet if it is capable to do your business through TCP / IP (I'm guessing and hoping you do),
– an efficient way to extract all this data using the exec () PHP command, and I do not care how many commands I have to call and analyze the output, it's just that I do not know any of these commands besides "top", and that's not It gives me enough information,
– and a clever suggestion about how often I can sample this data without affecting server performance (the writes on couchdb and postgresql are cheap and are my preferred storage platforms due to the scalability used in other features on the same server) ,

Of course, if there is some commercially marketed free-use software that does this, I'd love to know.

if there is not, I will have to create it myself, and consider licensing it under the MIT license instead of my own license, which is like the MIT license but which requires 1% of the income of a person / organization for commercial use.
I think I owe it to the ubuntu and apache community, along with tinymce builders, jquery guys, etc, etc, something that is back, which is licensed by MIT, and available as a PHP application from which the output it can be called as a (semi-transparent) iframe. this could be that

I need a notification with the full identification of the server bottleneck as it happens, by email.

and I need the same for my network. I am running a public access server from behind an ADSL line that currently has an outgoing bandwidth of 2 MByte / sec, which should provide service to around 100 KB per client (the rest is convenient stored in the hosting space of ISP and reference is made to the HTML code that is spit). server behind my ADSL line), and is projected to increase to 4 MByte / sec in the next year or so. After that, eventually the fiber to my home in an old neighborhood will be placed on the ground, and that growth path makes me think of a dedicated accommodation as if it were a waste of money (which I do not even have, by the way).

Now, when I run into performance problems with specific software stacks (for example, the IMAP server or the LDAP server is responding very slowly because initially everything would run on 1 machine),
moving the sick software stack to another machine (initially behind the same ADSL line and, therefore, on the same internal network),
so I need to measure the performance of the internal network and the external network as well.
Any comment you have on how to get this information with the least amount of developer time you invest, I would love to hear too.