I have been very happy to find this tutorial on how to have Nginx & Apache run on the same machine (a RPi 4 with up to date Raspbian OS in my case) with the help of Haproxy.
As in the tutorial’s use case, I have a webapp running on Nginx which cannot be migrated to Apache, and I would like to install Nextcloudpi (which requires Apache).
At the end of step 4, Nginx (127.0.0.1 for Haproxy); Apache (127.0.0.2 for Haproxy) and Haproxy are indicated as “Active” on the machine when checking “systemctl status”. However I cannot access any webapp with my browser at this point. I tend to believe this is due to Nextcloudpi’s install.sh script (run priori to following the tutorial) which installed some certificates and somehow forced https authentification on the machine, as when trying to access a URL which was working when I had only my Nginx webapp (let’s call it “nginxws.local”), Firefox displays PR_END_OF_FILE_ERROR, which seems to be related to SSL auth.
I guess I should generate new certificates after the changes made regarding webservers & Haproxy configuration. However, despite the dedicated chapter in the tutorial I am kind of lost.
Indeed, so far I have no domain name and I am trying to set all up on my local network for the moment.
I feel the section of the tutorial in which is created an DNS .ini file for Certbot is not necessary in my case (in the end the local network DNS is managed by my router - set to default ISP values - I believe?) , although I am not sure about it.
What is more, I don’t know whether I should generate a certificate as indicated per the tutorial, or as per Nextcloudpi instructions.
On local network, SNI request should point to domain names as defined in the machine’s /etc/hosts file?
I would be curious to read - if you have an opinion on the topic - why here it does not seem to be an issue to install Haproxy on the same machine than the webservers it points to, while some people seem to discourage it.
PS: FYI, I cross-posted my issue in slightly different terms on Nextcloud’s forum.