Enhancement - The control panel must always be reachable using a specific HTTPD instance

  • I think for customers a redirect from port 80/443 to the cp port would be possible, if apache works

  • @Ninos


    Sure, but the goal is to make the panel accessible even if Apache is down.. We will not start to say: You can access the panel to X URLs...

    badge.php?id=1239063037&bid=2518&key=1747635596&format=png&z=547451206

  • Apache shouldn't be/go down. If it does you have a problem anyhow. Introducing nginx to keep the panel up when Apache fails is not showing much trust in Apache. Is there a reason to doubt Apache?

  • @Smallserver


    Please, read the thread correctly before answering. The problem is not really about Apache... Anyway, when you say: "Apache shouldn't be/go down", it's true but only in the bisounours world ;)

    badge.php?id=1239063037&bid=2518&key=1747635596&format=png&z=547451206

  • As the control panel didn't control services or anything related to the server itself, I don't really see the interest of such functionnality (if Apache and/or FTP is down, what will be the interest for the user/admin to connect to the panel ? they won't be unable to perform any actions in the end, the admin will need to fix it through SSH as usual, unless I miss something)

  • @Athar


    Hello ;

    • This will allow to implement some commands to manage some services through the control panel
    • This will allow to keep the tools online (pma, webmail, pydio)
    • This will improve security
    • This will allow the admin to access some informations without having to look in the database in case the engine is broken (the engine here is the backend side of i-MSCP)
    • This will allow to force synchronization of customers accounts even if Apache is down and restart Apache through the control panel
    • This will allow to integrate some interesting things such as ajax actions with progress bar more easily
    • This will allow to provide a REST API which will be always available even if Apache is down
    • ...


    More info


    Currently, the backend side (which is written in perl) is not able to communicate with a client (client-server) such as the frontEnd, a CLI client... This is something which is too much restrictive.


    This also mean that each time you trigger a backend request from the frontEnd, you have to reload the page to know about the task progression. Even worse, on the backend side, this involve the creation of a new process each time a backend request is done which also mean that all perl files have to be reloaded and reinterpreted by Perl....


    My vision behind all this

    • Having a backend daemon which is loaded once and which is able to communicate with a client by using an established protocol
    • Having a backend which is multi-threads and which operates using a job queue (thread-safety)
    • Provide a CLI client which is able to communicate with the backend without the need of the frontEnd
    • ...


    Do you ever play with proxmox? ;)

    badge.php?id=1239063037&bid=2518&key=1747635596&format=png&z=547451206

  • @TheCry


    I'll have a look ;)

    badge.php?id=1239063037&bid=2518&key=1747635596&format=png&z=547451206