A common gripe we hear about Magento is the deployment strategy. The need for a significant site outage during deployment of even a small front-end change is a real weakness. A modern web application should really be architected to allow for zero (or at least minimal) downtime deployments.
This downtime is due to the steps required to prepare changes in the Magento file system for release on a production environment. Each new deployment requires that the newly updated dependencies get installed, the database be updated, the DI be compiled, the autoload dumped and all static assets be re-published. Each of these steps takes precious time, highlighted by the process being book-ended by enabling/disabling the maintenance page, taking the site offline.
Thankfully this is something Magento are aware of and have been working on to improve. Since the 2.2.x release it has been possible to use Magento's "Pipeline" deployment strategy to reduce the time the site needs to be offline during deployments. Unfortunately the documentation for this process is not great (link below).
The main problem here is that there is a lot of information but not a lot of logical structure. Pretty much everything you need to know is in there, however the information doesn't flow well and it is not clear how to actually implement a best practise Pipeline deployment.
At Absolute Commerce we have years of experience when it comes to Magento and deploying sites in to production. We strive to make the deployment process as simple as possible whilst providing the smoothest experience possible for users of the site.
What follows is a common setup we use to deploy Magento to multiple servers using the Pipeline deployment strategy, reducing the amount of downtime required during each deploy to almost zero.
One of the biggest points of confusion in the Pipeline documentation is around dumping the Magento configuration to make it available for the deployment steps. We have found that many people fall at the first hurdle due to the vague reasoning and instruction on how to achieve this.
The reason for dumping the config is to copy information that is usually stored only in the Magento database into a file. Doing this removes the dependency of Magento on its database to retrieve the information, allowing building of the deployment separately without needing the currently live files or database. With the information dumped out to the app/etc/config.php file there's no reliance on having an app/etc/env.php file to be able to access the database at all.
What is not clear from the documentation is which information needs to be dumped and what can be left to remain in the database alone. In short the configuration dump step is necessary for Pipeline deployments but not to the extent that the documentation makes out. If you try to run the Magento deploy commands with no dumped config and without an app/etc/env.php file present you'll hit errors.
The only information required to be dumped to file is that which is a dependency of the deployment steps to be run. This is in fact only a very small amount of information (currently only the configuration of the stores setup). To dump this information in to the site's app/etc/config.php file use the following command:
bin/magento app:config:dump scopes
We always recommend storing the app/etc/config.php file in your Magento project's version control, so the dumped information can be committed in for use during subsequent deployments.
You may find that you do need additional configuration to be stored in the dump, if it is required by the deployment process. In this case you can use the command below. Note this will dump the entire core config data of the site into the app/etc/config.php file, so you'll want to trim the array down to just the elements you wanted before committing it.
bin/magento app:config:dump system
The site's js/css minification configuration is a good example of something you might want to include, so that your live deploys minify the files as expected.
If you prefer to keep this information out of your version control and make it environment dependant you might consider using environment variables to override configuration settings as per this link:
DeployHQ is one of many great CI tools out there that allows you to program in custom deployment steps to run on your servers, pulling code in to place and running relevant install commands. We have used DeployHQ in this example as it is one of the easiest to use and offers a free tier so you can have a dabble with no commitment.
Within DeployHQ you can specify the servers you want to deploy to and grant access to them for DeployHQ to make changes. In this example we are using two separate servers to demonstrate the multi server element of the Pipeline deploy.
DeployHQ allows you to use 'Server Groups' to group your servers together and specify the 'Server Transfer Order'. For Magento Pipeline deployment to multiple servers, create a new server group and select the Parallel Transfer Order option.
This means that on each deployment the steps of the deploy will be carried out on each server in the group one by one before moving on to the next step of the deploy. You can also specify certain steps to only run on certain servers, more on this later.
Each server is configured to allow SSH access to DeployHQ using a keypair (generated by DeployHQ). You'll need to provision access to the server using the public key provided, adding it to the ~/.ssh/authorized_keys file of the relevant user (the owner of the files to be deployed) on the server. Don't forget you'll also need to whitelist the DeployHQ IPs to access the server on your SSH port.
Make sure to tick the "Perform zero-downtime deployments" check box when setting up each server. This is the key to performing the Magento deployment steps in the required order to reduce downtime.
The zero downtime element of the deployment is implemented on your servers using a symlink to point the webroot of your site to a particular release directory. Each new deployment creates a new release directory and during a deployment the symlink is updated to point to the newly deployed code rather than the old code.
Once the servers are all configured the next step is to configure the SSH commands that will be run on them each time you kick off a deploy. This is set up in the "SSH Commands" section of the DeployHQ account. Each step of the Magento deploy can be broken down in to its individual commands here. On each deployment the files from the specifed branch or commit of the deploy are copied to the server and then each of these steps is then run.
Within each SSH command DeployHQ provides variables which are substituted in for the relevant value during a deployment. The key variable used here is %release_path%. This allows the deploy step to reference the new release directory created for the current run of the deployment. The example below shows how this would be used to enable maintenance mode in the current release directory.
With DeployHQ zero-downtime the SSH commands you define can be run either before the current release symlink is switched, or after. With this in mind you'll need the following configuration.
Before the release symlink is switched over, you'll want your Composer dependencies installed, DI compiled, autoload dumped and static content deployed. All of these steps can happen 'offline' in containment of the release directory before the symlink switch, meaning the site is still up and live whilst this is happening. In a true CI setup these steps should really happen elsewhere and the result just be retrieved here as an artifact to use for the deployment. For the sake of clarity we're using DeployHQ for everything in this example:
After the release symlink is switched over the live site will be operating on the newly deployed files. However, the database has not yet been updated with any changes introduced, so now the maintenance page needs to go up whilst the setup:upgrade command runs.
Remember as it is configured in "Parallel" mode, DeployHQ will complete each step of the deploy on all of the servers specified before moving on, so the maintenance page will go up on each server before the setup:upgrade command is run. Similarly the maintenance page will not come off until the setup:upgrade command has run on all of the servers specified. In the configuration of the setup:upgrade SSH command it is important to set it to run only on one server rather that "All Servers", as the database is shared by all of the servers and only needs to be updated once.
Note, the --keep-generated parameter must be used here when running setup:upgrade to prevent the previously generated Magento files being deleted. The length this process takes to run depends entirely on the changes introduced, but usually it will run in under a minute where only regular schema changes are being made. This is a significant reduction in the time required to have the maintenance page up for.
The final element of the DeployHQ setup is to leverage their "Shared Directory" functionality to enable each new release directory to link to assets that you want to be available across deploys. For instance you don't ideally want to duplicate all of your catalog images etc for every single deployment.
Shared directory works by creating symlinks in the release to any files or directories added to a special shared directory, added alongside the releases and current symlink on the server. You can read more about how this works here:
For Magento Pipeline deployment we suggest the following structure for the shared directory. Keeping the site's env.php, media and logs/reports available across deploys with no need to copy or link files yourself:
So that's how you can robustly deploy Magento to multiple servers with minimal downtime. Obviously there are many additional steps that can be added to this brief example to improve the process but we hope it gives enough insight to the process to get over the initial hurdles and get under way.
Should you require assistance with your own Magento deploys or any other aspect of ecommerce, please don't hesitate to get in touch for a chat with us.