Hosting WordPress Yourself Part 7 – Nginx Security Tweaks, WooCommerce Caching, and Auto Server Updates

In the last post of this series, I demonstrated how to configure HTTPS and HTTP/2. In this post I’m going to tie up a few loose ends and cover a number of topics that didn’t quite make the cut in previous posts.

I’ll start with security and how you can further protect your sites against Cross-site Scripting (XSS) and Clickjacking threats. While on the subject of security I’ll also take the opportunity to discuss automatic updates and how to enable them. Next, I’ll show you how to correctly configure FastCGI caching for use with plugins such as WooCommerce, to ensure that you do not cache your checkout or account pages. Finally, I’ll demonstrate how to easily setup automated tasks across multiple sites with a few examples of tasks I like to perform.

More Security

In the previous post you learned how to configure HTTPS to encrypt connections between the browser and server, but this still leaves sites open to other areas of attack, such as XSS, Clickjacking and MIME sniffing. Let’s look at each of those now.

XSS

The most effective way to deal with XSS is to ensure that you correctly validate and sanitize all user input, including that within the WordPress admin areas. That said, input validation and sanitization are sometimes out of your control, especially when you need to rely on third party themes or plugins. You can however reduce the risk of XSS attacks by configuring Nginx to provide a few additional response headers.

Let’s assume an attacker has managed to embed a malicious JS file into the source code of your site, maybe through a comment form or something similar. By default, the browser will unknowingly load this external file and allow its contents to execute. Enter ‘Content Security Policy’, which allows you to define a whitelist of sources that are approved to load assets (JS, CSS, etc). If the script isn’t on the approved list, it doesn’t get loaded.

Creating a ‘Content Security Policy’ can require some trial and error, as you need to be careful not to block non-harmful assets such as those provided by Google or other third party vendors. This sample policy will allow the current domain and a few sources from Google and WordPress:

default-src 'self' https://*.google-analytics.com https://*.googleapis.com https://*.gstatic.com https://*.gravatar.com https://*.w.org data: 'unsafe-inline' 'unsafe-eval';

Alternatively, some people opt to only block non HTTPS assets, which although less secure is a lot easier to manage:

default-src 'self' https: data: 'unsafe-inline' 'unsafe-eval';

You can add the header directive to Nginx.conf or each site’s individual configuration file, depending on whether you want to share the policy across all sites. Personally, I specify a generic policy in the global config file and overwrite it on a per-site basis as needed.

sudo nano /etc/nginx/nginx.conf

Add the following code within the http block:

##
# Security
##

add_header Content-Security-Policy "default-src 'self' https: data: 'unsafe-inline' 'unsafe-eval';" always;

Some of you may have picked up on the fact that this only deals with external assets, but what about inline scripts? There are two ways you can handle this:

  1. Completely disable inline scripts by removing 'unsafe-inline' and 'unsafe-eval' from the ‘Content-Security-Policy’. However, this approach can break some third party plugins or themes, so be wary before going down this route.
  2. Enable ‘X-Xss-Protection’ which will instruct the browser to filter through user input and ensure suspicious code isn’t output directly to HTML. Although not foolproof, it’s a relatively simple countermeasure to implement.

To enable the ‘X-Xss-Protection’ filter add the following directive below the ‘Content-Security-Policy’ entry:

add_header X-Xss-Protection "1; mode=block" always;

Remember, these headers are no replacement for correct validation or sanitization. However, if you are able to explicitly define the ‘Content-Security-Policy’ sources and disable inline scripts you will have a very strong line of defence against various XSS based attacks.

Clickjacking

Clickjacking is an attack which fools the user into performing an action which they did not intend to, and is commonly achieved through the use of iframes. For more information on the Clickjacking threat, check out this article by Troy Hunt.

The most effective way to combat this attack vector is to completely disable frame embedding from third party domains. To do this, add the following directive below the ‘X-Xss-Protection’ header:

add_header X-Frame-Options "SAMEORIGIN" always;

This will prevent all external domains from embedding your site directly into their own through the use of the iframe tag:

<iframe src="http://mydomain.com"></iframe>

MIME Sniffing

The final security concern to tackle in this article is MIME sniffing, which can expose your site to attacks such as drive-by downloads. The ‘X-Content-Type-Options’ header counters this threat by ensuring only the MIME type provided by the server is honored. This post by Microsoft contains further information.

To disable MIME sniffing add the following directive:

add_header X-Content-Type-Options "nosniff" always;

That’s all of the suggested security headers implemented. Save and close the file by hitting CTRL X followed by Y. Before reloading the Nginx configuration, ensure there are no syntax errors.

sudo nginx -t

If no errors are shown, reload the configuration.

sudo service nginx reload

After reloading your site you may see a few console errors related to external assets. If so, adjust your ‘Content-Security-Policy’ as required.

You can confirm the status of your site’s security headers using SecurityHeaders.io, which is an excellent free resource created by Scott Helme. This, in conjunction with the SSL Server Test by Qualys SSL Labs, should give you a good insight into your site’s security. Another useful tool is WP Scanner, which will detect other WordPress specific areas of vulnerability.

Automatic Security Updates

It’s vitally important that you keep your server software updated as all the precautions in the world won’t protect you if you’re using software with known vulnerabilities. Thankfully, Ubuntu can automatically perform software updates. However, it’s important to remember that this convenience can be quite dangerous and it’s recommended that you only enable security updates. This will automatically patch new vulnerabilities as they are discovered, like the Heartbleed bug in 2014.

Non-essential software updates should be tested on a staging server before installing them so as not to introduce breaking changes, which could inadvertently take your sites offline.

On some systems this feature may automatically be enabled. If not, or you’re unsure, follow the steps bellow:

Install the unattended-upgrades package:

sudo apt-get install unattended-upgrades

Create the required configuration files:

sudo dpkg-reconfigure unattended-upgrades

Edit the configuration file:

sudo nano /etc/apt/apt.conf.d/50unattended-upgrades

Ensure that the security origin is allowed, all others should be removed or commented out:

// Automatically upgrade packages from these (origin:archive) pairs
Unattended-Upgrade::Allowed-Origins {
    "${distro_id}:${distro_codename}";
    "${distro_id}:${distro_codename}-security";
//  "${distro_id}:${distro_codename}-updates";
//  "${distro_id}:${distro_codename}-proposed";
//  "${distro_id}:${distro_codename}-backports";
};

You may also wish to configure whether or not the system should automatically restart if it’s required for an update to take effect. The default behaviour is to restart the server immediately after installing the update, but you can specify a time or disable it completely by supplying "false":

Unattended-Upgrade::Automatic-Reboot-Time "04:00";

If your server does restart you must remember to start all critical services. By default Nginx, PHP and MariaDB will automatically restart, but check out this Stack Overflow thread on how to add additional services if needed.

Finally, set how often the automatic updates should run:

sudo nano /etc/apt/apt.conf.d/10periodic

Ensure that Unattended-Upgrade is in the list.

APT::Periodic::Unattended-Upgrade "1";

The number indicates how often the upgrades will be performed in days. A value of 1 will run upgrades every day.

Automated Tasks

Automated tasks can go a long way to streamline the process of hosting your own sites and can dramatically reduce your time spent manually performing repetitive tasks. In part 5 of this series I showed you how to set up WordPress cron and automatic backups using the crontab. While this approach is fine when hosting a small number of sites, it quickly becomes hard to manage as your crontab grows in size. Let’s improve upon this approach so that you only require a single cron entry for each individual task.

I’ve created a GitHub repository to house my current setup, which makes installing the automated tasks relatively simple. The following tasks are also included, but feel free to add your own:

  • WordPress cron (every 5 minutes)
  • Database and uploads directory backups to S3 (daily at 5AM)
  • File permission updates – loops through each site and automatically sets the correct file permissions as recommended by this article (daily at 6AM)
  • Verify WordPress checksums – checks all core files against the WordPress repo to monitor for code changes, which is often the first sign that your site has been compromised (kudos to Danny van Kooten). If a change is detected a push notification is sent to any device using the PushBullet app (daily at 7AM)

Login to your server and copy the files to your home directory:

curl -O https://github.com/A5hleyRich/simple-automated-tasks/archive/master.zip

Unzip the compressed files:

unzip master.zip

Move the ‘.tasks’ directory to your home directory:

cd simple-automated-tasks-master
mv .tasks ~/

Clean up the leftover files:

rm -fr master.zip simple-automated-tasks-master

Open the ‘sites.sh’ config file and add any sites you wish to enable the automated tasks:

nano ~/.tasks/sites.sh

image1

Once happy, save the configuration file by hitting CTRL X followed by Y.

Finally, open your crontab:

crontab -e

Add the following entries, replacing those created in part 5 (remember to update the file paths to point to your home directory):

*/5 * * * * cd /home/a5hley/.tasks; bash cron.sh >/dev/null 2>&1
0 5 * * * cd /home/a5hley/.tasks; bash backups.sh >/dev/null 2>&1
0 6 * * * cd /home/a5hley/.tasks; bash permissions.sh >/dev/null 2>&1
0 7 * * * cd /home/a5hley/.tasks; bash checksums.sh >/dev/null 2>&1

Save the entries by hitting CTRL X followed by Y.

If you wish to send your backups to S3 you will also need to install and configure the AWS CLI tools, as detailed by Brad here. If you do install them, the backup task will automatically send them to S3 and store them like so: ashleyrich.com/backups/*.sql.gz. If you need to change the upload location, modify lines 23 and 24.

In order for push notifications to be sent to your devices when a code change is detected, you must add your PushBullet access token. This can be found from your account page and should be added to line 7.

That’s all there is to enabling the automated tasks. Now, in the future, when you wish to add a new site to your server, just update the ‘sites.sh’ config file and everything else is taken care of.

eCommerce FastCGI Cache Rules

In part 4 of this series I showed you how to implement FastCGI caching to supercharge your sites without the need for complicated caching plugins. Although page caching is desired for the majority of frontend pages there are times when it can cause issues, particularly on eCommerce sites. For example, in most cases you shouldn’t cache the shopping cart, checkout or account pages as they are generally unique for each visitor.

New cache exclusions can be added using simple conditionals and regex expressions. The following example will work for the default pages (Cart, Checkout and My Account) created by WooCommerce :

if ($request_uri ~* "/(cart|checkout|my-account)/*$") {
    set $skip_cache 1;
}

Open the configuration file for your chosen site, in my case:

sudo nano /etc/nginx/sites-available/ashleyrich.com

Add the new exclusion to the server directive, directly below the existing conditionals. Once you’re happy, save, test and reload the configuration for the changes to take effect. You should now see that the ‘fastcgi-cache’ response header it set to ‘BYPASS’ when visiting any of the WooCommerce pages.

Your site configuration should resemble the following example, minus the SSL specific directives if you’re running on regular HTTP:

WooCommerce isn’t the only plugin to create pages that you should exclude from the FastCGI cache. Plugins such as Easy Digital Downloads, WP eCommerce, BuddyPress and bbPress all create pages that you will need to exclude. Simply add any desired rules, as I have demonstrated above.

Job done! I realise it’s been quite a mixed bag of information, but hopefully you’ve found parts useful. Let me know if you have any questions in the comments below.

About the Author

Ashley Rich

Ashley is a PHP and JavaScript developer with a fondness for solving complex problems with simple, elegant solutions. He also has a love affair with WordPress and learning new technologies.

  • Jeff van Loben Sels

    I’d love to see some great tutorials/examples of Continuous Integration based on multiple servers tied with different branches in Git.

    • Thanks for the idea Jeff, it’s duly noted.

  • Popsantiago

    @A5hleyRich:disqus, i put your code in my conf and i have alerts as you told (with Zopim)… But how can i solve it ? >> Refused to connect to ‘wss://ie08.zopim.com/s/W/ws/IlZWZfWm06K/c/44996166140’ because it violates the following Content Security Policy directive: “default-src ‘self’ https: data: ‘unsafe-inline’ ‘unsafe-eval'”. Note that ‘connect-src’ was not explicitly set, so ‘default-src’ is used as a fallback.
    Any idea ? Thx

    • You need to add the wss protocol, something like:

      add_header Content-Security-Policy “default-src ‘self’ https: data: wss: ‘unsafe-inline’ ‘unsafe-eval’;” always;

  • Hi @A5hleyRich:disqus, after installing the unattended-updates, I have to do sudo dpkg-reconfigure unattended-upgrades to make it work, is that right? (You didn’t mention the extra step in this post)

    • I think it depends on whether `unattended-upgrades` was installed by default or not. I’ll update the article to add the extra step. Thanks for the heads up.

  • As for security and WordPress, I would love to hear your thoughts on permissions needed for WordPress folder to be able to have auto-updates running as well as still being reasonably secure install.

  • Carlos Augusto dos Santos

    Very good setup! I set up a server in just this way and it was good. Now I configured with ubuntu 16.04 with nginx 1.10.0 and parameter:

    location ~ /purge(/.*) {

    fastcgi_cache_purge ashleyrich.com “$scheme$request_method$host$1”;

    }

    Does not work, I have to do to enable this parameter?

  • bubienok

    Hi, I have problem with
    https://github.com/A5hleyRich/simple-automated-tasks/archive/master.zip
    I cannt unzip it
    pls could you check it ?
    thank you

    • The ZIP works fine. Remember, the files are hidden.

      • John

        I had problem with master.zip too when downloaded it via curl -O.
        The Error:

        $ unzip master.zip
        Archive: master.zip
        End-of-central-directory signature not found. Either this file is not
        a zipfile, or it constitutes one disk of a multi-part archive. In the
        latter case the central directory and zipfile comment will be found on
        the last disk(s) of this archive.
        unzip: cannot find zipfile directory in one of master.zip or
        master.zip.zip, and cannot find master.zip.ZIP, period.

        Then I got that with wget and it was fine.