When the time comes to release a new update to one of our plugins, the Delicious Brains team gets all excited at delivering more features and improvements to our customers, but also gets a little sad that it signals the start of another round of manual and rigorous release testing! Recently we’ve been discussing how best to improve this testing process and one option is automation. Let me talk you through what we do today and what we are slowly trying to implement.
Release Testing Relay
Generally we approach release testing (also known as acceptance testing) as a 3-person testing relay. Our current setup involves a Google spreadsheet for each product with worksheets for each of the plugins and addons. Each sheet has lots of test scripts to perform, covering the functionality and edge cases for the plugins.
The first person runs through the scripts testing and finding any issues. If an issue is found, the test is marked as a fail and raised on GitHub to be fixed. The next person in the relay can only start their round of testing once the first person has finished and the issues have been fixed, tested, and merged. The process is repeated until the third person has fully tested and no bugs are found.
This testing approach is very thorough but extremely time consuming. We don’t have a dedicated testing resource so the testing is performed by our team members (all developers), which in itself is quite costly. Of course the quality of the testing is high and this benefit has always outweighed the time and cost. However, as a small team we are always looking at ways to improve, automate, and refine our processes to free up resources for improving our products and occasionally building new ones.
Our testing process, which Ian has written about previously, also includes unit tests which are written using PHPUnit and run by Travis CI every time code is pushed to our GitHub repositories. This is great as it is highly automated and catches issues with code before the products get to the release testing stage. However, unit testing is just one piece of the testing puzzle; we still need real people to test how the plugin works and what it does. You can’t automate that, can you?
Codeception
Codeception is a behaviour-driven testing framework that automates acceptance, functional, and unit testing. Tests are written in PHP (the framework is actually built on PHPUnit), in a clear and descriptive way, similar to Behat.
It is the acceptance testing part of Codeception we are interested in. It allows us to automate the tests usually carried out by a person actually using our plugins. By default Codeception uses what they call ‘PhpBrowser’:
a PHP web scraper, which acts like a browser: it sends a request, then receives and parses the response. Codeception uses Guzzle and Symfony BrowserKit to interact with HTML web pages.
PhpBrowser has some limitations around what UI elements can be clicked and doesn’t handle JavaScript interactions. However, your acceptance tests can be run using the WebDriver module instead which leverages Selenium WebDriver, a browser automation tool, so that your tests can interact in the browser (typically Firefox) in the same way as if you were testing yourself. Sounds good right? Let’s take a look at getting something set up.
Installation
We have only just started using Codeception for testing our plugins and have a handful of test scenarios working on a branch of the repo for our AWS plugins. What we found when setting this up is that the install and configuration of the environment took the most time but writing tests was much faster.
Like PHPUnit, we have Codeception installed in our repository using Composer:
"require": {
"codeception/codeception": "2.1.*",
}
I won’t go into any more detail about the installation and configuration as the Codeception site covers getting started better than I could.
We also have WPBrowser installed, a set of WordPress extensions for Codeception. The WPBrowser module adds some very helpful methods allowing the test ‘actor’ to easily interact with a WordPress site.
In the following example we will be testing a scenario of our WP Offload S3 Lite plugin which interacts with Amazon S3. Unlike unit tests where code is tested in isolation, these acceptance tests will need to assert if files have been uploaded to S3. We could use the WebDriver module to see
if files exist on S3 by checking on Amazon’s S3 console page, but as the console isn’t great for real-life users, I wouldn’t want to rely on Selenium accessing it every time. Instead I have built an S3 file system module for Codeception that uses the AWS API to make assertions about S3 data.
Environment
Instead of using the WPBrowser’s WPLoader module to set up the test WordPress site we have decided to roll our own bash script to set up everything needed to run our acceptance tests. The script relies on WP-CLI and includes:
- Installing WordPress in a tmp directory inside our repo (.gitignored of course)
- Build our plugins from source and install them on the site
- Download either PhantomJS or Selenium and fire it up to be used by the WebDriver module.
- Run Codeception tests
- Shut down PhantomJS or Selenium
- Kill any Firefox instances
An Example
I’ve put together an example repository that demonstrates everything we do when testing WP Offload S3, but in a slimmed down form.
Installation
There are a few requirements for the project with detailed installation steps in the README.md.
Once all installed, you can fire up the test by running sh run-tests.sh
.
The Scenario
At the heart of this example is the main test file, known as a ‘cept’, which is a file containing all the steps and assertions for a given scenario. Our scenario is to test the core functionality of WP Offload S3 Lite – upload a file to the WordPress Media Library and ensure it gets offloaded to S3:
<?php
$I = new AcceptanceTester( $scenario );
// Login to wp-admin
$I->loginAsAdmin();
// Navigate to the Media Library
$I->amOnPage( '/wp-admin/media-new.php' );
$I->waitForText( 'Upload New Media' );
// Add new file
$I->attachFile( 'input[type="file"]', 'team.jpg' );
// Wait for upload
$I->waitForElement( '.edit-attachment', 20 );
$I->seeElement( '.edit-attachment' );
$I->click( '.edit-attachment' );
// Navigate to the Edit Media window
$I->executeInSelenium( function ( \Facebook\WebDriver\Remote\RemoteWebDriver $webdriver ) {
$handles = $webdriver->getWindowHandles();
$last_window = end( $handles );
$webdriver->switchTo()->window( $last_window );
} );
$I->waitForText( 'Edit Media' );
// Check URL is an S3 one
$url = $I->grabValueFrom( 'attachment_url' );
$this->assertContains( 'amazonaws.com', $url );
// Parse the URL
$url = explode( '.com/', $url );
$url_parts = explode( '/', $url[1] );
$bucket = array_shift( $url_parts );
$key = implode( '/', $url_parts );
// Check attachment has been offloaded to Amazon S3
$I->setBucket( $bucket )->seeFile( $key );
The steps are very human readable and show clearly the process that replicates what an actual tester would do: login, add a file to the media library, check that the URL is rewritten to an S3 URL, and then finally check with AWS to see if the file has been uploaded.
Watching Codeception and Selenium automate the testing of this scenario is pretty cool:
Weighing It Up
There is definitely a significant amount of work involved getting this all set up. I even had to write a custom S3 module to fully test our plugin’s functionality, and the need for custom functionality could be the same for anyone’s plugins depending on what they do. There is also room for improvement. We are only testing the scenario for a single site install of WordPress, but our existing release testing process covers both subdomain and subdirectory flavours of WordPress multisite. I would assume that a custom database module for Codeception is required to make switching between databases and installs during testing as painless as possible.
There is also a need for script maintenance as code and product functionality changes over time. Of course, this is the same for unit tests, but could be more involved for acceptance testing scenarios. With all this in mind, I firmly believe that having the ability to potentially replace two out of the three of our human testers with automated tests using Codeception would ultimately be beneficial to our release testing process. The time to test would seriously reduce, as would the drain on developer time (and mood!).
Hopefully we can start building up our test scenarios and start fully integrating Codeception into our testing process in the near future. Do you perform acceptance testing on your products? Have you started to automate them with Codeception or another tool?