WP Offload S3 1.2 Released


After nearly 6 months of development I’m pleased to announce the release of WP Offload S3 1.2. For now you’ll need to download it from My Account. Updates through the WordPress dashboard will be enabled soon, hopefully next week.

In this iteration of the plugin we have made some significant under-the-hood improvements, which fundamentally changes how S3 URLs are handled. These changes should make WP Offload S3 more reliable and less of a burden on server resources. Before I go onto discuss the changes, let me explain how WP Offload S3 used to handle S3 URLs and the problems with that implementation.

The Old Implementation

It’s relatively simple. When you inserted an attachment into post content, the S3 URL was added to the editor and saved to the database. However, storing S3 URLs in the database presented a few problems:

  1. What happens if you want to change the URL format? For example, if you decided to serve your Media Library from CloudFront as opposed to S3.
  2. If you uninstalled WP Offload S3 and deleted your S3 bucket what happens to existing content?
  3. What about private attachments? If signed S3 URLs expire after 15 minutes and they were embedded directly into post content they would cease to work once the expiry was reached.

Find and Replace

We could solve problems 1 and 2 easily enough by performing a find and replace on your content. You may have seen the find and replace prompt when changing your URL settings or deactivating the plugin. On small to medium size sites this process is relatively quick and light on server resources. But, things start to get hairy when sites have thousands of posts and/or attachments. That’s before you take into consideration intermediate image sizes and post revisions. Consider the following example:

A site has 100,000 posts, each with 20 revisions and 50,000 attachments. Each attachment has 6 intermediate image sizes. If you do the math that’s 350,000 REPLACE queries that needed to be executed over a data set of 2,000,000 posts and revisions. That many queries is going to bring most servers to halt, or, cause a significant CPU spike.

We could alleviate some of the performance issues by using background processing, but batch processing that many queries takes time. Having to wait 8 hours for changes to fully take effect after you update your URL settings isn’t exactly a great experience.

Private URLs

Handling private attachments is a bit more complicated. Since they expire, you can no longer just store the signed S3 URLs in the database. Our initial idea was to introduce a shortcode, something similar to:

[amazonS3 id="xxx"]

That would allow a new signed URL to be generated on each page request, thus fixing the issue. But, having to use a mix of URLs and shortcodes to output attachments seemed dirty. It would also be a nightmare if you decided to remove WP Offload S3, because you couldn’t do a simple search and replace on those shortcodes.

The New Implementation

In version 1.2 of WP Offload S3 we no longer save the S3 URL to the database. When you insert an attachment into your post content, the S3 URL is added to the editor as before, but when the post is saved we filter the content and ensure only the server URL is saved to the database. We do the opposite when the post is requested by swapping out the server URL for the S3 URL.

This completely removes the need for performing a find and replace on the database. If you change your URL settings, the changes will take full effect instantly across all posts. The same goes for when WP Offload S3 is removed. Because we only store the server URL in the database, deactivating or removing WP Offload S3 will be fine so long as your files haven’t been removed from your server. Private URLs are no longer an issue either, because a new signed URL is generated on every page request.

You may be wondering what impact this has on your site load times, after all the processing is no longer done in the background, but on each individual page request. Let’s take a look at a before and after using Query Monitor.

WP Offload S3 deactivated, URLs pointing to the server:


WP Offload S3 activated, URLs rewritten to S3:


Just 1 extra query and a negligible increase in load time, which we’re pretty pleased with! We managed to keep the content filtering process relatively light due to piggybacking onto responsive images, which was introduced in WordPress 4.4.

These tests were performed using a fresh install of WordPress 4.6.1 on PHP 7. The results were taken from the blog index, which consisted of 10 posts, each with 2 image attachments.

Upgrade Routine

When you upgrade to WP Offload S3 1.2 a new upgrade procedure will kick off, which will trawl through your post content and replace S3 URLs with their server counterpart. Wait, but what about…

That many queries is going to bring most servers to halt, or, at best cause a significant CPU spike.

Don’t worry. We have greatly improved how the find and replace works and it now consumes very little CPU. The upgrade routine will run silently in the background via WordPress cron and you can check on its progress from the WP Offload S3 settings screen.



This release also adds support for custom logos and contains a good amount of bug fixes. Check out the changelog for more details.

We a have lot lined up for WP Offload S3, so stay tuned!

About the Author

Ashley Rich

Ashley is a PHP and JavaScript developer with a fondness for solving complex problems with simple, elegant solutions. He also has a love affair with WordPress and learning new technologies.

  • Great improvement guys, thank you for making this happen. It will allow us to finally put to rest a few funky regex filters we had to use for edge cases.

  • mrjarbenne

    Will there be any impact for sites that host the content solely on S3, with the “remove local copy after upload” setting enabled?

    • No impact whatsoever. Just remember to download the images locally if you disable WP Offload S3.

      • mrjarbenne

        Thanks. That’s great.

      • This doesn’t work for me. The way it is now I don’t have a problem if somehow WP Offload S3 gets disabled. I do not ever want to have to rely on a 3rd party to be working in order for my content to be visible. I am using S3 to lessen the burden on my server, now I have to be sure all those images are double hosted? Not cool. This needs to be an option and not a requirement.

  • Sounds great – but I *only* have many of my images on S3 not my local server (using S3 for storage, after all).

    Does this mean if there’s a hiccup with Offload, the URLs aren’t transformed with each page request and are left pointing at dead content?

  • Mike B

    The update notification is not showing up in WordPress Updates, which says, “Your plugins are all up to date.”

    Do I need to manually replace version 1.1.7?

    • As per the article intro, we’re not turning on automatic updates to v1.2 just yet. If you don’t want to wait until we do, then you’ll need to download v1.2 from https://deliciousbrains.com/my-account/ and manually update your server to replace v1.1.7.

      • Mike B

        Thanks! I think I’ll wait until auto updates are turned on.

  • Tim Scully

    Not updating urls in post content actually causes issues in my setup. Is there a way to disable this feature, and *do* update the image urls in post content? Tl;dr, locally stored images aren’t accessible by clients and *only* accessible on s3. I need rendered post content to have the s3 urls.

    • The posts will render with the proper S3 URLs, it’s just that the post content in the database will hold URLs that point to the server. We filter the server URLs and turn them into S3 ones.

      • Tim Scully

        Ah that’s right. Ok, so a little more detail, I’m actually using WordPress solely as a backend for portions of my site, and writing my own queries to pull post content. Apparently bypassing the filter you have in place to replace image urls. If providing an option to convert those urls isn’t possible, could you point me in the right direction on how to create my own filter for post content image urls?

        • Tim, when querying for WordPress posts/pages, are you using WP_Query or functions such as get_posts()?

          • Tim Scully

            More like something similar to:

            “SELECT `post_content` FROM `posts` WHERE `ID` = 1”

            I’m not actually using anything from the WP core to grab post content from the database. Just straight up queries to grab the post content, then present it the way I want on my page. As I said, we’re only using WordPress as a backend to create/store/manage content.

          • Tim, I’m afraid we can’t directly support that kind of usage, as directly accessing the database content means you miss out on the WordPress API’s filters, including resolving short codes.

            Assuming you’re still using PHP and able to access the WP libraries but can’t use query API, then you’ll need to recreate WP’s use of some funky regex to detect the class of the img tags in the content, detect the embedded Attachment ID and then use a function such as wp_get_attachment_image_src to get the filtered URL for the Attachment ID.

            Seems like a lot of work, I’d personally just switch to grabbing WP_Posts via the query API, but I guess you might might have other factors in play that might prevent that.

          • Tim Scully

            I figured as much, but I also figured it wouldn’t hurt to ask. No worries though! You’ve pointed me in the right direction toward a solution. I appreciate the responses! 🙂

  • Henrique Mattos

    This sounds great. For me this is specially awesome since I’m changing bucket name, applying CloudFront and switching to HTTPS a lot over the last few months, searching for the best approach to serve my content. Great to hear about this updates. Thanks a lot!

  • “In version 1.2 of WP Offload S3 we no longer save the S3 URL to the database. When you insert an attachment into your post content, the S3 URL is added to the editor as before, but when the post is saved we filter the content and ensure only the server URL is saved to the database. We do the opposite when the post is requested by swapping out the server URL for the S3 URL.”

    Does this mean there is a call to your web service for each call to an image on my site?

    What happens if you change to a totally paid plugin that is no longer supported for those of us that cannot possibly pay for the service the way you have it priced?

    Plugins get disabled for a variety of reasons and the fact that if your plugin gets disabled my entire network of sites have no images is a huge problem for me. I use S3 to move images off my server, not keep them both on S3 and on my server.

    • > Does this mean there is a call to your web service for each call to an image on my site?

      No, everything is done on your server, there is no communication with ours when filtering content.

      > What happens if you change to a totally paid plugin that is no longer supported for those of us that cannot possibly pay for the service the way you have it priced?

      It’s a GPLv2 plugin, and the Lite version is available from both GitHub and the wp.org plugins directory/svn. We wouldn’t close source it, we believe in open source, and we couldn’t even if we did want to.

      > Plugins get disabled for a variety of reasons and the fact that if your plugin gets disabled my entire network of sites have no images is a huge problem for me.

      The old way of altering your saved content in the database was a very brittle and problematic way of doing things that hinders you from easily switching between URL formats and CDNs, and prevented us from implementing some much requested features. We are now “doing the right thing” and ensuring that your content is left untouched and more flexible in its use by other plugins and WordPress itself.

      It does mean that if you use the “Remove Files From Server” feature then you should be careful to not deactivate WP Offload S3 before downloading your images from S3. Much in the same way you would need to remove your shop pages and related content if you wanted to stop using WooCommerce. Although preparing for removing WP Offload S3 is a one button click, not so much for WooCommerce!

      • I think this needs to be an option and not an automatic for existing content. I have tens of thousands of images and your change could potentially break my site.

        This is not a good thing.

        I do not want to have to rely on a plugin being live on my site for my site to continue to work. There were issues with photon by Jetpack that already has broken big parts of my site.

        WooCommerce is not part of this discussion or the issues I will have.

        • Thanks for sharing your concerns Deborah.

          I’ve raised a feature request for the team to consider adding a switch to enable find and replace of URLs in content and save back to the database.

          • Thank you. I will not update until or unless this is in place. I will look for a replacement instead.

          • Hey Deborah, are you a customer of ours? I can’t seem to find you in our systems.

            We are planning to add support for installing WP Offload S3 as a must-use plugin which would alleviate your primary concern of our plugin becoming deactivated unintentionally.

            Also, I would recommend setting up staging sites and applying plugin updates to those and testing before rolling out updates to production sites. Updating any plugin presents a risk of breaking a site and using staging sites mitigates this risk.

          • I have been using your plugin on a number of sites since it’s original release under the old name. I am not a paid customer because your pricing model is way out of wack and would cost me more than my sites make in revenue. I have tens of thousands of images that get uploaded through my sites through user submitted content (authors uploading book covers). So yes I am a “customer” and no I’m not a paid customer. I would be willing to pay if you had a model that wouldn’t break the bank.

          • Any update to this? If it isn’t going to an option I need to start looking for a new plugin.

          • Hi Deborah,

            You can continue to use the pre v1.1 version of WP Offload S3 Lite that modifies your content indefinitely, there is nothing compelling you to upgrade.

            However, we have yet to determine whether adding an option to modify content is feasible, let alone when that might be implemented. If that concerns you, then you should probably seek out other solutions.

          • It is really discouraging when developers do things like this that impact websites without considering the consequences. This negates the value of your plugin. I use it based on initial core features that you don’t expect to change after years of implementation. Several SaaS companies have done similar things this year, losing a boatload of customers in the process.

            Change your core, and change your customer base. Or maybe this is your plan. Get rid of the loyal long term customers and cater to a new group of customers in the future.

            I’ve recommended to all of my team and current clients not to update this plugin. Which means changing to something else in the future since not upgrading plugins leads to security issues. So you have your customers held hostage.

          • Hi Deborah,

            I’m sorry you feel this way about the improvements we’ve made to WP Offload S3, especially as they address a huge portion of the support requests we see for the plugin. The majority of both our paid customers and users of the free Lite plugin have literally been crying out for a better mechanism than Find & Replace of database content.

            We’ve worked incredibly hard on this release, and as always, have done an immense amount of testing. We also test AWS, WP Offload S3 Lite, WP Offload S3, and all the addons with every new version of WordPress, from early on in their beta release cycle, up to and beyond release. We regression test every new plugin version against older versions of WP too, along with a matrix of PHP versions on different web servers and host configurations. We do an incredible amount of work to ensure we have a stable, quality product.

            I firmly believe that the new version would be a great upgrade for yourself, and wish you would give it a try.

  • Nir

    Hi guys, I would like to check with you something regarding the search and replace procedure that you are doing as part of thr upgrade (actually it also relates to the new filtering on the post content): Are you doing it only on the post content field? The reason I’m asking that is because images (image urls) exist not only in the post content field. For example, Woocommerce-when you editing the product page, you have the product description and the product short description fields. In both fields you can store images, so if you are looking just on the post content field, we will have a problem. Also, there are plugins that use their own tables, with fields holding content and images. For example, Event Manager plugin. I had all this issues initially when I installed the plugin and moved all the images to S3, and I had to manually change urls in the DB. Please advice…

    • We filter `the_content` so the product descriptions are taken care of in WooCommerce. We don’t currently support postmeta or custom tables (unless the plugins are applying `the_content` filter).

      • Nir

        Thanks Ashley,

        WooCommerce uses the excerpt field for the product description (short description), and the post content field for the long description, as far as I know.

        As far as I understand, we will have mixture of image sources in the DB – the post content will have the new way of implementing the image sources, and all other places will have the old way of images sources….

        I wonder what happens when a new image is inserted to the excerpt field…
        Will it have the local image path while the image itself is on S3 ?
        Sounds like an issue…

        Also, I upgraded the plugin on 4 websites a day ago, today I looked at the DB, and the old image URLs are still there – it looks that the search and replace of the upgrade never ran…

        Please let me know what your thoughts on these issues..

        Thanks much,

        • I’ve added an issue for the team to investigate supporting excerpts. The find and replace will only run on `post_content`, which is the only column we have ever supported. We are planning on adding a public function which developers can run their content through to replace the URLs.

          In regards to the find and replace not working, please raise a support ticket and we will look into the issue for you.

  • Jeff Dembinski

    The database was scoured for the URLs and finished – and then the meta was updated. However, I have the EDD Addon as well – and none of my digital downloads will download any longer. All of my users are seeing 0 byte files

  • John Armstrong

    Ashley, 1.2 is really really fantastic, great work. I can release without going to maintenance mode during syncs!


  • This is awesome news! Do you think you might ever spin out a Google Cloud Engine version of WP Offload? The prices are better and getting set up is much more intuitive.

    P.S. Where. Is. Mergebot! Killin me over here lol

    • We don’t have plans at the moment, but we will certainly consider it if we have enough requests.

  • Did this already occur????

    WP Offload S3 Lite 1.1 – 2016-10-29

    New: Filter post content. S3 URLs will no longer be saved to the database
    New: Upgrade routine to replace all S3 URLs in content with local URLs

    • Yup, WP Offload S3 Lite 1.1 has been out over 2 weeks now.

      • I meant the url replacement that wasn’t supposed to happen until 1.2 – it looks to have been included in the 1.1 release???

        • You’re using WP Offload S3 Lite, this is the release post for the paid version. As they’re different plugins they have different version numbers.

          • So you are telling me this already happened in the Lite version without warning????

            Wow. Way to screw your users.

      • Plus how is it dated for a date in the future when it already occurred?

        • Thanks for the heads up. Dates have been fixed.

  • Chris

    I am using this on a site that hosts JSON of my posts to be used in populating an ios app. The problem is that the functions that creates my JSON on post update no longer puts the S3 url in my JSON because its not being saved in the post content. Is there a filter that I can run the content through on my JSON creation page?

    • You should just be able to use `apply_filters( ‘the_content’, $content )`

  • Liam O’Boyle

    The reasons given do make sense but this has been a painful shift for us; we had several problems with this.

    W3 Total Cache was also trying to rewrite URLs (as we had legacy content that had non-S3 URLs), so it was necessary to disable this (as with this update WP Offload can handle those legacy URLs).

    WP Offload’s local-to-s3 rewriting had issues with the domain mapping that we use for WP’s multisite; the URLs in the content are like “site.network.com” (where site varies), whereas the info returned from wp_upload_dir has a baseurl with the domain mapped site in it, e.g. “site.com”, so the rewriting decided the domain was already not local and that it didn’t need to be rewritten. This was fixed by filtering wp_upload_dir and rewriting the baseurl back to “site.network.com”.

    There was a lot of invalid metadata in _wp_attached_file and _wp_attachment_metadata; the files had a complete path including “/var/www/” etc in there which again convinced WP Offload to not bother rewriting them. This was fixed by scanning all of the metadata for the invalid URLs, extracting and patching it and rewriting.

    So… good move, but a pain. Like others have mentioned, the old behaviour worked well for us. The servers are ephemeral and any changes (like uploaded files) get thrown away anyway, so the S3 version is the only one we need and we will always be rewriting them *somewhere*. An option to disable this behaviour would be appreciated.

  • evsnm

    As far as I can tell, the upgrade to WP Offload S3 1.2 is still not scheduled to download through the WordPress dashboard. Is there a timeline for this? Thanks.

    • Auto updates have been enabled today. Please let us know if you have any issues seeing the new version.

      • evsnm

        Thank you for the prompt response. I’m seeing version 1.2.2 in the dashboard now.

  • Aboufirass Hamza

    Great improvement! A question before I go ahead and upgrade. The majority of my files (videos) are stored on S3 copied over using WP Offload S3 once I installed it and I turned on Remove files from local server.
    1) Would this upgrade require me to download these files back to Local Server?
    2) Would this plugin still work if transferred over to the MU Plugin’s Folder?
    3) If I were to disable the plugin without downloading files to local server, and then re-enable it after some time, would I still be able to link back to the s3 files?
    Thank you for answering these few questions before I go ahead with the upgrade

    • 1. No download is required
      2. There’s no reason it shouldn’t, but we don’t officially support running as an MU plugin
      3 We don’t remove your data on plugin deactivation so you should be able to re-active at anytime and begin serving from S3 once again

      • Aboufirass Hamza

        Thank you.

  • Will Woodward

    This update seems to have broken my setup. Even in the backend edit post page, the images in the content are referencing the (non-existent) local URLs instead of my CDN’s URLs.

    Is there something I’m doing wrong, perhaps? How would I go about debugging this one?

  • Manish Yadav


    How to trigger wp offload s3 plugin when we create attachment from wp_insert_attachment function in backend? Is there any actions we can call?
    This is not uploading files to s3 when files uploaded through custom file inputs not from the media uploader. Is it correct? Please help.

    • WP Offload S3 hooks into `wp_update_attachment_metadata` to perform the upload. The sample shown in the codex should work, notice how `wp_update_attachment_metadata` is called after `wp_insert_attachment`:


      • Manish Yadav

        Thanks Ashley. We are using the similar code from codex, still we are not able to get it uploaded to s3. Please have a look at our code. Is it necessary to give parent post id? Thanks in advance.

        // $filename should be the path to a file in the upload directory.
        $wp_upload_dir = wp_upload_dir();
        $filename = $wp_upload_dir[‘path’].’/’.$file[“name”];

        // Check the type of file. We’ll use this as the ‘post_mime_type’.
        $filetype = wp_check_filetype( basename( $filename ), null );

        // Prepare an array of post data for the attachment.
        $attachment = array(
        ‘ID’ => $attachment_id,
        ‘guid’ => $wp_upload_dir[‘url’] . ‘/’ . basename( $filename ),
        ‘post_mime_type’ => $filetype[‘type’],
        ‘post_title’ => preg_replace( ‘/.[^.]+$/’, ”, basename( $filename ) ),
        ‘post_content’ => ”,
        ‘post_status’ => ‘inherit’

        // Insert the attachment.
        $attach_id = wp_insert_attachment( $attachment, $filename, 0 );

        // Generate the metadata for the attachment, and update the database record.
        $attach_data = wp_generate_attachment_metadata( $attachment_id, $filename );

        wp_update_attachment_metadata( $attachment_id, $attach_data );

        set_post_thumbnail( 0, $attachment_id );

    • Pierre

      Hi Manish,
      Did you manage to make it work?
      I have the same issue…