Using the JavaScript FileReader API to Avoid File Upload Limits

If you’ve ever spent any amount of time messing with PHP configuration files to get a file to upload, you know that uploading large files can be a real pain. You have to find the loaded php.ini file, edit the upload_max_filesize and post_max_size settings, and hope that you never have to change servers and do all of this over again.

I ran into this problem myself while working on WP Migrate DB Pro – one of the features that will be going into the next release is the ability to upload and import an SQL file. Since WP Migrate DB Pro is used on a wide variety of servers, I needed to create a reliable upload tool that can upload large files without hitting upload limits.

Meet the JavaScript FileReader API. It’s an easy way to read and process any sort of file without the need to upload the file to the server first. FileReader is now supported in all major browsers including Internet Explorer 10, making it a viable solution for just about any project.

With that in mind, let’s create a sample WordPress file upload plugin to take a look at the FileReader API and learn how to use it to process large file uploads.

Getting Started

Since the FileReader API is baked right into JavaScript, the HTML side of things is very simple and relies on a basic HTML file input field:

<form>
    <input type="file" name="dbi_import_file" /><br><br>
    <input type="submit" value="Upload" />
</form>

To make things easier we’re going to create a small class to contain most of our code and place the above form inside a WordPress dashboard widget:

<?php
/**
 * Plugin Name: DBI File Uploader
 * Description: Upload large files using the JavaScript FileReader API
 * Author: Delicious Brains Inc
 * Version: 1.0
 * Author URI: https://deliciousbrains.com
 * Plugin URI: https://deliciousbrains.com/using-javascript-file-api-to-avoid-file-upload-limits/
 */

class DBI_File_Uploader {

    public function __construct() {
        add_action( 'admin_enqueue_scripts', array( $this, 'enqueue_scripts' ) );
        add_action( 'wp_dashboard_setup', array( $this, 'add_dashboard_widget' ) );
        add_action( 'wp_ajax_dbi_upload_file', array( $this, 'ajax_upload_file' ) );
    }

    public function enqueue_scripts() {
        $src = plugins_url( 'dbi-file-uploader.js', __FILE__ );
        wp_enqueue_script( 'dbi-file-uploader', $src, array( 'jquery' ), false, true );
        wp_localize_script( 'dbi-file-uploader', 'dbi_vars', array(
            'upload_file_nonce' => wp_create_nonce( 'dbi-file-upload' ),
            )
        );
    }

    public function add_dashboard_widget() {
        wp_add_dashboard_widget( 'dbi_file_upload', 'DBI File Upload', array( $this, 'render_dashboard_widget' ) );
    }

    public function render_dashboard_widget() {
        ?>
        <form>
            <p id="dbi-upload-progress">Please select a file and click "Upload" to continue.</p>

            <input id="dbi-file-upload" type="file" name="dbi_import_file" /><br><br>

            <input id="dbi-file-upload-submit" class="button button-primary" type="submit" value="Upload" />
        </form>
        <?php
    }
}

With the upload form in place we should see a very basic file upload form when we visit the WordPress dashboard:

Screenshot of upload form

Uploading the File

The form above doesn’t do anything yet, so let’s create the dbi-file-uploader.js file that was enqueued above and add a simple click handler for the upload button. After initializing the FileReader object and selecting the file to be uploaded, it will call the upload_file() function to start the upload:

(function( $ ) {

    var reader = {};
    var file = {};
    var slice_size = 1000 * 1024;

    function start_upload( event ) {
        event.preventDefault();

        reader = new FileReader();
        file = document.querySelector( '#dbi-file-upload' ).files[0];

        upload_file( 0 );
    }
    $( '#dbi-file-upload-submit' ).on( 'click', start_upload );

    function upload_file( start ) {

    }

})( jQuery );

Now we can start working on the upload_file() function that will handle most of the heavy lifting. First we create a blob containing a small chunk of the file using the JavaScript slice() method:

function upload_file( start ) {
    var next_slice = start + slice_size + 1;
    var blob = file.slice( start, next_slice );
}

We’ll also need to define a function within the upload_file() function that will run when the FileReader API has read from the file.

reader.onloadend = function( event ) {
    if ( event.target.readyState !== FileReader.DONE ) {
        return;
    }

    // At this point the file data is loaded to event.target.result
};

Now we need to tell the FileReader API to read a portion of the file. We can do that by passing the blob of file data that we created to the FileReader object:

reader.readAsDataURL( blob );

It’s worth noting that we’re using the FileReader.readAsDataURL() method of the FileReader object, instead of the FileReader.readAsText() or FileReader.readAsBinaryString() methods that are mentioned in the FileReader documentation.

In this case the FileReader.readAsDataURL() method is much more reliable than the other methods because the contents of the file are read out as a Base64-encoded string as opposed to plain text or binary. This is important because a string containing plain text or binary will likely run into encoding or sanitization issues when sent to the server via AJAX. On the other hand, a Base64-encoded string will usually just contain the A-Z, a-z, and 0-9 characters, and is easy enough to decode with PHP or any other server-side language.

Let’s fill out the rest of the function by adding the AJAX call that is responsible for POSTing the data to the server and recursively calling the upload_file() function again when the request has completed. Here’s what the upload_file() function looks like in it’s entirety:

function upload_file( start ) {
    var next_slice = start + slice_size + 1;
    var blob = file.slice( start, next_slice );

    reader.onloadend = function( event ) {
        if ( event.target.readyState !== FileReader.DONE ) {
            return;
        }

        $.ajax( {
            url: ajaxurl,
            type: 'POST',
            dataType: 'json',
            cache: false,
            data: {
                action: 'dbi_upload_file',
                file_data: event.target.result,
                file: file.name,
                file_type: file.type,
                nonce: dbi_vars.upload_file_nonce
            },
            error: function( jqXHR, textStatus, errorThrown ) {
                console.log( jqXHR, textStatus, errorThrown );
            },
            success: function( data ) {
                var size_done = start + slice_size;
                var percent_done = Math.floor( ( size_done / file.size ) * 100 );

                if ( next_slice < file.size ) {
                    // Update upload progress
                    $( '#dbi-upload-progress' ).html( 'Uploading File - ' + percent_done + '%' );

                    // More to upload, call function recursively
                    upload_file( next_slice );
                } else {
                    // Update upload progress
                    $( '#dbi-upload-progress' ).html( 'Upload Complete!' );
                }
            }
        } );
    };

    reader.readAsDataURL( blob );
}

It’s still relatively simple in terms of functionality, but that should be enough to get the file upload going on the client side.

Saving Chunks Server-Side

Now that JavaScript has handled splitting the file up and POST-ing the file to the server, we need to re-assemble and save those chunks with PHP. To do that, we’re going to add the ajax_upload_file() method to our main plugin class:

public function ajax_upload_file() {
    check_ajax_referer( 'dbi-file-upload', 'nonce' );

    $wp_upload_dir = wp_upload_dir();
    $file_path     = trailingslashit( $wp_upload_dir['path'] ) . $_POST['file'];
    $file_data     = $this->decode_chunk( $_POST['file_data'] );

    if ( false === $file_data ) {
        wp_send_json_error();
    }

    file_put_contents( $file_path, $file_data, FILE_APPEND );

    wp_send_json_success();
}

public function decode_chunk( $data ) {
    $data = explode( ';base64,', $data );

    if ( ! is_array( $data ) || ! isset( $data[1] ) ) {
        return false;
    }

    $data = base64_decode( $data[1] );
    if ( ! $data ) {
        return false;
    }

    return $data;
}

The example above is about as simple as it gets – the ajax_upload_file() method does a quick nonce check and then decodes the data via decode_chunk(). If the data can be successfully decoded from the Base64-encoded data URL, it is appended to the file and the upload continues.

With that in place we should be able to run our uploader and the file should get saved to the path we designated:

The completed file uploader

And that’s it, we have a working (although relatively basic) large file uploader! If you’ve been following along and want to see the full code, I’ve uploaded it to GitHub so you can take a look.

Conclusion

I really like how relatively simple it is to create a file uploader that can handle huge files without needing to adjust any settings server-side. It’s always interesting when technologies that were just pipe dreams a few years ago become commonplace and greatly improve today’s workflows.

It’s worth noting that there are several existing JavaScript libraries like FineUploader and jQuery File Upload that can also upload large files, and in many cases, it makes more sense to use an existing library instead of reinventing the wheel. At the same time, it never hurts to have at least a basic understanding of what is going on behind the scenes in case you ever need to fix a bug or implement a new feature.

If you’re going to implement something similar to this in a production application, you would definitely want to research any potential security implications. This could include additional client-side and server-side file type validation, disallowing uploads for files with executable file extensions, and making sure that uploaded files have a random string in the filename to mitigate some potential vulnerabilities. You may also want to implement a way for users to pause or cancel the upload and come back to it later, and log any errors that come up during the upload.

Have you ever written a script to handle large file uploads? If so, did you take a similar approach using the FileReader API or did you use something completely different? Let me know in the comments below.

About the Author

Matt Shaw

Matt is a WordPress plugin developer located near Philadelphia, PA. He loves to create awesome new tools with PHP, Javascript, and whatever else he happens to get his hands on.

  • Jeremy Benson

    This is really cool, previously to handle large uploads I’ve used integrations with DropBox and AWS S3, but they never felt like really seamless solutions. Definitely would want to research the security before implementing a custom upload function though.

  • Clifford P

    So it never runs into https://codex.wordpress.org/Function_Reference/wp_max_upload_size either, right? Can you navigate away from the Dashboard widget and it continues the upload?

    • Matt Shaw

      Yep, the WordPress max upload size shouldn’t be a problem. You do have to stay on the page while it uploads, but I’m sure it could be tweaked so that it resumes from where it was left off.

  • jvanpelt

    “…one of the features that will be going into the next release is the ability to upload and import an SQL file.” I can stop reading the article there and come away happy! I’ve been hoping this would be added to the Pro version!

  • Jason

    Excellent tutorial. So where would you recommend imposing a secure size limit so users can’t simply upload any size file? Perhaps even on a per-uploader basis, but where a hacker couldn’t bypass it.

    • I’d do that sort of check in your ajax handler function. Either before or after you add the latest chunk of data to your file with file_put_contents(), just use filesize() to see the current size of the file and if it’s larger than your set limit(this can be static, based on the current user’s role, etc.) return an error message that will halt your client-side uploader.

      • Matt Shaw

        Beat me to it! 🙂

      • Jason

        Right on. So I’d probably to a JS check as well, but not assume that it’s secure and do a server-side. Thanks!

    • Matt Shaw

      Thanks! You could check the filesize during the upload in the ajax_upload_file() method and delete it or stop the upload if it reaches a certain size. Limiting the upload functionality to certain users or roles can go a long way here as well.

      • Jason

        Yes! Roles, users, and context — albeit the context will be a bit hard to securely determine from an AJAX request.

  • Bob Chip

    Seems to me like this bit of code:

    “`
    if ( next_slice < file.size ) {
    // Update upload progress
    $( '#dbi-upload-progress' ).html( 'Uploading File – ' + percent_done + '%' );

    // More to upload, call function recursively
    upload_file( next_slice );
    } else {
    // Update upload progress
    $( '#dbi-upload-progress' ).html( 'Upload Complete!' );
    }
    “`

    Will miss the last slice of the file. If slice_size is 10, total file size is 15, then the first slice goes through correctly, but the 2nd slice will evaluate 20 < 15 as false, and so those last 5 bytes will be lost as it brances to "upload complete". Am I mis-interpreting that?

    • No, it will work fine. That’s because next_slice is not bumped up until the next call to upload_file(), where it’s previous value is used as the starting value.

      So the correct run-through of the code is this:

      1) upload_file( start = 0 )
      2) next_slice = start+10+1 = 11
      3) success: 11 true
      4) upload_file( start = 11 )
      5) next_slice = start+10+1=22
      6) success: 22 false
      7) Upload Complete!

  • Also, another cool thought would be to pass the max upload size to your JS, so that you can reduce the number of AJAX requests. Of course it also might not be a bad idea to introduce a max cap, since some servers might have a pretty hight limit(I locally have 100mb upload and 200 post max size, but I also have a very high memory limit). Not sure how base 64 encoding adjusts file size, and whether it’s safe to read 7 mb of data if upload limit is 8 mb.

  • Aloade

    Hi,

    ok a little comment, because i wasted a lot of time because of informations on this page :/

    the API FileReader is not for a request XMLHttpRequest, it’s for render a file on the browser.
    with fileReader datas are cached and manipulated async : it’s can’t be cleared by a script
    if you need to send 2-3 files of 1Go, the web page will crashed or the API fileReader sent a “file not found” because of the leak of memory.

    1 – Datas are stored in a formData object or a input ( during the reading pahse the files are manipulated localy in a hard file, so no data in flash memory )

    2 – Datas are cut and directly send with a “POST” request XMLHttpRequest ( avoid the method “PUT” for security reasons ), only the current chunck must be parsed/cached/sent
    in this way no additionnals headers are sent ( like the method readAsDataURL which send a hug amount of bit to the server ), and no impacts in the memory storage.

    3 – on the server side, dat are store on the variable $_FILES ( so not need of parsing/decoding chuncks)
    chuncks are added to a temporary file ( id of the file is sent during the step 2 ), with the last chunck the file is moved to a final location