How to fix phpunit “Too many open files” errors using PhpStorm with Laravel Homestead

  1. Login to your homestead environment, and update /etc/security/limits.conf to include the following line: “vagrant soft nofile 4096”.
  2. Restart the Homestead box using “vagrant reload”.

How to enable Wi-Fi Calling with an iPhone, Verizon Wireless, and ASUS RT-AC68P router

With the release of iOS 9.3, Verizon Wireless customers with an iPhone 5c or later now have the option to use Wi-Fi Calling, which allows you to make and receive calls on Verizon’s network using Wi-Fi in places where Verizon’s wireless network is weak or non-existent.

First, enable Wi-Fi Calling on your iPhone (see Apple’s instructions at https://support.apple.com/en-us/HT203032).

You also may need to make some configuration changes on your router, depending on the model. On an ASUS RT-AC68P, I had to enable IPSec Passthrough (WAN → NAT Passthrough → IPSec Passthrough).

When the phone detects a weak Verizon Wireless signal, it will automatically switch to Wi-Fi Calling. You’ll know it’s switched because the carrier banner will change from “Verizon” to “VZW Wi-Fi”. To force Wi-Fi Calling, you can enable Airplane Mode, and then enable Wi-Fi.

Note that (unlike AT&T, Sprint, and T-Mobile) Verizon Wireless doesn’t allow Wi-Fi Calling on supported iCloud-connected devices, so you’ll still need your iPhone nearby in order to make or receive calls on Verizon’s network from your iPad or Mac.

How to copy a Heroku Postgres database into Laravel Homestead

These instructions assume you are using PostgreSQL on Laravel Homestead with the default configuration (a database named “homestead”, with user “homestead” and password “secret”), and that you’ve already installed the Heroku Toolbelt inside the Homestead virtual machine (see https://toolbelt.heroku.com/debian).

  1. Drop the existing “homestead” database from the Homestead PostgreSQL server by typing PGUSER=homestead PGPASSWORD=secret PGHOST=localhost dropdb homestead

    vagrant@homestead:~/Code/Laravel$ PGUSER=homestead PGPASSWORD=secret PGHOST=localhost dropdb homestead

  2. Pull a copy of the Heroku Postgres database to the Homestead PostgreSQL server, replacing the “homestead” database you just deleted, by typing PGUSER=homestead PGPASSWORD=secret PGHOST=localhost heroku pg:pull DATABASE_URL homestead (enter the local database password again when prompted)

    vagrant@homestead:~/Code/Laravel$ PGUSER=homestead PGPASSWORD=secret PGHOST=localhost heroku pg:pull DATABASE_URL homestead
    pg_dump: reading extensions

    pg_restore: connecting to database for restore
    Password:

    Password:

How to Upgrade the Laravel Homestead Vagrant box

It might seem that upgrading your Laravel Homestead environment should be as easy as running vagrant box update. You probably realized something was wrong when you tried to remove the previous version of the box, and got an error like this:

Box ‘laravel/homestead’ (v0.4.0) with provider ‘virtualbox’ appears to still be in use by at least one Vagrant environment. Removing the box could corrupt the environment. We recommend destroying these environments first:

The problem is that Vagrant doesn’t know how to migrate the changes from one virtual hard drive image into another virtual hard drive image. Updating requires you to start from a new hard drive image, then reapply changes through file synchronization and database migrations/seeds. While it’s not quite as simple as running vagrant box update, it’s still pretty easy if you’ve prepared and know the commands to run.

To upgrade your Laravel Homestead environment, follow these steps:

  1. You shouldn’t have any data inside the Homestead virtual machine that can’t be recreated from other sources. You should already be using Migrations and Seeders to create your development database, and your Laravel application files should be synchronized from the host’s filesystem. If not, and there’s any data you need to preserve from the virtual machine, make a backup now and keep it somewhere outside your Homestead virtual machine.
  2. Run the vagrant box update command to download the latest version of the Laravel Homestead Vagrant box.
  3. Next run the vagrant destroy command to destroy the existing Homestead environment.
  4. Now you’re ready to run the vagrant up command, which will bring up and provision the new Homestead Vagrant environment using the latest version of the laravel/homestead box. This will also run any provisioning commands in ~/.homestead/after.sh (for instance, to re-install packages), and also synchronize files from the host file system.
  5. Login to the Homestead virtual machine, and from your Laravel application directory, run the php artisan migrate --seed command to migrate and re-seed the database.
  6. (Optional) Find any old versions of the Homestead box by running vagrant box list, and delete them with vagrant box remove laravel/homestead --box-version version:C:\Users\David\Homestead>vagrant box list
    laravel/homstead (virtualbox, 0.4.0)
    laravel/homestead (virtualbox, 0.4.2)

    C:\Users\David\Homestead>vagrant box remove laravel/homestead --box-version 0.4.0
    Removing box ‘laravel/homestead’ (v.0.4.0) with provider ‘virtualbox’…

How to Add xdebug to Laravel Homestead

In version 0.4.0 (and earlier versions) of the Laravel Homestead Vagrant box (laravel/homestead), xdebug was not included as part of the default installation. This presents a problem when trying to use the remote debugging features in PhpStorm, for example.

If you’re running version 0.4.0 or earlier of the Homestead Vagrant box, follow these steps to add the php-xdebug package to the current environment, and to any future Homestead environments you create:

  1. Update ~/.homestead/after.sh to run “apt-get install php-xdebug”:
    #!/bin/sh
    
    # If you would like to do some extra provisioning you may
    # add any commands you wish to this file and they will
    # be run after the Homestead machine is provisioned
    apt-get install php-xdebug
    
  2. From the Homestead directory, run the vagrant provisioning command. This will re-run any provisioning scripts for the current Homestead environment.
    C:\Users\David\Homestead>vagrant provision

Before you upgrade your laravel/homestead box to version 0.4.1 or above, you can remove the apt-get command line from ~/.homestead/after.sh, since php-xdebug is already included by default as of laravel/homestead v0.4.1.

How to Test Laravel 5 Routes with CSRF Protection Using PHPUnit

Update: As of Laravel 5.2, CSRF verification no longer happens automatically in unit tests, so you don’t need anything special to make your tests pass. See the Laravel 5.2 Upgrade Guide (under “CSRF Verification”) for details.

The release of Laravel 5 introduces some changes that can make testing a bit trickier. In particular, testing routes that are protected with cross-site request forgery (CSRF) protection requires some additional setup in your test methods.

First, some background.

Instead of the old “csrf” filter, Laravel 5 moves the CSRF protection logic to the App\Http\Middleware\VerifyCsrfToken class. This class is referenced in App\Http\Kernel.php, so CSRF protection is applied to every POST, PUT, and DELETE route, even in testing. Whereas Laravel 4.x disabled filters by default in the testing environment, there’s nothing in Laravel 5 that disables middleware in the Http kernel during testing. At the same time, tests use the array session by driver by default, so there’s also nothing to persist session state. As a result, if you test a POST route without making any other changes to your tests, the default classes, or the framework, it will throw a TokenMismatchException.

How to resolve this dilemma?

Some options suggested by the community (which I don’t recommend) are to remove the VerifyCsrfToken middleware from the Http kernel (though you’d better be sure to reference it from the relevant routes/controllers), or to change the VerifyCsrfToken so it only applies outside the “testing” environment.

The easier (and more secure) solution is to simply initiate a session in your test, retrieve the correct csrf token, and include the token with your request input.

For example, suppose you have this simple route:

Route::post('hello', function () {
    return 'Hello ' . Request::input('name');
});

and this associated test:

class HelloTest extends TestCase
{
    public function testHello()
    {
        $params = [
            'name' => 'Bob',
        ];

        $response = $this->call('POST', 'hello', $params);

        $this->assertResponseOk();
        $this->assertEquals('Hello ' . $params['name'], $response->getContent());
    }

}

As it reads above, the test will throw a TokenMismatchException (which can be difficult to see unless you follow the test execution with a debugger).

The fix? Adding just two lines will allow the test to complete successfully. Add a call to Session::start() in the test method (or in the test class setUp() method), then retrieve the csrf token using csrf_token() and include it in your form input as _token.

<pre>class HelloTest extends TestCase
{
    public function testHello()
    {
        Session::start(); // Start a session for the current test
        $params = [
            '_token' => csrf_token(), // Retrieve current csrf token
            'name'   => 'Bob',
        ];

        $response = $this->call('POST', 'hello', $params);

        $this->assertResponseOk();
        $this->assertEquals('Hello ' . $params['name'], $response->getContent());
    }

}

Storage Predictions 2013

In the next year, the data storage industry will continue to evolve due to greater adoption of cloud services and changes in enterprise storage infrastructures. I’ve identified what I believe will be the four main themes below:

1. Shift to cloud services

Organizations will continue to evaluate and adopt cloud-based software (Software as a Service, or SaaS), particularly for ancillary functions like e-mail. Cloud-based infrastructure services (Infrastructure as a Service, or IaaS) will also see greater adoption, though primarily among the most heavily Internet-focused businesses. Accordingly, the data that had been stored in corporate data centers will continue moving to cloud services, albeit more slowly than originally predicted; while consumers have fully adopted cloud services, most organizations are still being extremely cautious.

2. Encapsulation of data and applications

Whether hosted on premises or in cloud services, data will move away from unstructured file storage toward models that more tightly couple data with applications. This will be driven by several factors:

  • Increased adoption of cloud-based applications (such as Google Docs and Microsoft Office Web Apps)
  • Increased use of mobile devices and apps, which encapsulate local data with apps and rely on cloud services for off-device storage
  • Increased encapsulation of data in desktop applications, and tighter cloud service integration in desktop operating systems (as evidenced by Mac OS X Mountain Lion and Windows 8)
  • Movement of data from unstructured files to structured databases and locally-hosted applications

One result of greater encapsulation will be a shift in data classification methodology. While some organizations will continue to pursue classification at a file level, many will find that classifying applications will be sufficient to meet their needs.

As information becomes more tightly coupled with applications, the responsibility for disaster recovery will move more solidly into the application realm. Traditional mechanisms like replication, snapshots, and continuous data protection will continue to be important, but they will be performed or at least coordinated by the application, rather than leaving those functions solely to the operating system or storage system.

3. Improvements to storage features in operating systems

Storage features are rapidly improving in mainstream operating systems (including Red Hat’s Gluster purchase and Microsoft’s Windows Scale-Out File Server). Combined with more intelligent disaster recovery features in applications, the result will be a shift away from expensive, monolithic storage systems toward less expensive scale-out approaches using commodity server hardware.

4. Changing skill sets

As storage moves to cloud services and commodity hardware, the skills required for storage professionals will change. The new storage challenges of the future will include:

  • Service provider relationship management
  • Business analysis
  • Integration
  • Scale-out server technologies

What are your predictions? (Would you like me to expand on any of these in a future post?)