In How to create a Symfony bundle part 1 we developed a minimal bundle that logs every HTTP request. In this iteration we will add some more functionality then release V0.0.1
We will add the feature to enable/disable the bundle through configuration.

PerformanceMeterBundle V0.0.1

When we implemented the request logging, we required the bundle in a test application to test it. The more we will add functionalities, the more the manual testing method becomes impractical. If you work within a team, automated testing becomes even vital to the success of any non trivial project.

First, we need to prepare the required pieces for a testing environment. We will store the tests in tests directory, so create the directory. Add an autoload block to the root of composer.json

"autoload-dev": {
    "psr-4": {
      "Skafandri\\PerformanceMeterBundle\\Tests\\": "tests/"
    }
  },

Require phpunit

$ composer require --dev phpunit/phpunit

Configure phpunit

phpunit.xml.dist

<?xml version="1.0" encoding="UTF-8"?>
<phpunit bootstrap="vendor/autoload.php">
    <testsuites>
        <testsuite name="PerformanceMeterBundle Test Suite">
            <directory>tests</directory>
        </testsuite>
    </testsuites>

    <filter>
        <whitelist>
            <directory>.</directory>
            <exclude>
                <directory>Resources</directory>
                <directory>tests</directory>
                <directory>vendor</directory>
            </exclude>
        </whitelist>
    </filter>
</phpunit>

The whitelist filter is required to generate a code coverage report, we will use it later.

Check if everything is OK

$ ./vendor/bin/phpunit
PHPUnit 5.7.6 by Sebastian Bergmann and contributors.

Time: 24 ms, Memory: 2.00MB

No tests executed!

No tests executed! means all is good, we can start writing the first test case.

We will simply write the test cases in the same order we produced the bundle files. The first test for RequestLogger should be trivial.

tests/RequestLoggerTest.php

<?php
namespace Skafandri\PerformanceMeterBundle\Tests;
use PHPUnit\Framework\TestCase;
use Psr\Log\LoggerInterface;
use Skafandri\PerformanceMeterBundle\RequestLogger;
use Symfony\Component\HttpFoundation\Request;

class RequestLoggerTest extends TestCase
{
    public function test_logs_request()
    {
        $mockLogger = $this->getMockBuilder(LoggerInterface::class)->getMock();
        $mockLogger->expects($this->once())
            ->method('info')
            ->with(
                'performance_meter.request',
                array('uri' => 'http://:/', 'duration' => 10)
            );

        $requestLogger = new RequestLogger($mockLogger);
        $requestLogger->logRequest(new Request(), 10);
    }

    public function test_doesnt_break_without_logger()
    {
        $requestLogger = new RequestLogger();
        $requestLogger->logRequest(new Request(), 10);
    }
}

The second test has no assertions, isn’t this a bad practice? This test is simply a safeguard against a possible optimization to the block

if (!$this->logger) {
    return;
}

in RequestLogger. PHPUnit doesn’t have an assertion similar to assertEverythingIsFine() and I wouldn’t use it anyway. The function name should be self explanatory.
Needless to say that every time you change the code, you run ./vendor/bin/phpunit to execute the test suite.

Second test case will be to test KernelEventsSubscriber, the scenario is:

-1- create an event subscriber with a mocked request logger
-2- call eventSubscriber->onKernelRequest with a mocked event
-3- call eventSubscriber->onKernelResponse with a mocked event
-*- make sure request logger was called properly

tests/KernelEventsSubscriberTest.php

<?php
namespace Skafandri\PerformanceMeterBundle\Tests;
use PHPUnit\Framework\TestCase;
use Skafandri\PerformanceMeterBundle\KernelEventsSubscriber;
use Skafandri\PerformanceMeterBundle\RequestLogger;
use Symfony\Component\HttpFoundation\Request;
use Symfony\Component\HttpKernel\Event\FilterResponseEvent;
use Symfony\Component\HttpKernel\Event\GetResponseEvent;
use Symfony\Component\HttpKernel\HttpKernelInterface;

class KernelEventsSubscriberTest extends TestCase
{
    public function test_logs_request_on_kernel_response()
    {
        $request = new Request();

        $mockGetResponseEvent = $this->getGetResponseEventMock();
        $mockGetResponseEvent->expects($this->any())
            ->method('getRequestType')
            ->willReturn(HttpKernelInterface::MASTER_REQUEST);

        $mockFilterResponseEvent = $this->getFilterResponseEventMock();
        $mockFilterResponseEvent->expects($this->any())
            ->method('getRequest')
            ->willReturn($request);

        $mockRequestLogger = $this->getMockBuilder(RequestLogger::class)->getMock();
        $mockRequestLogger->expects($this->once())
            ->method('logRequest')
            ->with($this->callback(function ($requestArgument) use ($request) {
                return $requestArgument === $request;
            }));

        $eventSubscriber = new  KernelEventsSubscriber($mockRequestLogger);
        $eventSubscriber->onKernelRequest($mockGetResponseEvent);
        $eventSubscriber->onKernelResponse($mockFilterResponseEvent);
    }

    private function getGetResponseEventMock()
    {
        return $this
            ->getMockBuilder(GetResponseEvent::class)
            ->disableOriginalConstructor()
            ->getMock();
    }

    private function getFilterResponseEventMock()
    {
        return $this
            ->getMockBuilder(FilterResponseEvent::class)
            ->disableOriginalConstructor()
            ->getMock();
    }
}

The test code seems bigger than the tested code, isn’t it? Yes, and it may get even bigger. A ratio of 2:1 or more is not uncommon. That’s the price you pay to buy this safety net.
In fact, you can estimate the worthiness of automated testing in a particular project:

if (time_to_write_automated_tests + time_to_run_automated_tests*N < time_to_run_manual_tests*N)
    you should write automated tests
else    
    you should do manual testing
endif

Where N is the number of times tests will run, automatically or manually.

Notice that the more N goes up, the less time_to_write_automated_tests is affecting the balance. When N is too big, which is the case in most projects, you can simplify it by just comparing time_to_run_automated_tests to time_to_run_manual_tests.

After the event subscriber, we created services.xml and the extension class. The extension just loads the configuration file into a container, the test scenario is:

-1- Create a container
-2- Register the extension
-*- Make sure request logger is registered with the correct class and arguments
-*- Make sure events subscriber is registered with the correct class, arguments and tags

tests/DependencyInjection/Fixtures/config.yml

performance_meter:

tests/DependencyInjection/PerformanceMeterExtensionTest.php

<?php

namespace Skafandri\PerformanceMeterBundle\Tests\DependencyInjection;

use PHPUnit\Framework\TestCase;
use Skafandri\PerformanceMeterBundle\DependencyInjection\PerformanceMeterExtension;
use Skafandri\PerformanceMeterBundle\KernelEventsSubscriber;
use Skafandri\PerformanceMeterBundle\RequestLogger;
use Symfony\Component\DependencyInjection\ContainerBuilder;
use Symfony\Component\DependencyInjection\ContainerInterface;
use Symfony\Component\DependencyInjection\Reference;

class PerformanceMeterExtensionTest extends TestCase
{
    public function test_registers_request_logger()
    {
        $container = $this->createContainer();
        $container->compile();

        $requestLoggerDefinition = $container->getDefinition('performance_meter.request_logger');

        $this->assertEquals(RequestLogger::class, $requestLoggerDefinition->getClass());
        $this->assertEquals(
            array(
                new Reference('logger', ContainerInterface::NULL_ON_INVALID_REFERENCE)
            ),
            $requestLoggerDefinition->getArguments()
        );
    }

    public function test_registers_kernel_event_subscriber()
    {
        $container = $this->createContainer();
        $container->compile();

        $eventSubscriberDefinition = $container->getDefinition('performance_meter.kernel_events_subscriber');

        $this->assertEquals(KernelEventsSubscriber::class, $eventSubscriberDefinition->getClass());
        $this->assertEquals(
            array(
                new Reference('performance_meter.request_logger')
            ),
            $eventSubscriberDefinition->getArguments()
        );
        $this->assertEquals(
            array('kernel.event_subscriber' => array(array())),
            $eventSubscriberDefinition->getTags()
        );
    }

    private function createContainer()
    {
        $container = new ContainerBuilder();

        $locator = new FileLocator(__DIR__.'/Fixtures');
        $loader = new YamlFileLoader($container, $locator);
        $loader->load('config.yml');

        $container->registerExtension(new PerformanceMeterExtension());

        $container->getCompilerPassConfig()->setOptimizationPasses(array());

        return $container;
    }
}

I think the previous example beats all the quests to find the golden number that defines a good test coverage percentage, and must be between 0 and 100.
PHPUnit will think the previous test is testing PerformanceMeterExtension, which has 3 lines of code. But in reality, we are testing services.xml.

We are done testing our previous code, we can proceed to implement the next feature. The user should be able to activate/deactivate the bundle using a configuration toggle.

performance_meter:
    enabled: true|false defaults to true

Before thinking about how to implement this feature, let’s think how to test it first. We can create a dummy configuration file with enabled: false and make sure the bundle is disabled. The scenario looks like:

-1- Create a container
-2- load disabled.yml
-*- check that no services are loaded

tests/DependencyInjection/Fixtures/disabled.yml

performance_meter:
    enabled: false

in tests/DependencyInjection/PerformanceMeterExtensionTest.php we add another test case

public function test_registers_nothing_when_disabled()
{
    $container = $this->createContainer();

    $locator = new FileLocator(__DIR__ . '/Fixtures');
    $loader = new YamlFileLoader($container, $locator);
    $loader->load('disabled.yml');

    $container->compile();

    $this->assertFalse($container->has('performance_meter.request_logger'));
    $this->assertFalse($container->has('performance_meter.kernel_events_subscriber'));
}

Running the test suite again, it fails obviously

$ ./vendor/bin/phpunit
PHPUnit 5.7.6 by Sebastian Bergmann and contributors.

..F...                                                              6 / 6 (100%)

Time: 207 ms, Memory: 6.00MB

There was 1 failure:

1) Skafandri\PerformanceMeterBundle\Tests\DependencyInjection\PerformanceMeterExtensionTest::test_registers_nothing_when_disabled
Failed asserting that true is false.

/home/ilyes/projects/PerformanceMeterBundle/tests/DependencyInjection/PerformanceMeterExtensionTest.php:63

FAILURES!
Tests: 6, Assertions: 8, Failures: 1.

Once we manage to pass the tests, we will be done with this feature.

PerformanceMeterExtension seems a good candidate, all we need is to check for the enabled configuration and call $loader->load('services.xml') only when it has the value true.
The first argument $configs for the load method is a raw array of configs defined by users and appended by other bundles. This array needs to be processed using a Configuration class that defines how configuration should be.

So let’s create the configuration class.
src/DependencyInjection/Configuration.php

<?php
namespace Skafandri\PerformanceMeterBundle\DependencyInjection;
use Symfony\Component\Config\Definition\Builder\TreeBuilder;
use Symfony\Component\Config\Definition\ConfigurationInterface;

class Configuration implements ConfigurationInterface
{

    /**
     * Generates the configuration tree builder.
     *
     * @return \Symfony\Component\Config\Definition\Builder\TreeBuilder The tree builder
     */
    public function getConfigTreeBuilder()
    {
        $tree = new TreeBuilder();
        $tree->root('performance_meter');

        return $tree;
    }
}

We will use it from PerformanceMeterExtension to process the $configs array. Before $loader->load('services.xml'), insert $config = $this->processConfiguration(new Configuration(), $configs);

Run the tests again, now we get an exception Symfony\Component\Config\Definition\Exception\InvalidConfigurationException: Unrecognized option "enabled" under "performance_meter". We need to add to the TreeBuilder returned from the Configuration class expect a key of type scalar and a default value true. Edit the configuration class, assign the root node creation to a variable $root = $tree->root('performance_meter'); and add the enabled node $root->children()->scalarNode('enabled')->defaultTrue();.

Run the tests again, we get back to Failed asserting that true is false. Since now we have the new config recognized and parsed, we update PerformanceMeterExtension to load services.xml only when enabled=true, we just wrap it in an if clause

if ($this->isConfigEnabled($container, $config)) {
    $loader->load('services.xml');
}

Run the tests again.. all good.

If you haven’t done it before, congratulations, this was your first TDD session.

In the previous tutorial we had to add "minimum-stability": "dev" to the test application in order to be able to require performance-meter-bundle. The minimum stability defaults to stable, which from composer’s perspective, it just means that the project has some specifically named git tags. The tags represent Semantic Versioning.
To release the first stable version of this bundle, we need to add a version tag v0.0.1

$ git add .; git commit -m "V0.0.1"; git tag v0.0.1.

Before pushing the changes to Github, you can go to the project settings on Github, Integrations & services and add Packagist from the list. Now changes pushed to Github will automatically update the project on Packagist through a webhook.

$ git push origin HEAD --tags

Next: How to create a Symfony bundle part 3
Previous: How to create a Symfony bundle part 1