Adding a gem to your project with bundler

Now that we know the basics of how to use bundler, let’s put our new knowledge to good use by adding a gem to a project. We’ll look at how to add generate-puppetfile to a puppet Controlrepo, but you can add any gem to any project you’d like. It’s very simple.

Start by cloning your project and checking out a new branch:

[rnelson0@build02 controlrepo:production]$ git checkout -b generate-puppetfile
Switched to a new branch 'generate-puppetfile'

Open up the existing Gemfile. If you don’t have one, you just need the source statement followed by one or more gems.

[rnelson0@build02 controlrepo:generate-puppetfile]$ cat Gemfile
source 'https://rubygems.org'

group :development, :test do
  gem 'json', :require => false
  gem 'metadata-json-lint', :require => false
  gem 'puppetlabs_spec_helper', :require => false
  gem 'puppet-lint', :require => false
  gem 'rake', :require => false
  gem 'rspec-puppet', :require => false
end

if puppetversion = ENV['PUPPET_GEM_VERSION']
  gem 'puppet', puppetversion, :require => false
else
  gem 'puppet', :require => false
end

# vim:ft=ruby

Add the gem (generate-puppetfile) to the file:

[rnelson0@build02 controlrepo:generate-puppetfile]$ cat Gemfile
source 'https://rubygems.org'

group :development, :test do
  gem 'json', :require => false
  gem 'metadata-json-lint', :require => false
  gem 'puppetlabs_spec_helper', :require => false
  gem 'puppet-lint', :require => false
  gem 'rake', :require => false
  gem 'rspec-puppet', :require => false
  gem 'generate-puppetfile'
end

if puppetversion = ENV['PUPPET_GEM_VERSION']
  gem 'puppet', puppetversion, :require => false
else
  gem 'puppet', :require => false
end

# vim:ft=ruby

Now run bundle install:

[rnelson0@build02 controlrepo:generate-puppetfile]$ bundle install --path vendor --without system_tests
Fetching gem metadata from https://rubygems.org/.........
Fetching version metadata from https://rubygems.org/..
Installing rake 10.4.2
Installing diff-lcs 1.2.5
Installing facter 2.4.4
Installing generate-puppetfile 0.9.6
Installing json_pure 1.8.2
Installing hiera 1.3.4
Installing json 1.8.3 with native extensions
Installing metaclass 0.0.4
Installing spdx-licenses 1.0.0
Installing metadata-json-lint 0.0.11
Installing mocha 1.1.0
Installing puppet 3.7.3
Installing puppet-lint 1.1.0
Installing puppet-syntax 2.0.0
Installing rspec-core 2.99.2
Installing rspec-expectations 2.99.2
Installing rspec-mocks 2.99.4
Installing rspec 2.99.0
Installing rspec-puppet 2.2.0
Installing puppetlabs_spec_helper 0.10.3
Using bundler 1.10.6
Bundle complete! 8 Gemfile dependencies, 21 gems now installed.
Gems in the group system_tests were not installed.
Bundled gems are installed into ./vendor.

And now you can use the gem in your project with bundle exec, without installing it globally:

[rnelson0@build02 controlrepo:generate-puppetfile]$ bundle exec generate-puppetfile -v
generate-puppetfile v0.9.6

At this point, it may be worth adding an alias to our shell:

[rnelson0@build02 controlrepo:generate-puppetfile]$ alias be
alias be='bundle exec'

Enjoy!

Getting started with Bundler

Not too long ago, I learned about bundler. It’s a solution that allows you to have multiple versions of ruby gems installed, specific to the project you’re working on, without affecting the globally installed ruby gems. I’m far from an expert but I hope I can help explain it a bit. To get started, install the bundler gem – it’s the one and only global gem we’ll need to install. You’ll see the dependencies installed as well if you do not have them:

[root@build02 ~]# gem install bundler
Fetching: bundler-1.10.6.gem (100%)
Successfully installed bundler-1.10.6
Installing ri documentation for bundler-1.10.6
1 gem installed

Now, clone a ruby project with a Gemfile. I’ve chosen puppet-retrospec:

[rnelson0@build02 git]$ git clone git@github.com:nwops/puppet-retrospec.git
Initialized empty Git repository in /home/rnelson0/git/puppet-retrospec/.git/
remote: Counting objects: 3246, done.
remote: Compressing objects: 100% (100/100), done.
remote: Total 3246 (delta 45), reused 0 (delta 0), pack-reused 3127
Receiving objects: 100% (3246/3246), 2.32 MiB | 2.67 MiB/s, done.
Resolving deltas: 100% (829/829), done.
[rnelson0@build02 git]$ cd puppet-retrospec/
[rnelson0@build02 puppet-retrospec:master]$

Try and run rake and you’ll notice it’s missing a dependency:

[rnelson0@build02 puppet-retrospec:master]$ rake -T
Could not find addressable-2.3.8 in any of the sources
Run `bundle install` to install missing gems

Continue reading

Configuring Travis CI on a Puppet Module Repo

Recently we looked at enabling Travis CI on the Controlrepo. Today, we’re going to do the same for a module repo. We’re going to use much of the same logic and files, just tweaking things a bit to fit the slightly different file layout and perhaps changing the test matrix a bit. If you have not registered for Travis CI yet, go ahead and take care of that (public or private) before continuing.

The first challenge is to decide if you’re going to enable Travis CI with an existing module, or a new module. Since a new module is probably easier, let’s get the hard stuff out of the way.

Set up an existing module

I have an existing module rnelson0/certs which has no CI but does have working rspec tests, a great candidate for today’s efforts. Let’s make sure the tests actually work, it’s easy to make incorrect assumptions:

modules travis ci fig 1

Continue reading

Configuring Travis CI on your Puppet Controlrepo

Continuous Integration is an important technique used in modern software development. For every change, a CI system runs a suite of tests to ensure the whole system – not just the changed portion – still “works”, or more specifically, still passes the defined tests. We are going to look at Travis CI, a cloud-based Continuous Integration service that you can connect to your GitHub repositories. This is valuable because it’s free (for “best effort” access; there are paid plans as well.) and helps you guarantee that code you check in will work with Puppet. This isn’t a substitute or replacement for rspec-puppet, this is another layer of testing that improves the quality of our work.

There are plenty of other CI systems out there – Jenkins and Bamboo are popular – but that would involve setting up the CI system as well as configuring our repo to use CI. Please feel free to investigate these CI systems, but they’ll remain beyond the scope of this blog for the time being. Please share any guides you may have in the comments, though!

Travis CI works by spinning up a VM or docker instance, cloning our git repo (using tokenized authentication), and running the command(s) we provide. Each entry in our test matrix will run on a separate node, so we can test different OSes or Ruby or Puppet versions to our heart’s content. The results of the matrix are visible through GitHub and show us red if any test failed and green if all tests passed. We’ll look at some details of how this works as we set up Travis CI.

From a workflow perspective, you’ll continue to create branches on your controlrepo and submit PRs. The only additional step is that when a PR is ready for review, you’ll want to wait for Travis CI to complete first. If it’s red, investigate the failure and remediate it. Don’t review code until everything is green because it won’t work anyway. This will mostly be a time saver, unless you’re watching your CI run which of course makes it slower!

Continue reading

Minimum Viable Configuration (MVC)

In my PuppetConf talk, I discussed a concept I call “Minimum Viable Configuration”, or MVC. This concept is similar to that of the Minimum Viable Product (MVP), in which you develop and deploy just the core features required to determine if there’s a market fit for your anticipated customer base. The MVC, however, is targeted at your developers, and is the minimum amount of customization required for the developers to be productive with the languages and tools your organization uses. This can include everything from having preferred IDEs available, language plugins, build tools, etc.

A Minimum Viable Configuration may not appear necessary to many, especially those who have been customizing their own environment for years or decades. The MVC is really targeted at your team, or as the organization as a whole. You may have a great customized IDE setup for writing Puppet or Powershell code, but others on your team may just be starting. The MVC allows the organization to share that accumulated wealth, making full use of the tens or hundreds of years of experience on the team. A novice developer can sit down and be productive with any language or tool covered by the MVC by standing on the shoulders of their teammates.

The MVC truly is the minimum customization required to get started – for instance, a .vimrc file that sets the tabstop to 2 characters and provides enhanced color coding and syntax checking for various languages – but that still allows users to add their own customizations. If you enforce the minimum, but don’t limit further customization, new hires can not only check their email on day one, but can actually delve through the codebase and start making changes on day one. You can also tie it into any vagrant images you might maintain.

Your MVC will change over time, of course. Use your configuration management tool, like Puppet, to manage the MVC. When the baseline is updated, all the laptops and shared nodes can be updated quickly to the new standard. You can see an example of a Minimum Viable Configuration for Linux in PuppetInABox’s role::build and the related profiles (build, rcfiles::vim, rcfiles::bash). You can easily develop similar roles and profiles for other languages or operating systems.

I feel the MVC can be a very powerful tool for teams who work with an evolving variety of tools and languages, who hire novices and grow expertise internally, and especially organizations that are exposing Operations teams to development strategies (i.e. DevOps). What do you think about the MVC? Are you using something similar now, or is there another way to address the issue?

PHP Unit Testing

I recently needed to investigate unit testing in PHP. I’m familiar with but not very well versed in PHP, and I’m certainly not a PHP aficionado, but a quick google search turned me on to PHPUnit by Sebastian Bergmann. The docs appear very complete and there’s a nice Getting Started guide to keep it simple. Using this tutorial and the accompanying GitHub repo, you can be up and running in a few minutes. Unfortunately, I ran into some problems because I am using PHP 5.3.3 (CentOS EL 6) and I was trying a literal copy and paste instead of using the provided repo. Don’t copy and paste, just use the repo. However, I managed to learn something by doing this.

PHP Versions

The simpler issue is PHP 5.3.3. I installed phpunit per the directions in the Getting Started guide. Here’s what happens when I clone the Money repo and run phpunit:

[rnelson0@build01 money:master]$ git remote -v
origin  git@github.com:sebastianbergmann/money.git (fetch)
origin  git@github.com:sebastianbergmann/money.git (push)
[rnelson0@build01 money:master]$ phpunit --bootstrap src/autoload.php tests/MoneyTest.php
PHP Parse error:  syntax error, unexpected T_CLASS, expecting T_STRING or T_VARIABLE or '$' in /home/rnelson0/php/money/tests/MoneyTest.php on line 55

The current version requires PHP 5.5. It’s okay, there’s an older version we can use in the 1.5 branch. Check it out, run phpunit again, and everything works.

[rnelson0@build01 money:master]$ git branch -a
  1.5
  1.6
* master
  remotes/origin/1.5
  remotes/origin/1.6
  remotes/origin/HEAD -> origin/master
  remotes/origin/master
  remotes/origin/php-7
[rnelson0@build01 money:master]$ git checkout 1.5
Switched to branch '1.5'
[rnelson0@build01 money:1.5]$ phpunit --bootstrap src/autoload.php tests/MoneyTest.php
PHPUnit 4.8.2 by Sebastian Bergmann and contributors.

..............................S

Time: 665 ms, Memory: 18.75Mb

OK, but incomplete, skipped, or risky tests!
Tests: 31, Assertions: 50, Skipped: 1.

To Autoload, or not to Autoload

The second issue, where I copied the test code directly from the tutorial, was a little trickier. You are supposed to use the file src/autoload.php, but the tutorial does not provide it. You can see the full file in the repo, here’s an important snippet:

spl_autoload_register(
    function($class) {
        static $classes = null;
        if ($classes === null) {
            $classes = array(
                //...
                'sebastianbergmann\\money\\currency' => '/Currency.php',
                'sebastianbergmann\\money\\currencymismatchexception' => '/exceptions/CurrencyMismatchException.php',
                //...
                'sebastianbergmann\\money\\money' => '/Money.php',
                //...

This function maps the namespace’d classes to the files they are located in. I have not gone through the PHPUnit docs in great detail yet, but I haven’t seen instructions on generating this dynamically or crafting it manually. It’s certainly not part of the tutorial, so I decided to see if I could get around this with brute force. First, I generated a simple namespace and class, NewProject\Base.

<?php

namespace NewProject;

class Base {
  /**
   * @var integer
   */
  private $counter;

  /**
   * param integer $count
   */
  public function __construct($counter) {
    if (!is_int($counter)) {
      throw new \InvalidArgumentException('$counter must be an Integer');
    }
    $this->counter = $counter;
  }

  /**
   * Return the current counter value
   *
   * @return integer
   */
  public function getCount() {
    return $this->counter;
  }

  /**
   * Increase the counter and return its current value
   *
   * @return integer
   */
  public function increaseCount() {
    $this->counter++;

    return $this->counter;
  }
}

?>

The comments are there for PHPUnit. I think I’m doing it right, but I’m still new to this so it may not be accurate. This is also a very contrived class that exists just to do some testing, but for that purpose it’s great! Next, we need a class to do the testing. The name of the class is <Class>Test and it extends the PHPUnit_Framework_TestCase (there are others, but we’re starting small). Here’s the first draft:

<?php
namespace NewProject;

class BaseTest extends \PHPUnit_Framework_TestCase {
  /**
   * @covers NewProject\Base::__construct
   */
  public function testConstructor() {
    new Base(0);
  }

  public function testShouldFail() {
    new Base('string');
  }
}
?>

With unit tests, you want everything to pass, but I put the last one in because I wanted to make sure that an actual failure would be detected as a failure, not as a syntax error or something else that would bomb out the entire test suite. Here’s what happens when you run phpunit against that without an autoload file:

[rnelson0@build01 NewProject]$ phpunit tests
PHPUnit 4.8.2 by Sebastian Bergmann and contributors.

PHP Fatal error:  Class 'NewProject\Base' not found in /home/rnelson0/php/NewProject/tests/BaseTest.php on line 9

Well, shoot. It’s not loading the underlying class that it needs to test, and I don’t know how to generate an autoload file yet. Since it can’t find the class, I tried to see if I could force it to load that by adding a require() statement (emphasis on the additional line):

[rnelson0@build01 NewProject]$ cat tests/BaseTest.php
<?php
namespace NewProject;

require ('src/Base.php');

class BaseTest extends \PHPUnit_Framework_TestCase {
  /**
   * @covers NewProject\Base::__construct
   */
  public function testConstructor() {
    new Base(0);
  }

  public function testShouldFail() {
    new Base('string');
  }
}
?>
[rnelson0@build01 NewProject]$ phpunit tests
PHPUnit 4.8.2 by Sebastian Bergmann and contributors.

.E

Time: 221 ms, Memory: 18.25Mb

There was 1 error:

1) NewProject\BaseTest::testShouldFail
InvalidArgumentException: $counter must be an Integer

/home/rnelson0/php/NewProject/src/Base.php:16
/home/rnelson0/php/NewProject/tests/BaseTest.php:15

FAILURES!
Tests: 2, Assertions: 0, Errors: 1.

Lo and behold, that works! I’m sure at some point I’ll figure out how to generate the autoload, but this is good enough for now.

Summary

I’m well on my way to unit testing with PHP, thanks to Sebastian’s awesome framework. Thank you, Sebastian, you have taken much of the suck out of PHP!

You can find my test repo on github.

Customizing bash and vim for better git and puppet use

Welcome back to our Puppet series. I apologize for the extended hiatus and thank you for sticking around! As an added bonus, in addition to inlining files, I’m including links to the corresponding files and commits in my PuppetInABox project so you can easily review the files and browse around as needed. I hope this is helpful!

Today, we will look at improving our build server. The build role is a centralized server where we can do our software development, including work on our puppet code and creating packages with FPM. When we work with git, we have to run git branch to see what branch we’re in. If you’re like me, this has led to a few uses of git stash and in some cases having to redo the work entirely once you start committing on the long branch. To help, we’re going to add the currently-active branch name of any git directory we are in to the PS1 prompt. We also are doing a lot of edits of *.pp files and we don’t have any syntax highlighting or auto-indenting going on. We can fix that with a few modifications, and we’ll discuss where additional customizations can be made.

Continue reading

Visible Ops Phase Four: Enable Continual Improvement

The final phase of Visible Ops is Enable Continual Improvement. To really succeed with our efforts, we need to make sure that the resources we have are allocated optimally toward our business goals. With most of our fires put out and significant efforts into avoiding future fires, we have the time available to do this right. To determine where our resources should be allocated, we need to look at metrics.

Continue reading

Visible Ops Phase Three: Create A Repeatable Build Library

Phase three of Visible Ops is Create a Repeatable Build Library. This phase’s focus is to define build mechanisms, create system images, and establish documentation that together describe how to build our desired infrastructure from “bare metal” (I’ll continue to use “bare metal” throughout for consistency, but “bare VM” may be more appropriate in today’s virtualized IT). This allows us to treat our infrastructure like fuses. When a fuse pops, it is discarded instead of repaired and a new fuse is inserted in its place; likewise when a system fails, it is removed from service and a new system is provisioned in it’s place. All high-performing IT organizations, not just the unicorns of IT, use this technique. This chapter focuses on how to achieve that goal.

Continue reading

Visible Ops Phase Two: Catch And Release and Find Fragile Artifacts

In the second phase of Visible Ops implementation, our goal is to Catch & Release and Find Fragile Artifacts. This phase focuses on creating and maintaining an accurate inventory of assets and highlighting those that generate the most unplanned work. The various configurations in use are also collected in order to start reducing the unique configuration counts, the focus of Phase Three. This is the shortest chapter in the book at 6 pages, though it may take significant time to complete the work efforts.

The Issues and Indicators lays out the issues being tackled, including moving from “individual knowledge” to “tribal knowledge” (a shared knowledgebase the entire organization can access, rather than in people’s heads) and preventing the “special snowflake” syndrome where every node in the network, even those in clusters or farms, are similar but still unique.

Continue reading