Subscribe: Untitled
Added By: Feedage Forager Feedage Grade B rated
Language: English
application  array  build  framework  function  new  phing  phpunit  project  public function  public  task  test  zend 
Rate this Feed
Rating: 3 starRating: 3 starRating: 3 starRate this feedRate this feed
Rate this feed 1 starRate this feed 2 starRate this feed 3 starRate this feed 4 starRate this feed 5 star

Comments (0)

Feed Details and Statistics Feed Statistics
Preview: Untitled


PHP web development blog by Raphael Stolt

Updated: 2018-03-07T02:38:05.137+01:00


Documenting Composer scripts


For open source projects I'm involved with, I developed the habit to define and document the steady growing amount of repository and build utilities via Composer scripts. Having Composer scripts available makes it trivial to define aliases or shortcuts for complex and hard to remember CLI calls. It also lowers the barrier for contributors to start using these tools while helping out with fixing bugs or providing new features. Finally they're also simplifying build scripts by stashing away complexity. Defining Composer scriptsIf you've already defined or worked with Composer scripts or even their npm equivalents you can skip this section, otherwise the next code snippet allows you to study how to define these. The here defined Composer scripts range from simple CLI commands with set options (e.g. the test-with-coverage script) to more complex build utility tools (i.e. the application-version-guard script) which are extracted into specific CLI commands to avoid cluttering up the composer.json or even the .travis.yml.composer.json{ "scripts": { "test": "phpunit", "test-with-coverage": "phpunit --coverage-html coverage-reports", "cs-fix": "php-cs-fixer fix . -vv || true", "cs-lint": "php-cs-fixer fix --diff --stop-on-violation --verbose --dry-run", "configure-commit-template": "git config --add commit.template .gitmessage", "application-version-guard": "php bin/application-version --verify-tag-match" }}Describing Composer scriptsSince Composer 1.6.0 it's possible to set custom script descriptions via the scripts-descriptions element like shown next. It's to point out here that the name of a description has to match the name of a defined custom Composer script to be recognised at runtime. On another note it's to mention that the description should be worded in simple present to align with the other Composer command descriptions.composer.json{ "scripts-descriptions": { "test": "Runs all tests.", "test-with-coverage": "Runs all tests and measures code coverage.", "cs-fix": "Fixes coding standard violations.", "cs-lint": "Checks for coding standard violations.", "configure-commit-template": "Configures a local commit message template.", "application-version-guard": "Checks that the application version matches the given Git tag." }, "scripts": { "test": "phpunit", "test-with-coverage": "phpunit --coverage-html coverage-reports", "cs-fix": "php-cs-fixer fix . -vv || true", "cs-lint": "php-cs-fixer fix --diff --stop-on-violation --verbose --dry-run", "configure-commit-template": "git config --add commit.template .gitmessage", "application-version-guard": "php bin/application-version --verify-tag-match" }}Now when running $ composer via the terminal the descriptions of defined custom scripts will show up sorted in into the list of available commands, which makes it very hard to spot the Composer scripts of the package at hand. Luckily Composer scripts can also be namespaced.Namespacing Composer scriptsTo namespace (i.e. some-namespace) the custom Composer scripts for any given package define the script names with a namespace prefix as shown next. As the chances are very high that you will be using the one or other Composer script several times, while working on the package, it's recommended to use a short namespace like in the range from two to four characters.composer.json{ "scripts": { "some-namespace:test": "phpunit", "some-namespace:test-with-coverage": "phpunit --coverage-html coverage-reports", "some-namespace:cs-fix": "php-cs-fixer fix . -vv || true", "some-namespace:cs-lint": "php-cs-fixer fix --diff --stop-on-violation --verbose --dry-run", "some-namespace:configure-commit-template": "git config --add commit.template .gitmessage", "some-namespace:application-version-guard": "php bin/application-version --verify-tag-match" }}Now this time when running $ composer via the terminal the defined custom scripts will show up in the list of available commands in a namespaced manner giving an immediate overview of the available Composer script of the pac[...]

Keeping your CLI integration tests green on Windows


Lately on a Windows system, some failing integration tests for CLI commands utilising the Symfony Console component caused me some blip headaches by PHPUnit insisting that two strings are not identical due to different line endings. The following post documents the small steps I took to overcome these headaches.First the assertion message produced by the failing test, see the console output below, got me thinking it might be caused by different encodings and line endings; though the project was utilising an .editorconfig from the early start and the related files were all encoded correctly and had the configured line endings. The Git configuration e.g. core.autocrlf=input also was as it should be.1) Stolt\LeanPackage\Tests\Commands\InitCommandTest::createsExpectedDefaultLpvFileFailed asserting that two strings are identical.--- Expected+++ Actual@@ @@ #Warning: Strings contain different line endings!-Created default 'C:\Users\stolt\AppData\Local\Temp\lpv\.lpv' file.+Created default 'C:\Users\stolt\AppData\Local\Temp\lpv\.lpv' file.Another deeper look at the CommandTester class yielded that it’s possible to disable the command output decoration and also to normalise the command output. So a change of the SUT preparation and a normalisation of the console output, visualised via a git diff -U10, brought the solution for this particular test.diff --git a/tests/Commands/InitCommandTest.php b/tests/Commands/InitCommandTest.phpindex 58e7114..fb406f3 100644--- a/tests/Commands/InitCommandTest.php+++ b/tests/Commands/InitCommandTest.php@@ -48,21 +48,21 @@ class InitCommandTest extends TestCase /** * @test */ public function createsExpectedDefaultLpvFile() { $command = $this->application->find('init'); $commandTester = new CommandTester($command); $commandTester->execute([ 'command' => $command->getName(), 'directory' => WORKING_DIRECTORY,- ]);+ ], ['decorated' => false]);// ommitted code- $this->assertSame($expectedDisplay, $commandTester->getDisplay());+ $this->assertSame($expectedDisplay, $commandTester->getDisplay(true)); $this->assertTrue($commandTester->getStatusCode() == 0); $this->assertFileExists($expectedDefaultLpvFile);Since the SUT had a lot of integration test for its CLI commands, the lazy me took the shortcut to extend the CommandTester and using it, with desired defaults set, instead of changing all of the related command instantiations. false] ) { return parent::execute($input, $options); }}So it's a yay for green CLI command integration tests on Windows from here on. Another measure for the SUT would be to enable Continuous Integration on a Windows system via AppVeyor, but that’s a task for another commit 85bdf22.[...]

Eight knobs to adjust and improve your Travis CI builds


After having refactored several Travis CI configuration files over the last weeks, this post will provide eight adjustments or patterns immediately applicable for faster, changeable, and economic builds. 1. Reduce git clone depthThe first one is a simple configuration addition with a positive impact on the time and disk space consumption, which should be quite noticeable on larger code bases. Having this configured will enable shallow clones of the Git repository and reduce the clone depth from 50 to 2..travis.ymlgit: depth: 2 2. Enable cachingThe second one is also a simple configuration addition for caching the Composer dependencies of the system under build (SUB) or its result of static code analysis. Generally have a look if your used tools allow caching and if so cache away. This one deserves a shout out to @localheinz for teaching me about this one. The next shown configuration excerpt assumes that you lint coding standard compliance with the PHP Coding Standards Fixer in version 2.0.0-alpha and have enable caching in its .php_cs configuration..travis.ymlcache: directories: - $HOME/.composer/cache - $HOME/.php-cs-fixer 3. Enforce contribution standardsThis one might be a tad controversial, but after having had the joys of merging GitHub pull requests from a master branch I started to fail builds not coming from feature or topic branch with the next shown bash script. It's residing in an external bash script to avoid the risk of terminating the build process../bin/travis/fail-non-feature-topic-branch-pull-request#!/bin/bashset -eif [[ $TRAVIS_PULL_REQUEST_BRANCH = master ]]; then echo "Please open pull request from a feature / topic branch."; exit 1;fi.travis.ymlscript: - ./bin/travis/fail-non-feature-topic-branch-pull-requestThe bash script could be extended to fail pull requests not following a branch naming scheme, e.g. feature- for feature additions or fix- for bug fixes, by evaluating the branch name. If this is a requirement for your builds you should also look into the blocklisting branches feature of Travis CI. 4. Configure PHP versions in an includeWith configuring the PHP versions to build against in a matrix include it's much easier to inject enviroment variables and therewith configure the version specific build steps. You can even set multiple enviroment variables like done on the 7.0 version..travis.ymlenv: global: - OPCODE_CACHE=apcmatrix: include: - php: hhvm - php: nightly - php: 7.1 - php: 7.0 env: DISABLE_XDEBUG=true LINT=true - php: 5.6 env: - DISABLE_XDEBUG=truebefore_script: - if [[ $DISABLE_XDEBUG = true ]]; then phpenv config-rm xdebug.ini; fiI don't know if enviroment variable injection is also possible with the minimalistic way to define the PHP versions list, so you should take that adjustment with a grain of salt. It also seems like I stumbled upon a Travis CI bug where the global enviroment variable OPCODE_CACHE is lost, so add another grain of salt. To work around that possible bug the relevant configuration has to look like this, which sadly adds some duplication and might be unsuitable when dealing with a large amount of environment variables..travis.ymlmatrix: include: - php: hhvm env: - OPCODE_CACHE=apc - php: nightly env: - OPCODE_CACHE=apc - php: 7.1 env: - OPCODE_CACHE=apc - php: 7.0 env: OPCODE_CACHE=apc DISABLE_XDEBUG=true LINT=true - php: 5.6 env: OPCODE_CACHE=apc DISABLE_XDEBUG=truebefore_script: - if [[ $DISABLE_XDEBUG = true ]]; then phpenv config-rm xdebug.ini; fi 5. Only do static code analysis or code coverage measurement onceThis one is for reducing the build duration and load by avoiding unnecessary build step repetition. It's achived by linting against coding standard violations or generating the code coverage for just a single PHP version per build, in most cases it will be the same for 5.6 or 7.0..travis.ymlmatrix: include: - php: hhvm - php: nightly - php: 7.1 - php: 7.0 [...]

Anatomy of a dope PHP package repository


While contributing to Construct, maintained by Jonathan Torres, I gathered some insights and learnings on the characteristics of a dope PHP package repository. This post summarises and illustrates these, so that PHP package develeopers have a complementary guideline to improve existing or imminent package repositories. Jonathan Reinink did a good job in putting the PHP package checklist out there which provides an incomplete, but solid quality checklist for open-source PHP packages. I'll distill the characteristics of a dope PHP package repository by looking at the repository artifacts Construct can generate for you when starting the development of a new PHP project or micro-package. The following tree command output shows most of the elements this post will touch upon. The artifacts in parenthese are optional and configurable from Construct but can nonetheless have an import impact on the overall package quality.├── │ ├──│ ├── (│ ├── composer.json│ ├── composer.lock│ ├──│ ├── (.editorconfig)│ ├── (.env)│ ├── (.env.example)│ ├── (.git)│ │ └── ...│ ├── .gitattributes│ ├── (.github)│ │ ├──│ │ ├──│ │ └──│ ├── .gitmessage│ ├── .gitignore│ ├── (.lgtm)│ ├──│ ├── (MAINTAINERS)│ ├── (.php_cs)│ ├── (phpunit.xml.dist)│ ├──│ ├── (docs)│ │ └──│ ├── src│ │ └── Logger.php│ ├── tests│ │ └── LoggerTest.php│ ├── .travis.yml│ ├── (Vagrantfile)│ └── vendor│ └── ...Definition of a dope PHP package repositoryBefore jumping into the details, let's define what could be considered as a dope package repository. Therefor, being lazy, I'm going to simply reword this classic quote from Michael Feathers > Clean code is code that is written by someone who cares. to > A dope PHP package repository is one that is created and maintained by someone who cares. Artifact categoriesThe next shown pyramid illustrates the three main categories the artifacts of a package repository will fall into. First and most important there's the main sourcecode, it's tests or specs, and the documentation which could be dependent on it's size reside in a section or inside a dedicated docs directory. Using a docs directory also allows publishing the documentation via GitHub pages. Other aspects of a package which should be documented are the chosen license, how to contribute to the package, possibly a code of conduct to comply with, and the changes made over the lifespan of the package. Second there's the configuration for a myriad of tools like Git, GitHub, EditorConfig, Composer, the preferred testing framework, the preferred continuous inspection / integration platform such like Scrutinizer or Travis CI, and so forth. The final category includes tools which ease the life of maintainers and potential contributors equally. These tools can be helpful for releasing new versions, enforcing coding standard compliance, or commit message quality and consistency. Consistency Sourcecode All sourcecode and accompanying tests or specs should follow a coding standard (PSR-2) and have a consistent formatting style, there's nothing new here. The perfect place to communicate such requirements is the file. Tools like PHP Coding Standards Fixer or PHP_CodeSniffer in combination with a present configuration .php_cs|ruleset.xml.dist and a command wrapping Composer script are an ideal match to ease compliance. The Composer script cs-fix shown next will be available for maintainers and contributors alike.composer.json{ "__comment": "omitted other configuration", [...]

Enforcing target descriptions within build files with a Git hook


When automating mundane tasks of a project or development environment with a build tool like Phing or Ant, the driving build file will naturally accumulate several targets and tasks over time. To ease the build file acceptance within a team and at a later stage also the contribution rate by team members, it's crucial that all build targets have a description attribute to provide at least a rough outline of the build features at hand. When these attributes are in place the (potential) build file user will get such an outline by executing the build tool's list command (phing -l or ant -p). To get a better picture of the problem at hand imagine a project poorly covered with tests and your personal attitude towards extending it or just take a peek at the screenshot below showing a very poorly documented build file.To overcome this accumulation of some sort of technical debt (i.e. poorly documented targets) there are various options at hand. The first one, not covered in this blog post, would be to add a pursuant test which verifies the existence of a description for every target/task of the build file under test. As it's very uncommon, at least from what I've heard, to have your build files covered by tests; the next thinkable approach would be to use a Git pre-commit hook to guard your repository/ies against the creeping in of such poorly documented build files.The next listing shows such a Git hook (also available via GitHub) scribbled away in PHP, which detects any build file(s) following a common build file naming schema (i.e. build.xml|build.xml.dist|personal-build.xml|…) , prior to the actual commit. For every target element in the detected build file(s) it's then verified that it has a description attribute and that it's actual content is long enough to carry some meaning. If one of those two requirements aren't met, the commit is rejected while revealing the build file smells to the committer, so she can fix it, as shown in the outro screenshot. Happy build file sniffing.#!/usr/bin/php 0) { $allViolations[$file] = $violations; } } } return $allViolations;}/** * @param array $allViolations * @return void */function fireBackPossibleViolationsAndExitAccordingly(array $allViolations){ if (count($allViolations) > 0) { foreach ($allViolations as $buildFile => $violations) { $buildFileConsoleMessageHeader = sprintf("Build file '%s':", $buildFile); echo $buildFileConsoleMessageHeader . PHP_EOL; foreach ($violations as $violationMessage) { $buildFileConsoleMessageLine = sprintf(" + %s", $violationMessage); echo $buildFileConsoleMessageLine . PHP_EOL; } } if (count($allViolations) > 1) { $rejectCommitConsoleMessag[...]

Measuring & displaying Phing build times with buildhawk


Recently I installed a Ruby gem called buildhawk which allows to measure and display the build times of Rake driven builds. As I like the idea behind this tool a lot but mostly use Phing for build orchestration, it was time to explore the possibility to interconnect them both. In this blog post I'll show an implementation of an apposite Phing Logger gathering the buildhawk compatible build times via git note(s) and how to put the interplay between those two tools to work.Logging onAs mentioned above the build time of each build is stored as a git note and associated to the repository's HEAD, reflecting the current state of the system under build (SUB), which assumes that the SUB is versioned via Git. The next shown Phing Logger (i.e. BuildhawkLogger) grabs the overall build time by hooking into the buildFinished method of the extended DefaultLogger class, transforms it into a buildhawk specific format and finally adds it as a git note. * @see BuildEvent * @link Buildhawk on GitHub * @package phing.listener */class BuildhawkLogger extends DefaultLogger { /** * @var string */ private $_gitNotesCommandResponse = null; /** * Behaves like the original DefaultLogger, plus adds the total build time * as a git note to current repository HEAD. * * @param BuildEvent $event * @see BuildEvent::getException() * @see DefaultLogger::buildFinished * @link */ public function buildFinished(BuildEvent $event) { parent::buildFinished($event); if ($this->_isProjectGitDriven($event)) { $error = $event->getException(); if ($error === null) { $buildtimeForBuildhawk = $this->_formatBuildhawkTime( Phing::currentTimeMillis() - $this->startTime ); if (!$this->_addBuildTimeAsGitNote($buildtimeForBuildhawk)) { $message = sprintf( "Failed to add git note due to '%s'", $this->_gitNotesCommandResponse ); $this->printMessage($message, $this->err, Project::MSG_ERR); } } } } /** * Checks (rudimentary) if the project is Git driven * * @param BuildEvent $event * @return boolean */ private function _isProjectGitDriven(BuildEvent $event) { $project = $event->getProject(); $projectRelativeGitDir = sprintf( '%s/.git', $project->getBasedir()->getPath() ); return file_exists($projectRelativeGitDir) && is_dir($projectRelativeGitDir); } /** * Formats a time micro integer to buildhawk readable format. * * @param integer The time stamp */ private function _formatBuildhawkTime($micros) { return sprintf("%0.3f", $micros); } /** * Adds the build time as a git note to the current repository HEAD * * @param string $buildTime The build time of the build * @return mixed True on sucess otherwise the command failure response */ private function _addBuildTimeAsGitNote($buildTime) { $gitNotesCommand = sprintf( "git notes --ref=buildtime add -f -m '%s' HEAD 2>&1", $buildTime ); $gitNotesCommandResponse = exec($gitNotesCommand, $output, $return); if ($return !== 0) { $this->_gitNotesCommandResponse = $gitNotesCommandResponse; return false; } return true; }}Putting the Logger to workAs the buildhawk l[...]

Growling PHPUnit's test status


Two years ago I blogged about a Xinc (R.I.P?) plugin that growls each build status for any via Xinc continuously integrated project. Since I'm using PHPUnit more and more lately, especially in continuous testing sessions (sprints without hitting the continuous integration server), my dependence on a fast and more visual feedback loop rose. In this post I'll provide an easy solution that meets these requirements by utilizing PHPUnit's test listener feature.What's the motivation, yo?While doing story or feature sprints embedded in a continuous testing approach I first used a combination of stakeout.rb and PHPUnit's --colors option to radiate the tests status, but soon wasn't that satisfied with the chosen route as it happened that the console window got superimposed with other opened windows (e.g. API Browser, TextMate etc.) especially on my 13,3" MacBook. To overcome this misery I decided to utilize PHPUnit's ability to write custom test listeners and to implement one that radiates the test status in a more prominent and sticky spot via Growl.Implementing the Growl test listenerSimilar to the ticket listener plugin mechanism I blogged about earlier PHPUnit also provides one for test listeners. This extension mechanism allows to bend the test result formatting and output to the given needs and scenarios a developer might face and therefore is a perfect match. To customize the test feedback and visualization the test listener has to implement the provided PHPUnit_Framework_Testlistener interface. A few keystrokes later I ended up with the next shown implementation, which is also available via a GitHub gist, supporting the previous stated requirements._successPicturePath = $successPicturePath; $this->_incompletePicturePath = $incompletePicturePath; $this->_failurePicturePath = $failurePicturePath; } public function addError(PHPUnit_Framework_Test $test, Exception $e, $time) { $this->_errors[] = $test->getName(); } public function addFailure(PHPUnit_Framework_Test $test, PHPUnit_Framework_AssertionFailedError $e, $time) { $this->_failures[] = $test->getName(); } public function addIncompleteTest(PHPUnit_Framework_Test $test, Exception $e, $time) { $this->_incompletes[] = $test->getName(); } public function addSkippedTest(PHPUnit_Framework_Test $test, Exception $e, $time) { $this->_skips[] = $test->getName(); } public function startTest(PHPUnit_Framework_Test $test) { } public function endTest(PHPUnit_Framework_Test $test, $time) { $this->_tests[] = array('name' => $test->getName(), 'assertions' => $test->getNumAssertions() ); $this->_assertionCount+= $test->getNumAssertions(); } public function startTestSuite(PHPUnit_Framework_TestSuite $suite) { if (count($this->_suites) === 0) { PHP_Timer::start(); [...]

Installing the PHP redis extension on Mac OS X


Recently I took a look at Redis, a popular and advanced key-value store. Peeking at the supported languages section of the project's website you'll notice a lot of client libraries available for PHP. Two out of them caught my particular attention: Rediska due to it's impressive Zend Framework integration and phpredis as it's a native PHP extension written in C and therefore supposed to be blazingly faster than vanilla PHP client libraries. The following blog post will show how to install and configure the aforementioned, native PHP extension on a Mac OS X system.The next steps assume that you've installed redis on your machine. In case you are using MacPorts and haven't installed the key-value store yet, all it takes are the following two commands and you're good to go. In case you prefer Homebrew for managing your package/software installations, there's also a Formula for redis available that allows you to install it via brew install redis.sudo port install redissudo launchctl load -w /Library/LaunchDaemons/org.macports.redis.plistThe very first step for building the native PHP redis extension is to get the source code by cloning the GitHub repository of the extension without it's history revisions.mkdir phpredis-buildcd phpredis-buildgit clone --depth 1 git:// phpredisThe next task is to compile the extension with the following batch of commands.phpize./configuremakesudo make installThe next to last step is to alternate your php.ini, use php --ini | grep 'Loaded' to get the location of it on your system, so that the redis module/extension is available to your PHP ecosystem. Therefor simply add in the Dynamic Extensions section of your php.ini. Afterwards you can verify that the redis module is loaded and available via one of the following commands.php -m | grep redisphp -i | grep 'Redis Support'To make the extension also available to the running Apache PHP module you'll need to restart the Apache server. Looking at phpinfo()'s output in a browser you should see the entry shown in the next image.For testing the communication between the just installed redis extension and the running Redis server, I further created a simple test script called redis-glue-test.php you can fetch from GitHub and run via the next commands.curl -s -o redis-glue-test.phpphp redis-glue-test.phpWhen you see the following shown console output you're good to go. Happy Redising![...]

Using MongoHq in Zend Framework based applications


As the name slightly foreshadows MongoHq is a currently bit pricey cloud-based hosting solution for MongoDb databases provided by CommonThread. Since they went live a few weeks ago I signed up for the small plan and started to successfully re-thinker with it in an exploratory Zend Framework based application. Therefore the following post will show how to bootstrap such an instance into a Zend Framework based application and how to use it from there in some simple scenarios like storing data coming from a Zend_Form into a designated collection and vice versa fetching it from there.Bootstrapping a MongoHq enabled connectionTo establish and make the MongoDb connection application-wide available the almighty Zend_Application component came to the rescue again. After reading Matthew Weier O'Phinney's enlightening blog post about creating re-usable Zend_Application resource plugins and deciding to use MongoDb in some more exploratory projects, I figured it would be best to create such a plugin and ditch the also possible resource method approach. The next code listing shows a possible implementation of the MongoDb resource plugin initializing a Mongo instance for the given APPLICATION_ENV (i.e. production) mode. For the other application environment modes (development | testing | staging) it's currently assumed that no database authentication is enabled, which is also the default when using MongoDb, so you might need to adapt the plugin to your differing needs; and since I'm currently only rolling on the small plan the support for multiple databases is also not accounted for.library/Recordshelf/Resource/MongoDb.php '', 'port' => '27017', 'username' => null, 'password' => null, 'databasename' => null, 'connect' => true ); /** * Initalizes a Mongo instance. * * @return Mongo * @throws Zend_Exception */ public function init() { $options = $this->getOptions(); if (null !== $options['username'] && null !== $options['password'] && null !== $options['databasename'] && 'production' === APPLICATION_ENV) { // Database Dns with MongoHq credentials $mongoDns = sprintf('mongodb://%s:%s@%s:%s/%s', $options['username'], $options['password'], $options['hostname'], $options['port'], $options['databasename'] ); } elseif ('production' !== APPLICATION_ENV) { $mongoDns = sprintf('mongodb://%s:%s/%s', $options['hostname'], $options['port'], $options['databasename'] ); } else { $exceptionMessage = sprintf( 'Recource %s is not configured correctly', __CLASS__ ); throw new Zend_Exception($exceptionMessage); } try { return new Mongo($mongoDns, array('connect' => $options['connect'])); } catch (MongoConnectionException $e) { throw new Zend_Exception($e->getMessage()); } }}With the MongoDb resource plugin in the place to be, it's time to make it known to the boostrapping mechanism which is done by registering the resource plugin in the application.ini. Further the MongoHq credentials, which are available in the MongoHq > My Database section, and the main database name are added to the configuration file which will be used to set the definable resource plugin ($_)options and to connect to the hosted database.application/configs/application.ini[production[...]

Utilizing Twitter lists with Zend_Service_Twitter


Several months ago Twitter added the list feature to it's public API. While debating some use cases for an event registration application I stumbled upon an interesting feature, which adds participants automatically to a Twitter list upon registration. This way registered and interested users can discover like-minded individuals and get in touch prior to any pre-social event activities. This post will show how this feature can be implemented by utilizing the Zend_Service_Twitter component, and how it then can be used in a Zend Framework based application.Implementing the common list featuresLooking at the three relevant parts of the Twitter list API some common features emerged and had to be supported to get the feature out of the door. These are namely the creation, deletion of new lists and the addition, removal of list members (i.e. event participants). Since the current Twitter component doesn't support these list operations out of the box it was time to put that develeoper hat on and get loose; which was actually a joy due to the elegance of the extended Zend_Service_Twitter component laying all the groundwork. A non-feature-complete implementation is shown in the next code listing and can alternatively be pulled from GitHub. Currently it only supports the above stated common operations plus the ability to get the lists of a Twitter account and it's associated members; but feel free to fork it or even turn it into an official proposal._methodTypes[] = 'list'; } /** * Creates a list associated to the current user. * * @param string $listname The listname to create. * @param array $options The options to set whilst creating the list. * Allows to set the list creation mode (public|private) * and the list description. * @return Zend_Rest_Client_Result * @throws Zend_Service_Twitter_Exception */ public function create($listname, array $options = array()) { $this->_init(); if ($this->_existsListAlready($listname)) { $exceptionMessage = 'List with name %s exists already'; $exceptionMessage = sprintf($exceptionMessage, $listname); throw new Zend_Service_Twitter_Exception($exceptionMessage); } $_options = array('name' => $this->_validListname($listname)); foreach ($options as $key => $value) { switch (strtolower($key)) { case 'mode': $_options['mode'] = $this->_validMode($value); break; case 'description': $_options['description'] = $this->_validDescription($value); break; default: break; } } $path = '/1/%s/lists.xml'; $path = sprintf($path, $this->getUsername()); $response = $this->_post($path, $_options); return new Zend_Rest_Client_Result($response->getBody()); } /** * Deletes an owned list of the current user. * * @param string $listname The listname to delete. * @return Zend_Rest_Client_Result *[...]

Closing and reopening GitHub issues via PHPUnit tests


Since PHPUnit 3.4.0 a new extension point for interacting with issue tracking systems (TTS) based on the test results has been added to PHP's first choice xUnit framework. The extension point has been introduced by an abstract PHPUnit_Extensions_TicketListener class, which allows developer to add tailor-made ticket listeners supporting their favoured TTS. Currently PHPUnit ships with a single ticket listener for Trac as it's still the used TTS for the framework itself. As I start to become more and more accustomed to use GitHub for some of my exploratory projects and hacks, the following blog post will contain a GitHub_TicketListener implementation and a showcase of it's usage.Annotating tests with ticket meta dataAs you might know, it's considered to be a best practice to write a test for each new ticket representing a bug and drive the system under test (SUT) till the issue is resolved. This extension of test-driven development is also known as test-driven bug fixing. To create a relation between these tests and their associated tickets, PHPUnit provides a new @ticket annotation which will be analyzed before each test is run. The following code listing shows such an annotated test._isCurlAvailable() === false) { throw new RuntimeException('The dependent curl extension is not available'); } if ($this->_isJsonAvailable() === false) { throw[...]

Zend Framework 1.8 Web Application Development book review


As the days are rapidly getting shorter, my reading appetite grows potentially and this evening I finished the 'Zend Framework 1.8 Web Application Development' book written by Keith Pope. While Keith worked on the book, I peeked several times at it's tutorial application, dubbed the Storefront, to get me going with the new Zend_Application component. Looking at it's code made me feel certain to get another great digest of the new features and components of version 1.8, and also a different practical perspective on web application development with the Zend Framework, once the book has been published. Therefor I got in touch with the publisher Packt and fortunately got a copy of which I'd like to share a personal review in this blog post.What's in it?The book opens with a quick run-through of the Model-View-Controller (MVC) architecture by creating a project structure via Zend_Tool and building a first very basic web application. While this introduction intentionally skips over a lot of details, the following chapter provides very detailed insights into the Zend Framework's MVC components by explaining the surrounded objects, the Design Patterns they are based upon and their interactions. After laying out that hefty block of theory the aforementioned tutorial application is introduced and built incrementally over several chapters; each one going into more detail for the specific application aspect. The highlight content of these chapters reach from introducing the Fat Model Skinny Controller concept, thoughts on Model design strategies which are reflected in a custom Storefront Model design, to developing application specific Front Controller Plugins, Action-Helpers, and View-Helpers. The application walk-through is completed by looking at general techniques to optimize the Storefront application and by building an automated PHPUnit Test Suite of functional tests utilizing Zend_Test to keep the Zend Framework based application self-reliant and refactorable.ConclusionThe book by Keith Pope provides any interested PHP developer, who's not already sold on a specific framework, a thorough introduction to the vivid Zend Framework and it's use in a MVC based web application development context. The content of the book is delivered in a fluent, very enthusiastic and 'knowledge-pillowed' writing tone. By implementing or working through the Storefront application seasoned web developers using older versions of the Framework will get a good blue sheet on new components like Zend_Application and it's implication in the bootstrapping process; while new developers tending towards picking up the Zend Framework will get a current and well compiled guide, which might first start off with a steep learning-curve but will turn into profund knowledge once hanging in there.The only thing that seemed a bit odd to me, was the utilization of Ant instead of Phing as the build tool for the Storefront application to set the application environment, to remove all require_once statements from the framework library and to run the PHPUnit Test Suite; but this might also be inflicted by my Phing nuttiness.[...]

Logging to MongoDb and accessing log collections with Zend_Tool


Influenced by a recent blog post of a colleague of mine and by being kind of broke on a Saturday night; I tinkered with the just recently discovered MongoDb and hooked it into the Zend_Log environment by creating a dedicated Zend_Log_Writer. The following post will therefore present a peek at a prototypesque implementation of this writer and show how the afterwards accumulated log entries can be accessed and filtered with a custom Zend_Tool project provider.Logging to a MongoDb databaseThe following steps assume that an instance of a MongoDb server is running and that the required PHP MongoDb module is also installed and loaded. To by-pass log entries to a MongoDb database there is a need to craft a proper Zend_Log_Writer. This can be achieved by extending the Zend_Log_Writer_Abstract class, injecting a Mongo connection instance and implementing the actual write functionality as shown in the next listing._connection = $connection; $this->_db = $this->_connection->selectDB($db)->createCollection( $collection ); } public function setFormatter($formatter) { require_once 'Zend/Log/Exception.php'; throw new Zend_Log_Exception(get_class() . ' does not support formatting'); } public function shutdown() { $this->_db = null; $this->_connection->close(); } protected function _write($event) { $this->_db->insert($event); } /** * Create a new instance of Recordshelf_Log_Writer_MongoDb * * @param array|Zen_Config $config * @return Recordshelf_Log_Writer_MongoDb * @throws Zend_Log_Exception * @since Factory Interface available since release 1.10.0 */ static public function factory($config) { $exceptionMessage = 'Recordshelf_Log_Writer_MongoDb does not currently ' . 'implement a factory'; throw new Zend_Exception($exceptionMessage); }}With the MongoDb writer available and added to the library directory of the application it's now possible to utilize this new storage backend as usual with the known Zend_Log component. The Mongo connection injected into the writer is configured via Zend_Config and initialized via the Zend_Application bootstrapping facility as shown in the listings below.application/configs/application.ini[production] = recordshelf....log.mongodb.db = zf_mongolog.mongodb.collection = recordshelf_loglog.mongodb.server = localhostlog.priority = Zend_Log::CRIT....application/Bootstrap.phpgetOptions())); } protected function _initLogger() { $this->bootstrap(array('frontController', 'config')); $config = Zend_Registry::get('config'); $applicationName = $config->app->get('name', 'recordshelf'); $mongoDbServer = $config->log->mongodb->get('server', ''); $mongoDbName = $config->log->mongodb->get('db', "{$applicationName}_logs"); $mongoDbCollection = $config->log->mongodb->get('collection', 'entries'); $logger = new Zend_Log(); $writer = new Recordshelf_Log_Writer_MongoDb(new Mongo($mongoDbServer), $mongoDbName, $mongoDbCollection); [...]

Kicking off custom Phing task development with TextMate


As a reader of this blog you migth have noticed that from time to time I like to utilize Phing's ability to write custom tasks. Though that's not an everyday routine for me and therefor I might, depending on my form of the day, end up with some real smelly code where for example the task's properties validation is handled in the task's main worker method. This is actually a bad habit/practice I'm aware of and to improve my future endeavours in custom Phing task development, I bended TextMate's snippet feature to my needs.

Snippets in TextMate are a very powerful feature that can be used to insert code that you do not want to type again and again, or like in my case might have forgotten over a certain time.

The next code listing shows the snippet providing a basic custom Phing task class skeleton which can be utilized over and over at the beginning of the implementation activities.
require_once 'phing/Task.php';

class ${1:CustomName}Task extends Task
private \$_${2:property} = null;

* @param string \$${2:property} ${3:description}
public function set${2/./\u$0/}(\$${2:property})
\$this->_${2:property} = trim(\$${2:property});
* Initializes the task environment if necessary
public function init()
* Does the task main work or delegates it
* @throws BuildException
public function main()
* Validates the task properties
* @throws BuildException
private function _validateProperties()
if (is_null(\$this->_${2:property})) {
throw new BuildException('${4:message}.');
To apply the snippet, after installing it, on a PHP source file it can either be selected from the Bundles menue or more comfortable via the assigned tab trigger i.e. ctask. After triggering the snippet it's possible to properly name the task under development and dynamically set it's first property, which is also treated as a mandatory property in the extracted _validateProperties method.

The outro image shows the above stated snippet in the TextMate Bundle Editor and it's configuration.



Scaffolding, implementing and using project specific Zend_Tool_Project_Providers


Working on a project involving several legacy data migration tasks, I got curious what the Zend_Tool_Project component of the Zend Framework offers to create project specific providers for the above mentioned tasks or ones of similar nature. Therefore the following post will try to show how these providers can be developed in an iterative manner by scaffolding them via the capabilities of the Zend_Tool_Project ProjectProvider provider, enlived with action/task logic, and be used in the project scope.Scaffolding project specific providersAll following steps assume there is a project available i.e. recordshelf initially created with the Zend_Tool_Project Project provider and that the forthcoming commands are issued from the project root directory against the zf command line client. The scaffolding of a project specific provider can be triggered via the create action of the ProjectProvider provider by passing in the name of the provider i.e. csv and it's intended actions. As the next console snippet shows it's possible to specify several actions as a comma separated list.sudo zf create project-provider csv importSpecials,importSummersaleAfter running the command the project's profile .zfproject.xml has been modified and a new providers directory exists in the project root directory containing the scaffolded Csv provider. The next code snippet shows the initial Csv provider class skeleton and its two empty action methods named importSpecials and importSummersale. At the point of this writing, using the Zend Framework 1.8.4 and PHP 5.2.10 on a Mac OS X system the generated Csv provider code or the mapping in the .zfproject.xml is incorrect, but can be fixed by renaming the class from CsvProvider to Csv._getProjectProfileResource($profile, $projectProviderName); return $projectProviderResource instanceof Zend_Tool_Project_Profile_Resource; } private function _isActionSupportedByProjectProvider(Zend_Tool_Project_Profile $profile, $projectProviderName, $actionName) { $projectProviderResource = $this->_getProjectProfileResource($profile, $projectProviderName); $projectProviderAttributes = $projectProviderResource->getContext() ->getPersistentAttributes(); return in_array($actionName, explode(',', $projectProviderAttributes['actionNames'])); } priva[...]

Testing Phing buildfiles with PHPUnit


While transforming some of the Ant buildfile refactorings described in Julian Simpson's seminal essay into a Phing context, it felt plainly wrong that I didn't have any tests for the buildfile to back me up on obtaining the pristine behaviour throughout the process. While Ant users can rely on an Apache project called AntUnit there are currently no tailor-made tools available for testing or verifying Phing buildfiles. Therefor I took a weekend off, locked myself in the stuffy lab, and explored the abilities to test Phing buildfiles respectively their included properties, targets and tasks with the PHPUnit testing framework. In case you'd like to take a peek at the emerged lab jottings, keep on scanning.Introducing the buildfile under testThe buildfile that will be used as an example is kept simple, and contains several targets ranging from common ones like initializing the build environment by creating the necessary directories to more specific ones like pulling an external artifact from GitHub. To get an overview of the buildfile under test have a look at the following listing. repository = $repository; } function setDest($destDirectory) { $this->destDirectory = $destDirectory; } function main() { // Get project name from repos Uri $projectName = str_replace('.git', '', substr(strrchr($this->repository, '/'), 1)); $gitCommand = 'git clone ' . $this->repository . ' ' . $this->destDirectory . '/' . $projectName; exec(escapeshellcmd($gitCommand), $output, $return); if ($return !== 0) { throw new BuildException('Git clone failed'); } $logMessage = 'Cloned Git repository ' . $this->repository . ' into ' . $this->destDirectory . '/' . $projectName; $this->log($logMessage); } }]]><[...]

Creating and using Phing ad hoc tasks


Sometimes there are build scenarios where you'll badly need a functionality, like adding a MD5 checksum file to a given project, that isn't provided neither by the available Phing core nor the optional tasks. Phing supports developers with two ways for extending the useable task pool: by writing 'outline' tasks that will end up in a directory of the Phing installation or by utilizing the AdhocTaskdefTask, which allows to define custom tasks in the buildfile itself. The following post will try to outline how to define and use these inline tasks, by sketching an ad hoc task that enables the build orchestra to clone Git repositories from GitHub during a hypothetical workbench setup.Creating the inline/ad hoc taskThe AdhocTaskdefTask expects a name attribute i.e. github-clone for the XML element which will later referr to the ad hoc task and a CDATA section hosting the task implementation. Similar to 'outline' tasks the ad hoc task extends Phing's Task class, configures the task via attributes and holds the logic to perform. Unfortunately inline task implementations don't allow to require or include external classes available in the include_path, like Zend_Http_Client which I initially tried to use for an example task fetching short Urls from This limits the available functions and classes to craft the task from to the ones built into PHP. The following buildfile snippet shows the implementation of the github-clone ad hoc task which is wrapped by a private target to encourage reusability and limit it's callability. repository = $repository; } function setDest($destDirectory) { $this->destDirectory = $destDirectory; } function main() { // Get project name from repos Uri $projectName = str_replace('.git', '', substr(strrchr($this->repository, '/'), 1)); $gitCommand = 'git clone ' . $this->repository . ' ' . $this->destDirectory . '/' . $projectName; exec(escapeshellcmd($gitCommand), $output, $return); if ($return !== 0) { throw new BuildException('Git clone failed'); } $logMessage = 'Cloned Git repository ' . $this->repository . ' into ' . $this->destDirectory . '/' . $projectName; $this->log($logMessage); } } ]]> Using the ad hoc taskWith the ad hoc task in the place to be, it's provided functionality can now be used from any target using the tasks XML element according to the given name i.e. github-clone in the AdhocTaskdefTask element earlier and by feeding it with the required attributes i.e. repos and dest. The next snippet allows you to take a peek at the complete buildfile with the ad hoc task in action. &l[...]

Using Haml & Sass from a Rake task


Some time ago I had the 'lightning' idea to implement another Rake automation to support my current blogging workflow, which at the moment consists of finding a sparkling idea to blog about, write it out in WriteRoom and refine the post in TextMate before publishing. As this process was a recurring and copy & paste driven event, I strove for an automation supporting this workflow. So unsurprisingly the post will show my current solution to achieve this goal by utilizing Rake, Haml and Sass.So what's that Haml and Sass thingy?Haml (HTML Abstraction Markup Language) is a templating language/engine with the primary goal to make Markup DRY, beautiful and readable again. It has a very shallow learning curve and therefor is perfectly suited for programmers and designers alike. Haml is primarily targeted at making the views of Ruby on Rails, Merb or Sinatra web applications leaner, but as you will see later the Ruby implementation also can be used framework independently.Sass (Syntactically Awesome StyleSheets) is a module which comes bundled with Haml providing a meta-language/abstraction on top of CSS sharing the same goals and advantages as Haml.Gluing Haml and Sass into a Rake taskTo get going you first have to install Haml and Sass by running the gem command shown next.sudo gem install hamlWith Haml and Sass available it's about time to identify and outline the parts you want to automate, in my case it's the creation of a WriteRoom and/or a XHTML draft document for initial editings. So the parameters to pass into the task to come are the targeted editor(s), the title of the blog post to draft and a list of associated and whitespace separated category tags.The XHTML document skeleton content and it's inline CSS are defined each in a separate Haml and Sass template file and will be rendered into the outcoming document along with the content passed into the Rake task. While the document skeleton for the WriteRoom draft document, due to it's brevity, is defined inside of the task itself. The following snippets are showing the mentioned Haml and Sass templates for the XHTML draft output file, which are located in the same directory as the Rake file. Haml!!! 1.1%html %head %title= "#{title} - Draft" %style{ :type => 'text/css' }= inline_css %body %h3= title %h4.custom sub headline %pre.consoleOutput console command %pre.codeSnippet code snippet %br/ = "Tags: #{tags.join ', '}" Sassbody :margin 5 :line-height 1.5em :font small Trebuchet MS, Verdana, Arial, Sans-serif :color #000000h4 :margin-bottom 0.3em.consoleOutput :padding 6px :background-color #000 :color rgb(20, 218, 62) :font-size 12px :font-weight bolder.codeSnippet :padding 3px :background-color rgb(243, 243, 243) :color rgb(93, 91, 91) :font-size small :border 1px solid #6A6565To inject the dynamic content into the Haml template and have it rendered into the outcoming document, the values i.e. draft_title, draft_tags and draft_inline_css have to be made available to the template engine by passing them in a bundling Hash into the to_html alias method of the Haml Engine object like shown in the next Rake task.task :default do Rake::Task['blog_utils:create_draft_doc'].invokeendnamespace :blog_utils do desc 'Create a new draft document for a given title, category tags and editor' task :create_draft_doc, [:title, :tags, :editor] do |t, args| draft_title = args.title draft_tags = args.tags.split(' ') draft_target_editor = args.editor raise_message = 'No title for draft provided' raise raise_message if draft_title.nil? raise_message = [...]

Phplocing your projects with Phing


When I started to play around with Ruby on Rails, my attention got somehow soon drawn to it's Rake stats task, which provides developers or more likely project managers with an overview of the actual project size. Exactly one month ago Sebastian Bergmann, of PHPUnit fame, started to implement a similar tool dubbed phploc which can give you an overview of the size for any given PHP project. As I wanted to automate the invocation of this handy tool and collect it's report output out of a Phing buildfile, I invested some time to develop a custom Phing task doing so. Thereby the following post will show you a possible implementation of this task and it's use in a buildfile.Installing phplocTo setup phploc on your system simply install the phploc PEAR package available from the channel as shown in the next commands. In case you already have installed PHPUnit via PEAR you can omit the channel-discover command.sudo pear channel-discover pear.phpunit.desudo pear install phpunit/phplocImplementing the phploc taskAs I already blogged about developing custom Phing task I'm only going to show the actual implementation and not dive into any details; alternatively you can also grab it from this public GitHub repository.suffixesToCheck = array('php'); $this->acceptedReportTypes = array('cli', 'txt', 'xml'); $this->reportType = 'cli'; $this->reportFileName = 'phploc-report'; $this->fileSets = array(); $this->filesToCheck = array(); } public function setSuffixes($suffixListOrSingleSuffix) { if (stripos($suffixListOrSingleSuffix, ',')) { $suffixes = explode(',', $suffixListOrSingleSuffix); $this->suffixesToCheck = array_map('trim', $suffixes); } else { array_push($this->suffixesToCheck, trim($suffixListOrSingleSuffix)); } } public function setFile(PhingFile $file) { $this->fileToCheck = trim($file); } public function createFileSet() { $num = array_push($this->fileSets, new FileSet()); return $this->fileSets[$num - 1]; } public function setReportType($type) { $this->reportType = trim($type); } public function setReportName($name) { $this->reportFileName = trim($name); } public function setReportDirectory($directory) { $this->reportDirectory = trim($directory); } public function main() { if (!isset($this->fileToCheck) && count($this->fileSets) === 0) { $exceptionMessage = "Missing either a nested fileset or the " . "attribute 'file' set."; throw new BuildException($exceptionMessage); } if (count($this->suffixesToCheck) === 0) { throw new BuildException("No file suffix defined."); } if (is_null($this->reportType)) { throw new BuildException("No report type defined."); } if (!is_null($this->reportType) && !in_array($this->reportTyp[...]

Broadcasting blog post notifications to Twitter with Ruby and Rake


During my latest blogging absence I had some time to tinker around with Ruby. For an introductory challenge I chose to implement a real life feature which currently isn't supported by and screams siren-like for an one-button automation: Broadcasting the latest blog entry to my Twitter account. As I didn't want to sign up for a Twitterfeed account and couldn't resort to the Twitter Tools plugin like WordPress users, I had to perform these broadcasting steps manually, until now. To see how this repetitive and time-stealing process was transformed into a semi-automated one by utilizing Ruby, a splash of Hpricot, Ruby's excellent Twitter Api wrapper and Rake, read on my dear.Installing the required RubyGemsPrior to diving into the implementation details of the given scenario I had to install the required RubyGems like shown in the next console snippet. The installation of the twitter gem might take a while due to it's dependency on several other gems.sudo gem install hpricot rake twitterScraping the latest blog post details with HpricotThe initial implementation step was to gather relevant metadata (Url, title and used tags) of the latest blog post. I first took the route to get it by grabbing the blog's RSS feed and extracting the metadata from there, but soon stumbled into problems getting an outdated feed from Feedburner. The next alternative was to scrape the needed metadata directly from the blog landing page. As I went this route before with the Zend_Dom_Query component of the Zend Framework I decided to use something similar from the Ruby toolbox. Some Google hops later I was sold to Hpricot, a HTML Parser for Ruby and as you can see in the first code snippet, showing an extract of the Rake file to come, this is done in just 13 lines of code.doc = Hpricot(open(blog_landing_page, scrape_options))latest_post_url =' > a')['href']latest_post_title =' > a').inner_htmllabel_doc = Hpricot('').first.to_s)label_links =' > a').each do |label_link| label = label_link.inner_html.gsub(' ', '').downcase if label.include?('/') labels = label.split('/') labels.each { |label| last_post_labels.push(label) } else last_post_labels.push(label) endendOutstanding tasksWith the metadata available the oustanding tasks to implement were:to get a short Url for the actual blog post by utilzing a public API of an Url shortening service i.e. is.gdto build the tweet to broadcast by injecting the available metadata into a tweet templateto broadcast the notification tweet to the given Twitter accountto log the broadcasted blog title to prevent spamming or duplication scenariosAs a guy sold to build tools and eager to learn something new I subverted Rake, Ruby's number one build language, to glue the above mentioned tasks and their implementation together, to manage their sequential dependencies and to have a comfortable invocation interface. The nice thing about Rake is that it allows you to implement each tasks unit of work by using the Ruby language; and there is no need to follow a given structure to implement custom tasks like it's the case for custom Phing tasks. As you will see in the forthcoming complete Rakefile some of the tasks are getting quite long and complex; therefor some of them are pending candidates for Refactoring activities like for example extract task units of work into helper/worker classes. require 'rubygems' require 'hpricot' require 'open-uri' require 'twitter' task :default do Rake[...]

Installing Zend_Tool on Mac OS X


Yesterday I decided to tiptoe into the development of custom Zend_Tool Providers as the introductional article series by Ralph Schindler motivated me to learn more about it and I already have some useful use cases on my mind. Therefor I prior had to install the Zend_Tool component and it's driving CLI scripts on my MacBook. The following brief instruction describes a possible approach that got me running in no time on a Mac OS X system. Once the Zend Framework has an official PEAR channel most of the forthcoming steps should be obsolete and entirely performed by the PEAR package installer command.

Fetching and installing the Zend_Tool component

First I tried to install the 1.8.0(devel) version of the Zend Framework via the PEAR channel but it currently only delivers the 1.7.3PL1(stable) package; even after switching the stability state of the PEAR config. To dodge the include_path setting hassle and for a further use when customizing other tools like Phing tasks I decided to keep the installed package.
sudo pear channel-discover
sudo pear install zfcampus/zf-devel
The next commands are showing the footwork I had to do to get the Zend_Tool component into the PEAR Zend Framework package installed in /opt/local/lib/php/Zend.
sudo svn co $HOME/Cos/Zend/Tool
sudo rsync -r --exclude=.svn $HOME/Cos/Zend/Tool /opt/local/lib/php/Zend

Putting the Zend_Tool CLI scripts to work

The next steps were to fetch the CLI scripts from the public Subversion repository and to link them into the system path /opt/local/bin as shown in the next commands.
sudo svn co $HOME/Cos/Zend/bin
sudo ln $HOME/Cos/Zend/bin/ /opt/local/bin/zf
sudo ln $HOME/Cos/Zend/bin/zf.php /opt/local/bin/zf.php

Checking the installation

With everything hopefully in place it was time to verify the success of the installation via the below stated provider action call; and as I got the version of the installed Zend Framework as a response of the executed action/command I'm good to go.
zf show version


Rails for PHP Developers book review


The e-book version of the Pragmatic Programmers release Rails for PHP Developers written by Derek DeVries and Mike Naberezny occupies now some of my scarce hard drive space for several months, and today I managed to hit the last page of it. In case you're interested in knowing if it's worthy to sacrifice some rare hard drive or bookshelf space for this book read on.What's in it?The book consists of three main parts which are addressing open-minded developers with a PHP background tempted to add the Ruby language and the thereupon built Rails 2.0 framework to their toolset. The first part introduces the classic and nowadays omnipresent MVC pattern, the concepts and conventions of Rails by converting a simple PHP newsletter application into a Rails based one. The follow-up chapters of the first part are covering the basics of the Ruby language by looking at known PHP language features and constructs, and how they translate to their Ruby counterparts. Reading these chapters you will get a thorough understanding of the Ruby language and be able to apply unique features like blocks or the reopening of existing classes. The communicated knowledge builds the foundation to accelerate the use and understanding of the Rails framework which is covered in-depth through-out the book's second part. While teaming up with their imaginary buddy Joe the authors walk you through building a Rails user group application. The chapters of the second part are covering a lot of ground reaching from domain modeling, putting the particular MVC parts to work, ensuring quality by utilizing the Test::Unit library to finally deploying the application into a productive production environment. The first two chapters of the final and reference part cover the differences and similarities between PHP and Ruby data structures, operations and language constructs. The final chapter of the book closes with a web development specific comparision of PHP constructs and approaches to the ones used by the Rails framework. The book is accompanied by a dedicated blog and a PHP to Rails online reference to satisfy severe thirst for more knowledge.ConclusionThe book provides interested PHP developers a thorough introduction to the Ruby language and the Rails framework in a fluent and enjoyable writing tone. By implementing the example application of the second book part any decent PHP developer will derive a solid understanding of the Rails framework, he can build upon and that puts him in the position to make reasonable judgments for using/flaming it or not. IMHO this book is so far one of the best PHP related book releases of the out fading year 2008, and can be a real motivator to extend the just gained knowledge by diving deeper into the Ruby/Rails ocean. So be prepared to see one or another Ruby related post popping up in the future timeline of this blog; I just added another costly addiction to the medicine cupboard. Word![...]

Tinyizing URLs with Zend_Http_Client


While doing some initial research for a blog related automation task to implement I learned some more about services which transform long URLs into short ones. The well-knownst of these services, due to the Twitter hype, is probably TinyURL which can be accessed via a classic webinterface or by calling a public API. In a recent blog post Dave Marshall outlined a quick workaround for tweeting via the Zend_Http_Client component which is a reasonable approach for calling services that aren't in the Zend Framework core yet like Zend_Service_Twitter or are not supported out of the box. Therefore this post will try to describe a Zend Framework way of creating tinyized URLs.Getting tiny tiny y'allAccording to Wikipedia there are numerous services available e.g. RubyUrl providing the same feature as TinyURL, so to be prepared for the future and thereby maybe violating the YAGNI principle I decided to declare a very basic interface first in case of switching the service provider someday._serviceEndpoint = $serviceEndpoint; } /** * Shortenizes a given Url * * @param string $url * @return string * @throws Exception */ public function shortenize($url) { if (is_null($this->_serviceEndpoint)) { throw new Exception('No service endpoint set'); } $client = new Zend_Http_Client($this->_serviceEndpoint); $client->setParameterGet('url', $url) ->setMethod(Zend_Http_Client::GET); try { $response = $client->request(); } catch (Exception $e) { throw $e; } if (200 === $response->getStatus()) { return $response->getBody(); } else { throw new Exception($response->getStatus() . ": " . $response->getMessage()); } } /** * Alias method for the shortenize method * * @param string $url * @throws Exception * @see shortenize */ public function tinyize($url) { return $this->shortenize($url); }}Now with everything hopefully operating smoothly it's time for a test-drive, yeah I'm lazy and cut that development approach called TDD, by creating a service instance and requesting a TinyURL for the Zend Framework website as shown in the outro listing.

Getting a visualization of a Phing buildfile


Today I spent some time to get a tool running to visualize Phing buildfiles as this can come in handy for maintaing, refactoring or extending large buildfiles. Out of the box the Phing -l option can be used to get a first overview of all available targets in a given buildfile but it doesn't untangle the target dependencies and sometimes a picture is still worth a thousand words. Luckily the Ant community already provides several tools to accomplish the visualization of Ant buildfiles, reaching from solutions that apply a Xslt stylesheet upon a given buildfile e.g. ant2dot to those ones that take a programmatically approach e.g. Grand. All these solutions utilize Graphiz to generate a graphic from a DOT file representing the buildfile structure, it's targets and their dependencies. As Phing is a very close descendant of Ant the Xslt approach was best suited and the one with the least effort because their buildfile markup is very similar. The following post will walk you through on how to get a simple Phing buildfile visualization tool running in just a few minutes.Grabbing the Xslt fileThe first step is to get the ant2dot Xslt stylesheet and put it into the same directory as the visualization buildfile and target to come. Due to the aforementioned Phing and Ant buildfile markup similarities it can be used without any modfications.Setting up the buildfile visualization targetThe next step is to create a Phing target that utilizes the Xslt task to transfrom the fed buildfile into a DOT file which gets passed further to a platform dependent Exec task handling the final transformation into a PNG image. To make the visualization target independent from the buildfile to visualize it's hosted in an own buildfile and the target accepts the buildfile to be transformed as a property passed to the Phing Cli or if none given uses the default build.xml. Further the Xslt stylesheet accepts several parameters to add extended data to the resulting DOT file/PNG image which can be set in the tags of the Xslt task. For a list of possible parameters have a look at the options section of ant2dot. The following codesnippet shows the visualization buildfile and the visualize target doing the Whodini like magic. Scraping websites with Zend_Dom_Query


Today I stumbled upon an interesting and reportable scenario where I had to extract information of the weekly published Drum and Bass charts provided by BBC 1Xtra. As this information currently isn't available in any consumer friendly format like for example a RSS feed, I had to go that scraping route but didn't want to hustle with a regex approach. Since version 1.6.0 the Zend_Dom_Query component has been added to the framework mainly to support functional testing of MVC applications, but it also can be used for rolling custom website scrapers in a snap. Woot, perfect match!The following code snippets are showing the Bbc_DnbCharts_Scraper class I came up with and an example of its usage. The class utilizes curl to read the website holding the desired data, which will be passed to Zend_Dom_Query to execute queries upon it. For querying the former loaded XHTML Document Object Model it's possible to either utilize XPath or CSS selectors. So I had to pick my poison, and decided to go with the CSS selectors as them were best suited for the document to query and will be more familiar to most jQuery or Prototype users. The query returns a result set of all matching DOMElements which are further unpuzzled via a private helper method returning just the desired charts data as shown in the closing listing. As you can see the implementation of the scraping can be done with a minimum of effort and these are exactly the moments I love the Zend Framework for._url = $url; } /** * Scrapes off the drum and bass charts content from the BBC 1Xtra website. * * @return array * @throws Exception */ public function scrape() { try { $dom = new Zend_Dom_Query($this->_getXhtml()); } catch (Exception $e) { throw $e; } $results = $dom->query('div.chart div'); $chartDetails = array(); foreach ($results as $index => $result) { /* @var $result DOMElement */ if ($result->nodeValue !== '') { //filter out
element $chartDetails[] = $result->nodeValue; } } return $this->_unpuzzleChartDetails($chartDetails, true); } /** * Unpuzzles the chart details and groups them by their chart position, * if desired with associative keys. * * @param array $details * @param boolean $associative * @return array */ private function _unpuzzleChartDetails(array $details, $associative = false) { if (0 === count($details)) { return array(); } else { $nextChartRank = 2; $charts = array(); $groupedChartDetails = array(); foreach ($details as $index => $chartDetail) { if ($index $chart) { unset($charts[$chartsIndex]); foreach ($chart as $chartIndex => $chartDetails) { $charts[$chartsIndex][$associatives[$chartIndex]] = $chartDetails; } } } return $charts; } } /** * Gets the XHTML document via c[...]