Zengenuity: Decoupling Your Backend Code from Drupal (and Improving Your Life) with Wrappers Delight

Planet Drupal - Wed, 03/12/2014 - 15:48

If you've ever written a lot of custom code for a Drupal site, then you know it can be a tedious and error-prone experience. Your IDE doesn't know how Drupal's data structures are organized, and it doesn't have a way to extract information about configured fields to do any autocomplete or check data types. This leads to some frustrations:

  • You spend a lot of time typing out by hand all the keys in every array of doom you come across. It's tedious, verbose, and tiring.
  • Your code can contains errors your IDE won't alert you to. Simple typos can go unnoticed since the IDE has no idea how the objects and arrays are structured.
  • Your code is tightly coupled to specific field names, configured in the database. You must remember these, because your IDE can't autocomplete them.
  • Your code is tightly coupled to specific field types. (If you start off with a text field and then decide to switch to an email field, for example, you will find the value is now stored in a different key of the data array. You need to update all your custom code related to that field.)
  • It can be easy to create cross-site-scripting vulnerabilities in your code. You need to keep in mind all the field data that needs to be sanitized for output. It only takes one forgotten spot to open your site to attacks.

Wrappers Delight (https://www.drupal.org/project/wrappers_delight) is a development tool I've created to help address these issues, and make my life easier. Here's what it does:

  • Provides wrapper classes for common entity types, with getters and setters for the entities' base properties. (These classes are wrappers/decorators around EntityMetadataWrapper.)
  • Adds a Drush command that generates wrapper classes for the specific entity bundles on your site, taking care of the boilerplate getter and setter code for all the fields you have configured on the bundles.
  • Returns sanitized values by default for the generated getters for text fields. (raw values can be returned with an optional parameter)
  • Allows the wrapper classes to be customized, so that you can decouple your custom code from specific Drupal field implementation.

With Wrappers Delight, your custom code can be written to interface with wrapper classes you control instead of with Drupal objects directly. So, in the example of changing a text type field to an email type field, only the corresponding wrapper class needs to be updated. All your other code could work as it was written.

But wait, there's more!

Wrappers Delight also provides bundle-specific wrapper classes for EntityFieldQuery, which allow you to build queries (with field-level autocomplete) in your IDE, again decoupled from specific internal Drupal field names and formats. Whatever your decoupled CRUD needs may be, Wrappers Delight has you covered!

Getting Started with Wrappers Delight

To generate wrapper classes for all the content types on your site:

  1. Install and enable the Wrapper Delight module.
  2. Install Drush, if you don't already have it.
  3. At the command line, in your Drupal directory, run drush wrap node.
  4. This will generate a new module called "wrappers_custom" that contains wrapper classes for all your content types.
  5. Enable the wrappers_custom module, and you can start writing code with these wrapper classes.
  6. This process works for other entity types, as well: Users, Commerce Products, OG Memberships, Messages, etc. Just follow the Drush command pattern: drush wrap ENTITY_TYPE. For contributed entity types, you may need to enable a submodule like Wrappers Delight: Commerce to get all the base entity properties.
Using the Wrapper Classes

The wrapper classes generated by Wrappers Delight have getters and setters for the fields you define on each bundle, and they inherit getters and settings for the entity's base properties. The class names follow the pattern BundlenameEntitytypeWrapper. So, to use the wrapper class for the standard article node type, you would do something like this:

$article = new ArticleNodeWrapper($node);
$body_value = $article->getBody();
$image = $article->getImage();

Wrapper classes also support passing an ID to the constructor instead of an entity object:

$article = new ArticleNodeWrapper($nid);

In addition to getters that return standard data arrays, Wrappers Delight creates custom utility getters for certain field types. For example, for image fields, these will all work out of the box:

$article = new ArticleNodeWrapper($node);
$image_array = $article->getImage();
$image_url = $article->getImageUrl();
$image_style_url = $article->getImageUrl('medium');
$absolute_url = $article->getImageUrl('medium', TRUE);

// Get a full tag (it's calling theme_image_style
// under the hood)
$image_html = $article->getImageHtml('medium'); Creating New Entities and Using the Setter Methods

If you want to create a new entity, wrapper classes include a static create() method, which can be used like this:

$values = array(
'title' => 'My Article',
'status' => 1,
'promote' => 1,
$article = ArticleNodeWrapper::create($values);

You can also chain the setters together like this:

$article = ArticleNodeWrapper::create();
$article->setTitle('My Article')
->save(); Customizing Wrapper Classes

Once you generate a wrapper class for an entity bundle, you are encouraged to customize it to your specific needs. Add your own methods, edit the getters and setters to have more parameters or different return types. The Drush command can be run multiple times as new fields are added to your bundles, and your customizations to the existing methods will not be overwritten. Take note that Wrappers Delight never deletes any methods, so if you delete a field, you should clean up the corresponding methods (or rewrite them to get the data from other fields) manually.

Drush Command Options

The Drush command supports the following options:

  • --bundles: specify the bundles to export (defaults to all bundles for a given entity type)
  • --module: specify the module name to create (defaults to wrappers_custom)
  • --destination: specify the destination directory of the module (defaults to sites/all/modules/contrib or sites/all/modules)
Packaging Wrapper Classes with Feature Modules or Other Bundle-Supplying Modules

With the options listed above, you can export individual wrapper classes to existing modules by running a command like the following:

drush wrap node --bundles=blog --module=blog_feature

That will put the one single wrapper class for blog in the blog_feature module. Wrappers Delight will be smart enough to find this class automatically on subsequent runs if you have enabled the blog_feature module. This means that once you do some individual exports, you could later run something like this:

drush wrap node

and existing classes will be updated in place and any new classes would end up in the wrappers_custom module.

Did You Say Something About Queries?

Yes! Wrappers Delight includes a submodule called Wrapper Delight Query that provides bundle-specific wrapper classes around EntityFieldQuery. Once you generate the query wrapper classes (by running drush wrap ENTITY_TYPE), you can use the find() method of the new classes to execute queries:

$results = ArticleNodeWrapperQuery::find()
->range(0, 10)

The results array will contain objects of the corresponding wrapper type, which in this example is ArticleNodeWrapper. That means you can immediately access all the field methods, with autocomplete, in your IDE:

foreach ($results as $article) {
$output .= $article->getTitle();
$output .= $article->getImageHtml('medium');

You can also run queries across all bundles of a given entity type by using the base wrapper query class:

$results = WdNodeWrapperQuery::find()
->byTitle('%Awesome%', 'LIKE')

Note that results from a query like this will be of type WdNodeWrapper, so you'll need to check the actual bundle type and re-wrap the object with the corresponding bundle wrapper in order to use the bundle-level field getters and setters.

Wrapping Up

So, that's Wrappers Delight. I hope you'll give it a try and see if it makes your Drupal coding experience more pleasant. Personally, I've used on four new projects since creating it this summer, and it's been amazing. I'm kicking myself for not doing this earlier. My code is easier to read, WAY easier to type, and more adaptable to changes in the underlying architecture of the project.

If you want to help me expand the project, here are some things I could use help with:

  • Additional base entity classes for common core and contrib entities like comments, taxonomy terms, and Workflow states.
  • Additional custom getter/setter templates for certain field types where utility functions would be useful, such as Date fields.
  • Feedback from different use cases. Try it out and let me know what could make it work better for your projects.

Post in the issue queue (https://www.drupal.org/project/issues/wrappers_delight) if you have questions or want to lend a hand.

Categories: Elsewhere

ERPAL: How we automate Drupal security updates

Planet Drupal - Wed, 03/12/2014 - 14:05

During the past few weeks, automated security updates have been one of the hotly debated topics in the Drupal community. Ever since Drupalgeddon, security automation has been one of the issues we should really try to solve in order to ensure Drupal's continued growth, especially in the enterprise world. Whereas content management systems such as Wordpress already run automated updates in a background process, Drupal does not yet have such a feature. There are these and other discussions ongoing at Drupal.org that point out potential pros and cons of this feature. Personally, from the perspective of a Drupal professional, I think running Drupal module updates in the background could lead to several problems. There are a few reasons for this:

  • We somehow need to handle patched modules and cannot just override the complete module with an update
  • Letting Drupal rewrite its own codebase will open other security issues
  • Professionally developed Drupal projects use GIT (or another code versioning system) to maintain their codebase and handle the deployment process. Every update needs to be committed to the repository so that it’s not removed in the next deployment cycle
  • After updating a module, we should run our automated test scripts (for example, behat or selenium) to ensure the site didn't break with the update
  • To ensure quality we shouldn’t just run a complete update containing bug fixes and new features but only apply the patch relevant to security

The issue of applying security updates has become more and more time-sensitive because hackers start to attack vulnerable sites within hours of a security update release. Especially with enterprise web applications and large content sites with lots of users and traffic, this update process is really business critical. Even before the pressure of something like Drupalgeddon, these last two years we had already been thinking about update automation. In this blog post I want to describe the technology and workflows we use to automate security updates in our Drupal projects while ensuring quality with automated and manual tests and the correct handling of patches.

Every site that we support for our clients sends hourly update reports to our internal ERPAL (you can replace “ERPAL” here with any other ticketing system) over an https connection and with additional encryption. For every security update available, we create a new branch in the project's GIT repository and a task that is related automatically.

Once the task has been created, we get all the security-relevant patches and code changes from the active modules' repositories and merge them into the modules of the project. These code changes are committed to the new so-called "feature branch". Using Jenkins and a system to build feature branch instances with a live database, the changes are now ready to test. The status of the ERPAL Task is automatically set to "ready to test". Now all automated tests will run, if any are available for the project. The result is documented with a comment on the initially created task.
Depending on the test mode of the project and the priority of the security update (e.g. “critical” or “highly critical”), the security patches are either deployed directly to live once all tests are passed or the task is assigned to the project manager with the status "ready to test". He can then test the complete patched Drupal installation on a separate feature branch test instance under live conditions. If all tests are passed, the task will be set to "test passed" and the customer receives a notification that the security of his site is up-to-date. The update branch is merged as a hotfix into the master branch and the site is deployed to the live server. After this process, the update branch is deleted and the test instance destroyed to clean up the system. The following graphic describes the behavior.

This system has several benefits, both for us and for our clients:

  • Security-relevant updates are applied within one hour
  • Quality is ensured by automated tests and, if needed, by a notification system indicating manual test steps
  • No need to involve developers to patch and deploy code
  • No website downtime
  • All steps are documented in ERPAL to make the process transparent: customers see it in their ERPAL account
  • No panic on critical updates; all workflows run as in a normal project and are delivered with compliance to our task workflow
  • Instant customer notification once updates are applied gives customers a good feeling ;-)

This system has been working well for 2.5 years. Working in cooperation with other Drupal shops to test the system, we want to make this security update automation system available for others to use as well. Therefore we will soon publish the whole solution as a service. If you want to become one of the few beta testers, or if you want to become a reseller to deliver the same security automation to your clients, you can sign up at our Drop Guard - The Drupal security update automation service page.

Categories: Elsewhere

Diego Escalante Urrelo: Link pack #01

Planet Debian - Wed, 03/12/2014 - 09:37

Following the lead of my dear friend Daniel and his fantastic and addictive “Summing up” series, here’s a link pack of recent stuff I read around the web.

Link pack is definitely a terrible name, but I’m working on it.

How to Silence Negative Thinking
On how to avoid the pitfall of being a Negatron and not an Optimist Prime. You might be your own worst enemy and you might not even know it:

Psychologists use the term “automatic negative thoughts” to describe the ideas that pop into our heads uninvited, like burglars, and leave behind a mess of uncomfortable emotions. In the 1960s, one of the founders of cognitive therapy, Aaron Beck, concluded that ANTs sabotage our best self, and lead to a vicious circle of misery: creating a general mindset that is variously unhappy or anxious or angry (take your pick) and which is (therefore) all the more likely to generate new ANTs. We get stuck in the same old neural pathways, having the same negative thoughts again and again.

Meet Harlem’s ‘Official’ Street Photographer
A man goes around Harlem with his camera, looking to give instead of taking. Makes you think about your approach to people and photography, things can be simpler. Kinda like Humans of New York, but in Harlem. And grittier, and on film —but as touching, or more:

“I tell people that my camera is a healing mechanism,” Allah says. “Let me photograph it and take it away from you.”

What Happens When We Let Industry and Government Collect All the Data They Want
Why “having nothing to hide” is not about the now, but about the later. It’s not that someone is going to judge for pushing every detail of your life to Twitter and Instagram, it’s just that something you do might be illegal a few years later:

There was a time when it was essentially illegal to be gay. There was a time when it was legal to own people—and illegal for them to run away. Sometimes, society gets it wrong. And it’s not just nameless bureaucrats; it’s men like Thomas Jefferson. When that happens, strong privacy protections—including collection controls that let people pick who gets their data, and when—allow the persecuted and unpopular to survive.

The Sex-Abuse Scandal Plaguing USA Swimming
Abusive coaches and a bullying culture in sports training are the perfect storm for damaging children. And it’s amazing the extent to which a corporation or institution is willing to look the other way, as long as they save face. Very long piece, but intriguing to read.

What Cities Would Look Like if Lit Only by the Stars
Thierry Cohen goes around the world and builds beautiful and realistic composite images of how would big cities look like if lit only by stars. The original page has some more cities: Villes éteintes (Darkened Cities).

On Muppets & Merchandise: How Jim Henson Turned His Art into a Business
Lessons from how Jim Henson managed to juggle both art and business without selling out for the wrong reasons. Really interesting, and reminds you to put Henson in perspective as a very smart man who managed to convince everyone to give him money for playing with muppets. The linked video on How the Muppet Show is Made is also cool. Made me curious enough to get the book.

Barbie, Remixed: I (really!) can be a computer engineer
Mattel launched the most misguided book about empowering Barbie to be anything but a computer engineer in a book about being a computer engineer. The internet did not disappoint and fixed the problem within hours. There’s now even an app for that (includes user submitted pages).

Categories: Elsewhere

Symphony Blog: Continue shopping button on Drupal Commerce cart

Planet Drupal - Wed, 03/12/2014 - 09:34

We had a Drupal project, implementing a commerce site for a local store. We use Drupal Commerce, as always, for this type of websites. You may see that we have alot of Drupal Commerce themes on our portfolio.

During the project, there was a minor request from our customer: add the Continue Shopping button to the cart. This feature is available on Ubercart, especially for Drupal 6 Ubercart users. Most of ecommerce sites have this feature as well. But it is not built-in with Drupal Commerce.

As I searched on the Drupal.org issues, I found a very helpful thread: Continue shopping in cart. Zorroposada presented a custom code to achieve it:

read more

Categories: Elsewhere

Pronovix: Hosting and playing videos in Drupal: Part 1

Planet Drupal - Wed, 03/12/2014 - 09:13

When you are first faced with the task of hosting and playing videos in Drupal, the number of different approaches and solutions might seem overwhelming. Where to store the videos? What is the difference between CDNs, cloud storage services and hosted video solutions? Which Drupal modules to use with which service? This blog post walks you through the basics of hosting and playing videos in Drupal:

Categories: Elsewhere

Craig Small: WordPress 4.0.1 fixes for Debian stable

Planet Debian - Wed, 03/12/2014 - 08:46

Previously I posted a short article about the WordPress package for Debian and how that SID was getting the updated WordPress 4.0.1 which had some security fixes.

The question a lot of people were asking was: What about stable (or Wheezy).  After way too much time due to other pressing issues, I have just uploaded the patched WordPress debian package for stable.  The fixed version has the catchy number of 3.6.1~deb7u5.  This package has all of the relevant patches that went in from WordPress 3.7.4 to 3.7.5 and there are even CVE IDs for this package (and 4.0.1 which all this stems from).

Stolen from the 3.6.1 changelog, these are the fixes:

  • CVE-2014-9031 XSS in wptexturize() via comments or posts
  • CVE-2014-9033 CSRF in the password reset process
  • CVE-2014-9034 Denial of service for giant passwords
  • CVE-2014-9035 XSS in Press This
  • CVE-2014-9036 XSS in HTML filtering of CSS in posts
  • CVE-2014-9037 Hash comparison vulnerability in old passwords
  • CVE-2014-9038 SSRF: Safe HTTP requests did not sufficiently block the loopback IP address space
  • CVE-2014-9039 Email address change didn’t invalidate previously sent password reset

I’d like to thank the Debian security team especially Salvatore for their assistance and checking the package looked ok.


Backporting and Git

Part of the delay in getting the wordpress stable package out is that backporting is fiddly.  I’m currently using pdebuild with a custom pbuilderrc file that points to wheezy.  Getting things to that point took a lot of trial and error; with one of the errors being that the pbuilder puts the files in a result directory, not the parent.

This also means that the wheezy backports are out of the git repository.  I see that there is a git-pbuild but to me it looks like yet another workflow which will slow me right down. Anyone got some good and simple suggestions on having a wheezy track (branch?) and requiring backporting that doesn’t get complicated or broken quick?  sbuild died in a wave of permission denieds within the chroot.




Categories: Elsewhere

Open Source Training: How to Rewrite the Output of Views with PHP

Planet Drupal - Wed, 03/12/2014 - 03:30

Views is a very powerful tool that allows you to pull information from your database in many flexible ways.

However, there will be situations where the default options in Views aren't enough. The Views PHP module allows you even more flexibility.

In these two videos, Robert Ring, one of our Drupal teachers, shows you how to re-write a View using PHP.

Categories: Elsewhere

David Norman: Node access rebuilds have a hard time limit

Planet Drupal - Wed, 03/12/2014 - 01:04

I thought I'd call attention to a bit of little-known Drupal trivia. The node_access_rebuild() function has a hard-coded 240-second time limit. That means that attempting to rebuild the node_access table with a tool like drush probably won't work on a large site. Having a time limit helps keep the site from hitting memory limits and crashing.

In case you think you can out-smart the problem with time limit removal trickery, I think you'll find that the limit is deeply embedded in Drupal's core code. These drush attempts will fail:

drush php-eval 'set_time_limit(0); node_access_rebuild();' drush php-eval 'ini_set("max_execution_time", 0); node_access_rebuild();'

The node_access_rebuild() function also allows an argument to toggle batch mode, the intended design to bypass the 240 second limit. Trying that with drush probably won't get the results you're hoping for, either.

drush php-eval 'node_access_rebuld(TRUE);'

Don't expect htop to show a bunch of busy threads after running that. A web browser trumps drush when it comes to performing successive HTTP requests for batch mode here.

Instead try telling Drupal that node access controls need a rebuild.

drush php-eval 'node_access_needs_rebuild(TRUE);'


drush vset node_access_needs_rebuild 1

After the rebuild status is set, any of the administration pages will show a link with an error message to prompt administrators to rebuild node permissions.

The link in the prompt goes to admin/reports/status/rebuild, which will prompt with a confirmation form before actually rebuilding permissions.

Your next option is a bit fancier, using a PHP script by shrop.

Run it like this:

time drush php-script rebuild-perms.php

If all that doesn't satisfy your needs, as it should, then hack core.

Hacking core is usually the wrong answer to any question. Nonetheless, here's a patch against Drupal 7.34.

Don't let the irony escape you - the comment before the drupal_set_time_limit() actually says the limit attempts to "allocate enough time."

Post categories Drupal
Categories: Elsewhere

Daniel Leidert: WISO Steuer-Sparbuch unter Debian Linux

Planet Debian - Wed, 03/12/2014 - 00:37

Ich bin begeistert, denn meine USB-Stick Installation vom WISO Steuer-Sparbuch lässt sich ganz einfach unter Debian Linux mit Wine nutzen:

wine32 /media/pfad/mein/stick/WISO/Steuersoftware\ 2014/usbstart.exe

Es gab überhaupt keine Probleme während der mehrstündigen Arbeit!! Die Update-Funktion werde ich bei Gelegenheit noch testen, genauso wie die Installation auf den USB-Stick unter Wine. Für letzteres hatte ich noch auf Windows zurückgegriffen. Genial :)

Categories: Elsewhere

Richard Hartmann: Tabletop games

Planet Debian - Tue, 02/12/2014 - 23:36

Wood. I really like wood. Even more, I like working with wood.

Touching it, following its grain, and contemplating that it was made mostly from thin air and water.

Normally, I just turn trees into handy pieces of firewood. While that's already deeply satisfying in the sense that you actually get to see what you worked for, it's a mostly destructive task. You kill a tree, you chop it up, only to turn it back into (mostly) thin air.

As chance would have it, I needed a new table. After dragging myself through way too many furniture stores, I realized that I wouldn't be happy with what's on offer. So I struck a deal with a local carpenter: I would buy from them, but only if I could help building my own table; I would finally create something larger than a carving from wood.

After some scouting, the carpenter found five planks of oak which were 4+ meter long, about 40-60 cm wide, and 8+ cm thick:

If you think this wood looks old, worn, and broken: You should have seen it up close; it was worse than on the potato-cam picture ;) But that's another great thing about working with wood: by taking off a laughably thin layer of surface material, you can renew the whole thing.

We cut off 10 cm from each side as that's what tends to split and tossed that away. Afterwards, we cut off 80 cm from the wider side for the legs. Again, the base tends to have more imperfections and as you don't need to cut long pieces, you have more freedom in deciding how to cut.

Cutting is an art in itself; tiny imperfections in the wood's surface can hint at large fissures underneath. The fact that the wood looks worn and spotty does not help in figuring this out. After cutting to minimize waste, you end up with a pile like this:

And a surprising amount of waste, i.e. firewood:

After a lot of planing, the wood becomes cleaner and smoother:

Then, everything's fitted so that neighbouring planks have their heartwood running into different directions, and so that the upper surface gets (most of) the interesting features. This is another surprisingly involved process and took about two hours.

And no, the potato-cam does not manage to capture the wood's beauty.

After sanding the sides down to perfection to ensure the glue can bond really tightly, the table feet are glued and put into a hydraulic press:

while the tabletop itself shines in all its 287 cm x 107 cm x 6.3 cm glory:

Tomorrow, we will sand down the top and bottom of the tabletop and prepare the feet. Grooves will be milled into the wood to glue steel bars into it, as well as another plank that will be glued to the bottom of the tabletop, running along the middle. Along with the alternating heartwood, this helps ensure that this beast of a table will not fold in on itself or otherwise succumb to internal torsion or gravity.

The final steps will be to fit the feet, sand down the surface again and then apply two layers of oil.

And while most people may not fancy taking a week off just to rise way too early and then do unpaid work, I love it.

As I said: I like wood.

Categories: Elsewhere

Gizra.com: Behat vs. Casper (In Drupal Context)

Planet Drupal - Tue, 02/12/2014 - 23:00

In my previous blog post Behat - The Right Way I made a statement that I think Behat was a better choice for writing tests even for the frontend. Some good arguments were raised in favor of CasperJS.

@amitaibu @juampy72 it boils down to this: I'm a frontend dev. Writing PHP is something I avoid whenever possible.

— Chris Ruppel (@rupl) November 19, 2014

I believe my comparison was wrong in the sense it was lacking the key point to Behat's stength for us. It's not really about "Behat vs. Casper". The proper comparison should have been "Behat vs. Casper - With a Drupal backend"

And here's the key difference: With Behat you can interact with Drupal's API even when testing using PhantomJS. That is a lot of testing power!

Continue reading…

Categories: Elsewhere

Mediacurrent: Mediacurrent and Drupal Featured on NBC’s Atlanta Tech Edge

Planet Drupal - Tue, 02/12/2014 - 21:33

As a leading Atlanta-based Drupal and Digital web agency, Mediacurrent was recently profiled on 11Alive’s (an NBC affiliate) Atlanta Tech Edge. Atlanta Tech Edge was created to highlight selected companies that are leading the charge in Atlanta’s booming tech sector.

Categories: Elsewhere

Tag1 Consulting: BDD: It's about value

Planet Drupal - Tue, 02/12/2014 - 21:11

I was drawn to Behavior Driven Development the moment I was pointed toward Behat not just for the automation but because it systematized and gave me a vocabulary for some things I already did pretty well. It let me teach some of those skills instead of just using them. At DrupalCon Amsterdam, Behat and Mink architect Konstantin Kudryashov gave a whole new dimension to that.

read more

Categories: Elsewhere

Open Source Training: How to Sort Drupal Views Alphabetically

Planet Drupal - Tue, 02/12/2014 - 19:43

Alphabetical sorting is one of the most common ways people want to sort content in Views.

You may want to all sorts of things from A to Z, from staff members to business listings.

Here's how to add alphabetical sorting to your Drupal views.

Categories: Elsewhere

Sven Hoexter: Out of the comfort zone: Backporting net-snmp from CentOS 7 to CentOS 5 and 6

Planet Debian - Tue, 02/12/2014 - 19:23

Lately I faced the requirement to create a backport of net-snmp 2.7.2, which is part of CentOS 7, to a largely CentOS 5 and 6 based infrastructure. Due to some prior adventures in the rpm world I already knew about Mock (the Fedora equivalent to pbuilder).

The easy part was the backporting to CentOS 6. Here the major objectiv is ripping out the newly added systemd support. That boils down to just ripping out a patch, removing a configure flag, some %post, %preun, %postun script snippets and the corresponding %install magic from the spec file. Builds fine but leaves us with the issue that the init script ships in a package net-snmp-sysvinit but the whole chkconfig registration magic is gone.

I put that aside for a moment and moved on to CentOS 5. It did not even start to build.

  1. There is no tcp_wrappers-devel in CentOS 5.
  2. The perl libraries are in the main perl package and not in perl-devel.
  3. lm_sensors-devel is not at version 3.

I lowered the versioned build-dep on lm_sensors, disabled via a spec file toggle the tcp-wrapper support and changed the perl build-dep. This time it failed in the configure script due to DTLS support issues with the ancient openssl in CentOS 5. Decided to drop that one as well from the list of configure flags since we do not use it. Tried to build it again, now it failed somewhere down the road post compilation when assembling the package. Not cool.

At that time we pushed a few other ideas back and forth. We found an isolated patch for net-snmp 5.6 that could fix our most pressing issue and someone had a look if that could be backported to CentOS 5 easily. Nope, internal structure of net-snmp changed a lot in between. So back to backporting.

After staring long enough at the build log I could understand that the variable %{RPM_BUILD_ROOT} is empty in the CentOS 5 world. And indeed the BuildRoot stanza is also gone from the spec file. So I reintroduced that one and later even found documentation in the Fedora Wiki about this issue. If you know it it's easy to work around it, so I just defined it by hand:

BuildRoot: %{_tmppath}/%{name}-%{version}-%{release}-root %global RPM_BUILD_ROOT %{buildroot}

About ten minutes later I had a set of rpm files for CentOS 5.

Tried to install it, failed due to a depedency on libmysqlclient15 which is part of the mysql package in CentOS 5. Yes, eight years back RedHat did not split off the lib packages and that would mean for us to install a mysql database on every system. So no, we won't do it and we just removed the mysql support.

Left with only the init script setup missing I tried to be clever and failed miserable. Be warned, this would've also been a bad idea in the Debian world.

Since we have the init script in a seperate binary package I thought I could just use something like

Requires: net-snmp-sysvinit

in the main net-snmp package and re-add the old %post, %preun and %postun magic. Of course that failed at install time for a good reason.

In the second (desperate) try to work around it I tried something similar to "PreDepends"

Requires(post): net-snmp-sysvinit

That works at install time but still breaks on a removal/update.

So I think I did the right thing in the end (after reading some more documentation) and just added the scripts to the net-snmp-sysvinit package. It's just about adding the package name to the %post macro invocation.

My initial plan included the wish to have one spec file that is usable to build on CentOS 5 and 6 but now I'm left with a number of special cases for CentOS 5.

I already knew about the %if %endif conditional macro you can use in rpm spec files, so what was missing was a proper variable to base my decisions on. Every CentOS release ships with a "/etc/rpm/" directory that includes several macro file. Within those one can find the %{rhel} variable that defines an int we can use.

%if %{rhel} <= 5 BuildRequires: perl %else BuildRequires: perl-devel %endif

And finally I could build from one spec file a backport for CentOS 5 and 6. My diff against the CentOS 7 spec is at https://sven.stormbind.net/centos/netsnmp-bp-el56/net-snmp-backport.from_el7.diff

Drawbacks of the resulting packages:

  1. No tcp-wrapper support, we don't care but that should be just another package name issue for the conditional Requires, simililar to the perl isssue.
  2. No mysql support. We do not care, one could change that and/or make it conditional or build against newer mysql releases where the -lib package is split.
  3. No DTLS support on CentOS 5. We do not care and that is not easy to fix.
  4. You've to install the net-snmp-sysvinit script to get back the usual setup you'd expect.
  5. We ripped out the systemd support, I did not investigate yet if it's possible to keep that included and just not enable it.

By now CentOS 5 is old. It's painful to backport to it. But it's still possible. Another strategy someone tried was packaging the current net-snmp upstream release based on the CentOS 5 spec file. That would've meant throwing away all 106 patches RedHat includes in that package at the moment and to depend on new upstream releases for your own security support. In case we start using CentOS 7 we would also diverge on the version of snmpd we use. So we decided to stick with the RedHat sources and keep the version consistent on our infrastructure even if we can not keep feature parity.

After all the issues one encounters while backporting packages in the rpm world are similar to those in the dpkg world. Still it feels different if you're not familar with the toolchain. For example

  1. If you build a src.rpm with mock you've to copy it out of the result directory before you rebuild it with mock to create the binary package. Mock deletes everything from the result directory when starting a new build. So I guess parallel builds for the same chroot will not work.
  2. Depedencies on files instead of packages are a strange thing.
  3. How on earth do you maintain a patchset of over a hundred patches without something like quilt? I guess you always have to apply them in a for loop on your source directory and then you create a copy of that to implement your modifications and create another patch. I'm pretty sure eight years ago Fedora was far away from using git so I would rule out patch branches in a VCS for the moment. Not sure what they invented inside RedHat to keep track of it.
Categories: Elsewhere

Drupal Watchdog: Baby Steps

Planet Drupal - Tue, 02/12/2014 - 18:59



RONNIE RAY (post-modern hippie, disheveled) is slouched at his desk, finishing the NY Times crossword puzzle and knocking back his second powerful coffee of the day (“I buy the gourmet, expensive stuff, ‘cause when I drink it, I wanna taste it.”)

He suddenly straightens up and tosses the Times aside.

RONNIE: (muttering morosely) I need to get serious.


Outside a Flatiron District jazz-club, Ronnie chats with JEAN-CLAUDE (French, cheerful), the venue’s sound engineer. Ronnie smokes a cigarette.

RONNIE: I need a website.

JEAN-CLAUDE: But of course, my friend. I can build you a website – not a problem.

RONNIE: How much will it cost?

JEAN-CLAUDE: The website is for free – I only ask that you commit to my server for a year or two. One hundred dollars a year.

RONNIE: Perfect! Do you use Drupal?

JEAN-CLAUDE: Wordpress.

RONNIE: (sagging in disappointment) No! It has to be Drupal. I’m the copy-editor for Drupal Watchdog. It would be weird – disloyal – to use some other software.

JEAN-CLAUDE: (with a shrug) Sorry.

Ronnie lurches off into the night, engulfed in disappointment.

Suddenly he stops and stands stock still – Eureka!


Ronnie paces excitedly while talking on the phone to his PUBLISHER.

RONNIE: Okay, so I’ve been copy-editing Drupal Watchdog for, what, two years?

PUBLISHER: Five issues, yes.

RONNIE: And I still don’t know a module from a cache, MySQL from Behat from –

PUBLISHER: – And that’s fine. All I ever expected was for you to put it into good English, which is what you do.

Categories: Elsewhere

Chocolate Lily: Open Outreach welcomes new partner Praxis Labs

Planet Drupal - Tue, 02/12/2014 - 18:24

A managed hosting service will be the first fruit of a new partnership between Chocolate Lily Web Projects and the Montreal-based cooperative Praxis Labs aimed at strengthening and expanding the nonprofit-focused Open Outreach Drupal distribution.

Collaboration between Chocolate Lily and Praxis comes out of a community engagement process that began last fall.

Categories: Elsewhere

Gregor Herrmann: GDAC 2014/2

Planet Debian - Tue, 02/12/2014 - 17:51

after the GR is before the GR. the next one is about limiting the term of TC members. & it's a pleasure for me to watch how the proponents of different variants & other interested people work together most constructively in order to prepare the best possible ballot for the voters. – hat tip to everyone involved!

this posting is part of GDAC (gregoa's debian advent calendar), a project to show the bright side of debian & why it's fun for me to contribute.

Categories: Elsewhere

Linnovate: Drupal And the Disappearing Images Mystery

Planet Drupal - Tue, 02/12/2014 - 17:03

After working many years with a specific framework, you sometimes face difficulties that in other situations, specifically while learning a new language or framework would not even challenge you.
One example for such a case is one I’ve encountered this past week, and to tackle it, all I’ve needed to do is to actually read the Drupal docs and not just flip through it.
One of my clients came to me and told me that all of the images he’s uploading to his site are deleted from the files directory of his Drupal project after several hours.
After checking that the images are created successfully in the Drupal’s temp directory and are then moved to the files directory as they should, I begun checking for any file/image related modules and any Drupal configurations that could hint a relation to the problem.
Checking those off I’ve started to look at custom code developed by our programmers, as this is a more time-consuming task I’ve not started with it but knew from the beginning that this is probably where the culprit could be found.
While carefully combing the code I’ve landed upon a form api piece of code related to an image field similar to this:

<!--?php // Use the #managed_file FAPI element to upload an image file. $form['image_example_image_fid'] = array(   '#title' => t('Image'),   '#type' => 'managed_file',   '#description' => t('The uploaded image will be displayed on this page using the image style chosen below.'),   '#default_value' => variable_get('image_example_image_fid', ''),   '#upload_location' => 'public://image_example_images/', ); ?>

This piece of code will add a nice file/image field to the page and will allow you to attach an image to the current entity.
After finding the “managed_file” type documentation the problem and the solution was clear.

Note: New files are uploaded with a status of 0 and are treated as temporary files which are removed after 6 hours via cron. Your module is responsible for changing the $file objects status to FILE_STATUS_PERMANENT and saving the new status to the database. Something like the following within your submit handler should do the trick.

 <!--?php // Load the file via file.fid. $file = file_load($form_state['values']['my_file_field']); // Change status to permanent. $file->status = FILE_STATUS_PERMANENT; // Save. file_save($file); // Record that the module (in this example, user module) is using the file. file_usage_add($file, 'user', 'user', $account->uid); ?>

So in order to prevent the (weird – in my opinion) automatic 6 hour cron deletion of the uploaded images you have to add a submit handler and inside it add that piece of code.
To clarify and help those in need, this is an expanded example of a form and submit functions.

$form = drupal_get_form('my_module_example_form'); ... function my_module_example_form($form, &$form_state) { $form['image_example_image_fid'] = array(   '#title' => t('Image'),   '#type' => 'managed_file',   '#description' => t('The uploaded image will be displayed on this page using the image style chosen below.'),   '#default_value' => variable_get('image_example_image_fid', ''),   '#upload_location' => 'public://image_example_images/', ); $form['submit'] = array( '#type' => 'submit', '#value' => t('Submit'), ); return $form; } function my_module_example_form_validate($form, &$form_state) { // Validation logic. } function my_module_example_form_submit($form, &$form_state) { // Submission logic. // Load the file via file.fid. $file = file_load($form_state['values']['my_file_field']); // Change status to permanent. $file->status = FILE_STATUS_PERMANENT; // Save. file_save($file); // Record that the module (in this example, user module) is using the file. file_usage_add($file, 'user', 'user', $account->uid); // a more generic example of file_usage_add // file_usage_add($file, 'my_module_name', 'user or node or any entity', 'that entity id'); // you don't need to use "file_usage_add" if you're not attaching the image to an entity }

Originally posted on my personal blog.

Categories: Elsewhere

Amazee Labs: DrupalCamp Moscow 2014

Planet Drupal - Tue, 02/12/2014 - 17:00
DrupalCamp Moscow 2014

It must be weird, but living in Russia I have never attended a russian Drupal event. I was at DrupalCon Prague 2013 and have attended ukrainian DrupalCamps several times before. The Ukraine is located much closer to the town where I live than to the russian capital, so for me it’s faster to get to the neighbour country rather than to visit Moscow.

Amazee Labs travels back to the USSR

This time I decided to visit a russian Drupal event, DrupalCamp Moscow 2014. I didn’t know what to expect. Another country means another people. But with the Drupal community, this rule never works. Drupal folks are pretty much the same all around the world: sociable, nice, friendly, and always ready to help! If you are a professional drupalist, or a newbie, or even if you know nothing about Drupal… You are always welcome!

And this is exactly what Boris and Corina were talking about in their blog posts “Be a part of the community” and “Being part of the community - a non-techie perspective”.

@duozersk рассказывает про #angularJS и #drupal http://t.co/KbMheeFSY6 #dcmsk pic.twitter.com/Cqrl6yLifb

— Nikolay Shapovalov (@RuZniki) 29. November 2014


The sessions I attended were good. I learned how russian drupalists work with Solr, which we use a lot at Amazee Labs, learned some new techniques for high-performance sites, went to Drupal 8 theming, and even learned some cases of using AngularJS with Drupal.

Speaking at Moscow State University, feeling like a professor

For us at Amazee Labs it is an essential part of our company culture to contribute back to the community. We are a proud sponsor of the event, and I shared our knowledge and know-how by giving a presentation about Drupal 8 configuration management. My next post will be about it. Subscribe to our RSS FeedTwitter, or Facebook page to not miss it ;)

Categories: Elsewhere


Subscribe to jfhovinne aggregator - Elsewhere