Agrégateur de flux

Chromatic: TheaterMania: Lessons Learned on Localization

Planet Drupal - lun, 16/11/2015 - 21:28

We recently launched a new site for an existing client, TheaterMania. We helped launch and currently maintain and develop The Gold Club, which is a subscription-based discount theater club in New York City. The new site is the same thing, but in London – same language, same codebase, new database, different servers. We only had to migrate users, which were already exported for us, so nothing exceptional there. Shouldn’t be a big deal, right? We learned that’s not always the case.

Architectural Decisions

One of our first problems, besides the obvious localization issues (currency, date formats, language), was to decide what we were shipping. Were we just building another site? Were we packaging software? There will most likely be more sites in other cities in the future – how far did we want to go in terms of making this a product that we could ship? In the end, we wound up going somewhere in the middle. We had to decide initially if we would use Organic Groups to have one site with multiple “clubs,” one Drupal multisite installation, or multiple Drupal installations. The final decision was to combine the latter two choices – we created multisite-style directories so that if we need to take the site in a multi-site direction, we can easily do that. The sites each have a site-specific settings file, full of various configuration variables.

Now that the site has been launched, we’re not sure if this list of variables will be developer-friendly moving forward, and have been keeping in mind that we may want a more elegant solution for this. The best part about this setup is that we have one codebase, one master branch, and each site is configured to use the appropriate settings. The most important thing is that this is all very thoroughly documented, both in the code, README files, and the repo wiki.

Currency & Recurly: Easier than Expected

One of the issues I thought would be very problematic was currency, but that wasn’t actually an issue. All of the existing transactions are set up in cents – ie, 100 instead of 1.00 for a dollar, and that translates perfectly from dollars to pounds. We use Recurly, an external payment and subscription processor, so we didn’t have to worry about any localization issues on that front. Most of the currency abstractions I did were to remove any hard-coded references to the dollar sign, and create functions and variables to get the appropriate currency symbol.

Dealing with Dates; Ugh.

Date formats were something I expected to be easy, but that wound up being more complex. I discovered hook_date_combo_process_alter() to change the display of the date in calendar popup fields. This made what I’d thought was going to be a difficult series of view handlers really simple. We have several fields using the date combo box on both content types and entities, and this function took care of them.

* Implements hook_date_combo_process_alter().
* Changes the date format.
function gc_display_date_combo_process_alter(&$element, &$form_state, $context) {
  if (isset($element['#entity']->type)) {
    switch ($element['#entity']->type) {
      case 'event':
        $element['value']['#date_format'] = variable_get('date_format_short');

      case 'partner':
        $element['value']['#date_format'] = variable_get('date_format_short');
        $element['value2']['#date_format'] = variable_get('date_format_short');

      case 'promo_offer':
        $element['value']['#date_format'] = variable_get('date_format_short');
        $element['value2']['#date_format'] = variable_get('date_format_short');

  elseif (isset($element['#entity']->field_name)) {
    if ($element['value']['#instance']['widget']['type']  'date_popup' && $element['#entity']->field_name  'field_user_csr_notes') {
      $element['value']['#date_format'] = variable_get('date_format_short');

I took the dozen or so existing date formats from Drupal, altered some of them to meet our needs, and added a few more. My head also started spinning when testing because I’m so used to M/D/Y formats that D/M/Y formats look really strange after a while, especially because code changes needed to be tested on the US and UK sites, so I had to be really careful when visually testing a page to make sure that a US page was showing 9/1/15 and the UK page was showing 1/9/15. In the future, I’d definitely advocate for a testing suite on a project like this. Overall, making sure all of the dates were changed was somewhat tedious, but not difficult. It required a lot of attention to detail and familiarity with PHP date formats, and vigorous testing by the whole team to make sure nothing had been missed.

Proper Use of t() Early == Wins Later

This project made me extremely grateful for the t() function. Since both sites were in English, we didn’t have a need for site-wide translation, but we did need to localize a handful of strings, both for language issues (words like ‘personalize’ vs ‘personalise’), and the general language preference of the stakeholders. It was easy enough to find the strings and list them in locale_custom_strings_en to switch them out. One gotcha we came across that I wasn’t familiar with – you cannot use t() in your settings files. The function isn’t available at that point in the bootstrapping. You can use get_t(), but we opted to remove the translation strings from any variables and make sure that t() was used when the variable was called. This wasn’t something I had run into before, and it caused some problems before we figured it out.


A few tricky miscellaneous problems cropped up, too. There was a geolocation function enabled in Recurly, which was defaulting to the US and we were unable to change the settings – we also didn’t realize this when testing in the US, and we scratched our heads when the London team told us the field was defaulting to US until we came across the culprit. We were able to fix it, and put in a patch for the library causing the issue.

I also realized how many various settings default to the US when working on this project – a lot of the location-related work was just abstracting out country defaults. Something to keep in mind if you’re working on a project with locations. Don’t make more work for developers who live or work on projects outside of the US. Plan for the future! Assume nothing!

Looking Back

I’m really glad that I worked on this project, because it’s made me develop with a better eye for abstraction of all kinds, and making sure that it’s easy for developers or users to work with my code anywhere. In the future, I’d put more thought into managing our configurations from the start, as well as automating the testing process, both for time-saving and better QA.

If you’ve ever worked on a site with challenges like these, I’d love to hear how you handled them! What are your best practices for managing custom locale strings and other site-specific variables? To what extent do you abstract things like dates and currency when developing a site, even when you don’t know if those will ever change?

Catégories: Elsewhere

Steve Kemp: lumail2 nears another release

Planet Debian - lun, 16/11/2015 - 21:15

I'm pleased with the way that Lumail2 development is proceeding, and it is reaching a point where there will be a second source-release.

I've made a lot of changes to the repository recently, and most of them boil down to moving code from the C++ side of the application, over to the Lua side.

This morning, for example, I updated the handing of index.limit to be entirely Lua based.

When you open a Maildir folder you see the list of messages it contains, as you would expect.

The notion of the index.limit is that you can limit the messages displayed, for example:

  • See all messages: Config:set( "index.limit", "all")
  • See only new/unread messages: Config:set( "index.limit", "new")
  • See only messages which arrived today: Config:set( "index.limit", "today")
  • See only messages which contain "Steve" in their formatted version: Config:set( "index.limit", "steve")

These are just examples that are present as defaults, but they give an idea of how things can work. I guess it isn't so different to Mutt's "limit" facilities - but thanks to the dynamic Lua nature of the application you can add your own with relative ease.

One of the biggest changes, recently, was the ability to display coloured text! That was always possible before, but a single line could only be one colour. Now colours can be mixed within a line, so this works as you might imagine:

Panel:append( "$[RED]This is red, $[GREEN]green, $[WHITE]white, and $[CYAN]cyan!" )

Other changes include a persistant cache of "stuff", which is Lua-based, the inclusion of at least one luarocks library to parse Date: headers, and a simple API for all our objects.

All good stuff. Perhaps time for a break in the next few weeks, but right now I think I'm making useful updates every other evening or so.

Catégories: Elsewhere

Acquia Developer Center Blog: Open Sourcing Statsgod, a StatsD Implementation In Go

Planet Drupal - lun, 16/11/2015 - 21:10
Kevin Hankens

Acquia Engineering is excited to be open-sourcing Statsgod, a reimplementation of StatsD we created internally to help scale our metrics collection effort.

Acquia developers often create tooling to build, deploy, and monitor applications we run on Amazon Web Services, and Statsgod is one such tool that we want to make publicly available. Statsgod was designed to be highly scalable and easily deployed.

Tags: acquia drupal planet
Catégories: Elsewhere

DrupalOnWindows: Exposing reverse entity reference fields in Drupal

Planet Drupal - lun, 16/11/2015 - 20:55
Language English

Entity references in Drupal is the mechanism used to do some "proper" (sorry for the quotes but what you can achieve with Drupal is years behind a real ORM such as the Entity Framework in terms of usability, reliability, flexibility and overal quality) data modeling without having to write everything from scratch including queries, widgets and storage. 

More articles...
Catégories: Elsewhere

Daniel Pocock: Quick start using Blender for video editing

Planet Debian - lun, 16/11/2015 - 19:53

Updated 2015-11-16 for WebM

Although it is mostly known for animation, Blender includes a non-linear video editing system that is available in all the current stable versions of Debian, Ubuntu and Fedora.

Here are some screenshots showing how to start editing a video of a talk from a conference.

In this case, there are two input files:

  • A video file from a DSLR camera, including an audio stream from a microphone on the camera
  • A separate audio file with sound captured by a lapel microphone attached to the speaker's smartphone. This is a much better quality sound and we would like this to replace the sound included in the video file.
Open Blender and choose the video editing mode

Launch Blender and choose the video sequence editor from the pull down menu at the top of the window:

Now you should see all the video sequence editor controls:

Setup the properties for your project

Click the context menu under the strip editor panel and change the panel to a Properties panel:

The video file we are playing with is 720p, so it seems reasonable to use 720p for the output too. Change that here:

The input file is 25fps so we need to use exactly the same frame rate for the output, otherwise you will either observe the video going at the wrong speed or there will be a conversion that is CPU intensive and degrades the quality. Also check that the resolution_percentage setting under the picture dimensions is 100%:

Now specify output to PNG files. Later we will combine them into a WebM file with a script. Specify the directory where the files will be placed and use the # placeholder to specify the number of digits to use to embed the frame number in the filename:

Now your basic rendering properties are set. When you want to generate the output file, come back to this panel and use the Animation button at the top.

Editing the video

Use the context menu to change the properties panel back to the strip view panel:

Add the video file:

and then right click the video strip (the lower strip) to highlight it and then add a transform strip:

Audio waveform

Right click the audio strip to highlight it and then go to the properties on the right hand side and click to show the waveform:

Rendering length

By default, Blender assumes you want to render 250 frames of output. Looking in the properties to the right of the audio or video strip you can see the actual number of frames. Put that value in the box at the bottom of the window where it says 250:

Enable AV-sync

Also at the bottom of the window is a control to enable AV-sync. If your audio and video are not in sync when you preview, you need to set this AV-sync option and also make sure you set the frame rate correctly in the properties:

Add the other sound strip

Now add the other sound file that was recorded using the lapel microphone:

Enable the waveform display for that sound strip too, this will allow you to align the sound strips precisely:

You will need to listen to the strips to make an estimate of the time difference. Use this estimate to set the "start frame" in the properties for your audio strip, it will be a negative value if the audio strip starts before the video. You can then zoom the strip panel to show about 3 to 5 seconds of sound and try to align the peaks. An easy way to do this is to look for applause at the end of the audio strips, the applause generates a large peak that is easily visible.

Once you have synced the audio, you can play the track and you should not be able to hear any echo. You can then silence the audio track from the camera by right clicking it, look in the properties to the right and change volume to 0.

Make any transforms you require

For example, to zoom in on the speaker, right click the transform strip (3rd from the bottom) and then in the panel on the right, click to enable "Uniform Scale" and then set the scale factor as required:

Render the video output to PNG

Click the context menu under the Curves panel and choose Properties again.

Click the Animation button to generate a sequence of PNG files for each frame.

Render the audio output

On the Properties panel, click the Audio button near the top. Choose a filename for the generated audio file.

Look on the bottom left-hand side of the window for the audio file settings, change it to the ogg container and Vorbis codec:

Ensure the filename has a .ogg extension

Now look at the top right-hand corner of the window for the Mixdown button. Click it and wait for Blender to generate the audio file.

Combine the PNG files and audio file into a WebM video file

You will need to have a few command line tools installed for manipulating the files from scripts. Install them using the package manager, for example, on a Debian or Ubuntu system:

# apt-get install mjpegtools vpx-tools mkvtoolnix

Now create a script like the following:

#!/bin/bash -e # Set this to match the project properties FRAME_RATE=25 # Set this to the rate you desire: TARGET_BITRATE=1000 WORK_DIR=${HOME}/video1 PNG_DIR=${WORK_DIR}/frames YUV_FILE=${WORK_DIR}/video.yuv WEBM_FILE=${WORK_DIR}/video.webm AUDIO_FILE=${WORK_DIR}/audio-mixed.ogg NUM_FRAMES=`find ${PNG_DIR} -type f | wc -l` png2yuv -I p -f $FRAME_RATE -b 1 -n $NUM_FRAMES \ -j ${PNG_DIR}/%08d.png > ${YUV_FILE} vpxenc --good --cpu-used=0 --auto-alt-ref=1 \ --lag-in-frames=16 --end-usage=vbr --passes=2 \ --threads=2 --target-bitrate=${TARGET_BITRATE} \ -o ${WEBM_FILE}-noaudio ${YUV_FILE} rm ${YUV_FILE} mkvmerge -o ${WEBM_FILE} -w ${WEBM_FILE}-noaudio ${AUDIO_FILE} rm ${WEBM_FILE}-noaudio Next steps

There are plenty of more comprehensive tutorials, including some videos on Youtube, explaining how to do more advanced things like fading in and out or zooming and panning dynamically at different points in the video.

If the lighting is not good (faces too dark, for example), you can right click the video strip, go to the properties panel on the right hand side and click Modifiers, Add Strip Modifier and then select "Color Balance". Use the Lift, Gamma and Gain sliders to adjust the shadows, midtones and highlights respectively.

Catégories: Elsewhere

Pantheon Blog: Better Behavior-Driven Development on Remote Servers

Planet Drupal - lun, 16/11/2015 - 18:48
Behavior-Driven Development is a widely-used testing methodology that is used to describe functional tests—that is, tests that operate on the whole of a system—in natural, readable language called Gherkin syntax. The goal of this methodology is to make the contents of the tests approachable to non-technical stakeholders. This makes it possible for a project’s functional tests to be meaningfully used as the acceptance criteria for the product.
Catégories: Elsewhere

Red Route: How to add classes to links in Drupal 8

Planet Drupal - lun, 16/11/2015 - 17:48

As I start porting the modules I maintain to Drupal 8, I'm hitting a few places where things haven't been intuitive to me. I'll try to work on the documentation when I get a chance, but in the meantime I figured it would be worth writing up a few notes.

A common task is creating a link, and adding classes and other attributes to it. The Responsive Share Buttons is basically just a block of links to social networks, so this was a key building block.

In Drupal 7 this was pretty simple - the link building function took three arguments - a title, a path, and an array of options:

$link = l(t('Link Title'), '', array(
  'attributes' => array(
    'class' => array(

In Drupal 8, the l function now takes a Url object with attributes, rather than a string, so it's a little different. Here's how to build a link to an external URL and add a class to it: First the Url class needs to be brought into scope:

use Drupal\Core\Url;

And then you can build the Url object and call setOptions on it:

$url = Url::fromUri('');
$link_options = array(
  'attributes' => array(
    'class' => array(
$link = \Drupal::l(t('Link title'), $url);

Incidentally, the other gotcha here that had me scratching my head for a while was how to get the current page title, and how to get the current URL. Drupal 7 had easily accessible functions for these tasks, but the object-oriented approach

Drupal 7 $title = drupal_get_title();
$current_url = url(current_path(), array('absolute' => TRUE)); Drupal 8 $request = \Drupal::request();
$route_match = \Drupal::routeMatch();
$title = \Drupal::service('title_resolver')->getTitle($request, $route_match->getRouteObject());
$current_url = $request->getUri();

My own learning journey with Drupal 8 is very much in its early days, and a lot of the old Drupalisms are pretty familiar to me, but it does seem a little long-winded. Give it a while, and I'm sure I'll get up to speed, and start seeing the benefits of the object-oriented approach in Drupal 8.

Tags: Drupaldrupal 8
Catégories: Elsewhere

Zivtech: oEmbed in Drupal: Embed all the things!

Planet Drupal - lun, 16/11/2015 - 17:44

​WordPress has great support for oEmbed, allowing content creators to paste in URLs that are automatically displayed as rich embedded content. You may also be familiar with similar behavior on Facebook and in chat services like Slack. Meanwhile in Drupal 7, most sites are using Media module with their WYSIWYG and are able to (with some effort) embed from certain providers. In Drupal 8, we finally have WYSIWYG in core, but no solution for adding videos and other embedded content. How can we have the ease of use of Wordpress for embedding 3rd party content?

oEmbed Module

The Drupal oEmbed module, despite its humble project description, works nicely, and can easily give you a similar experience as you get in Wordpress. I recommend signing up for an account with the service, and setting the cache lifetime high in the oEmbed module settings. This gives you access to a large number of oEmbed providers and for many sites if you use a high cache lifetime you can stay within the free usage tier. OEmbed module gives you the option of using an input filter to turn URLs in your textareas into embeds, or you can use oEmbed Field submodule, which allows you to add link fields and use an oEmbed display formatter.

Asset Module: How are you so awesome and so overlooked?

The ability to use oEmbed in a field got me thinking about one of my favorite (and highly underrated) modules: Asset module. Asset module is essentially an alternative to the widely-used Media module (Scald is a third option in this space). While Media currently boasts 262,680 site installs, Asset is used by a humble 1,167. Drupalers are often advised that a good way to tell which module is the best when choosing between similar modules is to pick based on usage statistics and how much active development is occurring. Unfortunately, this is not foolproof advice: beware echo chambers.

If you've worked with Media module much (disclaimer: I was involved in early stages of Media module architecture and development), you're probably familiar with some of its flaws: an ever-changing variety of complex bugs on its 2.x branch, complicated relationship between Media and File Entity configurations, no straightforward method to add captions to images, multiple dialogs to click through just to add an image, bugs when you disable and re-enable rich text, and difficulty editing items after you add them to the WYSIWYG, to name a few.

Asset module in contrast has a lovely UI, provides common features out of the box (add an image to the WYSIWYG with working captions and right/left alignment), is simple to configure and use, relatively bug-free, and stable. It provides many of the same features as Media, like a library of reusable media assets you can add to a WYSIWYG or display in Views and the ability to add your own fielded bundles for various types of assets. In addition Asset module lets you pick your own WYSIWYG button icons and have a separate button in your WYSIWYG for each type of asset (image, video, document) and unlike Media module it is not directly tied to files. This means you can create Asset types for reusable, centrally-managed structured content that are not file-based at all. I like to make Asset types for things like Addresses and Calls to Action which authors can use within their WYSIWYG. You can quickly explore the wonders of Asset module on its demo site - make sure in addition to the WYSIWYG buttons you try out the 'Asset Widget' on the right side of the content creation page and see how you can drag existing assets into not only textareas but also entityreference fields.

oEmbed with Asset Module

What does this have to do with oEmbed? Well, guess what happens if you add a new Asset type with a link field you set to display as oEmbed? Yup, now you have an Embed button on your WYSIWYG that lets your authors paste in a URL from any of those services, or reuse embeds they've already added to their Asset library. No more adding separate modules to be able to integrate with YouTube, Vimeo, and more. In fact, now we have a better user experience than WordPress! The embeds even show up already rendered right in your WYSIWYG.

Here are some examples of embeds I can put into this WYSIWYG (content from myself and my old band from around the internet):

A song on Rdio

A video on YouTube

A tweet

.@tizzo at work

— Jody Hamilton (@JodyHamilton) June 28, 2015

A photo on Flickr

A LinkedIn user

A Github gist

A JibJab

Editor Experience

Want to see how it looks in my WYSIWYG? Let me embed a screenshot with my Asset image button!

My WYSIWYG right now... Like my caption?

The buttons on the right in my WYSIWYG are for Assets. I have a Document, Image, 'Call to Action', and then Embed Asset Types, followed by the Search button that lets me use my Asset library. By the way, the 'Call to Action' is just a link field that outputs like:

Get Asset Module!

When I press the Embed button, I embed an asset like

To add an embed, just paste in a URL. Note you pick your Asset button - here I'm using a heart because I heart this setup. You can also add your own icons (patch in the queue).

Please tune in for Part 2 of this series on how to set this up on your Drupal 7 site and Part 3: Embedding in Drupal 8.

Terms: Publishing Workflow Ready for Publishing
Catégories: Elsewhere

Drupal Camp NJ 2015: Announcing Mike Anello as the Keynote for DrupalCamp NJ 2016!

Planet Drupal - lun, 16/11/2015 - 17:35

Mike Anello (@ultimike) is co-founder and vice president of DrupalEasy, a

Catégories: Elsewhere

Pronovix: Retooling on Drupal 8: free training materials

Planet Drupal - lun, 16/11/2015 - 16:55

We are working on a set of free training materials for Drupal 8. To make sure we build something that others will be able to reuse we would like to get your input on the kind of trainings you would like to use to retrain your team.

Catégories: Elsewhere

Drupalpress, Drupal in the Health Sciences Library at UVA: Setting up Shibboleth + Ubuntu 14 + Drupal 7 on AWS with integration

Planet Drupal - lun, 16/11/2015 - 16:48

We’ve recently begun moving to amazon web services for hosting, however we still need to authenticate through ITS who handles the central SSO Authentication services for  In previous posts we looked at Pubcookie aka Netbadge - however Pubcookie is getting pretty long in the tooth (it’s last release back in 2010) and we are running Ubuntu 14 with Apache 2…. integrating pubcookie was going to be a PITA…. so it was time to look at Shibboleth – an Internet2  SSO standard that works with SAML  and is markedly more modern than pubcookie – allowing federated logins between institutions etc…

A special thanks to Steve Losen who put up with way more banal questions than anyone should have to deal with… that said, he’s the man

Anyhow – ITS does a fine job at documenting the basics -  Since we’re using ubuntu the only real difference is that we used apt-get

Here’s the entire install from base Ubuntu 14

apt-get install apache2 mysql-server php5 php-pear php5-mysql php5-ldap libapache2-mod-shib2 shibboleth-sp2-schemas drush sendmail ntp


Apache Set up

On the Apache2 side  we enabled some modules and the default ssl site

a2enmod ldap rewrite  shib2 ssl
a2ensite default-ssl.conf

Back on the apache2 side here’s our default SSL 

<IfModule mod_ssl.c>
<VirtualHost _default_:443>
ServerAdmin webmaster@localhost
DocumentRoot /some_web_directory/
<Directory /some_web_directory/>
AllowOverride All

SSLEngine on

SSLCertificateFile /somewheresafe/biocon_hsl.crt
SSLCertificateKeyFile /somewheresafe/biocon_hsl.key

<Location />
AuthType shibboleth
ShibRequestSetting requireSession 0 ##This part meant that creating a session is possible, not required
require shibboleth

the location attributes are important – if you don’t have that either in the Apache conf you’ll need it in an .htaccess in the drupal directory space

Shibboleth Config

The Shibboleth side confused me for a hot minute.

we used  shib-keygen as noted in the documentation to create keys for shibboleth and ultimately the relevant part of our /etc/shibboleth/shibboleth2.xml looked like this

<ApplicationDefaults entityID=””
REMOTE_USER=”eppn uid persistent-id targeted-id”>

<Sessions lifetime=”28800″ timeout=”3600″ relayState=”ss:mem”
checkAddress=”false” handlerSSL=”true” cookieProps=”https”>
<!–we went with SSL Required – so change handlerSSL to true and cookieProps to https

<SSO entityID=””>
<!–this is the production value, we started out with the testing config – ITS provides this in their documentation–>

<MetadataProvider type=”XML” file=”UVAmetadata.xml” />
<!–Once things are working you should be able to find this at – it’s a file you download from ITS = RTFM –>
<AttributeExtractor type=”XML” validate=”true” reloadChanges=”false” path=”attribute-map.xml”/>
<!–attribute-map.xml is the only other file you’re going to need to touch–>

<CredentialResolver type=”File” key=”sp-key.pem” certificate=”sp-cert.pem”/>
<!–these are the keys generated with shib-keygen –>
<Handler type=”Session” Location=”/Session” showAttributeValues=”true”/>
<!–During debug we used with the  showAttributeValues=”true” setting on to see what was coming across from the UVa  Shibboleth IdP–>

/etc/shibboleth/attribute-map.xml looked like this

<Attribute name=”urn:mace:dir:attribute-def:eduPersonPrincipalName” id=”eppn”>
<AttributeDecoder xsi:type=”ScopedAttributeDecoder”/>

<Attribute name=”urn:mace:dir:attribute-def:eduPersonScopedAffiliation” id=”affiliation”>
<AttributeDecoder xsi:type=”ScopedAttributeDecoder” caseSensitive=”false”/>
<Attribute name=”urn:oid:″ id=”affiliation”>
<AttributeDecoder xsi:type=”ScopedAttributeDecoder” caseSensitive=”false”/>

<Attribute name=”urn:mace:dir:attribute-def:eduPersonAffiliation” id=”unscoped-affiliation”>
<AttributeDecoder xsi:type=”StringAttributeDecoder” caseSensitive=”false”/>
<Attribute name=”urn:oid:″ id=”unscoped-affiliation”>
<AttributeDecoder xsi:type=”StringAttributeDecoder” caseSensitive=”false”/>

<Attribute name=”urn:mace:dir:attribute-def:eduPersonEntitlement” id=”entitlement”/>
<Attribute name=”urn:oid:″ id=”entitlement”/>

<Attribute name=”urn:mace:dir:attribute-def:eduPersonTargetedID” id=”targeted-id”>
<AttributeDecoder xsi:type=”ScopedAttributeDecoder”/>

<Attribute name=”urn:oid:″ id=”persistent-id”>
<AttributeDecoder xsi:type=”NameIDAttributeDecoder” formatter=”$NameQualifier!$SPNameQualifier!$Name” defaultQualifiers=”true”/>

<!– Fourth, the SAML 2.0 NameID Format: –>
<Attribute name=”urn:oasis:names:tc:SAML:2.0:nameid-format:persistent” id=”persistent-id”>
<AttributeDecoder xsi:type=”NameIDAttributeDecoder” formatter=”$NameQualifier!$SPNameQualifier!$Name” defaultQualifiers=”true”/>
<Attribute name=”urn:oid:″ id=”eduPersonPrincipalName”/>
<Attribute name=”urn:oid:0.9.2342.19200300.100.1.1″ id=”uid”/>

Those two pieces marked in red are important – they’re going to be the bits that we pipe in to Drupal

For  debugging we used the following URL to see what was coming across – once it was all good we got a response that looks like

Session Expiration (barring inactivity): 479 minute(s)
Client Address:
SSO Protocol: urn:oasis:names:tc:SAML:2.0:protocol
Identity Provider:
Authentication Time: 2015-11-16T15:35:39.118Z
Authentication Context Class: urn:oasis:names:tc:SAML:2.0:ac:classes:PasswordProtectedTransport
Authentication Context Decl: (none)

uid: adp6j
unscoped-affiliation: member;staff;employee

The uid and eduPersonPrincipalName variables being the pieces we needed to get Drupal to set up a session for us

Lastly the Drupal bit

The Drupal side of this is pretty straight

We installed Drupal as usual  and grabbed the shib_auth module.


and on the Advanced Tab

Catégories: Elsewhere

Julien Danjou: Profiling Python using cProfile: a concrete case

Planet Debian - lun, 16/11/2015 - 16:00

Writing programs is fun, but making them fast can be a pain. Python programs are no exception to that, but the basic profiling toolchain is actually not that complicated to use. Here, I would like to show you how you can quickly profile and analyze your Python code to find what part of the code you should optimize.

What's profiling?

Profiling a Python program is doing a dynamic analysis that measures the execution time of the program and everything that compose it. That means measuring the time spent in each of its functions. This will give you data about where your program is spending time, and what area might be worth optimizing.

It's a very interesting exercise. Many people focus on local optimizations, such as determining e.g. which of the Python functions range or xrange is going to be faster. It turns out that knowing which one is faster may never be an issue in your program, and that the time gained by one of the functions above might not be worth the time you spend researching that, or arguing about it with your colleague.

Trying to blindly optimize a program without measuring where it is actually spending its time is a useless exercise. Following your guts alone is not always sufficient.

There are many types of profiling, as there are many things you can measure. In this exercise, we'll focus on CPU utilization profiling, meaning the time spent by each function executing instructions. Obviously, we could do many more kind of profiling and optimizations, such as memory profiling which would measure the memory used by each piece of code – something I talk about in The Hacker's Guide to Python.


Since Python 2.5, Python provides a C module called cProfile which has a reasonable overhead and offers a good enough feature set. The basic usage goes down to:

>>> import cProfile
>>>'2 + 2')
2 function calls in 0.000 seconds
Ordered by: standard name
ncalls tottime percall cumtime percall filename:lineno(function)
1 0.000 0.000 0.000 0.000 <string>:1(<module>)
1 0.000 0.000 0.000 0.000 {method 'disable' of '_lsprof.Profiler' objects}

Though you can also run a script with it, which turns out to be handy:

$ python -m cProfile -s cumtime
72270 function calls (70640 primitive calls) in 4.481 seconds
Ordered by: cumulative time
ncalls tottime percall cumtime percall filename:lineno(function)
1 0.004 0.004 4.481 4.481<module>)
1 0.001 0.001 4.296 4.296
3 0.000 0.000 4.286 1.429
3 0.000 0.000 4.268 1.423
4/3 0.000 0.000 3.816 1.272
4 0.000 0.000 2.965 0.741
4 0.000 0.000 2.962 0.740
4 0.000 0.000 2.961 0.740
2 0.000 0.000 2.675 1.338
30 0.000 0.000 1.621 0.054
30 0.000 0.000 1.621 0.054
30 1.621 0.054 1.621 0.054 {method 'read' of '_ssl._SSLSocket' objects}
1 0.000 0.000 1.611 1.611
4 0.000 0.000 1.572 0.393
4 0.000 0.000 1.572 0.393
60 0.000 0.000 1.571 0.026
4 0.000 0.000 1.571 0.393
1 0.000 0.000 1.462 1.462
1 0.000 0.000 1.462 1.462
1 0.000 0.000 1.462 1.462
1 0.000 0.000 1.459 1.459

This prints out all the function called, with the time spend in each and the number of times they have been called.

Advanced visualization with KCacheGrind

While being useful, the output format is very basic and does not make easy to grab knowledge for complete programs. For more advanced visualization, I leverage KCacheGrind. If you did any C programming and profiling these last years, you may have used it as it is primarily designed as front-end for Valgrind generated call-graphs.

In order to use, you need to generate a cProfile result file, then convert it to KCacheGrind format. To do that, I use pyprof2calltree.

$ python -m cProfile -o myscript.cprof
$ pyprof2calltree -k -i myscript.cprof

And the KCacheGrind window magically appears!

Concrete case: Carbonara optimization

I was curious about the performances of Carbonara, the small timeserie library I wrote for Gnocchi. I decided to do some basic profiling to see if there was any obvious optimization to do.

In order to profile a program, you need to run it. But running the whole program in profiling mode can generate a lot of data that you don't care about, and adds noise to what you're trying to understand. Since Gnocchi has thousands of unit tests and a few for Carbonara itself, I decided to profile the code used by these unit tests, as it's a good reflection of basic features of the library.

Note that this is a good strategy for a curious and naive first-pass profiling. There's no way that you can make sure that the hotspots you will see in the unit tests are the actual hotspots you will encounter in production. Therefore, a profiling in conditions and with a scenario that mimics what's seen in production is often a necessity if you need to push your program optimization further and want to achieve perceivable and valuable gain.

I activated cProfile using the method described above, creating a cProfile.Profile object around my tests (I actually started to implement that in testtools). I then run KCacheGrind as described above. Using KCacheGrind, I generated the following figures.

The test I profiled here is called test_fetch and is pretty easy to understand: it puts data in a timeserie object, and then fetch the aggregated result. The above list shows that 88 % of the ticks are spent in set_values (44 ticks over 50). This function is used to insert values into the timeserie, not to fetch the values. That means that it's really slow to insert data, and pretty fast to actually retrieve them.

Reading the rest of the list indicates that several functions share the rest of the ticks, update, _first_block_timestamp, _truncate, _resample, etc. Some of the functions in the list are not part of Carbonara, so there's no point in looking to optimize them. The only thing that can be optimized is, sometimes, the number of times they're called.

The call graph gives me a bit more insight about what's going on here. Using my knowledge about how Carbonara works, I don't think that the whole stack on the left for _first_block_timestamp makes much sense. This function is supposed to find the first timestamp for an aggregate, e.g. with a timestamp of 13:34:45 and a period of 5 minutes, the function should return 13:30:00. The way it works currently is by calling the resample function from Pandas on a timeserie with only one element, but that seems to be very slow. Indeed, currently this function represents 25 % of the time spent by set_values (11 ticks on 44).

Fortunately, I recently added a small function called _round_timestamp that does exactly what _first_block_timestamp needs that without calling any Pandas function, so no resample. So I ended up rewriting that function this way:

def _first_block_timestamp(self):
- ts = self.ts[-1:].resample(self.block_size)
- return (ts.index[-1] - (self.block_size * self.back_window))
+ rounded = self._round_timestamp(self.ts.index[-1], self.block_size)
+ return rounded - (self.block_size * self.back_window)

And then I re-run the exact same test to compare the output of cProfile.

The list of function seems quite different this time. The number of time spend used by set_values dropped from 88 % to 71 %.

The call stack for set_values shows that pretty well: we can't even see the _first_block_timestamp function as it is so fast that it totally disappeared from the display. It's now being considered insignificant by the profiler.

So we just speed up the whole insertion process of values into Carbonara by a nice 25 % in a few minutes. Not that bad for a first naive pass, right?

Catégories: Elsewhere

Drupal Commerce: Contributor Spotlight: Joël Pittet

Planet Drupal - lun, 16/11/2015 - 15:53
Say hi. (who are you and what do you do in the Commerce ecosystem)

Hi:) My name is Joël Pittet and I’m out of Vancouver, BC, Canada. I offered to help co-maintain commerce_discount and a few other Commerce modules as well as likely involved in messing about with patches all over Commerce ecosystem.

How did you get involved with contributing to Drupal Commerce?

Started working on a Drupal Commerce project, noticed things could use some fixing up and jumped in the deep end. I was recognized for helping triage the commerce queue in a fervor to fix all the things.

Catégories: Elsewhere

Drupal Easy: DrupalEasy Podcast 164 - Dentistry (Paul Johnson - Drupal Social Media)

Planet Drupal - lun, 16/11/2015 - 15:18
Download Podcast 164

Paul Johnson (pdjohnson) joins Mike Anello and Ted Bowman to talk about Drupal's social media presence, how community members can get involved, and the forthcoming release of Drupal 8!

read more

Catégories: Elsewhere

Jim Birch: No more View pages

Planet Drupal - lun, 16/11/2015 - 11:00

Views has long been one of the magic pieces that makes Drupal my CMS of choice.  Views allows us to easily create queries of content in the UI, giving great power to the site builder. 

When you first create a view, the default, obvious choice is to create a "Page" display of the view.  A Page has a URL that people can visit to see the information, and gets us as site builders closer to job done.  However, I don't want you to do it!

When you first create a view, the options are that you can make a Page and a Block.  Selecting neither will allow you to create a "Master" display, and additional modules can hook it and add addtional displays for your view.  In the screenshot below, you see we have additional displays of Attachment, Content pane, Context, and Feed in addition to the Block and Page displays.

All of our sites already have some sort of "Page" content type, for basic content of the site.  In this page content type, we add fields, set meta descriptions, get added to the xml sitemap, and include the pages in Drupal's core search.  When you create a view page, we only get the output as a url, we miss the benefit of having a "Page" node at that url.

Read more

Catégories: Elsewhere

Chris Hall on Drupal 8: Drupal Site Builder role

Planet Drupal - lun, 16/11/2015 - 09:59
Drupal Site Builder role chrishu Mon, 11/16/2015 - 08:59
Catégories: Elsewhere

Wouter Verhelst: terrorism

Planet Debian - lun, 16/11/2015 - 09:07

noun | ter·ror·ism | \ˈter-ər-ˌi-zəm\ | no plural

The mistaken belief that it is possible to change the world through acts of cowardice.

They killed a lot of people, but their terrorism only intensified the people's resolve.

Catégories: Elsewhere

DrupalCon News: 7 Things You Must Experience in India

Planet Drupal - lun, 16/11/2015 - 01:47

India is shaped by countless influences, from centuries old civilizations to modern day technology. In its long journey, the country has absorbed many different cultures, which have given it different dimensions. You must experience some of these when you come for DrupalCon Asia 2016. Here is a list of seven unique experiences that will make your trip worth remembering.

Catégories: Elsewhere

Norbert Preining: Movies: Monuments Men and Interstellar

Planet Debian - lun, 16/11/2015 - 01:01

Over the rainy weekend we watched two movies: Monuments Men (in Japanese it is called Michelangelo Project!) and Interstellar. Both blockbuster movies from the usual American companies, they are light-years away when it comes to quality. The Monuments Men are boring, without a story, without depth, historically inaccurate, a complete failure. Interstellar, although a long movie, keeps you frozen in the seat while being as scientific as possible and starts your brain working heavily.

My personal verdict: 3 rotten eggs (because Rotten Tomatoes are not stinky enough) for the Monuments Men, and 4 stars for Interstellar.


First for the plot of the two movies: The Monuments Men is loosely based on a true story about rescuing pieces of art at the end of the second world war, before the Nazis destroy them or the Russian take them away. A group of art experts is sent into Europe and manages to find several hiding places of art taken by the Nazis.

Interstellar is set in near future where the conditions on the earth are deteriorating to a degree that human life seems to be soon impossible. Some years before the movie plays a group of astronauts were sent through a wormhole into a different galaxy to search for new inhabitable planets. Now it is time to check out these planets, and try to establish colonies there. Cooper, a retired NASA officer and pilot, now working as farmer, and his daughter are guided by some mysterious way to a secret NASA facility. Cooper is drafted for being a pilot on the reconnaissance mission, and leaves earth and our galaxy through the same wormhole. (Not telling more!)

Monuments Men

Looking at the cast of Monuments Men (George Clooney, Matt Damon, Bill Murray, John Goodman, Jean Dujardin, Bob Balaban, Hugh Bonneville, and Cate Blanchett) one would expect a great movie – but from the very first to the very last scene, it is a slowly meandering shallow flow of sticked together scenes without much coherence. Tension is generated only through unrelated events (stepping onto a landmine, patting a horse), but never developed properly. Dialogs are shallow and boring – with one exception: When Frank Stokes (George Clooney) meets the one German and inquires general about the art, predicting his future being hanged.

Historically, the movie is as inaccurate as it can be – despite Clooney stating that “80 percent of the story is still completely true and accurate, and almost all of the scenes happened”. That contrasts starkly with the verdict of Nigel Pollard (Swansea University): “There’s a kernel of history there, but The Monuments Men plays fast and loose with it in ways that are probably necessary to make the story work as a film, but the viewer ends up with a fairly confused notion of what the organisation was, and what it achieved.”

The movie leaves a bitter aftertaste, hailing of American heroism paired with the usual stereotypes (French amour, German retarded, Russian ignorance, etc). Together with the half baked dialogues it feels like a permanent coitus interruptus.


Interstellar cannot serve with a similar cast, but still a few known people (Matthew McConaughey, Anne Hathaway, and Michael Caine!). But I believe this is actually a sign of quality. Well balancing scientific accuracy and the requirements for blockbusters, the movie successfully spans the bridge between complicated science, in particular general gravity, and entertainment. While not going so far to call the move edutainment (like both the old and new Cosmos), it is surprising how much of hard science is packed into this movie. This is mostly thanks to the theoretical physicist Kip Thorne acting as scientific consultant for the movie, but also due to the director Christopher Nolan being serious about it and studying relativity at Caltech.

Of course, scientific accuracy has limits – nobody knows what happens if one crosses the event horizon of a black hole, and even the existence of wormholes is purely theoretical by now. Still, throughout the movie it follows the two requirements laid out by Kip Thorne: “First, that nothing would violate established physical laws. Second, that all the wild speculations… would spring from science and not from the fertile mind of a screenwriter.”

I think the biggest compliment was that, despite the length, despite a long day out (see next blog), despite the rather unfamiliar topic, my wife, who is normally not interested in space movies and that kind, didn’t fall asleep throughout the movie, and I had to stop several times to explain details of the theory of gravity and astronomy. So in some sense it was perfect edutainment!

Catégories: Elsewhere

Darren Mothersele: How to Survive Gentrification of the Drupal Community

Planet Drupal - lun, 16/11/2015 - 01:00

We're finally approaching the release of Drupal 8.0.0 on 19th Nov. The biggest achievement of the Drupal community to date. A complete rewrite of the core system to use modern object-oriented PHP. An effort that is often referred to as "getting off the island".

While the switch from Drupal 7 to Drupal 8 is a big change for developers, it is the result of a slow process of maturation of the Drupal community. Drupal 8 brings changes that will be welcomed by many, will bring in many new users, and of course, will push a few people out. How can we survive this "gentrification" of the Drupal community and prosper without losing touch with why we loved Drupal in the first place.


Cities all over the world are becoming more exclusive, more expensive, and a natural result of this is gentrification. It's contentious. Some see this as urban improvement, some as social cleansing.

I moved to London nearly 12 years ago. Dalston, to be precise. I was back in Dalston this weekend for a party, and it's very different to how I remember it from 2004. I compared the nice clean overground train to the unreliable and dirty Silverlink trains that used to run to Dalston. Then, walking down Kingsland road without being on guard. When I lived there in 2004 it was often cordoned off by police. The hipsters, the trendy coffee shops, and other obvious signs of gentrification proliferate.

Brixton was my home for many years, and I witnessed first hand the results of gentrification. I had an office space in Brixton, and decided to leave it when the landlord announced he was increasing the rent by 25%. I lived in several flats around Brixton over the years, and eventually moved (a bit) further south as rental prices in Brixton soared. I say this with tongue in cheek, well aware that to many I'd be seen as one of the gentrifiers! It's the communities that settled here during the 1940s and 1950s that gave the area it's eclectic multi-cultural feel. They're the ones who have been displaced, losing their homes and community as developers and "yuppies" take over.

Gentrification of the Drupal Community

I first used Drupal back in 2003, version 4 point something. It was fun. Hacky, but fun. I had to quickly get a site up for an event we were organising and Drupal offered a collaborative content model that set it apart from the other produces we evaluated.

I came back to Drupal in 2007 for another community site build, and Drupal 5 had been released. It was really fun. Yes, still very hacky, but it came with the power to build a CMS exactly the way I wanted it to work, and it came with an awesome community of other hackers. A community of dedicated open-source types, who valued openness, and working on projects for good. I was hooked and made the leap to full time Drupal development. Through Drupal I got involved in the first social innovation camp, and other tech-for-good type things.

Szeged 2008 was my first Drupalcon. 500 Drupal contributors and users in a small university town in Hungary. Everyone I met truly cared about making Drupal an awesome project and was contributing time and effort in any way they could. Several years later and Drupalcon have grown. 2000+ attendees in Barcelona this year, 2300+ in Amsterdam last year. But, as the community has grown, so has the commercial influence. With sales pitches as prevalent as learning sessions on the schedule.

One thing I noticed this year was that several sessions concluded, or included, a call for donations or funding to accelerate a particular module or project's development. The precedent was set in the starting session of the conference when the Drupal Association made an announcement about the Drupal 8 accelerate funding programme. I'm not saying this is a bad thing. If this is what it takes to get Drupal finished in today's conditions, then that's great. But, look at it as an indicator of how the community has changed, when compared to the sessions at Szeged seven years earlier. You would not have seen a call for quarter of a million dollar funding back then. Everyone was there because they loved it, not because they were being paid.

Hacking the hackers

While doing research for this post, I came across this brilliant essay, The hacker hacked, by Brett Scott about the gentrification of hacker culture. I quote his summary of the gentrification process:

Key to any gentrification process are successive waves of pioneers who gradually reduce the perceived risk of the form in question. In property gentrification, this starts with the artists and disenchanted dropouts from mainstream society who are drawn to marginalised areas. This, in turn, creates the seeds for certain markets to take root. A WiFi coffeeshop appears next to the Somalian community centre. And that, in turn, sends signals back into the mainstream that the area is slightly less alien than it used to be.

If you repeat this cycle enough times, the perceived dangers that keep the property developers and yuppies away gradually erode. Suddenly, the tipping point arrives. Through a myriad of individual actions under no one person’s control, the exotic other suddenly appears within a safe frame: interesting, exciting and cool, but not threatening. It becomes open to a carefree voyeurism, like a tiger being transformed into a zoo animal, and then a picture, and then a tiger-print dress to wear at cocktail parties. Something feels ‘gentrified’ when this shallow aesthetic of tiger takes over from the authentic lived experience of tiger.
-- Brett Scott

How does this relate to the Drupal community? Perhaps it starts with the NGOs and charities, our original flagship Drupal sites, that became our "artists and disenchanted dropouts from mainstream society". Then the big media companies move in as the "perceived dangers gradually erode". Eventually, The White House start using Drupal, and we're at home with the large enterprise clients and big corporate contracts.

As the Drupal project developed the requirements changed. Drupal's capabilities improve, and the Drupal user base and community advanced too.

This is evident in the development, and standardisation of things like configuration management. Something that was never an issue in the early days, as the community became more professional, solutions for configuration management were hacked together, and then became standardised.

Configuration management is just one example of the many benefits the Drupal community has experienced through the process of gentrification. There's also great test coverage, performance improvements, greater tooling, and many other advancements that came to Drupal as the community matured. Drupal became less about hacking and more about software engineering.

Drupal 8

Development on Drupal 8 started in March 2011 and four years later, is to set to be released on November 19, 2015. Over these years, Drupal has been rewritten, removing most of the pre-OO era PHP legacy.

Drupal's legacy was the "not invented here" mindset that became entrenched in the community through hacking solutions to extensibility into a language that was not designed to support it. And, a culture of not depending on third-party code due to early well publicised security issues with PHP extensions.

The move away from this legacy, the move to "get off the island", is a move towards more standardised, modern, development practises, and a move to embrace the wider PHP community.

Social cleansing

I mentioned before that gentrification is contentious. For some see it as urban improvement, some as social cleansing. Drupal and the Drupal community have clearly benefitted already, and it looks like prosperous times ahead for those who come along for the ride, and the newcomers who join and adopt Drupal.

But, what about the social cleansing. Will parts of the community be pushed out? Who gets left behind?

Drupal has suffered from an identity crisis. Because of it's flexibility, it's been used for many things. Drupal's openness to hacking, extending and ability to do just about anything, meant it was more than just a CMS. Over the years many talked about "small core", many used Drupal's core tools as a Framework, building apps and tools well beyond what a typical CMS would be used for.

Drupal 8 is a content management system.

Drupal 8 focuses on content management, on providing tools for non-technical users to build and manage sites. That's what it always wanted to be anyway.

Drupal 8 leverages the wider PHP community, in particular the Symfony components, as it's core. It no longer makes sense to see Drupal as a framework.

One of the parts of the community being displaced, are those using Drupal as a framework. If this is you then you may already be looking at a fork, like Backdrop, or playing with other frameworks, like the beautiful Laravel.

Another section of the community that may be displaced are those running Drupal on low end and shared hosting. Through the gentrification process, Drupal's requirements have increased. The increased hosting requirements have meant that dedicated Drupal platform hosting providers have emerged. More options for scalability and custom software stacks have taken precedent over solutions for smaller websites.

Drupal also potentially loses the innovators. Drupal always had a reputation for being cutting edge and innovative. As it moves to become the enterprise choice of open-source CMS, innovation becomes less important, and stability, security, and backwards compatibility become more important. The biggest innovations in Drupal (flexible content types and Views) date back to the 4.7 era. Views is now in core in Drupal 8. As Drupal matures further from this point, we'll probably see Drupal adopting innovations from other systems and ecosystems, rather than innovating on it's own. It's well placed to do this now, built on Symfony components, innovations from the wider community will be easier to integrate.

Surviving Gentrification Do you abandon the form, leave it to the yuppies and head to the next wild frontier? Or do you attempt to break the cycle, deface the estate-agent signs, and picket outside the wine bar with placards reading "Yuppies Go Home"?
-- Brett Scott

Or, do come along for the ride? Enjoy the benefits of gentrification, without losing the reason why you got involved in the first place?

If you're going to stick around then you're going to need change a few things. Here's 5 steps that will get you started:

1. Learn the foundations that Drupal is now built on.

If (like me) you've got a background in OO then this shouldn't be too hard. I did several years of post-graduate research into semantics and verification of object-oriented software. You definitely don't need to go that deep, but I would highly recommend getting to grips with classic works on design patterns such as Gang of Four and Martin Fowler.

With a basic understanding of the core "patterns" of object-oriented software, you start to appreciate how Symfony works.

Drupal, Silex, Laravel, Symfony Full Stack, Symfony CMF, phpBB, Joomla, Magento, Piwik, PHPUnit, Sonata, and many more projects are built on this same foundation. So, it's definitely worth learning, and Drupal can be a good way to learn it, while still working with a system you know well.

Try building a simple app with Silex.

Check out Drupalcon (and Laracon) on YouTube. There's some great stuff. Like this talk from Ryan Weaver about Symfony and this talk by Ross Tuck about Models and Service Layers.

2. Do PHP the right way.

PHP has changed. There's a lot of outdated information and a lot of legacy code. Drupal 8 has been rewritten to remove this legacy code, but there's still a lot of bad advice on how to write PHP out there. Read PHP The Right Way for a full guide on how modern PHP should be crafted.

3. Use Composer, use and create PHP packages.

Getting off the island, and embracing the wider PHP ecosystem means using Composer, and it's ecosystem of PHP packages. There are many more packages that are potentially compatible with Drupal, and by architecting your Drupal extensions as more general PHP packages you have access to a much wider pool of potential collaborators.

Creating PHP packages also forces you to write clean code, think like a software engineer, and write more maintainable, extensible, and reusable code. Check out The PHP League as examples of solid PHP packages. They have a good Skeleton starting package.

You may have made custom Drupal modules before. Try thinking about how you can refactor these into separate packages, and using the Drupal "module" as a small layer that integrates your logic with Drupal.

The SOLID principles will guide you towards creating good packages.

4. Use an IDE

This was a big one for me. I was always against using an IDE, burnt by early experiences with open-source IDEs. I settled on a customised Sublime Text setup, and various other apps. I didn't see much benefit over using one app for everything when I could combine a selection of my favorite apps to do the same thing.

I'm not sure why I stuck to this. I also do a lot of C++ programming. I have my own programming language (Cyril) for creating audio-reactive visuals. I use XCode for C++ as the debugging tools are essential when you're dealing with object graphs, memory management, and debugging pointer issues. So, why not use an IDE for my web development?

I tried PHPStorm and it's great. Far from the cumbersome experience I had in the early days with open-source IDEs, it offers a smooth, fast, integrated experience.

I think you can get away without an IDE when you're hacking on Drupal 7, but on an OO system like Drupal 8 you will need an IDE. You will need the integrated tooling, testing, and you'll be much more efficient with intelligent autocompletion, hinting, quick access to docs, and fast navigation of the huge codebase.

5. Identify your values and serve your purpose.

As the corporates, enterprises, and big businesses take over, it's important to remain true to your yourself. By identifying your values you will be well placed to notice when they are being compromised.

You probably got into open-source because you believe in the power of collaboration. But, this value of collaboration can often be at odds with the cut-throat corporate culture of competition.

To be aware of this is to be aware of the opportunity to spread openness and collaboration with our work.

As the proceeds of Drupal's success flow into the community, it's important to use this to do good. To continue to serve our communities and society as a whole. To enable collaboration, share our work, and use openness to build the world we want.

Final thoughts

The real opportunity, is to spread Drupal's values of cooperation to the wider population.

This is part of a bigger shift in society to adopt open-source values, principles, and methodologies. Chris Anderson says it best:

If the past ten years have been about discovering new social and innovation models on the Web, then the next ten years will be about applying them to the real world.
-- Chris Anderson

The Work Open Manifesto offers a useful formulation of what it means to be open that can apply beyond open source software: "Think Big, Start Small, Work Open".

Drupal is great case study for starting small, thinking big, and working openly.

The Drupal community has always has been transforming, improving ourselves, improving the product, improving our practises, and improving our tools.

Now it's time to think beyond Drupal, beyond the Drupal community, and to see Drupal's values of collaboration, teamwork, and openness spread through the wider community, society, and the world.

Catégories: Elsewhere


Subscribe to jfhovinne agrégateur