Planet Drupal

Subscribe to Planet Drupal feed - aggregated feeds in category Planet Drupal
Updated: 5 min 9 sec ago

Frederick Giasson: Specifying Field Widgets for OSF Entities Fields (Screencast)

Tue, 29/04/2014 - 14:31

In this screencast, I will show you how you can use ontologies to specify the field types to use for the classes and properties we map into Drupal using OSF Entities mapping process. Once the field types will be configured for each Datatype property, I will run the mapping process to generate new fields that will use the configured field types. Once done, I will show you the impact of this configuration into the fields and fields instances that are being created into Drupal.

The second part of this screencast focus on the configuration of the field widgets that are being used by each field. Then I will update a few entities using the new forms. I will tell you how you can modify the form by re-ordering the fields, by changing their titles or other configuration options such as their cardinality.

OSF Entities supports the following 18 field types and 34 field widgets.


Categories: Elsewhere

Acquia: It Works in the Wild: Open Source + Business - Meet me at re:publica 2014!

Tue, 29/04/2014 - 14:06
Meet me in Berlin this May at re:publica!

Categories: Elsewhere

InternetDevels: Node.js installation and setup

Tue, 29/04/2014 - 13:46

Node.js is an open source platform for writing server-side web applications. It is based on an event-oriented and asynchronous programming with non-blocking I/O. The platform is designed to execute standalone web applications in JavaScript. It internally uses the Google V8 JavaScript engine to execute code.

Read more
Categories: Elsewhere

BryceAdamFisher: DrupalCon Austin, Texas - Video Module Sprint

Tue, 29/04/2014 - 02:00

I’m super excited for DrupalCon Austin! Since becoming a new maintainer on the illustrious Video Module, I’ve been gathering feedback from the issue queue and wanting to really dive deep on the outstanding bugs and low hanging feature requests. If you’re there, let’s meet up for a Birds of a Feather.

The Week’s Hitlist

My goal for DrupalCon Austin is to roll out a new release (video-7.x-2.11). Ideally, I’d like to:

  • Streamline the process from drush dl video to seeing a player on the page
  • Make sure that Firefox can play transcoded videos
  • Make responsive players easier to setup
  • Roll out some REAL documentation on

I’m definitely open to other features and bug fixes – just show up, and we’ll see if we can roll it into the new release.

Newbies Welcome

Never rolled a patch before? Struggling with arcane git commands? Just message / tweet me and we’ll work on it together while updating the module.

Experts Welcome Too!

Personally, I’m still getting up to speed on FFMpeg and some of the sublties of video. I’d love to get some more pointers during this sprint. If all goes according to plan, we’ll be creating a new release. I haven’t done this before – if you’re an experienced module maintainer, I’d love to have you on my team!

Categories: Elsewhere

AGLOBALWAY: Google chrome and testing for mobile

Mon, 28/04/2014 - 23:01
Google chrome has many versions to download. Now its even easier to debug a mobile device on your laptop or desktop computer. Currently Chrome is on Version 34(Stable) but give the rest a try:  35(Bata), 36(Dev) and 37(Canary) for both Windows and Mac.    Currently version 34+ supports mobile emulation which can be found in the Dev-Tools  (F12 on Windows) – (cmd+option+I on Mac).
  1. Open the Settings panel within the DevTools.
  2. Enable "Show 'Emulation' view in console drawer."
* ESC is a shortcut to open the drawer once DevTools is open.   Emulation is meant to make the development process faster and easier to develop mobile first.  Tackle those pesky little bugs only on mobile by using the inspector to see what the DOM and CSS rules are being processed. Built into the emulator is a number device’s to test from an iPhone 3gs to a Blackberry playbook. This allows you to emulate and debug mobile viewport issues like CSS media query breakpoints. Selecting a device preset automatically enables a number of settings for the specific device such as: User Agent, Screen Resolution, Device Pixel Ratio, Text Auto sizing and of course touch events.
  1. Open the Emulation panel within the DevTools
  2. With the Device pane selected, select "iPhone 4".
  3. Click Emulate.
  Now you can swap dimensions, shrink to fit, see all your media query rules and how they’re breaking.   To enable support for touch event emulation:
  1. Open the Emulation panel in the DevTools.
  2. Enable "Emulate touch screen" in the Sensors pane.
  Now your mouse will emulate your finger on the screen. Shift + Drag to emulate a "pinch" to zoom.  Give this canvas finger paint demo a try. Enable a device and refresh the page.   User Agent spoofing
  1. In the DevTools, open up the User Agent pane within the Emulation panel.
  2. Check "Spoof user agent" and select "iphone 3gs".
  3. Refresh the page. And enjoy.
  You can even enable Geolocation Overrides, Device Orientation Overrides and CSS Media Type Emulation.   If you have an Android device you can Remote Debug Chrome on Android.   Here are some talks about Chome and the DevTools   Here is where to download the different versions of Chrome.   Using this chrome developer tool will speed up your workflow as you develop for mobile first and will assist in seeing what is happening to your CSS/JS/HTML when its displayed on a smaller screen. There are many more features of Chrome just have to dig around and play a little.  Tags: Mobiledrupal planet
Categories: Elsewhere

Drupal @ Penn State: HTML5 Audio recording with Drupal

Mon, 28/04/2014 - 21:14

This is still highly experimental but everyone I've shown has been really excited so I figured I'd do a quick video. Using a few modules and HTML5 in-browser audio recording you can click a button and save your voice to your drupal site as s file field. This can bring new life to commenting on private forum where that might be valuable or in our context, allowing instructors to give audio feedback to students about their work in ways text alone can't describe.

Categories: Elsewhere

Drupal Easy: DrupalEasy Podcast 128: Better Than a Double Root Canal (Drupal 8 Theming: MortenDK)

Mon, 28/04/2014 - 20:17
Download Podcast 128

Morten Birch Heide-Jørgensen (mortendk), owner of geek Royale, a one-man design and theming shop in Copenhagen, Denmark, the King of DrupalCon, and Drupal Association At-Large Board of Directors member joins Ryan, Ted, and Mike to discuss Drupal 8 themeing, NYC Camp (again), Drupal 6 support, and possible growing pains for Acquia. On top of all that, we have picks of the week and Morten answers our “5 questions”!

read more

Categories: Elsewhere

Drupal core announcements: Lets fix our file extensions in D8?

Mon, 28/04/2014 - 19:47

We have an opportunity to finally fix Drupal file extensions once and forever in

The planned move from PSR0 to PSR4 in will break many/most patches in the queue.
Syncing the rename of our procedural code files with the change above minimizes disruption and brings similar DX gains for D8.

Drop by and provide your feedback on this issue.

Categories: Elsewhere

Ryan Oles: Using Drupal Content Channel Tokens with Nodejs

Mon, 28/04/2014 - 19:29

You've decided to add a bit of Node.js functionality to your Drupal site using the Drupal Nodejs Module. After some reading you realize you need to be setting up user message channels using hook_nodejs_user_channels or possibly nodejs_add_user_to_channel. This allows your application to send socket messages to groups of users. Great! But you're writing no simple application. You want to go one step further and manage these channels based not only on some property of the user object, but all users who may be currently viewing a particular Drupal served page. Suddenly the typical user channels no longer cut it. Try as you might, you'll find yourself quickly descending into channel management hell. There has got to be a better way. Thankfully content token channels are here to help.

Send a Message Based on the Content Being Viewed

You can think of content token channels as a per page (or piece of content) channel subscription. This differs from the user channels in that it targets users who are currently viewing a specific page rather than users targeted by some property of the user object (regardless of where they are on the site).

There are two main PHP functions to implement a token channel. nodejs_send_content_channel_token to generate a token channel for a given piece of content, and nodejs_send_content_channel_message to send a message to users subscribed to a given content channel.

To illustrate this lets assume we want to create a channel to send user messages when they are viewing a Drupal node of type page. To begin we will need to setup our content channel for that node type. A good place to do this would be hook_node_view which fires when a user views a node.

function example_node_view($node, $view_mode, $langcode) {
  if($node->type == "page") {

Now every time a node of type page is viewed the Node.js server is notified of the page_node_channel content channel. Node.js then adds the channel to the user for whom the page was rendered.

From here if we wish to send a message to all of the users viewing a node of type page we make a simple call to nodejs_send_content_channel_message. When and where you choose to fire this message is up to you. Possibly you want to notify users if the page they are viewing has been updated (hook_node_update), a user logs in (hook_user_login), or some new content was added (hook_node_insert).

// Build a message object
$message = new stdClass();
$message->channel = 'page_node_channel';
$message->data['body'] = 'Hello World!';

// Send the message to the channel we created

And that's about it! When nodejs_send_content_channel_message fires our message will be broadcast to all users subscribed to the page_node_channel. In this case all users who are currently viewing a node of type page. This message behaves like any other on the Node.js side, so you could send it to a custom callback handler, as mentioned in an earlier post.

Token Messages From an Node.js Extension

If you are writing a Node.js server extension and wish to send a message to a content channel, it is worth noting that there is no immediately exposed functionality to do this. However, Node.js server extensions are aware of what the content channels are via config.tokenChannels. This allows us to mimic the functionality of the Drupal Node.js server with a helper method in our server extension.

function sendMessageToTokenChannel(message, config) {
  if (!message.hasOwnProperty('channel')) {
    console.log('publishMessageToContentChannel: An invalid message object was provided.');
  if (!config.tokenChannels.hasOwnProperty( {
    console.log('publishMessageToContentChannel: The channel "' + + '" doesn\'t exist.');

  for (var socketId in config.tokenChannels[].sockets) {
    config.publishMessageToClient(socketId, message);

Adding this method to your server extension, will allow you to send messages to a content channel, by passing the message and config objects from within your exports.setup function.

Using content token channels has greatly simplified several of the Drupal/Node.js projects I have done. However, it wasn't immediately apparent that this functionality existed. So I hope this helps to further clarify the capabilities of this module.

Categories: Elsewhere

Design Guru: Hosting options for Drupal: Acquia vs Pantheon vs Rackspace Cloud

Mon, 28/04/2014 - 19:14

Considering the best hosting environment for Drupal projects has always involved the following questions:

  • What is your budget for hosting?
  • Who will be responsible for maintaining the hosting environment and what is their technical familiarity with LAMP (Linux Apache MySQL and PHP)?
  • How much physical space does the site actually need?
  • What is your expected site traffic and how can you expect that to grow over time?

Of course, there are many more things to weigh in when a project entails higher-than-average server loads due to multimedia encoding and streaming, complex database calls for large dynamic lists, custom data visualizations etc... But these typically frame the task of choosing the best hosting vendor and then package for the project, based ultimately on cost, server space and bandwidth.

In the past few years Cloud hosting infrastructure has emerged to make answering these a little easier - storage space and traffic have become scalable costs that grow with the needs of the project and there is a great range of hosting packages which cater to almost any type of customer; cloud hosting can be used by people who want to hand-off the role of server manager to the vendor or hard-core tweakers who want to manage their own custom environment.

Currently, it seems to us that there are three main vendors for scalable cloud based hosting solutions which best suit Drupal.

Acquia Cloud makes site versioning quite easy; whereby development, staging and production enviromnents can be individually and files or databases can be moved between by dragging and dropping in their web based admin panel.  Their site backup tool is very usefull for rolling back a Drupal site to an earlier captured version - something otherwise needs to be configured per project on other hosting environments using tools like the Backup & Migrate module.  The site versioning functionality is available through a web based GUI tool, drush command line and an API - the last two making it an excellent choice for Drupal based app projects which require frequent versioning to facilitate large functional development and testing.

Pantheon is a cloud hosting infrastructure which has been developed specifically for both Drupal and Wordpress sites.  Its very interesting for focusing on scalability and automation - their platform allows automated updates for Drupal modules and more in attempts to let its customers focus less on managing their hosting environment and more on content etc...   Their versioning system seems to be comparable in flexibility to Acquia's but load content a little faster thanks to their load balancing and caching systems which are specific to their platform.  Of course, this seems to all be sitting on Amazon's S3 cloud; so reliability is generally pretty high as well.  On the whole it looks like Pantheon is a great solution for teams who want to spin up versions of their pre-existing Drupal projects to create as independant sites which all share the same hosting setup.  Particularily useful for Agencies who bear the responsibility of hosting client projects or even single projects entailing site versioning for sign-off from multiple stakeholders, Pantheon seems to fit into Acquia Cloud's pricerange but offer a more feature-rich user experience for non-techies and great service (especially if you want them to help make your sites faster and scale better.)

Rackspace Cloud is the only solution in this comparison which takes a cloud approach to conventional hosting, whilst being somewhat platform and CMS agnostic.  Their offering takes conventional hosting setups into the cloud with scalable resources and pricing and they offer managed hosting, which gives you amazing support resources which ultimately saves developer time through doing server optimization and software installations for you.  Rackspace's servers will cost about 1/4 of Patheon and Acquia Cloud if you don't want managed hosting, and about 1/2 of them if you want it and this makes for a great option if your developer is able to setup your hosting scenario for you, and then is on retainer to manage software upgrades to Drupal post launch.  Using Rackspace means that you can customize your server and allocate larger storage and RAM allotments per dollar spent on hosting; though it also means that there is a more technical know-how required in setting up dev/staging/production environments and a backup scheduling system - once all of that is setup though (assuming you need it) we've found uptime amazing and their service available by phone or email 24hrs a day.

In summary, Rackspace is the best solution to save money on hosting whilst making sure you have the best support in the business - which is important when figuring out how to scale with increased traffic over time etc... However, if your budget is a little larger, both Acquia and Pantheon offer flexible infrastructures that remove the need to ancticipate scaling, and can give you the peace of mind of not needing to worry about hosting infrastructure - that is of course, until you run out of the allotted system resources - at which point for any of these solutions it seems you'll need to purchase an upgraded hosting package.

Categories: Elsewhere

Stanford Web Services Blog: Entity Construction Kit: What It Is and When to Use It

Mon, 28/04/2014 - 18:00

You may have read or heard of entities in Drupal but are not entirely sure what they are or how to use them. I hope to clear up some of the terminology and provide a use case for creating custom entities over nodes and content types.

Project Page:

Current Version: 2.x

Categories: Elsewhere

Evolving Web: Join us for Hands-on Training at DrupalCon Austin

Mon, 28/04/2014 - 14:43

DrupalCon Austin promises to be a great event! The North American DrupalCon is always the biggest Drupal event of the year, which means a lot of content to absorb, people to meet, and ideas to share.

read more
Categories: Elsewhere Drupalfund helped get Tess to Drupalcon 2014 in Austin!

Mon, 28/04/2014 - 09:33
  Tess Flynn is a valuable member of the community that supports Drupal 8. She “is pouring countless hours in porting flag.module to Drupal 8 and in doing so test driving new the D8 APIs,” wrote S. Corlosquet on Tess´s feedback wall. Flag module 8.x will have the benefit of uncovering bugs in Drupal 8 early so they can be fixed before D8 is released.   Why the effort to get there?!   Tess wants to provide a working Drupal 8 version of Flag module as close as possible to the Drupal 8 release date. The radical changes in Drupal 8 have required an extensive rewrite of Flag module code. “While I've made significant progress on IRC, I need the high-bandwidth communication with core developers that only in-person conversations provide,”  says Tess Flynn.   Therefore she needed help to attend Drupalcon held in Austin on June 2nd, 2014 so that she could get Flag to the finish line. Besides, her goal is to use this as an opportunity to educate others on Drupal 8, as well as raise awareness and excitement; also to catch any issues in core that only contributing developers will experience. She has already caught several core bugs earlier.    Tess has really good reasons to get there. Also “the community would definitely benefit from being able to help her get to Austin,” mentioned Damien McKenna on Tess´s feedback wall.   We have perfect news to tell you!   Tess Flynn will provide a D8 version of Flag module, she will also educate others in Drupal 8 while catching any issues on core because she GOES to Drupalcon Austin 2014!!!    The project “Send Flag for Drupal 8 Developer to Drupalcon!” succeeded. Collectively, 21 funders donated 104% of the funding goal. It equated to 1040 dollars and that´s enough to send her to Drupalcon Austin 2014 to finish her mission. Thanks everybody!  
Categories: Elsewhere

ThinkShout: "Big Data" challenges with Salesforce for Facing History and Ourselves

Mon, 28/04/2014 - 09:00
The Introduction

Facing History and Ourselves is an international educational and professional development organization whose mission is to engage students of diverse backgrounds in an examination of racism, prejudice, and antisemitism in order to promote the development of a more humane and informed citizenry. As part of their recent site launch, ThinkShout was tasked with synching their new Salesforce instance, built by our partner on the project, Kell Partners, with Drupal. This is a use case tailor made for the Salesforce Suite, and one that we have lots of experience with including recent projects for the Los Angeles Conservancy and the Forum of Regional Association of Grant Makers, however there was one small difference. Actually a big one: for Facing History we had to sync 300,000+ plus records in near real time as opposed to tens of thousands. How this was accomplished was an exercise in troubleshooting, scripting and patience.

The Drupal Salesforce Suite allows any Drupal entity to be synchronized with any Salesforce object, with field level granularity and directionality. Data can be pushed from Drupal in real time or can be batched. Data from Salesforce is pulled into a queue at regular intervals and then a configurable amount of queued records are processed during those intervals. During processing, contacts and orgs in RedHen CRM are created or updated, keeping the user experience of managing contact data within the Drupal site. In future phases, we will add engagement scoring to the mix by scoring user engagements on the website and pushing that data back to Salesforce.

The Challenge

Getting 300,000+ records into the queue was a relatively quick operation taking less than 4 hours. Processing those records was much more time consuming as only a few hundred records were processed during a single cron run. Since the site is hosted on Pantheon, the standard cron run is hourly, which would mean the processing would take weeks. Even manually triggering the process would take days. We needed a better solution.

The Solution

One of the ways to improve this process was to allow more records to be processed during the cron run. The default worker timeout was set to 180 seconds (3 minutes). Meaning that every hour, records from the queue were processed for 3 minutes and then nothing would happen until the next cron run. So that timeout was altered using hook_cron_queue_info_alter() to 3600 seconds (1 hour). We also wanted to limit other processes from running during this time. Just firing off cron processes all cron tasks from all modules. Running drush queue-run we could just process the queue worker identified. But it would still require someone manually running that command every hour. That command also allows queue processing in parallel, which theoretically would process the records even faster.

We created a bash script which would process the queue every hour running multiple parallel threads:

#!/bin/sh NUM_RUNS=$1 NUM_PROCESSES=$2 r=0 START=`date +%s` while [ $r -lt $NUM_RUNS ]; do p=0 run=$(($r+1)) echo "Run: $run"; while [ $p -lt $NUM_PROCESSES ]; do proc=$(($p+1)) echo "Process: $proc"; #create file with header and time stamp printf "Run: $run Process: $proc Log\n" > sync.R$run.P$proc.log drush queue-run salesforce_pull --strict=0 >> sync.R$run.P$proc.log & p=$proc done r=$run # Should match worker timeout sleep 3600 done

During our testing, however, we quickly realized that running parallel Drupal processes caused MySQL deadlocks. It appeared that this was caused be a lack of database transactions being created when doing field level operations. We spent some time researching ways to prevent this, but in the end decided that it would be better to improve the way that records were imported into the SalesForce module in the first place.

While troubleshooting an unrelated issue, it was found that when pulling mapped Relations from Salesforce the entity ID was needed, but since the entity was not saved at the time of processing those mappings, the ID was not available yet. This was temporarily resolved to prevent errors by saving the entity before the mapping took place. Then the mappings were completed and the entity was saved again. This meant that wether a Relation was used or not the entity was saved twice. To prevent this double save from causing a decrease in performance, a check was made to see if the pulled entity was mapped with a Relation. If so, the entity was saved to provide the entity ID. If not, the entity was only saved after the field mappings were completed.

<?php function salesforce_mapping_related_entity_fieldmap_pull_value($entity_wrapper, ... // Handle relations. elseif (module_exists('relation') && isset($info['relation_type'])) { // We cannot create relationships between new items. We are saving them here // to avoid performing a duplicate save for all entities in // salesforce_pull_process_records(). if (!$info['parent']->getIdentifier()) { $info['parent']->save(); }

Another performance improvement came from changing the way field mappings where handled if an error was thrown. Previously if an error was thrown while updating a mapping, the mapping object (the entity that links Drupal entities to Salesforce objects) was not created or, if it existed, was removed. Instead, now if a valid entity ID is present the mapping is still saved. This cause less errors and allows for better data syncing.

The function salesforce_pull_process_records in salesforce_pull.module was updated


<?php if ($mapping_object && ($sf_mapping->sync_triggers & SALESFORCE_MAPPING_SYNC_SF_UPDATE))


<?php $mapping_object = salesforce_mapping_object_load_by_sfid($sf_object['Id']); $exists = $mapping_object ? TRUE : FALSE; if ($exists && ($sf_mapping->sync_triggers & SALESFORCE_MAPPING_SYNC_SF_UPDATE)) {

The code checks for existence of an entity referenced by a mapping to ensure it exists, and behaves intelligently if it doesn't. Previously this would cause an unrecoverable sync state for objects.

After we had completed a test run of the import in Pantheon's test environment, we were ready to import data into the production instance of the new site. We decided to set cron to "never run" to again limit the amount of processes running at the time of the import. We also did not want to recreate the parallel issues we discovered during the our tests with our scripted solution. After our first production test run of a few thousand records over 3 hours we noticed that we were still getting deadlocks. Upon investigation it was found that Pantheon runs cron against their production instances using drush, which does not respect the "Never run" configuration. Pantheon had documentation about this which lead us to Elysia Cron. This module does prevent cron from running by setting the "Globally Disable" flag. This module gives itself the highest system weight so that its hook_cron is the first to run. And if that flag is set, Elysia Cron stops the process.


At the end of the day 300,000+ records were successfully imported into Drupal from Salesforce. Many lessons were learned. And significant improvements were made to the Salesforce Suite. Facing History and Ourselves provided us with an opportunity to go further than we ever had before in understanding and improving upon this process.

Categories: Elsewhere

Midwestern Mac, LLC: Drupal 8 - A Brief Introduction (DrupalCamp STL.14 Presentation)

Sun, 27/04/2014 - 19:21

I presented Drupal 8 - A Brief Introduction at DrupalCamp STL.14 on April 26, 2014.

Drupal 8 brings a lot of changes. Many standby contributed modules are now included with Drupal Core, and many small changes add up to the most exciting Drupal release yet! This presentation guides you through many of the biggest changes, highlighting how Drupal 8 will accelerate your web development and provide tools to make Drupal the best content management platform on any device.

View the slideshow below, or follow the links at the bottom of the post to view the full presentation and video.

Links for full slideshow/video:

Categories: Elsewhere

Damien McKenna: Contrib plans for end of April & May 2014

Sun, 27/04/2014 - 18:56

I thought it'd be worthwhile to keep people abreast of my current contrib plans, given I've got the keys to several important modules.

Categories: Elsewhere

Bert Boerland: Zen and the Art of Drupal, The DrupalJam 2014.

Sun, 27/04/2014 - 17:55

As a member of the Stichting Drupal Nederland I have been (co) organising our DrupalCamps called "DrupalJam" for some time now. Last year we hired a soccer stadium for the the DrupalJam and this year we are at a very relaxing location at the water: InnStyle

This year we are working hard to make sure we will have an even better conference than last year. So far, we are on track on making sure we will. :-)

The first keynote speaker is Ancillia Tilia, former fetish model now known in the Netherlands a advocate on digital rights. She has been active in BIts Of Freedom, the Dutch equivelant of the EFF and the "pirateparty" in Amsterdam. The second keynote is from Jan Willem Tulp who creates astonishing data visualisations. Take for example the work descibed on on this page and burn some CPU cycles while flying in WebGL over over Amsterdam.

The featured speakers include Jeroen Tjepkema on web performance, Vincent van Scherpenseel on UX, Iacobien Riezenbosh (State of the web) and Sander Spierenbug (Ethical hacker at KPN, the largest telco in the Netherlands). Apart from these there is a full program on the site and it includes a Question and Answer session via the internet with Dries Buytaert.

Always wanted to ask a question (even if you are not coming to the DrupalJam or are from the Netherlands?), but were afraid to ask? Now is your chance. Do send in your hair raising questions to Dries! If you are stuck in Drupal, we also have a gurubar where the best minds of the Dutch Drupal community will help you out on the spot.

We would like to thank the sponsors and if you are around in the Netherlsands, be sure to buy your ticket for 29 euros (30 if you become a member of the Stichting Drupal Nederland). It includes coffee, tea, water, quality lunch, 20 plus sesions, a free e-book from O'Reilly, the option to win 1 out of 5 free PHP Storm licenses, first free drink in the bar and iternal peace. If you do, be sure to pick the pick a badge.

If you can't make it, you can follow the event at eventifier or @drupaljam

Last thing, friend Metin Seven made an artwork with Druplicon that we printed on a 1 meter high canvas. Attendees can bid on this artwork (new office? must have! :-) ) during the conference.

Organising these events is always a lot of work but with a great team it is neither "a lot" nor "work". Thanks team! Peace.

Categories: Elsewhere

Forum One: Conditional Fields and Display Suite

Sun, 27/04/2014 - 16:19

As part of our normal Drupal process we use both Display Suite (DS) and Conditional Fields frequently. While building out a site I ran into a very strange issue, when I was trying to output a field through DS it just wasn't showing up. I tried playing around with the formatters, label, etc. ... and nothing changed. So I fired up xdebug and took a look at the render array coming through and, yep, the field was in there. But '#access' was set to false. I went through all the usual suspects, user permissions, field permissions ... nothing. When the entity was loaded it was fine, but just prior to rendering it was set to false. So when stepping through all the code I ran into the culprit ... conditional_fields_entity_view_alter

When Display Suite builds the entity for rendering it only has the fields that are part of the display mode. So if you exclude the Conditional Field it won't show up in the entity. And in this little snippet of code:

// Manage orphan fields (dependents with no dependees). $evaluate = in_array(CONDITIONAL_FIELDS_FIELD_VIEW_EVALUATE, $behaviors); $hide_orphan = in_array(CONDITIONAL_FIELDS_FIELD_VIEW_HIDE_ORPHAN, $behaviors); $hide_untriggered_orphan = in_array(CONDITIONAL_FIELDS_FIELD_VIEW_HIDE_UNTRIGGERED_ORPHAN, $behaviors); $is_orphan = empty($build[$dependee]['#access']); if ($is_orphan) { // Hide the dependent. No need to evaluate the dependency. if ($hide_orphan) { $build[$dependent]['#access'] = FALSE; continue; }

it directly checks to see if the conditional field is in the entity, and if not sets the '#access' attribute to false. So we just made a change to the information architecture to have that field display on the page and everything worked. Once you know it's a fairly simple thing to understand. But if you haven't run into it before it can take some significant investigation to figure it out.

 Display Suite and Conditional Fields are hugely useful modules. There are some times when they don't always play nicely together. Here's one example of where they don't and how to fix it.

Categories: Elsewhere

Mike Stiv - Drupal developer and consultant: Using the Feeds api

Sun, 27/04/2014 - 14:04

Feeds ( is a very popular module. From the project page, we get a nice description of the module:

Import or aggregate data as nodes, users, taxonomy terms or simple database records.

The basic idea is that you throw a csv file to it and it creates drupal content. As simple as that. The input format can be more than just a csv, check the project page for more details.

We can use the feeds api if we want more functionality than the standard.
I am going to describe 3 different uses of the the feeds api:

  1. Perform an operation after a feed source has been parsed, before it will be processed
  2. Perform pre-save operations
  3. Add additional target options to the mapping form
1. Perform an operation after a feed source has been parsed, before it will be processed

Use this hook: <?php
A common use case is to alter the data from the csv. For example, lets say that the terms we want to import are described different in the csv than in drupal:
In the csv we may have words like "n. america", "s. america", "w. europe", "e. europe" while the terms in drupal are: "north america", "south america", "western europe", "eastern europe". We need to map the csv values to their drupal equivalent data before import:
* Implements hook_feeds_after_parse().
function mymodule_feeds_after_parse(FeedsSource $source, FeedsParserResult $result) {
  $map = array(
    'e. europe' => 'eastern europe',
    'w. europe' => 'western europe',
  foreach($result->items as $key=>$item){
    $result->items[$key]['region'] = $map[$result->items[$key]['region']];

2. Perform pre-save operations

This allows us to act on the entity that is going to be created. This is similar to the hook_entity_presave().
Here we import only users with their surname equal to 'Smith':
* Implements hook_feeds_presave().
function mymodule_feeds_presave(FeedsSource $source, $entity, $item) {

  // check that this is fired only for the indented importer
    // check that we like this name
    if($item['surname'] != 'Smith'){
      $entity->feeds_item->skip = TRUE;
      drupal_set_message(t('Only Smith\'s allowed. Skipping...', 'warning');

3. Add additional target options to the mapping form

This is the most advanced case described here. The hook <?php
?> allows us to do more complex stuff. For example, lets assume that in our site we are using different newsletter lists, and we want to register the user to the proper list. The newsletter lists are not a field in the user form, so we don't get the option in the feeds ui to control this. This hook allows us to add a field in the feeds ui mapping form and define a callback for its function.

* Implements hook_feeds_processor_targets_alter().
* @param $targets
* @param $entity_type
* @param $bundle_name
function mymodule_feeds_processor_targets_alter(&$targets, $entity_type, $bundle_name) {
  $targets['newsletter_list'] = array(
    'name' => t('Newsletter list'),
    'description' => t('This field sets the user to the proper newsletter list.'),
    'callback' => 'mymodule_newsletter_list',

The code above will show a new option "Newsletter list" in the fields mapping page of the feeds ui module for our importer. Now, we need to define a callback, for this option.

* Callback for hook_feeds_processor_targets_alter
* Subscribes the user to the proper newsletter list
* @param $source
* @param $entity
* @param $target
* @param $value
* @param $mapping
function mymodule_newsletter_list$source, $entity, $target, $value, $mapping) {
  //$value contains the subscription list from the csv
  subscribe_user(($entity->uid, $value);

Did you like this post? Drop me a line in the comments below

Tags: feedsDrupal Planet
Categories: Elsewhere

Klaus Harris: Block Google from Drupal 7 node types on the cheap

Sun, 27/04/2014 - 08:55

In these post Google panda/penguin days it is important that you get your website's crawl profile right and make best uses of your crawl budget. It probably doesn't matter on small sites, certainly not this one but on large ones with millions of pages it does. If Google is crawling useless pages, it could be missing important ones and at the same time this will weaken your site's overall ranking and visibility in Google.

On this site, I have a 'link' node type like this one. How do I stop google crawling those types of pages without installing a module? It's easy.

1. I use Pathauto already and just changed the path alias for my link node type to add a directory name like this:

2. Through the interface, I deleted my aliases and then regenerated them. Use caution here, I don't know how well Drupal handles very large numbers of aliases, perhaps doing it directly in the DB might be safer.

3. Add an entry in your robots.txt to block that directory:

Disallow: /link/

This will now remove those nodes from Google and stop them being crawled.

My words of warning then. Update url aliases with great caution, especially on commercial or heavily indexed sites unless you know exactly what you doing. If you're setting up a new site, this is a harmless strategy. If you are changing urls and care about search engines, have a redirection strategy in place.

Blog tags: Link tags:
Categories: Elsewhere