Planet Drupal

Subscribe to Planet Drupal feed
Drupal.org - aggregated feeds in category Planet Drupal
Updated: 54 min 28 sec ago

InternetDevels: 6 Reasons Why You Should Use Drupal For Website Development

Tue, 07/07/2015 - 14:50

Our guest blogger Jack Dawson, founder of Big Drop Inc, expains why he thinks Drupal is the best choice for website development services.

Read more
Categories: Elsewhere

ERPAL: Drop Guard vs. Drupalgeddon - The 1:0 knockout in just a few minutes

Tue, 07/07/2015 - 11:00

Get ready to rumble! Watch how Drop Guard won against Drupalgeddon on 15.10.2014 at 6:00 PM in this live webinar! We're going to run a live demo re-enacting the whole epic match, and you'll learn about the techniques and strategies that Drop Guard uses to sucker-punch any future Drupal security threats. Don't expect a second round: it'll be a technical KO within minutes!

This free, 45-minute webinar takes place via Google Hangout on 27.07.2015 at 4 PM GMT+2. You'll learn the following:

 

  1. How to set up an automated workflow to keep your Drupal site updated and secure
  2. Three simple prerequisites for starting with Drop Guard
  3. How Drop Guard integrates with your individual development and deployment workflows

 

We'll use a Drupalgeddon-infected Drupal installation including several other modules with security issues. This exclusive live demo shows the current status of Drop Guard. All attendees also get free Drop Guard access until the end of September 2015. All attendees get free Drop Guard access until the end of September 2015.

 

Sign up for free!

 

Categories: Elsewhere

Mpumelelo Msimanga: Drupal: Using Views Database Connector (VDC) To Display Data in External Database

Mon, 06/07/2015 - 22:17
Drupal: Using Views Database Connector (VDC) To Display Data in External Database

Apart from Drupal most organisations may have many systems such as a CRM, HR and e-commerce system. The field of Business Intelligence (BI) deals with bringing disparate data sources into a single data warehouse(DW). If you wanted to display data from your DW using Drupal, the Views module is a good place to start. Before you can use the Views module you need to connect to the database and describe the tables to Views. I am a SQL developer so I am reluctant to write my own module.

Categories: Elsewhere

Klaus Harris: Some ways you could start using Drupal in your organisation

Mon, 06/07/2015 - 21:56

Looking at my blog posts, you would be forgiven for thinking that I work with Drupal all day but actually I don't. I've used it for odd private and business projects over the years, built a startup (1) with it and run my sailing club's website on it.

Actually, my working day is spent on a mix of legacy and new Symfony 2 applications. In years gone past that included applications written in frameworks such Zend framework, SugarCRM and plenty of home grown ones from early PHP days.

Sure there are things I don’t like about Drupal 7 such as configuration management, deployment, lack of modern OOP and practices and the fairly blunt caching mechanism (2) but I think it is because of having seen so much legacy code and so much internally written software that I still like Drupal.

Starting from the assumption that you aren’t going to touch your main application which could be running on another framework or technology, I’m going to write about using Drupal in your organization in ways that you might not have considered.

We’re just not going to switch to Drupal!

I think the reality regards PHP applications is that most companies use their own legacy frameworks and/or something like Zend Framework or Symfony, or they are using a totally different stack and they certainly aren’t about to switch any time soon to something else. Which is fair enough.

However there are ways to get value from Drupal with low effort and risk alongside - and in some cases replacing - existing applications and other technologies.

1. Drupal for building intranet tools

In my current company we have a myriad of intranet tools built on different technologies, mostly home grown legacy PHP apps, but also ones built on other frameworks and some in .NET. Nothing is standard, every one is special and each demands developer ramp up time. One is even stuck on a dead framework on an early PHP version.

Each generation of developers had seemed to want to try out a new framework, different stack or just write something new. I think that is fairly common in old internet companies.

Drupal can be pretty good for building intranet tools. It has some nice APIs, you certainly aren’t limited to the Drupal data model, you get a lot for free (no custom coding needed) plus there are many good modules out there that expose APIs as well. It is a bit of a revelation to define your own DB table structures and then manipulate or expose them using standard Drupal modules like Views.

Also, if you have good developers to set things up correctly initially, management of the application can then be moved to non-technical staff. For example, it won’t then take a developer to alter a permission, set up a new data view, trigger an alert, turn off a functionality, set up a new workflow or add a new language and so on.

This is one of the things I really like about Drupal in that if done properly, it pushes more control to non-technical staff. Developers are expensive. And just using Drupal on an intranet simplifies things a lot in the sense of deployment, maintenance and possible custom coding / themeing.

I think this “myriad of unique applications” scenario should just end. It’s too expensive, complicated and risky and companies should choose standard solutions and stick with them. A number of open source PHP projects are now very established, not going away and 100% enterprise ready. In the PHP world Symfony 2 with its connected eco system is one, Drupal is another.

Take a look at intranet distributions like Open Atrium for intranets and also other disributions on the Acquia downloads page.

2. Drupal as a data source for other applications

In 2010, in one company we had a large Zend Framework application, a java powered auction system in the back end and we wanted to add multilingual news and help sections. I chose Drupal as the CMS as it had great multilingual functionality, rock solid content management and the possibility to build nice workflows for editors and managers.

There was however just no way that we were going to build a new Drupal application for news and faqs, theme it, deploy it and manage it in production alongside the existing frontend.

The solution for this was simple, use Drupal as an intranet CMS only and just pull the data where needed. I set up a multilingual content management application in Drupal to enable staff to add news and help pages, and we just pulled that data into the front facing Zend Framework application through the caching layer via some simple database views.

Note: this was how it looked in 2010, nowadays you would most likely use the services module to surface the data.

The Drupal application used only out of the box modules, needed no theming and had no special deployment , it was simple and with short development time. Everyone was happy.

I’ve seen plenty of home built CRUD applications for all sorts of things like FAQs, news and help pages, community announcements, landing pages, email templates, translation strings, configuration management. Why write and maintain that stuff yourself? Just use a CMS like Drupal.

Just on translation strings, it would be very easy to build a Drupal application to manage translation strings consumed by other applications. Just build the translations app in Drupal and then write a script to export them to a format that the external application needs, e.g. see here for Symfony translations.

And regards exposing Drupal data, Drupal does have an XML-RPC API and RSS but check out services. See the links below.

3. Drupal as a frontend for manipulating data on other applications

Let’s say you have a public site where users can add content. Rather than add more code to your public facing application, you could make a management frontend for that data in Drupal. Drupal has nice APIs for forms and data access, but you could even hook that data into Drupal APIs more fully, allowing you to use native drupal modules to manipulate the data.

4. Drupal for microsites

Drupal can be perfect for microsites such as News and FAQ sections or forums that work alongside your main application(s). Theming isn’t that difficult and if you don’t need to authenticate users, this is a very easy option.

5. Drupal for your corporate website

The corporate website is usually separate from the main application(s) and a good use case for Drupal. A corp site is usually content heavy, needs frequent updates and might need publishing workflow.

6. Drupal for rapid prototyping

Rocket Internet, the giant European incubator gets new minimum marketable websites out in under 100 days. I think this is a very good plan and Drupal is good for the rapid prototyping of ideas even if the data comes from other sources like search services.

Caveat

If you are having to make long term decisions now, then with Drupal 8 just around the corner whether you choose Drupal 7 or 8 needs careful research.

Further reading, listening, watching (1) I ceased work on it in 2012
(2) Drupal 8 which is built using Symfony 2 components fixes these issues Blog tags: Link tags:
Categories: Elsewhere

Chapter Three: Goals First, Then Tactics

Mon, 06/07/2015 - 20:24

When I kick off a client project, two of my first questions are:



  • What are your project goals?

  • What are your project tactics?

Most of the time, my clients define their goals as tactics. For example, “I want a beautiful site with a great user experience.” While this is a useful thing to identify, it’s not a goal. It’s a tactic.



What purpose does a beautiful site serve? Does beauty drive revenue? Why improve the user experience? Doex UX cultivate positive emotions towards your business?



Asking the “why" behind the tactics can help reveal the true goals of the redesign. It’s easy to define the purpose of a project as “I need my site to look better” because it’s the most obvious thing to be improve. However, there is a missed opportunity in looking only skin deep. 

Categories: Elsewhere

Dcycle: Catching watchdog errors in your Simpletests

Mon, 06/07/2015 - 20:04

If you are using a site deployment module, and running simpletests against it in your continuous integration server using drush test-run, you might come across Simpletest output like this in your Jenkins console output:

Starting test MyModuleTestCase. [ok] ... WD rules: Unable to get variable some_variable, it is not [error] defined. ... MyModuleTestCase 9 passes, 0 fails, 0 exceptions, and 7 debug messages [ok] No leftover tables to remove. [status] No temporary directories to remove. [status] Removed 1 test result. [status] Group Class Name

In the above example, the Rules module is complaining that it is misconfigured. You will probably be able to confirm this by installing a local version of your site along with rules_ui and visiting the rules admin page.

Here, it is rules which is logging a watchdog error, but it could by any module.

However, this will not necessarily cause your test to fail (see 0 fails), and more importantly, your continuous integration script will not fail either.

At first you might find it strange that your console output shows [error], but that your script is still passing. You script probably looks something like this:

set -e drush test-run MyModuleTestCase

So: drush test-run outputs an [error] message, but is still exiting with the normal exit code of 0. How can that be?

Well, your test is doing exactly what you are asking of it: it is asserting that certain conditions are met, but you have never explicitly asked it to fail when a watchdog error is logged within the temporary testing environment. This is normal: consider a case where you want to assert that a given piece of code logs an error. In your test, you will create the necessary conditions for the error to be logged, and then you will assert that the error has in fact been logged. In this case your test will fail if the error has not been logged, but will succeed if the error has been logged. This is why the test script should not fail every time there is an error.

But in our above example, we have no way of knowing when such an error is introduced; to ensure more robust testing, let's add a teardown function to our test which asserts that no errors were logged during any of our tests. To make sure that the tests don't fail when errors are expected, we will allow for that as well.

Add the following code to your Simpletest (if you have several tests, consider creating a base test for all of them to avoid reusing code):

/** * {inheritdoc} */ function tearDown() { // See http://dcycleproject.org/blog/96/catching-watchdog-errors-your-simpletests $num_errors = $this->getNumWatchdogEntries(WATCHDOG_ERROR); $expected_errors = isset($this->expected_errors) ? $this->expected_errors : 0; $this->assertTrue($num_errors == $expected_errors, 'Expected ' . $expected_errors . ' watchdog errors and got ' . $num_errors . '.'); parent::tearDown(); } /** * Get the number of watchdog entries for a given severity or worse * * See http://dcycleproject.org/blog/96/catching-watchdog-errors-your-simpletests * * @param $severity = WATCHDOG_ERROR * Severity codes are listed at https://api.drupal.org/api/drupal/includes%21bootstrap.inc/group/logging_severity_levels/7 * Lower numbers are worse severity messages, for example an emergency is 0, and an * error is 3. * Specify a threshold here, for example for the default WATCHDOG_ERROR, this function * will return the number of watchdog entries which are 0, 1, 2, or 3. * * @return * The number of watchdog errors logged during this test. */ function getNumWatchdogEntries($severity = WATCHDOG_ERROR) { $results = db_select('watchdog') ->fields(NULL, array('wid')) ->condition('severity', $severity, '<=') ->execute() ->fetchAll(); return count($results); }

Now, all your tests which have this code will fail if there are any watchdog errors in it. If you are actually expecting there to be errors, then at some point in your test you could use this code:

$this->expected_errors = 1; // for example Tags: blogplanet
Categories: Elsewhere

A Small Web Firm: Connecting Tableau to Drupal on Pantheon

Mon, 06/07/2015 - 19:29
Data-driven content management Although creating, searching for, updating, and publishing content in Drupal is a snap, understanding and making decisions based on that content can be challenging. Questions like, "What are the most viewed, untranslated case studies?" or "Does an accelerated blogging cadence increase page views?" are difficult or impossible to answer within Drupal alone. Though the data that could help answer those questions may live in Drupal, content editors or administrators are unlikely to find answers on their own because the information is made available, if at all, through complex UIs only understood by Drupal site builders, developers, or power users. This problem--where those most knowledgeable about some dataset are only able to ask questions of that data by proxy through a specialist--extends far beyond just content management in Drupal. Tableau (my employer) takes pride in helping solve this type of problem for organizations the world over, and because Drupal runs on pervasive database technologies like MySQL and PostgreSQL, we also happen to work well with Drupal: just add database credentials, connect, and go. Container cloud complications If you run Drupal on Pantheon, you may be familiar with (and likely benefit from) their container-based architecture. The efficiency and agility that containers provide are what allow Pantheon to offer development and test environments at scale. Containers also enable Pantheon to offer you highly available (distributed) and horizontally elastic applications by default. Although these features are a Drupal developer's dream, the underlying technology complicates things for data-driven content managers. When Pantheon updates servers, migrates endpoints, or does other maintenance work transparent to end-users, database connection details change, breaking Tableau's connection to the Drupal database. There are a few options for working around this problem, each with its drawbacks:
  • Send out updated credentials whenever they break: just instruct every Tableau user to e-mail you when the dashboard they built stops updating; you can find the new credentials and send them back. Rinse and repeat for every user with access and every site: welcome to your new full-time job.
  • Give content editors access to the Pantheon dashboard: train them to navigate to the specific environment you want them to connect to, suss out the MySQL details, and be sure not to hit that "delete live site" button you just gave them access to...
  • Use Pantheon's CLI to replicate the DB locally on a schedule: on the whole, not a bad option, but what happens when the DB server goes offline or your replication script starts failing? Didn't you go with Pantheon to get out of the infrastructure management and monitoring game in the first place?
Introducing the Pantheon Switchboard We, being a data-driven marketing organization who coincidentally has a large Tableau installation base, and one that happens to host many Drupal sites on Pantheon, know the struggle well. To that end, we've developed and open sourced the Pantheon Switchboard, a Docker image that mashes up the Pantheon command line interface and MySQL proxy, allowing Tableau users (both those connecting ad-hoc using the desktop client as well as those scheduling extracts with our cloud or on-premise servers) to reliably and seamlessly connect to MySQL databases hosted on Pantheon, despite those periodic database connection detail changes. The Switchboard's container approach attempts to strike the right balance between infrastructure requirements and the type of self-service simplicity that Tableau users expect. Complete details on installation and usage are available on the project's README. Deploying to Google Compute Engine For production use, we're enamored with the simplicity of deploying containers on GCE using their Container-Optimized VM images; feel free to use this as a recipe to get started: # switchboard-manifest.yml version: v1 kind: Pod spec: containers: - name: my-drupal-site-proxy image: tableaumkt/pantheon-mysql-proxy imagePullPolicy: Always ports: - name: mysql containerPort: 3306 hostPort: 11337 protocol: TCP env: - name: PROXY_DB_UN value: un_to_connect_to_drupal_proxy - name: PROXY_DB_PW value: pw_to_connect_to_proxy_here - name: PANTHEON_SITE value: my-drupal-site - name: PANTHEON_ENV value: test - name: PANTHEON_EMAIL value: email-w-access-to-dashboard@example.com - name: PANTHEON_PASS value: password_for_email_here - name: my-wp-site-proxy image: tableaumkt/pantheon-mysql-proxy imagePullPolicy: Always ports: - name: mysql containerPort: 3306 hostPort: 11338 protocol: TCP env: - name: PROXY_DB_UN value: un_to_connect_to_wp_proxy_here - name: PROXY_DB_PW value: pw_to_connect_to_proxy_here - name: PANTHEON_SITE value: my-wp-site - name: PANTHEON_ENV value: dev - name: PANTHEON_EMAIL value: email-w-access-to-dashboard@example.com - name: PANTHEON_PASS value: password_for_email_here # - Additional containers/proxies here. restartPolicy: Always dnsPolicy: Default Then spin up a VM in Google's Cloud with their CLI, using the above manifest as a template: gcloud compute instances create pantheon-switchboard-test \ --image container-vm \ --metadata-from-file google-container-manifest=switchboard-manifest.yml \ --zone us-central1-a \ --machine-type f1-micro After which you should:
  1. Provision a permanent IP and assign it to the VM (or use the VM's ephemeral IP for testing)
  2. Set up network rules to only allow connections from a specific range of IPs (like your corporate network or if you use Tableau Online, its IP), and to the ports you specified in your manifest, and optionally,
  3. Route a domain to the VM's IP.
Once wired up, you should be able to connect to your Pantheon databases using the PROXY_DB_UN, PROXY_DB_PW, and host port specified in your manifest, along with the IP (or domain) you configured in your Google Cloud console. Get Started
Categories: Elsewhere

Microserve: DrupalCamp Bristol 2015 Wrap-up

Mon, 06/07/2015 - 18:40

After nearly a year in the planning, Bristol's inaugural DrupalCamp has finally come and gone!

There have been murmors about a Bristol camp or event for a number of years and it's so rewarding to see the whole South West Drupal community coming together to make it a reality.

A few of my personal highlights were: Paul Johnson's Business day GOSH showcase; Matt Jukes recounting his numerous and often hilarious escapades trying to bring modern digital tools and techniques to ONS; and a very thought-provoking accessibility talk and demonstration by Léonie Watson.

There were also two very polished case studies to round-off the event, given by DrupalCamp Bristol committee chair Rick Donohoe and Ringo Moss.

We at Microserve have really relished the opportunity to help make DC Bristol a reality, and are very much looking forward to starting work on DrupalCamp Bristol 2016!

We will be following up with a detailed perspective on DrupalCamp Bristol from an organisers point of view, as well as a camp wash-up from the whole committee.

Images courtesy of DrupalCamp Bristol and Oliver Davies. DrupalCamp Bristol branding by Positive.

Mark Pavlitski
Categories: Elsewhere

SitePoint PHP Drupal: How to Build Multi-step Forms in Drupal 8

Mon, 06/07/2015 - 18:00

In this article, we are going to look at building a multistep form in Drupal 8. For brevity, the form will have only two steps in the shape of two completely separate forms. To persist values across these steps, we will use functionality provided by Drupal’s core for storing temporary and private data across multiple […]

Continue reading %How to Build Multi-step Forms in Drupal 8%

Categories: Elsewhere

Open Source Training: You Can Now Get Live Chat Support at OSTraining

Mon, 06/07/2015 - 14:34

Starting today, Pro members can live chat one-on-one with the OSTraining team.

What's not to love about live chat? If you have a WordPress, Drupal or Joomla question, you can get instant answers from an expert.

Every time we sent a survey to our members, live chat has been your most requested feature.

When you take a look back at how training has changed over the last few years, moving to instant support is a logical step ...

Categories: Elsewhere

Appnovation Technologies: Drupal North Regional Summit

Mon, 06/07/2015 - 14:32

Last weekend was the first annual Drupal North Summit held in Toronto, Ontario.

Categories: Elsewhere

Annertech: Your Connected Website

Mon, 06/07/2015 - 14:07
Your Connected Website

Modern websites talk. They talk through great content to the visitors who come to read them, but they also talk through APIs (Application Programming Interfaces) to other systems. Does yours? Should it? Could integrating your site with other systems bring you greater return on investment?

Categories: Elsewhere

Open Source Training: New Video Class: Build a Drupal Magazine Site

Mon, 06/07/2015 - 10:04

Managing content is one of Drupal's greatest strengths, so in this week's new class we decided to build a site that focuses very heavily on content. Robert, our Drupal teacher, shows you how to build a magazine website, complete with issues and scheduled articles.

During the 16 videos in this class, Robert introduces a number of popular Drupal modules, including Views, Pathauto, Devel, Scheduler and Entity Views Attachments.

Take this class and Robert will show you how to build a robust, content-focused Drupal site.

Categories: Elsewhere

LevelTen Interactive: How to display an RSS feed in a Drupal block

Mon, 06/07/2015 - 07:00

If you have a standard RSS feed that you'd like to display in a block on your Drupal website, you've come to the right place. For this example, we will be using a sample feed at http://www.feedforall.com/sample.xml (excerpted here:). ... Read more

Categories: Elsewhere

KatteKrab: CCR at OSCON

Sun, 05/07/2015 - 03:11
Sunday, July 5, 2015 - 11:11

I've given a "Constructive Conflict Resolution" talk twice now. First at DrupalCon Amsterdam, and again at DrupalCon Los Angeles. It's something I've been thinking about since joining the Drupal community working group a couple of years ago. I'm giving the talk again at OSCON in a couple of weeks. But this time, it will be different. Very different. Here's why.

After seeing tweets about Gina Likins keynote at ApacheCon earlier this year I reached out to her to ask if she'd be willing to collaborate with me about Conflict Resolution in open source, and ended up inviting her to co-present with me at OSCON. We've been working together over the past couple of weeks. It's been a joy, and a learning experience! I'm really excited about where the talk is heading now. If you're going to be at OSCON, please come along. If you're interested, please follow our tweets tagged #osconCCR.

Jen Krieger from Opensource.com interviewed Gina and I about our talk - here's the article: Teaching open source communities about conflict resolution

In the meantime, do you have stories of conflict in Open Source Communities to share?

  • How were they resolved?
  • Were they intractable?
  • Do the wounds still fester?
  • Was positive change an end result?
  • Do you have resources for dealing with conflict?

Tweet your thoughts to me @kattekrab

Categories: Elsewhere

Wuinfo: How we come with CYouTube Module

Sat, 04/07/2015 - 14:07

Recently, I was working with a team on the project for country music television (CMT) at Corus Entertainment. We need to synchronize videos from two different resources. One of them is Youtube. There are over 200 youtube channels on CMT. We need pull videos from all those channels regularly. By doing that, videos that published on Youtube will be available on CMT automatically. So, in-house editors do not need to spend time uploading the videos.

We store videos in file entities. Videos are from different sources. All imported videos act same across the site. Among the imported video, only their mime type is different if two videos are from the different source. Each imported video is a file entity on CMT. For the front end, we built views pages and blocks based on those video file entities.

We have some videos imported from another source: MPX thePlatform. We used "Media: thePlatform mpx" as main module to import and update those videos. To deal with customized video fields, we contributed a module "Media: thePlatform MPX entity fields sync" to work with the main module.

After we have done with MPX thePlatform videos, we have experience of handling videos import. How do we import YouTube videos now? We have over 200 channels on YouTube.

We gave up to make it work with Feeds module. At first, we planned to use feeds module. Since Google have just deprecated their YouTube API V2.0, Old RSS channels feed is no longer working. Thanks to the community, we found a module called feeds_youtube. The module's 7.x-3.x branch works with YouTube latest API V3.0. So, we have a feeds parser. We still need a feeds processor. Thanks to the community again, we found feeds_files module. We installed those modules and their dependent modules. We configure feeds module and spent couple days. It did work. At the end, we decide to build a lightweight custom module that can do everything from download YouTube JSON data to create local file entities. Each video imported from channels will have a uplink to an artist or a hosting show.

What do we want from the module? We want it to check multiple YouTube channels. If there is a new video uploaded in a channel, we create a associated file entity on CMT. In the created entity, we have some the metadata from YouTube saved to the entity fields. In that entity, we also save local metadata. Local metadata likes a show, artist information. We want the module to handle over 200 and maybe thousands of channels in the future. We want the module to handle the tasks gracefully and not burn out the system when importing thousands of videos. It sounds to be quite intimidating but not really.

We ended up building a module and contributed it to Drupal.org. It is called Youtube Channel Videos Sync V3. Here is how we come up with the module.

First of all, we gathered a list of channels name. We also collected relevant local metadata like artist id or show id. Then, we send a request to YouTube and get a list of videos for each channel. Then, for each video, we create a system queue task with video's data and the local metadata. At last, we built queue processor and created the video file entity one by one. See the diagram below for the complete process.

A little more technical detail below:

How we get a list of YouTube channels name and the local metadata for the channels? In the module configuration page, we set the fields used to store the channel name. In CMT, we have the field in both show and artist nodes. The module went through the nodes and get an array of channel name and the node id with they content type. Array like this:

  array(
   'channel name' => array(
     'field_artist_referenced' => array('nid',...),
    ),

How to configure YouTube metadata fields into entity fields? On the module configuration page, we can set up mapping between the YouTube metadata field and local entity field.

In the imported videos, how to set the local artist and show information? The module has a hook where a custom module can implement to provide that information. Like the array above, "field_artist_referenced" is a field machine name of video entity. "array('nid')" is the value for that field. By doing that, the imported video has an entity reference pointing back to the artist or show node. One of the best ways to set up a relation is to have an entity reference in the video entity that point to the artist node or entity.

That is the overall process the module follow to import thousands of videos from YouTube.

Categories: Elsewhere

Ryan Szrama: Publishing Content on Relative Dates Using Rules in Drupal 7

Sat, 04/07/2015 - 05:50

A friend of mine is a DUI lawyer who uses a Drupal site for content marketing and lead generation, managed by my friends at EverConvert. They quickly identified the weekend as the best time for the website to directly appeal to visitors who found the site after a Friday or Saturday night arrest. In addition to live chat, they decided to alter the site's appearance by publishing content only during these times that contains large, direct calls to action.

Fortunately, Drupal provides the tools to build this with nary a line of code.

Introducing the Rules Scheduler

The primary module you'll use to create something similar is Rules with its Rules Scheduler sub-module. Most purpose built content scheduling modules that I've seen allow you to set absolute dates for pieces of content to be published or unpublished. However, you wouldn't want to have to enter in every single weekend date to get that content published at the right times. Fortunately, Rules Scheduler allows you to schedule arbitrary actions using relative date strings (in addition to any other format supported by strtotime()).

Before we dive into the configuration I proposed to them, you should understand that the Rules module basically adds a GUI based scripting language to the back end of Drupal. In addition to configuring actions to be performed after certain events when a set of conditions are met, you can create Rules components that are essentially subroutines that can be directly invoked by Rules (or code) without being triggered by events.

To setup a Rules Scheduler based content publishing schedule, you have to create two Rules components: one will publish the piece of content and schedule it to be unpublished on a relative date (i.e. "next Monday"), while the other will unpublish the piece of content and schedule it to be published on a relative date (i.e. "next Friday"). Another Rule will need to react to content being created or updated to initiate the publishing schedule.

Using Fields to Avoid "Magic" Behavior

One of the keys to building a maintainable Drupal site (or module) is to ensure that every "automated" action is explicitly enabled. In the early days of Drupal Commerce development, I adopted an approach to some module behaviors where a feature just automatically worked if certain conditions were met (e.g. the representation of attribute fields on Add to Cart forms). "Neat!" I thought, until I realized that it was difficult to document and even more difficult to ensure users knew to look for said documentation. Much better to include explicit user interface components to enable functionality.

In the case of a scheduling system, you wouldn't want to build the site to just automatically enter every piece of content, or even every piece of content of a certain type, into the publishing pattern. Your client, a.k.a. the end user, really expects (and needs) to see an indicator on a form that submitting it will lead to a desired outcome.

For my friend's site, my recommendation was simply to add a Boolean field using a checkbox widget to the relevant content type reading, "Schedule this content to be published on Fridays and unpublished on Mondays." If the site required more publishing patterns than just the weekend pattern, I would've used a List (text) field with radio buttons or a select list identifying the different available publishing patterns.

Building the Scheduling System in Rules

Working with Rules is fairly simple if you can write out long hand what it is you're trying to accomplish. Considering all of our constraints here, we need a set of rules that accomplishes the following:

  1. When a call to action is created or saved with the scheduling checkbox checked, delete any scheduled tasks related to the content. (This is possible because Rules Scheduler lets you assign an identifier to scheduled rules, so identifying our scheduled tasks with the node ID will allow us to delete the tasks when needed in the future.)
  2. If the call to action that was just saved isn't published, schedule it to be published next Friday.
  3. If the call to action that was just saved is published, schedule it to be unpublished next Monday.
  4. When a call to action is automatically published on Friday, schedule it to be unpublished next Monday.
  5. When a call to action is automatically unpublished on Monday, schedule it to be published next Friday.

The first three items will be accomplished through a single rule reacting to two events, "After saving new content of type Call to action" and "After updating existing content of type Call to action." The rule will first delete any scheduled task for the piece of content and then it will invoke two rules components that will schedule the appropriate task based on whether or not the call to action was published.

The final two items will be accomplished through rules components that perform the necessary action and then schedule the appropriate task. As mentioned above, we'll use relative time strings ("next monday" and "next friday") and choose task identifiers that include the call to action node IDs ("unpublish-call-to-action-[node:nid]" and [publish-call-to-action-[node:nid]" respectively).

Give it a whirl!

It only took me about 10 minutes to create and test the rules based on the specification above, but if you aren't familiar with the Rules UI, it could take much longer. I believe Rules is worth learning (we built Drupal Commerce around it, after all), but there's something to be said for ready made examples.

I've attached to this post a Features export of the content type and related rules configurations for you to try on your own site. Give Scheduled Calls to Action a whirl and let me know how it works for you in the comments!

(Note: to see the rules configurations and scheduled tasks, enable the Rules UI, Views, and Views UI modules if they aren't already enabled on your site.)

Categories: Elsewhere

2bits: Presentation: Backdrop, and alernative Drupal fork

Fri, 03/07/2015 - 17:25
Last week, at the amazing Drupal North regional conference, I gave a talk on Backdrop: an alternative fork of Drupal. The slides from the talk are attached below, in PDF format.
Categories: Elsewhere

Chapter Three: Principles of Configuration Management - Part Three

Fri, 03/07/2015 - 16:09

This is the third in a series of posts about Drupal 8's configuration management system. The Configuration Management Initiative (CMI) was the first Drupal 8 initiative to be announced in 2011, and we've learned a lot during thousands of hours of work since then. These posts share what we've learned and provide background on the why and how. In case you missed them: here is part one and part two.

Categories: Elsewhere

Midwestern Mac, LLC: Major improvements to Drupal VM - PHP 7, MariaDB, Multi-OS

Fri, 03/07/2015 - 15:17

For the past couple years, I've been building Drupal VM to be an extremely-tunable, highly-performant, super-simple development environment. Since MidCamp earlier this year, the project has really taken off, with almost 200 stars on GitHub and a ton of great contributions and ideas for improvement (some implemented, others rejected).

Categories: Elsewhere

Pages