Feed aggregator

Mogdesign: 2/3 Using custom fields in the feed/csv

Planet Drupal - Wed, 28/05/2014 - 11:00

This is part 2 of Aegir mini-series.

Categories: Elsewhere

Drupal core announcements: Rerolling patches for PSR-4

Planet Drupal - Wed, 28/05/2014 - 08:01

All Drupal 8 core module code was recently converted to the PSR-4 standard. This issue moved hundreds of files and so many patches will need rerolls. You can update patches safely with the following workflow (provided by @donquixote):

  1. Ensure you have the latest 8.x HEAD in your Drupal 8 repository: git pull origin 8.x
  2. git checkout -b tmp 00339b3d to create a new a new (temporary) branch from the Drupal 8 commit hash before the patch for Issue #2247991 i=was committed.
  3. git apply --index the patch and commit to the new branch.
  4. git rebase 8.x -- This should actually preserve all the local changes to existing files, because git understands if a file has been moved.
  5. Run php core/scripts/switch-psr4.sh and commit the changes. This is for files that were added in the patch or local branch. Remember, this script will add and remove files without udpating the git index, so you will need to add and remove files with git add.
  6. git diff 8.x to produce a new version of the patch.
  7. Upload the new patch to the issue and set the issue "Needs review".

As always, be sure to review your updated patch carefully for any errors.

Categories: Elsewhere

AGLOBALWAY: Drupal Social Media Login with OneAll Social Login in 3 easy steps

Planet Drupal - Tue, 27/05/2014 - 21:49
As social media is more and more popular, logging into your preferred CMS tool with your social media account is clearly becoming one of the most frequent requirements in most projects.    There are a bunch of modules contributed to this feature. Here we’ll go over what it takes to implement "OneAll Social Login", which is available for Drupal 7 only. For more details, please visit the module page https://drupal.org/project/social-login. With this module, it is super convenient to connect your Drupal site with the users' social media account.    First, you need a OneAll account and API credentials. You can register here: https://app.oneall.com/signup/   For the Drupal site, download the module and enable all the components: Oneall Social Login, Oneall Social Login Core, and Oneall Social Login Widget.    Then go to the configuration -> Social Login plugin settings page to enter your API Keys, choose your settings and enable the Social Networks of your choice.   What's next? All set! The API supports more than 20 social networks including almost all the hot sites like Facebook, Twitter, Google, Linkedin, Yahoo, OpenID... It is really an all in one module make everything on the fly.  Tags: drupaldrupal planet
Categories: Elsewhere

Robert Douglass: Attention Drupal Super Heroes! Your powers are needed!

Planet Drupal - Tue, 27/05/2014 - 20:19

Calling all DRUPAL SUPER HEROES: it is time to assemble! The CMS is threatened by Lord Over-Engineering, and only you can help save humanity! 

We're looking for volunteers to participate in Rob and Jam's pre-keynote opening session at DrupalCon, Austin, in 1 week. If you don't know what I'm talking about, just imagine yourself having this much fun:

Not fun enough? How about THIS MUCH FUN:

You get the picture?

Just fill out the following form to sign up for the most important Drupal adventure of your lifetime.

I hereby attest to posession of DRUPAL SUPER POWERS which I will use to serve the good of mankind and defeat LORD OVER-ENGINEERING at DrupalCon, Austin. I am fully aware that this obligates me to not only attend the showdown pre-keynote session, which takes place at the inhumanly early hour of 8am on Tuesday, June 3, but also to disquise my true identity and dress as is appropriate for a DRUPAL SUPERHERO to dress

Name E-mail Description of Super Powers I can fly Yes No Roles that I would like to play Evil minion Vigilante posse Offstage coordinator Assistant to Lord Over Engineering Innocent bystander Damsel in distress Victim of senseless annihilation By submitting this form, you accept the Mollom privacy policy.
Categories: Elsewhere

Phase2: Combining Tasks with Grunt

Planet Drupal - Tue, 27/05/2014 - 19:37

I was recently asked to help out with a few build steps for a Drupal project using Grunt as its build system. The project’s Gruntfile.js has a drush:make task that utilizes the grunt-drush package to run Drush make. This task in included in a file under the tasks directory in the main repository.


module.exports = function(grunt) { /** * Define "drush" tasks. * * grunt drush:make * Builds the Drush make file to the build/html directory. */ grunt.loadNpmTasks('grunt-drush'); grunt.config('drush', { make: { args: ['make', '<%= config.srcPaths.make %>'], dest: '<%= config.buildPaths.html %>' } }); };

You can see that the task contains a few instances of variable interpolation, such as <%= config.srcPaths.make %>. By convention, the values of these variables go in a file called Gruntconfig.json and are set using the grunt.initConfigmethod. In addition, the configuration for the default task lives in a file called Gruntfile.js. I have put trimmed examples of each below.


module.exports = function(grunt) { // Initialize global configuration variables. var config = grunt.file.readJSON('Gruntconfig.json'); grunt.initConfig({ config: config }); // Load all included tasks. grunt.loadTasks(__dirname + '/tasks'); // Define the default task to fully build and configure the project. var tasksDefault = [ 'clean:default', 'mkdir:init', 'drush:make' ]; grunt.registerTask('default', tasksDefault); };


{ "srcPaths": { "make": "src/project.make" }, "buildPaths": { "build": "build", "html": "build/html" } }

As you can see, the project’s Gruntfile.js also has a clean:default task to remove the built site and a mkdir:inittask to make the build/html directory, and the three tasks are combined with grunt.registerTask to make the default task which will be run when you invoke grunt with no arguments.

A small change

In Phase2′s build setup using Phing we have a task that will run drush make when the Makefile’s modified time is newer than the built site. This allows a user to invoke the build tool and only spend the time doing a drush make if the Makefile has indeed changed.

The setup needed to do this in Phing is configured in XML: if an index.php file exists and it is newer than the Makefile, don’t run drush make. Otherwise, delete the built site and run drush make. The necessary configuration to do this in a Phing build.xml is below.


<target name="-drush-make-uptodate" depends="init" hidden="true"> <if> <available file="${html}/index.php" /> <then> <uptodate property="drush.makefile.uptodate" targetfile="${html}/index.php" srcfile="${drush.makefile}" /> </then> </if> </target> <!-- Use drush make to build (or rebuild) the docroot --> <target name="drush-make" depends="-drush-make-uptodate, init" hidden="true" unless="drush.makefile.uptodate"> <if> <available file="${html}"/> <then> <echo level="info" message="Rebuilding ${html}."/> <delete dir="${html}" failonerror="true"/> </then> </if> <exec executable="drush" checkreturn="true" passthru="true" level="info"> <arg value="make"/> <arg value="${drush.makefile}"/> <arg value="${html}"/> </exec> </target>

You’ll note that Phing also uses variable interpolation. The syntax, ${html}, is similar to regular PHP string interpolation. By convention, parameters for a Phing build live in a build.properties file.

A newer grunt

The grunt-newer plugin appears to be the proper way to handle this. It creates a new task prefixed with newer: to any other defined tasks. If your task has a src and dest parameter, it will check that src is newer than dest before running the task.

In my first quick testing, I added a spurious src parameter to the drush:make task and then invoked the newer:drush:maketask.

grunt.config('drush', { make: { args: ['make', '<%= config.srcPaths.make %>'], src: '<%= config.srcPaths.make %>', dest: '<%= config.buildPaths.html %>' } });

That modification worked properly in concert with grunt-newer (and the drush task from grunt-drush task didn’t complain about the extra src parameter,) but I still also needed to conditionally run the clean:default and mkdir:init only if the Makefile was newer than the built site.

Synchronized grunting

The answer turned out to be to create a composite task using grunt.registerTask and grunt.task.run that combined the three tasks existing tasks and then use the grunt-newerversion of that task. The solution looked much like the following.


module.exports = function(grunt) { /** * Define "drushmake" tasks. * * grunt drushmake * Remove the existing site directory, make it again, and run Drush make. */ grunt.registerTask('drushmake', 'Erase the site and run Drush make.', function() { grunt.task.run('clean:default', 'mkdir:init', 'drush:make'); }); grunt.config('drushmake', { default : { // Add src and dest attributes for grunt-newer. src: '<%= config.srcPaths.make %>', dest: '<%= config.buildPaths.html %>' } }); }

I could then invoke newer:drushmake:default in my Gruntfile.js and only delete and rebuild the site when there were changes to the Makefile.

Learn more about build systems in Adam Ross’s blog post “Creating Options in Automated Software Deployment.”

Categories: Elsewhere

Jon Dowland: Nine Inch Nails, Manchester, 2014

Planet Debian - Tue, 27/05/2014 - 18:37

I spent the bank holiday weekend mostly in Manchester with my brother and a couple of friends, mostly to see Nine Inch Nails perform, but also to enjoy the pleasures of the city.

the Marble Arch's ceiling

We arrived on the Sunday and after checking in headed out to get some food. I'd booked us a table at the historic Marble Arch - the pub which gives its name to the parent company Marble Beers - and produce my most favourite ales. Whenever a Marble beer ends up as a guest at one of the Newcastle real ale pubs a friend of mine (who is much more on top of this stuff than I am) lets me know and we stop off for a few. Invariably there's a crowd of a dozen or so Manchester expats in bar when we do. I reserved our table over twitter, which is a bit of a novelty for me. My two friends are Ale skeptics but - due to lack of choice - we all ended up sampling the Chocolate Marble and the Ginger 5.1, which went down a storm. They are possibly converts now. There was an Earl Grey IPA too, which was nice but a bit on the strong side for sessioning.

click for the animated version

The gig was great - I prefer club gigs to arena gigs but the sound techs for NIN know what they are doing and the mix was great. The visuals were stunning too. Highlights for me were "The Great Destroyer" - in particular the extended, improvised 5 minute glitch-meltdown coda; the slow build of "Eraser" and "The Day The World Went Away" - played faithfully to the CD mix rather than the traditional extended live arrangement. I miss the former live arrangement, which includes a drawn out drum-backed finish, but this arrangement had a lot of force, with all four of the band bashing guitars into a pummelling wall of sound. I've heard that at least one person has taped the show and it turns out recordings have just recently surfaced for the two club gigs we attended back in 2005 and 2007 - meaning there are now widely available, high quality ROIO copies for every NIN gig I've ever been to.

Afterwards we tried to find a decent club. Manchester is a lot larger than my native city and there's plenty of places to go, if you know where they are. We had originally planned to visit The Factory on Charles Street, but we didn't believe it was open on the Sunday. The Factory is Peter Hook's (formerly of Joy Division/New Order) club - occupying the former office spaces of Factory records. I've chanced in there once before as it's right across the road from the Lass O' Gowrie pub, which is the haunt of choice for HE/tech people whenever we're in Manchester for conferences.

cheesy cocktail names

Instead we gravitated towards Dry Bar on Oldham street. By coincidence this place also has historic ties to Factory Records. However the doormen wouldn't let us in! Finally we landed at a place which had one name on the door and a different name on the inside (Jack's). Like the aformentioned Lass O' Gowrie and perhaps half of all pubs and clubs in Manchester, the place is decked out as a shrine to the former musical giants of the City, with framed pictures of Mark E Smith, the Haçienda again, the Stone Roses in their boy-band-looks hay day, John Peel (champion of many of them) and of course Tony Wilson. I can't help but wonder whether people who live here get royally sick of that.

This place served some delicious albeit cliché-titled Cocktails and played a pretty good set - the obligatory Madchester throwback interlude was followed by a chunk of Northern Soul and a couple of early Rap classics. Setlist wise it was pretty much identical to that of Foundation in Newcastle, 12 years ago, no doubt because Foundation was cribbing heavily from the Haçienda in the first place.

Record haul

Monday was dedicated to exploring and shopping. Top of my list of places to go was my pilgrimage venue Vinyl Exchange. Whilst waiting for it to open we rolled around in Affleck's Palace, which turned out to have a small record stall within. After Vinyl Exchange we chanced across another record store right across the road - Picadilly Records - which happens to be larger and focuses more on new releases. It didn't take long before we found another small, DIY record shop, then another. We tried out the Fopp branch here - more Vinyl; then Urban Outfitters - more; then an Oxfam branch - even more. I actually got Vinyl snow blind at this point. I've been controlling myself admirably and only picked up a couple of bits and pieces. My best find was Fad Gadget's sophmore album "Incontinent".

Categories: Elsewhere

Blink Reaction: Why is Symfony in Drupal 8, and how Does that Change Things?

Planet Drupal - Tue, 27/05/2014 - 18:30

This post is the first of a series exploring the integration of Symfony2 into Drupal's next major release, Drupal 8.

Is Drupal 8 harder to learn than previous versions?

If you have been working with and developing Drupal modules for a few years, and now you are trying to learn about all of the Drupal 8 changes, your answer may be a resounding, "YES."

Categories: Elsewhere

Gunnar Wolf: On how tech enthusiasts become tech detractors

Planet Debian - Tue, 27/05/2014 - 17:04

As often is the case, the Saturday Morning Breakfast Cereal webcomic (http://smbc-comics.com/ gets it right. And I cannot help but share today's comic.

The picture explains it much better than what I ever could.

Categories: Elsewhere

2bits: Using Drush for a Seven Day Daily Backup scheme for Drupal sites

Planet Drupal - Tue, 27/05/2014 - 16:50
Everyone needs to have a backup plan for their live site. Not only can your server's disk get corrupted, but you can also erroneously overwrite your site with bad code or bad data, or your site can get hacked. Detecting the latter situations takes some time. Hours or days. For this reason, you should have multiple backup copies at multiple time points. The most convenient scheme is to have a 7 day sliding backup: that is, you have one backup snapshot for each day of the week, with today's backup overwriting the backup from 8 days ago.

read more

Categories: Elsewhere

Jon Dowland: 2012 In Review

Planet Debian - Tue, 27/05/2014 - 16:31

2013 is nearly all finished up and so I thought I'd spend a little time writing up what was noteable in the last twelve months. When I did so I found an unfinished draft from the year before. It would be a shame for it to go to waste, so here it is.

2012 was an interesting year in many respects with personal highs and lows. Every year I see a lots of "round-up"-style blog posts on the web, titled things like "2012 in music", which attempt to summarize the highlights of the year in that particular context. Here's JWZ's effort, for example. Often they are prefixed with statements like "2012 was a strong year for music" or whatever. For me, 2012 was not a particularly great year. I discovered quite a lot of stuff that I love that was new to me, but not new in any other sense.

In Music, there were a bunch of come-back albums that made the headlines. I picked up both of Orbital's Wonky and Brian Eno's Lux (debatably a comeback: his first ambient record since 1983, his first solo effort since 2005, but his fourth collaborative effort on Warp in the naughties). I've enjoyed them both, but I've already forgotten Wonky and I still haven't fully embraced Lux (and On Land has not been knocked from the top spot when I want to listen to ambience.) There was also Throbbing Gristle's (or X-TG) final effort, a semi/post-TG, partly posthumous double-album swan song effort which, even more than Lux, I still haven't fully digested. In all honesty I think it was eclipsed by the surprise one-off release of a live recording of a TG side project featuring Nik Void of Factory Floor: Carter Tutti Void's Transverse, which is excellent. Ostensibly a four-track release, there's a studio excerpt V4 studio (Slap 1) which is available from (at least) Amazon. There's also a much more obscure fifth "unreleased" track cruX which I managed to "buy" from one of the web shops for zero cost.

The other big musical surprise for me last year was Beth Jeans Houghton and the Hooves of Destiny: Yours Truly, Cellophane Nose. I knew nothing of BJH, although it turns out I've heard some of her singles repeatedly on Radio 6, but her band's guitarist Ed Blazey and his partner lived in the flat below me briefly. In that time I managed to get to the pub with him just once, but he kindly gave me a copy of their album on 12" afterwards. It reminds me a bit of Goldfrapp circa "Seventh Tree": I really like it and I'm looking forward to whatever they do next.

Reznor's How To Destroy Angels squeezed out An Omen EP which failed to set my world on fire as a coherent collection, despite a few strong songs individually.

In movies, sadly once again I'd say most of the things I recall seeing would be "also rans". Prometheus was a disappointment, although I will probably rewatch it in 2D at least once. The final Batman was fun although not groundbreaking to me and it didn't surpass Ledger's efforts in The Dark Knight. Inception remains my favourite Nolan by a long shot. Looper is perhaps the stand-out, not least because it came from nowhere and I managed to avoid any hype.

In games, I moaned about having moaning about too many games, most of which are much older than 2012. I started Borderlands 2 after enjoying Borderlands (disqualified on age grounds) but to this day haven't persued it much further. I mostly played the two similar meta-games: The Playstation Plus download free games in a fixed time period and the more sporadic but bountiful humble bundle whack-a-mole. More on these another time.

In reading, as is typical I mostly read stuff that was not written in 2012. Of that which was, Charles Stross's The Apocalypse Codex was an improvement over The Fuller Memorandum which I did not enjoy much, but in general I'm finding I much prefer Stross's older work to his newer; David Byrne's How Music Works was my first (and currently last) Google Books ebook purchase, and I read it entirely on a Nexus 7. I thoroughly enjoyed the book but the experience has not made a convert of me away from paper. He leans heavily on his own experiences which is inevitable but fortunately they are wide and numerous. Iain Banks' Stonemouth was an enjoyable romp around a fictional Scottish town (one which, I am reliably informed, is incredibly realistical rendered). One of his "mainstream" novels, It avoided a particular plot pattern that I've grown to dread with Banks, much to my suprise (and pleasure). Finally, the stand-out pleasant surprise novel of the year was Pratchett and Baxter's The Long Earth. With a plot device not unlike Banks' Transition or Stross's Family Trade series, the pair managed to write a journey-book capturing the sense-of-wonder that these multiverse plots are good for. (Or perhaps I have a weakness for them). It's hard to find the lines between Baxter and Pratchett's writing, but the debatably-reincarnated Tibetan Monk-cum-Artificial Intelligence 'Lobsang' must surely be Pratchett's. Pratchett managed to squeeze out another non-Discworld novel (Dodger) as well as a long-overdue short story collection, although I haven't read either of them yet.

On to 2013's write-up...

Categories: Elsewhere

Dries Buytaert: Acquia raises $50 million series F

Planet Drupal - Tue, 27/05/2014 - 15:48
Topic: DrupalAcquiaBusiness

We've got great news to share today; we are announcing that Acquia raised $50 million, the largest round of financing we’ve ever completed.

The round is led by New Enterprise Associates (NEA), one of the world's top investors in our space. They have made various great investments in Open Source (MongoDB, Mulesoft, etc.) as well as SaaS companies (SalesForce, Workday, Box, etc.).

With the new funding, we can continue to go after our vision to help many more organizations with their digital platform and digital business transformation. In addition, Acquia is charting new territory in the world of software with a very unique business model, one that is rooted in Open Source and that helps us build a web that supports openness, innovation and freedom.

We have such a big and exciting opportunity ahead of us. This vision will not come to life on its own and the proprietary competitors are not resting on their laurels. We'll use the funding to double down on all aspects of our company; from increasing our investment in products to deeper investments in sales and marketing.

In addition to lead investor NEA, other investors include Split Rock Partners, and existing investors North Bridge Venture Partners, Sigma Partners, Investor Growth Capital, Tenaya Capital, and Accolade Partners. The new funding will bring Acquia’s total fund-raising to $118.6 million.

Of course, none of this success would be possible without the support of our customers, the Acquia team, our partners, the Drupal community and our many friends. Thanks so much for supporting Acquia!

Categories: Elsewhere

High Rock Media: Drupal 8 DevOps: Leveraging OpenShift Online PaaS

Planet Drupal - Tue, 27/05/2014 - 15:21

In my free time recently, I've been designing and developing a little Drupal 8 site that I'd like to bring online. From the start, I knew I'd need a decent web host with all the fixins' Drush, Git and most of all, PHP 5.4 as that's the minimum required for Drupal 8.

The first place I thought of was Get Pantheon but as it turns out, as of this writing, they only have PHP 5.3. I also took a look at Linode but admittedly I'm not a huge fan as it requires a fair amount of server setup and management.

Enter, OpenShift Online, Red Hat's scalable PaaS or Platform as a Service in the cloud. OpenShift is an all in one development, hosting and deployment platform which supports PHP, Java, Node.js, Python and more. The idea behind this is to let the developer / designer focus on what matters, code and apps rather than tedious server administration and management. For me, this is a huge win and best of all, Red Hat has a basic plan that gets you everything you need for free.

Apps, Gears, and Cartridges

OpenShift's terminology takes a little getting used to but I'll break it down and try to define what's what. Think of an Application as just an instance of a website. Gears are an allocated amount of server resources ranging in size from small to large for scalability. Think of a Cartridge as an a la carte style add on for your app. Need Jenkins, phpMyAdmin or cron? No problem, just browse the list of available cartridges or add one from a URL and OpenShift does all the installation work for you.

Build Your App

My workflow involves developing a website locally and then bringing it into dev / production via Git. With OpenShift, this is no different, however with a few caveats. Normally you'd use .gitignore for persistent non-tracked data files, i.e. Drupal's files directory and then these untracked items can just sit on your server happy to go about their business. However, with OpenShift, anything not in Git gets deleted on the next push regardless of .gitignore. OpenShift handles the untracked files via their data directory - ${OPENSHIFT_DATA_DIR} - and symbolic links. So while .gitignore is fine for your local, you'll need to do a little wrangling on the OpenShift end and I'll show you how.

We can automate file handling by designing a custom deploy script. This can have any command you'd normally execute directly on the server but with added logic to test for certain conditions, e.g whether a file or folder already exists. I wrote a full scale deploy script (Gist) from scratch which turned out to be extremely challenging and a lot of fun. The script executes on each git push.

Getting Started

First, create an account at OpenShift Online, login and create your first app. The best way is to simply click the "Add Application" button in your app console and then search for PHP. I added PHP 5.4 since I was going to use Drupal 8. You'll be asked to name your app and can follow the simple steps there. This creates your base app which is now up and running with the flavor of PHP you chose.

Here's the other OpenShift cartridges I use in my Drupal 8 app.

There's also an option to install PHPmyAdmin as well. Note for Drush, enter the URL in the Cartridge UI page, i.e. "Install your own cartridge" You can also use Red Hat's RHC language to setup Cartridges from your local, it does require Ruby. I'm sure I'll get in to that next, I just have not had time to work on it yet but it looks pretty cool.

Local Setup

Once you've got all your cartridges installed, you can now pull from Git and get your local dev environment setup, ready to push your code up. Here's a general outline of what I do in Terminal that woks well.

 # Within your project directory:
$ git clone [git_url] [project_site]
$ cd [project_site]
 # Remove OpenShift's default index file:
$ rm index.php
 # Now download drupal (latest 8.x Alpha)
$ drush dl drupal-8.0-alpha11
 # Rename the drupal folder to "php"
$ mv drupal-8.0-alpha11 php

With the above setup, you'll note that we rename the Drupal directory to php. That's because OpenShift best serves web files if you have a sub directory named php. They also automatically alias this for you so you don't see php in the URL. Here's how my local directory structure looks once I'm done.

| |____project_site
| | |____.git
| | |____.openshift
| | | |____action_hooks
| | | | |____build
| | | | |____deploy
| | | | |____README.md
| | |____php
| | | |____[drupal-files-here]

Push and Deploy

To get around the persistent non-tracked files / folders issue, we can implement the deploy script I mentioned above in our local OpenShift repo that runs on each git push. Essentially this creates a sites/default/files directory in your Openshift data directory or ${OPENSHIFT_DATA_DIR} and then symlinks it to your repo or ${OPENSHIFT_REPO_DIR} In addition, it copies default.settings.php as settings.php to the data directory and proceeds to symlink that as well. The deploy script should be located at:


My deploy script has conditional statements to insure that already existing files and folders in the ${OPENSHIFT_DATA_DIR} don't get overwritten. For your deploy script to be recognized, be sure to give it appropriate permissions. Once you set with all this, it's time to commit and push for the first time.

 # Commit your changes and push to OpenShift
$ git commit -a -m 'Initial commit'
$ git push

On first push, I noticed that my deploy script did not run and that was a real mystery. In the end, it turned out to be a permissions issue on the file. It was set to 644 so a chmod to 777 took care of it. Once git has been pushed and the deploy script runs, here is what my OpenShift /repo/php/sites/all/files directory looks like:

files -> /var/lib/openshift/[app_id]/app-root/data/sites/default/files
settings.php -> /var/lib/openshift/[app_id]/app-root/data/sites/default/settings.php Once you either import your database or install Drupal, you'll want to reset the permissions on settings.php in the data directory to not be writable.

A note about settings.php

I could not find much documentation with regard to Drupal's settings.php and OpenShift. Suffice it to say, I found this snippet. (See further below with regard to QuickStarts.)

if (array_key_exists('OPENSHIFT_APP_NAME', $_SERVER)) {
  $src = $_SERVER;
} else {
  $src = $_ENV;
$databases = array (
  'default' =>
  array (
    'default' =>
    array (
      'database' => $src['OPENSHIFT_APP_NAME'],
      'username' => $src['OPENSHIFT_MYSQL_DB_USERNAME'],
      'password' => $src['OPENSHIFT_MYSQL_DB_PASSWORD'],
      'host' => $src['OPENSHIFT_MYSQL_DB_HOST'],
      'port' => $src['OPENSHIFT_MYSQL_DB_PORT'],
      'driver' => 'mysql',
      'prefix' => '',

The code above use OpenShift Environment Variables to connect to your database, this is the preferred method, especially if you build a scaleable app. If you want to print out all the variables, you can SSH in and simply do env in terminal or you can add a build file in your action_hooks directory with the line of code export. The next time you git push, you'll see all these vars print out. Pretty cool!

QuickStarts Alternative

As an alternative to the process I've outlined above and to get up and running super fast on OpenShift, you can choose from a variety of Quickstarts that are offered, these are one click app installs pre-configured with the cartridges you need. An eclectic list is offered appealing to any designer or developer including Ghost, WordPress, Ruby, JBoss and of course Drupal. In the Drupal category sits a Drupal 8 app ready to deploy.

With this method, you'll want to create symlinks from your repo to the themes and modules directory of Drupal. From what I can tell, this method is not meant for a ton of customization and is probably more akin to something like simplytest.me. @Srdjan_Popovic blogged about this method for a Drupal 7 QuickStart and you check out his post). What he outlines could be adapted for the Drupal 8 QuickStart as well. I tested Popovic's method out and it worked well.

Is PaaS the Future?

In the end, OpenShift seems to have a lot of promise, especially with a Drupal 8 beta, RC and final release not too far around the corner. Compared to a cPanel hosting paradigm, OpenShift feels modern, fresh and the way forward. It seems like they've adapted to the needs of and the way in which developers are now working.

  • Drupal
  • PaaS
  • OpenShift
  • Drupal Planet
  • DevOps
Categories: Elsewhere

Drupalize.Me: A Guide to Drupal 8 at DrupalCon Austin

Planet Drupal - Tue, 27/05/2014 - 15:05

DrupalCon Austin is rapidly approaching and the big question on my mind is: What Drupal 8 sessions do I need to put on my radar? To figure that out, I've mined the schedule for Drupal 8 related talks and events and organized them a bit to help me – and hopefully you – find the Drupal 8 sessions not to be missed.

Information Overload

The bottom line? There is a LOT of Drupal 8 content at DrupalCon Austin, as one would expect. So, what is my strategy for gleaning as much Drupal 8 information as possible for use in my work as a trainer? There are two facts that generally guide my strategy at DrupalCon:

  1. Sessions are recorded and archived on the Drupal Association's YouTube channel.
  2. BoFs are usually not recorded.

So how does this help me decide what to do with my time? I check the BoF whiteboard early and often and if there's a BoF and a session at the same time that are equally interesting to me, I chose the BoF and make a note to watch the session recording later.

Categories: Elsewhere

ThinkShout: Chimpfestation, A Closer Look at the New MailChimp Module

Planet Drupal - Tue, 27/05/2014 - 15:00
Your Basic Monkey

A few weeks ago, we released the initial beta of the 3.x version of the MailChimp Module on Drupal.org. The third major revision of the MailChimp Module for Drupal 7 is actually the fifth major revision of the module, including two versions for Drupal 6. ThinkShout Partner Lev Tsypin rolled the first release in January of 2008, and the first version of the project page included a little information about his goals for the module:

Right now, I am focusing on 3 types of integration:

  1. Using hook_user to maintain a members list in MailChimp.
  2. Having an opt in field in the user profile which uses one of the MC merge fields to allow for segmenting the members into those who want to receive communications.
  3. Having an anonymous sign up form to enroll users in a general newsletter.

The module (and the project page!) have both come a long way since then, but the functionality described in that initial post has remained the core of the module through each version: anonymous signup forms and authenticated subscription control describe the core use cases that have resulted in over 15,000 installs. Sure, there's campaign integration, activity reporting, and all sorts of bells and whistles around list and subscription management, but anonymous signup forms and user-based subscription control have always been the bread and butter.

Identity Crisis

Building on the success of the MailChimp module, ThinkShout has made the contribution of robust, useful Drupal modules a core part of our business. In building Entity Registration, RedHen, Salesforce v3, Leaflet, and a bunch of other great modules, we've often leveraged Drupal 7's Entity and Field systems to make our tools as versatile and abstract as possible, allowing for any imaginable use-case.

We had a bit of a wake-up call when one of our favorite clients, The Salmon Project, asked us to integrate their fancy new RedHen CRM directly with MailChimp. Integrating RedHen Contact Entities doesn't actually match up with either of these: anonymous signup forms and authenticated subscription control.

It was time to bring ThinkShout's signature versatility and abstraction to ThinkShout's signature module.

Monkeys Everywhere!

The first thing we did was de-couple the configuration of anonymous signup forms and authenticated subscription control. The MailChimp Lists configuration UI had grown into a bit of a monster: it included 16 separate options, not counting merge field sync settings, ranging from the submit button label (on the signup form) to the roles allowed (to access the list on user configuration pages). For version 3, rather than framing everything around each list, we broke things out by their Drupal-side functionality:

  1. The Signup Module was created for generating anonymous list signup forms.
  2. The List Module now provides a field type: "MailChimp Subscription", which, modeled on Entity Registration's successful architecture, leverages Drupal's Field API to allow any entity to become an independently-controlled MailChimp list subscriber.

What does this mean? If all you need to do is generate some anonymous subscription blocks or pages, the MailChimp Signup module has you covered. Just enable it, go to the "Signup Forms" tab in the MailChimp Admin UI, and create a signup! The UI lets you generate blocks or pages easily, include one or more lists on each form, pick which merge fields to include, and voila!

If, however, you want to subscribe some type of entity to a MailChimp List (like a user, say, or a RedHen contact), you can now do that lickity-split using Field UI:

This handy MailChimp Signup field will insist on being tied to one of your MailChimp lists. Once that's done, you can configure instances of this field like you would any other Drupal field. It will automatically pull in the available Merge Fields and let you select which properties or fields from the entity you want to push into these fields: Want to default your entity to be subscribed to the list? Use field UI's built-in configuration options. Use field display options to hide the field if you want to, or display it as a form right on the entity.

Do you want to get the old role-based subscription behavior? Easily done with a field on your user bundle and a simple rule or two! We've included the custom rules actions you need, and there's even an example rule in the README file in the MailChimp Lists submodule.

What this all boils down to is do what you want! You can MailChimp-ify any entity on your site with an email address in under 5 minutes. So go ape!

Peeling Away Campaign Complexity

ThinkShouter Dan Ruscoe brought huge improvement to the Campaign module, including the ability to send to list segments from directly within Drupal and some awesome UI improvements. We have long offered the ability to pull site content into campaigns, but you had to come up with the exact token for the content on your own: not the simplest task, especially if you have a non-developer creating your campaigns.

Now? A simple drop-down interface generates the token for you. Create a view mode for your entity types specifically for use in campaigns, or re-use an existing view mode. Just select your content type, the view mode, and search by title, and the module generates the token. Pop it into your campaign anywhere you want.

We also added a handy mergefield key selector patterned after the Token UI.

Other Evolutions

We didn't stop with fancy configuration options. Heck, we didn't start with fancy configuration options. The goofs at MailChimp HQ released the 2.0 version of their API, and we wouldn't want you using that Late Pleistocene 1.x nonsense, so we re-wrote the entire core of the MailChimp Module to leverage the new API. While we were at it, we re-wrote the asyncronous functionality to make it much simpler and less error-prone. It may not be easy enough for a chimp to understand quite yet, but it's certainly more tolerant of a little monkeying.

Climb Aboard!

You can download the MailChimp Module 7.x-3.0 beta now. We're already using it on a few sites and it's working great. So give it a try and let us know what you think!

Categories: Elsewhere


Subscribe to jfhovinne aggregator