Feed aggregator

Web Wash: Build a Blog in Drupal 8: Content types and Fields

Planet Drupal - Tue, 29/09/2015 - 23:11

With Drupal 8 on the horizon, now is a good time to start using it. The best way to learn the new version is to build something with it.

Over the next few weeks, I'll be publishing a series of tutorials teaching you how to create a blog in Drupal 8. The aim of the series is to help new comers, as well as experienced site builders, how to create a blog website using Drupal 8.

Throughout each tutorial, major changes between Drupal 7 and 8 will be highlighted. Even if you know Drupal 7, follow along and you'll learn what's new in Drupal 8.

In this first tutorial, you'll learn how to create a Blog content type and how to add custom fields. You'll also learn about the Taxonomy system by creating your own vocabulary to categorize blog posts.

Categories: Elsewhere

OSTraining: Use Adminimal for a Responsive Admin in Drupal 7

Planet Drupal - Tue, 29/09/2015 - 23:01

Do you want many of the benefits of the new Drupal 8 admin area, without actually needing to upgrade? Check out the Adminimal theme.

Adminimal provides a responsive admin area for Drupal 7 sites.

Adminimal also has several usability improvements. For example, Adminimal has colored buttons for common tasks like Submit, Preview and Delete to them easier to see. Adminimal also provides improved layouts for the Configuration, and Module screens. 

In this video, taken from our Better Drupal Administration class, Robert Ring introduces you to Adminimal:

Categories: Elsewhere

Pantheon Blog: Drush and the Drupal Console with Drupal 8

Planet Drupal - Tue, 29/09/2015 - 22:05
With interest in Drupal 8 heating up, a lot of people are wondering whether they should be using Drush or the Drupal Console as their command line tool of choice. Drush is the longstanding heavyweight champion, widely used and supported in hundreds of contrib modules. Drupal Console sports the modern Symfony Console component, that provides a new object-oriented interface for command line tools, exposing the power of the Symfony framework to script developers. Which is better for Drupal 8? (Disclaimer: I am a maintainer of Drush, but the answer may surprise you!)
Categories: Elsewhere

ThinkShout: Free Salesforce and RedHen Trainings at BADCamp

Planet Drupal - Tue, 29/09/2015 - 18:00

This year marks a particularly exciting BADCamp for us. On Friday, October 23, we’ll be leading two official half-day trainings on Salesforce and RedHen CRM!

We talk a lot about Salesforce and RedHen on our blog, which is inevitable given how much time we spend working with both. Given that the Drupal Salesforce Suite we wrote and maintain powers over 1,200 sites, we decided we should talk about it more! Lev Tsypin, Tauno Hogue, Greg Boggs and I will be your trainers for the day.

RedHen CRM: Exploring CRM Solutions That Extend Drupal

This half-day workshop will begin with a detailed review of the modules that make up the “RedHen CRM ecosystem.” We’ll then show you how to leverage RedHen for building advanced CRM-driven solutions. A hands-on training, we will help you configure and customize your own RedHen instance, and we will teach you best practices for building solutions on top of this module suite. We’ll cover tools in the RedHen ecosystem, including memberships, donations, campaigns, engagements, deduping contacts, and integrating RedHen with other Drupal standard bearers such as Views, Rules, Context, etc.

Integrating Drupal and Salesforce: The World’s Most Flexible CMS Meets the Most Powerful CRM

After lunch, the latter half of our workshop block will start with the basics of configuring mappings between Drupal entities and Salesforce objects and leveraging typical Drupal development workflows. Then we'll dive deeper into complex scenarios including managing large datasets, customizing behaviors as objects are synced, and managing errors.

If you’ve ever wanted to talk to us in person about our work, or how to better leverage these CRMs to meet yours or your client’s needs, now’s your chance. These trainings are FREE (it’s $20 to reserve a seat, but that $20 will be refunded) and space is limited, so register early!

Categories: Elsewhere

Dariusz Dwornikowski: Delete until signature in vim

Planet Debian - Tue, 29/09/2015 - 16:13

It has been bugging me for a while. When responding to an email, you often want to delete all the content (or part of the previous content) until the end of the email's body. However it would be nice to leave your signature in place. For that I came up with this nifty little vim trick:

nnoremap <silent> <leader>gr <Esc>d/--\_.*Dariusz<CR>:nohl<CR>O

Assuming that your signature starts with -- and the following line starts with your name (in my case it is Dariusz), this will delete all the content from the current line until the signature. Then it will remove search highlighting, and finally move one line up.

Categories: Elsewhere

OpenLucius: Headless Drupal & Node.js | Part 2/3: Node.js

Planet Drupal - Tue, 29/09/2015 - 14:48

This is part two of the series “Headless Drupal & Node.js”, for part one, click here. In this blog I will give you a 'Hello Node' introduction. 

About Node.js

Node.js uses complex techniques and can therefore be confusing to work with. It is therefore not suitable for the novice web developer.

Categories: Elsewhere

Paul Johnson: Drupal 8 hall of fame

Planet Drupal - Tue, 29/09/2015 - 14:29

At DrupalCon I arranged for a 9 metre squared floor decal celebrating the top 1000 contributors to Drupal 8. So many of you asked for access to the original artwork, and many who could not be at the conference, so I have prepared this page so you can!

It was such a delight to witness so many Drupalists, with a great sense of pride, photographing theirs and friends names, sharing on social media. The larger a name is, the more they have contributed. But honestly if your name is there, bravo to you!

Below the tag cloud is clickable so you can reach each persons D.O account. There is also attached a PDF version so you can print it if you like.

I used DrupalCores.com for source data and Tagul.com to produce the artwork. I hope you enjoyed seeing it and look forward to seeing you at future Drupal events.

File:  Drupal8.pdf
Categories: Elsewhere

Amazee Labs: DrupalCon Barcelona - Recap

Planet Drupal - Tue, 29/09/2015 - 12:44
DrupalCon Barcelona - Recap Josef Dabernig Tue, 09/29/2015 - 12:44

DrupalCon is not only about sessions, though they are a big part of the conference. Up to 10 presentations at the same time ensure, that there is quality content for any audience. At the same time, many things happen alongside of the sessions.

Some of the side activities might be really familiar to anyone who has attended a Drupal event; some might be hidden gems that I would definitely recommend checking out.

Drupal is all about the people behind the great software we are using. Let's find out together what happens during DrupalCon.

Before starting with the actual conference, we did the second Tour de Drupal. This time, we were a much smaller team. Christian and I started in Andorra, cycling over the Pas de la Casa over the French boarder and back to Puigcerda, Spain on Friday. The next day we took a train to Vic and biked over some nice hills and down to sea level located Blanes.

Finally, on Sunday, the Tour de Drupal crew was completed when Gaele joined us from his 2 weeks cycling trip, along with Martin, for the final lap from Blanes to Barcelona alongside the beautiful beaches.

The local Spanish community and other conference attendees welcomed the Tour de Drupal team.  At a beach bar next to the conference centre we got to see some nice fireworks from the city centre. Pictures from Tour de Drupal Barcelona 2015 are available here

We got to the conference on Monday, where community members where already working on fixing the last critical bugs for the upcoming Drupal 8 release during the extended sprints. There are always a few people taking pictures, including for example Paul Johnson. We were also glad to see Boris Baldinger, former Amazee Labs colleague, join us for DrupalCon as part of his new business as full-time photographer.

Mondays at DrupalCon are often underestimated as just a day of arrival, attending trainings, or participating in the business summit. But besides that, there is a room full of sprinters and there was also a community kick-off event happening. People interested in the inner workings of the Drupal community joined together to discuss internal topics like event organization and best practices - for example.

On the ground floor, companies were busy preparing their sponsor booths in the exhibit hall. We from Amazee Labs traditionally see DrupalCons as a big investment; we are sponsors, and put a lot of work into our booth. In this way we can both support the Drupal community by facilitating such an important event but also represent our brand across the community and provide visitors and employees a comfortable area to discuss business and hang out. 

The evenings and nights after DrupalCon are packed with social events where community members gather to chat, drink, or eat together in a more relaxed atmosphere.

Tuesday morning, just before the Keynote by Drupal founder Dries Buytaerd (aka "Driesnote"), Robert Douglass, Jeffrey "Jam" McGuire, and a team of creative community members present the "Prenote". Each DrupalCon, they come up with a great show explaining Drupal to newcomers and share regional fun facts about the hosting city or country. 

After each keynote there is a moderated Q&A where Mike Anello asks questions collected via twitter, which provides a great way to discuss instant feedback from the audience on the presented topics. Check out the hashtags #driesnote, #DCNahai, #DCRozas, #DCBell for more info.

Just after the first keynote, all conference attendees gather outside for a big group picture. This time, more than 2000 folks interested in Drupal joined in. That's Diana up there at the top, the DrupalCon production lead, strapped in a safety harness to get the shot!

Drupal heavily relies on contributions by individuals who invest a lot of time into making the system better. Drupal 8 has an incredible amount of more than 3000 active contributors. In the sponsor hall, the Drupal8 Contributors Hall Of Fame, contributed by CTI Digital, visualized all the names as a floor graphic.

Tuesday evening, the local Drupal association threw a great party at the beach with live music allowing the diverse crowd to connect with each other in an open, outdoor environment. 

Alongside sessions and workshops, many interviews were held, capturing voices from influental community members about the current state of Drupal, and their experiences doing business and working with the community.

To compensate for heavy coding sessions and deep technical discussions, Drupalists also hang-out with each other and just have a good time at the beach, swimming or enjoining the ocean breeze.

While most of the sessions are related to technical topics, there is also another track that I find really interesting. In Core Conversations, we discuss how to improve our processes, what works well, and what needs to be fixed in order to work well together. On the above picture, you can see YesCT, kgoel, bfr, and alimac in their session Paid contribution: past, present, and future.

Drupal core development is a constantly evolving process. In the Drupal 8 release cycle, initiatives where introduced to allow breaking down the complexity of tasks into different areas. Now, with the Drupal 8 release coming up soon, Dries and the team of core committers took a chance to do a retrospective on what went well and what needs improvement: Drupal 8 retrospective with Dries.

In the closing session, the next big DrupalCon events are announced. Besides Frontend United Ghent (May 27-28, 2016)Drupal Dev Days Milano (June 2016), DrupalCon Asia (Feb 18-21, 2016)DrupalCon New Orleans (May 09-13, 2016) and DrupalCon Dublin (September 26-30, 2016) were announced. In the above picture you can see the enhusiatic Indian community promoting their local event. 

On Thursday evening, "Trivia Night" was on! A fun Irish-style pub quiz with questions on Drupal and picture puzzles. 

Conference attendees from various countries and continents celebrate the game together.

Friday is the official sprint day of DrupalCon. The entire day is dedicated on workshops that allow contributors to improve Drupal core and contributed modules. Our sucessful mentoring system ensures that new contributors are onboarded properly to Drupal's contribution systems and processes.

A great collaborative effort is being made to facilitate moving Drupal forward, while at the same time providing free training for anyone interested in learning new systems first-hand from the experts in Drupal.

There were three rooms full of contributors: one with a First-Time Sprinter Workshop, a second one hosting a Mentored Core Sprint and a third one where contributors work in a self-organized way on different initatives per table. As part of the #d8rules initiative, I led a sprint table for porting the Rules module to Drupal 8.

There is a lot going on during DrupalCon. Thanks to everyone for organizing and making DrupalCon such a multifaceted event!

More photos can be found on the Amazee Labs flickr account:

Categories: Elsewhere

Darryl Norris's Blog: Light Skeleton - A Simple Theme (MVP)

Planet Drupal - Tue, 29/09/2015 - 12:41

Light Skeleton is a theme base on version of Skeleton v2. The goal of this project is trying to be as close as Skeleton v2 in the Drupal theme. Light Skeleton is a very lightweight theme that does not require any type of compiling, and provide an out-box styling, without a need of not large UI framework.
  • A new truly responsive grid based on percentages
  • Mobile first media queries
  • New typeface Raleway as default
  • Generally simpler style
  • More robust forms (especially in relation to grid)
  • Inclusion of basic table styling
  • Inclusion of super basic code styling
Similar Projects They are couple contrib themes based on Skeleton, however; none of them are based on the latest version of...Read more
Categories: Elsewhere

InternetDevels: How Responsive Web Design Works — Presentation

Planet Drupal - Tue, 29/09/2015 - 12:09

If you already know the name of Ethan Marcotte and the term “responsive design”, you have chosen the right direction. If not - this direction is a fortiori necessary for you. Let’s take a look on the brief history of the responsive design.

Read more
Categories: Elsewhere

Norbert Preining: Multi-boot stick update: TAILS 1.6, SysresCD 4.6.0, GParted 0.23, Debian 8.2

Planet Debian - Tue, 29/09/2015 - 10:57

Updates for my multi-boot/multi-purpose USB stick: All components have been updated to the latest versions and I have confirmed that all of them still boot properly – although changes in the grub.cfg file are necessary. So going through these explanations one will end up with a usable USB stick that can boot you into TAILS, System Rescue CD, GNU Parted Live CD, GRML, and also can boot into an installation of Debian 8.2 Jessie installation. All this while still being able to use the USB stick as normal media.

Since there have been a lot of updates, and also changes in the setup and grub config file, I include the full procedure here, that is, merging and updating these previous posts: USB stick with Tails and SystemRescueCD, Tails 1.2.1, Debian jessie installer, System Rescue CD on USB, USB stick update: TAILS 1.4, GParted 0.22, SysResCD 4.5.2, Debian Jessie, and USB stick update: Debian is back, plus GRML.

Let us repeat some things from the original post concerning the wishlist and the main players:

I have a long wishlist of items a boot stick should fulfill

  • boots into Tails, SystemRescueCD, GParted, and GRML
  • boots on both EFI and legacy systems
  • uses the full size of the USB stick (user data!)
  • allows installation of Debian
  • if possible, preserve already present user data on the stick

A USB stick, the iso images of TAILS 1.6, SystemRescueCD 4.6.0, GParted Lice CD 0.23.0, GRML 2014.11, and some tool to access iso images, for example ISOmaster (often available from your friendly Linux distribution).

I assume that you have already an USB stick prepared as described previously. If this is not the case, please go there and follow the section on preparing your usb stick.

Three types of boot options

We will employ three different approaches to boot special systems: the one is directly from an iso image (easiest, simple to update), the other via extraction of the necessary kernels and images (bit painful, needs some handwork), and the last one is a mixture necessary to get Debian booting (most painful, needs additional downloads and handwork).

At the moment we have the following status with respect to boot methods:

  • Booting directly from ISO image: System Rescue CD, GNOME Parted Live CD, GRML
  • Extraction of kernels/images: TAILS
  • Mixture: Debian Jessie install
Booting from ISO image

Grub has gained quite some time ago the ability to boot directly from an ISO image. In this case the iso image is mounted via loopback, and the kernel and initrd are specified relatively to the iso image root. This system makes it extremely easy to update the respective boot option: just drop the new iso image onto the USB stick, and update the isofile setting. One could even use some -latest method, but I prefer to keep the exact name.

For both SystemRescueCD, GNOME Partition Live CD, and GRML, just drop the iso files into /boot/iso/, in my case /boot/iso/systemrescuecd-x86-4.6.0.iso and /boot/iso/gparted-live-0.23.0-1-i586.iso.

After that, entries like the following have to be added to grub.cfg. For the full list see grub.cfg:

submenu "System Rescue CD 4.6.0 (via ISO) ---> " { set isofile="/boot/iso/systemrescuecd-x86-4.6.0.iso" menuentry "SystemRescueCd (64bit, default boot options)" { set gfxpayload=keep loopback loop (hd0,1)$isofile linux (loop)/isolinux/rescue64 isoloop=$isofile initrd (loop)/isolinux/initram.igz } ... }   submenu "GNU/Gnome Parted Live CD 0.23.0 (via ISO) ---> " { set isofile="/boot/iso/gparted-live-0.23.0-1-i586.iso" menuentry "GParted Live (Default settings)"{ loopback loop (hd0,1)$isofile linux (loop)/live/vmlinuz boot=live union=overlay username=user config components quiet noswap noeject ip= net.ifnames=0 nosplash findiso=$isofile initrd (loop)/live/initrd.img } ... }   submenu "GRML 2014.11 ---> " { menuentry "Grml Rescue System 64bit" { iso_path="/boot/iso/grml64-full_2014.11.iso" export iso_path loopback loop (hd0,1)$iso_path set root=(loop) kernelopts=" ssh=foobarbaz toram " export kernelopts configfile /boot/grub/loopback.cfg } }

Note the added isoloop=$isofile and findiso=$isofile that helps the installer find the iso images.

Booting via extraction of kernels and images

This is a bit more tedious, but still not too bad.

Installation of TAILS files

Assuming you have access to the files on the TAILS CD via the directory ~/tails, execute the following commands:

mkdir -p /usbstick/boot/tails cp -a ~/tails/live/* /usbstick/boot/tails/

The grub.cfg entries look now similar to the following:

submenu "TAILS Environment 1.6 ---> " { menuentry "Tails64 Live System" { linux /boot/tails/vmlinuz2 boot=live live-media-path=/boot/tails config live-media=removable nopersistent noprompt timezone=Etc/UTC block.events_dfl_poll_msecs=1000 splash noautologin module=Tails initrd /boot/tails/initrd2.img   } ... }

The important part here is the live-media-path=/boot/tails, otherwise TAILS will not find the correct files for booting. The rest of the information was extracted from the boot setup of TAILS itself.

Mixture of iso image and extraction – Debian jessie

As mentioned in the previous post, booting Debian/Jessie installation images via any method laid out above didn’t work, since the iso images is never found. It turned out that the current installer iso images do not contain the iso-scan package, which is responsible for searching and loading of iso images.

But with a small trick one can overcome this: One needs to replace the initrd that is on the ISO image with one that contains the iso-scan package. And we do not need to create these initrd by ourselves, but simply use the ones from hd-media type installer. I downloaded the following four gzipped initrds from one of the Debian mirrors: i386/initrd text mode, i386/initrd gui mode, amd64/initrd text mode, amd64/initrd gui mode, and put them into the USB stick’s boot/debian/install.386, boot/debian/install.386/gtk, boot/debian/install.amd, boot/debian/install.amd/gtk, respectively. Finally, I added entries similar to this one (rest see the grub.cfg file):

submenu "Debian 8.2 Jessie NetInstall ---> " { set isofile="/boot/iso/firmware-8.2.0-amd64-i386-netinst.iso" menuentry '64 bit Install' { set background_color=black loopback loop (hd0,1)$isofile linux (loop)/install.amd/vmlinuz iso-scan/ask_second_pass=true iso-scan/filename=$isofile vga=788 -- quiet initrd /boot/debian/install.amd/initrd.gz } ... }

Again an important point, don’t forget the two kernel command line options: iso-scan/ask_second_pass=true iso-scan/filename=$isofile, otherwise you probably will have to make the installer scan all disks and drives completely, which might take ages.

Current status of USB stick

Just to make sure, the usb stick should contain at the current stage the following files:

/boot/ iso/ firmware-8.2.0-amd64-i386-netinst.iso gparted-live-0.23.0-1-i586.iso grml64-full_2014.11.iso systemrescuecd-x86-4.6.0.iso tails/ vmlinuz Tails.module initrd.img .... grub/ fonts/ lots of files locale/ lots of files x86_64-efi/ lots of files font.pf2 grubenv grub.cfg *this file we create in the next step!!* /EFI BOOT/ BOOTX64.EFI The Grub config file grub.cfg

The final step is to provide a grub config file in /usbstick/boot/grub/grub.cfg. I created one by looking at the isoboot.cfg files both in the SystemRescueCD, TAILS iso images, GParted iso image, and the Debian/Jessie image, and converting them to grub syntax. Excerpts have been shown above in the various sections.

I spare you all the details, grab a copy here: grub.cfg


That’s it. Now you can anonymously provide data about your evil government, rescue your friends computer, fix a forgotten Windows password, and above all, install a proper free operating system.

If you have any comments, improvements or suggestions, please drop me a comment. I hope this helps a few people getting a decent USB boot stick running.


Categories: Elsewhere

Erich Schubert: Ubuntu broke Java because of Unity

Planet Debian - Tue, 29/09/2015 - 10:45

Unity, that is the Ubuntu user interface, that nobody else uses.

Since it is a Ubuntu-only thing, few applications have native support for its OSX-style hipster "global" menus.

For Java, someone once wrote a hack called java-swing-ayatana, or "jayatana", that is preloaded into the JVM via the environment variable JAVA_TOOL_OPTIONS. The hacks seems to be unmaintained now.

Unfortunately, this hack seems to be broken now (Google has thousands of problem reports), and causes a NullPointerException or similar crashes in many applications; likely due to a change in OpenJDK 8.

Now all Java Swing applications appear to be broken for Ubuntu users, if they have the jayatana package installed. Congratulations!

And of couse, you see bug reports everywhere. Matlab seems to no longer work for some, NetBeans appears to have issues, and I got a number of bug reports on ELKI because of Ubuntu. Thank you, not.

Categories: Elsewhere

InternetDevels: How to start with Symfony2 framework: tutorial for beginners

Planet Drupal - Tue, 29/09/2015 - 10:27

If you want to begin studying Symfony web development, this
Symfony2 framework tutorial by our developer could be very
useful for you. However, if you want to hire a Symfony developer
for building an amazing website, you’re always welcome!

Read more
Categories: Elsewhere

Chromatic: Programatically Creating and Storing WordPress Migrate Migrations in Drupal

Planet Drupal - Tue, 29/09/2015 - 05:17

Migrations are never glamorous, but doing them right and verifying their integrity is essential to their success. The WordPress Migrate module gives you an easy turnkey solution to migrating content into Drupal from WordPress. It allows you to create each migration through an interactive admin form, allowing you to configure your migration entirely through the UI. This is great, but it does not make creating or storing the resulting migrations easy to manage across multiple environments, since the migrations are not defined in code like a typical Migrate class. Short of copying database tables or re-entering the configuration through the admin forms, developers are stuck with the migrations stored in a single database and thus it is not easy to move to other environments for testing or further development.

Copying data tables is almost always the wrong solution and manually re-entering all of the migrations would be way too time consuming, so our solution was to create the migrations programmatically. To do this, we hooked into the existing WordPress Migrate codebase and used its logic to build programmatically, what it builds from data input to its admin forms. Then we are able to define all of our migration sources in code and instantly create all of our migrations in a new environment, or recreate them after something fails during development.

As mentioned, this solution relies upon programmatically submitting admin forms, which is often not an ideal scenario. Additionally, there is the almost inevitable request to add additional customizations beyond what Wordpress Migrate supports out of the box. Sometimes this makes WordPress Migrate more of a hinderance than a help. So why not just create a custom Migrate class from the outset and avoid all of these issues? Here are some factors to consider:

  • Writing a custom Migrate class for your WordPress content always sounds more appealing until you run into problems and realize WordPress Migrate already solved those issues.
  • The WordPress Migrate module offers a lot of functionality, including file transfer, author migration, embedded video processing, internal link rewriting, comment migration, etc.
  • You might not need much custom code and just tweaking the WordPress Migrate functionality by extending one of its classes will easily do the trick.
  • You might not have the resources (time, knowledge, etc.) to write a custom Migrate class.
  • Running and testing the migrations on multiple environments might not be in your workflow, although I would argue it should be.
  • You might only have one or two WordPress sites to migrate content from, so manually re-creating them is not an issue.

If after weighing all of the factors, you decide using the WordPress Migrate module is in your best interest and manually recreating the migrations is not an option, then follow along as we walk you through our approach to creating and storing WordPress Migrate migrations programmatically.

Our Solution

First we need to define the list of source blogs. The keys of each item in this array can be added to as needed to override the default values we assign later.

* Define the WordPress blogs to be imported.
function example_wordpress_migrate_wordpress_blogs() {
  // Any key not set here will default to the values set in the
  // $blog_default_settings variable in the drush command.
  $blogs = array(
      'domain' => 'www.example.com/site-one/',
      'domain' => 'www.example.com/site-two/',
      'domain' => 'www.test.com/',
  return $blogs;

Next we’ll create a custom drush command so that we can easily trigger the creation of our migrations from the command line.

* Implements hook_drush_command().
function example_wordpress_migrate_drush_command() {
  $items = array();
  // Creates WordPress migrations.
  $items['example-migrate-create-wordpress-migrations'] = array(
    'description' => 'Creates the WordPress migrations.',
    'aliases' => array('mcwm'),

  return $items;

Be sure to note the example_migrate_wordpress_password variable below, as you will need to ensure you set that in settings.php before creating the migrations. The WordPress Migrate code needs to be able to login to your site to download the source XML file, and a password is paramount to the success of that operation!

* Callback for WordPress migration creation drush command.
function drush_example_wordpress_migrate_create_wordpress_migrations() {
  // Reset the file_get_stream_wrappers static cache so the 'wordpress' stream
  // wrapper created by the wordpress_migrate module is available.
  $wrappers_storage = &drupal_static('file_get_stream_wrappers', NULL, TRUE);
  // The wordpress_migrate module's UI is a multi-step form that collects all
  // configuration needed to migrate a given blog. As this form's steps are
  // submitted and validated, an export file is downloaded for each blog and its
  // contents are migrated. There is no easy way to export these settings or use
  // code to provide that configuration and then trigger a migration, so the best
  // bet is simulate the submission of those form steps with the needed data.
  module_load_include('inc', 'migrate_ui', 'migrate_ui.wizard');
  // Get a list of blogs to migrate.
  $blogs = example_migrate_wordpress_blogs();
  $blog_default_settings = array(
    'source_select' => '1',
    'domain' => '',
    'username' => 'admin',
    'password' => variable_get('example_migrate_wordpress_password', ''),
    'wxr_file' => NULL,
    'do_migration' => 0,
    'default_author' => 'admin',
    'page_type' => '',
    'blog_post_type' => 'story',
    'path_action' => 1,
    'tag_field' => '',
    'category_field' => '',
    'attachment_field' => '',
    'text_format' => 'filtered_html',
    'text_format_comment' => 'filtered_html',
  // Import each of the blogs.
  foreach ($blogs as $blog_settings) {
    // Combine the default settings and the custom per blog settings.
    $blog_settings = array_merge($blog_default_settings, $blog_settings);
    // Skip the import if no username or password was found.
    if (empty($blog_settings['username']) || empty($blog_settings['password'])) {
      $message = t('The :site-name migration was not created since no username and/or password could be found. Verify that the example_migrate_wordpress_password variable has been set.');
      $replacements = array(
        ":site-name" => $blog_settings['domain'],
      drupal_set_message(t($message, $replacements), 'warning');
    // Set the form state values.
    $form_state['values'] = $blog_settings;
    // Store the values so we can use them again since $form_state is
    // a reference variable.
    $form_state_values = $form_state['values'];
    // Build the import form.
    $form = drupal_get_form('migrate_ui_wizard', 'WordPressMigrateWizard');
    $form = migrate_ui_wizard($form, $form_state, 'WordPressMigrateWizard');
    // Create a Migrate Wizard object.
    $form_state['wizard'] = new WordPressMigrateWizard();
    // Set the number of steps in the form.
    $form_steps = 6;
    // Go through all of the steps.
    foreach (range(1, $form_steps) as $step) {
      // Validate the form data.
      // Submit the form page.
      migrate_ui_wizard_next_submit($form, $form_state);
      // Put any values removed from the array back in for the next step.
      $form_state['values'] = array_merge($form_state_values, $form_state['values']);
    // Submit the form.
    drupal_form_submit('migrate_ui_wizard', $form_state);
    // Save the settings into the wizard object.
    // Notify the user that the migration was created successfully.
    $replacements = array(
      '@site-name' => $blog_settings['domain'],
    $message = t('The @site-name migration was successfully created.', $replacements);
    drupal_set_message($message, 'success');

With all of this in place, the source WordPress sites and the configuration needed to import them are now fully defined in code along with a custom Drush command to create the required migrations. No longer will each individual site need to be re-entered through the UI introducing opportunities for mistakes and wasted time.

Now when you are in a new environment or after you reset your migrations, you can simply run drush mcwm.

Following its successful completion, the following are done for you:

  • A new Migrate group is created for each individual blog.
  • The actual Migrate classes within each group that migrate, authors, content, terms, and attachments are created and configured as defined in the code.
  • The source WordPress XML file is downloaded for each site and stored in wordpress://.

Then simply run drush ms to verify everything was created successfully, and you are ready to migrate some content!

Now that you have the tools and knowledge to evaluate your unique migration needs, you can make a well informed decision if this approach is right for you. However, we think that more often than not, all of the incredible functionality you get pre-built with the WordPress Migrate module will outweigh the issues that arise from not being able to fully build and store your migrations in code, especially when you add the functionality outlined above that gets you the best of both worlds. So have your cake and eat it too, and define your migrations in code and utilize the WordPress Migrate module while you are at it!

If you decide to go this route, all of the code referenced here is available in this gist. Please note that this was all built for WordPress Migrate 7.x-2.3, so future updates to the module could break this functionality.

Categories: Elsewhere

OSTraining: New Video Class: Speeding up Joomla

Planet Drupal - Tue, 29/09/2015 - 03:22

If you run a Joomla site, then you really need this week's new video class from Rod Martin called "Speeding up Joomla".

Rod starts by showing that a normal Joomla site is not highly-optimized and then he takes you through 10 steps to improve your site speed.

First, you'll learn to use Google PageSpeed and YSlow to test your site. Then Rod shows you how to use caching, compresssion, .htaccess, CDNs and more. By the time you've finished this class, you'll have a blazing fast Joomla site!

Categories: Elsewhere

Gbyte blog: How to use the Drupal 8 honeypot module efficiently

Planet Drupal - Tue, 29/09/2015 - 00:33
The Honeypot module is a great captcha alternative, as it keeps spam bots from submitting content while also saving your site visitors from having to type in mundane character combinations. Configured properly it will prevent the majority of bots from submitting forms on your site including registration forms, contact forms, comment forms, content forms... any drupal forms.
Categories: Elsewhere

Gizra.com: Sticky Floors and Happy Kids

Planet Drupal - Mon, 28/09/2015 - 23:00

There's a saying "Good moms have sticky floors, messy kitchens, laundry piles, dirty ovens, and happy kids”. God knows I score an A+ and by that phrase alone I am THE perfect mother. But how do you leave those happy kids and get back to work?!

I have 3 kids (recent one is brand new - in fact I'm writing this post while being on maternity leave), which I absolutely adore and admire with every breath they take, a husband that I'm (still, after so many years together) crazy about and, lucky me, I love my job and co-workers.

I must be the fortunate one to hit the jackpot as a most people I know are not even satisfied with one thing, let alone three!

I should know. I was "most people”. A nine to five office monkey, working tons of overtime and weekends, trying to get to the top. I was carefree at that time (for the singles in the crowd: this means no children) and my main concerns were questions such as what's the best party to attend to on Thursday night? Which movie to go to on Friday night? What to eat, where? You get my point.

So I met my husband; then came the kids; and then there was Gizra.

Gizra is known for our up-front, straightforward attitude, and as such I want to be very honest - on the verge of blunt - and share with you some thoughts. I am a working mother. A wife. A person. Not by that specific order but that sounds just about right to me as I feel like the "person" part kicks in only after the kids are in beds.

I've been asking myself some questions lately:

  • I want to be at the office while I am on my maternity leave. Does that make me a bad mother?
  • I miss my kids and my husband and I want to be with them instead of being at the office, while I am at work. Does that make me a bad employee?
  • Sometimes I cannot stand the sound of my own name when one of my employees calls me, wants something from me. Does that make me a bad manager?
  • I find myself often screening my friends calls and avoiding meeting them (sometimes it involves telling a little white lie) because I just wanna stay home. Does that make me a bad friend?
  • Sometimes all I want is just be left alone for a while. No kids shouting M-O-M-M-Y, No employees telling me they need something from me. Does that make me a bad person?

Is there a way to keep everyone happy? Your kids, your bosses, your employees, your friends, your husband, and last but not least - even yourself?
I just know I love my job; love my husband; love my kids (definitely not in that specific order).

Truth be told, I don't have all the answers, far from it, but I'd like to share some tips on how you can make family and work combine (just a little bit) better together:

  • Working nine to five? - Not necessarily! Try to be flexible in terms of working hours. You are killing two birds with one stone: arriving early means you get to leave early - just in time to get your kids out of daycare.
  • Safety net - There's nothing like family support, if you're fortunate enough to have one who is able to help you. Play nice with your mother in law, your brother, your distant cousin etc.
  • Don't be a stranger - There's a saying "Better a nearby neighbor than a far away brother". Get to know your neighbors just a little bit better, you might find it useful one day.
  • Don't be sociopaths - Everybody knows that the worst thing in the world is to be in a kindergarten's WhatsApp group! I'd be the first to admit it - once I've been added to one, I curse like hell and immediately silence it for a year! Who wants to get messages announcing the appearance of lice or which kid lost his coat or a toy. However, it may save your ass if you're stuck in traffic or in an important meeting and you might need another mom or dad to take your kids out of the kindergarten/school.
  • Playdates - Try to convince your kids to go to a playdate, preferably at another mom's house! That way you have some peace and quiet to finish some tasks you were suppose to finish long time ago or to finish an important presentation.

Continue reading…

Categories: Elsewhere

Antoine Beaupré: Fun experiments with laptop battery

Planet Debian - Mon, 28/09/2015 - 19:45

After reading up on a eye-opening blog from Petter Reinholdtsen about laptop batteries, or more specifically Lithium-Ion laptop batteries, I figured I needed to try out the "TLP Linux Advanced Power Management" that I had been keeping an eye on for a while. tlp is yet another tool to control power usage on laptops (mostly Thinkpad, mine is a X120e). The novelty of tlp is the "hands off" approach: everything should be automatically configured for you...

Obviously, that means I then went on working for a few hours on breaking and fixing my laptop in random operations. I opened a bunch of pull requests on the interesting battery status package that Petter produced to make it work with my setup and make it display graphs directly (instead of into a file). I also rewrote the graphing tool in Python with SciPy in order to have cleaner labels and be able to deduce the date at which a battery would be completely unusuable because it can't recharge high enough. (At the time of writing, the battery estimated death time is december 7th, but that data is skewed because of a quick change in the battery charge after the BIOS upgrade, below.)

I then went on to try to limit my laptop charging to 80%, since this seems to make the battery last longer (sources from Petter: 1, 2, 3). Unfortunately, even after building a local (and trivial) backport of the tlp package to Debian stable (8.2/Jessie), I still couldn't access those controls, as TLP is really just a set of shell scripts that glue a bunch of stuff together.

The backport was simply:

apt-get source tlp cd tlp*/ debuild sudo dpkg -i ../tlp*.deb

I read here and there (and in Petter's post) that I needed the tp-smapi-dkms package, so I went ahead an installed it:

sudo apt install tp-smapi-dkms

(Yes, Jessie has a neat apt command now, it's great, upgrade now.)

Unfortunately, this still didn't work. I think the error back then was something like:

thinkpad_ec: thinkpad_ec_request_row: arg0 rejected: (0x01:0x00)->0x00 thinkpad_ec: thinkpad_ec_read_row: failed requesting row: (0x01:0x00)->0xfffffffb thinkpad_ec: initial ec test failed

I have seen suggestions here and there to try the acpi-call-dkms package, but that was useless as it doesn't support my model (but may work with others!). The error there was:

acpi_call: Cannot get handle: Error: AE_NOT_FOUND

Note: I still have it installed - it's unclear what impact it has, and I do not want to break my current setup.

So I then started to look at upgrading my BIOS, for some reason. I was running version 1.13 (8FET29WW) from 05/06/2011. I was able to update to 1.17 (8FET33WW) from 11/07/2012, using the memdisk binary from the syslinux-common package, with some help from the quite useful grub-imageboot package:

sudo apt install syslinux-common grub-imageboot wget https://download.lenovo.com/ibmdl/pub/pc/pccbbs/mobiles/8fuj10uc.iso sudo mkdir /boot/images sudo mv 8fuj10uc.iso /boot/images sudo reboot

I found the image on the Thinkpad x120e support page from Lenovo (which happily bounces around, so don't rely on the above URLs too much). When I rebooted, I was offered to boot from the image by grub, which went on fine, considering it was running some version of DOS, which is always a bit scary considering it is software that is somewhat almost as old as me.

I wish I could have installed some free software in the BIOS instead of the outdated crap that Lenovo provides, but unfortunately, it seems this will never be possible with Libreboot or Coreboot, mostly because Intel is evil and installs backdoors in all their computers. Fun times.

Fortunately and surprisingly, the update worked and went on pretty smoothly. After that, I was able to set the charge limit with:

echo 80 | sudo tee /sys/devices/platform/smapi/BAT0/stop_charge_thresh

Amazing! I had almost forgotten why I almost bricked my system on a thursday, good thing that worked! Now the fun part was that, after some reboots or something I did, I am not sure what, the above stopped working: I couldn't load the drivers at all anymore, dmesg treating me with a nasty:

thinkpad_ec: thinkpad_ec_read_row: failed requesting row: (0x01:0x00)->0xfffffff0 thinkpad_ec: initial ec test failed

Now that is some obscure error message material! Fun stuff. I tried uninstalling tlp, the smapi modules, the acpi-call modules, rebooting, turning the machine off, removing the battery, nothing worked. Even more hilarious, the charge controler was now stuck at 80%: I had artificially destroyed 20% of the battery capacity in software. Ouch.

I think this may have been related to uninstalling the tp-smapi-dkms package without unloading it at first. I found some weird entries in my kern.log like this:

tp_smapi unloaded. thinkpad_ec: thinkpad_ec_read_row: failed requesting row: (0x14:0x00)->0xfffffff0 hdaps: cannot power off hdaps: driver unloaded. thinkpad_ec: unloaded.

I think that after that point, I couldn't load either module, not even thinkpad_ec...

After tearing out a few more hairs and hammering my head on the keyboard randomly, I thought I could just try another BIOS upgrade, just for the fun of it. Turns out you actually can't rerun the upgrade, but you can change the model number through the same software, and this seems to reset some stuff. So I went back in the ISO image I had loaded earlier, and went on to change the model number (actually setting it to the same value, but whatever, it still ran the update). It turns out this seems to have reset a bunch of stuff and now everything works. I can use tlp setcharge and all the neat tools go well.

The two key commands are:

# limit charging to 80% of the battery, but not lower than 40% sudo tlp setcharge 40 80 # clear the above setting and just charge the battery to 100% sudo tlp fullcharge

It seems that the 40% bit isn't supported by my laptop, but whatever: the battery stays charged when on AC power anyways, so I don't really understand what the setting is for in the first place. The error there is:

smapi smapi: smapi_request: SMAPI error: Function is not supported by SMAPI BIOS (func=2116) smapi smapi: __get_real_thresh: cannot get start_thresh of bat=0: Function is not supported by SMAPI BIOS

The second command is what I will need to remember to run before I unplug the laptop for a trip. I suspect this will be really annoying and I may end up disabling all this stuff and just yank the power cable out when the battery reaches 80%, by hand, when I need to.

But it was a fun geeking out, and hopefully this will be useful for others. And of course, the graphs from Petter will be interesting in a few months... Before the BIOS upgrade, the battery capacity was reported as 100% (actually, at 100.03%, which was strange). Now, the capacity is at 98.09%, which is probably just a more accurate reading that was fixed in the BIOS upgrade.

Finally, also see the useful thinkwiki troubleshooting page and especially their interesting BIOS upgrade documentation which inspired me to write my own version. I would have gladly contributed to theirs, but I seem to have lost my password on this site, with no recovery possible... The arch linux wiki has obviously excellent documentation as well, as usual.

Categories: Elsewhere


Subscribe to jfhovinne aggregator