Elsewhere

Wuinfo: How we come with CYouTube Module

Planet Drupal - Sat, 04/07/2015 - 14:07

Recently, I was working with a team on the project for country music television (CMT) at Corus Entertainment. We need to synchronize videos from two different resources. One of them is Youtube. There are over 200 youtube channels on CMT. We need pull videos from all those channels regularly. By doing that, videos that published on Youtube will be available on CMT automatically. So, in-house editors do not need to spend time uploading the videos.

We store videos in file entities. Videos are from different sources. All imported videos act same across the site. Among the imported video, only their mime type is different if two videos are from the different source. Each imported video is a file entity on CMT. For the front end, we built views pages and blocks based on those video file entities.

We have some videos imported from another source: MPX thePlatform. We used "Media: thePlatform mpx" as main module to import and update those videos. To deal with customized video fields, we contributed a module "Media: thePlatform MPX entity fields sync" to work with the main module.

After we have done with MPX thePlatform videos, we have experience of handling videos import. How do we import YouTube videos now? We have over 200 channels on YouTube.

We gave up to make it work with Feeds module. At first, we planned to use feeds module. Since Google have just deprecated their YouTube API V2.0, Old RSS channels feed is no longer working. Thanks to the community, we found a module called feeds_youtube. The module's 7.x-3.x branch works with YouTube latest API V3.0. So, we have a feeds parser. We still need a feeds processor. Thanks to the community again, we found feeds_files module. We installed those modules and their dependent modules. We configure feeds module and spent couple days. It did work. At the end, we decide to build a lightweight custom module that can do everything from download YouTube JSON data to create local file entities. Each video imported from channels will have a uplink to an artist or a hosting show.

What do we want from the module? We want it to check multiple YouTube channels. If there is a new video uploaded in a channel, we create a associated file entity on CMT. In the created entity, we have some the metadata from YouTube saved to the entity fields. In that entity, we also save local metadata. Local metadata likes a show, artist information. We want the module to handle over 200 and maybe thousands of channels in the future. We want the module to handle the tasks gracefully and not burn out the system when importing thousands of videos. It sounds to be quite intimidating but not really.

We ended up building a module and contributed it to Drupal.org. It is called Youtube Channel Videos Sync V3. Here is how we come up with the module.

First of all, we gathered a list of channels name. We also collected relevant local metadata like artist id or show id. Then, we send a request to YouTube and get a list of videos for each channel. Then, for each video, we create a system queue task with video's data and the local metadata. At last, we built queue processor and created the video file entity one by one. See the diagram below for the complete process.

A little more technical detail below:

How we get a list of YouTube channels name and the local metadata for the channels? In the module configuration page, we set the fields used to store the channel name. In CMT, we have the field in both show and artist nodes. The module went through the nodes and get an array of channel name and the node id with they content type. Array like this:

  array(
   'channel name' => array(
     'field_artist_referenced' => array('nid',...),
    ),

How to configure YouTube metadata fields into entity fields? On the module configuration page, we can set up mapping between the YouTube metadata field and local entity field.

In the imported videos, how to set the local artist and show information? The module has a hook where a custom module can implement to provide that information. Like the array above, "field_artist_referenced" is a field machine name of video entity. "array('nid')" is the value for that field. By doing that, the imported video has an entity reference pointing back to the artist or show node. One of the best ways to set up a relation is to have an entity reference in the video entity that point to the artist node or entity.

That is the overall process the module follow to import thousands of videos from YouTube.

Categories: Elsewhere

Guido Günther: Debian work in June 2015

Planet Debian - Sat, 04/07/2015 - 12:46

June was the second month I contributed to Debian LTS under the Freexian umbrella. In total I spent ten hours working on:

Besides that I did CVE triaging of 17 CVEs to check if and how they affect oldoldstable security. The information provided by the Security team on these issues in data/CVE/list is an awesome help here. So I tried to be as verbose when triaging CVEs that weren't looked at for Wheezy or Jessie yet.

On non LTS time I patched our lts-cve-triage tool to allow to skip packages that are already in dla-needed.txt. This avoids wasting time on CVEs that were already triaged.

Categories: Elsewhere

Ryan Szrama: Publishing Content on Relative Dates Using Rules in Drupal 7

Planet Drupal - Sat, 04/07/2015 - 05:50

A friend of mine is a DUI lawyer who uses a Drupal site for content marketing and lead generation, managed by my friends at EverConvert. They quickly identified the weekend as the best time for the website to directly appeal to visitors who found the site after a Friday or Saturday night arrest. In addition to live chat, they decided to alter the site's appearance by publishing content only during these times that contains large, direct calls to action.

Fortunately, Drupal provides the tools to build this with nary a line of code.

Introducing the Rules Scheduler

The primary module you'll use to create something similar is Rules with its Rules Scheduler sub-module. Most purpose built content scheduling modules that I've seen allow you to set absolute dates for pieces of content to be published or unpublished. However, you wouldn't want to have to enter in every single weekend date to get that content published at the right times. Fortunately, Rules Scheduler allows you to schedule arbitrary actions using relative date strings (in addition to any other format supported by strtotime()).

Before we dive into the configuration I proposed to them, you should understand that the Rules module basically adds a GUI based scripting language to the back end of Drupal. In addition to configuring actions to be performed after certain events when a set of conditions are met, you can create Rules components that are essentially subroutines that can be directly invoked by Rules (or code) without being triggered by events.

To setup a Rules Scheduler based content publishing schedule, you have to create two Rules components: one will publish the piece of content and schedule it to be unpublished on a relative date (i.e. "next Monday"), while the other will unpublish the piece of content and schedule it to be published on a relative date (i.e. "next Friday"). Another Rule will need to react to content being created or updated to initiate the publishing schedule.

Using Fields to Avoid "Magic" Behavior

One of the keys to building a maintainable Drupal site (or module) is to ensure that every "automated" action is explicitly enabled. In the early days of Drupal Commerce development, I adopted an approach to some module behaviors where a feature just automatically worked if certain conditions were met (e.g. the representation of attribute fields on Add to Cart forms). "Neat!" I thought, until I realized that it was difficult to document and even more difficult to ensure users knew to look for said documentation. Much better to include explicit user interface components to enable functionality.

In the case of a scheduling system, you wouldn't want to build the site to just automatically enter every piece of content, or even every piece of content of a certain type, into the publishing pattern. Your client, a.k.a. the end user, really expects (and needs) to see an indicator on a form that submitting it will lead to a desired outcome.

For my friend's site, my recommendation was simply to add a Boolean field using a checkbox widget to the relevant content type reading, "Schedule this content to be published on Fridays and unpublished on Mondays." If the site required more publishing patterns than just the weekend pattern, I would've used a List (text) field with radio buttons or a select list identifying the different available publishing patterns.

Building the Scheduling System in Rules

Working with Rules is fairly simple if you can write out long hand what it is you're trying to accomplish. Considering all of our constraints here, we need a set of rules that accomplishes the following:

  1. When a call to action is created or saved with the scheduling checkbox checked, delete any scheduled tasks related to the content. (This is possible because Rules Scheduler lets you assign an identifier to scheduled rules, so identifying our scheduled tasks with the node ID will allow us to delete the tasks when needed in the future.)
  2. If the call to action that was just saved isn't published, schedule it to be published next Friday.
  3. If the call to action that was just saved is published, schedule it to be unpublished next Monday.
  4. When a call to action is automatically published on Friday, schedule it to be unpublished next Monday.
  5. When a call to action is automatically unpublished on Monday, schedule it to be published next Friday.

The first three items will be accomplished through a single rule reacting to two events, "After saving new content of type Call to action" and "After updating existing content of type Call to action." The rule will first delete any scheduled task for the piece of content and then it will invoke two rules components that will schedule the appropriate task based on whether or not the call to action was published.

The final two items will be accomplished through rules components that perform the necessary action and then schedule the appropriate task. As mentioned above, we'll use relative time strings ("next monday" and "next friday") and choose task identifiers that include the call to action node IDs ("unpublish-call-to-action-[node:nid]" and [publish-call-to-action-[node:nid]" respectively).

Give it a whirl!

It only took me about 10 minutes to create and test the rules based on the specification above, but if you aren't familiar with the Rules UI, it could take much longer. I believe Rules is worth learning (we built Drupal Commerce around it, after all), but there's something to be said for ready made examples.

I've attached to this post a Features export of the content type and related rules configurations for you to try on your own site. Give Scheduled Calls to Action a whirl and let me know how it works for you in the comments!

(Note: to see the rules configurations and scheduled tasks, enable the Rules UI, Views, and Views UI modules if they aren't already enabled on your site.)

Categories: Elsewhere

2bits: Presentation: Backdrop, and alernative Drupal fork

Planet Drupal - Fri, 03/07/2015 - 17:25
Last week, at the amazing Drupal North regional conference, I gave a talk on Backdrop: an alternative fork of Drupal. The slides from the talk are attached below, in PDF format.
Categories: Elsewhere

Norbert Preining: Debian/TeX Live 2015.20150703-1

Planet Debian - Fri, 03/07/2015 - 17:08

The first upload of new packages after TeX Live 2015 hit unstable. Against my expectations, the bugs didn’t come in in the thousands, more or less there were only some fixes necessary in the binary package, which lead to a few updates over the last week. This upload fixes an RC bug (missing replaces), and also takes a step further in the Debianization of the packages: I finally removed texconfig and texlinks programs, as they are not useful on Debian, and should actually not be used.

Besides a few other fixes, of course there was the usual chore of package updates.

Updated packages

babel-french, biblatex-fiwi, biblatex-opcit-booktitle, c90, chemformula, chemgreek, cjkutils, ctex, curve2e, dozenal, eledmac, elements, enotez, garuda-c90, koma-script, l3build, latex, leadsheets, norasi-c90, pkuthss, poemscol, pstricks, pst-solides3d, siunitx, termmenu, texlive-scripts, tudscr, upmethodology, xindy.

New packages

arabi-add, br-lex.

Enjoy.

Categories: Elsewhere

Midwestern Mac, LLC: Major improvements to Drupal VM - PHP 7, MariaDB, Multi-OS

Planet Drupal - Fri, 03/07/2015 - 15:17

For the past couple years, I've been building Drupal VM to be an extremely-tunable, highly-performant, super-simple development environment. Since MidCamp earlier this year, the project has really taken off, with almost 200 stars on GitHub and a ton of great contributions and ideas for improvement (some implemented, others rejected).

Categories: Elsewhere

Drupal core announcements: Recording from July 3rd 2015 Drupal 8 critical issues discussion

Planet Drupal - Fri, 03/07/2015 - 12:19

This was our 6th critical issues discussion meeting to be publicly recorded in a row. (See all prior recordings). Here is the recording of the meeting video and chat from today in the hope that it helps more than just those who were on the meeting:

If you also have significant time to work on critical issues in Drupal 8 and we did not include you, let me know as soon as possible.

The meeting log is as follows (all times are CEST real time at the meeting):


[11:06am] alexpott: https://www.drupal.org/node/2280965
[11:06am] Druplicon: https://www.drupal.org/node/2280965 => [meta] Remove or document every SafeMarkup::set() call [#2280965] => 90 comments, 13 IRC mentions
[11:06am] alexpott: https://www.drupal.org/node/2506581
[11:06am] Druplicon: https://www.drupal.org/node/2506581 => Remove SafeMarkup::set() from Renderer::doRender [#2506581] => 49 comments, 9 IRC mentions
[11:07am] alexpott: https://www.drupal.org/node/2506195
[11:07am] Druplicon: https://www.drupal.org/node/2506195 => Remove SafeMarkup::set() from XSS::filter() [#2506195] => 40 comments, 3 IRC mentions
[11:09am] dawehner: https://www.drupal.org/node/2502785
[11:09am] Druplicon: https://www.drupal.org/node/2502785 => Remove support for #ajax['url'] and $form_state->setCached() for GET requests [#2502785] => 51 comments, 18 IRC mentions
[11:09am] Druplicon: dawehner: 5 hours 36 sec ago tell dawehner you might already be following https://www.drupal.org/node/1412090 but I figure you would be in favor.
[11:10am] alexpott: GaborHojtsy: do you what the you link is for the live hangout?
[11:11am] GaborHojtsy: live hangout page: http://youtu.be/rz_EissgU7Q
[11:11am] GaborHojtsy: alexpott: ^^^
[11:11am] GaborHojtsy: to watch that is
[11:11am] alexpott: GaborHojtsy: thanks!
[11:12am] plach: https://www.drupal.org/node/2453153
[11:12am] Druplicon: https://www.drupal.org/node/2453153 => Node revisions cannot be reverted per translation [#2453153] => 159 comments, 58 IRC mentions
[11:12am] plach: https://www.drupal.org/node/2453175
[11:12am] Druplicon: https://www.drupal.org/node/2453175 => Remove EntityFormInterface::validate() and stop using button-level validation by default in entity forms [#2453175] => 79 comments, 9 IRC mentions
[11:14am] plach: https://www.drupal.org/node/2478459
[11:14am] Druplicon: https://www.drupal.org/node/2478459 => FieldItemInterface methods are only invoked for SQL storage and are inconsistent with hooks [#2478459] => 112 comments, 29 IRC mentions
[11:14am] larowlan: https://www.drupal.org/node/2354889
[11:14am] Druplicon: https://www.drupal.org/node/2354889 => Make block context faster by removing onBlock event and replace it with loading from a BlockContextManager [#2354889] => 100 comments, 19 IRC mentions
[11:15am] larowlan: https://www.drupal.org/node/2421503
[11:16am] Druplicon: https://www.drupal.org/node/2421503 => SA-CORE-2014-002 forward port only checks internal cache [#2421503] => 46 comments, 11 IRC mentions
[11:16am] larowlan: https://www.drupal.org/node/2512460
[11:16am] Druplicon: https://www.drupal.org/node/2512460 => "Translate user edited configuration" permission needs to be marked as restricted [#2512460] => 20 comments, 8 IRC mentions
[11:17am] WimLeers: catch: pfrenssen is likely gonna be working on the "numerous paramconverters" issue
[11:17am] WimLeers: the issue catch talked about: https://www.drupal.org/node/2512718
[11:17am] Druplicon: https://www.drupal.org/node/2512718 => Numerous ParamConverters in core break the route / url cache context [#2512718] => 22 comments, 12 IRC mentions
[11:17am] catch: WimLeers: did you discuss last night's discussion with him already?
[11:18am] WimLeers: https://www.drupal.org/project/issues/search/drupal?project_issue_follow...
[11:19am] WimLeers: https://www.drupal.org/node/2450993
[11:19am] Druplicon: https://www.drupal.org/node/2450993 => Rendered Cache Metadata created during the main controller request gets lost [#2450993] => 132 comments, 27 IRC mentions
[11:20am] pfrenssen: catch: WimLeers: yes I'm working on that, for the moment just studying how it works
[11:20am] berdir: WimLeers: I think we can also demote it if all the other must issues are critical on their own
[11:20am] GaborHojtsy: https://www.drupal.org/node/2512460
[11:20am] Druplicon: https://www.drupal.org/node/2512460 => "Translate user edited configuration" permission needs to be marked as restricted [#2512460] => 20 comments, 9 IRC mentions
[11:20am] catch: pfrenssen: cool. I'll be around most of today so just ping if you want to discuss.
[11:21am] GaborHojtsy: https://www.drupal.org/node/2512466
[11:21am] Druplicon: https://www.drupal.org/node/2512466 => Config translation needs to be validated on input for XSS (like other t string input) [#2512466] => 34 comments, 3 IRC mentions
[11:21am] WimLeers: pfrenssen++
[11:21am] WimLeers: berdir: assuming you're talking about https://www.drupal.org/node/2429287 — then yes, that's exactly the plan :)
[11:21am] Druplicon: https://www.drupal.org/node/2429287 => [meta] Finalize the cache contexts API & DX/usage, enable a leap forward in performance [#2429287] => 112 comments, 10 IRC mentions
[11:21am] GaborHojtsy: https://www.drupal.org/node/2489024
[11:21am] Druplicon: https://www.drupal.org/node/2489024 => Arbitrary code execution via 'trans' extension for dynamic twig templates (when debug output is on) [#2489024] => 43 comments, 12 IRC mentions
[11:22am] GaborHojtsy: https://www.drupal.org/node/2512718
[11:22am] Druplicon: https://www.drupal.org/node/2512718 => Numerous ParamConverters in core break the route / url cache context [#2512718] => 22 comments, 13 IRC mentions
[11:22am] xjm: GaborHojtsy: I took care of updating the security polict with mlhess
[11:22am] xjm: GaborHojtsy: that's done
[11:23am] • xjm listening to the meeting but cannot join because of 10 person limit
[11:23am] GaborHojtsy: xjm: yay
[11:23am] GaborHojtsy: xjm++
[11:24am] WimLeers: Yes
[11:24am] WimLeers: I DO HAVE A MICROPHONE!!!
[11:24am] WimLeers: :P
[11:24am] xjm: GaborHojtsy: https://www.drupal.org/node/475848/revisions/view/7267195/8630716 now restrict access is mentioned explicitly, so any perm that has it is covered
[11:24am] Druplicon: https://www.drupal.org/node/475848 => Security advisories process and permissions policy => 0 comments, 1 IRC mention
[11:25am] larowlan: https://www.drupal.org/node/2509898
[11:25am] Druplicon: https://www.drupal.org/node/2509898 => Additional uncaught exception thrown while handling exception after service changes [#2509898] => 22 comments, 5 IRC mentions
[11:25am] alexpott: WimLeers: a working microphone :)
[11:28am] WimLeers: alexpott: all of you guys are breaking up for me from time to time. It looks like it's something with Chrome/Hangouts :(
[11:28am] xjm: WimLeers: yeah I had to use FF
[11:28am] WimLeers: My mic *works* if I test it locally.
[11:28am] WimLeers: xjm: lol, the beautiful irony
[11:30am] WimLeers: berdir: alexpott: I checked and I agree with the change you guys just asked me feedback on: https://www.drupal.org/node/2375695#comment-10082188
[11:30am] Druplicon: https://www.drupal.org/node/2375695 => Condition plugins should provide cache contexts AND cacheability metadata needs to be exposed [#2375695] => 117 comments, 40 IRC mentions
[11:32am] alexpott: https://www.drupal.org/node/2497243
[11:32am] Druplicon: https://www.drupal.org/node/2497243 => Rebuilding service container results in endless stampede [#2497243] => 97 comments, 22 IRC mentions
[11:33am] • larowlan using FF too, doesn't trust Google
[11:33am] alexpott: dawehner, catch: anyone got the nid of the chx container issue?
[11:33am] dawehner: https://www.drupal.org/node/2513326
[11:33am] catch: alexpott: https://www.drupal.org/node/2513326
[11:33am] Druplicon: https://www.drupal.org/node/2513326 => Performance: create a PHP storage backend directly backed by cache [#2513326] => 43 comments, 13 IRC mentions
[11:33am] Druplicon: https://www.drupal.org/node/2513326 => Performance: create a PHP storage backend directly backed by cache [#2513326] => 43 comments, 14 IRC mentions
[11:34am] catch: beat me.
[11:36am] larowlan: https://www.drupal.org/node/2511568 for the ctools issue I mentioned
[11:36am] Druplicon: https://www.drupal.org/node/2511568 => Create "context stack" service where available contexts can be registered [#2511568] => 0 comments, 3 IRC mentions
[11:41am] xjm: dropping off now to walk to the venue
[11:41am] alexpott: pfrenssen++
[11:50am] GaborHojtsy: for browser we are working on https://www.drupal.org/node/2430335
[11:50am] Druplicon: https://www.drupal.org/node/2430335 => Browser language detection is not cache aware [#2430335] => 47 comments, 21 IRC mentions
[11:50am] GaborHojtsy: Fabianx-screen: see above
[11:51am] berdir: and it's not a blocker
[11:51am] berdir: because right now it just kills page/smart cache
[11:51am] berdir: so if you use that, you are not seeing a cached page anyway
[11:51am] berdir: Fabianx-screen: we have a cache context for the content language
[11:52am] xjm: the youtube video has like a 20 second delay
[11:53am] GaborHojtsy: xjm: we get what we pay for :D
[11:54am] xjm: now I am talking
[11:55am] plach: https://www.drupal.org/node/2453175#comment-10080242
[11:55am] Druplicon: https://www.drupal.org/node/2453175 => Remove EntityFormInterface::validate() and stop using button-level validation by default in entity forms [#2453175] => 79 comments, 10 IRC mentions
[11:55am] xjm: more like 60s
[11:55am] GaborHojtsy: xjm: well, you’ll be in the recording forver now :D
[11:55am] xjm: :P
[11:56am] xjm: GaborHojtsy: is there any way to increase the number of participants or to include the chat sidebar in the video?
[11:56am] WimLeers1: catch: Can you leave a comment on https://www.drupal.org/node/2512718 with your POV/conclusion — sounds like you have the clearest grasp on this/completest view on the likely solution.
[11:56am] Druplicon: https://www.drupal.org/node/2512718 => Numerous ParamConverters in core break the route / url cache context [#2512718] => 23 comments, 14 IRC mentions
[11:56am] WimLeers1: xjm: chat is here, we don't use Google Hangout's chat sidebar
[11:56am] GaborHojtsy: xjm: the chat window is THIS ONE
[11:57am] larowlan: xjm: the chat one is lost soon as the call ends
[11:57am] xjm: WimLeers: GaborHojtsy: so the thing is that listening to the video it's very hard to figure out which issues people are discussing, even with Gábor's notes
[11:57am] larowlan: xjm: you can get more than 10 in the call if you have a corporate account I think
[11:57am] WimLeers: larowlan: Gábor copy/pastes this chat to the g.d.o post with the link of the recording
[11:57am] • larowlan nods
[11:57am] WimLeers: that was meant for xjm, oops
[11:58am] GaborHojtsy: xjm: the trick is if we would start on time, then my chat log shows timestamps which are sync with the video
[11:58am] GaborHojtsy: xjm: we have a few minute delay between the video start and the top of the hour
[11:58am] catch: WimLeers: yep will do.
[11:59am] GaborHojtsy: xjm: because people arrive later basically :)
[11:59am] WimLeers: catch++
[12:01pm] GaborHojtsy: xjm: its also hard to get Druplicon in google hangouts — as in impossible ÉD
[12:01pm] GaborHojtsy: xjm: and there was some interest for people watching to be able to follow links WHILE the meeting was on
[12:02pm] xjm: GaborHojtsy: but it doesn't work -- I'm trying that right now -- the delay is too big
[12:02pm] WimLeers: alexpott: regarding in title: https://api.drupal.org/comment/26#comment-26
[12:03pm] GaborHojtsy: xjm: well, then at least the Druplicon advantage is there :)
[12:03pm] GaborHojtsy: xjm: I am happy to use some way to get the comments in if there is one
[12:04pm] GaborHojtsy: xjm: I am not sure it helps if eg. someone shares their screen with an IRC client throughout the video
[12:04pm] GaborHojtsy: xjm: the links are not clickable either
[12:04pm] xjm: GaborHojtsy: shannon did that for the meeting we had 2y ago -- though the same problem with links not being clickable
[12:04pm] GaborHojtsy: xjm: but that may be a workaround

Categories: Elsewhere

DrupalCon News: We're making it easy to give back. Add a membership to your cart.

Planet Drupal - Fri, 03/07/2015 - 11:00

If you have been to a DrupalCon before, you will notice something new in the registration process. We've made it really easy for you to sign up for Drupal Association membership and bundle it into your overall DrupalCon purchase.

Categories: Elsewhere

InternetDevels: Drupal 7 application with PhoneGap: full tutorial

Planet Drupal - Fri, 03/07/2015 - 09:06

Mobile application development is a relatively new area of Drupal development services, but it is advancing rapidly, because mobile devices are used increasingly.

In terms of adapting websites for mobile platforms, PhoneGap and technologies built on it remain very popular. So let’s see what it is and how it works.

Read more
Categories: Elsewhere

Petter Reinholdtsen: Time to find a new laptop, as the old one is broken after only two years

Planet Debian - Fri, 03/07/2015 - 07:10

My primary work horse laptop is failing, and will need a replacement soon. The left 5 cm of the screen on my Thinkpad X230 started flickering yesterday, and I suspect the cause is a broken cable, as changing the angle of the screen some times get rid of the flickering.

My requirements have not really changed since I bought it, and is still as I described them in 2013. The last time I bought a laptop, I had good help from prisjakt.no where I could select at least a few of the requirements (mouse pin, wifi, weight) and go through the rest manually. Three button mouse and a good keyboard is not available as an option, and all the three laptop models proposed today (Thinkpad X240, HP EliteBook 820 G1 and G2) lack three mouse buttonsf. It is also unclear to me how good the keyboard on the HP EliteBooks are. I hope Lenovo have not messed up the keyboard, even if the quality and robustness in the X series have deteriorated since X41.

I wonder how I can find a sensible laptop when none of the options seem sensible to me? Are there better services around to search the set of available laptops for features? Please send me an email if you have suggestions.

Categories: Elsewhere

2bits: Backdrop: an alternative Drupal fork

Planet Drupal - Fri, 03/07/2015 - 05:16
Last week, at the amazing Drupal North regional conference, I gave a talk on Backdrop: an alternative fork of Drupal. The slides from the talk are attached below, in PDF format.
Categories: Elsewhere

Drupal core announcements: Portsmouth NH theme system critical sprint recap

Planet Drupal - Fri, 03/07/2015 - 05:16

In early June a Drupal 8 theme system critical issues sprint was held in Portsmouth, New Hampshire as part of the D8 Accelerate program.

The sprint started the afternoon of June 5 and continued until midday June 7.

Sprint goals

We set out to move forward the two (at the time) theme system criticals, #2273925: Ensure #markup is XSS escaped in Renderer::doRender (created May 24, 2014) and #2280965: [meta] Remove or document every SafeMarkup::set() call (created June 6, 2014).

Sponsors

The Drupal Association provided the D8 Accelerate grant which covered travel costs for joelpittet and Cottser.

Bowst provided the sprint space.

As part of its NHDevDays series of contribution sprints, the New Hampshire Drupal Group provided snacks and refreshments during the sprint, lunch and even dinner on Saturday.

Digital Echidna provided time off for Cottser.

Summary

xjm committed #2273925: Ensure #markup is XSS escaped in Renderer::doRender Sunday afternoon! xjm’s tweet sums things up nicely.

As for the meta (which is comprised of about 50 sub-issues), by the end of the sprint we had patches on over 30 of them, 3 had been committed, and 7 were in the RTBC queue.

Thanks to the continued momentum provided by the New Jersey sprint, as of this writing approximately 20 issues from the meta issue have been resolved.

Friday afternoon

peezy kicked things off with a brief welcome and acknowledgements. joelpittet and Cottser gave an informal introduction to the concepts and tasks at hand for the sprinters attending.

After that, leslieg on-boarded our Friday sprinters (mostly new contributors), getting them set up with Drupal 8, IRC, Dreditor, and so on. leslieg and a few others then went to work reviewing documentation around #2494297: [no patch] Consolidate change records relating to safe markup and filtering/escaping to ensure cross references exist.

Meanwhile in "critical central" (what we called the meeting room where the work on the critical issues was happening)…

lokapujya and joelpittet got to work on the remaining tasks of #2273925: Ensure #markup is XSS escaped in Renderer::doRender.

cwells and Cottser started the work on removing calls to SafeMarkup::set() by working on #2501319: Remove SafeMarkup::set in _drupal_log_error, DefaultExceptionSubscriber::onHtml, Error::renderExceptionSafe.

Thai food was ordered in, and many of us continued working on issues late into the evening.

Saturday

joelpittet and Cottser gave another brief introduction to keep new arrivals on the same page and reassert concepts from the day before.

leslieg did some more great on-boarding Saturday and worked with a handful of new contributors on implementing #2494297: [no patch] Consolidate change records relating to safe markup and filtering/escaping to ensure cross references exist. The idea was that by reviewing and working on this documentation the contributors would be better equipped to work directly on the issues in the SafeMarkup::set() meta.

Mid-morning Cottser led a participatory demo with the whole group of a dozen or so sprinters, going through one of the child issues of the meta and ending up with a patch. This allowed us to walk through the whole process and think out loud the whole time.


The Benjamin Melançon XSS attack in action. Having some fun while working on our demo issue.

By this time we had identified some common patterns after working on enough of these issues.

By the end of Saturday all of the sprinters including brand new contributors were collaborating on issues from the critical meta and the issue stickies were flying around the room with fervor (a photo of said issue stickies is below).


Then we had dinner :)

Sunday morning

drupal.org was down for a while.

We largely picked up where we left off Saturday, cranked out more patches, and joelpittet and Cottser started to review the work that had been done the day before that was in the “Needs Human” column.


Our sprint board looked something like this on the last day of the sprint.

Thank you

Thanks to the organizing committee (peezy, leslieg, cwells, and kbaringer), xjm, effulgentsia, New Hampshire DUG, Seacoast DUG, Bowst, Drupal Association, Digital Echidna, and all of our sprinters: cdulude, Cottser, cwells, Daniel_Rose, dtraft, jbradley428, joelpittet, kay_v, kbaringer, kfriend, leslieg, lokapujya, mlncn, peezy, sclapp, tetranz.

AttachmentSize 20150606_114556.jpg453.91 KB 20150606_184029.jpg549.46 KB 20150607_130317.jpg353.87 KB IMG_8589.JPG781.47 KB
Categories: Elsewhere

Enrico Zini: italian-fattura-elettronica

Planet Debian - Thu, 02/07/2015 - 23:48
Billing an Italian public administration

Here's a simple guide for how I managed to bill one of my customers as is now mandated by law in Italy.

Create a new virtualbox machine

I would never do any of this to any system I would ever want to use for anything else, so it's virtual machine time.

  • I started virtualbox, created a new machine for Ubuntu 32bit, 8Gb disk, 4Gb RAM, and placed the .vdi image in an encrypted partition. The web services of Infocert's fattura-pa requires "Java (JRE) a 32bit di versione 1.6 o superiore".
  • I installed Ubuntu 12.04 on it: that is what dike declares to support.
  • I booted the VM, installed virtualbox-guest-utils, and de sure I also had virtualbox-guest-x11
  • I restarted the VM so that I could resize the virtualbox window and have Ubuntu resize itself as well. Now I could actually read popup error messages in full.
  • I changed the desktop background to something that gave me the idea that this is an untrusted machine where I need to be very careful of what I type. I went for bright red.
Install smart card software into it
  • apt-get install pcscd pcsc-tools opensc
  • In virtualbox, I went to Devices/USB devices and enabled the smart card reader in the virtual machine.
  • I ran pcsc_scan to see if it could see my smart card.
  • I ran Firefox, went to preferences, advanced, security devices, load. Module name is "CRS PKCS#11", module path is /usr/lib/opensc-pkcs11.so
  • I went to https://fattura-pa.infocamere.it/fpmi/service and I was able to log in. To log in, I had to type the PIN 4 times into popups that offered little explanations about what was going on, enjoying cold shivers because the smart card would lock itself at the 3rd failed attempt.
  • Congratulations to myself! I thought that all was set, but unfortunately, at this stage, I was not able to do anything else except log into the website.
Descent into darkness Set up things for fattura-pa
  • I got the PDF with the setup instructions from here. Get it too, for a reference, a laugh, and in case you do not believe the instructions below.
  • I went to https://www.firma.infocert.it/installazione/certificato.php, and saved the two certificates.
  • Firefox, preferences, advanced, show certificates, I imported both CA certificates, trusted for everything, all my base are belong to them.
  • apt-get install icedtea-plugin
  • I went to https://fattura-pa.infocamere.it/fpmi/service and tried to sign. I could not: I got an error about invalid UTF8 for something or other in Firefox's stdandard error. Firefox froze and had to be killed.
Set up things for signing locally with dike
  • I removed icedtea so that I could use the site without firefox crashing.
  • I installed DiKe For Ubuntu 12.04 32bit
  • I ran dikeutil to see if it could talk to my smart card
  • When signing with the website, I chose the manual signing options and downloaded the zip file with the xml to be signed.
  • I got a zip file, unzipped it.
  • I loaded the xml into dike.
  • I signed it with dike.
  • I got this error message: "nessun certificato di firma presente sul dispositivo di firma" and then this error message: "Impossibile recuperare il certificato dal dispositivo di firma". No luck.
Set up things for signing locally with ArubaSign
  • I went to https://www.pec.it/Download.aspx
  • I downloaded ArubaSign for Linux 32 bit.
  • Oh! People say that it only works with Oracle's version of Java.
  • sudo add-apt-repository ppa:webupd8team/java
  • apt-get update
  • apt-get install oracle-java7-installer
  • During the installation process I had to agree to also sell my soul to Oracle.
  • tar axf ArubaSign*.tar*
  • cd ArubaSing-*/apps/dist
  • java -jar ArubaSign.jar
  • I let it download its own updates. Another time I did not. It does not seem to matter: I get asked that question every time I start it anyway.
  • I enjoyed the fancy brushed metal theme, and had an interesting time navigating an interface where every label on every icon or input field was truncated.
  • I downloaded https://www.pec.it/documenti/Manuale_ArubaSign2_firma%20Remota_V03_02_07_2012.pdf to get screenshots of that interface with all the labels intact
  • I signed the xml that I got from the website. I got told that I needed to really view carefully what I was signing, because the signature would be legally binding
  • I enjoyed carefully reading a legally binding, raw XML file.
  • I told it to go ahead, and there was now a .p7m file ready for me. I rejoiced, as now I might, just might actually get paid for my work.
Try fattura-pa again

Maybe fattura-pa would work with Oracle's Java plugin?

  • I went to https://fattura-pa.infocamere.it/fpmi/service
  • I got asked to verify java at www.java.com. I did it.
  • I told FireFox to enable java.
  • Suddenly, and while I was still in java.com's tab, I got prompted about allowing Infocert's applet to run: I allowed it to run.
  • I also got prompted several times, still while the current tab was not even Infocert's tab, about running components that could compromise the security of my system. I allowed and unblocked all of them.
  • I entered my PIN.
  • Congratulations! Now I have two ways of generating legally binding signatures with government issued smart cards!
Aftermath

I shut down that virtual machine and I'm making sure I never run anything important on it. Except, of course, generating legally binding signatures as required by the Italian government.

What could possibly go wrong?

Categories: Elsewhere

Antonio Terceiro: Upgrades to Jessie, Ruby 2.2 transition, and chef update

Planet Debian - Thu, 02/07/2015 - 22:26

Last month I started to track all the small Debian-related things that I do. My initial motivation was to be concious about how often I spend short periods of time working on Debian. Sometimes it’s during lunch breaks, weekends, first thing in the morning before regular work, after I am done for the day with regular work, or even during regular work, since I do have the chance of doing Debian work as part of my regular work occasionally.

Now that I have this information, I need to do something with it. So this is probably the first of monthly updates I will post about my Debian work. Hopefully it won’t be the last.

Upgrades to Jessie

I (finally) upgraded my two servers to Jessie. The first one, my home server, is a Utilite which is a quite nice ARM box. It is silent and consumes very little power. The only problem I had with it is that the vendor-provided kernel is too old, so I couldn’t upgrade udev, and therefore couldn’t switch to systemd. I had to force systemv for now, until I can manage to upgrade the kernel and configure uboot to properly boot the official Debian kernel.

On my VPS things are way better. I was able to upgrade nicely, and it is now running a stock Jessie system.

fixed https on ci.debian.net

pabs had let me know on IRC of an issue with the TLS certificate for ci.debian.net, which took me a few iterations to get right. It was missing the intermediate certificates, and is now fixed. You can now enjoy Debian CI under https .

Ruby 2.2 transition

I was able to start the Ruby 2.2 transition, which has the goal of switch to Ruby 2.2 on unstable. The first step was updating ruby-defaults adding support to build Ruby packgaes for both Ruby 2.1 and Ruby 2.2. This was followed by updates to gem2deb (0.18, 0.18.1, 0.18.2, and 0.18.3) and rubygems-integration . At this point, after a few rebuild requests only 50 out of 137 packages need to be looked at; some of them just use the default Ruby, so a rebuild once we switch the default will be enough to make it use Ruby 2.2, while others, specially Ruby libraries, will still need porting work or other fixes.

Updated the Chef stack

Bringing chef to the very latest upstream release into unstable was quite some work.

I had to update:

  • ruby-columnize (0.9.0-1)
  • ruby-mime-types (2.6.1-1)
  • ruby-mixlib-log 1.6.0-1
  • ruby-mixlib-shellout (2.1.0-1)
  • ruby-mixlib-cli (1.5.0-1)
  • ruby-mixlib-config (2.2.1-1)
  • ruby-mixlib-authentication (1.3.0-2)
  • ohai (8.4.0-1)
  • chef-zero (4.2.2-1)
  • ruby-specinfra (2.35.1-1)
  • ruby-serverspec (2.18.0-1)
  • chef (12.3.0-1)
  • ruby-highline (1.7.2-1)
  • ruby-safe-yaml (1.0.4-1)

In the middle I also had to package a new dependency, ruby-ffi-yajl, which was very quickly ACCEPTED thanks to the awesome work of the ftp-master team.

Random bits

  • Sponsored a upload of redir by Lucas Kanashiro
  • chake, a tool that I wrote for managing servers with chef but without a central chef server, got ACCEPTED into the official Debian archive.
  • vagrant-lxc , a vagrant plugin for using lxc as backend and lxc containters as development environments, was also ACCEPTED into unstable.
  • I got the deprecated ruby-rack1.4 package removed from Debian
Categories: Elsewhere

Acquia: Front End Performance Strategy: Styles

Planet Drupal - Thu, 02/07/2015 - 20:38

The quest for improved page-load speed and website performance is constant. And it should be. The speed and responsiveness of a website have a significant impact on conversion, search engine optimization, and the digital experience in general.

In part one of this series, we established the importance of front-end optimization on performance, and discussed how properly-handled images can provide a significant boost toward that goal. In this second installment, we’ll continue our enhancements, this time by tackling CSS optimization.

We’ll consider general best practices from both a front-end developer’s and a themer’s point of view. Remember, as architects and developers, it’s up to us to inform stakeholders of the impacts of their choices, offer compromises where we can, and implement in smart and responsible ways.

Styles

Before we dive into optimizing our CSS, we need to understand how Drupal’s performance settings for aggregation work. We see developers treating this feature like a black box, turning it on without fully grokking its voodoo. Doing so misses two important strategic opportunities: 1. Controlling where styles are added in the head of our document, and 2. Regulating how many different aggregates are created.

Styles can belong to one of three groups:

  • System - Drupal core
  • Default - Styles added by modules
  • Theme - Styles added in your theme

Drupal aggregates styles from each group into a single sheet for that group, meaning you’ll see at minimum three CSS files being used for your page. Style sheets added by inclusion in a theme’s ‘.info’ file or a module’s ‘.info’ file automatically receive a ‘true’ value for the every_page flag in the options array, which wraps them into our big three aggregates.

Styles added using drupal_add_css automatically have the every_page flag set to ‘false.’ These style sheets are then combined separately, by group, forming special one-off aggregate style sheets for each page.

When using drupal_add_css, you can use the optional ‘options’ array to explicitly set the every_page flag to ‘true.’ You can also set the group it belongs to and give it a weight to move it up or down within a group.

<?php
drupal_add_css(drupal_get_path('module', 'custom-module') . '/css/custom-module.css', array('group' => CSS_DEFAULT, 'every_page' => TRUE));
?>

Style sheets added using Drupal’s attached property aren’t aggregated unless they have the every_page flag set to ‘true.’

Favoring every_page: true

Styles added with the every_page flag set to ‘false’ (CSS added via drupal_add_css without the true option, or without the option set at all, or added using the attached property) will only load on the pages that use the function that attaches, or adds, that style sheet to the page, so you have a smaller payload to build the page. However, on pages that do use the function that adds or attaches the style sheet with every_page: ‘false’, an additional HTTP request is required to build that page. The additional requests are for the one-off aggregates per group, per page.

In more cases than not, I prefer loading an additional 3kb of styling in my main aggregates that will be downloaded once and then cached locally, rather than create a new aggregate that triggers a separate HTTP request to make up for what’s not in the main aggregates.

Additionally, turning on aggregation causes Drupal to compress our style sheets, serving them Gzipped to the browser. Gzipped assets are 70%–90% smaller than their uncompressed counterparts. That’s a 500kb CSS file being transferred in a 100kb package to the browser.

Preprocessing isn’t a license for inefficiency

I love preprocessing my CSS. I use SASS, written in SCSS syntax, and often utilize Compass for its set of mixins and for compiling. But widespread adoption of preprocessing has led to compiled CSS files that tip the 1Mb limit (which is way over the average), or break the IE selector limit of 4095. Some of this can be attributed to Drupal’s notorious nesting of divs (ugly mark-up often leads to ugly CSS), but a lot of it is just really poor coding habits.

The number one culprit I’ve come across is over nesting of selectors in SASS files. People traverse the over nested DOM created by Drupal with SASS and spit out compiled CSS rules using descendant selectors that go five (sometimes even more) levels deep.

I took the above example from the SASS inception page, linked above, and cleaned it up to what it should probably be:

The result? It went from 755 bytes to 297 bytes, a 60% reduction in size. That came from just getting rid of the extra characters added by the excess selectors. Multiply the savings by the 30 partials or so, and it’s a pretty substantial savings in compiled CSS size.

Besides the size savings, the number and type of selectors directly impact the amount of time a browser takes to process your style rules. Browsers read styles from right to left, matching the rightmost “key” selector first, then working left, disqualifying items as it goes. Mozilla wrote a great efficient CSS reference years ago that is still relevant.

Conclusion

Once again we’ve demonstrated how sloppy front-end implementations can seriously hamper Drupal’s back-end magic. By favoring style sheet aggregation and reining in exuberant preprocessing, we can save the browser a lot of work.

In our next, and final, installment in this series, we’ll expand our front-end optimization strategies even further, to include scripts.

Tags:  acquia drupal planet
Categories: Elsewhere

Christoph Berg: PostgreSQL 9.5 in Debian

Planet Debian - Thu, 02/07/2015 - 20:03

Today saw the release of PostgreSQL 9.5 Alpha 1. Packages for all supported Debian and Ubuntu releases are available on apt.postgresql.org:

deb http://apt.postgresql.org/pub/repos/apt/ YOUR_RELEASE_HERE-pgdg main 9.5

The package is also waiting in NEW to be accepted for Debian experimental.

Being curious which PostgreSQL releases have been in use over time, I pulled some graphics from Debian's popularity contest data:

Before we included the PostgreSQL major version in the package name, "postgresql" contained the server, so that line represents the installation count of the pre-7.4 releases at the left end of the graph.

Interestingly, 7.4 reached its installation peak well past 8.1's. Has anyone an idea why that happened?

Categories: Elsewhere

Drupal Watchdog: Build it with Backdrop

Planet Drupal - Thu, 02/07/2015 - 17:29
Feature

Backdrop CMS is a fork of the Drupal project. Although Backdrop has different name, the 1.0 version is very similar to Drupal 7. Backdrop 1.0 provides an upgrade path from Drupal 7, and most modules or themes can be quickly ported to work on Backdrop.

Backdrop is focused on the needs of the small- to medium-sized businesses, non-profits, and those who may not be able to afford the jump from Drupal 7 to Drupal 8. Backdrop values backwards compatibility, and recognizes that the easier it is for developers to get things working, the less it will cost to build, maintain, and update comprehensive websites and web applications. By iterating code in incremental steps, innovation can happen more quickly in the areas that are most important.

The initial version of Backdrop provides major improvements over Drupal 7, including configuration management, a powerful new Layout system, and Views built-in.

What's Different From Drupal 7?

Backdrop CMS is, simply put: Drupal 7 plus built-in Configuration Management, Panels, and Views.

Solving the Deployment Dilemma

Database-driven systems have long suffered from the deployment dilemma: Once a site is live, how do you develop and deploy a new feature?

Sometimes there is only one copy of a site – the live site – and all development must be done on this live environment. But any mistakes made during the development of the new feature will be immediately experienced by visitors to the site.

It’s wiser to also have a separate development environment, which allows for creation of the new feature without putting the live site at risk. But what happens when that new feature has been completed, and needs to be deployed?

We already have a few great tools for solving this problem. When it comes to the code needed for the new feature we have great version control systems like Git. Commit all your work, merge if necessary, push when you’re happy, and then just pull from the live environment (and maybe clear some caches).

Categories: Elsewhere

Acquia: Sustainable contribution, part 1 of 2: How Drupal has solved and evolved

Planet Drupal - Thu, 02/07/2015 - 17:05
Language Undefined

Part 1 of 2 - Drupal user number 5622, John Faber, has been involved with Drupal since late 2003. He is a Managing Partner with Chapter Three, a San Francisco-based digital agency. Their slogan sums up well what a lot of us think about Drupal: "We build a better internet with Drupal." John and I got on a Google Hangout to talk about the business advantages of contribution and sustainability when basing your business on open source software. We also touch on Drupal 8's potential power as a toolset and for attracting new developers, doing business in an open source context, and more!

Categories: Elsewhere

Simon Kainz: DUCK challenge at DebConf15

Planet Debian - Thu, 02/07/2015 - 16:45
New features in DUCK Carnivore-* data

DUCK now uses carnivore-{names,email} tables from UDD, giving a nice list of packages grouped by Maintainer/Uploader names.

Domain grouping

A per-domain-listing is now also available here.

DUCK challenge at DebConf15

After announcing DUCK in mid-june 2012, the number of source packages with issues is still somewhat stable around 1700. After a recent update of the curl libs, i also managed to get rid of 200 false positives, caused by SSL-verification issues, as can be seen here.

To speed things up a bit and lower the number of broken links, i hereby propose the following challenge:

The first 99 persons who fix at least 1 broken URL and upload the fixed package before end of DebConf15 will get an awesome "200 OK" DUCK-branded lighter at DebConf15!

The challenge starts right now!

I will try hard to not forget anyone who fixes packages (note the s ;-), but if you feel missed out, please contact me at DC15.

Also, please remember that this is not a valid excuse to NMU packages ;-).

Categories: Elsewhere

Cheeky Monkey Media: An introduction to AngularJS and Drupal

Planet Drupal - Thu, 02/07/2015 - 16:00

If you are looking to create a rich and dynamic web app that has little to no latency, Drupal and AngularJS is the answer as they go hand in hand. In this article I will show you a brief intro of what AngularJS is. There is a bit of a learning curve in this awesome architecture, but if you know some basic JavaScript you will be ok.

AngularJS,  is a front-end JavaScript framework for making web apps and is created by our good pal Google.

Couple of cool facts about AngularJS:

  • Open Source
  • MVC pattern (ASP.NET developer soft spot)
  • Handles task like
  • ...
Categories: Elsewhere

Pages

Subscribe to jfhovinne aggregator - Elsewhere