Planet Drupal

Subscribe to flux Planet Drupal - aggregated feeds in category Planet Drupal
Mis à jour : il y a 10 min 12 sec

Deeson: Extending EntityDrupalWrapper

ven, 28/11/2014 - 10:33

We'd all like to be writing better OO code with Drupal 7, wouldn't we?

It's always been a best practice of ours to use the entity_metadata_wrapper() function when programming the additional logic that inevitably comes when constructing complex content types. But a pain point of this is that the code usually ends up in one hook_ implementation or another and any shared code tends to get arranged into functions in the .module file.

In an ideal world, Drupal would have provided us with entity classes that we could extend at the bundle level to allow us to add feature-specific functionality. I had a bit of a eureka moment the other day, when I realised that we’re already quite close to this, and that it's actually quite trivial to extend the EntityDrupalWrapper class provided by entity_metadata_wrapper() with functionality specific to each entity type.

Indeed, as I think about it, I'm wondering if this should be a new best practice?

An overly simple example

Let’s imagine that we have a content type that is tied up with Organic Groups and that we use it as a way of grouping members into different organisations on our site. We’ll call it “company”.

Let’s also imagine that we would like to send an email to each of the members of a company whenever one of them posts a comment.

For the convenience of this example, let's forget that Rules exist for a moment!

Firstly, a quick note about autoloading

Drupal 8 takes advantage of the wonderful PSR-4 autoloading standard so that we no longer have to define includes in our module's .info file. The brilliant X Autoload module provides just the same functionality for Drupal 7. Since we’re going to be writing a lot more classes from now on, I’d recommend using it, especially as what we’re doing supports PHP 5.3 namespaces. My example below assumes that you're using it.

The old way

So you’d probably start with a hook_comment_insert, and in there you’d probably grab the user, load the company that they belong to and start looping.

The problem with this approach is that everything is inside of the hook implementation and if you take it out of the hook implementation, you’ll just be littering the code base with API-like functions. Your modules are soon going to get messy. Additionally, if you need to do something else non-related in hook_comment_insert() things are going to get even more messy.

The new way(?)

What if in your hook_comment_insert, instead you just called a function on an entity wrapper?

$author_wrapper = new UserWrapper($comment->uid); $author_wrapper->emailColleagues([...]);

Then, in my_module/src/EntityWrapper/User/UserWrapper.php (Drupal\my_module\EntityWrapper\User\UserWrapper.php) you’d make your class:

<?php /** * @file * Firm Profile wrapper class. */ namespace Drupal\my_module\EntityWrapper\User; use \EntityDrupalWrapper; use Drupal\my_module\EntityWrapper\Node\CompanyWrapper; /** * Wraps nodes of type firm_profile with additional functionality. */ class UserWrapper extends EntityDrupalWrapper { /** * Wrap a user object. * * @param int|object $data * A uid or user object. */ public function __construct($data) { parent::__construct('user', $data); } /** * Send an email to all colleagues of this user. * * @param mixed $some_args * Whatever is needed here. */ public function emailColleagues($some_args) { foreach ($this->getColleagues() as $colleague) { $colleague->email($some_args); } } /** * Send an email to this user. * * @param mixed $some_args * Whatever is needed here. */ public function email($some_args) { // Call something like drupal_mail() here; } /** * Get a list of colleagues in the same company as this user. * * @return UserWrapper[] * An array UserWrappers. */ public function getColleagues() { $colleagues = array(); foreach ($this->getCompany()->getEmployees() as $employee) { if ($employee->getIdentifier() !== $this->getIdentifier()) { $colleagues[] = $employee; } } return $colleagues; } /** * Get the company for this user. * * Note that we can cleanly wrap relationships between entities. * * @return CompanyWrapper * A company wrapper object. Check it with ->value() if you * need to make sure it has data. */ public function getCompany() { return new CompanyWrapper($this->field_company->raw()); } }

How simple is that?! Look at all those reusable functions that we've placed directly against the relevant class. You’ll call getCompany() all the time and if you add functionality to the CompanyWrapper class then that will be ready to use as well.

I’m quite convinced that my code is going to start looking very different from now on!

Catégories: Elsewhere

Flocon de toile | Freelance Drupal: Drupal SA-CORE-2014-005, perspective and lessons

ven, 28/11/2014 - 10:28

On Wednesday, October 15, 2014, the Drupal Security Team published an highly critical security advisory (reference SA-CORE-2014-005 (CVE-2014-3704)) in regard of a vulnerability discovered in the database abstraction API that allows to automatically sanitize the parameters of an SQL query and, therefore,prevent SQL injection attacks. In short, this vulnerability allows an anonymous attacker to compromise any Drupal 7 site by a SQL injection attack. Let's go back on the chronology of the event and try to learn some lesson from the discovery and the management of this security breach.

Thème Drupal planet
Catégories: Elsewhere

Flocon de toile | Freelance Drupal: Create a Drupal 8 block in seconds

ven, 28/11/2014 - 08:59

We continue to explore the possibilities offered by the Console module and will discover how to build a Drupal 8 block in a very little time, and then customize it to our needs.

Thème Drupal planet
Catégories: Elsewhere

Code Drop: Getting Started Testing Drush Commands

ven, 28/11/2014 - 07:53

I've recently been rewriting the Drush Migrate Manifest command for Drupal 8 and it was my first introduction to writing tests for Drush commands. The process was more painless than I had initially imagined it would be, but not without a few caveats along the way.


Drush tests use PHPUnit as the test runner and testing framework which offers a kind familiarity to many developers, especially anyone involved in Drupal 8. Drush uses the name "Unish" for its testing namespace and base test classes. I’ve no idea where Unish comes from but here’s a definition?

Test Skeleton

Create your test within the tests folder using Lower Camel Case for the naming of both the file and the class name.

Catégories: Elsewhere

KnackForge: Android app integration with Drupal for OAuth

ven, 28/11/2014 - 07:40

When you build Android app for an existing Drupal website which allows it's users to login using their social media account, it is important to include such features in your app as well. Instead of using app specific APIs, we can use WebView to achieve the login mechanism. Let me explain the method that I followed,

1) Setup Drupal menu paths for social login(s).

2) From android app, use webview that can call Drupal path for social login.

3) Drupal side completes the authentication and sends back an identification token to Android.

4) Android app uses the token to make further calls to Drupal site. Drupal side authenticates user by this token.

Drupal side

First we need to define entry point for social login,

Catégories: Elsewhere

.VDMi/Blog: I went to Drupal 7.33 and all I got was a slow site

jeu, 27/11/2014 - 22:29
So, you just upgraded your site(s) to Drupal >= 7.33, everything seemed fine in your tests. You deployed the new release and after a while you notice that your site isn't as fast as before. It's actually slowing down everything on your server. You started Googling and ended up on this blogpost. Sounds like your story? Read on!

I spent the last 2 hours figuring this out, I decided it would be best to write it up right away while it's fresh in my memory. TLDR; at the bottom of this post.

We did an upgrade to Drupal 7.34 recently, we thought everything was fine. Release went over 3 different environments before deploying to live and no actual issues were found.

After deployment to live, we got some reports that the site was responding slow. We didn't directly link it to the upgrade to 7.34, I actually did a diff between 7.32 and 7.34 to see what changed after these reports and did not see anything suspicious that could cause this.

We had to investigate after a while as there was no sign of improvement, the CPU of the server was hitting 100% 24/7. New Relic monitoring told us about many calls to the "file_scan_directory" function of Drupal. When I actually logged the calls with the following snippet:

static $count;
if (!isset($count)) {
  $count = 0;
drupal_debug($count . PHP_EOL);

The count actually went up to 700 for every request (It's quite a large project, plus the file_scan_directory is recursive).
When I printed the "debug_backtrace", I saw that this call was coming from "drupal_get_filename".
Looking at the function arguments, Drupal was looking for a imagecache_actions file, why?! And why on every request? Doesn't it cache these records in the registry?!

Yes it does! It appeared the imagecache_actions module had a typo in it (patch here):

module_load_include('inc', 'imagcache_actions', '');

This should be:

module_load_include('inc', 'imagecache_actions', '');

This would not have been an issue previously, 7.33 introduced a change though.
Pre 7.33:

$file = db_query("SELECT filename FROM {system} WHERE name = :name AND type = :type", array(':name' => $name, ':type' => $type))->fetchField();
if (file_exists(DRUPAL_ROOT . '/' . $file)) {
  $files[$type][$name] = $file;

7.33 or higher:

$file = db_query("SELECT filename FROM {system} WHERE name = :name AND type = :type", array(':name' => $name, ':type' => $type))->fetchField();
if ($file !== FALSE && file_exists(DRUPAL_ROOT . '/' . $file)) {
  $files[$type][$name] = $file;

Before 7.33, Drupal would try to find the record in the system table, it wouldn't find it and the $file would be NULL. The resulting string would be "DRUPAL_ROOT . '/' . $file", as $file is NULL, you can also see it as "DRUPAL_ROOT . '/'". Obviously the DRUPAL_ROOT exists, so it returns TRUE. It would put the record in the $files array and continue with what it was doing.

Because 7.33 and higher does a NULL-check on $file, it will not add any record to the $files array. This causes it to go into the file discovery routine:

if (!isset($files[$type][$name])) {
  // ... Some code
  $matches = drupal_system_listing("/^" . DRUPAL_PHP_FUNCTION_PATTERN . "\.$extension$/", $dir, 'name', 0);
  foreach ($matches as $matched_name => $file) {
    $files[$type][$matched_name] = $file->uri;

This code will try to scan your Drupal installation for the given file. It will not find the file and continue eventually, but it will execute the file search in EVERY request that you execute the module_load_include.

While our issue was in the imagecache_actions module, your issue might be in any module (even custom) which does a wrong module_load_include.
It's very hard to find this out yourself. You can edit includes/ on line 866 to write some info away to /tmp/drupal_debug.txt:
Add the following code after line 866:

else {
  drupal_debug('Missing file ' . $type . ' ' . $name . ' ' . DRUPAL_ROOT . '/' . $file . PHP_EOL);

TLDR; an issue in imagecache_actions combined with an upgrade to Drupal >= 7.33 killed the performance of our site. Patch for imagecache_actions here. Causing issue/patch here.

Catégories: Elsewhere

Blair Wadman: Eleven tips to keep Drupal up to date with security releases

jeu, 27/11/2014 - 21:36

Keeping your Drupal site up to date has always been of critical importance to ensure it remains secure. Last month's announcement of a SQL Injection vulnerability and subsequent announcement of automated attacks within 7 hours caused wide spread panic across much of the Drupal community.

Tags: DrupalPlanet Drupal
Catégories: Elsewhere

Appnovation Technologies: Searching and attaching images to content

jeu, 27/11/2014 - 18:00

Because of its ability to extend the core platform, Drupal has become a popular CMS/Framework for many large media and publishing companies.

var switchTo5x = false;stLight.options({"publisher":"dr-75626d0b-d9b4-2fdb-6d29-1a20f61d683"});
Catégories: Elsewhere

Oliver Davies: Include environment-specific settings files on Pantheon

jeu, 27/11/2014 - 17:24

I was recently doing some work on a site hosted on Pantheon and came across an issue, for which part of the suggested fix was to ensure that the $base_url variable was explicitly defined within settings.php (this is also best practice on all Drupal sites).

The way that was recommended was by using a switch() function based on Pantheon's environment variable. For example:

Catégories: Elsewhere

Cheppers blog: Acquia Certified Developer exam - passed!

jeu, 27/11/2014 - 13:21

We are proud to announce that Cheppers has now three Acquia Certified Developers!
This Monday Mau, Attila and Andor have all passed the exam held by Acquia, and we are very proud of them.

Catégories: Elsewhere

Kristian Polso: Crawling the top 15,000 Drupal websites

jeu, 27/11/2014 - 11:01
So I crawled the top 1,000,000 websites from Alexa, looking for all of the Drupal websites (and other popular CMS's). Here are the results.
Catégories: Elsewhere

Joachim's blog: A git-based patch workflow for (with interdiffs for free!)

jeu, 27/11/2014 - 09:39

There's been a lot of discussion about how we need github-like features on Will we get them? There's definitely many improvements in the pipeline to the way our issue queues work. Whether we actually need to replicate github is another debate (and my take on it is that I don't think we do).

In the meantime, I think that it's possible to have a good collaborative workflow with what we have right now on, with just the issue queue and patches, and git local branches. Here's what I've gradually refined over the years. It's fast, it helps you keep track of things, and it makes the most of git's strengths.

A word on local branches

Git's killer feature, in my opinion, is local branches. Local branches allow you to keep work on different issues separate, and they allow you to experiment and backtrack. To get the most out of git, you should be making small, frequent commits.

Whenever I do a presentation on git, I ask for a show of hands of who's ever had to bounce on CMD-Z in their text editor because they broke something that was working five minutes ago. Commit often, and never have that problem again: my rule of thumb is to commit any time that your work has reached a state where if subsequent changes broke it, you'd be dismayed to lose it.

Starting work on an issue

My first step when I'm working on an issue is obviously:

  git pull

This gets the current branch (e.g. 7.x, 7.x-2.x) up to date. Then it's a good idea to reload your site and check it's all okay. If you've not worked on core or the contrib project in question in a while, then you might need to run update.php, in case new commits have added updates.

Now start a new local branch for the issue:

  git checkout -b 123456-foobar-is-broken

I like to prefix my branch name with the issue number, so I can always find the issue for a branch, and find my work in progress for an issue. A description after that is nice, and as git has bash autocompletion for branch names, it doesn't get in the way. Using the issue number also means that it's easy to see later on which branches I can delete to unclutter my local git checkout: if the issue has been fixed, the branch can be deleted!

So now I can go ahead and start making commits. Because a local branch is private to me, I can feel free to commit code that's a total mess. So something like:

  // Commented-out earlier approach that didn't quite work right.
  $foo += $bar;
  // Badly-formatted code that will need to be cleaned up.
  if($badly-formatted_code) { $arg++; }

That last bit illustrates an important point: commit code before cleaning up. I've lost count of the number of times that I've got it working, and cleaned up, and then broken it because I've accidentally removed an important that was lost among the cruft. So as soon as code is working, I make a commit, usually whose message is something like 'TOUCH NOTHING IT WORKS!'. Then, start cleaning up: remove the commented-out bits, the false starts, the stray code that doesn't do anything. (This is where you find it actually does, and breaks everything: but that doesn't matter, because you can just revert to a previous commit, or even use git bisect.)

Keeping up to date

Core (or the module you're working on) doesn't stay still. By the time you're ready to make a patch, it's likely that there'll be new commits on the main development branch (with core it's almost certain). And prior to that, there may be commits that affect your work in some way: API changes, bug fixes that you no longer need to work around, and so on.

Once you've made sure there's no work currently uncommitted (either use git stash, or just commit it!), do:

git fetch
git rebase BRANCH

where BRANCH is the main development branch that is being committed to on, such as 8.0.x, 7.x-2.x-dev, and so on.

(This is arguably one case where a local branch is easier to work with than a github-style forked repository.)

There's lots to read about rebasing elsewhere on the web, and some will say that rebasing is a terrible thing. It's not, when used correctly. It can cause merge conflicts, it's true. But here's another place where small, regular commits help you: small commits mean small conflicts, that shouldn't be too hard to resolve.

Making a patch

At some point, I'll have code I'm happy with (and I'll have made a bunch of commits whose log messages are 'clean-up' and 'formatting'), and I want to make a patch to post to the issue:

  git diff 7.x-1.x > 123456.PROJECT.foobar-is-broken.patch

Again, I use the issue number in the name of the patch. Tastes differ on this. I like the issue number to come first. This means it's easy to use autocomplete, and all patches are grouped together in my file manager and the sidebar of my text editor.

Reviewing and improving on a patch

Now support Alice comes along, reviews my patch, and wants to improve it. She should make her own local branch:

  git checkout -b 123456-foobar-is-broken

and download and apply my patch:

  patch -p1 < 123456.PROJECT.foobar-is-broken.patch

(Though I would hope she has a bash alias for 'patch -p1' like I do. The other thing to say about the above is that while wget is working at downloading the patch, there's usually enough time to double-click the name of the patch in its progress output and copy it to the clipboard so you don't have to type it at all.)

And finally commit it to her branch. I would suggest she uses a commit message that describes it thus:

  git commit -m "joachim's patch at comment #1"

(Though again, I would hope she uses a GUI for git, as it makes this sort of thing much easier.)

Alice can now make further commits in her local branch, and when she's happy with her work, make a patch the same way I did. She can also make an interdiff very easily, by doing a git diff against the commit that represents my patch.

Incorporating other people's changes to ongoing work

All simple so far. But now suppose I want to fix something else (patches can often bounce around like this, as it's great to have someone else to spot your mistakes and to take turns with). My branch looks like it did at my patch. Alice's patch is against the main branch (for the purposes of this example, 7.x-1.x).

What I want is a new commit on the tip of my local branch that says 'Alice's changes from comment #2'. What I need is for git to believe it's on my local branch, but for the project files to look like the 7.x-1.x branch. With git, there's nearly always a way:

  git checkout 7.x-1.x .

Note the dot at the end. This is the filename parameter to the checkout command, which tells git that rather than switch branches, you want to checkout just the given file(s) while staying on your current branch. And that the filename is a dot means we're doing that for the entire project. The branch remains unchanged, but all the files from 7.x-1.x are checked out.

I can now apply Alice's patch:

  patch -p1 < 123456.2.PROJECT.foobar-is-broken.patch

(Alice has put the comment ID after the issue ID in the patch filename.)

When I make a commit, the new commit goes on the tip of my local branch. The commit diff won't look like Alice's patch: it'll look like the difference between my patch and Alice's patch: effectively, an interdiff.

  git commit -m "Alice's patch at comment #2"

I can now do a diff as before, post a patch, and work on the issue advances to another iteration.

Here's an example of my local branch for an issue on Migrate I've been working on recently. You can see where I made a bunch of commits to clean up the documentation to get ready to make a patch. Following that is a commit for the patch the module maintainer posted in response to mine. And following that are a few further tweaks that I made on top of the maintainer's patch, which I then of course posted as another patch.

Improving on our tools

Where next? I'm pretty happy with this workflow as it stands, though I think there's plenty of scope for making it easier with some git or bash aliases. In particular, applying Alice's patch is a little tricky. (Though the stumbling block there is that you need to know the name of the main development branch. Maybe pass the script the comment URL, and let it ask what the branch of that issue is?)

Beyond that, I wonder if any changes can be made to the way git works on A sandbox per issue would replace the passing around of patch files: you'd still have your local branch, and merge in and push instead of posting a patch. But would we have one single branch for the issue's development, which then runs the risk of commit clashes, or start a new branch each time someone wants to share something, which adds complexity to merging? And finally, sandboxes with public branches mean that rebasing against the main project's development can't be done (or at least, not without everyone know how to handle the consequences). The alternative would be merging in, which isn't perfect either.

The key thing, for me, is to preserve (and improve) the way that so often on, issues are not worked on by just one person. They're a ball that we take turns pushing forward (snowball, Sisyphean rock, take your pick depending on the issue!). That's our real strength as a community, and whatever changes we make to our toolset have to be made with the goal of supporting that.

Catégories: Elsewhere

PreviousNext: Lightning talk - Drupal 8's Third Party Settings Interface

jeu, 27/11/2014 - 03:22

During this weeks developers' meeting our lightning talk was all about Drupal 8's ThirdPartySettingsInterface.

Here's the video introduction to this powerful new feature in Drupal.

Catégories: Elsewhere

Acquia: Part 2 – Cal Evans and Jeffrey A. "jam" McGuire talk open source

jeu, 27/11/2014 - 00:53
Language Undefined

Voices of the ElePHPant / Acquia Podcast Ultimate Cage Match Part 2 - I had the chance to try to pull Cal Evans out of his shell at DrupalCon Amsterdam. After a few hours, he managed to open up and we talked about a range of topics we have in common. In this part of our conversation we talk about 'Getting off the Island', inter-project cooperation in PHP and Drupal's role in that; the reinvention and professionalization of PHP core development; decoupled, headless Drupal 8; PHP and the LAMP stack as tools of empowerment and the technologists' responsibility to make devices and applications that are safe, secure, and private by default.

Catégories: Elsewhere

David Norman: Drupal Workstation Configuration

mer, 26/11/2014 - 22:24

I use a MacBook Pro with Retina for work - they make the best hardware and operating system combination. I also use a Nexus 5, iPad 3, Kindle Paperwhite 2, and Chromebook. I had an iPhone for about 9 months - I didn't like it. If I shed one of my devices, it'd be the iPad.

I force my blog to use HTTPS. I assume government agencies broke SSL and have the root signing keys for certificate providers. I'm just hoping HTTPS makes sniffing a blog annoying enough to make a point to someone that sniffs traffic for a living.

(adsbygoogle = window.adsbygoogle || []).push({}); Physical security

Immediately after turning on a fresh MacBook, I activate Filevault. At first, activating Filevault for my work was a policy to follow - now I want it. I went through the mental exercise of having all my personal items lost or stolen. I couldn't imagine an end to what lawyers would do to me if I lost control of their company's confidential information. The rule even applies when living a minimalist lifestyle. At least one minimalist tells how he got robbed while sleeping in his own home. Filevault 2 on Mac OS 10.8 creates complete encryption transparency. It's hard to tell if a drive is encrypted with Filevault or not without looking at the System Preferences.

I use screensaver hot corners to activate the screensaver and password protection when I walk away from my MacBook.

When someone walks up to my workstation to talk to me or I need a restroom break, all I do is swipe the mouse cursor down to the bottom left corner and my MacBook screen is locked.

For addtional security, I also use a screen protector. Though I telecommute, I work frequently at places other than my home like the library, Regus business lounges, and county parks.

The 3M privacy screen protector is the only one I take seriously. It doesn't keep my screen from getting dirty, though it will block unexpected sneeze splittle - rather it keeps wandering eyes next to me from reading what is on my screen. I prefer the black filter over the gold one, though I agree with reviewers that say the gold filter is more effective.

Password management

Hackers compromised my MtGox account in 2011. Unfortunately, my password for MtGox was the same as what I used on every other website I use. After the release of the password file, anyone could have assumed my digital identity online. The silly part was, I already used a wonderful password manager - 1Password.

Today, I keep passwords in both 1Password and Lastpass - I don't re-use passwords anymore. Generating a random password for every site makes me so dependent on having a password manager that having redundancy doesn't bother me. Adding Lastpass to my toolbox gave me options for new toys, like using a Chromebook, since Lastpass is platform-agnostic.

Version control

When I started doing public development of Thatware, someone suggested using CVS to manage contributions from other people and I thought they were nuts. All CVS was going to do was add a bunch of aggravating complexity. I still don't think Thatware ever reaped anything worthwhile by using CVS before I abandoned it. By the time I introduced CVS to the project, my interest in Thatware was already waning.

My progress on Thatware stalled after an email I got from Dries Buytaert asking why I didn't use table joins anywhere in Thatware's database queries. I didn't know what a table join was at the time - the little bit of SQL I had implemented was all copied from a post I found on Dev Shed. As it turns out, doing a couple table joins instead of queries, nested in loops, is about 10 times faster. Dries proved it when he had me benchmark alternative scripts. Those scripts turned into what people know today as Drupal. Needless to say, I yielded to his superiority.

In time, I shifted my CVS knowledge from Thatware to Drupal. When I discovered I could commit to any contributed module's CVS repository, I spread my generosity where it wasn't wanted, which ended with two commits to the Xstatistics module. After that time, CVS commits got access hooks to verify maintainers were listed in the project node. I still intend to see the Windows-style newlines removed someday, even if the project stays in an officially abandoned state.

I used Subversion for professional work alongside CVS for community code. I thought Subversion and CVS were equal platforms with different methods for versioning files.


Today, I use Git. When I was using Subversion, I thought it was fine. I only switched to Git because the Drupal project did. Now I've experienced the ease of branching in Git versus Subversion. Doing local merges makes resolving conflicts easier. I'll never go back to Subversion by choice.

The normal git log output is useless to me, so I never run it. I do get some value by running an alternate version of the log command that outputs the merge tree.

git config --global alias.lg "log --color --graph \ --pretty=format:'%Cred%h%Creset \ -%C(yellow)%d%Creset %s \ %Cgreen(%cr)%C(bold blue)<%an>%Creset' --abbrev-commit"

Look at the difference between git log:

and git lg:

Git versions before 2.0 would push all of the local braches that matched a remote branch name using a matching method. As of Git 2.0, it switches to a more conservative simple behavior, pushing only the current branch to the corresponding remote branch it is configured to follow. The new, 2.0 simple version is set through

git config --global push.default simple

To retain the older method of pushing all matching branches, even after upgrading to Git 2.0:

git config --global push.default matching Terminal

My colleagues recommended iTerm as a replacement for the built-in, Apple-provided terminal. It offers options for a split pane, autocomplete, and paste history which aren't available with the Apple terminal.

iTerm also protects me against myself by blocking a system shutdown when I have a terminal open. Before iTerm blocked shutdown in Mac, I would shutdown my Mac with an open SSH or mosh connection. Since I never use screen, shutting down sometimes broke something in-process, like apt-get upgrade.

The default theme of either terminal is not suitable for me - or at least I think it could be improved. Instead, I use the Tomorrow Night theme. I combine Adobe's Source Code Pro font with a Druplicon to make my default terminal.

The Druplicon ASCII art shows at the start of each new terminal window I open, whether I use Apple's Terminal or iTerm, by using the system MOTD. The Internet is full of web-based tools that can convert an image file to ASCII art, but I created the Druplicon by hand so I could customize the output to fit in 80 columns and 25 lines for the default terminal window size.

Mac OS X doesn't come with a /etc/motd file by default, but if you create one, it will get used. To get access to write to /etc, you'll need to sudo as a user with administrative rights.

sudo su - cat ~/Downloads/ascii_druplicon.txt > /etc/motd exit OH MY ZSHELL!

Bash was all I needed as a commandline interpreter until I started using git. I when I was informed about oh-my-zsh and that it support various plugins, it was a huge annoyance to do any commands inside any git repository without the assistance of zsh.

Robby Russel hosts the official installer. It only needs one command in a terminal.

curl -L | sh

After installation, examine the .zshrc file in your home directory. Two options to pay particular attention to are ZSH_THEME for theme and plugins for plugins.

Throughout this book, screenshots I take of a terminal window use the gallifrey zsh theme.

I could live with just the git plugin - I have a tendency to enable zsh plugins that I have no idea how to use. Now when cloning git repositories, the git plugin for zsh shows the checked-out branch of your current directory as well as any subdirectories.

In the case of setting variables for a plugin like jira, I could either create a ~/.jira-url file or add the JIRA_URL setting to ~/.zshrc. There's a trade-off between being able to

cat "" > ~/.jira-url

or centralizing the configuration. I opted for putting it in the ~/.zshrc file directly because I think I'm more likely to remember that the jira command is a zsh plugin and look for the configuration in zsh's file.

Show hidden files

The operating system doesn't matter - I like to see all the hidden files on my filesystems. In Mac OS, I show them all by referencing an article on Lifehacker to get the correct terminal commands.

defaults write \ AppleShowAllFiles TRUE killall Finder

Commands like this can also be aliased in your ~/.zshrc file.

# Show/hide hidden files in Finder alias show="defaults write \ AppleShowAllFiles -bool true && killall Finder" alias hide="defaults write \ AppleShowAllFiles -bool false && killall Finder"

Since I use BlueHarvest to minimize the number of .DS_Store files on my Mac and network drives, I periodically see files show in finder, then disappear. On my desktop, the Mac system files sometimes stack on top of files I want, making it annoying to work with files from my desktop.

The default settings for BlueHarvest 5 only remove Apple files on remote drives, so I add my home directory as an additional location.




I suspect antivirus is an under-utilized tool on Mac OS X installations, but I don't let that stop me from running it. ClamXav is a free antivirus utility which can monitor directories in additon to a standard, manual scan.

ClamXav can be setup up as passive or active: scan only the files you tell it to or your entire computer. I activated Sentry to monitor my home folder and scan new files as they arrive. I usually download files from the Internet to either ~/Downloads or ~/tmp. Doing my entire home directory also covers unknowingly downloading a virus through Apple Mail or as browser cache from Google Chrome.


Crashplan is the only backup solution I've found where I can backup my files encrypted to an offiste datacenter and to another computer in my house. The other computer in my house is key - when I used other services like Jungle Disk, the restore process was painful. Downloading files archived on the Internet as multiple hundreds of gigabytes is painful. At least when I need to a restore from another computer in my house, I can control the network conditions. Getting files restored from the Internet is only my plan for catastrophic events - burgulary, fire, tornado.


I store my Calibre library; 1Password keychain; Alfred, Codekit and TextExpander configurations; Leanpub documents; and a few small file backups encrypted by TrueCrypt.

Browsers and plugins

I used Netscape Navigator back in the 1990s. Since then, I've felt some membership on team Firefox. Truthfully, since I switched to using Android for my phone, I've stuck with Google Chrome. I feel like Chrome simplifies my life somehow. The market dominance Chrome gained in 2013 doesn't give me much reason to move back to Firefox, either.

Chrome extensions
  • 1Password for password management
  • AdBlock to block advertisements
  • Buffer to queue up messages on social networking to target readers at times they're most likely to be reading social networks
  • Dreditor adds functionality for patch reviews and Git commit messages to the issue queue
  • Eye Dropper to show me hex colors of pages on the Internet
  • Google Analytics Opt-out Add-on in hopes of reducing how much I get tracked on the Internet
  • Google Docs was added by my Chromebook
  • HTTPS Everywhere because the Internet should be encrypted
  • Incognito Close closes incognito tabs if they're inactive for 30 minutes
  • Lastpass intergrates with my Lastpass vault
  • MeasureIt! measures pixels
  • Mosh is another way I can use a terminal to a remote server while I'm using my Chromebook
  • Readability is a shortcut for sending articles for me to read later on Readability
  • Rescuetime for Chrome tells RescueTime what pages I'm visiting instead of just recording that I'm using Chrome, otherwise, it can't help gauge my web-based productivity
  • Secure Shell so I can SSH into remote servers from my Chromebook
  • Send to Kindle for Google Chrome can send websites as e-books to my Kindle Paperwhite
  • Subhub opens GitHub files in Sublime Text when paired with the Subhub package in Sublime Text
  • Tailor is an offline code editor with Dropbox and Google Drive integration
  • Text is a text editor
  • Type Fu implemented my Norman keyboard layout experiment for learning how to type
  • Writebox is an offline text editor with Dropbox and Google Drive integration
  • Wunderlist for Chrome makes a shortcut to my Wunderlist task archive

It occurred to me one day that if Drupal had hundreds of thousands of downloads, then those Drupal installations each had millions of page views, that compressing images in the Drupal core distribution could have a ripple effect for saving thousands of gigabytes of unnecessary network traffic123456789.

The image output from programs like GIMP, Photoshop, Skitch, and even the built-in screenshot hotkeys for Mac OS X, have a bunch of unnecessary comments and color profiles that have no relevance to the Internet environment. Editing a JPG in GIMP, then exporting it with 80% compression still has the opportunity to remove extra file bytes without any quality loss.

ImageOptim wraps PNGOUT, Ziofku, Pngcrush, AdvPNG, OptiPNG, JpegOptim, jpegrescan, jpegtran, and Gifsicle. I keep ImageOptim in my Mac Dock, then drag image files to the ImageOptim icon in my Dock to optimize them. I find the outputs save 5-25% on the output file size.

Rather than getting the default output from CMD+SHIFT+4, I prefer to focus a specific type of interaction, output location, and filename. This screencapture command outputs the screenshot of whatever window I click on, in JPG format on my desktop. After I process the output with ImageOptim, the screenshot complies with all the size and format rules I need for adding to this book.

screencapture -tjpg -oW ~/Desktop/imageoptim.jpg

ImageAlpha uses the same principles as ImageOptim, except that it works specifically on PNG files and applies lossy compression and converts the PNG to PNG8+alpha format - that's a fancy way of saying it proposes to convert the image from full color to 256 colors. The 256 color palate change by itself can reduce image sizes by around 70% and you can probably only tell the difference on full-color photos. pngquant the backend utility for ImageAlpha.

Kindle Previewer





The evening is a tricky time to balance activities. While I know I shouldn't spend much of any evening on the computer, sometimes I do, even if I'm just working on our family budget. When it's dark in the house, it's easy for a phone or computer screen to blind me, even after the computer does a brightness compensation for the dark.

f.lux makes my computer screen tint match the lighting in the room at night. It goes beyond just a brightness change, because the blue tint of the computer screen without a red filter at night doesn't match what the average light bulb emits. If I worked in an office cave, I could set the daylight hours to match the florescent lighting in the room.

The preferences automatically identify my location. With the transition time set to 1 hour, I don't even notice the screen tint change. I think f.lux is an an application that every computer user should install, regardless of your technical skills and abilities. I'd have my father-in-law install it.

E-book management

I'm not a fan of how set the Kindle book royalty structure for authors, but I am fond of their Paperwhite for the non-glare screen and their centralized document management. Their implementation for sending an email with an e-book attachment to my Kindle is a great system.

After I buy an e-book, I think I should own it. Kindle DRM doesn't follow that spirit since Amazon makes it hard to move a book from my Kindle to iBooks. [](Apprentice Alf) has DRM removal tools for e-books. My DRM-free e-books are managed by Calibre.

Calibre is free. The DRM removal plugin for Calibre makes it possible to import books from my Kindle to my Dropbox. I convert my books to other formats, store whitepapers, documentation, and presentations in Calibre so I can find them easily when I decide to email them to friends.

Meeting online

Shush audio hijack pro mictracker Skype IRC Squiggle Dropcam GoToMeeting FreeConferenceCall Zoom Cloud Meetings


Divvy sorts out all my windows when I'm doing research between iTerm, Google Chrome, and Sublime Text. By clicking the Divvy icon in the menu bar, I can drag across a grid to organize my open applications to not clobber each other on the screen. It divides my screen into exact portions so I don't have to mess with dragging windows to specific sizes.

Divvy also has a method for organizing applications by shortcut. One global keyboard shortcut setting opens the Divvy grid. Then, a second combination, specific to your Divvy preferences, sets the window sizes according to your own preset grid.


Hazel advertises itself as a personal housekeeper. It's a rule engine that keeps files organized by whatever means you tell it. The default rules use color labels for downloads so I can quickly find new downloads and see reminders when it's time to archive installers to my Drobo.

iStat Menus requires an admin account to install a resource monitor. I configure it to show my CPU load, memory use, disk space consumption, and temperatures for my CPU and GPU.

Internet tethering

HoRNDIS is a Mac OS X driver for using my Android phone's native USB tethering mode. The driver is free and the source is available on GitHub.

After installing the driver, I didn't have to configure anything. I just plugged my Nexus 5 into my Mac, then activated USB tethering on my phone in the Settings app.

Since I tether frequently, I added a widget to my phone to the Tethering & portable hotspot settings. By adding a widget to Settings shortcut, then placing the widget icon, the widget will ask for the setting you'd like to set the widget to load.

The Portable Wi-Fi hotspot option on my Android also works, but I've found that using it in a building with a bunch of other wireless networks seems congested. There seems to be a limit to how much wireless traffic can broacast through the air before it all jumbles together. Since I keep my phone plugged in anyway while tethering to keep the battery from draining too fast, the USB tethering becomes an overall more attracive option.

Android File Transfer is not available while using the USB tethering.


My Drobo is one of two things I've ever locked to my desk with a Kensington lock. Thanks to having SSH and rsync on my Drobo, I can rsync backup snapshots over my home network. For that, I created a bash script that I keep in ~/bin, which is in my ~/.zshrc PATH.

#!/bin/sh rsync -a --delete --progress --ignore-errors \ --rsh=ssh /Users/davidnorman \ root@ rsync -a --delete --progress --ignore-errors \ --rsh=ssh /Applications \ root@ rsync -a --delete --progress --ignore-errors \ --rsh=ssh /private \ root@ Other apps to consider

bee, brackets, byword, caffeine, pckeyboardhack, quiet, readkit, rescuetime, capsee, carboncopycloner, requiem, resolutiontab, codekit, codebox, coffitivity, sequel pro, shortcat, smartsyncrhronize, sourcetree, spideroak, expandrive, tower, gpgtools, gabble, github, google drive, unplugged, kaleidoscope, ia writer, icon slate, keyremap4macbook, libreoffice, owncloud, ripit vs handbrake, brew install youtube-dl

  1. Smush core images 

  2. Optimize module and theme images 

  3. Use transparent PNG instead of opacity for overlay background to gain rendering performance 

  4. Open/Close Drawer link doesn't show focus 

  5. Create a new Druplicon logo for Bartik 

  6. System messages need identifying icons (WCAG 2.0) 

  7. Optimize Bartik images 

  8. Optimize seven ui icons 

  9. Compress Images using Lossless Techniques 

Post categories Drupal
Catégories: Elsewhere

David Norman: Sublime Text for Drupal

mer, 26/11/2014 - 22:02

The novel, The Pragmatic Programmer: From Journeyman to Master, the advocate for using a single file editor for as many tasks as you can.

We think it is better to know one editor very well, and use it for all editing tasks: code, documentation, memos, system administration, and so on. Without a single editor, you face a potential modern day Babel of confusion. You may have to use the built-in editor in each language's IDE for coding, and an all-in-one office product for documentation, and maybe a different built-in editor for sending e-mail. Even the keystrokes you use to edit command lines in the shell may be different. It is difficult to be proficient in any of these environments if you have a different set of editing...

At this point of writing this book, I've already used Penflip, Leanpub, Sublime Text, StackEdit, Tailor, Writebox, a pad of paper, and two dry erase boards.

My trial of other editors in this project as part of routine experimentation. Staying informed about the latest tools is necessary to prevent becoming a technological dinosaur. In 2013, I settled deep into using Sublime Text for as many things as I could. Viewing and editing files in Sublime wasn't part of a conscious effort, rather I think it was just a natural result of what Andrew Hunt and David Thomas identified in their Pragmatic Programmer book.

(adsbygoogle = window.adsbygoogle || []).push({});

As a developer, themer, designer, or manager of any of those, I've found my editor to be a critical tool and one in which I spend a lot of time. In 2013, Sublime Text was my third-most used application.

What the RescueTime chart doesn't show clearly is that I actually spent most of my time processing email because I used Gmail as my email client at work for an extended trial period. I tell people I work with that email should be considered a low-priority medium of communication - one in which I might respond in a week when I attempt to re-gain my inbox zero status. This chart has a long tail of other websites and applications spreading out to 3965 items, including me browsing and watching YouTube.

Sublime Text is the second place where zsh plugins shine. GitHub is littered with various instructions and alias files to create a terminal alias to run Sublime Text from a terminal. When you add the sublime plugin to your ~/.zshrc file, it handles aliasing the subl binary packaged in the regular Sublime bundle.

Whether on Linux or Mac, the zsh plugin searches the /usr/local/bin and /Applications paths to locate and make the subl binary file available as both subl and st. Having some basic vi or nano knowledge for server management and devops is nice, but when you have a perfectly good GUI available, I say use it. Additionally, zsh also makes a stt command available to the terminal for opening the current directory in Sublime Text.

Basic syntax

Sublime Text uses the Preferences / Settings - User menu to create a file named Preferences.sublime-settings for storing various formatting options. To match most of the Drupal coding standards, start with the following JSON settings:

{ "default_line_ending": "unix", "ensure_newline_at_eof_on_save": true, "fallback_encoding": "UTF-8", "rulers": [ 80 ], "shift_tab_unindent": true, "spell_check": true, "tab_size": 2, "translate_tabs_to_spaces": true, "trim_automatic_white_space": true, "trim_trailing_white_space_on_save": true, "use_tab_stops": true, "word_separators": "./\\()\"'-:,.;<>~!@#%^&*|+=[]{}`~?" }

Sublime Text can have multiple rulers. Setup one for drafting email, one for code, another for markdown documents.

"rulers": [72, 80, 120],

Pasting code from another file or from a sample I find on the Internet is annoying, because I have to spend the next 30 seconds fixing the indentation. And even if the indentation levels happen to match, chances are I didn't select the leading whitespace, so I still have a formatting hassle to sort out. The Paste and Indent command automatically adjusts the indentation of the code from the clipboard to match the surrounding code. Add the following to Default (OSX).sublime-keymap file through the Sublime Text - Preferences - Key bindings - User menu:

{ "keys": ["super+v"], "command": "paste_and_indent" }, { "keys": ["super+shift+v"], "command": "paste" } Sublime Text package manager

The Sublime Text package manager is not part of the default installation. There are only two steps in the instructions, which creates a new Sublime Text Preferences menu named Package Control.

Package control options are nested into the Sublime Text Command Palette. The shortcut to open the command palette menu on Mac is cmd+shift+p or ctrl+shift+p for Windows and Linux from within a Sublime Text window. All the package management options begin with Package Control: - the list is narrowed by typing package, or even just install in the command palette prompt.


Once you've navigated to the Package Control: Install Package menu, typing Drupal in the next Command Palette prompt should show the Drupal package by robballou.

The Drupal package has various completions for Drupal 7, which it can do within the context of filename extensions. For example, in projects with a .info file, the Drupal package will adds a syntax file type named Drupal Info File and autocomplete triggers based on name, description, dependencies, package, core, php, version, files, screenshot, engine, regions, features, scripts, settings, and stylesheets.

The Drupal package goes on to have autocomplete abilities for a comprehensive list of all Drupal's core functions, hooks, form API, theming, and several contributed modules - devel, migrate, and views.

The autocomplete functionality is complete enough to interpret your current $query variable in PDO syntax and expand upon condition statements to provide the syntax for the field, value and operator in the condition statement. Each of the Drupal completions are also context sensitive, meaning that if you wanted to access the fapi autocomplete shortcut for a markup item, the autocomplete menu won't appear unless you are in the context of a PHP function, within a valid <?php block of a PHP file.

Sublime Text autocomplete snippets for any language, Drupal, PHP, or Python, work on a principle of tabs. As long as you have an idea of what snippet trigger you need, the tab key will trigger the autocomplete event. The Drupal package could also serve as an alternative to copy and pasting code from the Examples for Developers project since shortcuts like block will fill-out an entire list of stub functions in which you can fill in arguments to name all the functions with the same prefix in one effort by tabbing between elements in the stub.


The Behat and Behat Features packages are close in functionality. The maintainer of Behat Features originally forked his code from the Behat project in apparent frustration of getting changes added back to the Behat project on Github.

Both Behat packages have equivalent syntax highlighting for .feature files, except that the Behat Features package adds support for French, German, Italian, Portuguese, Spanish, and Russian. Though I only read English, I installed Behat Features.

Tomorrow theme

Continuity between my terminal and Sublime Text is nice, so I theme my editor with the Tomorrow color scheme. The Tomorrow theme can be activated through the Sublime Text / Preferences / Color Scheme / Tomorrow Color Schemes menu on Mac OS X.

Sass and SCSS

As I browse through the Sublime Text packages, it's easy to two packages like Sass and SCSS, then assume I need them both - the Sass syntax was superceeded by SCSS. Upon closer inspection, both packages for Sublime Text highlight Sass and SCSS.

Though the SCSS package pitches itself as the "official bundle", I only installed the Sass project. I was troubled by the management of the SCSS project because it managed the Sublime Text package within a branch of the larger SCSS TextMate repository - the same repository which also has a Chocolat truffle in a different branch. Using git branches in that manner is wrong - each of the support bundles should be their own repository for each editor.

The Sass package for Sublime Text is properly managed all as a single bundle in a fork from the SCSS project. The Sass fork took time to pull contributions from other forks. I consider the result to be more curated. Since Sass is a fork of SCSS, it contains all of what was "official" up until the fork anyway. Installing both packages would be redundant.


The DocBlockr package completes docblock comments in code. In PHP, just typing /** then return, will examine the context in which you're typing, then fill in the rest.

DocBlockr's formatting is not Drupal comment standards compliant, so some adjustments are necessary. Navigate the Sublime Text menus on Mac OS X to Sublime Text / Preferences / Package Settings / DocBlockr / Settings - User.

The following modifications to the default rules bring the DocBlockr output into a reasonable proximity to compliance with Drupal's standards.

{ // Align words following the @tags. // Values: no, shallow, deep. "jsdocs_align_tags": "no", // Add blank lines between the description // line and tags of different types. "jsdocs_spacer_between_sections": true, // Shorten primitives such as "boolean" // and "integer" to "bool" and "int". "jsdocs_short_primitives": true, // Add a '[description]' placeholder // for the @return tag. "jsdocs_return_description": false, // Add a '[description]' placeholder // for the @param tag. "jsdocs_param_description": false, }

Though Drupal's comment standards to prefer descriptions added to @param and @return, DocBlockr doesn't have a setting to shift the description to the line below each block tag. Instead, the above configuration omits the description placeholder - you'll have to add it on your own. Also note, the configuration file for DocBlockr user preferences is Base File.sublime-settings - it doesn't have DocBlockr in the name as I expected.

DocBlockr does not conflict with the Drupal package. When typing hook then pressing tab in a PHP file, you'll still get the same magic function stub, complete with docblock.

/** * Implements hook_function(). */ function example_function(args) { } SublimeGit

The Git and SublimeGit packages offer similar functionality - another scenario where installing both is redundant.

I use SublimeGit for git diff when I want to make a quick patch for a Drupal module. Though I checkout modules from in iTerm, the process to update the contributed module versions in the Guardr project make file could look something like this.

  1. Load It's usually in my browser history.
  2. Click the "Version control" tab
  3. Copy the git clone line to the clipboard
  4. cmd+space, iT (to autocomplete iTerm)
  5. cd to ~/Sites
  6. paste the clone line for Guardr, then cd into the clone
  7. subl d<tab>.<tab>
  8. Make required version updates in the Sublime Text window
  9. cmd+s to save
  10. cmd+shift+p, git dif, and save the result as a .patch file for uploading to an issue on

SublimeLinter with SublimeLinter-php can do syntax checking, a basic function found in most any code-centric IDE. SublimeLinter has PHP-related syntax check plugins, including ones for PHP_CodeSniffer and PHP Mess Detector. I haven't seen a SublimeLinter plugin for PHP PSR compliance, but the [detailed notes] on writing a SublimeLinter plugin explain how someone might go about creating such support.

Since SublimeLinter isn't specific to just PHP, it can also handle lint checks for Puppet manifest files, CSS, Perl, Ruby, Javascript, and a bunch of others. I picked SublimeLinter over Phpcs package. If later I decide to use the SensioLabs standards fixer, Phpcs can just run that portion of the package and leave the rest to SublimeLinter.

Other notable Sublime Text packages

Alphpetize sorts the methods alphabetically within a PHP class. The sorting preserves Docblocks and any sloppiness that floats between functions within the class gets collected at the top of the class. I've never actually used this plugin for what it's supposed to do because I sort my PHP functions in order of importance. Drupal hook implementations go at the top of the files, followed by functions I'm most likely to rank as important or might need frequent editing, then utility functions at the end. Merely having this plugin ready appeals to my inner-OCD.

BlameHighlighter adds two options to the Sublime Text command palette - one highlights lines you have edited, according to Git, the other clears the highlights.

Find++ can expand the file search to look at all the open files, the files in the projects, or of a selected directory. Normally, the find function works only within the current open file. The results show in a new file, along with the filename and line number.

GitGutter uses the Sublime Text gutter to add indicators to the left of line numbers that show whether a line has been inserted, modified, or deleted.

BracketHighligher also uses the gutter space to show where brackets of various types start and end: (), {},[],<>."",'',`. A caveat I've noticed is that a conflict between two gutter plugins usually results in one winner - this one loses to GitGutter, so this would fail to show the start or end of a bracket on a line which had a change since the latest git commit.

The spell checker in Sublime Text does a decent job, but it lacks context, industry jargon, and colloquial terms. Adding the Google Spell Check package brings some sanity back to my spelling. The author even made a screencast to show its super powers.

GPG performs the basic functions of encrypt, sign, decrypt, and verify for ASCII armored text. Since GPGMail and Enigmail can handle GPG in the email stream, I use GPG for verifying GPG signatures on files or for otherwise handling the miscellaneous exceptions to normal GPG use.

SublimeJira adds command palette options to get, update, and create issues on a JIRA server. Getting issues prompts for an issue key, then displays the issue in a new tab of Sublime Text.

Phpcs has crossover functionality with SublimeLinter in terms of running php -l on the current file. It can also run PHP_CodeSniffer, PHP Mess Detector, Scheck, and the PHP Coding Standards Fixer by SensioLabs for PSR compliance. The formatting notifications use a combination of the quick panel, gutter, and status bar to notify about non-compliance.

Drupal 8 uses PHPUnit for unit tests instead of Drupal 7's Simpletest. This package adds support for right clicking test files from within Sublime Text to run them.

SideBarGit adds a git menu when right-clicking on files in the sidebar.

SSH Config highlights syntax /.ssh/config files. It handles essential autocompletion for adding new hosts to the config file, as well.

Subhub uses a combination of a Chrome extension and a Sublime Text package to insert a "Open in Sublime" button to pages in Chrome when you browse projects on GitHub.

Instead of loading a separate diff program, Sublimerge shows diffs for git, in tabs side-by-side. It's nice to see differences, but I haven't been able to actually change the edited file while Sublimerge shows the diff. Instead, Sublimerge forces me back to edit mode to make changes. I'm a fan of SmartSynchronize and its ability to merge and edit in realtime, but it's also annoying to switch to a different application.

SublimeWritingStyle helps you use active, instead of passive voice, and highlights judgmental words like "very", "clearly", and "relatively". Passive voice instances highlight in yellow and judgmental words in grey.

WordCount shows the words and/or characters in a document or a selection within a document, as well as an estimated read time. Digits are excluded. I use the following custom WordCount user settings. A few causal minutes of research seems to show that a reasonably educated adult reads at 265 words per minute, depending on whether you're reading for comprehension, pleasure, or scanning on the Internet.

{ "enable_readtime": true, "readtime_wpm": 265, "whitelist_syntaxes": [ "Markdown", "MultiMarkdown", "Plain Text" ], "blacklist_syntaxes": [ "CSS", "SQL", "JavaScript", "JSON", "PHP" ] }

WordHighlight highlights all copies of a word which is currently highlighted, or optionally, highlights copies of a word which has the insertion cursor upon it. The README file on the Github project page also has a number of configuration options, including treatment of the gutter bar, the status bar, and how to configure the highlight colors.

Xdebug Client can do breakpoints through clicking lines in the Sublime Text screen and set watch expressions. More on this package later.

Post categories Drupal
Catégories: Elsewhere

Four Kitchens: Extracting data from Drupal entities the right way

mer, 26/11/2014 - 19:54

If you’ve ever had to extract data from Drupal entities you know it can be a painful process. This post presents a solution for distilling Drupal entities into human-readable documents that can be easily consumed by other modules and services.

Projects Drupal
Catégories: Elsewhere

Drupal 8 Rules: #d8rules update October-November 2014

mer, 26/11/2014 - 18:28

Curious why the Rules module still isn't finished, even when our initial milestone planning aimed to ship Rules 8.x by 2014? The bad news is we are a bit delayed, but that pretty much aligns with the state of Drupal 8 core. And we are working hard on solving problems in a generic way for entire Drupal core instead of having to fork APIs in Rules.

DrupalCon Amsterdam Recap

We were quite active at DrupalCon Amsterdam: I gave a 12 minutes update on the #d8rules initiative (video & slides) and we had some productive discussions on crowd funding in general at the community summit (notes & related core conversation).

On Thursday, we had a BoF to get contributors up to speed with development for Rules in Drupal 8 and on Friday we sprinted the whole day to port actions and get Milestone 1 finished.

Development status 

As [META] Rules 8.x Roadmap states, most of the funded Milestone 1 tasks have been finished but a number of core integration tasks are still in progress. The unified context system is going to be used across Blocks, Conditions and Actions API as well as related contrib modules like the Drupal 8 version of Page Manager. In Drupal 7 CTools and Rules together with the Entity module basically invented two separate "context systems", so compared to that this is a big step fowards and will bring site builders and developers a much better plugin interopability & re-usability in Drupal 8.

Some core tasks we are currently working on are:

In Rules 8.x-3.x, we recently finisihed the conversion of all condition and action test cases to PHPUnit based integration tests, what helps to test the plugins including their annotation and runs very fast (without a full bootstrap). During the sprints in Amsterdam we worked with contributors on porting more of Rules' actions to the new Action API, including test coverage. We'll continue to work on porting actions with contributors via the issue queue and run sprints on the coming events we attend.

Next steps & events

We anticipate to use the budget raised by the end of 2014. Next year, we will have to look for further funding resources or limit work on Rules 8.x to our free time. Thanks again to everyone supporting us so far!

There are already some great Drupal events lined up for next year where the #d8rules initiative will be present:

DrupalCon Bogota, 10-12 of February 2014 will feature a full session on upgrading modules to integrate with Rules in Drupal 8 by dasjo: #d8rules - Web-automation with Rules in Drupal

European Drupal Days Milano, 19-21 of March 2015  will include a session, training workshop & sprints provided by fago & dasjo.

Let us know if you want to get involved at any of these events and see you the core & Rules issue queues!

dasjo on behalf of the #d8rules team 

Catégories: Elsewhere

Code Karate: Drupal 7 File Resumable Upload Module

mer, 26/11/2014 - 16:04
Episode Number: 181

The Drupal 7 File Resumable Upload Module is a great way to allow your Drupal site to upload large files. This is especially helpful if your server limits the size of files you can upload to your Drupal site. The module simply replaces the standard Drupal file upload field with a better alternative that allows:

Tags: DrupalDrupal 7File ManagementMediaDrupal Planet
Catégories: Elsewhere