Planet Drupal

Subscribe to Planet Drupal feed
Drupal.org - aggregated feeds in category Planet Drupal
Updated: 40 min 28 sec ago

Cheppers blog: Acquia Certified Developer exam - passed!

Thu, 27/11/2014 - 13:21

We are proud to announce that Cheppers has now three Acquia Certified Developers!
This Monday Mau, Attila and Andor have all passed the exam held by Acquia, and we are very proud of them.

Categories: Elsewhere

Kristian Polso: Crawling the top 15,000 Drupal websites

Thu, 27/11/2014 - 11:01
So I crawled the top 1,000,000 websites from Alexa, looking for all of the Drupal websites (and other popular CMS's). Here are the results.
Categories: Elsewhere

Joachim's blog: A git-based patch workflow for drupal.org (with interdiffs for free!)

Thu, 27/11/2014 - 09:39

There's been a lot of discussion about how we need github-like features on d.org. Will we get them? There's definitely many improvements in the pipeline to the way our issue queues work. Whether we actually need to replicate github is another debate (and my take on it is that I don't think we do).

In the meantime, I think that it's possible to have a good collaborative workflow with what we have right now on drupal.org, with just the issue queue and patches, and git local branches. Here's what I've gradually refined over the years. It's fast, it helps you keep track of things, and it makes the most of git's strengths.

A word on local branches

Git's killer feature, in my opinion, is local branches. Local branches allow you to keep work on different issues separate, and they allow you to experiment and backtrack. To get the most out of git, you should be making small, frequent commits.

Whenever I do a presentation on git, I ask for a show of hands of who's ever had to bounce on CMD-Z in their text editor because they broke something that was working five minutes ago. Commit often, and never have that problem again: my rule of thumb is to commit any time that your work has reached a state where if subsequent changes broke it, you'd be dismayed to lose it.

Starting work on an issue

My first step when I'm working on an issue is obviously:

  git pull

This gets the current branch (e.g. 7.x, 7.x-2.x) up to date. Then it's a good idea to reload your site and check it's all okay. If you've not worked on core or the contrib project in question in a while, then you might need to run update.php, in case new commits have added updates.

Now start a new local branch for the issue:

  git checkout -b 123456-foobar-is-broken

I like to prefix my branch name with the issue number, so I can always find the issue for a branch, and find my work in progress for an issue. A description after that is nice, and as git has bash autocompletion for branch names, it doesn't get in the way. Using the issue number also means that it's easy to see later on which branches I can delete to unclutter my local git checkout: if the issue has been fixed, the branch can be deleted!

So now I can go ahead and start making commits. Because a local branch is private to me, I can feel free to commit code that's a total mess. So something like:

  dpm($some_variable_I_needed_to_examine);
  /*
  // Commented-out earlier approach that didn't quite work right.
  $foo += $bar;
  */
  // Badly-formatted code that will need to be cleaned up.
  if($badly-formatted_code) { $arg++; }

That last bit illustrates an important point: commit code before cleaning up. I've lost count of the number of times that I've got it working, and cleaned up, and then broken it because I've accidentally removed an important that was lost among the cruft. So as soon as code is working, I make a commit, usually whose message is something like 'TOUCH NOTHING IT WORKS!'. Then, start cleaning up: remove the commented-out bits, the false starts, the stray code that doesn't do anything. (This is where you find it actually does, and breaks everything: but that doesn't matter, because you can just revert to a previous commit, or even use git bisect.)

Keeping up to date

Core (or the module you're working on) doesn't stay still. By the time you're ready to make a patch, it's likely that there'll be new commits on the main development branch (with core it's almost certain). And prior to that, there may be commits that affect your work in some way: API changes, bug fixes that you no longer need to work around, and so on.

Once you've made sure there's no work currently uncommitted (either use git stash, or just commit it!), do:

git fetch
git rebase BRANCH

where BRANCH is the main development branch that is being committed to on drupal.org, such as 8.0.x, 7.x-2.x-dev, and so on.

(This is arguably one case where a local branch is easier to work with than a github-style forked repository.)

There's lots to read about rebasing elsewhere on the web, and some will say that rebasing is a terrible thing. It's not, when used correctly. It can cause merge conflicts, it's true. But here's another place where small, regular commits help you: small commits mean small conflicts, that shouldn't be too hard to resolve.

Making a patch

At some point, I'll have code I'm happy with (and I'll have made a bunch of commits whose log messages are 'clean-up' and 'formatting'), and I want to make a patch to post to the issue:

  git diff 7.x-1.x > 123456.PROJECT.foobar-is-broken.patch

Again, I use the issue number in the name of the patch. Tastes differ on this. I like the issue number to come first. This means it's easy to use autocomplete, and all patches are grouped together in my file manager and the sidebar of my text editor.

Reviewing and improving on a patch

Now support Alice comes along, reviews my patch, and wants to improve it. She should make her own local branch:

  git checkout -b 123456-foobar-is-broken

and download and apply my patch:

  wget PATCHURL
  patch -p1 < 123456.PROJECT.foobar-is-broken.patch

(Though I would hope she has a bash alias for 'patch -p1' like I do. The other thing to say about the above is that while wget is working at downloading the patch, there's usually enough time to double-click the name of the patch in its progress output and copy it to the clipboard so you don't have to type it at all.)

And finally commit it to her branch. I would suggest she uses a commit message that describes it thus:

  git commit -m "joachim's patch at comment #1"

(Though again, I would hope she uses a GUI for git, as it makes this sort of thing much easier.)

Alice can now make further commits in her local branch, and when she's happy with her work, make a patch the same way I did. She can also make an interdiff very easily, by doing a git diff against the commit that represents my patch.

Incorporating other people's changes to ongoing work

All simple so far. But now suppose I want to fix something else (patches can often bounce around like this, as it's great to have someone else to spot your mistakes and to take turns with). My branch looks like it did at my patch. Alice's patch is against the main branch (for the purposes of this example, 7.x-1.x).

What I want is a new commit on the tip of my local branch that says 'Alice's changes from comment #2'. What I need is for git to believe it's on my local branch, but for the project files to look like the 7.x-1.x branch. With git, there's nearly always a way:

  git checkout 7.x-1.x .

Note the dot at the end. This is the filename parameter to the checkout command, which tells git that rather than switch branches, you want to checkout just the given file(s) while staying on your current branch. And that the filename is a dot means we're doing that for the entire project. The branch remains unchanged, but all the files from 7.x-1.x are checked out.

I can now apply Alice's patch:

  wget PATCHURL
  patch -p1 < 123456.2.PROJECT.foobar-is-broken.patch

(Alice has put the comment ID after the issue ID in the patch filename.)

When I make a commit, the new commit goes on the tip of my local branch. The commit diff won't look like Alice's patch: it'll look like the difference between my patch and Alice's patch: effectively, an interdiff.

  git commit -m "Alice's patch at comment #2"

I can now do a diff as before, post a patch, and work on the issue advances to another iteration.

Here's an example of my local branch for an issue on Migrate I've been working on recently. You can see where I made a bunch of commits to clean up the documentation to get ready to make a patch. Following that is a commit for the patch the module maintainer posted in response to mine. And following that are a few further tweaks that I made on top of the maintainer's patch, which I then of course posted as another patch.

Improving on our tools

Where next? I'm pretty happy with this workflow as it stands, though I think there's plenty of scope for making it easier with some git or bash aliases. In particular, applying Alice's patch is a little tricky. (Though the stumbling block there is that you need to know the name of the main development branch. Maybe pass the script the comment URL, and let it ask d.org what the branch of that issue is?)

Beyond that, I wonder if any changes can be made to the way git works on d.org. A sandbox per issue would replace the passing around of patch files: you'd still have your local branch, and merge in and push instead of posting a patch. But would we have one single branch for the issue's development, which then runs the risk of commit clashes, or start a new branch each time someone wants to share something, which adds complexity to merging? And finally, sandboxes with public branches mean that rebasing against the main project's development can't be done (or at least, not without everyone know how to handle the consequences). The alternative would be merging in, which isn't perfect either.

The key thing, for me, is to preserve (and improve) the way that so often on d.org, issues are not worked on by just one person. They're a ball that we take turns pushing forward (snowball, Sisyphean rock, take your pick depending on the issue!). That's our real strength as a community, and whatever changes we make to our toolset have to be made with the goal of supporting that.

Categories: Elsewhere

PreviousNext: Lightning talk - Drupal 8's Third Party Settings Interface

Thu, 27/11/2014 - 03:22

During this weeks developers' meeting our lightning talk was all about Drupal 8's ThirdPartySettingsInterface.

Here's the video introduction to this powerful new feature in Drupal.

Categories: Elsewhere

Acquia: Part 2 – Cal Evans and Jeffrey A. "jam" McGuire talk open source

Thu, 27/11/2014 - 00:53
Language Undefined

Voices of the ElePHPant / Acquia Podcast Ultimate Cage Match Part 2 - I had the chance to try to pull Cal Evans out of his shell at DrupalCon Amsterdam. After a few hours, he managed to open up and we talked about a range of topics we have in common. In this part of our conversation we talk about 'Getting off the Island', inter-project cooperation in PHP and Drupal's role in that; the reinvention and professionalization of PHP core development; decoupled, headless Drupal 8; PHP and the LAMP stack as tools of empowerment and the technologists' responsibility to make devices and applications that are safe, secure, and private by default.

Categories: Elsewhere

David Norman: Drupal Workstation Configuration

Wed, 26/11/2014 - 22:24

I use a MacBook Pro with Retina for work - they make the best hardware and operating system combination. I also use a Nexus 5, iPad 3, Kindle Paperwhite 2, and Chromebook. I had an iPhone for about 9 months - I didn't like it. If I shed one of my devices, it'd be the iPad.

I force my blog to use HTTPS. I assume government agencies broke SSL and have the root signing keys for certificate providers. I'm just hoping HTTPS makes sniffing a blog annoying enough to make a point to someone that sniffs traffic for a living.

(adsbygoogle = window.adsbygoogle || []).push({}); Physical security

Immediately after turning on a fresh MacBook, I activate Filevault. At first, activating Filevault for my work was a policy to follow - now I want it. I went through the mental exercise of having all my personal items lost or stolen. I couldn't imagine an end to what lawyers would do to me if I lost control of their company's confidential information. The rule even applies when living a minimalist lifestyle. At least one minimalist tells how he got robbed while sleeping in his own home. Filevault 2 on Mac OS 10.8 creates complete encryption transparency. It's hard to tell if a drive is encrypted with Filevault or not without looking at the System Preferences.

I use screensaver hot corners to activate the screensaver and password protection when I walk away from my MacBook.

When someone walks up to my workstation to talk to me or I need a restroom break, all I do is swipe the mouse cursor down to the bottom left corner and my MacBook screen is locked.

For addtional security, I also use a screen protector. Though I telecommute, I work frequently at places other than my home like the library, Regus business lounges, and county parks.

The 3M privacy screen protector is the only one I take seriously. It doesn't keep my screen from getting dirty, though it will block unexpected sneeze splittle - rather it keeps wandering eyes next to me from reading what is on my screen. I prefer the black filter over the gold one, though I agree with reviewers that say the gold filter is more effective.

Password management

Hackers compromised my MtGox account in 2011. Unfortunately, my password for MtGox was the same as what I used on every other website I use. After the release of the password file, anyone could have assumed my digital identity online. The silly part was, I already used a wonderful password manager - 1Password.

Today, I keep passwords in both 1Password and Lastpass - I don't re-use passwords anymore. Generating a random password for every site makes me so dependent on having a password manager that having redundancy doesn't bother me. Adding Lastpass to my toolbox gave me options for new toys, like using a Chromebook, since Lastpass is platform-agnostic.

Version control

When I started doing public development of Thatware, someone suggested using CVS to manage contributions from other people and I thought they were nuts. All CVS was going to do was add a bunch of aggravating complexity. I still don't think Thatware ever reaped anything worthwhile by using CVS before I abandoned it. By the time I introduced CVS to the project, my interest in Thatware was already waning.

My progress on Thatware stalled after an email I got from Dries Buytaert asking why I didn't use table joins anywhere in Thatware's database queries. I didn't know what a table join was at the time - the little bit of SQL I had implemented was all copied from a post I found on Dev Shed. As it turns out, doing a couple table joins instead of queries, nested in loops, is about 10 times faster. Dries proved it when he had me benchmark alternative scripts. Those scripts turned into what people know today as Drupal. Needless to say, I yielded to his superiority.

In time, I shifted my CVS knowledge from Thatware to Drupal. When I discovered I could commit to any contributed module's CVS repository, I spread my generosity where it wasn't wanted, which ended with two commits to the Xstatistics module. After that time, CVS commits got access hooks to verify maintainers were listed in the project node. I still intend to see the Windows-style newlines removed someday, even if the project stays in an officially abandoned state.

I used Subversion for professional work alongside CVS for community code. I thought Subversion and CVS were equal platforms with different methods for versioning files.

Git

Today, I use Git. When I was using Subversion, I thought it was fine. I only switched to Git because the Drupal project did. Now I've experienced the ease of branching in Git versus Subversion. Doing local merges makes resolving conflicts easier. I'll never go back to Subversion by choice.

http://www.syntevo.com/smartgithg/ http://www.git-legit.org/

The normal git log output is useless to me, so I never run it. I do get some value by running an alternate version of the log command that outputs the merge tree.

git config --global alias.lg "log --color --graph \ --pretty=format:'%Cred%h%Creset \ -%C(yellow)%d%Creset %s \ %Cgreen(%cr)%C(bold blue)<%an>%Creset' --abbrev-commit"

Look at the difference between git log:

and git lg:

Git versions before 2.0 would push all of the local braches that matched a remote branch name using a matching method. As of Git 2.0, it switches to a more conservative simple behavior, pushing only the current branch to the corresponding remote branch it is configured to follow. The new, 2.0 simple version is set through

git config --global push.default simple

To retain the older method of pushing all matching branches, even after upgrading to Git 2.0:

git config --global push.default matching Terminal

My colleagues recommended iTerm as a replacement for the built-in, Apple-provided terminal. It offers options for a split pane, autocomplete, and paste history which aren't available with the Apple terminal.

iTerm also protects me against myself by blocking a system shutdown when I have a terminal open. Before iTerm blocked shutdown in Mac, I would shutdown my Mac with an open SSH or mosh connection. Since I never use screen, shutting down sometimes broke something in-process, like apt-get upgrade.

The default theme of either terminal is not suitable for me - or at least I think it could be improved. Instead, I use the Tomorrow Night theme. I combine Adobe's Source Code Pro font with a Druplicon to make my default terminal.

The Druplicon ASCII art shows at the start of each new terminal window I open, whether I use Apple's Terminal or iTerm, by using the system MOTD. The Internet is full of web-based tools that can convert an image file to ASCII art, but I created the Druplicon by hand so I could customize the output to fit in 80 columns and 25 lines for the default terminal window size.

Mac OS X doesn't come with a /etc/motd file by default, but if you create one, it will get used. To get access to write to /etc, you'll need to sudo as a user with administrative rights.

sudo su - cat ~/Downloads/ascii_druplicon.txt > /etc/motd exit OH MY ZSHELL!

Bash was all I needed as a commandline interpreter until I started using git. I when I was informed about oh-my-zsh and that it support various plugins, it was a huge annoyance to do any commands inside any git repository without the assistance of zsh.

Robby Russel hosts the official installer. It only needs one command in a terminal.

curl -L https://raw.github.com/robbyrussell/oh-my-zsh/master/tools/install.sh | sh

After installation, examine the .zshrc file in your home directory. Two options to pay particular attention to are ZSH_THEME for theme and plugins for plugins.

Throughout this book, screenshots I take of a terminal window use the gallifrey zsh theme.

I could live with just the git plugin - I have a tendency to enable zsh plugins that I have no idea how to use. Now when cloning git repositories, the git plugin for zsh shows the checked-out branch of your current directory as well as any subdirectories.

In the case of setting variables for a plugin like jira, I could either create a ~/.jira-url file or add the JIRA_URL setting to ~/.zshrc. There's a trade-off between being able to

cat "https://jira.example.com" > ~/.jira-url

or centralizing the configuration. I opted for putting it in the ~/.zshrc file directly because I think I'm more likely to remember that the jira command is a zsh plugin and look for the configuration in zsh's file.

Show hidden files

The operating system doesn't matter - I like to see all the hidden files on my filesystems. In Mac OS, I show them all by referencing an article on Lifehacker to get the correct terminal commands.

defaults write com.apple.finder \ AppleShowAllFiles TRUE killall Finder

Commands like this can also be aliased in your ~/.zshrc file.

# Show/hide hidden files in Finder alias show="defaults write com.apple.Finder \ AppleShowAllFiles -bool true && killall Finder" alias hide="defaults write com.apple.Finder \ AppleShowAllFiles -bool false && killall Finder"

Since I use BlueHarvest to minimize the number of .DS_Store files on my Mac and network drives, I periodically see files show in finder, then disappear. On my desktop, the Mac system files sometimes stack on top of files I want, making it annoying to work with files from my desktop.

The default settings for BlueHarvest 5 only remove Apple files on remote drives, so I add my home directory as an additional location.

Alfred

http://www.alfredapp.com

Bartender

http://www.macbartender.com

ClamXav

I suspect antivirus is an under-utilized tool on Mac OS X installations, but I don't let that stop me from running it. ClamXav is a free antivirus utility which can monitor directories in additon to a standard, manual scan.

ClamXav can be setup up as passive or active: scan only the files you tell it to or your entire computer. I activated Sentry to monitor my home folder and scan new files as they arrive. I usually download files from the Internet to either ~/Downloads or ~/tmp. Doing my entire home directory also covers unknowingly downloading a virus through Apple Mail or as browser cache from Google Chrome.

Crashplan

Crashplan is the only backup solution I've found where I can backup my files encrypted to an offiste datacenter and to another computer in my house. The other computer in my house is key - when I used other services like Jungle Disk, the restore process was painful. Downloading files archived on the Internet as multiple hundreds of gigabytes is painful. At least when I need to a restore from another computer in my house, I can control the network conditions. Getting files restored from the Internet is only my plan for catastrophic events - burgulary, fire, tornado.

Dropbox

I store my Calibre library; 1Password keychain; Alfred, Codekit and TextExpander configurations; Leanpub documents; and a few small file backups encrypted by TrueCrypt. https://www.dropbox.com

Browsers and plugins

I used Netscape Navigator back in the 1990s. Since then, I've felt some membership on team Firefox. Truthfully, since I switched to using Android for my phone, I've stuck with Google Chrome. I feel like Chrome simplifies my life somehow. The market dominance Chrome gained in 2013 doesn't give me much reason to move back to Firefox, either.

Chrome extensions
  • 1Password for password management
  • AdBlock to block advertisements
  • Buffer to queue up messages on social networking to target readers at times they're most likely to be reading social networks
  • Dreditor adds functionality for patch reviews and Git commit messages to the drupal.org issue queue
  • Eye Dropper to show me hex colors of pages on the Internet
  • Google Analytics Opt-out Add-on in hopes of reducing how much I get tracked on the Internet
  • Google Docs was added by my Chromebook
  • HTTPS Everywhere because the Internet should be encrypted
  • Incognito Close closes incognito tabs if they're inactive for 30 minutes
  • Lastpass intergrates with my Lastpass vault
  • MeasureIt! measures pixels
  • Mosh is another way I can use a terminal to a remote server while I'm using my Chromebook
  • Readability is a shortcut for sending articles for me to read later on Readability
  • Rescuetime for Chrome tells RescueTime what pages I'm visiting instead of just recording that I'm using Chrome, otherwise, it can't help gauge my web-based productivity
  • Secure Shell so I can SSH into remote servers from my Chromebook
  • Send to Kindle for Google Chrome can send websites as e-books to my Kindle Paperwhite
  • Subhub opens GitHub files in Sublime Text when paired with the Subhub package in Sublime Text
  • Tailor is an offline code editor with Dropbox and Google Drive integration
  • Text is a text editor
  • Type Fu implemented my Norman keyboard layout experiment for learning how to type
  • Writebox is an offline text editor with Dropbox and Google Drive integration
  • Wunderlist for Chrome makes a shortcut to my Wunderlist task archive
ImageOptim

It occurred to me one day that if Drupal had hundreds of thousands of downloads, then those Drupal installations each had millions of page views, that compressing images in the Drupal core distribution could have a ripple effect for saving thousands of gigabytes of unnecessary network traffic123456789.

The image output from programs like GIMP, Photoshop, Skitch, and even the built-in screenshot hotkeys for Mac OS X, have a bunch of unnecessary comments and color profiles that have no relevance to the Internet environment. Editing a JPG in GIMP, then exporting it with 80% compression still has the opportunity to remove extra file bytes without any quality loss.

ImageOptim wraps PNGOUT, Ziofku, Pngcrush, AdvPNG, OptiPNG, JpegOptim, jpegrescan, jpegtran, and Gifsicle. I keep ImageOptim in my Mac Dock, then drag image files to the ImageOptim icon in my Dock to optimize them. I find the outputs save 5-25% on the output file size.

Rather than getting the default output from CMD+SHIFT+4, I prefer to focus a specific type of interaction, output location, and filename. This screencapture command outputs the screenshot of whatever window I click on, in JPG format on my desktop. After I process the output with ImageOptim, the screenshot complies with all the size and format rules I need for adding to this book.

screencapture -tjpg -oW ~/Desktop/imageoptim.jpg

ImageAlpha uses the same principles as ImageOptim, except that it works specifically on PNG files and applies lossy compression and converts the PNG to PNG8+alpha format - that's a fancy way of saying it proposes to convert the image from full color to 256 colors. The 256 color palate change by itself can reduce image sizes by around 70% and you can probably only tell the difference on full-color photos. pngquant the backend utility for ImageAlpha.

Kindle Previewer

http://www.amazon.com/gp/feature.html?docId=1000765261

Skitch

https://evernote.com/skitch/

SmartSynchronize

http://www.syntevo.com/smartsynchronize

Viscosity

http://www.sparklabs.com/viscosity/

f.lux

The evening is a tricky time to balance activities. While I know I shouldn't spend much of any evening on the computer, sometimes I do, even if I'm just working on our family budget. When it's dark in the house, it's easy for a phone or computer screen to blind me, even after the computer does a brightness compensation for the dark.

f.lux makes my computer screen tint match the lighting in the room at night. It goes beyond just a brightness change, because the blue tint of the computer screen without a red filter at night doesn't match what the average light bulb emits. If I worked in an office cave, I could set the daylight hours to match the florescent lighting in the room.

The preferences automatically identify my location. With the transition time set to 1 hour, I don't even notice the screen tint change. I think f.lux is an an application that every computer user should install, regardless of your technical skills and abilities. I'd have my father-in-law install it.

E-book management

I'm not a fan of how Amazon.com set the Kindle book royalty structure for authors, but I am fond of their Paperwhite for the non-glare screen and their centralized document management. Their implementation for sending an email with an e-book attachment to my Kindle is a great system.

After I buy an e-book, I think I should own it. Kindle DRM doesn't follow that spirit since Amazon makes it hard to move a book from my Kindle to iBooks. [https://apprenticealf.wordpress.com/](Apprentice Alf) has DRM removal tools for e-books. My DRM-free e-books are managed by Calibre.

Calibre is free. The DRM removal plugin for Calibre makes it possible to import books from my Kindle to my Dropbox. I convert my books to other formats, store whitepapers, documentation, and presentations in Calibre so I can find them easily when I decide to email them to friends.

Meeting online

Shush audio hijack pro mictracker Skype IRC Squiggle Dropcam GoToMeeting FreeConferenceCall Zoom Cloud Meetings

More

Divvy sorts out all my windows when I'm doing research between iTerm, Google Chrome, and Sublime Text. By clicking the Divvy icon in the menu bar, I can drag across a grid to organize my open applications to not clobber each other on the screen. It divides my screen into exact portions so I don't have to mess with dragging windows to specific sizes.

Divvy also has a method for organizing applications by shortcut. One global keyboard shortcut setting opens the Divvy grid. Then, a second combination, specific to your Divvy preferences, sets the window sizes according to your own preset grid.

Hazel

Hazel advertises itself as a personal housekeeper. It's a rule engine that keeps files organized by whatever means you tell it. The default rules use color labels for downloads so I can quickly find new downloads and see reminders when it's time to archive installers to my Drobo.

iStat Menus requires an admin account to install a resource monitor. I configure it to show my CPU load, memory use, disk space consumption, and temperatures for my CPU and GPU.

Internet tethering

HoRNDIS is a Mac OS X driver for using my Android phone's native USB tethering mode. The driver is free and the source is available on GitHub.

After installing the driver, I didn't have to configure anything. I just plugged my Nexus 5 into my Mac, then activated USB tethering on my phone in the Settings app.

Since I tether frequently, I added a widget to my phone to the Tethering & portable hotspot settings. By adding a widget to Settings shortcut, then placing the widget icon, the widget will ask for the setting you'd like to set the widget to load.

The Portable Wi-Fi hotspot option on my Android also works, but I've found that using it in a building with a bunch of other wireless networks seems congested. There seems to be a limit to how much wireless traffic can broacast through the air before it all jumbles together. Since I keep my phone plugged in anyway while tethering to keep the battery from draining too fast, the USB tethering becomes an overall more attracive option.

Android File Transfer is not available while using the USB tethering.

Drobo

My Drobo is one of two things I've ever locked to my desk with a Kensington lock. Thanks to having SSH and rsync on my Drobo, I can rsync backup snapshots over my home network. For that, I created a bash script that I keep in ~/bin, which is in my ~/.zshrc PATH.

#!/bin/sh rsync -a --delete --progress --ignore-errors \ --rsh=ssh /Users/davidnorman \ root@192.168.0.99:/mnt/DroboFS/Shares/deekayen-macbook/ rsync -a --delete --progress --ignore-errors \ --rsh=ssh /Applications \ root@192.168.0.99:/mnt/DroboFS/Shares/deekayen-macbook/ rsync -a --delete --progress --ignore-errors \ --rsh=ssh /private \ root@192.168.0.99:/mnt/DroboFS/Shares/deekayen-macbook/ Other apps to consider

bee, brackets, byword, caffeine, pckeyboardhack, quiet, readkit, rescuetime, capsee, carboncopycloner, requiem, resolutiontab, codekit, codebox, coffitivity, sequel pro, shortcat, smartsyncrhronize, sourcetree, spideroak, expandrive, tower, gpgtools, gabble, github, google drive, unplugged, kaleidoscope, ia writer, icon slate, keyremap4macbook, libreoffice, owncloud, ripit vs handbrake, brew install youtube-dl

  1. Smush core images 

  2. Optimize module and theme images 

  3. Use transparent PNG instead of opacity for overlay background to gain rendering performance 

  4. Open/Close Drawer link doesn't show focus 

  5. Create a new Druplicon logo for Bartik 

  6. System messages need identifying icons (WCAG 2.0) 

  7. Optimize Bartik images 

  8. Optimize seven ui icons 

  9. Compress Images using Lossless Techniques 

Post categories Drupal
Categories: Elsewhere

David Norman: Sublime Text for Drupal

Wed, 26/11/2014 - 22:02

The novel, The Pragmatic Programmer: From Journeyman to Master, the advocate for using a single file editor for as many tasks as you can.

We think it is better to know one editor very well, and use it for all editing tasks: code, documentation, memos, system administration, and so on. Without a single editor, you face a potential modern day Babel of confusion. You may have to use the built-in editor in each language's IDE for coding, and an all-in-one office product for documentation, and maybe a different built-in editor for sending e-mail. Even the keystrokes you use to edit command lines in the shell may be different. It is difficult to be proficient in any of these environments if you have a different set of editing...

At this point of writing this book, I've already used Penflip, Leanpub, Sublime Text, StackEdit, Tailor, Writebox, a pad of paper, and two dry erase boards.

My trial of other editors in this project as part of routine experimentation. Staying informed about the latest tools is necessary to prevent becoming a technological dinosaur. In 2013, I settled deep into using Sublime Text for as many things as I could. Viewing and editing files in Sublime wasn't part of a conscious effort, rather I think it was just a natural result of what Andrew Hunt and David Thomas identified in their Pragmatic Programmer book.

(adsbygoogle = window.adsbygoogle || []).push({});

As a developer, themer, designer, or manager of any of those, I've found my editor to be a critical tool and one in which I spend a lot of time. In 2013, Sublime Text was my third-most used application.

What the RescueTime chart doesn't show clearly is that I actually spent most of my time processing email because I used Gmail as my email client at work for an extended trial period. I tell people I work with that email should be considered a low-priority medium of communication - one in which I might respond in a week when I attempt to re-gain my inbox zero status. This chart has a long tail of other websites and applications spreading out to 3965 items, including me browsing amazon.com and watching YouTube.

Sublime Text is the second place where zsh plugins shine. GitHub is littered with various instructions and alias files to create a terminal alias to run Sublime Text from a terminal. When you add the sublime plugin to your ~/.zshrc file, it handles aliasing the subl binary packaged in the regular Sublime Text.app bundle.

Whether on Linux or Mac, the zsh plugin searches the /usr/local/bin and /Applications paths to locate and make the subl binary file available as both subl and st. Having some basic vi or nano knowledge for server management and devops is nice, but when you have a perfectly good GUI available, I say use it. Additionally, zsh also makes a stt command available to the terminal for opening the current directory in Sublime Text.

Basic syntax

Sublime Text uses the Preferences / Settings - User menu to create a file named Preferences.sublime-settings for storing various formatting options. To match most of the Drupal coding standards, start with the following JSON settings:

{ "default_line_ending": "unix", "ensure_newline_at_eof_on_save": true, "fallback_encoding": "UTF-8", "rulers": [ 80 ], "shift_tab_unindent": true, "spell_check": true, "tab_size": 2, "translate_tabs_to_spaces": true, "trim_automatic_white_space": true, "trim_trailing_white_space_on_save": true, "use_tab_stops": true, "word_separators": "./\\()\"'-:,.;<>~!@#%^&*|+=[]{}`~?" }

Sublime Text can have multiple rulers. Setup one for drafting email, one for code, another for markdown documents.

"rulers": [72, 80, 120],

Pasting code from another file or from a sample I find on the Internet is annoying, because I have to spend the next 30 seconds fixing the indentation. And even if the indentation levels happen to match, chances are I didn't select the leading whitespace, so I still have a formatting hassle to sort out. The Paste and Indent command automatically adjusts the indentation of the code from the clipboard to match the surrounding code. Add the following to Default (OSX).sublime-keymap file through the Sublime Text - Preferences - Key bindings - User menu:

{ "keys": ["super+v"], "command": "paste_and_indent" }, { "keys": ["super+shift+v"], "command": "paste" } Sublime Text package manager

The Sublime Text package manager is not part of the default installation. There are only two steps in the instructions, which creates a new Sublime Text Preferences menu named Package Control.

Package control options are nested into the Sublime Text Command Palette. The shortcut to open the command palette menu on Mac is cmd+shift+p or ctrl+shift+p for Windows and Linux from within a Sublime Text window. All the package management options begin with Package Control: - the list is narrowed by typing package, or even just install in the command palette prompt.

Drupal

Once you've navigated to the Package Control: Install Package menu, typing Drupal in the next Command Palette prompt should show the Drupal package by robballou.

The Drupal package has various completions for Drupal 7, which it can do within the context of filename extensions. For example, in projects with a .info file, the Drupal package will adds a syntax file type named Drupal Info File and autocomplete triggers based on name, description, dependencies, package, core, php, version, files, screenshot, engine, regions, features, scripts, settings, and stylesheets.

The Drupal package goes on to have autocomplete abilities for a comprehensive list of all Drupal's core functions, hooks, form API, theming, and several contributed modules - devel, migrate, and views.

The autocomplete functionality is complete enough to interpret your current $query variable in PDO syntax and expand upon condition statements to provide the syntax for the field, value and operator in the condition statement. Each of the Drupal completions are also context sensitive, meaning that if you wanted to access the fapi autocomplete shortcut for a markup item, the autocomplete menu won't appear unless you are in the context of a PHP function, within a valid <?php block of a PHP file.

Sublime Text autocomplete snippets for any language, Drupal, PHP, or Python, work on a principle of tabs. As long as you have an idea of what snippet trigger you need, the tab key will trigger the autocomplete event. The Drupal package could also serve as an alternative to copy and pasting code from the Examples for Developers project since shortcuts like block will fill-out an entire list of stub functions in which you can fill in arguments to name all the functions with the same prefix in one effort by tabbing between elements in the stub.

Behat

The Behat and Behat Features packages are close in functionality. The maintainer of Behat Features originally forked his code from the Behat project in apparent frustration of getting changes added back to the Behat project on Github.

Both Behat packages have equivalent syntax highlighting for .feature files, except that the Behat Features package adds support for French, German, Italian, Portuguese, Spanish, and Russian. Though I only read English, I installed Behat Features.

Tomorrow theme

Continuity between my terminal and Sublime Text is nice, so I theme my editor with the Tomorrow color scheme. The Tomorrow theme can be activated through the Sublime Text / Preferences / Color Scheme / Tomorrow Color Schemes menu on Mac OS X.

Sass and SCSS

As I browse through the Sublime Text packages, it's easy to two packages like Sass and SCSS, then assume I need them both - the Sass syntax was superceeded by SCSS. Upon closer inspection, both packages for Sublime Text highlight Sass and SCSS.

Though the SCSS package pitches itself as the "official bundle", I only installed the Sass project. I was troubled by the management of the SCSS project because it managed the Sublime Text package within a branch of the larger SCSS TextMate repository - the same repository which also has a Chocolat truffle in a different branch. Using git branches in that manner is wrong - each of the support bundles should be their own repository for each editor.

The Sass package for Sublime Text is properly managed all as a single bundle in a fork from the SCSS project. The Sass fork took time to pull contributions from other forks. I consider the result to be more curated. Since Sass is a fork of SCSS, it contains all of what was "official" up until the fork anyway. Installing both packages would be redundant.

DocBlockr

The DocBlockr package completes docblock comments in code. In PHP, just typing /** then return, will examine the context in which you're typing, then fill in the rest.

DocBlockr's formatting is not Drupal comment standards compliant, so some adjustments are necessary. Navigate the Sublime Text menus on Mac OS X to Sublime Text / Preferences / Package Settings / DocBlockr / Settings - User.

The following modifications to the default rules bring the DocBlockr output into a reasonable proximity to compliance with Drupal's standards.

{ // Align words following the @tags. // Values: no, shallow, deep. "jsdocs_align_tags": "no", // Add blank lines between the description // line and tags of different types. "jsdocs_spacer_between_sections": true, // Shorten primitives such as "boolean" // and "integer" to "bool" and "int". "jsdocs_short_primitives": true, // Add a '[description]' placeholder // for the @return tag. "jsdocs_return_description": false, // Add a '[description]' placeholder // for the @param tag. "jsdocs_param_description": false, }

Though Drupal's comment standards to prefer descriptions added to @param and @return, DocBlockr doesn't have a setting to shift the description to the line below each block tag. Instead, the above configuration omits the description placeholder - you'll have to add it on your own. Also note, the configuration file for DocBlockr user preferences is Base File.sublime-settings - it doesn't have DocBlockr in the name as I expected.

DocBlockr does not conflict with the Drupal package. When typing hook then pressing tab in a PHP file, you'll still get the same magic function stub, complete with docblock.

/** * Implements hook_function(). */ function example_function(args) { } SublimeGit

The Git and SublimeGit packages offer similar functionality - another scenario where installing both is redundant.

I use SublimeGit for git diff when I want to make a quick patch for a Drupal module. Though I checkout modules from drupal.org in iTerm, the process to update the contributed module versions in the Guardr project make file could look something like this.

  1. Load https://drupal.org/project/guardr. It's usually in my browser history.
  2. Click the "Version control" tab
  3. Copy the git clone line to the clipboard
  4. cmd+space, iT (to autocomplete iTerm)
  5. cd to ~/Sites
  6. paste the clone line for Guardr, then cd into the clone
  7. subl d<tab>.<tab>
  8. Make required version updates in the Sublime Text window
  9. cmd+s to save
  10. cmd+shift+p, git dif, and save the result as a .patch file for uploading to an issue on drupal.org.
SublimeLinter

SublimeLinter with SublimeLinter-php can do syntax checking, a basic function found in most any code-centric IDE. SublimeLinter has PHP-related syntax check plugins, including ones for PHP_CodeSniffer and PHP Mess Detector. I haven't seen a SublimeLinter plugin for PHP PSR compliance, but the [detailed notes] on writing a SublimeLinter plugin explain how someone might go about creating such support.

Since SublimeLinter isn't specific to just PHP, it can also handle lint checks for Puppet manifest files, CSS, Perl, Ruby, Javascript, and a bunch of others. I picked SublimeLinter over Phpcs package. If later I decide to use the SensioLabs standards fixer, Phpcs can just run that portion of the package and leave the rest to SublimeLinter.

Other notable Sublime Text packages

Alphpetize sorts the methods alphabetically within a PHP class. The sorting preserves Docblocks and any sloppiness that floats between functions within the class gets collected at the top of the class. I've never actually used this plugin for what it's supposed to do because I sort my PHP functions in order of importance. Drupal hook implementations go at the top of the files, followed by functions I'm most likely to rank as important or might need frequent editing, then utility functions at the end. Merely having this plugin ready appeals to my inner-OCD.

BlameHighlighter adds two options to the Sublime Text command palette - one highlights lines you have edited, according to Git, the other clears the highlights.

Find++ can expand the file search to look at all the open files, the files in the projects, or of a selected directory. Normally, the find function works only within the current open file. The results show in a new file, along with the filename and line number.

GitGutter uses the Sublime Text gutter to add indicators to the left of line numbers that show whether a line has been inserted, modified, or deleted.

BracketHighligher also uses the gutter space to show where brackets of various types start and end: (), {},[],<>."",'',`. A caveat I've noticed is that a conflict between two gutter plugins usually results in one winner - this one loses to GitGutter, so this would fail to show the start or end of a bracket on a line which had a change since the latest git commit.

The spell checker in Sublime Text does a decent job, but it lacks context, industry jargon, and colloquial terms. Adding the Google Spell Check package brings some sanity back to my spelling. The author even made a screencast to show its super powers.

GPG performs the basic functions of encrypt, sign, decrypt, and verify for ASCII armored text. Since GPGMail and Enigmail can handle GPG in the email stream, I use GPG for verifying GPG signatures on files or for otherwise handling the miscellaneous exceptions to normal GPG use.

SublimeJira adds command palette options to get, update, and create issues on a JIRA server. Getting issues prompts for an issue key, then displays the issue in a new tab of Sublime Text.

Phpcs has crossover functionality with SublimeLinter in terms of running php -l on the current file. It can also run PHP_CodeSniffer, PHP Mess Detector, Scheck, and the PHP Coding Standards Fixer by SensioLabs for PSR compliance. The formatting notifications use a combination of the quick panel, gutter, and status bar to notify about non-compliance.

Drupal 8 uses PHPUnit for unit tests instead of Drupal 7's Simpletest. This package adds support for right clicking test files from within Sublime Text to run them.

SideBarGit adds a git menu when right-clicking on files in the sidebar.

SSH Config highlights syntax /.ssh/config files. It handles essential autocompletion for adding new hosts to the config file, as well.

Subhub uses a combination of a Chrome extension and a Sublime Text package to insert a "Open in Sublime" button to pages in Chrome when you browse projects on GitHub.

Instead of loading a separate diff program, Sublimerge shows diffs for git, in tabs side-by-side. It's nice to see differences, but I haven't been able to actually change the edited file while Sublimerge shows the diff. Instead, Sublimerge forces me back to edit mode to make changes. I'm a fan of SmartSynchronize and its ability to merge and edit in realtime, but it's also annoying to switch to a different application.

SublimeWritingStyle helps you use active, instead of passive voice, and highlights judgmental words like "very", "clearly", and "relatively". Passive voice instances highlight in yellow and judgmental words in grey.

WordCount shows the words and/or characters in a document or a selection within a document, as well as an estimated read time. Digits are excluded. I use the following custom WordCount user settings. A few causal minutes of research seems to show that a reasonably educated adult reads at 265 words per minute, depending on whether you're reading for comprehension, pleasure, or scanning on the Internet.

{ "enable_readtime": true, "readtime_wpm": 265, "whitelist_syntaxes": [ "Markdown", "MultiMarkdown", "Plain Text" ], "blacklist_syntaxes": [ "CSS", "SQL", "JavaScript", "JSON", "PHP" ] }

WordHighlight highlights all copies of a word which is currently highlighted, or optionally, highlights copies of a word which has the insertion cursor upon it. The README file on the Github project page also has a number of configuration options, including treatment of the gutter bar, the status bar, and how to configure the highlight colors.

Xdebug Client can do breakpoints through clicking lines in the Sublime Text screen and set watch expressions. More on this package later.

Post categories Drupal
Categories: Elsewhere

Four Kitchens: Extracting data from Drupal entities the right way

Wed, 26/11/2014 - 19:54

If you’ve ever had to extract data from Drupal entities you know it can be a painful process. This post presents a solution for distilling Drupal entities into human-readable documents that can be easily consumed by other modules and services.

Projects Drupal
Categories: Elsewhere

Drupal 8 Rules: #d8rules update October-November 2014

Wed, 26/11/2014 - 18:28

Curious why the Rules module still isn't finished, even when our initial milestone planning aimed to ship Rules 8.x by 2014? The bad news is we are a bit delayed, but that pretty much aligns with the state of Drupal 8 core. And we are working hard on solving problems in a generic way for entire Drupal core instead of having to fork APIs in Rules.

DrupalCon Amsterdam Recap

We were quite active at DrupalCon Amsterdam: I gave a 12 minutes update on the #d8rules initiative (video & slides) and we had some productive discussions on crowd funding in general at the community summit (notes & related core conversation).

On Thursday, we had a BoF to get contributors up to speed with development for Rules in Drupal 8 and on Friday we sprinted the whole day to port actions and get Milestone 1 finished.

Development status 

As [META] Rules 8.x Roadmap states, most of the funded Milestone 1 tasks have been finished but a number of core integration tasks are still in progress. The unified context system is going to be used across Blocks, Conditions and Actions API as well as related contrib modules like the Drupal 8 version of Page Manager. In Drupal 7 CTools and Rules together with the Entity module basically invented two separate "context systems", so compared to that this is a big step fowards and will bring site builders and developers a much better plugin interopability & re-usability in Drupal 8.

Some core tasks we are currently working on are:

In Rules 8.x-3.x, we recently finisihed the conversion of all condition and action test cases to PHPUnit based integration tests, what helps to test the plugins including their annotation and runs very fast (without a full bootstrap). During the sprints in Amsterdam we worked with contributors on porting more of Rules' actions to the new Action API, including test coverage. We'll continue to work on porting actions with contributors via the issue queue and run sprints on the coming events we attend.

Next steps & events

We anticipate to use the budget raised by the end of 2014. Next year, we will have to look for further funding resources or limit work on Rules 8.x to our free time. Thanks again to everyone supporting us so far!

There are already some great Drupal events lined up for next year where the #d8rules initiative will be present:

DrupalCon Bogota, 10-12 of February 2014 will feature a full session on upgrading modules to integrate with Rules in Drupal 8 by dasjo: #d8rules - Web-automation with Rules in Drupal

European Drupal Days Milano, 19-21 of March 2015  will include a session, training workshop & sprints provided by fago & dasjo.

Let us know if you want to get involved at any of these events and see you the core & Rules issue queues!

dasjo on behalf of the #d8rules team 

Categories: Elsewhere

Code Karate: Drupal 7 File Resumable Upload Module

Wed, 26/11/2014 - 16:04
Episode Number: 181

The Drupal 7 File Resumable Upload Module is a great way to allow your Drupal site to upload large files. This is especially helpful if your server limits the size of files you can upload to your Drupal site. The module simply replaces the standard Drupal file upload field with a better alternative that allows:

Tags: DrupalDrupal 7File ManagementMediaDrupal Planet
Categories: Elsewhere

David Norman: Acquia Certified Developer - Back end Specialist 2014 study guide

Wed, 26/11/2014 - 14:53

I believe employers should be able to reliably expect that a person who has been certified by Acquia as a back end specialist have some actual experience with using, modifying, and extending Drupal in real-world projects. I don't think it would be appropriate to attempt this exam unless you've got at least 6 months of intense, full-time Drupal experience. It's one thing to have read about hook_menu, but this exam will expect that the examines know what sorts of options are in a particular hook, why you would use it, when and how to override it. Some of the questions are presented in a manner which a theoretical client is proposing a change to existing functionality or reporting a bug in custom code. Having experience to reference in responding to those situations would be helpful.

(adsbygoogle = window.adsbygoogle || []).push({});

The exam was Drupal-centric. You shouldn't need to concern yourself with reviewing peripheral tools like drush, dreditor, Vagrant, Docker, GitHub, Jenkins, Apache, Nginx, Redis, memcache, MySQL, PostgreSQL, IDEs, Xdebug, Varnish, or any particular operating system.

The Back end Specialist exam doesn't ask questions on any particular contributed module like Views, Panels, Features, Configuration Management, ctools, Token, Date, Webform, Rules, Boost, etc. Distributions and testing are absent, too. As I understand, contributed module questions may appear in the cross-discipline, general Acquia Certified Developer exam as part of their Professional Track.

Domain 1.0: Fundamental Web Development Concepts

To my dismay, the test authors really did ask front-end only questions specific to CSS and Javascript; I I think I got 3 of those. I think it detracts from the value of the exam - they could have asked more back end questions instead.

1.1. Demonstrate knowledge of HTML and CSS

https://developer.mozilla.org/en-US/docs/Web/CSS/Pseudo-elements

1.2. Identify PHP programing concepts 1.3. Identify JavaScript and jQuery programing concepts

https://stackoverflow.com/questions/5150363/onchange-open-url-via-select...

1.4. Demonstrate the use of Git for version control

Brush up on git functions that aren't necessarily common for day-to-day tasks.

Debugging Administration Domain 2.0: Drupal core API 2.1 Demonstrate an ability to register paths to define how URL requests are handled in Drupal using hook_menu and hook_menu_alter 2.2 Demonstrate ability to build, alter, validate and submit forms using Form API

http://befused.com/drupal/form-validation

2.3 Demonstrate ability to interact with the Node system using hook_node_*

https://api.drupal.org/api/drupal/modules!node!node.api.php/group/node_a...

2.4 Demonstrate ability to interact with the Block system using hook_block_* 2.5 Demonstrate ability to use Core System hooks like hook_boot, hook_init, hook_cron, hook_mail, hook_file*

https://www.drupal.org/node/555118

Documentation Example projects 2.6 Determine order of hooks to enhance performance Domain 3.0: Database Abstraction Layer 3.1 Demonstrate ability to work with Drupal's Database Abstraction Layer for managing tables 3.2 Demonstrate ability to work with Drupal's Database Abstraction Layer CRUD operations on data Domain 4.0: Debug code and troubleshooting 4.1 Demonstrate ability to debug code 4.2 Demonstrate ability to troubleshoot site problems

https://www.drupal.org/node/482956

Domain 5.0: Theme Integration 5.1 Demonstrate ability to work with Drupal's theme CSS and JavaScript APIs Domain 6.0: Performance 6.1 Demonstrate ability to analyze and resolve site performance issues arising from site configuration

https://api.drupal.org/api/drupal/includes!common.inc/group/block_caching/7

6.2 Demonstrate ability to analyze and resolve site performance issues arising from custom code 6.3 Implement Drupal caching strategies

https://www.drupal.org/node/145279

Domain 7.0: Security 7.1 Demonstrate ability to analyze and resolve security issues arising from site configuration

You should review what each of the permissions give access to for core modules where multiple roles could be involved in publishing.

7.2 Demonstrate ability to analyze and resolve security issues arising from site custom code 7.3 Demonstrate the abilty to implement Drupal core security mechanisms

https://www.drupal.org/writing-secure-code

Domain 8.0: Community 8.1 Demonstrate the ability to contribute to the community

https://www.drupal.org/contribute/development

8.2 Demonstrate abiity to write code using Drupal Code standards

https://www.drupal.org/coding-standards

Post categories Drupal
Categories: Elsewhere

Paul Booker: Programmatically adding terms into a vocabulary from a structured text file

Wed, 26/11/2014 - 11:06
/** * Implements hook_install(). */ function artist_install() { artist_install_vocabularies(); artist_install_terms(); } /** * Installs artist module's default terms that are read from * text files in the module's includes folder. */ function artist_install_terms() { foreach (array_keys(artist_vocabularies()) as $machine_name) { $v = taxonomy_vocabulary_machine_name_load($machine_name); $wrapper = entity_metadata_wrapper('taxonomy_vocabulary', $v); if ($wrapper->term_count->value() == 0) { $path = drupal_get_path('module', 'artist') . '/includes/terms_' . $v->machine_name . '.txt'; $lines = file($path, FILE_SKIP_EMPTY_LINES); artist_install_term_tree($wrapper, $lines); } } } /** * Installs a term tree. * @param $vwrapper * EntityMetadataWrapper of a taxonomy_vocabulary entity. * @param $lines * Array of lines from the term text file. The iterator must be set * to the line to parse. * @param $last * Either NULL or the parent term ID. * @param $depth * Current depth of the tree. */ function artist_install_term_tree($vwrapper, &$lines, $last = NULL, $depth = 0) { $wrapper = NULL; while ($line = current($lines)) { $name = trim($line); $line_depth = max(strlen($line) - strlen($name) - 1, 0); if ($line_depth < $depth) { return; } else if ($line_depth > $depth) { $tid = $wrapper ? $wrapper->tid->value() : NULL; artist_install_term_tree($vwrapper, $lines, $tid, $depth+1); } else { $data = array( 'name' => $name, 'vid' => $vwrapper->vid->value(), 'parent' => array($last ? $last : 0), ); $term = entity_create('taxonomy_term', $data); $wrapper = entity_metadata_wrapper('taxonomy_term', $term); $wrapper->save(); next($lines); } } } /** * Installs terms into default vocabularies. */ function artist_update_7201(&$sandbox) { artist_install_terms(); }

In the preceding code, term names are read from text files that have tab indentation to symbolize the term hierarchy

Tags:
Categories: Elsewhere

Drupal Association News: Drupal Association Board Meeting: 21 November 2014

Wed, 26/11/2014 - 05:27

It is hard to believe, but we just finished our second-to-last board meeting of the year. The Association has grown and changed so much in 2014 and the November meeting was a great chance to talk about some of those changes and what we are planning for 2015. It was a somewhat short public meeting as we spent the bulk of our time in Executive Session to review the financial statements from the last quarter and the staff's proposed 2015 Leadership Plan and Budget. As always, you can review the minutes, the materials, or the meeting recording to catch up on all the details, and here's a summary for you as well.

Staff Update

DrupalCons: DrupalCon Amstedam is over, and we are now focused on evaluation of the event - reviewing all the session and conference evaluations as well as closing up the financials for the event. We will have an in-depth review of the event at the December board meeting. Next up is DrupalCon Latin America, which is progressing nicely now with sessions accepted and registration open. One final DrupalCon note is that DrupalCon Los Angeles session submissions should open in January, so mark your calendars for that.

Drupal.org: Drupal.org has been our primary imperative at the Association this year. We've spent 2014 building a team and really working to pay off a mountain of accumulated technical debt while also balancing the need to release new features for the community. We are very pleased that, with the working groups and community feedback, we've been able to release a Drupal.org roadmap for 2015. We also released a new Terms of Service and Privacy Policy after extensive edits based on community feedback. We'll continue to respond to questions and ideas about these documents and make notes for future releases. We have also finally been able to deploy and use a suite of over 400 tests on Drupal.org. This is work that was initiated by Melissa Anderson (eliza411) and she was extremely helpful in getting those tests up and running again. We're thrilled to be using this contribution after all this time and are extremely grateful to Melissa.

Community Programs: We just held our final Global Training Days for 2014 with over 80 companies on every continent but Antarctica (c'mon penguins!). This program has continued to evolve with new partnerships and currciulums used this time around, as well as a plethora of great photos and tweets sent our way.

Marketing and Communications: Joe Saylor and our team have been working with the Security team to develop a follow up to the recent security announcement focused on the lessons learned and changes our community has made in response to the situation. It's another great example of an Association and community volunteer partnership.

Licensing Working Group

As we discussed in the August board meeting, some community members have expressed concern over the slow and sometimes inconsistent response to licensing issues on Drupal.org. In response, a volunteer group came together to draft a charter which was published for comment just after DrupalCon Amsterdam. Some of the main points from the charter include:

  • All members (chair + 4-5) appointed by the Board

  • Scope is limited to licensing of code and other assets on D.O only, not other sites, and not determining WHICH license is used

  • Group responds to issues, does not police for issues

  • Will maintain the whitelist of allowable assets

The Association board voted to adopt the charter, so our next steps are to recruit members, create a queue for licesning issues, and then provide some legal training for our new Working Group members. If you are interested in participating, you can nominate yourself for the Working Group. We are planning to present a slate of candidates to the board for approval in the January 2015 meeting.

3rd Quarter Financials

Once per quarter, the board discusses the previous quarter's financial statements and then votes to approve and publish them. In the November meeting the board approved the Q3 financials:

I recently wrote a post highlighting how to read our financial statments, but will summarize here for you as well. Generally speaking, we are performing well ahead of our budgeted deficit spend. Though we had planned for a -$750,000 budget for 2014, a combination of slow tech team hiring, savings on Drupal.org contractor expenses, and some better than budgeted revenue means that we will not operate at nearly that level of loss for 2014. Instead the burden of the staffing investment we've made will really be felt in 2015. We'll see more of this and have a larger discussion when we release our budget and leadership plan next month.

As always, please let me know if you have any questions or share your thoughts in the comments.

Flickr phtoo: steffen.r

Categories: Elsewhere

Drupal core announcements: Drupal 8 beta 4 on Wednesday, December 17, 2014

Wed, 26/11/2014 - 02:54

The next beta for Drupal 8 will be beta 4! Here is the schedule for the beta release.

Tuesday, December 16, 2014 Only critical and major patches committed Wednesday, December 17, 2014 Drupal 8.0.0-beta4 released. Emergency commits only.
Categories: Elsewhere

Forum One: Using Panels without Panelizer

Tue, 25/11/2014 - 23:23

The Panels and Panelizer modules have opened up a whole world of options for layouts in Drupal, but too often the usage and features of these powerful tools get confused. My goal with this post is to explore some of the capabilities the Panels module has to offer before ever getting into the realm of using Panelizer.

At a high level, the goals of each of these modules break down into the following:

  • Panels: Create custom pages with configurable layouts and components.
  • Panelizer: Configure Panels layouts and content on a per instance basis for various entities.

Often, I see both modules added and used when Panelizer isn’t needed at all. What’s the problem with that? Introducing Panelizer when it isn’t needed complicates the final solution and can lead to unnecessary headaches later in configuration management, content maintenance, and system complexity.

Panels and Panel Variants

Before the introduction of Panelizer, site-builders got by just fine with Panels alone. This original solution is still valid and just as flexible as it ever was. The secret in doing this lies in understanding how variants work and knowing how to configure them.

Default Layout for All Nodes

Once Drupal finds a Panels page in charge of rendering the page request, Panels proceeds through the page’s variants checking the defined selection rules on each. Starting from the first variant, Panels evaluates each set of selection rules until one passes. As soon as a variant’s selection rules pass successfully, that variant is used and the rest below it are ignored. This is why it’s important to pay attention to the order in which you define your variants to ensure you place less strict selection rules later in the sequence.

Using this to your advantage, a good common practice is to define a default variant for your Panels page to ensure there is a baseline that all requests can use. To do this, you’ll need to define a new variant with no selection rules, so the tests always pass, and place it last in the series of variants. Since the selection rules on this variant will always pass, be aware that any variants placed below it will never be evaluated or used.

Custom Layouts per Content Type

Once you have a generic default in place to handle the majority of content items for your Panels page, you can start to tackle the pages that might have more specific or unique requirements. You can do this by creating a new variant above your default and define selection rules to limit its use to only the scenarios you’re targeting.

A common use case for this is the changing layout for content based on content types. To build this out, you need to edit the default node_view Panels page and add a new variant. If this page variant is intended to handle all nodes of this type, I’ll typically name it with the name of the content type so it’s clear. The next step is to configure the selection rules by adding the “Node: Bundle” rule and select the content type we’re building for. Once you save the new variant, any detail pages for that content type should render using the new configuration.

Building on this, a single page can be expanded to handle any number of variants using any combination of selection rules necessary. It’s common to see a variant added for each content type in this way. Examples of further customizations that are possible include:

•    A specific layout for a section of the site matching a specific URL pattern

•    Separate layouts based on the value of a field on the entity

•    Alternate views of a page if the user doesn’t have access to it

•    Separate views of a page based on the user’s role

Thanks to CTools, these selection rules are also pluggable. This means if you can’t find the right combination of selection rules to enforce the limitation you need, it’s easy to write a new plug-in to add your specific rule.

Avoiding Too Many Variants

Using the existing selection rules allows for a great deal of flexibility. Adding in custom plugins further improves your options to define any number of increasingly specific variants.

It is possible to take this too far, however. The cost of creating a new variant is that your layouts and content configurations are now forked further. Any common changes across them now have to be maintained in independent variants to maintain continuity.

Visibility Rules

It’s important to also remember the other features Panels offers, including visibility rules. Visibility rules are configured on a per-pane basis inside a specific variant’s layouts. These rules offer the same conditions available in variant-level selection rules. Since they’re configured on individual panes, however, you can focus on the component-level differences between pages. A common use case for this is to use the same node variant for multiple content types with similar layouts and configure the unique panes with visibility rules to limit which pages they show on.

To elaborate on this, here’s an example. Assuming a default node variant with a two-column layout, we can define the common elements that all nodes will have, such as the page title, rendered content, and maybe a sidebar menu. If we then add the requirement that all article nodes include a list of similar articles in the sidebar, we can accomodate this by placing the correct pane in the sidebar and adding the visibility rule “Node: Bundle”. We’ll then configure it to use the current node being viewed and limit it to show only when that node is in the “Article” bundle. Now, whenever a node is displayed, it will show the common panes, but the block for similar articles will only show in the sidebar if we’re viewing a article node.

Choosing the Right Approach

Once you get to the level of creating panel variants or visibility rules just for single pages, it’s usually time to ask if you’re using the right tool. When you’ve gotten to the point where individual page instances need to be different, you’ve come to the impasse of determining the best approach.

If it’s only a handful of pages that are each unique, then the most straightforward solution may be to create independent Panels pages for each of these.

If instead, individual instances of the same type need different layouts or configurations, then it may be time to install Panelizer to allow instance-specific overrides.

 

Categories: Elsewhere

groups.drupal.org frontpage posts: Google Code-In 2014 starts Dec 1st and Drupal is back!

Tue, 25/11/2014 - 23:12

It's that wonderful time of the year when Google Code-In students (ages 13-17) work on tasks for Drupal. Starting on Monday December 1st, hundreds of students from around the world will be contributing tasks to open source organizations competing to win an all expense paid trip to visit Google in California. Luckily Drupal was invited back by Google to participate in 2014 and timing could not be better with the beta releases of Drupal 8. Thanks Google!

The Code-In contest will end January 19th 2015 and Drupal has an awesome chance to connect with the youth of computer programming. Did you know that many of Drupal's top contributors started in Google Summer of Code and/or Code-In? Even if you don't have ideas for tasks, nor want the responsibility of being a mentor, please hangout on IRC in #drupal-google to help students and prepare for a flood of questions in #drupal-contribute.

Drupal gained priceless experience and contributions in GCI 2013. We're proud to note that several students even became Drupal 8 core contributors. In addition to helping Drupal during GCI, a few students have continued contributing to our community after the contest. A final congratulations to our 2013 winners areke and royal121, we thank you for all your hard work. Moving forward we're excited to continue our momentum of contributing to Drupal while having fun educating the next generation of developers.

How can the Drupal community participate? Drupal needs help until the end of contest in January. It is not too late to help. We need task ideas and of course we're always looking for additional mentors. Maybe you can help us by promoting GCI to social media or mentioning it to colleagues? Did we mention you don't have to be a mentor to create tasks? Learn how to add tasks and become a mentor in details below.

How to become a Drupal GCI Mentor

How to Add Drupal GCI Tasks

Important Links

More information will be available as the contest gets started. Subscribe to gdo/code-in to hear the latest updates. More info soon.

Categories: Elsewhere

Colan Schwartz: How to review Drupal code

Tue, 25/11/2014 - 22:26
Topics: 

If you're interested in code quality and providing a means by which to bring Drupal beginners up-to-speed on the coding standards, I recommend reviewing code from all developers. I say "all" developers because everyone needs an editor.

The best way to force code reviews is to bake it into your development process. Use a tool like Gitlab (free hosted version) to prevent developers from committing code to authoritative branches. Instead, have them fork the project repository, and submit merge requests. Someone else can then review them. The reviewer can add in-line comments, wait for the developer to make changes, and then accept the request.

Here are some things to look for when reviewing Drupal code submissions. For some of these, we're assuming Git is being used for version control.

  1. Read, understand and follow the Coding standards.
  2. Install, enable, and use the Coder module on your development sites.
  3. For the purists out there, use the Coder Tough Love module as well.
  4. If you're running a continuous integration (CI) system like Jenkins, check the logs for new errors or warnings on new commits. Either way, make sure your development sandboxes have errors being reported to the screen so that developers can see any new errors that they generate. You'll find a lot of errors in your Drupal log if you're not doing this. (Make them refresh their DBs from your Dev site which already has this enabled.)
  5. Speaking of CI, add one or more code quality inspection tools to the mix such as SonarQube. There's actually a pre-configured Vagrant profile to build a VM with everything already set up. See CI: Deployments and Static Code Analysis with Drupal/PHP for details.
  6. Look for unrelated code reversions in merge requests. That is, if you see code changes that aren't related to what the developer is trying to do, there's something wrong. In most cases, this means the developer's branch is out-of-date with the main development branch. He or she should fetch and merge that branch from from the origin repository, fix any conflicts, and then add it to the merge request.
  7. Look for debugging code that wasn't removed such as dd(), drupal_debug() and other output functions.
  8. Look for Git conflict symbols such as "<<<", ">>>" and "===". These usually indicate a botched conflict resolution.
  9. Notice any lack of comments. Stanzas (small blocks of code that do little things) should be separated by blank lines, each with a comment explaining what it does. It may be clear to the original developer, but that doesn't help anybody else.
  10. Make sure that modules are installed in the right place. This is usually sites/all/modules/contrib (for upstream modules coming from drupal.org) or sites/all/modules/custom (for modules written specifically for the project).
  11. In theme files, usually somewhere under sites/all/themes, look for any functionality that is not theme-specific. Functionality should always be in modules, not themes, so that if the theme is changed, the site still works as expected. This is an extremely common error for beginners. For example, JavaScript files related to modules shouldn't be in the theme directory, but the module itself.
  12. Ensure consistency in module package names. For custom modules, it's advisable to give the package name the name of the project so that it's clear that these are site-specific. For contributed modules, use what others are using; don't arbitrarily make one up. This helps keep your list of modules organized.

These are the most common issues I've discovered while reviewing code. If you have any others, feel free to add them as comments. I can add them to the list here.

Happy reviewing!

Categories: Elsewhere

Bert Boerland: Drupal SplashAwards 2014

Tue, 25/11/2014 - 22:13


On december 12 the Dutch Drupal Foundation will organise the first edition of the "SplashAwards". This award is to put outstanding projects in the spotlight, the best Drupal projects and community contributions from Belgium and The Netherlands.

Both Drupal agencies and individuals who have achieved extraordinary results get special recognition from inside and outside the Drupal community. The international jury selects winners out of hundreds of contestants in several categories including best government project and best Drupal theme.

The jury includes well known people in the broader PHP and Drupal community from all around the world: Joost de Valk (SEO WP fame) , Moshe Weitzman (contributor since 2001), Jeffrey "jam" McGuire (evangelist with a mo), Holly Ross (Executive Director DA) , Morten Birch Heide-Jorgensen (enfant terrible and good friend :-)), Stefan Koopmanschap (PHP / Symfony guru from the Netherlands) , Guido Jansen (magento fame) and Robert Douglass (SOLR fame and most of all around friendly chap) will select the ten winners who will walk home along the canals with a great award and a smiling face.

There are 10 awards to be given, from architecture and commerce to best governmental site and theme. The award self will be held for some 100 people, in an old cinema in the centre of Amsterdam. We are really looking forward to this event. And in fact, it will be the last event of the year for the Dutch and a great year it has been.

From a record breaking DrupalJam, via the social events around DrupalCon to 100's of students getting a free training on the Drupal Training Day and now the bowtie SplashAwards, showing of the Dutch Drupal community never was better.

Categories: Elsewhere

3C Web Services: Creating dynamic output on your entity in Drupal 7 using hook_entity_view()

Tue, 25/11/2014 - 22:00

If you need to create dynamic output on an entity when it is displayed on your Drupal site you have multiple options. One method that is easy to implement is using hook_entity_view().

You can insert this hook function into your custom module (see creating a custom module for more info).

Example:

Categories: Elsewhere

Mediacurrent: Best Practices for Custom Modules

Tue, 25/11/2014 - 20:19

Recently, I had the need to refer back to a custom module I wrote at a previous job years ago and like it tends to do, the code scared me. As far as I know, this module is still chugging along doing its job to this day, and hasn’t had any issues. But for it to work for us at Mediacurrent, it needed some serious refactoring.

Categories: Elsewhere

Pages