Elsewhere

Jo Shields: mono-project.com Linux packages, January 2015 edition

Planet Debian - Wed, 21/01/2015 - 02:26

The latest version of Mono has released (actually, it happened a week ago, but it took me a while to get all sorts of exciting new features bug-checked and shipshape).

Stable packages

This release covers Mono 3.12, and MonoDevelop 5.7. These are built for all the same targets as last time, with a few caveats (MonoDevelop does not include F# or ASP.NET MVC 4 support). ARM packages will be added in a few weeks’ time, when I get the new ARM build farm working at Xamarin’s Boston office.

Ahead-of-time support

This probably seems silly since upstream Mono has included it for years, but Mono on Debian has never shipped with AOT’d mscorlib.dll or mcs.exe, for awkward package-management reasons. Mono 3.12 fixes this, and will AOT these assemblies – optimized for your computer – on installation. If you can suggest any other assemblies to add to the list, we now support a simple manifest structure so any assembly can be arbitrarily AOT’d on installation.

Goodbye Mozroots!

I am very pleased to announce that as of this release, Mono users on Linux no longer need to run “mozroots” to get SSL working. A new command, “cert-sync”, has been added to this release, which synchronizes the Mono SSL certificate store against your OS certificate store – and this tool has been integrated into the packaging system for all mono-project.com packages, so it is automatically used. Just make sure the ca-certificates-mono package is installed on Debian/Ubuntu (it’s always bundled on RPM-based) to take advantage! It should be installed on fresh installs by default. If you want to invoke the tool manually (e.g. you installed via make install, not packages) use

cert-sync /path/to/ca-bundle.crt

On Debian systems, that’s

cert-sync /etc/ssl/certs/ca-certificates.crt

and on Red Hat derivatives it’s

cert-sync /etc/pki/tls/certs/ca-bundle.crt

Your distribution might use a different path, if it’s not derived from one of those.

Windows installer back from the dead

Thanks to help from Alex Koeplinger, I’ve brought the Windows installer back from the dead. The last release on the website was for 3.2.3 (it’s actually not this version at all – it’s complicated…), so now the Windows installer has parity with the Linux and OSX versions. The Windows installer (should!) bundles everything the Mac version does – F#, PCL facades, IronWhatever, etc, along with Boehm and SGen builds of the Mono runtime done with Visual Studio 2013.

An EXPERIMENTAL OH MY GOD DON’T USE THIS IN PRODUCTION 64-bit installer is in the works, when I have the time to try and make a 64-build of Gtk#.

Categories: Elsewhere

Dimitri John Ledkov: Python 3 ports of launchpadlib & ubuntu-dev-tools (library) are available

Planet Debian - Wed, 21/01/2015 - 01:06
I'm happy to announce that Python 3 ports of launchpadlib & ubuntu-dev-tools (library) are available for consumption.

These are 1.10.3 & 0.155 respectfully.

This means that everyone should start porting their reports, tools, and scriptage to python3.

ubuntu-dev-tools has the library portion ported to python3, as I did not dare to switch individual scripts to python3 without thorough interactive testing. Please help out porting those and/or file bug reports against the python3 port. Feel free to subscribe me to the bug reports on launchpad.

For the time being, I believe some things will not be easy to port to python3 because of the elephant in the room - bzrlib. For some things like lp-shell, it should be easy to move away from bzrlib, as non-vcs things are used there. For other things the current suggestion is to probably fork to bzr binary or a python2 process. I ponder if a minimal usable python3-bzrlib wrapper around python2 bzrlib is possible to satisfy the needs of basic and common scripts.

On a side note, launchpadlib & lazr.restfulclient have out of the box proxy support enabled. This makes things like add-apt-repository work behind networks with such setup. I think a few people will be happy about that.

All of these goodies are available in Ubuntu 15.04 (Vivid Vervet) or Debian Experimental (and/or NEW queue).
Categories: Elsewhere

ThinkShout: The Clutter and the Deceptively Simple

Planet Drupal - Wed, 21/01/2015 - 01:00

2015 is poised to be a great year for nonprofit technology and the adoption of digital tools to advance the causes we love. While I can’t say that I see too many groundbreaking innovations on the immediate horizon, I do believe that this will be a year of implementation and refinement. Building upon trends that we saw arise last year in the consumer industry and private sector, 2015 will be the year that many nonprofits leap into digital engagement strategies and begin to leverage new tools that will create fundamental change in the way that they interact with their constituencies.

Of course, as always happens when a growing sector first embraces new tools, the nonprofit technology world will see more than its fair share of awkward clunkiness this year, mainly as "software as a service" product companies rebrand their offerings for nonprofits and flash shiny objects at the earnest and hungry organizations we all support.

But as a more general and appealing trend, I believe that we’ll see a slimming down and a focus on polish this coming year. Visual storytelling and "long form" journalism are hopefully on the rise in the nonprofit digital world. We should see more, and better, integrations between web applications, data management systems, and social networks. These integrations will power more seamless and personalized user experiences. Rather than tossing up an incongruent collection of web interfaces and forms delivered by different paid service platforms, nonprofits will be able to present calls-to-action through more beautiful and less cumbersome digital experiences.

Below are some additional thoughts regarding the good stuff (and some of the bad) that we’re likely to see this year. If you have any additional predictions, please share your thoughts!

Visual Storytelling and the Resurgence of Long-Form Journalism

I don’t know about you, but I’m tired of my eyeballs popping out of my head every time I visit a nonprofit’s homepage that attempts to cram 1,000 headlines above the fold. I’m tired of the concept of "a fold" altogether. And don’t get me started about slideshow carousels as navigation: It’s 2015, folks!

Fortunately, we are seeing an elegant slowdown in the pace of writing for the web. Audiences are getting a little more patient, particularly when presented with clean design, pleasing typography, and bold imagery. We’re also seeing nonprofits embrace visual storytelling, investing in imagery and content over whistles and bells and widgets.

Medium and Exposure are my two favorite examples of impactful long-form journalism and visual storytelling on the web. These deceptively simple sites leverage cutting-edge javascript and other complex technologies to get out of the way and let content and visuals speak for themselves.

As an added benefit, adopting this more long-form storytelling approach may help your SEO. Google took bold steps in late 2014 to reward websites that focus on good content. With its release of Panda 4.1, their new search algorithm, nonprofits who prioritize long-form writing and quality narrative will start to see significant benefits.

We’re already seeing nonprofits adopt this approach, including one of my new favorites, The Marshall Project. This site cuts away the usual frills and assumes an intelligent audience that will do the work to engage with the content. Don’t get me wrong: The Marshall Project website is slick and surprisingly complex from an engineering and user experience perspective – but its designers have worked hard to bring the content itself to the surface as the most compelling call-to-action.

Interconnectivity

2015 will be a big year for APIs in the CMS space. Teasing out those acronyms, we will see content management systems, like Drupal and WordPress, release powerful tools allowing them to talk with other web applications and tools. Indeed, its new web services layer is a central and much anticipated feature in the upcoming release of Drupal 8. WordPress made similar strides late last year with the early release of its own REST API.

Leveraging these APIs, 2015 will bring the nonprofit sector more mobile applications that share data and content with these organizations’ websites. The costs for developing these integrations should decrease relative to the usefulness of such solutions, which will hopefully lead to more experimentation and mobile investment among nonprofits. And as mentioned previously, because these new applications will have access to more constituent data across platforms, they will lend themselves to more robust and personalized digital experiences.

On the less technical and more DIY front, 2015 will be marked by the maturation of 3rd-party services that allow non-developers to integrate their online tools. In its awesome post about technology trends in 2015, the firm Frog Design refers to this development as the "emergence of the casual programmer." Services like Zapier, and my new favorite IFTTT, will allow nonprofits to make more out of social networks and services like Google Apps, turn disparate data into actionable analytics, see the bigger picture across networks, and make more data-driven decisions.

More Big (And Perhaps Clunky) Web Apps

If you’ve been following ThinkShout for a while now, you probably know that we are big fans of Salesforce because of its great API and commitment to open data. We maintain the Salesforce Integration Suite for Drupal. At this point, the majority of our client work involves some sort of integration between the Drupal CMS and the Salesforce CRM.

As proponents of data-driven constituent engagement, we couldn’t be more excited to see the nonprofit sector embrace Salesforce and recognize the importance of constituent relationship management (CRM) and CRM-CMS integration. Because of the power of the Salesforce Suite, we can build powerful, gorgeous tools in Drupal that sync data bidirectionally and in real time with Salesforce.

That said, part of the rise of Salesforce in the nonprofit sector over the last two years has been driven by the vacuum created by Blackbaud’s purchase of Convio. And now, with the recent releases of Salesforce’s NGO Connect and Blackbaud’s Raiser’s Edge NXT, both "all-in-one" fundraising solutions with limited website integration potential (in my opinion…), we’re going to see more and more of an arms race between these two companies as they try to “out featurize” each other in marketing to nonprofits. In other words, in spite of the benefits from integrating Drupal and Salesforce, we’re going to see big nonprofit CRM offerings like Salesforce and Blackbaud push competing solutions that try to do everything in their own proprietary and sometimes clunky ecosystems.

The Internet of Things

The Internet of Things (IoT), or the interconnectivity of embedded Internet devices, is not a new concept for 2015. We’ve seen the rise of random smart things, from TVs to refrigerators, for the last few years. While the world’s population is estimated to reach 7.7 billion in 2020, the number of Internet-connected devices is predicted to hit 26 billion that same year. Apple’s announcement of its forthcoming Watch last year tolled the the first meaningful generation of wearable technology. Of course, that doesn’t necessarily mean that you’ll want to wear this stuff just yet, depending upon your fashion sense...

(Image from VentureBeat’s coverage of the 2015 Consumer Electronics Show last week. Would you wear these?)

However, the advent of the wearable Internet presents many opportunities to the nonprofit sector, both as a delivery device for micro-campaigns and targeted appeals, and as a tool for collecting information about an organization’s constituency. Our colleagues at BlueSpark Labs recently wrote about how these technologies will allow organizations to build websites that are really "context-rich systems." For example, with an Internet-connected watch synced up to a nonprofit’s website, that organization could potentially monitor a volunteer athlete’s speed and heart rate during a workout. These contextualized web experiences could drive deeper feelings of commitment among donors and other nonprofit supporters.

(Fast Company envisions how the NY Times might cover election result's on the Apple Watch.)

Privacy and Security

While not exactly a trend in nonprofit technology, I will be interested to see how the growing focus on Internet privacy and security will affect online fundraising and digital engagement strategies this year.

(A poster for the film, The Interview, as most of you probably know, the film incited a major hack of Sony Studios and spurred international dialog about cyber security.)

We are seeing more and more startups providing direct-to-consumer privacy and security offerings. This last year, Apple release Apple Pay which adds security, as well as convenience, to both online and in-person credit card purchases. And Silent Circle just released Blackphone - an encrypted cell phone with a sophisticated and secure operating system built on top of the Android platform.

How might this focus on privacy and security affect the nonprofit sector? It’s hard to say for sure, but nonprofits should anticipate the need to pay for more routine security audits and best practices regarding maintenance of their web properties, especially as these tools begin to collect and leverage more constituent data. They should also consider how their online fundraising tools will begin to support new online payment formats, such as Apple Pay, as well as virtual currencies like BitCoin.

And Away We Go…

At ThinkShout, we’ve already rolled up our sleeves and are excitedly working away to implement many of these new strategies and approaches for our clients in 2015. What are you looking forward to seeing in the world of of nonprofit tech this year? What trends do you see on the horizon? Let us know. And consider swinging by the "Drupal Day for Nonprofits" event that we’re organizing on March 3rd in Austin, TX, as part of this year’s Nonprofit Technology Conference. We hope to dream with you there!

Categories: Elsewhere

Jonathan Wiltshire: Never too late for bug-squashing

Planet Debian - Tue, 20/01/2015 - 23:20

With over a hundred RC bugs still outstanding for Jessie, there’s never been a better time to host a bug-squashing party in your local area. Here’s how I do it.

  1. At home is fine, if you don’t mind guests. You don’t need to seek out a sponsor and borrow or hire office space. If there isn’t room for couch-surfers, the project can help towards travel and accommodation expenses. My address isn’t secret, but I still don’t announce it – it’s fine to share it only with the attendees once you know who they are.
  2. You need a good work area. There should be room for people to sit and work comfortably – a dining room table and chairs is ideal. It should be quiet and free from distractions. A local mirror is handy, but a good internet connection is essential.
  3. Hungry hackers eat lots of snacks. This past weekend saw five of us get through 15 litres of soft drinks, two loaves of bread, half a kilo of cheese, two litres of soup, 22 bags of crisps, 12 jam tarts, two pints of milk, two packs of chocolate cake bars, and a large bag of biscuits (and we went out for breakfast and supper). Make sure there is plenty available before your attendees arrive, along with a good supply of tea and coffee.
  4. Have a work plan. Pick a shortlist of RC bugs to suit attendees’ strengths, or work on a particular packaging group’s bugs, or have a theme, or something. Make sure there’s a common purpose and you don’t just end up being a bunch of people round a table.
  5. Be an exemplary host. As the host you’re allowed to squash fewer bugs and instead make sure your guests are comfortable, know where the bathroom is, aren’t going hungry, etc. It’s an acceptable trade-off. (The reverse is true: if you’re attending, be an exemplary guest – and don’t spend the party reading news sites.)

Now, go host a BSP of your own, and let’s release!

Never too late for bug-squashing is a post from: jwiltshire.org.uk | Flattr

Categories: Elsewhere

Sven Hoexter: Heads up: possible changes in fonts-lyx

Planet Debian - Tue, 20/01/2015 - 21:02

Today the super nice upstream developers of LyX reached out to me (and pelle@) as the former and still part time lyx package maintainers to inform us of an ongoing discussion in http://www.lyx.org/trac/ticket/9229. The current aproach to fix this bug might result in a name change of all fonts shipped in fonts-lyx with the next LyX release.

Why is it relevant for people not using LyX?

For some historic reasons beyond my knowledge the LyX project ships a bunch of math symbol fonts converted to ttf files. From a seperate source package they moved to be part of the lyx source package and are currently delivered via the fonts-lyx package.

Over time a bunch of other packages picked this font package up as a dependency. Among them also rather popular packages like icedove, which results in a rather fancy popcon graph. Drawback as usual is that changes might have a visible impact in places where you do not expect them.

So if you've some clue about fonts, or depend on fonts-lyx in some way, you might want to follow that issue cited above and/or get in contact with the LyX developers.

If you've some spare time feel also invited to contribute to the lyx packaging in Debian. It really deserves a lot more love then what it seldomly gets today by the brave Nick Andrik, Per and myself.

Categories: Elsewhere

Daniel Pocock: Quantifying the performance of the Microserver

Planet Debian - Tue, 20/01/2015 - 20:53

In my earlier blog about choosing a storage controller, I mentioned that the Microserver's on-board AMD SB820M SATA controller doesn't quite let the SSDs perform at their best.

Just how bad is it?

I did run some tests with the fio benchmarking utility.

Lets have a look at those random writes, they simulate the workload of synchronous NFS write operations:

rand-write: (groupid=3, jobs=1): err= 0: pid=1979 write: io=1024.0MB, bw=22621KB/s, iops=5655 , runt= 46355msec

Now compare it to the HP Z800 on my desk, it has the Crucial CT512MX100SSD1 on a built-in LSI SAS 1068E controller:

rand-write: (groupid=3, jobs=1): err= 0: pid=21103 write: io=1024.0MB, bw=81002KB/s, iops=20250 , runt= 12945msec

and then there is the Thinkpad with OCZ-NOCTI mSATA SSD:

rand-write: (groupid=3, jobs=1): err= 0: pid=30185 write: io=1024.0MB, bw=106088KB/s, iops=26522 , runt= 9884msec

That's right, the HP workstation is four times faster than the Microserver, but the Thinkpad whips both of them.

I don't know how much I can expect of the PCI bus in the Microserver but I suspect that any storage controller will help me get some gain here.

Categories: Elsewhere

Drupal core announcements: MidWest Developers Summit

Planet Drupal - Tue, 20/01/2015 - 20:33
Start:  2015-08-12 (All day) - 2015-08-15 (All day) America/Chicago Sprint Organizers:  gdemet webchick xjm

This is a place holder to get MWDS on the calendar.
Wednesday August 12 - Saturday August 15

All sprint days. No sessions.
Focus on getting Drupal 8 released (and some key contrib ports to Drupal 8).

More details soon.

Will be hosted in Chicago at http://palantir.net/
2211 N Elston Avenue
Suite 400
Chicago, Illinois 60614

Categories: Elsewhere

Sven Hoexter: python-ipcalc bumped from 0.3 to 1.1.3

Planet Debian - Tue, 20/01/2015 - 20:16

I've helped a friend to get started with Debian packaging and he has now adopted python-ipcalc. Since I've no prior experience with packaging of Python modules and there were five years of upstream development in between, I've uploaded to experimental to give it some exposure.

So if you still use the python-ipcalc package, which is part of all current Debian releases and the upcoming jessie release, please check out the package from experimental. I think the only reverse dependency within Debian is sshfp, that one of course also requires some testing.

Categories: Elsewhere

Shomeya: Model Your Data with Drupal and Domain-driven Design

Planet Drupal - Tue, 20/01/2015 - 18:00

On of the things I've blogged about recently when talking about my upcoming book Model Your Data with Drupal is domain-driven design. While domain-driven design is important and something that I hope to touch on in the future, I've decided it's too much to cover in one book, and I'm refocusing Model Your Data with Drupal on basic object oriented principles.

If you want to know more about this decision and what will be covered in Model Your Data with Drupal, read on.

Read more
Categories: Elsewhere

Drupal Watchdog: Something Borrowed, Something Drupal

Planet Drupal - Tue, 20/01/2015 - 17:59
Feature


Drupal 8 represents a radical shift, both technically and culturally, from previous versions. Perusing through the Drupal 8 code base, many parts may be unfamiliar. One bit in particular, though, is especially unusual: A new directory named /core/vendor. What is this mysterious place, and who is vending?

The "vendor" directory represents Drupal's largest cultural shift. It is where Drupal's 3rd party dependencies are stored. The structure of that directory is a product of Composer, the PHP-standard mechanism for declaring dependencies on other packages and downloading them as needed. We won't go into detail about how Composer works; for that, see my article in the September 2013 issue of Drupal Watchdog, Composer: Sharing Wider.

But what 3rd party code are we actually using, and why?

Crack open your IDE if you want, or just follow along at home, as we embark on a tour of Drupal 8's 3rd party dependencies. (We won't be going in alphabetical order.)

Guzzle

Perhaps the easiest to discuss is Guzzle. Guzzle is an HTTP client for PHP; that is, it allows you to make outbound HTTP requests with far more flexibility (and a far, far nicer API) than using curl or some other very low-level library.

Drupal had its own HTTP client for a long time... sort of. The drupal_http_request() function has been around longer than I have, and served as Drupal's sole outbound HTTP utility. Unfortunately, it was never very good. In fact, it sucked. HTTP is not a simple spec, especially HTTP 1.1, and supporting it properly is difficult. drupal_http_request() was always an after-thought, and lacked many features that some users needed.

Categories: Elsewhere

Blink Reaction: Give me a Swiss knife, Pleeease!

Planet Drupal - Tue, 20/01/2015 - 17:25
All the annoying CSS stuff we don't want to do in 1 tool. One tool for stylesheets
Categories: Elsewhere

EchoDitto Tech Blog: Code Management in Drupal 7 using Features, Ctools, and Panels

Planet Drupal - Tue, 20/01/2015 - 17:22

Code structure is something most Drupal developers wrestle with. There are tons of modules out there that make our lives easier (Views, Display Suite, etc.) but managing database configuration while maintaining a good workflow is no easy challenge. Today I'm going to talk about a few approaches I use in my work here at Echo. We will be using a simple use case of creating a paginated list of blog posts. To start, we're going to talk about the workflow from a high level, then we'll get into the modules that leverage Drupal in a way that makes sense. Finally, we'll have some code samples to help guide things along.

Workflow

This will vary a bit based on what you need, but the idea behind this is we never want to redo our work. Ideally we'd like to design a View or functionality once on our local, and then package it and push it up. Features is a big driving force behind this. Beyond that, we want things like page structures and custom code to have a place to live that makes sense. So, for this example we will be considering the idea of a paginated list of Blog Posts. This is a heavy hammer to be swinging at such a solved task, but we will get into why this is good later on.

  • Create a new Feature that requires ctools and panels (and not views!)
  • Open up the generated .module file and declare the ctool plugin directory
  • Create the plugins/content_types/blog_posts.inc file
  • Define the needed functions within blog_posts.inc to make it work
  • Add the newly created content type to a page in Page Manager
  • Add everything we care about to the Feature and export it for deployment
Installation

This only assumes that you have a working Drupal installation and some knowledge of how to install modules. In this case, we will be using drush to accomplish this, but feel free to pick your poison here. Simply run the following commands and answer yes when prompted.

drush dl ctools ds features panels strongarm drush en ctools ds features panels strongarm page_manager

What we have done here is install and enable a strong foundation on which we can start to scaffold our site. Note that I won't be getting into folder structure too much, but there are some more steps before this you would have to take to ensure contrib, custom, and features all make it to their own place. We wave our hands at this for now.

Features

The first thing we're going to do is generate ourselves a Feature. Simply navigate to Structures -> Features -> Create Feature and you will see a screen that looks very similar to this. Fill out a name, and have it require ctools and panels for now.

This will generate a mostly empty feature for us. The important part we want here is the ability to turn it on and off in the Features UI, and the structure (that we didn't have to create manually!) which includes a .module and .info file is ready to go for us. That being said, we're going to open it up and tell it where to find the plugins. The code to do that is below, and here is a screenshot of the directory structure and code to make sure you're on the right track. Go ahead and create the plugins directory and associated file as well.

function blog_posts_ctools_plugin_directory($owner, $plugin_type) { return 'plugins/' . $plugin_type; } Chaos Tools

Known more commonly as ctools, this is a module that allows us this plugin structure. For our purposes, we've already made the directory and file structure needed. Now all we have to do is create ourselves a plugin. There are three key parts to this: plugin definition, render function, and form function. These are all defined in the .inc file mentioned above. There are plenty of resources online that get into the details, but basically we're going to define everything that gets rendered in code and leverage things like Display Suite and the theme function for pagination. This is what we wind up with:

<?php   /** * Plugin definition */ $plugin = array( 'single' => TRUE, 'title' => t('Blog Post Listing'), 'description' => t('Custom blog listing.'), 'category' => t('Custom Views'), 'edit form' => 'blog_post_listing_edit_form', 'render callback' => 'blog_post_listing_render', 'all contexts' => TRUE, );   /** * Render function for blog listing * @author Austin DeVinney */ function blog_post_listing_render($subtype, $conf, $args, &$context) { //Define the content, which is built throughout the function $content = '';   //Query for blog posts $query = new EntityFieldQuery(); $query->entityCondition('entity_type', 'node', '=') ->entityCondition('bundle', 'blog_post', '=') ->propertyCondition('status', NODE_PUBLISHED, '=') ->pager(5);   //Fetch results, and load all nodes $result = $query->execute();   //If we have results, build the view if(!empty($result)) { //Build the list of nodes $nodes = node_load_multiple(array_keys($result['node'])); foreach($nodes as $node) { $view = node_view($node, 'teaser'); $content .= drupal_render($view); }   //Add the pager $content .= theme('pager'); }   //Otherwise, show no results else { $content = "No blog posts found."; }   //Finally, we declare a block and assign it the content $block = new stdClass(); $block->title = 'Blog Posts'; $block->content = $content; return $block; }   /** * Function used for editing options on page. None needed. * @author Austin DeVinney */ function blog_post_listing_edit_form($form, &$form_state) { return $form; }

Some things to note here. We're basically making a view by hand using EntityFieldQuery. It's a nifty way to write entity queries a bit easier and comes with some useful how to's on Drupal.org. We also offload all rendering to work with Display Suite and use the built-in pagination that Drupal provides. All things considered, I'm really happy with how this comes together.

Panels

Finally, we need to add this to the page manager with panels. Browser to Structure -> Pages -> Add custom page and it will provide you with a step by step process to make a new page. All we're going to do here is add our newly created content type to the panel, as shown here.

And now, we're all ready to export to the Feature we created. Go on back to and recreate the feature and you're ready to push your code live. After everything is said and done, you should have a working blog with pagination.

.

Motivation

Obviously, this example is extremely basic. We could have done this in a View in far less time. Why would we ever want to use this? That's a great question and I'd like to elaborate on why this is important. Views are great and solve this problem just as well. They export nicely with Features and can even play with Panels (if you want to use Views as blocks or content panes). That being said, this is more for the layout of how we would have custom code that works with a lot of Drupal's best practices. Imagine instead if we have a complicated third party API we're trying to query and have our "view" react to that. What if we want a small, code-driven block that we can place discretely with panels? The use cases go on, of course.

There are many ways to solve problems in Drupal. This is just my take on a very clean and minimal code structure that allows developers to be developers and drive things with their code, rather than being stuck clicking around in menus.

Tags: drupaldrupal 7ctoolspanelsfeaturestechnologymaintainability
Categories: Elsewhere

Dcycle: Multiple git remotes, the --depth parameter and repo size

Planet Drupal - Tue, 20/01/2015 - 16:31

When building a Drupal 7 site, one oft-used technique is to keep the entire Drupal root under git (for Drupal 8 sites, I favor having the Drupal root one level up).

Starting a new project can be done by downloading an unversioned copy of D7, and initializing a git repo, like this:

Approach #1 drush dl cd drupal* git init git add . git commit -am 'initial project commit' git remote add origin ssh://me@mygit.example.com/myproject

Another trick I learned from my colleagues at the Linux Foundation is to get Drupal via git and have two origins, like this:

Approach #2 git clone --branch 7.x http://git.drupal.org/project/drupal.git drupal cd drupal git remote rename origin drupal git remote add origin ssh://me@mygit.example.com/myproject

This second approach lets you push changes to your own repo, and pull changes from the Drupal git repo. This has the advantage of keeping track of Drupal project commits, and your own project commits, in a unified git history.

git push origin 7.x git pull drupal 7.x

If you are tight for space though, there might be one inconvenience: Approach #2 keeps track of the entire Drupal 7.x commit history, for example we are now tracking in our own repo commit e829881 by natrak, on June 2, 2000:

git log |grep e829881 --after-context=4 commit e8298816587f79e090cb6e78ea17b00fae705deb Author: natrak <> Date: Fri Jun 2 18:43:11 2000 +0000 CVS drives me nuts *G*

All of this information takes disk space: Approach #2 takes 156Mb, vs. 23Mb for approach #1. This may add up if you are working on several projects, and especially if for each project you have several environments for feature branches. If you have a continuous integration server tracking multiple projects and spawning new environments for each feature branch, several gigs of disk space can be used.

If you want to streamline the size of your git repos, you might want to try the --depth option of git clone, like this:

Approach #3 git clone --branch 7.x --depth 1 http://git.drupal.org/project/drupal.git drupal cd drupal git remote rename origin drupal git remote add origin ssh://me@mygit.example.com/myproject

Adding the --depth parameter here reduces the initial size of your repo to 18Mb in my test, which interestingly is even less than approach #1. Even though your repo is now linked to the Drupal git repo, by running git log you will see that the entire history is not being stored.

Tags: blogplanet
Categories: Elsewhere

Code Karate: An introduction to Git (part 1)

Planet Drupal - Tue, 20/01/2015 - 16:18

If you are not already using Git on your Drupal websites or projects, now is the time to learn.

Categories: Elsewhere

Sooper Drupal Themes: Drupal CMS Powerstart Tutorial 2: Responsive websites with Bootstrap 3 and Drupal

Planet Drupal - Tue, 20/01/2015 - 15:36

This tutorial will showcase how we have made Bootstrap 3 and especially its responsive grid system and integral part of the platform, and will show you how to use some easy tools to make any website component or content mobile friendly!

About Bootstrap 3 in CMS Powerstart

The Drupal CMS Powerstart disbitrution has made Bootstrap 3 an integral part of the platform. The main reason we did this is to leverage the Bootstrap 3 responsive grid system. This grid system is not just functional, practical and effective.. it's also widely used, widely understood and very well documented. On top of that, Bootstrap 3 is an active open source project, like Drupal, and also supported very well with Drupal through a basetheme and various modules. This tutorial will teach you about these integrations and how to use them to create awesome responsive websites with ease. This tutorial will focus more on Drupal integration than on the gridsystem itself. For a quick introduction to the grid system check out this tutorial. For real life examples check out our Drupal themes.

2.1 Bootstrap on blocks

Forget about themes with 16 regions, or 25 regions. If your'e using Bootstrap you really only need full-width regions that stack on top of one another. The horizontal division will be provisioned by block classes, with responsive layout switching that is customized for your content, not for your theme (-designer) or for an outdated wireframe.

In Drupal CMS Powerstart I added the block_class module and added a patch that assists in our responsive designing labours by auto-completing the Bootstrap 3 grid system classes. 

2.2 Bootstrap in Views

To use Bootstrap 3 in views we will use the views_bootstrap Drupal module. Let's take a look at how this module is used to create a Portfolio grid page for theDrupal CMS Powerstart Portfolio component.

powerstart-portoflio-grid-mock2.png

Live demo of portfolio grid.

The views_bootstrap module provides an array of new Views display plugins:

  • Bootstrap Accordion
  • Bootstrap Carousel
  • Bootstrap Grid
  • Bootstrap List Group
  • Bootstrap Media Object
  • Bootstrap Tab
  • Bootstrap Table
  • Bootstrap Thumbnails 

 

This grid of portfolio thumbnails uses the Bootstrap Grid views display plugin. The Bootstrap Grid plugin allows you to output any content in a grid using Bootstrap's grid html markup. A current shortcoming in the module is that it only allows you to select the number of columns for the 'large' media query. Fortunately, there is a patch for that:

https://www.drupal.org/node/2203111

The Drupal CMS Powerstart distribution has this patch included and uses it in views to create truly responsive grids, where you can set the number of columns per media query. It works quiet well out of the box. Here is the views format configuration used for the portfolio:

bootstrap-grid-views-plugin.jpg

As you can see it's real easy to create responsive views with this Views Bootstrap 3 integration! Without writing any code you can leverage the tried and tested responsive systems that are provided by Bootstrap. The views_bootstrap module gives you a whole set of tools that help you build responsive layouts and widgets using your trusted Views backend interface. This means site builders can rely less on themers/programmers and get work done quicker.

Using custom markup in views

The View Bootstrap module is great at organizing rows of data into responsive layouts, but it doesn't have the same level of support for fields inside a row of data. This is what we did to create a responsive events listing for the Drupal CMS Powerstart events component:

powerstart-events.jpg

Live demo of events view.

The events view uses the 'Unformatted list' plugin that is provided by the views module itself. This prints each row of data in a div container. There are 2 ways to make the contents of these rows responsive. One would be to generate node teasers inside the rows, and configure the content type's teaser display mode to use grid classes on the fields. This method will be covered in the next part of this tutorial. For the events view we don't use teasers, we are building a fields view because it gives us more flexibility in the fields we show in our view. Luckily the views interface makes it easy for us to add grid classes right where we need them. First, we will add a row class to each views row by clicking Settings under Format and adding row in the Row class field:

bootstap-rowclasses-views.jpg

Now we can add responsive column classes to our fields and they will be organized within each row. We simply add classes by clicking each field and editing the Style Settings CSS class field:

bootstap-classes-views.jpg

The only thing we need to do here is check the Create a CSS class checkbox, and a textbox will appear that allows us to add grid classes to the field. This field uses the class col-sm-6, which makes our event title use 50% of its parent container's width (because Bootstrap uses a 12 column grid) when on a small device. This means that on an extra small device there is not grid class active and the title will use 100% of it's parent container's width, as you can see in the mock-up above. We can't say this method is as easy as the point and click method discussed earlier but if you are familiar with the views interface already this method will become intuitive with a little bit of practice and will allow you to have very fine-grained control over responsive behaviors in your views.

2.3 Bootstrap in Fields

Often you want to organise content fields in a layout. A module that can be of help here is Display Suite, but even with the ds_bootstrap_layouts extension this will give you a limited set of layouts. We can easily build any layout by simply adding bootstral grid classes on fields. This is not to say I don't like Display Suite but since CMS Powerstart focuses on simplicity I will choose the simplest solution. 'Big' tools like Panels and Display Suite are definitely more appropriate for larger Drupal projects.

To make an example I will start building a new Drupal CMS Powerstart component. There was a feature request for a 'shop' component, so we will be building a content type as part of a simple component that will help brick and mortar shops display their inventory. First we will create a new content type called Object.  Since Bootstrap columns need to be wrapped in row classes, we are adding the field_group module. Once you have downloaded and enabled the field_group module, you will have a new option 'Add new group' under the manage fields tab of your Object content type. We are adding a group called Bootstrap row using the default widget fieldset. Now drag the image and body field to the indented position under the Bootstrap row field group. This will create a visual indication in the node/add and node/edit interface that fields belong to the same group. Your Manage Fields interface should now look like this:

bootstap-field-group.jpg

Next we will go to the Manage Display tab of the Object content type. This is where the Bootstrap magic happens. Our goal is to display the body text and image field beside eachother on big device and above one another in small devices. First, we have to create our Bootstrap row group again, this time we add a group named Bootstrap row and make it the 'Div' format. Give our field group the following configuration settings:

  • Fieldgroup settings: open
  • Show label: no
  • Speed: none
  • Effec none:
  • Extra CSS classes: row (you can remove the default classes)

Next we wil drag the Body and Image fields to the indented position under the field group. Now we simply configure the field formatters to use the Bootstrap grid classes of our choice. To add these classes in the Manage Display interface we are going to install another module: field_formatter_class. Once you have downloaded and enabled this module you can go back to the Manage Display interface and you will see an option to add a class on each field. You will now set both the Body and Image field to have the Field Formatter Class col-sm-6. This will create a 2 column layout on devices wider than 768px and a stacked layout on smaller devices. If you are using Drupal CMS Powerstart, you can set the Image style of your image field to Bootstrap 3 col6. This will resize the image to exactly fit the 6 column grid container.

Your Manage Display tab should now look like this: 

bootstap-field-group-display.jpg

Now if you create a node using your new content type it should look similar to this:

bootstap-field-group-node.jpg

Using our new fieldgroup tool we can easily add bootstrap rows and columns to any content type, and since classes are listed and edited in the Manage Fields interface, it's relatively quick and and easy to manage per-node layouts. At least it's a step up from managing a ton of node templates.

2.4 Bootstrap in Content: Shortcodes

Sometimes you (or a client) just want to create a special page that needs more attention than other pages of the same type. Unfortunately there aren't any free tools that give our clients a true WYSIWYG experience for creating responsive Bootstrap grids. If you know one please let me know! Our fallback option is the bs_shortcodes module that I ported from a Wordpres plugin. This module let's you add nearly all Bootstrap components, including grid elements, using a WYSIWYG-integrated form. 

To see the power and flexibility of what you can do with these shortcode check out this demo page:

http://glazed-demo.sooperthemes.com/content/responsive-columns

This system leverages the Drupal Shortcode API, which is a port of the Wordpress shortcode API. The Drupal CMS Powerstart distribution ships with a WYWISYG component that includes CKEditor 4 with the neccesary Shortcode API and shortcode-provisioning submodules. Since the configuration of this setup is complex and beyond the scope of this article I'm just going to assuming you are using Drupal CMS Powerstart and ready to use the WYSIWYG with Shortcodes integration.

To create a simple 2 column layout like in the previous examples we first add a row shortcode:

bootstrap-shortcode-row.png

Then we select the column shortcode and find the code that corresponds to 6 columns on a small devices:

bootstrap-shortcode-column.png

Now if we use 2 6 column shortcodes and put in the same content used in the Field and Field Group tutorial in will look like this in the editor:

bootstrap-shortcode-full.png

After saving the page it will look exactly as the Test Object page we created in the previous tutorial. I admit that shortcodes are a rather crude tool for a complex problem but anyone who is willing the learn the basic principles of a 12 column grid system will have a huge amount of flexibility and capability in creating responsive content. When you combine the Bootstrap 3 grid documentation, the WYSIWYG integration, and for emergencies the documentation of the Wordpress plugin you already have a fully documented tool for savvy clients who don't want to deal with raw HTML code. Shortcodes don't seem like the most userfriendly tool but I've seen clients pick it up quickly and appreciate the flexibility it gives them in organising their most important pages. In the future we migh see improvement in this area from tools like Visual Composer and the Drupal-compatible alternative Azexo Composer.

In Part 3 of this tutorial series I will write about using shortcodes as a site building tool and demonstrate what you can do with shortcodes in a real life Drupal CMS project. To get a sneak preview of the shortcode elements I will be using, check out our Drupal themes.

 

 

Tags planet drupal Bootstrap 3 Views Fields API Shortcodes cms powerstart Drupal 7.x
Categories: Elsewhere

Drupal core announcements: Drupal 8 beta 5 on Wednesday, January 28, 2014

Planet Drupal - Tue, 20/01/2015 - 15:36

The next beta for Drupal 8 will be beta 5! Here is the schedule for the beta release.

Tuesday, January 27, 2014 Only critical and major patches committed Wednesday, January 28, 2014 Drupal 8.0.0-beta5 released. Emergency commits only.
Categories: Elsewhere

Raphael Geissert: Edit Debian, with iceweasel

Planet Debian - Tue, 20/01/2015 - 08:00
Soon after publishing the chromium/chrome extension that allows you to edit Debian online, Moez Bouhlel sent a pull request to the extension's git repository: all the changes needed to make a firefox extension!

After another session of browser extensions discovery, I merged the commits and generated the xpi. So now you can go download the Debian online editing firefox extension and hack the world, the Debian world.

Install it and start contributing to Debian from your browser. There's no excuse now.

Categories: Elsewhere

VM(doh): Be Careful with Large Select Lists on Drupal Commerce Line Item Type Configuration

Planet Drupal - Mon, 19/01/2015 - 23:26

Recently, we were debugging some performance issues with a client's Drupal Commerce website. After doing the standard optimizations, we hooked up New Relic so we could see exactly what else could be trimmed.

The site is using different line item types to differentiate between products that should be taxed in different ways. Each line item type has a field where administrators can select the tax code to use for that line item type. The options for the select list are populated via an API call to another service provider. The call for the list was using the static cache because it was thought that the list would only be populated when needed on the line item type configuration page. In reality, that's not the case.

When an Add to Cart form is displayed in Drupal Commerce, it also loads the line item type and the line item type's fields. When loading the fields, it loads all of the options even if the "Include this field on Add to Cart forms for line items of this type" option is not enabled for that field. In this case, it resulted in 90 HTTP calls to populate the list of tax codes every time someone viewed a page with an Add to Cart form.

The solution was to actually cache those results using Drupal's Cache API. You can see the improvement:

Categories: Elsewhere

Daniel Pocock: jSMPP project update, 2.1.1 and 2.2.1 releases

Planet Debian - Mon, 19/01/2015 - 22:29

The jSMPP project on Github stopped processing pull requests over a year ago and appeared to be needing some help.

I've recently started hosting it under https://github.com/opentelecoms-org/jsmpp and tried to merge some of the backlog of pull requests myself.

There have been new releases:

  • 2.1.1 works in any project already using 2.1.0. It introduces bug fixes only.
  • 2.2.1 introduces some new features and API changes and bigger bug fixes

The new versions are easily accessible for Maven users through the central repository service.

Apache Camel has already updated to use 2.1.1.

Thanks to all those people who have contributed to this project throughout its history.

Categories: Elsewhere

Daniel Pocock: Storage controllers for small Linux NFS networks

Planet Debian - Mon, 19/01/2015 - 14:59

While contemplating the disk capacity upgrade for my Microserver at home, I've also been thinking about adding a proper storage controller.

Currently I just use the built-in controller in the Microserver. It is an AMD SB820M SATA controller. It is a bottleneck for the SSD IOPS.

On the disks, I prefer to use software RAID (such as md or BtrFs) and not become dependent on the metadata format of any specific RAID controller. The RAID controllers don't offer the checksumming feature that is available in BtrFs and ZFS.

The use case is NFS for a small number of workstations. NFS synchronous writes block the client while the server ensures data really goes onto the disk. This creates a performance bottleneck. It is actually slower than if clients are writing directly to their local disks through the local OS caches.

SSDs on an NFS server offer some benefit because they can complete write operations more quickly and the NFS server can then tell the client the operation is complete. The more performant solution (albeit with a slight risk of data corruption) is to use a storage controller with a non-volatile (battery-backed or flash-protected) write cache.

Many RAID controllers have non-volatile write caches. Some online discussions of BtrFs and ZFS have suggested staying away from full RAID controllers though, amongst other things, to avoid the complexities of RAID controllers adding their metadata to the drives.

This brings me to the first challenge though: are there suitable storage controllers that have a non-volatile write cache but without having other RAID features?

Or a second possibility: out of the various RAID controllers that are available, do any provide first-class JBOD support?

Observations

I looked at specs and documentation for various RAID controllers and identified some of the following challenges:

Next steps

Are there other options to look at, for example, alternatives to NFS?

If I just add in a non-RAID HBA to enable faster IO to the SSDs will this be enough to make a noticeable difference on the small number of NFS clients I'm using?

Or is it inevitable that I will have to go with one of the solutions that involves putting a vendor's volume metadata onto JBOD volumes? If I do go that way, which of the vendors' metadata formats are most likely to be recognized by free software utilities in the future if I ever connect the disk to a generic non-RAID HBA?

Thanks to all those people who provided comments about choosing drives for this type of NAS usage.

Related reading
Categories: Elsewhere

Pages

Subscribe to jfhovinne aggregator - Elsewhere