Planet Drupal

Subscribe to flux Planet Drupal
Drupal.org - aggregated feeds in category Planet Drupal
Mis à jour : il y a 39 min 9 sec

Drupal core announcements: Drupal 8 Jersey shore sprint, Asbury Park, June 13-14

mer, 10/06/2015 - 19:07

June 13-14, 2015, the Central NJ Drupal Group is hosting a core sprint
focusing on the upcoming release of Drupal 8.

The current plan is to focus on [meta] Remove or document every SafeMarkup::set() call, continuing the work done at the recent Drupal 8 theme system critical issues sprint (June 5-7 in Portsmouth, NH.)

See the event post at https://groups.drupal.org/node/468408 for more details, and
registration information.

Catégories: Elsewhere

SitePoint PHP Drupal: Multiple Editors per Node in Drupal 7

mer, 10/06/2015 - 18:00

One of the things that makes Drupal great is its flexible user permission system. The out of the box permissions grid we are all familiar with covers most uses cases of controlling what users can and cannot do. It is also very easy for module developers to create new permissions and roles that restrict the logic they implement.

Nevertheless, I have encountered a practical use case where the default configuration options are not enough. Namely, if you need to have multiple users with access to edit a particular node of a given type but without them necessarily having access to edit others of the same type. In other words, the next great article should be editable by Laura and Glenn but not by their colleagues. However, out of the box, users of a particular role can be masters either of their own content or of all content of a certain type. So this is not immediately possible.

In this article I am going to show you my solution to this problem in the form of a simple custom module called editor_list. Article nodes will have a field where you can select users and only these users (or those who have full access) will be able to edit that particular node. You can find the module already in this git repository and you can install it on your site for a quick start. Do keep in mind that it has a dependency on the Entity Reference module as we will see in a minute.

I will keep the code comments to a minimum to save space but you can find them in the repository if you want. Basic knowledge of Drupal 7 is assumed in the remainder of this tutorial.

Continue reading %Multiple Editors per Node in Drupal 7%

Catégories: Elsewhere

Lullabot: Holly Ross on the Drupal Association

mer, 10/06/2015 - 17:30

In this episode of Hacking Culture, Matthew Tift talks with Holly Ross, the Executive Director of the Drupal Association, about the Drupal community, the Drupal Association, non-profits, business, tax codes, and more. They get into some controversial issues, and some of Holly's answers may surprise you!

Catégories: Elsewhere

Dcycle: Add unit testing to legacy code

mer, 10/06/2015 - 16:40

To me, modern code must be tracked by a continuous integration server, and must have automated tests. Anything else is legacy code, even if it was rolled out this morning.

In the last year, I have adopted a policy of never modifying any legacy code, because even a one-line change can have unanticipated effects on functionality, plus there is no guarantee that you won't be re-fixing the same problem in 6 months.

This article will focus on a simple technique I use to bring legacy Drupal code under a test harness (hence transforming it into modern code), which is my first step before working on it.

Unit vs. functional testing

If you have already written automated tests for Drupal, you know about Simpletest and the concept of functional web-request tests with a temporary database: the vast majority of tests written for Drupal 7 code are based on the DrupalWebTestCase, which builds a Drupal site from scratch, often installing something like a site deployment module, using a temporary database, and then allows your test to make web requests to that interface. It's all automatic and temporary environments are destroyed when tests are done.

It's great, it really simulates how your site is used, but it has some drawbacks: first, it's a bit of a pain to set up: your continuous integration server needs to have a LAMP stack or spin up Vagrant boxes or Docker containers, you need to set up virtual hosts for your code, and most importantly, it's very time-consuming, because each test case in each test class creates a brand new Drupal site, installs your modules, and destroys the environment.

(I even had to write a module, Simpletest Turbo, to perform some caching, or else my tests were taking hours to run (at which point everyone starts ignoring them) -- but that is just a stopgap measure.)

Unit tests, on the other hand, don't require a database, don't do web requests, and are lightning fast, often running in less than a second.

This article will detail how I use unit testing on legacy code.

Typical legacy code

Typically, you will be asked to make a "small change" to a function which is often 200+ lines long, and uses global variables, performs database requests, and REST calls to external services. Bit I'm not judging the authors of such code -- more often than not, git blame tells me that I wrote it myself.

For the purposes of our example, let's imagine that you are asked to make change to a function which returns a "score" for the current user.

function mymodule_user_score() { global $user; $user = user_load($user->uid); $node = node_load($user->field_score_nid['und'][0]['value']); return $node->field_score['und'][0]['value']; }

This example is not too menacing, but it's still not unit testable: the function calls the database, and uses global variables.

Now, the above function is not very elegant; our first task is to ignore our impulse to improve it. Remember: we're not going to even touch any code that's not under a test harness.

As mentioned above, we could write a subclass of DrupalWebTestCase which provisions a database, we could create a node, a user, populate it, and then run the function.

But we would rather write a unit test, which does not need externalities like the database or global variables.

But our function depends on externalities! How can we ignore them? We'll use a technique called dependency injection. There are several approaches to dependency injection; and Drupal 8 code supports it very well with PHPUnit; but we'll use a simple implementation which requires the following steps:

  • Move the code to a class method
  • Move dependencies into their own methods
  • Write a subclass replaces dependencies (not logic) with mock implementations
  • Write a test
  • Then, and only then, make the "small change" requested by the client

Let's get started!

Move the code to a class method

For dependency to work, we need to put the above code in a class, so our code will now look like this:

class MyModuleUserScore { function mymodule_user_score() { global $user; $user = user_load($user->uid); $node = node_load($user->field_score_nid['und'][0]['value']); return $node->field_score['und'][0]['value']; } } function mymodule_user_score() { $score = new MyModuleUserScore(); return $score->mymodule_user_score(); }

That wasn't that hard, right? I like to keep each of my classes in its own file, but for simplicity's sake let's assume everything is in the same file.

Move dependencies into their own methods

There are a few dependencies in this function: global $user, user_load(), and node_load(). All of these are not available to unit tests, so we need to move them out of the function, like this:

class MyModuleUserScore { function mymodule_user_score() { $user = $this->globalUser(); $user = $this->user_load($user->uid); $node = $this->node_load($user->field_score_nid['und'][0]['value']); return $node->field_score['und'][0]['value']; } function globalUser() { return global $user; } function user_load($uid) { return user_load($uid); } function node_load($nid) { return node_load($nid); } }

Your dependency methods should generally only contain one line. The above code should behave in exactly the same way as the original.

Override dependencies in a subclass

Our next step will be to provide mock versions of our dependencies. The trick here is to make our mock versions return values which are expected by the main function. For example, we can surmise that our user is expected to have a field_score_nid, which is expected to contain a valid node id. We can also make similar assumptions about how our node is structured. Let's make mock responses with these assumptions:

class MyModuleUserScoreMock extends MyModuleUserScore { function globalUser() { return (object) array( 'uid' => 123, ); } function user_load($uid) { if ($uid == 123) { return (object) array { field_score_nid => array( LANGUAGE_NONE => array( array( 'value' => 234, ), ), ), } } } function node_load($nid) { if ($nid == 234) { return (object) array { field_score => array( LANGUAGE_NONE => array( array( 'value' => 3000, ), ), ), } } } }

Notice that our return values are not meant to be complete: they only contain the minimal data expected by our function: our mock user object does not even contain a uid property! But that does not matter, because our function is not expecting it.

Write a test

It is now possible to write a unit test for our logic without requiring the database. You can copy the contents of this sample unit test to your module folder as mymodule.test, add files[] = mymodule.test to your mymodule.info, enable the simpletest modules and clear your cache.

There remains the task of actually writing the test: in your testModule() function, the following lines will do:

public function testModule() { // load the file or files where your classes are located. This can // also be done in the setUp() function. module_load_include('module', 'mymodule'); $score = new MyModuleUserScoreMock(); $this->assertTrue($score->mymodule_user_score() == 3000, 'User score function returns the expected score'); } Run your test

All that's left now is to run your test:

php ./scripts/run-tests.sh --class mymoduleTestCase

Then add above line to your continuous integration server to make sure you're notified when someone breaks it.

Your code is now ready to be fixed

Now, when your client asks for a small or big change, you can use test-driven development to implement it. For example, let's say your client wants all scores to be multiplied by 10 (30000 should be the score when 3000 is the value in the node):

  • First, modify your unit test to make sure it fails: make the test expect 30000 instead of 3000
  • Next, change your code iteratively until your test passes.
What's next

This has been a very simple introduction to dependency injection and unit testing for legacy code: if you want to do even more, you can make your Mock subclass as complex as you wish, simulating corrupt data, nodes which don't load, and so on.

I highly recommend getting familiar with PHPUnit, which is part of Drupal 8, and which takes dependency injection to a whole new level: Juan Treminio's "Unit Testing Tutorial Part I: Introduction to PHPUnit", March 1, 2013 is the best introduction I've found.

I do not recommend doing away entirely with functional, database, and web tests, but a layered approach where most of your tests are unit tests, and you limit the use of functional tests, will allow you to keep your test runs below an acceptable duration, making them all the more useful, and increasing the overall quality of new and even legacy code.

Tags: blogplanet
Catégories: Elsewhere

Acquia: Real world change with PHP and community: "The sky's the limit."

mer, 10/06/2015 - 15:01
Language Undefined

Michelle Sanver–developer at Liip–and I sat down and talked at SymfonyCon 2014 in Madrid. Michelle and I have a number of interests in common (community, FTW!) and I really enjoyed getting to know her better in a conversation in front of my microphone and camera. We covered her long history in PHP, her SymfonyCon presentation (Life After Assetic: State of Art Symfony2 Frontend Dev) the PHP Renaissance bringing communities together, Michelle's "open source addiction", building PHP applications that touch the lives of almost everyone in Switzerland, and more.

Catégories: Elsewhere

CiviCRM Blog: Load test Drupal and CiviCRM with LoadImpact

mer, 10/06/2015 - 12:28

This has been my approach (together with CiviCoop) to load test a big site with CiviCRM where most visitors where expected to login.
Let me know if you would agree with this approach or if you have a better alternative.

Every big drupal site needs load testing before going live.

These are the key questions you should have answered in the final stages before deployment:

  • How does your infrastructure handle the expected amount of visitors?
  • how does it perform with maximum amount of visitors?
  • and at what amount of visitors does it start to crumble?

For anonymous load testing there are a number of tools available.
For logged in users there are not so many available.

But what about the sites where logging in is secured using unique tokens per user visit like drupal?

How do you load test those?

The problem is that you often can record or script what you need to post during login.

But sites like drupal secure their login pages with an unique token so you do not know what you will need to post beforehand.

With LoadImpact.com that problem is solvable.

LoadImpact automates load testing and gives graphs like:

Example of LoadImpact graphs

To setup a load test on Load Impact for logged in users you can do:

Step 1: Record one or more user scenario's with the Load Impact crome plugin: https://chrome.google.com/webstore/detail/load-impact-user-scenario/comn...

Step 2: Export user scenario to Load Impact.

Step 3: Look into the generated LUA code and find the GET request to the login page.

Step 4: Change it so the form token is gathered and placed in variable:

For example:

http.page_start("Page 1") local pages = http.request_batch({ {"GET", "https://www.domain.com/user", response_body_bytes=10240} }) local body = pages[1]['body'] local token = string.match(body, 'input type="hidden" name="form_build_id" value="(.-)"')

Step 5: find the POST request to the drupal login page and change the "form_build_id" with the token value.

if token ~= nil then http.request_batch({ {"POST", "https://www.domain.com/user", headers={["Content-Type"]="application/x-www-form-urlencoded"}, data="form_build_id=" .. token .. "&form_id=user_login&name=<username>op=Log%20in&pass=<password>", auto_decompress=true} }) else log.error("failed to find token" .. body .. ""); end

And you're done. Now load tests can be performed with thousands of concurrent logged in users on your drupal site.

If your user scenario contains other form submissions you can repeat this for the other forms as well.

Using CiviCRM as an example: someting similar is needed if CiviCRM searches are performed.

CiviCRM adds a session dependent qfKey to every search. Without the right qfKey a search will not be executed properly, harming the load test.

To solve this you have to execute the following steps in the Load Impact user scenario.

Step 1: Find the GET page for the search and place the qfKey in a variable

local pages = http.request_batch({ {"GET", "https://www.domain.com/civicrm/contact/search?reset=1", response_body_bytes=102400} }) local body = pages[1]['body'] local token = string.match(body, 'input type="hidden" name="qfKey" value="(.-)"')

Step 2: find the POST request to the search page and replace the qfKey with the token

if token ~= nil then http.page_start("Page 5") http.request_batch({ {"POST", "https://www.domain.com/civicrm/contact/search", headers={["Content-Type"]="application/x-www-form-urlencoded"}, data="_qf_Basic_refresh=Search&_qf_default=Basic%3Arefresh&contact_type=&entryURL=https%3A%2F%2Fwww.domain.com%2Fcivicrm%2Fcontact%2Fsearch%3Freset%3D1&group=&qfKey=" .. token .. "&sort_name=&tag=", auto_decompress=true} }) http.page_end("Page 5") else log.error("failed to find token" .. body .. ""); end

And you can also do proper CiviCRM searches in your Load Impact user scenario and load test your Drupal+CiviCRM site before deployment.

Originally posted on http://orgis.com/en/blog/web/professional-load-testing-drupal-and-civicr...

 

Catégories: Elsewhere

InternetDevels: Your Drupal website security: how you can ensure it

mer, 10/06/2015 - 08:14

Drupal web development is on the rise today - there are hundreds of thousands of sites written in Drupal. The community of Drupal developers keeps growing. So Drupal is regularly checked, scanned, analyzed for security vulnerability.

Read more
Catégories: Elsewhere

KnackForge: How to install Gitlab 7.8 on Centos 5.5 with Apache and MySQL

mer, 10/06/2015 - 07:16

GitLab is a web-based Git repository manager with wiki and issue tracking features. GitLab is written in ruby on rails.

Installing gitlab omnibus package won't be that difficult following the guide given here. But when it comes to installing gitlab on centos 5.5, it isn't that easy as there are no omnibus packages available for centos systems with version less than 6.5. So let's look at the steps that need to be followed to make gitlab installation successful on centos 5.5:

1. Install the development tools necessary to compile applications from source

Catégories: Elsewhere

LevelTen Interactive: DrupalCon LA 2015 Video: Drupal Association Interview

mer, 10/06/2015 - 07:00

Last week’s DrupalCon interview featured one of Lullabot's Technical Project Manager and Front-End Developer.... Read more

Catégories: Elsewhere

Modules Unraveled: 138 Organize and Manage Your Drupal Projects Using Dropfort with Mathew Winstone - Modules Unraveled Podcast

mer, 10/06/2015 - 07:00
Published: Wed, 06/10/15Download this episodeDropfort
  • What is Dropfort?
    • Dropfort is a suite of tools to develop and manage Drupal applications.

On the development side, it integrates with GitHub and GitLab to track commits, issues and tags. It then packages releases based on those tags and lets you share those releases with your Drupal sites. Same way you tag and create releases on Drupal.org. The only difference is the released modules are private. Meaning a site that wants to use those modules needs to authenticate to download them.

For example, if you want to download a custom module to your site from dropfort, you can just do a “drush dl mymodule --source=https://app.dropfort.com/fserver/release-history”. Dropfort generates the same XML data for its modules as does Drupal.org for contrib modules. Meaning the Update module works with Dropfort, all your Drush commands work and Drush make works too. It’s all pretty seamless. The only difference with our XML is that it's not publicly available. Your site has to be allowed to see the update feed which is what you configure in the Dropfort web app itself.

The other half of Dropfort are the operations or “ops” tools. You connect your sites to Dropfort using the Dropfort Update module (which is available on Drupal.org) and it will start doing a few things. The most obvious is tracking the update status of your site. Being a Drupal shop, we monitor a few dozen Drupal sites at once and so logging into each site to see what modules need updating and the status of those sites is time consuming. What Dropfort let’s us do is see all those sites in one dashboard. I can login and see the update status and status report data from all the sites in one place. Dropfort then uses this data to generate some graphs about your sites. For example it can tell you how many dev modules you’re using across all your sites, it shows you a list of what sites have security updates and it does some fancy calculations to grade your site’s health as well. Lots of metrics to know what’s going on and how things are changing over time.

The last part, and this is what I’ve been working on mostly these last few months, is an Environment manager. It’s still pretty fresh and there are some rough edges but it does work. You can create a set of environments (dev, testing or production) to store machine configurations to both do your development and run your Drupal applications. You can say “I want a server running apache, with MySQL and the Commerce Kickstart distro” on it. Then you can either download a Vagrant file which will provision a vm, or you can download a docker container which will do the same. Or you can run a bash script on an existing machine to link the server to Dropfort and have that configuration deployed. It’s pretty neat stuff. Basically any server anywhere can be turned into a managed Drupal cloud.

Like I said, there’s a whole suite of stuff in here.

  • Why did you create Dropfort? Where did it start?

It all really came from using Feature Server for Drupal 6. When we incorporated Coldfront in 2011 and really turned it into a full time job, we wanted a way to distribute code to all our clients (even though at the time I think we only had one). We setup FServer to deploy code to your client sites. But manually creating the releases and the pages and stuff was kind of a pain. So we came up with a special commit syntax we could add to Subversion (this was before Git was a big thing). So could make our svn commit and in the message we’d write [release:full] and some post-commit scripts would run on the svn server. They’d look at the commit message and then take the code, create a tgz file, create a tag, commit the tag and then upload the tgz file to FServer using some REST web service endpoints we create on a Drupal 6 site with Services. That would create the release page, the release notes, add the files and generate the update XML. It was pretty well a mini Drupal.org but with Subversion (instead of CVS which D.O was still using at the time). It actually worked really well. So well in fact that the University of Ottawa asked if we could install a version for them to manage their Drupal stuff (which they’re actually still using today until their git migration is done). That’s when the lightbulb went off I guess. We had built this stuff for us but as it turns out, other people want to deploy custom modules too! Who’da thunk it?

And that’s when the idea for a more “web app” version of our SVN workflow came to be. At the time I thought “Yeah we can totally rewrite this in no time, I give is 6 months and we’ll have a web app ready to go”. That was 3 years ago I think? It took a bit longer than expected. Mostly I kept adding features… Yay scope creep. But now we’ve got a pretty awesome suite of tools and we’re focusing on the polish now. I’ve been told I’m not allowed to make feature requests for at least a month. We’ll see how that goes.

  • Can this be used to compete with Drupal.org? Meaning can people share public releases here instead of on d.o?

Nope. Public modules should be on Drupal.org. No module can be downloaded from Dropfort without authenticating first. We don’t want to supersede d.o in any way. We actually looked into writing a feature to automatically move a project from bieng private to a public one on d.o but since Drupal.org doesnt’ have an API we couldn’t do that. But yeah, this is for privately distributed modules only.

  • Does the monitoring dashboard check both the private projects as well as public ones on d.o?

    • Yes
  • What did you build it with? Is Dropfort open-source?

The original SVN workflow using Subversion, CLI PHP, Drupal 6, 2 custom modules and some REST / Services stuff.

Dropfort uses Git, Drupal 7, Services, about 20 or so custom modules, Puppet and our Drupal 7 port of FServer. So Dropfort is a Drupal application itself. We actually use Dropfort to manage Dropfort. Meaning we track it’s own updates and status using itself, and it packages releases for itself. A little inceptiony but it works.

Most of the parts which make up Dropfort are open. Some of the custom modules aren’t openly available. But that’s mostly because we don’t have the bandwidth to help and support the distro on D.o. Especially the stuff involving setting up a Puppet master. We’d spend more of our time debugging that than actually making things work. Doesn’t mean we won’t share everything eventually, just not right now.

  • What’s the plan for Dropfort? Is it a paid service or is it free?

Right now it’s free to use. The main reason for that is we haven’t written the commerce component yet so we can’t actually charge for it so… yeah. But we’re looking at different ways of monetizing. It’s tricky cause we want people to use it but at the same time we don’t really know what people will use. There’s such a variety of things in there it’s tough to decide what should be charged for. For example Github is pretty straightforward in their pay structure. If the code is open, your repo is free. If your code is closed, you pay for the repo. For us, I think it’s a question of usage. We’re leaning towards all tools are free to use with any account, it’s just a question of how much storage or how many sites you’re using. But regardless, anyone who uses it now is free to use it as much as they want. And we’ll have some special plans for early adopters as a thanks for their feedback. More than likely a bunch of free stuff.

  • How does this compare to other tools like Platform.sh, Pantheon, Acquia Dev Cloud?

The big difference is that they’re primarily a hosting platform. Dropfort is a management platform. You can connect a Pantheon site or an Acquia Dev cloud site to Dropfort and use most of the features no problem. You’d probably skip the release packaging stuff and environment management (for now) but the stats tracking and collaboration tools would work just fine. Dropfort doesn’t care where or how you run your Drupal site. As long as it can reach the internet, you can use Dropfort.

But you can use Dropfort with GitHub or GitLab or neither. You can use Vagrant or Docker or both. We do our best to integrate with anything which might make building Drupal application easier. It’s all about choice.

As for the hosting side of things, we give you tools to deploy your own server or cloud of servers. Meaning you can run an optimized network of Drupal web servers on whatever provider you want. It’s a philosophical difference. We let you host your code and sites wherever you want whereas with others you live on their machines. Which can have a lot of advantages and for the majority of folks out there, that’s fine with them.

But for us we’ve found it difficult for some enterprises here in Canada to get hosting on services in the US which are bound by the Patriot Act. We have FIPA, the Freedom of Information and Privacy Act which states that we can’t share information about users unless the user has explicitly allowed that agency access. The Patriot Act is pretty much the exact opposite of that. So we figured we’d bring most if not all of the advantages of a cloud solution (the optimized configuration, automated deployments / scaling, generated environments) to anyone’s infrastructure. You just supply the hardware, Dropfort does the rest.

I see it as just another option in how you can host your Drupal site. You can choose how much or how little you want to be involved in managing the hosting environment. Whichever way works best for you is the one you should go with.

  • How does this handle dev/staging/live scenarios?
  • How about local?
Use Cases
  • Let’s talk about the current use cases for Dropfort.
    • Managing several sites in one place
    • Create custom, shareable development environments
    • Create releases of projects destined for more than one application
  • Why would you use Dropfort instead of just Git to manage deployments?
    • We use Drush Make for just about everything. We control our releases using Drush make. We apply patches with Drush make. We really like Drush make. And we really don’t like merge conflicts. The number of times I’ve come into a project where the entirety of Drupal core and all the contrib modules are in a single repo with a team of people trying to all work on it at once has taught me that’s not the way to work. Treat your projects like d.o does, as self contained sets of functionality. Use make files to build your application and drush to manage updates. This is how Drupal is designed to work. Drupal is a collection of modules. When you all of a sudden lump it all together into a single repo you’re breaking how Drupal was meant to be managed.
  • Can you explain a bit more about how Drush Make works?
  • You just mentioned automated updates.
  • What’s in the near, and far future for Dropfort?
Episode Links: Mathew on drupal.orgMathew on TwitterDropfort on TwitterDropfort WebsiteColdfrontlabs.caColdfrontlabs GitHubTags: MonitoringUpdatesplanet-drupal
Catégories: Elsewhere

Drupal CMS Guides at Daymuse Studios: Drupal and Facebook: Login and Connect Integration Tutorial

mer, 10/06/2015 - 00:12

Integrate Facebook Login with your Drupal website with this tutorial: learn how to use appropriate modules to allow your users to connect with Facebook.

Catégories: Elsewhere

Drupal Association News: Bart's Bash: Breaking World Records With Drupal

mar, 09/06/2015 - 19:44

Breaking a Guinness world record is no easy feat, but in 2014, the folks behind Bart’s Bash did just that. With help from Drupal, they coordinated the world’s largest-ever sailing race — a fundraising event in memory of Andrew “Bart” Simpson.

Bart Simpson was a British sailor who won a gold medal at the 2008 Summer Olympics in Beijing, a silver medal in the 2012 Summer Olympics in London, and medaled in numerous World and European Championships. After Simpson was killed in a sailing accident in May of 2013 when training for the 2013 America’s Cup, his friends and family went on to found the Andrew Simpson Sailing Foundation in his memory.

“Andrew had passed away six months before [we began organizing Bart’s Bash],” said David Bishop, who built the website for Bart’s Bash. David is a sailor and runs NinetyOne Consulting out of Shropshire, England with his wife, who did much of the design work for the Bart’s Bash site. “When we set out initially, our goal was [to reach] only fifty sailing clubs, to raise £10,000, and see 2,000 people on the water."

The Andrew Simpson Sailing Foundation exists to inspire personal growth in young people through sailing. According to the Bart’s Bash 2014 website, "Many of our Olympic sailors have described the first time they were given charge of a boat as their moment of clarity – the first instance they felt true responsibility and in command of their destiny.  Whether or not children will take up sailing as a pastime, many studies have shown that children who are confident, have self worth and personal resilience do better in every way.  They are happier in their personal and family life, they are better able to learn, do better at school and in employment and they are more open to new experiences in life.  We aim to provide an avenue to that fulfilment and have global ambitions to spread the attitude, inspiration and personality of Andrew Simpson around the world."

“Initially, there was a Facebook page that had been set up in memory of Bart, and it only had about five thousand followers,” said David. “So we built a one page website for the event, and I put social sharing buttons on it. We were very quickly up to several thousand shares on Facebook, and hundreds on Twitter.

"Within three or four weeks, over 300 sailing clubs had come to us and said, ‘we want to be involved,’” David continued. “So we had to change what the event was going to be. Initially, we were just going to be a dinghy event in the UK, but because of the international interest from yacht clubs, kitesurfing clubs, model yacht clubs... all these people wanted to be a part of it, and we wanted to accommodate them as much as possible.”

The perfect platform for breaking records

As it turned out, Drupal was the perfect platform for this rapidly-growing event. “The whole concept of Bart’s Bash was that there’s no overriding governance. It's about engaging sailing clubs and getting someone at each venue to say, 'I’ll hold an event here, I’ll manage it,’” said David. “From that point of view it was a massively volunteer, community driven event. We’ve been as open as possible about making sure clubs can make their own pages and manage their own content, to make the event as successful as possible."

For David, that meant building a platform that sailing clubs around the world could use and make their own.

“I’ve built the system so that each club can create their own page,” David said. "They log in to a control panel, upload their own content, and manage it themselves. With the flexibility of the Simple CCK module, and blocks and views, it was possible for me to do rapid development. I built the whole thing myself. I had a little help from a local web development company — a day’s support, maybe — but other than that, one person built this whole system, and the scale it gives you is phenomenal.

“It’s interesting, because one of the areas that this has shown that the foundation can go into is providing services around the world just as a club web page. A lot of sailing clubs might not have a page that looks as nice as this, or that isn’t mobile responsive. But all of this is. So that’s actually one of the services that the foundation is looking at: we’re thinking of turning this into a ‘Learn to Sail' directory where you can find information about sailing at clubs near you.

“It’s amazing how good Drupal is as a platform — it definitely works for something like this,” David continued. “It’s just so flexible and so scalable. We put up the site for the 2014 event, and translated one of the key pages into eight or nine different languages. As you know, you turn on the international module and add the different variations, and you’re done. Drupal is the only platform out there that does this."

“A lovely festival of sailing"

Building a scalable, global website was only the beginning of holding a worldwide race, however.

“One of the biggest challenges was that it was going to be a global race — so how do you rank people racing in different time zones, in different classes?” David said. “We worked with a formula so we could calculate speed — a handicapped speed, if you will — so people in fast boats were adjusted for slow boats. Ultimately it came down to where the wind was in the world on that day. We were fortunate to receive a lot of help from the UK’s Royal Yachting Association with this challenge."

“After the race, we split the results up by age, experience, wind conditions, country, and boat class, which was key,” David continued. “We were able to produce a very nice set of statistics, and that’s something that hasn’t really been done in sailing before. In most sailing races, you just get a very straightforward set of results to see the winners. But it turns out our way was really popular— we saw a lot more traffic to the website after the events and continually for the next few weeks. As more results came in for those few weeks afterwards, seeing how the top 10 has moved up and down, it was great."

But for the sailors, it turns out it wasn’t all about winning. “We thought people were going to be obsessed about the results, and we weren’t sure how we’d validate it,” David said. “But in reality, we had massive boats in the same start lines as a 7-year-old kid in a tiny boat. It turned out people didn’t care about the race so much. Instead, it became this lovely festival of sailing."

Breaking world records

With the size of the event, the Bart’s Bash organizers were certain they’d be able to break a world record.

"For Guinness we had to get video of every start and every finish, plus steward and witness statements, and then we had to send each club bundle in. With more than 500 venues around the world participating, we wound up having nearly 10,000 boats qualified as being part of the world record,” said David. In total, the group collected and calculated results for 30,754 sailors across 52 countries around the world.

“It was another great way to get people involved in the event,” he continued. “Telling them that they're going to be a Guinness world record holder."

When it comes to the next year of races, David has high hopes. “For 2015, the Guinness restrictions have been lifted as we want to encourage small clubs who were not large enough to qualify under the rules required by Guinness last year. Also in 2015, we want more non-sailors on the water at more clubs around the world. To help make this happen we have come up with an idea called “Bart’s Buddies” aimed at taking your mates sailing. There will also be a special “Bouy Race” which will make it easier to get all of the wonderful volunteers sailing. To help showcase that, this year’s website is much more geared around showing the photos and the videos taken by each club around the world."

“Ultimately, three things brought the whole event together last year, and are pushing it forward this year, too,” David said. “First, it's a worthy fundraising reason. People want to do something in Andy’s memory. Second, it's a challenge — and sailors love challenges. Lastly, though, it brings a global community together, and Drupal as a platform enabled that to happen. We could create maps showing where people were using Open Layers modules. We could personalize the website for different people, and could drill down data and results.”

“Really, this is the first census for sailing activity done around the world in one day. It hadn’t been done before, which makes this website and event historic from that point of view,” said David. “We’ve been approached by other sailing associations and foundations, saying 'we want to do this, can we use the data you’ve collected.’"

As for what comes next, David is excited for the race coming up in September.

“A big sailing club signed up to participate in Barcelona last year,” David said. “And this year, the race is on 20 September — the day before DrupalCon Barcelona happens. Perhaps we’ll be able to get some Drupalers out there?

“The fact that Drupal exists means that Bart’s Bash happened. It has a lot of thanks to give to Drupal,” David concluded.

If you're interested in participating in Bart's Bash at DrupalCon Barcelona, let us know.

Sailing image credit to Gorazd Božič on Flickr.

Catégories: Elsewhere

Cheeky Monkey Media: Responsive images with Foundation Interchange

mar, 09/06/2015 - 18:58

Having a mobile friendly responsive website is always a good idea. Having a responsive website that loads really fast is even better. Large images are often a bottleneck and the cause of slower page load. A great way to solve this is to serve up different images based on the screen size instead of scaling a large image to fit.

To solve this dilemma, I recently discovered the Zurb Interchange module. Since I already use Foundation as a base theme/framework, I thought I would...

Catégories: Elsewhere

Chromatic: Understanding and Using HSL in Your CSS

mar, 09/06/2015 - 18:04

Color! Without it, life can can be pretty monotone, so I’m going to introduce to you the most awesome of ways you can represent it in your CSS: hue, saturation and lightness.

"I use HEX and RGB all the time, what’s so great about HSL?"

HSL is easier to read, modify, improvise, and it’s supported back to IE9. To see why it’s awesome and how to become an HSL master, let’s take a look at HSL to understand how it works.

Here’s an example:

hsl(30, 75%, 50%);

  • Hue: The color is determined by the hue value as represented in 360 degrees of the HSL color wheel.

  • Saturation: This ranges from 0 - 100. Zero being completely desaturated with 100 representing the full saturation of your hue.

  • Lightness: This also ranges from 0 - 100 with 0 denoting black while 100 will return white.

Here’s an HSL color wheel to get an understanding of how this behaves.

Want to change your orange to yellow? Just add another 30° degrees.

Sure, you could do this with HEX and RGB, but if a request came down the line to add a little green to your color make it and 20% darker, which of the formats below would be easier to interpret and change?

HSL hsl(60, 75%, 50%); RGB rgb(223, 223, 32); HEX #dfdf20;

With its simple manipulation, HSL also let’s you create common color harmonies fast.

Want your color’s complementary color? No sweat - add 180° to the hue value. Is your hue greater than 180° already? HSL is smart enough to loop around the wheel once more.

$primary-color: hsl(30, 75%, 50%); $complementary-color: hsl(210, 75%, 50%); // 30 + 180 = 210

Here’s some additional color schemes that are common in color theory:

Analogous: $red: hsl(0, 75%, 50%); $orange: hsl(30, 75%, 50%); $yellow: hsl(60, 75%, 50%);

Triadic: $orange: hsl(30, 75%, 50%); $blue-green: hsl(150, 75%, 50%); $purple: hsl(270, 75%, 50%);

Split-complementary: $orange: hsl(30, 75%, 50%); $cyan: hsl(180, 75%, 50%); $blue: hsl(240, 75%, 50%);

If you use Sass, you may know that there are built-in functions that utilize HSL. If you’ve used adjust-hue(), saturate() or darken() for example, you’ve already employed HSL as these derive their values from HSL.

"Why should I use HSL if Sass can make these adjustments for me?"

Besides the ease of reading values for when you or another person alters the colors of a project, it also allows you to write cleaner code for getting more ambitious with your own color schemes.

As an example, let’s create our own pattern based on the analogous principle of color theory with HSL.

Protip: Color schemes tend to work best when the hue difference is wide, but saturation remains similar.

$hue: 40; $saturation: 100; $lightness: 70; $second-color: hsl($hue - 25, $saturation - 20, $lightness - 10); $third-color: hsl($hue - 15, $saturation - 10, $lightness - 10); $primary-color: hsl($hue, $saturation, $lightness); $fourth-color: hsl($hue + 15, $saturation - 15, $lightness); $fifth-color: hsl($hue + 25, $saturation - 35, $lightness - 15);

You can swap the hue to see how it looks with other colors:

$hue: 210;

Change one value, and you’ve created a different color system. Nice!

I hope you’re down with HSL and see the light of how awesome it is. Now go out, you MacGyver of color, and enjoy your new abilities with HSL!

Catégories: Elsewhere

Drupal Watchdog: JSON or XML

mar, 09/06/2015 - 18:00
Article

Now that Drupal 8 has built-in support for Web Services, you’re likely thinking about exposing the content in your site with an API. But should you make the data available in JSON, XML, or both?

A Short History of XML and JSON

XML and JSON are the primary formats used for data exchange on the web. XML was born when some individuals involved in the Standard Generalized Markup Language (SGML) effort became early adopters of the Web. SGML is a way of defining languages for marking up documents, like HTML; XML borrowed many of the core principles and simplified the rest. The initial draft of XML was completed by a subcommittee of the W3C’s SGML Activity in 1996. Even in the early drafting process, it had support from many large technology companies.

In contrast, JSON (JavaScript Object Notation) is known for having been more discovered than invented: Douglass Crockford saw that language constructs already existing in JavaScript could be used to represent objects as strings. He coined the term JSON for this usage in 2001. It didn’t go through the standardization process, in part because it is a proper subset of the JavaScript standard. When Crockford was told by clients that they couldn’t use JSON because it wasn’t a standard, he bought json.org and put up a Web page declaring it a standard. JSON slowly gained popularity as people discovered the page. Since then, it has become an official standard, and support for encoding to and decoding from JSON has been added to many languages.

The choice between these two has been a topic of debate for nearly a decade.

Why is JSON Better?

JSON is lightweight. It often takes fewer characters to transmit the same information. For example, compare the following data in XML with the same data in JSON.

XML:
<pre>
<root>
<foo>text goes here</foo>
<bar>and here</bar>
</root>

JSON:

Catégories: Elsewhere

Mediacurrent: Mediacurrent Dropcast: Episode 6

mar, 09/06/2015 - 17:07

This episode we have a special guest, Mickey Williamson, who talks about the importance of Web Accessibility. We also talk about developing a restful todo application with Backbone.js and as always, Drupal 8 updates and other Drupal news. If you would like to be a guest or have any questions, email us at dropcast@mediacurrent.com.

Catégories: Elsewhere

Red Crackle: How to Install Red Test

mar, 09/06/2015 - 17:00
Catégories: Elsewhere

Drupalize.Me: Create Offsite Backups with NodeSquirrel

mar, 09/06/2015 - 15:02

In our free Module Monday: Backup and Migrate tutorial we discussed all the benefits and features the module has to offer. In this tutorial I am going to extend on the functionality of the module because something great has happened in the Drupal world. Pantheon, a Drupal hosting provider, has purchased NodeSquirrel an offsite backup solution created by the makers of the Backup and Migrate module. What is so great about this is Pantheon is allowing free backups up to 5gb. This means there are no more excuses not to have an offsite backup of your Drupal database.

Catégories: Elsewhere

Dries Buytaert: The post-browser era of the web is coming

mar, 09/06/2015 - 10:38

At yesterday's Worldwide Developer Conference keynote, Apple announced its annual updates to iOS, OS X, and the new watchOS. As usual, the Apple rumor blogs correctly predicted most of the important announcements weeks ago, but one important piece of news only leaked a few hours before the keynote: the launch of a new application called "News". Apple's News app press release noted: "News provides beautiful content from the world's greatest sources, personalized for you".

Apple basically cloned Flipboard to create News. Flipboard was once Apple's "App of the Year" in 2010, and it remains one of the most popular reading applications on iOS. This isn't the first time Apple has chosen to compete with its ecosystem of app developers. There is even a term for it, called "Sherlocking".

But forget about Apple's impact on Flipboard for a minute. The release of the News app signifies a more important shift in the evolution of the web, the web content management industry, and the publishing industry.

Impact on content management platforms

Why is Apple's News app a big deal for content management platforms? When you can read all the news you are interested in in News, you no longer have to visit websites for it. It's a big deal because there are half a billion active iOS devices and Apple will ship its News app to every single one of them. It will accelerate the fact that websites are becoming less relevant as an end-point destination.

Some of the other new iOS 9 features will add fuel to the fire. For example, Apple's search service Spotlight will also get an upgrade, allowing third-party services to work directly with Apple's search feature. Spotlight can now "deep link" to content inside of a website or application, further eliminating website or applications as end-points. You could search for a restaurant in Yelp directly from your home screen, and go straight to Yelp's result page without having to open the Yelp website or application. Add to that the Apple Watch which doesn't even ship with a web browser, and it's clear that Apple is about to accelerate the post-browser era of the web.

The secret to the News app is the new Apple News Format; rumored to be a RSS-like data feed with support for additional design elements like images, videos, custom fonts, and more. Apple uses these feeds to aggregate content from different news sources, uses machine learning to match the best content to a given user, and provides a clean, consistent look and feel for articles coming from the various news sources. That is the long way of saying that Apple decides what the best content is for you, and what the best format is to deliver it in. It is a profound change, but for most people this will actually be a superior user experience.

The release of Apple News is further proof that data-driven experiences will be the norm and of what I have been calling The Big Reverse of the Web. The fact that for the web to reach its full potential, it will go through a massive re-architecture from a pull-based architecture to a push-based architecture. After the Big Reverse of the Web is complete, content will find you, rather than you having to find content. Apple's News and Flipboard are examples of what such push-based experiences look like; they "push" relevant and interesting content to you rather than you having to "pull" the news from multiple sources yourself.

When content is "pushed" to you by smart aggregators, using a regular web browser doesn't make much sense. You benefit from a different kind of browser for the web. For content management platforms, it redefines the browser and websites as end-points; de-emphasizing the role of presentation while increasing the importance of structured content and metadata. Given Apple's massive install base, the launch of its News app will further accelerate the post-browser era of the web.

I don't know about your content management platform, but Drupal is ready for it. It was designed for a content-first mentality while many competitive content management systems continue to rely on a dated page-centric content model. It was also designed to be a content repository capable of outputting content in multiple formats to multiple end-points.

Impact on publishing industry

Forget the impact on Flipboard or on content management platforms, the impact on the publishing world will even be more significant. The risk for publishers is that they are being disintermediated as the distribution channel and that their brands become less useful. It marks a powerful transformation that could de-materialize and de-monetize much of the current web and publishing industry.

Because of Apple's massive installed base, Apple will now own a large part of the distribution channel and it will have an outsized influence on what hundreds of millions of users will read. If we've learned one thing in the short history of the Internet, it is that jumping over middlemen is a well-known recipe for success.

This doesn't mean that online news media have lost. Maybe it can actually save them? Apple could provide publishers large and small with an immense distribution channel by giving them the ability to reach every iOS user. Apple isn't alone with this vision, as Facebook recently rolled out an experiment with select publishers like Buzzfeed and the New York Times called Instant Articles.

In a "push economy" where a publisher's brand is devalued and news is selected by smart aggregators, the best content could win; not just the content that is associated with the most well-known publishing brands with the biggest marketing budgets. Publishers will be incentivized to create more high-quality content -- content that is highly customized to different target audiences, rather than generic content that appeals to large groups of people. Success will likely rely on Apple's ability to use data to match the right content to each user.

Conclusion

This isn't necessarily bad. In my opinion, the web isn't dead, it's just getting started. We're well into the post-PC era, and now Apple is helping to move consumers beyond the browser. It's hard to not be cautiously optimistic about the long-term implications of these developments.

Catégories: Elsewhere

Pages