I recently had time to install and take a look at Drupal 8. I am going to share my first take on Drupal 8 and some of the hang-ups that I came across. I read a few other blog posts that mentioned not to rely too heavily on one source for D8 documentation with the rapid changing pace of D8 the information has become outdated rather quickly.
If the stress of pressing the wrong button on a live website is familiar to you, you need the Environment Indicator module! On large Drupal projects you will ...
The monthly security release window for Drupal 6 and Drupal 7 core will take place on Wednesday, August 19.
This does not mean that a Drupal core security release will necessarily take place on that date for either the Drupal 6 or Drupal 7 branches, only that you should prepare to look out for one (and be ready to update your Drupal sites in the event that the Drupal security team decides to make a release).
There will be no bug fix/feature release on this date; the next window for a Drupal core bug fix/feature release is Wednesday, September 2.
We met again today to discuss critical issues blocking Drupal 8's release (candidate). (See all prior recordings). Here is the recording of the meeting video and chat from today in the hope that it helps more than just those who were on the meeting:
If you also have significant time to work on critical issues in Drupal 8 and we did not include you, let me know as soon as possible.
The meeting log is as follows (all times are CEST real time at the meeting):
[11:07am] alexpott: https://www.drupal.org/node/2501931
[11:07am] Druplicon: https://www.drupal.org/node/2501931 => Remove SafeMarkup::set in twig_render_template() and ThemeManager and FieldPluginBase:advancedRender [#2501931] => 117 comments, 34 IRC mentions
[11:08am] alexpott: https://www.drupal.org/node/2549943
[11:08am] Druplicon: https://www.drupal.org/node/2549943 => [plan] Remove as much of the SafeMarkup class's methods as possible [#2549943] => 16 comments, 6 IRC mentions
[11:14am] plach: https://www.drupal.org/node/2542748
[11:14am] Druplicon: https://www.drupal.org/node/2542748 => EntityDefinitionUpdateManager::applyUpdates() can fail when there's existing content, leaving the site's schema in an unpredictable state, so should not be called during update.php [#2542748] => 100 comments, 24 IRC mentions
[11:56am] catch: https://www.drupal.org/node/2551341
[11:56am] Druplicon: https://www.drupal.org/node/2551341 => Update test database dump should be based on beta 12 and contain content [#2551341] => 0 comments, 1 IRC mention
[11:58am] jibran: https://www.drupal.org/node/2464427
[11:58am] Druplicon: https://www.drupal.org/node/2464427 => Replace CacheablePluginInterface with CacheableDependencyInterface [#2464427] => 157 comments, 22 IRC mentions
[12:01pm] jibran: plach: is it bells time on the call? :P
[12:02pm] plach: jibran: yeah :)
[12:08pm] jibran: https://www.drupal.org/node/2349819
[12:08pm] Druplicon: https://www.drupal.org/node/2349819 => String field type doesn't consider empty string as empty value [#2349819] => 89 comments, 8 IRC mentions
One of the core tools used in many software development circles is Git - a 'version control system' which enables individuals and groups to have a complete record of all the code changes throguhout the life-cycle of a project. As you can imagine, starting to use version control to save progess, rather than just the usual ctrl + S takes a bit of getting used to!
As a new developer, learning all of this stuff is pretty intimidating along with all of the other new knowledge. So I wrote the following guide, which is the document I would've like to have been given when I started.Introduction
Have no fear, Git is here to help. Nothing is lost, once commited to a repository. And there are many ways to organise your work, although this flexibility does mean it can be more complicated to store/locate your work. Commits are the core concept though, so get used to them.
It is useful to append the idea of 'always commit your work' to the paradigm 'always save your work'.
Git stores different "versions of reality" and you can switch between them like a multiverse! If you want to branch off and do something weird and crazy this is perfectly fine - everything in the other universes (branches) is unharmed.
Here are a couple of good tutorial links to get you started:
- Practice and visualise manipulating an example Git repository
- Hands on, step-by-step, bite-sized explanation of the core git concepts
A commit is a change to a set of files. It can consist of new files, altered files, deleted files and directories etc. In the real world a commit represents a logical/intuitive point to package some work you've done. After a certain piece of functionality has been completed, or everything required for a ticket has been done, or just at the end of the day (althought this isn't considered good practice by all as it results in incomplete work in the repository). Every commit is given it's own message, so you have the opportunity to show why this is a logical or intuitive place to be committing.
There can be a very large number of commits in a repository, which are connected in a variety of ways. One thing that is important to get your head around is that you can shift the whole filesystem to be the way it was at any commit. This is referred to as a "checkout" of a certain place in the structure (although the command is used slightly differently elsewhere). The process of checking out moves the "HEAD" to a different place in the repository and therefore looks at a version of the files from a different point in history.Staging Areas
The mechanism for creating commits is by placing the required changes, piece by piece if required, into the staging area. Git is aware of things that are in the 'work tree' (the filesystem you are looking at), that are different to the repository. It allows you to chose from these things by executing a command like these:
Put one file into the staging area:~$ git add path/to/altered.file
Stage any changes within this directory:~$ git add path/to/directory
This stages AND performs the next step - committing - all in one:~$ git commit -am "useful message"
This is quicker but less flexible. Be sure to know which files are going into the commit. Also, this only adds files that are already in the repo. If you need to add a new file for the first time, you have to use 'git add' first.
After adding some files, you can do 'git status' to display everything staged for a commit in green. If you are happy with what is prepared (you can use 'git diff' to see the details of what has changed) you can go ahead and run:~$ git commit -m "useful message"
That commit is now locked down in the repository. You will always be able to go back to it, event after future commits.Branches
Branching in Git is the action of deviating a sequence of commits from any pre-existing sequence. This is like taking your code to another dimension - where it can't mess up anything in another one. The ability to branch gives lots of flexibility for different workflows and gives Git its characteristic look in diagrams.
A typical workflow may consist of using a branch for working on a particular ticket or developing a piece of functionality. This "feature branch" will most likely deviate from a main development branch. Once created this branch will contain any number of commits working towards a finalised unit of work. After the work is ready, this branch can be merged back into the dev branch and this becomes the new start point for future feature branches.Branchy Commands
Create a branch (relative to the current branch):~$ git branch name-of-branch
This Branch will deviate from wherever HEAD is in the repo. Though the new branch will not be checked out at this stage.
List all available branches:~$ git branch
Lists available branches.
Creates a new branch and checks it out in a oner:~$ git checkout -b name-of-branch Repositories
Most workflows will consist of interactions with different repositories. Typically you will use a hosted central repository (BitBucket, GitHub etc.) which multiple collaborators can send their work to. It is in the central repository that the main workflow (feature branching and managing merges) occurs.
It is good practice to keep your local repo as similar in structure to the central remote one, although this does not necessarily need to happen. In principle you can perform the command below between any arbitrary repos, and Git will do its best to handle any differences it encounters. For example, two repos may not have the same branch structure, i.e. branches with the same content may have different parent commits. This is not necessarily a problem, depending on the nature of the code changes, but it may explain issue with merges later on down the line.Commands and concepts for moving changes between repositories.
fetch - Gets commits from a remote repository so they exists on a branch parallel to the local counterparts. So for example 'git fetch dev' gets a local version of the branch that is called "dev" in the remote repo (which is usually referred to as "origin"), but locally is called "origin/dev". This sounds confusing, but the idea of the "parallel" branches on fetching is illustrated in this interactive tool - just type "git fetch" in the command box!
pull - Does the above except it would also attempt to bring the local version of the "dev" branch up to date with any new stuff by performing a merge. The merge happens locally. The 'git pull' command acts on the current branch and is quite automatic. Here's a good summary of the differences between fetch and pull.
push - Attempts to send changes from local to remote. You may need to pull first to have the necessary information about the structure of the remote repo. Push requires Git to know what branch it's pushing to. You can specify that each time by executing 'git push origin branch-name', or you can set an "upstream branch" using 'git push -u origin branch-name' which lets Git know that it should always push between this pair of branches. Thereafter you can just do 'git push' and it knows where to go.Stashing
Sometimes you are asked to "stash" when switching branches. Stashing takes uncommited changes and puts them somewhere safe for you to re-apply later.
If you are asked to do this, simply do a 'git stash' - to put the changes in the stash. Then go to the new branch with 'git checkout new-branch-name'. Finally 'git stash apply' puts those uncommited files onto the new branch.
The stash never goes away. You can always access older stashed items even if new ones have been created.
Why does this happen? For unstaged changes to files, Git thinks about the last commit that involved a change to that file - a "parent commit" (this may be the initial commit). If you do a switch that makes it unclear which parent commit to use for a file, a stash becomes necessary.Merging (and conflicts)
Merging happens in various guises at various points in the workflow. Generally speaking, it involves making two branches into one so that the information in both is maximally represented. Most of the time the information can be matched up well, but sometimes it will be ambiguous as to which branch's change should be used. I.e. in some files there may be a different change to the same line(s) of code. When this happens Git will present the two options to you by writing them both into the file, using a certain convention to show which branch each is from. This is called a merge conflict. Fixing consists of simply editing the files and deleting the bits you don't need.
The basic command for merging is:~$ git merge target-branch-name
This will bring any differences from the target branch into HEAD (current branch). If this is successful (usually) a new commit will be created which has two parent commits, although you can also create a fast-forward merge in some cases.
Hopefully these notes and basic principles will help you get to grips with Git!Ashley George
I am running my machine with nginx & php 5.6, First make sure that you have already installed Xdebug. We can check this with php version command$ php -v
If you see ‘Xdebug v’ that means XDebug installed. If you do not see then install XDebug using PECL
Last summer, the Capital Area Food Bank of Washington, DC came to us with a great idea: The Food Bank wanted to increase its investment in team fundraising initiatives, but they didn’t necessarily want to continue to invest in a "pay as you go" service like StayClassy, TeamRaiser, or Razoo. Rather, they wanted to explore whether or not they could invest in the development of an open source alternative that would eliminate licensing costs, that they could customize more easily, and that they could then give back to the broader nonprofit community.
We’ve spoken a lot in the community regarding various strategies for how nonprofits can contribute to the development of open source tools. Most of the time, we recommend that nonprofits start off by solving their own organizational problems first, and then abstracting components of the resulting solutions at the end of the project.
However, in the case of the Capital Area Food Bank’s team fundraising needs, a competitive analysis of existing solutions provided us with a well-defined roadmap for how to meet the Food Bank’s goals, while simultaneously working on a suite of contributed tools that could be given back to the broader community. So, in coming up with their solution, we started out by developing RedHen Raiser as a stand-alone Drupal Distribution, and then implemented a customized instance of the Distribution to meet the Food Bank’s more specific design needs.Architectural Planning for the RedHen Raiser Drupal Distribution
As mentioned above, we began our feature design process for this peer-to-peer fundraising platform by doing a competitive analysis of existing tools. The table below shows our recommended feature set ("Our MVP") compared to the functionality available as of June 2014 on eight of the leading platforms:
As a quick aside, it's interesting in comparing these products that there is a lack of consensus in the industry regarding how to refer to these sorts of fundraising tools. We often hear the terms "peer-to-peer fundraising," “team fundraising,” “viral fundraising,” and even “crowd fundraising” used interchangeably. With crowd fundraising being all the rage in the for-profit world of Kickstarter and the like, we feel that that descriptor could be a little misleading, but the other three terms all speak to different highlights and features of these tools.
With a target feature set identified, we began wireframing our proposed solution. Again, peer-to-peer fundraising has become somewhat of a well-trodden space. While we came up with a few small enhancements to the user experiences we’ve seen in the marketplace, we recognized that donors are beginning to anticipate certain design patterns when visiting these fundraising sites:
With the wireframes in place, we turned to technical architecture. Given the need to collect a broad range of constituent data, not just user account information, it was clear to us that this Drupal distribution should be built on top of a CRM platform native to Drupal. For obvious reasons, we chose to go with RedHen CRM.
We also saw many advantages to leveraging RedHen Donation, a flexible tool for building single-page donation forms. With RedHen Donation, a "Donation field" can be attached to an entity, and in so doing, it attaches a form to that entity for processing donations. The RedHen-specific piece to this module is that it is configurable, so that incoming donation data is used to “upsert” (create or update) RedHen contacts. RedHen Donation also integrates with Drupal Commerce to handle order processing and payment handling. Integrating with the Commerce Card on File and Commerce Recurring modules, RedHen Donation can be configured to support recurring gifts as well.
Missing from the underlying set of building blocks needed for building a peer-to-peer fundraising platform with Drupal was a tool for managing and tracking fundraising against campaign targets and goals. In the case of peer-to-peer fundraising, each team fundraising page needs to be connected to a larger campaign. To address this requirement, we released a module called RedHen Campaign. This module, coupled with with RedHen Donation, are at the core of RedHen Raiser, allowing us to create a hierarchy between fundraising campaigns, teams, and individual donors.
If you are interested in giving RedHen Raiser a spin, you can install the distribution using your own Drupal Make workflow, you can checkout our own RedHen Raiser development workflow, or you can just click a button to spin up a free instance on Pantheon’s hosting platform. (If you’re not a Drupal developer, this last approach is incredibly quick and easy. The distribution even ships with an example fundraising campaign to help you get started.)Customizing RedHen Raiser to Meet the Food Bank’s Needs
With a strong base in place, customizing RedHen Raiser to meet the Capital Area Food Bank’s requirements was straightforward and comparably inexpensive. It was largely a matter of adding Drupal Commerce configuration to work with their payment gateway, developing a custom theme that matched the Food Bank’s overall brand, and then training their team to start building out their first peer-to-peer fundraising campaign:The Success of the Food Bank’s First Peer-to-Peer Fundraising Campaign
The Capital Area Food Bank launched this tool around its May 2015 "Food From The Bar" campaign, targeting law firms in the D.C. Metro area. In just 30 days, the Food Bank raised close to $150,000 on RedHen Raiser.
The Food Bank’s Chief Digital Officer, Chris von Spiegelfield, had this to say about the project:
"The Capital Area Food Bank has been no stranger to peer-to-peer fundraising tools. For years, it has relied on third-party sites such as Causes.com, Razoo, Crowdrise, among others. However, these tools often came with considerable branding and messaging limitations as well as pretty stiff transactional fees. Users got confused about where their money was going and complained after learning a considerable portion of their donation didn’t make its way to the Food Bank. We wanted to provide greater unity of purpose beyond a donation form without all the hassle, which is how we decided to invest in our own crowdfunding platform.
After kicking the tires at a few SaaS options, we decided the best way forward was to build a customized website. Out of all the different frameworks proposed, the open-source Drupal and RedHen Raiser combo impressed us the most. We wouldn’t just be buying a website. We would be leveraging a vast network of programmers and community-minded architects who could start us off in a very good place, not to mention help our platform be secure and grow for the foreseeable future.
We launched the website this year and couldn’t be happier with what ThinkShout built for us. We’re already seeing it pay dividends across several campaigns. We continue to add new features and hope our site might be a benchmark that other nonprofits could benefit from and contribute to as well."What’s Next for RedHen Raiser?
The obvious answer to this question is a port to Drupal 8. But that will take a little time, as many complex pieces, such as Drupal Commerce and RedHen CRM will need to be ported before we can migrate over these higher-level fundraising features. But a RedHen CRM port is on our short(er) term horizon. And frankly, the idea of being able to use RedHen as a "headless CRM" is incredibly exciting.
In the meantime, we are looking forward to collaborating with the community to make RedHen Raiser an even stronger open source competitor to the pay-as-you-go alternatives that are currently out there. So, please, give RedHen Raiser a test drive and send us your feedback!
Brush up on your DevOps knowledge with Zivtech’s Drupal DevOps training session. Whether you’re new to DevOps, or need a refresher course, trainers Jody Hamilton and Howard Tyson will show you valuable techniques for improving your workflow, specifically tailored to Drupal development.
Wouldn’t it be cool if you could use Drupal Rules to set, change or modify certain site configurations and variables? Yes, it would be awesome and now it is possible with "Rules Set Site Variables” module.
"Rules Set Site Variables” allows you to use the power of Drupal Rules to change site configuration and site variables on your Drupal Site. Create a rule or rules action and then add the action Set Drupal Site variable. You can configure which site variable you want to modify and text you want to change or modify.
Not the sexiest module name, but it gets the job done. Here’s how to use and some possible example use cases.How to Use?
- Download and Enable "Rules Set Site Variables” (as well as Rules and any other modules you need)
- Create a Rule using some kind of Reaction Event (and add any additional conditions)
- Alternatively create an Action Component
- After, create an Action called "Set Drupal Site variable” under “System”
- Select the site variable you want to set.
- Add the text (or use a Replacement Pattern) to set that variable.
This module for lazy site builders that want to use the power of Rules to modify site configuration in various ways. In my particular use case, I’m using Rules Scheduler to set rules to change certain site properties at different intervals. You can use this module to change site configuration according to certain events, conditions or whatever you can imagine.
Obviously changing your site configuration in this way can do some bad things to your site, so be smart and use with caution. This module is not responsible for any mistakes you make in using it.How you might use it?
The possibilities of this module are not quite endless. (It literally sets site variables). But some cool examples would be:
- Enable certain site settings when a certain user logs in.
- Periodically change a site configuration according to a schedule.
- According to the date, set something funky on the site like a configuration color or whatever.
- Set the site title according to certain reaction events.
- Change out configuration settings from DEV to PROD according to certain conditions.
Learn about this module on its Drupal.org project page at https://www.drupal.org/project/rules_set_site_variables. I’d appreciate any comments, feedback and bug reports.
This module ain’t going to cure cancer or pay your bills, but it definitely solved an itch I had and hope it solves one of yours one day.Tags: drupalRulesDrupal PlanetPlanet Drupaldrupal module Mark Koester @markwkoester Mark has worked on Drupal since the early days of D6. He is passionate about open source as well as entrepreneurship. When he isn't building, he enjoys traveling and speaking one of his many foreign languages. Chengdu, China
Part 2 of 2 - In a recent conversation with Tom Erickson, Acquia's CEO, we got to talking about Acquia's "Drupal 8 All-In", what it is and what it means for the Drupal world and those we have yet to convince ... Acquia has drawn a line in the sand, saying, this thing is so great, we are confident we can deliver the kind of help and experience that we've been guaranteeing with Drupal over the last 7 or 8 years. This thing is ready enough. And all of the rough edges that we find along the way; this is the perfect opportunity to sand them off. This thing, Drupal 8, is ready for real business. In short: Acquia is running customer sites on Drupal 8 already, supporting all of Drupal 8, helping get the last rough edges off it for general release, and would love to talk with you about your next project and whether Drupal 8 is a good fit!
Panels comes with a great feature where you can control the visibility of individual panel panes. Visibility rules are useful when you need to show or hide a pane based off some criteria. You can add a rule by clicking on the cogwheel on the pane and then click on "Add new rule" within the Visibility rules section.
The default options are fine for simple configuration. But sometimes you’ll need to write a bit of code to implement complex requirements. To handle this functionality Panels utilises the Ctools access plugin. So if you need to build custom visibility rules then just write your own access plugin.
Today I’ll show you how to create a basic access plugin for those times when the default options won’t cut it.
Last night I attended the first meetup of the Brighton outpost of the Homebrew Website Club - I had planned on staying in and blogging about my recent Hungarian adventures at Drupalaton but very glad I popped my trusty Respect Your Freedoms-certified Libreboot X200 laptop into my lovely new Drupalaton tote bag & made the short but always eventful trip across the Laine/Lanes border from purkisshq to 68 Middle Street and made progress on upgrading my website to Drupal 8, here's why:
- focused time meant I actually did what always seems to get put back in the queue, and owning my own data and code is important to me in the long term
- time-boxing an hour meant I had no choice other than to prioritise, for me that was fixing an issue I had with getting DrupalVM running.
- collaboration apart from attending the business day at our DrupalCamp Brighton event I haven't been to 68 Middle Street much since I wrote a blog post about Atomised Web Design where I kinda rip into one of the space's founders & organiser of the Brighton Homebrew Website Club Jeremy Keith (@adactio). Four-or-so years on, and thanks in big part to the stern work of MortenDK, Drupal 8 is a whole load better so I don't feel so bad for my not-so-humbledness of old, plus I couldn't miss an event on a topic I'm so passionate about - Freedom!
Not realising the openness of the space I may have been my usual loud self when chatting to a fellow fashionably late attendee as we walked around the corner of a bar to the main venue space where there were lots of people sitting round tables with one up showing what they'd been working on. There was a wide range of people and interests, with most discussions being around what software and services people were using to control their internet communications.
It came to my turn - I didn't want to even attempt to connect my laptop up to the projector as I didn't really have that much to show so just waved my latest blog post about What is Drupal? around on the laptop and explained that ever since I attended Léonie Watson's DrupalCamp Bristol Keynote: The metamorphosis of accessibility I have been deeply affected by how far behind we still are in terms of providing any sort of acceptable web experience for everyone, and as I started blogging again I wanted to make sure my creative efforts would be accessible and enjoyable. I wrote a description of the photo in my blog post but noticed the correct tags weren't there for accessibility - WAI-ARIA, ARIA Live Announcements API and TabManager are built-in to Drupal 8.
Other people were using a wide range of tools and languages to build their sites, it will be interesting to see how they achieve some of the things I'm going to be doing with developing purkiss.com over the next few months to explain more about what I do, in terms of delivering Drupal projects by working with the community as well as other parts of how I approach my life experience.Hack
Once updates were all done it was time to get down to business. Fellow Drupaler & DrupalCamp Bristol organiser Oliver Davies had mentioned the other day that I just needed to change nfs to rsync in the settings.yml file and that worked. I should've seen that but wasn't thinking - NFS is Windows, I'm using Linux ;) So by changing just one setting I had a whole virtual machine with a fresh, working install of Drupal 8 up and running on my 7-year old technology but freedom-respecting laptop - so much for Drupal 8 is complicated enterprise-only!
Next was reading up on a production-ready version of the virtual machine which I could host on DigitalOcean. Once I learn more about the sysadmin side of things and feel more confident, I'd prefer to host it from home as there don't seem to be any suitable distributed options I know about / could use which fully respect mine (or yours) freedoms - surely there's business to be made there?! I'm also looking at a more home-grown solution from the UK - VLAD, but for the moment I'm going with what I know is working for me and once I'm a little more confident and can inspect a working system I'll see where I can go from there.
I'd just started to get into the production-ready issue when the preverbial bell rang and we were out of time! I made my way up to the Brighton Farm weekly new media freelancers meetup where, amongst many other interesting conversations, I discovered the Cubieboard, which according to the FSF is all good for Freedom apart from the WiFi controller, but as I don't plan on hosting wirelessly this sounds like a good option to investigate for self-hosting.
I'm looking forward to the next Homebrew Website Club meeting in a couple of weeks, certainly managed to achieve movement and am looking forward to getting off the Drupal 7 island into the wide open waters of Drupal 8!tags: Homebrew Website ClubDrupal 8Drupal PlanetPlanet Drupalbrighton
Bootstrap is winning the web.
Nearly 10% of all websites now use the Bootstrap framework.
That's reflected on Drupal.org, where Bootstrap is the third most popular theme. Bootstrap is a base theme that integrates Bootstrap 3 with Drupal.
Here's a guide to getting started with the Bootstrap theme.
Back in August 2013, we wrote a post called, "The Bootstrap Boom is Just Getting Started".
At that time, we estimated that Bootstrap powered between 1.5% and 3% of the web.
2 years later, I decided to check in on Bootstrap. How is popular is the framework now? Did the Bootstrap boom continue, or has it bust?
Look for links to our Strategic Roadmap highlighting how this work falls into our priorities set by the Drupal Association Board and Drupal.org Working Groups.
July was an action packed month at the Drupal Association - we had our quarterly prioritization with the Working Groups, our annual all-hands summer staff meeting, and mentored two tremendously dedicated interns throughout.
Our primary engineering focus was on DrupalCI and Localize.drupal.org - though we also found time to make some iterative changes to Drupal.org in a few areas, namely: issue credit refinements, performance, groundwork for the new content model.Strategic Planning Prioritization for Q3 2015
The Drupal.org Working Groups help to provide governance for Drupal.org and to set priorities for the work of Association staff. Each quarter we evaluate our priorities with the Working Groups and update our Roadmap.
On July 15th we updated our roadmap based on the Working Groups input. Our main priorities for Q3 are Drupal.org services that are required to support the Drupal 8 release, and functional improvements to Drupal.org:
- The port of localize.drupal.org to Drupal 7, as well as a few issues that support Drupal 8 localization.
- Making sure DrupalCI meets the MVP spec set out by the core developers for providing test coverage for Drupal 8, and that it meets the functionality required to replace the old testbot system.
- Improving Drupal.org search.
- A new documentation section based on our content strategy work that will provide better organization and governance of documentation.
Additional priorities were identified for Association Staff to tackle as time permits.Drupal Association Summer Staff Week
July was also the time for the annual all-hands staff meeting for Drupal.org. For one week, we gathered all our local and remote staff in our Portland office to discuss:
- The mission, vision, and values of the Association.
- Our ever-evolving relationship with the Drupal project itself.
- Setting engineering and design principles for the team.
- Finding sustainable revenue that will fund our work.
For 8 weeks beginning in mid-June the Association staff hosted and mentored two interns who had just completed Epicodus’ inaugural Drupal curriculum.
Bojana Skarich(BabaYaga64) and Daniel Toader(CountPacMan) worked with us on bug fixes, features, and theme work for the Conference Organizing Distribution and several related modules that allow the Association to run our DrupalCon websites.
We’d also like to thank our Supporting Partner ThinkShout for funding Daniel and Bojana’s work with us. This is just one small example of how the supporting partner program fosters our mission by promoting Drupal as part of a software development training curriculum and giving these new members of our community a great head start.Drupal.org Issue Credit Updates
We deployed two small updates to continue to refine and iterate on the Issue credit system that we implemented in the beginning of this year.
Firstly, to support an earlier change to allow explicit attribution as a volunteer, we’ve updated the attribution :hover state display. Previously unattributed comments and volunteer attributed comments would both simply display the username in attribution, though the distinction was being made in the comments themselves. Now that distinction exists not just in the data but in the display on comments.
Since releasing the issue credit system in March, there have been over 9,500 issue credits awarded on over 5,200 issues. Over 2,400 unique users and 250 unique organizations have been awarded issue credits. Over 1,000 projects (modules, themes, distributions) have credits that have been awarded. The last 90 days of issue credits can be viewed on each user and organization profile.
Secondly, we deployed a small change that will automatically generate the first comment on a newly created issue.
This automatically generated initial comment serves two purposes: It allows the original author of an issue to be credited when the issue is resolved, even if they did not leave any subsequent comments on an issue. It provides a link to the original issue summary providing a better at-a-glance view of what the original reporter wrote, even if the summary has since been edited a large number of times by other participants in the issue.
There are still additional refinements to be made as we find time - in particular providing a ui to edit the attribution that will be made for the automatically generated first comment.Entityreference_prepopulate Module
The new content model for Drupal.org requires a number of new modules on Drupal.org. To ensure that the site remains performant we are serializing these changes as much as we can. The first new module to be deployed on Drupal.org was entityreference_prepopulate.
As we work to build out the Documentation section we’ll be installing additional modules, creating some new content types, and providing a number of new resources for maintaining documentation on the site.Advanced Aggregator
Improving performance of Drupal.org is an ongoing concern, particularly as we look to adding new modules that while powerful may also be somewhat heavy on a site of our scale. Utilizing advanced css/js aggregation is something we began to gradually implement towards the end of June, and in July we completed the majority of the changes laid out in this issue.
With these changes we’ve largely completed the work that will be done here for the foreseeable future, though there may be a few more performance gains to be found here and there. Thanks again to mikeytown2 for his assistance.Drupal 8 Blockers DrupalCI
July was a huge month for DrupalCI. There are two major milestones for the Association’s work on DrupalCI.
- DrupalCI must meet the testing requirements for Drupal 8 Core and Contrib specified by core developers.
- DrupalCI must also meet or exceed the existing functionality of the PIFT/PIFR testbots for testing Drupal 7 and Drupal 6 so that the old testbot system can be retired.
The first milestone was our primary goal in July - while the second will be our hard focus in August.
We made tremendous strides towards the first goal, starting with a reformat of the test result output to better display in Jenkins. This new format more logically organizes the test output by:Test Group -> Test Class -> Test Method -> Output/Result
This should make understanding the results of testing easier in the long run, and is also a precursor to displaying test result information directly on Drupal.org - which we hope to complete in August.
We also made improvements to the test history pages - so that project maintainers can make better comparisons of any given test result to the status of a branch when an issue was created, for example, or against the most recent branch test. These test history pages also allow maintainers to see which user triggered the test, and are the portal to the test results.
July also saw the deployment of patch level testing with DrupalCI - which can be enabled on a per environment basis for projects on Drupal.org.
Towards the end of July we also enabled testing for Contrib projects - allowing any project maintainer on Drupal.org to begin using DrupalCI. We are asking project maintainers to enable DrupalCI for their projects and provide us with their feedback in this issue. This will be critical for us to retire the old testing infrastructure.
We also focused on improving the performance and efficiency of the tests. Minimizing the time it takes to initiate a test and complete a full test run both improves efficiency for developers and maximizes the reach of the Association budget for automated testing.
The new DrupalCI architecture automatically scales up and down the number of bots dependent on need, which will hopefully present a cost savings once we disable the redundant old testing infrastructure.
In addition to the architectural work above - we also upgraded our base environments to php 5.5 to support the change in Drupal 8 minimum requirements.
Finally, we improved the documentation for project maintainers for enabling DrupalCI testing on their projects.
Endless thanks to jthorson for his help.Localize.Drupal.org
After our community testing of the localize.Drupal.org Drupal 7 port in June we identified a critical path of remaining issues that needed to be resolved to allow us to complete the upgrade. Many of the issues were related to user roles and permissions do to the differences between Drupal 6 organic groups and the Drupal 7 version.
We put a hard focus on resolving as many of these issues in July as we could, so that we would be ready to perform the final upgrade in August (which was completed successfully on August 12th).
We also added the ability for event attendees to purchase or renew their Drupal Association memberships while purchasing their tickets for DrupalCon.
Finally we are in the planning phase for some additional work to support our payment processing needs in India, and to support having the registration process live for multiple events simultaneously.Sustaining Support and Maintenance New Git Infrastructure Deployment
As mentioned in our June update - we put the bow on a long-standing project to migrate our git infrastructure to new servers in July. Much of the work to provision the new servers was completed in June - but the cutover itself was scheduled in the early weeks of July.
The new git infrastructure is now both redundant and highly available, greatly increasing the stability of a critical part of our infrastructure.Serving Files from a Separate Domain
In July we also acquired a new domain and wildcard cert for *.drupalcontent.org. This new domain will be used to serve static files across the Drupal ecosystem of websites, providing benefits for security and reducing the size of http requests by serving these resources from a domain without cookies.
Work to serve files from the new domain name is ongoing, and many thanks to mlhess for helping us implement this change.Load balancer stability
After continuing the debug the decreasing stability of our load balancers, we decided to swap hardware and rebuild the load balancers using different hardware. The new hardware is also using an updated configuration and operating system, which has proven to be more stable. The second load balancer is in the process of being built out using the new configuration and different hardware. The project should be completed by the end of August, bringing stability back to one of our key infrastructure components.
Many thanks to nnewton for helping us diagnose and make this change.Updates to Updates
One of the core services that Drupal.org provides is updates.drupal.org. In essence this is a feature of Drupal itself. Because Drupal.org is the home of updates information for the entire project, we analyze our updates traffic as part of our project usage statistics.
Unfortunately the project usage stats have been somewhat unreliable - so in July (and continuing into August) we’ve given this system some attention.
Changes we’ve made or are in the process of making include:
- We moved the updates system to a CDN (and then migrating to a different CDN provider).
- We updated our processing to work with centralized logs on our loghost.
- We are improving the performance of the process from a 3-4 hour run per day’s worth of data to a 1 hour run per month of data.
- We are simplifying the process by removing an intermediate MongoDB deduplication/key-value store.
Work to improve the performance and stability of the updates stats system will be ongoing.
As always, we’d like to say thanks to all volunteers who are working with us and to the Drupal Association Supporters, who made it possible for us to work on these projects.whats new on Drupal.org
A few months ago I wrote about why good markup matters. As a front-end developer, I interact with Drupal’s markup on a daily basis and I experience first-hand the benefits of good markup and the frustration of poor markup. A lot of the content on a Drupal website is produced by views. Whether they are out of the box or custom views, they are powerful and present us with great opportunities to manipulate the content and markup.
The sluggishness of Ruby's Sass really started showing with very large projects, and became a nuissance when coupled with LiveReload. Let's face it, when you make a single CSS change, waiting for even a couple seconds can feel like an eternity.
The first step to speed things up was to get rid of some mixins that we no longer needed. The low-hanging fruit was Compass. Compass is an amazing framework that we had come to rely on, but now that browsers have largely caught up with each other, there isn't been much need for Compass's vendor prefixing mixins. Instead, when we do need it we can use an Autoprefixer or write our own mixin. Easy enough!
After getting Compass and some other Ruby gems out of the way we saw some improvement, but not enough to make us happy! So the search continued...
We decided as a group to move over to Gulp and try to get rid of Ruby altogether.
Why Gulp? Gulp is all node-based, so we were able to use gulp-sass for compiling our CSS. That gulp plugin is just a thin-wrapper around libsass, a C/C++ implementation that is wicked fast, even for large projects.
Some quick Google searches lead me to benchmarks performed by The Opinionated Programmer that compare several CSS preprocessors. Long story short, libsass is about 25 times faster than Ruby on a first run, and after Ruby has a sass-cache available libsass is still about 12 times faster. That's a massive improvement! Terms: Read more Product Development TCa.aS, short for Technical Cofounder as a Service, is a new type of software service relationship for product-focused startups and existing businesses looking to start a new venture or product line. Read more