Elsewhere

Vardot: Project Manager’s Guide to Breaking Down a Drupal Site for Incremental Delivery

Planet Drupal - jeu, 21/04/2016 - 09:39
How to, Resources Read time: 8 minutes

TL;DR. Jump to the free template: Standard Drupal Work Breakdown Structure Template.

Building a new site on a content management system has always been a tricky project to manage for a project manager, when compared to building a site on a framework or from scratch. That is because you are dealing with building blocks that are provided as a standard from the CMS. A project manager should have the necessary knowledge of the CMS’s building blocks to be able to manage a successful project.

Put this in context of today’s Scrum management approach (an agile way to manage a project, usually software development) and you’ll end up with a puzzled project manager with several questions such as:

  1. What can my team deliver in the first sprint?

  2. How can I breakdown the project’s deliveries into sprints?

  3. What expectations of deliverables should I set with my project’s stakeholders (product owner, business owner, client)

  4. When do I deliver the homepage for my stakeholder to look at?

  5. Are we supposed to deliver on page by page basis?

 

Drupal disrupts the “page” methodology that we are used to thinking of. One naturally tends to think of a website as a folder, with sub-folders, and pages (.html) inside those folders. That’s the 90s. We’re in 2016. Drupal is a database-driven CMS that takes a content-first (or content-out) approach of building rich web experiences, instead of a page-first approach. See “Drupal is content first, and that's good” and “A Richer Canvas”.

   

Due to this approach of how Drupal works, we at Vardot have came up with a framework of planning the phases of building a Drupal site, to lead to an incremental development that can be broken down and fit into Scrum sprints. This will apply to almost all Drupal projects.

This standard way we'll call: The Standard Drupal Work Breakdown Structure

 

Why Do I Need a Breakdown of Work for Planning My Drupal Site?

Because this what project managers do. I have seen in the many Drupal projects that I was part of, that project managers (and/or coordinators) must understand how Drupal works, how the development process goes, and how do we get 80% of the site done in 20% of the time.

A work breakdown structure will help you (as a project manager) to understand how a Drupal site is built. It will also ease the process for getting high quality incremental deliveries to fit in your sprints. In this post, I will walk you through the high-level breakdown for any Drupal site.

 

Most importantly, the goals and outcomes of a breakdown are for you to understand and communicate to your project’s stakeholders your timeline of deliveries, and to be able to fit these deliveries into sprints.

To summarize, these goals are:

  1. Breakdown of deliverables. Define needed outcomes of initial sprints

  2. Provide a holistic view and analysis of the site’s functionality and its building blocks

  3. Remember, we are building a CMS, not a website. Therefore you need to architect your “CMS solution”, and not your “website solution”

 

Let’s Start With The How

Now we enforce these goals by implementing the The Standard Drupal Work Breakdown Structure, that will fit for almost all of the Drupal projects you will work with.

These phases will be divided into:

  1. Initialization Work Breakdown Structure: This phase is the cornerstone phase for starting right, it’s most probably a typical standard way that you should do in every project.

  2. Project’s Epics Work Breakdown Structure: Careful analysis of the site’s components and how it will be developed in the CMS will be implemented here.

  3. Finalization Work Breakdown Structure: This is the ending phase, where you make sure your site is ready for launch. Final preparations, tuning, and tweaks are carried out in this phase to prepare for your big day.

Note that you will be able to deliver something for your stakeholders to look at, in the “Initialization” phase.

This breakdown must happen after high-fidelity wireframes are done, or if you have the full visual mockups of a Drupal site done for your key pages.

It’s important to note that the visual mockups should use and adhere to Drupal’s design language and patterns. But what is Drupal’s design language and patterns? That’s for another article to discuss.

Now that we have designs handed to us with a clear communication of how the new website will look like. We are ready to breakdown our Drupal site for a successful delivery.

 

The Work Breakdown

Disclaimer: the terminology that I’m using below to name some components that make up your site is not an “official Drupal language”. No worries if you stick with the same terminology or use your own names, what really matters is just the breakdown structure.

So I’m categorizing what makes a (Drupal) site into six components:

  1. Wrapping components: Header and Footer.
    These are the components that provide your site with a wrapper for all your next components. Start with these as soon as you install Drupal; it will help you get through the easy stuff that makes up your site.
     
  2. Global components: Page title, Breadcrumbs, Tabs (a.k.a menu local tasks), System messages ...etc.
    These are the components that make up the uniformity of a CMS. These are your next target.
     
  3. Site-unified components: Ad blocks, Newsletter subscribe block, Social media feeds or “follow us” blocks, Static “about us” block ..etc.
    These are the components that most likely appear in the same style across multiple pages in the site.
     
  4. Full nodes and entities: Your “Full content” node/user/entity pages.
    Getting back to “content-out” approach, always start with the full-node or entities completion.
     
  5. Views, view modes, and other content: Views of recent content, Featured content, Node pages, Feeds integration, CRM integration, Single Sign On integration, ...etc.
    This is the major work; components that define your site.
     
  6. The annoying 20% of the site: This is where the built 80% of your site gets the final hidden work, iterative tweaking and enhancements to your site, whether it is requested by your QA team, the client, or the product owner.

In light of this breakdown of CMS’s categories, here’s an animated illustration of how a site can be made possible when following the flow of development based on the components above:

In this order, you can now think of a Drupal site to be developed according the following steps:

Initialization Work Breakdown Structure

  1. Delivering “1. Wrapping components

    1. Install Drupal (or the distribution you want to use), setup the development environment ..etc.

    2. Populate the things that make up the “Wrapping components”: Menus, logo, search ..etc.

    3. Create your theme, and theme the “Wrapping components”

  2. Delivering “2. Global components

    1. Just populate then and theme them.

  3. Delivering “3. Site-unified components

    1. Create and populate the things that make up your “site-unified components”

    2. Theme them

Project’s Epics Work Breakdown Structure

  1. Outline your content types starting from the “Full node” view modes. Identify other view modes for your content types. Start creating those into “Tasks”

  2. Do the same for other Drupal entities: Entities, Files, Comments ..etc.

  3. Deliver “4. Full nodes and entities

  4. Deliver “5. Views, view modes, and other content

  5. Deliver “6. The annoying 20% of the site

Finalization Work Breakdown Structure

  1. Final overall testing

  2. SEO, Printability, Performance, Security, and Accessibility tuning and configuration

  3. Your pre-launch checklists

  4. Go live!

 

FREEBIE: The Standard Drupal Work Breakdown Structure Template

Our Standard Drupal Work Breakdown Structure Template provides an outline of these phases and detailed tasks to be done that we use for every Drupal project. This template is made to be easily imported to JIRA. It contains:

  • a master sheet that aggregates the standard epics, tasks and stories to be easily imported to JIRA.

  • a sheet for defining the project’s own epics and stories

  • the standard Initialization and Finalization work breakdown structure that must not be missed for any project

All of this helps to reduce discrepancies in developing each project, not to miss important tasks and also allows our team to deliver a project fast, and incrementally (delivering in the first week of development).

Using The Template

The template is a Google Spreadsheet that you can easily clone and customize. To do so:

  1. Open the sheet and copy it to make it yours.

  2. Feel free to edit the sheet to make it your own. There are some instructions on how to use the sheet to make it yours.

  3. Follow the instructions on what to edit. We recommend that the “Initialization WBS” and the “Finalization WBS” stay intact (you can edit them once to your standard flow, then replicate for all projects).

  4. For each project, you will want to copy your template to customize the “Project’s Epics WBS” as per the project. The template has some samples for you to consider.

  5. Once done, export the “Master WBS” sheet to CSV. So you can import to your JIRA project.

  6. Map fields to your JIRA. See sample [image to illustrate mapping]

  7. That’s it!

 

Conclusion

Two things have helped us to standardize our work process when developing a Drupal site, and insure consistency and quality:

  1. Starting a project by finishing up components-first approach, not page-first approach.

  2. Documenting our recurring tasks and processes in a Template that uses this approach. This template makes applying this process easier for you.

Next time you start a Drupal project, consider this approach and let us know how this would help you in the comments section below.

Note: This does not depend on a specific Drupal version, this methodology works with Drupal 6, 7 or 8. It depends on Drupal’s conceptual building approach.

Tags:  Drupal Planet drupal 8 Project Management Drupal Templates Title:  Project Manager’s Guide to Breaking Down a Drupal Site for Incremental Delivery
Catégories: Elsewhere

Drupal Console: Drupal Console alpha1 is now available

Planet Drupal - jeu, 21/04/2016 - 09:24
We are so excited to announce the first Alpha release of Drupal Console. Almost three years of working on this project and after 84 releases, almost 86,000 downloads and the awesome help of 169 contributors we released the 1.0.0-alpha1 version. What is so great about this version  This release provides support for latest Drupal 8.1.x version released on April the 20th. For more information and details about this Drupal release you can visit Drupal 8.1.0 is now available. This release includes minor fixes and improvements and only one new feature. Support for placeholders on chain files, I will elaborate about this on another blog post, but if you are interested to know about this please visit the issue 2055. What is not so great about it Drupal 8.0.x is no longer supported. We are still trying to confirm if the Embedded Composer project can help us with this issue. If this is not doable, we can open a discussion to find a better way to approach this issue.
Catégories: Elsewhere

qed42.com: Pune Drupal Meetup - March 2016

Planet Drupal - jeu, 21/04/2016 - 08:44
Pune Drupal Meetup - March 2016 Body

The monthly meet-up for March was moved from the last friday of the month, which was the good Friday, to the 1st of April and hoped really hard that people didn't think it was an April fools prank. This PDG meetup was hosted by Rotary International thanks to diligence of Dipak Yadav who works there. It is always fun when the meetup is hosted in different locations because we get to explore different parts of Pune and see new faces.

With 25 members in attendence, the meetup was kicked off by Dipak giving us an informative talk about Rotary International and the work they do.

 

The speaker for the evening was Sushil Hanwate of Axelerant and he spoke on,“ Services and dependency injections in Drupal 8.”

 

The session ended after a short Q&A, we broke off into smaller groups for BOF sessions. Saket headed the BOF for Service workers and the second group discussed about the Drupal 8 Module development.

Once we were done with technical talks, we were served one of the best Kachoris we have tasted :). While we happily munched on the snacks, we decided on the preliminary team members for the upcoming Pune Drupal Camp.

Though the meetups are being held regularly we still need to figure a way of involving newer members into the community and one of the way that is possible is if we get more people volunteering to host the meetups. Kudos to Rotary for hosting us, if you are a Pune based company / group who would like to host the next meetup then please get in touch via comments. 

Our next PDG meetup is scheduled for the 29th of April. Along with a session on,"Experience with Drupal" by Rahul Savaria and Prashant Kumar from QED42, we shall also be planning and discussing further about the upcoming Pune Drupal Camp.
Dont forget to RSVP, See you soon!

aurelia.bhoy Thu, 04/21/2016 - 12:14
Catégories: Elsewhere

KnackForge: Programmatically create and trigger feeds importer

Planet Drupal - jeu, 21/04/2016 - 07:21
Programmatically create and trigger feeds importer

We met with a challenging requirement where we needed to create a feeds importer on node creation of particular content type. It’s like for ‘n’ number of nodes there must be a ‘n’ number of feeds importer.

For that I created a feeds importer which will be considered as the template. Whenever the node of specific content type is created, the template will be cloned and a new importer will be created.

The following code needs to reside in hook_node_insert() and will be used to clone the feeds importer:

Thu, 04/21/2016 - 10:51
Catégories: Elsewhere

KnackForge: Apache Solr View Count module

Planet Drupal - jeu, 21/04/2016 - 06:46
Apache Solr View Count module

While working with Apache solr in drupal, I had to sort the results based on relevancy, date and popularity. The Apache solr sort module allowed me to sort based on relevancy and date, but sorting based on popularity wasn't available. What I mean by sorting based on popularity is that sorting based on the view count of each result.

suresh Thu, 04/21/2016 - 10:16
Catégories: Elsewhere

FFW Agency: Drupal Console: An Overview of the New Drupal CLI

Planet Drupal - jeu, 21/04/2016 - 02:01
Drupal Console: An Overview of the New Drupal CLI jmolivas Thu, 04/21/2016 - 00:01

Drupal Console is the new CLI (Command Line Interface) for Drupal. This tool can help you to generate boilerplate code, as well as interact with, and debug Drupal 8. From the ground up, it is built to utilize the same modern PHP practices that have been adopted in Drupal 8.

Drupal Console takes advantage of the Symfony Console and other well-known third-party components like Twig, Guzzle, and Dependency Injection among others. By embracing those standard components, we’re fully participating in the PHP community, building bridges and encouraging the PHP community to join the Drupal project and allow us to reduce the isolation of Drupal.

Why is Drupal Console important?

Drupal is infamous for having a steep learning curve, complete with its own language of “Drupalisms”. While Drupal 8 simplifies and standardizes the development process, it is more technically advanced and complex than its predecessor. 

Managing the increasing complexity of Drupal 8 could be a daunting task for anyone. Drupal Console has been designed to help you manage that complexity, facilitating Drupal 8 adoption while making development and interaction more efficient and enjoyable. Drupal Console was created with one goal in mind: to allow individuals and teams to develop smarter and faster on Drupal 8. 

Drupal Console features

In this blog post, I will mention some of the most common features and commands of Drupal Console, to serve as a good introduction.

Install Drupal Console

Copy configuration files

The init command copy application configuration files to the user home directory. Modifying this configuration files is how the behavior of the application can be modified. 

  Validate system requirements

The check command will verify the system requirements and throw error messages if any required extension is missing.

Install Drupal 8

The easiest way to try Drupal 8 on your local machine is by executing the chain command and pass the option --file=~/.console/chain/quick-start.yml

The chain command helps you to automate command execution, allowing you to define an external YAML file containing the definition name, options, and arguments of several commands and execute that list based on the sequence defined in the file.

In this example, the chain command will download and install Drupal using SQLite, and finally, start the PHP's built- in server. Now you only need to open your browser and point it to 127.0.0.1:8088.

  Generate a module

The generate:module command helps you to:

  • Generate a new module, including a new directory named hello_world at modules/custom directory.
  • Creates a hello_world.info.yml file at modules/custom/hello_world directory.
  Generate a service

The generate:service command helps you to: 

  • Generate a new service class and register it with the hello_world.services.yml file.

 

Generate a Controller

The generate:controller command helps you to:

  • Generate a new HelloController Class with a hello method at src/Controller directory.
  • Generate a route with a path to /hello/{name} at hello_world.routing.yml file.

 

Generate a Configuration Form

The generate:form:config command helps you to:

  • Generate a SettingsForm.php class at src/Form directory,
  • Generate a route with path to /admin/config/hello_world/settings at hello_world.routing.yml
  • Register at the hello_world.links.menu.yml file the hello_world.settings_form route using system.admin_config_system as parent.

This command allows you to add a form structure to include form fields based on the field API. Also generates a buildForm and submitForm methods with the required code to store and retrieve form values from the configuration system.

NOTE: The parent route system.admin_config_system for the menu_link can be selected from the command interaction.

  Debug Services

The container:debug command displays the currently registered services for an application. Drupal contains several services registered out-of-the-box plus the services added by custom and contributed modules, for that reason I will use peco a simplistic interactive filtering tool to make this debug easier.

 

Debug Routes

The router:debug command displays the currently registered routes for an application. Similar to debugging services. In this example, I will use peco to make this debugging task easier.

 

Create Data

The create:nodes command creates dummy nodes for your application.

 

Drupal Console provides a YAML to execute using the chain command. This file contains instructions to execute create:users, create:vocabularies, create:terms and create:nodes command using one command.

Change the application language

The settings:set command helps to change the application configuration in this example using the arguments language es we can set Spanish as the application language. After switching the default language the interface is translated.

  All of the available commands

The list command can be used to show all of the available commands. A print screen was not included because the more that 130 commands make the image huge to show on this blog post. 

For the full list of commands, you can also visit the documentation page at http://docs.drupalconsole.com/en/commands/available-commands.html 

What makes Drupal Console unique
  • Has been built to utilize the same modern PHP practices adopted by Drupal 8.
  • Generates the code and files required by Drupal 8 modules and components.
  • Facilitate Drupal 8 adoption while making development and interaction more efficient and enjoyable. 
  • Allow individuals and teams to develop smarter and faster on Drupal 8.
  • Help developers to understand Drupal 8 with the "--learning" flag.
  • Fully multilingual and translatable, just like Drupal 8 itself.
Links and resources Tagged with Comments
Catégories: Elsewhere

FFW Agency: Great Examples Of Distributed Content Management In The Pharmaceutical Industry

Planet Drupal - jeu, 21/04/2016 - 02:00
Great Examples Of Distributed Content Management In The Pharmaceutical Industry hank.vanzile Thu, 04/21/2016 - 00:00

This is the third post in my series on Distributed Content Management.  In my first post I defined the term and used a few examples while doing so.  My second post, Great Examples of Distributed Content Management In Higher Education, expanded on the first example of a large university.  In today’s post we’ll explore the second example - a global pharmaceutical company - and once again discuss some great use cases for Distributed Content Management.

 

Setting The Scene

Pharmaceutical companies, more than companies in many other industries, must carefully consider all elements of their content lifecycle. Providing correct, approved content to both healthcare professionals and consumers is of utmost importance and, as such, web content in the pharmaceutical industry must undergo stringent regulatory review and control.  This requires consistent management across all digital properties and, for larger companies, that can be hundreds, or potentially even thousands, of websites and channels globally.

 

Use Case 1: Efficient Regulatory Review With Content Publishing Workflows

At first, the idea of Distributed Content Management may seem somewhat counterintuitive to how pharmaceutical companies work.  (In previous posts we’ve used it to explore empowering content creators and overcoming bottlenecks to content publishing - challenging concepts to tout for such a regulated industry.)  However, I’ve also opined that content approval and publishing workflows must be tailored to the specific use case.  

Consider a web publishing workflow that allows medical-legal reviews to take place within a Content Management System.  In some web systems this requires a multi-tiered platform wherein a “staging” version of the website - an exact copy of the real (“production”) website on which content changes have been staged - is made available for regulatory approval before the content is made available to the public.  While this is certainly more efficient than sharing offline documents, a deeper consideration of the technologies used can increase the efficiency and further control its risks.  

Some Content Management Systems, such as Drupal, allow content approval to take place on the production website, controlling the visibility and publishing of content through user authentication and roles instead of requiring  separate “staging” websites.  By mapping the appropriate roles to regulatory affairs, pharmaceutical companies using this approach can save costly and timely deployments of new content to the production site and free up the resources required to manage multiple copies of each website.

 

Use Case 2: Controlled, Single-Source Content Deployment

For some pharmaceutical content, decentralized content publishing may not be an appropriately-sized solution.  Some content is not only highly-regulated but also highly reused wherever products are marketed and is therefore best suited to be updated, approved, and disseminated from a central source.  Important Safety Information and Indications, for example, are types of content that a pharmaceutical company may choose to publish only through a centralized content repository.  

By establishing policies that all content editing must occur in the content repository, with individual websites disallowed from making changes locally, companies may avoid the need to have regulatory approval workflows on each of those sites and ensure that important information is updated in a timely and error-free way across numerous sites.  Content syndication is a fascinating opportunity for organizations considering Distributed Content Management and I’ll explore some of the available technologies, such as Acquia Content Hub, in later posts.

 

Use Case 3: Multichannel Brand Content

Single-source content syndication also provides an opportunity for pharmaceutical companies looking to promote their consumer products across multiple channels.  Let’s use e-commerce as an example.  Many companies choose to employ standalone, all-in-one e-commerce systems such as BigCommerce, Demandware and Magento rather than integrate e-commerce stores into each of their individual brand websites.  This makes a tremendous amount of sense: these systems can provide a number of compelling features such as gift cards, coupons, centralized inventory management, and opportunities for cross-selling other products among the company’s brands.  However, because these stores are independent of the main brand website, they too need to display content such as product descriptions, use and dosing information, ingredients, etc.  

By programmatically providing that content from a content repository to the e-commerce system, pharmaceutical companies can eliminate the risk of entering information directly into the store and potentially make use of the streamlined regulatory control processes they’ve already set up for the brand sites.

 

Use Case 4: Content Delivery To Validated Audiences

In addition to marketing content, pharmaceutical companies maintain large amounts of HCP content - information intended for healthcare professionals.  What content is available to these professionals, how they’ll access it, and how to validate the identity of a user seeking that information

is another key consideration for a pharma  company’s Distributed Content Management strategy.  A common approach is to segregate HCP content into regional “portals” - websites that require medical professionals to create accounts and login to see the information for their country or part of the world.  To overcome the challenge of validating these accounts, companies often integrate with an Identity Provider (IdP) such as DocCheck or Cegedim that specializes in maintaining national registries of healthcare professionals.  

However, having a number of disparate system integrations dependant on which country a website is intended to serve introduces both the overhead of managing multiple bundles of code - sometimes written in entirely different programming languages - and the opportunity for error in integrating the wrong code for the intended region.  Because of this, some global pharmaceutical companies may choose to build a more centralized approach to validation and registration using an integration platform such as Mulesoft Anypoint Platform to amalgamate the different Identity Provider code bundles and provide simultaneous access them all through a dedicated Identity Management system such as Janrain.

 

What’s Next?

We will continue exploring use cases for distributed content management for the next few posts before moving on to discussing some prerequisites for companies looking to implement Distributed Content Management.  Thoughts or questions?  Reach out in the comments below or tweet them to me at @HankVanZile.

 

 

Tagged with Comments
Catégories: Elsewhere

Aten Design Group: Adding CSS Classes to Blocks in Drupal 8

Planet Drupal - jeu, 21/04/2016 - 00:25

This is an update to a previous post I wrote on adding classes to blocks in Drupal 7

As I've stated before, I'm a big fan of Modular CSS which requires the ability to easily manage classes on your markup. This was often a struggle in previous versions of Drupal. However, Drupal 8 makes this significantly easier to manage thanks to a number of improvements to the front-end developer experience (DX). In this post we'll look at how two of these DX improvements, the Twig template language and hook_theme_suggestions_HOOK_alter, and how they make adding classes to blocks much easier to manage.

Twig allows us to easily open up a template file and add our classes where we need them. There are two main approaches to adding classes to a template file. The first is simple: open the file, add the class directly to the tag, save the file and move on with your life.

block.html.twig <div class="block block--fancy"> {{ title_prefix }} {% if label %} <h2 class="block__title block__title--fancy">{{ label }}</h2> {% endif %} {{ title_suffix }} {% block content %} {{ content }} {% endblock %} </div>

This works in a lot of cases, but may not be flexible enough. The second approach utilizes the new attributes object – the successor to Drupal 7's attributes array. The attribute object encapsulates all the attributes for a given tag. It also includes a number of methods which enable you to add, remove and alter those attributes before printing. For now we'll just focus on the attributes.addClass() method. You can learn more about available methods in the official Drupal 8 documentation.

block.html.twig {% set classes = [ 'block', 'block--fancy' ] %}   {% set title_classes = [ 'block__title', 'block__title--fancy' ] %}   <div{{ attributes.addClass(classes) }}> {{ title_prefix }} {% if label %} <h2{{ title_attributes.addClass(title_classes) }}>{{ label }}</h2> {% endif %} {{ title_suffix }} {% block content %} {{ content }} {% endblock %} </div>

Alternatively, we can add our class directly to the class attribute with the existing classes from the attribute.class then print the remaining attributes. To prevent the class attribute from printing twice, we exclude it using the without Twig filter. Either way works.

block.html.twig <div class="block--fancy {{ attributes.class }}"{{attributes|without('class') }}> {{ title_prefix }} {% if label %} <h2 class="block--fancy {{ title_attributes.class }}" {{title_attributes|without('class') }}>{{ label }}</h2> {% endif %} {{ title_suffix }} {% block content %} {{ content }} {% endblock %} </div>

In any case, all our blocks on the site now look fancy as hell (assuming we've styled .block--fancy as such)

Template Suggestions

The above examples work. In reality if all our blocks look fancy, no blocks will look fancy. We need to apply this class only to our special blocks that truly deserve to be fancy. This introduces my second favorite DX improvement to Drupal 8 – hook_theme_suggestions_HOOK_alter.

If you wanted to make a custom template available for use to a certain block In Drupal 7, you had to do so in a preprocess function. Altering theme hook suggestions (the list of possible templates) in the Drupal 8 is delegated to its very own hook. The concept is pretty straight forward. Before Drupal renders an element, it looks at an array of possible template file names (a.k.a. suggestions) one-by-one. For each template file, it looks in the file system to see if that file exists in our theme, its base theme or core themes. Once it finds a match, it stops looking and renders the element using the matching template.

We'll use this hook to add our new template file to the list of suggestions. In the case of blocks, the function we'll define is hook_theme_suggestions_block_alter. It takes two arguments, the first is the array of suggestions which are passed by reference (by prefixing the parameter with a & so we can alter them directly. The second is the variables from our element that we can use to determine which templates we want to include.

Lets assume we renamed one of our templates above to block--fancy.html.twig and saved it to our theme. We then add the following function to my_theme.theme where "my_theme" is the name of our theme.

my_theme.theme <?php   /** * Implements hook_theme_suggestions_HOOK_alter() for block templates. */ function my_theme_theme_suggestions_block_alter(array &$suggestions, array $variables) { $block_id = $variables['elements']['#id'];   /* Uncomment the line below to see variables you can use to target a block */ // print $block_id . '<br/>';   /* Add classes based on the block id. */ switch ($block_id) { /* Account Menu block */ case 'account_menu': $suggestions[] = 'block__fancy'; break; } }

Now the account menu block on our site will use block--fancy.html.twig as we can see from the output of twig debug

This is just one example of the improvements in D8 theming. I'm really excited for the clarity that the new Twig templates bring to Drupal 8 and the simplicity of managing template suggestions through hook_theme_suggestions_HOOK_alter.

Catégories: Elsewhere

Jonathan Dowland: mount-on-demand backups

Planet Debian - mer, 20/04/2016 - 22:49

Last week, someone posted a request for help on the popular Server Fault Q&A site: they had apparently accidentally deleted their entire web hosting business, and all their backups. The post (now itself deleted) was a reasonably obvious fake, but mainstream media reported on it anyway, and then life imitated art and 123-reg went and did actually delete all their hosted VMs, and their backups.

I was chatting to some friends from $job-2 and we had a brief smug moment that we had never done anything this bad, before moving on to incredulity that we had never done anything this bad in the 5 years or so we were running the University web servers. Some time later I realised that my personal backups were at risk from something like this because I have a permanently mounted /backup partition on my home NAS. I decided to fix it.

I already use Systemd to manage mounting the /backup partition (via a backup.mount file) and its dependencies. I'll skip the finer details of that for now.

I planned to define some new Systemd units for each backup job which was previously scheduled via Cron in order that I could mark them as depending on the /backup mount. I needed to adjust that mount definition by adding StopWhenUnneeded=true. This ensures that /backup will be unmounted when it is not in use by another job, and not at risk of a stray rm -rf.

The backup jobs are all simple shell scripts that convert quite easily into services. An example:

backup-home.service:

[Unit] Requires=backup.mount After=backup.mount [Service] User=backupuser Group=backupuser ExecStart=/home/backupuser/bin/phobos-backup-home

To schedule this, I also need to create a timer:

backup-home.timer:

[Timer] OnCalendar=*-*-* 04:01:00 [Install] WantedBy=timers.target

To enable the timer, you have to both enable and start it:

systemctl enable backup-home.timer
systemctl start backup-home.timer

I created service and timer units for each of my cron jobs.

The other big difference to driving these from Cron is that by default I won't get any emails if the jobs generate output - in particular, if they fail. I definitely do want mail if things fail. The Arch Wiki has an interesting proposed solution to this which I took a look at. It's a bit clunky, and my initial experiments with a derivation from this (using mail(1) not sendmail(1)) have not yet generated any mail.

Pros and Cons

The Systemd timespec is more intuitive than Cron's. It's a shame you need a minimum of three more lines of boilerplate for the simplest of timers. I think WantedBy=timers.target should probably be an implicit default for all .timer type units. Here I think clarity suffers in the name of consistency.

With timers, start doesn't kick-off the job, it really means "enable" in the context of timers, which is clumsy considering the existing enable verb, which seems almost superfluous, but is necessary for consistency, since Systemd units need to be enabled before they can be started As Simon points out in the comments, this is not true. Rather, "enable" is needed for the timer to be active upon subsequent boots, but won't enable it in the current boot. "Start" will enable it for the current boot, but not for subsequent ones.

Since I need a .service and a .unit file for each active line in my crontab, that's a lot of small files (twice as many as the number of jobs being defined) and they're all stored in system-wide folder because of the dependency on the necessarily system-level units defining the mount.

It's easy to forget the After= line for the backup services. On the one hand, it's a shame that After= doesn't imply Require=, so you don't need both; or alternatively there was a convenience option that did both. On the other hand, there are already too many Systemd options and adding more conjoined ones would just make it even more complicated.

It's a shame I couldn't use user-level units to achieve this, but they could not depend on the system-level ones, nor activate /backup. This is a sensible default, since you don't want any user to be able to start any service on-demand, but some way of enabling it for these situations would be good. I ruled out systemd.automount because a stray rm -rf would trigger the mount which defeats the whole exercise. Apparently this might be something you solve with Polkit, as the Arch Wiki explains, which looks like it has XML disease.

I need to get mail-on-error working reliably.

Catégories: Elsewhere

Ben Hutchings: Experiments with signed kernels and modules in Debian

Planet Debian - mer, 20/04/2016 - 20:53

I've lately been working on support for Secure Boot in Debian, mostly in the packages maintained by the kernel team.

My instructions for setting up UEFI Secure Boot are based on OVMF running on KVM/QEMU. All 'Designed for Windows' PCs should allow reconfiguration of SB, but it may not be easy to do so. They also assume that the firmware includes an EFI shell.

Updated: Robert Edmonds pointed out that the 'Designed for Windows' requirements changed with Windows 10:

@benhutchingsuk "Hardware can be Designed for Windows 10 and can offer no way to opt out of the Secure Boot" https://t.co/lQVdPYtMwx

— Robert Edmonds (@rsedmonds) April 20, 2016

The ability to reconfigure SB is indeed now optional for devices which are designed to always boot with a specific Secure Boot configuration. I also noticed that the requirements say that OEMs should not sign an EFI shell binary. Therefore I've revised the instructions to use efibootmgr instead.

Background

UEFI Secure Boot, when configured and enabled (which it is on most new PCs) requires that whatever it loads is signed with a trusted key. The one common trusted key for PCs is held by Microsoft, and while they will sign other people's code for a nominal fee, they require that it also validates the code it loads, i.e. the kernel or next stage boot loader. The kernel in turn is responsible for validating any code that could compromise its integrity (kernel modules, kexec images).

Currently there are no such signed boot loaders in Debian, though the shim and grub-signed packages included in many other distributions should be usable. However it's possible to load an appropriately configured Linux kernel directly from the UEFI firmware (typically through the shell) which is what I'm doing at the moment.

Packaging signed kernels

Signing keys obviously need to be protected against disclosure; the private keys can't be included in a source package. We also won't install them on buildds separately, and generating signatures at build time would of course be unreproducible. So I've created a new source package, linux-signed, which contains detached signatures prepared offline.

Currently the binary packages built from linux-signed also contain only detached signatures, which are applied as necessary at installation time. The signed kernel image (only on x86 for now) is named /boot/vmlinuz-kversion.efi.signed. However, since packages must not modify files owned by another package and I didn't want to dpkg-divert thousands of modules, the module signatures remain detached. Detached module signatures are a new invention of mine, and require changes in kmod and various other packages to support them. (An alternate might be to put signed modules under a different directory and drop a configuration file in /lib/depmod.d to make them higher priority. But then we end up with two copies of every module installed, which can be a substantial waste of space.)

Preparation

The packages you need to repeat the experiment:

  • linux-image-4.5.0-1-flavour version 4.5.1-1 from unstable (only 686, 686-pae or amd64 flavours have signed kernels; most flavours have signed modules)
  • linux-image-4.5.0-1-flavour-signed version 1~exp3 from experimental
  • initramfs-tools version 0.125 from unstable
  • kmod and libkmod2 unofficial version 22-1.2 from people.debian.org

For Secure Boot, you'll then need to copy the signed kernel and the initrd onto the EFI system partition, normally mounted at /boot/efi.

SB requires a Platform Key (PK) which will already be installed on a real PC. You can replace it but you don't need to. If you're using OVMF, there are no persistent keys so you do need to generate your own:

openssl req -new -x509 -newkey rsa:2048 -keyout pk.key -out pk.crt \ -outform der -nodes

You'll also need to install the certificate for my kernel image signing key, which is under debian/certs in the linux-signed package. OVMF requires this in DER format:

openssl x509 -in linux-signed-1~exp3/debian/certs/linux-image-benh@debian.org.cert.pem \ -out linux.crt -outform der

You'll need to copy the certificate(s) to a FAT-formatted partition such as the EFI system partition, so that the firmware can read it.

Use efibootmgr to add a boot entry for the kernel, for example:

efibootmgr -c -d /dev/sda -L linux-signed -l '\vmlinuz.efi' -u 'initrd=initrd.img root=/dev/sda2 ro quiet'

You should use the same kernel parameters as usual, except that you also need to specify the initrd filename using the initrd= parameter. The EFI stub code at the beginning of the kernel will load the initrd using EFI boot services.

Enabling Secure Boot
  1. Reboot the system and enter UEFI setup
  2. Find the menu entry for Secure Boot customisation (in OVMF, it's under 'Device Manager' for some reason)
  3. In OVMF, enrol the PK from pk.crt
  4. Add linux.crt to the DB (whitelist database)
  5. Ensure that Secure Boot is enabled and in 'User Mode'
Booting the kernel in Secure Boot

If all went well, Linux will boot as normal. You can confirm that Secure Boot was enabled by reading /sys/kernel/security/securelevel, which will contain 1 if it was.

Module signature validation

Module signatures are now always checked and unsigned modules will be given the 'E' taint flag. If Secure Boot is used or you add the kernel parameter module.sig_enforce=1, unsigned modules will be rejected. You can also turn on signature enforcement and turn off various other methods of modifying kernel code (such as kexec) by writing 1 to /sys/kernel/security/securelevel.

Catégories: Elsewhere

Reproducible builds folks: Reproducible builds: week 51 in Stretch cycle

Planet Debian - mer, 20/04/2016 - 20:47

What happened in the reproducible builds effort between April 10th and April 16th 2016:

Toolchain fixes
  • Roland Rosenfeld uploaded transfig/1:3.2.5.e-6 which honors SOURCE_DATE_EPOCH. Original patch by Alexis Bienvenüe.
  • Bill Allombert uploaded gap/4r8p3-2 which makes convert.pl honor SOURCE_DATE_EPOCH. Original patch by Jerome Benoit, duplicate patch by Dhole.
  • Emmanuel Bourg uploaded ant/1.9.7-1 which makes the Javadoc task use UTF-8 as the default encoding if none was specified and SOURCE_DATE_EPOCH is set.

Antoine Beaupré suggested that gitpkg stops recording timestamps when creating upstream archives. Antoine Beaupré also pointed out that git-buildpackage diverges from the default gzip settings which is a problem for reproducibly recreating released tarballs which were made using the defaults.

Alexis Bienvenüe submitted a patch extending sphinx SOURCE_DATE_EPOCH support to copyright year.

Packages fixed

The following packages have become reproducible due to changes in their build dependencies: atinject-jsr330, avis, brailleutils, charactermanaj, classycle, commons-io, commons-javaflow, commons-jci, gap-radiroot, jebl2, jetty, libcommons-el-java, libcommons-jxpath-java, libjackson-json-java, libjogl2-java, libmicroba-java, libproxool-java, libregexp-java, mobile-atlas-creator, octave-econometrics, octave-linear-algebra, octave-odepkg, octave-optiminterp, rapidsvn, remotetea, ruby-rinku, tachyon, xhtmlrenderer.

The following packages became reproducible after getting fixed:

Some uploads fixed some reproducibility issues, but not all of them:

Patches submitted which have not made their way to the archive yet:

  • #820603 on viking by Alexis Bienvenüe: fix icon headers inclusion order.
  • #820661 on nullmailer by Alexis Bienvenüe: fix the order in which files are included in the static archive.
  • #820668 on sawfish by Alexis Bienvenüe: fix file ordering in theme archives, strip hostname and username from the config.h file, and honour SOURCE_DATE_EPOCH when creating the config.h file.
  • #820740 on bless by Alexis Bienvenüe: always use /bin/sh as shell.
  • #820742 on gmic by Alexis Bienvenüe: strip the build date from help messages.
  • #820809 on wsdl4j by Alexis Bienvenüe: use a plain text representation of the copyright character.
  • #820815 on freefem++ by Alexis Bienvenüe: fix the order in which files are included in the .edp files, and honour SOURCE_DATE_EPOCH when using the build date.
  • #820869 on pyexiv2 by Alexis Bienvenüe: honour the SOURCE_DATE_EPOCH environment variable through the ustrftime function, to get a reproducible copyright year.
  • #820932 on fim by Alexis Bienvenüe: fix the order in which files are joined in header files, strip the build date from fim binary, make the embeded vim2html script honour SOURCE_DATE_EPOCH variable when building the documentation, and force language to be English when using bison to make a grammar that is going to be parsed using English keywords.
  • #820990 on grib-api by Santiago Vila: always call dh-buildinfo.
diffoscope development

Zbigniew Jędrzejewski-Szmek noted in #820631 that diffoscope doesn't work properly when a file contains several cpio archives.

Package reviews

21 reviews have been added, 14 updated and 22 removed in this week.

New issue found: timestamps_in_htm_by_gap.

Chris Lamb reported 10 new FTBFS issues.

Misc.

The video and the slides from the talk "Reproducible builds ecosystem" at LibrePlanet 2016 have been published now.

This week's edition was written by Lunar and Holger Levsen. h01ger automated the maintenance and publishing of this weekly newsletter via git.

Catégories: Elsewhere

Mediacurrent: New eBook: Intranets the Drupal Way

Planet Drupal - mer, 20/04/2016 - 20:14

The Intranet has entered a new era where 78% of companies are running on open source software. Now, options for corporate Intranets are no longer confined to proprietary platforms.

Catégories: Elsewhere

myDropWizard.com: Drupal 6 security update for Views!

Planet Drupal - mer, 20/04/2016 - 19:40

As you may know, Drupal 6 has reached End-of-Life (EOL) which means the Drupal Security Team is no longer doing Security Advisories or working on security patches for Drupal 6 core or contrib modules - but the Drupal 6 LTS vendors are and we're one of them!

Today, there is a Moderately Critical security release for Views to fix an Access Bypass vulnerability.

The Views module provides a flexible method for Drupal site designers to control how lists and tables of content, users, taxonomy terms and other data are presented.

The module doesn't sufficiently check handler access when returning the list of handlers fromview_plugin_display::get_handlers(). The most critical code (access plugins and field output) is unaffected - only area handlers, theget_field_labels()method, token replacement, and some relationship handling are susceptible.

Download the patch for Views 6.x-2.x or Views 6.x-3.x!

If you have a Drupal 6 site using the Views module (probably most sites), we recommend you update immediately! We have already deployed the patch for all of our Drupal 6 Long-Term Support clients. :-)

If you'd like all your Drupal 6 modules to receive security updates and have the fixes deployed the same day they're released, please check out our D6LTS plans.

Note: if you use the myDropWizard module (totally free!), you'll be alerted to these and any future security updates, and will be able to use drush to install them (even though they won't necessarily have a release on Drupal.org).

Catégories: Elsewhere

OSTraining: Drupal 8.1 and What It Means for Drupal's Future

Planet Drupal - mer, 20/04/2016 - 17:32

Today, Drupal 8.1 was officially released.

All the way back in 2014, we talked about the changes coming to Drupal and how the release cycle would allow for changes to be progressively added to Drupal.

At that time, it was estimated that a new version with new features could be released every 6 months. Keeping to that schedule for Drupal 8 has been problematic due to the size and scope of what they wanted to achieve, but they made it! 

Catégories: Elsewhere

Wim Leers: Drupal 8.1: BigPipe as an experimental module

Planet Drupal - mer, 20/04/2016 - 13:09

Today, Drupal 8.1 has been released and it includes BigPipe as an experimental module.

Six months ago, on the day of the release of Drupal 8, the BigPipe contrib module was released.

So BigPipe was first prototyped in contrib, then moved into core as an experimental module.

Experimental module?

Quoting d.o/core/experimental:

Experimental modules allow core contributors to iterate quickly on functionality that may be supported in an upcoming minor release and receive feedback, without needing to conform to the rigorous requirements for production versions of Drupal core.

Experimental modules allow site builders and contributed project authors to test out functionality that might eventually be included as a stable part of Drupal core.

With your help (in other words: by testing), we can help BigPipe “graduate” as a stable module in Drupal 8.2. This is the sort of module that needs wider testing because it changes how pages are delivered, so before it can be considered stable, it must be tested in as many circumstances as possible, including the most exotic ones.

(If your site offers personalization to end users, you are encouraged to enable BigPipe and report issues. There is zero risk of data loss. And when the environment — i.e. web server or (reverse) proxy — doesn’t support streaming, then BigPipe-delivered responses behave as if BigPipe was not installed. Nothing breaks, you just go back to the same perceived performance as before.)

About 500 sites are currently using the contrib module. With the release of Drupal 8.1, hopefully thousands of sites will test it.12

Please report any issues you encounter! Hopefully there won’t be many. I’d be very grateful to hear about success stories too — feel free to share those as issues too!

Documentation

Of course, documentation is ready too:

What about the contrib module?

The BigPipe contrib module is still available for Drupal 8.0, and will remain available.

  • 1.0-beta1 was released on the same day as Drupal 8.0.0
  • 1.0-beta2 was released on the same day as Drupal 8.0.1, and made it feature-complete
  • 1.0-beta3 contained only improved documentation
  • 1.0-rc1 brought comprehensive test coverage, which was the last thing necessary for BigPipe to become a core-worthy module — the same day as the work continued on the core issue: https://www.drupal.org/node/2469431#comment-10899308
  • 1.0 was tagged today, on the same day as Drupal 8.1.0

Going forward, I’ll make sure to tag releases of the BigPipe contrib module matching Drupal 8.1 patch releases, if they contain BigPipe fixes/improvements. So, when Drupal 8.1.3 is released, BigPipe 1.3 for Drupal 8.0 will be released also. That makes it easy to keep things in sync.

Upgrading?

When you upgrade from Drupal 8.0 to Drupal 8.1, and you were using the BigPipe module on your 8.0 site, then follow the instructions in the 8.1.0 release notes:

If you previously installed the BigPipe contributed module, you must uninstall and remove it before upgrading from Drupal 8.0.x to 8.1.x.

  1. Note there is also the BigPipe demo module (d.o/project/big_pipe_demo), which makes it easy to simulate the impact of BigPipe on your particular site. 

  2. There’s also a live demo: http://bigpipe.demo.wimleers.com/ 

  • Acquia
  • Drupal
  • WPO
  • performance
Catégories: Elsewhere

Michal &#268;iha&#345;: Testing Sphinx documentation with Jenkins

Planet Debian - mer, 20/04/2016 - 12:00

While reviewing comments on phpMyAdmin wiki (which we're shrinking down to developer documentation and moving end user documentation to proper documentation) I've noticed that people complained there on broken links in our documentation. Indeed there was quite some of them as this is something nobody really checks. It seems like obvious task to automate.

It seemed to me as obvious as somebody had to do it already. Unfortunately I have not found much, but at least there was Using Jenkins to parse sphinx warnings. This helps with the build warnings, but unfortunately I found no integration for the linkcheck builder. Fortunately it's quite easy with the Jenkins Warnings plugin to write custom parsers and to parse linkcheck output as well.

The Sphinx output parser based on above link can be configured like:

Regular Expression:

^(.*):(\d+): \((.*)\) (.*)

Mapping Script:

import hudson.plugins.warnings.parser.Warning String fileName = matcher.group(1) String lineNumber = matcher.group(2) String category = matcher.group(3) String message = matcher.group(4) return new Warning(fileName, Integer.parseInt(lineNumber), "sphinx", category, message);

Example Log Message:

Percona-Server-1.0.2-3.rst:67: (WARNING/2) Inline literal start-string without end-string.

The Sphinx linkcheck output is quite similar:

Regular Expression:

^(.*):(\d+): \[([^\]]*)\] (.*)

Mapping Script:

import hudson.plugins.warnings.parser.Warning String fileName = matcher.group(1) String lineNumber = matcher.group(2) String category = matcher.group(3) String message = matcher.group(4) return new Warning(fileName, Integer.parseInt(lineNumber), "sphinx-linkcheck", category, message);

Example Log Message:

faq.rst:793: [broken] http://www.hardened-php.net/: <urlopen error [Errno -3] Temporary failure in name resolution>

All you need to do now is to enable these in your Jenkins project, let the Sphinx parse output and the Sphinx linkcheck one file generated by linkcheck (usually _build/linkcheck/output.txt). The result can be found on the phpMyAdmin CI server.

Filed under: English phpMyAdmin | 0 comments

Catégories: Elsewhere

Dries Buytaert: Applaud the Drupal maintainers

Planet Drupal - mer, 20/04/2016 - 11:38

Today is another big day for Drupal as we just released Drupal 8.1.0. Drupal 8.1.0 is an important milestone as it is a departure from the Drupal 7 release schedule where we couldn't add significant new features until Drupal 8. Drupal 8.1.0 balances maintenance with innovation.

On my blog and in presentations, I often talk about the future of Drupal and where we need to innovate. I highlight important developments in the Drupal community, and push my own ideas to disrupt the status quo. People, myself included, like to talk about the shiny innovations, but it is crucial to understand that innovation is only a piece of how we grow Drupal's success. What can't be forgotten is the maintenance, the bug fixing, the work on Drupal.org and our test infrastructure, the documentation writing, the ongoing coordination and the processes that allow us to crank out stable releases.

We often recognize those who help Drupal innovate or introduce novel things, but today, I'd like us to praise those who maintain and improve what already exists and that was innovated years ago. So much of what makes Drupal successful is the "daily upkeep". The seemingly mundane and unglamorous effort that goes into maintaining Drupal has a tremendous impact on the daily life of hundreds of thousands of Drupal developers, millions of Drupal content managers, and billions of people that visit Drupal sites. Without that maintenance, there would be no stability, and without stability, no room for innovation.

Catégories: Elsewhere

Jim Birch: Midcamp 2016 Recap - Where the Drupal community comes together!

Planet Drupal - mer, 20/04/2016 - 11:20

MidCamp 2016, the Midwest Drupal Camp was a roaring success.  We had 36 Sessions and 1 keynote were spread across the University of Chicago Student Center West,.  All of the sessions were successfully recorded by our amazing AV team and shared within hours on the Midcamp YouTube channel.  Our sponsor tables were busy; our Birds of a Feather discussions were many; and our socials were social!

This was my second time attending, and my first time being a volunteer organizer.  If you attended, I hope that I got to greet you on the way in.  Attending my first year, I was so awestruck by the amount of knowledge and talent at MidCamp, I couldn't help but get involved.  After volunteering to help, I am still in awe of the dedication of the volunteers, and the effort it takes to put on a camp like this.  Thanks to all of the volunteers for the countless hours put in throughout the year to make this event happen.

Please indulge me a moment while I call out a few individuals specifically for their incredible effort and dedication put forth to MidCamp 2016.

Read more

Catégories: Elsewhere

Drupal Console: Drupal Console and Beer - Enzo join us from Chongqing

Planet Drupal - mer, 20/04/2016 - 10:33
This time, enzo join us from Chongqing to talk about upcoming presentations on his enzotour 2016. We also talk about lates added features in the 0.11.3 release our very last one before the 1.0.0-alpha1 release. The next upcoming release will be tagged once Drupal 8.1.0 got release.
Catégories: Elsewhere

Drupal Blog: Drupal 8.1.0 is now available

Planet Drupal - mer, 20/04/2016 - 09:48

Drupal 8.1.0, the first minor release of Drupal 8, is now available. With Drupal 8, we made significant changes in our release process, adopting semantic versioning and scheduled feature releases. This allows us to make extensive improvements to Drupal 8 in a timely fashion while still providing backwards compatibility. Drupal 8.1.0 is the first such update.

What's new in Drupal 8.1.x?

Drupal 8.1.0 comes with numerous improvements, including CKEditor WYSIWYG enhancements, added APIs, an improved help page, and two new experimental modules. (Experimental modules are provided with Drupal core for testing purposes, but are not yet fully supported.)

Download Drupal-8.1.0 Experimental UI for migrations from Drupal 6 and 7

Drupal 8.1.0 now includes the Migrate Drupal UI module, which provides a user interface for Drupal core migrations. Use it to migrate Drupal 6 or 7 sites to Drupal 8. The user guide on migrating from Drupal 6 or 7 to Drupal 8 has full documentation. Note that the Drupal 8 Migrate module suite is still experimental and has known issues. Read below for specific information on migrating Drupal 6 and Drupal 7 sites with 8.1.0. (Always back up your data before performing a migration and review the results carefully.)

BigPipe for perceived performance

The Drupal 8 BigPipe module provides an advanced implementation of Facebook's BigPipe page rendering strategy, leading to greatly improved perceived performance for pages with dynamic, personalized, or uncacheable content. See the BigPipe documentation.

CKEditor WYSIWYG spellchecking and language button

Drupal 8.0.0 included the CKEditor module (a WYSIWYG editor), but it was not previously possible to use your browser's built-in spell checker with it to check the text. With Drupal 8.1.0, spellchecking is now enabled within CKEditor as well.

Another great improvement is the addition of the optional language markup button in CKEditor. When configured to appear in your editing toolbar, it allows you to assign language information to parts of the text, which is useful for accessibility and machine processing.

Improved help page with tours

Drupal 8.0.0 included a new system for help tutorials called tours with the core Tour module. In Drupal 8.1.0, we made these tours easier to discover by listing them in the administrative help overview at /admin/help.

The help overview page is also more flexible now, so contributed modules can add sections to it and themes can override its appearance more easily. You can read more about the new system in the change record for the updated help page, or refer to the Tour API documentation for how to add tours for your modules.

Rendered entities in Views fields

Drupal 8.1.0 now includes a rendered entity field handler for Views, which allows placing a fully rendered entity within a view field. For example, this feature could be used to display a rendered user profile for each node author in a table listing node content. (This feature was provided by the Entity contributed module in Drupal 7, but had not yet been available in Drupal 8.)

Support for JavaScript automated testing

Drupal 8.1.0 adds support for automated testing of JavaScript, which will mean fewer bugs with Drupal's JavaScript functionality in the future as we write new tests for it. (Read more about how to run the JavaScript tests.) There are also other improvements to the testing system, including improved reporting of PHPUnit and other test results.

Improved Composer support

Starting with Drupal 8.1.x, Drupal core and its dependencies are packaged by Composer on Drupal.org. This means that sites and modules can now also use Composer to manage all of their third-party dependencies (rather than having to work around the vendor directory that previously shipped with core).

Developer API improvements

Minor releases like Drupal 8.1.0 include backwards-compatible API additions for developers as well as new features. Read the 8.1.0 release notes for more details on the many improvements for developers in this release.

What does this mean to me?
Drupal 8 site owners

Update to 8.1.0 to continue receiving bug and security fixes. The next bugfix release, 8.1.1, is scheduled for May 4, 2016.

Updating your site from 8.0.6 to 8.1.0 with update.php is exactly the same as updating from 8.0.5 to 8.0.6. Modules, themes, and translations may need small changes for this minor release, so test the update carefully before updating your production site.

Drupal 6 site owners

Drupal 6 is not supported anymore. Create a Drupal 8 site and try migrating your data into it as soon as possible. Your Drupal 6 site can still remain up and running while you test migrating your Drupal 6 data into your new Drupal 8 site. Note that there are known issues with the experimental Migrate module suite. If you find a new bug not covered by one of these issues, your detailed bug report with steps to reproduce is a big help!

Drupal 7 site owners

Drupal 7 is still fully supported and will continue to receive bug and security fixes throughout all minor releases of Drupal 8.

The new Migrate Drupal UI for Migrate also allows migrating a Drupal 7 site into a Drupal 8 site, but the migration path from Drupal 7 to 8 is not complete, so you may encounter errors or missing migrations when you try to migrate. That said, since your Drupal 7 site can remain up and running while you test migrating into a new Drupal 8 site, you can help us stabilize the Drupal 7 to Drupal 8 migration path! Testing and bug reports from your real-world Drupal 7 sites will help us stabilize this functionality sooner for everyone. (Search the known issues.)

Translation, module, and theme contributors

Minor releases like Drupal 8.1.0 are backwards-compatible, so modules, themes, and translations that support Drupal 8.0.x will be compatible with 8.1.x as well. However, the new version does include some string changes, minor UI changes, and internal API changes (as well as more significant changes to experimental modules like the Migrate suite). This means that some small updates may be required for your translations, modules, and themes. See the announcement of the 8.1.0 release candidate for more background information.

Catégories: Elsewhere

Pages

Subscribe to jfhovinne agrégateur - Elsewhere