Agrégateur de flux

Microserve: Adding complex conditions to views with a query alter hook

Planet Drupal - mar, 26/05/2015 - 13:05
What we were doing

Recently one of our clients requested an update to the visibility settings in their content and for these access settings to filter through to some views they had set up. Using Organic Groups, we could easily create public and private content, but the client wanted more fine-grained permissions that allowed for content to be restricted by user status as well.

With the help of the OG Extras module, we created the following options:

  • Publicly accessible to all users
  • Restricted to only logged-in users
  • Restricted to only users in one of the content's groups
  • Private to only the user creating the content

That was easy enough: we updated the field base to include the new options, keyed using the constants defined in the OG Access submodule (and adding one of our own in a custom module where all of this work took place). The tricky part was adding filtering to the views, as there was now no easy way of telling views to filter correctly. So, hook_views_query_alter() to the rescue!

Why we were doing it

It's important to point out that this isn't always the solution you should seek. Adding a query alter hook makes it more difficult for a site to be maintained - site builders can't see it in the UI, and if you don't clearly document your work, then you run the risk of having it overridden by an over-zealous administrator who renames the view, or something similar.

In our case, we were unable to use the filtering that Views offers out-of-the-box to get the results we needed. Not only did we need to know what permissions were set on the content, but we also needed to know whether the user was logged in, whether the user looking at it was the user who created it, and whether the user looking at it was in one of the groups that the content appeared in.

Simply put, Views couldn't handle all of those filters at once through the UI, so our only option was to modify the query directly.

Planning

Before diving into the code, it became apparent that we would need to figure out exactly what data needed to be displayed in each use case, and how to set up the code. We began with a list of criteria that would lead to the view being displayed:

  • User is anonymous AND the group content access is set to 'public - all including anon'.
  • User is logged in AND the group content access is set to 'restricted - only auth users' OR content is in one of the user's groups.
  • User is logged in AND the group content access is set to 'private - only users in these groups' AND content is in one of the user's groups.
  • User is logged in AND they created the content.

From there, it was easy to start breaking the checks down and order them in such a way that each new check will override the previous ones.

  • Is the user logged in? If no, then only show public content. If yes, show content for members.
  • Is the content in one of the user's groups? If yes, then show it. If not, don't.
  • Did the user create the content? If yes, show it.

This could all be achieved by adding a db_or() condition to the query in hook_views_query_alter().

Setting up the code - step by step

Bear in mind that hook_views_query_alter() operates on all Views queries, so it's a good idea to make sure you're working with the correct view/query before going ahead. For us, a simple check against the $view->name was sufficient.

In order to retrieve some of this data, we need some extra tables to be added and to cue up some other variables.

$anon = user_is_anonymous(); $query->add_table('field_data_group_content_access'); $query->add_table('og_membership_node'); $query->add_table('users_node'); // Set up our db_or. Either... $conditions = db_or();

The first check we need to make is whether or not the user viewing the view is anonymous. If they are, we don't need to do too much else - just add the condition and continue.

if ($anon) { // ... user is anonymous and content is public. $conditions->condition('field_data_group_content_access.group_content_access_value', array(OG_CONTENT_ACCESS_PUBLIC)); }

Then comes the trickier bit. The user is obviously not anonymous, so we need to check the status of the content, which groups it belongs to, and whether the author of the content is the one viewing it.

First things first, using the global $user variable, we can check the user's roles. For this client, we were also checking for the 'member' role, but you can just as easily remove this check (assuming all authenticated users can view this content).

else { global $user; // ... user is logged in and content is restricted to member users. $access_values = array(OG_CONTENT_ACCESS_PUBLIC); if (in_array('member', $user->roles)) { $access_values[] = OG_CONTENT_ACCESS_RESTRICTED; } // Add the condition. $conditions->condition('field_data_group_content_access.group_content_access_value', $access_values);

The next step is to load the logged in user's groups and filter the view by the group IDs. For this, we'll need to create a db_and() - we want to make sure the content has one of the $access_values we've set up, and also that it is in one of their groups. If the user doesn't have any groups, there's no point in adding this condition.

// ... user is logged in and content is in their groups. $gids = og_get_groups_by_user(NULL, 'node'); if (!empty($gids)) { $access_values[] = OG_CONTENT_ACCESS_PRIVATE; $group_and = db_and(); $group_and->condition('og_membership_node.gid', $gids, 'IN'); $group_and->condition('field_data_group_content_access.group_content_access_value', $access_values); // Add the condition. $conditions->condition($group_and); }

Finally, add a check to see if the content being viewed is content that the user has created themselves. If they've created it, they should be able to view it, regardless of any other permissions set on it.

// ... user is logged in and they created the content. $conditions->condition('users_node.uid', $user->uid); }

And to wrap it all up, we need to add the condition to the query.

$query->add_where(0, $conditions); Final structure

Fragmenting the code like this makes it difficult to understand what's going on, and I whizzed quite quickly through it. In the end, though, you should end up with a db_or() statement that looks something like one of these. Remember, there is a different $conditions value depending on whether the user is logged in or not.

For anonymous users:

$conditions = db_or() ->condition('field_data_group_content_access.group_content_access_value', array(OG_CONTENT_ACCESS_PUBLIC));

For logged-in users:

$conditions = db_or() ->condition('field_data_group_content_access.group_content_access_value', $access_values) ->condition(db_and() ->condition('og_membership_node.gid', $gids, 'IN') ->condition('field_data_group_content_access.group_content_access_value', $access_values) ) ->condition('users_node.uid', $user->uid);

Good luck!

Further reading Sophie Shanahan-Kluth
Catégories: Elsewhere

Zivtech: 11 Lorem Ipsum Generators That Make Drupal Site Building Fun

Planet Drupal - mar, 26/05/2015 - 13:00

I have been building and maintaining Drupal sites for over 8 years now. As a Drupal site builder it is my job to ensure that all content types, fields, views, panels, and other components of a Drupal site build are working properly before launch. This requires actually entering real content on the site, which can be time consuming. Some of you may be thinking, “What about Devel Generate?”. Devel generate works great for initial testing of content types, as it allows for automatic creation of nodes for testing, but we still need to make sure creation of content goes smoothly for content editors. This means at some point we actually have to test creating content manually. This is where a Lorem Ipsum generator comes in handy.

I have mainly used generic lorem ipsum generators like Lipsum.com over the years, or more recently the Lorem Ipsum Generator (Default Text) Chrome extension. These work great, but after 8 years have become a bit of a bore when creating test content. One of our team members, Nick Lewis, showed me the Corporate BS Generator one day and just last week I thought to myself, “That Corporate BS Generator would be a pretty funny lorem ipsum generator.” That’s when I decided to Google “bs lipsum generator” and came across Corporate Lipsum. What I discovered next was something that I never imagined. There are tons of themed lorem ipsum generators out there. There are many that are comical and some are very innovative in their approach. Let’s take a look at 11 Lorem Ipsum Generators that will be sure to make testing content creation on your next Drupal project fun.

  1. Meet the Ipsums - This lorem ipsum generator is an aggregator of ipsums from many of the other lorem ipsum generators out there. It has one of the nicer looking designs of the lipsum generators I have seen.

  2. Pirate Ipsum - This lorem ipsum generator allows you to create lipsum text in pirate talk. This one is my favorite design so far for all of the lipsum generators I have seen. You can even buy posters of the artwork used on the site at the bottom of the page.

  3. Startupsum - This lorem ipsum generator allows you to create lipsum text with startup related terms and phrases. It has a clean design and has automatic text copy highlighting, which is a nice feature to have for these types of sites.

  4. Sagan Ipsum - This lorem ipsum generator spans the cosmos and generates lipsum text of one of the great literary geniuses of our time, Carl Sagan. Take your next project to a new level of understanding with this lipsum generator. it includes a text selection button to make copying the generated text much easier.

  5. Pick Sum Ipsum - This lorem ipsum generator has several options for generating lipsum text which are fairly unique. It has the most innovative feature out of the sites I have seen that allows you to have 2 actors “brawl” to combine their generated lipsum text together for even more fun. This generator includes a button to select the text to copy, however it takes more time to actually generate the text due to it’s multiple options.

  6. Savage Ipsum - This lorem ipsum generator is one of my favorites for content, as it generates word and phrases that “Macho Man” Randy Savage would say during his wrestling career. Ooooh yeeeah!

  7. DeLorean Ipsum - This lorem ipsum generator is a throwback to one of my favorite movie franchises, Back to the Future. Generate some “heavy” lipsum text for your next site build with this one.

  8. Hodor Ipsum - This lorem ipsum generator is one of the funnier lipsum generators I have found. It generates lipsum text in the language of the Hodor characters from Game of Thrones. Keeping in character, the site is even created by Hodor.

  9. Cheese Ipsum - This lorem ipsum generator has a great design and the delicious subject matter of cheese related words and phrases. You may find yourself looking for lunch earlier using this lipsum generator.

  10. Bacon Ipsum - This lorem ipsum generator falls in the food category again, but this one generates text with a bit more meat to it. There are no fancy features on this lipsum generator, but it had me at bacon. It even has Turducken!

  11. Pig Latin Ipsum - I figured with all these types of lorem ipsum generators, there had to be one for Pig Latin, and there is! This lipsum generator will generate lipsum text that is translated to Pig Latin. The best part is you can translate any text you want, so technically you could turn any of the lipsum text from the previously mentioned lipsum generators into this one and get an even funnier result.

That rounds out my list of favorites that I have found so far, but there are tons of other fun and innovative lorem ipsum text generators out there to check out. Some even fill other needs for Drupal site building, such as allowing you to generate placeholder HTML code and generate placeholder images. Not Lorem Ipsum takes a completely different approach on things by giving you ideas of what to write for your actual content based on your industry. As you may have guessed, there is also a module for that called Drupal Ipsum. Drupal Ipsum allows you to generate Drupal related ipsum text and even integrates with Devel. Let us know what your favorite lipsum generator is in the comments below.

Terms: Drupalsite buildingDrupal Planet
Catégories: Elsewhere

orkjerns blogg: Drupal and IoT. Code examples, part 2: User temperatures!

Planet Drupal - lun, 25/05/2015 - 20:29
Drupal and IoT. Code examples, part 2: User temperatures! Body This will be the second part of the code examples, and the third in the series about Drupal and the Internet of Things.

If you haven't read the other parts of it, you are going to be all fine. But if you want to, the front page has a complete list

In this part we will look at simplifying the request flow for the client, while still keeping a certain level of security for our endpoint. There are several ways of doing this, but today we will look at a suggestion on how to implement API keys per user.

Let me just start first by saying there are several additional steps you could (and ideally should) implement to get a more secure platform, but I will touch on these towards the end of the article.

First, let's look at a video I posted in the first blog post. It shows an example of posting the temperature to a Drupal site.

Architecture

Let's look at the scenario we will be implementing

  • A user registers an account on our Drupal 8 site
  • The user is presented with a path where they can POST their temperatures (for example example.com/user/5/user_temp_post)
  • The temperature is saved for that user when that request is made

See any potential problems? I sure hope so. Let's move on.

Registering the temperature in a room and sending to Drupal

As last time, I will not go into details on the implementation of any micro controller or hardware specific details in the blog post. But the code is available on github. I will quickly go through the technical steps and technologies used here:

  • I use a Raspberry pi 2, but the code should work on any model Raspberry pi
  • I use a waterproof dsb18b20 sensor, but any dsb18b20 should work. I have a waterproof one because I use it to monitor my beer brewing :)
  • The sensor checks the temperature at a certain interval (per default, 1 minute)
  • The temperature data is sent to the Drupal site and a node is created for each new registration
  • To authenticate the requests, the requests are sent with a x-user-temp header including the API key

This scenario is a bit different from the very real time example in the video above, but it is both more flexible (in terms of having a history of temperatures) and real-life (since temperatures seldom have such real-time changes as the one above).

Receiving temperatures in Drupal

The obvious problem with the situation described above, is the authentication and security of the transferred data. Not only do we not want people to be able to just POST data to our site with no authentication, we are also dealing with temperatures per user. So what is to stop a person to just POST a temperature on behalf of another user? Last post dealt with using the same user session as your web browser, but today we are going to look at using API keys.

If you have ever integrated a third party service to Drupal (or used a module that integrates a third party service) you are probably familiar with the concept of API keys. API keys are used to specify that even though a "regular" request is made, a secret token is used to prove that the request originates from a certain user. This makes it easy to use together with internet connected devices, as you would not need to obtain (and possibly maintain) a session cookie to authenticate as your user.

Implementation details

So for this example, I went ahead and implemented a "lo-fi" version of this as a module for Drupal 8. You can check out the code at github if you are eager to get all the details. Also, I deployed the code on Pantheon so you can actually go there and register an account and POST temperatures if you want!

The first step is to actually generate API keys to users that wants one. My implementation just generates one for users when they visit their "user temperatures" tab for the first time.

Side note: The API key in the picture is not the real one for my user.

Next step is to make sure that we use a custom access callback for the path we have defined as the endpoint for temperatures. In my case, I went with making the endpoint per user, so the path is /user/{uid}/user_temp_post. In Drupal 7 you would accomplish this custom access check by simply specifying something like this in your hook_menu:

'access callback' => 'my_module_access_callback',

In Drupal 8, however, we are using a my_module.routing.yml file for routes we are defining. So we also need to specify in this file what the criteria for allowing access should be. For a very good example of this, I found the user.module to be very helpful. My route for the temperature POST ended up like this:

user_temp.post_temperatures: path: '/user/{user}/user_temp_post' defaults: _controller: '\Drupal\user_temp\Controller\UserTempController::post' _title: 'Post user temperatures' requirements: _access_user_temp_post: 'TRUE'

In this case '_access_user_temp_post' is what will be the criteria of allowing access. You can see this in the user_temp.services.yml file of the module. From there you can also see that Drupal\user_temp\Access\PostTempAccessCheck is the class responsible for checking access to the route. In this class we must make sure to return a Drupal\Core\Access\AccessResult to indicate if the user is allowed access or not.

Some potential questions about the approach

From there on in, the code for the POST controller should provide you with the answers you need. And if the code is not enough, you can try to read the tests of the client part or the Drupal part. I will proceed with making assumptions about theoretical questions to the implementation:

How is this different from using the session cookie?

It is different in 2 aspects. The API key will not expire for reasons beyond your control. Or more precisely, the device's control. You can also reset the API key manually if you would want it to expire. The other big difference is that if your API key should be compromised, your account is not compromised in any way (as would be the case if a valid session cookie were to be compromised). Beyond that, please observe that in one area this is not different from using a session cookie: The requests should be made over https, especially if you are using a wifi connection.

How can I further strengthen the security of this model?

One "easy" way to do this is to not expose the API key as part of the request. I was originally planning to implement this, but realised this might make my original point a little less clear. What I would do as a another "lo-fi" hardening would be to make the x-user-temp header just include a hash of the temperature sent and the user API key. This way, if someone were sniffing the requests, they would just see that the x-user-temp header would change all the time, and so it would take a considerable effort to actually forge the requests (compared to just observing the key in the header).

Why are you using nodes? Isn't that very much overhead for this?

This is a fair point. It might be a bit overkill for something so simple. But there are two bonus parts about using nodes:

  • We can use views to display our data.
  • We can ship the views, content types and fields as configuration with our module.
This last part is especially powerful in Drupal 8, and incredibly easy to accomplish. For the files required for this particular implementation, you can reference the config/install directory of the module. But since you are posting nodes, why aren't you using the REST module?

I admit it, I have no good reason for this beyond that I wanted to make this post be about implementing API keys for authentication. Also, here is a spoiler alert: Code examples part 3 will actually be using the REST module for creating nodes.

What if I want to monitor both my living room and my wine cellar? This is only one endpoint per user!

I am sorry for not implementing that in my proof of concept code, but I am sure you can think of a creative solution to the problem. Also, luckily for you, the code is open source so you are free to make any changes required to monitor your wine cellar. "Pull requests welcome" as they say.

As always, if you have any question or criticism (preferably beyond the points made above) I would love to hear thoughts on this subject in the comments. To finish it all off, I made an effort to find a temperature related gif. Not sure the effort shows in the end result.

admin Mon, 05/25/2015 - 18:29 Image Tags:
Catégories: Elsewhere

Netstudio.gr Blog: Drupal 8 is around the corner

Planet Drupal - lun, 25/05/2015 - 09:27

Four years ago we made an important decision at Netstudio: We would migrate to Drupal. Drupal 7 had just come out, and despite our having mastered Joomla at online-store level, we simply could not ignore Drupal’s embedded Content Construction Kit, its multi-lingual support, safety, stability, scalability, search engine friendliness, but, most of all, the professionalism of the Drupal community.

Drupal 7 helped us undertake even bigger and more complex projects, and build all of our solutions for ecommerce and presentation websites, including even our ERP, project management tool, quotes generation and hiring systems and our project management tool. It was on Drupal we based our company’s projects, like visibility.gr and servalarm.com, as well as more than 200 websites and online stores for our customers.

Not for a single moment have we regretted our migration to Drupal, despite our having to learn a new ecosystem from scratch.

Just six months after Drupal 7 was launched, the Drupal community set off for Drupal 8. Dries Buytaert set very high standards for the new version of Drupal defining the main pillars to set it on the crest of the wave.

Four and a half years of development later, beta 10 release is here, signalling the approach of the final release in less than 6 months from now!

More than 2800 programmers − twice as many as for its predecessor − have contributed for Drupal 8 developing hundreds of new features.

The most significant structural change made was that it was based on Symfony2 framework thus turning it into an object-oriented platform. From now on, large chunks of code, such as Guzzle, Twig (the new theming system), the Composer, etc. come from the Symfony2 community.

Moreover, Drupal 8 brings along enormous changes in the way we develop websites.

Internationalization

Multilingual support is very important for us, since we are based in Greece and our many international clients require the operation of their website in English and the language spoken in their country. Some of our clients run multilingual websites ​​(eg ivf-embryo.gr is available in 14 languages). Drupal 7 supports multilingual websites much better than Joomla or WordPress, but requires installing and configuring over 15 additional modules. In some extreme cases full support requires a programmer’s intervention, so it is not fully manageable by the non-technical administrator.

Multilingual support is embedded in the core of Drupal 8 requiring the activation of only 4 modules. Drupal can now be installed and operated in the administrator’s native language from start to finish. Gábor Hojtsy, heading a large team of contributors for this pillar, did an excellent job solving dozens of issues on Drupal 7, and transforming Drupal 8 into the most comprehensive platform available for developing multilingual websites and open-source-code applications!

Views in Core

For many, the views module is the basic reason for choosing Drupal. It allows displaying data to the user, without them having to write SQL queries. In Drupal 8 the Views module is conveniently embedded in Core. You can start building your website without having to install an additional module offering stability, full integration and interoperability. Furthermore, all administration screens are now Views, allowing the administrator to change fields and their appearance with just a few clicks.

WYSIWYG and Inline Edit

Content editing in Drupal 7 hasn’t been the best experience for the administrator. WordPress is still the leader in convenience and usability in this area. At Netstudio we have been installing dozens of modules in order to provide our customers with an environment almost as friendly as that of WordPress. Drupal 8 is a huge step in that direction. Its basic installation offers the content manager enhanced usability in a much friendlier environment (in fact, Netstudio has contributed some usability tests via UserFeel.com). The new release features a configured built-in WYSIWYG editor, as well as the impressive cutting-edge Inline Edit, making quick content interventions piece-of-cake for the front-end administrator.

Anything left for the next Drupal 8 (maybe 8.1) release? Well, yes. Media administration (photos, videos etc.). Although there have been improvements, additional modules need to be installed for a comprehensive solution, as is the case of the recently released Entity Embed and Entity Browser.

Mobile friendly

Now that the mobile web is a reality, it all seems so matter-of-fact. In 2011 however, at the onset of the Drupal 7 era, all we knew about the mobile web was forecasts. Fortunately, Dries Buytaert took these forecasts seriously, so he set the requirements for Drupal 8 aiming at its full compatibility with mobile devices. And his efforts were not in vain. The environment all around Drupal 8 is not only responsive, but also very easy for mobile and tablet users, even regarding contextual links, the admin menu and inline editing. The bottom line is that any administrator can manage their website comfortably commuting on a bus seat, on their bed or enjoying the sunshine on the beach!

Performance & Scalability

A platform’s performance and its ability to juggle sudden spikes of traffic have been major concerns of ours, partly explaining why we migrated to Drupal 7. Initially, Drupal 8 was much slower than Drupal 7, as shown in a comparative analysis we published two years ago. The obvious reason of course was that it was still under development and lacking the necessary speed optimization. This now has been changed. About a month ago, internal cache was set to on by default. We’re talking about a much smarter cache than that in Drupal 7. Speed racing is in progress, with developer Wim Leers in the lead. Here we expect dramatic improvements and capabilities once "SmartCache" and "PigPipe", Facebook-style page loading have been embedded in the core.

Configuration Management

I have a crush on this one, can’t wait for it, as it solves the issue of maintaining the site configuration under version control. In Drupal 7 website configuration and content were stored in the database, sometimes in the same tables, making their management almost impossible. We have repeatedly tried to resolve this issue using features, strongarm etc, but have given up as these solutions were too time-consuming and costly. We’ve been going out of our way creating scripts to control functionality, and checking over 300 points regarding security, performance, SEO etc. before delivering a website to our client. Still, this may not be considered a comprehensive solution. In Drupal 8 configuration is stored in YML files. That makes version control management, and data transfer from website to website or from environment to environment (e.g. development > staging > testing > live) a breeze.

Next Releases

Another important change in Drupal 8 is the rate of publication of new releases. In Drupal 7 subsequent releases (7.01, 7.02, and the most recent one, 7.37) focused exclusively on fixing bugs and security issues During these last four and a half years only few features have been added, in accordance to the backward compatibility policy. This is changed in Drupal 8. Versions 8.1, 8.2, etc. will not comprise of "minor updates", but will add new functionality. What it boils down to is that from now on, Drupal will be integrating technology innovation much faster.

More Novelties

Over and above the most important new features mentioned above, Drupal 8 comes with many more innovations like fieldable blocks, a greater range of integrated field types (date, entity reference, phone, email, link), the tour module, embedded schema.org output, enhanced accessibility, content publishing preview, friendlier front-end development, and more. What’s your main reason you can’t wait for Drupal 8?
Leave your comment below.

When will it be ready?

DrupalCon Los Angeles, with dozens of developers racing in Coding Sprints for the completion of Drupal 8, is now over. Only 20 critical issues (tasks and bugs) were left unsolved!

Teh RC (Release Candidate) version will be released once the last of these issues will have been resolved. The final version of Drupal 8 will be released 15 days after critical tasks and critical bugs have been brought down to zero. This is likely to take less than six months, but we’ll need to wait for another few months before we see the top contributed modules upgraded to Drupal 8. Knowing us, however, I’m sure we’ll start developing some simple presentation websites on the release of the almost final version. However, we’ll have to wait for 3-6 extra months before we can use Drupal 8 in more demanding installations. Meanwhile, we can all keep an eye on the top 100 contributed modules and their Drupal 8 updating status at: http://www.bluespark.com/status-top-100-contributed-modules-drupal-8

We can’t wait for Drupal 8. Can you?

Now you know our reasons of impatience to start working on Drupal 8. What about your reasons? Do you plan on using Drupal 8? Have you used it already? What do you like most about it? Write your comment here.

Catégories: Elsewhere

Web Omelette: Debugging Drupal redirects

Planet Drupal - lun, 25/05/2015 - 09:00

I was working on a big website with many contrib and custom modules. And I had to debug a very annoying redirect that started happening sometime in the recent past, not sure when. Some pages simply just redirected to other URLs.

I figured out that the problem was one of a 301 redirect. My browser told me that at least. But good luck figuring out where in the code I can find the culprit. Xdebug breakpoints everywhere but to no avail. A search for drupal_goto in the custom modules directory didn't help either, and God be with anyone trying to search through a contrib folder of that size.

Then it hit me. Isn't there a hook invoked inside drupal_goto? At this point I was assuming (and hoping really) that the redirect was happening somehow with a drupal_goto. And it turns out there is one: hook_drupal_goto_alter.

Armed with a new dose of hope, I implemented the hook and cleared the cache. Inside, I added the obligatory $test = ''; statement and put a breakpoint on it. Let's see what happens. After loading one of the offending pages, the breakpoint halted the execution and the Xdebug call stack in my PHPStorm immediately pointed out the problem: Global Redirect. There was some URL rewriting happening on the site so GR got a bit confused and was redirecting back to the original path. The details of the issue are however not important.

My point is that using this hook, I could see exactly who and why was calling drupal_goto. I didn't use it for anything else, apart from learning why the redirect is happening which in turn allowed me to write some code that prevented that.

Awesome. I learned about a new hook. And maybe now you as well.

In Hooks | Drupal var switchTo5x = true;stLight.options({"publisher":"dr-8de6c3c4-3462-9715-caaf-ce2c161a50c"});
Catégories: Elsewhere

Russ Allbery: Catch-up haul

Planet Debian - lun, 25/05/2015 - 01:44

As always, even though I've not been posting much, I'm still buying books. This is a catch-up post listing a variety of random purchases.

Katherine Addison — The Goblin Emperor (sff)
Milton Davis — From Here to Timbuktu (sff)
Mark Forster — How to Make Your Dreams Come True (non-fiction)
Angela Highland — Valor of the Healer (sff)
Marko Kloos — Terms of Enlistment (sff)
Angela Korra'ti — Faerie Blood (sff)
Cixin Liu — The Three-Body Problem (sff)
Emily St. John Mandel — Station Eleven (sff)
Sydney Padua — The Thrilling Adventures of Lovelace and Babbage (graphic novel)
Melissa Scott & Jo Graham — The Order of the Air Omnibus (sff)
Andy Weir — The Martian (sff)

Huh, for some reason I thought I'd bought more than that.

I picked up the rest of the Hugo nominees that aren't part of a slate, and as it happens have already read all the non-slate nominees at the time of this writing (although I'm horribly behind on reviews). I also picked up the first book of Marko Kloos's series, since he did the right thing and withdrew from the Hugos once it became clear what nonsense was going on this year.

The rest is a pretty random variety of on-line recommendations, books by people who made sense on the Internet, and books by authors I like.

Catégories: Elsewhere

Wouter Verhelst: Fixing CVE-2015-0847 in Debian

Planet Debian - dim, 24/05/2015 - 21:18

Because of CVE-2015-0847 and CVE-2013-7441, two security issues in nbd-server, I've had to updates for nbd, for which there are various supported versions: upstream, unstable, stable, oldstable, oldoldstable, and oldoldstable-backports. I've just finished uploading security fixes for the various supported versions of nbd-server in Debian. There're various relevant archives, and unfortunately it looks like they all have their own way of doing things regarding security:

  • For squeeze-lts (oldoldstable), you check out the secure-testing repository, run a script from that repository that generates a DLA number and email template, commit the result, and send a signed mail (whatever format) to the relevant mailinglist. Uploads go to ftp-master with squeeze-lts as target distribution.
  • For backports, you send a mail to the team alias requesting a BSA number, do the upload, and write the mail (based on a template that you need to modify yourself), which you then send (inline signed) to the relevant mailinglist. Uploads go to ftp-master with $dist-backports as target distribution, but you need to be in a particular ACL to be allowed to do so. However, due to backports policy, packages should never be in backports before they are in the distribution from which they are derived -- so I refrained from uploading to backports until the regular security update had been done. Not sure whether that's strictly required, but I didn't think it would do harm; even so, that did mean the procedure for backports was even more involved.
  • For the distributions supported by the security team (stable and oldstable, currently), you prepare the upload yourself, ask permission from the security team (by sending a debdiff), do the upload, and then ask the security team to send out the email. Uploads go to security-master, which implies that you may have to use dpkg-buildpackage's -sa parameter in order to make sure that the orig.tar.gz is actually in the security archive.
  • For unstable and upstream, you Just Upload(TM), because it's no different from a regular release.

While I understand how the differences between the various approaches have come to exist, I'm not sure I understand why they are necessary. Clearly, there's some room for improvement here.

As anyone who reads the above may see, doing an upload for squeeze-lts is in fact the easiest of the three "stable" approaches, since no intermediate steps are required. While I'm not about to advocate dropping all procedures everywhere, a streamlining of them might be appropriate.

Catégories: Elsewhere

CiviCRM Blog: Easier creation of email newsletters - New version

Planet Drupal - dim, 24/05/2015 - 16:24

When preparing an email newsletter, one part of it that is time consuming is gathering together all the content that is needed. In my experience, virtually all the content already exists elsewhere, such as in the local CMS, in CiviCRM, or on a blog, or some other online source.    So I was thinking how can I make this process easier.  What I did: I created mail merge tokens for CiviCRM that autofill a list of recent blog posts, stories, or any other type of CMS content.  So the end-user sees a list of tokens, one for each content type, each term/category, each aggregator feed,  and for each date range. Such as "Content of type 'blog' created in the last 7 days" .  What is particulary powerful about this approach, is that if you are also using a CMS aggregator (such as the aggregator module in Drupal core) then virually any external RSS feed is turned into CMS content, which is now available as a CiviCRM token. (The original blog post about this extension is at: https://civicrm.org/blogs/pogstonesarahgladstone/easier-creation-email-newsletters-content-tokens )

Thanks to community involvement (specifically thanks to https://github.com/jorich-2000), there is a new version of the Content Token extension.  This version now supports Joomla, in addition to Drupal7, Drupal6, and WordPress.

The lastest version is 2.9 and can be downloaded from: https://civicrm.org/extensions/content-tokens

I am looking forward to getting feedback on this.

 

 

 

Catégories: Elsewhere

Russ Allbery: git-pbuilder 1.34

Planet Debian - dim, 24/05/2015 - 02:59

Long time without a blog post. My time got eaten by work and travel and work-related travel. Hopefully more content soon.

This is just a quick note about the release of version 1.34 of the git-pbuilder script (which at some point really should just be rewritten in Python and incorporated entirely into the git-buildpackage package). Guido Günther added support for creating chroots for LTS distributions.

You can get the latest version from my scripts page.

Catégories: Elsewhere

Jim Birch: Drupal 7 Panels: Page Manager Existing Pages

Planet Drupal - sam, 23/05/2015 - 17:47

I plan on doing a more in depth article on how I've been using Panels instead of templates or contexts for laying out this Drupal 7 site, but I feel like I still have more to learn.  Until then, I wanted to share what I found to be a missing piece of the puzzle, Page Manager Existing Pages.

PMEP allows you to override any page that is in the admin menu for use in Page Manager.  That way, you can create variants, and add whatever layout, content, selection rules, that you want.  Without this plugin, you get an error message in Page Manager when trying to overwrite an existing URL.

So, where would I use this?  Page Manager comes with defaults for Node, Taxonomy, and some User pages, most of what you need to present your site to the world.  But there are certain administration pages, when viewed in a front end theme that slipped through the cracks.  For example, node/add, which lists all the content types you can add, or the Style Guide Module generated /admin/appearance/styleguide

Install and configure Page Manager Existing Pages

Read more

Catégories: Elsewhere

Eddy Petrișor: HOWTO: No SSH logins SFTP only chrooted server configuration with OpenSSH

Planet Debian - sam, 23/05/2015 - 16:44
If you are in a situation where you want to set up a SFTP server in a more secure way, don't want to expose anything from the server via SFTP and do not want to enable SSH login on the account allowed to sftp, you might find the information below useful.

What do we want to achive:
  • SFTP server
  • only a specified account is allowed to connect to SFTP
  • nothing outside the SFTP directory is exposed
  • no SSH login is allowed
  • any extra security measures are welcome
To obtain all of the above we will create a dedicated account which will be chroot-ed, its home will be stored on a removable/no always mounted drive (acessing SFTP will not work when the drive is not mounted).

Mount the removable drive which will hold the SFTP area (you might need to add some entry in fstab). 

Create the account to be used for SFTP access (on a Debian system this will do the trick):
# adduser --system --home /media/Store/sftp --shell /usr/sbin/nologin sftp
This will create the account sftp which has login disabled, shell is /usr/sbin/nologin and create the home directory for this user.

Unfortunately the default ownership of the home directory of this user are incompatible with chroot-ing in SFTP (which prevents access to other files on the server). A message like the one below will be generated in this kind of case:
$ sftp -v sftp@localhost
[..]
sftp@localhost's password:
debug1: Authentication succeeded (password).
Authenticated to localhost ([::1]:22).
debug1: channel 0: new [client-session]
debug1: Requesting no-more-sessions@openssh.com
debug1: Entering interactive session.
Write failed: Broken pipe
Couldn't read packet: Connection reset by peerAlso /var/log/auth.log will contain something like this:
fatal: bad ownership or modes for chroot directory "/media/Store/sftp"
The default permissions are visible using the 'namei -l' command on the sftp home directory:
# namei -l /media/Store/sftp
f: /media/Store/sftp
drwxr-xr-x root root    /
drwxr-xr-x root root    media
drwxr-xr-x root root    Store
drwxr-xr-x sftp nogroup sftpWe change the ownership of the sftp directory and make sure there is a place for files to be uploaded in the SFTP area:
# chown root:root /media/Store/sftp
# mkdir /media/Store/sftp/upload
# chown sftp /media/Store/sftp/upload
We isolate the sftp users from other users on the system and configure a chroot-ed environment for all users accessing the SFTP server:
# addgroup sftpusers
# adduser sftp sftusersSet a password for the sftp user so password authentication works:
# passwd sftpPutting all pieces together, we restrict access only to the sftp user, allow it access via password authentication only to SFTP, but not SSH (and disallow tunneling and forwarding or empty passwords).

Here are the changes done in /etc/ssh/sshd_config:
PermitEmptyPasswords no
PasswordAuthentication yes
AllowUsers sftp
Subsystem sftp internal-sftp
Match Group sftpusers
        ChrootDirectory %h
        ForceCommand internal-sftp
        X11Forwarding no
        AllowTcpForwarding no
        PermitTunnel noReload the sshd configuration (I'm using systemd):
# systemctl reload ssh.serviceCheck sftp user can't login via SSH:
$ ssh sftp@localhost
sftp@localhost's password:
This service allows sftp connections only.
Connection to localhost closed.But SFTP is working and is restricted to the SFTP area:
$ sftp sftp@localhost
sftp@localhost's password:
Connected to localhost.
sftp> ls
upload 
sftp> pwd
Remote working directory: /
sftp> put netbsd-nfs.bin
Uploading netbsd-nfs.bin to /netbsd-nfs.bin
remote open("/netbsd-nfs.bin"): Permission denied
sftp> cd upload
sftp> put netbsd-nfs.bin
Uploading netbsd-nfs.bin to /upload/netbsd-nfs.bin
netbsd-nfs.bin                                                              100% 3111KB   3.0MB/s   00:00 Now your system is ready to accept sftp connections, things can be uploaded in the upload directory and whenever the external drive is unmounted, SFTP will NOT work.

Note: Since we added 'AllowUsers sftp', you can test no local user can login via SSH. If you don't want to restrict access only to the sftp user, you can whitelist other users by adding them in the AllowUsers directive, or dropping it entirely so all local users can SSH into the system.
Catégories: Elsewhere

DebConf team: Second Call for Proposals and Approved Talks for DebConf15 (Posted by DebConf Content Team)

Planet Debian - sam, 23/05/2015 - 16:36

DebConf15 will be held in Heidelberg, Germany from the 15th to the 22nd of August, 2015. The clock is ticking and our annual conference is approaching. There are less than three months to go, and the Call for Proposals period closes in only a few weeks.

This year, we are encouraging people to submit “half-length” 20-minute events, to allow attendees to have a broader view of the many things that go on in the project in the limited amount of time that we have.

To make sure that your proposal is part of the official DebConf schedule you should submit it before June 15th.

If you have already sent your proposal, please log in to summit and make sure to improve your description and title. This will help us fit the talks into tracks, and devise a cohesive schedule.

For more details on how to submit a proposal see: http://debconf15.debconf.org/proposals.xhtml.

Approved Talks

We have processed the proposals submitted up to now, and we are proud to announce the first batch of approved talks. Some of them:

  • This APT has Super Cow Powers (David Kalnischkies)
  • AppStream, Limba, XdgApp: Past, present and future (Matthias Klumpp)
  • Onwards to Stretch (and other items from the Release Team) (Niels Thykier for the Release Team)
  • GnuPG in Debian report (Daniel Kahn Gillmor)
  • Stretching out for trustworthy reproducible builds - creating bit by bit identical binaries (Holger Levsen & Lunar)
  • Debian sysadmin (and infrastructure) from an outsider/newcomer perspective (Donald Norwood)
  • The Debian Long Term Support Team: Past, Present and Future (Raphaël Hertzog & Holger Levsen)

If you have already submitted your event and haven’t heard from us yet, don’t panic! We will contact you shortly.

We would really like to hear about new ideas, teams and projects related to Debian, so do not hesitate to submit yours.

See you in Heidelberg,
DebConf Team

Catégories: Elsewhere

Francois Marier: Usual Debian Server Setup

Planet Debian - sam, 23/05/2015 - 11:00

I manage a few servers for myself, friends and family as well as for the Libravatar project. Here is how I customize recent releases of Debian on those servers.

Hardware tests apt-get install memtest86+ smartmontools e2fsprogs

Prior to spending any time configuring a new physical server, I like to ensure that the hardware is fine.

To check memory, I boot into memtest86+ from the grub menu and let it run overnight.

Then I check the hard drives using:

smartctl -t long /dev/sdX badblocks -swo badblocks.out /dev/sdX Configuration apt-get install etckeepr git sudo vim

To keep track of the configuration changes I make in /etc/, I use etckeeper to keep that directory in a git repository and make the following changes to the default /etc/etckeeper/etckeeper.conf:

  • turn off daily auto-commits
  • turn off auto-commits before package installs

To get more control over the various packages I install, I change the default debconf level to medium:

dpkg-reconfigure debconf

Since I use vim for all of my configuration file editing, I make it the default editor:

update-alternatives --config editor ssh apt-get install openssh-server mosh fail2ban

Since most of my servers are set to UTC time, I like to use my local timezone when sshing into them. Looking at file timestamps is much less confusing that way.

I also ensure that the locale I use is available on the server by adding it the list of generated locales:

dpkg-reconfigure locales

Other than that, I harden the ssh configuration and end up with the following settings in /etc/ssh/sshd_config (jessie):

HostKey /etc/ssh/ssh_host_ed25519_key HostKey /etc/ssh/ssh_host_rsa_key HostKey /etc/ssh/ssh_host_ecdsa_key KexAlgorithms curve25519-sha256@libssh.org,ecdh-sha2-nistp521,ecdh-sha2-nistp384,ecdh-sha2-nistp256,diffie-hellman-group-exchange-sha256 Ciphers chacha20-poly1305@openssh.com,aes256-ctr,aes192-ctr,aes128-ctr MACs hmac-sha2-512-etm@openssh.com,hmac-sha2-256-etm@openssh.com,umac-128-etm@openssh.com,hmac-sha2-512,hmac-sha2-256,umac-128@openssh.com UsePrivilegeSeparation sandbox AuthenticationMethods publickey PasswordAuthentication no PermitRootLogin no AcceptEnv LANG LC_* TZ LogLevel VERBOSE AllowGroups sshuser

or the following for wheezy servers:

HostKey /etc/ssh/ssh_host_rsa_key HostKey /etc/ssh/ssh_host_ecdsa_key KexAlgorithms ecdh-sha2-nistp521,ecdh-sha2-nistp384,ecdh-sha2-nistp256,diffie-hellman-group-exchange-sha256 Ciphers aes256-ctr,aes192-ctr,aes128-ctr MACs hmac-sha2-512,hmac-sha2-256

On those servers where I need duplicity/paramiko to work, I also add the following:

KexAlgorithms ...,diffie-hellman-group-exchange-sha1 MACs ...,hmac-sha1

Then I remove the "Accepted" filter in /etc/logcheck/ignore.d.server/ssh (first line) to get a notification whenever anybody successfully logs into my server.

I also create a new group and add the users that need ssh access to it:

addgroup sshuser adduser francois sshuser

and add a timeout for root sessions by putting this in /root/.bash_profile:

TMOUT=600 Security checks apt-get install logcheck logcheck-database fcheck tiger debsums apt-get remove john john-data rpcbind tripwire

Logcheck is the main tool I use to keep an eye on log files, which is why I add a few additional log files to the default list in /etc/logcheck/logcheck.logfiles:

/var/log/apache2/error.log /var/log/mail.err /var/log/mail.warn /var/log/mail.info /var/log/fail2ban.log

while ensuring that the apache logfiles are readable by logcheck:

chmod a+rx /var/log/apache2 chmod a+r /var/log/apache2/*

and fixing the log rotation configuration by adding the following to /etc/logrotate.d/apache2:

create 644 root adm

I also modify the main logcheck configuration file (/etc/logcheck/logcheck.conf):

INTRO=0 FQDN=0

Other than that, I enable daily checks in /etc/default/debsums and customize a few tiger settings in /etc/tiger/tigerrc:

Tiger_Check_RUNPROC=Y Tiger_Check_DELETED=Y Tiger_Check_APACHE=Y Tiger_FSScan_WDIR=Y Tiger_SSH_Protocol='2' Tiger_Passwd_Hashes='sha512' Tiger_Running_Procs='rsyslogd cron atd /usr/sbin/apache2 postgres' Tiger_Listening_ValidProcs='sshd|mosh-server|ntpd' General hardening apt-get install harden-clients harden-environment harden-servers apparmor apparmor-profiles apparmor-profiles-extra

While the harden packages are configuration-free, AppArmor must be manually enabled:

perl -pi -e 's,GRUB_CMDLINE_LINUX="(.*)"$,GRUB_CMDLINE_LINUX="$1 apparmor=1 security=apparmor",' /etc/default/grub update-grub Entropy and timekeeping apt-get install haveged rng-tools ntp

To keep the system clock accurate and increase the amount of entropy available to the server, I install the above packages and add the tpm_rng module to /etc/modules.

Preventing mistakes apt-get install molly-guard safe-rm sl

The above packages are all about catching mistakes (such as accidental deletions). However, in order to extend the molly-guard protection to mosh sessions, one needs to manually apply a patch.

Package updates apt-get install apticron unattended-upgrades deborphan debfoster apt-listchanges update-notifier-common aptitude popularity-contest

These tools help me keep packages up to date and remove unnecessary or obsolete packages from servers. On Rackspace servers, a small configuration change is needed to automatically update the monitoring tools.

In addition to this, I use the update-notifier-common package along with the following cronjob in /etc/cron.daily/reboot-required:

#!/bin/sh cat /var/run/reboot-required 2> /dev/null || true

to send me a notification whenever a kernel update requires a reboot to take effect.

Handy utilities apt-get install renameutils atool iotop sysstat lsof mtr-tiny

Most of these tools are configure-free, except for sysstat, which requires enabling data collection in /etc/default/sysstat to be useful.

Apache configuration apt-get install apache2-mpm-event

While configuring apache is often specific to each server and the services that will be running on it, there are a few common changes I make.

I enable these in /etc/apache2/conf.d/security:

<Directory /> AllowOverride None Order Deny,Allow Deny from all </Directory> ServerTokens Prod ServerSignature Off

and remove cgi-bin directives from /etc/apache2/sites-enabled/000-default.

I also create a new /etc/apache2/conf.d/servername which contains:

ServerName machine_hostname Mail apt-get install postfix

Configuring mail properly is tricky but the following has worked for me.

In /etc/hostname, put the bare hostname (no domain), but in /etc/mailname put the fully qualified hostname.

Change the following in /etc/postfix/main.cf:

inet_interfaces = loopback-only myhostname = (fully qualified hostname) smtp_tls_security_level = may smtp_tls_protocols = !SSLv2, !SSLv3

Set the following aliases in /etc/aliases:

  • set francois as the destination of root emails
  • set an external email address for francois
  • set root as the destination for www-data emails

before running newaliases to update the aliases database.

Create a new cronjob (/etc/cron.hourly/checkmail):

#!/bin/sh ls /var/mail

to ensure that email doesn't accumulate unmonitored on this box.

Finally, set reverse DNS for the server's IPv4 and IPv6 addresses and then test the whole setup using mail root.

Network tuning

To reduce the server's contribution to bufferbloat I change the default kernel queueing discipline by putting the following in /etc/sysctl.conf:

net.core.default_qdisc=fq_codel
Catégories: Elsewhere

Gizra.com: Forking Todo Restful with Backbone.Marionette

Planet Drupal - ven, 22/05/2015 - 23:00

In this guest post, Luke Herrington shares his experience with integrating an existing Drupal backend with a Backbone.Marionette Todo app.

If you're reading this, you probably already know about all of the great work that Gizra has done in the Drupal/REST space. If you haven't, I highly recommend you check out their github repo. Also see the RESTful module.

One of the projects that Amitai has contributed is Todo Restful. It shows an Angular implementation of the canonical TodoMVC Javascript app connecting to a headless Drupal backend. It's a great demonstration of how easy exposing Drupal content with the RESTful module is. It also shows that when a RESTful API adheres to best practices, connecting it with clients that follow the same best practices is like a nice handshake.

I saw the Todo Restful project and it got me thinking, "If Amitai did this right (hint: he did), then I should be able to get this working with Backbone pretty easily". I was pleasantly surprised!

View demo Get the source code Todo app with a Drupal backend

Here's a simplified list of everything I had to do to get it working:

Continue reading…

Catégories: Elsewhere

LevelTen Interactive: How to use Drupal Bootstrap with Webforms

Planet Drupal - ven, 22/05/2015 - 22:16

If you are sub-theming Drupal Bootstrap, you are probably spoiled by all of the awesome functionality that comes with the Bootstrap framework and the Drupal Bootstrap theme. One place where you can’t easily throw a row and col class around your divs through the admin UI is if you are creating a Webform.

I came up with a quick solution to this that, with a little setup, allows the user to leverage Bootstrap through the Webform UI.... Read more

Catégories: Elsewhere

Mediacurrent: Highlights From DrupalCon Los Angeles

Planet Drupal - ven, 22/05/2015 - 22:16

Last week, was in sunny Los Angeles for DrupalCon 2015. Though many were seasoned veterans, it was my first time at a Con. It was a whirlwind of team building, a magical Prenote, great one-on-one coversations and plenty of Drupal talk. Needless to say, I'm still recovering! But if one thing is certain, our team had a wonderful time. Here are some of their takeaways:

Catégories: Elsewhere

Commercial Progression: DrupalCon LA 2015 Highlights with Steve Burge from OSTraining (E9)

Planet Drupal - ven, 22/05/2015 - 20:22
Download

Commercial Progression presents Hooked on Drupal, “Episode 9: DrupalCon LA 2015 Highlights with Steve Burge from OSTraining".  In this special DrupalCon edition of Hooked on Drupal we conferenced in Steve Burge of OSTraining for an on the ground report from Los Angeles.  Held on May 11-15, 2015 DrupalCon LA was the premiere event for the Drupal community.  Steve brings us the inside scoop of highlights and takeaways as the conference wraps up.  Additionally, Alex Fisher (also a DrupalCon veteran) shares his memories and insights from past DrupalCons.  Commercial Progression has recently sponsored OSTraining with a $5000 kickstarter backing to bring Drupal 8 upgrade training to the masses.  This new collection of video resources will be released in September 2015.  With Dries call to support Drupal as public utility from DrupalCon, this announcement seems especially timely.

Hooked on Drupal is available for RSS syndication here at the Commercial Progression site. Additionally, each episode is available to watch online via our YouTube channel, within the iTunes store, on SoundCloud, and now via Stitcher.

If you would like to participate as a guest or contributor, please email us at

social@commercialprogression.com

 

Content Links and Related Information

 

 Hooked on Drupal Content Team

ALEX FISHER - Founder of Commercial Progression

STEVE BURGE - Founder of OSTraining

 

Left, Alex Fisher, founder and owner of Commercial Progression in Northville, Mich.
Right, Steve Burge of Sarasota, Fla., founder and CEO of OSTraining

Podcast Subscription

Tags:  Hooked on Drupal, podcast, Drupal 8, DrupalCon, Planet Drupal, training, sponsorship
Catégories: Elsewhere

Paul Booker: Creating an action to update the prices of your commerce products

Planet Drupal - ven, 22/05/2015 - 15:07
/** * Implements hook_action_info(). */ function mymodule_action_info() { return array( 'mymodule_update_products' => array( 'type' => 'entity', 'label' => t('Update products by 2%'), 'configurable' => FALSE, 'triggers' => array('any'), 'pass rows' => TRUE, ), ); } function mymodule_update_products(&$entity, $context) { $product_id = $entity->product_id; //dsm($product_id); $price = $entity->commerce_price[LANGUAGE_NONE][0]['amount']; //dsm($price); $updated_price = 1.02 * $price; $affected_rows = db_update('field_data_commerce_price') ->fields(array('commerce_price_amount' => $updated_price)) ->condition('entity_id', $product_id) ->execute(); //dsm($affected_rows); } function mymodule_round_up_line_item_price($line_item_id) { $line_item = commerce_line_item_load($line_item_id); return round($line_item->commerce_unit_price[LANGUAGE_NONE][0]['amount'],-2); } Tags:
Catégories: Elsewhere

Gábor Hojtsy: New easy ways to explore Drupal 8's multilingual capabilities

Planet Drupal - ven, 22/05/2015 - 14:57

The Drupal 8 multilingual team is really great in spreading know-how on the new things in the upcoming version, so we had our session (1h) and workshop (2h) recordings published and widely available. While we of course love our baby and can talk all day about it, who has hours when they just want to explore what is coming up? We just addressed that this week with the following.

1. New 2m22s introduction video with the key benefits 2. A quick summary of key benefits and an easy to skim features list

http://www.drupal8multilingual.org/#topbenefits lists the top 12 benefits and http://www.drupal8multilingual.org/features provides the more detailed information in an easy to skim text form. And yeah, that 1h session video if you have the time.

3. Easy to launch demo to try features out

Thanks to our work on the multilingual workshops for DrupalCons, BADCamp and DrupalCamps, we have a demo with sample content in 4 languages that you can try out in your browser for 30 minutes without any registration or local software install required thanks to simplytest.me.

4. Check out who voted with their feet already

Drupal 8 is not yet released, yet there are numerous live multilingual Drupal 8 sites helping with nature preservation, finding health professionals or concert tickets among other good uses. Now there is a handy list to review at http://www.drupal8multilingual.org/showcase.

If you like what you see, we still have guided workshops (those that last 2h). The next one is coming up right this Sunday at DrupalCamp Spain. We also believe that the multilingual team is one of the best to get involved with if you want to know Drupal 8 better and give back some to improve the new version as well. We have weekly meetings and a huge sprint coming up at DrupalCon Barcelona. Maybe we'll have some opportunity to celebrate as well. See you there!

Catégories: Elsewhere

Julian Granger-Bevan: A Git Workflow for Drupal Modules

Planet Drupal - ven, 22/05/2015 - 14:54

Years ago now, the Drupal community adopted Git as a version control system to replace CVS. That move has helped development since the distributed nature of Git allows better tracking of work privately before uploading a patch to drupal.org.

Sandbox repositories allow contributors to clone an existing project to work on independently (therefore not needing permissions for the canonical repository), but there is currently no way that I know of to request that those changes are pulled back, facilitate a review of changes and then merge the changes in (a pull request).

Hopefully that functionality is on the way!

But as a community the challenge is not just the development on drupal.org, collaboration with GitHub, or whatever form the technical change takes. Alongside those changes, we need the workflows that will help us better manage multiple versions, allow fast bug fixes whilst features are being tested, and provide for reviews without alienating developers. And the technical element goes hand in hand with the workflow.

As an example, for the Drupal PM module, we recently debated how to set up Git branches to allow more flexibility than the traditional "single line of code" inheritted from CVS.

There were a few criteria that the new solution had to have:

  • Flexibility that allowed bug fixes to be more quickly applied to a release: Under the "single line of code" approach, Releasing bug fixes only would require adhoc branches and tags.
  • Fit with drupal.org infrasturcture: In particular, we'd like users to be able to test a development version without cloning from Git. So the development release on drupal.org needed to correspond to an appropriate codeset for people to test.
  • Alignment to industry standard approaches where possible: Looking into what is used elsewhere in the software world, the Gitflow model has been received well.

Putting all of this together and discussing on Skype and a drupal.org issue, we came up with a branching model that seems to fit these criteria.

For each major version of the module (i.e., 7.x-1.x, 7.x-2.x, 8.x-1.x), we will have the following branches:

  • Release branches: There will be one release branch for each major version, named after the version (for example: "7.x-1.x"). The codebase in here will always be the release candidate for the next point release, and those point releases will always be tagged from this release branch.
  • Development branches: There will be one development branch for each major version, named "develop-[version]" (for example: "7.x-1.x"). This will effectively be a staging branch for the next release but one. Features will be merged into here, and then this development branch will be merged into the release branch when the next release candidate is required.
  • Feature branches: There will be one feature branch for each feature (drupal.org issue), named "feature-[issue]-[title]" (for example, "feature-12345-add-feature"). These will be worked on until the given feature is finished. Once completed, the feature branch is merged into the development branch.
  • Hotfix branches: There will be one hotfix branch for each bug fix (drupal.org issue), named "hotfix-[issue]-[title]" (for example, "hotfix-12345-fix-bug"). These will be worked on until the bug is confirmed fixed. Once completed, the hotfix branch is merged into both the development and release branches.

We're just beginning to use this system in entirety, and I hope that it works out.

One caveat is that the system only works for developers with permissions on the project repository. I would love for any contributor to be able to fit into this model and to have the pull request system available for the final merge... perhaps soon...

Category: WebsitesTags: GitDrupalDrupal Planetworkflowsbranching
Catégories: Elsewhere

Pages

Subscribe to jfhovinne agrégateur