Agrégateur de flux

Ian Campbell: A vhosting git setup with gitolite and gitweb

Planet Debian - sam, 16/05/2015 - 19:52

Since gitorious' shutdown I decided it was time to start hosting my own git repositories for my own little projects (although the company which took over gitorious has a Free software offering it seems that their hosted offering is based on the proprietary version, and in any case once bitten, twice shy and all that).

After a bit of investigation I settled on using gitolite and gitweb. I did consider (and even had a vague preference for) cgit but it wasn't available in Wheezy (even backports, and the backport looked tricky) and I haven't upgraded my VPS yet. I may reconsider cgit this once I switch to Jessie.

The only wrinkle was that my VPS is shared with a friend and I didn't want to completely take over the gitolite and gitweb namespaces in case he ever wanted to setup git.hisdomain.com, so I needed something which was at least somewhat compatible with vhosting. gitolite doesn't appear to support such things out of the box but I found an interesting/useful post from Julius Plenz which included sufficient inspiration that I thought I knew what to do.

After a bit of trial and error here is what I ended up with:

Install gitolite

The gitolite website has plenty of documentation on configuring gitolite. But since gitolite is in Debian its even more trivial than even the quick install makes out.

I decided to use the newer gitolite3 package from wheezy-backports instead of the gitolite (v2) package from Wheezy. I already had backports enabled so this was just:

# apt-get install gitolite3/wheezy-backports

I accepted the defaults and gave it the public half of the ssh key which I had created to be used as the gitolite admin key.

By default this added a user gitolite3 with a home directory of /var/lib/gitolite3. Since they username forms part of the URL used to access the repositories I want it to include the 3, so I edited /etc/passwd, /etc/groups, /etc/shadow and /etc/gshadow to say just gitolite but leaving the home directory as gitolite3.

Now I could clone the gitolite-admin repo and begin to configure things.

Add my user

This was simple as dropping the public half into the gitolite-admin repo as keydir/ijc.pub, then git add, commit and push.

Setup vhosting

Between the gitolite docs and Julius' blog post I had a pretty good idea what I wanted to do here.

I wasn't too worried about making the vhost transparent from the developer's (ssh:// URL) point of view, just from the gitweb and git clone side. So I decided to adapt things to use a simple $VHOST/$REPO.git schema.

I created /var/lib/gitolite3/local/lib/Gitolite/Triggers/VHost.pm containing:

package Gitolite::Triggers::VHost; use strict; use warnings; use File::Slurp qw(read_file write_file); sub post_compile { my %vhost = (); my @projlist = read_file("$ENV{HOME}/projects.list"); for my $proj (sort @projlist) { $proj =~ m,^([^/\.]*\.[^/]*)/(.*)$, or next; my ($host, $repo) = ($1,$2); $vhost{$host} //= []; push @{$vhost{$host}} => $repo; } for my $v (keys %vhost) { write_file("$ENV{HOME}/projects.$v.list", { atomic => 1 }, join("\n",@{$vhost{$v}})); } } 1;

I then edited /var/lib/gitolite3/.gitolite.rc and ensured it contained:

LOCAL_CODE => "$ENV{HOME}/local", POST_COMPILE => [ 'VHost::post_compile', ],

(The first I had to uncomment, the second to add).

All this trigger does is take the global projects.list, in which gitolite will list any repo which is configured to be accessible via gitweb, and split it into several vhost specific lists.

Create first repository

Now that the basics were in place I could create my first repository (for hosting qcontrol).

In the gitolite-admin repository I edited conf/gitolite.conf and added:

repo hellion.org.uk/qcontrol RW+ = ijc

After adding, committing and pushing I now have "/var/lib/gitolite3/projects.list" containing:

hellion.org.uk/qcontrol.git testing.git

(the testing.git repository is configured by default) and /var/lib/gitolite3/projects.hellion.org.uk.list containing just:

qcontrol.git

For cloning the URL is:

gitolite@${VPSNAME}:hellion.org.uk/qcontrol.git

which is rather verbose (${VPSNAME} is quote long in my case too), so to simplify things I added to my .ssh/config:

Host gitolite Hostname ${VPSNAME} User gitolite IdentityFile ~/.ssh/id_rsa_gitolite

so I can instead use:

gitolite:hellion.org.uk/qcontrol.git

which is a bit less of a mouthful and almost readable.

Configure gitweb (http:// URL browsing)

Following the documentation's advice I edited /var/lib/gitolite3/.gitolite.rc to set:

UMASK => 0027,

and then:

$ chmod -R g+rX /var/lib/gitolite3/repositories/*

Which arranges that members of the gitolite group can read anything under /var/lib/gitolite3/repositories/*.

Then:

# adduser www-data gitolite

This adds the user www-data to the gitolite group so it can take advantage of those relaxed permissions. I'm not super happy about this but since gitweb runs as www-data:www-data this seems to be the recommended way of doing things. I'm consoling myself with the fact that I don't plan on hosting anything sensitive... I also arranged things such that members of the groups can only list the contents of directories from the vhost directory down by setting g=x not g=rx on higher level directories. Potentially sensitive files do not have group permissions at all either.

Next I created /etc/apache2/gitolite-gitweb.conf:

die unless $ENV{GIT_PROJECT_ROOT}; $ENV{GIT_PROJECT_ROOT} =~ m,^.*/([^/]+)$,; our $gitolite_vhost = $1; our $projectroot = $ENV{GIT_PROJECT_ROOT}; our $projects_list = "/var/lib/gitolite3/projects.${gitolite_vhost}.list"; our @git_base_url_list = ("http://git.${gitolite_vhost}");

This extracts the vhost name from ${GIT_PROJECT_ROOT} (it must be the last element) and uses it to select the appropriate vhost specific projects.list.

Then I added a new vhost to my apache2 configuration:

<VirtualHost 212.110.190.137:80 [2001:41c8:1:628a::89]:80> ServerName git.hellion.org.uk SetEnv GIT_PROJECT_ROOT /var/lib/gitolite3/repositories/hellion.org.uk SetEnv GITWEB_CONFIG /etc/apache2/gitolite-gitweb.conf Alias /static /usr/share/gitweb/static ScriptAlias / /usr/share/gitweb/gitweb.cgi/ </VirtualHost>

This configures git.hellion.org.uk (don't forget to update DNS too) and sets the appropriate environment variables to find the custom gitolite-gitweb.conf and the project root.

Next I edited /var/lib/gitolite3/.gitolite.rc again to set:

GIT_CONFIG_KEYS => 'gitweb\.(owner|description|category)',

Now I can edit the repo configuration to be:

repo hellion.org.uk/qcontrol owner = Ian Campbell desc = qcontrol RW+ = ijc R = gitweb

That R permission for the gitweb pseudo-user causes the repo to be listed in the global projects.list and the trigger which we've added causes it to be listed in projects.hellion.org.uk.list, which is where our custom gitolite-gitweb.conf will look.

Setting GIT_CONFIG_KEYS allows those options (owner and desc are syntactic sugar for two of them) to be set here and propagated to the actual repo.

Configure git-http-backend (http:// URL cloning)

After all that this was pretty simple. I just added this to my vhost before the ScriptAlias / /usr/share/gitweb/gitweb.cgi/ line:

ScriptAliasMatch \ "(?x)^/(.*/(HEAD | \ info/refs | \ objects/(info/[^/]+ | \ [0-9a-f]{2}/[0-9a-f]{38} | \ pack/pack-[0-9a-f]{40}\.(pack|idx)) | \ git-(upload|receive)-pack))$" \ /usr/lib/git-core/git-http-backend/$1

This (which I stole straight from the git-http-backend(1) manpage causes anything which git-http-backend should deal with to be sent there and everything else to be sent to gitweb.

Having done that access is enabled by editing the repo configuration one last time to be:

repo hellion.org.uk/qcontrol owner = Ian Campbell desc = qcontrol RW+ = ijc R = gitweb daemon

Adding R permissions for daemon causes gitolite to drop a stamp file in the repository which tells git-http-backend that it should export it.

Configure git daemon (git:// URL cloning)

I actually didn't bother with this, git http-backend supports the smart HTTP mode which should be as efficient as the git protocol. Given that I couldn't see any reason to run another network facing daemon on my VPS.

FWIW it looks like vhosting could have been achieved by using the --interpolated-path option.

Conclusion

There's quite a few moving parts, but they all seems to fit together quite nicely. In the end apart from adding www-data to the gitolite group I'm pretty happy with how things ended up.

Catégories: Elsewhere

Another Drop in the Drupal Sea: DrupalCon LA Friday Recap

Planet Drupal - sam, 16/05/2015 - 19:06

From my vantage point the sprint day was extremely well attended. I spent my day working on a patch I had submitted to the Flag module and working on OG Forum and OG Forum D7.

We had the traditional live commit in the afternoon.

There wasn't any announcement if any more critical bugs were squashed for core.

How many of you participated in the sprints? When did you head home? Are you participating on Saturday?

read more

Catégories: Elsewhere

Holger Levsen: 20150516-lts-march-and-april

Planet Debian - sam, 16/05/2015 - 18:56
My LTS March and April

In March and April 2015 I sadly didn't get much LTS work done, for a variety of reasons. Most of these reasons make me happy, while at the same time I'm sad I had to reduce my LTS involvement and actually I even failed to allocate those few hours which were assigned to me. So I'll keep this blog post short too, as time is my most precious ressource atm.

In March I only sponsored the axis upload for and wrote DLA-169-1, besides that I spent some hours to implement JSON output for the security-tracker, which was more difficult than anticipated, because a.) different people had different (first) unspoken assumptions what output they wanted and b.) since the security-trackers database schema has grown over years getting the data out in a logically structured fashion ain't as easy as one would imagine...

In April I sponsored the openldap upload and wrote DLA-203-1 and then prepared debian-security-support 2015.04.04~~deb6u1 for squeeze-lts and triaged some of d-s-s's bugs. Adding support for oldoldstable (and thus keeping support for squeeze-lts) to the security-tracker was my joyful contribution for the very joyful Jessie release day.

So in total I only spent 7.5 (paid) hours in these two months on LTS, despite I should have spent 10. The only thing I can say to my defense is that I've spent more time on LTS (supporting both users as well as contributors on the list as well as on IRC) but this time ain't billable. Which I think is right, but it still eats from my "LTS time" and so sometimes I wish I could more easily ignore people and just concentrate on technical fixes...

Catégories: Elsewhere

Craig Small: Debian, WordPress and Multi-site

Planet Debian - sam, 16/05/2015 - 10:07

For quite some time, the Debian version of WordPress has had a configuration tweak that made it possible to run multiple websites on the same server. This came from a while ago when multi-site wasn’t available. While a useful feature, it does make the initial setup of WordPress for simple sites more complicated.

I’m looking at changing the Debian package slightly so that for a single-site use it Just Works. I have also looked into the way WordPress handles the content, especially themes and plugins, to see if there is a way of updating them through the website itself. This probably won’t suit everyone but I think its a better default.

The idea will be to setup Debian packages something like this by default and then if you want more fancier stuff its all still there, just not setup. It’s not setup at the moment but the default is a little confusing which I hope to change.

Multisite

The first step was to get my pair of websites into one. So first it was backing up time and then the removal of my config-websitename.php files in /etc/wordpress. I created a single /etc/wordpress/config-default.php file that used a new database.  This initial setup worked ok and I had the primary site going reasonably quickly.

The second site was a little trickier. The problem is that multisite does things like foo.example.com and bar.example.com while I wanted example.com and somethingelse.com There is a plugin wordpress-mu-domain-mapping that almost sorta-kinda works.  While it let me make the second site with a different name, it didn’t like aliases, especially if the alias was the first site.

Some evil SQL fixed that nicely.  “UPDATE wp_domain_mapping SET blog_id=1 WHERE id=2″

So now I had:

  • enc.com.au as my primary site
  • rnms.org as a second site
  • dropbear.xyz as an alias for my primary site
Files and Permissions

We really three separate sets of files in wordpress. These files come from three different sources and are updated using three different ways with a different release cycle.

The first is the wordpress code which is shipped in the Debian package. All of this code lives in /usr/share/wordpress and is only changed if you update the Debian package, or you fiddle around with it. It needs to be readable to the webserver but not writable. The config files in /etc/wordpress are in this lot too.

Secondly, we have the user generated data. This is things like your pictures that you add to the blog. As they are uploaded through the webserver, it needs to be writable to it. These files are located in /var/lib/wordpress/wp-content/uploads

Third, is the plugins and themes. These can either be unzipped and placed into a directory or directly loaded in from the webserver. I used to do the first way but are trying the second. These files are located in /var/lib/wordpress/wp-content

Ever tried to update your plugins and get the FTP prompt? This is because the wp-content directory is not writable. I adjusted the permissions and now when a plugin wants to update, I click yes and it magically happens!

You will have to reference the /var/lib/wordpress/wp-content subdirectory in two places:

  • In your /etc/config-default.php:  WP_CONTENT_DIR definition
  • In apache or htaccess: Either a symlink out of /usr/share/wordpress and turn on followsymlinks or an apache Alias and also permit access.
What broke

Images did, in a strange way. My media library is empty, but my images are still there. Something in the export and reimport did not work. For me its a minor inconvenience and due to moving from one system to another, but it still is there.

 

 

Catégories: Elsewhere

Rogério Brito: A Small Python Project (coursera-dl) Activites

Planet Debian - sam, 16/05/2015 - 06:54

Lately, I have been dedicating a lot of my time (well, at least compared to what I used to) to Free Software projects. In particular, I have spent a moderate amount of time with two projects written in Python.

In this post, I want to talk about the first, more popular project is called coursera-dl. To be honest, I think that I may have devoted much more time to it than to any other project in particular.

With it I started to learn (besides the practices that I already used in Debian), how to program in Python, how to use unit tests (I started with Python's built-in unittest framework, then progressed to nose, and I am now using pytest), hooking up the results of the tests with a continuous integration system (in this case, Travis CI).

I must say that I am sold on this idea of testing software (after being a skeptical for way too long) and I can say that I find hacking on other projects without proper testing a bit uncomfortable, since I don't know if I am breaking unrelated parts of the project.

My use/migration to pytest was the result of a campaign from pytest.org called Adopt Pytest Month which a kind user of the project let me know about. I got a very skilled volunteer assigned from pytest to our project. Besides learning from their pull requests, one side-effect of this whole story was that I spent a moderate amount of hours trying to understand how properly package and distribute things on PyPI.

One tip learned along the way: contrary to the official documentation, use twine, not python setup.py upload. It is more flexible for uploading your package to PyPI.

You can see the package on PyPI. Anyway, I made the first upload of the package to PyPI on the 1st of May and it already has almost 1500 download, which is far more than what I expected.

A word of warning: there are other similarly named project, but they don't seem to have as much following as we have. A speculation from my side is that this may be, perhaps, due to me spending a lot of time interacting with users in the bug tracker that github provides.

Anyway, installation of the program is now as simple as:

pip install coursera

And all the dependencies will be neatly pulled in, without having to mess with multi-step procedures. This is a big win for the users.

Also, I even had an offer to package the program to have it available in Debian!

Well, despite all the time that this project demanded, I think that I have only good things to say, especially to the original author, John Lehmann.

If you like the project, please let me know and consider yourselves invited to participate lending a hand, testing/using the program or [triaging some bugs][issues].

Catégories: Elsewhere

Norbert Preining: Plex Home Theater 1.4.1 for Debian Jessie and Sid

Planet Debian - sam, 16/05/2015 - 05:28

Recently Plex Plex Home Theater was updated to 1.4.1 with fixes for some errors, in particular concerning the new music handling introduced in 1.4.0. As with 1.4.0, I have compiled PHT for both jessie and sid, both for amd64 and i386.

Jessie

Add the following lines to your sources.list:

deb http://www.preining.info/debian/ jessie pht deb-src http://www.preining.info/debian/ jessie pht

You can also grab the binary for amd64 directly here for amd64 and i386, you can get the source package with

dget http://www.preining.info/debian/pool/pht/p/plexhometheater/plexhometheater_1.4.1-1~bpo8+1.dsc Sid

Add the following lines to your sources.list:

deb http://www.preining.info/debian/ sid pht deb-src http://www.preining.info/debian/ sid pht

You can also grab the binary for amd64 directly here for amd64 and i386, you can get the source package with

dget http://www.preining.info/debian/pool/pht/p/plexhometheater/plexhometheater_1.4.1-1.dsc

The release file and changes file are signed with my official Debian key 0x860CDC13.

Enjoy!

Catégories: Elsewhere

Forum One: DrupalCon LA Round-Up: Wrapping Up and Looking Ahead

Planet Drupal - sam, 16/05/2015 - 01:34

A number of us from Forum One are sticking around for Friday’s sprints, but that’s a wrap on the third day of DrupalCon and the conference proper!

Wednesday and Thursday were chock-full of great sessions, BoFs, and all the small spontaneous meetings and conversations that make DrupalCons so fruitful, exhausting and energizing.

Forum One gave three sessions on Wednesday. John Brandenburg presented Maximizing Site Speed with Mercy Corps, a case study of our work on www.mercycorps.org focusing on performance optimization. Kalpana Goel of Forum One and Frédéric G. Marand presented Pain points of learning and contributing in the Drupal community, a session on how to even better facilitate code contributions to Drupal from community members. And finally Forum One’s Andrew Morton presented Content After Launch: Preparing a Monkey for Space, a survey of content considerations for project success before, during, and after the website build process. The other highlight from my perspective on Wednesday was a great talk by Wim Leers and Fabian Franz on improvements to Drupal performance/speed, and how to make your Drupal sites fly.

Then Thursday, Daniel Ferro and Dan Mouyard rounded out the seven Forum One sessions with their excellent presentation, To the Pattern Lab! Collaboration Using Modular Design Principles. The session describes our usage of Pattern Lab at Forum One to improve project workflow and collaboration between visual designers, front- and back-end developers, and clients. This approach has eased a lot of friction on our project teams. I’m particularly excited about how it’s allowed our front-end developers to get hacking much earlier in the project lifecycle. We were glad to see the presentation get a shout out from Brad Frost, one of the Pattern Lab creators. Other highlights for me on Thursday were the beloved Q&A with Dries and friends and sitting down over lunch with other Pacific Northwest Drupalers to make some important decisions about the PNW Drupal Summit coming to Seattle this fall.

In addition to looking ahead to DrupalCon Barcelona, the closing session revealed the exciting news that DrupalCon will be landing in Mumbai next year!

#DrupalCon is coming to Mumbai! Plus other photos from todays closing session https://t.co/Y3vWCQCSTu? pic.twitter.com/zEt4Y6VLxS

— DrupalCon LosAngeles (@DrupalConNA) May 15, 2015

And the always anticipated announcement of the next DrupalCon North America location… New Orleans!

And the next North American #DrupalCon will be…… pic.twitter.com/AXiFxv3gfW

— DrupalCon LosAngeles (@DrupalConNA) May 14, 2015

That news was ushered in soulfully by these gentlemen, Big Easy style, pouring out from the keynote hall into the convention center lobby.

Great way to announce #DrupalCon New Orleans! #DrupalConLA pic.twitter.com/3cRmV8jI1F

— Andy Hieb (@AndyHieb) May 14, 2015

And to finish off the day properly, many of us hooted and hollered at Drupal trivia night, MC’d by none other than Jeff Eaton.

Another fantastic #DrupalCon trivia night in progress… Woo! pic.twitter.com/AzavA2AFXi

— Jeff Eaton (@eaton) May 15, 2015

A great con was had by all of us here at Forum One… On to the sprints!

Catégories: Elsewhere

Forum One: Hacking the Feds: Forum One Among the Winners at GSA Hack-a-Thon

Planet Drupal - ven, 15/05/2015 - 20:25

Last Friday, we attended the Digital Innovation Hack-a-Thon hosted by the GSA… and we won. The federal tech website FCW even wrote an article about it.

Our team, made up of designers and developers from Forum One, along with Booz Allen Hamilton, Avar Consulting, and IFC International, worked on a solution for IAE’s Vendor Dashboard for Contracting Officers. We were tasked with creating a vendor dashboard for displaying GSA data that would enable procurement officers to quickly and easily search and identify potential vendors that have small-business or minority-owned status, search by other special categories, and view vendors’ history.

How did we tackle the problem?

Our team initially split into smaller working groups. The first group performed a quick discovery session; talking with the primary stakeholder and even reaching out to some of the Contracting Officers we work with regularly. They identified pain points and looked at other systems which we ended up integrating into our solution. As this group defined requirements, the second group created wireframes. We even took some time to perform quick usability testing with our stakeholders, and iterate on our initial concept until it was time to present.

The other group dove into development. We carefully evaluated the data available from the API to understand the overlap and develop a data architecture. Using that data map, we decided to create a listing of contracts and ways to display an individual contract. We then expanded it to include alternative ways of comparing and segmenting contracts using other supporting data. Drupal did very well pulling in the data and allowed us to leverage its data listings and displays tools. Most developers see Drupal as a powerful albeit time intensive building tool, but it worked very well in this time critical environment.

Our two groups rejoined frequently to keep everyone on the same page and make sure our solutions was viable.

How much could we possibly accomplish in 6 hours?

More than you might think. Our solutions presented the content in an organized, digestible way that allowed contracting officers to search and sort through information quickly and easily within one system. We created wireframes to illustrate our solution for the judges and stakeholders. We also stood up a Drupal site to house the data and explained the technical architecture behind our solution. Unfortunately, we didn’t have a front-end developer participating in the hack-a-thon, so we weren’t able to create a user interface, but our wireframes describe what the UI should eventually look like.

Some of us even took a quick break to catch a glimpse the Arsenal of Democracy World War II Victory Capitol Flyover from the roof. It was also broadcasted on the projectors in the conference room.

What did we learn?

It’s interesting to see how others break down complex problems and iterate on solutions especially if that solution includes additional requirements. Our solution was more complex than some of the other more polished data visualizations, but we won the challenge in part because of the strategy behind our solution.

We’re excited to see what GSA develops as a MVP, and we’ll be keeping our ears open for the next opportunity to attend a hack-a-thon with GSA.

Finally, a big shout out to our teammates!
  • Mary C. J. Schwarz, Vice President at ICF International
  • Gita Pabla, Senior Digital Designer at Booz Allen Hamilton
  • Eugene Raether, IT Consultant at Booz Allen Hamilton
  • Robert Barrett, Technical Architect, Avar Consulting
Catégories: Elsewhere

SitePoint PHP Drupal: Using Ajax Forms in Drupal 8

Planet Drupal - ven, 15/05/2015 - 18:00

In this article, I am going to show you a clean way of using the Drupal 8 Ajax API without writing one line of JavaScript code. To this end, we will go back to the first custom form we built for Drupal 8 in a previous article and Ajaxify some of its behaviour to make it more user friendly.

An updated version of this form can be found in this repository under the name DemoForm (the demo module). The code we write in this article can also be found there but in a separate branch called ajax. I recommend you clone the repo and install the module in your development environment if you want to follow along.

DemoForm

Although poorly named, the DemoForm was very helpful in illustrating the basics of writing a custom form in Drupal 8. It handles validation, configuration and exemplifies the use of the Form API in general. Of course, it focuses on the basics and has nothing spectacular going on.

If you remember, or check the code, you’ll see that the form presents a single textfield responsible for collecting an email address to be saved as configuration. The form validation is in charge of making sure that the submitted email has a .com ending (a poor attempt at that but enough to illustrate the principle of form validation). So when a user submits the form, they are saving a new email address to the configuration and get a confirmation message printed to the screen.

In this article, we will move the email validation logic to an Ajax callback so that after the user has finished typing the email address, the validation gets automagically triggered and a message printed without submitting the form. Again, there is nothing spectacular about this behaviour and you will see it quite often in forms in the wild (typically to validate usernames). But it’s a good exercise for looking at Ajax in Drupal 8.

Continue reading %Using Ajax Forms in Drupal 8%

Catégories: Elsewhere

Another Drop in the Drupal Sea: DrupalCon LA Thursday Recap

Planet Drupal - ven, 15/05/2015 - 17:38

There was a shortened day of sessions, finishing with the closing ceremony. In the morning I attended a discussion about documentation on D.O. I attended this session because the point of focus the group chose at my BOF was to do something to improve the documentation. The session was quite well attended, which apparently demonstrates that there's a good bit of interest in improving the documentation. So, I guess I'll be putting more of my time, energy and resources to getting involved.

read more

Catégories: Elsewhere

S. M. Bjørklund: How to migrate content from drupal 6 to 7 by using Migrate_d2d - Part 1

Planet Drupal - ven, 15/05/2015 - 15:59

The normal way of performing a major upgrade in Drupal have traditionally been by running update.php, that fire off a lot of rather complex hook_update_N() tasks. They will try to upgrade configuration and content from one major version to another. Example Drupal 6 to 7. This is all about to change in Drupal 8. Drupal 8 have got the migrate module baked in, and upgrade is no longer a upgrade, but a migration of data and configuration from one system to another.

Catégories: Elsewhere

Drupal Easy: DrupalEasy Podcast 154: DrupalCon Los Angeles - Day 3 Recap

Planet Drupal - ven, 15/05/2015 - 15:29
DrupalEasy Podcast 154

Ryan managed to catch a few final interviews before leaving town on the last day of DrupalCon. He got in a traditional interview with a Blackmesh employee (who isn't Cathy Theyes), namely Jason Ford. Also is an 15 minute interview with Colleen Carrol of Palantir.net who recaps her session about Sustainable Recruiting Practices.

read more

Catégories: Elsewhere

Paul Booker: Collecting total prices on a recipe form using field collections, JQuery / AHAH

Planet Drupal - ven, 15/05/2015 - 15:26
(function($) { Drupal.behaviors.recipesForm = { attach: function (context, settings) { $(".field-name-field-recipe-quantity input[type=text]").focus(function() { $ingredient = $(this).parent().parent().parent().prev().find("select"); nid = $ingredient.find('option:selected').val(); //console.log(nid); $.get('/ingredient/price/get/' + nid , null, updateCost); }); $(".field-name-field-recipe-quantity input[type=text]").blur(function() { total_cost = 0; $quantity = $(this); quantity_val = $(this).val(); if (quantity_val && cost_per_kg) { var item = $(".field-name-field-recipe-cost"); $cost = $(this).parent().parent().parent().parent().find(item).find("input[type=text]"); cost_val = quantity_val * cost_per_kg; //console.log($quantity_val); //console.log($cost_per_kg); //console.log($cost_val.toFixed(2)); $cost.val(cost_val.toFixed(2)); } $('.field-name-field-recipe-cost').each(function() { //console.log(this); //console.log($(this).find("input[type=text]")); //console.log($(this).find("input[type=text]").val()); cost = $(this).find("input[type=text]").val(); total_cost = total_cost + parseFloat(cost); }); //console.log(total_cost); $total_cost = $('.field-name-field-total-cost').find("input[type=text]"); $total_cost.val(total_cost.toFixed(2)); }); $(".field-name-field-recipe-ingredient select").change(function() { nid = $(this).find('option:selected').val(); $.get('/ingredient/price/get/' + nid , null, updateCost); var item = $(".field-name-field-recipe-quantity"); $quantity = $(this).parent().parent().parent().find(item).find("input[type=text]"); //console.log($quantity); $quantity.val(0); var item = $(".field-name-field-recipe-cost"); $cost = $(this).parent().parent().parent().find(item).find("input[type=text]"); $cost.val(0); }); } }; var updateCost = function(response) { cost_per_kg = response['data']; //console.log(cost_per_kg); } })(jQuery); /** * Implements hook_menu(). */ function mymodule_menu() { $items['ingredient/price/get'] = array( 'page callback' => 'mymodule_get_ingredient_price_ajax', 'type' => MENU_CALLBACK, 'access arguments' => array('access content'), ); return $items; } /** * Callback to return JSON encoded ingredient price for given nid. */ function mymodule_get_ingredient_price_ajax($nid) { $node = node_load($nid); //print_r($node->field_cost_per_kg['und'][0]['value']); $cost_per_kg = $node->field_cost_per_kg['und'][0]['value']; drupal_json_output(array('status' => 0, 'data' => $cost_per_kg)); drupal_exit(); } function mymodule_form_recipe_sheet_node_form_alter(&$form, &$form_state, $form_id) { //dsm($form_id); //dsm($form['field_total_cost']); foreach ($form['field_collection_ingredients']['und'] as $key => &$value) { if (is_numeric($key)) { //dsm($key); //dsm($value); $value['field_recipe_cost']["#disabled"] = TRUE; } } $form['field_total_cost']["#disabled"] = TRUE; drupal_add_js(drupal_get_path('module', 'mymodule') .'/scripts/mymodule.js'); } Tags:
Catégories: Elsewhere

Drupal Easy: DrupalEasy Podcast 153: DrupalCon Los Angeles - Day 2 Recap

Planet Drupal - ven, 15/05/2015 - 14:45
Download Podcast 153

Ryan Mike and Ted are joined by Dave Hall, Amitai Burstein, Damien McKenna, Tess Flynn, Kelley Curry, Brian Lewis in this musical and magical Day 2 of DrupalCon in Los Angeles.

read more

Catégories: Elsewhere

InternetDevels: Tips for Design when using Drupal

Planet Drupal - ven, 15/05/2015 - 11:33

Meet the new blog post with tips about Drupal web design from our guest blogger Lalit Sharma, SEO consultant who runs a SEO agency.

Drupal is a popular open-source platform loved by many designers. However, there are a few golden rules to follow when designing anything for Drupal sites in order to ensure developers have an easier time coding, maintain production speeds and the client’s pocket remains relatively bulky. A few are as provided below:

Read more
Catégories: Elsewhere

Sune Vuorela: Getting a Q_INVOKABLE C++ function reevaluated by QML engine

Planet Debian - ven, 15/05/2015 - 11:16

Unfortunately, with all the normal magic of QML property bindings, getting a property updated in a setup that involves return values from functions isn’t really doable, like this:

Text {
text: qtobject.calculatedValue()
}

I’m told there is a low priority feature request for a way of signalling that a function now returns a different value and all properties using it should be reevaluated.

I have so far discovered two different workarounds for that that I will be presenting here.

Using an extra property

Appending an extra property to trigger the reevaluation of the function is one way of doing it.

Text {
text: qtobject.calculatedValue() + qtobject.emptyNotifierThing
}

with the following on the C++ side:

Q_PROPERTY(QString emptyNotifierThing READ emptyString NOTIFY valueChanged)
QString emptyString() const {
return QString();
}

This is a bit more code to write and to remember to use, but it does get the job done.

Intermediate layer
Another way is to inject an intermediate layer, an extra object, that has the function. It can even be simplified by having a pointer to itself.

Text {
text: qtobject.dataAccess.calculatedValue()
}

with the following on the C++ side:

Q_PROPERTY(QObject* dataAccess READ dataAccess NOTIFY valueChanged)
QObject* dataAccess() {
return this;
}

It seems a bit simpler for the reader on the QML side, but also gets the job done.

I am not sure which way is the best one, but the intermediate layer has a nicer feeling to it when more complicated types are involved.

Catégories: Elsewhere

DebConf team: DebConf Open Weekend (Posted by DebConf Content Team )

Planet Debian - ven, 15/05/2015 - 07:35

The first two days of this year’s DebConf (August 15th and 16th) will constitute the Open Weekend. On these days, we are planning to have the Debian 22nd Birthday party, a Job Fair, and more than 20 hours of events and presentations, including some special invited speakers.

Given that we expect to have a broader and larger audience during the weekend, our goal is to have talks that are equally interesting for both Debian contributors and users.

If you want to present something that might be interesting to the larger Debian community, please go ahead and submit it. It can be for a talk of either 45 or 20 minutes; if you don’t have content for a full length talk, we encourage you to go for the half length one. If you consider that the event is better suited for either the OpenWeekend or the regular DebConf days, you may say so in the comment field. But keep in mind that all events might be rearranged by the content team to make sure they fit together nicely.

Call for proposals

The deadline to submit proposals is June 15th. Please submit your talk early with a good description and a catchy title. We look forward to seeing your proposals!

If you want to submit an event please go ahead and read the original CfP on DebConf15 http://debconf15.debconf.org/proposals.xhtml.

Catégories: Elsewhere

Metal Toad: Amazon CloudFront with Drupal 8

Planet Drupal - ven, 15/05/2015 - 02:04
Amazon CloudFront with Drupal 8 May 14th, 2015 Dylan Tack

Since I wrote my first review of CloudFront in 2012, Amazon has added support for three essential features:

What this means is that CloudFront is no longer just for static content; it's fully capable of delivering content from a dynamic CMS like Drupal. Here are the configs, step-by-step:

Configure your distribution and origin

This is fairly straightforward. I reccomend using a CNAME for your origin (which could be a single instance, or an elastic load balancer). Ideally, your origin URL should not be accessible from the open internet for serveral reasons:

  • Prevent the origin URL from getting crawled by search engines
  • Pevent DDoS attacks from being able to bypass the CDN
  • Prevent spoofing of the X-Forwarded-For header

Configure a default behavior

Noteworthy settings are:

  • "use origin cache headers" - This means CloudFront will honor the page lifetime set on /admin/config/development/performance within Drupal.
  • Whitelist "Host" and "CloudFront-Forwarded-Proto". This allows virtual hosts, and any SSL redirect logic on the origin to function correctly.
  • Whitelist your site's session cookie.

Drupal 8 workarounds

One of the remaining Drupal 8 critical issues interferes with CloudFront:
[meta] External caches mix up response formats on URLs where content negotiation is in use
As a result, some additional behaviors are needed to work around this. These settings instruct CloudFront to forward all client headers for specific paths:

Domain-sharding

If you plan to use a single domain for your entire site, you're done! On this site, we decided to keep the domain-sharding approach described in my previous post, so we need a little D8 code.

mt_custom.info.yml name: Metal Toad Custom description: Stuff that doesn't fit anywhere else. package: Custom type: module core: 8.x dependencies: mt_custom.services.yml services: mt_custom_event_subscriber: class: Drupal\mt_custom\EventSubscriber\MTCustomSubscriber arguments: ['@current_user'] tags: - {name: event_subscriber} mt_custom.module use Drupal\Component\Utility\UrlHelper;   /** * Implements hook_file_url_alter(). */ function mt_custom_file_url_alter(&$uri) {   // Route static files to Amazon CloudFront, for anonymous users only. if (\Drupal::request()->server->get('HTTP_HOST') == 'www.metaltoad.com' && \Drupal::currentUser()->isAnonymous() && !\Drupal::request()->isSecure()) {   // Multiple hostnames to parallelize downloads. $shard = crc32($uri) % 4 + 1; $cdn = "http://static$shard.metaltoad.com";   $scheme = file_uri_scheme($uri); if ($scheme == 'public') { $wrapper = file_stream_wrapper_get_instance_by_scheme('public'); $path = $wrapper->getDirectoryPath() . '/' . file_uri_target($uri); $uri = "$cdn/" . UrlHelper::encodePath($path); } else if (!$scheme && strpos($uri, '//') !== 0) { $uri = "$cdn/" . UrlHelper::encodePath($uri); } } }   /** * Implements hook_css_alter(). */ function mt_custom_css_alter(&$css) { // Mangle the paths slightly so that Drupal\Core\Asset\AssetDumper will generate // different keys on HTTPS. Necessary because CDN URL varies by protocol. if (\Drupal::request()->isSecure()) { foreach ($css as $key => $file) { if ($file['type'] === 'file') { $css[$key]['data'] = './' . $css[$key]['data']; } } } } src/EventSubscriber/MTCustomSubscriber.php namespace Drupal\mt_custom\EventSubscriber;   use Symfony\Component\HttpFoundation\RedirectResponse; use Symfony\Component\HttpKernel\Event\FilterResponseEvent; use Symfony\Component\HttpKernel\KernelEvents; use Symfony\Component\HttpKernel\Event\GetResponseEvent; use Symfony\Component\EventDispatcher\EventSubscriberInterface; use Drupal\Core\Session\AccountInterface;   class MTCustomSubscriber implements EventSubscriberInterface {   protected $account;   public function checkForCloudFront(GetResponseEvent $event) { $req = $event->getRequest();   /* * Make sure Amazon CloudFront doesn't serve dynamic content * from static*.metaltoad.com */ if (strstr($req->server->get('HTTP_HOST'), 'static')) { if (!strstr($req->getPathInfo(), 'files/styles')) { header("HTTP/1.0 404 Not Found"); print '404 Not Found'; exit(); } } }   /** * {@inheritdoc} */ static function getSubscribedEvents() { $events[KernelEvents::REQUEST][] = array('checkForCloudFront'); return $events; }   public function __construct(AccountInterface $account) { $this->account = $account; }   }
Catégories: Elsewhere

Capgemini Engineering: Drupal integration patterns

Planet Drupal - ven, 15/05/2015 - 01:00

As Drupal has evolved, it has become more than just a CMS. It is now a fully fledged Web Development Platform, enabling not just sophisticated content management and digital marketing capabilities but also any number of use cases involving data modelling and integration with an endless variety of applications and services. In fact, if you need to build something which responds to an HTTP request, then you can pretty much find a way to do it in Drupal.

“Just because you can, doesn’t mean you should.”

However, the old adage is true. Just because you can use use a sledgehammer to crack a nut, that doesn’t mean you’re going to get the optimal nut-consumption-experience at the end of it.

Drupal’s flexibility can lead to a number of different integration approaches, all of which will “work”, but some will give better experiences than others.

On the well trodden development path of Drupal 8, giant steps have been taken in making the best of what is outside of the Drupal community and “getting off the island”, and exciting things are happening in making Drupal less of a sledgehammer, and more of a finely tuned nutcracker capable of cracking a variety of different nuts with ease.

In this post, I want to explore ways in which Drupal can create complex systems, and some general patterns for doing so. You’ll see a general progression in line with that of the Drupal community in general. We’ll go from doing everything in Drupal, to making the most of external services. No option is more “right” than others, but considering all the options can help make sure you pick the approach that is right for you and your use case.

Build it in Drupal

One option, and probably the first that occurs to many developers, is to implement business logic, data structures and administration of a new applications or services using Drupal and its APIs. After all, Entity API and the schema system give us the ability to model custom objects and store them in the Drupal database; Views gives us the means to retrieve that data and display it in a myriad of ways. Modules like Rules; Features and CTools provide extensive options for implementing specific business rules to model your domain specific data and application needs.

This is all well and good, and uses the strengths of Drupal core and the wide range of community contributed modules to enable the construction of complex sites with limited amounts of coding required, and little need to look outside Drupal. The downside can come when you need to scale the solution. Depending on how the functionality has been implemented you could run into performance problems caused by large numbers of modules, sub-optimal queries, or simply the amount of traffic heading to your database - which despite caching strategies, tuning and clustering is always likely to end up being the performance bottleneck of your Drupal site.

It also means your implementation is tightly coupled to Drupal - and worse, most probably the specific version of Drupal you’ve implemented. With Drupal 8 imminent this means you’re most likely increasing the amount of re-work required when you come to upgrade or migrate between versions.

It’s all PHP

Drupal sites can benefit hugely from being part of the larger PHP ecosystem. With Drush make, the Libraries API, Composer Manager, and others providing the means of pulling external, non-Drupal PHP libraries into a Drupal site, there are huge opportunities for building complexity in your Drupal solution without tying yourself to specific Drupal versions, or even to Drupal at all. This could become particularly valuable as we enter the transition period between Drupal 7 and 8.

In this scenario, custom business logic can be provided in a framework agnostic PHP library and a Naked Module approach can be used to provide the glue between that library and Drupal - utilising Composer to download and install dependencies.

This approach is becoming more and more widespread in the Drupal community with Commerce Guys (among others) taking a libraries first approach to many components of Commerce 2.x which will have generic application outside of Drupal Commerce.

The major advantage of building framework agnostic libraries is that if you ever come to re-implement something in another framework, or a new version of Drupal, the effort of migrating should be much lower.

Integrate

Building on the previous two patterns, one of Drupal’s great strengths is how easy it is to integrate with other platforms and technologies. This gives us great opportunity to implement functionality in the most appropriate technology and then simply connect to it via web services or other means.

This can be particularly useful when integrating with “internal” services - services that you don’t intend to expose to the general public (but may still be external in the sense of being SaaS platforms or other partners in a multi-supplier ecosystem). It is also a useful way to start using Drupal as a new part of your ecosystem, consuming existing services and presenting them through Drupal to minimise the amount of architectural change taking place at one time.

Building a solution in this componentised and integrated manner gives several advantages:

  • Separation of concerns - the development, deployment and management of the service can be run by a completely separate team working in a different bounded context. It also ensures logic is nicely encapsulated and can be changed without requiring multiple front-end changes.
  • Horizontal scalability - implementing services in alternate technologies lets us pick the most appropriate for scalability and resilience.
  • Reduce complex computation taking place in the web tier and let Drupal focus on delivering top quality web experience to users. For example, rather than having Drupal publish and transform data to an external platform, push the raw data into a queue which can be consumed by “non-Drupal” processes to do the transform and send.
  • Enable re-use of business logic outside of the web tier, on other platforms or with alternative front ends.
Nearly-Headless Drupal

Headless Drupal is a phrase that has gained a lot of momentum in the Drupal community - the basic concept being that Drupal purely responds with RESTful endpoints, and completely independant front-end code using frameworks such as Angular.js is used to render the data and completely separate content from presentation.

Personally, I prefer to think of a “nearly headless” approach - where Drupal is still responsible for the initial instantiation of the page, and a framework like Angular is used to control the dynamic portion of the page. This lets Drupal manage the things it’s good at, like menus, page layout and content management, whilst the “app” part is dropped into the page as another re-usable component and only takes over a part of the page.

For an example use case, you may have business requirements to provide data from a service which is also provided as an API for consumption by external parties or mobile apps. Rather than building this service in Drupal, which while possible may not provide optimal performance and management opportunities, this could be implemented as a standalone service which is called by Drupal as just another consumer of the API.

From an Angular.js (or insert frontend framework of choice) app, you would then talk directly to the API, rendering the responses dynamically on the front end, but still use Drupal to build everything and render the remaining elements of the page.

Summing up

As we’ve seen, Drupal is an incredibly powerful solution, providing the capability for highly-consolidated architectures encapsulated in a single tool, a perfect enabler for projects with low resources and rapid development timescales. It’s also able to take its place as a mature part of an enterprise architecture, with integration capabilities and rich programming APIs able to make it the hub of a Microservices or Service Oriented Architecture.

Each pattern has pros and cons, and what is “right” will vary from project to project. What is certain though, is that Drupal’s true strength is in its ability to play well with others and do so to deliver first class digital experiences.

New features in Drupal 8 will only continue to make this the case, with more tools in core to provide the ability to build rich applications, RESTful APIs for entities out of the box allowing consumption of that data on other platforms (or in a headless front-end), improved HTTP request handling with Guzzle improving options for consuming services outside of Drupal, and much more.

Drupal integration patterns was originally published by Capgemini at Capgemini on May 15, 2015.

Catégories: Elsewhere

X-Team: DrupalCon Latin America through the eyes of an X-Teamer (Part 2)

Planet Drupal - jeu, 14/05/2015 - 23:25

Drupal left the island. Larry Garfield brought us an awesome speech about Drupal 8, which you can see here. Dries explained to us why Drupal 8 had to change. Larry showed us those changes, and everybody loved his demo, but the important thing came after that. The inline editor, content management improvements, rendering HTML5 are...

The post DrupalCon Latin America through the eyes of an X-Teamer (Part 2) appeared first on X-Team.

Catégories: Elsewhere

Pages

Subscribe to jfhovinne agrégateur