Agrégateur de flux
I sent this email to debian-private a few days ago, on the 10th anniversary of my Debian account creation:Date: Fri, 14 Aug 2015 19:37:20 +0200 From: David Moreno To: email@example.com Subject: Retiring from Debian User-Agent: Mutt/1.5.23 (2014-03-12) [-- PGP output follows (current time: Sun 23 Aug 2015 06:18:36 PM CEST) --] gpg: Signature made Fri 14 Aug 2015 07:37:20 PM CEST using RSA key ID 4DADEC2F gpg: Good signature from "David Moreno " gpg: aka "David Moreno " gpg: aka "David Moreno (1984-08-08) " [-- End of PGP output --] [-- The following data is signed --] Hi, Ten years ago today (2005-08-14) my account was created: https://nm.debian.org/public/person/damog Today, I don't feel like Debian represents me and neither do I represent the project anymore. I had tried over the last couple of years to retake my involvement but lack of motivation and time always got on the way, so the right thing to do for me is to officially retire and gtfo. I certainly learned a bunch from dozens of Debian people over these many years, and I'm nothing but grateful with all of them; I will for sure carry the project close to my heart — as I carry it with the Debian swirl I still have tattooed on my back ;) http://damog.net/blog/2005/06/29/debian-tattoo/ I have three packages left that have not been updated in forever and you can consider orphaned now: gcolor2, libperl6-say-perl and libxml-treepp-perl. With all best wishes, David Moreno. http://damog.net/ [-- End of signed data --]
I received a couple of questions about my decision here. I basically don’t feel like Debian represents my interests and neither do I represent the project – this doesn’t mean I don’t believe in free software, to the contrary. I think some of the best software advancements we’ve made as society are thanks to it. I don’t necessarily believe on how the project has evolved itself, whether that has been the right way, to regain relevancy and dominance, and if it’s remained primarily a way to feed dogmatism versus pragmatism. This is the perfect example of a tragic consequence. I was very happy to learn that the current Debian Conference being held in Germany got the highest attendance ever, hopefully that can be utilized in a significant and useful way.
Regardless, my contributions to Debian were never noteworthy so it’s also not that big of a deal. I just need to close cycles myself and move forward, and the ten year anniversary looked like a significant mark for that.
Poke me in case you wanna discuss some more. I’ll always be happy to. Specially over beer :)
Over the last few weeks, we've been working really hard on preparing training for Drupal 8.
We'll release the training close to the actual launch date of Drupal 8, but before then we have a great video to get you started.
This is a webinar we did with Acquia called, "10 Things Site Builders Need to Know Before Leaping to Drupal 8".
I talked about the many user-friendly features in Drupal 8, including the mobile-friendly admin interface and the in-place WYSIWYG editor, plus improvements in theming and module development:
As web sites and applications have become more complex, the need for auditing – at multiple points in the lifecycle of a project – has become ever more important.
Before delivery, a web project can be audited to ensure the ability to meet business goals or compliance with regulations. After delivery, an audit can identify problems and propose remedies. In a possible merger or acquisition, an audit can help evaluate the project’s relative benefits and liabilities.
Website auditing has become similar to financial auditing (which is separate and distinct from accounting and financial activities). It is similar to the practices applied in auditing management systems (see “There’s a Module Standard for That” sidebar).
Website auditors must apply these four principles:
- Judgment They must be able to choose the scope and granularity of the website, without wasting effort on discovering problems with no meaningful impact on the behavior and performance of the site; hence, a need for business acumen.
- Expertise In order to determine whether or not best practices were followed by the original site developers, auditors must achieve a level of proficiency beyond that with which the site was delivered.
- Objectivity Auditors cannot audit a site they themselves produced, or else risk selective blindness – the inability to see problems they missed the first time around.
- Distance Auditors cannot operate on a website developed by a company – especially their own – with which they have any kind of commercial or personal involvement.
Market studies show that site audits are often used as a loss leader by generalist Drupal agencies. Their objective: to set the stage for redevelopment and third-party maintenance work, where the main volume of business is done using “findings” from a short and low-cost audit to provide the developer with a technical advantage against competitors.
It’s easy to add node endpoints to your RESTful API - but there’s more to Drupal than nodes. This week we’ll add an endpoint for a taxonomy vocabulary.
Not every student learns the same way, so teachers consistently have to find a way to instruct a classroom while also reaching students individually.
We were reminded of a teacher’s greatest challenge when we trained dozens of Acquia employees on Drupal 8. As my co-author Kent Gale and I detailed earlier in this series on Drupal 8 instruction, we separated employees into groups of two, with one person having some knowledge of the new platform and the other having no knowledge. Once their instruction ended, they split up and each teamed with two other employees – our version of chromosomal mitosis.
Our approach to training was structured. We had goals to achieve. But we also had to stay flexible throughout. Because experience, knowledge, and skill set differed with each employee, we had to connect with them individually while maintaining the larger class structure.
We had people with deep programming experience. We had middleware folks. We had site builders. We had front-enders. Because of that, the training program had to present a lot of material, but not so much that individuals wouldn’t learn. We trained with the expectation that not everyone would, or even needed to, become experts.
Consider the training of our “explainers,” the employees who explain our products to the public. We had to figure out what they could easily learn in only one- to four-hours of training. They needed to know enough to promote and answer questions about Drupal 8, but didn’t need to know as much as a support person, who received anywhere from 40 to 80 hours of training. Figuring out what the explainers needed to learn took some effort, but there was ample material to help us determine which path to follow.
Speaking of paths, your team doesn’t have to follow ours. Mitosis worked great for us, but it may pose a problem for your program if you have fewer employees, less time to train, or other considerations.
You need to find out what works best and that, as we’ve mentioned, takes time and effort, success and failure. Some employees like to be lone wolves and learn everything on their own, for example, so our process may not work for them.
Tools that track progress will help you ascertain what works and what doesn’t. Every company, no matter how large or small, faces time constraints, so these tools will guide you through the unknowns.
We used training as a key performance indicator (KPI) for employees. Shared ownership in this big Drupal 8 training project made sense if we all had to make a big leap together in understanding the new platform. Sometimes employees will sweep training under the rug because they believe putting out other fires is a priority.
We knew learning Drupal 8 would be a significant commitment; it’s a significant change, after all. But we couldn’t delay training. Drupal 8 was coming out and there was no time for delay. KPIs helped motivate and get everyone on the same page. There was a vested interest in making progress.Blog series: Organizing to Rock with Drupal 8Workflow: PendingFeatured: NoTags: acquia drupal planetDrupal 8 related: YesAuthor: Thomas Howell
There are a bunch of things I mean to blog about, but as I have just got fully home from Heidelberg and DebConf15 this afternoon that seems most appropriate to start with. It’s a bit of a set of disjoint thoughts, but I figure I should write them down while they’re in my head.
DebConf is an interesting conference. It’s the best opportunity the Debian project has every year to come together and actually spend a decent amount of time with each other. As a result it’s a fairly full on experience, with lots of planned talks as a basis and a wide range of technical discussions and general social interaction filling in whatever gaps are available. I always find it a thoroughly enjoyable experience, but equally I’m glad to be home and doing delightfully dull things like washing my clothes and buying fresh milk.
I have always been of the opinion that the key aspect of DebConf is the face time. It was thus great to see so many people there - we were told several times that this was the largest DebConf so far (~ 570 people IIRC). That’s good in the sense that it meant I got to speak to a lot of people (both old friends and new), but does mean that there are various people I know I didn’t spend enough, or in some cases any, time with. My apologies, but I think many of us were in the same situation. I don’t feel it made the conference any less productive for me - I managed to get a bunch of hacking done, discuss a number of open questions in person with various people and get pulled into various interesting discussions I hadn’t expected. In short, a typical DebConf.
Also I’d like to say that the venue worked out really well. I’ll admit I was dubious when I heard it was in a hostel, but it was well located (about a 30 minute walk into town, and a reasonable bus service available from just outside the door), self-contained with decent facilities (I’m a big believer in having DebConf talks + accommodation be as close as possible to each other) and the room was much better than expected (well, aside from the snoring but I can’t blame the DebConf organisers for that).
One of the surprising and interesting things for me that was different from previous DebConfs was the opportunity to have more conversations with a legal leaning. I expect to go to DebConf and do OpenPGP/general crypto related bits. I wasn’t expecting affirmation about the things I have learnt on my course over the past year, in terms of feeling that I could use that knowledge in the process of helping Debian. It provided me with some hope that I’ll be able to tie my technology and law skills together in a way that I will find suitably entertaining (as did various conversations where people expressed significant interest in the crossover).
Next year is in Cape Town, South Africa. It’s a long way (though I suppose no worse than Portland and I get to stay in the same time zone), and a quick look at flights indicates they’re quite expensive at the moment. The bid presentation did look pretty good though so as soon as the dates are confirmed (I believe this will happen as soon as there are signed contracts in place) I’ll take another look at flights.
In short, excellent DebConf, thanks to the organisers, lovely to see everyone I managed to speak to, apologies to those of you I didn’t manage to speak to. Hopefully see you in Cape Town next year.
A pure maintenance release 0.1.3 of the RcppDE package arrived on CRAN yesterday. RcppDE is a "port" of DEoptim, a popular package for derivative-free optimisation using differential optimization, to C++. By using RcppArmadillo, the code becomes a lot shorter and more legible.
At the Debian Conference 2015 I gave a talk about Continuous Delivery of Debian packages. My slides are available online (PDF, 753KB). Thanks to the fantastic video team there’s also a recording of the talk available: WebM (471MB) and on YouTube.
Every Drupal developer heard stories of 'clients from hell' and collaborations that turned into nightmares. And, it’s true… you’re going to come across some extremely difficult clients. No matter you do, no matter how hard you work…you’re going to have problems with them.
Remind yourself: “I am not alone!”
All businesses – no matter what kind of company it is...
This blog describes how to clear views cache while inserting, updating and deleting a node. If we want to improve site performance, then views caching is one of the options.
For example, let's consider that there is a view that display list of records and it will be updated occasionally. Then we can render views data from cache rather than server by setting up cache for views. We can set views cache at Advanced section of its settings page. Drupal supports time based caching by default. We can cache the views until 6 days. If you want to cache the views over 6 days, then you could select 'Custom' option and fill number of seconds in 'Seconds' textbox.
Suppose you have cached views for 30 mins. Then it won't display updated data until 30 mins, even if new node is added to that views. It displays updated data only after 30 mins because the views is cached for 30 mins. In that situation, the user can't view new data in cached views. So we need to clear views cache when we add a new node. Then only we can see new data in views and also data is rendered from cache.
Lets see how to clear views cache while inserting node:
Earlier in the year we worked with a household Australian name to help them build a next-generation more-performant version of their energy-comparison site.
We used Blackfire.io to optimize the critical elements of the site.
Read on to find out more about our experience.
Triggered by all the bugs around font problems I spent my weekend instead of mountaineering with crawling through the TeX Live history for changes and fixes to dvipdfm-x. Thanks to 角藤さん for his hints, I have pulled out the changes necessary to fix Type1 support in dvipdfm-x and have reincluded them into the Debian texlive-bin package. The uploaded binaries (version 2015.20150524.37493-6) are already compiled against the new C++ ABI, see the Debian transition, so most systems will still need to wait for the update to be installable.
At the same time I did an update to the whole set of arch: all packages (texlive-base, texlive-lang, texlive-extra (version 2015.20150823-1). This was triggered by bug that seems to be caused by bad interplay between fontspec and l3 packages. Furthermore, I needed to remove the activation for fontconfig of the URW++ Base35 fonts, to ensure that fontconfig returns always the ones from the gsfonts package, instead of a mixture between TeX Live and gsfonts.
Unrelated bug fix: libpaper intergration has been fixed and should work again. So for now all the bugs are now hopefully settled and we are back to normal. What remains is trying to fix jessie which is also broken in some respects.Updated packages
acro, animate, babel-bosnian, babel-french, babel-latin, beamer-FUBerlin, beebe, breqn, chemformula, chet, cnltx, crossrefware, dantelogo, datetime2-it-fulltext, disser, drm, dvipdfmx-def, ecclesiastic, eledform, gradstudentresume, idxcmds, l3build, mcf2graph, media9, pageslts, pdfpages, reledmac, siunitx, tcolorbox, tex4ht, texlive-docindex, texlive-scripts, udesoftec, upmethodology, xindy.New packages
blochsphere, e-french, fitbox, nar
Wouldn’t it be cool if you could use Rules to periodically set, change or modify certain Drupal site configurations and variables?
Yeah, it would be cool and nowit is totally do-able with "Rules Set Site Variables”
Below is my use case and a simple recipe on how to use Rules (and a few helper modules) to change site configurations periodically. I imagine you can use it for all kinds of other craziness on your Drupal site.Use Case: Modifying Node.Js Host Configuration Variable on Schedule
We are huge fans of Drupal and Node.js and use it on multiple integration projects. Combined together, Drupal provides powerful information handling and Node.js handles real-time scalability.
We use Heroku to handle our free, node.js Chatroom Demo site. Unfortunately, Heroku is ending its policy of forever free dyno and moving towards a limited free usage of 18 hours per day. It’s an understandable, business change. For our free Drupal, Node.js chatroom demo, this means I’d need to pay $7 per month to keep the site up and show the [amazingness of Drupal and Node.js](http://int3c.com/tags/nodejs OR shut down the site.
So, instead of paying this (I’m cheap), I decided to setup two Heroku, Node.js instances and simply use a Drupal cron process to flip my node.js server twice a day.
The end result is that twice a day my Drupal site will change which node.js server it is using and thus prevent me from overusing my free, 18 hour daily limit.
Technically, I could have written some custom code, but being a lazy Drupal developer and site builder, I figured there had to be some way to do this with Rules.
Unfortunately there wasn’t at the time. The only similar solution was Conditional Variables, which would only set a session variable and not a permanent change to the configuration.
My solution was to create "Rules Set Site Variables”, which provides a custom reaction where you set any Drupal site variable. You can read more about that module in "Announcing: Rules Set Site Variables Module.”
In order to solve my use case to switch Heroku Node.js server, the final solution involved using Rules, Rules Once Per Day and "Rules Set Site Variables” as well as a couple of site and rules configurations.
Let’s take a look at how.Which Modules? Rules, Rules Once Per Day and Rules Set Site Variables
In order to accomplish this task of periodically changing a site configuration, I used three modules (besides whatever else I’m using for Node.js and other parts):
I could have used a different triggering method but Rules Once Per Day was the easiest. Rules Once Per Day creates a Drupal event, which you can control and set to a specific time and which you can use as the triggering event with Rules.
Using "Rules Set Site Variables”, you then have the option of creating a Rules Reaction to set a Drupal variable.
After enabling these modules, there were only a few simple configurations.Cron and Rules Once Per Day
In order for this to work you are going to need a functioning cron task on your server. This article won’t go into how to configure cron. But essentially for this to work, your server needs to run cron periodically in order to trigger rules and other things. In general you want cron to run every few minutes.
Rules Once Per Day provides a simple configuration setting page where you can configure when you want it to run according to your site time. This means that once a day this event will be triggered and your rules accordingly.
In my case, I use Rules Once a Day to change the site configuration to one of the Node.JS servers, create a future scheduled event in 12 hours and send me an email that the whole thing happened.Configuring Rules to Change a Variable
Once you have created the event side of the rule, your next step is configuring the relevant action in this case “Set Drupal Site Variable”:
As you can see there are just two parts here: pick the variable you want to change and what text it will be set to.Setting Up an Additional Rules Scheduled Component
Since I need to have this change every 12 hours, I need two scheduled events. The first scheduled event is from Rules Once a Day. The second event gets scheduled from the first one via a Rules Component.
A Rules Component is basically a conditional action you create that can then be trigger by other things. In this case, it gets scheduled via our original rule.
In order to add this to our original rule, we first need to create a new component and add your reaction events. In this case, I created another “Set Drupal Site Variable” action and another reaction to send me an email.Packaging it all together
Using Rules, Rules Once a Day, Rules Components and Rules Set Site Variable, the final result is a series of configurable items that work together to switch my heroku node.js site server twice a day.
The rule configuration looks like this:Conclusion: You can use rules to change site configuration
With "Rules Set Site Variables”, you change site configuration according various site events. In our case we use scheduled cron events to switch our heroku server and save some money.
On a wider level, any site builder can now use rules to modify their site’s configuration. Using a bit more configuration and some helper modules, you can make these changes according to a set schedule.
Live long and prosper Drupal Site Builders.Tags: drupalDrupal PlanetPlanet DrupalNode.jsRulesSite Builder Mark Koester @markwkoester Mark has worked on Drupal since the early days of D6. He is passionate about open source as well as entrepreneurship. When he isn't building, he enjoys traveling and speaking one of his many foreign languages. Chengdu, China
One nice thing during Drupal 7/8 development is the ability, thanks to the devel module, to get a list of all SQL queries ran on a page. As I've been working quite a bit on MongoDB in PHP recently, I wondered how to obtain comparable results when using MongoDB in PHP projects. Looking at the D7 implementation, the magic happens in the Database class:<?php
// Start logging on the default database.
// Get the log contents, typically in a shutdown handler.
$log = \Database::getLog(DB_CHANNEL);
With DBTNG, that's all it takes, and devel puts it to good use UI-wise. So is there be an equivalent mechanism in MongoDB ? Of course there is !
As I’m writing this, DebConf 15 is coming to an end. I spend most of my time improving the situation of the Haskell Packages in Debian, by improving the tooling and upgrading our packages to match Stackage 3.0 and build against GHC 7.10. But that is mostly of special interest (see this mail for a partial summary), so I’d like to use this post to advertise a very small and simple package I just uploaded to Debian:
During one of the discussion here I noticed that it is rather tricky to make a locally built package available to apt-get. The latest version in unstable allows one to install a debian package simply by running apt-get install on it, but in some cases, e.g. when you want a convenient way to list all packages that you made available for local use, this is insufficient.
So the usual approach is to create a local apt repository with your packages. Which is non-trivial: You can use dpkg-scanpackage, apt-ftparchive or reprepro. You need to create the directories, run the commands, add the repository to your local sources. You need to worry about signing it or setting the right options to make apt-get accept it without signing.
It is precisely this work that my new package local-apt-repository automates for you: Once it is installed, you simply drop the .deb file into /srv/local-apt-repository/ and after the next apt-get update the package can be installed like any other package from the archive.
I chose to use the advanced features that systemd provides – namely activation upon path changes – so works best with systemd as the init system.
If you want to contribute, or test it before it passes the NEW queue, check out the git repository.
Next week, I'll be running the "Echappee Belle" race : 144km and 10.000 meters positive climb, in French Alps (Belledonne range, this time).
That will be, by far, my longest race ever and indeed a great challenge for me with very difficult tracks (when there are tracks).
I expect to run for about 48 hours, or even up to 55, two nights out.....or maybe less as I'm in very good shape.
You can follow me on the live tracking site. The race starts on Friday August 28th, 06:00 CET DST.
We recently had to debug a site for customer who was using Apache Solr and that wonderful Drupal module combo that goes with it: Search API and Search API Solr.
We were pleased to find that these components have continued to evolve over the past years, much to the benefit of their users . Setting up Apache Solr for Drupal is now easier than ever before, whether you use the command line or the newish Solr user interface.
In the steps below we've gone mostly for the point and click install.
2) Once downloaded unzip the .tgz or .zip and move it to your favourite application folder. I have fallen in the habit of abusing /Applications for this on my Mac, but you can use almost any folder that works for you.
3) Now cd into the epicentre of your Solr installation. When reading further documentation this is where it is assumed that you execute your Solr commands from:
4) A quick smoke test is to launch Solr in standalone (as opposed to cloud) mode
$ bin/solr start
Yay! Our Solr is up and running!
5) Let’s hook it up to Drupal by giving it a Search API configuration to work with. In your solr-5.2.1/server/solr directory create a new directory drupal-search and inside that a directory conf. Then drag all the files residing in /sites/all/modules/search_api_solr/solr-conf/5.x into drupal-search/conf.
Or using the command line:
$ mkdir server/solr/drupal-search
$ cp -r [DocumentRoot]]/sites/all/modules/search_api_solr/solr-conf/5.x server/solr/drupal-search/conf
6a) Now open a browser window and visit localhost:8983/solr and click the No cores available — Go and create one menu option in the bottom left. See the screenshot at the top of this article.
6b) Verify by picking "drupal-search" in the core selector drop-down. You should see something like this:
No errors? Great!
7) Revisit your Drupal site on the Search API config page, at admin/config/search/search_api and fill it out as shown below.
Press "Save settings" and you should see lots of green, like the screenshot below.
Check and tweak the configuration if necessary via the Edit tab at the top.
8) Made a mistake? You can delete your Solr core like so:
$ bin/solr delete -c drupal-search
Then try again from step 5) or 6).
* * *File under: Planet Drupal
Having had personal contact with Amazon employees, and having read the NYT article Inside Amazon: Wrestling Big Ideas in a Bruising Workplace, I have only a few things to say:
- Most Japanese companies have different methods, but work hours are similar, and exploitation even worse (voluntary service etc)
- I loved one comment in the MobileRead thread on the article, link to the post:
All you need to do is make one assumption: dishonest people will outperform honest people over the short term. Then every year Amazon’s “culling” will get rid of the honest people and keep the dishonest ones. Rinse and repeat. Soon only the sociopaths are left.
Best supporting quote from the article: “You learn how to diplomatically throw people under the bus.”
What remains – I guess I would be a horrible performer at Amazon, and I am proud of it.
For a small Drupal shop or an individual Drupal consultant, how to grow up? It seems that small Drupal shops face a glass ceiling when they want to move upward. They are not able to find a larger project because they not big enough. It is not trustworthy or not give the stack holder a confidence if there are not a team of developers. Should we solve this problem by working together in a partnership? The Drupal developer is a very technical intensive. Let us follow the way lawyers did in their practice. We get together and build a strong team.
What is the benefit to run a Drupal shop in a partnership?
1) It is easy to setup unless we want to form an LLP partnership. As a professional Drupal Freelance, we may have some client already. Initial partners sign an agreement and form a partnership with some existing customers already.
2) A good size team gives confidence to customers. It is going to be easier to win a bigger project.
3) Having a partnership formed, we can recruit more junior developers and train them.
The challenge here is we never did it before. We do may not have any ways to follow. A comprehensive partnership agreement is needed. Here are some important things that we need think through before we form a partnership:
1) Types of Partnerships (General Partnership or Limited Liability Partnership)
2) Governance and Decision-Making
3) Partner Compensation
4) Capital Contribution
5) Overhead and Liabilities
6) Parental Leaves and Sabbaticals
7) Retirement and Termination.
Professional Drupal developers will benefit by practicing partnership in professional service. A reputable good size team is capable of catch and deliver bigger and more profitable projects.
Drupal developers provide highly skilled professional service. Lawyers give professional service related law. Lawyers have lawyer office to provide their service in a decent way. Why not copy the way how they did it to provide our Drupal service.
Referred document: http://www.cba.org/cba/PracticeLink/WWP/agreement.aspx