Agrégateur de flux
The UDD bugs interface currently knows about the following release critical bugs:
- In Total:
178 bugs affecting
- Affecting Jessie:
172 (key packages:
104) That's the number we need to get down to zero
before the release. They can be split in two big categories:
- Affecting Jessie and unstable:
128 (key packages:
80) Those need someone to find a fix, or to finish the
work to upload a fix to unstable:
- 19 bugs are tagged 'patch'. (key packages: 10) Please help by reviewing the patches, and (if you are a DD) by uploading them.
- 8 bugs are marked as done, but still affect unstable. (key packages: 5) This can happen due to missing builds on some architectures, for example. Help investigate!
- 101 bugs are neither tagged patch, nor marked done. (key packages: 65) Help make a first step towards resolution!
- Affecting Jessie only: 44 (key packages: 24) Those are already fixed in unstable, but the fix still needs to migrate to Jessie. You can help by submitting unblock requests for fixed packages, by investigating why packages do not migrate, or by reviewing submitted unblock requests.
- Affecting Jessie and unstable: 128 (key packages: 80) Those need someone to find a fix, or to finish the work to upload a fix to unstable:
- Affecting Jessie: 172 (key packages: 104) That's the number we need to get down to zero before the release. They can be split in two big categories:
How do we compare to the Squeeze release cycle?Week Squeeze Wheezy Jessie 43 284 (213+71) 468 (332+136) 319 (240+79) 44 261 (201+60) 408 (265+143) 274 (224+50) 45 261 (205+56) 425 (291+134) 295 (229+66) 46 271 (200+71) 401 (258+143) 427 (313+114) 47 283 (209+74) 366 (221+145) 342 (260+82) 48 256 (177+79) 378 (230+148) 274 (189+85) 49 256 (180+76) 360 (216+155) 226 (147+79) 50 204 (148+56) 339 (195+144) ??? 51 178 (124+54) 323 (190+133) 189 (134+55) 52 115 (78+37) 289 (190+99) 147 (112+35) 1 93 (60+33) 287 (171+116) 140 (104+36) 2 82 (46+36) 271 (162+109) 157 (124+33) 3 25 (15+10) 249 (165+84) 172 (128+44) 4 14 (8+6) 244 (176+68) 5 2 (0+2) 224 (132+92) 6 release! 212 (129+83) 7 release+1 194 (128+66) 8 release+2 206 (144+62) 9 release+3 174 (105+69) 10 release+4 120 (72+48) 11 release+5 115 (74+41) 12 release+6 93 (47+46) 13 release+7 50 (24+26) 14 release+8 51 (32+19) 15 release+9 39 (32+7) 16 release+10 20 (12+8) 17 release+11 24 (19+5) 18 release+12 2 (2+0)
As I said, I did not certain events that begun with “lea” and end with “ing” prevent me from organising a Debian/m68k hack weekend. Well, that weekend is now.
I’m too unorganised, and I spent too much time in the last few evenings to organise things so I built up a sleep deficit already ☹ and the feedback was slow. (But so are the computers.) And someone I’d have loved to come was hurt and can’t come.
On the plus side, several people I’ve long wanted to meet IRL are coming, either already today or tomorrow. I hope we all will have a lot of fun.
Legal disclaimer: “Debian/m68k” is a port of Debian™ to m68k. It used to be official, but now isn’t. It belongs to debian-ports.org, which may run on DSA hardware, but is not acknowledged by Debian at large, unfortunately. Debian is a registered trademark owned by Software in the Public Interest, Inc.
I stumbled upon this site thanks to Helga: Parable of the Polygons. On the site you can interactively find out how harmless choices can make a harmful world. I found it quite eye opening. And what most catched me but isn't part of the site is that only unhappy polygons are willing to move. Those who are just ok with their neighbourhood but not really happy about it aren't willing to move. Which made me try it out in my own way: Trying to create the most diverse possible environment by temporarily making as many polygons unhappy to find out if it's possible to make as many polygons happy in the long run as possible.
... which is actually part of the way I see my own life. I always sort-of tried to confront people to think. I mean, it's not that common that you see a by-the-looks male person wearing a skirt. And ... since I moved out in July into a small intermediate flat and thus a new neighbourhood, I found the confidence (in parts also to be attributed to the confidence built up at these fine feministic conferences) to walk my hometown in a skirt. Only on some few occations, when meeting up with friends, mostly at evening/night, but it was always a nice experience. And I only felt once uncomfortable to be honest, when there was a probably right-winged skinhead at the subway station. Too many other people around, so I tried to avoid eye contact, but it didn't feel good.
Diversity is something that society needs. In all aspects. Also within the Debian project. I believe strongly in that there can't be much of innovation and moving forward if all people do think the same direction. That only means that potential alternative paths won't even get considered, and potentially get lost. That's one of the core parts of what makes the Free Software community livid and useful. People try different approaches, and in the end there will be adopters of what they believe is the better project. Projects pop up every now and then, others starve because of loss of interest, users not picking it up, developers spending their time on other stuff, and that's absolutely fine too. There is always something to be learned even from those situations.
Speaking of diversity, there is this protest going on later today because the boss of a cafe here in Vienna considered it a good idea to kick out a lesbian couple because they kissed each other for greeting and told them that they don't have a place for their "otherness" in her traditional viennese cafe and they rather should take it to a brothel. She excused yesterday for her tone that she used, she said she should have been more relaxed—as the CEO of that cafe. Which literally means that she only exused for the tone she used in her role, but not at all for the message she transported. So meh, hope there will be many people at the protest. Yes, there is some anti discrimination law around, but that only covers the workplace, and not service areas. Welcome to Austria.
On the upside, court striked down ban on same-sex couple adoption just the other day. Hopefully there is still hope for this country. :)
It's 2015. locate still works by a linear scan through a flat file.
In December 46 work hours have been equally split among 4 paid contributors (note that Thorsten and Raphaël have actually spent more hours because they took over some hours that Holger did not do over the former months). Their reports are available:
Compared to last month, the number of paid work hours has almost not increased (we are at 48 hours per month). We still have a couple of new sponsors in the pipe but with the new year they did not complete the process yet. Hopefully next month will see a noticeable increase.
As usual, we are looking for more sponsors to reach our our minimal goal of funding the equivalent of a half-time position. Those of you who are struggling to spend money in the last quarter due to budget overrun, now is a good time to see if you want to include Debian LTS support in your 2015 budget!
In terms of security updates waiting to be handled, the situation looks similar to last month: the dla-needed.txt file lists 30 packages awaiting an update (3 more than last month), the list of open vulnerabilities in Squeeze shows about 56 affected packages in total. We do not manage to clear the backlog but it’s not getting significantly worse either.Thanks to our sponsors
- Gold sponsors:
- Silver sponsors:
- AD&D – David Ayers – IntarS Austria
- Domeneshop AS
- Trollweb Solutions
- Université Lille 3
- Bronze sponsors:
Today is a much anticipated day. Some said it would never come. Others said that if it did come it could ultimately mean nothing. Still others, and myself, believe that it is a red-letter date that will be long celebrated as the day that Backdrop CMS 1.0 was launched.
Backdrop CMS is a comprehensive CMS for non-profits and small to medium sized businesses. Not inconsequentially, it is a Drupal fork.
I am on the record as saying that Backdrop CMS supplies a need in the market place. I believe that Drupal 8 is charging ahead into new and innovative territory and leaving behind a disenfranchised segment of the Drupal community. This segment includes both those who need websites and those who create them. The Drupal 7 methodology is a proven success. Backdrop CMS builds upon that success and prepares the way for a continued bright and prosperous future.
Before I continue my take on all of this, Let me give you a brief glimpse of the early days of BackdropCMS.
- June 2012
- Nate, Jen, and a few others have a conversation discussing the many changes in Drupal 8 and how those changes come with a cost. The idea comes up of getting "back" to the principles that lead to Drupal's amazing success. The phrase Backdrop is first coined.
- Nate registers BackdropCMS.org and on the same date publishes the Backdrop repository
- The first tweet appears referencing Backdrop
June 27, 2013
- Among the tweets is Nate's tweet announcing the purpose of the fork and the new BackdropCMS.org website.
September 11, 2013
- Also in September, an IndieGoGo crowd sourcing campaign is createdhttps://www.indiegogo.com/projects/backdrop-cms/x/5200252 to fund the development of BackDropCMS. By the close on November 10th, it raises $6,685 including contributions from some highly visible Drupal Community members.
- DrupalEasy Podcast- 9/16/13 - http://drupaleasy.com/podcast/2013/09/backdropeasy-podcast-114-no-crying-sprints
- Acquia Blog by Jesse Beach - “Drupal will not be ugly; we will not punish dissent” -https://www.youtube.com/watch?v=YxCDI-ONqDo&noredirect=1
- Drupalize.me Podcast- 9/20/13 https://www.lullabot.com/blog/podcasts/drupalizeme-podcast/26-backdrop-drupal-fork
- synapsesoftware.com Podcast - 9/24/13 - http://synapsesoftware.com/podcast/episode-6-how-choose-software-development-company
- Talking Drupal Podcast - 10/2/13 - https://www.youtube.com/watch?v=YxCDI-ONqDo&authuser=0
- Quora.com - 10/8/13 - “Will BackDrop CMS - the fork of Drupal 7 - be a viable alternative to Drupal 8 or will it fail? What factors will determine its success or failure?” http://www.quora.com/Will-BackDrop-CMS-the-fork-of-Drupal-7-be-a-viable-alternative-to-Drupal-8-or-will-it-fail-What-factors-will-determine-its-success-or-failure
- Modules Unraveled - 10/11/13 - https://modulesunraveled.com/podcast/081-backdrop-and-drupal8-discussion-jen-lampton-nate-haug-john-albin-wilkins-and-alex
I don’t know how many people want to weed through all of those blogs and podcasts, but I can tell you that many of us lived through it and it was exciting! I was glued to my Mac trying to keep track of it all. I wanted to hear what the Drupal Easy guys had to say. I wanted to hear what Nate and Jen had to say on their interviews. I wanted some sense of what people thought about it, how they reacted, etc. In short, it was very important to me that I maintained some sense of how the Drupal Community at large felt about the whole idea of Backdrop CMS.
Why care what the Drupal community feels about BackdropCMS?
Well… Back on 9/11/13 when the twitter-sphere lit up about BackdropCMS, I too tweeted out that I supported it. Within minutes of my tweet, I got an email from Dries asking me why I supported it. We ended up discussing many aspects of the whole situation. We didn’t really end the exchange with anything concrete other than the fact that we started with different perspectives and feelings and we ended with those same different feelings and perspectives. Since then, I've enjoyed many conversations with community members as we discuss the impact that Backdrop CMS is having or will have on the Drupal Community.
As the title so boldly states, Backdrop CMS 1.0 is here! This is going to change many things. I look forward to watching as more and more people discover this new, and powerful tool and begin recommending it to their clients who already love Drupal and wish to enjoy increased functionality without the dramatic changes.Drupal Planet
Frameworks are pretty and shiny and like Drupal can be fun, but you have to use them in the right context. So here's when to use a Framework, and by default when not to as well.Read more
A persistent problem that I encounter with hard disks is the capacity limit. If only hard disks could expand like the Tardis.
My current setup at home involves a HP Microserver. It has four drive bays carrying two SSDs (for home directories) and two Western Digital RE4 2TB drives for bulk data storage (photos, source tarballs and other things that don't change often). Each pair of drives is mirrored. I chose the RE4 because I use RAID1 and they offer good performance and error recovery control which is useful in any RAID scenario.
When I put in the 2TB drives, I created a 1TB partition on each for Linux md RAID1 and another 1TB partition on each for BtrFs.
Later I added the SSDs and I chose BtrFs again as it had been working well for me.Where to from here?
Since getting a 36 megapixel DSLR that produces 100MB raw images and 20MB JPEGs I've been filling up that 2TB faster than I could have ever imagined.
I've also noticed that vendors are offering much bigger NAS and archive disks so I'm tempted to upgrade.
First I looked at the Seagate Archive 8TB drives. 2TB bigger than the nearest competition. Discussion on Reddit suggests they don't have Error Recovery Control / TLER however and that leaves me feeling they are not the right solution for me.
Then I had a look at WD Red. Slightly less performant than the RE4 drives I run now, but with the possibility of 6TB per drive and a little cheaper. Apparently they have TLER though, just like the RE4 and other enterprise drives.Will 6 or 8TB create new problems?
This all leaves me scratching my head and wondering about a couple of things though:
- Will I run into trouble with the firmware in my HP Microserver if I try to use such a big disk?
- Should I run the whole thing with BtrFs and how well will it work at this scale?
- Should I avoid the WD Red and stick with RE4 or similar drives from Seagate or elsehwere?
If anybody can share any feedback it would be really welcome.
You can still make an important contribution to Drupal 8. Drupal Global Sprint 2015-New England takes place this Saturday, January 17, from 10 AM to 5 PM at Genuine in Boston. Acquia is co-sponsoring the event and we invite you to RSVP and jump into the community.
I've had some trouble using Twig's include statements in Drupal 8 theming. I'm not sure if this is a bug since it's at Beta 4, but it's sort of annoying. I include my content areas in page.html.twig in a separate include file in Drupal 6 and insert it into the area I need. For example, if I have a 3 column layout, I'm changing the Bootstrap classes from "col-md-12" to "col-md-9" and "col-md-3" (for a sidebar) if the sidebars have content in them. Inclu
After releasing Weblate 2.0 with Bootstrap based UI, there was still lot of things to improve. Weblate 2.1 brought more consistency in using buttons with colors and icons. Weblate 2.2 will bring some improvements in other graphics elements.
One of thing which was for quite long in our issue tracker is to provide own renderer for SVG status badge. So far Weblate has offered either PNG badge or external SVG rendered by shields.io. Relying on external service was not good in a long term and also caused requests to third party server on many pages, what could be considered bad privacy wise.
Since this week, Weblate can render SVG badge on it's own and they are also matching current style used by other services (eg. Travis CI):
If you're running Spamassassin on Debian or Ubuntu, have you enabled automatic rule updates? If not, why not? If possible, you should enable this feature. It should be as simple as setting "CRON=1" in /etc/default/spamassassin. If you choose not to enable this feature, I'd really like to hear why. In particular, I'm thinking about changing the default behavior of the Spamassassin packages such that automatic rule updates are enabled, and I'd like to know if (and why) anybody opposes this.
Spamassassin hasn't been providing rules as part of the upstream package for some time. In Debian, we include a snapshot of the ruleset from an essentially arbitrary point in time in our packages. We do this so Spamassassin will work "out of the box" on Debian systems. People who install spamassassin from source must download rules using spamassassin's updates channel. The typical way to use this service is to use cron or something similar to periodically check for rule changes via this service. This allows the anti-spam community to quickly adapt to changes in spammer tactics, and for you to actually benefit from their work by taking advantage of their newer, presumably more accurate, rules. It also allows for quick reaction to issues such as the one described in bug 738872 and 774768.
If we do change the default, there are a couple of possible approaches we could take. The simplest would be to simply change the default value of the CRON variable in /etc/default/spamassassin. Perhaps a cleaner approach would be to provide a "spamassassin-autoupdates" package that would simply provide the cron job and a simple wrapper program to perform the updates. The Spamassassin package would then specify a Recommends relationship with this package, thus providing the default enabled behavior while still providing a clear and simple mechanism to disable it.
Unfortunately I could not go on stage at the 31st Chaos Communication Congress to present reproducible builds in Debian alongside Mike Perry from the Tor Project and Seth Schoen from the Electronic Frontier Foundation. I've tried to make it up for it, though… and we have made amazing progress.Wiki reorganization
What was a massive and frightening wiki page now looks really more welcoming:
Depending on what one is looking for, it should be much easier to find. There's now a high-level status overview given on the landing page, maintainers can learn how to make their packages reproducible, enthusiasts can more easily find what can help the project, and we have even started writing some history..buildinfo for all packages
New year's eve saw me hacking Perl to write dpkg-genbuildinfo. Similar to dpkg-genchanges, it's run by dpkg-buildpackage to produce .buildinfo control files. This is where the build environment, and hash of source and binary packages are recorded. This script, integrated with dpkg, replace the previous debhelper interim solution written by Niko Tyni.
We used to fix mtimes in control.tar and data.tar using a specific addition to debhelper named dh_fixmtimes. To better support the ALWAYS_EXCLUDE environment variable and for pragramtic reasons, we moved the process in dh_builddeb.
Both changes were quickly pushed to our continuous integration platform. Before, only packages using dh would create a .buildinfo and thus eventually be considered reproducible. With these modifications, many more packages had their chance… and this shows:
Yes, with our experimental toolchain we are now at more than eighty percent! That's more than 17200 source packages!srebuild
Given a .buildinfo file, it first finds a timestamp of Debian Sid from snapshot.debian.org which contains the requested packages in their exact versions. It then runs sbuild with the right architecture as given by the .buildinfo file and the right base system to upgrade from, as given by the version of the base-files package version in the .buildinfo file. Using two hooks it will install the right package versions and verify that the installed packages are in the right version before the build starts.Understanding problems
Over 1700 packages have now been reviewed to understand why build results could not be reproduced on our experimental platform. The variations between the two builds are currently limited to time and file ordering, but this still has uncovered many problems. There are still toolchain fixes to be made (more than 180 packages for the PHP registry) which can make many packages reproducible at once, but others like C pre-processor macros will require many individual changes.
debbindiff, the main tool used to understand differences, has gained support for .udeb, TrueType and OpenType fonts, PNG and PDF files. It's less likely to crash on problems with encoding or external tool. But most importantly for large package, it has been made a lot faster, thanks to Reiner Herrmann and Helmut Grohne. Helmut has also been able to spot cross-compilation issues by using debbindiff!Targeting our efforts
It gives warm fuzzy feelings to hit the 80% mark, but it would be a bit irrelevant if this would not concern packages that matter. Thankfully, Holger worked on producing statistics for more specific package sets. Mattia Rizzolo has also done great work to improve the scripts generating the various pages visible on reproducible.debian.net.
All essential and build-esential packages, except gcc and bash, are considered reproducible or have patches ready. After some lengthy builds, I also managed to come up with a patch to make linux build reproducibly.Miscellaneous
After my initial attempt to modify r-base to remove a timestamp in R packages, Dirk Eddelbuettel discussed the issue with upstream and came up with a better patch. The latter has already been merged upstream!
Identifiers generated by xsltproc have also been an issue. After reviewing my initial patch, Andrew Awyer came up with a much nicer solution. Its potential performance implications need to be evaluated before submission, though.
Chris West has been working on packages built with Maven amongst other things.
PDF generated by GhostScript, another painful source of troubles, is being worked on by Peter De Wachter.
Holger got X.509 certificates signed by the CA cartel for jenkins.debian.net and reproducible.debian.net. No more scary security messages now. Let's hope next year we will be able to get certificates through Let's Encrypt!Let's make a difference together
As you can imagine with all that happened in the past weeks, the #debian-reproducible IRC channel has been a cool place to hang out. It's very energizing to get together and share contributions, exchange tips and discuss hardest points. Mandatory quote:* h01ger is very happy to see again and again how this is a nice learning circle...! i've learned a whole lot here too... in just 3 months... and its going on...!
Reproducible builds are not going to change anything for most of our users. They simply don't care how they get software on their computer. But they care to get the right software without having to worry about it. That's our responsibility, as developpers. Enabling users to trust their software is important and a major contribution, we as Debian, can make to the wider free software movement. Once Jessie is released, we should make a collective effort to make reproducible builds an highlight of our next release.
- Okay, I can already see huge benefits of utilizing these tools. But, I’d love to get your opinion on what the benefits are for Developers/Site-builders/Themers?
- There are two big benefits as I see them, and another not so apparent. First, a lot of these tasks are repetitive. And things like copying a database may take a bit of time. Or merging code. Or running tests. Etc. Anything that you can automate means time you can spend on other things. Second, not everyone is as experienced - or maybe they don’t have the permissions - to execute all the tasks. You don’t want mistakes, you don’t want to give everyone permissions. So you figure out the best way to do it and then you automate it. The last reason is not as obvious. I think a lot of times we hack things together in our development environments to get it working - but then may run into issues later on. We don’t spend the extra time because its temporary. By spending a little extra time getting it right, we have created a reusable pattern that we can use on our next project. By encapsulating our best practices, we not only have a quicker setup, but we have a better one too.
- Perfect. So, save time by automating tasks like copying a database. Prevent mistakes by limiting who has permissions to execute tasks, and automating them so that even those who do have permission can’t introduce user error. And by setting up a process that uses best practices, creating new environments is faster, and better than if I had to try to remember all of the steps myself.
- Exactly. And I’ll add, ansible can be used for each of installation, configuration, and orchestration. The examples we’ve talked about so far are orchestration - moving databases, code, etc. It can also be used to install Apache, Mysql, Mongodb, etc. Any set of system commands that are repeatable.
- Oh... So if you’ve got a server that you have full access to, you could actually wipe and rebuild the entire server AND Drupal site? We’re not limited to just configuring the Drupal site?
- Exactly. And throw in Vagrant into the mix and now you can do that on your local machines using Virtual machines. Immagine spinning up a brand new VM and within a few clicks you have your entire development environment with a fresh drupal install all ready for you on a VM.
- Now, I do wonder who this is more geared toward. Developers, Site-builders or Themers. I understand that each of them can use these, and would probably help them all with their daily tasks, but who do you see benefiting the most from these tools. Or, do you have examples of people in each category that you know of that are using them?
- I think all three benefit from automation. For example, in a previous life where I didn’t use Ansible, my themer was insanely good at theming, but when it came to running commands remotely on a server to check out his work, he was a fish out of water. I wish I had written an Ansible playbook so that he could check his code out onto staging. Or even better, if I had set up Jenkins to run an Ansible playbook to automatically check it out his work each time he committed. He wouldn’t have had to wait on me, sometimes a few days if I was not around. That said, he would not have been able create the ansible playbook.
- As for who is using Ansible, well, Twitter does - they use it to roll out updates across their hundreds of machines. And of course BlackMesh, the hosting company I work for, also does. The product Cascade I mentioned uses ansible and Jenkins to do a lot of the things we talked about today, only we set it up so you don’t have to.
The Drupal 7 Auto Assign Role module allows you a lot of flexibility in deciding what roles users receive on your Drupal 7 website. If you have ever needed to allow a user to select their own role, or if you have ever needed to automatically assign a role to every user on your Drupal site. This is the module for you.Tags: DrupalUsersDrupal 7Drupal Planet
Happy birthday to Drupal! On this day in 2001, Drupal 1.0 was released.
This milestone is the perfect time to talk about some of the findings of our recent community survey. The survey findings offer a window into what community members are thinking as the project matures and evolves. It also gives us at the Drupal Association a way to better understand what we're doing right and what we could be doing better. There aren't many surprises (and that's a good thing), but all of the findings are educational. Here are three results we thought were particularly interesting and insightful.Drupal 8 Will Be Broadly Adopted
In the survey, about 80% of respondents said they either plan to start using Drupal 8 as soon as it is released, or plan to adopt it at some point after release. Another 8% said they did not have specific plans to adopt, but do plan to evaluate Drupal 8.
Drupal.org Remains an Important and Heavily-Used Tool
The overwhelming majority of respondents said they use Drupal.org more than once per week. Most also say they are satisfied or somewhat satisfied with the site. While that result is encouraging, it does not change the important mission to improve the experience of the site and make it a better tool for everyone from first time visitors to those who spend the majority of their working time on the site.
We Need to Create Broader Awareness of Drupal Association Programs
Community members who took the survey have great awareness of DrupalCons. Awareness of the work we are doing on Drupal.org seems to be steadily growing. But awareness is relatively low for Community Grants and our Supporter Programs that provide a way for organizations to give back to the Project. That awareness is clearly something we need to improve to promote transparency.
If you would like to read the full results, you can access them here (2.8M PDF). Thanks for reading, and thanks for being a part of this amazing community.