When I started looking for a lightweight solution of serving a music library over LAN I did not expect so many complications. I expected it not to be a unique need to have something running on a SheevaPlug straight from the Debian repository. Apparently it kind of was.
Debian used to have mt-daapd (popcon: 165), but now it is available from oldstable only and upstream is dead. There is tangerine (popcon: 98) with its Mono dependencies and GUI which seemed to me overkill and more like a demo of a networked application written in Mono than a music library server. The most promising candidate was forked-daapd (popcon: 220) but it was far from being a true winner.
First, it had a series of dead upstreams. At the beginning it was forked from mt-daapd (hence the name) by Julien Blache who also served as the prior Debian maintainer. Then the code base was forked and converted to use Grand Central Dispatch. Then the GCD fork died off slowly as well a few years ago. When I found the package it had been unmaintained for a few years and was based on the GCD branch which prevented building it on many architectures and the server itself was crashing or quitting occasionally.
Luckily there still existed a fork thanks to Espen Jürgensen which was well maintained and could serve as a way out but examining it closely it turned out that it had switched to libevent from GCD but to a version (1.4) which is present only in oldstable! And some say Debian’s software versions are ancient ;-). Moreover it was not simply libevent 1.4-based, but it included some heavily patched parts of it.
Espen also liked the idea of packaging his version in Debian and we extracted the patches to libevent and slowly got them accepted to libevent’s master.
Forked-daapd’s master works best with libevent 2.1.4-alpha, but thanks to Espen the development branch now also works with libevent 2.0.x giving up some performance and a little feature.
This was a long journey, but finally Espen’s forked-daapd became ready for being used as a new upstream of the Debian package thus please welcome 20.0+git20140530+gc740e6e-1, the first version of forked-daapd building on all architectures for a very long time and a prime candidate for being the music library server in Jessie (and wheezy-backports, soon)!
Testing, bug reports are always welcome!
From the package description:forked-daapd is an iTunes-compatible media server, originally intended as a rewrite of Firefly Media Server (also known as mt-daapd). It supports a wide range of audio formats, can stream video to iTunes, FrontRow and other compatible clients, has support for Apple's Remote iPhone/iPod application and can stream music to AirTunes devices like the AirPort Express. It also features RSP support for Roku's SoundBridge devices. Built-in, on-the-fly decoding support enables serving popular free music formats like FLAC, Ogg Vorbis or Musepack to those clients that do not otherwise support them.
DrupalCon Austin 2014 kicks off today in Austin, Texas, at the Austin Convention Center. More than 3,300 people are expected to attend the event, which offers educational and networking opportunities for the mix of developers, designers, IT managers, agencies and Fortune 500 companies that make up the Drupal community.
Changed in version 3.3.1: bufsize now defaults to -1 to enable buffering by default to match the behavior that most code expects. In versions prior to Python 3.2.4 and 3.3.1 it incorrectly defaulted to 0 which was unbuffered and allowed short reads. This was unintentional and did not match the behavior of Python 2 as most code expected.
So it was unintentional it seems that the previous documentation clearly documented the default to be 0 and the implementation matching the documentation. And it was unintentional that it was the only sane value for any non-trivial handling of pipes (without running into deadlocks).
Yay for breaking programs that follow the documentation! Yay for changing such an important setting between 3.2.3 and 3.2.4 and introducing deathlocks into programs.
A few people have observed dropouts using WebRTC, including the new Debian community service at rtc.debian.org
I've been going over some of these problems and made some improvements. If you tried it before and had trouble, please try again (rtc.debian.org is running with some of the fixes already).
In this case, you may see the picture of the other person for a split second and then the websocket link disconnects and JSCommunicator re-initializes itself.
People observing this problem had sometimes found that audio-only calls would work more frequently.
I believe one reason for this problem has been some incorrect handling of the OpenSSL error queue. I posted my observations about this on the resiprocate developer's list and included a fix for it in the recent 1.9.7 release.Call starts, no remote picture appears, call stops after 20 seconds or so
A BYE message usually includes a reason such as this:
Reason: SIP ;cause=200; text="RTP Timeout"
This particular reason (RTP Timeout) may indicate that the TURN server is faulty, not running at all or is not reachable due to one or both users being behind a firewall.
If you experience this problem but some other reason code appears, please share your observations on the repro-users mailing list so it can be improved.
In the first post of this series, we looked at why Symfony is being included in Drupal 8, and what that means for longtime Drupal developers. Now that we've considered the implications, just what is getting incorporated into Drupal?
Drupal is not using Symfony as a full stack framework. Rather, it is taking advantage of specific components. This post is essentially a list of these components with a link and a brief description for each.
Title: High Performance Drupal
Authors: Jeff Sheltren & Narayan Newton
Title: Drupal for Designers & Drupal Development Tricks for Designers
Author: Dani Nordin
Want to use the most awesome version of Drupal on the most awesome Drupal hosting? ;) This guide is for you!Let's Get Started!
First, head to http://www.acquia.com/acquia-cloud-free to set up your new site. This process takes a few minutes, but worry not; you can bide your time watching helpful tutorial videos! Once your site is provisioned, you'll be taken to the Acquia Cloud workflow page.
After two weeks working as OPW intern for Mozilla, it's time for a recap!
What exactly I've been doing in these two weeks?
Yes, this is the thing I'm most proud of.
I'm a bit cheating here, as strictly speaking, since the beginning of the internship I've triaged only
But I've decided to count from the beginning of my activity on bugzilla, at the end of March, since I've started work on that as part of the small contribution required for applying to OPW.
Therefore, it's all OPW related :)
Here's the grand total.
Right now, I've decided to work on an average of 5 bugs a day: it's mostly triage and/or verification, which is quite fun.
It consists in trying to have a more complete and detailed bug report for the developers: asking the right questions to the reporter, ensuring that the bug is filed against the right product or component and all the information about platforms and version are correct.
Or verifying that the bug isn't a duplicate, which involves doing some voodoo with Bugzilla quicksearch (I'm not so good with that yet, mostly because I'm not imaginative enough in the queries... but I'm getting better!)
Sometimes triaging means reading lots of documentation (to be sure that something is a bug and not a feature) and checking meta-bugs and release notes to be able to pinpoint the time when something was introduced and the reasoning behind it.
That takes a lot of time, but it makes you discover some funny things, like the Mighty Bouncing Unicorn.
And while I know it sounds a bit cruel, it's really good when you're verifying a fix and you find it's not totally ok, or that it triggered another bug.
I've been assured that feeling satisfied after that it's an essential part of the sadistic QA work.
This started as a personal project even before knowing I've been selected for OPW, and it's now part of my internship: I've been writing a first draft of FAQ for those who approach for the first time the Bug Triaging and Verifying work in Mozilla.
It meant taking a whole lot of IRC logs and scan them for the most asked questions during bugdays, and you can find here my first draft. I'll send a RFC today about it on dev-quality mailing list and link it to the main Bugdays page.Lessons learned
So, what I've learned in these two weeks?
That I'm pretty good at figuring things alone, but I like to have feedback on what I'm working on.
That testing things is an art, and perfectionism is a big plus.
That there are such things as stupid questions, but you have to ask them nonetheless.
That people in the Mozilla community are quite friendly and not scary at all. Not even in video! :)
I've been thinking about this a lot, and I think I'd like to have
a guide to all the technical terms in the UI (it took me a while to figure out what exactly was the hamburger menu, or understand the difference between awesomebar, search bar and new tab search). This is essential when triaging or verifying theme related bugs, or UI bugs in general.
a big jargon/acronym file: m-c? UX? nightlies? australis? WFM? STR? m-a? Mozilla's people speak another language, especially in bug reports. You get familiar with that after a while, but at first glance can be quite obscure.
They will probably become my next pet project.
Work on the ROS/Blender robotics control pipeline has culminated in the integration of the system with the tiny robotic Einstein head that had been living at Polytechnic University under the watchful eye of the OpenCog group. Back at Hanson Robotics, tiny Einstein was mercilessly ripped apart and his head attached to an absurdly long robot snake neck apparatus that mimics the configuration of the upcoming “Dmitroid” robot system. Once suitably mutated, tiny Einstein was able to make a few statements and even sing a couple of tunes.
Until recently I was very happy with my console mail client, Lumail, thinking I'd written it in a flexible manner, with lots of Lua-callable primitives.
Now I'm beginning to suspect that I might have gone down the wrong path at some point.
The user interface, at startup consists of a list of mailboxes. The intention being that you select a mailbox, and open it. That then takes you to a list of messages. (There is a cool and simple to use option to open the union of multiple mailboxes, which is something I use daily.)
Now the list of mailboxes is sorted alphabetically, so the user interface looks something like this:
Now the issue that triggered my rethink:
- Can it be possible for Lua to sort the maildir list? So I could arbitrarily have the Maildir .people.katy at the top of the list, always?
Sure you think. It's just a list of strings. You could pass an array to a lua on_sort_maildirs function, and then use the returned array/table as teh display order. Simple.
Simple until you realize the user might want to do more than operate solely on the list of strings. Perhaps they wish to put folders with unread messages at the top. At which point you need a "count_unread( maildir )" function. (Which does exist.)
Anyway the realisation I had is that the CMaildir object, on the C++ side, isn't exposed to the Lua-side. So the (useful) member functions which should be exported/visible are not.
Really what I'm trying to say is that I think I've implemented and exported useful Lua primitives, but actually many more parts of the implementation could be usefully exported - but are not, and that comes from the choice I made to expose "things" not "objects". If I'd exposed objects right from the start I'd have been in a better place.
I continued to toy with a basic GUI mail-client last week, but I've pretty much written that off as a useful way to spend my time. For the moment I'll leave email alone, I've done enough and despite niggles what I have is absolutely the best mail client for me.
(It is a shame that Mutt is so heavyweight and hard to deal with, and that notmuch never really took off.)
All people ever say is: thank you (a celebration of life) and please (an opportunity to make life more wonderful). Marshall Rosenberg
I have said "I love you" many times in my life, and many times I have failed to say it, because, for me, it is not an easy thing to say.
It is not easy when I have no idea what the other person will make of it: will they be frightened? Will they feel awkward around me afterwards? Will they disappear from my life?
But do I know what I myself mean when I say it?
I have said "I love you" because I thought you somehow expected it of me. "please, consider me worth of you".
I have said "I love you" to beg for affection. "please, love me back".
I have said "I love you" because I was grateful to you for existing in my life. "thank you".
I now understand why it has not been easy for me to say "I love you" when I was feeling, or imagining, that I had to say it.
I now understand why I have sometimes made myself awkward, as I was begging.
I now understand why, when I said "I love you" out of gratitude, when I said it to celebrate that you exist in my life, that's when I felt no trouble, no fear, and when I felt that my words really were fitting with what I was feeling and what I was wanting to say.
Review: Debt, by David GraeberPublisher: Melville House Copyright: 2011, 2012 Printing: October 2012 ISBN: 1-61219-129-0 Format: Trade paperback Pages: 453
You probably remember this story from introductory economics. Originally, people exchanged things through barter. I needed grain but had a cow. You had a farm but needed milk. I gave you milk for your grain. But this was tedious and awkward, and required finding the right combination of people who needed something the other person had on hand. Then, people invented money: small objects with an agreed-upon value that you could keep and use later when you needed something else. In other words, both a medium of exchange and a store of value. And that improved matters for a long time, although war-torn societies or societies that collapsed (the supposed "Dark Ages" often raises its head) would sometimes "revert to barter" for some period of time until things were stable enough for currency to reappear.
Following that story was usually some story about the emergence of money based on credit: goldsmith shops that started a side business in storing other people's coins and giving them receipts, leading to the receipts trading like the coins since the shops were trustworthy, and from there into the Bank of England and eventually the fractional reserve banking system, fiat money, and all the modern machinery of finance. All of this was dated from the early Enlightenment and the development of modern finance in Europe.
It's a very neat story, and it makes a great deal of intuitive sense. That barter system does feel like what you'd have to do without money, but one can immediately see how obnoxious and limiting it would be. These stories play into our intuitive sense of history: people started with emergent properties of the physical world (different people have different goods and different skills and want to exchange them) and then develop layers of abstraction on top of them, eventually leading to the sophistication of modern societies.
However, David Graeber is not an economist. He's an anthropologist: the profession devoted to understanding how people actually do things rather than how we've reconstructed our history based on our modern perspective. And, as he points out, engagingly and comprehensively, in chapter two of Debt, there is no evidence that this story about economic history has anything whatsoever to do with reality. And quite a bit of evidence that it does not. As best as we've been able to determine, not only is there no society in the world that deals with routine, day-to-day needs like grain and milk through barter, there never has been such a society in human history.
Instead, history and anthropology shows that credit is, in a sense, older than currency: the earliest recorded economic transactions were built on a rich system of credit, but in the form of purchases from shopkeepers on credit, or credit clearinghouses through the local government or temple. Those credit records were often denominated in some standardized commodity — so many cattle or bushels of grain — that could create the impression that the economy worked on barter. But the anthropological evidence indicates that this was more an accounting technique than a practical currency. People rarely brought the named commodity to the temple to pay off some debt. Rather, there was an agreed conversion from various other goods to that standardized commodity, and its primary purpose was consistent bookkeeping. Specie — minted coins — appear to come later, and wax and wane throughout history depending on local circumstance. For example, Europe did not "revert to barter" after the collapse of the Roman Empire, but the inhabitants did stop using specie and returned to a system of credit and record-keeping... records that continued to be denominated in Roman currency, even though few people still used the actual coins.
That's one fascinating observation with which Graeber begins this book. The other is the question of why we have such a strong social and moral belief that people must pay their debts. This moral belief is ever-present in discussions about the 2008 financial collapse, and more broadly in discussions about the modern economy, but it's not as obvious of a belief as it might appear on the surface. After all, the banking and investment system is founded on the principle that not everyone will repay their debts, and therefore lenders receive a risk premium based on the likelihood that the debt won't be repaid. But, despite building the possibility of non-repayment into the system, debt forgiveness or intentional default is almost unthinkable and considered a huge moral problem. To this, Graeber brings the perspective of historical anthropology: human societies have struggled with the problems of debt and repayment from the beginning of recorded history, and have attempted a wide variety of solutions to those problems, including massive debt forgiveness. The jubilee described in the Bible was not novel; rather, it reflected a common practice to keep abuses of debt under control in ancient Sumer.
The subtitle of Debt is The First 5,000 Years, and this book is a historical survey. But Graeber puts off the history to first lay an intellectual groundwork for our understanding of debt, and I found that preliminary discussion extremely valuable. Most memorable was the way Graeber divides human economic relationships into communism (in the old sense, not the political sense), exchange, and hierarchy. Capitalism has consumed our economic analysis to such a degree that exchange is the only economic basis that gets much discussion, but the other two are both obvious and pervasive once Graeber points them out. Human civilization could not exist without all three. And Graeber also points out one aspect of exchange-based economics that had not previously occurred to me: it's the economic relationship that one creates with strangers. Debt has the unique characteristic that it can be discharged, at which point the relationship ceases. This has far more complex and far-reaching moral and social implications than one might initially realize, and Graeber did a wonderful job opening my eyes to some of the subtleties.
Debt is clearly a scholarly work, but Graeber's writing is clear and engaging. I found most of this book to be surprisingly easy reading. The hardest going was Graeber's discussion of societies that use a form of currency to arrange relationships between people (marriages, births, and deaths, primarily), but not day-to-day economic transactions. I suspect this area is closer to Graeber's areas of personal research and field work, which resulted in more technical detail. I'm still not sure I completely grasp the principles that Graeber was trying to communicate. But I was struck by the observation of alienation's role in turning a human being into a commodity, and how that links with debt's role as the economic transaction one has with strangers. Graeber covers slavery only glancingly, but makes some memorable points about the use of violence to rip someone out of their social context, and how that is necessary in human cultures before humans can be reduced to a commodity.
There's a lot here, and I've only scratched the surface. I haven't mentioned, for example, the fascinatingly elegant theory that coining money and then requiring taxes be paid in the same money is a simple and highly effective way of funding armies, an explanation for specie that is largely unproven but that I find more compelling than the ones I've previously heard. Approaching debt from an anthropological instead of economic perspective is surprisingly enlightening. Debt is primarily a historical and cultural discussion rather than a set of proposed solutions, but Graeber does effectively show that debt as a moral obligation is not an unquestionable moral stance, but rather has a long history as one side of a two-sided political debate. I also came away from this book more conscious of the social implications and costs of debt-structured interactions, and wanting to push more of the language of debt out of my day-to-day dealings.
Graeber is well known as one of the supporters of Occupy Wall Street, and Debt, while a well-defended academic work, certainly does advocate a position. But the academic analysis is more prominent than the advocacy, and I found his positions well-defended and well-argued. I do need to give the caveat that I don't have the anthropological background to distinguish the statements from Graeber that are well-established common knowledge in anthropology from the ones that are more controversial, and I would be a bit leery of taking this book as the final word on the topic. But it fully deserves its popularity and reputation as a thought-provoking and valuable contribution to the conversation. It's another book that I want to re-read someday to digest further, and the sort of book whose observations keep occurring to me in subsequent discussions or news stories.
If you're at all interested in the way in which we construct the morality around economics and debt, I think this is a book that you should read. It's thoughtful, challenging, and surprising, and it passes my acid test for books of this sort with flying colors: after reading it, you realize that many things are more complicated, more historical, and less novel than you had originally thought. Highly recommended.
Rating: 9 out of 10
Great time, super well organized by this year’s TCamp staff. Really outstanding. Lots of really amazing discussion, and I feel a lot of effort is finally jelling around Open Civic Data, which is an absolute thrill for me.
Can’t wait to see what the next few months bring!
Debian is a big system. At the time of writing, looking at my local package list caches tells me that the unstable suite contains 21306 source packages, and 42867 binary packages on amd64. Between these 42867 binary packages, there is an unthinkable number of inter-package dependencies. For example the dependency graph of the ruby packages contains other 20-something packages.
A new version of any of these packages can potentially break some functionality in the ruby package.
And that dependency graph is very small. Looking at the dependency graph for, say, the rails package will make your eyes bleed. I tried it here, and GraphViz needed a PNG image with 7653×10003 pixels to draw it. It ain’t pretty. Installing rails on a clean Debian system will pull in another 109 packages as part of the dependency chain. Again, as new versions of those packages are uploaded the archive, there is a probability that a backwards-incompatible change, or even a bug fix which was being worked around, might make some funcionality in rails stop working. Even if that probability is low for each package in the dependency chain, with enough packages the probability of any of them causing problems for rails is quite high.
And still the rails dependency chain is not that big. libreoffice will pull in another 264 packages. gnome will pull in 1311 dependencies, and kde-full 1320 (!).
With a system this big, problems will arrive, and that’s a fact of life. As developers, what we can do is try to spot these problems as early as possible, and fixing them in time to make a solid release with the high quality Debian is known for.
While automated testing is not the proverbial Silver Bullet of Software Engineering, it is an effective way of finding regressions.
Back in 2006, Ian Jackson started the development of autopkgtest as a tool to test Debian packages in their installed form (as opposed to testing packages using their source tree).
In 2011, the autopkgtest test suite format was proposed as a standard for the Debian project, in what we now know as the DEP-8 specification.
Since then, some maintainers such as myself started experimenting with DEP-8 tests in their packages. There was an expectation in the air that someday, someone would run those tests for the entire archive, and that would be a precious source of QA information.
Durign the holiday break last year, I decided to give it a shot. I initially called the codebase dep8. Later I renamed it to debci, since it could potentially also run other other types of test suites in the future. Since early January, ci.debian.net run an instance of debci for the Debian Project.
The Debian continuous Integration will trigger tests at most 4 times a day, 3 hours after each dinstall run. It will update a local APT cache and look for packages that declare a DEP-8 test suite. Each package with a test suite will then have its test suite executed if there was any change in its dependency chain since the last test run. Existing test results are published at ci.debian.net every hour, and at the end of each batch a “global status” is updated.
Maintainers can subscribe to a per package Atom feed to keep up with their package test results. People interested in the overall status can subscribe to a global Atom feed of events.
Since the introduction of Debian CI in mid-January 2014, we have seen an amazing increase in the number of packages with test suites. We had little less than 200 packages with test suites back then, against around 350 now (early June 2014). The ratio of packages passing passing their test suite has also improved a lot, going from less than 50% to more than 75%.
There is documentation available, including a FAQ for package maintainers with further information about how the system works, how to declare test suites in their packages and how to reproduce test runs locally. Also available is development information about debci itself, to those inclined to help improve the system.
This is just the beginning. debci is under a good rate of development, and you can expect to see a constant flux of improvements. In special, I would like to mention a few people who are giving amazing contributions to the project:
- Martin Pitt has been working on improving debci to support parallel and distributed workers. Being the current autopkgtest maintainer, Martin also already got some bug fixes and fixes into autopkgtest motivated by debci use cases.
- Lucas Kanashiro is another GSOC student, who is working on investigating patterns among packages that fail their test suites, so that we can figure out how we can fix them, or if there are classes of failures that are actually caused by problems in the debci infrastructure.
For all you early birds arriving in Austin today, DrupalCon badge pickup and onsite registration opens at 3pm.
Can't make it tonight? Reg opens bright and early Monday morning at 7am and goes until 6:30pm.Drupal Training
Training begins tomorrow at 9am, and there is still space in some classes. Register online to secure your spot before they sell out!