Mark Brown: We show up

Planet Debian - lun, 13/06/2016 - 11:50

It’s really common for pitches to managements within companies about Linux kernel upstreaming to focus on cost savings to vendors from getting their code into the kernel, especially in the embedded space. These benefits are definitely real, especially for vendors trying to address the general market or extend the lifetime of their devices, but they are only part of the story. The other big thing that happens as a result of engaging upstream is that this is a big part of how other upstream developers become aware of what sorts of hardware and use cases there are out there.

From this point of view it’s often the things that are most difficult to get upstream that are the most valuable to talk to upstream about, but of course it’s not quite that simple as a track record of engagement on the simpler drivers and the knowledge and relationships that are built up in that process make having discussions about harder issues a lot easier. There are engineering and cost benefits that come directly from having code upstream but it’s not just that, the more straightforward upstreaming is also an investment in making it easier to work with the community solve the more difficult problems.

Fundamentally Linux is made by and for the people and companies who show up and participate in the upstream community. The more ways people and companies do that the better Linux is likely to meet their needs.

Catégories: Elsewhere

Cheppers blog: TCPDF module ported to Drupal 8

Planet Drupal - lun, 13/06/2016 - 11:36

A few months ago, I decided to port the TCPDF module for Drupal 8. My first thought was that it would be an easy task, but I ran into my first problem early, when I tried to pull the TCPDF library into Drupal. 

Catégories: Elsewhere

Simon Désaulniers: [GSOC] Week 2 - Report

Planet Debian - lun, 13/06/2016 - 06:22

I’ve been reworking the code for the queries I introduced in the first week.

What’s done
  • I have worked on value pagination and optimization of accnounce operations;
  • Fixed bugs like #72, #73;
  • I’ve split the Query into Select and Where strcutures. This change was explained here.
What’s still work in progress
  • Value pagination;
  • Optimizing announce operations;
Catégories: Elsewhere

Iustin Pop: Elsa Bike Trophy 2016—my first bike race!

Planet Debian - lun, 13/06/2016 - 01:09
Elsa Bike Trophy 2016—my first bike race!

So today, after two months of intermittent training using Zwift and some actual outside rides, I did my first bike race. Not of 2016, not of 2000+, but like ever.

Which is strange, as I learned biking very young, and I did like to bike. But as it turned out, even though I didn't like running as a child, I did participate in a number of running events over the years, but no biking ones.

The event

Elsa Bike Trophy is a mountain bike event—cross-country, not downhill or anything crazy; it takes part in Estavayer-le-Lac, and has two courses - one 60Km with 1'791m altitude gain, and a smaller one of 30Km with 845m altitude gain. I went, of course, for the latter. 845m is more than I ever did in a single ride, so it was good enough for a first try. The web page says that this smaller course “… est nerveux, technique et ne laisse que peu de répit”. I choose to think that's a bit of an exaggeration, and that it will be relatively easy (as I'm not too skilled technically).

The atmosphere there was like for the running races, with the exception of bike stuff being sold, and people on very noisy rollers. I'm glad for my trainer which sounds many decibels quieter…

The long race started at 12:00, and the shorter one at 12:20. While waiting for the start I had to concerns in mind: whether I'm able to do the whole course (endurance), and whether it will be too cold (the weather kept moving towards rain). I had a small concern about the state of the course, as it was not very nice weather recently, but a small one.

And then, after one hour plus of waiting, go!

Racing, with a bit of "swimming"

At first thing went as expected. Starting on paved roads, moving towards the small town exit, a couple of 14% climbs, then more flat roads, then a nice and hard 18% short climb (I'll never again complain about < 10%!), then… entering the woods. It became quickly apparent that the ground in the forest was in much worse state than I feared. Much worse as in a few orders of magnitude.

In about 5 minutes after entering the tree cover, my reasonably clean, reasonably light bike became a muddy, heavy monster. And the pace that until then went quite OK became walking pace, as the first rider that didn't manage to keep going up because the wheel turned out of the track blocked the one behind him, which had to stop, and repeat until we were one line (or two, depending on how wide the trail was) of riders walking their bikes up. While on dry ground walking your bike up is no problem, or hiking through mud with good hiking shoes is also no problem, walking up with biking shoes is a pain. Your foot slides and you waste half of your energy "swimming" in the mud.

Once the climb is over, you get on the bike, and of course the pedals and cleats are full of heavy mud, so it takes a while until you can actually clip in. Here the trail version of SPD was really useful, as I could pedal reasonably well without being clipped-in, just had to be careful and push too hard.

Then maybe you exit the trail and get on paved road, but the wheels are so full of mud that you still are very slow (and accelerate very slowly), until the shed enough of the mud to become somewhat more "normal".

After a bit of this "up through mud, flat and shedding mud", I came upon the first real downhill section. I would have been somewhat confident in dry ground, but I got scared and got off my bike. Better safe than sorry was the thing for now.

And after this is was a repetition of the above: climb, sometimes (rarely) on the bike, most times pushing the bike, fast flat sections through muddy terrain where any mistake of controlling the bike can send the front wheel flying due to the mud being highly viscous, slow flat sections through very liquid mud where it definitely felt like swimming, or any dry sections.

My biggest fear, uphill/endurance, was unfounded. The most gains I made were on the dry uphills, where I had enough stamina to overtake. On flat ground I mostly kept order (i.e. neither being overtaken nor overtaking), but on downhill sections, I lost lots of time, and was overtaken a lot. Still, it was a good run.

And then, after about 20 kilometres out of the 30, I got tired enough of getting off the bike, on the bike, and also tired mentally and not being careful enough, that I stopped getting off the bike on downhills. And the feeling was awesome! It was actually much much easier to flow through the mud and rocks and roots on downhill, even when it was difficult (for me) like 40cm drops (estimated), than doing it on foot, where you slide without control and the bike can come crashing down on you. It was a liberating feel, like finally having overcome the mud. I was soo glad to have done a one-day training course with Swiss Alpine Adventure, as it really helped. Thanks Dave!

Of course, people were still overtaking me, but I also overtook some people (who were on foot; he he, I wasn't the only one it seems). And being easier, I had some more energy so I was able to push a bit harder on the flats and dry uphill sections.

And then the remaining distance started shrinking, and the last downhill was over, I entered through familiar roads the small town, a passer-by cries "one kilometre left", I push hard (I mean, hard as I could after all the effort), and I reach the finish.

Oh, and my other concern, the rain? Yes it did rain somewhat, and I was glad for it (I keep overheating); there was a single moment I felt cold, when exiting a nice cosy forest into a field where the wind was very strong—headwind, of course.

Lessons learned

I did learn a lot in this first event.

  • indoor training sessions only help with endurance (but they do good on this); they don't help with technique, and most importantly, they don't teach how to handle the bike in inclement weather; biking to work on paved road also doesn't help.
  • nevertheless, indoor training does help with endurance ☺
  • mud guards…; before the race, I thought they'll help; during the race, I cursed at the extra weight and their seemingly uselessness; after the race, after I saw how other people looked, I realised that indeed they helped a lot—I was only dirty on my legs, mostly below the knee, but not on my upper body. Unsure whether I will use the again.
  • a drop seat is not needed if your seat is set in-between, but sure damn would have been more easy with one
  • installing your GPS on your handle-bar with elastic bands in a section of non-constant diameter is a very bad idea, as it lives in an unstable equilibrium: any move towards the thinner section makes the mount very loose, and you have to lose time fixing it.

So, how did I do after all? As soon as I reached the finish and recovered my items, among which the phone, I checked the datasport page: I was rank 59/68 in my category. Damn, I hoped (and thought) I would do better. Similar % in the overall ranking for this distance.

That aside, it was mighty fun. So much fun I'd do it again tomorrow! I forgot the awesome atmosphere of such events, even in the back of the rankings.

And then, after I reach drive home and open on my workstation the datasport page, I get very confused: the overall number of participants was different. And the I realised: not everybody finished the race when I first checked (d'oh)! Final ranking: 59 out of 84 in my category, and 247/364 in the overall 30km rankings. That makes it 70% and 67% respectively, which matches somewhat with my usual running results a few years back (but a bit worse). It is in any case better than what I thought originally, yay!

Also, Strava activity for some more statistics (note that my Garmin says it was not 800+ meters of altitude…):

I'd embed a Veloviewer nice 3D-map but I can't seem to get the embed option, hmm…

TODO: train more endurance, train more technique, train in more various conditions!

Catégories: Elsewhere

Sune Vuorela: Randa day 0

Planet Debian - lun, 13/06/2016 - 00:16

Sitting on Lake Zurich and reflecting over things was a great way to get started. http://manifesta.org/2015/11/pavillon-of-reflections-for-zurich-in-2016/

After spending a bit of time in a train, I climbed part of a mountain together with Adriaan – up to the snow where I could throw a snowball at him. We also designed a couple of new frameworks on our climbing trip. Maybe they will be presented later.

Catégories: Elsewhere

Jeff Geerling's Blog: Hosted Apache Solr — now for Drupal 8!

Planet Drupal - dim, 12/06/2016 - 23:42

After a few months of testing, I'm happy to announce Hosted Apache Solr now supports Search API Solr with Drupal 8! Both Search API and Search API Solr have been getting closer to stable releases, and more people have been requesting Drupal 8 search cores, so I decided to finish testing and updating support guides this weekend.

Catégories: Elsewhere

Mario Lang: A Raspberry Pi Zero in a Handy Tech Active Star 40 Braille Display

Planet Debian - dim, 12/06/2016 - 11:20

TL;DR: I put a $5 Raspberry Pi Zero, a Bluetooth USB dongle, and the required adapter cable into my new Handy Tech Active Star 40 braille display. An internal USB port provides the power. This has transformed my braille display into an ARM-based, monitorless, Linux laptop that has a keyboard and a braille display. It can be charged/powered via USB so it can also be run from a power bank or a solar charger, thus potentially being able to run for days, rather than just hours, without needing a standard wall-jack.


Some Background on Braille Display Form Factors

Braille displays come in various sizes. There are models tailored for desktop use (with 60 cells or more), models tailored for portable use with a laptop (usually with 40 cells), and, nowadays, there are even models tailored for on-the-go use with a smartphone or similar (with something like 14 or 18 cells).

Back in the old days, braille displays were rather massive. A 40-cell braille display was typically about the size of a 13" laptop. In modern times, manufacturers have managed to reduce the size of the internals such that a 40-cell display can be placed in front of a laptop or keyboard instead of placing the laptop on top of the braille display.

While this is a nice achievement, I personally haven't found it to be very convenient because you now have to place two physically separate devices on your lap. It's OK if you have a real desk, but, at least in my opinion, if you try to use your laptop as its name suggests, it's actually inconvenient to use a small form factor, 40-cell display.

For this reason, I've been waiting for a long-promised new model in the Handy Tech Star series. In 2002, they released the Handy Tech Braille Star 40, which is a 40-cell braille display with enough space to put a laptop directly on top of it. To accommodate larger laptop models, they even built in a little platform at the back that can be pulled out to effectively enlarge the top surface. Handy Tech has now released a new model, the Active Star 40, that has essentially the same layout but modernized internals.

You can still pull out the little platform to increase the space that can be used to put something on top.

But, most conveniently, they've designed in an empty compartment, roughly the size of a modern smartphone, beneath the platform. The original idea was to actually put a smartphone inside, but this has turned out (at least to me) to not be very feasible. Fortunately, they thought about the need for electricity and added a Micro USB cable terminating within the newly created, empty compartment.

My first idea was to put a conventional Raspberry Pi inside. When I received the braille display, however, we immediately noticed that a standard-sized rpi is roughly 3mm too high to fit into the empty compartment.

Fortunately, though, a co-worker noticed that the Raspberry Pi Zero was available for order. The Raspberry Pi Zero is a lot thinner, and fits perfectly inside (actually, I think there's enough space for two, or even three, of them). So we ordered one, along with some accessories like a 64GB SDHC card, a Bluetooth dongle, and a Micro USB adapter cable. The hardware arrived a few days later, and was immediately bootstrapped with the assistance of very helpful friends. It works like a charm!

Technical Details

The backside of the Handy Tech Active Star 40 features two USB host ports that can be used to connect devices such as a keyboard. A small form-factor, USB keyboard with a magnetic clip-on is included. When a USB keyboard is connected, and when the display is used via Bluetooth, the braille display firmware additionally offers the Bluetooth HID profile, and key press/release events received via the USB port are passed through to it.

I use the Bluetooth dongle for all my communication needs. Most importantly, BRLTTY is used as a console screen reader. It talks to the braille display via Bluetooth (more precisely, via an RFCOMM channel).

The keyboard connects through to Linux via the Bluetooth HID profile.

Now, all that is left is network connectivity. To keep the energy consumption as low as possible, I decided to go for Bluetooth PAN. It appears that the tethering mode of my mobile phone works (albeit with a quirk), so I can actually access the internet as long as I have cell phone reception. Additionally, I configured a Bluetooth PAN access point on my desktop machines at home and at work, so I can easily (and somewhat more reliably) get IP connectivity for the rpi when I'm near one of these machines. I plan to configure a classic Raspberry Pi as a mobile Bluetooth access point. It would essentially function as a Bluetooth to ethernet adapter, and should allow me to have network connectivity in places where I don't want to use my phone.

BlueZ 5 and PAN

It was a bit challenging to figure out how to actually configure Bluetooth PAN with BlueZ 5. I found the bt-pan python script (see below) to be the only way so far to configure PAN without a GUI.

It handles both ends of a PAN network, configuring a server and a client. Once instructed to do so (via D-Bus) in client mode, BlueZ will create a new network device - bnep0 - once a connection to a server has been established. Typically, DHCP is used to assign IP addresses for these interfaces. In server mode, BlueZ needs to know the name of a bridge device to which it can add a slave device for each incoming client connection. Configuring an address for the bridge device, as well as running a DHCP server + IP Masquerading on the bridge, is usually all you need to do.

A Bluetooth PAN Access Point with Systemd

I'm using systemd-networkd to configure the bridge device.


[NetDev] Name=pan Kind=bridge ForwardDelaySec=0


[Match] Name=pan [Network] Address= DHCPServer=yes IPMasquerade=yes

Now, BlueZ needs to be told to configure a NAP profile. To my surprise, there seems to be no way to do this with stock BlueZ 5.36 utilities. Please correct me if I'm wrong.

Luckily, I found a very nice blog post, as well as an accommodating Python script that performs the required D-Bus calls.

For convenience, I use a Systemd service to invoke the script and to ensure that its dependencies are met.


[Unit] Description=Bluetooth Personal Area Network After=bluetooth.service systemd-networkd.service Requires=systemd-networkd.service PartOf=bluetooth.service [Service] Type=notify ExecStart=/usr/local/sbin/pan [Install] WantedBy=bluetooth.target


#!/bin/sh # Ugly hack to work around #787480 iptables -F iptables -t nat -F iptables -t mangle -F iptables -t nat -A POSTROUTING -o eth0 -j MASQUERADE exec /usr/local/sbin/bt-pan --systemd --debug server pan

This last file wouldn't be necessary if IPMasquerade= were supported in Debian right now (see #787480).

After the obligatory systemctl daemon-reload and systemctl restart systemd-networkd, you can start your Bluetooth Personal Area Network with systemctl start pan.

Bluetooth PAN Client with Systemd

Configuring the client is also quite easy to do with Systemd.


[Match] Name=bnep* [Network] DHCP=yes


[Unit] Description=Bluetooth Personal Area Network client [Service] Type=notify ExecStart=/usr/local/sbin/bt-pan --debug --systemd client %I --wait

Now, after the usual configuration reloading, you should be able to connect to a specific Bluetooth access point with:

systemctl start pan@00:11:22:33:44:55 Pairing via the Command Line

Of course, the server and client-side service configuration require a pre-existing pairing between the server and each of its clients.

On the server, start bluetoothctl and issue the following commands:

power on agent on default-agent scan on scan off pair XX:XX:XX:XX:XX:XX trust XX:XX:XX:XX:XX:XX

Once you've set scan mode to on, wait a few seconds until you see the device you're looking for scroll by. Note its device address, and use it for the pair and (optional) trust commands.

On the client, the sequence is essentially the same except that you don't need to issue the trust command. The server needs to trust a client in order to accept NAP profile connections from it without waiting for manual confirmation by the user.

I'm actually not sure if this is the optimal sequence of commands. It might be enough to just pair the client with the server and issue the trust command on the server, but I haven't tried this yet.

Enabling Use of the Bluetooth HID Profile

Essentially the same as above also needs to be done in order to use the Bluetooth HID profile of the Active Star 40 on Linux. However, instead of agent on, you need to issue the command agent KeyboardOnly. This explicitly tells bluetoothctl that you're specifically looking for a HID profile.

Configuring Bluetooth via the Command Line Feels Vague

While I'm very happy that I actually managed to set all of this up, I must admit that the command-line interface to BlueZ feels a bit incomplete and confusing. I initially thought that agents were only for PIN code entry. Now that I've discovered that "agent KeyboardOnly" is used to enable the HID profile, I'm not sure anymore. I'm surprised that I needed to grab a script from a random git repository in order to be able to set up PAN. I remember, with earlier version of BlueZ, that there was a tool called pand that you could use to do all of this from the command-line. I don't seem to see anything like that for BlueZ 5 anymore. Maybe I'm missing something obvious?


The data rate is roughly 120kB/s, which I consider acceptable for such a low power solution. The 1GHz ARM CPU actually feels sufficiently fast for a console/text-mode person like me. I'll rarely be using much more than ssh and emacs on it anyway.

Console fonts and screen dimensions

The default dimensions of the framebuffer on the Raspberry Pi Zero are a bit unexpectedly strange. fbset reports that the screen dimension is 656x416 pixels (of course, no monitor connected). With a typical console font of 8x16, I got 82 columns and 26 lines.

With a 40 cell braille display, the 82 columns are very inconvenient. Additionally, as a braille user, I would like to be able to view Unicode braille characters in addition to the normal charset on the console. Fortunately, Linux supports 512 glyphs, while most console fonts do only provide 256. console-setup can load and combine two 256-glyph fonts at once. So I added the following to /etc/default/console-setup to make the text console a lot more friendly to braille users:

SCREEN_WIDTH=80 SCREEN_HEIGHT=25 FONT="Lat15-Terminus16.psf.gz brl-16x8.psf"


You need console-braille installed for brl-16x8.psf to be available.

Further Projects

There's a 3.5mm audio jack inside the braille display as well. Unfortunately, there are no converters from Mini-HDMI to 3.5mm audio that I know of. It would be very nice to be able to use the sound card that is already built into the Raspberry Pi Zero, but, unfortunately, this doesn't seem possible at the moment. Alternatively, I'm looking at using a Micro USB OTG hub and an additional USB audio adapter to get sound from the Raspberry Pi Zero to the braille display's speakers. Unfortunately, the two USB audio adapters I've tried so far have run hot for some unknown reason. So I have to find some other chipset to see if the problem goes away.

A little nuisance, currently, is that you need to manually power off the Raspberry, wait a few seconds, and then power down the braille display. Turning the braille display off cuts power delivery via the internal USB port. If this is accidentally done too soon then the Raspberry Pi Zero is shut down ungracefully (which is probably not the best way to do it). We're looking into connecting a small, buffering battery to the GPIO pins of the rpi, and into notifying the rpi when external power has dropped. A graceful, software-initiated shutdown can then be performed. You can think of it as being like a mini UPS for Micro USB.

The image

If you are a happy owner of a Handy Tech Active Star 40 and would like to do something similar, I am happy to share my current (Raspbian Stretch based) image. In fact, if there is enough interest by other blind users, we might even consider putting a kit together that makes it as easy as possible for you to get started. Let me know if this could be of interest to you.


Thanks to Dave Mielke for reviewing the text of this posting.

Thanks to Simon Kainz for making the photos for this article.

And I owe a big thank you to my co-workers at Graz University of Technology who have helped me a lot to bootstrap really quickly into the rpi world.


My first tweet about this topic is just five days ago, and apart from the soundcard not working yet, I feel like the project is already almost complete! By the way, I am editing the final version of this blog posting from my newly created monitorless ARM-based Linux laptop via an ssh connection to my home machine.

Catégories: Elsewhere

Francois Marier: Cleaning up obsolete config files on Debian and Ubuntu

Planet Debian - sam, 11/06/2016 - 23:40

As part of regular operating system hygiene, I run a cron job which updates package metadata and looks for obsolete packages and configuration files.

While there is already some easily available information on how to purge unneeded or obsolete packages and how to clean up config files properly in maintainer scripts, the guidance on how to delete obsolete config files is not easy to find and somewhat incomplete.

These are the obsolete conffiles I started with:

$ dpkg-query -W -f='${Conffiles}\n' | grep 'obsolete$' /etc/apparmor.d/abstractions/evince ae2a1e8cf5a7577239e89435a6ceb469 obsolete /etc/apparmor.d/tunables/ntpd 5519e4c01535818cb26f2ef9e527f191 obsolete /etc/apparmor.d/usr.bin.evince 08a12a7e468e1a70a86555e0070a7167 obsolete /etc/apparmor.d/usr.sbin.ntpd a00aa055d1a5feff414bacc89b8c9f6e obsolete /etc/bash_completion.d/initramfs-tools 7eeb7184772f3658e7cf446945c096b1 obsolete /etc/bash_completion.d/insserv 32975fe14795d6fce1408d5fd22747fd obsolete /etc/dbus-1/system.d/com.redhat.NewPrinterNotification.conf 8df3896101328880517f530c11fff877 obsolete /etc/dbus-1/system.d/com.redhat.PrinterDriversInstaller.conf d81013f5bfeece9858706aed938e16bb obsolete

To get rid of the /etc/bash_completion.d/ files, I first determined what packages they were registered to:

$ dpkg -S /etc/bash_completion.d/initramfs-tools initramfs-tools: /etc/bash_completion.d/initramfs-tools $ dpkg -S /etc/bash_completion.d/insserv initramfs-tools: /etc/bash_completion.d/insserv

and then followed Paul Wise's instructions:

$ rm /etc/bash_completion.d/initramfs-tools /etc/bash_completion.d/insserv $ apt install --reinstall initramfs-tools insserv

For some reason that didn't work for the /etc/dbus-1/system.d/ files and I had to purge and reinstall the relevant package:

$ dpkg -S /etc/dbus-1/system.d/com.redhat.NewPrinterNotification.conf system-config-printer-common: /etc/dbus-1/system.d/com.redhat.NewPrinterNotification.conf $ dpkg -S /etc/dbus-1/system.d/com.redhat.PrinterDriversInstaller.conf system-config-printer-common: /etc/dbus-1/system.d/com.redhat.PrinterDriversInstaller.conf $ apt purge system-config-printer-common $ apt install system-config-printer

The files in /etc/apparmor.d/ were even more complicated to deal with because purging the packages that they come from didn't help:

$ dpkg -S /etc/apparmor.d/abstractions/evince evince: /etc/apparmor.d/abstractions/evince $ apt purge evince $ dpkg-query -W -f='${Conffiles}\n' | grep 'obsolete$' /etc/apparmor.d/abstractions/evince ae2a1e8cf5a7577239e89435a6ceb469 obsolete /etc/apparmor.d/usr.bin.evince 08a12a7e468e1a70a86555e0070a7167 obsolete

I was however able to get rid of them by also purging the apparmor profile packages that are installed on my machine:

$ apt purge apparmor-profiles apparmor-profiles-extra evince ntp $ apt install apparmor-profiles apparmor-profiles-extra evince ntp

Not sure why I had to do this but I suspect that these files used to be shipped by one of the apparmor packages and then eventually migrated to the evince and ntp packages directly and dpkg got confused.

If you're in a similar circumstance, you want want to search for the file you're trying to get rid of on Google and then you might end up on http://apt-browse.org/ which could lead you to the old package that used to own this file.

Catégories: Elsewhere

Simon Désaulniers: [GSOC] Week 1 - Report

Planet Debian - sam, 11/06/2016 - 19:06

I have been working on writing serializable structure for remote filtering of values on the distributed hash table OpenDHT. This structure is called Query.

What’s done

The implementation of the base design with other changes have been made. You can see evolution on the matter here;

Changes allow to create a Query with a SQL-ish statement like the following

Query q("SELECT * WHERE id=5");

You can then use this query like so

get(hash, getcb, donecb, filter, "SELECT * WHERE id=5");

I verified the working state of the code with the dhtnode. I have also done some tests using our python benchmark scripts.

What’s next
  • Value pagination;
  • Optimization of put operations with query for ids before put, hence avoiding potential useless traffic.

The Query is the key part for optimizing my initial work on data persistence on the DHT. It will enhance the DHT on more than one aspects. I have to point out it would not have been possible to do that before our major refactoring we introduced in 0.6.0.

Catégories: Elsewhere

Shirish Agarwal: The road to debconf 2016, tourism and arts.

Planet Debian - sam, 11/06/2016 - 08:55

A longish blog post, please bear, a second part of the blog post would be published in few days from now. My fixed visa finally arrived, yeah But this story doesn’t start here, it starts about a year back. While I have been contributing to Debian in my free time over the years, and sometimes paid time as well, I had never thought of going overseas as the experiences I knew from friends and relatives, it isn’t easy to get all the permissions and paperwork done to say the least (bureaucracy @ work). But last year, when Debconf 15 was being launched, there are/were 2-3 friends of mine who are studying, doing their Ph.D. in some computer/web stuff, living in Germany currently that they goaded me to apply. The first few times I gave some standard excuses, but when they kept on for a while, just to shut them up I applied to the debconf team applying for food, accommodation and travel sponsorship.

I didn’t have high hopes as there obviously are many more talented peers around me who understand FOSS and Debian at a much more fundamental, philosophical as well as technical level than me. Much to my surprise though, about a month (and around two or three weeks just before the event was about to take place) I got the bursary/sponsorships for food, accommodation as well as travel. I was unsure that the remaining time was enough to get a visa hence declined that time around.

That whole episode gave me the confidence that perhaps my application would be accepted if I applied again. Using my previous years understanding, decided to give it a shot again as this would also enable me to get a feel of visa bureaucracy as well as gain a bit of novice understanding about what factors go in choosing a flight and believe me the latter part proved to be pretty confusing. While the visa business seems easy, the form at least is easy and what they ask, it can be troublesome as far as visa-processing process is concerned. It took a better part of the month to get the visa which I needed. The in-between time is and can be a bit stressful as you have committed funds for travel i.e. the airline tickets and are in a limbo as you didn’t know that if the visa is cancelled for some reason, your tickets would be refunded or not. A little history is needed and hopefully is helpful for anybody who’s applying for a short-stay Visa in South Africa.

Exactly, A month and day back I had applied for visa. The visa I had applied for is 17 days as the flights within my budget was for those days only. The visa I got was for 10 days only which was ending in the middle of the conference. I tried many avenues to get information and was told that I needed to write a correction letter telling/sharing the information about the correct time period and dates in BOLD with a heavier weight/point which is what I did and gave to VFS office without any further payment on my part. It took a bit more time than the first time around but the consulate co-operated. While I can understand the oversight as they probably get more visas requests along with special and urgent requests so such occurrences can happen. I am and was happy that there was a recourse rather than starting from scratch which probably would have made me more anxious due to the first experience .

Apparently I was lucky that I had done with Qatar Airways as later came to know that there are other airways which don’t refund money even after visa rejection . This information I got to know pretty late in the game otherwise I wouldn’t have been much stressed. As had committed knew had to go the whole hog and whatever barriers are there, have overcome at least as far as the visa part/process is concerned.

Now after few days, will probably start to worry about the actual travel, part nervousness, part excitement, nervousness as it will be an alien land, am obese so traveling economy on 787-8 and 777-300 ER will be tricky. The 787-8 will probably be a rough ride as it’s a 9 hr. journey and the seat are a mere 18″, the cattle class as shared by one of our esteemed politicians. This blame has to do with Boeing 787 rather than anybody else. Hopefully, if there is a next time, would make better choices.

Anyways,have selected an aisle seat so that will be able to walk every hr. or so to get the circulation in legs going as leg room in economy is not much from my domestic air travel experiences. If I do survive the travel, then will see South Africa and try to get some free time and explore South Africa and try to figure out how are they able to get one and half time tourists while we get around half figure for a whole year even though we are bigger (area-wise) to South Africa. I did find something positive for us as well, it’s not all doldrums all around.

Now, I had been thinking about if I know any South African music and movies. I had explored djembe while growing up in teens but other than that, not much. The only music I have heard is Harry Belafonte . So I hope to bring some Indian movies and music with me so that people if they have not explored Bollywood as well as Indian classical music could explore some of it, of course it will be pale imitation of what ‘Sawai Gandharva Music Festival‘ for instance gives. I also hope to hear and get some music and movies to learn more about South Africa.

Filed under: Miscellenous Tagged: #bollywood, #Debconf Germany 2015, #Debconf South Africa 2016, #Debconf15, #Debconf16, #djembe, #South-African consulate, music, tourism, visa
Catégories: Elsewhere

Hideki Yamane: Which compression do Debian packages use?

Planet Debian - sam, 11/06/2016 - 05:50
gzip: 4576
bzip2: 54xz: 46250none: 9
90% of packages use xz. Packages use bzip2 should migrate to xz.
Catégories: Elsewhere

Paul Tagliamonte: It's all relative

Planet Debian - sam, 11/06/2016 - 05:45

As nearly anyone who's worked with me will attest to, I've long since touted nedbat's talk Pragmatic Unicode, or, How do I stop the pain? as one of the most foundational talks, and required watching for all programmers.

The reason is because netbat hits on something bigger - something more fundamental than how to handle Unicode -- it's how to handle data which is relative.

For those who want the TL;DR, the argument is as follows:

Facts of Life:

  1. Computers work with Bytes. Bytes go in, Bytes go out.
  2. The world needs more than 256 symbols.
  3. You need both Bytes and Unicode
  4. You cannot infer the encoding of bytes.
  5. Declared encodings can be Wrong

Now, to fix it, the following protips:

  1. Unicode sandwich
  2. Know what you have
  3. TEST
Relative Data

I've started to think more about why we do the things we do when we write code, and one thing that continues to be a source of morbid schadenfreude is watching code break by failing to handle Unicode right. It's hard! However, watching what breaks lets you gain a bit of insight into how the author thinks, and what assumptions they make.

When you send someone Unicode, there are a lot of assumptions that have to be made. Your computer has to trust what you (yes, you!) entered into your web browser, your web browser has to pass that on over the network (most of the time without encoding information), to a server which reads that bytestream, and makes a wild guess at what it should be. That server might save it to a database, and interpolate it into an HTML template in a different encoding (called Mojibake), resulting in a bad time for everyone involved.

Everything's awful, and the fact our computers can continue to display text to us is a goddamn miracle. Never forget that.

When it comes down to it, when I see a byte sitting on a page, I don't know (and can't know!) if it's Windows-1252, UTF-8, Latin-1, or EBCDIC. What's a poem to me is terminal garbage to you.

Over the years, hacks have evolved. We have magic numbers, and plain ole' hacks to just guess based on the content. Of course, like all good computer programs, this has lead to its fair share of hilarious bugs, and there's nothing stopping files from (validly!) being multiple things at the same time.

Like many things, it's all in the eye of the beholder.


Just like Unicode, this is a word that can put your friendly neighborhood programmer into a series of profanity laden tirades. Go find one in the wild, and ask them about what they think about timezone handling bugs they've seen. I'll wait. Go ahead.

Rants are funny things. They're fun to watch. Hilarious to give. Sometimes just getting it all out can help. They can tell you a lot about the true nature of problems.

It's funny to consider the isomorphic nature of Unicode rants and Timezone rants.

I don't think this is an accident.

U̶n̶i̶c̶o̶d̶e̶ timezone Sandwich

Ned's Unicode Sandwich applies -- As early as we can, in the lowest level we can (reading from the database, filesystem, wherever!), all datetimes must be timezone qualified with their correct timezone. Always. If you mean UTC, say it's in UTC.

Treat any unqualified datetimes as "bytes". They're not to be trusted. Never, never, never trust 'em. Don't process any datetimes until you're sure they're in the right timezone.

This lets the delicious inside of your datetime sandwich handle timezones with grace, and finally, as late as you can, turn it back into bytes (if at all!). Treat locations as tzdb entries, and qualify datetime objects into their absolute timezone (EST, EDT, PST, PDT)

It's not until you want to show the datetime to the user again should you consider how to re-encode your datetime to bytes. You should think about what flavor of bytes, what encoding -- what timezone -- should I be encoding into?


Just like Unicode, testing that your code works with datetimes is important. Every time I think about how to go about doing this, I think about that one time that mjg59 couldn't book a flight starting Tuesday from AKL, landing in HNL on Monday night, because United couldn't book the last leg to SFO. Do you ever assume dates only go forward as time goes on? Remember timezones.

Construct test data, make sure someone in New Zealand's +13:45 can correctly talk with their friends in Baker Island's -12:00, and that the events sort right.

Just because it's Noon on New Years Eve in England doesn't mean it's not 1 AM the next year in New Zealand. Places a few miles apart may go on Daylight savings different days. Indian Standard Time is not even aligned on the hour to GMT (+05:30)!

Test early, and test often. Memorize a few timezones, and challenge your assumptions when writing code that has to do with time. Don't use wall clocks to mean monotonic time. Remember there's a whole world out there, and we only deal with part of it.

It's also worth remembering, as Andrew Pendleton pointed out to me, that it's possible that a datetime isn't even unique for a place, since you can never know if 2016-11-06 01:00:00 in America/New_York (in the tzdb) is the first one, or second one. Storing EST or EDT along with your datetime may help, though!


Improper handling of timezones can lead to some interesting things, and failing to be explicit (or at least, very rigid) in what you expect will lead to an unholy class of bugs we've all come to hate. At best, you have confused users doing math, at worst, someone misses a critical event, or our security code fails.

I recently found what I regard to be a pretty bad bug in apt (which David has prepared a fix for and is pending upload, yay! Thank you!), which boiled down to documentation and code expecting datetimes in a timezone, but accepting any timezone, and silently treating it as UTC.

The solution is to hard-fail, which is an interesting choice to me (as a vocal fan of timezone aware code), but at the least it won't fail by misunderstanding what the server is trying to communicate, and I do understand and empathize with the situation the apt maintainers are in.

Final Thoughts

Overall, my main point is although most modern developers know how to deal with Unicode pain, I think there is a more general lesson to learn -- namely, you should always know what data you have, and always remember what it is. Understand assumptions as early as you can, and always store them with the data.

Catégories: Elsewhere

Drupal Console: Help us complete the Drupal Console stable release

Planet Drupal - sam, 11/06/2016 - 00:52
If you are reading this you maybe are aware or have an idea what the Drupal Console is, but in case you are not; this is a brief explanation. Drupal Console is “The new CLI for Drupal”. A tool to generate boilerplate code, interact with and debug Drupal. Why are we asking for help? The Drupal Console as many other Open Source projects is created and maintained with the effort and free time of contributors and maintainers, which is great but, sometimes the community requires those tools earlier than the contributors can produce using only in their own time. One example of that is the Drupal 8 Accelerate program,  d8rules funding program, and D8 Module Acceleration Program with the aim to bring to the community those products as soon as possible, getting money to sponsor contributors office hours to complete those products. Our situation is not so different from other projects, even though the core maintainers have some hours per week sponsored by our employer or business. But this time is not enough, and we need to use our personal time to continue with the development. We need to assign more time to check the issue queue, currently with 200+ pending issues and feature requests, to provide support on the Gitter channel, to do a much better work with the documentation, and improve the test coverage. Therefore, we need some financial assistance to try to deliver our first stable release as soon as possible. For more information and to know how to help us, please read the article in full.
Catégories: Elsewhere

Guido Günther: Debian Fun in May 2016

Planet Debian - ven, 10/06/2016 - 19:38
Debian LTS

May marked the thirteenth month I contributed to Debian LTS under the Freexian umbrella. I spent the 17.25 hours working on these LTS things:

  • Fixed CVE-2014-7210 in pdns resulting in DLA-492-1
  • Fixed the build failure of Icedove on armhf resulting in DLA 472-2
  • Forward ported our nss, nspr enhancements to to the current versions in testing to continue the discussion on the same nss and nspr versions in all suites including some ABI compliance research (thanks abi-compliance-tester!), resulting in 824872.
  • Backported Icedve 45 and Enigmail to wheezy to check if we can continue to support it - we can with a minor tweaks. Upload will happen in June.
  • While at that added some autpkgtests for Icedove 45 resulting in 809723 (already applied).
  • Released DLA-498-1 for ruby-active-model-3.2 to address CVE-2016-0753.
  • Reviewed the Updates of ruby-active-record-3.2 for CVE-2015-7577 and eglibc.
Other Debian stuff
  • Uploaded libvirt 1.3.4 to sid, 1.3.5~rc1 to experimental
  • Uploaded libosinfo 0.3.0 to sid
  • Uploaded git-buildpackage 0.7.4 to sid including experimental multiple tarball support for gbp buildpackage
Catégories: Elsewhere

Acquia Developer Center Blog: Multisite Governance, Site Delivery, and Other Issues Related to Managing Many Sites: Part 3

Planet Drupal - ven, 10/06/2016 - 18:34

This is Part 3 of an interview with Will Eisner, Senior Director, Product at Acquia. Will’s primary focus is on Acquia Cloud Site Factory, which helps organizations create and manage many sites, from a dozen to thousands.

Also sitting in on the interview, via conference line, was Sonya Kovacic, a Junior Product Manager at Acquia who also works on Site Factory.

Tags: acquia drupal planet
Catégories: Elsewhere

Acquia Developer Center Blog: Agile Training for the Government Product Owner

Planet Drupal - ven, 10/06/2016 - 18:06

TLDR: A new, free agile training course for government product owners has been released on the AGL Academy. Sign up now to participate in the introductory webinar scheduled for June 16, 2016 at 1PM ET (or view the webinar recording after that date).

Tags: acquia drupal planet
Catégories: Elsewhere

DrupalCon News: Drupal 8 in the Wild

Planet Drupal - ven, 10/06/2016 - 17:46

Intrepid developers of the Drupal community!

This year saw the bravest of explorers venture out into the harsh and unforgiving landscapes of the World (Wide Web).

Wearing only t-shirts from past DrupalCons, they put all of their trust in the hard work of their friends and colleagues, as they set out on a mission: to use Drupal 8 on real projects!

Catégories: Elsewhere

ImageX Media: How Can I Make It then How Can I Break It?

Planet Drupal - ven, 10/06/2016 - 17:42

This is the second in a series of posts recapping ImageX’s presentations at this year’s DrupalCon.

With so many testing methods available -- code static analysis checks, unit testing, functional testing, front-end performance testing, load testing, visual regression testing, etc. --  it can be difficult for a development team to choose which will work best for their project, particularly with limited time and budget available.

Catégories: Elsewhere

ImageX Media: DrupalCon 2016

Planet Drupal - ven, 10/06/2016 - 17:27

DrupalCon brings together thousands of people throughout the Drupal community who use, design for, develop for, and support the platform. It’s the heartbeat of the Drupal community, where advancements in the platform happen, learnings from its users are shared, and where connections are made that strengthen the community. 

Catégories: Elsewhere

ImageX Media: Higher Education Notes and Trends

Planet Drupal - ven, 10/06/2016 - 17:17


As a web agency that specializes in higher education, ImageX keeps its figurative finger on the pulse of the sector. Some weeks are busier than others for new data and studies being released, and this week definitely falls into the busy category. Let’s take a look at the week that was in higher education!

The Bill and Melinda Gates Foundation released a compelling student demographic breakdown of what the higher education landscape looks like in America:

Catégories: Elsewhere


Subscribe to jfhovinne agrégateur - Elsewhere