For those who are starving for some updates while the big gcc5 transitions brings the rest of Debian/sid to a halt, here is some fresh meat, a new TeX Live checkout. Nothing spectacular new here, just the usual big bunch of updates of and several new packages. Maybe worthwhile mentioning is that luasseq has been reincorporated into the TeX Live packages. Thanks to the maintainer for his work till now!
From the long list of changes let me pick one update and one new package: pgf has been updated to 3.0.1, which is just a small change in number, but incorporates 1.5 years of fixes, and the list of fixes is long. And a very interesting newcomer is pdfpagediff by C. V. Radhakrishnan. It allows visual comparison of two pdf files by overlaying them in a transparent way, making even slight changes in the layout immediately visible.
Of course, these are not the only updates and newcomers, see the list below.Updated packages
academicons, addlines, algorithms, archaic, babel, babel-estonian, biber, biblatex-source-division, bidi, bxjscls, chronology, cjk-ko, conteq, csplain, csquotes, curve2e, datatool, doclicense, dvips, eledmac, enotez, esami, eso-pic, etex-pkg, etoolbox, exsheets, fandol, fithesis, fontawesome, fontspec, forest, glossaries, greek-inputenc, jslectureplanner, kotex-oblivoir, kotex-utf, l3build, l3experimental, l3kernel, l3packages, latex, leadsheets, ledmac, lshort-english, luamplib, luasseq, luatexko, maths-symbols, media9, memoir, metrix, mhchem, minitoc, moderncv, modiagram, morefloats, mptopdf, msu-thesis, nameauth, ndsu-thesis, newpx, newtx, newtxsf, newtxtt, notes, ocgx2, pageslts, pdfpages, pdftex, pgf, phonrule, plain, polyglossia, pstricks, ptex, pxchfon, reflectgraphics, rsfso, sectionbox, siunitx, skrapport, standalone, tcolorbox, tetex, tex4ht, texfot, texinfo, texlive-scripts, todonotes, translations, tudscr, ucbthesis, unicode-math, xcharter, xgreek.New packages
alertmessage, bewerbung, bidihl, bxpdfver, cloze, comicneue, copyedit, fcavtex, gradstudentresume, make4ht, mcf2graph, multiaudience, nmbib, pdfpagediff, quran, reledmac, roundrect, screenplay-pkg, shapes, tex4ebook.
This month I marked 485 packages for accept, rejected 87 of them and had to send 18 emails to maintainers. The NEW-queue is below 100 again, but you hardworking fellows don’t make a break, but start the GCC5 transition. This is so much fun .
This month I got assigned a workload of 15h and I spent again most of it to work on a new upload of php5. I finally prepared the patches for the CVEs and realized only then that the number of failed tests drastically increased. So return to beginning and checking why everything is broken now .
- [DLA 269-1] linux-ftpd-ssl security update
- [DLA 271-1] libunwind security update
- [DLA 280-1] ghostscript security update
- [DLA 281-1] expat security update
The patch for [DLA 269-1] was prepared by Mats Erik Andersson.
This month I also had another term of doing frontdesk work. So I answered questions on the IRC channel and looked for CVEs that are important for Squeeze LTS or could be ignored.
This month I could finally finish the harminv transition and all affected packages migrated to testing meanwhile.
I also uploaded a new version for pipexec.
Again, thanks alot to all donors. I really appreciate this and hope that everybody is pleased with my commitment. Don’t hesitate to make suggestions for improvements.
So, everybody knows that WPS (Wi-Fi Protected Setup) is broken. But sometimes, you don't own the access point, and you'd just want the wireless to work. That happens for example when you're a guest in some place using an Orange Livebox and you don't have the WPA passphrase (usually because it's written somewhere you don't have access too, or because someone forgot to tell you).
Liveboxes WPS is the “press button” thing: you press a button on the front for one second, then any device can connect in the next two minutes. That works fine with Android devices, for example, but it didn't work with my laptop and NetworkManager, which doesn't support WPS at all.
Fortunately, the underlying piece of software (wpa_supplicant) does support WPS, and even the “push button” style. And you can nicely ask it to reveal the passphrase to you with the following trick.
- Disconnect NetworkManager from the network, disable the wireless link, stop it; just make sure wpa_supplicant is not running;
- Put a stub wpa_supplicant.conf file with only the following content: update_config=1
- Start wpa_supplicant in the foreground with your stub config file: wpa_supplicant -iwlan0 -c wpa_supplicant.conf
- Start wpa_cli
- Scan the network: scan
- Get the results: scan_results and identity the bssid of the Livebox
- Press the WPS button on the Livebox
- Run wps_pbc <bssid> ; some text should appear in the wpa_cli window, and it should eventually connect successfully (at that point you can even run a dhclient on wlan0)
- Run save_config
The last command will update your stub configuration file, adding a new network block with the passphrase in the clear. You can then use that passphrase inside Network Manager if it's more convenient for you.
There might be something easier, but at least it worked just fine for me during the holidays.
In my last post, I mentioned that libvpx had rate control issues; or more accurately, that libvpx as used by VLC (through FFmpeg) had. Seemingly that was actually fixed a while ago by Ilkka Ollakka; VLC now (well, since late 2013) automatically sets half a second VBV by default. I normally use one, but half a second is totally fine, too, and it will keep the bitrate use from spiraling out of control when you switch from e.g. a static slide to a fade.
Unfortunately, bitrate allocation has been much more thoroughly broken than that; since late 2012, VLC has fed completely bogus timestamp values into avcodec, causing bitrate allocation to be completely off. (E.g., I asked for 872 kbit/sec, and got bitrates wildly varying between 2 and 3 Mbit/sec.) The only reason why this wasn't detected was that the muxer got correct pts, and the encoder's pts are only really used for bitrate allocation, so the files ended up being valid and playing. Anyway, after I reported it, Ilkka Ollakka fixed that too, so now bitrate is much, much more stable:
Unfortunately, it also exposes another bug: On some inputs (e.g. my MPEG-TS source from live TV), the frame rate is (about half the time) not properly detected before the encoder is initialized, and as such it's started with the default timebase of 1/25—making it force-downconverting from 50 to 25 since it refuses to send two frames with the same timestamp. Unfortunate, but at least easy to override by setting fps=50.
We are happy to announce that TecKids, a non-profit specialising in working with kids, teaching them technonology, and fostering self-sustaining communities amongst them, will be holding a workshop from 15.08. to 18.08.
The focus is on kids aged 10-15 years old, but kids aged 8-16 years old are welcome to attend if they can follow the course without supervision of theirs parents.
Admission is free of charge, but registration through TecKids’ web form is mandatory. Registration is open until Tuesday, 11 August.
The rough schedule for now is:
- game programming
You will be kept up to date on any changes to the schedule and other details by the TecKids team after registration. It is important that you check mail before heading to DebConf in order to receive any last-minute informations.
Status: install ok installed
Version: 5.1.1alpha+20120614-2.1 $ time `sudo cowbuilder --build fonts-horai-umefont_530-1.dsc`
sys 0m3.244s real 1m9.842s
sys 0m3.856s real 1m9.355s
xz 5.2 supports multithread, with local build (remove --disable-thread option in debian/rules)
$ dpkg -s liblzma5
Status: install ok installed
Version: 5.2.1-0.1 real 1m7.946s
sys 0m3.228s real 1m8.794s
sys 0m3.556s real 1m7.580s
Umm, not affected.
$ time `xz -T 0 fonts-horai-umefont_530.orig.tar`
sys 0m0.252s$ time `xz -T 1 fonts-horai-umefont_530.orig.tar`
sys 0m0.124smultithread itself is enabled.
I have updated NEWS, bumped the API and tagged in git; procps version 3.3.11 is now released!
This release we have fixed many bugs and made procps more robust for those odd corner cases. See the NEWS file for details. The most significant new feature in this release is the support for LXC containers in both ps and top.
The source files can be found at both sourceforge and gitlab at:
My thanks to the procps co-maintainers, bug reporters and merge/patch authors.What’s Next?
There has been a large amount of work on the library API. This is not visible to this release as it is on a different git branch called newlib. The library is getting a complete overhaul and will look completely different to the old libproc/libprocps set. A decision hasn’t been made when newlib branch will merge into master, but we will do it once we’re happy the library and its API have settled. This change will be the largest change to procps’ library in its 20-odd year history but will mean the library will use common modern practices for libraries.
Typesetting a book is harder than I hoped. As the translation is mostly done, and a volunteer proof reader was going to check the text on paper, it was time this summer to focus on formatting my translated docbook based version of the Free Culture book by Lawrence Lessig. I've been trying to get both docboox-xsl+fop and dblatex to give me a good looking PDF, but in the end I went with dblatex, because its Debian maintainer and upstream developer were responsive and very helpful in solving my formatting challenges.
Last night, I finally managed to create a PDF that no longer made Lulu.com complain after uploading, and I ordered a text version of the book on paper. It is lacking a proper book cover and is not tagged with the correct ISBN number, but should give me an idea what the finished book will look like.
Instead of using Lulu, I did consider printing the book using CreateSpace, but ended up using Lulu because it had smaller book size options (CreateSpace seem to lack pocket book with extended distribution). I looked for a similar service in Norway, but have not seen anything so far. Please let me know if I am missing out on something here.
But I still struggle to decide the book size. Should I go for pocket book (4.25x6.875 inches / 10.8x17.5 cm) with 556 pages, Digest (5.5x8.5 inches / 14x21.6 cm) with 323 pages or US Trade (6x8 inches / 15.3x22.9 cm) with 280 pages? Fewer pager give a cheaper book, and a smaller book is easier to carry around. The test book I ordered was pocket book sized, to give me an idea how well that fit in my hand, but I suspect I will end up using a digest sized book in the end to bring the prize down further.
My biggest challenge at the moment is making nice cover art. My inkscape skills are not yet up to the task of replicating the original cover in SVG format. I also need to figure out what to write about the book on the back (will most likely use the same text as the description on web based book stores). I would love help with this, if you are willing to license the art source and final version using the same CC license as the book. My artistic skills are not really up to the task.
I plan to publish the book in both English and Norwegian and on paper, in PDF form as well as EPUB and MOBI format. The current status can as usual be found on github in the archive/ directory. So far I have spent all time on making the PDF version look good. Someone should probably do the same with the dbtoepub generated e-book. Help is definitely needed here, as I expect to run out of steem before I find time to improve the epub formatting.
Please let me know via github if you find typos in the book or discover translations that should be improved. The final proof reading is being done right now, and I expect to publish the finished result in a few months.
Slighthly delayed, but here are the stats for week 5 of the DUCK challenge:
- Emmanuel Bourg uploaded gant
- Marco d'Itri uploaded libsystemd-dummy
- Christoph Egger uploaded sbcl
- Sandro Tosi uploaded basemap
- Hilko Bengen uploaded libbde
- Bas Couwenberg uploaded gpx2shp
- Michael Stapelberg uploaded dh-make-golang
- Jackson Doak uploaded pyicu
- Peter Pentchev uploaded stdsyslog
- Gianfranco Costamagna uploaded cld2
So we had 10 packages fixed and uploaded by 10 different uploaders. A big "Thank You" to you!!
Since the start of this challenge, a total of 59 packages, were fixed.
Here is a quick overview:Week 1 Week 2 Week 3 Week 4 Week 5 Week 6 Week 7 # Packages 10 15 10 14 10 - - Total 10 25 35 49 59 - -
What do the businesses in the Drupal community need to do to survive and grow in the next three years?
Drupal 8 will attract a new wave of agencies to compete in the market, and will also enable Drupal businesses to compete at a higher level. A time of big change and competition is coming, and it will no longer be enough just to say you 'do Drupal' and then sit back and watch sales roll in.
I recently moved from an agency specialising in building Drupal sites to one which is platform-agnostic, and uses all variety of technologies. As my team was not very familiar with Drupal, I started writing some documentation on setting up locally, installing Drush and commonly used modules, and some other stuff so everyone could get up and running quickly. I’ve modified it to be even more beginner-friendly, for people who’ve never built websites before. This is sort of opinionated so feel free not to follow along exactly.The basic technology stack
As with most content management systems, Drupal has some system requirements in order to run, commonly known as a technology stack. This simply means you have to install...
A new version 0.1.0 of the drat package arrived on CRAN today. Its name stands for drat R Archive Template, and it helps with easy-to-create and easy-to-use repositories for R packages, and is finding increasing by other projects.
This version 0.1.0 builds on the previous releases and now adds complete support for binaries for both Windows and OS X. This builds on what had been added in the previous release 0.0.4.
This and other new features are listed below:
- updated vignettes
- more complete support for binaries thanks to work by Jan Schulz, Matt Jones and myself
- new support to (optionally) archive existing packages thanks to Thomas Leeper
- various smaller fixes.
While disappointing in a bunch of ways, this is probably the correct decision. TODO stumbled into this space with a poor understanding of the problems that they were trying to solve. Nikki Murray pointed out that the initial draft lacked several of the key components that help ensure less privileged groups can feel that their concerns are taken seriously. This was mostly rectified last week, but nobody involved appeared to be willing to stand behind those changes in a convincing way. This wasn't helped by almost all of this appearing to land on Github's plate, with the rest of the TODO group largely missing in action. Where were Google in this? Yahoo? Facebook? Left facing an angry mob with nobody willing to make explicit statements of support, it's unsurprising that Github would try to back away from the situation.
But that doesn't remove their blame for being in the situation in the first place. The statement claims
We are consulting with stakeholders, community leaders, and legal professionals, which is great. It's also far too late. If an industry body wrote a new kernel from scratch and deployed it without any external review, then discovered that it didn't work and only then consulted any of the existing experts in the field, we'd never take them seriously again. But when an industry body turns up with a new social policy, fucks up spectacularly and then goes back to consult experts, it's expected that we give them a pass.
Why? Because we don't perceive social problems as difficult problems, and we assume that anybody can solve them by simply sitting down and talking for a few hours. When we find out that we've screwed up we throw our hands in the air and admit that this is all more difficult than we imagined, and we give up. We ignore the lessons that people have learned in the past. We ignore the existing work that's been done in the field. We ignore the people who work full time on helping solve these problems.
We wouldn't let an industry body with no experience of engineering build a bridge. We need to accept that social problems are outside our realm of expertise and defer to the people who are experts.
 The repository history shows the majority of substantive changes were from Github, with the initial work appearing to be mostly from Twitter.
One problem to solve when doing web authentication has always been one identity provider, so you don't have to remember which username (or email address) you used for that bugtracker you used three years ago or that website. And tie it to one login ideally. Five years ago this problem seemed to be basically solved. There was OpenID and while it may not have been great it worked. You could have your own provider, your institution (university, company, foss project, ..) could have one and you could use your university-provided ID for all university stuff.Today's state
Looking at the problem again today and the situation seems to have changed. To the worse. A lot. People are actively removing OpenID support. There seemed to be a replacement with, at least, proper design goals: Mozilla's persona. However this seems to be a dead end, no-one (almost) actually supports it.
Then there is what people call OAuth2. However there does not seem to be such a thing as OAuth2 at all, at least not for logging into websites. So for example phabricator supports 12 different OAuth2 systems. That includes Google, Facebook, Twitter, Amazon Github and a whole bunch of other services. Each with a different implementation in the webapp of course. And of course you can not just have your university/company/.. provide an OAuth2 service for you to use -- you would need to write yet another adapter on the (foreign) website to talk to your implementation and your provider.
And the strange thing, people seem to still consider OAuth2 a replacement for OpenID while it does not even provide the core functionality of the older system. Plus there does not seem to be any awareness of that all together.Other features
Now of course, OpenID is not (and never was) the ultimate answer to the web authentication problem. The most obvious problem being user tracking. Your identity provider will see every website you log into, will see when you log into it and even be able to log into that website with your credentials.
Of course, this problem is fully inherited by OAuth2. And in contrast to OpenID you can no longer run your own provider whom you can fully trust and who already knows about your surfing habits (because it's actually you already). Mozilla's persona might have solved that, they at least intended to. But, again, persona seems quite dead.