Feed aggregator

jfhovinne pushed to feature/NEXTEUROPA-10441 at ec-europa/platform-dev

Devel - Thu, 12/05/2016 - 14:34
May 12, 2016 jfhovinne pushed to feature/NEXTEUROPA-10441 at ec-europa/platform-dev
  • f5f3b9d NEXTEUROPA-10441: Added readme.
Categories: Networks

OSTraining: Drupal 8 CookieConsent EU Law module

Planet Drupal - Thu, 12/05/2016 - 13:53

One of are OSTraining members asked how to add cookie notification to a drupal 8 site.

The CookieConsent is a module provides a solution to deal with the EU Cookie Law.

And is particularly useful if you want to use the SuperCookie module

Categories: Elsewhere

jfhovinne pushed to feature/NEXTEUROPA-10441 at ec-europa/platform-dev

Devel - Thu, 12/05/2016 - 13:38
May 12, 2016 jfhovinne pushed to feature/NEXTEUROPA-10441 at ec-europa/platform-dev
Categories: Networks

Wunderkraut blog: Dropcat - the configuration files

Planet Drupal - Thu, 12/05/2016 - 13:26

In a series of blog posts I am going to present our new tool for doing drupal deploys. It is developed internally in the ops-team in Wunderkraut Sweden , and we did that because of when we started doing Drupal 8 deploys we tried to rethink how we mostly have done Drupal deploys before, because we had some issues what we already had. This is part 2.

The idea with dropcat is that you use it with options, or/and with configuration files. I would recommend to use it with config files, and with minor settings as options. 

You could use just use a default settings file, that should be dropcat.yml, or as in most cases you have one config file for each environment you have – dev, stage, prod etc.

You could use an environment variable to set which environment to use, this variable is called DROPCAT_ENV.  To use prod environment you could set that variable in the terminal to prod with:
export DROPCAT_ENV=prod

Normally we set this environment variable in our jenkins build, but you could also set it as an parameter with dropcat like:
dropcat backup --env=prod

That will use the dropcat.prod.yml file

By default dropcat uses dropcat.yml if youi don't set an environment. 

Thing will be more in the next blog posts, but first we now look into a minimal config file, in our root dir we could hav a dropcat.yml file with this config:

app_name: mysite local: environment: tmp_path: /tmp seperator: _ drush_folder: /home/myuser/.drush remote: environment: server: mytarget.server.com ssh_user: myuser ssh_port: 22 identity_file: /home/myuser/.ssh/id_rsa web_root: /var/www/webroot temp_folder: /tmp alias: mysite_latest_stage site: environment: drush_alias: mysitestage backup_path: /backup original_path: /srv/www/shared/mysite_stage/files symlink: /srv/www/mysite_latest_stage/web/sites/default/files url: http://mysite.com name: mysitestage mysql: environment: host: mymysql.host.com database: my_db user: my_db_user password: my_db_password port: 3306

The settings is grouped in a way that should explain what they are used for – local.environment is from where we deploy, remote.environment is to where we deploy. site.environment is for drush and symlinks (we use for the files folder), mysql.environment, is for… yeah you guessed correctly – mysql/mariadb. 

appname

This is the application name, used for creating a tar-file with that name (with some more information, like build date and build number).

local

These are the settings from where we deploy, it could be localy, it could be a build server as jenkins. 

tmp_path

Where we temporary store stuff.

Seperator

Used for i name of foler to deploy as seperator like myapp_DATE


drush_folder

Where drush-settings from you deploy from, normaly in your home folder (for jenkins normaly: /var/lib/jenkins/.drush), and this is also to which path the drush alias is saved on dropcat prepare.

Remoteserver

The server you deploy you code too.

ssh_user

User to use with ssh to your remote server

ssh_port

Port used to use ssh to your server

identity_file

Which private ssh-key to use to login to your remote server

web_root

Path to which your site is going to be deployed to.

temp_folder

Temp folder on remote server, used for unpacking tar file.

alias

Symlink alias for you site


Sitedrush_alias

Name of you drush alias, used from 'local' server. Drush alias is created as a part of dropcat prepare.

backup_path

Backup path on ”local” server. Used by dropcat backup

original_path

Existing path to point a symlink to – we use for the files folder

symlink

Symlink path that points to original_path

url

URL for you site, used in drush alias

name

Name of site in drush alias.


Mysqlhost

name of db host

database

Database to use

user

Database user

password

password for db user to host

port

Port to use with mysql

We are still on a very abstract level, next time we will go through that is needed in an normal jenkins-build.

Categories: Elsewhere

Wunderkraut blog: Dropcat - the configuration files

Planet Drupal - Thu, 12/05/2016 - 13:26

In a series of blog posts I am going to present our new tool for doing drupal deploys. It is developed internally in the ops-team in Wunderkraut Sweden , and we did that because of when we started doing Drupal 8 deploys we tried to rethink how we mostly have done Drupal deploys before, because we had some issues what we already had. This is part 2.

The idea with dropcat is that you use it with options, or/and with configuration files. I would recommend to use it with config files, and with minor settings as options. 

You could use just use a default settings file, that should be dropcat.yml, or as in most cases you have one config file for each environment you have – dev, stage, prod etc.

You could use an environment variable to set which environment to use, this variable is called DROPCAT_ENV.  To use prod environment you could set that variable in the terminal to prod with:
export DROPCAT_ENV=prod

Normally we set this environment variable in our jenkins build, but you could also set it as an parameter with dropcat like:
dropcat backup –env=prod

That will use the dropcat.prod.yml file

By default dropcat uses dropcat.yml if youi don't set an environment. 

Thing will be more in the next blog posts, but first we now look into a minimal config file, in our root dir we could hav a dropcat.yml file with this config:

app_name: mysite local: environment: tmp_path: /tmp seperator: _ drush_folder: /home/myuser/.drush remote: environment: server: mytarget.server.com ssh_user: myuser ssh_port: 22 identity_file: /home/myuser/.ssh/id_rsa web_root: /var/www/webroot temp_folder: /tmp alias: mysite_latest_stage site: environment: drush_alias: mysitestage backup_path: /backup original_path: /srv/www/shared/mysite_stage/files symlink: /srv/www/mysite_latest_stage/web/sites/default/files url: http://mysite.com name: mysitestage mysql: environment: host: mymysql.host.com database: my_db user: my_db_user password: my_db_password port: 3306

The settings is grouped in a way that should explain what they are used for – local.environment is from where we deploy, remote.environment is to where we deploy. site.environment is for drush and symlinks (we use for the files folder), mysql.environment, is for… yeah you guessed correctly – mysql/mariadb. 

appname

This is the application name, used for creating a tar-file with that name (with some more information, like build date and build number).

local

These are the settings from where we deploy, it could be localy, it could be a build server as jenkins. 

tmp_path

Where we temporary store stuff.

Seperator

Used for i name of foler to deploy as seperator like myapp_DATE


drush_folder

Where drush-settings from you deploy from, normaly in your home folder (for jenkins normaly: /var/lib/jenkins/.drush), and this is also to which path the drush alias is saved on dropcat prepare.

Remoteserver

The server you deploy you code too.

ssh_user

User to use with ssh to your remote server

ssh_port

Port used to use ssh to your server

identity_file

Which private ssh-key to use to login to your remote server

web_root

Path to which your site is going to be deployed to.

temp_folder

Temp folder on remote server, used for unpacking tar file.

alias

Symlink alias for you site


Sitedrush_alias

Name of you drush alias, used from 'local' server. Drush alias is created as a part of dropcat prepare.

backup_path

Backup path on ”local” server. Used by dropcat backup

original_path

Existing path to point a symlink to – we use for the files folder

symlink

Symlink path that points to original_path

url

URL for you site, used in drush alias

name

Name of site in drush alias.


Mysqlhost

name of db host

database

Database to use

user

Database user

password

password for db user to host

port

Port to use with mysql

We are still on a very abstract level, next time we will go through that is needed in an normal jenkins-build.

Categories: Elsewhere

Michal Čihař: Changed Debian repository signing key

Planet Debian - Thu, 12/05/2016 - 09:10

After getting complains from apt and users, I've finally decided to upgrade signing key on my Debian repository to something more decent that DSA. If you are using that repository, you will now have to fetch new key to make it work again.

The old DSA key was there really because my laziness as I didn't want users to reimport the key, but I think it's really good that apt started to complain about it (it doesn't complain about DSA itself, but rather on using SHA1 signatures, which is most you can get out of DSA key).

Anyway the new key ID is DCE7B04E7C6E3CD9 and fingerprint is 4732 8C5E CD1A 3840 0419 1F24 DCE7 B04E 7C6E 3CD9. It's signed by my GPG key, so you can verify it this way. Of course instruction on my Debian repository page have been updated as well.

Filed under: Debian English | 2 comments

Categories: Elsewhere

Petter Reinholdtsen: Debian now with ZFS on Linux included

Planet Debian - Thu, 12/05/2016 - 07:30

Today, after many years of hard work from many people, ZFS for Linux finally entered Debian. The package status can be seen on the package tracker for zfs-linux. and the team status page. If you want to help out, please join us. The source code is available via git on Alioth. It would also be great if you could help out with the dkms package, as it is an important piece of the puzzle to get ZFS working.

Categories: Elsewhere

d7One: How to print orders in Commerce

Planet Drupal - Wed, 11/05/2016 - 18:53

In this tutorial or guide, I will share the best solutions I found for two basic Drupal Commerce use-cases and delve into their respective setup.

Commerce Kickstart 2 (CK2) is a great distribution for setting up an online store; it packs a lot of goodies out-of-the-box. But it can't have them all. Printing an order to PDF is not included. So one has to do some R&D for that.

Categories: Elsewhere

jfhovinne pushed to feature/NEXTEUROPA-10441 at ec-europa/platform-dev

Devel - Wed, 11/05/2016 - 17:55
May 11, 2016 jfhovinne pushed to feature/NEXTEUROPA-10441 at ec-europa/platform-dev
Categories: Networks

jfhovinne created repository jfhovinne/integration_couchdb

Devel - Wed, 11/05/2016 - 16:56
jfhovinne created repository jfhovinne/integration_couchdb May 11, 2016
Categories: Networks

Valuebound: Your First Step to Git

Planet Drupal - Wed, 11/05/2016 - 14:39

Hey! So you are here in this page trying to find/learn something about git! Have you used a source code management system to synchronize your local code remotely before? Do you know that Git is the most powerful SCM. I was convinced and yes it is!

I have actually started SCM with svn( Apache Subversion). In fact started with TortoiseSVN, a GUI tool for Windows. Here there are no commands, no need to remember, so, nothing to worry, Just right click on your web root folder and choose whichever option you need! Sounds easy?

If you want to go with SVN, you can refer these links.
http://www.tutorialspoint.com/svn/svn_basic_concepts.htm
http://…

Categories: Elsewhere

Elena 'valhalla' Grandi: GnuPG Crowdfunding and sticker

Planet Debian - Wed, 11/05/2016 - 14:22
GnuPG Crowdfunding and sticker

I've just received my laptop sticker from the GnuPG crowdfund http://goteo.org/project/gnupg-new-website-and-infrastructure: it is of excellent quality, but comes with HOWTO-like detailed instructions to apply it in the proper way.

This strikes me as oddly appropriate.

#gnupg
Categories: Elsewhere

Elena 'valhalla' Grandi: New gpg subkey

Planet Debian - Wed, 11/05/2016 - 14:21
New gpg subkey

The GPG subkey http://www.trueelena.org/about/gpg.html I keep for daily use was going to expire, and this time I decided to create a new one instead of changing the expiration date.

Doing so I've found out that gnupg does not support importing just a private subkey for a key it already has (on IRC I've heard that there may be more informations on it on the gpg-users mailing list), so I've written a few notes on what I had to do on my website http://www.trueelena.org/computers/howto/gpg_subkeys.html, so that I can remember them next year.

The short version is:

* Create your subkey (in the full keyring, the one with the private master key)
* export every subkey (including the expired ones, if you want to keep them available), but not the master key
* (copy the exported key from the offline computer to the online one)
* delete your private key from your regular use keyring
* import back the private keys you have exported before.

#gnupg
Categories: Elsewhere

Julian Andres Klode: Backing up with borg and git-annex

Planet Debian - Wed, 11/05/2016 - 11:47

I recently found out that I have access to a 1 TB cloud storage drive by 1&1, so I decided to start taking off-site backups of my $HOME (well, backups at all, previously I only mirrored the latest version from my SSD to an HDD).

I initially tried obnam. Obnam seems like a good tool, but is insanely slow. Unencrypted it can write about 3 MB/s, which is somewhat OK, but even then it can spend hours forgetting generations (1 generation takes probably 2 minutes, and there might be 22 of them). In encrypted mode, the speed reduces a lot, to about 500 KB/s if I recall correctly, which is just unusable.

I found borg backup, a fork of attic. Borg backup achieves speeds of up to 15 MB/s which is really nice. It’s also faster with scanning: I can now run my bihourly backups in about 1 min 30s (usually backs up about 30 to 80 MB – mostly thanks to Chrome I suppose!). And all those speeds are with encryption turned on.

Both borg and obnam use some form of chunks from which they compose files. Obnam stores each chunk in its own file, borg stores multiple chunks (even from different files) in a single pack file which is probably the main reason it is faster.

So how am I backing up: My laptop has an internal SSD and an HDD.  I backup every 2 hours (at 09,11,13,15,17,19,21,23,01:00 hours) using a systemd timer event, from the SSD to the HDD. The backup includes all of $HOME except for Downloads, .cache, the trash, Android SDK, and the eclipse and IntelliJ IDEA IDEs.

Now the magic comes in: The backup repository on the HDD is monitored by git-annex assistant, which automatically encrypts and uploads any new files in there to my 1&1 WebDAV drive and registers them in a git repository hosted on bitbucket. All files are encrypted and checksummed using SHA256, reducing the chance of the backup being corrupted.

I’m not sure how the WebDAV thing will work once I want to prune things, I suspect it will then delete some pack files and repack things into new files which means it will spend more bandwidth than obnam would. I’d also have to convince git-annex to actually drop anything from the WebDAV remote, but that is not really that much of a concern with 1TB storage space in the next 2 years at least…

I also have an external encrypted HDD which I can take backups on, it currently houses a fuller backup of $HOME that also includes Downloads, the Android SDK, and the IDEs for quicker recovery. Downloads changes a lot, and all of them can be fairly easily re-retrieved from the internet as needed, so there’s not much point in painfully uploading them to a WebDAV backup site.

 


Filed under: Uncategorized
Categories: Elsewhere

Norbert Preining: Gaming: The Room series

Planet Debian - Wed, 11/05/2016 - 07:55

After having finished Monument Valley and some spin-offs, Google Play suggested me The Room series games (The Room, The Room II, Room III), classical puzzle games with a common theme – one needs to escape from some confinement.

I have finished all the three games, game play was very nice and smooth on my phone (Nexus 6P). The graphics and detail level is often astonishing, and everything is well made.

But there is one drop of Vermouth: You need a strong finger tapping muscle! I really love solving the puzzles, but most of them were not really difficult. The real difficulty is finding everything by touching each and every knob, looking from all angles at all times. This later part, the tedious part to find things by often illogically tapping on strange places to realize “ahh, there is something that turns”, that is what I do not like.

I had the feeling that more than 60% of the game play is searching for things. Once you have found them, their use and the actual riddle is mostly straightforward, though.

The Room series somehow reminded me of the Myst series (Myst, Riven, Myst III etc), but afair the Myst series had more involved, more complicated riddles, and less searching. Also the recently reviewed Talos Principle and Portal series have clear set problems that challenge your brain, not your finger tapping muscle.

But all in all a very enjoyable series of games.

Final remark: I learned recently that there are real-world games like this, called “Escape Room“. Somehow tempting to try one out …

Categories: Elsewhere

Virtuoso Performance: DrupalCon NOLA Tuesday call to action - migration sprints

Planet Drupal - Wed, 11/05/2016 - 03:33
DrupalCon NOLA Tuesday call to action - migration sprints

Again, in the interests of timeliness I'll stick to a simple chronological wrapup of the day. And in the interests of of-course-everyone-cares-what-Mike-eats, I will continue subjecting you to my culinary adventures - breakfast at the Clover Grill in the midst of tourist land (Bourbon Street). Good, basic diner food - eggs over easy with bacon and hash browns, the primary goal here was to make it quick and get to the convention center in time for the prenote (which I have somehow never managed to rouse myself in time for at previous DrupalCons).

And, as always (by reputation), the prenote was an extravaganza hosted by jam. So much energy on the stage, so many songwriters calling their lawyers... Highlights were Gábor unveiling a sweet, soulful voice, and the epic Code of Conduct song (performed 1.5 times, so no excuses for not getting it down).

That brings us to - ta-da! - DriesNote. As always, a lot of information presented succintly - I'm sure others will cover many of his points, so I'll focus on my special interest - migration. In Dries' annual survey, site builders identified migration tools as their biggest need for Drupal 8, and he called out the Friday migration sprint.

Sprint all the migrates!

So... let's see how much progress we can make on core migration issues this week! Important things to note:

  1. You don't have to wait for Friday. The Sprint Lounge (rooms 275-277) is open every day. And, while as usual I checked off many, many sessions I'd like to attend, after sitting in a couple today where (through no fault of the presenters) I was mainly thinking about migration, I'm going to try to spend significant time every day (right up through Sunday morning) sprinting.
  2. You don't have to be in New Orleans! You can help remotely - drop into the #drupal-migrate IRC channel, or just pick issues from the core queue and dive in on your own.
  3. You don't have to know the migration framework - there are various ways you can help out (see below).

We already have 10 people officially signed up for migration sprinting (between the core and multilingual lists), so (particularly with more people joining) we can afford to split into multiple sprint teams:

  • Backwards-compability breakers - try to address any issues that may affect backwards-compability, so migration implementors will be able to count on a stable API from 8.2.x forward. This was my priority coming in, and you'll find triaged issues on the Sprint triage tab of the Migration sprint spreadsheet.
  • I18n issues - penyaskito is already leading a migration sprint in this area - it overlaps with the BC-breakers on the epic Migrate D6 i18n nodes issue.
  • Migrate criticals - note that this overlaps some with the BC-breakers (the BC-breaker list has its migrate-criticals listed first), so look for issues not already covered there.
  • UI issues - Abhishek Anand, who did some of the work on the UI in contrib, will lead efforts to clean up remaining issues in core. He'll be in the sprint room Wednesday morning, as well as most of the day Friday, and you can also coordinate with him outside of those times (or if you're not here).
  • We have a lot of issues at the Needs review stage - let's see how many we can get to RTBC, or give constructive feedback, so we can move forward on stuff like node and user references.
Specifically, how can I help?
  • If you're at DrupalCon NOLA, come to the sprint room (275-277) any time Wednesday-Friday - I'll try to get there early and reserve a table just for migration. There are a couple of sessions I definitely want to catch, but I should be there for most non-lunch time, and there should generally be others there (especially Friday) when I'm not.
  • If you're remote, you can announce your presence in #drupal-migrate on IRC. Or just pick an issue to work on.

Either way, please put your name under "Who's working on it" in the spreadsheet so we don't duplicate effort (multiple people can be involved in one patch, but should coordinate).

Ways to help on a specific issue:

  • Write a patch (or discuss approaches to a patch) where there is none yet.
  • Review an existing "needs review" patch.
  • Manually test a "needs review" patch - set up a patched D8 environment and try running your site through the migration process (we'll give some help on setup here).
  • Add tests to a patch tagged "Needs tests".
  • Help solve any outstanding issues on a "needs work" patch.
  • Any other ideas you might have...

 

 

mikeryan Tue, 05/10/2016 - 20:33 Tags
Categories: Elsewhere

Reproducible builds folks: Reproducible builds: week 54 in Stretch cycle

Planet Debian - Tue, 10/05/2016 - 23:00

What happened in the Reproducible Builds effort between May 1st and May 7th 2016:

Media coverage

There has been a surprising tweet last week: "Props to @FiloSottile for his nifty gvt golang tool. We're using it to get reproducible builds for a Zika & West Nile monitoring project." and to our surprise Kenn confirmed privately that he indeed meant "reproducible builds" as in "bit by bit identical builds". Wow. We're looking forward to learn more details about this; for now we just know that they are doing this for software quality reasons basically.

Two of the four GSoC and Outreachy participants for Reproducible builds posted their introductions to Planet Debian:

Toolchain fixes and other upstream developments

dpkg 1.18.5 was uploaded fixing two bugs relevant to us:

  • #719845 (make the file order within the {data,control}.tar.gz .deb members deterministic)
  • #819194 (add -fdebug-prefix-map to the compilers options)

This upload made it necessary to rebase our dpkg on the version on sid again, which Niko Tyni and Lunar promptly did. Then a few days later 1.18.6 was released to fix a regression in the previous upload, and Niko promptly updated our patched version again. Following this Niko Tyni found #823428: "dpkg: many packages affected by dpkg-source: error: source package uses only weak checksums".

Alexis Bienvenüe worked on tex related packages and SOURCE_DATE_EPOCH:

  • Alexis uploaded texlive-bin to our repo improving the existing patches.
  • pdftex upstream discussion by Alexis Bienvenüe began at tex-k mailing list to make \today honour SOURCE_DATE_EPOCH. Upstream already commited enhanced versions of the proposed patches.
  • Similar discussion on the luatex side at luatex mailing list. Upstream is working on it, and already committed some changes.

Emmanuel Bourg uploaded jflex/1.4.3+dfsg-2, which removes timestamps from generated files.

Packages fixed

The following 285 packages have become reproducible due to changes in their build dependencies (mostly from GCC honouring SOURCE_DATE_EPOCH, see the previous week report): 0ad abiword abcm2ps acedb acpica-unix actiona alliance amarok amideco amsynth anjuta aolserver4-nsmysql aolserver4-nsopenssl aolserver4-nssqlite3 apbs aqsis aria2 ascd ascii2binary atheme-services audacity autodocksuite avis awardeco bacula ballerburg bb berusky berusky2 bindechexascii binkd boinc boost1.58 boost1.60 bwctl cairo-dock cd-hit cenon.app chipw ckermit clp clustalo cmatrix coinor-cbc commons-pool cppformat crashmail crrcsim csvimp cyphesis-cpp dact dar darcs darkradiant dcap dia distcc dolphin-emu drumkv1 dtach dune-localfunctions dvbsnoop dvbstreamer eclib ed2k-hash edfbrowser efax-gtk efax exonerate f-irc fakepop fbb filezilla fityk flasm flightgear fluxbox fmit fossil freedink-dfarc freehdl freemedforms-project freeplayer freeradius fxload gdb-arm-none-eabi geany-plugins geany geda-gaf gfm gif2png giflib gifticlib glaurung glusterfs gnokii gnubiff gnugk goaccess gocr goldencheetah gom gopchop gosmore gpsim gputils grcompiler grisbi gtkpod gvpe hardlink haskell-github hashrat hatari herculesstudio hpcc hypre i2util incron infiniband-diags infon ips iptotal ipv6calc iqtree jabber-muc jama jamnntpd janino jcharts joy2key jpilot jumpnbump jvim kanatest kbuild kchmviewer konclude krename kscope kvpnc latexdiff lcrack leocad libace-perl libcaca libcgicc libdap libdbi-drivers libewf libjlayer-java libkcompactdisc liblscp libmp3spi-java libpwiz librecad libspin-java libuninum libzypp lightdm-gtk-greeter lighttpd linpac lookup lz4 lzop maitreya meshlab mgetty mhwaveedit minbif minc-tools moc mrtrix mscompress msort mudlet multiwatch mysecureshell nifticlib nkf noblenote nqc numactl numad octave-optim omega-rpg open-cobol openmama openmprtl openrpt opensm openvpn openvswitch owx pads parsinsert pcb pd-hcs pd-hexloader pd-hid pd-libdir pear-channels pgn-extract phnxdeco php-amqp php-apcu-bc php-apcu php-solr pidgin-librvp plan plymouth pnscan pocketsphinx polygraph portaudio19 postbooks-updater postbooks powertop previsat progressivemauve puredata-import pycurl qjackctl qmidinet qsampler qsopt-ex qsynth qtractor quassel quelcom quickplot qxgedit ratpoison rlpr robojournal samplv1 sanlock saods9 schism scorched3d scummvm-tools sdlbasic sgrep simh sinfo sip-tester sludge sniffit sox spd speex stimfit swarm-cluster synfig synthv1 syslog-ng tart tessa theseus thunar-vcs-plugin ticcutils tickr tilp2 timbl timblserver tkgate transtermhp tstools tvoe ucarp ultracopier undbx uni2ascii uniutils universalindentgui util-vserver uudeview vfu virtualjaguar vmpk voms voxbo vpcs wipe x264 xcfa xfrisk xmorph xmount xyscan yacas yasm z88dk zeal zsync zynaddsubfx

Last week the 1000th bug usertagged "reproducible" was fixed! This means roughly 2 bugs per day since 2015-01-01. Kudos and huge thanks to everyone involved! Please also note: FTBFS packages have not been counted here and there are still 600 open bugs with reproducible patches provided. Please help bringing that number down to 0!

The following packages have become reproducible after being fixed:

Some uploads have fixed some reproducibility issues, but not all of them:

Uploads which fix reproducibility issues, but currently FTBFS:

Patches submitted that have not made their way to the archive yet:

  • #823174 against ros-pluginlib by Daniel Shahaf: use printf instead of echo to fix implementation-specific behavior.
  • #823239 against gspiceui by Alexis Bienvenüe: sort list of object files for linking binary.
  • #823241 against unhide by Alexis Bienvenüe: sort list of source files passed to compiler.
  • #823393 against kdbg by Alexis Bienvenüe: fix changelog encoding and call grep in text mode.
  • #823452 against khronos-opengl-man4 by Daniel Shahaf: sort file lists deterministically.
Package reviews

54 reviews have been added, 6 have been updated and 44 have been removed in this week.

18 FTBFS bugs have been reported by Chris Lamb, James Cowgill and Niko Tyni.

diffoscope development

Thanks to Mattia, diffoscope 52~bpo8+1 is available in jessie-backports now.

tests.reproducible-builds.org
  • All packages from all tested suites have finally been built on i386.
  • Due to GCC supporting SOURCE_DATE_EPOCH sid/armhf has finally reached 20k reproducible packages and sid/amd64 has even reached 21k reproducible packages. (These numbers are about our test setup. The numbers for the Debian archive are still all 0. dpkg and dak need to be fixed to get the numbers above 0.)
  • IRC notifications for non-Debian related jenkins job results go to #reproducible-builds now, while Debian related notifications stay on #debian-reproducible. (h01ger)
  • profitbricks-build4-amd64 has been fully set up now and is running 398 days in the future. Next: update coreboot/OpenWrt/Fedora/Archlinux/FreeBSD/NetBSD scripts to use it. Help (in form of patches to existing shell scripts) very much welcome! (Other help is much welcome (and needed) too, but some things might take longer to merge or explain…)
Misc.

This week's edition was written by Reiner Herrmann, Holger Levsen and Mattia Rizzolo and reviewed by a bunch of Reproducible builds folks on IRC. Mattia also wrote a small ikiwiki macro for this blog to ease linking reproducible issues, packages in the package tracker and bugs in the Debian BTS.

Categories: Elsewhere

Leopathu: Create a custom Twig filter in Drupal 8

Planet Drupal - Tue, 10/05/2016 - 22:06
Twig can be extended in many ways; you can add extra tags, filters, tests, operators, global variables, and functions. You can even extend the parser itself with node visitors. In this blog, I am going to show you how to create new custom twig filters in drupal. For example we are going to create a filter to remove numbers from string, will explain with hello_world module. Create hello_world folder in modules/custom/ folder with the following files,
Categories: Elsewhere

Jeff Geerling's Blog: Thoughts on the Acquia Certified Developer - Drupal 8 Exam

Planet Drupal - Tue, 10/05/2016 - 21:21

Another year, another Acquia Certification exam...

I'm at DrupalCon New Orleans, the first North American DrupalCon since the release of Drupal 8. In addition, this is the first DrupalCon where the Acquia Certified Developer - Drupal 8 Exam is being offered, so I decided to swing by the certification center (it's on the 3rd floor of the convention center, in case you want to take any of the certification exams this week!) and take it.

Categories: Elsewhere

Pages

Subscribe to jfhovinne aggregator