Elsewhere

Drupal Watchdog: Write a Migrate Process Plugin, Learn Drupal 8

Planet Drupal - mar, 16/06/2015 - 19:24

A few of us were coaching Campbell Vertesi on porting the CSV source to Drupal 8 and he asked as an aside about mapping US states he had in a taxonomy vocabulary to taxonomy IDs during a migration. Glad you asked! The answer gives us an example for quite a few concepts in Drupal 8, so let’s dig in! We will go over the code line by line.

Plugins

This particular class is a plugin. Plugins are normal objects in a predefined directory with a little metadata. For example, field widgets and formatters are plugins: they get a field and they return a form or a render array. We can change the formatter freely, only the type and meaning of the inputs and the output is fixed. Another good example are the image effects. Migrate uses plugins for everything: sources, processing, destinations. See more.

Namespaces, PSR-4

Line 8 contains a namespace declaration: the first part is Drupal and then the module name migrate_plus then the rest. Typically a plugin will follow by the a Plugin part and then the name of the defining module migrate and finally the type of a plugin process if the defining module has several. Not every plugin type requires such a long namespace, for example entities simply use Entity after the module name: Drupal\taxonomy\Entity. Drupal 8 will look for classes of the migrate_plus module under modules/migrate_plus/src (and all the other usual places for modules) and then the rest of the path is the same as namespace -- this is specified by the PSR-4 standard so this class is in the directory modules/migrate_plus/src/Plugin/migrate/process (sneak preview: a few lines later we will find the class name is TermReference and so the filename is TermReference.php).

Use Statements

Line 10-16 contains use statements. use some\namespace\classallows us to just write class in later code and the Drupal coding standards require this. It really is just syntactic sugar, you can even use non-existing classes. As an aside, many of us have found the PhpStorm IDE very convenient for Drupal 8 development: for example, it takes care of the file placement and naming from the previous section and adds these use statements automatically for you.

Annotations

Line 21-23 contains an annotation. Annotations are a very useful feature in sane languages (like Python) so much so that the PHP community have implemented them in user space… several times. As such, Drupal 8 uses the annotations syntax of Doctrine on classes and PHPUnit annotation on tests. The Doctrine annotations are pretty close to a PHP array except {} is used instead of array(). We can see a very simple example here: this is using the MigrateProcessPlugin annotation and the plugin definition is array(‘id’ => ‘term_reference’). Every plugin must have an id at least. In previous versions of Drupal you would’ve used a hook_migrate_process_info returning an array keyed by the same id and some data. Although the info hooks are gone the alter hooks are still here: for example migrate_process_info_alter is a valid hook (although at this moment undocumented as its utility is severely limited). Other similar hooks, however, are much more useful, for example hook_entity_info_alter.

MigrateProcessPlugin itself is a class in the Drupal\migrate\Annotation namespace and it’s useful to know this because this class is the nexus of information about process plugins.

Classes, Base Classes and Interfaces

Line 25 contains the class name, a base class and an interface. One of the fundamental building bricks of Drupal 8 are interfaces. Interfaces provide a contract, that by which classes that implement it agree to provide certain functionality so that they can be used the same way other classes that use the interface. In other words, every class will have certain methods which take a certain kind of input and provide a certain kind of output. They are absolutely fundamental to plugins since any code interacting with a plugin will only know about the methods the interface require and nothing about the plugin details itself. Because of this, plugin types can require their plugins to implement a specific interface and Drupal throw an exception if they don’t.

Base classes are not a language feature, they are typical of Drupal 8 however: these classes contain some useful common logic for implementing an interface. Extending these instead of implementing an interface is very strongly recommended (although not mandatory at all). Some interfaces do not have a base class, for example ContainerFactoryPluginInterface.

Services, Injection

We will skip the constructor for now and talk about the create method starting on Line 40 required for implementing ContainerFactoryPluginInterface and then we will cover the constructor briefly.

Previous versions of Drupal were often strongly coupled: hardwired function calls were the norm. In Drupal 8 a lot of functionality is provided by so called services. There is a service for all sorts of things: working with entities, logging information, installing modules etc. The container itself is an object and the most used method of it (by far) is get as visible on line 46. You can find the services provided by core here. Because the container provides so many things it is not a good practice to pass and store the container in an object. By doing so, it becomes harder to understand (and to test) a class as it can basically depend on anything. Instead only the static create method will get the container, it passes the necessary services to the constructor and the class itself now has clean dependencies.

By far the most commonly used service is the entity manager: the getDefinition method gives us the entity type object, the equivalent of entity_get_info in Drupal 7. The getStorage gives us the storage object, which in turn can query and load entities of a particular type. (Then the entity objects can save themselves.) If we are not coding a nice little plugin then the entity manager can also be accessed at \Drupal::entityManager(). The Drupal class has methods for most common of the functionality. Most of these methods are just wrapping a $container->get() call so this list is also useful as a list of services. See more on services.

So the create method grabs the taxonomy term storage object and passes it to the constructor. The constructor in turn will call the base class constructor which initializes the common plugin properties, our constructor then initializes our own properties: most importantly the term storage is now available to every method in the class.

Entity Query

We have a getTermId helper method, not required by any interface -- it can not be as interfaces have public methods only. This method queries the term storage for the terms in the specified vocabulary. This perhaps looks familiar -- almost like a database query in Drupal 7. This, however, is for entities only and the condition method is extremely powerful, for example to find nodes posted by users joined in the last hour, condition(‘uid.entity.created’, REQUEST_TIME - 3600, ‘>’). Also, in general, already in Drupal 7 using SQL queries was discouraged but in Drupal 8 it’s safe to assume accessing the database is just doing it wrong.

The entity query returns a list of entity ids and then we load those terms. The following interesting tidbit is $term->name->value, this is one of the ways to access a field value in D8 but it’s mostly just for demo, using a proper method $term->label() is strongly preferred. This $entity->fieldname->propertyname chain can continue: we can write $node->uid->entity->created->value to get the created time for the node author.

The entity query condition closely mirrors this syntax: change the arrows to dots, optionally drop the main property , in this case value and you will get the previously mentioned condition('uid.entity.created', ... to query the same. The Entity API is a really powerful feature of Drupal 8.

Process Plugins

Finally we arrived to the transform method which is the only method required from a process plugin. Migrate works by reading a row from a source plugin then running each property through a pipeline of process plugins and then hand the resulting row to a destination plugin. Each process plugin gets the current value and returns a value. Core provides quite a number of these, a list can be found here. Most process plugins are really small: the average among the core process plugins is a mere 58 LoC (lines of code) and there is only one above 100 LoC: the migration process plugin which is used to look up previously migrated identifiers and even that is only 196 LoC (lines of code).

In our case the actual functionality is just one line of code after all this setup. Of course this doesn’t include error handling etc.

So there you have it: in order to be able to run this single line of code, we needed to put a file in the right directory, containing the right namespace and classname, implement the right interfaces, get a service from the container and run an entity query.

Catégories: Elsewhere

Julien Danjou: Timezones and Python

Planet Debian - mar, 16/06/2015 - 18:15

Recently, I've been fighting with the never ending issue of timezones. I never thought I would have plunged into this rabbit hole, but hacking on OpenStack and Gnocchi I felt into that trap easily is, thanks to Python.

“Why you really, really, should never ever deal with timezones”

To get a glimpse of the complexity of timezones, I recommend that you watch Tom Scott's video on the subject. It's fun and it summarizes remarkably well the nightmare that timezones are and why you should stop thinking that you're smart.

The importance of timezones in applications

Once you've heard what Tom says, I think it gets pretty clear that a timestamp without any timezone attached does not give any useful information. It should be considered irrelevant and useless. Without the necessary context given by the timezone, you cannot infer what point in time your application is really referring to.

That means your application should never handle timestamps with no timezone information. It should try to guess or raises an error if no timezone is provided in any input.

Of course, you can infer that having no timezone information means UTC. This sounds very handy, but can also be dangerous in certain applications or language – such as Python, as we'll see.

Indeed, in certain applications, converting timestamps to UTC and losing the timezone information is a terrible idea. Imagine that a user create a recurring event every Wednesday at 10:00 in its local timezone, say CET. If you convert that to UTC, the event will end up being stored as every Wednesday at 09:00.

Now imagine that the CET timezone switches from UTC+01:00 to UTC+02:00: your application will compute that the event starts at 11:00 CET every Wednesday. Which is wrong, because as the user told you, the event starts at 10:00 CET, whatever the definition of CET is. Not at 11:00 CET. So CET means CET, not necessarily UTC+1.

As for endpoints like REST API, a thing I daily deal with, all timestamps should include a timezone information. It's nearly impossible to know what timezone the timestamps are in otherwise: UTC? Server local? User local? No way to know.

Python design & defect

Python comes with a timestamp object named datetime.datetime. It can store date and time precise to the microsecond, and is qualified of timezone "aware" or "unaware", whether it embeds a timezone information or not.

To build such an object based on the current time, one can use datetime.datetime.utcnow() to retrieve the date and time for the UTC timezone, and datetime.datetime.now() to retrieve the date and time for the current timezone, whatever it is.

>>> import datetime
>>> datetime.datetime.utcnow()
datetime.datetime(2015, 6, 15, 13, 24, 48, 27631)
>>> datetime.datetime.now()
datetime.datetime(2015, 6, 15, 15, 24, 52, 276161)


As you can notice, none of these results contains timezone information. Indeed, Python datetime API always returns unaware datetime objects, which is very unfortunate. Indeed, as soon as you get one of this object, there is no way to know what the timezone is, therefore these objects are pretty "useless" on their own.

Armin Ronacher proposes that an application always consider that the unaware datetime objects from Python are considered as UTC. As we just saw, that statement cannot be considered true for objects returned by datetime.datetime.now(), so I would not advise doing so. datetime objects with no timezone should be considered as a "bug" in the application.

Recommendations

My recommendation list comes down to:

  1. Always use aware datetime object, i.e. with timezone information. That makes sure you can compare them directly (aware and unaware datetime objects are not comparable) and will return them correctly to users. Leverage pytz to have timezone objects.
  2. Use ISO 8601 as input and output string format. Use datetime.datetime.isoformat() to return timestamps as string formatted using that format, which includes the timezone information.

In Python, that's equivalent to having:

>>> import datetime
>>> import pytz
>>> def utcnow():
return datetime.datetime.now(tz=pytz.utc)
>>> utcnow()
datetime.datetime(2015, 6, 15, 14, 45, 19, 182703, tzinfo=<UTC>)
>>> utcnow().isoformat()
'2015-06-15T14:45:21.982600+00:00'


If you need to parse strings containing ISO 8601 formatted timestamp, you can rely on the iso8601, which returns timestamps with correct timezone information. This makes timestamps directly comparable:

>>> import iso8601
>>> iso8601.parse_date(utcnow().isoformat())
datetime.datetime(2015, 6, 15, 14, 46, 43, 945813, tzinfo=<FixedOffset '+00:00' datetime.timedelta(0)>)
>>> iso8601.parse_date(utcnow().isoformat()) < utcnow()
True


If you need to store those timestamps, the same rule should apply. If you rely on MongoDB, it assumes that all the timestamp are in UTC, so be careful when storing them – you will have to normalize the timestamp to UTC.

For MySQL, nothing is assumed, it's up to the application to insert them in a timezone that makes sense to it. Obviously, if you have multiple applications accessing the same database with different data sources, this can end up being a nightmare.

PostgreSQL has a special data type that is recommended called timestamp with timezone, and which can store the timezone associated, and do all the computation for you. That's the recommended way to store them obviously. That does not mean you should not use UTC in most cases; that just means you are sure that the timestamp are stored in UTC since it's written in the database, and you check if any other application inserted timestamps with different timezone.

OpenStack status

As a side note, I've improved OpenStack situation recently by changing the oslo.utils.timeutils module to deprecate some useless and dangerous functions. I've also added support for returning timezone aware objects when using the oslo_utils.timeutils.utcnow() function. It's not possible to make it a default unfortunately for backward compatibility reason, but it's there nevertheless, and it's advised to use it. Thanks to my colleague Victor for the help!

Have a nice day, whatever your timezone is!

Catégories: Elsewhere

DrupalCon News: Planning for Friends and Family at DrupalCon

Planet Drupal - mar, 16/06/2015 - 17:59

As part of our extended Drupal family, many Drupalistas bring their spouse, significant other, friend or children along to DrupalCon. As we know, the Con is always jam-packed with sessions, BoFs and sprints that keep us busy; Barcelona will be no different. After the Drupalers have drained our brains at the convention center, we jaunt off to group dinners, sponsor parties or the coder lounge to continue getting our Drupal on.

Catégories: Elsewhere

Simon Josefsson: SSH Host Certificates with YubiKey NEO

Planet Debian - mar, 16/06/2015 - 14:05

If you manage a bunch of server machines, you will undoubtedly have run into the following OpenSSH question:

The authenticity of host 'host.example.org (1.2.3.4)' can't be established. RSA key fingerprint is 1b:9b:b8:5e:74:b1:31:19:35:48:48:ba:7d:d0:01:f5. Are you sure you want to continue connecting (yes/no)?

If the server is a single-user machine, where you are the only person expected to login on it, answering “yes” once and then using the ~/.ssh/known_hosts file to record the key fingerprint will (sort-of) work and protect you against future man-in-the-middle attacks. I say sort-of, since if you want to access the server from multiple machines, you will need to sync the known_hosts file somehow. And once your organization grows larger, and you aren’t the only person that needs to login, having a policy that everyone just answers “yes” on first connection on all their machines is bad. The risk that someone is able to successfully MITM attack you grows every time someone types “yes” to these prompts.

Setting up one (or more) SSH Certificate Authority (CA) to create SSH Host Certificates, and have your users trust this CA, will allow you and your users to automatically trust the fingerprint of the host through the indirection of the SSH Host CA. I was surprised (but probably shouldn’t have been) to find that deploying this is straightforward. Even setting this up with hardware-backed keys, stored on a YubiKey NEO, is easy. Below I will explain how to set this up for a hypothethical organization where two persons (sysadmins) are responsible for installing and configuring machines.

I’m going to assume that you already have a couple of hosts up and running and that they run the OpenSSH daemon, so they have a /etc/ssh/ssh_host_rsa_key* public/private keypair, and that you have one YubiKey NEO with the PIV applet and that the NEO is in CCID mode. I don’t believe it matters, but I’m running a combination of Debian and Ubuntu machines. The Yubico PIV tool is used to configure the YubiKey NEO, and I will be using OpenSC‘s PKCS#11 library to connect OpenSSH with the YubiKey NEO. Let’s install some tools:

apt-get install yubikey-personalization yubico-piv-tool opensc-pkcs11 pcscd

Every person responsible for signing SSH Host Certificates in your organization needs a YubiKey NEO. For my example, there will only be two persons, but the number could be larger. Each one of them will have to go through the following process.

The first step is to prepare the NEO. First mode switch it to CCID using some device configuration tool, like yubikey-personalization.

ykpersonalize -m1

Then prepare the PIV applet in the YubiKey NEO. This is covered by the YubiKey NEO PIV Introduction but I’ll reproduce the commands below. Do this on a disconnected machine, saving all files generated on one or more secure media and store that in a safe.

user=simon key=`dd if=/dev/random bs=1 count=24 2>/dev/null | hexdump -v -e '/1 "%02X"'` echo $key > ssh-$user-key.txt pin=`dd if=/dev/random bs=1 count=6 2>/dev/null | hexdump -v -e '/1 "%u"'|cut -c1-6` echo $pin > ssh-$user-pin.txt puk=`dd if=/dev/random bs=1 count=6 2>/dev/null | hexdump -v -e '/1 "%u"'|cut -c1-8` echo $puk > ssh-$user-puk.txt yubico-piv-tool -a set-mgm-key -n $key yubico-piv-tool -k $key -a change-pin -P 123456 -N $pin yubico-piv-tool -k $key -a change-puk -P 12345678 -N $puk

Then generate a RSA private key for the SSH Host CA, and generate a dummy X.509 certificate for that key. The only use for the X.509 certificate is to make PIV/PKCS#11 happy — they want to be able to extract the public-key from the smartcard, and do that through the X.509 certificate.

openssl genrsa -out ssh-$user-ca-key.pem 2048 openssl req -new -x509 -batch -key ssh-$user-ca-key.pem -out ssh-$user-ca-crt.pem

You import the key and certificate to the PIV applet as follows:

yubico-piv-tool -k $key -a import-key -s 9c < ssh-$user-ca-key.pem yubico-piv-tool -k $key -a import-certificate -s 9c < ssh-$user-ca-crt.pem

You now have a SSH Host CA ready to go! The first thing you want to do is to extract the public-key for the CA, and you use OpenSSH's ssh-keygen for this, specifying OpenSC's PKCS#11 module.

ssh-keygen -D /usr/lib/x86_64-linux-gnu/opensc-pkcs11.so -e > ssh-$user-ca-key.pub

If you happen to use YubiKey NEO with OpenPGP using gpg-agent/scdaemon, you may get the following error message:

no slots cannot read public key from pkcs11

The reason is that scdaemon exclusively locks the smartcard, so no other application can access it. You need to kill scdaemon, which can be done as follows:

gpg-connect-agent SCD KILLSCD SCD BYE /bye

The output from ssh-keygen may look like this:

ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABAQCp+gbwBHova/OnWMj99A6HbeMAGE7eP3S9lKm4/fk86Qd9bzzNNz2TKHM7V1IMEj0GxeiagDC9FMVIcbg5OaSDkuT0wGzLAJWgY2Fn3AksgA6cjA3fYQCKw0Kq4/ySFX+Zb+A8zhJgCkMWT0ZB0ZEWi4zFbG4D/q6IvCAZBtdRKkj8nJtT5l3D3TGPXCWa2A2pptGVDgs+0FYbHX0ynD0KfB4PmtR4fVQyGJjJ0MbF7fXFzQVcWiBtui8WR/Np9tvYLUJHkAXY/FjLOZf9ye0jLgP1yE10+ihe7BCxkM79GU9BsyRgRt3oArawUuU6tLgkaMN8kZPKAdq0wxNauFtH

Now all your users in your organization needs to add a line to their ~/.ssh/known_hosts as follows:

@cert-authority *.example.com ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABAQCp+gbwBHova/OnWMj99A6HbeMAGE7eP3S9lKm4/fk86Qd9bzzNNz2TKHM7V1IMEj0GxeiagDC9FMVIcbg5OaSDkuT0wGzLAJWgY2Fn3AksgA6cjA3fYQCKw0Kq4/ySFX+Zb+A8zhJgCkMWT0ZB0ZEWi4zFbG4D/q6IvCAZBtdRKkj8nJtT5l3D3TGPXCWa2A2pptGVDgs+0FYbHX0ynD0KfB4PmtR4fVQyGJjJ0MbF7fXFzQVcWiBtui8WR/Np9tvYLUJHkAXY/FjLOZf9ye0jLgP1yE10+ihe7BCxkM79GU9BsyRgRt3oArawUuU6tLgkaMN8kZPKAdq0wxNauFtH

Each sysadmin needs to go through this process, and each user needs to add one line for each sysadmin. While you could put the same key/certificate on multiple YubiKey NEOs, to allow users to only have to put one line into their file, dealing with revocation becomes a bit more complicated if you do that. If you have multiple CA keys in use at the same time, you can roll over to new CA keys without disturbing production. Users may also have different policies for different machines, so that not all sysadmins have the power to create host keys for all machines in your organization.

The CA setup is now complete, however it isn't doing anything on its own. We need to sign some host keys using the CA, and to configure the hosts' sshd to use them. What you could do is something like this, for every host host.example.com that you want to create keys for:

h=host.example.com scp root@$h:/etc/ssh/ssh_host_rsa_key.pub . gpg-connect-agent "SCD KILLSCD" "SCD BYE" /bye ssh-keygen -D /usr/lib/x86_64-linux-gnu/opensc-pkcs11.so -s ssh-$user-ca-key.pub -I $h -h -n $h -V +52w ssh_host_rsa_key.pub scp ssh_host_rsa_key-cert.pub root@$h:/etc/ssh/

The ssh-keygen command will use OpenSC's PKCS#11 library to talk to the PIV applet on the NEO, and it will prompt you for the PIN. Enter the PIN that you set above. The output of the command would be something like this:

Enter PIN for 'PIV_II (PIV Card Holder pin)': Signed host key ssh_host_rsa_key-cert.pub: id "host.example.com" serial 0 for host.example.com valid from 2015-06-16T13:39:00 to 2016-06-14T13:40:58

The host now has a SSH Host Certificate installed. To use it, you must make sure that /etc/ssh/sshd_config has the following line:

HostCertificate /etc/ssh/ssh_host_rsa_key-cert.pub

You need to restart sshd to apply the configuration change. If you now try to connect to the host, you will likely still use the known_hosts fingerprint approach. So remove the fingerprint from your machine:

ssh-keygen -R $h

Now if you attempt to ssh to the host, and using the -v parameter to ssh, you will see the following:

debug1: Server host key: RSA-CERT 1b:9b:b8:5e:74:b1:31:19:35:48:48:ba:7d:d0:01:f5 debug1: Host 'host.example.com' is known and matches the RSA-CERT host certificate.

Success!

One aspect that may warrant further discussion is the host keys. Here I only created host certificates for the hosts' RSA key. You could create host certificate for the DSA, ECDSA and Ed25519 keys as well. The reason I did not do that was that in this organization, we all used GnuPG's gpg-agent/scdaemon with YubiKey NEO's OpenPGP Card Applet with RSA keys for user authentication. So only the host RSA key is relevant.

Revocation of a YubiKey NEO key is implemented by asking users to drop the corresponding line for one of the sysadmins, and regenerate the host certificate for the hosts that the sysadmin had created host certificates for. This is one reason users should have at least two CAs for your organization that they trust for signing host certificates, so they can migrate away from one of them to the other without interrupting operations.

Catégories: Elsewhere

ThinkShout: A Tale of Two Devsigners

Planet Drupal - mar, 16/06/2015 - 14:00

It’s June, which means Devsigner is just around the corner so, naturally, we’ve got design on the brain. What’s Devsigner? Well, I’m glad you asked. Devsigner is a conference held here in the Pacific Northwest geared towards front end developers and development-minded designers. Sessions focus on the relationship between design and web development, bridging the gap that separates the design from the code. The math looks like this: developer + designer = devsigner.

ThinkShout’s own devsigners Josh Riggs (User Experience Lead) and Eric Paxton (Front End Engineer), will be speaking at this conference at the end of the month. I sat down with Josh and Eric to learn a little bit more about their design process, and how we work with our nonprofit clients to ensure that their sites don’t just work, but that they also deliver a fantastic user experience.

You two make up the dynamic design duo here at ThinkShout. What do your respective roles entail? How do you leverage your different skill sets?

Josh: My role as the UX lead right now is handling all aspects of user experience and visual design. I’m responsible for interpreting site maps and requirements, plus things like client/user needs and creating a user interface out of that. That starts with wireframing and ends with a visual design layer.

Eric: My role as Front End Engineer is very much in the implementation phase. Though I do advise in the discovery and budgeting phase, just so we can be sure that we can actually implement what the client wants. It’s nice because in the past, before joining the ThinkShout team, I’d done the whole gamut. From the requirements gathering phase to wireframing, and then the implementation. Here at ThinkShout, I’ve found my sweet spot. I do occasional wireframing, but I get to focus on lots of implementation. I also implement Josh’s designs. I write a lot of Javascript and Sass, basically.

Josh: Eric is like the alchemist. He takes the metals - the designs from me - and turns them into websites. There is actually a large spectrum in between where my responsibilities stop and Eric’s begin. We still talk about things like, how do we go from an idea being on a screen, to that idea being a functioning website? We’re constantly thinking about how to best utilize our respective skillsets, always reevaluating our process to improve upon it.

What’s a recent project that you’ve really enjoyed working on?

Eric: The SPLC (Southern Poverty Law Center) microsite. I thought that was very well done. Josh did a lot of the front end work on that and I came in and did the site optimization, which is what I’ll be talking about at Devsigner. I thought that went really smoothly because at that time, all the work he’d done in the browser went directly to implementation. We were able to take exactly what he’d designed and just build off of it.

Can you talk a little bit about what the design process for the SPLC microsite was like, Josh?

Josh: We happened to be working on that right around the same time as I was doing wireframes for the upcoming SPLC main site that we’re redesigning. We were already doing a lot of thinking about their content and what their needs were. Because the Selma: Bridge to the Ballot movie was coming out on the anniversary of the Selma March, we wanted to have this ready to go in time for that day. There was no way we were going to launch the whole SPLC site along with it - we were too early in development for that - so we decided to split that project up and give them a campaign microsite that would be easy to build while we continued to work on their main site.

A lot of that meant working with their team to define their content needs. I began with basic wireframes in Sketch, and uploaded them into Invision to give them interactivity. As SPLC came up with more fidelity to what their needs were, we solidified the visual designs. Luckily, they already had a lot of assets that their really great internal design team had created for the movie, so I was able to go off of that style. I took their visual style and applied it to the wireframes and at that point, I went to Eric for a consultation and said, "Ok, if we’re going to build this in Jekyll, what’s the best way to do this as far as the architecture goes?" Eric was a huge help in regards to file structure. He wrote a great rake script to automate all the Jekyll, Sass, and Javascript components. That’s when I jumped in and rebuilt what I’d done in Sketch, and added more fidelity with HTML and Sass. I then passed it onto to Eric so he could do his unicorn magic.

Eric: And that’s a nice part about where our skills overlap: we can get closer to what we want. He’s a better designer than I am. My strengths lie in the code. I’ve designed when I had to, but it’s not my forte, so it’s nice to have Josh’s expertise. So these skill sets compliment each other. I feel comfortable handing over my implementation to design and saying, "Hey, can you polish the nav? Or the design?" Things like that.

What design trends do you want to see more of? Or less of?

Eric: I think flat design is getting boring. I’m starting to see a little bit more texture in the things we’ve done. Like patterns, not just flat design for the sake of flat design. There’s texture strategically used to make things look better. For instance, in the Capital Area Food Bank of Texas site, there’s a bit of a pattern in the footer. It’s not just a flat blue background with text. I really like patterns that are used to call out different sections of a design. It adds to it and brings something out of the page. It used to just be that admin interfaces were this flat. But now everything reflects that. Lots of rectangles. I personally like shapes and textures and patterns.

Josh: It’s tricky to know when to add life to what’s a very flat trend right now. I come from the old school world of web design, which was about how cool can you make your shadows look in Photoshop, how three-dimensional can you make things appear. Now that’s kind of like wearing skinny jeans in the late nineties, when you wouldn’t be caught dead wearing them. Or neon colors. So I think what’s happening is that it’s not just that flat design is popular. If you look at other design mediums, like automotive or architecture, there’s a phase with extreme ornate elements. You know, crazy fins, details, lights, every car had a custom badge. All that stuff. And then you have the modern era after that where everything gets streamlined and simplified. It’s more about the function over the form, and the function drives the form. You see the opposite in the Victorian era. Go walk along the St. Johns bridge and look up at a lamp. You’ll see these ornate, twisted little embellishments along the lamps. But the purpose of a lamp is to provide light. Those embellishments do nothing to support the function. They’re just there to make it look pretty.

I think we’re seeing a lot of that in digital design as it matures. We’re getting rid of the stuff that doesn’t support the function and focusing more on the intent of the users. While we’re taking that ornate-ness out of it, we’re also adding a lot more micro-interactions and animations. Things that actually help you do what you’re there to do. At first, I was kind of against that. But now that I think about it as post-modern design for the web, it makes more sense to me.

How do you advise nonprofits on this? Do these same trends benefit nonprofits as much as they do for-profits?

Eric: I think knowing your end user is what determines your path. A lot of nonprofits have similar goals as for-profits when it comes to their websites - they’re trying to tell a story and engage their users. But the main thing is, do the organizational goals reflect what the user is coming there for? For instance, we work with the LA Conservancy. They work to preserve historical buildings in LA. We didn’t just look at them, and then try to make their website look like a pretty building. But we also had this discussion in LA about form versus function. But I wonder, where does that meet in the middle? That’s what I struggle with. Because I do think there’s value in ornate elements like that. They set a mood. So I think that’s part of function - that ornateness sets the mood you want to present to your users to help them feel the connection to the organization’s cause.

Josh: Nearly every major design phase, whether it be automotive, architecture, art, whatever, there’s always a backlash to those current trends. So there will be backlash to flat web design. It may be a subculture, it may take over. But whenever something gets to be ubiquitous, there’s always someone who wants to do something totally different. It’ll be interesting to see what that is.

I feel like that’s the nature of creativity… We see something, we make it part of our process, plus a spark of something new.

Eric: We all have things we’re influenced by. To me, Google stands out. They’ve really led in the trends that people are using. There’s a level of depth to their designs that make me feel like I can reach out and grab it. It’s flat in some ways, but yeah, there’s definitely some depth.

Josh: Yeah, I think Google’s done a really great job. And you can see this happening in the app world. The current trend is also getting ubiquitous.

Devsigner is at the end of the month and you both are leading your own sessions. Can you tell us a bit about them?

Eric: My session is called "Optimization is User Experience." I think this is something everybody can use, which is why it’s listed as a beginner talk. We learn web design, we learn app design, we release these things to the world where we don’t have control over devices and users’ bandwidth, so it’s important to know that this beautiful thing you’ve created can be experienced correctly regardless of what device it’s viewed on.

Josh: So my session is based on something I’ve noticed. I worked on a lot of projects where there’s limited time, budget, or resources. Maybe there isn’t any resource for stock photography, or there’s just a really small team working on it. I’ve always had to find ways to be creative with what I have and with a small budget. I signed up to speak at Refresh Portland and I figured this might be a shared struggle and that other people could learn from my experience: how to stay under budget and still come up with a great, workable design. It’s called "Ballin’ on a Budget."

Want to dig deeper into design with Josh and Eric and pick their brains? Come to Devsigner, which takes place during June 27-28 at the Pacific Northwest College of Art in Portland, Oregon. Check out the full session schedule on the Devsigner site. You can also follow Josh and Eric on Twitter at @joshriggs and @epxtn.

Catégories: Elsewhere

InternetDevels: Best Drupal Video Player Modules

Planet Drupal - mar, 16/06/2015 - 13:47

Greetings to all who want to add video integration to their Drupal website! Drupal module development never stops, offering us a large number of various modules for working with videos. I have hunted through a huge amount of Drupal video modules for you.

To begin with, you need to decide where you want to store your video, how you want to display it, etc.

Let's discuss the pros and cons of each method. Here we go!

Read more
Catégories: Elsewhere

KnackForge: Mitigating Apache Internal Dummy Connection issue

Planet Drupal - mar, 16/06/2015 - 06:00

This is one of the bothering issues we had lately in our project. I'm summarizing the list of causes and possible ways to fix / mitigate the same. So what is Apache's Internal Dummy Connection is all about? Official Wiki page explains it better. See snip below,

When the Apache HTTP Server manages its child processes, it needs a way to wake up processes that are listening for new connections. To do this, it sends a simple HTTP request back to itself. This request will appear in the access_log file with the remote address set to the loop-back interface (typically 127.0.0.1 or ::1 if IPv6 is configured). If you log the User-Agent string (as in the combined log format), you will see the server signature followed by "(internal dummy connection)" on non-SSL servers. During certain periods you may see up to one such request for each httpd child process.

#1: VirtualHost

As mentioned, Apache makes a call to itself. If your default VirtualHost is configured to serve dynamic database driven site like Drupal, it will certainly result in increased resource utilization. Changing the same to serve static index.html should make the dummy http request faster and less resource intense. Even if you have directory listing, symbolic links and/or AllowOverriding turned on, it is suggested to disable them.

#2: .htaccess Rewrite Rule

If default VirtualHost couldn't be changed for some reason, with mod_rewrite you can prevent request hitting the Drupal with rewrite rule. 

Catégories: Elsewhere

Mediacurrent: Contrib Committee Status Review for May, 2015

Planet Drupal - lun, 15/06/2015 - 22:18

As with most other Drupal development studios, our May was dominated by DrupalCon. For the first week we were doing final preparations - making sure everything was ready for our booth, adding the final polish to our presentations, and packing for the trip. Needless to say, it was an excellent week from many perspectives, and we look forward to DrupalCon being in New Orleans next year.

Catégories: Elsewhere

Lunar: Reproducible builds: week 7 in Stretch cycle

Planet Debian - lun, 15/06/2015 - 19:33

What happened about the reproducible builds effort for this week:

Presentations

On June 7th, Reiner Herrmann presented the project at the Gulaschprogrammiernacht 15 in Karlsruhe, Germany. Video and audio recordings in German are available, and so are the slides in English.

Toolchain fixes
  • Joachim Breitner uploaded ghc/7.8.4-9 which uses a hash of the command line instead of the pid when calculating a “random” directory name.
  • Lunar uploaded mozilla-devscripts/0.42 which now properly sets the timezone. Patch by Reiner Herrmann.
  • Dmitry Shachnev uploaded python-qt4/4.11.4+dfsg-1 which now outputs the list of imported module in a stable order. The issue has been fixed upstream. Original patch by Reiner Herrmann.
  • Norbert Preining uploaded tex-common/6.00 which tries to ensure reproducible builds in files generated by dh_installtex.
  • Barry Warsaw uploaded wheel/0.24.0-2 which makes the output deterministic. Barry has submitted the fixes upstream based on patches by Reiner Herrman.

Daniel Kahn Gillmor's report on help2man started a discussion with Brendan O'Dea and Ximin Luo about standardizing a common environment variable that would provide a replacement for an embedded build date. After various proposals and research by Ximin about date handling in several programming languages, the best solution seems to define SOURCE_DATE_EPOCH with a value suitable for gmtime(3).

  1. Martin Borgert wondered if Sphinx could be changed in a way that would avoid having to tweak debian/rules in packages using it to produce HTML documentation.

Daniel Kahn Gillmor opened a new report about icont producing unreproducible binaries.

Packages fixed

The following 32 packages became reproducible due to changes in their build dependencies: agda, alex, c2hs, clutter-1.0, colorediffs-extension, cpphs, darcs-monitor, dispmua, haskell-curl, haskell-glfw, haskell-glib, haskell-gluraw, haskell-glut, haskell-gnutls, haskell-gsasl, haskell-hfuse, haskell-hledger-interest, haskell-hslua, haskell-hsqml, haskell-hssyck, haskell-libxml-sax, haskell-openglraw, haskell-readline, haskell-terminfo, haskell-x11, jarjar-maven-plugin, kxml2, libcgi-struct-xs-perl, libobject-id-perl, maven-docck-plugin, parboiled, pegdown.

The following packages became reproducible after getting fixed:

Some uploads fixed some reproducibility issues but not all of them:

Patches submitted which did not make their way to the archive yet:

reproducible.debian.net

A new variation to better notice when a package captures the environment has been introduced. (h01ger)

The test on Debian packages works by building the package twice in a short time frame. But sometimes, a mirror push can happen between the first and the second build, resulting in a package built in a different build environment. This situation is now properly detected and will run a third build automatically. (h01ger)

OpenWrt, the distribution specialized in embedded devices like small routers, is now being tested for reproducibility. The situation looks very good for their packages which seems mostly affected by timestamps in the tarball. System images will require more work on debbindiff to be better understood. (h01ger)

debbindiff development

Reiner Herrmann added support for decompling Java .class file and .ipk package files (used by OpenWrt). This is now available in version 22 released on 2015-06-14.

Documentation update

Stephen Kitt documented the new --insert-timestamp available since binutils-mingw-w64 version 6.2 available to insert a ready-made date in PE binaries built with mingw-w64.

Package reviews

195 obsolete reviews have been removed, 65 added and 126 updated this week.

New identified issues:

Misc.

Holger Levsen reported an issue with the locales-all package that Provides: locales but is actually missing some of the files provided by locales.

Coreboot upstream has been quick to react after the announcement of the tests set up the week before. Patrick Georgi has fixed all issues in a couple of days and all Coreboot images are now reproducible (without a payload). SeaBIOS is one of the most frequently used payload on PC hardware and can now be made reproducible too.

Paul Kocialkowski wrote to the mailing list asking for help on getting U-Boot tested for reproducibility.

Lunar had a chat with maintainers of Open Build Service to better understand the difference between their system and what we are doing for Debian.

Catégories: Elsewhere

Drupal Watchdog: Caffeinated Drupal

Planet Drupal - lun, 15/06/2015 - 18:42
Column

Once upon a time, I drank coffee purely to wake myself up in the morning or to stay awake during a late night coding marathon. Eventually, I gained an appreciation for the different flavors, smells, and textures to be found in different coffees and brewing methods. That appreciation has grown into a pursuit of the perfect cup of coffee which, while it may never be achieved, provides me with a fun hobby as well as an endless supply of amazing coffee.

Performance tuning a website is another of those endless pursuits wherein you may never actually reach a happy ending.

Is there such a thing as a perfectly performing website? The answer to that question is much like the perfect cup of coffee: perfection lies in the eye of the beholder. While we may not ever be able to achieve a perfectly performing website, we can certainly define goals for what would be considered a well performing site. And by precisely measuring aspects of the site’s performance, we can know if our adjustments are moving us in the right direction or not.

Of course, when defining performance goals, like any project, it’s best to begin at the beginning. In this case there’s no better place to start than a nice cup of Kenya Peaberry, brewed in a manual pour-over to bring out the amazing citrus fruitiness (with a touch of spice). Mmmm, if that’s not nirvana, it sure is close! Now we can jump right in.

Defining Performance Goals

As I mentioned, we need to define goals in order to know where we’re going with the performance tuning, otherwise we’re likely to get people working on random performance improvements that may or may not meet our business requirements. The more specific the goals, the better. Here are a few ideas to get us going:

  • The front page must load in under X seconds.
  • The site must support at least Y concurrent users.
  • Popular entry points to the site must load in under Z seconds.

The important point here is to create an authoritative list which will get everyone on the same page and understand exactly what they’re working towards. Even if you are a team of one person, this is still a great way to define an endpoint for your (current) performance work.

Catégories: Elsewhere

Microserve: Setting Up Drupal Bootstrap

Planet Drupal - lun, 15/06/2015 - 18:02

For those looking for a reliable, responsive front-end framework to base their website/drupal theme upon, Twitter Bootstrap can be hard to beat. Luckily there is an existing, contributed theme available to take out the hard work of integrating Bootstrap and Drupal... Well nearly all the hard work.

This step by step tutorial hopes to serve as an extension to existing documentation for Drupal Bootstrap and strives to fill in a few blanks and signpost the odd 'gotcha' that can potentially leave the novice banging their head against their monitor. It assumes you already have a decent grasp of the drupal folder structure and a knowledge of LESS CSS preprocessor.

Drupal Bootstrap Theme

Download the latest version of the Bootstrap Drupal Theme.
https://www.drupal.org/project/bootstrap

Unzip the contents into the sites/all/themes/ folder of your drupal site.

Copy the folder 'bootstrap_subtheme' and place the copy in the root of your regular sites/all/themes/ folder (You should now have two separate theme folders 'Bootstrap' and 'bootstrap_subtheme' at the same level in your theme folder structure).

Before anything else, rename the 'subtheme' copy to reflect the project you are working on. (for the purposes of this tutorial we'll name ours 'mytheme')

Bootstrap Editable Source Files

Bootstrap Drupal Theme provides the core framework to use bootstrap within Drupal, but we still need to include the latest working distribution of the editable bootstrap source files themselves.

In the future this should be possible using drush, but for now there are two methods for including these files. Either via link to the CDN, which is convenient, but does not give us full editability of LESS files, or by downloading the files to our theme to be used locally. Further info: https://www.drupal.org/node/1978010

We want to choose the second method...

  1. Download the latest distribution of bootstrap from: http://getbootstrap.com/getting-started/#download (Choose the second, 'SOURCE CODE' version.)
  2. Download to the root of your new sub_theme (mytheme).
  3. Unzip and rename the unzipped folder 'bootstrap'. (Yep this is where it can seem confusing, you will now have a new folder called 'bootstrap' inside your new bootstrap sub_theme)
  4. Inside your new subtheme edit the .info file. On the first line change 'name =' value to match your new theme name ('mytheme' in this instance).
  5. Now we need to tell the theme which method to use for including the Bootstrap distribution. Towards the bottom of the  .info file, uncomment all lines under the heading 'METHOD 1: Bootstrap Source Files'  (yes, all those JS files.)
LESS Preprocessor Method

Although you can run a (very restrictive) installation of Bootstrap using standard CSS, it's unlikely you'll want to pass up access to the wealth of in-built variables and mixins available in the core LESS files, so now we need to choose which method of LESS compilation we want to use.

If you wish to install and use a local LESS compiler, you can leave the .info file set to use /css/style.css and then set your preprocessor to compile all LESS files to this file.

*I recommend however using the Drupal LESS module, to let Drupal do the compiling for you in browser. For this method, change the 'Stylesheets' entry in .info to point directly to /less/style.less

For this method to work, you will need to download and install the drupal Less module here:
https://www.drupal.org/project/less

Secondly download the Preprocessor library (lessphp) from:
http://lessphp.gpeasy.com/#integration-with-other-projects

to > /sites/all/libraries/ unzip and rename the folder to 'lessphp'

Enable the LESS module (if you haven't already) and go to /admin/config/development/less in the Drupal admin menu.

Choose 'less.php' as your LESS engine and turn on 'Developer Mode'. (This will ensure LESS files are recompiled on each page load.) - *Make sure this is turned OFF before site goes live.

Turn On The Theme

If you haven't already, enable your sub_theme and make it the default theme.

Disable the main Bootstrap theme (it doesn't need to be enabled for the subtheme to work.)

Clear your drupal cache and you should be good to go.

JQuery Update

For boostrap to run properly, you will have to have JQuery installed and running at atleast version 1.7. Make sure you have the JQuery Update module installed and set to 1.7 or above. (I've run bootstrap on 1.10 without problems.) 

You can change the version on the JQuery Update config page, or specifically for the theme, you can switch the version on your bootstrap sub_theme's theme settings page. 

*If you have selected a version of JQuery 1.7 or above and you're still getting drupal errors complaining that Bootstrap requires this version, you can choose to 'Suppress jQuery version error message' under Advanced on the sub_theme settings page. 

Missing Variable Errors?

Sometimes the Drupal Bootstrap theme can fall out of sync with the latest Bootstrap version.

If after enabling the subtheme you get lots of red errors about missing variables, do the following:

Inside your subtheme:

Make a COPY of the latest variables.less from the distribution files (mytheme/bootstrap/less/variables.less) and use it to REPLACE the version in your theme's custom files (mytheme/less/variables.less)

This should stop bootstrap looking for out of date variables.

Page Templates

While you 'could' copy the page.tpl.php and html.tpl.php templates from the Drupal core and set about adding all the necessary bootstrap classes and regions to them, It makes much more sense to start off by making copies of the versions supplied inside the main Bootstrap parent theme, where most of the ground work has been done for you.

You can find templates at: bootstrap/theme/ (where 'bootstrap' is the main parent theme installed from drupal.org.) the page and html templates are inside the 'system' sub folder.

Copy the templates you need, to you sub_theme's template folder. (Create one if there isn't one already.)

Bootstrap LESS Files

In your sub_theme, you will initially have the following LESS files:

  • bootstrap.less 
    Never edit this. It's only purpose is to import all of bootstrap's core less files - the integral part of the framework.
  • overrides.less
    You will sometimes want edit some values in this file. It mainly contains drupal specific resets and 'overrides'.
  • variables.less
    This is where you can change the values of default bootstrap variables to set site wide typography, form styles, grid styles, branding etc. VERY USEFUL
  • styles.less
    This is initially empty other than a few import declarations. This like a normal style.css or .less file, is where you will put the bulk of your project specific custom LESS code.
  • header.less, content.less, footer.less
    I don't personally tend to find any use for these region specific files. These can safely be deleted if you don't intend to use them. If you do delete them, also make sure to delete their import declarations from the top of 'style.less'.
Custom Variables

You could create a new LESS file for your own custom variables, but I find a lot of my custom variables can be additions to existing bootstrap variable structures (for instance there is already a @brand-primary color value in variables.less and I nearly always add a @brand-secondary color), so it makes sense to include them in the same file and flow. So I add my variables to the existing file, making one consolidated, semantic file.

Custom Mixins

Mixins are a little different. You can just include them in style.less. You can also include them in the bottom of the existing overrides.less file. (You can include them anywhere really, but as you will often want to use variables within your mixins, it's advisable to call them after all variables have been imported from bootstrap and your own custom files.) I think the neatest way is to create new custom LESS file and keep all the custom mixins separate. For instance, on my current project, i've created a ‘custom-mixins.less’ file and imported it into style.less straight AFTER the existing imports like so:

// Bootstrap library. @import 'bootstrap.less'; // Base-theme overrides. @import 'overrides.less'; // Theme specific. @import 'custom-mixins.less';

Wait!? Where was variables.less in those import declarations? 

Well, one thing to be careful of is that you don't want to import the same file into more than one other less file directly. This would in essence mean the entire file would be imported twice. So because variables.less has already been imported into overrides.less, it's content will be inherited through importing override.less into the above file.

Here's a diagram to try and better explain the bootstrap .less inheritance flow, mentioned above: 

In Conclusion

Hopefully these tips will be of use and help navigate the initially daunting landscape of getting Drupal and Bootstrap to play nice together.

This guide is based on the worflow I have personally found most logical and efficient, but if you have other methods or further tips to 'share with the class', feel free to leave a comment below.

My closing 'top tip' to developing with Bootstrap, Drupal or otherwise is to always have the Bootstrap site open in a tab, for easy reference of it's existng grid structure, variables, mixins, js plugins and info.

Martin White
Catégories: Elsewhere

Acquia: Build Your Drupal 8 Team: The Forrester Digital Maturity Model

Planet Drupal - lun, 15/06/2015 - 16:46

In business, technology is a means to an end, and using it effectively to achieve that end requires planning and strategy.

The Capability Maturity Model, designed for assessing the formality of a software development process, was initially described back in 1989. The Forrester Digital Maturity Model is one of several models that update the CMM for modern software development in the age of e-commerce and mobile development, when digital capability isn't an add-on but rather is fundamental to business success. The model emphasizes communicating strategy while putting management and control processes into place.

Organizations that are further along within the maturity model are more likely to repeatedly achieve successful completion of their projects.

Let's take a look at the stages of this model, as the final post in our Build Your Drupal 8 Team series.

Here are the four stages:

Stage 1 is ad hoc development. When companies begin e-commerce development, there is no defined strategy, and the companies' products are not integrated with other systems. Most products are released in isolation and managed independently.

Stage 2 organizations follow a defined process model. The company is still reactive and managing projects individually, but the desired digital strategy has been identified.

Stage 3 is when the digital strategy and implementation is managed. An overall environment supportive for web and e-commerce development exists, and products are created within the context of that environment.

In Stage 4, the digital business needs are integrated. Products aren't defined in isolation, but rather are part of an overall strategic approach to online business. The company has a process for planning and developing the products and is focused on both deployment and ongoing support.

The final capability level, Stage 5, is when digital development is optimized. Cross-channel products are developed and do more than integrate: they are optimized for performance. The company is able to focus on optimizing the development team as well, with continuous improvement and agile development providing a competitive advantage.

Understanding where your company currently finds itself on the maturity scale can help you plan how you will integrate and adapt the new functionality of Drupal 8 into your development organization.

If you are an ad hoc development shop, adopting Drupal 8 and achieving its benefits may be very challenging for you. You may need to work with your team to move up at least one maturity level before you try to bring in the new technology.

In contrast, if your team is at stage 5, you can work on understanding how Drupal 8 will benefit not just your specific upcoming project, but also everything else that is going on within your organization.

Resources:

  • A comprehensive SlideShare presentation on Digital Maturity Models.
  • A blog post by Forrester's Martin Gill that mentions the Digital Maturity Model in the context of digital acceleration.
Tags:  acquia drupal planet
Catégories: Elsewhere

Petter Reinholdtsen: Graphing the Norwegian company ownership structure

Planet Debian - lun, 15/06/2015 - 14:00

It is a bit work to figure out the ownership structure of companies in Norway. The information is publicly available, but one need to recursively look up ownership for all owners to figure out the complete ownership graph of a given set of companies. To save me the work in the future, I wrote a script to do this automatically, outputting the ownership structure using the Graphviz/dotty format. The data source is web scraping from Proff, because I failed to find a useful source directly from the official keepers of the ownership data, Brønnøysundsregistrene.

To get an ownership graph for a set of companies, fetch the code from git and run it using the organisation number. I'm using the Norwegian newspaper Dagbladet as an example here, as its ownership structure is very simple:

% time ./bin/eierskap-dotty 958033540 > dagbladet.dot real 0m2.841s user 0m0.184s sys 0m0.036s %

The script accept several organisation numbers on the command line, allowing a cluster of companies to be graphed in the same image. The resulting dot file for the example above look like this. The edges are labeled with the ownership percentage, and the nodes uses the organisation number as their name and the name as the label:

digraph ownership { rankdir = LR; "Aller Holding A/s" -> "910119877" [label="100%"] "910119877" -> "998689015" [label="100%"] "998689015" -> "958033540" [label="99%"] "974530600" -> "958033540" [label="1%"] "958033540" [label="AS DAGBLADET"] "998689015" [label="Berner Media Holding AS"] "974530600" [label="Dagbladets Stiftelse"] "910119877" [label="Aller Media AS"] }

To view the ownership graph, run "dotty dagbladet.dot" or convert it to a PNG using "dot -T png dagbladet.dot > dagbladet.png". The result can be seen below:

Note that I suspect the "Aller Holding A/S" entry to be incorrect data in the official ownership register, as that name is not registered in the official company register for Norway. The ownership register is sensitive to typos and there seem to be no strict checking of the ownership links.

Let me know if you improve the script or find better data sources. The code is licensed according to GPL 2 or newer.

Update 2015-06-15: Since the initial post I've been told that "Aller Holding A/S" is a Danish company, which explain why it did not have a Norwegian organisation number. I've also been told that there is a web services API available from Brønnøysundsregistrene, for those willing to accept the terms or pay the price.

Catégories: Elsewhere

Annertech: Web Development on Fire? Smoke testing a Drupal Website

Planet Drupal - lun, 15/06/2015 - 12:57
Web Development on Fire? Smoke testing a Drupal Website

Documenting code 10 years ago was always something that I wanted to do, but, let's face it: clients didn't give a damn, so unless you did it for free, it rarely happened. And I felt very sorry for the developer that had to fix any bugs without documentation (yes, even my code contains bugs from time to time!).

Catégories: Elsewhere

Drupal core announcements: Recording from June 12th 2015 Drupal 8 critical issues discussion

Planet Drupal - lun, 15/06/2015 - 11:56

It came up multiple times at recent events that it would be very helpful for people significantly working on Drupal 8 critical issues to get together more often to talk about the issues and unblock each other on things where discussion is needed. While these do not by any means replace the issue queue discussions (much like in-person meetings at events are not), they do help to unblock things much more quickly. We also don't believe that the number of or the concrete people working on critical issues should be limited, so we did not want to keep the discussions closed. After our second meeting last week, here is the recording of the third meeting from today in the hope that it helps more than just those who were on the meeting:

Unfortunately not all people invited made it this time. If you also have significant time to work on critical issues in Drupal 8 and we did not include you, let me know as soon as possible.

The issues mentioned were as follows:

Alex Pott
Rebuilding service container results in endless stampede: https://www.drupal.org/node/2497243
Twig placeholder filter should not map to raw filter: https://www.drupal.org/node/2495179

Francesco Placella
https://www.drupal.org/project/issues/search/drupal?project_issue_followers=&status[]=Open&priorities[]=400&version[]=8.x&component[]=entity+system&component[]=field+system&component[]=language+system&component[]=content_translation.module&component[]=language.module&component[]=views.module&issue_tags_op=%3D
FieldItemInterface methods are only invoked for SQL storage and are inconsistent with hooks: https://www.drupal.org/node/2478459

Lee Rowlands
Make block context faster by removing onBlock event and replace it with loading from a BlockContextManager: https://www.drupal.org/node/2354889

Francesco Placella
FieldItemInterface methods are only invoked for SQL storage and are inconsistent with hooks: https://www.drupal.org/node/2478459

Alex Pott
Rewrite \Drupal\file\Controller\FileWidgetAjaxController::upload() to not rely on form cache https://www.drupal.org/node/2500527

Gábor Hojtsy
Twig placeholder filter should not map to raw filter: https://www.drupal.org/node/2495179

Daniel Wehner
drupal_html_id() considered harmful; remove ajax_html_ids to use GET (not POST) AJAX requests: https://www.drupal.org/node/1305882

Francesco Placella
Node revisions cannot be reverted per translation: https://www.drupal.org/node/2453153
https://www.drupal.org/project/issues/search/drupal?project_issue_followers=&status[]=Open&priorities[]=400&version[]=8.x&issue_tags_op=%3D&issue_tags=D8+upgrade+path

Daniel Wehner
SA-CORE-2014-002 forward port only checks internal cache: https://www.drupal.org/node/2421503

Francesco Placella
Nat: it would be good to have your feedback on the proposed solution the translation revisions issue aside from its criticality (see https://www.drupal.org/node/2453153#comment-9991563 and following)

Fabian Franz
[PP-2] Remove support for #ajax['url'] and $form_state->setCached() for GET requests: https://www.drupal.org/node/2502785
Condition plugins should provide cache contexts AND cacheability metadata needs to be exposed: https://www.drupal.org/node/2375695
Make block context faster by removing onBlock event and replace it with loading from a BlockContextManager: https://www.drupal.org/node/2354889

Alex Pott
[meta] Identify necessary performance optimizations for common profiling scenarios: http://drupal.org/node/2470679

Nathaniel Catchpole
Core profiling scenarios: https://www.drupal.org/node/2497185
Node::isPublished() and Node:getOwnerId() are expensive: https://www.drupal.org/node/2498919
And User:getAnonymousUser() takes 13ms due to ContentEntityBase::setDefaultLangcode() (https://www.drupal.org/node/2504849) is similar.

Catégories: Elsewhere

Jim Birch: Using CKFinder to organize image uploads by Content type in Drupal 7

Planet Drupal - lun, 15/06/2015 - 11:00

As you may have noticed, /sites/default/files can quickly become a pretty busy place in your Drupal installation.  When creating image or file fields, we can add folders in the Drupal UI to organize the uploads.  But when we allow users to upload using the CKEditor WYSIWYG Editor, we have to work a bit harder to organize those uploads.

I am currently working on a project where we want to organize the uploads by content type.  Certain users have access to certain content types.  We want to be able to keep the separation going with the files.  Our goal is to have the wysiwyg uploads in the same folder as the "featured image" field on each content type, which is in /sites/default/files/[content-type].

What I quickly learned, was that IMCE is great in so many ways, and part of our normal Drupal install, but there is no obvious way to do this.  You can use IMCE to organize in a variety of different ways, like php date based folders and user id folders.  You could even have a roles based system, by creating an IMCE profile per role.  But I couldn't figure out a way to organize by field, or Content Type.

CKFinder to the rescue.  CKFinder is a premium file manager plugin for CKEditor.  When integrated with the CKEditor Drupal Module, both can be customized right in the Drupal UI.

Read more

Catégories: Elsewhere

Alessio Treglia: How to have a successful OpenStack project

Planet Debian - lun, 15/06/2015 - 10:30

It’s no secret that OpenStack is becoming the de-facto standard for private cloud and a way for telecom operators to differentiate against big names such as Amazon or Google.
OpenStack has already been adopted in some specific projects, but the wide adoption in enterprises is starting now, mostly because people simply find it difficult to understand. VMWare is still something to compare to, but OpenStack and cloud is different. While cloud implies virtualization, virtualization is not cloud.

Cloud is a huge shift in your organization and will change forever your way of working in the IT projects, improving your IT dramatically and cutting down costs.

In order to get the best of OpenStack, you need to understand deeply how cloud works. Moreover, you need to understand the whole picture beyond the software itself to provide new levels of agility, flexibility, and cost savings in your business.

Giuseppe Paterno’, leading European consultant and recently awarded by HP, wrote OpenStack Explained to guide you through the OpenStack technology and reveal his secret ingredient to have a successful project. You can download the ebook for a small donation to provide emergency and reconstruction aid for Nepal. Your donation is certified by ZEWO , the Swiss federal agency that ensures that funds go to a real charity project.

… but hurry up, the ebook is in a limited edition and it ends on July 2015.

Donate & Download here: https://life-changer.helvetas.ch/openstack

Catégories: Elsewhere

PreviousNext: How to index panelizer node pages using Drupal Apache Solr module

Planet Drupal - lun, 15/06/2015 - 09:44

Apache Solr Search is a great module for integrating your Drupal site with the powerful Apache Solr search tool. Out of the box it can index nodes and their fields, but Panelizer pages won't be indexed. In this post I show how you can get around this by indexing the rendered HTML of a panelizer node page.

Catégories: Elsewhere

Web Omelette: Drupal 8: custom data on configuration entities using the ThirdPartySettingsInterface

Planet Drupal - lun, 15/06/2015 - 09:00

In this article we are going to look at how to use the ThirdPartySettingsInterface to add some extra data to existing configuration entities. For example, if you ever need to store some config together with a node type or a taxonomy vocabulary, there is a great way to do so using this interface. Today we are going to see an example of this and add an extra field to the menu definition and store the value in this way.

There are a number of steps involved in this process. First, we need to alter the form with which the entity configuration data is added and saved. In the case of the menu entity there are two forms (one for adding and one for editing) so we need to alter them both. We can do something like this:

/** * Implements hook_form_alter(). */ function my_module_form_alter(&$form, \Drupal\Core\Form\FormStateInterface $form_state, $form_id) { if ($form_id === 'menu_add_form' || $form_id === 'menu_edit_form') { my_module_alter_menu_forms($form, $form_state, $form_id); } }

Inside this general hook_form_alter() implementation we delegate the logic to a custom function if the form is one of the two we need. Alternatively you can also implement hook_form_FORM_ID_alter() for both those forms and delegate from each. That would limit a bit on the function calls. But let's see our custom function:

/** * Handles the form alter for the menu_add_form and menu_edit_form forms. */ function my_module_alter_menu_forms(&$form, \Drupal\Core\Form\FormStateInterface $form_state, $form_id) { $menu = $form_state->getFormObject()->getEntity(); $form['my_text_field'] = array( '#type' => 'textfield', '#title' => t('My text field'), '#description' => t('This is some extra data'), '#default_value' => $menu->getThirdPartySetting('my_module', 'my_text_field'), '#weight' => 1 ); if (isset($form['links'])) { $form['links']['#weight'] = 2; } $form['#entity_builders'][] = 'my_module_form_menu_add_form_builder'; }

In here we do a couple of things. First, we retrieve the configuration entity object which the form is currently editing. Then, we define a new textfield and add it to the form. Next, we check if the form has menu links on it (meaning that it's probably the edit form) in which case we make its weight higher than one of our new field (just so that the form looks nicer). And last, we add a new #entity_builder to the form which will be triggered when the form is submitted.

The getThirdPartySetting() method on the entity object is provided by the ThirdPartySettingsInterface which all configuration entities have by default if they extend from the ConfigEntityBase class. With this method we simply retrieve a value that is stored as third party for a given module (my_module in this case). It will return NULL if none is set so we don't even need to provide a default in this case.

Let us now turn to our #entity_builder which gets called when the form is submitted and is responsible for mapping data to the entity:

/** * Entity builder for the menu configuration entity. */ function my_module_form_menu_add_form_builder($entity_type, \Drupal\system\Entity\Menu $menu, &$form, \Drupal\Core\Form\FormStateInterface $form_state) { if ($form_state->getValue('my_text_field')) { $menu->setThirdPartySetting('my_module', 'my_text_field', $form_state->getValue('my_text_field')); return; } $type->unsetThirdPartySetting('my_module', 'my_text_field'); }

Inside we check if our textfield was filled in and set it to the third party setting we can access from the config entity object that is passed as an argument. If the form value is empty we reset the third party setting to remove lingering data in case there is something there.

And that's pretty much it for the business logic. We can clear the cache and try this out by creating/editing a menu and storing new data with it. However, our job is not quite finished. We need to add our configuration schema so that it becomes translatable. Inside the /config/schema/my_module.schema.yml file of our module we need to add this:

system.menu.*.third_party.my_module: type: mapping label: 'My module textfield' mapping: my_text_field: type: text label: 'My textfield'

With this schema definition we are basically appending to the schema of the system.menu config entity by specifying some metadata about the third party settings our module provides. For more information on config schemas be sure to check out the docs on Drupal.org.

Now if we reinstall our module and turn on configuration translation, we can translate the values users add to my_text_field. You go to admin/config/regional/config-translation/menu, select a menu and when translating in a different language you see a new Third Party Settings fieldset containing all the translatable values defined in the schema.

Hope this helps.

In Drupal 8 var switchTo5x = true;stLight.options({"publisher":"dr-8de6c3c4-3462-9715-caaf-ce2c161a50c"});
Catégories: Elsewhere

Chen Hui Jing: Developing Drupal sites as a team

Planet Drupal - lun, 15/06/2015 - 02:00

A lot of people, myself included, start out with Drupal on their own, developing and building everything as a one-person operation. When we’re working by ourselves, there will be certain good practices that we neglect, either out of convenience (there’s no point doing X since I’m the only one touching this project), or out of ignorance (wow, I had no idea that was how Y was supposed to be used).

Working with a team of people to build a Drupal site (or any other development project) requires more structure and discipline to ensure the project doesn’t descend into a pile of spaghetti code. I’m going to try to summarise the processes that worked for my team thus far. I...

Catégories: Elsewhere

Pages

Subscribe to jfhovinne agrégateur - Elsewhere