Elsewhere

Gunnar Wolf: Mozilla: So our communitary echobox *does* resound with social issues

Planet Debian - Fri, 04/04/2014 - 03:16

I woke up to the news that, after a very short tenure, Brendan Eich steps down as the Mozilla CEO.

Why? Because of the community outcry. Because some years ago, Eich pubilcly supported (and donated funds) the ban of any kind of marriages in California that were not between a man and a woman. The world has advanced enormously in this regard in the last years/decades, and so many individuals and organizations opposed and announced they would boycott Mozilla that either him or Mozilla could not stand the pressure anymore.

So, of course, it's sad the person had to resign. Many people talked about freedom of speech, freedom of harbouring his own personal opinion — But when it comes to the rights of minorities, particularly of minorities that have suffered such hard prejudice and abuse as the gay, lesbian and all the other non-orthodox sexual- and gender- orientations, righting a wrong is much more important than preserving an individual's freedom of opinion. Besides, it's not just thinking or talking about something — The concrete proposition Eich supported (and eventually made him resign) is about bringing the life of thousands of people to a hellish state of uncertainty, and going back to not having a way for the society to legally recognize their way of being, their love, their lifes.

But anyway — What prompts me into writing this is that, once again, the Free Software (and related denominations) community has shown that a set of core values, seemingly shared by a very large amount of our own people with no coordination or correlation with what conforms us as a community (and thus, being emergent traits), are strong enough to create a critical mass, to achieve cohesion. And that ours is not just a technical community of people writing software at all layers of the stack, but –first and foremost– is a group of social activists, committed to making the world better.

I will quote from Matthew Garrett's post on this topic, clearly more contundent and thorough that what I'm trying to come up with:

The Mozilla Manifesto discusses individual liberty in the context of use of the internet, not in a wider social context. Brendan's appointment was very much in line with the explicit aims of both the Foundation and the Corporation - whatever his views on marriage equality, nobody has seriously argued about his commitment to improving internet freedom. So, from that perspective, he should have been a fine choice.

But that ignores the effect on the wider community. People don't attach themselves to communities merely because of explicitly stated goals - they do so because they feel that the community is aligned with their overall aims. The Mozilla community is one of the most diverse in free software, at least in part because Mozilla's stated goals and behaviour are fairly inspirational. People who identify themselves with other movements backing individual liberties are likely to identify with Mozilla. So, unsurprisingly, there's a large number of socially progressive individuals (LGBT or otherwise) in the Mozilla community, both inside and outside the Corporation.

A CEO who's donated money to strip rights from a set of humans will not be trusted by many who believe that all humans should have those rights. It's not just limited to individuals directly affected by his actions - if someone's shown that they're willing to strip rights from another minority for political or religious reasons, what's to stop them attempting to do the same to you? Even if you personally feel safe, do you trust someone who's willing to do that to your friends? In a community that's made up of many who are either LGBT or identify themselves as allies, that loss of trust is inevitably going to cause community discomfort.

Categories: Elsewhere

Andrew Pollock: [life] Day 66: Story time, Science Friday, impromptu play date, scootering and the Hawthorne Markets

Planet Debian - Fri, 04/04/2014 - 02:42

Zoe slept through last night, which was lovely. We had nothing really planned for today.

The Bulimba Library has a story time thing at 10:30am on a Friday. I've never bothered to take Zoe because it would have been too much of a hustle to get her there after her Brazilian Jiu-Jitsu class, particularly by bike. Since today we didn't have a BJJ class, it made getting to story time much easier.

So we had a nice lazy start to the day, and then Zoe just wanted to procrastinate anyway, so since we were in no particular rush, I let her play in her room for a bit and I read for a while. We eventually had teeth and hair brushed and biked over to the library.

Not having been to the story time thing before, I assumed it was in the kids area, but it turned out it was downstairs in the general purpose room, which we only discovered after story time had started. I'm certainly glad I never busted a gut to get to the library in time for it, because it wasn't anything particularly exciting. They did give all the kids a colouring sheet and there was some colouring and a couple of songs, but really it was nothing to phone home about.

I did run into Jacob and his mum Laura from Kindergarten. They live locally, and so I invited them over for lunch and a play date. They had some grocery shopping to do, and I had to go to the pharmacy to get a prescription filled, so we agreed to meet at our place in about an hour.

We biked to the pharmacy to get my prescription filled and some more sunscreen for Zoe, and the pharmacist gave her a free Chupa Chups

We biked home, and grabbed some balloons from the convenience store next door for our Science Friday activity. I was slightly more organised today, and figured out what I wanted to do in the morning while Zoe was watching TV. We did yeast and sugar and warm water in a bottle with a balloon on top and watched it inflate itself with all the carbon dioxide produced by the yeast eating the sugar.

The only problem was the balloons we bought were total rubbish. They'd been on the shelf for too long and they'd all stuck to themselves and either popped when I tried to blow them up or had holes in them. So we marched back to the corner store in our lab coats to get some more, and the store keeper gave Zoe one of the little flashlights she keeps playing with in the bowl at the check out.

It was a good day for freebies.

We were able to complete our Science Friday activity before Laura and Jacob and his baby brother Ethan came over for lunch. Zoe didn't have a particularly good lunch, I think it was the distraction of having Jacob there as well.

After lunch, they couldn't really agree on anything to play. Laura said Jacob just plays out in the yard at Kindergarten the whole day. The one thing they both managed to agree on briefly was bubbles.

Ethan was getting tired and needed a nap by about 2pm, so they left. Zoe wanted to ride her scooter, but I had to clean up from lunch and the aftermath of the play date on the balcony first, so she went and played some games on her Nexus 7 while I did that, and then we headed out on the scooter.

It was about 2:30pm by this point, and the Hawthorne Markets, which have now moved to a Friday "twilight" thing, were due to start at 4pm, so I figured we could just kill time on the scooter.

It's funny how the scooter is now the in thing as of Tuesday. It's sat unused on the balcony for most of the year, but I'm glad she wants to use it now because it guarantees she moves faster than if she were to walk and want to be picked up every 5 minutes.

We scootered around to the playground and she played there until about 3:45pm, when we headed around to the markets.

We ran into Nicky Noo from the Ooniverse Family Cafe, who we'd met on Tuesday, and she made Zoe another dog balloon and she got her face painted.

After that, it was the obligatory jumping castle for a while. It was getting time to get home for Sarah to pick her up, when I dropped the dog balloon on the ground and it popped. That made Zoe very sad. Then the lack of a decent lunch kicked in. Zoe wanted some poffertjes, but they would have taken too long, so I gave her a few of the free samples to tide her over, and we headed home.

Sarah was waiting for her when we got home, so we parted ways at that point.

I like how the days that have nothing planned often end up as full if not fuller than the days when I do have something planned.

Categories: Elsewhere

Darren Mothersele: Drupal Theme Generator Update

Planet Drupal - Fri, 04/04/2014 - 01:00

It's been a week now since I demoed my proof-of-concept for an automated theme generator at the Drupal show and tell event so I thought I'd collect together the feedback I've received so far and post an update.

Wrong Approach?

Almost unanimously positive feedback. In fact, it seems other people have been thinking along similar lines:

@mothersele dude! just saw http://t.co/GyV2m41eUe This is something that @jenlampton, @mortendk, @Cottser and I have discussed for 8.x twig!

— Mark Carver (@mark_carver) March 29, 2014

The one opposing view I have encountered wasn't actually against any of the ideas in the theme generator, but suggested that taking over Drupal markup was wrong and that we should be working with what Drupal provides. I know there are arguments for this, and if you want to go this route then you will need some other mechanism for documenting the conversion of your design to Drupal theme. If you want to argue this case, I'd suggest first try having that discussion with Morten, as I'm going to assume that we're all OK with the concept of taking complete control of (completely rewriting) Drupal's markup output.

Annotation

In an earlier prototype I had started working with annotations inside HTML comments but I found these increasing harder to parse as the extractions became more sophisticated. Someone in conversation brought up ideas from KSS and suggested looking at CSS comments as an alternative.

I'm still proposing this as a possible approach (see Docblock), but for now I'm going to continue to annotate the markup (not the CSS) with x- attributes, as no one has had an issue with this, and at this stage it's easier to work with QueryPath to create the extractions based on these attributes. It seems that annotating the markup with x- attributes will be acceptable as long as they are stripped from the markup during the build process.

@rootwork @illepic @micahgodbolt @EvanLovely @mothersele Interesting! Do the data attributes get stripped out during the build step?

— Brad Frost (@brad_frost) March 28, 2014

It was great to get feedback from Brad Frost as his work on Atomic Design has been influential in the development of this process.

In code, or config

In this first proof-of-concept, the generated theme is held in memory, well actually it is persisted as a Drupal variable containing a single object that holds the result of all the 'extractions' from the source. The original intention was that this would actually be a ctools exportable, so that it could be exported and managed as part of the configuration management process for the site.

This is how the Panels flexible layout builder works. It has one parent layout plugin that programmatically declares child layout plugins based on the layouts you define using the layout builder tool. These child layouts are stored as exportable objects, so they can be exported using Features. The current Hyde theme generator approach is similar, except that the parent plugins (for layout or styles) programmatically declare child layout and style plugins based on the result of each extraction from the HTML source design.

Storing the result of the build in configuration or database raised some concerns, mainly over capturing the results in version control. These tweets summarise the issue:

@mothersele interesting implementation. But I believe that should definitely generate theme in code, not just DB @mcjim @MattFielding

— Tom Bamford (@waako) March 28, 2014

@waako If a prototype is always in sync with a Drupal theme, the markup *is* all in code right? // @mothersele @mcjim

— Matt Fielding (@MattFielding) March 28, 2014

Matt picks up on my original intention, in that the design/theme would be captured in code and be version-able because the translation is automatic from the design's HTML/CSS/JS.

The difficulty is in managing any changes that happen to the generated code once it becomes a Drupal theme. This is exactly the problem that using the theme generator is trying to solve. That it provides a documented, repeatable conversion process, so that design can become part of the (agile) development workflow.

However, it is going to be unavoidable that some tweaking will be needed. This covers a couple more issues that were raised at the Drupal show-and-tell event:

  • How to manage logic in template files?
  • How to capture Drupal's pre-process functions?

The approach I am looking at to solve this, is one I've seen practised by other tools that involve code generation. For example, have you seen BDD using Behat? When define a test scenario in Behat it generates stub code for any unrecognised steps in your tests. For example, if you say "Given I am in a directory", you would get the generated stub code:

/** * @Given /^I am in a directory "([^"]*)"$/ */ public function iAmInADirectory($argument1) { throw new PendingException(); }

I think the theme generator could do something similar for elements marked as requiring pre-processing in the template file. This needs some further thought and perhaps a couple of experiements.

Terminology

Still struggling with naming conventions. If this is going to be a more general tool then need generally understandable terms (like 'component'). But, need to avoid overloading terms even more, as it's already quite confusing having SMACSS modules, Drupal modules, panels, blocks, boxes, styles, layouts. urgh!

Next steps...

@mothersele @mark_carver I love it. Also love that it works w/ panels! Q: Are the layout plugins placed in the theme? @mortendk @Cottser

— Jen Lampton (@jenlampton) March 31, 2014

So, I'm going to revise the current proof-of-concept and produce a second prototype. This time as a Drush command that generates an actual Drupal theme. Rather than holding the extracted theme in configuration it will generate a theme folder, that will include all the usual Drupal theme files, plus any plugins for Panels layouts, styles, display suite etc, and the CSS/JS copied across from the source design.

This will allow Hyde to generate stub code for pre-processing or other programmatic tweaks that are needed to get Drupal's output to match the design markup. I also think people will be more accepting of this approach as it's probably more like how it is expected to work.

My worry is that people will then hack the generated theme, it will go out of sync with the source design markup, and that will break the whole process.

If you want to get involved, please drop me a line. I need input from designers, themers, and developers. In particular, I'd be interested to speak to anyone else already using Atomic Design and/or SMACSS on Drupal projects.

Categories: Elsewhere

PreviousNext: Object-oriented page callbacks for Drupal 7

Planet Drupal - Fri, 04/04/2014 - 00:45

In Drupal we use object-oriented page and form callbacks to ease our programmning burden This is a nice improvement that allows us to encapsulate the functionality of one or many page callbacks into objects, with all the benefits that brings. Is it possible for us to us object-oriented page callbacks in Drupal 7? With a few tricks, yes it is. This article shows you how.

This is part of a continuing series of using Drupal 8 programming techniques in Drupal 7.

Categories: Elsewhere

PreviousNext: Object-oriented page callbacks for Drupal 7

Planet Drupal - Fri, 04/04/2014 - 00:45

In Drupal we use object-oriented page and form callbacks to ease our programmning burden This is a nice improvement that allows us to encapsulate the functionality of one or many page callbacks into objects, with all the benefits that brings. Is it possible for us to us object-oriented page callbacks in Drupal 7? With a few tricks, yes it is. This article shows you how.

This is part of a continuing series of using Drupal 8 programming techniques in Drupal 7.

Categories: Elsewhere

Matthew Garrett: Mozilla and leadership

Planet Debian - Fri, 04/04/2014 - 00:42
A post I wrote back in 2012 got linked from a couple of the discussions relating to Brendan Eich being appointed Mozilla CEO. The tldr version is "If members of your community doesn't trust their leader socially, the leader's technical competence is irrelevant". That seems to have played out here.

In terms of background[1]: in 2008, Brendan donated money to the campaign for Proposition 8, a Californian constitutional amendment that expressly defined marriage as being between one man and one woman[2]. Both before and after that he had donated money to a variety of politicians who shared many political positions, including the definition of marriage as being between one man and one woman[3].

Mozilla is an interesting organisation. It consists of the for-profit Mozilla Corporation, which is wholly owned by the non-profit Mozilla Foundation. The Corporation's bylaws require it to work to further the Foundation's goals, and any profit is reinvested in Mozilla. Mozilla developers are employed by the Corporation rather than the Foundation, and as such the CEO is responsible for ensuring that those developers are able to achieve those goals.

The Mozilla Manifesto discusses individual liberty in the context of use of the internet, not in a wider social context. Brendan's appointment was very much in line with the explicit aims of both the Foundation and the Corporation - whatever his views on marriage equality, nobody has seriously argued about his commitment to improving internet freedom. So, from that perspective, he should have been a fine choice.

But that ignores the effect on the wider community. People don't attach themselves to communities merely because of explicitly stated goals - they do so because they feel that the community is aligned with their overall aims. The Mozilla community is one of the most diverse in free software, at least in part because Mozilla's stated goals and behaviour are fairly inspirational. People who identify themselves with other movements backing individual liberties are likely to identify with Mozilla. So, unsurprisingly, there's a large number of socially progressive individuals (LGBT or otherwise) in the Mozilla community, both inside and outside the Corporation.

A CEO who's donated money to strip rights[4] from a set of humans will not be trusted by many who believe that all humans should have those rights. It's not just limited to individuals directly affected by his actions - if someone's shown that they're willing to strip rights from another minority for political or religious reasons, what's to stop them attempting to do the same to you? Even if you personally feel safe, do you trust someone who's willing to do that to your friends? In a community that's made up of many who are either LGBT or identify themselves as allies, that loss of trust is inevitably going to cause community discomfort.

The first role of a leader should be to manage that. Instead, in the first few days of Brendan's leadership, we heard nothing of substance - at best, an apology for pain being caused rather than an apology for the act that caused the pain. And then there was an interview which demonstrated remarkable tone deafness. He made no attempt to alleviate the concerns of the community. There were repeated non-sequiturs about Indonesia. It sounded like he had no idea at all why the community that he was now leading was unhappy.

And, today, he resigned. It's easy to get into hypotheticals - could he have compromised his principles for the sake of Mozilla? Would an initial discussion of the distinction between the goals of members of the Mozilla community and the goals of Mozilla itself have made this more palatable? If the board had known this would happen, would they have made the same choice - and if they didn't know, why not?

But that's not the real point. The point is that the community didn't trust Brendan, and Brendan chose to leave rather than do further harm to the community. Trustworthy leadership is important. Communities should reflect on whether their leadership reflects not only their beliefs, but the beliefs of those that they would like to join the community. Fail to do so and you'll drive them away instead.

[1] For people who've been living under a rock
[2] Proposition 8 itself was a response to an ongoing court case that, at the point of Proposition 8 being proposed, appeared likely to support the overturning of Proposition 22, an earlier Californian ballot measure that legally (rather than constitutionally) defined marriage as being between one man and one woman. Proposition 22 was overturned, and for a few months before Proposition 8 passed, gay marriage was legal in California.
[3] http://www.theguardian.com/technology/2014/apr/02/controversial-mozilla-ceo-made-donations-right-wing-candidates-brendan-eich
[4] Brendan made a donation on October 25th, 2008. This postdates the overturning of Proposition 22, and as such gay marriage was legal in California at the time of this donation. Donating to Proposition 8 at that point was not about supporting the status quo, it was about changing the constitution to forbid something that courts had found was protected by the state constitution.

comments
Categories: Elsewhere

Wunderkraut blog: Does Acquia Certification give you personal ROI?

Planet Drupal - Thu, 03/04/2014 - 23:48

I’ve been recruiting many Drupal developers. The process is usually a mixture of random in-depth questions, a drupal.org profile review and pure intuition on the applicant’s fit for our company.

That’s why I felt genuinely interested in the recently published Acquia Certification program. Could a single test provide a trustworthy distinction between a seasoned and an unexperienced Drupal developer? I got to test myself a couple of days ago.

The Acquia Certified Developer Exam promised to analyze a testee’s knowledge of Drupal as well as web development skills in general. The covered areas are so overly wide that it is pretty hard to cover all corners with 60 multiple choice questions. Drupal was naturally present in most them while some of the other development topics were covered only by one or two.

The test time was limited to 90 minutes which I spent completely. The most time consuming part was reading the questions themselves.  I would have made the answer options tricky, not the questions. The questions were  quite lengthy, trying to mimic a real life case of an freelancer. An entreprise level questions would have had more focus on testing, scalability, deployment and documentation - not to mention drupal.org participation.

My overall feeling about the exam was positive. Althought I would have preferred the test being split into more specialized, separate tests, the certificate test gives what it promises. After completing my test, I got the results which reflected in my mind quite well my personal skills.

For sure one cannot pass the test only by guessing so the certificate is a valid testimony of prior in-depth Drupal experience. That’s why having a Acquia Certification in your CV will give you additional credibility and personal ROI. It will grow from personal ROI to corporate ROI when becoming an acknowledged selling point as well as a decision criterion in tendering processes.

 

Categories: Elsewhere

Daniel Pocock: LogAnalyzer and rsyslog MongoDB support now in wheezy-backports

Planet Debian - Thu, 03/04/2014 - 17:12

LogAnalyzer is a powerful but simple log file analysis tool. The upstream web site gives an online demo.

It is developed in PHP, runs in Apache and has no other dependencies such as databases - it can read directly from the log files.

For efficiency, however, it is now trivial to make it work with MongoDB on Debian.

Using a database (including MongoDB and SQL backends) also means that severity codes (debug/info/notice/warn/error/...) are retained. These are not available from many log files. The UI can only colour-code and filter the messages by severity if it has a database backend.

Package status

The packages just entered Debian recently. It has now been migrated to wheezy-backports so anybody on wheezy can use it.

Quick start with MongoDB

The version of rsyslog in Debian wheezy does not support MongoDB output. It is necessary to grab 7.4.8 from backports.

Some versions, up to 7.4.4 in backports, had bugs with MongoDB support - if you tried those, please try again now.

The backported rsyslog is a drop-in replacement for the standard rsyslog package and for users with a default configuration it is unlikely you will notice any difference. For users who customized the configuration, as always, make a backup before trying the new version.

  • Install all the necessary packages: apt-get install rsyslog-mongodb php5-mongo mongodb-server
  • Add the following to /etc/rsyslog.conf:

    module (load="ommongodb")
    *.* action(type="ommongodb" server="127.0.0.1")

  • Look for the MongoDB settings in /etc/loganalyzer/config.php and uncomment them. Comment out the stuff for disk log access.
  • Restart rsyslog and then browse your logs at http://localhost/loganalyzer
Categories: Elsewhere

ThinkShout: Reflections on Drupal Day: Creating a One-Size-Fits-All Day for Nonprofit Professionals and Technologists

Planet Drupal - Thu, 03/04/2014 - 17:00

Originally published March 26 on NTEN.org

Learning a new technology can be incredibly intimidating, especially if you’re going at it alone. There’s great comfort in knowing that you’re not the only one with those particular questions or having this recurring, frustrating problem. Stranding yourself on a technological island is so unnecessary, especially given how accessible learning resources are these days. This is the beauty of the modern technology communities.

Specifically, the Drupal community. It’s everywhere, it’s friendly, and it’s full of helpful people excited to share their expertise and bring new talent into the fold. I spent the last four months preparing for Drupal Day, a Drupal-centric, day-long workshop that ThinkShout coordinates as part of NTEN’s Nonprofit Technology Conference (NTC). I didn’t quite understand the scope of this community until those months finally culminated in the big day.

The process was an interesting one for me especially, as it was not only my very first Drupal Day but also my first experience at the NTC. How do you create a one-size-fits-all day for a large group of people, both nonprofit professionals and technologists, with a wide range of technical competency levels?

It may not be a perfect fit, but so long as there are options, your attendees remain in control and are able to choose the sessions relevant to their interests. With the collaborative efforts of our sponsors and nonprofit feedback, we were able to put together a day jam-packed with content.

My experience with Drupal Day left me with a few key takeaways for those looking to dive into Drupal:

1. The Drupal community really is awesome.

Drupal.org is only the beginning, but it’s a fantastic beginning full of answers. There are forums, an archive of resources, and even a live chat if that’s more your speed. There’s a wealth of information available to you online, all of it curated by the people that know and love Drupal best. This community isn’t purely digital, either. If you live in a large city, chances are there’s a Drupal meetup near you. If you’d prefer to meet face to face, you can, whether it’s through a local event, full-blown DrupalCon, or nonprofit summits at NYC Camp, and BADCamp. You can also access paid training on BuildAModule, but the best part is that you can meet Christ Shattuck, the BuildAModule instructor, in person at a ton of these events. You’re going to start recognizing people quickly, and it’s going to be more helpful than you might think.

2. Learn from others’ stories and share your own.

One of the draws of Drupal Day is that it’s a great opportunity to hear from nonprofit decision makers about their experiences with Drupal. This year, every single one of our speakers represented a nonprofit with a successful Drupal story and each came from different technological backgrounds. We chose speakers that we believed had great, impactful stories that Drupal Day attendees could learn from. This year, Erin Harrington from The Salmon Project, Jess Snyder from WETA, Porter Mason from UNICEF, Milo Sybrant from the International Rescue Committee, and Tony Kopetchny from Pew Charitable Trusts joined us to share their experiences. You can learn more about their projects by clicking through to their websites.

3. Every question is a good question.

There really aren’t any dumb questions, especially when it comes to Drupal. The community embraces newcomers and fosters a great environment for learning. No matter your technical competency level, they’ve got an answer for you. This is why we structured Drupal Day 2014 the way we did: nonprofit speakers in the morning talking about their personal accounts of their organization’s experience with Drupal, followed by an afternoon of twelve breakout sessions covering a variety of topics, where guests could move from classroom to classroom easily. We collaborated with our developer sponsors and nonprofit attendees to determine what information was most relevant to nonprofits. We crafted a day around the topics they wanted to learn about. Everything from Google Analytics to content creation had a place at Drupal Day.

The Drupal community is one that needs to be experienced to truly understand its value. It’s a wonderful stage for nonprofits, no matter where their organization is at technology-wise. Drupal Day is a prime example of that, but there are many more events on the horizon, which I highly recommend if you’re on the fence about diving into Drupal. Of course, I also encourage everyone to come out to Drupal Day at the next NTC and see just what exactly it feels like to be part of this fantastic community.

Categories: Elsewhere

drupalsear.ch: This week in Search API - Meeting notes 01/04/2014

Planet Drupal - Thu, 03/04/2014 - 16:59

On Tuesday, April first we had our first Search API Online meeting using Google Hangout. The following is a result of what was decided and said. It's also a start of this week in Search API so that we allow you to follow along.

What have we done

Frederick fixed the batch process, needs to be tested
During Drupal Dev Days we made 322 commits
Heaviest hours were from 09:00 till 19:00 and then it went up again from 22:00 till 24:00. Thanks to freblasty we even had commits during the middle of the night, 9 of them at 04:00.
Wednesday was our most productive day with 75 commits. Saturday was our low point with only 36 commits, I guess that has something to do with the party on Friday evening…!
20834 lines were added, 11996 were removed
In total we had 15 contributors!!!!
A blog post with more details will be posted soon.
http://drupalsear.ch/rocketship/all/all now features the issues from the sandbox also. If you see issues in Needs Review, please go in and review and/or put your opinion.

What will we do

In General more tests for the processors is our main focus
The indexing logic needs to be reviewed and tested
We’ll move to the proper Search API project once basic functionality is working and we want to work on features.
freblasty is going to move all the issues from the google doc to the sandbox issue queue.
drunkenmonkey will review if the information is clear enough on the main Search API Module page so that people are guided towards best practices and they can join the effort.
Nick_vh will figure how search api tests quickly locally using drupal.org docker images.
Nick_vh will figure out how to proceed with a this week in Search API blog post (which is what you are more or less reading right now).
Nick_vh will register drupalsear.ch and link d8searchapi.acquia.com to it.

For your information

freblasty has a holiday and seems he has time to work on some harder issues. He needs some cheering!
We decided that we'll talk about multiple entity types per index next meeting.
Before committing/pushing, we will always run tests.

Tags:
Categories: Elsewhere

Blink Reaction: How to Add the Current Date to a View in Drupal 7

Planet Drupal - Thu, 03/04/2014 - 15:31

Today, someone in IRC asked how to add the current date to a View they were working on. Seemed simple enough, and I offered to help, thinking it was just a matter of sticking a token in the View header. And it was...sort of. Incredibly, Google searches turned up only code-based solutions. Overkill. Here's a UI-based approach.

To add the current date to the top of a View, follow these steps:

Categories: Elsewhere

Gunnar Wolf: DrupalCamp Mexico City: April 23-25

Planet Debian - Thu, 03/04/2014 - 15:30

We are organizing a DrupalCamp in Mexico City!

As a Drupal user, I have so far attended two DrupalCamps (one in Guadalajara, Mexico, and one in Guatemala, Guatemala). They are –as Free Software conferences usually are– great, informal settings where many like-minded users and developers meet and exchange all kinds of contacts, information, and have a good time.

Torre de Ingeniería

This year, I am a (minor) part of the organizing team. DrupalCamp will be held in Torre de Ingeniería, UNAM — Just by Facultad de Ingeniería, where I teach. A modern, beautiful building in Ciudad Universitaria.

Talks, tracks

So, who is this for? You can go look at the accepted sessions, you will find there is a lot of ground. Starting from the very introduction to how Drupal is structured and some tips on how to work with it (delivered by yours truly), through workflows for specific needs, to strong development-oriented talks. The talks are structured along four tracks: "Training", "Theming", "Development", "Business" and "SymfonyDay".

"SymfonyDay"? Yes.

Drupal is a fast-evolving Free Software project. Most users are currently using versions 6 and 7, which are as different between each other as day and night... But the upcoming Drupal 8 brings even greater changes. One of the most interesting changes I can see is that Drupal will now be based on a full MVC framework, Symfony. One of the days of our DrupalCamp will be devoted to Symfony (dubbed the Symfony Day).

...And... Again, just look at the list of talks. You will find a great amount of speakers interested in coming here. Not just from Mexico City. Not just from Mexico. Not just from Latin America. I must say I am personally impressed.

Sponsors!

Of course, as with any volunteer-run conferences: We are still looking for sponsors. We believe being a DrupalCamp sponsor will greatly increase your brand visibility in the community you want to work with. There are still a lot of expenses to cover to make this into all that we want. And surely, you want to be a part of this great project. There are many sponsor levels — Surely you can be part of it!

AttachmentSize drupalcampmx.png57.81 KB torre_ing.jpg90.22 KB torre_ing_mini.jpg21.64 KB
Categories: Elsewhere

Commercial Progression: The Acquia Certified Developer exam

Planet Drupal - Thu, 03/04/2014 - 14:49

I recently took the Acquia Certified Developer exam and I’m proud to say that I passed by a significant margin. While it was by no means easy, it shouldn’t be insanely difficult for anyone who is a competent, experienced Drupal developer to get a passing score.

I originally heard about it through Acquia’s partner newsletter (Commercial Progression is an Acquia partner) but it was a blog post from Angie "webchick" Byron that really sold me on it. I really liked the idea that it would test practical knowledge and not just require memorizing a bunch of facts. I also felt that being offered by Acquia, one of the most recognized and trusted names for all things Drupal, really helps give it credibility that it might not have if some other company was offering it.

Onsite vs online

After reading about the Secure Sentinel software used for the online testing, I decided to go the onsite route. My biggest concern was that despite my best efforts to avoid disturbances, something would inevitably happen that would cause me to have to retake the exam. Rather than chance it, I decided to just do the onsite exam.

When you register for an onsite exam, you will receive an email with an authorization code and instructions. Essentially it boils down to “show up 15 minutes early with this email and 2 forms of id”. From there, you have to sign some forms (code of conduct and consent to be recorded) and hand over any items you brought with you (wallet, watch, cellphone, etc) which they will store for you until you’re finished since you have any of that with you during the exam. Once all that’s taken care of, they will get the computer setup and take you to the room to start your exam.

Exam prep

If somehow you haven’t seen it already, Webchick has put together an excellent study guide. The sample question is also worth a look but if you feel that you need to do a lot of studying, you should probably re-think whether you should be taking the test. Studying is good for brushing up on topics you might be a little rusty on but it’s no substitute for hands-on experience.

The exam

While I obviously can’t really say anything about the content of the exam, I will say that it is focused things that you will actually encounter and use, not just a bunch of random facts. If you have experience building, maintaining, and fixing Drupal sites using best practices, it will serve you well.

When I was taking the exam, there were questions that actually caused me to smile when I read them because I knew the answer right away. They pertained to scenarios that I had encountered and dealt with while working on Drupal sites. If you’re an experienced developer, don’t be surprised to see some familiar scenarios and issues on the exam.

The actual process was fairly smooth and the 90 minute time limit seems pretty reasonable. I was able to thoroughly read all of the questions, evaluate the answers, and go back and review the questions that I had flagged all with time to spare.

There was one minor issue where the testing system got stalled between questions for a minute or so. That ended up resolving itself just as I was about to call the proctor and the timer resumed from where it was previously so I didn’t actually lose any time. It seems like that was just a minor hiccup with the testing system and not really anyone’s fault.

Final thoughts

When you get you results after the exam, your score is broken down by topic so you can see how well you did in each of the four “domains” the exam covers. It would be nice to be able to see what questions you got wrong after you complete the exam but I realize that might not be possible. One possible alternative might be to give a more detailed breakdown of the score using the sub-topics from the blueprint (eg “3.1 Given a scenario, demonstrate ability to create a custom theme or sub theme”) so the people who take the exam have a better idea of areas where they may need to improve.

The best advice I can give to anyone taking this exam is don’t overthink things and go with your gut. Acquia is not trying to trick you. As long as you read the whole question you should be fine. If you’re an experienced developer and you’re unsure about the answer for a question even after you’ve gone back and reviewed it, the best thing you can do is just go with your first instinct as it’s probably correct.

Categories: Elsewhere

Wunderkraut blog: Slides from Vagrant+Puppet=TRUE from Drupal Dev Days in Szeged

Planet Drupal - Thu, 03/04/2014 - 13:37

On Drupal Dev Days I gave a talk about how we work with Vagrant and Puppet with development, here are the slides from that talk.

If you have any questions, please comment.

 

Categories: Elsewhere

InternetDevels: Flexible materials sorting with the help of Radioactivity module

Planet Drupal - Thu, 03/04/2014 - 13:30

While planning site architecture with a large amount of materials, developers often face such issue: how to implement a flexible materials sorting, or how to make the most interesting articles not to be lost among the new content?

Here we will describe the solution we have implemented while handling this kind of task.

Read more
Categories: Elsewhere

Steve Kemp: Tagging images, and maintaining collections?

Planet Debian - Thu, 03/04/2014 - 13:02

I'm an amateur photographer, although these days I tend to drop the amateur prefix, given that I shoot people for cash at least once a month.

(It isn't my main job, and I'd never actually want it to be, because I'm certain I'd become unhappy hustling for jobs and doing the promotion thing.)

Anyway over the years I've built up a large library of images, mostly organized in a hierarchy of directories beneath ~/Images.

Unlike most photographers I don't use aperture, lighttable, or any similar library management. I shoot my images in RAW, convert to JPG via rawtherapee, and keep both versions of the images.

In short I don't want to mix the "library management" functions with the "RAW conversion" because I do regard them as two separate steps. That said I'm reaching a point where I do want to start tagging images, and finding them more quickly.

In the past I wrote a couple of simple tools to inject tags into the EXIF data of images, and then indexed them. But that didn't work so well in practise. I'm starting to think instead I should index images into sqlite:

  • Size.
  • date.
  • Content hash.
  • Tags.
  • Path.

The downside is that this breaks utterly as soon as you move images around on-disk. Which is something my previous exif-manipulation was designed to avoid.

Anyway I'm thinking at the moment, but I know that the existing tools such as F-Spot, shotwell, DigiKam, and similar aren't suitable. So I either need to go standalone and use EXIF tags, accepting the fact that the tags I enter won't be visible to other tools, or I cope with the file-rename issues by attempting to update an existing sqlite database via hash/size/etc.

Categories: Elsewhere

Johannes Schauer: mapbender - maps for long-distance travels

Planet Debian - Thu, 03/04/2014 - 12:47

Back in 2007 I stumbled over the "Plus Fours Routefinder", an invention of the 1920's. It's worn on the wrist and allows the user to scroll through a map of the route they planned to take, rolled up on little wooden rollers.

At that point I thought: that's awesome for long trips where you either dont want to take electronics with you or where you are without any electricity for a long time. And creating such rollable custom maps of your route automatically using openstreetmap data should be a breeze! Nevertheless it seems nobody picked up the idea.

Years passed and in a few weeks I'll go on a biking trip along the Weser, a river in nothern Germany. For my last multi-day trip (which was through the Odenwald, an area in southern Germany) I printed a big map from openstreetmap data which contained the whole route. Openstreetmap data is fantastic for this because in contrast to commercial maps it doesnt only allow you to just print the area you need but also allows you to highlight your planned route and objects you would probably not find in most commercial maps like for example supermarkets to stock up on supplies or bicycle repair shops.

Unfortunately such big maps have the disadvantage that to show everything in the amount of detail that you want along your route, they have to be pretty huge and thus easily become an inconvenience because the local plotter can't handle paper as large as DIN A0 or because it's a pain to repeatedly fold and unfold the whole thing every time you want to look at it. Strong winds are also no fun with a huge sheet of paper in your hands. One solution would be to print DIN A4 sized map regions in the desired scale. But that has the disadvantage that either you find yourself going back and forth between subsequent pages because you happen to be right at the border between two pages or you have to print sufficiently large overlaps, resulting in many duplicate map pieces and more pages of paper than you would like to carry with you.

It was then that I remembered the "Plus Fours Routefinder" concept. Given a predefined route it only shows what's important to you: all things close to the route you plan to travel along. Since it's a long continuous roll of paper there is no problem with folding because as you travel along the route you unroll one end and roll up the other. And because it's a long continuous map there is also no need for flipping pages or large overlap regions. There is not even the problem of not finding a big enough sheet of paper because multiple DIN A4 sheets can easily be glued together at their ends to form a long roll.

On the left you see the route we want to take: the bicycle route along the Weser river. If I wanted to print that map on a scale that allows me to see objects in sufficient detail along our route, then I would also see objects in Hamburg (upper right corner) in the same amount of detail. Clearly a waste of ink and paper as the route is never even close to Hamburg.

As the first step, a smooth approximation of the route has to be found. It seems that the best way to do that is to calculate a B-Spline curve approximating the input data with a given smoothness. On the right you can see the approximated curve with a smoothing value of 6. The curve is sampled into 20 linear segments. I calculated the B-Spline using the FITPACK library to which scipy offers a Python binding.

The next step is to expand each of the line segments into quadrilaterals. The distance between the vertices of the quadrilaterals and the ends of the line segment they belong to is the same along the whole path and obviously has to be big enough such that every point along the route falls into one quadrilateral. In this example, I draw only 20 quadrilaterals for visual clarity. In practice one wants many more for a smoother approximation.

Using a simple transform, each point of the original map and the original path in each quadrilateral is then mapped to a point inside the corresponding "straight" rectangle. Each target rectangle has the height of the line segment it corresponds to. It can be seen that while the large scale curvature of the path is lost in the result, fine details remain perfectly visible. The assumption here is, that while travelling a path several hundred kilometers long, it does not matter that large scale curvature that one is not able to perceive anyways is not preserved.

The transformation is done on a Mercator projection of the map itself as well as the data of the path. Therefore, this method probably doesnt work if you plan to travel to one of the poles.

Currently I transform openstreetmap bitmap data. This is not quite optimal as it leads to text on the map being distorted. It would be just as easy to apply the necessary transformations to raw openstreetmap XML data but unfortunately I didnt find a way to render the resulting transformed map data as a raster image without setting up a database. I would've thought that it would be possible to have a standalone program reading openstreetmap XML and dumping out raster or svg images without a round trip through a database. Furthermore, tilemill, one of the programs that seem to be one of the least hasslesome to set up and produce raster images is stuck in an ITP and the existing packaging attempt fails to produce a non-empty binary package. Since I have no clue about nodejs packaging, I wrote about this to the pkg-javascript-devel list. Maybe I can find a kind soul to help me with it.

The code that produced the images in this post is very crude, unoptimized and kinda messy. If you dont care, then it can be accessed here

Categories: Elsewhere

Code Positive: Sage Pay

Planet Drupal - Thu, 03/04/2014 - 12:20

Sage Pay is one of the world’s most trusted payment solutions providers.

Our brief was to create a platform which could be used to build and maintain a website for each of the countries Sage Pay operates in, supporting existing customers and promoting Sage Pay services in those countries.

Background

Sage Pay has earned a reputation for security and good service. From startups to major brands, fifty thousand business of all sizes relly on Sage Pay to succesfully processes customer payments.

Sage Pay approched Code Positive to re-build the main corporate platform from which all Sage Pay country websites are built and maintained. The objectives were to present fresh new branding and make it possible for Sage Pay customers, partners, and developers to more quickly find solutions and information.  

Code Positive provided consultancy, project planning, project management, development, and on-going support for the project.

Drupal was used to implemented the project, and hosting was provided by Acquia.

Site Highlights Marketing

The Payment Solutions section features visually rich explanations of the advantages of Sage Pay products, step-by-step guides on how to switch to Sage Pay, and information for statups and corporates. Each page has a clear call to action to connect the visitor to the sage Pay sales team.

The Partner and Developer section of the site provides information on becoming a partner, and helps partners and developers find developer resources and information about 3rd party integrations. Similar to the Solutions section, navigation is visually driven and there is a good balance of imagery and content on the pages with strong call to actions in the form of buttons, teasers with icons and customer logos linked to case studies.

Customer and partner case studies highlight how different Sage Pay solutions have solved problems and met the needs of customers and partners - two different audiences for Sage Pay. The case studies grab the attention of the reader with quotes and logos from prominent customers and partners. Case studies are promoted throughout the site with case study carousels and lists of linked logo thumbnails.

Support

The support section makes it quick and easy for Sage Pay’s various audiences to find the help that they need.

It features support articles on various topics, integration guides explaing how Sage Pay products can be used with other applications, online shoppers FAQ for anyone making payments through Sage Pay services, and contact details for 24/7 support services through which clients and partners can get more information.

Also available in the support section are logo downloads that can be added to a website to show it’s using Sage Pay services, a beta testing programme registration, a glossary with definitions of termnology used on the site, and explanations of error codes and their suggested solutions.

One of the most important features of the site is the system monitor which provides up to the moment information on the status of Sage Pay services.

The support section has a search facility, including the option to search specifically in error codes so that integrators can quickly diagnose problems and find solutions.

The Sage Pay forum is hosted externaly on Stack Overflow. which has excellent tools for technical support and a large and active developer community.

Content Strategy

The client’s brief had two almost contradictory requirements for the site’s content strategy:

  1. Flexibility to add a completely different mix of content to each page
  2. Rigidlly defined content fields that would guide staff in entering content

A flexible content system would enable the marketing department to create whatever message was appropriate for each page. Rigid content fields would maintain consistency across the site, and allow content parts to be re-used on other pages, or hidden on mobile devices.

We squared this circle by analyzing the content that would be added to the site and found common repeating patterns, we then implemented each of the patterns as component types that could be added to any page in any order. Options were also added so the client could control the position and styling of each component as it was created.

We met this challenge by analysing the planned site content to find common repeating patterns. We then built a component for each of these patterns, that could be used to add structured content to any page in any order. The components included flexible configuration options that empowered Sage Pay to control the layout, styling and order of the content as it is added to a page.

Most of the components were developed to have a one-to-one relationship with the page they would be added to, and we also provided a few components that could be re-used on multiple pages.

After a few weeks of use, feedback from the client enabled us to refine the user interface of the component system, to provide a system that is flexible, powerful, and easy to use.

Tokens

One of the challenges was to make it easy for Sage Pay to manage and maintain lots of pieces of information, like telephone numbers and prices, across multiple pages.

To solve this problem, we implemented a token system that enable the client to use place holder tokens anywhere in their copy, which are automatically filled in with information before being displayed on the web page.

The client can easily create new tokens as required, and updating the information associated with a token updates the information across the website.

Media Management

The website makes extensive use of images, videos, and downloadable documents as part of it’s marketing and support features.

We used Media module to implement a media management system that enables the client to add media to the site’s asset library, re-use them across multiple pages, and track which pages each asset is used on.

The system also allows the client to update a media asset, such as a new image of a product, and have the site display the new version everywhere it used the old version.

Drupal Contributions Node View Mode

Provides a field that can be used to select a view mode on each node.

Node View Mode The Results

The site has exceeded the client’s expectations for flexibility and ease of use in adding content. Sage Pay’s marketing and support teams are excited about the positive feedback from customers, developers and partners.

Drupal ConsultingDrupal DevelopmentDrupal Maintenance
Categories: Elsewhere

Gerfried Fuchs: 2CELLOS

Planet Debian - Thu, 03/04/2014 - 12:08

A good friend just yesterday sent me a link to a one and a half hour lasting live concert of 2CELLOS. And wow, I was deeply impressed. Terrific! Even Sir Elton John approves. Have to share them with you, too. :)

Enjoy!

P.S.: I sooo love them also for their pun in their second album title, In2ition. :D

/music | permanent link | Comments: 0 | Flattr this

Categories: Elsewhere

Aigars Mahinovs: Wireless photo workflow

Planet Debian - Thu, 03/04/2014 - 11:51

For a while now I've been looking for ways to improve my photo workflow - to simplify and speed up the process. Now I've gotten a new toy to help that along - a Panasonic FlashAir SD card with WiFi connectivity. I was pretty sure that build-in workflows of some more automated solutions would not be a perfect fit for me, so I got this card which has a more manual workflow and a reasonable API, so I could write my own.

Now I am trying to work out my requirements, the user stories if you will.

I see two distinct workflows: live event and travel pictures.

In both cases I want the images to retain the file names, Exif information and timing of the original photos and also have embedded GPS information from the phone synced to the time the photo was taken. And if I take a burst of very similar photos, I want the uploading process to only select and upload the "best" one (trivial heiristic being the file size) with an ability for me to later choose another one to replace it. There would need to be some way of syncing phone and camera time, especially considering that phones usually switch to local time zone when traveling and cameras do not, maybe the original time the photo was taken would need to be changed to local time zone, so that there are no photos that are taken during the day, but have a timestamp of 23:45 GMT.

When I am in Live Event mode I would like the photos that I take to immediately start uploading to an event album that I create (or choose) at the start of the shoot with a preset privacy mode. This assumes that either I am willing to upload via 3G of my phone or that I have access to a stable WiFi network on-site. It might be good if I could upload a scaled down version of the pictures during the event and then later replace the image files with full-size images when the even is over and I am at home in my high-speed network. I probably don't need the full size files on my phone.

When I am in Travel mode, I want to delay photo uploading until I am back at the hotel with its high speed Wifi, but also have an option to share some snapshots immediately over 3G or random cafe Wifi. I am likely to take more photos that there is memory in my phone, so I would like to clear original files from the phone while keeping them in the SD card and in the cloud, but still keeping enough metadata to allow re-uploading an image or choosing another image in a burst.

Now I need to flesh out the technical requirements from the above and write an Android app to implement that. Or maybe start by writing this in Python as a cross-platform command-line/desktop app and only later porting it to Android when all the rought parts are ironed out. This will have an extra benefit of people being able to run the same workflow on a laptop instead of a phone or tablet.

Let's assume that this is written in a pretty flexible way, allowing to plug in backends for different WiFi SD cards, cloud services, plug-in points for things like instant display of the latest photo on the laptop screen in full-screen mode and other custom actions, what else would people love to see in something like this? What other workflow am I completely overlooking?

Categories: Elsewhere

Pages

Subscribe to jfhovinne aggregator - Elsewhere