Feed aggregator

Craig Small: killing a process in TCL

Planet Debian - Tue, 01/07/2014 - 14:40

Suppose you had spawned a process in TCL and knew its PID and wanted to kill it? Sounds simple enough thing to do, right? This problem has plagued me for many months because some things you can assume on a normal system do not hold true on strange environments, such as build deaemons.

Seems simple enough, I started off with:

exec kill $pid

Except.. not every environment has the kill binary, and with that piece of code exec has to be a binary and not a shell builtin. The funny thing is that /bin/kill is in the procps package, which is the package having the buildd problems.

So next idea was to use command -v to check for the existence of kill and skip those tests that needed kill if not found. Good idea except, so I found out later, it also finds built-ins. That means we are back to problem #1.

There is a kill command in tcl, but it requires tclx. That seems excessive for just one little simple command. How can I run a shell out of TCL that runs the kill builtin? On the command line, something like below would do it.

/bin/sh -c 'kill 1234'

I was closer, but then hit TCL quoting hell. No matter what I (initially) did I’d either get the shell to complain or my variable to not be evaluated. In the end, I had to write it to a separate variable for the command line then apply that to the exec. Not perfect but at least it works now.

The resulting code (found in testsuite/config/unix.exp) looks like:

proc kill_process pid { set cmdline "kill $pid" if { [catch { exec /bin/sh -c $cmdline } msg]} { warning "Could not kill process: $msg\n" } }

Perhaps there is a more elegant way, I’m certainly no star TCL programmer, but of all the combinations I saw this was the only that worked.

Categories: Elsewhere

Blink Reaction: Austin Wrap-up - Drupal 8: The Friendly Platform

Planet Drupal - Tue, 01/07/2014 - 13:31

For those of us living at the speed of Drupal each and every day, Austin seems light years away already. We’ve begun planning in earnest for Drupalcon Amsterdam and even Drupalcon Bogota and Drupalcon LA in 2015.

Categories: Elsewhere

Neil Williams: LAVA in Debian unstable

Planet Debian - Tue, 01/07/2014 - 13:27

LAVA has arrived in Debian unstable. No need for third party repositories (unless you want to), a simple apt install lava-server. What happens from that point isn’t so simple. I’ve made the single instance installation as straightforward as I could but there is a lot more to do to get LAVA working in a useful manner. First, a little history to explain some of the hysterical raisins which may appear later.

Validation

So, you’ve made a change to the code, good. It compiles, yay! Before you ship it, does it pass the unit tests? Right, now you have an improved patch which does what you want and keeps the unit tests running. Validation is all about asking that awkward question: does your change work? Does it introduce side effects on other systems / services in the same code? Does it break programs which use services exported by the code? LAVA is one step along the road to system testing, starting at the bottom.

Automation

Well you could do all that yourself. Write the image to the device yourself, apply power yourself, connect to serial, copy test scripts to the booted image and pull the results off, somehow. Much better if this is automated – maybe every commit or every push or as often as the tests can be run on the number of devices available.

LAVA Linaro Automated Validation Architecture. Linaro is a not-for-profit building the future of the Linux kernel on ARM devices. Lava was built for and by Linaro, so the initial focus is clearly on validating the Linux kernel on ARM devices. There are likely to be gotchas in the code for those wanting to use LAVA for other kernels – Linaro can’t support lots of different kernels, but if there are changes which make it easier to use other kernels without impacting on validation of Linux, those would likely be accepted. The experience with using LAVA is all with ARM but if there is interest in extending LAVA to work with devices of other architectures, again, patches would be welcome.

The development of the packaging to make LAVA suitable for Debian has also meant that LAVA will run on hardware other than x86. e.g. armv7.com. I’m running a mini lab at home based around an arndale board, I’ve also got a mini lab at work based on a cubie2. Requirements for such setups are described in the documentation and in previous entries on this blog (principally you will need SATA, lots of RAM and as many CPU cores as you can find. If you want to run LAVA on other architectures, go ahead and let us know if there are issues.

Available versions

The versions uploaded to Debian will (from now on) be production releases. The same code as is running on http://validation.linaro.org/ – development builds and test builds are supported using helpers in the lava-dev package. Releases to unstable will automatically migrate into Ubuntu Unicorn. I’ll continue building interim versions at the former locations on http://people.linaro.org/~neil.williams/, including builds for Ubuntu Trusty 14.04LTS as well as providing packages for Jessie until the uwsgi package can migrate. LAVA is looking to work with anyone who can prepare and maintain packages for other distributions like Fedora.

Bugs

LAVA is migrating from Launchpad bugs to http://bugs.linaro.org which is a bugzilla interface. Now that LAVA is also in Debian, anyone is welcome to use the Debian BTS which does not require any login or account setup. The maintainers (including me) will then forward those bugs as appropriate.

Documentation

The immediate task is to update the documentation in the lava-server-doc package (and the Debian wiki) to reflect the availability of LAVA from Debian unstable and how to choose which release of LAVA to use in your systems. However, there is a large amount of documentation already available – click the Help link in the menu bar of any current LAVA instance. As with many projects, the docs have been written by the development team. If there are things which are unclear or if sections need to be moved around to make it easier for people new to LAVA to pick it up, please file bugs.

Hardware

It isn’t easy to run a LAVA lab, there is a lot more to it than simply installing lava-server. LAVA would not be possible without the lab team and any other LAVA instance is going to need the same level of excellence in system administration, device support and cooperation. I’ve covered a little bit of that in previous entries on this blog about my home lab but that is nothing compared to the work required to provide a working lab like the one in Cambridge. Access to that lab is restricted to people within Linaro but LAVA is also reaching out to the community and if there are tests or hardware you want to see within a LAVA instance, not necessarily in the main lab, then talk to us.

Contact

Bug reports are preferable but you can also find LAVA people on #linaro-lava on OFTC or contact us on the linaro-validation mailing list.

Future

There is a lot more work to do on LAVA yet. There are assumptions about partition layout within images (hysterical raisins) and issues with unused software being required for remote worker installations. Both are now part of the next major development work within LAVA.

DebConf14

There is lot more to talk about with LAVA – if you are attending DebConf14 then there will be talks on LAVA and plenty of time to answer questions.

Categories: Elsewhere

Pixelite: Updating Drupal to use Google Analytics Universal tracking

Planet Drupal - Tue, 01/07/2014 - 06:58
Facebook Like Google Plus One Linkedin Share Button

So Google Analytics has a new version of Google Analytics dubbed "Universal Analytics", which has a bunch of new features, that could be handy for your website. I would dive into exactly what they are here, as you can read about them on Google's own website.

In this post I will go through the steps to upgrade the Google Analytics 7.x-1.x module to the new 7.x-2.x version that supports Universal Analytics.

Update the Drupal module

If you read the Google Analytics module page you will spot that there are two different branches in use, in order to get the correct version you will need to get the 7.x-2.x version.

You can do this with Drush:

drush dl google_analytics-7.x-2.x drush updb Event tracking

If you have used custom event tracking in your website, a few changes are required.

Instead of

_gaq.push(['_trackEvent', 'category', 'action', 'opt_label', opt_value, opt_noninteraction]);

It is now

ga('send', 'event', 'category', 'action', 'opt_label', opt_value, {'nonInteraction': 1}); Handy grep command

If you want to find the offending lines of code, you can use grep

grep -nrI "_trackEvent" * Custom variables are now dimensions and metrics

If you were using the old style custom variables, these are now completely gone, now replaced with dimensions and metrics. You can read more about these on Google's website.

Instead of

_gaq.push(['_setCustomVar', 1, // Slot 'Customer Type', // Name 'Paid', // Value 1 // Scope (1 = User scope) ]);

It is now

ga('set', 'dimension1', 'Paid'); Drupal support of custom dimensions and metrics

The Drupal module has an active issue that allows you to configure this through the UI, unfortunately this is still only a patch at the moment, but is looking likely to be committed shortly (it maybe already if you are reading this now). For now I patched the Google Analytics module with Drush make

; Google Analytics projects[google_analytics][type] = module projects[google_analytics][subdir] = contrib projects[google_analytics][version] = 2.x ; Implement custom dimensions and custom metrics ; https://www.drupal.org/node/2136031 projects[google_analytics][patch][] = "http://www.drupal.org/files/issues/google_analytics-2136031-1-custom_dimensions.patch" DoubleClick data

If you were using the additional data that DoubleClick integration provided, this is now supported, this is just a tickbox on the admin settings page.

To enable it

variable_set('googleanalytics_trackdoubleclick', 1) Other new features in Universal Analytics UserID tracking

This effectively allows you to track the same user across multiple devices. This comes in handy if your users can login to your Drupal site, and they would likely login on their mobile phones, and tablets etc. You can read more on Google's page about User ID tracking

To enable it

variable_set('googleanalytics_trackuserid', 1) Enhanced Link Attribution feature

Allows Google Analytics to differentiate URLs based on what link the user clicked on, really handy if you have many links pointing at the same page. You can read more on Google's page about User ID tracking

To enable it

variable_set('googleanalytics_tracklinkid', 1) Finally

Run this little gem over your codebase to ensure there are no legacy Google Analytics code lying around.

grep -nrI "_gaq" *

Let me know if you have any tips or tricks in the comments for the new Google Analytics

Tags drupal drupalplanet Google Analytics Source Drupal module for Google Analytics Category Tutorial
Categories: Elsewhere

Russ Allbery: Review: Ancillary Justice

Planet Debian - Tue, 01/07/2014 - 06:25

Review: Ancillary Justice, by Ann Leckie

Series: Imperial Radch #1 Publisher: Orbit Copyright: 2013 ISBN: 0-316-24662-X Format: Trade paperback Pages: 416

As Ancillary Justice opens, Breq is on an icy planet trying to track down a person who has gone rather thoroughly to ground. As her search continues, the reader slowly learns the context of that search, which is only the latest step of a long and quixotic exercise in private determination. She was set on her path by events twenty years previous, told in interleaved flashbacks, that have left her greatly diminished and thrust outside of any of the context of her over thousand years of life. Breq was Justice of Toren, a starship and AI that possessed thousands of ancillaries, human bodies slaved to her mind. Now only this one remains, and nothing exists of her former life other than this goal.

It's always hard to write a review of a book that I loved this much. I want to find some way to grab the reader and shake them and say "you have to read this!" without giving away any of the delicious details. That's particularly difficult with Ancillary Justice, since one of the delights of this book is the slow unfolding of not only the plot but the background and motivations behind the plot. There is so much beneath the surface of Breq's methodical intent, and discovering all the nuances is utterly delightful. This is also a book that is very deeply concerned with identity, and which does one of the best jobs I've seen in science fiction of showing a non-human first-person protagonist: close enough to human to permit identification and comprehension, but far enough away for delightful sparks of unexpected insight or thought-provoking difference.

Breq is a ship, and AIs are central to this story, so the comparison that comes immediately to mind is with Iain M. Banks. Leckie's role for ship AIs is much different than the structure of Banks's Culture, but I think this is still an apt comparison. Like Banks, Leckie is writing large-scale space opera dominated by a meddling empire that follows some familiar human patterns but not others. Banks's Culture takes a less direct approach with its meddling; the Radch has a more Roman attitude towards preventative conquest and citizenship. But both deal with large-scale issues of politics, culture, and conquest. Both also write excellent AIs, but I think Leckie is more successful than Banks at giving her AIs inhuman properties and creating a sense of eerie alienness that sometimes fades almost entirely and sometimes comes sharply to the surface. The first-person perspective helps considerably there.

But where Ancillary Justice truly shined for me is in the interpersonal relationships, and in the insight they provided into character and motive. At the start of the book, Breq finds a drug addict dying in the cold and rescues her, bringing her along on her search for lack of a better option. Breq's relationship with Seivarden is complex, difficult for both of them, badly asymmetric at the start, and develops into something brilliant. At the start of the book, Seivarden is easy to dislike, and Breq's tone is refreshingly bracing and forthright. But through the course of the book Seivarden grows into something much more, in a way that I found both believable and incredibly compelling. And Leckie does this without falling into any of the typical relationship patterns, without introducing artificial romance (or, indeed, any romance at all, which is an excellent choice here), and without compromising the personalities of either character. It's masterfully done.

One of the most amazing things about this book to me is that it's a first novel. I never would have guessed that from reading it. It's beautifully paced, the characterization is deep and compelling, and Leckie avoids any sign of the typical first-novel problem of stuffing the book with too much Stuff. Ancillary Justice is a capable and confident novel that builds a compelling world and even more compelling characters. I liked it more than I like most of the Culture novels, which is saying quite a lot, but Leckie offers much of the same scope with deeper and more personal characterization and a tighter plot.

I haven't yet remarked on one aspect of this book that every other review seems to remark on: its treatment of gender. The Radch do not recognize or care about gender distinctions, and therefore Breq struggles throughout the book with proper gender labeling in much the same way that a native English speaker tends to struggle with grammatical gender when learning a Romance language, except with more social consequences. Leckie has chosen to represent this in the novel by having Breq refer to everyone uniformly as "she." This has its pluses and minuses: it still supports a binary gender concept where a gender-neutral pronoun might not, but given all the negative reaction this book got just for using "she," a gender-neutral pronoun might be a bridge too far for a lot of readers. I thought it created a nice bit of alienation, a way of forcing the reader to look at gender markers from the outside and a way to point out how arbitrary many of them are. It was also interesting to see how surprised I was at various points in the book when it became obvious that some person Breq had been calling "she" in the first-person narration turned out, from story context, to probably be male.

That said, I think this part of the book is overblown in reviews. It's often the first thing people mention, but while it was a nice side bit of world-building, I don't think it's that central to the story. I'm particularly baffled by the handful of people who complained about it, since it's not intrusive and it quickly fades into the background apart from occasional necessary shifts of mental image. (It does create the impression of a world containing only women, but I found that a nice change from the more common impression in space opera of a world containing only men.)

Ancillary Justice has already won the Nebula and Arthur C. Clarke awards and tied for the BSFA award for best novel, and I'm happy to report that it deserves all of those. I haven't yet read all of the other Hugo nominees, but it's hard to imagine a world in which it won't top my ballot. This is a fantastic novel, by far the best thing I've read so far this year. I'm delighted that it's the first book of a trilogy, since I'm not done with either the world or the characters yet, but it stands well on its own and reaches a satisfying conclusion. I recommend it to everyone, but particularly to anyone who likes Banks, intelligent ships, or who is looking for thoughtful and complex space opera.

Followed by Ancillary Sword.

Rating: 10 out of 10

Categories: Elsewhere

Russell Coker: Blog Comments

Planet Debian - Tue, 01/07/2014 - 05:34

John Scalzi wrote an insightful post about the utility of blog comments with the way the Internet works nowadays [1]. He starts out focusing on hate comments that could reasonably be described as terrorism (death threats with the aim of preventing people writing about politics meet any reasonable definition of “terrorism”). Terrorists on the Internet are a significant problem but it’s one that doesn’t get much attention as it generally only affects people who aren’t straight-acting white men.

Blogging About Technology

One corner case that John doesn’t seem to consider is that of writing about technology. Issues related to programming often aren’t related to politics and are often testable so comments will be based on things that have been shown to work rather than stuff people invent or want to believe. I’ve received many useful and educational comments on my technical posts with little hostility. Even getting a snarky comment is rare when writing a strictly technical blog post.

The comments problem for technology blogging is spam. I’ve been using the WordPress plugin Block Spam by Math [2] (which is obsolete but still works) for years. Initially it stopped almost all spam, but now I’m getting at least 20 spam comments a day.

Extremely Popular Blogs

The comments section of a blog is sometimes described as a “conversation”. When a blog post gets comments from less than 10 people it is possible for them to have something that resembles a conversation with the author that is of benefit to other readers and doesn’t take excessive amounts of time for the author. When a blog is very popular and every post gets comments from 50+ people it’s not really possible. So a traditional blog comment section seems to work best when the blog is primarily read by a small well connected group of people who sometimes comment and some casual readers who never comment (but sometimes find value in the comments of others).

Discussions of blog comment systems usually include a reference to a post written by someone who disabled comments on their blog and found it to be a good thing, it always seems that the person who writes such a post has a large and varied audience who’s comments would take a lot of time to moderate. John followed the usual form in this regard by linking to a reasonably popular SF author who would presumably have a lot of fans with good net access.

I’m not going to criticise anyone for disabling comments when their blog becomes really popular, but any advice that they have to offer about such things won’t apply to the vast majority of blogs. Due to the long-tail effect the small blogs would probably comprise the majority of all comments so in terms of the way the blog environment works I don’t think it makes much difference when the small minority of very popular blogs disable comments. The vast majority of blogs that I regularly read only have a small number of comments.

One thing that should be noted is that getting a lot of readers shouldn’t be the only factor for writing a successful blog. For example some of my blog posts about SE Linux are aimed at a small audience of Linux programmers and have an even smaller number of people who are qualified to comment. When I write a post that can only receive comments other than “please explain more because I don’t understand” from a few dozen people that doesn’t make it any less important. Sometimes the few dozen people who know a topic well need to work together to educate the few thousand who can implement the ideas for the benefit of millions of users of the software.

Disabling Comments on Contentious Posts

One interesting method John uses is to disable comments early when posting about contentious issues. It’s a general practice when running a blog to disable comments on posts after a certain period of time (3 months to 1 year seem to be common time limits for comments). This means that the moderators can concentrate on recent posts and not be bothered with spam bots hitting ancient posts as the interest in writing legitimate comments on an old post is vanishingly small. John has a practice of disabling comments after a couple of days when the comments start to lose quality.

No matter how contentious the issue is I’m not likely to get the 400+ comments that John gets. But the idea of closing comments quickly still has some merit for my blog and other blogs with less traffic.

Not Accepting Comments While Asleep

John has a practice of closing comments while he’s asleep to avoid allowing a troll to get 8 hours of viewing for a nasty comment. The most immediate down-side to that is that it inconveniences people who don’t want to wait 8 hours to comment and prioritises comments from people in the same time zone, this makes me think of Cory Doctorow’s novel Eastern Standard Tribe (which is available for free download and I highly recommend reading it) [3]. It seems that a better solution to that problem would be to have a team of moderators to watch things 24*7 which is what a lot of popular blogs that allow comments do. The WordPress capabilities model doesn’t support granting a user no special privileges other than moderating comments [4], as WordPress is the most popular self-hosted blog software this limits the possibilities for people moderating comments on other people’s blogs.

No variation of this would work for me. I have lots of things that require my ongoing attention and don’t want to add my blog to the list. If I have other things to work on for a few days I want to just not bother with my blog. This means that my blog needs to be able to run on autopilot for days at a time – however I do monitor my blog closely after publishing a post that is likely to attract nasty comments. One extra problem that I have is that the Android client for WordPress has problems in synchronising comments.

Using a Forum for Comments

Popular Planet installations such as Planet Debian and Planet Linux Australia syndicate more than a few blogs that have comments disabled. A forum installation for such a Planet would be useful to allow people to comment on all posts and also support bloggers who are thinking of disabling comments. While the use of a forum for blog comments has been proven to work well for Boing Boing forums have their own issues of spam and anti-social behavior.

Debian already has a forum [5], if a section of that was devoted to discussing blog posts from Planet Debian then it shouldn’t make much of an increase to the work of the forum administrators while providing a benefit to the community. Also if the Debian forum had such a section it would probably attract use from more Debian Developers, I would use that forum if it was a place to comment on blogs that don’t have a comment section and I might also comment on other forum discussions.

It would be good if there was a forum for discussing Linux in Australia. I’m not volunteering to run it but I would help out if someone else wants to be the main sysadmin and I can offer free hosting.

Related posts:

  1. reviewing blog comments and links It seems that the swik.net site is mirroring all my...
  2. a Heartbeat developer comments on my blog post Alan Robertson (a major contributor to the Heartbeat project) commented...
  3. Blog Friends? There are some people who’s blogs I read and often...
Categories: Elsewhere

Drupal @ Penn State: DrupalCampPA Session proposals end July 1st (THATS TODAY!!)

Planet Drupal - Tue, 01/07/2014 - 04:16

In case you hadn't heard, Pittsburgh will be having its very first DrupalCamp this year. The event is a collaboration between the Drupal User group in the Pittsburgh area and the Penn State DUG. It's hosted on University of Pittsburgh (Pitt) campus and the theme of this year's event is Bridging Higher Education & Industry.

http://drupalcamppa.org/

Categories: Elsewhere

Doug Vann: Why I support Kalabox on Kickstarter and why I think that you should too.

Planet Drupal - Tue, 01/07/2014 - 04:07

This Kickstarter campaign exists to take Kalabox 1.0 to the next level! Literally!

Let’s be honest. There are a LOT OF TOOLS out there to turn your computer into a web server and help you leverage sophisticated tools. They range in cost from free to cheap to pricey. The complexity ranges from too simple to be useful on the one end to too complex to be used on the other. Yes, there is some middle ground there, but at the end of the day you simply don’t have all the tools that the cool kids use. :-(

Now, Here comes Kalabox!

KALABOX uses the tagline, “Advanced Dev tools For The People.

I love this! I’ve always been the kind of geek who was happiest when technology makes a difference, like when introducing new technology makes humans happier and more productive! And this is exactly what Kalabox is already doing AND wants to do a whole lot more of.

The tagline is catchy, but the full definition, of what Kalabox is, gets me equally excited:

Kalabox is an “Integrated workflow solution for people who use Drupal.”

They're talking about US! If you’re reading this you probably use Drupal and if you’re not excited yet… Keep reading!

Here’s a bullet list of some Kalabox facts that caused me to reach for my credit card. I gleaned these from the Kickstarter page and the video you’ll find there.

  1. Both novices & pros can use it easily.
  2. Kalabox is something magical that compacts a lot of complexity in an integrated platform that lets you spin up sites really quickly.
  3. Kalabox builds a computer within your computer called a hypervisor. Launch it and you get a friendly dashboard to get things done.
  4. One click and you have a Drupal site on your computer.
  5. Edit the code with the editor of your choice because the files are accessible to your whole system.
  6. Integrates with pantheon. look at your site list. Pull one down. Make updates and refresh. It’s everything you need to test code and go live in one spot.
  7. Under the hood are all the tools you would expect: git, xdebug, puppet, node.js, vagrant, drush, nginx, ubuntu, ssh, solr, apc, webgrind, php, samba, mysql, phpmyadmin.
  8. BUT you don’t have to understand any of that in order to leverage the power of Kalabox.
  9. Within 6 months of the launch of Kalabox 1.0 it had over 1000 downloads.
  10. Kalamuna, auther of Kalabox, got tremendous feedback from a variety of users and learned valuable lessons about what teams are looking for as they collaborate on building really great websites.
  11. They want to integrate new and exciting technologies.
  12. They want to make it open source and share the love!
  13. They want to add Windows support!
  14. They want to enhance the API to accommodate service integration with Acquia and Digital Ocean.
  15. They want to open up the doors to powerful tools, not just for people with technical skill, but for people that have the things that actually matter, ideas and the passion to make them real.
  16. Kalabox provides a Node.js frontend so you can quickly spin up new Drupal sites, access utilities and tweak your environment without earning DevOps ninja-pants.
  17. They want to add Docker Integration. Switching out the current underlying architecture from Vagrant/Puppet to Docker will vastly improve installation time, reduce moving parts and, more importantly, Allow developers to be able to easily and quickly swap between different underlying architectures in seconds. This means you can use your own tools with Kalabox, too!

 

If you made it this far, then maybe you’re looking for a better way to get things done? Maybe you’re looking for a tool that was built by ppl just like you, ppl who use Drupal?

Maybe you’re looking for Kalabox 2.0!

Check out their Kickstarter campaign here : https://www.kickstarter.com/projects/kalabox/kalabox-advanced-web-tools-for-the-people

Drupal Planet

View the discussion thread.

Categories: Elsewhere

Pages

Subscribe to jfhovinne aggregator