I've spent a fair amount of time thinking about how to win back the Open Web, but in the case of digital distributors (e.g. closed aggregators like Facebook, Google, Apple, Amazon, Flipboard) superior, push-based user experiences have won the hearts and minds of end users, and enabled them to attract and retain audience in ways that individual publishers on the Open Web currently can't.
In today's world, there is a clear role for both digital distributors and Open Web publishers. Each needs the other to thrive. The Open Web provides distributors content to aggregate, curate and deliver to its users, and distributors provide the Open Web reach in return. The user benefits from this symbiosis, because it's easier to discover relevant content.
As I see it, there are two important observations. First, digital distributors have out-innovated the Open Web in terms of conveniently delivering relevant content; the usability gap between these closed distributors and the Open Web is wide, and won't be overcome without a new disruptive technology. Second, the digital distributors haven't provided the pure profit motives for individual publishers to divest their websites and fully embrace distributors.
However, it begs some interesting questions for the future of the web. What does the rise of digital distributors mean for the Open Web? If distributors become successful in enabling publishers to monetize their content, is there a point at which distributors create enough value for publishers to stop having their own websites? If distributors are capturing market share because of a superior user experience, is there a future technology that could disrupt them? And the ultimate question: who will win, digital distributors or the Open Web?
I see three distinct scenarios that could play out over the next few years, which I'll explore in this post.
This image summarizes different scenarios for the future of the web. Each scenario has a label in the top-left corner which I'll refer to in this blog post. A larger version of this image can be found at http://buytaert.net/sites/buytaert.net/files/images/blog/digital-distrib....Scenario 1: Digital distributors provide commercial value to publishers (A1 → A3/B3)
Digital distributors provide publishers reach, but without tangible commercial benefits, they risk being perceived as diluting or even destroying value for publishers rather than adding it. Right now, digital distributors are in early, experimental phases of enabling publishers to monetize their content. Facebook's Instant Articles currently lets publishers retain 100 percent of revenue from the ad inventory they sell. Flipboard, in efforts to stave off rivals like Apple News, has experimented with everything from publisher paywalls to native advertising as revenue models. Except much more experimentation with different monetization models and dealmaking between the publishers and digital distributors.
If digital distributors like Facebook succeed in delivering substantial commercial value to the publisher they may fully embrace the distributor model and even divest their own websites' front-end, especially if the publishers could make the vast majority of their revenue from Facebook rather than from their own websites. I'd be interested to see someone model out a business case for that tipping point. I can imagine a future upstart media company either divesting its website completely or starting from scratch to serve content directly to distributors (and being profitable in the process). This would be unfortunate news for the Open Web and would mean that content management systems need to focus primarily on multi-channel publishing, and less on their own presentation layer.
As we have seen from other industries, decoupling production from consumption in the supply-chain can redefine industries. We also know that introduces major risks as it puts a lot of power and control in the hands of a few.Scenario 2: The Open Web's disruptive innovation happens (A1 → C1/C2)
For the Open Web to win, the next disruptive innovation must focus on narrowing the usability gap with distributors. I've written about a concept called a Personal Information Broker (PIM) in a past post, which could serve as a way to responsibly use customer data to engineer similar personal, contextually relevant experiences on the Open Web. Think of this as unbundling Facebook where you separate the personal information management system from their content aggregation and curation platform, and make that available for everyone on the web to use. First, it would help us to close the user experience gap because you could broker your personal information with every website you visit, and every website could instantly provide you a contextual experience regardless of prior knowledge about you. Second, it would enable the creation of more distributors. I like the idea of a PIM making the era of handful of closed distributors as short as possible. In fact, it's hard to imagine the future of the web without some sort of PIM. In a future post, I'll explore in more detail why the web needs a PIM, and what it may look like.Scenario 3: Coexistence (A1 → A2/B1/B2)
Finally, in a third combined scenario, neither publishers nor distributors dominate, and both continue to coexist. The Open Web serves as both a content hub for distributors, and successfully uses contextualization to improve the user experience on individual websites.Conclusion
Right now, since distributors are out-innovating on relevance and discovery, publishers are somewhat at their mercy for traffic. However, a significant enough profit motive to divest websites completely remains to be seen. I can imagine that we'll continue in a coexistence phase for some time, since it's unreasonable to expect either the Open Web or digital distributors to fail. If we work on the next disruptive technology for the Open Web, it's possible that we can shift the pendulum in favor of “open” and narrow the usability gap that exists today. If I were to guess, I'd say that we'll see a move from A1 to B2 in the next 5 years, followed by a move from B2 to C2 over the next 5 to 10 years. Time will tell!
At Annertech, there are three things we take very seriously: website/server security, accessibility, and website load times/performance. This article will look at website performance with metrics from recent work we completed for Oxfam Ireland.
We use a suite of tools for performance testing. Some of these include Apache Benchmark, Yahoo's YSlow, and Google's PageSpeed Insights. Our favourite at the moment is NewRelic, though this does come at a cost.
During Jacob Applebaum's talk at DebConf15, he noted that Debian should TLS-enable all services, especially the mirrors.
His reasoning was that when a high-value target downloads a security update for package foo, an adversary knows that they are still using a vulnerable version of foo and try to attack before the security update has been installed.
In this specific case, TLS is not of much use though. If the target downloads 4.7 MiB right after a security update with 4.7 MiB has been released, or downloads from security.debian.org, it's still obvious what's happening. Even padding won't help much as the 5 MiB download will also be suspicious. The mere act of downloading anything from the mirrors after an update has been released is reason enough to try an attack.
The solution, is, of course, Tor.
weasel was nice enough to set up a hidden service on Debian's infrastructure; initally we agreed that he would just give me a VM and I would do the actual work, but he went the full way on his own. Thanks :) This service is not redundant, it uses a key which is stored on the local drive, the .onion will change, and things are expected to break.
But at least this service exists now and can be used, tested, and put under some load:http://vwakviie2ienjx6t.onion/
I couldn't get apt-get to be content with a .onion in /etc/apt/sources.list and Acquire::socks::proxy "socks://127.0.0.1:9050"; in /etc/apt/apt.conf, but the torify wrapper worked like a charm. What follows is, to the best of my knowledge, the first ever download from Debian's "official" Tor-enabled mirror:~ # apt-get install torsocks ~ # mv /etc/apt/sources.list /etc/apt/sources.list.backup ~ # echo 'deb http://vwakviie2ienjx6t.onion/debian/ unstable main non-free contrib' > /etc/apt/sources.list ~ # torify apt-get update Get:1 http://vwakviie2ienjx6t.onion unstable InRelease [215 kB] Get:2 http://vwakviie2ienjx6t.onion unstable/main amd64 Packages [7548 kB] Get:3 http://vwakviie2ienjx6t.onion unstable/non-free amd64 Packages [91.9 kB] Get:4 http://vwakviie2ienjx6t.onion unstable/contrib amd64 Packages [58.5 kB] Get:5 http://vwakviie2ienjx6t.onion unstable/main i386 Packages [7541 kB] Get:6 http://vwakviie2ienjx6t.onion unstable/non-free i386 Packages [85.4 kB] Get:7 http://vwakviie2ienjx6t.onion unstable/contrib i386 Packages [58.1 kB] Get:8 http://vwakviie2ienjx6t.onion unstable/contrib Translation-en [45.7 kB] Get:9 http://vwakviie2ienjx6t.onion unstable/main Translation-en [5060 kB] Get:10 http://vwakviie2ienjx6t.onion unstable/non-free Translation-en [80.8 kB] Fetched 20.8 MB in 2min 0s (172 kB/s) Reading package lists... Done ~ # torify apt-get install vim Reading package lists... Done Building dependency tree Reading state information... Done The following extra packages will be installed: vim-common vim-nox vim-runtime vim-tiny Suggested packages: ctags vim-doc vim-scripts cscope indent The following packages will be upgraded: vim vim-common vim-nox vim-runtime vim-tiny 5 upgraded, 0 newly installed, 0 to remove and 661 not upgraded. Need to get 0 B/7719 kB of archives. After this operation, 2048 B disk space will be freed. Do you want to continue? [Y/n] Retrieving bug reports... Done Parsing Found/Fixed information... Done Reading changelogs... Done (Reading database ... 316427 files and directories currently installed.) Preparing to unpack .../vim-nox_2%3a7.4.826-1_amd64.deb ... Unpacking vim-nox (2:7.4.826-1) over (2:7.4.712-3) ... Preparing to unpack .../vim_2%3a7.4.826-1_amd64.deb ... Unpacking vim (2:7.4.826-1) over (2:7.4.712-3) ... Preparing to unpack .../vim-tiny_2%3a7.4.826-1_amd64.deb ... Unpacking vim-tiny (2:7.4.826-1) over (2:7.4.712-3) ... Preparing to unpack .../vim-runtime_2%3a7.4.826-1_all.deb ... Unpacking vim-runtime (2:7.4.826-1) over (2:7.4.712-3) ... Preparing to unpack .../vim-common_2%3a7.4.826-1_amd64.deb ... Unpacking vim-common (2:7.4.826-1) over (2:7.4.712-3) ... Processing triggers for man-db (126.96.36.199-5) ... Processing triggers for mime-support (3.58) ... Processing triggers for desktop-file-utils (0.22-1) ... Processing triggers for hicolor-icon-theme (0.13-1) ... Setting up vim-common (2:7.4.826-1) ... Setting up vim-runtime (2:7.4.826-1) ... Processing /usr/share/vim/addons/doc Setting up vim-nox (2:7.4.826-1) ... Setting up vim (2:7.4.826-1) ... Setting up vim-tiny (2:7.4.826-1) ... ~ #
More services will follow. noodles, weasel, and me agreed that the project as a whole should aim to Tor-enable the complete package lifecycle, package information, and the website.
Maybe a more secure install option on the official images which, amongst others, sets up apt, apt-listbugs, dput, reportbug, et al up to use Tor without further configuration could even be a realistic stretch goal.
I have just released a wastly improved new version of the Kobo Japanese Dictionary Enhancer. It allows you to enhance the Kobo Japanese dictionary with English translations.
The new version provides now 326064 translated entries, which covers most non-compound words, including Hiragana. In my daily life reading Harry Potter and some other books in Japanese, I haven’t found many untranslated words by now.
Please head over to the main page of the project for details and download instructions. If you need my help in creating the updated dictionary, please feel free to contact me.
I therefore spent some time to finish a couple of features in the editor for sources.debian.net. Here are some of the changes:
- Compare the source file with that of another version of the package
- And in order to present that: tabs! editor tabs!
- at the same time: generated diffs are now presented in a new editor tab, from where you can download it or email it
Get it for chromium, and iceweasel.
If your browser performs automatic updates of the extensions (the default), you should soon be upgraded to version 0.1.0 or later, bringing all those changes to your browser.
Want to see more? multi-file editing? in-browser storage of the editing session? that and more can be done, so feel free to join me and contribute to the Debian sources online editor!
Moodle is a free and open-source software learning management system written in PHP and distributed under the GNU General Public License. Moodle is used for blended learning, distance education, flipped classroom and other e-learning projects in schools, universities, workplaces and other sectors.
Our main objective is that we wanted to manage all the users from Drupal i.e., use drupal as the front end for managing users. For this purpose, we have a moodle plugin and drupal module. Drupal services is a moodle authorization plugin that allows for SSO between Drupal and Moodle. Moodle SSO provides the Drupal functionality required to allow the Moodle training management system to SSO share Drupal sessions.
In order to make SSO work, we need to ensure that sites can share cookies. Drupal and moodle sites should have url like drupal.example.com and moodle.example.com. As mentioned earlier, sites should be able to share cookies. To make sites use shared cookie, we need set the value of $cookie_domain in settings.php file on the drupal site. In our case, the site urls was something like drupal.example.com and moodle.example.com. For these type of sub-domains, the cookie_domain value can be set like the below one:$cookie_domain = ".example.com";
Note: The dot before "example.com" is necessary.
Let's start with the steps that need to followed for achieving SSO between drupal and moodle:
1. Moodle site
This post explains about creating slideshow in drupal. There are many ways and plugins available to create slideshow in drupal and I am going to discuss some methods which will be very efficient and useful.
1) Using Views slideshow module
2) Using jQuery cSlider plugin
3) Using Bootstrap carousel
1. Using Views slideshow module:
The modules required for this method are:
3) jQuery cycle plugin ( Download here and place it at sites/all/libraries/jquery.cycle/)
Enable the added modules. To create views slideshow, create a new content type for instance "Slideshow" with an image field which can be used as slideshow image.
Add multiple slideshow nodes with images. Then, we have to create a view block with slideshow content. Select "slideshow" as required format and configure transition effect in the Settings link.
After saving this view, place this view block at neccessary region at admin/structure/blocks.
2. Using jQuery cSlider plugin:
1) You can download this plugin from here. There is also a demo file in this plugin which can be used as a reference.
One of our key values at the Drupal Association is communication:
We value communication. We seek community participation. We are open and transparent.
One of the ways that we try to live this value is by making our numbers -- both operating targets and financial -- available to the public. The monthly board reports share basic financial numbers and all our operational metrics. Once a quarter, we release full financial reports for the previous quarter. You can access all this information at any time on the Association web site.
At the close of each year, we take the opportunity to have our financials reviewed (and sometimes audited). The review process ensures that we've represented our financials properly. This work takes some time. Though our fiscal year closes on 31 December, it takes six to eight weeks to get the final bits and pieces handled in our financial systems. The independent review or audit adds another 8+ weeks to the process of closing out our year. Then we have to review the findings with the Finance Committee and the full Board before we share them publicly. That's why it's August and we're just now sharing the 2014 reviewed financial statements with you.
In 2014 we also began tracking our progress towards several operational goals for the first time. Though we share those numbers every month in the board report, we pulled some of our favorite stats and stories together into an Annual Report to share the work that our financials represent.What happened in 2014?
2014 was an investment year. Per our Leadership Plan and Budget for the year, our key focus was building an engineering team to first address technical debt on Drupal.org and then take on actual improvements to the site. We purposely built a budget that anticipated a deficit spend in order to fully fund the team. The intent was to also build some new revenue programs (like Drupal Jobs) that would ramp up and eventually allow us to fund the new staff without deficit spending. And that's what we did. We went from two full time staff focused on Drupal.org to ten.
The investment has been paying off. We spent a lot of 2014 playing catch up with technical debt, but also managed to improve site performance markedly while also increasing the portability of our infrastructure. On top of that, staff worked with community volunteers to release new features related to commit messages, profiles, and Drupal 8 release blockers. Most importantly, staff and the working groups prioritized upcoming work and published a strategic roadmap for improvements to Drupal.org.
We held two huge DrupalCons, one in Austin and one in Amsterdam, and planned for a third. Our very small team of events staff and a crew of remarkable volunteers hosted over 5,500 people across our two events, all while planning our first Con in Latin America. We had some stumbling blocks and learning opportunities, and have been applying what we learned to the three 2015 DrupalCons.
We launched Drupal Jobs. This was something the community asked for very clearly when we conducted a 2013 study. We’ve seen steady growth in usage since our quiet launch and will continue to refine the site, including our pricing models, so that it is accessible to Drupalers around the world.
We diversified our revenue streams. DrupalCons used to be 100% of our funding. Not only is this a risky business strategy, it puts undue pressure on the Cons to perform financially, leaving us little room to experiment or make decisions that may be right for attendees, but could negatively impact the bottom line. As we increase the funding sources for the Association, we can make more and different choices for these flagship programs and also grow new programs with the community.
We introduced branded content including white papers, infographics, and videos. These materials have been widely used by the community and have helped us understand the Drupal.org audience in a better way. You can see a lot of this work on the Drupal 8 landing pages, where the key content pieces were downloaded thousands of times in 2014.
We released new vision, mission, and values statements for the Association. These tools are really useful in defining the focus of the organization and helping to guide how we get our work done. Working in a community of this size and diversity is extremely challenging. There is no choice we can make that will include everyone’s ideals, but our values help us make those decisions in a way that allows for transparency and open dialogue with the community. It’s something that we try to honor every day.What about money in 2014?
As anticipated, we ran a deficit in 2014. However, we did manage to grow our overall revenue by about 6% from 2013 to 2014. This trend has continued into 2015, though not at the rate we had hoped. Still, we are now on track to support the investment we made in 2014 into the future. Another key win in 2014 is that we grew non-DrupalCon revenue to 21% of our total revenue. Diversifying our revenue streams reduces our financial risk and takes the pressure off of Cons, allowing us to experiment more.I want all the details
Excellent! You can check out:
Even though the week of DebCamp took its toll and the stress level will not go down any time soon...
...DebConf15 has finally started! :)
Even though Debian has moved to systemd as default a long while ago now, I've stayed with sysv as I have somewhat custom setups (self-built trimmed down kernels, separate /usr not pre-mounted by initrd, etc.).
After installing a new system with Jessie and playing a bit with systemd on it a couple of months ago, I said it's finally time to upgrade. Easier said than starting to actually do it ☹.
The first system I upgraded was a recent (~1 year old) install. It was a trimmed-down system with Debian's kernel, so everything went smoothly. So smoothly that I soon forgot I made the change, and didn't do any more switches for a while.
Systemd was therefore out of my mind until this recent Friday when I got a bug report about mt's rcS init script and shipping a proper systemd unit. The first step should be to actually start using systemd, so I said - let's convert some more things!
During the weekend I upgraded one system, still a reasonably small install, but older - probably 6-7 years. First reboot into systemd flagged the fact that I had some forced-load modules which no longer exist, fact that was too easy to ignore with sysv. Nice! The only downside was that there seems to be some race condition between and ntp, as it fails to start on boot (port listen conflict). I'll see if it repeats. Another small issue is that systemd doesn't like duplicate fstab entries (i.e. two devices which both refer to the same mount point), while this works fine for mount itself (when specifying the block device).
I said that after that system, I'll wait a while until to upgrade the next. But so it happened that today another system had an issue and I had to reboot it (damn lost uptimes!). The kernel was old so I booted into a newer one (this time compiled with the required systemd options), so I had a though - what if I take the opportunity and also switch to systemd on this system?
Caution said to wait, since this was the oldest system - installed sometime during or before 2004. Plus it doesn't use an initrd (long story), and it has a split /usr. Caution… excitement… caution lost ☺ and I proceeded.
It turns out that systemd does warn about split /usr but itself has no problems. I learned that I also had very old sysfs entries that no longer exist, and which I didn't know about as sysv doesn't make it obvious. I also had a crypttab entry which was obsolete, and I forgot about it, until I met the nice red moving ASCII bar which—fortunately—had a timeout.
To be honest, I believed I'll have to rescue boot and fix things on this "always-unstable" machine, on which I install and run random things, and which has a hackish /etc/fstab setup. I'm quite surprised it just worked. On unstable.
So thanks a lot to the Debian systemd team. It was much simpler than I thought, and now, on to exploring systemd!
P.S.: the sad part is that usually I'm a strong proponent of declarative configuration, but for some reason I was reluctant to migrate to systemd also on account on losing the "power" of shell scripts. Humans…
Periodically, there is a complaint that PHP conferences are just "the same old faces". That the PHP community is insular and is just a good ol' boys club, elitist, and so forth.
It's not the first community I've been part of that has had such accusations made against it, so rather than engage in such debates I figured, let's do what any good scientist would do: Look at the data!
I sent this email to debian-private a few days ago, on the 10th anniversary of my Debian account creation:Date: Fri, 14 Aug 2015 19:37:20 +0200 From: David Moreno To: firstname.lastname@example.org Subject: Retiring from Debian User-Agent: Mutt/1.5.23 (2014-03-12) [-- PGP output follows (current time: Sun 23 Aug 2015 06:18:36 PM CEST) --] gpg: Signature made Fri 14 Aug 2015 07:37:20 PM CEST using RSA key ID 4DADEC2F gpg: Good signature from "David Moreno " gpg: aka "David Moreno " gpg: aka "David Moreno (1984-08-08) " [-- End of PGP output --] [-- The following data is signed --] Hi, Ten years ago today (2005-08-14) my account was created: https://nm.debian.org/public/person/damog Today, I don't feel like Debian represents me and neither do I represent the project anymore. I had tried over the last couple of years to retake my involvement but lack of motivation and time always got on the way, so the right thing to do for me is to officially retire and gtfo. I certainly learned a bunch from dozens of Debian people over these many years, and I'm nothing but grateful with all of them; I will for sure carry the project close to my heart — as I carry it with the Debian swirl I still have tattooed on my back ;) http://damog.net/blog/2005/06/29/debian-tattoo/ I have three packages left that have not been updated in forever and you can consider orphaned now: gcolor2, libperl6-say-perl and libxml-treepp-perl. With all best wishes, David Moreno. http://damog.net/ [-- End of signed data --]
I received a couple of questions about my decision here. I basically don’t feel like Debian represents my interests and neither do I represent the project – this doesn’t mean I don’t believe in free software, to the contrary. I think some of the best software advancements we’ve made as society are thanks to it. I don’t necessarily believe on how the project has evolved itself, whether that has been the right way, to regain relevancy and dominance, and if it’s remained primarily a way to feed dogmatism versus pragmatism. This is the perfect example of a tragic consequence. I was very happy to learn that the current Debian Conference being held in Germany got the highest attendance ever, hopefully that can be utilized in a significant and useful way.
Regardless, my contributions to Debian were never noteworthy so it’s also not that big of a deal. I just need to close cycles myself and move forward, and the ten year anniversary looked like a significant mark for that.
Poke me in case you wanna discuss some more. I’ll always be happy to. Specially over beer :)
Over the last few weeks, we've been working really hard on preparing training for Drupal 8.
We'll release the training close to the actual launch date of Drupal 8, but before then we have a great video to get you started.
This is a webinar we did with Acquia called, "10 Things Site Builders Need to Know Before Leaping to Drupal 8".
I talked about the many user-friendly features in Drupal 8, including the mobile-friendly admin interface and the in-place WYSIWYG editor, plus improvements in theming and module development:
As web sites and applications have become more complex, the need for auditing – at multiple points in the lifecycle of a project – has become ever more important.
Before delivery, a web project can be audited to ensure the ability to meet business goals or compliance with regulations. After delivery, an audit can identify problems and propose remedies. In a possible merger or acquisition, an audit can help evaluate the project’s relative benefits and liabilities.
Website auditing has become similar to financial auditing (which is separate and distinct from accounting and financial activities). It is similar to the practices applied in auditing management systems (see “There’s a Module Standard for That” sidebar).
Website auditors must apply these four principles:
- Judgment They must be able to choose the scope and granularity of the website, without wasting effort on discovering problems with no meaningful impact on the behavior and performance of the site; hence, a need for business acumen.
- Expertise In order to determine whether or not best practices were followed by the original site developers, auditors must achieve a level of proficiency beyond that with which the site was delivered.
- Objectivity Auditors cannot audit a site they themselves produced, or else risk selective blindness – the inability to see problems they missed the first time around.
- Distance Auditors cannot operate on a website developed by a company – especially their own – with which they have any kind of commercial or personal involvement.
Market studies show that site audits are often used as a loss leader by generalist Drupal agencies. Their objective: to set the stage for redevelopment and third-party maintenance work, where the main volume of business is done using “findings” from a short and low-cost audit to provide the developer with a technical advantage against competitors.
It’s easy to add node endpoints to your RESTful API - but there’s more to Drupal than nodes. This week we’ll add an endpoint for a taxonomy vocabulary.
Not every student learns the same way, so teachers consistently have to find a way to instruct a classroom while also reaching students individually.
We were reminded of a teacher’s greatest challenge when we trained dozens of Acquia employees on Drupal 8. As my co-author Kent Gale and I detailed earlier in this series on Drupal 8 instruction, we separated employees into groups of two, with one person having some knowledge of the new platform and the other having no knowledge. Once their instruction ended, they split up and each teamed with two other employees – our version of chromosomal mitosis.
Our approach to training was structured. We had goals to achieve. But we also had to stay flexible throughout. Because experience, knowledge, and skill set differed with each employee, we had to connect with them individually while maintaining the larger class structure.
We had people with deep programming experience. We had middleware folks. We had site builders. We had front-enders. Because of that, the training program had to present a lot of material, but not so much that individuals wouldn’t learn. We trained with the expectation that not everyone would, or even needed to, become experts.
Consider the training of our “explainers,” the employees who explain our products to the public. We had to figure out what they could easily learn in only one- to four-hours of training. They needed to know enough to promote and answer questions about Drupal 8, but didn’t need to know as much as a support person, who received anywhere from 40 to 80 hours of training. Figuring out what the explainers needed to learn took some effort, but there was ample material to help us determine which path to follow.
Speaking of paths, your team doesn’t have to follow ours. Mitosis worked great for us, but it may pose a problem for your program if you have fewer employees, less time to train, or other considerations.
You need to find out what works best and that, as we’ve mentioned, takes time and effort, success and failure. Some employees like to be lone wolves and learn everything on their own, for example, so our process may not work for them.
Tools that track progress will help you ascertain what works and what doesn’t. Every company, no matter how large or small, faces time constraints, so these tools will guide you through the unknowns.
We used training as a key performance indicator (KPI) for employees. Shared ownership in this big Drupal 8 training project made sense if we all had to make a big leap together in understanding the new platform. Sometimes employees will sweep training under the rug because they believe putting out other fires is a priority.
We knew learning Drupal 8 would be a significant commitment; it’s a significant change, after all. But we couldn’t delay training. Drupal 8 was coming out and there was no time for delay. KPIs helped motivate and get everyone on the same page. There was a vested interest in making progress.Blog series: Organizing to Rock with Drupal 8Workflow: PendingFeatured: NoTags: acquia drupal planetDrupal 8 related: YesAuthor: Thomas Howell
There are a bunch of things I mean to blog about, but as I have just got fully home from Heidelberg and DebConf15 this afternoon that seems most appropriate to start with. It’s a bit of a set of disjoint thoughts, but I figure I should write them down while they’re in my head.
DebConf is an interesting conference. It’s the best opportunity the Debian project has every year to come together and actually spend a decent amount of time with each other. As a result it’s a fairly full on experience, with lots of planned talks as a basis and a wide range of technical discussions and general social interaction filling in whatever gaps are available. I always find it a thoroughly enjoyable experience, but equally I’m glad to be home and doing delightfully dull things like washing my clothes and buying fresh milk.
I have always been of the opinion that the key aspect of DebConf is the face time. It was thus great to see so many people there - we were told several times that this was the largest DebConf so far (~ 570 people IIRC). That’s good in the sense that it meant I got to speak to a lot of people (both old friends and new), but does mean that there are various people I know I didn’t spend enough, or in some cases any, time with. My apologies, but I think many of us were in the same situation. I don’t feel it made the conference any less productive for me - I managed to get a bunch of hacking done, discuss a number of open questions in person with various people and get pulled into various interesting discussions I hadn’t expected. In short, a typical DebConf.
Also I’d like to say that the venue worked out really well. I’ll admit I was dubious when I heard it was in a hostel, but it was well located (about a 30 minute walk into town, and a reasonable bus service available from just outside the door), self-contained with decent facilities (I’m a big believer in having DebConf talks + accommodation be as close as possible to each other) and the room was much better than expected (well, aside from the snoring but I can’t blame the DebConf organisers for that).
One of the surprising and interesting things for me that was different from previous DebConfs was the opportunity to have more conversations with a legal leaning. I expect to go to DebConf and do OpenPGP/general crypto related bits. I wasn’t expecting affirmation about the things I have learnt on my course over the past year, in terms of feeling that I could use that knowledge in the process of helping Debian. It provided me with some hope that I’ll be able to tie my technology and law skills together in a way that I will find suitably entertaining (as did various conversations where people expressed significant interest in the crossover).
Next year is in Cape Town, South Africa. It’s a long way (though I suppose no worse than Portland and I get to stay in the same time zone), and a quick look at flights indicates they’re quite expensive at the moment. The bid presentation did look pretty good though so as soon as the dates are confirmed (I believe this will happen as soon as there are signed contracts in place) I’ll take another look at flights.
In short, excellent DebConf, thanks to the organisers, lovely to see everyone I managed to speak to, apologies to those of you I didn’t manage to speak to. Hopefully see you in Cape Town next year.
A pure maintenance release 0.1.3 of the RcppDE package arrived on CRAN yesterday. RcppDE is a "port" of DEoptim, a popular package for derivative-free optimisation using differential optimization, to C++. By using RcppArmadillo, the code becomes a lot shorter and more legible.
At the Debian Conference 2015 I gave a talk about Continuous Delivery of Debian packages. My slides are available online (PDF, 753KB). Thanks to the fantastic video team there’s also a recording of the talk available: WebM (471MB) and on YouTube.