FOSDEM is almost here! And in an hour or so, I'm leaving for the airport.
My talk tomorrow is about Nageru, my live video mixer. HDMI/SDI signals in, stream that doesn't look like crap out. Or, citing the abstract for the talk:Nageru is an M/E (mixer/effects) video mixer capable of high-quality output on modest hardware. We'll go through the fundamental goals of the project, what we can learn from the outside world, performance challenges in mixing 720p60 video on an ultraportable laptop, and how all of this translates into a design and implementation that differs significantly from existing choices in the free software world.
Saturday 17:00, Open Media devroom (H.2214). Feel free to come and ask difficult questions. :-) (I've heard there's supposed to be a live stream, but there's zero public information on details yet. And while you can still ask difficult questions while watching the stream, it's unlikely that I'll hear them.)
FOSDEM 2016 starts tomorrow and I will be attending. I've not got off to a brilliant start with my flight being cancelled, though SAS have now rebooked me onto a later flight and I'm going to arrive in time for the start tomorrow morning. Unfortunately, I am going to miss the Friday beer event.
On the Saturday, the real-time communications devroom will be happening, and I am one of the devroom admins that helped to organise this. There will be a full day of talks and demonstrations about real-time communications using open standards and free software. I'm rather excited about this.
On the Sunday, I'll be giving a talk and quick demonstration in the distributions devroom about real-time communications in free software communities and why it is useful for your community.
If you're considering setting up some real-time communications infrastructure for your community but you're not going to be along to FOSDEM, there is a great guide for getting started at rtcquickstart.org.
This year, DrupalCon North America heads to New Orleans, and I can't tell you how excited I am. Together with all the other members of our local Louisiana Drupal user group, we are truly looking forward to hosting the Drupal community as much as you all are excited to come experience everything New Orleans has to offer. For those visiting for the first time, I hope your expectations are exceeded. For those of you who are familiar with New Orleans, we welcome you back, and I hope you will be comforted with old haunts.
There is a discussion underway about adopting a similar release schedule for Drupal 7 that Drupal 8 is using (i.e., a six-month feature release schedule, with pseudo-semantic versioning).
This is not a major change to current policy, but it would mean that new features or other potentially disruptive patches to Drupal 7 core would be limited to two windows per year, with the corresponding releases given a round number (for example, Drupal 7.50, Drupal 7.60, etc.) to indicate their importance.
I'd like to bring the discussion to a close soon so that if we do adopt the new schedule, the first such release can be slated for April 20 (the same day as the Drupal 8.1.0 release).
If you have any feedback on this proposal, please try to comment within the next week here: https://www.drupal.org/node/2598382
Sometimes the content improves accessibility but is considered visual noise (says the designer).
So as the developer you have a lot of ways to... pat the cat(?) But not all cats react the same to being patted :D
Over the past two months or so I have become a contributor to the Debian Project. This is something that I’ve wanted to do for a while. Firstly, just because I’ve got so much out of Debian over the last five or six years—both as a day-to-day operating system and a place to learn about computing—and I wanted to contribute something back. And secondly, in following the work of Joey Hess for the past three or four years I’ve come to share various technical and social values with Debian. Of course, I’ve long valued the project of making it possible for people to run their computers entirely on Free Software, but more recently I’ve come to appreciate how Debian’s mature technical and social infrastructure makes it possible for a large number of people to work together to produce and maintain high quality packages. The end result is that the work of making a powerful software package work well with other packages on a Debian system is carried out by one person or a small team, and then as many users who want to make use of that software need only apt-get it. It’s hard to get the systems and processes to make this possible right, especially without a team being paid full-time to set it all up. Debian has managed it on the backs of volunteers. That’s something I want to be a part of.
So far, most of my efforts have been confined to packaging addons for the Emacs text editor and the Firefox web browser. Debian has common frameworks for packaging these and lots of scripts that make it pretty easy to produce new packages (I did one yesterday in about 30 minutes). It’s valuable to package these addons because there are a great many advantages for a user in obtaining them from their local Debian mirror rather than downloading them from the de facto Emacs addons repository or the Mozilla addons site. Users know that trusted Debian project volunteers have reviewed the software—I cannot yet upload my packages to the Debian archive by myself—and the whole Debian infrastructure for reporting and classifying bugs can be brought to bear. The quality assurance standards built into these processes are higher than your average addon author’s, not that I mean to suggest anything about authors of the particular addons I’ve packaged so far. And automating the installation of such addons is easy as there are all sorts of tools to automate installations of Debian systems and package sets.
I hope that I can expand my work beyond packaging Emacs and Firefox addons in the future. It’s been great, though, to build my general knowledge of the Debian toolchain and the project’s social organisation while working on something that is both relatively simple and valuable to package. Now I said at the beginning of this post that it was following the work of Joey Hess that brought me to Debian development. One thing that worries me about becoming involved in more contentious parts of the Debian project is the dysfunction that he saw in the Debian decision-making process, dysfunction which eventually led to his resignation from the project in 2014. I hope that I can avoid getting quagmired and demotivated.
It’s strange but true: seven years after the PDF reference was published as an ISO standard (ISO-32000-1), there are still developers who think that the Portable Document Format is a closed document format owned by Adobe. PDF is often perceived as a strange, impenetrable document format. It’s high time we bury that idea and take a look at what’s going on in the world of PDF today...
This will be a momentus year for Drupal.
Some people may be predicting a year of security vulnerabilities. I believe we have a year of innovation ahead of us.
Drupal's ecosystem of contributed modules is playing catch-up since the release of version 8. It has been years since CCK and Views were just experiments in the contrib module space. They are now established as foundation of Drupal core. Contrib (the wider community of Drupal developers) can now get back to innovating.
"Get back to innovating" isn't much of a prediction. I thought about it some more and came up with three things I think are likely to happen in 2016...CMS-as-a-Service
The era of the monolythic, does-it-all, CMS is coming to an end. The Drupal community talk a lot about progressive decoupling  . But, the idea of a fully decoupled backend is becoming established in other areas. Services like Contentful already provide a fully decoupled, headless CMS.
Systems become decoupled, we move to a microservice architecture, and evaluate server-less options. It is conceivable that a Content API could become part of the infrastructure. Amazon are the leading Infrastructure-as-a-Service provider. A Content API (or CMS-as-a-Service) would fit in their suite of cloud computing services.
I did an experiment recently where I put Amazon's API Gateway in front of an EC2 instance running Drupal. This gives a more robust API on top of Drupal 8's Rest support. Monitoring, traffic management, and flexible security controls are standard. This approach offers several advantages, including: Swagger support; CloudFront caching for performance; and input/output translation with data models defined with JSON schema.Acquia IPO
Going public was always on the cards for Acquia, but they said they are in no rush to IPO. This could be the year. Adoption of D8 will being changes to the user base and continued adoption at the enterprise level.
This will be accompanied by more consolidation in the Drupal world. One of the biggest risks I see is with Drupal companies taking on bigger projects. A single client becomes a large contributer to their revenue, in some cases I've heard as much as 70%. This is a risky situation to be in, if you rely on one client for a majority of your business. The solution is for Drupal companies to come together to form larger entities.
This is just a continuation of an existing trend. Wunderkraut, was the most high-profile merger in the community. Followed by many more, involving companies such as FFW, MediaCurrent, Phase2, and i-Kos.Composer Support and Decoupled Components
Composer support in Drupal needs some work. There are some big wins to be had by embracing the Composer (and Packagist) workflow.
Commerce Guys are leading the way with Drupal Commerce. They have been factoring out components into separate libraries. Other PHP projects beyond Drupal are making use of them, and contributing to their development. Expect more contrib projects to factor out separate PHP packages of re-usable code. Then Drupal modules become just a thin layer of glue.
There has been a trend for PHP Frameworks to decouple their core components. The Symfony Components split from the full stack framework has meant much wider adoption. They are used in many PHP projects, including Drupal.
PHP-FIG exists to promote interoperability between frameworks. This year will see further initiatives to clean up Drupal's code. Such as, removal of anti-patterns like service locator. The eventual aim will be to decoupling components from Drupal core.
Can you imagine using Views on a non-drupal project?
The process of debugging can be a difficult one, and the process of troubleshooting performance even more so. Luckily there are some great tools out there to help with improving the performance of web applications. Previously I wrote about generating flamegraphs from XHProf to visual stacktraces and identify bottlenecks. Flamegraphs are very helpful, but still require you to setup XHProf, download the tools for making the flamegraph SVG, and appropriately change your code to save the XHProf data. Luckily there is one of my favorite tools, Blackfire, that provides continuous PHP performance testing.
One second can define a conversion or not, and this is highly critical when it comes to eCommerce. Bottlenecks can happen in your catalog, viewing products, or even the Drupal Commerce order's life cycle. A great example can be found on this Drupal commerce issue: https://www.drupal.org/node/2653904. It's a patch to improve the deletion of line item references from orders to cut down on number of saves for orders. There's profiling done with both XHProf and Blackfire.What is Blackfire?
What about Blackfire makes it my favorite? Blackfire provides on-demand profiling, unlike XHProf, which is always collecting data. Blackfire has a PHP extension called Probe that collects raw performance data when requested. Then the Agent sends that data to Blackfire's website for processing and viewing. Companion is a browser extension (Chrome only currently) that will trigger the Probe to collect performance data on a page.What you get
It speaks for itself. Here is a Blackfire profiling result from a catalog page in the Commerce Kickstart 2 demo store (Drupal without caches enabled.)
As you can see, we spend a lot of time in Views. Enabling Views cache for our catalog (and anonymous page cache) would greatly speed up our page's load time. Here are results with page caching enabled, and search based Views caching. From 1.88s to 55ms!How to get started
While there is some pretty amazing documentation on installing Blackfire's components, I'll provide a quick start guide. Well, it's a quick start if you use Docker! Blackfire provides an image for their probe and agent. All you need to do is link it to your php-fpm (or whatever container serves your PHP.)
If you use Docker Compose, you would need to add the following configuration, and add a new link to this container from your PHP container. You will just need to get your API credentials from your Blackfire account. Restart your app and you're good to go!blackfire: image: blackfire/blackfire ports: ['8707'] environment: BLACKFIRE_SERVER_ID: xx-xx-xx-xx-xxxx BLACKFIRE_SERVER_TOKEN: dsfsd! BLACKFIRE_LOG_LEVEL: 4
See the full documentation for Docker integration: https://blackfire.io/docs/integrations/docker
Last years FOSDEM featured one talk about Reproducible Builds, while this year there will be four at least:
On Saturday, there will Reproducible and Customizable Deployments with GNU Guix by Ludovic Courtès which which I definitly will be attending!
And on Sunday there will three talks and I plan to attend them all: a rather general one about the Reproducible ecosystem by myself, followed by ElectroBSD - Getting a reproducible BSD out of the door by Fabian Keil and finally Reproducible builds in FreeBSD packages by Baptiste Daroussin.
The FOSDEM organizers also reached out to me for an interview with me about all this reproducible stuff. I hope you'll like my answers as I enjoyed the questions
But there are many more interesting talks (hundreds they say) and so I'd appreciate if you could share your pointers and explainations, whether here on planet, or on IRC or IRL!
Each day, more Drupal 7 modules are being migrated over to Drupal 8 and new ones are being created for the Drupal community’s latest major release. In this series, the Acquia Developer Center is profiling some of the most prominent, useful modules available for Drupal 8. This week: Scheduled Updates.Tags: acquia drupal planetscheduled updatedrupal 8
There are two exciting Drupal community events happening in Portland soon. The first is the Drupal Global Sprint Day on January 30th - this coming Saturday - which is a day focused on bringing together volunteers to contribute work such as documentation, testing, code, and design to the Drupal project. The project needs improvements from a wide variety of skill sets, and it’s a great way for new folks to contribute to Drupal. The second is Drupal Global Training Day, a free Drupal 8 training for new community members. We’re thrilled to be involved with both!Drupal Global Sprint
We’re hosting the Portland sprint at our office. Bring your projects and come code with us! If you've wanted to contribute to Drupal 8, but don't know how to begin, we’re happy to help you get started. New contributors are encouraged to attend, as we will be providing sprint training and new contributor onboarding, so don't worry if you've never contributed to Drupal before. The sprint starts at 9:00 am and goes until 5:00 pm. Programming help, snacks, coffee, tables, and wifi will all be provided by ThinkShout.Drupal 8 Training
February 6th is the Drupal Global Training Day. We will be leading the Portland training at the Drupal Association headquarters, and it's open to everyone. This free training is ideal for new community members and people who are new to Drupal – but PHP developers not familiar with Drupal should also find the training valuable. The training includes coffee and snacks. Participants need only bring a laptop. Everything you need to know to get started will be discussed in detail at the event. We’ll cover:
- An introduction to CMS
- File management and databases
- Site building basics with content types, fields, and views
- Installation of modules and themes
- Deploying to your web host with Git
- Introduction to Drupal 8 theming with Twig templates
- Drupal 8 configuration management
These two global Drupal events offer something for Drupal folks of all skill levels, helping us to tap into Portland’s strong Drupal community. I hope you’ll join us for either (or both!) of these great events.
pidof is a program that reports the PID of a process that has the given command line. It has an option x which means “scripts too”. The idea behind this is if you have a shell script it will find it. Recently there was an issue raised saying pidof was not finding a shell script. Trying it out, pidof indeed could not find the sample script but found other scripts, what was going on?What is a script?
Seems pretty simple really, a shell script is a text file that is interpreted by a shell program. At the top of the file you have a “hash bang line” which starts with #! and then the name of shell that is going to interpret the text.
When you use the x option, the pidof uses the following code:if (task.cmd && !strncmp(task.cmd, cmd_arg1base, strlen(task.cmd)) && (!strcmp(program, cmd_arg1base) || !strcmp(program_base, cmd_arg1) || !strcmp(program, cmd_arg1)))
What this means if match if the process comm (task.cmd) and the basename (strip the path) of argv match and one of the following:
- The given name matches the basename of argv; or
- The basename of the given name matches argv; or
- The given name matches argv
Most scripts I have come across start with a line like#!/bin/sh
Which means use the normal shell (on my system dash) shell interpreter. What was different in the test script had a first line of#!/usr/bin/env sh
Which means run the program “sh” in a new environment. Could this be the difference? The first type of script has the following procfs files:$ cat -e /proc/30132/cmdline /bin/sh^@/tmp/pidofx^@ $ cut -f 2 -d' ' /proc/30132/stat (pidofx)
The first line picks up argv “/tmp/pidofx” while the second finds comm “pidofx”. The primary matching is satisfied as well as the first dot-point because the basename of argv is “pidofx”.
What about the script that uses env?$ cat -e /proc/30232/cmdline bash^@/tmp/pidofx^@ $ cut -f 2 -d' ' /proc/30232/stat (bash)
The comm “bash” does not match the basename of argv so this process is not found.How many execve?
So the proc filesystem is reporting the scripts differently depending on the first line, but why? The fields change depending on what process is running and that is dependent on the execve function calls.
A typical script has a single execve call, the strace output shows:29332 execve("./pidofx", ["./pidofx"], [/* 24 vars */]) = 0
While the env has a few more:29477 execve("./pidofx", ["./pidofx"], [/* 24 vars */]) = 0 29477 execve("/usr/local/bin/sh", ["sh", "./pidofx"], [/* 24 vars */]) = -1 ENOENT (No such file or directory) 29477 execve("/usr/bin/sh", ["sh", "./pidofx"], [/* 24 vars */]) = -1 ENOENT (No such file or directory) 29477 execve("/bin/sh", ["sh", "./pidofx"], [/* 24 vars */]) = 0
The first execve is the same for both, but then env is called and it goes on its merry way to find sh. After trying /usr/local/bin, /usr/bin it finds sh in /bin and execs this program. Because of there are two successful execve calls, the procfs fields are different.What Next?
So now the mystery of pidof missing scripts now has a reasonable reason. The problem is, how to fix pidof? There doesn’t seem to be a fix that isn’t a kludge. Hard-coding potential script names seems just evil but there doesn’t seem to be a way to differentiate between a script using env and, say, “vi ./pidofx”.
If you got some ideas, comment below or in the issue on gitlab.
As you can now see on phpMyAdmin's security page, we've managed to spend 9 security announcements on todays release. Hopefully it won't continue that bad in rest of the year.
Anyway receiving such extensive report was really challenging for us - correctly tracking and fixing all reported issues, discovering which versions are affected. This proven to be quite difficult given that most of the affected code has been refactored meanwhile. But I'm quite happy we've managed to fix ll issues on three supported branches in two weeks.
Another challenge (especially for Isaac) was that this all came with change of our release manager, so forgive us some minor problems with the releases (especially not updated changelogs), we will do it better next time!
PS: Updated packages are on their way to Debian and phpMyAdmin PPA.
PS2: It seems we've messed few more things, so expect quick followup releases for older versions.
Recently, at a summer-school-like event, we were discussing pen-and-paper role playing. I’m not sure if this was after a session of role-playing, but I was making the point that you don’t need much or any at all of the rules, and scores, and dice, if you are one of the story-telling role players, and it can actually be more fun this way.
As an example, I said, it can make sense if one of the players (and the game master, I suppose) reads up a lot about one aspect of the fantasy world, e.g. one geographical area, one cult, one person, and then this knowledge is used to create an exciting puzzle, even without any opponents.
I’m not quite sure, but I think I fell asleep shortly after, and I dreamed of such a role playing session. It was going roughly like this:
I (a human), and my fellows (at least a dwarf, not sure about the rest) went to some castle. It was empty, but scary. We crossed its hall, and went into a room on the other side. It was locked towards the hall by a door that covered the door frame only partly, and suddenly we could see a large Ogre, together with other foul folk not worth mentioning, hammered at the door. My group (which was a bit larger in that moment) all prepared shooting arrows at him the moment it burst through the door. I had the time to appreciate the ingenuity that we all waited for him to burst through, so that none of the arrows would bounce of the door, but it did not help, and we ran from the castle, over a field, through a forest, at the other side of which we could see, below a sleep slope, a house, so we went there.
The path towards that was filled with tracks that looked surprisingly like car tracks. When we reached the spot there was no house any more, but rather a cold camp side. We saw digging tools, and helmets (strangely, baseball helmets) were arranged in a circle, as if it was a burial site.
We set up camp there and slept.
It occurred to me that I must have been the rightful owner of the castle, and it was taken by me from my brother and his wife, who denied my existence or something treacherously like that. When we woke up at the camp side, she were there, together with what must be my niece. My sister in law mocked us for fighting unsuccessfully at the castle, but my niece was surprised to see me, as I must have a very similar appearance to my brother. She said that her mother forbid it, but she nevertheless sneakily takes out something which looks like a Gameboy with a camera attachment and a CompactFlash card from her mothers purse, puts it in and take a photo of me. This is when I realize that I will get my castle back.
At that moment, I woke up. I somewhat liked the story (and it was a bit more coherent in my mind than what I then wrote down here), so I wanted to write it down. I quickly fetched my laptop. My friends at the summer school were a bit worried, and I promised not to mention their names and concrete places, and started writing. They distracted me, so I searched for a place of my own, lied down (why? no idea), and continued writing. I had to to touch writing on my belly, because my laptop was not actually there.
I also noticed that I am back at the camp side, and that I am still wearing my back protector that I must have been wearing while fighting in the castle, and which I did not take off while sleeping at the camp side. Funnily, it was not a proper medieval amour, but rather my snowboarding back protector.
At that moment, I woke up. I somewhat liked the story (and it was a bit more coherent in my mind than what I then wrote down here), so I wanted to write it down. I quickly got up, started my laptop, and wrote it down. And this is what you are reading right now.
Off to bed again, let’s see what happens next.
Tor Browser for reasons beyond this blog post is not part of Debian. To easily and securely use it, one can runsudo apt-get install torbrowser-launcher and then run torbrowser-launcher which will download Tor Browser and well, launch it. And this sometimes breaks, when things change, which is rather frequently the case…
So yesterday I finally woke up to see what I wished to see every morning since November 15th last year, when I started testing torbrowser-launcher systematically on jenkins.debian.net:
This is the graphical status overview of torbrowser tests on Debian which are tests installing torbrowser-launcher on and from sid, stretch, jessie-backports, jessie and wheezy-backports, which first download torbrowser via https and via tor, then launches it to finally connect to a debian mirror via an onion-address and then to www.debian.org. In addition to these there are also tests installing the package from stretch on jessie as well as the package from sid on stretch and jessie. Finally there are also tests for building the package from our git branches for various suites and finally we also build a package based on the upstream git master branch merged with our sid packaging.
There are 17 different tests currently and they are configured in just two files, a 220 line yaml file defining the jenkins jobs and 528 lines of bash script containing the actual test code. The tests are using schroot, xvfb, kvkbd and gocr and are executed either daily or weekly (those testing the package from ftp.d.o) or on every commit and at least once every month (those testing the package build from git).
At least briefly I have been looking at the this page every day since setting up these tests two months ago. Back then, torbrowser-launcher was broken in many suites and then it broke some more and instead of fixing it, I first made these tests so that at least in future there will be automatic notifications when things break. This has worked out rather nicely and torbrowser-launcher got fixed over time too. Doing so in jessie proper took longest as we missed 8.2. Once jessie was fixed the fix for wheezy-backports was also finally accepted very quickly. Which was this Monday, so yesterday on Tuesday I could enjoy for the first time an all "green page"!
And then on Wednesday, which is now also yesterday, the nice green page became less green due two new issues: #805173 really needs ca-certificates as depends and not recommends and then also the rendering of the new 5.5 version of torbrowser changed slightly… Both issues have been addressed by now.
I'm curious how this page will look tomorrow morning. And when I'll consider it the number of false negatives to be low enough so as I'll happily enable notifications to be send to the pkg privacy maintainers mailing list. I think it's almost the case and will keep an eye on the results the coming weeks and months.
Last not least: if you want to run similar tests for your Debian project, I'd be glad to help! See you at FOSDEM?
The last two months finally saw rapid progress in the development of the Search API module's Drupal 8 version. Acquia generously agreed to fund all available time for both Joris Vercammen (borisson_) and me in December and January to work on this port and related modules (especially Facets).What we did
With this backing, we were able to make a lot of head-way and got a lot of large blocking issues out of the way, among them a overhaul for the fields UI, some necessary major internal refactorings and most of the Views integration. All of this is now baked into the new Alpha 12 release, created today. Over the next couple of days, we will then also create releases for the other related modules with a working D8 version: Facets, Search API Solr Search, Search API attachments (Alpha 2) and maybe also Search API pages.
That way, we should be able to avoid a confusion of versions and conflicts for any users interested in trying out the current state of work of this module suite, or already starting to build a new site using them.
Going forth, we will also try to keep this system of creating a set of compatible releases for future Alpha versions.
As noted in the release notes, though, be careful when building sites already with this module version, as there will be no upgrade paths until Beta and some changes until then are still likely to break the storage structure (and would thus lead to loss of configuration, unless handled correctly). Also, this release (like all other non-stable releases for any module) will not be covered by Drupal's Security Team, so any discovered security vulnerabilities would be reported, worked on and fixed publicly.
That being said, though, one of the greatest improvements in the module's D8 version, at least under the hood, is it's vastly improved test coverage. That, along with Drupal.org's automated testing, enables us to be very confident in each new feature we add and each bug we fix, thus also improving the maintainability and speed of feature development in the future. And it hopefully makes it much less likely that any major bugs go unnoticed for long.
But there are also lots of improvements visible right on the surface: we carefully reviewed all major encountered problems and pitfalls with the module's D7 version and worked to make the new D8 version another large leap forward to support as many search use cases as possible, while still becoming much more user-friendly than the D7 version – probably one of the largest points of criticism overall.
So, how does it look for the further path towards a stable D8 release for the Search API (and, subsequently, for its numerous add-on modules)?
Currently, there are no immediate plans for further funding, so while I will of course still work on the port whenever I can, the pace will necessarily slack down a bit again. I also neglected maintenance of my various D7 modules in the last months, so there's also a lot of catching-up to do there. (Incidentally, a great way to help this effort if you are not comfortable with D8 yet: just go into any of my modules' issue queues and try to answer support requests, reproduce or fix bugs, test patches, etc., there.)
However, while there are still a lot of beta blockers left, most of them are relatively minor compared to the ones we now resolved, so I think a first Beta release in March should be within reach. Then it will be a matter of determining the MVP for an initial stable release and working towards that – but I expect a much shorter period for Beta than it has been for Alpha, maybe only a month or two.
Florida DrupalCamp is coming up on March 5th, and DrupalEasy is happy to be involved as a sponsor and organizer. This year's event will be better than ever, with three amazing featured speakers flying in from three different countries! Karen Stevenson, Morten DK, and Jesus Manuel Olivas will be presenting double-length sessions on the lastest Druapl 8-related topics.-->
Media partners are an important part of any DrupalCon. They help us spread the word about the event to people who might not have heard of it otherwise, and get to attend (and report on) the events. Our media partners are critical to DrupalCon's success, so we'd like to say a big thank you to all of our partners for DrupalCon Asia.