Elsewhere

Xeno Media: Posting to Slack, Publishing in Drupal

Planet Drupal - jeu, 25/08/2016 - 04:01
How Zoomdata employees share insights into company life

Xeno Media is pleased to announce our latest Drupal 7 contrib module, Slack to Drupal.  This module imports pictures uploaded to Slack to Drupal 7 systems--thereby allowing a community of users to add content to a site while managing their daily business collaboration through the Slack app.

Zoomdata--who makes visual analytics software for big data--tasked us with coming up with a solution that allows their employees to submit images for the public website to share the company’s unique, engaging culture to aid in marketing and recruiting.  

Various source platforms, including Instagram, Flickr, and Twitter, were originally considered. As we surveyed Zoomdata employees, though, we realized that Slack was the ideal source. Slack is fundamental to Zoomdata’s work culture; Its 200 employees and contractors throughout North America and Europe actively collaborating on Slack on an ongoing basis. Leveraging Slack as the source platform would allow employees to submit images in real-time without breaking their typical work/collaboration workflows and methods.

With that settled, we started researching how to integrate.  Our developers researched Slack’s API and proposed two approaches: 1) Create a Slack “bot”--a virtual user that our human users could interface with. Or: 2) Integrate with a specific Slack channel.  We elected the later as we could more efficiently access the files in a specific channel and Zoomdata appreciated having a single destination channel for users to come to rather than clogging other channels with off-topic bot chatter.

With the Slack-side figured out, we worked on the Drupal development.  We are supporters of the Drupal Media initiative, and decided to integrate the the Drupal Media 7.x-2.0 File Entity as we do on many of our client sites.  The File Entity module creates an entity like a node for each file in the system.  This allows us to add fields, like Caption, Approval, Date, and Uploader.  It also allows us to use, and reuse the entities in the site on other pieces of content and create views of the entities.  We called this new entity Slack Image.

We also created an administration screen where an administrator can approve or disapprove images.  If images are disapproved, they are removed from the system and not imported again.  If approved, they are available where all the other File Entities are available.

For the Zoomdata public site, we created a view of the new Slack images that appears on their Careers page in a beautiful, modern, and responsive layout using Masonry Views, Colorbox, and GD infinite scroll plugin modules.

Our employees are always posting photos in Slack. I really wanted to share those photos with our customers, partners, prospective employees and vendors so they could get a view inside Zoomdata and know what a great team of people they’re partnering with. Jim, and the team at Xeno Media, made it possible by creating a fantastic Drupal website for us, and by developing Slack to Drupal.

Robyn Forman, Zoomdata’s VP of Digital Marketing.

Results so far have been very positive--with more than half of the company joining the channel and submissions coming from every office and department.  Through Slack to Drupal, employees from throughout the organization have shown what an engaged, fun, and cutting edge culture Zoomdata really is.

Catégories: Elsewhere

Roy Scholten: New process, new results

Planet Drupal - jeu, 25/08/2016 - 01:40

We’re probably misusing the term MVP when we try to frame what we would like to see make it into core. But the actual mode of working we use there is quite an achievement. We used to grind it out endlessly, where proposed changes could be discussed endlessly, with a high risk of not committing anything at all in the end. What we’re doing now is: agree up front that it’s a good idea to improve feature X or rework interface Y. And then focus on keeping the scope as small as possible.

Yes, I, J and K are also good ideas, but we’re trying to do X here and while these are all related ideas and together would like make for a nicer whole, we should really focus on shipping X, and X alone, before turning our attention to I, J and K. If at all, because while shiny, interface Y actually presents people with more problems, so maybe we should focus on that. Though it’s never that strongly a case of either/or, and we should definately not stop iterating after the initial commit.

This is a very new and different way of working. Deliberately lowering our standards for the goal of introducing change. This is uncomfortable at times, but even that is good, because it means we’re stretching ourselves, which means we’re doing and learning new things. I’m excited and proud to see this happen. More like this.

Doing it like this means that Drupal 8.2:

  • Has content moderation tools (draft! review! publish! etc.)
  • Provides a new way to add new elements (blocks) to the page you’re on, without having to go to some far away corner in the admin section
  • Those elements (blocks! menus! logo & site name! etc.) can then also be configured in the context of the user facing page. A side tray will show up and expose the relevant settings.

Looking forward to learn how these additions will be received and how we can improve them. In the mean time, lets add more useful and usable things to 8.3 (sample content! media handling! better dates! etc).

Tags: drupaluxdrupalplanetSub title: This is a pretty radical change
Catégories: Elsewhere

Cocomore: 10 Do’s & Don’ts for Facebook Pages: This is what businesses should keep in mind

Planet Drupal - jeu, 25/08/2016 - 00:00

The digital point of contact, the electronic business card, the online meet up for fans: A Facebook business page serves many functions. For this reason it’s important to know how to use it correctly. Here are 10 tips on how to work it.

Catégories: Elsewhere

DrupalCon News: An Insider's Guide to Visiting Dublin

Planet Drupal - mer, 24/08/2016 - 23:54

Thinking of coming to DrupalCon Dublin this year? Why not extend your trip by a few days and stay a bit longer to take in some of the fabulous things you can go do and see in Dublin?

Here's our recommended list of things to do and see while here:

1. Guinness Storehouse

Catégories: Elsewhere

Drupal core announcements: Coding standards ratified changes and ongoing proposals

Planet Drupal - mer, 24/08/2016 - 22:29

The TWG coding standards committee is announcing two coding standards changes for final discussion. These appear to have reached a point close enough to consensus for final completion. The new process for proposing and ratifying changes is documented on the coding standards project page.

Official coding standards updates now ratified:

Issues awaiting core approval:

Issues that just need a little TLC (you can help!):

These proposals will be re-evaluated during the next coding standards meeting currently scheduled for August 30th. At that point the discussion may be extended, or if clear consensus has been reached one or more policies may be dismissed or ratified and moved to the next step in the process.

Catégories: Elsewhere

FFW Agency: The ABC's of Drupal: Dev Ops, Display and Distribution

Planet Drupal - mer, 24/08/2016 - 20:43
The ABC's of Drupal: Dev Ops, Display and Distribution Ray Saltini Wed, 08/24/2016 - 18:43

For anyone who's ever looked up a definition of a Drupal term and been left wondering what it all means, here are some practical real world explanations you can use to navigate the Drupalverse. Watch this space and use comments to send us your feedback and requests.

The Discipline of Dev Ops

Dev Ops, or Development Operations, is the intersection between IT managed hosting support and development. While it is a specialization in many organizations, senior developers, tech leads, and architects should be conversant in the various systems and tools to be used by your IT team or provider.

One of the primary goals of Dev Ops is to create standardized operating system iterations that are consistently reliable and easily replicable. Your unique infrastructure or hosting service plays a big role in these systems, which is why they tend to be customized to each project.

Standardized Dev Ops systems are used to create local and remote development environments, as well as staging and production environments, which all function in the same way. Having good Dev Ops systems in place means that your organization can support continuous development practices like version control and automated testing.

For any site that’s even moderately complex, having Dev Ops standards is huge. You don’t have to try to become a Dev Ops genius yourself: instead, you can find an organization like FFW to provide the level of Dev Ops help and support that is appropriate for the size and scope of your project.

Defining a Display

Displays, unlike Dev Ops, are a little simpler. A Display in Drupal typically refers to how queried data is organized and shown to visitors. It is usually used in connection with a native database query referred to as a View.

One View (or database query) can have several Displays sorted in different ways. For instance, a set of queried data can be output in the following ways:

  • a sortable table
  • a grid
  • as consecutive field sets
  • in a rotating banner
  • as a calendar or list of coming events
  • as points on a map

… and these are only just a few examples of the many different kinds of Displays.

The Details Around Distributions

A Distribution is a pre-developed assembly of database data, code, and files. Distributions commonly include saved content, configuration settings, Drupal core, contributed and custom modules, libraries, and a custom theme. It’s basically a pre-built Drupal site.

Most people first become acquainted with Distributions as different iterations of Drupal that are built for specific use cases or verticals, such as e-commerce or publishing. Many distributions are robust, production-ready applications that will save you tremendous work. They let you take advantage of the distribution sponsor’s subject matter expertise.

There are other kinds of distributions, such as ones developed mainly for marketing purposes to showcase what Drupal can do and how Drupal can be used. Both of these types of distributions have value, but it is important to differentiate between the two.

Distributions can be vetted in much the same way that a Drupal module or theme can be vetted. When evaluating a Distribution, I always like to ask the following questions:

  • Who are the contributors?
  • What is their experience?
  • Is the project actively maintained and are new features or versions planned?

The other primary consideration when vetting a Distribution is how much complexity and effort is required to ‘unravel’ a distribution. Many organizations have found that the more fully realized distributions are difficult to customize around their specific workflows and therefore are more expensive to change than starting fresh with a more basic version of Drupal.

If you want to know more about Distributions, I recommend looking at Drupal’s distribution project pages and this documentation page.

Tagged with Comments
Catégories: Elsewhere

Drupal.org blog: Upcoming Changes to the Front Page

Planet Drupal - mer, 24/08/2016 - 20:22

In recent weeks we've been making several small changes to Drupal.org: precursors to bigger things to come. First, we moved the user activity links to a user menu in the header. Next, we're moving the search function from the header to the top navigation. These changes aren't just to recover precious pixels so you can better enjoy those extra long issue summaries—these are the first step towards a new front page on Drupal.org.

As the Drupal 8 life-cycle has moved from development, to release, to adoption, we have adapted Drupal.org to support the needs of the project in the moment. And today, the need of the moment is to support the adoption journey.

As we make these changes you'll see echoes of the visual style we used when promoting the release of Drupal 8.

  • The Drupal wordmark region will help to define Drupal, and promote trying a demo.

  • A ribbon will promote contextual CTAs like learning more about Drupal 8.

  • The news feed will be tweaked.

  • DrupalCon will have a permanent home on the front page.

  • Community stats and featured case studies will be carried over(but may evolve).

  • The home page sponsorship format may change.

  • We'll be phasing in a new font throughout the site: Ubuntu - which you've already seen featured in the new Documentation section.

Here's a teaser

… a sneak preview of some new page elements and styles you'll see in the new home page.  

Our first deployment will introduce the new layout and styles. Additional changes will follow as we introduce content to support our turn towards the adoption journey. Drupal evaluators beginning their adoption journey want to know who uses Drupal, and what business needs Drupal can solve. We will begin promoting specific success stories: solutions built in Drupal to meet a concrete need.

What's next?

We're continuing to refine our content model and editorial workflow for the new front page. You'll see updates in the Drupal.org change notifications as we get closer to deployment.

Wondering why we're making these changes now? This turn towards the adoption journey is part of our changing priorities for the next 12 months.

Catégories: Elsewhere

Chocolate Lily: Announcing Drutopia

Planet Drupal - mer, 24/08/2016 - 19:46

Drutopia is an initiative within the Drupal project that prioritizes putting the best online tools into the hands of grassroots groups. By embracing the liberatory possibilities of free software and supporting people-centred economic models, Drutopia aims to revolutionize the way we work and cooperate.

Drutopia is at once an ethos of Drupal development and a fresh take on Drupal distributions for users to build upon, all based in a governance model that gives users a large role in the direction of the project.

Core values of the Drutopia initiative include:

  • Be inclusive regarding gender, gender identity, sexual orientation, ethnicity, ability, age, religion, geography and class.
  • Commit to protection of personal information and privacy and freedom from surveillance.
  • Put collaboration and cooperation above competition.
  • Prioritize human needs over private profit.
  • Foster non-hierarchical structures and collective decision-making.

Drutopia focuses on shared solutions. Drupal excels at providing the tools to develop and distribute specialized website platforms that can be freely shared, reused, and adapted. Of the three most-used free software content management systems (CMSs) – WordPress, Joomla!, and Drupal – only Drupal has the built-in ability to package and share highly developed distributions.

Distributions are essential in attracting and meeting the needs of groups that want to support the free software movement but don’t have the technical know-how or resources to create a site from scratch. For developers, too, distributions hold a lot of potential because they do the heavy lifting of initial setup, allowing developers and site builders to bypass many hours of unnecessary effort. Drupal distributions so far have been held back by a series of factors that Drutopia aims to address.

Drutopia is about returning to Drupal’s roots in free software and progressive social change. Since its founding years, the Drupal free software project has both reflected and contributed to the democratic potential of the internet: to empower citizens to freely collaborate and organize outside the control of governments and corporate media. Long before it powered Fortune 500 sites and whitehouse.gov, Drupal was a tool of choice for small, grassroots, change-oriented groups.

This initiative aims to reclaim Drupal for the communities and groups that have always been its core users and adopters and have contributed to much of its best innovation.

Join us at drutopia.org.

Catégories: Elsewhere

Frederick Giasson: Winnipeg City’s NOW [Data] Portal

Planet Drupal - mer, 24/08/2016 - 19:33

The Winnipeg City’s NOW (Neighbourhoods Of Winnipeg) Portal is an initiative to create a complete neighbourhood web portal for its citizens. At the core of the project we have a set of about 47 fully linked, integrated and structured datasets of things of interests to Winnipegers. The focal point of the portal is Winnipeg’s 236 neighbourhoods, which define the main structure of the portal. The portal has six main sections: topics of interests, maps, history, census, images and economic development. The portal is meant to be used by citizens to find things of interest in their neibourhood, to learn their history, to see the images of the things of interest, to find tools to help economic development, etc.

The NOW portal is not new; Structured Dynamics was also its main technical contractor for its first release in 2013. However we just finished to help Winnipeg City’s NOW team to migrate their older NOW portal from OSF 1.x to OSF 3.x and from Drupal 6 to Drupal 7; we also trained them on the new system. Major improvements accompany this upgrade, but the user interface design is essentially the same.

The first thing I will do is to introduce each major section of the portal and I will explain the main features of each. Then I will discuss the new improvements of the portal.

Datasets

A NOW portal user won’t notice any of this, but the main feature of the portal is the data it uses. The portal manages 47 datasets (and growing) of fully structured, integrated and linked datasets of things of interests to Winnipegers. What the portal does is to manage entities. Each kind of entity (swimming pools, parks, places, images, addresses, streets, etc.) are defined with multiple properties and values. Several of the entities reference other entities in other datasets (for example, an assessment parcel from the Assessment Parcels dataset references neighbourhoods entities and property addresses entities from their respective datasets).

The fact that these datasets are fully structured and integrated means that we can leverage these characteristics to create a powerful search experience by enabling filtering of the information on any of the properties, to bias the searches depending where a keyword search match occurs, etc.

Here is the list of all the 47 datasets that currently exists in the portal:

  1. Aboriginal Service Providers
  2. Arenas
  3. Neighbourhoods of Winnipeg City
  4. Streets
  5. Economic Development Images
  6. Recreation & Leisure Images
  7. Neighbourhoods Images
  8. Volunteer Images
  9. Library Images
  10. Parks Images
  11. Census 2006
  12. Census 2001
  13. Winnipeg Internal Websites
  14. Winnipeg External Websites
  15. Heritage Buildings and Resources
  16. NOW Local Content Dataset
  17. Outdoor Swimming Pools
  18. Zoning Parcels
  19. School Divisions
  20. Property Addresses
  21. Wading Pools
  22. Electoral wards of Winnipeg City
  23. Assessment Parcels
  24. Libraries
  25. Community Centres
  26. Police Service Centers
  27. Community Gardens
  28. Leisure Centres
  29. Parks and Open Spaces
  30. Community Committee
  31. Commercial real estates
  32. Sports and Recreation Facilities
  33. Community Characterization Areas
  34. Indoor Swimming Pools
  35. Neighbourhood Clusters
  36. Fire and Paramedic Stations
  37. Bus Stops
  38. Fire and Paramedic Service Images
  39. Animal Services Images
  40. Skateboard Parks
  41. Daycare Nurseries
  42. Indoor Soccer Fields
  43. Schools
  44. Truck Routes
  45. Fire Stations
  46. Paramedic Stations
  47. Spray Parks Pads
Structured Search

The most useful feature of the portal to me is its full-text search engine. It is simple, clean and quite effective. The search engine is configured to try to give the most relevant results a NOW portal user may be searching. For example, it will positively bias some results that comes from some specific datasets, or matches that occurs in specific property values. The goal of this biasing is to improve the quality of the returned results. This is somewhat easy to do since the context of the portal is well known and we can easily boost scoring of search results since everything is fully structured.

Another major gain is that all the search results are fully templated. The search results do not simply return a title and some description for your search results. It does template all the information the system has about the matched results, but also displays the most relevant information to the users in the search results.

For example, if I search for a indoor swimming pool, in most of the cases it may be to call the front desk to get some information about the pool. This is why different key information will be displayed directly in the search results. That way, most of the users won’t even have to click on the result to get the information they were looking for directly in the search results page.

Here is an example of a search for the keywords main street. As you can notice, you are getting different kind of results. Each result is templated to get the core information about these entities. You have the possibility to focus on particular kind of entities, or to filter by their location in specific neighbourhoods.

Templated Search Results

Now let’s see some of the kind of entities that can be searched on the portal and how they are presented to the users.

Here is an example of an assessment parcel that is located in the St. John’s neighbourhood. The address, the value, the type and the location of the parcel on a map is displayed directly into the search results.

Another kind of entity that can be searched are the property addresses. These are located on a map, the value of the parcels and the building and the zoning of the address is displayed. The property is also linked to its assessment parcel entity which can be clicked to get additional information about the parcel.

Another interesting type of entity that can be searched are the streets. What is interesting in this case is that you get the complete outline of the street directly on a map. That way you know where it starts and where it ends and where it is located in the city.

There are more than a thousand geo-localized images of all different things in the city that can be searched. A thumbnail of the image and the location of the thing that appears on the image appears in the search results.

If you were searching for a nursery for your new born child, then you can quickly see the name, location on a map and the phone number of the nursery directly in the search result.

There are just a few examples of the fifty different kind of entities that can appear like this in the search results.

Mapping

The mapping tool is another powerful feature of the portal. You can search like if you were using the full-text search engine (the top search box on the portal) however you will only get the results that can be geo-localized on a map. You can also simply browse entities from a dataset or you can filter entities by their properties/values. You can persist entities you find on the map and save the map for future reference.

In the example below, it shows that someone searched for a street (main street) and then he persisted it on the map. Then he search for other things like nurseries and selected the ones that are near the street he persisted, etc. That way he can visualize the different known entities in the portal on a map to better understand where things are located in the city, what exists near a certain location, within a neighbourhood, etc.

Census Analysis

Census information is vital to the good development of a city. They are necessary to understand the trends of a sector, who populates it, etc., such that the city and other organizations may properly plan their projects to have has much impact as possible.

These are some of the reason why one of the main section of the site is dedicated to census data. Key census indicators have been configured in the portal. Then users can select different kind of regions (neighbourhood clusters, community areas and electoral wards) to get the numbers for each of these indicators. Then they can select multiple of these regions to compare each other. A chart view and a table view is available for presenting the census data.

History, Images & Points of Interest

The City took the time to write the history of each of its neighbourhoods. In additional to that, they hired professional photographs to photograph the points of interests of the city, to geo-localize them and to write a description for each of these photos. Because of this dedication, users of the portal can learn a much about the city in general and the neighbourhood they live in. This is what the History and Image sections of the website are about.

Historic buildings are displayed on a map and they can be browsed from there.

Images of points of interests in the neighbourhood are also located on a map.

Find Your Neighbourhood

Ever wondered in which neighbourhood you live in? No problem, go on the home page, put your address in the Find your Neighbourhood section and you will know it right away. From there you can learn more about your neighbourhood like its history, the points of interest, etc.

Your address will be located on a map, and your neighbourhood will be outlined around it. Not only you will know in which neighbourhood you live, but you will also know where you live within it. From there you can click on the name of the neigbourhood to get to the neighbourhood’s page and start learning more about it like its history, to see photos of points of interest that exists in your neighbourhood, etc.

Browsing Content by Topic

Because all the content of the portal is fully structured, it is easy to browse its content using a well defined topic structure. The city developed its own ontology that is used to help the users browse the content of the portal by browsing topics of interest. In the example below, I clicked the Economic Development node and then the Land use topic. Finally I clicked the Map button to display things that are related to land use: in this case, zoning and assessment parcels are displayed to the user.

This is another way to find meaningful and interesting content from the portal.

Depending on the topic you choose, and the kind of information related to that topic, you may end up with different options like a map, a list of links to documents related to that topic, etc.

Export Content

Now that I made an overview of each of the main features of the portal, let’s go back to the geeky things. The first thing I said about this portal is that at its core, all information it manages is fully structured, integrated and linked data. If you get to the page of an entity, you have the possibility to see the underlying data that exists about it in the system. You simply have to click the Export tab at the top of the entity’s page. Then you will have access to the description of that entity in multiple different formats.

In the future, the City should (or at least I hope will) make the whole set of datasets fully downloadable. Right now you only have access to that information via that export feature per entity. I hope because this NOW portal is fully disconnected from another initiative by the city: data.winnipeg.ca, which uses Socrata. The problem is that barely any of the datasets from NOW are available on data.winnipeg.ca, and the ones that are appearing are the raw ones (semi-structured, un-documented, un-integrated and non-linked) all the normalization work, the integration work, the linkage work done by the NOW team hasn’t been leveraged to really improve the data.winnipeg.ca datasets catalog.

New with the upgrades

Those who are familiar with the NOW portal will notice a few changes. The user interface did not change that much, but multiple little things got improved in the process. I will cover the most notable of these changes.

The major changes that happened are in the backend of the portal. The data management in OSF for Drupal 7 is incompatible with what was available in Drupal 6. The management of the entities became easier, the configuration of OSF networks became a breeze. A revisioning system has been added, the user interface is more intuitive, etc. There is no comparison possible. However, portal users’ won’t notice any of this, since these are all site administrator functions.

The first thing that users will notice is the completely new full-text search engine. The underlying search engine is almost the same, but the presentation is far better. All entity types have gotten their own special template, which are displayed in a special way in the search results. Most of the time results should be much more relevant, filtering is easier and cleaner. The search experience is much better in my view.

The overall site performance is much better since different caching strategies have been put in place in OSF 3.x and OSF for Drupal. This means that most of the features of the portal should react more swiftly.

Now every type of entity managed by the portal is templated: their webpage is templated in specific ways to optimize the information they want to convey to users along with their search result “mini page” when they get returned as the result of a search query.

Multi-linguality is now fully supported by the portal, however not everything is currently templated. However expect a fully translated NOW portal in French in the future.

Creating a Network of Portals

One of the most interesting features that goes with this upgrade is that the NOW portal is now in a position to participate into a network of OSF instances. What does that mean? Well, it means that the NOW portal could create partnerships with other local (regional, national or international) organizations to share datasets (and their maintenance costs).

Are there other organizations that uses this kind of system? Well, there is at least another one right in Winnipeg City: MyPeg.ca, also developed by Structured Dynamics. MyPeg uses RDF to model its information and uses OSF to manage its information. MyPeg is a non-profit organization that uses census (and other indicator) data to do studies on the well being of Winnipegers. The team behind MyPeg.ca are research experts in indicator data. Their indicator datasets (which includes census data) is top notch.

Let’s hypothetize that there would be interest between the two groups to start collaborating. Let’s say that the NOW portal would like to use MyPeg’s census datasets instead of its own since they are more complete, accurate and include a larger number of important indicators. What they basically want is to outsource the creation and maintenance of the census/indicators data to a local, dedicated and highly professional organization. The only things they would need to do is to:

  1. Formalize their relationship by signing a usage agreement
  2. The NOW portal would need to configure the MyPeg.ca OSF network into their OSF for Drupal instance
  3. The NOW portal would need to register the datasets it want to use from MyPeg.ca.

Once these 3 steps are done, taking no more than a couple of minutes, then the system administrators of the NOW portal could start using the MyPeg.ca indicator datasets like they were existing on their own network. (The reverse could also be true for MyPeg.) Everything would be transparent to them. From then on, all the fixes and updates performed by MyPeg.ca to their indicator datasets would immediately appear on the NOW portal and accessible to its users.

This is one possibility to collaborate. Another possibility would be to simply on a routine basis (every month, every 6 months, every year) share the serialized datasets such that the NOW portal re-import the dataset from the files shared by MyPeg.ca. This is also possible since both organizations use the same Ontology to describe the indicator data. This means that no modification is required by the City to take that new information into account, they only have to import and update their local datasets. This is the beauty of ontologies.

Conclusion

The new NOW portal is a great service for citizens of Winnipeg City. It is also a really good example of a web portal that leverages fully structured, integrated and linked data. To me, the NOW portal is a really good example of the features that should go along with a municipal data portal.

Catégories: Elsewhere

Don Armstrong: H3ABioNet Hackathon (Workflows)

Planet Debian - mer, 24/08/2016 - 16:40

I'm in Pretoria, South Africa at the H3ABioNet hackathon which is developing workflows for Illumina chip genotyping, imputation, 16S rRNA sequencing, and population structure/association testing. Currently, I'm working with the imputation stream and we're using Nextflow to deploy an IMPUTE-based imputation workflow with Docker and NCSA's openstack-based cloud (Nebula) underneath.

The OpenStack command line clients (nova and cinder) seem to be pretty usable to automate bringing up a fleet of VMs and the cloud-init package which is present in the images makes configuring the images pretty simple.

Now if I just knew of a better shared object store which was supported by Nextflow in OpenStack besides mounting an NFS share, things would be better.

You can follow our progress in our git repo: [https://github.com/h3abionet/chipimputation]

Catégories: Elsewhere

Mediacurrent: "Shrop" Talk at Drupal Camp Asheville 2016

Planet Drupal - mer, 24/08/2016 - 15:55

On August 13th, I had the pleasure of enjoying another Drupal Camp Asheville. This has become one of my favorite Drupal camps because of the location and quality of camp organization. It has the right balance of structure, while maintaining a grassroots feel that encourages open discussion and sharing.

Catégories: Elsewhere

Drupal Bits at Web-Dev: Hook Update Deploy Tools: Node import FAQs

Planet Drupal - mer, 24/08/2016 - 15:12

Using the Drupal module Hook Update Deploy Tools to move node content  can be an important part to a deployment strategy. 

What is the unique ID that connects an export to an import?

To create the export file, the node id is used to create the file.  After that, the filename and 'unique id' references the alias of that node.  So when you import the node, the node id on the production site will be determined by looking up the alias of the node.  If a matching alias is found, that is the node that gets updated.  If no matching alias is found, a new node gets created.  The alias becomes the unique id.

What are the risks of this import export model?

At present the known risks are:

  1. If the exported node uses entity references that do not exist on prod, the entity reference will either not be made, or reference an entity that is using that entity id on prod.  This can be mitigated by exporting your source node while using a recent copy of the production DB.
  2. If the exported node uses taxonomy terms that do not exist on prod, the tag may import incorrectly. This can be mitigated by exporting your source node while using a recent copy of the production DB.
  3. if you are using pathato and the existing pattern on the production site is different than the pattern on your sandbox.  The imported node will end up with a different alias, resulting in an invalid import.  The imported node will be deleted since it failed validation and the hook_update_N will fail. This can be mitigated by exporting your source node while using a recent copy of the production DB.
  4. File attachments.  There is currently not a way to bring attached files along with them unless the files already exist with a matching fid on production.
What if I am using an entity reference or a taxonomy that does not exist on production?

See answers 1 and 2 in What are the risks of this import export model?

Does the import show up as a revision?

Yes it does, and the revison note contains the imported note, but also indicates it was imported with Hook Update Deploy Tools.  The revision will take on the status of the exported node.  If the exported node was unpublished, the impoirted revision will be unpublished.

What happens if the import does not validate?

If the import was to an existing node, the update revision wil be deleted and return the node to its last published revision.  If the import was for a node that did not exist on the site, the node and its first revision will be deleted.  In either case, if the import was run through a hook_update_N, that update will fail and allow it to be re-run once the issue is resolved.

What if the alias or path is already in use by another node?

If the alias is in use by a node, that node will be updated by the import.  The alias is the unique id that links them not the nid.

What if the alias or path is already in use by a View or used by a menu router?

If the alias is in use on the site by something other than a node, the import will be prevented.  If the import is being run by a hook_update_N() then the update will fail and can be run when the issue is resolved.

Is there a limit to the number of nodes that can be imported this way?

Technically, there is no real limit.  Realistically, it is not a great workflow to move all of your content this way.  It is not a good workflow.  This export import method is best reserved for mission critical pages like forms or thankyou pages that go along with a Feature deployment.  It is also good for pages that often get destroyed during early site development like style guides and example pages.

Catégories: Elsewhere

WDTutorials.com: Drupal 8 Tutorial #43 : Twig Tweak Module (Article + Video)

Planet Drupal - mer, 24/08/2016 - 15:00

Twig Tweak module adds some useful functions and filters to use in templates.

Catégories: Elsewhere

DrupalEasy: DrupalEasy Podcast 184 - PMA (Marc Drummond - Next Steps in Drupal Theming)

Planet Drupal - mer, 24/08/2016 - 14:54

Direct .mp3 file download.

Marc Drummond (mdrummond), Front-end developer at Lullabot, Drupal core contributor, and self-processed Star Wars expert joins Kelley and Mike to discuss all the things the Drupal front-end community has been talking about lately. We also discuss the next major version of Drupal, whether or not a major Drupal contrib module will be deprecated, as well as our picks of the week.

Interview DrupalEasy News Three Stories
  1. Proposal: Deprecate Field Collections for Drupal 8, focus on Entity Reference Revisions & Paragraphs.
  2. The Average Web Page (Data from Analyzing 8 Million Websites).
  3. There will never be a Drupal 9 vs. There will be a Drupal 9, and here is why.
Sponsors Picks of the Week Upcoming Events Follow us on Twitter Five Questions (answers only)
  1. Disney
  2. Docker for Mac
  3. Writing a fantasy novel
  4. Llama
  5. DrupalCamp Twin Cities
Intro Music
  • Chunk-y Town - performed by Marc Drummond at Twin Cities DrupalCamp 2016.
Subscribe

Subscribe to our podcast on iTunes, Google Play or Miro. Listen to our podcast on Stitcher.

If you'd like to leave us a voicemail, call 321-396-2340. Please keep in mind that we might play your voicemail during one of our future podcasts. Feel free to call in with suggestions, rants, questions, or corrections. If you'd rather just send us an email, please use our contact page.

Catégories: Elsewhere

Gábor Hojtsy: Want to get issues resolved in Drupal core? Find community with an initiative!

Planet Drupal - mer, 24/08/2016 - 14:36

In my previous post I explained why there will be a Drupal 9 even though we have previously unseen possibilities to add new things within Drupal 8.x.y. Now I'd like to dispel another myth, that initiatives are only there to add those new things.

Drupal 8 introduced initiatives to the core development process with the intention that even core development became too big to follow, understand or really get involved with in general. However because there are key areas that people want to work in, it makes sense to set up focused groups to organize work in those areas and support each other in those smaller groups. So initiatives like Configuration Management, Views in Core, Web Services, Multilingual, etc. were set up and mostly worked well, not in small part because it is easier to devote yourself to improving web services capabilities or multilingual support as opposed to "make Drupal better". Too abstract goals are harder to sign up for, a team with a thousand people is harder to feel a member of.

Given the success of this approach, even after the release of Drupal 8.0.0, we continued using this model and there are now several groups of people working on making things happen in Drupal 8.x. Ongoing initiatives include API-first, Media, Migrate, Content Workflows and so on. Several of these are primarily working on fixing bugs and plugging holes. A significant part of Migrate and API-first work to date was about fixing bugs and implementing originally intended functionality for example.

The wonder of these initiatives is they are all groups of dedicated people who are really passionate about that topic. They not only have plan or meta issues linked in the roadmap but also have issue tags and have regular meeting times. The Drupal 8 core calendar is full of meetings happening almost every single workday (that said, somehow people prefer Wednesdays and avoid Fridays).

If you have an issue involving usability, a bug with a Drupal web service API, a missing migration feature and so on, your best choice is to bring it to the teams already focused on the topics. The number and diverse areas of teams already in place gives you a very good chance that whatever you are intending to work on is somehow related to one or more of them. And since no issue will get done by one person (you need a reviewer and a committer at minimum), your only way to get something resolved is to seek interested parties as soon as possible. Does it sound like you are demanding time from these folks unfairly? I don't think so. As long as you are genuinely interested to solve the problem at hand, you are in fact contributing to the team which is for the benefit of everyone. And who knows, maybe you quickly become an integral team member as well.

Thanks for contributing and happy team-match finding!

Ps. If your issue is no match for an existing team, the friendly folks at #drupal-contribute in IRC are also there to help.

Catégories: Elsewhere

Zyxware Technologies: [Drupal-8] How to send a mail programmatically in Drupal-8

Planet Drupal - mer, 24/08/2016 - 14:33

This article covers, how to send email programmatically in your Drupal 8 site. There are two main steps to send an email using Drupal 8. First we need to implement hook_mail() to define email templates and the second step is to use the mail manager to send emails using these templates. Let's see an example for sending an email from the custom module, also the following name spaces.

DrupalDrupal 8Drupal Planet
Catégories: Elsewhere

Unimity Solutions Drupal Blog: Identification of an Open Source Video Annotations Tool for NVLI

Planet Drupal - mer, 24/08/2016 - 14:00

As mentioned in our earlier blog on Video Annotations: A powerful and innovative tool for education, the most intriguing feature of the pilot version of NVLI is Video Annotation. UniMity Solutions assisted in building Annotation feature for Audio and Video assets. This involved identifying and integrating an open plugin that supported video and audio annotations and a generic annotation store module that was plugin agnostic.

Catégories: Elsewhere

Zlatan Todorić: Take that boredom

Planet Debian - mer, 24/08/2016 - 07:45

While I was bored on Defcon, I took the smallest VPS in DO offering (512MB RAM, 20GB disk), configured nginx on it, bought domain zlatan.tech and cp'ed my blog data to blog.zlatan.tech. I thought it will just be out of boredom and tear it apart in a day or two but it is still there.

Not only that, the droplet came with Debian 8.5 but I just added unstable and experimental to it and upgraded. Just to experiment and see what time will I need to break it. To make it even more adventurous (and also force me to not take it too much serious, at least at this point) I did something on what Lars would scream - I did not enable backups!

While having fun with it I added letsencrypt certificate to it (wow, that was quite easy).

Then I installed and configured Tor. Ende up adding an .onion domain for it! It is: pvgbzphm622hv4bo.onion

My main blog is still going to be zgrimshell.github.io (for now at least) where I push my Nikola (static site generator written in python) generated content as git commits. To my other two domains (on my server) I just rsync the content now. Simple and efficient.

I must admit I like my blog layout. It is simple, easy to read, efficient and fast, I don't bother with comments and writing a blog in markdown (inside terminal as all good behaving hacker citizen) while compiling it with Nikola is breeze (and yes, I did choose Nikola because of Nikola Tesla and python). Also I must admit that nginx is pretty nice webserver, no need to explain the beauty of git but I can't recommend enough of rsync.

If anyone is interested in doing the same I am happy to talk about it but these tools are really simple (as I enjoy simple things and by simple I mean small tools, no complicated configs and easy execution).

Catégories: Elsewhere

Drupal Bits at Web-Dev: Import nodes as as part of deployment using Hook Update Deploy Tools

Planet Drupal - mer, 24/08/2016 - 05:52

With the 7.x-1.18 release of Hook Update Deploy Tools for Drupal 7 it is now possible to export a node on a development sandbox, commit the export file to the repository, then import it using either a hook_update_N() or using drush site-deploy-import node

Pros:

  • No need to re-create a node on prod after a client approves it.
  • Early content that keeps getting wiped out by database snapshots (think style guides) can get re-created instantly with a single drush command.
  • Content imported into an existing node shows up as a revision.
  • Atomated deployment is testable and repeatable on all dev environments.
  • No uuid required.
Workflow Example:

You have a styleguide you created on your sandbox and want to deploy it to the production site.

  1.  Create the node on your sandbox (node id = 1234).
  2. Export the node to an export file.
    drush site-deploy-export 1234
  3. The command created an export file named  for the alias of the node being exported
    ex: site-deploy/node_source/helpzZzstyle-guide.txt  ('zZz' represents '/')
  4. Create a hook_update_N() to import the file on deployment
     

    <?php
    /**
     * Import a the style guide
     */
    function site_deploy_update_7129() {
      $nodes = array('help/style-guide');
      $message = HookUpdateDeployTools\Nodes::import($nodes);
      return $message;
    }
    ?>
  5. Commit the file and update hook to your repo.
  6. Push the code, run 'drush updb'
drush updb -y
Site_deploy  7129  Import a the style guide

Site_deploy: Updated: node/1234: help/style-guide - successful.
Summary: Imported Nodes 1/1.  Completed the following:
   [help/style-guide] => Updated: node/1234
  
Performed update: site_deploy_update_7129

or the import can be performed by

drush site-deploy-import  help/style-guide
Catégories: Elsewhere

Drupal @ Penn State: Drupal 8 Theme Generation and Development Intro Using the Drupal Console

Planet Drupal - mer, 24/08/2016 - 01:56

Here is a screen cast of how to get started with Drupal 8 theme development.

In the video I cover:

  • using the drupal console to generate a theme from a base theme
  • creating a libraries yml file
  • adding global css to your theme
  • Using Kint with the devel module
  • debugging twig
  • adding your own twig file to your theme
Catégories: Elsewhere

Pages

Subscribe to jfhovinne agrégateur - Elsewhere