Skip to Content

Drupal

tombola: Drupal Sites Consolidation

Planet Drupal - 20 February 2015 - 3:19am
Where I am at

I have been intermittently working with a drupal 7 multiple site ‘platform’ for several years, which originally emerged from a single site. I am the sole maintainer. The platform is not based on a drupal ‘multi-site’ configuration, as the shared codebase model seemed more hindrance than feature. Instead the (four) sites share a common pattern of base configuration, each cloned (direct copies of files and db) from an early version of the build. A set of custom modules is kept up to date as one git repo that is cloned into each of the sites. Similarly a base theme, and sub themes for all of the sites are also held in a single repo and cloned into each site. More recently, I created an entirely separate site to aggregate and index these sister sites (via feeds) to provide a more advanced, yet loosely coupled search facility.

The setup as described was intended to make maintaining these sites simpler, as extended functionality could be built into some sites whilst the other sites are left without the baggage of additional modules and configuration, making them more easily performative, and simpler to maintain. Sister sites can optionally share any new features, because the early build decisions are shared, so configuration is transferable, and custom modules/features from the shared repo can simply be turned on. I opted not to construct a single site segmented by Spaces, or equivalent pattern, because initially it was not clear how far the sites might diverge from one another.

This workflow has seemed to make a rough kind of sense until now, but I am returning to the decisions I made when creating the platform to try and get it ship shape now that each site has largely settled into a stable pattern of usage. I want to make sure that I have not incurred an impractical amount of technical debt, and that site maintenance is transferable, should it need to be. Another consideration is the approach of Drupal 8. When this transformative version is widely used, it will be much easier to migrate simplified, minimal sites.

Why rock the boat?

Drupal is endlessly configurable. A lot of this configuration is executed through the admin interface and captured in the database. This oft criticised lack of separation between configuration and content should soon be alleviated by the adoption of Drupal 8, but for the moment (maybe until the end of 2015?) it does not seem sensible to move to D8.

The sites I have created have evolved to meet the requirements of their users. Due to reactive (and sometimes undocumented) measures taken in individual sites sometimes similar workflow and layout objectives have been achieved in subtlely different ways. Earlier in the process, these changes were captured in Drupal Features as much as was possible, but this became complex in itself.

Now that I can see the way these sites are being used, and content/design policies have emerged that structure (admin)...

Categories: Drupal

Freelock : How do you rate Maintainability?

Planet Drupal - 19 February 2015 - 5:04pm

Lately at Freelock, we've been improving our Drupal site assessment. For years we've analyzed Drupal sites built by others to identify how well they are built, what pitfalls/minefields lurk there, and where we need to be extremely careful with budget recommendations when extending functionality.

In the past couple months, we've overhauled it to include a snapshot rating of the site, to let our clients know what we think of their site in 7 crucial areas.

One of them that's often overlooked is Maintainability.

Site AssessmentmaintenanceDeploymentDrupal Planet
Categories: Drupal

Making Drupal 8 fly

Dries Buytaert - 19 February 2015 - 11:57am

In my travels to talk about Drupal, everyone asks me about Drupal 8's performance and scalability. Modern websites are much more dynamic and interactive than 10 years ago, making it more difficult to build modern sites while also being fast. It made me realize that maybe I should write up a summary of some of the most exciting performance and scalability improvements in Drupal 8. After all, Drupal 8 will leapfrog many of its competitors in terms of how to architect and scale modern web applications. Many of these improvements benefit both small and large websites, but also allow us to build even bigger websites with Drupal.

More precise cache invalidation

One of the strategies we employ in making Drupal fast is "caching". This means we try to generate pages or page elements one time and then store them so future requests for those pages or page elements can be served faster. If an item is already cached, we can simply grab it without going through the building process again (known as a "cache hit"). Drupal stores each cache item in a "cache bin" (a database table, Memcache object, or whatever else is appropriate for the cache backend in use).

In Drupal 7 and before, when one of these cache items changes and it needs to be re-generated and re-stored (the cache gets "invalidated"), you can only delete a specific cache item, clear an entire cache bin, or use prefix-based invalidation. None of these three methods allow you to invalidate all cache items that contain data of, say, user 200. The only method that is going to suffice is clearing the entire cache bin, and this means that usually we invalidate way too much, resulting in poor cache hit ratios and wasted effort rebuilding cache items that haven't actually changed.

This problem is solved in Drupal 8 thanks to the concept of "cache tags": each cache item can have any number of cache tags. A cache tag is a compact string that describes the object being cached. Thanks to this extra metadata, we can now delete all cache items that use the user:200 cache tag, for example. This means we've deleted all the cache items we must delete, but not a single one more: optimal cache invalidation!

Example cache tags for different cache IDs.

And don't worry, we also made sure to expose the cache tags to reverse proxies, so that efficient and accurate invalidation can happen throughout a site's entire delivery architecture.

More precise cache variation

While accurate cache invalidation makes caching more efficient, there is more we did to improve Drupal's caching. We also make sure that cached items are optimally varied. If you vary too much, duplicate cache entries will exist with the exact same content, resulting in inefficient usage of caches (low cache hit ratios). For example, we don't want a piece of content to be cached per user if it is the same for many users. If you vary too little, users might see incorrect content as two different cache entries might collide. In other words, you don't want to vary too much nor too little.

In Drupal 7 and before, it's easy to program any cached item to vary by user, by user role, and/or by page, and could even be configured through the UI for blocks. However, more targeted variations (such as by language, by country, or by content access permissions) were more difficult to program and not typically exposed in a configuration UI.

In Drupal 8, we introduced a Cache Context API to allow developers and site builders to express these variations and to make them automatically available in the configuration UI.

Server-side dynamic content substitution

Usually a page can be cached almost entirely except for a few dynamic elements. Often a page served to two different authenticated users looks identical except for a small "Welcome $name!" and perhaps their profile picture. In Drupal 7, this small personalization breaks the cacheability of the entire page (or rather, requires a cache context that's way too granular). Most parts of the page, like the header, the footer and certain blocks in the sidebars don't change often nor vary for each user, so why should you regenerate all those parts at every request?

In Drupal 8, thanks to the addition of #post_render_cache, that is no longer the case. Drupal 8 can render the entire page with some placeholder HTML for the name and profile picture. That page can then be cached. When Drupal has to serve that page to an authenticated user, it will retrieve it from the cache, and just before sending the HTML response to the client, it will substitute the placeholders with the dynamically rendered bits. This means we can avoid having to render the page over and over again, which is the expensive part, and only render those bits that need to be generated dynamically!

Client-side dynamic content substitution

Some things that Drupal has been rendering for the better part of a decade, such as the "new" and "updated" markers on comments, have always been rendered on the server. That is not ideal because these markers are different for every visitor and as a result, it makes caching pages with comments difficult.

The just-in-time substitution of placeholders with dynamic elements that #post_render_cache provides us can help address this. In some cases, as is the case with the comment markers, we can even do better and offload more work from the server to the client. In the case for comment markers, a certain comment is posted at a certain time — that doesn't vary per user. By embedding the comment timestamps as metadata in the DOM with a data-comment-timestamp="1424286665" attribute, we enable client-side JavaScript to render the comment markers, by fetching (and caching on the client side) the “last read" timestamp for the current user and simply comparing these numbers. Drupal 8 provides some framework code and API to make this easy.

A "Facebook BigPipe" render pipeline

With Drupal 8, we're very close to taking the client-side dynamic content substitution a step further, just like some of the world's largest dynamic websites do. Facebook has 1.35 billion monthly active users all requesting dynamic content, so why not learn from them?

The traditional page serving model has not kept up with the increase of highly personalized websites where different content is served to different users. In the traditional model, such as Drupal 7, the entire page is generated before it is sent to the browser: while Drupal is generating a page, the browser is idle and wasting its cycles doing nothing. When Drupal finishes generating the page and sends it to the browser, the browser kicks into action, and the web server is idle. In the case of Facebook, they use BigPipe. BigPipe delivers pages asynchronously instead; it parallelizes browser rendering and server processing. Instead of waiting for the entire page to be generated, BigPipe immediately sends a page skeleton to the the client so it can start rendering that. Then the remaining content elements are requested and injected into their correct place. From the user's perspective the page is rendered progressively. The initial page content becomes visible much earlier, which improves the perceived speed of the site.

We've made significant improvements to the way Drupal 8 renders pages (presentation). By default, Drupal 8 core still implements the traditional model of assembling these pieces into a complete page in a single server-side request, but the independence of each piece and the architecture of the new rendering pipeline enable different “render strategies" to be experimented with — different methods for dynamic content assembly, such as BigPipe, Edge Side Includes, or other ideas for making the most optimal use of client, server, content delivery networks and reverse proxies. In all those examples, the idea is that we can send the primary content first so the client can start rendering that. Then we send the remaining Drupal blocks, such as the navigation menu or a 'Related articles' block, and have the browser, content delivery network or reverse proxy assemble or combine these blocks into a page.

A snapshot of the Drupal 8 render pipeline diagram that highlights where alternative render strategies can be implemented.

Some early experiments by Wim Leers in Acquia's OCTO show that we can improve performance by a factor of about 2 compared to a recent Drupal 8 development snapshot. These breakthroughs are enabled by leveraging the various improvements we made to Drupal 8.

And much more

But that is not all. The Drupal community has actually done much more, including: complete asset dependency information (which allowed us to ensure zero JavaScript is loaded by default for anonymous users and send less data on AJAX requests), pluggable CSS/JS aggregation and minification (to support more optimal optimization algorithms), and more. We've also made sure Drupal 8 is fast by default, by having better defaults: CSS/JS aggregation enabled, JS assets being loaded from the bottom, block caching enabled, and so on.

All in all, there is a lot to look forward to in Drupal 8!

Special thanks to Acquia's Wim Leers, Alex Bronstein and Angie Byron for their contributions to this blog post.

Categories: Drupal

Dries Buytaert: Making Drupal 8 fly

Planet Drupal - 19 February 2015 - 11:57am

In my travels to talk about Drupal, everyone asks me about Drupal 8's performance and scalability. Modern websites are much more dynamic and interactive than 10 years ago, making it more difficult to build modern sites while also being fast. It made me realize that maybe I should write up a summary of some of the most exciting performance and scalability improvements in Drupal 8. After all, Drupal 8 will leapfrog many of its competitors in terms of how to architect and scale modern web applications. Many of these improvements benefit both small and large websites, but also allow us to build even bigger websites with Drupal.

More precise cache invalidation

One of the strategies we employ in making Drupal fast is "caching". This means we try to generate pages or page elements one time and then store them so future requests for those pages or page elements can be served faster. If an item is already cached, we can simply grab it without going through the building process again (known as a "cache hit"). Drupal stores each cache item in a "cache bin" (a database table, Memcache object, or whatever else is appropriate for the cache backend in use).

In Drupal 7 and before, when one of these cache items changes and it needs to be re-generated and re-stored (the cache gets "invalidated"), you can only delete a specific cache item, clear an entire cache bin, or use prefix-based invalidation. None of these three methods allow you to invalidate all cache items that contain data of, say, user 200. The only method that is going to suffice is clearing the entire cache bin, and this means that usually we invalidate way too much, resulting in poor cache hit ratios and wasted effort rebuilding cache items that haven't actually changed.

This problem is solved in Drupal 8 thanks to the concept of "cache tags": each cache item can have any number of cache tags. A cache tag is a compact string that describes the object being cached. Thanks to this extra metadata, we can now delete all cache items that use the user:200 cache tag, for example. This means we've deleted all the cache items we must delete, but not a single one more: optimal cache invalidation!

Example cache tags for different cache IDs.

And don't worry, we also made sure to expose the cache tags to reverse proxies, so that efficient and accurate invalidation can happen throughout a site's entire delivery architecture.

More precise cache variation

While accurate cache invalidation makes caching more efficient, there is more we did to improve Drupal's caching. We also make sure that cached items are optimally varied. If you vary too much, duplicate cache entries will exist with the exact same content, resulting in inefficient usage of caches (low cache hit ratios). For example, we don't want a piece of content to be cached per user if it is the same for many users. If you vary too little, users might see incorrect content as two different cache entries might collide. In other words, you don't want to vary too much nor too little.

In Drupal 7 and before, it's easy to program any cached item to vary by user, by user role, and/or by page, and could even be configured through the UI for blocks. However, more targeted variations (such as by language, by country, or by content access permissions) were more difficult to program and not typically exposed in a configuration UI.

In Drupal 8, we introduced a Cache Context API to allow developers and site builders to express these variations and to make them automatically available in the configuration UI.

Server-side dynamic content substitution

Usually a page can be cached almost entirely except for a few dynamic elements. Often a page served to two different authenticated users looks identical except for a small "Welcome $name!" and perhaps their profile picture. In Drupal 7, this small personalization breaks the cacheability of the entire page (or rather, requires a cache context that's way too granular). Most parts of the page, like the header, the footer and certain blocks in the sidebars don't change often nor vary for each user, so why should you regenerate all those parts at every request?

In Drupal 8, thanks to the addition of #post_render_cache, that is no longer the case. Drupal 8 can render the entire page with some placeholder HTML for the name and profile picture. That page can then be cached. When Drupal has to serve that page to an authenticated user, it will retrieve it from the cache, and just before sending the HTML response to the client, it will substitute the placeholders with the dynamically rendered bits. This means we can avoid having to render the page over and over again, which is the expensive part, and only render those bits that need to be generated dynamically!

Client-side dynamic content substitution

Some things that Drupal has been rendering for the better part of a decade, such as the "new" and "updated" markers on comments, have always been rendered on the server. That is not ideal because these markers are different for every visitor and as a result, it makes caching pages with comments difficult.

The just-in-time substitution of placeholders with dynamic elements that #post_render_cache provides us can help address this. In some cases, as is the case with the comment markers, we can even do better and offload more work from the server to the client. In the case for comment markers, a certain comment is posted at a certain time — that doesn't vary per user. By embedding the comment timestamps as metadata in the DOM with a data-comment-timestamp="1424286665" attribute, we enable client-side JavaScript to render the comment markers, by fetching (and caching on the client side) the “last read" timestamp for the current user and simply comparing these numbers. Drupal 8 provides some framework code and API to make this easy.

A "Facebook BigPipe" render pipeline

With Drupal 8, we're very close to taking the client-side dynamic content substitution a step further, just like some of the world's largest dynamic websites do. Facebook has 1.35 billion monthly active users all requesting dynamic content, so why not learn from them?

The traditional page serving model has not kept up with the increase of highly personalized websites where different content is served to different users. In the traditional model, such as Drupal 7, the entire page is generated before it is sent to the browser: while Drupal is generating a page, the browser is idle and wasting its cycles doing nothing. When Drupal finishes generating the page and sends it to the browser, the browser kicks into action, and the web server is idle. In the case of Facebook, they use BigPipe. BigPipe delivers pages asynchronously instead; it parallelizes browser rendering and server processing. Instead of waiting for the entire page to be generated, BigPipe immediately sends a page skeleton to the the client so it can start rendering that. Then the remaining content elements are requested and injected into their correct place. From the user's perspective the page is rendered progressively. The initial page content becomes visible much earlier, which improves the perceived speed of the site.

We've made significant improvements to the way Drupal 8 renders pages (presentation). By default, Drupal 8 core still implements the traditional model of assembling these pieces into a complete page in a single server-side request, but the independence of each piece and the architecture of the new rendering pipeline enable different “render strategies" to be experimented with — different methods for dynamic content assembly, such as BigPipe, Edge Side Includes, or other ideas for making the most optimal use of client, server, content delivery networks and reverse proxies. In all those examples, the idea is that we can send the primary content first so the client can start rendering that. Then we send the remaining Drupal blocks, such as the navigation menu or a 'Related articles' block, and have the browser, content delivery network or reverse proxy assemble or combine these blocks into a page.

A snapshot of the Drupal 8 render pipeline diagram that highlights where alternative render strategies can be implemented.

Some early experiments by Wim Leers in Acquia's OCTO show that we can improve performance by a factor of about 2 compared to a recent Drupal 8 development snapshot. These breakthroughs are enabled by leveraging the various improvements we made to Drupal 8.

And much more

But that is not all. The Drupal community has actually done much more, including: complete asset dependency information (which allowed us to ensure zero JavaScript is loaded by default for anonymous users and send less data on AJAX requests), pluggable CSS/JS aggregation and minification (to support more optimal optimization algorithms), and more. We've also made sure Drupal 8 is fast by default, by having better defaults: CSS/JS aggregation enabled, JS assets being loaded from the bottom, block caching enabled, and so on.

All in all, there is a lot to look forward to in Drupal 8!

Categories: Drupal

Drupal core announcements: Drupal core updates for February 19th, 2015

Planet Drupal - 19 February 2015 - 10:30am

新年快樂 (Happy [Lunar] New Year)! Last week, the Drupal community converged on Bogotá, Colombia for DrupalCon Latin America 2015, and largely due to the pre– and post–DrupalCon sprints, the number of Drupal 8 critical issues has dropped to about 56! Also, all of the critical issues from the Drupal 8 core critical issues sprint at DrupalCamp NJ have been committed!

Some other highlights of the month were:

How can I help get Drupal 8 done?

See Help get Drupal 8 released! for updated information on the current state of the release and more information on how you can help. Webchick posted an excellent rundown of the remaining Drupal 8 critical issues on her personal blog which may also be helpful.

We're also looking for more contributors to help compile these posts. Contact xjm if you'd like to help!

Drupal 8 In Real Life Whew! That's a wrap!

Do you follow Drupal Planet with devotion, or keep a close eye on the Drupal event calendar, or git pull origin 8.0.x every morning without fail before your coffee? We're looking for more contributors to help compile these posts. You could either take a few hours once every six weeks or so to put together a whole post, or help with one section more regularly. If you'd like to volunteer for helping to draft these posts, please follow the steps here!

Categories: Drupal

Zivtech: How to Patch Drupal Modules

Planet Drupal - 19 February 2015 - 9:00am

Have you ever worked with a Drupal developer who seemed to always be able to fix bugs by finding patches to apply seemingly instantaneously? At the risk of being dispelled from the Alliance of Magicians, I’ll share how I do it.

 

View All Issues

The first step to find a patch to a contributed Drupal module is to get to that module’s issue queue.  The path to the queue is always drupal.org/project/issues/[machine-name]. By default when you view a module’s issue queue only Open issues are shown. This excludes you from seeing all of the patches that have already been applied to the dev branch even if they have not been included in any stable release. When hunting for patches, be sure to look at All statuses.

You can save time finding the issue queue and showing all issues with a Chrome Search Engine keyword

My Drupal search keywords.

I have an “is” Chrome keyword, so I can type “is views” into Chrome and it takes me to https://www.drupal.org/project/issues/views?status=All

 

Search

First look at the top most recent issue titles for something similar. If these don’t look relevant use the Search to search within the issue queue. Using just one or two keywords in this search is typically best. Again scan the first few issue titles, opening up anything that looks relevant in a new tab. Don’t worry about the status of the issues or the number of comments (any relevant issue may provide a link to a better issue).

 

Scan

Don’t attempt to read issues in their entirety (unless you are going to be chiming in to the issue, in which case for the love of god do read and understand the entire issue before doing so). Start by quickly reading the summary of the issue (which may be the original poster’s issue or a revised summary) and then scan down the page looking for links to other issues, patches, and the last updates on the issue. Open any links to review in a new tab.

Stereotyping the actors in the issue based on their apparent professionalism helps you scan faster. If there are no patches and all commenters seem confused, look for a different issue as it may be a duplicate: the inexperienced may not have found the other issue where the real action is. Make sure you can tell the difference between an amateur (inappropriate status changes, complaints, excess punctuation??, not enough or irrelevant details) and a struggling English speaker (unusual word choice or grammar, speling erors) as the latter is much more likely to have a useful point. If someone stands out in the issue as knowing their stuff (often the maintainer), spend your time focused on their comments.

If the issue seemed unhelpful, close the tab. If it may be related but you’re not sure yet, leave it open. If it looks like exactly your issue, proceed to try out the latest version of any patch attached.

Go through all your open tabs scanning them this way. If you’ve scanned them all without a clear solution, next try looking more closely at the issues you still have open, or try different search keywords (you likely have learned a few relevant words from what you’ve scanned so far).

 

Declare Victory

Once you find a promising patch, to best impress your colleagues with your amazing speed you should declare victory in your chat room at this point. This will help add to the illusion that you have unusual speed at patch hunting. It helps to have some gifs ready for this occasion.

 

Apply a Patch

Instead of downloading a patch and then moving it to the right location on your machine or virtual machine, if you’ve already got a terminal open to the right spot it’s faster to copy the URL of the patch and use wget to download it. Don’t worry that the patch is for the dev version but you’re running the stable version- it will often apply anyway. The steps for patching Foo module are

    cd sites/all/modules/contrib/foo

    wget http://drupal.org/files/issues/foo.patch

    patch -p1 < foo.patch

Then you can use git diff (please be using git) to check your patch applied properly.

Note that many patches only change a few lines of code. If this is the case you may find it quicker to edit the files manually.

 

Test a Patch

Now, slow down. This isn’t the step to save time. You need to carefully test whether the patch fixed your problem and it’s easy to get confused when you do this (did I clear the cache? was I looking at the right environment? Did I attempt to reproduce it the same way?) which will get you completely off track.

 

Track a Patch

If the patch didn’t help or it broke something else, use git checkout to remove it. Read the issue the patch came from and if it still seems like the right issue, add a comment on your findings. Continue looking for patches or try another problem-solving technique (google? IRC? ask a colleague? debug the code? think of a workaround?)

If the patch solved your problem, chime in to the issue to say so (unless several folks have already said the same). If you now understand the involved code enough to review it, give it a code review and mark its status as Needs Work or Reviewed. If you want your patch to survive an update and not be a dirty hack (hint: you do) you need to track your patch according to your organization’s procedures. I highly recommend using a Drush Patch File workflow.

Happy patching!

 

Postscript: Check the Attitude

Do you get frustrated every time there’s a bug in your open source software and you need to find a patch or some other solution? You need to accept that bugs are a normal part of all software, and work with the rest of us as a giant team finding and fixing them as a part of our working process. You will be able to find solutions much more quickly when you look for them with calm confidence rather than frustration.

Terms: DrupalDrupal Planet
Categories: Drupal

Code Karate: Drupal 7 Interval Field module

Planet Drupal - 19 February 2015 - 8:46am
Episode Number: 193

The Drupal 7 Interval Field module provides a simple way to create a duration or interval field on any Drupal 7 field-able entity. A common use for this might be on a content type that generally keeps track of dates. Sometimes it easier to summarize a group of dates to a user or visitor using an interval field rather than selecting multiple dates.

An interval field is useful for keeping track of data such as:

Tags: DrupalFieldsDrupal 7Site BuildingDrupal Planet
Categories: Drupal

CSS Delivery Optimizer

New Drupal Modules - 19 February 2015 - 8:35am

This module aims to implement optimized CSS delivery by inlining stylesheets which are critical for rendering 'above the fold' content and load the rest asynchronously via JavaScript. This, if used well, should completely eliminate render blocking CSS from the pages, improving perceivable and absolute page load times.

Development is at its early stage at the moment although code is in working and functional.

Categories: Drupal

Drupalize.Me: Write A Hello World Test for Drupal 7 with SimpleTest

Planet Drupal - 19 February 2015 - 6:00am

This tutorial covers writing a "Hello World" test for Drupal 7 using the SimpleTest framework that comes with Drupal core, and is based on the free video Learning Test Case Basics by Writing A Hello World Test.

Categories: Drupal

OpenLucius: Coding custom (compound) Drupal fields

Planet Drupal - 19 February 2015 - 5:38am

In a previous post I wrote about why 'Compound fields' in your Drupal installation. A compound field can be seen as a unified field, so a field that contains multiple fields. Get it? :)

And now, as promised: how to build

To clarify you will find below an example of how to build a module in which you define a compound field. In this example I am creating a 'Video' field’ for which the following two fields are required:

Categories: Drupal

Config Pages

New Drupal Modules - 19 February 2015 - 3:16am

DO I NEED IT?
At some point i was tired of creating custom pages using menu and form api, writing tons of code just to have a page with ugly form where client can enter some settings, and as soon as client wants to add some interactions to the page (drag&drop, ajax etc) things starts to get hairy. Same story was with creation of dedicated CT just to theme a single page (like homepage) and explaining why you can only have 1 node of this type, or force it programmatically.

If this sounds familiar, then this module may be just a thing you were looking for :)

Categories: Drupal

Nate Haug: Drupal and Backdrop meet Drush and Drupal Console

Planet Drupal - 18 February 2015 - 11:22pm
Drupalbackdropdrushdrupal console

Tonight I was reading up on the history of the Drupal Console. It started after seeing a tweet that seemed to indicate that the Drupal Console project is headed towards full command-line abilities to control Drupal. Up until recently, the Drupal Console has been described mostly as a "scaffolding tool" for Drupal 8, which would help save developers from boilerplate code. But Console can now enable/disable maintenance mode, which looks like it's headed right into full-blown site management.

This got me thinking, "don't we already have a command line tool for managing Drupal?" Of course we do! It's drush, The canonical tool for managing Drupal sites. I know I must not be the first to think of the overlap, and knowing the Drupal community tends to react poorly to overlap, I did some searching on the history of these two projects. There are a lot of cross-links that happen all over the place, but I think these two posts generally sum things up:

It may take a while to read through the posts and their internal links, but several similarities struck me between Drush/Console and Drupal/Backdrop.

  • The initial reaction was negative. Both Backdrop and Drupal Console received some negative response, claiming that the overlap was wasted work or would hurt either the project or developers.
  • The purpose for forking or rewriting was based around leveraging Symfony components. I find this interesting that Backdrop was created to avoid adopting Symfony, which in turn led to major refactoring and rewriting of major systems. In this case, Drupal Console could (loosely) be considered the fork for the opposite reason. Drush maintainers considered some Symfony underpinnings but decided (for now) that the amount of change and syntax differences didn't really warrant implementation. Drush stayed the same (a la Backdrop) but Drupal Console adopted Symfony components (a la Drupal 8).
  • Symfony's paradigms infect all code it touches. I know "infect" is a negative term, but being a proponent of the infectious GPL license, I don't necessarily think of it in a negative light. Drupal 8 didn't set out to rewrite all its subsystems, but after adopting Symfony at the bottom-most layer, the concepts bubbled up into higher levels of the application. It seems to me that the fundamental incompatibility between procedural programming and dependency-injection leads in inevitable rewriting of all code so it fits with the dependency-injection model.
  • There are differences in compatibility. Moving from Drush to Console (if you were so inclined) will almost certainly require rewriting scripts and integrations, but Drush 7 to Drush 8 likely will be very compatible. On the opposite side, the jump from Drupal 7 to Drupal 8 requires significant rewriting, while moving from Drupal 7 to Backdrop maintains most compatibility.
  • Even though there's now a split, both projects are still communicating and moreover, collaborating on solving problems. I think it's wonderful that the Drush and Console folks have had no hostility. Instead you see the opposite, that they're cross-communicating ways that each could leverage from the others approach. Backdrop and Drupal are collaborating in similar fashion, such as coordinating security releases and cross-porting patches/pull-requests between the projects.

So with all these comparisons, you can see that it's not simply an analogous "A is to B as C is to D" situation, but there are a lot of similarities between reactions, purpose, and intent. As Mark Ferree said, it'll be interesting to see where both projects end up.

Categories: Drupal

Media Gallery Extras

New Drupal Modules - 18 February 2015 - 8:53pm

Experimental Utility additions to Media Gallery.

Adds some additional tools for managing media galleries - especially large and bulky ones.

Categories: Drupal

PreviousNext: Decouple Design with Styleguide Driven Drupal Development

Planet Drupal - 18 February 2015 - 7:30pm

The traditional approach of directly styling default Drupal markup is not a scalable solution when we consider the volatility of the modern browser ecosystem. It is now necessary for front-end developers to abstract design patterns into manageable components. By using a styleguide we can automate the process of isolating and cataloguing patterns so they can be iterated and tested against independently. In this post I discuss ideas and put forward some informal rules around managing these components from within a styleguide.

Categories: Drupal

Chapter Three: Content Strategy for Drupal 8

Planet Drupal - 18 February 2015 - 5:40pm

We've been publishing a lot of technical blogs about Drupal 8 to educate and inspire the community. But what about the non-technical folk? How will Drupal 8 shift the way designers, content strategists and project managers plan websites? While many of the changes will not affect our day to day work, there are a few new terms and ways of thinking that can streamline the strategy process, save developers time and save clients money.



Entities: The Word of the Day

Entities are our new friend. They are easygoing and flexible. The sooner we get comfortable with the word Entity and begin using it with our teams, the sooner we can all reap the rewards of the budding relationship.

Categories: Drupal

Victor Kane: Desgrabación de mi presentación sobre DurableDrupal Lean UX+Dev+DevOps Drupalcon Latin America

Planet Drupal - 18 February 2015 - 3:46pm

Desgrabación de la Presentación en el DrupalCon Latin America 2015

Poner en pie una fábrica de DurableDrupal (Drupal Duradero) en base de un proceso Lean reutilizable

[Bajar el pdf de 18 páginas al pie de la página]

Ya que el sonido de la grabación de mi presentación quedó muy bajo en volumen, quiero presentar la desgrabación con la esperanza de que mi mensaje llegue a la mayor cantidad de personas posible.

El video de la presentación en sí se encuentra aquí: https://www.youtube.com/watch?v=bNbkBvtQ8Z0

Los diapositivas: http://awebfactory.com/drupalcon2015lean/#/

Para cada diapositiva, incluyo a continuación el link al slide y el texto correspondiente del video.

En algunos casos hay correcciones inevitables o he extendido el texto para su mejor comprensión. También he traducido diapositivas importantes y agregado algunos comentarios para incluir puntos de suma importancia que no fueron mencionados en la presentación por falta de tiempo, como el Kanban y sus características.

El artículo como consecuencia se extiende bastante, pero espero que resulte de utilidad para quienes se interesan por la cuestión del proceso Lean en los desarrollos con Drupal.

Mi plan es publicar en breve un libro que integre estos conceptos en la práctica con un ejemplo concreto de desarrollo de una aplicación web en Drupal 7 y 8.

read more

Categories: Drupal

DrupalCon News: One week left to submit your DrupalCon sessions

Planet Drupal - 18 February 2015 - 10:06am

We are in the fourth week of session submissions for DrupalCon Los Angeles and only one week remains before the deadline. Now is your chance to shine! Send us your talk idea and you could find yourself presenting to the Drupal community's largest annual event this spring.

Categories: Drupal

InternetDevels: Drupal vulnerability or developers' carelessness?

Planet Drupal - 18 February 2015 - 7:16am

In October, 2014 Sektion Eins company has discovered vulnerability which affects all branches of Drupal 7 versions. It allows performing any SQL-request to database even without having permissions in the system. The security risk was recognized as highly critical. The corresponding core update was released on October, 15. It upgraded the core to 7.32 version and eliminates this vulnerability. And now we’ll talk about some other kinds of vulnerabilities.

Read more
Categories: Drupal

Dcycle: A quick intro to Docker for a Drupal project

Planet Drupal - 18 February 2015 - 7:05am

I recently added Docker support to Realistic Dummy Content, a project I maintain on Drupal.org. It is now possible to run ./scripts/dev.sh directly from the project directory (use the latest dev version if you try this), and have a development environment, sans MAMP.

I don't consider myself an expert in Docker, virtualization, DevOps and config management, but here, nonetheless, is my experience. If I'm wrong about something, please leave a comment!

Intro: Docker and DevOps

The DevOps movement, popularized in the last years, promises to include environment information along with application information in the same git repo for smoother development, testing, and production environments. For example, if your Drupal module requires version 5.4 of PHP, along with a given library, then that information should be somewhere in your Git repo. Building an environment for testing, development or production should then use that information and not be dependent on anything which is unversioned. Docker is a tool which is anchored in the DevOps movement.

DevOps: the Config management approach

The family of tools which has been around for awhile now includes Puppet, Chef, and Ansible. These tools are configuration management tools: they define environment information (PHP version should be 5.3, Apache mod_rewrite should be on, etc.) and make sure a given environment conforms to that information.

I have used Puppet, along with Vagrant, to deliver applications, including my Jenkins server hosted on GitHub.

Virtualization and containers

Using Puppet and Vagrant, you need to use Virtualization: create a Virtual Machine on your host machine. Docker uses containers so resources are shared. The article Getting Started with Docker (Servers for Hackers, 2014/03/20) contains some graphics which demonstrate how much more efficient containers are as opposed to virtualization.

Puppet and Vagrant are slow; Docker is fast

Puppet and Vagrant together work for packaging software and environment configuration, but it is excruciatingly slow: it can take several minutes to launch an environment. My reaction to this has been to cringe every time I have to do it.

Docker, on the other hand, uses caching agressively: if a server was already in a given state, Docker uses a cached version of it move faster. So, when building a container, Docker goes through a series of steps, and caches each step to make it lightning fast.

One example: launching a dev environment of Jenkins projects on Mac OS takes over five minutes, but launching dev environment of my Drupal project Realistic Dummy Content (which uses Docker), takes less than 15 seconds the first time it is run once the server code has been downloaded, and, because of caching, less than one (1) second subsequent times if no changes have been made.

Configuration management is idempotent, Docker is not

Before we move on, note that Docker is not incompatible with config management tools, but Docker does not require them. Here is why I think, in many cases, config management tools are not necessary.

The config management tools such as Puppet are idempotent: you define how an environment should, and the tools runs whatever steps are necessary to make it that way. This sounds like a good idea in theory, but it looks like this in practice. I have come to the conclusion that this is not the way I think, and it forces me to relearn how to think of my environments. I suspect that many developers have a hard time wrapping their heads around idempotence.

Docker is not idempotent; it defines a series of steps to get to a given state. If you like idempotence, one of the steps can be to run a puppet manifest; but if, like me, you think idempotence is overrated, then you don't need to use it. Here is what a Dockerfile looks like: I understood it at first glace, it doesn't require me to learn a new way of thinking.

The CoreOS project

The CoreOS project has seen the promise of Docker and containers. It is an OS which ships with Docker, Git, and a few other tools, but is designed so that everything you do happens within containers (using the included Docker, and eventually Rocket, a tool they are building). The result is that CoreOS is tiny: it takes 10 seconds to build a CoreOS instance on DigitalOcean, for example, but almost a minute to set up a CentOS instance.

Because Docker does not work on Mac OS without going through hoops, I decided to use Vagrant to set up a CoreOS VM on my Mac, which is speedy and works great.

Docker for deploying to production

We have seen that Docker can work for quickly setting up dev and testing environments. Can it be used to deploy to production? I don't see why not, especially if used with CoreOS. For an example see the blog post Building an Internal Cloud with Docker and CoreOS (Shopify, Oct. 15, 2014).

In conclusion, I am just beginning to play with Docker, and it just feels right to me. I remember working with Joomla in 2006, when I discovered Drupal, and it just felt right, and I have made a career of it since then. I am having the same feeling now discovering Docker and CoreOs.

I am looking forward to your comments explaining why I am wrong about not liking idempotence, how to make config management and virutalization faster, and how and why to integrate config management tools with Docker!

Tags: blogplanet
Categories: Drupal
Syndicate content


Google+
about seo