Skip to Content

Planet Drupal

Syndicate content
Drupal.org - aggregated feeds in category Planet Drupal
Updated: 7 hours 55 min ago

Blue Drop Awards: Blue Drop Awards 2015: Join Us!

4 December 2014 - 2:33pm

Although the Blue Drop Awards are still a few months away, preparation has already begun. However, this isn’t a one-man show; we need your help. There are many ways to get involved such as volunteering, becoming a sponsor, posting to your social media networks and more.

In case you were not aware, in 2012, a group of volunteers set out to create an annual program that aims to increase the awareness of Drupal and it’s capabilities. Now in it’s fourth year, this community-nominated, publicly-voted event recognizes the contributions of individuals, companies, and projects. The Blue Drop Awards are 100% volunteer-organized so we need your help to highlight and showcase the best Drupal websites/modules out there.

There are lots of ways of getting involved! Currently we're looking for sponsors to help us continue this annual project and volunteers to help us run the project. 

We're looking for kind volunteers, who are interested in helping with verifying nominations, web design, development, running the booth at DrupalCon LA, wrangling other volunteers and sponsors, and more.

Sponsors are the only way for us to make this a sustaining project because it takes significant amounts of time and resources to put this project on every year. Luckily for our fantastic sponsors, there are tons of benefits of becoming a sponsor! For 2015 we have four sponsorship packages to choose from; sign up and we’ll get your brand in front of tens of thousands of people in the Drupal community. This is a great lead generation and branding opportunity for our sponsors.

Regardless of whether you decide to sponsor or volunteer, we can't wait to see all of the great Drupal nominees come February!

A big THANK YOU to our talent in this Drupal video: Doug Vann (DougVann.com), Arnold Leung (Appnovation), Michael Spinosa (Unleashed Technologies), Stephen Weinberg (Commerce Guys), Ben Finklea, Erik Wagner, Brian Solka, Alexander Popov, and Gilbert Sauceda (Volacci).

Tags:  Planet Drupal Drupal blue drop awards
Categories: Drupal

Drupal Association News: Global Training Days 2014 Wrap-up

4 December 2014 - 2:01pm

In our third year supporting the Drupal Global Training Days Initiative, we have seen more training companies, more community leaders, and more individuals participate than ever before. 

Each quarter, communities and training companies host Introduction to Drupal sessions, building better and brighter community members. Global Training Days (GTD) is an opportunity for local training companies and community leaders to build their local community by offering low cost/free Introduction to Drupal training. The Drupal Association lists your training on our site, promotes GTDs and is available to consult on planning and curriculum.   Offering these trainings has anecdotally shown to significantly grow individual participation in communities, build a pool of developers, and raise general awareness for the Drupal project. 

This year 35 countries hosted over 170 low cost or free trainings. This year our Asian-Pacific communities really embraced GTDs! We had 8 Indian companies host trainings in Srinagar, Bangalore, Ghaziabad, Delhi, Mumbai, Kolkata, and Gurgaon -- the most trainings in one individual country. China, South Korea, Pakistan, New Zealand, Australia, and Japan also hosted training sessions, really highlighting our growth and momentum in Asia-Pacific. 

Feeling inspired?  Sign up to host a GTD at your DrupalCamp, meet-up, or your local user group.  Are you new to Drupal, or do you want to participate in an introductory training? Try reaching out to local Drupal training company to host an event.

Make sure you save the following dates to host and participate in Global Training Days in 2015:

  • February 27/28
  • May 22/23
  • August 21/22 
  • November 20/21

For more details on the benefits of why you should participate and spread the word about GTDs, check out our webcast and video.  Let’s grow Drupal and our community one person at a time.

Amazing Drupal learning session #DrupalGTD #Drupal @DrupalMumbai @DrupalAssoc pic.twitter.com/8E2xNYxBgE

— Rachit Gupta (@tweet_rachit) September 6, 2014

Categories: Drupal

Marek Sotak: Making Drupal Commons look like facebook - part 1

4 December 2014 - 6:45am

Acquia Commons is a Drupal distribution that is for building communities within your organization. I have been able to play around with its competitor JIVE and must admit, it stands quite well next to it.

We were approached by Gary Conroy from Specsavers, who have introduced Commons within their organization and needed to do some additional tweaks to make it a bit similar to Facebook. This will be a series of two blog posts, in first we will show how we have made similar functionality to Facebook Like links and in the second how did we change the status form widget.

Categories: Drupal

Drupal Commerce: How to switch your payment settings based on environment variables using Platform.sh

4 December 2014 - 5:00am

When working on a Commerce project which uses a payment gateway, you need to always make sure that your Staging and Development environments are properly targeting the sandbox or test mode of your payment gateway, and that your Production site is targeting the live account.

This is actually true for any third-party service integration which provides a sandbox where you can test. The objective is to make sure you never send test data on a live account, no matter the service you're testing on.

For this tutorial, I will focus on payment method settings, but the principle remains the same for any other third-party integration.

I will start from an empty Drupal site hosted on Platform.sh and go through the following steps:

  • Enable and configure Paypal WPS payment method
  • Export its configuration to a settings.local.php file
  • Override its Sandbox configuration on the Staging environment
  • Write custom code to read the configuration from the settings.local.php

As you see, the goal (as always with Drupal) is to read the configuration from your code so that you can easily switch from a sandbox mode to a live mode.

Categories: Drupal

tanay.co.in: Book - D8 Module Porting 101 - Porting Simple FB Connect Module to D8

4 December 2014 - 3:43am
Probably the first book on Drupal 8. But honestly this is not worthy of being called a book. Just a collection of notes that I took as I ported the simple_fb_connect module to D8. Felt it was too long for a blog post and hence the book below! Read More @ http://www.tanay.co.in/blog/book-d8-module-porting-101-porting-simple-fb-connect-module-d8.html

If you see nothing above probably your browser does not support PDF embed. Try from a desktop with the latest version of Chrome or Firefox.

Categories: Drupal

CMS Quick Start: Publishing Drupal 7 Content to Social Media: Part 2

4 December 2014 - 1:02am
Today we're going to look at how to push your site content to social media services using contributed Drupal modules. If you want full control over how social media integrates with your site and allow extra features to be used on site, this is the way to go. The major tradeoff comes with increased time for configuration and testing, and possibly troubleshooting if something doesn't work correctly.
 

Let's get started.

read more

Categories: Drupal

Propeople Blog: Prototypes: A Better Approach to Development

3 December 2014 - 3:21pm

Traditional web design has always involved creating flat, two-dimensional designs in wireframes or high-definition design comps. While this process feels natural and intuitive for designers, it presents significant shortcomings when it comes to the increasing demand for modern, responsive websites.

Building a website in Drupal typically follows this process:

  1. create low-definition conceptual designs, wireframes, and sketches

  2. create high-definition design files in Photoshop

  3. configure the site and build out various functionalities

  4. create a base theme and apply it to complete functionalities

 

 

This process, tried-and-true as it may be, leads to a lot of challenges:

Synchronicity of frontend and backend development

Building functionality is tied to internal elements that sometimes are not exposed to UI at all. For example, integrations with external systems or implementing editorial workflows. This sort of work can represent hundreds of hours from a development perspective, but the theming required is marginal. On the other hand, some parts of web design work--like assembling pages with node listings or just the pages themselves--can involve 10 hours of development and 20 hours (or more!) of theming due to custom Javascript and layouts. In this situation it can be challenging for a busy frontend developer to know how to organize his/her time most effectively.

Clients can't test the site until most of the functionality is done

Theming only after a site’s functionality is ready leads to a problematic situation wherein a web design team can't actually show its client polished work until the end of the project. Theoretically, of course, we can show some functionality or provide elaborate descriptions, but the “bells and whistles” still tend to be missing until a project nears completion. This leads to a huge bottleneck effect as both clients and quality analysts turn their attention to testing a new website on multiple devices and platforms. At this stage, most of the bugs reported will be related to responsive behavior or small changes to theming. This can be a nightmare for frontend developers; they receive tons of work all at once, even as the project is about to finish.

Enabling frontend developers

Frontend developers are forced to learn a lot about preprocess functions and how Drupal works. This whittles down the time frontend Drupal developers have to introduce cool new technologies that exist elsewhere in the frontend world. Despite multiple attempts to make Drupal’s HTML output cleaner, Drupal still produces a lot of code that, most of the time, is superfluous. The truth is the frontend world develops at a faster pace than our PHP tools. New Javascript frameworks mature at lightning speed and we are not really all that good at adopting them. What we would really like to see is frontend developers progressing in their field, rather than getting bogged down by more Drupalisms. But how?

Prototypes to the rescue

Lately Propeople has been rewriting the web design process, building prototypes right after designs become available. In fact, the main deliverables of our design work are the prototypes themselves. Technically, the prototypes we build are sets of HTML pages. The idea here is for these prototypes to empower frontend developers to build a site how they see it, instead of how Drupal dictates. Prototypes free up frontend developers to use new technologies and to properly organize code. Another advantage is that prototypes can be built before Drupal functionality is finalized. In other words, clients can start test driving their website early on in the development process and have a clear idea of how the site is going to look and behave. By the time Drupal functionality is ready, all frontend bugs are resolved, specifically the ones related to responsive behavior. There are multiple PHP-based template engines to use, platesphp.com being one. Our developers use assemble.io.

 

Prototype-based theming

Let’s say you’ve got your prototypes just the way you want them. Now Drupal’s backend HTML should be made to match the prototypes as precisely as possible. The bad news: this is not easy as it may sound. The good news: we’ve compiled some best practices to lead the way.

  1. Panels, panels, panels. We use a stack of panels modules. It is pretty simple to build custom layouts out from the prototypes, replacing static blocks with implemented panes.

  2. Custom panes with custom templates. In order to control the HTML of custom panes, we create a template file for each one. This makes it easy to tweak the HTML of a single pane as needed. We even display custom panes in lieu of views panes, executing views under the hood. We skip rendering views in order to keep all the theming for one block in one template.

  3. Display entities as view modes. For most view modes (different kinds of teasers), we use separate custom templates.

Prototypes make building a website much more exciting and improves the web design process for everyone involved. Backend developers can actually watch the site’s theming implemented as they complete functionality. Frontend developers have more freedom and fewer Drupalisms to remember. Best of all, prototypes increase client satisfaction because prototypes allow clients to feel more involved in the implementation of their projects.

 Tags: PropeoplePrototypesDesignService category: TechnologyCheck this option to include this post in Planet Drupal aggregator: planetTopics: Tech & Development
Categories: Drupal

Drupal core announcements: Drupal core updates for December 3, 2014

3 December 2014 - 2:29pm
What's new with Drupal 8? Where's Drupal 8 at in terms of release?

Since the last Drupal Core Updates, we fixed 18 critical issues and 12 major issues, and opened 9 criticals and 18 majors. That puts us overall at 110 release-blocking critical issues and 705 major issues.

Part of managing Drupal 8 toward its release is continuously reassessing what must block a release of 8.0.0. (Remember, hundreds of thousands of people will benefit from all the great new functionality in Drupal 8, so we need to be smart about what does or doesn't hold up that release!) The chart below illustrates not only those newly discovered and newly fixed critical issues each week, but also issues that are promoted to critical and demoted from critical based on our latest understanding. For more information on what is (and isn't) release-blocking, see the handbook page on issue priority.

Current focus

The current top priority in Drupal 8 is to resolve issues that block a beta-to-beta upgrade path (critical issues tagged 'D8 upgrade path'). We also need core contributors to continue evaluating issues for the beta phase based on the beta changes policy.

Finally, keep an eye out for critical issues that are blocking other work.

How to get involved

If you're new to contributing to core, check out Core contribution mentoring hours. Twice per week, you can log into IRC and helpful Drupal core mentors will get you set up with answers to any of your questions, plus provide some useful issues to work on.

If you are interested in really digging into a tough problem and helping resolve a stagnating release blocker, or if you are stuck on a critical currently, join the #drupal-contribute IRC channel during weekly critical issue office hours on Fridays at 12:00p PST. See chx's office hours reports for an idea of what we've done so far!

If you'd like to contribute to a particular Drupal 8 initiative or working group, see the regularly scheduled meetings on the Drupal 8 core calendar. Google calendar ID: happypunch.com_eq0e09s0kvcs7v5scdi8f8cm70@group.calendar.google.com

You can also help by sponsoring independent Drupal core development.

Notable Commits

The best of git log --since "1 week ago" --pretty=oneline (70 commits in total):

  • Issue 2359369 by mpdonadio, Berdir, bdurbin: Render cache is not cleared when module is uninstalled - cache invalidation is always hard :)
  • Issue 2377281 by hussainweb, dawehner: Upgrade to Symfony 2.6 stable - getting close to the 2.7 LTS release
  • Issue 2342593 by znerol, grendzy, David_Rothstein: Remove mixed SSL support from core - aligning Drupal with the wider web trends regarding https
  • Issue 2369781 by larowlan: Ensure twig_debug output has needed sanitization - another critical security fix down
  • Issue 2384581 by cilefen, Wim Leers: Security: Update CKEditor library to 4.4.6 - brings some security improvements
  • Issue 2384163 by yched: Entity render cache is needlessly cleared when an Entity*Fom*Display is modified - performance++
  • Issue 2368275 by martin107, dawehner, znerol, Crell, Wim Leers: EntityRouteEnhancer and ContentFormControllerSubscriber implicitly depend on too many services - ensuring our critical execution path is a lean as posisble
  • Issue 2348459 by larowlan, alexarpen: Fields of type 'Text (formatted)' do NOT save values - a critical that was causing data loss when editor module was enabled
  • Issue 2235901 by alexpott, mdrummond, iMiksu, sun, Wim Leers: Remove custom theme settings from *.info.yml - theme system using config objects like everything else
  • Issue 2212335 by jhodgdon: Separate out NodeSearch::execute() into finding vs. processing results
  • Issue 2377397 by Wim Leers, alexpott: Themes should use libraries, not individual stylesheets - moving us towards simplifying ajax page state, and smaller Javascript settings object - and hence increased performance

You can also always check the Change records for Drupal core for the full list of Drupal 8 API changes from Drupal 7.

Drupal 8 Around the Interwebs Drupal 8 in "Real Life" Whew! That's a wrap!

Do you follow Drupal Planet with devotion, or keep a close eye on the Drupal event calendar, or git pull origin 8.0.x every morning without fail before your coffee? We're looking for more contributors to help compile these posts. You could either take a few hours once every six weeks or so to put together a whole post, or help with one section more regularly. Read more about how you can volunteer to help with these posts!

Categories: Drupal

Appnovation Technologies: Uploading videos with Brightcove FTP Batch Provisioning

3 December 2014 - 2:26pm

Brightcove is a leading platform for cloud-based online video hosting and publishing.

var switchTo5x = false;stLight.options({"publisher":"dr-75626d0b-d9b4-2fdb-6d29-1a20f61d683"});
Categories: Drupal

Drupal Association News: 2015 Leadership Plan and Budget

3 December 2014 - 2:12pm

During the last Drupal Association meeting, the board approved the 2015 Leadership Plan and Budget. We are very pleased to make the related documents available to you in their entirety:

However, these documents can be a lot to parse on their own, so let me provide some context and summary here in this post as well.

In short…

With a full engineering team and a bigger staff overall, 2015 presents more opportunities for the Association to directly impact the community. There are still far more projects and services that we could pursue than we can pursue, so we will use a couple of strategies in our work that will help us find the right approaches quickly and in collaboration with the community. Can we do everything that needs to get done? No. But we can and will do more in 2015.

2015 Imperatives

As mentioned above, there are plenty of things we can be doing in 2015 to serve the community. Because we can’t do them all, we’ve identified three areas that are imperatives - if we succeed in these areas, we’ve set ourselves up to grow and do more for the community in the future. Our three imperatives are:

  • Drupal.org: We spent most of 2014 focused on paying off technical debt, making the site and services more portable while improving performance and stability. In 2015, we plan to make many more visible changes, aligned with the roadmap (https://www.drupal.org/roadmap) that was developed with community input and published last month. Drupal.org is the heart of our community and we have a heavy lift to make it the useful tool it could be and to help the tools on Drupal.org better reflect our comunity processes and values. If we do this well, we will see a more engaged and growing community of developers and contributors.
  • Drupal 8 Release: We know you want Drupal 8 to be released just as much as we do. Additionally, there is a lot of opportunity during a product release that we want to take advantage of with the community. We’ll be looking to capitalize on the release to gain positive attention for the project that will result in more market share for Drupal and a growing developer community.
  • Operationalizing Revenue Programs: We’re still in a situation where most of the revenue we use to fund Drupal.org support and development, Community Cultivation Grants, and our other programs comes from the DrupalCons. Having a single major source of revenue is risky for any organization, but it also means that we are limited in terms of what we can do to improve upon or change DrupalCon formats. We are working on diversifying our revenue streams, with several new products introduced in 2014. In 2015, we need to operationalize and grow these revenue streams.
What does this mean?

That’s the work we want to focus on in 2015, but how will we do it all? The Leadership Plan and Budget lay this out in pretty complex detail, so here is a summary of our thinking and what this all represents:

  • You may recall that our original 2014 budget predicted a $750,000 deficit spend so that we could focus on building out a technical team to support our primary imperative - Drupal.org. As it turns out, we were unable to hire that team as quickly as we had hoped and we also did not utilize as much money for contractors on Drupal.org as anticipated (because it’s tough to manage contractors when you don’t have the staff). The result is that we will have a much smaller deficit in 2014.
  • It also means that we still have a lot of work to do on Drupal.org and 2015 is the year that we will actual feel the financial impact of all those engineering hires. Additionally, we need to invest in our revenue team to build out our funding streams so we can sustain the team long-term.
  • Another investment we need to make is in Drupal 8. Not only do we want to support the release, we want to help ensure that the release happens as quickly as possible. Working with the Drupal 8 branch maintainers we are developing a grants program modeled after the Community Cultivation Grants to help fund Drupal 8 development. As with the CGG program, the program is a true partnership with the community. Our role will be to provide the funds (up to $125,000) and logistical support. The branch maintainers will evaluate the proposals. We’ll have more details ready to be released next week, so stay tuned!
  • Continuing the shift we began in 2014, the Drupal.org investment in 2015 will be over $1 million - on par with the DrupalCons in terms of total expense. The bulk of this expense is in staffing, but we will make some small investments in hardware and services and a larger investment to develop a design system that will complement the user research and content strategy to fuel an iterative redesign.
  • When you put it all together, we are expecting another deficit spend, primarily to address the investments that were delayed in 2014. Though we did not have a third DrupalCon in 2014 and we did not make any new investments in revenue related staff until the fourth quarter of the year, we will still have managed to grow our gross revenue over 2013 by several hundred thousand dollars. This gives us great confidence that we can make up this deficit and support a revenue neutral or positive budget for 2016.
A word on strategy

In 2015 we’re going to tackle everything we took on in 2014 and then some. Defining our imperatives helps us understand what work to focus on, but defining our strategies allows us to understand how we will do our work and ensure that it aligns with our values.

We have two main strategies for 2015: work with and highlight community contribution and treat every project as an opportunity to run a small, fast experiment. Our best work is done in partnership with the community. A great example is testbots. In 2014, the Association took on the upkeep of the primary testbot infrastructure so that community volunteers can focus their time on a new, more modern implementation. In all of our work, we will find the right ways to leverage and celebrate these kinds of partnerships.

We also know that in a community this large and complex, giant, comprehensive projects rarely succeed. No one can anticipate every need or use case, and no individual or small group can engineer the “right” solution for every facet. Our teams will instead break big projects down into small components and test solutions, from new ideas for Cons to new UX components. We’ll implement, gather data, and iterate - repeatedly.

Join us for more conversation

The Association staff and board are very excited about the potential this budget represents for progress in areas that matter most to all of us in the Drupal community. We invite you to join us for a community webcast on 18 December at 8am Pacific (that's 11am Eastern and 4pm in London) to discuss the plan in more detail. We’ll also make a recording available for anyone who can’t join us live.

Join the Webcast

Categories: Drupal

PreviousNext: Updating Panelizer content programmatically

3 December 2014 - 1:48pm

Panelizer is a great module for being able to modify the layout of a page on a per-node basis. However, its greatest strength can sometimes be its greatest weakness. We found this out the hard way when a client asked us to help them add a block on every single page of their site directly beneath the h1 page title. Read on for how we approached this issue.

Categories: Drupal

Pixelite: Using git pre-commit hooks to keep you Drupal codebase clean

3 December 2014 - 12:57pm
Facebook Like Google Plus One Linkedin Share Button

All too often when peer reviewing code done by other Drupalers, I spot debug code left in the commit, waiting for the chance to be deployed to staging and break everything.

I started to read up on git hooks, paying particular attention to pre-commit:

This hook is invoked by git commit, and can be bypassed with --no-verify option. It takes no parameter, and is invoked before obtaining the proposed commit log message and making a commit. Exiting with non-zero status from this script causes the git commit to abort.

You can write you pre-commit hook in any language, bash seems the most sane due to the power of text analysis tools at your disposal.

Where is the code

Here is a link to the github repository with the pre-commit hook:

git clone https://github.com/wiifm69/drupal-pre-commit.git Features
  • Executes PHP lint across the PHP files that were changed
  • Checks PHP for a blacklist of function names (e.g. dsm(), kpr())
  • Checks JavaScript/CoffeeScript for a blacklist of function names (e.g. console.log(), alert())
  • Ignores deleted files from git and will not check them
  • Tells you all of the fails at the end (and stores a log)
  • Only lets the commit go ahead when there are no fails
Installation cd /tmp git clone https://github.com/wiifm69/drupal-pre-commit.git cd drupal-pre-commit cp scripts/pre-commit.sh [PATH_TO_YOUR_DRUPAL_PROJECT]/scripts cd [PATH_TO_YOUR_DRUPAL_PROJECT] ln -s ../../scripts/pre-commit.sh .git/hooks/pre-commit Feedback

I am keen to hear from anyone else on how they do this, and if you have any enhancements to the code then I am happy to accept pull requests on github. Happy coding.

Tags drupal drupalplanet git Source Git hooks Category Tutorial
Categories: Drupal

Acquia: Digital Government and Content on the Moon - Hugo Pickford-Wardle

3 December 2014 - 11:31am
Language Undefined

Categories: Drupal

Drupal Commerce: Commerce 1.x enhances cart calculation and order management

3 December 2014 - 8:39am

Have you heard about all the great new features being developed for Commerce 2.x?

Well, there's lots to talk about for Commerce 1.x as well. Over the last few months, we've released a number of features that are definitely useful for over 50,000!! existing Drupal Commerce stores and good to know about for future shop implementations:

Read More

Categories: Drupal

LevelTen Interactive: How to set up a Drupal site on DigitalOcean

3 December 2014 - 8:29am

If you haven’t heard of DigitalOcean yet, I definitely recommend it as a web developer. DigitalOcean is a "simple and fast cloud hosting provider built for developers,” which basically means pure awesomeness. You can spin up a cloud server in 55 seconds starting at just $5 a month. 20 GB SSD, 512MB of RAM - five bucks, that’s it. They have a number of applications you can pre-install as well. Want to host your own Dropbox? Try out OwnCloud. Maybe you need a Ruby on Rails environment, they have you covered.... Read more

Categories: Drupal

Digett: 4 Challenges With a Webforms Project, Part IV

3 December 2014 - 8:27am

Welcome to the fourth and final part of my in-depth look at customizing some default behavior in Webforms, and in this installment, Drupal’s Bootstrap theme. Today we’ll look specifically at repurposing bootstrap alerts and some more general minor DOM manipulation.

(Here’s Part I, Part II, and Part III for easy reference if you’re late to the party.)

Challenge #4: Thank you, come again!

So, at this point in the project, the form behaves as desired right up until a successful submission.

read more

Categories: Drupal

Dcycle: What is content? What is configuration?

3 December 2014 - 7:18am

What is content? What is configuration? At first glance, the question seems simple, almost quaint, the kind one finds oneself patiently answering for the benefit of Drupal novices: content is usually information like nodes and taxonomy terms, while content types, views and taxonomy vocabularies are usually configuration.

Content lives in the database of each environment, we say, while configuration is exportable via Features or other mechanisms and should live in the Git repo (this has been called code-driven development).

Still, a definition of content and configuration is naggingly elusive: why "usually"? Why are there so many edge cases? We're engineers, we need precision! I often feel like I'm trying to define what a bird is: every child knows what a bird is, but it's hard to define it. Ostriches can't fly; platypuses lay eggs but aren't birds.

Why the distinction?

I recently saw an interesting comment titled "A heretic speaks" on a blog post about code-driven development. It sums up some of the uneasiness about the place of configuration in Drupal: "Drupal was built primarily with site builders in mind, and this is one reason [configuration] is in the database".

In effect, the primary distinction in Drupal is between code (Drupal core and config), and the database, which contains content types, nodes, and everything else.

As more complex sites were being built, a new distinction had to be made between two types of information in the database: configuration and content. This was required to allow development in a dev-stage-production workflow where features being developed outside of a production site could be deployed to production without squashing the database (and existing comments, nodes, and the like). We needed to move those features into code and we called them "configuration".

Thus the features module was born, allowing views, content types, and vocabularies (but not nodes and taxonomy terms) to be developed outside of the database, and then deployed into production.

Drupal 8's config management system takes that one step further by providing a mature, central API to deal with this.

The devil is in the details

This is all fine and good, but edge cases soon begin to arise:

  • What about an "About us" page? It's a menu item (deployable) linking to a node (content). Is it config? Is it content?
  • What about a "Social media" menu and its menu items? We want a Facebook link to be deployable, but we don't want to hard-code the actual link to our client's Facebook page (which feels like content) -- we probably don't even know what that link is during development.
  • What about a block whose placement is known, but whose content is not? Is this content? Is it configuration?
  • What about a view which references a taxonomy term id in a hard-coded filter. We can export the view, but the taxonomy term has an incremental ID ans is not guaranteed to work on all environments.

The wrong answer to any of these questions can lead to a misguided development approach which will come back to haunt you afterward. You might wind up using incremental IDs in your code or deploying something as configuration which is, in fact, content.

Defining our terms

At the risk of irking you, dear reader, I will suggest doing away with the terms "content" and "configuration" for our purposes: they are just too vague. Because we want a formal definition with no edge cases, I propose that we use these terms instead (we'll look at each in detail a bit further on):

  • Code: this is what our deliverable is for a given project. It should be testable, versioned, and deployable to any number of environments.
  • Data: this is whatever is potentially different on each environment to which our code is deployed. One example is comments: On a dev environment, we might generate thousands of dummy comments for theming purposes, but on prod there might be a few dozen only.
  • Placeholder content: this is any data which should be created as part of the installation process, meant to be changed later on.
Code

This is what our deliverable is for a given project. This is important. There is no single answer. Let's take the following examples:

  • If I am a contributor to the Views contrib project, my deliverable is a system which allows users to create views in the database. In this case I will not export many particular views.

  • For another project, my deliverable may be a website which contains a set number of lists (views). In this case I may use features (D7) or config management (D8) to export all the views my client asked for. Furthermore, I may enable views_ui (the Views User interface) only on my development box, and disable it on production.

  • For a third project, my deliverable may a website with a number of set views, plus the ability for the client to add new ones. In this only certain views will be in code, and I will enable the views UI as a dependency of my site deployment module. The views my client creates on production will be data.

Data

A few years ago, I took a step back from my day-to-day Drupal work and thought about what my main pain points were and how to do away with them. After consulting with colleagues, looking at bugs which took longest to fix, and looking at major sources of regressions, I realized that the one thing all major pain points had in common were our deployment techniques.

It struck me that cloning the database from production to development was wrong. Relying on production data to do development is sloppy and will cause problems. It is better to invest in realistic dummy content and a good site deployment module, allowing the standardized deployment of an environment in a few minutes from any commit.

Once we remove data from the development equation in this way, it is easier to define what data is: anything which can differ from one environment to the next without overriding a feature.

Furthermore, I like to think of production as just another environment, there is nothing special about it.

A new view or content type created on production outside of our development cycle resides on the database, is never used during the course of development, and is therefore data.

Nodes and taxonomy terms are data.

What about a view which is deployed through features and later changed on another environment? That's a tough one, I'll get to it (See Overriden features, below).

Placeholder content

Let's get back to our "About us" page. Three components are involved here:

  • The menu which contains the "About us" menu item. These types of menus are generally deployable, so let's call them code.
  • The "About us" node itself which has an incremental nid which can be different on each environment. On some environments it might not even exist.
  • The "About us" menu item, which should link to the node.

Remember: we are not cloning the production database, so the "About us" does not exist anywhere. For situations such as this, I will suggest the use of Placeholder content.

For sake of argument, let's define our deliverable for this sample project as follows:

"Define an _About us_ page which is modifiable".

We might be tempted to figure out a way to assign a unique ID to our "About us" node to make it deployable, and devise all kinds of techniques to make sure it cannot be deleted or overridden.

I have an approach which I consider more logical for these situations:

First, in my site deployment module's hook_update_N(), create the node and the menu item, bypassing features entirely. Something like:

function mysite_deploy_update_7023() { $node = new stdClass(); $node->title = 'About us'; $node->body[LANGUAGE_NONE][0]['format'] = 'filtered_html'; $node->body[LANGUAGE_NONE][0]['value'] = 'Lorem ipsum...'; $node->type = 'page'; node_object_prepare($node); $node->uid = 1; $node->status = 1; $node->promote = 0; node_save($node); $menu_item = array( 'link_path' => 'node/' . $node->nid, 'link_title' => 'About us', 'menu_name' => 'my-existing-menu-exported-via-features', ); menu_link_save($item); }

If you wish, you can also implement hook_requirements() in your custom module, to check that the About us page has not been accidentally deleted, that the menu item exists and points to a valid path.

What are the advantages of placeholder content?

  • It is deployable in a standard manner: any environment can simply run drush updb -y and the placeholder content will be deployed.
  • It can be changed without rendering your features (D7) or configuration (D8) overriden. This is a good thing: if our incremental deployment script calls features_revert() or drush fra -y (D7) or drush cim -y (D8), all changes to features are deleted. We do not want changes made to our placeholder content to be deleted.
  • It can be easily tested. All we need to do is make sure our site deployment module's hook_install() calls all hook_update_N()s; then we can enable our site deployment module within our simpletest, and run any tests we want against a known good starting point.
Overriden features

Although it is easy to override features on production, I would not recommend it. It is important to define with your client and your team what is code and what is data. Again, this depends on the project.

When a feature gets overridden, it is a symptom that someone does not understand the process. Here are a few ways to mitigate this:

  • Make sure your features are reverted (D7) or your configuration is imported (D8) as part of your deployment process, and automate that process with a continuous integration server. That way, if anyone overrides a feature on a production, it won't stay overridden long.
  • Limit administrator permissions so that only user 1 can override features (this can be more trouble than it's worth though).
  • Implement hook_requirements() to check for overridden features, warning you on the environment's dashboard if a feature has been overridden.
Some edge cases

Now, with our more rigorous approach, how do our edge cases fare?

Social media menu and items: Our deliverable here is the existence of a social media menu with two items (twitter and facebook), but whose links can be changed at any time on production without triggering an overridden feature. For this I would use placeholder content. Still, we need to theme each button separately, and our css does not know the incremental IDs of the menu items we are creating. I have successfully used the menu attributes module to associate classes to menu items, allowing easy theming. Here is an example, assuming menu_attributes exists and menu-social has been exported as a feature.

/** * Add facebook and twitter menu items */ function mysite_deploy_update_7117() { $item = array( 'link_path' => 'http://twitter.com', 'link_title' => 'Twitter', 'menu_name' => 'menu-social', 'options' => array( 'attributes' => array( 'class' => 'twitter', ) ) ); menu_link_save($item); $item = array( 'link_path' => 'http://facebook.com', 'link_title' => 'Facebook', 'menu_name' => 'menu-social', 'options' => array( 'attributes' => array( 'class' => 'facebook', ) ) ); menu_link_save($item); }

The above code creates the menu items linking to Facebook and Twitter home pages, so that content editors can put in the correct links directly on production when they have them.

Placeholder content is just like regular data but it's created as part of the deployment process, as a service to the webmaster.

A block whose placement is known, but whose content is not. It may be tempting to use the box module which makes blocks exportable with feature. But in this case the block is more like placeholder content, so it should be deployed outside of features. And if you create your block programmatically, its id is incremental and it cannot be deployed with context, but should be placed in a region directly, again, programmatically in a hook_update_N().

Another approach here is to create a content type and a view with a block display, fetching the last published node of that content type and displaying it at the right place. If you go that route (which seems a bit overengineered to me), you can then place your block with the context module and export it via features.

A view which references a taxonomy term id in its filter: If a view requires access to a taxonomy term nid, then perhaps taxonomy is the wrong tool here. Taxonomy terms are data, they can be deleted, their names can be changed. It is not a good idea for a view to reference a specific taxonomy term. (Your view can use taxonomy terms for contextual filters without a problem, but we don't want to hard-code a specific term in a non-contextual filter -- See this issue for an example of how I learned this the hard way, I'll get around to fixing that soon...).

For this problem I would suggest rethinking our use of a taxonomy term. Rather I would define a select field with a set number of options (with defined keys and values). These are deployable and guaranteed to not change without triggering a features override. Thus, our views can safely use them. If you are implementing this change on an existing site, you will need to update all nodes from the old to the new technique in a hook_update_N() -- and probably add an automated test to make sure you're updating the data correctly. This is one more reason to think things through properly at the onset of your project, not midway through.

In conclusion

Content and configuration are hard to define, I prefer the following definitions:

  • Code: deployable, deliverable, versioned, tested piece of software.
  • Data: anything which can differ from one environment to the next.
  • Placeholder content: any data which should be created as part of the deployment process.

In my experience, what fits in each category depends on each project. Defining these with your team as part of your sprint planning will allow you create a system with less edge cases.

Tags: blogplanet
Categories: Drupal

Zengenuity: Decoupling Your Backend Code from Drupal (and Improving Your Life) with Wrappers Delight

3 December 2014 - 6:48am

If you've ever written a lot of custom code for a Drupal site, then you know it can be a tedious and error-prone experience. Your IDE doesn't know how Drupal's data structures are organized, and it doesn't have a way to extract information about configured fields to do any autocomplete or check data types. This leads to some frustrations:

  • You spend a lot of time typing out by hand all the keys in every array of doom you come across. It's tedious, verbose, and tiring.
  • Your code can contains errors your IDE won't alert you to. Simple typos can go unnoticed since the IDE has no idea how the objects and arrays are structured.
  • Your code is tightly coupled to specific field names, configured in the database. You must remember these, because your IDE can't autocomplete them.
  • Your code is tightly coupled to specific field types. (If you start off with a text field and then decide to switch to an email field, for example, you will find the value is now stored in a different key of the data array. You need to update all your custom code related to that field.)
  • It can be easy to create cross-site-scripting vulnerabilities in your code. You need to keep in mind all the field data that needs to be sanitized for output. It only takes one forgotten spot to open your site to attacks.

Wrappers Delight (https://www.drupal.org/project/wrappers_delight) is a development tool I've created to help address these issues, and make my life easier. Here's what it does:

  • Provides wrapper classes for common entity types, with getters and setters for the entities' base properties. (These classes are wrappers/decorators around EntityMetadataWrapper.)
  • Adds a Drush command that generates wrapper classes for the specific entity bundles on your site, taking care of the boilerplate getter and setter code for all the fields you have configured on the bundles.
  • Returns sanitized values by default for the generated getters for text fields. (raw values can be returned with an optional parameter)
  • Allows the wrapper classes to be customized, so that you can decouple your custom code from specific Drupal field implementation.

With Wrappers Delight, your custom code can be written to interface with wrapper classes you control instead of with Drupal objects directly. So, in the example of changing a text type field to an email type field, only the corresponding wrapper class needs to be updated. All your other code could work as it was written.

But wait, there's more!

Wrappers Delight also provides bundle-specific wrapper classes for EntityFieldQuery, which allow you to build queries (with field-level autocomplete) in your IDE, again decoupled from specific internal Drupal field names and formats. Whatever your decoupled CRUD needs may be, Wrappers Delight has you covered!

Getting Started with Wrappers Delight

To generate wrapper classes for all the content types on your site:

  1. Install and enable the Wrapper Delight module.
  2. Install Drush, if you don't already have it.
  3. At the command line, in your Drupal directory, run drush wrap node.
  4. This will generate a new module called "wrappers_custom" that contains wrapper classes for all your content types.
  5. Enable the wrappers_custom module, and you can start writing code with these wrapper classes.
  6. This process works for other entity types, as well: Users, Commerce Products, OG Memberships, Messages, etc. Just follow the Drush command pattern: drush wrap ENTITY_TYPE. For contributed entity types, you may need to enable a submodule like Wrappers Delight: Commerce to get all the base entity properties.
Using the Wrapper Classes

The wrapper classes generated by Wrappers Delight have getters and setters for the fields you define on each bundle, and they inherit getters and settings for the entity's base properties. The class names follow the pattern BundlenameEntitytypeWrapper. So, to use the wrapper class for the standard article node type, you would do something like this:

$article = new ArticleNodeWrapper($node);
$body_value = $article->getBody();
$image = $article->getImage();

Wrapper classes also support passing an ID to the constructor instead of an entity object:

$article = new ArticleNodeWrapper($nid);

In addition to getters that return standard data arrays, Wrappers Delight creates custom utility getters for certain field types. For example, for image fields, these will all work out of the box:

$article = new ArticleNodeWrapper($node);
$image_array = $article->getImage();
$image_url = $article->getImageUrl();
$image_style_url = $article->getImageUrl('medium');
$absolute_url = $article->getImageUrl('medium', TRUE);

// Get a full tag (it's calling theme_image_style
// under the hood)
$image_html = $article->getImageHtml('medium'); Creating New Entities and Using the Setter Methods

If you want to create a new entity, wrapper classes include a static create() method, which can be used like this:

$values = array(
'title' => 'My Article',
'status' => 1,
'promote' => 1,
);
$article = ArticleNodeWrapper::create($values);
$article->save();

You can also chain the setters together like this:

$article = ArticleNodeWrapper::create();
$article->setTitle('My Article')
->setPublished(TRUE)
->setPromoted(TRUE)
->save(); Customizing Wrapper Classes

Once you generate a wrapper class for an entity bundle, you are encouraged to customize it to your specific needs. Add your own methods, edit the getters and setters to have more parameters or different return types. The Drush command can be run multiple times as new fields are added to your bundles, and your customizations to the existing methods will not be overwritten. Take note that Wrappers Delight never deletes any methods, so if you delete a field, you should clean up the corresponding methods (or rewrite them to get the data from other fields) manually.

Drush Command Options

The Drush command supports the following options:

  • --bundles: specify the bundles to export (defaults to all bundles for a given entity type)
  • --module: specify the module name to create (defaults to wrappers_custom)
  • --destination: specify the destination directory of the module (defaults to sites/all/modules/contrib or sites/all/modules)
Packaging Wrapper Classes with Feature Modules or Other Bundle-Supplying Modules

With the options listed above, you can export individual wrapper classes to existing modules by running a command like the following:

drush wrap node --bundles=blog --module=blog_feature

That will put the one single wrapper class for blog in the blog_feature module. Wrappers Delight will be smart enough to find this class automatically on subsequent runs if you have enabled the blog_feature module. This means that once you do some individual exports, you could later run something like this:

drush wrap node

and existing classes will be updated in place and any new classes would end up in the wrappers_custom module.

Did You Say Something About Queries?

Yes! Wrappers Delight includes a submodule called Wrapper Delight Query that provides bundle-specific wrapper classes around EntityFieldQuery. Once you generate the query wrapper classes (by running drush wrap ENTITY_TYPE), you can use the find() method of the new classes to execute queries:

$results = ArticleNodeWrapperQuery::find()
->byAuthor($uid)
->bySomeCustomField($value1)
->byAnotherCustomField($value2)
->orderByCreatedTime('DESC')
->range(0, 10)
->execute();

The results array will contain objects of the corresponding wrapper type, which in this example is ArticleNodeWrapper. That means you can immediately access all the field methods, with autocomplete, in your IDE:

foreach ($results as $article) {
$output .= $article->getTitle();
$output .= $article->getImageHtml('medium');
}

You can also run queries across all bundles of a given entity type by using the base wrapper query class:

$results = WdNodeWrapperQuery::find()
->byAuthor($uid)
->byTitle('%Awesome%', 'LIKE')
->execute();

Note that results from a query like this will be of type WdNodeWrapper, so you'll need to check the actual bundle type and re-wrap the object with the corresponding bundle wrapper in order to use the bundle-level field getters and setters.

Wrapping Up

So, that's Wrappers Delight. I hope you'll give it a try and see if it makes your Drupal coding experience more pleasant. Personally, I've used on four new projects since creating it this summer, and it's been amazing. I'm kicking myself for not doing this earlier. My code is easier to read, WAY easier to type, and more adaptable to changes in the underlying architecture of the project.

If you want to help me expand the project, here are some things I could use help with:

  • Additional base entity classes for common core and contrib entities like comments, taxonomy terms, and Workflow states.
  • Additional custom getter/setter templates for certain field types where utility functions would be useful, such as Date fields.
  • Feedback from different use cases. Try it out and let me know what could make it work better for your projects.

Post in the issue queue (https://www.drupal.org/project/issues/wrappers_delight) if you have questions or want to lend a hand.

Categories: Drupal

ERPAL: How we automate Drupal security updates

3 December 2014 - 5:05am

During the past few weeks, automated security updates have been one of the hotly debated topics in the Drupal community. Ever since Drupalgeddon, security automation has been one of the issues we should really try to solve in order to ensure Drupal's continued growth, especially in the enterprise world. Whereas content management systems such as Wordpress already run automated updates in a background process, Drupal does not yet have such a feature. There are these and other discussions ongoing at Drupal.org that point out potential pros and cons of this feature. Personally, from the perspective of a Drupal professional, I think running Drupal module updates in the background could lead to several problems. There are a few reasons for this:

  • We somehow need to handle patched modules and cannot just override the complete module with an update
  • Letting Drupal rewrite its own codebase will open other security issues
  • Professionally developed Drupal projects use GIT (or another code versioning system) to maintain their codebase and handle the deployment process. Every update needs to be committed to the repository so that it’s not removed in the next deployment cycle
  • After updating a module, we should run our automated test scripts (for example, behat or selenium) to ensure the site didn't break with the update
  • To ensure quality we shouldn’t just run a complete update containing bug fixes and new features but only apply the patch relevant to security

The issue of applying security updates has become more and more time-sensitive because hackers start to attack vulnerable sites within hours of a security update release. Especially with enterprise web applications and large content sites with lots of users and traffic, this update process is really business critical. Even before the pressure of something like Drupalgeddon, these last two years we had already been thinking about update automation. In this blog post I want to describe the technology and workflows we use to automate security updates in our Drupal projects while ensuring quality with automated and manual tests and the correct handling of patches.

Every site that we support for our clients sends hourly update reports to our internal ERPAL (you can replace “ERPAL” here with any other ticketing system) over an https connection and with additional encryption. For every security update available, we create a new branch in the project's GIT repository and a task that is related automatically.

Once the task has been created, we get all the security-relevant patches and code changes from the active modules' repositories and merge them into the modules of the project. These code changes are committed to the new so-called "feature branch". Using Jenkins and a system to build feature branch instances with a live database, the changes are now ready to test. The status of the ERPAL Task is automatically set to "ready to test". Now all automated tests will run, if any are available for the project. The result is documented with a comment on the initially created task.
Depending on the test mode of the project and the priority of the security update (e.g. “critical” or “highly critical”), the security patches are either deployed directly to live once all tests are passed or the task is assigned to the project manager with the status "ready to test". He can then test the complete patched Drupal installation on a separate feature branch test instance under live conditions. If all tests are passed, the task will be set to "test passed" and the customer receives a notification that the security of his site is up-to-date. The update branch is merged as a hotfix into the master branch and the site is deployed to the live server. After this process, the update branch is deleted and the test instance destroyed to clean up the system. The following graphic describes the behavior.

This system has several benefits, both for us and for our clients:

  • Security-relevant updates are applied within one hour
  • Quality is ensured by automated tests and, if needed, by a notification system indicating manual test steps
  • No need to involve developers to patch and deploy code
  • No website downtime
  • All steps are documented in ERPAL to make the process transparent: customers see it in their ERPAL account
  • No panic on critical updates; all workflows run as in a normal project and are delivered with compliance to our task workflow
  • Instant customer notification once updates are applied gives customers a good feeling ;-)

This system has been working well for 2.5 years. Working in cooperation with other Drupal shops to test the system, we want to make this security update automation system available for others to use as well. Therefore we will soon publish the whole solution as a service. If you want to become one of the few beta testers, or if you want to become a reseller to deliver the same security automation to your clients, you can sign up at our Drop Guard - The Drupal security update automation service page.

Categories: Drupal

Symphony Blog: Continue shopping button on Drupal Commerce cart

3 December 2014 - 12:34am

We had a Drupal project, implementing a commerce site for a local store. We use Drupal Commerce, as always, for this type of websites. You may see that we have alot of Drupal Commerce themes on our portfolio.

During the project, there was a minor request from our customer: add the Continue Shopping button to the cart. This feature is available on Ubercart, especially for Drupal 6 Ubercart users. Most of ecommerce sites have this feature as well. But it is not built-in with Drupal Commerce.

As I searched on the Drupal.org issues, I found a very helpful thread: Continue shopping in cart. Zorroposada presented a custom code to achieve it:

read more

Categories: Drupal


Google+
about seo