Planet Drupal

Subscribe to Planet Drupal feed
Drupal.org - aggregated feeds in category Planet Drupal
Updated: 22 hours 12 min ago

Dries Buytaert: Frontend United keynote

2 June 2018 - 4:00am

Keynoted at Frontend United in The Netherlands about our work on Drupal's web services APIs and our work toward a JavaScript-driven Drupal administration interface. Great event with lots of positive energy!

© Christoph Breidert

Categories: Drupal

Mobilefish.de: Import translations on profile or module installation

1 June 2018 - 2:11pm
Import translations on profile or module installation Peter Majmesku Fri, 06/01/2018 - 23:11

Let's say you have a custom module and you want to attach translation files to it. You want to import the translation files after installation or after you have updated the .po translation files. Also make sure that the Interface Translation (locale) core module is installed.

Use a folder named translations inside the module where the language files like de.po or fr.po can be found. To load the translations you have to insert the following lines into your example_module.info.yml:

'interface translation project': example_module
'interface translation server pattern': modules/custom/example_module/translations/%language.po

Note: more details about the interface translation properties can be found here.

To update your translations use the following Drush commands:

drush locale-check && drush locale-update && drush cr

To update existent translations you should take a look at the settings page (/admin/config/regional/translate/settings). You can use local translation files only or overwrite any existing translation.

Tags Add new comment
Categories: Drupal

Drupal Commerce: A May Full of Drupal Commerce Releases

1 June 2018 - 1:36pm

May was one of our most productive months to date. It was full of releases for the core Commerce modules, our standalone PHP libraries, and essential contributed modules that all work together to comprise Drupal Commerce. While I outlined the highlights in the roadmap issue on drupal.org, these wins are worth sharing more broadly to keep the rest of the Drupal community in the loop.

The biggest release of the month was Drupal Commerce 2.7, which included new features for currency formatting, address form configuration, and stored payment methods. It also fixed a handful of bugs that unblocked other module releases and updated core in response to improvements in our libraries and dependent modules.

We've long discussed how our standalone PHP libraries are exporting expertise off the Drupal island. Addressing and Internationalization, which have each been downloaded over one million times, are our two shining stars. We rolled new releases for each of them in May, improving even further Drupal Commerce's ability to solve the hardest parts of address entry / validation / formatting and currency localization. Refer to the price formatting change record from the 2.7 release to see how the new API is more flexible and performant as a result.

Additionally, we released Address 1.4 and Inline Entity Form 1.0 RC1. The latest Address release unlocks the customer profile’s address field to support collecting less detailed billing addresses. The Inline Entity Form release includes new product information management features, letting you duplicate product variations for faster product data entry.

Thanks to generous sponsorship from Authorize.Net themselves, we've been able to dedicate several weeks to improving their integration this year. The resulting Authorize.Net RC1 release now supports eCheck, Visa Checkout, and 3DSecure payments! We also included several bug fixes related to duplicate customer and payment profiles that appear when migrating from an old system to Drupal Commerce, for example.

While not fully released yet, our Technology Partner integration for Avalara's AvaTax is nearing beta. Jace Bennest from Acro Media contributed heavily by refactoring the module to properly use a TaxType plugin while my co-maintainer Matt Glaman contributed additional fixes to our port from the Drupal 7 integration to prepare it for certification. Thanks, Jace / Acro Media!

When Matt wasn't working on the above contribs, he was collaborating with Lisa Streeter from Commerce Guys to bring Commerce Reports to its first beta release for Drupal 8. The new version takes a completely different approach from the Drupal 7 using lessons we learned developing Lean Commerce Reports. It denormalizes transaction data when an order is placed to support reports generation with or without the Views module, providing a better developer experience and much better performance. Check it out below! (Click to expand.)

We've also been hard at work improving the evaluator experience. The big release for that is Commerce Demo's beta1, which showcases what Drupal Commerce provides out of the box. It creates products and scaffolds out a full product catalog (pictured below). To get the full effect, try it out with our default store theme, Belgrade. The new demo module gets us closer to something like we had with Kickstart 2.x on Drupal 7 - a learning resource for site builders and a way for agencies to more easily demo and sell Drupal Commerce.

Finally, I'm very excited to announce that Lisa Streeter is our new documentation lead! Expect some great things to come. She has already done fantastic work with the Commerce Recurring documentation and is working on revising our getting started, installation, and update docs.

Looking at June, we plan on finalizing the query level entity access API, which will allow us to better support marketplace and multi-store Drupal Commerce implementations. We expect to merge user registration after checkout completion, and we will also be focusing on address reuse / copying, Buy One Get One promotion offers, and more product management experience enhancements.

Categories: Drupal

Ashday's Digital Ecosystem and Development Tips: eSignatures with HelloSign and Drupal 8

1 June 2018 - 12:00pm

Previously, I wrote a bit about the HelloSign eSignature platform and how it can be integrated into a Drupal 7 website. As promised, a Drupal 8 version of the integration is now available and ready for use on cutting-edge websites everywhere. But this new version is much more than a one-to-one upgrade of the original module— we've leveraged some of Drupal 8's great new features to make using HelloSign with your site even easier than it was before. Here are just some of the highlights of the new release:

Categories: Drupal

ComputerMinds.co.uk: Rebranding ComputerMinds - Part 5: Development

1 June 2018 - 5:01am

Let's have a quick look through our development process on this project and pick out some of the more interesting bits. As briefly mentioned in the last article we are using a composer set up and all code is version controlled using git on github. All pretty standard stuff.

Frontend

In the previous article I briefly discussed how we set up Pattern Lab. Before getting stuck in to the components that would make up the pages of the site, we first needed to set up some global variables and grid. Variables allow us to reuse common values throughout the SCSS and if we need to make a change we can do so centrally. After adding variables for each of the colours and also a colour palette mapping which would allow to loop through all colours if we needed to throughout the project, we added variables for padding that would be used throughout and also font styles, after importing from Google Fonts.

 

CSS Grid

Although still relatively new, CSS Grid is a web standard and works in all modern browsers. So much simpler than using grid libraries like Susy we were keen to start using it on our projects and this was the perfect one on which to try it out. Set up was simple, partly due to the simple grid in the designs but mostly due to the simplicity of CSS Grid itself. A few lines of SCSS and the grid wrapper was set up:

.grid { display: grid; grid-auto-rows: auto; grid-gap: 20px; grid-column-gap: 20px; grid-template-rows: minmax(0, auto); }

This declares the grid, sets a consistent gap of 20px and sets a broad size range for the rows. As well as adding the .grid class to the wrapper of where we'd like a grid, we also need to add another class to define how many columns that grid should have. Defining, in SCSS, a simple mapping allowed me to create a loop to generate the column classes we needed:

// Column mapping $columns: ( one: 1, two: 2, three: 3, four: 4, five: 5, six: 6, ); // Generate column classes @each $alpha, $numeric in $columns { .grid--columns-#{$numeric} { grid-template-columns: repeat(#{$numeric}, 1fr); @include to-large { grid-template-columns: repeat(1, 1fr); } } }

This loop generates a class for each of the potential number of columns we might need. The last @include in the above code simply resets the column definition, making all columns full width on smaller screens. Now, all we needed to do was add 2 classes and we'd have a grid!

Occasionally, we'd have a need for grid items to to span more than one column. Using the same mapping as before, I created a simple loop that would generate classes to define different column spans. These classes could then be applied to the immediate children of the grid wrapper.

.grid__item { @include from-large { @each $alpha, $numeric in $columns { &--span-#{$alpha} { grid-column: auto / span #{$numeric}; } } } }

Now we have complete control over our grid. Here's a example of how it's used.

First item Second item spanning two columns Third item spanning three columns

 

Pattern Lab

In the previous article I mentioned the setup of Pattern Lab and Emulsify but didn't look in to the actual development, so let's do that now! Although we're used to coding SCSS in a modular way here at CM, with Pattern Lab's stand alone components essentially working like modules we actually don't need to take too much care to produce nice clean code. Each SCSS file is only catering for a small component on the page and as such is usually small and specific.

But, as well as including our pattern specific code within each component's directory we needed to ensure that we also considered working in a SMACSSy way to reduce the CSS we were generating. We didn't want multiple classes applying the same styling, so any rules that would be reused and consistent, like padding, were placed inside the Base folder in a Base SCSS file.

Of course, once we had defined our classes we needed to get them in to the Pattern Lab Twig templates. As components will have variations we can't just hard code the classes in to the templates, we need to pass them in as variables. Passing variables to Twig files is super simple and with Emulsify 2.x there's now even Drupal Attributes support with the addition of the BEM Twig extension. As we are likely wanting to pass multiple classes to the same element we can pass in a simple array of modifiers and render it out in the Twig template. So in a Drupal preprocess we can prepare some modifiers (we'll look at passing these on to the Pattern Lab Twig files later):

$variables['heading_modifiers'] = ['centered', 'no-space'];

And then in our Twig file we pass this through the BEM function:

{% set heading_base_class = heading_base_class|default('h' ~ heading_level) %} {{ heading }}

Which renders the markup as:

Heading

 

Backend

The beauty of using Pattern Lab is the ability to work simultaneously on frontend and backend development. Before bringing more hands on deck I was able to begin the backend of the site before getting even close to completing the frontend. As mentioned earlier, the codebase was set up before the Front End work began so we could jump straight in to the Emulsify theme. Using composer allowed us to quickly get Drupal 8 and a bunch of contrib modules we needed so when we were ready to start on the backend we could jump straight in.

This site required nothing too complex in terms of backend development and the work was more a task of building content types and views to display content as per the designs. That said, we did utilise the Paragraphs module allowing us to create reusable entities, or tiles as we're used to calling them, as they are used extensively throughout the designs.

 

Configuration

Something that hasn't been standard in our Drupal 8 builds since the release is configuration. Gone are the days of bundling settings in to features, Drupal 8 Core comes with configuration management tools. In the early days, one of our senior developers created cm_config_tools - a module to give developers precise control over what config to export. Drupal 8 has progressed since then and the timing of this project allowed us to use a new module, Configuration Split.

Configuration Split builds on Drupal Core's configuration management ability to export a whole set of a site's configuration by allowing us to define sets of configuration to be exported to separate directories. It's then possible to define in settings.php which directories to include when importing/exporting. As we were committing settings.php to git we could include the main config directory here and then have a local.settings.php (not committed to git) to define the database and any other config directories to include:

## Enable config split settings $config['config_split.config_split.local_dev']['status'] = TRUE; $config['config_split.config_split.local_overrides']['status'] = TRUE;

This means we can have configuration solely for use when developing (things like Devel and Field_UI). It's also possible to override settings that are included in the main config export, locally. This allows us to run local environments without fear of interfering with live functionality, like affecting comments by changing the Disqus Domain, for example.

Importing and exporting works the same way as Core's configuration management, by using Drush commands:

Drush cim Drush cex

 

Templating

In a normal Drupal project, the markup (Twig files) would be within Drupal's templating system with prepared variables being rendered out where they were needed to be. With our component based Pattern Lab, all of our markup was within the Patten Lab structure, away from Drupal's /templates directory. Fortunately, including them is simple enough. First we needed to download and install the Components Libraries module. This allowed us to specify a different directory for our Twig files and also register Twig namespaces for those files. We do this in the theme's .info file:

component-libraries: base: paths: - components/_patterns/00-base atoms: paths: - components/_patterns/01-atoms molecules: paths: - components/_patterns/02-molecules organisms: paths: - components/_patterns/03-organisms templates: paths: - components/_patterns/04-templates pages: paths: - components/_patterns/05-pages

Now our Pattern Lab Twig files were included, we could begin to link them up to Drupal's templating system. Linking them is as simple as choosing which components you want to display and then calling that Twig file from your Drupal template. When you call the component's Twig file you just need to pass in the variables from Drupal.

So if we wanted to display a page title as an H1, within page-title.html.twig inside Drupal's template directory we would call our Pattern Lab's heading component passing in the title and heading level:

{{ title_prefix }} {% if title %} {% include "@atoms/02-text/00-headings/_heading.twig" with { "heading": title, "heading_level": 1, } %} {% endif %} {{ title_suffix }}

If we wanted to change the style of the heading we could pass in an array of modifiers, as shown in the example further up the page, too. For more complex page components we can also pass in an array to be looped over inside the component's Twig file. For example, if we wanted a listing of cards we could pass an array to a listing component Twig template and within that loop through the array each time calling another component's Twig template:

{% for item in content_array %} {% include "@molecules/card/01-card.twig" with { "card_img_src": item.image, "card_title": item.title, "card_body": item.body, "card_button_content": item.button_text, "card_button_url": item.button_url, "card_button_modifiers": item.button_mods, "card_url": item.url, "card_img_alt": item.image_alt, } %} {% endfor %}

This is just a brief overview and a look at some interesting parts, there was obviously a lot more work that went in to the site build! Now, as this website was being built to replace our old site, we needed the content from old site to be moved over. In the next article Christian is going to talk through this process.

Categories: Drupal

Third & Grove: A Year Later and Drupal Commerce is Still in Existential Crisis

1 June 2018 - 5:00am
A Year Later and Drupal Commerce is Still in Existential Crisis justin Fri, 06/01/2018 - 08:00
Categories: Drupal

Agiledrop.com Blog: AGILEDROP: Top Drupal blog posts from May

1 June 2018 - 12:00am
Each month, we revisit out top Drupal blog posts of the month, giving you the chance to check out some of our favourites. This month was all about decoupled Drupal and JavaScript, check it out!   First one on the list is Nightwatch in Drupal Core by Sally Young from Lullabot. In this blog post, she introduces us to Nightwatch, a functional testing framework, that has been integrated into Drupal, so we can test JavaScript with JavaScript itself. She explains what are the features and how you can try it out.  We continue our list with Working toward a JavaScript-driven Drupal administration… READ MORE
Categories: Drupal

Virtuoso Performance: Disabling functionality temporarily during migration

31 May 2018 - 8:25am
Disabling functionality temporarily during migration mikeryan Thursday, May 31, 2018 - 10:25am

Continuing with techniques from the “Acme” project, the location content type had an address field and a geofield, with field_geofield configured to automatically determine latitude and longitude from the associated field_address - a fact I was initially unaware of. Our source data contained latitude and longitude already, which I mapped directly in the migration:

field_geofield: plugin: geofield_latlon source: - latitude - longitude

However, testing location migrations by repeatedly running the import, I soon started getting messages from Google Maps API that my daily quota had been exceeded, and quickly tracked down the integration with field_address. Clearly, the calls out to Google Maps were both unnecessary and hazardous - how to prevent them? Fortunately, the migration system provides events which fire before and after each migration is executed. So, we subscribe to MigrateEvents::PRE_IMPORT to save the current settings and disable the external call:

public function onMigrationPreImport(MigrateImportEvent $event) { if ($event->getMigration()->id() == 'location') { $fields = \Drupal::entityTypeManager()->getStorage('field_config')->loadByProperties(['field_name' => 'field_geofield']); if ($fields) { /** @var \Drupal\field\Entity\FieldConfig $field */ if ($field = $fields['node.location.field_geofield']) { $this->originalSettings = $field->getThirdPartySettings('geocoder_field'); $field->setThirdPartySetting('geocoder_field', 'method', 'none'); $field->save(); } } } }

And we subscribe to MigrateEvents::POST_IMPORT to restore the original settings:

public function onMigrationPostImport(MigrateImportEvent $event) { if ($event->getMigration()->id() == 'location') { $fields = \Drupal::entityTypeManager()->getStorage('field_config')->loadByProperties(['field_name' => 'field_geofield']); if ($fields) { /** @var \Drupal\field\Entity\FieldConfig $field */ if ($field = $fields['node.location.field_geofield']) { foreach ($this->originalSettings as $key => $value) { $field->setThirdPartySetting('geocoder_field', $key, $value); } $field->save(); } } } }

The thoughtful reader may note a risk here - what if someone were adding or editing a location node while this were running? The geofield would not be populated from the address field. In this case, this is not a problem - this is a one-time bulk migration (and no one should be making changes on a production website at such a time). In cases involving an ongoing feed where the feed data is used as-is on the Drupal site, it would also not be a problem, although if there were a practice of manually editing imported content there would be some risk.

Tags Drupal Planet Drupal Migration Use the Twitter thread below to comment on this post:

https://t.co/TRqRsWJPlA

— Virtuoso Performance (@VirtPerformance) May 31, 2018

 

Categories: Drupal

OpenSense Labs: How to strategize web personalization with Drupal

31 May 2018 - 5:33am
How to strategize web personalization with Drupal Shankar Thu, 05/31/2018 - 18:03

You might have listened to the new album of your favourite band on a music application. Or, you would have streamed a critically acclaimed movie on a video streaming platform. In both cases, you will notice suggestions curated especially for you based on your choice of music and movies. Personalized content is the way to go for providing the better user experience. Drupal has provisions for building personalization features into your site to tailor the content as per the interests of the user thereby enhancing user engagement.

Source: Getty Images

Proper analysis of web personalization criteria and strategies should prove vital for the digital firms. According to a research from Econsultancy, 94 percent of the in-house marketers agree that web personalization is really significant for the current as well as the future growth of their business.

What is web personalization anyway? Source: Marketo Web personalization implies that on the basis of attributes like profile, behaviour and the location of the internet users, you should create dynamic and personalized content to provide them with the relevant website experience.

It refers to understanding the interests of the users, tailoring the website to accommodate their profile and, offer them the best content that is relevant to them.

Due to intricacies and traffic volume of some websites, digital marketers may find web personalisation as convoluted and may think that it will occupy a lot of their time. And they may also think that it is for large enterprises who have enormous presence globally, have a huge team, and can sustain increasing budget. But, with right tools and strategies, it can be incorporated into every website no matter what sort of business they do thereby making the website efficacious and skyrocketing the return on investment.

How is web personalization meritorious?

Digital marketers constantly work on personalizing the way they interact with the customers and attain their objectives through customer satisfaction and retention. There are various ways you can reap the merits of web personalization.

  • Strengthens customer’s loyalty to your brand: You can use it to firm the grip on how the customer thinks about you and your brand. A research from Invespcro stated that 45 percent of the online users are more likely to shop on a website that shows personalized suggestions. You can compile the data collected from a user’s interaction with your website and curate messages for your cross-channel marketing. Hence, it helps in developing a brand value and customer loyalty.
  • Enhances lead generation: A Hubspot research found that personalized calls-to-actions (CTA) can assist in generating leads. CTAs that were targeted for specific users had 42 percent higher view-to-submission rate than the ones that were same for every user.
  • Offers insight into online visitor’s preferences: You can get a deep insight into who your online users are, what they prefer and, lead to higher conversion rate. You can then channelise your messages based on what certain user might want to know.
  • Shores up sales and revenue: Through loyalty programs, it can help in accentuating your sales. It is not just restricted to offering discounts and merchandises to the customers but promotes more user engagement. For instance, you can send alert to the customers when a product is back in stock or encourage them for future purchases by sending them notifications when a brand new product is launched. A report by Mckinsey and Company states that acquisition costs get reduced by almost 50 percent and business revenue increases by 5-15 percent.
  • Increases conversion rate: Web personalization can help track demographics and the behavioural patterns and convert an anonymous user into a potential customer.
  • Improves user engagement: Engagement and acquisition are two of the important terms that digital marketers adhere to. Personalizing the website lets you cultivate user engagement through steps like cross-sell, upsell and, customer loyalty. The E-Tailing Group says that increasing the personalisation to multiple channels can increase the customer spending by 500 percent.
  • Disseminates targeted ads: It helps in broadcasting targeted cross-channel promotional campaigns. Through the compilation of user preferences and the advertisements which they click on, your website can spread targeted messages to enhance user engagement. Constellation Researchhttps://www.monetate.com/blog/constellation-research-declares-ai-driven-personalization-the-answer says that lack of content relevancy would lead to 83 percent lower response rates than the average promotional campaigns.
Web personalization with Drupal

Drupal 8 provides the perfect foundation for incorporation of technologies to enable personalization from marketing automation to web analytics. Drupal 8’s web services initiative has streamlined the process of sharing meaningful information with external systems. There is no circumscription on which marketing tools you apply as the Drupal managed content can be turned into standardized data sharing formats.

DrupalCon New Orleans 2016 threw light on how Drupal can be effectively implemented in your Drupal site to increase the user engagement.

This DrupalCon session talked about how Drupal module Acquia Lift can delineate unprecedented insight into what customers want and don’t want to serve them personalized content. With such a system incorporated into the site, digital marketers get more control over automation, testing and measurement of marketing activities.

Acquia Lift module helps in the unification of content and the insight collected from several sources about the customers for delivering in-context and personalized experiences across multiple platforms. It has functionalities like drag-and-drop user interface for targeting messages, syndicating content, behavioural targeting, A/B testing, unifying customer profile, and combining anonymous and known online visitor profiles.

Let’s see how can we strategize the integration of web personalization into the Drupal site.

How to implement web personalization? 1. Understand the classifications of web personalization
  • Feedbacks: Implicit and explicit feedback can be used to send personalized messages to the target segments. A data is said to be implicit when an inference is taken from user interactions with the website. Assessment of their page visits, response to CTAs, behavioural patterns with menu navigation etc. gives you an implicit data about user behaviour. When you use Google analytics data to know the geolocation, browser or the device of the user. Or, when you ask a user to fill out a form, it comes under explicit data.
  • Data sources: You can gain an insight into user behaviour from first-party data sources like user interactions with the website, email marketing, and marketing automation. Second-party data sources include the first party information that is known to a different entity. You can get this information out from trusted partners who are ready to enclose the data with some kind of agreement. Third-party data sources are the external data providers who can come handy in extracting user profiles.
  • Identity: Attributes like IP address, location, device, and browser of a user are automatically detected non-personal identification traits. A user’s age, gender, interests belong to personal identification traits.
  • Online visitor profile: User’s interaction with the website like the onsite searches, date and time of site visits, response to online forms etc. helps in building their user profile.
2. Creation of a content plan
  • Segmentation of audience: It is very much evident that personalizing your website lets you target your messages accordingly. Having content that requires being sent to different sort of audience calls for segmenting the audience type. You might be having a lot of whitepapers, case studies, ebooks, etc. on your website that can be useful for B2B marketing. Or, you might be having a lot of advertisements promoting slashed rates of popular products that are helpful for consumer marketing. So, sending the right content to the right audience is of paramount importance.
  • Efficacy in mapping out content: Once you have decided the segment of the audience that you are targeting, it is important to figure how to effectively map out content. For instance, in B2B marketing, you can consider building on metrics like awareness, interest, evaluation and commitment through infographics, case studies, live demos, and advanced solutions respectively. And in consumer marketing, you can build on metrics like awareness, interest, and decision through product highlights, videos, and special offers respectively.
Source: MarketoSource: Marketo
  • Betterments with the existing content: Smart calculation of dissemination of the content is very essential. Creation of cornucopia of content won’t solve your trials and tribulations. Personalized advertorial campaigns may require you to develop new content but it is also imperative to improvise the content that is already existing on the site. Repurpose the content according to the segmentation of audiences. Revisit the titles and the CTAs to make it appropriate as per the segments. Add the industry-specific research studies that resonate with your content. Convert your large reports into short ebooks, infographics etc.
  • Choosing the right place on your website: You can customize your homepage based on the region from where the user is accessing your website. Sometimes, an internal page with product details might be having higher SEO rankings than the homepage. So, it is fruitful to personalize that page with some live demos that is relevant to them.
3. Testing the web personalization efforts
  • A/B testing: In other words, it is also known as split testing. This process refers to the comparison of more than two versions of promotional campaigns or advertorial messages through cross-channel marketing to understand which method is working and which isn’t.
  • Importance of A/B testing: Not only it helps in enhancing the engagement of online users and efficacy of promotional campaigns but also improves digital marketers’ awareness and expertise when it comes to understanding user preferences.
4. Measuring the success rate Source: Marketo

You would notice the early signs of success through the increase in time on site, amount of content consumed and the return visitors. Thereafter, contact quality gets better. Finally, you see improvement in return on investment.

Source: MarketoCase study

Drupal has a proven track record in the healthcare industries. “What does it mean for content to be “personalized”?”. Humana, which finds a place in the list of Fortune 500 healthcare company, personalized their Drupal microsite for one of their customers with the help a digital agency to deliver relevant content.

The objective was to send out targeted content to the users and find their interests. Humana wanted to personalize their website based on the data including demographics and click path of the users. They pushed their existing online users to a personal wellness analysis that provided them with the insight into their behavioural patterns and their interests.

Source: Acquia

When the users clicked on this personal wellness registration forms, Humana got their demographic details. Users could also manually edit the settings through some sliders. This enabled them to see the type of content that they wanted to see on the website. Thus, it paved the way for mapping users with similar demographics and preferences.

Source: Acquia

They could disseminate the content by constructing different segments like health, finance, etc. For instance, in the health segment, they could categorise on the basis of attributes like demographics (gender and age) and site activity (users who mostly clicked on the content related to health factors).

Conclusion

Content strategies that prophesize the one-size-fits-for-everyone would make your site banal, trite and shorn of any fanfare. Digital strategies have come a full circle and have clung on to web personalization tactics to provide relevant and meaningful content to the users. Drupal 8 provides a magnificent platform for personalizing the website. With right strategies and plans in place, you can build a bonhomie between the online users and your website.

Ping us at hello@opensenselabs.com to personalize your site and build a colossal online presence.

blog banner blog image Blog Type Articles Is it a good read ? On
Categories: Drupal

LakeDrops Drupal Consulting, Development and Hosting: Own your data (again)

31 May 2018 - 5:03am
Own your data (again) Jürgen Haas Thu, 05/31/2018 - 14:03

My personal #gdpr today, May 25th 2018: completed my project to get back all my data from @Google, @evernote et al and host it all by myself with @Nextclouders, #joplin and dozens of other @OpenSourceOrg tools that come with the same convenience but with real privacy. Check!

Categories: Drupal

Promet Source: Should I Fix my Existing Site or Build a New Site from Scratch?

30 May 2018 - 7:57pm
Does an accessibility issue on my website meanI need to build a brand new one? This might be one of many questions rolling around in your head as you read the email or letter informing you that your site has an accessibility problem. Don’t panic just yet. It could be something simple, but you need to have all the facts. You need a plan of attack and that starts with a site audit.
Categories: Drupal

OSTraining: How to Use Google Webfonts in Your Drupal 8 Site

30 May 2018 - 11:07am

Although Drupal has reputation for being a developers' platform, lots of user rely on Drupal's admin area for key tasks.

For typography in Drupal sites, the best way to change your site's fonts via the admin is a module called @font-your-face

The @font-your-face module allows you to work with webfonts like Google Fonts or Font Squirrel. It also provides the ability to work with paid font services like Typekit or fonts.com.

In this tutorial, you’ll learn how to configure and use this module in Drupal 8.

Categories: Drupal

Drupal Europe: Drupal Europe Conference — Government Track

30 May 2018 - 9:49am
Photo by Drupal Association

Government touches the lives of us all in fundamental ways. It is essential that government is able to communicate with its citizens in an effective and inclusive manner.
This communication requires high quality tools and special considerations regarding:

  • exchange of information with each other and citizens in an open manner
  • providing ability of citizens to see how their government is run
  • protecting citizens’ data and privacy
  • providing modern and easy to use technologies for both citizens and authorities
  • contributing back their code and data, because it’s paid for by the citizens

Therefore, we have dedicated a special government track at the Drupal Europe Conference.

As you’ve probably read in one of our previous blog posts, industry verticals are a new concept being introduced at Drupal Europe and replace the summits, which typically took place on Monday. At Drupal Europe. These industry verticals are integrated with the rest of the conference — same location, same ticket and provide more opportunities to learn and exchange within the industry verticals throughout 3 days.

The Government vertical track of the Drupal Europe Conference is focused on trends and innovations as well as all aspects of the current developments and challenges within the government space.

In an exciting mix of case-studies, panel-discussion and thematic sessions the following, most burning topics will be discussed

  • Open access, data, government and standards
  • Accessibility / Inclusivity
  • Digital-by-default citizen services
  • User experience design for digital services
  • Hosting and Security
  • Content Management and Usability of digital tools
  • and more

We strive to provide the best possible lineup of speakers and session with a great variety of interesting topics to create the best conference for attendees working within and who are interested in government.

Session submissions is open / Call for Sessions is open and we ask you to submit interesting session proposals to create an awesome conference. Session proposals are not limited to Drupal and all topics in relationship with the above are welcome.

Please also help us to spread the word about this awesome conference. Our hashtag is #drupaleurope.

If you want to participate in organisation or want to recommend speakers or topics please get in touch at program@drupaleurope.org.

In any case we look forward to seeing you at Drupal Europe on September 10–14 in Darmstadt Germany!

About Drupal Europe Conference

Drupal is one of the leading open source technologies empowering digital solutions in the government space around the world.

Drupal Europe 2018 brings over 2,000 creators, innovators, and users of digital technologies from all over Europe and the rest of the world together for three days of intense and inspiring interaction.

Location & Dates

Drupal Europe will be held in Darmstadtium in Darmstadt, Germany — with a direct connection to Frankfurt International Airport. Drupal Europe will take place 10–14 September 2018 with Drupal contribution opportunities every day. Keynotes, sessions, workshops and BoFs will be from Tuesday to Thursday.

Categories: Drupal

Acquia Developer Center Blog: A Specification Tool for Drupal 8

30 May 2018 - 9:28am

Revered management thinker Peter Drucker once wrote, “If you can’t replicate something because you don’t understand it, then it really hasn’t been invented; it’s only been done.” In many ways content modeling in Drupal has been done without being invented. For this reason, we’re developing a discipline for content modeling at Acquia. It’s drastically reducing both costs and defect rates for us.

Tags: acquia drupal planet
Categories: Drupal

Lullabot: Decoupled Drupal Hard Problems: Routing

30 May 2018 - 8:00am

As part of the Decoupled Hard Problems series, in this fourth article, I'll discuss some of the challenges surrounding routing, custom paths and URL aliases in decoupled projects. 

Decoupled Routing

It's a Wednesday afternoon, and I'm using the time that Lullabot gives me for professional development to contribute to Contenta CMS. Someone asks me a question about routing for a React application with a decoupled Drupal back-end, so I decide to share it with the rest of the Contenta Slack community and a lengthy conversation ensues. I realize the many tendrils that begin when we separate our routes and paths from a more traditional Drupal setup, especially if we need to think about routing across multiple different consumers. 

It's tempting to think about decoupled Drupal as a back-end plus a JS front-end application. In other words, a website. That is a common use case, probably the most common. Indeed, if we can restrict our decoupled architecture to a single consumer, we can move as many features as we want to the server side. Fantastic, now the editors who use the CMS have many routing tools at their disposal. They can, for instance, configure the URL alias for a given node. URL aliases allow content editors to specify the route of a web page that displays a piece of content. As Drupal developers, we tend to make no distinction between such pieces of content and the web page that Drupal automatically generates for it. That's because Drupal hides the complexity involved in making reasonable assumptions:

  •  It assumes that we need a web page for each node. Each of those has a route node/<nid> and they can have a custom route (aka URL alias).
  •  It means that it is okay to add presentation information in the content model. This makes it easy to tell the Twig template how to display the content (like field_position = 'top-left') in order to render it as the editor intended.

Unfortunately, when we are building a decoupled back-end, we cannot assume that our pieces of content will be displayed on a web page, even if our initial project is a website. That is because when we eventually need a second consumer, we will need to make amends all over the project to undo those assumptions before adding the new consumer.

Understand the hidden costs of decoupling in full. If those costs are acceptable—because we will take advantage of other aspects of decoupling—then a rigorous separation of concerns that assigns all the presentation logic to the front-end will pay off. It takes more time to implement, but it will be worth it when the time comes to add new consumers. While it may save time to use the server side to deal with routing on the assumption that our consumer will be a single website,  as soon as a new consumer gets added those savings turn into losses. And, after all, if there is only a website, we should strongly consider a monolithic Drupal site.

undefined

After working with Drupal or other modern CMSes, it's easy to assume that content editors can just input what they need for SEO purposes and all the front-ends will follow. But let's take a step back to think about routes:

  • Routes are critical only for website clients. Native applications can also benefit from them, but they can function with just the resource IDs on the API.
  • Routes are important for deep linking in web and native applications. When we use a web search engine in our phone and click a link, we expect the native app to open on that particular content if we have it installed. That is done by mapping the web URL to the app link.
  • Links are a great way to share content. We want users to share links, and then let the appropriate app on the recipient's mobile device open if they have it installed.

It seems clear that even non-browser-centric applications care about the routes of our consumers. Luckily, Drupal considers the URL alias to be part of the content, so it's available to the consumers. But our consumers' routing needs may vary significantly.

Routing From a Web Consumer

Let's imagine that a request to http://cms.contentacms.io/recipes/4-hour-lamb-stew hits our React application. The routing component will know that it needs to use the recipes resource and find the node that has a URL alias of /4-hour-lamb-stew. Contenta can handle this request with JSON API and Fieldable Path—both part of the distribution. With the response to that query, the React app builds all the components and displays the results to the user.

It is important to note the two implicit assumptions in this scenario. The first is that the inbound URL can be tokenized to extract the resource to query. In our case, the URL tells us that we want to query the /api/recipes resource to find a single item that has a particular URL alias. We know that because the URL in the React side contains /recipes/... What happens if the SEO team decides that the content should be under https://cms.contentacms.io/4-hour-lamb-stew? How will React know that it needs to query the /api/recipes resource and not /api/articles?

The second assumption is that there is a web page that represents a node. When we have a decoupled architecture, we cannot guarantee a one-to-one mapping between nodes and pages. Though it's common to have the content model aligned with the routes, let's explore an example where that's not the case. Suppose we have a seasonal page in our food magazine for the summer season (accessible under /summer). It consists of two recipes, and an article, and a manually selected hero image. We can build that easily in our React application by querying and rendering the content. However, everything—except for the data in the nodes and images—lives in the react application. Where does the editor go to change the route for that page?

On top of that, SEO will want it so that when a URL alias changes (either editorially or in the front-end code) a redirect occurs, so people using the old URL can still access the content. Note that a change in the node title could trigger a change in the URL alias via Pathauto. That is a problem even in the "easy" situation. If the alias changes to https://cms.contentacms.io/recipes/four-hour-stewed-lamb, we need our React application to still respond to the old https://cms.contentacms.io/recipes/4-hour-lamb-stew. The old link may have been shared in social networks, linked to from other sites, etc. The problem is that there is no recipe with an alias of /recipes/4-hour-lamb-stew anymore, so the Fieldable Path solution will not cover all cases.

Possible Solutions

In monolithic Drupal, we'd solve the aforementioned SEO issue by using the Redirect module, which keeps track of old path aliases and can respond to them with a redirect to the new one. In decoupled Drupal, we can use that same module along with the new Decoupled Router module (created as part of the research for this article).

The Contenta CMS distribution already includes the Decoupled Router module for routing as we recommend this pattern for decoupled routing.

Pages—or visualizations—that comprise a disconnected selection of entities—our /summer page example—are hard to manage from the back-end. A possible solution could be to use JSON API to query the entities generated by Page Manager. Another possible solution would be to create a content type, with its corresponding resource, specific for that presentation in that particular consumer. Depending on how specific that content type is for the consumer, that will take us to the Back-end For Front-end pattern, which incurs other considerations and maintenance costs.

For the case where multiple consumers claim the same route but have that route resolve to different nodes, we can try the Contextual Aliases module.

The Decoupled Router

Decoupled Router is an endpoint that receives a front-end path and tries to resolve it to an entity. To do so it follows as many redirects and URL aliases as necessary. In the example of /recipes/four-hour-stewed-lamb it would follow the redirect down to /recipes/4-hour-lamb-stew and resolve that URL alias to node:1234. The endpoint provides some interesting information about the route and the underlying entity.

undefined

In a previous post, we discussed how multiple requests degrade performance significantly. With that in mind, making an extra request to resolve the redirects and aliases seems less attractive. We can solve this problem using the Subrequests module. Like we discussed in detail, we can use response tokens to combine several requests in one.

Imagine that we want to resolve /bread and display the title and image. However, we don’t know if /bread will resolve into an article or a recipe. We could use Subrequests to resolve the path and the JSON API entity in a single request.

undefined

In the request above, we provide the path we want to resolve. Then we get the following response.

undefined

To summarize, we can use Decoupled Router in combination with Subrequests to resolve multiple levels of redirects and URL aliases and get the JSON API data all in a single request. This solution is generic enough that it serves in almost all cases.

Conclusion

Routing in decoupled applications becomes challenging because of three factors:

  • Instead of one route, we have to think about (at least) two, one for the front-end and one for the back-end. We can mitigate this by keeping them both in sync.
  • Multiple consumers may decide different routing patterns. This can be mitigated by reaching an agreement among consumers. Another alternative is to use Contextual Aliases along with Consumers. When we want back-end changes that only affect a particular consumer, we can use the Consumers module to make that dependency explicit. See the Consumer Image Styles module—explained in a previous article—for an example of how to do this.
  • Some visualizations in some of the consumers don’t have a one-to-one correspondence with an entity in the data model. This is solved by introducing dedicated content types for those visualizations. That implies that we have access to both back-end and front-end. A custom resource based on Page Manager could work as well.

In general, whenever we need editorial control we'll have to turn to the back-end CMS. Unfortunately, the back-end affects all consumers, not just one. That may or may not be acceptable, depending on each project. We will need to make sure to consider this when thinking through paths and aliases on our next decoupled Drupal project.

Lucky for us, every project has constraints we can leverage. That is true even when working on the most challenging back-end of all—a public API that powers an unknown number of 3rd-party consumers. For the problem of routing, we can leverage these constraints to use the mitigations listed above.

Hopefully, this article will give you some solutions for your Decoupled Drupal Hard Problems.

Note: This article was originally published on November 29, 2017. Following DrupalCon Nashville, we are republishing (with updates) some of our key articles on decoupled or "headless" Drupal as the community as a whole continues to explore this approach further. Comments from the original will appear unmodified.

Photo by William Bout on Unsplash.

Categories: Drupal

ComputerMinds.co.uk: GDPR compliance steps for Drupal Developers

30 May 2018 - 7:54am

The new GDPR laws are here, hurrah!

Having a number of developers handling databases from a number of client sites could easily be a nightmare, but we at ComputerMinds spent quite some time thinking about how to get and keep everybody safe and squeaky clean on the personal data front.

Here's a quick run-down of the key things to be aware of - and a pretty poster to help you keep it all in mind :)

Remove personal data from your system

  1. Review all databases on your computer, making sure to consider also those .sql dump files still sat in your downloads directory or your Recycle bin/trash.
  2. If there are databases that you need to keep on your system, then you must sanitize them by encrypting, anonymizing or removing personal data.
  3. Review all testing / UAT environments and ensure they're running off sanitized databases where possible.

Stay clean by using sanitized databases

Set up some _drush_sql_sync_sanitize() hooks to deal with personal data stored on your site. Then either have your Jenkins server use it to provide sanitized dumps, or ensure that your developers use it to sanitize databases immediately after importing.

When setting up your hook, make sure to consider things like:

  • User table - clear out email addresses, usernames etc.
  • Custom fields on users - names, telephone numbers etc. that you've added.
  • Webform / contact form submissions - make sure that your Webform / contact form data gets cleared out. Webform 7.12 and above has these hooks included, but it's good to double-check.
  • Commerce order table - you'll need to remove personal data from the commerce orders.
  • Commerce profile tables - make sure that the personal data in the profiles gets anonymized or removed.
  • Commerce payment gateway callback tables - these will have detailed payment transaction data, and absolutely must be cleared out.
  • URL aliases & redirects - by default Drupal sets up aliases for users' usernames, so you'll need to review those tables.
  • Comments - these usually have name, email and website fields that will need clearing out. But their body content may also have personal data in too, so you might be better off just binning the lot.
  • Watchdog / logging tables - these take up lots of space, so you probably don't want to export them off the live site anyway, but think seriously about the personal data inside if you do decide you want to import them elsewhere. Truncate recommended.
  • Cache tables - these can be huge, so you probably don't want to export them off the live site anyway, but think seriously about the personal data inside if you do decide you want to import them elsewhere. Truncate recommended.

This is certainly not a complete list, but we can't tell you what custom fun you've implemented on your site - so its' down to you to go check your tables!

Stay vigilant

  • Ensure future development environments and UAT/test environments are built using sanitized databases.
  • If you receive user data via email, immediately delete the email and attachments and reprimand the sender!
  • Talk to your clients about changes that need to be made to their sites.

PDF download link below!

 

Categories: Drupal

Dries Buytaert: Making the web easier and safer with the Web Authentication standard

30 May 2018 - 6:05am

Firefox 60 was released a few weeks ago and now comes with support for the upcoming Web Authentication (WebAuthn) standard.

Other major web browsers weren't far behind. Yesterday, the release of Google Chrome 67 also included support for the Web Authentication standard.

I'm excited about it because it can make the web both easier and safer to use.

Supporting for the Web Authentication standard will make the web easier, because it is a big step towards eliminating passwords on the web. Instead of having to manage passwords, we'll be able to use web-based fingerprints, facial authentication, voice recognition, a smartphone, or hardware security keys like the YubiKey.

It will also make the web safer, because U2F will help reduce or even prevent phishing, man-in-the-middle attacks, and credential theft. If you are interested in learning more about the security benefits of the Web Authentication standard, I recommend reading Adam Langley's excellent analysis.

When I have a bit more time for side projects, I'd like to buy a YubiKey 4C to see how it fits in my daily workflow, in addition to what it would look like to add Web Authentication support to Drupal and https://dri.es.

Categories: Drupal

TEN7 Blog's Drupal Posts: Episode 029: Wilbur Ince, Drupal Frontend Developer and Human Rights Activist

30 May 2018 - 5:54am
Wilbur Ince, Drupal Frontend Developer and Human Rights Activist, sits down with Ivan Stegic to discuss his career, road to Drupal and the valuable volunteer work he does.
Categories: Drupal

Jacob Rockowitz: Are we afraid to estimate our work in Drupal and open source?

30 May 2018 - 5:54am

Estimation is not a new concept nor is it a bad word.

Estimation is not a new topic for anyone in the Drupal or open source community. We do it every day at our jobs. We even discuss estimation techniques at our conferences. We provide our clients with estimates when building and contributing back to open source project, yet we don't include estimations within our open source community and issue queues.

We all provide estimates to our clients - are we afraid to do it when it comes to Drupal and Open Source?

Before we take on this tough question, 'Are we afraid to estimate our work in Drupal and open source?', let's start off with a straightforward question: 'Why do we provide estimates to our clients?' The answer is just that our clients want to know how much something is going to cost and we want to know how much work is required to complete a project.

To give this discussion more context, let's begin with a very general definition of estimation

Here’s my hypothesis:

The Science of guessing - Drupal estimation techniques from project managers

While researching estimation within the Drupal community, I found a bunch of great presentations about project management and estimation. To me, "The Science of guessing - Drupal estimation techniques from project managers" by Shannon Vettes (svettes), Jakob Persson (solipsist), and Mattias Axelsson (acke), was the most comprehensive and inspiring presentation. Feel free to watch this presentation. I am going to pull a few slides from this presentation to help move through this exploration.

Every presentation I watched focused on estimation concerning managing expectations for...Read More

Categories: Drupal

Pages