Drupal

Amazon Elastic Transcoder

New Drupal Modules - 12 October 2017 - 8:09am

A video transcoder for the video module. This project allows drupal 7 website operators to transcode video using Amazon Elastic Transcoding services on the AWS cloud. Services utilized include S3, SNS, and Elastic Transcoder.

Dependencies:

  • Video (2.x)
  • Amazon S3 (2.x-dev)
  • Amazon S3 CORS (optional w/ patches provided below)

Several patches are required to achieve full feature coverage. Issues will be listed below.

Categories: Drupal

Yet Another Blog Archive

New Drupal Modules - 12 October 2017 - 7:46am

Module provides "Archive Block" as blogger.com for core blog module.

Categories: Drupal

Noticeboard

New Drupal Modules - 12 October 2017 - 7:21am

A virtual notice board.
Used to provide users information about time-sensitive things across the site.

Categories: Drupal

Vardot: SEO Checklist Before Launching Your Drupal Website

Planet Drupal - 12 October 2017 - 7:11am
SEO Checklist Before Launching Your Drupal Website Dmitrii Susloparov Thu, 10/12/2017 - 17:11

Search Engine Optimization (SEO) might not be the first thing you think of when designing a new website, but building an optimized framework from the start will help you drive traffic to your site and keep it there. With our Drupal SEO-checklist in hand you can build an excellent website that draws customers from launch day. Briefly speaking, here is a bullet list of what to check before the launch day. Below we’ll speak about each point in more detail.

 

  • Check that all web pages have unique titles using the Page Title module

  • Check if XML Sitemap and Google News Sitemap are configured properly

  • Check if Redirect module is enabled and configured

  • Check if Global Redirect module is enabled and configured

  • Check that .htaccess redirect to site with/without www

  • Check that the homepage title includes a slogan, and is descriptive for the function of the site

  • Check if Meta Tags is filled with descriptive information

  • Check that OG tags are filled correctly and with descriptive information.

  • Check if site's information appears well when shared on Facebook

  • Check if Path aliases patterns are meaningful

  • Check if Google Analytics is enabled and configured

  • Check if Page Title module is enabled and configured

  • Check if Google News Sitemap is enabled and configured

  • Check if Site verification is enabled and configured

  • Check if Search 404 module is enabled and configured

 

Drupal SEO: 12 Things that Will Improve Your Site's Ranking Check that all web pages have unique titles...

...and make sure to write them correctly. All of your pages should be easily identifiable to the end user. Not only should they have unique titles, they should have meaningful titles. Having multiple pages with the same titles (like “Get in touch”, “Contact us” and “Make a booking”) will simply confuse your end users and search engine crawlers.

 

Not only do good page titles help customers who are already on your site, but they help with social sharing, and picking your site out of search engine results. Titles are the first element that any user will see, whether they come directly to your site, find it in a search engine, or see it shared on social media.

 

Writing good titles is extremely important, and having keywords in your title that match a user's search greatly improves the chances of them clicking on your page.

 

Ensuring all your pages have a unique name will help users navigate, boost your SEO ratings, and increase the chances that someone will type the right keywords into a search engine to bring them to your site.

 

You can set up unique page titles much easier if you install the Drupal Page Title module.

10 Drupal Modules that Will Boost Your Website’s SEO

 

 

Check if XML Sitemap and Google News Sitemap are configured properly

The XML Sitemap module creates a robot friendly map of your site that Google and other search engines can crawl to categorise your website. There are a few settings you can alter for your site at admin/config/search/xmlsitemap and you can view the sitemap from http://yoursite.com/sitemap.xml.

 

You should configure XML Sitemap early in your site build for the best effect, but you can also alter the settings later on if needed.

 

Google News Sitemap offers a similar but different service that creates a Google specific map - as suggested in the name. These two modules work nicely side by side to make your site easy for search engines to crawl and index.

 

Please note that if your site contains AMPs, there is no need to create sitemaps for them. The rel=amphtml link is enough for Google to pick up on the accelerated mobile page version, which means you can easily gain traffic from Top Stories carousels and mobile search. Creating AMP on your Drupal site became easy with our step-by-step guide.

 

 

Check if Redirect module is enabled and configured

Redirect is a handy module for making sure users always make it to your site. It uses case-insensitive matching to help catch broken links with redirects and tracks how often users are hitting those redirects. You can use redirects to capture any broken links, set up promotional links, or simply capture typos users are entering when trying to access your site.

 

Check if Global Redirect module is enabled and configured

If you’re using Drupal 8 you can skip this one because the functionality has been rolled into the redirect module. Otherwise install Global Redirect to work in tandem with Redirect to catch any broken links. Global Redirect will test all links with and without a trailing slash, ensure links are case-insensitive and if a link is truly broken it will return a user to your home page, rather than an ugly 404 page that decrease the position of your site in SERPs.

Check that .htaccess redirects to site with/without www

Some users attempting to visit your site will navigate to www.yoursite.com, while others will simply type yoursite.com. By setting up your site to handle either request you can be sure you won’t miss any visitors.

 

 

Check that the homepage title includes a headline, logo and primary image and is descriptive for the function of the site

The headline as well as the slogan represent who you are as a business. Make your first impression a good one as this will also be visible on search engines. This is a good opportunity to stack your website with SEO friendly keywords, but don’t go overboard and sacrifice your image for it - keyword stuffing may not only decrease the trust index of your site, but also its conversion rates.

Ensure Metatags are filled with descriptive information

Writing SEO-optimized metatags is highly important, because they remain one of the top on-page ranking factors. Make sure to install the Metatag module on your site to have an easy, user friendly interface for updating metadata. With the module installed you can easily populate metadata with keywords, page descriptions, and more.

 

SEO tips for your Drupal site

 

The Metatag module will also give you extra control over how your site appears when shared on Twitter or Facebook.

Check that OG tags are filled correctly and with descriptive information.

OG tags are metatags specifically designed to ensure your site communicates nicely with Facebook. By setting these tags correctly you will be able to control exactly how your site appears on Facebook, including what images and what taglines are used.

Check if site's information appears well when shared on Facebook and Twitter

After configuring the metatag module and OG tags, pop over to Facebook and make sure that your site shares the way you would like it too. It’s important to test this out now before users start sharing your site around.

 

Similarly try tweeting a couple of your pages to see how well your Twitter Cards come through. If you don’t want to show your site to your audience until you are sure it is set up properly, you can check Twitter Cards using the Card Validator.

 

For more information on configuring Twitter cards, check out the Twitter user guides.

 

Check if Path aliases patterns are meaningful

By default Drupal will set your URLs to node/123 - while this works great for the database back end, it doesn’t work well for your end users, or for search engines.

 

You can use the Pathauto module to create rules and patterns for your URLs that will significantly cut down on your maintenance times and simplify your site navigation.

Check if Google Analytics is enabled and configured

While having Google Analytics configured won’t improve your SEO, it will give you all the data you need to understand where your users are coming from and how they behave once they hit your site.

 

Installing the Google Analytics module makes setting up and configuring Google Analytics a breeze.

Check if Site verification is enabled and configured

The Site verification module makes it easy to check the boxes that tell search engines that your site is truly yours. Having your site verified will improved how search engines crawl your site, and for Google will allow you to access private search data. With site verification you will receive better data and better search engine rankings for just a few minutes work.

 

Check if Search 404 module is enabled and configured

The Search 404 module is a saving grace for reducing your bounce rate, your SEO and improving your customer experience. Instead of your users finding an ‘Error: Page not Found” in place of the content they were hoping for, they will be offered a search of your site based on the URL string. For example if “www.yoursite.com/great-seo-tips” doesn’t exist, users this module will automatically search your site for ‘Great SEO tips” and show the users the results.

 

 

Bottom line

While SEO may seem like a tricky subject to wrap your head around, the basics are easy with the right modules and the right guidance. Drupal is a great content management system for building search engine optimized websites.

 

With our SEO checklist you can get off on the right foot, and here at Vardot we love educating our customers to build top quality websites. If you’re looking for even more ways to improve your sites SEO, have a look at SEO articles in our blog or get in touch with us.

Categories: Drupal

Lullabot: Incredible Decoupled Performance with Subrequests

Planet Drupal - 12 October 2017 - 6:52am

In my previous post, Modern Decoupling is More Performant, we discussed how saving HTTP round-trips has a very positive impact on performance. In particular, we demonstrated how the JSON API module could help your application by returning multiple entities in a single request. Doing so eliminates the need for making an individual request per entity. However, this is only possible when fetching entities, not when writing data and only if those entities are related to the entry point (a particular entity or collection).

Sometimes you can solve this problem by writing a custom resource in the back-end every time, but that can lead to many custom resources, which impacts maintainability and is tiresome. If your API is public and you don’t have prior knowledge of what the consumers are going to do with it, it’s not even possible to write these custom endpoints.

The Subrequests module completes that idea by allowing ANY set of requests to be aggregated together. It can aggregate them even when one of them depends on a previous response. The module works with any request, it's not limited to REST or any other constraint. For simplicity, all the examples here will make requests to JSON API.

Why Do We Need It?

The main concept of the Subrequests module is that instead of sending multiple requests to your Drupal instance we will only send a single request. In this master request, we will provide the information about the requests we need to make in a JSON document. We call this document blueprint.

A blueprint is a JSON document containing the instructions for Drupal to make all those requests in our name. The blueprint document contains a list of subrequest objects. Each subrequest object contains the information about a single request being aggregated in the blueprint.

Imagine that our consumer application has a decoupled editorial interface. This editorial interface contains a form to create an article. As part of the editorial experience, we want the form to create the article and a set of tags in the Drupal back-end.

Without using Subrequests, the consumer application should execute the following requests when the form is submitted:

  • Query Drupal to find the UUID for the tags vocabulary.
  • Query Drupal to find the UUID of the user, based on the username present in the editorial app.
  • Create the first tag in the form using the vocabulary UUID.
  • Create the second tag in the form using the vocabulary UUID.
  • Create the article in the form using the user UUID and the newly created tags.

We can query for the user and the vocabulary in parallel. Once that is done, and using the information in the vocabulary response, we can create the tag entities. Once those are created, we can finally create the article. In total, we would be making five requests at three sequential levels. And, this is not even a complex example!

undefined

A JavaScript pseudo-code for the form submission handler could look like:

console.log('Article creation started…'); Promise.all([ httpRequest('GET', 'https://cms.contentacms.io/api/vocabularies?filter[vid-filter][condition][path]=vid&filter[vid-filter][condition][value]=tags'), httpRequest('GET', 'https://cms.contentacms.io/api/users?filter[admin][condition][path]=name&filter[admin][condition][value]=admin'), ]) .then(res => { const [vocab, user] = res; return Promise.all([ Promise.resolve(user), httpRequest('POST', 'https://cms.contentacms.io/api/tags', bodyForTag1, headers), httpRequest('POST', 'https://cms.contentacms.io/api/tags', bodyForTag2, headers), ]) }) .then(res => { const [user, tag1, tag2] = res; const body = buildBodyForArticle(formData, user, tag1, tag2); return httpRequest('POST', 'https://cms.contentacms.io/api/articles', body, headers); }) .then(() => { console.log('Article creation finished!'); }); Using Subrequests

Our goal is to have JavaScript pseudo-code that looks like:

console.log('Article creation started…'); const blueprint = buildBlueprint(formData); httpRequest('POST', 'https://cms.contentacms.io/api/subrequests?_format=json', blueprint, headers) .then(() => { console.log('Article creation finished!'); });

We've reduced our application code to a single POST request that contains a blueprint in the request body. We have reduced the problem to the blueprint creation. That is a big improvement in the developer experience of consumer applications.

undefined Parallel Requests

In our current task we need to perform two initial HTTP requests that can be run in parallel:

  • Query Drupal to find the UUID for the tags vocabulary.
  • Query Drupal to find the UUID of the user based on the username in the editorial app.

That translates to the following blueprint:

[ { "requestId": "vocabulary", "action": "view", "uri": "/api/vocabularies?filter[vid-filter][condition][path]=vid&filter[vid-filter][condition][value]=tags", "headers": ["Accept": "application/vnd.application+json"] }, { "requestId": "user", "action": "view", "uri": "/api/users?filter[admin][condition][path]=name&filter[admin][condition][value]=admin", "headers": ["Accept": "application/vnd.application+json"] } ]

For each subrequest, we can observe that we are providing four keys:

  • requestId A string used to identify the subrequest. This is an arbitrary value generated by the consumer application.
  • action Identifies the action being performed. A "view" action will generate a GET request. A "create" action will generate a POST request, etc.
  • uri The URL where the subrequest will be sent .
  • headers An object containing the headers specific for this subrequest.

The response to this blueprint (after adjusting the permissions in Drupal to view users and vocabularies) will return the response to both subrequests:

{ "vocabulary": { "headers": { "content-id": ["<vocabulary>"], "status": [200] }, "body": "{\"data\":[{\"type\":\"vocabularies\",\"id\":\"47ce8895-0df6-44a4-af43-9ef3b2a924dd\",\"attributes\":{\"status\":true,\"dependencies\":{\"module\":[\"recipes_magazin\"]},\"_core\":\"HJlsFfKP4PFHK1ub6QCSNFmzAnGiBG7tnx53eLK1lnE\",\"name\":\"Tags\",\"vid\":\"tags\",\"description\":\"Use tags to group articles on similar topics into categories.\",\"hierarchy\":0,\"weight\":0},\"links\":{\"self\":\"http:\\/\\/localhost\\/api\\/vocabularies\\/47ce8895-0df6-44a4-af43-9ef3b2a924dd\"}}],\"links\":{\"self\":\"http:\\/\\/localhost\\/api\\/vocabularies?filter%5Bvid-filter%5D%5Bcondition%5D%5Bpath%5D=vid\\u0026filter%5Bvid-filter%5D%5Bcondition%5D%5Bvalue%5D=tags\"}}" }, "user": { "headers": { "content-id": ["<user>"], "status": [200] }, "body": "{\"data\":[{\"type\":\"users\",\"id\":\"a0b7af80-e319-4271-899f-f151d3fbfc8e\",\"attributes\":{\"internalId\":1,\"name\":\"admin\",\"mail\":\"admin@example.com\",\"timezone\":\"Europe\\/Madrid\",\"isActive\":true,\"createdAt\":\"2017-09-15T15:47:26+0200\",\"updatedAt\":\"2017-09-15T20:06:15+0200\",\"access\":1505565434,\"lastLogin\":\"2017-09-15T20:06:07+0200\"},\"relationships\":{\"roles\":{\"data\":[]}},\"links\":{\"self\":\"http:\\/\\/localhost\\/api\\/users\\/a0b7af80-e319-4271-899f-f151d3fbfc8e\"}}],\"links\":{\"self\":\"http:\\/\\/localhost\\/api\\/users?filter%5Badmin%5D%5Bcondition%5D%5Bpath%5D=name\\u0026filter%5Badmin%5D%5Bcondition%5D%5Bvalue%5D=admin\"}}" } }

In the (simplified) response above we can see that for each subrequest, we have one key in the response object. That key is the same as our requestId in the blueprint. Each one of the subresponses contains the information about the response headers and the response body. Note how the response body is an escaped JSON object.

This blueprint is not sufficient to create an article with two tags, but it's a great start. Let's build on top of that to create the tags and the article.

Dependent Requests

The next task we need to execute is the creation of the two tag entities:

  • Create the first tag in the form using the vocabulary UUID.
  • Create the second tag in the form using the vocabulary UUID.

To do this, we will need to expand the blueprint. However, we don't know the vocabulary UUID at the time we are writing the blueprint. What we do know is that the vocabulary UUID will be in the subresponse to the vocabulary subrequest. In particular, we can find the UUID in data[0].id.

We will use that information to create a blueprint that can create tags. Since we don't know the actual value of the vocabulary UUID, we will use a replacement token. At some point, during the blueprint processing by Drupal, the token will be resolved to the actual UUID value.

Replacement Tokens

We can use replacement tokens anywhere in the body or the URI of our subrequests. For those to be resolved, a token needs to be formatted in the following way:

{{<requestId>.<"body"|"headers">@<json-path-expression>}}

In particular, the replacement token for our vocabulary UUID will be:

{{vocabulary.body@$.data[0].id}}

What this replacement says is:

  1. Use the subresponse for the vocabulary subrequest.
  2. Take the body from that subresponse.
  3. Extract the string under data[0].id, by executing the JSON Path expression $.data[0].id. You can execute any JSON Path expression as long as it returns a string. JSON Path is a very powerful way to extract data from an arbitrary JSON object, in our case the body in subresponse to the vocabulary subrequest.

This is what our blueprint looks like after adding the subrequests to create the tag entities. Note the presence of the replacement tokens:

[ { "requestId": "vocabulary", "action": "view", "uri": "/api/vocabularies?filter[vid-filter][condition][path]=vid&filter[vid-filter][condition][value]=tags", "headers": {"Accept": "application/vnd.api+json"} }, { "requestId": "user", "action": "view", "uri": "/api/users?filter[admin][condition][path]=name&filter[admin][condition][value]=admin", "headers": {"Accept": "application/vnd.api+json"} }, { "action": "create", "requestId": "tags-1", "body": "{\"data\":{\"type\":\"tags\",\"attributes\":{\"name\":\"My First Tag\"},\"relationships\":{\"vocabulary\":{\"data\":{\"type\":\"vocabularies\",\"id\":\"{{vocabulary.body@$.data[0].id}}\"}}}}}", "uri": "/api/tags", "headers": {"Content-Type": "application/vnd.api+json"}, "waitFor": ["vocabulary"] }, { "action": "create", "requestId": "tags-2", "body": "{\"data\":{\"type\":\"tags\",\"attributes\":{\"name\":\"My Second Tag\",\"description\":null},\"relationships\":{\"vocabulary\":{\"data\":{\"type\":\"vocabularies\",\"id\":\"{{vocabulary.body@$.data[0].id}}\"}}}}}", "uri": "/api/tags", "headers": {"Content-Type": "application/vnd.api+json"}, "waitFor": ["vocabulary"] } ]

Note that to use a replacement token in a subrequest, we need to add a dependency on the subresponse that contains the information. That's why we added the waitFor key in our tag subrequests.

Finishing the Blueprint undefined

Using the same principles that we used for the tags we can add the subrequest for:

  • Create the article in the form using the user UUID and the newly created tags.

That will leave our completed blueprint as:

[ { "requestId": "vocabulary", "action": "view", "uri": "/api/vocabularies?filter[vid-filter][condition][path]=vid&filter[vid-filter][condition][value]=tags", "headers": {"Accept": "application/vnd.api+json"} }, { "requestId": "user", "action": "view", "uri": "/api/users?filter[admin][condition][path]=name&filter[admin][condition][value]=admin", "headers": {"Accept": "application/vnd.api+json"} }, { "action": "create", "requestId": "tags-1", "body": "{\"data\":{\"type\":\"tags\",\"attributes\":{\"name\":\"My First Tag\"},\"relationships\":{\"vocabulary\":{\"data\":{\"type\":\"vocabularies\",\"id\":\"{{vocabulary.body@$.data[0].id}}\"}}}}}", "uri": "/api/tags", "headers": {"Content-Type": "application/vnd.api+json"}, "waitFor": ["vocabulary"] }, { "action": "create", "requestId": "tags-2", "body": "{\"data\":{\"type\":\"tags\",\"attributes\":{\"name\":\"My Second Tag\",\"description\":null},\"relationships\":{\"vocabulary\":{\"data\":{\"type\":\"vocabularies\",\"id\":\"{{vocabulary.body@$.data[0].id}}\"}}}}}", "uri": "/api/tags", "headers": {"Content-Type": "application/vnd.api+json"}, "waitFor": ["vocabulary"] }, { "action": "create", "requestId": "article", "headers": {"Content-Type": "application/vnd.api+json"}, "body": "{\"data\":{\"type\":\"articles\",\"attributes\":{\"body\":\"Custom value\",\"default_langcode\":\"1\",\"langcode\":\"en\",\"promote\":\"1\",\"status\":\"1\",\"sticky\":\"0\",\"title\":\"Article Created via Subrequests!\"},\"relationships\":{\"tags\":{\"data\":[{\"id\":\"{{tags-1.body@$.data.id}}\",\"type\":\"tags\"},{\"id\":\"{{tags-2.body@$.data.id}}\",\"type\":\"tags\"}]},\"type\":{\"data\":{\"id\":\"article\",\"type\":\"contentTypes\"}},\"owner\":{\"data\":{\"id\":\"{{user.body@$.data[0].id}}\",\"type\":\"users\"}}}}}", "uri": "/api/articles", "waitFor": ["user", "tags-1", "tags-2"] } ] More Powerful Replacements

Imagine that instead of creating an article for a single user, we wanted to create an article for each one of the users on the site. We cannot write a simple blueprint, like the one above, since we don't know how many users there are in the Drupal site. Hence, we cannot write an article creation subrequest for each user.

To solve this problem we can tweak the user subrequest, so instead of returning a single user it returns all the users in the site:

[ … { "requestId": "user", "action": "view", "uri": "/api/users", "headers": {"Accept": "application/vnd.api+json"} }, … ]

Then in our replacement tokens, we can write a JSON Path expression that will return a list of user UUIDs, instead of a single string. Subrequests will accept JSON Path expressions that return either strings or an array of strings for the replacement tokens.

In our article creation subrequest we will need to change {{user.body@$.data[0].id}} by {{user.body@$.data[*].id}}. The Subrequests module will create a duplicate of the article subrequest for each replacement item. In our case this will have the effect of having a copy of the article creation subrequest per each available user in the user subresponse.

The Final Response

The modified blueprint that generates one article per user will have a response like:

undefined

We can see how a single subrequest can generate n subresponses, and we can use each one of those to generate n other subresponses, etc. This highlights how powerful this technique is. In addition, we have seen that we can combine different type of operations. In our example, we mixed GET and POST in a single blueprint (to get the vocabulary and create the new tags).

Conclusion

Sub requests is a great way to fetch or write many resources in a single HTTP request. This allows us to improve performance significantly while maintaining almost the same flexibility that custom code provides.

Further Your Understanding

If you want to know more about the blueprint format you can read the specification. The Subrequests module comes with a JSON schema that you can use to validate your blueprint. You can find the schema here.

The hero image was downloaded from Frankenphotos and use without modifications with a CC BY 3.0 license.

Categories: Drupal

Mediacurrent: Webinar Recap: Security by Design - An Introduction to Drupal Security

Planet Drupal - 12 October 2017 - 6:29am

With cybercrime on the rise, securing data in Drupal has become a hot topic for developers and project stakeholders alike.

In our latest webinar, we were joined by three Drupal security experts from Townsend Security, Lockr and Mediacurrent who shared their approach for building a secure groundwork to protect site data in Drupal.

Categories: Drupal

Hook Rebuild

New Drupal Modules - 12 October 2017 - 5:47am
A Drupal 7 polyfill for the core hook_rebuild() added in Drupal 8

hook_rebuild() is called after a flush of all Drupal cache's allowing implementing modules to rebuild any required data strcutures using fresh data that is known to be pulled direct from it's data source.

Categories: Drupal

Bitbucket Issues

New Drupal Modules - 12 October 2017 - 3:11am

The Bitbucket Issues module provides a Bitbucket core API layer for managing git issues from your Drupal website.

Installation

Install as usual.

Place the entirety of this directory in the /modules folder of your Drupal
installation. Navigate to Administer > Extend. Check the ‘Enabled’ box next
to the ‘Bitbucket Issues’ and then click the ‘Save Configuration’ button at the bottom.

Populate module settings form with git base url and git user access token.

Categories: Drupal

Config default image

New Drupal Modules - 12 October 2017 - 3:07am

Image field formatter allowing to set a default image deployable through config management. It stores a file path into config, instead of a content uuid.

Categories: Drupal

Views fields as row classes

New Drupal Modules - 12 October 2017 - 3:05am
Categories: Drupal

Singleton

New Drupal Modules - 11 October 2017 - 1:49pm
Description

This module utilizes the Singleton design pattern by loading in the petrknap/php-singleton
PHP library through composer. This module is only an API to allow you to utilize singletons
for classes you create.

How to use

In your PHP class

Categories: Drupal

The evolution of Acquia's product strategy

Dries Buytaert - 11 October 2017 - 1:26pm

Four months ago, I shared that Acquia was on the verge of a shift equivalent to the decision to launch Acquia Fields and Drupal Gardens in 2008. As we entered Acquia's second decade, we outlined a goal to move from content management to data-driven customer journeys. Today, Acquia announced two new products that support this mission: Acquia Journey and Acquia Digital Asset Manager (DAM).

Last year on my blog, I shared a video that demonstrated what is possible with cross-channel user experiences and Drupal. We showed a sample supermarket chain called Gourmet Market. Gourmet Market wants its customers to not only shop online using its website, but to also use Amazon Echo or push notifications to do business with them. The Gourmet Market prototype showed an omnichannel customer experience that is both online and offline, in store and at home, and across multiple digital touchpoints. The Gourmet Market demo video was real, but required manual development and lacked easy customization. Today, the launch of Acquia Journey and Acquia DAM makes building these kind of customer experiences a lot easier. It marks an important milestone in Acquia's history, as it will accelerate our transition from content management to data-driven customer journeys.

Introducing Acquia Journey

I've written a great deal about the Big Reverse of the Web, which describes the transition from "pull-based" delivery of the web, meaning we visit websites, to a "push-based" delivery, meaning the web comes to us. The Big Reverse forces a major re-architecture of the web to bring the right information, to the right person, at the right time, in the right context.

The Big Reverse also ushers in the shift from B2C to B2One, where organizations develop a one-to-one relationship with their customers, and contextual and personalized interactions are the norm. In the future, every organization will have to rethink how it interacts with customers.

Successfully delivering a B2One experience requires an understanding of your user's journey and matching the right information or service to the user's context. This alone is no easy feat, and many marketers and other digital experience builders often get frustrated with the challenge of rebuilding customer experiences. For example, although organizations can create brilliant campaigns and high-value content, it's difficult to effectively disseminate marketing efforts across multiple channels. When channels, data and marketing software act in different silos, it's nearly impossible to build a seamless customer experience. The inability to connect customer profiles and journey maps with various marketing tools can result in unsatisfied customers, failed conversion rates, and unrealized growth.

Acquia Journey delivers on this challenge by enabling marketers to build data-driven customer journeys. It allows marketers to easily map, assemble, orchestrate and manage customer experiences like the one we showed in our Gourmet Market prototype.

It's somewhat difficult to explain Acquia Journey in words — probably similar to trying to explain what a content management system does to someone who has never used one before. Acquia Journey provides a single interface to define and evaluate customer journeys across multiple interaction points. It combines a flowchart-style journey mapping tool with unified customer profiles and an automated decision engine. Rules-based triggers and logic select and deliver the best-next action for engaging customers.

One of the strengths of Acquia Journey is that it integrates many different technologies, from marketing and advertising technologies to CRM tools and commerce platforms. This makes it possible to quickly assemble powerful and complex customer journeys.

Acquia Journey will simplify how organizations deliver the "best next experience" for the customer. Providing users with the experience they not only want, but expect will increase conversion rates, grow brand awareness, and accelerate revenue. The ability for organizations to build more relevant user experiences not only aligns with our customers' needs but will enable them to make the biggest impact possible for their customers.

Acquia's evolving product offering also puts control of user data and experience back in the hands of the organization, instead of walled gardens. This is a step toward uniting the Open Web.

Introducing Acquia Digital Asset Manager (DAM)

Digital asset management systems have been around for a long time, and were originally hosted through on-premise servers. Today, most organizations have abandoned on-premise or do-it-yourself DAM solutions. After listening to our customers, it became clear that large organizations are seeking a digital asset management solution that centralizes control of creative assets for the entire company.

Many organizations lack a single-source of truth when it comes to managing digital assets. This challenge has been amplified as the number of assets has rapidly increased in a world with more devices, more channels, more campaigns, and more personalized and contextualized experiences. Acquia DAM provides a centralized repository for managing all rich media assets, including photos, videos, PDFs, and other corporate documents. Creative and marketing teams can upload and manage files in Acquia DAM, which can then be shared across the organization. Graphic designers, marketers and web managers all have a hand in translating creative concepts into experiences for their customers. With Acquia DAM, every team can rely on one dedicated application to gather requirements, share drafts, consolidate feedback and collect approvals for high-value marketing assets.

On top of Drupal's asset and media management capabilities, Acquia DAM provides various specialized functionality, such as automatic transcoding of assets upon download, image and video mark-up during approval workflows, and automated tagging for images using machine learning and image recognition.

By using a drag-and-drop interface on Acquia DAM, employees can easily publish approved assets in addition to searching the repository for what they need.

Acquia DAM seamlessly integrates with both Drupal 7 and Drupal 8 (using Drupal's "media entities"). In addition to Drupal, Acquia DAM is built to integrate with the entirety of the Acquia Platform. This includes Acquia Lift and Acquia Journey, which means that any asset managed in the Acquia DAM repository can be utilized to create personalized experiences across multiple Drupal sites. Additionally, through a REST API, Acquia DAM can also be integrated with other marketing technologies. For example, Acquia DAM supports designers with a plug in to Adobe Creative Cloud, which integrates with Photoshop, InDesign and Illustrator.

Acquia's roadmap to data-driven customer journeys

Throughout Acquia's first decade, we've been primarily focused on providing our customers with the tools and services necessary to scale and succeed with content management. We've been very successful with helping our customers scale and manage Drupal and cloud solutions. Drupal will remain a critical component to our customer's success, and we will continue to honor our history as committed supporters of open source, in addition to investing in Drupal's future.

However, many of our customers need more than content management to be digital winners. The ability to orchestrate customer experiences using content, user data, decisioning systems, analytics and more will be essential to an organization's success in the future. Acquia Journey and Acquia DAM will remove the complexity from how organizations build modern digital experiences and customer journeys. We believe that expanding our platform will be good not only for Acquia, but for our partners, the Drupal community, and our customers.

Categories: Drupal

Dries Buytaert: The evolution of Acquia's product strategy

Planet Drupal - 11 October 2017 - 1:26pm

Four months ago, I shared that Acquia was on the verge of a shift equivalent to the decision to launch Acquia Fields and Drupal Gardens in 2008. As we entered Acquia's second decade, we outlined a goal to move from content management to data-driven customer journeys. Today, Acquia announced two new products that support this mission: Acquia Journey and Acquia Digital Asset Manager (DAM).

Last year on my blog, I shared a video that demonstrated what is possible with cross-channel user experiences and Drupal. We showed a sample supermarket chain called Gourmet Market. Gourmet Market wants its customers to not only shop online using its website, but to also use Amazon Echo or push notifications to do business with them. The Gourmet Market prototype showed an omnichannel customer experience that is both online and offline, in store and at home, and across multiple digital touchpoints. The Gourmet Market demo video was real, but required manual development and lacked easy customization. Today, the launch of Acquia Journey and Acquia DAM makes building these kind of customer experiences a lot easier. It marks an important milestone in Acquia's history, as it will accelerate our transition from content management to data-driven customer journeys.

Introducing Acquia Journey

I've written a great deal about the Big Reverse of the Web, which describes the transition from "pull-based" delivery of the web, meaning we visit websites, to a "push-based" delivery, meaning the web comes to us. The Big Reverse forces a major re-architecture of the web to bring the right information, to the right person, at the right time, in the right context.

The Big Reserve also ushers in the shift from B2C to B2One, where organizations develop a one-to-one relationship with their customers, and contextual and personalized interactions are the norm. In the future, every organization will have to rethink how it interacts with customers.

Successfully delivering a B2One experience requires an understanding of your user's journey and matching the right information or service to the user's context. This alone is no easy feat, and many marketers and other digital experience builders often get frustrated with the challenge of rebuilding customer experiences. For example, although organizations can create brilliant campaigns and high-value content, it's difficult to effectively disseminate marketing efforts across multiple channels. When channels, data and marketing software act in different silos, it's nearly impossible to build a seamless customer experience. The inability to connect customer profiles and journey maps with various marketing tools can result in unsatisfied customers, failed conversion rates, and unrealized growth.

Acquia Journey delivers on this challenge by enabling marketers to build data-driven customer journeys. It allows marketers to easily map, assemble, orchestrate and manage customer experiences like the one we showed in our Gourmet Market prototype.

It's somewhat difficult to explain Acquia Journey in words — probably similar to trying to explain what a content management system does to someone who has never used one before. Acquia Journey provides a single interface to define and evaluate customer journeys across multiple interaction points. It combines a flowchart-style journey mapping tool with unified customer profiles and an automated decision engine. Rules-based triggers and logic select and deliver the best-next action for engaging customers.

One of the strengths of Acquia Journey is that it integrates many different technologies, from marketing and advertising technologies to CRM tools and commerce platforms. This makes it possible to quickly assemble powerful and complex customer journeys.

Acquia Journey will simplify how organizations deliver the "best next experience" for the customer. Providing users with the experience they not only want, but expect will increase conversion rates, grow brand awareness, and accelerate revenue. The ability for organizations to build more relevant user experiences not only aligns with our customers' needs but will enable them to make the biggest impact possible for their customers.

Acquia's evolving product offering also puts control of user data and experience back in the hands of the organization, instead of walled gardens. This is a step toward uniting the Open Web.

Introducing Acquia Digital Asset Manager (DAM)

Digital asset management systems have been around for a long time, and were originally hosted through on-premise servers. Today, most organizations have abandoned on-premise or do-it-yourself DAM solutions. After listening to our customers, it became clear that large organizations are seeking a digital asset management solution that centralizes control of creative assets for the entire company.

Many organizations lack a single-source of truth when it comes to managing digital assets. This challenge has been amplified as the number of assets has rapidly increased in a world with more devices, more channels, more campaigns, and more personalized and contextualized experiences. Acquia DAM provides a centralized repository for managing all rich media assets, including photos, videos, PDFs, and other corporate documents. Creative and marketing teams can upload and manage files in Acquia DAM, which can then be shared across the organization. Graphic designers, marketers and web managers all have a hand in translating creative concepts into experiences for their customers. With Acquia DAM, every team can rely on one dedicated application to gather requirements, share drafts, consolidate feedback and collect approvals for high-value marketing assets.

On top of Drupal's asset and media management capabilities, Acquia DAM provides various specialized functionality, such as automatic transcoding of assets upon download, image and video mark-up during approval workflows, and automated tagging for images using machine learning and image recognition.

By using a drag-and-drop interface on Acquia DAM, employees can easily publish approved assets in addition to searching the repository for what they need.

Acquia DAM seamlessly integrates with both Drupal 7 and Drupal 8 (using Drupal's "media entities"). In addition to Drupal, Acquia DAM is built to integrate with the entirety of the Acquia Platform. This includes Acquia Lift and Acquia Journey, which means that any asset managed in the Acquia DAM repository can be utilized to create personalized experiences across multiple Drupal sites. Additionally, through a REST API, Acquia DAM can also be integrated with other marketing technologies. For example, Acquia DAM supports designers with a plug in to Adobe Creative Cloud, which integrates with Photoshop, InDesign and Illustrator.

Acquia's roadmap to data-driven customer journeys

Throughout Acquia's first decade, we've been primarily focused on providing our customers with the tools and services necessary to scale and succeed with content management. We've been very successful with helping our customers scale and manage Drupal and cloud solutions. Drupal will remain a critical component to our customer's success, and we will continue to honor our history as committed supporters of open source, in addition to investing in Drupal's future.

However, many of our customers need more than content management to be digital winners. The ability to orchestrate customer experiences using content, user data, decisioning systems, analytics and more will be essential to an organization's success in the future. Acquia Journey and Acquia DAM will remove the complexity from how organizations build modern digital experiences and customer journeys. We believe that expanding our platform will be good not only for Acquia, but for our partners, the Drupal community, and our customers.

Categories: Drupal

mark.ie: Drupal Camp Dublin is Next Week - Last Chance for Tickets

Planet Drupal - 11 October 2017 - 11:44am
Drupal Camp Dublin is Next Week - Last Chance for Tickets

Seems like just yesterday since we held DrupalCon in Dublin, now we're back with our annual Drupal Camp Dublin.

markconroy Wed, 10/11/2017 - 19:44

This year's Drupal Camp Dublin has a great line up of speakers from Ireland and abroad, covering such topics as:

  • Building multi-lingual, multi-region websites (Stella Power)
  • Working as a developer with attention-deficit disorder - add (Levi Govaerts)
  • Planning for disruptions (Jochen Lillich)
  • Migrating from Drupal 4 to 5 to 6 to 7 to 8 (Alan Burke)
  • Automating deployments (Luis Rodriguez)
  • Working webform and commerce and paragraphs and display suites and more (Chandeep Khosa)
  • Live debugging a site that's giving issues (Anthony Lindsay)
  • Deploy with Fabric, and test driven development (Oliver Davies)
  • Design in the Browser (yours truly, me, Mark Conroy)
  • Teaching web development at third level (Ruairi O'Reilly)
  • The QA process (Daniel Shaw)
  • Getting started with Docker (Ed Crompton)
  • The new theme coming to Drupal core (Mark Conroy)

And then there's some socials, and our Drupal Ireland AGM, and at least one other talk not announced yet, and ... you get the idea.

The full schedule is available on our website. There are some tickets left (only €20), get them before they are all gone.

Categories: Drupal

myDropWizard.com: Drupal 6 version of netFORUM Authentication not affected by SA-CONTRIB-2017-077

Planet Drupal - 11 October 2017 - 11:37am

Today, there was a Moderately Critical security advisory for an Access Bypass vulnerability in the netFORUM Authentication module for Drupal 7:

netFORUM Authentication - Moderately critical - Access Bypass - SA-CONTRIB-2017-077

The module was bypassing protections on the Drupal 7 user login form, to deter brute force attempts to login to the site, and so was an Access Bypass vulnerability by making login less secure when using this module.

However, Drupal 6 (including Pressflow 6) don't have these same protections for the user login form, and so, using this module is no less secure than using vanilla Drupal 6. Of course, these protections could be added to this module, and while this would be great security hardening, this doesn't represent a vulnerability - only a weakness which is also present (and widely known) in Drupal 6 core.

If you'd like all your Drupal 6 modules to receive security updates and have the fixes deployed the same day they're released, please check out our D6LTS plans.

Note: if you use the myDropWizard module (totally free!), you'll be alerted to these and any future security updates, and will be able to use drush to install them (even though they won't necessarily have a release on Drupal.org).

Categories: Drupal

Mediacurrent: Mediacurrent Wins Three Nominations at 2017 Acquia Engage Conference

Planet Drupal - 11 October 2017 - 10:21am

Mediacurrent has been selected as finalists for the 2017 Acquia Engage Awards in the categories of Financial Services, Travel and Tourism, and Digital Experience. These awards recognize the amazing sites and digital experiences that leading digital agencies are building with the Acquia Platform.

Categories: Drupal

Drupal blog: Drupal looking to adopt React

Planet Drupal - 11 October 2017 - 10:05am

This blog has been re-posted with permission from Dries Buytaert's blog. Please leave your comments on the original post.

Last week at DrupalCon Vienna, I proposed adding a modern JavaScript framework to Drupal core. After the keynote, I met with core committers, framework managers, JavaScript subsystem maintainers, and JavaScript experts in the Drupal community to discuss next steps. In this blog post, I look back on how things have evolved, since the last time we explored adding a new JavaScript framework to Drupal core two years ago, and what we believe are the next steps after DrupalCon Vienna.

As a group, we agreed that we had learned a lot from watching the JavaScript community grow and change since our initial exploration. We agreed that today, React would be the most promising option given its expansive adoption by developers, its unopinionated and component-based nature, and its well-suitedness to building new Drupal interfaces in an incremental way. Today, I'm formally proposing that the Drupal community adopt React, after discussion and experimentation has taken place.

Two years ago, it was premature to pick a JavaScript framework

Three years ago, I developed several convictions related to "headless Drupal" or "decoupled Drupal". I believed that:

  1. More and more organizations wanted a headless Drupal so they can use a modern JavaScript framework to build application-like experiences.
  2. Drupal's authoring and site building experience could be improved by using a more modern JavaScript framework.
  3. JavaScript and Node.js were going to take the world by storm and that we would be smart to increase the amount of JavaScript expertise in our community.

(For the purposes of this blog post, I use the term "framework" to include both full MV* frameworks such as Angular, and also view-only libraries such as React combined piecemeal with additional libraries for managing routing, states, etc.)

By September 2015, I had built up enough conviction to write several long blog posts about these views (post 1, post 2, post 3). I felt we could accomplish all three things by adding a JavaScript framework to Drupal core. After careful analysis, I recommended that we consider React, Ember and Angular. My first choice was Ember, because I had concerns about a patent clause in Facebook's open-source license (since removed) and because Angular 2 was not yet in a stable release.

At the time, the Drupal community didn't like the idea of picking a JavaScript framework. The overwhelming reactions were these: it's too early to tell which JavaScript framework is going to win, the risk of picking the wrong JavaScript framework is too big, picking a single framework would cause us to lose users that favor other frameworks, etc. In addition, there were a lot of different preferences for a wide variety of JavaScript frameworks. While I'd have preferred to make a bold move, the community's concerns were valid.

Focusing on Drupal's web services instead

By May of 2016, after listening to the community, I changed my approach; instead of adding a specific JavaScript framework to Drupal, I decided we should double down on improving Drupal's web service APIs. Instead of being opinionated about what JavaScript framework to use, we would allow people to use their JavaScript framework of choice.

I did a deep dive on the state of Drupal's web services in early 2016 and helped define various next steps (post 1, post 2, post 3). I asked a few of the OCTO team members to focus on improving Drupal 8's web services APIs; funded improvements to Drupal core's REST API, as well as JSON API, GraphQL and OpenAPI; supported the creation of Waterwheel projects to help bootstrap an ecosystem of JavaScript front-end integrations; and most recently supported the development of Reservoir, a Drupal distribution for headless Drupal. There is also a lot of innovation coming from the community with lots of work on the Contenta distribution, JSON API, GraphQL, and more.

The end result? Drupal's web service APIs have progressed significantly the past year. Ed Faulkner of Ember told us: "I'm impressed by how fast Drupal made lots of progress with its REST API and the JSON API contrib module!". It's a good sign when a core maintainer of one of the leading JavaScript frameworks acknowledges Drupal's progress.

The current state of JavaScript in Drupal

Looking back, I'm glad we decided to focus first on improving Drupal's web services APIs; we discovered that there was a lot of work left to stabilize them. Cleanly integrating a JavaScript framework with Drupal would have been challenging 18 months ago. While there is still more work to be done, Drupal 8's available web service APIs have matured significantly.

Furthermore, by not committing to a specific framework, we are seeing Drupal developers explore a range of JavaScript frameworks and members of multiple JavaScript framework communities consuming Drupal's web services. I've seen Drupal 8 used as a content repository behind Angular, Ember, React, Vue, and other JavaScript frameworks. Very cool!

There is a lot to like about how Drupal's web service APIs matured and how we've seen Drupal integrated with a variety of different frameworks. But there is also no denying that not having a JavaScript framework in core came with certain tradeoffs:

  1. It created a barrier for significantly leveling up the Drupal community's JavaScript skills. In my opinion, we still lack sufficient JavaScript expertise among Drupal core contributors. While we do have JavaScript experts working hard to maintain and improve our existing JavaScript code, I would love to see more experts join that team.
  2. It made it harder to accelerate certain improvements to Drupal's authoring and site building experience.
  3. It made it harder to demonstrate how new best practices and certain JavaScript approaches could be leveraged and extended by core and contributed modules to create new Drupal features.

One trend we are now seeing is that traditional MV* frameworks are giving way to component libraries; most people seem to want a way to compose interfaces and interactions with reusable components (e.g. libraries like React, Vue, Polymer, and Glimmer) rather than use a framework with a heavy focus on MV* workflows (e.g. frameworks like Angular and Ember). This means that my original recommendation of Ember needs to be revisited.

Several years later, we still don't know what JavaScript framework will win, if any, and I'm willing to bet that waiting two more years won't give us any more clarity. JavaScript frameworks will continue to evolve and take new shapes. Picking a single one will always be difficult and to some degree "premature". That said, I see React having the most momentum today.

My recommendations at DrupalCon Vienna

Given that it's been almost two years since I last suggested adding a JavaScript framework to core, I decided to talk bring the topic back in my DrupalCon Vienna keynote presentation. Prior to my keynote, there had been some renewed excitement and momentum behind the idea. Two years later, here is what I recommended we should do next:

  • Invest more in Drupal's API-first initiative. In 2017, there is no denying that decoupled architectures and headless Drupal will be a big part of our future. We need to keep investing in Drupal's web service APIs. At a minimum, we should expand Drupal's web service APIs and standardize on JSON API. Separately, we need to examine how to give API consumers more access to and control over Drupal's capabilities.
  • Embrace all JavaScript frameworks for building Drupal-powered applications. We should give developers the flexibility to use their JavaScript framework of choice when building front-end applications on top of Drupal — so they can use the right tool for the job. The fact that you can front Drupal with Ember, Angular, Vue, React, and others is a great feature. We should also invest in expanding the Waterwheel ecosystem so we have SDKs and references for all these frameworks.
  • Pick a framework for Drupal's own administrative user interfaces. Drupal should pick a JavaScript framework for its own administrative interface. I'm not suggesting we abandon our stable base of PHP code; I'm just suggesting that we leverage JavaScript for the things that JavaScript is great at by moving relevant parts of our code from PHP to JavaScript. Specifically, Drupal's authoring and site building experience could benefit from user experience improvements. A JavaScript framework could make our content modeling, content listing, and configuration tools faster and more application-like by using instantaneous feedback rather than submitting form after form. Furthermore, using a decoupled administrative interface would allow us to dogfood our own web service APIs.
  • Let's start small by redesigning and rebuilding one or two features. Instead of rewriting the entirety of Drupal's administrative user interfaces, let's pick one or two features, and rewrite their UIs using a preselected JavaScript framework. This allows us to learn more about the pros and cons, allows us to dogfood some of our own APIs, and if we ultimately need to switch to another JavaScript framework or approach, it won't be very painful to rewrite or roll the changes back.
Selecting a JavaScript framework for Drupal's administrative UIs

In my keynote, I proposed a new strategic initiative to test and research how Drupal's administrative UX could be improved by using a JavaScript framework. The feedback was very positive.

As a first step, we have to choose which JavaScript framework will be used as part of the research. Following the keynote, we had several meetings at DrupalCon Vienna to discuss the proposed initiative with core committers, all of the JavaScript subsystem maintainers, as well as developers with real-world experience building decoupled applications using Drupal's APIs.

There was unanimous agreement that:

  1. Adding a JavaScript framework to Drupal core is a good idea.
  2. We want to have sufficient real-use experience to make a final decision prior to 8.6.0's development period (Q1 2018). To start, the Watchdog page would be the least intrusive interface to rebuild and would give us important insights before kicking off work on more complex interfaces.
  3. While a few people named alternative options, React was our preferred option, by far, due to its high degree of adoption, component-based and unopinionated nature, and its potential to make Drupal developers' skills more future-proof.
  4. This adoption should be carried out in a limited and incremental way so that the decision is easily reversible if better approaches come later on.

We created an issue on the Drupal core queue to discuss this more.

Conclusion

Drupal should support a variety of JavaScript libraries on the user-facing front end while relying on a single shared framework as a standard across Drupal administrative interfaces.

In short, I continue to believe that adopting more JavaScript is important for the future of Drupal. My original recommendation to include a modern JavaScript framework (or JavaScript libraries) for Drupal's administrative user interfaces still stands. I believe we should allow developers to use their JavaScript framework of choice to build front-end applications on top of Drupal and that we can start small with one or two administrative user interfaces.

After meeting with core maintainers, JavaScript subsystem maintainers, and framework managers at DrupalCon Vienna, I believe that React is the right direction to move for Drupal's administrative interfaces, but we encourage everyone in the community to discuss our recommendation. Doing so would allow us to make Drupal easier to use for site builders and content creators in an incremental and reversible way, keep Drupal developers' skills relevant in an increasingly JavaScript-driven world, move us ahead with modern tools for building user interfaces.

Special thanks to Preston So for contributions to this blog post and to Matt Grill, Wim Leers, Jason Enter, Gábor Hojtsy, and Alex Bronstein for their feedback during the writing process.

Categories: Drupal

Commerce claim gift aid

New Drupal Modules - 11 October 2017 - 6:09am
Commerce claim gift aid

This simple module adds the ability for Commerce 2.0 shops to claim gift aid.

To determine whether or not an order item is eligible for gift aid simply update an order item type by visiting the following url

admin/commerce/config/order-item-types

Check the checkbox and the rest is done for you.

If a user adds in an order item where its eligible for gift aid then they will have the choice to choose whether you can claim gift aid.

You can edit the text about gift aid in the commerce configuration screens.

Categories: Drupal

Periodically Mail

New Drupal Modules - 11 October 2017 - 4:41am

This module allow you to send emails after some period to users with some role.

Example: You want to send emails every Monday.

To setup module visit: "/admin/config/services/periodically_mail" path.

Important: The mail is sent when cron runs.

Notice: For HTML emailing you can use extra modules like Mime Mail.

Categories: Drupal

Living style guide

New Drupal Modules - 11 October 2017 - 2:09am
Categories: Drupal

Pages

Subscribe to As If Productions aggregator - Drupal