Planet Drupal

Subscribe to Planet Drupal feed
Drupal.org - aggregated feeds in category Planet Drupal
Updated: 6 hours 52 min ago

OpenSense Labs: Integrating Drupal and Salesforce? Surely a win-win for you.

1 February 2019 - 3:54am
Integrating Drupal and Salesforce? Surely a win-win for you. Vasundhra Fri, 02/01/2019 - 17:24

The number eight in bible signifies resurrection and regeneration, a digit that implies “New beginnings” 

Just like the resurrection of Drupal which loudly announced its new inceptions as a content management system, and its ability to connect with Saas CRM like Salesforce. 

Salesforce is like the heart for most of the business that has allowed them to handle there sales data at one stop and given highest priority in terms of customer’s growth. And now that it has a tighter integration than ever before, Drupal 8 can do it too 


So instead of wasting any more of your time and beating around the bush, let's explore the paths that lead down its integration and the key considerations that are involved in it. 

Benefits of Integrating Drupal and Salesforce

Managing Territories

When you have a team of salespeople small or big, how would you manage that which territories or which areas each of them is going after? 

Definitely with the help of Salesforce that would monitor and track almost anything that you can imagine of. Instead of managing those old school spreadsheets, a CRM like Salesforce can help you track and monitor all the tasks. It saves your time, resources that arrives while managing small and large scale teams. With the help of this CRM, you have the power to do many things such as:

  • Better management of lead processing and territories.
  • The leads can be assigned to the users according to according to the data that have business sense.
  • Instant email notifications can help in rep up the sales of the customers and prospects immediately.
  • It can help you attain better efficiency. 

Tracking competitors and managing opportunities 

In this competitive world, it is important to track and manage your competitors. You can do this thing with the help of Salesforce CRM. 

It diligently ensures that each and every opportunity is followed up on and not forgotten through the various in-built tools and responses faster to any client that enquires about your services or products which shows ultimately to your customers that you care about their business.

Forecasting

A good CRM system gives you the ability from a business point of view to track exactly what is happening but also accurately forecast the growth or decline of your business. For forecasting, salesforce can also provide you with:

  • Calculate the forecasting including all the information from the sales team.
  • Differentiate between booked and recurring venues 
  • Customize forecast based on the parameters that make sense to the business.

Managing Orders

The Salesforce CRM allows you to truly manage end to end customer relationships. You can see everything from the first time when you engage with a client to when they place an order and beyond. 

The best part about Salesforce CRM in terms of managing order is that it can easily turn an estimate into order and beyond with a single click of a button and customized or automated reports based on what you need to see

Architectural approaches

There are different architectural approaches to have you think about data flow that provides for different requirements and satisfy different needs. Architectures like:

Technology Description Strengths Weakness Real-Time Push Sends data immediately on the entity and creates,  updates and deletes Fast, limited, update lag, avoids UX, can avoid race conditions  Less durable and reliable  Cron Based Sync Identify records requiring sync on the cron Handles large volumes well, can be stopped and start as needed Slow, lags and risk the update conflicts Work Queue Single point of integration receives data and action Reliable, performant and has a shorter time lag Large changes create backlogs, the risk of update conflicts

Real-Time Push

With the real-time integration, the Drupal objects are exported to salesforce immediately. You get the feedbacks indicating whether the item failed to export and the data is available in Salesforce or not. This can be a great option if you need the data to be in the salesforce as close as possible.

Cron Based Sync

Earlier in Drupal 7 the asynchronous push left hiccups concerning error handling (which involved debugging and troubleshooting) optimization, API calls etc.

Now in Drupal 8 salesforce cron based push service has been introduced to construct database queues, normalizing queue items, optimizing queue operations and implementing error handling. 

The Cron based sync has helped Drupal’s core API schedule synchronization from salesforce to Drupal. 

Work Queues

With the queue-based batching system running in the background, it allows many objects to be sent to the salesforce as soon as possible. Instead of the objects being sent to the Salesforce at the same time. In this architecture, instead of the objects that are being sent to the salesforce as soon as it is created, edited, deleted it goes into the queue where it waits to be exported to other items. 

The queues items are then picked up on the configurable schedule and then exports to the Salesforce in batches. Batching the data helps in synchronization and helps to increase the performance by using fewer API calls.

Approaches suitable for integration

There are many ways to move your data from the website to another application. Drupal and Salesforce out of which is the easiest and allows integration in almost all projects. Here are some approaches which are suitable to integrate Drupal and Salesforce.

Simple web forms

Salesforce lets you create simple HTML web form (Web-to-lead or web-to-case)  that generates lead or case records in Salesforce when they are submitted. 

Anyone of the Salesforce administrator can create these forms and then paste them in Drupal for the users to complete it. 


While not all of the things are addressed in every circumstance, there are specific situations when this method is a good solution: 

  • A basic idea on the user data or inquiry information into Salesforce is needed.
  • There is no or little expertise in web development. 
  • Something quick and easy is needed. 

Third party form service 

There are an ample number of form services like Formstack, click and pledge and Wufoo that have the power to pass the data to Salesforce. In this, you can either embed the form in Drupal or let the user click through the platform.


This method is suitable when the following conditions are applied:

  • When there is a need to pass both user and transaction data into Salesforce. 
  • There is no need to move the information in both the directions. 
  • You may want users to log in to submit a form or return to the form and provide more information later. 
  • You want sophisticated solutions that don’t really need to be customized

Salesforce Suite

The Salesforce Suit is the collection of all the Drupal modules that allow synchronization of all the data and the information that is between Drupal and Salesforce in single or both directions. This suit also has the ability to provide a mapping tool that can be used to define the integration which is field-by-field and object-by-object.


Salesforce forms

The simplest way to hook Drupal up (or any other website) with Salesforce is by simply linking over to a form that is created by the Salesforce. Any data that the user is entering gets dumped directly to the salesforce and Drupal is not involved in it.

This type of method is good for a lead generation or simple application form. One of the biggest advantage in using salesforce forms is that it is not only cheap and easy to use, but there are zero setups that are done on Drupal side besides providing a link to the form.

Salesforce mapping 

There might be instances where you might have content that constitutes in both Drupal as well as Salesforce and is needed to stay in sync. Salesforce mapping does that task for everyone. 

Salesforce mapping keeps the version of the data at both ends, whatever happens to one happens to the other version too. 

Rules can also be made to add, delete, push or pull data.  

 

  Cost Direction   Complexity Simple Web Forms Free  One direction inbound to Salesforce DIY Third Party Form Service Low One direction  DIY or Developer Assistance Salesforce Suite Moderate To High  Bi-directional  Developer Assistance Salesforce Mapping High Double-entering the same content in two places Developer Assistance Salesforce Forms Low  Natural  DIY Integrating with different Directions   Integrating with One Direction Integrating with two directions Useful when When you have to pass user data, transaction data, and specific node types  When data is entered directly into Salesforce  To keep Integration Simple  Modern  Advantage This approach limits complexity and therefore liability and errors. Fewer duplicate records are created in Salesforce. User Experience No updates required to impact UX  Users need sophisticated interaction such as the ability to view offline data you have entered  Use Cases Donation forms, Event registration Donation forms, Event registration  Drupal modules are here to ease the integration with Salesforce 

The Drupal Salesforce Suite module is a testament to both the ingenuity and passion of the Drupal community and the flexibility of Drupal as an enterprise platform. As a contributed module, the Salesforce Suite for Drupal enables out of the box connection with Salesforce, no matter what your configuration is. It supports integration by simply synchronizing Drupal entities (eg users, nodes) with the Salesforce objects (organization, contacts)

The Drupal community, as a matter of fact, has been contributing a lot to this part. It has come together to sponsor the development of the suite of Salesforce integration modules that can deal with a variety of business needs. To rewrite the module, the community gathered time and the resources, taking full advantage of the advances that were made in the Drupal and Salesforce platforms. To put it all together it now has been rearranged into a modular architecture exposing core functionality via an API enabling other systems, E.g., Springboard, Jackson River’s fundraising platform 

Most importantly the Drupal suite module has authorized Auth 2.0 to its highest access control 

For the non-technical users, the Drupal entity and Salesforce object mapping system has provided them with the power to configure the data maps between any objects in any 2 systems. Not only this but the synchronization between any Drupal entity and Salesforce object, E.g., Drupal users, donation receipts has been made easy. It has presented its users with a lightweight wrapper around the SOAP API, which has more capabilities for some use cases, using the same OAuth authorization 

Examples of the Use case of Drupal and Salesforce Integration 

SpringBoard 

A packaged distribution of Drupal for non-profit organizations, Springboard, Jackson River’s innovative solution (for online fundraising and marketing), needed to accept online donations and wanted to use Drupal to power other user touch points such as petitions, email registration and more. Springboard presented a robust integration queue for bi-directional sync of data between Drupal and Salesforce.com CRM.

RedHen CRM 

RedHen CRM has been designed for the needs of membership organizations and associations, the RedHen framework is extensible and flexible and can be leveraged to produce a broad range of CRM solutions. For instance, RedHen could be used as a light-weight sales pipeline management tool for small businesses. RedHen CRM could also be leveraged as an integration point between Drupal and much larger, enterprise CRM solutions such as Salesforce. 

Case studies on Cornell University 

The university offers hundreds of opportunities to the students including the ones that are living aboard. But to take the advantages of the opportunities the student had to navigate a full maze of departments and websites. Thus, to solve this issue Cornell University Experience Initiative (CUEI) came up with a plan to bring out a “Netflix” like experience to the students that provide customizable user guide making it easy for the students and the opportunities. 

An organization known as Pantheon was chosen. They wanted to maintain there content with Drupal but also wanted to manage student application and data with Salesforce CRM. The whole team chose Message Agency as their partner to help conceptualize how Drupal and Salesforce would work. Message Agency is also an architecture of the Salesforce Suite, a set of Drupal modules that allows integration of these two powerful solutions.


There are interested students who come to the site to find things and explore. For that task, Drupal does a really good job, but when it comes to actions and customization Salesforce wins in it. This created a whole new paradigm of student’s communication and interaction. 

The technique of centralizing information also provided Cornell with opportunities where each department had their individual page or site with content strategies. But before the website went live the CSEI team tested the user experience with the most trusted stakeholders: Cornell Students

The feedback which they received was overwhelming. They granted with positive reviews telling that how great and well organized the website was. Not only this but Pantheon also evaluated the site’s performance under the traffic load by providing complexity and image-heavy design 

The Future 

The wide raps of what Salesforce and Drupal make possible has given us a vivid idea on how the sales can be increased and raised among the marketing organizations. If you take one view away from all of the above, it should be this: there's definitely an integration that will work for your organization's needs and budget, but it might not be as efficient as integrating Salesforce and Drupal.

If you are able to get a Drupal-Salesforce integration deployed to your operation and organization, there is no doubt on the fact that you will enjoy streamlined and optimized business processes in the short and long term, thus boosting sales and also making the entire process much more comfortable and effective. The flexibility and customizability of Salesforce could prove to be troublesome when it comes to the consistency of your back-end.

Conclusion 

Drupal installations are all unique because of the different modules and customizations that they use, so integration has to be set up in a different manner by an expert.

If you already have a Salesforce instance set up, we'll be happy to explore the appropriate integration options. If you're new to Salesforce, we can work with your Salesforce developers to make sure your data is structured in a way that minimizes the integration effort and costs. 

Ping us at  hello@opensenselabs.com now for the required services related to the whole topic.

blog banner blog image Drupal Drupal 8 CMS Salesforce Salesforce Suite Salesforce mapping Salesforce Forms CRM Blog Type Articles Is it a good read ? On
Categories: Drupal

Digital Echidna: Thoughts on all things digital: Digital Echidna Recognized as a Top Development Firm in Canada

1 February 2019 - 1:02am
Since the beginning, our goal has been to balance technical expertise with creative flair when building and designing websites, applications, and digital platforms to deliver real solutions for clients. We continue to stand as leaders in our…
Categories: Drupal

Tandem's Drupal Blog: Lando + Contenta CMS + Nuxt Pt. 2

31 January 2019 - 4:00pm
February 01, 2019 Configure Contenta CMS and Nuxt to communicate Why? In Lando + Contenta CMS + Nuxt Pt. 1 we configured the infrastructure for local development of a headless Drupal app with a Nuxt frontend. In this article we will wire up the communication between the myapi Contenta CMS backend and the mynuxt Nuxt frontend apps. Configure Cont...
Categories: Drupal

Capgemini Engineering: How to update data models in Drupal 8

31 January 2019 - 4:00pm

In this article we will see how to update data models in Drupal 8, how to make the difference between model updating and content updating, how to create default content, and finally, the procedure to adopt for successful deployments to avoid surprises in a continuous integration/delivery Drupal cycle.

Before we start, I would encourage you to read the documentation of the hook hook_update_N() and to take into account all the possible impacts before writing an update.

Updating the database (executing hook updates and/or importing the configuration) is a very problematic task during a Drupal 8 deployment process, because the updating actions order of structure and data is not well defined in Drupal, and can pose several problems if not completely controlled.

It is important to differentiate between a contributed module to be published on drupal.org aimed at a wide audience, and a custom Drupal project (a set of Drupal contrib/custom modules) designed to provide a bespoke solution in response to a client’s needs. In a contributed module it is rare to have a real need to create instances of configuration/content entities, on the other hand deploying a custom Drupal project makes updating data models more complicated. In the following sections we will list all possible types of updates in Drupal 8.

The Field module allows us to add fields to bundles, we must make difference between the data structure that will be stored in the field (the static schema() method) and all the settings of the field and its storage that will be stored as a configuration. All the dependencies related to the configuration of the field are stored in the field_config configuration entity and all the dependencies related to the storage of the field are stored in the field_storage_config configuration entity. Base fields are stored by default in the entity’s base table.  

Configurable fields are the fields that can be added via the UI and attached to a bundle, which can be exported and deployed. Base fields are not managed by the field_storage_config configuration entities and field_config.

To update the entity definition or its components definitions (field defintions for example if the entity is fieldable) we can implement hook_update_N(). In this hook don’t use the APIs that require a full Drupal bootstrap (e.g. database with CRUD actions, services, …), to do this type of update safely we can use the methods proposed by the contract EntityDefinitionUpdateManagerInterface (e.g. updating the entity keys, updating a basic field definition common to all bundles, …)

To be able to update existing data entities or data fields in the case of a fieldable entity following a modification of a definition we can implement hook_post_update_NAME(). In this hook you can use all the APIs you need to update your entities.

To update the schema of a simple, complex configuration (a configuration entity) or a schema defined in a hook_schema() hook, we can implement hook_update_N().

In a custom Drupal project we are often led to create custom content types or bundles of custom entities (something we do not normally do in a contributed module, and we rarely do it in an installation profile), a site building action allows us to create this type of elements which will be exported afterwards in yml files and then deployed in production using Drupal configuration manager.

A bundle definition is a configuration entity that defines the global schema, we can implement hook_update_N() to update the model in this case as I mentioned earlier. Bundles are instances that persist as a Drupal configuration and follow the same schema. To update the bundles, updated configurations must be exported using the configuration manager to be able to import them into production later. Several problems can arise:

  • If we add a field to a bundle, and want to create content during the deployment for this field, using the current workflow (drush updatedb -> drush config-import) this action is not trivial, and the hook hook_post_update_NAME() can’t be used since it’s executed before the configuration import.
  • The same problem can arise if we want to update fields of bundles that have existing data, the hook hook_post_update_NAME() which is designed to update the existing contents or entities will run before the configuration is imported. What is the solution for this problem? (We will look at a solution for this problem later in this article.)
Now the question is: How to import default content in a custom Drupal project?

Importing default content for a site is an action which is not well documented in Drupal, in a profile installation often this import is done in the hook_install() hook because always the data content have not a complex structure with levels of nested references, in some cases we can use the default content module. Overall in a module we can’t create content in a hook_install() hook, simply because when installing a module the integrity of the configuration is still not imported.

In a recent project i used the drush php-script command to execute import scripts after the (drush updatedb -> drush config-import) but this command is not always available during deployment process. The first idea that comes to mind is to subscribe to the event that is triggered after the import of the configurations to be able to create the contents that will be available for the site editors, but the use of an event is not a nice developer experience hence the introduction of a new hook hook_post_config_import_NAME() that will run once after the database updates and configuration import. Another hook hook_pre_config_import_NAME() has also been introduced to fix performance issues.

A workflow that works for me

To achieve a successful Drupal deployment in continuous integration/delivery cycles using Drush, the most generic workflow that I’ve found at the moment while waiting for a deployment API in core is as follows :

  1. drush updatedb
    • hook_update_N() : To update the definition of an entity and its components
    • hook_post_update_N() : To update entities when you made an entity definition modification (entity keys, base fields, …)
  2. hook_pre_config_import_NAME() : CRUD operations (e.g. creating terms that will be taken as default values when importing configuration in the next step)
  3. drush config-import : Importing the configuration (e.g. new bundle field, creation of a new bundle, image styles, image crops, …)
  4. hook_post_config_import_NAME(): CRUD operations (e.g. creating contents, updating existing contents, …)

This approach works well for us, and I hope it will be useful for you. If you’ve got any suggestions for improvements, please let me know via the comments.

How to update data models in Drupal 8 was originally published by Capgemini at Capgemini Engineering on February 01, 2019.

Categories: Drupal

Aten Design Group: Getting Started with Drupal 8 Migrations

31 January 2019 - 3:02pm

We do a lot of Drupal 8 migrations here at Aten. From older versions of Drupal and Wordpress, to custom SQL Server databases, to XML and JSON export files: it feels like we’ve imported content from just about every data source imaginable. Fortunately for us, the migration system in Drupal 8 is extremely powerful. It’s also complicated. Here’s a quick-start guide for getting started with your next migration to Drupal 8.

First, a caveat: we rarely perform simple one-to-one upgrades of existing websites. If that’s all you need, skip this article and check out this handbook on Drupal.org instead: Upgrading from Drupal 6 or 7 to Drupal 8.

It’s Worth the Steep Learning Curve

Depending on what you’re trying to do, using the migrate system might seem more difficult than necessary. You might be considering feeds, or writing something custom. My advice is virtually always the same: learn the migrate system and use it anyway. Whether you’re importing hundreds of thousands of nodes and dozens of content types or just pulling in a collection of blog posts, migrate provides powerful features that will save you a bunch of time in the long run. Often in the short run, for that matter.

Use the Drupal.org Migrate API Handbooks

There’s a ton of great information on Drupal.org in the Migrate API Handbooks. Be prepared to reference them often – especially the source, process, and destination plugin handbooks.

Basic Steps

Here’s a much simplified overview of the high-level steps you’ll use to set up your custom Drupal 8 migration:

All Migrations

  • Enable the migrate module (duh).
  • Install Migrate Tools to enable Drush migration commands.
  • Install Migrate Extras as well. It provides a bunch of, well, extras. I’d just assume you need it.
  • Create a custom module for your migration.
  • Use YAML configuration files to map fields from the appropriate source, specifying process plugins for necessary transformations, to the destination. The configuration files should exist in “my_migration_module/config/install/“.
    (Pro tip: you’ll probably do a lot of uninstalling and reinstalling your module to update the configuration as you build out your migrations. Use “enforced dependencies” so your YAML configurations are automatically removed from the system when your module is uninstalling, allowing them to be recreated – without conflicts – when you re-enable the module.)

Enforced dependencies in your YAML file will looks something like this:

dependencies: enforced: module: - my_migration_module

See this issue on Drupal.org for more details on enforced dependencies, or refer to the Configuration Management Handbooks.

Drupal-to-Drupal Migrations

  • If you’re running a Drupal-to-Drupal migration, run the “migrate-upgrade” Drush command with the “--configure-only” flag to generate stub YAML configurations. Refer to this handbook for details: Upgrade Using Drush.
  • Copy the generated YAML files for each desired migration into your custom module’s config/install directory, renaming them appropriately and editing as necessary. As stated above, add enforced dependencies to your YAML files to make sure they are removed if your module is uninstalled.
Process Plugins

Process plugins are responsible for transforming source data into the appropriate format for destination fields. From correctly parsing images from text blobs, to importing content behind HTTP authentication, to merging sources into a single value, to all kinds of other transformations: process plugins are incredibly powerful. Further, you can chain process plugins together, making endless possibilities for manipulating data during migration. Process plugins are one of the most important elements of Drupal 8 migrations.

Here are a few process plugin resources:

Continuously Migrate Directly from a Pantheon-Hosted Database

Most of our projects are hosted on Pantheon. Storing credentials for the source production database (for example, a D7 website) in our destination website (D8) code base – in settings.php or any other file – is not secure. Don’t do that. Usually, the preferred alternative is to manually download a copy of the production database and then migrate from that. There are plenty of times, though, where we want to perform continuous, automated migrations from a production source database. Often, complex migrations require weeks or months to complete. Running daily, incremental migrations is really valuable. For those cases, use the Terminus secrets plugin to safely store source database credentials. Here’s a great how-to from Pantheon: Running Drupal 8 Data Migrations on Pantheon Through Drush.

A Few More Things I Wish I’d Known

Here are a few more things I wish I had known about back when I first started helping clients migrate to Drupal 8:

Text with inline images can be migrated without manually copying image directories.

It’s very common to migrate from sources that have inline images. I found a really handy process plugin that helped with this. In my case, I needed to first do a string replace to make image paths absolute. Once that was done, I ran it through the inline_images plugin. This plugin will copy the images over during the migration.

body/value: - plugin: str_replace source: article_text search: /assets/images/ replace: 'https://www.example.com/assets/images/' - plugin: inline_images base: 'public://inline-images' Process plugins can be chained.

Process plugins can be chained together to accomplish some pretty crazy stuff. Sometimes I felt like I was programming in YAML. This example shows how to create taxonomy terms on the fly. Static_map allows you to map old values to new. In this case, if it doesn’t match, it gets a null value and is skipped. Finally, the entity_generate plugin creates the new taxonomy term.

field_webinar_track: - plugin: static_map source: webinar_track map: old_tag_1: 'New Tag One' old_tag_2: 'New Tag One' default_value: null - plugin: skip_on_empty method: process - plugin: entity_generate bundle_key: vid bundle: webinar_track Dates can be migrated without losing your mind.

Dates can be challenging. Drupal core has the format_date plugin that allows specifying the format you are migrating from and to. You can even optionally specify the to and from time zones. In this example, we were migrating to a date range field. Date range is a single field with two values representing the start and end time. As you can see below, we target the individual values by specifying the individual value targets as ‘/’ delimited paths.

field_date/value: plugin: format_date from_timezone: America/Los_Angeles from_format: 'Y-m-d H:i:s' to_format: 'Y-m-d\TH:i:s' source: start_date field_date/end_value: plugin: format_date from_timezone: America/Los_Angeles from_format: 'Y-m-d H:i:s' to_format: 'Y-m-d\TH:i:s' source: end_date Files behind http auth can be copied too.

One migration required copying PDF files as the migration ran. The download plugin allows passing in Guzzle options for handling things like basic auth. This allowed the files to be copied from an http authenticated directory without the need to have the files on the local file system first.

plugin: download source: - '@_remote_filename' - '@_destination_filename' file_exists: replace guzzle_options: auth: - username - password Constants & temporary fields can keep things organized.

Constants are essentially variables you can use elsewhere in your YAML file. In this example, base_path and file_destination needed to be defined. Temporary fields were also used to create the exact paths needed to get the correct remote filename and destination filename. My examples use an underscore to prefix the temporary field, but that isn’t required.

source: plugin: your_plugin constants: base_path: 'https://www.somedomain.com/members/pdf/' file_destination: 'private://newsletters/'   _remote_filename: plugin: concat source: - constants/base_path - filename _destination_filename: plugin: concat source: - constants/file_destination - filename   plugin: download source: - '@_remote_filename' - '@_destination_filename' file_exists: replace guzzle_options: auth: - username - password

This list of tips and tricks on Drupal Migrate just scratches the surface of what’s capable. Drupalize.me has some good free and paid content on the subject. Also, check out the Migrate API overview on drupal.org.

Further Reading

Like I said earlier, we spend a lot of time on migrations. Here are a few more articles from the Aten blog about various aspects of running Drupal 8 migrations. Happy reading!

Categories: Drupal

Chromatic: Custom Entity Methods in Twig Templates

31 January 2019 - 2:38pm

Entities and their methods are no longer limited to use within PHP, they are now available in Twig as well.

Categories: Drupal

Acquia Developer Center Blog: Building Usable Conversations: Effective Conversational Design

31 January 2019 - 7:39am

In this third installment of our series on conversational usability, we look at conversational design, an already well-explored area that is still burgeoning with emerging best practices.

Tags: acquia drupal planet
Categories: Drupal

OpenSense Labs: Brace Yourself, OpenEuropa Initiative is Here!

31 January 2019 - 6:14am
Brace Yourself, OpenEuropa Initiative is Here! Vasundhra Thu, 01/31/2019 - 23:32

In one of the episodes of Star Trek - “The Trouble With Tribbles” there is a graphical example of how small changes leads to monumental consequences over a short period of time.  

The episode depicted the effect of new “species” on the established society, somewhat similar to the rise of the open source software and its tools in today’s technology. 

Yet many of us aren’t cognizant of the reach and the influence that open source has on our personal and professional endurance.


Thus to solve this awareness issue, OpenEuropa initiative was introduced. This Directorate- General for information initiative aims at strengthing and adopting open source tools and its practices in consolidating the European institutes' web presence. 

What the initiative focuses on?

In order to achieve such goals, the OpenEuproa Initiative focuses on the following activities. 

  • Software components licensed under EUPL-1.2

The initiative focuses on building, maintaining and releasing loosely coupled, reusable software components that are licensed under EUPL-1.2 

The European Union Public License is a free software that has been created and approved by the European Commission. The goal of this license was to create open source license in 23 different languages for the European Union and also that it conforms the copyrights law of member states of the European Union.

  • Open Source Strategies 

The initiative also focused on building, maintaining and releasing full-fledged solutions and open source strategies for the European institutes. The special objectives of these strategies are:

Equal treatment in the procedure 

Under this, the open source solutions and a proprietary solution will be assessed on an equal basis, both being evaluated on the basis of total cost of ownership, including the prevailing cost. 

Contribution to communities 

The commission service would actively participate in open source software communities to build a strong open source building block which is used in the commission software.

Clarifying legal aspects

For easy collaborations with open source communities, commission developers benefit from the right legal coaching and the advice on how to deal with intellectual property issues related to open source software

Good Communication

The strategy emphasis strongly on improved governance, increasing use of open source in the domain of ITC security and the alignment of this strategy. 

 

  • Web Services Architecture Overview 

The initiative provides a high-level architecture overview of web related information systems. 

Web information system or web-based information system is an information system that uses internet web technologies to deliver information and services, to users or other information systems. 

A software system whose main purpose is to publish and maintain data by using hypertext-based principles. 
 

 

  • Open Source Projects 

The initiative contributed back to the upstream open source projects. Each project complies with PHP-FIG standards and adheres to best practices put forward by PHP the “right” way.

PHP and Drupal Projects released under the EUPL-1.2 license.are:

 


OpenEuropa Coding Standards

OpenEuropa and its components are built with a public contribution in mind. In order for all components and contributions to look and feel familiar, OpenEuropa has agreed to follow some coding standards 

Although OpenEuropa does not follow any coding standards guidelines as such, it does follow known standards such as PSR-1 and PSR-2.

The code review components have been created in order to make it easier for the contributors to create new components or to modify the existing ones. The coding standards have to be reviewed and validated under OpenEuropa code review across different OpenEuropa components. 

Development Environment

The projects that are developed under the Open Europa initiative does not follow any development environment, there are software packages that follow it. The software packages like:

Tools Required Purpose PHP YES  Needed by Drush, Compose,  and Tash Runner  Composer YES A package manager for PHP  GIT YES Version control system Drush YES CLI (command line interface) integration with Drupal  ROBO YES Required by Open Europa task runner  Node.js YES Required to develop OpenEuropa theme


PHP:  We have heard about this word once in our lifetimes. Here the use of PHP is required by various tools which include the composer, Drush, Robo and Drupal itself.

Composer:  Composer is used for managing dependencies of the PHP project. All the projects are required to use it and the plus point about this is that it has its natural integration with Drupal.org.

Git: Git is the distributed control version that is used as a foundation of OpenEuropa ecosystem. 

Drush: This is the command line shell and UNIX interface scripting for Drupal and is used to interact with the Drupal website through line command.

ROBO: This is a simple PHP task runner which is inspired by Gulp and Rake that aims to automate common tasks.

Node.js: It is required for the development of OpenEuropa themes. All the development dependencies are defined under package.json. 

Automated Testing for functionalities

OpenEuropa requires automated tests to be written for every new feature or bugfix to ensure that the functionality will keep working as expected in the future. There are two types of test 

User Stories

OpenEuropa practices Behaviour Driven Development to facilitate effective communication between business and development teams. User stories should be accompanied by test scenarios that have been written in non-technical language. After the user stories are accepted it can then be used to perform automated tests to ensure the business requirement work. 

Unit Test 

If there are any pull requests that do not result in from user stories can be covered by unit tests rather than BDD user stories. The user should use the appropriate unit testing framework that is available for the programming languages in which the components are developed. 

Can Drupal components be tested as well?

In addition to the testing framework that comes with the Drupal core, OpenEuropa also uses Behat to describe business requirements. 

Behat is a test framework for behavior-driven development written in the PHP programming language. 

When a PR is compelled to change the user behavior it is demanded to describe the expected end user behavior in the Behat scenario. 

  • Each of the user stories is accompanied by Behat scenario. The scenario provides to the project stakeholder for the acceptance testing. 
  • The target audience of these scenarios is stakeholders. 
  • Every Behat test scenario is written in domain-specific building language and should only be used to describe expected user behavior as it is specified by the stakeholders.
  • Any code is written that does not directly affect the expected end-user behavior should be covered by unit tests instead.

Drupal 8 has introduced the concept of experimental modules, that are not under privacy policies or fully supported but used for testing purposes. They offer a wide range of functionalities, starting from migration till the site building. 

Due to the experimental nature of these modules, OpenEuropa defined a set of policies under its components.

Minimum Stability Required

These modules follow different levels of stability by Drupal: alpha, beta, RC and stable.
In order for OpenEuropa to provide stability, the only experimental module in its beta and greater stage are allowed. 

This is done because the modules that are in beta and later stage are very stable in API. Whenever API is changed, great care is taken to provide a compatibility layer. 

Experimental modules in the alpha state

Although the rules state that the alpha modules are not allowed in the vanity, there is still a great potential value to the customers. 

If for a technical reason or business reason the alpha module does justification to the project, alpha modules are allowed to the experiment. However, in such cases, the components will use the same labels as in the experimental modules they are using. That means if you are using the alpha module you are required to use its label as well until the related module is changed. 

OpenEuropa Release Cycle

OpenEuropa releases its components following semantic versioning. There would be three types of releases planned:

MAJOR

Incompatible API changes, very rare and planned in

MINOR

Adds backward compatible manner functionalities and bug fixes
 
PATCH 

Adds backward compatibility bug/security fixes and can be deployed instantaneously. No new functionalities would be introduced. 

Release Preparation and testing in Drupal

OpenEuropa Drupal components are released with the follow of Drupal 8 components and will be always tested against. 

  • Current stability in Drupal 8 core minor release. (n)
  • Previous release Drupal 8 core minor release. (n-1)
  • Development range in Drupal 8 core minor.

This allows to follow same support cycle as Drupal core and be better prepared for next minor releases as they occur.

Release Support 

For Drupal components, OpenEuropa team will have a support policy inspired by Drupal core:
Components support current and previous Drupal Core minor versions. New minor versions for components are made compatible with these respective core versions.
When a new minor core version (n) is supported, the support for release n-2 is dropped.

Conclusion 

Open Source and its components have become really essential for building trust and safety around the software and web. It has been contributing to the projects, service-oriented architecture and technical governance that derive the design and development of the components.

The initiative has emerged as a lightning bolt in this dark world of “unawareness” It has not only aimed the strengths and powers of open source tools and practices but established a stronghold on the web presence. 

Contact us on hello@opensenselabs.com for any queries and questions to the related topic. The professionals would guide you with the same and provide you with adequate services.  

blog banner blog image Drupal Drupal 8 CMS OpenEuropa EUPL-1.2 Open Source OpenEuropa Coding Standards OpenEuropa Release Cycle Blog Type Articles Is it a good read ? On
Categories: Drupal

Specbee: Artificial Intelligence & Drupal - The Smart Impact On Your Business

31 January 2019 - 4:31am
Over the years, AI has been instrumental in personalizing user experience and this is one of the prime reasons why brands are investing in Big Data as a key element of their customer offerings. However, organizations have failed to utilize the enormous potential of Artificial Intelligence by not embracing the technology completely in the field of website development.
Categories: Drupal

WeKnow: Improving Drupal and Gatsby Integration - The Drupal Boina Distribution

31 January 2019 - 1:00am
Improving Drupal and Gatsby Integration - The Drupal Boina Distribution

Drupal 8 has plenty of contributed modules to help you building a headless/decoupled web application. However, getting all those set up correctly could be a daunting task. 

Understanding this is an issue that should be addressed and as mentioned previously in our blog posts of this “Improving Drupal and Gatsby Integration” series we wrote and contributed two modules Toast UI Editor and Build Hooks. But there are some other needed modules: JSON-API, JSON-API Extras, Site Settings to mention a few and also a minimum configuration you should take care of to have a pleasant experience with your Drupal-Gatsby integration. 

jmolivas Thu, 01/31/2019 - 09:00
Categories: Drupal

Tandem's Drupal Blog: Lando is ready for the masses with RC2 release

31 January 2019 - 12:24am
February 01, 2019 We've waited, we've bided our time, we've gathered data and now we are ready to smite traditional local dev to ruin by unleashing the true power of Lando Holla!!! We are super pumped to announce the release of Lando 3.0.0-rc.2!. About midway through 2018 we reached a few of the milestones we were looking for: Over 5,000 monthl...
Categories: Drupal

Dries Buytaert: 2019 Australian Open 'aces' the digital experience with Acquia and Drupal

30 January 2019 - 6:00pm

Since I was young, I've been an avid tennis player and fan. I still play to this day, though maybe not as much as I'd like to.

In my teens, Andre Agassi was my favorite player. I've even sported some of his infamous headbands. I also remember watching him win the Australian Open in 1995.

In 2012, I traveled to Melbourne for a Drupal event, the same week the Australian Open was going on. As a tennis fan, I was lucky enough to watch Belgium's Kim Clijsters play.

Last weekend, the Australian Open wrapped up. This year, their website, https://ausopen.com, ran on Acquia and Drupal, delivered by the team at Avanade.

In a two-week timeframe, the site successfully welcomed tens of millions of visitors and served hundreds of millions of page views.

I'm very proud of the fact that many of the world's largest sporting events and media organizations (such as NBC Sports who host the Super Bowl and Olympics in the US) trust Acquia and Drupal as their chosen digital platform.

When the world is watching an event, there is no room for error!

Team Tennis Australia, Acquia and Avanade after the men’s singles final.

Many thanks to the round-the-clock efforts from Acquia's team in Asia Pacific, as well as our partners at Avanade!

Categories: Drupal

OSTraining: How to Display Flickr Images on Your Drupal 8 Site

30 January 2019 - 4:00pm

The Drupal 8 "Flickr" module allows you to insert Flickr images or photosets (albums) on your site, without the need of keeping the images on your server. This has a couple of advantages (we won’t discuss the disadvantages in this tutorial):

  • Less use of resources on your own server
  • Improvements on the performance of the site
  • You avoid copyright issues in your site, Flickr takes care of that
  • Make use of thousands of Flickr images available under CC License.
Categories: Drupal

DrupalEasy: Make 2019 the Year of Drupal Talent Development

30 January 2019 - 3:01pm

All sorts of organizations have made their predictions and proclamations about what 2019 will be the year of... Some say it is the Year of Optimism and growth for helicopters, others, …of the Hack & Slash, …of indigenous languages, …of the electric SUV, and even the International Year of the Salmon.  It all comes down to what someone sees as an important aspect of their field to promote or what is or should be a trend.  DrupalEasy would therefore like propose we make 2019 the Year of Drupal Talent Development.

As an active member of the community, we've observed, had great discussions and provided several sessions at Camps and Cons about the talent shortage in the community as well as how to build a Drupal career. We feel it is pretty important to Drupal. Right now, there are more than 2,000 jobs requiring Drupal skills on Indeed, and hundreds more continually being added. The Drupal Association also recognized the need, as we discovered while putting together this blog post, with the introduction of a Drupal Educational Opportunities Newsletter, which will help get the word out to those looking to further their skills, and those looking to start a career in Drupal.  

As a training organization, we not only train within the community, but bring in new people to develop passion for Drupal. We’ve had over a decade to watch people who started out not being able to spell Drupal develop the commitment and skills to excel at it. We’ve seen our Drupal learning community grow, diversity, and expand internationally, as new students hone their skills and strive to contribute to the community while they do it. Through all of this talent development for Drupal and the community, most gratifying is seeing the companies and organizations compete for the people building rewarding and fulfilling careers through experience and participation. 

How can we make it the Year of Drupal Talent Development?  A simple way is for each of us to simply sharie information on Drupal as a career and the opportunities in Drupal to those who may benefit from it. For those with the need and resources, providing the education and training needed for individuals or teams. There are some great resources to get information about Drupal as a career

The US Department of Labor has an Occupational Outlook Handbookprovides summaries of careers including salary ranges, anticipated job growth, and types of work environments.  Their entry on Web Developer careers, though not specific to Drupal, seems to track pretty well. There is also great salary information about Drupal specific web development through the Indeed Salary Tool, as well as Glass Door's version

Of course, salary is just a part of it, so a few years ago, we put together a Drupal Career Resources page to provide an index of information, insight and news for those looking to get into Drupal. It is a quick way for those of us who are in the community to share a lot of information.    

We also truly believe that solid education in the ways of Drupal is key to get and keep people active in the community.  Mike Anello teaches our our 12-week live, Drupal Career Online course twice each year.  The career technical education course is licensed as a certificate program through the Florida Department of Education Commission for Independent Education.

Drupal Career Online is a  comprehensive program that includes 2 class sessions and one co-working lab session each week, along with Mike's office hours and access to other training provider resources, most recently Drupalize.me.  Participants are also provide rich learning resources including a lesson guide, class slides, links to go further in-depth into topics, and a screencast for every lesson, which is all accessible through a session-specific class web site. 

Prior to each DCO session, we hold Taste of Drupal mini-webinars to introduce people to Drupal, Drupal careers and our course.  There are 2 more sessions before the Spring 2019 session of Drupal Career Online kicks off on February 25th.  Those interested can sign up for a Taste of Drupal, or contact us to get more information.      

Categories: Drupal

DrupalEasy: Make 2019 the Year of Drupal Talent Development

30 January 2019 - 3:01pm

All sorts of organizations have made their predictions and proclamations about what 2019 will be the year of... Some say it is the Year of Optimism and growth for helicopters, others, …of the Hack & Slash, …of indigenous languages, …of the electric SUV, and even the International Year of the Salmon.  It all comes down to what someone sees as an important aspect of their field to promote or what is or should be a trend.  DrupalEasy would therefore like propose we make 2019 the Year of Drupal Talent Development.

As an active member of the community, we've observed, had great discussions and provided several sessions at Camps and Cons about the talent shortage in the community as well as how to build a Drupal career. We feel it is pretty important to Drupal. Right now, there are more than 2,000 jobs requiring Drupal skills on Indeed, and hundreds more continually being added. The Drupal Association also recognized the need, as we discovered while putting together this blog post, with the introduction of a Drupal Educational Opportunities Newsletter, which will help get the word out to those looking to further their skills, and those looking to start a career in Drupal.  

As a training organization, we not only train within the community, but bring in new people to develop passion for Drupal. We’ve had over a decade to watch people who started out not being able to spell Drupal develop the commitment and skills to excel at it. We’ve seen our Drupal learning community grow, diversity, and expand internationally, as new students hone their skills and strive to contribute to the community while they do it. Through all of this talent development for Drupal and the community, most gratifying is seeing the companies and organizations compete for the people building rewarding and fulfilling careers through experience and participation. 

How can we make it the Year of Drupal Talent Development?  A simple way is for each of us to simply sharie information on Drupal as a career and the opportunities in Drupal to those who may benefit from it. For those with the need and resources, providing the education and training needed for individuals or teams. There are some great resources to get information about Drupal as a career

The US Department of Labor has an Occupational Outlook Handbookprovides summaries of careers including salary ranges, anticipated job growth, and types of work environments.  Their entry on Web Developer careers, though not specific to Drupal, seems to track pretty well. There is also great salary information about Drupal specific web development through the Indeed Salary Tool, as well as Glass Door's version

Of course, salary is just a part of it, so a few years ago, we put together a Drupal Career Resources page to provide an index of information, insight and news for those looking to get into Drupal. It is a quick way for those of us who are in the community to share a lot of information.    

We also truly believe that solid education in the ways of Drupal is key to get and keep people active in the community.  Mike Anello teaches our our 12-week live, Drupal Career Online course twice each year.  The career technical education course is licensed as a certificate program through the Florida Department of Education Commission for Independent Education.

Drupal Career Online is a  comprehensive program that includes 2 class sessions and one co-working lab session each week, along with Mike's office hours and access to other training provider resources, most recently Drupalize.me.  Participants are also provide rich learning resources including a lesson guide, class slides, links to go further in-depth into topics, and a screencast for every lesson, which is all accessible through a session-specific class web site. 

Prior to each DCO session, we hold Taste of Drupal mini-webinars to introduce people to Drupal, Drupal careers and our course.  There are 2 more sessions before the Spring 2019 session of Drupal Career Online kicks off on February 25th.  Those interested can sign up for a Taste of Drupal, or contact us to get more information.      

Categories: Drupal

John Svensson: Convert from Drupal 8 tarball to Composer

30 January 2019 - 12:46pm

If you were an early Drupal 8 adopter you've might have downloaded and installed your Drupal 8 sites by downloading a tarball or using Drush. We did as well, but the benefits of using Composer are so great that it's time to convert those in to being Composer-managed.

Luckily, grasmash has built a great Composer plugin called Composerize Drupal which does all the heavy-lifting for us.

Here's how we did it:

Before you even begin, make sure you branch out $ git checkout -b chore/composerize-drupal

And then we installed the Composer plugin globally:

composer global require grasmash/composerize-drupal

Consider the plugin options available:

  • Use the --exact-versions option if the site is big and complex. Especially if you don't have any good test coverage to ensure your site doesn't break. The option sets the constraints of your composer.json to the exact versions of your currently downloaded modules.

Now we run the command:

composer composerize-drupal --composer-root=. --drupal-root=. --exact-versions

Next:

  • Update your .gitignore and ignore the vendor/, core/ and modules/contrib folders. If the files were already commited you also need to remove them: Ignore files that have already been committed to a Git repository
  • Re-apply your patches! Since all the core code and contrib. modules are managed by Composer you'll need to add those patches to your composer.json:
    Something like this:
"extra": { "enable-patching": true, "patches": { "drupal/core": { "2492171 - Adds transliteration to uploaded file and images": "_kodamera/patches/use_new_transliteration-2492171-72.patch" } } ...

When you run composer install they are automatically applied for you. No more manually work here. Yay!

Do some regression testing and if everything looks fine, you're done! Commit and deploy :)

Categories: Drupal

CTI Digital: Drupal Camp London 2019

30 January 2019 - 8:46am

Drupal Camp London is a 3-day event celebrating the users, designers, developers and advocates of Drupal and its community! Attracting 500 people from across Europe, after Drupalcon, it’s one of the biggest events in the Drupal Calendar.

Categories: Drupal

DrupalCon News: Community Connection - Garvita Kapur

30 January 2019 - 7:57am

We’re featuring some of the people in the Drupalverse! This Q&A series highlights individuals you could meet at DrupalCon.

Every year, DrupalCon is the largest gathering of people who belong to this community. To celebrate and take note of what DrupalCon means to them, we’re featuring an array of perspectives and fun facts to help you get to know your community.
 

Categories: Drupal

Blair Wadman: How do you place a block in a Twig template

30 January 2019 - 7:16am

There are various ways you can add blocks to regions in pages in Drupal. You could add it in the block interface, use Panels or Context. But what if you just want to place a block directly in a Twig template?

The simplest way to place a block in a Twig template is to use the Twig Tweak module. Twig Tweak is a very handy module that gives you a range of functions to make theming with Twig easier. Read on to find out how...

Categories: Drupal

Craft of Coding: Drupal on OpenShift: The business value of OpenShift

30 January 2019 - 6:50am

Looking to achieve production grade Drupal deployment using Kubernetes? Find out the business value of running your Drupal site on OpenShift, the industry’s most advanced Kubernetes distribution. Somewhere around 2017, I recall migrating my blog(running Drupal 8 at that time) to Kubernetes, just to test the then uncharted Kubernetes waters attempting to understand the buzz behind […]

The post Drupal on OpenShift: The business value of OpenShift appeared first on Craft of Coding.

Categories: Drupal

Pages