All RPGs and Storygames by Tod Foley are now available at DrivethruRPG and RPGnow. Bring these games to your table!
It is a views exposed fields area handler plugin. It can be used in the Views Header or in Footer. It contains the tokens of the exposed fields names of the view. These tokens got replaced with the searched values while searching the fields in the views form .
These tokens needs to be added in the Display section of the plugin to get the results. Currently the cases for text exposed fields, Taxonomy exposed fields, datetime, boolean fields are being handled.
In this article we will see how to update data models in Drupal 8, how to make the difference between model updating and content updating, how to create default content, and finally, the procedure to adopt for successful deployments to avoid surprises in a continuous integration/delivery Drupal cycle.
Before we start, I would encourage you to read the documentation of the hook hook_update_N() and to take into account all the possible impacts before writing an update.
Updating the database (executing hook updates and/or importing the configuration) is a very problematic task during a Drupal 8 deployment process, because the updating actions order of structure and data is not well defined in Drupal, and can pose several problems if not completely controlled.
It is important to differentiate between a contributed module to be published on drupal.org aimed at a wide audience, and a custom Drupal project (a set of Drupal contrib/custom modules) designed to provide a bespoke solution in response to a client’s needs. In a contributed module it is rare to have a real need to create instances of configuration/content entities, on the other hand deploying a custom Drupal project makes updating data models more complicated. In the following sections we will list all possible types of updates in Drupal 8.
The Field module allows us to add fields to bundles, we must make difference between the data structure that will be stored in the field (the static schema() method) and all the settings of the field and its storage that will be stored as a configuration. All the dependencies related to the configuration of the field are stored in the field_config configuration entity and all the dependencies related to the storage of the field are stored in the field_storage_config configuration entity. Base fields are stored by default in the entity’s base table.
Configurable fields are the fields that can be added via the UI and attached to a bundle, which can be exported and deployed. Base fields are not managed by the field_storage_config configuration entities and field_config.
To update the entity definition or its components definitions (field defintions for example if the entity is fieldable) we can implement hook_update_N(). In this hook don’t use the APIs that require a full Drupal bootstrap (e.g. database with CRUD actions, services, …), to do this type of update safely we can use the methods proposed by the contract EntityDefinitionUpdateManagerInterface (e.g. updating the entity keys, updating a basic field definition common to all bundles, …)
To be able to update existing data entities or data fields in the case of a fieldable entity following a modification of a definition we can implement hook_post_update_NAME(). In this hook you can use all the APIs you need to update your entities.
To update the schema of a simple, complex configuration (a configuration entity) or a schema defined in a hook_schema() hook, we can implement hook_update_N().
In a custom Drupal project we are often led to create custom content types or bundles of custom entities (something we do not normally do in a contributed module, and we rarely do it in an installation profile), a site building action allows us to create this type of elements which will be exported afterwards in yml files and then deployed in production using Drupal configuration manager.
A bundle definition is a configuration entity that defines the global schema, we can implement hook_update_N() to update the model in this case as I mentioned earlier. Bundles are instances that persist as a Drupal configuration and follow the same schema. To update the bundles, updated configurations must be exported using the configuration manager to be able to import them into production later. Several problems can arise:
- If we add a field to a bundle, and want to create content during the deployment for this field, using the current workflow (drush updatedb -> drush config-import) this action is not trivial, and the hook hook_post_update_NAME() can’t be used since it’s executed before the configuration import.
- The same problem can arise if we want to update fields of bundles that have existing data, the hook hook_post_update_NAME() which is designed to update the existing contents or entities will run before the configuration is imported. What is the solution for this problem? (We will look at a solution for this problem later in this article.)
Importing default content for a site is an action which is not well documented in Drupal, in a profile installation often this import is done in the hook_install() hook because always the data content have not a complex structure with levels of nested references, in some cases we can use the default content module. Overall in a module we can’t create content in a hook_install() hook, simply because when installing a module the integrity of the configuration is still not imported.
In a recent project i used the drush php-script command to execute import scripts after the (drush updatedb -> drush config-import) but this command is not always available during deployment process. The first idea that comes to mind is to subscribe to the event that is triggered after the import of the configurations to be able to create the contents that will be available for the site editors, but the use of an event is not a nice developer experience hence the introduction of a new hook hook_post_config_import_NAME() that will run once after the database updates and configuration import. Another hook hook_pre_config_import_NAME() has also been introduced to fix performance issues.A workflow that works for me
To achieve a successful Drupal deployment in continuous integration/delivery cycles using Drush, the most generic workflow that I’ve found at the moment while waiting for a deployment API in core is as follows :
- drush updatedb
- hook_update_N() : To update the definition of an entity and its components
- hook_post_update_N() : To update entities when you made an entity definition modification (entity keys, base fields, …)
- hook_pre_config_import_NAME() : CRUD operations (e.g. creating terms that will be taken as default values when importing configuration in the next step)
- drush config-import : Importing the configuration (e.g. new bundle field, creation of a new bundle, image styles, image crops, …)
- hook_post_config_import_NAME(): CRUD operations (e.g. creating contents, updating existing contents, …)
This approach works well for us, and I hope it will be useful for you. If you’ve got any suggestions for improvements, please let me know via the comments.
We do a lot of Drupal 8 migrations here at Aten. From older versions of Drupal and Wordpress, to custom SQL Server databases, to XML and JSON export files: it feels like we’ve imported content from just about every data source imaginable. Fortunately for us, the migration system in Drupal 8 is extremely powerful. It’s also complicated. Here’s a quick-start guide for getting started with your next migration to Drupal 8.
First, a caveat: we rarely perform simple one-to-one upgrades of existing websites. If that’s all you need, skip this article and check out this handbook on Drupal.org instead: Upgrading from Drupal 6 or 7 to Drupal 8.It’s Worth the Steep Learning Curve
Depending on what you’re trying to do, using the migrate system might seem more difficult than necessary. You might be considering feeds, or writing something custom. My advice is virtually always the same: learn the migrate system and use it anyway. Whether you’re importing hundreds of thousands of nodes and dozens of content types or just pulling in a collection of blog posts, migrate provides powerful features that will save you a bunch of time in the long run. Often in the short run, for that matter.Use the Drupal.org Migrate API Handbooks
Here’s a much simplified overview of the high-level steps you’ll use to set up your custom Drupal 8 migration:
- Enable the migrate module (duh).
- Install Migrate Tools to enable Drush migration commands.
- Install Migrate Extras as well. It provides a bunch of, well, extras. I’d just assume you need it.
- Create a custom module for your migration.
- Use YAML configuration files to map fields from the appropriate source, specifying process plugins for necessary transformations, to the destination. The configuration files should exist in “my_migration_module/config/install/“.
(Pro tip: you’ll probably do a lot of uninstalling and reinstalling your module to update the configuration as you build out your migrations. Use “enforced dependencies” so your YAML configurations are automatically removed from the system when your module is uninstalling, allowing them to be recreated – without conflicts – when you re-enable the module.)
- If you’re running a Drupal-to-Drupal migration, run the “migrate-upgrade” Drush command with the “--configure-only” flag to generate stub YAML configurations. Refer to this handbook for details: Upgrade Using Drush.
- Copy the generated YAML files for each desired migration into your custom module’s config/install directory, renaming them appropriately and editing as necessary. As stated above, add enforced dependencies to your YAML files to make sure they are removed if your module is uninstalled.
Process plugins are responsible for transforming source data into the appropriate format for destination fields. From correctly parsing images from text blobs, to importing content behind HTTP authentication, to merging sources into a single value, to all kinds of other transformations: process plugins are incredibly powerful. Further, you can chain process plugins together, making endless possibilities for manipulating data during migration. Process plugins are one of the most important elements of Drupal 8 migrations.
Here are a few process plugin resources:
- Migrate Process Plugins overview and resources from Drupal.org
- List of Core Migrate Process Plugins quick reference on Drupal.org
- Writing a Process Plugin guide to creating your own process plugin on Drupal.org (Pro tip: do a Google search first; the thing you’re trying to create likely already exists.)
Most of our projects are hosted on Pantheon. Storing credentials for the source production database (for example, a D7 website) in our destination website (D8) code base – in settings.php or any other file – is not secure. Don’t do that. Usually, the preferred alternative is to manually download a copy of the production database and then migrate from that. There are plenty of times, though, where we want to perform continuous, automated migrations from a production source database. Often, complex migrations require weeks or months to complete. Running daily, incremental migrations is really valuable. For those cases, use the Terminus secrets plugin to safely store source database credentials. Here’s a great how-to from Pantheon: Running Drupal 8 Data Migrations on Pantheon Through Drush.A Few More Things I Wish I’d Known
Here are a few more things I wish I had known about back when I first started helping clients migrate to Drupal 8:Text with inline images can be migrated without manually copying image directories.
It’s very common to migrate from sources that have inline images. I found a really handy process plugin that helped with this. In my case, I needed to first do a string replace to make image paths absolute. Once that was done, I ran it through the inline_images plugin. This plugin will copy the images over during the migration.body/value: - plugin: str_replace source: article_text search: /assets/images/ replace: 'https://www.example.com/assets/images/' - plugin: inline_images base: 'public://inline-images' Process plugins can be chained.
Process plugins can be chained together to accomplish some pretty crazy stuff. Sometimes I felt like I was programming in YAML. This example shows how to create taxonomy terms on the fly. Static_map allows you to map old values to new. In this case, if it doesn’t match, it gets a null value and is skipped. Finally, the entity_generate plugin creates the new taxonomy term.field_webinar_track: - plugin: static_map source: webinar_track map: old_tag_1: 'New Tag One' old_tag_2: 'New Tag One' default_value: null - plugin: skip_on_empty method: process - plugin: entity_generate bundle_key: vid bundle: webinar_track Dates can be migrated without losing your mind.
Dates can be challenging. Drupal core has the format_date plugin that allows specifying the format you are migrating from and to. You can even optionally specify the to and from time zones. In this example, we were migrating to a date range field. Date range is a single field with two values representing the start and end time. As you can see below, we target the individual values by specifying the individual value targets as ‘/’ delimited paths.field_date/value: plugin: format_date from_timezone: America/Los_Angeles from_format: 'Y-m-d H:i:s' to_format: 'Y-m-d\TH:i:s' source: start_date field_date/end_value: plugin: format_date from_timezone: America/Los_Angeles from_format: 'Y-m-d H:i:s' to_format: 'Y-m-d\TH:i:s' source: end_date Files behind http auth can be copied too.
One migration required copying PDF files as the migration ran. The download plugin allows passing in Guzzle options for handling things like basic auth. This allowed the files to be copied from an http authenticated directory without the need to have the files on the local file system first.plugin: download source: - '@_remote_filename' - '@_destination_filename' file_exists: replace guzzle_options: auth: - username - password Constants & temporary fields can keep things organized.
Constants are essentially variables you can use elsewhere in your YAML file. In this example, base_path and file_destination needed to be defined. Temporary fields were also used to create the exact paths needed to get the correct remote filename and destination filename. My examples use an underscore to prefix the temporary field, but that isn’t required.source: plugin: your_plugin constants: base_path: 'https://www.somedomain.com/members/pdf/' file_destination: 'private://newsletters/' _remote_filename: plugin: concat source: - constants/base_path - filename _destination_filename: plugin: concat source: - constants/file_destination - filename plugin: download source: - '@_remote_filename' - '@_destination_filename' file_exists: replace guzzle_options: auth: - username - password
This list of tips and tricks on Drupal Migrate just scratches the surface of what’s capable. Drupalize.me has some good free and paid content on the subject. Also, check out the Migrate API overview on drupal.org.Further Reading
Like I said earlier, we spend a lot of time on migrations. Here are a few more articles from the Aten blog about various aspects of running Drupal 8 migrations. Happy reading!
Entities and their methods are no longer limited to use within PHP, they are now available in Twig as well.
Allow managing app association dynamically
Handles case where site is multi-lingual, language prefix won't be added for app related links.
Blizzard has a reputation for filling its games with charming, colorful characters, and at the 2019Â Game Developers ConferenceÂ in March you'll get a rare look at how a new Blizzard hero is designed. ...
In this third installment of our series on conversational usability, we look at conversational design, an already well-explored area that is still burgeoning with emerging best practices.Tags: acquia drupal planet
In one of the episodes of Star Trek - “The Trouble With Tribbles” there is a graphical example of how small changes leads to monumental consequences over a short period of time.
The episode depicted the effect of new “species” on the established society, somewhat similar to the rise of the open source software and its tools in today’s technology.
Yet many of us aren’t cognizant of the reach and the influence that open source has on our personal and professional endurance.
What the initiative focuses on?
Thus to solve this awareness issue, OpenEuropa initiative was introduced. This Directorate- General for information initiative aims at strengthing and adopting open source tools and its practices in consolidating the European institutes' web presence.
In order to achieve such goals, the OpenEuproa Initiative focuses on the following activities.
- Software components licensed under EUPL-1.2
The initiative focuses on building, maintaining and releasing loosely coupled, reusable software components that are licensed under EUPL-1.2
The European Union Public License is a free software that has been created and approved by the European Commission. The goal of this license was to create open source license in 23 different languages for the European Union and also that it conforms the copyrights law of member states of the European Union.
- Open Source Strategies
The initiative also focused on building, maintaining and releasing full-fledged solutions and open source strategies for the European institutes. The special objectives of these strategies are:
Equal treatment in the procedure
Under this, the open source solutions and a proprietary solution will be assessed on an equal basis, both being evaluated on the basis of total cost of ownership, including the prevailing cost.
Contribution to communities
The commission service would actively participate in open source software communities to build a strong open source building block which is used in the commission software.
Clarifying legal aspects
For easy collaborations with open source communities, commission developers benefit from the right legal coaching and the advice on how to deal with intellectual property issues related to open source software
The strategy emphasis strongly on improved governance, increasing use of open source in the domain of ITC security and the alignment of this strategy.
- Web Services Architecture Overview
The initiative provides a high-level architecture overview of web related information systems.
Web information system or web-based information system is an information system that uses internet web technologies to deliver information and services, to users or other information systems.A software system whose main purpose is to publish and maintain data by using hypertext-based principles.
- Open Source Projects
The initiative contributed back to the upstream open source projects. Each project complies with PHP-FIG standards and adheres to best practices put forward by PHP the “right” way.
PHP and Drupal Projects released under the EUPL-1.2 license.are:
OpenEuropa Coding Standards
OpenEuropa and its components are built with a public contribution in mind. In order for all components and contributions to look and feel familiar, OpenEuropa has agreed to follow some coding standards
The code review components have been created in order to make it easier for the contributors to create new components or to modify the existing ones. The coding standards have to be reviewed and validated under OpenEuropa code review across different OpenEuropa components.Development Environment
The projects that are developed under the Open Europa initiative does not follow any development environment, there are software packages that follow it. The software packages like:Tools Required Purpose PHP YES Needed by Drush, Compose, and Tash Runner Composer YES A package manager for PHP GIT YES Version control system Drush YES CLI (command line interface) integration with Drupal ROBO YES Required by Open Europa task runner Node.js YES Required to develop OpenEuropa theme
PHP: We have heard about this word once in our lifetimes. Here the use of PHP is required by various tools which include the composer, Drush, Robo and Drupal itself.
Composer: Composer is used for managing dependencies of the PHP project. All the projects are required to use it and the plus point about this is that it has its natural integration with Drupal.org.
Git: Git is the distributed control version that is used as a foundation of OpenEuropa ecosystem.
Drush: This is the command line shell and UNIX interface scripting for Drupal and is used to interact with the Drupal website through line command.
ROBO: This is a simple PHP task runner which is inspired by Gulp and Rake that aims to automate common tasks.
Node.js: It is required for the development of OpenEuropa themes. All the development dependencies are defined under package.json.Automated Testing for functionalities
OpenEuropa requires automated tests to be written for every new feature or bugfix to ensure that the functionality will keep working as expected in the future. There are two types of test
OpenEuropa practices Behaviour Driven Development to facilitate effective communication between business and development teams. User stories should be accompanied by test scenarios that have been written in non-technical language. After the user stories are accepted it can then be used to perform automated tests to ensure the business requirement work.
If there are any pull requests that do not result in from user stories can be covered by unit tests rather than BDD user stories. The user should use the appropriate unit testing framework that is available for the programming languages in which the components are developed.Can Drupal components be tested as well?
In addition to the testing framework that comes with the Drupal core, OpenEuropa also uses Behat to describe business requirements.Behat is a test framework for behavior-driven development written in the PHP programming language.
When a PR is compelled to change the user behavior it is demanded to describe the expected end user behavior in the Behat scenario.
- Each of the user stories is accompanied by Behat scenario. The scenario provides to the project stakeholder for the acceptance testing.
- The target audience of these scenarios is stakeholders.
- Every Behat test scenario is written in domain-specific building language and should only be used to describe expected user behavior as it is specified by the stakeholders.
- Any code is written that does not directly affect the expected end-user behavior should be covered by unit tests instead.
Drupal 8 has introduced the concept of experimental modules, that are not under privacy policies or fully supported but used for testing purposes. They offer a wide range of functionalities, starting from migration till the site building.
Due to the experimental nature of these modules, OpenEuropa defined a set of policies under its components.
Minimum Stability Required
These modules follow different levels of stability by Drupal: alpha, beta, RC and stable.
In order for OpenEuropa to provide stability, the only experimental module in its beta and greater stage are allowed.
This is done because the modules that are in beta and later stage are very stable in API. Whenever API is changed, great care is taken to provide a compatibility layer.
Experimental modules in the alpha state
Although the rules state that the alpha modules are not allowed in the vanity, there is still a great potential value to the customers.
If for a technical reason or business reason the alpha module does justification to the project, alpha modules are allowed to the experiment. However, in such cases, the components will use the same labels as in the experimental modules they are using. That means if you are using the alpha module you are required to use its label as well until the related module is changed.OpenEuropa Release Cycle
OpenEuropa releases its components following semantic versioning. There would be three types of releases planned:
Incompatible API changes, very rare and planned in
Adds backward compatible manner functionalities and bug fixes
Adds backward compatibility bug/security fixes and can be deployed instantaneously. No new functionalities would be introduced.
Release Preparation and testing in Drupal
OpenEuropa Drupal components are released with the follow of Drupal 8 components and will be always tested against.
- Current stability in Drupal 8 core minor release. (n)
- Previous release Drupal 8 core minor release. (n-1)
- Development range in Drupal 8 core minor.
This allows to follow same support cycle as Drupal core and be better prepared for next minor releases as they occur.
For Drupal components, OpenEuropa team will have a support policy inspired by Drupal core:
Components support current and previous Drupal Core minor versions. New minor versions for components are made compatible with these respective core versions.
When a new minor core version (n) is supported, the support for release n-2 is dropped.
Open Source and its components have become really essential for building trust and safety around the software and web. It has been contributing to the projects, service-oriented architecture and technical governance that derive the design and development of the components.
The initiative has emerged as a lightning bolt in this dark world of “unawareness” It has not only aimed the strengths and powers of open source tools and practices but established a stronghold on the web presence.Drupal Drupal 8 CMS OpenEuropa EUPL-1.2 Open Source OpenEuropa Coding Standards OpenEuropa Release Cycle Blog Type Articles Is it a good read ? On
This module allows you to visualize drupal module dependencies in a hierarchy structure of all installed modules.
- You need at least vis.js v4.21.0 you can install it via composer or manual for instructions see Reamde.md.
Provides a token [consumers:current-name] for consumers name replacement depending on what consumer has requested the token.
This module contains an Inmail plugin that generates users from the sender of the incoming emails.
The commerce remove cart vat tax module provide functionality to show original
product price and amount total from cart page.
* By default vat tax are included with price and total in cart page.
* Remove vat tax from cart page and show original price.
* Install as you would normally install
a contributed Drupal module.