All RPGs and Storygames by Tod Foley are now available at DrivethruRPG and RPGnow. Bring these games to your table!
This module integrates TextRazor service with Drupal 8 to automatically classify your content into categories and topics based on the Natural Language Processing (NLP) technology of TextRazor.
Allow Entity Reference Revision to be added/edited/viewed within a modal dialog, without requiring a page reload.
Entity Browser ( https://www.drupal.org/project/entity_browser )
Entity Reference Revision ( https://www.drupal.org/project/entity_reference_revisions )
This is the latest post of the “Improving Drupal and Gatsby Integration” series. This time I will be talking about the Gatsby Boina Starter; we are contributing to make your Drupal-Gatsby integration easier. The Boina starter ships with the main Gatsby configuration files you might need to get up and running on your Gatsby site.jmolivas Fri, 02/01/2019 - 17:47
This module makes it possible to create crops in combination with the Thumbor Effects module.
A Twitter timeline fetcher for Feeds.
This module is under active development and will receive periodic updates. It's useable but having no stable release currently I cannot guarantee that the most recent release will work.
For why should you trade Drupal's battle-tested content authoring and administration tools for a more interactive user experience?
The hotels we chose each offer an ideal hub—connecting you to a rewarding DrupalCon community experience.
As mentioned previously, we have been collaborating across the Drupal community on updating and, expanding Drupal.org/community and that work is ongoing. There are still wrinkles to resolve, such as how to make the menus on that page more obvious, but we are getting there:Next step - community group sections
One of the things I was especially keen to do was to make areas for the groups of people that make our community work available under /community and give them the tools and space to tell the World about:
- Who they are
- What they do
- How they work
- What their latest updates are
- How you can get involved.
Well, the framework to do this looks good and the first couple of sections are now available. You can see the following community groups already:
Each section will have a “standard” home page content, detailing the info above, as many content pages as the group can muster and a blog that will go onto Drupal Planet.
Of course, a group will likely have content across many different parts of the Drupal.org website. I’m especially keen for all members of our community to be able to see what groups there are and how they work in one easy to consume place. Our project values challenge us all to clearly define how our community functions: "We foster a learning environment, prefer collaborative decision-making, encourage others to get involved and to help lead our community."What about the community group you are a member of?
If you represent a community group and would like to join the growing list of those with sections under /community, please get in contact
I’m looking at globally-relevant groups right now - maybe in the future, we will look at what we can do to support local groups.
Imagine what could be possible when new members of our community come to /community and find right where they belong! I'm excited to see what's next.
The number eight in bible signifies resurrection and regeneration, a digit that implies “New beginnings”
Just like the resurrection of Drupal which loudly announced its new inceptions as a content management system, and its ability to connect with Saas CRM like Salesforce.
Salesforce is like the heart for most of the business that has allowed them to handle there sales data at one stop and given highest priority in terms of customer’s growth. And now that it has a tighter integration than ever before, Drupal 8 can do it too
Benefits of Integrating Drupal and Salesforce
So instead of wasting any more of your time and beating around the bush, let's explore the paths that lead down its integration and the key considerations that are involved in it.
When you have a team of salespeople small or big, how would you manage that which territories or which areas each of them is going after?
Definitely with the help of Salesforce that would monitor and track almost anything that you can imagine of. Instead of managing those old school spreadsheets, a CRM like Salesforce can help you track and monitor all the tasks. It saves your time, resources that arrives while managing small and large scale teams. With the help of this CRM, you have the power to do many things such as:
- Better management of lead processing and territories.
- The leads can be assigned to the users according to according to the data that have business sense.
- Instant email notifications can help in rep up the sales of the customers and prospects immediately.
- It can help you attain better efficiency.
Tracking competitors and managing opportunities
In this competitive world, it is important to track and manage your competitors. You can do this thing with the help of Salesforce CRM.
It diligently ensures that each and every opportunity is followed up on and not forgotten through the various in-built tools and responses faster to any client that enquires about your services or products which shows ultimately to your customers that you care about their business.
A good CRM system gives you the ability from a business point of view to track exactly what is happening but also accurately forecast the growth or decline of your business. For forecasting, salesforce can also provide you with:
- Calculate the forecasting including all the information from the sales team.
- Differentiate between booked and recurring venues
- Customize forecast based on the parameters that make sense to the business.
The Salesforce CRM allows you to truly manage end to end customer relationships. You can see everything from the first time when you engage with a client to when they place an order and beyond.
The best part about Salesforce CRM in terms of managing order is that it can easily turn an estimate into order and beyond with a single click of a button and customized or automated reports based on what you need to seeArchitectural approaches
There are different architectural approaches to have you think about data flow that provides for different requirements and satisfy different needs. Architectures like:Technology Description Strengths Weakness Real-Time Push Sends data immediately on the entity and creates, updates and deletes Fast, limited, update lag, avoids UX, can avoid race conditions Less durable and reliable Cron Based Sync Identify records requiring sync on the cron Handles large volumes well, can be stopped and start as needed Slow, lags and risk the update conflicts Work Queue Single point of integration receives data and action Reliable, performant and has a shorter time lag Large changes create backlogs, the risk of update conflicts
With the real-time integration, the Drupal objects are exported to salesforce immediately. You get the feedbacks indicating whether the item failed to export and the data is available in Salesforce or not. This can be a great option if you need the data to be in the salesforce as close as possible.
Cron Based Sync
Earlier in Drupal 7 the asynchronous push left hiccups concerning error handling (which involved debugging and troubleshooting) optimization, API calls etc.
Now in Drupal 8 salesforce cron based push service has been introduced to construct database queues, normalizing queue items, optimizing queue operations and implementing error handling.
The Cron based sync has helped Drupal’s core API schedule synchronization from salesforce to Drupal.
With the queue-based batching system running in the background, it allows many objects to be sent to the salesforce as soon as possible. Instead of the objects being sent to the Salesforce at the same time. In this architecture, instead of the objects that are being sent to the salesforce as soon as it is created, edited, deleted it goes into the queue where it waits to be exported to other items.
The queues items are then picked up on the configurable schedule and then exports to the Salesforce in batches. Batching the data helps in synchronization and helps to increase the performance by using fewer API calls.Approaches suitable for integration
There are many ways to move your data from the website to another application. Drupal and Salesforce out of which is the easiest and allows integration in almost all projects. Here are some approaches which are suitable to integrate Drupal and Salesforce.
Simple web formsSalesforce lets you create simple HTML web form (Web-to-lead or web-to-case) that generates lead or case records in Salesforce when they are submitted.
Anyone of the Salesforce administrator can create these forms and then paste them in Drupal for the users to complete it.
While not all of the things are addressed in every circumstance, there are specific situations when this method is a good solution:
- A basic idea on the user data or inquiry information into Salesforce is needed.
- There is no or little expertise in web development.
- Something quick and easy is needed.
Third party form service
There are an ample number of form services like Formstack, click and pledge and Wufoo that have the power to pass the data to Salesforce. In this, you can either embed the form in Drupal or let the user click through the platform.
This method is suitable when the following conditions are applied:
- When there is a need to pass both user and transaction data into Salesforce.
- There is no need to move the information in both the directions.
- You may want users to log in to submit a form or return to the form and provide more information later.
- You want sophisticated solutions that don’t really need to be customized
The Salesforce Suit is the collection of all the Drupal modules that allow synchronization of all the data and the information that is between Drupal and Salesforce in single or both directions. This suit also has the ability to provide a mapping tool that can be used to define the integration which is field-by-field and object-by-object.
The simplest way to hook Drupal up (or any other website) with Salesforce is by simply linking over to a form that is created by the Salesforce. Any data that the user is entering gets dumped directly to the salesforce and Drupal is not involved in it.
This type of method is good for a lead generation or simple application form. One of the biggest advantage in using salesforce forms is that it is not only cheap and easy to use, but there are zero setups that are done on Drupal side besides providing a link to the form.
There might be instances where you might have content that constitutes in both Drupal as well as Salesforce and is needed to stay in sync. Salesforce mapping does that task for everyone.Salesforce mapping keeps the version of the data at both ends, whatever happens to one happens to the other version too.
Rules can also be made to add, delete, push or pull data.
Cost Direction Complexity Simple Web Forms Free One direction inbound to Salesforce DIY Third Party Form Service Low One direction DIY or Developer Assistance Salesforce Suite Moderate To High Bi-directional Developer Assistance Salesforce Mapping High Double-entering the same content in two places Developer Assistance Salesforce Forms Low Natural DIY Integrating with different Directions Integrating with One Direction Integrating with two directions Useful when When you have to pass user data, transaction data, and specific node types When data is entered directly into Salesforce To keep Integration Simple Modern Advantage This approach limits complexity and therefore liability and errors. Fewer duplicate records are created in Salesforce. User Experience No updates required to impact UX Users need sophisticated interaction such as the ability to view offline data you have entered Use Cases Donation forms, Event registration Donation forms, Event registration Drupal modules are here to ease the integration with Salesforce
The Drupal Salesforce Suite module is a testament to both the ingenuity and passion of the Drupal community and the flexibility of Drupal as an enterprise platform. As a contributed module, the Salesforce Suite for Drupal enables out of the box connection with Salesforce, no matter what your configuration is. It supports integration by simply synchronizing Drupal entities (eg users, nodes) with the Salesforce objects (organization, contacts)
The Drupal community, as a matter of fact, has been contributing a lot to this part. It has come together to sponsor the development of the suite of Salesforce integration modules that can deal with a variety of business needs. To rewrite the module, the community gathered time and the resources, taking full advantage of the advances that were made in the Drupal and Salesforce platforms. To put it all together it now has been rearranged into a modular architecture exposing core functionality via an API enabling other systems, E.g., Springboard, Jackson River’s fundraising platformMost importantly the Drupal suite module has authorized Auth 2.0 to its highest access control
For the non-technical users, the Drupal entity and Salesforce object mapping system has provided them with the power to configure the data maps between any objects in any 2 systems. Not only this but the synchronization between any Drupal entity and Salesforce object, E.g., Drupal users, donation receipts has been made easy. It has presented its users with a lightweight wrapper around the SOAP API, which has more capabilities for some use cases, using the same OAuth authorizationExamples of the Use case of Drupal and Salesforce Integration
A packaged distribution of Drupal for non-profit organizations, Springboard, Jackson River’s innovative solution (for online fundraising and marketing), needed to accept online donations and wanted to use Drupal to power other user touch points such as petitions, email registration and more. Springboard presented a robust integration queue for bi-directional sync of data between Drupal and Salesforce.com CRM.
RedHen CRM has been designed for the needs of membership organizations and associations, the RedHen framework is extensible and flexible and can be leveraged to produce a broad range of CRM solutions. For instance, RedHen could be used as a light-weight sales pipeline management tool for small businesses. RedHen CRM could also be leveraged as an integration point between Drupal and much larger, enterprise CRM solutions such as Salesforce.Case studies on Cornell University
The university offers hundreds of opportunities to the students including the ones that are living aboard. But to take the advantages of the opportunities the student had to navigate a full maze of departments and websites. Thus, to solve this issue Cornell University Experience Initiative (CUEI) came up with a plan to bring out a “Netflix” like experience to the students that provide customizable user guide making it easy for the students and the opportunities.
An organization known as Pantheon was chosen. They wanted to maintain there content with Drupal but also wanted to manage student application and data with Salesforce CRM. The whole team chose Message Agency as their partner to help conceptualize how Drupal and Salesforce would work. Message Agency is also an architecture of the Salesforce Suite, a set of Drupal modules that allows integration of these two powerful solutions.
There are interested students who come to the site to find things and explore. For that task, Drupal does a really good job, but when it comes to actions and customization Salesforce wins in it. This created a whole new paradigm of student’s communication and interaction.
The technique of centralizing information also provided Cornell with opportunities where each department had their individual page or site with content strategies. But before the website went live the CSEI team tested the user experience with the most trusted stakeholders: Cornell Students
The feedback which they received was overwhelming. They granted with positive reviews telling that how great and well organized the website was. Not only this but Pantheon also evaluated the site’s performance under the traffic load by providing complexity and image-heavy designThe Future
The wide raps of what Salesforce and Drupal make possible has given us a vivid idea on how the sales can be increased and raised among the marketing organizations. If you take one view away from all of the above, it should be this: there's definitely an integration that will work for your organization's needs and budget, but it might not be as efficient as integrating Salesforce and Drupal.
If you are able to get a Drupal-Salesforce integration deployed to your operation and organization, there is no doubt on the fact that you will enjoy streamlined and optimized business processes in the short and long term, thus boosting sales and also making the entire process much more comfortable and effective. The flexibility and customizability of Salesforce could prove to be troublesome when it comes to the consistency of your back-end.Conclusion
Drupal installations are all unique because of the different modules and customizations that they use, so integration has to be set up in a different manner by an expert.
If you already have a Salesforce instance set up, we'll be happy to explore the appropriate integration options. If you're new to Salesforce, we can work with your Salesforce developers to make sure your data is structured in a way that minimizes the integration effort and costs.Drupal Drupal 8 CMS Salesforce Salesforce Suite Salesforce mapping Salesforce Forms CRM Blog Type Articles Is it a good read ? On
This module is a collection of useful Drush commands for cloning/creating/copying fields in the node entity type bundle and in the profile entity type bundle of Drupal 8 from other node entity type bundle.
This module provides 2 useful commands for Drupal 8 fields:--
1. This module is used for cloning/copying all the fields from one node type bundle to another node type bundle. So, it is used for attaching all the field instances from one bundle to another bundle.
2. This module is used for creating all the fields from a node type bundle to Profile Type bundle.
It is a view exposed fields area handler plugin. It can be used in the Views Header or in Footer. It contains the tokens of the exposed fields names of the view. These tokens got replaced with the searched values while searching the fields in the views form.
These tokens need to be added in the Display section of the plugin to get the results. Currently, the cases for text exposed fields, Taxonomy exposed fields, DateTime, boolean fields are being handled.
Digital Echidna: Thoughts on all things digital: Digital Echidna Recognized as a Top Development Firm in Canada
It is a views exposed fields area handler plugin. It can be used in the Views Header or in Footer. It contains the tokens of the exposed fields names of the view. These tokens got replaced with the searched values while searching the fields in the views form .
These tokens needs to be added in the Display section of the plugin to get the results. Currently the cases for text exposed fields, Taxonomy exposed fields, datetime, boolean fields are being handled.
In this article we will see how to update data models in Drupal 8, how to make the difference between model updating and content updating, how to create default content, and finally, the procedure to adopt for successful deployments to avoid surprises in a continuous integration/delivery Drupal cycle.
Before we start, I would encourage you to read the documentation of the hook hook_update_N() and to take into account all the possible impacts before writing an update.
Updating the database (executing hook updates and/or importing the configuration) is a very problematic task during a Drupal 8 deployment process, because the updating actions order of structure and data is not well defined in Drupal, and can pose several problems if not completely controlled.
It is important to differentiate between a contributed module to be published on drupal.org aimed at a wide audience, and a custom Drupal project (a set of Drupal contrib/custom modules) designed to provide a bespoke solution in response to a client’s needs. In a contributed module it is rare to have a real need to create instances of configuration/content entities, on the other hand deploying a custom Drupal project makes updating data models more complicated. In the following sections we will list all possible types of updates in Drupal 8.
The Field module allows us to add fields to bundles, we must make difference between the data structure that will be stored in the field (the static schema() method) and all the settings of the field and its storage that will be stored as a configuration. All the dependencies related to the configuration of the field are stored in the field_config configuration entity and all the dependencies related to the storage of the field are stored in the field_storage_config configuration entity. Base fields are stored by default in the entity’s base table.
Configurable fields are the fields that can be added via the UI and attached to a bundle, which can be exported and deployed. Base fields are not managed by the field_storage_config configuration entities and field_config.
To update the entity definition or its components definitions (field defintions for example if the entity is fieldable) we can implement hook_update_N(). In this hook don’t use the APIs that require a full Drupal bootstrap (e.g. database with CRUD actions, services, …), to do this type of update safely we can use the methods proposed by the contract EntityDefinitionUpdateManagerInterface (e.g. updating the entity keys, updating a basic field definition common to all bundles, …)
To be able to update existing data entities or data fields in the case of a fieldable entity following a modification of a definition we can implement hook_post_update_NAME(). In this hook you can use all the APIs you need to update your entities.
To update the schema of a simple, complex configuration (a configuration entity) or a schema defined in a hook_schema() hook, we can implement hook_update_N().
In a custom Drupal project we are often led to create custom content types or bundles of custom entities (something we do not normally do in a contributed module, and we rarely do it in an installation profile), a site building action allows us to create this type of elements which will be exported afterwards in yml files and then deployed in production using Drupal configuration manager.
A bundle definition is a configuration entity that defines the global schema, we can implement hook_update_N() to update the model in this case as I mentioned earlier. Bundles are instances that persist as a Drupal configuration and follow the same schema. To update the bundles, updated configurations must be exported using the configuration manager to be able to import them into production later. Several problems can arise:
- If we add a field to a bundle, and want to create content during the deployment for this field, using the current workflow (drush updatedb -> drush config-import) this action is not trivial, and the hook hook_post_update_NAME() can’t be used since it’s executed before the configuration import.
- The same problem can arise if we want to update fields of bundles that have existing data, the hook hook_post_update_NAME() which is designed to update the existing contents or entities will run before the configuration is imported. What is the solution for this problem? (We will look at a solution for this problem later in this article.)
Importing default content for a site is an action which is not well documented in Drupal, in a profile installation often this import is done in the hook_install() hook because always the data content have not a complex structure with levels of nested references, in some cases we can use the default content module. Overall in a module we can’t create content in a hook_install() hook, simply because when installing a module the integrity of the configuration is still not imported.
In a recent project i used the drush php-script command to execute import scripts after the (drush updatedb -> drush config-import) but this command is not always available during deployment process. The first idea that comes to mind is to subscribe to the event that is triggered after the import of the configurations to be able to create the contents that will be available for the site editors, but the use of an event is not a nice developer experience hence the introduction of a new hook hook_post_config_import_NAME() that will run once after the database updates and configuration import. Another hook hook_pre_config_import_NAME() has also been introduced to fix performance issues.A workflow that works for me
To achieve a successful Drupal deployment in continuous integration/delivery cycles using Drush, the most generic workflow that I’ve found at the moment while waiting for a deployment API in core is as follows :
- drush updatedb
- hook_update_N() : To update the definition of an entity and its components
- hook_post_update_N() : To update entities when you made an entity definition modification (entity keys, base fields, …)
- hook_pre_config_import_NAME() : CRUD operations (e.g. creating terms that will be taken as default values when importing configuration in the next step)
- drush config-import : Importing the configuration (e.g. new bundle field, creation of a new bundle, image styles, image crops, …)
- hook_post_config_import_NAME(): CRUD operations (e.g. creating contents, updating existing contents, …)
This approach works well for us, and I hope it will be useful for you. If you’ve got any suggestions for improvements, please let me know via the comments.
We do a lot of Drupal 8 migrations here at Aten. From older versions of Drupal and Wordpress, to custom SQL Server databases, to XML and JSON export files: it feels like we’ve imported content from just about every data source imaginable. Fortunately for us, the migration system in Drupal 8 is extremely powerful. It’s also complicated. Here’s a quick-start guide for getting started with your next migration to Drupal 8.
First, a caveat: we rarely perform simple one-to-one upgrades of existing websites. If that’s all you need, skip this article and check out this handbook on Drupal.org instead: Upgrading from Drupal 6 or 7 to Drupal 8.It’s Worth the Steep Learning Curve
Depending on what you’re trying to do, using the migrate system might seem more difficult than necessary. You might be considering feeds, or writing something custom. My advice is virtually always the same: learn the migrate system and use it anyway. Whether you’re importing hundreds of thousands of nodes and dozens of content types or just pulling in a collection of blog posts, migrate provides powerful features that will save you a bunch of time in the long run. Often in the short run, for that matter.Use the Drupal.org Migrate API Handbooks
Here’s a much simplified overview of the high-level steps you’ll use to set up your custom Drupal 8 migration:
- Enable the migrate module (duh).
- Install Migrate Tools to enable Drush migration commands.
- Install Migrate Extras as well. It provides a bunch of, well, extras. I’d just assume you need it.
- Create a custom module for your migration.
- Use YAML configuration files to map fields from the appropriate source, specifying process plugins for necessary transformations, to the destination. The configuration files should exist in “my_migration_module/config/install/“.
(Pro tip: you’ll probably do a lot of uninstalling and reinstalling your module to update the configuration as you build out your migrations. Use “enforced dependencies” so your YAML configurations are automatically removed from the system when your module is uninstalling, allowing them to be recreated – without conflicts – when you re-enable the module.)
- If you’re running a Drupal-to-Drupal migration, run the “migrate-upgrade” Drush command with the “--configure-only” flag to generate stub YAML configurations. Refer to this handbook for details: Upgrade Using Drush.
- Copy the generated YAML files for each desired migration into your custom module’s config/install directory, renaming them appropriately and editing as necessary. As stated above, add enforced dependencies to your YAML files to make sure they are removed if your module is uninstalled.
Process plugins are responsible for transforming source data into the appropriate format for destination fields. From correctly parsing images from text blobs, to importing content behind HTTP authentication, to merging sources into a single value, to all kinds of other transformations: process plugins are incredibly powerful. Further, you can chain process plugins together, making endless possibilities for manipulating data during migration. Process plugins are one of the most important elements of Drupal 8 migrations.
Here are a few process plugin resources:
- Migrate Process Plugins overview and resources from Drupal.org
- List of Core Migrate Process Plugins quick reference on Drupal.org
- Writing a Process Plugin guide to creating your own process plugin on Drupal.org (Pro tip: do a Google search first; the thing you’re trying to create likely already exists.)
Most of our projects are hosted on Pantheon. Storing credentials for the source production database (for example, a D7 website) in our destination website (D8) code base – in settings.php or any other file – is not secure. Don’t do that. Usually, the preferred alternative is to manually download a copy of the production database and then migrate from that. There are plenty of times, though, where we want to perform continuous, automated migrations from a production source database. Often, complex migrations require weeks or months to complete. Running daily, incremental migrations is really valuable. For those cases, use the Terminus secrets plugin to safely store source database credentials. Here’s a great how-to from Pantheon: Running Drupal 8 Data Migrations on Pantheon Through Drush.A Few More Things I Wish I’d Known
Here are a few more things I wish I had known about back when I first started helping clients migrate to Drupal 8:Text with inline images can be migrated without manually copying image directories.
It’s very common to migrate from sources that have inline images. I found a really handy process plugin that helped with this. In my case, I needed to first do a string replace to make image paths absolute. Once that was done, I ran it through the inline_images plugin. This plugin will copy the images over during the migration.body/value: - plugin: str_replace source: article_text search: /assets/images/ replace: 'https://www.example.com/assets/images/' - plugin: inline_images base: 'public://inline-images' Process plugins can be chained.
Process plugins can be chained together to accomplish some pretty crazy stuff. Sometimes I felt like I was programming in YAML. This example shows how to create taxonomy terms on the fly. Static_map allows you to map old values to new. In this case, if it doesn’t match, it gets a null value and is skipped. Finally, the entity_generate plugin creates the new taxonomy term.field_webinar_track: - plugin: static_map source: webinar_track map: old_tag_1: 'New Tag One' old_tag_2: 'New Tag One' default_value: null - plugin: skip_on_empty method: process - plugin: entity_generate bundle_key: vid bundle: webinar_track Dates can be migrated without losing your mind.
Dates can be challenging. Drupal core has the format_date plugin that allows specifying the format you are migrating from and to. You can even optionally specify the to and from time zones. In this example, we were migrating to a date range field. Date range is a single field with two values representing the start and end time. As you can see below, we target the individual values by specifying the individual value targets as ‘/’ delimited paths.field_date/value: plugin: format_date from_timezone: America/Los_Angeles from_format: 'Y-m-d H:i:s' to_format: 'Y-m-d\TH:i:s' source: start_date field_date/end_value: plugin: format_date from_timezone: America/Los_Angeles from_format: 'Y-m-d H:i:s' to_format: 'Y-m-d\TH:i:s' source: end_date Files behind http auth can be copied too.
One migration required copying PDF files as the migration ran. The download plugin allows passing in Guzzle options for handling things like basic auth. This allowed the files to be copied from an http authenticated directory without the need to have the files on the local file system first.plugin: download source: - '@_remote_filename' - '@_destination_filename' file_exists: replace guzzle_options: auth: - username - password Constants & temporary fields can keep things organized.
Constants are essentially variables you can use elsewhere in your YAML file. In this example, base_path and file_destination needed to be defined. Temporary fields were also used to create the exact paths needed to get the correct remote filename and destination filename. My examples use an underscore to prefix the temporary field, but that isn’t required.source: plugin: your_plugin constants: base_path: 'https://www.somedomain.com/members/pdf/' file_destination: 'private://newsletters/' _remote_filename: plugin: concat source: - constants/base_path - filename _destination_filename: plugin: concat source: - constants/file_destination - filename plugin: download source: - '@_remote_filename' - '@_destination_filename' file_exists: replace guzzle_options: auth: - username - password
This list of tips and tricks on Drupal Migrate just scratches the surface of what’s capable. Drupalize.me has some good free and paid content on the subject. Also, check out the Migrate API overview on drupal.org.Further Reading
Like I said earlier, we spend a lot of time on migrations. Here are a few more articles from the Aten blog about various aspects of running Drupal 8 migrations. Happy reading!
Entities and their methods are no longer limited to use within PHP, they are now available in Twig as well.
Allow managing app association dynamically
Handles case where site is multi-lingual, language prefix won't be added for app related links.