Drupal

NSW Feedback

New Drupal Modules - 8 November 2017 - 7:47pm

This module enable a feedback form in your website once your domains are registered with onegov.

Additionally this module will allow you to provide custom path of the javascript file from your admin UI.

Thanks.

Categories: Drupal

Database Management

New Drupal Modules - 8 November 2017 - 3:56pm
Categories: Drupal

Bulk Uploader For Media Entities

New Drupal Modules - 8 November 2017 - 1:52pm

A module to bulk upload images to create media entities.

Categories: Drupal

Entity reference redirect

New Drupal Modules - 8 November 2017 - 1:27pm
Categories: Drupal

Elevated Third: Marketing Automation, Meet Drupal

Planet Drupal - 8 November 2017 - 12:06pm
Marketing Automation, Meet Drupal Marketing Automation, Meet Drupal Andy Mead Wed, 11/08/2017 - 13:06

Oh, hi there. I’d be lying if I said I wasn’t expecting you. This is a blog after all. And supposedly people read these things, which is, supposedly, why you’re here. So pull up a seat (if you’re not already sitting) and I’ll tell you why Drupal is a great partner for Marketing Automation.

Ah, Marketing Automation. (Hereafter MA, because why read 7 syllables when you can read 2?) It’s arguably the most hyped business technology of the last decade or so, spoken about in hushed tones, as though simply subscribing to a platform will print money for you. Sadly that’s not the truth. But when used properly with digital strategy, it’s pretty good at what it does: capturing latent demand and turning it into sales. The tricky part is the modifying clause that opened the last sentence, “when used properly.”

What to expect from Marketing Automation?

Marketing Automation tools and platforms these days come loaded with bells and whistles: from custom reporting engines to fancy drag-n-drop campaign UIs, and WYSIWYGs that let marketers build digital assets like landing pages and emails. And yet, despite all that fanciness, it’s still really hard to do Marketing Automation right. Why? Well, leaving aside strategic questions (a massive topic on its own), my own experience with MA always left me wanting two things - expressibility and scalability.

Drupal + Marketing Automation

While publishing workflows in Marketing Automation, tools have improved over the years. They still can’t compete with a CMS; particularly one as powerful as Drupal. Drupal empowers users to express content in terms that go far beyond simple landing pages.

In fact, Drupal is used today for just about anything you can imagine, from powering Fortune 500 marketing websites to running weather.com and acting as the backbone of custom web applications. What’s possible with Drupal is really up to you. Just ask the guy who built it.

So, fine. Drupal is great and everything. But how does it help your marketing? Well, because Drupal is so flexible, you can integrate it with almost anything:  like Google Analytics, Pardot, Marketo, Eloqua, Salesforce, and on, and on, and on. In a quickly changing technology landscape that’s an incredible strength because it acts as the nervous system for your marketing technology stack.

“Marketing technology stack?” Yeah, I don’t like business jargon, either. But, it’s a helpful way to think about digital marketing tools. Because they are just that: tools with strengths and weaknesses. You probably wouldn’t use a screwdriver to drive a nail into the wall. Sure, you could, but there’s a better tool for the job: a hammer. Likewise, your MA platform could power all your digital assets, but there’s a better tool for that job, too: Drupal.

The right tools for the job

In my experience, organizing these tools around their strengths brings better results. And here at Elevated Third, we’ve done that by connecting Drupal to Marketing Automation platforms like Pardot, Marketo, and SharpSpring; using it as the front end for services that are powering marketing programs. And moreover, MA is only a piece of that puzzle. Want to use something like HotJar? Drupal is happy to.

Open source means flexibility 

So where does this flexibility come from? Drupal is Open Source Software and there’s a massive developer community that improves it daily. Probably the strength of open source software is its flexibility.

You don’t like the way something works? Easy. Let’s change it.

Is something broken? No problem, let’s fix it.

Got a new problem that off-the-shelf solutions don’t solve? Well, then, let’s built a solution for it.

Is Drupal the right tool for every job? I’d be lying (again) if I said it was. But it’s the right tool for jobs that require unique, flexible solutions. And it could be the right tool for your job, too. If you are curious, let's talk

Categories: Drupal

Cheeky Monkey Media: The Drupal Checklist Every Developer Needs

Planet Drupal - 8 November 2017 - 11:49am
The Drupal Checklist Every Developer Needs cody Wed, 11/08/2017 - 19:49

Are you almost finished setting up your Drupal website? At a glance, everything might look ready to go.

But, before you hit "publish," you need to make sure you haven't made any mistakes.

A writer proofreads before they post an article. Similarly, a developer should double check their work.

The last thing you want is to go live with your site and have something go wrong. Finding problems before you launch can save some headaches and embarrassment.

We've compiled a pre-launch, Drupal checklist. When it's complete, you'll rest easy knowing that your website is ready to go.

Security

Security is the first on this Drupal checklist because it's so important. Of course you want to rest easy knowing that your site is secure when it launches. You also want your users to have peace of mind knowing that their information is safe.

Double checking your site's security will ensure that there's nothing you've missed that could make you vulnerable to hackers.

Categories: Drupal

Evolving Web: Profiling and Optimizing Drupal Migrations with Blackfire

Planet Drupal - 8 November 2017 - 11:34am

A few weeks ago, us at Evolving Web finished migrating the Princeton University Press website to Drupal 8. The project was over 70% migrations. In this article, we will see how Blackfire helped us optimize our migrations by changing around two lines of code.

Before we start
  • This article is mainly for PHP / Drupal 8 back-end developers.
  • It is assumed that you know about the Drupal 8 Migrate API.
  • Code performance is analyzed with a tool named Blackfire.
  • Front-end performance analysis is not in the scope of this article.
The Problem

Here are some of the project requirements related to the problem. This would help you get a better picture of what's going on:

  • A PowerShell script exports a bunch of data into CSV files on the client's server.
  • A custom migration plugin PUPCSV uses the CSV files via SFTP.
  • Using hook_cron() in Drupal 8, we check hashes for each CSV.
  • If a file's MD5 hash changes, the migration is queued for import using the Drupal 8 Queue API.
  • The CSV files usually have 2 types of changes:
    • Certain records are updated here and there.
    • Certain records are added to the end of the file.
  • When a migration is executed, migrate API goes line-by-line, doing the following things for every record:
    • Read a record from the data source.
    • Merge data related to the record from other CSV files (kind of an inner join between CSVs).
    • Compute hash of the record and compare it with the hash stored in the database.
    • If a hash is not found in the database, the record is created.
    • If a hash is found and it has changed, the record is updated.
    • If a hash is unchanged, no action is taken.

While running migrations, we figured out that it was taking too much time for migrations to go through the CSV files, simply checking for changes in row hashes. So, for big migrations with over 40,000 records, migrate was taking several minutes to reach the end of file even on a high-end server. Since we were running migrate during cron (with Queue Workers), we had to ensure that any individual migration could be processed below the 3 minute PHP maximum execution time limit available on the server.

Analyzing migrations with Blackfire

At Evolving Web, we usually analyze performance with Blackfire before any major site is launch. Usually, we run Blackfire with the Blackfire Companion which is currently available for Google Chrome and Firefox. However, since migrations are executed using drush, which is a command line tool, we had to use the Blackfire CLI Tool, like this:

$ blackfire run /opt/vendor/bin/drush.launcher migrate-import pup_subjects Processed 0 items (0 created, 0 updated, 0 failed, 0 ignored) - done with 'pup_subjects' Blackfire Run completed

Upon analyzing the Blackfire reports, we found some 50 unexpected SQL queries being triggered from somewhere within a PUPCSV::fetchNextRow() method. Quite surprising! PUPCSV refers to a migrate source plugin we wrote for fetching CSV files over FTP / SFTP. This plugin also tracks a hash of the CSV files and thereby allows us to skip a migration completely if the source files have not changed. If the source hash changes, the migration updates all rows and when the last row has been migrated, we store the file's hash in the database from PUPCSV::fetchNextRow(). As a matter of fact, we are preparing another article about creating custom migrate source plugin, so stay tuned.

We found one database query per row even though no record was being created or updated. Didn't seem to be very harmful until we saw the Blackfire report.

Code before Blackfire

Taking a closer look at the RemoteCSV::fetchNextRow() method, a call to MigrateSourceBase::count() was found. It was found that the count() method was taking 40% of processing time! This is because it was being called for every row in the CSV. Since the source/cache_counts parameter was not set to TRUE in the migration YAML files, the count() method was iterating over all items to get a fresh count for each call! Thus, for a migration with 40,000 records, we were going through 40,000 x 40,000 records and the PHP maximum execution time was being reached even before migrate could get to the last row! Here's a look at the code.

protected function fetchNextRow() { // If the migration is being imported... if (MigrationInterface::STATUS_IMPORTING === $this->migration->getStatus()) { // If we are at the last row in the CSV... if ($this->getIterator()->key() === $this->count()) { // Store source hash to remember the file as "imported". $this->saveCachedFileHash(); } } return parent::fetchNextRow(); }Code after Blackfire

We could have added the cache_counts parameter in our migration YAML files, but any change in the source configuration of the migrations would have made migrate API update all records in all migrations. This is because a row's hash is computed as something like hash($row + $source). We did not want migrate to update all records because we had certain migrations which sometimes took around 7 hours to complete. Hence, we decided to statically cache the total record count to get things back in track:

protected function fetchNextRow() { // If the migration is being imported... if (MigrationInterface::STATUS_IMPORTING === $this->migration->getStatus()) { // Get total source record count and cache it statically. static $count; if (is_null($count)) { $count = $this->doCount(); } // If we are at the last row in the CSV... if ($this->getIterator()->key() === $count) { // Store source hash to remember the file as "imported". $this->saveCachedFileHash(); } } return parent::fetchNextRow(); }Problem Solved. Merci Blackfire!

After the changes, we ran Blackfire again and found things to be 52% faster for a small migration with 50 records.

For a bigger migration with 4,359 records the migration import time reduced from 1m 47s to only 12s which means a 98% improvement. Asking why we didn't include the screenshot for the bigger migration? We did not (or rather could not) generate a report for the big migration because of two reasons:

  • While working, Blackfire stores function call and other information to memory. Running a huge migration with Blackfire might be a bit slow. Besides, our objective was to find the problem and we could do that more easily while looking at smaller figures.
  • When running a migration with thousands of rows, the migration functions are called over thousands of times! Blackfire collects data for each of these function calls, hence, the collected data sometimes becomes too heavy and Blackfire rejects the huge data payload with an error message like this:
The Blackfire API answered with a 413 HTTP error () Error detected during upload: The Blackfire API rejected your payload because it's too big.

Which makes a lot of sense. As a matter of fact, for the other case study given below, we used the --limit=1 parameter to profile code performance for a single row.

A quick brag about another 50% Improvement?

Apart from this jackpot, we also found room for another 50% improvement (from 7h to 3h 32m) for one of our migrations which was using the Touki FTP library. This migration was doing the following:

  • Going through around 11,000 records in a CSV file.
  • Downloading the files over FTP when required.

A Blackfire analysis of this migration revealed something strange. For every row, the following was happening behind the scenes:

  • If a file download was required, we were doing FTP::findFileByName($name).
  • To get the file, Touki was:
    • Getting a list of all files in the directory;
    • Creating File objects for every file;
    • For every file object, various permission, owner and other objects were created.
    • Passing all the files through a callback to see if it's name was $name.
    • If the name was matching, the file was returned and all other File objects were discarded.

Hence, for downloading every file, Touki FTP was creating 11,000 File objects of which it was only using one! To resolve this, we decided to use a lower-level FTP::get($source, $destination) method which helped us bypass all those 50,000 or more objects which were being created per record (approximately, 11,000 * 50,000 or more for all records). This almost halved the import time for that migration when working with all 11,000 records! Here's a screenshot of Blackfire's report for a single row.

So the next time you think something fishy is going on with code you wrote, don't forget to use use Blackfire! And don't forget to leave your feedback, questions and even article suggestions in the comments section below.

More about Blackfire

Blackfire is a code profiling tool for PHP which gives you nice-looking reports about your code's performance. With the help of these reports, you can analyze the memory, time and other resources consumed by various functions and optimize your code where necessary. If you are new to Blackfire, you can try these links:

Apart from all this, the paid version of Blackfire lets you set up automated tests and gives you various recommendations for not only Drupal but various other PHP frameworks.

Next Steps
  • Try Blackfire for free on a sample project of your choice to see what you can find.
  • Watch video tutorials on Blackfire's YouTube channel.
  • Read the tutorial on creating custom migration source plugins written by my colleague (coming soon).
+ more awesome articles by Evolving Web
Categories: Drupal

CSV Importer

New Drupal Modules - 8 November 2017 - 11:29am
Categories: Drupal

React Admin UI

New Drupal Modules - 8 November 2017 - 9:55am

All issues and development takes place on Github.

Setup

Ensure you have a moderately recent version of Node.js(8.9.1LTS) and Yarn installed. Currently the bundle is not included as this is under active development.

cd /js && yarn run build:js

Categories: Drupal

Lullabot: Styling the WYSIWYG Editor in Drupal 8

Planet Drupal - 8 November 2017 - 8:42am

Drupal 8 ships with a built-in WYSIWG editor called CKEditor. It’s great to have it included in core, but I had some questions about how to control the styling. In particular, I wanted the styling in the editor to look like my front-end theme, even though I use an administration theme for the node form. I spent many hours trying to find the answer, but it turned out to be simple if a little confusing.

In my example, I have a front-end theme called “Custom Theme” that extends the Bootstrap theme. I use core’s “Seven” theme as an administration theme, and I checked the box to use the administration theme for my node forms. 

My front end theme adds custom fonts to Bootstrap and uses a larger than normal font, so it’s distinctively different than the standard styling that comes with the WYSIWYG editor. 

Front End Styling undefined WYSIWYG Styling

Out of the box, the styling in the editor looks very different than my front-end theme. The font family and line height are wrong, and the font size is too small.

undefined

It turns out there are two ways to alter the styling in the WYSIWYG editor, adding some information to the default theme’s info.yml file, or implementing HOOK_ckeditor_css_alter() in either a module or in the theme. The kicker is that the info changes go in the FRONT END theme, even though I’m using an admin theme on the node form.

I added the following information to my default theme info file, custom_theme.info.yml. The font-family.css and style.css files are the front-end theme CSS files that I want to pass into the WYSIWYG editor. Even if I select the option to use the front-end theme for the node form, the CSS from that theme will not make it into the WYSIWYG editor without making this change, so this is necessary whether or not you use an admin theme on the node form!  

name: "Custom Theme" description: A subtheme of Bootstrap theme for Drupal 8. type: theme core: 8.x base theme: bootstrap ckeditor_stylesheets: - https://fonts.googleapis.com/css?family=Open+Sans - css/font-family.css - css/style.css libraries: ... WYSIWYG Styling

After this change, the font styles in the WYSIWYG editor match the text in the primary theme.

undefined

When CKEditor builds the editor iframe, it checks to see which theme is the default theme, then looks to see if that theme has values in the info.yml file for ckeditor_stylesheets. If it finds anything, it adds those CSS files to the iframe. Relative CSS file URLs are assumed to be files in the front-end theme’s directory, or you can use absolute URLs to other files.

The contributed Bootstrap module does not implement ckeditor_stylesheets, so I had to create a sub-theme to take advantage of this. I always create a sub-theme anyway, to add in the little tweaks I want to make. In this case, my sub-theme also uses a Google font instead of the default font, and I can also pass that font into the WYSIWYG editor.

TaDa!

That was easy to do, but it took me quite a while to understand how it worked. So I decided to post it here in case anyone else is as confused as I was.

More Information

To debug this further and understand how to impact the styling inside the WYSIWYG editor, you can refer to the relevant code from two files in core, ckeditor.module:  

/** * Retrieves the default theme's CKEditor stylesheets. * * Themes may specify iframe-specific CSS files for use with CKEditor by * including a "ckeditor_stylesheets" key in their .info.yml file. * * @code * ckeditor_stylesheets: * - css/ckeditor-iframe.css * @endcode */ function _ckeditor_theme_css($theme = NULL) { $css = []; if (!isset($theme)) { $theme = \Drupal::config('system.theme')->get('default'); } if (isset($theme) && $theme_path = drupal_get_path('theme', $theme)) { $info = system_get_info('theme', $theme); if (isset($info['ckeditor_stylesheets'])) { $css = $info['ckeditor_stylesheets']; foreach ($css as $key => $url) { if (UrlHelper::isExternal($url)) { $css[$key] = $url; } else { $css[$key] = $theme_path . '/' . $url; } } } if (isset($info['base theme'])) { $css = array_merge(_ckeditor_theme_css($info['base theme']), $css); } } return $css; }

and Plugin/Editor/CKEditor.php:  

/** * Builds the "contentsCss" configuration part of the CKEditor JS settings. * * @see getJSSettings() * * @param \Drupal\editor\Entity\Editor $editor * A configured text editor object. * @return array * An array containing the "contentsCss" configuration. */ public function buildContentsCssJSSetting(Editor $editor) { $css = [ drupal_get_path('module', 'ckeditor') . '/css/ckeditor-iframe.css', drupal_get_path('module', 'system') . '/css/components/align.module.css', ]; $this->moduleHandler->alter('ckeditor_css', $css, $editor); // Get a list of all enabled plugins' iframe instance CSS files. $plugins_css = array_reduce($this->ckeditorPluginManager->getCssFiles($editor), function($result, $item) { return array_merge($result, array_values($item)); }, []); $css = array_merge($css, $plugins_css); $css = array_merge($css, _ckeditor_theme_css()); $css = array_map('file_create_url', $css); $css = array_map('file_url_transform_relative', $css); return array_values($css); }
Categories: Drupal

MQTT Integration

New Drupal Modules - 8 November 2017 - 8:30am

This module allows Drupal sites to connect to an MQTT broker. The module is still under development and as of now, only allows publish operations over TCP. Other protocols and operations will be added eventually.

Categories: Drupal

Response Code Condition

New Drupal Modules - 8 November 2017 - 7:20am

Provides a condition to show or hide a block depending on the response code.

Categories: Drupal

Block in form

New Drupal Modules - 8 November 2017 - 6:26am

This module allows you to add content blocks in a Drupal entity form

Categories: Drupal

Valuebound: Enabling custom web font in Drupal website

Planet Drupal - 8 November 2017 - 4:21am

This blog will walk you through one the contributed module in Drupal community that has been a heave of sigh for me whenever I was in trouble for web building activity. A couple of weeks back, I have been assigned a task where the requirement was to enable ‘Benton-sans Regular’ font throughout the site. Initially, I thought it would be an easy task and can be done easily. But I was wrong.

No issues! If you facing similar difficulties. Here, I am going to discuss how you can enable ‘Benton-sans Regular’ font seamlessly using Drupal font-your-face module.

Categories: Drupal

Views Plain

New Drupal Modules - 8 November 2017 - 2:53am

Views display style that removes all the wrapper markup.

Installation

composer require drupal/views_plain:1.x-dev

Configuration

While editing a View, select the 'Plain' option under 'Format'.

Categories: Drupal

Flocon de toile | Freelance Drupal: Change the position of the meta data panel on the node form with Drupal 8

Planet Drupal - 8 November 2017 - 2:00am
Content metadata (menu settings, publishing options, url path settings, and so on) are by default displayed on the node form in a side panel. This has the advantage of giving immediate visibility on these options while writing its content. But there are use cases where the lateral position of these informations is detrimental to the general ergonomics, because reducing the space available for the content form. This can be the case, for example, if you use the Field Group module to structure and group the information you need to enter. No need here for a Drupal expert. Let's find out how we can make the position of these metadata customizable according to the needs and general ergonomics of the Drupal 8 project.
Categories: Drupal

Editor image attributes

New Drupal Modules - 8 November 2017 - 1:12am

Overview
Editor image attributes provides additional attributes to be added on editor image.

Supported attributes include
id
class
extra

Categories: Drupal

Agiledrop.com Blog: AGILEDROP: Top Drupal blogs from October

Planet Drupal - 8 November 2017 - 12:56am
The October is over, so it's time we present you top Drupal blogs written in October by other authors.  Let's start with How to maintain Drush commands for Drush 8 and 9 and Drupal console with the same code base by Fabian Bircher from Nuvole. He shows us that the solution is actually really simple, it is all about separating the command discovery from the command logic. Check it out! Our second choice is Decoupled Drupal Hard Problems: Image Styles by Mateu Aguiló Bosch from Lullabot. He shows us the problems when back-end doesn't know anything about the front-end design. He presents a… READ MORE
Categories: Drupal

Savas Labs: The cost of investing in Drupal 7 - why it's time for Drupal 8

Planet Drupal - 7 November 2017 - 4:00pm

In the second of a two-part series, we investigate Drupal 8's present value and help highlight sometimes hidden costs of developing on an older platform. Continue reading…

Categories: Drupal

Morpht: Announcing Enforce Profile Field for Drupal 8

Planet Drupal - 7 November 2017 - 3:42pm


The Enforce Profile Field is a new module which allows editors to enforce the completion of one or more fields in order to access content on a site. It is now available for Drupal 8.

Sometimes you need to collect a variety of profile data for different users. The data may be needed for regulatory compliance or marketing reasons. In some cases you need a single field and in others it may be several. You may also wish to collect the information when a user access certain parts of the site.

The Enforce Profile Field module comes to the rescue in cases such as these, forcing users to complete their profile before being able to move onto the page they want to see. This may sound harsh, however, collecting data as you need it is a more subtle way of collecting data than enforcing it all at registration time.

The implementation consists mainly from a new Field Type called "Enforce profile" and hook_entity_view_alter().

The module works as follows
  1. Site builder defines a “form display” for the user type bundle and specify fields associated with it to collect data.
    1. The fields should not be required, as this allows the user to skip them on registration and profile editing.
    2. In addition the Form Mode Manager module can be used to display the “form display” as a "tab" on a user profile page.
  2. The site builder places an Enforce profile field onto an entity type bundle, such as a node article or page.
  3. The Enforce profile field requires some settings:
    1. A "User's form mode" to be utilized for additional field information extraction (created in the first step).
    2. An "Enforced view modes" that require some profile data to be filled in before being able to access them. You should usually select the "Full content" view mode and rather not include view modes like "Teaser" or "Search".
  4. The editor creates content, an article or page, and can then select which fields need to be enforced.
    1. The editor is provided with multi-select of "User's form mode" fields.
    2. Selecting nothing is equal to no access change, no profile data enforcement.
  5. A new user navigates to the content and is redirected to the profile tab and is informed that they need to complete the fields.
  6. Fields are completed, form submitted and the user redirected back to the content.
    1. In case the user doesn't provide all enforced fields, the profile tab is displayed again with the message what fields need to be filled in.
Why to use the Enforce Profile Field to collect an additional profile data?
  • You may need customer's information to generate a coupon or access token.
  • You may just want to know better with whom you share information.
  • Your users know exactly what content requires their additional profile data input rather than satisfying a wide range of requirements during registration. It just makes it easier for them.
  • The new profile data can be synced to a CRM or other system if required to.

Let us know what you think.
 

Categories: Drupal

Pages

Subscribe to As If Productions aggregator - Drupal