Planet Drupal

Subscribe to Planet Drupal feed
Drupal.org - aggregated feeds in category Planet Drupal
Updated: 8 hours 21 min ago

Drudesk: UpTime Widget Drupal module to show website reliability

12 June 2019 - 7:00am

There are many beautiful words to tell your customers that your website is trustworthy, reliable, and transparent. But one small widget can say it better that a thousand words.

So let us introduce the UpTime Widget Drupal module. See how it could help you always stay aware of your website uptime, build customer trust, and stand out from competitors.

Categories: Drupal

Srijan Technologies: Make your Travel Business a Global Phenomenon with Drupal

12 June 2019 - 6:16am

Tour and travel business has started to catch up in the digital realm. In fact, it’s growing faster than the total travel market. It is predicted that by 2020, the overall tours and activities segment will grow to $183 billion.

A clear opportunity for businesses in the travel industry.

Categories: Drupal

InternetDevels: Artificial intelligence and Drupal 8: amazing opportunities & useful modules

12 June 2019 - 5:30am

It’s exciting to see how once unimaginable things become popular digital practices! A vivid example is artificial intelligence. We have shared with you an article about artificial intelligence coming to your apps thanks to cognitive services. What about Drupal websites — are they ready for AI? The answer is a definite yes! Let’s see how artificial intelligence and Drupal 8 come together.

Read more
Categories: Drupal

OSTraining: How to Integrate Telegram Chat With Drupal 8

11 June 2019 - 11:29pm

Telegram is an easy to use free chat application that is rapidly winning fans all over the world. 

There is a Telegram plugin for WordPress but there is not yet a Telegram module for Drupal.

In this tutorial, you will learn how to integrate the Telegram app with your Drupal 8 site using JavaScript from Re:plain.

Categories: Drupal

Gizra.com: Tools with Friendly Learning Curve: ddev

11 June 2019 - 10:00pm

Some years ago, a frontend developer colleague mentioned that we should introduce SASS, as it requires almost no preparation to start using it. Then as we progress, we could use more and more of it. He proved to be right. A couple of months ago, our CTO, Amitai made a similar move. He suggested to use ddev as part of rebuilding our starter kit for a Drupal 8 project. I had the same feeling, even though I did not know all the details about the tool. But it felt right introducing it and it was quickly evident that it would be beneficial.

Here’s the story of our affair with it.

For You

After the installation, a friendly command-line wizard (ddev config) asks you a few questions:

The configuration wizard holds your hand

It gives you an almost a perfect configuration, and in the .ddev directory, you can overview the YAML files. In .ddev/config.yaml, pay attention to router_http_port and router_https_port, these ports should be free, but the default port numbers are almost certainly occupied by local Nginx or Apache on your development system already.

After the configuration, ddev start creates the Docker containers you need, nicely pre-configured according to the selection. Even if your site was installed previously, you’ll be faced with the installation process when you try to access the URL as the database inside the container is empty, so you can install there (again) by hand.

You have a site inside ddev, congratulations!

For All of Your Coworkers

So now ddev serves the full stack under your site, but is it ready for teamwork? Not yet.

You probably have your own automation that bootstraps the local development environment (site installation, specific configurations, theme compilation, just to name a few), now it’s time to integrate that into ddev.

The config.yaml provides various directives to hook into the key processes.

A basic Drupal 8 example in our case looks like this:

hooks: pre-start: - exec-host: "composer install" post-start: # Install Drupal after start - exec: "drush site-install custom_profile -y --db-url=mysql://db:db@db/db --account-pass=admin --existing-config" - exec: "composer global require drupal/coder:^8.3.1" - exec: "composer global require dealerdirect/phpcodesniffer-composer-installer" post-import-db: # Sanitize email addresses - exec: "drush sqlq \"UPDATE users_field_data SET mail = concat(mail, '.test') WHERE uid > 0\"" # Enable the environment indicator module - exec: "drush en -y environment_indicator" # Clear the cache, revert the config - exec: "drush cr" - exec: "drush cim -y" - exec: "drush entup -y" - exec: "drush cr" # Index content - exec: "drush search-api:clear" - exec: "drush search-api:index"

After the container is up and running, you might like to automate the installation. In some projects, that’s just the dependencies and the site installation, but sometimes you need additional steps, like theme compilation.

In a development team, you will probably have a dev, stage and a live environment that you would like to routinely sync to local to debug and more. In this case, there are integrations with hosting providers, so all you need to do is a ddev pull and a short configuration in .ddev/import.yaml:

provider: pantheon site: client-project environment: test

After the files and database are in sync, everything in post-import-db will be applied, so we can drop the existing scripts we had for this purpose.

We still prefer to have a shell script wrapper in front of ddev, so we have even more freedom to tweak the things and keep it automated. Most notably, ./install does a regular ddev start, which results in a fresh installation, but ./install -p saves the time of a full install if you would like to get a copy on a Pantheon environment.

For the Automated Testing

Now that the team is happy with the new tool, they might be faced with some issues, but for us it wasn’t a blocker. The next step is to make sure that the CI also uses the same environment. Before doing that, you should think about whether it’s more important to try to match the production environment or to make Travis really easily debuggable. If you execute realistic, browser-based tests, you might want to go with the first option and leave ddev out of the testing flow; but for us, it was a desirable to spin an identical site on local to what’s inside Travis. And unlike our old custom Docker image, the maintenance of the image is solved.

Here’s our shell script that spins up a Drupal site in Travis:

#!/usr/bin/env bash set -e # Load helper functionality. source ci-scripts/helper_functions.sh # -------------------------------------------------- # # Installing ddev dependencies. # -------------------------------------------------- # print_message "Install Docker Compose." sudo rm /usr/local/bin/docker-compose curl -s -L "https://github.com/docker/compose/releases/download/1.22.0/docker-compose-$(uname -s)-$(uname -m)" > docker-compose chmod +x docker-compose sudo mv docker-compose /usr/local/bin print_message "Upgrade Docker." sudo apt -q update -y sudo apt -q install --only-upgrade docker-ce -y # -------------------------------------------------- # # Installing ddev. # -------------------------------------------------- # print_message "Install ddev." curl -s -L https://raw.githubusercontent.com/drud/ddev/master/scripts/install_ddev.sh | bash # -------------------------------------------------- # # Configuring ddev. # -------------------------------------------------- # print_message "Configuring ddev." mkdir ~/.ddev cp "$ROOT_DIR/ci-scripts/global_config.yaml" ~/.ddev/ # -------------------------------------------------- # # Installing Profile. # -------------------------------------------------- # print_message "Install Drupal." ddev auth-pantheon "$PANTHEON_KEY" cd "$ROOT_DIR"/drupal || exit 1 if [[ -n "$TEST_WEBDRIVERIO" ]]; then # As we pull the DB always for WDIO, here we make sure we do not do a fresh # install on Travis. cp "$ROOT_DIR"/ci-scripts/ddev.config.travis.yaml "$ROOT_DIR"/drupal/.ddev/config.travis.yaml # Configures the ddev pull with Pantheon environment data. cp "$ROOT_DIR"/ci-scripts/ddev_import.yaml "$ROOT_DIR"/drupal/.ddev/import.yaml fi ddev start check_last_command if [[ -n "$TEST_WEBDRIVERIO" ]]; then ddev pull -y fi check_last_command

As you see, we even rely on the hosting provider integration, but of course that’s optional. All you need to do after setting up the dependencies and the configuration is to ddev start, then you can launch the tests of any kind.

All the custom bash functions above are adapted from https://github.com/Gizra/drupal-elm-starter/blob/master/ci-scripts/helper_functions.sh, and we are in the process of having an ironed out starter kit from Drupal 8, needless to say, with ddev.

One key step is to make ddev non-interactive, see global_config.yaml that the script copies:

APIVersion: v1.7.1 omit_containers: [] instrumentation_opt_in: false last_used_version: v1.7.1

So it does not ask about data collection opt-in, as it would break the non-interactive Travis session. If you are interested in using the ddev pull as well, use encrypted environment variables to pass the machine token securely to Travis.

The Icing on the Cake

ddev has a welcoming developer community. We got a quick and meaningful reaction to our first issue, and by the time of writing this blog post, we have an already merged PR to make ddev play nicely with Drupal-based webservices out of the box. Contributing to this project is definitely rewarding – there are 48 contributors and it’s growing.

The Scene of the Local Development Environments

Why ddev? Why not the most popular choice, Lando or Drupal VM? For us, the main reasons were the Pantheon integration and the pace of development. It definitely has the momentum. In 2018, it was the 13th choice for local development environment amongst Drupal developers; in 2019, it’s at the 9th place according to the 2019 Drupal Local Development survey. This is what you sense when you try to contribute: the open and the active state of the project. What’s for sure, based on the survey, is that nowadays the Docker-based environments are the most popular. And with a frontend that hides all the pain of working with pure Docker/docker-compose commands, it’s clear why. Try it (again), these days - you can really forget the hassle and enjoy the benefits!

Continue reading…

Categories: Drupal

Freelock : Layout Builders versus Content Management - are you making this mistake?

11 June 2019 - 5:27pm
Layout Builders versus Content Management - are you making this mistake? John Locke Tue, 06/11/2019 - 17:27

Glitzy websites are all the rage these days. Everybody seems to be looking for easy ways to create multimedia-rich pages with ease. Yet there is a big downside to the current trend of page builders -- if you're not careful, you might end up making your long term content management far harder than it should be.

content management Drupal Drupal Planet Layout Builder Website Content Management WordPress
Categories: Drupal

Hook 42: Drupal Core Initiative Meetings Recap - May 2019

11 June 2019 - 1:18pm
Drupal Core Initiative Meetings Recap - May 2019 Hook 42 Tue, 06/11/2019 - 20:18
Categories: Drupal

Horizontal Integration: Managing JavaScript and CSS with Config Entities

11 June 2019 - 10:57am
Every once in a while you have those special pages that require a little extra something. Some special functionality, just for that page. It could be custom styling for a marketing landing page, or a third party form integration using JavaScript. Whatever the use case, you need to somehow sustainably manage JavaScript or CSS for those pages. Our client has some of these special pages. These are pages that live outside of the standard workflow and component library and require their own JS and CSS to pull them together.  Content authors want to be able to manage these bits of…
Categories: Drupal

Kanopi Studios: How to work remotely and foster a happy, balanced life

11 June 2019 - 7:11am

Virtual. Remote. Distributed. Pick your label. This style of organization is becoming wildly more in demand and popular among many agencies and organizations. It saves the cost of office space, allows for hiring the best talent possible regardless of location, can be a huge bonus to employees who require flexibility in their schedules, and saves everyone time in commuting assuming they don’t go to a shared work space. You can even wear what you want (being mindful of video chats, of course).

The flipside? While many folks have gone remote, some people find the experience quite isolating and disconnected. Does remote work make people happier? Does it make them more productive? From my experience running a remote-only agency, the answer is not really. Going for days not seeing another human in person can be extremely isolating and demotivating. And while it seems as though you’d have more time at your computer, and therefore would be more productive, often the opposite is true: it can often be harder to have focused time to work on tasks if you are at home with multiple screens. And even worse if you are distracted by anything at home (deliveries at your door, that laundry in the corner, etc).

It can also be physically damaging: the human body is not designed to sit at a desk for long periods of time, and there’s less incentive to get up and move if you don’t have to move more than a few feet to your computer.

I know I’ve experienced all those issues. So I feel everyone’s pain. Literally.

The main reason Kanopi Studios exists is to support humans in every way: we support our clients by giving them great work so they can be successful online, but additionally Kanopi serves to support its employees so they are successful in both their work and home lives. We want our people to always be happy, fulfilled, and constantly evolving in a positive way. So it’s critical that we create an environment and culture that fosters practices that provide meaning, collaboration, and happiness regardless of location. It’s also critical that employees feel empowered to speak up if they are feeling the negative repercussions of remote work.

As CEO, it’s my job to give my staff the right tools and systems so that they are as happy and healthy as possible, and to create connectivity in Kanopi’s culture. Building and sustaining strong relationships requires a unique approach that makes use of a variety of tools to create the right work culture to combat the isolation.

There’s a session I give on this very topic, and the DrupalCon video is linked below. I cover how to be the best remote employee, as well as how to support your team if you are a leader of a remote team. I give key tactics to keep you (and all other staff) inspired, creative, productive and most importantly, happy! I hope you find it helpful in making your own work environment as connected and collaborative as possible, no matter where you are.

The post How to work remotely and foster a happy, balanced life appeared first on Kanopi Studios.

Categories: Drupal

Dries Buytaert: Drupal interview on Dutch business news network

11 June 2019 - 5:54am

Recently I was interviewed on RTL Z, the Dutch business news television network. In the interview, I talk about the growth and success of Drupal, and what is to come for the future of the web. Beware, the interview is in Dutch. If you speak Dutch and are subscribed to my blog (hi mom!), feel free to check it out!

Categories: Drupal

ComputerMinds.co.uk: Boost your speed with lazy images

11 June 2019 - 5:05am

Websites need to look pretty and be blazing fast. That often means lots of beautiful high-quality images, but they can be pretty enormous to download, making the page slow to load. Images are often one of the 'heaviest' parts of a website, dragging a visitor's experience down instead of brightening it up as intended. If a website feels even a tiny bit unresponsive, that tarnishes your message or brand. Most of us have sat waiting frustratedly for a website to work (especially on mobile), and given up to go elsewhere. Drupal can be configured to deliver appropriately-resized versions, but what's even better than that?

Lazy image loading

Don't send images to be downloaded at all until they're actually going to be seen! Browsers usually download everything for a page, even if it's out of sight 'below the fold'. We know we can do better than that on a modern website, with this technique called lazy image loading.

Lazily loading an image means only sending it for a user to download once they are scrolling it into view. Modern web browsers make this surprisingly simple to achieve for most images, although there are often a few that need special attention. When combined with optimisation from Kraken.io, and other responsive design tricks, performance can sky-rocket again. Check out our case study of NiquesaTravel.com for a great example using this.

Niquesa is a luxury brand for busy people, so the website experience needs to be smooth, even when used on the go over a mobile network. Perhaps more than that, SEO (search engine optimisation) is critical. Their bespoke packages need to show up well in Google searches. Google promotes websites that perform well on mobile devices - so if your site is slow, it needs to be sped up. It's not just that you'll lose out on competitive advantage and tarnish your brand: people simply won't find you.

You can see what Google thinks of your website performance by using their PageSpeed Insights tool. That gives you an overall score and lists specific improvements you can make. Niquesa asked us to boost their score, especially for mobile devices. So we looked to speed up anything slow, and to reduce the amount of things there are to download in the first place. Any website can use that approach too. Lazy image loading speeds up the initial page load, and reduces the amount to download.

This stuff should be standard on most websites nowadays. But many web projects began well before browsers supported this kind of functionality so still need it adding in. As an ever-improving platform, the internet allows you to continually improve your site. There's no need to feel locked in to a slow site! Get in touch with us if you're interested in improving your website with lazy loaded imagery. Who wouldn't want beautiful high-quality media and great performance on any device?

 

Can you teach me to be lazy?

Sure! Rather than using the normal src attribute to hold the image file location, use a data-src attribute. Browsers ignore that, so nothing gets downloaded. We then use the browser's Intersection Observer API to observe when the image is being scrolled up into view. Our javascript can jump in at this point to turn that data-src attribute into a real src attribute, which means the browser will download the real image.

On its own, that wouldn't take very long to set up on most websites. But on top of this, we often go the extra mile to add some extra optimisations. These can take up the majority of the time when applying lazy loading to a website, as they are a great improvement for the user experience, but usually need crafting specifically for each individual project:

  • Images defined via style or srcset attributes (rather than a src attribute) and background images in CSS files, need similar handling. For example, use a data-style or data-srcset attribute.
  • Images that we expect to be immediately in view are excluded from any lazy loading, as it is right to show them immediately.
  • It may be important to keep a placeholder in place of the real image, perhaps either to keep a layout in place or in case javascript is not running. Styling may even need to be tweaked for those cases. Sadly it's not unusual for third-party javascript out of your control to break functionality on a page!
  • Dimensions may need some special handling, as Drupal will often output fixed widths & heights, but responsive design usually dictates that images may need to scale with browser widths. If the real image is not being shown, its aspect ratio may still need to be applied to avoid breaking some layouts.
  • Some design elements, like carousels, hide some images even when they are within the viewport. These can get their own lazy magic. One of our favourite carousel libraries, Slick, supports this with almost no extra work, but many designs or systems will need more careful bespoke attention.

Here is a basic example javascript implementation for Drupal:

(function($) { // Set up an intersection observer. Drupal.lazy_load_observer = new window.IntersectionObserver(function(entries) { for (var i in entries) { if (entries.hasOwnProperty(i) && entries[i].isIntersecting) { var $element = $(entries[i].target); // Take the src value from data-src. $element.attr('src', $element.attr('data-src')); // Stop observing this image now that it is sorted. Drupal.lazy_load_observer.unobserve(entries[i].target); } } }, { // Specify a decent margin around the visible viewport. rootMargin: "50% 200%" }); // Get that intersection observer acting on images. Drupal.behaviors.lazy_load = { attach: function (context, settings) { $('img[data-src]', context).once('lazy-load').each(function() { Drupal.lazy_load_observer.observe(this); }); } }; })(jQuery);

(This does not include a fallback for older browsers. The rootMargin property, which defines how close an element should be to the edge of the viewport before being acted on, might want tweaking for your design.)

Drupal constructs most image HTML tags via its image template, so a hook_preprocess_image can be added to a theme to hook in and change the src attribute to be a data-src attribute. If required, a placeholder image can be used in the src attribute there too. We tend to use a single highly-cacheable transparent 1x1 pixel lightweight image, but sometimes a scaled down version of the 'real' image is more useful.

The lazy loading idea can be applied to any page element, not just images. Videos are a good candidate - and I've even seen ordinary text loaded in on some webpages as you scroll further through long articles. Enjoy being lazier AND faster!

     

    Image: Private beach by Thomas

    Categories: Drupal

    heykarthikwithu: AES Encrypt & Decrypt

    11 June 2019 - 4:00am
    AES Encrypt & Decrypt

    Advanced Encryption Standard, where we use “AES-256” to encrypt the data with Cipher. Encrypt & Decrypt approach taken is “Cipher Block Chaining” method “AES-256-CBC”.

    heykarthikwithu Tuesday, 11 June 2019 - 16:30:04 IST
    Categories: Drupal

    AddWeb Solution: Move on to Drupal 8, Be Ready for Drupal 9!

    11 June 2019 - 1:02am

    Change is the only constant and yet what one fears the most is change. But it is rightly said about change - “Don’t be afraid of change. You may lose something good, but you may gain something better.” We’ll like to say the same about the fear you hold for changing the current version of your Drupal 6/7 site to Drupal 8. Well, we also know that its more of a confusion than the fear of change, since you’re stuck between the two thoughts - whether to upgrade now to Drupal 8 or wait for Drupal 9. What if we say, we offer you a solution that will hit both the birds with one stone?

     

    An Easy, Inexpensive & Drupal 9 Compatible Migration!

    , ,

    We have been an active Drupal community member since the past 6+ years, 7+ Drupal projects supported, 5000+ successfully delivered international projects and 500+ international Drupal projects - out of which 100+ projects are of Drupal Migration. And hence, we can help you in migrating your current Drupal 6/7 site to Drupal 8 and that too in a way that you will not have to spend a single penny for migrating to Drupal 9 in future. There’s a bunch of rational reasons to back this statement and offer of ours, which we’ll like to share with you:
     

    • Change in Drupal Philosophy
      Previously, every Drupal upgrade was considered to be tedious and more of a technical task as compared to its counterpart CMS platforms. This is because Drupal 8 was created with a philosophy of bridging the gap between the technical developer and a layman-like admin. And taking this philosophy of positive change, Drupal 9 is going to bridge the gap of upgrade issue by introducing compatibility between its older and newer version - making the entire process effortless and inexpensive.
       

    • Upgrade-based Modules
      The compatibility between the older and newer version of Drupal majorly depended upon the modules and themes used while building the older version. Until and unless these modules and themes aren’t upgraded, the migration was a time-taking task and tedious task that required technical assistance. This has been changed with the change in the upgrade path of the content, which makes the migration easier if prepared.
       

    • Drupal Core Deprecating Policy
      Drupal 8 capable of introducing new APIs and features against the old ones. And once these new ones are launched, the old ones automatically get deprecated. Though these old APIs cannot be removed in the minor release of  Drupal 8, it will be removed in the next major version of Drupal 9. Hence, if you migrate to Drupal 8 now, the migration to Drupal 9 can easily be done with just a handful of changes to make it compatible.
       

    Looking at the above three major reasons, it must be clear to you that migrating to Drupal 9 from Drupal 8 is far easier as compared to the migration from Drupal 6/7 to Drupal 9. Dries Buytaert, the founder of Drupal, has also shared similar information about planning to be done for Drupal 9. According to him, Drupal 9 is basically built in Drupal 8 instead of a different codebase, altogether. This implies that the new features are added as backward-compatible code and experimental features, which means once the code is stable the old functionality will be deprecated.
     

    Dries, in his blog on ‘Plan for Drupal 9’, has quoted contributed module authors as one of the core reasons behind the easy migration from Drupal 8 to Drupal 9. On this, he says that these are the module authors are already well-equipped with the upcoming technologies of Drupal 9 and hence they can priorly work in a manner that is Drupal 9 compatible. AddWeb, being one of these contributing members of the community, can assure you of the easy and inexpensive migration to Drupal 9 as and when it arrives.
     

    Why Vouch for Drupal 9?
    Now, after grasping all the above information regarding the upcoming major release of Drupal 9, you must be wondering what’s in Drupal 9 to vouch for. Let us throw some light on the same, to be able to bring some clarity for you. Drupal 9 is all about eliminating the use of deprecated modules and APIs. Drupal 8, which runs on the dependency of Symfony 3, will run out from the market by November 2021. And hence, it is highly advisable to upgrade and avail the benefits of all that’s latest!
     

    Concluding Words:
    As an expert #Drupal-er and active community member, AddWeb is all set to offer you with this amazing opportunity to migrate from your current Drupal 6/7 site to Drupal 8, in a way that the future migration to Drupal 9 will be super easy and inexpensive. Share your details with us in here and let our Drupal Migration Experts get back to you. In case, of any queries or suggestions feel free to get in touch with us!

    Categories: Drupal

    Drupal blog: Commercial sponsorship and Open Source sustainability

    10 June 2019 - 5:15pm

    This blog has been re-posted and edited with permission from Dries Buytaert's blog.

    Recently, GitHub announced an initiative called GitHub Sponsors where open source software users can pay contributors for their work directly within GitHub.

    There has been quite a bit of debate about whether initiatives like this are good or bad for Open Source.

    On the one hand, there is the concern that the commercialization of Open Source could corrupt Open Source communities or harm contributors' intrinsic motivation and quest for purpose.

    On the other hand, there is the recognition that commercial sponsorship is often a necessary condition for Open Source sustainability. Many communities have found that to support their growth, as a part of their natural evolution, they need to pay developers or embrace corporate sponsors.

    Personally, I believe initiatives like GitHub Sponsors, and others like Open Collective, are a good thing.

    It helps not only with the long-term sustainability of Open Source communities, but also improves diversity in Open Source. Underrepresented groups, in particular, don't always have the privilege of free time to contribute to Open Source outside of work hours. Most software developers have to focus on making a living before they can focus on self-actualization. Without funding, Open Source communities risk losing or excluding valuable talent.

    Categories: Drupal

    Cheeky Monkey Media: There is no reason to wait until Drupal 9

    10 June 2019 - 2:48pm
    There is no reason to wait until Drupal 9 cody Mon, 06/10/2019 - 21:48

    Why?

    Because instead of building a radically new version of Drupal in a separate codebase, Drupal 9 is being built in Drupal 8.

    You might be thinking… “Huh?!”

    Well, what this means is that the upgrade experience will be as smooth as a babies bottom.

    Drupal 9 will essentially be just like another minor core update in Drupal 8. 

    What is a minor core update? Quite simply, it’s the middle number in the version of Drupal you are running.

    Core updates come out roughly every 6 months and keeping your site up-to-date with these is critical in making sure it’s well maintained.

    Categories: Drupal

    Hook 42: Field Notes: Drupal + Kubernetes with Lagoon

    10 June 2019 - 1:44pm
    Field Notes: Drupal + Kubernetes with Lagoon Lindsey Gemmill Mon, 06/10/2019 - 20:44
    Categories: Drupal

    Lullabot: Behind the Screens: Behind the Screens with Hussain Abbas

    10 June 2019 - 12:00am

    Bangalore to Seattle is no short trip, but Hussain Abbas made the journey, stopping at many Drupal camps along the way. He tells us why DrupalCon is so important, and where to find the best biryani.

    Categories: Drupal

    OpenSense Labs: Decoupled Drupal: Cornerstone of digital experiences

    9 June 2019 - 6:46pm
    Decoupled Drupal: Cornerstone of digital experiences Shankar Mon, 06/10/2019 - 07:16

    The audience revels in the magnificent performances of the actors, picturesque visuals, breathtaking action sequences, alluring background score, thoughtful dialogues, and emotions attached to the narrative. To bring them all out in the best possible way on to the screen, there goes an exceptional direction and screenplay behind-the-scenes in addition to a massive swathe of people who are involved in different parts of the film. Apparently, a film works wonders when both the onscreen elements and the off-screen elements strike the right chord.


    A similar theory is of paramount significance in the case of web development. The rapid evolution of diverse end-user clients and applications have resulted in a plethora of digital channels to support. Monolithic architecture-powered websites leverage web content management solutions for disseminating content via a templating solution tightly coupled with the content management system on the backend. Propelled by the need to distribute content-rich digital interactions, application development and delivery (AD&D) professionals, who are supporting content management systems (CMS), are showing an inclination towards an API-first approach.
     
    Headless CMSes have been leading the way forward to provide a spectacular digital experience and Drupal, being API-first, is a quintessential solution to implement a headless architecture. Before we move forward, let’s briefly look at how significant is content for your online presence and how the headless CMS is fulfilling the needs of organisations.

    Content: Linchpin of ambitious digital experience

    It is difficult to envisage a digital screen without content as every single moment that we spend on a smartphone, laptop, tablet, or a smartwatch is enriched with digital content like images, text, video, product reviews and so on. Even when we talk to a voice assistant and inquire about something, its answers constitute words, links, pictures, maps etc. (again, it’s all content). The relevance quotient of that content should be top-of-the-line as it is the medium that enables users to experience their digital interactions. This makes content the linchpin of ambitious digital experiences.

    The relevance quotient of content should be top-of-the-line as it is the medium that enables users to experience their digital interactions

    Several content repositories are struggling to meet today’s digital requirements. When the world was just web and email, governance of dynamic content dissemination worked perfectly fine using a web CMS. A web CMS has been an astronomical solution for offering unique designs, WYSIWYG authoring, a workflow for approvals and translation, fantastic marketing capabilities and internet-scale delivery.

    Forrester’s The rise of the headless content management system report states that web CMSes has led to a cluster of content with markup and metadata. Moreover, if you optimise your content repository for HTML templates, it would require you to undo all the optimisations in order to use the content elsewhere in a non-HTML format. Also, tightly coupled approaches did not need APIs (Application Programming Interfaces) connecting the repository to the delivery tier or the content editing and workflow tools. And, selling the content repository and delivery environment together is great for web-only scenarios but reusing the content on the mobile application or in email marketing would still require you to run the entire web CMS stack.

    There is where the need for headless CMS kicks in. It uses modern storage, stateless interfaces and cloud infrastructure for the efficacious delivery of Internet-scale content experiences on any device.

    Uncloaking headless CMS Source: Forrester

    Headless CMS is a content component in a digital experience architecture that interacts with other components comprising of authoring, delivery front ends, analytics tools through loosely coupled APIs. It does not do any rendering of content and the rendering is decoupled from the management interface which is why terms ‘headless’ and ‘decoupled’ are used interchangeably.

    Headless CMS stores the content, offers a user interface for the creation and management of content, and provides a mechanism for accessing the content through REST APIs as JSON

    While ‘head’ refers to the frontend rendering or the presentation of the content, the ‘body’ refers to the backend storage and the governance of the content.

    Headless CMS stores the content and offers a user interface for the creation and management of content. It provides a mechanism for accessing the content through REST APIs as JSON. So, it is also referred to as API-first CMS.

    Content can be delivered to and integrated with the third party system like e-commerce tool. Or, it can be delivered to and exhibited using front end technology in the browser, mobile app or syndication service. Headless CMS is a wonderful content-as-a-service solution.

    Source: Contentstack

    A traditional CMS confronts with the processes of creation of content, its dissemination and its display. It has a backend where the users can enter content which is stored in a database, retrieved, rendered into HTML on the server which is then delivered as fully rendered pages to the browser.

    In contrast, headless CMS decouples the rendering and presentation system thereby enabling you to replace it with frontend or other technologies of your choice. The CMS will be a content store and web application for the content producers and the content is delivered to the frontend or another system through an API.

    With the stupendous rise of headless architectures, a portion of the web is turning server-centric for data and client-centric for the presentation. This has given momentum to the ascension of JavaScript frameworks and on the server side it has led to the growth of JSON:API and GraphQL for better serving the JavaScript applications with content and data. Among the different web services implementations like REST, JSON:API and GraphQL, when we consider request efficiency, JSON:API is the better option as a single request is usually sufficient for most needs. JSON:API also is great in operational simplicity and is perfect while writing data.

    Headless CMS decouples the rendering and presentation system thereby enabling you to replace it with frontend or other technologies of your choice

    Headless CMS is advantageous for the following reasons:

    • You can instantly start with headless with no hurdles.
    • It does not require you to alter your existing delivery tier as it seamlessly fits into the existing architecture
    • It is perfect for building web and mobile applications as it allows practically any application- be it web, mobile, IoT(Internet of Things), smart TV or touchscreens- to pull and push content.
    • Frontend developers, backend developers, marketing and content editors can get started quickly and work autonomously.
    • You can give more power to the front-end developers as they simply work content APIs and do not have to learn inner functionalities of CMS or its templating system.
    • It follows the approach of ‘Create Once, Publish Everywhere’ thereby allowing you to reuse content for different channels.
    • It works tremendously well in a microservices environment and enables cross-functional teams to work via agile processes and get tasks done swiftly.
    Going the Drupal way

    Call it headless or decoupled, leveraging Drupal, as the central content service, is a magnificent solution to power your complete application and device ecosystem. Decoupled Drupal has the provision for omnichannel delivery of content that is quintessential for marketers and publishers.

    Decoupled Drupal has the provision for omnichannel delivery of content that is quintessential for marketers and publishers

    It enables the developer to leverage any technology for rendering the frontend experience instead of theming and presentation layers in Drupal. The Drupal backend exposes content to native applications, JavaScript application, IoT devices and other such systems. In addition to the modules for web service implementations like REST, GraphQL and JSON:API, Decoupled Drupal ecosystem also offers several other alternative modules that can be of huge help.

    Source: Dries Buytaert’s blog

    There are different approaches to decouple Drupal:

    Coupled Drupal

    In traditional Drupal, also referred to as coupled Drupal, monolithic implementation is done in which Drupal has the authority over all frontend and backend side of your web application setup. Coupled Drupal is fantastic for content creators, especially when you are in dire need of achieving fast time to market without relying too much on front-end developers. Developers, who love Drupal 8 and want it to own the entire stack, still find it a great way of building a web application.

    Progressively decoupled Drupal

    Another way to utilise the power of Drupal is the progressively decoupled approach. It is a compelling approach for developing Drupal’s frontend where the governance of contiguous experiences is handled by content editors, site assemblers and the front-end developers. While content authors and the site assemblers get the benefits of contextualised interfaces, content workflow, site preview etc. to remain usable and incorporated with Drupal as a whole, a portion of the page to a JavaScript framework is dedicated for front-end developers to let them work autonomously. Progressive decoupling helps in utilising Drupal’s rendering system while simultaneously using a JavaScript framework for powering the client-side interactivity.

    Fully decoupled Drupal

    In fully decoupled Drupal, there is a complete separation between Drupal’s frontend and the backend. The Twig theme layer is replaced with a different frontend entirely. Native mobile or desktop applications, JavaScript single-page applications or IoT applications are some of the examples. RESTful API is leveraged by these applications to communicate with Drupal. RESTful API, which acts as a middle layer between frontend and backend, exposes resources as JSON or XML that can be queried or modified with the help of HTTP methods like GET, POST etc. Even though integral features like in-place editing and layout management are not available, the fully decoupled approach is preferred by developers as it offers ultimate authority over the frontend and is superb for those who are already experienced with the development of applications in frameworks like React, Vue etc.

    Increasing intricacy of JavaScript development has given birth to JAMstack (JavaScript, APIs, Markup) which has, in turn, resulted in another very much favoured approach called fully decoupled static sites. Enhanced performance, security and reduced complication for developers have made static sites a favourite option among many developers. For instance, Gatsby, a static site generator, can retrieve content from Drupal, generate a static site, and deploy it to a content delivery network (CDN) via specialised cloud provider like Netlify.

    Meritorious features of decoupled Drupal

    Following are some of the major benefits of decoupled Drupal:

    • Syndication of content: Whether it is a coupled approach or a decoupled approach, Drupal remains the hub while developing experience ecosystems with all of them ingesting content from one source of truth.
    • Full separation: Even though monolithic and progressively decoupled approaches in Drupal has implicit separation of concerns and mostly couldn’t be seen by the user, fully decoupled architecture gives you an explicit separation between structured content that is governed by Drupal and its presentation which is managed by consumer applications.
    • User experience: Decoupled architecture offers an amazing user-centred experience. For instance, a JavaScript framework can be more suited to the task when it comes to an interactive application which is in dire need of frequent re-renderings of content.
    • Work in parallel: Decoupling also brings efficacy to a pipelined development process which involves teams working in parallel. A team of front-end developers can develop applications against a dummy web service API that is utilised only for the purpose of testing but not actually completed whereas the team of backend developers can administer the backend that exposes the API and the underlying processes yielding it.
    Challenges of Decoupled Drupal

    Some of the major hurdles while decoupling Drupal are mentioned below:

    • Editing and governance: Drupal 8’s wonderful features like in-place editing, configuration menus constituting certain page components, and some modules that include contextualised tools for Drupal governance won’t be available.
    • Security: Although JavaScript and application frameworks have the provision for defending cross-site scripting attacks, fully decoupled and progressively decoupled approaches put the obligation of carefully scrutinising the security implications.
    • Point of failure: Fully decoupled architecture require the use of stacks like MERN (MongoDB, Express, React, NodeJS) or MEAN (Angular instead of React) or other solutions that may imperative for native mobile or IoT applications. That means, it can be challenging to introduce an additional hosting stack into your firm’s infrastructure and can lead to an additional point of failure.
    • Layout management: Having to remove modules like Panels and Display Suite can be an issue for the developers causing obstacles to the marketing teams who do not have the access to developers who can help in implementing layout changes.
    • Previews: It can be challenging if your editorial team wants a previewable content workflow as it is used to working with coupled CMS.
    • Notifications: In a fully decoupled architecture, Drupal system messages, that are frequently highlighted at the top of rendered pages, are not accessible. Moreover, providing these messages in a progressively decoupled setup is not much of an issue.
    • Performance: BigPipe module works tremendously well in enhancing the web performance in Drupal that can match the page load performance of JavaScript applications. Fully decoupled architecture is devoid of this feature but progressively decoupled setup can give you the option of leveraging the feature.
    • Accessibility: Drupal won’t be providing the readymade frontend code or a roster of core interface components and interaction that can be relied upon which calls for front-end developers to build a suitable UX and ensure accessibility without the assistance of Drupal.
    Strategies employed while choosing decoupled Drupal

    Assessment of the organisational needs is instrumental to the decision-making process. Being abreast of the business requirements pertaining to building a robust digital presence helps you in forming an immaculate strategy while choosing decoupled Drupal.

    For instance, selecting decoupled Drupal might or might not be an astounding option for developing a single standalone website. It depends upon the functionalities that are deemed as “really necessary” by your developers and content editors. In case, you are developing multiple web experiences, decoupled Drupal instance can either be leveraged as a content repository which is devoid of its public-facing frontend or simply as a traditional site that can act concurrently as a content repository. It, again, depends upon how dynamic you want your web application to be that would ultimately help in deciding a JavaScript of choice or even a static site generator.

    Developing native mobile or IoT applications may require you to adopt a decoupled approach where you can expose web service APIs and consume that Drupal site as a central content service which is bereft of its own public-facing frontend.

    The significant thing to take a note here is the stupendous capabilities of Drupal for supporting almost any given use case as it streamlines the process of developing decoupled Drupal. 

    Case studies

    Some of the biggest names in different industries have chosen decoupled Drupal to power their digital presence.

    The Economist

    Established in 1843, The Economist, which set out to take part in “a severe contest between intelligence, which presses forward, and an unworthy, timid ignorance obstructing our progress”, has seen staggering growth over the years and has earned great recognition in the world of media. It chose decoupled architecture for building a Drupal backend for the native iOS and Android Espresso applications with the help of a digital agency.


    Drupal turned out to be an astronomical solution for the Economist editorial team. They could iteratively design and had a spectacular content creation and publishing workflow that met their requirements. It helped in incorporating features like automatic issue creation, approval of content, the look and feel of interfaces among others.

    The customisation of Drupal content creation interface was done in a way that would avoid errors while formatting and enables content authors to emphasise on content. Editorial teams had the provision for a dashboard that could help in swiftly and efficaciously creating and publishing new issues. It also offered visual indicators of approval status, countdown timers for each region and quick links for all the articles.

    Produce Market Guide

    The website of Produce Market Guide (PMG), a resource for produce commodity information, fresh trends and data analysis, was rebuilt by OpenSense Labs. It involved interpolation of a JavaScript framework into the Drupal frontend using progressively decoupled Drupal that helped in creating a balance between the workflows of developers and content editors. The rebuilding process comprised of majorly progressively decoupled approach, React, Elasticsearch Connector module among others.


    The process of mapping and indexing on Elastic Server required ElasticSearch Connector and Search API modules. Elastic backend architecture building process was followed by the development of faceted search application with React and the integration of the app in Drupal as block or template page. The project structure for the search was designed and built in the sandbox with modern tools like Babel and Webpack and third-party libraries like Searchkit.
     
    Moreover, Logstash and Kibana, that are based on Elasticsearch, were incorporated on the Elastic Server thereby helping in collecting, parsing, storing and visualising the data. The app in the Sandbox was developed for the production and all the CSS/JS was incorporated inside Drupal as a block to make it a progressively decoupled feature. Following the principles of Agile and Scrum helped in building a user-friendly site for PMG with a search application that could load the search results rapidly.

    Princess Cruises

    As one of the premiere cruise lines in the world, Princess Cruises innovatively metamorphosed their marketing landscape with the integration of decoupled Drupal. They went on to fundamentally change the way their guest accessed information while onboard their ships.


    The guests on their ships relied upon their smartphones to swiftly access information, purchase items and inform the management about anything. This led to the development of Princess@Sea with the objective of transforming Princess experience. It is a mobile application that is specifically designed for allowing guests to plan their day, assess the ship’s itinerary, scan through restaurant menus and book shore excursions on-the-go.

    When the ships are sailing different parts of the world, the digital experience had to be reliable which called for a centralised way of administering content across several channels and touchpoints. This would enable them to offer a uniform experience on mobile and digital signage onboard the ship. Decoupled Drupal was chosen to serve content across multiple touchpoints and channels. Princess Cruises could create content once and publish everywhere thereby connecting every passenger to Princess@sea, hence Drupal.

    NASA

    NASA, an independent agency of the executive branch of the federal government of the United States, went for the decoupled setup for the redressal of their site with the help of an agency. Drupal and Amazon Web Services (AWS) turned out to be a wonderful match for meeting the content needs of both NASA and the public with user-driven APIs, dynamic host provisioning, scalability and security.


    The deployment of NASA’s website is done in numerous AWS availability zones and manages almost 500 content editors updating over 2000 content every day. On an average, it receives nearly a million page views a day and has even gone onto handle peak load of approximately 40,000,000 page views in a single day with groundbreaking feat of 2,000,000+ simultaneous users during NASA’s 2017 Total Solar Eclipse coverage.

    Conclusion

    Application development and delivery teams have already started exploring headless CMS tools along with numerous other sets of API-first microservices for building innovative solutions. These digital natives are adopting a do-it-yourself approach to digital experience architectures and dragging their organisations into the digital-first age.

    Headless throws open interesting possibilities and challenges traditional ways of doing things. For a lot of organisations, it is no longer a question of whether they should go for headless or not but more of a contemplation of headless to assess where does the headless fit in their organisational setup. Moreover, the growth of microservices architecture will continue to give that extra push to headless or decoupled approaches.

    Decoupled Drupal is an outstanding solution for implementing headless architecture. It acts as a central hub, processing and curating content and data from other tools and services while simultaneously sharing its own content and data via APIs. With the stupendous flexibility, scalability and content authoring capabilities of headless approaches, digital firms can enjoy seamless creativity and innovation as they build their digital architectures.

    We have been perpetually working towards the provision for great digital experiences with our suite of services.

    Contact us at hello@opensenselabs.com to get the best out of decoupled Drupal and ingrain your digital presence with its superb capabilities.

    blog banner blog image Decoupled Drupal Coupled Drupal Traditional Drupal Drupal Drupal 8 Fully Decoupled Drupal Progressively decoupled Drupal Headless Drupal Headless CMS Decoupled CMS Fully Decoupled CMS Fully Decoupled Static Site Fully Decoupled App Traditional CMS API-first Drupal API-first API-first CMS CaaS Content-as-a-service REST API GraphQL JSON API Internet of things IoT Microservices architecture JavaScript ReactJS VueJS JAMstack GatsbyJS Digital Signage Blog Type Articles Is it a good read ? On
    Categories: Drupal

    DrupalEasy: Drupal 8 and Composer - working with cloned dependencies

    7 June 2019 - 11:59pm

    If you use the Drupal Composer Drupal Project template for managing your Drupal 8 site’s codebase, and you commit dependencies to your Git repository, then you’ve probably run into issues involving cloned dependencies. Sometimes when requiring a dependency via Composer, you end up with a cloned version (which includes a .git directory) instead of a release version. 

    If you’re committing dependencies to your repository, then the .git directories associated with cloned dependencies cause an issue when you try to commit. A common resolution is to remove the .git directory from the dependency’s directory.

    While this solves the immediate issue, the next time you go to update Drupal core, you’ll likely see an error message along the lines of, “The .git directory is missing from /var/www/html/vendor/some/dependency, see https://getcomposer.org/commit-deps for more information”. How can we get past this?

    Here’s my workflow:

    1. Delete the entire /vendor/ directory.
    2. Run “composer install” to reinstall all dependencies. 
    3. Update Drupal core (normally with “composer update drupal/core webflo/drupal-core-require-dev "symfony/*" --with-dependencies”
    4. Re-remove any .git directories for cloned dependencies.
    5. Commit the update.

    Ultimately the "proper" solution will be to not commit dependencies to the project repository. I agree that this is the best solution, but not everyone’s workflow currently supports this.

    There's also a great discussion in the Drupal Composer Drupal Project issue queue about alternate methods to deal with this issue. 

    Have a different workflow to deal with cloned dependencies? Share it in a comment below!

    Just getting started with managing your Drupal 8 project with Composer? Jeff Geerling has some super-helpful blog posts.

    Categories: Drupal

    OSTraining: Directly Upload and Link Files to the Text Editor Content in Drupal 8

    7 June 2019 - 5:54pm

    One of Drupal’s big advantages is its possibility to structure content with the use of fields. However, from time to time you will want to link a file to your content without the need of adding a field to the database for that purpose.

    The D8 Editor File Upload module provides this functionality by adding a button to the toolbar of the rich text editor (in this case CKEditor). This way it is possible to upload a file and present it within the content as a link. This tutorial will expĺain the usage of this module.

    Let’s start!

    Categories: Drupal

    Pages