Drupal

Plan for Drupal 9

Dries Buytaert - 12 December 2018 - 5:13am

At Drupal Europe, I announced that Drupal 9 will be released in 2020. Although I explained why we plan to release in 2020, I wasn't very specific about when we plan to release Drupal 9 in 2020. Given that 2020 is less than thirteen months away (gasp!), it's time to be more specific.

Shifting Drupal's six month release cycle We shifted Drupal 8's minor release windows so we can adopt Symfony's releases faster.

Before I talk about the Drupal 9 release date, I want to explain another change we made, which has a minor impact on the Drupal 9 release date.

As announced over two years ago, Drupal 8 adopted a 6-month release cycle (two releases a year). Symfony, a PHP framework which Drupal depends on, uses a similar release schedule. Unfortunately the timing of Drupal's releases has historically occurred 1-2 months before Symfony's releases, which forces us to wait six months to adopt the latest Symfony release. To be able to adopt the latest Symfony releases faster, we are moving Drupal's minor releases to June and December. This will allow us to adopt the latest Symfony releases within one month. For example, Drupal 8.8.0 is now scheduled for December 2019.

We hope to release Drupal 9 on June 3, 2020

Drupal 8's biggest dependency is Symfony 3, which has an end-of-life date in November 2021. This means that after November 2021, security bugs in Symfony 3 will not get fixed. Therefore, we have to end-of-life Drupal 8 no later than November 2021. Or put differently, by November 2021, everyone should be on Drupal 9.

Working backwards from November 2021, we'd like to give site owners at least one year to upgrade from Drupal 8 to Drupal 9. While we could release Drupal 9 in December 2020, we decided it was better to try to release Drupal 9 on June 3, 2020. This gives site owners 18 months to upgrade. Plus, it also gives the Drupal core contributors an extra buffer in case we can't finish Drupal 9 in time for a summer release.

Planned Drupal 8 and 9 minor release dates.We are building Drupal 9 in Drupal 8

Instead of working on Drupal 9 in a separate codebase, we are building Drupal 9 in Drupal 8. This means that we are adding new functionality as backwards-compatible code and experimental features. Once the code becomes stable, we deprecate any old functionality.

Let's look at an example. As mentioned, Drupal 8 currently depends on Symfony 3. Our plan is to release Drupal 9 with Symfony 4 or 5. Symfony 5's release is less than one year away, while Symfony 4 was released a year ago. Ideally Drupal 9 would ship with Symfony 5, both for the latest Symfony improvements and for longer support. However, Symfony 5 hasn't been released yet, so we don't know the scope of its changes, and we will have limited time to try to adopt it before Symfony 3's end-of-life.

We are currently working on making it possible to run Drupal 8 with Symfony 4 (without requiring it). Supporting Symfony 4 is a valuable stepping stone to Symfony 5 as it brings new capabilities for sites that choose to use it, and it eases the amount of Symfony 5 upgrade work to do for Drupal core developers. In the end, our goal is for Drupal 8 to work with Symfony 3, 4 or 5 so we can identify and fix any issues before we start requiring Symfony 4 or 5 in Drupal 9.

Another example is our support for reusable media. Drupal 8.0.0 launched without a media library. We are currently working on adding a media library to Drupal 8 so content authors can select pre-existing media from a library and easily embed them in their posts. Once the media library becomes stable, we can deprecate the use of the old file upload functionality and make the new media library the default experience.

The upgrade to Drupal 9 will be easy

Because we are building Drupal 9 in Drupal 8, the technology in Drupal 9 will have been battle-tested in Drupal 8.

For Drupal core contributors, this means that we have a limited set of tasks to do in Drupal 9 itself before we can release it. Releasing Drupal 9 will only depend on removing deprecated functionality and upgrading Drupal's dependencies, such as Symfony. This will make the release timing more predictable and the release quality more robust.

For contributed module authors, it means they already have the new technology at their service, so they can work on Drupal 9 compatibility earlier (e.g. they can start updating their media modules to use the new media library before Drupal 9 is released). Finally, their Drupal 8 know-how will remain highly relevant in Drupal 9, as there will not be a dramatic change in how Drupal is built.

But most importantly, for Drupal site owners, this means that it should be much easier to upgrade to Drupal 9 than it was to upgrade to Drupal 8. Drupal 9 will simply be the last version of Drupal 8, with its deprecations removed. This means we will not introduce new, backwards-compatibility breaking APIs or features in Drupal 9 except for our dependency updates. As long as modules and themes stay up-to-date with the latest Drupal 8 APIs, the upgrade to Drupal 9 should be easy. Therefore, we believe that a 12- to 18-month upgrade period should suffice.

So what is the big deal about Drupal 9, then?

The big deal about Drupal 9 is … that it should not be a big deal. The best way to be ready for Drupal 9 is to keep up with Drupal 8 updates. Make sure you are not using deprecated modules and APIs, and where possible, use the latest versions of dependencies. If you do that, your upgrade experience will be smooth, and that is a big deal for us.

Special thanks to Gábor Hojtsy (Acquia), Angie Byron (Acquia), xjm (Acquia), and catch for their input in this blog post.

Categories: Drupal

SPARQL Entity Storage

New Drupal Modules - 12 December 2018 - 4:14am

Provides a SPARQL entity storage and entity query to be used by Drupal entities. ToBeExpanded...

Categories: Drupal

OpenSense Labs: Build HIPAA-compliant Website with Drupal

Planet Drupal - 12 December 2018 - 2:47am
Build HIPAA-compliant Website with Drupal Shankar Wed, 12/12/2018 - 16:17

Healthcare. When you listen to this word, you get a feeling of ‘care’ and ‘improvement’ because that’s what the healthcare industry does. This industry cares for the people and works on the improvement of their health. And today’s growing number of healthcare providers, payers, and IT professionals need a HIPAA-compliant website that also ‘cares’ about processing, storing and transmitting protected health information.


Open source software align tremendously well with the requirements of the healthcare sector. Drupal, being an open source content management framework itself, aligns very well with Health IT interoperability and is great for commercial applications at the enterprise level.

What is HIPAA compliance?

The Health Insurance Portability and Accountability Act of 1996 (HIPPA) is a legislation that was implemented for the easy retention of healthcare insurance coverage that can be of huge significance whenever the US workers change or lose their jobs.

The Health Insurance Portability and Accountability Act of 1996 is a legislation that sets the standard for the protection of sensitive patient data

It sets the standard for the protection of sensitive patient data and any organisation that confronts with protected health information (PHI) has to make sure that all the required physical, network and process security measures are being adhered to. According to Amazon Web Services, PHI includes a very wide set of personally identifiable health and health-related data, including insurance and billing information, diagnosis data, clinical care data, and lab results such as images and test results.

It also encourages to use electronic health records (EHR) for the betterment of efficiency and quality of the US healthcare system via improved information sharing.

It encompasses covered entities (CE), anyone who offers treatment, payment and operations in healthcare and business associates, and people who can access patient information and offer support in treatment, payment or operations. Moreover, subcontractors or business associates of business associates should also be in compliance.

How to make your website HIPAA compliant?

In order to make sure that your website is HIPAA-compliant, you must start by establishing new processes. Make sure that the PHI is only accessible to authorised personnel in addition to establishing processes for deleting, backing up and restoring PHI as needed. Emails consisting of PHI should be sent in an encrypted and secure manner.

Moreover, it is of great significance that you partner with web hosting companies that are HIPAA compliant and have processes for protecting PHI. Also, sign a business associate contract with third parties who have access to your patient’s PHI.

It is of paramount importance to buy and implement an SSL certificate for your website and ensuring that all web forms on your site are encrypted and safe.

Use Case: Drupal ensures HIPAA compliance

Drupal, being one of the most security-focussed CMS, comes with stupendous database encryption mechanisms. For high-security applications, Drupal can be configured for a firm database encryption. When the whole database encryption is not desirable, top-notch granularity is available for safeguarding more specific information like user accounts, particular forms, and also the values of particular fields can be encrypted in an otherwise plaintext database.

Drupal’s encryption system is configurable to adhere to the norms of PCI, HIPAA and state privacy laws constituting offsite encryption key management

Drupal’s encryption system is configurable to adhere to the norms of Payment Card Industry (PCI), HIPAA and state privacy laws constituting offsite encryption key management. Drupal is also a spectacular solution as an enterprise-grade healthcare system because it can be extended. It is possible to leverage Drupal as a content dissemination network, intranet, or even to incorporate several systems within a single platform.

By integrating Drupal data layer and an electronic medical record (EMR) via a RESTful API connection could dramatically enhance interoperability. It can unlock the important data from the proprietary systems and their data silos.

Source: Acquia

Proprietary EMR systems are astronomical with their top-of-the-line standardised approaches and the ability to organise and store enormous amounts of data. But they are not great for customisation or interoperability. The dearth of interoperability of high-priced, single-vendor solutions and the hurdles while functioning in an integrated healthcare delivery setting results in HIPAA non-compliance. Even if the files are electronic, the difficulty in moving the files boundlessly leads to frequent violations like the shortage of legal authorisation or unencrypted emails.

Drupal and EMR integration is an excellent example of a Drupal-powered healthcare technology that empowers the staff to leverage critical and potentially life-saving data. Healthcare delivery systems, with the division of data silos, can witness the evolution from being a reactive diagnostic model to a proactive preventative model.

Drupal can be layered on top of multiple EMR systems within a medical group and the information can be compiled into one physician portal. The integration of Drupal and EMR systems can be made through numerous feeds from API calls, XML or JSON feeds and RESTful APIs.

Drupal offers granular user access control, that is, it can offers site administrators complete authority over who can see and who can modify different parts of a site. It operates on the basis of a system of extensible user roles and access permissions. Thus, with the help of role-based provisioning, Drupal can emphasise on critical data that dwells behind a firewall in a HIPAA secure environment.

Drupal can also be configured to look into the database via web services integration on the basis of specific EHR authorisation requirements. And it does so by adhering to the user access permission controls. Therefore, the data remains safe and secure all the time.

Conclusion

Drupal is a magnificent solution for enterprises in the healthcare sector to help them process, store and transmit protected health information.

We have been steadfast in our goals to deliver a great digital experience with our expertise in Drupal development.

Contact us at hello@opensenselabs.com to build HIPAA-compliant Drupal website.
 

blog banner blog image HIPAA-compliant website HIPAA Health Insurance Portability and Accountability Act HIPAA compliance Drupal 8 Blog Type Articles Is it a good read ? On
Categories: Drupal

ComputerMinds.co.uk: Security risks as Drupal matures

Planet Drupal - 12 December 2018 - 1:59am

After reading this from Ars Technica, which describes how a developer offered to 'help' the maintainer of an NPM module - and then slowly introduced malicious code to it - I can't help but wonder if the Drupal community is vulnerable to the exact same issue. Let's discuss!

Please, don't touch my package

NPM modules have been hacked at before, and it's not pretty when it happens. Because of the way we use packages, it's a lot easier for nasty code to get sucked in to a LOT of applications before anyone notices. Attacks on the code 'supply chain', therefore, have tended to be high-profile and high-damage.

NPM is used as a source for a huge number of code projects, many of which use other bits of code from other NPM packages. Even moderate applications for PC, mobile or web can have hundreds or thousands of NPM packages pulled in. It's common for packages to depend on other packages, which depend on other packages, which need other packages, which require... you get the picture? There are so many fragments, layers and extra bits that NPM is used for, that the developers of the applications don't necessarily know all the packages that are being pulled in to their application. It's so easy to just type "npm require somefancypackageineed" without thinking and without vetting. NPM will just go and get everything for you, and you don't need to care.

That's how it should be, right? We should be able to just add code and know that it's safe, right? In a perfect world, that would be fine. But in reality there's an increasingly large amount of trust being given when you add a package to your application, and developers don't realise it. It's events like this that are making people aware again that they are including code in their projects that they either do not scrutinise or do not know exists.

Drupal's moment will come

Fortunately, Drupal is a little different to NPM. Whilst modules are often dependent on other modules, we tend to have a lot less layers going on. It's much easier to know what modules and dependencies you're adding in when you include a new module. But that doesn't mean we're immune.

This particular incident came about when a tired, busy module maintainer was approached and offered help. It's a classic social engineering hack.

"Sure, I'll help you! [mwahaha]"

What struck me was that Drupal probably has hundreds of module maintainers in similar circumstances. Put yourself in those shoes, for a moment:
- You maintain an old Drupal 7 module
- It has a few thousand sites using it still
- You're busy, don't have time for it anymore

If somebody offered to sort it all out for you, what would you say? I'm pretty sure most would be ecstatic! Hurrah! But how would you vet your new favourite person in the whole world, before making them a co-maintainer and giving them the keys to the kingdom?

Alternatively, what of this:
- There is an old module, officially unmaintained
- It still has users
- The maintainer cannot be contacted

Drupal has a system for allowing people to be made maintainers of modules, when the original maintainer cannot be contacted. How are these people vetted? I'm sure there's some sort of check, but what if it's not enough?

In particular, I want to point out that as Drupal 7 ages, there will be more and more old, unmaintained and unloved modules still used by thousands of sites. If we forget them and fail to offer them sufficient protection, they will become vulnerable to attacks just like this. Drupal's moment will come.

This is an open source issue

It would be rather very easy to run away screaming right now, having decided that open source technologies sound too dangerous. So I'll put in some positive notes!

That Drupal should be increasingly exposed to the possibility of social engineering and malevolent maintainers is no new issue. There are millions of open source projects out there, all exposed to exactly these issues. As the internet grows and matures and ages, these issues will become more and more common; how many projects out there have tired and busy maintainers?!

For now, though, it must be said that the open source communities of the world have done what few thought possible. We have millions of projects and developers around the world successfully holding onto their trusty foundations, Drupal included. Many governments, enterprises and organisations have embraced the open source way of working on the premise that although there is risk in working differently, there is great merit in the reward. To this day, open source projects continue to thrive and to challenge the closed-source world. It is the scrutiny and the care of the open source community that keeps it clear and safe. As long as we continue to support and love and use our open source communities and contributions, they will stay in good repair and good stead.

If you were thinking of building a Drupal site and are suddenly now questioning that decision, then a read of Drupal's security statement is probably worthwhile.

Know your cattle by name

The key mitigation for this risk, it should be said, is for developers to know what code is in their application. It's our job to care and so it's our job to be paranoid. But it's not always easy. How many times have you installed a module without checking every line of code? How many times have you updated a module without checking the diff in Git? It's not always practicable to scan thousands and thousands of lines of code, just in case - and you'd hope that it's not necessary - but that doesn't mean it's not a good idea.

Using Composer with Drupal 8 makes installing new modules as easy as using NPM, and exposes the same problems to some extent. Add in a build pipeline, and it's very easy to never even see a single line of the new code that you've added to your project. Am I poking a paranoia nerve, yet? ;)

For further fun, think back to other attacks in the last year where sources for external JS dependencies were poisoned, resulting in compromised sites that didn't have a single shred of compromised code committed - it was all in the browser. How's THAT for scary!

In short, you are at risk if:
- You install a module without checking every line of code
- You update a module without checking every line of code / the diff
- You use a DEV release of a module
- You use composer
- Your application pulls in external dependencies

These actions, these ways of working all create dark corners in which evil code can lie undetected.

The light shall save you

Fortunately, it can easily be argued that Drupal Core is pretty safe from these sorts of issues. Phew. Thanks to the wide community of people contributing and keeping keen eyes on watch, Core code can be considered as well-protected. Under constant scrutiny, there's little that can go wrong. The light keeps the dark corners away.

Contrib land, however, is a little different. The most popular modules not only have maintainers (well done, guys!), but many supporting developers and regular release cycles and even official 'Security Coverage' status. We have brought light and trust to the contrib world, and that's a really important thing.

But what does 'Security Coverage' really provide? Can it fail? What happens if there is a malicious maintainer? I wonder.

When the light goes out

Many modules are starting to see the sun set. As dust gathers on old Drupal 7 modules and abandoned D8 alpha modules, the dark corners will start to appear. 'Security Coverage' status will eventually be dropped, or simply forgotten about, and issue lists will pile up. Away from the safety of strong community, keen eyes and dedicated maintainers, what used to be the pride of the Drupal community will one day become a relic. We must take care to keep pride in our heritage, and not allow it to become a source of danger.

Should a Drupal module maintainer be caught out by a trickster and have their work hacked, what would actually happen? Well, for most old D7 modules we'd probably see a few thousand sites pull in the code without looking, and it would likely take some time for the vulnerability to be noticed, let alone fixed.

Fortunately, most developers need a good reason to upgrade modules, so they won't just pull in a new malicious release straight away. But there's always a way, right? What if the hacker nicely bundled all those issues in the queue into a nice release? Or simply committed some new work to the DEV branch to see who would pull it in? There are loads of old modules still running on dev without an official release. How many of us have used them without pinning to a specific commit?

Vigilance is my middle name!

I have tried to ask a lot of questions, rather than simply doom-mongering. There's not an obvious resolution to all of these questions, and that's OK. Many may argue that, since Drupal has never had an issue like this before, we must already have sufficient measures in place to prevent such a thing happening - and I disagree. As the toolkit used by the world's hackers gets ever larger and ever more complex, we cannot afford to be lax in our perspective on security. We must be vigilant!

Module maintainers, remain vigilant. Ask good questions of new co-maintainers. Check their history. See what they've contributed. Find out who they really are.

Developers, remain vigilant. Know your cattle. Be familiar with what goes in and out of your code. Know where it comes from. Know who wrote it.

Drupalers, ask questions. How can we help module maintainers make good decisions? How can we support good developers and keep out the bad?

Some security tips!
- Always know what code you're adding to your project and whether you choose to trust it
- Drupal projects not covered by the Security Team should be carefully reviewed before use
- Know what changes are being made when performing module updates and upgrades
- If using a DEV version of a module in combination with a build process, always pin to a specific git commit (rather than HEAD), so that you don't pull in new code unknowingly

Categories: Drupal

Blair Wadman: How to customise content in partial Twig templates

Planet Drupal - 12 December 2018 - 1:11am

Using partial Twig templates is a great way to organise your frontend code because you can reuse code fragments in multiple templates. But what happens if you want to use the same partial template in multiple places and customise its content slightly?

You can do this by passing variables to the included partial template using the with keyword.

Categories: Drupal

Entity Switcher

New Drupal Modules - 12 December 2018 - 12:26am

This module allows to toggle between two entities.

Sponsors Live example

Here's a live example of this module working in the real world: https://www.unicef.es/unnombreunavida#dona

Categories: Drupal

GTM dataLayer

New Drupal Modules - 12 December 2018 - 12:11am

This module integrates Drupal with Google Tag Manager, allowing you to developing complex dataLayers.

Sponsors
Categories: Drupal

Commerce Single Euro Payments Area (SEPA)

New Drupal Modules - 12 December 2018 - 12:03am

This module provides a simple SEPA (Single Euro Payments Area) payment method with IBAN validation for Drupal Commerce. After completing the order a mail with the SEPA Direct Debit Mandate can be sent to the customer with all necessary information that the customer must return completed and signed.

Sponsors
Categories: Drupal

Entity Library

New Drupal Modules - 11 December 2018 - 11:49pm

This module provides an entity config for dynamic library declarations.

Recommended projects Sponsors
Categories: Drupal

Simple Json Viewer

New Drupal Modules - 11 December 2018 - 10:54pm

Simple JSON Viewer is the module built to beautify the given JSON.

Features:

  1. It can be used with any editor by using simple class "simple-json-viewer"
  2. it can be used in custom js by using simpleJsonViewer function
  3. it can be used as ajax by using simpleJsonViewer function

Usage 1:

Just add some JSON in any html tag with the class name "simple-json-viewer"

Categories: Drupal

Stripe Webhooks

New Drupal Modules - 11 December 2018 - 10:54pm

This module adds a Stripe Webhook endpoint to to receive notifications of the desired events and launch Symfony events.

Categories: Drupal

Loading Bar

New Drupal Modules - 11 December 2018 - 10:27pm

This module provides a highly flexible progress bar form element based on SVG,
using LoadingBar.js library.

Sponsors
Categories: Drupal

Kanopi Studios: Kanopi Studios is a Top Provider on Clutch

Planet Drupal - 11 December 2018 - 1:30pm

 

It’s not easy to find a development partner you can trust. Particularly if you’ve never been immersed in the world of web development, it may take you some time to learn the language. That can make it even more difficult to know whether your partner is really staying on track with what you want to accomplish.

Luckily, knowing what to look for in a business partner can save you from all of the potential troubles later on. Ratings and reviews sites like Clutch can help you get there. This platform focuses on collecting and verifying detailed client feedback, and then using a proprietary research algorithm to rank thousands of firms across their platform. Ultimately, Clutch is a resource for business buyers to find the top-ranked service providers that match their business needs.

Luckily for us, users on Clutch will also find Kanopi Studios at the top of the list to do just that. Kanopi has been working with Clutch for a few months to collect and utilize client feedback to find out what we should focus on in the coming year. Through the process, we’ve coincidentally been named among the firm’s top digital design agencies in San Francisco.

Here are some of the leading client reviews that led us to this recognition:

“They were fantastic overall. We had great success communicating to their team via video conferencing, and they were able to answer every question we had. They also worked quickly and were very efficient with their time, so we got a great value overall.”

“Kanopi Studios’ staff members are their most impressive assets — extremely intelligent, experienced, and personable. Building a website is never easy, but working with people you both respect and like makes a huge difference.”

“Kanopi Studios successfully migrated our Drupal platform while preserving all the content that we’ve built up over the years. They worked hard to achieve a responsive design that works well on both mobile and large desktop displays.”

Not only have these kind words earned us recognition on Clutch, but we’ve also gained the attention of the how-to focused platform, The Manifest (where we are listed among top Drupal developers in San Francisco), and the portfolio-focused site, Visual Objects (where we are gaining ground among top web design agencies site-wide).

Thank you, as always, to our amazing clients for the reviews and the support.

The post Kanopi Studios is a Top Provider on Clutch appeared first on Kanopi Studios.

Categories: Drupal

Commerce GoCardless Client

New Drupal Modules - 11 December 2018 - 1:12pm

The module integrates Drupal Commerce with GoCardless.com, and creates direct debit mandates for new orders upon check out. GoCardless is a recurring payment specialist, and is very competitive compared with other payment services, charging just 1% on transactions, with a minimum of 20p (or an equivalent amount in other currencies).

Categories: Drupal

CKEditor Blockquote Attribution

New Drupal Modules - 11 December 2018 - 12:37pm

This module includes the Blockquote Attribution plugin, an extension of the standard CKEditor
Blockquote button. This button inserts <figure> and <figcaption> markup around the <blockquote> element. This is the standard way to indicate that a piece of content is quoted from another source.

Categories: Drupal

LinkedIn Insights

New Drupal Modules - 11 December 2018 - 12:23pm

The LinkedIn Insight Tag is a lightweight JavaScript tag that powers conversion tracking, retargeting, and web analytics for LinkedIn ad campaigns. The Insight Tag should be incorporated as a standard component of all of your website's pages to enable these LinkedIn Marketing Solutions features.

Categories: Drupal

Acro Media: Drupal 7 Commerce Performance Tuning

Planet Drupal - 11 December 2018 - 11:30am

When it comes to ecommerce, a fast site can make a big difference in overall sales. I recently went through an exercise to tune a Drupal 7 Commerce site for high traffic on a Black Friday sales promotion. In previous years, the site would die in the beginning of the promotion, which really put a damper on the sale! I really enjoyed this exercise, finding all the issues in Commerce and Drupal that caused the site to perform sub-optimally.

FYI, We also have a Drupal 8 Commerce Performance Tuning guide here.

Scenario

In our baseline, for this specific site the response time was 25 seconds and we were able to handle only about 1000 orders an hour. With a very heavy percentage of 500s, timeouts and general unresponsiveness. CPU and memory utilization on web and database servers was very high.

Fast-forward to the end of all the tuning and we were able to handle 12K-15K orders an hour! The load generator couldn’t generate any more load, or the internet bandwidth on the load generators would get saturated, or something external to the Drupal environment became the limiter. At this point, we stopped trying to tune things. Horizontal capacity by adding additional webheads was linear. If we had added more webheads, they could handle the traffic. The database server wasn’t deadlocking. Its CPU and memory was very stable. CPU on the web servers would peak out at ~80% utilization, then more capacity would get added by spinning up a new server. The entire time, response time hovered around 500-600ms.

Enough about the scenario. Let’s dive into things.

Getting Started

The first step in tuning a site for a high volume of users and orders is to build a script that will create synthetic users and populate and submit the form(s) to add item(s) to the cart, register new users, input the shipping address and any other payment details. There’s a couple options to do this. JMeter is very popular. I’ve used it in the past with pretty decent success. In my most recent scenario, I used locust.io because it was recommended as a good tool. I hadn’t used it before and gave it a try. It worked well. And there are other load testing tools available too.

OK, now you are generating load on the site. Now start tuning the site. I used New Relic's APM monitoring to flag transactions and PHP methods that were red flags. Transactions that take a long time or happen with great frequency are all good candidates for red flags. If you don’t have access to New Relic, another option is Blackfire. Regardless what you use for identifying slow transactions, use something.

Make sure that there’s nothing crazy going on. In my case, there was a really bad performing query that was called in the theme’s template.php and it was getting loaded on every single page call. Even when it wasn’t needed. Tuning that query gave use an instant speed-up in performance.

After that, we starting digging into things. There are several core and contrib patches I’ll mention and explain why and when you should consider applying them on your site.

In your specific commerce site, things might be different. You might have different payment gateways or external integration points. But the process of identifying pain points it the same. Run a 30-60 minute load test and find long running PHP functions. Then fix them so it doesn’t take as long.

As a first step, install the Memcache (or Redis) module and set it up for locking. Without that one step, you’ll almost immediately run into deadlocks on the DB for the semaphore table. This is a critical first step. From my experience, deadlocks are the number one issue when running a site under load. And deadlocks on the semaphore table is probably the most common scenario. Do yourself a favor. Install Memcache and avoid the problem entirely.

Then see if you can disable form caching on checkout and user registration. This helped save a TON of traffic against the database for forms that really don’t need to be cached. More about that later in specific findings.

One last thing before diving into some findings...

SHOW ENGINE INNODB STATUS

...will become your favorite friend. Use it to find deadlocks on your MySQL server.

Specific Findings The following section describes specific problems and links to issues and patches related to the problems.
  • Do not attempt field storage write when field content did not change
    Commerce and Rules run and reprocess an order a lot. And then blindly save the results. If nothing has changed, why re-save everything again? So don’t. Apply this patch and see fewer deadlocks on order saves.
  • field_sql_storage_field_storage_load does use an unnecessary sort in the DB leading to a filesort
    Many times it makes sense to use your database to process the query. Until it doesn’t make sense. This is a case it leads to a filesort in MySQL (which you can discover using EXPLAIN in MySQL) and locking of tables and deadlocks. It is not that hard to do the sort in PHP. So do it.
  • Do not make entries in "cache_form" when viewing forms that use #ajax['callback'] (Drupal 7 port)
    This is a huge win, if you can pull it off. For transient form processing like login and checkout, disabling form cache is a huge relief to the DB. You might need to put the entire cart checkout onto a single page. No cart wizard. But the gains are pretty amazing.
  • If you are using captcha or anything with ajax on it on the login page, then you’ll need to make sure you are running the latest versions of Captcha and Recaptcha. See issues #2449209 and #2219993. Also, side note: if using the timing feature of recaptcha, the page this form falls on will not be cacheable and tends to bust page cache for important pages (like homepages that have a newsletter sign up form).
  • form_get_cache called when no_cache enabled
    You’ve done all that work to cut down on what is stored in cache. Great. But Drupal still wants to retrieve from cache. Let’s fix that. Cut down more DB calls.
  • commerce_payment_pane_checkout_form uses form_state values instead of input
    If your webshop is like most webshops, it is there to generate revenue. If you disable form caching on checkout, without this patch the values in your payment (including the ones for receiving payment) aren’t captured. Oops. Let’s fix that too.
  • Variable set stampede for js and css during asset building
    If you are using any auto scaling system and building out new servers when the site is under heavy load, you might already be using Advagg. But if you aren’t and are still using Drupal core’s asset system, spinning up a new system or two will cause some issues. Deadlocks galore when generating the CSS and JS aggregates. So either install Advagg or this patch.
  • Reduce database load by adding order_number during load
    Commerce and Rules really like to reprocess orders. An easy win is to reduce the number of one-off resaves and assign the order number after the first load.
  • Never use aggregation in maintenance mode
    While the site is under heavy load, the database sometimes becomes unreachable. Drupal treats this as maintenance mode. And tries to aggregate the JS/CSS and talk to the database. But the database isn’t reachable. It is a little ridiculous to aggregate JS/CSS on the maintenance page. And even more to try to talk to the database. So cut out that nonsense.
  • drupal_goto short circuits and doesn't set things to cache
    If you have any PHP classes you are using during the checkout, Drupal’s classloader auto loads them into memory. It then keeps track of where the files exist on the disk and this makes the next load of those classes just that much faster. Well, drupal_goto kills all this caching. And drupal_goto gets called when navigating through checkout.
Recap

Wow! That was a long list of performance enhancements. Here’s a quick recap though. Identify critical flow of your application. Generate load on that flow. Use a profiler to find pain points in that process. Then start picking things off, looking on drupal.org for existing issues, filing bugs, applying patches. Many of the identified issues discussed here will apply to your site. Others won’t apply and you’ll have different issues.

Surprisingly, or maybe not surprisingly, the biggest wins in our discovery process were the low hanging fruit, not the complex changes. That query in the template.php was killing the site. After that, switching to use Memcache for the semaphore table and eliminating form cache for orders also cut down on a lot of chatter with the database.

I hope you too can tune that Drupal 7 Commerce site to be able to handle thousands of orders an hour. The potential exists in the platform, it is just a matter of giving performance bottlenecks a little attention and fine tuning for your particular use case. Of course, if you need a little help we'd be happy to assist. A little bit of time spent can have you reaping the rewards from then on.

Categories: Drupal

Acro Media: Drupal 8 Commerce Performance Tuning

Planet Drupal - 11 December 2018 - 11:30am

Here are some performance tuning tips and instructions for setting up a very performant Drupal 8 Commerce site using Varnish, Redis, Nginx and MySQL. I’ve got this setup running nicely for at least 13,000 concurrent users and it should scale well past that.

FYI, We also have a Drupal 7 Commerce Performance Tuning guide here.

Varnish Config

You’ll need some specific config for Drupal as well as some extra config to work nicely with BigPipe caching. These are standard for Varnish and Drupal and not specific to Commerce.

Drupal

You’ll want to setup the Purge and Varnish Purge modules to handle tag based cache invalidation, nothing here is unique to Commerce, so you can follow the standard instructions. You will, however, want to make sure your pages actually are cached, as often modules or small misconfigurations can make a page not cacheable. To work nicely with Varnish, you want the entire page to be cacheable so your webserver doesn’t even get hit. An underused module that I find very helpful is Renderviz, which will show you a 3D breakdown of what cache tags are attached to what parts and can help you identify problem parts. I run

renderviz(‘max-age’, ‘0’) to show me anything that can’t be cached. Usually the parts you find can be corrected and made cacheable.

For example: In a recent set of performance testing I was doing, I found a newsletter signup that appeared on the bottom of every page had an overly aggressive honeypot setting, which rendered the page uncacheable. Changing the settings to only apply to necessary forms, as well as correcting a language selector, turned tons of uncached pages into cacheable pages. Now these pages return <10ms and put zero load on my web servers or database.

Web Servers PHP

Use the most modern version of PHP you can, preferably the latest stable. Never ever ever use PHP 5 which is terrible, terrible, terrible. Otherwise, make sure you have sufficient memory and allowed threads, and that will cover most of your PHP tuning. This is almost certainly the most resource heavy part of your Drupal stack, but it is also easy to scan horizontally, pretty much indefinitely. Also, the more you can make use of Varnish, the less this will get used.

Nginx/Apache

Most of this is just making sure you can handle the number of connections. You may need to up the file limit...

ulimit -n

...of your web user to allow for more than 1024 connections per nginx instance.

Database

A Commerce site is usually more write-heavy than your standard site, as your users create lots of "content" (aka carts and orders). This will usually change your MySQL config a bit, although the majority of your queries will still be reads. A pretty simple way to tune your site is to run...

mysqltuner

...against it after getting some real traffic data for at least a couple days, or simulating high traffic. It’s recommendations will get you a pretty good setup.

There is one other VERY important thing you need to do, you need to change your transaction isolation level from READ-REPEATABLE to READ-COMMITTED. READ-REPEATABLE is much too aggressive at table locking to work with most Drupal sites, especially anything write heavy. You will suffer from constant deadlocks even at fairly low traffic levels without this. Frankly, I think this should be a flag in the status page, but my patch hasn’t gotten any traction.

Cache Server

Nothing special here, but you are going to want use a separate caching option. It could be Memcache, Redis or even just a separate MySQL database. Redis is nice and fast, but the biggest gain is just splitting your cache away from the rest of your db so you can scale them easier.

Patches

There are a few specific patches that will be a great help to your performance.

_list cache tag invalidation

See: https://www.drupal.org/project/drupal/issues/2966607

Every entity type has an entity_type_list cache tag, which gets invalidated any time an entity of that type is added or changed and that those lists will need to get rebuild. This happens a LOT, but is a relatively simple query.

update cachetags set invalidation=invalidation+1 where tag=’my_entity_list’

This is an update, which is a blocking query, nothing else can edit this row while this query is running, which wouldn’t be so bad except...

This query often gets run as a part of larger tasks, in our case, such as when placing an order. A big task like this is run in a transaction, which basically means we save up all the queries and run them at once so they can be rolled back if something goes wrong. This means though, that this row stays locked for the whole duration of the transaction, not just the short time it takes this little query to run. If this invalidation happens near the start of the transaction, it can take a query that would talk 0.002 seconds and make it take 0.500 seconds, for example. Now, if we have more than 2 of these happening a second, we start to back up and build a queue of these queries, which just keeps getting longer and longer until we just start returning timeouts. Since this query is part of the bigger order transaction, it stops the whole order from being processed and can bring your checkout flow to a halt. 

Thankfully, the above listed patch allows these cache invalidations to be deferred so as to not block large transactions. I think the update query for invalidating cache tags is still a bottleneck as you could eventually reach it without these long transactions, but at this point that problem is more hypothetical than something you will practically encounter.

Add index to profiles

See: https://www.drupal.org/project/profile/issues/3017788

As you start getting more and more customers and orders, you will get more profiles. Loading them, especially for anonymous users, will really start to slow down and become a bottleneck. The listed patch simply adds an index to prevent that. Please note, this is a patch for the Profile module, not Commerce itself.

Make language switcher block cacheable

See: https://www.drupal.org/project/drupal/issues/2232375

This issue is unfortunately on hold pending some large core changes, but once it does land, this will allow the language switcher block to be used without worry of it blocking full page caching.

Conclusion

You should be able to scale well above 10,000 concurrent users with these tips. If you encounter any other bottlenecks or bugs, I’d love to hear about them. If you want help with some performance improvements from Acro Media and yours truly, feel free to contact us.

Categories: Drupal

php[architect]: Better Practice – December 2018

Planet Drupal - 11 December 2018 - 9:10am

Practice and more practice are the keys to adopting modern software engineering practices. It doesn’t matter if you’re using WordPress or Drupal to manage website content, trying to learn unit testing, get hired, or looking for better ways to manage date and time data—there’s always an opportunity to learn how to do it better. This issue rounds up articles on each of these topics to help you deepen your problem-solving skills.

The post Better Practice – December 2018 appeared first on php[architect].

Categories: Drupal

Agiledrop.com Blog: How to Create a Node in Drupal 8 using REST

Planet Drupal - 11 December 2018 - 4:38am

In this post, I go through the steps involved in creating content (nodes) using RESTful web services in order to demonstrate their capabilities.

READ MORE
Categories: Drupal

Pages

Subscribe to As If Productions aggregator - Drupal