Planet Drupal

Subscribe to Planet Drupal feed
Drupal.org - aggregated feeds in category Planet Drupal
Updated: 23 hours 38 min ago

OpenSense Labs: Cognitive Search: A True Genius

17 May 2018 - 3:51am
Cognitive Search: A True Genius Shankar Thu, 05/17/2018 - 16:21

A black spot on a white sheet of paper can be found with a quick glance. What if you have to search for a black dot with certain radius among the cluster of dots on a large sheet of white paper. Such is the need of the hour where you have to intelligently search for a piece of information from a cornucopia of data in your system. Cognitive search is revolutionizing the process of retrieving the files.

There is a diminishing trend of manually searching for a document stored somewhere in your system. Large enterprises are the ones who are showing their dire inclination towards this disrupting technology.

Before we move on to how large organizations are looking to extract the merits of cognitive search, let’s understand what it is.

What is Cognitive Search anyway? Source: Forrester

Forrester, research and advisory firm, defined cognitive search and knowledge discovery as “the new generation of enterprise search solutions that employ Artificial Intelligence (AI) technologies such as natural language processing (NLP) and machine learning to ingest, understand, organize, and query digital content from multiple data sources”.

That is the best definition one can give to describe a cognitive information system. In short, it can extract the most relevant piece of information from large sets of data in their work context.

Platforms enabled with cognitive computing abilities can interact with the users in a natural manner. With experience, they can learn user preferences and behavioral patterns. This helps them establish links between related data from both internal and external sources.

How Beneficial is Cognitive Search?

So now we have an understanding of what it is and how it works. How can it turn out to be a great asset?

Tapping into large sets of data sources
  • To fetch the best piece of data out of voluminous sources of data can seem tiring. Cognitive search can work wonders in extracting the most valuable piece of information from large sets of intricate and varied data sources.
  • Whether it is internal or external, it peeks inside everything that is available in your entire enterprise. It also touches searches through structured and unstructured data and lends deep and insightful search capabilities to your organization. This helps in making better decisions in the business.
Providing relevant knowledge
  • It comes packed with a lot of functionalities that leads us to find meaningful and relevant information. Doing a search across the enterprise data may seem daunting, but it does that with ease. 
  • Using NLP, it can gauge and get to know the scheme of things vis-à-vis text content like email, blog, report, research work, and document and also media content like meeting videos and its audio recordings.
  • Once it is done with the understanding part, machine learning algorithms help it do deeper research and come up with insightful information. Company dictionaries and ontologies help with understanding the terminologies and their relationships.
Enhancing search results

Machine learning algorithms help in providing better search results.

  • To help digital marketers predict if the advertisements designed by them is going to work or not, a supervised learning algorithm called Classification By Example can help. For instance, it can help them judge how people reacted to particular ad campaigns in the past to help them come up with something better this time around.
  • Marketers can ascertain a particular group of people and target them for their upcoming marketing campaigns. Clustering, an unsupervised learning algorithm, helps them in the process.
  • To understand the relationship between input and output variables and do the prediction, regression algorithm comes handy. For instance, it can be used to build applications that determine the road traffic based on the present weather situations. Also, based on various economic factors, it can help predict stock prices.
  • Similarity algorithm can help you appoint an expert team for a business project based on their skills and competency levels in the previous projects.
  • Personalized recommendation based on the interests of the users can be done using Recommendation algorithm. Based on the previous history and usage patterns, it can recommend content which a user would most likely want to consume.
How did Cognitive Search come into existence?

It is a valid question. One would ponder whether this was developed in a short span of time or did it involve some amazing technologies behind the scenes. It was, of course, a very long roadmap which had to be traveled to reach this tech marvel.

By now, you have already noticed that, as the definition mentions it, machine learning and artificial intelligence are the frontrunners in leading up to this masterpiece.

Almost all the search methods that exist today are some way or the other related to Google. Leverage Marketing came up with an interesting study. When Google first created a search engine in 1996, there were already several others in place. But Google’s search was different. While other search engines delivered search results only if they could find an exact keyword in the search box, Google had a different algorithm.

Google gave a value to certain keywords. So, keyword frequency determined the search results which led to irrelevancy in terms of content that it showed on top. So, in the 2000s, Google devised several improved search techniques. It, finally, incorporated machine learning into their search engine in 2015. That means Google would not just read what you have written in the search box but interpret what we really mean when we type that.

By developing its cognitive learning search method, Google’s search algorithm could understand keywords and provide rankings, past search results, browser history, user location and other such parameters. This was their major leaning towards artificial intelligence.

This is how Google standardized the internal search. Office network developers contemplated developing search methods for their business needs. Their pursuit of search method development was based on Google cognitive search and machine learning techniques. That is how cognitive search came into existence and did a splendid job in improving the search experience.

Challenges that lie ahead...

Three-pronged approaches to look at the challenges that it might encounter and how to tackle them:

  • Expertise: Shortage of personnel required to develop and maintain this budding technology can be one of the main challenges that have to be overcome.
  • AI implementation:
    • Supervised machine learning helps in recognizing user patterns over time. Providing sufficient labeled training datasets from which these systems can learn is a huge challenge.
    • Unsupervised machine learning identifies existing user patterns. Systems with this capability face a major hurdle. Sufficient data with intervention for proper guidance and interpretation to train the system is a challenge.
  • Goal formulation: There has to be clear goals and outcomes formulated. For instance, in reinforcement learning, systems perform several attempts and learn from the outcome of the trials to take better decisions. The biggest task is to provide clear-cut goals and enough practice to the systems in a challenging environment.
How can Cognitive Search improve Enterprise search?

Large enterprises are having this question as this will make life easier. We have seen that relevancy, meaningfulness, and completeness are required to get the better search results. But they should also have the enterprise qualities.

  • Understanding data: It should understand any data that an enterprise would fling at it. It should browse through the plenitude of data sources, understand both structured and unstructured data to come up with better enterprise search results.
  • Scalability: It should be able to scale with the ever-increasing density of enterprise data. Large enterprises have hundreds and thousands of applications with several bytes of data stored in the cloud or on-premises. Cognitive search solutions should deliver better quality search results.
  • Manual fine-tuning: It uses natural language processing and machine learning algorithms to understand data, do the prediction of user’s search patterns, improve the relevance of search results and automatically tune them over a period of time. Cognitive search solutions should provide tools for administrators to manually tune search results. After all, AI is not perfect.
  • Building search applications: It should help developers to develop search applications. Instead of incorporating simple text box search methods, business enterprises should be able to build an application that works like virtual digital assistants such as Google Now and Siri.
Use Cases

Sinequa, which provides cognitive search and analytics platform for over 2000 organizations, has a great cognitive computing solution. As a matter of fact, it is recognized as the leader in the Gartner 2017 Magic Quadrant for Insight Engines and the Forrester Wave™: Cognitive Search and Knowledge Discovery Solutions Q2 2017.

Sinequa partnered with content platform firm Box to enhance cross-platform enterprise search and analytics.

  • The Partnership between Sinequa and Box helped in leveraging the aggregation of human-generated data in an enterprise, exploring information, and apply them in a business decision makings. This integration not only lets customers mine their Box content but also allows to search across other organizations’ repositories. This increases the value of the information searched by linking them with related content across different sources which was previously affected by standalone silos.
  • Sinequa offers more than 150 connectors to its various data sources. And with this partnership, the relevancy of search results for the enterprise data corpus has only got improved. Individuals can do the enterprise search across platforms for contextually relevant search results across platforms from a single interface. Hence this shows that their partnership has promoted the enterprise cloud-first philosophy that is becoming the norm of the industries.
  • From the angle of information management, the partnership is significant with both the companies having a huge presence among the large enterprise clients. Box provides the user-friendly cloud file sharing and sync functionalities. It decided to embed enterprise governance features for the content by building tools for reports, access controls, workflow, securities policies etc. Partnering Sinequa further strengthened the voluminous knowledge base content of Box by allowing it to maintain control settings on data while the content is being searched and assessed on Sinequa.
  • Native security and permissions settings of connected repositories are preserved to a great extent with this integration. Users can search the Box environment without bothering the native control settings on the respective platforms where the information is located. That means Sinequa search interface allows users to endlessly search for the content and the granular security and permissions settings of the Box remain intact.
  • 360-degree view of the customer is attained through this partnership. Sinequa’s natural language processing and machine capabilities, powered by Apache Spark, and its more than 150 connectors to content sources let it form compound results of the enterprise search for detailed understanding by humans. If a user is searching for a specific subject in Box, he or she can view most contextually relevant results from email, Salesforce, on-premise file shares and other such sources. This mitigates the time and efforts required to find the right actionable insight. Thus, this improves yields from certain initiatives like gaining a 360-degree view of the customer.

Hewlett Packard Enterprise (HPE), multinational enterprise information technology company, has its own cognitive computing solution. Delivering natural language based systems to meet the ever-increasing needs of users, providing answers that are precise, relevant and trustworthy is important. HPE IDOL Natural Language Question Answering is one such solution that comes with natural language processing enables features for large enterprises.

  • Accurate response: IDOL Answer Bank feature helps in providing accurate and curate responses to the predefined reference questions. For instance, It can be programmed to give you instructions on configuring a smartphone.
  • Fact-based answers: IDOL Fact Bank feature helps in providing answers based on proper facts. For instance, it can give the stock price details through structured data sources. Or it can provide company's annual report through unstructured data sources.
  • Text-based overview: IDOL Passage Extract feature helps in giving you an overview. For instance, you can see the latest financial services and their rules and regulations or the news events.
  • Assessment questions and data sources: IDOL Answer Server feature assesses the questions and various content sources to provide the best possible answer.
Cognitive Search solution providers Source: Forrester

Forrester, in their research study for Cognitive Search and Knowledge Discovery Solutions, compiled a list of high performing vendor solutions.

  • HPE IDOL: This solution is built to analyze everything that is searched using it.  With HPE’s intentions apparently not restricted to unstructured text, its cognitive computing platform also does a deep analysis of speech, images, and video. It includes capabilities like gauging a question and optimally answering that can help developers in developing chatbots or virtual conversational assistants.
  • Coveo: Its major focus lies in contextual and relevant search results. It uses advanced analytics and machine learning algorithms to return the most contextual results for the queries made by the user. It has also integrated with Salesforce using its cloud-based model.
  • Sinequa: It gave importance to natural language processing for the better understanding of search queries and relevance of content discovery. Moreover, incorporating Apache spark, its analytics platform has got a further boost.
  • Attivio: It is suitable for most complex search applications. It offers knowledge management, anti-money laundering, customer 360, and other such features. Developers can use the structured query language to search the index.
  • IBM: It has leveraged the utilities of Watson Explorer by incorporating it in IBM’s Watson Developer Cloud. Watson explorer can be deployed in the cloud or on-premises. It is very helpful for customer 360 search applications, enterprise search, and claims processing.
  • Lucidworks: Their solution called Fusion has fantastic enterprise search features, 40 prebuilt connectors to applications like Salesforce and Slack, better administration tool, and out-of-the-box machine learning algorithms to come up with better knowledge discovery.
Summary

Cognitive search has emerged as the default standard for enterprise search. By analyzing a search query, using its AI capabilities, to give most relevant and contextual output, it has led to a volte-face in the thinking of large enterprises.

Using internal and external content sources to provide the most relevant knowledge and enhance search results, it has been a huge helping aid in the smart cross-platform search.

Google’s search engine integrated it in their algorithms to understand user’s behavioral patterns and show results. This is how cognitive search came into existence in the enterprise world.

It has to break through the straitjackets of few challenges to come out as an improved technology in the coming years.

With a deep understanding of data, scalability with challenges and manual tuning by administrators etc. it can improve enterprise search.

Leaders in cognitive search solution providers like Sinequa, HPE, Attivio among others have amazing platforms where customers can reap the benefits.

Opensense Labs love this tech genius. Contact us at hello@opensenselabs.com to understand more about this remarkable piece of technology.

blog banner blog image Blog Type Articles Is it a good read ? On
Categories: Drupal

OpenSense Labs: Drupal Lays The Foundation For Every Enterprise

17 May 2018 - 3:33am
Drupal Lays The Foundation For Every Enterprise Akshita Thu, 05/17/2018 - 16:03

As an entrepreneur, you need a reliable, secure, and flexible platform to build your business on. Not only scalable it should be future-proof to sustain the content without hampering the performance of your website.

Leaders worldwide are using the power of open source to innovate their platforms and improve their business statistics. Selecting the right technology means working on the solutions that will support an active and growing business over the long-haul. Therefore, it requires careful consideration and foresight, when choosing the CMS for your enterprise.

Fulfilling the business requirements as well meeting the technical aspects, no wonder why Drupal is used 7 times the number of top sites as its next two competitors combined (BuiltWith.com)

Let's simplify the word enterprise 

An oft-repeated word in the world of business, “enterprise” covers organizations of all shapes and sizes. All such businesses cover individual organizational units with a distinct need to build their firm with a unique identity and reputation of its own kind.

Even though the meaning may vary considerably, when it comes to web development and technology, an enterprise website requires a particular set of abilities such as, accommodating a larger and varied content base, handle traffic, microsites, and of course provide tight security.

Who uses Drupal CMS for their enterprise?

Drupal is fostering billion dollar businesses under the aegis of its brand, a few well known are:

  • Puma
  • Tesla Motors
  • Grammy
  • Pfizer
  • Timex
  • The Economist
  • Whole Food
  • Honda (Brazil)
  • Johnson and Johnson
  • Shoretel
  • LOreal (India)

And a million more add to Drupal's credentials. Acknowledging that enterprise solutions often demand complex requirements, Drupal has it sorted for you.

Why Drupal For Your Enterprise?

Covering the enterprises using Drupal, below are some of the solid technical reasons which makes it an excellent candidate for any enterprise of any scale or vertical.

It is Easier To Build

As an online platform on which your business will be built, Drupal lets your need dictate the terms.

Providing easy-to-set-up solutions with distribution, the development time is cut by half.

Enabling companies to deploy core features and functionality rapidly, it allows easier customization as per their business requirements.

It is easier to choose the layout and themes for your Drupal website, as themes and appearances are just a click away. With features simplified to make non-developers comfortable around Drupal, the editorial capabilities have been made fluent and easy.

Drupal is Secure

Used by hundreds and thousands of websites, Drupal’s core, codes, and passwords are repeatedly encrypted and hashed to strengthen the life of your website. Supported by experts, and a large and continuously growing community, it has a dedicated security team to patch any probable security violation.

Frequent Updates

In case of any security update, the community ensures that you get notified the day patches are released. Security release windows are released every Wednesday for contributed projects, and the third Wednesday of every month for core, usually, for a fixed period of time.

Even though the release window does not necessarily mean that a release will actually be rolled out on that date, it exists for the site administrators to know in advance the days they should look out for a possible security release.

Security Modules

In addition to the proven security of core, numerous contributed modules can strengthen the security of your website. These modules extend the security by adding password complexity, login, and session controls, increasing cryptographic strength, and improving Drupal' logging and auditing functions. For a detailed research on security-related modules, check the list of must-have security modules.

Security Team and Working Group

The security team works closely with the Drupal Security Working Group (SecWG), comprising dozens of experts from around the world to validate and respond to security issues, aim being - to ensure that core and contributed project system provides world-class security and provide security practices to community developers.

Its core is designed to prevent any possible security breach. Vulnerabilities in the core are coordinated with branch maintainers and individual project maintainers respectively.

Drupal has proven to be a secure solution for enterprise needs and is used by top-tier enterprises.

Drupal is Scalable and Flexible

Another salient feature that makes it popular among businesses. When concerning web technology, enterprises require the ability to handle considerable traffic throughout - especially if it is a media and entertainment site.

It is built with core web technologies which have stood both the test of time and traffic spike.

Drupal’s ability to make the framework extensible via its modules and distributions is at the heart of much of its success. While it has enabled the core to sustain the bulk of the content, its way to streamline the demands of new industries by allowing them to address their needs in the form of custom modules and distributions has given it more satisfactory customer reviews.  

One matter that addresses the worries of enterprises is the cost of maintenance. Many government and non-government organizations have migrated to Drupal to avoid the licensing and maintenance cost of the proprietary systems.  

Excels at Responsive Development and Quick Loading Time

According to Google’s official statement, more than 50 percent of search queries globally now come from mobile devices. People want to be able to find answers as fast as possible and various studies have proved that people really do care about the loading speed.

And that is why a recent Google release says that page speed will be a ranking factor for mobile searches from July 2018. It’s high time that you take the combination of performance and mobile responsiveness as a serious factor for improving visibility and revenue from the web.

Drupal 8 is built for a mobile-first world. Everything in version 8 supports mobile responsive design. Its admin and default designs are responsive for both developers and content authors providing a responsive front-end theming framework.

Increasing the loading speed of your web page opens numerous doors for business. And when users can view your Drupal website the same way on a desktop and mobile devices you cannot be having second thoughts.

Mobile responsiveness helps you deliver the optimal mobile visitor experience. It supports the best responsive design practices and ensures that your users get a coherent experience anytime and every time.   

Supports Multi-site Functionalities

Given that your organization is running more than one site, the maintenance and management would require big bucks and time. But with the multi-site feature you can share one single Drupal installation (which includes core code, contributed modules, and themes) among other several sites.

Enterprises, this way, can handle complex requirements from a single Drupal installation which implies that less time and resources are required to build your network of websites.

One can manage any number of sites across their organization or brand, crossing geographies and campaigns from a single platform that allows swift and uncomplicated site creation and deployment.

This is particularly useful for managing the core code since each upgrade only needs to be done once. While each site will have its own database and configuration settings to manage their own content, the sites would be sharing one code base and web document root.

The multisite feature can be used for sites with same features and functionalities. But if you have different functionalities it is better to test each site independently.

For Every Enterprise

Realizing the needs of every industry is different, Drupal has something for everyone.

Media and entertainment

Editing and Scalability

Media and entertainment websites worldwide use Drupal for their online platforms for seamless editing and scalability. The list of over one million organizations includes The Economist, ET Online, MTV(UK), The Grammy, The Emmy, The Weather.com, The Beatles, and Warner Bros Music.

Scalability is all about quantity - how many requests and amount of information you can handle at any given time without breaking or bending. Supporting some of the world’s most visited sites, Drupal is the other name of scalability.

Allowing easy content editing and management, which media and entertainment websites look for, it provides it all with WYSIWYG and CKEditor without another weighty feature.

SaaS

Community solutions:

SaaS enterprises are using Drupal to build the platform for their product as well as a community to engage with the clients and followers. It is easy to develop the platforms and then keep on adding the features in the later phase.

Given that community platforms are one of the key needs of SaaS organizations which allow the domain for the prospects and help the product and community to grow alike, distributions like OpenSocial offer great help.

Zoho is one of the SaaS products using Drupal for its community platforms.

E-commerce

E-commerce functionalities

Providing easy payment gateway to conduct online transactions, Drupal ensures the customer information passes seamlessly and remains safe.

Its core commerce payment module and distributions (Drupal commerce and Commerce KickStart) support the payment API, for a smooth payment collection procedure, through the check out form.

Supporting Paypal Express Checkout and Paypal Credit along with Amazon Pay, it lets you reach a wider audience by letting your shoppers complete the payment and shipping information stored on their Amazon accounts.

Tour and travel

For a potential traveler, your site shouldn’t look like just-another-information-brochure on the web. The need for an end-to-end solution to integrate all the minute details (from hotel booking to landing back) has never been greater.  

Booking Engine:

Providing two of the best booking solutions for your website:

  • EasyBooking - Distribution
  • BAT - Module

A complete solution for your vacation portal, BAT allows you to build an exclusive booking engine for a better customer relationship management. And EasyBooking gives a set of options to your visitors to make room reservations, contact hotel administration, or just sign-up for the hotel’s newsletter to be aware of the special offers and discounts.

FMCG

Theming

A design which resonates with your brand, interests and engages with your visitors is what you should indulge your resources in developing.

It’s the psychological effect which drives the visitor to make a transaction or to explore provided possibilities throughout the interface. Every landing page matters.

Regardless of your showcased products, Drupal themes provide sound navigation throughout the categories and sections with in-built hero banners’ section and pop-ups which are definitely customizable.

Additional modules can be further used to build an industry-specific theme. In order to cope up with varied demands, it provides more than two thousand easy and free to use themes on the go.

Government and Non-Government

Cost and Security:

In 2012 when the Georgian government shifted to Drupal, the first reason to dump its previous CMS (Vignette) was its rising maintenance costs. 

Running a total of 65 state websites on two different versions of this proprietary system proved to be costly in the long run

Another decisive factor for government websites, uncompromised security is why government organizations are opting for Drupal. Around 150 governments are already powered by it. Just like the Georgian government, costs have been a significant factor affecting the choice of government and non-government agencies.  

Higher Education

Distributions:

To quickly build your higher education website, distributions provide an easy opportunity to build the website halving the development time and providing quick features. Opigno and OpenEDU are two of the distributions used widely by the higher-ed websites.

Drupal is most widely used CMS in the education sector no wonder why top international universities like the Harvard, Brown, Yale, Pennsylvania, and Columbia rely on it.

HealthCare and Life Sciences

Content and User access control:

It can conform to any workflow that can be programmed with just a few configurations available. You can identify different types of content such as text, images, comments, file attachments, and any other information on your website for easy content integration and management.

Drupal As an Enterprise Management System

The need for an intranet system cannot be emphasized enough. For your business to grow by leaps and bounds, it is necessary to establish clear communication within your organization.

As your business expands, the need for an intranet system which can help in storage and sharing of data increases. ECMS is different from the web content management system in the way that the former is specifically designed for enterprise websites and is more dynamic.

Drupal allows building ECMS in two ways, either by using its modules and features or with the third party configuration. Its integration capabilities help the website to serve as a central content management system integrated with other necessary advancements.

Drupal Is Easier To Manage

Drupal isn’t hard to use, but it can be hard to learn how to use. Even though it requires more technical experience it is capable of producing exceptionally advanced sites. There is a WYSIWYG editor and drag-and-drop functionality to ease out the process and help you start straight away.

The release of version 8 has made the platform easier to use even for non-developers(and it includes content authors). Managing your website is easy as the community platform provides you with necessary documentation and answers in case you get stuck.

Summary

Being one of the leading technologies in the market, Drupal gives your enterprise the features and flexibility to innovate as per your visitor behavior and preferences.

We’d love to hear your thoughts. To get in touch, drop a mail at hello@opensenselabs.com and let us know how we can enhance your statistics with Drupal.

blog banner blog image Drupal and enterprise Drupal Drupal 8 Drupal module Blog Type Articles Is it a good read ? On
Categories: Drupal

Promet Source: Does an Accessibility Badge make my site WCAG 2.0 Compliant?

16 May 2018 - 9:25pm
Accessibility badges are gaining attention.
Categories: Drupal

Lullabot: Decoupled Drupal Hard Problems: Image Styles

16 May 2018 - 3:52pm

As part of the API-First Drupal initiative, and the Contenta CMS community effort, we have come up with a solution for using Drupal image styles in a decoupled setup. Here is an overview of the problems we sought to solve:

  • Image styles are tied to the designs of the consumer, therefore belonging to the front-end. However, there are technical limitations in the front-end that make it impossible to handle them there.
  • Our HTTP API serves an unknown number of consumers, but we don't want to expose all image styles to all consumers for all images. Therefore, consumers need to declare their needs when making API requests.
  • The Consumers and Consumer Image Styles modules can solve these issues, but it requires some configuration from the consumer development team.
Image Styles Are Great

Drupal developers are used to the concept of image styles (aka image derivatives, image cache, resized images, etc.). We use them all the time because they are a way to optimize performance on our Drupal-rendered web pages. At the theme layer, the render system will detect the configuration on the image size and will crop it appropriately if the design requires it. We can do this because the back-end is informed of how the image is presented.

In addition to this, Drupal adds a token to the image style URLs. With that token, the Drupal server is saying I know your design needs this image style, so I approve the use of it. This is needed to avoid a malicious user to fill up our disk by manually requesting all the combinations of images and image styles. With this protection, only the combinations that are in our designs will be possible because Drupal is giving a seal of approval. This is transparent to us so our server is protected without even realizing this was a risk.

The monolithic architecture allows us to have the back-end informed about the design. We can take advantage of that situation to provide advanced features.

The Problem

In a decoupled application your back-end service and your front-end consumer are separated. Your back-end serves your content, and your front-end consumer displays and modifies it. Back-end and front-end live in different stacks and are independent of each other. In fact, you may be running a back-end that exposes a public API without knowing which consumers are using that content or how they are using it.

In this situation, we can see how our back-end doesn't know anything about the front-end(s) design(s). Therefore we cannot take advantage of the situation like we could in the monolithic solution.

The most intuitive solution would be to output all the image styles available when requesting images via JSON API (or REST core). This will only work if we have a small set of consumers of our API and we can know the designs for those. Imagine that our API serves to three, and only three, consumers A, B and C. If we did that, then when requesting an image from consumer A we would output all the variations for all the image styles for all the consumers. If each consumer has 10 - 15 image styles, that means 30 - 45 image styles URLs, where only one will be used.

undefined

This situation is not ideal because a malicious user can still generate 45 images in our disk for each image available in our content. Additionally, if we consider adding more consumers to our digital experience we risk making this problem worse. Moreover, we don't want the presentation from one consumer sipping through another consumer. Finally, if we can't know the designs for all our consumers, then this solution is not even on the table because we don't know what image styles we need to add to our back-end.

On top of all these problems regarding the separation of concerns of front-end and back-end, there are several technical limitations to overcome. In the particular case of image styles, if we were to process the raw images in the consumer we would need:

  • An application runner able to do these operations. The browser is capable of this, but other more challenged devices won't.
  • A powerful hardware to compute image manipulations. APIs often serve content to hardware with low resources.
  • A high bandwidth environment. We would need to serve a very high-resolution image every time, even if the consumer will resize it to 100 x 100 pixels.

Given all these, we decided that this task was best suited for a server-side technology.

In order to solve this problem as part of the API-First initiative, we want a generic solution that works even in the worst case scenario. This scenario is an API served by Drupal that serves an unknown number of 3rd party applications over which we don't have any control.

How We Solved It

After some research about how other systems tackle this, we established that we need a way for consumers to declare their presentation dependencies. In particular, we want to provide a way to express the image styles that consumer developers want for their application. The requests issued by an iOS application will carry a token that identifies the consumer where the HTTP request originated. That way the back-end server knows to select the image styles associated with that consumer.

undefined

For this solution, we developed two different contributed modules: Consumers, and Consumer Image Styles.

The Consumers Project

Imagine for a moment that we are running Facebook's back-end. We defined the data model, we have created a web service to expose the information, and now we are ready to expose that API to the world. The intention is that any developer can join Facebook and register an application. In that application record, the developer does some configuration and tweaks some features so the back-end service can interact optimally with the registered application. As the manager of Facebook's web services, we are not to take special request from any of the possible applications. In fact, we don't even know which applications integrate with our service.

The Consumers module aims to replicate this feature. It is a centralized place where other modules can require information about the consumers. The front-end development teams of each consumer are responsible for providing that information.

This module adds an entity type called Consumer. Other modules can add fields to this entity type with the information they want to gather about the consumer. For instance:

  • The Consumer Image Styles module adds a field that allows consumer developers to list all the image styles their application needs.
  • Other modules could add fields related to authentication, like OAuth 2.0.
  • Other could gather information for analytic purposes.
  • Maybe even configuration to integrate with other 3rd party platforms, etc.
The Consumer Image Styles Project

Internally, the Consumers module takes a request containing the consumer ID and returns the consumer entity. That entity contains the list of image styles needed by that consumer. Using that list of image styles Consumer Image Styles integrates with the JSON API module and adds the URLs for the image after applying those styles. These URLs are added to the response, in the meta section of the file resource. The Consumers project page describes how to provide the consumer ID in your request.

{ "data": { "type": "files", "id": "3802d937-d4e9-429a-a524-85993a84c3ed" "attributes": { … }, "relationships": { … }, "links": { … }, "meta": { "derivatives": { "200x200": "https://cms.contentacms.io/sites/default/files/styles/200x200/public/boyFYUN8.png?itok=Pbmn7Tyt", "800x600": "https://cms.contentacms.io/sites/default/files/styles/800x600/public/boyFYUN8.png?itok=Pbmn7Tyt" } } } }

To do that, Consumer Image Styles adds an additional normalizer for the image files. This normalizer adds the meta section with the image style URLs.

Conclusion

We recommend having a strict separation between the back-end and the front-end in a decoupled architecture. However, there are some specific problems, like image styles, where the server needs to have some knowledge about the consumer. In these very few occasions the server should not implement special logic for any particular consumer. Instead, we should have the consumers add their configuration to the server.

The Consumers project will help you provide a unified way for app developers to include this information on the server. Consumer Image Styles and OAuth 2.0 are good examples where that is necessary, and examples of how to implement it.

Further Your Understanding

If you are interested in alternative ways to deal with image derivatives in a decoupled architecture. There are other alternatives that may incur extra costs, but still worth checking: Cloudinary, Akamai Image Converter, and Origami.

Note: This article was originally published on October 25, 2017. Following DrupalCon Nashville, we are republishing (with updates) some of our key articles on decoupled or "headless" Drupal as the community as a whole continues to explore this approach further. Comments from the original will appear unmodified.

Hero Image by Sadman Sakib. Also thanks to Daniel Wehner for his time spent on code and article reviews.

Categories: Drupal

TEN7 Blog's Drupal Posts: Episode 028: Exploring Flight Deck, Docker containers for Drupal development with Tess Flynn

16 May 2018 - 6:00am
Tess Flynn sits down with Ivan Stegic to discuss TEN7's Flight Deck, a set of Docker containers for local Drupal development. Flight Deck is lightweight, simple, and Docker-native, allowing you to stand up a local development environment quickly after installing Docker.
Categories: Drupal

Jacob Rockowitz: Our journeys within our community

16 May 2018 - 4:39am

To begin to address sustainability in Drupal and Open Source, it’s important to explore our journeys within the community. We need to examine how we work together to grow and build our software and community.

This is going to be one of the most challenging blog posts I have ever written because I am uncomfortable with the words: roles, maintainers, contributor and mentoring. All of these words help establish our Open Source projects and communities. Over the past two years, while working on the Webform module I have learned the value of how each of these aspects relates to one another and to our Open Source collaboration and community.

Why am I uncomfortable with these words?

I am uncomfortable with these words because my general mindset and work habit are very independent and individualistic, but living on this island does not work well when it comes to Open Source. And changing my mindset and habits are things that I know need to happen.

Like many programmers, I went to art school where I learned the importance of exploring and discovering one's individual creative process. Another thing I had in common with many people who went to art school - I needed to figure out how to make a living. I went to the Brooklyn Public Library and started surfing this new thing called the World Wide Web. I was curious, confident and intrigued enough to realize that this was something I could and wanted to do - I could get a job building websites.

I built my first website, http://jakesbodega.com, using MS FrontPage while reading the HTML Bible and tinkering on a computer in the basement of my folks’ big blue house. After six months of self-teaching, I got my first job coding HTML at a small company specializing in Broadway websites. Interestingly, with the boom of the Internet, everyone's roles were constantly changing as companies grew to accommodate more...Read More

Categories: Drupal

Axelerant Blog: DrupalCamp Mumbai 2018: A Recap

16 May 2018 - 12:46am


DrupalCamp Mumbai
was held on 28th-29th April at IIT Bombay, bringing developers, students, managers, and organizations together and providing them the opportunity to interact, share knowledge, and help the community grow. 

Categories: Drupal

Hook 42: April Accessibility (A11Y) Talks

15 May 2018 - 4:32pm

This month’s Accessibility Talk was an encore presentation of the panel’s Core Conversation at DrupalCon Nashville: Core Accessibility: Building Inclusivity into the Drupal Project
Helena McCabeCatherine McNally, and Carie Fisher discussed the fundamentals of accessibility and how they can be injected further into the Drupal project. All three are accessibility specialists in their fields.

Categories: Drupal

Commerce Guys: Human Presence protects Drupal forms after Mollom

15 May 2018 - 3:01pm

On April 2, 2018, Acquia retired Mollom, a spam fighting tool built by Drupal founder Dries Buytaert. As Dries tells the story, Mollom was both a technical and financial success but was ultimately shut down to enable Acquia to deploy its resources more strategically. At its peak, Mollom served over 60,000 websites, including many of ours!

Many sites are looking for alternatives now that Mollom is shut down. One such service Commerce Guys integrated earlier this year in anticipation of Mollom's closing is Human Presence, a fraud prevention and form protection service that uses multiple overlapping strategies to fight form spam. In the context of Drupal, this includes protecting user registration and login forms, content creation forms, contact forms, and more.

Similar to Mollom, Human Presence evaluates various parameters of a visitor's session to decide if the visitor is a human or a bot. When a protected form is submitted, the Drupal module requests a "human presence" confidence rating from the API (hence the name), and if the response does not meet a configurable confidence threshold, it will block form submission or let you configure additional validation steps if you choose. For example, out of the box, the module integrates the CAPTCHA module to rebuild the submitted form with a CAPTCHA that must be completed before the form will submit.

We believe Human Presence is a great tool to integrate on its own or in conjunction with other standalone modules like Honeypot. Furthermore, they're joining other companies like Authorize.Net, Avalara, and PayPal as Drupal Commerce Technology Partners. Their integration includes support for protecting shopping cart and checkout forms, and we are looking for other ways they can help us combat payment fraud in addition to spam.

Learn more about Human Presence or reach the company's support engineer through their project page on drupal.org.

Categories: Drupal

Acquia Developer Center Blog: Decoupling Drupal 8 with JSON API

15 May 2018 - 8:51am

As we saw in the previous post, core REST only allows for individual entities to be retrieved, and Views REST exports only permit the issuance of GET requests rather than unsafe methods as well. But application developers often need greater flexibility and control, such as the ability to fetch collections, sort and paginate them, and access related entities that are referenced.

In this column, we'll inspect JSON API, part of the surrounding contributed web services ecosystem that Drupal 8 relies on to provide even more extensive features relevant to application developers that include relationships and complex operations such as sorting and pagination.

Tags: acquia drupal planet
Categories: Drupal

Virtuoso Performance: Importing specific fields with overwrite_properties

15 May 2018 - 8:50am
Importing specific fields with overwrite_properties mikeryan Tuesday, May 15, 2018 - 10:50am

While I had planned to stretch out my posts related to the "Acme" project, there are currently some people with questions about using overwrite_properties - so, I've moved this post forward.

By default, migration treats the source data as the system of record - that is, when reimporting previously-imported records, the expectation is to completely replace the destination side with fresh source data, discarding any interim changes which might have been made on the destination side. However, sometimes, when updating you may want to only pull specific fields from the source, leaving others (potentially manually-edited) intact. We had this situation with the event feed - in particular, the titles received from the feed may need to be edited for the public site. To achieve that, we used the overwrite_properties property on the destination plugin:

destination: plugin: 'entity:node' overwrite_properties: - 'field_address/address_line1' - 'field_address/address_line2' - 'field_address/locality' - 'field_address/administrative_area' - 'field_address/postal_code' - field_start_date - field_end_date - field_instructor - field_location_name - field_registration_price - field_remaining_spots - field_synchronized_title

When overwrite_properties is present, nothing changes when importing a new entity - but, if the destination entity already exists, the existing entity is loaded, and only the fields and properties enumerated in overwrite_properties will be, well, overwritten. In our example, note in particular field_synchronized_title - on initial import, both the regular node title and this field are populated from ClassName, but on updates only field_synchronized_title receives any changes in ClassName. This prevents any unexpected changes to the public title, but does make the canonical title from the feed available should an editor care to review and decide whether to modify the public title to reflect any changes.

Now, in this case we are creating the entities initially through this migration, and thus we know via the map table when a previously-migrated entity is being updated and thus overwrite_properties should be applied. Another use case is when the entire purpose of your migration is to update specific fields on pre-existing entities (i.e., not created by this migration). In this case, you need to map the IDs of the entities that are to be updated, otherwise the migration will simply create new entities. So, if you had a "nid_to_update" property in your source data, you would include

process: nid: nid_to_update

in your migration configuration. The destination plugin will then load that existing node, and only alter the specifies overwrite_properties in it.

Tags Drupal Planet Drupal Migration Use the Twitter thread below to comment on this post:

Importing specific fields with overwrite_properties https://t.co/0H3W1Ll0ts

— Virtuoso Performance (@VirtPerformance) May 15, 2018

 

Categories: Drupal

Virtuoso Performance: Drupal 8 migration from a SOAP API

15 May 2018 - 8:12am
Drupal 8 migration from a SOAP API mikeryan Tuesday, May 15, 2018 - 10:12am

Returning from my sabbatical, as promised I’m catching up on blogging about previous projects. For one such project, I was contracted by Acquia to provide migration assistance to a client of theirs [redacted, but let’s call them Acme]. This project involved some straightforward node migrations from CSV files, but more interestingly required implementing two ongoing feeds to synchronize external data periodically - one a SOAP feed, and the other a JSON feed protected by OAuth-based authentication. There were a number of other interesting techniques employed on this project which I think may be broadly useful and haven’t previously blogged about - all-in-all, there was enough to write about on this project that rather than compose one big epic post, I’m going to break things down in a series of posts, spread out over several days so as not to spam Planet Drupal. In this first post of the sequence, I’ll cover migration from SOAP. The full custom migration module for this project is on Gitlab.

A key requirement of the Acme project was to implement an ongoing feed, representing classes (the kind people attend in person, not the PHP kind), from a SOAP API to “event” nodes in Drupal. The first step, of course, was to develop (in migrate_plus) a parser plugin to handle SOAP feeds, based on PHP’s SoapClient class. This class exposes functions of the web service as class methods which may be directly invoked. In WSDL mode (the default, and the only mode this plugin currently supports), it can also report the signatures of the methods it supports (via __getFunctions()) and the data structures passed as parameters and returned as results (via __getTypes()). WSDL allows our plugin to do introspection and saves the need for some explicit configuration (in particular, it can automatically determine the property to be returned from within the response).

migrate_example_advanced (a submodule of migrate_plus) demonstrates a simple example of how to use the SOAP parser plugin - the .yml is well-documented, so please review that for a general introduction to the configuration. Here’s the basic source configuration for this specific project:

source: plugin: url # To remigrate any changed events. track_changes: true data_fetcher_plugin: http # Ignored - SoapClient does the fetching itself. data_parser_plugin: soap # The method to invoke via the SOAP API. function: GetClientSessionsByClientId # Within the response, the object property containing the list of events. item_selector: SessionBOLExternal # Indicates that the response will be in the form of a PHP object. response_type: object # You won’t find ‘urls’ and ‘parameters’ in the source .yml file (they are inserted # by a web UI - the subject of a future post), but for demonstration purposes # this is what they might look like. urls: http://services.example.com/CFService.asmx?wsdl parameters: clientId: 1234 clientCredential: ClientID: 1234 Password: service_password startDate: 08-31-2016 # Unique identifier for each event (section) to be imported, composed of 3 columns. ids: ClassID: type: integer SessionID: type: integer SectionID: type: integer fields: - name: ClientSessionID label: Session ID for the client selector: ClientSessionID ...

Of particular note is the three-part source ID defined here. The way this data is structured, a “class” contains multiple “sessions”, which each have multiple “sections” - the sections are the instances that have specific dates and times, which we need to import into event nodes, and we need all three IDs to uniquely identify each unique section.

Not all of the data we need for our event nodes is in the session feed, unfortunately - we want to capture some of the class-level data as well. So, while, the base migration uses the SOAP parser plugin to get the session rows to migrate, we need to fetch the related data at run time by making direct SOAP calls ourselves. We do this in our subscriber to the PREPARE_ROW event - this event is dispatched after the source plugin has obtained the basic data per its configuration, and gives us an opportunity to retrieve further data to add to the canonical source row before it enters the processing pipeline. I won’t go into detail on how that data is retrieved since it isn’t relevant to general migration principles, but the idea is since all the class data is not prohibitively large, and multiple sessions may reference the same class data, we fetch it all on the first source row processed and cache it for reference by subsequent rows.

Community contributions

SOAP Source plugin - Despite the title (from the original feature request), it was implemented as a parser plugin.

Altering migration configuration at import time - the PRE_IMPORT event

Our event feed permits filtering by the event start date - by passing a ‘startDate’ parameter in the format 12-31-2016 to the SOAP method, the feed will only return events starting on or after that date. At any given point in time we are only interested in future events, and don’t want to waste time retrieving and processing past events. To optimize this, we want the startDate parameter in our source configuration to be today’s date each time we run the migration. We can do this by subscribing to the PRE_IMPORT event.

In acme_migrate.services.yml:

services: ... acme_migrate.update_event_filter: class: Drupal\acme_migrate\EventSubscriber\UpdateEventFilter tags: - { name: event_subscriber }

In UpdateEventFilter.php:

class UpdateEventFilter implements EventSubscriberInterface { /** * {@inheritdoc} */ public static function getSubscribedEvents() { $events[MigrateEvents::PRE_IMPORT] = 'onMigrationPreImport'; return $events; }

The migration system dispatches the PRE_IMPORT event before the actual import begins executing. At that point, we can insert the desired date filter into the migration configuration entity and save it:

/** * Set the event start date filter to today. * * @param \Drupal\migrate\Event\MigrateImportEvent $event * The import event. */ public function onMigrationPreImport(MigrateImportEvent $event) { // $event->getMigration() returns the migration *plugin*. if ($event->getMigration()->id() == 'event') { // Migration::load() returns the migration *entity*. $event_migration = Migration::load('event'); $source = $event_migration->get('source'); $source['parameters']['startDate'] = date('m-d-Y'); $event_migration->set('source', $source); $event_migration->save(); } }

Note that the entity get() and set() functions only operate directly on top-level configuration properties - we can’t get and set, for example ‘source.parameters.startDate’ directly. We need to retrieve the entire source configuration, modify our one value within it, and set the entire source configuration back on the migration.

Tags Drupal Planet Drupal Migration Use the Twitter thread below to comment on this post:

Drupal 8 migration from a SOAP API https://t.co/hf8LGiATsh

— Virtuoso Performance (@VirtPerformance) May 15, 2018
Categories: Drupal

Web Wash: Managing Media Assets using Core Media in Drupal 8

15 May 2018 - 8:00am

There's a lot of momentum to fix media management in Drupal 8 thanks to the Media Entity module. By using a combination of Media EntityEntity Embed, Entity Browser and some media providers such as Media entity image you could add decent media handling in Drupal 8.

Then in Drupal 8.4, the Media Entity functionality was moved into a core module called Media. However, the core module was hidden by default. Now in Drupal 8.5 it's no longer hidden and you can install it yourself.

In this tutorial, you'll learn how to install and configure the Media module in Drupal 8 core. This tutorial is an updated version of the How to Manage Media Assets in Drupal 8 tutorial where we cover Media Entity.

Configuring Entity Embed and Entity Browser for the core Media module is essentially the same as with Media Entity. So if you have experience using Media Entity, then you'll be fine using the core Media module.

Categories: Drupal

Hook 42: Giddy Up! Hook 42 Moseys over to Texas Drupal Camp

15 May 2018 - 7:52am

Dust off your saddle and get prepared to optimize your workflow. There is a lot packed into 3 days in Austin. Pull on your chaps, fasten your leathers, dig in your spurs and head on over to Texas Drupal Camp. On Thursday, make sure you check out the trainings and sprints. On Friday and Saturday, catch all of the keynotes and sessions.

Our own Ryan Bateman will be at Texas Drupal Camp to share his presentation about visual regression testing.

Texas Drupal Camp is Thursday, March 31st through Saturday, June 2nd at the Norris Conference Center in beautiful Austin, TX.

Categories: Drupal

Valuebound: Drupal 8 - Extending module using Plugin Manager

15 May 2018 - 12:56am

Often we write and contribute module, but have you ever thought or considered how the module features can be extended? In Drupal 8, we can do so by using Plugin Manager that make our modules extendable. For this, first, you need to know what is Plugin, Plugin Type and how it works. Have a look.

So what is Plugin?

In short, Plugin is small pieces of swappable functionality.

What is Plugin Type?

Plugin type is categorization or grouping of Plugins, which perform similar functionality. Drupal 8 Plugin system has three base elements:

  1. Plugin Types

    The central controlling class that defines the ways plugins of this type will be discovered, instantiated and…

Categories: Drupal

Joachim's blog: The quick and dirty debug module

14 May 2018 - 11:28pm

There's a great module called the debug module. I'd give you the link… but it doesn't exist. Or rather, it's not a module you download. It's a module you write yourself, and write again, over and over again.

Do you ever want to inspect the result of a method call, or the data you get back from a service, the result of a query, or the result of some other procedure, without having to wade through the steps in the UI, submit forms, and so on?

This is where the debug module comes in. It's just a single page which outputs whatever code you happen to want to poke around with at the time. On Drupal 8, that page is made with:

  • an info.yml file
  • a routing file
  • a file containing the route's callback. You could use a controller class for this, but it's easier to have the callback just be a plain old function in the module file, as there's no need to drill down a folder structure in a text editor to reach it.

(You could quickly whip this up with Module Builder!)

Here's what my router file looks like:

joachim_debug: path: '/joachim-debug' defaults: _controller: 'joachim_debug_page' options: _admin_route: TRUE requirements: _access: 'TRUE'

My debug module is called 'joachim_debug'; you might want to call yours something else. Here you can see we're granting access unconditionally, so that whichever user I happen to be logged in as (or none) can see the page. That's of course completely insecure, especially as we're going to output all sorts of internals. But this module is only meant to be run on your local environment and you should on no account commit it to your repository.

I don't want to worry about access, and I want the admin theme so the site theme doesn't get in the way of debug output or affect performance.

The module file starts off looking like this:

opcache_reset(); function joachim_debug_page() { $build = [ '#markup' => “aaaaarrrgh!!!!”, ]; /* // ============================ TEMPLATE return $build; */ return $build; }

The commented-out section is there for me to quickly copy and paste a new section of code anytime I want to do something different. I always leave the old code in below the return, just in case I want to go back to it later on, or copy-paste snippets from it.

Back in the Drupal 6 and 7 days, the return of the callback function was merely a string. On Drupal 8, it has to be a proper render array. The return text used to be 'It's going wrong!' but these days it's the more expressive 'aaaaarrrgh'. Most of the time, the output I want will be the result of dsm() call, so the $build is there just so Drupal's routing system doesn't complain about a route callback not returning anything.

Here are some examples of the sort of code I might have in here.

// ============================ Route provider $route_provider = \Drupal::service('router.route_provider'); $path = 'node/%/edit'; $rs = $route_provider->getRoutesByPattern($path); dsm($rs); return $build;

Here I wanted to see the what the route provider service returns. (I have no idea why, this is just something I found in the very long list of old code in my module's menu callback, pushed down by newer stuff.)

// ============================ order receipt email $order = entity_load('commerce_order', 3); $build = [ '#theme' => 'commerce_order_receipt', '#order_entity' => $order, '#totals' => \Drupal::service('commerce_order.order_total_summary')->buildTotals($order), ]; return $build;

I wanted to work with the order receipt emails that Commerce sends. But I don't want to have to make a purchase, complete and order, and then look in the mail logger just to see the email! But this is quicker: all I have to do is load up my debug module's page (mine is at the path 'joachim-debug', which is easy to remember for me; you might want to have yours somewhere else), and vavoom, there's the rendered email. I can tweak the template, change the order, and just reload the page to see the effect.

As you can see, it's quick and simple. There's no safety checks, so if you ever put code here that does something (such as an entity_delete(), it's useful for deleting entities in bulk quickly), be sure to comment out the code once you're done with it, or your next reload might blow up! And of course, it's only ever to be used on your local environment; never on shared development sites, and certainly never on production!

I once read something about how a crucial piece of functionality required for programming, and more specifically, for ease of learning to program with a language or a framework, is being able to see and understand the outcomes of the code you are writing. In Drupal 8 more than ever, being able to understand the systems you're working with is vital. There are tools such as debuggers and the Devel and Devel Contrib modules' information pages, but sometimes quick and dirty does the job too.

Categories: Drupal

AddWeb Solution: Reasons To Prove Why Drupal Commerce Is Best Choice For Ecommerce Website

14 May 2018 - 11:24pm

The concept of a global village is getting more and more real with the advancement of ‘online’ world. And online shops share a major part in this advancement. But with the elevated need of building an online store, the options offering platforms to build these stores has also elevated.

Here’s where our experience and expertise come in picture. After 500+ man hours spent over building about 10+ Ecommerce websites, we’ve come to a conclusion that Drupal is indeed the best choice for building an Ecommerce website. So, here are the 11 realistic reasons to guide you through while choosing the best platform for building an Ecommerce website for you; which is undoubtedly Drupal Commerce

 

1. An Array of Inbuilt Features 
Drupal is priorly loaded with all the features that are required for building a website viz., product management system, payment modes, cart management, et al.

 

2. Time-Saving 
Development time reduces since the time consumed in first developing and then custom integrating two separate systems is eliminated.
 

3. SEO Friendly 
Drupal is SEO friendly and hence, helps your website rank higher in the search engine

 

4. Negligible Traffic Issues 
Heavy traffic is never an issue with Drupal since it is backed by a wealthy system to support the traffic.
 

5. Social Media Integration 
Social Media platforms like Facebook, Instagram, Twitter, LinkedIn, etc comes priorly integrated with Drupal. 

 

6. High on Security 
Drupal is high on security grounds and hence, comes up with an inbuilt solution for securing your data/information on the website. 

 

7. Super Easy Data Management 
Data management becomes easy with Drupal since it is the best content management system. 

 

8. Feasible for E-Commerce Websites
Easy to built and run a Drupal-based eCommerce website, whether it is a small size enterprise or large business houses. 

 

9. Inbuilt Plugins for Visitor Analysis  
The inbuilt plugins for visitor reporting and analytics help you to easily evaluate your website without any external support. 

 

10. Customization
Drupal is flexible enough to make your website a customized one. 

 

11. Every Single Code is Free!
Drupal firmly believes in maintaining the integrity, the core of Open Source Community, where nothing is chargeable and every single code is for everyone to use. 


And you thought we’re trying to sell it just because ‘We Drupal Everyday’? Well, good that now you’re aware of the selfless efforts we make to solve your tech-related confusions! We at AddWeb are Friends of Drupal Development.

Categories: Drupal

Chapter Three: Introducing React Comments

14 May 2018 - 10:57am

Commenting system giant Disqus powers reader conversations on millions of sites, including large publishers like Rolling Stone and the Atlantic. So when Disqus quietly introduced ads into their free plans last year, there was some understandable frustration.

Why did @disqus just add a bunch of ads to my site without my permission? https://t.co/CzXTTuGs67 pic.twitter.com/y2QbFFzM8U

— Harry Campbell (@TheRideshareGuy) February 1, 2017

 

Categories: Drupal

CTI Digital: NWDUG Drupal Contribution Sprints

14 May 2018 - 9:48am

Last weekend I attended my first ever Drupal Sprint organised by NWDUG.

Categories: Drupal

CTI Digital: NWDUG Drupal Contribution Sprints

14 May 2018 - 9:48am

Last weekend I attended my first ever Drupal Sprint organised by NWDUG.

Categories: Drupal

Pages