Planet Drupal

Subscribe to Planet Drupal feed
Drupal.org - aggregated feeds in category Planet Drupal
Updated: 17 hours 33 min ago

Gizra.com: A Simple Decoupled Site with Drupal 8 and Elm

21 May 2018 - 10:00pm

This is going to be a simple exercise to create a decoupled site using Drupal 8 as the backend and an Elm app in the frontend. I pursue two goals with this:

  • Evaluate how easy it will be to use Drupal 8 to create a restful backend.
  • Show a little bit how to set up a simple project with Elm.

We will implement a very simple functionality. On the backend, just a feed of blog posts with no authentication. On the frontend, we will have a list of blog posts and a page to visualize each post.

Our first step will be the backend.

Before we start, you can find all the code I wrote for this post in this GitHub repository.

Drupal 8 Backend

For the backend, we will use Drupal 8 and the JSON API module to create the API that we will use to feed the frontend. The JSON API module follows the JSON API specification and currently can be found in a contrib project. But as has been announced in the last DrupalCon “Dries”-note, the goal is to move it to an experimental module in core in the Drupal 8.6.x release.

But even before that, we need to set up Drupal in a way that is easy to version and to deploy. For that, I have chosen to go with the Drupal Project composer template. This template has become one of the standards for site development with Drupal 8 and it is quite simple to set up. If Composer is already installed, then it is as easy as this:

composer create-project drupal-composer/drupal-project:8.x-dev server --stability dev --no-interaction

This will create a folder called server with our code structure for the backend. Inside this folder, we have now a web folder, where we have to point our webserver. And is also inside this folder where we have to put all of our custom code. For this case, we will try to keep the custom code as minimal as possible. Drupal Project also comes with the two best friends for Drupal 8 development: drush and drupal console. If you don’t know them, Google it to find more out about what they can do.

After installing our site, we need to install our first dependency, the JSON API module. Again, this is quite easy, inside the server folder, we run the next command:

composer require drupal/jsonapi:2.x

This will accomplish two things: it will download the module and it will add it to the composer files. If we are versioning our site on git, we will see that the module does not appear on the repo, as all vendors are excluded using the gitignore provided by default. But we will see that it has been added to the composer files. That is what we have to commit.

With the JSON API module downloaded, we can move back to our site and start with site building.

Configuring Our Backend

Let’s try to keep it as simple as possible. For now, we will use a single content type that we will call blog and it will contain as little configuration as possible. As we will not use Drupal to display the content, we do not have to worry about the display configuration. We will only have the title and the body fields on the content type, as Drupal already holds the creation and author fields.

By default, the JSON API module already generates the endpoints for the Drupal entities and that includes our newly created blog content type. We can check all the available resources: if we access the /jsonapi path, we will see all the endpoints. This path is configurable, but it defaults to jsonapi and we will leave it as is. So, with a clean installation, these are all the endpoints we can see:

JSON API default endpoints

But, for our little experiment, we do not need all those endpoints. I prefer to only expose what is necessary, no more and no less. The JSON API module provides zero configurable options on the UI out of the box, but there is a contrib module that allows us to customize our API. This module is JSON API Extras:

composer require drupal/jsonapi_extras:2.x

JSONAPI Extras offer us a lot of options, from disabling the endpoint to changing the path used to access it, or renaming the exposed fields or even the resource. Quite handy! After some tweaking, I disabled all the unnecessary resources and most of the fields from the blog content type, reducing it just to the few we will use:

JSONAPI blog resource

Feel free to play with the different options. You will see that you are able to leave the API exactly as you need.

Moving Our Configuration to Version Control

If you have experience with Drupal 7, you probably used the Features module to export configuration to code. But one of the biggest improvements of Drupal 8 is the Configuration Management Interface (CMI). This system provides a generic engine to export all configuration to YAML files. But even if this system works great, is still not the most intuitive or easy way to export the config. But using it as a base, there are now several options that expand the functionality of CMI and provide an improved developer experience. The two bigger players on this game are (Config Split)[https://www.drupal.org/project/config_split] and the good old (Features)[https://www.drupal.org/project/features].

Both options are great, but I decided to go with my old friend Features (maybe because I’m used to it’s UI). The first step, is to download the module:

composer require drupal/features:3.x

One of the really cool functionalities that the Drupal 8 version of the Features module brings is that can instantly create an installation profile with all our custom configuration. Just with a few clicks we have exported all the configuration we did in previous steps; but not only that, we have also created an installation profile that will allow us to replicate the site easily. You can read more of Features in the (official documentation on drupal.org)[https://www.drupal.org/docs/8/modules/features/building-a-distribution-with-features-3x].

Now, we have the basic functionality of the backend. There are some things we should still do, such as restricting the access to the backend interface, to prevent login or registration to the site, but we will not cover it in this post. Now we can move to the next step: the Elm frontend.

Sidenote

I used Features in this project to give it a try and play a bit. If you are trying to create a real project, you might want to consider other options. Even the creators of the Features module suggest not to use it for this kind of situations, as you can read here.

The Frontend

As mentioned, we will use Elm to write this app. If you do not know what it is, Elm is a pure functional language that compiles into Javascript and it is used to create reliable webapps.

Installing Elm is easy. You can build it from the source, but the easiest and recommended way is just use npm. So let’s do it:

npm install -g elm

Once we install Elm, we get four different commands:

  • elm-repl: an interactive Elm shell, that allows us to play with the language.
  • elm-reactor: an interactive development tool that automatically compiles our code and serves it on the browser.
  • elm-make: to compile our code and build the app we will upload to the server.
  • elm-package: the package manager to download or publish elm packages.

For this little project, we will mostly use elm-reactor to test our app. We can begin by starting the reactor and accessing it on the browser. Once we do that, we can start coding.

elm-reactor Elm Reactor Our First Elm Program

If you wish to make apple pie from scratch, you must create first the universe. Carl Sagan

We start creating a src folder that will contain all our Elm code and here, we start the reactor with elm reactor. If we go to our browser and access http://localhost:8000, we will see our empty folder. Time to create a Main.elm file in it. This file will be the root of our codebase and everything will grow from here. We can start with the simplest of all the Elm programs:

module Main exposing main import Html exposing (text) main = text "Hello world"

This might seem simple, but when we access the Main.elm file in the reactor, there will be some magic going on. The first thing we will notice, is that we now have a page working. It is simple, but it is an HTML page generated with Elm. But that’s not the only thing that happened. On the background, elm reactor noticed we imported a Html package, created a elm-packages.json file, added it as dependency and downloaded it.

This might be a good moment to do our first commit of our app. We do not want to include the vendor packages from elm, so we create a .gitignore file and add the elm-stuff folder there. Our first commit will include only three things, the Mail.elm file, the .gitignore and the elm-packages.json file.

The Elm Architecture

Elm is a language that follows a strict pattern, it is called (The Elm Architecture)[https://guide.elm-lang.org/architecture/]. We can summarize it in this three simple components:

  • Model, which represents the state of the application.
  • Update, how we update our application.
  • View, how we represent our state.

Given our small app, let’s try to represent our code with this pattern. Right now, our app is static and has no functionality at all, so there are not a lot of things to do. But, for example, we could start moving the text we show on the screen to the model. The view will be the content we have on our main function, and as our page has no functionality, the update will do nothing at this stage.

type alias Model = String model : Model model = "Hello world" view : Model -> Html Msg view model = text model main = view model

Now, for our blog, we need two different pages. The first one will be the listing of blog posts and the second one, a page for the individual post. To simplify, let’s keep the blog entries as just a string for now. Our model will evolve into a list of Posts. In our state, we also need to store which page we are located. Let’s create a variable to store that information and add it to our model:

type alias Model = { posts : List Post , activePage : Page } type alias Post = String type Page = List | Blog model : Model model = { posts = ["First blog", "Second blog"] , activePage = List }

And we need to update or view too:

view Model : Model -> Html Msg view model = div [] [ List.map viewPost model.posts ] viewPost : Post -> Html Msg viewPost post = div [] [ text post ]

We now have the possibility to create multiple pages! We can create our update function that will modify the model based on the different actions we do on the page. Right now, our only action will be navigating the app, so let’s start there:

type Msg = NavigateTo Page

And now, our update will update the activePage of our model, based on this message:

update : Msg -> Model -> (Model, Cmd Msg) update msg model = case msg of NavigateTo page -> ( {model | activePage = page}, Cmd.none )

Our view should be different now depending on the active page we are viewing:

view : Model -> Html Msg view model = case model.activePage of BlogList -> viewBlogList model.posts Blog -> div [] [ text "This is a single blog post" ] viewBlogList : List Post -> Html Msg viewBlogList posts = div [] [ List.map viewPost model.posts ]

Next, let’s wire the update with the rest of the code. First, we fire the message to change the page to the views:

viewPost post = div [ onClick <| NavigateTo Blog ] [ text post ]

And as a last step, we replace the main function with a more complex function from the Html package (but still a beginner program):

main : Program Never Model Msg main = beginnerProgram { model = model , view = view , update = update }

But we still have not properly represented the single blogs on their individual pages. We will have to update our model once again along with our definition of Page:

type alias Model = { posts : Dict PostId Post , activePage : Page } type alias PostId = Int type Page = List | Blog PostId model : Model model = { posts = Dict.fromList [(1, "First blog"), (2, "Second blog")] , activePage = List }

And with some minor changes, we have the views working again:

view : Model -> Html Msg view model = case model.activePage of BlogList -> viewBlogList model.posts Blog postId -> div [ onClick <| NavigateTo BlogList ] [ text "This is a single blog post" ] viewBlogList : Dict PostId Post -> Html Msg viewBlogList posts = div [] (Dict.map viewPost model.posts |> Dict.values) viewPost : PostId -> Post -> Html Msg viewPost postId post = div [ onClick <| NavigateTo <| Blog postId ] [ text post ]

We do not see yet any change on our site, but we are ready to replace the placeholder text of the individual pages with the content from the real Post. And here comes one of the cool functionalities of Elm, and one of the reasons of why Elm has no Runtime exceptions. We have a postId and we can get the Post from the list of posts we have on our model. But, when getting an item from a Dict, we always risk the possibility of trying to get an non-existing item. If we call a function over this non-existing item, it usually causes errors, like the infamous undefined is not a function. On Elm, if a function has a chance of return the value or not, it returns a special variable type called Maybe.

view : Model -> Html Msg view model = case model.activePage of BlogList -> viewBlogList model.posts Blog postId -> let -- This is our Maybe variable. It could be annotated as `Maybe Post` or a full definition as: -- type Maybe a -- = Just a -- | Nothing post = Dict.get postId model.posts in case post of Just aPost -> div [ onClick <| NavigateTo BlogList ] [ text aPost ] Nothing -> div [ onClick <| NavigateTo BlogList ] [ text "Blog post not found" ] Loading the Data from the Backend

We have all the functionality ready, but we have to do something else before loading the data from the backend. We have to update our Post definition to match the structure of the backend. On the Drupal side, we left a simple blog data structure:

  • ID
  • Title
  • Body
  • Creation date

Let’s update the Post, replacing it with a record to contain those fields. After the change, the compiler will tell us where else we need to adapt our code. For now, we will not care about dates and we will just take the created field as a string.

type alias Post = { id : PostId , title : String , body : String , created : String } model : Model model = { posts = Dict.fromList [ ( 1, firstPost ), ( 2, secondPost ) ] , activePage = BlogList } firstPost : Post firstPost = { id = 1 , title = "First blog" , body = "This is the body of the first blog post" , created = "2018-04-18 19:00" }

Then, the compiler shows us where we have to change the code to make it work again:

Elm compiler helps us find the errors -- In the view function: case post of Just aPost -> div [] [ h2 [] [ text aPost.title ] , div [] [ text aPost.created ] , div [] [ text aPost.body ] , a [ onClick <| NavigateTo BlogList ] [ text "Go back" ] ] -- And improve a bit the `viewPost`, becoming `viewPostTeaser`: viewBlogList : Dict PostId Post -> Html Msg viewBlogList posts = div [] (Dict.map viewPostTeaser model.posts |> Dict.values) viewPostTeaser : PostId -> Post -> Html Msg viewPostTeaser postId post = div [ onClick <| NavigateTo <| Blog postId ] [ text post.title ]

As our data structure now reflects the data model we have on the backend, we are ready to import the information from the web service. For that, Elm offers us a system called Decoders. We will also add a contrib package to simplify our decoders:

elm package install NoRedInk/elm-decode-pipeline

And now, we add our Decoder:

postListDecoder : Decoder PostList postListDecoder = dict postDecoder postDecoder : Decoder Post postDecoder = decode Post |> required "id" string |> required "title" string |> required "body" string |> required "created" string

As now our data will come from a request, we need to update again our Model to represent the different states a request can have:

type alias Model = { posts : WebData PostList , activePage : Page } type WebData data = NotAsked | Loading | Error | Success data

In this way, the Elm language will protect us, as we always have to consider all the different cases that the data request can fail. We have to update now our view to work based on this new state:

view : Model -> Html Msg view model = case model.posts of NotAsked -> div [] [ text "Loading..." ] Loading -> div [] [ text "Loading..." ] Success posts -> case model.activePage of BlogList -> viewBlogList posts Blog postId -> let post = Dict.get postId posts in case post of Just aPost -> div [] [ h2 [] [ text aPost.title ] , div [] [ text aPost.created ] , div [] [ text aPost.body ] , a [ onClick <| NavigateTo BlogList ] [ text "Go back" ] ] Nothing -> div [ onClick <| NavigateTo BlogList ] [ text "Blog post not found" ] Error -> div [] [ text "Error loading the data" ]

We are ready to decode the data, the only thing left is to do the request. Most of the requests done on a site are when clicking a link (usually a GET) or when submitting a form (POST / GET), then, when using AJAX, we do requests in the background to fetch data that was not needed when the page was first loaded, but is needed afterwards. In our case, we want to fetch the data at the very beginning as soon as the page is loaded. We can do that with a command or as it appears in the code, a Cmd:

fetchPosts : Cmd Msg fetchPosts = let url = "http://drelm.local/jsonapi/blog" in Http.send FetchPosts (Http.get url postListDecoder)

But we have to use a new program function to pass the initial commands:

main : Program Never Model Msg main = program { init = init , view = view , update = update , subscriptions = subscriptions }

Let’s forget about the subscriptions, as we are not using them:

subscriptions : Model -> Sub Msg subscriptions model = Sub.none

Now, we just need to update our initial data; our init variable:

model : Model model = { posts = NotAsked , activePage = BlogList } init : ( Model, Cmd Msg ) init = ( model , fetchPosts )

And this is it! When the page is loaded, the program will use the command we defined to fetch all our blog posts! Check it out in the screencast:

Screencast of our sample app

If at some point, that request is too heavy, we could change it to just fetch titles plus summaries or just a small amount of posts. We could add another fetch when we scroll down or we can fetch the full posts when we invoke the update function. Did you notice that the signature of the update ends with ( Model, Cmd Msg )? That means we can put commands there to fetch data instead of just Cmd.none. For example:

update : Msg -> Model -> ( Model, Cmd Msg ) update msg model = case msg of NavigateTo page -> let command = case page of Blog postId -> fetchPost postId BlogList -> Cmd.none ( { model | activePage = page }, command )

But let’s leave all of this implementation for a different occasion.

And that’s all for now. I might have missed something, as the frontend part grew a bit more than I expected, but check the repository as the code there has been tested and is working fine. If yuo have any question, feel free to add a comment and I will try to reply as soon as I can!

End Notes

I did not dwell too much on the syntax of elm, as there is already plenty of documentation on the official page. The goal of this post is to understand how a simple app is created from the very start and see a simple example of the Elm Architecture.

If you try to follow this tutorial step by step, you will may find an issue when trying to fetch the data from the backend while using elm-reactor. I had that issue too and it is a browser defense against (Cross-site request forgery)[https://es.wikipedia.org/wiki/Cross-site_request_forgery]. If you check the repo, you will see that I replaced the default function for get requests Http.get with a custom function to prevent this.

I also didn’t add any CSS styling because the post would be too long, but you can find plenty of information on that elsewhere.

Continue reading…

Categories: Drupal

Mark Shropshire: CharDUG Presents Drupal 8 Training Series

21 May 2018 - 8:09pm

During the past few CharDUG (Charlotte Drupal User Group) meetings, I realized that we have a real need to help our existing Charlotte area Drupalers using Drupal 7 to move on to Drupal 8. There is also a huge opportunity to train those completely new to web development and Drupall. Out of some recent conversations, I have put together a training series that will span the next 6 months and become the focus for each of our monthly meetings on the 2nd Wednesday of each month.

The CharDUG Drupal 8 Training Series is a comprehensive set of training workshops to get attendees up to speed with all aspects of Drupal 8. Whether you are brand new to Drupal, focused on content management, a frontend or backend developer, or devops engineer, this series contains what you will need to utilize Drupal 8 to its potential. Attendees should bring friends, laptops, and questions. There is no need to attend all sessions, though recommended for those new to Drupal. An outline of what to expect is provided below. If you have questions or suggestions, don’t hesitate to reach out on meetup.com or @chardug.

Drupal 8: Installation and Features
  • June 2018
    • What is Drupal?
      • Community
      • Open Source
      • Contributing to Drupal
    • How to install Drupal 8
    • Important features of note
    • Changes since Drupal 7
    • Drupal release schedule
Drupal 8: Site Building
  • Julu 2018
    • Installing Drupal 8
    • Using Composer with Drupal
    • How to build a site without any development
      • Installing modules and themes
    • Top Drupal 8 contributed modules
    • Users, roles, and permissions
    • Configuring Drupal
Drupal 8: Managing Content
  • August 2018
    • Content management concepts
      • Content types
      • Content authoring experience
      • Content moderations
      • Permissions and roles
      • Content scheduling
    • Creating and editing content
    • Blocks
    • Layout Builder
    • Paragraphs
Drupal 8: Layout and Theming
  • September 2018
    • Layout options
    • Responsive design
    • Images and media
    • Developing a Drupal theme
    • Drupal and responsive layouts
    • Decoupled Drupal frontends
Drupal 8: Module Development
  • October 2018
    • Developing a module
      • Menus and routes
      • Permissions
      • Creating pages and admin forms
      • Event subscribers
    • Writing and running tests
Drupal 8: Deployment and Next Steps
  • November 2018
    • Deploying to production
      • Development workflows
      • Security
        • Guardr
      • SEO
    • Next steps for Drupal
      • Drupal 9
      • New initiatives
      • Decoupled Drupal
Blog Category: 
Categories: Drupal

Jeff Geerling's Blog: Hosted Apache Solr's Revamped Docker-based Architecture

20 May 2018 - 6:18pm

I started Hosted Apache Solr almost 10 years ago, in late 2008, so I could more easily host Apache Solr search indexes for my Drupal websites. I realized I could also host search indexes for other Drupal websites too, if I added some basic account management features and a PayPal subscription plan—so I built a small subscription management service on top of my then-Drupal 6-based Midwestern Mac website and started selling a few Solr subscriptions.

Back then, the latest and greatest Solr version was 1.4, and now-popular automation tools like Chef and Ansible didn't even exist. So when a customer signed up for a new subscription, the pipeline for building and managing the customer's search index went like this:

Original Hosted Apache Solr architecture, circa 2009.

Categories: Drupal

Jeff Beeman: Rebuilding jeffbeeman.com: My local development environment and workflow

20 May 2018 - 1:20pm
Rebuilding jeffbeeman.com: My local development environment and workflow Last week I talked about setting up a new project using BLT, Dev Desktop, and Lightning. Today, I’ll talk more about my local environment setup and give a brief overview of my development and deployment workflow. Jeff Beeman Sun, 05/20/2018 - 13:20
Categories: Drupal

DrupalEasy: Using the Markup module to add content to your entity forms

19 May 2018 - 5:11am

Have you ever been building a form and found yourself wishing that you could insert additional help text - or even other forms of content (images, video) inline with the form? While each field's "Description" field is useful, sometimes it isn't enough.

The Markup module solves this problem in an elegant way by providing a new "Markup" field type.

 

This field doesn't expose any input widgets to the end user, rather it just allows for site builders to add additional markup (content) to an entity form.

 

The markup isn't saved with the resulting entity - it's just there to provide additional information to the user filling out the form.

Granted, this has always been possible by writing a small custom module utilizing hook_form_alter(), but having it as a field type makes it much more convenient to use.

 

Categories: Drupal

Ashday's Digital Ecosystem and Development Tips: 5 Reasons to Upgrade to Drupal 8 Today

18 May 2018 - 11:57am

Drupal 8 has been available now for more than two years, but if your site is up and running on Drupal 6 or 7, you may be wondering… why should I upgrade? And why now?

Categories: Drupal

OPTASY: How to Integrate Alexa with Your Drupal 8 Website: A Step-by-Step Guide

18 May 2018 - 9:05am
How to Integrate Alexa with Your Drupal 8 Website: A Step-by-Step Guide radu.simileanu Fri, 05/18/2018 - 16:05

Just imagine: a user asks Amazon Alexa to read out loud to him/her the headline of your latest blog post! Or maybe to look for a specific section on your Drupal site! Or, even better: quit imagining this and start implementing it instead! Right on your website. And here's how you integrate Alexa with your Drupal 8 website via the Alexa integration APIs.

A 7-step tutorial:
 

  • on how to get Alexa to “talk to” your site users/online customers
  • on turning your site's content into the needed “raw material” for setting up your custom Alexa skills
  • on how you can leverage Drupal 8's outstanding third-party integration capabilities to “fuel” your implementation plan with
     

So, here's how it's done: 
 

Categories: Drupal

Zivtech: Drupal 8 Content Moderation Tips & Tricks

18 May 2018 - 8:19am

The Content Moderation core module was marked stable in Drupal 8.5. Think of it like the contributed module Workbench Moderation in Drupal 7, but without all the Workbench editor Views that never seemed to completely make sense. The Drupal.org documentation gives a good overview.

Content Moderation requires the Workflows core module, allowing you to set up custom editorial workflows. I've been doing some work with this for a new site for a large organization, and have some tips and tricks.

Less Is More

Resist increases in roles, workflows, and workflow states and make sure they are justified by a business need. Stakeholders may ask for many roles and many workflow states without knowing the increased complexity and likelihood of editorial confusion that results.

If you create an editorial workflow that is too strict and complex, editors will tend to find ways to work around the  system. A good compromise is to ask that the team tries something simple first and adds complexity down the line if needed.

Try to use the same workflow on all content types if you can. It makes a much simpler mental model for everyone.

Transitions are Key

Transitions between workflow states will be what you assign as permissions to roles. Typically, you'll want to lock down who can publish content, allowing content contributors to create new drafts only.

Read more
Categories: Drupal

Drupal blog: Working toward a JavaScript-driven Drupal administration interface

18 May 2018 - 8:18am

This blog has been re-posted and edited with permission from Dries Buytaert's blog. Please leave your comments on the original post.

As web applications have evolved from static pages to application-like experiences, end-users' expectations of websites have become increasingly demanding. JavaScript, partnered with effective user-experience design, enable the seamless, instantaneous interactions that users now expect.

The Drupal project anticipated this trend years ago and we have been investing heavily in making Drupal API-first ever since. As a result, more organizations are building decoupled applications served by Drupal. This approach allows organizations to use modern JavaScript frameworks, while still benefiting from Drupal's powerful content management capabilities, such as content modeling, content editing, content workflows, access rights and more.

While organizations use JavaScript frameworks to create visitor-facing experiences with Drupal as a backend, Drupal's own administration interface has not yet embraced a modern JavaScript framework. There is high demand for Drupal to provide a cutting-edge experience for its own users: the site's content creators and administrators.

At DrupalCon Vienna, we decided to start working on an alternative Drupal administrative UI using React. Sally Young, one of the initiative coordinators, recently posted a fantastic update on our progress since DrupalCon Vienna.

Next steps for Drupal's API-first and JavaScript work

While we made great progress improving Drupal's web services support and improving our JavaScript support, I wanted to use this blog post to compile an overview of some of our most important next steps:

1. Stabilize the JSON API module

JSON API is a widely-used specification for building web service APIs in JSON. We are working towards adding JSON API to Drupal core as it makes it easier for JavaScript developers to access the content and configuration managed in Drupal. There is a central plan issue that lists all of the blockers for getting JSON API into core (comprehensive test coverage, specification compliance, and more). We're working hard to get all of them out of the way!

2. Improve our JavaScript testing infrastructure

Drupal's testing infrastructure is excellent for testing PHP code, but until now, it was not optimized for testing JavaScript code. As we expect the amount of JavaScript code in Drupal's administrative interface to dramatically increase in the years to come, we have been working on improving our JavaScript testing infrastructure using Headless Chrome and Nightwatch.js. Nightwatch.js has already been committed for inclusion in Drupal 8.6, however some additional work remains to create a robust JavaScript-to-Drupal bridge. Completing this work is essential to ensure we do not introduce regressions, as we proceed with the other items in our roadmap.

3. Create designs for a React-based administration UI

Having a JavaScript-based UI also allows us to rethink how we can improve Drupal's administration experience. For example, Drupal's current content modeling UI requires a lot of clicking, saving and reloading. By using React, we can reimagine our user experience to be more application-like, intuitive and faster to use. We still need a lot of help to design and test different parts of the Drupal administration UI.

4. Allow contributed modules to use React or Twig

We want to enable modules to provide either a React-powered administration UI or a traditional Twig-based administration UI. We are working on an architecture that can support both at the same time. This will allow us to introduce JavaScript-based UIs incrementally instead of enforcing a massive paradigm shift all at once. It will also provide some level of optionality for modules that want to opt-out from supporting the new administration UI.

5. Implement missing web service APIs

While we have been working for years to add web service APIs to many parts of Drupal, not all of Drupal has web services support yet. For our React-based administration UI prototype we decided to implement a new permission screen (i.e. https://example.com/admin/people/permissions). We learned that Drupal lacked the necessary web service APIs to retrieve a list of all available permissions in the system. This led us to create a support module that provides such an API. This support module is a temporary solution that helped us make progress on our prototype; the goal is to integrate these APIs into core itself. If you want to contribute to Drupal, creating web service APIs for various Drupal subsystems might be a great way to get involved.

6. Make the React UI extensible and configurable

One of the benefits of Drupal's current administration UI is that it can be configured (e.g. you can modify the content listing because it has been built using the Views module) and extended by contributed modules (e.g. the Address module adds a UI that is optimized for editing address information). We want to make sure that in the new React UI we keep enough flexibility for site builders to customize the administrative UI.

All decoupled builds benefit

All decoupled applications will benefit from the six steps above; they're important for building a fully-decoupled administration UI, and for building visitor-facing decoupled applications.

Useful for decoupling of visitor-facing front-ends Useful for decoupling of the administration backend 1. Stabilize the JSON API module ✔ ✔ 2. Improve our JavaScript testing infrastructure ✔ ✔ 3. Create designs for a React-based administration UI ✔ 4. Allow contributed modules to use React or Twig ✔ ✔ 5. Implement missing web service APIs ✔ ✔ 6. Make the React UI extensible and configurable ✔ Conclusion

Over the past three years we've been making steady progress to move Drupal to a more API-first and JavaScript centric world. It's important work given a variety of market trends in our industry. While we have made excellent progress, there are more challenges to be solved. We hope you like our next steps, and we welcome you to get involved with them. Thank you to everyone who has contributed so far!

Special thanks to Matt Grill and Lauri Eskola for co-authoring this blog post and to Wim Leers, Gabe Sullice, Angela Byron, and Preston So for their feedback during the writing process.

Categories: Drupal

mark.ie: My Approach to PatternLab?

18 May 2018 - 7:43am
My Approach to PatternLab?

I'm sometimes asked for an overview of my general approach to PatternLab. Simple: put everything for each component in the same directory!

markconroy Fri, 05/18/2018 - 15:43

When working with PatternLab, which I use for all my Drupal themes, including the theme for this website, I don’t use the full atomic approach. I don't use the approach of atoms > molecules > organisms > etc. I’m sure many people seriously disagree with me for that ( I do think it's a very clever concept). Instead I’ve renamed things to match the language we use with our clients.

I tried talking about atoms and molecules to some clients and their eyes glazed over. Clients do not want a science lesson. They do not want to be told that we are going to take two of these atoms, and mix them with one of these atom, and eventually we'll have water. No, they want to know what their final website is going to look like. When I changed the conversation and started talking about ‘Building Blocks’ (what we call our Drupal paragraph types), site blocks (Drupal's search block, branding block), display types (Drupal's view modes such as teaser, search result), etc, they immediately understood. Then we started hearing things like, "Oh, so we can create a page by adding a number of different building blocks?" and "I see, so the search results page is made up of a group of pages using the 'Search Result' display type?". And my response, "Yes!". You see, we are using plain English to ease with understanding.

Another aspect of my approach that I really like is that _everything_ for each of my components is within the same directory. For example, if it’s a nested paragraph component such as an accordion (so we need a paragraph type called 'Accordion' and one called 'Accordion Item') each template and css and js and readme and json and yaml is all in the same folder. That means when I want to reuse one in another project, I don’t need to remember what sub-particles (atoms/molecules) are used to create the organism. It also means my CSS is scoped to that specific component and doesn’t bleed out of it, so making changes or adding new features is very easy, you just scope the new component's CSS to it, so it won't affect other previously-created components.

Now the top bar of my PatternLab that used to say Atoms | Molecules | Organisms, etc has tabs for:

  • Base
    • Colours
    • Spacing
    • Breakpoints
  • Basic Elements
    • Headings
    • Paragraphs
    • Lists
  • Site Blocks (Drupal Blocks)
    • Search Block
    • Login Block
    • Branding Block
  • Building Blocks (Paragraph Types)
    • Accordion
    • Image with Text
    • Video
  • Content
    • Display Types (View Modes)
      • Teaser
      • Card
      • Search Result
    • Lists (Views)
      • Blog
      • Search Results
    • Content Types
      • Basic Page
      • Blog
      • Event
  • Page Sections (Regions)
    • Header
    • Footer
    • Sidebar
  • Sample Pages
    • Homepage
    • Blog Listing Page
    • Blog Node

After that, I have Backstop.js set up to regression test all of these, so each time I create a new component I can quickly run the visual regression tests and check that nothing has broken. Since all my CSS/JS is scoped to each individual component, it's rare if something is.

Categories: Drupal

Specbee: 5 Reasons Why Media Industry is Choosing Drupal CMS Over Other Platforms in 2018

18 May 2018 - 6:24am
The high demand for creating a seamless digital experience across numerous devices and channels has challenged the classic approach of the media industry. The changing technology landscape and the business models have forced the companies to think out-of-the-box to drive customer interaction and avoid competitive differentiation to catch up with them.
Categories: Drupal

Zoocha Blog: User login in Drupal 8 with AWS Rekognition

18 May 2018 - 4:43am
User login in Drupal 8... Drupal Web Development titi 18 May 2018 Idea We've recently attended the AWS London summit and this year it seemed like the focus was AI and machine learning. One of the services I've been meaning to play with is Rekognition, whichh can do face detection and compare two faces among many other things. While sitting in one…
Categories: Drupal

OpenSense Labs: Drupal ensuring the Web Accessibility Standards

18 May 2018 - 2:48am
Drupal ensuring the Web Accessibility Standards Akshita Fri, 05/18/2018 - 15:18

Just like land, air, and water are meant for everyone, the web was designed to work for all people and expel any hindrance, irrespective of the surroundings and capabilities of people. But the effect of incapacity (of individuals) in the light of the fact that the web standards don’t include all in itself has become a barrier. Creating quite the paradox in the situation. 

Before completing this blog, my ignorance led me to believe that web accessibility was limited to ‘accessibility only for people with disability’. Another thing that I was coxed to believe was that it is almost synonymous with visibility issues. But it is as much for a person with auditory disabilities as it is for a person with cognitive or neurological disabilities. However, I realized I was not the only one associating such wrong notions with disabilities and web accessibility. Lack of awareness and taboos associated with disabilities often mislead us.

Ensuring that people with disability have equal and inclusive access to the resources on the web, governments and agencies follow certain guidelines in order to establish equal accessibility for all without any bias. 

What are Web Accessibility Standards and why do they matter? “Web Content Accessibility Guidelines (WCAG) is developed through the World Wide Web Consortium process with a goal of providing a single shared standard for web content accessibility that meets the needs of individuals, organizations, and governments internationally.”

The WCAG explains how the web content be made more accessible to people. Here the word "content" refers to any and every kind of information in a web page, such as text (include heading and captions too), images, sounds, codes, markup - anything that defines the layout and framework.  

Take examples of physical infrastructures like ramps and digital vision signboards, which can be used by anyone, in a similar fashion web accessibility is for everyone.

When you go out in the noon, the level of contrast can be an issue as much for a person with 6/6 vision as it can be for a person with visibility issues. Or say, older people (due to aging) face problems with changing abilities, as much as people with “temporary disabilities” such as a broken arm or lost glasses. Thus, not only web accessibility standards ensure justice for people with disability but, it is inclusive for all. 

According to the Convention on the Rights of Persons with Disabilities by the United Nations, enjoying equal human rights is a fundamental freedom. To ensure the dignity of people with disability is not a subject of ridicule, governments across the globe signed a treaty for easy web accessibility. 

How does Drupal help?

A person may face an issue either when building a website or when using it. The WCAG ensures that both the times the guidelines are followed. The World Wide Web Consortium (W3C) guidelines are then divided into two: ATAG 2.0 and WCAG 2.0. Authoring Tool Accessibility Guidelines (ATAG 2.0) addresses authoring tools and Web Content Accessibility Guidelines (WCAG 2.0) addresses Web content and is used by developers, authoring tools, and accessibility evaluation tools. 

Drupal conforms to both the guidelines. The initiative started with Drupal 7 accessibility and the community has been committed to ensuring that accessibility for all. 

What Drupal does...

The community has an accessibility team which works to identify the barriers both at the code level and the awareness level to resolve them. As a person using assistive technologies to browse the web, Drupal is built to encourage and support the semantic markup (which comes out-of-box in Drupal 8 now).

One can realize that the improvements are meant for both the visitor and administrator in the:

  • Color contrast and intensity
  • Drag and Drop functionality
  • Adding skip navigation to core themes
  • Image handling
  • Form labeling
  • Search engine form and presentation
  • Removing duplicate or null tags
  • Accessibility for Developers
Modules For Accessibility

Following are some of the Drupal modules which will assist you in keeping up with the accessibility standards. 

  1. Automatic Alt text
    The basic principle at work here is the idea of easy perceivability. Any and every information should be, thus, presented in such a way that is easily perceivable to the user. It is required for any non-text information like images and video to describe the content in the form of text for the screen readers to read it. 



    The Automatic Alt text module automatically generates an alternative text for images when no alt text has been provided by the user. This module works great for the websites and portals with user-generated content where the users may even not be aware of the purpose and importance of the Alternative text. 

    It describes the content of the image in one sentence but it doesn’t provide face recognition. 
     
  2. Block ARIA Landmark Roles
    Inspired by Block Class, Block ARAI Landmark Roles adds additional elements to the block configuration forms that allow users to assign a ARIA landmark role to a block.
     
  3. CKEditor Abbreviation
    The CKEditor Abbreviation module adds a button to CKEditor which helps in inserting and editing abbreviations in a given text. If an existing abbr tag is selected, the context menu also contains a link to edit the abbreviation.

    Abbr tag defines the abbreviation or an acronym in the content. Marking up abbreviations can give useful information to browsers, translation systems, and help boost search-engines.
     
  4. CKEditor Accessibility Checker
    The CKEditor Accessibility Checker module enables the Accessibility Checker plugin in your WYSIWYG editor. A plugin, the module lets you inspect the accessibility level of content created and immediately solve any accessibility issues that are found.
     
  5. High Contrast
    On April 13, 2011, Joseph Dolson published an article "Web Accessibility: 10 Common Developer Mistakes" stating the most common mistakes related to web accessibility and quoted that most of the issues have "more to do with a failure to understand what constitutes accessible content than with a failure to understand the technology"

    In most of the surveys, poor contrast level is often cited as the most commonly overlooked feature by the developers.

    High Contrast module, provides a quick solution to allow the user to switch between the active theme and a high contrast version of it helping them pull out of the problem.

  6. htmLawed
    According to the "Ten Common Accessibility Problems" an article by Roger Hudson, failure to use HTML header elements appropriately is one of the key accessibility issues. 

    The htmLawed module utilizes the htmLawed PHP library to limit and filter HTML for consistency with site administrator policy and standards and for security. Use of the htmLawed library allows for highly customizable control of HTML markup.

  7. Style Switcher
    The Style Switcher module takes the fuss out of creating themes or building sites with alternate stylesheets. Most of the accessibility issues have been confronted at the theming level. With this module, themers can provide a theme with alternate stylesheets. Site builder can add other alternate stylesheets right in the admin section to bring it under the right guidelines of accessibility. Allowing special styling of some part of the site, the module presents all those styles as a block with links. So any site user is able to choose the style of the site he/she prefers.

  8. Text Resize
    The handiest feature giving the end users just the right autonomy to resize the text as per their comfort of the eyesight. The Text Resize module provides the end-users with a block that can be used to quickly change the font size of text on your Drupal site. 

    It includes two buttons that can increase and decrease the size of the printed text on the page.

  9. Accessibility
    A module for the developer, Accessibility module gives you a list of available Accessibility tests, (most of which are) aligned with one or more guidelines like WCAG 2.0 or Section 508. 

    It immediately informs the site maintainer about the missing an “alt” attribute in an image, or if the headers are used appropriately. Further, each test can be customized to fit your site’s specific challenges, and customize messages users see for each test so that you can provide tips on fixing accessibility problems within the context of your site’s editing environment.

Drupal 8 Features for Accessibility 

Other than the modules that can assist you to overcome web compatibility issues, here is a list of top Drupal 8 features for easier web accessibility. 

  1. Semantics in the Core
    When an assistive device scans a web page for information, it extracts the data about the Document Object Model (DOM), or the HTML structure of the page. No further information is read by the screen reader.

    Often these assistive devices only allow a user to select to read the headings on the page or only the links. It prioritizes according to the hierarchy in which the headings and links are presented making browsing easier for users of assistive devices. 

    Drupal 8 is based on HTML5. Presenting new and better semantic components HTML5 is, in fact, one of five major initiatives outlined in Drupal 8 development. It allows theme developers to control where to use the new semantic elements and opt out entirely if they so choose. 

    When we compose semantically correct HTML, we’re telling the browser and the assistive technology what type of content it is managing with and how that information relates to other content. By doing this, assistive technology is all the more effortlessly ready to carry out its activity since it has a structure that it can work with.
     
  2. Aural Alerts
    Often page updates are expressed visually through color changes and animations. But listening to a site is a very different experience from seeing it, therefore, Drupal provides a method called “Drupal.announce()”. This helps make page updates obvious in a non-visual manner. This method creates an aria-live element on the page.

    This also lets the user know of any alert box appearing along with providing instructions to screen reader users about the tone as well. Text attached to the page is read by the assistive technologies. Drupal.announce accepts a string to be read by an audio UA. 
     
  3. Controlled Tab Order
    The accessibility issues also crop when a user uses different mediums while navigating the web. Not every user uses a mouse to navigate the website. The TabbingManager, in Drupal, is an awesome medium to direct both non-visual and non-mouse users to access the prime elements on the page in a logical order. It, thus, permits more control when exploring complex UIs.

    The tabbing manager helps in defining explicit tab order. It also allows elements besides links and form to receive keyboard focus. Without breaking the tab order it places the elements in a logical navigation flow as if it were a link on the page.
     
  4. Accessible Inline Form Errors
    It is important to provide the necessary feedback to users about the results of their form submission. Both the times when successful and when not.  This incorporates an in-line feedback that is typically provided after form submission.

    Notifications have to be concise and clear. The error message, in particular, should be easy to understand and provide simple instructions on how the situation can be resolved. And in case of successful submission, a message to confirm would do. 

    Drupal forms have turned out to be impressively more open to the expansion of available inline form errors. It is now easier for everyone to identify what errors they might have made when filling in a web form.

  5. Fieldsets
    Fieldset labels are utilized as systems for gathering related segments of forms. Effectively implemented label gives a visual diagram around the shape field gathering. This can, to a great degree, be valuable for individuals with cognitive disabilities as it viably breaks the form into subsections, making it easier to understand.

    Drupal presently uses fieldsets for radios & checkboxes in the Form API. This helps towards additionally upgrading forms in Drupal.

Conclusion

However good the features Drupal offers, in the end, it is up to the organizations to strategize and build the websites and applications around the web accessibility.   

We ensure that our different teams and interaction work together in order to make the Web more accessible to people with disabilities. At OpenSense Labs we design and develop the web technologies to ensure universal accessibility. Connect with us at hello@opensenselabs.com to make the web a better place. 

blog banner blog image Blog Type Articles Is it a good read ? On
Categories: Drupal

Agiledrop.com Blog: AGILEDROP: A Short Introduction to Headless Drupal

18 May 2018 - 2:39am
It is a well-known fact by now that Drupal is a very flexible and an extremely agile CMS. Even though it is arguably the most customizable CMS of all, the awesome people behind Drupal aren’t just resting on their laurels due to this. If you’ve been keeping up with any Drupal news or in fact CMS news at all, you might have heard of the term ‘headless’. While the term might sound very odd, it defines something really awesome. So let’s take a look at what headless Drupal is in this post and why it’s so great.   Difference Between Headless and ‘Normal’ Drupal When you run a default Drupal… READ MORE
Categories: Drupal

Valuebound: GraphQL: A Beginners Guide

18 May 2018 - 12:08am

GraphQL is the new frontier in Application Programming Interfaces (APIs) - a query language for your API and a set of server-side runtimes (implemented in various backend languages) for executing queries. Further, it isn't tied to any specific database or storage engine; instead backed by your existing code and data.

If you are a Javascript developer then there are better chances that you have heard of it. But you are not sure about it. To help you out, I have written this blog post so that you can easily figure out what exactly is GraphQL and how to make most of it. When you will complete this GraphQL blog cum tutorial, you will be able to answer:

  • What is GraphQL

Categories: Drupal

Dries Buytaert: Working toward a JavaScript-driven Drupal administration interface

17 May 2018 - 10:42am

As web applications have evolved from static pages to application-like experiences, end-users' expectations of websites have become increasingly demanding. JavaScript, partnered with effective user-experience design, enable the seamless, instantaneous interactions that users now expect.

The Drupal project anticipated this trend years ago and we have been investing heavily in making Drupal API-first ever since. As a result, more organizations are building decoupled applications served by Drupal. This approach allows organizations to use modern JavaScript frameworks, while still benefiting from Drupal's powerful content management capabilities, such as content modeling, content editing, content workflows, access rights and more.

While organizations use JavaScript frameworks to create visitor-facing experiences with Drupal as a backend, Drupal's own administration interface has not yet embraced a modern JavaScript framework. There is high demand for Drupal to provide a cutting-edge experience for its own users: the site's content creators and administrators.

At DrupalCon Vienna, we decided to start working on an alternative Drupal administrative UI using React. Sally Young, one of the initiative coordinators, recently posted a fantastic update on our progress since DrupalCon Vienna.

Next steps for Drupal's API-first and JavaScript work

While we made great progress improving Drupal's web services support and improving our JavaScript support, I wanted to use this blog post to compile an overview of some of our most important next steps:

1. Stabilize the JSON API module

JSON API is a widely-used specification for building web service APIs in JSON. We are working towards adding JSON API to Drupal core as it makes it easier for JavaScript developers to access the content and configuration managed in Drupal. There is a central plan issue that lists all of the blockers for getting JSON API into core (comprehensive test coverage, specification compliance, and more). We're working hard to get all of them out of the way!

2. Improve our JavaScript testing infrastructure

Drupal's testing infrastructure is excellent for testing PHP code, but until now, it was not optimized for testing JavaScript code. As we expect the amount of JavaScript code in Drupal's administrative interface to dramatically increase in the years to come, we have been working on improving our JavaScript testing infrastructure using Headless Chrome and Nightwatch.js. Nightwatch.js has already been committed for inclusion in Drupal 8.6, however some additional work remains to create a robust JavaScript-to-Drupal bridge. Completing this work is essential to ensure we do not introduce regressions, as we proceed with the other items in our roadmap.

3. Create designs for a React-based administration UI

Having a JavaScript-based UI also allows us to rethink how we can improve Drupal's administration experience. For example, Drupal's current content modeling UI requires a lot of clicking, saving and reloading. By using React, we can reimagine our user experience to be more application-like, intuitive and faster to use. We still need a lot of help to design and test different parts of the Drupal administration UI.

4. Allow contributed modules to use React or Twig

We want to enable modules to provide either a React-powered administration UI or a traditional Twig-based administration UI. We are working on an architecture that can support both at the same time. This will allow us to introduce JavaScript-based UIs incrementally instead of enforcing a massive paradigm shift all at once. It will also provide some level of optionality for modules that want to opt-out from supporting the new administration UI.

5. Implement missing web service APIs

While we have been working for years to add web service APIs to many parts of Drupal, not all of Drupal has web services support yet. For our React-based administration UI prototype we decided to implement a new permission screen (i.e. https://example.com/admin/people/permissions). We learned that Drupal lacked the necessary web service APIs to retrieve a list of all available permissions in the system. This led us to create a support module that provides such an API. This support module is a temporary solution that helped us make progress on our prototype; the goal is to integrate these APIs into core itself. If you want to contribute to Drupal, creating web service APIs for various Drupal subsystems might be a great way to get involved.

6. Make the React UI extensible and configurable

One of the benefits of Drupal's current administration UI is that it can be configured (e.g. you can modify the content listing because it has been built using the Views module) and extended by contributed modules (e.g. the Address module adds a UI that is optimized for editing address information). We want to make sure that in the new React UI we keep enough flexibility for site builders to customize the administrative UI.

All decoupled builds benefit

All decoupled applications will benefit from the six steps above; they're important for building a fully-decoupled administration UI, and for building visitor-facing decoupled applications.

Useful for decoupling of visitor-facing front-ends Useful for decoupling of the administration backend 1. Stabilize the JSON API module ✔ ✔ 2. Improve our JavaScript testing infrastructure ✔ ✔ 3. Create designs for a React-based administration UI ✔ 4. Allow contributed modules to use React or Twig ✔ ✔ 5. Implement missing web service APIs ✔ ✔ 6. Make the React UI extensible and configurable ✔ Conclusion

Over the past three years we've been making steady progress to move Drupal to a more API-first and JavaScript centric world. It's important work given a variety of market trends in our industry. While we have made excellent progress, there are more challenges to be solved. We hope you like our next steps, and we welcome you to get involved with them. Thank you to everyone who has contributed so far!

Special thanks to Matt Grill and Lauri Eskola for co-authoring this blog post and to Wim Leers, Gabe Sullice, Angela Byron, and Preston So for their feedback during the writing process.

Categories: Drupal

Wim Leers: Two big milestones in API-First Drupal

17 May 2018 - 6:12am

Two big “maintainability” milestones have been hit in the past few days:

1. rest.module now is in a maintainable state

For the first time ever, the issue tracker for the rest.module Drupal core component fits on a single page: https://www.drupal.org/project/issues/drupal?component=rest.module. 48 46 open issues, of which several are close to RTBC, so that number will likely still go down. Breakdown:

  • 19 of those 48 are feature requests.
  • 6 are “plan” issues.
  • At least 10 issues are related to “REST views” — for which we are only fixing critical bugs.
  • And 12 are postponed — blocked on other subsystems usually. (There is overlap among those breakdown bullets.)

Finally the REST module is getting to a place where it is maintainable and we’re not extinguishing whatever the current fire is! It’s been a long road, but we’re getting there!

2. Instilling API-First responsibility

Two days ago, #2910883: Move all entity type REST tests to the providing modules landed, which is a huge milestone in making Drupal 8 truly API-First: every module now owns its REST test coverage, which conveys the fact that every component/module is responsible for its own REST API-compatibility, rather than all responsibility lying in the rest.module component!

This is largely made possible by \Drupal\Tests\rest\Functional\EntityResource\EntityResourceTestBase, which is a base test class that each entity type should subclass to test how it behaves when exposed via REST. Every entity type in core has tests based on it, but contrib and custom entity types are encouraged to do the same!

API-First ecosystem

But the rest.module being stable is not the only thing that matters — it’s a key part, but not the only part of API-First Drupal. The remaining challenges lie elsewhere in Drupal: the issues tagged API-First Initiative are now mainly in the modules providing field types, entity types, Entity/Field API and Configuration API.

The good thing is that the fixes to any of those always help the entire API-First ecosystem:

If you want to follow along a bit more closely, or want the news right as it happens, follow REST: top priorities for Drupal 8.6.x!

Categories: Drupal

AddWeb Solution: Drupal 8, Driesnote And A Lot More From Our Recent Visit To DrupalCon, Nashville 2018

17 May 2018 - 4:01am

“When the bond is strong, relationships lasts long!” said no one in specific because some facts are too true to be cited in words. But why are we talking about this in here and not Drupal? Well, because it is about Drupal; to be precise, our relationship with Drupal!


We realised this quite recently while on the way to our 10th Drupal event, ever since our inception in the year 2012. In these 5 years, we’ve attended 9 Drupal events, sponsored 5 of them and volunteered in 8 of them. Then how could we miss this one in Nashville?! And here we’re, back from the DrupalCon, Nashville 2018, with a bagful of memories and experiences to share. So, if you weren’t there, read on to discover all that was there!

 

Quick Overview of the Event Timeline

The five-day event, that DrupalCon was, had been traditionally divided into three major sections - Summits, Sessions and Contribution. Business Summit, Sessions & Contribution Day.
 

  • The opening day had summits and training sessions, along with the opening reception.
  • Day two had programming summits and sessions planned, along with the much-anticipated session of Dries’ Keynote. 
  • The third and fourth day was when multiple sessions were held followed by special events in the evening and trivia nights. 
  • The closing day of the event was all about Drupal contributions, just as it is at any other Drupal event.

But what made this particular event stand out for me, from the previous ones, were the booths of both - Joomla and WordPress. Now that’s how we see the competitors coming together for the sake of saving open source community from their counterparts. 

 

Major Takeaway from the Event

 

1) Business Summit:
As a regular attendee of Drupal events, we knew the significance of the paid business summits held on the first day of the event. So, we had priorly registered for the same. After attending them, we realised that the core of these summits lied in two major things. One, that Drupal alone was not enough and second, the need to promote that Drupal is not merely a CMS platform for building websites but a wide and highly ‘Ambitious Digital Experience’ in itself. 

 

2) Dries Keynote:
Amidst the commencement of several sessions, the second day at DrupalCon, Nashville 2018 most of us waited for that one crucial session; #Driesnote - the keynote by Dries Buytaert, founder of Drupal. It revolved around three prime agendas - Drupal 8 update, Growing adoption of Drupal and Fostering the Drupal Community. He confirmed the fact that how Drupal is capable of doing everything digital and called it to be the most ‘Ambitious Digital Experience’ of our time. Plus, the decoupled Drupal had opened up the opportunity to get the best of Drupal and collate it with any other user-friendly front-end framework. 

, ,

He also declared to take Drupal 8 in the right directions it has become necessary to take up marketing techniques for promotion. And hence, the Drupal Association has launched the ‘Promote Drupal Initiative’ for raising $ 100,000 of which they’ve already bagged $ 54,000 so far. 

, ,

Many such significant stuff was shared in the Dries Keynote and to watch its entire recording click here.
 

3) Sessions:
The third and fourth day consisted of multiple sessions running parallelly. We attended quite a lot of sessions and re-bonded with old Drupal friends over a cup of coffee. ‘Delegating Work: A Zippy Guide to Releasing Your Death Grip on Control’ by Hannah Del Porto from Brick Factory, ‘Debugging Effectively’ by Colin O’Dell from Unleashed Technology and ‘Beyond Websites: Drupal as Data Pipeline for Digital Signage’ by Mike Madison from Acquia - are few of the sessions that we thoroughly enjoyed.

 

And of course, on both the nights had bumper social events to end the day, along with our fun fellow-Drupalers. Since we met many old friends from Drupal community, we made quite some memories with those fellas! 

 

DrupalCon, Nashville 2018 had one more day to go to conclude, which was the Contribution Day. But unfortunately, our already planned schedule did not permit us to stay back and be a part of the concluding day. One does feel this loss, especially when they’ve been a contributor in past and know the importance of being a part of such a day. But anyways, the experience that we earned in the previous 4 days was quite enriching. It has opened up a lot of hope and opportunities for people like us, who are hardcore Drupal enthusiasts! 
 

Categories: Drupal

OpenSense Labs: Cognitive Search: A True Genius

17 May 2018 - 3:51am
Cognitive Search: A True Genius Shankar Thu, 05/17/2018 - 16:21

A black spot on a white sheet of paper can be found with a quick glance. What if you have to search for a black dot with certain radius among the cluster of dots on a large sheet of white paper. Such is the need of the hour where you have to intelligently search for a piece of information from a cornucopia of data in your system. Cognitive search is revolutionizing the process of retrieving the files.

There is a diminishing trend of manually searching for a document stored somewhere in your system. Large enterprises are the ones who are showing their dire inclination towards this disrupting technology.

Before we move on to how large organizations are looking to extract the merits of cognitive search, let’s understand what it is.

What is Cognitive Search anyway? Source: Forrester

Forrester, research and advisory firm, defined cognitive search and knowledge discovery as “the new generation of enterprise search solutions that employ Artificial Intelligence (AI) technologies such as natural language processing (NLP) and machine learning to ingest, understand, organize, and query digital content from multiple data sources”.

That is the best definition one can give to describe a cognitive information system. In short, it can extract the most relevant piece of information from large sets of data in their work context.

Platforms enabled with cognitive computing abilities can interact with the users in a natural manner. With experience, they can learn user preferences and behavioral patterns. This helps them establish links between related data from both internal and external sources.

How Beneficial is Cognitive Search?

So now we have an understanding of what it is and how it works. How can it turn out to be a great asset?

Tapping into large sets of data sources
  • To fetch the best piece of data out of voluminous sources of data can seem tiring. Cognitive search can work wonders in extracting the most valuable piece of information from large sets of intricate and varied data sources.
  • Whether it is internal or external, it peeks inside everything that is available in your entire enterprise. It also touches searches through structured and unstructured data and lends deep and insightful search capabilities to your organization. This helps in making better decisions in the business.
Providing relevant knowledge
  • It comes packed with a lot of functionalities that leads us to find meaningful and relevant information. Doing a search across the enterprise data may seem daunting, but it does that with ease. 
  • Using NLP, it can gauge and get to know the scheme of things vis-à-vis text content like email, blog, report, research work, and document and also media content like meeting videos and its audio recordings.
  • Once it is done with the understanding part, machine learning algorithms help it do deeper research and come up with insightful information. Company dictionaries and ontologies help with understanding the terminologies and their relationships.
Enhancing search results

Machine learning algorithms help in providing better search results.

  • To help digital marketers predict if the advertisements designed by them is going to work or not, a supervised learning algorithm called Classification By Example can help. For instance, it can help them judge how people reacted to particular ad campaigns in the past to help them come up with something better this time around.
  • Marketers can ascertain a particular group of people and target them for their upcoming marketing campaigns. Clustering, an unsupervised learning algorithm, helps them in the process.
  • To understand the relationship between input and output variables and do the prediction, regression algorithm comes handy. For instance, it can be used to build applications that determine the road traffic based on the present weather situations. Also, based on various economic factors, it can help predict stock prices.
  • Similarity algorithm can help you appoint an expert team for a business project based on their skills and competency levels in the previous projects.
  • Personalized recommendation based on the interests of the users can be done using Recommendation algorithm. Based on the previous history and usage patterns, it can recommend content which a user would most likely want to consume.
How did Cognitive Search come into existence?

It is a valid question. One would ponder whether this was developed in a short span of time or did it involve some amazing technologies behind the scenes. It was, of course, a very long roadmap which had to be traveled to reach this tech marvel.

By now, you have already noticed that, as the definition mentions it, machine learning and artificial intelligence are the frontrunners in leading up to this masterpiece.

Almost all the search methods that exist today are some way or the other related to Google. Leverage Marketing came up with an interesting study. When Google first created a search engine in 1996, there were already several others in place. But Google’s search was different. While other search engines delivered search results only if they could find an exact keyword in the search box, Google had a different algorithm.

Google gave a value to certain keywords. So, keyword frequency determined the search results which led to irrelevancy in terms of content that it showed on top. So, in the 2000s, Google devised several improved search techniques. It, finally, incorporated machine learning into their search engine in 2015. That means Google would not just read what you have written in the search box but interpret what we really mean when we type that.

By developing its cognitive learning search method, Google’s search algorithm could understand keywords and provide rankings, past search results, browser history, user location and other such parameters. This was their major leaning towards artificial intelligence.

This is how Google standardized the internal search. Office network developers contemplated developing search methods for their business needs. Their pursuit of search method development was based on Google cognitive search and machine learning techniques. That is how cognitive search came into existence and did a splendid job in improving the search experience.

Challenges that lie ahead...

Three-pronged approaches to look at the challenges that it might encounter and how to tackle them:

  • Expertise: Shortage of personnel required to develop and maintain this budding technology can be one of the main challenges that have to be overcome.
  • AI implementation:
    • Supervised machine learning helps in recognizing user patterns over time. Providing sufficient labeled training datasets from which these systems can learn is a huge challenge.
    • Unsupervised machine learning identifies existing user patterns. Systems with this capability face a major hurdle. Sufficient data with intervention for proper guidance and interpretation to train the system is a challenge.
  • Goal formulation: There has to be clear goals and outcomes formulated. For instance, in reinforcement learning, systems perform several attempts and learn from the outcome of the trials to take better decisions. The biggest task is to provide clear-cut goals and enough practice to the systems in a challenging environment.
How can Cognitive Search improve Enterprise search?

Large enterprises are having this question as this will make life easier. We have seen that relevancy, meaningfulness, and completeness are required to get the better search results. But they should also have the enterprise qualities.

  • Understanding data: It should understand any data that an enterprise would fling at it. It should browse through the plenitude of data sources, understand both structured and unstructured data to come up with better enterprise search results.
  • Scalability: It should be able to scale with the ever-increasing density of enterprise data. Large enterprises have hundreds and thousands of applications with several bytes of data stored in the cloud or on-premises. Cognitive search solutions should deliver better quality search results.
  • Manual fine-tuning: It uses natural language processing and machine learning algorithms to understand data, do the prediction of user’s search patterns, improve the relevance of search results and automatically tune them over a period of time. Cognitive search solutions should provide tools for administrators to manually tune search results. After all, AI is not perfect.
  • Building search applications: It should help developers to develop search applications. Instead of incorporating simple text box search methods, business enterprises should be able to build an application that works like virtual digital assistants such as Google Now and Siri.
Use Cases

Sinequa, which provides cognitive search and analytics platform for over 2000 organizations, has a great cognitive computing solution. As a matter of fact, it is recognized as the leader in the Gartner 2017 Magic Quadrant for Insight Engines and the Forrester Wave™: Cognitive Search and Knowledge Discovery Solutions Q2 2017.

Sinequa partnered with content platform firm Box to enhance cross-platform enterprise search and analytics.

  • The Partnership between Sinequa and Box helped in leveraging the aggregation of human-generated data in an enterprise, exploring information, and apply them in a business decision makings. This integration not only lets customers mine their Box content but also allows to search across other organizations’ repositories. This increases the value of the information searched by linking them with related content across different sources which was previously affected by standalone silos.
  • Sinequa offers more than 150 connectors to its various data sources. And with this partnership, the relevancy of search results for the enterprise data corpus has only got improved. Individuals can do the enterprise search across platforms for contextually relevant search results across platforms from a single interface. Hence this shows that their partnership has promoted the enterprise cloud-first philosophy that is becoming the norm of the industries.
  • From the angle of information management, the partnership is significant with both the companies having a huge presence among the large enterprise clients. Box provides the user-friendly cloud file sharing and sync functionalities. It decided to embed enterprise governance features for the content by building tools for reports, access controls, workflow, securities policies etc. Partnering Sinequa further strengthened the voluminous knowledge base content of Box by allowing it to maintain control settings on data while the content is being searched and assessed on Sinequa.
  • Native security and permissions settings of connected repositories are preserved to a great extent with this integration. Users can search the Box environment without bothering the native control settings on the respective platforms where the information is located. That means Sinequa search interface allows users to endlessly search for the content and the granular security and permissions settings of the Box remain intact.
  • 360-degree view of the customer is attained through this partnership. Sinequa’s natural language processing and machine capabilities, powered by Apache Spark, and its more than 150 connectors to content sources let it form compound results of the enterprise search for detailed understanding by humans. If a user is searching for a specific subject in Box, he or she can view most contextually relevant results from email, Salesforce, on-premise file shares and other such sources. This mitigates the time and efforts required to find the right actionable insight. Thus, this improves yields from certain initiatives like gaining a 360-degree view of the customer.

Hewlett Packard Enterprise (HPE), multinational enterprise information technology company, has its own cognitive computing solution. Delivering natural language based systems to meet the ever-increasing needs of users, providing answers that are precise, relevant and trustworthy is important. HPE IDOL Natural Language Question Answering is one such solution that comes with natural language processing enables features for large enterprises.

  • Accurate response: IDOL Answer Bank feature helps in providing accurate and curate responses to the predefined reference questions. For instance, It can be programmed to give you instructions on configuring a smartphone.
  • Fact-based answers: IDOL Fact Bank feature helps in providing answers based on proper facts. For instance, it can give the stock price details through structured data sources. Or it can provide company's annual report through unstructured data sources.
  • Text-based overview: IDOL Passage Extract feature helps in giving you an overview. For instance, you can see the latest financial services and their rules and regulations or the news events.
  • Assessment questions and data sources: IDOL Answer Server feature assesses the questions and various content sources to provide the best possible answer.
Cognitive Search solution providers Source: Forrester

Forrester, in their research study for Cognitive Search and Knowledge Discovery Solutions, compiled a list of high performing vendor solutions.

  • HPE IDOL: This solution is built to analyze everything that is searched using it.  With HPE’s intentions apparently not restricted to unstructured text, its cognitive computing platform also does a deep analysis of speech, images, and video. It includes capabilities like gauging a question and optimally answering that can help developers in developing chatbots or virtual conversational assistants.
  • Coveo: Its major focus lies in contextual and relevant search results. It uses advanced analytics and machine learning algorithms to return the most contextual results for the queries made by the user. It has also integrated with Salesforce using its cloud-based model.
  • Sinequa: It gave importance to natural language processing for the better understanding of search queries and relevance of content discovery. Moreover, incorporating Apache spark, its analytics platform has got a further boost.
  • Attivio: It is suitable for most complex search applications. It offers knowledge management, anti-money laundering, customer 360, and other such features. Developers can use the structured query language to search the index.
  • IBM: It has leveraged the utilities of Watson Explorer by incorporating it in IBM’s Watson Developer Cloud. Watson explorer can be deployed in the cloud or on-premises. It is very helpful for customer 360 search applications, enterprise search, and claims processing.
  • Lucidworks: Their solution called Fusion has fantastic enterprise search features, 40 prebuilt connectors to applications like Salesforce and Slack, better administration tool, and out-of-the-box machine learning algorithms to come up with better knowledge discovery.
Summary

Cognitive search has emerged as the default standard for enterprise search. By analyzing a search query, using its AI capabilities, to give most relevant and contextual output, it has led to a volte-face in the thinking of large enterprises.

Using internal and external content sources to provide the most relevant knowledge and enhance search results, it has been a huge helping aid in the smart cross-platform search.

Google’s search engine integrated it in their algorithms to understand user’s behavioral patterns and show results. This is how cognitive search came into existence in the enterprise world.

It has to break through the straitjackets of few challenges to come out as an improved technology in the coming years.

With a deep understanding of data, scalability with challenges and manual tuning by administrators etc. it can improve enterprise search.

Leaders in cognitive search solution providers like Sinequa, HPE, Attivio among others have amazing platforms where customers can reap the benefits.

Opensense Labs love this tech genius. Contact us at hello@opensenselabs.com to understand more about this remarkable piece of technology.

blog banner blog image Blog Type Articles Is it a good read ? On
Categories: Drupal

OpenSense Labs: Drupal Lays The Foundation For Every Enterprise

17 May 2018 - 3:33am
Drupal Lays The Foundation For Every Enterprise Akshita Thu, 05/17/2018 - 16:03

As an entrepreneur, you need a reliable, secure, and flexible platform to build your business on. Not only scalable it should be future-proof to sustain the content without hampering the performance of your website.

Leaders worldwide are using the power of open source to innovate their platforms and improve their business statistics. Selecting the right technology means working on the solutions that will support an active and growing business over the long-haul. Therefore, it requires careful consideration and foresight, when choosing the CMS for your enterprise.

Fulfilling the business requirements as well meeting the technical aspects, no wonder why Drupal is used 7 times the number of top sites as its next two competitors combined (BuiltWith.com)

Let's simplify the word enterprise 

An oft-repeated word in the world of business, “enterprise” covers organizations of all shapes and sizes. All such businesses cover individual organizational units with a distinct need to build their firm with a unique identity and reputation of its own kind.

Even though the meaning may vary considerably, when it comes to web development and technology, an enterprise website requires a particular set of abilities such as, accommodating a larger and varied content base, handle traffic, microsites, and of course provide tight security.

Who uses Drupal CMS for their enterprise?

Drupal is fostering billion dollar businesses under the aegis of its brand, a few well known are:

  • Puma
  • Tesla Motors
  • Grammy
  • Pfizer
  • Timex
  • The Economist
  • Whole Food
  • Honda (Brazil)
  • Johnson and Johnson
  • Shoretel
  • LOreal (India)

And a million more add to Drupal's credentials. Acknowledging that enterprise solutions often demand complex requirements, Drupal has it sorted for you.

Why Drupal For Your Enterprise?

Covering the enterprises using Drupal, below are some of the solid technical reasons which makes it an excellent candidate for any enterprise of any scale or vertical.

It is Easier To Build

As an online platform on which your business will be built, Drupal lets your need dictate the terms.

Providing easy-to-set-up solutions with distribution, the development time is cut by half.

Enabling companies to deploy core features and functionality rapidly, it allows easier customization as per their business requirements.

It is easier to choose the layout and themes for your Drupal website, as themes and appearances are just a click away. With features simplified to make non-developers comfortable around Drupal, the editorial capabilities have been made fluent and easy.

Drupal is Secure

Used by hundreds and thousands of websites, Drupal’s core, codes, and passwords are repeatedly encrypted and hashed to strengthen the life of your website. Supported by experts, and a large and continuously growing community, it has a dedicated security team to patch any probable security violation.

Frequent Updates

In case of any security update, the community ensures that you get notified the day patches are released. Security release windows are released every Wednesday for contributed projects, and the third Wednesday of every month for core, usually, for a fixed period of time.

Even though the release window does not necessarily mean that a release will actually be rolled out on that date, it exists for the site administrators to know in advance the days they should look out for a possible security release.

Security Modules

In addition to the proven security of core, numerous contributed modules can strengthen the security of your website. These modules extend the security by adding password complexity, login, and session controls, increasing cryptographic strength, and improving Drupal' logging and auditing functions. For a detailed research on security-related modules, check the list of must-have security modules.

Security Team and Working Group

The security team works closely with the Drupal Security Working Group (SecWG), comprising dozens of experts from around the world to validate and respond to security issues, aim being - to ensure that core and contributed project system provides world-class security and provide security practices to community developers.

Its core is designed to prevent any possible security breach. Vulnerabilities in the core are coordinated with branch maintainers and individual project maintainers respectively.

Drupal has proven to be a secure solution for enterprise needs and is used by top-tier enterprises.

Drupal is Scalable and Flexible

Another salient feature that makes it popular among businesses. When concerning web technology, enterprises require the ability to handle considerable traffic throughout - especially if it is a media and entertainment site.

It is built with core web technologies which have stood both the test of time and traffic spike.

Drupal’s ability to make the framework extensible via its modules and distributions is at the heart of much of its success. While it has enabled the core to sustain the bulk of the content, its way to streamline the demands of new industries by allowing them to address their needs in the form of custom modules and distributions has given it more satisfactory customer reviews.  

One matter that addresses the worries of enterprises is the cost of maintenance. Many government and non-government organizations have migrated to Drupal to avoid the licensing and maintenance cost of the proprietary systems.  

Excels at Responsive Development and Quick Loading Time

According to Google’s official statement, more than 50 percent of search queries globally now come from mobile devices. People want to be able to find answers as fast as possible and various studies have proved that people really do care about the loading speed.

And that is why a recent Google release says that page speed will be a ranking factor for mobile searches from July 2018. It’s high time that you take the combination of performance and mobile responsiveness as a serious factor for improving visibility and revenue from the web.

Drupal 8 is built for a mobile-first world. Everything in version 8 supports mobile responsive design. Its admin and default designs are responsive for both developers and content authors providing a responsive front-end theming framework.

Increasing the loading speed of your web page opens numerous doors for business. And when users can view your Drupal website the same way on a desktop and mobile devices you cannot be having second thoughts.

Mobile responsiveness helps you deliver the optimal mobile visitor experience. It supports the best responsive design practices and ensures that your users get a coherent experience anytime and every time.   

Supports Multi-site Functionalities

Given that your organization is running more than one site, the maintenance and management would require big bucks and time. But with the multi-site feature you can share one single Drupal installation (which includes core code, contributed modules, and themes) among other several sites.

Enterprises, this way, can handle complex requirements from a single Drupal installation which implies that less time and resources are required to build your network of websites.

One can manage any number of sites across their organization or brand, crossing geographies and campaigns from a single platform that allows swift and uncomplicated site creation and deployment.

This is particularly useful for managing the core code since each upgrade only needs to be done once. While each site will have its own database and configuration settings to manage their own content, the sites would be sharing one code base and web document root.

The multisite feature can be used for sites with same features and functionalities. But if you have different functionalities it is better to test each site independently.

For Every Enterprise

Realizing the needs of every industry is different, Drupal has something for everyone.

Media and entertainment

Editing and Scalability

Media and entertainment websites worldwide use Drupal for their online platforms for seamless editing and scalability. The list of over one million organizations includes The Economist, ET Online, MTV(UK), The Grammy, The Emmy, The Weather.com, The Beatles, and Warner Bros Music.

Scalability is all about quantity - how many requests and amount of information you can handle at any given time without breaking or bending. Supporting some of the world’s most visited sites, Drupal is the other name of scalability.

Allowing easy content editing and management, which media and entertainment websites look for, it provides it all with WYSIWYG and CKEditor without another weighty feature.

SaaS

Community solutions:

SaaS enterprises are using Drupal to build the platform for their product as well as a community to engage with the clients and followers. It is easy to develop the platforms and then keep on adding the features in the later phase.

Given that community platforms are one of the key needs of SaaS organizations which allow the domain for the prospects and help the product and community to grow alike, distributions like OpenSocial offer great help.

Zoho is one of the SaaS products using Drupal for its community platforms.

E-commerce

E-commerce functionalities

Providing easy payment gateway to conduct online transactions, Drupal ensures the customer information passes seamlessly and remains safe.

Its core commerce payment module and distributions (Drupal commerce and Commerce KickStart) support the payment API, for a smooth payment collection procedure, through the check out form.

Supporting Paypal Express Checkout and Paypal Credit along with Amazon Pay, it lets you reach a wider audience by letting your shoppers complete the payment and shipping information stored on their Amazon accounts.

Tour and travel

For a potential traveler, your site shouldn’t look like just-another-information-brochure on the web. The need for an end-to-end solution to integrate all the minute details (from hotel booking to landing back) has never been greater.  

Booking Engine:

Providing two of the best booking solutions for your website:

  • EasyBooking - Distribution
  • BAT - Module

A complete solution for your vacation portal, BAT allows you to build an exclusive booking engine for a better customer relationship management. And EasyBooking gives a set of options to your visitors to make room reservations, contact hotel administration, or just sign-up for the hotel’s newsletter to be aware of the special offers and discounts.

FMCG

Theming

A design which resonates with your brand, interests and engages with your visitors is what you should indulge your resources in developing.

It’s the psychological effect which drives the visitor to make a transaction or to explore provided possibilities throughout the interface. Every landing page matters.

Regardless of your showcased products, Drupal themes provide sound navigation throughout the categories and sections with in-built hero banners’ section and pop-ups which are definitely customizable.

Additional modules can be further used to build an industry-specific theme. In order to cope up with varied demands, it provides more than two thousand easy and free to use themes on the go.

Government and Non-Government

Cost and Security:

In 2012 when the Georgian government shifted to Drupal, the first reason to dump its previous CMS (Vignette) was its rising maintenance costs. 

Running a total of 65 state websites on two different versions of this proprietary system proved to be costly in the long run

Another decisive factor for government websites, uncompromised security is why government organizations are opting for Drupal. Around 150 governments are already powered by it. Just like the Georgian government, costs have been a significant factor affecting the choice of government and non-government agencies.  

Higher Education

Distributions:

To quickly build your higher education website, distributions provide an easy opportunity to build the website halving the development time and providing quick features. Opigno and OpenEDU are two of the distributions used widely by the higher-ed websites.

Drupal is most widely used CMS in the education sector no wonder why top international universities like the Harvard, Brown, Yale, Pennsylvania, and Columbia rely on it.

HealthCare and Life Sciences

Content and User access control:

It can conform to any workflow that can be programmed with just a few configurations available. You can identify different types of content such as text, images, comments, file attachments, and any other information on your website for easy content integration and management.

Drupal As an Enterprise Management System

The need for an intranet system cannot be emphasized enough. For your business to grow by leaps and bounds, it is necessary to establish clear communication within your organization.

As your business expands, the need for an intranet system which can help in storage and sharing of data increases. ECMS is different from the web content management system in the way that the former is specifically designed for enterprise websites and is more dynamic.

Drupal allows building ECMS in two ways, either by using its modules and features or with the third party configuration. Its integration capabilities help the website to serve as a central content management system integrated with other necessary advancements.

Drupal Is Easier To Manage

Drupal isn’t hard to use, but it can be hard to learn how to use. Even though it requires more technical experience it is capable of producing exceptionally advanced sites. There is a WYSIWYG editor and drag-and-drop functionality to ease out the process and help you start straight away.

The release of version 8 has made the platform easier to use even for non-developers(and it includes content authors). Managing your website is easy as the community platform provides you with necessary documentation and answers in case you get stuck.

Summary

Being one of the leading technologies in the market, Drupal gives your enterprise the features and flexibility to innovate as per your visitor behavior and preferences.

We’d love to hear your thoughts. To get in touch, drop a mail at hello@opensenselabs.com and let us know how we can enhance your statistics with Drupal.

blog banner blog image Drupal and enterprise Drupal Drupal 8 Drupal module Blog Type Articles Is it a good read ? On
Categories: Drupal

Pages