Drupal

FZ Songbook

New Drupal Modules - 12 February 2019 - 6:33am

The Songbook module is a Drupal 7 module.

You can show a Chordpro format file. The syntax of chordpro is here.

You can download the songbook module from here.

Install module

Download it and copy into folder where are the other contribute modules.
Enable in the admin/modules
admin/config/content/formats/full_htm enable the checkbox of FZ songbook filter.

Categories: Drupal

Drupal OAuth Server ( OAuth Provider) - Single Sign On ( SSO )

New Drupal Modules - 12 February 2019 - 4:24am

OAuth Server (OAuth 2.0 Provider) allows Single Sign On ( SSO ) to your client apps with Drupal. It allows you to use your Drupal site as your OAuth Server and access OAuth API’s. The primary goal of this OAuth Server / OAuth Provider module is to allow users to interact with Drupal and Jetpack sites like Google, Facebook, AWS Cognito, Azure AD, Salesforce and many more without requiring them to store sensitive credentials.

Categories: Drupal

Warmer

New Drupal Modules - 12 February 2019 - 3:08am

This module provides all the necessary infrastructure to orchestrate your cache warming processes.

You can warm the cache of your critical entities (and more!) right after you deploy to production. Additionally cron will keep them warm for you.

Categories: Drupal

Commerce Sezzle pay

New Drupal Modules - 12 February 2019 - 2:58am

Commerce Sezzle pay

Categories: Drupal

Dependant Reference Method

New Drupal Modules - 12 February 2019 - 12:21am
About

This module clones the Business Rules module's method to make the Entity Reference field instance able to filter via entity reference view type display, PLUS the ajax source of the dependant field of the same entity creation / update form. Thus is possible to forward the arguments into the context filters of the view making it much faster on the performance side.

Categories: Drupal

Spritespin

New Drupal Modules - 11 February 2019 - 10:47am

Integration with awesome js library http://spritespin.ginie.eu/
1. Get js file here http://unpkg.com/spritespin@x.x.x/release/spritespin.js where x.x.x is version number. Tested to work with version 4.0.6.
2. Put js file in your libraries folder so your file must be in libraries/spritespin/spritespin.js
3. Install Colorbox module if not already.
4. Install Spritespin module as usual.

Categories: Drupal

Drupal core announcements: 8.7.0-alpha1 deadline moved to March 11

Planet Drupal - 11 February 2019 - 9:42am

Last fall, we adjusted our minor release date for Drupal 8.7.0 from March 6 to May 1. This was done as part of moving Drupal's minor release schedule toward a consistent schedule that will have minor releases in the first weeks of June and December each year. (See Plan for Drupal 9 for more information on why we are making this change.)

However, the change to the 8.7.0 release date means that DrupalCon Seattle now falls in the middle of important preparation phases for the minor release. In order to ensure community members have adequate time to prepare and test the release without interfering with DrupalCon Seattle events, we've moved the alpha and beta phases for the release one week earlier:

  • 8.7.0-alpha1 will now be released the week of March 11. The alpha phase will last two weeks until the release window for beta1.
  • 8.7.0-beta1 will now be released the week of March 25. The beta phase will now last three weeks (including the week of DrupalCon) instead of two. The beta phase will still end when the release candidate window begins.
  • The release candidate (RC) and release dates are unchanged. The RC window still begins April 15 and the scheduled release date is still May 1.

(Read more about alpha, beta, and RC phases.)

Categories: Drupal

Connection Plugins

New Drupal Modules - 11 February 2019 - 8:47am

The aim of this module is to provide a bridge so that users can more easily put together API connections when wiring together decoupled applications. The solution, in Drupal-speak, is to use some different plugin types.

Categories: Drupal

Basic Data

New Drupal Modules - 11 February 2019 - 8:21am

Basic Data is a content entity that ships with an additional data property. The entity type is basic_data and you may add any fielded bundles you'd need.

The primary use case for a basic_data entity and NOT a node entity is not complicated. Nodes do not come with a data property that is required. If you have no data to store in a basic_data entity, consider using Migrate or some custom code to create nodes with fields that make up the appropriate structured content.

Categories: Drupal

Simple Slim API

New Drupal Modules - 11 February 2019 - 6:41am

This module creates an API for Drupal nodes and users using the Slim Framework.

http://www.slimframework.com

Slim is a PHP micro framework that helps you quickly write simple yet powerful web applications and APIs.
At its core, Slim is a dispatcher that receives an HTTP request, invokes an appropriate callback routine, and returns an HTTP response.
That’s it.

Categories: Drupal

Acquia Developer Center Blog: Building Usable Conversations: Conversational Content Strategy

Planet Drupal - 11 February 2019 - 6:34am

In this fourth installment of our series on conversational usability, we're turning our attention to conversational content strategy, an underserved area of conversational interface design that is rapidly growing due to the number of enterprises eager to convert the text trapped in their websites into content that can be consumed through voice assistants and chatbots.

Tags: acquia drupal planet
Categories: Drupal

Dries Buytaert: Headless CMS: REST vs JSON:API vs GraphQL

Planet Drupal - 11 February 2019 - 5:59am

The web used to be server-centric in that web content management systems managed data and turned it into HTML responses. With the rise of headless architectures a portion of the web is becoming server-centric for data but client-centric for its presentation; increasingly, data is rendered into HTML in the browser.

This shift of responsibility has given rise to JavaScript frameworks, while on the server side, it has resulted in the development of JSON:API and GraphQL to better serve these JavaScript applications with content and data.

In this blog post, we will compare REST, JSON:API and GraphQL. First, we'll look at an architectural, CMS-agnostic comparison, followed by evaluating some Drupal-specific implementation details.

It's worth noting that there are of course lots of intricacies and "it depends" when comparing these three approaches. When we discuss REST, we mean the "typical REST API" as opposed to one that is extremely well-designed or following a specification (not REST as a concept). When we discuss JSON:API, we're referring to implementations of the JSON:API specification. Finally, when we discuss GraphQL, we're referring to GraphQL as it used in practice. Formally, it is only a query language, not a standard for building APIs.

The architectural comparison should be useful for anyone building decoupled applications regardless of the foundation they use because the qualities we will evaluate apply to most web projects.

To frame our comparisons, let's establish that most developers working with web services care about the following qualities:

  1. Request efficiency: retrieving all necessary data in a single network round trip is essential for performance. The size of both requests and responses should make efficient use of the network.
  2. API exploration and schema documentation: the API should be quickly understandable and easily discoverable.
  3. Operational simplicity: the approach should be easy to install, configure, run, scale and secure.
  4. Writing data: not every application needs to store data in the content repository, but when it does, it should not be significantly more complex than reading.

We summarized our conclusions in the table below, but we discuss each of these four categories (or rows in the table) in more depth below. If you aggregate the colors in the table, you see that we rank JSON:API above GraphQL and GraphQL above REST.

REST JSON:API GraphQL Request efficiency Poor; multiple requests are needed to satisfy common needs. Responses are bloated. Excellent; a single request is usually sufficient for most needs. Responses can be tailored to return only what is required. Excellent; a single request is usually sufficient for most needs. Responses only include exactly what was requested. Documentation, API explorability and schema Poor; no schema, not explorable. Acceptable; generic schema only; links and error messages are self-documenting. Excellent; precise schema; excellent tooling for exploration and documentation. Operational simplicity Acceptable; works out of the box with CDNs and reverse proxies; few to no client-side libraries required. Excellent; works out of the box with CDNs and reverse proxies, no client-side libraries needed, but many are available and useful. Poor; extra infrastructure is often necessary client side libraries are a practical necessity, specific patterns required to benefit from CDNs and browser caches. Writing data Acceptable; HTTP semantics give some guidance but how specifics left to each implementation, one write per request. Excellent; how writes are handled is clearly defined by the spec, one write per request, but multiple writes is being added to the specification. Poor; how writes are handled is left to each implementation and there are competing best practices, it's possible to execute multiple writes in a single request.

If you're not familiar with JSON:API or GraphQL, I recommend you watch the following two short videos. They will provide valuable context for the remainder of this blog post:

Request efficiency

Most REST APIs tend toward the simplest implementation possible: a resource can only be retrieved from one URI. If you want to retrieve article 42, you have to retrieve it from https://example.com/article/42. If you want to retrieve article 42 and article 72, you have to perform two requests; one to https://example.com/article/42 and one to https://example.com/article/72. If the article's author information is stored in a different content type, you have to do two additional requests, say to https://example.com/author/3 and https://example.com/author/7. Furthermore, you can't send these requests until you've requested, retrieved and parsed the article requests (you wouldn't know the author IDs otherwise).

Consequently, client-side applications built on top of basic REST APIs tend to need many successive requests to fetch their data. Often, these requests can't be sent until earlier requests have been fulfilled, resulting in a sluggish experience for the website visitor.

GraphQL and JSON:API were developed to address the typical inefficiency of REST APIs. Using JSON:API or GraphQL, you can use a single request to retrieve both article 42 and article 72, along with the author information for each. It simplifies the developer experience, but more importantly, it speeds up the application.

Finally, both JSON:API and GraphQL have a solution to limit response sizes. A common complaint against typical REST APIs is that their responses can be incredibly verbose; they often respond with far more data than the client needs. This is both annoying and inefficient.

GraphQL eliminates this by requiring the developer to explicitly add each desired resource field to every query. This makes it difficult to over-fetch data but easily leads to very large GraphQL queries, making (cacheable) GET requests impossible.

JSON:API solves this with the concept of sparse fieldsets or lists of desired resource fields. These behave in much the same fashion as GraphQL does, however, when they're omitted JSON:API will typically return all fields. An advantage, though, is that when a JSON:API query gets too large, sparse fieldsets can be omitted so that the request remains cacheable.

REST JSON:API GraphQL Multiple data objects in a single response Usually; but every implementation is different (for Drupal: custom "REST Export" view or custom REST plugin needed). Yes Yes Embed related data (e.g. the author of each article) No Yes Yes Only needed fields of a data object No Yes; servers may choose sensible defaults, developers must be diligent to prevent over-fetching. Yes; strict, but eliminates over-fetching, at the extreme, it can lead to poor cacheability. Documentation, API explorability and schema

As a developer working with web services, you want to be able to discover and understand the API quickly and easily: what kinds of resources are available, what fields does each of them have, how are they related, etc. But also, if this field is a date or time, what machine-readable format is the date or time specified in? Good documentation and API exploration can make all the difference.

REST JSON:API GraphQL Auto-generated documentation Depends; if using the OpenAPI standard. Depends; if using the OpenAPI standard (formerly, Swagger). Yes; various tools available. Interactivity Poor; navigable links rarely available. Acceptable; observing available fields and links in its responses enable exploration of the API. Excellent; autocomplete feature, instant results or compilation errors, complete and contextual documentation. Validatable and programmable schema. Depends; if using the OpenAPI standard. Depends; the JSON:API specification defines a generic schema, but a reliable field-level schema is not yet available. Yes; a complete and reliable schema is provided (with very few exceptions).

GraphQL has superior API exploration thanks to GraphiQL (demonstrated in the video above), an in-browser IDE of sorts, which lets developers iteratively construct a query. As the developer types the query out, likely suggestions are offered and can be auto-completed. At any time, the query can be run and GraphiQL will display real results alongside the query. This provides immediate, actionable feedback to the query builder. Did they make a typo? Does the response look like what was desired? Additionally, documentation can be summoned into a flyout, when additional context is needed.

On the other hand, JSON:API is more self-explanatory: APIs can be explored with nothing more than a web browser. From within the browser, you can browse from one resource to another, discover its fields, and more. So, if you just want to debug or try something out, JSON:API is usable with nothing more than cURL or your browser. Or, you can use Postman (demonstrated in the video above) — a standalone environment for developing on top of an any HTTP-based API. Constructing complex queries requires some knowledge, however, and that is where GraphQL's GraphiQL shines compared to JSON:API.

Operational simplicity

We use the term operational simplicity to encompass how easy it is to install, configure, run, scale and secure each of the solutions.

The table should be self-explanatory, though it's important to make a remark about scalability. To scale a REST-based or JSON:API-based web service so that it can handle a large volume of traffic, you can use the same approach websites (and Drupal) already use, including reverse proxies like Varnish or a CDN. To scale GraphQL, you can't rely on HTTP caching as with REST or JSON:API without persisted queries. Persisted queries are not part of the official GraphQL specification but they are a widely-adopted convention amongst GraphQL users. They essentially store a query on the server, assign it an ID and permit the client to get the result of the query using a GET request with only the ID. Persisted queries add more operational complexity, and it also means the architecture is no longer fully decoupled — if a client wants to retrieve different data, server-side changes are required.

REST JSON:API GraphQL Scalability: additional infrastructure requirements Excellent; same as a regular website (Varnish, CDN, etc). Excellent; same as a regular website (Varnish, CDN, etc). Usually poor; only the simplest queries can use GET requests; to reap the full benefit of GraphQL, servers needs their own tooling. Tooling ecosystem Acceptable; lots of developer tools available, but for the best experience they need to be customized for the implementation. Excellent; lots of developer tools available; tools don't need to be implementation-specific. Excellent; lots of developer tools available; tools don't need to be implementation-specific. Typical points of failure Fewer; server, client. Fewer; server, client. Many; server, client, client-side caching, client and build tooling. Writing data

For most REST APIs and JSON:API, writing data is as easy as fetching it: if you can read information, you also know how to write it. Instead of using the GET HTTP request type you use POST and PATCH requests. JSON:API improves on typical REST APIs by eliminating differences between implementations. There is just one way to do things and that enabled better, generic tooling and less time spent on server-side details.

The nature of GraphQL's write operations (called mutations) means that you must write custom code for each write operation; unlike JSON:API the specification, GraphQL doesn't prescribe a single way of handling write operations to resources, so there are many competing best practices. In essence, the GraphQL specification is optimized for reads, not writes.

On the other hand, the GraphQL specification supports bulk/batch operations automatically for the mutations you've already implemented, whereas the JSON:API specification does not. The ability to perform batch write operations can be important. For example, in our running example, adding a new tag to an article would require two requests; one to create the tag and one to update the article. That said, support for bulk/batch writes in JSON:API is on the specification's roadmap.

REST JSON:API GraphQL Writing data Acceptable; every implementation is different. No bulk support. Excellent; JSON:API prescribes a complete solution for handling writes. Bulk operations are coming soon. Poor; GraphQL supports bulk/batch operations, but writes can be tricky to design and implement. There are competing conventions. Drupal-specific considerations

Up to this point we have provided an architectural and CMS-agnostic comparison; now we also want to highlight a few Drupal-specific implementation details. For this, we can look at the ease of installation, automatically generated documentation, integration with Drupal's entity and field-level access control systems and decoupled filtering.

Drupal 8's REST module is practically impossible to set up without the contributed REST UI module, and its configuration can be daunting. Drupal's JSON:API module is far superior to Drupal's REST module at this point. It is trivial to set up: install it and you're done; there's nothing to configure. The GraphQL module is also easy to install but does require some configuration.

Client-generated collection queries allow a consumer to filter an application's data down to just what they're interested in. This is a bit like a Drupal View except that the consumer can add, remove and control all the filters. This is almost always a requirement for public web services, but it can also make development more efficient because creating or changing a listing doesn't require server-side configuration changes.

Drupal's REST module does not support client-generated collection queries. It requires a "REST Views display" to be setup by a site administrator and since these need to be manually configured in Drupal; this means a client can't craft its own queries with the filters it needs.

JSON:API and GraphQL, clients are able to perform their own content queries without the need for server-side configuration. This means that they can be truly decoupled: changes to the front end don't always require a back-end configuration change.

These client-generated queries are a bit simpler to use with the JSON:API module than they are with the GraphQL module because of how each module handles Drupal's extensive access control mechanisms. By default JSON:API ensures that these are respected by altering the incoming query. GraphQL instead requires the consumer to have permission to simply bypass access restrictions.

Most projects using GraphQL that cannot grant this permission use persisted queries instead of client-generated queries. This means a return to a more traditional Views-like pattern because the consumer no longer has complete control of the query's filters. To regain some of the efficiencies of client-generated queries, the creation of these persisted queries can be automated using front-end build tooling.

REST JSON:API GraphQL Ease of installation and configuration Poor; requires contributed module REST UI, easy to break clients by changing configuration. Excellent; zero configuration! Poor; more complex to use, may require additional permissions, configuration or custom code. Automatically generated documentation Acceptable; requires contributed module OpenAPI. Acceptable; requires contributed module OpenAPI. Excellent; GraphQL Voyager included. Security: content-level access control (entity and field access) Excellent; content-level access control respected. Excellent; content-level access control respected, even in queries. Acceptable; some use cases require the consumer to have permission to bypass all entity and/or field access. Decoupled filtering (client can craft queries without server-side intervention) No Yes Depends; only in some setups and with additional tooling/infrastructure. What does this mean for Drupal's roadmap?

Drupal grew up as a traditional web content management system but has since evolved for this API-first world and industry analysts are praising us for it.

As Drupal's project lead, I've been talking about adding out-of-the-box support for both JSON:API and GraphQL for a while now. In fact, I've been very bullish about GraphQL since 2015. My optimism was warranted; GraphQL is undergoing a meteoric rise in interest across the web development industry.

Based on this analysis, we rank JSON:API above GraphQL and GraphQL above REST. As such, I want to change my recommendation for Drupal 8 core. Instead of adding both JSON:API and GraphQL to Drupal 8 core, I believe only JSON:API should be added. While Drupal's GraphQL implementation is fantastic, I no longer recommend that we add GraphQL to Drupal 8 core.

On the four qualities by which we evaluated the REST, JSON:API and GraphQL modules, JSON:API has outperformed its contemporaries. Its web standards-based approach, its ability to handle reads and writes out of the box, its security model and its ease of operation make it the best choice for Drupal core. Additionally, where JSON:API underperformed, I believe that we have a real opportunity to contribute back to the specification. In fact, one of the JSON:API module's maintainers and co-authors of this blog post, Gabe Sullice (Acquia), recently became a JSON:API specification editor himself.

This decision does not mean that you can't or shouldn't use GraphQL with Drupal. While I believe JSON:API covers the majority of use cases, there are valid use cases where GraphQL is a great fit. I'm happy that Drupal is endowed with such a vibrant contributed module ecosystem that provides so many options to Drupal's users.

I'm excited to see where both the JSON:API specification and Drupal's implementation of it goes in the coming months and years. As a first next step, we're preparing the JSON:API to be added to Drupal 8.7.

Special thanks to Wim Leers (Acquia) and Gabe Sullice (Acquia) for co-authoring this blog post and to Preston So (Acquia) and Alex Bronstein (Acquia) for their feedback during the writing process.

Categories: Drupal

Delphi

New Drupal Modules - 11 February 2019 - 5:35am
Introduction

The Delphi module provides an implementation of the
Delphi method for
http://armstrong.wharton.upenn.edu. With some work it could be useful to other sites.

Categories: Drupal

Walkme

New Drupal Modules - 11 February 2019 - 12:45am

About "Walkme"

WalkMe’s Digital Adoption Platform (DAP) makes it effortless to use any software,
website, or app. Combined with proactive, step-by-step guidance, our comprehensive
solution analyzes and automates processes so users can complete tasks easily in
the moment of need.

Installation

Before installing this module go to walkme.com, generate the script and publish script.
After module is installed navigate to admin/config/system/walkme and place walkme script in Script field.

Categories: Drupal

Agiledrop.com Blog: Our blog posts from January 2019

Planet Drupal - 11 February 2019 - 12:16am

We've prepared an overview of all our blog posts from January 2019; have a look!

READ MORE
Categories: Drupal

OpenSense Labs: Small enterprises, Rooting for Goliath? Here is How to Bag Big projects

Planet Drupal - 10 February 2019 - 8:29pm
Small enterprises, Rooting for Goliath? Here is How to Bag Big projects Vasundhra Mon, 02/11/2019 - 09:59 Wisdom doesn’t automatically come with old age. Nothing does - except wrinkles. It’s true that some wines improve by age but only if the grapes were good in the first place. 
-Abigail Van Buren

A reflection of the life experiences adds generously to the whole box of one’s wisdom because let’s face it, being wise and savvy can come from anyone and anywhere. So yes, famous quote “Age is just a number” has done justice to the whole scenario of erudition. 

Just like natural misconception “ bigger the better” proved right by small agencies handling bigger projects. Gone are the days where large enterprises use to rule in the market kingdom bagging all the big projects. Today, small agencies are winning big-name accounts and cool projects far more often. And the trend is forecast to continue.


For the Drupal agency with big aspirations deciding on the projects to opt for can be a bit of a task sometimes, but attaining the trust from CxOs of big organizations that is even bigger than the projects itself. 

Thereby, solving this issue of handling and winning - here are some of the ways which would help you to seize those big projects in your vanity. 

First things First - How to meet big clients?

Just because you are a small agency or organization, it would not mean your clients to have to be small. Landing on the large organization not only boosts up the small business revenue but also increases efficiency among your team members and organization.

  • Use client reference to introduce your process

Big companies may seem like a grand entity, but you should not forget that they are constituted of hundreds and thousands of individuals who have the power to make the decisions.

So it is really important for your research to be up notch and accurate that tells you who to contact within the company you've targeted. Some of the sources or references may help with this. Apart from this some companies also present details of at least one of the senior employees on their websites.

But you need to be really creative to figure out exactly who the right person is. Look out for out some of the company’s publications or newspapers mentions seeing whose name comes up.
Not only this but you can also tag along with people who would introduce you to big tech giants.

  • Indulge in cold calling

Telemarketing and cold calling continues to be an essential discipline that is really useful for the sales role. In many business sales organizations, the old school “door knocking” might not be that productive, and when it comes to big organizations especially with large territory assignments, cold calling becomes the hero for everyone. Prospecting via phone calls continues to be a great compliment to your overall employment setting and lead generation projects.

  • Be an expert and then try to be a solution to their needs. 

If you want the big giants to trust you with the projects then a sense of “What the work means to you”must be established with a clearer vision for the future. In fact, according to the Employee Job Satisfaction and Engagement survey, nearly 77% of employees said it was important to their job satisfaction and engagement to have a clear understanding of their organization’s vision and mission.

Start with your team 

Now that you have big names in your vanity start by developing strong team hold and skills. Starting from:

  • A team of Generalists 

Generalists are the people who have a particular skill but are flexible enough to mold themselves in any situations and are ready to learn a new skill. In the case of Drupal websites, a generalist should be able to handle both backends as well as frontend. 

In other words, having a person as a generalist would be beneficial for your organization. He/She would be able to effectively handle many tasks. 

 

  • Services are important 

Focus on the set of services and assistance which you would be providing to the vendor. Your team would become a specialist with time and experience. Treat a big enterprise like royalty. 

The big giant enterprise is like the customer for you who are always expecting great services and will not put up with the waiting for the poor responses from their representatives. 

Be honest with your projects and their goals. If your customers find that you are dishonest with your services, they will lose faith in you and may even spread negative feedback about your business.  

  • Categorizing your projects

To ensure that the complexity of the project is achieved, categorize the project into the following:

Small projects: These can easily be tracked just by getting updates A project is classified as small when the relationships between tasks are basic and detailed planning or organization is not required.

Charter required projects: These are projects that require some level of approval other than the first line manager, but do not include significant financial investment. A summary of major deliverables is usually enough for management approval.

Large projects: The project network is broad and complicated. There are many task interdependencies. With these projects, simplification where possible is everything. 

  • Planning 

Planning a project helps in achieving objectives and deadlines on time. It pushes the team members to keep working hard until the goal are conquered. Planning also helps in creating a network of right directions to the organization.

Increases efficiency: Planning helps in maximum utilization of all the available resources that you would be using. It supports to reduce the wastage of precious resources and dodges their duplication. It also aims to give the greatest returns at the lowest possible cost. 

Reduces risks: With having such large projects there are many risks associated with it. Planning serves to forecast these risks. It also serves to take the necessary precautions to avoid these risks.
 
Facilitates coordination: Often, the plans of all departments of an organization are well coordinated with each other. Similarly, the short-term, medium-term and long-term plans of an organization should be coordinated with each other. 
 
Aids in Organizing: Organizing intends to bring together all possible resources, Organizing is not possible without planning. It is so, since, planning tells us the number of resources needed and when are they needed. It suggests that planning aids in organizing in an effective way.
 
Keeps good control: The actual administration of an employee is compared with the plans, and deviations (if any) are found out and corrected. It is impossible to achieve such control without the right planning. Therefore, planning becomes necessary to keep good control.

 

  • The scope of the Project 

Perhaps the most difficult part of managing a large project with a small team is the difference between a task and an actual project. In order for small project teams to be successful with large projects, the manager should always know the status of the project and the scope at which it is being achieved. 

  • Excellent Relationship with the vendor

The most important part of managing big projects with small teams is to establish a meaningful relationship across the organization.

A solid relationship is a path that may lead to the difference between a project that becomes actualized and one that remains in the conceptual area. If the business doesn't concentrate on a product or the service that is important to reach your clientele, you require a vendor that does it. 

Next comes the Methodologies 

Large organizations usually handle classical methodologies which involve a lot of unnecessary documentation. Thus, for small agencies, some methodologies help largely in handling large projects 

  • Agile 
Agile was developed for projects that require both speed and flexibility. The method is split down into sprints- short cycles for producing certain features. 

Agile is highly interactive, allowing for fast adjustments throughout a project. It is mostly applied in software development projects in large part because it makes it simpler to identify issues quickly 

Agile is essential because it allows making changes early in the development process, rather than having to wait until testing is complete.

  • Scrum

It is a variation of an agile framework which is iterative in nature which relies on scrum sessions for evaluating priorities. “The Scrum assemblies” or “30-day sprints” are utilized to limit prioritized tasks.

Small teams may be gathered to concentrate on a particular task independently and then coincide with the scrum master to assess progress or results and reprioritize backlogged tasks.

 

  • Waterfall 

This is a basic, sequential methodology from which Agile and similar concepts evolved. It is commonly practiced in many industries, especially in software projects.

Waterfall has been an excellent project management methodology for years now and used by most of the project managers. This methodology is sequential in nature and is used by many industries, mostly used in software development. It consists of static phases ( analysis, design, testing, implementation, and maintenance) that are produced in a specific order. 

  • Critical Path Method 

CPM is an orderly, systematic method that breaks down project development into specific but related actions. 

This methodology can be used to build the preference for a project’s activities to assess risks and allocate resources accordingly. This method encourages teams to identify milestones, assignment dependencies, and deadlines with efficiency. 

A Critical Path introduces to a sequence of critical projects (dependent or floating) in a project that tells the extended succession of tasks that have to be made on time in order for the project to meet the deadlines.

Culture is Fundamental to Succeed

How do you explain to your client that the team won’t work this week for DrupalCon, DrupalCamp or any other events happening around?

You can only explain it by being clear with your thoughts and ideas. The community here plays a vital role in everything. 

Explain to your team members that it is beneficial for them to improve Drupal as a platform and introduce them with the team culture. Help your team member create pages in drupal.org and give credits to them of their creation on patches and modules. 

Closing the project 

Yes, it is possible that project closing might look like an insignificant and unimportant task in your project management journey, but, in fact, it is a critical part of producing a successful project. To help you get this step right, here are 4 things you need to know about how to close a project effectively.

Trace Project Deliverables: It is an effective closure means that you have completed all the deliverables to the satisfaction of the project’s sponsor

Reward Team Members: As your project comes to a close, always make sure to acknowledge, recognize and appreciate the contribution of your team members

Closeout Reports: A detailed close-out report should contain details about the process used during the project, the mistakes, the lessons learned, and how successful the project was in achieving the initial goals

Finance: Big clients are usually slow in payment, try to indulge in an agile budget for large projects. 

Turning From Technical provider to strategic solution partner 

As with any investment portfolio, an organization’s investment in Run, Optimise and Innovate initiatives must be balanced and aligned with the organization’s risk tolerance and the role expected of IT. If an organization considers itself to be more conservative, it is expected to see a higher ratio of Run to Optimise and Innovate spending. More progressive organizations will have more Optimise spending, and “leading edge” organizations will have more Innovate spending.

Conclusion 

Yes, Goliath, the Gittite, is and would always be the well-known giant in the Bible. He is described as 'a champion out of the camp of the Philistines, whose height was six cubits and a span. 

Befriending with the Goliath not only gave the sense of power to anyone with him but was also granted with security. 

Hunching on to large enterprises with big projects is like the very first step to success. Right steps and maintenance would to that success anytime soon. 

Opensense labs development methodologies work specifically on the approaches that involve Drupal development, enhancing efficiency, and increasing project delivery. 

Contact us on hello@opensenselabs.com to accomplish those large projects which you always desired off. 

blog banner blog image Drupal Drupal 8 Project management Project management Methodologies Agile Scrum framework Waterfall Critical Path Method CMS Blog Type Articles Is it a good read ? On
Categories: Drupal

User Account Language Negotiation

New Drupal Modules - 10 February 2019 - 1:23pm

This module does two things differently from Drupal 8 core:

  1. Allows language switchers to save the new language in the user account of the logged-in user.
  2. In addition, it shows languages in their translation. What this means is visible in the picture to the right.

You need this module only if your multilingual site has users that are typically logged in.

Categories: Drupal

Shakespeare Backend

New Drupal Modules - 10 February 2019 - 11:01am
Categories: Drupal

OpenSense Labs: Leveraging NightwatchJS in Drupal

Planet Drupal - 10 February 2019 - 8:34am
Leveraging NightwatchJS in Drupal Shankar Sun, 02/10/2019 - 22:04

The night watchman can get a new image as an alert guard if he senses correctly that something does not look right. Imagine a man dressed like a stockbroker and carrying a soft leather briefcase and a valise. His walking style is sort of tentative and squirrelish which is not the way a stockbroker walks. On being asked for an ID by the watchman, he scurries off towards the entry gate of a building and drops his valise on to the floor before being finally captured by the watchman who later finds out that he was trying to rob someone from the building.


Much like the night watchman and his brilliance in judgement that makes sure that the building is kept safe from any such robbers, there is another solution in the digital landscape that helps in writing browser tests easily and safely. NightwatchJS is an exceptional option to run browser tests and its inclusion in the Drupal core has only made things easier for the Drupalists.

Understanding NightwatchJS


NightwatchJS is an automated testing framework for web applications and websites. It is written on NodeJS and uses the W3C WebDriverAPI (formerly known as Selenium WebDriver) for performing commands and assertions on DOM elements.

Nightwatch.js is an integrated, easy to use End-to-End testing solution for browser-based apps and websites, written on Node.js. - Nightwatchjs.org

As a complete browser (end-to-end) testing solution, NightwatchJS has the objective of streamlining the process of setting up continuous integration and writing automated tests. It can also be utilised for writing NodeJS unit tests. It has a clean syntax that helps in writing testing rapidly with the help of NodeJS and CSS or Xpath selectors. Its out-of-the-box command line test runner propels sequential or parallel test runs simultaneously by group, tags or single. It, also, has the support for Mocha runner out-of-the-box.

NightwatchJS has its own cloud testing platform called NightCloud.io in addition to the support for other cloud testing providers like SauceLabs and BrowserStack. It governs Selenium and WebDriver services automatically in a different child process and has great support for working with Page Object Model. Moreover, with its out-of-the-box JUnit XML reporting, it is possible to incorporate your tests in the build process with systems like Jenkins. 

NightwatchJS in Drupal

JavaScript Modernisation Initiative paved the way for the addition of NightwatchJS to the Drupal core (in version 8.6) as the new standard framework for unit and functional testing of JavaScript. It makes sure that alterations made to the system do not break expected functionality in addition to the provision for writing tests for your contributed modules and themes. It can be included in the build process for ensuring that regressions hardly creep into production.

You can try NightwatchJS in Drupal 8.6 by adhering to the instructions given on GitHub. It exhibits how to test core functionality. It also instructs on how to use it for testing your existing sites, modules, and themes by giving your own custom commands, assertions and tests. It is worth considering to check out Nightwatch API documentation and the developer guide of NightwatchJS for creating custom commands and assertions.

NightwatchJS tests will be run by Drupal CI and are viewable in test log for core developers and module authors. And for your own projects, tests can be run easily in, for instance, CircleCI, that can provide you access to artefacts like screenshots and console logs.

Conclusion

While Drupal 8 has extensive back-end test coverage, NightwatchJS offers a more modern platform that will make Drupal more familiar to PHP and JavaScript developers. 

Offering amazing digital experience has been our biggest objective and we have been doing that with a suite of services.

Contact us at hello@opensenselabs.com and let us know how can we help you achieve your digital transformation dreams.

blog banner blog image Drupal Drupal 8 NightwatchJS Blog Type Articles Is it a good read ? On
Categories: Drupal

OpenSense Labs: Improving remote communications: Rocket.Chat for Drupal

Planet Drupal - 10 February 2019 - 7:54am
Improving remote communications: Rocket.Chat for Drupal Shankar Sun, 02/10/2019 - 21:24

Virtual private servers are fantastic for running your own cloud applications and gives you the authority over your private data. Potentially, your private data may be leaked when you communicate via services like text messaging. One way to ensure greater privacy is by hosting your own messaging system. This is where Rocket.Chat comes into picture.


Rocket.Chat has the provision for an actual open source implementation of an HTTP chat solution that provides convenience and gives greater freedom at the same time. It can be a marvellous solution for remote communications, especially for open source communities.

Rocket Chat: A close look


The official site of Rocket.Chat states that it’s an open source team communication software which offers an alternative to remote communications by replacing the likes of email, HipChat and Slack. It aids in improving productivity via efficacious team communication and team collaboration. It helps in sharing files with the team, real-time chatting or even leveraging audio or video conference calls with screen sharing. Being an open source solution, It gives you the option of customising, extending or adding new functionality to meet your expectations.

Rocket.Chat is an open source team communication software which offers an alternative to remote communications by replacing the likes of email, HipChat and Slack

 


With Rocket.Chat, you can do away with cc/bcc and make use of Rocket.Chat channels and private groups thereby bringing in more transparency in the team communication. By utilising @username, you can include relevant participants in order to apprise them swiftly. When you need to inform about an important matter to all the members of a group, @all can be used. Participants can join or leave anytime using the full chat history. Moreover, Rocket.Chat offers a secure workspace with restrictions on username and greater transparency for admin. You can be a part of leading blockchain propellants like Hyperledger, Brave, Aragon among others in migrating from Slack and Atlassian to Rocket.Chat.

Essential features

Following are some of the major features of Rocket.Chat:

  • Unlimited: Rocket.Chat has the provision for unlimited users, channels, searches, messages, guests and file uploads.
  • Authentication mechanism: It offers different authentication mechanisms like LDAP Group Sync, 2-factor authentication, end-to-end encryption, single sign-on and dozens of OAuth providers.
  • Real-time Chat: Its Live Chat feature allows you to add real-time chat widgets to any site or mobile applications. This brings more efficacy in team communication and also ensures top-notch customer service.
  • Message translation: It utilises machine learning for automatically translating messages in real-time.
  • Use across platforms: It can be leveraged for all the platforms with its web, desktop and mobile applications, LiveChat clients, and SDK( Software Development Kit).
  • Marvellous customisation: You can alter it to meet your requirements. Incoming and outgoing WebHook integrations can be added to it. The personalisation of user interface can be performed by overriding any of the built-in styles. You get to extract the benefits of its REST API, LiveChat API or Real-time API.
Rocket.Chat in Drupal

The Rocket.Chat module, available for both Drupal 7 and Drupal 8, helps a Drupal site to integrate Rocket.Chat. It constitutes a base module that holds the configuration and the LiveChat module that includes a block for enabling LiveChat widget on a page which can be controlled as a block.
 
The maintainers of this module recommend running Drupal and Rocket.Chat behind a TLS (Transport Layer Security) proxy or web server that has TLS capabilities. Also, HTTPS and HTTP crossovers should be taken care of. Moreover, enabling the LiveChat on Rocket.Chat instance allows you to use LiveChat feature.

Conclusion

One of the quintessential aspects of open source communities is remote communication. Rocket.Chat offers a great alternative to the likes of Slack and gives you the plugin to enable in it in Drupal.
 
We have a strong inclination towards the provision of digital innovation and have been doing that with our expertise in Drupal development.

Contact us at hello@opensenselabs.com to understand more on Rocket.Chat and transform your team communication and collaboration.

blog banner blog image Rocket.Chat Drupal Drupal 8 Drupal module Blog Type Articles Is it a good read ? On
Categories: Drupal

Pages

Subscribe to As If Productions aggregator - Drupal