All RPGs and Storygames by Tod Foley are now available at DrivethruRPG and RPGnow. Bring these games to your table!
If I have said it once I have said it a hundred times. I LOVE Monster books. Any and all monster books. I even grab monster books for games I don't play. So when Eric Bloat they head monster hunter at Bloat Games asked if I would be interested in looking over his new monster book (Kickstarting today) for his FANTASTIC Dark Places and Demogorgons I screamed HELL YES at my computer.
Now I promise to be fair here but a couple of words of full disclaimer. First I was sent this book as part of an agreement for a review. That is no big, I get a lot of books this way and I always try to be fair. Secondly. Well, look above. I am predisposed to like Monster books and I already love DP and P and cryptids are a TON of fun. So please keep all this in mind.
Dark Places and Demogorgons: Cryptid Manual is a digest-sized book weighing in at 90 or so pages. Some bits look like redacted Governmental documents and blood splattered hunter's notebooks. It's actually pretty cool looking if not 100% original (see Chill, Supernatural and Conspiracy X). That being said though it is also 100% EXPECTED. That's HOW I want my 80s monster hunting guide to look like.
The interior and the cover features two-color art (blacks and reds) on glossy pages. Now the gloss might just be my pre-copy version, or not. In any case the color, the art, and the layout are all a leap ahead in terms of style and look than all the previous DP and D books. If this is the future of their books then the future looks good.
A little over 50 monsters fill this book. They use the same stat block as DP and D so that also means they are roughly compatible with Swords and Wizardry (I'd say about 99%) and most other OSR-flavored games. Given the size of the book it fits in nicely with my Swords and Wizardry Whitebox games, so I have another monster book now for that! Each monster gets a page. Some exceptions occur with the Bigfoots and the E.T.s, but still, it's a good bit for each one.
There are also templates in the back of the book that work like the monster templates from 3.x. So you can apply the Vampire, Werewolf or my favorite Radioactive, template (among others) to any monster. Radioactive Bigfoots? Hell yes! There is also a table of enhancements and how they change your monster. So now it's Agile Radioactive Bigfoots!
There are some conditions ported over from 3.x (more or less) but very, very useful and I am happy to see them here.
Ok what are some of my favorites? There is the Almasti, which I also used in Ghosts of Albion. They have a special place in my heart. I'll likely include Almasti Shamen in my DP and D games like I did with Ghosts. Old faves like the Bunyip and Chupacabra. Holy crap there is a Crocoduck!
I have to admit I nearly shot coffee out of my nose when I first saw that. Worth the price of the book alone in my mind. Flatwoods Monster, all the various extraterrestrials (Nordics, Reptilians, LGMs, Greys), Hellhounds, the Hodag! (love those things!), Jersey Devils, Skin Walkers, and the Wendigo. So plenty really and many more. The monsters mostly come from modern cyptids, but there some classics from myths and local monsters.
This book is great really. While I may have been pre-disposed to like it, it really delivered and then some with me. The art is great and fun. The layout top notch and the monsters are just too much fun.
While reading it I could not help but think how well this would also work with White Star or other White Box derived game. So even if you don't play DP and D (and you should really, it's just too much fun) you can still get a lot of enjoyment out of this book. For example the Cryptid Manual is 90% compatible with Swords and Wizardry White Box. There is not a lot of overlap in monsters, so this makes the CM a perfect monster book for S and WWB players. Also, there are a lot of "new" monsters in S and W for the DP and D player/GM. Who's to say that an alien life form could resemble an orc or a wyvern.
In fact, this is true for nearly every clone. The clone game provides monsters for DP and P and the Cryptid Manual provides new monsters for your clone of choice. You just need to justify why they are there.
Worth picking up.
Provides functionality for adding redirect type info to redirects by taxonomy based selection field.Motivation
Va.gov has several legacy redirects that need to be migrated into drupal. Some are server level, some are template based.
Va.gov uses a headless setup with graphql queries generated for consumption via metalsmith frontend. In the redirect query,
information about the original redirect source is required, so this module associates a new vocabulary taxonomy term to indicate
redirect type with each redirect.
This module dispatch an event when configuration has changed and has been saved.
Here is an example how to subscribe this event :
Education is just like planting a seed. A seed that has different stages with significant and important roles. If any stage is missed in the entire scenario, it destroys the life cycle of the plant.It is not what is poured into a student that counts, but what is planted
There is no secret to the fact that education has the power to change lives. For a successful career, every student needs to go through the learning stage of knowledge, confidence, academics and technical skills so that they can grow efficiently. A college education is one such element that contributes highly to these steps of learning.
Therefore, to achieve these steps of knowledge, campus management software has been introduced.
But what exactly is OpenCampus?
OpenSource Campus Management solution is one such management software which has made lives easy for students, teachers, authorities and other people that follow down the chain. Such a system has brought standardization and steadiness within the organization.
OpenCampus is a technical model that contributes highly to the outlook and the network of the universities. It was developed with the first open adoption solution of campus management in Drupal.OpenCampus is designed to cover the life cycle of students.
In Germany and Austria, more than 30 universities are using this software and it is highly contributing to their needs and requirements.
With the help of OpenCampus software, you can manage everything. Starting from all the courses till recording achievements, the application does everything and is considered among the most versatile applications. It allows mapping of complex procedure which includes the allocation of the student into smaller classes in medical or dentistry programs.
The framework of OpenSource is based on the open source technology, Drupal, and it lets their customers create their own applications with a smooth integration of third-party products such as a moodle.Features provided by OpenCampus
Features Benefits Application and Admissions
- Transparent and multi-staging application process.
- Dynamic list of view.
- Automatic e-mail notification
- Smart forms
- Parallel small groups.
- Automation of complex course sequence
- Uploading of documents, term papers, and personal calendar
- Exam questions
- Written tests and online evaluation
- Seating plans
- Easy modifications following revision of the examination.
- Automatic generation of course certificate
- List of synopsis
- Evaluation via app
- Flexible configuration
- Automatic evaluation report
- Integration of students and faculty
- Forums and online discussions
Application and Admission
The process involving applications as well as admissions have been made really simple with the help of OpenCampus. The software presents the applicants with a simple tool that uploads and manages all the necessary information in one single place.
Course and Event Management
OpenCampus software is one of the most powerful and flexible of its kind. The module handles simple seminars with the location, automates complex courses, appointment, and lecturer. It also supports multilevel architectures with multi-language pages and directs the budget control.
The software is an innovative web-based solution that grants users with extensive functionalities for creating a multi-level architecture of any exam. All the aspects of exam preparation are managed seamlessly with OpenCampus ( starting from an online mock test to the seating arrangement)
Records of Achievements
OpenCampus performance management tells the whole study achievements of the students in a clear view. The data of the other modules such as "OpenCampus Exams" and "OpenCampus event management" are also stored in this location. Easy modifications in the revision of the examination, automatically generating course certificate and the listing of synopsis are some of the features that are offered under OpenCampus
Continuous and seamless evaluation is the key to ensure the quality of teaching and offers that are present by a university. The user can evaluate standardized courses and receive qualitative as well as quantitative feedback on different areas of teaching. The user can benefit from simple creation option of questionaries or reports as full integration of course management is done in the system.
The OpenCampus software has special support which is "Room Management". The users can manage their booking of event and laboratory rooms and their equipment. As the software is mobile responsive, it makes it even more efficient and handy.
The reason why customers choose OpenCampus
Higher education institutes are bound with various responsibilities and data information that has to be managed accurately and in complete synchronization. OpenCampus here bags all the trophies by providing them with the administration of the students and faculty. There are also many reasons why OpenCampus is chosen by universities. Some of the reasons are:
- It presents with unique processing mapping: OpenCampus is the only software that manages complex processes of the universities.
- It comes with comprehensive feature sets: OpenCampus software offers extensive functionalities and features to its customers.
- Open Adaptive System: OpenCampus is an adaptive system that has additional modules that can easily be added anytime on the openSource platform.
- Established and Experienced: More than 25 universities are using OpenCampus that have at least 3,000 students.
Research institutes need to manage multiple studies with individual data sets, processing rules and different type of permissions. But there are no “official” or “standard” technology that presents an easy to use the environment to construct database and user interface for the clinical trials or the research studies. Thus, many software solutions were being used which were explicitly made for a specific study, to cost-intensive commercial Clinical Trial Management Systems (CTMS)
With OpenCampus Research, Open adoption software (OAS) solution provided the users with a standard environment for state-of-the-art research database management at a very low cost.
The architecture of the open adoption software (OAS) allows the user after a brief instruction to develop their own web-based data management system.
The implementation provided with the following features:
- Basic Architecture
OpenCampus is basically three types: forms, trees, and containers. Any type of research project or clinical trial can always be mapped with this model and are fully configurable through the graphical user interface
There are many taxonomies that allow the user to classify content with terms gathered within the vocabularies. With the help of taxonomies, the field contents are able to store not just as text, but also as a reference that is linked to the predefined value.
The approach of OpenCampus software works really well under this section. There is one single study administrator that assigns permissions to center coordinators. Center coordinators then independently distribute access and permissions to the data managers that are responsible for entering the data.
The multicenter concept can be extended with various additional features such as node state levels or individual data processing guidelines that ensure that certain quality management actions are executed during data processing
One core element of this data storage approach in the OpenCampus OAS concept is that it allows the nodes to get connected to each other. The link between these nodes is called entity reference. With the help of entity references, the data from many studies can be combined (merged), enabling meta-analysis to be executed just by creating a new output view.
- Data Security
The two major solution in terms of security is that the customer can fill online form or the information can be submitted on premises along with the confidentiality of doctor-patient.
Thus, with the help of OpenCampus system, a steady environment was provided to the research center and the people working in it alongside with database design and pattern design.Conclusion
OpenCampus is not only that software which is used for the small clerical task, but it is also beyond that as it offers three-way interactive platform for students, teachers, and parents. It not only saves the time of the administrative staff and their pupils, but it also allows them to pay fees online and makes them attentive about important information around the university.
Opensense Labs believes that the contemporary system of education will spread a new level of superiority in the education sector. Ping us at email@example.com to know more about OpenSource campus management. The services provided by our organization would help you solve all your queries.blog banner blog image Drupal Drupal 8 CMS OpenCampus OpenSource Campus Management Data Management Course Management Blog Type Articles Is it a good read ? On
Lightweight utility module to help users to print Commerce Orders.
Uses browsers' built-in Print to file functionality.
No third-party library needed.
Provides a table formatter for the Field Collection module.
This module provides both a field formatter and a widget for the Field Collection entity type.
Integrates ActivityPub in your Drupal website so you can interact with your site on the Fediverse.
Currently development is happening on Github at https://github.com/swentel/activitypub and is synced back for bug fixes and releases. Create issues on Github.
The story of a historical character acquires a plethora of accretions over the centuries. So, we have numerous incidents and episodes in his or her life but not the complete picture. So, representing historical characters on stage could lead to a fractured narrative. There has to be synchronisation with the pre-recorded dialogue and it should not distract the actors from emoting.
Need for Content Synchronisation
The synchronisation is also of great significance in the digital scene. Content publishing in Drupal is of utmost importance with the expanding possibilities of content creation itself. It is even more crucial when it has to be synchronised between Drupal sites. Why is content synchronisation needed in Drupal?
Drupal 8 offers numerous tools for streamlining content creation and moderation. For instance, the Content Moderation module lets you expand on Drupal’s ‘unpublished’ and ‘published’ states for content and enables you to have a published version that is live and have a different working copy that is undergoing assessment before it is published.
In case, you need to share content or media across multiple sites, different solutions are available in Drupal that comes with content synchronisation capabilities to assist you to keep the development, staging and production in superb sync by automating the safe provisioning of content, code, templates and digital assets between them. Multiple synchronisation tasks can be created and scheduled for automating their occurrence in the future targeting different destinations servers and sites. Or, synchronisation tasks can be manually performed via the user interface.
Following are some of the major modules that are worth considering for content synchronisation necessities:Deploy - Content Staging
The Deploy module enables users to easily stage content from one Drupal site to another and automatically governs dependencies between entities like node references. Its rich API is extensible that helps in different content staging situations. It is great for performing cross-site content staging. Using Deploy with RELAXed Web Services helps in staging content between different Drupal websites. It, also, works with Workspace module for offering workspace preview system for single site content staging. And the API offered by RELAXed Web Services is spectacular for building fully decoupled site. With Multiversion, all content entities can be revisioned. Workbench Moderation ensures that when you moderate a workspace, content is replicated automatically when approved.
CMS Content Sync
Entity Sync module enables you to share entities like node, field collection, taxonomy, media etc. between Drupal instances. It lets you share entities with the help of JSON API and offers a user interface for leveraging endpoints provided by JSON API module.
CMS Content Sync module offers content synchronisation functionalities between Drupal sites with the help of a NodeJS based Sync Core. The synchronisation of an enormous amount of data consisting of content and media assets is possible with the help of this module that can’t be managed by Drupal itself. It is wonderful for content staging as it enables you to test code updates with your content and publish code and content concurrently. It manages the publishing so that you can have a complete focus on the creation of content. It also offers content syndication functionalities and your entire content can be updated and deleted centrally. Moreover, it allows you to connect any of your sites to a Content Pool that lets you push your content and media items and the remote sites can be allowed to import that content easily.
Exporting single content item or a number of content items from an environment to another efficaciously is possible with the help of Content Synchronisation module. That means you can export and import full site content. Or, you can export and import a single content item. The difference between site content and the one in YAML files can be viewed. Also, entities can be imported with a parent/child relationship.Acquia Content Hub
The distribution and discovery of content from any source in order to a fantastic multi-channel digital experiences can be done using Acquia Content Hub module. It enables you to connect Drupal sites to the Acquia Content Hub service. Acquia Content Hub, which is cloud-based, centralised content dissemination and syndication solution, lets you share and enrich content throughout a network of content sources using extensible, open APIs.
These are some of the most significant solutions available in the enormous list of Drupal modules that are specifically built for enhancing synchronisation of content between Drupal sites.
We have been constantly working towards the provisioning of marvellous digital experiences with our expertise in Drupal development.
Let us know at firstname.lastname@example.org how you want us to help you build innovative solutions using Drupal.blog banner blog image Content Synchronisation Content Syndication Content Staging Drupal 8 Drupal module Blog Type Articles Is it a good read ? On
A scientist can be rewarded a Nobel prize for some amazing scientific breakthrough made in his or her research. Most often than not, the leading scientist is backstopped in the research by a team of hugely talented assistants.
Traversing visual regression testing and BackstopJS
Talking about backstopping, you do need something as a backstop even in the digital landscape. When you make an alteration to your website, you have to be sure if it is devoid of unintended side effects. This is where BackstopJS comes into light. As an intuitive tool, it enables swift configuration and helps you get up and rolling quickly. Before we look at how it can be leveraged with Drupal, let’s dive deeper into visual regression testing and BackstopJS.
Visual regression testing emphasises on identifying visual alterations between iterations or version of a site. In this, reference images are created for every component as they are created which enables a comparison over time for monitoring alterations. Developers can run this test on his or her local development environment after the alterations are performed to make sure that no regressions issues transpire because of the changes.Visual regression testing emphasises on identifying visual alterations between iterations or version of a site.
Visual regression testing is hugely beneficial in enabling the developers to get a test coverage that is more than what they could do manually thereby ensuring that alterations do not cause regression impact on other components. It has the provision for enhanced detail comparison than what you would get while reviewing the site manually. Even without a full understanding of the project, developers know what the website should look like before the alterations.
BackstopJS is leveraged for running visual tests with the help of headless browsers to capture screenshots
BackstopJS, an open source project, is a great tool for performing visual regression testing. It is leveraged for running visual tests with the help of headless browsers to capture screenshots. It was created by Garris Shipon and was originally ran with the help of PhantomJS or SlimerJS headless browser libraries. It supports screen rendering with Chrome-headless and you can add your own interactions using Puppeteer and ChromyJS scripting.
It offers an excellent comparison tool for identifying and highlighting differences and lets you set up several breakpoints for testing responsive sites. Moreover, it utilises simple CSS selectors for identifying what to capture. It provides in-browser reporting user interface like a customisable layout, scenario display filtering, etc. Furthermore, it comes with integrated Docker rendering and works superbly with continuous integration and source control. Also, it gives you CLI and JUnit reports.Workflow and installation
It’s pretty easy to install and configure BackstopJS which involves:
Installation (global): npm install -g backstopjs
Installation (local): npm install --save-dev backstopjs
Configuration: backstop init (creates backstop.json template)
Following is the workflow of BackstopJS:
- backstop init: A new BackstopJS instance is set up by specifying URLs, cookies, screen sizes, DOM selectors, interactions among others
- backstop test: By creating a set of test screengrabs and comparing them with your reference screengrabs, you can check the alterations that show up in a visual report.
- backstop approve: If all looks fine after the test, you can approve it
A Drupal Community event in 2018 talked about a Drupal 8 module called Backstop Generator for creating backstop.json configuration files on the basis of site’s unique content.
This Drupal 8 module exposes an administrative configuration form to create a BackstopJS visual testing profile on the basis of Drupal website’s content. It assists you in creating backstop scenarios from Drupal pages and defining random pages for including as scenarios. You can toggle on and off viewport sizes. It results in a backstop.json file that requires you to place that into a backstop directory and the existing backstop.json file is replaced. It requires the services of Serialisation, HAL and REST modules. It is worth noting that this module is not covered by Drupal’s security advisory policy.
Backstop Generator can do a lot more. You can contribute towards building more interesting features and join the issue queue on Drupal.org to submit a patch or report a bug.Conclusion
Deploying frontend alterations can be troublesome. Visual regression testing software like BackstopJS ensures that our changes are accurate and contained and can be great for Drupal sites.
We have been offering a suite of services to help digital firms fulfil their dreams of digital transformation.
Contact us at email@example.com and let us know how you want us to be part of your digital transformation plans.
Views contextual filter for Date fields.
Support only Date unix type fields.
Available contextual filters:
- Date in the form of YYYY.
- Date in the form of MM (01 - 12).
- Date in the form of DD (01 - 31).
- Date in the form of WW (01 - 53).
- Date in the form of YYYYMM.
- Date in the form of CCYYMMDD.
R programming can be an astronomical solution for foretelling re-booking rates by leveraging previous guest ratings and, thereby, automating guest/host matching. That’s exactly what analysts at Airbnb, an online marketplace and hospitality service provider, has done with it. It has leveraged R for driving numerous company initiatives with the help of an internal R package called Rbnb.
A sip of history
As a growing organisation, Airbnb’s inclination towards R for enhancing its business operations speaks volumes of this programming language. It offers powerful analytics, statistics and visualisations. In combination with Drupal, R programming language can be a spectacular option for an innovative solution. Let’s take a look at the origins of R and its significance before heading over to the integration of Drupal and R programming.
Microsoft delineates that strolling along history of R would take us back to ‘90s when it was implemented by Robert Gentleman and Ross Ihaka (faculty members at the University of Auckland). It was commenced as an open source project in 1995. R Core Group handled this project from 1997. The first version was released in the year 2000. It was closely modelled on the S language for Statistical Computing.R programming: An exploration
R is a language and environment for statistical computing and graphics. - r-project.org
R, as an open source programming language and software environment, is magnificent for statistical computing and graphics. Prominently, it is used by data scientists and statisticians alike. It has the support of R Foundation for Statistical Computing and a huge community of open community developers. Those who are accustomed to using GUI focussed programs like Statistical Package for Social Sciences (SPSS) and Statistical Analysis System (SAS) can find it difficult in the nascent stages of using R as it leverages a command line interface. In this case, RStudio can be meritorious.
R offers a wide array of statistical techniques like linear and non-linear modelling, classical statistical tests, time-series analysis, classification, clustering among others. In addition to this, it also provides graphical techniques and is very extensible.
R has had a remarkable growth which can be seen through the following figure.Source: Stack Overflow
Even the Redmonk programming language rankings, that compared languages’ appearances on Github (usage) and StackOverflow (support), R maintains its position near the top.
Source: Stack Overflow
Academics, healthcare and government segments maintain the preeminence when it comes to the industries that visit the R questions the most in U.S. and U.K.
The Burtch Works survey shows that R is a second choice for data scientists and its flexibility makes it great for predictive analytics. Research practitioners, who want to do both sorts of analysis and concurrently implement machine learning to their marketing research processes in future, will find R as a great option.Source: NebuWhy choose R?
Following are some of the pivotal reasons that state the importance of R:Documentation
Online resources for R like message boards are superbly supported and well documented.Analysis
R helps in performing data acquisition, cleaning and analysis all in one place.Data Visualisation
R has fantastic tools for data visualisation, analysis and representation.Figure: A ggplot2 result | Source: IBM
This is a good example of data visualisation provided by R package ggplot2. In this, it displays the intricate relationship between yards per game and touchdowns as well as first downs.Easy to learn
You can quickly get to speed as R has an easy learning curve. For instance, with the simplicity of the language, you can easily create three samples and create a bar graph of that data.Figure: Three random samples | Source: IBMMachine learning
It makes machine learning a lot more easy and approachable with a superabundance of R packages for machine learning like MICE (for taking care of missing values), CARET (for classification and regression training) among others.Drupal in the picture
A digital agency integrated Drupal with R for an insurance company that envisions itself offering a highly personalised matching service to assist people for selecting the right insurance program. The company leveraged R for building an algorithm that can calculate the compatibility score. The main notion was to be able to efficaciously match prospective customers to the insurance carrier based on customer’s needs and preferences.The company leveraged R for building an algorithm that can calculate the compatibility score to be able to efficaciously match prospective customers to the insurance carrier based on customer’s needs.
The intricacies involved during the calculation of the compatibility score that is a function of elements like user persona, price sensitivity among others. So, numerous unique customer personas were found in the process like demographic factors (gender, age, education etc.) and car ownership details (car type, mileage etc.). Once a prospect is found to be a particular persona, it is, then, mapped to each of the insurance providers on the basis of a score and customer satisfaction survey ratings.
Moreover, different scores are calculated for tracking sensitivity of the customer to price, customer service levels, etc. This is done through a web service that links to the insurance carriers’ information and offers a quote for the customer on the basis of numerous information points provided by him. Finally, consolidation of all these scores into two different parameters give a final score that helps the prospect to select the right insurance carrier.
An insurance portal was built in Drupal which was the customer persona questionnaire that prospects can fill to be able to search the best carrier for their spending styles and other preferences. Once the information is entered by the prospect, it is passed from Drupal to R in the JSON format. R ensures that a plethora of customer data already exist and are also leveraged by the algorithm developed. Quotes pulled from multiple carriers are processed via web service. On the basis of the algorithm, R processes all the data and sends back the best carrier match options for the prospect who can, then, select the carrier on the basis of his preferences.
R, along with Drupal, is a marvellous option for the data experts to do everything ranging from the mapping of social and marketing trends online to building financial and climate models for giving that extra push to our economies and communities.
We have been offering a suite of services for fulfilling the digital transformation endeavours of our partners.
Let us know at firstname.lastname@example.org how you want us to be a part of your digital innovation plans.
iRobot, a leading global consumer robot company, had a spectacular debut on Amazon Prime Day as it sold thousands of its Roomba robotic vacuums. Eventually, with the greater focus on its central value proposition vis-à-vis offering leading-edge robots to relieve customers from humdrum chores and the increasing number of connected customers, iRobot decided to move its mission-critical platform to the Amazon Web Services (AWS) Cloud. By leveraging the powerful tools and integrations of AWS, they were able to use a serverless architecture that offered an efficacious combination of scalability, global availability and breadth of services.
The emergence of serverless computing
Why is such a big organisation like iRobot opting for serverless computing? Using a serverless architecture powered by AWS IoT and AWS Lambda was beneficial to keep the cost of the cloud platform low, negate the need for subscription services and handle the solution with fewer people. Also, the need to maintain physical infrastructure and systems software vanishes. Drupal, as a leading player in the open source content management framework market, has been a humongous option for building innovative solutions. Drupal can be a great solution for implementing a serverless architecture. Before diving right into it, let’s traverse the road that delineates how the concept of serverless came into existence.
To look at how serverless came into existence, freeCodeCamp has compiled an interesting story that takes us to the 1950s. This was the time period when the computing paradigm called mainframes burst onto the scene. Eventually, the mid-2000s witnessed the advent of a new paradigm called cloud computing.Source: ResearchGate
It was in the 2010s when the concept of serverless architecture popped up. Serverless is more: From PaaS to present cloud computing report has an interesting compilation that distinguishes six main dimensions, as seen in the figure above, of critical breakthroughs that paved the way for the emergence of serverless.Serverless: In-depth Source: freeCodeCamp
What is Serverless Computing? It enables you to write and deploy code without having any sort of obstruction while handling the underlying infrastructure. Here, the cloud-based servers, infrastructure and operating systems are relied upon by the developers. Although it is referred to as serverless, there is still an involvement of servers. Being a fully managed service, the setup, capacity planning and server management are managed by the cloud provider.Serverless architectures are application designs that incorporate third-party “Backend as a Service” (BaaS) services, and/or that include custom code run in managed, ephemeral containers on a “Functions as a Service” (FaaS) platform. By using these ideas, and related ones like single-page applications, such architectures remove much of the need for a traditional always-on server component.- Martin Fowler
Martin Fowler states that there is no clear view of defining Serverless. On one side, it was first used to describe applications that involve the incorporation of third-party, cloud-hosted applications and services for handling server-side logic and state called (Mobile) Backend as a Service (BaaS). On the other side, it can also mean applications in which the server side logic is written by the developer and run in stateless compute containers that are fully managed by a third party called Functions as a Service (FaaS).Benefits of Serverless
Following are some of the merits of Serverless stated by AWS:
- Governance: There is no need for governing any servers. To add to this, no software or runtime to install, maintain or administer is needed.
- Scalability: Automatically, your application can be scaled. It can also be flexibly scaled by adjusting its capacity which is done by toggling the units of consumption (such as throughput or memory) in lieu of units of separate servers.
- Cost: You can pay for consistent throughput or execution duration instead of making payment by server unit.
- Availability: It offers built-in availability and fault tolerance
The seamless integration between CloudFront, Lambda@Edge and headless Drupal delivers the lowest latency and personalised experience to the users.
A combination of Amazon CloudFront, a web service that quickens the process of distribution of static and dynamic web content; Lambda@Edge, that extends serverless compute feature to the CloudFront network; and Drupal as a powerful headless CMS can be fantastic for building a serverless architecture. The seamless integration between CloudFront, Lambda@Edge and headless Drupal delivers the lowest latency and personalised experience to the users.
AWS has delineated how to accelerate the Drupal content with Amazon CloudFront. It showed how to deploy CloudFront to cache and accelerate your Drupal content with the help of a globally distributed set of CloudFront nodes. In this, every CloudFront distribution consisted of one or more origin locations. An origin is where Drupal content resides. By running the supplied Amazon CloudFormation stacks, Drupal 8 was deployed. Amazon Elastic Compute Cloud (EC2), Amazon Elastic File System (EFS), Amazon Relational Database Service (RDS) and Amazon Aurora came in handy as well. It was all wrapped up in a highly available design using several Availability Zones and its configuration was done in such a way that it was able to auto scale using Amazon EC2 Auto Scaling groups.
A report by Markets and Markets estimates that the market size of serverless architecture was estimated at USD 4.25 billion in 2018. This is expected to grow at a Compound Annual Growth Rate (CAGR) of 28.6% to reach USD 14.93 billion by 2023. Elimination of the need to administer servers for minimising infrastructure costs and streamlining deployment, governance and execution, shift from DevOps to serverless computing, and the burgeoning state of microservices architecture have been contributing towards its growth.Source: Grand View Research
A report by Grand View Research states that automation and integration services segment play a crucial role in establishing a serverless architecture. While the monitoring services segment is expected to see the highest CAGR of 28.8%, the small and medium enterprises (SME) segment will grow with CAGR of 28.6% between the forecast period of 2015 and 2025. The banking, financial services and insurance (BFSI) segment to anticipated to retain its dominance in this market share. Moreover, North America, with the powerful presence of U.S., is growing at a great pace. Asia-pacific is also expected to witness a growth of CAGR of 26% in the forecast period.
Research and Markets, in a report, states that serverless computing does come with certain challenges during its deployment. As Cloud Service Providers (CSP) control the underlying infrastructure, users are unable to customise or optimise the infrastructure. Further, the organisation has no authority over the infrastructure that raises the risk factor involved with adding several customers on the same platform. Also, consumers do not have control over penetrations tests and vulnerability scanning on infrastructure that increases the requirement for compliance concerns adopters and acts as restraints to the market growth.
The Chief Technology Officer (CTO) of Amazon, Werner Vogels, said in his 2016 keynote, “Before, your servers were like pets. If they became ill you had to nurture them back to health. Then, with cloud, they were cattle, you put them out to pasture and got yourself a new one. In serverless, there is no cattle, only your application. You don't even have to think about nurturing back to health or getting new ones, all the execution is taken care of.”
Serverless Computing can be a great solution in combination with Drupal. The serverless platform enables you to globally distribute your web application for running dozens of data centres across the globe with the customers being served from the one that is nearest to them.
Be mindful of the fact that serverless is not the right approach for all the problems and consider your business requirements before taking a plunge.
We have been on a constant pursuit of offering a magnificent digital presence to our partners using our expertise in Drupal development.
To decide if serverless is the right fit for your business, let us know at email@example.com so that we can help you scale your digital business.
Security public service announcements: SA-CORE-2019-003 Notice of increased risk and Additional exploit path - PSA-2019-02-22
This Public Service Announcement is a follow-up to SA-CORE-2019-003. This is not an announcement of a new vulnerability. If you have not updated your site as described in SA-CORE-2019-003 you should do that now.
There are public exploits now available for this SA.
As far as we know, this is not being mass exploited at this time.
In the original SA we indicated this could be mitigated by blocking POST, PATCH and PUT requests to web services resources, there is now a new way to exploit this using GET requests.
The best mitigation is:
- If you are using Drupal 8.6.x, upgrade to Drupal 8.6.10.
- If you are using Drupal 8.5.x or earlier, upgrade to Drupal 8.5.11.
- Be sure to install any available security updates for contributed projects after updating Drupal core.
This only applies to your site if:
- The site has the Drupal 8 core RESTful Web Services (rest) module enabled.
- The site has another web services module enabled, like JSON:API in Drupal 8, or Services or RESTful Web Services in Drupal 7, or custom code that allows entity updates via non-form sources.
This module provides some magic blocks.
- A block that removes the X-Frame-Options header so that single pages can be embedded without changing the protection of the other pages.
- A block that gives all links in a page the target=_blank behavior.
When implementing Drupal or any CMS for that matter, you can jump-start development and save time with pre-built themes. However, these themes can also hamstring your efforts to deliver content in the way you envisioned.
An advantage of a pre-built theme is that they are usually already tested and will work as designed right out of the gate. On the other side, building a custom theme ties your web presence tightly with your brand identity and feels more like a singular experience. In this post we will try to determine if you should build a theme to match a custom design, or use a pre-built theme and try to fit your needs into the pre-packaged solution.