UI to control the the form elements provided by the bootstrap_forms theme
In this tutorial, I'm going to explain how you can use the new Group module to organize your site's users. Group is extremely powerful Drupal 8 module.
At the basic level, Group allows you to add extra permissions to content.
At the more advanced level, this module is potentially a Drupal 8 replacement for Organic Groups.
TGIF! We hope the work week has treated you well.
The social media share module allows the user to share current page to different social media platforms. It is rendered as block, you can place it anywhere of your site. It is flexible to share any page of the site whether it is node, term , panels, view pages so on.
This post is an excerpt from the topics covered by our DrupalCon Dublin training: Drupal 8 Development - Workflows and Tools.
During the recent Nuvole presentations at Drupal Dev Days Milan 2016 and Drupalaton Hungary 2016 we received a number of questions on how to properly setup a Drupal 8 project with Composer. An interesting case where we discovered that existing practices are completely different from each other is: "What is the best way to deploy a Composer-based Drupal 8 project?".
We'll quickly discuss some options and describe what works best for us.What to commit
You should commit:
- The composer.json file: this is obvious when using Composer.
- The composer.lock file: this is important since it will allow you to rebuild the entire codebase at the same status it was at a given point in the past.
The fully built site is commonly left out of the repository. But this also means that you need to find a way for rebuilding and deploying the codebase safely.Don't run Composer on the production server
You would clearly never run composer update on the production server, as you want to be sure that you will be deploying the same code you have been developing upon. For a while, we considered it to be enough to have Composer installed on the server and run composer install to get predictable results from the (committed) composer.lock file.
Then we discovered that this approach has a few shortcomings:
The process is not robust. A transient network error or timeout might result in a failed build, thus introducing uncertainty factors in the deploy scripts. Easy to handle, but still not desirable as part of a delicate step such as deployment.
The process will inevitably take long. If you run composer install in the webroot directly, your codebase will be unstable for a few minutes. This is orders of magnitude longer than a standard update process (i.e., running drush updb and drush cim) and it may affect your site availability. This can be circumvented by building in a separate directory and then symlinking or moving directories.
Even composer install can be unpredictable, especially on servers with restrictions or running different versions of Composer or PHP; in rare circumstances, a build may succeed but yield a different codebase. This can be mitigated by enforcing (e.g., through Docker or virtualization) a dev/staging environment that matches the production environment, but you are still losing control on a relatively lengthy process.
You have no way of properly testing the newly built codebase after building it and before making it live.
Composer simply does not belong in a production server. It is a tool with a different scope, unrelated to the main tasks of a production server.
After ruling out the production server, where should the codebase be built then?
Building it locally (i.e., using a developer's environment) can't work: besides the differences between the development and the production (--no-dev) setup, there is the risk of missing possible small patches applied to the local codebase. And a totally clean build is always necessary anyway.
We ended up using Continuous Integration for this task. Besides the standard CI job, which operates after any push operation to the branches under active development, performs a clean installation and runs automated tests, another CI job builds the full codebase based on the master branch and the composer.lock file. This allows sharing it between developers, a fast deployment to production through a tarball or rsync, and opportunities for actually testing the upgrade (with a process like: automatically import the production database, run database updates, import the new configuration, run a subset of automated tests to ensure that basic site functionality has no regressions) for maximum safety.
Slides from our recent presentations, mostly focused on Configuration Management but covering part of this discussion too, are below.Tags: Drupal PlanetDrupal 8DrupalConTrainingAttachments: Slides: Configuration Management in Drupal 8
Continuing on his thread exploring the rules of games, game dev Stieg Hedlund I compares and contrasts similar issues in other media and shares lessons learned. ...
Styling the HTML <select> tag to appear similar in all the different browsers is a task unto itself. It seems on each new site , I find myself back visiting this post by Ivor Reić for a CSS only solution. My task for today is to use this idea to theme an exposed filter on a view.
The first thing we need to do is add a div around the select. We can do this by editing the select's twig template from Drupal 8 core's stable theme. Copy the file from
Then add the extra <div class="select-style"> and closing </div> as so.
Here is the LESS file that I compile which includes Ivor's CSS, but also some adjustments I added to event the exposed filter out. Each rule is commented, explaining what they do.
I will compile this into my final CSS and we are good to go. The display of the form, and the select list should be pretty accurate to what I want across all modern browsers. Adjust as needed for your styles and design.
So, this post is a little later than I would like. GenCon ended on August 7th, but in between then and now, I had to drive back from Indianapolis and then go through the circus of closing on a house and moving into said house. I still don’t have all my stuff in the new house, but close enough. It’s been a crazy two weeks.
Anyway, GenCon was awesome as always, but man was it hot. I’ve been attending GenCon since 2006 and I think this year gave us the most consistently hot weather of all those times. Every day was 90+ degrees with 90% humidity. While all of the events are inside, you still have to get from one location to another. Saturday reaching only 85 felt wonderful.
The heat certainly didn’t hurt attendance, though. While the numbers didn’t top last year’s record, there was still 60k people in attendance. Sometimes looking around the convention center and through the dealer’s hall it absolutely felt like there were that many people around. I was really glad to get a downtown hotel this year so there was no messing with commuting or parking.
Because of everything else going on at GenCon, I didn’t get to play or run as many games as I did at Origins, but I still got some great games in. I ran three sessions of Lone Wolf for Cubicle 7 and had three great groups of players. The first group actually had a couple from my home town that had played in one of my games at a small local convention. Also, for the very first time, I had a player ‘recognize’ me as a Gnome. That was a pretty cool moment among some awesome games.
For the games I played, I was able to get over to Indie Games on Demand twice. I highly recommend hitting them up for games. They’ve got a great system down and it’s a great way to play some games you may not otherwise get to experience. This year I got to play Masks and Lady Blackbird. I’m eagerly awaiting my copy of Masks to show up in my mailbox.
Soooo much cosplay. While it doesn’t quite hit San Diego ComicCon levels, the folks who put effort into their cosplay for GenCon do a great job. This is just a fraction of the pictures I took and that’s a fraction of the number of great costumes I saw. There were a bunch of lady Ghostbusters, but I never got a chance to get a pic of one.
The two costumes I was happiest to see were Tali and the Warden from Mass Effect and Dragon Age respectively. I’m a huge Bioware fangirl and seeing those games still be popular enough to spawn cosplay warms my nerdy heart.
I can’t talk about GenCon and not mention Pokemon Go. The popularity of the game was obvious just looking at the dealer’s hall and seeing the giant Pikachu floating over a booth dedicated to Pokemon swag. Or, if you popped open the game itself, you’d see every Pokestop tagged with a lure throughout downtown pretty much nonstop from Wednesday to Sunday.
The gyms throughout the city kept changing hands at an alarming rate. I do have the screenshot to prove that I very (VERY) briefly held a gym all on my own. The Haunter wasn’t my highest level character, but was the highest one I had left in full health after barely scraping by taking over the gym. I think my claim lasted all of 30 seconds.
My highlight of the convention, though, was getting to spend time with my fellow Gnomes. On the night of the Ennie Awards, we met beforehand for a lovely dinner at St. Elmo’s where we’re served by the amazing and magical Billy the Waiter. This year was all the more special with the addition of the new Gnomes getting to join in on the festivities.
All in all, it was a great convention and I’m looking forward to 2017 and the 50th anniversary of GenCon. Even with the heat and the crazy crowds, you can count on me making it there again.
Did you get a chance to hit GenCon this year? What were your highlights of the convention?
This module provides an Excel encoder for the Drupal 8 Serialization API. This
enables the XLS format to be used for data output (and potentially input,
eventually). For example:
On the VDMA website (Association of German Machinery and Plant Engineering) various professional associations are specifically listed with their individual information. To provide each page with information from the Tango Backend, a specific interface has been developed: The so-called Tango REST interface. In the seventh part of our series “The Central Data Hub of VDMA” we will introduce this interface, its technical realization and its functions.
Consider the marketing efforts of one worldwide corporation. Until recently, each brand and global region built and hosted its own websites independently, often without a unified coding and branding standard. The result was a disparate collection of high maintenance, costly brand websites. A Thousand Sites: One Goals
The organization has created nearly a thousand sites in total, but those sites were not developed at the same time or with the same goals. That’s a pain point. To solve this problem, the company decided to standardize all of its websites onto a single reference architecture, built on Drupal.
The objective of the new proprietary platform includes universal standards, a single platform that can accommodate regional feature sets, automated testing, and sufficient features that work for 95% of use cases for the company’s websites globally.
While building a custom platform is a great step forward, it must then be implemented, and staff needs to be brought up to speed. To train staff on technical skills and platforms, often the best solution is to outsource the training to experts who step in, take over training and propel the effort forward quickly.
As part of an embedded team, an outsourced trainer is an adjunct team member, attending all of the scrum meetings, with a hand in the future development of the training materials. Train Diverse Audiences
A company may invest a lot of money into developing custom features, and trainers become a voice for the company, showing people how easy it is to implement, how much it is going to help, and how to achieve complex tasks such as activation processes. The goal is to get people to adopt the features and platform. Classroom style training allows for exercises on live sites and familiarity with specific features. The Training Workflow
Trainers work closely with the business or feature owner to build a curriculum. It’s important to determine the business needs that inspired the change or addition.
Starting with an initial outline, trainers and owners work together. Following feedback, more information gets added to flesh it out. This first phase can take four to five sessions to get the training exactly right for the business owner. For features that follow, the process becomes streamlined. It's more intuitive because the trainer has gotten through all the steps and heard the pain points, but it’s important to always consult the product owner. Once there is a plan, the trainers rehearse the curriculum to see what works, what doesn’t work, what’s too long, and where they need to cut things. Training Now & Future
Training sessions may be onsite or remote. It is up to the business to decide if attendance is mandatory. Some staffers may wish to attend just to keep up with where the business is going.
Sessions are usually two hours with a lot of time for Q&A. With trainings that are hands-on, it’s important to factor in time for technical difficulties and different levels of digital competence.
Remote trainings resemble webinars. Trainers also create videos to enable on demand trainings. They may be as simple as screencasts with a voiceover, but others have a little more work involved. Some include animations to demo tasks in a friendlier way before introducing a more static backend form. It is the job of the trainer to tease out what’s relevant to a wide net of audiences.
The training becomes its own product that can live on. The recorded sessions are valuable to onboard and train up future employees. Trainers add more value to existing products and satisfy management goals.