Don't Miss: The challenges and rewards of designing local multiplayer games

Social/Online Games - Gamasutra - 7 August 2018 - 11:56am

We talk to the developers of Super Pole Riders, Overcooked and Spaceteam about the appeal of developing a local multiplayer game, and the benefits over taking your game online. ...

Categories: Game Theory & Design How micro is your microservice?

Planet Drupal - 7 August 2018 - 9:02am
How micro is your microservice? Crell Tue, 08/07/2018 - 16:02 Blog

"Microservices" have been all the rage for the past several years. They're the new way to make applications scalable, robust, and break down the old silos that kept different layers of an application at odds with each other.

But let's not pretend they don't have costs of their own. They do. And, in fact, they are frequently, perhaps most of the time, not the right choice. There are, however, other options besides one monolith to rule them all and microservice-all-the-things.

What is a microservice?

As usual, let's start with the canonical source of human knowledge, Wikipedia:

"There is no industry consensus yet regarding the properties of microservices, and an official definition is missing as well."

Well that was helpful.

Still, there are common attributes that tend to typify a microservice design:

  • Single-purpose components
  • Linked together over a non-shared medium (usually a network with HTTP or similar, but technically inter-process communication would qualify)
  • Maintained by separate teams
  • And released (or replaced) on their own, independent schedule

The separate teams part is often overlooked, but shouldn't be. The advantages of the microservice approach make it clear why:

  • Allow the use of different languages and tools for different services (PHP/MongoDB for one and Node/MySQL for another, for instance.)
  • Allows small, interdisciplinary teams to manage targeted components (that is, the team has one coder, one UI person, and one DB monkey rather than having a team of coders, a team of UI people, and a team of DB monkeys)
  • Allows different components to evolve and scale scale independently
  • Encourages strong separation of concerns

Most of those benefits tie closely to Conway's Law:

Any organization that designs a system (defined broadly) will produce a design whose structure is a copy of the organization's communication structure.

A microservice approach works best when you have discrete teams that can view each other as customers or vendors, despite being within the same organization. And if you're in an organization where that's the case then microservices are definitely an approach to consider.

However, as with any architecture there are tradeoffs. Microservices have cost:

  • Adding network services to your system introduces the network as a point of failure.
  • PointS of failure should always be plural, as a network, even a virtual and containerized one, has many, many points of failure.
  • The network will always be 10x slower than calling a function, even a virtual network. If you're using a shared-nothing framework like PHP you have to factor in the process startup cost of every microservice.
  • If you need to move some logic from one microservice to another it's 10x harder than from one library to another within an application.
  • You need to staff multiple interdisciplinary teams.
  • Teams need to coordinate carefully to avoid breaking any informal APIs
  • Coarse APIs
  • Needing new information from another team involves a much longer turnaround time than just accessing a database.

Or, more simply: Microservices add complexity. A lot of complexity. That means a lot more places where things can go wrong. A common refrain from microservice skeptics (with whom I agree) is

"if one of your microservices going down means the others don't work, you don't have a microservice; you have a distributed monolith."

To be sure, that doesn't mean you shouldn't use microservices. Sometimes that is the right approach to a problem. However, the scale at which that's the is considerably higher than most people realize.

What's the alternative?

Fortunately, there are other options than the extremes of a single monolith and a large team of separate applications that happen to talk to each other. There's no formal term for these yet, but I will refer to them as "clustered applications".

A clustered application:

  • Is maintained by a single interdisciplinary team
  • Is split into discrete components that run as their own processes, possibly in separate containers
  • Deploys as a single unit
  • May be in multiple languages but usually uses a single language
  • May share its datastore(s) between processes

This "in between" model has been with us for a very long time. The simplest example is also the oldest: cron tasks. Especially in the PHP world, many applications have had a separate cron process from their web request/response process for literally decades. The web process exists as, essentially, a monolith, but any tasks that can be pushed off to "later" get saved for later. The cron process, which could share, some, all, or none of the same code, takes care of the "later". That could include sending emails, maintenance tasks, refreshing 3rd party data, and anything else that doesn't have to happen immediately upon a user request for the response to be generated.

Moving up a level from cron are queue workers. Again, the idea is to split off any tasks that do not absolutely need to be completed before a response can be generated and push them to "later". In the case of a queue worker "later" is generally sooner than with a cron job but that's not guaranteed. The workers could be part and parcel of the application, or they could be a stand-alone application in the same language, or they could be in an entirely different language. A PHP application with a Node.js worker is one common pattern, but it could really be any combination.

Another variant is to make an "Admin" area of a site a separate application from the front-end. It would still be working on the same database, but it's possible then to have two entirely separate user pools, two different sets of access control, two different caching configurations, etc. Often the admin could be built as just an API with a single-page-app frontend (since all users will be authenticated with a known set of browser characteristics and no need for SEO) while the public-facing application produces straight HTML for better performance, scalability, cacheability, accessibility, and SEO.

Similarly, one could make a website in Django but build a partner REST API in a separate application, possibly in Go to squeeze the last drop of performance out of your system.

There's an important commonality to all of these examples: Any given web request runs through exactly one of them at a time. That helps to avoid the main pitfall of microservices, which is adding network requests to every web request. The fewer internal IO calls you have the better; just ask anyone who's complained about an application making too many SQL queries per request. The boundaries where it's reasonable to "cut" an application into multiple clustered services are anywhere there is, or can be, an asynchronous boundary.

There is still additional complexity overhead beyond a traditional monolith: while an individual request only needs one working service and there's only one team to coordinate, there's still multiple services to have to manage. The communication paths between them are still points of failure, even if they're much more performance tolerant. There could also be an unpredictable delay between actions; an hourly cron could run 1 minute or 59 minutes after the web request that gave it an email to send. A queue could fill up with lots of traffic. Queues are not always perfectly reliable.

Still, that cost is lower than the overhead of full separate-team microservices while offering many (but not all) of the benefits in terms of separation of concerns and allowing different parts of the system to scale and evolve mostly independently. (You can always throw more worker processes at the queue even if you don't need more resources for web requests.) It's a model well worth considering before diving into microservices.

How do I do either of these on

I'm so glad you asked! is quite capable of supporting both models. While our CPO might yell at me for this, I would say that if you want to do "microservices" you need multiple projects.

Each microservice is supposed to have its own team, its own datastore, its own release cycle, etc. Doing that in a single project, with a single Git repository, is rather counter to that design. If your system is to be built with 4 microservices, then that's 4 projects; however, bear in mind that's a logical separation. Since they're all on and presumably in the same region, they're still physically located in the same data center. The latency between them shouldn't be noticeably different than if they were in the same project.

Clustered applications, though, are where especially shines. Every project can have multiple applications in a single project/Git repository, either in the same language or different language. They can share the same data store or not.

To use the same codebase for both the web front-end and a background worker (which is very common), we support the ability to spin up the same built application image as a separate worker container. Each container is the same codebase but can have different disk configuration, different environment variables, and start a different process. However, because they all run the same code base it's only a single code base to maintain, a single set of unit tests to write, etc.

And of course cron tasks are available on every app container for all the things cron tasks are good for.

Within a clustered application processes will usually communicate either by sharing a database (be it MariaDB, PostgreSQL, or MongoDB) or through a queue server, for which we offer RabbitMQ.

Mixing and matching is also entirely possible. In a past life (in the bad old days before existed) I built a customer site that consisted of an admin curation tool built in Drupal 7 that pulled data in from a 3rd party, allowed users to process it, and then exported pre-formatted JSON to Elasticsearch. That exporting was done via a cron job, however, to avoid blocking the UI. A Silex application then served a read-only API off of the data in Elasticsearch, and far faster than a Drupal request could possibly have done.

Were I building that system today it would make a perfect case for a multi-app project: A Drupal app container, a MySQL service, an Elasticsearch service, and a Silex app container.

Please code responsibly

There are always tradeoffs in different software design decisions. Sometimes the extra management, performance, and complexity overhead of microservices is worth it. Sometimes it's... not, and a tried-and-true monolith is the most effective solution.

Or maybe there's an in-between that will get you a better balance between complexity, performance, and scalability. Sometimes all you need is "just" a clustered application.

Pick the approach that fits your needs best, not the one that fits the marketing zeitgeist best. Don't worry, we can handle all of them.

Larry Garfield 7 Aug, 2018
Categories: Drupal

Community: Governance Task Force Community Update, August 2018

Planet Drupal - 7 August 2018 - 8:40am

This is a public update on the work of the Governance Task Force.

We have progressed into what we are calling the “Engagement Phase” of our schedule; interviewing community member, working groups, and soliciting feedback and meetups and camp. To date we have interviewed at least 18 people (including community members, liaisons, and leadership,) and 3 groups, with at least 15 more being scheduled.


If you would like to participate in an interview, please contact any member of the Governance Task Force or sign up using this Google form.

The purpose of interviews is to meet with people individually to get feedback and ideas, and have a conversation about community governance (non-technical governance.) Many governance related discussions have occurred in the past, but we want to make sure everyone has an opportunity to be heard, since group discussions are not always conducive to individual thoughts. Notes taken during the interview are available to, and editable by, the person interviewed, and not shared outside the Governance Task Force. If you have any concerns about a language barrier, privacy, or any other concerns about participating in an interview, contact us. We will do what we can to work with you.


The individual interviews are a new step in this governance process, but we do have access to a lot of information that was already compiled from prior discussions. Many town hall style discussions were held over the past year, and we are using all of that information. As we progress into the “Analysis Phase” we are using that information to develop user stories and ideas that will help inform our eventual proposal. Once the interviews are concluded, their analysis will be merged with the existing information.

Drupal Europe

Rachel, Ela, and Stella will be providing an update on the task force’s efforts at Drupal Europe. Findings will be shared and there will be an open discussion to hear from attendees to inform our efforts.

Ongoing Feedback

The task force is committed to working transparently and delivering a well-rounded proposal for both the community and for leadership to review. We believe the proposal presents a great opportunity to help evolve community governance and inform next steps. Should you want to contact the Governance Task Force, feel free to reach out to any member of the group via Slack,, or any public place you find our members.

We’ve also setup a Google form for direct feedback. If you do not want to participate in an interview, but do want to contribute your individual thoughts, use this form. You can answer as many or few questions you like. You can also submit the form anonymously. This form will stay active throughout the proposal process, so if you have thoughts to share at a later date, you can still use this form.

Adam Bergstein
David Hernandez
Ela Meier
Hussain Abbas
Lyndsey Jackson
Rachel Lawson
Stella Power

Categories: Drupal - Thoughts: Last month in Drupal - July 2018

Planet Drupal - 7 August 2018 - 8:22am
July has been and gone so here we take a look back at all the best bits of news that have hit the Drupal community over the last month. Drupal Development Dries Buytaert discussed why more and more large corporations are beginning to contribute to Drupal. He shares an extended interview with Pfizer Director Mike Lamb who explains why his development team over there has ingrained open source contribution into the way they work. Drupal 8.5.5 was released in July, this patch release for Drupal 8 contained a number of bug fixes, along with documentation and testing improvements.  It was announced that Drupal 8.6.0 will be released on September 5th and the Alpha version was released the week beginning July 16th. The beta was also recently released, the week of July 29th. This release will bring with it a number of new features, Drupal released a roadmap of all the fixes and features they aim to have ready for the new release.  Events Drupal Europe announced 162 hours of sessions and 9 workshops for the event on Tuesday, Wednesday and Thursday. They also urge anyone with any ideas for social events at this year's event to submit your ideas to help fill out the social calendar with community led ideas.  On August 17-19, New York will play host to the second Decoupled Drupal days. For those that don’t know Decoupled Drupal Days gathers technologists, marketers and content professionals who build and use Drupal as a Content Service -- for decoupled front ends, content APIs, IoT, and more.  DrupalCamp Colorado recently took place. The event proved popular as per usual and this year's Keynote “The Do-ocracy Dilemma and Compassionate Contribution”, was delivered by Acquia Director of research and innovation, Preston So. Preston discusses why a more compassionate approach to contribution is so critical when it comes to managing open-source projects, crafting conference lineups, enabling a successful team, and building a winning business. New Modules New modules, updates and projects were of course released throughout July, the pick of the bunch includes: Commerce 8.x-2.8 - E-commerce suite sees a number of bug fixes google_analytics 8.x-2.3 - Module sees a number of bug fixes Drupal 8.5.5 - Patch release that sees a number of bug fixes and testing improvements That is the end of this months round up. Keep an eye out for next months where we cover all the latest developments in the Drupal community and all the important news affecting the wider Drupal community. Miss last months round up? Check it out on the Ixis site now.
Categories: Drupal


New Drupal Modules - 7 August 2018 - 8:21am
Categories: Drupal

Country, State and City Field

New Drupal Modules - 7 August 2018 - 7:56am

This module add a field containing 3 others fields (Country, State and City).

This field has 2 form display:

Country State Widget

When user choose the country, the state field is showed, than, when user select the state, the city field is showed.

Country State Autocomplete Widget

The user serach by the city using autocomplete. When a city is choosed, the field get the state and country.

Categories: Drupal

The 4D Finch House - by Justin Reeve Blogs - 7 August 2018 - 6:56am
There's more to What Remains of Edith Finch than first meets the eye. This article examines how the game's creative level design evokes feelings of quiet contemplation in the player.
Categories: Game Theory & Design

LEVY: Designing for Accessibility_02 - by Daniel St Germain Blogs - 7 August 2018 - 6:53am
My name is Dan St. Germain and while in undergrad, I worked with four other students on creating a blind/deaf accessible video game called LEVY. These blog posts will be dedicated to explaining the different UI, UX, and design decisions that were made.
Categories: Game Theory & Design

Story Analysis - Part 3 - Nier & Bonus - by Nathan Savant Blogs - 7 August 2018 - 6:52am
A series of critical analysis of the narrative and mechanical design in narrative games, continuing with Nier: Automata and a final bonus examination.
Categories: Game Theory & Design

Domain Driven Camera System - by Cameron Nicoll Blogs - 7 August 2018 - 6:50am
A concept for allowing easy control of how the Camera feels across multiple areas
Categories: Game Theory & Design

How a computer learns to dribble: Practice, practice, practice

Virtual Reality - Science Daily - 7 August 2018 - 6:49am
Basketball players need lots of practice before they master the dribble, and it turns out that's true for computer-animated players as well. By using deep reinforcement learning, players in video basketball games can glean insights from motion capture data to sharpen their dribbling skills.
Categories: Virtual Reality

First Fortnite, Now Fallout '76 - The "30% Standard Fee" is Under Attack - by Jay Powell Blogs - 7 August 2018 - 6:47am
Ok, Fortnite and Fallout '76 aren't the first EVER to skip the traditional stores, but they are signalling another shift in the industry. Epic and Bethesda are skipping Google Play and Steam respectively and they won't be the last to do it.
Categories: Game Theory & Design

From light novels to eSports: the business model behind The King’s Avatar - by Henri Brouard Blogs - 7 August 2018 - 6:47am
Chinese gaming giant Tencent released an anime about eSports. The series is an ideal marketing tool to promote Tencent products in China and overseas.
Categories: Game Theory & Design

Quickly Learn to Create Isometric Games Like Clash of Clans or AOE - by Vivek Tank Blogs - 7 August 2018 - 6:46am
With the advancement of technology in computer games, 3D games are becoming very common. But years back when there was no support for 3D elements, it was quite hard to develop 3D games but solution found at that time was an Isometric view, of 2D elements.
Categories: Game Theory & Design

Genre As A Tool For Meaning

Gnome Stew - 7 August 2018 - 6:30am

by kellepics on Pixabay

Genre is a powerful tool. Fantasy, Horror, Sci Fi, Historical Fiction, Anime… the definition of genre is broad and wiggly, but no matter how you’re defining it genre plays an interesting roll in how we tell our stories. While present in all media, genre is specifically a focus of tabletop roleplaying games and LARPs, where a realistic setting is the odd one out (except in the original concept of Nordic LARP, where the rule is no dragons, no NYC). This means our stories have so much potential to be packed full of meaning about ideas that spawn from the cultural consciousness.

How Humans Make Meaning

Every game has a message. Games are stories, especially roleplaying games, where we play characters and interact with narrative and create tales together. We are collaborative storytellers when we play roleplaying games. Stories are how we humans make meaning of our world. All stories have meaning, all roleplaying games have meaning.  Stories are how we humans make meaning of our world. All stories have meaning, all roleplaying games have meaning. Share6Tweet6+11Reddit1Email

Who you are means you often get the privilege of “just telling stories” without thinking about their meaning, and there’s an implicit power in that ability. How much you admit the meaning in the stories you are telling, how intentional the meaning is in that story, and what your subconscious unthinking mind creates in a story are all conveying messages. Meaning is conveyed through storytelling.

A Love Of Genre

Storytelling is a way of sharing images, characters, and journeys in a world that the storyteller wants to see. Genre and speculative fiction allows us to imagine so many different ways this could occur, outside the boundaries of modern life. What’s so awesome about genre is that we can wrestle with these big themes and big ideas that we face in contemporary life without having to skirt away from the heaviness of those themes. Genres give us freedom to imagine different worlds and rules of existence and realities… which is why speculative fiction is such a meaningful tool for feminists and marginalized folks in particular. We can imagine better futures for ourselves.

Genre is an alibi for these meanings in stories.  We can wrap up these meanings in afrofuturism and speculative feminism, space opera and other worlds, other times, fantastical times. In these other worlds we can imagine what these burning questions in our modern lives might be like in a different scenario than the one in the real world. That imagining can lead us to real solutions in our minds and hearts.

Genre Games

In Call of Cthulhu players approach the horrific and the unknown to try and see more than humans can see, and are punished because of this curiosity. In Dungeons & Dragons, players travel to different locations and use their wits and weapons to solve puzzles, find treasure, to gain power over time. In Blades in the Dark criminals survive in a dark city by making their fortunes against all odds. These all have implicit meaning behind them, and paint the world with different brushes to purposefully tell specific stories about specific types of people.

 If we were to drop these themes into stories of straight drama, they’d be too on the nose. Who would want to play a game about traveling to a different country, attacking a group of people it’s decided are evil without really getting to know them, and stealing their treasure so you can become more powerful, for example? Share6Tweet6+11Reddit1Email If we were to drop these themes into stories of straight drama, they’d be too on the nose. Who would want to play a game about traveling to a different country, attacking a group of people it’s decided are evil without really getting to know them, and stealing their treasure so you can become more powerful, for example? When put in a realistic context, it’s easy to see the colonialist meaning behind that story. Genre acts as an alibi for stories though! If you’re fighting dragons, it’s easy to tell they’re evil, right?

Genre As Intentional Tool

While unintentional meaning can arise from genre stories that don’t consider the meaning behind the story, many genre games do a great job of using genre as an alibi to talk about serious things. In the tradition of Octavia Butler and Ursula K. Le Guin with their social science fiction, roleplaying games can tell meaningful fantastical stories about our current lives and identities.

The Blades in the Dark example above is derived heavily from the TV show The Wire, which is about how marginalized people don’t have many choices when the system doesn’t support them, and criminal action is the only way to survive. Mutants in the Night, much like the X-Men, uses the sci fi concept of mutants to highlight the lives of marginalized folks and how to fight back against systems of oppression. Monsterhearts uses monster romance as a metaphor for realizing you’re queer as a teenager. Kagematsu takes a typical samurai tale and subverts it by making the women of the village the main characters, thus portraying the gendered assumptions of emotional work.


The potential of genre as a tool to tell stories about contemporary issues is huge! Especially in roleplaying games, where we act out the lived experiences of the characters in the tale, and gain empathy through doing so. The fantastical and the unreal have great power in our imaginations. What meaning do your stories tell? What genre games tell your favorite types of stories about our contemporary lives? Let me know in the comments!


Categories: Game Theory & Design

Amazee Labs: Transparent Database Sanitization with GDPR-dump

Planet Drupal - 7 August 2018 - 5:12am
Transparent Database Sanitization with GDPR-dump

With GDPR in full effect, sanitization of user data is a fairly hot topic. Here at Amazee we take our clients and our clients’ clients privacy seriously, so we have been investigating several possible approaches to anonymizing data.

In the Drupal world, and the PHP world more generally, there are several options available. Here, though, I’d like to discuss one we think is particularly cool.

Blaize Kaye Tue, 08/07/2018 - 14:12

At Amazee Labs’ Global Maintenance, we work with several different projects per day. We move data from our production to staging and dev servers, and from our servers to our local development environments. Especially on legacy systems, site-specific configuration details often exist only in the databases, and even if that weren’t the case, the issues we’re investigating routinely require that we dig into the database as it (more or less) is on the production servers. Anonymization is crucial for our day to day work.

So our considerations here are, how do we balance productivity while keeping things anonymous?

One way of achieving this is to make Anonymization transparent to the developer. Essentially, we want our developers to be able to pull down the live database as it exists at the moment that they pull it down, and have it be anonymized.

How can we achieve this?

Well, one way is to analyse the daily workflow to see if there are any points at which the data has to flow through before it reaches the developer?

It turns out that, if you’re working with mysql, this “final common path” that the data flows through is the mysqldump utility.

If you’re running backups, chances are you’re using mysqldump.

If you’re doing a drush sql-sync there’s a call to mysqldump right at the heart of that process.

Mysqldump is everywhere.

The question is, though, how do we anonymize data using myqldump?

The standard mysqldump binary doesn’t support anonymization of data, and short of writing some kind of plugin, this is a non-starter.

Fortunately for us, Axel Rutz came up with an elegant solution, namely, a drop in replacement for the mysqldump binary, which he called gdpr-dump. A few of us here at Amazee loved what he was doing, and started chipping in.

The central idea is to replace the standard mysqldump with gdpr-dump so that any time the former is called, the latter is called instead.

Once the mysqldump call has been hijacked, so to speak, the first order of business is to make sure that we are actually able to dump the database as expected.

This is where mysqldump-php comes in. It’s the library on which the entire gdpr-dump project is based. It provides a pure PHP implementation of mysqldump as a set of classes. On its own, it simply dumps the database, just as the native mysqldump cli tool does.

A great starting point, but it only gets us part of the way.

What we’ve added is the ability to describe which tables and columns in the database being dumped you would like to anonymize. If, for instance, you have a table describing user data with their names, email, telephone numbers, etc. You can describe the structure of this table to gdpr-dump and it will generate fake, but realistic looking, data using the Faker library.

This requires some upfront work, mapping the tables and columns, but once it is done you’re able to call mysqldump in virtually any context, and it will produce an anonymized version of your database.

There is still a lot of thinking and work to be done, but we think it’s worth investing time in this approach. The fact that it can be used transparently is its most compelling aspect - being able to simply swap out mysqldump with gdpr-dump and have the anonymization work without having to change any of the dependent processes.

If any of this piques your interest and you’re looking for more details about how you might be able to use gdpr-dump in your own workflow, feel free to check out the project (and submit PRs):

Categories: Drupal

Date all day

New Drupal Modules - 7 August 2018 - 3:51am

Provides a field type and field widget extending date range to allow editors to set that a date has no time, meaning that it place all the day.
Similar functionality of date_all_day module in drupal 7 that is part of the date project, ported to drupal 8

Categories: Drupal

ADCI Solutions: Visual regression testing with BackstopJS

Planet Drupal - 7 August 2018 - 3:26am

The larger a project, the more time you will spend on regression testing after each change. But there are a lot of tools which can help you to reduce efforts for this process. One of them is BackstopJS.

Get acquainted with BackstopJS

Categories: Drupal

Queue monitor

New Drupal Modules - 7 August 2018 - 3:01am

queue monitor could monitor status of Drupal queue. It will be executed immediately if Drupal queue exists, otherwise it will be in waiting state

Listening to the specified queue

$ drush queue_monitor:run myqueue

Listening all queue

$ drush queue_monitor:runall
Categories: Drupal


New Drupal Modules - 7 August 2018 - 1:56am
Categories: Drupal


Subscribe to As If Productions aggregator