All RPGs and Storygames by Tod Foley are now available at DrivethruRPG and RPGnow. Bring these games to your table!
So far in our series of posts about helping you become a better Drupal developer, we’ve talked a lot about contribution from an individual point of view. But Drupal is community first! Even if you believe you’ve already reached your potential as a developer, remember people before you said and subscribed to the motto: Come. Continue reading...
The post Grow as a Drupal developer: embrace the community! appeared first on Manifesto.
We've heard of test-driven development, behaviour-driven development, feature-driven development and someone has probably invented buzzword-driven development by now. Here's my own new buzzword phrase: review-driven development. At ComputerMinds, we aim to put our work through peer reviews to ensure quality and to share knowledge around the team. Chris has recently written about why and how we review our work. We took some time on our last team 'CMDay' to discuss how we could make doing peer reviews better. We found ourselves answering this question:Why is reviewing hard? How can we make it easier?
We had recently run into a couple of specific scenarios that had triggered our discussion. For one, pressure to complete the work had meant reviews were rushed or incomplete. The other scenario had involved such a large set of code changes that reviewing them all at the end was almost impossible. I'm glad of the opportunity to reflect on our practice. Here are some of the points we have come away with - please add your own thoughts in the comments section below.1. Coders, help your reviewers
The person that does the development work is the ideal person to make a review easy. The description field of a pull request can be used to write a summary of changes, and to show where the reviewer should start. They can provide links back to the ticket(s) in the project's issue tracking system (e.g. Redmine/Jira), and maybe copy across any relevant acceptance criteria. The coder can chase a colleague to get a review, and then chase them up to continue discussions, as it is inevitable that reviewers will have questions.2. Small reviews are easier
Complicated changes may be just as daunting to review as to build. So break them up into smaller chunks that can be reviewed easier. This has the massive benefit of forcing a developer to really understand what they're doing. A divide & conquer approach can make for a better implementation and is often easier to maintain too, so the benefits aren't only felt by reviewers.3. Review early & often
Some changes can get pretty big over time. They may not be easy to break up into separate chunks, but work on them could well be broken up into iterations, building on top of each other. Early iterations may be full of holes or @TODO comments, but they still reveal much about the developer's intentions & understanding. So the review process can start as early as the planning stage, even when there's no code to actually review. Then as the changes to code take shape, the developer can continually return to the same person every so often. They will have contextual knowledge growing as the changes grow, to understand what's going on, helping them provide a better review.4. Anyone can review
Inevitably some colleagues are more experienced than others - but we believe reviews are best shared around. Whether talking about your own code, or understanding someone else's code, experience is spread across the team. Fresh eyes are sometimes all that's needed to spot issues. Other times, it's merely the act of putting your own work up for scrutiny that forces you to get things right.5. Reviewers, be proactive!
Developers like to be working, not waiting for feedback. Once they've got someone to agree to review their work, they have probably moved onto solving their next problem. However well they may have written up their work, it's best for the reviewer to chase the developer and talk through the work, ideally face-to-face. Even if the reviewer then goes away to test the changes, or there's another delay, it's best for the reviewer to be as proactive as possible. Clarify as much as needed. Chase down the answers. Ask seemingly dumb questions. Especially if you trust the developer - that probably means you can learn something from them too!6. Use the tools well
Some code changes can be ignored or skipped through easily. Things like the boilerplate code around features exports in Drupal 7, or changes to composer.lock files. Pointers from the developer to the reviewer of what files/changes are important are really helpful. Reviewers themselves can also get wise as to what things to focus on. Tools can help with this - hiding whitespace changes in diffs, the files tab of PRs on github, or three-way merge tools, for example. Screenshots or videos are essential for communicating between developer & reviewer about visual elements when they can't meet face-to-face.7. What can we learn from drupal.org?
The patch-based workflow that we are forced to use on drupal.org doesn't get a lot of good press. (I'm super excited for the gitlab integration that will change this!) But it has stood the test of time. There are lessons we can draw from our time spent in its issue queues and contributing patches to core and contrib projects. For example, patches often go through two types of review, which I'd call 'focussed nitpicks' and the wider 'approach critiques'. It can be too tempting to write code to only fulfil precise acceptance criteria, or to pass tests - but reviewers are humans, each with their own perspectives to anticipate. Aiming for helpful reviews can be even more useful for all involved in the long-run than merely aiming to resolve a ticket.8. Enforcing reviews
We tailor our workflow for each client and project - different amounts of testing, project management and process are appropriate for each one. So 'review-driven development' isn't a strict policy to be enforced, but a way of thinking about our work. When it is helpful, we use Github's functionality to protect branches and require reviews or merges via pull requests. This helps us to transparently deliver quality code. We also find this workflow particularly handy because we can broadcast notifications in Slack of new pull requests or merges that will trigger automatic deployments.What holds you back from doing reviews? What makes a review easier?
I've only touched on some the things we've discussed and there's bound to be even more that we haven't thought of. Let us know what you do to improve peer reviewing in the comments!
It happens sometimes to remove users and their contents without noticing how bad it could be. This module tries to prevent this from happening and also adds a deleting user method that allows you to assign all of the contents made by the user you are deleting to some other one.
In nutshell this module adds:
- extra option to assign content to new user and make it default
- add warning styles and extra text for the core option for deleting content with the user
Meet Gabriele Maira, also known as Gabi by friends and as Gambry by the Drupal community. With over 15 years of experience working with PHP and over 10 working with Drupal, Gabriele is currently the PHP/Drupal Practice lead at the London-based Manifesto Digital. Read about his beginnings with open source and why he thinks every Drupal developer should attend a Sprint at least once in their life.READ MORE
Instances of advanced search can be seen through different things. Take space research for instance. In their perpetual effort to explore Universe and search for Earth-like planets or a star that is similar to our Sun, Scientists wind up discovering interesting things. NASA’s Hubble Space Telescope was used to spot a star called Icarus, named after Greek mythological figure, which is the most distant star ever viewed and is located halfway across the universe.
Planting the seed in 2004
On the other side of the spectrum, Apache Solr is making huge strides with its advanced search capabilities. Enterprise-level search is a quintessential necessity for your online presence to be able to thrive in this digital age. Big organisations like Apple, Bloomberg, Marketo, Roku among others are opting for Apache Solr for its advanced search features. Amalgamation of Drupal and Apache Solr can be a remarkable solution for a magnificent digital presence.
John Thuma, in one of his blog posts, stated that tracing the history of Apache Solr would take us back to the year 2004 when it was an in-house project at CNET Networks used for offering search functionalities for the company’s website. It, then, donated it to the Apache Software Foundation in 2006.
Later, the Apache Lucene and Solr projects combined in 2010 with Solr becoming a sub-project of Lucene. It has witnessed an awful lot of alterations since then and is now a very significant component in the market.Uncloaking Apache Solr
Solr is a standalone enterprise search server with a REST-like API. You put documents in it (called "indexing") via JSON, XML, CSV or binary over HTTP. You query it via HTTP GET and receive JSON, XML, CSV or binary results. - Lucene.apache.org
As an enterprise-capable, open source search platform, Apache Solr is based on the Apache Lucene search library and is one of most widely deployed search platforms in the world.
It is written in Java and offers both a RESTful XML interface and a JSON API that enables the development of search applications. Its perpetual development by an enormous community of open source committers under the direction of the Apache Software Foundation has been a great boost.
Apache Solr is often debated alongside Elasticsearch. There is even a dedicated website called solr-vs-elasticsearch that compares both of them on various parameters. It states that both the solutions have support for integration with open source content management system like Drupal. It depends upon your organisation’s needs to select from either one of them.
For instance, if your team comprises a plentitude of Java programmers, or you already are using ZooKeeper and Java in your stack, you can opt for Apache Solr. On the contrary, if your team constitutes PHP/Ruby/Python/full stack programmers or you already are using Kibana/ ELK stack (Elasticsearch, Logstash, Kibana) for handling logs, you can choose Elasticsearch.Characteristics of Apache Solr
Following are the features of Apache Solr:Advanced search capabilities
- Spectacular matching capabilities: Apache Solr is characterised by the advanced full-text search capabilities. It enables spectacular matching capabilities comprising of phrases, grouping, wildcards, joins and so on, across any data type.
- A wide array of faceting algorithms: It has the support for faceted search and filtering that enables you to slice and dice your data as needed.
- Location-based search: It offers out-of-the-box geospatial search functionalities.
- Multi-tenant architecture: It offers multiple search indices that streamlines the process of segregating content and users.
- Suggestions while querying: There is support for auto-complete while searching (typeahead search), spell checking and many more.
It is optimised for the colossal spike in traffic. Also, Solr is built on Apache Zookeeper which makes it easy to scale up or down. It has in-built support for replication, distribution, rebalancing and fault tolerance.Support for standards-based open interfaces and data formats
It uses the standards-based open interfaces like XML, JSON and HTTP. Furthermore, you do not have to waste time converting all the data to a common representation as Solr supports JSON, CSV, XML and many more out-of-the-box.Responsive admin UI
It has the provision for an out-of-the-box admin user interface that makes it easier to administer your Solr instances.Streamlined monitoring
Solr publishes truckload of metric data via JMX that assists you in getting more insights into your instances. Moreover, the logging is monitorable as the log files can be easily accessed from the admin interface.Magnificent extensions
It has an extensible plugin architecture for making it simple to plugin both index and query time plugins. It also provides optional plugins for indexing rich content, detecting language, clustering search results amongst others.Configuration management
Its flexibility and adaptability for easy configuration are top-notch. It also offers advanced configurable text analysis, that means, there is support for most of the widely spoken languages in the world and a plethora of analysis tools that makes the process of indexing and querying your content flexible.Performance optimisation
It has been tuned to govern largest of sites and its out-of-the-box caches have fine-grained controls that assist in optimising performance.Amazing Indexing capabilities
Solr leverages Lucene’s Near Real-Time Indexing capabilities that ensure that the user sees the content whenever he or she wants to. Also, its built-in Apache Tika simplifies the process of indexing rich content like Microsoft Word, Adobe PDF and many more.Schema management
You can leverage Solr’s data-driven schemaless mode in the incipient stage of development and can lock it down during the time of production.Security
Solr has robust built-in security like SSL (Secure Sockets Layer), Authentication and role-based authorisation.Storage
Lucene’s advanced storage options like codecs, directories among others ensures that you can fine-tune your data storage needs that are applicable for your application.Leverage Apache UIMA
Enhancement of content can be done with its advanced annotation engines. It incorporates Apache UIMA for leveraging NLP (Natural Language Processing) and other tools for your application.Integrating Apache Solr with Drupal
Drupal’s impressive flexibility empowers digital innovation and gives the power to the users to build almost anything. It has the provision for integration of your website with Solr platform. Drupal’s Search API Solr Search module provides a Solr backend for the Drupal Search API module.Drupal’s Search API Solr Search module provides a Solr backend for the Drupal Search API module.
To begin with, you need to have Apache Solr installed on your server. This is followed by the validation of the Solr server’s status using Terminal. It is succeeded by the installation of Search API Solr Search module using Composer.
Once the installation of Search API Solr Search module is done, the process of configuration of Solr ensues. This involves the creation of collection which is basically a logical index linked to a config set.Source: OSTraining
Then, Drupal’s default search module is uninstalled for negating any performance issues and the Search API Solr Search module is enabled. You can, then, move on to the process of configuration of the Search API. Finally, you can test the Search API Solr Search module.Source: OSTrainingCase study
The Rainforest Alliance (RA), which is an international non-profit organisation working towards the development of strong forests, healthy agricultural work landscapes, and burgeoning communities via creative collaboration, leveraged the power of Drupal to revamp their website with the help of a digital agency.
Drupal was great because of its deep integrations with Apache Solr that enabled nuanced content relation engine.
RA has built a repository of structured content for supporting its mission and the content is primarily exhibited as long-form text with a huge variety of metadata and assets associated with each part of the content. It wanted to revamp the site and enable the discovery of new content on the site with the help of the automatic selection of related content. It also required the advanced permission features and publishing workflows.
Drupal turned out to be an astounding choice for fulfilling RA’s requirement of portable and searchable content. It was also great because of its deep integrations with Apache Solr that enabled nuanced content relation engine. Solr was leveraged for powering various search interfaces. Furthermore, Drupal’s wonderful content workflow features made it a perfect choice.
Solr offered ‘more like this’ (MLT) functionality that was more robust than just tagging content and showing other content with the same taxonomy terms. Search API Solr Search module, which provides a Solr backend for the Search API module, was utilised for providing the interface to govern the servers and indexes. Then, with a custom block, MLT was leveraged for assisting the process generating related content lists.
Page manager module, in combination with Layout Plugin and Panels modules, was used to build specialised landing pages in the form of specialised page manager pages with many of them having their own layouts. Different modules were utilised from within the media ecosystem of Drupal were very beneficial in administering images, embedding videos, and so on. Entity Embed, Entity Browser and Inline Entity form were magnificent for a great editorial experience for content teams.Conclusion
Apache Solr is a great solution for enabling enterprise-level search and can make a world of difference in combination with Drupal for your digital presence.
We have been empowering our partners in their efforts digital transformation dreams with our expertise in Drupal development.
Ping us at firstname.lastname@example.org to extract advanced Solr features with Drupal.blog banner blog image Apache Solr Drupal Drupal 8 Elasticsearch Search API Search API Solr Search Blog Type Articles Is it a good read ? On
Elasticsearch has been meritorious for The Guardian, one of the most reputed news media, by giving them the freedom to build a stupendous analytics system in-house rather than depending on a generic, off-the-shelf analytics solution. Their traditional analytics package was horrendous and was extremely sluggish consuming an enormous amount of time. The Elasticsearch-powered solution has turned out to be an enterprise-wide analytics tool and helped them understand how their content is being consumed.
Why is such a large organisation like The Guardian choosing Elasticsearch for its business workflow? Elasticsearch is all about full-text search, structured search, analytics, intricacies of confronting with human language, geolocation and relationships. Drupal, one of the leading content management frameworks, is a magnificent solution for empowering digital innovation and can help in implementing elastic search. Before we look at Drupal’s capability in implementing elastic search ecosystem, let’s unwrap Elasticsearch first.
Elasticsearch is an open source, broadly distributable, RESTful search and analytics engine which is built on Apache Lucene. It can be accessed through an extensive and elaborate API. It enables incredibly fast searches for supporting your data discovery applications. It is used for log analytics, full-text search, security intelligence, business analytics and operational applications.Elasticsearch is a distributed, scalable, real-time search and analytics engine - Elastic.io
It enables you to store, search and assess the voluminous amount of data swiftly and in near real-time. In general, it is leveraged as the underlying engine/technology for powering applications that have sophisticated search features and requirements.
How does Elasticsearch work? With the help of API or ingestion tools like Logstash, data is sent to Elasticsearch in the form of JSON documents. The original document is automatically stored by Elasticsearch and a searchable reference is added to the document in the cluster’s index. Elasticsearch API can, then, be utilised for searching and retrieving the document. Kibana, an open-source visualisation tool, can also be leveraged with Elasticsearch for visualising the data and create interactive dashboards.
Elasticsearch is often debated alongside Apache Solr. There is even a dedicated website called solr-vs-elasticsearch that compares both of them on various metrics. Both the solutions accompany itself with support for integration with open source content management system like Drupal. It depends upon your organisation’s needs to select from either one of them.
For instance, if your team comprises a superabundance of Java programmers, or you already are using ZooKeeper and Java in your stack, you can opt for Apache Solr. On the contrary, if your team includes PHP/Ruby/Python/full stack programmers or you already are using Kibana/ELK stack (Elasticsearch, Logstash, Kibana) for handling logs, you can choose Elasticsearch.Merits of Elasticsearch
Following are some of the benefits of Elasticsearch:
- Speed: Elasticsearch helps in leveraging and accessing all the data at a fast clip. Also, it makes it simple to rapidly build applications for multiple use cases.
- Performance: Being highly distributable, it allows the processing of a colossal amount of data in parallel and swiftly finds the best matches for your search queries.
- Scalability: Elasticsearch offers provision for easily operating at any scale without comprising on power and performance. It allows you to move from prototype to production boundlessly. It scales horizontally for governing multiple events per second while simultaneously handling the distribution of indices and queries across the cluster for efficacious operations.
- Integration: It comes integrated with visualisation tool Kibana. It also offers integration with Beats and Logstash for streamlining the process of transforming source data and loading it into Elasticsearch cluster.
- Safety: It detects failures for keeping the cluster and the data safe and available. With cross-cluster replication, a secondary cluster can be leveraged as a hot backup.
- Real-time operations: Elasticsearch operations like reading or writing data is usually performed in less than a second.
- Flexibility: It can pliably handle application search, security analytics, metrics, logging among others.
For designing a full Elasticsearch ecosystem in Drupal, Elasticsearch Connector, which is a set of modules, can be utilised. It leverages the official Elasticsearch PHP library and was built with the objective of handling large sets of data at scale. It is worth noting that this module is not covered by security advisory policy.
Elasticsearch Connector module can be utilised with a Drupal 8 installation and configured so that Elasticsearch receives the content changes
Elasticsearch Connector module can be utilised with a Drupal 8 installation and configured so that Elasticsearch receives the content changes. At first, you need to download a stable release of Elasticsearch and start it. You can, then, move ahead and set up Search API. This is followed by the process of connecting Drupal to Elasticsearch with the help of Elasticsearch Connector module which involves the creation of cluster or the collection of node servers where all the data will get stored or indexed.
This is succeeded by the configuration of Search API. It offers an abstraction layer to allow Drupal to push content alterations to different servers such as Elasticsearch, Apache Solr, or any other provider that has a Search API compatible module. The indexes are created in each of those servers with the help of Search API. These indexes are like buckets where the data can be pushed and can be searched in different ways. Subsequently, indexing of content and processing of data is done.
We rebuilt the website of PMG using progressively decoupled Drupal, React and Elasticsearch Connector module among others.
To do the mapping and indexing on Elastic Server, ElasticSearch Connector and Search API modules were leveraged. The development of Elastic backend architecture was followed by the building process of the faceted search application with React and the incorporation of the app in Drupal as block or template page.
The project structure for the search was designed and developed in the sandbox with modern tools like Babel and Webpack and third-party libraries like Searchkit. Searchkit is a suite of React components that interact directly with your ElasticSearch cluster where every component is built using React and can be customised as per your needs. Searchkit was of immense help in this project.
Logstash and Kibana, which are based on Elasticsearch, were integrated on the Elastic Server. This helped in collected, parsing, storing and visualising the data. The app in the Sandbox was built for the production and all the CSS/JS was integrated inside the Drupal as a block thereby making it a progressively decoupled feature.
The world is floating over a cornucopia of data. There is simply no end to the growth in the amount of data that is flowing through and produced by our systems. Existing technology has laid emphasis on how to store and structure warehouses replete with data.
But when it comes to making decisions in real time informed by that data, you need something like an Elasticsearch for searching and analysing data in real-time. Drupal can be a wonderful solution for implementing Elasticsearch ecosystem with its suite of modules.
We have been steadfast in our goals of empowering digital innovation with our suite of services.
Contact us at email@example.com to reap the rewards of Elasticsearch and ingrain your digital presence with advanced search capabilities.blog banner blog image Elasticsearch Drupal Drupal 8 ReactJS Apache Solr Search API Elasticsearch Connector Blog Type Articles Is it a good read ? On
This module adds extra options to customize the Entity Embed Dialog:
1) You can choose a custom dialog title to display during the selection step.
2) You can choose a custom dialog title to display during the embed step.
3) You can choose a custom label for the "back" button that returns to the selection step.
4) You can select a view to display the entity embed, which allows for a nicer UI experience for the editor. You can customize the view (or create your own) to add images and an edit button (a default view is available out of the box with the module).
What happened since last month? In a nutshell:
- 2.0 & 2.1 released :)
- Usage continues to rise: ~400 → ~1700 sites, 50% of those on 2.0 1
- Gabe proposed JSON:API profiles for versioning/revisions and multiple-arity relationships, to take advantage of the upcoming 1.1 version of the spec
- New core patch to bring JSON:API to Drupal core: #2843147-101
- Several refactors of internals 2 that pave the path for hypermedia links and partial caching
- JSON:API Extras is kept in sync — about 300 of you use that with JSON:API 2.x.
Work-arounds for two very common use cases are no longer necessary: decoupled UIs that are capable of previews and image uploads3.
- File uploads work similarly to Drupal core’s file uploads in the REST module, with the exception that a simpler developer experience is available when uploading files to an entity that already exists.
- Revision support is for now limited to retrieving the working copy of an entity using ?resourceVersion=rel:working-copy. This enables the use case we hear about the most: previewing draft Nodes. 4 Browsing all revisions is not yet possible due to missing infrastructure in Drupal core. With this, JSON:API leaps ahead of core’s REST API.
Please share your experience with using the JSON:API module!
Note that usage statistics on drupal.org are an underestimation! Any site can opt out from reporting back, and composer-based installs don’t report back by default. ↩︎
Unfortunately only Node and Media entities are supported, since other entity types don’t have standardized revision access control. ↩︎
The file replace module is a small utility providing site administrators with the possibility to replace files, keeping the file uri intact. This is useful in cases where a file is linked or used directly but needs to be updated occasionally.Installation and usage
Install the module as you would any contrib module, preferably with composer:
composer require drupal/file_replace
Do you have too many games just collecting dust on your bookshelves?
Is it a struggle to get your players to try something new?
Are you just too busy to finish reading that new tome of a game book?
You are not alone.
As Convention Coordinator for the Indie Game Developer Network and as a self-published game designer, I travel to monthly conventions to sell and run role-playing games. I’m always asking what games people are playing, running, and are interested in. And, let me tell you, there are plenty of others who feel the same as you. They face the same challenges, the same struggles.
You don’t have to go it alone.
Together, I think we can build/borrow a system to help solve all of our (current) problems. It won’t cure them overnight, but it will treat the problems and help to create a foundation for other like-minded locals. Together, we can unite to build more than a gaming group. We can create a movement! One to tackle the difficulties of having too many unplayed games, of luring players to try something different, or the reoccurring learning curve of each new game. First things first, we start by building a community. To meetup and game together, we’ll need a pool of potential players and Game Masters. Thanks to social media, this is surprisingly easy to start, however, difficult to master.Build a Community?
We need a place for people to gather in order to build interest in our idea—our movement. With the utility of social media, we can easily recruit, message, and share ideas (posts) at times that are convenient for one another. A community can be fostered in something as simple as a Facebook Group. You could start today. Add the people you game with, the people that you know game locally, and the Game Masters that run events at local stores or gaming hangouts. Most game stores have their event schedules posted on a website or community board in their store. Take advantage of these to help you find where people are gaming and who is facilitating these games. Talk to your local game store owners about what you’re working on and who else they think you should talk to. Don’t assume that every game played will be publicly posted. Also, dig around for other local gaming groups on Facebook. Search the name of local game stores for groups that share their name. Search for groups that prominently state your city, town, or region’s name with the keywords RPG, Tabletop, Geek, or D&D. You may be surprised to find how much is already going on right under your nose.
How Will This Solve My Problems?
A model that I’ve adopted in Northwest Indiana is that of Games on Demand. You may have heard of their work or participated in a game with them at Gencon, Origins, or Pax. Their model consists of several Game Masters, each offering up two or more different games per time slot. Players that attend the event(s), pick from the games offered in a first come, first served basis. They generally select from table tents that give a blurb of each game with a picture. As games are selected, the choices begin to narrow, and focus shifts to filling the games selected.
To solve your problems, you need to foster an environment that builds demand for new games and that attracts players who want to be involved in those games. Share1Tweet1+11Reddit1Email To solve your problems, you need to foster an environment that builds demand for new games and that attracts players who want to be involved in those games. The Games on Demand model is attractive to players and Game Masters looking to play and run new games. It has an easy to understand structure that clearly defines what a Game Master needs to prepare for (two different games, two different one shot sessions, expect new players). Not to mention, with other Game Masters sharing games, you can learn more of them without having to read or research them one at a time.What Do I Need for This to Work?
- Game Masters: With the help of another Game Master or two, you could offer up to six different games for a game night. Each Game Master already has games they know how to run and games on their shelf they are just dying to put to good use. Creating a community with game nights and a Games on Demand model provides the opportunity you’ve all been waiting for. Give yourself and other Game Masters the opportunity to share all of the cool games you’ve been collecting.
- Public Places to Play: Potential players need to feel safe before they will join you for a game. Playing with strangers can be very intimidating, especially if it is at a private home where you don’t know anyone. Reach out to your local game stores or anywhere else people are gaming in your area (coffee shops, tabletop friendly bars, churches, the local library) and schedule a time for 2-3 free tables. You may be asked to institute a rule that everyone buy their drinks and snacks from the establishment. That’s only fair, besides, you want the opportunity for there to be foot traffic. Shopping customers and regulars can be recruited to play and invited to future events. They may even have friends…
- Game Day Events or Meetups: What good is the community if we have nothing to rally around? Create events to attract people in your local area. Ask your friends to share the events with their friends and families. Spread the word on Facebook in your local groups with similar interests. If you have the skills to make a flyer, put some up in your local school and game store bulletin boards. It takes time for people to understand what you’re doing and also to carve out time to attend. Don’t be discouraged! Some people will need time to find themselves in a situation where they are also looking for new people to game with or new games to play. I have had more than one Game Master tell me, “I’ll be there when I’m no longer running two games a week.” That’s fair!
My local contingent is called the Tales of the 219. Yep, that’s our area code! Northwest Indiana Story Gamers just didn’t win over the hearts and minds of our founding members.
On January 31st, we mark our one year anniversary of running events at local game stores on a monthly (and more) schedule. Setting out, I found other Game Masters interested in diversifying the games that they play and struggling to find players for their games not named D&D or Pathfinder. Sharing my vision for a more vibrant, inclusive, and variety rich gaming community, I reached out to local game stores. To their surprise, we didn’t want money or to sell attendees something physical. We just wanted to grow the community of role-players in Northwest Indiana, to bridge RPG enthusiasts beyond their favorite game store, gaming group, or routine game of choice.
I remember speaking to Matt and Jared, two of my friends and early adopters of our vision.
“This (Tales of the 219) is probably going to be like four of us running games for each other for a year or two. But, one day, there will be others, and they’ll talk fondly about how they found the Tales of the 219. We’ll be like forefathers that paved the way to make a more vibrant and diverse role-playing game community possible. This will work. We just need to be persistent and deal with the inevitable trying times that will come with the occasional successes.”
I’m happy to inform you that it never did end up being just the four of us playing each others’ games. We’ve hosted events at six different game stores and one local convention for a total of fifteen events in 2018. Attendance varies from 5-15 individuals with an average of 8-9 folks attending per gathering. At our local convention Arcticon, we held eight games seating over 40 players! It’s a good feeling to not only play more games but to help others experience brilliant games they never knew existed. It’s a good feeling to not only play more games but to help others experience brilliant games they never knew existed. Share1Tweet1+11Reddit1Email
Some things I’ve learned so far:
- Meetup.com was an excellent tool for engaging and managing a growing role-playing game community. It isn’t anymore. Some Meetup accounts are doing very well and holding strong to the format, but they are mostly groups that have been around for years. All the action is on Facebook these days, even if respondents ARE flaky. If you aren’t familiar with people checking interested instead of going for your events, get used to it. Phone based Facebook users really gotta dig to find the illusive going option. So, don’t be too hard on those that mark interested. Searching Meetup for other gaming groups in your area can be very useful, though. You can reach out to them and talk about consolidating efforts.
- D&D can be a powerful tool for recruiting and finding new players to join your burgeoning RPG community. It can also lead to exactly what you may have been trying to avoid—people only interested in D&D(or Pathfinder). I’ve spoken to a few of the larger role-playing game Meetup leaders about using D&D as a gateway for players. It will inflate your community numbers but may not convert many people over to trying out new games. People like to like D&D in this day and age! Experiment at your own risk.
- Going to where the people are is worth it. Finding your people takes good word of mouth, a little luck, or consistency. Hopefully, you can find two of the three! Sometimes, it takes several events at a location to find the people who will become a core part of your new community. Most of us have busy adult lives and might skip a few of these events until it fits nicely in our schedule.
- A lot of role-players choose not to be on Facebook or social media in general. Form an email list to keep them in the loop. You may be stunned by how many people that is. I certainly was!
- It’s a win-win for game stores, you attract RPG enthusiasts to different stores and show them new games to purchase and run for others. Don’t be afraid to approach them. Also, don’t be surprised at how many still want telephone calls to schedule or are bad at email conversations. I like reaching out with Facebook Messenger as that tool trains businesses to reply timely. It has been very effective, for the most part.
- Giving prizes to first time attendees and Game Masters for running games has been far less of an incentive than I had hoped for attracting players. I’m always giving away full games to new players and it doesn’t seem to really sway whether they return or not. Convenience seems to be king.
Don’t have a game store nearby? Can’t find role-players locally? Have you thought about building an online community or joining something like the Gauntlet? An earlier conception of this idea, this movement, was to build a role-playing game group with a rotating Game Master dynamic. On G+, we called ourselves the Janus GM Project. The members would announce games they wanted to run for the next round of games (like 3 rounds a year) and then we would vote on our favorite titles per Game Master. Each Game Master then knew what game to start reading with months to prepare, read up, or research the game. We’d play each game every other week for a 3-5 session story arc. It was a ton of fun and very effective! We played over twenty-five new games in about two and a half years.What Are You Really Doing This For?
Maybe, you’re a Game Master overloaded with games that just need to be played. Maybe, you’re a game designer and want to build an audience that will playtest and buy your game(s). Maybe, you are a new player looking for the game that is uniquely your fit, your niche.
Don’t settle for the status quo. Build a Games on Demand community where you live! Grow the community YOU want to be a part of. Build something for the future players of your neighborhood.
I’ll be there for you. Together, we can build a network that communicates and shares best practices. You just need to be persistent and share your love for RPGs with those near you. I believe you can do it, and this is a fun way to grow the hobby we love so dear.
Special thanks to the crew at the Tales of the 219 (Sout, Matt, Jared, Adrienne, Pedro, Tom), my local game stores (By the Board Games & Entertainment, The Librarium Cafe, Galactic Greg’s, Tenth Planet), and the RPG enthusiasts of the region! Thank you for the added joy to my life!
I want to know:
How have you brought people together in your local area?
How do you attract new players or sway your gaming group into trying something new?
Where do you game and who else could you invite?
This module is helpful for developers. Many times it needs to restrict code
to be functional on development site only or staging site only. This can be
achieved with this module. So once configured you can make your code to run on
any of the environment you want to restrict on - Development, Staging or Production(Live).
This module allows you to send warnings and more serious errors in the error log via email to a specified address. Optionally, every PHP entry can be sent, since this is definitely a problem on the site.