Newsfeeds

Lullabot: Decoupled Drupal Hard Problems: Schemas

Planet Drupal - 23 May 2018 - 8:59am

The Schemata module is our best approach so far in order to provide schemas for our API resources. Unfortunately, this solution is often not good enough. That is because the serialization component in Drupal is so flexible that we can’t anticipate the final form our API responses will take, meaning the schema that our consumers depend on might be inaccurate. How can we improve this situation?

This article is part of the Decoupled hard problems series. In past articles, we talked about request aggregation solutions for performance reasons, and how to leverage image styles in decoupled architectures.

TL;DR
  • Schemas are key for an API's self-generated documentation
  • Schemas are key for the maintainability of the consumer’s data model.
  • Schemas are generated from Typed Data definitions using the Schemata module. They are expressed in the JSON Schema format.
  • Schemas are statically generated but normalizers are determined at runtime.
Why Do We Need Schemas?

A database schema is a description of the data a particular table can hold. Similarly, an API resource schema is a description of the data a particular resource can hold. In other words, a schema describes the shape of a resource and the datatype of each particular property.

Consumers of data need schemas in order to set their expectations. For instance, the schema tells the consumer that the body property is a JSON object that contains a value that is a string. A schema also tells us that the mail property in the user resource is a string in the e-mail format. This knowledge empowers consumers to add client-side form validation for the mail property. In general, a schema will help consumers to have a prior understanding of the data they will be fetching from the API, and what data objects they can write to the API.

We are using the resource schemas in the Docson and Open API to generate automatic documentation. When we enable JSON API and  Open API you get a fully functional and accurately documented HTTP API for your data model. Whenever we make changes to a content type, that will be reflected in the HTTP API and the documentation automatically. All thanks to the schemas.

A consumer could fetch the schemas for all the resources it needs at compile time or fetch them once and cache them for a long time. With that information, the consumer can generate its models automatically without developer intervention. That means that with a single implementation once, all of our consumers’ models are done forever. Probably, there is a library for our consumer’s framework that does this already.

More interestingly, since our schema comes with type information our schemas can be type safe. That is important to many languages like Swift, Java, TypeScript, Flow, Elm, etc. Moreover, if the model in the consumer is auto-generated from the schema (one model per resource) then minor updates to the resource are automatically reflected in the model. We can start to use the new model properties in Angular, iOS, Android, etc.

In summary, having schemas for our resources is a huge improvement for the developer experience. This is because they provide auto-generated documentation of the API and auto-generated models for the consumer application.

How Are We Generating Schemas In Drupal?

One of Drupal 8's API improvements was the introduction of the Typed Data API. We use this API to declare the data types for a particular content structure. For instance, there is a data type for a Timestamp that extends an Integer. The Entity and Field APIs combine these into more complex structures, like a Node.

JSON API and REST in core can expose entity types as resources out of the box. When these modules expose an entity type they do it based on typed data and field API. Since the process to expose entities is known, we can anticipate schemas for those resources.

In fact, assuming resources are a serialization of field API and typed data is the only thing we can do. The base for JSON API and REST in core is Symfony's serialization component. This component is broken into normalizers, as explained in my previous series. These normalizers transform Drupal's inner data structures into other simpler structures. After this transformation, all knowledge of the data type, or structure is lost. This happens because the normalizer classes do not return the new types and new shapes the typed data has been transformed into. This loss of information is where the big problem lies with the current state of schemas.

The Schemata module provides schemas for JSON API and core REST. It does it by serializing the entity and typed data. It is only able to do this because it knows about the implementation details of these two modules. It knows that the nid property is an integer and it has to be nested under data.attributes in JSON API, but not for core REST. If we were to support another format in Schemata we would need to add an ad-hoc implementation for it.

The big problem is that schemas are static information. That means that they can't change during the execution of the program. However, the serialization process (which transforms the Drupal entities into JSON objects) is a runtime operation. It is possible to write a normalizer that turns the number four into 4 or "four" depending if the date of execution ends in an even minute or not. Even though this example is bizarre, it shows that determining the schema upfront without other considerations can lead to errors. Unfortunately, we can’t assume anything about the data after its serialized.

We can either make normalization less flexible—forcing data types to stay true to the pre-generated schemas—or we can allow the schemas to change during runtime. The second option clearly defeats the purpose of setting expectations, because it would allow a resource to potentially differ from the original data type specified by the schema.

The GraphQL community is opinionated on this and drives the web service from their schema. Thus, they ensure that the web service and schema are always in sync.

How Do We Go Forward From Here

Happily, we are already trying to come up with a better way to normalize our data and infer the schema transformations along the way. Nevertheless, whenever a normalizer is injected by a third party contrib module or because of improved normalizations with backward compatibility the Schemata module cannot anticipate it. Schemata will potentially provide the wrong schema in those scenarios. If we are to base the consumer models on our schemas, then they need to be reliable. At the moment they are reliable in JSON API, but only at the cost of losing flexibility with third-party normalizers.

One of the attempts to support data transformations and the impact they have on the schemas are Field Enhancers in JSON API Extras. They represent simple transformations via plugins. Each plugin defines how the data is transformed, and how the schema is affected. This happens in both directions, when the data goes out and when the consumers write back to the API and the transformation needs to be reversed. Whenever we need a custom transformation for a field, we can write a field enhancer instead of a normalizer. That way schemas will remain correct even if the data change implies a change in the schema.

undefined

We are very close to being able to validate responses in JSON API against schemas when Schemata is present. It will only happen in development environments (where PHP’s asserts are enabled). Site owners will be able to validate that schemas are correct for their site, with all their custom normalizers. That way, when a site owner builds an API or makes changes they'll be able to validate the normalized resource against the purported schema. If there is any misalignment, a log message will be recorded.

Ideally, we want the certainty that schemas are correct all the time. While the community agrees on the best solution, we have these intermediate measures to have reasonable certainty that your schemas are in sync with your responses.

Join the discussion in the #contenta Slack channel or come to the next API-First Meeting and show your interest there!

Note: This article was originally published on November 3, 2017. Following DrupalCon Nashville, we are republishing (with updates) some of our key articles on decoupled or "headless" Drupal as the community as a whole continues to explore this approach further. Comments from the original will appear unmodified.

Hero photo by Oliver Thomas Klein on Unsplash.

Categories: Drupal

Chromatic: DrupalCon Nashville Recap

Planet Drupal - 23 May 2018 - 8:56am

It’s hard to believe DrupalCon Nashville was over a month ago! We have been busy here at Chromatic ever since, but we wanted to give a recap of the conference from our point of view.

Categories: Drupal

Late Pledges Open For The Reckoners

Tabletop Gaming News - 23 May 2018 - 8:00am
Kickstarters come and go. It’s easy to miss one in all the shuffle. Thankfully, there’s usually Late Pledge opportunities. However, those don’t last forever, either. So, if you missed out on getting The Reckoners as part of the Kickstarter, don’t miss out on also getting it during the Late Pledge time. From the website: Games […]
Categories: Game Theory & Design

National Geographic's brilliant cover

Dries Buytaert - 23 May 2018 - 7:55am

One of the best covers I've seen. Iconic!

Categories: Drupal

Doxie Dash Card Game Up On Kickstarter

Tabletop Gaming News - 23 May 2018 - 7:00am
Ooooooooooh, I can’t get a long, little doggie. I can’t even get one that’s small. I can’t get a long, little doggie. I can’t get a doggie at all! Thank you, Yosemite Sam, for that. Doxie Dash is a game all about long, little doggies. Play as a dachshund (or doxie, for short) and do […]
Categories: Game Theory & Design

Gnome Stew Notables – Meguey Baker

Gnome Stew - 23 May 2018 - 6:39am

Welcome to the next installment of our Gnome Spotlight: Notables series. The notables series is a look at game developers in the gaming industry doing good work. The series will focus on game creators from underrepresented populations primarily, and each entry will be a short bio and interview. We’ve currently got a group of authors and guest authors interviewing game creators and hope to bring you many more entries in the series as it continues on. If you’ve got a suggestion for someone we should be doing a notables article on, send us a note at headgnome@gnomestew.com. – Head Gnome John

Meet Meguey

Meguey Baker is a quilter, roleplaying game designer, and independent publisher. Notably she has created Apocalypse World, A Thousand and One Nights, and Psi-Run.

Talking With Meguey 1) Tell us a little bit about yourself and your work.

I am a huge history fan and work in as many history museums as I can possibly fit in my life, maybe more. I love stories and storytelling, and find people endlessly fascinating. These two things combine to inform my game design. Flavor that with raising 3 kids, being a sex ed teacher, and living in New England, and it’s pretty much me. Also, rocks are the best thing after toast, which is the best thing. People are not things, so there’s that.

2) What project are you most proud of?

I am very proud of the work I did with the Girl Effect to help re-weave connections between girls in Ethiopia. Working with Jessica Hammer and Julia Ellingboe and Giuila Barbano and John Stavropoulos was just great, and I was able to put so much of my whole self into the work, combining dance and song, sex education work, and shaping and holding space for people to discover their own voices. I am also very pleased with the way that Apocalypse World has helped so many people find a framework to bring forward their own vision and share their designs. Outside of game design, I’m glad to have had the chance to do conservation work on some very challenging and historically valuable textiles, and to spend 12 years helping families navigate postpartum depression.

3) What themes do you like to emphasize in your game work?

Throughout everything, the underlying principal to my work is to seek underrepresented voices and stories and find ways to amplify and share them. So in game design, I think a lot about things like accessibility to diverse players, play in different settings, play across generations and languages and other cultural division lines. The last big project I did before Apocalypse World 2nd ed was Playing Nature’s Year, which is a good example of this desire, to find new places and play games with new people. I also sometimes have a sharpness in the game that is unexpected; people have looked at 1001 Nights thinking it’s all fantastic magical stories, which is true, but it’s also potentially cut-throat small group politics.

4) What mechanics do you like best in games?

2d6, add a stat. Ok, beyond being flippant, I like mechanics that are designed with that exact game in mind, that are not just Oh, I’ll use 2d6, add a stat, that’ll work” because sometimes it really won’t.  The mechanics of the game should be thoughtfully designed the same way the setting of the artwork or the staging of the play is designed The mechanics of the game should be thoughtfully designed the same way the setting of the artwork or the staging of the play is designed; sometimes they are very very similar, and sometimes really not.

5) How would you describe your game design style?

Search for the clearest way to facilitate the most compelling story, then do that. Be ready to kill my darlings, set things aside, and re-interpret my ideas until the game does what I want it to do.

6) How does gender fit into your games?

Everyone should be able to play my games and find themselves in them. If they can’t, I better be exceedingly intentionally clear about that and understand why I did that. That said, I default to a feminist viewpoint on the world, including gender issues, so that is reflected in my games.

7) Do your kids influence your game design?

Parenting is a big deal and an influential part of our design process, first as parents of young kids looking for ways to play in brief chunks of time that had to occasionally flex around kid needs, and now as the kids all become young adults and we are gaming with them and watching them in their own design process. They and their friends in the Baker House Band are now one of our primary test groups, and we have increasingly good design relationships with them as we take game design on as a family business. Later this summer, our son Elliot intends to Kickstart his game Tiny&Chrome, which is a mini Lego road war game and entirely his own design. That is super gratifying as a parent, and a rare treat as a designer to see someone’s development as a game designer so closely, from “pretend we are dinosaurs!” as a toddler through to young adulthood.

One of my other main areas of work is in small local history museums. The way that we tell and reinterpret the story of the past has always intrigued me, and roleplaying games are one way to revisit the same story from different angles and through different lenses: what if Jane Austin was a zombie hunter? What if the steam age ran on magic? What if we centered Native American narratives in looking at North American history? What would an original story of the Lawrence mill strikes look like? Could I play those people? Could I design games to support those stories? Uncovering the connections between one aspect of history and other aspects, like the change in local textile use after the arrival of the railroad, is such incredibly interesting design space for me. I’m working now on a major exhibit on the industrial history of our town, and I’m designing site-specific games to engage visitors with their local history. I’m excited to see what comes next.

8) How did you get into game design?

Star Wars came out and I wanted to play a Jedi, so I asked my DM if we could play D&D but in Star Wars. She said yes. I was 7. She was 10. Her brother was 9 and made us lightsabers out of paper towel tubes. Seriously. I don’t remember not writing and designing games and plays and stories. Once it was the 1990s and print shops and copiers existed, it was a matter of time, and then when the internet and .pdfs existed, it was a sure thing. I designed 1001 Nights in notebooks while strolling around town with my youngest baby, then typed it up and sent it out into the world.

9) What one thing would you change in gaming?

I would somehow give access and time to queer folks, women, people of color, non-English speakers looking to connect with English speaking audiences, and folks with no regular time to sit at a computer and write or take part in on-line community. In practical terms, I want conventions to be cheaper and more accessible for more people. In magic wand terms, I want states and nations to value storytelling and connecting to other people enough to fund major NEA-type projects to get role playing games to every school and every library and every museum and community center and summer camp in the world.

10) What are you working on now?

just launched in Kickstarter, so that’s the current big focus. After that, there’s this suite of site-specific games I’m working on that’s pretty fun and has some applicability in the “magic wand” area I mentioned above.

11) Who/What games are some of your influences?

AD&D, first and foremost. Shadowrun and Cyberpunk and Ars Magica. Lots of school yard games and lots of old string games and puzzle games. I played games with my sister non-stop as a kid, so she’s a big influence, even though we haven’t played together in 30 years except in fleeting moments. We live nearby, but life is very full, and playing games with family is vulnerable and tricky sometimes. Emily Care Boss, of course, and Vincent Baker. Esther Clinton, who no-one knows about, but who is a folklorist and was crucial to my college experience of gaming. More recently, I found a lot of fruitful ground in Dialect, by Kathryn Hymes and Hakan Seyalioglu, Tree House Dreams by Gray Pawn Games and Alas for the Awful Sea! by Storybrewers Veronica and Haley, and Ben Dutter’s Perseverant.

Thanks for joining us for this entry in the notables series.  You can find more in the series here:
and please feel free to drop us any suggestions for people we should interview at headgnome@gnomestew.com.

 

Categories: Game Theory & Design

User Statistics

New Drupal Modules - 23 May 2018 - 6:35am

It provides user statistics like post count, login count, and IP address tracking.

Categories: Drupal

Paizo Previews the Wizard Class From Pathfinder

Tabletop Gaming News - 23 May 2018 - 6:00am
One of the big news stories of late has been the announcement that Pathfinder is getting a new edition. But what’s changing? Well, the core classes are getting an overhaul. So, what can you expect? At PaizoCon, players will get a chance to see these new classes first-hand, but even us that won’t be in […]
Categories: Game Theory & Design

Datadog P

New Drupal Modules - 23 May 2018 - 5:34am

Integrates datadog into drupal.

Categories: Drupal

Axelerant Blog: Women at Axelerant: Chapter Two

Planet Drupal - 23 May 2018 - 4:23am


I sat down to speak with the amazing women of Axelerant, and they each shared their unique perspectives about what it's like being professionals in their field. In this chapter, Mridulla, Akanksha, Sabreena, and Nikita expound on this—and in their own words.

Categories: Drupal

The RPGnet Interview: David Donachie, Solipsist

RPGNet - 23 May 2018 - 12:00am
A talk with the designer of Solipsist and a contributor to many others.
Categories: Game Theory & Design

Gizra.com: Understanding Media Management with Drupal Core

Planet Drupal - 22 May 2018 - 10:00pm

But I just want to upload images to my site…

There is a clear difference between what a user expects from a CMS when they try to upload an image, and what they get out of the box. This is something that we hear all the time, and yet we, as a Drupal community, struggle to do it right.

There are not simple answers on why Drupal has issues regarding media management. As technology evolves, newer and simpler tools raise the bar on what users expects to see on their apps. Take Instagram for example. An entire team of people (not just devs) are focused on making the experience as simple as possible.

Therefore it’s normal to expect that everyone wants to have this type of simplicity everywhere. However, implementing this solutions is not always trivial, as you will see.

Continue reading…

Categories: Drupal

Virtuoso Performance: Configuring migrations via a form

Planet Drupal - 22 May 2018 - 7:29pm
Configuring migrations via a form mikeryan Tuesday, May 22, 2018 - 09:29pm

Frequently, there may be parts of a migration configuration which shouldn’t be hard-coded into your YAML file - some configuration may need to be changed periodically, some may vary according to environment (for example, a dev environment may access a dev or test API endpoint, while prod needs to access a production endpoint), or you may need a password or other credentials to access a secure endpoint (or for a database source which you can’t put into settings.php). You may also need to upload a data file for input into your migration. If you are implementing your migrations as configuration entities (a feature provided by the migrate_plus module), all this is fairly straightforward - migration configuration entities may easily be loaded, modified, and saved based on form input, implemented in a standard form class.

Uploading data files

For this project, while other CSV source files were static enough to go into the migration module itself, we needed to periodically update the blog data during the development and launch process. A file upload field is set up in the normal way:

$form['acme_blog_file'] = [ '#type' => 'file', '#title' => $this->t('Blog data export file (CSV)'), '#description' => $this->t('Select an exported CSV file of blog data. Maximum file size is @size.', ['@size' => format_size(file_upload_max_size())]), ];

And saved to the public file directory in the normal way:

$all_files = $this->getRequest()->files->get('files', []); if (!empty($all_files['acme_blog_file'])) { $validators = ['file_validate_extensions' => ['csv']]; if ($file = file_save_upload('acme_blog_file', $validators, 'public://', 0)) {

So, once we’ve got the file in place, we need to point the migration at it. We load the blog migration, retrieve its source configuration, set the path to the uploaded file, and save it back to active configuration storage.

$blog_migration = Migration::load('blog'); $source = $blog_migration->get('source'); $source['path'] = $file->getFileUri(); $blog_migration->set('source', $source); $blog_migration->save(); drupal_set_message($this->t('File uploaded as @uri.', ['@uri' => $file->getFileUri()])); } else { drupal_set_message($this->t('File upload failed.')); } }

It’s important to understand that get() and set() only operate directly on top-level configuration keys - we can’t simply do something like $blog_migration->set(‘source.path’, $file->getFileUri()), so we need to retrieve the whole source configuration array, and set the whole array back on the entity.

Endpoints and credentials

The endpoint and credentials for our event service are configurable through the same webform. Note that we obtain the current values from the event migration configuration entity to prepopulate the form:

$event_migration = Migration::load('event'); $source = $event_migration->get('source'); if (!empty($source['urls'])) { if (is_array($source['urls'])) { $default_value = reset($source['urls']); } else { $default_value = $source['urls']; } } else { $default_value = 'http://services.example.com/CFService.asmx?wsdl'; } $form['acme_event'] = [ '#type' => 'details', '#title' => $this->t('Event migration'), '#open' => TRUE, ]; $form['acme_event']['event_endpoint'] = [ '#type' => 'textfield', '#title' => $this->t('CF service endpoint for retrieving event data'), '#default_value' => $default_value, ]; $form['acme_event']['event_clientid'] = [ '#type' => 'textfield', '#title' => $this->t('Client ID for the CF service'), '#default_value' => @$source['parameters']['clientId'] ?: 1234, ]; $form['acme_event']['event_password'] = [ '#type' => 'password', '#title' => $this->t('Password for the CF service'), '#default_value' => @$source['parameters']['clientCredential']['Password'] ?: '', ];

In submitForm(), we again load the migration configuration, insert the form values, and save:

$event_migration = Migration::load('event'); $source = $event_migration->get('source'); $source['urls'] = $form_state->getValue('event_endpoint'); $source['parameters'] = [ 'clientId' => $form_state->getValue('event_clientid'), 'clientCredential' => [ 'ClientID' => $form_state->getValue('event_clientid'), 'Password' => $form_state->getValue('event_password'), ], 'startDate' => date('m-d-Y'), ]; $event_migration->set('source', $source); $event_migration->save(); drupal_set_message($this->t('Event migration configuration saved.'));

Note that we also reset the startDate value while we’re at it (see the previous SOAP blog post).

Tags Drupal Planet Drupal Migration Use the Twitter thread below to comment on this post:

Configuring migrations via a form https://t.co/EZTiUKBazX

— Virtuoso Performance (@VirtPerformance) May 22, 2018

 

Categories: Drupal

Kalamuna Blog: Drupalistas Spent Our Entire Swag Budget. Where did the Money Go?

Planet Drupal - 22 May 2018 - 3:09pm
Drupalistas Spent Our Entire Swag Budget. Where did the Money Go? Shannon O'Malley Tue, 05/22/2018 - 15:09

This April at DrupalCon Nashville, in addition to wanting to meet colleagues and soak up the great talks, we wanted to create a forum for the international Drupal community to do good. That’s why we used our sponsor booth wall as a space for attendees to promote nonprofits that work for causes that matter to them.

Categories Articles Community Drupal Nonprofits Author Shannon O'Malley
Categories: Drupal

CKEditor Text Transform

New Drupal Modules - 22 May 2018 - 2:45pm

This module integrates the CKEditor Text Transform Selection plugin.

Categories: Drupal

New Releases Available For Avatars of War

Tabletop Gaming News - 22 May 2018 - 2:00pm
It might be called the Empire of Men, but there’s some kick-ass women lining up on the battlefield, too. This month’s releases from Avatars of War include the Warrior Priestess and Sunna of Sonnstahl. To celebrate these ladies’ arrival, everything for the Empire of Men faction is on sale for 15% off this week. From […]
Categories: Game Theory & Design

EA acquires GameFly's cloud gaming service

Social/Online Games - Gamasutra - 22 May 2018 - 1:54pm

Electronic Arts announced its acquisition of the cloud gaming technology assets and personnel of GameFly today. ...

Categories: Game Theory & Design

Email To Image

New Drupal Modules - 22 May 2018 - 1:23pm

Email to image module, avoids the email scrapping technique to identify emails inside the source code and send spam, we encrypt the email and the css styles data, generating an image with that data, and replacing the email text with the generated hash and image. Therefore when user clicks on mailto link, the email is not sent as plain text to the local email client in the OS, just sent the encrypted hash to a custom form that do the work sending the email.

Categories: Drupal

BYU Slideshow

New Drupal Modules - 22 May 2018 - 1:03pm

Allows the user to add an image slideshow paragraph type to embed a slideshow.

Categories: Drupal

Rebel Minis Releases Qwik: A Game of the Wastelands

Tabletop Gaming News - 22 May 2018 - 1:00pm
Rebel Minis is pleased to announce that they’ve released Qwik: A Game of the Wastelands. It’s a new fantasy sports game set in a post-apocalyptic world. Hey, when not running away from nuclear mutants, you still wanna just play around some, right? Grab the dog skull and get in there! From the announcement: Rebel Minis […]
Categories: Game Theory & Design

Pages

Subscribe to As If Productions aggregator