Skip to Content

Newsfeeds

Services Address Field

New Drupal Modules - 20 January 2015 - 9:34pm

Resources to assist in the usage of Address Field with Services.

Categories: Drupal

ThinkShout: The Clutter and the Deceptively Simple

Planet Drupal - 20 January 2015 - 4:00pm

2015 is poised to be a great year for nonprofit technology and the adoption of digital tools to advance the causes we love. While I can’t say that I see too many groundbreaking innovations on the immediate horizon, I do believe that this will be a year of implementation and refinement. Building upon trends that we saw arise last year in the consumer industry and private sector, 2015 will be the year that many nonprofits leap into digital engagement strategies and begin to leverage new tools that will create fundamental change in the way that they interact with their constituencies.

Of course, as always happens when a growing sector first embraces new tools, the nonprofit technology world will see more than its fair share of awkward clunkiness this year, mainly as "software as a service" product companies rebrand their offerings for nonprofits and flash shiny objects at the earnest and hungry organizations we all support.

But as a more general and appealing trend, I believe that we’ll see a slimming down and a focus on polish this coming year. Visual storytelling and "long form" journalism are hopefully on the rise in the nonprofit digital world. We should see more, and better, integrations between web applications, data management systems, and social networks. These integrations will power more seamless and personalized user experiences. Rather than tossing up an incongruent collection of web interfaces and forms delivered by different paid service platforms, nonprofits will be able to present calls-to-action through more beautiful and less cumbersome digital experiences.

Below are some additional thoughts regarding the good stuff (and some of the bad) that we’re likely to see this year. If you have any additional predictions, please share your thoughts!

Visual Storytelling and the Resurgence of Long-Form Journalism

I don’t know about you, but I’m tired of my eyeballs popping out of my head every time I visit a nonprofit’s homepage that attempts to cram 1,000 headlines above the fold. I’m tired of the concept of "a fold" altogether. And don’t get me started about slideshow carousels as navigation: It’s 2015, folks!

Fortunately, we are seeing an elegant slowdown in the pace of writing for the web. Audiences are getting a little more patient, particularly when presented with clean design, pleasing typography, and bold imagery. We’re also seeing nonprofits embrace visual storytelling, investing in imagery and content over whistles and bells and widgets.

Medium and Exposure are my two favorite examples of impactful long-form journalism and visual storytelling on the web. These deceptively simple sites leverage cutting-edge javascript and other complex technologies to get out of the way and let content and visuals speak for themselves.

As an added benefit, adopting this more long-form storytelling approach may help your SEO. Google took bold steps in late 2014 to reward websites that focus on good content. With its release of Panda 4.1, their new search algorithm, nonprofits who prioritize long-form writing and quality narrative will start to see significant benefits.

We’re already seeing nonprofits adopt this approach, including one of my new favorites, The Marshall Project. This site cuts away the usual frills and assumes an intelligent audience that will do the work to engage with the content. Don’t get me wrong: The Marshall Project website is slick and surprisingly complex from an engineering and user experience perspective – but its designers have worked hard to bring the content itself to the surface as the most compelling call-to-action.

Interconnectivity

2015 will be a big year for APIs in the CMS space. Teasing out those acronyms, we will see content management systems, like Drupal and WordPress, release powerful tools allowing them to talk with other web applications and tools. Indeed, its new web services layer is a central and much anticipated feature in the upcoming release of Drupal 8. WordPress made similar strides late last year with the early release of its own REST API.

Leveraging these APIs, 2015 will bring the nonprofit sector more mobile applications that share data and content with these organizations’ websites. The costs for developing these integrations should decrease relative to the usefulness of such solutions, which will hopefully lead to more experimentation and mobile investment among nonprofits. And as mentioned previously, because these new applications will have access to more constituent data across platforms, they will lend themselves to more robust and personalized digital experiences.

On the less technical and more DIY front, 2015 will be marked by the maturation of 3rd-party services that allow non-developers to integrate their online tools. In its awesome post about technology trends in 2015, the firm Frog Design refers to this development as the "emergence of the casual programmer." Services like Zapier, and my new favorite IFTTT, will allow nonprofits to make more out of social networks and services like Google Apps, turn disparate data into actionable analytics, see the bigger picture across networks, and make more data-driven decisions.

More Big (And Perhaps Clunky) Web Apps

If you’ve been following ThinkShout for a while now, you probably know that we are big fans of Salesforce because of its great API and commitment to open data. We maintain the Salesforce Integration Suite for Drupal. At this point, the majority of our client work involves some sort of integration between the Drupal CMS and the Salesforce CRM.

As proponents of data-driven constituent engagement, we couldn’t be more excited to see the nonprofit sector embrace Salesforce and recognize the importance of constituent relationship management (CRM) and CRM-CMS integration. Because of the power of the Salesforce Suite, we can build powerful, gorgeous tools in Drupal that sync data bidirectionally and in real time with Salesforce.

That said, part of the rise of Salesforce in the nonprofit sector over the last two years has been driven by the vacuum created by Blackbaud’s purchase of Convio. And now, with the recent releases of Salesforce’s NGO Connect and Blackbaud’s Raiser’s Edge NXT, both "all-in-one" fundraising solutions with limited website integration potential (in my opinion…), we’re going to see more and more of an arms race between these two companies as they try to “out featurize” each other in marketing to nonprofits. In other words, in spite of the benefits from integrating Drupal and Salesforce, we’re going to see big nonprofit CRM offerings like Salesforce and Blackbaud push competing solutions that try to do everything in their own proprietary and sometimes clunky ecosystems.

The Internet of Things

The Internet of Things (IoT), or the interconnectivity of embedded Internet devices, is not a new concept for 2015. We’ve seen the rise of random smart things, from TVs to refrigerators, for the last few years. While the world’s population is estimated to reach 7.7 billion in 2020, the number of Internet-connected devices is predicted to hit 26 billion that same year. Apple’s announcement of its forthcoming Watch last year tolled the the first meaningful generation of wearable technology. Of course, that doesn’t necessarily mean that you’ll want to wear this stuff just yet, depending upon your fashion sense...

(Image from VentureBeat’s coverage of the 2015 Consumer Electronics Show last week. Would you wear these?)

However, the advent of the wearable Internet presents many opportunities to the nonprofit sector, both as a delivery device for micro-campaigns and targeted appeals, and as a tool for collecting information about an organization’s constituency. Our colleagues at BlueSpark Labs recently wrote about how these technologies will allow organizations to build websites that are really "context-rich systems." For example, with an Internet-connected watch synced up to a nonprofit’s website, that organization could potentially monitor a volunteer athlete’s speed and heart rate during a workout. These contextualized web experiences could drive deeper feelings of commitment among donors and other nonprofit supporters.

(Fast Company envisions how the NY Times might cover election result's on the Apple Watch.)

Privacy and Security

While not exactly a trend in nonprofit technology, I will be interested to see how the growing focus on Internet privacy and security will affect online fundraising and digital engagement strategies this year.

(A poster for the film, The Interview, as most of you probably know, the film incited a major hack of Sony Studios and spurred international dialog about cyber security.)

We are seeing more and more startups providing direct-to-consumer privacy and security offerings. This last year, Apple release Apple Pay which adds security, as well as convenience, to both online and in-person credit card purchases. And Silent Circle just released Blackphone - an encrypted cell phone with a sophisticated and secure operating system built on top of the Android platform.

How might this focus on privacy and security affect the nonprofit sector? It’s hard to say for sure, but nonprofits should anticipate the need to pay for more routine security audits and best practices regarding maintenance of their web properties, especially as these tools begin to collect and leverage more constituent data. They should also consider how their online fundraising tools will begin to support new online payment formats, such as Apple Pay, as well as virtual currencies like BitCoin.

And Away We Go…

At ThinkShout, we’ve already rolled up our sleeves and are excitedly working away to implement many of these new strategies and approaches for our clients in 2015. What are you looking forward to seeing in the world of of nonprofit tech this year? What trends do you see on the horizon? Let us know. And consider swinging by the "Drupal Day for Nonprofits" event that we’re organizing on March 3rd in Austin, TX, as part of this year’s Nonprofit Technology Conference. We hope to dream with you there!

Categories: Drupal

Drupal core announcements: MidWest Developers Summit

Planet Drupal - 20 January 2015 - 11:33am
Start:  2015-08-12 (All day) - 2015-08-15 (All day) America/Chicago Sprint Organizers:  gdemet webchick xjm

This is a place holder to get MWDS on the calendar.
Wednesday August 12 - Saturday August 15

All sprint days. No sessions.
Focus on getting Drupal 8 released (and some key contrib ports to Drupal 8).

More details soon.

Will be hosted in Chicago at http://palantir.net/
2211 N Elston Avenue
Suite 400
Chicago, Illinois 60614

Categories: Drupal

Video: 'Reclaiming my soul' or 'Why I quit making F2P social games'

Social/Online Games - Gamasutra - 20 January 2015 - 11:22am

Experienced game maker Caryl Shaw (EA, Kixeye, Ngmoco, Telltale Games) shares what she's learned in transitioning back and forth between AAA and F2P development in this GDC Next 2014 talk. ...

Categories: Game Theory & Design

Reminder: Just over 24 hours left to register early for GDC 2015

Social/Online Games - Gamasutra - 20 January 2015 - 9:57am

With just over a day left until early registration for GDC 2015 ends on January 21, conference organizers are encouraging anyone interested in attending to register now at the discounted rate. ...

Categories: Game Theory & Design

Redirect Token

New Drupal Modules - 20 January 2015 - 9:14am

Use tokens in "To" paths in Redirect module.

See https://www.drupal.org/node/1331582

Configure Redirect as normal

Go to /admin/config/search/redirect and add or edit a redirect.

Example of path replacement

From: account/edit
To: user/[current-user:uid]/edit

Categories: Drupal

Shomeya: Model Your Data with Drupal and Domain-driven Design

Planet Drupal - 20 January 2015 - 9:00am

On of the things I've blogged about recently when talking about my upcoming book Model Your Data with Drupal is domain-driven design. While domain-driven design is important and something that I hope to touch on in the future, I've decided it's too much to cover in one book, and I'm refocusing Model Your Data with Drupal on basic object oriented principles.

If you want to know more about this decision and what will be covered in Model Your Data with Drupal, read on.

Read more
Categories: Drupal

Drupal Watchdog: Something Borrowed, Something Drupal

Planet Drupal - 20 January 2015 - 8:59am
Feature


Drupal 8 represents a radical shift, both technically and culturally, from previous versions. Perusing through the Drupal 8 code base, many parts may be unfamiliar. One bit in particular, though, is especially unusual: A new directory named /core/vendor. What is this mysterious place, and who is vending?

The "vendor" directory represents Drupal's largest cultural shift. It is where Drupal's 3rd party dependencies are stored. The structure of that directory is a product of Composer, the PHP-standard mechanism for declaring dependencies on other packages and downloading them as needed. We won't go into detail about how Composer works; for that, see my article in the September 2013 issue of Drupal Watchdog, Composer: Sharing Wider.

But what 3rd party code are we actually using, and why?

Crack open your IDE if you want, or just follow along at home, as we embark on a tour of Drupal 8's 3rd party dependencies. (We won't be going in alphabetical order.)

Guzzle

Perhaps the easiest to discuss is Guzzle. Guzzle is an HTTP client for PHP; that is, it allows you to make outbound HTTP requests with far more flexibility (and a far, far nicer API) than using curl or some other very low-level library.

Drupal had its own HTTP client for a long time... sort of. The drupal_http_request() function has been around longer than I have, and served as Drupal's sole outbound HTTP utility. Unfortunately, it was never very good. In fact, it sucked. HTTP is not a simple spec, especially HTTP 1.1, and supporting it properly is difficult. drupal_http_request() was always an after-thought, and lacked many features that some users needed.

Categories: Drupal

The Game Outcomes Project, Part 4: Crunch Makes Games Worse - by Paul Tozour

Gamasutra.com Blogs - 20 January 2015 - 8:39am
Part 4 in a 5-part series analyzing the results of the Game Outcomes Project survey, which polled hundreds of game developers to determine how teamwork, culture, leadership, production, and project management contribute to game project success or failure.
Categories: Game Theory & Design

Skewmesh Tutorial: Remove Skewing in Normal map bakes - by Peter Kojesta

Gamasutra.com Blogs - 20 January 2015 - 8:38am
This is a technique we developed to remove skewing on normal maps for mechanical models. It comlpetely eliminates poor skewed normals.
Categories: Game Theory & Design

Doctor Who's Effect on Narrative and Episodic Storytelling - by Josh Bycer

Gamasutra.com Blogs - 20 January 2015 - 8:37am
The hit science fiction show Doctor Who is the foundation for today's discussion on episodic storytelling and what game designers can learn about how the show has altered its narrative structure and pacing over the years.
Categories: Game Theory & Design

Loot Quest: From Ruminations to Release - by Elliot Pinkus

Gamasutra.com Blogs - 20 January 2015 - 8:37am
After nearly 2 years of development, it's interesting to look back at my old moleskine notebook with the 6 scribbled pages where Loot Quest was conceived. I'll reflect on how far it has come and where the iterations took us.
Categories: Game Theory & Design

On downloadable press kits and info the press needs - by Lena LeRay

Gamasutra.com Blogs - 20 January 2015 - 8:36am
I tweeted yesterday that if I get a game pitch with a press kit that needs to be downloading, I can be almost certain that the info I need isn't in it. That doesn't mean load the email down with tons of info. Here's an elaboration.
Categories: Game Theory & Design

Let’s Actually Do Something about Internet Hate - by Livio De La Cruz

Gamasutra.com Blogs - 20 January 2015 - 8:34am
Practical tips for developers, community managers, and anyone else who wants to help solve the growing problem of internet hate in our communities.
Categories: Game Theory & Design

Design dilemma in prop behaviour - by Peter Kjaer

Gamasutra.com Blogs - 20 January 2015 - 8:32am
What happens to a prop when a block it stands on is temporarily removed and later reappears? This poses an interesting design challenge that have a big impact on both gameplay and implementation. I give you my thoughts and would love to hear yours!
Categories: Game Theory & Design

The North American Conference on Video Game Music: Q&A with Keynote Speaker Winifred Phillips - by Winifred Phillips

Gamasutra.com Blogs - 20 January 2015 - 8:30am
An excerpt from the live Q&A session held on Jan. 18, 2015 during the North American Conference on Video Game Music. Topics include game music production, career building, live performance and issues related to game music study.
Categories: Game Theory & Design

Reclaiming the Caves on the Borderlands

New RPG Product Reviews - 20 January 2015 - 8:29am
Publisher: Sacrosanct Games
Rating: 2
I picked this up because I do enjoy seeing what others can do with such well trodden ground as the Keep and the Caves of Chaos. The cover claims to be 5th Edition compatible and uses the current OGL to get there. Personally if I were a publisher I would be staying away from this. It is murky legal ground right now and one I would not tread on. But lets move on.
The cover is nice and drew me in right away. The book is 24 pages, but minus 1 for cover, 1 for ogl, 1 for a blank page, 1 for an ad, 4 for maps and 1 more for a character sheet. So 15 pages of text.

The Good: There is a good section on pages 4 and 5 on playing humanoid races such as orcs, bugbears, gnolls and so on. Just the stats, nothing really on "how to play them" No big, these are the standard baddies for the last 40 years. We know them.
There is detail on how the caves are controlled and what can be the expected losses of the various groups of humanoids living in the caves over time.

The Not As Good For Me: The caves and the rooms themselves are not detailed. There are blanks left for the DM to write in what is there from monsters to items. The main conceit here is that the inhabitants of the Keep have taken over the Caves now. It is all very sandboxy which is fine, but not what I was expecting. I am perfectly fine with sandboxes, but that it not why I buy pdfs. I buy graph paper for that.

The Bad: This PDF uses scanned images from the original map of the Caves of Chaos from B2. It has been run through Photoshop and some alterations have been made, but I can overlay a scan of the blue/white Caves map and line it up perfectly (including grids) to the "Reclaiming" maps. Not very professional at all.

The Ugly: Additionally there is a really bad scan of the old D and D Basic era Character sheet. It has been edited (poorly) to make it more in line with 5th Edition, but honestly it is just plain ugly. The artist would have been better off starting from scratch and making a 5e sheet that looked a bit like the Basic one rather than include this. Better still would be not to include one at all. It is just ugly, shows really poor Photoshop skills and a copyright infringement to boot.
So in the end, despite some promise and high hopes, this falls really flat.
Categories: Game Theory & Design

Blink Reaction: Give me a Swiss knife, Pleeease!

Planet Drupal - 20 January 2015 - 8:25am
All the annoying CSS stuff we don't want to do in 1 tool. One tool for stylesheets
Categories: Drupal

EchoDitto Tech Blog: Code Management in Drupal 7 using Features, Ctools, and Panels

Planet Drupal - 20 January 2015 - 8:22am

Code structure is something most Drupal developers wrestle with. There are tons of modules out there that make our lives easier (Views, Display Suite, etc.) but managing database configuration while maintaining a good workflow is no easy challenge. Today I'm going to talk about a few approaches I use in my work here at Echo. We will be using a simple use case of creating a paginated list of blog posts. To start, we're going to talk about the workflow from a high level, then we'll get into the modules that leverage Drupal in a way that makes sense. Finally, we'll have some code samples to help guide things along.

Workflow

This will vary a bit based on what you need, but the idea behind this is we never want to redo our work. Ideally we'd like to design a View or functionality once on our local, and then package it and push it up. Features is a big driving force behind this. Beyond that, we want things like page structures and custom code to have a place to live that makes sense. So, for this example we will be considering the idea of a paginated list of Blog Posts. This is a heavy hammer to be swinging at such a solved task, but we will get into why this is good later on.

  • Create a new Feature that requires ctools and panels (and not views!)
  • Open up the generated .module file and declare the ctool plugin directory
  • Create the plugins/content_types/blog_posts.inc file
  • Define the needed functions within blog_posts.inc to make it work
  • Add the newly created content type to a page in Page Manager
  • Add everything we care about to the Feature and export it for deployment
Installation

This only assumes that you have a working Drupal installation and some knowledge of how to install modules. In this case, we will be using drush to accomplish this, but feel free to pick your poison here. Simply run the following commands and answer yes when prompted.

drush dl ctools ds features panels strongarm drush en ctools ds features panels strongarm page_manager

What we have done here is install and enable a strong foundation on which we can start to scaffold our site. Note that I won't be getting into folder structure too much, but there are some more steps before this you would have to take to ensure contrib, custom, and features all make it to their own place. We wave our hands at this for now.

Features

The first thing we're going to do is generate ourselves a Feature. Simply navigate to Structures -> Features -> Create Feature and you will see a screen that looks very similar to this. Fill out a name, and have it require ctools and panels for now.

This will generate a mostly empty feature for us. The important part we want here is the ability to turn it on and off in the Features UI, and the structure (that we didn't have to create manually!) which includes a .module and .info file is ready to go for us. That being said, we're going to open it up and tell it where to find the plugins. The code to do that is below, and here is a screenshot of the directory structure and code to make sure you're on the right track. Go ahead and create the plugins directory and associated file as well.

function blog_posts_ctools_plugin_directory($owner, $plugin_type) { return 'plugins/' . $plugin_type; } Chaos Tools

Known more commonly as ctools, this is a module that allows us this plugin structure. For our purposes, we've already made the directory and file structure needed. Now all we have to do is create ourselves a plugin. There are three key parts to this: plugin definition, render function, and form function. These are all defined in the .inc file mentioned above. There are plenty of resources online that get into the details, but basically we're going to define everything that gets rendered in code and leverage things like Display Suite and the theme function for pagination. This is what we wind up with:

<?php   /** * Plugin definition */ $plugin = array( 'single' => TRUE, 'title' => t('Blog Post Listing'), 'description' => t('Custom blog listing.'), 'category' => t('Custom Views'), 'edit form' => 'blog_post_listing_edit_form', 'render callback' => 'blog_post_listing_render', 'all contexts' => TRUE, );   /** * Render function for blog listing * @author Austin DeVinney */ function blog_post_listing_render($subtype, $conf, $args, &$context) { //Define the content, which is built throughout the function $content = '';   //Query for blog posts $query = new EntityFieldQuery(); $query->entityCondition('entity_type', 'node', '=') ->entityCondition('bundle', 'blog_post', '=') ->propertyCondition('status', NODE_PUBLISHED, '=') ->pager(5);   //Fetch results, and load all nodes $result = $query->execute();   //If we have results, build the view if(!empty($result)) { //Build the list of nodes $nodes = node_load_multiple(array_keys($result['node'])); foreach($nodes as $node) { $view = node_view($node, 'teaser'); $content .= drupal_render($view); }   //Add the pager $content .= theme('pager'); }   //Otherwise, show no results else { $content = "No blog posts found."; }   //Finally, we declare a block and assign it the content $block = new stdClass(); $block->title = 'Blog Posts'; $block->content = $content; return $block; }   /** * Function used for editing options on page. None needed. * @author Austin DeVinney */ function blog_post_listing_edit_form($form, &$form_state) { return $form; }

Some things to note here. We're basically making a view by hand using EntityFieldQuery. It's a nifty way to write entity queries a bit easier and comes with some useful how to's on Drupal.org. We also offload all rendering to work with Display Suite and use the built-in pagination that Drupal provides. All things considered, I'm really happy with how this comes together.

Panels

Finally, we need to add this to the page manager with panels. Browser to Structure -> Pages -> Add custom page and it will provide you with a step by step process to make a new page. All we're going to do here is add our newly created content type to the panel, as shown here.

And now, we're all ready to export to the Feature we created. Go on back to and recreate the feature and you're ready to push your code live. After everything is said and done, you should have a working blog with pagination.

.

Motivation

Obviously, this example is extremely basic. We could have done this in a View in far less time. Why would we ever want to use this? That's a great question and I'd like to elaborate on why this is important. Views are great and solve this problem just as well. They export nicely with Features and can even play with Panels (if you want to use Views as blocks or content panes). That being said, this is more for the layout of how we would have custom code that works with a lot of Drupal's best practices. Imagine instead if we have a complicated third party API we're trying to query and have our "view" react to that. What if we want a small, code-driven block that we can place discretely with panels? The use cases go on, of course.

There are many ways to solve problems in Drupal. This is just my take on a very clean and minimal code structure that allows developers to be developers and drive things with their code, rather than being stuck clicking around in menus.

Tags: drupaldrupal 7ctoolspanelsfeaturestechnologymaintainability
Categories: Drupal

Dcycle: Multiple git remotes, the --depth parameter and repo size

Planet Drupal - 20 January 2015 - 7:31am

When building a Drupal 7 site, one oft-used technique is to keep the entire Drupal root under git (for Drupal 8 sites, I favor having the Drupal root one level up).

Starting a new project can be done by downloading an unversioned copy of D7, and initializing a git repo, like this:

Approach #1 drush dl cd drupal* git init git add . git commit -am 'initial project commit' git remote add origin ssh://me@mygit.example.com/myproject

Another trick I learned from my colleagues at the Linux Foundation is to get Drupal via git and have two origins, like this:

Approach #2 git clone --branch 7.x http://git.drupal.org/project/drupal.git drupal cd drupal git remote rename origin drupal git remote add origin ssh://me@mygit.example.com/myproject

This second approach lets you push changes to your own repo, and pull changes from the Drupal git repo. This has the advantage of keeping track of Drupal project commits, and your own project commits, in a unified git history.

git push origin 7.x git pull drupal 7.x

If you are tight for space though, there might be one inconvenience: Approach #2 keeps track of the entire Drupal 7.x commit history, for example we are now tracking in our own repo commit e829881 by natrak, on June 2, 2000:

git log |grep e829881 --after-context=4 commit e8298816587f79e090cb6e78ea17b00fae705deb Author: natrak <> Date: Fri Jun 2 18:43:11 2000 +0000 CVS drives me nuts *G*

All of this information takes disk space: Approach #2 takes 156Mb, vs. 23Mb for approach #1. This may add up if you are working on several projects, and especially if for each project you have several environments for feature branches. If you have a continuous integration server tracking multiple projects and spawning new environments for each feature branch, several gigs of disk space can be used.

If you want to streamline the size of your git repos, you might want to try the --depth option of git clone, like this:

Approach #3 git clone --branch 7.x --depth 1 http://git.drupal.org/project/drupal.git drupal cd drupal git remote rename origin drupal git remote add origin ssh://me@mygit.example.com/myproject

Adding the --depth parameter here reduces the initial size of your repo to 18Mb in my test, which interestingly is even less than approach #1. Even though your repo is now linked to the Drupal git repo, by running git log you will see that the entire history is not being stored.

Tags: blogplanet
Categories: Drupal
Syndicate content


Google+
about seo