It’s a Wrap for Sim4Act

For the past two years I’ve been working to improve access to an approach that I believe could solve many of our society’s ills. The approach leverages games and computer simulations in tandem to make complex problems accessible and understandable to a wide audience and to bridge the many gaps between citizens and experts and between implicit and scientific knowledge. These gaps promote controversy and prevent innovation on so many of the wicked environmental, infrastructure, and social problems we currently face.

The vehicle for this work was Simulation for Collective Action (Sim4Act), a private company I founded with the highly accomplished arts and crafts master and GIS and remote sensing expert, Dr. Paolo Campo. The goal was to make the above approach available outside of its traditional settings –natural resource development projects funded by international organizations– to clients such as cities and water districts who suffered from many of the same wicked planning problems.

To accomplish this we developed prototype games; talked to a lot of people across Europe, Asia, and the United States; generated a lot of excitement; and were even finalists for the prestigious Echoing Green Climate Fellowship. Maybe we could have made it work if we’d kept chasing US federal and EU grants, and we had some very heady days around CA’s new groundwater planning requirements which looked like a perfect fit, but at the end of the day the interest among local level clients hasn’t materialized sufficiently.

A start-up is a search for a business model, and sadly we’re not close enough to having one to keep going. I for my part am going to get embedded in an organization, i.e. find a job. (My biz partner Paolo is also looking).
I’ve got some ideas about how to keep up our vision, using games and simulations to help people collectively solve “wicked” problems, but first, gonna find that job (tips and contact suggestions most certainly welcome).

From Game to Dynamic Simulation: Collective Learning in Maps and Pictures

Simple Model of Basin to support discussion- used for first workshop

As simple model of the situation under study is presented to stakeholders well acquainted with the issue. Here we’re concerned about flooding, and the interaction between agriculture, mining, erosion, rainfall, and dam water releases. The knowledgeable stakeholders discuss among themselves and provide feedback.

Components from Playing 1st Workshop

Now it’s time to play! In this role playing game the participants play as lowland farmers, upland farmers, or dam managers under conditions of normal or heavy rainfall. The lowland farmers plant either rice or vegetables and are vulnerable to flooding caused by erosion from upland farming, and/or unfortunately timed releases from dam.


Map of Low Level Flooding Phonsi Village- workshop 2 p11

Of course that first group didn’t have all the information. Now we go out to the villages, and get them to give input on how they are affected by the floods, e.g. where flooding occurs and negative and positive effects such as destruction of fishing equipment or increases in soil productivity.

Village plus basin combined map from 3rd workshop

Now we’re ready to link the river basin scale and village level dynamics in a game that includes even a wider group of stakeholders (see the white village maps). Here, white stones represent water, and black stones represent sedimentation.

Action cards from 3rd Session

Of course, what drives sedimentation, as well as people’s vulnerability to the impacts of flooding, are human actions. The game allows people to take actions based on those they would take in real life.

 

Final Presentation of simulation

The common understanding of the drivers and impacts of funding has now been developed and checked multiple times. This understanding is formalized in a computer simulation which helps policy makers plan, analyze, and predict the effects of different management options

Presentation of Game

And for those who want a more dynamic and engaged understanding of the flooding issue, an electronic version of the game is also delivered.

 

When Words and Things Lost Their Way, and Beer

One of the professors at my grad school, Roger Bohn writes an interesting blog called Art2Science where, among other things, he lays out how the medical industry could massively improve its safety record and outcomes by learning from the aviation industry. Early pilots, just like today’s doctors, depended extensively on their own judgement, and frequently made deadly errors, and standardized practices such as checklists led to a lot fewer crashes.

Lately, however, he’s run into some issues with his terminology. While the phrase “art to science” can generally capture the sense of replacing intuition and superstition with a general, comprehensive, and systematic understanding, it leave as lot of vagueness as to what the practitioners are specifically doing. We can talk about the art of aviation and the art or practicing medicine, but these don’t give us much insight into what pilots or doctors do on a day to day basis, or how they can improve. And while science probably has something to do with their improvement, we expect our pilots and doctors to be improving based on the lessons of science, not to actually become scientists. I do think he’s in for a world of hurt in trying to come up with precise terminology though, and here’s why:

The reason “art to science” works, is we all have a general understanding of how such a process proceeds. The tinkerer and the craftsman were put out of business by the scientist and the engineer (by way of the corporation and the modern state). The apprenticeship gives way to formal education with a standarized curriculum. Brewers were able to make beer before yeast was discovered, but their belief that fermentation was a blessing from God is now a bit laughable. And even if an alchemist were able to re-discover how to make porcelain, we’d be unlikely to to employ their services over the chemist’s, who could explain (after the fact) how this was accomplished. “Art to science” captures. in a general sense. the progress in each of these specific areas.

The problem with moving beyond the general understanding to more precise terminology is that many of the developments in the above areas corresponded with the undoing of our language. Well, not precisely, but they undid our confidence in a certain picture of how language worked, one in which words have fixed correspondence with specific things in the world outside our heads.

This picture of language is, according to Wittgenstein, best supplied by Saint Augustine learning a new language. Something is pointed out and a sound is uttered, and he learned to associate that sound with that thing. We can imagine pointing out the window of a moving car at a sheep, and saying “sheep,” and the child (in the back seat for safety reasons) repeating “sheep.” Later, we drive past a cow and the child says “sheep,” and we correct them, saying “cow,” but nonetheless congratulating ourselves on having such intelligent progeny that is able to identify a category of things which we call “four-legged animals,” a concept the kid will get to in due time.

The natural sciences made progress cleaning up mumbo-jumbo in terms of relating our words to things in the external world in certain domains. There was a time when whether a rabbit was a fish or not was a matter of theological and philosophical debate. Because the Church determined it was a fish, French monks went to extreme efforts to domesticate this excitable creature, as it could be consumed on Fridays. Natural philosophers in Prussia called the sun a kitchen furnace, the pyramids volcanoes, but all this ridiculousness was done away with by the late 19th Century. When Ishmael insists that a whale is a fish, there is a sense of poetic and tragic assertion, just as the hero in Notes from Underground rails against the immovable wall that is science.

But while they were able to kill off natural philosophy, the natural sciences and the scientific method more generally ran into their own wall when it came to clarifying and studying the problems of human social, ethical, and political life. What was “good,” or “true” was still largely a matter of interpretation, and as a result in these domains philosophy and an applied derivative, ideology (and increasingly propaganda), still held sway.

Developing an ideal language that would be free of the frustrating ambiguities of natural language was the last great project of philosophy and one at which the discipline failed (and consequently lost any claim of being a scientific discipline).

This picture of language is as pernicious as it is powerful and there’s a reason that one of the most important philosophical works of the last century was devoted to helping people see past it.

It’s power comes from its usefulness in scientific contexts. “What causes fermentation?” is a great scientific question, that can be answered with precision. We can not only talk about yeast, but types of yeast, and while there is plenty to be discussed, there’s certainly no need to consider our picture of language, much less God’s judgement on the brewer.

But once we move beyond a precise scientific domain, language quickly starts to lose its precision in the sense conveyed in the above picture. And we run the danger of interesting but endless philosophical rabbit holes (What is Truth? What is Freedom?) if we cannot accept a bit of imprecision and need to clarify based on audience and context. Dr. Bohn has proposed to replace the term “art” with “craft,” borrowing from lines of the progression outlined above, in which the craftsmen were put out of business by modern science and engineering. Along these lines we might think of a master brewer in medieval Germany as being a practitioner of a craft, one which he apprenticed under a previous master. But Anheuser-Busch wouldn’t trust any of these men to run any of its brewing facilities unless they went back to school for a rudimentary education in brewing science.

At the same time, however, craft brewing has got Anheuser-Busch scared and defensive and is putting the modern German beer industry (finally) on notice. Does this mean that science is giving way to craft? Well, not exactly, craft here has more to do with the sense of small scale/artisan/high quality (though it doesn’t necessarily mean any of those either). Craft has simply acquired a new meaning, which is related, but different to its old meaning. Not a problem, we give new things the names of old things all the time,

But we can see the problem for Dr. Bohn, this new meaning for craft already presents potential for confusion. Craft is at the same time what is out of date and what is cutting edge. This is not insurmountable perhaps, but  enough to undermine precision. And finding a precise term for better, more scientific management may prove even more difficult.

The rise of the sciences has not meant the end of confusion. One of the most dangerous places to be is between people with precise (but varying) terminologies and the common uses of language, “babbling equilibriums” (to use Elinor Ostrom’s terminology) are a common occurrence.

Games, Gamification, Serious Games, and Simulation: Playing with the Terminology

At some point, between a youth misspent playing Dungeons & Dragons and war games, and an adulthood in nominally more adult pursuits such as studying game theory and observing and taking part in real life political games, I acquired a certain body of knowledge. And despite my best efforts, this knowledge has become increasingly useful and relevant. So for those for whom it didn’t seem natural to compose a 7th grade book report On the Red Fern Grows in the form of a game (one player played the ‘coons and naturally tried to escape the other players hounds) I’m going to try to jot down some basics about gaming and related concepts and their potential for worthwhile and less worthwhile activities.

Reflecting for a minute, I can see why there is room for confusion. The term “game” is applied very widely and loosely from anything played with dice and cards, recreational or gambling, Roman gladiatorial games, “Don’t play games with me”, “Don’t hate the player, hate the game,” to the twitchy first person shooters that put Tipper Gore in a fit.

The truth is, there isn’t a good single definition of what a game is. It’s something some people “play”, and probably has some element of fun, challenge, and or competition to it. It may be trivial (“only a game”) or it may be awful (“playing games with people’s lives”) and may be removed from or intersect the real world in various ways. Gamification, serious games, game theory, and simulation are key concepts for understanding how we can design and take advantage of intersections between the world of gaming and the real world. And I’m going to do this primarily with illustrative examples, as just as with games, they are rather fungible concepts, and resist simple definition.

An Epically Crap Game and Gamification

So Monopoly is a game. No denying it. You roll dice, draw cards, buy and develop properties, take money from your opponents, and hopefully (finally) bankrupt them all and emerge the winner. It is also crap from a design perspective, a board game should be fun but Monopoly games often stretch on forever even after the eventual winner is clear, resulting in frustration and boredom. Still, most people have positive associations with Monopoly. This is because there is something inherently pleasurable about sitting down with your friends and family, opening a box, and setting it up, and then rolling dice, drawing random cards, set making, buying and developing properties, and taking others’ money.

Let’s talk for a moment about set making. A big part of Monopoly is getting sets of properties that are all the same color (e.g. Boardwalk and Park Place) or collecting all the railroads or utilities. This activity is rewarded in the game, as by making a complete set you can extract more money from other players for yourself. Different sets have different values, but any set of properties is better than a mix of unmatched colors. Set making is an activity we see across a number of games, particularly card games. Kids start with Go Fish!, you try to get cards of the same face value, and then some move on to poker where bluffing, probabilities, and money create for a adult experience, but at the end of the day you’re still trying to make sets (e.g. flush, four of a kind, straight).

McDonald’s famous Monopoly campaign used the explicit theme from Monopoly as well as set making to increase customer engagement.

This suggests that there is something about the activity of set making in itself. It’s fun to make a full house or a straight. You feel like you’ve done something. In fact, a big challenge of poker is figuring out how to value this accomplishment vis a vis what your opponent has done. A beginner may very well get overly attached to his or her hand. This set making activity has a certain grip on human psychology, which means it has the potential to influence behavior.

And here we have the basis for gamification. Gamification is only related to pure gaming in that the designer isn’t aiming for pure recreation or competition. Rather he or she takes lessons from what people find fun (or even irritating) to motivate and modify behavior in non-game settings.

Far more productive than Monopoly for this hot and emerging field has been the fantasy role playing game genre. In Dungeons and Dragons for example players collected experience points to “level-up”. Now your online avatar can do the same thing through leaving helpful reviews on Trip Advisor.

Like a mage or warrior in a fantasy role playing game MargaretV earns points to level up on TripAdvisor

Like a mage or warrior in a fantasy role playing game MargaretV earns points to level up on TripAdvisor

Points, levels, badges, leaderboards and little bars that show you are 87% of the way to completing your profile, these have become staples of modern web communities. And community designers know very well that they give users a way to measure their progress, and their status vis a vis other users and that through gamification they (LinkedIn, Yahoo!Answers, Amazon.com) can get people to undertake tasks they would not otherwise do on a voluntary basis.

Of course gamification insights don’t have to be used only to drive quarterly profits and shareholder value. Points, competition on leader boards, and card collection can also be used to change behavior in important real world settings. A great example is iChoose, a competition between employees at the Miron Construction Company to improve the sustainability of their daily routines. The routine behaviors were distinctly unsexy in themselves, not turning on the TV, taking re-usable grocery bags to the store, unplugging a second refrigerator. But they created cards that one could keep for completing certain actions, which gave a nice sense of accomplishment. And these cards had point values that were totaled up and displayed on a running online leaderboard so the employees could compare their progress to that of their fellow workers. Through cards and competition, gamification gave context and meaning and a sense of purpose to what would otherwise be mundane actions, and thus created a real world game.

Getting Serious 

And here we start to cross the boundary into so-called “serious games.” Actually, I don’t like the term serious game, as it is somewhat ambiguous as to what is being taken seriously. People play games such as Go and chess very seriously. I’ve known a few people who have paid their bills with poker. What we’re really getting at, as with the Miron iChoose game is that a serious game has some sort of beneficial social impact that we are concerned about outside of the gaming context. The social impact in the iChoose game was to promote more sustainable behaviors around the household. But not all serious games are real world games. Often a board game or a digital game is used as a medium in which players can experiment learn what the designers believe to be socially important lessons. Actually Monopoly itself has its origins as a serious game. The Landlord’s Game was created in 1904 to show players the pitfalls of unequal wealth and the socially destructive nature of monopoly. The original had two versions of play, one in which players pursued only profit and greed. The folly of this version was supposed to be self-evident and highlighted by the second version in which social justice prevailed. The Parker Brothers dropped the socially just version when it took over the rights in 1935.

Greed and rapaciousness may just be more fun.

A Design Interlude

I said above that Monopoly is crap. From a game designers perspective it is “unbalanced,” due to early luck someone can establish an insurmountable lead and there is little others can do to catch up. There is little interesting to do once the properties have been bought, and nothing to do except roll dice if you were unable to acquire a set of properties. And worse of all, it can go on forever. A Monopoly game once went on for 10 weeks. Most players just get bored and quit before reaching that point.

Due to its poor design and popularity any mention of Monopoly is sure to invite eye rolling from your favorite board game geek. If you want to restore your now agitated geek to equilibrium, and demonstrate that you are not completely in the Dark Ages when it comes to gaming culture, you should immediately follow any Monopoly reference with noting how much you enjoy “Settlers.” Settlers of Catan that is, which was dubbed by Wired to be a a “Monopoly Killer” and “the perfect German Game.”

Settlers is not altogether unlike Monopoly, it is also an economic development game and many of the basic elements and mechanics, dice rolling, set making, stealing resources from opponents, that we talked about in terms of gamification and Monopoly, are also in Settlers. But the experience of players is completely different. It can be easily played in an intense hour and a half and every player is usually convinced that he or she would have won if it hadn’t been for a run of poor luck or boneheaded and/or vicious decision from an opponent. With Settlers you don’t have to worry about being bored, but you may have to be prepared to answer a very peeved significant other about why you “robbed” him or her so frequently.

And here we have a certain and central truth. Game design is a design exercise, and we can design a game for any number of purposes, be they recreational or commercial, to lessen the environmental impact of household decisions, for competition, to build a sense of community, to educate, and even to convey truth and beauty.  And we can accomplish more than one of these ends a the same time. A game can be both serious and recreational, in fact if you hope that anyone will play your game it’s a good idea to give them a pleasurable and interesting experience.

Our limitations here are not any clear delineations between serious, real world, and recreational games, we have seen how Monopoly is derived from a serious game. Settlers can be turned into an educational game on the economics, social dynamics, and environmental consequences of fossil fuel exploitation with a nice little free add-on called “Oil Springs.”

What really matters with games, be they serious, recreational, or real world, is the design behind them. While games are better suited to some purposes than others, the limitations aren’t clearly set, but rather depend on the competence of the designer and his or her team and those of the game design discipline itself. And the discipline has in recent years broken through barriers in ways previously thought to be insurmountable and contributed to fundamental medical research (with players of Foldit discovering the riddle of how HIV enzyme is folded in three weeks), and abstract and mathematical game theory in social science research increasingly giving way to games with real people in laboratory settings. But these are only more recent developments, indeed, the most serious and destructive of human enterprises has a long a very developed practice with games.

Practice Battles

There is a fundamental problem with running physical combat scenarios in the real world. Equipment and property get destroyed, and people get killed which makes it devilishly hard to do multiple runs. Nevertheless, war is a high stakes affair and one in which training, planning, tactics, and strategy are of high import. It is also an uncertain affair in which innovation and experimentation can be either fatal or highly rewarding. To deal with these challenges people have been endlessly creative in finding ways to prepare themselves and their armies through the use of activities, games, and simulations.

On a very basic level we can see certain activities and sports, from jousting and fencing to target shooting, as gamifications of the basic physical skills needed to be successful in combat. Competition and rules are introduced in order to motivate investment and focus on activities that would otherwise be overly intense and dangerous (when done for real) or even mundane and dull (putting bullets in targets is more fun if you have others to measure your prowess against).

The two top intellectual board games have martial origins, Go and chess are built around territorial control and the killing and capture of opposing pieces. Both of these games are, however, highly abstract, while they may teach high level concepts around strategy, they yield few practical lessons. Real world castles don’t tend to move around.  But some chess players in the 17th century in current day Germany started to try their hand at building more realistic representations of battle. The modern war game was born when a version, complete with scale maps and dice, was shown to King Friedrich Wilhelm III and he was convinced of its value. Prussia’s decisive victory in the Franco-Prussian War (1870-1871) assured that war games and simulations became important tools for military planners, from WWII through the Cold War, to the present day. The Pentagon recently played out a war with Russia in the Baltic (with somewhat dispiriting results).

Like all the categories in this piece, the boundary between simulations and games is blurred. One can simulate historical battle, anticipate future battle, or completely fictional battle (say against aliens) on a board or on a computer, either to prepare for an actual battle, to train, or merely to entertain. Many simulations are designed primarily for pure entertainment, one need only look at the popular and commercial “Sims” series to get an idea how many variations they can take (Sim City, Sim Ant, Sim Life). The primary distinction  between a simulation a simulation always tries to represent some external situation.

The power of a simulation is that it allows us to deal with real life situations in a controlled and safe setting. Military trainers and planners discovered this out of necessity quite early on, but there is plenty of room for using simulation in any setting where the stakes are high and so is uncertainty and complexity. And in recent years we’ve seen simulations extended to help with firefighting and emergency response training and for urban, environmental, and landscape planning.

Advances in computing power and ICT mean that we can now simulate far more parameters and contingencies than Prussian military planners did on their maps and dice. Agent based models for example let us set up rules for individual critters or people and then set them loose in a computer environment and observe emergent behavior that arises from their individual decisions and local interactions. Such simulations have lead to a better understanding of the spread of rabies through foxes and more realistic modeling of commuter behavior for transportation planning. The U.S. Navy is trusting that its new flight simulators can provide such a realistic experience that it can use them as substitutes flying in actual aircraft, and thus save money.

As noted above these may or may not be games. The U.S. Navy is looking to create a realistic experience, but advances in realism for training purposes will most likely be applied for entertainment at some point. And while agent based models were developed for scientific purposes, it wasn’t long before someone used a popular modeling language (NetLogo) to create Tetris and the Legend of Zelda. Similarly, the data for the models themselves can be derived from games with real life participants. I’m a big fan and building a company on an approach that uses tabletop role playing games in conjunction with agent based models.

End Game

Well, I hope this has given the reader somewhat of an overview of some very rapidly expanding fields, and more importantly a little familiarity with some very hot but perhaps confusing concepts. While there is plenty writing on the subject from both the scholarly and the design perspectives I’d suggest the interested reader leave all the reading aside for some time, because well, it isn’t all that much fun. Check out what people are doing (a Google search for your industry and the Games for Change Festival are good starts), and try to play games that interest you. Attend a game-a-thon. And most of all try to enjoy what you’re doing. Because for all the talk about “serious” games a central insight to all of this is that fun and playful thinking can bring engagement, understanding, and creativity to very serious topics. So even if games aren’t your thing, they might very well be worth playing around with.

 

 

 

Lessons from the Failure of the Stormwater Rule: The Need for Meaningful Public Participation and Relevant Analysis

Those of you who know me know that I am very proud of the four and a half years I worked at the U.S. EPA’s Office of Water, first as an Oak Ridge Institute for Science and Education Fellow and then as an environmental protection specialist and economist. Fewer of you will know that every major effort I worked on as a federal employee more or less failed. This was not due to any failings on the part of the people there, the staff and management remain one of the smartest and dedicated groups I have ever met. And even the undeniable budgetary and partisan political troubles can not take the full share of the blame. No, our failures stemmed from how poorly institutions and organizations set up in the 1970s are suited to deal with 21st century environmental problems.

So what I took out of my time at U.S. EPA were the lessons of failure. And nothing taught more lessons than the high profile failure of the Stormwater Rule.

Stormwater 101: Trees and dirt have a different hydrology than roads and buildings. This has environmental implications.

 

 

 

 

 

 

 

 

Taking dramatic action to deal with the damage to U.S. waterbodies caused by stormwater running off buildings and roads wasn’t a bad idea. Dramatic changes were necessary in how stormwater was managed and regulated in the United States to prevent further degradation of said waterbodies. The U.S. EPA had more or less left the issue to the states for many years and while some had made significant and exciting progress, many were doing virtually nothing.

Our specific regulatory approach itself wasn’t bad or uninformed either, it had been developed by some very smart people with decades of experience with the issue area itself and the nuts and bolts of how to get things done in Washington, D.C. In many ways we were implementing a tried and tested U.S EPA approach to environmental regulation: let the states experiment, and then use the lessons from that experimentation to set a minimum technical standard that the states that were straggling behind would have to meet.

No, the problem was the institutional context in which we were operating, which as I noted above, was still tuned to fix the environmental problems of the 1970s where the problems could be easily traced to some big building with a pipe spewing gross stuff that killed the fish.

Before U.S. EPA management and the White House could come to a final decision about the proposed regulation, however, the agency was going to have to go through the full regulatory process, which means that even the most sensible sounding and pragmatic policy (such as we thought ours was) has a lot of hoops to jump through before it becomes law.

This is because decisions, particularly decisions by public authorities, often have unintended consequences and attempts to solve a problem in one area may lead to frustration, misunderstanding, and even adversely affect public health, the environment, and people’s livelihoods in other areas. The solution is to create an opportunity for affected and interested parties to make their voices heard, and in the U.S. a federal requirement that authorities include the public in their decision making process has existed at least since the National Environmental Protection Act of 1969. And “public participation” requirements in some form or another are pretty commonplace in Europe as well, as I would imagine they should be in any nominally democratic society.

And we can imagine the ideal public participation process. New information is brought to the public authorities, they reconsider and redesign their policy appropriately, and the harms the policy would have caused are avoided or at least minimized. Perhaps the process takes a bit longer, and the final policy costs a bit more, but overall society is richer, more equitable, and just plain better off. But how often is this ideal realized?

I can’t say exactly. On the margins, when minor tweaks to the policy are required, maybe exempting some small population from the regulation or giving them extra protection, say not requiring businesses below a certain size to fill out a lengthy report to the government, or putting some extra resources into inspecting potentially toxic substances in schools, public input and participation can be very effective.

But when the changes to the anticipated policy become more than little tweaks, simple participation may not lead to any meaningful improvement of the policy, and may just lead to confusion and distrust. Some problems just aren’t that easy to characterize or solve, and the public authority may very well find its attempt to solve the problem characterized as being a problem in itself. In these “Wicked Problems” even formulation what the problem is, is a problem, often problems are not understood until after a solution has been formulated.

Let’s say for example you have too much waste piling up for the city dump to handle. The problem may be that the dump is too small, that you need a recycling program, or simply that the citizens should not consume as much stuff (NRC 1996).

As stakeholder groups (waste haulers and environmental activists in the above example) are likely to have varying solutions, they likely won’t be able to agree on what the problem was in the first place.

So we moved forward with our formulation of the solution (which was tied to our formulation of the problem, i.e. the need for U.S. EPA to set a standard) for the damage caused by stormwater runoff from streets and buildings. Our solution being a regulatory requirement that would require newly developed and redeveloped land to try to mimic natural hydrology. We believed that this could be implemented in a relatively cost effective manner as we had seen a lot of very attractive projects which had used green infrastructure such as rain gardens and green roofs and new “low impact development” techniques to accomplish just this. Some had even saved money and seen higher sale prices because, well, people like plants.

We had listening sessions across the country where people commented on our proposed changes to administrative law. Some people liked our program, some didn’t. And many just brought a lot of new information to the table. For example, a major barrier to the use of green infrastructure was that it wasn’t allowed under local building codes. For something like this we could issue guidance and allow local authorities some time to revise these codes. Other issues weren’t so easily resolved. For example, we were only looking at regulating new and redevelopment within urbanized areas, what if this perversely pushed more sprawl as developers sought to avoid the costs of regulation? And could the new techniques, which infiltrated the rainwater and anything it was carrying into the ground lead to groundwater contamination? And how burdensome would this be for the construction industry, would it substantially increase home prices?

These questions and many others were debated inside and outside of the U.S. EPA for the entire rulemaking process, and many of the debates go on even though the rule itself died in March 2014. And the process and the extensive information gathering, modeling, and analysis that were connected to it were not able to build any kind of broad consensus for or against the regulation we were proposing.

Which sounds funny. Having collected and analyzed all this information we should have “known” more and so it should have been clearer whether the regulation was a good or bad idea. But we hadn’t dealt with the fact that we had a wicked problem on our hands, or rather we were trying to deal with one aspect of a whole host of interconnected issues relating to urban development, and we couldn’t convince stakeholders that we could pursue our goals without stepping on their toes and causing problems in the issue areas they cared about.

Now, as I learned in our efforts to make sense of what happened, the problem of building consensus and understanding around government actions which involve difficult to interpret science and political disagreement was not unique to this effort. Back in 1996 the National Research Council (NRC) took a stab at addressing similar issues which had been encountered by a number of U.S. entities, including the U.S. Department of Defense, Health and Human Services, Agriculture, EPA, the U.S. Nuclear Regulatory Commission, and American Industrial Health Council, and the Electric Power Research Institute in its report Understanding Risk: Informing Decisions in a Democratic Society.

The thinking in the report remains state of the art today and it contains many concrete examples of where the process to inform the public and develop the appropriate scientific analysis was done correctly, and where it was done poorly. If the recommendations of the report had been implemented broadly at U.S. EPA, we would have been far more likely to have avoided the expensive and very public failure that was the process to establish a Stormwater Rule.

And process is the key word here. The final rule could be challenged in court if we did not comply with certain federal requirements which included when you have to publish notices in the Federal Register, how long comment periods need to be open for, what additional statues have to be considered, and what analyses have to be conducted. Where we fell down with the Stormwater Rule is that we formulated the broad outlines of the problem, and the solution, relatively early, and then treated the (admittedly substantial) process requirements that had been given to us as boxes to be checked, or hurdles to be jumped, on our way to a final decision. Don’t get me wrong, we did listen and conducted the elements of the process with complete sincerity, but how open we could be about changing our approach was limited by the fact that we knew basically where we wanted to go, and we didn’t have much time to get there as we started work in earnest in 2009 and planned to propose by the end of 2012. That is a very very aggressive schedule for a government action of the size we were conceiving.

In the above mentioned report (Understanding Risk) the NRC is emphatic that the process must be gotten right (NRC 1996, p. 22), and the process element which the NRC warns most strongly against using without broad based deliberation was exactly the element which killed our rulemaking, that is the cost benefit analysis (NRC 1996, p. 104) and the many “judgements that are implicit” in the techniques that makes such an analysis possible.

Now the superficial reason that the cost benefit analysis killed the Stormwater Rule was that costs were higher than benefits, and no doubt having higher benefits than costs would have strengthened the political hand of U.S EPA management and made them more likely to issue the rule. But getting high “monetized benefits” around water issues is notoriously difficult, and many regulations have been issued in which costs were much higher then benefits (I recall a rule of thumb for water rules being that costs should be no more than three times benefits).

The reason that federal agencies are allowed to issue regulations is even when a cost benefit analysis is unfavorable for taking an action, it may still make sense to take the action. This is because the meaning of costs and benefits in the context of a formal analysis is deceptively narrow and technical and understanding precisely the final numbers of any given analysis requires not only a significant background in economics but some familiarity with how methodologies were implemented in that specific study. In the environmental context, these terms are even less intuitive because one must rely on “non-market valuation” which depends on a very specific and even more technical body of research.

To cover quickly how “cost” and “benefit” are used in everyday speech as opposed to in a formal analytic setting, let’s consider that while many of us consider increasing equality to be a “benefit,” there is no way that equality can be considered in the framework of a cost benefit analysis . Other qualitative factors of interest such as sustainability, resiliency, or innovativeness, might be translated into monetized terms in a cost benefit analysis, but it is likely that after the translation process such terms would only bare a tangential relation to our everyday usage of the terms, and thus the analysis would be likely misleading in broader debate.

The point is: cost benefit analysis is only useful if decision makers and stakeholders understand how to interpret the results. And when it comes to complicated environmental regulations this is rarely if ever the case. Air regulations often have much higher benefits than costs but this is because small particulate matter is likely to lead to human deaths. Now it does seem that avoided deaths should translate into HUGE benefits, obviously human deaths are first and foremost what we wish to prevent, but discussing human death in a cost benefit analysis requires putting a monetary value on human life, in the case of the U.S. government standard values are between $7 million to $9 million per human life (these come from very technical studies about how much money Italian miners demand for more dangerous work). Check your intuition on these numbers for a second. Then consider that simply multiplying one of these numbers together by the number of expected deaths likely forms the core of the several billion dollars in potential benefits reported in the news regarding a big new air regulation.

If you have any skepticism about placing a value of $7 million to $9 million dollars on each individual human life, you probably would have preferred to look simply at the number of people likely to die without the new regulation. Then you might be able to make a judgement call based on what you thought was a manageable cost for industry. The problem we are now faced with is that we as a society don’t have a nice analysis to make the implicit judgement for us. What you decide is “manageable” will say something about your judgement, and thus also about you, and your judgement is likely to be inextricably linked to whether your child suffers from asthma or you have to install the control technology on your coal plant, whether you’re a Greenpeace volunteer or a Koch Industries lobbyist. Very quickly what seemed to be an entirely technical matter has become a political one.

And here are the Scylla and Charybdis which any decision making process which is at the same time scientific and yet political must steer between.

I’m going with the overly political process running into the old Scylla monster.

Overly technically and analytically oriented decision making processes threaten to confuse and obscure understanding of relevant stakeholders, the public, and even the decision makers themselves; overly politically oriented processes may never be able to arrive at a consensus on the facts at hand. But in a well tuned process, consensus and understanding go hand in hand, just as technical analyses can support and inform political deliberation.

Now designing and carrying out such a process is more of an art than a science. But this is an art that can be informed by prior cases, as the “Understanding Risk” report shows. An example of a successful effort comes from the Man and Biosphere Program (MAB) which was organized by the U.S. Department of State. I assure you, given the scope of the issues involved, success was not assured:

In a MAB activity over several years, more than 100 natural and social scientists from various federal and state agencies and from universities have considered policy options for managing surface water so as to maintain a sustainable ecosystem in and around Florida’s Everglades (Harwell et. al., in press). Changes in the ecosystem and possible responses to them entail risks to endangered species, to drinking water quality in nearby metropolitan areas, and to the livelihoods of sugar growers. The scientists considered all these risks carefully, but from a perspective different from that typical to risk assessments.

 

They defined the problem not as one of estimating and reducing risks, but as one of developing a shared vision of desired conditions of the ecosystem. The then identified development strategies consistent with such a vision and proposed governance structures that could adaptively manage the social-ecological system and it changed and new knowledge developed. They considered several scenarios for change in human management of the ecosystem and analyzed them in terms of their compatibility with goals of sustainable economic and social development and with a widely shared vision of ecosystem use. The MAB effort is noteworthy for its problem driven approach, particularly its extensive and explicit efforts to understand the decisions to be made, rather than presuming that decision makers would gain the understanding they needed from estimates of the ecological, health, and economic costs and benefits of previously defined choices. In fact, the process generated policy options that had not previously been considered and might be more acceptable, both socially and ecologically, than any that might otherwise have been considered (NRC 1996, p. 18-19). (Emphasis added)

In addition to specific cases such as the one mentioned above, there is a good deal of general practical knowledge out there and agencies would do well to direct more of the attention to the “craft” of meaningful public participation. Such effort shouldn’t be seen as a distraction for scientific work to inform policy, but rather as a necessary step to ensuring that the data gathered and analyzed answer the relevant questions. And what the relevant questions are is never clear at the outset of a policy making process which involves a complicated mix of politics and science.

And those of you who know me very well or read my post at the beginning of the year, have heard of the huge potential I see in participatory modeling, particularly participatory modeling exercises aided by computer simulation. Such exercises have been shown to help stakeholders from diverse backgrounds to come to a common representation and understanding of the problem, and thus a common and trusted vision for a way forward. As I develop my work with an approach that uses these exercises, my experiences with the Stormwater Rule and stormwater issues more generally will certainly be foremost on my mind. And I invite any practitioners or researchers struggling with these issues to contact me through the comment form on this blog or at chris.moore [at] ecologic.eu

Stern and Fineberg. Understanding Risk: Informing Decisions in a Democratic Society. National Research Council. 1996. http://www.nap.edu/openbook.php?isbn=030905396X

Concept Sketch: Civil Society 3.0

Representative democracy has never been sufficient to formulate informed and legitimate legislation and policies. Elected representatives and governing institutions have always required help from civil society:

Civil Society 1.0: The town hall meeting. Everyone gets together in a room and debates directly with the elected representative. A robust and independent media is also indispensable.

Civil Society 2.0: In response to the growing complexity of society and policy issues a whole group of organizations grew up to inform policy. Think tanks, academic institutes, issue advocacy nonprofits, and industry associations, achieved power and influence by filling gaps and finding niches to influence policy through research, proposing and opposing policy, and representing core constituencies and providing issues expertise.

Civil Society 3.0:  Cross cutting persistent issues such as poverty and emerging issues such as climate change adaptation require means to build consensus and collective action that overwhelm Civil Society 2.0 actors. The the experts and institutions for 20th century policy problems lack the flexibility, creativity, and legitimacy to address 21st century challenges unaided. Blogging and social media allow for new thought leaders and influencers to grow and online forums facilitate dialogue and organization. Increasingly, ideas will be crowd sourced (e.g. MIT Climate CoLab) and cross cutting issues will be addressed, not by building wholly new organizations and expertise, but rather by reorganizing existing actors, often those active in Civil Society 2.0. Increasingly important are “systems leadership” and “collective impact” facilitated by organizations (e.g. FSG and The Intersector Project) and embedded in “backbone organizations” (such as the Greater Cincinnati Foundation).

Level up to Systems: From Design to Game Design Thinking

Design thinking has enjoyed incredible success in the for-profit sector and is increasingly extended to products and services aimed at positive social impact. To tackle systemic and persistent “wicked” problems however, design thinking will have to be enhanced with techniques from game design.

Where there are start-ups, there is design thinking, and the reason why is relatively clear. When developing a radically new product or service, there are a lot of risks and a lot of unknowns that mean even brilliant start-up ideas won’t make sense as an ongoing business concern. At least not as it was originally conceived.

Design thinking mitigates such risks, or at least increases the chances of revealing them before you’ve dropped millions of dollars in development.  Empathizing with the user and maintaining a laser-like focus on his or her experience and developing rough and ready prototypes for feedback often means that either an idea fails quickly, in which case you move on to the next one, or that the final product or service offers exactly features and experience that the user wants.

The design thinking approach can be contrasted with the waterfall development, a model rooted in the age of mass manufacturing.

Waterfall Development (c/o Wikipedia)

The waterfall model was originally adopted for software development, and the reasoning was also fairly clear: it follows far more closely our ideal of a rational course of action than design thinking. Figure out what the end user wants up front and then figure out the most efficient way to get it to them. Easy peazy, lemon squeezy.

The hard part is when the user doesn’t know what he or she wants up front. Or thinks that she knows, but changes her mind once she starts using it. Doesn’t like this feature, would really like to have this one. Or even worse, maybe there is just something “off.” Here design becomes less of a rational or linear process, but rather one of empathy, intuition, and experimentation, and this is where design thinking excels.

If we check out the design thinking’s Wikipedia entry we see it is associated with addressing wicked problems, “a class of social system problems which are ill-formulated, where the information is confusing, where there are many clients and decision makers, and where the ramifications in the whole system are thoroughly confusing” (Rittel and Webber 1972). Wicked problems include those such as persistent poverty, which has proven to be more than a match for efficient manufacturing processes and many of our current environmental problems, which the successes of mass production played and continue to play their role in creating.

But while design thinking has shaken up the hotel industry through AirBnB and IDEO, and the leader in the practice is now moving to disrupt the remarkably resilient consulting industry, we have yet to see any exciting new businesses make progress on disrupting the problems we really care about, our persistent and evolving social and environmental problems. And while social impact accelerators and incubators exist and are producing all sorts of exciting companies and ideas, we have not yet seen social start-ups scale to make inroads on the pressing problems of our day, and even the most successful are currently not within any reasonable distance or disrupting or acting as a reasonable complement or alternative to our current, highly inefficient and unsatisfactory political and policy making processes.

The reason is that design thinking was developed primarily for developing consumer products, it gives the consumer what he wants. But there is no single product or service to ensure that water is used in a socially equitable and sustainable way, or to eliminate poverty, i.e. the problems that Rittel found to be truly wicked. Fortunately there was an approach developed explicitly for wicked problems, and it has been used with success by a community of practitioners working in renewable resource management since 2000.

The approach, ComMod, has an orientation that will not be completely foreign to practitioners of design thinking. Empathy is inherent, as is prototyping. The difference is the type of client it is aimed at, and the type of design problems it aims to take on. It is geared towards solving problems in social and ecological systems by assisting with the design of institutions, and thus by necessity it is equipped to deal with a higher degree of complexity and uncertainty than design thinking, as well cope with very diverse values among the clients, i.e the users and managers of the resource system. The client is not an individual consumer, the client is society, or rather the segment of society that is concerned about or involved with a particular wicked problem.

To deal with the complexity endemic to issues such as protecting biodiversity, and promoting best practices of land and water management, ComMod puts researchers at the center of the process. Clear decisions on such issues are of little use if they do not have an adequate scientific and technical basis. But as these are wicked problems, and thus are not clearly formulated, it is not clear from the outset where the researchers should devote their time. Different stakeholders will have different understandings of the underlying problem, as well as what the solutions are likely to be, and thus different takes on which factual matters need further research and which are irrelevant. If the researcher merely begins research without consulting the stakeholders, her work is likely to reflect her disciplinary background research interests which maybe be academically interesting and useful. But if she consults the stakeholders she gets a problem similar to the one our design thinkers deal with, stakeholders will be unable to give a common or coherent view of what the goals and requirements of the research should be.

And this is where the game design and prototyping come in. Starting with one stakeholder group the researchers are able to get one perspective on the underlying problem, and possible solutions, and then are able to represent that understanding of the system of interest through a role playing game. This artifact, the game, can then be played by other stakeholders in the system and critiqued and iteratively improved. Stakeholders with diverse backgrounds are now working from a common object, understanding the assumptions and important facts of others, and seeing their own views incorporated and critiqued.

They also engage with and “play” with the problem and each other, which can allow them to temporarily put aside their own frame of reference, and thus better understand those of others and become more open to creative solutions. This avoids the situation often endemic to wicked problems, where stakeholders continuously argue past each other. It also looks a lot like the “second generation” model of planning advocated by Rittel and Webber, one that is an “argumentative process in the course of which an image of the problem and of the solution emerges gradually among the participants, as a product of incessant judgment, subjected to critical argument” (Rittel and Webber, 1973).

A ComMod river basin game (c/o Paolo Campo)

A ComMod river basin game (c/o Paolo Campo)

Prototyping isn’t everything. The final research product to support decision making is accomplished by simulations based on the final prototype in a computer environment. But this research product is far more likely to be relevant to the important decisions at hand, as well as understood and accepted by the involved parties, because they played a role in creating it.

ComMod has been applied successfully in Africa, Asia, Europe, South America and Oceania, it has been used for issues surrounding agriculture, biodiversity, water scarcity and flooding, livestock, fishery and forest management. That it is not more widely used and known is likely a function of language, the approach was developed by French researchers (ComMod standing for Companion Modelling being a somewhat awkward translation from La Modélisation Comme Outil D’Accompagnement) and because the practicing community has remained largely in academia. Both of these are changing however, as new geographic settings bring new linguistic settings, and as private practice is increasing. Lisode is a private firm that serves French language clients, and Sim4Act, which will bring ComMod to English language clients (full disclosure: I am a co-founder of Sim4Act), is currently in its start-up phase.

These are welcome developments, as the increased use of ComMod is important for communities facing traditional wicked and commons problems such as resource scarcity and ecosystem management, or ones trying to grapple with new ones such as climate change adaptation. ComMod promises to be an important tool in the hands of policymakers and citizens trying to deal with politically contentious risk management problems where science is necessary but not sufficient to drive decision making. It fits neatly with the National Research Council’s (NRC) recommendations a number of U.S. entities, including the U.S. Department of Defense, Health and Human Services, Agriculture, EPA, the U.S. Nuclear Regulatory Commission, and American Industrial Health Council, and the Electric Power Research Institute in its report Understanding Risk: Informing Decisions in a Democratic Society. ComMod fits the bill of the “analytic-deliberative” process that the NRC recommends, and offers an excellent alternative to cost benefit analysis, an approach which the NRC openly warns against as being bureaucratically convenient, but often poor at informing the public.

In an era of persistent and multiplying wicked problems, it’s time to take extend design thinking’s successes based on empathy and prototyping our urgent social and environmental problems through game design thinking. It’s time to scale up from individual users to systems.

Rittel, Horst W. J. 1972a. On the Planning Crisis: Systems Analysis of the First and Second Generations. Reprinted from: Bedrifts Økonomen (Norway), No. 8, October, 1972. Reprint 107. Berkeley: University of California at Berkeley, Institute of Urban and Regional Development, as cited in Buchanan 1992

Rittel, Horst WJ, and Melvin M. Webber. “Dilemmas in a general theory of planning.” Policy sciences 4.2 (1973): 155-169.

 

Republican Smartness Stratego: Who Trumps Trump?

“I went to the Wharton School of Business. I’m, like, a really smart person.” -Donald Trump

So I’ve been trying to figure out how I measure up to the Republican front runner in terms of intelligence. I mean, I didn’t go to the Wharton School, because I didn’t study business, but I did go to UC Berkeley for undergrad, so that’s got to count for something, right? And from the few episodes of The Apprentice I watched, I gotta say his business acumen didn’t seem all that great, mostly seemed to fire whoever took responsibility for their actions.

“If you’re so smart, why aren’t you rich?” you might ask me. And then maybe I’d point out that I’m not interested in money and that I could make money if I wanted to and bring up the story about the Greek philosopher guy who rented all the olive presses just before a good harvest he predicted ’cause he was really smart and thus proved that smart people can make money if they want to.

“Yeah, idiot snot lefty intellectuals like you got Obama elected and are ruining America. If we’d put Carly or the Donald in charge we’d have some real economic recovery!” you would reply right before you knocked me out cold with Jack Welch’s biography.

If I were still conscious I would have noted that Mr. Trump is #405 on Forbes list. And #1 on the list is Bill Gates. And that guy dropped out of college!

This is getting hard. So maybe let’s stick to rating the Republican candidates. We’ve got enough of ’em we could populate an entire Stratego set and send into battle against the clearly underpopulated Democrats! We just have to work how to rank ’em. If we’re going to go with ranking in terms of intellect, as the Donald does, here’s what I got:

#1  Carson (the marshal) -freakin’ neurosurgeon

#2 Cruz (the general) -he knows he’s not going to win, but will get a load of publicity in the effort.

#3 Bush (the colonel) – He’s the smart brother.

#4 Rubio (major) – Smart enough to know being the smart brother ain’t enough to wrap this one up.

#5 Fiorina (captain) – Got fired and got $21 million. She ain’t stupid!

#6 Paul (lieutenant)- Keeping the family biz alive with a no-hope run.

#7 Huckabee (sergeant)- This is how you sell a book these days!

#8 Kasich (the miner)- the only one who had a comment that defused Trump during the Fox news debate.

#9 Graham, Santorum, Perry…….. (the scouts) – cannon fodder.

#The Bomb- Trump! (But for which side?)

Illustrating With Keynote

Graphic Test- Naturally Driven Change

A natural driver of hydrological change.

 

 

I’ve been looking into visual ways to communicate my work and have been playing a bit in Apple’s Keynote which has an easy to use and attractive set of shapes and colors. I’m kinda tickled with the result of my test and plan to work with Keynote a bit more. But I would certainly like to hear of any other cheap and easy to use illustration packages that are out there.

Managing Power Inequalities in Policy Making: An Encouraging Case

Those with the power make the rules, and with a few notable exceptions, the rules usually benefit those with the power. This is a general truth in human affairs.

For those of us who live in democratic societies, this often means living with the contradiction of nominal equality before the law and in political power (“one man, one vote”) versus reality: the few who wield economic power have a disproportionate voice in the political process through their lobbyists, and extra protection before the law with the help of their high-end lawyers.

Requiring that everyone have a seat at the table, and thus a voice in the process is a common way to assure that actions in the public arena do not neglect society’s less fortunate. But even if the less fortunate are able to make it to the table, just a seat at the table doesn’t make you an equal, and certainly doesn’t guarantee that your interests will be represented in any final decision. Power dynamics have a way of playing themselves out around a table, too. Continue reading