A Coupled Infrastructure Perspective on Cycling

I recently saw the Dutch documentary Why we cycle which provides an integrated perspective of the role of cycling in Dutch society. Not only allow cycling people to go from A to B in a densely populated country like the Netherlands, it also provides other benefits such as cognitive and physical exercise, and stimulate the random encounters between people of different parts of society reinforcing the egalitarian societal landscape. People meet and get in a conversation waiting for a stop-sign to get green.
Obviously the biophysical context, a flat country, enables the use of bikes (but note that it rains every other day, and Dutch will continue to use the bicyle, even if it snows). The infrastructure for bicycles is excellent in the Netherlands, while the dense occupation makes it impossible for people in cities to use cars in a convenient way. As such the use of bicycles is also a cost-effective solution to solve mobility problems. And as the Prime Minister and the King also use the bicycle, there is a strong social norm to use bicycle for short to middle distance trips.

Learning how to learn

Just finished a great book by Barbara Oakley on how to learn to learn. As I experience as a professor teaching courses that students consider to be difficult, many students do not know how to study effectively. Just reading books to learn mathematics will not bring you far. Dr. Oakley failed mathematics and science courses when she was in school, learned Russian and became a translator in the army. Now she is a full professor of engineering at Oakland University. What happened? She learned how to effectively learn the material she was interested in to learn.
An important lesson is to vary between focused and unfocused attention. Your brain keeps working on problems when you are not concentrated on it. It is important to start early with studying and make use of this background brain processing. Spending many hours in the last minute before an exam is very ineffective. And practice. Do your homework and practice. You don’t derive skills in sports or music by just trying it once, you have to practice!!
To get a brief overview of her book, see her Tedtalk.

 

Bridge maintenance as a collective action problem

Collapsed bridge on the I-10

On July 19th a bridge collapsed on the I-10 that connects Phoenix with Los Angeles. Heavy rain caused flash flooding which eroded the eastbound bridge to give way. As a consequence the I-10, the main highway between Arizona and California was closed for a week and travelers had to drive a few 100 miles extra to go to their destination.

Is the bridge collapse a rare accident or can we expect more due to increased intensity of rainfall events and lack of maintenance of bridges (and infrastructure in general)?
This bridge collapse coincided with me reading the book “Too Big to Fall” by Barry LePatner, a construction lawyer of New York City. LePatner discusses a number of cases, such as the collapse of the I-35W bridge in Minneapolis, in detail and provides a historical analysis of the incentives to built and maintain infrastructure. Unfortunately, the incentives are tailored to building new roads and bridges (where the costs are shared with the federal government), and maintenance tend to be postponed. Furthermore, there has been a lack of coordination on how inspections need to be done. While there are now standard inspections, new technologies might be used to create smart infrastructure to get more often relevant info on key indicators of the structural functionality of the bridges.

The book provides interesting material to start looking into the provision of bridging services as a collective action problem where current perverse incentives lead to an under provision of maintenance and inspections. Together with increased extreme weather events due to climate change, we can expect that this will have major consequences for our society. Unfortunately those topics do not get the attention in political debates that they deserve. Strategies have to be made how to cope with the failing infrastructure and increased vulnerabilities.

Who rules the world?

The new book of Paul Steinberg, a professor of Political Science and Environmental Policy at Harvey Mudd College, is entitled Who Rules the Earth? The book is an engaging discussion on how rules rule our lives and the interaction with the environment. Unlike many people may assume rules are constructed because individuals care about a problem and try to find solutions. For example, building codes have a potential big environmental impact and thus changing those rules or creating new standard (like LEED) can have major impact. Based on personal observations and stories, Professor Steinberg shows us that rules are everywhere and key to understand how we can solve environmental challenges.
His students created a website where you find animations, news updates and a game to get immersed  even more with the rules and the environment. Although rules and regulations may sound like a boring topic. Steinberg is doing a great job to make it engaging and raise awareness. 

Eco: how to save the world?

In my previous post I lamented the lack of resource dynamics in (video) games. Some of you let me know about resource constraints in some games, but many of you recognized the lack of ecological realism. Now a new multi-player game is announced: Eco, which will focus on the fragility of the world we live in. It is a survival game for the the resource you share with others. The ecosystem consists of predators and prey. Eat or be eaten. As human avatars you need to harvest resources to survive, but if you overuse the resource, you and the whole world will be affected. In fact, the developer mentions that the server might be physically wiped if the world collapsed. This might sound dramatic especially since many lifeforms will outlive humans, but it is an interesting option. No restart is possible, nor having extra lives.
I wonder how the developers deal with trolls, those gamers who purposely want to collapse the system. Anyway check out the Trailer and see whether this would be an interesting alternative to the robust worlds.

Resource dynamics in video games

During the recent winter break I got introduced by some younger family members to Clash of Clans and Boom Beach, which are strategy games you can play on your ipad or other devices. You make investment decisions for defense and attack infrastructure as well as infrastructure to extract resources. For example, in Boom Beach (see figure below) you occupy an island and use gold and wood resources to build and support your army (to attack island of other players and rob their resources). There is a saw mill that generate construction material and it will not reduce the amount of forest on the island. In fact, if you have collected sufficient gold from the unlimited gold resource, you can increase the capacity of the saw mill which will not affect the amount of trees on the landscape.
So we can conclude that the game has no relevant renewable (let along non-renewable) resource dynamics. Just invest in better technology to extract the resources and everything will be fine. Why do I bother about this? I don’t want to spoil the game. In fact, the games are entertaining and addictive, which is exactly why millions of people play these games. But the lack of relevant resource dynamics affect the perception of people on how to solve resource problems. I understand that it might be more challenging to develop a game with limited resources to attract millions of players (everyone wants to grow their army to stay in the game). On the other hand, it is like assuming gravity does not exist for the convenience of the game dynamics.
It might be an interesting challenge for the gaming industry to try to capture relevant resource dynamics such that people learn not only to develop complex strategies to combat other armies but also derive a better understanding of the complex dynamics of short term benefits of resource extraction and long term consequences of a livable planet.

Archiving practice for model code of agent-based models

There is increasing concern over the repeatability and reproducibility of computational science (see also here, here, here, here and here). If computational scientific enterprises want to be accumulative more transparency is required including the archiving of computer code in public repositories. This also holds for agent-based modeling, an increasingly popular methodology in the social and life sciences.

I show here some initial results of an analysis of the practice of archiving agent-based models. Five journals were selected that regularly publish research that use agent-based models: Advances in Complex Systems, Computational and Mathematical Organization Theory, Ecological Modelling, Environmental Modeling and Software, Journal of Artificial Societies and Social Simulation. Using the ISI web of science we searched for all articles in those 5 journals in the years 2010 to 2014 using the search term “agent-based model*”. This resulted in 255 articles on September 5, 2014 of which 56 articles were disregarded since they did not discuss an agent-based model itself.

Percentage of archived model per year

Out of the 199 remaining articles 135 were found not to provide the computational model’s source code. 21 articles referred to an institutional or individual homepage. In 5 cases, the link resulted in a 404 not found error and we recorded that the code was not available. In 17 cases the code was included as an electronic appendix of the journal. Only 31 articles provided the model code in a public archive, out of which 26 were stored at the CoMSES Computational Model Library . The other 5 models were archived in repositories like Bitbucket, Git Hub, Google code, Sourceforge and the Netlogo community models site.

Percentage of archived model per journal

Over the years there has been improvement in model archiving. In 2010 75% of models were not archived. The increasing availability of public archives has enabled authors to archive their models more frequently and in 2014 50% of the models are archived. The majority of those models are archived in OpenABM. As we can see, most models are still not archived. One journal has championed model archiving with more than 50% of its publications associated with a publicly archived model, whereas the other journals have an archiving percentage between 10% and 20%.

Since most research is sponsored by tax money, sponsors sometimes explicitly require that the data, including software code, is made publicly available. We find that papers from the 2 main sponsors (16 by European Commission and 21 by the National Science Foundation) experience a low compliance rate to best practices. In both cases we find that only 15% of the models are available in public archives, significantly lower than the articles that do not list a sponsor (29%), or list other sponsors (24%).

Currently the scope of the analysis is extended to about 3000 articles (using search term agent-based model* unrestricted to years and journals). Besides getting a better picture of current archiving practices we also hope this activity lead to more awareness of the problem and the need for journals to increase requirements for archiving code and documentation in public repositories.

Open Science; Science for a 21th century experiences roadblocks from 19th century incentive structures

The book Reinventing Discovery by Michael Nielsen is a joy to read. Technological development enables the production of knowledge by a large group of people making small contributions, instead of the isolated inventions in 19th century science. Nielsen discussed examples like the Polymath project where mathematicians – from field medalist to high school students – collaborate to solve problems, the GalaxyZoo (classifying galaxies) and Foldit (folding proteins). These success stories brought large number of people together – often non-specialists – to solve problems faster and better than individuals may do. Key components for success are the ability to split the problem in smaller modules, and clear measures of performance (getting a score when folding proteins).
So citizens start solving science problems, but a harder problem is to have scientists doing research in the open, Open science, which enable others to build on it. The incentives structures in Science are perverse to stimulate discovery. In fact we use incentives structures from the 19th century ignoring the potential increase in knowledge production if we use a 21th century approach. This networked open science approach aspired by Nielsen experience major challenges due to the incentives of scientists to publish results in high profile journals, but not getting recognition for sharing data and/or computer code, which are often the main outcomes of a project.
As we experience in the development of openabm scientists like to download the models of others, but are reluctant to archive their own work. Furthermore, journals are reluctance to increase the standards of transparency and sponsors like the National Science Foundation require data management plans but do not invest sufficiently in cyberinfrastructure to make this possible.
Although there is self-organization at the small scale in the science community, the sponsors need to step up and invest seriously to make a science for the 21th century possible.

The re-emergence of the roving bandit

Mancur Olson introduced the concept of roving and stationary bandits to explain why dictators – stationary bandits – have a self-interest to the society they rule productive to maximize the rent they can collect from the population. This in contrast to roving bandits who have no obligation to provide protection to the population, or keep the land productive. Roving bandits just plunder and steal. A stationary bandit uses taxation.
With the development of nation states during the last few centuries we seem to have entered a period where we don’t have roving bandits. Some countries experience democratic systems, others autocratic. But in recent years we seem to observe the re-emergence of roving bandits, as a nasty side-effect in panacea thinking of the benefits of market solutions and democratic systems. Note that democracy is typically portrait as voting, not the involvement of people in decision making, which is the key to democracy according to Vincent Ostrom.
We have multinational companies who move around to avoid taxation and exploit natural resources. When resources are depleted they can move on to other countries. Whether this is shrimp farming in rice field in south east Asia, gold mines in Africa, or soy bean production in Latin America, the multi-national companies lack the incentives to care about the long term for the people, and often local governments are too weak to implement and enforce regulations to reduce the negative impact.

Former stationary bandits

At a different scale, we see roving bandits emerging in Northern Africa and the Middle East. The so-called Arab Spring led to the removal of some dictators who suppressed a large part of their population. Unfortunately we now learn that those dictators were able to provide some security compared to the anarchy currently ruling in ‘countries’ like Libya, Syria and Iraq. Those dictator were able to suppress the violence between different ethnic and religious groups within their countries. These suppressive regimes favored their own ethnic and religious groups, but compared to the current anarchy, we may wonder whether everyone did benefit.

Douglas North and his colleagues public a book in 2009 on the difficulty of societies to transition of controlling social order via violence, to the control social order via votes and democratic institutions. Those transitions have been rare, had a long duration and are embedded in a long history of social norm development that supports democratic institutions. As we have seen in recent years, removing dictators is not a solution to establish a less violent and prosperous society. I don’t have a solution to this problem, but we should at least learn from history and avoid creating more anarchy as has happened in recent years.

The biology of trust

The book “The moral molecule” by Paul Zak is a great book discussing the research on the relation between oxytocin and decision making. Using the trust game and measurements of oxytocin in the blood before and after decisions are make, dr. Zak shows that a higher increase with oxytocin correlates with higher levels of trust. By artificially increasing oxytocin levels participants increase their level of cooperation in experimental games. Dr. Zak also did experiments with non-traditional groups such as people who are abused, religious groups, and tribes in Papua New Guinea.
The result is a fascinating discussion how hormone levels like oxytocin and testosterone affect decision making. These hormone levels are defined by genetics, social context and change during the life cycle of our lives. A better understanding how hormone levels, due to frequent hugging as practiced in the lab of dr. Zak, affect decision making has consequences how we organize our social interactions. Although new technologies lead to “social snacking” they do not replace the biological responses to physical and face-to-face contacts with other people. If we want to cope with increasingly complex collective action problems at higher levels of scale we need to take into account the biological context of our decision making next to the efficiency of information exchange by the new technologies.