- Complex Adaptive Systems, Cognitive Agents and Distributed Energy (CASCADE): a Complexity Science-Based Investigation into the Smart Grid Concept http://gow.epsrc.ac.uk/ViewGrant.aspx?GrantRef=EP/G059969/1
- reventing wide-area blackouts through adaptive islanding of transmission networks http://gow.epsrc.ac.uk/ViewGrant.aspx?GrantRef=EP/G059101/1 and http://gow.epsrc.ac.uk/ViewGrant.aspx?GrantRef=EP/G060169/1
- SCALE (SMALL CHANGES LEAD TO LARGE EFFECTS): Changing Energy Costs in Transport and Location Policy http://gow.epsrc.ac.uk/ViewGrant.aspx?GrantRef=EP/G057737/1
- Future Energy Decision Making for Cities - Can Complexity Science Rise to the Challenge? http://gow.epsrc.ac.uk/ViewGrant.aspx?GrantRef=EP/G05956X/1 and http://gow.epsrc.ac.uk/ViewGrant.aspx?GrantRef=EP/G059780/1
Tuesday, June 09, 2009
Energy Challenges for Complexity Science
Wednesday, May 27, 2009
Improvise Visualisation: Visual analysis of simulations

This can be an interesting way to 'understand' the relationship between the parameters and solution spaces of the simulations, including agent-based simulations.
Vlasios
Thursday, May 21, 2009
Energy & Spatial Agent-based models

Currently, a group (CIBS and STORM) of colleagues as LondonMet are working on developing a spatial agent-based model of the global energy system using the R statistical package and Repast Symphony (the next 6 months or so I will post more information about it).
However, I have found some sources that can be helpful to people that develop agent-based model of the energy or exlectricity system:
1) Agent-Based Models of Energy Investment Decisions by Tobias Wittmann. Abstract:
At the start of the 21 st century societies face the challenge of securing an
efficient and environmentally sound supply of energy for present and future generations. Sector deregulation, the emergence of novel distributed technologies, firms focusing on these new options and competing in selected markets, and the requirements to reduce energy related greenhouse gas emissions might change the structure of energy systems significantly.
Densely populated urban areas, which allow for the operation of sophisticated energy infrastructures are the most suitable to see essential changes in their energy infrastructure.
This book develops a new model to study the development of urban energy systems. It combines a technical, highly resolved energy system model with an agent-based approach. The technical, highly resolved energy model is used to simulate the operation of technologies. Different agents are developed to capture the investment decisions of actors. Two classes of actors are distinguished: private and commercial actors. The decisions of private actors are modeled using a bounded rational decision model which can be parameterized by socio-demographic surveys. The decisions of commercial actors are approached with a rational choice model, but taking into account different perspectives of firms with regard to future
market developments.
A proof of concept implementation demonstrates the potential of the developed approach. Diffusion curves for conversion technologies and efficiency upgrades in the residential sector were obtained and the overall energy savings were calculated. Further, the impact of firms’ competition on diffusion curves could be estimated and different business models were tested.
2) Achieving A Sustainable Global Energy System: Identifying Possibilities Using Long-Term Energy Scenarios by Asami Miketa (Author), Keywan Riahi (Author), Richard Alexander Roehrl (Author), Leo Schrattenholzer (Editor). Abstract:
Sustainable development and global climate change have figured prominently in scientific analysis and international policymaking since the early 1990s. This book formulates technology strategies that will lead to environmentally sustainable energy systems, based on an analysis of global climate change issues using the concept of sustainable development. The authors focus on environmentally compatible, long-term technology developments within the global energy system, while also considering aspects of economic and social sustainability. The authors analyze a large number of alternative scenarios and illustrate the differences between those that meet the criteria for sustainable development and those that do not. As a result of their analysis, they identify a variety of promising socio-economic and environmental development paths that are consistent with sustainable development. One sustainable-development scenario and its policy implications are then presented in detail from a technology change perspective. The authors propose ambitious targets for technology adoption that are judged to achieve the desired socio-economic and environmental goals. Although the optimal policy mix to pursue these targets is clearly country-specific, the authors suggest that energy-related R&D that leads to technology performance improvements and the promotion of technology adoption in niche markets are the policy options which will yield the most significant long-term benefits. Policymakers, economists and researchers working on sustainability, energy economics, and technology change and innovation will welcome this topical and highly readable book.
3) The AMES Wholesale Power Market Test Bed by Hongyan Li, Junjie Sun, and Leigh Tesfatsion. Details: http://www.econ.iastate.edu/tesfatsi/AMESMarketHome.htm.
4) Electricity Market Complex Adaptive System (EMCAS) by Argonne. Details: http://www.dis.anl.gov/projects/emcas.html.
Vlasios
Thursday, April 02, 2009
Economics and Computation (by Kenneth L. Judd)
"
Dear Colleagues,
As you all know, the so-called leaders of the academic economics have little respect for efforts to bring modern numerical and computational methods to economics. I have created a website that discusses and documents my experiences, particularly with journals. I have no illusions about the likelihood of this changing their behavior, but it does clearly show their attitude. It may also help you deal with colleagues who similarly oppose building computational expertise in economics and inflate the value of publications in particular journals.
The website is at
http://sites.google.com/site/economicsandcomputation/
I know that some will not be comfortable with this confrontational approach. In my opinion, this is appropriate given the insulting and hostile treatment that computational economists frequently experience.
Ken
"
Saturday, February 07, 2009
Econophysics: Economics needs a scientific revolution

The following interesting essay is from Econophysics forum. This is written by Jean-Philippe Bouchaud who is the head of research of Capital Fund Management and a physics professor at cole Polytechnique in France. See, however, the essay by Jesper Stage on " Speaking up economic-sciences modelling" - Nature 456, 570.
Compared to physics, it seems fair to say that the quantitative success of the economic sciences is disappointing. Rockets fly to the moon, energy is extracted from minute changes of atomic mass without major havoc, global positioning satellites help millions of people to find their way home. What is the flagship achievement of economics, apart from its recurrent inability to predict and avert crises, including the current worldwide credit crunch?
Why is this so? Of course, modelling the madness of people is more difficult than the motion of planets, as Newton once said. But the goal here is to describe the behaviour of large populations, for which statistical regularities should emerge, just as the law of ideal gases emerge from the incredibly chaotic motion of individual molecules. To me, the crucial difference between physical sciences and economics or financial mathematics is rather the relative role of concepts, equations and empirical data. Classical economics is built on very strong assumptions that quickly become axioms: the rationality of economic agents, the invisible hand and market efficiency, etc. An economist once told me, to my bewilderment: These concepts are so strong that they supersede any empirical observation. As Robert Nelson argued in his book, Economics as Religion, the marketplace has been deified.
Physicists, on the other hand, have learned to be suspicious of axioms and models. If empirical observation is incompatible with the model, the model must be trashed or amended, even if it is conceptually beautiful or mathematically convenient. So many accepted ideas have been proven wrong in the history of physics that physicists have grown to be critical and queasy about their own models. Unfortunately, such healthy scientific revolutions have not yet taken hold in economics, where ideas have solidified into dogmas, that obsess academics as well as decision-makers high up in government agencies and financial institutions. These dogmas are perpetuated through the education system: teaching reality, with all its subtleties and exceptions, is much harder than teaching a beautiful, consistent formula. Students do not question theorems they can use without thinking. Though scores of physicists have been recruited by financial institutions over the last few decades, these physicists seem to have forgotten the methodology of natural sciences as they absorbed and regurgitated the existing economic lore, with no time or liberty to question its foundations.
The supposed omniscience and perfect efficacy of a free market stems from economic work in the 50s and 60s, which with hindsight looks more like propaganda against communism than a plausible scientific description. In reality, markets are not efficient, humans tend to be over-focused in the short-term and blind in the long-term, and errors get amplified through social pressure and herding, ultimately leading to collective irrationality, panic and crashes. Free markets are wild markets. It is foolish to believe that the market can impose its own self-discipline, as was promoted by the US Securities and Exchange Commission in 2004 when it allowed banks to pile up new debt.
Reliance on models based on incorrect axioms has clear and large effects. The Black-Scholes model was invented in 1973 to price options assuming that price changes have a Gaussian distribution, i.e. the probability extreme events is deemed negligible. Twenty years ago, unwarranted use of the model to hedge the downfall risk on stock markets spiraled into the October 1987 crash: -23% drop in a single day, dwarfing the recent hiccups of the markets. Ironically, it is the very use of the crash-free Black-Scholes model that destabilized the market! This time around, the problem lay in part in the development of structured financial products that packaged sub-prime risk into seemingly respectable high-yield investments. The models used to price them were fundamentally flawed: they underestimated the probability of that multiple borrowers would default on their loans simultaneously. In other words, these models again neglected the very possibility of a global crisis, even as they contributed to triggering one. The financial engineers who developed these models did not even realize that they helped the credit mongers of the financial industry to smuggle their products worldwide&emdash;they were not trained to decipher what their assumptions really meant.
Surprisingly, there is no framework in classical economics to understand "wild" markets, even though their existence is so obvious to the layman. Physics, on the other hand, has developed several models allowing one to understand how small perturbations can lead to wild effects. The theory of complexity, developed in the physics literature over the last thirty years, shows that although a system may have an optimum state (such as a state of lowest energy, for example), it is sometimes so hard to identify that the system in fact never settles there. This optimal solution is not only elusive, it is also hyper-fragile to small changes in the environment, and therefore often irrelevant to understanding what is going on. There are good reasons to believe that this complexity paradigm should apply to economic systems in general and financial markets in particular. Simple ideas of equilibrium and linearity (the assumption that small actions produce small effects) do not work. We need to break away from classical economics and develop altogether new tools, as attempted in a still patchy and disorganized way by "behavioral" economists and "econophysicists". But their fringe endeavour is not taken seriously by mainstream economics.
While work is done to improve models, regulation also needs to improve. Innovations in financial products should be scrutinized, crash tested against extreme scenarios and approved by independent agencies, just as we have done with other potentially lethal industries (chemical, pharmaceutical, aerospace, nuclear energy, etc.). In view of the present mayhem spilling over from the financial industry into every day life, a parallel with these other dangerous human activities seems relevant.
Most of all, there is a crucial need to change the mindset of those working in economics and financial engineering. They need to move away from what Richard Feynman called Cargo Cult Science: a science that follows all the apparent precepts and forms of scientific investigation, while still missing something essential. An overly formal and dogmatic education in the economic sciences and financial mathematics are part of the problem. Economic curriculums need to include more natural science. The prerequisites for more stability in the long run are the development of a more pragmatic and realistic representation of what is going on in financial markets, and to focus on data, which should always supersede perfect equations and aesthetic axioms.
An edited version of this essay appeared in Nature (455, 1181, 30 October 2008)
Monday, February 02, 2009
Emergent Macroeconomics: An agent-based approach to business fluctuations

Although I have not finished the book yet, I have found it really interesting. It gives good examples of the interactions between microeconomics and macroeconomics. It achieve this by using agent-based models to establish sound microfoundations of macroeconomics.
Abstract (From the Publisher): This book contributes substantively to the current state-of-the-art of macroeconomics by providing a method for building models in which business cycles and economic growth emerge from the interactions of a large number of heterogeneous agents. Drawing from recent advances in agent-based computational modeling, the authors show how insights from dispersed fields like the microeconomics of capital market imperfections, industrial dynamics and the theory of stochastic processes can be fruitfully combined to improve our understanding of macroeconomic dynamics.
A copy can be downloaded from: http://www.dea.unian.it/gallegati/Emergent_Macroeconomics.pdf
Wednesday, January 14, 2009
Agent-based models (new book by Nigel Gilbert)
I found chapter 3 (using agent-based models in social science) really useful as it gives practical steps in developing an agent-based model. Chapter 2 is also important as it discusses explicitly the concepts of time (as well as the concepts of agents and environment). Although he discusses time briefly, his treatment is important because it is usually underestimated. Within the GIScience literature, Dona Peuquet' s book (Representation of Space and Time) provides a comprehensive discussion that may benefit the development of agent-based models.
My personal view is that time has been trivialized in most of the agent-based models that I know. Time should be added to the list of significant research areas (e.g.,space/GIS, Learning and Simulation of language - chapter 5) that agent-based modellers need to address.
Wednesday, October 29, 2008
PhD thesis of the Object-Field Model

The full PhD of the Object-Field Model can be accessed from http://vega.soi.city.ac.uk/~fd776/phd/PhD_VoudourisV.pdf OR http://ssrn.com/abstract=1292262
ABSTRACT
The need for a conceptually unifying data model for the representation of geospatial phenomena has already been acknowledged. Recognising that the importance of the data model employed by and large determines what can be done by way of analysis and the methods by which the analysis can be undertaken, there has been some activity in developing unifying data models for geospatial representation in digital form. Some successes have been reported. Nevertheless, progress has been slow, especially at the conceptual and logical levels of abstraction of geospatial data models.
Concepts and ideas from cognitive and perceptual psychology as well as GIScience and GISystems literature are examined within the context of geospatial data modelling and reasoning. Drawing on and combining these concepts, ideas and successes with an empirical approach which proposes generalities by induction, this thesis suggests the fused Object-Field model with uncertainty and semantics at the conceptual, logical and physical levels of abstraction. The logical level has been formalised in the Unified Modelling Language (UML) class diagram and the physical level has been implemented in Java programming language.
The purpose of the Object-Field model is to better support the representation and reasoning of geospatial phenomena, particularly indeterminate phenomena such as town centres and land cover changes. It is shown that many of the concepts required to better represent geospatial phenomena can be derived from a single foundation that is termed the elementary-geoParticle which is regarded as indivisible, has no parts and serves as the standard for integrating the dual continuous-field and discrete-object conceptualisations by means of aggregation. A second concept is introduced, termed Traditional Scientific and Concept spaces of the Object-Field model and shown to provide a useful foundation for collaborative reasoning. The traditional scientific space is a mathematical representation of observational data and the concept space is a representation of conceptualisations, meanings and interpretations of the traditional scientific space. A third concept is also introduced, termed the Hierarchical Uncertainty and Semantic components of the Object-Field model that ‘populates’ the concept space with variable levels of uncertainty and semantics. Sketching is also suggested as a way to represent, record and manage conceptualisation uncertainty as it is an element of uncertainty that is frequently overlooked, yet has a significant impact in the way in which subjects understand and use geospatial data. Given that conceptualisation uncertainty is a subjective process that varies between individuals, this form of uncertainty has particular importance in the collaborative decision-making of indeterminate phenomena.
This thesis constructs technical and theoretical scientific knowledge for the design and development of geospatial models that aim to support the human decision-making process of indeterminate phenomena by means of multiple conceptualisations and interpretations. The theoretical knowledge is embodied in the UML formalization of the Object-Field model and the technical knowledge is embodied in the Object-Field GISystems Prototype.
Friday, October 17, 2008
The Object-Field model with Uncertainty and Semantics
Sunday, October 12, 2008
Understanding Anasazi using Agent-Based Modelling
More information is included in the chapter 4-6 of the Generative Social Science book by Epstein and others. A Java-based application of the Anasazi can be accessed from http://ascape.sourceforge.net/
Friday, October 10, 2008
SimCity Creator for Wii
I wonder if these games will help in raising awareness of the complexity of spatial modelling (as Google Earth/Maps seems to do) or they will trivialize this complexity by oversimplifying it.
See also: http://www.freehand.co.uk/games/seera/ by SOUTH EAST ENGLAND REGIONAL ASSEMBLY
Thursday, September 11, 2008
UNDERSTANDING SUSTAINABLE DEVELOPMENT AND URBAN ECONOMIC GROWTH: EXPLORATIONS WITH AN AGENT-BASED LABORATORY

I currently work in the idea that there is no conflict between the concepts of sustainable development and economic growth, as some people suggest.
If we accepts that ‘Sustainable Development’ is a catchall term about intergenerational welfare, then optimal allocation of stock capital can support both sustainability and economic growth. In this case, stock capital includes both environmental capital (such as fossil fuel and clean water) and man-made capital (such as school and hospital).
If the above is accepted, then what is the difference between economic theories of maximisation of welfare and sustainable development? I try to answer these questions within a triad conceptual framework of economy, effectiveness and efficiency, and an spatial agent-based computational laboratory.
But what is 'sustainable development? WCED (1987) and UK Defra (2008) define sustainable development as “development that meets the needs of the present without compromising the ability of future generations to meet their own needs”. Although this definition seems clear, it does not provide a conceptual basis for measuring sustainable development in a systematic way (Beckerman 2003). For example the intra-generational needs of the people coevolve in space and time without necessarily satisfying all the present needs at any point in space and time.
Pezzey (1992) argues that sustainability is related with measures that sustain an improvement in the quality of life, which is also supported by Faucheux et al (1996) who emphasise the need for intergenerational equity in the context of non-negative change in economic welfare per capita. These definitions signal a shift in defining sustainability by promoting the concept of ‘welfare’ as an all-embracing central variable, as argued above. Beckerman (2003) argues since the whole problem is the selections of means towards ‘sustainability’ and the assessment of these means, then the concept of sustainable development has nothing to add (if not subtracting from the classical economic objective of maximisation of welfare because of the precautionary principle).
I try to critically analyze these premises within the ‘triad framework’ (economy, efficiency and effectiveness). Based on this triad framework, I propose ways to measure sustainable development in the context of urban economic growth using a spatial agent-based computational laboratory. I view this laboratory as step forward in addressing what the UK DOE (1996) argues: it is not clear what sustainable development means, thus it is difficult to know how to measure it or which policies promote it.
Saturday, September 06, 2008
COMBINING THE ADVANTAGES OF AGENT-BASED & AND EQUATION-BASED APPROACHES
Bobashev and Epstein (2007) publish a relevant paper: A Hybrid Epidemic Model: Combining the Advantages of Agent-based and Equation-based Approaches (see also Heterogeneity and Network Structure in the Dynamics of Diffusion: Comparing Agent-Based and Differential Equation Models by Rahmandad and Sterman 2006)
Abstract
Agent-based models (ABMs) are powerful in describing structured epidemiological processes involving human behavior and local interaction. The joint behavior of the agents can be very complex and tracking the behavior requires a disciplined approach. At the same time, equationbased models (EBMs) can be more tractable and allow for at least partial analytical insight. However, inadequate representation of the detailed population structure can lead to spurious results, especially when the epidemic process is beginning and individual variation is critical. In this paper, we demonstrate an approach that combines the two modeling paradigms and introduces a hybrid model that starts as agent-based and switches to equation-based after the number of infected individuals is large enough to support a population-averaged approach. This hybrid model can dramatically save computational times and, more fundamentally, allows for the mathematical analysis of emerging structures generated by the ABM.
Details: http://www.brookings.edu/papers/2007/winter_hybridmodel_epstein.aspx
Is this a step towards on how to verify and validate agent-based computational model? (see also Empirical validation of agent-based models)
Thursday, September 04, 2008
Abstracts from the Geospatial Analysis session at RGS-IBG Conference
Note:It is also expected to run the Geospatial Analysis: GIS & Agent-Based Models session next year - RGS-IBG 2009.
ABSTRACTS
Modelling Perceptions of Street Safety to Increase Access to Public Transport; Claire Ellul, Ben Calnan (Cities Institute, London Metropolitan University)
In the context of public transportation, “the provision of a permeable public space contributes to an inclusive journey environment” (Azmin-Fouladi 2007). However, when planning or modelling an urban environment, architectural vision and planning principles often take precedence over the way buildings and urban features make people feel. In particular, the identification of specific urban features that contribute towards a feeling of safety and security is not generally considered.
Our research aims to redress this imbalance by providing planners and local authorities with the means to identify potential barriers to the permeability of public space. It is argued that the removal of negatively-impacting features and the resulting increase in perception of safety will increase the use of public transportation.
We present two key outputs of this process. Firstly, we have developed an Index of Permeabilility (IoP) for the urban environment, where each relevant urban feature visible from a specific location has been assigned a weighting (through a process of consultation). This weighting contributes towards the overall index of permeability for the point. Secondly, we present a GIS-based implementation of this index using Isovists (which identify the urban features visible from a specific point), extending the index to create a surface of permeability.
System wide cultural districts: mapping and clustering the tangible and intangible cultural assets for the policy design of the regional clusters in the Veneto Region, Italy; Pier Luigi Sacco, Guido Ferilli (IUAV University), Massimo Buscema, Terzi Stefano (Semeion Research Center)
In previous research carried out by Sacco et al. the notion of system-wide cultural districts has been introduced and analyzed. In particular, system-wide cultural districts are horizontally integrated local clusters of economic activities in which culture plays a key strategic role as a social activator of innovative processes and practices, as well as an attractor of talent and resources, a factor of social cohesion and of networking, and of course as a sector with its own value added.
In other, related research from the same group, Artificial Neural Networks (ANN) techniques have been adopted to investigate to what degree they were able to single out emergent industrial districts of various kinds in selected areas of the Italian territory.
In this paper, we combine this two strands of research in a project carried out under the initiative of the Veneto Region, one of Italy’s outstanding productive regions. In the first phase of the project, the spatial distribution and clustering of all cultural activities and facilities with a non-occasional character has been mapped. This has led us to identify qualitatively a certain number of emergent culture-driven clustering. In the second phase, a battery of innovative ANN techniques has been employed to identify the ‘centroids’ of the cultural clusters and to check to what extent they overlap with the poles of the Region’s overall productive systems.
Finally, an analogous analysis has been conducted for specific cultural sectors – visual arts, performing arts, museums, and so on, to investigate to what extent they tend to gravitate upon specific cultural clusters and to what extent they are useful to define prospective local specializations by means of a specific policy design process.
Revealing the fuzzy geography of an urban locality; Richard Flemmings (Blom Aerofilms Ltd & Birkbeck University of London)
The delineation of urban geographical boundaries can be problematic, particularly when unitary authority boundaries do not represent perceived reality. The lack of agreement between perception and the reality of political boundaries, make an urban locality a fuzzy geography. This fuzzy geography can be exploited, for example by estate agents who wish to alter an area to increase property values. By giving such fuzzy boundaries definition, better clarity can be achieved between estate agent and customer.
A method is proposed here that gives definition to the boundary of an imprecise region using the internet as the information source. Kernel density estimation is used to transform geo-tagged internet search results into a continuous surface. This is both compared and combined with a kernel density estimation of relevant Ordnance Survey MasterMap® cartographic text labels. A composite Index of Urban Locality is given to represent the fuzzy boundary of Clifton, Bristol. The resulting continuous surface is graded based on membership. Thus, the extent that a location is within or is not within the urban locality is depicted. The success of this output has been verified using estate agent’s interpretations of the boundary of Clifton. The Index of Urban Locality has also been applied to the region of Bedminster, Bristol, with some success.
Geospatial Modelling and Collaborative Reasoning of Indeterminate Phenomena: The Object-Field Model with Uncertainty and Semantics; Vlasios Voudouris (London Metropolitan Business School & City University London)
The need for a conceptually unifying geospatial data model for the representation of geospatial phenomena has already been acknowledged. Recognising that the importance of the data model employed by and large determines what can be done by way of analysis and the methods by which the analysis can be undertaken, there has been some activity in developing unifying data models for geospatial representation in digital form. Some successes have been reported. Nevertheless, progress has been slow, especially at the conceptual and logical levels of abstraction of geospatial data models.
Concepts and ideas from cognitive and perceptual psychology as well as GIScience and GISystems literature are examined within the context of geospatial data modelling and reasoning. Drawing on and combining these concepts, ideas and successes with an empirical approach, this work presents the fused Object-Field model with uncertainty and semantics at the conceptual and logical levels of abstraction.
The purpose of the Object-Field model is to better support the representation and collaborative reasoning of geospatial phenomena, particularly indeterminate phenomena such as town centres. It is shown that many of the concepts required to better represent geospatial phenomena can be derived from a single foundation that is termed the elementary-geoParticle. This serves as the standard for integrating the dual continuous-field and discrete-object data models by means of aggregation
GIS and Built Form: Using Pattern Recognition for Energy Efficiency Models; Donald Alexander, Simon Lannon, Orly Linovski (Cardiff University)
Much of what has been written about residential development in the UK relies on anecdotal evidence (Whitehand and Carr 1999). Little ‘on-the-ground’ research has been conducted due to the significant time required for investigating development through building records and other municipal data. A wide variety of research often requires detailed building information that has previously only been obtainable through walk-by surveys or building records. This paper examines alternative methods for determining building age using pattern recognition algorithms.
This model has wide ranging applications including researching urban development patterns, conducting urban design studies and assessing energy efficiency. This paper specifically focuses on the use of building data for energy efficiency studies. Modelling software has been developed to quantify energy emissions but requires detailed information of the built environment and age of buildings (Jones et al. 2000). It is proposed that pattern recognition algorithms can be used to automate the collection of this data from GIS and aerial photos.
To develop this technique, two study areas in Wales were chosen as case studies. These areas were surveyed manually to establish a baseline for assessing the built form characteristics of each development that could be incorporated into the algorithm. This paper will present the results of the development characteristic study, as well as the efficacy of using these to determine the age of dwellings.
Saturday, August 30, 2008
Towards a General Field model and its order in GIS
Abstract
Geospatial data modelling is dominated by the distinction between continuous-
field and discrete-object conceptualizations. However, the boundary between
them is not always clear, and the field view is more fundamental in some respects
than the object view. By viewing a set of objects as an object field and unifying it
with conventional field models, a new concept, the General Field (G-Field)
model, is proposed. In this paper, the properties of G-Field models, including
domain, range, and categorization, are discussed. As a summary, a descriptive
framework for G-Field models is proposed. Then, some common geospatial
operations in geographic information systems are reconsidered from the G-Field
perspective. The geospatial operations are classified into order-increasing
operations and non-order-increasing operations, depending on changes induced
in the G-Field’s order. Generally, the order can be viewed as an indicator of the
level of information extraction of geospatial data. It is thus possible to integrate
the concept of order with a geo-workflow management system to support
geographic semantics.
The paper can be downloaded from:
http://www.geog.ucsb.edu/%7Egood/papers/451.pdf
------
Refrences
Cova, T.J. and Goodchild, M.F. (2002) Extending geographical representation to include
fields of spatial objects. International Journal of Geographical Information Science, 16,
pp. 509–532.
Kjenstad, K., (2006) On the integration of object-based models and field-based models in
GIS. International Journal of Geographical Information Science, 20, pp. 491–509.
Voudouris, V (2008) Geospatial Modelling and Collaborative Reasoning of Indeterminate Phenomena: The Object-Field Model with Uncertainty and Semantics. Pesented at the RGS-IBG International Conference 2008.
Voudouris, V., Marsh, S., (2007) Geovisualization and GIS: A Human Centred Approach. In Visual Languages for Interactive Computing: Definitions and Formalizations (Eds, F. Ferri), Idea Group Inc.
Voudouris, V., Fisher, P.F., Wood, J., (2006) Capturing Conceptualization Uncertainty Interactively using Object-Fields. In: Kainz, W., Reid, A., Elmes, G.(2006) (Eds). 12th International Symposium on Spatial Data Handling (Vienna, Austria). Springer-Verlag.
Voudouris, V., Wood, J., Fisher, P.F., (2005) Collaborative geoVisualization: Object-Field Representations with Semantic and Uncertainty Information . In: Meersman, R., Tari, Z., Herrero, P., et al (Eds).On the Move to Meaningful Internet Systems OTM 2005, Lecture Notes in Computer Science (LNCS), Vol 3762, Springer, Berlin
IJGIS Valediction by Peter Fisher

Peter Fisher's Valediction is very interesting and promising for those who work in the area of geospatial data modelling (or representation) as It is the foundation of all else that is possible or can be done.
A pdf version of Fisher's Valediction can be accesses from http://www.informaworld.com/smpp/content~content=a784379045~db=all~order=page or you can read the html version below:
IJGIS Valediction
1. Introduction
I have been editing the International Journal of Geographical Information Science for the last 14 years. I was first associated with the journal in this role in 1994 during the publication of volume 8. Then, it was publishing 6 issues per year with a target length of 600 pages, which allowed approximately 30 articles to be included. In 1996, this increased to 800 pages, and in 2005 to 1200 pages. The journal now carries nearly 60 articles per year, and will have a modest increase in volume again next year.
I took over the journal from the capable hands of Professor Terry Coppock, who worked diligently to establish the journal as one which would persist and become the premier journal of record for those working on the development and application of geographical information systems, whatever their background. I have endeavoured to maintain the status of the journal, and I believe that I have. Elsewhere (Fisher 2007b), I have listed some of the competitor journals of IJGIS. IJGIS is unusual because it is listed in the Journal Citation Rankings of ISI's Web of Knowledge in four subject areas (Geography, Physical Geography, Information Science, and Computer Science). Among competing journals, IJGIS has had the highest impact factor over a number of years.
Volume 11 saw the journal published under its present title, when the name was changed from IJGISystems to IJGIScience, in recognition of the fact that the journal had always been engaged in the publication of research into the science of geographical information which underpins the systems that are in widespread use.
2. A personal view of research published in IJGIS
The research that has been published in IJGIS over the years can be divided many ways, but I choose to look at it as is illustrated in figure 1. These are the themes I see which have persisted through the 14 years. The structure identified here was first articulated at a presentation at the AGILE 2007 Annual conference in Aalborg. I would like to thank the organizers (including Lars Bodum and Monica Wachowicz) for inviting me to give that presentation.

Figure 1. Personal view of general research topics published within IJGIS.
To me, the most important research theme is that of Representation. It is the foundation of all else that is possible or can be done with geographical information. I view it as having five components:
Spatial Information Theory addresses how we conceptualize spatial information, and is absolutely central to GIScience. It has been a persistent theme with issues of RESELS, geoatoms, object orientation, and multiscale and multiresolution information as part of it.
Issues of Uncertainty in its broadest sense may be the most common research topic published in IJGIS. This includes probabilistic and fuzzy formalisms, error modelling, rough sets, and semantic uncertainty, among others.
Researchers have long bemoaned the lack of Temporality in geographical databases, but over the 14 years, many papers have been published in this area.
IJGIS has not been slow in publishing the results of Ontological research both from a database construction point of view and from a semantic understanding point of view.
Finally, and perhaps a smaller component than is desirable, is the research on Geometric representation.
The second broad topic is modelling, which, for convenience, I divide into:
analytical and statistical modelling, including network modelling and spatial statistics; and
process modelling, including modelling of social and environmental processes and the technology of those models.
Visualization has always been a major theme within GIScience, and of course, cartography, and computer cartography in particular, is one of the antecedents of the field. Many interesting papers and special issues have been published on topics from this field, including generalization, visual analytics, geocollaboration, and interactive mapping.
Cognitive studies and usability are concerned with how we relate to the world and the information about that world. There are increasing studies on usability, but my personal view is that studies on spatial cognition should ground much research in GIScience but have not been published in IJGIS, with a very few exceptions. I hope that the future may see more such research linking these areas.
A final persistent theme has been that of data policy with which I bracket social construction of information. The first has been researched in many ways and most recently within the umbrella of Spatial Data Infrastructures. The social construction argument is seen by some as anti-scientific, but in my view it is part of all information, as some recent studies have demonstrated, and those studies have shown some potential for working with different world views within GIScience.
Over the 14 years, paper submissions on some research topics have ceased. Parallel processing on which a special issue was published in Volume 10 has become a low-level system issue, with barely a mention of the topic in more recent issues of IJGIS. Similarly, Interoperability was the topic of a special issue in Volume 12, but it too has not been addressed directly in much writing in IJGIS since. The topic remains important, but within the research published within IJGIS, it has been subsumed within the interoperability of data, or within the developing area of Web technologies. Another person might see papers on Web technologies as another emerging component of the IJGIS research literature, but currently I see the Web issues as one that touches many of the other topics raised, particularly Visualistation and Data Policy.
Discussion of the structure outlined here has led others to suggest to me that the World Wide Web, Location Based Services and Global Change are so called 'killer applications' for GIScience, and so might be viewed as themes for structuring the field. These are all interesting areas for research with their own challenges and problems. However, I would rather see these as important areas for application, along with many others, rather than as driving forces. I believe that when an application becomes a driver, it moulds the science, and I do not believe that all applications will fit one mould. Therefore, it is necessary to keep the independence of the core issues as central to GIScience, and not view GIScience as issues of any one application.
3. Issues in producing the Journal
Many issues could be mentioned in the production process, but two stand out for me. The first is with respect to reviewers, and the second is the preparation of graphics.
3.1 Problems with reviewers
The most intractable problem in managing any peer-reviewed journal is making timely decisions on articles. This process is a trade-off between the need for reviewers to have time to read a paper, and an author's wish to have a rapid response, as well as the editor's wish to 'have a life' and do some of their own research. The most frustrating part of managing the process is that reviewers repeatedly promise to complete a review within a particular time period, but fail to do so. This can be for understandable reasons, but when the reviewer then promises to do the review by some new date but fails, and promises again and again, the process becomes very frustrating for everyone.
When I first started editing, a member of the editorial board said to me 'I hope to complete three reviews for every paper I publish'. 3 to 1 is the minimum ratio of reviewed to published papers to which all active researchers should commit. Because of rejections, unfortunately the ratio actually needs to be considerably higher. Unfortunately, there are people who will never return a review, no matter how many times they promise, and there are others who will always return a review, once they have said they will. Research productivity and administrative responsibility is no indicator as to group membership—some of the busiest people are the most reliable. But if you publish one paper, you should commit to reviewing at least three papers, and you should do them as if you were the author, in a prompt and timely manner.
3.2 Problems in graphcs
Authors should be more careful in their design of graphics, graphs, and maps. Perhaps the worst are the graphs generated in modern spreadsheets. One particular spreadsheet package uses grey backgrounds so that graphs are highly visible on the screen, but when these are printed, the grey tends to obscure the actual graph, as do such ephemera as the grid lines and oversized point markers. Unfortunately, many authors seem to be ignorant of the design guidelines of Edward Tufte (1983), which should be studied with care by all involved in illustrating scientific articles. Authors should be prepared to make multiple changes to graphs in the process of preparing an article, using smaller symbols and clear, white backgrounds. Similarly, many authors use grey fills for boxes in flow diagrams. On the whole, these are completely redundant and only obscure the text within the boxes. Boxes should be white, with the outline used to code the boxes, if that is desirable.
In the print technology used by the publishers, colour continues to be expensive, but colour in the electronic version of papers is free. This means that as much colour as an author wishes to include can be carried in any article, but the print version of that article may include all those graphics in greys. The problem with this is that many colours will produce the same grey, so that if information is colour-coded, but the print version is in grey, the coding may not carry over. Therefore, authors need to continue to be be careful in their use of colour and, where necessary, may need to consult experts in the use of colour.
4. Is it still research?
There are many interesting and challenging research topics to be addressed in GIScience, but there are some topics which might be considered to be pass for publication in IJGIS. Without wishing to put off researchers, I would like to mention two here.
First is the annual assault on the editors of papers documenting yet another instance of a raster-GIS implementation of the Universal Soil Loss Equation (USLE), by authors who have not read the literature well enough. This topic was first addressed in the 1980s by, for example, Spanner et al. (1983), and papers being submitted in 2007 are very little different. I am not saying that soil erosion modelling is pass, but as scientific research the USLE is, both within and outside GIS. On the contrary, research relating to more advanced soil erosion models is welcome, and excellent contributions have been included in the Journal, when they meet the review standards.
Similarly, many articles have been written on comparisons of a modest number of surface interpolation algorithms in an experimental situation (whether from point observations or contours and using IDW, spline, and kriging, perhaps). Papers continue to be submitted doing no more. It is easy to conceive of such an experiment, but it is a real challenge to make it original and different from previous experiments, and to demonstrate that the conclusions can be generalized to other contexts. Generation of digital elevation models is no longer dependent on the interpolation of values from sparse point observations or contour lines, but has moved over to measurement-based remote sensing devices such as Lidar and Ifsar. Interpolation remains important for these technologies, but the issues have changed. Future experiments need to be demonstrably relevant.
5. Thanks
During the 14 years I have been working on the journal, I estimate that about 625 papers will have been published, which means that something of the order of 1800 papers have been submitted. A number of people in various roles have been involved, and I would like to record my thanks to them all (in spite of having listed many in a previous acknowledgement; Fisher 2007a):
First are those people whose work has been published in the journal over the last 14 years. I thank them for taking the time to conduct the interesting research they have submitted, and for writing it up. Almost without exception, they have taken criticism from reviewers and papers have gone through changes in the review process. We believe that the published papers which result are better than those originally submitted, but making the changes can be nonetheless painful for the authors. It has been a pleasure for my colleagues and I to see this work through the review process.
Because each paper is sent to at least three reviewers, approximately 5400 requests for reviews have been dispatched. I am ashamed to say that I have no idea how many reviewers this equates to, because I do not know how many have been asked more than once, although I suspect it is the majority. My thanks go to all those who have responded with reviews, when requested. The work involved in taking time and care to consider and critique the work of others cannot be understated, but it can also be most rewarding. Foremost among these reviewers have been members of the editorial board.
All journals have two classes of author: authors whose work is accepted, and those whose work is rejected. The acceptance rate is approximately 30% of submissions, and therefore the latter group is about twice the size of the former (except that, of course, some authors are in both categories), and having taken the effort to conduct the research and write the paper to then have it rejected for publication is always very dispiriting. These are the unacknowledged facilitators of the peer-review process, and I would like to take this opportunity to thank them all, because their work has come to nothing and will not be published in this journal.
During the 14 years, 16 special issues have been published, and a number more are in preparation. The editors of these issues are numerous, but they are acknowledged by being the authors of guest editorials.
It has been my pleasure to work with a number of other people in editorial roles, including Eric Anderson, Steven Guptill, Marc Armstrong, Harvey Miller and now Mark Gahegan as North American Editors (now Editor for the Americas), and Dave Abel and now Brian Lees as editors for the Western Pacific (now Editor for Australasia and Eastern Asia). I have worked with Neil Stuart, Nick Tate, and Lex Comber as Book Review Editors.
Throughout my period as editor, the Publisher's principal representative has been Richard Steele. Direct managerial contacts for the journal have been Meloney Bartlett, Rachel Sangster, and Virginia Klaessen. On the production side, managing the work of anonymous typesetters and copy editors, are the people with whom authors have communicated about proofs (whether they know it or not). They have been David Chapman, Sophie Middleton, Heidi Cormode, and currently James Baldock.
Finally, I must thank Jill Fisher, who has given continuing support and assistance in communicating with authors and reviewers.
The system of peer review, which is the current paradigm for scholarly publication, would not work without all these players; all are crucial to the process. My thanks to all these people in their various roles, from reviewers and authors, to editors and production managers, and to anyone else I should have named but have not. The last 14 years would not have been possible without each and every one of you.
I would like to close by offering my very best wishes for continuing success of the journal to the future editorial team, including Brian Lees (Australian Defense Force Academy, University of New South Wales) as both Editor in chief and Editor for Australasia and Asia, Mark Gahegan (Pennsylvania State University) as Editor for the Americas, and Sytze de Bruin and Monica Wachowicz (Wagenigen University) as Editors for Europe and Africa. I hope that they find it in as good condition as Terry Coppock left it for me.
References
1. Fisher, P. F. Fisher, P. (ed) (2007a) Preface.. Classics from IJGIS: Twenty Years of the International Journal of Geographical Information Science and Systems pp. v-vi. Taylor & Francis , London
2. Fisher, P. F. Fisher, P. (ed) (2007b) 20 years of IJGIS: Choosing the classics.. Classics from IJGIS: Twenty Years of the International Journal of Geographical Information Science and Systems pp. 1-6. Taylor & Francis , London
3. Spanner, M. A., Strahler, A. H. and Estes, J. E. (1983) Soil loss prediction in a geographic information system format.. — In Papers Selected for Presentation at the 17th International Symposium on Remote Sensing of Environment. Volume 1, pp. 89-102. (Environment Research Institute of Michigan, Ann Arbor.)
4. Tufte, E. R. (1983) The Visual Display of Quantitative Information Graphic Press , Cheshire, CT
Thursday, May 01, 2008
Unlocking Economic Systems with Agent-Based Computational Economics: The EU Leasing Market

THis is the work that I (with Haris) present at the RGS-IBG International Conference 2008.
Studies of economic systems must consider how to handle interdependent feedback interactions of micro behaviors, interaction patterns and macroscopic regularities. The Agent-Field framework is an approach for agent-based computational economics. In this framework, models of economic systems are viewed as a collection of multi-scale and structured agents operating in indeterminate economic environments conceptualized as continuous, differentiable fields with variable levels of spatial uncertainty. We propose formalization of the Agent-Field framework using the Unified Modeling Language. We explore potential advantages and disadvantages of the framework for the study of economic systems using the EU leasing market. This enables us to formulate an initial frame representation of major economic agents for the EU leasing market. We predicted the direction of the Central and Easter cluster of Europe's high growth economies can be expected to take, as its economies move towards higher prosperity levels. Within the scope of the work, it has been shown that the Agent-Field framework is an intuitive rather than an abstract process in modeling economic systems. This intuitive process needs more understanding of the interactions between the economic environment and the agents within it. The Agent-Field approach seems ontologically well founded for the growing field of agent-based computational economics.
Monday, April 07, 2008
Geospatial Analysis: GIS & Agent-Based Models

This year I organise the Geospatial Analysis: GIS & Agent-Based Models session at the RGS-IBG Annual Conference 2008 in London. We hope that the session will attract interest from users of GIS and Agent-based models for the analysis of geospatial phenomena, and particularly those who are interested in the fusion of these two areas. The deadline for submission to this session is 17th April 2008. Abstract should be sent to v.voudouris@londonmet.ac.uk
Sunday, February 03, 2008
On the Integration: GIS with Agent-Based Models

ArcGIS now interacts with Repast using the Agent Analyst:
The Agent Analyst is a free and open source ArcGIS extension that allows ArcGIS users to build geographically aware agent-based models. Agent Analyst achieves this goal by integrating the free and open source Recursive Porous Agent Simulation Toolkit (Repast) into ArcGIS (see here for details).
This offers interesting opportunities for both the agent-based community (see Batty, 2005) and GIS community (see Repast Vector GIS Integration for details). In my PhD thesis, i am suggesting a way of integrating agent-based wIth GIS using the object-field model (details will be posted soon).
Reference
Batty, M (2005), 'Approaches to Modelling in GIS: Spatial Representation and Temporal Dynamics'. In Maguire, Batty and Goodchild (eds.): GIS, Spatial Analysis and Modelling, ESRI Press