# Papers

##### Post a paper
 « 2012 Jan Feb Mar Apr May Jun Jul Aug Sep Oct Nov Dec
Despite all our great advances in science, technology and financial innovations, many societies today are struggling with a financial, economic and public spending crisis, over-regulation, and mass unemployment, as well as lack of sustainability and innovation. Can we still rely on conventional economic thinking or do we need a new approach? <br />I argue that, as the complexity of socio-economic systems increases, networked decision-making and bottom-up self-regulation will be more and more important features. It will be explained why, besides the "homo economicus" with strictly self-regarding preferences, natural selection has also created a "homo socialis" with other-regarding preferences. While the "homo economicus" optimizes the own prospects in separation, the decisions of the "homo socialis" are self-determined, but interconnected, a fact that may be characterized by the term "networked minds". Notably, the "homo socialis" manages to earn higher payoffs than the "homo socialis". <br />I show that the "homo economicus" and the "homo socialis" imply a different kind of dynamics and distinct aggregate outcomes. Therefore, next to the traditional economics for the "homo economicus" ("economics 1.0"), a complementary theory must be developed for the "homo socialis". This economic theory might be called "economics 2.0" or "socionomics". The names are justified, because the Web 2.0 is currently promoting a transition to a new market organization, which benefits from social media platforms and could be characterized as "participatory market society". To thrive, the "homo socialis" requires suitable institutional settings such a particular kinds of reputation systems, which will be sketched in this paper. I also propose a new kind of money, so-called "qualified money", which may overcome some of the problems of our current financial system.
1 vote
We introduce the concept of self-healing in the field of complex networks. Obvious applications range from infrastructural to technological networks. By exploiting the presence of redundant links in recovering the connectivity of the system, we introduce self-healing capabilities through the application of distributed communication protocols granting the "smartness" of the system. We analyze the interplay between redundancies and smart reconfiguration protocols in improving the resilience of networked infrastructures to multiple failures; in particular, we measure the fraction of nodes still served for increasing levels of network damages. We study the effects of different connectivity patterns (planar square-grids, small-world, scale-free networks) on the healing performances. The study of small-world topologies shows us that the introduction of some long-range connections in the planar grids greatly enhances the resilience to multiple failures giving results comparable to the most resilient (but less realistic) scale-free structures.
1 vote
Simulation with agent-based models is increasingly used in the study of complex socio-technical systems and in social simulation in general. This paradigm offers a number of attractive features, namely the possibility of modeling emergent phenomena within large populations. As a consequence, often the quantity in need of calibration may be a distribution over the population whose relation with the parameters of the model is analytically intractable. Nevertheless, we can simulate. In this paper we present a simulation-based framework for the calibration of agent-based models with distributional output based on indirect inference. We illustrate our method step by step on a model of norm emergence in an online community of peer production, using data from three large Wikipedia communities. Model fit and diagnostics are discussed.
1 vote
The ability to understand and eventually predict the emergence of information and activation cascades in social networks is core to complex socio-technical systems research. However, the complexity of social interactions makes this a challenging enterprise. Previous works on cascade models assume that the emergence of this collective phenomenon is related to the activity observed in the local neighborhood of individuals, but do not consider what determines the willingness to spread information in a time-varying process. Here we present a mechanistic model that accounts for the temporal evolution of the individual state in a simplified setup. We model the activity of the individuals as a complex network of interacting integrate-and-fire oscillators. The model reproduces the statistical characteristics of the cascades in real systems, and provides a framework to study time-evolution of cascades in a state-dependent activity scenario.
1 vote
Heavy-tailed distributions of meme popularity occur naturally in a model of meme diffusion on social networks. Competition between multiple memes for the limited resource of user attention is identified as the mechanism that poises the system at criticality. The popularity growth of each meme is described by a critical branching process, and asymptotic analysis predicts power-law distributions of popularity with very heavy tails (exponent $\alpha<2$, unlike preferential-attachment models), similar to those seen in empirical data.
1 vote
This paper provides a substantial reconceptualization of the serial clearing of the product market on the basis of structural axioms. The change of premises is required simply because from the accustomed premises only the accustomed conclusions can be derived and these are known to be inapplicable in the real world. This holds in particular for the still popular idea that the working of a market can be described in terms of the triad demand function–supply function–equilibrium. Structural axiomatization provides the complete and consistent picture of interrelated product market events.
1 vote
We study a phenomenological model for the continuous double auction, equivalent to two independent $M/M/1$ queues. The continuous double auction defines a continuous-time random walk for trade prices. The conditions for ergodicity of the auction are derived and, as a consequence, three possible regimes in the behavior of prices and logarithmic returns are observed. In the ergodic regime, prices are unstable and one can observe an intermittent behavior in the logarithmic returns. On the contrary, non-ergodicity triggers stability of prices, even if two different regimes can be seen.
1 vote
We have analyzed the Indices of Industrial Production (Seasonal Adjustment Index) for a long period of 240 months (January 1988 to December 2007) to develop a deeper understanding of the economic shocks. The angular frequencies estimated using the Hilbert transformation, are almost identical for the 16 industrial sectors. Moreover, the partial phase locking was observed for the 16 sectors. These are the direct evidence of the synchronization in the Japanese business cycle. We also showed that the information of the economic shock is carried by the phase time-series. The common shock and individual shocks are separated using phase time-series. The former dominates the economic shock in all of 1992, 1998 and 2001. The obtained results suggest that the business cycle may be described as a dynamics of the coupled limit-cycle oscillators exposed to the common shocks and random individual shocks.
We present and discuss a stochastic model of financial assets dynamics based on the idea of an inverse renormalization group strategy. With this strategy we construct the multivariate distributions of elementary returns based on the scaling with time of the probability density of their aggregates. In its simplest version the model is the product of an endogenous auto-regressive component and a random rescaling factor embodying exogenous influences. Mathematical properties like increments' stationarity and ergodicity can be proven. Thanks to the relatively low number of parameters, model calibration can be conveniently based on a method of moments, as exemplified in the case of historical data of the S&P500 index. The calibrated model accounts very well for many stylized facts, like volatility clustering, power law decay of the volatility autocorrelation function, and multiscaling with time of the aggregated return distribution. In agreement with empirical evidence in finance, the dynamics is not invariant under time reversal and, with suitable generalizations, skewness of the return distribution and leverage effects can be included. The analytical tractability of the model opens interesting perspectives for applications, for instance in terms of obtaining closed formulas for derivative pricing. Further important features are: The possibility of making contact, in certain limits, with auto-regressive models widely used in finance; The possibility of partially resolving the endogenous and exogenous components of the volatility, with consistent results when applied to historical series.
1 vote
This paper sets up a methodology for approximately solving optimal investment problems using duality methods combined with Monte Carlo simulations. In particular, we show how to tackle high dimensional problems in incomplete markets, where traditional methods fail due to the curse of dimensionality.
1 vote
One of the most important features of spatial networks such as transportation networks, power grids, Internet, neural networks, is the existence of a cost associated with the length of links. Such a cost has a profound influence on the global structure of these networks which usually display a hierarchical spatial organization. The link between local constraints and large-scale structure is however not elucidated and we introduce here a generic model for the growth of spatial networks based on the general concept of cost benefit analysis. This model depends essentially on one single scale and produces a family of networks which range from the star-graph to the minimum spanning tree and which are characterised by a continuously varying exponent. We show that spatial hierarchy emerges naturally, with structures composed of various hubs controlling geographically separated service areas, and appears as a large-scale consequence of local cost-benefit considerations. Our model thus provides the first building blocks for a better understanding of the evolution of spatial networks and their properties. We also find that, surprisingly, the average detour is minimal in the intermediate regime, as a result of a large diversity in link lengths. Finally, we estimate the important parameters for various world railway networks and find that --remarkably-- they all fall in this intermediate regime, suggesting that spatial hierarchy is a crucial feature for these systems and probably possesses an important evolutionary advantage.
1 vote
We analyze realized volatilities constructed using high-frequency stock data on the Tokyo Stock Exchange. In order to avoid non-trading hours issue in volatility calculations we define two realized volatilities calculated separately in the two trading sessions of the Tokyo Stock Exchange, i.e. morning and afternoon sessions. After calculating the realized volatilities at various sampling frequencies we evaluate the bias from the microstructure noise as a function of sampling frequency. Taking into account of the bias to realized volatility we examine returns standardized by realized volatilities and confirm that price returns on the Tokyo Stock Exchange are described approximately by Gaussian time series with time-varying volatility, i.e. consistent with a mixture of distributions hypothesis.
1 vote
The stochastic volatility model is one of volatility models which infer latent volatility of asset returns. The Bayesian inference of the stochastic volatility (SV) model is performed by the hybrid Monte Carlo (HMC) algorithm which is superior to other Markov Chain Monte Carlo methods in sampling volatility variables. We perform the HMC simulations of the SV model for two liquid stock returns traded on the Tokyo Stock Exchange and measure the volatilities of those stock returns. Then we calculate the accuracy of the volatility measurement using the realized volatility as a proxy of the true volatility and compare the SV model with the GARCH model which is one of other volatility models. Using the accuracy calculated with the realized volatility we find that empirically the SV model performs better than the GARCH model.
1 vote
The question on the title came through my mind one day as I keep in one hand a paper in nuclear physics and in the other hand a paper in finance and surprisingly conclude that the same formula appear in both articles*. Phenomena from apparently completely different field of research were solved with the help of same equation. Things are getting even weirder saying that the formula I was talking about is the time-independent Schrodinger equation.
1 vote
Demand outstrips available resources in most situations, which gives rise to competition, interaction and learning. In this article, we review a broad spectrum of multi-agent models of competition and the methods used to understand them analytically. We emphasize the power of concepts and tools from statistical mechanics to understand and explain fully collective phenomena such as phase transitions and long memory, and the mapping between agent heterogeneity and physical disorder. As these methods can be applied to any large-scale model made up of heterogeneous adaptive agent with non-linear interaction, they provide a prospective unifying paradigm for many scientific disciplines.
The main aim of this work is to incorporate selected findings from behavioural finance into a Heterogeneous Agent Model using the Brock and Hommes (1998) framework. Behavioural patterns are injected into an asset pricing framework through the so-called `Break Point Date', which allows us to examine their direct impact. In particular, we analyse the dynamics of the model around the behavioural break. Price behaviour of 30 Dow Jones Industrial Average constituents covering five particularly turbulent U.S. stock market periods reveals interesting pattern in this aspect. To replicate it, we apply numerical analysis using the Heterogeneous Agent Model extended with the selected findings from behavioural finance: herding, overconfidence, and market sentiment. We show that these behavioural breaks can be well modelled via the Heterogeneous Agent Model framework and they extend the original model considerably. Various modifications lead to significantly different results and model with behavioural breaks is also able to partially replicate price behaviour found in the data during turbulent stock market periods.
1 vote