Advanced

# Papers

##### Post a paper
 « 2012 Jan Feb Mar Apr May Jun Jul Aug Sep Oct Nov Dec
Despite all our great advances in science, technology and financial innovations, many societies today are struggling with a financial, economic and public spending crisis, over-regulation, and mass unemployment, as well as lack of sustainability and innovation. Can we still rely on conventional economic thinking or do we need a new approach? <br />I argue that, as the complexity of socio-economic systems increases, networked decision-making and bottom-up self-regulation will be more and more important features. It will be explained why, besides the "homo economicus" with strictly self-regarding preferences, natural selection has also created a "homo socialis" with other-regarding preferences. While the "homo economicus" optimizes the own prospects in separation, the decisions of the "homo socialis" are self-determined, but interconnected, a fact that may be characterized by the term "networked minds". Notably, the "homo socialis" manages to earn higher payoffs than the "homo socialis". <br />I show that the "homo economicus" and the "homo socialis" imply a different kind of dynamics and distinct aggregate outcomes. Therefore, next to the traditional economics for the "homo economicus" ("economics 1.0"), a complementary theory must be developed for the "homo socialis". This economic theory might be called "economics 2.0" or "socionomics". The names are justified, because the Web 2.0 is currently promoting a transition to a new market organization, which benefits from social media platforms and could be characterized as "participatory market society". To thrive, the "homo socialis" requires suitable institutional settings such a particular kinds of reputation systems, which will be sketched in this paper. I also propose a new kind of money, so-called "qualified money", which may overcome some of the problems of our current financial system.
1 vote
pdf other (11 views, 7 downloads, 0 comments) [show abstract]
We introduce the concept of self-healing in the field of complex networks. Obvious applications range from infrastructural to technological networks. By exploiting the presence of redundant links in recovering the connectivity of the system, we introduce self-healing capabilities through the application of distributed communication protocols granting the "smartness" of the system. We analyze the interplay between redundancies and smart reconfiguration protocols in improving the resilience of networked infrastructures to multiple failures; in particular, we measure the fraction of nodes still served for increasing levels of network damages. We study the effects of different connectivity patterns (planar square-grids, small-world, scale-free networks) on the healing performances. The study of small-world topologies shows us that the introduction of some long-range connections in the planar grids greatly enhances the resilience to multiple failures giving results comparable to the most resilient (but less realistic) scale-free structures.
1 vote
pdf other (14 views, 10 downloads, 0 comments) [show abstract]
Simulation with agent-based models is increasingly used in the study of complex socio-technical systems and in social simulation in general. This paradigm offers a number of attractive features, namely the possibility of modeling emergent phenomena within large populations. As a consequence, often the quantity in need of calibration may be a distribution over the population whose relation with the parameters of the model is analytically intractable. Nevertheless, we can simulate. In this paper we present a simulation-based framework for the calibration of agent-based models with distributional output based on indirect inference. We illustrate our method step by step on a model of norm emergence in an online community of peer production, using data from three large Wikipedia communities. Model fit and diagnostics are discussed.
1 vote
pdf ps other (9 views, 5 downloads, 0 comments) [show abstract]
The ability to understand and eventually predict the emergence of information and activation cascades in social networks is core to complex socio-technical systems research. However, the complexity of social interactions makes this a challenging enterprise. Previous works on cascade models assume that the emergence of this collective phenomenon is related to the activity observed in the local neighborhood of individuals, but do not consider what determines the willingness to spread information in a time-varying process. Here we present a mechanistic model that accounts for the temporal evolution of the individual state in a simplified setup. We model the activity of the individuals as a complex network of interacting integrate-and-fire oscillators. The model reproduces the statistical characteristics of the cascades in real systems, and provides a framework to study time-evolution of cascades in a state-dependent activity scenario.
1 vote
pdf ps other (8 views, 3 downloads, 0 comments) [show abstract]
Heavy-tailed distributions of meme popularity occur naturally in a model of meme diffusion on social networks. Competition between multiple memes for the limited resource of user attention is identified as the mechanism that poises the system at criticality. The popularity growth of each meme is described by a critical branching process, and asymptotic analysis predicts power-law distributions of popularity with very heavy tails (exponent $\alpha<2$, unlike preferential-attachment models), similar to those seen in empirical data.
1 vote
other (21 views, 9 downloads, 0 comments) [show abstract]
This paper provides a substantial reconceptualization of the serial clearing of the product market on the basis of structural axioms. The change of premises is required simply because from the accustomed premises only the accustomed conclusions can be derived and these are known to be inapplicable in the real world. This holds in particular for the still popular idea that the working of a market can be described in terms of the triad demand function–supply function–equilibrium. Structural axiomatization provides the complete and consistent picture of interrelated product market events.
1 vote
pdf ps other (22 views, 9 downloads, 0 comments) [show abstract]
We study a phenomenological model for the continuous double auction, equivalent to two independent $M/M/1$ queues. The continuous double auction defines a continuous-time random walk for trade prices. The conditions for ergodicity of the auction are derived and, as a consequence, three possible regimes in the behavior of prices and logarithmic returns are observed. In the ergodic regime, prices are unstable and one can observe an intermittent behavior in the logarithmic returns. On the contrary, non-ergodicity triggers stability of prices, even if two different regimes can be seen.
1 vote
pdf ps other (17 views, 4 downloads, 0 comments) [show abstract]
We have analyzed the Indices of Industrial Production (Seasonal Adjustment Index) for a long period of 240 months (January 1988 to December 2007) to develop a deeper understanding of the economic shocks. The angular frequencies estimated using the Hilbert transformation, are almost identical for the 16 industrial sectors. Moreover, the partial phase locking was observed for the 16 sectors. These are the direct evidence of the synchronization in the Japanese business cycle. We also showed that the information of the economic shock is carried by the phase time-series. The common shock and individual shocks are separated using phase time-series. The former dominates the economic shock in all of 1992, 1998 and 2001. The obtained results suggest that the business cycle may be described as a dynamics of the coupled limit-cycle oscillators exposed to the common shocks and random individual shocks.
2 votes
pdf ps other (47 views, 24 downloads, 0 comments) [show abstract]
We present and discuss a stochastic model of financial assets dynamics based on the idea of an inverse renormalization group strategy. With this strategy we construct the multivariate distributions of elementary returns based on the scaling with time of the probability density of their aggregates. In its simplest version the model is the product of an endogenous auto-regressive component and a random rescaling factor embodying exogenous influences. Mathematical properties like increments' stationarity and ergodicity can be proven. Thanks to the relatively low number of parameters, model calibration can be conveniently based on a method of moments, as exemplified in the case of historical data of the S&P500 index. The calibrated model accounts very well for many stylized facts, like volatility clustering, power law decay of the volatility autocorrelation function, and multiscaling with time of the aggregated return distribution. In agreement with empirical evidence in finance, the dynamics is not invariant under time reversal and, with suitable generalizations, skewness of the return distribution and leverage effects can be included. The analytical tractability of the model opens interesting perspectives for applications, for instance in terms of obtaining closed formulas for derivative pricing. Further important features are: The possibility of making contact, in certain limits, with auto-regressive models widely used in finance; The possibility of partially resolving the endogenous and exogenous components of the volatility, with consistent results when applied to historical series.
1 vote
pdf ps other (26 views, 15 downloads, 0 comments) [show abstract]
This paper sets up a methodology for approximately solving optimal investment problems using duality methods combined with Monte Carlo simulations. In particular, we show how to tackle high dimensional problems in incomplete markets, where traditional methods fail due to the curse of dimensionality.