# Papers

##### Post a paper
 « 2012 Jan Feb Mar Apr May Jun Jul Aug Sep Oct Nov Dec
Despite all our great advances in science, technology and financial innovations, many societies today are struggling with a financial, economic and public spending crisis, over-regulation, and mass unemployment, as well as lack of sustainability and innovation. Can we still rely on conventional economic thinking or do we need a new approach? <br />I argue that, as the complexity of socio-economic systems increases, networked decision-making and bottom-up self-regulation will be more and more important features. It will be explained why, besides the "homo economicus" with strictly self-regarding preferences, natural selection has also created a "homo socialis" with other-regarding preferences. While the "homo economicus" optimizes the own prospects in separation, the decisions of the "homo socialis" are self-determined, but interconnected, a fact that may be characterized by the term "networked minds". Notably, the "homo socialis" manages to earn higher payoffs than the "homo socialis". <br />I show that the "homo economicus" and the "homo socialis" imply a different kind of dynamics and distinct aggregate outcomes. Therefore, next to the traditional economics for the "homo economicus" ("economics 1.0"), a complementary theory must be developed for the "homo socialis". This economic theory might be called "economics 2.0" or "socionomics". The names are justified, because the Web 2.0 is currently promoting a transition to a new market organization, which benefits from social media platforms and could be characterized as "participatory market society". To thrive, the "homo socialis" requires suitable institutional settings such a particular kinds of reputation systems, which will be sketched in this paper. I also propose a new kind of money, so-called "qualified money", which may overcome some of the problems of our current financial system.
1 vote
We introduce the concept of self-healing in the field of complex networks. Obvious applications range from infrastructural to technological networks. By exploiting the presence of redundant links in recovering the connectivity of the system, we introduce self-healing capabilities through the application of distributed communication protocols granting the "smartness" of the system. We analyze the interplay between redundancies and smart reconfiguration protocols in improving the resilience of networked infrastructures to multiple failures; in particular, we measure the fraction of nodes still served for increasing levels of network damages. We study the effects of different connectivity patterns (planar square-grids, small-world, scale-free networks) on the healing performances. The study of small-world topologies shows us that the introduction of some long-range connections in the planar grids greatly enhances the resilience to multiple failures giving results comparable to the most resilient (but less realistic) scale-free structures.
1 vote
Simulation with agent-based models is increasingly used in the study of complex socio-technical systems and in social simulation in general. This paradigm offers a number of attractive features, namely the possibility of modeling emergent phenomena within large populations. As a consequence, often the quantity in need of calibration may be a distribution over the population whose relation with the parameters of the model is analytically intractable. Nevertheless, we can simulate. In this paper we present a simulation-based framework for the calibration of agent-based models with distributional output based on indirect inference. We illustrate our method step by step on a model of norm emergence in an online community of peer production, using data from three large Wikipedia communities. Model fit and diagnostics are discussed.
1 vote
The ability to understand and eventually predict the emergence of information and activation cascades in social networks is core to complex socio-technical systems research. However, the complexity of social interactions makes this a challenging enterprise. Previous works on cascade models assume that the emergence of this collective phenomenon is related to the activity observed in the local neighborhood of individuals, but do not consider what determines the willingness to spread information in a time-varying process. Here we present a mechanistic model that accounts for the temporal evolution of the individual state in a simplified setup. We model the activity of the individuals as a complex network of interacting integrate-and-fire oscillators. The model reproduces the statistical characteristics of the cascades in real systems, and provides a framework to study time-evolution of cascades in a state-dependent activity scenario.
1 vote
Heavy-tailed distributions of meme popularity occur naturally in a model of meme diffusion on social networks. Competition between multiple memes for the limited resource of user attention is identified as the mechanism that poises the system at criticality. The popularity growth of each meme is described by a critical branching process, and asymptotic analysis predicts power-law distributions of popularity with very heavy tails (exponent $\alpha<2$, unlike preferential-attachment models), similar to those seen in empirical data.
1 vote
This paper provides a substantial reconceptualization of the serial clearing of the product market on the basis of structural axioms. The change of premises is required simply because from the accustomed premises only the accustomed conclusions can be derived and these are known to be inapplicable in the real world. This holds in particular for the still popular idea that the working of a market can be described in terms of the triad demand function–supply function–equilibrium. Structural axiomatization provides the complete and consistent picture of interrelated product market events.
1 vote
We study a phenomenological model for the continuous double auction, equivalent to two independent $M/M/1$ queues. The continuous double auction defines a continuous-time random walk for trade prices. The conditions for ergodicity of the auction are derived and, as a consequence, three possible regimes in the behavior of prices and logarithmic returns are observed. In the ergodic regime, prices are unstable and one can observe an intermittent behavior in the logarithmic returns. On the contrary, non-ergodicity triggers stability of prices, even if two different regimes can be seen.
1 vote
We have analyzed the Indices of Industrial Production (Seasonal Adjustment Index) for a long period of 240 months (January 1988 to December 2007) to develop a deeper understanding of the economic shocks. The angular frequencies estimated using the Hilbert transformation, are almost identical for the 16 industrial sectors. Moreover, the partial phase locking was observed for the 16 sectors. These are the direct evidence of the synchronization in the Japanese business cycle. We also showed that the information of the economic shock is carried by the phase time-series. The common shock and individual shocks are separated using phase time-series. The former dominates the economic shock in all of 1992, 1998 and 2001. The obtained results suggest that the business cycle may be described as a dynamics of the coupled limit-cycle oscillators exposed to the common shocks and random individual shocks.
We present and discuss a stochastic model of financial assets dynamics based on the idea of an inverse renormalization group strategy. With this strategy we construct the multivariate distributions of elementary returns based on the scaling with time of the probability density of their aggregates. In its simplest version the model is the product of an endogenous auto-regressive component and a random rescaling factor embodying exogenous influences. Mathematical properties like increments' stationarity and ergodicity can be proven. Thanks to the relatively low number of parameters, model calibration can be conveniently based on a method of moments, as exemplified in the case of historical data of the S&P500 index. The calibrated model accounts very well for many stylized facts, like volatility clustering, power law decay of the volatility autocorrelation function, and multiscaling with time of the aggregated return distribution. In agreement with empirical evidence in finance, the dynamics is not invariant under time reversal and, with suitable generalizations, skewness of the return distribution and leverage effects can be included. The analytical tractability of the model opens interesting perspectives for applications, for instance in terms of obtaining closed formulas for derivative pricing. Further important features are: The possibility of making contact, in certain limits, with auto-regressive models widely used in finance; The possibility of partially resolving the endogenous and exogenous components of the volatility, with consistent results when applied to historical series.
1 vote
This paper sets up a methodology for approximately solving optimal investment problems using duality methods combined with Monte Carlo simulations. In particular, we show how to tackle high dimensional problems in incomplete markets, where traditional methods fail due to the curse of dimensionality.
1 vote
One of the most important features of spatial networks such as transportation networks, power grids, Internet, neural networks, is the existence of a cost associated with the length of links. Such a cost has a profound influence on the global structure of these networks which usually display a hierarchical spatial organization. The link between local constraints and large-scale structure is however not elucidated and we introduce here a generic model for the growth of spatial networks based on the general concept of cost benefit analysis. This model depends essentially on one single scale and produces a family of networks which range from the star-graph to the minimum spanning tree and which are characterised by a continuously varying exponent. We show that spatial hierarchy emerges naturally, with structures composed of various hubs controlling geographically separated service areas, and appears as a large-scale consequence of local cost-benefit considerations. Our model thus provides the first building blocks for a better understanding of the evolution of spatial networks and their properties. We also find that, surprisingly, the average detour is minimal in the intermediate regime, as a result of a large diversity in link lengths. Finally, we estimate the important parameters for various world railway networks and find that --remarkably-- they all fall in this intermediate regime, suggesting that spatial hierarchy is a crucial feature for these systems and probably possesses an important evolutionary advantage.
1 vote
We analyze realized volatilities constructed using high-frequency stock data on the Tokyo Stock Exchange. In order to avoid non-trading hours issue in volatility calculations we define two realized volatilities calculated separately in the two trading sessions of the Tokyo Stock Exchange, i.e. morning and afternoon sessions. After calculating the realized volatilities at various sampling frequencies we evaluate the bias from the microstructure noise as a function of sampling frequency. Taking into account of the bias to realized volatility we examine returns standardized by realized volatilities and confirm that price returns on the Tokyo Stock Exchange are described approximately by Gaussian time series with time-varying volatility, i.e. consistent with a mixture of distributions hypothesis.
1 vote
The stochastic volatility model is one of volatility models which infer latent volatility of asset returns. The Bayesian inference of the stochastic volatility (SV) model is performed by the hybrid Monte Carlo (HMC) algorithm which is superior to other Markov Chain Monte Carlo methods in sampling volatility variables. We perform the HMC simulations of the SV model for two liquid stock returns traded on the Tokyo Stock Exchange and measure the volatilities of those stock returns. Then we calculate the accuracy of the volatility measurement using the realized volatility as a proxy of the true volatility and compare the SV model with the GARCH model which is one of other volatility models. Using the accuracy calculated with the realized volatility we find that empirically the SV model performs better than the GARCH model.
1 vote
The question on the title came through my mind one day as I keep in one hand a paper in nuclear physics and in the other hand a paper in finance and surprisingly conclude that the same formula appear in both articles*. Phenomena from apparently completely different field of research were solved with the help of same equation. Things are getting even weirder saying that the formula I was talking about is the time-independent Schrodinger equation.
1 vote
Demand outstrips available resources in most situations, which gives rise to competition, interaction and learning. In this article, we review a broad spectrum of multi-agent models of competition and the methods used to understand them analytically. We emphasize the power of concepts and tools from statistical mechanics to understand and explain fully collective phenomena such as phase transitions and long memory, and the mapping between agent heterogeneity and physical disorder. As these methods can be applied to any large-scale model made up of heterogeneous adaptive agent with non-linear interaction, they provide a prospective unifying paradigm for many scientific disciplines.
The main aim of this work is to incorporate selected findings from behavioural finance into a Heterogeneous Agent Model using the Brock and Hommes (1998) framework. Behavioural patterns are injected into an asset pricing framework through the so-called Break Point Date', which allows us to examine their direct impact. In particular, we analyse the dynamics of the model around the behavioural break. Price behaviour of 30 Dow Jones Industrial Average constituents covering five particularly turbulent U.S. stock market periods reveals interesting pattern in this aspect. To replicate it, we apply numerical analysis using the Heterogeneous Agent Model extended with the selected findings from behavioural finance: herding, overconfidence, and market sentiment. We show that these behavioural breaks can be well modelled via the Heterogeneous Agent Model framework and they extend the original model considerably. Various modifications lead to significantly different results and model with behavioural breaks is also able to partially replicate price behaviour found in the data during turbulent stock market periods.
1 vote
We investigate the relation between economic growth and equality in a modified version of the agent-based asset exchange model (AEM). The modified model is a driven system that for a range of parameter space is effectively ergodic in the limit of an infinite system. We find that the belief that "a rising tide lifts all boats" does not always apply, but the effect of growth on the wealth distribution depends on the nature of the growth. In particular, we find that the rate of growth, the way the growth is distributed, and the percentage of wealth exchange determine the degree of equality. We find strong numerical evidence that there is a phase transition in the modified model, and for a part of parameter space the modified AEM acts like a geometric random walk.
1 vote
We consider hundreds of thousands of individual economic transactions to ask: how predictable are consumers in their merchant visitation patterns? Our results suggest that, in the long-run, much of our seemingly elective activity is actually highly predictable. Notwithstanding a wide range of individual preferences, shoppers share regularities in how they visit merchant locations over time. Yet while aggregate behavior is largely predictable, the interleaving of shopping events introduces important stochastic elements at short time scales. These short- and long-scale patterns suggest a theoretical upper bound on predictability, and describe the accuracy of a Markov model in predicting a person's next location. We incorporate population-level transition probabilities in the predictive models, and find that in many cases these improve accuracy. While our results point to the elusiveness of precise predictions about where a person will go next, they suggest the existence, at large time-scales, of regularities across the population.
1 vote
We study a subset of the movie collaboration network, imdb.com, where only adult movies are included. We show that there are many benefits in using such a network, which can serve as a prototype for studying social interactions. We find that the strength of links, i.e., how many times two actors have collaborated with each other, is an important factor that can significantly influence the network topology. We see that when we link all actors in the same movie with each other, the network becomes small-world, lacking a proper modular structure. On the other hand, by imposing a threshold on the minimum number of links two actors should have to be in our studied subset, the network topology becomes naturally fractal. This occurs due to a large number of meaningless links, namely, links connecting actors that did not actually interact. We focus our analysis on the fractal and modular properties of this resulting network, and show that the renormalization group analysis can characterize the self-similar structure of these networks.
1 vote
The focus of this work is on developing probabilistic models for user activity in social networks by incorporating the social network influence as perceived by the user. For this, we propose a coupled Hidden Markov Model, where each user's activity evolves according to a Markov chain with a hidden state that is influenced by the collective activity of the friends of the user. We develop generalized Baum-Welch and Viterbi algorithms for model parameter learning and state estimation for the proposed framework. We then validate the proposed model using a significant corpus of user activity on Twitter. Our numerical studies show that with sufficient observations to ensure accurate model learning, the proposed framework explains the observed data better than either a renewal process-based model or a conventional uncoupled Hidden Markov Model. We also demonstrate the utility of the proposed approach in predicting the time to the next tweet. Finally, clustering in the model parameter space is shown to result in distinct natural clusters of users characterized by the interaction dynamic between a user and his network.
This paper develops an agent-based model to examine the emergent dynamic properties of share market price formation over time, with a view on financial market stability under alternative accounting regimes. In the model, individual heterogeneous investors interact with each other and with institutional devices which are an accounting system (related to the business firm) and a price system (related to the Share Exchange). These interactions provide mechanisms for transmission through which firm-specific (accounting signal) and market-driven (aggregate price) drivers can act. A baseline simulation analysis assesses the financial market stability under three alternative accounting designs, namely two kinds of historical cost accounting regime and one kind of fair value (mark-to-market) accounting regime. The former prove to better stabilize the financial system for market volatility and exuberance in perfectly balanced conditions between speculative and fundamentalist beliefs and intentions. An evolutionary analysis is then developed by varying the relative degree of speculative attitudes. Historical cost accounting regimes further prove to make the financial system more resilient to speculative waves occurring at inter-individual level. Baseline findings are further corroborated through experimental analysis in ten artificial financial systems. This mathematical institutional economic analysis has general implications for both designing accounting systems aimed at enhancing financial market stability and preventing pro-cyclicality, and the study of accounting information process in the formation of share market prices over time.
1 vote
The increasing interdependencies between the world’s technological, socio-economic, and environmental systems have the potential to create global catastrophic risks. We may have to re-design many global networks, otherwise they could turn into "global time bombs".
In this paper we argue that if we want to find a more satisfactory approach to tackling the major socio-economic problems we are facing, we need to thoroughly rethink the basic assumptions of macroeconomics and financial theory. Making minor modifications to the standard models to remove "imperfections" is not enough, the whole framework needs to be revisited.
Economists are fond of the physicists’ powerful tools. As a popular mindset Toolism is as old as economics but the transplants failed to produce the same successes as in their aboriginal environment. Economists therefore looked more and more to the math department for inspiration. Now the tide turns again. The ongoing crisis discredits standard economics and offers the chance for a comeback. Modern econophysics commands the most powerful tools and argues that there are many occasions for their application. The present paper argues that it is not a change of tools that is most urgently needed but a paradigm change.
1 vote
Motivated by empirical data, we develop a statistical description of the queue dynamics for large tick assets based on a two-dimensional Fokker-Planck (diffusion) equation, that explicitly includes state dependence, i.e. the fact that the drift and diffusion depends on the volume present on both sides of the spread. "Jump" events, corresponding to sudden changes of the best limit price, must also be included as birth-death terms in the Fokker-Planck equation. All quantities involved in the equation can be calibrated using high-frequency data on best quotes. One of our central finding is the the dynamical process is approximately scale invariant, i.e., the only relevant variable is the ratio of the current volume in the queue to its average value. While the latter shows intraday seasonalities and strong variability across stocks and time periods, the dynamics of the rescaled volumes is universal. In terms of rescaled volumes, we found that the drift has a complex two-dimensional structure, which is a sum of a gradient contribution and a rotational contribution, both stable across stocks and time. This drift term is entirely responsible for the dynamical correlations between the ask queue and the bid queue.
1 vote
This paper investigates the relevance of the No-Ponzi game condition for public debt (i.e. the public debt growth rate has to be lower than the real interest rate, a necessary assumption for Ricardian equivalence) and of the transversality condition for the GDP growth rate (i.e. the GDP growth rate has to be lower than the real interest rate). First, on the unbalanced panel of 21 countries from 1961 to 2010 available in OECD database, those two conditions were simultaneously validated only for 29% of the cases under examination. Second, those two conditions were more frequent in the 1980s and the 1990s when monetary policies were more restrictive. Third, in tune with the Keynesian view, when the real interest rate is higher than the GDP growth, it corresponds to 75% of the cases of the increases of the debt/GDP ratio but to only 43% of the cases of the decreases of the debt/GDP ratio (fiscal consolidations).
1 vote
One of the fundamental principles driving diversity or homogeneity in domains such as cultural differentiation, political affiliation, and product adoption is the tension between two forces: influence (the tendency of people to become similar to others they interact with) and selection (the tendency to be affected most by the behavior of others who are already similar). Influence tends to promote homogeneity within a society, while selection frequently causes fragmentation. When both forces are in effect simultaneously, it becomes an interesting question to analyze which societal outcomes should be expected. <br />In order to study the joint effects of these forces more formally, we analyze a natural model built upon active lines of work in political opinion formation, cultural diversity, and language evolution. Our model posits an arbitrary graph structure describing which "types" of people can influence one another: this captures effects based on the fact that people are only influenced by sufficiently similar interaction partners. In a generalization of the model, we introduce another graph structure describing which types of people even so much as come in contact with each other. These restrictions on interaction patterns can significantly alter the dynamics of the process at the population level. <br />For the basic version of the model, in which all individuals come in contact with all others, we achieve an essentially complete characterization of (stable) equilibrium outcomes and prove convergence from all starting states. For the other extreme case, in which individuals only come in contact with others who have the potential to influence them, the underlying process is significantly more complicated; nevertheless we present an analysis for certain graph structures.
1 vote
The advancement of various fields of science depends on the actions of individual scientists via the peer review process. The referees' work patterns and stochastic nature of decision making both relate to the particular features of refereeing and to the universal aspects of human behavior. Here, we show that the time a referee takes to write a report on a scientific manuscript depends on the final verdict. The data is compared to a model, where the review takes place in an ongoing competition of completing an important composite task with a large number of concurrent ones - a Deadline -effect. In peer review human decision making and task completion combine both long-range predictability and stochastic variation due to a large degree of ever-changing external "friction".
1 vote
We study the time evolution of ranking and spectral properties of the Google matrix of English Wikipedia hyperlink network during years 2003 - 2011. The statistical properties of ranking of Wikipedia articles via PageRank and CheiRank probabilities, as well as the matrix spectrum, are shown to be stabilized for 2007 - 2011. A special emphasis is done on ranking of Wikipedia personalities and universities. We show that PageRank selection is dominated by politicians while 2DRank, which combines PageRank and CheiRank, gives more accent on personalities of arts. The Wikipedia PageRank of universities recovers 80 percents of top universities of Shanghai ranking during the considered time period.
1 vote
The following fundamental properties are proved to be true if a financial market is exhaustive: (i) Every event which is measurable by the price history at time T is independent of G_t conditional on the current price history H_t, where G_t is a superset of H_t, (ii) every event which is measurable by G_t is independent of H_T conditional on H_t. These properties are especially useful for asset valuation, portfolio optimization and risk management. An exhaustive market with respect to {F_t} is free of dominance and there are no free lunches with vanishing risk under {F_t}. Moreover, it is complete with respect to every information flow which is contained in {F_t} and the growth-optimal portfolio at time t is only determined by the past asset prices. This means any other information which is contained in F_t and available to the investor at time t is irrelevant.
1 vote
We introduce a simple agent-based model which allows us to analyze three stylized facts: a fat-tailed size distribution of companies, a tent-shaped' growth rate distribution, the scaling relation of the growth rate variance with firm size, and the causality between them. This is achieved under the simple hypothesis that firms compete for a scarce quantity (either aggregate demand or workforce) which is allocated probabilistically. The model allows us to relate size and growth rate distributions. We compare the results of our model to simulations with other scaling relationships, and to similar models and relate it to existing theory.
1 vote
Financial markets are prominent examples for highly non-stationary systems. Sample averaged observables such as variances and correlation coefficients strongly depend on the time window in which they are evaluated. This implies severe limitations for approaches in the spirit of standard equilibrium statistical mechanics and thermodynamics. Nevertheless, we show that there are similar generic features which we uncover in the empirical return distributions for whole markets. We explain our findings by setting up a random matrix model.
1 vote
The probability distribution of number of ties of an individual in a social network follows a scale-free power-law. However, how this distribution arises has not been conclusively demonstrated in direct analyses of people's actions in social networks. Here, we perform a causal inference analysis and find an underlying cause for this phenomenon. Our analysis indicates that heavy-tailed degree distribution is causally determined by similarly skewed distribution of human activity. Specifically, the degree of an individual is entirely random - following a "maximum entropy attachment" model - except for its mean value which depends deterministically on the volume of the users' activity. This relation cannot be explained by interactive models, like preferential attachment, since the observed actions are not likely to be caused by interactions with other people.
1 vote
How much did a network change since yesterday? How different is the wiring between Bob's brain (a left-handed male) and Alice's brain (a right-handed female)? Graph similarity with known node correspondence, i.e. the detection of changes in the connectivity of graphs, arises in numerous settings. In this work, we formally state the axioms and desired properties of the graph similarity functions, and evaluate when state-of-the-art methods fail to detect crucial connectivity changes in graphs. We propose DeltaCon, a principled, intuitive, and scalable algorithm that assesses the similarity between two graphs on the same nodes (e.g. employees of a company, customers of a mobile carrier). Experiments on various synthetic and real graphs showcase the advantages of our method over existing similarity measures. Finally, we employ DeltaCon to real applications: (a) we classify people to groups of high and low creativity based on their brain connectivity graphs, and (b) do temporal anomaly detection in the who-emails-whom Enron graph.
1 vote
Punishment may deter antisocial behavior. Yet to punish is costly, and the costs often do not offset the gains that are due to elevated levels of cooperation. However, the effectiveness of punishment depends not only on how costly it is, but also on the circumstances defining the social dilemma. Using the snowdrift game as the basis, we have conducted a series of economic experiments to determine whether severe punishment is more effective than mild punishment. We have observed that severe punishment is not necessarily more effective, even if the cost of punishment is identical in both cases. The benefits of severe punishment become evident only under extremely adverse conditions, when to cooperate is highly improbable in the absence of sanctions. If cooperation is likely, mild punishment is not less effective and leads to higher average payoffs, and is thus the much preferred alternative. Presented results suggest that the positive effects of punishment stem not only from imposed fines, but may also have a psychological background. Small fines can do wonders in motivating us to chose cooperation over defection, but without the paralyzing effect that may be brought about by large fines. The later should be utilized only when absolutely necessary.
Modern ICT (Information and Communication Technology) has developed a vision where the "computer" is no longer associated with the concept of a single device or a network of devices, but rather the entirety of situated services originating in a digital world, which are perceived through the physical world. It is observed that services with explicit user input and output are becoming to be replaced by a computing landscape sensing the physical world via a huge variety of sensors, and controlling it via a plethora of actuators. The nature and appearance of computing devices is changing to be hidden in the fabric of everyday life, invisibly networked, and omnipresent, with applications greatly being based on the notions of context and knowledge. Interaction with such globe spanning, modern ICT systems will presumably be more implicit, at the periphery of human attention, rather than explicit, i.e. at the focus of human attention. Socio-inspired ICT assumes that future, globe scale ICT systems should be viewed as social systems. Such a view challenges research to identify and formalize the principles of interaction and adaptation in social systems, so as to be able to ground future ICT systems on those principles. This position paper therefore is concerned with the intersection of social behaviour and modern ICT, creating or recreating social conventions and social contexts through the use of pervasive, globe-spanning, omnipresent and participative ICT.
1 vote
The key feature of online social networks (OSN) is the ability of users to become active, make friends and interact via comments, videos or messages with those around them. This social interaction is typically perceived as critical to the proper functioning of these platforms; therefore, a significant share of OSN research in the recent past has investigated the characteristics and importance of these social links, studying the networks' friendship relations through their topological properties, the structure of the resulting communities and identifying the role and importance of individual members within these networks. <br />In this paper, we present results from a multi-year study of the online social network Digg.com, indicating that the importance of friends and the friend network in the propagation of information is less than originally perceived. While we do note that users form and maintain a social structure along which information is exchanged, the importance of these links and their contribution is very low: Users with even a nearly identical overlap in interests react on average only with a probability of 2% to information propagated and received from friends. Furthermore, in only about 50% of stories that became popular from the entire body of 10 million news we find evidence that the social ties among users were a critical ingredient to the successful spread. Our findings indicate the presence of previously unconsidered factors, the temporal alignment between user activities and the existence of additional logical relationships beyond the topology of the social graph, that are able to drive and steer the dynamics of such OSNs.
"In the next century, planet earth will don an electronic skin. It will use the Internet as a scaffold to support and transmit its sensations. This skin is already being stitched together. It consists of millions of embedded electronic measuring devices: thermostats, pressure gauges, pollution detectors, cameras, microphones, glucose sensors, EKGs, electroencephalographs. These will probe and monitor cities and endangered species, the atmosphere, our ships, highways and fleets of trucks, our conversations, our bodies--even our dreams ....What will the earth's new skin permit us to feel? How will we use its surges of sensation? For several years--maybe for a decade--there will be no central nervous system to manage this vast signaling network. Certainly there will be no central intelligence...some qualities of self-awareness will emerge once the Net is sensually enhanced. Sensuality is only one force pushing the Net toward intelligence". These statements are quoted by an interview by Cherry Murray, Dean of the Harvard School of Engineering and Applied Sciences and Professor of Physics. It is interesting to outline the timeliness and highly predicting power of these statements. In particular, we would like to point to the relevance of the question "What will the earth's new skin permit us to feel?" to the work we are going to discuss in this paper. There are many additional compelling questions, as for example: "How can the electronic earth's skin be made more resilient?"; "How can the earth's electronic skin be improved to better satisfy the need of our society?";"What can the science of complex systems contribute to this endeavour?"
The FuturICT project is a response to the European Flagship Call in the Area of Future and Emerging Technologies, which is planning to spend 1 billion EUR on each of two flagship projects over a period of 10 years. FuturICT seeks to create an open, global but decentralized, democratically controlled information platform that will use online data and real-time measurements together with novel theoretical models and experimental methods to achieve a paradigm shift in our understanding of today's strongly interdependent and complex world and make our techno-socio-economic systems more flexible, adaptive, resilient, sustainable, and livable through a participatory approach.
1 vote
Biological competition is widely believed to result in the evolution of selfish preferences. The related concept of the homo economicus' is at the core of mainstream economics. However, there is also experimental and empirical evidence for other-regarding preferences. Here we present a theory that explains both, self-regarding and other-regarding preferences. Assuming conditions promoting non-cooperative behaviour, we demonstrate that intergenerational migration determines whether evolutionary competition results in a homo economicus' (showing self-regarding preferences) or a homo socialis' (having other-regarding preferences). Our model assumes spatially interacting agents playing prisoner's dilemmas, who inherit a trait determining friendliness', but mutations tend to undermine it. Reproduction is ruled by fitness-based selection without a cultural modification of reproduction rates. Our model calls for a complementary economic theory for networked minds' (the homo socialis') and lays the foundations for an evolutionarily grounded theory of other-regarding agents, explaining individually different utility functions as well as conditional cooperation.
1 vote
A microeconomic model is developed, which accurately predicts the shape of personal income distribution (PID) in the United States and the evolution of the shape over time. The underlying concept is borrowed from geo-mechanics and thus can be considered as mechanics of income distribution. The model allows the resolution of empirical and definitional problems associated with personal income measurements. It also serves as a firm fundament for definitions of income inequality as secondary derivatives from personal income distribution. It is found that in relative terms the PID in the US has not been changing since 1947. Effectively, the Gini coefficient has been almost constant during the last 60 years, as reported by the Census Bureau.
1 vote
The process of collecting and organizing sets of observations represents a common theme throughout the history of science. However, despite the ubiquity of scientists measuring, recording, and analyzing the dynamics of different processes, an extensive organization of scientific time-series data and analysis methods has never been performed. Addressing this, annotated collections of over 35 000 real-world and model-generated time series and over 9000 time-series analysis algorithms are analyzed in this work. We introduce reduced representations of both time series, in terms of their properties measured by diverse scientific methods, and of time-series analysis methods, in terms of their behaviour on empirical time series, and use them to organize these interdisciplinary resources. This new approach to comparing across diverse scientific data and methods allows us to organize time-series datasets automatically according to their properties, retrieve alternatives to particular analysis methods developed in other scientific disciplines, and automate the selection of useful methods for time-series classification and regression tasks. The broad scientific utility of these tools is demonstrated on datasets of electroencephalograms, self-affine time series, heart beat intervals, speech signals, and others, in each case contributing novel analysis techniques to the existing literature. Highly comparative techniques that compare across an interdisciplinary literature can thus be used to guide more focused research in time-series analysis for applications across the scientific disciplines.
1 vote
This editorial opens the special issues that the Journal of Statistical Physics has dedicated to the growing field of statistical physics modeling of social dynamics. The issues include contributions from physicists and social scientists, with the goal of fostering a better communication between these two communities.
1 vote
Citation numbers and other quantities derived from bibliographic databases are becoming standard tools for the assessment of productivity and impact of research activities. Though widely used, still their statistical properties have not been well established so far. This is especially true in the case of bibliometric indicators aimed at the evaluation of individual scholars, because large-scale data sets are typically difficult to be retrieved. Here, we take advantage of a recently introduced large bibliographic data set, Google Scholar Citations, which collects the entire publication record of individual scholars. We analyze the scientific profile of more than 30,000 researchers, and study the relation between the h-index, the number of publications and the number of citations of individual scientists. While the number of publications of a scientist has a rather weak relation with his/her h-index, we find that the h-index of a scientist is strongly correlated with the number of citations that she/he has received so that the number of citations can be effectively be used as a proxy of the h-index. Allowing for the h-index to depend on both the number of citations and the number of publications, we find only a minor improvement.
1 vote
The patterns of life exhibited by large populations have been described and modeled both as a basic science exercise and for a range of applied goals such as reducing automotive congestion, improving disaster response, and even predicting the location of individuals. However, these studies previously had limited access to conversation content, rendering changes in expression as a function of movement invisible. In addition, they typically use the communication between a mobile phone and its nearest antenna tower to infer position, limiting the spatial resolution of the data to the geographical region serviced by each cellphone tower. We use a collection of 37 million geolocated tweets to characterize the movement patterns of 180,000 individuals, taking advantage of several orders of magnitude of increased spatial accuracy relative to previous work. Employing the recently developed sentiment analysis instrument known as the \textit{hedonometer}, we characterize changes in word usage as a function of movement, and find that expressed happiness increases logarithmically with distance from an individual's average location.
1 vote