# Papers

##### Post a paper
 « 2011 Jan Feb Mar Apr May Jun Jul Aug Sep Oct Nov Dec 2013 »
In this paper, we contribute to the literature on energy market co-movement by studying its dynamics in the time-frequency domain. The novelty of our approach lies in the application of wavelet tools to commodity market data. A major part of economic time series analysis is done in the time or frequency domain separately. Wavelet analysis combines these two fundamental approaches allowing study of the time series in the time- frequency domain. Using this framework, we propose a new, model-free way of estimating time-varying cor- relations. In the empirical analysis, we connect our approach to the dynamic conditional correlation approach of Engle (2002) on the main components of the energy sector. Namely, we use crude oil, gasoline, heating oil, and natural gas on a nearest-future basis over a period of approximately 16 and 1/2 years beginning on November 1, 1993 and ending on July 21, 2010. Using wavelet coherence, we uncover interesting dynamics of correlations between energy commodities in the time-frequency space.
In this paper, we show how the sampling properties of the Hurst exponent methods of estimation change with the presence of heavy tails. We run extensive Monte Carlo simulations to find out how rescaled range analysis (R/S), multifractal detrended fluctuation analysis (MF-DFA), detrending moving average (DMA) and generalized Hurst exponent approach (GHE) estimate Hurst exponent on independent series with different heavy tails. For this purpose, we generate independent random series from stable distribution with stability exponent {\alpha} changing from 1.1 (heaviest tails) to 2 (Gaussian normal distribution) and we estimate the Hurst exponent using the different methods. R/S and GHE prove to be robust to heavy tails in the underlying process. GHE provides the lowest variance and bias in comparison to the other methods regardless the presence of heavy tails in data and sample size. Utilizing this result, we apply a novel approach of the intraday time-dependent Hurst exponent and we estimate the Hurst exponent on high frequency data for each trading day separately. We obtain Hurst exponents for S&P500 index for the period beginning with year 1983 and ending by November 2009 and we discuss the surprising result which uncovers how the market's behavior changed over this long period.
In this paper we propose a new approach to estimation of the tail exponent in financial stock markets. We begin the study with the finite sample behavior of the Hill estimator under {\alpha}-stable distributions. Using large Monte Carlo simulations, we show that the Hill estimator overestimates the true tail exponent and can hardly be used on samples with small length. Utilizing our results, we introduce a Monte Carlo-based method of estimation for the tail exponent. Our proposed method is not sensitive to the choice of tail size and works well also on small data samples. The new estimator also gives unbiased results with symmetrical confidence intervals. Finally, we demonstrate the power of our estimator on the international world stock market indices. On the two separate periods of 2002-2005 and 2006-2009, we estimate the tail exponent.
1 vote
We propose a stochastic process driven by memory effect with novel distributions including both exponential and leptokurtic heavy-tailed distributions. A class of distribution is analytically derived from the continuum limit of the discrete binary process with the renormalized auto-correlation and the closed form moment generating function is obtained, thus the cumulants are calculated and shown to be convergent. The other class of distributions are numerically investigated. The concoction of the two stochastic processes of the different signs of memory under regime switching mechanism does incarnate power-law decay behavior, which strongly implies that memory is the alternative origin of heavy-tail.
1 vote
We consider an illiquid financial market where a risk averse investor has to liquidate a portfolio within a finite time horizon [0,T] and can trade continuously at a traditional exchange (the "primary venue") and in a dark pool. At the primary venue, trading yields a linear price impact. In the dark pool, no price impact costs arise but order execution is uncertain, modeled by a multi-dimensional Poisson process. We characterize the costs of trading by a linear-quadratic functional which incorporates both the price impact costs of trading at the primary exchange and the market risk of the position. The liquidation constraint implies a singularity of the value function of the resulting minimization problem at the terminal time T. Via the HJB equation and a quadratic ansatz, we obtain a candidate for the value function which is the limit of a sequence of solutions of initial value problems for a matrix differential equation. We show that this limit exists by using an appropriate matrix inequality and a comparison result for Riccati matrix equations. Additionally, we obtain upper and lower bounds of the solutions of the initial value problems, which allow us to prove a verification theorem. If a single asset position is to be liquidated, the investor slowly trades out of her position at the primary venue, with the remainder being placed in the dark pool at any point in time. For multi-asset liquidations this is generally not the case; it can, e.g., be optimal to oversize orders in the dark pool in order to turn a poorly balanced portfolio into a portfolio bearing less risk.
In a recent paper, we analyzed the self-assembly of a complex cooperation network. The network was shown to approach a state where every agent invests the same amount of resources. Nevertheless, highly-connected agents arise that extract extraordinarily high payoffs while contributing comparably little to any of their cooperations. Here, we investigate a variant of the model, in which highly-connected agents have access to additional resources. We study analytically and numerically whether these resources are invested in existing collaborations, leading to a fairer load distribution, or in establishing new collaborations, leading to an even less fair distribution of loads and payoffs.
The goals of this paper are to present criteria, that allow to a priori quantify the attack stability of real world correlated networks of finite size and to check how these criteria correspond to analytic results available for infinite uncorrelated networks. As a case study, we consider public transportation networks (PTN) of several major cities of the world. To analyze their resilience against attacks either the network nodes or edges are removed in specific sequences (attack scenarios). During each scenario the size S(c) of the largest remaining network component is observed as function of the removed share c of nodes or edges. To quantify the PTN stability with respect to different attack scenarios we use the area below the curve described by S(c) for c \in [0,1] recently introduced (Schneider, C. M, et al., PNAS 108 (2011) 3838) as a numerical measure of network robustness. This measure captures the network reaction over the whole attack sequence. We present results of the analysis of PTN stability against node and link-targeted attacks.
Populations are seldom completely isolated from their environment. Individuals in a particular geographic or social region may be considered a distinct network due to strong local ties, but will also interact with individuals in other networks. We study the susceptible-infected-recovered (SIR) process on interconnected network systems, and find two distinct regimes. In strongly-coupled network systems, epidemics occur simultaneously across the entire system at a critical infection strength $\beta_c$, below which the disease does not spread. In contrast, in weakly-coupled network systems, a mixed phase exists below $\beta_c$ of the coupled network system, where an epidemic occurs in one network but does not spread to the coupled network. We derive an expression for the network and disease parameters that allow this mixed phase and verify it numerically. Public health implications of communities comprising these two classes of network systems are also mentioned.
We introduce a new measure of activity of financial markets that provides a direct access to their level of endogeneity. This measure quantifies how much of price changes are due to endogenous feedback processes, as opposed to exogenous news. For this, we calibrate the self-excited conditional Poisson Hawkes model, which combines in a natural and parsimonious way exogenous influences with self-excited dynamics, to the E-mini S&P 500 futures contracts traded in the Chicago Mercantile Exchange from 1998 to 2010. We find that the level of endogeneity has increased significantly from 1998 to 2010, with only 70% in 1998 to less than 30% since 2007 of the price changes resulting from some revealed exogenous information. Analogous to nuclear plant safety concerned with avoiding "criticality", our measure provides a direct quantification of the distance of the financial market to a critical state defined precisely as the limit of diverging trading activity in absence of any external driving.
Online social networking technologies enable individuals to simultaneously share information with any number of peers. Quantifying the causal effect of these technologies on the dissemination of information requires not only identification of who influences whom, but also of whether individuals would still propagate information in the absence of social signals about that information. We examine the role of social networks in online information diffusion with a large-scale field experiment that randomizes exposure to signals about friends' information sharing among 253 million subjects in situ. Those who are exposed are significantly more likely to spread information, and do so sooner than those who are not exposed. We further examine the relative role of strong and weak ties in information propagation. We show that, although stronger ties are individually more influential, it is the more abundant weak ties who are responsible for the propagation of novel information. This suggests that weak ties may play a more dominant role in the dissemination of information online than currently believed.
In this paper we analyze Gresham's Law, in particular, how the rate of inflow or outflow of currencies is affected by the demand elasticity of arbitrage and the difference in face value ratios inside and outside of a country under a bimetallic system. We find that these equations are very similar to those used to describe drift in systems of free charged particles. In addition, we look at how Gresham's Law would play out with multiple currencies and multiple countries under a variety of connecting topologies.
This paper deals with the disciplinary dimensions of a very new field called econphysics and shows that despite the fact that econophysics is regularly described as an interdisciplinary approach, it is in fact a multidisciplinary field. Beyond this observation, we note that recent developments suggests that econophysics could evolve into a more integrated field. We have therefore taken a prospective approach by analyzing how this field could become more transdisciplinary. We show that a common echeme is attainable and we investigate the possibilities of transdisciplinary econophysics.
We introduce a new method for detection of long-range cross-correlations and multifractality - multifractal height cross-correlation analysis (MF-HXA) - based on scaling of qth order covariances. MF-HXA is a bivariate generalization of the height-height correlation analysis of Barabasi & Vicsek [Barabasi, A.L., Vicsek, T.: Multifractality of self-affine fractals, Physical Review A 44(4), 1991]. The method can be used to analyze long-range cross-correlations and multifractality between two simultaneously recorded series. We illustrate a power of the method on both simulated and real-world time series.
In this paper, we present the results of Monte Carlo simulations for two popular techniques of long-range correlations detection - classical and modified rescaled range analyses. A focus is put on an effect of different distributional properties on an ability of the methods to efficiently distinguish between short and long-term memory. To do so, we analyze the behavior of the estimators for independent, short-range dependent, and long-range dependent processes with innovations from 8 different distributions. We find that apart from a combination of very high levels of kurtosis and skewness, both estimators are quite robust to distributional properties. Importantly, we show that R/S is biased upwards (yet not strongly) for short-range dependent processes, while M-R/S is strongly biased downwards for long-range dependent processes regardless of the distribution of innovations.
1 vote
Trade is a fundamental pillar of economy and a form of social organization. Its empirical characterization at the worldwide scale is represented by the World Trade Web (WTW), the network built upon the trade relationships between the different countries. Several scientific studies have focused on the structural characterization of this network, as well as its dynamical properties, since we have registry of the structure of the network at different times in history. In this paper we study an abstract scenario for the development of global crises on top of the structure of connections of the WTW. Assuming a cyclic dynamics of national economies and the interaction of different countries according to the import-export balances, we are able to investigate, using a simple model of pulse-coupled oscillators, the synchronization phenomenon of crises at the worldwide scale. We focus on the level of synchronization measured by an order parameter at two different scales, one for the global system and another one for the mesoscales defined through the topology. We use the WTW network structure to simulate a network of Integrate-and-Fire oscillators for six different snapshots between years 1950 and 2000. The results reinforce the idea that globalization accelerates the global synchronization process, and the analysis at a mesoscopic level shows that this synchronization is different before and after globalization periods: after globalization, the effect of communities is almost inexistent.
1 vote
For fat tailed distributions (i.e. those that decay slower than an exponential), large deviations not only become relatively likely, but the way in which they are realized changes dramatically: A finite fraction of the whole sample deviation is concentrated on a single variable: large deviations are not the accumulation of many small deviations, but rather they are dominated to a single large fluctuation. The regime of large deviations is separated from the regime of typical fluctuations by a phase transition where the symmetry between the points in the sample is {\em spontaneously broken}. This phenomenon has been discussed in the context of mass transport models in physics, where it takes the form of a condensation phase transition. Yet, the phenomenon is way more general. For example, in risk management of large portfolios, it suggests that one should expect losses to concentrate on a single asset: when extremely bad things happen, it is likely that there is a single factor on which bad luck concentrates. Along similar lines, one should expect that bubbles in financial markets do not gradually deflate, but rather burst abruptly and that in the most rainy day of a year, precipitation concentrate on a given spot. Analogously, when applied to biological evolution, we're lead to infer that, if fitness changes for individual mutations have a broad distribution, those large deviations that lead to better fit species are not likely to result from the accumulation of small positive mutations. Rather they are likely to arise from large rare jumps.
1 vote
Understanding the statistical properties of recurrence intervals of extreme events is crucial to risk assessment and management of complex systems. The probability distributions and correlations of recurrence intervals for many systems have been extensively investigated. However, the impacts of microscopic rules of a complex system on the macroscopic properties of its recurrence intervals are less studied. In this Letter, we adopt an order-driven stock market model to address this issue for stock returns. We find that the distributions of the scaled recurrence intervals of simulated returns have a power law scaling with stretched exponential cutoff and the intervals possess multifractal nature, which are consistent with empirical results. We further investigate the effects of long memory in the directions (or signs) and relative prices of the order flow on the characteristic quantities of these properties. It is found that the long memory in the order directions (Hurst index $H_s$) has a negligible effect on the interval distributions and the multifractal nature. In contrast, the power-law exponent of the interval distribution increases linearly with respect to the Hurst index $H_x$ of the relative prices, and the singularity width of the multifractal nature fluctuates around a constant value when $H_x<0.7$ and then increases with $H_x$. No evident effects of $H_s$ and $H_x$ are found on the long memory of the recurrence intervals. Our results indicate that the nontrivial properties of the recurrence intervals of returns are mainly caused by traders' behaviors of persistently placing new orders around the best bid and ask prices.
1 vote
We investigate large changes, bursts, of the continuous stochastic signals, when the exponent of multiplicativity is higher than one. Earlier we have proposed a general nonlinear stochastic model which can be transformed into Bessel process with known first hitting (first passage) time statistics. Using these results we derive PDF of burst duration for the proposed model. We confirm analytical expressions by numerical evaluation and discuss bursty behavior of return in financial markets in the framework of modeling by nonlinear SDE.
1 vote
Human dynamical social networks encode information and are highly adaptive. To characterize the information encoded in the fast dynamics of social interactions, here we introduce the entropy of dynamical social networks. By analysing a large dataset of phone-call interactions we show evidence that the dynamical social network has an entropy that depends on the time of the day in a typical week-day. Moreover we show evidence for adaptability of human social behavior showing data on duration of phone-call interactions that significantly deviates from the statistics of duration of face-to-face interactions. This adaptability of behavior corresponds to a different information content of the dynamics of social human interactions. We quantify this information by the use of the entropy of dynamical networks on realistic models of social interactions.
1 vote
We study cross-country GDP losses due to financial crises in terms of frequency (number of loss events per period) and severity (loss per occurrence). We perform the Loss Distribution Approach (LDA) to estimate a multi-country aggregate GDP loss probability density function and the percentiles associated to extreme events due to financial crises. <br />We find that output losses arising from financial crises are strongly heterogeneous and that currency crises lead to smaller output losses than debt and banking crises. <br />Extreme global financial crises episodes, occurring with a one percent probability every five years, lead to losses between 2.95% and 4.54% of world GDP.
By using Random Matrix Theory, we build covariance matrices between stocks of the BM&F-Bovespa (Bolsa de Valores, Mercadorias e Futuros de S\~ao Paulo) which are cleaned of some of the noise due to the complex interactions between the many stocks and the finiteness of available data. We also use a regression model in order to remove the market effect due to the common movement of all stocks. These two procedures are then used to build stock portfolios based on Markowitz's theory, trying to obtain better predictions of future risk based on past data. This is done for years of both low and high volatility of the Brazilian stock market, from 2004 to 2010. The results show that the use of regression to subtract the market effect on returns greatly increases the accuracy of the prediction of risk, and that, although the cleaning of the correlation matrix often leads to portfolios that better predict risks, in periods of high volatility of the market this procedure may fail to do so.
1 vote
In this paper, we use the generalized Hurst exponent approach to study the multi- scaling behavior of different financial time series. We show that this approach is robust and powerful in detecting different types of multiscaling. We observe a puzzling phenomenon where an apparent increase in multifractality is measured in time series generated from shuffled returns, where all time-correlations are destroyed, while the return distributions are conserved. This effect is robust and it is reproduced in several real financial data including stock market indices, exchange rates and interest rates. In order to understand the origin of this effect we investigate different simulated time series by means of the Markov switching multifractal (MSM) model, autoregressive fractionally integrated moving average (ARFIMA) processes with stable innovations, fractional Brownian motion and Levy flights. Overall we conclude that the multifractality observed in financial time series is mainly a consequence of the characteristic fat-tailed distribution of the returns and time-correlations have the effect to decrease the measured multifractality.
We study the evolution of public cooperation on two interdependent networks that are connected by means of a utility function, which determines to what extent payoffs in one network influence the success of players in the other network. We find that the stronger the bias in the utility function, the higher the level of public cooperation. Yet the benefits of enhanced public cooperation on the two networks are just as biased as the utility functions themselves. While cooperation may thrive on one network, the other may still be plagued by defectors. Nevertheless, the aggregate level of cooperation on both networks is higher than the one attainable on an isolated network. This positive effect of biased utility functions is due to the suppressed feedback of individual success, which leads to a spontaneous separation of characteristic time scales of the evolutionary process on the two interdependent networks. As a result, cooperation is promoted because the aggressive invasion of defectors is more sensitive to the slowing down than the build-up of collective efforts in sizable groups.
1 vote
In all empirical-network studies, the observed properties of economic networks are informative only if compared with a well-defined null model that can quantitatively predict the behavior of such properties in constrained graphs. However, predictions of the available null-model methods can be derived analytically only under assumptions (e.g., sparseness of the network) that are unrealistic for most economic networks like the World Trade Web (WTW). In this paper we study the evolution of the WTW using a recently-proposed family of null network models. The method allows to analytically obtain the expected value of any network statistic across the ensemble of networks that preserve on average some local properties, and are otherwise fully random. We compare expected and observed properties of the WTW in the period 1950-2000, when either the expected number of trade partners or total country trade is kept fixed and equal to observed quantities. We show that, in the binary WTW, node-degree sequences are sufficient to explain higher-order network properties such as disassortativity and clustering-degree correlation, especially in the last part of the sample. Conversely, in the weighted WTW, the observed sequence of total country imports and exports are not sufficient to predict higher-order patterns of the WTW. We discuss some important implications of these findings for international-trade models.
1 vote
We present conditions under which positive alpha exists in the realm of active portfolio management– in contrast to the controversial result in (Jarrow, 2010, pg. 20) which implicates delegated portfolio management by surmising that positive alphas are illusionary. Specifically, we show that the critical assumption used in (Jarrow, 2010, pg. 20), to derive the illusionary alpha result, is based on a zero set for CAPM with Lebesgue measure zero. So conclusions based on the assumption may well have probability measure zero of occurrence. Technically, the existence of [Tanaka] local time on that set implies existence of positive alphas. In fact, we show that positive alpha exists under the same scenarios of ”perpetual event swap” and ”market systemic event” Jarrow (2010) used to formulate the illusionary positive alpha result. First, we prove that as long as asset price volatility is greater than zero, systemic events like market crash will occur in finite time almost surely. Thus creating an opportunity to hedge against that event. Second, we find that Jarrow’s ”false positive alpha” variable constitutes portfolio manager reward for trading strategy. For instance, we show that positive alpha exists if portfolio managers develop hedging strategies based on either (1) an exotic [barrier] option on the underlying asset–with barrier hitting time motivated by the ”market systemic” event, or (2) a swaption strategy for the implied interest rate risk inherent in Jarrow’s triumvirate of riskless rate of return, factor sensitivity exposure, and constant risk premium for a perpetual event swap.
1 vote
We study in details the turnout rate statistics for 77 elections in 11 different countries. We show that the empirical results established in a previous paper for French elections appear to hold much more generally. We find in particular that the spatial correlation of turnout rates decay logarithmically with distance in all cases. This result is quantitatively reproduced by a decision model that assumes that each voter makes his mind as a result of three influence terms: one totally idiosyncratic component, one city-specific term with short-ranged fluctuations in space, and one long-ranged correlated field which propagates diffusively in space. A detailed analysis reveals several interesting features: for example, different countries have different degrees of local heterogeneities and seem to be characterized by a different propensity for individuals to conform to the cultural norm. We furthermore find clear signs of herding (i.e. strongly correlated decisions at the individual level) in some countries, but not in others.
1 vote
On December 16, Zynga, the well-known social game developing company went public. This event is following other recent IPOs in the world of social networking companies, such as Groupon, Linkedin or Pandora to cite a few. With a valuation close to 7 billion USD at the time when it went public, Zynga has become the biggest web IPO since Google. This recent enthusiasm for social networking companies, and in particular Zynga, brings up the question whether or not they are overvalued. The common denominator of all these IPOs is that a lot of estimates about their valuation have been circulating, without any specifics given about the methodology or assumptions used to obtain those numbers. To bring more substance to the debate, we propose a two-tiered approach. First, we introduce a new model to forecast the global user base of Zynga, based on the analysis of the individual dynamics of its major games. Next, we model the revenues per user using a logistic growth function, a standard model for growth in competition. This leads to bracket the valuation of Zynga using three different scenarios (base one, optimistic and very optimistic): 4.17 billion USD in the base case, 5.16 billion in the high growth and 7.02 billion in the extreme growth scenario respectively. Thus, only the unlikely extreme growth scenario could potentially justify today's 6.6 billion USD valuation of Zynga. This suggests that Zynga at its IPO has been overpriced.
1 vote
Propagation of balance-sheet or cash-flow insolvency across financial institutions may be modeled as a cascade process on a network representing their mutual exposures. We derive rigorous asymptotic results for the magnitude of contagion in a large financial network and give an analytical expression for the asymptotic fraction of defaults, in terms of network characteristics. Our results extend previous studies on contagion in random graphs to inhomogeneous directed graphs with a given degree sequence and arbitrary distribution of weights. We introduce a criterion for the resilience of a large financial network to the insolvency of a small group of financial institutions and quantify how contagion amplifies small shocks to the network. Our results emphasize the role played by "contagious links" and show that institutions which contribute most to network instability in case of default have both large connectivity and a large fraction of contagious links. The asymptotic results show good agreement with simulations for networks with realistic sizes.
1 vote
We study the cross-correlation matrix $C_{ij}$ of inventory variations of the most active individual and institutional investors in an emerging market to understand the dynamics of inventory variations. We find that the distribution of cross-correlation coefficient $C_{ij}$ has a power-law form in the bulk followed by exponential tails and there are more positive coefficients than negative ones. In addition, it is more possible that two individuals or two institutions have stronger inventory variation correlation than one individual and one institution. We find that the largest and the second largest eigenvalues ($\lambda_1$ and $\lambda_2$) of the correlation matrix cannot be explained by the random matrix theory and the projection of inventory variations on the first eigenvector $u(\lambda_1)$ are linearly correlated with stock returns, where individual investors play a dominating role. The investors are classified into three categories based on the cross-correlation coefficients $C_{VR}$ between inventory variations and stock returns. Half individuals are reversing investors who exhibit evident buy and sell herding behaviors, while 6% individuals are trending investors. For institutions, only 10% and 8% investors are trending and reversing investors. A strong Granger causality is unveiled from stock returns to inventory variations, which means that a large proportion of individuals hold the reversing trading strategy and a small part of individuals hold the trending strategy. Comparing with the case of Spanish market, Chinese investors exhibit common and market-specific behaviors. Our empirical findings have scientific significance in the understanding of investors' trading behaviors and in the construction of agent-based models for stock markets.