refereed journal articles

Fritsch, Markus, Andrew Adrian Pua, and Joachim Schnurbus. “pdynmc: A Package for Estimating Linear Dynamic Panel Data Models Based on Nonlinear Moment Conditions.” The R Journal 13, no. 1 (2021): 218-231

If you do not have Stata, this is one free way to estimate linear dynamic panel data models in R. The design goals were (a) to give the user time to think about what to estimate (b) to be flexible but not too flexible and (c) to allow for nonlinear moment conditions of the Ahn and Schmidt (1995) type.

working papers

nonlinear panel data econometrics

“Should We Use IV to Estimate Dynamic Linear Probability Models with Fixed Effects?” (download here)

In general, no because (a) implicit weights may not reflect the research question and (b) limiting quantities can sometimes be outside of identified set. Large sample sizes in either dimension do not change the message. But if you want to test whether there is zero first-order state dependence, then feel free to use default software routines.

“Simultaneous Equations for Discrete Outcomes: Coherence and Completeness Using Panel Data.” (download here)

Modeling jointly determined discrete outcomes via simdivtaneous equations require the so-called coherency condition to guarantee the existence of a unique reduced form. These conditions effectively convert a model where the endogenous variables are jointly determined into a model that is triangdivar or recursive. In the spirit of a suggestion by Lewbel (2007), I propose using panel data to decide how the coherency condition will hold without restricting error supports or imposing triangularity for all observations.

“Estimation and Inference in Dynamic Nonlinear Fixed Effects Panel Data Models by Projection.”(download here)

I develop a bias reduction method for the estimators of structural parameters of a panel data model (whether linear or nonlinear) using a projection argument. A corrected score is calculated by projecting the score vector for the structural parameters onto the orthogonal complement of a space characterized by incidental parameter fluctuations.

“The Role of Sparsity in Panel Data Models.” (download here)

Sparsity of the fixed effects means that some fixed effects have absolute value of zero while others are bounded and other are large. The proposed estimator attempts to detect the large values of the fixed effects so that they can be removed in the second step. I tune the regularization parameter to encourage sparsity and allow for contemporaneously exogenous regressors. As a second step, I remove the large non-zero fixed effects so that pooled OLS may be applied.

linear panel data econometrics

“Practical Aspects of Using Quadratic Moment Conditions in Linear AR(1) panel data models.” (with Markus Fritsch and Joachim Schnurbus, link here)

We study the estimation of the lag parameter of linear dynamic panel data models with first order dynamics based on the quadratic Ahn and Schmidt (1995) moment conditions. Our contribution is twofold: First, we show that extending the standard assumptions by mean stationarity and time series homoscedasticity and employing these assumptions in estimation restores standard asymptotics and mitigates the non-standard distributions found in the literature. Second, we consider an IV estimator based on the quadratic moment conditions that consistently identifies the true population parameter under standard assumptions. Standard asymptotics hold for the estimator when the cross section dimension is large and the time series dimension is finite. We also suggest a data-driven approach to obtain standard errors and confidence intervals that preserves the time series dependence structure in the data.

“Large-\(n\), Large-\(T\) Properties of an IV Estimator Based on the Ahn-Schmidt Moment Conditions.” (with Markus Fritsch and Joachim Schnurbus, link here)

We propose an instrumental variables (IV) estimator based on nonlinear (in param- eters) moment conditions for estimating linear dynamic panel data models and derive the large sample properties of the estimator. We assume that the only explanatory variable in the model is one lag of the dependent variable and consider the setting where the absolute value of the true lag parameter is smaller or equal to one, the cross section dimension is large, and the time series dimension is either fixed or large. Estimation of the lag parameter involves solving a quadratic equation and we find that the lag parameter is point identified in the unit root case; otherwise, two distinct roots (solutions) result. We propose a selection rule that identifies the consistent root asymptotically in the latter case and derive the asymptotic distribution of the estimator for the unit root case and for the case when the absolute value of the lag parameter is smaller than one.

time series econometrics

“Entropy-based Normality Testing for Time Series Data.” (with Shahzad Munir), submitted.

We propose a normality test that is valid in time series settings. We show that in a broad class of stationary processes, the proposed test statistic is asymptotically normal and does not require any kernel smoothing to consistently estimate the asymptotic variance of the proposed test. We also provide a Monte Carlo analysis to evaluate the finite-sample performance of the proposed test relative to existing moment-based tests in the time series literature. Our test performs better than Bai and Ng (2005) and other moment-based tests, but almost matches the performance of Lobato and Velasco (2004). The power of the test comes from deviations from a kurtosis equal to 3, and, as a result, has no power against asymmetric alternatives where kurtosis for these alternative match that of the normal distribution.

“A Monte Carlo Analysis of Fixed-\(b\) Modifications to the Normality Tests of Bai and Ng (2005).” (with Shahzad Munir)

We propose a convenient fixed-\(b\) implementation of the skewness, kurtosis, and normality tests of Bai and Ng (2005) to avoid resorting to simulated critical values and data-driven bandwidth calculations . We show through a detailed Monte Carlo study that our proposal of combining a Bartlett correction factor with a default bandwidth has good performance comparable to alternative procedures. Therefore, our proposal should encourage wider adoption by practitioners.

empirical applications

“Revisiting Habits and Heterogeneity in Demands.” (with Markus Fritsch and Joachim Schnurbus, link here)

We conduct a narrow replication of Browning and Collado (Journal of Applied Econometrics 2007; 22(3): 625-640). They estimate a linear panel AR(1) version of an Engel curve for six consumption composites using iterated GMM. We find that the coefficient estimates and standard errors differ from the reported results when we use their instrument set; in particular, we find habit formation in non-durable services and no state dependence in small durables. Despite finding evidence for weak instruments, our results support most of the claims made in the original paper and are also unable to detect intertemporal dependence strong enough to resolve existing macro puzzles.

“Market Pricing of Fundamentals at the Shanghai Stock Exchange: Evidence from a Dividend Discount Model with Adaptive Expectations.” (with Mingyang Li and Linlin Niu, link here)

We document evidence of fundamental pricing at the Shanghai Stock Exchange using panel data of listed stocks from 1997 to 2018. We estimated key parameters, namely, the price elasticity of dividends and the expectation adjustment parameter, from a dividend discount model with adaptive expectations. The reduced form can be estimated using linear dynamic panel data methods which, given the size of the panel, requires us to apply methods that correct for incidental parameter bias and to use a subset of stocks that do not completely drive the aggregate dynamics of the exchange. The resulting price elasticity of dividends is about 0.37 based on annual data and about 0.51 based on quarterly data, both of which are sizable given high PD and PE ratios in the market. Our resdivts imply that slow expectation adjustment contributes to ``bubble-like'' price patterns. We also show prices significantly react to macro information related to the discount rate, but these effects are very sensitive to the information set used.

pedagogical articles

“Teaching Advanced Topics in Econometrics using an Introductory Textbook: The Case of Dynamic Panel Data Methods.” (with Markus Fritsch and Joachim Schnurbus), submitted.

We show how to use the well-established introductory econometrics textbook by Stock and Watson (2019) as scaffolding for teaching or studying dynamic panel data methods. Using an introductory textbook lowers entry barriers and addresses a broad range of needs. First, we unlock access to linear dynamic panel data methods, which fits lecturing formats and also suits seminar-type courses, capstones, or independent study by students selecting research topics requiring dynamic panel data methods. Second, we design a case study building on the textbook's illustration of cigarette demand to illustrate how both economic and methodological issues are interrelated. Third, we also design a case study to show how to evaluate current empirical practices from a theoretical standpoint based on methods covered in an introductory econometrics course along with our scaffolding. We designed both case studies to boost students' confidence in working with technical material and to provide instructors with more opportunities to develop econometric thinking in order to actively communicate with applied economists. We show instructors and students possible further pathways to reuse and extend our case studies to other contexts.