Here is a direct proof using the characteristic function that sums of iid random variables from the Student’s-t distribution with two degrees of freedom, that has a finite mean but no variance, converge nevertheless to the standard Normal distribution.
Student’s-t with two degrees of freedom: no variance but asymptotically Normal
Posted: November 24, 2025 in Educational MaterialTags: Asymptotics, Bessel function, Econometrics, Generalized Central Limit Theorem, Lambert W function, Stable distribution, student's t distribution
Derivation of the Cauchy distribution
Posted: September 12, 2025 in Educational MaterialTags: Cauchy distribution, Distributions, Mellin transform, Normal distribution, Skew Normal Distribution, Statistics
Here is the derivation of the (standard) Cauchy distribution (CDF and pdf), as the ratio of two standard Normals. It makes a nice link with the Skew Normal distribution, and with Mellin transforms.
Little Things Economics & Econometrics
Posted: September 9, 2023 in Educational Material, UncategorizedTags: Econometrics, Economics, Statistics
Two volumes.
Economics, 292 pages, 106 entries, Little Things you should know about, and you usually don’t.
Econometrics, 342 pages, 172 entries, Little Things you should know about, and most likely you don’t.
Go to the book page to download them (free of charge of course).
Bandwidth selection for kernel density estimation of fat-tailed and skewed distributions
Posted: March 20, 2023 in Research, Research papersTags: Asymetric Laplace distribution, bandwidth selection, fat-tailed distribution, kernel density estimation, skewness
Daniel J. Henderson, Alecos Papadopoulos & Christopher F. Parmeter (2023): Bandwidth selection for kernel density estimation of fat-tailed and skewed distributions, Journal of Statistical Computation and Simulation. DOI:10.1080/00949655.2023.2173194
This is a paper together with D.J. Henderson and C.F. Parameter. Bottomline: it is time to more-or-less abandon the Silverman rule-of-thumb bandwidth selection in kernel density estimation, in favor, of rules based on other reference densities like the Asymmetric Laplace we present in the paper. They perform better in Mean Squared Error terms. Check it out at https://doi.org/10.1080/00949655.2023.2173194
ABSTRACT: Applied researchers using kernel density estimation have worked with optimal bandwidth rules that invariably assumed that the reference density is Normal (optimal only if the true underlying density is Normal).We offer four new optimal bandwidth rules-of-thumb based on other infinitely supported distributions: Logistic, Laplace, Student’s-t and Asymmetric Laplace. Additionally, we propose a psuedorule-of-thumb (ROT) bandwidth based on a Gram-Charlier expansion of the unknown reference density that is linked to the empirical skewness and kurtosis of the data. The intellectual investment needed to implement these new optimal bandwidths is practically zero. We discuss the behaviour of these bandwidths as it links to differences in skewness and kurtosis to the Normal reference ROT. We further propose model selection criteria for bandwidth choice when the true underlying density is unknown. The performance of these new ROT bandwidths are assessed in a variety of Monte Carlo simulations as well as in two empirical illustrations, the well known data set of annual snowfall in Buffalo, New York, and a timely example on stock market trading.
The noise error component in stochastic frontier analysis
Posted: January 10, 2023 in Research, Research papersTags: Identification, Statistical Dependence, Stochastic frontier analysis, Stochastic Noise, Wrong skewness
Published on line on the last day of the year 2022, in Empirical Economics. It will eventually be part of a volume in honor of Peter Schmidt.
check https://doi.org/10.1007/s00181-022-02339-w
Abstract
With a little help from a handful of scholars, the noise component of the composed
error in a production model created the stochastic frontier analysis field. But after
that glorious moment, it was confined to obscurity. We review what little research
has been done on it. We present two cases where it torments us from the shadows,
by sabotaging identification, and by distorting the sample skewness. We examine
the relation between predicted noise and predicted inefficiency. For the Normal-Half
Normal and the Normal-Exponential error specification, we provide its conditional
expectation as predictor and we examine its distribution in relation to the marginal
law.We also derive the conditional distribution of the noise and we compute confidence
intervals and the probability of over-predicting it. Finally, we present a model where
the noise, as the carrier of uncertainty, induces directly inefficiency. We conclude by
showcasing our theoretical results through an empirical illustration.
Keywords Noise · Stochastic frontier · Identification · Wrong skewness ·
Dependence
JEL Classification C46 · C21 · C51 · D24
Quantile methods for Stochastic Frontier Analysis
Posted: November 10, 2022 in Research, Research papers, UncategorizedTags: Econometrics, efficiency, Production Economics, Quantile regression, Stochastic frontier analysis
Finally. This journey started in June 2020 in the North American Productivity Workshop, Miami Florida USA, (virtual due to covid-19). There were some papers on applying quantile methods to stochastic frontier analysis (SFA), and I publicly commented that something just didn’t feel in place. Not technical stuff, but conceptual issues. I started thinking and writing about it, and one year later in the next NAPW workshop (again virtual), June 2021, I presented “Quantile regression in Stochastic Frontier Analysis: some fundamental considerations” in the virtual North American Productivity Workshop (Miami Florida USA), where I was rather pessimistic about the applicability of quantile methods in SFA. The organizer of the workshop Chris Parmeter was also writing a piece on the topic, having his own concerns about whether Quantile Regression and SFA can co-exist. We initially discussed putting together a “symposium”, a small collection of papers around the subject in a journal, but we ended instead writing a monograph together. Which just got published. It is not just a review, but has new (valid) tools, an empirical application, and many open issues for further research.
The abstract goes
Quantile regression has become one of the standard tools of econometrics. We examine its compatibility with the special goals of stochastic frontier analysis. We document several conflicts between quantile regression and stochastic frontier analysis. From there we review what has been done up to now, we propose ways to overcome the conflicts that exist, and we develop new tools to do applied efficiency analysis using quantile methods in the context of stochastic frontier models. The work includes an empirical illustration to reify the issues and methods discussed, and catalogs the many open issues and topics for future research.
Here are the ToC
1 Introduction
I Where We Are
2 The Relation Between Conditional Quantiles and the Regression Function
3 Basics of Quantile Regression: The Independence Case
4 Where Quantile Regression and Stochastic Frontier Analysis Clash
5 Reconciling Quantile Regression with Stochastic Frontier Models
6 Likelihood-Based Quantile Estimation
II What We Can Do
7 The Corrected Q-Estimator
8 Quantile-Dependent Efficiency
9 From the Composite Error Term to Inefficiency: A Fundamental Result
10 Quantile Estimation and Inference with Dependence
11 An Empirical Application
III For the Road
12 Challenges Ahead
13 Summary and Concluding Remarks
References
The full text is at http://dx.doi.org/10.1561/0800000042 (pay-walled).
Click here to download front material, chapter 1, 12, 13 and references.
Trade liberalization and growth: replication with quantile regression
Posted: December 13, 2021 in Research, Research papers, UncategorizedTags: Econometrics, growth, Quantile regression, tariff reduction, trade liberalization, Urugay round
This has been accepted and published in Empirical Economics, Papadopoulos, A. (2021) Trade liberalization and growth: a quantile moderator for Hoyos’ (2021) replication study of Estevadeordal and Taylor (2013). Empir Econ, https://doi.org/10.1007/s00181-021-02159-4.
Estevadeordal and Taylor (2013) is unique in that the authors dug deeply into raw data to construct indicators of tariff reductions in the 1990’s for several countries. Hoyos (2021) is a “critical replication” study that challenges their main finding (that tariff reduction accelerated growth in the 1990’s), on account of the empirical exercise being “non-robust”.
My paper is indeed a moderator in this debate, because it shows that the non-robustness results of Hoyos are valid, but they do not invalidate the overall conclusion of Estevadeordal and Taylor. In order to understand this, one should apply quantile regression, which is what I do.
The abstract reads:
We examine whether Hoyos’ (Empir Econ, 2021) critical replication of Estevadeordal and Taylor (Rev Econ Stat 95(5):1669–1690, 2013) that dealt with trade liberalization and growth, does provide, as the author claims, clear evidence that the estimation results and the conclusions of the original should be discarded, the first as “nonrobust,” the second as relying on the former. We find that robustness is indeed an issue. We correct for it using quantile regression and we obtain results that explain both papers and support a modified argument that tariff reduction in the 1990’s contributed to growth acceleration, when other determinants of growth were conducive to such acceleration.
The Effects of Management on Production: A Survey of Empirical Studies
Posted: July 18, 2021 in Research, Research papersTags: efficiency, Management, Production, Production Economics
Eleven months later, the chapter has just been released on line.
Contributed chapter to Handbook of Production Economics vol. 2, Springer,
https://doi.org/10.1007/978-981-10-3450-3 45-1. On-line: July 16, 2021.
ABSTRACT: We review econometric studies that attempt to estimate the effects of management on production, being on output, productivity, or efficiency. We group the studies mainly by a methodological criterion: whether they treat management as a latent variable, whether they proxy it by some other variable(s), or whether they attempt to construct a direct measure of management and use it as a regressor in an econometric model. A large part of the literature uses data from small-size agriculture, while in recent years national surveys have started to collect more systematically data related to management and management practices from various industries. Rather than being mentioned by telegraphic references, most of the studies presented are given a somewhat detailed summary so that the reader can acquire a good sense of the methodological choices made, the estimation techniques adopted, and the results obtained on the effects of management.
Modeling dependence in two-tier stochastic frontier models
Posted: July 14, 2021 in Research, Research papersTags: Copula, Statistical Dependence, two-tier stochastic frontier
This paper just got aired on the website of Journal of Productivity Analysis, https://doi.org/10.1007/s11123-021-00611-2. It is jointly written with Chris F. Parmeter and Subal Kumbhakar.
Abstract: The two-tier stochastic frontier model has seen widespread application across a range of social science domains. It is particularly useful in examining bilateral exchanges where unobserved side-specific information exists on both sides of the transaction. These buyer and seller specific informational aspects offer opportunities to extract surplus from the other side of the market, in combination also with uneven relative bargaining power. Currently, this model is hindered by the fact that identification and estimation relies on the potentially restrictive assumption that these factors are statistically independent. We present three different models for empirical application that allow for varying degrees of dependence across these latent informational/bargaining factors.



