By Michael Levin
Overcrowding in United States hospitals’ emergency departments (EDs) has been identified as a significant barrier to receiving high-quality emergency care, resulting from many EDs struggling to properly triage, diagnose, and treat emergency patients in a timely and effective manner. Priority is now being placed on research that explores the effectiveness of possible solutions, such as heightened adoption of IT to advance operational workflow and care services related to diagnostics and information accessibility, with the goal of improving what is called throughput efficiency. However, high costs of technological process innovation as well as usability challenges still impede wide-spanning and rapid implementation of these disruptive solutions. This paper will contribute to the pursuit of better understanding the value of adopting health IT (HIT) to improve ED throughput efficiency.
Using hospital visit data, I investigate two ways in which ED throughput activity changes due to increased HIT sophistication. First, I use a probit model to estimate any statistically and economically significant decreases in the probability of ED mortality resulting from greater HIT sophistication. Second, my analysis turns to workflow efficiency, using a negative binomial regression model to estimate the impact of HIT sophistication on reducing ED waiting room times. The results show a negative and statistically significant (p < 0.01) association between the presence of HIT and the probability of mortality in the ED. However, the marginal impact of an increase in sophistication from basic HIT functionality to advanced HIT functionality was not meaningful. Finally, I do not find a statistically significant impact of HIT sophistication on expected waiting room time. Together, these findings suggest that although technological progress is trending in the right direction to ultimately have a wide-sweeping impact on ED throughput, more progress must be made in order for HIT to directly move the needle on confronting healthcare’s greatest challenges.
Advisors: Professor Ryan McDevitt, Professor Michelle Connolly | JEL Codes: I1, I18, O33
By Jacob Chasan
A new kernel1 is in town. The current industry-standard for resource allocation on computers does
not take the user’s preferences into account, rather programs are given access to resources based on the
time that each requested to be run. Although this system can lead to solutions that minimize the time it
takes for a program to receive an allocation, it often leads to an incentive misalignment between the
programs and the user. This misalignment is exacerbated as the current queue-based systems have no
inherent mechanism to prevent a tragedy of the commons issue, whereby programs take more resources
from the system than the value they provide to the user. By shifting to a market-based approach, where
computing resources are allocated to programs based on how much utility the user receives from each
program, the incentives of the programs and the users align. With inherent market mechanisms to keep
the incentives aligned, this new paradigm leads to at least superior levels of utility for a user.
1As described in subsequent parts of this paper, the kernel is the core program within an operating system which is given the authority to allocate the hardware resources amongst the programs on the computer.
Advisors: Professor Benjamin C. Lee, Professor Atila Abdulkadiroglu, Professor Michelle Connolly | JEL Codes: C8, C80
By Andie Carroll
As nonprofits work to serve their communities, they must choose a place to locate that best suits their needs and the needs of the population they aim to serve. Locational characteristics such as median income and population density have been shown to impact how many nonprofits choose to locate in a given area. However, few studies have examined the impact of locational characteristics on how nonprofits survive and thrive. This study examines the impact of geographic and demographic factors on nonprofit survival and success through a case study of El Sistema USA (ESUSA), a nationwide network of music education programs with the goal of helping underserved youth. The study analyzes panel survey data from 131 El Sistema-inspired programs in the U.S. from 2005 to 2018 along with demographic data from the American Community Survey, charitable giving data from the IRS, and GIS data compiled through a review of ESUSA program websites. By using regression models of ESUSA program survival and success (defined by more students served and higher program budgets), this study found that ESUSA programs in areas of more need are more likely to survive and thrive.
Advisors: Professor Lorrie Schmid, Professor Michelle Connolly | JEL Codes: L3, L31, D23
Navigating the Maize of Poverty: Intra-Household Allocation and Investment in Children’s Human Capital in Tanzania
By Saheel Chodavadia
Intra-household resource allocation influences investment in children’s human capital and hence influences long-term poverty levels. I study how climate shocks in Tanzania shift intra-household bargaining power and investment in children’s human capital. Past empirical work finds that bargaining power is associated with income, assets, education, and other often unobservable factors. Anthropological evidence from Tanzania suggests that male decision-makers in poor households control most income and own most assets. Conditioning on changes in total household resources due to climate shocks, I find evidence consistent with climate shocks increasing female bargaining power through a reduction in male decision-maker’s income. Specifically, climate shocks in households with more educated women increase investment in children’s education and improve anthropometric measures of health. Lastly, I comment on the usefulness of relative education as a proxy for bargaining power in contexts of data and cultural limitations on distinct assets and income streams for decision-makers.
Advisors: Professor Robert Garlick, Professor Michelle Connolly | JEL Codes: D0, D13, I20
The Upstream and Downstream Effects of Government Industrial Policy in the Rare Earth Elements Industry
By Charles Daniel
The Chinese government has found considerable success in stimulating economic modernization through its industrial policy. The development of the rare earths industry, in both upstream and downstream markets, exemplifies this success. Rare earths are a group of metals whose natural properties make them critical for many pieces of modern technology. Upstream, Chinese raw rare earth producers extracted minimal output in 1985; by 2001 they accounted for more than 90 percent of global production. China stimulated this growth beginning in 1990 with implicit and explicit subsidies for rare earth producers, which enabled them to enter the market and produce at lower marginal costs than other world firms. These lower costs enabled Chinese producers to assume a market-leading position, and this paper explains the resulting developments in the upstream rare earth market through the Stackelberg model, which describes sequential quantity competition. In 2006, China introduced an additional policy of export quotas on rare earths, intended to benefit downstream Chinese firms. These firms depend on rare earths as inputs for the final goods (such as batteries and personal electronics) they produce. After the quota announcement, Chinese downstream firms benefitted from continued unrestricted access to rare earths, while non-Chinese downstream firms faced higher costs on the world market for rare earth inputs. This paper uses the Bertrand model, in which firms compete on prices, to examine the subsequent effects on these downstream markets. While Chinese rare earth producers were harmed by the export quotas, the combination of the subsidy and the export quotas enabled China to complete its economic goals: to first gain leverage in the rare earths industry, and to second transition its economy toward higher-value products and services.
Advisors: Professor Alexander Pfaff, Professor Michelle Connolly | JEL Codes: L5, L52, L13
By Pranav Ganapathy
We propose and evaluate an auction mechanism for the priority review voucher program. The 2007 voucher program rewards drug developers for regulatory approval of novel treatments for neglected tropical diseases. Previous papers have proposed auctioning vouchers for the priority review voucher program but have offered neither a mathematical model nor a framework. We present a mechanism design problem with one pharmaceutical company producing one drug for a neglected tropical disease. The mechanism that maximizes the regulator’s expected surplus is a take-it-or-leave-it offer, with three different offers based on low, intermediate, and high neglected disease burdens. We demonstrate how mechanism design can be applied to settings in which the buyer pays for public access to a product with regulatory speed. Finally, this paper may be useful to policymakers seeking to improve access to voucher drugs through modifications of the program.
Advisors: Professor David Ridley, Professor Giuseppe Lopomo, Professor Michelle Connolly| JEL Codes: I1, D44, D82
By Ralph Lawton
Natural disasters can have catastrophic personal and economic effects, particularly in low-resource settings. Major natural disasters are becoming more frequent, so rigorous understanding of their effects on long-term economic wellbeing is fundamentally important in order to mitigate their impacts on exposed populations. In this paper, I investigate the effects of the 2004 Indian Ocean tsunami on real consumption and assets at the individual level. I also examine the heterogeneity of those impacts, and the related effects on inequality. Taking individual-specific heterogeneity into account with fixed effects, I find individuals living in heavily damaged areas experience major declines in real consumption and assets, and do not recover in the long term. These results are strikingly different than results that do not consider price effects, as well as previously published macroeconomic results. I also find significant heterogeneity by age, education-level, pre-tsunami socioeconomic status, and whether an individual went into a refugee camp. The tsunami resulted in large, long-term declines in asset inequality, and a temporary increase in consumption inequality that returns to near pre-tsunami levels in the long run.
Advisors: Professor Duncan Thomas, Professor Michelle Connolly | JEL Codes: D1, D15, H84
Forecasting Corporate Bankruptcy: Applying Feature Selection Techniques to the Pre- and Post-Global Financial Crisis Environments
By Parker Levi
I investigate the use of feature selection techniques to forecast corporate bankruptcy in the years before, during and after the global financial crisis. Feature selection is the process of selecting a subset of relevant features for use in model construction. While other empirical bankruptcy studies apply similar techniques, I focus specifically on the effect of the 2007-2009 global financial crisis. I conclude that the set of bankruptcy predictors shifts from accounting variables before the financial crisis to market variables during and after the financial crisis for one-year-ahead forecasts. These findings provide insight into the development of stricter lending standards in the financial markets that occurred as a result of the crisis. My analysis applies the Least Absolute Shrinkage and Selection Operator (LASSO) method as a variable selection technique and Principal Components Analysis (PCA) as a dimensionality reduction technique. In comparing each of these methods, I conclude that LASSO outperforms PCA in terms of prediction accuracy and offers more interpretable results.
Advisors: Professor Andrew Patton, Professor Michelle Connolly | JEL Codes: G1, G01, G33
By Kevin Ma and Matthew Treiber
This paper explores the secondary resale market for high-end and limited-edition sneakers, specifically analyzing the determinants that affect what value sneakers trade for in the secondary market. While it is common knowledge that the sneaker resale market is a thriving and active secondary market, there is little to no empirical research about what exactly causes such sneakers to sell for exorbitant prices in the resale market. The study utilizes a hedonic pricing approach to investigate the determinants of sneaker resale price. We use a dataset of sneaker resale transactions from the online marketplace StockX between the years of 2016 and 2020 as the basis for our research. After analyzing the results, we have determined that the amount of “hype” that surrounds a sneaker as well as supply scarcity are statistically significant factors when determining the resale price premium a particular sneaker commands in the secondary market. This work adds to the sparse literature on the sneaker resale industry and brings an econometrics-approach to determining the price a given pair of sneakers commands in the resale market.
Advisors: Professor Kyle Jurado, Professor Michelle Connolly, Professor Grace Kim| JEL Codes: C2, C20, J19