↳ Metaresearch

November 4th, 2017

↳ Metaresearch

Untitled

FEEDBACK LOOPS AND SOCIAL MEDIA | MESO-LEVEL CAUSES | ETHNOGRAPHY OF BUREAUCRACY

FEED FEEDBACK

Sociologist Zeynep Tufekci engages with Adam Mosseri, who runs the Facebook News Feed

Tufekci: “…Facebook does not ask people what they want, in the moment or any other way. It sets up structures, incentives, metrics & runs with it.”

Mosseri: “We actually ask 10s of thousands of people a day how much they want to see specific stories in the News Feed, in addition to other things.”

Tufekci: “That’s not asking your users, that’s research on your product. Imagine a Facebook whose customers are users—you’d do so much differently. I mean asking all people, in deliberate fashion, with sensible defaults—there are always defaults—even giving them choices they can change…Think of the targeting offered to advertisers—with support to make them more effective—and flip the possibilities, with users as customers. The users are offered very little in comparison. The metrics are mostly momentary and implicit. That’s a recipe to play to impulse.”

The tweets are originally from Zeynep Tufekci in response to Benedict Evans (link), but the conversation is much easier to read in Hamza Shaban’s screenshots here.

See the end of this newsletter for an extended comment from Jay.

  • On looping effects (paywall): “This chapter argues that today's understanding of causal processes in human affairs relies crucially on concepts of ‘human kinds’ which are a product of the modern social sciences, with their concern for classification, quantification, and intervention. Child abuse, homosexuality, teenage pregnancy, and multiple personality are examples of such recently established human kinds. What distinguishes human kinds from ‘natural kinds’, is that they have specific ‘looping effects’. By coming into existence through social scientists' classifications, human kinds change the people thus classified.” Link. ht Jay

THE MESO-LEVEL

Mechanisms and causes between micro and macro

Daniel Little, the philosopher of social science behind Understanding Society, haswritten numerous posts on the topic. Begin with this one from 2014:

“It is fairly well accepted that there are social mechanisms underlying various patterns of the social world — free-rider problems, communications networks, etc. But the examples that come readily to mind are generally specified at the level of individuals. The new institutionalists, for example, describe numerous social mechanisms that explain social outcomes; but these mechanisms typically have to do with the actions that purposive individuals take within a given set of rules and incentives.

“The question here is whether we can also make sense of the notion of a mechanism that takes place at the social level. Are there meso-level social mechanisms? (As always, it is acknowledged that social stuff depends on the actions of the actors.)”

In the post, Little defines a causal mechanism and a meso-level mechanism, then offers example research.

“…It is possible to identify a raft of social explanations in sociology that represent causal assertions of social mechanisms linking one meso-level condition to another. Here are a few examples:

  • Al Young: decreasing social isolation causes rising inter-group hostility (link)
  • Michael Mann: the presence of paramilitary organizations makes fascist mobilization more likely (link)
  • Robert Sampson: features of neighborhoods influence crime rates (link)
  • Chuck Tilly: the availability of trust networks makes political mobilization more likely (link)
  • Robert Brenner: the divided sovereignty system of French feudalism impeded agricultural modernization (link)
  • Charles Perrow: legislative control of regulatory agencies causes poor enforcement performance (link)

More of Little’s posts on the topic are here. ht Steve Randy Waldman

⤷ Full Article

December 9th, 2017

Un Coup de dés

HIGHER EDUCATION | EXPLANATION, PART II

THE FUTURE OF UNDERGRADUATE EDUCATION

A new report argues that quality, not access, is the pivotal challenge for colleges and universities

From the American Academy of Arts and Sciences, a 112-page report with "practical and actionable recommendations to improve the undergraduate experience":

"Progress toward universal education has expanded most recently to colleges and universities. Today, almost 90 percent of high school graduates can expect to enroll in an undergraduate institution at some point during young adulthood and they are joined by millions of adults seeking to improve their lives. What was once a challenge of quantity in American undergraduate education, of enrolling as many students as possible, is now a challenge of quality—of making sure that all students receive the rigorous education they need to succeed, that they are able to complete the studies they begin, and that they can do this affordably, without mortgaging the very future they seek to improve."

Link to the full report. Co-authors include Gail Mellow, Sherry Lansing, Mitch Daniels, and Shirley Tilghman. ht Will, who highlights a few of the report's recommendations that stand out:

  • From page 40: "Both public and private colleges and universities as well as state policy-makers [should] work collaboratively to align learning programs and expectations across institutions and sectors, including implementing a transferable general education core, defined transfer pathway maps within popular disciplines, and transfer-focused advising systems that help students anticipate what it will take for them to transfer without losing momentum in their chosen field."
  • From page 65: "Many students, whether coming straight out of high school or adults returning later to college, face multiple social and personal challenges that can range from homelessness and food insecurity to childcare, psychological challenges, and even imprisonment. The best solutions can often emerge from building cooperation between a college and relevant social support agencies.
  • From page 72: "Experiment with and carefully assess alternatives for students to manage the financing of their college education. For example, income-share agreements allow college students to borrow from colleges or investors, which then receive a percentage of the student’s after-graduation income."
  • On a related note, see this 2016 paper from the Miller Center at the University of Virginia: "Although interest in the ISA as a concept has ebbed and flowed since Milton Friedman first proposed it in the 1950s, today it is experiencing a renaissance of sorts as new private sector partners and institutions look to make the ISA a feasible option for students. ISAs offer a novel way to inject private capital into higher education systems while striking a balance between consumer preferences and state needs for economic skill sets. The different ways ISAs can be structured make them highly suitable as potential solutions for many states’ education system financing problems." Link.
  • Meanwhile, Congress is working on the reauthorization of the Higher Education Act: "Much of the proposal that House Republicans released last week is controversial and likely won’t make it into the final law, but the plan provides an indication of Congressional Republicans’ priorities for the nation’s higher education system. Those priorities include limiting the federal government’s role in regulating colleges, capping graduate student borrowing, making it easier for schools to limit undergraduate borrowing — and overhauling the student loan repayment system. Many of those moves have the potential to create a larger role for private industry." Link.
⤷ Full Article

January 6th, 2018

Anonymous Power Game

FAVORITE ECONOMICS PAPERS | ISAS IN THE NEWS | MICROSTRUCTURE

THE YEAR IN ECONOMICS

Nominations from top economists, including selections by Raj Chetty, Sendhil Mullainathan, and Angus Deaton

One favorite from this excellent round-up is by Hulten and Nakamura on metrics, selected by Diane Coyle (we previously sent her Indigo Prize paper):

Accounting for Growth in the Age of the Internet: The Importance of Output-Saving Technical Change by Charles Hulten and Leonard Nakamura

Main finding: Living standards may be growing faster than GDP growth.
Nominating economist: Diane Coyle, University of Manchester
Specialization: Economic statistics and the digital economy
Why?: “This paper tries to formalize the intuition that there is a growing gap between the standard measure of GDP, capturing economic activity, and true economic welfare and to draw out some of the implications.”

Robert Allen's "Absolute Poverty: When Necessity Displaces Desire" is another metrics-related piece on the list.

Also noteworthy, on the future of work:

Valuing Alternative Work Arrangements by Alexandre Mas and Amanda Pallais

Main finding: The average worker does not value an Uber-like ability to set their own schedule.
Nominating economist: Emily Oster, Brown University
Specialization: Health economics and research methodology
Why? “This paper looks at a question increasingly important in the current labor market: How do people value flexible work arrangements? The authors have an incredibly neat approach, using actual worker hiring to generate causal estimates of how workers value various employment setups.”

Full piece by DAN KOPF here.

⤷ Full Article

March 3rd, 2018

New Fables

UNIVERSITIES AND RESEARCH | OVERSIGHT FOR ALGORITHMS | UNEMPLOYMENT

IVORY MECHANICS

Regional parochialism and the production of knowledge in universities

"Scholarly understanding of how universities transform money and intellect into knowledge remains limited. At present we have only rudimentary measures of knowledge production's inputs: tuition and fees, government subsidies, philanthropic gifts, and the academic credentials of students and faculty. Output measures are equally coarse: counts of degrees conferred; dissertations, articles and books completed; patents secured; dollars returned on particular inventions. As for the black box of knowledge production in between: very little."

From the introduction to a new book on American social science research that aims to uncover the institutional pathways that produce (and favor) certain areas of research.

It continues:

"The rise of 'global' discourse in the US academy has coevolved with fundamental changes in academic patronage, university prestige systems, and the international political economy. America's great research institutions are now only partly servants of the US nation-state. This fact has very large implications for those who make their careers producing scholarly knowledge."

Link to the introduction.

  • A short interview with co-authors Mitchell L. Stevens and Cynthia Miller-Idris. "Sociology department chairs said frankly that they deliberately steer graduate students away from international study because such projects on non-U.S. topics are less likely to have purchase on the tenure-line job market.… The tenure process is largely mediated by disciplines, and because those disciplines prioritize their own theoretical abstractions, contextual knowledge loses out." Link.
  • A paper examines previous attempts to map the parochialism of a discipline, finding that “conventional measures based on nation-state affiliation capture only part of the spatial structures of inequality.” Employed therein: novel visualizations and mapping the social network structures of authorship and citation. Link. Relatedly, a September 2017 post by Samuel Moyn on parochialism in international law. Link.
  • And a link we sent last fall, by Michael Kennedy, on interdisciplinarity and global knowledge cultures. Link.
⤷ Full Article

March 24th, 2018

Do I See Right?

FAIRNESS IN MACHINE LEARNING | METARESEARCH | MICROSTRUCTURE OF VIOLENCE

DISTINCT FUSION

Tracking the convergence of terms across disciplines

In a new paper, CHRISTIAN VINCENOT looks at the process by which two synonymous concepts developed independently in separate disciplines, and how they were brought together.

“I analyzed research citations between the two communities devoted to ACS research, namely agent-based (ABM) and individual-based modelling (IBM). Both terms refer to the same approach, yet the former is preferred in engineering and social sciences, while the latter prevails in natural sciences. This situation provided a unique case study for grasping how a new concept evolves distinctly across scientific domains and how to foster convergence into a universal scientific approach. The present analysis based on novel hetero-citation metrics revealed the historical development of ABM and IBM, confirmed their past disjointedness, and detected their progressive merger. The separation between these synonymous disciplines had silently opposed the free flow of knowledge among ACS practitioners and thereby hindered the transfer of methodological advances and the emergence of general systems theories. A surprisingly small number of key publications sparked the ongoing fusion between ABM and IBM research.”

Link to a summary and context. Link to the abstract. ht Margarita

  • Elsewhere in metaresearch, a new paper from James Evans’s Knowledge Lab examines influence by other means than citations: “Using a computational method known as topic modeling—invented by co-author David Blei of Columbia University—the model tracks ‘discursive influence,’ or recurring words and phrases through historical texts that measure how scholars actually talk about a field, instead of just their attributions. To determine a given paper’s influence, the researchers could statistically remove it from history and see how scientific discourse would have unfolded without its contribution.” Link to a summary. Link to the full paper.
⤷ Full Article

April 14th, 2018

Inventions that Changed the World

R&D HISTORY | ML AND ECONOMICS | SCANLON ON INEQUALITY

METARESEARCH AND DEVELOPMENT

Changes in R & D funding and allocation

In a new report on workforce training and technological competitiveness, a task force led by former Commerce Secretary Penny Pritzker describes recent trends in research and development investment. Despite the fact that “total U.S. R&D funding reached an all-time high of nearly $500 billion in 2015, nearly three percent of U.S. gross domestic product,” the balance in funding has shifted dramatically to the private sector: “federal funding for R&D, which goes overwhelmingly to basic scientific research, has declined steadily and is now at the lowest level since the early 1950s.” One section of the report contains this striking chart:

Link to the full report. ht Will

  • A deeper dive into the report's sourcing leads to a fascinating repository of data from the American Association for the Advancement of Science on the U.S. government's investments in research since the 1950s. Alongside the shift from majority federal to majority private R&D funding, the proportion of investments across different academic disciplines has also changed significantly. One table shows that the share of federal R&D funding for environmental science, engineering, and math/computer science has grown the most, from a combined 43.2% in 1970 to 54.8% in 2017. Meanwhile, funding for social science research has decreased the most. In 1970, the social sciences received 4.3% of the government's R&D funding; but in 2017, that share had fallen to 1.8%. Much more data on public sector R&D investments is available from the AAAS here.
  • A March 2017 article in Science explains some of these shifts.
  • A section of a 1995 report commissioned by the U.S. Senate Committee on Appropriations charts and contextualizes the explosion of federal research and development funding in the immediate aftermath of the Second World War.
  • A study from the Brookings Institution finds that federal funding for research and development accounts for up to 2.8 percent of GDP in some of the largest metropolitan areas in America. The authors have fifty ideas for how municipalities can capture more of the economic impact generated by that R&D.
  • Michael comments: "With the diminishing share (4.3% to 1.8% of total government research) of halved expenditures—and business not naturally inclined to conduct this kind of research (except in, as one would expect, instances of direct business application like surge pricing and Uber)—social science research appears to no longer have a natural home."
⤷ Full Article

June 23rd, 2018

Yielding Stone

FAIRNESS IN ALGORITHMIC DECISION-MAKING | ADMINISTRATIVE DATA ACCESS

VISIBLE CONSTRAINT

Including protected variables can make algorithmic decision-making more fair 

A recent paper co-authored by JON KLEINBERG, JENS LUDWIG, SENDHIL MULLAINATHAN, and ASHESH RAMBACHAN addresses algorithmic bias, countering the "large literature that tries to 'blind' the algorithm to race to avoid exacerbating existing unfairness in society":  

"This perspective about how to promote algorithmic fairness, while intuitive, is misleading and in fact may do more harm than good. We develop a simple conceptual framework that models how a social planner who cares about equity should form predictions from data that may have potential racial biases. Our primary result is exceedingly simple, yet often overlooked: a preference for fairness should not change the choice of estimator. Equity preferences can change how the estimated prediction function is used (such as setting a different threshold for different groups) but the estimated prediction function itself should not change. Absent legal constraints, one should include variables such as gender and race for fairness reasons.

Our argument collects together and builds on existing insights to contribute to how we should think about algorithmic fairness.… We empirically illustrate this point for the case of using predictions of college success to make admissions decisions. Using nationally representative data on college students, we underline how the inclusion of a protected variable—race in our application—not only improves predicted GPAs of admitted students (efficiency), but also can improve outcomes such as the fraction of admitted students who are black (equity).

Across a wide range of estimation approaches, objective functions, and definitions of fairness, the strategy of blinding the algorithm to race inadvertently detracts from fairness."

Read the full paper here.

⤷ Full Article

July 7th, 2018

Quodlibet

RANDOMIZED CONTROLLED TRIALS | HIERARCHY & DESPOTISM

EVIDENCE PUZZLES

The history and politics of RCTs 

In a 2016 working paper, JUDITH GUERON recounts and evaluates the history of randomized controlled trials (RCTs) in the US, through her own experience in the development of welfare experiments through the MDRC and the HHS: 

“To varying degrees, the proponents of welfare experiments at MDRC and HHS shared three mutually reinforcing goals. The first was to obtain reliable and—given the long and heated controversy about welfare reform—defensible evidence of what worked and, just as importantly, what did not. Over a pivotal ten years from 1975 to 1985, these individuals became convinced that high-quality RCTs were uniquely able to produce such evidence and that there was simply no adequate alternative. Thus, their first challenge was to demonstrate feasibility: that it was ethical, legal, and possible to implement this untried—and at first blush to some people immoral—approach in diverse conditions. The other two goals sprang from their reasons for seeking rigorous evidence. They were not motivated by an abstract interest in methodology or theory; they wanted to inform policy and make government more effective and efficient. As a result, they sought to make the body of studies useful, by assuring that it addressed the most significant questions about policy and practice, and to structure the research and communicate the findings in ways that would increase the potential that they might actually be used." 

⤷ Full Article

July 21st, 2018

High Noon

ACTUARIAL RISK ASSESSMENT | SOCIAL SCIENCE DATA

ALTERNATIVE ACTUARY

History of risk assessment, and some proposed alternate methods 

A 2002 paper by ERIC SILVER and LISA L. MILLER on actuarial risk assessment tools provides a history of statistical prediction in the criminal justice context, and issues cautions now central to the contemporary algorithmic fairness conversations:  

"Much as automobile insurance policies determine risk levels based on the shared characteristics of drivers of similar age, sex, and driving history, actuarial risk assessment tools for predicting violence or recidivism use aggregate data to estimate the likelihood that certain strata of the population will commit a violent or criminal act. 

To the extent that actuarial risk assessment helps reduce violence and recidivism, it does so not by altering offenders and the environments that produced them but by separating them from the perceived law-abiding populations. Actuarial risk assessment facilitates the development of policies that intervene in the lives of citizens with little or no narrative of purpose beyond incapacitation. The adoption of risk assessment tools may signal the abandonment of a centuries-long project of using rationality, science, and the state to improve upon the social and economic progress of individuals and society."

Link to the paper.

A more recent paper presented at FAT* in 2018 and co-authored by CHELSEA BARABAS, KARTHIK DINAKAR, JOICHI ITO, MADARS VIRZA, and JONATHAN ZITTRAIN makes several arguments reminiscent of Silver and Miller's work. They argue in favor of causal inference framework for risk assessments aimed at working on the question "what interventions work":

"We argue that a core ethical debate surrounding the use of regression in risk assessments is not simply one of bias or accuracy. Rather, it's one of purpose.… Data-driven tools provide an immense opportunity for us to pursue goals of fair punishment and future crime prevention. But this requires us to move away from merely tacking on intervenable variables to risk covariates for predictive models, and towards the use of empirically-grounded tools to help understand and respond to the underlying drivers of crime, both individually and systemically."

Link to the paper. 

  • In his 2007 book Against Prediction, lawyer and theorist Bernard Harcourt provided detailed accounts and critiques of the use of actuarial methods throughout the criminal legal system. In place of prediction, Harcourt proposes a conceptual and practical alternative: randomization. From a 2005 paper on the same topic: "Instead of embracing the actuarial turn in criminal law, we should rather celebrate the virtues of the random: randomization, it turns out, is the only way to achieve a carceral population that reflects the offending population. As a form of random sampling, randomization in policing has significant positive value: it reinforces the central moral intuition in the criminal law that similarly situated individuals should have the same likelihood of being apprehended if they offend—regardless of race, ethnicity, gender or class." Link to the paper. (And link to another paper of Harcourt's in the Federal Sentencing Reporter, "Risk as a Proxy for Race.") 
  • A recent paper by Megan Stevenson assesses risk assessment tools: "Despite extensive and heated rhetoric, there is virtually no evidence on how use of this 'evidence-based' tool affects key outcomes such as incarceration rates, crime, or racial disparities. The research discussing what 'should' happen as a result of risk assessment is hypothetical and largely ignores the complexities of implementation. This Article is one of the first studies to document the impacts of risk assessment in practice." Link
  • A compelling piece of esoterica cited in Harcourt's book: a doctoral thesis by Deborah Rachel Coen on the "probabilistic turn" in 19th century imperial Austria. Link.
⤷ Full Article

July 28th, 2018

Jetty

QUANTITATIVE ECONOMIC HISTORY | BEHAVIORAL ECON POLICY | CHINA IN THE 20TH CENTURY

BANKING AS ART

On the history of economists in central banks 

A recent paper by FRANÇOIS CLAVEAU and JÉRÉMIE DION applies quantitative methods to the historical study of central banks, demonstrating the transition of central banking from an "esoteric art" to a science, the growth of economics research within central banking institutions, and the corresponding rise in the dominance of central banks in the field of monetary economics. From the paper: 

"We study one type of organization, central banks, and its changing relationship with economic science. Our results point unambiguously toward a growing dominance of central banks in the specialized field of monetary economics. Central banks have swelling research armies, they publish a growing share of the articles in specialized scholarly journals, and these articles tend to have more impact today than the articles produced outside central banks."

Link to the paper, which contains a vivid 1929 dialogue between Keynes and Sir Ernest Musgrave Harvey of the Bank of England, who asserts, "It is a dangerous thing to start giving reasons." 

h/t to the always-excellent Beatrice Cherrier who highlighted this work in a brief thread and included some visualizations, including this one showing the publishing rate of central banking researchers: 

  • Via both Cherrier and the paper, a brief Economist article on the crucial significance of the central banking conference in Jackson Hole, hosted by the Federal Reserve Bank of Kansas City: "Davos for central bankers." Link. (And link to an official history of the conference.) 
  • Another paper co-authored by Claveau looks at the history of specialties in economics, using quantitative methods to map the importance of sets of ideas through time. "Among our results, especially noteworthy are (1) the clear-cut existence of ten families of specialties, (2) the disappearance in the late 1970s of a specialty focused on general economic theory, (3) the dispersal of the econometrics-centered specialty in the early 1990s and the ensuing importance of specific econometric methods for the identity of many specialties since the 1990s, and (4) the low level of specialization of individual economists throughout the period in contrast to physicists as early as the late 1960s." Link
⤷ Full Article