Academic Papers

Decorative

Approximately Packing Dijoins via Nowhere-Zero Flows

  • Author(s)

    Gérard Cornuéjols, Siyue Liu, R. Ravi

  • Summary

    A new paper by Tepper School doctoral student Siyue Liu and faculty members Gérard Cornuéjols and R. Ravi, "Approximately Packing Dijoins via Nowhere-Zero Flows," explores Woodall's conjecture, which posits that in any digraph, the smallest dicut size equals the largest number of disjoint disjoins (subset of the edges of a directed graph). The authors demonstrate a significant finding: a digraph with a minimum dicut size (τ) contains at least τ/6 disjoint dijoins if its underlying undirected graph supports a nowhere-zero 6-flow. They further establish that increased connectivity in the underlying undirected graph leads to a better approximation ratio for disjoint dijoins. The core of the paper's argument involves transforming the problem of packing dijoins into one of packing strongly connected digraphs. The researchers connect this to "nowhere-zero (circular) k-flows," a concept from graph theory where edges are oriented and assigned flow values. They show that if a digraph's underlying undirected graph exhibits a nowhere-zero k-flow, they can decompose an augmented, specially weighted version of the digraph into two disjoint, strongly connected digraphs. This decomposition allows them to extract a specific number of disjoint arborescences, which then leads to the existence of disjoint dijoins. The paper effectively provides an approximate solution to Woodall's conjecture and offers new perspectives by reformulating it in terms of packing strongly connected orientations. This research offers valuable theoretical insights that applies to various practical fields, especially those involving network design and optimization.

Fast Convergence of Frank-Wolfe Algorithms on Polytopes

  • Author(s)

    Elias Wirth, Javier Peña, Sebastian Pokutta

  • Summary

    The paper introduces a unified framework for understanding the convergence rates of various Frank-Wolfe algorithms. Frank-Wolfe algorithms are a class of optimization methods particularly useful when dealing with complex constraints, like those found in "polytopes" (geometric shapes with flat sides, like cubes or pyramids), where traditional projection methods are computationally expensive. The vanilla Frank-Wolfe algorithm, while effective, can be slow in certain scenarios. To address this, variants such as Frank-Wolfe with away steps, blended pairwise steps, and in-face directions have been developed, offering faster "linear" convergence under specific conditions. This research presents a novel "template" that explains how the convergence speeds of these different Frank-Wolfe algorithms—from basic to advanced versions—are determined. This template is based on two key characteristics of the optimization problem itself: an "error bound" and an "extended curvature." Crucially, these properties are "affine-invariant," meaning they remain consistent even if the problem's geometric representation is stretched or skewed, a significant improvement over prior analyses that often relied on specific geometric properties like norms. The paper establishes new convergence rates that bridge the gap between slower sublinear rates and faster linear rates, depending on the strength of the error bound property. It also refines existing results by linking convergence to "local facial distance," which considers the geometry of the optimal solution set, rather than the entire polytope, making the analysis more precise and dimension-independent in many cases. This research would be particularly useful for mathematicians, computer scientists, and engineers working in optimization, machine learning, and operations research. Specifically, those developing or applying optimization algorithms for large-scale problems with complex constraints, such as in data analysis, artificial intelligence, or resource allocation, could leverage these findings. The unified framework and sharper convergence guarantees can lead to the development of more efficient and robust optimization methods, allowing for faster and more reliable solutions to challenges.

Child Welfare Services in the United States: An Operations Research Perspective

  • Author(s)

    Vincent W. Slaugh, Ludwig Dierks, Alan Scheller-Wolf, Andrew C. Trapp, M Utku Ünver

  • Summary

    Federal, state, local, and nonprofit agencies in the United States spend more than $40 billion annually to serve vulnerable children. Interactions with the child welfare system can change children’s lives significantly, ideally protecting children from harm and helping families thrive. In a chapter in the edited volume Nonprofit Operations and Supply Chain Management: Theory and Practice, Tepper School of Business researcher Alan Scheller-Wolf and his co-authors suggest that improving operational decisions can be a critical factor in increasing the efficiency, effectiveness, and fairness of child welfare systems. The chapter describes challenges for managing processes at three salient stages of the child welfare system: investigations and pre-removal services, foster care, and adoption. The chapter highlights relevant operations-focused insights from child welfare researchers and the limited research on child welfare by economists and operations researchers. It also identifies important issues to consider in future research and connects them to studies on operations management from related domains. The chapter emphasizes that making better operational decisions can direct limited resources to families needing services, prevent unfounded investigations of blameless families, and help frontline child welfare workers in their critical and demanding jobs.

Multi-Armed Bandits with Endogenous Learning Curves: Applications to Split Liver Transplantation and Personalized Marketing

  • Author(s)

    Yanhan (Savannah) Tang, Andrew Li, Alan Scheller-Wolf

  • Summary

    Many consumers learn about new products through exposure and user experience, commonly referred to as consumer learning. Repeated exposure and increased knowledge of a product or ad can lead to increased liking, preference, or positive attitudes, known as the mere exposure effect. Marketers can leverage consumer learning and the mere exposure effect by running consistently personalized advertisements to encourage consumers to make purchases and build customer loyalty. In a new study, researchers formulated a multi-armed bandit (MAB) model and applied it to the allocation of split liver transplantation (SLT) and marketing at organ transplant centers. The model included provisions ensuring that the choices of arms were subject to fairness constraints to guarantee equity in liver transplantation or preserve variety in personalized marketing. The study’s algorithms had superior numerical performance compared to standard MAB algorithms settings in which learning through experience/the mere exposure effect and fairness/variety-seeking concerns exist. The use of this model has broad implications for SLT (e.g., using algorithms to help evaluate strategies to increase the proliferation of SLT) as well as for other applications (e.g., using algorithms to harness customer learning and the mere exposure effect to boost sales and brand loyalty).

The Nonstationary Newsvendor with (and Without) Predictions

  • Author(s)

    Lin An, Andrew A. Li, Benjamin Moseley, R. Ravi

  • Summary

    In a new study, researchers examined an updated version of the classic newsvendor problem. In the old model, a newsvendor has to select a quantity of inventory before seeing what the demand is for the inventory, with the demand randomly drawn from a known distribution. Motivated by applications (e.g., cloud provisioning, staffing), researchers considered a new model, which they termed the nonstationary newsvendor model, in which newsvendor-type decisions must be made sequentially in the face of demand drawn from a randomly determined process that is both unknown and nonstationary. The study analyzed the nonstationary newsvendor with and without predictions. Researchers validated their findings with experiments based on three actual data sets, proving that they closed nearly three-quarters of the gap between the best approaches based on nonstationarity and predictions alone.