- Configuration Balancing for Stochastic Requests (Franziska Eberle, Anupam Gupta, Nicole Megow, Ben Moseley, and Rudy Zhou)
Mathematical Programming, to appear.
@article{EberleGuptaMegow+2024,
author = {Franziska Eberle and Anupam Gupta and Nicole Megow and Ben Moseley and Rudy Zhou},
title = {Configuration Balancing for Stochastic Requests},
journal = {Mathematical Programming},
year = {to appear},
doi = {10.1007/s10107-024-02132-w},
}
The configuration balancing problem with stochastic requests generalizes well-studied resource allocation problems such as load balancing and virtual circuit routing. There are given $m$ resources and $n$ requests; each request has multiple possible \emphconfigurations, each of which increases the load of each resource by some amount. The goal is to select one configuration for each request to minimize the \emphmakespan: the load of the most-loaded resource. In the stochastic setting, the amount by which a configuration increases the resource load is uncertain until the configuration is chosen, but we are given a probability distribution. We develop both offline and online algorithms for configuration balancing with stochastic requests. When the requests are known offline, we give a non-adaptive policy for configuration balancing with stochastic requests that $O(\frac{\log m}{\log \log m})$-approximates the optimal adaptive policy, which matches a known lower bound for the special case of load balancing on identical machines. When requests arrive online in a list, we give a non-adaptive policy that is $O(\log m)$ competitive. Again, this result is asymptotically tight due to information-theoretic lower bounds for special cases (e.g., for load balancing on unrelated machines). Finally, we show how to leverage adaptivity in the special case of load balancing on \emphrelated machines to obtain a constant-factor approximation offline and an $O(\log \log m)$-approximation online. A crucial technical ingredient in all of our results is a new structural characterization of the optimal adaptive policy that allows us to limit the correlations between its decisions.
- Generalized Assignment and Knapsack Problems in the Random-Order Model (Max Klimm and Martin Knaack)
IPCO 2025 – Proc. 26th Conference on Integer Programming and Combinatorial Optimization.
@inproceedings{KlimmKnaackl2025,
author = {Max Klimm and Martin Knaack},
title = {Generalized Assignment and Knapsack Problems in the Random-Order Model},
booktitle = {IPCO 2025 – Proc. 26th Conference on Integer Programming and Combinatorial Optimization},
year = {to appear},
}
We study different online optimization problems in the random-order model. There is a finite set of bins with known capacity and a finite set of items arriving in a random order. Upon arrival of an item, its size and its value for each of the bins is revealed and it has to be decided immediately and irrevocably to which bin the item is assigned, or to not assign the item at all. In this setting, an algorithm is $\alpha$-competitive if the total value of all items assigned to the bins is at least an $\alpha$-fraction of the total value of an optimal assignment that knows all items beforehand. We give an algorithm that is $\alpha$-competitive with $\alpha = (1-\ln(2))/2 \approx 1/6.52$ improving upon the previous best algorithm with $\alpha \approx 1/6.99$ for the generalized assignment problem and the previous best algorithm with $\alpha \approx 1/6.65$ for the integral knapsack problem. We then study the fractional knapsack problem where we have a single bin and it is also allowed to pack items fractionally. For that case, we obtain an algorithm that is $\alpha$-competitive with $\alpha = 1/e \approx 1/2.71$ improving on the previous best algorithm with $\alpha = 1/4.39$. We further show that this competitive ratio is the best-possible for this model.
- Valid Cuts for the Design of Potential-based Flow Networks (Pascal Börner, Max Klimm, Annette Lutz, Marc Pfetsch, Martin Skutella, and Lea Strubberg)
IPCO 2025 – Proc. 26th Conference on Integer Programming and Combinatorial Optimization.
@inproceedings{BoernerKlimmLutz+2025,
author = {Pascal Börner and Max Klimm and Annette Lutz and Marc Pfetsch and Martin Skutella and Lea Strubberg},
title = {Valid Cuts for the Design of Potential-based Flow Networks},
booktitle = {IPCO 2025 – Proc. 26th Conference on Integer Programming and Combinatorial Optimization},
year = {to appear},
}
The construction of a cost minimal network for flows obeying physical laws is an important problem for the design of electricity, water, hydrogen, and natural gas infrastructures. We formulate this problem as a mixed-integer non-linear program. Its non-convexity, due to the potential flow, together with the binary variables, indicating the decision to build a connection, make these problems challenging to solve. We develop a novel class of valid inequalities on the fractional relaxations of the binary variables. Further, we show that this class of inequalities can be separated in polynomial for solutions to a fractional relaxation. This makes it possible to incorporate these inequalities into a branch-and-bound algorithm. The advantage of these inequalities is lastly demonstrated in a computational study on the design of real-world gas transport networks.
- Packing a Knapsack with Items Owned by Strategic Agents (Javier Cembrano, Max Klimm, and Martin Knaack)
WINE 2024 – Proc. 20th Conference on Web and Internet Economics.
@inproceedings{CembranoKlimmKnaack2024,
author = {Javier Cembrano and Max Klimm and Martin Knaack},
title = {Packing a Knapsack with Items Owned by Strategic Agents},
booktitle = {WINE 2024 – Proc. 20th Conference on Web and Internet Economics},
year = {to appear},
}
This paper considers a scenario where a mechanism designer is interested in selecting items with maximum total value under a knapsack constraint. The items, however, are controlled by strategic agents who aim to maximize the total value of their items in the knapsack. This is a natural setting, e.g., when agencies select projects for funding, companies select products for sale in their shops, or hospitals schedule MRI scans for the day. A mechanism governing the packing of the knapsack is strategyproof if no agent can benefit from hiding items controlled by them to the mechanism. We are interested in mechanisms that are strategyproof and $\alpha$-approximate in the sense that they always approximate the maximum value of the knapsack by a factor of $\alpha \in [0,1]$. First, we give a deterministic mechanism that is $\smash{\frac{1}{3}}$-approximate. For the special case where all items have unit density, we design a $\smash{\frac{1}{\phi}}$-approximate mechanism where $1/\phi \approx 0.618$ is the inverse of the golden ratio. This result is tight as we show that no deterministic strategyproof mechanism with a better approximation exists. We further give randomized mechanisms with approximation guarantees of $1/2$ for the general case and $2/3$ for the case of unit densities. For both cases, no strategyproof mechanism can achieve an approximation guarantee better than $1/(5\phi -7)\approx 0.917$.
- Impartial Selection Under Combinatorial Constraints (Javier Cembrano, Max Klimm, and Arturo Merino)
WINE 2024 – Proc. 20th Conference on Web and Internet Economics.
@inproceedings{CembranoKlimmMerino2024,
author = {Javier Cembrano and Max Klimm and Arturo Merino},
title = {Impartial Selection Under Combinatorial Constraints},
booktitle = {WINE 2024 – Proc. 20th Conference on Web and Internet Economics},
year = {to appear},
}
Impartial selection problems are concerned with the selection of one or more agents from a set based on mutual nominations from within the set. To avoid strategic nominations of the agents, the axiom of impartiality requires that the selection of each agent is independent of the nominations cast by that agent. This paper initiates the study of impartial selection problems where the nominations are weighted and the set of agents that can be selected is restricted by a combinatorial constraint. We call a selection mechanism $\alpha$-optimal if, for every instance, the ratio between the total sum of weighted nominations of the selected set and that of the best feasible set of agents is at least $\alpha$. We show that a natural extension of a mechanism studied for the selection of a single agent remains impartial and $\frac{1}{4}$-optimal for general independence systems, and we generalize upper bounds from the selection of multiple agents by parameterizing them by the girth of the independence system. We then focus on independence systems defined by knapsack and matroid constraints, giving impartial mechanisms that exploit a greedy order of the agents and achieve approximation ratios of $\frac{1}{3}$ and $\frac{1}{2}$, respectively, when agents cast a single nomination. For graphic matroids, we further devise an impartial and $\frac{1}{3}$-optimal mechanism for an arbitrary number of nominations.
- The Polyhedral Geometry of Truthful Auctions (Michael Joswig, Max Klimm, and Sylvain Spitz)
Mathematical Programming, 2024.
@article{JoswigKlimmSpitz2024,
author = {Michael Joswig and Max Klimm and Sylvain Spitz},
title = {The Polyhedral Geometry of Truthful Auctions},
journal = {Mathematical Programming},
year = {2024},
doi = {10.1007/s10107-024-02168-y},
}
The difference set of an outcome in an auction is the set of types that the auction mechanism maps to the outcome. We give a complete characterization of the geometry of the difference sets that can appear for a dominant strategy incentive compatible multi-unit auction showing that they correspond to regular subdivisions of the unit cube. Similarly, we describe the geometry for affine maximizers for $n$ players and $m$ items, showing that they correspond to regular subdivisions of the $m$-fold product of $(n-1)$-dimensional simplices. These observations are then used to construct mechanisms that are robust in the sense that the sets of items allocated to the players change only slightly when the players' reported types are changed slightly.
- Impartial Selection with Additive Guarantees via Iterated Deletion (Javier Cembrano, Felix Fischer, David Hannon, and Max Klimm)
Games and Economic Behavior, 144:203–224, 2024.
@article{CembranoFischerHannon+2024,
author = {Javier Cembrano and Felix Fischer and David Hannon and Max Klimm},
title = {Impartial Selection with Additive Guarantees via Iterated Deletion},
journal = {Games and Economic Behavior},
year = {2024},
doi = {10.1016/j.geb.2024.01.008},
volume = {144},
pages = {203--224},
}
Impartial selection is the selection of an individual from a group based on nominations by other members of the group, in such a way that individuals cannot influence their own chance of selection. For this problem, we give a deterministic mechanism with an additive performance guarantee of $O(n^{(1+\kappa)/2})$ in a setting with $n$ individuals where each individual casts $O(n^{\kappa})$ nominations, where $\kappa\in[0,1]$. For $\kappa=0$, ie when each individual casts at most a constant number of nominations, this bound is $O(\sqrt{n})$. This matches the best-known guarantee for randomized mechanisms and a single nomination. For $\kappa=1$ the bound is $O(n)$. This is trivial, as even a mechanism that never selects provides an additive guarantee of $n-1$. We show, however, that it is also best possible: for every deterministic impartial mechanism there exists a situation in which some individual is nominated by every other individual and the mechanism either does not select or selects an individual not nominated by anyone.
- Deterministic Impartial Selection with Weights (Javier Cembrano, Svenja M. Griesbach, and Maximilian J. Stahlberg)
ACM Transactions on Economics and Computation, 12(3):10:1–10:22, 2024.
@article{CembranoGriesbachStahlberg2024,
author = {Javier Cembrano and Svenja M. Griesbach and Maximilian J. Stahlberg},
title = {Deterministic Impartial Selection with Weights},
journal = {ACM Transactions on Economics and Computation},
year = {2024},
volume = {12},
number = {3},
pages = {10:1--10:22},
doi = {10.1145/3677177},
arxiv = {2310.14991},
}
In the impartial selection problem, a subset of agents up to a fixed size $k$ among a group of $n$ is to be chosen based on votes cast by the agents themselves. A selection mechanism is \emphimpartial if no agent can influence its own chance of being selected by changing its vote. It is \emph$\alpha$-optimal if, for every instance, the ratio between the votes received by the selected subset is at least a fraction of $\alpha$ of the votes received by the subset of size $k$ with the highest number of votes. We study deterministic impartial mechanisms in a more general setting with arbitrarily weighted votes and provide the first approximation guarantee, roughly $1/\lceil 2n/k\rceil$. When the number of agents to select is large enough compared to the total number of agents, this yields an improvement on the previously best known approximation ratio of $1/k$ for the unweighted setting. We further show that our mechanism can be adapted to the impartial assignment problem, in which multiple sets of up to $k$ agents are to be selected, with a loss in the approximation ratio of $1/2$.
- Fractionally Subadditive Maximization under an Incremental Knapsack Constraint with Applications to Incremental Flows (Yann Disser, Max Klimm, Annette Lutz, and David Weckbecker)
SIAM Journal on Discrete Mathematics, 38, 2024.
@article{DisserKlimmLutz+2024,
author = {Yann Disser and Max Klimm and Annette Lutz and David Weckbecker},
title = {Fractionally Subadditive Maximization under an Incremental Knapsack Constraint with Applications to Incremental Flows},
journal = {SIAM Journal on Discrete Mathematics},
year = {2024},
volume = {38},
issue = {4},
doi = {10.1137/23M1569265},
}
We consider the problem of maximizing a fractionally subadditive function under an increasing knapsack constraint. An incremental solution to this problem is given by an order in which to include the elements of the ground set, and the competitive ratio of an incremental solution is defined by the worst ratio over all capacities relative to an optimum solution of the corresponding capacity. We present an algorithm that finds an incremental solution of competitive ratio at most $\max\{3.293\sqrt{M},2M\}$, under the assumption that the values of singleton sets are in the range $[1,M]$, and we give a lower bound of $\max\{2.618, M\}$ on the attainable competitive ratio. In addition, we establish that our framework captures potential-based flows between two vertices, and we give a lower bound of $\max\{2,M\}$ and an upper bound of $2M$ for the incremental maximization of classical flows with capacities in $[1,M]$ which is tight for the unit capacity case.
- Book Embeddings of $k$-Framed Graphs and $k$-Map Graphs (Michael A. Bekos, Giordano Da Lozzo, Svenja M. Griesbach, Martin Gronemann, Fabrizio Montecchiani, and Chrysanthi Raftopoulou)
Discrete Mathematics, 347:113690, 2024.
@article{BekosLozzoGriesbach+2024,
author = {Michael A. Bekos and Giordano Da Lozzo and Svenja M. Griesbach and Martin Gronemann and Fabrizio Montecchiani and Chrysanthi Raftopoulou},
title = {Book Embeddings of $k$-Framed Graphs and $k$-Map Graphs},
journal = {Discrete Mathematics},
doi = {10.1016/j.disc.2023.113690},
volume = {347},
issue = {1},
pages = {113690},
year = {2024},
}
An embedding of a graph in a book, called book embedding, consists of a linear ordering of its vertices along the spine of the book and an assignment of its edges to the pages of the book, so that no two edges on the same page cross. The book thickness of a graph is the minimum number of pages over all its book embeddings. For planar graphs, a fundamental result is due to Yannakakis, who proposed an algorithm to compute embeddings of planar graphs in books with four pages. Our main contribution is a technique that generalizes this result to a much wider family of nonplanar graphs, namely to $k$-map graphs. In fact, our technique can deal with any nonplanar graph having a biconnected skeleton of crossing-free edges whose faces have bounded degree. We prove that this family of graphs has book thickness bounded in $k$, and as a corollary, we obtain the first constant upper bound for the book thickness of optimal $2$-planar graphs.
- Accelerating Matroid Optimization through Fast Imprecise Oracles (Franziska Eberle, Felix Hommelsheim, Alexander Lindermayr, Zhenwei Liu, Nicole Megow, and Jens Schlöter)
NeuRIPS 2024 – Proc. 38th Annual Conference on Neural Information Processing Systems.
@inproceedings{EberleHommelsheimLindermayr+2024,
author = {Franziska Eberle and Felix Hommelsheim and Alexander Lindermayr and Zhenwei Liu and Nicole Megow and Jens Schlöter},
title = {Accelerating Matroid Optimization through Fast Imprecise Oracles},
year = {2024},
booktitle = {NeuRIPS 2024 – Proc. 38th Annual Conference on Neural Information Processing Systems},
arxiv = {2402.02774},
}
Querying complex models for precise information (e.g. traffic models, database systems, large ML models) often entails intense computations and results in long response times. Thus, weaker models which give imprecise results quickly can be advantageous, provided inaccuracies can be resolved using few queries to a stronger model. In the fundamental problem of computing a maximum-weight basis of a matroid, a well-known generalization of many combinatorial optimization problems, algorithms have access to a clean oracle to query matroid information. We additionally equip algorithms with a fast but dirty oracle modelling an unknown, potentially different matroid. We design and analyze practical algorithms which only use few clean queries w.r.t. the quality of the dirty oracle, while maintaining robustness against arbitrarily poor dirty matroids, approaching the performance of classic algorithms for the given problem. Notably, we prove that our algorithms are, in many respects, best-possible. Further, we outline extensions to other matroid oracle types, non-free dirty oracles and other matroid problems.
- Bicriterial Approximation for the Incremental Prize-Collecting Steiner-Tree Problem (Yann Disser, Svenja M. Griesbach, Max Klimm, and Annette Lutz)
ESA 2024 – Proc. 32nd European Symposium on Algorithms, pp. 47:1–47:16.
@inproceedings{DisserGriesbachKlimm+2024,
author = {Yann Disser and Svenja M. Griesbach and Max Klimm and Annette Lutz},
title = {Bicriterial Approximation for the Incremental Prize-Collecting Steiner-Tree Problem},
booktitle = {ESA 2024 – Proc. 32nd European Symposium on Algorithms},
year = {2024},
pages = {47:1--47:16},
doi = {10.4230/LIPIcs.ESA.2024.47},
}
We consider an incremental variant of the rooted prize-collecting Steiner-tree problem with a growing budget constraint. While no incremental solution exists that simultaneously approximates the optimum for all budgets, we show that a bicriterial $(\alpha,\mu)$-approximation is possible, i.e., a solution that with budget $B+\alpha$ for all $B \in \R_{\geq 0}$ is a $\mu$-approximation compared to the optimum solution with budget $B$. For the case that the underlying graph is a tree, we present a polynomial density-greedy algorithm that computes a $(\chi,1)$-approximation, where $\chi$ denotes the eccentricity of the root vertex in the underlying graph, and show that this is best possible. An adaptation of the density-greedy algorithm for general graphs is $(\gamma,2)$-competitive where $\gamma$ is the maximal length of a vertex-disjoint path starting in the root. While this algorithm is not polynomial, it can be adapted to a $(\gamma,3)$-competitive algorithm that runs in polynomial time. We further devise a capacity-scaling algorithm that guarantees a $(3\chi,8)$-approximation and, more generally, a $\smash{((4\ell - 1)\chi, \frac{2^{\ell + 2}}{2^{\ell}-1})}$-approximation for every fixed $\ell \in \mathbb{N}$. A polynomial variant of this algorithm yields a $\smash{((4\ell - 1)\chi, \frac{2^{\ell + 3}}{2^{\ell}-1})}$-approximation.
- Optimizing Throughput and Makespan of Queuing Systems by Information Design (Svenja M. Griesbach, Max Klimm, Philipp Warode, and Theresa Ziemke)
ESA 2024 – Proc. 32nd European Symposium on Algorithms, pp. 62:1–62:18.
@inproceedings{GriesbachKlimmWarode+2024,
author = {Svenja M. Griesbach and Max Klimm and Philipp Warode and Theresa Ziemke},
title = {Optimizing Throughput and Makespan of Queuing Systems by Information Design},
booktitle = {ESA 2024 – Proc. 32nd European Symposium on Algorithms},
year = {2024},
pages = {62:1--62:18},
doi = {10.4230/LIPIcs.ESA.2024.62},
}
We study the optimal provision of information for two natural performance measures of queuing systems: throughput and makespan. A set of parallel links (queues) is equipped with deterministic capacities and stochastic travel times where the latter depend on a realized scenario, and the number of scenarios is assumed to be constant. A continuum of flow particles (users) arrives at the system at a constant rate. A system operator knows the realization of the scenario and may (partially) reveal this information via a public signaling scheme to the flow particles. Upon arrival, the flow particles observe the signal issued by the system operator, form an updated belief about the realized scenario, and decide on a link to use. Inflow into a link exceeding the link's capacity builds up in a queue that increases the travel time on the link. Dynamic inflow rates are in a Bayesian dynamic equilibrium when the expected travel time along all links with positive inflow is equal at every point in time. For a given time horizon $T$, the throughput induced by a signaling scheme is the total volume of flow that leaves the links in the interval $[0,T]$. We show that the public signaling scheme that maximizes the throughput may involve irrational numbers and provide an additive polynomial time approximation scheme (PTAS) that approximates the optimal throughput by an arbitrary additive constant $\epsilon>0$. The algorithm solves a Langrangian dual of the signaling problem with the Ellipsoid method whose separation oracle is implemented by a cell decomposition technique. We also provide a multiplicative fully polynomial time approximation scheme (FPTAS) that does not rely on strong duality and, thus, allows to compute also the optimal signals. It uses a different cell decomposition technique together with a piece-wise convex under-estimator of the optimal value function. Finally, we consider the makespan of a Bayesian dynamic equilibrium which is defined as the last point in time when a total given value of flow leaves the system. Using a variational inequality argument, we show that full information revelation is a public signaling scheme that minimizes the makespan.
- Information Design for Congestion Games with Unknown Demand (Svenja M. Griesbach, Martin Hoefer, Max Klimm, and Tim Koglin)
AAAI 2024 – Proc. 38th Annual Conference on Artificial Intelligence, pp. 9722–9730.
@inproceedings{GriesbachHoeferKlimm+2024,
author = {Svenja M. Griesbach and Martin Hoefer and Max Klimm and Tim Koglin},
title = {Information Design for Congestion Games with Unknown Demand},
booktitle = {AAAI 2024 – Proc. 38th Annual Conference on Artificial Intelligence},
year = {2024},
pages = {9722--9730},
doi = {10.1609/aaai.v38i9.28830},
}
We study a novel approach to information design in the standard traffic model of network congestion games. It captures the natural condition that the demand is unknown to the users of the network. A principal (e.g., a mobility service) commits to a signaling strategy, observes the realized demand and sends a (public) signal to agents (i.e., users of the network). Based on the induced belief about the demand, the users then form an equilibrium. We consider the algorithmic goal of the principal: Compute a signaling scheme that minimizes the expected total cost of the induced equilibrium. We concentrate on single-commodity networks and affine cost functions, for which we obtain the following results. First, we devise a fully polynomial-time approximation scheme (FPTAS) for the case that the demand can only take two values. It relies on several structural properties of the cost of the induced equilibrium as a function of the updated belief about the distribution of demands. We show that this function is piecewise linear for any number of demands, and monotonic for two demands. Second, we give a complete characterization of the graph structures for which it is optimal to fully reveal the information about the realized demand. This signaling scheme turns out to be optimal for all cost functions and probability distributions over demands if and only if the graph is series-parallel. Third, we propose an algorithm that computes the optimal signaling scheme for any number of demands whose time complexity is polynomial in the number of supports that occur in a Wardrop equilibrium for some demand. Finally, we conduct a computational study that tests this algorithm on real-world instances.
- O(1/\(\epsilon\)) Is the Answer in Online Weighted Throughput Maximization (Franziska Eberle)
STACS 2024 – Proc. 41st International Symposium on Theoretical Aspects of Computer Science, pp. 32:1–32:15.
@inproceedings{Eberle2024,
author = {Franziska Eberle},
booktitle = {STACS 2024 – Proc. 41st International Symposium on Theoretical Aspects of Computer Science},
title = {O(1/{\(\epsilon\)}) Is the Answer in Online Weighted Throughput Maximization},
pages = {32:1--32:15},
year = {2024},
doi = {10.4230/LIPICS.STACS.2024.32},
}
We study a fundamental online scheduling problem where jobs with processing times, weights, and deadlines arrive online over time at their release dates. The task is to preemptively schedule these jobs on a single or multiple (possibly unrelated) machines with the objective to maximize the weighted throughput, the total weight of jobs that complete before their deadline. To overcome known lower bounds for the competitive analysis, we assume that each job arrives with some slack $\varepsilon > 0$; that is, the time window for processing job $j$ on any machine $i$ on which it can be executed has length at least $(1+\varepsilon)$ times $j$'s processing time on machine $i$. Our contribution is a best possible online algorithm for weighted throughput maximization on unrelated machines: Our algorithm is $O\big(\frac1\varepsilon\big)$-competitive, which matches the lower bound for unweighted throughput maximization on a single machine. Even for a single machine, it was not known whether the problem with weighted jobs is "harder" than the problem with unweighted jobs. Thus, we answer this question and close weighted throughput maximization on a single machine with a best possible competitive ratio $\Theta\big(\frac1\varepsilon\big)$. While we focus on non-migratory schedules, on identical machines, our algorithm achieves the same (up to constants) performance guarantee when compared to an optimal migratory schedule.
- Scheduling On a Stochastic Number of Machines (Moritz Buchem, Franziska Eberle, Hugo Kooki Kasuya Rosado, Kevin Schewior, and Andreas Wiese)
APPROX 2024 – Proc. International Workshop on Approximation Algorithms for Combinatorial Optimization Problems.
@inproceedings{BuchemEberle+2024,
author = {Moritz Buchem and Franziska Eberle and Hugo Kooki Kasuya Rosado and Kevin Schewior and Andreas Wiese},
booktitle = {APPROX 2024 – Proc. International Workshop on Approximation Algorithms for Combinatorial Optimization Problems},
series = {LIPIcs},
title = {Scheduling On a Stochastic Number of Machines},
year = {2024},
arxiv = {2407.15737},
}
We consider a new scheduling problem on parallel identical machines in which the number of machines is initially not known, but it follows a given probability distribution. Only after all jobs are assigned to a given number of bags, the actual number of machines is revealed. Subsequently, the jobs need to be assigned to the machines without splitting the bags. This is the stochastic version of a related problem introduced by Stein and Zhong [SODA 2018, TALG 2020] and it is, for example, motivated by bundling jobs that need to be scheduled by data centers. We present two PTASs for the stochastic setting, computing job-to-bag assignments that (i) minimize the expected maximum machine load and (ii) maximize the expected minimum machine load (like in the Santa Claus problem), respectively. The former result follows by careful enumeration combined with known PTASs. For the latter result, we introduce an intricate dynamic program that we apply to a suitably rounded instance.
- Spatiotemporal Reconstruction of Ancient Road Networks Through Sequential Cost–Benefit Analysis (Maximilian J. Stahlberg, Guillaume Sagnol, Benjamin Ducke, and Max Klimm)
PNAS Nexus, 2(2), 2023.
@article{StahlbergSagnolDucke+2022,
author = {Maximilian J. Stahlberg and Guillaume Sagnol and Benjamin Ducke and Max Klimm},
title = {Spatiotemporal Reconstruction of Ancient Road Networks Through Sequential Cost--Benefit Analysis},
journal = {PNAS Nexus},
year = {2023},
volume = {2},
number = {2},
doi = {10.1093/pnasnexus/pgac313},
}
The construction of ancient road networks spanned generations and exhibits temporal path dependence that is not fully captured by established network formation models that are used to support archaeological reasoning. We introduce an evolutionary model that captures explicitly the sequential nature of road network formation: A central feature is that connections are added successively and according to an optimal cost–benefit trade-off with respect to existing connections. In this model, the network topology emerges rapidly from early decisions, a trait that makes it possible to identify plausible road construction orders in practice. Based on this observation we develop a method to compress the search space of path-dependent optimization problems. We use this method to show that the model's assumptions on ancient decision making allow the reconstruction of partially known road networks from the Roman era in good detail and from sparse archaeological evidence. In particular, we identify missing links in the major road network of ancient Sardinia that are in good agreement with expert predictions.
- On the Robustness of Potential-Based Flow Networks (Max Klimm, Marc E. Pfetsch, Rico Raber, and Martin Skutella)
Mathematical Programming, 197:337–374, 2023.
@article{KlimmPfetschRaber+2022a,
author = {Max Klimm and Marc E. Pfetsch and Rico Raber and Martin Skutella},
doi = {10.1007/s10107-021-01760-w},
journal = {Mathematical Programming},
title = {On the Robustness of Potential-Based Flow Networks},
year = {2023},
volume = {197},
pages = {337--374},
}
Potential-based flows provide a simple yet realistic mathematical model of transport in many real-world infrastructure networks such as, e.g., gas or water networks, where the flow along each edge depends on the difference of the potentials at its end nodes. We call a network topology robust if the maximal node potential needed to satisfy a set of demands never increases when demands are decreased. This notion of robustness is motivated by infrastructure networks where users first make reservations for certain demands that may be larger than the actual flows sent later on. In these networks, node potentials correspond to physical quantities such as pressures or hydraulic heads and must be guaranteed to lie within a fixed range, even if the actual amounts are smaller than the previously reserved demands. Our main results are a precise characterization of robust network topologies for the case of point-to-point demands via forbidden node-labeled graph minors, as well as an efficient algorithm for testing robustness.
- Speed-robust scheduling: sand, bricks, and rocks (Franziska Eberle, Ruben Hoeksma, Nicole Megow, Lukas Nölke, Kevin Schewior, and Bertrand Simon)
Mathematical Programming, 197(2):1009–1048, 2023.
@article{EberleHoeksmaMegow+2023,
author = {Franziska Eberle and Ruben Hoeksma and Nicole Megow and Lukas N{\"{o}}lke and Kevin Schewior and Bertrand Simon},
title = {Speed-robust scheduling: sand, bricks, and rocks},
journal = {Mathematical Programming},
volume = {197},
number = {2},
pages = {1009--1048},
year = {2023},
doi = {10.1007/S10107-022-01829-0},
}
The speed-robust scheduling problem is a two-stage problem where, given $m$ machines, jobs must be grouped into at most $m$ bags while the processing speeds of the machines are unknown. After the speeds are revealed, the grouped jobs must be assigned to the machines without being separated. To evaluate the performance of algorithms, we determine upper bounds on the worst-case ratio of the algorithm's makespan and the optimal makespan given full information. We refer to this ratio as the robustness factor. We give an algorithm with a robustness factor $2-\frac1m$ for the most general setting and improve this to $1.8$ for equal-size jobs. For the special case of infinitesimal jobs, we give an algorithm with an optimal robustness factor equal to $\frac e{e-1}\approx 1.58$. The particular machine environment in which all machines have either speed $0$ or $1$ was studied before by Stein and Zhong (ACM Trans.\ Alg.\ 2020). For this setting, we provide an algorithm for scheduling infinitesimal jobs with an optimal robustness factor of $\frac{1+\sqrt{2}}2\approx 1.207$. It lays the foundation for an algorithm matching the lower bound of $\frac43$ for equal-size jobs.
- Reduction of Potential-Based Flow Networks (Max Klimm, Marc Pfetsch, Rico Raber, and Martin Skutella)
Mathematics of Operations Research, 48:2287–2303, 2023.
@article{KlimmPfetschRaber+2023,
author = {Max Klimm and Marc Pfetsch and Rico Raber and Martin Skutella},
title = {Reduction of Potential-Based Flow Networks},
journal = {Mathematics of Operations Research},
volume = {48},
issue = {4},
year = {2023},
pages = {2287--2303},
doi = {10.1287/moor.2022.1338},
}
We consider potential-based flow networks with terminal nodes at which flow can enter or leave the network and physical properties such as voltages or pressures are measured and controlled. We study conditions under which such a network can be reduced to a smaller, equivalent network with the same behavior at the terminal nodes. Potential-based flow networks are widely used to model infrastructure networks such as electricity, gas, or water networks. In contrast to Kron's reduction for electrical networks, we prove that, in general, potential-based flow networks with at least three terminals cannot be reduced to smaller networks whose size only depends on the number of terminals. On the other hand, we show that it is possible to represent a special class of potential-based flow networks by a complete graph on the terminals, and we establish a characterization of networks that can be reduced to a path network. Our results build on fundamental properties of effective resistances proved in this paper, including explicit formulae for their dependence on edge resistances of the network and their metric properties.
- Online Throughput Maximization on Unrelated Machines: Commitment is No Burden (Franziska Eberle, Nicole Megow, and Kevin Schewior)
ACM Transactions on Algorithms, 19(1):10:1–10:25, 2023.
@article{EberleMegowSchewior2023,
author = {Franziska Eberle and Nicole Megow and Kevin Schewior},
title = {Online Throughput Maximization on Unrelated Machines: Commitment is No Burden},
journal = {ACM Transactions on Algorithms},
volume = {19},
number = {1},
pages = {10:1--10:25},
year = {2023},
}
We consider a fundamental online scheduling problem in which jobs with processing times and deadlines arrive online over time at their release dates. The task is to determine a feasible preemptive schedule on a single or multiple possibly unrelated machines that maximizes the number of jobs that complete before their deadline. Due to strong impossibility results for competitive analysis on a single machine, we require that jobs contain some \em slack $\varepsilon>0$, which means that the feasible time window for scheduling a job is at least $1+\varepsilon$ times its processing time on each eligible machine. Our contribution is two-fold: (i) We give the first non-trivial online algorithms for throughput maximization on unrelated machines, and (ii), this is the main focus of our paper, we answer the question on how to handle commitment requirements which enforce that a scheduler has to guarantee at a certain point in time the completion of admitted jobs. This is very relevant, e.g., in providing cloud-computing services, and disallows last-minute rejections of critical tasks. We present an algorithm for unrelated machines that is $\Theta\big(\frac{1}\varepsilon\big )$-competitive when the scheduler must commit upon starting a job. Somewhat surprisingly, this is the same optimal performance bound (up to constants) as for scheduling without commitment on a single machine. If commitment decisions must be made before a job's slack becomes less than a $\delta$-fraction of its size, we prove a competitive ratio of $\mathcal{O}\big(\frac{1}{\varepsilon - \delta}\big)$ for $0 < \delta < \varepsilon$. This result nicely interpolates between commitment upon starting a job and commitment upon arrival. For the latter commitment model, it is known that no (randomized) online algorithm admits any bounded competitive ratio. While we mainly focus on scheduling without migration, our results also hold when comparing against a migratory optimal solution in case of identical machines.
- Nonconvex Equilibrium Models for Energy Markets: Exploiting Price Information to Determine the Existence of an Equilibrium (Julia Grübel, Olivier Huber, Lukas Hümbs, Max Klimm, Martin Schmidt, and Alexandra Schwartz)
Optimization Methods and Software, 38, 2023.
@article{GruebelHuberHuembs+2023,
author = {Julia Gr{\"u}bel and Olivier Huber and Lukas H{\"u}mbs and Max Klimm and Martin Schmidt and Alexandra Schwartz},
journal = {Optimization Methods and Software},
title = {Nonconvex Equilibrium Models for Energy Markets: Exploiting Price Information to Determine the Existence of an Equilibrium},
year = {2023},
volume = {38},
issue = {1},
doi = {10.1080/10556788.2022.2117358},
}
Motivated by examples from the energy sector, we consider market equilibrium problems (MEPs) involving players with nonconvex strategy spaces or objective functions, where the latter are assumed to be linear in market prices. We propose an algorithm that determines if an equilibrium of such an MEP exists and that computes an equilibrium in case of existence. Three key prerequisites have to be met. First, appropriate bounds on market prices have to be derived from necessary optimality conditions of some players. Second, a technical assumption is required for those prices that are not uniquely determined by the derived bounds. Third, nonconvex optimization problems have to be solved to global optimality. We test the algorithm on well-known instances from the power and gas literature that meet these three prerequisites. There, nonconvexities arise from considering the transmission system operator as an additional player besides producers and consumers who, e.g., switches lines or faces nonlinear physical laws. Our numerical results indicate that equilibria often exist, especially for the case of continuous nonconvexities in the context of gas market problems.
- Incremental Maximization via Continuization (Yann Disser, Max Klimm, Kevin Schewior, and David Weckbecker)
ICALP 2023 – Proc. 50th International Colloquium on Automata, Languages and Programming.
@inproceedings{DisserKlimmSchewior+2023,
arxiv = {2305.01310},
author = {Yann Disser and Max Klimm and Kevin Schewior and David Weckbecker},
title = {Incremental Maximization via Continuization},
booktitle = {ICALP 2023 – Proc. 50th International Colloquium on Automata, Languages and Programming},
year = {2023},
doi = {10.4230/LIPIcs.ICALP.2023.47},
}
We consider the problem of finding an incremental solution to a cardinality-constrained maximization problem that not only captures the solution for a fixed cardinality, but also describes how to gradually grow the solution as the cardinality bound increases. The goal is to find an incremental solution that guarantees a good competitive ratio against the optimum solution for all cardinalities simultaneously. The central challenge is to characterize maximization problems where this is possible, and to determine the best-possible competitive ratio that can be attained. A lower bound of $2.18$ and an upper bound of $\varphi + 1 \approx 2.618$ are known on the competitive ratio for monotone and accountable objectives [Bernstein et al.,Math. Prog., 2022], which capture a wide range of maximization problems. We introduce a continuization technique and identify an optimal incremental algorithm that provides strong evidence that $\varphi+1$ is the best-possible competitive ratio. Using this continuization, we obtain an improved lower bound of $2.246$ by studying a particular recurrence relation whose characteristic polynomial has complex roots exactly beyond the lower bound. Based on the optimal continuous algorithm combined with a scaling approach, we also provide a $1.772$-competitive randomized algorithm. We complement this by a randomized lower bound of $1.447$ via Yao's principle.
- The Polyhedral Geometry of Truthful Auctions (Michael Joswig, Max Klimm, and Sylvain Spitz)
IPCO 2023 – Proc. 24th Conference on Integer Programming and Combinatorial Optimization, pp. 231–245.
@inproceedings{JoswigKlimmSpitz2023,
arxiv = {2211.01907},
author = {Michael Joswig and Max Klimm and Sylvain Spitz},
title = {The Polyhedral Geometry of Truthful Auctions},
booktitle = {IPCO 2023 – Proc. 24th Conference on Integer Programming and Combinatorial Optimization},
year = {2023},
pages = {231--245},
doi = {10.1007/978-3-031-32726-1_17},
}
The difference set of an outcome in an auction is the set of types that the auction mechanism maps to the outcome. We give a complete characterization of the geometry of the difference sets that can appear for a dominant strategy incentive compatible multi-unit auction showing that they correspond to regular subdivisions of the unit cube. This observation is then used to construct mechanisms that are robust in the sense that the set of items allocated to a player does change only slightly when the player's reported type is changed slightly.
- Configuration Balancing for Stochastic Requests (Franziska Eberle, Anupam Gupta, Nicole Megow, Benjamin Moseley, and Rudy Zhou)
IPCO 2023 – Proc. 24th Conference on Integer Programming and Combinatorial Optimization, Springer, pp. 127–141.
@inproceedings{EberleGuptaMegow+2023,
author = {Franziska Eberle and Anupam Gupta and Nicole Megow and Benjamin Moseley and Rudy Zhou},
title = {Configuration Balancing for Stochastic Requests},
booktitle = {IPCO 2023 – Proc. 24th Conference on Integer Programming and Combinatorial Optimization},
series = {Lecture Notes in Computer Science},
volume = {13904},
pages = {127--141},
publisher = {Springer},
year = {2023},
doi = {10.1007/978-3-031-32726-1\_10},
}
The configuration balancing problem with stochastic requests generalizes well-studied resource allocation problems such as load balancing and virtual circuit routing. There are given $m$ resources and $n$ requests; each request has multiple possible \emphconfigurations, each of which increases the load of each resource by some amount. The goal is to select one configuration for each request to minimize the \emphmakespan: the load of the most-loaded resource. In the stochastic setting, the amount by which a configuration increases the resource load is uncertain until the configuration is chosen, but we are given a probability distribution. We develop both offline and online algorithms for configuration balancing with stochastic requests. When the requests are known offline, we give a non-adaptive policy for configuration balancing with stochastic requests that $O(\frac{\log m}{\log \log m})$-approximates the optimal adaptive policy, which matches a known lower bound for the special case of load balancing on identical machines. When requests arrive online in a list, we give a non-adaptive policy that is $O(\log m)$ competitive. Again, this result is asymptotically tight due to information-theoretic lower bounds for special cases (e.g., for load balancing on unrelated machines). Finally, we show how to leverage adaptivity in the special case of load balancing on \emphrelated machines to obtain a constant-factor approximation offline and an $O(\log \log m)$-approximation online. A crucial technical ingredient in all of our results is a new structural characterization of the optimal adaptive policy that allows us to limit the correlations between its decisions.
- Improved Approximation Algorithms for the Expanding Search Problem (Svenja M. Griesbach, Felix Hommelsheim, Max Klimm, and Kevin Schewior)
ESA 2023 – Proc. 31st European Symposium on Algorithms, pp. 54:1–54:15.
@inproceedings{GriesbachHommelsheimKlimm+2023,
author = {Svenja M. Griesbach and Felix Hommelsheim and Max Klimm and Kevin Schewior},
title = {Improved Approximation Algorithms for the Expanding Search Problem},
booktitle = {ESA 2023 – Proc. 31st European Symposium on Algorithms},
year = {2023},
pages = {54:1--54:15},
doi = {10.4230/LIPIcs.ESA.2023.54},
}
A searcher faces a graph with edge lengths and vertex weights, initially having explored only a given starting vertex. In each step, the searcher adds an edge to the solution that connects an unexplored vertex to an explored vertex. This requires an amount of time equal to the edge length. The goal is to minimize the weighted sum of the exploration times over all vertices. We show that this problem is hard to approximate and provide algorithms with improved approximation guarantees. For the general case, we give a $(2e+\varepsilon)$-approximation for any $\varepsilon > 0$. For the case that all vertices have unit weight, we provide a $2e$-approximation. Finally, we provide a PTAS for the case of a Euclidean graph. Previously, for all cases only an $8$-approximation was known.
- Improved Bounds for Single-Nomination Impartial Selection (Javier Cembrano, Felix Fischer, and Max Klimm)
EC 2023 – Proc. 24th ACM Conference on Economics and Computation, pp. 449.
@inproceedings{CembranoFischerKlimm2023,
author = {Javier Cembrano and Felix Fischer and Max Klimm},
title = {Improved Bounds for Single-Nomination Impartial Selection},
booktitle = {EC 2023 – Proc. 24th ACM Conference on Economics and Computation},
year = {2023},
doi = {10.1145/3580507.3597693},
pages = {449},
arxiv = {2305.09998},
}
We give new bounds for the single-nomination model of impartial selection, a problem proposed by Holzman and Moulin (Econometrica, 2013). A selection mechanism, which may be randomized, selects one individual from a group of $n$ based on nominations among members of the group; a mechanism is impartial if the selection of an individual is independent of nominations cast by that individual, and $\alpha$-optimal if under any circumstance the expected number of nominations received by the selected individual is at least $\alpha$ times that received by any individual. In a many-nominations model, where individuals may cast an arbitrary number of nominations, the so-called permutation mechanism is $1/2$-optimal, and this is best possible. In the single-nomination model, where each individual casts exactly one nomination, the permutation mechanism does better and prior to this work was known to be $67/108$-optimal but no better than $2/3$-optimal. We show that it is in fact $2/3$-optimal for all $n$. This result is obtained via tight bounds on the performance of the mechanism for graphs with maximum degree $\Delta$, for any $\Delta$, which we prove using an adversarial argument. We then show that the permutation mechanism is not best possible; indeed, by combining the permutation mechanism, another mechanism called plurality with runner-up, and some new ideas, $2105/3147$-optimality can be achieved for all $n$. We finally give new upper bounds on $\alpha$ for any $\alpha$-optimal impartial mechanism. They improve on the existing upper bounds for all $n \geq 7$ and imply that no impartial mechanism can be better than $76/105$-optimal for all $n$; they do not preclude the existence of a (3/4−\varepsilon)-optimal impartial mechanism for arbitrary $\varepsilon > 0$ if $n$ is large.
- Complexity of Equilibria in Binary Public Goods Games on Undirected Graphs (Max Klimm and Maximilian J. Stahlberg)
EC 2023 – Proc. 24th ACM Conference on Economics and Computation, pp. 938–955.
@inproceedings{KlimmStahlberg2023,
arxiv = {2301.11849},
author = {Max Klimm and Maximilian J. Stahlberg},
title = {Complexity of Equilibria in Binary Public Goods Games on Undirected Graphs},
booktitle = {EC 2023 – Proc. 24th ACM Conference on Economics and Computation},
year = {2023},
pages = {938--955},
doi = {10.1145/3580507.3597780},
}
We study the complexity of computing equilibria in binary public goods games on undirected graphs. In such a game, players correspond to vertices in a graph and face a binary choice of performing an action, or not. Each player's decision depends only on the number of neighbors in the graph who perform the action and is encoded by a per-player binary pattern. We show that games with decreasing patterns (where players only want to act up to a threshold number of adjacent players doing so) always have a pure Nash equilibrium and that one is reached from any starting profile by following a polynomially bounded sequence of best responses. For non-monotonic patterns of the form $10^k10^*$ (where players want to act alone or alongside $k+1$ neighbors), we show that it is NP-hard to decide whether a pure Nash equilibrium exists. We further investigate a generalization of the model that permits ties of varying strength: an edge with integral weight $w$ behaves as $w$ parallel edges. While, in this model, a pure Nash equilibrium still exists for decreasing patters, we show that the task of computing one is PLS-complete.
- Deterministic Impartial Selection with Weights (Javier Cembrano, Svenja M. Griesbach, and Maximilian J. Stahlberg)
WINE 2023 – Proc. 19th Conference on Web and Internet Economics, Springer Nature Switzerland, pp. 151–168.
@inproceedings{CembranoGriesbachStahlberg2023,
author = {Javier Cembrano and Svenja M. Griesbach and Maximilian J. Stahlberg},
title = {Deterministic Impartial Selection with Weights},
editor = {Jugal Garg and Max Klimm and Yuqing Kong},
booktitle = {WINE 2023 – Proc. 19th Conference on Web and Internet Economics},
series = {LNCS},
volume = {14413},
publisher = {Springer Nature Switzerland},
address = {Cham},
year = {2023},
pages = {151--168},
doi = {10.1007/978-3-031-48974-7\_9},
arxiv = {2310.14991},
}
In the impartial selection problem, a subset of agents up to a fixed size $k$ among a group of $n$ is to be chosen based on votes cast by the agents themselves. A selection mechanism is impartial if no agent can influence its own chance of being selected by changing its vote. It is $\alpha$-optimal if, for every instance, the ratio between the votes received by the selected subset is at least a fraction of $\alpha$ of the votes received by the subset of size $k$ with the highest number of votes. We study deterministic impartial mechanisms in a more general setting with arbitrarily weighted votes and provide the first approximation guarantee, roughly $1/\lceil 2n/k\rceil$. When the number of agents to select is large enough compared to the total number of agents, this yields an improvement on the previously best known approximation ratio of $1/k$ for the unweighted setting. We further show that our mechanism can be adapted to the impartial assignment problem, in which multiple sets of up to $k$ agents are to be selected, with a loss in the approximation ratio of $1/2$.
- WINE 2023 – Proc. 19th Conference on Web and Internet Economics (Jugal Garg, Max Klimm, Yuqing Kong, eds.), Springer, 2023.
@proceedings{GargKlimmKong2023,
doi = {10.1007/978-3-031-48974-7},
editor = {Jugal Garg and Max Klimm and Yuqing Kong},
publisher = {Springer},
series = {LNCS},
title = {WINE 2023 – Proc. 19th Conference on Web and Internet Economics},
volume = {14413},
year = {2023},
}
- Impartial Rank Aggregation (Javier Cembrano, Felix Fischer, and Max Klimm)
Preprint, 2023.
@techreport{CembranoFischerKlimm+2023,
author = {Javier Cembrano and Felix Fischer and Max Klimm},
title = {Impartial Rank Aggregation},
year = {2023},
arxiv = {2310.13141},
}
We study functions that produce a ranking of $n$ individuals from $n$ such rankings and are impartial in the sense that the position of an individual in the output ranking does not depend on the input ranking submitted by that individual. When $n \geq 4$, two properties concerning the quality of the output in relation to the input can be achieved in addition to impartiality: individual full rank, which requires that each individual can appear in any position of the output ranking; and monotonicity, which requires that an individual cannot move down in the output ranking if it moves up in an input ranking. When $n \geq 5$, monotonicity can be dropped to strengthen individual full rank to weak unanimity, requiring that a ranking submitted by every individual must be chosen as the output ranking. Mechanisms achieving these results can be implemented in polynomial time. Both results are best possible in terms of their dependence on $n$. The second result cannot be strengthened further to a notion of unanimity that requires agreement on pairwise comparisons to be preserved.
- Multidimensional Political Apportionment (Javier Cembrano, Jose Correa, and Victor Verdugo)
Proceedings of the National Academy of Sciences, 119(15):e2109305119, 2022.
@article{CembranoCorreaVerdugo2022,
author = {Javier Cembrano and Jose Correa and Victor Verdugo},
doi = {10.1073/pnas.2109305119},
journal = {Proceedings of the National Academy of Sciences},
number = {15},
pages = {e2109305119},
title = {Multidimensional Political Apportionment},
volume = {119},
year = {2022},
}
Deciding how to allocate the seats of a deliberative assembly is one of the most fundamental problems in the political organization of societies and has been widely studied over two centuries already. The idea of proportionality is at the core of most approaches to tackle this problem, and this notion is captured by the divisor methods, such as the Jefferson/D'Hondt method. In a seminal work, Balinski and Demange extended the single-dimensional idea of divisor methods to the setting in which the seat allocation is simultaneously determined by two dimensions and proposed the so-called biproportional apportionment method. The method, currently used in several electoral systems, is, however, limited to two dimensions and the question of extending it is considered to be an important problem both theoretically and in practice. In this work we initiate the study of multidimensional proportional apportionment. We first formalize a notion of multidimensional proportionality that naturally extends that of Balinski and Demange. By means of analyzing an appropriate integer linear program we are able to prove that, in contrast to the two-dimensional case, the existence of multidimensional proportional apportionments is not guaranteed and deciding their existence is a computationally hard problem ($\mathsf{NP}$-complete). Interestingly, our main result asserts that it is possible to find approximate multidimensional proportional apportionments that deviate from the marginals by a small amount. The proof arises through the lens of discrepancy theory, mainly inspired by the celebrated Beck–Fiala theorem. We finally evaluate our approach by using the data from the recent 2021 Chilean Constitutional Convention election.
- Packing Under Convex Quadratic Constraints (Max Klimm, Marc E. Pfetsch, Rico Raber, and Martin Skutella)
Mathematical Programming, 192:361–386, 2022.
@article{KlimmPfetschRaber+2022b,
author = {Max Klimm and Marc E. Pfetsch and Rico Raber and Martin Skutella},
doi = {10.1007/s10107-021-01675-6},
journal = {Mathematical Programming},
pages = {361--386},
title = {Packing Under Convex Quadratic Constraints},
volume = {192},
year = {2022},
}
We consider a general class of binary packing problems with a convex quadratic knapsack constraint. We prove that these problems are $\mathsf{APX}$-hard to approximate and present constant-factor approximation algorithms based upon two different algorithmic techniques: a rounding technique tailored to a convex relaxation in conjunction with a non-convex relaxation, and a greedy strategy. We further show that a combination of these techniques can be used to yield a monotone algorithm leading to a strategyproof mechanism for a game-theoretic variant of the problem. Finally, we present a computational study of the empirical approximation of these algorithms for problem instances arising in the context of real-world gas transport networks.
- Equilibria in Multi-Class and Multi-Dimensional Atomic Congestion Games (Max Klimm and Andreas Schütz)
Mathematics of Operations Research, 47:2547–3399, 2022.
@article{KlimmSchuetz2022,
author = {Max Klimm and Andreas Sch{\"u}tz},
doi = {10.1287/moor.2021.1223},
journal = {Mathematics of Operations Research},
title = {Equilibria in Multi-Class and Multi-Dimensional Atomic Congestion Games},
year = {2022},
volume = {47},
issue = {4},
pages = {2547--3399},
}
This paper studies the existence of pure Nash equilibria in atomic congestion games with different user classes where the cost of each resource depends on the aggregated demand of each class. A set of cost functions is called consistent for this class if all games with cost functions from the set have a pure Nash equilibrium. We give a complete characterization of consistent sets of cost functions showing that the only consistent sets of cost functions are sets of certain affine functions and sets of certain exponential functions. This characterization is also extended to a larger class of games where each atomic player may control flow that belongs to different classes.
- Parametric Computation of Minimum Cost Flows with Piecewise Quadratic Costs (Max Klimm and Philipp Warode)
Mathematics of Operations Research, 47(1):812–846, 2022.
@article{KlimmWarode2021,
author = {Max Klimm and Philipp Warode},
doi = {10.1287/moor.2021.1151},
journal = {Mathematics of Operations Research},
number = {1},
pages = {812--846},
title = {Parametric Computation of Minimum Cost Flows with Piecewise Quadratic Costs},
volume = {47},
year = {2022},
}
We develop algorithms solving parametric flow problems with separable, continuous, piecewise quadratic, and strictly convex cost functions. The parameter to be considered is a common multiplier on the demand of all nodes. Our algorithms compute a family of flows that are each feasible for the respective demand and minimize the costs among the feasible flows for that demand. For single commodity networks with homogenous cost functions, our algorithm requires one matrix multiplication for the initialization, a rank 1 update for each nondegenerate step and the solution of a convex quadratic program for each degenerate step. For nonhomogeneous cost functions, the initialization requires the solution of a convex quadratic program instead. For multi-commodity networks, both the initialization and every step of the algorithm require the solution of a convex program. As each step is mirrored by a breakpoint in the output this yields output-polynomial algorithms in every case.
- Generalized Permutahedra and Optimal Auctions (Michael Joswig, Max Klimm, and Sylvain Spitz)
SIAM Journal on Applied Algebra and Geometry, 6(1):711-739, 2022.
@article{JoswigKlimmSpitz2022,
author = {Michael Joswig and Max Klimm and Sylvain Spitz},
title = {Generalized Permutahedra and Optimal Auctions},
journal = {SIAM Journal on Applied Algebra and Geometry},
year = {2022},
volume = {6},
number = {1},
pages = {711-739},
doi = {10.1137/21M1441286},
}
We study a family of convex polytopes, called SIM-bodies, which were introduced by Giannakopoulos and Koutsoupias (2018) to analyze so-called Straight-Jacket Auctions. First, we show that the SIM-bodies belong to the class of generalized permutahedra. Second, we prove an optimality result for the Straight-Jacket Auctions among certain deterministic auctions. Third, we employ computer algebra methods and mathematical software to explicitly determine optimal prices and revenues.
- Online Load Balancing with General Reassignment Cost (Sebastian Berndt, Franziska Eberle, and Nicole Megow)
Operations Research Letters, 50(3):322–328, 2022.
@article{BerndtEberleMegow2022,
author = {Sebastian Berndt and Franziska Eberle and Nicole Megow},
doi = {10.1016/j.orl.2022.03.007},
journal = {Operations Research Letters},
number = {3},
pages = {322--328},
title = {Online Load Balancing with General Reassignment Cost},
volume = {50},
year = {2022},
}
We investigate a semi-online variant of load balancing with restricted assignment. In this problem, we are given $n$ jobs, which need to be processed by $m$ machines with the goal to minimize the maximum machine load. Since strong lower bounds rule out any competitive ratio of $o(\log n)$, we may reassign jobs at a certain job-individual cost. We generalize a result by Gupta, Kumar, and Stein (SODA 2014) by giving a $O(\log \log mn)$-competitive algorithm with constant amortized reassignment cost.
- PICOS: A Python interface to conic optimization solvers (Guillaume Sagnol and Maximilian J. Stahlberg)
Journal of Open Source Software, 7(70):3915, 2022.
@article{SagnolStahlberg2022,
author = {Guillaume Sagnol and Maximilian J. Stahlberg},
doi = {10.21105/joss.03915},
journal = {Journal of Open Source Software},
number = {70},
pages = {3915},
title = {{PICOS}: A {Python} interface to conic optimization solvers},
volume = {7},
year = {2022},
}
Many convex and mixed integer constrained optimization problems can be compiled into a canonical form and handed to some off-the-shelf optimization solver for numeric solution. However, the necessary reformulations can be technically challenging and a program written to use one solver cannot easily be made to use another. PICOS is a well-established Python meta-interface to many proprietary and open source optimization solvers that allows problems to be input in a natural algebraic form and that handles solver selection and the required transformations transparently. This enables users to focus more on the high level optimization model and its application and less on technicalities.
- Supersonic Overland Without a Sonic Boom: Quantitative Assessment of Mach-Cutoff Flight (Bernd Liebhardt, Florian Linke, and Martin Knaack)
Journal of Aircraft, 59(5):1257-1266, 2022.
@article{LiebhardtLinkeKnaack,
author = {Bernd Liebhardt and Florian Linke and Martin Knaack},
title = {Supersonic Overland Without a Sonic Boom: Quantitative Assessment of Mach-Cutoff Flight},
journal = {Journal of Aircraft},
volume = {59},
number = {5},
pages = {1257-1266},
year = {2022},
doi = {10.2514/1.C036637},
}
When flying at low supersonic speeds, rising temperatures and convenient winds can deflect the sonic boom shock waves so that they do not reach the ground. This work presents a computation methodology that incorporates atmospheric sonic ray tracing, topography, and realistic three-dimensional atmospheres in order to optimize supersonic overland cruise with respect to speed. Flight missions are simulated on several suitable city pairs using numerous atmospheric conditions. Flight times and fuel consumption are compared to subsonic high-speed missions.
- Competitive Algorithms for Symmetric Rendezvous on the Line (Max Klimm, Guillaume Sagnol, Martin Skutella, and Khai Van Tran)
SODA 2022 – Proc. 33rd ACM-SIAM Symposium on Discrete Algorithms, pp. 329–347.
@inproceedings{KlimmSagnolSkutella+2022,
author = {Max Klimm and Guillaume Sagnol and Martin Skutella and Khai Van Tran},
booktitle = {SODA 2022 – Proc. 33rd ACM-SIAM Symposium on Discrete Algorithms},
doi = {10.1137/1.9781611977073.16},
pages = {329--347},
title = {Competitive Algorithms for Symmetric Rendezvous on the Line},
year = {2022},
}
In the Symmetric Rendezvous Search on the Line with Unknown Initial Distance, two identical agents are placed on the real line with their distance, the other's location, and their orientation unknown to them. Moving along the line at unit speed and executing the same randomized search strategy, the agents' goal is to meet up as early as possible. The expected meeting time obviously depends on the unknown initial distance and orientations. The quality of a randomized search strategy is thus measured by its competitive ratio, that is, the ratio of the expected meeting time and the earliest possible meeting time (half the initial distance). We present a class of successively refined randomized search strategies together with a rigorous mathematical analysis of their continuously improved competitive ratios. These strategies all rely on the basic idea of performing an infinite sequence of steps of geometrically increasing size in random directions, always returning to the agent's initial position before starting the next step. In addition, our more refined strategies use two novel ideas. First, remembering their past random choices, the agents randomly choose the direction of the next step in a Markov-chain-like manner. Second, choosing the next few random directions in advance, each agent may combine consecutive steps in the same direction into one longer step. As our main result, we show that this combination of looking into the past as well as into the future leads to a substantially improved competitive ratio of 13.93 compared to the previously best known bound of 24.85 (Ozsoyeller et al. 2013).
- Public Signals in Network Congestion Games (Svenja M. Griesbach, Martin Hoefer, Max Klimm, and Tim Koglin)
EC 2022 – Proc. 23rd ACM Conference on Economics and Computation, pp. 736.
@inproceedings{GriesbachHoeferKlimm+2022,
arxiv = {2205.09823},
author = {Svenja M. Griesbach and Martin Hoefer and Max Klimm and Tim Koglin},
booktitle = {EC 2022 – Proc. 23rd ACM Conference on Economics and Computation},
doi = {10.1145/3490486.3538349},
pages = {736},
title = {Public Signals in Network Congestion Games},
year = {2022},
}
We consider a largely untapped potential for the improvement of traffic networks that is rooted in the inherent uncertainty of travel times. Travel times are subject to stochastic uncertainty resulting from various parameters such as weather condition, occurrences of road works, or traffic accidents. Large mobility services have an informational advantage over single network users as they are able to learn traffic conditions from data. A benevolent mobility service may use this informational advantage in order to steer the traffic equilibrium into a favorable direction. The resulting optimization problem is a task commonly referred to as signaling or Bayesian persuasion. Previous work has shown that the underlying signaling problem can be NP-hard to approximate within any non-trivial bounds, even for affine cost functions with stochastic offsets. In contrast, we show that in this case, the signaling problem is easy for many networks. We tightly characterize the class of single-commodity networks, in which full information revelation is always an optimal signaling strategy. Moreover, we construct a reduction from optimal signaling to computing an optimal collection of support vectors for the Wardrop equilibrium. For two states, this insight can be used to compute an optimal signaling scheme. The algorithm runs in polynomial time whenever the number of different supports resulting from any signal distribution is bounded to a polynomial in the input size. Using a cell decomposition technique, we extend the approach to a polynomial-time algorithm for multi-commodity parallel link networks with a constant number of commodities, even when we have a constant number of different states of nature.
- Impartial Selection with Additive Guarantees via Iterated Deletion (Javier Cembrano, Felix A. Fischer, David Hannon, and Max Klimm)
EC 2022 – Proc. 23rd ACM Conference on Economics and Computation, pp. 1104–1105.
@inproceedings{CembranoFischerHannon+2022,
author = {Javier Cembrano and Felix A. Fischer and David Hannon and Max Klimm},
booktitle = {EC 2022 – Proc. 23rd ACM Conference on Economics and Computation},
doi = {10.1145/3490486.3538294},
pages = {1104--1105},
title = {Impartial Selection with Additive Guarantees via Iterated Deletion},
year = {2022},
}
Impartial selection is the selection of an individual from a group based on nominations by other members of the group, in such a way that individuals cannot influence their own chance of selection. We give a deterministic mechanism with an additive performance guarantee of $\mathcal{O}(n^{(1+\kappa)/2})$ in a setting with $n$ individuals where each individual casts $\mathcal{O}(n^\kappa)$ nominations, where $\kappa \in [0,1]$. For $\kappa =0$, i.e. when each individual casts at most a constant number of nominations, this bound is $\mathcal{O}(\sqrt{n})$. This matches the best-known guarantee for randomized mechanisms and a single nomination. For $\kappa=1$ the bound is $\mathcal{O}(n)$. This is trivial, as even a mechanism that never selects provides an additive guarantee of $n-1$. We show, however, that it is also best possible: for every deterministic impartial mechanism there exists a situation in which some individual is nominated by every other individual and the mechanism either does not select or selects an individual not nominated by anyone.
- Robustification of Online Graph Exploration Methods (Franziska Eberle, Alexander Lindermayr, Nicole Megow, Lukas Nölke, and Jens Schlöter)
AAAI 2022 – Proc. 36th Annual Conference on Artificial Intelligence, pp. 9732-9740.
@inproceedings{EberleLindermayrMegow+2022,
author = {Franziska Eberle and Alexander Lindermayr and Nicole Megow and Lukas Nölke and Jens Schlöter},
booktitle = {AAAI 2022 – Proc. 36th Annual Conference on Artificial Intelligence},
doi = {10.1609/aaai.v36i9.21208},
pages = {9732-9740},
title = {Robustification of Online Graph Exploration Methods },
year = {2022},
}
Exploring unknown environments is a fundamental task in many domains, e.g., robot navigation, network security, and internet search. We initiate the study of a learning-augmented variant of the classical, notoriously hard online graph exploration problem by adding access to machine-learned predictions. We propose an algorithm that naturally integrates predictions into the well-known Nearest Neighbor (NN) algorithm and significantly outperforms any known online algorithm if the prediction is of high accuracy while maintaining good guarantees when the prediction is of poor quality. We provide theoretical worst-case bounds that gracefully degrade with the prediction error, and we complement them by computational experiments that confirm our results. Further, we extend our concept to a general framework to robustify algorithms. By interpolating carefully between a given algorithm and NN, we prove new performance bounds that leverage the individual good performance on particular inputs while establishing robustness to arbitrary inputs.
- Optimal Impartial Correspondences (Javier Cembrano, Felix A. Fischer, and Max Klimm)
WINE 2022 – Proc. 18th Conference on Web and Internet Economics, pp. 187–203.
@inproceedings{CembranoFischerKlimm2022,
author = {Javier Cembrano and Felix A. Fischer and Max Klimm},
booktitle = {WINE 2022 – Proc. 18th Conference on Web and Internet Economics},
title = {Optimal Impartial Correspondences},
year = {2022},
doi = {10.1007/978-3-031-22832-2_11},
pages = {187--203},
}
We study mechanisms that select a subset of the vertex set of a directed graph in order to maximize the minimum indegree of any selected vertex, subject to an impartiality constraint that the selection of a particular vertex is independent of the outgoing edges of that vertex. For graphs with maximum outdegree $d$, we give a mechanism that selects at most $d+1$ vertices and only selects vertices whose indegree is at least the maximum indegree in the graph minus one. We then show that this is best possible in the sense that no impartial mechanism can only select vertices with maximum degree, even without any restriction on the number of selected vertices. We finally obtain the following trade-off between the maximum number of vertices selected and the minimum indegree of any selected vertex: when selecting at most $k$ of vertices out of $n$, it is possible to only select vertices whose indegree is at least the maximum indegree minus $\lfloor(n-2)/(k-1)\rfloor+1$.
- Multi-Leader Congestion Games with an Adversary (Tobias Harks, Mona Henle, Max Klimm, Jannik Matuschke, and Anja Schedel)
AAAI 2022 – Proc. 36th Annual Conference on Artificial Intelligence, pp. 5068–5075.
@inproceedings{HarksHenleKlimm+2022,
author = {Tobias Harks and Mona Henle and Max Klimm and Jannik Matuschke and Anja Schedel},
booktitle = {AAAI 2022 – Proc. 36th Annual Conference on Artificial Intelligence},
doi = {10.1609/aaai.v36i5.20439},
pages = {5068--5075},
tear = {2},
title = {Multi-Leader Congestion Games with an Adversary},
year = {2022},
}
We study a multi-leader single-follower congestion game where multiple users (leaders) choose one resource out of a set of resources and, after observing the realized loads, an adversary (single-follower) attacks the resources with maximum loads, causing additional costs for the leaders. For the resulting strategic game among the leaders, we show that pure Nash equilibria may fail to exist and therefore, we consider approximate equilibria instead. As our first main result, we show that the existence of a $K$-approximate equilibrium can always be guaranteed, where $K \approx 1.1974$ is the unique solution of a cubic polynomial equation. To this end, we give a polynomial time combinatorial algorithm which computes a $K$-approximate equilibrium. The factor $K$ is tight, meaning that there is an instance that does not admit an $\alpha$-approximate equilibrium for any $alpha < K$. Thus $\alpha = K$ is the smallest possible value of $\alpha$ such that the existence of an $\alpha$-approximate equilibrium can be guaranteed for any instance of the considered game. Secondly, we focus on approximate equilibria of a given fixed instance. We show how to compute efficiently a best approximate equilibrium, that is, with smallest possible $\alpha$ among all $\alpha$-approximate equilibria of the given instance.
- Maximizing a Submodular Function with Bounded Curvature Under an Unknown Knapsack Constraint (Max Klimm and Martin Knaack)
APPROX 2022 – Proc. International Workshop on Approximation Algorithms for Combinatorial Optimization Problems.
@inproceedings{KlimmKnaack2022,
author = {Max Klimm and Martin Knaack},
title = {Maximizing a Submodular Function with Bounded Curvature Under an Unknown Knapsack Constraint},
booktitle = {APPROX 2022 – Proc. International Workshop on Approximation Algorithms for Combinatorial Optimization Problems},
year = {2022},
doi = {10.4230/LIPIcs.APPROX/RANDOM.2022.49},
}
This paper studies the problem of maximizing a monotone submodular function under an unknown knapsack constraint. A solution to this problem is a policy that decides which item to pack next based on the past packing history. The robustness factor of a policy is the worst case ratio of the solution obtained by following the policy and an optimal solution that knows the knapsack capacity. We develop an algorithm with a robustness factor that is decreasing in the curvature $c$ of the submodular function. For the extreme cases $c = 0$ corresponding to a modular objective, it matches a previously known and best possible robustness factor of $1/2$. For the other extreme case of $c = 1$ it yields a robustness factor of $\approx 0.35$ improving over the best previously known robustness factor of $\approx 0.06$.
- The Space Complexity of Undirected Graph Exploration (Yann Disser and Max Klimm)
Chapter in Algorithms for Big Data (Hannah Bast, Claudius Korzen, Ulrich Meyer, Manuel Penschuck, eds.), 2022.
@incollection{DisserKlimm2022,
author = {Yann Disser and Max Klimm},
title = {The Space Complexity of Undirected Graph Exploration},
booktitle = {Algorithms for Big Data},
doi = {10.1007/978-3-031-21534-6_8},
year = {2022},
series = {LNCS},
volume = {13201},
editor = {Hannah Bast and Claudius Korzen and Ulrich Meyer and Manuel Penschuck},
}
We review the space complexity of deterministically exploring undirected graphs. We assume that vertices are indistinguishable and that edges have a locally unique color that guides the traversal of a space-constrained agent. The graph is considered to be explored once the agent has visited all vertices. We visit results for this setting showing that $\Theta(\log n)$ bits of memory are necessary and sufficient for an agent to explore all $n$-vertex graphs. We then illustrate that, if agents only have sublogarithmic memory, the number of (distinguishable) agents needed for collaborative exploration is $\Theta(\log \log n)$.
- Competitive Kill-and-Restart and Preemptive Strategies for Non-Clairvoyant Scheduling (Sven Jäger, Guillaume Sagnol, Daniel Schmidt genannt Waldschmidt, and Philipp Warode)
Preprint, 2022.
@techreport{JaegerSagnolSchmidtgenanntWaldschmidt+2022,
arxiv = {2211.02044},
author = {Sven Jäger and Guillaume Sagnol and Daniel Schmidt genannt Waldschmidt and Philipp Warode},
title = {Competitive Kill-and-Restart and Preemptive Strategies for Non-Clairvoyant Scheduling},
year = {2022},
}
We study kill-and-restart and preemptive strategies for the fundamental scheduling problem of minimizing the sum of weighted completion times on a single machine in the non-clairvoyant setting. First, we show a lower bound of $3$ for any deterministic non-clairvoyant kill-and-restart strategy. Then, we give for any $b > 1$ a tight analysis for the natural $b$-scaling kill-and-restart strategy as well as for a randomized variant of it. In particular, we show a competitive ratio of $(1+3\sqrt{3})\approx 6.197$ for the deterministic and of $\approx 3.032$ for the randomized strategy, by making use of the largest eigenvalue of a Toeplitz matrix. In addition, we show that the preemptive Weighted Shortest Elapsed Time First (WSETF) rule is $2$-competitive when jobs are released online, matching the lower bound for the unit weight case with trivial release dates for any non-clairvoyant algorithm. Using this result as well as the competitiveness of round-robin for multiple machines, we prove performance guarantees $<10$ for adaptions of the $b$-scaling strategy to online release dates and unweighted jobs on identical parallel machines.
- Travelling on Graphs with Small Highway Dimension (Yann Disser, Andreas E. Feldmann, Max Klimm, and Jochen Könemann)
Algorithmica, 83(5):1352–1370, 2021.
@article{DisserFeldmannKlimm+2021,
author = {Yann Disser and Andreas E. Feldmann and Max Klimm and Jochen K{\"o}nemann},
doi = {10.1007/s00453-020-00785-5},
journal = {Algorithmica},
number = {5},
pages = {1352--1370},
tier2 = {3},
title = {Travelling on Graphs with Small Highway Dimension},
volume = {83},
year = {2021},
}
We study the Travelling Salesperson (TSP) and the Steiner Tree problem (STP) in graphs of low highway dimension. This graph parameter roughly measures how many central nodes are visited by all shortest paths of a certain length. It has been shown that transportation networks, on which TSP and STP naturally occur for various applications in logistics, typically have a small highway dimension. While it was previously shown that these problems admit a quasi-polynomial time approximation scheme on graphs of constant highway dimension, we demonstrate that a significant improvement is possible in the special case when the highway dimension is 1. Specifically, we present a fully-polynomial time approximation scheme (FPTAS). We also prove that both TSP and STP are weakly $\mathsf{NP}$-hard for these restricted graphs.
- Pure Nash Equilibria in Resource Graph Games (Tobias Harks, Max Klimm, and Jannik Matuschke)
Journal of Artificial Intelligence Research, 72:185–213, 2021.
@article{HarksKlimmMatuschke2021,
author = {Tobias Harks and Max Klimm and Jannik Matuschke},
doi = {10.1613/jair.1.12668},
journal = {Journal of Artificial Intelligence Research},
pages = {185--213},
title = {Pure {Nash} Equilibria in Resource Graph Games},
volume = {72},
year = {2021},
}
This paper studies the existence of pure Nash equilibria in resource graph games, a general class of strategic games succinctly representing the players' private costs. These games are defined relative to a finite set of resources and the strategy set of each player corresponds to a set of subsets of resources. The cost of a resource is an arbitrary function of the load vector of a certain subset of resources. As our main result, we give complete characterizations of the cost functions guaranteeing the existence of pure Nash equilibria for weighted and unweighted players, respectively. For unweighted players, pure Nash equilibria are guaranteed to exist for any choice of the players' strategy space if and only if the cost of each resource is an arbitrary function of the load of the resource itself and linear in the load of all other resources where the linear coefficients of mutual influence of different resources are symmetric. This implies in particular that for any other cost structure there is a resource graph game that does not have a pure Nash equilibrium.For weighted games where players have intrinsic weights and the cost of each resource depends on the aggregated weight of its users, pure Nash equilibria are guaranteed to exist if and only if the cost of a resource is linear in all resource loads, and the linear factors of mutual influence are symmetric, or there is no interaction among resources and the cost is an exponential function of the local resource load. We further discuss the computational complexity of pure Nash equilibria in resource graph games showing that for unweighted games where pure Nash equilibria are guaranteed to exist, it is $\mathsf{coNP}$-complete to decide for a given strategy profile whether it is a pure Nash equilibrium. For general resource graph games, we prove that the decision whether a pure Nash equilibrium exists is $\mathsf{\Sigma_2^p}$-complete.
- Speed-Robust Scheduling - Sand, Bricks, and Rocks (Franziska Eberle, Ruben Hoeksma, Nicole Megow, Lukas Nölke, Kevin Schewior, and Bertrand Simon)
IPCO 2021 – Proc. 22nd Conference on Integer Programming and Combinatorial Optimization, Springer, pp. 283–296.
@inproceedings{EberleHoeksmaMegow+2021,
author = {Franziska Eberle and Ruben Hoeksma and Nicole Megow and Lukas N{\"{o}}lke and Kevin Schewior and Bertrand Simon},
title = {Speed-Robust Scheduling - Sand, Bricks, and Rocks},
booktitle = {IPCO 2021 – Proc. 22nd Conference on Integer Programming and Combinatorial Optimization},
series = {Lecture Notes in Computer Science},
volume = {12707},
pages = {283--296},
publisher = {Springer},
year = {2021},
doi = {10.1007/978-3-030-73879-2\_20},
}
The speed-robust scheduling problem is a two-stage problem where, given $m$ machines, jobs must be grouped into at most $m$ bags while the processing speeds of the machines are unknown. After the speeds are revealed, the grouped jobs must be assigned to the machines without being separated. To evaluate the performance of algorithms, we determine upper bounds on the worst-case ratio of the algorithm's makespan and the optimal makespan given full information. We refer to this ratio as the robustness factor. We give an algorithm with a robustness factor $2-\frac1m$ for the most general setting and improve this to $1.8$ for equal-size jobs. For the special case of infinitesimal jobs, we give an algorithm with an optimal robustness factor equal to $\frac e{e-1}\approx 1.58$. The particular machine environment in which all machines have either speed $0$ or $1$ was studied before by Stein and Zhong (ACM Trans.\ Alg.\ 2020). For this setting, we provide an algorithm for scheduling infinitesimal jobs with an optimal robustness factor of $\frac{1+\sqrt{2}}2\approx 1.207$. It lays the foundation for an algorithm matching the lower bound of $\frac43$ for equal-size jobs.
- Fractionally Subadditive Maximization under an Incremental Knapsack Constraint (Yann Disser, Max Klimm, and David Weckbecker)
WAOA 2021 – Proc. 19th Workshop on Approximation and Online Algorithms.
@inproceedings{DisserKlimmWeckbecker2021,
author = {Yann Disser and Max Klimm and David Weckbecker},
booktitle = {WAOA 2021 – Proc. 19th Workshop on Approximation and Online Algorithms},
doi = {10.1007/978-3-030-92702-8_13},
title = {Fractionally Subadditive Maximization under an Incremental Knapsack Constraint},
year = {2021},
editor = {J. Koenemann and B. Peis},
}
We consider the problem of maximizing a fractionally sub-additive function under a knapsack constraint that grows over time. An incremental solution to this problem is given by an order in which to include the elements of the ground set, and the competitive ratio of an incremental solution is defined by the worst ratio over all capacities relative to an optimum solution of the corresponding capacity. We present an algorithm that finds an incremental solution of competitive ratio at most $\max\{3.293\sqrt{M}, 2M\}$, under the assumption that the values of singleton sets are in the range $[1,M]$, and we give a lower bound of $\max\{2.449, M\}$ on the attainable competitive ratio. In addition, we establish that our framework captures potential-based flows between two vertices, and we give a tight bound of 2 for the incremental maximization of classical flows with unit capacities.
- Fully Dynamic Algorithms for Knapsack Problems With Polylogarithmic Update Time (Franziska Eberle, Nicole Megow, Lukas Nölke, Bertrand Simon, and Andreas Wiese)
FSTTCS 2021 – Proc. 41st Conference on Foundations of Software Technology and Theoretical Computer Science, pp. 18:1–18:17.
@inproceedings{EberleMegowNölke+2022,
author = {Franziska Eberle and Nicole Megow and Lukas Nölke and Bertrand Simon and Andreas Wiese},
booktitle = {FSTTCS 2021 – Proc. 41st Conference on Foundations of Software Technology and Theoretical Computer Science},
doi = {10.4230/LIPIcs.FSTTCS.2021.18},
pages = {18:1--18:17},
title = {Fully Dynamic Algorithms for Knapsack Problems With Polylogarithmic Update Time},
year = {2021},
}
Knapsack problems are among the most fundamental problems in optimization. In the Multiple Knapsack problem, we are given multiple knapsacks with different capacities and items with values and sizes. The task is to find a subset of items of maximum total value that can be packed into the knapsacks without exceeding the capacities. We investigate this problem and special cases thereof in the context of dynamic algorithms and design data structures that efficiently maintain near-optimal knapsack solutions for dynamically changing input. More precisely, we handle the arrival and departure of individual items or knapsacks during the execution of the algorithm with worst-case update time polylogarithmic in the number of items. As the optimal and any approximate solution may change drastically, we maintain implicit solutions and support polylogarithmic time query operations that can return the computed solution value and the packing of any given item. While dynamic algorithms are well-studied in the context of graph problems, there is hardly any work on packing problems (and generally much less on non-graph problems). Motivated by the theoretical interest in knapsack problems and their practical relevance, our work bridges this gap.
- Multidimensional Apportionment through Discrepancy Theory (Javier Cembrano, Jose Correa, and Victor Verdugo)
EC 2021 – Proc. 22nd ACM Conference on Economics and Computation, pp. 287–288.
@inproceedings{CembranoCorreaVerdugo2021,
author = {Javier Cembrano and Jose Correa and Victor Verdugo},
booktitle = {EC 2021 -- Proc. 22nd ACM Conference on Economics and Computation},
doi = {10.1145/3465456.3467596},
pages = {287--288},
title = {Multidimensional Apportionment through Discrepancy Theory},
year = {2021},
}
- Proportional Apportionment: A Case Study From the Chilean Constitutional Convention (Javier Cembrano, Jose Correa, Gonzalo Diaz, and Victor Verdugo)
EAAMO 2021 - Proc. 1st ACM conference on Equity and Access in Algorithms, Mechanisms, and Optimization, pp. 1–9.
@inproceedings{CembranoCorreaDiaz+2021,
author = {Javier Cembrano and Jose Correa and Gonzalo Diaz and Victor Verdugo},
booktitle = {EAAMO 2021 - Proc. 1st ACM conference on Equity and Access in Algorithms, Mechanisms, and Optimization},
doi = {10.1145/3465416.3483295},
pages = {1--9},
title = {Proportional Apportionment: A Case Study From the Chilean Constitutional Convention},
year = {2021},
}
How to elect the representatives in legislative bodies is a question that every modern democracy has to answer. This design task has to consider various elements so as to fulfill the citizens' expectations and contribute to maintaining a healthy democracy. In this work, we present five proposals of electoral methods based on proportional representation that extend the notion of proportionality to several dimensions: For instance, simultaneous proportionality in political, geographical, and gender representation. We also consider including two additional desirable properties for electoral methods: A minimum threshold to obtain political representation, and the incorporation of plurality voting, guaranteeing the election of the highest voted candidate in each district. We use the Chilean Constitutional Convention election (May 15–16, 2021) results as a testing ground, and compare the apportionment obtained under each method according to four criteria: Proportionality, representativeness, robustness, and voting power. We conclude that it is feasible to design and implement an electoral method satisfying all mentioned properties. Our findings may be useful in assessing electoral designs in other contexts as well.
- Approximate Parametric Computation of Minimum-Cost Flows with Convex Costs (Per Joachims, Max Klimm, and Philipp Warode)
Preprint, 2021.
@techreport{JoachimsKlimmWarode2021,
arxiv = {2203.13146},
author = {Per Joachims and Max Klimm and Philipp Warode},
title = {Approximate Parametric Computation of Minimum-Cost Flows with Convex Costs},
year = {2021},
}
This paper studies a variant of the minimum-cost flow problem in a graph with convex cost functions, where the demands at the vertices are functions depending on a one-dimensional parameter $\lambda$. We devise two algorithmic approaches for the approximate computation of parametric solutions for this problem. The first approach transforms an instance of the parametric problem into an instance with piecewise quadratic cost functions by interpolating the marginal cost functions. The new instance can be solved exactly with an algorithm we developed in prior work. In the second approach, we compute a fixed number of non-parametric solutions and interpolate the resulting flows yielding an approximate solution for the original, parametric problem. For both methods we formulate explicit bounds on the step sizes used in the respective interpolations that guarantee relative and absolute error margins. Finally, we test our approaches on real-world traffic and gas instances in an empirical study.
- A General Framework for Handling Commitment in Online Throughput Maximization (Lin Chen, Franziska Eberle, Nicole Megow, Kevin Schewior, and Cliff Stein)
Mathematical Programming, 183(1):215–247, 2020.
@article{ChenEberleMegow+2020,
author = {Lin Chen and Franziska Eberle and Nicole Megow and Kevin Schewior and Cliff Stein},
title = {A General Framework for Handling Commitment in Online Throughput Maximization},
journal = {Mathematical Programming},
volume = {183},
number = {1},
pages = {215--247},
year = {2020},
doi = {10.1007/s10107-020-01469-2},
}
We study a fundamental online job admission problem where jobs with deadlines arrive online over time at their release dates, and the task is to determine a preemptive single-server schedule which maximizes the number of jobs that complete on time. To circumvent known impossibility results, we make a standard slackness assumption by which the feasible time window for scheduling a job is at least $1+\varepsilon$ times its processing time, for some $\varepsilon > 0$. We quantify the impact that different provider commitment requirements have on the performance of online algorithms. Our main contribution is one universal algorithmic framework for online job admission both with and without commitments. Without commitment, our algorithm with a competitive ratio of $O(1/\varepsilon)$ is the best possible (deterministic) for this problem. For commitment models, we give the first non-trivial performance bounds. If the commitment decisions must be made before a job’s slack becomes less than a $\delta$-fraction of its size, we prove a competitive ratio of $O(\varepsilon / ((\varepsilon - \delta)\delta^2))$, for $0 < \delta < \varepsilon$. When a provider must commit upon starting a job, our bound is $O(1/\varepsilon^2)$. Finally, we observe that for scheduling with commitment the restriction to the “unweighted” throughput model is essential; if jobs have individual weights, we rule out competitive deterministic algorithms.
- Hiring Secretaries over Time: The Benefit of Concurrent Employment (Yann Disser, John Fearnley, Martin Gairing, Oliver Göbel, Max Klimm, Daniel Schmand, Alexander Skopalik, and Andreas Tönnis)
Mathematics of Operations Research, 45(1):323–352, 2020.
@article{DisserFearnleyGairing+2020,
author = {Yann Disser and John Fearnley and Martin Gairing and Oliver G{\"o}bel and Max Klimm and Daniel Schmand and Alexander Skopalik and Andreas T{\"o}nnis},
doi = {10.1287/moor.2019.0993},
journal = {Mathematics of Operations Research},
number = {1},
pages = {323--352},
tier2 = {2},
title = {Hiring Secretaries over Time: The Benefit of Concurrent Employment},
volume = {45},
year = {2020},
}
We consider a stochastic online problem where n applicants arrive over time, one per time step. Upon the arrival of each applicant, their cost per time step is revealed, and we have to fix the duration of employment, starting immediately. This decision is irrevocable; that is, we can neither extend a contract nor dismiss a candidate once hired. In every time step, at least one candidate needs to be under contract, and our goal is to minimize the total hiring cost, which is the sum of the applicants' costs multiplied with their respective employment durations. We provide a competitive online algorithm for the case that the applicants' costs are drawn independently from a known distribution. Specifically, the algorithm achieves a competitive ratio of 2.965 for the case of uniform distributions. For this case, we give an analytical lower bound of 2 and a computational lower bound of 2.148. We then adapt our algorithm to stay competitive even in settings with one or more of the following restrictions: (i) at most two applicants can be hired concurrently; (ii) the distribution of the applicants' costs is unknown; (iii) the total number n of time steps is unknown. On the other hand, we show that concurrent employment is a necessary feature of competitive algorithms by proving that no algorithm has a competitive ratio better than $\Omega(n/ \log n)$ if concurrent employment is forbidden.
- Broadcasting a File in a Communication Network (Kai-Simon Goetzmann, Tobias Harks, and Max Klimm)
Journal of Scheduling, 23(2):211–232, 2020.
@article{GoetzmannHarksKlimm2020,
author = {Kai{-}Simon Goetzmann and Tobias Harks and Max Klimm},
doi = {10.1007/s10951-020-00643-w},
journal = {Journal of Scheduling},
number = {2},
pages = {211--232},
title = {Broadcasting a File in a Communication Network},
volume = {23},
year = {2020},
}
We study the problem of distributing a file, initially located at a server, among a set of $n$ nodes. The file is divided into $m \geq 1$ equally sized packets. After downloading a packet, nodes can upload it to other nodes, possibly to multiple nodes in parallel. Each node, however, may receive each packet from a single source node only. The upload and download rates between nodes are constrained by node- and server-specific upload and download capacities. The objective is to minimize the makespan. This problem has been proposed and analyzed first by Mundinger et al. (J. Sched. 11:105–120, 2008) under the assumption that uploads obey the fair sharing principle, that is, concurrent upload rates from a common source are equal at any point in time. Under this assumption, the authors devised an optimal polynomial time algorithm for the case where the upload capacity of the server and the nodes' upload and download capacities are all equal. In this work, we drop the fair sharing assumption and derive an exact polynomial time algorithm for the case when upload and download capacities per node and among nodes are equal. We further show that the problem becomes strongly $\mathsf{NP}$-hard for equal upload and download capacities per node that may differ among nodes, even for a single packet. For this case, we devise a polynomial time $(1+ 2\sqrt{2})$-approximation algorithm. Finally, we devise two polynomial time algorithms with approximation guarantees of $5$ and $2 + \lceil \log_2 \lceil n/m \rceil \rceil /m$, respectively, for the general case of $m$ packets.
- Complexity and Parametric Computation of Equilibria in Atomic Splittable Congestion Games via Weighted Block Laplacians (Max Klimm and Philipp Warode)
SODA 2020 – Proc. 31st ACM-SIAM Symposium on Discrete Algorithms, pp. 2728–2747.
@inproceedings{KlimmWarode2020,
author = {Max Klimm and Philipp Warode},
booktitle = {SODA 2020 – Proc. 31st ACM-SIAM Symposium on Discrete Algorithms},
doi = {10.1137/1.9781611975994.166},
pages = {2728--2747},
title = {Complexity and Parametric Computation of Equilibria in Atomic Splittable Congestion Games via Weighted Block {Laplacians}},
year = {2020},
}
We settle the complexity of computing an equilibrium in atomic splittable congestion games with player-specific affine cost functions $l_{e,i}(x) = a_{e,i}x + b_{e,i}$ showing that it is $\mathsf{PPAD}$-complete. To prove that the problem is contained in $\mathsf{PPAD}$, we develop a homotopy method that traces an equilibrium for varying flow demands of the players. A key technique is to describe the evolution of the equilibrium locally by a novel block Laplacian matrix. This leads to a path following formulation where states correspond to supports that are feasible for some demands and neighboring supports are feasible for increased or decreased flow demands. A closer investigation of the block Laplacian system allows to orient the states giving rise to unique predecessor and successor states thus putting the problem into $\mathsf{PPAD}$. For the $\mathsf{PPAD}$-hardness, we reduce from computing an approximate equilibrium of a bimatrix win-lose game. As a byproduct of our reduction we further show that computing a multi-class Wardrop equilibrium with class-dependent affine cost functions is $\mathsf{PPAD}$-complete as well. As a byproduct of our $\mathsf{PPAD}$-completeness proof, we obtain an algorithm that computes all equilibria parametrized by the players' flow demands. For player-specific costs, this computation may require several increases and decreases of the demands leading to an algorithm that runs in polynomial space but exponential time. For player-independent costs only demand increases are necessary. If the coefficients be,i are in general position, this yields an algorithm computing all equilibria as a function of the flow demand running in time polynomial in the output.
- Packing Under Convex Quadratic Constraints (Max Klimm, Marc E. Pfetsch, Rico Raber, and Martin Skutella)
IPCO 2020 – Proc. 21st Conference on Integer Programming and Combinatorial Optimization, pp. 266–279.
@inproceedings{KlimmPfetschRaber+2020,
author = {Max Klimm and Marc E. Pfetsch and Rico Raber and Martin Skutella},
booktitle = {IPCO 2020 – Proc. 21st Conference on Integer Programming and Combinatorial Optimization},
doi = {10.1007/978-3-030-45771-6\_21},
pages = {266--279},
title = {Packing Under Convex Quadratic Constraints},
year = {2020},
}
We consider a general class of binary packing problems with a convex quadratic knapsack constraint. We prove that these problems are $\mathsf{APX}$-hard to approximate and present constant-factor approximation algorithms based upon three different algorithmic techniques: (1) a rounding technique tailored to a convex relaxation in conjunction with a non-convex relaxation whose approximation ratio equals the golden ratio; (2) a greedy strategy; (3) a randomized rounding method leading to an approximation algorithm for the more general case with multiple convex quadratic constraints. We further show that a combination of the first two strategies can be used to yield a monotone algorithm leading to a strategyproof mechanism for a game-theoretic variant of the problem. Finally, we present a computational study of the empirical approximation of the three algorithms for problem instances arising in the context of real-world gas transport networks.
- Optimally Handling Commitment Issues in Online Throughput Maximization (Franziska Eberle, Nicole Megow, and Kevin Schewior)
ESA 2020 – Proc. 28th European Symposium on Algorithms, pp. 41:1–41:15.
@inproceedings{EberleMegowSchewior2020,
author = {Franziska Eberle and Nicole Megow and Kevin Schewior},
booktitle = {ESA 2020 – Proc. 28th European Symposium on Algorithms},
doi = {10.4230/LIPIcs.ESA.2020.41},
pages = {41:1--41:15},
title = {Optimally Handling Commitment Issues in Online Throughput Maximization},
year = {2020},
}
We consider a fundamental online scheduling problem in which jobs with processing times and deadlines arrive online over time at their release dates. The task is to determine a feasible preemptive schedule on m machines that maximizes the number of jobs that complete before their deadline. Due to strong impossibility results for competitive analysis, it is commonly required that jobs contain some slack $\varepsilon > 0$, which means that the feasible time window for scheduling a job is at least $1+\varepsilon$ times its processing time. In this paper, we answer the question on how to handle commitment requirements which enforce that a scheduler has to guarantee at a certain point in time the completion of admitted jobs. This is very relevant, e.g., in providing cloud-computing services and disallows last-minute rejections of critical tasks. We present the first online algorithm for handling commitment on parallel machines for arbitrary slack $\varepsilon$. When the scheduler must commit upon starting a job, the algorithm is $\Theta(1/\varepsilon)$-competitive. Somewhat surprisingly, this is the same optimal performance bound (up to constants) as for scheduling without commitment on a single machine. If commitment decisions must be made before a job’s slack becomes less than a $\delta$-fraction of its size, we prove a competitive ratio of $O(1/(\varepsilon - \delta))$ for $0 < \delta < \varepsilon$. This result nicely interpolates between commitment upon starting a job and commitment upon arrival. For the latter commitment model, it is known that no (randomized) online algorithms admits any bounded competitive ratio.
- Book Embeddings of Nonplanar Graphs with Small Faces in Few Pages (Michael A. Bekos, Giordano Da Lozzo, Svenja M. Griesbach, Martin Gronemann, Fabrizio Montecchiani, and Chrysanthi Raftopoulou)
SoCG 2020 – Proc. 36th International Symposium on Computational Geometry, pp. 16:1–16:17.
@inproceedings{BekosDaLozzoGriesbach+2020,
address = {Dagstuhl, Germany},
annote = {Keywords: Book embeddings, Book thickness, Nonplanar graphs, Planar skeleton},
author = {Michael A. Bekos and Giordano Da Lozzo and Svenja M. Griesbach and Martin Gronemann and Fabrizio Montecchiani and Chrysanthi Raftopoulou},
booktitle = {SoCG 2020 – Proc. 36th International Symposium on Computational Geometry},
doi = {10.4230/LIPIcs.SoCG.2020.16},
editor = {Sergio Cabello and Danny Z. Chen},
pages = {16:1--16:17},
title = {Book Embeddings of Nonplanar Graphs with Small Faces in Few Pages},
urn = {urn:nbn:de:0030-drops-121749},
volume = {164},
year = {2020},
}
An embedding of a graph in a book, called book embedding, consists of a linear ordering of its vertices along the spine of the book and an assignment of its edges to the pages of the book, so that no two edges on the same page cross. The book thickness of a graph is the minimum number of pages over all its book embeddings. For planar graphs, a fundamental result is due to Yannakakis, who proposed an algorithm to compute embeddings of planar graphs in books with four pages. Our main contribution is a technique that generalizes this result to a much wider family of nonplanar graphs, which is characterized by a biconnected skeleton of crossing-free edges whose faces have bounded degree. Notably, this family includes all 1-planar and all optimal 2-planar graphs as subgraphs. We prove that this family of graphs has bounded book thickness, and as a corollary, we obtain the first constant upper bound for the book thickness of optimal 2-planar graphs.
- Nobel, Milgrom und Wilson (Max Klimm)
Chapter in Mitteilungen der Deutschen Mathematiker-Vereinigung, 2020.
@incollection{Klimm2020,
author = {Max Klimm},
booktitle = {Mitteilungen der Deutschen Mathematiker-Vereinigung},
doi = {10.1515/dmvm-2020-0062},
number = {4},
pages = {193--200},
title = {Nobel, Milgrom und Wilson},
volume = {28},
year = {2020},
}
Paul Milgrom und Robert Wilson erhielten den Nobel-Gedächtnispreis für Wirtschaftswissenschaften für ihre Beiträge zur Auktionstheorie und die Entwicklung neuer Auktionsformate. Wir geben hier einen Überblick über die mathematische Analyse von Auktionen und den Einfluss von Milgrom und Wilson auf diese Theorien. Insbesondere werden wir der Frage nachgehen, nach welchen Prinzipien Auktionen gestaltet sein sollten, damit sie einen möglichst hohen Erlös erzielen.
- SAGT 2020 – Proc. 13th International Symposium on Algorithmic Game Theory (Tobias Harks, Max Klimm, eds.), Springer, 2020.
@proceedings{HarksKlimm2020,
doi = {10.1007/978-3-030-57980-7},
editor = {Tobias Harks and Max Klimm},
publisher = {Springer},
series = {LNCS},
title = {SAGT 2020 – Proc. 13th International Symposium on Algorithmic Game Theory},
volume = {12283},
year = {2020},
}
- Tight Bounds for Undirected Graph Exploration with Pebbles and Multiple Agents (Yann Disser, Jan Hackfeld, and Max Klimm)
Journal of the ACM, 66(6):40:1–40:41, 2019.
@article{DisserHackfeldKlimm2019,
author = {Yann Disser and Jan Hackfeld and Max Klimm},
doi = {10.1145/3356883},
journal = {Journal of the ACM},
number = {6},
pages = {40:1--40:41},
tier2 = {1},
title = {Tight Bounds for Undirected Graph Exploration with Pebbles and Multiple Agents},
volume = {66},
year = {2019},
}
We study the problem of deterministically exploring an undirected and initially unknown graph with $n$ vertices either by a single agent equipped with a set of pebbles or by a set of collaborating agents. The vertices of the graph are unlabeled and cannot be distinguished by the agents, but the edges incident to a vertex have locally distinct labels. The graph is explored when all vertices have been visited by at least one agent. In this setting, it is known that for a single agent without pebbles $\Theta(\log n)$ bits of memory are necessary and sufficient to explore any graph with at most $n$ vertices. We are interested in how the memory requirement decreases as the agent may mark vertices by dropping and retrieving distinguishable pebbles or when multiple agents jointly explore the graph. We give tight results for both questions showing that for a single agent with constant memory $\Theta(\log \log n)$ pebbles are necessary and sufficient for exploration. We further prove that using collaborating agents instead of pebbles does not help as $\Theta(\log \log n)$ agents with constant memory each are necessary and sufficient for exploration. For the upper bounds, we devise an algorithm for a single agent with constant memory that explores any $n$-vertex graph using $\mathcal{O}(\log \log n)$ pebbles, even when $n$ is not known a priori. The algorithm terminates after polynomial time and returns to the starting vertex. We further show that the algorithm can be realized with additional constant-memory agents rather than pebbles, implying that $\mathcal{O}(\log \log n)$ agents with constant memory can explore any $n$-vertex graph. For the lower bound, we show that the number of agents needed for exploring any graph with at most n vertices is already $\Omega(\log \log n)$ when we allow each agent to have at most $\mathcal{O}((\log n)^{1-ε})$ bits of memory for any $\epsilon > 0$. Our argument also implies that a single agent with sublogarithmic memory needs $\Theta(\log \log n)$ pebbles to explore any $n$-vertex graph.
- Distance-Preserving Graph Contractions (Aaron Bernstein, Karl Däubel, Yann Disser, Max Klimm, Torsten Mütze, and Frieder Smolny)
SIAM Journal on Discrete Mathematics, 33(3):1607–1636, 2019.
@article{BernsteinDaeubelDisser+2019,
author = {Aaron Bernstein and Karl D{\"a}ubel and Yann Disser and Max Klimm and Torsten M{\"u}tze and Frieder Smolny},
doi = {10.1137/18M1169382},
journal = {SIAM Journal on Discrete Mathematics},
number = {3},
pages = {1607--1636},
tier2 = {3},
title = {Distance-Preserving Graph Contractions},
volume = {33},
year = {2019},
}
Compression and sparsification algorithms are frequently applied in a preprocessing step before analyzing or optimizing large networks/graphs. In this paper we propose and study a new framework contracting edges of a graph (merging vertices into supervertices) with the goal of preserving pairwise distances as accurately as possible. Formally, given an edge-weighted graph, the contraction should guarantee that for any two vertices at distance $d$, the corresponding supervertices remain at distance at least $\varphi(d)$ in the contracted graph, where $\varphi$ is a tolerance function bounding the permitted distance distortion. We present a comprehensive picture of the algorithmic complexity of the contraction problem for affine tolerance functions $\varphi(x) = x/\alpha + \beta$, where $\alpha \geq 1$ and $\beta \geq 0$ are arbitrary real-valued parameters. Specifically, we present polynomial-time algorithms for trees as well as hardness and inapproximability results for different graph classes, precisely separating easy and hard cases. Further we analyze the asymptotic behavior of contractions, and find efficient algorithms to compute (nonoptimal) contractions despite our hardness results.
- Greedy Metric Minimum Online Matchings with Random Arrivals (Martin Gairing and Max Klimm)
Operations Research Letters, 47(2):88–91, 2019.
@article{GairingKlimm2019,
author = {Martin Gairing and Max Klimm},
doi = {10.1016/j.orl.2019.01.002},
journal = {Operations Research Letters},
number = {2},
pages = {88--91},
title = {Greedy Metric Minimum Online Matchings with Random Arrivals},
volume = {47},
year = {2019},
}
We consider online metric minimum bipartite matching problems with random arrival order and show that the greedy algorithm assigning each request to the nearest unmatched server is $n$-competitive, where $n$ is the number of requests. This result is complemented by a lower bound exhibiting that the greedy algorithm has a competitive ratio of at least $n^{(\ln 3 - \ln 2)/\ln 4} \approx n^{0.292}$, even when the underlying metric space is the real line.
- A more fine-grained complexity analysis of finding the most vital edges for undirected shortest paths (Cristina Bazgan, Till Fluschnik, André Nichterlein, Rolf Niedermeier, and Maximilian J. Stahlberg)
Networks, 73(1):23–37, 2019.
@article{BazganFluschnikNichterlein+2019,
arxiv = {1804.09155},
author = {Cristina Bazgan and Till Fluschnik and Andr{\'e} Nichterlein and Rolf Niedermeier and Maximilian J. Stahlberg},
doi = {10.1002/net.21832},
journal = {Networks},
number = {1},
pages = {23--37},
title = {A more fine-grained complexity analysis of finding the most vital edges for undirected shortest paths},
volume = {73},
year = {2019},
}
We study the $\mathsf{NP}$-hard shortest path most vital edges problem arising in the context of analyzing network robustness. For an undirected graph with positive integer edge lengths and two designated vertices s and t, the goal is to delete as few edges as possible in order to increase the length of the (new) shortest $st$-path as much as possible. This scenario has been studied from the viewpoint of parameterized complexity and approximation algorithms. We contribute to this line of research by providing refined computational tractability as well as hardness results. We achieve this by a systematic investigation of various problem-specific parameters and their influence on the computational complexity. Charting the border between tractability and intractability, we also identify numerous challenges for future research.
- On Index Policies in Stochastic Minsum Scheduling (Franziska Eberle, Felix Fischer, Jannik Matuschke, and Nicole Megow)
Operations Research Letters, 47(3):213–218, 2019.
@article{EberleFischerMatuschke+2019,
author = {Franziska Eberle and Felix Fischer and Jannik Matuschke and Nicole Megow},
doi = {10.1016/j.orl.2019.03.007},
journal = {Operations Research Letters},
number = {3},
pages = {213--218},
title = {On Index Policies in Stochastic Minsum Scheduling},
volume = {47},
year = {2019},
}
Minimizing the sum of completion times when scheduling jobs on $m$ identical parallel machines is a fundamental scheduling problem. Unlike the well-understood deterministic variant, it is a major open problem how to handle stochastic processing times. We show for the prominent class of index policies that no such policy can achieve a distribution-independent approximation factor. This strong lower bound holds even for simple instances with deterministic and two-point distributed jobs. For such instances, we give an $O(m)$-approximative list scheduling policy.
- Computing all Wardrop Equilibria parametrized by the Flow Demand (Max Klimm and Philipp Warode)
SODA 2019 – Proc. 30th ACM-SIAM Symposium on Discrete Algorithms, pp. 917–934.
@inproceedings{KlimmWarode2019,
author = {Max Klimm and Philipp Warode},
booktitle = {SODA 2019 – Proc. 30th ACM-SIAM Symposium on Discrete Algorithms},
doi = {10.1137/1.9781611975482.56},
pages = {917--934},
title = {Computing all {Wardrop} Equilibria parametrized by the Flow Demand},
year = {2019},
}
We develop an algorithm that computes for a given undirected or directed network with flow-dependent piece-wise linear edge cost functions all Wardrop equilibria as a function of the flow demand. Our algorithm is based on Katzenelson's homotopy method for electrical networks. The algorithm uses a bijection between vertex potentials and flow excess vectors that is piecewise linear in the potential space and where each linear segment can be interpreted as an augmenting flow in a residual network. The algorithm iteratively increases the excess of one or more vertex pairs until the bijection reaches a point of non-differentiability. Then, the next linear region is chosen in a simplex-like pivot step and the algorithm proceeds. We first show that this algorithm correctly computes all Wardrop equilibria in undirected single-commodity networks along the chosen path of excess vectors. We then adapt our algorithm to also work for discontinuous cost functions which allows to model directed edges and/or edge capacities. Our algorithm is output-polynomial in non-degenerate instances where the solution curve never hits a point where the cost function of more than one edge becomes non-differentiable. For degenerate instances we still obtain an output-polynomial algorithm computing the linear segments of the bijection by a convex program. The latter technique also allows to handle multiple commodities.
- A General Framework for Handling Commitment in Online Throughput Maximization (Lin Chen, Franziska Eberle, Nicole Megow, Kevin Schewior, and Cliff Stein)
IPCO 2019 – Proc. 20th Conference on Integer Programming and Combinatorial Optimization, Springer, pp. 141–154.
@inproceedings{ChenEberleMegow+2019,
author = {Lin Chen and Franziska Eberle and Nicole Megow and Kevin Schewior and Cliff Stein},
title = {A General Framework for Handling Commitment in Online Throughput Maximization},
booktitle = {IPCO 2019 – Proc. 20th Conference on Integer Programming and Combinatorial Optimization},
pages = {141--154},
publisher = {Springer},
year = {2019},
doi = {10.1007/978-3-030-17953-3_11},
}
We study a fundamental online job admission problem where jobs with deadlines arrive online over time at their release dates, and the task is to determine a preemptive single-server schedule which maximizes the number of jobs that complete on time. To circumvent known impossibility results, we make a standard slackness assumption by which the feasible time window for scheduling a job is at least $1+\varepsilon$ times its processing time, for some $\varepsilon > 0$. We quantify the impact that different provider commitment requirements have on the performance of online algorithms. Our main contribution is one universal algorithmic framework for online job admission both with and without commitments. Without commitment, our algorithm with a competitive ratio of $O(1/\varepsilon)$ is the best possible (deterministic) for this problem. For commitment models, we give the first non-trivial performance bounds. If the commitment decisions must be made before a job’s slack becomes less than a $\delta$-fraction of its size, we prove a competitive ratio of $O(\varepsilon / ((\varepsilon - \delta)\delta^2))$, for $0 < \delta < \varepsilon$. When a provider must commit upon starting a job, our bound is $O(1/\varepsilon^2)$. Finally, we observe that for scheduling with commitment the restriction to the “unweighted” throughput model is essential; if jobs have individual weights, we rule out competitive deterministic algorithms.
- Travelling on Graphs with Small Highway Dimension (Yann Disser, Andreas Emil Feldmann, Max Klimm, and Jochen Könemann)
WG 2019 – Proc. 45th Workshop on Graph-Theoretic Concepts in Computer Science, pp. 175–189.
@inproceedings{DisserFeldmannKlimm+2019,
author = {Yann Disser and Andreas Emil Feldmann and Max Klimm and Jochen K{\"o}nemann},
booktitle = {WG 2019 – Proc. 45th Workshop on Graph-Theoretic Concepts in Computer Science},
doi = {10.1007/978-3-030-30786-8\_14},
editor = {Ignasi Sau and Dimitrios M. Thilikos},
pages = {175--189},
series = {LNCS},
title = {Travelling on Graphs with Small Highway Dimension},
volume = {11789},
year = {2019},
}
We study the Travelling Salesperson (TSP) and the Steiner Tree problem (STP) in graphs of low highway dimension. This graph parameter was introduced by Abraham et al. [SODA 2010] as a model for transportation networks, on which TSP and STP naturally occur for various applications in logistics. It was previously shown [Feldmann et al. ICALP 2015] that these problems admit a quasi-polynomial time approximation scheme (QPTAS) on graphs of constant highway dimension. We demonstrate that a significant improvement is possible in the special case when the highway dimension is 1, for which we present a fully-polynomial time approximation scheme (FPTAS). We also prove that STP is weakly $|mathsf{NP}$-hard for these restricted graphs. For TSP we show $\mathsf{NP}$-hardness for graphs of highway dimension 6, which answers an open problem posed in [Feldmann et al. ICALP 2015].
- The Online Best Reply Algorithm for Resource Allocation Problems (Max Klimm, Daniel Schmand, and Andreas Tönnis)
SAGT 2019 – Proc. 12th International Symposium on Algorithmic Game Theory, pp. 200–215.
@inproceedings{KlimmSchmandToennis2019,
author = {Max Klimm and Daniel Schmand and Andreas T{\"o}nnis},
booktitle = {SAGT 2019 – Proc. 12th International Symposium on Algorithmic Game Theory},
doi = {10.1007/978-3-030-30473-7\_14},
editor = {Dimitris Fotakis and Evangelos Markakis},
pages = {200--215},
series = {LNCS},
title = {The Online Best Reply Algorithm for Resource Allocation Problems},
volume = {11801},
year = {2019},
}
We study the performance of a best reply algorithm for online resource allocation problems with a diseconomy of scale. In an online resource allocation problem, we are given a set of resources and a set of requests that arrive in an online manner. Each request consists of a set of feasible allocations and an allocation is a set of resources. The total cost of an allocation vector is given by the sum of the resources' costs, where each resource's cost depends on the total load on the resource under the allocation vector. We analyze the natural online procedure where each request is allocated greedily to a feasible set of resources that minimizes the individual cost of that particular request. In the literature, this algorithm is also known as a one-round walk in congestion games starting from the empty state. For unweighted resource allocation problems with polynomial cost functions with maximum degree $d$, upper bounds on the competitive ratio of this greedy algorithm were known only for the special cases $d \in \{1,2,3\}$. In this paper, we show a general upper bound on the competitive ratio of $d(d/W(\frac{1.2d-1}{d+1}))^{d+1}$ for the unweighted case where $W$ denotes the Lambert-$W$ function on $\mathbb{R}_{\geq 0}$. For the weighted case, we show that the competitive ratio of the greedy algorithm is bounded from above by (d/W(\fracdd+1))^d+1.
- Sensitivity Analysis for Convex Separable Optimization Over Integral Polymatroids (Tobias Harks, Max Klimm, and Britta Peis)
SIAM Journal on Optimization, 28(3):2222–2245, 2018.
@article{HarksKlimmPeis2018,
author = {Tobias Harks and Max Klimm and Britta Peis},
doi = {10.1137/16M1107450},
journal = {SIAM Journal on Optimization},
number = {3},
pages = {2222--2245},
title = {Sensitivity Analysis for Convex Separable Optimization Over Integral Polymatroids},
volume = {28},
year = {2018},
}
We study the sensitivity of optimal solutions of convex separable optimization problems over an integral polymatroid base polytope with respect to parameters determining both the cost of each element and the polytope. Under convexity and a regularity assumption on the functional dependency of the cost function with respect to the parameters, we show that reoptimization after a change in parameters can be done by elementary local operations. Applying this result, we derive that starting from any optimal solution, there is a new optimal solution to new parameters such that the $L_1$-norm of the difference of the two solutions is at most two times the $L_1$-norm of the difference of the parameters. We apply these sensitivity results to a class of noncooperative games with a finite set of players where a strategy of a player is to choose a vector in a player-specific integral polymatroid base polytope defined on a common set of elements. The players' private cost functions are regular, convex-separable, and the cost of each element is a nondecreasing function of the own usage of that element and the overall usage of the other players. Under these assumptions, we establish the existence of a pure Nash equilibrium. The existence is proven by an algorithm computing a pure Nash equilibrium that runs in polynomial time whenever the rank of the polymatroid base polytope is polynomially bounded. Both the existence result and the algorithm generalize and unify previous results appearing in the literature. We finally complement our results by showing that polymatroids are the maximal combinatorial structure enabling these results. For any nonpolymatroid region, there is a corresponding optimization problem for which the sensitivity results do not hold. In addition, there is a game where the players' strategies are isomorphic to the nonpolymatroid region and that does not admit a pure Nash equilibrium.
- Bottleneck Routing with Elastic Demands (Tobias Harks, Max Klimm, and Manuel Schneider)
Operations Research Letters, 46(1):93–98, 2018.
@article{HarksKlimmSchneider2018,
author = {Tobias Harks and Max Klimm and Manuel Schneider},
doi = {10.1016/j.orl.2017.11.007},
journal = {Operations Research Letters},
number = {1},
pages = {93--98},
title = {Bottleneck Routing with Elastic Demands},
volume = {46},
year = {2018},
}
Bottleneck routing games are a well-studied model to investigate the impact of selfish behavior in communication networks. In this model, each user selects a path in a network for routing her fixed demand. The disutility of a user only depends on the most congested link visited. We extend this model by allowing users to continuously vary the demand rate at which data is sent along the chosen path. As our main result we establish tight conditions for the existence of pure strategy Nash equilibria.
- Demand-Independent Optimal Tolls (Riccardo Colini-Baldeschi, Max Klimm, and Marco Scarsini)
ICALP 2018 – Proc. 45th International Colloquium on Automata, Languages and Programming, pp. 151:1–151:14.
@inproceedings{Colini-BaldeschiKlimmScarsini2018,
author = {Riccardo Colini{-}Baldeschi and Max Klimm and Marco Scarsini},
booktitle = {ICALP 2018 – Proc. 45th International Colloquium on Automata, Languages and Programming},
doi = {10.4230/LIPIcs.ICALP.2018.151},
editor = {Ioannis Chatzigiannakis and Christos Kaklamanis and D{\'a}niel Marx and Donald Sannella},
pages = {151:1--151:14},
series = {LIPIcs},
title = {Demand-Independent Optimal Tolls},
volume = {107},
year = {2018},
}
Wardrop equilibria in nonatomic congestion games are in general inefficient as they do not induce an optimal flow that minimizes the total travel time. Network tolls are a prominent and popular way to induce an optimum flow in equilibrium. The classical approach to find such tolls is marginal cost pricing which requires the exact knowledge of the demand on the network. In this paper, we investigate under which conditions demand-independent optimum tolls exist that induce the system optimum flow for any travel demand in the network. We give several characterizations for the existence of such tolls both in terms of the cost structure and the network structure of the game. Specifically we show that demand-independent optimum tolls exist if and only if the edge cost functions are shifted monomials as used by the Bureau of Public Roads. Moreover, non-negative demand-independent optimum tolls exist when the network is a directed acyclic multi-graph. Finally, we show that any network with a single origin-destination pair admits demand-independent optimum tolls that, although not necessarily non-negative, satisfy a budget constraint.
- Distance-Preserving Graph Contractions (Aaron Bernstein, Karl Däubel, Yann Disser, Max Klimm, Torsten Mütze, and Frieder Smolny)
ITCS 2018 – Proc. 9th Innovations in Computer Science Conference, pp. 51:1–51:14.
@inproceedings{BernsteinDaeubelDisser+2018,
author = {Aaron Bernstein and Karl D{\"a}ubel and Yann Disser and Max Klimm and Torsten M{\"u}tze and Frieder Smolny},
booktitle = {ITCS 2018 – Proc. 9th Innovations in Computer Science Conference},
doi = {10.4230/LIPIcs.ITCS.2018.51},
editor = {Anna R. Karlin},
pages = {51:1--51:14},
series = {LIPIcs},
title = {Distance-Preserving Graph Contractions},
volume = {94},
year = {2018},
}
Compression and sparsification algorithms are frequently applied in a preprocessing step before analyzing or optimizing large networks/graphs. In this paper we propose and study a new framework contracting edges of a graph (merging vertices into super-vertices) with the goal of preserving pairwise distances as accurately as possible. Formally, given an edge-weighted graph, the contraction should guarantee that for any two vertices at distance $d$, the corresponding super-vertices remain at distance at least $\phi(d)$ in the contracted graph, where $\varphi$ is a tolerance function bounding the permitted distance distortion. We present a comprehensive picture of the algorithmic complexity of the contraction problem for affine tolerance functions $\varphi(x) = x/\alpha + \beta$, where $\alpha \geq 1$ and $β \geq 0$ are arbitrary real-valued parameters. Specifically, we present polynomial-time algorithms for trees as well as hardness and inapproximability results for different graph classes, precisely separating easy and hard cases. Further we analyze the asymptotic behavior of the size of contractions, and find efficient algorithms to compute (non-optimal) contractions despite our hardness results.
- Complexity and Approximation of the Continuous Network Design Problem (Martin Gairing, Tobias Harks, and Max Klimm)
SIAM Journal on Optimization, 27(3):1554–1582, 2017.
@article{GairingHarksKlimm2017,
author = {Martin Gairing and Tobias Harks and Max Klimm},
doi = {10.1137/15M1016461},
journal = {SIAM Journal on Optimization},
number = {3},
pages = {1554--1582},
title = {Complexity and Approximation of the Continuous Network Design Problem},
volume = {27},
year = {2017},
}
We revisit a classical problem in transportation, known as the (bilevel) continuous network design problem, CNDP for short. Given a graph for which the latency of each edge depends on the ratio of the edge flow and the capacity installed, the goal is to find an optimal investment in edge capacities so as to minimize the sum of the routing costs of the induced Wardrop equilibrium and the investment costs for installing the edge's capacities. While this problem is considered to be challenging in the literature, its complexity status was still unknown. We close this gap, showing that CNDP is strongly $\mathsf{NP}$-hard and $\mathsf{APX}$-hard, both on directed and undirected networks and even for instances with affine latencies. As for the approximation of the problem, we first provide a detailed analysis for a heuristic studied by Marcotte for the special case of monomial latency functions [P. Marcotte, Math. Prog., 34 (1986), pp. 142–162]. We derive a closed form expression of its approximation guarantee for arbitrary sets of latency functions. We then propose a different approximation algorithm and show that it has the same approximation guarantee. Then, we prove that using the better of the two approximation algorithms results in a strictly improved approximation guarantee for which we derive a closed form expression. For affine latencies, for example, this best-of-two approach achieves an approximation factor of $49/41 \approx 1.195$, which improves on the factor of $5/4$ that has been shown before by Marcotte.
- Impartial Selection and the Power of Up to Two Choices (Antje Bjelde, Felix A. Fischer, and Max Klimm)
ACM Transactions on Economics and Computation, 5(4):21:1–21:20, 2017.
@article{BjeldeFischerKlimm2017,
author = {Antje Bjelde and Felix A. Fischer and Max Klimm},
doi = {10.1145/3107922},
journal = {ACM Transactions on Economics and Computation},
number = {4},
pages = {21:1--21:20},
title = {Impartial Selection and the Power of Up to Two Choices},
volume = {5},
year = {2017},
}
We study mechanisms that select members of a set of agents based on nominations by other members and that are impartial in the sense that agents cannot influence their own chance of selection. Prior work has shown that deterministic mechanisms for selecting any fixed number $k$ of agents are severely limited and cannot extract a constant fraction of the nominations of the $k$ most highly nominated agents. We prove here that this impossibility result can be circumvented by allowing the mechanism to sometimes but not always select fewer than $k$ agents. This added flexibility also improves the performance of randomized mechanisms, for which we show a separation between mechanisms that make exactly two or up to two choices and give upper and lower bounds for mechanisms allowed more than two choices.
- Packing a Knapsack of Unknown Capacity (Yann Disser, Max Klimm, Nicole Megow, and Sebastian Stiller)
SIAM Journal on Discrete Mathematics, 31(3):1477–1497, 2017.
@article{DisserKlimmMegow+2017,
author = {Yann Disser and Max Klimm and Nicole Megow and Sebastian Stiller},
doi = {10.1137/16M1070049},
journal = {SIAM Journal on Discrete Mathematics},
number = {3},
pages = {1477--1497},
title = {Packing a Knapsack of Unknown Capacity},
volume = {31},
year = {2017},
}
We study the problem of packing a knapsack without knowing its capacity. Whenever we attempt to pack an item that does not fit, the item is discarded; if the item fits, we have to include it in the packing. We show that there is always a policy that packs a value within factor 2 of the optimum packing, irrespective of the actual capacity. If all items have unit density, we achieve a factor equal to the golden ratio $\varphi \approx 1.6128$. Both factors are shown to be best possible. In fact, we obtain the above factors using packing policies that are universal in the sense that they fix a particular order of the items in the beginning and try to pack the items in this order, without changing the order later on. We give efficient algorithms computing these policies. On the other hand, we show that, for any $\alpha > 1$, the problem of deciding whether a given universal policy achieves a factor of $\alpha$ is $\mathsf{coNP}$-complete. If $\alpha$ is part of the input, the same problem is shown to be $\mathsf{coNP}$-complete for items with unit densities. Finally, we show that it is $\mathsf{coNP}$-hard to decide, for given $\alpha$, whether a set of items admits a universal policy with factor $\alpha$, even if all items have unit densities.
- Brief Announcement: Approximation Algorithms for Unsplittable Resource Allocation Problems with Diseconomies of Scale (Antje Bjelde, Max Klimm, and Daniel Schmand)
SPAA 2017 – Proc. 29th ACM Symposium on Parallelism in Algorithms and Architectures, pp. 227–229.
@inproceedings{BjeldeKlimmSchmand2017,
author = {Antje Bjelde and Max Klimm and Daniel Schmand},
booktitle = {SPAA 2017 – Proc. 29th ACM Symposium on Parallelism in Algorithms and Architectures},
doi = {10.1145/3087556.3087597},
editor = {Christian Scheideler and Mohammad Taghi Hajiaghayi},
pages = {227--229},
title = {Brief Announcement: Approximation Algorithms for Unsplittable Resource Allocation Problems with Diseconomies of Scale},
year = {2017},
}
We study general resource allocation problems with a diseconomy of scale. Given a finite set of commodities that request certain resources, the cost of each resource grows superlinearly with the demand for it, and our goal is to minimize the total cost of the resources. In large systems with limited coordination, it is natural to consider local dynamics where in each step a single commodity switches its allocated resources whenever the new solution after the switch has smaller total cost over all commodities. This yields a deterministic and polynomial time algorithm with approximation factor arbitrarily close to the locality gap, i.e., the worst case ratio of the cost of a local optimal and a global optimal solution. For costs that are polynomials with non-negative coefficients and maximal degree d, we provide a locality gap for weighted problems that is tight for all values of d. For unweighted problems, the locality gap asymptotically matches the approximation guarantee of the currently best known centralized algorithm [Makarychev, Srividenko FOCS14] but only requires local knowledge of the commodities.
- MatchTheNet - An Educational Game on 3-Dimensional Polytopes (Multimedia Contribution) (Michael Joswig, Georg Loho, Benjamin Lorenz, and Rico Raber)
SoCG 2017 – Proc. 33rd International Symposium on Computational Geometry, pp. 66:1–66:5.
@inproceedings{JoswigLohoLorenz+2017,
author = {Michael Joswig and Georg Loho and Benjamin Lorenz and Rico Raber},
booktitle = {SoCG 2017 – Proc. 33rd International Symposium on Computational Geometry},
doi = {10.4230/LIPIcs.SoCG.2017.66},
editor = {Boris Aronov and Matthew J. Katz},
pages = {66:1--66:5},
series = {LIPIcs},
title = {MatchTheNet - An Educational Game on 3-Dimensional Polytopes (Multimedia Contribution)},
volume = {77},
year = {2017},
}
We present an interactive game which challenges a single player to match 3-dimensional polytopes to their planar nets. It is open source, and it runs in standard web browsers.
- Optimal Impartial Selection (Felix A. Fischer and Max Klimm)
SIAM Journal on Computing, 44(5):1263–1285, 2015.
@article{FischerKlimm2015,
author = {Felix A. Fischer and Max Klimm},
doi = {10.1137/140995775},
journal = {SIAM Journal on Computing},
number = {5},
pages = {1263--1285},
title = {Optimal Impartial Selection},
volume = {44},
year = {2015},
}
We study a fundamental problem in social choice theory, the selection of a member of a set of agents based on impartial nominations by agents from that set. Studied previously by Alon et al. [Proceedings of TARK, 2011, pp. 101–110] and by Holzman and Moulin [Econometrica, 81 (2013), pp. 173–196], this problem arises when representatives are selected from within a group or when publishing or funding decisions are made based on a process of peer review. Our main result concerns a randomized mechanism that in expectation selects an agent with at least half the maximum number of nominations. This is best possible subject to impartiality and resolves a conjecture of Alon et al. Further results are given for the case where some agent receives many nominations and the case where each agent casts at least one nomination.
- Equilibria in a Class of Aggregative Location Games (Max Klimm and Tobias Harks)
Journal of Mathematical Economics, 61(1):211–220, 2015.
@article{KlimmHarks2015,
author = {Max Klimm and Tobias Harks},
doi = {10.1016/j.jmateco.2015.09.006},
journal = {Journal of Mathematical Economics},
number = {1},
pages = {211--220},
title = {Equilibria in a Class of Aggregative Location Games},
volume = {61},
year = {2015},
}
Consider a multimarket oligopoly, where firms have a single license that allows them to supply exactly one market out of a given set of markets. How does the restriction to supply only one market influence the existence of equilibria in the game? To answer this question, we study a general class of aggregative location games where a strategy of a player is to choose simultaneously both a location out of a finite set and a non-negative quantity out of a compact interval. The utility of each player is assumed to depend solely on the chosen location, the chosen quantity, and the aggregated quantity of all other players on the chosen location. We show that each game in this class possesses a pure Nash equilibrium whenever the players' utility functions satisfy the assumptions negative externality, decreasing marginal utility, continuity, and Location–Symmetry. We also provide examples exhibiting that, if one of the assumptions is violated, a pure Nash equilibrium may fail to exist.
- Computing Network Tolls with Support Constraints (Tobias Harks, Ingo Kleinert, Max Klimm, and Rolf H. Möhring)
Networks, 65(3):262–285, 2015.
@article{HarksKleinertKlimm+2015,
author = {Tobias Harks and Ingo Kleinert and Max Klimm and Rolf H. M{\"o}hring},
doi = {10.1002/net.21604},
journal = {Networks},
number = {3},
pages = {262--285},
title = {Computing Network Tolls with Support Constraints},
volume = {65},
year = {2015},
}
Reducing traffic congestion via toll pricing has been a central topic in the operations research and transportation literature and, recently, it has been implemented in several cities all over the world. Since, in practice, it is not feasible to impose tolls on every edge of a given traffic network, we study the resulting mathematical problem of computing tolls on a predefined subset of edges of the network so as to minimize the total travel time of the induced equilibrium flow. We first present an analytical study for the special case of parallel edge networks highlighting the intrinsic complexity and nonconvexity of the resulting optimization problem. We then present algorithms for general networks for which we systematically test the solution quality for large-scale network instances. Finally, we discuss the related optimization problem of computing tolls subject to a cardinality constraint on the number of edges that have tolls.
- Improving the Hk-Bound on the Price of Stability in Undirected Shapley Network Design Games (Yann Disser, Andreas Emil Feldmann, Max Klimm, and Matúš Mihalák)
Theoretical Computer Science, 562:557–564, 2015.
@article{DisserFeldmannKlimm+2015,
author = {Yann Disser and Andreas Emil Feldmann and Max Klimm and Mat{\'u}{\v s} Mihal{\'a}k},
doi = {10.1016/j.tcs.2014.10.037},
journal = {Theoretical Computer Science},
pages = {557--564},
title = {Improving the Hk-Bound on the Price of Stability in Undirected Shapley Network Design Games},
volume = {562},
year = {2015},
}
In this article we show that the price of stability of Shapley network design games on undirected graphs with $k$ players is at most $\frac{k^3(k+1)/2 - k^2}{1 + k^3(k+1)/2 - k^2}H_k = (1 - \mathcal{O}(1/k^4))H_k$, where $H_k$ denotes the $k$-th harmonic number. This improves on the known upper bound of $H_k$, which is also valid for directed graphs but for these, in contrast, is tight. Hence, we give the first non-trivial upper bound on the price of stability for undirected Shapley network design games that is valid for an arbitrary number of players. Our bound is proved by analyzing the price of stability restricted to Nash equilibria that minimize the potential function of the game. We also present a game with $k=3$ players in which such a restricted price of stability is $1.634$. This shows that the analysis of Bilò and Bove (2011) is tight. In addition, we give an example for three players that improves the lower bound on the (unrestricted) price of stability to $1.571$.
- Scheduling Bidirectional Traffic on a Path (Yann Disser, Max Klimm, and Elisabeth Lübbecke)
ICALP 2015 – Proc. 42nd International Colloquium on Automata, Languages and Programming, pp. 406–418.
@inproceedings{DisserKlimmLuebecke15,
author = {Yann Disser and Max Klimm and Elisabeth L{\"u}bbecke},
booktitle = {ICALP 2015 – Proc. 42nd International Colloquium on Automata, Languages and Programming},
doi = {10.1007/978-3-662-47672-7\_33},
editor = {Magn{\'u}s M. Halld{\'o}rsson and Kazuo Iwama and Naoki Kobayashi and Bettina Speckmann},
pages = {406--418},
series = {LNCS},
title = {Scheduling Bidirectional Traffic on a Path},
volume = {9134},
year = {2015},
}
We study the fundamental problem of scheduling bidirectional traffic along a path composed of multiple segments. The main feature of the problem is that jobs traveling in the same direction can be scheduled in quick succession on a segment, while jobs in opposing directions cannot cross a segment at the same time. We show that this tradeoff makes the problem significantly harder than the related flow shop problem, by proving that it is $\mathsf{NP}$-hard even for identical jobs. We complement this result with a PTAS for a single segment and non-identical jobs. If we allow some pairs of jobs traveling in different directions to cross a segment concurrently, the problem becomes $\mathsf{APX}$-hard even on a single segment and with identical jobs. We give polynomial algorithms for the setting with restricted compatibilities between jobs on a single and any constant number of segments, respectively.
- Impartial Selection and the Power of up to Two Choices (Antje Bjelde, Felix A. Fischer, and Max Klimm)
WINE 2015 – Proc. 11th Conference on Web and Internet Economics, pp. 146–158.
@inproceedings{BjeldeFischerKlimm2015,
author = {Antje Bjelde and Felix A. Fischer and Max Klimm},
booktitle = {WINE 2015 – Proc. 11th Conference on Web and Internet Economics},
doi = {10.1007/978-3-662-48995-6\_11},
editor = {Evangelos Markakis and Guido Sch{\"a}fer},
pages = {146--158},
series = {LNCS},
title = {Impartial Selection and the Power of up to Two Choices},
volume = {9470},
year = {2015},
}
We study mechanisms that select members of a set of agents based on nominations by other members and that are impartial in the sense that agents cannot influence their own chance of selection. Prior work has shown that deterministic mechanisms for selecting any fixed number of agents are severely limited, whereas randomization allows for the selection of a single agent that in expectation receives at least $1/2$ of the maximum number of nominations. The bound of $1/2$ is in fact best possible subject to impartiality. We prove here that the same bound can also be achieved deterministically by sometimes but not always selecting a second agent. We then show a separation between randomized mechanisms that make exactly two or up to two choices, and give upper and lower bounds on the performance of mechanisms allowed more than two choices.
- Bottleneck Routing with Elastic Demands (Tobias Harks, Max Klimm, and Manuel Schneider)
WINE 2015 – Proc. 11th Conference on Web and Internet Economics, pp. 384–397.
@inproceedings{HarksKlimmSchneider2015,
author = {Tobias Harks and Max Klimm and Manuel Schneider},
booktitle = {WINE 2015 – Proc. 11th Conference on Web and Internet Economics},
doi = {10.1007/978-3-662-48995-6\_28},
editor = {Evangelos Markakis and Guido Sch{\"a}fer},
pages = {384--397},
series = {LNCS},
title = {Bottleneck Routing with Elastic Demands},
volume = {9470},
year = {2015},
}
Bottleneck routing games are a well-studied model to investigate the impact of selfish behavior in communication networks. In this model, each user selects a path in a network for routing their fixed demand. The disutility of a used only depends on the most congested link visited. We extend this model by allowing users to continuously vary the demand rate at which data is sent along the chosen path. As our main result we establish tight conditions for the existence of pure strategy Nash equilibria.
- Sharing Non-anonymous Costs of Multiple Resources Optimally (Max Klimm and Daniel Schmand)
CIAC 2015 – Proc. 9th International Conference on Algorithms and Complexity, pp. 274–287.
@inproceedings{KlimmSchmand2015,
author = {Max Klimm and Daniel Schmand},
booktitle = {CIAC 2015 – Proc. 9th International Conference on Algorithms and Complexity},
doi = {10.1007/978-3-319-18173-8\_20},
editor = {Vangelis Th. Paschos and Peter Widmayer},
pages = {274--287},
series = {LNCS},
title = {Sharing Non-anonymous Costs of Multiple Resources Optimally},
volume = {9079},
year = {2015abstract=In cost sharing games, the existence and efficiency of pure Nash equilibria fundamentally depends on the method that is used to share the resources' costs. We consider a general class of resource allocation problems in which a set of resources is used by a heterogeneous set of selfish users. The cost of a resource is a (non-decreasing) function of the set of its users. Under the assumption that the costs of the resources are shared by uniform cost sharing protocols, i.e., protocols that use only local information of the resource's cost structure and its users to determine the cost shares, we exactly quantify the inefficiency of the resulting pure Nash equilibria. Specifically, we show tight bounds on prices of stability and anarchy for games with only submodular and only supermodular cost functions, respectively, and an asymptotically tight bound for games with arbitrary set-functions. While all our upper bounds are attained for the well-known Shapley cost sharing protocol, our lower bounds hold for arbitrary uniform cost sharing protocols and are even valid for games with anonymous costs, i.e., games in which the cost of each resource only depends on the cardinality of the set of its users.},
}
- Linear, Exponential, but Nothing Else (Max Klimm)
Chapter in Gems of Combinatorial Optimization and Graph Algorithms (Andreas S. Schulz, Martin Skutella, Sebastian Stiller, Dorothea Wagner, eds.), Springer, 2015.
@incollection{Klimm2015,
author = {Max Klimm},
booktitle = {Gems of Combinatorial Optimization and Graph Algorithms},
doi = {10.1007/978-3-319-24971-1\_11},
editor = {Andreas S. Schulz and Martin Skutella and Sebastian Stiller and Dorothea Wagner},
pages = {113--123},
publisher = {Springer},
title = {Linear, Exponential, but Nothing Else},
year = {2015},
}
We consider two seemingly unrelated resource allocation problems and show that they share a deep structural property. In the first problem, we schedule jobs on a single machine to minimize the sum of the jobs' cost where each job's cost is determined by a job-specific function of its completion time. In the second problem, we consider weighted congestion games and are interested in the existence of pure Nash equilibria. We show that the classes of delay cost functions for which the scheduling problem admits a priority rule are exactly the classes of resource cost functions that guarantee the existence of a pure Nash equilibrium in weighted congestion games. These classes of cost functions are those that contain only linear functions or exponential functions, but not both.
- Optimal Impartial Selection (Felix A. Fischer and Max Klimm)
EC 2014 – Proc. 15th ACM Conference on Economics and Computation, pp. 803–820.
@inproceedings{FischerKlimm2014,
author = {Felix A. Fischer and Max Klimm},
booktitle = {EC 2014 – Proc. 15th ACM Conference on Economics and Computation},
doi = {10.1145/2600057.2602836},
editor = {Moshe Babaioff and Vincent Conitzer and David A. Easley},
pages = {803--820},
title = {Optimal Impartial Selection},
year = {2014},
}
We study the problem of selecting a member of a set of agents based on impartial nominations by agents from that set. The problem was studied previously by Alon et al. and by Holzman and Moulin and has important applications in situations where representatives are selected from within a group or where publishing or funding decisions are made based on a process of peer review. Our main result concerns a randomized mechanism that in expectation selects an agent with at least half the maximum number of nominations. Subject to impartiality, this is best possible.
- Packing a Knapsack of Unknown Capacity (Yann Disser, Max Klimm, Nicole Megow, and Sebastian Stiller)
STACS 2014 – Proc. 31st International Symposium on Theoretical Aspects of Computer Science, pp. 276–287.
@inproceedings{DisserKlimmMegow+2014,
author = {Yann Disser and Max Klimm and Nicole Megow and Sebastian Stiller},
booktitle = {STACS 2014 – Proc. 31st International Symposium on Theoretical Aspects of Computer Science},
doi = {10.4230/LIPIcs.STACS.2014.276},
editor = {Ernst W. Mayr and Natacha Portier},
pages = {276--287},
series = {LIPIcs},
title = {Packing a Knapsack of Unknown Capacity},
volume = {25},
year = {2014},
}
We study the problem of packing a knapsack without knowing its capacity. Whenever we attempt to pack an item that does not fit, the item is discarded; if the item fits, we have to include it in the packing. We show that there is always a policy that packs a value within factor 2 of the optimum packing, irrespective of the actual capacity. If all items have unit density, we achieve a factor equal to the golden ratio $\varphi \approx 1.618$. Both factors are shown to be best possible. In fact, we obtain the above factors using packing policies that are universal in the sense that they fix a particular order of the items and try to pack the items in this order, independent of the observations made while packing. We give efficient algorithms computing these policies. On the other hand, we show that, for any $\alpha > 1$, the problem of deciding whether a given universal policy achieves a factor of $\alpha$ is $\mathsf{coNP}$-complete. If $\alpha$ is part of the input, the same problem is shown to be $\mathsf{coNP}$-complete for items with unit densities. Finally, we show that it is $\mathsfcoNP-hard to decide, for given $\alpha$, whether a set of items admits a universal policy with factor $\alpha$, even if all items have unit densities.
- Complexity and Approximation of the Continuous Network Design Problem (Martin Gairing, Tobias Harks, and Max Klimm)
APPROX 2014 – Proc. 17th International Workshop on Approximation Algorithms for Combinatorial Optimization Problems, pp. 226–241.
@inproceedings{GairingHarksKlimm2014,
author = {Martin Gairing and Tobias Harks and Max Klimm},
booktitle = {APPROX 2014 – Proc. 17th International Workshop on Approximation Algorithms for Combinatorial Optimization Problems},
doi = {10.4230/LIPIcs.APPROX-RANDOM.2014.226},
editor = {Klaus Jansen and Jos{\'e} D. P. Rolim and Nikhil R. Devanur and Cristopher Moore},
pages = {226--241},
series = {LIPIcs},
title = {Complexity and Approximation of the Continuous Network Design Problem},
volume = {28},
year = {2014},
}
We revisit a classical problem in transportation, known as the continuous (bilevel) network design problem, CNDP for short. Given a graph for which the latency of each edge depends on the ratio of the edge flow and the capacity installed, the goal is to find an optimal investment in edge capacities so as to minimize the sum of the routing cost of the induced Wardrop equilibrium and the investment cost for installing the capacity. While this problem is considered as challenging in the literature, its complexity status was still unknown. We close this gap showing that CNDP is strongly $\mathsf{NP}$-complete and $\mathsf{APX}$-hard, both on directed and undirected networks and even for instances with affine latencies. As for the approximation of the problem, we first provide a detailed analysis for a heuristic studied by Marcotte for the special case of monomial latency functions (Math. Program., Vol. 34, 1986). We derive a closed form expression of its approximation guarantee for arbitrary sets of latency functions. We then propose a different approximation algorithm and show that it has the same approximation guarantee. However, we show that using the better of the two approximation algorithms results in a strictly improved approximation guarantee for which we derive a closed form expression. For affine latencies, e.g., this algorithm achieves a 49/41-approximation which improves on the 5/4 that has been shown before by Marcotte. We finally discuss the case of hard budget constraints on the capacity investment.
- Approximate Pure Nash Equilibria in Weighted Congestion Games (Christoph Hansknecht, Max Klimm, and Alexander Skopalik)
APPROX 2014 – Proc. 17th International Workshop on Approximation Algorithms for Combinatorial Optimization Problems, pp. 242–257.
@inproceedings{HansknechtKlimmSkopalik2014,
author = {Christoph Hansknecht and Max Klimm and Alexander Skopalik},
booktitle = {APPROX 2014 – Proc. 17th International Workshop on Approximation Algorithms for Combinatorial Optimization Problems},
doi = {10.4230/LIPIcs.APPROX-RANDOM.2014.242},
editor = {Klaus Jansen and Jos{\'e} D. P. Rolim and Nikhil R. Devanur and Cristopher Moore},
pages = {242--257},
series = {LIPIcs},
title = {Approximate Pure Nash Equilibria in Weighted Congestion Games},
volume = {28},
year = {2014},
}
We study the existence of approximate pure Nash equilibria in weighted congestion games and develop techniques to obtain approximate potential functions that prove the existence of $\alpha$-approximate pure Nash equilibria and the convergence of $\alpha$-improvement steps. Specifically, we show how to obtain upper bounds for approximation factor $\alpha$ for a given class of cost functions. For example for concave cost functions the factor is at most $3/2$, for quadratic cost functions it is at most $4/3$, and for polynomial cost functions of maximal degree $\ell$ it is at at most $\ell + 1$. For games with two players we obtain tight bounds which are as small as for example $1.054$ in the case of quadratic cost functions.
- Multimarket Oligopolies with Restricted Market Access (Tobias Harks and Max Klimm)
SAGT 2014 – Proc. 7th International Symposium on Algorithmic Game Theory, pp. 182–193.
@inproceedings{HarksKlimm2014,
author = {Tobias Harks and Max Klimm},
booktitle = {SAGT 2014 – Proc. 7th International Symposium on Algorithmic Game Theory},
doi = {10.1007/978-3-662-44803-8\_16},
editor = {Ron Lavi},
pages = {182--193},
series = {LNCS},
title = {Multimarket Oligopolies with Restricted Market Access},
volume = {8768},
year = {2014},
}
We study the existence of Cournot equilibria in multimarket oligopolies under the additional restriction that every firm may sell its product only to a limited number of markets simultaneously. This situation naturally arises if market entry is subject to a valid license and each firm holds a fixed number of licenses only, or, equivalently, if the firms' short-term assets only suffice to serve up to a certain number of markets. We allow for firm-specific market reaction functions modeling heterogeneity among products. As our main result, we show the existence of a Cournot equilibrium under the following assumptions stated informally below: (i) cost functions are convex; (ii) the marginal return functions strictly decrease for strictly increased own quantities and non-decreased aggregated quantities; (iii) for every firm, the firm-specific price functions across markets are identical up to market-specific shifts. While assumptions (i) and (ii) are frequently imposed in the literature on single market oligopolies, only assumption (iii) seems limiting. We show, however, that if it is violated, there are games without a Cournot equilibrium.
- Resource Competition on Integral Polymatroids (Tobias Harks, Max Klimm, and Britta Peis)
WINE 2014 – Proc. 10th Conference on Web and Internet Economics, pp. 189–202.
@inproceedings{HarksKlimmPeis2014,
author = {Tobias Harks and Max Klimm and Britta Peis},
booktitle = {WINE 2014 – Proc. 10th Conference on Web and Internet Economics},
doi = {10.1007/978-3-319-13129-0\_14},
editor = {Tie{-}Yan Liu and Qi Qi and Yinyu Ye},
pages = {189--202},
series = {LNCS},
title = {Resource Competition on Integral Polymatroids},
volume = {8877},
year = {2014},
}
We study competitive resource allocation problems in which players distribute their demands integrally over a set of resources subject to player-specific submodular capacity constraints. Each player has to pay for each unit of demand a cost that is a non-decreasing and convex function of the total allocation of that resource. This general model of resource allocation generalizes both singleton congestion games with integer-splittable demands and matroid congestion games with player-specific costs. As our main result, we show that in such general resource allocation problems a pure Nash equilibrium is guaranteed to exist by giving a pseudo-polynomial algorithm computing a pure Nash equilibrium.
- Congestion Games with Higher Demand Dimensions (Max Klimm and Andreas Schütz)
WINE 2014 – Proc. 10th Conference on Web and Internet Economics, pp. 453–459.
@inproceedings{KlimmSchuetz2014,
author = {Max Klimm and Andreas Sch{\"u}tz},
booktitle = {WINE 2014 – Proc. 10th Conference on Web and Internet Economics},
doi = {10.1007/978-3-319-13129-0\_39},
editor = {Tie{-}Yan Liu and Qi Qi and Yinyu Ye},
pages = {453--459},
series = {LNCS},
title = {Congestion Games with Higher Demand Dimensions},
volume = {8877},
year = {2014},
}
We introduce a generalization of weighted congestion games in which players are associated with $k$-dimensional demand vectors and resource costs are k-dimensional functions $c : \mathbb{R}^k_{\geq 0} \to \mathbb{R}$ of the aggregated demand vector of the players using the resource. Such a cost structure is natural when the cost of a resource depends on different properties of the players' demands, e.g., total weight, total volume, and total number of items. A complete characterization of the existence of pure Nash equilibria in terms of the resource cost functions for all $k \in \mathbb{N}$ is given.
- Computing Pure Nash and Strong Equilibria in Bottleneck Congestion Games (Tobias Harks, Martin Hoefer, Max Klimm, and Alexander Skopalik)
Mathematical Programming, 141(1-2):193–215, 2013.
@article{HarksHoeferKlimm+2013,
author = {Tobias Harks and Martin Hoefer and Max Klimm and Alexander Skopalik},
doi = {10.1007/s10107-012-0521-3},
journal = {Mathematical Programming},
number = {1-2},
pages = {193--215},
title = {Computing Pure Nash and Strong Equilibria in Bottleneck Congestion Games},
volume = {141},
year = {2013},
}
Bottleneck congestion games properly model the properties of many real-world network routing applications. They are known to possess strong equilibria—a strengthening of Nash equilibrium to resilience against coalitional deviations. In this paper, we study the computational complexity of pure Nash and strong equilibria in these games. We provide a generic centralized algorithm to compute strong equilibria, which has polynomial running time for many interesting classes of games such as, e.g., matroid or single-commodity bottleneck congestion games. In addition, we examine the more demanding goal to reach equilibria in polynomial time using natural improvement dynamics. Using unilateral improvement dynamics in matroid games pure Nash equilibria can be reached efficiently. In contrast, computing even a single coalitional improvement move in matroid and single-commodity games is strongly $\mathsf{NP}$-hard. In addition, we establish a variety of hardness results and lower bounds regarding the duration of unilateral and coalitional improvement dynamics. They continue to hold even for convergence to approximate equilibria.
- Strong Equilibria in Games with the Lexicographical Improvement Property (Tobias Harks, Max Klimm, and Rolf H. Möhring)
International Journal of Game Theory, 42(2):461–482, 2013.
@article{HarksKlimmMoehring2013,
author = {Tobias Harks and Max Klimm and Rolf H. M{\"o}hring},
doi = {10.1007/s00182-012-0322-1},
journal = {International Journal of Game Theory},
number = {2},
pages = {461--482},
title = {Strong Equilibria in Games with the Lexicographical Improvement Property},
volume = {42},
year = {2013},
}
We study a class of finite strategic games with the property that every deviation of a coalition of players that is profitable to each of its members strictly decreases the lexicographical order of a certain function defined on the set of strategy profiles. We call this property the lexicographical improvement property (LIP) and show that, in finite games, it is equivalent to the existence of a generalized strong potential function. We use this characterization to derive existence, efficiency and fairness properties of strong equilibria (SE). As our main result, we show that an important class of games that we call bottleneck congestion games has the LIP and thus the above mentioned properties. For infinite games, the LIP does neither imply the existence of a generalized strong potential nor the existence of SE. We therefore introduce the slightly more general concept of the pairwise LIP and prove that whenever the pairwise LIP is satisfied for a continuous function, then there exists a SE. As a consequence, we show that splittable bottleneck congestion games with continuous facility cost functions possess a SE.
- Improving the Hk-Bound on the Price of Stability in Undirected Shapley Network Design Games (Yann Disser, Andreas Emil Feldmann, Max Klimm, and Matúš Mihalák)
CIAC 2013 – Proc. 8th International Conference on Algorithms and Complexity, pp. 158–169.
@inproceedings{DisserFeldmannKlimm+2013,
author = {Yann Disser and Andreas Emil Feldmann and Max Klimm and Mat{\'u}{\v s} Mihal{\'a}k},
booktitle = {CIAC 2013 – Proc. 8th International Conference on Algorithms and Complexity},
doi = {10.1007/978-3-642-38233-8\_14},
editor = {Paul G. Spirakis and Maria J. Serna},
pages = {158--169},
series = {LNCS},
title = {Improving the Hk-Bound on the Price of Stability in Undirected Shapley Network Design Games},
volume = {7878},
year = {2013},
}
In this paper we show that the price of stability of Shapley network design games on undirected graphs with $k$ players is at most $\frac{k^3(k+1)/2 - k^2}{1 + k^3(k+1)/2 - k^2}H_k = (1 - \mathcal{O}(1/k^4))H_k$, where $H_k$ denotes the $k$-th harmonic number. This improves on the known upper bound of $H_k$, which is also valid for directed graphs but for these, in contrast, is tight. Hence, we give the first non-trivial upper bound on the price of stability for undirected Shapley network design games that is valid for an arbitrary number of players. Our bound is proved by analyzing the price of stability restricted to Nash equilibria that minimize the potential function of the game. We also present a game with $k=3$ players in which such a restricted price of stability is $1.634$. This shows that the analysis of Bilò and Bove (Journal of Interconnection Networks, Volume 12, 2011) is tight. In addition, we give an example for three players that improves the lower bound on the (unrestricted) price of stability to $1.571$.
- Congestion Games with Player-Specific Costs Revisited (Martin Gairing and Max Klimm)
SAGT 2013 – Proc. 6th International Symposium on Algorithmic Game Theory, pp. 98–109.
@inproceedings{GairingKlimm2013,
author = {Martin Gairing and Max Klimm},
booktitle = {SAGT 2013 – Proc. 6th International Symposium on Algorithmic Game Theory},
doi = {10.1007/978-3-642-41392-6\_9},
editor = {Berthold V{\"o}cking},
pages = {98--109},
series = {LNCS},
title = {Congestion Games with Player-Specific Costs Revisited},
volume = {8146},
year = {2013},
}
We study the existence of pure Nash equilibria in congestion games with player-specific costs. Specifically, we provide a thorough characterization of the maximal sets of cost functions that guarantee the existence of a pure Nash equilibrium. For the case that the players are unweighted, we show that it is necessary and sufficient that for every resource and for every pair of players the corresponding cost functions are affine transformations of each other. For weighted players, we show that in addition one needs to require that all cost functions are affine or all cost functions are exponential. Finally, we construct a four-player singleton weighted congestion game where the cost functions are identical among the resources and differ only by an additive constant among the players and show that it does not have a pure Nash equilibrium. This answers an open question by Mavronicolas et al. who showed that such games with at most three players always have a pure Nash equilibrium.
- Characterizing the Existence of Potential Functions in Weighted Congestion Games (Tobias Harks, Max Klimm, and Rolf H. Möhring)
Theory of Computing Systems, 49(1):46–70, 2011.
@article{HarksKlimmMoehring2011,
author = {Tobias Harks and Max Klimm and Rolf H. M{\"o}hring},
doi = {10.1007/s00224-011-9315-x},
journal = {Theory of Computing Systems},
number = {1},
pages = {46--70},
title = {Characterizing the Existence of Potential Functions in Weighted Congestion Games},
volume = {49},
year = {2011},
}
Since the pioneering paper of Rosenthal a lot of work has been done in order to determine classes of games that admit a potential. First, we study the existence of potential functions for weighted congestion games. Let $\mathcal{C}$ be an arbitrary set of locally bounded functions and let $\mathcal{G}(\mathcal{C})$ be the set of weighted congestion games with cost functions in $\mathcal{C}$. We show that every weighted congestion game $G \in \mathcal{G}(\mathcal{C})$ admits an exact potential if and only if $\mathcal{C}$ contains only affine functions. We also give a similar characterization for $w$-potentials with the difference that here $\mathcal{C}$ consists either of affine functions or of certain exponential functions. We finally extend our characterizations to weighted congestion games with facility-dependent demands and elastic demands, respectively.
- Congestion Games with Variable Demands (Tobias Harks and Max Klimm)
TARK 2011 – Proc. 13th Conference on Theoretical Aspects of Rationality and Knowledge, pp. 111–120.
@inproceedings{HarksKlimm2011a,
author = {Tobias Harks and Max Klimm},
booktitle = {TARK 2011 – Proc. 13th Conference on Theoretical Aspects of Rationality and Knowledge},
doi = {10.1145/2000378.2000391},
editor = {Krzysztof R. Apt},
pages = {111--120},
title = {Congestion Games with Variable Demands},
year = {2011},
}
We initiate the study of congestion games with variable demands where the (variable) demand has to be assigned to exactly one subset of resources. The players' incentives to use higher demands are stimulated by non-decreasing and concave utility functions. The payoff for a player is defined as the difference between the utility of the demand and the associated cost on the used resources. Although this class of non-cooperative games captures many elements of real-world applications, it has not been studied in this generality, to our knowledge, in the past. We study the fundamental problem of the existence of pure Nash equilibria (PNE for short) in congestion games with variable demands. We call a set of cost functions $C$ consistent if every congestion game with variable demands and cost functions in $C$ possesses a PNE. We say that $C$ is FIP consistent if every such game possesses the $\alpha$-Finite Improvement Property for every $\alpha > 0$. Our main results provide a complete characterization of consistency of cost functions revealing structural differences to congestion games with fixed demands (weighted congestion games), where in the latter even inhomogeneously exponential functions are FIP consistent. Finally, we study consistency and FIP consistency of cost functions in a slightly different class of games, where every player experiences the same cost on a resource (uniform cost model). We give a characterization of consistency and FIP consistency showing that only homogeneously exponential functions are consistent while no functions are FIP consistent.
- Optimal File Distribution in Peer-to-Peer Networks (Kai-Simon Goetzmann, Tobias Harks, Max Klimm, and Konstantin Miller)
ISAAC 2011 – Proc. 22nd International Symposium on Algorithms and Computation, pp. 210–219.
@inproceedings{GoetzmannHarksKlimm+2011,
author = {Kai{-}Simon Goetzmann and Tobias Harks and Max Klimm and Konstantin Miller},
booktitle = {ISAAC 2011 – Proc. 22nd International Symposium on Algorithms and Computation},
doi = {10.1007/978-3-642-25591-5\_23},
editor = {Takao Asano and Shin{-}Ichi Nakano and Yoshio Okamoto and Osamu Watanabe},
pages = {210--219},
series = {LNCS},
title = {Optimal File Distribution in Peer-to-Peer Networks},
volume = {7074},
year = {2011},
}
We study the problem of distributing a file initially located at a server among a set of peers. Peers who downloaded the file can upload it to other peers. The server and the peers are connected to each other via a core network. The upload and download rates to and from the core are constrained by user and server specific upload and download capacities. Our objective is to minimize the makespan. We derive exact polynomial time algorithms for the case when upload and download capacities per peer and among peers are equal. We show that the problem becomes strongly NP-hard for equal upload and download capacities per peer that may differ among peers. For this case we devise a polynomial time $(1 + 2\sqrt{2})$-approximation algorithm. To the best of our knowledge, neither $\mathsf{NP}$-hardness nor approximation algorithms were known before for this problem.
- Demand Allocation Games: Integrating Discrete and Continuous Strategy Spaces (Tobias Harks and Max Klimm)
WINE 2011 – Proc. 7th Conference on Web and Internet Economics, pp. 194–205.
@inproceedings{HarksKlimm2011b,
author = {Tobias Harks and Max Klimm},
booktitle = {WINE 2011 – Proc. 7th Conference on Web and Internet Economics},
doi = {10.1007/978-3-642-25510-6\_17},
editor = {Ning Chen and Edith Elkind and Elias Koutsoupias},
pages = {194--205},
series = {LNCS},
title = {Demand Allocation Games: Integrating Discrete and Continuous Strategy Spaces},
volume = {7090},
year = {2011},
}
In this paper, we introduce a class of games which we term demand allocation games that combines the characteristics of finite games such as congestion games and continuous games such as Cournot oligopolies. In a strategy profile each player may choose both an action out of a finite set and a non-negative demand out of a convex and compact interval. The utility of each player is assumed to depend solely on the action, the chosen demand, and the aggregated demand on the action chosen. We show that this general class of games possess a pure Nash equilibrium whenever the players' utility functions satisfy the assumptions negative externality, decreasing marginal returns and homogeneity. If one of the assumptions is violated, then a pure Nash equilibrium may fail to exist. We demonstrate the applicability of our results by giving several concrete examples of games that fit into our model.