Fact-checked by Grok 2 weeks ago

Interval scheduling

Interval scheduling is a fundamental in and , involving the selection of the maximum number of non-overlapping time intervals from a given set of intervals, each defined by a start time s_j and finish time f_j. The problem arises in scenarios such as assigning jobs to a single machine or scheduling meetings in a conference room, where compatibility requires that no two selected intervals overlap (i.e., for any two selected intervals j and k, either f_j \leq s_k or f_k \leq s_j). The unweighted version, which maximizes the number of selected intervals, is efficiently solved using a that sorts the intervals in non-decreasing order of their finish times and iteratively selects the interval with the earliest finish time that does not conflict with previously chosen ones. This algorithm achieves optimality, as proven by showing that any optimal solution can be transformed into the greedy solution without reducing the number of intervals, and it runs in O(n \log n) time due to the step followed by a linear scan. A notable variant is weighted interval scheduling, where each interval has an associated weight w_j, and the goal shifts to maximizing the total weight of the selected non-overlapping intervals rather than their count. This version requires dynamic programming for an optimal solution, computing the maximum value for each interval by considering whether to include it or follow the optimal subproblem ending before its start time, also in O(n \log n) time after . Interval scheduling problems find broad applications in , such as hotel reservations, factory machine scheduling, and timetabling, where fixed time windows must be respected to minimize conflicts or machines used. Extensions include multi-machine scheduling and constraints on flexible availabilities, which introduce additional complexity but build on the core and dynamic programming techniques.

Fundamentals

Problem Definition

Interval scheduling is a fundamental optimization problem in algorithm design and , involving the selection of tasks or jobs to be performed on a single resource without conflicts. Each job is represented as an interval defined by a start time s_i and a finish time f_i, where s_i < f_i, typically assuming real-valued or integer times. The core objective is to select the largest possible subset of these intervals such that no two selected intervals overlap, thereby maximizing the utilization of the resource. Non-overlap is ensured by the constraint that for any two selected intervals i and j, either f_i \leq s_j or f_j \leq s_i. This setup models scenarios such as scheduling meetings in a single room or allocating time slots to non-conflicting activities on one machine. The problem has two primary variants. In the unweighted case, the goal is to maximize the cardinality, or number, of selected intervals. In the weighted case, each interval i is associated with a positive weight w_i (representing value, profit, or priority), and the objective shifts to maximizing the total sum of weights \sum w_i over the selected non-overlapping intervals. The roots of interval scheduling trace back to early scheduling problems in operations research, with seminal formulations emerging in the 1950s through works like Dantzig and Fulkerson's tanker scheduling, and further developed in algorithms literature by the 1970s; it was formalized in complexity-theoretic terms by Garey and Johnson in their 1979 guide to NP-completeness.

Key Terminology and Assumptions

In interval scheduling, each job or activity is modeled as an interval on the real line, typically denoted as [s_i, f_i) where s_i is the start time and f_i is the finish time, with s_i < f_i. Two intervals are compatible if they do not overlap, i.e., their intersection is empty (they are disjoint), and thus can be scheduled without conflict on the same resource. A feasible schedule consists of a set of mutually compatible intervals that can be selected together without any overlaps. Key assumptions underpin the problem formulation: intervals are often presorted by non-decreasing finish times (f_1 \leq f_2 \leq \cdots \leq f_n) to facilitate algorithmic processing, though this is not inherent to the input. Time is treated as continuous, allowing start and finish times to be any real numbers, and there is no preemption—each interval must run contiguously from start to finish or not at all if not selected. Let n denote the number of intervals, and for each interval i, the processing time is p_i = f_i - s_i, though this duration is not always central to the objective. Several edge cases arise in practice. If the input set of intervals is empty, the optimal feasible schedule is also empty, yielding zero selected intervals. When all intervals overlap with one another, at most one can be included in any feasible schedule; in the weighted variant, this would be the interval with the maximum weight. For identical intervals sharing the exact same start and finish times, they fully overlap and thus at most one can be selected, as compatibility requires disjointness.

Single-Resource Maximization

Unweighted Case

In the unweighted case of single-resource interval scheduling, the objective is to select the maximum number of non-overlapping intervals from a given set to maximize cardinality. The standard approach employs a greedy algorithm that sorts the intervals by their finish times and iteratively selects the one with the earliest finish time that does not overlap with previously selected intervals. Specifically, the algorithm proceeds as follows:
  1. Sort the n intervals in non-decreasing order of finish times, denoted as f_1 \leq f_2 \leq \cdots \leq f_n.
  2. Initialize an empty set S to store the selected intervals.
  3. Select the first interval (with the earliest finish time) and add it to S.
  4. For each subsequent interval i, if its start time s_i is at or after the finish time of the last interval in S, add it to S.
  5. Repeat until all intervals are considered.
This procedure can be expressed in pseudocode as:
GREEDY-INTERVAL-SELECTOR(intervals):
    sort intervals by increasing finish time
    S = empty set
    last_finish = -infinity
    for each interval i in sorted order:
        if start_time(i) >= last_finish:
            add i to S
            last_finish = finish_time(i)
    return S
The algorithm runs in O(n \log n) time, dominated by the initial step, with subsequent compatibility checks taking O(n) time. The optimality of this greedy choice is established via an exchange argument. Consider any optimal solution O with m intervals. Let G be the greedy solution, and suppose |G| < m. Align the first r intervals where G and O match, with r maximal. The next interval in O, say o_{r+1}, finishes no earlier than the greedy's next choice g_{r+1}. Replacing o_{r+1} with g_{r+1} in O yields a new feasible solution of size at least m that matches G for the first r+1 intervals, contradicting the maximality of r. Thus, |G| = m. To illustrate the importance of sorting by finish time, consider the following set of four intervals: A ([0,5]), B ([1,3]), C ([3.5,4]), and D ([2,6]). Sorting by earliest finish time yields B, C, A, D. The greedy algorithm selects B (finish 3) and then C (start 3.5 ≥ 3), yielding two non-overlapping intervals. In contrast, sorting by earliest start time and greedily selecting yields A first (start 0), after which no other interval is compatible, yielding only one interval.

Weighted Case

In the weighted case of single-resource interval scheduling, each interval j is associated with a positive weight w_j in addition to its start time s_j and finish time f_j, and the objective is to select a compatible subset that maximizes the total weight \sum w_j. Unlike the unweighted case, where a greedy algorithm suffices, the weighted variant requires to achieve optimality, as simply selecting the maximum number of intervals may overlook higher-weight combinations. The dynamic programming formulation assumes the n intervals are sorted in non-decreasing order of finish times, so f_1 \leq f_2 \leq \cdots \leq f_n. Define OPT(j) as the maximum total weight achievable using a compatible subset from the first j intervals. The base case is OPT(0) = 0. For j \geq 1, the recurrence is OPT(j) = \max \left( OPT(j-1),\ w_j + OPT(p(j)) \right), where p(j) is the largest index i < j such that interval i is compatible with j (i.e., f_i \leq s_j); if no such i exists, p(j) = 0. This captures the choice of either excluding interval j or including it and appending the optimal solution up to its latest compatible predecessor. To compute p(j) efficiently, preprocess the sorted intervals and use binary search on the finish times for each query, requiring O(\log n) time per p(j) and O(n \log n) total preprocessing and computation time. The DP table is then filled in O(n) time iteratively from j=1 to n, yielding an overall time complexity of O(n \log n). A naive implementation without binary search, using linear scans to find p(j), runs in O(n^2) time. To reconstruct the optimal solution after computing the OPT table, backtrack from j = n: if OPT(j) = w_j + OPT(p(j)), include interval j and recurse on p(j); otherwise, recurse on j-1. This traces the decisions that led to the maximum weight. As an illustrative example, consider five intervals sorted by finish time, with start times s = [0, 1, 2, 3, 0], finish times f = [1, 2, 3, 4, 5], and weights w = [10, 10, 10, 10, 50]. The compatible predecessors are p = [0, 1, 2, 3, 0]. The DP table is computed as follows:
jOPT(j-1)w_j + OPT(p(j))OPT(j)
1010 + 0 = 1010
21010 + 10 = 2020
32010 + 20 = 3030
43010 + 30 = 4040
54050 + 0 = 5050
The optimal solution selects the fifth interval alone, for total weight 50. In contrast, applying the unweighted (earliest finish time, disregarding weights) selects the first four intervals, yielding only 40—demonstrating why it fails here.

Grouped Interval Scheduling

Decision Problem Complexity

The decision version of the grouped interval scheduling problem asks whether there exists a selection of exactly one interval from each group such that no two selected intervals overlap, thereby forming a feasible schedule that represents all groups. This problem is NP-complete when the maximum group size is at least 3. The proof relies on a reduction from the numerical three-dimensional matching problem, demonstrating that deciding feasibility becomes computationally intractable as group sizes increase beyond 2. Specifically, Nakajima and Hakimi established the NP-completeness for cases where each group (or job) has 3 or more possible intervals, even under additional constraints like unit lengths. In contrast, the decision problem is solvable in polynomial time when all groups contain at most 2 intervals. This can be achieved by modeling the instance as a flow network, where groups are represented on one side and discretized time slots (derived from sorted endpoints of all intervals) on the other, with capacities ensuring at most one interval per group and no overlaps in time slots. The maximum flow in this network determines if a feasible assignment exists for all groups, computable using algorithms like for bipartite matching in O(n^{1.5}) time, where n is the number of intervals. The key dichotomy theorem states that the grouped interval scheduling decision problem is NP-complete if the maximum group size is at least 3 and polynomial-time solvable otherwise. This result, originating from foundational work in the 1980s and refined in subsequent analyses, highlights the threshold at group size 2 for tractability.

Optimization Problem Hardness

The optimization version of grouped interval scheduling requires selecting a maximum-cardinality (unweighted) or maximum-weight set of non-overlapping intervals such that at most one interval is chosen from each group. This problem is NP-hard even in the unweighted case when each group contains at least two intervals. The proof follows from showing that the decision variant—determining whether at least k groups can be covered by selecting one non-overlapping interval per group—is NP-complete for maximum group size 2, via a reduction establishing equivalence to known NP-complete problems in scheduling. Furthermore, the problem is APX-hard (formerly known as MAX SNP-hard) when the maximum group size is at least 2, implying no polynomial-time approximation scheme exists unless P=NP. This hardness arises from an L-reduction from . In the special case where all groups have size 1, the problem reduces to the classic single-resource interval scheduling maximization, which is solvable exactly in polynomial time: via a greedy algorithm for the unweighted case and dynamic programming for the weighted case, both in O(n \log n) time after sorting. Notably, a gap exists relative to the decision problem of whether a selection covering one interval from every group (without overlaps) is feasible: this covering decision is polynomial-time solvable when the maximum group size is at most 2 (via network flow or matching techniques), yet the optimization problem of maximizing the number of covered groups remains NP-hard in this setting.

Approximation Techniques

Greedy and Polynomial Approximations

In grouped interval scheduling, a simple greedy algorithm provides a 2-approximation for the unweighted maximization problem. The algorithm proceeds by repeatedly selecting the compatible interval (non-overlapping with previously selected intervals and from an unused group) that finishes earliest among all remaining candidates; after selection, all intervals from that group are discarded. This extends the classic earliest-finish-time greedy strategy for single-resource interval scheduling by incorporating the group constraint to ensure at most one selection per group. The algorithm runs in O(n log n) time when implemented efficiently using a priority queue or by presorting intervals by finish time and scanning linearly. The approximation guarantee is proven by partitioning the set of groups into those selected by the (size |G|) and those not selected. The optimal solution can select at most |G| intervals from the selected groups (at most one per group) and at most another |G| from the unselected groups, since the unselected groups' feasible lie in at most |G| disjoint time intervals defined by the gaps between the selections, allowing at most one non-overlapping interval per gap. Thus, the optimal value is at most 2|G|, yielding the 2-approximation . This directly adapts the proof for the single-resource case while accounting for the group restrictions. The bound is tight, as there exist instances where the selects half the optimal number. An improved approximation algorithm achieves a ratio arbitrarily close to e/(e-1) \approx 1.582 for the general unweighted case, using a combination of block partitioning of the timeline and . For special cases where each group contains at most two intervals, the problem admits an exact polynomial-time solution via network flow modeling for maximum selection. The intervals can be represented as vertices in a , with jobs (groups) as edges connecting their two possible intervals, and additional structure from the interval ordering allows maximum matching computation in O(n^2) time using standard flow algorithms, ensuring non-overlapping selections. This solvability holds more broadly for the decision version of selecting a fixed number of intervals when the maximum intervals per group is two.

Linear Programming Methods

Linear programming provides a powerful framework for approximating the grouped scheduling problem, where intervals are partitioned into groups and at most one interval per group can be selected, subject to non-overlap constraints, to maximize the total weight. The standard LP relaxation introduces variables x_i \in [0,1] for each i, representing the fractional selection of i. The objective is to maximize \sum_i w_i x_i, where w_i is the weight of i. Constraints ensure non-overlap by requiring \sum_{i \in C} x_i \leq 1 for each maximal C in the conflict (equivalently, for each time point covered by the intervals, though the formulation yields a polynomial-sized LP). Additionally, group constraints enforce \sum_{i \in g} x_i \leq 1 for each group g. This LP can be solved in polynomial time using interior-point methods or the ellipsoid algorithm, as the constraint matrix has a polynomial number of inequalities due to the structure of graphs. The integrality gap of this relaxation is at most 2, meaning the optimal LP value is at most twice the optimal integer solution, even with the added group constraints tightening the feasible region compared to the standard interval graph independent set LP (which has gap 1). With groups, the gap remains bounded by 2, though it can be tighter for instances with low group densities (few intervals per group). To obtain an integral solution, randomized rounding techniques can be applied to the LP solution. One approach selects, for each group independently, an interval i \in g with probability x_i / \sum_{j \in g} x_j (or simply x_i if the group sum is normalized), then discards conflicting selections greedily by earliest deadline or similar tie-breaking. This yields an expected value at least half the LP optimum, giving a 4-approximation overall due to the gap, but refinements like conditional probabilities to respect overlap constraints improve it to a constant factor. For the special case of equal weights per job (where weights are uniform within groups), an LP-based rounding achieves a 2-approximation by solving the relaxation and deterministically selecting non-conflicting intervals guided by fractional values. Deterministic pipage rounding, which iteratively rounds fractional variables while preserving the objective under the multilinear extension, can also be adapted for partition matroid constraints (groups) combined with the interval structure, yielding O(1)-approximations such as 2.5 in cases with bounded interval densities. When the number of intervals per group is fixed (say, at most k), more advanced techniques yield better guarantees. For k=2, approximations of 4/3 are possible via LP relaxations over feasible block schedules followed by randomized . Recent improvements include 1.5-approximations for low-density instances (where maximum overlap is bounded), using local search on partial solutions combined with LP guidance to swap intervals within groups for improved non-overlap satisfaction. For fixed k, constant-factor approximations exist, such as k^k / (k^k - (k-1)^k) + \epsilon, via iterative block partitioning and LP .

Classic Scheduling Variants

The activity selection problem is mathematically identical to the unweighted case of , in which the objective is to select the maximum number of mutually non-overlapping activities, each defined by a fixed time , to maximize resource utilization on a single processor. This classic problem is optimally solved using a that sorts the activities by their finishing times and iteratively selects the one with the earliest finish time that does not conflict with previously chosen activities, achieving an optimal solution in O(n \log n) time due to sorting. The greedy strategy for this problem emerged in the mid-20th century alongside foundational work on . In contrast, the weighted interval scheduling variant assigns a profit or weight to each interval and seeks to maximize the total weight of a non-overlapping , which can be efficiently computed using dynamic programming by considering intervals in sorted order and computing the maximum value up to each point. Notably, extensions like the job interval selection problem—where each job offers multiple possible execution intervals—render the weighted version NP-hard, with the 0-1 reducing to it as a special case when all job deadlines coincide. A related classic variant is deadline scheduling, where each job specifies a release time r_i (earliest start) and a deadline d_i (latest finish), often with processing time, transforming the problem into selecting non-overlapping intervals [r_i, d_i] to meet all deadlines without violation. This is equivalent to interval scheduling through a time-reversal transformation: negate all times to swap roles, setting the new finish time f_i' = -r_i and start time s_i' = -d_i, then apply the earliest-finish on the reversed timeline to check feasibility or maximize on-time jobs. Such formulations arise in systems and are APX-hard in general when processing times vary, but for fixed times via the above . Storage allocation represents another foundational connection, modeling memory requests as time intervals during which a block of storage must be exclusively reserved, requiring non-overlapping assignments along a linear to avoid conflicts. In the offline dynamic storage allocation problem, requests arrive with predefined intervals of need, and the task is to assign contiguous blocks such that no two active requests overlap in both time and , solvable via coloring techniques for the unweighted case. This variant differs from higher-dimensional , as it confines conflicts to a one-dimensional rather than multi-resource bins. These classic variants highlight key distinctions from more advanced interval scheduling extensions: they assume single-resource usage without job grouping or multiple identical processors, maintaining one-dimensional temporal conflicts unlike two-dimensional bin packing, where spatial dimensions introduce additional overlap constraints.

Graph-Theoretic Connections

provide a fundamental graph-theoretic framework for modeling interval scheduling problems. An is an undirected graph where each corresponds to an interval on the real line, and an edge exists between two vertices their corresponding intervals overlap. In this representation, the problem of selecting a maximum set of non-overlapping intervals—known as the unweighted interval scheduling problem—translates directly to finding a maximum independent set in the , as an independent set consists precisely of vertices whose intervals do not intersect. For the unweighted case, the maximum independent set in an can be computed efficiently using a that sorts the intervals by their ending times and iteratively selects the one with the earliest end time that does not overlap with previously selected intervals. This approach runs in O(n log n) time, where n is the number of intervals, and is optimal due to the perfect graph properties of interval graphs, where the size of the maximum independent set equals the minimum number. In the weighted case, the maximum weight independent set problem on interval graphs, which corresponds to weighted interval scheduling, can be solved using dynamic programming. Since interval graphs are chordal—a superclass where perfect elimination orders enable efficient —the algorithm processes vertices in a sorted order by interval endpoints, computing the maximum weight selectable set up to each point in time after . This leverages the tree-like structure implicit in the chordal elimination scheme, avoiding the exponential complexity of general graphs. The grouped variant of interval scheduling, where intervals are partitioned into groups and at most one interval per group can be selected without overlaps, introduces additional constraints that can be modeled by augmenting the with structures for each group. This formulation ties into aspects, as the minimum number of groups needed to schedule all intervals relates to the chromatic number of the conflict graph, which for interval graphs equals the size of the maximum ; however, the optimization under group constraints remains polynomial-time solvable via extensions of dynamic programming that account for group exclusions. The decision version's complexity connects to coloring problems in more general intersection graphs, highlighting when group overlaps deviate from interval structure. Interval graphs can be recognized in linear time using PQ-tree algorithms, which test for the consecutive ones property in the and construct an interval representation if valid. This recognition, originally developed in , enables efficient verification of whether a given admits an model, facilitating applications in diverse fields. In bioinformatics, interval graphs model DNA fragment overlaps for and detection, where maximum independent sets identify non-overlapping genomic regions of interest.

Extensions and Applications

Multi-Resource Scheduling

Multi-resource interval scheduling generalizes the single-resource problem to k identical machines, where the goal is to assign a of given intervals to the machines such that no two intervals assigned to the same machine overlap in time, maximizing the number of selected intervals in the unweighted case or the total weight in the weighted case. This problem arises in settings like where multiple can handle non-overlapping tasks simultaneously, with each representing a . The single-resource case corresponds to k=1, which is solvable optimally via selection in the unweighted case or dynamic programming in the weighted case. In the unweighted case, an optimal solution can be computed in O(n log n) time using a greedy algorithm that sorts the intervals by their ending times and assigns each interval to the machine that becomes available earliest after its start time, provided it does not overlap with previously assigned intervals on that machine. This approach, akin to multiprocessor scheduling heuristics, ensures no machine is idle when a compatible interval is available. For fixed k, the weighted variant is solvable in polynomial time via dynamic programming, where the state tracks the current interval and the availability times of the k machines, leading to O(n^2 k) time complexity by considering compatible assignments in sorted order. List scheduling and bin-packing-inspired heuristics, such as sorting intervals by length and assigning to the least-loaded machine, provide practical near-optimal solutions for larger k, though the exact greedy method is preferred for optimality. When all intervals must be scheduled on k , the focus shifts to minimizing the , defined as the maximum total length of intervals assigned to any single (since non-overlapping assignments ensure the busy time equals the sum of lengths). This problem is NP-hard, but a 2- exists using a assignment strategy that sorts intervals by decreasing length and assigns each to the with the current minimum load that can accommodate it without overlap. For large k, a polynomial-time (PTAS) achieves (1 + ε)- for any fixed ε > 0, leveraging structural properties of interval graphs and dynamic programming on subsets. The grouped multi-resource variant requires selecting at most one interval per group (where each group represents alternatives for a task) such that the selected intervals can be assigned to k machines without overlaps per machine, maximizing the number of represented groups. This problem is NP-hard even for k=2 and groups containing at least 2 intervals, as it generalizes the single-machine group interval scheduling, which reduces from 3-partition.

Real-World Uses

Interval scheduling finds extensive application in resource allocation problems where tasks or events occupy time intervals and must be selected to avoid overlaps. In meeting room booking systems, the unweighted interval scheduling algorithm is commonly used to maximize the number of reservations for a single room by greedily selecting the meeting that ends earliest among compatible options. For instance, Google's Calendar employs variants of this approach to handle basic scheduling without priorities, ensuring efficient use of shared spaces. When priorities are involved, such as assigning higher-value events like executive meetings over routine ones, weighted interval scheduling is applied to optimize total value, as implemented in enterprise tools like Outlook's resource booking features. Broadcast scheduling leverages interval scheduling to manage airtime for and radio channels, grouping advertisements or programs into non-overlapping slots to maximize . In this context, the problem is modeled as selecting weighted intervals (e.g., ad spots with revenue values) across multiple channels, using approximations to ensure high-value content airs without conflicts, as seen in systems developed by broadcasters like the for dynamic playlist generation. More recent applications include hyperparameter tuning, where trials are scheduled as time intervals on computational resources to avoid overlaps and maximize completed experiments within deadlines; frameworks like Tune employ interval scheduling variants for this purpose in distributed training setups. Similarly, in the 2020s, (EV) charging station allocation has adopted interval scheduling to assign charging slots as non-overlapping intervals, optimizing for user demand and grid capacity. Practical implementations often rely on open-source libraries for efficiency. Google's provides solvers for weighted and grouped variants, enabling custom optimizations in production environments like and event planning.

References

  1. [1]
    [PDF] Chapter 4 4.1 Interval Scheduling - cs.Princeton
    Interval scheduling involves finding a maximum subset of mutually compatible jobs, where jobs are compatible if they don't overlap. The goal is to find the ...
  2. [2]
    [PDF] CMSC 451: Lecture 5 Greedy Algorithms for Scheduling
    You want to write an algorithm to assign picnic tables to requests. Our first problem is called interval scheduling. We are given a set R = {r1,...,rn} of n.
  3. [3]
    [PDF] Scheduling Interval Scheduling, Reservations, and Timetabling
    Interval Scheduling, Reservation Systems. Applications Reservation Systems hotel room reservation car rental reserving machines in a factory timetabling ...<|control11|><|separator|>
  4. [4]
    Multithread Interval Scheduling with Flexible Machine Availabilities
    Mar 13, 2023 · The objective is to schedule all jobs such that the machines' availability intervals are respected or to decide that there exists no such ...
  5. [5]
    [PDF] Chapter 6 - cs.Princeton
    6.1 Weighted Interval Scheduling. 6. Weighted Interval Scheduling. Weighted interval scheduling problem. □! Job j starts at sj, finishes at fj, and has weight ...<|control11|><|separator|>
  6. [6]
    [PDF] Interval Scheduling: A Survey
    Apr 21, 2006 · Interval scheduling involves given starting times for jobs, where the goal is to process all jobs with a minimum number of machines without ...
  7. [7]
    [PDF] ‣ weighted interval scheduling ‣ segmented least squares ...
    Greedy algorithm is correct if all weights are 1. Observation. Greedy algorithm fails spectacularly for weighted version. 8 weight = 999 weight ...
  8. [8]
    Complexity results for scheduling tasks with discrete starting times
    12. K Nakajima, S.L Hakimi. On scheduling equal execution time tasks with discrete starting times. Proc. 16th Ann. Conf. on Information Sciences and Systems ( ...
  9. [9]
    Fixed Job Scheduling with Two Types of Processors - PubsOnLine
    We consider a scheduling problem that involves two types of processors, but three types of jobs. Each job has a fixed start time and a fixed completion time ...
  10. [10]
    Approximation Algorithms for the Job Interval Selection Problem and ...
    In this paper we consider the job interval selection problem (JISP), a simple scheduling model with a rich history and numerous applications.Missing: complete | Show results with:complete
  11. [11]
    [PDF] On the approximability of an interval scheduling problem
    In this paper we introduced an interval scheduling problem. Formulated in graph-theoretic terms the problem is a natural generalization of "nding a maximum ...
  12. [12]
    Optimization, approximation, and complexity classes - ScienceDirect
    We show that problems in these classes can be approximated with some bounded error. Furthermore, we show that a number of common optimization problems are ...
  13. [13]
    Scheduling Split Intervals | SIAM Journal on Computing
    Interval scheduling with economies of scale. Computers & Operations Research, Vol. 150 | 1 Feb 2023. Online scheduling of car-sharing request pairs between ...
  14. [14]
    [PDF] Interval selection: Applications, algorithms, and lower bounds
    having a lower time-complexity. Further, we prove in Section 5 that no ... The weighted job interval selection problem has applications in diverse areas.
  15. [15]
    [PDF] Pipage Rounding: A New Method of Constructing Algorithms with ...
    Abstract. The paper presents a general method of designing constant-factor approximation algorithms for some discrete optimization problems with ...
  16. [16]
    [PDF] Approximation Algorithms for the Job Interval Selection Problem and ...
    Sep 25, 2005 · Interval schedul- ing, where every job has a single choice, is equivalent to maximum inde- pendent set in interval graphs, and therefore has a ...
  17. [17]
    [PDF] Greedy Algorithms and Data Compression.
    Interval scheduling problem: Each i ∈ S has a start time si and a finish ... Kruskal's greedy algorithm. J. Kruskal, 1956. Similar to Jarn´ık - Prim ...
  18. [18]
    P-Complete Approximation Problems | Journal of the ACM
    Abstract. For P-complete problems such as traveling salesperson, cycle covers, 0-1 integer programming, multicommodity network flows, quadratic assignment, etc.
  19. [19]
  20. [20]
    2D interval scheduling problem - Computer Science Stack Exchange
    Dec 23, 2019 · This is called the offline dynamic storage allocation problem. Check out the papers cited by https://epubs.siam.org/doi/abs/10.1137 ...
  21. [21]
    [PDF] 1 Maximum Independent Set Problem in Graphs
    The goal is to find a maximum number of the intervals in the given set of intervals which do not overlap.
  22. [22]
    [PDF] Random-Order Online Independent Set of Intervals and ... - DROPS
    For interval graphs (i.e., when the objects are intervals), the problem is also known as interval scheduling, where a simple greedy algorithm is optimal. In ...
  23. [23]
    An efficient algorithm for finding a maximum weight 2-independent ...
    In this paper, we introduce an O(n) time algorithm to solve the maximum weight independent set problem on an interval graph with n vertices.
  24. [24]
    [PDF] Unit Interval Graphs & Maximum c-Independent Sets Maximizing the ...
    Mar 13, 2024 · We study maximum c-independent sets that maximize the number of isolated vertices and present an algorithm that computes such subgraphs for unit ...
  25. [25]
    Solving Group Interval Scheduling Efficiently | SpringerLink
    Jul 10, 2019 · The objective is to determine if there is a set of of jobs which can be scheduled in non-overlapping time intervals. This work describes a ...Missing: grouped | Show results with:grouped
  26. [26]
    Testing for the consecutive ones property, interval graphs, and ...
    December 1976, ... The consecutive ones test is extended to a test for interval graphs using a recently discovered fast recognition algorithm for chordal graphs.
  27. [27]
    Detecting independent and recurrent copy number aberrations ...
    Jun 11, 2014 · Interval graphs are a special class of graphs and a number of important optimization problems, that are generally NP-hard, can be solved ...
  28. [28]
    Interval scheduling: A survey - Kolen - 2007 - Wiley Online Library
    Mar 16, 2007 · We first review the complexity and approximability of different variants of interval scheduling problems. Next, we motivate the relevance of ...