We propose a novel hybrid method to solve the network-constrained stochastic unit commitment problem. We target realistic large-scale instances including hundreds of thermal generation units, thousands of transmission lines and nodes, and a large number of stochastic renewable generation units. This scheduling problem is formulated as a two-stage stochastic programming problem with continuous and binary variables in the first stage and only continuous variables in the second stage. We develop a hybrid solution method that decomposes the original problem into a master problem including unit commitment and dispatch decisions, and decomposed subproblems representing dispatch with transmission constraints per scenario. The proposed decomposition embeds a column-and-constraint generation step within the traditional Benders decomposition framework. The performance of the proposed decomposition technique is contrasted with the solution of the extensive form via branch-and-cut and Benders decomposition available in commercial solvers, and with conventional Benders decomposition variants. Our computational experiments show that the proposed method generates bounds of superior quality and finds solutions for instances where other approaches fail.
In this paper, we consider an integrated production and outbound distribution scheduling problem with a single production site, and its extension to multiple plants. A set of orders must be satisfied such that the required pieces from a single product must be first processed on a single machine in a plant, then must be delivered to the customers before their lifespan expire using a single vehicle. The goal is to minimize the makespan of the solution, which is the return time of the vehicle after its last trip. We propose an elementary variable neighborhood search to solve the problem, using two new local search operators. Our computational results show that this simple procedure outperforms the existing, sometimes complex approaches on the widely used benchmark dataset. We also review the existing computational results, and demonstrate that in some cases the comparisons in the literature are invalid due to the use of different rounding of the data. By re-evaluating the accessible solutions we provide a fair comparison for each rounding method. We also consider the extension of the problem to multiple plants, and adapt our solution approach for this extension. Our experiments show that our method is competitive in terms of solution quality with the existing solution approach for the problem.
In networks consisting of agents communicating with a central coordinator and working together to solve a global optimization problem in a distributed manner, the agents are often required to solve private proximal minimization subproblems. Such a setting often requires a decomposition method to solve the global distributed problem, resulting in extensive communication overhead. In networks where communication is expensive, it is crucial to reduce the communication overhead of the distributed optimization scheme. Gaussian processes (GPs) are effective at learning the agents' local proximal operators, thereby reducing the communication between the agents and the coordinator. We propose combining this learning method with adaptive uniform quantization for a hybrid approach that can achieve further communication reduction. In our approach, due to data quantization, the GP algorithm is modified to account for the introduced quantization noise statistics. We further improve our approach by introducing an orthogonalization process to the quantizer's input to address the inherent correlation of the input components. We also use dithering to ensure uncorrelation between the quantizer's introduced noise and its input. We propose multiple measures to quantify the trade-off between the communication cost reduction and the optimization solution's accuracy/optimality. Under such metrics, our proposed algorithms can achieve significant communication reduction for distributed optimization with acceptable accuracy, even at low quantization resolutions. This result is demonstrated by simulations of a distributed sharing problem with quadratic cost functions for the agents.
This paper presents exact Semi-Definite Program (SDP) reformulations for infinite-dimensional moment optimization problems involving a new class of piecewise Sum-of-Squares (SOS)-convex functions and projected spectrahedral support sets. These reformulations show that solving a single SDP finds the optimal value and an optimal probability measure of the original moment problem. This is done by establishing an SOS representation for the non-negativity of a piecewise SOS-convex function over a projected spectrahedron. Finally, as an application and a proof-of-concept illustration, the paper presents numerical results for the Newsvendor and revenue maximization problems with higher-order moments by solving their equivalent SDP reformulations. These reformulations promise a flexible and efficient approach to solving these models. The main novelty of the present work in relation to the recent research lies in finding the solution to moment problems, for the first time, with piecewise SOS-convex functions from their numerically tractable exact SDP reformulations.
Classifications organize entities into categories that identify similarities within a category and discern dissimilarities among categories, and they powerfully classify information in support of analysis. We propose a new classification scheme premised on the reality of imperfect data. Our computational model uses uncertain data envelopment analysis to define a classification's proximity to equitable efficiency, which is an aggregate measure of intra-similarity within a classification's categories. Our classification process has two overriding computational challenges, those being a loss of convexity and a combinatorially explosive search space. We overcome the first challenge by establishing lower and upper bounds on the proximity value, and then by searching this range with a first-order algorithm. We address the second challenge by adapting the p-median problem to initiate our exploration, and by then employing an iterative neighborhood search to finalize a classification. We conclude by classifying the thirty stocks in the Dow Jones Industrial average into performant tiers, by classifying prostate treatments into clinically effectual categories, and dividing airlines into peer groups.

