10:30 AM - 10:55 AM
A typology of logistics service providers in Canada
This work analyzes the content of 100 websites of Canadian logistics service providers (LSP) in order to identify a variety of value propositions defined as the services offered and the promised outcomes to the customers. Using clustering techniques on this data, it is possible to create a typology of LSPs.
10:55 AM - 11:20 AM
Towards station-level demand prediction for effective rebalancing in bike-sharing systems
Bike-sharing systems are today an efficient means of transportation. The proposed model uses temporal and weather features to model the network behavior. The model extracts main behaviors, characterizes them and rebuilds a prediction station-wise. This model is applied to the Montreal network and is able to loose 20% fewer trips than the operator.
11:20 AM - 11:45 AM
Less is more: Basic variable neighborhood search heuristic for balanced minimum sum-of-squares clustering
Balanced clustering addresses the problem of finding homogeneous and well-separated subsets of equal cardinality from a set of data points. We present a basic variable neighborhood search heuristic for balanced minimum sum-of-squares clustering. Computational experiments and statistical tests show that the proposed algorithm outperforms the current state-of-the-art algorithm.
11:45 AM - 12:10 PM
A scalable algorithm for the solution of large clustering problems
Clustering consists in finding homogeneous and well-separated subsets, called clusters, from a set of given objects. The literature presents numerous clustering criteria to be maximized for separation and minimized for homogeneity. In this paper, we propose a global optimization method for clustering problems with respect to clustering criteria that satisfy three simple properties. We exemplify the use of our method on the diameter minimization clustering problem, which is strongly NP-hard. Our algorithm can solve problems containing more than 500,000 objects while consuming only moderate amounts of time and memory. The size of the problems that can be solved using our algorithm is two orders of magnitude larger than the largest problems solved by the previous state-of-the-art exact methods.