Invited Talks: Workshop on Algorithm Engineering as a New Paradigm 00/10/30 - 11/2

Workshop on Algorithm Engineering as a New Paradigm


    Oct 30 - Nov 2, 2000
    RIMS, Kyoto University, Kyoto, Japan

    Sponsored by Research Institute for Mathematical Science, Kyoto
    University, and Grants-in-aid for Scientific Research of Priority
    Areas, Ministry of Education, Science, Sports and Culture of Japan


Invited Talks:

Octber 30


13:00-14:00 	Stefan Naeher (Universitaet Trier)
  A Generic Tool for Interactive Visualization of Geometric Algorithms (GeoWin)

Abstract:
The visualization of algorithms and data structures is very important
in the design and implementation of algorithms, especially in the area
of compuational geometry. It is useful for the presentation of results, 
teaching, debugging and testing, the design of worst-case or degenerate 
inputs, and it often helps to discover problems and find new solutions.
In this paper we present {\em GeoWin} a \CC data type 
for the visualization and manipulation of geometric objects
and show how to use it in the presentation and animation of 
geometric algorithms.
By the use of templates and object-oriented programming, GeoWin is a 
{\em generic} visualization and animation tool that can be used for 
arbitrary types of geometric objects, as long as they provide as 
small set of basic operations. This makes GeoWin
independent from any concrete geometric kernel. Currently, it is used
as visualization tool for both the CGAL and LEDA projects. 
------------------------------------------------------------------------------


14:00-15:00	Tamal Krishna Dey (The Ohio State University)
  Detecting Undersampling in Surface Reconstruction: Theory and Experiments

Abstract:
Current surface reconstruction algorithms
perform satisfactorily on well-sampled, smooth surfaces without boundaries.
However, these algorithms have severe problems
when met with undersampling. Cases of undersampling 
is prevalant in real data since often they sample a part
of the boundary of an object, or are derived from
a nonsmooth surface. In this paper we present an algorithm
to detect the regions of undersampling. This information can be 
used to reconstruct surfaces with boundaries, and also to
detect the locality of sharp features in nonsmooth surfaces. 
We report the effectiveness of the algorithm
with a number of experimental results. Theoretically, we justify
the algorithm with some mild assumptions that are valid
for most practical data. 
------------------------------------------------------------------------------


15:30-16:30	Mario Szegedy (Rutgers University)
  CKit: A Preprocessor for Algorithmic Experiments

Abstract:
CKit is a general purpose preprocessor that I originally designed
for and tested in code theoretical and combinatorial experiments. 
In this article I explore the most important features of this
preprocessor, describe its design principles, and illustrate its
novel use with examples. I introduce the notions of {\em macros
with associative power} and {\em lexical inheritance}, and
give examples to their use in CKit.
------------------------------------------------------------------------------


16:30-17:30	Wolfgang Slany (Technische Universitaet Wien)
  Theory and Practice of Shift Scheduling

Abstract:
Generating high-quality schedules for a rotating workforce is a critical
task in all situations where a certain staffing level must be guaranteed,
such as in industrial plants, hospitals, or airline companies. Results
from ergonomics indicate that shift schedules have a profound impact on
the health and satisfaction of employees as well as on their performance
at work. Moreover, shift schedules must satisfy legal requirements and
should also meet the objectives of the employing organization. Shift
scheduling comprises several sub-problems such as the allocation of
workers to shifts or the design of the shifts themselves that present
interesting challenges both from practical (our algorithms are sold in a
successful commercial package) as well as theoretical (hardness,
approximability) viewpoints. I will describe some of our results in both
areas. In particular, we studied in depth the (non-)approximability of the
shift design problem, a network flow variant that has many applications
beyond shift scheduling.
------------------------------------------------------------------------------

October 31


 9:30-10:30	Ingo Wegener (University of Dortmund)
  On the Design and Analysis of Evolutionary Algorithms

Abstract:
Evolutionary algorithms are randomized search heuristics
for the optimization of functions $f$. They have applications 
if time, money or skills are not available to design
a problem-specific algorithm or in black box optimization
where $f$ is not known and information on $f$ can only be
gained by sampling. Many experiments with evolutionary
algorithms have been performed but only few theoretical
results are known. Such a theory can be built up only by
people with experience in the analysis of algorithms. Here
a simple evolutionary algorithm for the optimization of
functions $f:\{0,1\}^n \to {I\!\!R}$ is investigated. The
algorithm works with population size 1 and mutations only.
Most results can be generalized to larger populations. It is 
shown that such algorithms cannot optimize all unimodal functions
efficiently. However, linear functions can be optimized in an 
expected time of $O(n \log n)$. The effect of varying mutation 
probabilities is studied. Finally, the first example is presented
where it is proven that uniform crossover can improve the expected
runtime from superpolynomial to polynomial and also the first
example where it is proven that 1-point-crossover can improve the
expected runtime from exponential to polynomial.
(Some of the results are obtained together with Stefan Droste and
Thomas Jansen).
------------------------------------------------------------------------------


10:30-11:30	Oscar H. Ibarra (University of California)
  Some Complexity Issues in Parallel Computing

Abstract:
We give an overview of the computational complexity of linear and
meshconnected cellular arrays with respect to well known models of
sequential and parallel computation. We discuss one-way communication
versus two-way communication, serial input versus parallel input, and
space-efficient simulations. In particular, we look at the parallel
complexity of cellular arrays in terms of the PRAM theory and its
implications, e.g., to the parallel complexity of recurrence equations
and loops. We also point out some important and fundamental open
problems that remain unresolved.

Next, we investigate the solvability of some reachability and safety
problems concerning machines operating in parallel and cite some
possible applications. Finally, we briefly discuss the complexity of
the "commutativity analysis" technique that is used in the areas of
parallel computing and parallelizing compilers.

Keywords: cellular array, computational complexity, P-complete,
parallel complexity, recurrence equations, reachability, safety,
commutativity analysis.
------------------------------------------------------------------------------

November 1


 9:30-10:30	James Ferdinand Geelen (University of Waterloo)
  An Algebraic Approach to Matching Problems

Abstract:
In Tutte's seminal paper on matching, he associates a skew-symmetric
matrix with a graph; this matrix is now known as {\em Tutte matrix}.
The rank of the Tutte matrix is exactly twice the size of a maximum
matching in the graph. This formulation easily leads to
an efficient randomized algorithm for matching. The Tutte matrix
is also useful in obtaining a min-max theorem and an efficient
deterministic algorithm. We review these results and look at
similar formulations of other problems; namedly, linear matroid 
intersection, linear matroid parity, pth matching, and matching forests.
------------------------------------------------------------------------------


10:30-11:30	David Williamson (IBM T. J. Watson Research Center)
  A Short Primer on the Primal-Dual Method for Approximation Algorithms

Abstract:
The primal-dual method is a well-known tool used in the design of
algorithms for problems in combinatorial optimization.  Typically it
is used for problems whose solution can be characterized by linear
programs (for example, shortest paths, network flows, bipartite
matchings).  However, in the last decade a wealth of results has shown
that a modified version of the method is also useful for designing
approximation algorithms for a wide variety of NP-hard optimization
problems.  An approximation algorithm is one that runs in polynomial
time and produces a solution whose value is within some factor of the
optimal solution.  The primal-dual method has been shown to give good
approximation algorithms for the set cover problem, the vertex cover
problem, network design problems, feedback vertex set problems, the
uncapacitated facility location problem, the k-median problem, and
others.  Additionally, experimental results show that the method
results in practical heuristics, whose solution values are typically
quite close to the value of the optimal.

In this talk, I will review the primal-dual method for approximation
algorithms and some interesting results that have been derived from
it.
------------------------------------------------------------------------------


13:00-14:00	Celso Ribeiro (Catholic University of Rio de Janeiro)
 A Hybrid GRASP with Perturbations and Adaptive Path-Relinking for the 
 Steiner Problem in Graphs

Abstract:
We Propose and describe a hybrid GRASP with weight perturbations and 
adaptive path-relinking heuristic (HGP-PR) for the Steiner problem in 
graphs. In this multi-start approach, the greedy randomized construction 
phase of a GRASP is replaced by the combination of several
construction heuristics with a weight perturbation strategy that combines
intensification and diversification elements, adn in a strategic oscillation
approach. The improvement phase circularly explores two different
local search strategies: the first uses a node-based neighborhood 
for local search, while the second used a key-path-based neighborhood. 
An adaptive path-relinking technique is applied to a set of elite
solutions as a post-optimization strategy. Computational experiments
on a large set of benchmark problems of three dirrerent classes are
reported. We first illustrate the effectiveness of preprocessing 
procedures for seceral classesof test problems. Next, We present
computational results illustrating the contribution of each algorithmic
feature to the robustness of the complete algorithm. Finally, we show
that our algorithm outperforms other heuristics in the literature,
obtaining consistently better or comparably good solutions for alll 
calsses of test problems.
------------------------------------------------------------------------------


14:00-15:00	Ibrahim Osman (American University of Beirut)
  Metaheuristics: A general framework

Abstract:
The most successful meta-heuristics are briefly reviewed with their
associated components. A unified-meta-heuristic framework will be
presented. It will be shown how the existing metaheuristics can fit into
the presented framework. In addition, it also invites extra research
into desiging new innovative and unexplored meta-heuristics. Finally we
conclude by highlighting current trends and future research directions
in this active area of the science of heuristics.
------------------------------------------------------------------------------

November 2


 9:30-10:30	Shay Kutten (Faculty of Industrial Engineering and Management,
                       Technion, Israel)
  On Time Adaptivity and Stabilization

Abstract:
We study the scenario where a transient fault hit a minority 
of the  nodes in a distributed system by corrupting their state. 
The notion of {\em time adaptive}, or of {\em fault locality} 
were introduced to take advantage of the fact that the 
number of faults in such realistic scenarios may be small.
A stabilizing distributed protocol is called \emph{adaptive} if its
recovery time from a state-corrupting fault is proportional to the
number of processors hit by the fault, rather than to the total 
number of nodes.

We describe time adaptive protocols, and upper and lower bound 
for their stabilization time, and their fault resilience. We describe 
basic problems, both in the static case, and in the interactive case,
such that more general problems can be reduced to them. We then solve
these problems optimally. We describe the main techniques, as well
as the new definitions, such as that of the amount of fault resilience
that can be obtained, the different kind of stabilization (state 
stabilization versus output stabilization), and the additional 
desirable properties for such protocols.

This area is rather new, and a lot of addition research is still needed.
------------------------------------------------------------------------------


10:30-11:30	Gerhard Woeginger (Technical University Graz)
Preemptive Scheduling with Rejection

Abstract:
We consider the problem of preemptively scheduling a set of
$n$ jobs on $m$ (identical, uniformly related, or unrelated)
parallel machines. The scheduler may reject a subset of the
jobs and thereby incur job-dependent penalties for each rejected
job, and he must construct a schedule for the remaining jobs
so as to optimize the preemptive makespan on the $m$ machines
plus the sum of the penalties of the jobs rejected.

We provide a complete classification of these problems with
respect to complexity and approximability. Our main results
are various approximation schemes, constant factor approximation
algorithms, and in-approximability results for different
variants of this problem.
------------------------------------------------------------------------------


Liang ZHAO
<last update 2000/10/5 >