DS-1993-06: Tromp, John (1993) Aspects of Algorithms and Complexity. Doctoral thesis, University of Amsterdam.
Text (Samenvatting)
DS-1993-06.samenvatting.txt Download (5kB) |
Abstract
This thesis is a compilation of several studies within the area of theoretical
computer science. Most deal with algorithms and their associated complexity
measures like time and space, but there is also the odd measure like the use
of energy in electronic circuits.
Chapter 2 is concerned with the following problem. Given a finite set of
strings, $w_1,w_2, \ldots, w_n$, find their shortest common superstring $w$.
It is `known' that no algorithm, given a set of words with total length $m$,
can always find the shortest superstring within $m^{k}$ steps, for some
constant $k$. Technically speaking, this problem is not solvable in polynomial
time. There does exist an algorithm that finds a reasonable superstring in
approximately $m^3$ steps, which is used in DNA sequencing, that is,
determining the exact order of nucleotides in DNA molecules. In this chapter
it is shown that the length of superstrings found with this algorithm is
bound by four times the length of the shortest possile superstring.
In Chapter 3 the central topic is the memory complexity of colouring a
picture. An algorithm is derived that solves this problem with constant
memory use, in contrast to current practice of using memory proportional to
the size of the figure to be coloured. Using only a constant amount of
memory, the problem appears like a gigantic Labyrinth, the part of which we
find ourselves in must be completely walled in.
Chapter 4 is concerned with upper bounds on the switching energy needed to
compute treshold and counting functions with VLSI circuits. Through some
redundancy in the number of switching elements, we show how to reduce the
amount of wires switching on a change of inputs, by a factor which is the
number of digits in the number of inputs to the curcuit.
Chapter 5 introduces a new kind of computer, a parallel version of the
`storage modification machine'. The memory of this machine consists of cells,
each pointing at several other cells. The machine has the ability to address
a set of cells on which certain operations can be carried out, such as the
creation of new cells or the redirecting of pointers. This turns out to be a
very powerful computer, that could, for instance, solve chess in a relatively
small number of steps (although the number of cells used would be as large as
the number of possible chess games).
The last four chapters are devoted to communication protocols that give
different processes the idea that they are sharing a single memory: a process
can read what other processes write. In the simplest case the memory consists
of a single bit, written by one, and read by one other process. This building
block serves as the basis for bigger and better memories: more values, more
readers, more writers. A unique property of these protocols is that they are
{\em wait-free\/}: one process does not have to wait for another process to
complete its memory operation.
Item Type: | Thesis (Doctoral) |
---|---|
Report Nr: | DS-1993-06 |
Series Name: | ILLC Dissertation (DS) Series |
Year: | 1993 |
Depositing User: | Dr Marco Vervoort |
Date Deposited: | 14 Jun 2022 15:16 |
Last Modified: | 14 Jun 2022 15:16 |
URI: | https://eprints.illc.uva.nl/id/eprint/1963 |
Actions (login required)
View Item |