Mathematics (Faculty of)
Permanent URI for this community
Welcome to the Faculty of Mathematics community.
This community and it’s collections are organized using the University of Waterloo's Faculties and Academics structure. In this structure:
- Communities are Faculties or Affiliated Institutions
- Collections are Departments or Research Centres
Research outputs are organized by type (eg. Master Thesis, Article, Conference Paper).
New collections following this structure will be created UPON REQUEST.
Browse
Browsing Mathematics (Faculty of) by Issue Date
Now showing 1 - 20 of 3096
Results Per Page
Sort Options
Item Contributions to the study of general relativistic shear-free perfect fluids: an approach involving Cartan's equivalence method, differential forms and symbolic computation(University of Waterloo, 1993) Lang, Jérôme MichelIt has been conjectured that general relativistic shear-free perfect fluids with a barotropic equation of state, and such that the energy density, µ, and the pressure, p, satisfy µ + p ̸= 0, cannot simultaneously be rotating and expanding (or contracting). A survey of the known results about this conjecture is included herein. We show that the conjecture holds true under either of the following supplementary conditions: 1) the Weyl tensor is purely magnetic with respect to the flow velocity vector or 2) dp/dµ = −1/3. Any hypersurface-homogeneous shear-free perfect fluid which is not space-time homogeneous and whose acceleration vector is not parallel to the vorticity vector belongs to one of three invariantly defined classes, labelled A, B and C. It is found that the Petrov types which are allowed in each class are as follows: for class A, type I only; for class B, types I, II and III; and for class C, types I, D, II and N. Two-dimensional pseudo-Riemannian space-times are classified in a manner similar to that of the Karlhede classification of four-dimensional general-relativistic space-times. In an appendix, the forms differential forms package for the Maple program is described.Item Effcient Simulation of Message-Passing in Distributed-Memory Architectures(University of Waterloo, 1996) Demaine, ErikIn this thesis we propose a distributed-memory parallel-computer simulation system called PUPPET (Performance Under a Pseudo-Parallel EnvironmenT). It allows the evaluation of parallel programs run in a pseudo-parallel system, where a single processor is used to multitask the program's processes, as if they were run on the simulated system. This allows development of applications and teaching of parallel programming without the use of valuable supercomputing resources. We use a standard message-passing language, MPI, so that when desired (e. g. , development is complete) the program can be run on a truly parallel system without any changes. There are several features in PUPPET that do not exist in any other simulation system. Support for all deterministic MPI features is available, including collective and non-blocking communication. Multitasking (more processes than processors) can be simulated, allowing the evaluation of load-balancing schemes. PUPPET is very loosely coupled with the program, so that a program can be run once and then evaluated on many simulated systems with multiple process-to-processor mappings. Finally, we propose a new model of direct networks that ignores network traffic, greatly improving simulation speed and often not signficantly affecting accuracy.Item Simulated Overloading using Generic Functions in Scheme(University of Waterloo, 1997) Cox, AnthonyThis thesis investigates extending the dynamically-typed, functional programming language Scheme, with simulated overloading in order to permit the binding of multiple, distributed defnitions to function names. Overloading facilitates the use of an incremental style of programming in which functions can be defined with a base behaviour and then extended with additional behaviour as it becomes necessary to support new data types. A technique is demonstrated that allows existing functions to be extended, without modifcation, therefore improving code reuse. Using the primitives provided by Scheme, it is possible to write functions that perform like the generic routines (functions) of the programming language EL1. These functions use the type of their arguments to determine, at run-time, the computation to perform. It is shown that by gathering the definitions for an overloaded function and building a generic routine, the language appears to provide overloading. A language extension that adds the syntax necessary to instruct the system to gather the distributed set of definitions for an overloaded function and incrementally build an equivalently applicable generic function is described. A simple type inference algorithm, necessary to support the construction of generic functions, is presented and detailed. Type inference is required to determine the domain of an overloaded function in order to generate the code needed to perform run-time overload resolution. Some limitations and possible extensions of the algorithm are discussed.Item Classification of Nilpotent Lie Algebras of Dimension 7 (over Algebraically Closed Field and R)(University of Waterloo, 1998) Gong, Ming-PengThis thesis is concerned with the classification of 7-dimensional nilpotent Lie algebras. Skjelbred and Sund have published in 1977 their method of constructing all nilpotent Lie algebras of dimension n given those algebras of dimension < n, and their automorphism groups. By using this method, we construct all nonisomorphic 7-dimensional nilpotent Lie algebras in the following two cases: (1) over an algebraically closed field of arbitrary characteristic except 2; (2) over the real field R. We have compared our lists with three of the most recent lists (those of Seeley, Ancochea-Goze, and Romdhani). While our list in case (1) over C differs greatly from that of Ancochea-Goze, which contains too many errors to be usable, it agrees with that of Seeley apart from a few corrections that should be made in his list, Our list in case (2) over R contains all the algebras on Romdhani's list, which omits many algebras.Item Folding Orthogonal Polyhedra(University of Waterloo, 1999) Sun, JulieIn this thesis, we study foldings of orthogonal polygons into orthogonal polyhedra. The particular problem examined here is whether a paper cutout of an orthogonal polygon with fold lines indicated folds up into a simple orthogonal polyhedron. The folds are orthogonal and the direction of the fold (upward or downward) is also given. We present a polynomial time algorithm to solve this problem. Next we consider the same problem with the exception that the direction of the folds are not given. We prove that this problem is NP-complete. Once it has been determined that a polygon does fold into a polyhedron, we consider some restrictions on the actual folding process, modelling the case when the polyhedron is constructed from a stiff material such as sheet metal. We show an example of a polygon that cannot be folded into a polyhedron if folds can only be executed one at a time. Removing this restriction, we show another polygon that cannot be folded into a polyhedron using rigid material.Item Multi-Resolution Approximate Inverses(University of Waterloo, 1999) Bridson, RobertThis thesis presents a new preconditioner for elliptic PDE problems on unstructured meshes. Using ideas from second generation wavelets, a multi-resolution basis is constructed to effectively compress the inverse of the matrix, resolving the sparsity vs. quality problem of standard approximate inverses. This finally allows the approximate inverse approach to scale well, giving fast convergence for Krylov subspace accelerators on a wide variety of large unstructured problems. Implementation details are discussed, including ordering and construction of factored approximate inverses, discretization and basis construction in one and two dimensions, and possibilities for parallelism. The numerical experiments in one and two dimensions confirm the capabilities of the scheme. Along the way I highlight many new avenues for research, including the connections to multigrid and other multi-resolution schemes.Item Coherent Beta Risk Measures for Capital Requirements(University of Waterloo, 1999) Wirch, Julia LynnThis thesis compares insurance premium principles with current financial risk paradigms and uses distorted probabilities, a recent development in premium principle literature, to synthesize the current models for financial risk measures in banking and insurance. This work attempts to broaden the definition of value-at-risk beyond the percentile measures. Examples are used to show how the percentile measure fails to give consistent results, and how it can be manipulated. A new class of consistent risk measures is investigated.Item Differential Equations and Depth First Search for Enumeration of Maps in Surfaces(University of Waterloo, 1999) Brown, DanielA map is an embedding of the vertices and edges of a graph into a compact 2-manifold such that the remainder of the surface has components homeomorphic to open disks. With the goal of proving the Four Colour Theorem, Tutte began the field of map enumeration in the 1960's. His methods included developing the edge deletion decomposition, developing and solving a recurrence and functional equation based on this decomposition, and developing the medial bijection between two equinumerous infinite families of maps. Beginning in the 1980's Jackson, Goulden and Visentin applied algebraic methods in enumeration of non-planar and non-orientable maps, to obtain results of interest for mathematical physics and algebraic geometry, and the Quadrangulation Conjecture and the Map-Jack Conjecture. A special case of the former is solved by Tutte's medial bijection. The latter uses Jack symmetric functions which are a topic of active research. In the 1960's Walsh and Lehman introduced a method of encoding orientable maps. We develop a similar method, based on depth first search and extended to non-orientable maps. With this, we develop a bijection that extends Tutte's medial bijection and partially solves the Quadrangulation Conjecture. Walsh extended Tutte's recurrence for planar maps to a recurrence for all orientable maps. We further extend the recurrence to include non-orientable maps, and express it as a partial differential equation satisfied by the generating series. By appropriately interpolating the differential equation and applying the depth first search method, we construct a parameter that empirically fulfils the conditions of the Map-Jack Conjecture, and we prove some of its predicted properties. Arques and Beraud recently obtained a continued fraction form of a specialisation of the generating series for maps. We apply the depth search method with an ordinary differential equation, to construct a bijection whose existence is implied by the continued fraction.Item The Model Theory of Algebraically Closed Fields(University of Waterloo, 2000) Cook, DanielModel theory can express properties of algebraic subsets of complex n-space. The constructible subsets are precisely the first order definable subsets, and varieties correspond to maximal consistent collections of formulas, called types. Moreover, the topological dimension of a constructible set is equal to the Morley rank of the formula which defines it.Item A Framework for Machine-Assisted Software Architecture Validation(University of Waterloo, 2000) Lichtner, KurtIn this thesis we propose a formal framework for specifying and validating properties of software system architectures. The framework is founded on a model of software architecture description languages (ADLs) and uses a theorem-proving based approach to formally and mechanically establish properties of architectures. Our approach allows models defined using existing ADLs to be validated against properties that may not be expressible using the original notation and tool-set. The central component of the framework is a conceptual model of architecture description languages. The model formalizes a salient, shared set of design categories, relationships and constraints that are fundamental to these notations. An advantage of an approach based on a conceptual model is that it provides a uniform view of design information across a selection of languages. This allows us to construct alternate formal representations of design information specified using existing ADLs. These representations can then be mechanically validated to ensure they meet their specific formal requirements. After defining the model we embed it in the logic of the PVS theorem-proving environment and illustrate its utility with a case study. We first demonstrate how the elements of a design are specified using the model, and then show how this representation is validated using machine-assisted proof. Our approach allows the correctness of a design to be established against a wide range of properties. We illustrate with structural properties, behavioural properties, relationships between the structural and behavioural specification, and dynamic, or evolving aspects of a system's topology.Item A survey on Traitor Tracing Schemes(University of Waterloo, 2000) Chen, JasonWhen intellectual properties are distributed over a broadcast network, the content is usually encrypted in a way such that only authorized users who have a certain set of keys, can decrypt the content. Some authorized users may be willing to disclose their keys in constructing a pirate decoder which allows illegitimate users to access the content. It is desirable to determine the source of the keys in a pirate decoder, once one is captured. Traitor tracing schemes were introduced to help solve this problem. A traitor tracing scheme usually consists of: a scheme to generate and distribute each user's personal key, a cryptosystem used to protect session keys that are used to encrypt/decrypt the actual content, and a tracing algorithm to determine one source of the keys in a pirate decoder. In this thesis, we survey the traitor tracing schemes that have been suggested. We group the schemes into two groups: symmetric in which the session key is encrypted and decrypted using the same key and asymmetric schemes in which the session key is encrypted and decrypted using different keys. We also explore the possibility of a truly public scheme in which the data supplier knows the encryption keys only. A uniform analysisis presented on the efficiency of these schemes using a set of performance parameters.Item Hadez, a Framework for the Specification and Verification of Hypermedia Applications(University of Waterloo, 2000) Morales-Germán, DanielIn recent years, several methodologies for the development of hypermedia applications have been proposed. These methodologies are, primarily, guidelines to be followed during the design process. They also indicate what deliverables should be created at each of their stages. These products are usually informally specified - in the sense that they do not have formal syntax nor formally defined semantics - and they are not required to pass validity tests. Hadez formally specifies the design of a hypermedia application, supports the verification of properties of the specification, and promotes the reuse of design. Hadez is an object-oriented specification language with formal syntax and semantics. Hadez is based on the formal specification languages Z and Z++, with extensions unique to hypermedia. It uses set theory and first order predicate logic. It divides the specification of a hypermedia application into three main parts: its conceptual schema, which describes the domain-specific data and its relationships; its structural schema, which describes how this data is combined and gathered into more complex entities, called composites; and the perspective schema, which uses Abstract Design Perspectives (artifacts unique to Hadez) to indicate how these composites are mapped to hyperpages, and how the user interacts with them. Hadez provides a formal framework in which properties of a specification can be specified and answered. The specification of an application should not constrain its implementation and, therefore, it is independent of the platform in which the application is to be presented. As a consequence, the same design can be instantiated into different applications, each for a different hypermedia platform. Hadez can be further extended with design patterns. Patterns enable reuse by capturing good solutions to well-known problems. Hadez characterizes patterns and makes their use readily available to the designer. Furthermore, Hadez is process independent, and is intended to be used with any of the main hypermedia design methodologies: EROM, HDM, OOHDM or RMM.Item Matrix Formulations of Matching Problems(University of Waterloo, 2000) Webb, KerriFinding the maximum size of a matching in an undirected graph and finding the maximum size of branching in a directed graph can be formulated as matrix rank problems. The Tutte matrix, introduced by Tutte as a representation of an undirected graph, has rank equal to the maximum number of vertices covered by a matching in the associated graph. The branching matrix, a representation of a directed graph, has rank equal to the maximum number of vertices covered by a branching in the associated graph. A mixed graph has both undirected and directed edges, and the matching forest problem for mixed graphs, introduced by Giles, is a generalization of the matching problem and the branching problem. A mixed graph can be represented by the matching forest matrix, and the rank of the matching forest matrix is related to the size of a matching forest in the associated mixed graph. The Tutte matrix and the branching matrix have indeterminate entries, and we describe algorithms that evaluate the indeterminates as rationals in such a way that the rank of the evaluated matrix is equal to the rank of the indeterminate matrix. Matroids in the context of graphs are discussed, and matroid formulations for the matching, branching, and matching forest problems are given.Item A Probabilistic Approach to Image Feature Extraction, Segmentation and Interpretation(University of Waterloo, 2000) Pal, Christopher JosephThis thesis describes a probabilistic approach to imagesegmentation and interpretation. The focus of the investigation is the development of a systematic way of combining color, brightness, texture and geometric features extracted from an image to arrive at a consistent interpretation for each pixel in the image. The contribution of this thesis is thus the presentation of a novel framework for the fusion of extracted image features producing a segmentation of an image into relevant regions. Further, a solution to the sub-pixel mixing problem is presented based on solving a probabilistic linear program. This work is specifically aimed at interpreting and digitizing multi-spectral aerial imagery of the Earth's surface. The features of interest for extraction are those of relevance to environmental management, monitoring and protection. The presented algorithms are suitable for use within a larger interpretive system. Some results are presented and contrasted with other techniques. The integration of these algorithms into a larger system is based firmly on a probabilistic methodology and the use of statistical decision theory to accomplish uncertain inference within the visual formalism of a graphical probability model.Item A Formalization of an Extended Object Model Using Views(University of Waterloo, 2000) Nova, Luis C. M.Reuse of software designs, experience and components is essential to making substantial improvements in software productivity, development cost, and quality. However, the many facets of reuse are still rarely used in the various phases of the software development lifecycle because of a lack of adequate theories, processes, and tools to support consistent application of reuse concepts. There is a need for approaches including definitions, models and properties of reuse that would provide explicit guidance to a software development team in applying reuse. In particular there is a need to provide abstractions that clearly separate the various functional concerns addressed in a software system. Separating concerns simplifies the identification of the software components that can benefit from reuse and can provide guidance on how reuse may be applied. In this thesis we present an extended model related to the separation of concerns in object-oriented design. The model, called views, indicates how an object-oriented design can be clearly separated into objects and their corresponding interfaces. In this model objects can be designed so that they are independent of their environment, because adaptation to the environment is the responsibility of the interface or view. The view can be seen as expressing the semantics for the 'glue' that joins components or objects together to create a software system. Informal versions of the views model have already been successfully applied to operational and commercial software systems. The objective of this thesis is to provide the views notion with a theoretical foundation to address reuse and separation of concerns. After clearly defining the views model we show the formal approach to combining the objects, interfaces (views), and their interconnection into a complete software system. The objects and interfaces are defined using an object calculus based on temporal logic, while the interconnections among object and views are specified using category theory. This formal framework provides the mathematical foundation to support the verification of the properties of both the components and the composite software system. We then show how verification can be mechanized by converting the formal version of the views model into higher-order logic and using PVS to support mechanical proofs.Item A survey of the trust region subproblem within a semidefinite framework(University of Waterloo, 2000) Fortin, CharlesTrust region subproblems arise within a class of unconstrained methods called trust region methods. The subproblems consist of minimizing a quadratic function subject to a norm constraint. This thesis is a survey of different methods developed to find an approximate solution to the subproblem. We study the well-known method of More and Sorensen and two recent methods for large sparse subproblems: the so-called Lanczos method of Gould et al. and the Rendland Wolkowicz algorithm. The common ground to explore these methods will be semidefinite programming. This approach has been used by Rendl and Wolkowicz to explain their method and the More and Sorensen algorithm; we extend this work to the Lanczos method. The last chapter of this thesis is dedicated to some improvements done to the Rendl and Wolkowicz algorithm and to comparisons between the Lanczos method and the Rendl and Wolkowicz algorithm. In particular, we show some weakness of the Lanczos method and show that the Rendl and Wolkowicz algorithm is more robust.Item On the Solution of the Hamilton-Jacobi Equation by the Method of Separation of Variables(University of Waterloo, 2000) Bruce, AaronThe method of separation of variables facilitates the integration of the Hamilton-Jacobi equation by reducing its solution to a series of quadratures in the separable coordinates. The case in which the metric tensor is diagonal in the separable coordinates, that is, orthogonal separability, is fundamental. Recent theory by Benenti has established a concise geometric (coordinate-independent) characterisation of orthogonal separability of the Hamilton-Jacobi equation on a pseudoRiemannian manifold. It generalises an approach initiated by Eisenhart and developed by Kalnins and Miller. Benenti has shown that the orthogonal separability of a system via a point transformation is equivalent to the existence of a Killing tensor with real simple eigen values and orthogonally integrable eigenvectors. Applying a moving frame formalism, we develop a method that produces the orthogonal separable coordinates for low dimensional Hamiltonian systems. The method is applied to a two dimensional Riemannian manifold of arbitrary curvature. As an illustration, we investigate Euclidean 2-space, and the two dimensional surfaces of constant curvature, recovering known results. Using our formalism, we also derive the known superseparable potentials for Euclidean 2-space. Some of the original results presented in this thesis were announced in [8, 9, 10].Item Static Conflict Analysis of Transaction Programs(University of Waterloo, 2000) Zhang, ConnieTransaction programs are comprised of read and write operations issued against the database. In a shared database system, one transaction program conflicts with another if it reads or writes data that another transaction program has written. This thesis presents a semi-automatic technique for pairwise static conflict analysis of embedded transaction programs. The analysis predicts whether a given pair of programs will conflict when executed against the database. There are several potential applications of this technique, the most obvious being transaction concurrency control in systems where it is not necessary to support arbitrary, dynamic queries and updates. By analyzing transactions in such systems before the transactions are run, it is possible to reduce or eliminate the need for locking or other dynamic concurrency control schemes.Item Variational Spectral Analysis(University of Waterloo, 2000) Sendov, HristoWe present results on smooth and nonsmooth variational properties of {it symmetric} functions of the eigenvalues of a real symmetric matrix argument, as well as {it absolutely symmetric} functions of the singular values of a real rectangular matrix. Such results underpin the theory of optimization problems involving such functions. We answer the question of when a symmetric function of the eigenvalues allows a quadratic expansion around a matrix, and then the stronger question of when it is twice differentiable. We develop simple formulae for the most important nonsmooth subdifferentials of functions depending on the singular values of a real rectangular matrix argument and give several examples. The analysis of the above two classes of functions may be generalized in various larger abstract frameworks. In particular, we investigate how functions depending on the eigenvalues or the singular values of a matrix argument may be viewed as the composition of symmetric functions with the roots of {it hyperbolic polynomials}. We extend the relationship between hyperbolic polynomials and {it self-concordant barriers} (an extremely important class of functions in contemporary interior point methods for convex optimization) by exhibiting a new class of self-concordant barriers obtainable from hyperbolic polynomials.Item An Approximation Algorithm for Character Compatibility and Fast Quartet-based Phylogenetic Tree Comparison(University of Waterloo, 2000) Tsang, JohnPhylogenetic analysis, or the inference of evolutionary history is done routinely by biologists and is one of the most important problems in systematic biology. In this thesis, we study two computational problems in the area. First, we study the evolutionary tree reconstruction problem under the character compatibility (CC) paradigm and give a polynomial time approximation scheme (PTAS) for a variation of the formulation called fractional character compatibility (FCC), which has been proven to be NP-hard. We also present a very simple algorithm called the Ordinal Split Method (OSM) to generate bipartitions given sequence data, which can be served as a front-end to the PTAS. The performance of the OSM and the validity of the FCC formulation are studied through simulation experiments. The second part of this thesis presents an efficient algorithm to compare evolutionary trees using the quartet metric. Different evolutionary hypothesis arises when different data sets are used or when different tree inference methods are applied to the same data set. Tree comparisons are routinely done by biologists to evaluate the quality of their tree inference experiments. The quartet metric has many desirable properties but its use has been hindered by its relatively heavy computational requirements. We address this problem by giving the first O(n^2) time algorithm to compute the quartet distance between two evolutionary trees.