Digitized University of Waterloo Theses
Permanent URI for this collectionhttps://hdl.handle.net/10012/21195
The following collection includes theses created by UW graduate students prior to 2010 that have since been digitized. Please note that not all theses written by UW graduate students have been digitized. Current graduate students should only submit their work in the Theses collection.
Browse
Browsing Digitized University of Waterloo Theses by Title
Now showing 1 - 20 of 689
- Results Per Page
- Sort Options
Item 4D induced matter from non-compact 5D Kaluza-Klein gravity(University of Waterloo, 2000) Sajko, W. J.The traditional constraints associated with five-dimensional Kaluza-Klein gravity are removed, namely that the 5D metric can depend on the extra-coordinate and that this coordinate is non-compact. The assumption that the 5D theory is vacuum RAB=0 is the minimal set of field equations to induce matter from 5D to 4D via a dimensional reduction. This reduction is carried out for two general types of 5D metrics: 1) the traditional Kaluza-Klein metric which unifies gravity, electromagnetism and a scalar field, and 2) conformal extra-coordinate dependent metric which induve an effective cosmological constant and realistic neutral matter. The physical aspects such as test particle motion, the weak-field limit and gravitational waves, and the energy from a Hamiltonian perspective and conserved quantities associated with scalar-tensor theories of gravity are studied in detail. It i found that 5D relativity is a rich extension of 4D gravity that unifies geometry with 4D matter.Item An a priori resource-based classification methodology for specialty/secondary ambulatory patients(University of Waterloo, 1997) Khamalah, Joseph NalukuluItem The abstract media model(University of Waterloo, 1998) Suhanic, West M. L.The goal of this thesis was to create a comprehensive model which could be used as a framework for thinking about, working with, and understanding the many elements of media. This model is called the Abstract Media Model (AMM). The AMM has been built using a systems design methodology so that not only were the overall technical aspects of problems carefully considered, but there was equal concern for the economic, social, human, and political parameters. The AMM in addition to providing a media framework gave rise to: - a new media representation called the Abstract Media Code (AMC). The AMC is important as it can represent many media types. The AMC, because of its time code foundation, also serves as a bridge. It plays this role by linking the existing media production community with its hueg investment in analogue-based tools, techniques and methodologies to the emerging digital world. The AMC is also an example of convergence in media representations and as such can support the technology convergence we are currently witnessing. - a decontextualised, service-oriented media model which is implemented, in prototype form, using distributed object computing (DOC). - a new type of database system built using neural net technology which utilises the Abstract Media Code. Each of these accomplishments was made possible by the partition-based design of the AMM. This type of design has resulted in the AMM providing a fragmented and decontextualised view of media. The fragmentation and decontextualisation of media is important as it is the basis for recombination which in turn presents opportunities for technological, social and cultural innovation.Item Accounting for misclassification in binary longitudinal data(University of Waterloo, 1999) Rosychuk, Rhonda JeanItem Active filtering of AC harmonics from HVDC converters(University of Waterloo, 1997) Plaisant, André Luiz da RosaItem An activity-based travel needs model for the elderly(University of Waterloo, 1998) Hildebrand, Eric DavidOver the coming decades, a significant increase in the numbers of elderly people requiring travel will occur as the demographic profile of Canadians shift thereby affecting all aspects of transportation demand. Furthermore, cohort effects are anticipated which may see tomorrow's elderly leading more active lives and travelling to more activities than today's aged. The current lack of a detailed description of elderly travel characteristics and behaviours, particularly one that examines the issue at a level involving activity engagement, was a deficiency addressed by this research. A further product of this study was the development and testing of a simplified activity-based modelling framework. The framework was designed to describe elderly travel characteristics and demand with the added benefit of providing a tool that can evaluate transportation related impacts of proposed policies. Comparisons of activity participation of the elderly with younger age groups showed that although the daily number of activities remains relatively constant, beginning around age 75 there is a significant decrease in the number to which they travel. There are also significant changes in the types of activities to which the elderly travel compared with the younger age groups. Furthermore, the daily number of trip tours was shown to increase for those 65 to 75 years of age before it steadily declines with advancing age. The average number of activities accessed in each trip tour was found to decrease significantly beginning at about age 65. Having been traditionally addressed as a relatively homogeneous group by transportation planners, the elderly were shown to possess extremely varied characteristics. Cluster analyses were undertaken to identify subpopulations of the elderly from a sample of 1,150 who responded to an activity-based survey conducted in Portland, Oregon. To identify different lifestyle groups, exploratory analyses were undertaken to delineate clusters based on socio-demographic, travel, and activity engagement variables. The final cluster solution chosen to provide a categorical basis for the modelling framework identified six distinct lifestyle groups based on socio-demographic variables. These clusters were also found to have statistically significant differences in travel behaviour and activity engagement patterns. The clusters identified are characterized as those who remain active in the workforce, the mobility impaired, the elderly who live with their grown offspring, the disabled who drive, and those who either live alone or with a spouse and continue to drive. The activity-based model was developed using discrete-event, stochastic simulation (or microsimulation) as a platform. Through a sequential process, the model stochastically assigns individuals with a daily itinerary of activities. Trip tours are estimated based on the type and quantity of activities requiring travel. All model assignments are conditioned on each individual's cluster membership. Although the model is operationalized at a relatively rudimentary level, it provides a base structure that can be enhanced in subsequent versions. The model framework successfully replicated all facets of the base data set used for its development. Elements of travel behaviour synthesized for individuals being modelling included total daily activities (with and without travel), activities engaged in by class (with and without travel), total daily trip tours, and mode splits. Comparing model outputs with observed base data, both the number of activities requiring travel and the total daily trip tours were overestimated by 3.7 percent for all of the elderly combined. The travel model was also applied to a smaller external data set (data from a different study area not used for model development) for validation. The number of activities requiring travel and the number of trip tours were overestimated by 9.2 and 10.5 percent, respectively. Differences between model outputs and observed values are the combined result of the stochastic nature of the modelling framework, aggregation effects (i.e., assigning individuals to clusters with predefined characteristics), model inaccuracies (e.g., use of regression models to predict the number of trip tours), and an incomplete set of constraining rules which govern daily activity itineraries. Two test applications of the model explored its ability to evaluate the impacts of a road pricing policy and a mandatory license retesting program on the different segments of the elderly. Results from a stated-adaptation survey for road pricing were used to modify the underlying empirical distributions imbedded in the base model. The model was rerun and the results compared with the original outputs. The analysis allowed the varied impacts of increased travel costs to be compared between the six elderly lifestyle clusters. This first test application illustrated the importance of having a statistically significant sample from a stated-response survey to represent each lifestyle cluster. Future applications should rely on stratified sampling techniques for stated-response surveys. The second test application examined the potential impacts associated with the implementation of a mandatory relicensing program for those older than 80. Given that the clusters were delineated based on several general socio-demographic variables, the model was not able to isolate fully the activity and travel patterns of this target group based only on age and driver's license variables. The test case reinforced the importance of defining clusters based on the end use of the model. For specific uses of the model, defining clusters on dimensions other than general socio-demographic variables will sometimes be necessary. The research has provided a more comprehensive understanding of the varied lifestyles, activity patterns, and subsequent travel behaviour and needs of the elderly. Furthermore, it has been shown that a categorical approach using lifestyle groups with unique activity and travel characteristics can be successfully combined within an activity-base framework. Although this approach was applied specifically to the elderly, it can be extended to other heterogeneous groups including the population as a whole. The successful development and validation of a simplified activity-based model have given this field of study a much needed demonstration of an operational activity-based modelling framework. It has been shown that even a simplified framework can synthesize the linkages between activity patterns and corresponding trip-making.Item Adaptation in bipedal locomotion, insights from dynamic modelling, numerical optimization, and neuro-fuzzy-genetic programming(University of Waterloo, 1998) Armand, MehranItem Adaptive digital image compression based on segmentation and block classification(University of Waterloo, 1997) El-Sakka, Mahmoud R.Over the last few decades, many good image compression schemes have been developed. The performance of these schemes varies from low to high compression ratios with low to high levels of degradation of the decompressed images. Since the end users of decompressed images are usually human beings, consequently, it is natural that attempts should be made to incorporate some of the human visual system properties into the encoding schemes to achieve even further compression with less noticeable degradations. This thesis presents a new digital image compression scheme which exploits one of the human visual system properties - namely that of, recognizing images by their regions - to achieve high compression ratios. It also assigns a variable bit count to each image region that is proportional to the amount of information it conveys to the viewer. The new scheme copes with image non-stationarity by adaptively segmenting the image - using quad-trees segmentation approach - into variable-block sized regions, and classifying them into statistically and perceptually different classes. These classes include, a smooth class, a textural class, and an edge class. Blocks in each class are separately encoded. For smooth blocks, a new adaptive prediction technique is used to encode block averages. Meanwhile, an optimized DCT-based technique is used to encode both edge and textural blocks. Based on extensive testing and comparisons with other existing compression techniques, the performance of the new scheme surpasses the performance of the JPEG standard and goes beyond its compression limits. In most test cases, the new compression scheme results in a maximum compression ratio that is at least twice of JPEG, while exhibiting lower objective and subjective image degradations. Moreover, the performance of the new block-based compression is comparable to the performance of the state-of-the-art wavelet-based compression technique and provides a good alternative when adaptability to image content is of interest.Item An adaptive ecosystem approach to rehabilitation and management of the Cooum River environmental system in Chennai, India(University of Waterloo, 2000) Bunch, Martin J.This research investigates the application of an adaptive ecosystem approach to the problem of the Cooum River and environs in Chennai (formerly Madras), India. The Cooum River is an extremely polluted urban stream that flows into the Bay of Bengal through the heart of Chennai, India's fourth largest metropolis. During the dry (non-monsoon) season, the upper reaches of the river are dry and flow in the river may be attributed primarily to the production of sewage by the city's population. The river is essentially a foul-smelling open sewer. Complexity of the problem is due as much to human factors (population growth, poverty, uncontrolled urban development, jurisdictional conflicts, modes of behaviour of the citizenry, and institutional culture) as to physical characteristics of the system (flat topography, tidal action, blockage of the river mouth by sand bar formation, and monsoon flooding). Uncertainty in the situation is both structural (regarding main processes and activities in the system and the nature of relationships among the various actors and elements), and parametric (having to do with scarcity, poor quality and restricted access to data). The work has drawn upon methods and techniques of Adaptive Environmental Management and Soft Systems Methodology to operate the ecosystem approach and address the problem. Specifically, this has involved a series of workshops which have brought together planners, researchers, NGOs, and other stakeholders in a participatory process oriented toward problem definition, system identification and conceptualization, determination of objectives for management, and the generation and exploration of management interventions. In addition, a central component of the program has been the development of a loosely-coupled GIS, environmental simulation model, and a decision support module. This is based upon a framework provided by participants in the first workshop in the series, and operationalizes a common understanding of the system. In addition to generating new insight into the nature of the problem situation, the research has provided a potentially useful tool to planners, managers and researchers in Chennai in the form of a GIS database and decision support system (DSS). Aside from the tool itself, it was found that the process of developing a conceptual model, and attempting to represent this in the DSS has made a significant contribution to understanding of the Cooum system. In particular, this process forced assumptions to be stated explicitly and publically, highlighted areas of uncertainty and led to new understanding in participants' conception of the problem situation. The program of research also provided a much needed forum for open debate and exchange of information which was removed from the restrictive institutional culture of government departments.Item An adaptive framework for sensor planning in a coordinated multi-agent environment(University of Waterloo, 2001) Hodge, Lovell A.Item Adaptive local statistics filtering(University of Waterloo, 1997) Adriannse, RobertItem Addressing Northern decision-making capacity, the case of health advisories and the Labrador Inuit(University of Waterloo, 1999) Furgal, C.Item The adoption of new university technology for product innovation, a core competence perspective(University of Waterloo, 2001) Van den Berghe, LarryItem Advances in mathematical modelling of multicomponent free-radical polymerizations in bulk, solution and emulsion(University of Waterloo, 1999) Gao, JunA computer package has been developed to simulate free-radical multicomponent polymerization in bulk, solution and emulsion. The simulation package consists of two models, one for bulk and solution polymerization, and the other for emulsion polymerization. Great emphasis has been placed on making both models general and reliable. This has been achieved through in-depth kinetic studies, critical model evaluation and extensive model testing. Models have been gradually enhanced and extended from a homopolymerization case to two comprehensive multicomponent bulk/solution/emulsion models. Databases of physicochemical parameter values for both models have been developed in parallel. The bulk/solution model's database includes 12 monomers and the database for the emulsion model consists of 5 monomers. Both databases also have many initiators, solvents (in bulk/solution model's database only), chain transfer agents and emulsifiers (in emulsion model's database only). Such extensive databases allow the models to simulate multicomponent polymerizations for a wide range of reaction recipes. In the first stage of model development, the bulk/solution model was developed and extensively tested with a total of 15 copolymer systems. Several important aspects in copolymerization kinetics were discussed. In most model testing cases, model predictions turned out to be very satisfactory and this confirms the reliability of the package. The literature review on copolymerizaiton kinetics and model testing presented in this thesis are believed to be the most extensive so far in the literature. In the second stage of model development, terpolymerization kinetics in bulk/solution over the entire conversion range were investigated in detail. The bulk/solution copolymerization model was extended to simulate terpolymerization in bulk/solution and testing over the entire conversion range. This is the first time that a terpolymer system is modelled over the entire conversion range. Testing has been performed with the very challenging (and widely used commercially) system of butyl acrylate/methyl methacrylate/vinyl acetate in bulk and solution (toluene). Due to the scarcity of available experimental data in the literature, we were not able to test the model more extensively with other terpolymerizations, however, the system in question was extremely challenging as a test case. In the third stage of model development, a general and comprehensive emulsion model has been developed. This emulsion model is one of the very few that can simulate emulsion homopolymerization as well as copolymerization under a very wide range of reaction and operation conditions. The model can describe the most important physicochemical phenomena (micelle formation, particle nucleation, absorption and desorption of radicals, monomer partitioning, gel effect, etc.) occurring in emulsion polymerization. Difficult and challenging subjects in emulsion polymerization kinetics, such as monomer partitioning through thermodynamic equilibrium, particle nucleation, desorption, etc., have been solved satisfactorily in a general fashion. This model can predict important reaction characteristics (conversion profile and rate of polymerization) and polymer/latex properties (number of particles, particle size, molecular weight averages, copolymer composition and sequence, etc.). The emulsion model has been tested with monomers of very different characteristics, like styrene (a "typical case 2" monomer with very low water solubility and no desorption), vinyl acetate (a typical "case I" monomer with high water solubility and significant desorption) and methyl methacrylate (a typical "case 3" monomer that exhibits strong gel effect). The model has also been tested for the copolymer system of styrene/methyl methacrylate. In most cases, simulation results are satisfactory compared to experimental data collected either from the literature or from this laboratory. After this systematic effort in refining and testing our multicomponent simulation model/package/database, we strongly believe that the package can provide a very flexible and useful tool that could guide academic and industrial research and development, as extensively demonstrated in Gao and Penlidis (I 996, 1998) for homo- and copolymerizations, and in the present thesis for terpolymerizations and emulsion case.Item After the first few seconds, stereotype activation over the course of time(University of Waterloo, 1999) Adams, Barbara D.Item Agent of imperial change, James MacQueen and the British Empire, 1778-1870(University of Waterloo, 1997) Pardue, Jeffrey DavidThis thesis examines the long and varied career of James MacQueen (1778-1870), a passionate and seemingly indefatigable Scotsman who spent his life attempting to consolidate British imperial power in the old empire of the West Indies, and introduce it to what he hoped would be part of a new empire in Africa. Although always a dedicated imperialist, his work in four specific capacities is highlighted: as a pro-slavery polemicist during the last decade of the emancipation debate, 1823-33; as an agent for the Colonial Bank 1836-38; as founder and general superintendent of the Royal Mail Steam Packet Company, 1837-44; and as a geographer, 1820-70. In each of these endeavours MacQueen acted as an "agent"--an imperial go-between--attempting to bind Great Britain closer to two of its peripheries. Although his grand plans for expansion into Africa never found much support, he did help consolidate central power through the establishment of metropolitan-controlled and government-chartered companies in the Caribbean colonies, and by filling in the so-called "blank spots" of Africa. On the one hand, MacQueen brought the metropolis to the periphery, and on the other, he brought the periphery to the metropolis. As one born during the American War of Independence and dying on the eve of the partition of Africa, MacQueen lived in a period that historians once deemed "anti-imperial." More recent scholars have revised this view, mainly by redefining imperialism, and this thesis continues along this line by delineating some of the subtler mechanics of Empire; specifically, those with which MacQueen was involved: labour, banking, communications, and geography.Item Air sampling with solid phase microextraction(University of Waterloo, 1998) Martos, Perry AnthonyThere is an increasing need for simple yet accurate air sampling methods. The acceptance of new air sampling methods requires compatibility with conventional chromatographic equipment, and the new methods have to be environmentally friendly, simple to use, yet with equal, or better, detection limits, accuracy and precision than standard methods. Solid phase microextraction (SPME) satisfies the conditions for new air sampling methods. Analyte detection limits, accuracy and precision of analysis with SPME are typically better than with any conventional air sampling methods. Yet, air sampling with SPME requires no pumps, solvents, is re-usable, extremely simple to use, is completely compatible with current chromatographic equipment, and requires a small capital investment. The first SPME fiber coating used in this study was poly(dimethylsiloxane) (PDMS), a hydrophobic liquid film, to sample a large range of airborne hydrocarbons such as benzene and octane. Quantification without an external calibration procedure is possible with this coating. Well understood are the physical and chemical properties of this coating which are quite similar to those of the siloxane stationary phase used in capillary columns. The log of analyte distribution coefficients for PDMS are linearly related to chromatographic retention indices and to the inverse of temperature. Therefore, the actual chromatogram from the analysis of the PDMS air sampler will yield the calibration parameters which are used to quantify unknown airborne analyte concentrations (ppbv to ppmv range). The second fiber coating used in this study was PDMS/divinyl benzene (PDMS/DVB) onto which o-(2,3,4,5,6-pentafluorobenzyl) hydroxylamine (PFBHA) was adsorbed for the on-fiber derivatization of gaseous formaldehyde (ppbv range), with and without external calibration. The oxime formed from the reaction can be detected with conventional gas chromatographic detectors. Typical grab sampling times were as small as 5 seconds. With 300 seconds sampling, the formaldehyde detection limit was 2.1 ppbv, better than any other 5 minutes sampling device for formaldehyde. The first-order rate constant for product formation was used to quantify formaldehyde concentrations without a calibration curve. This spot sampler was used to sample the headspace of hair gel, particle board, plant material and coffee grounds for formaldehyde, and other carbonyl compounds, with extremely promising results. The SPME sampling devices were also used for time-weighted average sampling (30 minutes to 16 hours). Finally, the four new SPME air sampling methods were field tested with side-by-side comparisons to standard air sampling methods, showing a tremendous use of SPME as an air sampler.Item Alcohol effects on visual attention, the impact of information processing(University of Waterloo, 2001) Carscadden, Judith LeslieItem Alkylation of 1-butene with isobutane using EMT and Y zeolites(University of Waterloo, 2000) Walker, Gail RobertsonItem Analogical reasoning in academic and social problem solving(University of Waterloo, 1997) Lee, Linda D. H.