University of Waterloo >
Electronic Theses and Dissertations (UW) >
Please use this identifier to cite or link to this item:
|Title: ||Automation of Sleep Staging|
|Authors: ||Maggard, Jessie Yang|
|Approved Date: ||21-Jan-2010 |
|Date Submitted: ||2009 |
|Abstract: ||This thesis primarily covers the automation problem for sleep versus awake detection, which is sometimes accomplished by differentiating the various sleep stages prior to clustering. This thesis documents various experimentation into areas where the performance can be improved, including classifer design and feature selection from EEG, EOG and Context.
In terms of classifers, it was found that the neural network MLP outperforms the continuous Hidden Markov Model with an accuracy of 91.91%, and additional performance requires better feature sets and more training data.
Improved EEG features based on time frequency representation were optimized to differentiate Awake with 93.52% sensitivity and 94.60% specificity, differentiate REM with 96.12% sensitivity and 93.63% specificity, differentiate
Stages II and III with 96.81% sensitivity and 89.28% specificity, and differentiate Stages III and IV with 93.60% sensitivity and 90.43% specificity.
Due to the limited data set, an example of applying contextual information using a One-Cycle-Duo-Direction model was built and shown to improve
EEG features by up to 10%. This level of performance is comparable if not superior to the human scorer accuracy of 88% to 94%.
This thesis improved some aspects of sleep staging automation, but due to the limitations on resources, the full potential of these improvements could not be demonstrated. To further develop these improvements, additional
data sets customized by sleep staging experts is crucial.|
|Program: ||Electrical and Computer Engineering|
|Department: ||Electrical and Computer Engineering|
|Degree: ||Master of Applied Science|
|Appears in Collections:||Faculty of Engineering Theses and Dissertations |
Electronic Theses and Dissertations (UW)
All items in UWSpace are protected by copyright, with all rights reserved.