- Research article
- Open Access
- Open Peer Review
This article has Open Peer Review reports available.
Data enhancement for co-morbidity measurement among patients referred for sleep diagnostic testing: an observational study
© Ronksley et al; licensee BioMed Central Ltd. 2009
Received: 05 March 2009
Accepted: 15 July 2009
Published: 15 July 2009
Observational outcome studies of patients with obstructive sleep apnea (OSA) require adjustment for co-morbidity to produce valid results. The aim of this study was to evaluate whether the combination of administrative data and self-reported data provided a more complete estimate of co-morbidity among patients referred for sleep diagnostic testing.
A retrospective observational study of 2149 patients referred for sleep diagnostic testing in Calgary, Canada. Self-reported co-morbidity was obtained with a questionnaire; administrative data and validated algorithms (when available) were also used to define the presence of these co-morbid conditions within a two-year period prior to sleep testing.
Patient self-report of co-morbid conditions had varying levels of agreement with those derived from administrative data, ranging from substantial agreement for diabetes (κ = 0.79) to poor agreement for cardiac arrhythmia (κ = 0.14). The enhanced measure of co-morbidity using either self-report or administrative data had face validity, and provided clinically meaningful trends in the prevalence of co-morbidity among this population.
An enhanced measure of co-morbidity using self-report and administrative data can provide a more complete measure of the co-morbidity among patients with OSA when agreement between the two sources is poor. This methodology will aid in the adjustment of these coexisting conditions in observational studies in this area.
Obstructive Sleep Apnea (OSA) is a disorder characterized by periods of cessation of breathing during sleep with intermittent hypoxemia and sleep fragmentation. Population-based studies estimate the prevalence of OSA to be approximately 3 to 7% for adult males and 2 to 5% for adult females in the general population [1–4]. Furthermore, patients with OSA commonly have other medical conditions, including hypertension, stroke, cardiovascular disease, and cardiac arrhythmia [5–14].
Given the increased morbidity associated with OSA, observational studies of patients with OSA must also adjust for these co-morbid conditions to determine the independent effect of OSA on the outcomes of interest. The use of self-reported data through questionnaires or interviews is a common method of determining the presence of co-morbid conditions due to its efficiency and relative low cost. However, the reliability and accuracy of this data is questionable [15–21]. In addition the validity of self-reported conditions, using medical records as the gold standard, varies depending on the medical conditions in question and the target population under investigation [15–21].
Administrative data is another source from which to determine the presence of co-morbid conditions. While agreement between self-reported medical conditions and that obtained from administrative databases also varies [22–27], combining self-reported clinical data with that obtained from administrative data has been proposed as a method to increase the completeness and accuracy of co-morbid conditions [28–30]. This enhanced measure of co-morbidity has been undertaken and shown to provide a valid assessment for patients with coronary heart disease and those undergoing coronary artery bypass graft surgery [31–33]. Previous studies have assessed co-morbidity in OSA patients in the years prior to diagnosis [34, 35]. However, many of these studies have relied on administrative records alone to determine co-morbidity. This source alone may result in an underestimate of co-morbidity within these populations. Given the importance of co-morbidity in observational studies of OSA patients, and the limited information in the literature on studies combining data sources to measure co-morbidity, the purpose of this study was to evaluate whether the combination of administrative data and self-reported data provided a more complete estimate of co-morbidity among patients referred for sleep diagnostic testing.
This project is part of a larger retrospective study investigating health care utilization among patients with OSA. We included all adult patients (> 18 years old) referred for sleep diagnostic testing at either a hospital location in Calgary, Alberta, or private home care facilities within the Calgary Health Region between July 2005 to August 2007. Virtually all sleep diagnostic testing for the city of Calgary and surrounding areas (population of approximately 1.3 million) is conducted in these facilities. All patients who underwent polysomnography (PSG) or ambulatory monitoring for the presence of OSA were invited to participate in the study. We excluded non-Alberta residents, patients previously diagnosed with OSA, and those referred but did not undergo diagnostic testing.
Obstructive Sleep Apnea
We used polysomnography (PSG) and ambulatory monitoring to identify OSA within participants. Although PSG is considered the 'gold standard' diagnostic test for OSA, an ambulatory monitoring device has proven to have excellent agreement, sensitivity and specificity with PSG . In addition, the use of ambulatory monitoring has been validated as a clinical management tool [37, 38].
We stratified patients by OSA severity, based on their sleep test results, using the respiratory disturbance index (RDI). The RDI was defined as the number of apneas and hypopneas per hour of sleep. Apnea was defined as a cessation of airflow for at least 10 seconds. Hypopnea was defined as an abnormal respiratory event lasting 10 seconds or more, with at least a 30% reduction in thorocoabdominal movement or airflow compared to baseline, and associated with at least a 4% oxygen desaturation. OSA severity categories included: no OSA (RDI <5 event/hr), mild OSA (RDI 5–14.9 events/hr), moderate OSA (RDI 15–29.9 events/hr) and severe OSA (RDI ≥ 30 events/hr). This classification system is well accepted in both clinical practice and within the medical literature [39, 40]. The date of the sleep study was used to define the index date.
Determination of Co-morbidities and Clinical Characteristics from Surveys
Baseline clinical and demographic information was collected for all participants prior to sleep diagnostic testing. This included: age, sex, height, weight, body mass index (BMI), neck circumference, and postal code. Each participant also completed the Epworth Sleepiness Scale (ESS) , a self-administered questionnaire that provides a measure of daytime sleepiness. Co-morbidity was determined through the use of a questionnaire administered by trained personnel within the clinics, and patients were asked to self-report the presence of nine specific co-morbidities including hypertension, asthma, depression, cardiac arrhythmia, myocardial infarction, chronic obstructive pulmonary disease (COPD), diabetes, heart failure, and stroke. Patients were also required to provide a list of their current medications at the time of the survey. This study was approved by the Ethics Review Board of the University of Calgary, and patients gave written informed consent to participate in the study.
Determination of Co-morbidities from Administrative Data Sources
Using the patient's unique Provincial Health Number (PHN), the cohort was linked to two Alberta Health and Wellness administrative databases, the hospitalization discharge database, and the physician claims database. For each patient, all hospitalization and physician claims information was obtained for a two-year period prior to sleep diagnostic testing.
The hospital inpatient data source contains details regarding hospitalizations including admission date, discharge date, length of stay, 25 diagnostic codes (ICD-10), and 10 procedure codes for each admission. The physician claims registry contains information on physician services including dates and location of the visits, diagnostic codes (ICD-9-CM), provider specialty, and include the majority of residents in the province except a small proportion of special population groups (i.e. members of the Armed Forces, Royal Canadian Mounted Police (RCMP), and federal inmates – accounting for approximately 1% of the total population) .
ICD-9-CM and ICD-10 Codes to Define Co-morbidity Among Patients Referred for Sleep Diagnostic Testing
ICD-10 diagnostic codes
ICD-9-CM diagnostic codes
Hypertension (with and without complication)
Tu et al.
2 physician claims in 3 years
I10.x, I11.x–I13.x, I15.x
Diabetes (with and without complication)
Hux et al.
1 hospitalization or 2 physician claims in 2 years
E10.0-E10.9, E11.0-E11.9, E12.0-E12.9, E13.0-E13.9, E14.0-E14.9
Huzel et al.
1 or more physician claims in 2 years
J45.0, J45.1, J45.8, J45.9
490.0, 491.0, 492.0, 493.0
Austin et al.
Primary discharge diagnosis of AMI in hospitalization database
I21.x, I22.x, I25.2
Congestive Heart Failure
Lee et al.
Primary discharge diagnosis of CHF in hospitalization database
I09.9, I11.0, I13.0, I13.2, I25.5, I42.0, I42.5–42.9, I43.x, I50.x, P29.0
Cerebrovascular Accident/Transient Ischemic Attack
Kokotailo and Hill
Primary discharge diagnosis of stroke in hospitalization database
H34.1, I63.x, I64.x, I61.x, I60.x, G45.x
362.3, 430.x, 431.x, 433.x1, 434.x1, 435.x, 436
No Validated Algorithm
491.21, 493.2, 496
No Validated Algorithm
F20.4, F31.3-F31.5, F32.x, F33.x, F34.1, F41.2, F43.2
296.2, 296.3, 296.5, 300.4, 309.x, 311
No Validated Algorithm
I44.1-I44.3, I45.6, I45.9, I47.x-I49.x, R00.0, R00.1, R00.8, T82.1, Z45.0, Z95.0
426.0, 426.1, 426.7, 426.9, 426.10, 426.12, 427.0–427.4, 427.6–427.9, 785.0, 996.01, 996.04, V45.0, V53.3
Patient characteristics were described using mean and standard deviation for normally distributed variables. In cases of highly skewed or non-normal distributions, the median and the inter-quartile range (IQR) were reported. Means and proportions were compared using analysis of variance and chi-square tests respectively. In addition, proportions of patients presenting with specific co-morbidities, identified in the questionnaire, were calculated.
To assess the agreement between self-reported co-morbidity and administrative databases, we calculated the proportion of subjects with each co-morbid condition based on: self-report only, administrative data sources only, both self-report and administrative data, and either self-report or administrative data. To evaluate consistency between self-report and administrative data the Kappa (κ) statistic and 95% confidence intervals were calculated. The Kappa statistic is an index of the degree of agreement between two raters, and can be thought of as the chance-corrected proportional agreement; possible values range from +1 (perfect agreement) to 0 (no agreement above that expected by chance). Kappa values were defined as: < 0.40 as poor or fair agreement, 0.40–0.60 as moderate agreement, 0.61–0.80 as substantial agreement, and 0.81–1.00 as almost perfect agreement .
In addition, the McNemar's test of paired proportions was determined. This is a statistical procedure to compare two dependent or correlated proportions, and is a test of marginal homogeneity that compares agreement between discordant pairs. A statistically significant McNemar's test would indicate a difference between the proportions compared. Finally to assess the validity of the enhanced measures of co-morbidity an analysis was also performed in which patients were stratified by severity of OSA to determine trends in the prevalence of the co-morbid conditions. All statistical analysis was conducted using STATA 10.0 software (Statacorp, College Station, Texas).
(n = 2149)
(n = 432)
(n = 738)
(n = 443)
(n = 536)
Male n (%)
Age, yrs mean (SD)
BMI, kg/m2 median (IQR)
Epworth Sleepiness Score, mean (SD)
Current Smoker, n (%)
Comparison of Co-Morbidity Determined by Self-Report and Administrative Data Algorithms
Agreement between Self-reported Co-Morbidity and Administrative Measure of Co-Morbidity
Present in Both Data Sources
Present in Either Data Source
No defined Algorithm
No defined Algorithm*
When "both" self-reported and administrative measures of co-morbidity were required to define each condition, proportions for all nine conditions were much lower when compared to a definition that required "either" self-report or administrative measure. For example, the proportion of patients with hypertension was 25.1% when "both" were used and 43.2% when "either" was used.
Co-Morbidity Measurement by OSA Severity
Self-reported Co-Morbidity and Administrative Measure of Co-Morbidity Stratified by OSA Severity
(n = 432)
(n = 738)
(n = 443)
(n = 536)
(n = 432)
(n = 738)
(n = 443)
(n = 536)
Enhanced Measure of Co-Morbidity using Either Self-Report or Administrative Databases Stratified by OSA Severity
Either Self-Report or Administrative Database (Enhanced)
(n = 432)
(n = 738)
(n = 443)
(n = 536)
In this large cohort of patients referred for sleep testing we determined that patient self-report of nine co-morbid conditions had varying levels of agreement with that derived from administrative data. Specifically, agreement was highest for diabetes and hypertension, and lowest for cardiac arrhythmia and stroke. An enhanced measure of co-morbidity using either self-report or administrative data demonstrated face validity and clinically meaningful trends of increasing prevalence by OSA severity. These results suggest that when agreement between data sources is poor, a combination of sources should be used when defining co-morbidity in OSA patients, as use of either source alone may result in an underestimate of the prevalence of these conditions. Specifically, using "either" self-report or administrative measure will increase the sensitivity of the estimate of co-morbidity.
We found that among patients referred for sleep testing, self-report of diabetes and hypertension had the highest agreement with administrative data derived definitions for these conditions. These findings are similar to those reported based on administrative data and survey data from an adult sample extracted from the Canadian Community Health Survey (CCHS) in Manitoba, Canada. Agreement between the two sources was highest for diabetes (κ > 0.70) and hypertension (κ > 0.50), and lowest for non-specific heart disease (κ = 0.38) . Cricelli et al. also found good agreement between self-reported diabetes and hypertension and administrative data sources . The consistency of self-reported and administrative data for these two conditions likely occurs because these conditions have clear objective criteria for diagnosis and require ongoing medical treatment. Agreement between self-reported measures of chronic disease and administrative data is dependent on the condition specifically .
We found very poor agreement between self-report and administrative data for the presence of cardiac arrhythmia and stroke. Underreporting of cardiac arrhythmia likely occurred because respondents are not aware of the diagnoses, or lack of familiarity with this medical term found on the self-report questionnaire . Though cardiac arrhythmia is common in patients with OSA with prevalence values ranging from 35–48% [13, 14], accurate self-reporting is more likely to occur for conditions that require frequent contacts with a health professional; cardiac arrhythmia is not one of these conditions. The enhanced definition of cardiac arrhythmia in our study is similar to the known prevalence in this population, and thus is likely to be an accurate reflection of the prevalence of this co-morbidity within the cohort (32.2%). The poor agreement between the two sources for stroke was also an interesting finding. We speculate that the discrepancies between administrative data and self-report for identifying stroke are due to the lower sensitivity of the administrative algorithm (67%), thus underestimating the true prevalence within this source. Again, the combination of either source likely provides a more accurate representation of stroke prevalence in this clinical population.
The measure of co-morbidity using the enhanced combination of data sources found that as OSA severity increased, the prevalence of hypertension, diabetes, and myocardial infarction also increased. This dose-response relationship for these specific conditions by OSA severity has been documented in previous studies [5, 10, 53–55] and provides support for the face validity of our enhanced measures of co-morbidity.
The results of our study should be interpreted in context of the study limitations. First, for three of the conditions of interest (depression, cardiac arrhythmia, and COPD), validated administrative algorithms were unavailable. Using an algorithm of at least one physician claim or hospitalization in a two-year period may have resulted in some misclassification and an over-reporting of these conditions. Secondly, we did not have a gold standard to determine whether the enhanced measures are more valid than a single data source alone. However, the increasing prevalence of conditions by OSA severity, consistent with that in the literature, does provide evidence of face validity. Finally, our study was limited to a single geographic region (Calgary Health Region) and only included patients referred for sleep diagnostic testing. These patients likely represent those with more severe morbidity and will limit the generalizability of these results to other clinic-based sleep cohorts in North America.
We found that administrative data in combination with survey data has the potential to create a more complete measure of the co-morbidity among patients referred for sleep diagnostic testing, particularly when agreement between survey and administrative data is poor. Given the resources required to obtain clinical data, use of data enhancement with administrative data may be valuable to other researchers. Although, future studies are required to validate co-morbidities based on data enhancement, these results suggest that this methodology can aid in the adjustment of these coexisting conditions in observational studies in this area.
This study was supported by operating grants from the Alberta Heritage Foundation for Medical Research and the Calgary Health Region. Brenda Hemmelgarn and Hude Quan are supported by New Investigator Awards from the Canadian Institutes of Health Research and by Population Health Investigator Awards from the Alberta Heritage Foundation for Medical Research. Paul Ronksley is supported by the Achievers in Medical Science Graduate Scholarship.
- Young T, Palta M, Dempsey J, Skatrud J, Weber S, Badr S: The occurrence of sleep-disordered breathing among middle-aged adults. N Engl J Med. 1993, 328: 1230-1235. 10.1056/NEJM199304293281704.View ArticlePubMedGoogle Scholar
- Young T, Peppard PE, Gottlieb DJ: Epidemiology of obstructive sleep apnea: a population health perspective. Am J Respir Crit Care Med. 2002, 165: 1217-1239. 10.1164/rccm.2109080.View ArticlePubMedGoogle Scholar
- Bixler EO, Vgontzas AN, Lin HM, Ten Have T, Rein J, Vela-Bueno A, Kales A: Prevalence of sleep-disordered breathing in women: effects of gender. Am J Respir Crit Care Med. 2001, 163: 608-613.View ArticlePubMedGoogle Scholar
- Udwadia ZF, Doshi AV, Lonkar SG, Singh CI: Prevalence of sleep disordered breathing and sleep apnea in middle-aged urban Indian men. Am J Respir Crit Care Med. 2004, 169: 168-173. 10.1164/rccm.200302-265OC.View ArticlePubMedGoogle Scholar
- Nieto FJ, Young TB, Lind BK, Shahar E, Samet JM, Redline S, D'Agostino RB, Newman AB, Lebowitz MD, Pickering TG: Association of sleep-disordered breathing, sleep apnea, and hypertension in a large community-based study. JAMA. 2000, 283: 1829-1836. 10.1001/jama.283.14.1829.View ArticlePubMedGoogle Scholar
- Davies CWH, Crosby JH, Mullins RL, Barbour C, Davies RJO, Stradling JR: Case-control study of 24 hour ambulatory blood pressure in patients with obstructive sleep apnoea and normal matched control subjects. Thorax. 2000, 55: 736-740. 10.1136/thorax.55.9.736.View ArticlePubMedPubMed CentralGoogle Scholar
- Peppard PE, Young T, Palta M, Skatrud J: Prospective study of the association between sleep-disordered breathing and hypertension. N Engl J Med. 2000, 342: 378-384. 10.1056/NEJM200005113421901.View ArticleGoogle Scholar
- Yaggi HK, Concato J, Kernan WN, Lichtman JH, Brass LM, Mohsenin V: Obstructive sleep apnea as a risk factor for stroke and death. N Engl J Med. 2005, 353: 2034-2041. 10.1056/NEJMoa043104.View ArticlePubMedGoogle Scholar
- Arzt M, Young T, Finn L, Skatrud JB, Bradley TD: Association of sleep-disordered breathing and the occurrence of stroke. Am J Respir Crit Care Med. 2005, 172: 1447-1451. 10.1164/rccm.200505-702OC.View ArticlePubMedPubMed CentralGoogle Scholar
- Marin JM, Carrizo SJ, Vicente E, Agusti A: Long-term cardiovascular outcomes in men with obstructive sleep apnoea-hypopnoea with or without treatment with continuous positive airway pressure: an observational study. Lancet. 2005, 365: 1046-1053.View ArticlePubMedGoogle Scholar
- Newman AB, Nieto FJ, Guidry U, Lind BK, Redline S, Shahar E, Pickering TG, Quan SF: Sleep Heart Health Study Research Group. Relation of sleep-disordered breathing to cardiovascular disease risk factors: The Sleep Heart Health Study. Am J Epidemiol. 2001, 154: 50-59. 10.1093/aje/154.1.50.View ArticlePubMedGoogle Scholar
- Shahar E, Whitney CW, Redline S, Lee ET, Newman AB, Javier Nieto F, O'Connor GT, Boland LL, Schwartz JE, Samet JM: Sleep-disordered breathing and cardiovascular disease: cross-sectional results of the Sleep Heart Health Study. Am J Respir Crit Care Med. 2001, 163: 19-25.View ArticlePubMedGoogle Scholar
- Guilleminault C, Connolly SJ, Winkle RA: Cardiac arrhythmia and conduction disturbances during sleep in 400 patients with sleep apnea syndrome. Am J Cardiol. 1983, 52: 490-494. 10.1016/0002-9149(83)90013-9.View ArticlePubMedGoogle Scholar
- Mehra R, Benjamin EJ, Shahar E, Gottlieb DJ, Nawabit R, Kirchner HL, Sahadevan J, Redline S: Association of nocturnal arrhythmias with sleep-disordered breathing: The Sleep Heart Health Study. Am J Respir Crit Care Med. 2006, 173: 910-916. 10.1164/rccm.200509-1442OC.View ArticlePubMedPubMed CentralGoogle Scholar
- Linet MS, Harlow SD, McLaughlin JK, McCaffrey LD: A comparison of interview data and medical records for previous medical conditions and surgery. J Clin Epidemiol. 1989, 42: 1207-1213. 10.1016/0895-4356(89)90119-4.View ArticlePubMedGoogle Scholar
- Okura Y, Urban LH, Mahoney DW, Jacobsen SJ, Rodeheffer RJ: Agreement between self-report questionnaires and medical record data was substantial for diabetes, hypertension, myocardial infarction and stroke but not for heart failure. J Clin Epidemiol. 2004, 57: 1096-1103. 10.1016/j.jclinepi.2004.04.005.View ArticlePubMedGoogle Scholar
- Haapanen N, Miilunpalo S, Pasanen M, Oja P, Vuori I: Agreement between questionnaire data and medical records of chronic diseases in middle-aged and elderly Finnish men and women. Am J Epidemiol. 1997, 145: 762-769.View ArticlePubMedGoogle Scholar
- Colditz GA, Martin P, Stampfer MJ, Willett WC, Sampson L, Rosner B, Hennekens CH, Speizer FE: Validation of questionnaire information on risk factors and disease outcomes in a prospective cohort study of women. Am J Epidemiol. 1986, 123: 894-900.PubMedGoogle Scholar
- Olsson L, Svardsudd K, Nilsson G, Ringqvist I, Tibblin G: Validity of a postal questionnaire with regard to the prevalence of myocardial infarction in a general population sample. Eur Heart J. 1989, 10: 1011-1016.PubMedGoogle Scholar
- Harlow SD, Linet MS: Agreement between questionnaire data and medical records: the evidence for accuracy of recall. Am J Epidemiol. 1989, 129: 233-248.PubMedGoogle Scholar
- Katz JN, Chang LC, Sangha O, Fossel AH, Bates DW: Can morbidity be measured by questionnaire rather than medical record review?. Med Care. 1996, 34: 73-84. 10.1097/00005650-199601000-00006.View ArticlePubMedGoogle Scholar
- Susser SR, McCusker J, Belzile E: Comorbidity information in older patients at an emergency visit: self-report vs. administrative data had poor agreement but similar predictive validity. J Clin Epidemiol. 2008, 61: 511-515. 10.1016/j.jclinepi.2007.07.009.View ArticlePubMedGoogle Scholar
- Van Doorn C, Bogardus S, Williams C, Concato J, Towle V, Inouye S: Risk adjustment for older hospitalized persons: a comparison of two methods of data collection for the Charlson index. J Clin Epidemiol. 2001, 54: 694-701. 10.1016/S0895-4356(00)00367-X.View ArticlePubMedGoogle Scholar
- Humphries KH, Rankin JM, Carere RG, Buller CE, Kiely FM, Spinelli JJ: Co-morbidity data in outcomes research: are clinical data derived from administrative databases a reliable alternative to chart review?. J Clin Epidemiol. 2000, 53: 343-349. 10.1016/S0895-4356(99)00188-2.View ArticlePubMedGoogle Scholar
- Cricelli C, Mazzaglia G, Samani F, Marchi M, Sabatini A, Nardi R, Ventriglia G, Caputi AP: Prevalence estimates for chronic diseases in Italy: Exploring the differences between self-report and primary care databases. J Public Health Med. 2003, 25: 254-257. 10.1093/pubmed/fdg060.View ArticlePubMedGoogle Scholar
- West SL, Richter N, Melfi CA, McNutt M, Nennstiel ME, Mauskopf JA: Assessing the Saskatchewan database for outcome research studies of depression and its treatment. J Clin Epidemiol. 2000, 53: 823-831. 10.1016/S0895-4356(99)00237-1.View ArticlePubMedGoogle Scholar
- Robinson JR, Young TK, Roos LL, Gelskey DE: Estimating the burden of disease. Comparing administrative data and self-reports. Med Care. 1997, 35: 932-947. 10.1097/00005650-199709000-00006.View ArticlePubMedGoogle Scholar
- Norris CM, Ghali WA, Knudtson ML, Naylor CD, Saunders LD: Dealing with missing data in observational health care outcomes analyses. J Clin Epidemiol. 2000, 54: 377-383. 10.1016/S0895-4356(99)00181-X.View ArticleGoogle Scholar
- Faris PD, Ghali WA, Brant R, Norris CM, Galbraith PD, Knudtson ML, APPROACH Investigators: Alberta Provincial Program for Outcome Assessment in Coronary Heart Disease. Multiple imputation versus data enhancement for dealing with missing data in observational health outcome analyses. J Clin Epidemiol. 2002, 55: 184-191. 10.1016/S0895-4356(01)00433-4.View ArticlePubMedGoogle Scholar
- Lix L, Yogendran M, Burchill C, Metge C, McKeen N, Moore D, Bond R: Defining and Validating Chronic Diseases: An Administrative Data Approach. Manitoba Centre for Health Policy. Winnipeg. 2006Google Scholar
- Hannan EL, Kilburn H, Lindsey ML, Lewis R: Clinical versus administrative databases for CABG surgery. Does it matter?. Med Care. 1992, 30: 892-907. 10.1097/00005650-199210000-00002.View ArticlePubMedGoogle Scholar
- Parker JP, Li Z, Damberg CL, Danielsen B, Carlisle DM: Administrative versus clinical data for coronary artery bypass graft surgery report cards: The view from California. Med Care. 2006, 44: 687-695. 10.1097/01.mlr.0000215815.70506.b6.View ArticlePubMedGoogle Scholar
- Roos LL, Stranc L, James RC, Li J: Complications, comorbidities, and mortality: Improving classification and prediction. Health Serv Res. 1997, 32: 229-242.PubMedPubMed CentralGoogle Scholar
- Smith R, Ronald J, Delaive K, Walld R, Manfreda J, Kryger MH: What are obstructive sleep apnea patients being treated for prior to this diagnosis?. Chest. 2002, 121: 164-172. 10.1378/chest.121.1.164.View ArticlePubMedGoogle Scholar
- Greenberg-Dotan S, Reuveni H, Simon-Tuval T, Oksenberg A, Tarasiuk A: Gender differences in morbidity and health care utilization among adult obstructive sleep apnea patients. Sleep. 2007, 30: 1173-1180.PubMedPubMed CentralGoogle Scholar
- Vazquez J, Tsai W, Flemons W, Masuda A, Brant R, Hajduk E, Whitelaw WA, Remmers JE: Automated analysis of digital oximetry in the diagnosis of obstructive sleep apnea. Thorax. 2000, 55: 302-307. 10.1136/thorax.55.4.302.View ArticlePubMedPubMed CentralGoogle Scholar
- Whitelaw WA, Brant RF, Flemons WW: Clinical usefulness of home oximetry compared to polysomnography for assessment of sleep apnea. Am J Resp Crit Care Med. 2005, 15: 188-193.View ArticleGoogle Scholar
- Mulgrew AT, Fox N, Ayas NT, Ryan CF: Diagnosis and initial management of obstructive sleep apnea without polysomnography. Ann Intern Med. 2007, 146: 157-166.View ArticlePubMedGoogle Scholar
- American Academy of Sleep Medicine Task Force: Sleep-related breathing disorders in adults: recommendations for syndrome definition and measurement techniques in clinical research. Sleep. 1999, 22: 667-689.Google Scholar
- American Academy of Sleep Medicine: International classification of sleep disorders, Diagnostic and coding manual. 2001, American Academy of Sleep Medicine. ChicagoGoogle Scholar
- Johns MW: A new method for measuring daytime sleepiness: the Epworth sleepiness scale. Sleep. 1991, 14: 540-545.PubMedGoogle Scholar
- Alberta Health and Wellness: Health Trends in Alberta: A Working Document. 2007, Alberta Health and Wellness. EdmontonGoogle Scholar
- Tu K, Campbell NRC, Chen ZL, Cauch-Dudek KJ, McAlister FA: Accuracy of administrative databases in identifying patients with hypertension. Open Med. 2007, 1: e3-5.Google Scholar
- Hux JE, Ivis F, Flintoft V, Bica A: Diabetes in Ontario: Determination of prevalence and incidence using a validated administrative data algorithm. Diab Care. 2002, 25: 512-516. 10.2337/diacare.25.3.512.View ArticleGoogle Scholar
- Huzel L, Roos LL, Anthonisen NR, Manfreda J: Diagnosing asthma: the fit between survey and administrative database. Can Respir J. 2002, 9: 407-412.View ArticlePubMedGoogle Scholar
- Austin PC, Daly PA, Tu JV: A multicenter of the coding accuracy of hospital discharge administrative data for patients admitted to cardiac care units in Ontario. Am Heart J. 2002, 144: 290-296. 10.1067/mhj.2002.123839.View ArticlePubMedGoogle Scholar
- Lee DS, Donovan L, Austin PC, Gong Y, Liu PP, Rouleau JL, Tu JV: Comparison of coding of heart failure and comorbidities in administrative and clinical data for use in outcomes research. Med Care. 2005, 43: 182-188. 10.1097/00005650-200502000-00012.View ArticlePubMedGoogle Scholar
- Kokotailo RA, Hill MD: Coding of stroke and stroke risk factors using international classification of diseases, revisions 9 and 10. Stroke. 2005, 36: 1776-1781. 10.1161/01.STR.0000174293.17959.a1.View ArticlePubMedGoogle Scholar
- Quan H, Sundaratajan V, Halfon P, Fong A, Burnand B, Luthi J, Saunders LD, Beck C, Feasby TE, Ghali WA: Coding algorithms for defining comorbidities in ICD-9-CM and ICD-10 administrative data. Med Care. 2005, 43: 1130-1139. 10.1097/01.mlr.0000182534.19832.83.View ArticlePubMedGoogle Scholar
- The International Classification of Disease 9th revision, Clinical Modification. 1996, World Health Organization. Geneva, Switzerland, 4Google Scholar
- The International statistical classification of disease and related health problems, tenth revision (ICD-10). 1992, World Health Organization. Geneva, SwitzerlandGoogle Scholar
- Landis JR, Koch GG: The measurement of observer agreement for categorical data. Biometrics. 1977, 33: 159-174. 10.2307/2529310.View ArticlePubMedGoogle Scholar
- Kapur V, Blough DK, Sandblom RE, Hert R, de Maine JB, Sullivan SD, Psaty BM: The medical cost of undiagnosed sleep apnea. Sleep. 1999, 22: 749-755.PubMedGoogle Scholar
- Reichmuth KJ, Austin D, Skatrud JB, Young TB: Association of sleep apnea and type II diabetes. Am J Respir Crit Care Med. 2005, 172: 1590-1595. 10.1164/rccm.200504-637OC.View ArticlePubMedPubMed CentralGoogle Scholar
- Punjabi NM, Sorkin JD, Katzel LI, Goldberg AP, Schwartz AR, Smith PL: Sleep disordered breathing and insulin resistance in middle-aged and overweight men. Am J Respir Crit Care Med. 2002, 165: 677-682.View ArticlePubMedGoogle Scholar
- The pre-publication history for this paper can be accessed here:http://www.biomedcentral.com/1471-2288/9/50/prepub
This article is published under license to BioMed Central Ltd. This is an Open Access article distributed under the terms of the Creative Commons Attribution License (http://creativecommons.org/licenses/by/2.0), which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited.