Article
Refine
Year of publication
Document Type
- Article (99) (remove)
Language
- English (95)
- German (3)
- Multiple languages (1)
Has Fulltext
- yes (99)
Is part of the Bibliography
- no (99)
Keywords
- - (44)
- dementia (7)
- GWAS (3)
- epidemiology (3)
- pandemic (3)
- physical activity (3)
- COVID-19 (2)
- Chronic kidney disease (2)
- Dementia (2)
- Developmental screening (2)
Institute
- Institut für Community Medicine (99) (remove)
Publisher
- MDPI (21)
- BioMed Central (BMC) (19)
- Frontiers Media S.A. (14)
- S. Karger AG (12)
- Wiley (7)
- Public Library of Science (PLoS) (6)
- Nature Publishing Group (4)
- BMJ Publishing Group (3)
- JMIR Publications (3)
- Springer Nature (2)
Aims
To investigate factors that influence the willingness of inactive nurses to return to nursing in a crisis situation and to identify aspects that need to be considered with regard to a possible deployment.
Design
A deductive and inductive qualitative content analysis of semi-structured focus group interviews.
Methods
Semi-structured focus group interviews with inactive or marginally employed nurses, nurses who have been inactive for some time and nursing home managers in October and November 2021. The participating inactive nurses had declared their willingness for a deployment during the COVID-19 pandemic or not. Data were analysed using qualitative content analysis.
Results
Communication was seen as essential by the participants for an informed decision for or against a temporary return to nursing and to potential or actual deployments. To make them feel safe, inactive nurses need to know what to expect and what is expected of them, for example, regarding required training and responsibilities. Considering their current employment status, some flexibility in terms of deployment conditions is needed.
A remaining attachment to care can trigger a sense of duty. Knowledge of (regular) working conditions in nursing can lead to both a desire to support former colleagues and a refusal to be exposed to these conditions again.
Conclusion
Past working experiences and the current employment situation play a major role in the willingness of inactive nurses to return to nursing in a crisis situation. Unbureaucratic arrangements must be provided for those who are willing to return.
Summary Statement
What already is known - In crisis situations, not every inactive nurse is willing or able to return to nursing and therefore, the ‘silent reserve’ may not be as large as suspected.
What this paper adds - Inactive nurses need to know what to expect and what is expected of them for their decision regarding a return to active patient care during a crisis situation.
Implications for practice/policy – Inactive nurses need to be informed and should be offered free training and refresher courses to ensure patient safety.
Impact
This research shows that the group of inactive nurses are not a silent workforce which can be activated anytime. Those who are able and willing to return to direct patient care in crisis situations need the best possible support – during and between crises.
Reporting Method
This study adhered to COREQ guidelines.
No Patient or Public Contribution
The involvement of patients or members of the public did not apply for the study, as the aim was to gain insight into the motivations and attitudes of the group of inactive nurses.
The impact of the COVID-19 pandemic on social-emotional developmental risks (SE-DR) of preschool children is largely unknown. Therefore, the aim of this prospective longitudinal dynamic cohort study was to assess changes in preschoolers’ SE-DR from before the pandemic to after the first COVID-19 wave. SE-DR were assessed annually with the instrument “Dortmund Developmental Screening for Preschools” (DESK). Longitudinal DESK data from 3- to 4-year-old children who participated both in survey wave (SW) three (DESK-SW3, 2019) and SW four (DESK-SW4, 2020) from August 1 to November 30 were used, respectively. Additionally, data from previous pre-pandemic SW were analyzed to contextualize the observed changes (SW1: 2017; SW2: 2018). A total of N = 786 children were included in the analysis. In the pre-pandemic DESK-SW3, the proportion of children with SE-DR was 18.2%, whereas in DESK-SW4 after the first COVID-19 wave, the proportion decreased to 12.4% (p = 0.001). Thus, the prevalence rate ratio (PRR) was 0.68. Compared to data from previous SW (SW1-SW2: PRR = 0.88; SW2-SW3: PRR = 0.82), this result represents a notable improvement. However, only short-term effects were described, and the study region had one of the highest preschool return rates in Germany. Further studies are needed to examine long-term effects of the pandemic on preschoolers’ SE-DR.
This dynamic cohort was established to evaluate the targeted individual promotion of children affected by developmental risks as part of the German federal state law for child day-care and preschools in Mecklenburg-Western Pomerania. The project has been conducted in preschools in regions with a low socio-economic profile since 2011. Since 2017, the revision of the standardized Dortmund Developmental Screening for Preschools (DESK 3–6 R) has been applied. Developmental risks of 3 to 6-year-old children in the domains of motor, linguistic, cognitive and social competencies are monitored. The cohort is followed up annually. In 2020, n = 7,678 children from n = 152 preschools participated. At the baseline (2017), n = 8,439 children participated. Due to the defined age range of this screening, 3,000 to 4,000 5-6-year-old children leave the cohort annually. Simultaneously, an approximately equal number of 3-year-old children enters the cohort per survey wave. N = 702 children participated in all 4 survey waves. On the basis of DESK 3–6 R scores available from survey waves 2017 to 2019 it is possible to compute expected values for the survey wave 2020 and to compare those with the measured values to evaluate the effects of the COVID-19 pandemic (i.e. parental home care due to restrictions related to COVID-19).
Hintergrund
Die chronische Nierenkrankheit (CKD) ist eine häufige Erkrankung, insbesondere im höheren Alter. Um der Progression der Erkrankung und deren Komplikationen vorzubeugen, ist eine leitliniengerechte ambulante Versorgung von Patient:innen mit CKD anzustreben. Zur Messung und Bewertung der Versorgungsqualität können Qualitätsindikatoren (QI) genutzt werden. In Deutschland existieren bisher keine QI für CKD. Ziel der Arbeit war die Entwicklung von QI für die Qualitätsüberprüfung der ambulanten Versorgung von Patient:innen über 70 Jahren mit nichtdialysepflichtiger CKD.
Material und Methoden
Auf Grundlage der nationalen S3-Leitlinie CKD und eines Reviews internationaler QI wurde eine Liste von QI erstellt. Die ausgewählten QI wurden in 2 Sets eingeteilt: basierend auf Routinedaten (z. B. Abrechnungsdaten der Krankenkassen) und auf Datenerhebung in der Praxis (Chart-Review). Expert:innen verschiedener Fachrichtungen sowie ein Patient:innenvertreter bewerteten diese in einem Delphi-Verfahren mit 2‑stufiger Onlinebefragung im Oktober 2021 und Januar 2022 und abschließender Konsensuskonferenz im März 2022. Zusätzlich wurden Ranglisten der wichtigsten QI von jedem Set erstellt.
Ergebnisse
Ein Inzidenz- und ein Prävalenzindikator wurden a priori festgelegt und standen nicht zur Abstimmung. Weitere 21 QI standen zur Abstimmung durch die Expert:innen. Für jedes QI-Set wurden die 7 wichtigsten Indikatoren ausgewählt. Nur 1 QI wurde von dem Expert:innenpanel für den zusätzlichen Einsatz bei Erwachsenen unter 70 Jahren als nicht geeignet eingestuft.
Diskussion
Die QI sollen es ermöglichen, die Qualität der ambulanten Versorgung von Patient:innen mit CKD zu untersuchen, mit dem Ziel, die leitlinienkonforme ambulante Versorgung zu optimieren.
Background
Adolescents and young adults (AYAs) with chronic conditions face a transfer, defined as an actual shift from paediatric to adult-oriented health care. Transition competence as the self-perceived knowledge, skills and abilities regarding the transition process was considered extremely useful.
Aim
This study was designed to investigate the impact of transition competence before and after the transfer on disease-specific quality of life (QoL) and health care satisfaction of AYAs with diabetes.
Results
In total, a sample of N = 90 AYAs with diabetes self-reported their transition competence, diabetes-specific QoL and satisfaction with care. Multiple linear regressions were used to analyse the impact of transition competence on satisfaction with care and QoL. Transition competence positively influenced the outcomes of satisfaction with care and QoL.
Conclusion
Young adults with diabetes showed higher transition competence scores than adolescents with diabetes.
In rural areas, healthcare providers, patients and relatives have to cover long distances. For specialised ambulatory palliative care (SAPV), a supply radius of max. 30 km is recommended. The aim of this study was to analyse whether there are regional disparities in the supply of SAPV and whether it is associated with the distance between the SAPV team’s site and the patient’s location. Therefore, anonymised data of the Association of Statutory Health Insurance Physicians of the Federal State of Mecklenburg-Western Pomerania (M-V) were retrospectively analysed for the period of 2014–2017. Identification as a palliative patient was based on palliative-specific items from the ambulatory reimbursement catalogue. In total, 6940 SAPV patients were identified; thereof, 48.9% female. The mean age was 73.3 years. For 28.3% of the identified SAPV patients (n = 1961), the SAPV teams had a travel distance of >30 km. With increasing distance, the average number of treatment days per patient increased. It was found that there are regional disparities in the provision of SAPV services in M-V and that local structures have an important impact on regional supply patterns. The distance between the SAPV team’s site and the patient’s location is not the only determining factor; other causes must be considered.
Introduction: The aim of this study was to test whether brief alcohol interventions at general hospitals work equally well for males and females and across age-groups.
Methods: The current study includes a reanalysis of data reported in the PECO study (testing delivery channels of individualized motivationally tailored alcohol interventions among general hospital patients: in PErson vs. COmputer-based) and is therefore of exploratory nature. At-risk drinking general hospital patients aged 18–64 years (N = 961) were randomized to in-person counseling, computer-generated individualized feedback letters, or assessment only. Both interventions were delivered on the ward and 1 and 3 months later. Follow-ups were conducted at months 6, 12, 18, and 24. The outcome was grams of alcohol/day. Study group × sex and study group × age interactions were tested as predictors of change in grams of alcohol/day over 24 months in latent growth models. If rescaled likelihood ratio tests indicated improved model fit due to the inclusion of interactions, moderator level-specific net changes were calculated.
Results: Model fit was not significantly improved due to the inclusion of interaction terms between study group and sex (χ2[6] = 5.9, p = 0.439) or age (χ2[6] = 5.5, p = 0.485).
Discussion: Both in-person counseling and computer-generated feedback letters may work equally well among males and females as well as among different age-groups. Therefore, widespread delivery of brief alcohol interventions at general hospitals may be unlikely to widen sex and age inequalities in alcohol-related harm.
Background
Multimedia multi-device measurement platforms may make the assessment of prevention-related medical variables with a focus on cardiovascular outcomes more attractive and time-efficient. The aim of the studies was to evaluate the reliability (Study 1) and the measurement agreement with a cohort study (Study 2) of selected measures of such a device, the Preventiometer.
Methods
In Study 1 (N = 75), we conducted repeated measurements in two Preventiometers for four examinations (blood pressure measurement, pulse oximetry, body fat measurement, and spirometry) to analyze their agreement and derive (retest-)reliability estimates. In Study 2 (N = 150), we compared somatometry, blood pressure, pulse oximetry, body fat, and spirometry measurements in the Preventiometer with corresponding measurements used in the population-based Study of Health in Pomerania (SHIP) to evaluate measurement agreement.
Results
Intraclass correlations coefficients (ICCs) ranged from .84 to .99 for all examinations in Study 1. Whereas bias was not an issue for most examinations in Study 2, limits of agreement for most examinations were very large compared to results of similar method comparison studies.
Conclusion
We observed a high retest-reliability of the assessed clinical examinations in the Preventiometer. Some disagreements between Preventiometer and SHIP examinations can be attributed to procedural differences in the examinations. Methodological and technical improvements are recommended before using the Preventiometer in population-based research.
Background
The national Network Genomic Medicine (nNGM) Lung Cancer provides comprehensive and high-quality multiplex molecular diagnostics and standardized personalized treatment recommendation for patients with advanced non-small cell lung cancer (aNSCLC) in Germany. The primary aim of this study was to investigate the effectiveness of the nNGM precision medicine program in terms of overall survival (OS) using real-world data (RWD).
Methods
A historical nationwide cohort analysis of patients with aNSCLC and initial diagnosis between 04/2019 and 06/2020 was conducted to compare treatment and OS of patients with and without nNGM-participation. Patients participating within the nNGM (nNGM group) were selected based on a prospective nNGM database. The electronic health records (EHR) of the prospective nNGM database were case-specifically linked to claims data (AOK, German health insurance). The control group was selected from claims data of patients receiving usual care without nNGM-participation (non-nNGM group). The minimum follow-up period was six months.
Findings
Overall, n = 509 patients in the nNGM group and n = 7213 patients in the non-nNGM group met the inclusion criteria. Patients participating in the nNGM had a significantly improved OS compared to the non-nNGM group (median OS: 10.5 months vs. 8.7 months, p = 0.008, HR = 0.84, 95% CI: 0.74–0.95). The 1-year survival rates were 46.8% (nNGM) and 41.3% (non-nNGM). The use of approved tyrosine kinase inhibitors (TKI) in the first-line setting was significantly higher in the nNGM group than in the non-nNGM group (nNGM: 8.4% (43/509) vs. non-nNGM: 5.1% (366/7213), p = 0.001). Overall, patients receiving first-line TKI treatment had significantly higher 1-year OS rates than patients treated with PD-1/PD-L1 inhibitors and/or chemotherapy (67.2% vs. 40.2%, p < 0.001).
Interpretation
This is the first study to demonstrate a significant survival benefit and higher utilization of targeted therapies for aNSCLC patients participating within nNGM. Our data indicate that precision medicine programs can enhance collaborative personalized lung cancer care and promote the implementation of treatment innovations and the latest scientific knowledge into clinical routine care.
Funding
The study was funded by the AOK Federal Association Germany.
Introduction
In response to the COVID-19 pandemic, a general lockdown was enacted across Germany in March 2020. As a consequence, patients with mental health conditions received limited or no treatment in day hospitals and outpatient settings. To ensure continuity of care, the necessary technological preparations were made to enable the implementation of telemedical care via telephone or video conferencing, and this option was then used as much as possible. The aim of this study was to investigate the satisfaction and acceptance with telemedical care in a heterogeneous patient group of psychiatric outpatients in Germany during the first COVID-19 lockdown.
Methods
In this observational study, patients in ongoing or newly initiated outpatient psychiatric therapy as well as those who had to be discharged from the day clinic ahead of schedule received telemedical treatment via telephone. Data collection to assess the patients’ and therapists’ satisfaction with and acceptance of the telemedical care was adjusted to the treatment setting.
Results
Of 60 recruited patients, 57 could be included in the analysis. 51.6% of the patients and 52.3% of their therapists reported that the discussion of problems and needs worked just as well over the phone as in face-to-face consultations. In the subgroup of patients who were new to therapy due to being discharged from hospital early, acceptance was higher and telemedicine was rated as equally good in 87.5% of contacts. Both patients and therapists felt that telemedicine care during lockdown was an alternative for usual therapy in the outpatient clinic and that the option of telemedicine care should continue for the duration of the coronavirus pandemic.
Discussion
The results show a clear trend towards satisfaction with and acceptance of telemedicine care in a heterogeneous group of unselected psychiatric patients. Although the number of patients is small, the results indicate that the mostly positive results of telemedicine concepts in research projects can probably be transferred to real healthcare settings.
Conclusions
Telemedicine can be employed in healthcare for psychiatric patients either an alternative treatment option to maintain continuity of care or as a potential addition to regular care.
Aims
To examine whether inactive nurses are willing to return to nursing during the COVID-19 pandemic, the reasons for or against their decision and further, possibly relevant factors.
Design
Cross-sectional online survey.
Methods
We developed a questionnaire, addressing registration, professional experiences, anticipations, and internal and external factors that might affect the decision of inactive nurses to return to nursing during the pandemic. Between 27 April and 15 June 2020, we recruited participants in Germany via social networks, organizations and institutions and asked them to forward the link to wherever other inactive nurses might be reached.
Results
Three hundred and thirty-two participants (73% female) could be included in the analysis. The majority of the participants (n = 262, 79%) were general nurses. The main reason for registering was ‘want to do my bit to manage the crisis’ (n = 73, 22.8%). More than two thirds of the participants (n = 230, 69%) were not or not yet registered. One hundred and twelve (49%) out of 220 participants, who gave reasons why they did not register, selected they ‘could not see a necessity at that time’. The few inactive nurses who were deployed reported a variety of experiences.
Conclusions
Different factors influence the nurses’ decision to register or not. A critical factor for their decision was previous experiences that had made them leave the job and prevented a return—even for a limited time in a special situation.
Impact
From the responses of the participants in this study, it can be deduced that: negative experiences made while working in nursing influence the willingness to volunteer for a deployment; only one-third of the inactive nurses would be willing to return to the nursing profession to help manage the Corona pandemic; policymakers and nursing leaders should not rely on the availability of inactive nurses in a crisis.
Epidemiological data reveal that there is a need for prevention measures specifically targeted at children with low SES. In the German federal state Mecklenburg-Western Pomerania preschools in socially deprived regions can apply for additional funds to support children with developmental risks. Mandatory criteria for obtaining these funds involve an annual assessment of all children using the “Dortmunder Developmental Screening for Preschools (DESK 3–6 R).” This instrument can detect and monitor developmental risks in the domains fine motor skills, gross motor skills, language, cognition, and social development. In this study, we examine the domain “Attention and concentration,” which is included for the 5 to 6-year-old age group, using data from two consecutive survey waves (sw). Research questions: (1) Does the prevalence rate ratio (PRR) improve over time? (2) Is the rate of improvements (developmental risk at sw1, no developmental risk at sw2) higher than the rate of deteriorations (no developmental risk at sw1, developmental risk at sw2)? Prospective cohort analysis (n = 940). The prevalence rate of a developmental risk in this DESK domain decreases over time (PRR = 0.78; p = 0.019). The ratio of the rate of improvements is 8.47 times higher than the rate of deteriorations. The results provide evidence of the effectiveness of targeted intervention measures in preschools focusing on skills that improve attention and concentration. This is significant considering the small-time interval and the categorization method of DESK scores. Nevertheless, over the same time period, the DESK results of some children deteriorated. Therefore, preschools also have to be aware that it is natural for some children to show modest declines in their skills over time. German Clinical Trials Register, ID: DRKS00015134, Registered on 29 October 2018, retrospectively registered.
Background
Pregnancy and the postpartum period are times when women are at increased risk for depression and mental problems. This may also negatively affect the foetus. Thus, there is a need for interventions with low-threshold access and care. Telemedicine interventions are a promising approach to address these issues. This systematic literature review examined the efficacy of telemedicine interventions for pregnant women and/or new mothers to address mental health-related outcomes. The primary objective was to analyse whether telemedicine interventions can reduce mental health problems in pregnant women and new mothers. The secondary aim was to clarify the impact of type of interventions, their frequency and their targets.
Methods
Inclusion criteria: randomized controlled trials, with participants being pregnant women and/or new mothers (with infants up to twelve months), involving telemedicine interventions of any kind (e.g. websites, apps, chats, telephone), and addressing any mental health-related outcomes like depression, postnatal depression, anxiety, stress and others. Search terms were pregnant women, new mothers, telemedicine, RCT (randomised controlled trials), mental stress as well as numerous synonyms including medical subject headings. The literature search was conducted within the databases PubMed, Cochrane Library, Web of Science and PsycINFO. Screening, inclusion of records and data extraction were performed by two researchers according to the PRISMA guidelines, using the online tool CADIMA.
Results
Forty four articles were included. A majority (62%) reported significantly improved mental health-related outcomes for participants receiving telemedicine interventions compared to control. In particular (internet-delivered) Cognitive Behavioural Therapy was successful for depression and stress, and peer support improved outcomes for postnatal depression and anxiety. Interventions with preventive approaches and interventions aimed at symptom reduction were largely successful. For the most part there was no significant improvement in the symptoms of anxiety.
Conclusion
Telemedicine interventions evaluated within RCTs were mostly successful. However, they need to be designed to specifically target a certain mental health issue because there is no one-size-fits-all approach. Further research should focus on which specific interventions are appropriate for which mental health outcomes in terms of intervention delivery modes, content, target approaches, etc. Further investigation is needed, in particular with regard to anxiety.
Background
Early diagnosis is mandatory for the medical care of children and adolescents with pediatric-onset inflammatory bowel disease (PIBD). International guidelines (‘Porto criteria’) of the European Society for Pediatric Gastroenterology, Hepatology and Nutrition recommend medical diagnostic procedures in PIBD. Since 2004, German and Austrian pediatric gastroenterologists document diagnostic and treatment data in the patient registry CEDATA-GPGE on a voluntary basis. The aim of this retrospective study was to analyze whether the registry CEDATA-GPGE reflects the Porto criteria and to what extent diagnostic measures of PIBD according to the Porto criteria are documented.
Methods
Data of CEDATA-GPGE were analyzed for the period January 2014 to December 2018. Variables representing the Porto criteria for initial diagnostic were identified and categorized. The average of the number of measures documented in each category was calculated for the diagnoses CD, UC, and IBD-U. Differences between the diagnoses were tested by Chi-square test. Data on possible differences between data documented in the registry and diagnostic procedures that were actually performed were obtained via a sample survey.
Results
There were 547 patients included in the analysis. The median age of patients with incident CD (n = 289) was 13.6 years (IQR: 11.2–15.2), of patients with UC (n = 212) 13.1 years (IQR: 10.4–14.8) and of patients with IBD-U (n = 46) 12.2 years (IQR: 8.6–14.7).
The variables identified in the registry fully reflect the recommendations by the Porto criteria. Only the disease activity indices PUCAI and PCDAI were not directly provided by participants but calculated from obtained data. The category ‘Case history’ were documented for the largest part (78.0%), the category ‘Imaging of the small bowel’ were documented least frequently (39.1%). In patients with CD, the categories ‘Imaging of the small bowel’ (χ2 = 20.7, Cramer-V = 0.2, p < 0.001) and ‘Puberty stage’ (χ2 = 9.8, Cramer-V = 0.1, p < 0.05) were documented more often than in patients with UC and IBD-U.
Conclusion
The registry fully reproduces the guideline’s recommendations for the initial diagnosis of PIBD. The proportion of documented diagnostic examinations varied within the diagnostic categories and between the diagnoses. Despite technological innovations, time and personnel capacities at participating centers and study center are necessary to ensure reliable data entry and to enable researchers to derive important insights into guideline-based care.
Background: The aim of our study was to investigate associations of spleen volume with blood count markers and lipid profile in the general population.
Materials & methods: Cross-sectional data from 1,106 individuals aged 30–90 years from the population-based Study of Health in Pomerania (SHIP-START-2) were analyzed. Blood count markers included red blood cell (RBC) counts, hemoglobin, platelet count, and white blood cell (WBC) counts. Lipid profile included total-cholesterol, high-density lipoprotein-cholesterol (HDL-C), and low-density lipoprotein-cholesterol (LDL-C) as well as triglycerides. Linear regression models adjusted for age, sex, body height, and weight were used to associate standardized spleen volume with blood counts and lipid profile markers.
Results: Spleen volume was positively associated with RBC (β = 0.05; 95% confidence interval [CI] = 0.03 to 0.08) and hemoglobin (β = 0.05; 95% CI = 0.01 to 0.09) but inversely with platelet count (β = −16.3; 95% CI = –20.5 to −12.1) and WBC (β = −0.25; 95% CI = −0.37 to −0.14). Furthermore, spleen volume showed inverse associations with total cholesterol (β = −0.17; 95% CI = −0.24 to −0.09), HDL-C (β = −0.08; 95% CI = −0.10 to −0.05), and LDL-C (β = −0.12; 95% CI = −0.17 to −0.06). There was no significant association of spleen volume with triglycerides.
Conclusion: Our study showed that the spleen volume is associated with markers of the blood count and lipid profile in the general population.
Background: The global obesity epidemic is a major public health concern, and accurate diagnosis is essential for identifying at-risk individuals. Three-dimensional (3D) body scanning technology offers several advantages over the standard practice of tape measurements for diagnosing obesity. This study was conducted to validate body scan data from a German population-based cohort and explore clinical implications of this technology in the context of metabolic syndrome. Methods: We performed a cross-sectional analysis of 354 participants from the Study of Health in Pomerania that completed a 3D body scanning examination. The agreement of anthropometric data obtained from 3D body scanning with manual tape measurements was analyzed using correlation analysis and Bland–Altman plots. Classification agreement regarding abdominal obesity based on IDF guidelines was assessed using Cohen’s kappa. The association of body scan measures with metabolic syndrome components was explored using correlation analysis. Results: Three-dimensional body scanning showed excellent validity with slightly larger values that presumably reflect the true circumferences more accurately. Metabolic syndrome was highly prevalent in the sample (31%) and showed strong associations with central obesity. Using body scan vs. tape measurements of waist circumference for classification resulted in a 16% relative increase in the prevalence of abdominal obesity (61.3% vs. 52.8%). Conclusions: These results suggest that the prevalence of obesity may be underestimated using the standard method of tape measurements, highlighting the need for more accurate approaches.
Guidelines and Standard Frameworks for AI in Medicine: Protocol for a Systematic Literature Review
(2023)
Background: Applications of artificial intelligence (AI) are pervasive in modern biomedical science. In fact, research results suggesting algorithms and AI models for different target diseases and conditions are continuously increasing. While this situation undoubtedly improves the outcome of AI models, health care providers are increasingly unsure which AI model to use due to multiple alternatives for a specific target and the “black box” nature of AI. Moreover, the fact that studies rarely use guidelines in developing and reporting AI models poses additional challenges in trusting and adapting models for practical implementation.
Objective: This review protocol describes the planned steps and methods for a review of the synthesized evidence regarding the quality of available guidelines and frameworks to facilitate AI applications in medicine.
Methods: We will commence a systematic literature search using medical subject headings terms for medicine, guidelines, and machine learning (ML). All available guidelines, standard frameworks, best practices, checklists, and recommendations will be included, irrespective of the study design. The search will be conducted on web-based repositories such as PubMed, Web of Science, and the EQUATOR (Enhancing the Quality and Transparency of Health Research) network. After removing duplicate results, a preliminary scan for titles will be done by 2 reviewers. After the first scan, the reviewers will rescan the selected literature for abstract review, and any incongruities about whether to include the article for full-text review or not will be resolved by the third and fourth reviewer based on the predefined criteria. A Google Scholar (Google LLC) search will also be performed to identify gray literature. The quality of identified guidelines will be evaluated using the Appraisal of Guidelines, Research, and Evaluation (AGREE II) tool. A descriptive summary and narrative synthesis will be carried out, and the details of critical appraisal and subgroup synthesis findings will be presented.
Results: The results will be reported using the PRISMA (Preferred Reporting Items for Systematic Review and Meta-Analyses) reporting guidelines. Data analysis is currently underway, and we anticipate finalizing the review by November 2023.
Conclusions: Guidelines and recommended frameworks for developing, reporting, and implementing AI studies have been developed by different experts to facilitate the reliable assessment of validity and consistent interpretation of ML models for medical applications. We postulate that a guideline supports the assessment of an ML model only if the quality and reliability of the guideline are high. Assessing the quality and aspects of available guidelines, recommendations, checklists, and frameworks—as will be done in the proposed review—will provide comprehensive insights into current gaps and help to formulate future research directions.
International Registered Report Identifier (IRRID): DERR1-10.2196/47105
Background
Elective surgeries are among the most common health stressors in later life and put a significant risk at functional and mental health, making them an important target of research into healthy aging and physical resilience. Large-scale longitudinal research mostly conducted in non-clinical samples provided support of the predictive value of self-rated health (SRH) for both functional and mental health. Thus, SRH may have the potential to predict favorable adaptation processes after significant health stressors, that is, physical resilience. So far, a study examining the interplay between SRH, functional and mental health and their relative importance for health changes in the context of health stressors was missing. The present study aimed at addressing this gap.
Methods
We used prospective data of 1,580 inpatients (794 complete cases) aged 70 years or older of the PAWEL study, collected between October 2017 and May 2019 in Germany. Our analyses were based on SRH, functional health (Barthel Index) and self-reported mental health problems (PHQ-4) before and 12 months after major elective surgery. To examine changes and interrelationships in these health indicators, bivariate latent change score (BLCS) models were applied.
Results
Our analyses provided evidence for improvements of SRH, functional and mental health from pre-to-post surgery. BLCS models based on complete cases and the total sample pointed to a complex interplay of SRH, functional health and mental health with bidirectional coupling effects. Better pre-surgery SRH was associated with improvements in functional and mental health, and better pre-surgery functional health and mental health were associated with improvements in SRH from pre-to-post surgery. Effects of pre-surgery SRH on changes in functional health were smaller than those of functional health on changes in SRH.
Conclusions
Meaningful changes of SRH, functional and mental health and their interplay could be depicted for the first time in a clinical setting. Our findings provide preliminary support for SRH as a physical resilience factor being associated with improvements in other health indicators after health stressors. Longitudinal studies with more timepoints are needed to fully understand the predictive value of SRH for multidimensional health.
Trial registration
PAWEL study, German Clinical Trials Register, number DRKS00013311. Registered 10 November 2017 – Retrospectively registered, https://www.drks.de/drks_web/navigate.do?navigationId=trial.HTML&TRIAL_ID=DRKS00013311.
Background
Long periods of uninterrupted sitting, i.e., sedentary bouts, and their relationship with adverse health outcomes have moved into focus of public health recommendations. However, evidence on associations between sedentary bouts and adiposity markers is limited. Our aim was to investigate associations of the daily number of sedentary bouts with waist circumference (WC) and body mass index (BMI) in a sample of middle-aged to older adults.
Methods
In this cross-sectional study, data were collected from three different studies that took place in the area of Greifswald, Northern Germany, between 2012 and 2018. In total, 460 adults from the general population aged 40 to 75 years and without known cardiovascular disease wore tri-axial accelerometers (ActiGraph Model GT3X+, Pensacola, FL) on the hip for seven consecutive days. A wear time of ≥ 10 h on ≥ 4 days was required for analyses. WC (cm) and BMI (kg m− 2) were measured in a standardized way. Separate multilevel mixed-effects linear regression analyses were used to investigate associations of sedentary bouts (1 to 10 min, >10 to 30 min, and >30 min) with WC and BMI. Models were adjusted for potential confounders including sex, age, school education, employment, current smoking, season of data collection, and composition of accelerometer-based time use.
Results
Participants (66% females) were on average 57.1 (standard deviation, SD 8.5) years old and 36% had a school education >10 years. The mean number of sedentary bouts per day was 95.1 (SD 25.0) for 1-to-10-minute bouts, 13.3 (SD 3.4) for >10-to-30-minute bouts and 3.5 (SD 1.9) for >30-minute bouts. Mean WC was 91.1 cm (SD 12.3) and mean BMI was 26.9 kg m− 2 (SD 3.8). The daily number of 1-to-10-minute bouts was inversely associated with BMI (b = -0.027; p = 0.047) and the daily number of >30-minute bouts was positively associated with WC (b = 0.330; p = 0.001). All other associations were not statistically significant.
Conclusion
The findings provide some evidence on favourable associations of short sedentary bouts as well as unfavourable associations of long sedentary bouts with adiposity markers. Our results may contribute to a growing body of literature that can help to define public health recommendations for interrupting prolonged sedentary periods.
Trial registration
Study 1: German Clinical Trials Register (DRKS00010996); study 2: ClinicalTrials.gov (NCT02990039); study 3: ClinicalTrials.gov (NCT03539237).
Background: Thorough data stewardship is a key enabler of comprehensive health research. Processes such as data collection, storage, access, sharing, and analytics require researchers to follow elaborate data management strategies properly and consistently. Studies have shown that findable, accessible, interoperable, and reusable (FAIR) data leads to improved data sharing in different scientific domains.
Objective: This scoping review identifies and discusses concepts, approaches, implementation experiences, and lessons learned in FAIR initiatives in health research data.
Methods: The Arksey and O’Malley stage-based methodological framework for scoping reviews was applied. PubMed, Web of Science, and Google Scholar were searched to access relevant publications. Articles written in English, published between 2014 and 2020, and addressing FAIR concepts or practices in the health domain were included. The 3 data sources were deduplicated using a reference management software. In total, 2 independent authors reviewed the eligibility of each article based on defined inclusion and exclusion criteria. A charting tool was used to extract information from the full-text papers. The results were reported using the PRISMA-ScR (Preferred Reporting Items for Systematic Reviews and Meta-Analyses extension for Scoping Reviews) guidelines.
Results: A total of 2.18% (34/1561) of the screened articles were included in the final review. The authors reported FAIRification approaches, which include interpolation, inclusion of comprehensive data dictionaries, repository design, semantic interoperability, ontologies, data quality, linked data, and requirement gathering for FAIRification tools. Challenges and mitigation strategies associated with FAIRification, such as high setup costs, data politics, technical and administrative issues, privacy concerns, and difficulties encountered in sharing health data despite its sensitive nature were also reported. We found various workflows, tools, and infrastructures designed by different groups worldwide to facilitate the FAIRification of health research data. We also uncovered a wide range of problems and questions that researchers are trying to address by using the different workflows, tools, and infrastructures. Although the concept of FAIR data stewardship in the health research domain is relatively new, almost all continents have been reached by at least one network trying to achieve health data FAIRness. Documented outcomes of FAIRification efforts include peer-reviewed publications, improved data sharing, facilitated data reuse, return on investment, and new treatments. Successful FAIRification of data has informed the management and prognosis of various diseases such as cancer, cardiovascular diseases, and neurological diseases. Efforts to FAIRify data on a wider variety of diseases have been ongoing since the COVID-19 pandemic.
Conclusions: This work summarises projects, tools, and workflows for the FAIRification of health research data. The comprehensive review shows that implementing the FAIR concept in health data stewardship carries the promise of improved research data management and transparency in the era of big data and open research publishing.
International Registered Report Identifier (IRRID): RR2-10.2196/22505
Background: Fatigue, dyspnea, and lack of energy and concentration are commonly interpreted as indicative of symptomatic anemia and may thus play a role in diagnostic and therapeutic decisions. Objective: To investigate the association between symptoms commonly attributed to anemia and the actual presence of anemia. Methods: Data from two independent cohorts of the Study of Health in Pomerania (SHIP) were analyzed. Interview data, laboratory data, and physical examination were individually linked with claims data from the Association of Statutory Health Insurance Physicians. A complete case analysis using logistic regression models was performed to evaluate the association of anemia with symptoms commonly attributed to anemia. The models were adjusted for confounders such as depression, medication, insomnia, and other medical conditions. Results: A total of 5979 participants (53% female, median age 55) were included in the analysis. Of those, 30% reported fatigue, 16% reported lack of energy, 16% reported lack of concentration, and 29% reported dyspnea and/or weakness. Anemia was prevalent in about 6% (379). The symptoms were more prevalent in participants with anemia. However, participants with anemia were older and had a poorer health status. There was no association in multivariate logistic regression models between the symptoms fatigue, lack of concentration, dyspnea, and/or weakness and anemia. Anemia was associated (OR: 1.45; 95% CI: 1.13–1.86) with lack of energy in the multivariate analysis. Other factors such as depression, insomnia, and medication were more strongly associated with the symptoms. Conclusion: The clinical symptoms commonly attributed to anemia are unspecific and highly prevalent both in non-anemic and anemic persons. Even in the presence of anemia, other diagnoses should be considered as causes such as depression, heart failure, asthma, and COPD, which are more closely associated with the symptoms. Further diagnostic research is warranted to explore the association of symptoms in different subgroups and settings in order to help clinical decision making.
Background
Multiple Sclerosis is an autoimmune inflammatory disease of the central nervous system that often leads to premature incapacity for work. Therefore, the MSnetWork project implements a new form of care and pursues the goal of maintaining or even improving the state of health of MS patients and having a positive influence on their ability to work as well as their participation in social life. A network of neurologists, occupational health and rehabilitation physicians, psychologists, and social insurance suppliers provide patients with targeted services that have not previously been part of standard care. According to the patient’s needs treatment options will be identified and initiated.
Methods
The MSnetWork study is designed as a multicenter randomized controlled trial, with two parallel groups (randomization at the patient level with 1:1 allocation ratio, planned N = 950, duration of study participation 24 months). After 12 months, the patients in the control group will also receive the interventions. The primary outcome is the number of sick leave days. Secondary outcomes are health-related quality of life, physical, affective and cognitive status, fatigue, costs of incapacity to work, treatment costs, out-of-pocket costs, self-efficacy, and patient satisfaction with therapy.
Intervention effects are analyzed by a parallel-group comparison between the intervention and the control group. Furthermore, the long-term effects within the intervention group will be observed and a pre-post comparison of the control group, before and after receiving the intervention in MSnetWork, will be performed.
Discussion
Due to the multiple approaches to patient-centered, multidisciplinary MS care, MSnetWork can be considered a complex intervention. The study design and linkage of comprehensive, patient-specific primary and secondary data in an outpatient setting enable the evaluation of this complex intervention, both on a qualitative and quantitative level. The basic assumption is a positive effect on the prevention or reduction of incapacity for work as well as on the patients’ quality of life. If the project proves to be a success, MSnetWork could be adapted for the treatment of other chronic diseases with an impact on the ability to work and quality of life.
Trial registration
The trial MSnetWork has been retrospectively registered in the German Clinical Trials Register (DRKS) since 08.07.2022 with the ID DRKS00025451.
This is the first study to analyze the association of accelerometer-measured patterns of habitual physical activity (PA) and sedentary behavior (SB) with serum BDNF in individuals with coronary heart disease. A total of 30 individuals (M = 69.5 years; 80% men) participated in this pre-post study that aimed to test a multi-behavioral intervention. All participants underwent standardized measurement of anthropometric variables, blood collection, self-administered survey, and accelerometer-based measurement of PA and SB over seven days. Serum BDNF concentrations were measured using enzyme-linked immunosorbent assay kit. We applied separate multiple linear regression analysis to estimate the associations of baseline SB pattern measures, light and moderate-to-vigorous PA with serum BDNF (n = 29). Participants spent 508.7 ± 76.5 min/d in SB, 258.5 ± 71.2 min/d in light PA, and 21.2 ± 15.2 min/d in moderate-to-vigorous PA. Per day, individuals had 15.5 ± 3.2 numbers of 10-to-30 min bouts of SB (average length: 22.2 ± 2.1 min) and 3.4 ± 1.2 numbers of > 30 min bouts of SB (average length: 43.8 ± 2.4 min). Regression analysis revealed no significant associations between any of the accelerometer-based measures and serum BDNF. The findings of this study did not reveal an association of accelerometer-measured PA and SB pattern variables with serum BDNF in individuals with coronary heart disease. In addition, our data revealed a considerable variation of PA and SB which should be considered in future studies.
Multivariate analysis of independent determinants of ADL/IADL and quality of life in the elderly
(2022)
Background
This study evaluated the determinants of disability and quality of life in elderly people who participated at the multi-centred RubiN project (Regional ununterbrochen betreut im Netz) in Germany.
Methods
Baseline data of the subjects aged 70 years and older of the RubiN project were used and only subjects with complete data sets were considered for the ensuing analysis (complete case analysis (CCA)).
Disability was examined using the concepts of ADL (activities of daily living) and IADL (instrumental activities of daily living). Subjects exhibiting one or more deficiencies in ADL respectively IADL were considered as ADL respectively IADL disabled. Quality of life was assessed using the WHOQOL-BREF and the WHOQOL-OLD. Applying multivariate analysis, sociodemographic factors, psychosocial characteristics as well as the functional, nutritional and cognitive status were explored as potential determinants of disability and quality of life in the elderly.
Results
One thousand three hundred seventy-five subjects from the RubiN project exhibited data completeness regarding baseline data. ADL and IADL disability were both associated with the respective other construct of disability, sex, a reduced cognitive and functional status as well as domains of the WHOQOL-BREF. Furthermore, ADL disability was related to social participation, while IADL disability was linked to age, education and social support. Sex, ADL and IADL disability, income, social support and social participation as well as the functional status were predictors of the domain ‘Physical Health’ (WHOQOL-BREF). The facet ‘Social Participation’ (WHOQOL-OLD) was affected by both ADL and IADL disability, income, social participation, the nutritional and also the functional status.
Conclusions
Several potential determinants of disability and quality of life were identified and confirmed in this study. Attention should be drawn to prevention schemes as many of these determinants appear to be at least partly modifiable.
Background
Although chronic kidney disease (CKD) is highly prevalent in the general population, little research has been conducted on CKD management in ambulatory care.
Objective was to assess management and quality of care by evaluating CKD coding in ambulatory care, patient diagnosis awareness, frequency of monitoring and whether appropriate patients are referred to nephrology.
Methods
Clinical data from the population-based cohort Study of Health in Pomerania (SHIP-START) were matched with claims data of the Association of Statutory Health Insurance Physicians. Quality of care was evaluated according international and German recommendations.
Results
Data from 1778 participants (56% female, mean age 59 years) were analysed. 10% had eGFR < 60 ml/min/1.73m2 (mean age 74 years), 15% had albuminuria. 21% had CKD as defined by KDIGO. 20% of these were coded and 7% self-reported having CKD. Coding increased with GFR stage (G3a 20%, G3b 61%, G4 75%, G5 100%). Serum creatinine and urinary dip stick testing were billed in the majority of all participants regardless of renal function. Testing frequency partially surpassed recommendations. Nephrology consultation was billed in few cases with stage G3b-G4.
Conclusion
CKD coding increased with stage and was performed reliably in stages ≥ G4, while CKD awareness was low. Adherence to monitoring and referral criteria varied, depending on the applicability of monitoring criteria. For assessing quality of care, consent on monitoring, patient education, referral criteria and coordination of care needs to be established, accounting for patient related factors, including age and comorbidity.
Trial registration
This study was prospectively registered as DRKS00009812 in the German Clinical Trials Register (DRKS).
Legal advice and care-effective use of care and case management: limits, risks and need for change
(2022)
Introduction
An important dimension of care and case managers is to support geriatric patients in obtaining social services in medical, nursing, therapeutic and social fields. To this, they advise and represent their patients.
Methods
The documentation of patient contacts with case managers of a network of physicians was evaluated. In particular, activities involving legal advice were analysed in detail, compared with the current legal situation in Germany and evaluated. In addition, qualitative expert interviews were conducted. The content and the legal requirements of legal services law were determined by applying legal interpretation methods (esp. wording, telos, systematics). The results of the evaluation of the documentation were compared with legal requirements.
Results
Care and case management touches activities in some fields of action without having a legal basis in legal services law. This leads to the fact that these services may not be provided and to - uninsured and uninsurable - liability risks.
Discussion
With the introduction of care and case management into standard care, both social law and the Legal Services Act must be adapted to enable the legally compliant use of care and case managers. Otherwise, certain services that are useful for the care of patients may not be provided.
The incidence and prevalence of pediatric-onset inflammatory bowel disease (PIBD) are on the rise worldwide. Initial symptoms are often recognized with a delay, which reduces the quality of life and may lead to an increased rate of complications. The aim of this study was to determine the diagnostic delay in PIBD and to identify potential influencing factors. Therefore, data from the German-Austrian patient registry CEDATA-GPGE for children and adolescents with PIBD were analyzed for the period January 2014 to December 2018. There were 456 children identified in the data, thereof 258 children (57%) with Crohn’s disease (CD) and 198 children (43%) with Ulcerative colitis (UC). The median age was 13.3 years (interquartile range (IQR) = 10.9−15.0), and 44% were females. The median diagnostic delay was 4.1 months (IQR = 2.1–7.0) in CD and 2.4 months (IQR = 1.2–5.1) in UC (p = 0.01). UC was associated with earlier diagnosis than CD (p < 0.001). Only a few factors influencing the diagnostic delay have been verified, e.g., abdominal pain at night and if video capsule endoscopy was performed. Diagnostic delay improved over the years in participating centers, but the level of awareness needs to be high even in common symptoms like abdominal pain.
Background
The Federal Ministry of Education and Research of Germany (BMBF) funds a network of university medicines (NUM) to support COVID-19 and pandemic research at national level. The “COVID-19 Data Exchange Platform” (CODEX) as part of NUM establishes a harmonised infrastructure that supports research use of COVID-19 datasets. The broad consent (BC) of the Medical Informatics Initiative (MII) is agreed by all German federal states and forms the legal base for data processing. All 34 participating university hospitals (NUM sites) work upon a harmonised infrastructural as well as legal basis for their data protection-compliant collection and transfer of their research dataset to the central CODEX platform. Each NUM site ensures that the exchanged consent information conforms to the already-balloted HL7 FHIR consent profiles and the interoperability concept of the MII Task Force “Consent Implementation” (TFCI). The Independent Trusted Third-Party (TTP) of the University Medicine Greifswald supports data protection-compliant data processing and provides the consent management solutions gICS.
Methods
Based on a stakeholder dialogue a required set of FHIR-functionalities was identified and technically specified supported by official FHIR experts. Next, a “TTP-FHIR Gateway” for the HL7 FHIR-compliant exchange of consent information using gICS was implemented. A last step included external integration tests and the development of a pre-configured consent template for the BC for the NUM sites.
Results
A FHIR-compliant gICS-release and a corresponding consent template for the BC were provided to all NUM sites in June 2021. All FHIR functionalities comply with the already-balloted FHIR consent profiles of the HL7 Working Group Consent Management. The consent template simplifies the technical BC rollout and the corresponding implementation of the TFCI interoperability concept at the NUM sites.
Conclusions
This article shows that a HL7 FHIR-compliant and interoperable nationwide exchange of consent information could be built using of the consent management software gICS and the provided TTP-FHIR Gateway. The initial functional scope of the solution covers the requirements identified in the NUM-CODEX setting. The semantic correctness of these functionalities was validated by project-partners from the Ludwig-Maximilian University in Munich. The production rollout of the solution package to all NUM sites has started successfully.
Teaching is amongst the six professions with the highest stress levels and lowest job satisfaction, leading to a high turnover rate and teacher shortages. During the pandemic, teachers and school principals were confronted with new regulations and teaching methods. This study aims to examine post-pandemic stress levels, as well as resilience factors to proactively cope with stress and thoughts of leaving the profession among teachers and school principals. We used a cross-sectional online survey. The validated instruments Perceived Stress Scale (PSS-10) and Proactive Coping Subscale (PCI) were used. We included 471 teachers and 113 school principals in the analysis. Overall, respondents had a moderate stress level. During the pandemic, every fourth teacher (27.2%) and every third principal (32.7%) had serious thoughts of leaving the profession. More perceived helplessness (OR = 1.2, p < 0.001), less self-efficacy (OR = 0.8, p = 0.002), and poorer coping skills (OR = 0.96, p = 0.044) were associated with a higher likelihood of thoughts of leaving the profession for teachers, whereas for school principals, only higher perceived helplessness (OR = 1.2, p = 0.008) contributed significantly. To prevent further teacher attrition, teachers and school principals need support to decrease stress and increase their ability to cope.
Severity of alcohol dependence and mortality after 20 years in an adult general population sample
(2022)
Objectives
To estimate mortality on grounds of the severity of alcohol dependence which has been assessed by two approaches: the frequency of alcohol dependence symptoms (FADS) and the number of alcohol dependence criteria (NADC).
Methods
A random sample of adult community residents in northern Germany at age 18 to 64 had been interviewed in 1996. Among 4075 study participants at baseline, for 4028 vital status was ascertained 20 years later. The FADS was assessed by the Severity of Alcohol Dependence Scale among the 780 study participants who had one or more symptoms of alcohol dependence or abuse and vital status information. The NADC was estimated by the Munich Composite International Diagnostic Interview among 4028 study participants with vital status information. Cox proportional hazard models were used.
Results
The age-adjusted hazard ratio for the FADS (value range: 0–79) was 1.02 (95% confidence interval, CI: 1.016–1.028), for the NADC (value range: 0–7) it was 1.25 (CI: 1.19–1.32).
Conclusions
The FADS and NADC predicted time to death in a dose-dependent manner in this adult general population sample.
The Apolipoprotein E (APOE) gene polymorphism (rs429358 and rs7412) shows a well-established association with lipid profiles, but its effect on cardiovascular disease is still conflicting. Therefore, we examined the association of different APOE alleles with common carotid artery intima-media thickness (CCA-IMT), carotid plaques, incident myocardial infarction (MI) and stroke. We analyzed data from 3327 participants aged 20–79 years of the population-based Study of Health in Pomerania (SHIP) from Northeast Germany with a median follow-up time of 14.5 years. Linear, logistic, and Cox-regression models were used to assess the associations of the APOE polymorphism with CCA-IMT, carotid plaques, incident MI and stroke, respectively. In our study, the APOE E2 allele was associated with lower CCA-IMT at baseline compared to E3 homozygotes (β: − 0.02 [95% CI − 0.04, − 0.004]). Over the follow-up, 244 MI events and 218 stroke events were observed. APOE E2 and E4 allele were not associated with incident MI (E2 HR: 1.06 [95% CI 0.68, 1.66]; E4 HR: 1.03 [95% CI 0.73, 1.45]) and incident stroke (E2 HR: 0.79 [95% CI 0.48, 1.30]; E4 HR: 0.96 [95% CI 0.66, 1.38]) in any of the models adjusting for potential confounders. However, the positive association between CCA-IMT and incident MI was more pronounced in E2 carriers than E3 homozygotes. Thus, our study suggests that while APOE E2 allele may predispose individuals to lower CCA-IMT, E2 carriers may be more prone to MI than E3 homozygotes as the CCA-IMT increases. APOE E4 allele had no effect on CCA-IMT, plaques, MI or stroke.
Background
A redistribution of tasks between specialized nurses and primary care physicians, i.e., models of advanced nursing practice, has the potential to improve the treatment and care of the growing number of people with dementia (PwD). Especially in rural areas with limited access to primary care physicians and specialists, these models might improve PwD’s quality of life and well-being. However, such care models are not available in Germany in regular healthcare. This study examines the acceptance, safety, efficacy, and health economic efficiency of an advanced nursing practice model for PwD in the primary care setting in Germany.
Methods
InDePendent is a two-arm, multi-center, cluster-randomized controlled intervention study. Inclusion criteria are age ≥70 years, cognitively impaired (DemTect ≤8) or formally diagnosed with dementia, and living in the own home. Patients will be recruited by general practitioners or specialists. Randomization is carried out at the physicians’ level in a ratio of 1:2 (intervention vs. waiting-control group). After study inclusion, all participants will receive a baseline assessment and a follow-up assessment after 6 months. Patients of the intervention group will receive advanced dementia care management for 6 months, carried out by specialized nurses, who will conduct certain tasks, usually carried out by primary care physicians. This includes a standardized assessment of the patients’ unmet needs, the generation and implementation of an individualized care plan to address the patients’ needs in close coordination with the GP. PwD in the waiting-control group will receive routine care for 6 months and subsequently become part of the intervention group. The primary outcome is the number of unmet needs after 6 months measured by the Camberwell Assessment of Need for the Elderly (CANE). The primary analysis after 6 months is carried out using multilevel models and will be based on the intention-to-treat principle. Secondary outcomes are quality of life, caregiver burden, acceptance, and cost-effectiveness. In total, n=465 participants are needed to assess significant differences in the number of unmet needs between the intervention and control groups.
Discussion
The study will provide evidence about the acceptance, efficacy, and cost-effectiveness of an innovative interprofessional concept based on advanced nursing care. Results will contribute to the implementation of such models in the German healthcare system. The goal is to improve the current treatment and care situation for PwD and their caregivers and to expand nursing roles.
Background
The care of palliative patients takes place as non-specialized and specialized care, in outpatient and inpatient settings. However, palliative care is largely provided as General Outpatient Palliative Care (GOPC). This study aimed to investigate whether the survival curves of GOPC patients differed from those of the more intensive palliative care modalities and whether GOPC palliative care was appropriate in terms of timing.
Methods
The study is based on claims data from a large statutory health insurance. The analysis included 4177 patients who received palliative care starting in 2015 and who were fully insured 1 year before and 1 year after palliative care or until death. The probability of survival was observed for 12 months. Patients were classified into group A, which consisted of patients who received palliative care only with GOPC, and group B including patients who received inpatient or specialized outpatient palliative care. Group A was further divided into two subgroups. Patients who received GOPC on only 1 day were assigned to subgroup A1, and patients who received GOPC on two or more days were assigned to subgroup A2. The survival analysis was carried out using Kaplan-Meier curves. The median survival times were compared with the log-rank test.
Results
The survival curves differed between groups A and B, except in the first quartile of the survival distribution. The median survival was significantly longer in group A (137 days, n = 2763) than in group B (47 days, n = 1424, p < 0.0001) and shorter in group A1 (35 days, n = 986) than in group A2 (217 days, n = 1767, p < 0.0001). The survival rate during the 12-month follow-up was higher in group A (42%) than in group B (11%) and lower in group A1 (38%) than in group A2 (44%).
Conclusions
The results of the analysis revealed that patients who received the first palliative care shortly before death suspected insufficient care, especially patients who received GOPC for only 1 day and no further palliative care until death or 12-month follow-up. Palliative care should start as early as necessary and be continuous until the end of life.
Background
Clinical practice guidelines recommend specialist referral according to different criteria. The aim was to assess recommended and observed referral rate and health care expenditure according to recommendations from:
• Kidney Disease Improving Global Outcomes (KDIGO,2012)
• National Institute for Health and Care Excellence (NICE,2014)
• German Society of Nephrology/German Society of Internal Medicine (DGfN/DGIM,2015)
• German College of General Practitioners and Family Physicians (DEGAM,2019)
• Kidney failure risk equation (NICE,2021)
Methods
Data of the population-based cohort Study of Health in Pomerania were matched with claims data. Proportion of subjects meeting referral criteria and corresponding health care expenditures were calculated and projected to the population of Mecklenburg-Vorpommern.
Results
Data from 1927 subjects were analysed. Overall proportion of subjects meeting referral criteria ranged from 4.9% (DEGAM) to 8.3% (DGfN/DGIM). The majority of patients eligible for referral were ≥ 60 years. In subjects older than 60 years, differences were even more pronounced, and rates ranged from 9.7% (DEGAM) to 16.5% (DGfN/DGIM). Estimated population level costs varied between €1,432,440 (DEGAM) and €2,386,186 (DGfN/DGIM). From 190 patients with eGFR < 60 ml/min, 15 had a risk of end stage renal disease > 5% within the next 5 years.
Conclusions
Applying different referral criteria results in different referral rates and costs. Referral rates exceed actually observed consultation rates. Criteria need to be evaluated in terms of available workforce, resources and regarding over- and underutilization of nephrology services.
In 2009, the Democratic Republic of Congo (DRC) started its journey towards achieving Universal Health Coverage (UHC). This study examines the evolution of financial risk protection and health outcomes indicators in the context of the commitment of DRC to UHC. To measure the effects of such a commitment on financial risk protection and health outcomes indicators, we analyse whether changes have occurred over the last two decades and, if applicable, when these changes happened. Using five variables as indicators for the measurement of the financial risk protection component, there as well retained three indicators to measure health outcomes. To identify time-related effects, we applied the parametric approach of breakpoint regression to detect whether the UHC journey has brought change and when exactly the change has occurred.
Although there is a slight improvement in the financial risk protection indicators, we found that the adopted strategies have fostered access to healthcare for the wealthiest quantile of the population while neglecting the majority of the poorest. The government did not thrive persistently over the past decade to meet its commitment to allocate adequate funds to health expenditures. In addition, the support from donors appears to be unstable, unpredictable and unsustainable. We found a slight improvement in health outcomes attributable to direct investment in building health centres by the private sector and international organizations. Overall, our findings reveal that the prevention of catastrophic health expenditure is still not sufficiently prioritized by the country, and mostly for the majority of the poorest. Therefore, our work suggests that DRC’s UHC journey has slightly contributed to improve the financial risk protection and health outcomes indicators but much effort should be undertaken.
Background
Data collected during routine health care and ensuing analytical results bear the potential to provide valuable information to improve the overall health care of patients. However, little is known about how patients prefer to be informed about the possible usage of their routine data and/or biosamples for research purposes before reaching a consent decision. Specifically, we investigated the setting, the timing and the responsible staff for the information and consent process.
Methods
We performed a quasi-randomized controlled trial and compared the method by which patients were informed either in the patient admission area following patient admission by the same staff member (Group A) or in a separate room by another staff member (Group B). The consent decision was hypothetical in nature. Additionally, we evaluated if there was the need for additional time after the information session and before taking the consent decision. Data were collected during a structured interview based on questionnaires where participants reflected on the information and consent process they went through.
Results
Questionnaire data were obtained from 157 participants in Group A and 106 participants in Group B. Overall, participants in both groups were satisfied with their experienced process and with the way information was provided. They reported that their (hypothetical) consent decision was freely made. Approximately half of the interested participants in Group B did not show up in the separate room, while all interested participants in Group A could be informed about the secondary use of their routine data and left-over samples. No participants, except for one in Group B, wanted to take extra time for their consent decision. The hypothetical consent rate for both routine data and left-over samples was very high in both groups.
Conclusions
The willingness to support medical research by allowing the use of routine data and left-over samples seems to be widespread among patients. Information concerning this secondary data use may be given by trained administrative staff immediately following patient admission. Patients mainly prefer making a consent decision directly after information is provided and discussed. Furthermore, less patients are informed when the process is organized in a separate room.
Background
Missing data are ubiquitous in randomised controlled trials. Although sensitivity analyses for different missing data mechanisms (missing at random vs. missing not at random) are widely recommended, they are rarely conducted in practice. The aim of the present study was to demonstrate sensitivity analyses for different assumptions regarding the missing data mechanism for randomised controlled trials using latent growth modelling (LGM).
Methods
Data from a randomised controlled brief alcohol intervention trial was used. The sample included 1646 adults (56% female; mean age = 31.0 years) from the general population who had received up to three individualized alcohol feedback letters or assessment-only. Follow-up interviews were conducted after 12 and 36 months via telephone. The main outcome for the analysis was change in alcohol use over time. A three-step LGM approach was used. First, evidence about the process that generated the missing data was accumulated by analysing the extent of missing values in both study conditions, missing data patterns, and baseline variables that predicted participation in the two follow-up assessments using logistic regression. Second, growth models were calculated to analyse intervention effects over time. These models assumed that data were missing at random and applied full-information maximum likelihood estimation. Third, the findings were safeguarded by incorporating model components to account for the possibility that data were missing not at random. For that purpose, Diggle-Kenward selection, Wu-Carroll shared parameter and pattern mixture models were implemented.
Results
Although the true data generating process remained unknown, the evidence was unequivocal: both the intervention and control group reduced their alcohol use over time, but no significant group differences emerged. There was no clear evidence for intervention efficacy, neither in the growth models that assumed the missing data to be at random nor those that assumed the missing data to be not at random.
Conclusion
The illustrated approach allows the assessment of how sensitive conclusions about the efficacy of an intervention are to different assumptions regarding the missing data mechanism. For researchers familiar with LGM, it is a valuable statistical supplement to safeguard their findings against the possibility of nonignorable missingness.
Background
Numerous wearables are used in a research context to record cardiac activity although their validity and usability has not been fully investigated. The objectives of this study is the cross-model comparison of data quality at different realistic use cases (cognitive and physical tasks). The recording quality is expressed by the ability to accurately detect the QRS complex, the amount of noise in the data, and the quality of RR intervals.
Methods
Five ECG devices (eMotion Faros 360°, Hexoskin Hx1, NeXus-10 MKII, Polar RS800 Multi and SOMNOtouch NIBP) were attached and simultaneously tested in 13 participants. Used test conditions included: measurements during rest, treadmill walking/running, and a cognitive 2-back task. Signal quality was assessed by a new local morphological quality parameter morphSQ which is defined as a weighted peak noise-to-signal ratio on percentage scale. The QRS detection performance was evaluated with eplimited on synchronized data by comparison to ground truth annotations. A modification of the Smith-Waterman algorithm has been used to assess the RR interval quality and to classify incorrect beat annotations. Evaluation metrics includes the positive predictive value, false negative rates, and F1 scores for beat detection performance.
Results
All used devices achieved sufficient signal quality in non-movement conditions. Over all experimental phases, insufficient quality expressed by morphSQ values below 10% was only found in 1.22% of the recorded beats using eMotion Faros 360°whereas the rate was 8.67% with Hexoskin Hx1. Nevertheless, QRS detection performed well across all used devices with positive predictive values between 0.985 and 1.000. False negative rates are ranging between 0.003 and 0.017. eMotion Faros 360°achieved the most stable results among the tested devices with only 5 false positive and 19 misplaced beats across all recordings identified by the Smith-Waterman approach.
Conclusion
Data quality was assessed by two new approaches: analyzing the noise-to-signal ratio using morphSQ, and RR interval quality using Smith-Waterman. Both methods deliver comparable results. However the Smith-Waterman approach allows the direct comparison of RR intervals without the need for signal synchronization whereas morphSQ can be computed locally.
Background
Since the onset of the COVID-19 pandemic, children have been mentally and physically burdened, particularly due to school closures, with an associated loss of learning. Therefore, efficient testing strategies with high sensitivity are necessary to keep schools open. Apart from individual rapid antigen testing, various methods have been investigated, such as PCR-based pool-testing of nasopharyngeal swabs, gargle, or saliva samples. To date, previous validation studies have found the PCR-based saliva swab pool testing method to be an effective screening method, however, the acceptability and feasibility of a widespread implementation in the school-setting among stakeholders has not been comprehensively evaluated.
Methods
In this pilot study, SARS-CoV-2 saliva swab pool testing of up to 15 swabs per pool was conducted in ten primary and special schools in Mecklenburg-Western Pomerania, Germany, over a period of one month. Thereafter, parents, teachers and school principals of the participating schools as well as the participating laboratories were surveyed about the feasibility and acceptability of this method, its large-scale implementation and challenges. Data were analyzed quantitatively and qualitatively.
Results
During the study period, 1,630 saliva swab pools were analyzed, of which 22 tested SARS-CoV-2 positive (1.3%). A total of N = 315 participants took part in the survey. Across all groups, the saliva swab pool testing method was perceived as more child-friendly (>87%), convenient (>82%), and easier (>81%) compared to rapid antigen testing by an anterior nasal swab. Over 80% of all participants favored widespread, regular use of the saliva swab method.
Conclusion
In school settings in particular, a high acceptability of the test method is crucial for a successful SARS-CoV-2 surveillance strategy. All respondents clearly preferred the saliva swab method, which can be used safely without complications in children six years of age and older. Hurdles and suggestions for improvement of an area-wide implementation were outlined.
The structure and content of the training phase following completion of medical school, referred to in most countries as postgraduate medical training, varies between countries. The purpose of this article is to give national and international readers an overview of the organisation and structure of postgraduate medical training in Germany.
The content and duration of postgraduate training in Germany are stipulated by state medical boards, officially termed associations (Landesärztekammer). In a periodically updated decree, the federal German medical association (Bundesärztekammer) provides a template for postgraduate medical training structure (Musterweiterbildungsordnung), which is adapted by the state medical associations. Admission to postgraduate medical training in Germany takes place by way of open, free-market selection. Based on the traditional assumption that junior doctors acquire all necessary clinical skills “on the job”, formal education in the form of seminars, lectures, or preorganised, detailed rotation plans through various specialties or wards is largely absent. Requirements for postgraduate medical training focus on the fulfilment of broad categories of rotations rather than specific content or gaining competencies. With few exceptions, no structured educational programs with curricular learning objectives exist. Limited funding impedes program development and expansion. Junior doctors bear the primary organisational responsibility in their training, which often results in extended training times and dissatisfaction. Structured training programs which prioritise skill-building and formal education are needed to support junior doctors and ensure their competence in primary and specialty care.
Background: Patients of geriatrics are often treated by several health care providers at the same time. The spatial, informational, and organizational separation of these health care providers can hinder the effective treatment of these patients.
Objective: This study aimed to develop a regional health information exchange (HIE) system to improve HIE in geriatric treatment. This study also evaluated the usability of the regional HIE system and sought to identify barriers to and facilitators of its implementation.
Methods: The development of the regional HIE system followed the community-based participatory research approach. The primary outcomes were the usability of the regional HIE system, expected implementation barriers and facilitators, and the quality of the developmental process. Data were collected and analyzed using a mixed methods approach.
Results: A total of 3 focus regions were identified, 22 geriatric health care providers participated in the development of the regional HIE system, and 11 workshops were conducted between October 2019 and September 2020. In total, 12 participants responded to a questionnaire. The main results were that the regional HIE system should support the exchange of assessments, diagnoses, medication, assistive device supply, and social information. The regional HIE system was expected to be able to improve the quality and continuity of care. In total, 5 adoption facilitators were identified. The main points were adaptability of the regional HIE system to local needs, availability to different patient groups and treatment documents, web-based design, trust among the users, and computer literacy. A total of 13 barriers to adoption were identified. The main expected barriers to implementation were lack of resources, interoperability issues, computer illiteracy, lack of trust, privacy concerns, and ease-of-use issues.
Conclusions: Participating health care professionals shared similar motivations for developing the regional HIE system, including improved quality of care, reduction of unnecessary examinations, and more effective health care provision. An overly complicated registration process for health care professionals and the patients’ free choice of their health care providers hinder the effectiveness of the regional HIE system, resulting in incomplete patient health information. However, the web-based design of the system bridges interoperability problems that exist owing to the different technical and organizational structures of the health care facilities involved. The regional HIE system is better accepted by health care professionals who are already engaged in an interdisciplinary, geriatric-focused network. This might indicate that pre-existing cross-organizational structures and processes are prerequisites for using HIE systems. The participatory design supports the development of technologies that are adaptable to regional needs. Health care providers are interested in participating in the development of an HIE system, but they often lack the required time, knowledge, and resources.
Background
In the German health care system, parents with an acutely ill child can visit an emergency room (ER) 24 hours a day, seven days a week. At the ER, the patient receives a medical consultation. Many parents use these facilities as they do not know how urgently their child requires medical attention. In recent years, paediatric departments in smaller hospitals have been closed, particularly in rural regions. As a result of this, the distances that patients must travel to paediatric care facilities in these regions are increasing, causing more children to visit an ER for adults. However, paediatric expertise is often required in order to assess how quickly the patient requires treatment and select an adequate treatment. This decision is made by a doctor in German ERs. We have examined whether remote paediatricians can perform a standardised urgency assessment (triage) using a video conferencing system.
Methods
Only acutely ill patients who were brought to a paediatric emergency room (paedER) by their parents or carers, without prior medical consultation, have been included in this study. First, an on-site paediatrician assessed the urgency of each case using a standardised triage. In order to do this, the Paediatric Canadian Triage and Acuity Scale (PaedCTAS) was translated into German and adapted for use in a standardised IT-based data collection tool. After the initial on-site triage, a telemedicine paediatrician, based in a different hospital, repeated the triage using a video conferencing system. Both paediatricians used the same triage procedure. The primary outcome was the degree of concordance and interobserver agreement, measured using Cohen’s kappa, between the two paediatricians. We have also included patient and assessor demographics.
Results
A total of 266 patients were included in the study. Of these, 227 cases were eligible for the concordance analysis. In n = 154 cases (68%), there was concordance between the on-site paediatrician’s and telemedicine paediatrician’s urgency assessments. In n = 50 cases (22%), the telemedicine paediatrician rated the urgency of the patient’s condition higher (overtriage); in 23 cases (10%), the assessment indicated a lower urgency (undertriage). Nineteen medical doctors were included in the study, mostly trained paediatric specialists. Some of them acted as an on-site doctor and telemedicine doctor. Cohen’s weighted kappa was 0.64 (95% CI: 0.49–0.79), indicating a substantial agreement between the specialists.
Conclusions
Telemedical triage can assist in providing acute paediatric care in regions with a low density of paediatric care facilities. The next steps are further developing the triage tool and implementing telemedicine urgency assessment in a larger network of hospitals in order to improve the integration of telemedicine into hospitals’ organisational processes. The processes should include intensive training for the doctors involved in telemedical triage.
The associations of thyroid function parameters with non-alcoholic fatty liver disease (NAFLD) and hepatic iron overload are not entirely clear. We have cross-sectionally investigated these associations among 2734 participants of two population-based cross-sectional studies of the Study of Health in Pomerania. Serum levels of thyroid-stimulating hormone (TSH), free tri-iodothyronine (fT3), and free thyroxine (fT4) levels were measured. Liver fat content (by proton-density fat fraction) as well as hepatic iron content (by transverse relaxation rate; R2*) were assessed by quantitative MRI. Thyroid function parameters were associated with hepatic fat and iron contents by median and logistic regression models adjusted for confounding. There were no associations between serum TSH levels and liver fat content, NAFLD, or hepatic iron overload. Serum fT4 levels were inversely associated with liver fat content, NAFLD, hepatic iron contents, and hepatic iron overload. Serum fT3 levels as well as the fT3 to fT4 ratio were positively associated with hepatic fat, NAFLD, hepatic iron contents, but not with hepatic iron overload. Associations between fT3 levels and liver fat content were strongest in obese individuals, in which we also observed an inverse association between TSH levels and NAFLD. These findings might be the result of a higher conversion of fT4 to the biologically active form fT3. Our results suggest that a subclinical hyperthyroid state may be associated with NAFLD, particularly in obese individuals. Furthermore, thyroid hormone levels seem to be more strongly associated with increased liver fat content compared to hepatic iron content.
Background
Few studies have assessed trajectories of alcohol use in the general population, and even fewer studies have assessed the impact of brief intervention on the trajectories. Especially for low-risk drinkers, it is unclear what trajectories occur, whether they benefit from intervention, and if so, when and how long. The aims were first, to identify alcohol use trajectories among at-risk and among low-risk drinkers, second, to explore potential effects of brief alcohol intervention and, third, to identify predictors of trajectories.
Methods
Adults aged 18-64 years were screened for alcohol use at a municipal registration office. Those with alcohol use in the past 12 months (N = 1646; participation rate: 67%) were randomized to assessment plus computer-generated individualized feedback letters or assessment only. Outcome was drinks/week assessed at months 3, 6, 12, and 36. Alcohol risk group (at-risk/low-risk) was determined using the Alcohol Use Disorders Identification Test–Consumption. Latent class growth models were estimated to identify alcohol use trajectories among each alcohol risk group. Sex, age, school education, employment status, self-reported health, and smoking status were tested as predictors.
Results
For at-risk drinkers, a light-stable class (46%), a medium-stable class (46%), and a high-decreasing class (8%) emerged. The light-stable class tended to benefit from intervention after 3 years (Incidence Rate Ratio, IRR=1.96; 95% Confidence Interval, CI: 1.14–3.37). Male sex, higher age, more years of school, and current smoking decreased the probability of belonging to the light-stable class (p-values<0.05). For low-risk drinkers, a very light-slightly increasing class (72%) and a light-increasing class (28%) emerged. The very light-slightly increasing class tended to benefit from intervention after 6 months (IRR=1.60; 95% CI: 1.12–2.28). Male sex and more years of school increased the probability of belonging to the light-increasing class (p-value < 0.05).
Conclusion
Most at-risk drinkers did not change, whereas the majority of low-risk drinkers increased alcohol use. There may be effects of alcohol feedback, with greater long-term benefits among persons with low drinking amounts. Our findings may help to identify refinements in the development of individualized interventions to reduce alcohol use.
Background
The COVID-19 pandemic and the imposed lockdowns severely affected routine care in general and specialized physician practices.
Objective
To describe the long-term impact of the COVID-19 pandemic on the physician services provision and disease recognition in German physician practices and perceived causes for the observed changes.
Design
Observational study based on medical record data and survey data of general practitioners and specialists' practices.
Participants
996 general practitioners (GPs) and 798 specialist practices, who documented 6.1 million treatment cases for medical record data analyses and 645 physicians for survey data analyses.
Main measures
Within the medical record data, consultations, specialist referrals, hospital admissions, and documented diagnoses were extracted for the pandemic (March 2020–September 2021) and compared to corresponding pre-pandemic months in 2019. The additional online survey was used to assess changes in practice management during the COVID-19 pandemic and physicians' perceived main causes of affected primary and specialized care provision.
Main results
Hospital admissions (GPs: −22% vs. specialists: −16%), specialist referrals (−6 vs. −3%) and recognized diseases (−9 vs. −8%) significantly decreased over the pandemic. GPs consultations initially decreased (2020: −7%) but compensated at the end of 2021 (+3%), while specialists' consultation did not (−2%). Physicians saw changes in patient behavior, like appointment cancellation, as the main cause of the decrease. Contrary to this, they also mentioned substantial modifications of practice management, like reduced (nursing) home visits (41%) and opening hours (40%), suspended checkups (43%), and delayed consultations for high-risk patients (71%).
Conclusion
The pandemic left its mark on primary and specialized healthcare provision and its utilization. Both patient behavior and organizational changes in practice management may have caused decreased and non-compensation of services. Evaluating the long-term effect on patient outcomes and identifying potential improvements are vital to better prepare for future pandemic waves.
Variability of Thyroid Measurements from Ultrasound and Laboratory in a Repeated Measurements Study
(2020)
Background: Variability of measurements in medical research can be due to different sources. Quantification of measurement errors facilitates probabilistic sensitivity analyses in future research to minimize potential bias in epidemiological studies. We aimed to investigate the variation of thyroid-related outcomes derived from ultrasound (US) and laboratory analyses in a repeated measurements study. Subjects and Methods: Twenty-five volunteers (13 females, 12 males) aged 22–70 years were examined once a month over 1 year. US measurements included thyroid volume, goiter, and thyroid nodules. Laboratory measurements included urinary iodine concentrations and serum levels of thyroid-stimulating hormone (TSH), free triiodothyronine (fT3), free thyroxine (fT4), and thyroglobulin. Variations in continuous thyroid markers were assessed as coefficient of variation (CV) defined as mean of the individual CVs with bootstrapped confidence intervals and as intraclass correlation coefficients (ICCs). Variations in dichotomous thyroid markers were assessed by Cohen’s kappa. Results: CV was highest for urinary iodine concentrations (56.9%), followed by TSH (27.2%), thyroglobulin (18.2%), thyroid volume (10.5%), fT3 (8.1%), and fT4 (6.3%). The ICC was lowest for urinary iodine concentrations (0.42), followed by fT3 (0.55), TSH (0.64), fT4 (0.72), thyroid volume (0.87), and thyroglobulin (0.90). Cohen’s kappa values for the presence of goiter or thyroid nodules were 0.64 and 0.70, respectively. Conclusion: Our study provides measures of variation for thyroid outcomes, which can be used for probabilistic sensitivity analyses of epidemiological data. The low intraindividual variation of serum thyroglobulin in comparison to urinary iodine concentrations emphasizes the potential of thyroglobulin as marker for the iodine status of populations.
Abstract
The increasing global prevalence of dementia demands concrete actions that are aimed strategically at optimizing processes that drive clinical innovation. The first step in this direction requires outlining hurdles in the transition from research to practice. The different parties needed to support translational processes have communication mismatches; methodological gaps hamper evidence‐based decision‐making; and data are insufficient to provide reliable estimates of long‐term health benefits and costs in decisional models. Pilot projects are tackling some of these gaps, but appropriate methods often still need to be devised or adapted to the dementia field. A consistent implementation perspective along the whole translational continuum, explicitly defined and shared among the relevant stakeholders, should overcome the “research‐versus‐adoption” dichotomy, and tackle the implementation cliff early on. Concrete next steps may consist of providing tools that support the effective participation of heterogeneous stakeholders and agreeing on a definition of clinical significance that facilitates the selection of proper outcome measures.
Background
Over the course of the COVID-19 pandemic, previous studies have shown that the physical as well as the mental health of children and adolescents significantly deteriorated. Future anxiety caused by the COVID-19 pandemic and its associations with quality of life has not previously been examined in school children.
Methods
As part of a cross-sectional web-based survey at schools in Mecklenburg-Western Pomerania, Germany, two years after the outbreak of the pandemic, school children were asked about COVID-19-related future anxiety using the German epidemic-related Dark Future Scale for children (eDFS-K). Health-related quality of life (HRQoL) was assessed using the self-reported KIDSCREEN-10. The eDFS-K was psychometrically analyzed (internal consistency and confirmatory factor analysis) and thereafter examined as a predictor of HRQoL in a general linear regression model.
Results
A total of N = 840 8–18-year-old children and adolescents were included in the analysis. The eDFS-K demonstrated adequate internal consistency reliability (Cronbach's α = 0.77), and the confirmatory factor analysis further supported the one-factor structure of the four-item scale with an acceptable model fit. Over 43% of students were found to have low HRQoL. In addition, 47% of the students sometimes to often reported COVID-19-related fears about the future. Children with COVID-19-related future anxiety had significantly lower HRQoL (B = – 0.94, p < 0.001). Other predictors of lower HRQoL were older age (B = – 0.63, p < 0.001), and female (B = – 3.12, p < 0.001) and diverse (B = – 6.82, p < 0.001) gender.
Conclusion
Two years after the outbreak of the pandemic, school-aged children continue to exhibit low HRQoL, which is further exacerbated in the presence of COVID-19-related future anxiety. Intervention programs with an increased focus on mental health also addressing future anxiety should be provided.
Objectives: An inverse relationship between education and cardiovascular risk has been described, however, the combined association of education, income, and neighborhood socioeconomic status with macrovascular disease is less clear. The aim of this study was to evaluate the association of educational level, equivalent household income and area deprivation with macrovascular disease in Germany.
Methods: Cross-sectional data from two representative German population-based studies, SHIP-TREND (n = 3,731) and KORA-F4 (n = 2,870), were analyzed. Multivariable logistic regression models were applied to estimate odds ratios and 95% confidence intervals for the association between socioeconomic determinants and macrovascular disease (defined as self-reported myocardial infarction or stroke).
Results: The study showed a higher odds of prevalent macrovascular disease in men with low and middle educational level compared to men with high education. Area deprivation and equivalent income were not related to myocardial infarction or stroke in any of the models.
Conclusion: Educational level, but not income or area deprivation, is significantly related to the macrovascular disease in men. Effective prevention of macrovascular disease should therefore start with investing in individual education.
Background
In non-randomized studies (NRSs) where a continuous outcome variable (e.g., depressive symptoms) is assessed at baseline and follow-up, it is common to observe imbalance of the baseline values between the treatment/exposure group and control group. This may bias the study and consequently a meta-analysis (MA) estimate. These estimates may differ across statistical methods used to deal with this issue. Analysis of individual participant data (IPD) allows standardization of methods across studies. We aimed to identify methods used in published IPD-MAs of NRSs for continuous outcomes, and to compare different methods to account for baseline values of outcome variables in IPD-MA of NRSs using two empirical examples from the Thyroid Studies Collaboration (TSC).
Methods
For the first aim we systematically searched in MEDLINE, EMBASE, and Cochrane from inception to February 2021 to identify published IPD-MAs of NRSs that adjusted for baseline outcome measures in the analysis of continuous outcomes. For the second aim, we applied analysis of covariance (ANCOVA), change score, propensity score and the naïve approach (ignores the baseline outcome data) in IPD-MA from NRSs on the association between subclinical hyperthyroidism and depressive symptoms and renal function. We estimated the study and meta-analytic mean difference (MD) and relative standard error (SE). We used both fixed- and random-effects MA.
Results
Ten of 18 (56%) of the included studies used the change score method, seven (39%) studies used ANCOVA and one the propensity score (5%). The study estimates were similar across the methods in studies in which groups were balanced at baseline with regard to outcome variables but differed in studies with baseline imbalance. In our empirical examples, ANCOVA and change score showed study results on the same direction, not the propensity score. In our applications, ANCOVA provided more precise estimates, both at study and meta-analytical level, in comparison to other methods. Heterogeneity was higher when change score was used as outcome, moderate for ANCOVA and null with the propensity score.
Conclusion
ANCOVA provided the most precise estimates at both study and meta-analytic level and thus seems preferable in the meta-analysis of IPD from non-randomized studies. For the studies that were well-balanced between groups, change score, and ANCOVA performed similarly.
Person-centered care (PCC) requires knowledge about patient preferences. An analytic hierarchy process (AHP) is one approach to quantify, weigh and rank patient preferences suitable for People living with Dementia (PlwD), due to simple pairwise comparisons of individual criteria from a complex decision problem. The objective of the present study was to design and pretest a dementia-friendly AHP survey. Methods: Two expert panels consisting of n = 4 Dementia Care Managers and n = 4 physicians to ensure content-validity, and “thinking-aloud” interviews with n = 11 PlwD and n = 3 family caregivers to ensure the face validity of the AHP survey. Following a semi-structured interview guide, PlwD were asked to assess appropriateness and comprehensibility. Data, field notes and partial interview transcripts were analyzed with a constant comparative approach, and feedback was incorporated continuously until PlwD had no further comments or struggles with survey completion. Consistency ratios (CRs) were calculated with Microsoft® Excel and ExpertChoice Comparion®. Results: Three main categories with sub-categories emerged: (1) Content: clear task introduction, (sub)criteria description, criteria homogeneity, (sub)criteria appropriateness, retest questions and sociodemography for heterogeneity; (2) Format: survey structure, pairwise comparison sequence, survey length, graphical design (incl. AHP scale), survey procedure explanation, survey assistance and response perspective; and (3) Layout: easy wording, short sentences and visual aids. Individual CRs ranged from 0.08 to 0.859, and the consolidated CR was 0.37 (0.038). Conclusions: Our formative qualitative study provides initial data for the design of a dementia-friendly AHP survey. Consideration of our findings may contribute to face and content validity in future quantitative preference research in dementia.
Background: Person-centered care (PCC) requires knowledge about patient preferences. This formative qualitative study aimed to identify (sub)criteria of PCC for the design of a quantitative, choice-based instrument to elicit patient preferences for person-centered dementia care. Method: Interviews were conducted with n = 2 dementia care managers, n = 10 People living with Dementia (PlwD), and n = 3 caregivers (CGs), which followed a semi-structured interview guide including a card game with PCC criteria identified from the literature. Criteria cards were shown to explore the PlwD’s conception. PlwD were asked to rank the cards to identify patient-relevant criteria of PCC. Audios were verbatim-transcribed and analyzed with qualitative content analysis. Card game results were coded on a 10-point-scale, and sums and means for criteria were calculated. Results: Six criteria with two sub-criteria emerged from the analysis; social relationships (indirect contact, direct contact), cognitive training (passive, active), organization of care (decentralized structures and no shared decision making, centralized structures and shared decision making), assistance with daily activities (professional, family member), characteristics of care professionals (empathy, education and work experience) and physical activities (alone, group). Dementia-sensitive wording and balance between comprehensibility vs. completeness of the (sub)criteria emerged as additional themes. Conclusions: Our formative study provides initial data about patient-relevant criteria of PCC to design a quantitative patient preference instrument. Future research may want to consider the balance between (sub)criteria comprehensibility vs. completeness.
Metabolites are intermediates or end products of biochemical processes involved in both health and disease. Here, we take advantage of the well-characterized Cooperative Health Research in South Tyrol (CHRIS) study to perform an exome-wide association study (ExWAS) on absolute concentrations of 175 metabolites in 3294 individuals. To increase power, we imputed the identified variants into an additional 2211 genotyped individuals of CHRIS. In the resulting dataset of 5505 individuals, we identified 85 single-variant genetic associations, of which 39 have not been reported previously. Fifteen associations emerged at ten variants with >5-fold enrichment in CHRIS compared to non-Finnish Europeans reported in the gnomAD database. For example, the CHRIS-enriched ETFDH stop gain variant p.Trp286Ter (rs1235904433-hexanoylcarnitine) and the MCCC2 stop lost variant p.Ter564GlnextTer3 (rs751970792-carnitine) have been found in patients with glutaric acidemia type II and 3-methylcrotonylglycinuria, respectively, but the loci have not been associated with the respective metabolites in a genome-wide association study (GWAS) previously. We further identified three gene-trait associations, where multiple rare variants contribute to the signal. These results not only provide further evidence for previously described associations, but also describe novel genes and mechanisms for diseases and disease-related traits.
Background: Multimorbidity is a common issue in aging societies and is usually associated with dementia in older people. Physical activity (PA) may be a beneficial nonpharmacological strategy for patients with complex health needs. However, insufficient PA is predominant in this population. Thus, there is an evident need to expand the knowledge on potential determinants influencing PA engagement among elderly persons at risk of dementia and multimorbidity. Methods: We used baseline data from the multicenter, cluster-randomized controlled AgeWell.de study. The main aim was to describe PA engagement and identify potential PA determinants in a sample of community-dwelling Germans aged 60–77 years old with an increased risk of dementia and multimorbidity. Results: Of the 1030 included participants, approximately half (51.8%) engaged in PA ≥2 times/week for at least 30 min at baseline. We identified self-efficacy (beta = 0.202, (p < 0.001) and BMI (beta = −0.055, (p < 0.001) as potential PA determinants. Conclusions: The identified determinants, self-efficacy, and BMI are consistent with those reported in the literature. Specific knowledge on PA determinants and stages of change in persons with risk of dementia and multimorbidity might guide the development of effective future prevention measures and health services tailored to this population. Trial registration: German Clinical Trials Register (reference number: DRKS00013555).
Dementia is a leading cause of disability and dependency in older people worldwide. As the number of people affected increases, so does the need for innovative care models. Dementia care management (DCM) is an empirically validated approach for improving the care and quality of life for people with dementia (PwD) and caregivers. The aim of this study is to investigate the influencing factors and critical pathways for the implementation of a regionally adapted DCM standard in the existing primary care structures in the German region of Siegen-Wittgenstein (SW). Utilizing participatory research methods, five local health care experts as co-researchers conducted N = 13 semi-structured interviews with 22 local professionals and one caregiver as peer reviewers. Data collection and analysis were based on the Consolidated Framework for Implementation Research (CFIR). Our results show that among the most mentioned influencing factors, three CFIR constructs can be identified as both barriers and facilitators: Patients’ needs and resources, Relative advantage, and Cosmopolitanism. The insufficient involvement of relevant stakeholders is the major barrier and the comprehensive consideration of patient needs through dementia care managers is the strongest facilitating factor. The study underlines the vital role of barrier analysis in site-specific DCM implementation.
Data quality assessments (DQA) are necessary to ensure valid research results. Despite the growing availability of tools of relevance for DQA in the R language, a systematic comparison of their functionalities is missing. Therefore, we review R packages related to data quality (DQ) and assess their scope against a DQ framework for observational health studies. Based on a systematic search, we screened more than 140 R packages related to DQA in the Comprehensive R Archive Network. From these, we selected packages which target at least three of the four DQ dimensions (integrity, completeness, consistency, accuracy) in a reference framework. We evaluated the resulting 27 packages for general features (e.g., usability, metadata handling, output types, descriptive statistics) and the possible assessment’s breadth. To facilitate comparisons, we applied all packages to a publicly available dataset from a cohort study. We found that the packages’ scope varies considerably regarding functionalities and usability. Only three packages follow a DQ concept, and some offer an extensive rule-based issue analysis. However, the reference framework does not include a few implemented functionalities, and it should be broadened accordingly. Improved use of metadata to empower DQA and user-friendliness enhancement, such as GUIs and reports that grade the severity of DQ issues, stand out as the main directions for future developments.
Knowledge on differences in the severity and symptoms of infections with the SARS-CoV-2 Omicron variants BA.2 (Pango lineage B.1.529.2) and BA.5 (Pango lineage B.1.529.5) is still scarce. We investigated epidemiological data available from the public health authorities in Mecklenburg-Western Pomerania, Northeast Germany, between April and July 2022 retrospectively. Comparative analyses revealed significant differences between recorded symptoms of BA.2 and BA.5 infected individuals and found strong correlations of associations between symptoms. In particular, the symptoms ‘chills or sweating’, ‘freeze’ and ‘runny nose’ were more frequently reported in BA.2 infections. In contrast, ‘other clinical symptoms’ appeared more frequently in Omicron infections with BA.5. However, the results obtained in this study provide no evidence that BA.5 has a higher pathogenicity or causes a more severe course of infection than BA.2. To our knowledge, this is the first report on clinical differences between the current Omicron variants BA.2 and BA.5 using public health data. Our study highlights the value of timely investigations of data collected by public health authorities to gather detailed information on the clinical presentation of different SARS-CoV-2 subvariants at an early stage.
This study aims to describe social network and social participation and to assess associations with depressive symptoms in older persons with increased risk for dementia in Germany. We conducted a cross-sectional observational study in primary care patients (aged 60–77) as part of a multicenter cluster-randomized controlled trial (AgeWell.de). We present descriptive and multivariate analyses for social networks (Lubben Social Network Scale and subscales) and social participation (item list of social activities) and analyze associations of these variables with depressive symptoms (Geriatric Depression Scale). Of 1030 included patients, 17.2% were at risk for social isolation (Lubben Social Network Scale < 12). Looking at the subscales, a reduced non-family network was found almost twice as often as a reduced family network. Patients with depressive symptoms had significantly smaller social networks than patients without depression (p < 0.001). They rather engaged in social activities of low involvement level or no weekly social activity at all (p < 0.001). The study shows associations of depressive symptoms with a decreased social network and less social participation in elderly participants. Sufficient non-family contacts and weekly social activities seem to play an important role in mental health and should be encouraged in elderly primary care patients.
Discovery of novel eGFR-associated multiple independent signals using a quasi-adaptive method
(2022)
A decreased estimated glomerular filtration rate (eGFR) leading to chronic kidney disease is a significant public health problem. Kidney function is a heritable trait, and recent application of genome-wide association studies (GWAS) successfully identified multiple eGFR-associated genetic loci. To increase statistical power for detecting independent associations in GWAS loci, we improved our recently developed quasi-adaptive method estimating SNP-specific alpha levels for the conditional analysis, and applied it to the GWAS meta-analysis results of eGFR among 783,978 European-ancestry individuals. Among known eGFR loci, we revealed 19 new independent association signals that were subsequently replicated in the United Kingdom Biobank (n = 408,608). These associations have remained undetected by conditional analysis using the established conservative genome-wide significance level of 5 × 10–8. Functional characterization of known index SNPs and novel independent signals using colocalization of conditional eGFR association results and gene expression in cis across 51 human tissues identified two potentially causal genes across kidney tissues: TSPAN33 and TFDP2, and three candidate genes across other tissues: SLC22A2, LRP2, and CDKN1C. These colocalizations were not identified in the original GWAS. By applying our improved quasi-adaptive method, we successfully identified additional genetic variants associated with eGFR. Considering these signals in colocalization analyses can increase the precision of revealing potentially functional genes of GWAS loci.
Introduction
Stroke is the leading neurological cause of adult long-term disability in Europe. Even though functional consequences directly related to neurological impairment are well studied, post-stroke trajectories of functional health according to the International Classification of Functioning, Disability and Health are poorly understood. Particularly, no study investigated the relationship between post-stroke trajectories of activities of daily living (ADL) and self-rated health (SRH). However, such knowledge is of major importance to identify patients at risk of unfavourable courses. This prospective observational study aims to investigate trajectories of ADL and SRH, and their modifying factors in the course of the first year after stroke.
Methods and analysis
The study will consecutively enrol 300 patients admitted to a tertiary care hospital with acute ischaemic stroke or transient ischaemic attack (TIA; Age, Blood Pressure, Clinical Features, Duration of symptoms, Diabetes score ≥3). Patient inclusion is planned from May 2021 to September 2022. All participants will complete an interview assessing ADL, SRH, mental health, views on ageing and resilience-related concepts. Participants will be interviewed face-to-face 1–5 days post-stroke/TIA in the hospital; and will be followed up after 6 weeks, 3 months, 6 months and 12 months via telephone. The 12-month follow-up will also include a neurological assessment. Primary endpoints are ADL operationalised by modified Rankin Scale scores and SRH. Secondary outcomes are further measures of ADL, functional health, physical activity, falls and fatigue. Views on ageing, social support, resilience-related concepts, affect, frailty, illness perceptions and loneliness will be examined as modifying factors. Analyses will investigate the bidirectional relationship between SRH and ADL using bivariate latent change score models.
Ethics and dissemination
The study has been approved by the institutional review board of the University Medicine Greifswald (Ref. BB 237/20). The results will be disseminated through scientific publications, conferences and media. Moreover, study results and potential implications will be discussed with patient representatives.
Trial registration number NCT04704635.
Objective
Whole-body MRI (wb-MRI) is increasingly used in research and screening but little is known about the effects of incidental findings (IFs) on health service utilisation and costs. Such effects are particularly critical in an observational study. Our principal research question was therefore how participation in a wb-MRI examination with its resemblance to a population-based health screening is associated with outpatient service costs.
Design
Prospective cohort study.
Setting
General population Mecklenburg-Vorpommern, Germany.
Participants
Analyses included 5019 participants of the Study of Health in Pomerania with statutory health insurance data. 2969 took part in a wb-MRI examination in addition to a clinical examination programme that was administered to all participants. MRI non-participants served as a quasi-experimental control group with propensity score weighting to account for baseline differences.Primary and secondary outcome measuresOutpatient costs (total healthcare usage, primary care, specialist care, laboratory tests, imaging) during 24 months after the examination were retrieved from claims data. Two-part models were used to compute treatment effects.
Results
In total, 1366 potentially relevant IFs were disclosed to 948 MRI participants (32% of all participants); most concerned masses and lesions (769 participants, 81%). Costs for outpatient care during the 2-year observation period amounted to an average of €2547 (95% CI 2424 to 2671) for MRI non-participants and to €2839 (95% CI 2741 to 2936) for MRI participants, indicating an increase of €295 (95% CI 134 to 456) per participant which corresponds to 11.6% (95% CI 5.2% to 17.9%). The cost increase was sustained rather than being a short-term spike. Imaging and specialist care related costs were the main contributors to the increase in costs.
Conclusions
Communicated findings from population-based wb-MRI substantially impacted health service utilisation and costs. This introduced bias into the natural course of healthcare utilisation and should be taken care for in any longitudinal analyses.
The Study of Health in Pomerania (SHIP), a population-based study from a rural state in northeastern Germany with a relatively poor life expectancy, supplemented its comprehensive examination program in 2008 with whole-body MR imaging at 1.5 T (SHIP-MR). We reviewed more than 100 publications that used the SHIP-MR data and analyzed which sequences already produced fruitful scientific outputs and which manuscripts have been referenced frequently. Upon reviewing the publications about imaging sequences, those that used T1-weighted structured imaging of the brain and a gradient-echo sequence for R2* mapping obtained the highest scientific output; regarding specific body parts examined, most scientific publications focused on MR sequences involving the brain and the (upper) abdomen. We conclude that population-based MR imaging in cohort studies should define more precise goals when allocating imaging time. In addition, quality control measures might include recording the number and impact of published work, preferably on a bi-annual basis and starting 2 years after initiation of the study. Structured teaching courses may enhance the desired output in areas that appear underrepresented.
Introduction: With the increased emergence of SARS-CoV-2 variants, the impact on schools and preschools remains a matter of debate. To ensure that schools and preschools are kept open safely, the identification of factors influencing the extent of outbreaks is of importance.
Aim: To monitor dynamics of COVID-19 infections in schools and preschools and identify factors influencing the extent of outbreaks.
Methods: In this prospective observational study we analyzed routine surveillance data of Mecklenburg-Western Pomerania, Germany, from calendar week (CW) 32, 2020 to CW19, 2021 regarding SARS-CoV-2 infection events in schools and preschools considering changes in infection control measures over time. A multivariate linear regression model was fitted to evaluate factors influencing the number of students, teachers and staff tested positive following index cases in schools and preschools. Due to an existing multicollinearity in the common multivariate regression model between the variables “face mask obligation for children” and “face mask obligation for adults”, two further separate regression models were set up (Multivariate Model Adults and Multivariate Model Children).
Results: We observed a significant increase in secondary cases in preschools in the first quarter of 2021 (CW8 to CW15, 2021), and simultaneously a decrease in secondary cases in schools. In multivariate regression analysis, the strongest predictor of the extent of the outbreaks was the teacher/ caregiver mask obligation (B = −1.9; 95% CI: −2.9 to −1.0; p < 0.001). Furthermore, adult index cases (adult only or child+adult combinations) increased the likelihood of secondary cases (B = 1.3; 95% CI: 0.9 to 1.8; p < 0.001). The face mask obligation for children also showed a significant reduction in the number of secondary cases (B = −0.6; 95% CI: −0.9 to −0.2; p = 0.004.
Conclusion: The present study indicates that outbreak events at schools and preschools are effectively contained by an obligation for adults and children to wear face masks.
(1) Background: Predicting chronic low back pain (LBP) is of clinical and economic interest as LBP leads to disabilities and health service utilization. This study aims to build a competitive and interpretable prediction model; (2) Methods: We used clinical and claims data of 3837 participants of a population-based cohort study to predict future LBP consultations (ICD-10: M40.XX-M54.XX). Best subset selection (BSS) was applied in repeated random samples of training data (75% of data); scoring rules were used to identify the best subset of predictors. The rediction accuracy of BSS was compared to randomforest and support vector machines (SVM) in the validation data (25% of data); (3) Results: The best subset comprised 16 out of 32 predictors. Previous occurrence of LBP increased the odds for future LBP consultations (odds ratio (OR) 6.91 [5.05; 9.45]), while concomitant diseases reduced the odds (1 vs. 0, OR: 0.74 [0.57; 0.98], >1 vs. 0: 0.37 [0.21; 0.67]). The area-under-curve (AUC) of BSS was acceptable (0.78 [0.74; 0.82]) and comparable with SVM (0.78 [0.74; 0.82]) and randomforest (0.79 [0.75; 0.83]); (4) Conclusions: Regarding prediction accuracy, BSS has been considered competitive with established machine-learning approaches. Nonetheless, considerable misclassification is inherent and further refinements are required to improve predictions.
Background
Vulnerable groups, e.g. persons with mental illness, neurological deficits or dementia, are often excluded as participants from research projects because obtaining informed consent can be difficult and tedious. This may have the consequence that vulnerable groups benefit less from medical progress. Vulnerable persons are often supported by a legal guardian in one or more demands of their daily life. We examined the attitudes of legal guardians and legally supervised persons towards medical research and the conditions and motivations to participate in studies.
Methods
We conducted a cross-sectional study with standardized surveys of legal guardians and legally supervised persons. Two separate questionnaires were developed for the legal guardians and the supervised persons to asses previous experiences with research projects and the reasons for participation or non-participation. The legal guardians were recruited through various guardianship organizations. The supervised persons were recruited through their legal guardian and from a previous study among psychiatric patients. The data were analysed descriptively.
Results
Alltogether, 82 legal guardians and 20 legally supervised persons could be recruited. Thereof 13 legal guardians (15.6%) and 13 legally supervised persons (65.0%) had previous experience with research projects. The majority of the guardians with experience in research projects had consented the participation of their supervised persons (n = 12 guardians, 60.0%; in total n = 16 approvals). The possible burden on the participating person was given as the most frequent reason not to participate both by the guardians (n = 44, 54.4%) and by the supervised persons (n = 3, 30.0%). The most frequent motivation to provide consent to participate in a research study was the desire to help other patients by gaining new scientific knowledge (guardians: n = 125, 78.1%; supervised persons: n = 10, 66.6%).
Conclusions
Overall, an open attitude towards medical research can be observed both among legal guardians and supervised persons. Perceived risks and no sense recognized in the study are reasons for not participating in medical research projects.
Background: It has not been investigated whether there are associations between urinary iodine (UI) excretion measurements some years apart, nor whether such an association remains after adjustment for nutritional habits. The aim of the present study was to investigate the relation between iodine-creatinine ratio (ICR) at two measuring points 5 years apart. Methods: Data from 2,659 individuals from the Study of Health in Pomerania were analyzed. Analysis of covariance and Poisson regressions were used to associate baseline with follow-up ICR. Results: Baseline ICR was associated with follow-up ICR. Particularly, baseline ICR >300 µg/g was related to an ICR >300 µg/g at follow-up (relative risk, RR: 2.20; p < 0.001). The association was stronger in males (RR: 2.64; p < 0.001) than in females (RR: 1.64; p = 0.007). In contrast, baseline ICR <100 µg/g was only associated with an ICR <100 µg/g at follow-up in males when considering unadjusted ICR. Conclusions: We detected only a weak correlation with respect to low ICR. Studies assessing iodine status in a population should take into account that an individual with a low UI excretion in one measurement is not necessarily permanently iodine deficient. On the other hand, current high ICR could have been predicted by high ICR 5 years ago.
Hintergrund: Menschen mit Demenz (MmD) zu versorgen, fordert Gesundheitssystem und pflegende Angehörige heraus und ist nur durch interprofessionelle medizinische und pflegerische Betreuung zu bewältigen. Fragestellung / Ziel: Die AHeaD-Studie untersuchte Einstellungen von Hausärzt_innen (HÄ) und Pflegefachpersonen (PFP) zur Übertragung bislang hausärztlich ausgeführter Tätigkeiten an PFP in der ambulanten Versorgung von MmD. Methoden: In vier Fokusgruppendiskussionen mit 10 HÄ und 13 PFP wurden Einstellungen zur Übertragung bestimmter Tätigkeiten inhaltsanalytisch untersucht sowie Chancen und Barrieren einer Einführung identifiziert. Ergebnisse: HÄ befürworteten die Übertragung bestimmter Tätigkeiten wie Blutentnahmen, Assessments, deren Monitoring oder Folgeverordnungen für Pflegehilfsmittel. „Klassische“ ärztliche Aufgaben (z. B. Diagnostik von Erkrankungen, Erstverordnung von Medikamenten) wurden weiter in hausärztlicher Hand gesehen. PFP forderten für die Beziehung zwischen PFP und HA mehr Wertschätzung und Anerkennung und bemängelten fehlendes Vertrauen sowie unzureichende Kommunikation. Beide Seiten verwiesen auf knappe Zeitbudgets, die sich kaum am reellen Bedarf der MmD orientierten. Schlussfolgerung: Die Umsetzung einer Aufgabenneuverteilung erfordert die Schaffung eines gesetzlichen und finanziellen Rahmens, zeitlicher Ressourcen, konkreter Aufgabenbeschreibungen sowie die stärkere Zusammenarbeit der involvierten Berufsgruppen. Innovative Konzepte könnten zum sinnvollen Einsatz der Ressourcen beider Berufsgruppen beitragen und die Versorgung von MmD stärken.
Quality of life (QoL) is a core patient-reported outcome in healthcare research, alongside primary clinical outcomes. A conceptual, operational, and psychometric elaboration of QoL in the context of TM is needed, because standardized instruments to assess QoL do not sufficiently represent essential aspects of intended outcomes of telemedical applications (TM). The overall aim is to develop an instrument that can adequately capture QoL in TM. For that purpose, an extended working model of QoL will be derived. Subsequently, an instrument will be developed and validated that captures those aspects of QoL that are influenced by TM. The initial exploratory study section includes (a) a systematic literature review, (b) a qualitative survey for concept elicitation, and (c) pre-testings using cognitive debriefings with patients and an expert workshop. The second quantitative section consists of an online expert survey and two patient surveys for piloting and validation of the newly developed instrument. The resulting questionnaire will assess central experiences of patients regarding telemedical applications and its impact on QoL more sensitively. Its use as adjunct instrument will lead to a more appropriate evaluation of TM and contribute to the improvement of care tailored to patients’ individual needs.
CFTR encodes for a chloride and bicarbonate channel expressed at the apical membrane of polarized epithelial cells. Transepithelial sodium transport mediated by the amiloride-sensitive sodium channel ENaC is thought to contribute to the manifestation of CF disease. Thus, ENaC is a therapeutic target in CF and a valid cystic fibrosis modifier gene. We have characterized SCNN1B as a genetic modifier in the three independent patient cohorts of F508del-CFTR homozygotes. We could identify a regulatory element at SCNN1B to the genomic segment rs168748-rs2303153-rs4968000 by fine-mapping (Pbest = 0.0177), consistently observing the risk allele rs2303153-C and the contrasting benign allele rs2303153-G in all three patient cohorts. Furthermore, our results show that expression levels of SCNN1B are associated with rs2303153 genotype in intestinal epithelia (P = 0.003). Our data confirm that the well-established biological role of SCNN1B can be recognized by an association study on informative endophenotypes in the rare disease cystic fibrosis and calls attention to reproducible results in association studies obtained from small, albeit carefully characterized patient populations.
Abstract
Background
Opioid use for chronic non‐cancer pain (CNCP) is under debate. In the absence of pan‐European guidance on this issue, a position paper was commissioned by the European Pain Federation (EFIC).
Methods
The clinical practice recommendations were developed by eight scientific societies and one patient self‐help organization under the coordination of EFIC. A systematic literature search in MEDLINE (up until January 2020) was performed. Two categories of guidance are given: Evidence‐based recommendations (supported by evidence from systematic reviews of randomized controlled trials or of observational studies) and Good Clinical Practice (GCP) statements (supported either by indirect evidence or by case‐series, case–control studies and clinical experience). The GRADE system was applied to move from evidence to recommendations. The recommendations and GCP statements were developed by a multiprofessional task force (including nursing, service users, physicians, physiotherapy and psychology) and formal multistep procedures to reach a set of consensus recommendations. The clinical practice recommendations were reviewed by five external reviewers from North America and Europe and were also posted for public comment.
Results
The European Clinical Practice Recommendations give guidance for combination with other medications, the management of frequent (e.g. nausea, constipation) and rare (e.g. hyperalgesia) side effects, for special clinical populations (e.g. children and adolescents, pregnancy) and for special situations (e.g. liver cirrhosis).
Conclusion
If a trial with opioids for chronic noncancer pain is conducted, detailed knowledge and experience are needed to adapt the opioid treatment to a special patient group and/or clinical situation and to manage side effects effectively.
Significance
If a trial with opioids for chronic noncancer pain is conducted, detailed knowledge and experience are needed to adapt the opioid treatment to a special patient group and/or clinical situation and to manage side effects effectively. A collaboration of medical specialties and of all health care professionals is needed for some special populations and clinical situations.
Copattern of depression and alcohol use in medical care patients: cross- sectional study in Germany
(2020)
Objective
To predict depressive symptom severity and presence of major depression along the full alcohol use continuum.
Design
Cross-sectional study.
Setting
Ambulatory practices and general hospitals from three sites in Germany.
Participants
Consecutive patients aged 18–64 years were proactively approached for an anonymous health screening (participation rate=87%, N=12 828). Four continuous alcohol use measures were derived from an expanded Alcohol Use Disorder Identification Test (AUDIT): alcohol consumption in grams per day and occasion, excessive consumption in days per months and the AUDIT sum score. Depressive symptoms were assessed for the worst 2-week period in the last 12 months using the Patient Health Questionnaire (PHQ-8). Negative binomial and logistic regression analyses were used to predict depressive symptom severity (PHQ-8 sum score) and presence of major depression (PHQ-8 sum score≥10) by the alcohol use measures.
Results
Analyses revealed that depressive symptom severity and presence of major depression were significantly predicted by all alcohol use measures after controlling for sociodemographics and health behaviours (p<0.05). The relationships were curvilinear: lowest depressive symptom severity and odds of major depression were found for alcohol consumptions of 1.1 g/day, 10.5 g/occasion, 1 excessive consumption day/month, and those with an AUDIT score of 2. Higher depressive symptom severity and odds of major depression were found for both abstinence from and higher levels of alcohol consumption. Interaction analyses revealed steeper risk increases in women and younger individuals for most alcohol use measures.
Conclusion
Findings indicate that alcohol use and depression in medical care patients are associated in a curvilinear manner and that moderation by gender and age is present.
Analysis based on claims data showed no clinical benefit from AGR intervention regarding theinvestigated outcomes. The slightly worse outcomes may reflect limitations in matching based on claims data,which may have insufficiently reflected morbidity and psychosocial factors. It is possible that the interventiongroup had poorer health status at baseline compared to the control group.
Background
In combination with systematic routine screening, brief alcohol interventions have the potential to promote population health. Little is known on the optimal screening interval. Therefore, this study pursued 2 research questions: (i) How stable are screening results for at‐risk drinking over 12 months? (ii) Can the transition from low‐risk to at‐risk drinking be predicted by gender, age, school education, employment, or past week alcohol use?
Methods
A sample of 831 adults (55% female; mean age = 30.8 years) from the general population was assessed 4 times over 12 months. The Alcohol Use Disorders Identification Test—Consumption was used to screen for at‐risk drinking each time. Participants were categorized either as low‐risk or at‐risk drinkers at baseline, 3, 6, and 12 months later. Stable and instable risk status trajectories were analyzed descriptively and graphically. Transitioning from low‐risk drinking at baseline to at‐risk drinking at any follow‐up was predicted using a logistic regression model.
Results
Consistent screening results over time were observed in 509 participants (61%). Of all baseline low‐risk drinkers, 113 (21%) received a positive screening result in 1 or more follow‐up assessments. Females (vs. males; OR = 1.66; 95% confidence intervals [95% CI] = 1.04; 2.64), 18‐ to 29‐year‐olds (vs. 30‐ to 45‐year‐olds; OR = 2.30; 95% CI = 1.26; 4.20), and those reporting 2 or more drinking days (vs. less than 2; OR = 3.11; 95% CI = 1.93; 5.01) and heavy episodic drinking (vs. none; OR = 2.35; 95% CI = 1.06; 5.20) in the week prior to the baseline assessment had increased odds for a transition to at‐risk drinking.
Conclusions
Our findings suggest that the widely used time frame of 1 year may be ambiguous regarding the screening for at‐risk alcohol use although generalizability may be limited due to higher‐educated people being overrepresented in our sample.
Abstract
Purpose
This study aims to assess the implementation of published research, contraindications, and warnings on the prescription of dual renin‐angiotensin‐hormone system (RAS) blockade in ambulatory care in Germany.
Methods
Cohort study based on health claims data of 6.7 million subjects from 2008 to 2015. Yearly prevalence and incidence for dual RAS blockade with (a) angiotensin‐converting enzyme inhibitors and angiotensin‐receptor blockers (ACEI + ARB) and (b) aliskiren and ACEI or ARB (aliskiren + ACEI/ARB) were calculated. We assessed prescriber specialty and associations between discontinuing dual RAS blockade with specialist (internal medicine, cardiology, nephrology) visits and hospital discharge in the previous year.
Results
A total of 2 984 517 patients were included (age 51.4 ± SD 18.4 y, 48.5% male). Prescription rates for ACEI + ARB decreased from 0.6% (n = 17 907) to 0.4% (n = 12 237) and for aliskiren + ACEI/ARB from 0.23% (n = 6634) to 0.03% (n = 818). Incident prescriptions decreased from 0.23% (n = 6705) to 0.19% (n = 5055) (ACE + ARB) and from 0.1% (n = 2796) to 0.005% (n = 142) (aliskiren + ACE/ARB); 59% of ACEI + ARB and 48% of aliskiren + ACE/ARB combinations were prescribed only by one physician. Of those, 73% (ACEI + ARB) and 58% (aliskiren + ACE/ARB) were primary care providers (PCPs). Discontinuing dual RAS blockade was associated with specialist care and hospital discharge in the previous year (specialist care: RR 1.4, 95% CI, 1.3‐1.6; hospital visit: RR 1.5, 95% CI, 1.3‐1.6).
Conclusions
Our results suggest a delayed uptake of treatment recommendation for ACEI + ARB and a higher impact of Dear Doctor letters addressing PCPs directly compared with published research, contraindications, and warnings. Targeted continuous medical education, practice software alerts, and stronger involvement of pharmacists might improve the implementation of medication safety recommendations in ambulatory care.
Scope
Previous work identified three metabolically homogeneous subgroups of individuals (“metabotypes”) using k‐means cluster analysis based on fasting serum levels of triacylglycerol, total cholesterol, HDL cholesterol, and glucose. The aim is to reproduce these findings and describe metabotype groups by dietary habits and by incident disease occurrence.
Methods and results
1744 participants from the KORA F4 study and 2221 participants from the KORA FF4 study are assigned to the three metabotype clusters previously identified by minimizing the Euclidean distances. In both KORA studies, the assignment of participants results in three metabolically distinct clusters, with cluster 3 representing the group of participants with the most unfavorable metabolic characteristics. Individuals of cluster 3 are further characterized by the highest incident disease occurrence during follow‐up; they also reveal the most unfavorable diet with significantly lowest intakes of vegetables, dairy products, and fibers, and highest intakes of total, red, and processed meat.
Conclusion
The three metabotypes originally identified in an Irish population are successfully reproduced. In addition to this validation approach, the observed differences in disease incidence across metabotypes represent an important new finding that strongly supports the metabotyping approach as a tool for risk stratification.
Introduction: Hearing and vision loss are highly prevalent in elderly adults, and thus frequently occur in conjunction with cognitive impairments. Studies have shown that hearing impairment is associated with a higher risk of dementia. However, evidence concerning the association between vision loss and dementia, as well as the co-occurrence of vision and hearing loss and dementia, has been inconclusive.
Objectives: To assess the association between: (i) either hearing or vision loss and the risk of dementia, as well as between; and (ii) the combination of both sensory impairments and the risk of dementia.
Methods: This case-control study was based on a 5-year data set that included patients aged 65 years and older who had initially been diagnosed with dementia diseases by one of 1,203 general practitioners in Germany between January 2013 and December 2017. In total, 61,354 identified dementia cases were matched to non-dementia controls, resulting in a sample size of 122,708 individuals. Hearing loss and vision loss were identified using the ICD-10 diagnoses documented in the general practitioners’ files prior to the initial dementia diagnosis. Multivariate logistic regression models were fitted to evaluate the associations between visual and/or hearing impairment and the risk of dementia and controlled for sociodemographic and clinical variables.
Results: Hearing impairment was documented in 11.2% of patients with a dementia diagnosis and 9.5% of patients without such a diagnosis. Some form of vision impairment was documented in 28.4% of patients diagnosed with dementia and 28.8% of controls. Visual impairment was not significantly associated with dementia (OR = 0.97, CI = 95% 0.97–1.02, p = 0.219). However, patients with hearing impairment were at a significantly higher risk of developing dementia (OR = 1.26, CI = 95% 1.15–1.38, p < 0.001), a finding that very likely led to the observed significant association of the combination of both visual and hearing impairments and the risk of dementia (OR = 1.14, CI = 95% 1.04–1.24, p = 0.005).
Discussion: This analysis adds important evidence that contributes to the limited body of knowledge about the association between hearing and/or vision loss and dementia. It further demonstrates that, of the two, only hearing impairment affects patients’ cognition and thus contributes to dementia risk.
Mendelian randomization (MR) is a framework for assessing causal inference using cross-sectional data in combination with genetic information. This paper summarizes statistical methods commonly applied and strait forward to use for conducting MR analyses including those taking advantage of the rich dataset of SNP-trait associations that were revealed in the last decade through large-scale genome-wide association studies. Using these data, powerful MR studies are possible. However, the causal estimate may be biased in case the assumptions of MR are violated. The source and the type of this bias are described while providing a summary of the mathematical formulas that should help estimating the magnitude and direction of the potential bias depending on the specific research setting. Finally, methods for relaxing the assumptions and for conducting sensitivity analyses are discussed. Future researches in the field of MR include the assessment of non-linear causal effects, and automatic detection of invalid instruments.
Activation of trace amine-associated receptor 1 (TAAR1) in endocrine pancreas is involved in weight regulation and glucose homeostasis. The purpose of this study was the identification and characterization of potential TAAR1 variants in patients with overweight/obesity and disturbed glucose homeostasis. Screening for TAAR1 variants was performed in 314 obese or overweight patients with impaired insulin secretion. The detected variants were functionally characterized concerning TAAR1 cell surface expression and signaling properties and their allele frequencies were determined in the population-based Study of Health in Pomerania (SHIP). Three heterozygous carriers of the single nucleotide missense variants p.Arg23Cys (R23C, rs8192618), p.Ser49Leu (S49L, rs140960896), and p.Ille171Leu (I171L, rs200795344) were detected in the patient cohort. While p.Ser49Leu and p.Ille171Leu were found in obese/overweight patients with slightly impaired glucose homeostasis, p.Arg23Cys was identified in a patient with a complete loss of insulin production. Functional in vitro characterization revealed a like wild-type function for I171L, partial loss of function for S49L and a complete loss of function for R23C. The frequency of the R23C variant in 2018 non-diabetic control individuals aged 60 years and older in the general population-based SHIP cohort was lower than in the analyzed patient sample. Both variants are rare in the general population indicating a recent origin in the general gene pool and/or the consequence of pronounced purifying selection, in line with the obvious detrimental effect of the mutations. In conclusion, our study provides hints for the existence of naturally occurring TAAR1 variants with potential relevance for weight regulation and glucose homeostasis.
SummaryBackground: According to the literature, ductoscopy is gaining increasing importance in the diagnosis of intraductal anomalies in cases of pathologic nipple discharge. In a multicenter study, the impact of this method was assessed in comparison with that of standard diagnostics. Patients and Methods: Between 09/2006 and 05/2009, a total of 214 patients from 7 German breast centers were included. All patients underwent elective ductoscopy and subsequent ductal excision because of pathologic nipple discharge. Ductoscopy was compared with the following standard diagnostics: breast sonography, mammography, magnetic resonance imaging (MRI), galactography, cytologic nipple swab, and ductal lavage cytology. The histological and imaging results were compared and contrasted to the results obtained from the nipple swab and cytologic assessment. Results: Sonography had the highest (82.9%) sensitivity, followed by MRI (82.5%), galactography (81.3%), ductoscopy (71.2%), lavage cytology (57.8%), mammography (57.1%), and nipple swab (22.8%). Nipple swabs had the highest (85.5%) specificity, followed by lavage cytology (85.2%), ductoscopy (49.4%), galactography (44.4%), mammography (33.3%), sonography (17.9%), and MRI (11.8%). Conclusion: Currently, ductoscopy provides a direct intraoperative visualization of intraductal lesions. Sensitivity and specificity are similar to those of standard diagnostics. The technique supports selective duct excision, in contrast to the unselective technique according to Urban. Therefore, ductoscopy extends the interventional/diagnostic armamentarium.
Do We Need to Rethink the Epidemiology and Healthcare Utilization of Parkinson's Disease in Germany?
(2018)
Epidemiological aspects of Parkinson's disease (PD), co-occurring diseases and medical healthcare utilization of PD patients are still largely elusive. Based on claims data of 3.7 million statutory insurance members in Germany in 2015 the prevalence and incidence of PD was determined. PD cases had at least one main hospital discharge diagnosis of PD, or one physician diagnosis confirmed by a subsequent or independent diagnosis or by PD medication in 2015. Prevalence of (co-)occurring diseases, mortality, and healthcare measures in PD cases and matched controls were compared. In 2015, 21,714 prevalent PD cases (standardized prevalence: 511.4/100,000 persons) and 3,541 incident PD cases (standardized incidence: 84.1/100,000 persons) were identified. Prevalence of several (co-)occurring diseases/complications, e.g., dementia (PD/controls: 39/13%), depression (45/22%), bladder dysfunction (46/22%), and diabetes (35/31%), as well as mortality (10.7/5.8%) differed between PD cases and controls. The annual healthcare utilization was increased in PD cases compared to controls, e.g., regarding mean ± SD physician contacts (15.2 ± 7.6/12.2 ± 7.3), hospitalizations (1.3 ± 1.8/0.7 ± 1.4), drug prescriptions (overall: 37.7 ± 24.2/21.7 ± 19.6; anti-PD medication: 7.4 ± 7.4/0.1 ± 0.7), assistive/therapeutic devices (47/30%), and therapeutic remedies (57/16%). The standardized prevalence and incidence of PD in Germany as well as mortality in PD may be substantially higher than reported previously. While frequently diagnosed with co-occurring diseases/complications, such as dementia, depression, bladder dysfunction and diabetes, the degree of healthcare utilization shows large variability between PD patients. These findings encourage a rethinking of the epidemiology and healthcare utilization in PD, at least in Germany. Longitudinal studies of insurance claims data should further investigate the individual and epidemiological progression and healthcare demands in PD.
Effectiveness of Varenicline as an Aid to Smoking Cessation in Primary Care: An Observational Study
(2012)
Aims: Although varenicline is commonly prescribed in primary care, information on smoking-related comorbidities and the effectiveness of varenicline in this context in Germany is scarce. This study assessed the efficacy and safety of varenicline in a large sample of patients seeking smoking cessation treatment through their general practitioners. The frequency of comorbidities was also evaluated. Methods: This was a 12-week, prospective, observational, non-comparative phase IV trial conducted in Germany. Abstinence rates at week 12 were evaluated by verbal reporting using the nicotine use inventory. Results: Overall, 1,391 subjects were enrolled; 1,177 received study medication and were evaluated for effectiveness and safety. At the end of the study, 71.1% (95% confidence interval 68.5–73.7) of subjects were abstinent. There were a total of 205 all-causality adverse events; 2.2% were classified as serious or severe. There were no fatal adverse events. At inclusion, 66.7% of participants had at least 1 concurrent comorbidity, with chronic obstructive pulmonary disease (35.5%), hypertension (29.6%) and depression (10.4%) being the most commonly reported. Conclusion: These real-world data indicate that varenicline is an effective and well-tolerated smoking cessation treatment when used in the primary care setting including patients with smoking-related comorbidities.
Context: 3,5-Diiodo-<smlcap>L</smlcap>-thyronine (3,5-T<sub>2</sub>) is a thyroid hormone metabolite which exhibited versatile effects in rodent models, including the prevention of insulin resistance or hepatic steatosis typically forced by a high-fat diet. With respect to euthyroid humans, we recently observed a putative link between serum 3,5-T<sub>2</sub> and glucose but not lipid metabolism. Objective: The aim of the present study was to widely screen the urine metabolome for associations with serum 3,5-T<sub>2</sub> concentrations in healthy individuals. Study Design and Methods: Urine metabolites of 715 euthyroid participants of the population-based Study of Health in Pomerania (SHIP-TREND) were analyzed by <sup>1</sup>H-NMR spectroscopy. Multinomial logistic and multivariate linear regression models were used to detect associations between urine metabolites and serum 3,5-T<sub>2</sub> concentrations. Results: Serum 3,5-T<sub>2</sub> concentrations were positively associated with urinary levels of trigonelline, pyroglutamate, acetone and hippurate. In detail, the odds for intermediate or suppressed serum 3,5-T<sub>2</sub> concentrations doubled owing to a 1-standard deviation (SD) decrease in urine trigonelline levels, or increased by 29-50% in relation to a 1-SD decrease in urine pyroglutamate, acetone and hippurate levels. Conclusion: Our findings in humans confirmed the metabolic effects of circulating 3,5-T<sub>2</sub> on glucose and lipid metabolism, oxidative stress and enhanced drug metabolism as postulated before based on interventional pharmacological studies in rodents. Of note, 3,5-T<sub>2</sub> exhibited a unique urinary metabolic profile distinct from previously published results for the classical thyroid hormones.
Background: Iodine deficiency disorders (IDD) represent a global health threat to individuals and societies. IDD prevention programmes have been introduced in many parts of the world. However, challenges remain, particularly in Europe due to fragmentation and diversity of approaches that are not harmonized. Objectives: This review is dedicated to the public-health impact of IDD prevention programmes. It sums up experiences collected by the EUthyroid consortium so far and provides information on stakeholders that should be involved in actions directed to improve the impact of IDD prevention. Methods: A joint European database for combining registry-based outcome and monitoring data as well as tools for harmonizing study methods were established. Methods for analyzing thyroglobulin from a dried blood spot are available for assessing the iodine status in the general population and at-risk groups. Mother-child cohorts are used for in-depth analysis of the potential impact of mild-to-moderate iodine deficiency on the neurocognitive development of the offspring. A decision-analytic model has been developed to evaluate the long-term effectiveness and cost effectiveness of IDD prevention programmes. Results: EUthyroid has produced tools and infrastructure to improve the quality of IDD monitoring and follows a dissemination strategy targeting policymakers and the general public. There are tight connections to major stakeholders in the field of IDD monitoring and prevention. Conclusions: EUthyroid has taken steps towards achieving a euthyroid Europe. Our challenge is to inspire a greater sense of urgency in both policymakers and the wider public to address this remediable deficit caused by IDD.
Background: Securing future blood supply is a major issue of transfusion safety. In this prospective 10-year longitudinal study we enrolled all blood donation services and hospitals of the federal state Mecklenburg-Western Pomerania. Methods and Results: From 2005 to 2015 (time period with major demographic effects), whole blood donation numbers declined by 18%. In male donors this paralleled the demographic change, while donation rates of females declined 12.4% more than expected from demography. In parallel, red cell transfusion rates/1,000 population decreased from 2005 to 2015 from 56 to 51 (-8.4%), primarily due to less transfusions in patients >60 years. However, the transfusion demand declined much less than blood donation numbers: -13.5% versus -18%, and the population >65 years (highest transfusion demand) will further increase. The key question is whether the decline in transfusion demand observed over the previous years will further continue, hereby compensating for reduced blood donation numbers due to the demographic change. The population structure of Mecklenburg-Western Pomerania reflects all Eastern German federal states, while the Western German federal states will reach similar ratios of age groups 18-64 years / ≥65 years about 10 years later. Conclusions: Regular monitoring of age- and sex-specific donation and transfusion data is urgently required to allow transfusion services strategic planning for securing future blood supply.
Previous studies on the antimicrobial activity of cold atmospheric pressure argon plasma showed varying effects against mecA<sup>+</sup> or mecA<sup>-</sup>Staphylococcus aureus strains. This observation may have important clinical and epidemiological implications. Here, the antibacterial activity of argon plasma was investigated against 78 genetically different S. aureus strains, stratified by mecA, luk-P, agr1-4, or the cell wall capsule polysaccharide types 5 and 8. kINPen09® served as the plasma source for all experiments. On agar plates, mecA<sup>+</sup>luk-P<sup>-</sup>S. aureus strains showed a decreased susceptibility against plasma compared to other S. aureus strains. This study underlines the high complexity of microbial defence against antimicrobial treatment and confirms a previously reported strain-dependent susceptibility of S. aureus to plasma treatment.
Background: The plasminogen activator system plays a key role in ovarian cancer (OC) tumor progression. The plasminogen activator inhibitor type 1 (PAI-1) and the recently identified PAI-1 RNA binding protein 1 (PAI-RBP1) are primary regulators of plasminogen activation and thus are putative biomarkers for OC progression. Methods: One hundred fifty six OC patients were analyzed to identify the presence of PAI-1 and PAI-RBP1 and subsequently correlated to clinicopathological parameters. Primary cells obtained from OC patient samples were applied in fluorescence microscopy analysis for examination of PAI-1 and PAI-RBP1 distribution. Results: PAI-1 and PAI-RBP1 have been found to be predictive markers for OC patients' outcome. PAI-1 levels significantly correlated with volume of ascites, FIGO staging, and lymph node status. PAI-RBP1 expression significantly correlated with age at first diagnosis, histological tumor type, presence of distant metastasis (pM), and recurrence. PAI-1 showed a trend toward association and PAI-RBP1 was significantly associated with progression-free survival. Notably, PAI-1 protein in recurrent OC tissues was exclusively localized in the nucleus. Conclusion: This study has shown that a combination of PAI-1 and PAI-RBP1 may represent novel prognostic factor for OC. Prospective trials are needed.
Mean platelet volume is more important than age for defining reference intervals of platelet counts
(2019)
Mendelian randomization (MR) is a framework for assessing causal inference using cross-sectional data in combination with genetic information. This paper summarizes statistical methods commonly applied and strait forward to use for conducting MR analyses including those taking advantage of the rich dataset of SNP-trait associations that were revealed in the last decade through large-scale genome-wide association studies. Using these data, powerful MR studies are possible. However, the causal estimate may be biased in case the assumptions of MR are violated. The source and the type of this bias are described while providing a summary of the mathematical formulas that should help estimating the magnitude and direction of the potential bias depending on the specific research setting. Finally, methods for relaxing the assumptions and for conducting sensitivity analyses are discussed. Future researches in the field of MR include the assessment of non-linear causal effects, and automatic detection of invalid instruments.
Introduction
Bipolar disorder (BD) is characterized by recurrent episodes of depression and mania and affects up to 2% of the population worldwide. Patients suffering from bipolar disorder have a reduced life expectancy of up to 10 years. The increased mortality might be due to a higher rate of somatic diseases, especially cardiovascular diseases. There is however also evidence for an increased rate of diabetes mellitus in BD, but the reported prevalence rates vary by large.
Material and Methods
85 bipolar disorder patients were recruited in the framework of the BiDi study (Prevalence and clinical features of patients with Bipolar Disorder at High Risk for Type 2 Diabetes (T2D), at prediabetic state and with manifest T2D) in Dresden and Würzburg. T2D and prediabetes were diagnosed measuring HBA1c and an oral glucose tolerance test (oGTT), which at present is the gold standard in diagnosing T2D. The BD sample was compared to an age-, sex- and BMI-matched control population (n = 850) from the Study of Health in Pomerania cohort (SHIP Trend Cohort).
Results
Patients suffering from BD had a T2D prevalence of 7%, which was not significantly different from the control group (6%). Fasting glucose and impaired glucose tolerance were, contrary to our hypothesis, more often pathological in controls than in BD patients. Nondiabetic and diabetic bipolar patients significantly differed in age, BMI, number of depressive episodes, and disease duration.
Discussion
When controlled for BMI, in our study there was no significantly increased rate of T2D in BD. We thus suggest that overweight and obesity might be mediating the association between BD and diabetes. Underlying causes could be shared risk genes, medication effects, and lifestyle factors associated with depressive episodes. As the latter two can be modified, attention should be paid to weight changes in BD by monitoring and taking adequate measures to prevent the alarming loss of life years in BD patients.