Refine
Year of publication
Document Type
- Doctoral Thesis (14)
- Article (1)
Has Fulltext
- yes (15)
Is part of the Bibliography
- no (15)
Keywords
- Simulation (15) (remove)
Institute
Publisher
Telemedicine at the Emergency Site – Evaluated by emergency team members in simulated scenarios
(2015)
The hypothesis of this study states that emergency medicine can benefit from telemedicine, whenever paramedics at a remote emergency site request consultation or mentoring by a distant emergency doctor. The hypothesis was semi-qualitatively evaluated in accordance with the protocol of the EU project in the setting of a medical simulation centre. Paramedics encountered simulated standardized emergency case scenarios, connected for teleconsultation and telementoring with emergency doctors by video and audio link through a newly developed real-time HD-video system called LiveCity camera. Paramedics and emergency doctors regarded the simulated scenarios as realistic and relevant and took the simulation seriously. Thus,the following conclusions can be drawn: 1.) Emergency team members encounter situations at the emergency site, in which they would like to get help by a more experienced colleague, especially help with diagnostics and treatment. 2.) The telemedical contact to an emergency doctor makes paramedics feel confirmed in their work, more secure, even in legal aspects. Paramedics do not feel controlled by telemedicine or like a puppet on a string. Their relationship to the patient is not mainly deranged or interfered by the doctor and their course of action is not mainly disrupted. The tele-emergency doctors do not feel like puppet masters and continue feeling as doctors and do not perceive themselves as interferer within the emergency team. 3.) Emergency team members call for a telemedical system providing transmission of vital signs as well as audio- and video-connection. 4.) The LiveCity camera is an effective telemedical tool. The audio quality is good and the orientation on the screen is easy. Paramedics state, that filming the emergency site is easy, does not restrict the field of vision and paramedics can communicate the emergency doctors everything they want to show and tell. Thus the emergency doctors get additional information. While the LiveCity camera is mostly perceived as not too heavy, the LiveCity camera is not easy to operate, very failure-prone and can derange the communication among team members at the emergency site. Nevertheless, the LiveCity camera is not perceived as an additional burden. 5.) Telemedicine is predominantly and largely appreciated by the members of the emergency team. Connecting the tele-emergency doctor to the remote paramedics leads to a perceived faster start of the therapy and is considered as helpful, improving the situation and the quality of patient care. The adherence to medical guidelines and therefore the quality increased, when the paramedics were connected to an emergency doctor through the telemedicine connection. In general, the quality of diagnostics, the correctness of diagnosis and the quality of therapy were rated higher. The majority of paramedics would call a tele-emergency doctor in cases, they wouldn´t normally activate medical support. The emergency team members largely agree in perceiving the tele-emergency doctor system as useful, and they can imagine, working in a tele-emergency system. As a conclusion, the general hypothesis of this study is mainly and in many items supported: Emergency medicine benefits from telemedical support via video- and audio link as studied here with a newly developed real-time HD-video system called LiveCity camera, whenever paramedics at a remote emergency site request consultation or mentoring by a distant emergency doctor.
Simulations of Short Model Peptides and Practically Relevant Modeled Titanium Implant Surfaces
(2014)
One of the aims of this work was to generate a non restrained force field model including carbon contamination to make the adsorption simulations more realistic and comparable with experimental data. Another purpose was to find out how the special recognition of small linker proteins on titanium dioxide is working. During this work a fixed and a non restrained rutile (100) model was used and critical properties were observed which are not only related to the surface. The rigid water layers on top of the oxide are very important for the protein and peptide adsorption. Therefore the first discussing object were the properties of the water layers and how they can be influenced. The charge distribution on the surface was found to have a big effect on them. Depending on the charges of the surface atoms or the functional groups, resulting out of the hydroxylation equilibrium, precisely the first water layer gets more rigid or smother. This has a big effect on biomolecule adsorption. The peptides need to penetrate these water layers to generate direct interaction points. The correct description of the surface in molecular dynamic simulations therefore has a high influence on the results. The better the model is the better the findings are comparable with experimental ones. Additionally carbon contamination was mimicked by using a monolayer of pentanol molecules. This fits very good with experimental data (e.g. contact angle) and make the oxide model more hydrophobic. Interaction of proteins and peptides in experiments or in medical use are often observed under normal air conditions, which means that the scaffold is i) hydroxylated by water and ii) carbon contaminated in a short period of time. Therefore investigations were done to find out how the contamination influences the adsorption of a formally know good or bad binding peptide (TiOBP1; TiOBP2). It was found that the TiOBP1 is able to bind the different surface modifications very well which coincides with observations made in experiments. The way of adsorption (direct or indirect) depends on the water layers properties. The first layer on high charged surface models is that rigid, that the peptide is not able to adsorb in a direct way. On the carbon contaminated oxide model the adsorption is possible by reducing the flexibility of the secondary structure motive. In the case of TiOBP2 adsorption on the clean surface model results in only weak binding or even in no interaction. Whereas on the carbon contaminated dioxide the once know bad binder is able to interact with the Pentanol monolayer. No direct adsorption is observed but the hydrophobic side chains have the possibility to orient themselves according to the hydrophobic layer without changing significantly in the secondary structure motive. An additional test peptide (minTBP) adsorbs without being affected by the contamination. This raises the question if the distribution of hydrophobic to hydrophilic amino acids has influence on the adsorption ability according to clean and contaminated surface. For experimental application it could be of interest to generated peptides (GEPI´s) which bind both surface types without changing the secondary structure motives then as we know functionality is based on these structures. In the case of the PHMB polymer adsorption was observed depending on the hydroxylation ratio and therefore on the charge density of the rutile (100) surface. After analysis of the simulations takeaways from experiments could be substantiated. The PHMB interacts with the negative charged surface via the first water layer as a film. So the new force field model describing the rutile (100) titanium dioxide surface with additional carbon contamination model of one monolayer pentanol fits the experimental data very well. The adsorption studied on this surfaces indicates that the contamination as expected makes the surface more hydrophobic and influences the adsorption behavior of the tested peptides especially the secondary structure of TiOBP1. This indeed enhances experimental investigations. Peptides which e.g. link organic and inorganic parts should be good adsorbing on clean and contaminated surfaces by keeping their functionality. Furthermore experimental data can be substantiated by using atomistic simulations like in the case of PHMB adsorption.
Simulationsbasierte Analyse von Operationsprozessen am Beispiel eines Grund- und Regelversorgers
(2013)
Die ökonomische Analyse von Prozessen sowie der Auslastung einzelner Ressourcen spielen im Krankenhaus zunehmend eine wichtige Rolle. Werkzeuge aus dem Bereich des Operations Research können die Darstellung, die Bewertung und die Gestaltung von Prozessen im Allgemeinen und von Operationsprozessen im Speziellen maßgeblich unterstützen. Insbesondere die diskrete Ereignissimulation gilt als vielversprechendes Verfahren zur Unterstützung wichtiger Analysen im Krankenhaus. Die vorliegende Arbeit untersucht Operationsprozesse eines Grund- und Regelversorgers. Mit Hilfe einer stochastischen diskreten Ereignissimulation werden sowohl bestehende Prozessabläufe modelliert als auch Auswirkungen veränderter Parameter mittels Szenarienrechnungen simuliert und analysiert. Ein besonderer Fokus liegt auf der Untersuchung der personellen und räumlichen Ressourcen sowie wichtiger Prozesskennzahlen, die durch Betrachtungen der entstehenden Kosten ergänzt wird. Die Arbeit zeigt, dass mit Hilfe einer diskreten Ereignissimulation operative Prozesse eines Grund- und Regelversorgers abgebildet und mittels verschiedener Szenarien Auswirkungen von Prozessänderungen betrachtet werden können. In diesen Szenarien lassen sich sowohl die Auslastung der verschiedenen Ressourcen als auch andere wichtige Prozesskennzahlen beeinflussen. Beispielsweise könnte das derzeit im Krankenhaus durchgeführte Leistungsspektrum in einer geringeren Anzahl an OP-Sälen durchgeführt werden, so dass frei werdende Ressourcen für eine alternative Nutzung zur Verfügung stünden. So ließe sich die mittlere Gesamtauslastung der OP-Säle je nach Szenario maßgeblich steigern. Die diskrete Ereignissimulation zeigt sich als hervorragendes Werkzeug für die Analyse wichtiger Fragestellungen im Krankenhaus und dient somit als hilfreiche Unterstützung von Entscheidungsprozessen.
We present classical and hybrid modeling approaches for genetic regulatory networks focusing on promoter analysis for negatively and positively autoregulated networks. The main aim of this thesis is to introduce an alternative mathematical approach to model gene regulatory networks based on piecewise deterministic Markov processes (PDMP). During somitogenesis, a process describing the early segmentation in vertebrates, molecular oscillators play a crucial role as part of a segmentation clock. In mice, these oscillators are called Hes1 and Hes7 and are commonly modeled by a system of two delay differential equations including a Hill function, which describes gene repression by their own gene products. The Hill coefficient, which is a measure of nonlinearity of the binding processes in the promoter, is assumed to be equal to two, based on the fact that Hes1 and Hes7 form dimers.However, by standard arguments applied to binding analysis, we show that a higher Hill coefficient is reasonable. This leads to results different from those in literature which requires a more sophisticated model. For the Hes7 oscillator we present a system of ordinary differential equations including a Michaelis-Menten term describing a nonlinear degradation of the proteins by the ubiquitinpathway. As demonstrated by the Hes1 and Hes7 oscillator, promoter behavior can have strong influence on the dynamical behavior of genetic networks. Since purely deterministic systems cannot reveal phenomenons caused by the inherent random fluctuations, we propose a novel approach based on PDMPs. Such models allow to model binding processes of transcription factors to binding sites in a promoter as random processes, where all other processes like synthesis, degradation or dimerization of the gene products are modeled in deterministic manner. We present and discuss a simulation algorithm for PDMPs and apply it to three types of genetic networks: an unregulated gene, a toggle switch, and a positively autoregulated network. The different regulation characteristics are analyzed and compared by numerical means. Furthermore, we determine analytical solutions of the stationary distributions of one negatively, and three positively autoregulated networks. Based on these results, we analyze attenuation of noise in a negative feedback loop, and the question of graded or binary response in autocatalytic networks.
Ziel dieser Arbeit ist es, die Problematik des steigenden Qualitätsanspruchs und des Kostendrucks im Gesundheitssystem aufzugreifen und vor diesem Hintergrund zwei entscheidende Prozesse im Krankenhaus, die Poliklinik als ambulante medizinische Einheit und die Station als stationäre Pflegeeinheit, näher zu untersuchen. Derzeit sind diese beiden Einheiten noch weitgehend unabhängig voneinander bzw. sie sind im Behandlungsprozess hintereinandergeschaltet. Die Effizienz und die Wirksamkeit von Gesundheitsleistungen hängen maßgeblich von der Entwicklung der Arbeitsteilung und vom Zusammenspiel des ambulanten und des stationären Sektors ab. Im Rahmen dieser Arbeit sollen diese organisch gewachsenen Organisationsstrukturen nun aufgebrochen werden. Es findet ein Perspektivenwechsel von traditionellen Strukturen zu einer prozessualen Sichtweise statt. Anhand verschiedener Prozessoptimierungsansätze soll der Prozess der stationären Versorgung auf der Station in den Prozess der Versorgung in der Poliklinik teilweise integriert werden. Ein zentraler Punkt ist dabei die Identifizierung und Bewertung von Synergiepotenzialen durch die Bündelung bzw. durch die Verlagerung der Ressourcen Pflegepersonal und ärztliches Personal. Neben den quantifizierbaren Folgen für den Ressourceneinsatz sollen durch die Neugestaltung der Prozesse auch Verbesserungen in den medizinischen Behandlungsabläufen und in der Qualitätssicherung erreicht werden. Ein Kerngedanke für die Prozessoptimierung ist eine Klassifikation des Prozessobjektes Patient in mobile und immobile stationäre Patienten. Anhand einer Simulation mit dem Programm MedModel werden die Prozessveränderungen im Ist- und Sollzustand ausgewertet und verglichen. Der Erfolg der angestrebten Prozessoptimierung wird an den Dimensionen der Prozessleistungsfähigkeit Kosten, Qualität und Zeit gemessen.