Communities in this Repository
Select a community to browse its collections.
Tourmaline chemistry (EMPA); Geochronology: U-Pb LA-ICPMS, Ar-Ar; S-Isotopes; B-Isotopes; Sample List
13 datafiles of the digital database
Supporting Information for Tajik Basin and Southwestern Tian Shan, Northwestern India-Asia Collision Zone: 2. Timing of Basin Inversion, Tian Shan Mountain Building, and Relation to Pamir-Plateau Advance and Deep India-Asia Indentation Abdulhameed et al., 2020, Tectonics
Contents of this File Supporting information S1 Tables S1 to S3 Figure S1 to Figure S4 Dataset S1 with Tables S4 to Table S6
Contents of this File Supporting information S1 Tables S1 to S3 Figure S1 to Figure S4 Dataset S1 with Tables S4 to Table S6
Supporting Information for Tajik Basin and Southwestern Tian Shan, Northwestern India-Asia Collision Zone: 3. Pre- to Syn-orogenic Retro-foreland Basin Evolution in the Eastern Tajik Depression and Linkage to the Pamir Hinterland Dedow et al., Tectonics, 2020
Contents of this file Supporting information S1. Tables S1 and S2 provided as separate files
AGUPublications: Tectonics Supporting Information for: Tajik Basin and Southwestern Tian Shan, Northwestern India-Asia Collision Zone: 1. Structure, Kinematics, and Salt-tectonics in the Tajik Fold-thrust Belt of the Western Foreland of the Pamir Łukasz Gągała1,2, Lothar Ratschbacher1, Jean-Claude Ringenbach3, Sofia-Katarina Kufner4, Bernd Schurr4, Ralf, Dedow1, Sanaa Abdulhameed1, Edouard Le Garzic3, Mustafo Gadoev5, and Ilhomjon Oimahmadov5 1Geologie, Technische Universität Bergakademie Freiberg, Freiberg, Germany, 2Present address: Hellenic Petroleum, Marousi, Greece, 3E2S-UPPA, CNRS, Univ. Pau & Pays Adour, Pau, France, 4GFZ German Research Center for Geosciences, Potsdam, Germany, 5Institute of Geology, Earthquake Engineering and Seismology, Tajik Academy of Sciences, Dushanbe, Tajikistan Contents of this File: Supporting information S1; Figures S1 to S9
Supporting Information to Gagala et al., Tectonics, 2020
Four zircon Raman bands were previously calibrated to give consistent estimates of the accumulated self-irradiation α-dose in unannealed volcanic samples. Partial annealing of radiation damage produces inconsistent values because the relative annealing sensitivities of the different bands differ from their relative accumulation rates. The damage estimate based on the external rotation band (DER) is the most sensitive and that based on the internal bending mode, ν2(SiO4) (D2), is the least sensitive to annealing. The D2/DER-ratio thus provides an estimate of the extent of annealing that a zircon sample has experienced. Further, it characterizes its Raman age and thermal history but also its state of radiation damage during its geologic history—and therefore—the manner in which this state influences other thermochronologic methods. Meaningful interpretation of the thermal signal and of the zircon Raman age requires that the spectra are free of measurement artifacts. The major artifacts result from micrometer-scale gradients of the damage densities within a zircon grain due to uranium and thorium zoning. The sampled volume may span different densities, producing overlapping spectra, causing apparent peak broadening, overestimated damage densities, and zircon Raman ages. The D3/D2-ratio calculated from the ν3(SiO4) and ν2(SiO4) bands, most and least affected by the overlap, is an efficient indicator of a meaningless signal. It reveals overlap in annealed and unannealed samples, because the ν2(SiO4) and ν3(SiO4) bands have similar responses to annealing. Multi-band Raman maps can be converted to damage-ratio maps for screening zircon mounts and for selecting suitable spots for thermochronologic investigations.
Supplement Texts S1, S2, and S3 Figures S1, S2, S3, and S4 Tables T1 and T2
Determination of closure temperatures for damage accumulation for three Raman bands in zircon.
Tables 1 to 3 giving Raman data of annealed zircons
The plate-bounding Alpine Fault in New Zealand is an 850 km long transpressive continental fault zone that is late in its earthquake cycle. The Deep Fault Drilling Project (DFDP) aims to deliver insight into the geological structure of this fault zone and its evolution by drilling and sampling the Alpine Fault at depth. We have acquired and processed reflection seismic data to image the subsurface around the drill site: 1) a 2D seismic line in 2011 and 2) an extended 3D walkaway Vertical Seismic Profiling data set in 2016. The resulting velocity models and seismic images of the upper 5 km show complex subsurface structures around the Alpine Fault zone with both the local structures of the glacial valley and the tectonic fault structures. The results provide a reliable basis for a seismic site characterisation at the DFDP-2 drill site and correlate with preliminary cutting and logging results from the drilling. Thus, the information derived from the seismic data sets are crucial for further structural and geological investigations of the architecture of the Alpine Fault in this area.
This seismic data set was acquired in 2016 by a joint effort of several partners. A detailed field report by Townend et al. (2016) describes the field procedures and available data. Please see the included pdf-file for more details.
Data sets for a seismic P-wave velocity model from a 3D VSP survey at the Alpine Fault DFDP-2 drill site in the Southern Alps in New Zealand containing the used first-arrival travel times and the final P-wave velocity model.
For the purpose of investigating the atomization of respiratory mucus during phonation, a new experimental setup was designed which emulates the vocal folds, their oscillating movement and the expiratory air flow in a simplified manner. The primary atomization of an artificial mucus can be observed. Based on the shadowgraphy measurements carried out, droplet size spectra were evaluated and the influence of the parameters flow rate, oscillation frequency and amplitude was investigated. Furthermore, high-speed recordings allowed the visualization and discussion of the droplet formation mechanisms.
Supplemental material and supporting information for the publication: Fritzsche et al. (2022), “Toward unraveling the mechanisms of aerosol generation during phonation” (DOI: 10.1063/5.0124944). \\ For the purpose of investigating the atomization of respiratory mucus during phonation, a new experimental setup was designed which emulates the vocal folds, their oscillating movement and the expiratory air flow in a simplified manner. The primary atomization of an artificial mucus can be observed. Based on the shadowgraphy measurements carried out, droplet size spectra were evaluated and the influence of the parameters flow rate, oscillation frequency and amplitude was investigated. Furthermore, high-speed recordings allowed the visualization and discussion of the droplet formation mechanisms.
Um der Frage nachzugehen, wie Chorsingen während einer Pandemie sicherer gemacht werden kann, wurden während unterschiedlicher Chorproben die CO_2 Konzentrationen mit Hilfe von Sensoren messtechnisch erfasst. Dabei wird angenommen, dass die CO2 Konzentration mit potentiell virenhaltigen Aerosolen korreliert, die während des Singens ausgestoßen werden. Die Messergebnisse von 2 Chorproben des Collegium Musicum der TUBAF wurden ausführlich ausgewertet und im Journal of Voice publiziert. Alle der Publikation zugrunde liegenden Daten, Auswerteskripte, Ergebnisse sowie die Publikation selbst sind hier enthalten.
Alle Rohdaten zur CO2-Messung und Auswertung, die in der Publikation "How safe is singing under pandemic conditions? - CO2 -measurements as simple method for risk estimation during choir rehearsals" im Journal of Voice enthalten sind.
This project holds supplementary data for numerical simulations of perodic surface structues created by direct laser interference patterning. These simulations were performed with OpenFOAM and compared to experimental measurements.
Supplementary Data for the following Publication: M. Heinrich, B. Voisiat, A.F. Lasagni, R. Schwarze Numerical simulation of periodic surface structures created by direct laser interference patterning PLOS ONE (2023)
This collection contains all necessary data and code to reproduce the graphs found in the main text of the PhD thesis (Dissertation) "Nonlinear Parameter Estimation of Experimental Cake Filtration Data" by Thomas Buchwald.
This collection belongs to the doctoral thesis "Nonlinear Parameter Estimation of Experimental Cake Filtration Data". Most of the content are Jupyter notebooks which contain the Python code which reproduces graphs found in the thesis from the original experimental data. The lab practice dataset that contains 500 filtration experiments is contained in the folder for section 3.5. The notebooks are not strictly sorted by section. At any rate, the Readme will guide you to the notebook which produces a certain graph, if it is not part of the main notebook of that specific section. The original Python environment was set up with Anaconda. Please use the provided .yml file to create a Python environment which contains all the necessary packages. Some notebooks may not work with the most current versions of the packages, so updating is not necessarily a good idea.
The 3D particle analysis offers the possibility to capture several particle characteristics simultaneously. However, when using X-ray tomography over several size scales (nano- and micro CT), it is not possible to derive quantitative information about the material composition of samples directly from the image grey values. Within the project, this is to be done by correlating the 3D volume with 2D elemental analysis (EDXS) utilizing a FIB-SEM workflow.
Saxolite and Talcum are two materials with comparable X-ray attenuating properties and thus not to distinguish by direct comparison of image gray values.
Reliable information about the micro-processes during filtration and dewatering of filter cakes allows more accurate statements about process development and design in any industrial application with solid-liquid separation units. Distributed particle properties such as shape, size, and material influence the porous network structure with considerable local fluctuations in vertical and horizontal alignment in the cake forming apparatus. The present work relates to a wide range of particle sizes and particle shapes and presents their effects on integral, but preferably local, structural parameters of cake-forming filtration. Current models for the relationship between particle properties and resulting porous structure remain inaccurate. Therefore, the central question focus on the model-based correlation between the obtained data and characteristic cake and process parameters. In combination with X-ray computed tomography and microscopy (ZEISS Xradia 510), data acquisition on the structural build-up of filter cakes is possible on a small scale (filter area 0.2 cm²) and a conventional laboratory scale (filter area 20 cm², VDI 2762 pressure nutsch). Thereby, the work focuses on structural parameters at the local level before, during, and after cake dewatering, such as porosity, coordination number, three-phase contact angle, characteristics of pores and isolated liquid regions, the liquid load of individual particles, tortuosity, and capillary length, and the corresponding spatial distributions. Seven different particle systems in the range of 20 and 500 µm, suspended in aqueous solutions with additives for contrast enhancement, served as the initial raw materials for the filter cake build-up. Image data processing from 16-bit greyscale images with a resolution of 2 to 4 µm/voxel edge length includes various operations from denoising filters and shape enhancement with two-stage segmentation to identify air, solid particles, and liquid phase, resulting in a machine learning-based automated approach. Subsequent modeling and correlation of measured parameters rely on experimentally verified quantities from mercury porosimetry, laser diffraction, dynamic image analysis, static and dynamic droplet contour analysis, as well as filtration and capillary pressure tests according to VDI guidelines. The tomography measurements provide microscopic information about the porous system, quantified using characteristic key parameters and distribution functions.
Appendices to the data collections of the following publications: - Publication A: Study on the influence of solids volume fraction on filter cake structures using micro tomography - Publication B: Neighborhood Relationships of Widely Distributed and Irregularly Shaped Particles in Partially Dewatered Filter Cakes - Publication C: Insight into filter cake structures using micro tomography: The dewatering equilibrium - Publication D: Network model of porous media – Review of old ideas with new methods - Publication E: Wetting behavior of porous structures: Three-dimensional determination of the contact angle after filter cake dewatering using X-ray microscopy The data collection contains tomographic images (ZEISS Xradia 510 Versa) of partially dewatered filter cakes according to VDI 2762. The images include filter cake structures from additional particle systems such as mica, limestone, quartz, dolomite and glass particles and thus complement the reference system of aluminum oxide particles (publications A to E). All particles are between 50 and 200 µm in size, and their shape varies from spherical to cubic to fibrous and plate-like.
Prediction of micro processes, filter cake build-up and porous media flow is a key challenge to describe macroscopic parameters like filter cake resistance. This is based on a precise description, not only of the disperse solid fraction, but the distributed properties of the voids between the particles. Lab-experiments are carried outwith alumina and limestone,which differ in particle size distribution (PSD) and resulting filter cake structure. Filter cakes of bothmaterials are characterized by standardized lab tests and additionally, alumina cakes aremeasured with X-ray microscopy (XRM). Focusing on distributed process key parameters, the data gives a deeper understanding of the laboratory experiments. The solid volume fraction inside the feed strongly influences the particle sedimentation and leads typically to a top layer formation of fine particles in the final filter cake,which has a negative influence on subsequent process steps. The top layers seal the filter cake for washing liquid and increase the capillary entry pressure for gas differential pressure de-watering. The influence on cake structure can be seen in a change of porosity, particle size and shape distribution over the height of the filter cake. In all measurements, homogenous filter cake structures could only be achieved by increasing the solid volume fraction inside the suspension above a certain percentage, at which particle size related sedimentation effects could be neglected and only zone sedimentation occurred. XRM offers the chance to quantify these effects, which previously could only be described qualitatively.
A more thorough understanding of the properties of bulk material structures in solid–liquid separation processes is essential to understand better and optimize industrially established processes, such as cake filtration, whose process outcome is mainly dependent on the properties of the bulk material structure. Here, changes of bulk properties like porosity and permeability can originate from local variations in particle size, especially for non-spherical particles. In this study, we mix self-similar fractions of crushed, irregularly shaped Al2O3 particles (20 to 90 μm and 55 to 300 μm) to bimodal distributions. These mixtures vary in volume fraction of fines (0, 20, 30, 40, 50, 60 and 100 vol.%). The self-similarity of both systems serves the improved parameter correlation in the case of multimodal distributed particle systems. We use nondestructive 3D X-ray microscopy to capture the filter cake microstructure directly after mechanical dewatering, whereby we give particular attention to packing structure and particle–particle relationships (porosity, coordination number, particle size and corresponding hydraulic isolated liquid areas). Our results reveal widely varying distributions of local porosity and particle contact points. An average coordination number (here 5.84 to 6.04) is no longer a sufficient measure to describe the significant bulk porosity variation (in our case, 40 and 49%). Therefore, the explanation of the correlation is provided on a discrete particle level. While individual particles < 90 μm had only two or three contacts, others > 100 μm took up to 25. Due to this higher local coordination number, the liquid load of corresponding particles (liquid volume/particle volume) after mechanical dewatering increases from 0.48 to 1.47.
In recent years, non-destructive X-ray microscopy (XRM) has become a common method to characterize particle systems in various scientific fields: Besides the size and shape of particles in bulk powders, the insight into filter cake structures provides additional information about micro processes during filtration and dewatering. Distributed particle properties mainly influence the porous network build-up with possible local deviation in vertical and horizontal alignment. This article focusses on the model-based correlation between the distributed particle properties and characteristic network parameters like tortuosity, pore radii and preferred capillaries for dewatering, using tomography data as model input. Therefore, cake-forming filtration experiments were carried out with a down-scaled, self-constructed in-situ pressure nutsch. The entire tomographic dataset consists of seven individual scans at certain desaturation steps at different pressure levels. For the experiments, a lognormal distributed particle system (crushed Al2O3) in the range of 55 to 200 μm inside an aqueous suspension was used, containing additives for contrast enhancement. Image data processing based on reconstructed 360° projections allows the identification of the background, solid particles and liquid phase by a two-step segmentation. The subsequent modelling uses experimentally verified particle size distributions from laser diffraction measurements (integral value), 2D- (limited number of particles) as well as tomographic analysis, based on calculated single-particle volumes given by the voxel-dataset (all particles within the scanned volume). To characterize the porous network, a developed tetrahedron model is first applied to follow the shortest way through the porous matrix, then again to calculate the widest capillary related to the pore entrance. Furthermore, with information about the pore throat distribution and the wetting line from the tetrahedron side faces, the force balance is evaluated. This results in an entrance pressure distribution, the capillary pressure curve. Experimental data according to VDI 2762 built filter cakes and mercury intrusion tests are taken as reference for validation.
The paper takes up the old ideas of describing porous media with several tube and network models. The wellknown models received from literature gave a good concept of dewatering equilibria resulting in capillary pressure curves and pore size distributions (PoSD). However, numerical methods and measurement techniques were not sophisticated allowing to evaluate the models appropriately. In this work, a numerical method based on statistics is introduced to validate the network model of FATT from 1956: The porous filter cake structure is implemented as a matrix, which elements represent the pore size correlating with the capillary entrance pressure for each pore. The input for the calculations can be any mathematical approximation of a PoSD, which can be derived from capillary pressure tests or micro computer tomography (μCT) analysis of the filter cake. A procedure based on the concept of FATT is presented to generate dewatering equilibria for different applied pressures. Therefore, the elements of the matrix are checked to be ‘dewatered’ regarding to their size, position, the applied pressure level and the progress of dewatering. The network model known from literature is improved by implementing additional conditions for the description of physical phenomena, such as the formation of residual bridge liquid or hydrodynamic isolated areas. X-ray microscopy, mercury intrusion tests and laboratory desaturation experiments by using semipermeable membranes for capillary pressure tests are used to validate the pore size distribution. The different results are integrated into the matrix model as starting parameters. For the laboratory experiments, the PoSD is calculated from the measured capillary pressure curve, using the distributed tube model and the YOUNG-LAPLACE-equation on an equal basis to the established mercury intrusion analysis. However, with the tomography measurements, it is possible to determine PoSD using different defined geometry elements fitting inside the pore space. The force balance is evaluated at the pore entrance by using the wetting line of the pore throat. The direct measurement of the void geometry allows the calculation of the pressure distribution without the LAPLACIAN assumptions. In this way, the difference between experimental, measured and modelled PoSDs is emphasised to validate the old (and improved) ideas of network models describing porous media
The wetting behavior of remaining isolated liquid bridges between particle interfaces determines the efficiency of filter cake dewatering. Micro-processes during and after dewatering can be traced by means of direct X-ray microtomography (ZEISS Xradia 510 Versa) providing insights into the filter cake structure. We measure the local contact angle between the immiscible phases on the pore scale after in-situ filter cake dewatering. By tracing the three-phase contact line and the two perpendicular vectors belonging to the solid and liquid surface, the contact angle is obtained from their scalar product at every mesh-node. The range of the resulting distribution and curvature increases with the degree of roughness, becoming more obvious for larger contact angles. The occurring roughness causes a naturally water-repellent surface and leads to low liquid saturations. The resulting angular distribution serves for a more accurate prediction of multiphase flow in pore networks as input for further pore model enhancement.
To get good image segmentation results in tomographic analysis methods, particles has to be seperated from each other while embedding them into a matrix. Low X-ray absorbing nano-graphite below the voxel resolution of the system is mixed with the analysis particles to form a shell to avoid direct particle-particle contact. After mixing it with epoxy resin, the paste is sucked into a polymeric tube of 2 mm inner diameter. After hardening, the formed cylinder was scanned with an X-ray microscope (ZEISS Xradia Versa 510) with the following parameters: 80 keV, 7 W, 360°, 2001 projections, 1 s exposure time, 2 µm voxel size, binning 2. After scannning, the following filters were applied while reconstruction with ZEISS reconstructor: Beam Hardening Correction 0.05, Smoothing 0,7. Detailed overview of the parameters is summarized as screenshots within the repository.
Glass Particles (soda-lime glass), particle size distribution 5 to 50 µm
This repository includes the reconstructed TIFF-files from multiple scale tomographic analysis of particles in an embedding matrix (overall volume and 3 sub-volumes in 2 different magnification steps)
including - 1 LARGE sample with low-resolution data (micro-CT, VERSA510) - splitted in I, II, III, IV, V - 3 SUB-Samples MEDIUM-resolution (cutted down from above LARGE sample, scanned with micro-CT, ZEISS VERSA510) - 3 SUB-Samples HIGH-resolution (same as MEDIUM, scanned with nano-CT, ZEISS ULTRA810)
This project is concerned with the description of multidimensional partition curves. When particle collectives are separated, seldom the separation into concentrate and refuse fractions only depends on a single particle characteristic. The aim of this research is to reveal how different separation apparatuses partition particle collectives according to multiple, often independent characteristics.
Daten des Aufsatzes "Beschreibung von Trennoperationen mit mehrdimensionalen Partikeleigenschaftsverteilungen", welcher 2022 in der Chemie Ingenieur Technik veröffentlicht werden soll.
PARROT stands for an oPen Access aRchive for paRticle discrete tOmographic daTasets (https://parrot.tu-freiberg.de). As certainly not the most obvious acronym, the PARROT is supposed to represent the different expressions of particle properties. Just as a bird grasps individual discrete parts of its food with its beak, we would like to provide a repository of a significantly large number of discrete particles from various practically important particle systems. The implementation is some kind of pilot study which makes no claim to completeness - that in this case will never be possible. Rather, it should be about the general way of implementation. Which use cases exist in practice, which data and metadata are required for this, how can this be mapped in a database structure that allows users convenient access and generates added value? All data collections are results of tomographic measurements and data basis of the PARROT database.
Tomographic Dataset. Includes reconstructed raw TIFF and the corresponding segmented TIFF stack. Acquisition and reconstruction parameters and particle size distribution of the original particle sample are summarized as image.
Tomographic Dataset. Includes reconstructed raw TIFF and the corresponding segmented TIFF stack. Acquisition and reconstruction parameters and particle size distribution of the original particle sample are summarized as image.
Tomographic Dataset. Includes reconstructed raw TIFF and the corresponding segmented TIFF stack. Acquisition and reconstruction parameters and particle size distribution of the original particle sample are summarized as image.
Tomographic Dataset. Includes reconstructed raw TIFF and the corresponding segmented TIFF stack. Acquisition and reconstruction parameters and particle size distribution of the original particle sample are summarized as image.
Tomographic Dataset. Includes reconstructed raw TIFF and the corresponding segmented TIFF stack. Acquisition and reconstruction parameters and particle size distribution of the original particle sample are summarized as image.
Tomographic Dataset. Includes reconstructed raw TIFF and the corresponding segmented TIFF stack. Acquisition and reconstruction parameters and particle size distribution of the original particle sample are summarized as image.
Supplemental material for presentations on ESAFORM2023 conference.
To investigate the impact of melt conditioning and filtration on intermetallic phases in AlSi9Cu3 µCT studies were conducted. The gained research data show microstructural features as well as superposition of fatigue cracks with their inital microstructure.
Das Thema „Grüne Stadt“ erlebt derzeit eine Renaissance und erfuhr zuletzt im 2015 veröffentlichten Grünbuch des Bundesministeriums für Umwelt, Naturschutz, Bau und Reaktorsicherheit (BMUB) eine umfassende Würdigung. Gegenstand dieser weitreichenden Befassungen sind auch Überlegungen zur quantitativen und qualitativen Angebotsentwicklung im Bereich der Erholungsvorsorge. Diesbezüglich vorhandene Rahmensetzungen sind über 40 Jahre alt und bedürfen nicht zuletzt vor dem Hintergrund veränderter Rahmenbedingungen der Aktualisierung und empirischen Überprüfung, v. a. im Hinblick auf die vermutlich recht unterschiedlichen quantitativen und qualitativen Anforderungen unterschiedlicher Bevölkerungsgruppen an die Gestaltung erholungsrelevanter Flächen sowie ihre Erreichbarkeit. Ziel der vorliegenden Dissertation, deren digitaler Anhang über Opara zugänglich ist, war es, auf Basis umfassender empirischer Erhebungen belastbare Daten zu alters- und nutzergruppenspezifischen Einzugsgebieten von Parkanlagen sowie zu alters- und nutzergruppenabhängigen Erholungsmustern zu erlangen und im Hinblick auf identifizierte Einflussfaktoren zu interpretieren. Im Fokus stand also, ergänzt durch eine Erweiterung vorhandener Flächendaten um v. a. qualitative Aspekte, die systematische Erfassung und Interpretation von Nutzungsmustern ausgewählter öffentlicher Parkanlagen im Stadtgebiet von Dresden, mithin keine wissenschafts-theoretische Auseinandersetzung mit den soziologischen Aspekten der Freiraumnutzung, sondern eine hypothesengeleitete empirische Studie.
In dieser Datensammlung befindet sich der digitale Anhang zur Dissertation "Grundlagen für Erholungsplanung in der Stadt. Eine empirische Untersuchung zu Nutzungsmustern ausgewählter öffentlicher Parkanlagen in Dresden." (Seidler 2016). Dieser umfasst einen Kartenteil und ergänzende Anhänge, beispielsweise Grafiken, Fotodokumentationen, Zählbögen, Dokumentationen zur Wegelängenermittlung in den untersuchten Anlagen und zur Dichteermittlungen in den umgebenden Stadtbereichen beispielsweise.
Das nachfolgende Datenrepository enthält sämtliche Software und Datenmodelle, die im Rahmen der Dissertation "Ein ontologiebasiertes Verfahren zur automatisierten Bewertung von Bauwerksschäden in einer digitalen Datenumgebung" verwendet wurde. Nachfolgend ist das dazugehörige Abstrakt der Dissertation dargestellt: Neue Technologien im Bereich der Bauwerks- und Schadenserfassung führen zu einer Automatisierung und damit verbundenen Effizienzsteigerung von Inspektionsprozessen. Eine adäquate Digitalisierung des erfassten Bauwerkzustandes in ein BIM-Modell ist jedoch gegenwärtig nicht problemlos möglich. Eine Hauptursache hierfür sind fehlende Spezifikationen für ein digitales Modell, das aufgenommene Schäden repräsentieren kann. Ein Problem bilden dabei Unschärfen in der Informationsmodellierung, die üblicherweise bei BIM-Verfahren im Neubau nicht auftreten. Unscharfe Informationen, wie z.B. die Klassifizierung detektierter Schäden oder Annahme weiterer verborgener Schäden, werden derzeit manuell von Experten evaluiert, was oftmals eine aufwendige Auswertung kontextueller Informationen in einer Vielzahl verteilter Bauwerksdokumente erfordert. Eine automatisierte Bewertung detektierter Schäden anhand des Bauwerkskontextes wird derzeit noch nicht in der Praxis umgesetzt. In dieser Dissertation wird ein Konzept zur Repräsentation von Bauwerksschäden in einem digitalen, generisch strukturierten Schadensmodell vorgestellt. Das entwickelte Konzept bietet hierbei Lösungsansätze für Probleme gegenwärtiger Schadensmodellierung, wie z.B. die Verwaltung heterogener Dokumentationsdaten, Versionierung von Schadensobjekten oder Verarbeitung der Schadensgeometrie. Das modulare Schema des Schadensmodells besteht aus einer generischen Kernkomponente, die eine allgemeine Beschreibung von Schäden ermöglicht, unabhängig von spezifizierenden Faktoren, wie dem betroffenen Bauwerkstyp oder Baumaterial. Zur Definition domänenspezifischer Informationen kann die Kernkomponente durch entsprechende Erweiterungsschemata ergänzt werden. Als präferierte Serialisierungsmöglichkeit wird das Schadensmodell in einer wissensbasierten Ontologie umgesetzt. Dies erlaubt eine automatisierte Bewertung der modellierten Schadens- und Kontextinformationen unter Nutzung digitalisierten Wissens. Zur Evaluation unscharfer Schadensinformationen wird ein wissensbasiertes Bewertungsverfahren vorgestellt. Das hierbei entwickelte Schadensbewertungssystem ermöglicht eine Klassifizierung detektierter Schäden, sowie Folgerung impliziter Bewertungsinformationen, die für die weitere instandhalterische Planung relevant sind. Außerdem ermöglicht das Verfahren eine Annahme undetektierter Schäden, die potentiell im Inneren des Bauwerks oder schwer erreichbaren Stellen auftreten können. In der ontologischen Bewertung werden dabei nicht nur Schadensmerkmale berücksichtigt, sondern auch Informationen bezüglich des Bauwerkskontext, wie z.B. der betroffene Bauteil- oder Materialtyp oder vorliegende Umweltbedingungen. Zur Veranschaulichung der erarbeiteten Spezifikationen und Methoden, werden diese abschließend an zwei Testszenarien angewendet.
New technologies in the field of building and damage detection lead to an automation of inspection processes and thus an increase in efficiency. However, an adequate digitalisation of the recorded building data into a BIM model is currently not possible without problems. One main reason for this is the lack of specifications for a digital model that can represent recorded damages. Thereby, a primary problem are uncertainties and fuzzy data in the information modelling, which usually does not occur when applying BIM for new buildings. Fuzzy information, such as the classification of detected damages or the assumption of further hidden damages, is currently evaluated manually by experts, which often requires a complex evaluation of contextual information in a multitude of distributed building documents. An automated evaluation of detected damages based on the building context is applied or implemented in practice. In this thesis a concept for the representation of structural damages in a digital, generically structured damage model is presented. The developed concept offers solutions for problems of current damage modelling, e.g. the management of heterogeneous documentation data, versioning of damage objects or processing of the damage geometry. The modular scheme of the damage model consists of a generic core component, which allows a general description of damages, independent of specifying factors, such as the type of construction or building material concerned. For the definition of domain-specific information, the core component can be supplemented by corresponding extension schemes. As a preferred serialisation option, the damage model is implemented in a knowledge-based ontology. This allows an automated evaluation of the modelled damage and context information using digitised knowledge. For the evaluation of fuzzy damage information, a knowledge-based evaluation procedure is presented. The developed damage evaluation system allows a classification of detected damages as well as the conclusion of implicit evaluation information relevant for further maintenance planning. In addition, the method allows the assumption of undetected damages that can potentially occur inside the structure or in places that are difficult to reach. In the ontological assessment, not only damage characteristics are considered, but also information regarding the building context, such as the affected component or material type as well as existing environmental conditions. To illustrate the developed specifications and methods, the whole concept is applied to two test scenarios.
DFG project number: 254872581 (follow-up of project HE2933/8-1) The majority of constitutive models, that are used nowadays to describe the behaviour of granular materials such as sands, are continuum models based on phenomenological approaches. In order to describe some of the phenomena occurring on the macroscopic scale, e.g. an abrupt change of stiffness due to a load reversal, these constitutive models use phenomenological state variables (e.g. back stress in elasto-plasticity or the intergranular strain concept for hypoplasticity) which often lack a clear physical meaning. The mechanisms that control the macroscopic behaviour and, as such, different phenomena, that can be observed on the continuum scale, must be sought at the grain-scale with the interactions of individual particles playing the key-role. X-Ray μ-computed tomography (CT) allows for a 3D imaging of natural soil samples in various loading conditions and is used in this project. In order to extract information on the structure of the granular material, different image analysis approaches can be used and their accuracy is evaluated with respect to the limited resolution. Mechanical experiments in the x-ray CT scanner have been carried out on natural sands in the running project. During a macroscopic loading the sand specimens were scanned using a laboratory x-ray scanner in order to assess the grain-scale behaviour in-situ and link it with the macroscopic observations. The evolution of the microstructure can be linked to the evolution of the phenomenological variables, e.g. the intergranular strain for hypoplasticity for changes in loading direction, leading to a possible micromechanical enhancement of these concepts. Establishing a link between micromechanical variables, such as the fabric tensors describing the stucture, and the macromechanical observations cannot only enhance our understanding of different phenomena occurring on the continuum scale, but also enable an incorporation of these effects into phenomenological approaches in a more straight-forward and reliable way.
This work develops a strategy to benchmark image analysis tools that can used for the determination of contact fabric from tomographic images. The discrete element method is used to create and load a reference specimen for which the fabric and its evolution is precisely known. Chosen states of this synthetic specimen are turned into realistic images taking into account inherent image properties, such as the partial volume effect, blur and noise. The application of the image analysis tools on these images validates the findings of the metrological study and highlights the importance of addressing the identified shortcomings, i.e., the systematical over-detection of contacts and the strong bias of orientations when using common watersheds.
3D-hydronumerical models are growing in popularity throughout various disciplines to achieve a better understanding of complex flow in hydraulic structures. Due to increasing computational resources over the last decades, these models enable us today to economically simulate a broad range of situations, where laboratory or analytical approaches are limited. But, the issue with 3D-hydronumerical simulations are the numerous variables to be handled by the user. Slight modifications of numerical parameters may lead to differing results. Furthermore, certain con-structional designs of fishways were identified in laboratory studies for their unpredictable flow regimes when modifying minor geometrical parameters. These volatile designs might affect the suitability of fishways in-situ, even though they comply common constructional guidelines. Hence, this article presents the evaluation of in total 10 slightly modified vertical-slot-fishway simulations, to identify (un)stable fishway designs and varies them through two different turbu-lence models (URANS and DES). Its evaluation indicates, that some fishway design built world-wide are affected in 3D-hydronumerical models by both, the constructional modifications and the simulation approach. The recognition of unstable designs and their consequences for fish passage criteria has to either be avoided in future designs or used on purpose to diversify flow regimes in technical built fishways.
The project goal is the development of innovative techniques for spatio-temporal high resolution monitoring and small-scale simulation of extreme events. In a cooperation between the chairs of Hydrology, Meteorology, Geoinformatics and Photogrammetry, of the TU Dresden, new types of operational monitoring systems will be developed. Existing monitoring networks will be densified using modern low-cost sensors, specific remote sensing data and geographical information systems. Additionally, historical analyses and predictive modelling of small-scale extreme events with different climate scenarios will support to predict the expected effects of climate change. The developed information will serve as a base for upcoming early warning systems and future adjustment strategies.
Small scale and headwater catchments are mostly ungauged, even though their observation could help to improve the understanding of hydrological processes. However, it is expensive to build and maintain conventional measurement networks. Thus, the heterogeneous characteristics and behavior of catchments are currently not fully observed. This study introduces a method to capture water stage with a flexible low-cost camera setup. By considering the temporal signature of the water surface, water lines are automatically retrieved via image processing. The image coordinates are projected into object space to estimate the actual water stage. This requires high resolution 3D models of the river bed and bank area which are calculated in a local coordinate system with SfM, employing terrestrial as well as UAV imagery. A medium- and a small-scale catchment are investigated to assess the accuracy and reliability of the introduced method. Results reveal that the average deviation between the water stages measured with the camera gauge and a reference gauge are below 6 mm in the medium-scale catchment. Trends of water stage changes are captured reliably in both catchments. The developed approach uses a low-cost camera design in combination with image-based water level measurements and high-resolution topography from SfM. In future, adding tracking algorithms can help to densify existing gauging networks.
In this paper an automatic approach is proposed to measure flow velocity with an uncooled thermal camera. Hot water is used as thermal tracer. The introduced tracking algorithm utilizes the pyramidal Lucas-Kanade method and is especially suitable for thermal image data. The performance of the new tool is compared to traditional image-based tracking tools, i.e. PIVlab and PTVlab. Experiments are performed in the laboratory for three different flow velocities. Afterwards, tests are conducted in a small stream to illustrate the suitability of the tool for field measurements. Results of the laboratory experiments as well as of the field experiments show that our tracking algorithm, applied to imagery from a thermal camera, outperforms commonly used tracking methods. Our tool provides velocity fields with very high resolution and is in close agreement with reference measurements, whereas PTVlab and PIVlab tend to overestimate and underestimate flow velocities, respectively.
An automatic workflow is introduced, including an image-based tracking tool, to measure surface flow velocities in rivers. The method is based on PTV and comprises an automatic definition of the search area for particles to track. Tracking is performed in the original images. Only the final tracks are geo-referenced, intersecting the image observations with water surface in object space. Detected particles and corresponding feature tracks are filtered considering particle and flow characteristics to mitigate the impact of sun glare and outliers. The method can be applied to different perspectives, including terrestrial and aerial (i.e. UAV) imagery. To account for camera movements images can be co-registered in an automatic approach. In addition to velocity estimates, discharge is calculated using the surface velocities and wetted cross-section derived from surface models computed with SfM and multi-media photogrammetry. The workflow is tested at two river reaches (paved and natural) in Germany. Reference data is provided by ADCP measurements. At the paved river reach highest deviations of flow velocity and discharge reach 5% and 4%, respectively. At the natural river deviations are larger (26% and 20%, respectively) due to the irregular cross-section shapes hindering accurate contrasting of ADCP- and image-based results. The provided tool enables the measurement of surface flow velocities independently of the perspective from which images are acquired. With the contact-less measurement spatially distributed velocity fields can be estimated and river discharge in previously ungauged and unmeasured regions can be calculated.
A workflow is introduced to automatically measure water stages based on image measurements using deep learning. So far, most camera gauges do not provide the needed robustness to achieve accurate water stage measurements because of changing environmental conditions. The novel, suggested approach is based on two CNNs (i.e. FCN and SegNet) to identify water in imagery. The image information is transformed into metric water level values intersecting the extracted water contour with a 3D model. The workflow allows for the densification of river monitoring networks based on low-cost camera gauges in various scenarios.
The data belong to the publication Wordell-Dietrich, P., Don, A., Wotte, A., Rethemeyer, J., Bachmann, J., Helfrich, M., Kirfel, K., & Leuschner, C. (2019). Vertical partitioning of CO2 production in a forest soil. Biogeosciences Discussions, April, 1–27. https://doi.org/10.5194/bg-2019-143. The dataset contains: 1) measurements of volumetric water content, soil temperature, carbon dioxide in three soil profiles in a beech forest in Germany 2) isotopic data of carbon dioxide in the soil atmosphere 3) diffusion parameters Detailed information on methods and calculations can be found in the publication.
The collection contains data on volumetric water content, soil temperature and carbon dioxide concentration from three subsoil observatories. It also contains data on soil respiration measurements, isotopic data of the soil atmosphere and diffusion parameters for CO2 flux modelling. The file column description contains information on the headings and units for each data file
Goal: To study the complex interactions between vegetation, composition, spatial distribution, and capability of the soil bacterial community to produce extracellular polymeric substances (EPS), and the effects of these EPS on the stability of soil aggregates. (Funded by the Deutsche Forschungsgemeinschaft (DFG) - Project number 316446092)
Data / results obtained from the first sampling campaign on the selected sites in Rambla Honda and near the town of Alboloduy, Almeria Spain.
The final article can be found via https://doi.org/10.3389/fenvs.2020.00051
Data / results obtained from the second sampling campaign on the selected sites near Tabernas (Rambla Honda) and near the town of Alboloduy, Almeria (Spain).
Data belonging to the paper "Identifying and quantifying geogenic organic carbon in soils – the case of graphite", published in SOIL journal (Copernicus). The research paper can be accessed via: https://doi.org/10.5194/soil-2019-30
Data underlying the figures of the research paper: "Identifying and quantifying geogenic organic carbon in soils – the case of graphite" (SOIL Copernicus journal) https://doi.org/10.5194/soil-5-1-2019
For detailed data description, please refer to: Zeh, L., Igel, T. M., Schellekens, J., Limpens, J., Bragazza, L., & Kalbitz, K. (2020). Vascular plants affect properties and decomposition of moss-dominated peat, particularly at elevated temperatures. Biogeosciences Discussions, 1-29.
Supplementary data to publication: Wanninger, L., Heßelbarth, A. (2020): GNSS code and carrier phase observations of a Huawei P30 smartphone: quality assessment and centimeter-accurate positioning; GPS Solutions 24:64, https://doi.org/10.1007/s10291-020-00978-z (open access). The data set consists of GNSS observations of a smartphone Huawei P30. They were collected in 8 static observation sessions with an overall duration of 77 h. Each session data set includes raw data gathered by GNSSLogger and RINEX files for Huawei P30 and a local reference station. The data sets are stored in two separate zip-archives.
Supplementary data to publication GPS Solutions 24:64 (2020) Wanninger, L., Heßelbarth, A. (2020): GNSS code and carrier phase observations of a Huawei P30 smartphone: quality assessment and centimeter-accurate positioning; GPS Solutions 24:64, https://doi.org/10.1007/s10291-020-00978-z (open access). The data set consists of GNSS observations of a smartphone Huawei P30. They were collected in 8 static observation sessions with an overall duration of 77 h. Each session data set includes raw data gathered by GNSSLogger and RINEX files for Huawei P30 and a local reference station. The data sets are stored in two separate zip-archives. This archive contains the RINEX observations of Huawei P30, RINEX observations of the local reference station "BZW1" equipped with SEPT POLARX5 / JAVRINGANT_DM JVDM, broadcast ephemerides in RINEX format, and antenna corrections in ANTEX format. All RINEX files containing Huawei P30 observations use station name "HUAW". The five rooftop sessions with different Huawei P30 orientations were used to calibrate the phase center of the smartphone. This was repeated with the DRB2 rotational device, which enables observations in four azimuthal orientations per minute. Our results of the HUAWEI P30 phase center calibration, valid for the setup as shown in the paper (Fig. 1), are found in the ANTEX directory. The two field sessions are complemented with additional observations by LEICA GRX1200+ GNSS / NAX3G+C NONE at the same stations as used by the Huawei P30. In field session 1 the station is named "1000". In the second session it is called "2000".
Supplementary data to publication GPS Solutions 24:64 (2020) Wanninger, L., Heßelbarth, A. (2020): GNSS code and carrier phase observations of a Huawei P30 smartphone: quality assessment and centimeter-accurate positioning; GPS Solutions 24:64, https://doi.org/10.1007/s10291-020-00978-z (open access). The data set consists of GNSS observations of a smartphone Huawei P30. They were collected in 8 static observation sessions with an overall duration of 77 h. Each session data set includes raw data gathered by GNSSLogger and RINEX files for Huawei P30 and a local reference station. The data sets are stored in two separate zip-archives. This archive contains the GNSSLogger raw data of the Huawei P30 GNSS measurements for all 8 sessions.
Nutzung der U-Pb-Datierung an Zirkonen und anderer Methoden wie EMMA, Gesamtgeochemie usw. um die Provenienz der äolischen Bestandteile, speziell der Zirkone in Deckschichten im Südwesten der USA zu bestimmen. Über die Herkunft der Zirkone soll versucht werden, die Paläowindrichtungen zu bestimmen. Es muss überprüft werden, ob die geologisch reliable Methode der Zirkondatierung hier anwendbar ist, da die Methode von den Geologen ab 100 Ma als sicher gilt. Aufgrund der enormen Halbwertzeiten der Uran-Isotope fehlt bei jüngeren Zirkonen die gegenseitige Bestätigung. Bis zu 1 Ma Jahre kann das über eine Thorium-Korrektur gesichert werden. Das Untersuchungsgebiet kann in zwei große Haupteinheiten unterteilt werden: das Great Basin als endorheisches Becken und das Colorado Plateau. Flusssysteme sind in dem Gebiet (arid/semi-arid) nicht relevant bzw. fluvialer Transport erkennbar. Die Modelle der aktuellen und der Paläoklimaforschung konstruieren in dem Gebiet einen Westwind, der dann über die über 4000m hohen Kordillieren angenommen wird/wurde. Durch andere vorhandene Archive in dem Gebiet und mehrere riesige Paläoseen (Lahonton und Bonneville) gelangte aber nachweislich mehr Feuchtigkeit in die Region. Diese Feuchtigkeit kann nicht über ein 4000m hohes Gebirge eingetragen worden sein.
All measured ages, used and published.
"Geovisual analysis of VGI for understanding people’s behaviour in relation to multi-faceted context" Volunteered Geographic Information (VGI) in the form of actively and passively generated spatial content offers extensive potential for a wide range of applications. Realising this potential however requires methods which take account of the specific properties of such data, for example its heterogeneity, quality, subjectivity, spatial resolution and temporal relevance. The creation and production of such content through social media platforms is an expression of human behaviour, and as such influenced strongly by events and external context. In this project we will develop geovisual analysis methods which show how actors interact in LBSM, and how their interactions influence, and are influenced by, their physical and social environment and relations.
This collection contains Supporting Information for the publication "From sunrise to sunset - Exploring landscape preference through global reactions to ephemeral events captured in georeferenced social media" (PLOS). Abstract: Events profoundly influence human-environment interactions. Through repetition, some events manifest and amplify collective behavioral traits, which significantly affects landscapes and their use, meaning, and value. However, the majority of research on reaction to events focuses on case studies, based on spatial subsets of data. This makes it difficult to put observations into context and to isolate sources of noise or bias found in data. As a result, inclusion of perceived aesthetic values, for example, in cultural ecosystem services, as a means to protect and develop landscapes, remains problematic. In this work, we focus on human behavior worldwide by exploring global reactions to sunset and sunrise using two datasets collected from Instagram and Flickr. By focusing on the consistency and reproducibility of results across these datasets, our goal is to contribute to the development of more robust methods for identifying landscape preference using geo-social media data, while also exploring motivations for photographing these particular events. Based on a four facet context model, reactions to sunset and sunrise are explored for Where, Who, What, and When. We further compare reactions across different groups, with the aim of quantifying differences in behavior and information spread. Our results suggest that a balanced assessment of landscape preference across different regions and datasets is possible, which strengthens representativity and exploring the How and Why in particular event contexts. The process of analysis is fully documented, allowing transparent replication and adoption to other events or datasets.
Grün in der Stadt – Informationen und Navigation zu urbanen Grün-flächen – Datenerhebung, Dienstinfrastruktur und App „meinGrün“
This collection includes Supplementary Materials from the IfK results for meinGrün, made available on mCloud: 1. Metadata Dresden dataset 2. Metadata Heidelberg dataset 3. Use Case Web Portal Heatmap "Großer Garten"
Release files for published paper.
The MatLab Code can be used to simulate a underwater laser triangulation system. Editing the Input Parameter in the script suimulation.m will generate plots of the system including the resulting measurement image.
Measurement images of experiment with an underwater triangulation sensor
In “Artificial Intelligence for Cold Regions” (AI-CORE) we will develop a collaborative approach for applying Artificial Intelligence (AI) methods in earth observation and thereby breaking new ground for researching the cryosphere. Rapidly changing ice sheets and thawing permafrost are big societal challenges, hence quantifying these changes and understanding the mechanisms are of major importance. Given the vast extent of polar regions and the availability of exponentially increasing satellite remote sensing data, intelligent data analysis is urgently required to exploit the full information in satellite time series. So far, extensive competences in data science, AI implementation, and processing infrastructures are decentralized and distributed among the individual Helmholtz centers. In the era of big data, cloud computing and extensive earth observation programs, a core challenge is to establish and consolidate a joint platform for AI applications by combining existing competences and infrastructures with new developments serving especially AI applications. Four geo-scientific use cases from cryosphere research will be used to demonstrate the new collaborative AI approach. These use cases are challenging due to diverse, extensive, and inhomogeneous input data and their high relevance is given in the context of climate change. To address these case studies, several AI methods will be developed, tested, evaluated, and implemented in the data processing infrastructure of the project members by combining all distributed capabilities into a joint platform. A “best practice” approach will be identified to solve each of the individual research questions. Once established, this knowledge and the AI-CORE platform can be used even beyond the exemplary use cases to address current and upcoming challenges in data processing, management, data science, and big data. The experience of this collaborative approach will be of very high value for the next research program PoF IV. Furthermore, the networking and knowledge exchange among AI-CORE members will facilitate synergies between methodsbased research and direct AI applications beyond the immediate use cases.
Outlet glaciers in Greenland experience a combination of seasonal and climate-driven change. Nearby glaciers exhibit very different retreat and advance behavior despite being situated in similar climatic conditions. This highlights the demand to essentially improve our understanding of the driving mechanisms and to provide a basis for parameterizations of oceanic forcing that are fed into mass-loss projections. Temporal changes of glacial flow velocities are presumably linked to the evolution of the subglacial hydrological system. Depending on the type of subglacial system, the temporal acceleration of the glacier is represented by different characteristics. While this is typically investigated only along a central flow line, the spatial distribution contains more information on the cause of the acceleration. In a similar way, the spatial pattern of acceleration due to changes at the calving front is likely driven by upstream propagation of changes in stresses. Hence, understanding the mechanisms in detail requires an analysis of different physical variables in high temporal and spatial resolution and combination with ice modelling. With the new generation of satellites the era of big data has started in glaciology, and new efficient methods to analyze change patterns are required.
Goal: Characterise the influence of aquifer properties and external stresses on DNAPL source zone architecture by deriving transformation techniques to convert complex to effective source geometries through a combination of laboratory-scale experiments and numerical modelling. Partners are the Helmholtz-Centre for Environmental Research-UFZ, the Indian Institute of Technology Delhi, the Department of Civil and Environmental Engineering at the University of Illinois Urbana-Champaign, U.S., and the Faculty of Civil, Architectural and Environmental Engineering at the University of Texas at Austin, U.S.
This data collection includes the Python script used for image processing and analysis as described in the article "Quantification of uncertainties from image processing and analysis in laboratory-scale DNAPL migration experiments evaluated by reflective optical imaging" by Engelmann et al. submitted to Journal "Water" in 2019. Exemplary raw images generated from laboratory-scale tank experiments for DNAPL migration are included as well.
Das DFG-Grundlagenforschungsprojekt „Regen als Grundwassertracer“ setzt es sich zum Ziel, die Nutzung von Niederschlagswässern als Grundwassertracer wissenschaftlich zu validieren. Hierzu wird im Rahmen des Projektes untersucht und modellgestützt abgebildet werden, inwieweit die kombinierte Berücksichtigung der inhärenten Eigenschaften von natürlichen Wässern wie z.B. Stabilisotopensignatur, ionische Zusammensetzung und Temperatur für eine zielgerichtete Detektion genutzt werden kann. Dieser Multitracer-Ansatz sollte es ermöglichen, Grundwasserleiter unter Verwendung einer toxikologisch und ökologisch sehr verträglichen Methode belastbar zu charakterisieren, u.a. auch bezüglich des strukturellen Aufbaus. Im Projekt werden hierzu umfassende Analysen zum Transportverhalten sowohl im kleinskaligen Bereich als auch auf größerer Skala unter Feldbedingungen durchgeführt. Im Fokus der Untersuchungen steht dabei vor allem die Beurteilung möglicher Einflussfaktoren auf die „Stabilität“ des Tracersignals während der Untergrundpassage. Dies erfolgt primär laborgestützt mittels ein- und mehrdimensionaler Durchströmungsexperimente in Kombination mit Langzeitbatchversuchen. Hierbei werden nicht nur sedimentbedingte Veränderungen der Isotopen- und Ionenzusammensetzung sowie strukturbedingte Transport- und Vermi-schungsprozesse bezüglich ihrer Relevanz bewertet, sondern auch chemische Fällungs- und Lösungsreaktionen sowie dichte- und viskositätsbedingte Effekte infolge der Wassereingabe und der damit verbundenen Temperaturveränderung. Zudem wird auch der Einfluss der Eingabeform des Tracerwassers in den hydrogeologischen Untergrund sowie von natürlichen Schwankungen auf das Transportverhalten auf Feldskala untersucht. Zusätzlich zur qualitativen und quantitativen Analyse der oben genannten, einzelnen Einflussfaktoren auf Labor- und Feldskala erfolgt auch eine modellgestützte Bewertung der Gesamtwirkung infolge von Prozessüberlagerungen.
Stable isotope analysis is widely used in environmental tracer studies, e.g. for groundwater flow and discharge quantification. In this context, this study presents an inexpensive approach for the combined use of deuterium (2H) and oxygen-18 (18O) as active semiartificial groundwater tracers by a direct injection of snowmelt into aquifers. This dual isotope approach takes advantage of isotope signature differences between typical groundwater and precipitation water. Aim of this study is the experimental demonstration on laboratory- and field-scale. For this, two column flow experiments were performed using δ2H and δ18O values of snowmelt for breakthrough detection. The differences of the isotope signature between the snowmelt and groundwater were ∆(δ2H) ≈ 61.0 ‰ and ∆(δ18O) ≈ 8.2 ‰. Breakthrough was observed to be almost congruent to a sodium chloride tracer, indicating conservative transport. The low electrical conductivity (EC) of snowmelt (45 µS/cm, i.e. ∆EC ≈ 486 µS/cm to groundwater) was used as an additional easy-to-measure breakthrough indicator. However, the snowmelt EC breakthrough suffered from a slight retardation due to ion exchange. Based on these results, a push-drift-pull tracer test with snowmelt, additionally labeled with uranine, was realized at the field site Pirna, Germany. In the pull phase, a significant isotopic depletion was observed with peak differences of ∆Peak(δ2H) ≈ 24.2 ‰ and ∆Peak(δ18O) ≈ 3.2 ‰, which equals approx. 40 % of the initial difference. The isotope breakthrough was observed to be almost the same as the breakthrough of uranine indicating conservative behavior, while EC breakthrough was affected by ion exchange again.
Ziel des Projektes ist die Erforschung und Entwicklung eines ganzheitlichen Systems zur Errichtung und Betriebsführung von Gruppenkleinkläranlage mit kombinierter Versickerung des gereinigten Abwassers und anfallenden Regenwassers. Die Bemessung der Versickerung soll unter Berücksichtigung der standortspezifisch gegebenen hydrogeologischen und ökologischen Rahmenbedingungen erfolgen.
This data collection includes the Python script used for model data preparation, processing and assessment as described in the article "Evaluation of Decentralized, Closely-Spaced Precipitation Water and Treated Wastewater Infiltration" by Händel et al. submitted to Journal "Water" in 2018. Python script input data as generated from Hydrus 2D/3D models as well as resulting plots as used in the previously mentioned article are included.
Die Nachwuchsforschergruppe adressiert drei Fragenkomplexe. Ein geschichtlich-architektonischer Komplex behandelt am Beispiel der baugeschichtlichen Entwicklung der Stadt Dresden im 20. Jahrhundert die Erforschung und Vermittlung von Wechselwirkungen zwischen Stadtlandschaft und deren Abbildung. Damit verknüpft ist ein zweiter, methodischer Komplex, welcher diesbezügliche forschungsmethodische Anforderungen an digitale Bild- und Planquellenrepositorien und sich daraus ableitende technische Unterstützungsoptionen behandelt. Darauf aufbauend behandelt ein informationell-technischer Komplex eine bedarfsgerechte Informationsmodellierung und deren technische Umsetzung am Beispiel der Deutschen Fotothek. Ziel der photogrammetrischen Untersuchungen ist die Entwicklung von automatisierten Arbeitsabläufen zur Geo- und Zeitreferenzierung von historischen Fotoaufnahmen der SLUB-Fotothek anhand von Metadaten und Bildmerkmalen und die Erstellung eines photogrammetrischen 3D-Stadtmodells. Dieses soll hinsichtlich einer erreichbaren Genauigkeit, Modellkomplexität und Güte evaluiert werden. Ferner sollen Verfahren zur Merkmalsextraktion und Posenschätzung für eine AR-Darstellung, sowie Schnittstellen zu AR/4D-Browsern bereitgestellt werden. Weitere Informationen finden Sie auf der Webseite des Projektes http://urbanhistory4d.org/wordpress/.
This dataset contains eight triples of historical images for four different sights. Images were chosen with respect to their possible matching quality. The images show combined differences in illumination, field of view, viewpoints, blurring and slight rotation. Some of the images show building reflections in water or extreme shadowing. The images are saved after digitization in full quality as *.tif files with a maximum sidelength of 3543 Pixels. Since no inner orientation could be determined for all image triples the Trifocal Tensor is provided - calculated using Ressl's method (Ressl, 2003). Additional metadata information, copyright disclaimer and permalinks are provided in License.txt. The purpose of the dataset is the evaluation of different feature detection and matching methods using the given orientation with the Trifocal Tensor. Point transfer calculation is possible using the equation on p. 382 in Multiple View Geometry in Computer Vision (Hartley and Zisserman, 2003). Another method uses the corrected Fundamental Matrices calculated in eq. 15.8 from the Trifocal Tensor on p. 374 in Multiple View Geometry in Computer Vision (Hartley and Zisserman, 2003). Ressl, C., 2003. Geometry, constraints and computation of the trifocal tensor. TU Wien. Hartley, R. and Zisserman, A., 2003. Multiple view geometry in computer vision. Cambridge university press.
Supplementary image dataset of the ISPRS International Journal of Geoinformation publication "Fully Automated Pose Estimation of Historical Images in the Context of 4D Geographic Information Systems Utilizing Machine Learning Methods". Historical image benchmark dataset including feature matches and four scene reconstructions. The dataset can be used for testing and evaluating feature matching methods on exclusively historical images. Additionally, the dataset can easily be extended as all tie point information is provided.
The ListDB provides a uniform standard for collecting and describing video-based traffic observations, e.g. using drones. This project provides selected data sets, i.e. recorded videos including metadata. || (German) Publication: https://nbn-resolving.org/urn:nbn:de:bsz:14-qucosa2-833252 || ListDB Gitlab: w3id.org/listdb
Selected videos including metadata from traffic observations according to the ListDB format (w3id.org/listdb/). Videos are collected in Dresden, Germany, and show the traffic at selected interscetions.
Bewertung infrastruktureller, verkehrlicher und netzplanerischer Merkmale in Unfallmodellen und Abschätzung des Einflusses der objektiven und subjektiven Verkehrssicherheit auf die Routenwahl von Radfahrenden.
This is supplementary material for the article “Blockchain technology in operations & supply chain management: a content analysis” published in Logistics Research in 2021.
This data is supplementary material to the article "Production planning and scheduling in multi-factory production networks: a systematic literature review".
In this appendix, the interested reader can learn further information about the comparison and evaluation of the identified maturity models. In addition, the three questionnaires used in the case study that was conducted are shown.
Das Verbundprojekt Diffusion digitaler Technologien in der Beruflichen Bildung durch Lernortkooperation (DiBBLok) setzt sich mit der Digitalisierung der Lernorte Berufsschule und Ausbildungsbetrieb mit besonderem Fokus auf die Lernortkooperation in der beruflichen Ausbildung auseinander und trägt zum besseren Verständnis von Digitalisierungsprozessen in der beruflichen Bildungspraxis bei.
Die Datensammlung Projekt DiBBLok enthält Forschungsdaten des Medienzentrums (MZ) der TU Dresden und des Lehrstuhl Bildungstechnologie der Fakultät Erziehungswissenschaften der TU Dresden, die in eigenständigen Arbeitspaketen und methodisch unabhängig voneinander Einrichtungen der beruflichen Bildung im Rahmen des vom BMBF-geförderten Forschungsprojekts DiBBLok im Zeitraum von 2019 bis 2022 beforscht haben. Das Archiv 'Datenarchiv-DiBBLok-MZ' enthält die Daten zur Erhebung des Medienzentrums der TU Dresden. Das Archiv 'Datenarchiv-DiBBLok-BT' enthält eine Auswahl von Daten der deutschlandweit an Berufsschulen durchgeführten Fallstudien der Bildungstechnologie der TU Dresden. Ausgewählt wurden gemäß der Richtlinien des Forschungsdatenarchivs der TU Dresden OpARA auschließlich anonymisiert erhobene Datensätze.
Das DFG-Projekt "Sicherheitsvorstellungen in der Antike" untersuchte die Werthaftigkeit des Begriffs Sicherheit in der archischen und klassischen griechischen Geschichte sowie in der Umbruchszeit von der Römischen Republik zum Prinzipat. Leiter des Gesamtprojekts war Prof. Dr. Martin Jehne. Projekthomepage: https://tu-dresden.de/gsw/phil/ige/ag/forschung/ehemalige-projekte/dfg-projekt-sicherheit.
Im Rahmen des Projekts "Sicherheitsvorstellungen in der Antike" wurde unter Leitung von Daniel Pauling eine umfangreiche Datenbank erstellt, die sämtliche Quellenstellen aus der griechischen Archaik und Klassik enthällt, in denen das Schlagwort ἀσφάλεια (asphaleia) verwendet wurde, welches als "Sicherheit" ins Deutsche übertragen wird. Auf dieser Forschungsdatenbasis wurde die Dissertation von Pauling "Ἀσφάλεια. Die Entwicklung der Sicherheitsvorstellungen und der Diskurs über Sicherheit im archaischen und klassischen Griechenland" erstellt. Sie ist frei verfügbar über den OpenAccess-Server Qucosa sowie im Hochschulschriftenmagazin der SLUB Dresden. (Link folgt) Die Quellenstellen wurden gesammelt, archiviert und auf mehreren Ebenen intensiv verschlagwortet. Dies geschah ursprünglich im Dateiformat MS Access 2016 (Quellensammlung_Asphaleia_v1.0.1.accdb). Diese Datenbank wird hier der Forschungsgemeinschaft frei zugänglich gemacht. Um eine nachhaltige und dauerhafte Nutzbarkeit zu gewährleisten wurde die Access-Datenbank auch als valides TEI XML 1.0 Dokument veröffentlicht (Quellensammlung_Asphaleia_v1.0.1.xml). Die Erschließung der Datenbank in beiden Dateiformaten ermöglicht die Dokumentation im PDF-Format (Quellensammlung_Asphaleia_BeschreibungUndSchlagwortverzeichnis.pdf - Acrobat Reader o.ä. wird benötigt). In ihr werden sämtliche Felder bzw. XML-Elemente der Datenbank sowie die möglichen darin enthaltenen Werte der Datensätze beschrieben und erläutert.
Medienbeilage des Magazins ARRAY2019
For the 2019 issue of Array, we focus on the idea of“Agency”in electronic and computer music, explored through the artistic and theoretical reflections of composers, performers, engineers, and musicologists. How do algorithms and artificial intelligences create a particular charac- ter through the decisions they make? How do we interpret the intention of nonhuman agents in the process of musical creation and analysis? Is it pos- sible to tell the difference between the intention of outside agencies from a projection of our own biases? How does the surrounding context inte- grate into the work itself? The writings present a collection of contemporary approaches and per- spectives from the field, examining topics ranging from the agency of digital signal processing and sonic analysis algorithms, to the design of inclusive instrument systems, object based composition, and relational aesthetics. This issue is accompanied by a set media examples: http://dx.doi.org/1 0.25532/OPARA-45
For the 2019 issue of Array, we focus on the idea of“Agency”in electronic and computer music, explored through the artistic and theoretical reflections of composers, performers, engineers, and musicologists. How do algorithms and artificial intelligences create a particular character through the decisions they make? How do we interpret the intention of nonhuman agents in the process of musical creation and analysis? Is it possible to tell the difference between the intention of outside agencies from a projection of our own biases? How does the surrounding context integrate into the work itself? The writings present a collection of contemporary approaches and perspectives from the field, examining topics ranging from the agency of digital signal processing and sonic analysis algorithms, to the design of inclusive instrument systems, object based composition, and relational aesthetics. This issue is accompanied by a set media examples: http://dx.doi.org/1 0.25532/OPARA-45
Medienbeilage zu ARRAY2019 Magazin der International Computer Music Association ICMA
Replikationsdaten für die Publikation "The dirty work of boundary maintenance: Der Topos der sicheren Grenzen im neurechten Diskurs"
CARInA is a German-language speech corpus containing speech material of the German Spoken Wikipedia Corpus. It is organized by completeness and speakers. The folder "Complete" contains all speech material which is annotated at orthographic, morphosyntactic, broad phonetic, narrow phonetic as well as prosodic speech level. The folder "WorkInProgress" contains all material with at least on one incomplete annotation level.
In dem Projekt "Funktionale Sicherheit modularer Anlagen" wird die statische Disziplin der funktionalen Sicherheit an die dynamischen Aspekte der modularen Automation durch die Entwicklung von Vorgehensmodellen sowie Sicherheitskonzepten angeglichen.
Sicherheitslebenszyklusmodelle beschreiben Tätigkeiten sowie Vorgehensweisen zur Entwicklung und Betrieb von sicherheitstechnischen Geräten bzw. Systemen zur Vermeidung systematischer Fehler. In der vorliegenden Veröffentlichung beschreibt die Entwicklung und Evaluation von Sicherheitslebenszyklusmodelle für Prozessmodule sowie modulare Anlagen. Die Entwicklung basiert auf einer Dokumentenanalyse der Normenserien DIN EN 61508 und DIN EN 61511. Die Evaluation wurde im Rahmen mehrerer Expertenworkshops durchgeführt.
Supplement material for a journal article in MPDI Applied Science.
This repository contains the measurement traces for the paper Comparison of UPF acceleration technologies and their tail-latency for URLLC.
High-speed processes can lead to significant technological advantages like an increased formability, reduced springback or an improved quality of cutting edges. For conventional forming processes, quasi-static conditions are a good approximation and numerical process optimisation is state of the art. However, there is still a need for research in the field of material characterisation for high speed forming and cutting processes. Production technologies with high velocities leads to high strain rates and the dependency of strain hardening and failure behaviour on the forming velocity cannot be neglected. Therefore, the data of the material behaviour at high strain rates is required for modelling high velocity processes. The challenge here is the measurement of relevant process quantities due to short process time that requires a very high sampling rate and the limited size and accessibility of the specimen. In this context, an inverse method for determining material characteristics at high strain rates was developed. The approach here is the measurement of auxiliary test parameters, which are easier to measure and then used as input data for an inverse numerical simulation. Two devices were implemented for different ranges of strain rates: a pneumatically driven device for strain rates up to 1.000 1/s and an electromagnetically driven accelerator for strain rates up to 100.000 1/s. The developed method is presented in detail in [1]. [1] Psyk et al.: Determination of Material and Failure Characteristics for High-Speed Forming via High-Speed Testing and Inverse Numerical Simulation. https://doi.org/10.3390/jmmp4020031
LAMpAS technology satisfies the increasing demand for products with novel surface performances at an affordable cost by the combination of high-power ultrashort laser sources in combination with new optical concepts for fast materials processing with minimal heat impact on the work piece. Inspired by natural surfaces, LAMpAS offers well-defined surface patterns with controlled length-scales and feature sizes smaller than 1 µm that can provide advance surface functions. LAMpAS technology can provide a wide range of surfaces with novel functionalities.
Recently, monitoring systems have become crucial components in industrial-scale laser machines to increase process reliability and efficiency. Particularly, monitoring methods have the potential to optimize and ensure the quality of laser surface patterning by indirectly characterizing the surface topography. Here, a diffraction measurement system, based on scatterometry, is used to determine the mean depth of laser-induced periodic surface structures (LIPSS) on stainless steel by analyzing the characteristics of the resulting diffraction patterns. To this end, LIPSS were produced with a ps-pulsed laser system operating at a wavelength of 1064 nm. The results reveal that the mean depth of LIPSS can be extracted from the intensity of the captured diffraction orders down to approximately 14 nm. This compact monitoring tool can be easily adapted to industrial-scale laser systems to improve the quality control and stability of surface microtexturing processes.
The combination of direct laser interference patterning (DLIP) with laser-induced periodic surface structures (LIPSS) enables the fabrication of functional surfaces reported for a wide spectrum of materials. The process throughput is usually increased by applying higher average laser powers. However, this causes heat accumulation impacting the roughness and shape of produced surface patterns. Consequently, the effect of substrate temperature on the topography of fabricated features requires detailed investigations. In this study, steel surfaces were structured with line-like patterns by ps-DLIP at 532 nm. To investigate the influence of substrate temperature on the resulting topography, a heating plate was used to adjust the temperature. Heating to 250 ∘C led to a significant reduction of the produced structure depths, from 2.33 to 1.06 µm. The reduction is associated with the appearance of a different LIPSS type, depending on the grain orientation of the substrates and laser-induced superficial oxidation. This study revealed a strong effect of substrate temperature, which is also to be expected when heat accumulation effects arise from processing surfaces at high average laser power.
Metallic samples with unique micro- and nano-scale surface structures can easily be fabricated with Direct Laser Interference Patterning. Like in all laser processes, the material interacts with the laser radiation and as a result, thermal effects occur. These effects have a significant influence on the resulting quality of the surface patterns. In this study, the thermal effects occurring during Direct Laser Interference Patterning of stainless steel and aluminum sheets are investigated. The used experimental setup consisted of a picosecond pulsed laser source operating at 532 nm wavelength, combined with a two-beam interference optical head. An infrared camera in an off-axis position is used to detect the resulting thermal radiation of the laser process varying different process parameters such as laser power and repetition rate. The obtained results reveal a correlation between the recorded signal by the infrared camera and the reached surface quality. They show an impact of the thermal effects on the quality of the surfaces and the amount of solidified material on the resulting line-like pattern. Threshold values of the detected infrared signal detected are determined to classify the obtained surface conditions.
Recently, process monitoring emerges as a breakthrough technology in industrial laser machines applications to enhance process stability and economic efficiency while ensuring high-quality pro-cessing parts and significantly reducing scrap rate. Furthermore, the latest advances in monitoring systems open a broad range of new opportunities to increase the capabilities of laser surface structur-ing. In this study, stainless steel and aluminum substrates are structured with a line-like geometry by Direct Laser Interference Patterning. A high-speed infrared camera is used to detect the thermal effects throughout the laser process. Simultaneously, a diffraction measurement system is implemented to analyze the quality of the fabricated periodic patterns by comparing the diffraction order characteris-tics. This specific combination of the systems allows a remarkably high-performance process moni-toring and quality assurance. The obtained results reveal a correlation between the signals detected by the infrared camera and the intensity of the diffraction orders recorded with the quality of the surface reached.
The objectives of the project are the development of fundamental digital methods for monitoring and increasing the reliability of highly integrated mechatronic systems that can be transferred to other engineering problems. The methods are to be developed within the framework of the project using the electric bicycle as an example, always with a view to the transferability and utilization of the research results to other vehicles with electric drives. These methods are a prerequisite for new business models of system providers that link product, application and service.
This dataset was created to provide measurements of a non-linear dynamic system with multiple input and output signal channels for the developement and testing of virtual sensing and forward prediction algorithms. It also includes predictions of a frequency response function model, which can be used as a benchmark for comparison with novel algorithmic approaches. The dataset contains measurements from a three-component servo-hydraulic fatigue test bench for suspension hydro-mounts. The sensor setup of this test bench consists of 3 inertia compensated force and 3 displacement sensors. Various non-linear influences affect the system behavior. The hydro-mounts are filled with oil to provide highly non-linear dampening, while the pendulum kinematics of the test bench introduce non-linear interactions between the excitations in different spatial directions. The most impactful non-linearity results from the system stiffness.
This dataset provides acceleration and strain measurements from a sensor equipped eBike, which were collected for the development of new methods for fatigue damage monitoring and maneuver identification tasks.
Partikelschäume zeichnen sich aus durch eine einzigartige Kombination von geringer Dichte, hoher mechanischer Energieabsorption bei Druckbelastung, großer Gestaltungsfreiheit in der Formgebung sowie geringen Herstellungskosten. Sie sind damit für vielfältige Einsatzgebiete prädestiniert, zu denen derzeit u.a. Sportschuhe und sicherheitsrelevante Interieur-Bauteile von Fahrzeugen gehören. Durch Auswahl der Verarbeitungsparameter beim Schäumen können die Eigenschaften der Zellstruktur und damit das Verhalten der Formteile gezielt an die jeweilige Anwendung angepasst werden. Nach aktuellem Stand der Forschung bestehen jedoch signifikante Unsicherheiten bezüglich der Struktur-Eigenschafts-Beziehungen von Partikelschäumen. So sind zwar Berechnungsmethoden verfügbar, die zur Vorhersage des makroskopischen Werkstoffverhaltens bei beliebigen multiaxialen Belastungszuständen geeignet sind, jedoch werden dabei substantielle Vereinfachungen zugrunde gelegt, wie etwa die Vernachlässigung lokaler Spannungs- und Verzerrungsmaxima auf der Mesoebene des Schaums. So kommt es bei globaler Druckbelastung innerhalb der Zellstruktur lokal zu Biegung, Beulen oder Zugversagen. Das mechanische Verhalten wird zudem maßgeblich durch das eingeschlossene Zellgas und dessen Kompression beeinflusst. Werden die Formteile schwingend über einen längeren Zeitraum belastet, kommt es nach empirischen Erkenntnissen zu einem zyklischen Kriechen des Formteils. Sowohl die Interaktion zwischen Zellgas und Zellstruktur als auch die Phänomenologie des zyklischen Kriechens sind bisher nicht hinreichend aufgeklärt. Im Fokus des Vorhabens steht die experimentelle und numerische Untersuchung des mechanischen Verhaltens von Partikelschäumen unter quasi-statischer sowie zyklischer Belastung im Druckbereich. Mit Hilfe eines im Vorhaben zu entwickelnden Prüfstandes zur Durchführung von umgebungsdruckabhängigen Stauchungsversuchen sowie der Anwendung röntgentomografischer Analysen sollen die Wechselwirkungen zwischen Zellgas, Zellstruktur und Grundpolymer einerseits und den zeitabhängigen mechanischen Eigenschaften des Partikelschaums andererseits analysiert werden. Im Mittelpunkt stehen dabei die viskoelastischen Eigenschaften und das Kriechverhalten des Partikelschaums sowie die globalen und lokalen Spannungs-Stauchungs-Verläufe bei wiederholter Be- und Entlastung. Die Analyse der Zellmorphologie bildet die Grundlage für die numerische Simulation der Partikelschäume, wobei zunächst einzelne Be- und Entlastungsschritte unter Berücksichtigung der Zellstruktur, des Zellgases und der Viskoelastizität simuliert werden. Hierdurch sollen insbesondere inelastische Effekte und Instabilitäten auf lokaler Ebene detektiert und deren Auswirkung auf das globale Verhalten identifiziert werden. Die Arbeiten schaffen damit wichtige Grundlagen für ein verbessertes Materialverständnis, das für eine ressourceneffizientere und verlässlichere Auslegung von langzeitbeanspruchten Partikelschaumstrukturen notwendig ist.
Bead foams with their hierarchical geometrical structure are a challenge in statistical reconstruction and finite element modelling. For the purpose of providing fundamental micro- and meso-structural descriptors of expanded polypropylene bead foams of different density, 3D-images from X-ray computed tomography (CT) are acquired. The data provided comprise reconstructed volume data as well as segmented data of three specimens of different density. Coarse scans of the specimen volume, definitions of certain regions of interests (ROI) as well as high resolution scans of the ROIs are included.
Goal: Development of a Deorbit Kit (DK) and related software based on Low Work function Tether (LWT) technology with TRL 4. Total Budget: 3M€ (Funded by the European Commission). Partners: UC3M, IKTS, UNIPD, TUD, SENER, and ATD. Potential impact: reversible and free of consumables in-space propulsion technology.
Thermionic Emission measurement data of thick film C12A7:e- paste (t=100 µm) on titanium substrates. Samples prepared by Fraunhofer IKTS. Tests conducted by Institute of Aerospace Engineering (TU Dresden).
Reuse of articles in the Dissertation with permission from Elsevier (via RightsLink, 2017).
Die Orchestrierung modularer Anlagen behandelt das verknüpfen von verschiedenen modularen Prozesseinheiten (PEA) sowohl aus verfahrenstechnischer als auch aus automatisierungstechnischer Sicht. Dabei sind eine Vielzahl von Aspekten zu berücksichtigen.
Two-photon NAD(P)H/FAD fluorescence lifetime imaging data of intact Drosophila melanogaster tissues including salivary gland and fat body cells of third instar larvae, enterocytes, as well as sperm stored the male seminal vesicle and the female seminal receptacle.
Sperm metabolism is fundamental to sperm motility and male fertility. Its measurement is still in its infancy and recommendations do not exist as to whether or how to standardize laboratory procedures. Here, using the sperm of an insect, the common bedbug, Cimex lectularius, we demonstrate that standardization of sperm metabolism is required with respect to the artificial sperm storage medium and a natural medium, the seminal fluid. We used fluorescence lifetime imaging microscopy (FLIM) in combination with time-correlated single-photon counting (TCSPC) to quantify sperm metabolism based on the fluorescent properties of autofluorescent coenzymes, NAD(P)H and FAD. Autofluorescence lifetimes (decay times) differ for the free and protein-bound state of the co-enzymes, and their relative contributions to the lifetime signal serve to characterize the metabolic state of cells. We found that artificial storage medium and seminal fluid separately, and additively, affected sperm metabolism. In a medium containing sugars and amino acids (Grace's Insect Medium), sperm showed increased glycolysis compared to a commonly used storage medium, phosphate-buffered saline (PBS). Adding seminal fluid to the sperm additionally increased oxidative phosphorylation, likely reflecting increased energy production of sperm during activation. Our study provides a protocol to measure sperm metabolism independently from motility, stresses that protocol standardizations for sperm measurements should be implemented and, for the first time, demonstrates that seminal fluid alters sperm metabolism. Equivalent protocol standardizations should be imposed on metabolic investigations of human sperm samples.
We show two characters that can be used to separate bedbug nymphs. Given the bedbugs increasing status as a global pest and as a research model, we discuss the economic and biological consequences of these findings.
Sperm viability in relation to buffer and genotype in Drosophila melanogaster
Research data and source code to reproduce the results proposed in 'Liquid Crystals on Deformable Surfaces'.
Source Code
Two-dimensional random metal networks possess unique electrical and optical properties, such as almost total optical transparency and low sheet resistance, which are closely related to their disordered structure. Here we present a detailed experimental and theoretical investigation of their plasmonic properties, revealing Anderson (disorder-driven) localized surface plasmon (LSP) resonances of very large quality factors and spatial localization close to the theoretical maximum, which couple to electromagnetic waves. Moreover, they disappear above a geometry-dependent threshold at ca. 1.6 eV in the investigated Au networks, explaining their large transparencies in the optical spectrum.
Jupyter notebook for the evaluation of XRR data by utilizing fast Fourier transformation and a multi-Gaussian fitting routine for the determination of ultra thin ALD films within the initial growth regime. One example measurement is included.
Supplementary material to the publication: "Fast fourier transform and multi-Gaussian fitting of XRR data to determine the thickness of ALD grown thin films within the initial growth regime"
This is supplementary information to the manuscript: "Sustainable Thermoelectric Materials Predicted by Machine Learning"
Physical Review Letters 2022 accepted Chaotic resonance modes in dielectric cavities: Product of conditionally invariant measure and universal fluctuations Roland Ketzmerick (1), Konstantin Clauß (1,2), Felix Fritzsch (1,3), and Arnd Bäcker (1) 1 Technische Universität Dresden, Institut für Theoretische Physik and Center for Dynamics, 01062 Dresden, Germany 2 Department of Mathematics, Technical University of Munich, Boltzmannstr. 3, 85748 Garching, Germany 3 Physics Department, Faculty of Mathematics and Physics, University of Ljubljana, Ljubljana, Slovenia We conjecture that chaotic resonance modes in scattering systems are a product of a conditionally invariant measure from classical dynamics and universal exponentially distributed fluctuations. The multifractal structure of the first factor depends strongly on the lifetime of the mode and describes the average of modes with similar lifetime. The conjecture is supported for a dielectric cavity with chaotic ray dynamics at small wavelengths, in particular for experimentally relevant modes with longest lifetime. We explain scarring of the vast majority of modes along segments of rays based on multifractality and universal fluctuations, which is conceptually different from periodic-orbit scarring.
Supplemental Material
Conflicts between long-term goals (e.g., maintaining health, achieving good grades) and immediate de-sires or strong habits (e.g., to smoke; eat a tasty dessert; to watch TV) are frequent in everyday life. Failures of self-control in such conflict situations are sources of a wide range of harmful behaviors including substance-related and addictive disorders (SAD), which incur immense personal and societal costs. The long-term aim of project C1 is to investigate whether impaired cognitive control, performance-monitoring, value-based decision-making and dysfunctional interactions between the underlying brain systems constitute vulnerability factors and/or mediating mechanisms underlying non-pathological daily self-control fail-ures (SCFs) as well as addictive behaviors. In the first funding period (07/2012-06/2016) project C1 launched a prospective cohort study using a multi-level approach that combines (i) a comprehensive clinical assessment, (ii) be-havioral task batteries assessing cognitive control and decision-making functions, (iii) task-related and resting state fMRI, and (iv) smartphone-based experience sampling of daily SCFs. In the first funding phase, from a representative community sample we recruited three groups of participants (each n = 100; age 20 - 26) with (a) symptoms of non-substance related and (b) substance-related addictive disorders and (c) syndrome-free controls. Hypothesis-driven cross-sectional analyses revealed that reduced error-related activity in brain areas involved in performance-monitoring and salience processing (anterior insu-la; aINS) and inhibitory control (right inferior frontal gyrus; IFG) as well as insufficient modulation of neural value signals in the ventromedial prefrontal cortex (vmPFC) by long-term goals predicted higher prone-ness to daily SCFs. Moreover, reduced conflict-related brain activity in performance-monitoring areas was associated with repeated unsuccessful attempts to quit smoking. These findings are consistent with a working model according to which deficient performance-monitoring, insufficient recruitment of cognitive control networks in response to conflicts or errors, and insufficient top-down modulation of value signals by long-term goals increase proneness to commit daily SCFs and show symptoms of SAD. Based on these encouraging results, in the second funding period project C1 will be expanded into a prospective-longitudinal cohort study with yearly clinical follow-up assessments and continued multi-level as-sessments 3 and 5 years after initial recruitment. This will provide the unique opportunity to examine with a cross-lagged panel design whether daily SCFs and SAD can be predicted by (a) cognitive control com-petencies as derived from latent variable analyses of our task battery and by (b) activity in brain areas involved in performance-monitoring, cognitive control, and value-based decision-making. This will not only allow us to investigate with sufficient statistical power (1) commonalities and differences in cognitive control functions between subgroups of SAD, but also (2) to address the central unresolved question whether cognitive control and performance-monitoring impairments are causally involved in the development of real-life SCFs and SAD.
In this paper, we tested whether impulsive decision-making (1) differs between individuals with substance use disorders (SUD) or non-substance-related addictive disorders (ND) and healthy controls and (2) predicts the course of SUD and ND severity after one year.
Project B5 addresses the modulation of cognitive control by stress. During the second funding period the project intended to investigate the longitudinal relationship of chronic stress and executive functioning, also known as cognitive control. For this purpose a population-based sample of 516 participants (Mage = 38.4 years, SD = 8.9 vears, 25 to 55 years; 67.2% female) was recruited out of 8400 randomly selected individuals within the population registry of the City of Dresden. This sample was followed up every six months till summer 2019 (i.e., six assessment waves) collecting data of three EF tasks, a set of questionnaires, as well as hair and saliva samples. EF tasks that were applied are a Number-Letter task (Rogers & Monsell, 1995), a Go/Nogo task (Wolff et al., 2016), and a spatial 2-Back task (Friedman et al., 2008). Questionnaires assessed are the Perceived Stress Scale (PSS; Sheldon Cohen & Williamson, 1988), the Screening Scale of the Trier Inventory for the Assessment of Chronic Stress (SSCS; Schulz & Schlotz, 1999), the depression scale of the Patient Health Questionnaire (PHQ-9; Löwe et al., 2002), the short version of the Big Five Inventory (BFI-K; Rammstedt & John, 2005), the short form of the Need for Cognition Scale (NFC; Bless et al., 1994), and the Multidimensional Mood State Questionnaire (MDBF; Steyer et al., 1997). Furthermore, several socioeconomic data are acquired, including education, employment status, household income, and health situation. To facilitate data collection, this was carried out self-administrated in the domestic setting.
The CRC 940/2 subproject B6 on "Individual differences in effort discounting and adjustments in volitional control" aims at providing an account for the role of individual differences in effort discounting or demand avoidance in order to better predict individual control adjustments across a variety of tasks. Across several studies, the nature of self-reported dispositional effort investment will be systematized (Study 1), it will be determined whether demand avoidance in behavioural tasks can be considered as a stable disposition that is related to self-reported effort investment (Study 2), and it will be examined whether individuals high in effort investment and demand avoidance actually invest more effort in task processing and how this effort investment is adjusted depending on task difficulty and payoff in typical cognitive control tasks (Study 3). Furthermore, project B6 aims at modeling these control adjustments based on individual differences in cost and benefit representations in further studies.
In this pilot study for Study 1 of CRC 940/2 subproject B6, we collected data from a large sample via an online survey using the questionnaires to assess the following personality traits: Need for Cognition, Self-Control, Effortful Control, and Action and State Orientation, and in addition Generalized Self-Efficiacy and the Big Five Personality Traits. The main research question was to examine to what extent Need for Cognition and Self-Control are related and whether there would be evidence for mediation or moderation effects. The results of these analyses will be submitted for publication (Grass, J. et al., Thinking in action: Need for Cognition predicts self-control together with action orientation). In this collection, the data and analysis routines as well as additional information to assess the validity of our results are provided.
These files provide first level fMRI data for reproducing the results of "Should we keep some distance from distancing? Regulatory and post-regulatory effects of emotion downregulation" by Kersten Diers and colleagues (2021). Due to the upload limit on OSF, the materials have been partitioned. Behavioral data, second level fMRI data, ROI masks, materials such as the experimental design and paradigms, the preprint and scripts can be found at OSF: https://osf.io/mg5ac/.
Model-based analysis of neural representations of decision evidence during perceptual decision making
Model-based analysis of neural representations of decision evidence during perceptual decision making
Selecting goals and successfully pursuing them in an uncertain and dynamic environment is an important aspect of human behaviour. In order to decide which goal to pursue at what point in time, one has to evaluate the consequences of one’s actions over future time steps by forward planning. However, when the goal is still temporally distant, detailed forward planning can be prohibitively costly. One way to select actions at minimal computational costs is to use heuristics. It is an open question how humans mix heuristics with forward planning to balance computational costs with goal reaching performance. To test a hypothesis about dynamic mixing of heuristics with forward planning, we used a novel stochastic sequential two-goal task. Comparing participants’ decisions with an optimal full planning agent, we found that at the early stages of goal-reaching sequences, in which both goals are temporally distant, on average 42% (SD = 19%) of participants’ choices deviated from the agent’s optimal choices. Only towards the end of the sequence, participant’s behaviour converged to near optimal performance. Subsequent model-based analyses showed that participants used heuristic preferences when the goal was temporally distant and switched to forward planning, when the goal was close.
Humans adaptively integrate forward planning and heuristic preferences during goal pursuit
This is for the paper titled "Context-dependent risk aversion adaptation for single subjects: a model-based approach". Contains both the experimental data obtained by Sven Breitmeyer, Florian Ott and Elena Ruge in 2018.
Experimental data for all 35 subjects of the experiment.
This project contains Unity Metadata and questionnaire data, and whenever available eye-tracking data from three studies which are part of the individual promotion by Judith Josupeit.
The project has been funded by the German Research Foundation (AS 497/1-1), awarded to Dr. Eva Asselmann (TU Dresden). The PI status was transferred to Prof. Dr. Katja Beesdo-Baum as of 01.10.2017 because Dr. Eva Asselmann left the TU Dresden and relocated to the Humboldt-Universität zu Berlin (now at HMU Health and Medical University Potsdam). She continued to be involved in the project as a cooperation partner and has been involved in the data analyses and publication.
Documentation for paper "Ecological Momentary Assessment and Applied Relaxation: Results of a Randomized Indicated Preventive Trial in Individuals at Increased Risk for Mental Disorders" including study protocol, raw data and analytic code.
Promotionsprojekt Helena Laudel, M.Sc. Betreuung: Prof. Dr. Susanne Narciss Professur für die Psychologie des Lehrens und Lernens
Laudel, Helena; Narciss, Susanne (2023): The effects of internal feedback and self-compassion on the perception of negative feedback and post-feedback learning behavior. In: Studies in Educational Evaluation 77, S. 101237. https://doi.org/10.1016/j.stueduc.2023.101237 Abstract: Negative feedback confronts learners with errors or failure but holds great learning potential. However, learners might perceive it as self-threatening, and thus react maladaptively. Feedback theories recommend prompting internal feedback prior to external feedback. And self-compassion is found to support adaptive reactions to failure. Thus, this study examined in a 2 × 2 factorial design the effects of prompting internal feedback or selfcompassion, or both, on feedback perception and post-feedback learning behavior. Participants (N = 210) completed a brief difficult reasoning test and received failure feedback. Perceived acceptance and fairness of the feedback were higher in the internal feedback and self-compassion conditions compared to the control condition with no prompts. The intervention effects were higher for participants with high perceived competence and low trait self-compassion. No significant effects on post-feedback learning behavior were observed. The results highlight the relevance of internal feedback processes for feedback perception.
This collection contains data that underlie the findings of the publication "Reward modulates the association between sensory noise and brain activity during perceptual decision-making". It includes the normalized and smoothed fMRI images, first level and group level stats, behavioral data (percent correct and reaction times) and Bayesian model parameter estimates.
Body odors play a subtle, but crucial role in many social situations. They are influenced by genetic connections, hormonal changes, current inflammatory processes, and diet, among other factors. Because the apocrine sweat glands are adrenaline sensitive, physiological and emotional arousal also alter body odor. Therefore, body odors enable the recognition and discrimination between different body states, emotions or diseases in our fellow human beings. Although this often happens unconsciously, piloting survey data show marked differences in the way, we describe body odors; e.g., a body odor that originates from exercise is most often described as "sweaty," whereas a body odor from a sick person is referred to as "biting." The goal of this study is to develop a valid and reliable matrix that captures how people perceive and describe different body odors. To this end, an online survey will be conducted including a minimum of n=1000 participants from different countries within and beyond Europe. Each country or language with at least n=100 participants will be included in the sample. Each participant is asked, in their respective language, to name three or more words or phrases that describe body odors in four different states (healthy, sick, stressed, after exercise) and from five body parts (from armpit, mouth, feet, male and female genitalia). The resulting body odor vocabulary from each language will be compared and the most often named words in each language will be used to generate a body odor description matrix. The study is part of the EU-funded project “Smart Electronic Olfaction for Body Odor Diagnostics (SMELLODI)” with the overall goal to digitize olfaction and make it usable for health applications, e.g. for patients with olfactory disorders.
This data collection will contain all intermediate and final datasets of the free descriptions of body odors from the SMELLODIonline study.
Relevant data from the paper "Studies about the dietary impact on “free” glycation com-pounds in human saliva" by Manig et al. 2022
Relevant raw data MRP in saliva (PLOS ONE)
We present a facile method for the determination of the electromagnetic field enhancement of nanostructured TiN electrodes using surface-enhanced Raman spectroscopy. As a model system, TiN with partially collapsed nanotube structure obtained from the nitridation of TiO2 nanotube arrays was used. Using surface-enhanced Raman scattering (SERS) spectroscopy, the electro-magnetic field enhancement factors (EFs) of the substrate across the optical region were deter-mined. The non-surface binding SERS reporter group azidobenzol was chosen, for which con-tributions from the chemical enhancement effect can be minimized. Derived EFs correlated with the electronic absorption profile and reached 3.9 at 786 nm excitation. Near-field enhancement and far-field absorption simulated with rigorous coupled wave analysis showed good agreement with the experimental observations. From the calculations, the major optical activity of TiN was con-cluded to originate from collective localized plasmonic modes at ca. 700 nm arising from the specific nanostructure.
Different research groups conduct clinical research in the field of feto/neonatal medicine at the Saxony Center for Feto/Neonatal Health (SCFNH). The research focusses on fetal and neonatal determinants of healthy development in humans.
A new method of surfactant administration, combining advantages of INSURE and LISA/MIST was developed at the TU Dresden. Retrospective patient data is analyzed to evaluate this approach.
Testdaten hochladen und bearbeiten. Zweck: Verwendbarkeit und Einführung als Archivplatz für Daten vom DIZ.
Testdaten
This data contains a set of 1550 patients’ answers to questionnaires taken before dental treatment in a dental clinic. The data is divided into male and female patients as well as according to their age. The level of Dental Anxiety can be interpreted by answers chosen in the Dental Anxiety Scale (DAS) and the level of psychological distress by answers chosen in the Brief Symptom Inventory-18 (BSI-18).
This data contains a set of 1550 patients’ answers to questionnaires taken before dental treatment in a dental clinic. The data is divided into male and female patients as well as according to their age. The level of Dental Anxiety can be interpreted by answers chosen in the Dental Anxiety Scale (DAS) and the level of psychological distress by answers chosen in the Brief Symptom Inventory-18 (BSI-18).
The aim of this research project is to gain an improved understanding about the variability of middle ear morphology and to develop a statistical simulation model from image data. Therefore, micro-CT data from human temporal bone specimens were acquired, segmented and evaluated.
Investigation of chromosome segregation in male meiotic spindles in C. elegans.
Datasets of male meiotic spindles. 1-Metaphase 2-Anaphase onset 3-Anaphase 1 4-Anaphase 2 5-Anaphase 3 6-Anaphase 4
This is the collection of all fully reconstructed mammalian spindles used throughout the publication "Mammalian kinetochores are semi-directly connected to spindle poles by k-fibers of variable morphology".
Supplementary Information for the publication “Hair concentrations of endocannabinoids in individuals with acute and weight-recovered anorexia nervosa”
Supplementary Information for the publication “Hair concentrations of endocannabinoids in individuals with acute and weight-recovered anorexia nervosa”
Background The Common Sense Model (CSM) identifies cognitive and emotional representations that influence the recovery process. In contrast to previous studies that related individual representations to outcome variables, we used cognitive schemata for prediction of the recovery process after a total hip replacement (THR). Method The purpose of this prospective study is to examine the importance of these schemata for functionality three and six months after THR. One additional predictive model was tested: emotional representation was extended to include depression and anxiety. 317 patients with primary THR were interviewed at t0. Preoperatively, the illness perception was assessed by the Illness Perception Questionnaire-Revised, and the function by the Western Ontario and McMaster Universities arthrosis index (WOMAC). 292 patients participated at all time points. A two-stage cluster analysis revealed two schemata. With generalized linear models-analyses, regressions were calculated. Results THR led to an increase in functionality in both schemata after three and six months. Before THR two cognitive schemata are found: scheme one: medium identity, long duration, many consequences, low personal and treatment control, and low coherence; scheme two: low identity, short timeline, low consequences, and high personal and treatment control. Patients with scheme 2 had better functionality three months after surgery than those with scheme one. Only the addition of depression leads to a slight improvement in prediction. Conclusions Surgery outcome could be improved substantially if the illness representation of patients with schema one and depression could be changed by an intervention preoperatively.
Datenumfang: 51 Variablen im SPSS-Format; Personenzahl 327; Verlaufsuntersuchung über drei Zeitpunkte: vor, 3 und 6 Monate nach der OP.
raw (ht3) and pre-analysed (bin) data of smFRET measurements for high-content, multiwell plate smFRET experiments.
In this project we studied the structural properties of OmpX in buffer and in complex with the chaperones SurA and Skp. We used single-molecule FRET, FCS, FLCS and nsFCS to determine the structural dynamics across many different time scales. This deposit carries the raw data and the first analysis steps, like burst identification.
This collection contains all raw data for the study of the interaction of OmpX with chaperones SurA and Skp3. The data was collected on a custom-built confocal TCSPC instrument as described in Hartmann and Krainer et al. Molecules 2014 and Hartmann and Krainer et al. Analytical Chemistry 2015.
A place to publish complimentary data about papers related to the Doctor Thesis of Rigel Alves.
complimentary data about Paper 3
Collaborative Research Centre/Transregio 205 - The Adrenal: Central Relay in Health and Disease
Molecular pathology and treatment of Cushing’s disease
NEXTGenIO will solve the problem by bridging the gap between memory and storage. This will use Intel's revolutionary new Optane DC Persistent Memory, which will sit between conventional memory and disk storage. NEXTGenIO will design the hardware and software to exploit the new memory technology. The goal is to build a system with 100x faster I/O than current HPC systems, a significant step towards Exascale computation. The advances that Optane DC Persistent Memory and NEXTGenIO represent are transformational across the computing sector.
This collection contains configurations, startup scripts and documentations of following applications: - Gromacs - Kronos - MONC - OpenFOAM examples - PyCOMPSs examples - Persistent Memory Development Kit - examples (PMDK)
This collections contains: - Deliverable's - Newsletter - Presentations - White papers - Work package interactions - Intermediate reports
The efficient parallel execution of scientific applications is a key challenge in high-performance computing (HPC). With growing parallelism and heterogeneity of compute resources as well as increasingly complex software, performance analysis has become an indispensable tool in the development and optimization of parallel programs. As waiting or idle time can propagate over multiple levels of parallelism, e.g. from a delayed task on an accelerator over host threads to another compute node, the actual cause of an inefficiency might be difficult to find. This thesis proposes a framework for systematic performance analysis of scalable, heterogeneous applications, which covers process- and thread-level parallelism as well as computation offloading. It addresses two essential aspects that have so far been neglected: potential inefficiencies with computation offloading and generic analyses across programming models. Furthermore, established analyses are combined in such a way that inefficiencies and program regions can be prioritized to enable a more focused optimization process. The analysis results are presented in form of a program region profile, a summary of all inefficiencies, and timelines. In the core analyses, the implementation is independent of application programming interfaces (APIs) and can detect further inefficiencies and wait states by adding new analysis rules. It is applied to synthetic and real-world programs to validate the applicability, correctness, and scalability.
CASITA is a tool for automatic analysis of OTF2 trace files that have been generated with Score-P. It determines program activities with high impact on the total program runtime and the load balancing. CASITA generates an OTF2 trace with additional information such as the critical path, waiting time, and the cause of wait states. The same metrics are used to generate a summary profile which rates activities according their potential to improve the program runtime and the load balancing. A summary of inefficient patterns exposes waiting times in the individual programming models and APIs.
Die BeLLiSU-Studie beschäftigt sich mit dem lebenslangen, berufsbegleitenden Lernen von Sachunterrichtslehrkräften, welches einen unverzichtbaren Bestandteil sowie Selbstverständnis des Lehrer:innenberufs darstellt (dritte Phase der Lehrer:innenbildung). Genauer fokussiert die Arbeit auf die Exploration und Deskription (1) der von Sachunterrichtslehrer:innen genutzten Lernformen zum Ziel der Weiterentwicklung ihrer professionellen Handlungskompetenz für das Fach Sachunterricht sowie (2) deren Verständnis von der Konzeption eines aktuellen, vielperspektivischen Sachunterrichts (sachunterrichtsdidaktisches Wissen). Ein weiteres Anliegen besteht in der Überprüfung möglicher Zusammenhänge ausgewählter Personen- und Kontextmerkmale mit dem berufsbegleitenden Lernen jener Lehrpersonen.
Raw images to publication "Large and stable - actin aster networks formed via entropic forces"