Communities in this Repository
Select a community to browse its collections.
Supporting Information for Tajik Basin and Southwestern Tian Shan, Northwestern India-Asia Collision Zone: 2. Timing of Basin Inversion, Tian Shan Mountain Building, and Relation to Pamir-Plateau Advance and Deep India-Asia Indentation Abdulhameed et al., 2020, Tectonics
Contents of this File Supporting information S1 Tables S1 to S3 Figure S1 to Figure S4 Dataset S1 with Tables S4 to Table S6
Contents of this File Supporting information S1 Tables S1 to S3 Figure S1 to Figure S4 Dataset S1 with Tables S4 to Table S6
Supporting Information for Tajik Basin and Southwestern Tian Shan, Northwestern India-Asia Collision Zone: 3. Pre- to Syn-orogenic Retro-foreland Basin Evolution in the Eastern Tajik Depression and Linkage to the Pamir Hinterland Dedow et al., Tectonics, 2020
Contents of this file Supporting information S1. Tables S1 and S2 provided as separate files
AGUPublications: Tectonics Supporting Information for: Tajik Basin and Southwestern Tian Shan, Northwestern India-Asia Collision Zone: 1. Structure, Kinematics, and Salt-tectonics in the Tajik Fold-thrust Belt of the Western Foreland of the Pamir Łukasz Gągała1,2, Lothar Ratschbacher1, Jean-Claude Ringenbach3, Sofia-Katarina Kufner4, Bernd Schurr4, Ralf, Dedow1, Sanaa Abdulhameed1, Edouard Le Garzic3, Mustafo Gadoev5, and Ilhomjon Oimahmadov5 1Geologie, Technische Universität Bergakademie Freiberg, Freiberg, Germany, 2Present address: Hellenic Petroleum, Marousi, Greece, 3E2S-UPPA, CNRS, Univ. Pau & Pays Adour, Pau, France, 4GFZ German Research Center for Geosciences, Potsdam, Germany, 5Institute of Geology, Earthquake Engineering and Seismology, Tajik Academy of Sciences, Dushanbe, Tajikistan Contents of this File: Supporting information S1; Figures S1 to S9
The plate-bounding Alpine Fault in New Zealand is an 850 km long transpressive continental fault zone that is late in its earthquake cycle. The Deep Fault Drilling Project (DFDP) aims to deliver insight into the geological structure of this fault zone and its evolution by drilling and sampling the Alpine Fault at depth. We have acquired and processed reflection seismic data to image the subsurface around the drill site: 1) a 2D seismic line in 2011 and 2) an extended 3D walkaway Vertical Seismic Profiling data set in 2016. The resulting velocity models and seismic images of the upper 5 km show complex subsurface structures around the Alpine Fault zone with both the local structures of the glacial valley and the tectonic fault structures. The results provide a reliable basis for a seismic site characterisation at the DFDP-2 drill site and correlate with preliminary cutting and logging results from the drilling. Thus, the information derived from the seismic data sets are crucial for further structural and geological investigations of the architecture of the Alpine Fault in this area.
Data sets for a seismic P-wave velocity model from a 3D VSP survey at the Alpine Fault DFDP-2 drill site in the Southern Alps in New Zealand containing the used first-arrival travel times and the final P-wave velocity model.
This repository includes the reconstructed TIFF-files from multiple scale tomographic analysis of particles in an embedding matrix (overall volume and 3 sub-volumes in 2 different magnification steps)
including - 1 LARGE sample with low-resolution data (micro-CT, VERSA510) - splitted in I, II, III, IV, V - 3 SUB-Samples MEDIUM-resolution (cutted down from above LARGE sample, scanned with micro-CT, ZEISS VERSA510) - 3 SUB-Samples HIGH-resolution (same as MEDIUM, scanned with nano-CT, ZEISS ULTRA810)
Das Thema „Grüne Stadt“ erlebt derzeit eine Renaissance und erfuhr zuletzt im 2015 veröffentlichten Grünbuch des Bundesministeriums für Umwelt, Naturschutz, Bau und Reaktorsicherheit (BMUB) eine umfassende Würdigung. Gegenstand dieser weitreichenden Befassungen sind auch Überlegungen zur quantitativen und qualitativen Angebotsentwicklung im Bereich der Erholungsvorsorge. Diesbezüglich vorhandene Rahmensetzungen sind über 40 Jahre alt und bedürfen nicht zuletzt vor dem Hintergrund veränderter Rahmenbedingungen der Aktualisierung und empirischen Überprüfung, v. a. im Hinblick auf die vermutlich recht unterschiedlichen quantitativen und qualitativen Anforderungen unterschiedlicher Bevölkerungsgruppen an die Gestaltung erholungsrelevanter Flächen sowie ihre Erreichbarkeit. Ziel der vorliegenden Dissertation, deren digitaler Anhang über Opara zugänglich ist, war es, auf Basis umfassender empirischer Erhebungen belastbare Daten zu alters- und nutzergruppenspezifischen Einzugsgebieten von Parkanlagen sowie zu alters- und nutzergruppenabhängigen Erholungsmustern zu erlangen und im Hinblick auf identifizierte Einflussfaktoren zu interpretieren. Im Fokus stand also, ergänzt durch eine Erweiterung vorhandener Flächendaten um v. a. qualitative Aspekte, die systematische Erfassung und Interpretation von Nutzungsmustern ausgewählter öffentlicher Parkanlagen im Stadtgebiet von Dresden, mithin keine wissenschafts-theoretische Auseinandersetzung mit den soziologischen Aspekten der Freiraumnutzung, sondern eine hypothesengeleitete empirische Studie.
In dieser Datensammlung befindet sich der digitale Anhang zur Dissertation "Grundlagen für Erholungsplanung in der Stadt. Eine empirische Untersuchung zu Nutzungsmustern ausgewählter öffentlicher Parkanlagen in Dresden." (Seidler 2016). Dieser umfasst einen Kartenteil und ergänzende Anhänge, beispielsweise Grafiken, Fotodokumentationen, Zählbögen, Dokumentationen zur Wegelängenermittlung in den untersuchten Anlagen und zur Dichteermittlungen in den umgebenden Stadtbereichen beispielsweise.
DFG project number: 254872581 (follow-up of project HE2933/8-1) The majority of constitutive models, that are used nowadays to describe the behaviour of granular materials such as sands, are continuum models based on phenomenological approaches. In order to describe some of the phenomena occurring on the macroscopic scale, e.g. an abrupt change of stiffness due to a load reversal, these constitutive models use phenomenological state variables (e.g. back stress in elasto-plasticity or the intergranular strain concept for hypoplasticity) which often lack a clear physical meaning. The mechanisms that control the macroscopic behaviour and, as such, different phenomena, that can be observed on the continuum scale, must be sought at the grain-scale with the interactions of individual particles playing the key-role. X-Ray μ-computed tomography (CT) allows for a 3D imaging of natural soil samples in various loading conditions and is used in this project. In order to extract information on the structure of the granular material, different image analysis approaches can be used and their accuracy is evaluated with respect to the limited resolution. Mechanical experiments in the x-ray CT scanner have been carried out on natural sands in the running project. During a macroscopic loading the sand specimens were scanned using a laboratory x-ray scanner in order to assess the grain-scale behaviour in-situ and link it with the macroscopic observations. The evolution of the microstructure can be linked to the evolution of the phenomenological variables, e.g. the intergranular strain for hypoplasticity for changes in loading direction, leading to a possible micromechanical enhancement of these concepts. Establishing a link between micromechanical variables, such as the fabric tensors describing the stucture, and the macromechanical observations cannot only enhance our understanding of different phenomena occurring on the continuum scale, but also enable an incorporation of these effects into phenomenological approaches in a more straight-forward and reliable way.
This work develops a strategy to benchmark image analysis tools that can used for the determination of contact fabric from tomographic images. The discrete element method is used to create and load a reference specimen for which the fabric and its evolution is precisely known. Chosen states of this synthetic specimen are turned into realistic images taking into account inherent image properties, such as the partial volume effect, blur and noise. The application of the image analysis tools on these images validates the findings of the metrological study and highlights the importance of addressing the identified shortcomings, i.e., the systematical over-detection of contacts and the strong bias of orientations when using common watersheds.
The project goal is the development of innovative techniques for spatio-temporal high resolution monitoring and small-scale simulation of extreme events. In a cooperation between the chairs of Hydrology, Meteorology, Geoinformatics and Photogrammetry, of the TU Dresden, new types of operational monitoring systems will be developed. Existing monitoring networks will be densified using modern low-cost sensors, specific remote sensing data and geographical information systems. Additionally, historical analyses and predictive modelling of small-scale extreme events with different climate scenarios will support to predict the expected effects of climate change. The developed information will serve as a base for upcoming early warning systems and future adjustment strategies.
Small scale and headwater catchments are mostly ungauged, even though their observation could help to improve the understanding of hydrological processes. However, it is expensive to build and maintain conventional measurement networks. Thus, the heterogeneous characteristics and behavior of catchments are currently not fully observed. This study introduces a method to capture water stage with a flexible low-cost camera setup. By considering the temporal signature of the water surface, water lines are automatically retrieved via image processing. The image coordinates are projected into object space to estimate the actual water stage. This requires high resolution 3D models of the river bed and bank area which are calculated in a local coordinate system with SfM, employing terrestrial as well as UAV imagery. A medium- and a small-scale catchment are investigated to assess the accuracy and reliability of the introduced method. Results reveal that the average deviation between the water stages measured with the camera gauge and a reference gauge are below 6 mm in the medium-scale catchment. Trends of water stage changes are captured reliably in both catchments. The developed approach uses a low-cost camera design in combination with image-based water level measurements and high-resolution topography from SfM. In future, adding tracking algorithms can help to densify existing gauging networks.
In this paper an automatic approach is proposed to measure flow velocity with an uncooled thermal camera. Hot water is used as thermal tracer. The introduced tracking algorithm utilizes the pyramidal Lucas-Kanade method and is especially suitable for thermal image data. The performance of the new tool is compared to traditional image-based tracking tools, i.e. PIVlab and PTVlab. Experiments are performed in the laboratory for three different flow velocities. Afterwards, tests are conducted in a small stream to illustrate the suitability of the tool for field measurements. Results of the laboratory experiments as well as of the field experiments show that our tracking algorithm, applied to imagery from a thermal camera, outperforms commonly used tracking methods. Our tool provides velocity fields with very high resolution and is in close agreement with reference measurements, whereas PTVlab and PIVlab tend to overestimate and underestimate flow velocities, respectively.
An automatic workflow is introduced, including an image-based tracking tool, to measure surface flow velocities in rivers. The method is based on PTV and comprises an automatic definition of the search area for particles to track. Tracking is performed in the original images. Only the final tracks are geo-referenced, intersecting the image observations with water surface in object space. Detected particles and corresponding feature tracks are filtered considering particle and flow characteristics to mitigate the impact of sun glare and outliers. The method can be applied to different perspectives, including terrestrial and aerial (i.e. UAV) imagery. To account for camera movements images can be co-registered in an automatic approach. In addition to velocity estimates, discharge is calculated using the surface velocities and wetted cross-section derived from surface models computed with SfM and multi-media photogrammetry. The workflow is tested at two river reaches (paved and natural) in Germany. Reference data is provided by ADCP measurements. At the paved river reach highest deviations of flow velocity and discharge reach 5% and 4%, respectively. At the natural river deviations are larger (26% and 20%, respectively) due to the irregular cross-section shapes hindering accurate contrasting of ADCP- and image-based results. The provided tool enables the measurement of surface flow velocities independently of the perspective from which images are acquired. With the contact-less measurement spatially distributed velocity fields can be estimated and river discharge in previously ungauged and unmeasured regions can be calculated.
Goal: To study the complex interactions between vegetation, composition, spatial distribution, and capability of the soil bacterial community to produce extracellular polymeric substances (EPS), and the effects of these EPS on the stability of soil aggregates. (Funded by the Deutsche Forschungsgemeinschaft (DFG) - Project number 316446092)
Data / results obtained from the first sampling campaign on the selected sites in Rambla Honda and near the town of Alboloduy, Almeria Spain.
Data belonging to the paper "Identifying and quantifying geogenic organic carbon in soils – the case of graphite", published in SOIL journal (Copernicus). The research paper can be accessed via: https://doi.org/10.5194/soil-2019-30
Data underlying the figures of the research paper: "Identifying and quantifying geogenic organic carbon in soils – the case of graphite" (SOIL Copernicus journal) https://doi.org/10.5194/soil-5-1-2019
Goal: Characterise the influence of aquifer properties and external stresses on DNAPL source zone architecture by deriving transformation techniques to convert complex to effective source geometries through a combination of laboratory-scale experiments and numerical modelling. Partners are the Helmholtz-Centre for Environmental Research-UFZ, the Indian Institute of Technology Delhi, the Department of Civil and Environmental Engineering at the University of Illinois Urbana-Champaign, U.S., and the Faculty of Civil, Architectural and Environmental Engineering at the University of Texas at Austin, U.S.
This data collection includes the Python script used for image processing and analysis as described in the article "Quantification of uncertainties from image processing and analysis in laboratory-scale DNAPL migration experiments evaluated by reflective optical imaging" by Engelmann et al. submitted to Journal "Water" in 2019. Exemplary raw images generated from laboratory-scale tank experiments for DNAPL migration are included as well.
Das DFG-Grundlagenforschungsprojekt „Regen als Grundwassertracer“ setzt es sich zum Ziel, die Nutzung von Niederschlagswässern als Grundwassertracer wissenschaftlich zu validieren. Hierzu wird im Rahmen des Projektes untersucht und modellgestützt abgebildet werden, inwieweit die kombinierte Berücksichtigung der inhärenten Eigenschaften von natürlichen Wässern wie z.B. Stabilisotopensignatur, ionische Zusammensetzung und Temperatur für eine zielgerichtete Detektion genutzt werden kann. Dieser Multitracer-Ansatz sollte es ermöglichen, Grundwasserleiter unter Verwendung einer toxikologisch und ökologisch sehr verträglichen Methode belastbar zu charakterisieren, u.a. auch bezüglich des strukturellen Aufbaus. Im Projekt werden hierzu umfassende Analysen zum Transportverhalten sowohl im kleinskaligen Bereich als auch auf größerer Skala unter Feldbedingungen durchgeführt. Im Fokus der Untersuchungen steht dabei vor allem die Beurteilung möglicher Einflussfaktoren auf die „Stabilität“ des Tracersignals während der Untergrundpassage. Dies erfolgt primär laborgestützt mittels ein- und mehrdimensionaler Durchströmungsexperimente in Kombination mit Langzeitbatchversuchen. Hierbei werden nicht nur sedimentbedingte Veränderungen der Isotopen- und Ionenzusammensetzung sowie strukturbedingte Transport- und Vermi-schungsprozesse bezüglich ihrer Relevanz bewertet, sondern auch chemische Fällungs- und Lösungsreaktionen sowie dichte- und viskositätsbedingte Effekte infolge der Wassereingabe und der damit verbundenen Temperaturveränderung. Zudem wird auch der Einfluss der Eingabeform des Tracerwassers in den hydrogeologischen Untergrund sowie von natürlichen Schwankungen auf das Transportverhalten auf Feldskala untersucht. Zusätzlich zur qualitativen und quantitativen Analyse der oben genannten, einzelnen Einflussfaktoren auf Labor- und Feldskala erfolgt auch eine modellgestützte Bewertung der Gesamtwirkung infolge von Prozessüberlagerungen.
Stable isotope analysis is widely used in environmental tracer studies, e.g. for groundwater flow and discharge quantification. In this context, this study presents an inexpensive approach for the combined use of deuterium (2H) and oxygen-18 (18O) as active semiartificial groundwater tracers by a direct injection of snowmelt into aquifers. This dual isotope approach takes advantage of isotope signature differences between typical groundwater and precipitation water. Aim of this study is the experimental demonstration on laboratory- and field-scale. For this, two column flow experiments were performed using δ2H and δ18O values of snowmelt for breakthrough detection. The differences of the isotope signature between the snowmelt and groundwater were ∆(δ2H) ≈ 61.0 ‰ and ∆(δ18O) ≈ 8.2 ‰. Breakthrough was observed to be almost congruent to a sodium chloride tracer, indicating conservative transport. The low electrical conductivity (EC) of snowmelt (45 µS/cm, i.e. ∆EC ≈ 486 µS/cm to groundwater) was used as an additional easy-to-measure breakthrough indicator. However, the snowmelt EC breakthrough suffered from a slight retardation due to ion exchange. Based on these results, a push-drift-pull tracer test with snowmelt, additionally labeled with uranine, was realized at the field site Pirna, Germany. In the pull phase, a significant isotopic depletion was observed with peak differences of ∆Peak(δ2H) ≈ 24.2 ‰ and ∆Peak(δ18O) ≈ 3.2 ‰, which equals approx. 40 % of the initial difference. The isotope breakthrough was observed to be almost the same as the breakthrough of uranine indicating conservative behavior, while EC breakthrough was affected by ion exchange again.
Ziel des Projektes ist die Erforschung und Entwicklung eines ganzheitlichen Systems zur Errichtung und Betriebsführung von Gruppenkleinkläranlage mit kombinierter Versickerung des gereinigten Abwassers und anfallenden Regenwassers. Die Bemessung der Versickerung soll unter Berücksichtigung der standortspezifisch gegebenen hydrogeologischen und ökologischen Rahmenbedingungen erfolgen.
This data collection includes the Python script used for model data preparation, processing and assessment as described in the article "Evaluation of Decentralized, Closely-Spaced Precipitation Water and Treated Wastewater Infiltration" by Händel et al. submitted to Journal "Water" in 2018. Python script input data as generated from Hydrus 2D/3D models as well as resulting plots as used in the previously mentioned article are included.
Die Nachwuchsforschergruppe adressiert drei Fragenkomplexe. Ein geschichtlich-architektonischer Komplex behandelt am Beispiel der baugeschichtlichen Entwicklung der Stadt Dresden im 20. Jahrhundert die Erforschung und Vermittlung von Wechselwirkungen zwischen Stadtlandschaft und deren Abbildung. Damit verknüpft ist ein zweiter, methodischer Komplex, welcher diesbezügliche forschungsmethodische Anforderungen an digitale Bild- und Planquellenrepositorien und sich daraus ableitende technische Unterstützungsoptionen behandelt. Darauf aufbauend behandelt ein informationell-technischer Komplex eine bedarfsgerechte Informationsmodellierung und deren technische Umsetzung am Beispiel der Deutschen Fotothek. Ziel der photogrammetrischen Untersuchungen ist die Entwicklung von automatisierten Arbeitsabläufen zur Geo- und Zeitreferenzierung von historischen Fotoaufnahmen der SLUB-Fotothek anhand von Metadaten und Bildmerkmalen und die Erstellung eines photogrammetrischen 3D-Stadtmodells. Dieses soll hinsichtlich einer erreichbaren Genauigkeit, Modellkomplexität und Güte evaluiert werden. Ferner sollen Verfahren zur Merkmalsextraktion und Posenschätzung für eine AR-Darstellung, sowie Schnittstellen zu AR/4D-Browsern bereitgestellt werden. Weitere Informationen finden Sie auf der Webseite des Projektes http://urbanhistory4d.org/wordpress/.
This dataset contains eight triples of historical images for four different sights. Images were chosen with respect to their possible matching quality. The images show combined differences in illumination, field of view, viewpoints, blurring and slight rotation. Some of the images show building reflections in water or extreme shadowing. The images are saved after digitization in full quality as *.tif files with a maximum sidelength of 3543 Pixels. Since no inner orientation could be determined for all image triples the Trifocal Tensor is provided - calculated using Ressl's method (Ressl, 2003). Additional metadata information, copyright disclaimer and permalinks are provided in License.txt. The purpose of the dataset is the evaluation of different feature detection and matching methods using the given orientation with the Trifocal Tensor. Point transfer calculation is possible using the equation on p. 382 in Multiple View Geometry in Computer Vision (Hartley and Zisserman, 2003). Another method uses the corrected Fundamental Matrices calculated in eq. 15.8 from the Trifocal Tensor on p. 374 in Multiple View Geometry in Computer Vision (Hartley and Zisserman, 2003). Ressl, C., 2003. Geometry, constraints and computation of the trifocal tensor. TU Wien. Hartley, R. and Zisserman, A., 2003. Multiple view geometry in computer vision. Cambridge university press.
This data is supplementary material to the article "Production planning and scheduling in multi-factory production networks: a systematic literature review".
Das DFG-Projekt "Sicherheitsvorstellungen in der Antike" untersuchte die Werthaftigkeit des Begriffs Sicherheit in der archischen und klassischen griechischen Geschichte sowie in der Umbruchszeit von der Römischen Republik zum Prinzipat. Leiter des Gesamtprojekts war Prof. Dr. Martin Jehne. Projekthomepage: https://tu-dresden.de/gsw/phil/ige/ag/forschung/ehemalige-projekte/dfg-projekt-sicherheit.
Im Rahmen des Projekts "Sicherheitsvorstellungen in der Antike" wurde unter Leitung von Daniel Pauling eine umfangreiche Datenbank erstellt, die sämtliche Quellenstellen aus der griechischen Archaik und Klassik enthällt, in denen das Schlagwort ἀσφάλεια (asphaleia) verwendet wurde, welches als "Sicherheit" ins Deutsche übertragen wird. Auf dieser Forschungsdatenbasis wurde die Dissertation von Pauling "Ἀσφάλεια. Die Entwicklung der Sicherheitsvorstellungen und der Diskurs über Sicherheit im archaischen und klassischen Griechenland" erstellt. Sie ist frei verfügbar über den OpenAccess-Server Qucosa sowie im Hochschulschriftenmagazin der SLUB Dresden. (Link folgt) Die Quellenstellen wurden gesammelt, archiviert und auf mehreren Ebenen intensiv verschlagwortet. Dies geschah ursprünglich im Dateiformat MS Access 2016 (Quellensammlung_Asphaleia_v1.0.1.accdb). Diese Datenbank wird hier der Forschungsgemeinschaft frei zugänglich gemacht. Um eine nachhaltige und dauerhafte Nutzbarkeit zu gewährleisten wurde die Access-Datenbank auch als valides TEI XML 1.0 Dokument veröffentlicht (Quellensammlung_Asphaleia_v1.0.1.xml). Die Erschließung der Datenbank in beiden Dateiformaten ermöglicht die Dokumentation im PDF-Format (Quellensammlung_Asphaleia_BeschreibungUndSchlagwortverzeichnis.pdf - Acrobat Reader o.ä. wird benötigt). In ihr werden sämtliche Felder bzw. XML-Elemente der Datenbank sowie die möglichen darin enthaltenen Werte der Datensätze beschrieben und erläutert.
Medienbeilage des Magazins ARRAY2019
For the 2019 issue of Array, we focus on the idea of“Agency”in electronic and computer music, explored through the artistic and theoretical reflections of composers, performers, engineers, and musicologists. How do algorithms and artificial intelligences create a particular charac- ter through the decisions they make? How do we interpret the intention of nonhuman agents in the process of musical creation and analysis? Is it pos- sible to tell the difference between the intention of outside agencies from a projection of our own biases? How does the surrounding context inte- grate into the work itself? The writings present a collection of contemporary approaches and per- spectives from the field, examining topics ranging from the agency of digital signal processing and sonic analysis algorithms, to the design of inclusive instrument systems, object based composition, and relational aesthetics. This issue is accompanied by a set media examples: http://dx.doi.org/1 0.25532/OPARA-45
For the 2019 issue of Array, we focus on the idea of“Agency”in electronic and computer music, explored through the artistic and theoretical reflections of composers, performers, engineers, and musicologists. How do algorithms and artificial intelligences create a particular character through the decisions they make? How do we interpret the intention of nonhuman agents in the process of musical creation and analysis? Is it possible to tell the difference between the intention of outside agencies from a projection of our own biases? How does the surrounding context integrate into the work itself? The writings present a collection of contemporary approaches and perspectives from the field, examining topics ranging from the agency of digital signal processing and sonic analysis algorithms, to the design of inclusive instrument systems, object based composition, and relational aesthetics. This issue is accompanied by a set media examples: http://dx.doi.org/1 0.25532/OPARA-45
Medienbeilage zu ARRAY2019 Magazin der International Computer Music Association ICMA
Reuse of articles in the Dissertation with permission from Elsevier (via RightsLink, 2017).
Die Orchestrierung modularer Anlagen behandelt das verknüpfen von verschiedenen modularen Prozesseinheiten (PEA) sowohl aus verfahrenstechnischer als auch aus automatisierungstechnischer Sicht. Dabei sind eine Vielzahl von Aspekten zu berücksichtigen.
Two-photon NAD(P)H/FAD fluorescence lifetime imaging data of intact Drosophila melanogaster tissues including salivary gland and fat body cells of third instar larvae, enterocytes, as well as sperm stored the male seminal vesicle and the female seminal receptacle.
We show two characters that can be used to separate bedbug nymphs. Given the bedbugs increasing status as a global pest and as a research model, we discuss the economic and biological consequences of these findings.
Research data and source code to reproduce the results proposed in 'Liquid Crystals on Deformable Surfaces'.
Conflicts between long-term goals (e.g., maintaining health, achieving good grades) and immediate de-sires or strong habits (e.g., to smoke; eat a tasty dessert; to watch TV) are frequent in everyday life. Failures of self-control in such conflict situations are sources of a wide range of harmful behaviors including substance-related and addictive disorders (SAD), which incur immense personal and societal costs. The long-term aim of project C1 is to investigate whether impaired cognitive control, performance-monitoring, value-based decision-making and dysfunctional interactions between the underlying brain systems constitute vulnerability factors and/or mediating mechanisms underlying non-pathological daily self-control fail-ures (SCFs) as well as addictive behaviors. In the first funding period (07/2012-06/2016) project C1 launched a prospective cohort study using a multi-level approach that combines (i) a comprehensive clinical assessment, (ii) be-havioral task batteries assessing cognitive control and decision-making functions, (iii) task-related and resting state fMRI, and (iv) smartphone-based experience sampling of daily SCFs. In the first funding phase, from a representative community sample we recruited three groups of participants (each n = 100; age 20 - 26) with (a) symptoms of non-substance related and (b) substance-related addictive disorders and (c) syndrome-free controls. Hypothesis-driven cross-sectional analyses revealed that reduced error-related activity in brain areas involved in performance-monitoring and salience processing (anterior insu-la; aINS) and inhibitory control (right inferior frontal gyrus; IFG) as well as insufficient modulation of neural value signals in the ventromedial prefrontal cortex (vmPFC) by long-term goals predicted higher prone-ness to daily SCFs. Moreover, reduced conflict-related brain activity in performance-monitoring areas was associated with repeated unsuccessful attempts to quit smoking. These findings are consistent with a working model according to which deficient performance-monitoring, insufficient recruitment of cognitive control networks in response to conflicts or errors, and insufficient top-down modulation of value signals by long-term goals increase proneness to commit daily SCFs and show symptoms of SAD. Based on these encouraging results, in the second funding period project C1 will be expanded into a prospective-longitudinal cohort study with yearly clinical follow-up assessments and continued multi-level as-sessments 3 and 5 years after initial recruitment. This will provide the unique opportunity to examine with a cross-lagged panel design whether daily SCFs and SAD can be predicted by (a) cognitive control com-petencies as derived from latent variable analyses of our task battery and by (b) activity in brain areas involved in performance-monitoring, cognitive control, and value-based decision-making. This will not only allow us to investigate with sufficient statistical power (1) commonalities and differences in cognitive control functions between subgroups of SAD, but also (2) to address the central unresolved question whether cognitive control and performance-monitoring impairments are causally involved in the development of real-life SCFs and SAD.
In this paper, we tested whether impulsive decision-making (1) differs between individuals with substance use disorders (SUD) or non-substance-related addictive disorders (ND) and healthy controls and (2) predicts the course of SUD and ND severity after one year.
The CRC 940/2 subproject B6 on "Individual differences in effort discounting and adjustments in volitional control" aims at providing an account for the role of individual differences in effort discounting or demand avoidance in order to better predict individual control adjustments across a variety of tasks. Across several studies, the nature of self-reported dispositional effort investment will be systematized (Study 1), it will be determined whether demand avoidance in behavioural tasks can be considered as a stable disposition that is related to self-reported effort investment (Study 2), and it will be examined whether individuals high in effort investment and demand avoidance actually invest more effort in task processing and how this effort investment is adjusted depending on task difficulty and payoff in typical cognitive control tasks (Study 3). Furthermore, project B6 aims at modeling these control adjustments based on individual differences in cost and benefit representations in further studies.
In this pilot study for Study 1 of CRC 940/2 subproject B6, we collected data from a large sample via an online survey using the questionnaires to assess the following personality traits: Need for Cognition, Self-Control, Effortful Control, and Action and State Orientation, and in addition Generalized Self-Efficiacy and the Big Five Personality Traits. The main research question was to examine to what extent Need for Cognition and Self-Control are related and whether there would be evidence for mediation or moderation effects. The results of these analyses will be submitted for publication (Grass, J. et al., Thinking in action: Need for Cognition predicts self-control together with action orientation). In this collection, the data and analysis routines as well as additional information to assess the validity of our results are provided.
Model-based analysis of neural representations of decision evidence during perceptual decision making
Model-based analysis of neural representations of decision evidence during perceptual decision making
Selecting goals and successfully pursuing them in an uncertain and dynamic environment is an important aspect of human behaviour. In order to decide which goal to pursue at what point in time, one has to evaluate the consequences of one’s actions over future time steps by forward planning. However, when the goal is still temporally distant, detailed forward planning can be prohibitively costly. One way to select actions at minimal computational costs is to use heuristics. It is an open question how humans mix heuristics with forward planning to balance computational costs with goal reaching performance. To test a hypothesis about dynamic mixing of heuristics with forward planning, we used a novel stochastic sequential two-goal task. Comparing participants’ decisions with an optimal full planning agent, we found that at the early stages of goal-reaching sequences, in which both goals are temporally distant, on average 42% (SD = 19%) of participants’ choices deviated from the agent’s optimal choices. Only towards the end of the sequence, participant’s behaviour converged to near optimal performance. Subsequent model-based analyses showed that participants used heuristic preferences when the goal was temporally distant and switched to forward planning, when the goal was close.
Humans adaptively integrate forward planning and heuristic preferences during goal pursuit
This is for the paper titled "Context-dependent risk aversion adaptation for single subjects: a model-based approach". Contains both the experimental data obtained by Sven Breitmeyer, Florian Ott and Elena Ruge in 2018.
Investigation of chromosome segregation in male meiotic spindles in C. elegans.
Datasets of male meiotic spindles. 1-Metaphase 2-Anaphase onset 3-Anaphase 1 4-Anaphase 2 5-Anaphase 3 6-Anaphase 4
Background The Common Sense Model (CSM) identifies cognitive and emotional representations that influence the recovery process. In contrast to previous studies that related individual representations to outcome variables, we used cognitive schemata for prediction of the recovery process after a total hip replacement (THR). Method The purpose of this prospective study is to examine the importance of these schemata for functionality three and six months after THR. One additional predictive model was tested: emotional representation was extended to include depression and anxiety. 317 patients with primary THR were interviewed at t0. Preoperatively, the illness perception was assessed by the Illness Perception Questionnaire-Revised, and the function by the Western Ontario and McMaster Universities arthrosis index (WOMAC). 292 patients participated at all time points. A two-stage cluster analysis revealed two schemata. With generalized linear models-analyses, regressions were calculated. Results THR led to an increase in functionality in both schemata after three and six months. Before THR two cognitive schemata are found: scheme one: medium identity, long duration, many consequences, low personal and treatment control, and low coherence; scheme two: low identity, short timeline, low consequences, and high personal and treatment control. Patients with scheme 2 had better functionality three months after surgery than those with scheme one. Only the addition of depression leads to a slight improvement in prediction. Conclusions Surgery outcome could be improved substantially if the illness representation of patients with schema one and depression could be changed by an intervention preoperatively.
Collaborative Research Centre/Transregio 205 - The Adrenal: Central Relay in Health and Disease
NEXTGenIO will solve the problem by bridging the gap between memory and storage. This will use Intel's revolutionary new Optane DC Persistent Memory, which will sit between conventional memory and disk storage. NEXTGenIO will design the hardware and software to exploit the new memory technology. The goal is to build a system with 100x faster I/O than current HPC systems, a significant step towards Exascale computation. The advances that Optane DC Persistent Memory and NEXTGenIO represent are transformational across the computing sector.
The efficient parallel execution of scientific applications is a key challenge in high-performance computing (HPC). With growing parallelism and heterogeneity of compute resources as well as increasingly complex software, performance analysis has become an indispensable tool in the development and optimization of parallel programs. As waiting or idle time can propagate over multiple levels of parallelism, e.g. from a delayed task on an accelerator over host threads to another compute node, the actual cause of an inefficiency might be difficult to find. This thesis proposes a framework for systematic performance analysis of scalable, heterogeneous applications, which covers process- and thread-level parallelism as well as computation offloading. It addresses two essential aspects that have so far been neglected: potential inefficiencies with computation offloading and generic analyses across programming models. Furthermore, established analyses are combined in such a way that inefficiencies and program regions can be prioritized to enable a more focused optimization process. The analysis results are presented in form of a program region profile, a summary of all inefficiencies, and timelines. In the core analyses, the implementation is independent of application programming interfaces (APIs) and can detect further inefficiencies and wait states by adding new analysis rules. It is applied to synthetic and real-world programs to validate the applicability, correctness, and scalability.
CASITA is a tool for automatic analysis of OTF2 trace files that have been generated with Score-P. It determines program activities with high impact on the total program runtime and the load balancing. CASITA generates an OTF2 trace with additional information such as the critical path, waiting time, and the cause of wait states. The same metrics are used to generate a summary profile which rates activities according their potential to improve the program runtime and the load balancing. A summary of inefficient patterns exposes waiting times in the individual programming models and APIs.