Dr. Matthew Beauregard - Baylor University
Adaptive Splitting Methods in Application to a Solid Fuel Ignition Model
Various types of partial differential equations have been playing increasingly important roles in the study of theoretical and numerical combustion. The heat distribution of a premixed, solid-fuel combustor can be decoupled from the activation energy leading to a singular, nonlinear, and degenerate reaction-diffusion equation. A Peaceman-Rachford splitting based adaptive method is developed. Spatial adaptation is accomplished through modified equidistribution principles that stem from a priori solution information. This generates non-uniform exponentially evolving grids. Rigorous numerical analysis are given to ensure the satisfactory effectiveness, efficiency, and numerical stability of the developed scheme. Simulation experiments are provided to illustrate these accomplishments. A brief history is provided, while many open problems are illustrated throughout the discussion.
Optical Sensor Development at MTSU: Computation Guiding Experiment: Robertson Reading
This talk describes the development of an optical sensing platform for detecting chemical and biological reactions which has been developed at MTSU over the past decade. The sensor is based on the resonant excitation of surface electromagnetic waves in multilayer dielectrics. The ultimate goal of the project is experimental �fabricating and characterizing prototype sensors�however, numerical simulation plays a pivotal role in experimental design and in the interpretation of results. Although, the project is an interdisciplinary effort with collaborators in chemistry and biology, the bulk of the work presented here will describe the physics associated with understanding the nature of surface waves, how their properties can be used to realize sensing, and how and why numerical modeling is crucial to the success of the research.
Granulomas and Model-based derivation of intermitotic time distributions
Granuloma: A granuloma is a collection of immune cells that contains bacteria or other foreign material. An example is provided by the granulomas of Tuberculosis, a disease that infects a third of the world�s population. Although 90% of Tuberculosis cases are latent, 10% result in active infection. I will present a simple model of a generic granuloma and discuss efforts to discover why granulomas break down to cause active infections. Model-based derivation of intermitotic time distributions: The time it takes a cell to divide, or intermitotic time (IMT), is highly variable, even under homogeneous environmental conditions. I will present a multistep stochastic model of the cell cycle and discuss how the model can be used to explain variability in IMT distributions and study the effect of drug treatment.
From Approximation Theory to High Dimensional Data Analysis
In this talk, I'll first briefly mention my research experience in approximation theory, including spline functions, wavelets, and applications. Then, I'll share my research experience at Vanderbilt Ingram Cancer Center on medical data analysis. Finally, I'll describe some recent research topics on hyper-spectral type data processing and applications.
Here are some papers for references:
Hong-Reading 1; Hong-Reading 2; Hong-Reading 3; Hong-Reading 4.
Efficient numerical methods and high performance computing for solving Nonlinear Schr�dinger Equations Liang Reading 1
Liang Reading 2
We compare several numerical methods for solving Nonlinear Schr�dinger (NLS) equations. Finite difference, quartic spline, Discontinues Galerkin (DG) method and Local DG method are implemented for the spatial discretization. The exponential time differencing (ETD) methods with Pad� (1, 1) and Pad� (2, 2) approximations are employed for the temporal discretization. These ETD methods have been proven to be unconditionally stable and their convergence rates will be shown in the numerical experiments. We will also discuss some parallel processing algorithms and compare sequential C programs with CUDA C programs running on a GPU cluster.
An Introduction to Stylometry Analysis Wu Reading 1
Self-assembly of Model Microtubules
Self-assembly plays a central role in producing ordered superstructures. The crucial question is to identify the necessary features that a macromolecular monomer must have in order to drive self-assembly into a desired complex structure. In this talk I will present our recent work on the self-assembly of model microtubules. The model monomer has a wedge-shape with lateral and vertical binding sites. Using MD simulations, we calculated a diagram of the self-assembled structures from these monomers. A modified Flory-Huggins theory was developed to predict the boundaries between different structures that match well with simulation results. We found that to form tubules the interaction strengths must be in a limited range. In addition, helical tubes are frequently formed even though the monomer is nonchiral. The occurrence of the helical tubes is related to the large overlap of energy distributions for nonhelical and helical tubes. To enhance structural control of the self-assembly, we added chirality and a lock-and-key mechanism to the model. We could control both the pitch of the helicity and the twist deformation of the tube by modifying the locations of the binding sites and their interaction strengths. Our results shed new light on the structure of in vitro microtubules formed with various numbers of protofilaments of tubulins, which also exhibit similar twisted structures and various pitches, and have determined the fundamental features of macromolecular monomers for self-assembly into a tubular structure. Sandia National Laboratories is a multi-program laboratory managed and operated by Sandia Corporation, a wholly owned subsidiary of Lockheed Martin Corporation, for the U.S. Department of Energy's National Nuclear Security Administration under contract DE-AC04-94AL85000.
Molecular Quantum Modeling: State of the Art and Future Outlook
Density functional theory (DFT) is the main modeling methodology of atomistic simulations at quantum mechanical level. DFT faces challenges in computational efficiency and accuracy of prediction and we have made progress in meeting the challenges in the last few years. The electronic Coulomb interaction is one of the computational bottlenecks in DFT and we have developed a set of methods [1, 2] that speed up the Coulomb calculation by several times without error. The new methods use Fourier series as auxiliary functions to treat the smooth electron density and the long-range part of the compact electron density, improving the scaling from quartic (O(N4)) to quardratic (O(N2)) with respect to the number of basis functions. For the calculation of exchange-correlation, the other computational bottleneck, we have developed a method  that shifts the calculation associated with the smooth electron density to an even-spaced cubic grid and speeds up the calculation by up to 10 times with no loss of accuracy. We believe that the combination of the above methods yields the fastest accurate DFT integration scheme for molecules. In addition to computational efficiency, a major challenge to DFT is the accurate prediction for strongly correlated systems. We have recently devised a scheme for efficient self-consistent calculation of nondynamic correlation with Becke�s B05 model . The performance of B05 is illustrated through chemical problems that have been difficult to mainstream DFT and wavefunction methods. Looking to the future, I plan to (1) speed up the DFT calculations by many more times without sacrifice of accuracy so that it can be applied to realistic molecular systems and materials with dynamics; (2) characterize the strong correlation qualitatively and quantitatively and apply the latest development to the study of molecular magnets and catalysis; (3) apply DFT methods and other molecular modeling techniques to the study of ligand-protein binding ; (4) design a novel software structure for molecular simulations that lowers the barrier between the idea formulation and software implementation.
1. L. Fusti-Molnar and J. Kong, Fast Coulomb calculations with Gaussian functions. J. Chem. Phys., 2005. 122: 74108.
2. C.-M. Chang and J. Kong, Ewald mesh method for quantum mechanics calculations. J. Chem. Phys., 2012. 136: 114112.
3. J. Kong, S.T. Brown, and L. Fusti-Molnar, Efficient computation of the exchange-correlation contribution in density functional theory through multiresolution. J. Chem. Phys., 2006. 124: 094109.
4. E. Proynov, F. Liu, Y. Shao, and J. Kong, Improved self-consistent and resolution-of-identity approximated Becke'05 density functional model of nondynamic electron correlation. J. Chem. Phys., 2012. 136: 034102.
5. N.N. Nasief, H. Tan, J. Kong, and D. Hangauer, Water mediated ligand functional group cooperativity: The contribution of a methyl group to binding affinity is enhanced by a COO� group through changes in the structure and thermodynamics of the hydration waters of ligand�thermolysin complexes. J. Med. Chem., 2012. 55: 8283.
Computing Kekul\'e number and Clar number of Fullerenes, Nanotubes and Nanotori
Fullerenes and nanotubes are typical nano-materials. They have been the subject of intense research, both for their unique chemistry and for their technological applications, especially in materials science, electronics, and nanotechnology. The Kekule number and Clar number play a key role in the resonant stability of fullerenes and nanotubes. In this talk, we will discuss the methods of computing the Kekule number and the Clar number of Fullerenes, Nanotubes and Nanotori.
Mathematical Models of a Zoonotic Infectious Disease:Hantavirus
Approximately 75% of human infectious diseases originate from an animal reservoir, many caused by viruses such as SARS coronavirus, avian in uenza viruses, rabies virus, West Nile virus and hantaviruses. Human diseases originating from a nonhuman animal reservoir are referred to as zoonoses and the transmission of infection from an animal reservoir to another species is referred to as a spillover infection. In this presentation, some deterministic and stochastic mathematical approaches developed for the study of the viral pathogen hantavirus are presented. Hantavirus, carried by wild rodents, can be transmitted to humans through inhalation of viral particles from rodent excreta. Whereas hantavirus infection in the rodent reservoir causes little impact on rodent survival, infection in humans results in hantavirus cardiopulmonary syndrome, a frequently fatal disease. Application of mathematical models to study the dynamics at the population and the cellular level have increased our understanding of the mechanisms for viral persistence in the reservoir host and have led to new investigations about the potential role of the spillover infection in emerging diseases.
Adaptive sparse grid generalized stochastic collocation methods for PDEs with high-dimensional random input data
Abstract: Our modern treatment of predicting the behavior of physical and engineering problems often relies on approximating solutions in terms of high dimensional spaces, particularly in the case when the input data (coefficients, forcing terms, boundary conditions, geometry, etc) are affected by a large amount of uncertainty. The goal of the mathematical and computational analysis becomes the prediction of statistical moments (mean value, variance, covariance, etc.) or even the whole probability distribution of some responses of the system (quantities of physical interest), given the probability distribution of the input random data. For higher accuracy, the computer simulation must increase the number of random variables (stochastic dimensions), and expend more effort approximating the quantity of interest within each individual dimension. The resulting explosion in computational effort is a symptom of the curse of dimensionality. Adaptive sparse grid generalized stochastic collocation (gSC) techniques yield non-intrusive methods to discretize and approximate these higher dimensional problems with a feasible amount of unknowns leading to usable methods.
It is the aim of this talk to survey the fundamentals and analysis of an adaptive sparse grid gSC method utilizing both local and global polynomial approximation theory. We will present both a priori and a posteriori approaches to adapt the anisotropy of the sparse grids to applications of both linear and nonlinear stochastic PDEs. Rigorously derived error estimates, for the fully discrete problem, will be described and used to compare the efficiency of the method with several other techniques. Numerical examples illustrate the theoretical results and are used to show that, for moderately large dimensional problems, the adaptive sparse grid gSC approach is extremely efficient and superior to all examined methods, including Monte Carlo.
Meta-Analysis of Rare Genetic Variants via Single-Variant Summary Statistics
Meta-analysis, which combines summary statistics from a series of independent studies, has become a norm in discovering common genetic variants associated with complex human diseases. Obtaining summary statistics is much more appealing than collecting individual participant data because it protects the privacy of genetic information, avoids cumbersome integration of genotype and phenotype data from different studies and increases the number of available studies. There is a growing recognition that identifying "causal'' rare variants also requires large-scale meta-analysis. However, the various gene-level tests for rare variants present unprecedented challenges, because the test results from different studies may not be compatible and collating multivariate summary statistics (i.e., the components of the test statistic and their covariance matrix) for certain tests is practically inconvenient. To circumvent these problems, we propose to collate only the single-variant summary statistics and to estimate the correlation matrix of test statistics from an internal reference study or a publicly available database, such as the 1000 Genomes or HapMap. We show both theoretically and numerically that the proposed meta-analysis approach provides accurate control of the type I error and is as powerful as as joint analysis of individual participant data.
Thermal Averaging of Atoms in Molecules
In this computational study, we topologically analyze thermally averaged properties of the Electron Density (ED) and compare these properties to those of the static ED. We calculate the single point optimized (static) geometry of Acetamide and construct an ensemble of wavefunction based densities from harmonic nuclear vibrations. This yields a thermally averaged density without invoking the convolution approximation. We use 10,000 thermally perturbed geometries to generate the large ensemble of wavefunctions. The corresponding EDs are then used to evaluate the location of the Bond Critical Point (BCP) between various atoms and compare these locations to those obtained from the static wavefunction representation. In doing so we are able to generate averaged properties, including thermal parameters and ellipsoids, of these BCPs and compare them to the related properties of the static ED.