Hydrodynamic and thermodynamic modeling: Using tools such as analytical ultracentrifugation and dynamic light scattering my research group focuses on the elucidation of structure and function of macromolecules, assemblies, nanoparticles and synthetic polymers. To this end, we have developed several numerical methods and computational approaches, including parallel distributed computing, for the analysis of experimental data from these hydrodynamic techniques. This long-time effort has resulted in the creation of the UltraScan software suite. By creating this software our goal is to implement novel technology in a user-friendly data analysis environment so that the methods can be applied by any investigator, even those without extensive expertise in computing, mathematics or biophysics.
A major emphasis in developing these new methods is placed on the global approach, which takes advantage of the added information content of multiple datasets from different experimental methods and experimental conditions. The global approach presents new challenges with respect to optimization algorithms and requires new paradigms to deal with the large amounts of data from combined experiments (such as multi-speed, multi-wavelength, and multi-concentration sedimentation velocity and equilibrium experiments).
Our current efforts focus on the development of novel adaptive space-time finite element solutions to partial differential equations describing sedimentation velocity experiments at a very detailed level, which extends beyond non-interacting, ideal systems to multi-component reactions, concentration dependent non-ideality, slow kinetics and reaction equilibria determinations, co-sedimenting solutes, and methods for the spectral decomposition of dissimilar absorbing components such as nucleic acids, proteins and molecules containing unique chromophores. the tcnovel analysis methods utilizing the latest advances in technology and instrumentation. Among them are parallel computational approaches using Linux Beowulf systems. Such tools are required to model the large and computationally demanding systems of experimental data in a global approach.