To illustrate the nature and scope of the research conducted by ASP personnel, we provide short descriptions of individual research projects in this section. The ASP seeks to maintain involvement in the full complement of NCAR research. Thus, almost all of the research projects that we highlight were conducted in collaboration with NCAR scientists in other divisions and programs, and in many cases more thematic descriptions of the research programs are available in the reports of those NCAR units. We link this Annual Scientific Report to those of related divisions, institutes, and programs as appropriate.
The Advanced Study Program supports postdoctoral fellows who work throughout NCAR. These fellowships permit scientists near the beginning of their careers to work with considerable independence, to collaborate with the NCAR scientific staff, to maintain links between NCAR and the university community (to which most of these fellows return), and to add energy and creativity to the NCAR research programs. The research of some of the postdoctoral fellows is described below.
Weather and Mesoscale Meteorology
Simulations of severe storms: George Bryan (ASP/MMM) used high-resolution numerical simulations to study convective systems that produce severe weather. Utilizing NCAR supercomputers, Bryan explicitly simulated turbulent processes within clouds, including the entraining eddies that dilute convective clouds. He found that environmental perturbations modulate convective system structure and evolution. For example, a standard idealized numerical simulation without perturbations was compared to one with small-amplitude random perturbations. The design of the latter simulation was guided by the fact that the atmosphere is constantly perturbed by variations in terrain, land use, and boundary layer turbulence. The results showed that the convective system in the simulation without perturbations was stronger: Cloud tops were 4 km higher, and total surface precipitation was 25% greater than in the simulation with perturbations. This result occurred because perturbations facilitate the generation of turbulent eddies, which enhance entrainment and dilute the cores of the large convective cells. Because the simulation with perturbations is probably more realistic, this result suggests that standard idealized simulations with smooth environments and coarse resolution produce convection that is anomalously strong.
Topics in mesoscale meteorology: Huaqing Cai (ATD) completed a detailed study of a dryline that was observed during the IHOP field program of 2002. This study demonstrated the important role of the dryline as an initiator of convection. Cai also applied tractal geometry to comparisons between tornadic and nontornadic mesocyclones, in order to characterize their differences and help understand which characteristics lead to tornadoes.
Land Surface-Hydrology Model Development and Testing: During 2004 David Gochis (ASP/RAP) contributed to the continuing development and testing of the hydrologically-enhanced version of the Noah land surface model (now called ‘Noah-distributed’). Gochis and Fei Chen (RAP) conducted various tests to assess the scale-dependency of surface hydrological states and fluxes to model representation using data from the CASES-97 field campaign. Adaptations included adding both so-called full Noah model disaggregation schema and a routing disaggregation scheme to the standard Noah land surface model. Key results indicate that horizontal routing processes at progressively finer spatial resolution of terrain features produce marked heterogeneity in land surface conditions. Even in comparatively flat terrain, such as that of CASES, the difference in hydrological states and fluxes between routing versus non-routing simulations can be appreciable and need to be accounted for in long-term climate simulation and data assimilation algorithms.
Data assimilation using mesonet observations: Joshua Hacker (ASP/MMM) found that surface-layer (screen-height) observations, such as exist in a typical mesonet, are under-utilized in current data assimilation (DA) algorithms because of weak coupling with the free atmosphere aloft. But, simulation and short-range forecasts of near-surface conditions could benefit from these data. The Ensemble Kalman Filter data assimilation algorithm, which uses anisotropic and flow-dependent covariance information to spread the influence of an observation, is appropriate for this task. Hacker in collaboration with Chris Snyder (MMM) used a column planetary boundary layer (PBL) model to successfully assimilate simulated observations. They also performed parameter estimation experiments to show that observations can be used to correct erroneous parameters in such a model.
Adaptive observations and data assimilation - In data assimilation, observations are combined with other sources of information about the present state of the atmosphere, such as a previous forecast, to create an analysis that provides the initial condition for a new forecast. Observations of the true state of the atmosphere are essential to obtaining a good forecast, but it is impossible to assimilate observations covering the entire atmosphere. Christine Johnson (ASP/CGD/MMM) sought to develop ways to determine which observations are most influential in the analysis and assimilation and are therefore most important to include. She addressed this question by considering how information from observations interacts with the model dynamics within advanced data assimilation schemes, such as four-dimensional variational data assimilation. She employed simple mid-latitude storm track models to compare various adaptive-observation strategies.
Applications of two-moment representations of microphysics in mesoscale models: Axel Seifert (ASP/MMM/RAP) tested the value of two-moment microphysical representations in several different settings, including precipitation formation in deep convective storms over Florida and in mesoscale convective systems. A systematic comparison of different bulk microphysical schemes, including the two-moment scheme, revealed that the effect on precipitation from mesoscale convective systems was relatively minor. However, he found that there was a substantial difference in anvil formation and on transport of ice to the upper troposphere when the two-moment scheme was used.
The production of gravity waves by convection: Jadwiga Beres (ASP/ACD/CGD/HAO) implemented a new method of specifying the gravity wave spectrum above convection into the Whole Atmosphere Community Climate Model (WACCM). This parameterization is interactive with the underlying convection and introduces realistic spatial and temporal distributions of gravity wave activity while relating gravity wave characteristics to the underlying wave source. The new scheme improves the structure of the tropical stratospheric and mesospheric Semi-Annual Oscillations (SAO) in WACCM and gives insight to the wave motions that might be responsible for extra-tropical mesospheric forcing. The newly implemented scheme also provides a new direction for gravity wave parameterizations as the gravity wave forcing is no longer fixed and will follow changes in tropospheric climate. This allows for more accurate studies of effects of changing climate on the middle atmosphere and on examining the feedbacks between the troposphere and the middle atmosphere. (Cf. Beres et al, submitted, J. Geophys. Res, 2004 )
Stochastic Parameterization: Judith Berner (ASP/CGD) explored the use of stochastic representations to parameterize unknown or unresolved processes in models. She explored obtaining such stochastic representations in two different ways: via finer-scale models that can characterize such processes explicitly, and via analysis of what unresolved processes are needed to obtain the correct moments for resolved processes. Both approaches provide an opportunity to represent, not only the mean, but also the variance in parameterizations, and so to characterize an important aspect of model error.
Storm tracks and northern anticyclones in the CCSM: Richard Cullather (ASP/CGD) in collaboration with James Hurrell (CGD) compared some features of weather as produced in the CCSM to observations. They showed that anticyclones in the model are weaker and of shorter duration in the model than in observations, and the North Atlantic storm track is more zonal in the simulations than in observations. In the course of this work, they also developed a feature-tracking algorithm to help automate the analysis and make it more objective.
Multifractal characterization of tropical convective systems: Wen-wen Tung (ASP/CGD) determined multifractal dimensions for complex convective systems in the tropics, for time scales ranging from diurnal to synoptic-scale. Tung used a multiplicative cascade model to reproduce the dimensions. The techniques and results can be used to judge how skillfully a model represents tropical convection, particularly in regard to extreme events.
Steps toward a model of the dynamo effect: Pablo Mininni (ASP/GTP) investigated the origin of magnetic fields in planets and stars which is usually attributed to the dynamo effect, a magnetohydrodynamic (MHD) instability governed by two parameters: the magnetic Reynolds number R M (ratio of induction to Ohmic dissipation) and the magnetic Prandtl number P M (ratio of fluid viscosity to magnetic diffusivity). Little is known for fluids with small P M, although they are common in nature. The ability to probe small P M, and to examine in detail the large-scale behavior of turbulent flows depends critically on the ability to resolve a large number of spatial and temporal scales, or else to model them adequately. Theory demands that computations of turbulent flows reflect a clear scale separation between the energy-containing, inertial and dissipative ranges. To this end, and to reach a better understanding of MHD turbulence and dynamo action, a combination of direct numerical simulations (DNS), the new Lagrangian-averaged model for MHD (or LAMHD, in collaboration with Annick Pouquet (CGD), David C. Montgomery (Dartmouth University) and D. Holm(Los Alamos National Laboratory)), and LES (in collaboration with Jean-François Pinton (Ecole Normale Supérieure de Lyon), Yannick Ponty (Observatoire de la Cote d'Azur), and Helene Politano (Observatoire de la Cote d'Azur)) is being adapted to this problem. After validating the models, this combination will be suited to the study of the low magnetic Prandtl number regime at high resolution, as is needed to model the geo-dynamo.
Lagrangian velocity correlations and tropospheric dispersion: Jai Sukhatme (ASP/GTP) estimated the Lagrangian velocity correlation functions (LVCF) associated with mid-latitude tropospheric flow from daily wind data. A decomposition of the velocity field into time mean and transient (or eddy) components helped to illustrate the nature of the LVCF's. Meridional and zonal characteristics were considered separately. The zonal LVCF was found to be non-exponential and, for a broad set of intermediate timescales, better described as a power law. The implied long-time correlation in the zonal flow results in a super-diffusive zonal absolute diffusion regime. On the other hand, the meridional LVCF decays rapidly to zero. Interestingly, before approaching to zero it shows a region of negative correlation. A physical argument based on the rotational inhibition of latitudinal excursions, mediated through the time mean flow, can account for this anticorrelation. As a result the meridional absolute diffusion, apart from showing the classical asymptotic ballistic and diffusive regimes, displays transient sub-diffusive behavior.
Studies of Solar and Upper-Atmosphere Physics
Ion-induced production of new particles by Galactic Cosmic Rays: Jan Kazil (ASP/HAO) and Edward Lovejoy (NOAA) used new laboratory measurements of particle nucleation by sulfuric acid and ion production rates from Galactic Cosmic Rays to predict the rate of new particle formation in the troposphere and stratosphere and the dependence of that rate on the solar cycle (Kazil and Lovejoy, 2004).
Causes of high spectral width in ionospheric Doppler-radar measurements: The SuperDARN (Super Dual Auroral Radar Network) array consists of coherent high-frequency radars that can measure the spectral width of the Doppler shift produced in the ionosphere. Large spectral widths, greater than 200 m/s, are sometimes observed, and the transition from low to high values has been interpreted as an indicator of the boundary between open and closed magnetic field lines. Using intercomparisons among radars, Emma Kavanagh (ASP/HAO) showed that the spectral width depends on the pointing angle of the radar, and this called into question its use as a proxy for magnetic-field-line boundaries. Her study considered possible causes of this directional dependence and suggested that it may arise from directional variability in the electric field in these regions.
Instabilities of downflow plumes in the solar atmosphere: In collaboration with Mark Rast (HAO) Matthias Rempel (ASP/HAO) used the FLASH code (from the University of Chicago) to study the stability of downflow plumes in a strongly stratified atmosphere. He found that the instabilities in plumes simulated in the adaptive-grid FLASH code evolve in ways that are fundamentally different from instabilities in corresponding simulations that use the fixed-grid finite-difference code of Rast. The difference arises from different ways in which viscosity is represented in the two simulations.
Water-soluble organic nitrogen in atmospheric aerosols: Kimberly Mace (ASP/ACD) monitored aerosols near a Colorado dairy by collecting size-resolved samples and analyzing them for water-soluble organic nitrogen compounds. She also set up an aerosol sampler in Puerto Rico for similar sampling, and she investigated lab techniques for detecting these compounds using gas chromatography with mass spectroscopy.
Numerical and Computational Techniques
Adaptive mesh refinement (AMR) techniques for modeling -- Christiane Jablonowski (ASP/SCD) developed an adaptive dynamical core for general circulation models (GCMs) that can statically and dynamically adapt its horizontal resolution based on user-defined refinement criteria. The block-structured AMR technique has been applied to a revised version of NCAR/NASA’s Lin-Rood dynamical core. This hydrostatic dynamics package is based on a conservative finite volume discretization which can be run as a 2D shallow water model or 3D dynamical core. The figure below shows an example of a dynamically adapted shallow water experiment. It displays the flow over an idealized mountain at model day 10. The adapted blocks were guided by a gradient-based refinement criterion (see also http://www.scd.ucar.edu/css/staff/cjablono/amr.html). The research is done in collaboration with Michael Herzog, Joyce Penner, Robert Oehmke, Quentin Stout, Bram van Leer (all of the University of Michigan).
2D flow over an isolated mountain with the dynamically adaptive finite volume dynamical core. The plot shows the geopotential height field at model day 10. The adapted blocks are guided by a gradient-based threshold. The adapted blocks are overlaid.
Education and Communication of Science
Improving scientific visualizations and educational material in digital libraries: Kirsten Butcher (ASP/SCD and DLESE/NSDL) studied the cognitive processes involved in learning with multimedia materials. This research has been targeted toward understanding learning with digital library materials, scientific diagrams and visualizations in addition to text. Through interactions with those designing digital libraries, she has helped guide such design in directions that will improve student learning with digital resources. She has also been studying how memory constraints, inference generation, and the interaction of visualization characteristics with learners’ background knowledge affect learning through scientific visualizations and how such visualizations can be improved to communicate their messages more effectively.
Topics addressing societal responses to stresses: Shui Bin (ESIG) studied several aspects of the ways in which societies respond to stresses, including those caused by emissions to the atmosphere:
- Global Carbon Cycle and Trade. The current system for accounting for carbon dioxide emissions is based on a within-border-measurement (WBM) rationale, and so does not represent the full global impact of a country’s consumption-driven CO2 emissions. Shui Bin and Robert Harriss (ESIG) used a case study of carbon emissions as embodied in the trade of North American Free Trade Agreement (NAFTA) to suggest that a proposed within-border-consumption (WBC) accounting framework would provide a better estimate of a country’s real contribution to global carbon emissions because such a measure would measure the global effect of domestic consumption.
- The Role of Embodied Carbon in Sino-US Trade. The economic and political aspects of Sino-US trade have been discussed intensively, but the associated environmental impacts (including carbon dioxide emissions embodied in the Sino-US trade flows) have drawn little attention. Shui Bin and Harriss used 1997-2003 data and two analyses of scenarios to discuss the significant ways in which carbon emissions are associated with Sino-US trade.
- Consumer responses to environmental impacts. In collaboration with Hadi Dowlatabadi and others in ESIG, Shui has pursued several studies of the ways in which human consumption responds to environmental impacts. More than 60% of US energy use and CO2 emissions are a consequence of consumers’ direct and indirect consumption activities, so an objective of this work is to help direct attention to the important role of the consumer in assessment of environmental impacts. Her work in this area has stressed that impact is not always represented well by income (as is sometimes assumed), that consumer choices on such matters as energy sources for water heaters can reduce energy consumption significantly (often with economic benefits to consumers), and that a consumer emissions inventory is a valuable tool when reduction strategies are considered.
Re-creation of Mann, Bradley, Hughes (MBH, 1998 & 1999) Reconstructions of N. Hemisphere Climate over the Last Millennium: In collaboration with Caspar Ammann (CGD), Doug Nychka (GSP) and Claudia Tebaldi (GSP), Eugene Wahl (ASP/ESIG) led an effort to replicate the MBH98 (1400-1980) reconstruction of the northern hemisphere annual average temperature. The 1400-1980 reconstruction has been replicated nearly exactly, showing it is highly robust to a number of simplifications purposefully developed in relation the original algorithm. This substantial replication is the first of its kind, which is important in that the MBH99 reconstruction for the entire period 1000-1980 has been featured prominently in the 3rd IPCC Report and elsewhere as key evidence concerning anomalous hemispheric warming in the 20th century.
Ethics of Generation and Use of Climate and Short-term Weather Forecasts – Eugene Wahl also worked with Rebecca Morss of (ESIG and MMM) to analyze assumptions, methods, limitations, and uses of weather and climate forecasts from an ethical perspective. Criteria and methods were drawn from modern applied ethics (in particular, the “Georgetown School” of analytical criteria and the method of iterative “reflective specification” developed by John Rawls), which have been developed into a framework specifically oriented to quandaries that arise in the preparation and provision of forecasts. A specific case was examined, the great flood of the Red River of the North in 1997, in which decisions made by forecasters had particularly important ethical dimensions.
Matt Dunn 's PhD research project is the chemical characterization of newly formed atmospheric particles, which he is pursuing via thermal desorption chemical ionization mass spectrometry. An early result of this work has been the measurement of nanoparticle size distributions in Mexico City and the observation of new particle formation and particle growth in an urban setting. (Cf. Dunn et al, 2004.)
Katja Dzepina used measurements from the Mexico City Metropolitan Area 2003 (MCMA-2003) field campaign, which took place from March 29 to May 4 2003, so show that non-refractive aerosols larger than about 1 µm were about 2/3 organic carbon (both primary and secondary) and about 1/3 inorganic species (mainly nitrate, sulfate, and ammonium, with a smaller contribution by chloride). Organic aerosols dominated this aerosol concentration in Mexico City during this time period.
GTP graduate student Jonathan Graham is investigating turbulence in magnetohydrodynamic (MHD) fluids. He is currently investigating the so-called alpha model's capability to reproduce intermittency in the absence of direct-fully-resolved calculations. Intermittency can be temporal like the intermittency of solar flares or like an intermittent problem with your car that is never there when the mechanic looks at it. More rigorously, it is an enhancement of the likely-hood of rare events either temporally or spatially (as is the case for 2-dinensional MHD). This intermittency is a fundamental element of turbulence for which there is no theory, only empirical relations. It is therefore anticipated that this work will contribute an essential part in the attempt to develop a theory of turbulence.
Graduate research fellow Sara Lance (ASP/ACD) is investigating the effect of organic-containing aerosol on aerosol hygroscopicity and cloud droplet growth, both of which have important implications for climate change. Lance will be using a humidified tandem differential mobility analyzer to measure the hygroscopic growth of aerosols generated and sampled in an NCAR lab. A new continuous flow streamwise thermal gradient cloud condensation nuclei (CCN) counter will be used to observe the CCN activity of these aerosols. A thermal desorption chemical ionization mass spectrometer and possibly an aerosol mass spectrometer will also be used to concurrently measure the particle chemical composition. It is anticipated that these instruments will be deployed for the Megacities Impact on Regional And Global Environments (MIRAGE) campaign in Mexico City in 2006.
Justin Peter 's PhD research project, which he completed in FY2004 under the guidance of Steven Siems (Monash University) and Jorgen Jensen (ATD), examined the effects of clouds on the aerosol size distribution using measurements collected during the ACE-Asia field experiment. Two case studies were used to illustrate the effects, a cumulus cloud and an extratropical cyclone. For the cumulus study, mixing diagrams of conserved variables, wet equivalent potential temperature and total water mixing ratio, were used to identify the fraction of cloud base air that had combined with environmental air entrained into the cloud. By using aerosol size distributions measured below and around the cumulus cloud, Peter extended this classical approach to deduce the origins of detrained air. His approach was to predict the shape of the aerosol size distribution for different sources and then compare to observations to determine likely true sources. The measurements showed evidence of size-dependent scavenging of the aerosol, with up to 20% decrease in the concentration of ultra-fine and fine-mode particles and up to 40% decrease in the accumulation mode. The cloud not only caused a general reduction in aerosol concentration, but also substantially modified the aerosol size distribution in the lower free troposphere. This effect may significantly influence the properties of clouds subsequently formed on the detrained aerosol. In addition, it was observed that the concentration of ultra-fine particles in the detrained air was greater than that predicted by a simple model of in-cloud scavenging, as would be the case if ultra-fine particles are produced in cloud.
Senior Research Associates
Sources of mercury in the atmosphere: In the past year Hans Friedli (ASP) and Lawrence Radke (CGD) completed a study of mercury sources and transport from Southeast Asia that was conducted as part of the ACE-Asia field project. This and a related study demonstrate that Southeast Asia is the major contributor to anthropogenic mercury pollution. Other studies by Friedli and collaborators of mercury reservoirs in boreal forests have shown that 90+% of the mercury pools are located on the organic soil and so is likely to be fully or partly released during major wildfires. The pools are very large and parallel the carbon storage in the boreal areas .
Possible mitigation of global warming by modification of cloud albedo : John Latham (ASP) continued exploration of a novel idea for the amelioration of global warming by the advertent and controlled enhancement of the albedo A and longevity L of low-Level maritime clouds. Detailed calculations coupled with computer modeling with the UK Meteorological Office GCM support the quantitative validity of the proposed technique, which involves increasing the droplet concentration in such clouds, with a corresponding increase in both A and L and thus cooling. The idea involves the dissemination at the ocean surface of small seawater droplets in sufficient quantities to act as the dominant CCN on which cloud droplets form. Satellite control of the overall dissemination rate is envisaged. Latham’s collaborators include Keith Bower and Tom Choularton (both of UMIST, Manchester, UK), Alan Blyth, Alan Gadian and Mike Smith (all of University of Leeds, UK), Stephen Salter, (University of Edinburgh, UK) and Andy Jones (Hadley Climate Centre, Meteorological Office, UK). If this technique were to prove workable on the scales required, it could be of great societal importance.
Deducing ice content in thunderstorms from satellite observations of lightning. Latham and collaborators, including Hugh Christian, Walt Petersen and Wiebke Deierling (all of NASA/MSFC), Alan Gadian and Alan Blyth, (both of University of Leeds, UK), Rumjana Mitzeva (University of Sofia, Bulgaria), Scott Ellis (ATD) and Jim Dye (MMM), are continuing their examination of the extent to which it is possible to determine thundercloud ice characteristics from satellite observations of lightning, which are now routinely made on a global scale, using NASA/MSFC devices. A specific goal is to ascertain whether measurements of lightning frequency f can yield estimates of precipitating and non-precipitating ice fluxes. Their computations, and particularly recent data analysis, support their hypothesis that f is roughly proportional to the product of the downward flux fg of graupel through the body of the thundercloud and the upward flux fi of ice crystals into its anvil. This raises the possibility of determining, on a global basis, values of fg and/or fi from the lightning measurements. Such information could have considerable climatological and nowcasting importance, particularly with respect to flooding.
Jerry Mahlman (ASP) provided expert testimony on global warming and served on several review panels and advisory panels during the year. In his role as science advisor for the National Geographic issue on global warming, he contributed advice and helped guide this story toward an accurate and clear representation of the science issues involved.
Geophysical Turbulence Program
The Geophysical Turbulence Program (GTP) hosted two workshops this year in addition to the regular seminar series and an active visitor program. These workshops are described in the Executive Summary.
Annick Pouquet (GTP) and postdoctoral fellow Pablo Mininni (ASP/GTP/University of Buenos Aires) and graduate fellow Jonathan Graham (GTP/University of Colorado) initiated a project to model MHD turbulence (link). A set of comparisons for 2D MHD has been made between the DNS code, which solves the primitive unmodified MHD equations, and the so-called alpha model developed by Holm and his collaborators. The comparisons are favorable for the evolution of the long-wavelength parts of the spectra.
Dynamo action in the regime of low ratio of kinematic viscosity to magnetic diffusivity, or low magnetic Prandtl number PM as in the molten part of the interior of the earth is being presently investigated. The critical magnetic Reynolds number for dynamo action has an abrupt eightfold increase for lower values of PM when compared to PM=1, and then reaches a plateau. The problem is linked to turbulence which renders the dynamo difficult to start by blurring so to speak magnetic field lines.
In addition, a part-time graduate student Wilfred Thompson completed work on a server-client-based tool, GProbe, which provides a GASpAR user with on-the-fly diagnostics, enabling the user to monitor run performance. The monitor is written in Java with a MySQL database on the server enabling the client to perform playback and display. This student also modified the GBin I/O software within GASpAR to accommodate proper output synchronization on multiprocessors, when the number of finite elements on each processor may differ (even substantially).
The Turbulence Numerics Team within GTP, under the leadership of Annick Pouquet and Duane Rosenberg (GTP), has continued a multi-year project to develop an object-oriented Geophysical and Astrophysical Spectral element Adaptive Refinement (GASpAR) code for application to turbulent flows. Like most spectral-element codes, GASpAR combines the efficiency of finite-element methods with the accuracy of spectral methods, and it is designed to be flexible enough for a range of geophysics and astrophysics applications where turbulence or other complex multiscale problems arise. The computational core is based on spectral element operators, which are represented as objects. The formalism accommodates both conforming and nonconforming elements, and their associated data structures for handling inter-element communications in a parallel environment. Many aspects of this code are a synthesis of existing methods; however, GTP has focused on a new formulation of dynamic adaptive mesh (nonconforming h-type) refinement. This year has seen the addition of the adaptive algorithms and the associated connectivity algorithms and demonstrations of the effectiveness of the code on a number of test problems. It is anticipated that the 2-d code will enter production in calendar 2004 (early FY-2005).
Consequences of the “alpha model,” also called the “Lagrangian-averaged” model, for two-dimensional incompressible magnetohydrodynamic (MHD) turbulence were explored by Annick Pouquet. This model is an extension of the smoothing procedure in fluid dynamics which filters velocity fields locally while leaving their associated vorticities unsmoothed, and has proved useful for high Reynolds number turbulence computations. Several known effects (selective decay, dynamic alignment, inverse cascades, and the probability distribution functions of fluctuating turbulent quantities) in magnetofluid turbulence have been considered. Numerical solutions of the primitive MHD equations were compared to their alpha-model counterparts' performance for the same flows in regimes where available resolution is adequate to explore both. The hope is to justify the use of the alpha model in regimes that lie outside currently available resolution, as will be the case in particular in three-dimensional geometry or for magnetic Prandtl numbers differing significantly from unity. The investigation, using direct numerical simulations with a standard and fully parallelized pseudo-spectral method and periodic boundary conditions in two space dimensions, focused on the role that such a modeling of the small scales using the Lagrangian-averaged framework plays in the large-scale dynamics of MHD turbulence. Several flows were examined, and for all of them one can conclude that the statistical properties of the large-scale spectra were recovered, whereas small-scale detailed phase information (such as the location of structures) is lost.
A heuristic derivation of the power--law spectra that obtain in anisotropic magnetohydrodynamics (MHD) flows was given by Pouquet; it is applicable for both weak and strong turbulence. To that effect, the concept of critical balance introduced in Goldreich and Sridhar (1995) was generalized; it uses the existence for the energy spectrum of a relationship between the perpendicular and parallel wavenumber scaling exponents, the directions referring to that of an imposed uniform magnetic field. As a result of this analysis, it is possible to recover with the same phenomenological approach the three standard energy spectra for fluid and MHD turbulence, namely the Kolmogorov (1941), the Iroshnikov (1963) – Kraichnan (1965) and the weak Alfven wave turbulence constant-flux solutions derived in Galtier et al. (2000). Furthermore, in the latter case of weak MHD turbulence, the first heuristic derivation of the power-law spectrum for the precursor to the Kolmogorov constant-flux solution which was discovered numerically in Galtier et al. (2000) was given. Finally, similar relationships for other types of wave motions, such as whistler or inertial waves, were also outlined (paper submitted to Phys. Rev. Lett.).
The origin of magnetic fields in planets and stars is usually attributed to the dynamo effect, a magnetohydrodynamic instability governed by two parameters: the magnetic Reynolds number Rm (ratio of induction to Ohmic dissipation) and the magnetic Prandtl number Pm (ratio of fluid viscosity to magnetic diffusivity). Little is known for fluids with small Pm, although they are common in nature, the difficulty stemming from the turbulence associated with regimes where self-generation may develop. Here, Pouquet and collaborators consider numerically a flow that generates a dynamo at Pm=1. We measure the evolution of the dynamo threshold for fluids down to Pm=0.01, using a combination of numerical and modeling schemes. They found that the development of turbulence induces a tenfold increase in the critical magnetic Reynolds number, and that Rmc remains roughly constant as Pm is further lowered. Thresholds predicted by kinematic simulations using time-averaged velocities yield correct estimates of Rmc.
Aimé Fournier (GTP) has developed new multi-resolution aspects of the spectral-element method, including error estimators and triadic interactions. The latter expresses the fundamental nonlinearity of fluid dynamics that accounts for possibly intermittent turbulent energy and enstrophy cascades across scales.
Jai Sukhatme’s (ASP/GTP) research has been in the broad field of transport and mixing. On one hand his work has been devoted to abstract issues such as studying the asymptotic self-similar state of the advection diffusion equation in periodic domains. Whereas, another set of problems which are currently under investigation deal with transport and mixing in realistic tropospheric flows. The aim is to quantify tropospheric transport and to make contact with the available literature on transport in ideal turbulent fields. Details of some of these projects can be found at his webpage http://www.asp.ucar.edu/~jai