We are pleased to announce that 2018 IMS Annual Meeting will be held in conjunction with the 12th Vilnius Conference on Probability Theory and Mathematical Statistics in Vilnius, one of the most beautiful cities in the Old Continent, capital of Lithuania, on July 2–6, 2018.

The Vilnius Conferences on Probability and Mathematical Statistics have a long and successful history. The first Vilnius Conference on Probability Theory and Mathematical Statistics was organized in June 1973, becoming the first big international meeting on probability theory and mathematical statistics in the former Soviet Union. Today, the conference is established as a must-attend international quadrennial event for many researchers in the field.

Many prominent probabilists and statisticians will attend the conference. There will be more than 180 invited talks. The Wald Lectures will be delivered by Luc Devroye; the 2018 Le Cam Lecturer is Ruth Williams*****; the Neyman Lecturer is Peter Bühlmann*****; and the Schramm Lecturer is Yuval Peres. Six Medallion lectures will be given at this meeting by Jean Bertoin, Svante Janson*** **, Thomas Mikosch, Sonia Petrone*** **, Richard Samworth and Allan Sly. The Vilnius Lecture will be delivered by Lithuanian probabilist Liudas Giraitis.

The conference coincides with an important anniversary in Lithuania’s history. On the 16th of February, 1918, an Act of Independence declared an independent State of Lithuania. Vilnius, where the heritage of the past and the achievements of the present in science and culture co-exist in harmony, will be full of the Lithuanian centennial events during the whole year as well. Vilnius has unique baroque buildings and the largest Old Town in central and eastern Europe, which has managed to preserve the medieval network of streets and typical spaces, reminding us of the beautiful ancient Italian towns where Baroque was born. The Old Town of Vilnius, located in an amphitheatre of breathtaking nature, has been on the UNESCO World Heritage List since 1994.

Conference delegates will have an opportunity to explore Vilnius during special tours organized on the 4th of July, and to taste delicious Lithuanian food during a Conference dinner on the same day.

The organizers of the Conference are the Lithuanian Mathematical Society, Vilnius University and IMS. The Program Co-chairs are Peter Bühlmann (IMS) and Vygantas Paulauskas (Vilnius). The Organizing Committee Co-chairs are Erwin Bolthausen (IMS) and Remigijus Leipus (Vilnius).

You can read more on the conference website: http://ims-vilnius2018.com.

* See some more Special Invited Lecture previews on pages 8–13 of the June/July 2018 Bulletin.

]]>The 2018 Carver Medal is awarded to **Byeong Uk Park**, for his efforts and dedication in heading the local organizing committee of the inaugural Asian Pacific Rim Meeting (IMS-APRM) in Seoul in 2009, and in chairing the scientific program committee of the two next meetings in 2012 and 2014, thereby establishing this meeting as a regular part of IMS conference series that has increased the presence of IMS in this region. Byeong Uk Park is a Professor of Statistics at Seoul National University in South Korea.

Park earned his BSc in 1982 and MSc in 1984, both from Seoul National University (SNU), and his PhD in 1987 at the University of California at Berkeley, under the supervision of Peter Bickel. He was a Visiting Assistant Professor at University of North Carolina at Chapel Hill in 1987–88, since when he has been faculty at SNU’s Department of Statistics, including serving as its Chair from 1999–2002.

Among his honors and awards, he is an invited lecturer this year at the International Congress of Mathematicians; he gave the Laplace Lecture at the 9th World Congress in Probability and Statistics in Toronto in 2016; he has received SNU’s Scholarly Research Award (2017), the National Academy of Science Award (2014), the Korean Order of Service Merit (2014), SNU’s College of Natural Science Research Grand Prize (2013), the First Korean Statistical Society Gallup–Korea Award (2006), and the First Junior Statistician Award of the Korean Statistical Society (1992). He is a Fellow of IMS, ASA and the Korean Academy of Science and Technology, and an Elected member of the International Statistical Institute.

On hearing about the award, Professor Park said, “I am honored to receive the Carver Medal. At the same time I feel indebted to all Korean colleagues who helped me in organizing the inaugural IMS-APRM, and also to the IMS leadership at that time for making this conference series possible and letting me actualize it.”

Byeong Uk Park will receive the Carver Medal at the Presidential Address and Awards session at the IMS Vilnius meeting on July 2: http://ims-vilnius2018.com/program/

]]>Those elected this year bring the total number of active members to 2,382 and the total number of foreign associates to 484. Foreign associates are nonvoting members of the Academy, with citizenship outside the United States.

Trevor Hastie is professor of statistics and John A. Overdeck Professor of Mathematical Sciences and Biomedical Sciences at Stanford University.

Simon Tavaré is director of the Cancer Research UK Cambridge Institute, at the University of Cambridge, UK.

The National Academy of Sciences recognizes achievement in science by election to membership, and, with the National Academies of Engineering and Medicine, provides science, engineering, and health policy advice to the US government and other organizations.

]]>Philip Dawid, Emeritus Professor of Statistics of the University of Cambridge, has made fundamental contributions to both the philosophical underpinnings and the practical applications of statistics. His theory of conditional independence is a keystone of modern statistical theory and methods, and he has demonstrated its usefulness in a host of applications, including computation in probabilistic expert systems, causal inference, and forensic identification. His co-authored book *Probabilistic Networks and Expert Systems *won the first DeGroot Prize, and he was awarded the Royal Statistical Society’s Guy Medal in Silver in 2001. For many years Philip Dawid was Professor of Probability and Statistics at University College London. He has served as Editor of *Biometrika* and of the *Journal of the Royal Statistical Society (Series B)*, and as President of ISBA.

Nancy Reid is University Professor and Canada Research Chair in Statistical Theory and Applications at the University of Toronto. Her research interests include statistical theory, likelihood inference, design of studies, and statistical science in public policy. Nancy is a Fellow of IMS, RSS, the Royal Society of Canada, and the American Association for the Advancement of Science, and a Foreign Associate of the National Academy of Sciences. In 2015 she was appointed Officer of the Order of Canada.

]]>* Alexander Aue, *Professor, University of California, Davis:

** Sourav Chatterjee, **Professor of Mathematics and Statistics, Stanford University:

** Ivan Corwin, **Professor of Mathematics, Columbia University:

** Christopher Field, **Professor Emeritus of Statistics, Dalhousie University:

** Peter Hoff, **Professor of Statistical Science, Duke University:

** Bing-Yi Jing, **Professor, Hong Kong University of Science and Technology:

** Geurt Jongbloed, **Professor of Statistics, Delft University of Technology:

** Piotr Kokoszka, **Professor, Colorado State University:

** Steven Kou, **Director of Risk Management Institute and Class ’62 Professor of Mathematics, National University of Singapore:

** Antonio Lijoi, **Professor of Statistics, Bocconi University, Italy:

** Sayan Mukherjee, **Professor, Duke University:

** Sofia Charlotta Olhede, **Professor of Statistics, Honorary Professor of Computer Science, Director of UCL’s Centre for Data Science, UK:

** Davy Paindaveine, **Professor of Statistics, Université libre de Bruxelles:

** Giovanni Peccati, **Professor in Stochastic Analysis and Financial Mathematics, Luxembourg University:

** Elvezio Ronchetti, **Professor of Statistics, University of Geneva:

** David A. Schoenfeld, **Professor of Medicine, Harvard Medical School; Professor of Biostatistics, Harvard T. H. Chan School of Public Health:

** Sunder Sethuraman, **Professor of Mathematics, University of Arizona:

** Huixia Judy Wang, **Professor of Statistics, The George Washington University:

** Aihua Xia, **Professor, the University of Melbourne:

** Jianfeng Yao, **Professor, The University of Hong Kong:

]]>

**Bin Yu** is Chancellor’s Professor, Departments of Statistics and Electrical Engineering and Computer Sciences at the University of California at Berkeley. Growing up in China, Yu received her Bachelor’s degree in Mathematics from Peking University in 1984, and completed her Master’s (1987) and PhD (1990) in Statistics at UC Berkeley. Prior to becoming a faculty member at Berkeley in 1992, she was an assistant professor at the University of Wisconsin–Madison. She has held visiting professorships at several universities/institutes (incl. Yale, ETH, Bell Labs, Poincaré Institute, INRIA), and since 2005 has been a founding co-director of the Microsoft Joint Lab on Statistics and Information Technology at Peking University in China.

Yu’s research interests include statistical inference, machine learning, and information theory; currently she focuses on statistical machine learning methodologies and on theory and algorithms for solving high-dimensional data problems. Her collaborations are highly interdisciplinary and include scientists from genomics, neuroscience, precision medicine, and political science.

Yu has received numerous awards and honors. She is a member of the US National Academy of Sciences, became a Guggenheim Fellow in 2006, and is a Fellow of the ASA, IMS, AAAS, and IEEE; she was the IMS President in 2013–2014, and the IMS Rietz Lecturer in 2016. Her impressive list of professional activities includes serving on many editorial boards, such as *PNAS*, *Journal of Machine Learning Research*, *Technometrics*, *Annals of Statistics*, and *JASA*.

The award to Bin Yu will be presented by Shirley Mills, Chair of the COPSS Elizabeth L. Scott Award Committee.

**Susan A. Murphy** is Professor of Statistics and Computer Science at the Harvard John A. Paulson School of Engineering and Applied Sciences, and Radcliffe Alumnae Professor at the Radcliffe Institute, Harvard University. Prior to joining Harvard in 2017, she was the H.E. Robbins Distinguished University Professor of Statistics, Professor of Psychiatry, and Research Professor at the Institute for Social Research at the University of Michigan. Growing up in southern Louisiana, Murphy received her BSc in Mathematics from the Louisiana State University in 1980, and her PhD in Statistics from UNC–Chapel Hill in 1989. She was a faculty member at Penn State (1989–97) before joining the University of Michigan.

Murphy’s research interests include experimental design and causal inference to inform sequential decision making. She developed the sequential, multiple assignment, randomized trial (SMART) design, which is used by scientists and clinical researchers to build better treatments for a broad range of health problems including ADHD, autism and depression. Her research lab focuses on methods for improving real-time sequential decision-making in mobile health, e.g. methods and algorithms that can be deployed on wearable devices, to deliver individually tailored treatments.

Among many honors, Murphy was inducted into the US National Academy of Sciences in 2016 and the National Academy of Medicine in 2014. She was awarded a MacArthur Fellowship in 2013, and is a Fellow of ASA, IMS and the College on Problems in Drug Dependence. She served as a co-editor of the *Annals of Statistics* (2007–09), was the 2015 IMS Wald Lecturer, and is the current President of the Bernoulli Society.

Susan A. Murphy will deliver the 2018 Fisher Lecture [*abstract below*] entitled “*The Future: Stratified Micro-randomized Trials with Applications in Mobile Health*” at 4pm on Wednesday, August 1, following the presentation of the Fisher award by Alicia Carriquiry, Chair of the COPSS Fisher Lecture Committee.

**Susan Murphy, 2018 R.A. Fisher Lecture **

*The Future: Stratified Micro-randomized Trials with Applications in Mobile Health*

Technological advancements in the field of mobile devices and wearable sensors make it possible to deliver treatments anytime and anywhere to users like you and me. Increasingly the delivery of these treatments is triggered by detections/predictions of vulnerability and receptivity. These observations are likely to have been impacted by prior treatments. Furthermore the treatments are often designed to have an impact on users over a span of time during which subsequent treatments may be provided. Here we discuss our work on the design of a mobile health smoking cessation study in which the above two challenges arose. This work involves the use of multiple online data analysis algorithms. Online algorithms are used in the detection, for example, of physiological stress. Other algorithms are used to forecast at each vulnerable time, the remaining number of vulnerable times in the day. These algorithms are then inputs into a randomization algorithm that ensures that each user is randomized to each treatment an appropriate number of times per day. We develop the stratified micro-randomized trial which involves not only the randomization algorithm but a precise statement of the meaning of the treatment effects and the primary scientific hypotheses along with primary analyses and sample size calculations. Considerations of causal inference and potential causal bias incurred by inappropriate data analyses play a large role throughout.

]]>This will be the first time the JSM will have lectures named after women. The JSM has been held annually since 1840. It is the largest gathering of statisticians in North American, and one of the largest in the world. Each year there are over 6,000 participants from over 50 countries.

The Elizabeth L. Scott Lecture and the F.N. David Lecture will be included in the COPSS portfolio, which already includes the prestigious Fisher Lecture [*given this year by Susan Murphy*]. According to Nick Horton, Chair of COPSS, “One of the main tasks for COPSS involves granting awards that highlight the work of notable statisticians. I’m proud that starting in 2019, at least one of the lectures at the JSM will be named after a woman. This is long overdue.”

The Caucus for Women in Statistics (CWS) spearheaded the effort to establish the lectureships, along with the ASA LGBT Concerns Committee, ASA Committee on Women, SSC Committee on Women, ISI Committee on Women, and IBS/ENAR/WNAR.. The COPSS Executive Committee voted unanimously to approve it.

The idea that too few women receive national recognitions for their research and scholarship is not new. The National Science Foundation in 2010 established an AWARDS project “to investigate and improve the process of granting awards and prizes for scholarly achievement” in disciplines like statistics. This project led to many association reforms.

Establishing a new named lecture slot at the JSM for the Scott Lecture and David Lecture is another significant step forward in advancing the statistics profession. It adds a face to the profession’s ongoing and growing commitment to diversity and inclusion. 2018 CWS President Shili Lin remarked: “I’m so excited and grateful that the long overdue recognitions for women in statistics in the form of two named lectures are finally here, and here to stay!”

The new lectures will be given by the winners of the COPSS awards named after outstanding women statisticians: Elizabeth L. Scott (1917–1988) and F. N. David (1909–1993). The two awards are given in alternate years. Both awards have high profiles and are highly sought within the statistics profession.

The first lecture will be the F.N. David Lecture at the 2019 JSM in Denver, Colorado, July 27–August 1. ASA Committee on Women Chairperson Kimberly Sellers remarked: “Already looking forward to JSM 2019!”

]]>

**Ruth J. Williams** studied mathematics, at the University of Melbourne where she earned BSc(Hons) and MSc degrees, and then at Stanford University where she earned her PhD. Following a postdoc at Courant Institute in New York, she took up a position as an Assistant Professor at the University of California, San Diego (UCSD). She is currently a Distinguished Professor of Mathematics and holds the Charles Lee Powell Chair in Mathematics I at UCSD.

*Ruth Williams’ research in probability theory and its applications includes work on reflecting diffusion processes in non-smooth domains, heavy traffic limit theorems for multiclass queueing networks, and fluid and diffusion approximations for the analysis and control of more general stochastic networks, including those described by measure-valued processes. Her current research includes the study of stochastic models of complex networks, for example, those arising in Internet congestion control and systems biology.*

*Among her honors, she is an elected member of the National Academy of Sciences, an elected fellow of the American Academy of Arts and Sciences, a fellow of AAAS, AMS, IMS and INFORMS. In 2012, she served as President of the IMS and in 2016, joint with Martin Reiman, she was awarded the John von Neumann Theory prize by INFORMS. She delivered an IMS Special Invited paper (now called Medallion lecture) in 1994.*

*Ruth Williams will give this Le Cam Lecture at the IMS Annual Meeting in Vilnius, Lithuania, on Monday July 2, 2018.*

Stochastic models of complex networks with limited resources arise in a wide variety of applications in science and engineering, e.g., in manufacturing, transportation, telecommunications, computer systems, customer service facilities, and systems biology. Bottlenecks in such networks cause congestion, leading to queueing and delay. Sharing of resources can lead to entrainment effects. Understanding the dynamic behavior of such modern stochastic networks presents challenging mathematical problems.

While there are exact steady-state analyses for certain network models under restrictive distributional assumptions, most networks cannot be analyzed exactly. Accordingly, it is natural to seek more tractable approximate models. Two types of approximations which have been used to study the stability and performance of some stochastic networks are fluid and diffusion models. At this point in time, there is a substantial theory of such approximate models for queueing networks in which service to a queue is given to the job at the head-of-the-line (HL networks). However, for queueing networks with non-HL service, such as processor sharing or random order of service, and for more general stochastic networks that do not have a conventional queueing network structure, the development of approximate models (and rigorous scaling limit theorems to justify them) is in its early stages of development.

This talk will describe some recent developments and open problems in this area. A key feature will be dimension reduction, resulting from entrainment due to resource sharing. Examples will be drawn from bandwidth sharing and enzymatic processing.

For background reading, see the survey article: R. J. Williams, Stochastic Processing Networks, *Annu. Rev. Stat. Appl. ***2016**. 3: 323–45.

**Peter Bühlmann **is Professor of Statistics and Mathematics at ETH Zürich. Previously (1995–97), he was a Neyman Visiting Assistant Professor at the University of California at Berkeley. His current main research interests are in causal and high-dimensional inference, computational statistics, machine learning, and applications in bioinformatics and computational biology. He is a Fellow of IMS and ASA, and he has served as Co-Editor of the Annals of Statistics from 2010–12. He received an honorary doctorate from the Université Catholique de Louvain in 2017, and is the recipient of the 2018 Royal Statistical Society’s Guy Medal in Silver.

*Peter will give this Neyman Lecture at the IMS Annual Meeting in Vilnius, Lithuania, on Tuesday July 3, 2018.*

** Jerzy Neyman: my starting point.** Jerzy Neyman (1923) considered agricultural field experiments where unobserved “potential yields” from a plant variety are modeled with a fully randomized assignment mechanism [5]. Historically, it appears that Neyman was the first to formalize causal inference using potential outcomes. It turned out to be an important milestone on which many developments, methods and algorithms are building, cf. [4].

** Causality: it’s about predicting an answer to a “What if I do question”.** Loosely speaking, the main task in causality is to predict a potential outcome under a certain treatment or in a certain environment based on data where this treatment has not been observed. In many modern applications, we are faced with such prediction tasks. For example: in genomics, what would be the effect of knocking down (the activity of) a gene on the growth rate of a plant? We want to make a prediction without having data on such a gene knock-out (e.g. no data for this particular perturbation). Similar questions arise in economics, e-commerce and many other areas.

*Invariance in heterogeneous data.** *Structural equation models are another framework for the same causal inference task as in the potential outcome setting. There is a key invariance assumption in this framework which has been formalized first by Trygve Haavelmo (the Norwegian economist and 1989 Nobel Laureate) in 1943 [1]. “Invariance” is the first word appearing in the lecture’s title as it is a crucial center point: we will focus on an invariance principle in the context of heterogeneous and potentially “large-scale” data. In a nutshell, Haavelmo [1] had recognized that:

causal model structures ⇒ invariance of a certain type.

One new line of thinking is the reverse relation, namely:

invariance of a certain type ⇒ causal model structures.

With access to large-scale heterogeneous data, we can simply estimate invariance from the data which then leads to estimated causal structures; or in other words, we infer causality from a special well-defined notion of “stability” or invariance [2]. Here, heterogeneity is not a nuisance but a “real friend” since it enables the search for invariance (a.k.a. stationary structures) within heterogeneous (a.k.a. non-stationary) data.

** From invariance to novel robustness: anchor regression. **Strict invariance and causality may be too ambitious in the context of “large-scale” and poorly collected data. But when relaxing to soft invariance, we still obtain interesting robustness results for prediction. The “What if I do question” from causality is related to robustness with respect to a class of new scenarios or perturbations which are not observed in the data. A novel methodology relying on causal regularization, called anchor regression, provides new answers [3].

The lecture will highlight some fundamental connections between invariance, robustness and causality. We will illustrate that the novel insights and methods are useful in a variety of applications involving “large-scale” data.

**Acknowledgment. **Many of the ideas presented in the lecture come from my collaborators Nicolai Meinshausen, Jonas Peters and Dominik Rothenhäusler.

**References**

[1] Haavelmo, T. (1943). The statistical implications of a system of simultaneous equations. *Econometrica*, **11**:1–12.

[2] Peters, J., Bühlmann, P., and Meinshausen, N. (2016). Causal inference using invariant prediction: identification and confidence interval (with discussion). *J. Royal Statistical Society, Series B*, **78**:947–1012.

[3] Rothenhäusler, D., Bühlmann, P., Meinshausen, N., and Peters, J. (2018). Anchor regression: heterogeneous data meets causality. Preprint arXiv:1801.06229.

[4] Rubin, D. and Imbens, G. (2015). *Causal Inference for Statistics, Social, and Biomedical Sciences.* Cambridge University Press.

[5] Splawa-Neyman, J. ([1923] 1990). On the application of probability theory to agricultural experiments. Essay on principles. Section 9. Translated and edited by D.M. Dabrowska and T.P. Speed from the Polish original, which appeared in *Roczniki Nauk Rolniczyc, Tom X* (1923): 1–51 (*Annals of Agricultural Sciences*). *Statistical Science*, **5**:465–472.

**Anthony Davison **is Professor of Statistics at the Ecole Polytechnique Fédérale de Lausanne (EPFL). Between obtaining his PhD from Imperial College London in 1984 and moving to EPFL in 1996, he worked at the University of Texas at Austin, at Imperial College London and at the University of Oxford. He has published on a variety of topics in statistical theory and methods, including small-sample likelihood inference, bootstrap methods and the statistics of extremes. He is author or co-author of several books.

*He has served the statistical profession as Editor of Biometrika, as Joint Editor of Journal of the Royal Statistical Society, series B, and in various other roles. In 2009 he was made laurea honoris causa in statistical science by the University of Padova. He is a Fellow of IMS and the ASA, and in 2015 received the Royal Statistical Society’s Guy Medal in Silver.*

*Anthony will give this Medallion Lecture at the JSM in Vancouver.*

Statistics of extremes deals with the estimation of the risk of events that have low probabilities but potentially very damaging consequences, such as stock market gyrations, windstorms, flooding and heatwaves. Typically, few events relevant to the phenomenon of interest have ever been observed, and their probabilities must be estimated by extrapolation well outside any existing data, using appropriate probability models and statistical methods. Two broad approaches to the analysis of extremes are the use of block maxima, for example, annual maximum rainfall series; and the use of threshold exceedances, whereby only those observations that exceed some high threshold contribute to tail estimation. Key difficulties are that relevant events are typically scarce, so as much information as possible must be squeezed from those data that are available, and that any models based on limiting arguments are likely to be mis-specified for finite samples. There is an extensive literature on all aspects of this problem, from a wide range of perspectives.

In the scalar case, classical arguments suggest that inference for block maxima and exceedances over high thresholds should be based respectively on the generalised extreme-value and generalized Pareto distributions, and these are widely used in applications. Extensions of these arguments to more complex settings suggest that max-stable processes should provide suitable models for maxima of spatial and spatio-temporal data, and that Pareto processes are appropriate for modelling multivariate exceedances, appropriately defined. Although max-stable processes were introduced around 1980, there had been few attempts to fit them to data until recently, due both to a dearth of suitable models and to computational considerations. Such processes are generally specified through their joint distribution functions, leading to a combinatorial explosion when attempting to construct a full likelihood function, so workarounds, such as use of composite likelihoods or other low-dimensional summaries, have been proposed for both parametric and nonparametric inference. These approaches are increasingly being deployed in applications, but they are statistically inefficient, and the resulting loss of precision matters in settings where the final uncertainty is invariably too large for comfort.

A further difficulty is that basing inference on maxima, which typically stem from merging several unrelated occurrences, obscures the detailed structure of individual events. Since these may show typical patterns that are important in risk assessment, attention has recently turned to inference based on multivariate exceedances, which in principle allow more informative modelling to be undertaken. Functionals that determine risks of particular interest are used to select the events most relevant to the estimation of these risks, and the tail probabilities corresponding to such risks are then estimated.

This lecture will survey recent work in the area and then show how detailed modelling for high-dimensional settings can be undertaken using Pareto processes, generalized versions of threshold exceedances, suitably-defined likelihoods and gradient scoring rules.

The work is joint with numerous others, and in particular with Raphaël de Fondeville.

**Anna De Masi **is Professor in Probability and Statistics at the University of L’Aquila in Italy, where she has been coordinator of the PhD program, “Mathematics and Models” since 2013. Her research interests cover issues such as macroscopic behavior of interacting particle systems, phase transition phenomena in equilibrium and non-equilibrium statistical mechanics. She is among the founders of the mathematical analysis of hydrodynamic limits involving stochastic evolutions of particle systems and kinetic limits [see the surveys in collaboration: “A survey of the hydrodynamical behaviour of many-particle systems,” Studies in Statistical Mechanics, Vol.11, North Holland (1984); and “Mathematical methods for hydrodynamical limits,” Lecture Notes in Mathematics 1501, Springer–Verlag (1991)]. With various collaborators, she has analyzed, using probabilistic techniques, phenomena like separation of phases, spinodal decomposition, and development and motion of interfaces. Her recent interests focus on boundary-driven systems in the presence of phase transition and their relation with free boundary problems.

*Anna’s Medallion Lecture will be at the Stochastic Processes and their Applications meeting in Gothenburg in June.*

To explain the title, consider a thought experiment where there is a gas in a container with a wall in the middle that keeps the density on the left smaller than that on the right. If we take out the wall, the gas diffuses and at the end the density becomes uniform. However with “astronomically” small probability or after “astronomically” long times we may see again regions with different densities.

A system exhibiting such a behaviour is the Ising model with nearest neighbour ferromagnetic interaction and Kawasaki dynamics. At large temperatures initial inhomogeneities disappear due to the diffusive behaviour of the system, [1], and the appearance of inhomogeneities is only due to a large deviation. However if we lower the temperature we see the opposite, i.e. spacial homogeneous initial states evolve into states with regions having different magnetization.

This is not only a mathematical abstraction, consider in fact a binary alloy mixture consisting of atoms of Fe and Cr combined in a face centered cubic lattice. At T=1200K the system performs normal diffusion but at T=800K the mixture separates into regions one predominantly Fe and the other Cr. This is largely used in chemical engineering to purify metal: loosely speaking in the above example we slice the crystal into parts made predominantly of Fe and others predominantly of Cr. Such phenomena go under the name “uphill diffusion”, see for instance [2], they occur in more general alloy mixtures and are widely used in industrial applications.

The purpose of my talk is to address these questions in the framework of a rigorous analysis. Something can be done but many intriguing questions remain open and I will try to focus on them, hoping that the audience will get interested and maybe involved. I will restrict to the Ising model with Kawasaki dynamics, which is a Markov process with nearest neighbour spin exchanges, so that the canonical Gibbs measure (with n.n. ferromagnetic interaction) is invariant. The evolution is in a finite region [0, *N*]* ^{d}* ∩

This is the usual set up for the Fick law. The question is the sign of the current in the stationary state. In *d* = 1 at infinite temperature the current satisfies the Fick law (going opposite to the magnetization gradient) and it is therefore negative. In this case we prove a large deviation estimate on the probability that it is instead positive (also when *m*_{±} are slowly varying on time), see [3]. In the case *d* = 2 when the temperature is subcritical (and there is a phase transition) we have observed via computer simulations, [4], that the stationary current may become positive flowing from the reservoir with smaller to the one with larger magnetization. We have some theory to explain the phenomenon but few mathematical proofs.

Several other questions will be addressed in my talk.

**References**

[1] S.R.S. Varadhan, T. Yau. Diffusive limit of lattice gas with mixing conditions. *Asian J. Math.* 1, 1997, 623-678.

[2] R. Krishna. Uphill diffusion in multicomponent mixtures. *Chem. Soc. Rev.* 44, 2015, 2812.

[3] A. De Masi, S. Olla. Quasi-static hydrodynamic limit. *J. Stat. Phys.*, 161, 2015 and Quasi-static large deviations in preparation 2018.

[4] M. Colangeli, C. Giardinà, C. Giberti, C. Vernia. Non-equilibrium 2D Ising model with stationary uphill diffusion. *Phys. Rev. E*, to appear 2018.

**Svante Janson** is Professor of Mathematics at Uppsala University. He obtained his PhD in Mathematics from Uppsala University 1977, and has remained there ever since, except for short stays at other places. His thesis and early research was in harmonic analysis and functional analysis, but for a long time, his main interest has been in probability theory. In particular, he is interested in the study of random combinatorial structures such as random graphs, trees, permutations, and so on, where he generally tries to find interesting limit theorems or limit objects. He also works on Pólya urns and branching processes. He has written three books, and over three hundred mathematical papers. Svante is a member of the Royal Swedish Academy of Sciences, the Royal Society of Sciences at Uppsala, the Swedish Mathematical Society, and the Swedish Statistical Association. He has been awarded the Rudbeck medal by Uppsala University and the Celsius medal by the Royal Society of Sciences at Uppsala.

*This Medallion Lecture will be at the IMS Annual Meeting in Vilnius, Lithuania, on Thursday July 5, 2018.*

Branching processes generate random trees in a natural way, which can be varied by, e.g., conditioning or stopping the branching process. Moreover, many families of random trees that are defined in other ways turn out to be equivalent to random trees defined by branching processes in some way or another. This has thus become one of the main probabilistic tools to study random trees.

One central family of random trees are the conditioned Galton–Watson trees, coming from a Galton–Watson process conditioned to have a given total size. We study the asymptotic behaviour of these random trees, both locally close to the root, locally close to a random node, and globally after suitable rescaling.

Other, quite different, families of random trees can be constructed by stopping a (supercritical) Crump–Mode–Jagers (CMJ) process when it reaches the desired size. We study the local and global behaviour of such trees too.

There are also open problems in this field, and these lead to open problems and conjectures about fluctuations for functionals of CMJ branching processes.

See further, e.g.,[1, 2] and the references therein.

**References**

[1] Holmgren, Cecilia and Janson, Svante. Fringe trees, Crump–Mode–Jagers branching processes and *m*-ary search trees. *Probability Surveys ***14 **(2017), 53–154.

[2] Janson, Svante. Simply generated trees, conditioned Galton–Watson trees, random allocations and condensation. *Probability Surveys ***9** (2012), 103–252.

**Sonia Petrone** is a Professor of Statistics at Bocconi University, Milan, Italy. Her research is in Bayesian statistics, covering foundational aspects as well as methods and applications. Foundational themes in her work include exchangeability and decisions under risk. Her main methodological interests are in the area of Bayesian nonparametrics, including mixtures and latent variable models, density estimation, nonparametric regression and predictive methods. She has been President of the International Society for Bayesian Analysis (ISBA) and is an elected member of the IMS Council. She has been a co-editor of Bayesian Analysis and is currently an associate editor of Statistical Science. She is a Fellow of ISBA. Sonia will be giving her Medallion Lecture at the IMS Annual Meeting in Vilnius, on Monday July 2.

Bayesian Statistics has its foundations in the concept of probability as the rule for quantifying uncertainty, and in the consequent solution of the learning process through conditional probabilities. People usually distinguish two main learning goals: in the inferential approach, the focus of learning is on the model’s parameters; in the predictive approach, the focus of learning is on future events, given the past. Predictive learning is central in many applications, and has a foundational appeal: one should express probabilities on observable facts, (non-observable) parameters being just a link of the probabilistic chain that leads from past to future events. Beyond foundations, the focus on prediction is important for fully understanding the implications of modeling assumptions.

Bayesian predictive learning is solved through the conditional distribution *P _{n}* of

In more complex problems, stochastic dependence structures beyond exchangeability are needed. Still, forms of symmetry, or partial exchangeability, may hold. Powerful predictive constructions have been proposed for partially and Markov-exchangeable data, and successfully applied in a wide range of fields. A challenge, nowadays, is having partially exchangeable predictive rules that remain computationally tractable for the increasingly complex applications. In the information-versus-computations trade-off, possibly sub-optimal but computationally more tractable approximations of exchangeable predictive rules receive renewed interest.

In my lecture, I will present predictive constructions, beyond exchangeability, based on (interacting) stochastic processes with time-dependent or random reinforcement. These processes are not, in general, (partially) exchangeable, but may still have convergent predictive distributions and be so asymptotically. They may model evolutionary phenomena that deviate from exchangeability but reach an exchangeable steady state. Interestingly, they may also offer a ‘quasi-Bayes’ recursive predictive learning rule, that approximates an exchangeable rule and is computationally simpler. I will illustrate this potential in the basic case of nonparametric mixture models.

]]>

https://www.niss.org/events/2018-niss-writing-workshop-junior-researchers-jsm ]]>

Prakash Chakraborty, Purdue University, USA will be attending **SPA 2018, **the 40th Conference on Stochastic Processes and Their Applications**,** in Gothenburg, Sweden, from June 11–15.

Also June 11–15 this year is the **4th Conference of the International Society for NonParametric Statistics **in Salerno, Italy. Attending this meeting will be Kellie Ottoboni, University of California, Berkeley, USA.

Ewain Gwynne (Massachusetts Institute of Technology, USA) and Ronan Herry (LAMA in Paris-Est and MRU in Luxembourg) will both travel to the **IMS annual meeting in Vilnius, Lithuania** (July 2–6). All the travel award winners will be mentioned in the Presidential Address and Awards Ceremony at this meeting, on the Monday evening, along with the winners of the IMS Travel Awards that were announced in the previous issue.

A further six travel awardees will be using their funds to travel to **JSM in Vancouver**, July 28–August 2. They are: Xiaowu Dai, University of Wisconsin–Madison, USA; Luella Fu, University of Southern California, USA; Chengchun Shi, North Carolina State University, USA; Jingshen Wang, University of Michigan–Ann Arbor, USA; Kashif Yousuf, Columbia University, USA; and Yuanyuan Zhang, University of Manchester, UK.

Finally, a Hannan award will enable Yifan Cui, University of North Carolina at Chapel Hill, USA, to travel to next year’s **ENAR** meeting, which takes place March 24–27, 2019, in Philadelphia, USA.

If you will be attending any of these meetings, please go and introduce yourself to the winners!

**Consider contributing to the award funds **

If you’d like to support the funds for the Hannan Graduate Student Travel Awards or the IMS New Researcher Travel Awards, or any other IMS funds, please take a moment to review the information at http://www.imstat.org/contribute-to-the-ims/

Thank you!

]]>Lawrence David Brown, one of the leading statisticians of our time, passed away peacefully on February 21, 2018, at the age of 77, after a long battle with cancer. We are all deeply saddened by the loss of Larry, a teacher, a mentor, a colleague, and a friend. Larry preserved his unfailing fortitude and good humor to his last day.

Larry was the Miers Busch Professor of Statistics at the Wharton School at the University of Pennsylvania. He published five books and over 170 papers in leading statistics and probability journals. Larry was known for his extensive work in decision theory, especially on the admissibility of estimators of one or more parameters, and its connection to recurrence and partial differential equations. Larry also made fundamental contributions to the theory of nonparametric function estimation, including asymptotic equivalence theory, and minimax and adaptation theory. He also worked on a broad range of other topics including sequential analysis, properties of exponential families, foundations of statistical inference, conditional confidence, interval estimation and Edgeworth expansions, bioequivalence, and analysis of census data and call-center data.

Larry was the recipient of many honors for his profound contributions to the field of statistics. He was the President of the Institute of Mathematical Statistics in 1992–1993, co-editor of *The Annals of Statistics *for 1995–1997 and gave the prestigious Wald Memorial Lectures in 1985. In 1993, Purdue University awarded him an honorary D.Sc. degree in recognition of his distinguished achievements. He was named winner of the Wilks Memorial Award of the American Statistical Association in 2002 and the winner of the C.R. and B. Rao Prize in 2007. He was a member of the US National Academy of Sciences and American Academy of Arts and Sciences. Larry provided exemplary public service for a number of organizations. He served on several panels and committees of the National Research Council (NRC) and the National Academy of Sciences, in particular over many years for the 2000 and 2010 censuses, and then as Chair of the NRC Committee on National Statistics from 2010 to 2018. He gave testimony concerning the 2000 US Census to the US Senate Governmental Affairs Committee in 1997, and to the US House of Representatives Committee in 1998.

Larry was much loved by his colleagues and his students, many of whom benefited tremendously from his wisdom and kindness. Larry highly valued teaching and mentoring students and new researchers —for whom he always found time and great energy despite his many obligations. He supervised 37 PhD students, many of whom hold leading positions in the United States and abroad. In addition to his own students, Larry also mentored many postdocs and junior faculty. He was the winner of the Provost’s Award for Distinguished Ph.D. Teaching and Mentoring at the University of Pennsylvania in 2011.

Larry was born on December 16, 1940, in Los Angeles, California. His parents moved to Alexandria, VA, during World War II, then returned to California. His father, Louis Brown, was a successful tax lawyer and later a professor of law at the University of Southern California, where he worked tirelessly on behalf of client services and conflict prevention, for which he coined the phrase ‘preventive law.’ His mother, Hermione Kopp Brown, studied law in Virginia and then in Los Angeles and became one of the leading women lawyers in Los Angeles in the field of entertainment law, with emphasis on estate planning. Larry inherited their dedication for service, their mental acuity and resourcefulness, and their selfless good spirits. Larry graduated from Beverly Hills High School in 1957 and from the California Institute of Technology in 1961 and earned his PhD in mathematics from Cornell University three years later. Initially hired at the University of California, Berkeley, he then taught in the mathematics department at Cornell University from 1966–72 and 1978–94 and in the statistics department at Rutgers University from1972–78. In 1994, he moved to the Wharton School at the University of Pennsylvania, serving as the Miers Busch Professor of Statistics, and taught his last course there in the fall of 2017.

His passion for his work was matched by his devotion to his family. His wife Linda Zhao survives him, as do their sons Frank and Louie, their daughter Yiwen Zhao, his daughters from his first marriage, Yona Alper and Sarah Ackman, his brothers Marshall and Harold and their wives Jane and Eileen, and 19 grandchildren.

Larry is deeply missed by all of us.

*—*

*Written by James Berger (Duke University), Tony Cai (The Wharton School, University of Pennsylvania), and Iain Johnstone (Stanford University)*

]]>

Herman Rubin, Professor of Statistics and Mathematics at Purdue University, passed away in West Lafayette, IN, on April 23, 2018; he was 91. Herman Rubin was among the last remaining great polymaths of the twentieth century. To all who knew him, or had heard about him, Herman was an inexplicable outlier in numerous ways. His unique ability to understand a new problem and arrive at the answer almost instantly baffled even the smartest mathematician. He never forgot a fact, theorem or proof that he had seen. He would solve a complete stranger’s problem without expecting co-authorship or anything else in return. He would fight for someone who opposed him at every step. He would always stand by his principles. Herman Rubin’s death marks the nearness of the end of a unique era following the second world war that saw the simultaneous emergence of a distinctive group of supremely talented statisticians who would shape the foundations of the subject for decades to come.

Herman Rubin was born October 27, 1926 in Chicago, Illinois. He obtained his PhD in Mathematics from the University of Chicago in 1948, at the age of 21; he was a student of Paul Halmos. He would serve the faculties of Stanford University, the University of Oregon, Michigan State University and Purdue University. After a stint at the Cowles Commission, he formed a productive intellectual affinity with Ted Anderson, Charles Stein and Ingram Olkin; at this time, he also became professionally close to David Blackwell and Meyer Girshick. With Ted Anderson, he wrote two phenomenal papers on fundamental multivariate analysis that worked out the fixed sample as well as asymptotic distribution theory of MLEs in factor analysis models and structural equation models. These results have entered into all standard multivariate analysis and econometrics texts, and have remained there for more than half a century. Herman’s most famous and classic contribution to inference is the widely used and fundamental idea of monotone likelihood ratio families. Anyone who has taken a course on testing hypotheses knows the absolute fundamentality of the idea and the results in the 1956 paper with Samuel Karlin. It was this work that led to Sam Karlin’s hugely influential TP2 and variation diminishing families, with shadows of Isaac Schoenberg and Bill Studden lurking in the background.

Following this period of a series of fundamental papers in statistics, we see a shift. He makes novel entries into various aspects of probability and asymptotics. With J. Sethuraman, he does theory of moderate deviations. With Herman Chernoff, he attacks the (then completely novel) problem of estimating locations of singularities, and how, precisely, the asymptotics are completely new. With Prakasa Rao as his student, he gets into the problem of cube root asymptotics for monotone densities. With C.R. Rao, he gives the classic Rao–Rubin characterization theorem. And, to many, the jewel in the crown is the invention of the idea of a Stratonovich integral.

Herman really did enjoy particular problems, as long as they were not mundane. Classic examples are his papers with Rick Vitale, quite shocking, that sets of independent events characterize an underlying probability measure, degeneracies aside; the pretty work with Jeesen Chen and Burgess Davis on how non-uniform can a uniform sample look to the eye; with Tom Sellke on roots of smooth characteristic functions; the Bayesian formulation of quality control with Meyer Girshick; the hilariously bizarre, but hard, problem of estimating a rational mean; his papers with Andrew Rukhin on the positive normal mean; the work on the notorious Binomial *N* problem; and Bayesian robustness of frequentist non-parametric tests… among others. Herman never considered who would cite or read a result; if he wanted to solve a problem, he did.

Herman was probably one of the lifelong Bayesians, but axiomatically so. He really did take most of the Savagian theory and axioms literally; he expanded on them, though later. An expansion was published in *Statistics and Decisions *(I believe, with extremely active help from Jim Berger). He would not budge an inch from his conviction that the loss and the prior are inseparable. He would refuse to discuss what is an appropriate loss function; he would insist you ask the client. He would nevertheless want to see the full risk function of a procedure and would study Bayes through the lens of Bayes risk—and even exclusively Bayes risk, namely the double integral. On asymptotic behaviors of procedures, he did not appear to care for second-order terms. He showed his concern time and again for only calculating a limit. A glorious example of this is his work with J. Sethuraman on efficiency defined through Bayes risks; this was so novel that it entered into the classic asymptotic text of Robert Serfling. He came back to it many years later in joint work with Kai-Sheng Song in an *Annals of Statistics *article.

In some ways, Herman was a paradox. He would publicly say only Bayes procedures should be used—but oppose the use of a single prior with all his teeth. He would be technically interested in the robustness of traditional frequentist procedures, although he would portray them as coming out of wrong formulations. (Well known examples are his oft-cited papers with Joe Gastwirth on the performance of the *t*-test under dependence.) He did not have a personal desire to burn the midnight oil writing a comprehensive review of some area; but he would be an invaluable asset to the one who was. An example is his review of infinitely divisible distributions with Arup Bose (and this writer). An all-time classic is his text on *Equivalents of the Axiom of Choice,* jointly written with his wife Jean E. Rubin, Professor of Mathematics at Purdue. Jim Berger thanks Herman profusely in the preface of his classic Springer book on decision theory and Bayesian analysis; Charles Stein acknowledges Herman (and Herbert Robbins) in his first shrinkage paper.

There was a fairly long period in the history of statistics, when nearly every paper written in his home department had Herman’s contributions to it. He never asked for, or received, credit for them. He epitomized the term scholar in its literal sense. With the passing of Herman Rubin, a shining beacon of knowledge and wisdom just moved. Herman was a consummate master of simulation, characteristic functions, and infinitely divisible distributions. He kept to himself a mountain of facts and results on these and other topics that never saw the light of the day. There was never a person who did not respect Herman Rubin’s brain; even Paul Erdős did. Herman never stopped thinking of good problems, and loved discussing them. He personified Albert Einstein’s quote, “Intellectual growth should commence at birth and cease only at death”.

The IMS *Lecture Notes–Monograph Series* published a collection of research articles in Herman Rubin’s honor in 2004; numerous leaders of the profession wrote an original article for this collection. As well as being IMS and AMS Fellow, he was also a Fellow of the American Association for the Advancement of Science, and a member of Sigma Xi.

Herman had sophisticated taste in music and literature. He was often seen in classical concerts and operas. He helped mathematical causes financially. Herman was probably one of the very few people anywhere who could work out the NY Times crossword puzzle on any day in about an hour.

Herman is survived by his son Arthur, and daughter Leonore. His wife Jean died in 2002.

*—*

*Written by Anirban DasGupta, Purdue University*

For the past 24 months, we have been working towards the establishment of a Peter Hall Prize within the IMS. Our proposal for such a prize has recently been approved by the IMS Council, with the following council resolution: “IMS Council approves the permanent institution of a Peter Hall Prize to be awarded annually in recognition of early career research accomplishments and the promise in statistics, broadly construed.”

It is now time to raise money for the endowment. We hope you will consider making a generous donation to help set up a permanent IMS Prize in Peter Hall’s name. These donations not only memorialize Peter’s research achievements and outstanding service to the profession, but go to help recognize early career researchers.

There are three different ways you can send your donation:

1 By credit card via the IMS website: https://secure.imstat.org/secure/orders/donations.asp

2 By US dollar check, made payable to “IMS – Peter Hall Endowment Fund.” Mail to: *IMS – Peter Hall Endowment Fund, 3163 Somerset Drive, Shaker Heights, OH 44122. *

3 By electronic transfer or wire: please email Elyse Gustafson erg@imstat.org for bank transfer details.

For US residents, the donations will be tax-deductible. Per Internal Revenue Code Section 170, a contribution to an IMS Fund is considered a charitable contribution and is tax deductible under US tax laws. We confirm that this donation does not benefit you in any monetary or material fashion.

IMS will invest the fund as they do all of their long-term accounts. It would be helpful if you could let us know the amount of your donation by emailing Jon Wellner, jaw@stat.washington.edu. While this information would assist us in keeping track of progress toward the initial $40,000 goal, offering it is of course completely optional. We look forward to receiving donations from you and thank you very much for your support in advance!

Sincerely yours,

*IMS Ad-Hoc Committee for a Peter Hall Award or Prize:*

*Jon Wellner, Richard Davis, Alison Etheridge, Iain Johnstone, and Xiao-Li Meng, with Honorary Member Jeannie Hall *

**Obituaries and websites:**

https://www.science.org.au/fellowship/fellows/biographical-memoirs-1/peter-gavin-hall-1951-2016

http://bulletin.imstat.org/2016/03/obituary-peter-gavin-hall-1951-2016/

http://peterhallmemorial.ucdavis.edu

]]>The lectureship was established in perpetuity in memory of Dr. Myrto Lefkopoulou, a faculty member and graduate of Harvard School of Public Health. Dr. Lefkopoulou died of cancer in 1992 at the age of 34 after a courageous two-year battle. She was deeply beloved by friends, students, and faculty.

Each year the Myrto Lefkopoulou Lectureship is awarded to a promising statistician who has made contributions to either collaborative or methodologic research in the applications of statistical methods to biology or medicine, and/or who has shown excellence in the teaching of biostatistics. Ordinarily, the lectureship is given to a statistician who has earned a doctorate in the last fifteen years. The lecture is presented to a general scientific audience as the first Department colloquium of each academic year. The lectureship includes travel to Boston, a reception following the lecture, and an honorarium of $1000.

Past recipients are listed at https://www.hsph.harvard.edu/biostatistics/myrto-award/

**Nominations**

Please send nominations (which should include a letter of nomination and a CV) via email to sandelma@hsph.harvard.edu or by mail to:

*Myrto Lefkopoulou Committee, **Harvard T. H. Chan School of Public Health, **Department of Biostatistics, **Building 2, 4th Floor, **655 Huntington Avenue, **Boston, MA 02115*

All nominations must be received by **June 1, 2018**.

**Seventeenth Annual Janet L. Norwood Award for outstanding achievement by a woman in the statistical sciences**

The Department of Biostatistics and the School of Public Health, University of Alabama at Birmingham (UAB) is pleased to request nominations for the Seventeenth Annual Janet L. Norwood Award for Outstanding Achievement by a Woman in the Statistical Sciences. The award will be conferred on Wednesday, September 19, 2018. The award recipient will be invited to deliver a lecture at the UAB award ceremony, and will receive all expenses, the award, and a $5,000 prize.

Eligible individuals are women who have completed their terminal degree, have made extraordinary contributions and have an outstanding record of service to the statistical sciences, with an emphasis on both their own scholarship and on teaching and leadership of the field in general and of women in particular and who, if selected, are willing to deliver a lecture at the award ceremony. For additional details about the award, please visit our website at http://www.soph.uab.edu/awards/norwoodaward.

**How to nominate:** Please send a full curriculum vitae accompanied by a letter of not more than two pages in length describing the nature of the candidate’s contributions. Contributions may be in the area of development and evaluation of statistical methods, teaching of statistics, application of statistics, or any other activity that can arguably be said to have advanced the field of statistical science. Self-nominations are acceptable.

Please send nominations to: norwoodawd@uab.edu

Deadline for receipt of nominations is **Friday, June 29, 2018**. Electronic submissions of nominations are encouraged. The winner will be announced by July 25.

Previous recipients, from 2002 onwards, are: Jane F. Gentleman, Nan M. Laird, Alice S. Whittemore, Clarice R. Weinberg, Janet Turk Wittes, Marie Davidian, Xihong Lin, Nancy Geller, Adrienne Cupples, Lynne Billard, Nancy Flournoy, Kathryn Roeder, Judith D. Singer, Judith D. Goldberg, Francesca Dominici, and last year, Sally C. Morton.

]]>The Institute of Mathematical Statistics has selected **Philip A. Ernst **as the winner of this year’s Tweedie New Researcher Award. Dr. Ernst received his PhD in 2014 from the Wharton School of the University of Pennsylvania and is now an Assistant Professor of Statistics at Rice University: http://www.stat.rice.edu/~pe6/. Philip’s research interests include exact distribution theory, stochastic control, optimal stopping, mathematical finance and statistical inference for stochastic processes.

The IMS Travel Awards Committee selected Philip “for his fundamental contributions to exact distribution theory, in particular for his elegant resolution of the Yule’s nonsense correlation problem, and for his development of novel stochastic control techniques for computing the value of insider information in mathematical finance problems.”

Philip Ernst will present the Tweedie New Researcher Invited Lecture at the IMS New Researchers Conference, held this year at Simon Fraser University from July 26–28 (immediately before JSM). Visit http://groups.imstat.org/newresearchers/conferences/nrc.html for more information about the New Researchers Conference.

The other invited speakers at the New Researchers Conference are: IMS President **Alison Etheridge**, Oxford University; IMS President-Elect **Xiao-Li Meng**, Harvard University; **Marc Suchard**, University of California, Los Angeles; **Hongyu Zhao**, Yale University; and **Jennifer Hill**, New York University.

The Tweedie award is named for Richard L. Tweedie (1947–2001), the Australian-born professor of biostatistics and head of the Division of Biostatistics at the University of Minnesota, who mentored many young colleagues at work and through professional society activities.

]]>