Wednesday, December 12, 2012

Sir Anthony Leggett on Majorana Fermions in px+ ipy Fermi Superfluids

Professor Sir Tony Leggett, Nobel Laureate, speaks on Majorana Fermions in  px+ ipy Fermi Fluids in his 2012 Lecture Series at the University of Waterloo.

Slides for this lecture series are available at:

Tuesday, July 17, 2012

Upcoming Blog Carnival of Cosmology 27 July 2012 at Galileo's Pendulum

Dr. MR Francis is organizing a Blog Carnival for Cosmology on July 27 2012, with the special topic of Dark Energy. Here I'm blogging all my tweets with the #DarkEnergy hashtag from the past month:

Thursday, July 5, 2012

The Higgs Is Different: It's Spinless!

Yesterday's much anticipated announcement of the discovery, at CERN, of a particle that is "consistent with the Higgs boson" has generated an unprecedented amount of comment, both technical and popular. Even the most casual observer of the scene for the past year could have anticipated what was coming, and indeed even I, in tweeting a summary of the Phenomenology 2012 Symposium at the University of Pittsburgh last May noted that the CMS and ATLAS experiments at CERN, as well as the CDF Experiment at Fermilab, had found a resonance at 125 GeV consistent with the Higgs.

While many post-July 4  2012 commentators have focused on the role of Higgs in particle physics, as the short-lived 'heavy' particle (field) that gives other lighter particles from quarks to electrons their masses, and others have noted such things as the difficulty of deciding who, among the six physicists who published papers predicting and/or analyzing the phenomenon of the Higgs should get the Nobel Prize(s) [since not more than 3 can get it any given year], and others yet have focused on the analogies in condensed matter physics (with superconductivity and superfluidity), I found precious little commentary on what is truly different about the Higgs - that it is a boson, and a scalar boson at that - a spinless (i.e. spin zero) particle.

Jonathan Bagger's piece in the ILC Newsline, however, lays this out in crystal clear prose:

Every other fundamental particle discovered to date – the quarks, leptons and gauge bosons of the standard model – has spin, an intrinsically quantum mechanical property that determines its fate. The Higgs, however, does not. It is an entirely new form of matter. The spin of the quarks and leptons is ultimately responsible for the structure of matter, including the properties of nuclei and the electronic structures that govern all of chemistry. The spin of the gauge bosons gives rise to the forces of nature, ranging from electricity and magnetism to nuclear reactions and gravity. The Higgs, though, is different; it has no spin. Its spinless state allows it to condense and fill the vacuum, much like steam condenses to form the sea. It is this Higgs condensate that is responsible for mass: particles travelling through the condensate experience a drag that slows their motion and gives them mass. The more the drag, the greater the mass.
But even more importantly, he lays out the possible cosmological consequences of the other Higgs-like particles predicted in supersymmetry and Grand Unified Theories for Dark Matter, and Dark Energy:
Higgs-like particles are ubiquitous in theories of physics that extend beyond the standard model.  They are predicted by supersymmetry and by theories of grand unification.  Their condensates contribute to the dark energy that is accelerating the expansion of the universe, and they determine the geometry of the extra dimensions in string theory.  Higgs-like particles might even be responsible for cosmological inflation, the change in time of dark energy, the missing dark matter, or even the puzzling properties of neutrinos.
The reference to Higgs-like particles must be emphasized: he does not mean the Higgs particle currently under intense discussion (the "Standard Model Higgs", which refers only to the particle-field responsible for the generation of masses for the two W and one Z bosons through electroweak symmetry breaking). He refers here to similar particles, yet to be discovered, that are responsible for mass generation in theories of supersymmetry and Grand Unified Symmetry breaking. Also, that 'Higgs-like particles might even be responsible for cosmological inflation' must be clarified: shortly after the original suggestion by Alan Guth of MIT of cosmological inflation, the scalar field required for inflation, subsequently called the 'inflaton' was initially identified with the electroweak Higgs field, but after more theoretical developments ,the inflaton, is no longer thought to result solely from the electroweak Higgs. [A related field, known as the curvaton, has also been proposed that, while not driving inflation itself, creates instead curvature perturbations in spacetime, and thus might have a role in, for example, the formation of structure at cosmological scales. In some theoretical models, the curvaton acts independently of the inflaton, for example, after it has decayed, while in others, inflaton and curvaton fields are considered to act jointly in creating the primordial perturbations.]. It still remains to explain, however, the origin of the field that drives inflation itself.  Here the proposal that Higgs-like particles from supersymmetric or Grand Unified symmetry breaking could be responsible for cosmological inflation, and analogously, could also have a role in explaining Dark Energy, an acceleration similar to inflation, but smaller in magnitude, in the rate of expansion of the universe, can still be considered.

With the near-complete convergence today in cosmological and particle physics related questions, any discussion of the significance of the recent confirmation of the (electroweak) Higgs is incomplete without also mentioning the possible significance of the (generalized) Higgs mechanisms and Higgs particle-fields in cosmology, and  for this we have Professor Bagger's insightful essay to thank. Prof. Bagger also argues that the proposed Electron-Positron International Linear Collider (ILC) could become a Higgs factory, producing enough of them for a full description of all its properties, and related issues such as whether the Higgs is part of a family of particles, or even whether, while being consistent with all the known properties of the electroweak Higgs, the current discovery at CERN might be a Higgs-imposter! (one of the minor disappointments of the discovery is that nothing 'new' was discovered, the particle seems to be a 'mere' confirmation of something predicted nearly a half century ago.)

A few personal notes - (i) Prof. Bagger is a faculty member at Johns Hopkins, my PhD alma mater, though I have known of him from much earlier [during 1985-86, he was a postdoc in the SLAC Theory Group, while I was a graduate student with the MaRC-II Experiment, and he was already well-known then for his work on supersymmetry]. I vividly remember also the seminar he presented when he visited Johns Hopkins before formally joining the Department - back in 1989!
(ii) Peter Ware Higgs was born the year of my father's birth; (exactly one month later!); and the Higgs mechanism was first proposed in 1962, the year of my own birth - though by Anderson; Higgs himself did it two short years later. [For a long time, the idea was known as the Anderson-Higgs mechanism. The six physicists who published papers nearly at the same time on the subject are Englert, Brout, Higgs, Guralnik, Hagen and Kibble. Nambu, Jona-Lasinio and Goldstone did fundamental groundlaying work].
(iii) I first encountered the Higgs mechanism back in '83, Peter Higgs is 83 now. And on a final personal note, I was named 'Satyen' for Satyen Bose by my mother, herself a scientist. Satyen Bose had been named National Professor in India three years before I was born, and his fame in India at the time as the discoverer of the Bose quantum statistics applicable to all integer spin particles was very high (the generic words 'boson' and 'fermion', however, appear to have first been used by Paul Dirac in 1946). Incidentally, Bose's paper on his calculation of the statistical distribution of energies in the photon gas, translated into German and transmitted to Zeitschrift fur Physik by Albert Einstein himself, was received by ZfP on 2 July 1924, almost exactly 88 years ago this day!
Update: The Higgs event detected might be consistent both with minimal (MMSM) and next-to-minimal supersymmetry (NMSSM).

Update March 2014 on Prof. Jonathan Bagger: He has just been named the next Director of TRIUMF, March 2014. The appointment, among other things, signals the very strong interest, both at TRIUMF and at other Canadian particle physics groups, in the International Linear Collider Collaboration.

Monday, June 25, 2012

Workshop for New Physics & Astronomy Faculty - June 25-28 2012

The Workshop for New Physics and Astronomy Faculty gets under way this week at the American Center for Physics in College Park, MD. Now in its eleventh year, the Workshop was started by the American Physical Society (APS) and the American Astronomical Society (AAS) , in cooperation with the American Association for Physics Teachers (AAPT), responding to the sad reality that beginning faculty often neglect teaching duties in the quest to establish their research credentials, and that this neglect is most likely to occur at research-intensive universities which also graduate the largest number of physics undergraduates, and often translates in to significant attrition in physics enrolment levels. The workshop is now funded by the National Science Foundation.

On the first day, the workshop proper gets going with Eric Mazur and Angelica Natera of Harvard University on Peer Instruction. A method of learner-centered teaching, with peer-originated interactive real-time feedback that is especially suited to large classes of undergraduates, Peer Instruction has demonstrated increased measurable learning outcomes in introductory physics both in large and in small institutional contexts. On the second day, several sessions on Learner-centered teaching, Digital Libraries, and Lecture tutorials, and my old friend and Hopkins Physics co-alum Andy Gavrin of Indiana University - Purdue University at Indianapolis talks on "How to Get Your Students to Prepare for Every Class." The third day includes a variety of sessions: on Upper level physics, on using Phys-lets (Java applets on physics concepts), and problem solving, etc.

All in all, the beginning physics faculty in attendance, as well as the conference facilitators and lecturers should have an interesting, educational (in all senses) and fun time.

Friday, June 15, 2012

Four Ways of Teaching General Relativity to Undergraduates

As someone who had cracked open his father's copy of The Principle of Relativity (which contained Einstein's original paper on General Relativity) at age 13, presented an undergraduate paper on White Holes in Astrophysics as a freshman at age 17, opened Weinberg's classic tome at age 18, and delivered a graduate-level seminar on the Binary Pulsar 1913+16 at age 20, I read the recent article by Professors Nelson Christensen and Thomas Moore on teaching General Relativity to Undergrads with a great deal of interest. They present four pedagogical approaches to teaching undergrads, and to quote from their article, the four approaches are:
  • The adjusted math-first approach.
  • The calculus-only approach.
  •  The physics-first approach.
  •  The intertwined + active-learning approach
When I was an undergrad, there was a severe paucity of  books on General Relativity directed at an undergraduate audience. The traditional textbook was Weinberg's Gravitation and Cosmology, and one had to wait for a full 3-year sequence in mathematical physics including tensors and elements of differential geometry, before one got to the course which taught it. Needless to say, for the interested and motivated student, this was rather a long time to wait! Today, textbooks such as Bernard Schutz's A First Course in General Relativity, and others cited by Christensen and Moore go a long way towards filling this gap, and also add much more material on recent developments such as gravitational wave detection and gravitational lensing of optical images, which adds considerably to their appeal.

Undergraduate interest in general relativity today comes not only from cosmology but also topics such as the Global Positioning System (GPS). I have created a blog widget on the left contains links to presentations and resources n teaching undergraduates general relativity organized by the American Association of Physics Teachers, that I have found extremely interesting and useful.

Sunday, April 29, 2012

Brookhaven Lab's Paul Sorensen on Recreating the Early Universe at RHIC

Paul Sorensen of the Brookhaven National Laboratory's Relativistic Heavy Ion Collider (RHIC) describes the quark-gluon plasma created in the collisions of Gold nuclei accelerated to energies of up to 100 GeV at RHIC. To the surprise of the investigators at the RHIC collaborations PHENIX, BRAHMS, STAR and PHOBOS, the quark-gluon plasma behaves more like a liquid than a gas, with its constituent quarks and gluons behaving not as free particles but interacting strongly with each other and moreover, displaying very strongly correlated motion, so that the fluid has no viscosity.

The quark gluon plasma created at RHIC represents a recreation of the conditions of the early universe and results of the experiments are being analyzed for answers to questions about how the universe evolved into its present state. For example, RHIC experiments have found both parity and CP-violating events in the quark-gluon plasma, which is one channel by which the observed matter-antimatter asymmetry in the universe could have arisen. Had this asymmetry not existed, the entire universe as we know it, with the stars, galaxies and life itself would not have been possible.

But theoretical analyses also reveals dynamical analogies with the initial evolution of the universe, with the decaying inflaton field responsible for cosmological inflation having its analogy in the decaying glasma field, for example. And among other possibilities, the evolution of a non-viscous hydrodynamics from the quark-gluon plasma, it is conjectured, may include a transient Bose-Einstein condensate, a phenomenon also known in the context of inflaton dynamics.

Tuesday, January 31, 2012

USNRC and EPRI Announce New Seismic Source Characterization Model for NPPs

The US Nuclear Regulatory Commission (USNRC), together with the Electric Power Research Institute(EPRI) and the US Department of Energy (USDOE) today released details of a new model for calculating the seismic risk for Nuclear Power Plants (NPPs) in the Central & Eastern United States (C & E US). This replaces the EPRI Report NP-4276 Seismic Hazard Methodology for the Central and Eastern United States of July 1986; and the Lawrence Livermore National Laboratory Model, Seismic Hazard Characterization of 69 Nuclear Plant Sites East of the Rocky Mountains, (Bernreuter, D.L., et al., 1989, NUREG/CR-5250, Volumes 1–8), and is the result of a 4-year long joint EPRI-USNRC project to revise the ground motion estimates that can be expected at a given NPP location in the C&E US.

Speaking broadly, the new model results in a greater ground motion for a given NPP location, and the greatest increases in ground motion estimates have been obtained for nuclear power plants in the vicinity of the New Madrid (TN) and Charleston (SC) fault systems, based on a 7-plant sample selected for detailed study by the USNRC. The new, higher ground motion estimates do not by themselves translate to a higher nuclear safety risk for NPPs at those locations - each NPP must re- calculate its safety risk based on details of its own design and plant layout, relative to the enhanced ground motion risk it faces.

The USNRC is asking the NPPs it regulates to re-evaluate their seismic risk based on this new model, and the model will also be used in assessing the seismic risk for new nuclear plants in the region during the new licensing process. While the new seismic and ground motion risk estimates have been in development for the last several years, the Commission direction in this regard is also part of regulatory initiatives in response to the events at the Fukushima nuclear power plant following the Tohoku earthquake and tsunami-following, on 11 March 2011.

Tuesday, January 24, 2012

World Economic Forum Global Risks Report 2012

The World Economic Forum Annual Meeting begins in Davos-Klosters, Switzerland, tomorrow, January 25 2012. In advance of the meeting, the Forum has published its 7th Annual Global Risks Report, created with its partners - Marsh & McLennan, Swiss Re, the Wharton Center for Risk Management, and Zurich Financial Services.

A most interesting read, the report develops 5 major global risk categories – Economic, Environmental, Geopolitical, Societal and Technological, and reports results of a broad survey of risk perceptions among representatives from 5 broad categories of Stakeholder Groups – Business, Academia, NGO, Government, and International Organization. Within the 5 risk categories are a total of 50 risk scenarios, roughly 10 in each risk category, each differing in likelihood and impact – e.g., in the Economic risk category: Chronic Fiscal Imbalances is considered to have the highest likelihood-impact combination; while Major Financial Systemic Failure is considered less likely, but considered to have the highest impact. Unmanageable Inflation or Deflation is considered least likely, while Unforeseen Negative Consequences of Regulations is considered to have the least impact.
The report proceeds to define, in each of the 5 Risk Categories, a Center of Gravity (CoG) – as the risk scenario with the highest (judgment-weighted) combination of likelihood and impact. Thus, the Economic Risk CoG is Chronic Fiscal Imbalances, while the Environmental, Geopolitical, Societal and Technological CoGs are respectively Rising Greenhouse Gas Emissions, Global Governance Failure, Unsustainable Population Growth and Critical Infrastructural Systems Failure. The survey also included a feature where respondents could write in ‘X-factors’ – risk scenarios that had unknown likelihood and impact, but which were nevertheless felt important enough to be thought about. This resulted in risk scenarios such as Volcanic Winter, Mega-accidents, and Neotribalism.
The report develops a series of risk constellations, where the cascading effect of different consequential risk scenarios across the 5 categories is explored. Three major cases are examined – a socio-economic dystopia, a governance dystopia and a technological dystopia, in each case setting out the different combinations of the 50 risk scenarios which could lead to each. The concept of 'critical connectors' is elucidated as the set of risk scenarios which link to the CoG of more than one risk category. Four critical connectors, all of them Economic risk scenarios, link 3 or more of the 5 CoGs.
Finally, the report presents detailed data and analyses of the Survey itself, and I found the data on the differential risk perceptions across the 5 Stakeholder groups, as well as across geographic affiliations particularly interesting. For example, on geographic variation, Europeans rank Chronic Fiscal Imbalances more likely than Middle Easterners and North Africans; while Asians see Unmanageable Inflation or Deflation as more likely than either Europeans or North Americans. Across stakeholder classes, Business saw the likelihood of Negative Consequences of Regulation as being higher than did Academia, while NGOs saw Negative Consequences of Nanotechnology as being more likely than did Academia. Even more interestingly, subject matter experts (across the 5 stakeholder classes) ranked the likelihood of the scenarios within their area of expertise (among the 50 risk scenarios) higher than generalists across the board (with the exception of nanotechnology, where generalists ranked the likelihood of unforeseen negative consequences higher than subject matter experts). This is very interesting, in that, on macroeconomic, socio-economic or environmental issues, where the risk is more easily grasped by generalists, the general level of concern appears lower than might be warranted, while on 'esoteric' technological issues,which by their nature are harder to properly grasp, the general level of concern appears higher than might be warranted strictly on an existing-knowledge basis.
I have sketched here a rather broad summary of the report, but it is well worth a detailed read. In addition to the themes I have outlined, the report also contains a Special Section on the Great East Japan Earthquake of 11 March 2011 (the Tohoku quake). In the video clip below, David Cole, Chief Risk Officer at Swiss Re, talks about the WEF's Global Risk Report 2012. He points out that risk assessments conducted by governments and companies in the past have been inadequate, subjecting nations to extreme economic risks. He urges that a Country Risk Officer be appointed for each country, who would aggregate and prioritize different kinds of risks, and bring them to the attention of policymakers. In another clip within the same playlist, Axel Lehmann, Chief Risk Officer of Zurich Financial Services emphasizes that no single individual, company, or even government can fully appreciate all aspects of the risks involved, and urges, on as many levels as possible, the formation of public-private partnerships for risk identification, analysis and mitigation. Erwann Michel-Kerjan, Director of the Wharton Risk Management Center, in another clip, points out that for high level decision makers, it is necessary to become familiar with all kinds of risks, not only the ones that their training or background predisposes them to consider. He emphasized also that the other side of risk is always opportunity, and the winners are those who not only protect themselves from the negative consequences of risk events, but those who positively profit from them :

Saturday, January 21, 2012

Next Steps in Seismic Hazard / Earthquake Loss Assessment Models

Just as the events at the Fukushima nuclear plant following the Tohoku earthquake of 11 March 2011 pointed to new directions in nuclear plant safety assessment (see my earlier blogpost), so also the property/casualty losses following the quake point to logical next steps in earthquake CAT loss models.

The recent Swiss Re Report on Lessons from Recent Major Earthquakes highlighted a number of ways that CAT Models could improve their loss estimates for portfolios insured against earthquakes. Emphasizing first of all that 2011 set the record both for total economic losses from earthquakes ($ 226 B) and for insured claims ($ 47 B), it underlined that the Tohoku earthquake of 11 March 2011, with insured claims of $35 B, was the most expensive natural CAT of all time, not just among earthquakes, but all natural CATs.

Next, the report turned to perceived inadequacies in current generation CAT loss estimation models. The Swiss Re report pointed out that while most CAT models used by property/casualty underwriters appeared to have adequately modeled property losses following from ground shaking alone, they typically underestimated (if they modeled them at all) the losses resulting from secondary loss agents – (i) the tsunami(s) following, (ii) the seismic aftershocks, (iii) soil liquefaction (iv) business interruption (BI) and (v) contingent business interruption (CBI). Losses due to fires following earthquakes, another secondary loss agent, however, appear to be well modeled.

Tsunamis Where CAT modelers had considered tsunamis following quakes, the height, consequent inland penetration, and damaging force of the tsunami were underestimated. This was true both in the Tohoku quake in Japan, as well as with the recent earthquakes in Chile.

Seismic Aftershocks
A major seismic event is often followed by aftershocks for a considerable period afterward. In some cases, a single aftershock can be more damaging than the original event; and very often the cumulative impact of the aftershocks is greater than that of the original event. Such cumulative effects and the clustering of smaller magnitude events following the original quake are important contributions to total losses, and need to be modeled more carefully.

Soil Liquefaction
This is a phenomenon where, after an earthquake, the soil loses its normal resistance to plastic deformation, and begins to flow like a fluid with a temporal and spatially variable viscosity. This was observed both in the aftermath of the Tohoku quake and in the recent Christchurch quakes in New Zealand, although in the Tohoku quake the tsunami damage far overwhelmed damage from soil liquefaction. As a secondary loss agent, soil liquefaction impacts total property replacement costs in the following ways: by damage from subterranean flooding, costs of land restoration, and in the case of large structures, damage from differential settlement (caused by spatial viscosity variations in the liquefied soil). In geospatial modeling of soil liquefaction potential, important factors to consider include the existence of a shallow ground water table; properties built on reclaimed land, near a river bank, or on poorly consolidated sandy soils that are most prone to liquefaction. Many of these risk factors are easily satisfied in urban areas where large commercial properties are usually built.

Business Interruption (BI)
losses are usually underestimated by models, because they underestimate the time period over which production facilities could remain damaged; and Contingent Business Interruption (CBI) losses are usually underestimated by models because they capture insufficiently well supply chain dependencies, location, and geographic factors.

These considerations point to logical next steps for Earthquake CAT loss estimation model developers to undertake as improvements in their models. While the Tohoku earthquake and the tsunami that followed was the most devastating CAT in history, it is worth remarking that, from the modelers' perspective, it was also geophysically the most well-recorded CAT of all time. Following the devastating Kobe quake of 1995, a large, dense, high-bandwidth, high-connectivity and high-sensitivity network of ground motion sensors was set up. This network spanned the area which was impacted by the Tohoku quake and tsunami, both on the ground and in the ocean, thus generating significant amount of data relative to the spatial distribution and magnitude of ground shaking intensity following the quake,both above ground as well as on the ocean-bed. In addition, following the Sumatra-Andaman earthquake-tsunami of 2004, a network of tsunami sensors and deep-water pressure gauges was also set up. The result is that a rich dataset is now available, which modelers can use to calibrate their ground motion loss estimate modules, and the correlation between earthquake moment magnitude with the size of the tsunami it can generate. However, on a larger scale, the lesson of the Tohoku tsunami quake is likely still to be that the historical record of tsunamis following earthquakes is as yet too sparse to enable confidence about the correlation between seismic moment magnitude and the temporal return periods. Nevertheless, CAT modelers can still proceed to remove the underestimation bias in the loss potential from secondary loss agents that was seen following recent earthquakes.

Friday, January 20, 2012

USDOE Issues Funding Opportunity Announcement for SMRs

The US Department of Energy (USDOE) issued a draft Funding Opportunity Announcement (FOA) this week, intended to support activities related to the design and licensing of Small Modular Reactors (SMRs), defined as reactors with an electric output of 300MWe or less; which can be manufactured remotely, transported to point of construction, with on-site assembly largely limited to system integration of components for operation.
Importantly, the USDOE is interested in designs with passive safety (e.g., against consequences of a nuclear accident) as well as inherent safety (e.g., against natural catastrophes such as earthquakes, windstorms or floods), in addition to designs with long inter-refueling periods, low capital cost outlays, low maintenance and operating costs, and high proliferation resistance. The stated intention is to support up to 2 reactor designs through the USNRC design and licensing process, with the ability to be deployed ‘expeditiously’ being an important merit criterion. 2022, a decade from now, is the target year for commercial operation.

Proponents may choose to pursue licensing from the USNRC under either 10 CFR 50 or 10 CFR 52. Stakeholders are encouraged to form consortia, and proponents are encouraged to form design-centered working groups (DCWGs) across the supply and value chains e.g., SMR manufacturers, power utilities, local bodies; and the activities funded by USDOE are required to draw at least 50% of their total resources required, from internal sources. The total amount of funding available from USDOE is estimated to be $452 M, subject to Congressional appropriations. The current draft FOA will be issued in final form after feedback from, and consultation with, stakeholders.