Lecture 5: Dark Energy – Is it Einstein’s cosmological after all?
Professor Ofer Lahav
The talk reviewed the history of Dark Energy (tracing it back to Newton!) and it presented results from new surveys, including the “Dark Energy Survey”.
Professor Lahav discussed if Dark Energy is just Einstein’s Cosmological Constant, and the implications for fundamental Physics. He also illustrated how optical imaging surveys designed for cosmological studies turned out to be useful for follow ups of Gravitational Wave events.
My notes from the lecture (if they don’t make sense then it is entirely my fault)
Using images from NASA’s orbiting Hubble Space Telescope, an international team of astronomers has deduced in unprecedented detail the distributions of dark matter within three clusters of galaxies.
Dark matter map of KiDS survey region (region G12); Credit: KiDS survey
Improving Weak Lensing Mass Map Reconstructions using Gaussian and Sparsity Priors: Application to DES SV
Mass map reconstruction from weak gravitational lensing recovers the underlying matter distribution in the Universe from measurements of galaxy shapes. Images of distant galaxies are deformed by the inhomogeneous matter distribution along the line of sight. Any matter can contribute to the lensing effect, making it a direct probe of non-visible dark matter.
What accelerates the Universe?
A simple but strange Universe
The weak field limit of Einstein’s General Relativity a = -GM/r2 + Λ/3r
Newton mentioned a term for linear force.
Einstein, in 1917, modified Newton’s idea
In cosmology, the cosmological constant (usually denoted by the Greek capital letter lambda: Λ) is the energy density of space, or vacuum energy, that arises in Albert Einstein’s field equations of general relativity. It is closely associated to the concepts of dark energy and quintessence.
Einstein originally introduced the concept in 1917 to counterbalance the effects of gravity and achieve a static universe, a notion which was the accepted view at the time. Einstein abandoned the concept in 1931 after Hubble’s discovery of the expanding universe.
Newton’s law of gravitation could be rewritten and Einstein noted that his proposed modification of Newton’s law of gravitation allows for an infinite space filled with a uniform distribution of matter 𝛻2𝛷 − l𝛷 = 4𝜋κr
𝛷 is the gravitational potential, ρ is the density of matter in a volume V, l denotes a universal constant, k is the Einstein constant and 𝛻2 is the Laplacian operator
Thus, a simple modification of Newton’s law of gravitation has overcome the problem of the equilibrium of matter in an infinite, static universe.
Modified GR for a static Universe
Probes of dark energy include standard candles, standard rulers, clusters and gravitational lensing
To find distances in space, astronomers use objects called “standard candles.” Standard candles are objects that give a certain, known amount of light. Because astronomers know how bright these objects truly are, they can measure their distance from us by analysing how dim they appear.
For short distances in space — within our galaxy or within our local group of nearby galaxies — astronomers use a type of star called a Cepheid variable as a standard candle. These young stars pulse with a brightness that tightly relates to the time between pulses. By observing the way the star pulses, astronomers can calculate its actual brightness.
But beyond the local group of galaxies, telescopes can’t make out individual stars. They can only discern large groups of stars. To measure distances to far-flung galaxies, therefore, astronomers need to find incredibly bright objects.
So astronomers turn to exploding stars, called supernovae. Supernovae, which occur within a galaxy about every 100 years, are among the brightest events in the sky. When a star explodes, it releases so much energy that it can briefly outshine all the stars in its galaxy. In fact, we can sometimes see a supernova occur even if we can’t see its home galaxy.
To determine distances, astronomers use a Type Ia supernova. Type Ia supernovae occur in a binary system — two stars orbiting one another. One of the stars in the system must be a white dwarf star, the dense, carbon remains of a star that was about the size of our Sun. The other can be a giant star or even a smaller white dwarf.
To examine the way the universe behaved in the past, astronomers look at extremely distant objects, such as supernovae in galaxies billions of light-years away.
In the 1920s, astronomer Edwin Hubble used a Cepheid variable as a “standard candle” to measure distances to other galaxies, and discovered that the universe was expanding. The idea of the expanding universe revolutionised astronomy. If the universe was expanding, it must at one time have been smaller. That concept led to the Big Bang theory that the universe began as a tiny point that suddenly and swiftly expanded to create everything we know today.
Once Einstein knew the universe was expanding, he discarded the cosmological constant as an unnecessary fudge factor. He later called it the “biggest blunder of his life,” according to his fellow physicist George Gamow.
Today astronomers refer to one theory of dark energy as Einstein’s cosmological constant. The theory says that dark energy has been steady and constant throughout time and will remain that way.
A second theory, called quintessence, says that dark energy is a new force and will eventually fade away just as it arose.
If the cosmological constant is correct, Einstein will once again have been proven right — about something even he thought was a mistake.
Albert Einstein (14 March 1879 – 18 April 1955) was a German-born theoretical physicist who developed the theory of relativity, one of the two pillars of modern physics (alongside quantum mechanics).
Edwin Powell Hubble (November 20, 1889 – September 28, 1953) was an American astronomer. He played a crucial role in establishing the fields of extragalactic astronomy and observational cosmology and is regarded as one of the most important astronomers of all time.
To measure the expansion of the Universe, cosmologists utilize standardized reference objects. One such standard reference is standard rulers. These are objects or features where the actual size of all objects of the same type is the same. By comparing this size to the apparent size of the objects in the sky, we get a measure of how far away the objects are from us. Combined with an estimate of the relative size of the Universe at the time the object sent the light out, we can then map the expansion history of the Universe.
The most prominent use of standard rulers today are the “baryon acoustic oscillations”, overdensities in the distribution of matter in the Universe which occur at regular intervals and therefore provide a standard ruler. These were first measured in 2005 when the Sloan Digital Sky Survey published results from its large survey of galaxies, showing that galaxies are, on average, preferentially found at a distance of about 500 million light years from each other. This standard ruler has its origin in the quantum fluctuations also causing the cosmic microwave background anisotropy, for which the typical fluctuation size (the “acoustic scale”) constitutes another standard ruler.
In cosmology, baryon acoustic oscillations (BAO) are fluctuations in the density of the visible baryonic matter (normal matter) of the universe, caused by acoustic density waves in the primordial plasma of the early universe. In the same way that supernovae provide a “standard candle” for astronomical observations, BAO matter clustering provides a “standard ruler” for length scale in cosmology. The length of this standard ruler is given by the maximum distance the acoustic waves could travel in the primordial plasma before the plasma cooled to the point where it became neutral atoms (the epoch of recombination), which stopped the expansion of the plasma density waves, “freezing” them into place. The length of this standard ruler (~490 million light years in today’s universe) can be measured by looking at the large scale structure of matter using astronomical surveys. BAO measurements help cosmologists understand more about the nature of dark energy (which causes the acceleration of the expansion of the universe) by constraining cosmological parameters.
The amount of matter in the Universe, which is dominated by the unseen substance called dark matter, and the properties of dark energy (what astronomers call cosmological parameters) affect the rate of expansion of the Universe and, therefore, how the distances to objects changes with time. If the cosmological parameters used are incorrect and a cluster is inferred to be traveling away faster than the correct value, then a cluster will appear to be larger and fainter due to this “Russian doll” property. If the cluster is inferred to be traveling away more slowly than the correct value, the cluster will be smaller and brighter than a cluster according to theory.
Latest results confirm earlier studies that the amount of dark energy has not changed over billions of years. They also support the idea that dark energy is best explained by the “cosmological constant,” which Einstein first proposed and is equivalent to the energy of empty space.
The galaxy clusters in a large sample ranged in distance from about 760 million to 8.7 billion light years from Earth, providing astronomers with information about the era where dark energy caused the once-decelerating expansion of the Universe to accelerate.
A gravitational lens is a distribution of matter (such as a cluster of galaxies) between a distant light source and an observer that is capable of bending the light from the source as the light travels towards the observer. This effect is known as gravitational lensing, and the amount of bending is one of the predictions of Albert Einstein’s general theory of relativity. (Classical physics also predicts the bending of light, but only half that predicted by general relativity.)
Weak lensing occurs when the light from a distant galaxy passes a good distance from a massive galaxy, galaxy cluster, or dark-matter concentration, or closer to less-massive objects. It produces a slight distortion in the shape of a distant galaxy. The effect is so subtle that you can’t notice a difference just by looking at the galaxy. Instead, astronomers must analyse the shapes of millions of galaxies to search for patterns. These patterns will allow them to produce three-dimensional maps of the distribution of matter throughout the universe.
These maps will clearly show the distribution of dark matter. But they also will help scientists understand the nature of dark energy.
The Dark Energy Survey (DES) is a visible and near-infrared survey that aims to probe the dynamics of the expansion of the Universe and the growth of large-scale structure. The collaboration is composed of research institutions and universities from the United States, Brazil, the United Kingdom, Germany, Spain, and Switzerland.
The survey uses the 4-meter Victor M. Blanco Telescope located at Cerro Tololo Inter-American Observatory (CTIO) in Chile, outfitted with the Dark Energy Camera (DECam). This camera allows for more sensitive images in the red part of the visible spectrum and in the near infrared, in comparison to previous instruments.
DECam has one of the widest fields of view (2.2-degree diameter) available for ground-based optical and infrared imaging. The survey will image 5,000 square degrees of the southern sky in a footprint that overlaps with the South Pole Telescope and Stripe 82 (in large part avoiding the Milky Way). The survey will take five years to complete, and the survey footprint will nominally be covered ten times in five photometric bands (g, r, i, z, and Y). DES officially began in August 2013 and completed its second season in February 2015.
There have been over 160 DES papers published. Over 400 scientists from 7 countries work on the project.
The Baryon Oscillation Spectroscopic Survey (BOSS) will map out the baryon acoustic oscillation (BAO) signature with unprecedented accuracy and greatly improve the constraints on the acceleration of the expansion rate of the Universe. The BOSS survey will use all of the dark and grey time at the Apache Point Observatory (APO) 2.5-m telescope for five years from 2009-2014, as part of Sloan Digital Sky Survey III.
The Mid-Scale Dark Energy Spectroscopic Instrument (DESI) is a new instrument for conducting a spectrographic survey of distant galaxies. Its main components are a focal plane containing 5000 fibre-positioning robots, and a bank of spectrographs which are fed by the fibres. The new instrument will enable an experiment to probe the expansion history of the Universe and the mysterious physics of dark energy.
The instrument is funded by the U.S. Department of Energy Office of Science and currently under construction. It will be located at 6880 ft. on the Mayall Telescope on top of Kitt Peak in the Sonoran Desert 55 miles distant from Tucson, Arizona.
One of the six lenses to be used was made at UCL
The Hobby–Eberly Telescope (HET) is a 10-meter (30-foot) aperture telescope located at the McDonald Observatory. It is one of the largest optical telescopes in the world and combines a number of features that differentiate it from most telescope designs, resulting in greatly lowered construction costs.
The telescope was upgraded for use in the Hobby–Eberly Telescope Dark Energy Experiment (HETDEX), which will provide the first observations to allow narrowing of the list of possible explanations for dark energy.
Using the powerful Japanese Subaru telescope, the Hyper Suprime-Cam (HSC) survey collaboration team has made and analysed the deepest wide field map of the three-dimensional distribution of matter in the Universe.
Subaru PFS will conduct a cosmological survey over 1400 square degrees in the sky and measure the distribution of the galaxies within an unprecedented large volume of 9(Giga parsec/h)3. The cosmology survey will enable us to measure the Hubble expansion rate of the universe and density of the dark energy precisely.
Use SDSS telescope/spectrograph to extend BAO to z > 0.6
eBOSS will precisely measure the expansion history of the Universe throughout eighty percent of cosmic history, back to when the Universe was less than three billion years old, and improve constraints on the nature of dark energy. “Dark energy” refers to the observed phenomenon that the expansion of the Universe is currently accelerating, which is one of the most mysterious experimental results in modern physics.
Euclid is a visible to near-infrared space telescope currently under development by the European Space Agency (ESA) and the Euclid Consortium. The objective of the Euclid mission is to better understand dark energy and dark matter by accurately measuring the acceleration of the universe.
The Large Synoptic Survey Telescope (LSST) is a wide-field survey reflecting telescope with an 8.4-meter primary mirror, currently under construction, that will photograph the entire available sky every few nights.
LSST evolved from the earlier concept of the Dark Matter Telescope, mentioned as early as 1996.
It will contribute to the study of the structure of the universe by observing thousands of supernovae, both nearby and at large redshift, and by measuring the distribution of dark matter through gravitational lensing.
The Wide Field Infrared Survey Telescope (WFIRST) is a NASA infrared space observatory currently under development. WFIRST was recommended in 2010 by United States National Research Council Decadal Survey committee as the top priority for the next decade of astronomy. On February 17, 2016, WFIRST was approved for development and launch.
Dark Energy Survey Year 1 Results: Cross-correlation between DES Y1 galaxy weak lensing and SPT+Planck CMB weak lensing https://arxiv.org/pdf/1810.02441.pdf
Constraints on a measure of the clumpiness of the matter distribution (S¥) and the fractional density of the Universe in matter (Wm) from the combined 3 DES Y1 measurements (blue), Planck CMB measurements (green), and their combination (red).
No evidence for dynamical dark energy in two models https://arxiv.org/pdf/1709.01074.pdf
Age of the Universe https://lambda.gsfc.nasa.gov/education/graphic_history/age.cfm
https://lambda.gsfc.nasa.gov/education/graphic_history/index.cfm As observations have accumulated to help refine or reject theoretical predictions, a consensus or “standard” 6-parameter Cosmological-constant (Lambda – Λ) Cold Dark Matter (CDM) model has been successfully applied to the interpretation of a range of cosmologically-oriented observations.
DES Collaboration https://kicp-workshops.uchicago.edu/DES-2017/
Density split statistics: Cosmological constraints from counts and lensing in cells in DES Y1 and SDSS data https://journals.aps.org/prd/abstract/10.1103/PhysRevD.98.023507
Baryonic Acoustic Oscillations in SDSS and DESI using the intergalactic medium absorption http://cosmology.lbl.gov/talks/duMasdesBourboux_18.pdf
The dark energy survey data release 1 https://www.darkenergysurvey.org/wp-content/uploads/2018/01/DR1Release.pdf
Wide-Field Lensing Mass Maps from DES Science Verification Data: Methodology and Detailed Analysis https://arxiv.org/pdf/1504.03002.pdf
Dimensional dark matter map estimated by weak lensing technique. The dark matter is concentrated in dense clumps. We can identify massive dark matter halos (indicated by oranges circles). The area shown in this figure is approximately 30 square degrees (a total of 160 square degrees were observed this time). The distribution map without the orange circles is available here. Credit: NAOJ/University of Tokyo.
The key results from DES support Einstein’s cosmological constant.
In cosmology, the cosmological constant (usually denoted by the Greek capital letter lambda: Λ) is the energy density of space, or vacuum energy, that arises in Albert Einstein’s field equations of general relativity. It is closely associated to the concepts of dark energy and quintessence.
Einstein originally introduced the concept in 1917 to counterbalance the effects of gravity and achieve a static universe, a notion which was the accepted view at the time. Einstein abandoned the concept in 1931 after Hubble’s discovery of the expanding universe. From the 1930s until the late 1990s, most physicists assumed the cosmological constant to be equal to zero. That changed with the surprising discovery in 1998 that the expansion of the universe is accelerating, implying the possibility of a positive nonzero value for the cosmological constant.
Tension in LCDM
The Hubble constant actually varies.
The Hubble constant can be measured by looking at fluctuations in the cosmic microwave background, the scale at which galaxies cluster, and by comparing the redshift of distant galaxies with their distance as determined by Cepheid variables and supernovae. It’s this last method that led to the initial discovery of dark energy. Each of these methods provides an independent measure of the Hubble parameter, and their agreement with each other is a way to validate the LCDM model.
Of course, these different measurements don’t all give exactly the same value. There’s a bit of variation in them, which is known as tension in the model. You would think that as our measurements get better the different methods would converge to a specific value, but that isn’t what’s happening. The latest results from the Planck spacecraft (which looks at the CMB) gives a Hubble constant value of about 67 – 68 (km/s)/Mpc, while new results from Cepheid variables and supernovae give a value of about 71 – 75 (km/s)/Mpc. What’s troubling is that both of these measurements are based on good data, so they should both be accurate within a few percent. Given the uncertainty of the measurements you could say that they “agree” statistically, but the best-value from each of these methods clearly don’t agree well. This tension between results has always been lingering in the data, but it seems to get worse as we get better data.
So what’s going on? The short answer is we aren’t sure. It could be that there is some bias in one or both of the methods that we haven’t accounted for. Planck, for example, has to account for gas and dust between us and the cosmic background, and that may be skewing the results. It could be that the supernovae we use as standard candles to measure galactic distance aren’t as standard as we think. It could also be that our cosmological model isn’t quite right. The current model presumes that the universe is flat, and that cosmic expansion is driven by a cosmological constant. We have measurements to support those assumptions, but if they are slightly wrong that could account for the discrepancy as well.
This disagreement between measurements (signals have different amplitudes) isn’t enough to completely discard the LCDM model. All of the observations we have support the idea that the model is broadly correct. But there’s something in the details we aren’t getting right, and until we figure that out there will always be a bit of tension.
Paper: Adam G. Riess, et al. A 2.4% Determination of the Local Value of the Hubble Constant. arXiv:1604.01424 [astro-ph.CO] (2016)
Possible outcomes? Open questions on dark energy?
In cosmology, the equation of state of a perfect fluid is characterized by a dimensionless number w, equal to the ratio of its pressure p to its energy density r:
w = P/r
It is closely related to the thermodynamic equation of state and ideal gas law.
The equation of state of ordinary non-relativistic matter (e.g. cold dust) is w = 0, which means that it is diluted as r µ a-3= V-1 where V is the volume. This means that the energy density red-shifts as the volume, which is natural for ordinary non-relativistic matter.
The equation of state of ultra-relativistic matter (e.g. radiation, but also matter in the very early universe) is w = 1/3 which means that it is diluted as r µ a-4. In an expanding universe, the energy density decreases more quickly than the volume expansion, because radiation has momentum and, by the de Broglie hypothesis a wavelength, which is red-shifted.
Dark energy equation of state
Pressure/density = w(a) = wo + wa(1 – a)
The equation-of-state parameter governs the rate at which the dark energy density evolves. For a perfect, unchanging vacuum energy, we have = -1: the pressure is equal in magnitude and opposite in sign to the energy density. If w is a little bit greater than -1 (e.g., -0.9 or -0.8), the dark energy density will slowly decrease as the universe expands. This would be the case, for example, if the dark energy were the potential energy of some slowly-rolling scalar field (sometimes called “quintessence”). Current experimental bounds tell us that = w-1 is the central preferred value, but there is room for improvement.
Cosmic inflation and the accelerated expansion of the universe can be characterised by the equation of state of dark energy. In the simplest case, the equation of state of the cosmological constant is -1. More generally, the expansion of the universe is accelerating for any equation of state w < -1/3. The accelerated expansion of the Universe was indeed observed. According to observations, the value of equation of state of cosmological constant is near -1.
Hypothetical phantom energy would have an equation of state w < -1, and would cause a Big Rip. Using the existing data, it is still impossible to distinguish between phantom w < -1 and non-phantom w ³ -1}.
Is there a fundamental reason for w = -1(lambda). Is it on the RHS or LHS of Einstein’s equation?
DES: more than dark energy
Non Dark energy overview
Dark energy could be merely mimicking the cosmological constant, a scalar field changing so slowly that we have not yet been able to detect it. Or (whisper it quietly) perhaps dark energy does not even exist.
The Dark Energy Survey: more than dark energy – an overview https://arxiv.org/pdf/1601.00329.pdf
The paper illustrates, using early data (from ‘Science Verification’, and from the first, second and third seasons of observations), what DES can tell us about the solar system, the Milky Way, galaxy evolution, quasars, dwarf satellites (e.g. Large Magellanic Cloud) and other topics
The Large Magellanic Cloud (LMC) is a satellite galaxy of the Milky Way.
A quasar (also known as a QSO or quasi-stellar object) is an extremely luminous active galactic nucleus (AGN). QSOs are strongly lensing.
A multiple-image quasar is a quasar whose light undergoes gravitational lensing, resulting in double, triple or quadruple images of the same quasar. The first such gravitational lens to be discovered was the double-imaged quasar Q0957+561 (or Twin Quasar) in 1979.
A superluminous supernova (SLSN, plural superluminous supernovae or SLSNe), also known as a hypernova, is a type of stellar explosion with a luminosity 10 or more times higher than that of standard supernovae. They are potentially strong galactic sources of gravitational waves.
Gravitational wave detection is ready to pick up a slight shift in spacetime stirred up by the blast. Detecting such gravitational waves or a surfeit of supernova neutrinos would lead to a distinct leap in scientists’ knowledge, and provide new windows into supernovas. All that’s needed now is the explosion.
The first direct observation of gravitational waves was made on 14 September 2015 and was announced by the LIGO and Virgo collaborations on 11 February 2016.
DES was used to look for an optical counterpart to this event. The Ligo trigger had the equivalent magnitude of an 8.3 earthquake.
The first binary neutron star was discovered by gravitational waves and light.
A kilonova (also called a macronova or r-process supernova) is a transient astronomical event that occurs in a compact binary system when two neutron stars or a neutron star and a black hole merge into each other. Kilonovae are thought to emit short gamma-ray bursts and strong electromagnetic radiation due to the radioactive decay of heavy r-process nuclei that are produced and ejected fairly isotropically during the merger process.
The rapid neutron-capture process, or so-called r-process, is a set of nuclear reactions that in nuclear astrophysics is responsible for the creation (nucleosynthesis) of approximately half the abundances of the atomic nuclei heavier than iron, usually synthesizing the entire abundance of the two most neutron-rich stable isotopes of each heavy element.
The term kilonova was introduced by Metzger et al. in 2010 to characterize the peak brightness, which they showed reaches 1000 times that of a classical nova. They are 1/10th to 1/100th the brightness of a typical supernova, the self-detonation of a massive star.
The first kilonova to be found was detected as a short gamma-ray burst, sGRB 130603B, by instruments on board the Swift Gamma-Ray Burst Explorer and KONUS/WIND spacecrafts and then observed using the Hubble Space Telescope.
In October 2018, astronomers reported that GRB 150101B, a gamma-ray burst event detected in 2015, may be analogous to the historic GW170817, a gravitational wave event detected in 2017, and associated with the merger of two neutron stars. The similarities between the two events, in terms of gamma ray, optical and x-ray emissions, as well as to the nature of the associated host galaxies, are considered “striking”, and this remarkable resemblance suggests the two separate and independent events may both be the result of the merger of neutron stars, and both may be a hitherto-unknown class of kilonova transients. Kilonova events, therefore, may be more diverse and common in the universe than previously understood, according to the researchers.
The first observational suggestion of a kilonova came in 2008 following the short-hard gamma-ray burst GRB 080503, where a faint object appeared in optical and infrared light after one day and rapidly faded. Another kilonova was suggested in 2013, in association with the short-duration gamma-ray burst GRB 130603B, where the faint infrared emission from the distant kilonova was detected using the Hubble Space Telescope.
On October 16, 2017, the LIGO and Virgo collaborations announced the first simultaneous detections of gravitational waves (GW170817) and electromagnetic radiation (GRB 170817A, SSS17a) of any phenomena, and demonstrated that the source was a kilonova caused by a binary neutron star merger. This short GRB was followed by a longer transient visible for weeks in the optical electromagnetic spectrum (AT 2017gfo) located in a relatively nearby galaxy, NGC 4993. This visible light faded away.
Dr. Edo Berger, an astronomer at Harvard University and a member of the panel, said the observed kilonova showed “the direct fingerprints of the (creation of) heavy elements for the first time ever, producing 16,000 times the mass of the Earth in heavy elements and tens of times the mass of the Earth in gold and platinum.”
NGC 4993 (also catalogued as NGC 4994) is a lenticular galaxy located about 140 million light-years away in the constellation Hydra. It was discovered on 26 March 1789 by William Herschel and is a member of the NGC 4993 Group.
NGC 4993 is the site of the first astronomical event detected in both electromagnetic and gravitational radiation, the collision of two neutron stars, a discovery given the Breakthrough of the Year award for 2017 by the journal Science. Detecting a gravitational wave event associated with the gamma-ray burst provided direct confirmation that binary neutron star collisions produce short gamma-ray bursts.
These two photos show two moments in time surrounding the merging of two neutron stars. In the left image, taken about one day after the merger, the optical afterglow of the resulting explosion is visible as a small star at roughly the 11 o’clock position on the outskirts of the galaxy NGC 4993. In the right image, taken about two weeks later, the optical afterglow has completed faded away. Images: Dark Energy Survey.
Was the binary neutron star formation triggered by two galaxies merging?
https://www.darkenergysurvey.org/wp-content/uploads/2017/10/HolzGW.pdf A gravitational-wave standard siren measurement of the Hubble constant
GW170817 measurement of H0 = 70.0+12-0 8.0 km s-1 Mpc-1 (68% credible interval). This is consistent with Planck and SNIg. They are tension with each other.
Speed of the gravitational wave was obtained from GW170817. There was a time delay of 1.7 seconds between the gamma wave burst and the gravitational wave. This rules out some (but not all) of the modified gravity modes.
Dark Energy Spectroscopic Instrument (DESI) – 10 times BOSS. An attempt to answer the big neutrino questions? Neutrino mass from cosmological surveys
The effect of massive neutrinos on the matter power spectrum https://arxiv.org/pdf/1006.0689.pdf
http://www.homepages.ucl.ac.uk/~ucapola/Lahav_BigData_SEPnet_21Sep2016B.pdf From the Big Bang to Big Data
https://sciencebusiness.net/news/79927/Square-Kilometre-Array-prepares-for-the-ultimate-big-data-challenge Square Kilometre Array prepares for the ultimate big data challenge
The Square Kilometre Array (SKA) project is an international effort to build the world’s largest radio telescope, with eventually over a square kilometre (one million square metres) of collecting area. The scale of the SKA represents a huge leap forward in both engineering and research & development towards building and delivering a unique instrument, with the detailed design and preparation now well under way. As one of the largest scientific endeavours in history, the SKA will bring together a wealth of the world’s finest scientists, engineers and policy makers to bring the project to fruition.
Photometric supernova classification with machine learning https://arxiv.org/pdf/1603.00882.pdf Automated photometric supernova classification has become an active area of research in recent years in light of current and upcoming imaging surveys such as the Dark Energy Survey (DES) and the Large Synoptic Survey Telescope, given that spectroscopic confirmation of type for all supernovae discovered will be impossible.
Twenty eight years of monitoring the LCDM. It is important to test further.
Dark energy: how the paradigm shifted http://www.homepages.ucl.ac.uk/~ucapola/PWJan10lahav.pdf
The idea that our universe is dominated by mysterious dark energy was revealed by two paradigm-shifting studies of supernovae published in 1998. Lucy Calder and Ofer Lahav explain how the concept had in fact been brewing for at least a decade before – and speculate on where the next leap in our understanding might lie
http://iopscience.iop.org/article/10.1088/1742-6596/718/2/022004/pdf Multi-messenger astronomy: gravitational waves, neutrinos, photons, and cosmic rays
In the next decade, multi-messenger astronomy will probe the rich physics of transient phenomena in the sky, such as the mergers of neutron stars and/or black holes, gamma-ray bursts, and core-collapse supernovae.