Tag Archives: Precise

Israeli military says it’s carrying out a “precise and targeted” ground operation in Gaza’s Al-Shifa hospital – CBS News

  1. Israeli military says it’s carrying out a “precise and targeted” ground operation in Gaza’s Al-Shifa hospital CBS News
  2. Israel shows alleged Hamas ‘armory’ under children’s hospital in Gaza. Local health officials dismiss the claims CNN
  3. WH: U.S. intel confirms Hamas using Al-Shifa hospital to conceal military operations MSNBC
  4. Hamas commits war crimes in hospitals and mosques, but world says nothing New York Post
  5. The Palestine Red Crescent strongly condemns the false claims by the occupying forces about armed individuals launching projectiles from inside Al-Quds Hospital [EN/AR] – occupied Palestinian territory ReliefWeb

Read original article here

Scientists Reveal The Most Precise Map of All The Matter in The Universe : ScienceAlert

A gargantuan effort by a huge international team of scientists has just given us the most precise map of the all matter in the Universe obtained to date.

By combining data from two major surveys, the international collaboration has revealed where the Universe does and doesn’t keep all its junk – not just the normal matter that makes up the planets, stars, dust, black holes, galaxies, but the dark matter, too: the mysterious invisible mass generating more gravity than the normal matter can account for.

The resulting map, showing where the matter has congregated over the 13.8-billion-year lifespan of the Universe, will be a valuable reference for scientists looking to understand how the Universe evolved.

Indeed, the results already show that the matter isn’t distributed quite how we thought it was, suggesting there could be something missing from the current standard model of cosmology.

According to the current models, at the point of the Big Bang, all the matter in the Universe was condensed into a singularity: a single point of infinite density and extreme heat that suddenly burst and spewed forth quarks that rapidly combined to form a soup of protons, neutrons and nuclei. Hydrogen and helium atoms came a few hundred thousand years later; from these, the entire Universe was made.

How these early atoms spread out, cooled, clumped together, formed stars and rocks and dust, is detective work based on how the Universe around us appears today. And one of the major clues we’ve used is where all the matter is now – because scientists can then work backwards to figure out how it got there.

But we can’t see all of it. In fact, most of the matter in the Universe – around 75 percent – is completely invisible to our current detection methods.

We’ve only detected it indirectly, because it creates stronger gravitational fields than there should be just based on the amount of normal matter. This manifests in such phenomena as galaxies spinning faster than they should, and a little quirk of the Universe we call gravitational lensing.

When something in the Universe has enough mass – for example, a cluster of thousands of galaxies – the gravitational field around it becomes strong enough to influence the curvature of space-time itself.

That means that any light that travels through that region of space does so along a curved path, resulting in warped and magnified light. These lenses, too, are stronger than they should be if they were only being created by normal matter.

To map the matter in the Universe, researchers compared gravitational lens data collected by two different surveys – the Dark Energy Survey, which collected data in near-ultraviolet, visible, and near-infrared wavelengths; and the South Pole Telescope, which collects data on the cosmic microwave background, the faint traces of radiation left over from the Big Bang.

Maps of the sky compiled from data from the Dark Energy Survey (left) and the South Pole Telescope (right). (Yuuki Omori)

By cross-comparing these two datasets taken by two different instruments, the researchers can be much more certain of their results.

“It functions like a cross-check, so it becomes a much more robust measurement than if you just used one or the other,” says astrophysicist Chihway Chang of the University of Chicago, who was the lead author on one of three papers describing the work.

Lead authors on the two other papers are physicist Yuuki Omori of Kavli Institute for Cosmological Physics and the University of Chicago, and telescope scientist Tim Abbott of NOIRLab’s Cerro Tololo Inter-American Observatory.

The resulting map, based on galaxy positions, lensing of galaxies, and lensing of the cosmic microwave background, can then be extrapolated to infer the matter distribution in the Universe.

This map can then be compared to models and simulations of the evolution of the Universe to see if the observed matter distribution matches theory.

The researchers did run some comparisons, and found that their map mostly matched current models. But not quite. There were some very slight differences between observation and prediction; the matter distribution, the researchers found, is less clumpy, more evenly spaced out than models predict.

This suggests that our cosmological models could use a tweak.

That’s not really a surprise – there are a few mismatches between cosmological observation and theory that seem to suggest we’re missing a trick or two, somewhere; and the team’s findings are consistent with previous work – but the more accurate and complete our data is, the more likely we are to resolve these discrepancies.

There’s more work to be done; the findings aren’t certain, yet. Adding more surveys will help refine the map, and validate (or overturn) the team’s findings.

And, of course, the map itself will help other scientists conduct their own investigations into the mysterious, murky history of the Universe.

The research has been published in Physical Review D. The three papers are available on preprint server arXiv and can be found here, here, and here.

Read original article here

Future Space Telescopes Could be 100 Meters Across, Constructed in Space, and Then Bent Into a Precise Shape

It is an exciting time for astronomers and cosmologists. Since the James Webb Space Telescope (JWST), astronomers have been treated to the most vivid and detailed images of the Universe ever taken. Webb‘s powerful infrared imagers, spectrometers, and coronographs will allow for even more in the near future, including everything from surveys of the early Universe to direct imaging studies of exoplanets. Moreover, several next-generation telescopes will become operational in the coming years with 30-meter (~98.5 feet) primary mirrors, adaptive optics, spectrometers, and coronographs.

Even with these impressive instruments, astronomers and cosmologists look forward to an era when even more sophisticated and powerful telescopes are available. For example, Zachary Cordero 
of the Massachusetts Institute of Technology (MIT) recently proposed a telescope with a 100-meter (328-foot) primary mirror that would be autonomously constructed in space and bent into shape by electrostatic actuators. His proposal was one of several concepts selected this year by the NASA Innovative Advanced Concepts (NIAC) program for Phase I development.

Corder is the Boeing Career Development Professor in Aeronautics and Astronautics at MIT and a member of the Aerospace Materials and Structures Lab (AMSL) and Small Satellite Center. His research integrates his expertise in processing science, mechanics, and design to develop novel materials and structures for emerging aerospace applications. His proposal is the result of a collaboration with Prof. Jeffrey Lang (from MIT’s Electronics and the Microsystems Technology Laboratories) and a team of three students with the AMSL, including Ph.D. student Harsh Girishbhai Bhundiya.

Remove All Ads on Universe Today

Join our Patreon for as little as $3!

Get the ad-free experience for life

Their proposed telescope addresses a key issue with space telescopes and other large payloads that are packaged for launch and then deployed in orbit. In short, size and surface precision tradeoffs limit the diameter of deployable space telescopes to the 10s of meters. Consider the recently-launched James Webb Space Telescope (JWST), the largest and most powerful telescope ever sent to space. To fit into its payload fairing (atop an Ariane 5 rocket), the telescope was designed so that it could be folded into a more compact form.

This included its primary mirror, secondary mirror, and sunshield, which all unfolded once the space telescope was in orbit. Meanwhile, the primary mirror (the most complex and powerful ever deployed) measures 6.5 meters (21 feet) in diameter. Its successor, the Large UV/Optical/IR Surveyor (LUVOIR), will have a similar folding assembly and a primary mirror measuring 8 to 15 meters (26.5 to 49 feet) in diameter – depending on the selected design (LUVOIR-A or -B). As Bhundiya explained to Universe Today via email:

“Today, most spacecraft antennas are deployed in orbit (e.g., Northrop Grumman’s Astromesh antenna) and have been optimized to achieve high performance and gain. However, they have limitations: 1) They are passive deployable systems. I.e. once you deploy them you cannot adaptively change the shape of the antenna. 2) They become difficult to slew as their size increases. 3) They exhibit a tradeoff between diameter and precision. I.e. their precision decreases as their size increases, which is a challenge for achieving astronomy and sensing applications that require both large diameters and high precision (e.g. JWST).”

While many in-space construction methods have been proposed to overcome these limitations, detailed analyses of their performance for building precision structures (like large-diameter reflectors) are lacking. For the sake of their proposal, Cordero and his colleagues conducted a quantitative, system-level comparison of materials and processes for in-space manufacturing. Ultimately, they determined that this limitation could be overcome using advanced materials and a novel in-space manufacturing method called Bend-Forming.

https://asd.gsfc.nasa.gov/luvoir/design/LUVOIR-A_2xSpeed.mp4

This technique, invented by researchers at the AMSL and described in a recent paper co-authored by Bhundiya and Cordero, relies on a combination of Computer Numerical Control (CNC) deformation processing and hierarchical high-performance materials. As Harsh explained it:

“Bend-Forming is a process for fabricating 3D wireframe structures from metal wire feedstock. It works by bending a single strand of wire at specific nodes and with specific angles, and adding joints to the nodes to make a stiff structure. So to fabricate a given structure, you convert it into bending instructions which can be implemented on a machine like a CNC wire bender to fabricate it from a single strand of feedstock. The key application of Bend-Forming is to manufacture the support structure for a large antenna on orbit. The process is well-suited for this application because it is low-power, can fabricate structures with high compaction ratios, and has essentially no size limit.”

In contrast to other in-space assembly and manufacturing approaches, Bend-Forming is low-power and is uniquely enabled by the extremely low-temperature environment of space. In addition, this technique enables smart structures that leverage multifunctional materials to achieve new combinations of size, mass, stiffness, and precision. Additionally, the resulting smart structures leverage multifunctional materials to achieve unprecedented combinations of size, mass, stiffness, and precision, breaking the design paradigms that limit conventional truss or tension-aligned space structures.

In addition to their native precision, Large Bend-Formed structures can use their electrostatic actuators to contour a reflector surface with sub-millimeter precision. This, said Harsh, will increase the precision of their fabricated antenna in orbit:

“The method of active control is called electrostatic actuation and uses forces generated by electrostatic attraction to precisely shape a metallic mesh into a curved shape which acts as the antenna reflector. We do this by applying a voltage between the mesh and a ‘command surface’ which consists of the Bend-Formed support structure and deployable electrodes. By adjusting this voltage, we can precisely shape the reflector surface and achieve a high-gain, parabolic antenna.”

An arrangement of 3 exoplanets to explore how the atmospheres can look different based on the chemistry present and incoming flux. Credit: Jack H. Madden used with permission

Harsh and his colleagues deduce that this technique will allow for a deployable mirror measuring more than 100 meters (328 ft) in diameter that could achieve a surface precision of 100 m/m and a specific area of more than 10 m2/kg. This capability would surpass existing microwave radiometry technology and could lead to significant improvements in storm forecasts and an improved understanding of atmospheric processes like the hydrologic cycle. This would have significant implications for Earth Observation and exoplanet studies.

The team recently demonstrated a 1-meter (3.3 ft) prototype of an electrostatically-actuated reflector with a Bend-Formed support structure at the 2023 American Institute of Aeronautics and Astronautics (AIAA) SciTech Conference, which ran from January 23rd to 27th in National Harbor, Maryland. With this Phase I NIAC grant, the team plans to mature the technology with the ultimate aim of creating a microwave radiometry reflector.

Looking ahead, the team plans to investigate how Bend-Forming can be used in geostationary orbit (GEO) to create a microwave radiometry reflector with a 15km (9.3 mi) field of view, a ground resolution of 35km (21.75 mi) and a proposed frequency span of 50 to 56 GHz – the super-high and extremely-high frequent range (SHF/EHF). This will enable the telescope to retrieve temperature profiles from exoplanet atmospheres, a key characteristic allowing astrobiologists to measure habitability.

“Our goal with the NIAC now is to work towards implementing our technology of Bend-Forming and electrostatic actuation in space,” said Harsh. “We envision fabricating 100-m diameter antennas in geostationary orbit with have Bend-Formed support structure and electrostatically-actuated reflector surfaces. These antennas will enable a new generation of spacecraft with increased sensing, communication, and power capabilities.”

Further Reading: NASA

Read original article here

The most precise astronomical test of electromagnetism yet

Credit: NASA

There’s an awkward, irksome problem with our understanding of nature’s laws which physicists have been trying to explain for decades. It’s about electromagnetism, the law of how atoms and light interact, which explains everything from why you don’t fall through the floor to why the sky is blue.

Our theory of electromagnetism is arguably the best physical theory humans have ever made—but it has no answer for why electromagnetism is as strong as it is. Only experiments can tell you electromagnetism’s strength, which is measured by a number called α (aka alpha, or the fine-structure constant).

The American physicist Richard Feynman, who helped come up with the theory, called this “one of the greatest damn mysteries of physics” and urged physicists to “put this number up on their wall and worry about it.”

In research just published in Science, we decided to test whether α is the same in different places within our galaxy by studying stars that are almost identical twins of our sun. If α is different in different places, it might help us find the ultimate theory, not just of electromagnetism, but of all nature’s laws together—the “theory of everything.”

We want to break our favorite theory

Physicists really want one thing: a situation where our current understanding of physics breaks down. New physics. A signal that cannot be explained by current theories. A sign-post for the theory of everything.

The sun’s rainbow: sunlight is here spread into separate rows, each covering just a small range of colors, to reveal the many dark absorption lines from atoms in the Sun’s atmosphere. Credit: N.A. Sharp / KPNO / NOIRLab / NSO / NSF / AURA, CC BY

To find it, they might wait deep underground in a gold mine for particles of dark matter to collide with a special crystal. Or they might carefully tend the world’s best atomic clocks for years to see if they tell slightly different time. Or smash protons together at (nearly) the speed of light in the 27-km ring of the Large Hadron Collider.

The trouble is, it’s hard to know where to look. Our current theories can’t guide us.

Of course, we look in laboratories on Earth, where it’s easiest to search thoroughly and most precisely. But that’s a bit like the drunk only searching for his lost keys under a lamp-post when, actually, he might have lost them on the other side of the road, somewhere in a dark corner.

Stars are terrible, but sometimes terribly similar

We decided to look beyond Earth, beyond our solar system, to see if stars which are nearly identical twins of our sun produce the same rainbow of colors. Atoms in the atmospheres of stars absorb some of the light struggling outwards from the nuclear furnaces in their cores.

Only certain colors are absorbed, leaving dark lines in the rainbow. Those absorbed colors are determined by α—so measuring the dark lines very carefully also lets us measure α.

Hotter and cooler gas bubbling through the turbulent atmospheres of stars make it hard to compare absorption lines in stars with those seen in laboratory experiments. Credit: NSO / AURA / NSF, CC BY

The problem is, the atmospheres of stars are moving—boiling, spinning, looping, burping—and this shifts the lines. The shifts spoil any comparison with the same lines in laboratories on Earth, and hence any chance of measuring α. Stars, it seems, are terrible places to test electromagnetism.

But we wondered: if you find stars that are very similar—twins of each other—maybe their dark, absorbed colors are similar as well. So instead of comparing stars to laboratories on Earth, we compared twins of our sun to each other.

A new test with solar twins

Our team of student, postdoctoral and senior researchers, at Swinburne University of Technology and the University of New South Wales, measured the spacing between pairs of absorption lines in our sun and 16 “solar twins”—stars almost indistinguishable from our sun.

The rainbows from these stars were observed on the 3.6-meter European Southern Observatory (ESO) telescope in Chile. While not the largest telescope in the world, the light it collects is fed into probably the best-controlled, best-understood spectrograph: HARPS. This separates the light into its colors, revealing the detailed pattern of dark lines.

HARPS spends much of its time observing sun-like stars to search for planets. Handily, this provided a treasure trove of exactly the data we needed.

The ESO 3.6-meter telescope in Chile spends much of its time observing Sun-like stars to search for planets using its extremely precise spectrograph, HARPS. Credit: Iztok Bončina / ESO, CC BY

From these exquisite spectra, we have shown that α was the same in the 17 solar twins to an astonishing precision: just 50 parts per billion. That’s like comparing your height to the circumference of Earth. It’s the most precise astronomical test of α ever performed.

Unfortunately, our new measurements didn’t break our favorite theory. But the stars we’ve studied are all relatively nearby, only up to 160 light years away.

What’s next?

We’ve recently identified new solar twins much further away, about half way to the center of our Milky Way galaxy.

In this region, there should be a much higher concentration of dark matter—an elusive substance astronomers believe lurks throughout the galaxy and beyond. Like α, we know precious little about dark matter, and some theoretical physicists suggest the inner parts of our galaxy might be just the dark corner we should search for connections between these two “damn mysteries of physics.”

If we can observe these much more distant suns with the largest optical telescopes, maybe we’ll find the keys to the universe.

More information:
Michael T. Murphy et al, A limit on variations in the fine-structure constant from spectra of nearby Sun-like stars, Science (2022). DOI: 10.1126/science.abi9232

This article is republished from The Conversation under a Creative Commons license. Read the original article.

Citation:
‘One of the greatest damn mysteries of physics’: The most precise astronomical test of electromagnetism yet (2022, November 11)
retrieved 11 November 2022
from https://phys.org/news/2022-11-greatest-damn-mysteries-physics-precise.html

This document is subject to copyright. Apart from any fair dealing for the purpose of private study or research, no
part may be reproduced without the written permission. The content is provided for information purposes only.



Read original article here

Supernova Explosions Reveal Precise Details of Dark Energy and Dark Matter

Artist’s impression of two white dwarf stars merging and creating a Type Ia supernova. Credit: ESO/L. Calçada

An analysis of more than two decades’ worth of supernova explosions convincingly boosts modern cosmological theories and reinvigorates efforts to answer fundamental questions.

A powerful new analysis has been performed by astrophysicists that places the most precise limits ever on the composition and evolution of the universe. With this analysis, dubbed Pantheon+, cosmologists find themselves at a crossroads.

Pantheon+ convincingly finds that the cosmos is made up of about two-thirds dark energy and one-third matter — predominantly in the form of dark matter — and is expanding at an accelerating pace over the last several billion years. However, Pantheon+ also cements a major disagreement over the pace of that expansion that has yet to be solved.

By putting prevailing modern cosmological theories, known as the Standard Model of Cosmology, on even firmer evidentiary and statistical footing, Pantheon+ further closes the door on alternative frameworks accounting for dark energy and dark matter. Both are bedrocks of the Standard Model of Cosmology but have yet to be directly detected. They rank among the model’s biggest mysteries. Following through on the results of Pantheon+, researchers can now pursue more precise observational tests and hone explanations for the ostensible cosmos.

G299 was left over by a particular class of supernovas called Type Ia. Credit: NASA/CXC/U.Texas

“With these Pantheon+ results, we are able to put the most precise constraints on the dynamics and history of the universe to date,” says Dillon Brout, an Einstein Fellow at the Center for Astrophysics | Harvard & Smithsonian. “We’ve combed over the data and can now say with more confidence than ever before how the universe has evolved over the eons and that the current best theories for dark energy and dark matter hold strong.”

Brout is the lead author of a series of papers describing the new Pantheon+ analysis, published jointly on October 19 in a special issue of The Astrophysical Journal.

Pantheon+ is based on the largest dataset of its kind, comprising more than 1,500 stellar explosions called Type Ia supernovae. These bright blasts occur when

The breakthrough discovery in 1998 of the universe’s accelerating growth was thanks to a study of Type Ia supernovae in this manner. Scientists attribute the expansion to an invisible energy, therefore monikered dark energy, inherent to the fabric of the universe itself. Subsequent decades of work have continued to compile ever-larger datasets, revealing supernovae across an even wider range of space and time, and Pantheon+ has now brought them together into the most statistically robust analysis to date.

“In many ways, this latest Pantheon+ analysis is a culmination of more than two decades’ worth of diligent efforts by observers and theorists worldwide in deciphering the essence of the cosmos,” says Adam Riess, one of the winners of the 2011 Nobel Prize in Physics for the discovery of the accelerating expansion of the universe and the Bloomberg Distinguished Professor at Johns Hopkins University (JHU) and the Space Telescope Science Institute in Baltimore, Maryland. Riess is also an alum of Harvard University, holding a PhD in astrophysics.

“With this combined Pantheon+ dataset, we get a precise view of the universe from the time when it was dominated by dark matter to when the universe became dominated by dark energy.” — Dillon Brout

Brout’s own career in cosmology traces back to his undergraduate years at JHU, where he was taught and advised by Riess. There Brout worked with then-PhD-student and Riess-advisee Dan Scolnic, who is now an assistant professor of physics at Duke University and another co-author on the new series of papers.

Several years ago, Scolnic developed the original Pantheon analysis of approximately 1,000 supernovae.

Now, Brout and Scolnic and their new Pantheon+ team have added some 50 percent more supernovae data points in Pantheon+, coupled with improvements in analysis techniques and addressing potential sources of error, which ultimately has yielded twice the precision of the original Pantheon.

“This leap in both the dataset quality and in our understanding of the physics that underpin it would not have been possible without a stellar team of students and collaborators working diligently to improve every facet of the analysis,” says Brout.

Taking the data as a whole, the new analysis holds that 66.2 percent of the universe manifests as dark energy, with the remaining 33.8 percent being a combination of dark matter and matter. To arrive at even more comprehensive understanding of the constituent components of the universe at different epochs, Brout and colleagues combined Pantheon+ with other strongly evidenced, independent, and complementary measures of the large-scale structure of the universe and with measurements from the earliest light in the universe, the cosmic microwave background.

“With these Pantheon+ results, we are able to put the most precise constraints on the dynamics and history of the universe to date.” — Dillon Brout

Another key Pantheon+ result relates to one of the paramount goals of modern cosmology: nailing down the current expansion rate of the universe, known as the Hubble constant. Pooling the Pantheon+ sample with data from the SH0ES (Supernova H0 for the Equation of State) collaboration, led by Riess, results in the most stringent local measurement of the current expansion rate of the universe.

Pantheon+ and SH0ES together find a Hubble constant of 73.4 kilometers per second per megaparsec with only 1.3% uncertainty. Stated another way, for every megaparsec, or 3.26 million light years, the analysis estimates that in the nearby universe, space itself is expanding at more than 160,000 miles per hour.

However, observations from an entirely different epoch of the universe’s history predict a different story. Measurements of the universe’s earliest light, the cosmic microwave background, when combined with the current Standard Model of Cosmology, consistently peg the Hubble constant at a rate that is significantly less than observations taken via Type Ia supernovae and other astrophysical markers. This sizable discrepancy between the two methodologies has been termed the Hubble tension.

The new Pantheon+ and SH0ES datasets heighten this Hubble tension. In fact, the tension has now passed the important 5-sigma threshold (about one-in-a-million odds of arising due to random chance) that physicists use to distinguish between possible statistical flukes and something that must accordingly be understood. Reaching this new statistical level highlights the challenge for both theorists and astrophysicists to try and explain the Hubble constant discrepancy.

“We thought it would be possible to find clues to a novel solution to these problems in our dataset, but instead we’re finding that our data rules out many of these options and that the profound discrepancies remain as stubborn as ever,” says Brout.

The Pantheon+ results could help point to where the solution to the Hubble tension lies. “Many recent theories have begun pointing to exotic new physics in the very early universe, however, such unverified theories must withstand the scientific process and the Hubble tension continues to be a major challenge,” says Brout.

Overall, Pantheon+ offers scientists a comprehensive look back through much of cosmic history. The earliest, most distant supernovae in the dataset gleam forth from 10.7 billion light years away, meaning from when the universe was roughly a quarter of its current age. In that earlier era, dark matter and its associated gravity held the universe’s expansion rate in check. Such a state of affairs changed dramatically over the next several billion years as the influence of dark energy overwhelmed that of dark matter. Dark energy has since flung the contents of the cosmos ever farther apart and at an ever-increasing rate.

“With this combined Pantheon+ dataset, we get a precise view of the universe from the time when it was dominated by dark matter to when the universe became dominated by dark energy,” says Brout. “This dataset is a unique opportunity to see dark energy turn on and drive the evolution of the cosmos on the grandest scales up through present time.”

Studying this changeover now with even stronger statistical evidence will hopefully lead to new insights into dark energy’s enigmatic nature.

“Pantheon+ is giving us our best chance to date of constraining dark energy, its origins, and its evolution,” says Brout.

Reference: “The Pantheon+ Analysis: Cosmological Constraints” by Dillon Brout, Dan Scolnic, Brodie Popovic, Adam G. Riess, Anthony Carr, Joe Zuntz, Rick Kessler, Tamara M. Davis, Samuel Hinton, David Jones, W. D’Arcy Kenworthy, Erik R. Peterson, Khaled Said, Georgie Taylor, Noor Ali, Patrick Armstrong, Pranav Charvu, Arianna Dwomoh, Cole Meldorf, Antonella Palmese, Helen Qu, Benjamin M. Rose, Bruno Sanchez, Christopher W. Stubbs, Maria Vincenzi, Charlotte M. Wood, Peter J. Brown, Rebecca Chen, Ken Chambers, David A. Coulter, Mi Dai, Georgios Dimitriadis, Alexei V. Filippenko, Ryan J. Foley, Saurabh W. Jha, Lisa Kelsey, Robert P. Kirshner, Anais Möller, Jessie Muir, Seshadri Nadathur, Yen-Chen Pan, Armin Rest, Cesar Rojas-Bravo, Masao Sako, Matthew R. Siebert, Mat Smith, Benjamin E. Stahl and Phil Wiseman, 19 October 2022, The Astrophysical Journal.
DOI: 10.3847/1538-4357/ac8e04



Read original article here

Most Precise Measurements of the Universe Suggest That Something Is Very Wrong

Researchers have found that the universe is expanding at an entirely different rate than previously thought, a groundbreaking discovery that could undermine our current understanding of the cosmos.

In a new paper published in The Astrophysics Journal, an international team of researchers studied the light emitted from 1,550 different supernovae — some near our own Milky Way and some located in the furthest reaches of the universe millions of light-years away — to study the composition and expansion rate of the universe.

In doing so, their analysis dubbed Pantheon+ includes some of the most comprehensive measurements ever made.

“With these Pantheon+ results,” Dillon Brout, co-author and researcher at Harvard’s Center for Astrophysics, told The Harvard Gazette, “we are able to put the most precise constraints on the dynamics and history of the universe to date.”

Excitingly, their findings corroborate some existing theories regarding dark matter, a mysterious yet abundant substance which scientists have yet to observe or measure directly, and dark energy, a hypothetical form of energy that behaves like the opposite of gravity.

Their research posits that the entire universe is roughly made up of two-thirds dark energy and one-third matter, the latter of which is mostly made up of dark matter.

“We’ve combed over the data,” Brout continued, “and can now say with more confidence than ever before how the universe has evolved over the eons and that the current best theories for dark energy and dark matter hold strong.”

But at the same time, the study fails to remedy one of the greatest discrepancies in the field of astronomy: the Hubble tension, or the apparent mismatch between previous, locally-measured estimates of the universe’s expansion rate and the measurement derived from the cosmic microwave background, electromagnetic remnants of the earliest known stages of the universe.

The new research suggests the universe is expanding at roughly 160,000 miles per hour, while previous measurements that take the cosmic microwave background into account concluded it was expanding far slower than that.

And while Pantheon+ may have confirmed the discrepancy, it didn’t exactly provide any answers for it.

“We thought it would be possible to find clues to a novel solution to these problems in our dataset,” Brout told the Harvard Gazette, “but instead we’re finding that our data rules out many of these options and that the profound discrepancies remain as stubborn as ever.”

“It certainly indicates,” the researcher told Agence France-Presse, “that potentially something is fishy with our understanding of the universe.”

Clearly, Pantheon+ has opened more doors than it has closed. But that, in a way, is the beauty of the scientific process — the research done by Brout and his team could still lay the groundwork for a number of future discoveries.

“We, as scientists, thrive on not understanding everything,” Brout continued to AFP. “There’s still potentially a major revolution in our understanding, coming potentially in our lifetimes.”

READ MORE: Most precise accounting yet of dark energy and dark matter [The Harvard Gazette]

More on astronomers being wrong about stuff: Scientists Puzzled Because James Webb Is Seeing Stuff That Shouldn’t Be There

Read original article here

The most precise accounting yet of dark energy and dark matter

Credit: NASA/CXC/U.Texas

Astrophysicists have performed a powerful new analysis that places the most precise limits yet on the composition and evolution of the universe. With this analysis, dubbed Pantheon+, cosmologists find themselves at a crossroads.

Pantheon+ convincingly finds that the cosmos is composed of about two-thirds dark energy and one-third matter—mostly in the form of dark matter—and is expanding at an accelerating pace over the last several billion years. However, Pantheon+ also cements a major disagreement over the pace of that expansion that has yet to be solved.

By putting prevailing modern cosmological theories, known as the Standard Model of Cosmology, on even firmer evidentiary and statistical footing, Pantheon+ further closes the door on alternative frameworks accounting for dark energy and dark matter. Both are bedrocks of the Standard Model of Cosmology but have yet to be directly detected and rank among the model’s biggest mysteries. Following through on the results of Pantheon+, researchers can now pursue more precise observational tests and hone explanations for the ostensible cosmos.

“With these Pantheon+ results, we are able to put the most precise constraints on the dynamics and history of the universe to date,” says Dillon Brout, an Einstein Fellow at the Center for Astrophysics | Harvard & Smithsonian. “We’ve combed over the data and can now say with more confidence than ever before how the universe has evolved over the eons and that the current best theories for dark energy and dark matter hold strong.”

Brout is the lead author of a series of papers describing the new Pantheon+ analysis, published jointly today in a special issue of The Astrophysical Journal.

Pantheon+ is based on the largest dataset of its kind, comprising more than 1,500 stellar explosions called Type Ia supernovae. These bright blasts occur when white dwarf stars—remnants of stars like our Sun—accumulate too much mass and undergo a runaway thermonuclear reaction.

Because Type Ia supernovae outshine entire galaxies, the stellar detonations can be glimpsed at distances exceeding 10 billion light years, or back through about three-quarters of the universe’s total age. Given that the supernovae blaze with nearly uniform intrinsic brightnesses, scientists can use the explosions’ apparent brightness, which diminishes with distance, along with redshift measurements as markers of time and space.

That information, in turn, reveals how fast the universe expands during different epochs, which is then used to test theories of the fundamental components of the universe.

The breakthrough discovery in 1998 of the universe’s accelerating growth was thanks to a study of Type Ia supernovae in this manner. Scientists attribute the expansion to an invisible energy, therefore monikered dark energy, inherent to the fabric of the universe itself. Subsequent decades of work have continued to compile ever-larger datasets, revealing supernovae across an even wider range of space and time, and Pantheon+ has now brought them together into the most statistically robust analysis to date.

“In many ways, this latest Pantheon+ analysis is a culmination of more than two decades’ worth of diligent efforts by observers and theorists worldwide in deciphering the essence of the cosmos,” says Adam Riess, one of the winners of the 2011 Nobel Prize in Physics for the discovery of the accelerating expansion of the universe and the Bloomberg Distinguished Professor at Johns Hopkins University (JHU) and the Space Telescope Science Institute in Baltimore, Maryland. Riess is also an alum of Harvard University, holding a Ph.D. in astrophysics.

Brout’s own career in cosmology traces back to his undergraduate years at JHU, where he was taught and advised by Riess. There Brout worked with then-Ph.D.-student and Riess-advisee Dan Scolnic, who is now an assistant professor of physics at Duke University and another co-author on the new series of papers.

Several years ago, Scolnic developed the original Pantheon analysis of approximately 1,000 supernovae.

Now, Brout and Scolnic and their new Pantheon+ team have added some 50 percent more supernovae data points in Pantheon+, coupled with improvements in analysis techniques and addressing potential sources of error, which ultimately has yielded twice the precision of the original Pantheon.

“This leap in both the dataset quality and in our understanding of the physics that underpin it would not have been possible without a stellar team of students and collaborators working diligently to improve every facet of the analysis,” says Brout.

Taking the data as a whole, the new analysis holds that 66.2 percent of the universe manifests as dark energy, with the remaining 33.8 percent being a combination of dark matter and matter.

To arrive at even more comprehensive understanding of the constituent components of the universe at different epochs, Brout and colleagues combined Pantheon+ with other strongly evidenced, independent and complementary measures of the large-scale structure of the universe and with measurements from the earliest light in the universe, the cosmic microwave background.

Another key Pantheon+ result relates to one of the paramount goals of modern cosmology: nailing down the current expansion rate of the universe, known as the Hubble constant. Pooling the Pantheon+ sample with data from the SH0ES (Supernova H0 for the Equation of State) collaboration, led by Riess, results in the most stringent local measurement of the current expansion rate of the universe.

Pantheon+ and SH0ES together find a Hubble constant of 73.4 kilometers per second per megaparsec with only 1.3% uncertainty. Stated another way, for every megaparsec, or 3.26 million light years, the analysis estimates that in the nearby universe, space itself is expanding at more than 160,000 miles per hour.

However, observations from an entirely different epoch of the universe’s history predict a different story. Measurements of the universe’s earliest light, the cosmic microwave background, when combined with the current Standard Model of Cosmology, consistently peg the Hubble constant at a rate that is significantly less than observations taken via Type Ia supernovae and other astrophysical markers. This sizable discrepancy between the two methodologies has been termed the Hubble tension.

The new Pantheon+ and SH0ES datasets heighten this Hubble tension. In fact, the tension has now passed the important 5-sigma threshold (about one-in-a-million odds of arising due to random chance) that physicists use to distinguish between possible statistical flukes and something that must accordingly be understood. Reaching this new statistical level highlights the challenge for both theorists and astrophysicists to try and explain the Hubble constant discrepancy.

“We thought it would be possible to find clues to a novel solution to these problems in our dataset, but instead we’re finding that our data rules out many of these options and that the profound discrepancies remain as stubborn as ever,” says Brout.

The Pantheon+ results could help point to where the solution to the Hubble tension lies. “Many recent theories have begun pointing to exotic new physics in the very early universe, however such unverified theories must withstand the scientific process and the Hubble tension continues to be a major challenge,” says Brout.

Overall, Pantheon+ offers scientists a comprehensive lookback through much of cosmic history. The earliest, most distant supernovae in the dataset gleam forth from 10.7 billion light years away, meaning from when the universe was roughly a quarter of its current age. In that earlier era, dark matter and its associated gravity held the universe’s expansion rate in check.

Such state of affairs changed dramatically over the next several billion years as the influence of dark energy overwhelmed that of dark matter. Dark energy has since flung the contents of the cosmos ever-farther apart and at an ever-increasing rate.

“With this combined Pantheon+ dataset, we get a precise view of the universe from the time when it was dominated by dark matter to when the universe became dominated by dark energy,” says Brout. “This dataset is a unique opportunity to see dark energy turn on and drive the evolution of the cosmos on the grandest scales up through present time.”

Studying this changeover now with even stronger statistical evidence will hopefully lead to new insights into dark energy’s enigmatic nature.

“Pantheon+ is giving us our best chance to date of constraining dark energy, its origins, and its evolution,” says Brout.


Information energy accounts for dark energy, resolves Hubble tension, avoids the ‘big chill,’ and is falsifiable


More information:
Dillon Brout et al, The Pantheon+ Analysis: Cosmological Constraints, The Astrophysical Journal (2022). DOI: 10.3847/1538-4357/ac8e04
Provided by
Harvard-Smithsonian Center for Astrophysics

Citation:
The most precise accounting yet of dark energy and dark matter (2022, October 19)
retrieved 20 October 2022
from https://phys.org/news/2022-10-precise-accounting-dark-energy.html

This document is subject to copyright. Apart from any fair dealing for the purpose of private study or research, no
part may be reproduced without the written permission. The content is provided for information purposes only.



Read original article here

Three decades of space telescope observations converge on a precise value for the Hubble constant

This collection of 36 images from NASA’s Hubble Space Telescope features galaxies that are all hosts to both Cepheid variables and supernovae. These two celestial phenomena are both crucial tools used by astronomers to determine astronomical distance, and have been used to refine our measurement of the Hubble constant, the expansion rate of the universe. The galaxies shown in this photo (from top row, left to bottom row, right) are:  NGC 7541, NGC 3021, NGC 5643, NGC 3254, NGC 3147, NGC 105, NGC 2608, NGC 3583, NGC 3147, Mrk 1337, NGC 5861, NGC 2525, NGC 1015, UGC 9391, NGC 691, NGC 7678, NGC 2442, NGC 5468, NGC 5917, NGC 4639, NGC 3972, The Antennae Galaxies, NGC 5584, M106, NGC 7250, NGC 3370, NGC 5728, NGC 4424, NGC 1559, NGC 3982, NGC 1448, NGC 4680, M101, NGC 1365, NGC 7329, and NGC 3447. Credit: NASA, ESA, Adam G. Riess (STScI, JHU)

Completing a nearly 30-year marathon, NASA’s Hubble Space Telescope has calibrated more than 40 “milepost markers” of space and time to help scientists precisely measure the expansion rate of the universe—a quest with a plot twist.

Pursuit of the universe’s expansion rate began in the 1920s with measurements by astronomers Edwin P. Hubble and Georges Lemaître. In 1998, this led to the discovery of “dark energy,” a mysterious repulsive force accelerating the universe’s expansion. In recent years, thanks to data from Hubble and other telescopes, astronomers found another twist: a discrepancy between the expansion rate as measured in the local universe compared to independent observations from right after the big bang, which predict a different expansion value.

The cause of this discrepancy remains a mystery. But Hubble data, encompassing a variety of cosmic objects that serve as distance markers, support the idea that something weird is going on, possibly involving brand new physics.

“You are getting the most precise measure of the expansion rate for the universe from the gold standard of telescopes and cosmic mile markers,” said Nobel Laureate Adam Riess of the Space Telescope Science Institute (STScI) and the Johns Hopkins University in Baltimore, Maryland.

Riess leads a scientific collaboration investigating the universe’s expansion rate called SHOES, which stands for Supernova, H0, for the Equation of State of Dark Energy. “This is what the Hubble Space Telescope was built to do, using the best techniques we know to do it. This is likely Hubble’s magnum opus, because it would take another 30 years of Hubble’s life to even double this sample size,” Riess said.

Riess’s team’s paper, to be published in the Special Focus issue of The Astrophysical Journal reports on completing the biggest and likely last major update on the Hubble constant. The new results more than double the prior sample of cosmic distance markers. His team also reanalyzed all of the prior data, with the whole dataset now including over 1,000 Hubble orbits.

When NASA conceived of a large space telescope in the 1970s, one of the primary justifications for the expense and extraordinary technical effort was to be able to resolve Cepheids, stars that brighten and dim periodically, seen inside our Milky Way and external galaxies. Cepheids have long been the gold standard of cosmic mile markers since their utility was discovered by astronomer Henrietta Swan Leavitt in 1912. To calculate much greater distances, astronomers use exploding stars called Type Ia supernovae.

Combined, these objects built a “cosmic distance ladder” across the universe and are essential to measuring the expansion rate of the universe, called the Hubble constant after Edwin Hubble. That value is critical to estimating the age of the universe and provides a basic test of our understanding of the universe.

Starting right after Hubble’s launch in 1990, the first set of observations of Cepheid stars to refine the Hubble constant was undertaken by two teams: the HST Key Project led by Wendy Freedman, Robert Kennicutt and Jeremy Mould, Marc Aaronson and another by Allan Sandage and collaborators, that used Cepheids as milepost markers to refine the distance measurement to nearby galaxies. By the early 2000s the teams declared “mission accomplished” by reaching an accuracy of 10 percent for the Hubble constant, 72 plus or minus 8 kilometers per second per megaparsec.

In 2005 and again in 2009, the addition of powerful new cameras onboard the Hubble telescope launched “Generation 2” of the Hubble constant research as teams set out to refine the value to an accuracy of just one percent. This was inaugurated by the SHOES program. Several teams of astronomers using Hubble, including SHOES, have converged on a Hubble constant value of 73 plus or minus 1 kilometer per second per megaparsec. While other approaches have been used to investigate the Hubble constant question, different teams have come up with values close to the same number.

The SHOES team includes long-time leaders Dr. Wenlong Yuan of Johns Hopkins University, Dr. Lucas Macri of Texas A&M University, Dr. Stefano Casertano of STScI and Dr. Dan Scolnic of Duke University. The project was designed to bracket the universe by matching the precision of the Hubble constant inferred from studying the cosmic microwave background radiation leftover from the dawn of the universe.

“The Hubble constant is a very special number. It can be used to thread a needle from the past to the present for an end-to-end test of our understanding of the universe. This took a phenomenal amount of detailed work,” said Dr. Licia Verde, a cosmologist at ICREA and the ICC-University of Barcelona, speaking about the SHOES team’s work.

The team measured 42 of the supernova milepost markers with Hubble. Because they are seen exploding at a rate of about one per year, Hubble has, for all practical purposes, logged as many supernovae as possible for measuring the universe’s expansion. Riess said, “We have a complete sample of all the supernovae accessible to the Hubble telescope seen in the last 40 years.” Like the lyrics from the song “Kansas City,” from the Broadway musical Oklahoma, Hubble has “gone about as fur as it c’n go!”

Weird Physics?

The expansion rate of the universe was predicted to be slower than what Hubble actually sees. By combining the Standard Cosmological Model of the Universe and measurements by the European Space Agency’s Planck mission (which observed the relic cosmic microwave background from 13.8 billion years ago), astronomers predict a lower value for the Hubble constant: 67.5 plus or minus 0.5 kilometers per second per megaparsec, compared to the SHOES team’s estimate of 73.

Given the large Hubble sample size, there is only a one-in-a-million chance astronomers are wrong due to an unlucky draw, said Riess, a common threshold for taking a problem seriously in physics. This finding is untangling what was becoming a nice and tidy picture of the universe’s dynamical evolution. Astronomers are at a loss for an explanation of the disconnect between the expansion rate of the local universe versus the primeval universe, but the answer might involve additional physics of the universe.

Such confounding findings have made life more exciting for cosmologists like Riess. Thirty years ago they started out to measure the Hubble constant to benchmark the universe, but now it has become something even more interesting. “Actually, I don’t care what the expansion value is specifically, but I like to use it to learn about the universe,” Riess added.

NASA’s new Webb Space Telescope will extend on Hubble’s work by showing these cosmic milepost markers at greater distances or sharper resolution than what Hubble can see.


Researchers question measurement of the Hubble constant by Nobel laureate Riess’ team


Provided by
NASA’s Goddard Space Flight Center

Citation:
Three decades of space telescope observations converge on a precise value for the Hubble constant (2022, May 19)
retrieved 20 May 2022
from https://phys.org/news/2022-05-decades-space-telescope-converge-precise.html

This document is subject to copyright. Apart from any fair dealing for the purpose of private study or research, no
part may be reproduced without the written permission. The content is provided for information purposes only.



Read original article here

Most Precise Ever Measurement of W Boson Mass Differs From Standard Model Prediction

Most precise ever measurement of W boson mass shows tension with the Standard Model.

After 10 years of careful analysis and scrutiny, scientists of the CDF collaboration at the U.S. Department of Energy’s Fermi National Accelerator Laboratory announced on April 7, 2022, that they have achieved the most precise measurement to date of the mass of the W boson, one of nature’s force-carrying particles. Using data collected by the Collider Detector at Fermilab, or CDF, scientists have now determined the particle’s mass with a precision of 0.01% — twice as precise as the previous best measurement. It corresponds to measuring the weight of an 800-pound gorilla to 1.5 ounces.

The new precision measurement, published in the journal Science, allows scientists to test the Standard Model of particle physics, the theoretical framework that describes nature at its most fundamental level. The result: The new mass value shows tension with the value scientists obtain using experimental and theoretical inputs in the context of the Standard Model.

The Collider Detector at Fermilab recorded high-energy particle collisions produced by the Tevatron collider from 1985 to 2011. About 400 scientists at 54 institutions in 23 countries are still working on the wealth of data collected by the experiment. Credit: Fermilab

“The number of improvements and extra checking that went into our result is enormous,” said Ashutosh V. Kotwal of Duke University, who led this analysis and is one of the 400 scientists in the CDF collaboration. “We took into account our improved understanding of our particle detector as well as advances in the theoretical and experimental understanding of the W boson’s interactions with other particles. When we finally unveiled the result, we found that it differed from the Standard Model prediction.”

If confirmed, this measurement suggests the potential need for improvements to the Standard Model calculation or extensions to the model.

Scientists have now determined the mass of the W boson with a precision of 0.01%. This is twice as precise as the previous best measurement and shows tension with the Standard Model.

The new value is in agreement with many previous W boson mass measurements, but there are also some disagreements. Future measurements will be needed to shed more light on the result.

“While this is an intriguing result, the measurement needs to be confirmed by another experiment before it can be interpreted fully,” said Fermilab Deputy Director Joe Lykken.

The W boson is a messenger particle of the weak nuclear force. It is responsible for the nuclear processes that make the sun shine and particles decay. Using high-energy particle collisions produced by the Tevatron collider at Fermilab, the CDF collaboration collected huge amounts of data containing W bosons from 1985 to 2011.

The W boson is the messenger particle of the weak nuclear force. It is responsible for the nuclear processes that make the sun shine and particles decay. CDF scientists are studying the properties of the W boson using data they collected at the Tevatron Collider at Fermilab. Credit: Fermi National Accelerator Laboratory

CDF physicist Chris Hays of the

The mass of a W boson is about 80 times the mass of a proton, or approximately 80,000 MeV/c2. Scientists of the Collider Detector at Fermilab collaboration have achieved the world’s most precise measurement. The CDF value has a precision of 0.01 percent and is in agreement with many W boson mass measurements. It shows tension with the value expected based on the Standard Model of particle physics. The horizontal bars indicate the uncertainty of the measurements achieved by various experiments. The LHCb result was published after this paper was submitted and is 80354+- 32 MeV/c2. Credit: CDF collaboration

“Many collider experiments have produced measurements of the W boson mass over the last 40 years,” said CDF co-spokesperson Giorgio Chiarelli, Italian National Institute for Nuclear Physics (INFN-Pisa). “These are challenging, complicated measurements, and they have achieved ever more precision. It took us many years to go through all the details and the needed checks. It is our most robust measurement to date, and the discrepancy between the measured and expected values persists.”

The collaboration also compared their result to the best value expected for the W boson mass using the Standard Model, which is 80,357 ± 6 MeV/c2. This value is based on complex Standard Model calculations that intricately link the mass of the W boson to the measurements of the masses of two other particles: the top quark, discovered at the Tevatron collider at Fermilab in 1995, and the Higgs boson, discovered at the Large Hadron Collider at

Read original article here

An Unexpected Boson Measurement Is Threatening The Standard Model of Physics

After a decade of meticulous measurements, scientists announced Thursday that a fundamental particle – the W boson – has a significantly greater mass than theorized, shaking the foundations of our understanding of how the Universe works.

 

Those foundations are grounded by the Standard Model of particle physics, which is the best theory scientists have to describe the most basic building blocks of the Universe, and what forces govern them.

The W boson governs what is called the weak force, one of the four fundamental forces of nature, and therefore a pillar of the Standard Model.

However new research published in Science said that the most precise measurement ever made of the W Boson directly contradicts the model’s prediction.

Ashutosh Kotwal, a physicist at Duke University who led the study, told AFP that the result had taken more than 400 scientists over 10 years to scrutinize four million W boson candidates out of a “dataset of around 450 trillion collisions”.

These collisions – made by smashing particles together at mind-bending speeds to study them – were done by the Tevatron collider in the US state of Illinois.

It was the world’s highest-energy particle accelerator until 2009, when it was supplanted by the Large Hadron Collider near Geneva, which famously observed the Higgs boson a few years later.

The Tevatron stopped running in 2011, but the scientists at the Collider Detector at Fermilab (CDF) have been crunching numbers ever since.

A chart showing the fundamental particles of the Standard Model. (ScienceAlert)

‘Fissures’ in the model

Harry Cliff, a particle physicist at Cambridge University who works at the Large Hadron Collider, said the Standard Model is “probably the most successful scientific theory that has ever been written down”.

“It can make fantastically precise predictions,” he said. But if those predictions are proved wrong, the model cannot merely be tweaked.

 

“It’s like a house of cards, you pull on one bit of it too much, the whole thing comes crashing down,” Cliff told AFP.

The Standard Model is not without its problems.

For example, it doesn’t account for dark matter, which along with dark energy is thought to make up 95 percent of the Universe. It also says that the Universe should not have existed in the first place, because the Big Bang ought to have annihilated itself.

On top of that, “a few fissures have recently been exposed” in the model, physicists said in a companion Science article.

“In this framework of clues that there are missing pieces to the Standard Model, we have contributed one more, very interesting, and somewhat large clue,” Kotwal said.

Jan Stark, physicist and director of research at the French CNRS institute, said “this is either a major discovery or a problem in the analysis of data,” predicting “quite heated discussions in the years to come”.

He told AFP that “extraordinary claims require extraordinary evidence”.

‘Huge deal’

The CDF scientists said they had determined the W boson’s mass with a precision of 0.01 percent – twice as precise as previous efforts.

They compared it to measuring the weight of a 350-kilogram (800-pound) gorilla to within 40 grams (1.5 ounces).

 

They found the boson was different than the Standard Model’s prediction by seven standard deviations, which are also called sigma.

Cliff said that if you were flipping a coin, “the chances of getting a five sigma result by dumb luck is one in three and a half million”.

“If this is real, and not some systematic bias or misunderstanding of how to do the calculations, then it’s a huge deal because it would mean there’s a new fundamental ingredient to our universe that we haven’t discovered before,” he said.

“But if you’re going to say something as big as we’ve broken the Standard Model of particle physics, and there’s new particles out there to discover, to convince people of that you probably need more than one measurement from more than one experiment.”

CDF co-spokesperson David Toback said that “it’s now up to the theoretical physics community and other experiments to follow up on this and shed light on this mystery”.

And after a decade of measurements, Kotwal isn’t done yet.

“We follow the clues and leave no stone unturned, so we’ll figure out what this means.”

© Agence France-Presse

 

Read original article here