/The future of scientific experimentation in physics 

The future of scientific experimentation in physics 

Key idea: The computing power of modern computers can simulate events many orders of magnitudes greater than what we’d be able to model in the real world, at much less the cost.

Original author and publication date: Nicolas Zimmerman (The Daily Campus) – October 28, 2022

Futurizonte Editor’s Note: Nobody knows what new discoveries are right around the corner. And nobody knows what we don’t know that we don’t know.

From the article:   

Our age of highly advanced technology and unprecedented computing power in the 21st century has performed wonders for our never ending pursuit of understanding the world around us. Scientific discovery has always encompassed the process of formulating a theory to explain an observed phenomenon, through which we synthesize an experiment to test the legitimacy of this theory, and perhaps improve upon it. It’s a perpetual positive feedback loop that has led to many extraordinary discoveries and complex theories. The scientific method is quickly being revolutionized as we give birth to a new era of experimentation, one that is merging with computer science. In a world that is still highly dependent on a few sparse state of the art laboratories where most of the world’s significant research gets done, this leaves large gaps of productivity to be filled in many areas of study.

Merging modeling and simulation with the scientific method is crucial to producing the most significant results in human experimentation, while simultaneously offering an accessible platform for scientists to ground their research.

Modeling and simulation live at the intersection between theory and experimentation. On one hand, a model and its associated implementation in software, such as a simulation tool, are an expression of theory. On the other hand, the execution of that software, or running the simulation, and the collection of results from it can be viewed as a virtual experiment. It’s worth noting that modeling and simulation should never be viewed as a substitute for real-world experimentation. Instead, they are highly valued in scientific discovery for providing additional insights that are often impractical or impossible to test through real-world experiments or theoretical analysis. Better yet, modern laptops are more than equipped to be able to handle simulation software that can explore many complex scenarios, such as muons traveling through our atmosphere, x-rays permeating through the human body or gamma rays interacting with a gas in a controlled chamber.

As we peer deeper into the micros and macros of the universe, we have a tendency to spend large sums of money on new equipment so we have the capacity to make certain measurements that would otherwise be impossible. Post-World War II cash and prestige encouraged the construction of powerful and expensive particle accelerators, which took the form of facilities including Stanford Linear Accelerator Center (SLAC) in Menlo Park, California, Fermi National Accelerator Laboratory (Fermilab) in Batavia, Illinois, the European Organization for Nuclear Research (CERN) near Geneva and several others. This was a drastic move away from the traditional tabletop experimentation. This evolution of experimentation brought forth certain struggles in the ways that we go about attempting to conduct an experiment. Researchers and scientists from across the world take months, even years of their time just to pitch proposals in order to acquire a designated block of time at the facilities. The intricacies and difficulties around setting up a perfect experiment, let alone executing said experiment, can be such an arduous task! This is why we need complex simulations to try to predict the outcome of physical experiments.

The computing power of modern computers can simulate events many orders of magnitudes greater than what we’d be able to model in the real world, at much less the cost.

Determining how likely certain outcomes are with unknown aspects is how we strive to reduce uncertainties in both computational and real-world applications.

READ the full article here