Why isn’t there a major theory like relativity and quantum science that emerged at the beginning of the last century?
Is basic research well developed, or is it stuck in a development bottleneck? If the development is not complete, what causes this slowdown?
This is indeed a good question, and it is also a question that the author has been thinking about for the past five years. Taking this opportunity, I might as well talk about my personal views.
Has the development of basic theory stalled?
This is a topic that is often mentioned but often misunderstood. In fact, this issue is very complicated.
Based on the basic theory of the philosophy of science, the author believes that basic research represented by physics is still in development and progress, but these progress are not subversive progress, but continuous progress.
The overall development stage of physics is still in the period of conventional science. In other words, the existing physics paradigm is still in the mature stage of life, and it has not encountered too many abnormalities and crises. Its mission at this stage is to use the existing paradigm. To solve more problems instead of subverting the old paradigm, that era has not yet arrived.
Combing the clues of the history of science, we can clearly find that humans have waited 218 years from the establishment of the Newtonian paradigm (1687) to the establishment of the Einstein paradigm (1905). During this period, countless scientists worked under the Newtonian paradigm and achieved a lot. Outstanding achievement, but no one’s prestige can surpass Newton, Newtonian paradigm only really encountered challenges at the end of the 19th century (two dark clouds appeared in the physics sky).
The modern physics paradigm has only been more than 100 years from its establishment to the present, and it is far from the withering period of theoretical life. Using this paradigm, mankind has made countless great achievements in the past century. So far, this paradigm has not been met. To weighty anomalies and crises.
Therefore, the progress of modern physics still has a lot of room for development, and it has not reached the so-called bottleneck stage.
So what makes the development of physical theory seem to be slowing down? The author believes that there are two reasons for this cognitive illusion:
First, after each scientific revolution, the scientific research space brought about by the new paradigm expands exponentially, just like the situation faced by astronomy after expanding from the solar system to the Milky Way. Two evidences of the expansion of the research field are:
1. The exponential growth of results. According to the American scientometrician Price (Derek John de SollaPrice, 1922-1983), in modern times, the scientific and technological literature has grown exponentially (the growth rate will become smaller in the later period).
If we link the substantial increase in the number of documents with the expansion of academic space, we can put forward a hypothesis: that is, after each scientific revolution, the growth of its academic space also increases exponentially.
If this hypothesis is true, then the range of academic space released after the physics revolution in the early 20th century has grown exponentially compared to the range of academic space in the era of the Newtonian Revolution. We might as well take a look at the range of physics research in the early 20th century Compared to today, it is very narrow. At that time, physics research was basically concentrated in a few fields such as nuclear physics and statistical thermodynamics.
According to the latest “National Standard Discipline Classification” (GB/T 13745-2008), physics, as a first-level discipline, consists of 16 second-level disciplines and 108 third-level disciplines.
In this sense, the current research space of physics has expanded dozens of times compared with 100 years ago. In other words, there are still many fields in need of deep cultivation under the contemporary physics paradigm, and the world of physics theory is far from nothing. It can be done even at the stage where Nanshan can be released. Since these studies are all work within the existing paradigm, it is difficult to produce subversive theories.
2. The exponential growth of computing power. According to Moore’s Law, every 12-18 months, the processing performance of the computer doubles while the price remains the same.
In short, the current computing power of human beings doubles every two years. Even so, there are still many scientific problems that need to be solved. This also indirectly proves that the research space of physics is extremely huge. With the support of such powerful computing power, it is still The countless work that has not yet been carried out is proof.
Second, the advantage of the increase in the number of scientists has been diverted by the huge research space, and the possibility of major achievements has not increased significantly.
The number of modern physicists is more than the total number of all physicists since Newton to the mid-20th century. Why have so many people still failed to achieve a revolutionary and major theoretical breakthrough?
It is not that modern physicists are not clever, but because the field of modern physics presents a high degree of professional subdivision, which has led to a large number of physicists being diverted by many subdivided research fields, which led to the dilution of research capabilities and strength.
These subdivided fields are almost entirely new fields, which are still under the paradigm of modern physics and started relatively late. As a result, the research has not really touched the boundaries of the new paradigm, just like a sailor sailing on the vast sea. Without seeing the land, we began to wonder if we were still moving forward. In the same way, we intuitively feel that the current progress of physics has slowed down, but in fact it is still progressing rapidly. The slowdown is just a mistake caused by human cognitive illusions. The verdict is only the result.
Regarding the issue of basic research and talent gathering, people often take a famous photo to illustrate, that is, the group photo of the fifth Solvay Conference in October 1927. The photo includes Einstein, Madame Curie, Planck, Dozens of great physicists, including Bohr, Born, and Schrödinger, gathered. Why were there so many great scientific figures in that period, but they are rare today?
This is a typical tense misplacement problem.
The 1920s of the last century was the period when the modern physics revolution established a new paradigm. In the early days of the establishment of the new paradigm, the entire physics space was full of uncultivated academic wasteland under the searchlight of the new paradigm. There were many opportunities and historical paradoxes. The point is that at exactly that time, these smartest brains all turned their attention to this uncultivated land, and the fields were highly convergent, and there was no intellectual diversion phenomenon. As a result, a high degree of focus on research ability and wisdom was formed, which produced many Fruitful results.
In the revolutionary period, heroes would naturally be used to annotate that era, while in peacetime heroes would be presented in another way. At that time, there were Einstein and Schrödinger, and in our era, there were Bill Gates, Jobs, and Musk. Can you say that these contemporary people are not heroes?
A consensus formed by the whole society in recent years is to strengthen basic research. So what is the use of basic research?
Regardless of the linear model of basic research by US policy expert Bush, that is, basic research promotes technological development and then brings about the linear knowledge transformation chain of industrial revitalization, basic research provides basic norms and standards for all scientific and technological activities. Cognitive baseline. Let us take physics as an example.
Physics has always been the most fully developed and mature subject in the scientific landscape, and it is also the field where human intelligence is most prominent.
The achievements of physics have set an industry model for all scientific research, and the best knowledge sample (the highest potential energy) of the achievements of human wisdom.
The level of development of physics represents a country’s cognitive height in the field of natural sciences, and its development status directly determines the position of the country’s cognitive shortcomings.
Tianwen-1 completed the fourth midway correction of its orbit and returned the first image of Mars
At 20 o’clock on February 5, 2021, the engine of the first Mars exploration mission Tianwen-1 probe was ignited, and the fourth midway correction of the ground fire transfer section was successfully completed to ensure that Mars capture was carried out as planned.
Up to now, Tianwen-1 has been in orbit for about 197 days, about 184 million kilometers away from Earth, about 1.1 million kilometers away from Mars, and a flight mileage of about 465 million kilometers. The probe systems are in good condition.
Prior to this, Tianwen-1 acquired the first Martian image at a distance of about 2.2 million kilometers from Mars.
The potential mass range of dark matter is narrowed, helping to focus the search area
According to a report by the physicist organization network on the 27th, British scientists used the fact that gravity acts on dark matter to fundamentally calculate the mass range of dark matter for the first time. This range is much narrower than previously thought. The latest research has narrowed the range of potential masses of dark matter particles and also limited the search range for future dark matter “catchers”. It will also help scientists discover potential new forces in the future.
Dark matter does not emit light, emit electromagnetic waves, does not participate in electromagnetic interactions, and cannot be directly “seen” with any optical or electromagnetic wave observation equipment. Scientists have confirmed its existence through the effect of dark matter on galaxies. The result of the cosmic microwave background radiation observation experiment gives the proportion of dark matter in the total amount of cosmic matter. In the universe, ordinary matter accounts for only 4.9%, dark matter accounts for 26.8%, and dark energy accounts for 68.3%. Dark energy and dark matter are also considered to be two dark clouds in the physics sky at the beginning of the 21st century.
In the latest study, the research team led by Professor Xavier Kalmot of the School of Mathematics and Physical Sciences at the University of Sussex assumed that gravity is the only force acting on dark matter, and calculated that the mass of dark matter particles is between Between 10-3eV (electron volt) and 107eV, it is much narrower than the theoretically expected 10-24eV to 1028eV.
“This is the first time anyone has used quantum gravity to calculate the mass range of dark matter,” Carmott said. This study shows that unless there is a hitherto unknown force affecting it, dark matter cannot be like some theories. It is said to be “ultralight” or “overweight”.
The researchers believe that this research will benefit physicists in two ways: one is to focus on the search area of dark matter; the other is to help reveal whether there are mysterious unknown forces in the universe-if the mass of dark matter is found to exceed in the future The range predicted by the Sussex team indicates that dark matter is affected by other forces in addition to gravity.
The collapse of some marine species: sea surface temperatures continue to rise under multiple pressures
A zoology study published on the 28th in the British “Nature” magazine pointed out that some marine species are facing a crisis of “population collapse”-since 1970, the number of marine slatyfish in the world has decreased by 71%. Among these marine species, More than three-quarters of them are on the verge of extinction; and another study published at the same time pointed out the “climate crisis” of the ocean: global sea surface temperature has been rising for the past 12,000 years.
The ocean is facing multiple pressures. Issues such as climate change, rising sea temperature, pollution, overfishing, and ocean acidification have been affecting the health of the ocean. Scientists currently believe that the risk of marine species extinction should mainly come from overfishing. However, the reduction of individual species has always been difficult to measure. Although the reduction of marine and coastal plategill fish populations in different regions of the world has been recorded before, there has been no global development. analysis.
This time, the Simon Fraser University team in Canada estimated the relative abundance of 18 marine slaty fish species from 1970 to 2018 and assessed the extinction risk of all 31 marine slaty fish species. They found that from 1970 to 2018, the global abundance of marine slaty fish fell by 71.1%. Among these 31 species, 24 are on the verge of extinction; 3 shark species have declined especially, and are now classified as critically endangered—this is also the most threatened species in the World Conservation Union (IUCN) list The grade.
The researchers blamed the reduction of these species on fishing pressure, which increased 18 times during this period. The research team pointed out that immediate action should be taken to prevent “population collapse”, and specifically called on governments to implement fishing restrictions to help promote species recovery.
In the climate report published on the same day by the journal Nature, the Rutgers University of New Jersey team pointed out the “climate crisis” of the ocean. According to the report, the global average annual sea surface temperature has been rising for the past 12,000 years.
The researchers reinterpreted the two latest climate models this time and devised a method to assess the seasonal deviation of a single record, and then calculate the annual average sea surface temperature. They found that climate warming from 12,000 to 6,500 years ago was caused by the retreat of the ice sheet, while recent warming was caused by the increase in greenhouse gas emissions. The current temperature is the highest in the past 12,000 years. The temperature was similar during the last interglacial period about 125,000 years ago.
In addition to suggesting that emissions have caused the sea surface temperature to continue to rise, this study also fills a long-standing gap in the gap between climate models and data used to reconstruct historical climate changes in the Holocene.