The Development History of Semiconductors




At the end of the 19th century, scientists began to study the properties and behavior of electrons. In 1897, British physicist Thomson discovered electrons, which laid the foundation for subsequent semiconductor research. However, at that time, people still knew very little about the application of electronics

At the beginning of the 20th century, research on semiconductor materials gradually emerged. In 1919, German physicist Hermann Stoll discovered the semiconductor properties of silicon. Afterwards, scientists began to study how to use semiconductor materials to control the flow of current. In 1926, American physicist Julian Leard designed the first semiconductor amplifier, marking the beginning of semiconductor technology.

However, the development of semiconductor technology has not been smooth. In the 1920s and 1930s, people's understanding of semiconductors was still limited, and the manufacturing process was also very complex. Until 1947, researchers at Bell Laboratories in the United States discovered the PN structure of semiconductor material silicon, which is considered a milestone in modern semiconductor technology. The discovery of PN structure enables people to control the flow of current, thus enabling the manufacturing of semiconductor devices.

In the 1950s, semiconductor technology made significant breakthroughs. In 1954, researchers John Badin and Walter Bratton from Bell Laboratories in the United States invented the first transistor, which is considered an important milestone in modern electronic technology. The invention of transistors greatly reduced the size of electronic devices and power consumption, thereby promoting the rapid development of electronic technology.

In the 1960s, the concept of integrated circuits was proposed. Integrated circuits integrate multiple transistors and other electronic components onto a single chip, achieving higher integration and smaller size. In 1965, Gordon Moore, the founder of Intel, proposed the famous "Moore's Law", which predicted the exponential growth of the number of transistors in integrated circuits. This law has been validated over the past few decades, driving the rapid development of semiconductor technology.

With the continuous progress of semiconductor technology, the performance of electronic devices continues to improve. In the 1970s, the emergence of personal computers led to the widespread application of semiconductor technology. In the 1980s and 1990s, with the rise of the Internet, semiconductor technology was widely applied in the fields of communication and information technology. Since the 21st century, the application of semiconductor technology in fields such as artificial intelligence, the Internet of Things, and new energy has been continuously expanding, providing strong support for the development of modern technology.

From the initial transistor to the current integrated circuit, the progress of semiconductor technology has driven the development and performance improvement of electronic devices. With the continuous progress of technology, the application of semiconductor technology in various fields will become more widespread, and at the same time, it will also create a better future for humanity.


Navigation