Tag Archives: 20th century

DDT and the Environment in the 20th Century

DDT (dichlorodiphenyltrichloroethane) was a popularly used insecticide in the 1940s, first in WWII and then in agriculture. In the 1950s, it was used to combat the spread of malaria by killing mosquitoes, but this project failed in tropical regions. Its widespread usage also caused high rates of DDT resistance in mosquitoes. American scientists were concerned with the possible dangers of DDT since it first began being used, but this issue did not gain attention until Rachel Carson published Silent Spring in 1962. This book claimed that pesticides, including DDT, were causing great harm to the environment and human health. Soon after, Silent Spring garnered great public attention, ad JFK ordered the investigation of Carson’s claims. The EDF (Environmental Defense Fund) was created with the aim to ban the use of DDT, which was discovered to be toxic to marine organisms and a chief cause of the thinning of birds’ eggshells. Its comprehensive uses were banned in 1972, but DDT is still used for disease vector control. For example, DDT is sprayed on the inside walls of houses to kill or repel mosquitoes, and this method supposedly greatly reduces environmental damage.

DDT is thought to be the major cause of the decline of birds of prey like bald eagles and peregrine falcons. Egg shell thinning makes the birds more susceptible to embryo deaths and egg breakage. DDT’s are also chemically similar to estrogens and can cause hormonal changes in animals. Thus, it is believed they also damage the reproductive system and decrease reproductive success. Some speculate that DDT is carcinogenic, but the CDC reports otherwise. Additionally, there is an ongoing debate between people who oppose the use of DDT for malaria control due to environmental concerns and those who support its use in order to save more livess. Regardless, the use of DDT is frowned upon in the US and most nations, yet it is still used in controlled manners.

Computers in the 20th Century

The first electronic digital computer was created in the 1930s by John Vincent Atanasoff. While it wasn’t programmable, the machine could solve linear equations and used a paper card writer/reader as its storage mechanism. This machine established three rules that became the basis for future computers: it used binary digits to represent data, performed calculations electronically, and contained a system in which the computing and memory were separated. Although the computer was never fully developed due to Atanasoff’s leave from Iowa State College for World War II assignments, the 1930s also saw the creation of the first binary digital computers and the first programmable calculator, the Z2.

The creator of this calculator was a man named Konrad Zuse, who went on to create his next technological wonder that was aptly named the Z3. An electromechanical computer that became operational in 1941, the Z3 was the world’s first programmable, fully automatic digital computer. Later, the Z4 became the world’s first commercial digital computer.

As computers became increasingly sophisticated and increasingly widespread, organizations began to use them to simplify tasks and work more efficiently. Telephone exchange networks were converted into electronic data processing systems, and the US Navy had developed an electromechanical analog computer named the “Torpedo Data Computer,” which used trigonometry to solve the problem of firing a torpedo at a moving target. The world’s first electronic digital programmable computer, the Colossus, was also built for the purpose of World War II. The Colossus was designed by engineer Thomas Flowers for the purpose of cracking German codes.

In the 1950s, the first computer designed to aid US businesses was created. Eckert and Mauchly created the UNIVAC, or UNIVersal Automatic Computer. Instead of punched cards, it used magnetic tape storage to aid in data collection. This machine was used by J. Lyons & Company to calculate the company’s weekly payroll. The first home computer, the Altair 8800, is not marketed to the public until 1975. At the cost of $400, hobbyists could own their very own machine that did not include a keyboard, monitor, or its own programming language. However, two young men decided to try their hands at writing a coding language for the new computer. Their names were Bill Gates and Paul Allen, and they started the project by forming a partnership called Microsoft. Soon after, Apple Computer, founded by electronics hobbyists Steve Jobs and Steve Wozniak, releases the Apple II, a desktop personal computer for the mass market that amazingly featured a keyboard, video monitor, mouse, and random-access memory (RAM).

What started as an enormous machine that could only solve simple math problems soon blossomed into something truly life-changing. Computers became more and more complex over time, and it took less than a century for them to integrate into the public’s daily life. Computers are easy to use, and they make tasks easier. The Internet allows us to share information with each other at a staggering speed that was impossible just a few years ago. This technological innovation has opened up a vast world of knowledge and information for everyone in the modern world. Where will all of this advancement lead, you ask? Well, we haven’t come up with an algorithm for that just yet.

-Sofia