In the 1950s, there were four television networks in the United States. Because of the frequencies allotted to television, the signals could only be received in a "line of sight" from the transmitting antenna. People living in remote areas, especially remote mountainous areas, couldn't see the programs that were already becoming an important part of U.S. culture.
In 1948, people living in remote valleys in Pennsylvania solved their reception problems by putting antennas on hills and running cables to their houses. These days, the same technology once used by remote hamlets and select cities allows viewers all over the country to access a wide variety of programs and channels that meet their individual needs and desires. By the early 1990s, cable television had reached nearly half the homes in the United States.
The earliest cable systems were, in effect, strategically placed antennas with very long cables connecting them to subscribers' television sets. Because the signal from the antenna became weaker as it traveled through the length of cable, cable providers had to insert amplifiers at regular intervals to boost the strength of the signal and make it acceptable for viewing. According to Bill Wall, technical director for subscriber networks at Scientific-Atlanta, a leading maker of equipment for cable television systems, limitations in these amplifiers were a significant issue for cable system designers in the next three decades.
"In a cable system, the signal might have gone through 30 or 40 amplifiers before reaching your house, one every 1,000 feet or so," Wall says. "With each amplifier, you would get noise and distortion. Plus, if one of the amplifiers failed, you lost the picture. Cable got a reputation for not having the best quality picture and for not being reliable." In the late 1970s, cable television would find a solution to the amplifier problem. By then, they had also developed technology that allowed them to add more programming to cable service.