Kirjailijakuva

Ithiel de Sola Pool (1917–1984)

Teoksen Technologies of Freedom tekijä

18 teosta 174 jäsentä 3 arvostelua

Tietoja tekijästä

Ithiel de Sola Pool (1917-1984) directed the research program in communications at the Massachusetts Institute of Technology for thirty years. Lloyd S. Etheredge, a political scientist and psychologist, is project director at the Policy Sciences Center Inc.

Tekijän teokset

Merkitty avainsanalla

Yleistieto

Kanoninen nimi
Pool, Ithiel de Sola
Syntymäaika
1917-10-26
Kuolinaika
1984-03-11
Sukupuoli
male
Suhteet
De Sola Pool, David (son)

Jäseniä

Kirja-arvosteluja

This is a difficult book to review. Should we see it in the light of the fifties and sixties when it was written or look at it from the vantage of more than 50 years of progress? Several terms used in the book seem anachronistic. Today we call them pollsters but through out the book the authors refer to them as pollers. Even today's spell checkers don't recognize pollers as a word. Much more importantly is the use of the term simulation. I'm not sure that anything they did would be called a simulation today. Yes they are pulling lots of data together. Yes they are using it to model the U.S. population, but today that's not we call simulation. I believe everything they did would be called deterministic rather than probabilistic or stochastic.

The real strength of this book is that it is the best source to learn how they pulled together the data to produce the reports they delivered to the Kennedy campaign etc. This is the source for the nerd who needs to go beyond the descriptions provided in Jill Lepore's If Then. If Then is much richer in providing the background for their work and learning who did what when and where. But if you want to understand what they actually did, this is the source. If you'd rather read a muckraking novel Eugene Burdick's The 480 tells a glossed over version. Indeed the authors of this book wanted to clear the air and tell people what they really did. Each gives you a piece of the puzzle. This book gives you the glue.

The authors recognized that computers could be used to pull together more than just one survey, something we consider simple today but in the fifties it was novel. They pulled together approximately 60 surveys from the fifties. Surprisingly they did not include surveys conducted during the campaign of 1960. Today that would be sacrilegious. The latest data is considered highly important. While the authors admit that part of their logic was based on the limited amount of time which prohibited them including recent data. Surprisingly the authors went further and decided it was a good thing they did not include recent data as they wanted to believe their model was underlying what went on in the campaign and thus should not include part of the campaign. This is one of the more telling instances of the authors putting a positive spin on what others would likely point to as a negative.

Perhaps the most creative feature of what they did was the reliance on grouping the data. They created 480 groups based on demographics. The population of the U.S. was broken into 480 different demographic groups. Group #1 was Democrat, Eastern, Protestant, Rural. Professional & White Collar. Group #480 was Independent, Border states, No religion. The rest were all the permutations between the two. That's the basic case structure. The variable structure is also interesting. For each of the 480 groups they created 50 issue "clusters". Some of these are simple vote reports such has having voted for Eisenhower or Stevenson in 1952. Or whether the group voted for Democratic or Republican congressional candidate in 1954. Recall of behavioral acts were straightforward. More challenging were attitudinal measures. Questions were not always the same. Here's where some of the magic started. Instead of using the categories used in the surveys they recoded the data. Essentially they captured a positive, a neutral and a negative response. Other levels of nuance, discrimination were tossed. Once the data was recoded the group was summarized as simple percentages.

The authors had another challenge to overcome. Not all surveys had all the questions. Here they used imputation based on assumptions. If one survey contained both questions they used the correlation they found there and if another survey had just one of the questions they assumed it was likely to have an identical correlation which they used to impute what the missing question would have produced. Fancy but questionable.

They also had interesting approaches to certain groups. Since Jews were often barely appearing in surveys they often set the Jewish groups to zero. Even more radical was how they handled Black of negro groups as they called them. Since several states practiced segregation and various forms of total voter suppression, remember this is the fifties, they eliminated those groups for those states.

Another challenge was needing to come up with estimates by state. Their focus was the Presidential race so they needed to determine an estimate for each state. Yet most of the data they used came from national polls and did not have enough data about a single state to produce a reliable estimate for each state. They developed a concept of a pseudo-state or combined state. They used cases from the other states in the region in which the state appeared to produce estimates for that state. Again a questionable assumption.

Models were also important, especially cross pressure. Since the 1960 race included a Catholic candidate they recognized that groups of Protestant Democrats would likely perform less than they would for a Protestant candidate. And groups of Catholics, both Republic and Democrat would overperform for Kennedy. The parameters they used in their models often involved best guesses rather than empirical estimates. To the authors credit they performed sensitivity testing on many of their assumptions to test how much their results were dependent on some of these assumption. Often they found the simplest model was sufficient.

Perhaps the most important part of the book is at the very end. In that section Lessons from the past and for the future, the authors recognize that computers would continue to get faster and would be able to handle much more data. They admit that the need to summarize the cases would go away and preserving the individual information would lead to richer models of electoral behavior. Amen.
… (lisätietoja)
 
Merkitty asiattomaksi
Ed_Schneider | Jan 28, 2021 |
Pool examines the impact of communications technology on civil rights and the rights protected by the First Amendment. He argues that restrictive government regulation is not inevitable.

Briefs many court cases, and supplies a detailed Index.
 
Merkitty asiattomaksi
keylawk | Nov 25, 2007 |
This is a lovely book: the author talks about how the telephone helped destroy Victorian culture. It was the telephone, you know, and Lady Chatterley's lover. Substitute "email" for telephone, or more significantly "synthetic world" for telephone.
 
Merkitty asiattomaksi
humdog | Feb 17, 2007 |

Palkinnot

You May Also Like

Associated Authors

Tilastot

Teokset
18
Jäseniä
174
Suosituimmuussija
#123,126
Arvio (tähdet)
½ 4.5
Kirja-arvosteluja
3
ISBN:t
26
Kielet
3

Taulukot ja kaaviot