Darren Walker, president of the Ford Foundation, recently told the New York Times that economic opportunities for poor people in the United States may have been better in the 1960s than they are today. Could this be true?
Walker, an African-American, was born in 1959 in Lafayette, Louisiana, to a single mother and grew up in small towns in Texas, including Ames, an all-black town. His prospects might not have seemed bright but in fact he attended the University of Texas at Austin, where he earned a law degree; he joined a prominent law firm, then the Union Bank of Switzerland (UBS), and subsequently entered the nonprofit world.
In the Times interview, Walker gave credit for his success to his mother, who was a nurse’s assistant, and to the federal childhood program Head Start. “I’m grateful to America, because I was a boy at a time when America believed in little poor black boys and girls living on dirt roads in shotgun houses in small towns across this country.”
He explains: “In 1965, I was sitting on the porch with my mother and a lady approached and told my mother about a new program called Head Start. And I was fortunate enough to be in the first class of Head Start, in the summer of 1965.”
1965? That was the year of the Watts (Los Angeles) riots, two years after Martin Luther King’s March on Washington, three years before King’s assassination, and a year after three young men were killed in Mississippi for trying to bring voting rights to blacks.
I have enjoyed nearly all my courses at NC State, but I have sometimes been disappointed with my fellow students. Frequently, they fail to speak up. Maybe they aren’t prepared or, for some reason, they just don’t want to talk. This occurs mostly with undergraduates but graduate students, too, can avoid participation in discussion for long periods of time.
I know it’s frustrating to the professors, some of whom go to great lengths to encourage discussion—requiring students to write short essays for each class or having a student present a five-minute précis of the day’s readings. Sometimes these work and sometimes they don’t. Some instructors also have pop quizzes to persuade the students to be prepared—although no professors of mine have used this tactic. Oh, and then there’s grading attendance and participation. That doesn’t seem to work at all.
I recently came across a guide for college instructors in the Chronicle of Higher Education that sheds some light on this problem.[1] Written by Jay Howard, a sociologist who has studied classroom interaction, “How to Hold a Better Class Discussion” explains that two “classroom norms” protect students from having to speak up.[2]
I thought it was an original discovery of mine—the notion that present-day concerns often direct historians to study particular aspects of history.
In the 1960s, for example, economists and politicians were trying to help newly independent, but underdeveloped, countries grow. Sidney Pollard, a historian, thought that a better understanding of the first great period of development, the Industrial Revolution, would “help forecast, and pave the way for, the next steps to be taken by living economies”; thus its study had “a severely practical basis.”[1]
While composing my post, however, I learned that historian David Cannadine had already written an essay detailing how present concerns shape investigations of the past. Writing in 1984, he identified four waves of historical analysis of the Industrial Revolution up to that time. He ingeniously tied each one of them to economic conditions at the time of writing.
While historians like Pollard had treated the Industrial Revolution as a model for future development, by the mid-1970s disillusionment about economic growth had set in. “[T]he British Industrial Revolution is now depicted in a more negative light,” Cannadine wrote, “as a limited, restricted, piecemeal phenomenon, in which various things did not happen or where, if they did, they had far less effect than was previously supposed.”[2]
Should such periodic re-investigation make us wonder about the validity of historical findings? That is, are those findings anachronistic (a word that historians dread)? An anachronism is something inappropriately included when depicting a particular period of time. For example, Shakespeare has a clock chime in Julius Caesar. Clocks did not chime in ancient Rome. [3] Academically, anachronism occurs when historians “pose the past in a form that would have been alien to the period we are describing.”[4]
The seventeenth century in Europe was bloody and violent. Some examples: a continental war that went on for thirty years (1618-1648), three British civil wars (1639-1651), naval wars between England and the Netherlands (1652-1674), and military efforts to rein in France’s Louis XIV and the Spanish Hapsburgs.
At the same time, however, economic changes were quietly occurring, laying a foundation for the Industrial Revolution. That’s the little-known subject of this post.
“What happened in the seventeenth and eighteenth centuries was a wholesale shift of industry, including rather sophisticated sectors, from city to countryside,” writes Jan de Vries in his informative book Economy of Europe in an Age of Crisis, 1600-1750.[1]
This shift from cities to rural areas is not the typical “Industrial Revolution” story, which says that peasants were forced off the farm and into the cities, making them available for burgeoning industrial factories. To some extent that did happen later, but manufacturing in England and other parts of western Europe started in rural areas, not cities. Here’s how, according to de Vries.
[Photo: credit: Campus facility (UA023.005), Special Collections Research Center, North Carolina State University Libraries, Raleigh, North Carolina.]
Like the child who pointed out that the emperor had no clothes, someone (Lawrence Biemiller) is admitting that the great wave of Modernist buildings on academic campuses—constructed from the 1960s until very recently—has not been a success. We may think of universities as places of ivy-covered brick walls and quaint quads, but the fact is that for decades, universities chose to construct stark “form follows function” buildings admired by architects, but rarely by students.
Here at North Carolina State University, Harrelson Hall, built in 1962, was torn down in 2016. Even the NC State website describes Harrelson as “a circular freak of a building that flummoxed students with its spiral ramps, windowless classrooms and ductwork that whooshed like a subway tunnel.”
Harrelson was over 50 years old when it was taken down, but I frequently walk by a newer construction, the Ricks Hall Addition, built in 2009. It is Modernist—a rectangular box connected on the second floor to the 1922 Ricks Hall, which boasts Ionic columns. The only similarity I can see to the original building is the color of the brick. I see nothing pleasing about it.