How Sweet It Was

Darren Walker, president of the Ford Foundation, recently told the New York Times that economic opportunities for poor people in the United States may have been better in the 1960s than they are today. Could this be true?

Walker, an African-American, was born in 1959 in Lafayette, Louisiana, to a single mother and grew up in small towns in Texas, including Ames, an all-black town. His prospects might not have seemed bright but in fact he attended the University of Texas at Austin, where he earned a law degree; he joined a prominent law firm, then the Union Bank of Switzerland (UBS), and subsequently entered the nonprofit world.

In the Times interview, Walker gave credit for his success to his mother, who was a nurse’s assistant, and to the federal childhood program Head Start. “I’m grateful to America, because I was a boy at a time when America believed in little poor black boys and girls living on dirt roads in shotgun houses in small towns across this country.”

He explains: “In 1965, I was sitting on the porch with my mother and a lady approached and told my mother about a new program called Head Start. And I was fortunate enough to be in the first class of Head Start, in the summer of 1965.”

1965? That was the year of the Watts (Los Angeles) riots, two years after Martin Luther King’s March on Washington, three years before King’s assassination, and a year after three young men were killed in Mississippi for trying to bring voting rights to blacks.

Continue reading “How Sweet It Was”

An Untraditional Tradition

Editor’s note: This is a guest post by Jay Schalin, director of policy analysis at the James G. Martin Center for Academic Renewal.

As a professional commenter on higher education, one of the phrases I encounter ad nauseam is “shared governance.” That is the concept in which the faculty, the trustees, and the administration are roughly coequal partners in higher education decision-making, each having dominance in its own sphere of activities. What makes this so annoying is that one of the common justifications used for it is that it is our “traditional” form of governing universities, and therefore any attempt to question it is out of bounds.

But is it really our tradition? Or is it something that was grafted onto another tradition of governance during the Progressive Era?

One of the bases for this traditional shared governance narrative is the claim that American colleges descend from the scholar-centered University of Paris, and therefore the origins of American colleges are as “communities of scholars.” Higher education historian Edwin Duryea, in his 2000 book The Academic Corporation: A History of College and University Governing Boards, claims that, “The Parisian model appeared in Oxford and Cambridge that, in turn, carried over to America.”1

Continue reading “An Untraditional Tradition”

Silence in the Classroom

I have enjoyed nearly all my courses at NC State, but I have sometimes been disappointed with my fellow students. Frequently, they fail to speak up. Maybe they aren’t prepared or, for some reason, they just don’t want to talk. This occurs mostly with undergraduates but graduate students, too, can avoid participation in discussion for long periods of time.

I know it’s frustrating to the professors, some of whom go to great lengths to encourage discussion—requiring students to write short essays for each class or having a student present a five-minute précis of the day’s readings. Sometimes these work and sometimes they don’t. Some instructors also have pop quizzes to persuade the students to be prepared—although no professors of mine have used this tactic. Oh, and then there’s grading attendance and participation. That doesn’t seem to work at all.

I recently came across a guide for college instructors in the Chronicle of Higher Education that sheds some light on this problem.[1] Written by Jay Howard, a sociologist who has studied classroom interaction, “How to Hold a Better Class Discussion” explains that two “classroom norms” protect students from having to speak up.[2]

Continue reading “Silence in the Classroom”

Does the Present Shed Light on the Past?

I thought it was an original discovery of mine—the notion that present-day concerns often direct historians to study particular aspects of history.

In the 1960s, for example, economists and politicians were trying to help newly independent, but underdeveloped, countries grow. Sidney Pollard, a historian, thought that a better understanding of the first great period of development, the Industrial Revolution, would “help forecast, and pave the way for, the next steps to be taken by living economies”; thus its study had “a severely practical basis.”[1]

While composing my post, however, I learned that historian David Cannadine had already written an essay detailing how present concerns shape investigations of the past. Writing in 1984, he identified four waves of historical analysis of the Industrial Revolution up to that time. He ingeniously tied each one of them to economic conditions at the time of writing.

While historians like Pollard had treated the Industrial Revolution as a model for future development, by the mid-1970s disillusionment about economic growth had set in. “[T]he British Industrial Revolution is now depicted in a more negative light,” Cannadine wrote, “as a limited, restricted, piecemeal phenomenon, in which various things did not happen or where, if they did, they had far less effect than was previously supposed.”[2]

Should such periodic re-investigation make us wonder about the validity of historical findings? That is, are those findings anachronistic (a word that historians dread)? An anachronism is something inappropriately included when depicting a particular period of time. For example, Shakespeare has a clock chime in Julius Caesar. Clocks did not chime in ancient Rome. [3] Academically, anachronism occurs when historians “pose the past in a form that would have been alien to the period we are describing.”[4]

Continue reading “Does the Present Shed Light on the Past?”

Good News about the 1600s, Part I

The seventeenth century in Europe was bloody and violent. Some examples: a continental war that went on for thirty years (1618-1648), three British civil wars (1639-1651),  naval wars between England and the Netherlands (1652-1674), and military efforts to rein in France’s Louis XIV and the Spanish Hapsburgs.

At the same time, however, economic changes were quietly occurring, laying a foundation for the Industrial Revolution. That’s the little-known subject of this post.

“What happened in the seventeenth and eighteenth centuries was a wholesale shift of industry, including rather sophisticated sectors, from city to countryside,” writes Jan de Vries in his informative book Economy of Europe in an Age of Crisis, 1600-1750.[1]

This shift from cities to rural areas is not the typical “Industrial Revolution” story, which says that peasants were forced off the farm and into the cities, making them available for burgeoning industrial factories. To some extent that did happen later, but manufacturing in England and other parts of western Europe started in rural areas, not cities.  Here’s how, according to de Vries.

Continue reading “Good News about the 1600s, Part I”