If you are anywhere close to my age, you probably took a college course called “World History” even though it was primarily about Europe. Today some students take courses in “World History” that are actually designed to cover the world. This reflects a genuine effort by historians to “go global.” However, it is not as easy—nor perhaps as valuable—as it seems.
The field of world history got its start in the 1960s, perhaps with William McNeill’s book The Rise of the West (a powerful book I wrote about here). [1] In spite of the name, his book was an effort to get beyond thinking about Europe and offshoots like the United States. In fact, McNeill viewed world history as starting with the Middle East civilizations of the Sumerians and Egyptians and dramatically changed by the Mongols, who moved south and west of the Asian steppes in the 1200s. Only after 1500 did Europe begin to dominate.
Last semester I took a graduate seminar in “Thinking about World History in the Early Modern Era.” As the name implies, the class was a creative effort to determine how, on the college level, to study the whole world in a single period, the early modern period (usually described as between 1500 and 1800). Each student had to devise a syllabus for teaching such a course. (In a previous post I discussed the difficulties of breaking up history into meaningful periods, but this post is about trying to encompass the world in one of those periods. )
I don’t have a problem with teaching world history during the early modern period. I do have a problem with the entire concept of world history as it has been developed over the past few decades. Many barrels of ink (a metaphor, of course) have been spent on trying to define the discipline.
For years I’ve heard about the academic pressure to publish. Now, as a graduate student, I’ve come across some results of that pressure. These are books that make an interesting subject dull.[1] I’ll consider one of them, Sacred Gifts, Profane Pleasures, in this post.
To be sure, my professors have taken pains to assign only books they consider important and relevant, the “cream of the crop.” (One professor advised his class that if we didn’t like these, we would hate the ones he had rejected.) Nevertheless, a few clunkers come through. Well, I consider them clunkers. As an editor (current and past), I am frustrated when I see tremendous talent combined with disappointing execution.
The book I’m commenting on was praised on its cover as “superior and fascinating.” It reflects enormous research (12 years’ worth), including meticulous gathering of visual artifacts across two continents and several centuries. And it exhibits heroic efforts to come up with new interpretations. But, in my view, its impact is restricted by having to meet the academic goals that lead to tenure and full professorship.
Marcy Norton’s Sacred Gifts, Profane Pleasures: A History of Tobacco and Chocolate in the Atlantic World[2]is the story of how tobacco and chocolate, substances that were part of pre-Columbian social and religious rituals in Mexico and Central America, became popular products in Europe during the 1600s.
Editor’s note: This is a guest post by Jay Schalin, director of policy analysis at the James G. Martin Center for Academic Renewal.
As a professional commenter on higher education, one of the phrases I encounter ad nauseam is “shared governance.” That is the concept in which the faculty, the trustees, and the administration are roughly coequal partners in higher education decision-making, each having dominance in its own sphere of activities. What makes this so annoying is that one of the common justifications used for it is that it is our “traditional” form of governing universities, and therefore any attempt to question it is out of bounds.
But is it really our tradition? Or is it something that was grafted onto another tradition of governance during the Progressive Era?
One of the bases for this traditional shared governance narrative is the claim that American colleges descend from the scholar-centered University of Paris, and therefore the origins of American colleges are as “communities of scholars.” Higher education historian Edwin Duryea, in his 2000 book The Academic Corporation: A History of College and University Governing Boards, claims that, “The Parisian model appeared in Oxford and Cambridge that, in turn, carried over to America.”1
The seventeenth century in Europe was bloody and violent. Some examples: a continental war that went on for thirty years (1618-1648), three British civil wars (1639-1651), naval wars between England and the Netherlands (1652-1674), and military efforts to rein in France’s Louis XIV and the Spanish Hapsburgs.
At the same time, however, economic changes were quietly occurring, laying a foundation for the Industrial Revolution. That’s the little-known subject of this post.
“What happened in the seventeenth and eighteenth centuries was a wholesale shift of industry, including rather sophisticated sectors, from city to countryside,” writes Jan de Vries in his informative book Economy of Europe in an Age of Crisis, 1600-1750.[1]
This shift from cities to rural areas is not the typical “Industrial Revolution” story, which says that peasants were forced off the farm and into the cities, making them available for burgeoning industrial factories. To some extent that did happen later, but manufacturing in England and other parts of western Europe started in rural areas, not cities. Here’s how, according to de Vries.