Serious study of leadership behavior got under way around the same time that Western nations were stumbling toward World War One, and it came to fruition shortly after Nazism and fascism were vanquished in World War Two. In the middle of the twentieth century, as the Information Age and the global Anthropocene (literally “the age of human beings”) dawned in the aftermath of two grotesquely violent world wars, psychologists, historians, religious leaders, and average citizens began asking each other some gut-wrenching questions: “Who in the world do we think we are? What in the world have we done to each other? How in the world can we do better?” By this time, philosophers, social scientists, and visionaries had proposed that there could be useful answers to these questions if we simply took them seriously and worked together to pursue them — in other words, if we could get in the habit of seeking the truth with each other, speaking the truth to each other, and working together in common cause. After all, now that we know we have the power to destroy the earth and end human history, why can’t we create a better world and chart a new path? Perhaps we could forge or find such a path through courage, compassion, continuous learning, and a commitment to the human community.

Though the average human life span in 1950 was more than twice that of people who lived when Aristotle was tutoring little Alexander the Not-Yet-Great, and though most people in the developed world enjoyed the benefits of electricity, rapid transportation, democratic governance, significant leisure time, instantaneous global communication, and indoor plumbing, the vast majority recoiled in horror and disgust over the recent devastation wreaked by followers of authoritarian tyrants driven by racism, nationalism, ancient superstitions, delusions of grandeur, and crackpot ideologies. Out of all that turmoil, social scientists eventually identified universal levels of human need, predictable stages of human development, and other critical factors contributing to the mess we found ourselves in and pointing to strategies for escape to a more humane future. Meanwhile, political leaders from all the world’s major nations agreed to support a new global organization dedicated to peaceful collaboration based on a Universal Declaration of Human Rights. (The League of Nations, an earlier global organization founded right after World War 1, proved toothless and useless. The newer United Nations is also rather toothless but has proved much more useful.)

 

Despite all we had learned about effective leadership by the middle of the twentieth century, the tragic irony of two world wars and the specter of global annihilation seemed to undermine any sense of optimism about our collective ability to act wisely upon our knowledge. In the aftermath of World War II, as thoughtful women and men reflected on the horrifying destruction wrought by millions following leaders they apparently respected and admired, the whole question of human destiny and its relation to leadership, followership, and citizenship called for redefinition.  In the years since then, at least in the United States, we’ve witnessed a gradual but consistent loss of confidence in our leaders.  Yet upon further reflection, we must also recognize that a loss of confidence does not necessarily signify a loss of long-term faith, and it certainly doesn’t signify a loss of hope. 

Our experience with large-scale mobilization for world war itself has ultimately helped us learn how effective leadership for the common good might possibly work and how the skills and values of effective leadership can be developed.  We’ve also begun to realize that the destructive potential unleashed by world war in the first half of this century reflects an even more powerful creative potential within the human community. Leaders in all times and places are called on to keep hope alive; at this point in human history, leaders need to help us all recognize that if we can destroy ourselves, then we can surely create ourselves as well.

           

If we wanted to find valid signs of hope and self-renewal in the wake of World War Two, we would have to consider the challenges facing the losers at least as much as those facing the winners.  Without a doubt, the world wars made us all losers in profound ways, but probably the three biggest losers of the Second World War in terms of physical destruction and loss of life were Germany, Japan, and Russia. Russia, of course, was actually on the winning side and did not see any need for a thorough-going renovation of its whole system until more than 40 years after the war. Germany and Japan, however, had to start rebuilding literally from the ground up, salvaging whatever they could from the wreck of their past and integrating it with new ideas for the future. European economies were rebuilt largely through the Marshall Plan, an ambitious program of welfare and reconstruction financed by the United States and named for the American General and Secretary of State George Catlett Marshall. The case of Japanese renewal offers a different but equally intriguing example of leadership and collaboration, particularly but not exclusively within the realm of commercial and industrial development.

As the human parade of generations and millennia has marched on, each transformational epoch is shorter than the one that came before it. The prehistoric era may have featured a million years of human activity. The Agricultural Revolution emerged about 10,000 years ago, the Axial Age about 3,000 years ago, the Renaissance about 600 years ago, the Enlightenment about 350 years ago, the Industrial Revolution around 200 years ago, and the Information Age within the last 100 years. The term “Information Age” applies largely to the emergence of digital technology, which facilitates the gathering, interpretation, and distribution of enormous piles of information. It also refers to the globalization of communication technologies from radio and television to the Internet and cell phones.

The Information Age has also called forth what some have called the “Anthropocene,” an era in which human influence has spread across the planet earth. On the positive side, the Information Age and the Anthropocene promise creative solutions to lingering problems through reason, acceptance of responsibility for future generations, and global collaboration. But on the negative side, they pose monumental challenges due to lingering human arrogance and greed, especially in light of burgeoning economic inequality, environmental pollution, climate change, and the availability of weapons of mass destruction.

The original concepts and forms of democracy have evolved considerably from the days when white male land-owners and slave-holders in the former American colonies outlined their vision of a democratic society. Thanks to the contributions of the Information Age, our vision of democracy is now much broader and deeper than it was just a dozen generations ago, and it is expanding and evolving still. In the wake of world wars, visions of peaceful democratic leadership on a global scale attract our imagination and invite us to rid the world of old visions of violent conquest and tribal dominance.

Backing up a bit, the world’s first formal attempt to study management behavior came with the founding of the Wharton School of Management at the University of Pennsylvania in 1881. Management is not quite the same as full-fledged leadership in that leaders generally help to formulate goals, aspirations, and strategies, while managers attempt to achieve goals, fulfill aspirations, and apply tactical operations to challenges chosen by leaders. Leaders focus on general long-term strategies and goals, while managers focus on specific tactics and short-term objectives. Nonetheless, a prestigious institution of higher learning devoted to the study of management behavior is a big step forward.

The German sociologist Max Weber (1864-1920) built a large following within the academic and business communities with his arguments in favor of large-scale hierarchical organizations, often called “bureaucracies,” managed according to a set of rational principles. The American management guru Frederick Taylor also attracted considerable attention in his 1911 publication, Principles of Scientific Management, based on extensive time-and-motion studies of workers engaged in physical work. From 1924 to 1932, industrial psychologist Elton Mayo and his associates conducted a series of time-and-motion studies at the Hawthorne plant of the Western Electric Company near Chicago. The studies were intended to determine levels of lighting at which workers were most productive.

Oddly enough, however, Mayo discovered that workers became more productive when the lighting was turned up, but then they got more productive again when the lighting was turned back down. It turned out that workers were not responding to the lighting, but rather to the fact that somebody seemed to be paying attention to them. Thus the “Hawthorne effect” gave rise to a movement within the management community called the “human relations movement,” since it called attention to the actual relationships between workers and managers, not just to the supposed traits of the managers and leaders. We’ll have more to say about the Hawthorne effect and the human-relations movement in the chapter on “Groups, Organizations, and Networks.”

After World War Two, the emphasis in psychology shifted from an emphasis on dysfunction in Freudian analysis to preventing dysfunction and even on building proactive strength, resilience, creativity, and effective leadership skills. Part of this effort lay in identifying the sources of bad behavior that had led us to the near-apocalypse of world war; the rest of it was about identifying the sources of strength, resilience, and creativity upon which come with our human birthright. We will spend more time on this material in future units within the Transforming Leadership curriculum, but for now, it is worth noting briefly the contributions of major thinkers and thought leaders of this period.

One of the most influential studies emerging from World War Two was conducted by Theodore Adorno and his associates, who studied hundreds of Nazi death-camp workers right after World War Two. Their book was published in 1950 under the title The Authoritarian Personality. Though it eventually met criticism for some of its methods and conclusions, it has stood the test of utility and led to further studies that have deepened our understanding of human behavior. In brief, the central features of the authoritarian personality include conventionalism, submission to authority, anti-intellectualism, “toughness,” cynicism, and aggressiveness. These elements and other were arranged on what Odorno and his associates called an “F scale,” where “F” stands for “fascism.” Though this study did not help to identify any positive properties of the human psyche, at least it helped to highlight the most dangerous tendencies. We can still use the F scale to identify potential Nazi sympathizers, Klan recruits, and racists in general.

In the section of this curriculum addressing individual human development and personal mastery, we’ll look into the details of some remarkable research on human motivation, life-cycle development, and moral reasoning. That research will clarify and bolster the insights of other writers noted below.