Jeffrey Pfeffer's Blog
August 27, 2016
The One Important Thing Missing From the Health Care Debate
You may have seen the big news on health care: Aetna is going to stop offering individual health insurance policies on the health exchanges in 11 of the 15 states where it operates. Aetna follows the lead of another large health insurer, United Health, which announced in April that it was withdrawing from almost all of the health insurance marketplaces where it operated.
The problem: the companies were losing money (or weren’t making enough) to continue to offer health insurance coverage. Meanwhile, health insurance rates for individuals are soaring, with insurance companies seeking rate increases of 20 to 40%. And on many health exchanges, with the withdrawal or bankruptcy of health insurance providers, consumers will have little choice—frequently just one plan to choose from.
As for employers, they’re trying to shift some of their costsonto employees, through higher premiums, larger deductibles, and greater co-pays. My employer, Stanford University with its $22 billion endowment, raised the co-pay for seeing a specialist last year by 50% and the cost of refilling a mail order prescription for a non-formulary drug by 33%.
Medicine takes a backseat to costs
More than a decade ago, an aptly titled article, “Monetized Medicine: From the Physical to the Fiscal,” noted the extension of industrial engineering logic from the factory to the hospital and physician’s office as health care became “market driven” in an attempt to hold down runaway health care costs. Although this effort was clearly a failure in reality, it was a symbolic success as doctors and other health care providers have lost power while economists and professional managers have gained increasing control over the health care system and the language used to discuss health care. For instance, a Google Trends search I did shows that since 2004, searches for “health care costs” far exceed searches for “health care outcomes,” and an even bigger gap exists using Google’s Ngram viewer to consider the frequency of these terms in books.
Should we care about language, terminology, and what makes the news? That health care is mostly about exchanges, competition, cost curves, costs, and, yes, politics—surveys show that support or opposition to Obamacare is strongly predicted by political party identification. I believe the answer is “yes.”
Ignoring patients’ health
When Don Petersen was nearing the end of his career as CEO of Ford, he gave a talk at Stanford about his successful efforts to turn around the company which was bleeding red ink when he took over. He noted that in his first senior management meeting as CEO (which he admitted was not that much different than any other meeting he had attended), a long period passed before the word “car” or “automobile” was mentioned.
Petersen noted that as long as a car company didn’t talk about cars but instead about margins, returns, and so forth, it was doomed—because it would place insufficient emphasis on product. This example reprised a comparison of oil company success in discovering oil related in In Search of Excellence and in speeches by Tom Peters. Peters commented that the oil company that did better than a competitor in finding oil and building its reserves was distinguished not by equipment or technical expertise but by how much time at senior meetings was devoted to the topic of oil discovery.
Language matters. Language focuses people’s attention. What senior managers talk about affects what their subordinates measure and emphasize, in the quest for leaders’ approval and individual career success. What isn’t talked about gets ignored and what is ignored suffers from inattention.
So here’s what I wish would get more attention from the news media and for that matter from employers and health insurers in their decision making: human health and well-being.
Surge in bankruptcies
Every article on the number of uninsured—and there are still millions even after the passage of the Affordable Care Act—and every decision that affects people’s access to health insurance should note that research estimated that in the late 2000s, people without health insurance were 40% more likely to die than those with health insurance, even after statistically controlling for age, income, education, health status, body mass index, exercise, smoking, and alcohol use. That means, given the number of uninsured at the time, that there were approximately 45,000 excess deaths annually because U.S. residents lacked access to health care.
Every article and discussion held within companies and by their health benefits consultants about cost shifting and increases in insurance premiums and copayments should consider the fact that in 2007, more than 60 percent of all personal bankruptcies filed were from medical costs and between 2001 and 2007, the share of bankruptcies “attributable to medical problems rose by 49.6%.” And as employers consider shifting ever more costs to individuals, they might consider research that shows that “increases in deductibles will lead to an overall decrease in optimal care-seeking behavior as families juggle healthcare costs with a weak economy and stagnating wages.”
Maybe the fact that health care has become more about money than health might help explain the fact that American’s life expectancy has stopped increasing and for some demographic groups, has started to decline.
As George Orwell noted in his famous essay, “Politics and the English Language,” “political speech and writing are largely the defence of the indefensible” and that “language can also corrupt thought.” That seems to be precisely what has occurred in much health care decision making and discussions as people’s lives and financial well-being get short shrift. While human health and wellbeing, and human life, may need to be balanced against economic realities, it just might be time for money and economic considerations in health care to be balanced by concern for human life and suffering.
(This post was originally published on Fortune on August 26, 2016)
The post The One Important Thing Missing From the Health Care Debate appeared first on Jeffrey Pfeffer.
August 17, 2016
How to Mythologize Yourself: Lessons from Donald Trump
There are many lessons to be gleaned from the Donald Trump phenomenon. This may be one of the most important: perception becomes reality, so you need to tell a convincing narrative persistently, frequently, and well.
Donald Trump is running for president in part on on the claim that he’s been a really rich and wildly successful businessperson. But as former Wall Street Journal reporter Neil Barsky, who covered Trump’s business travails between 1985 and 1991, recently wrote, “In reality, Mr. Trump was a walking disaster as a businessman for much of his life.” It’s not just the casino bankruptcies, frequent litigation, and the bad press surrounding Trump University. Even Trump’s negotiating skills have been questioned. Trump’s ghostwriter for The Art of the Deal has come forward with regrets about his role in creating the “myth” of Trump’s business acumen.
The typical Trumpian response to bad press: Threats of legal action. Nevertheless, Trump’s brand as a rich and successful business executive persists and just may be stronger than ever. That’s because Trump repeats the narrative endlessly, drops names of other successful people he hangs around with, takes on all the trappings of power and pomp—in short, he acts and plays the part almost flawlessly, so observers naturally enough buy in to the narrative.
But it’s not just Trump—or politics—where telling a convincing narrative matters. I frequently tell my friend, business writer John Byrne, that his description of Jack Welch in the best-selling, Jack: Straight from the Gut, has as much to do with the Welch’s reputation as a business genius as any actual business accomplishments (see Jim Collins’ bestseller, Good to Great, for discussion of GE’s thoroughly mediocre performance).
Silicon Valley is no different in teaching the importance of telling a convincing story with consummate skill. Former New York Times technology writer Nick Bilton noted in a 2013 article that the Valley is filled with “creation myths”that construct narratives that sometimes, maybe invariably, leave some people completely out of the story and substantially exaggerate the roles played by others. (I assign this article to students on the very first day of my class on power.) Almost no one remembers that Ronald Wayne was the third co-founder of Apple, that Jawed Karim was also involved in the creation of YouTube, that Reggie Brown was pushed out of Snapchat, and that there was litigation over whether Facebook was as much the sole creation of Mark Zuckerberg as is now assumed.
Bilton’s reporting on Jack Dorsey and the founding of Twitter, skillfully told in Hatching Twitter: A True Story of Money, Power, Friendship, and Betrayal, is insightful not only for its description of the power struggles and political maneuvering but also for the insight on the importance of PR—narrative—in gaining power—and wealth. As Bilton wrote, after Dorsey was forced out of Twitter, he “went on a media campaign to promote the idea that he and [Evan] Williams had switched roles.” In his retelling of Twitter’s founding story, Dorsey “completely erased Glass from any involvement in the genesis of the company” while slowly, with each retelling, increasing his own role and its importance. Quoting one former Twitter employee, Bilton penned, “The greatest product Jack Dorsey ever made was Jack Dorsey.”
The implications for leaders and everyone else seems clear: build relationships with the media, have a compelling personal narrative, and promulgate it and yourself relentlessly. A former Stanford business school student and friend, Marcelo Miranda, cultivated the Brazilian business press from the very beginning of his career, writing articles for the local media—and persisting in the face of rejection—and being always willing to take reporters’ calls when his colleagues weren’t. After he was featured on the cover of one of Brazil’s leading business magazines in a story about CEO’s of the future, it was not surprising that he soon became a CEO of an important Brazilian construction and engineering company.
And then there are the downsides to being written out of the picture. One friend let his cofounder do the media relations and was not mentioned at all in a story on their medical company. He left not long thereafter. Another former student co-founded a social media company that raised over $100 million in venture funding and at one point had a valuation of over $800 million. But her cofounder never talked about the venture and it was never mentioned in articles about him, and the business was sold at not a great price. Seeking funding for a new start-up, it was as if she were starting from scratch, with little memory of or recognition for the past achievements.
Lesson: you not only need a public relations strategy and to engage in personal brand building, you can never cease that activity.
Public relations people tell me the importance of personal branding has never been greater, and that smart executives understand the need to devote sufficient time to this effort. My Stanford colleague and former television producer Allison Kluger is going to be teaching a short course in the spring 2017 quarter, along with Tyra Banks, on how to build a personal brand. Banks has reinvented herself from model and actress into businesswoman and consumer brand icon, and nicely illustrates how to take control and own one’s personal narrative.
My students sometimes say that when they become a CEO or land in some other very senior role, they will worry about public relations and their image. My response: at that point, they won’t need to.
Get a PR strategy and build your story early—it will help make subsequent success more likely because people will use the brand and narrative to guide what information they seek, remember, and forget or overlook. It worked for Donald Trump and Jack Dorsey, among others. It might just be useful for you, too.
(This post was originally published on Fortune on August 16, 2016)
The post How to Mythologize Yourself: Lessons from Donald Trump appeared first on Jeffrey Pfeffer.
July 22, 2016
The Wisdom of Embracing Your Adversaries
I am often struck by the response of many people to the horrific acts of terrorism that are occurring all too frequently: advocating striking back—and hard—not just at the perpetrators, who are mostly dead by that point anyway, but at their communities. So there are calls to ban Muslim immigration and travel, continue to profile people of color, and other forms of retaliation that marginalize opponents and reduce contact with adversaries.
Holding aside the legality and morality of such recommendations, evidence and anecdotes suggest that striking back forcefully is sometimes the wrong approach. Here’s why, along with a different way of thinking about how to win in the inevitable conflicts you will find in workplaces and in the world.
The most fundamental principle comes from Nobel Prize-winner Thomas Schelling’s The Strategy of Conflict, nicely paraphrased in some lyrics from Bob Dylan’s song, “Like a Rolling Stone.” “When you got nothing, you got nothing to lose.” People who have no stake or say in the system, who are pushed to the edge and given no reasonable options, will fight with any and all means at their disposal because they see few alternatives and can imagine no cost of going all in simply because they have nothing to lose. Therefore, a more effective approach is to ensure that adversaries have a graceful out and do confront potential losses if they choose to mount a full-fledged war.
For instance, when Frances Conley, the first woman to pursue a surgical internship at Stanford Hospital and the first woman to become a tenured professor in neurosurgery at a U.S. medical school, resigned her full professorship at Stanford (a resignation she later rescinded) over sexism in the medical school, she became a visible and articulate public figure and a hero to many around the country concerned with the challenges facing female medical students and physicians. She opposed the medical school dean’s choice of neurosurgery department chair, an action that threatened to undermine his control and power.
Conley’s notoriety, professional accomplishments, and leadership qualities meant that she soon found herself included as a candidate for searches for administrative positions such as dean or department chair in numerous academic medical centers and also a focus of media attention. Administrative opportunities would come with a vetting process. So the question was: what would the dean, David Korn, do if and when he was asked about Conley?
I know Conley well from her time in the one-year full-time Sloan program at Stanford’s Business School and I wrote a case study on her. When I interviewed David Korn for the case, I asked him why he did not provide Conley glowing references for those outside jobs and thereby strategically “outplace” her into a better position elsewhere where she would not be in conflict with him, threatening his power. His reply was that he simply could not bring himself to speak highly of an adversary and help that person succeed, even if that was the best thing for him to do. This course of action turned out to be a mistake.
Once Conley saw she was not going to get a position elsewhere and, being in a tenured role, now with nothing to lose, she went all in on the struggle for better treatment of women medical students and faculty. She filed a complaint with the EEOC, continued to garner publicity and push for change at Stanford, wrote Walking Out on the Boys, a book that became a Bay Area bestseller, and most importantly sparked internal investigations into management practices at Stanford’s medical school. The publicity and internal fact-finding eventually led to Korn’s departure from under ambiguous circumstances a few years later, first for a senior role in the Association of American Medical Colleges.
Contrast Korn’s approach with that of two-time San Francisco mayor and 14-year speaker of the California Assembly Willie Brown. As described in James Richardson’s biography, Willie Brown: A Biography, when Brown won a bitterly contested battle with fellow Democrats for the role of Assembly speaker in the early 1980s, his response was to “take care of” his opponents. Some he worked to get elected to the California Senate, some he found administrative jobs for. His principal rival for the speakership was Howard Berman from the Los Angeles area. Brown worked hard to get Berman elected to Congress, where Berman served for decades. This move prompted Berman to comment that Willie Brown was the best speaker he had never voted for.
Instead of exacting vengeance, Brown eliminated his opponents in the best way possible, sending them to good positions—elsewhere—so they were not in his way in the Assembly but still had resources and successful careers to risk if they continued to fight him. And, of course, these former rivals might even come to appreciate Brown’s help.
A related, but somewhat distinct, second principle for successfully coping with adversaries is captured by the aphorism, “keep your friends close, but your enemies closer,” a saying often attributed to the great Chinese general Sun-tzu, author of The Art of War.
People mostly dislike conflict and, following the hedonic principle, seek pleasure and avoid pain. Thus, a natural response is to avoid being in much, if any, contact with adversaries. But avoiding opponents precludes attempts to co-opt them. And just as important, the less contact someone has with an adversary the less information that individual has about the adversary’s goals, strategies, and behaviors. A lack of information is never useful in a struggle.
No one was better at staying close to adversaries than President Lyndon Johnson. Johnson disliked and distrusted FBI Director J. Edgar Hoover. But Johnson kept Hoover on in his role. As author David Halberstam wrote, “Johnson, who resigning himself to the difficulty of firing J. Edgar Hoover, could say, ‘Well, it’s probably better to have him inside the tent pissing out, than outside pissing in.’”
Achieving a good outcome in a conflict sometimes requires tempering instincts to devastate and demonize one’s opponent and minimizing contact with adversaries. Those with nothing to lose have nothing to risk. And not engaging with enemies can yield an absence of knowledge about what they are up to.
The Buddha long ago noted yet another reason to not demonize opponents when he commented, “Hatred does not cease by hatred, but only by love; this is the eternal rule.”
(This post was originally published on Fortune on July 19, 2016)
The post The Wisdom of Embracing Your Adversaries appeared first on Jeffrey Pfeffer.
July 11, 2016
Why Neither Brexit Nor Trump Will Fix the World’s Job Problems
Britain’s vote to exit the European Union and the success of Donald Trump’s candidacy, with its stated intent to redo trade deals and get rid of immigrants, may result in part from the reality of disappearing (manufacturing) jobs, stagnating wages, and increasing economic insecurity. But the explanation for the job and income problems—globalization and unregulated borders—is mostly incorrect. And by blaming external forces, we let the real culprits—companies and governments—off the hook.
Over the years, there’s been a lot of rigorous, peer-reviewed research on the effects of globalization and immigration. Unfortunately, not much of the evidence is reflected in the discussions of these issues.
For instance, a study of the effects of immigration on employment in Germany concluded that there was “no detrimental effect.” An analysis looking at the effect of immigration on income in the U.S. and Europe found that the absolute value of any effects was small. A comprehensive review of the extensive research on immigration effects on labor markets concluded that “the probability that immigrants increase unemployment is low in the short run and zero in the long run” and that immigration does negatively affect wages but primarily, if not exclusively, for less-skilled people and earlier immigrants themselves.
The many studies of the effects of globalization frequently come to different conclusions about its effects. According to one analysis, an occupation’s exposure to global trade decreased real wages between 12% and 17% in the U.S., an effect that came from people having to move from high wage manufacturing occupations to lower-paid positions elsewhere in the economy. But a methodologically sophisticated study of the effects of both trade and migration in 30 OECD countries concluded that, “immigration and trade do not have a significant effect on income per capita in the short run.”
So we should stop focusing on factors where the evidence of harm is at best mixed and where the opportunity for substantive change is low. Few people believe that Britain can or will turn its back on economic integration with Europe. And as German Chancellor Angela Merkel and others have said, if Britain wants to be part of Europe, the benefits will come with obligations.
Even fewer people believe that Trump can unilaterally rewrite multilateral trade agreements, sometimes codified in treaties, or exclude the U.S. from participating in the global economy without incurring substantial economic damage.
Instead, let’s look at what we know about what people want—good jobs—and what those good jobs are and how they relate to income. The Gallup organization provides relevant information on these questions. (Full disclosure: I am collaborating with Gallup, in an unpaid capacity, on some research.)
First, as Gallup CEO Jim Clifton writes, “what the whole world wants is a good job.” This priority represents a change from the historical emphases on peace, freedom, and family. Gallup defines a good job as one where someone works 30 or more hours per week for a paycheck—a definition that excludes subsistence farming, selling trinkets or food by the side of the road, or similar activities. Gallup assesses the prevalence of good jobs in an economy with a measure it calls the payroll-to-population ratio. “Adults who are self-employed, working part time, unemployed, or out of the workforce are not counted as payroll-employed in the Payroll to Population metric.” So much for the benefits of the “gig” economy which, according to Gallup, is not producing good jobs.
Here’s the payoff: Gallup finds that GDP per capita, median household income, and median per-capita income all are strongly associated with the “proportion of adult residents in each country who say they are employed full time for an employer,” the payroll-to-population measure. Simply put, people want good jobs. Good jobs entail working at least almost full-time for an employer, and they are strongly correlated with measures of economic well-being.
So, what affects the payroll-to-population measure, or the number of good jobs in a society? Many things, but among the more important are government policies and company decisions about the nature of employment.
Decades ago, employers began to shift work from employees to contractors, part-timers, people provided by temporary help agencies, and other kinds of contingent workers. In the last decade, the proportion of people working in a contingent fashion has increased by 50%. And the “number of Americans using these alternate work arrangements rose by 9.4 million,” according to the New York Times, a number greater than the rise in overall employment. This means that “there was a small net decline in the number of workers with conventional jobs.”
As Gerald Davis at the University of Michigan said to me, the economy first transitioned from careers—long-term attachments between employers and employees—to jobs with less permanence and mutual commitment. And now, companies are moving from hiring people for jobs to obtaining people to do very specific, often limited (in time and scope) tasks, such as making deliveries or writing a specific piece of code or a technical manual.
Employers use contingent work arrangements because, as the U.S. Department of Labor has noted, contingent workers frequently receive less pay and benefits than permanent workers. Moreover, employers do not have to make Social Security payments or contribute to worker’s compensation or unemployment insurance funds for contingent labor. And, since the people doing the work are not employees, employers are free of any obligations or burdens under labor and employment laws such as paying overtime or any due process or notification requirements for firing people.
These employer decisions are in many instances discretionary. For years, numerous industrial relations and human resource scholars have described what have come to be called “low road” and “high road” work arrangements. For instance, Costco pays more and offers a larger portion of its workforce more benefits than Sam’s Club (part of Wal-Mart). Even though that would seem to raise costs, Costco has lower turnover and has more knowledgeable and skilled employees who are more productive, so it actually is more profitable on a per-employee basis. An analysis of airlines contrasted those that emphasized control over their workforce compared to those interested in generating employee commitment, and those airlines that avoided or, alternatively, sought accommodation with labor unions. The researchers concluded that, “airlines (and other enterprises) still have some scope for exercising strategic choice, in spite of their institutional and regulatory context.”
Under the banner of labor market flexibility and competitiveness, government— particularly in the U.S. but also around the world—has stepped back from intervening in the employment relationship. This lack of involvement has left employers to do what they want, which is mostly to increase the use of contingent work arrangements and cut benefits. And as the portion of workers covered by collective bargaining contracts has dramatically declined, unions have been unable to stem this tide.
Some might argue that the worldwide competition that has come with globalization has reduced employers’ discretion to use full-time employees and that immigration and the potential to move facilities offshore have provided a reserve labor force that weakens employees’ bargaining power. That may be true, but employers still have choices. Some pursue high-commitment, high-performance, high road strategies that entail competing on service, productivity, and innovation. Others opt for cost-based, low road business strategies. The same holds true for countries. Gallup data ranks the U.S. 15th in the payroll-to-population measure, behind countries such as Sweden, Denmark, Canada, Singapore, and the Czech Republic. Institutions and legal frameworks matter.
I don’t see how pulling out of the EU, pulling out of trade pacts, or curtailing immigration is going to affect employers’ decisions about using contingent labor or change the legal and institutional frameworks governing the employment relation for the better. As in many other domains, people are mistaking effects for causes.
(This post was originally published on Fortune on July 5, 2016)
The post Why Neither Brexit Nor Trump Will Fix the World’s Job Problems appeared first on Jeffrey Pfeffer.
June 21, 2016
Donald Trump: The Unproductive Narcissist
Yes, Donald Trump is a narcissist. He puts his name—often in huge letters—on everything from buildings to a 757 airplane. He constantly brags about his wealth and business acumen, sometimes exaggerating his accomplishments.
Trump eschews the modesty that best-selling author Jim Collins identified in Good to Great as characterizing level 5 leaders; people who, with fierce resolve but a willingness to be in the background, lead their companies to incredible levels of performance. But then again, Trump helps make the case that we often seek leaders with qualities and behaviors quite different from what we claim we want.
Narcissism, “a personality trait encompassing grandiosity, arrogance, self-absorption, entitlement, fragile self-esteem, and hostility”—has many advantages. Because of their apparent self-confidence and extraversion, narcissists seem to embody the qualities ascribed to leaders. Much research shows that narcissists are more likely to be selected into leadership roles in both formal and informal settings.
In an insightful article on the pros and cons of narcissistic leaders, Michael Maccoby, author of The Productive Narcissist, noted that many successful political and business leaders—people such as Larry Ellison, Jack Welch, Steve Jobs, and yes, Donald Trump—display narcissistic tendencies. Maccoby argued that contemporary business leaders, with their high profiles and active public relations campaigns, are more narcissistic than their predecessors, with a “large number of narcissists at the helm of corporations today.” Maccoby’s conjecture aligns with evidence showing an increase in average levels of narcissism in the general population and among college students and data indicating that business students are more narcissistic than students who decide to major in other subjects.
Narcissistic traits, such as a fixation on a leader’s pet projects, can be useful, if not necessary, for making the bold changes and big bets that help redefine companies and industries. Stanford business school professor Charles O’Reilly and his colleagues noted that, “narcissists are more likely to be seen as inspirational, succeed in situations that call for change, and be a force for creativity.”
But Maccoby and others have also identified some traits that hinder the effectiveness of narcissistic leaders. They include being too sensitive to criticism, having poor listening skills, displaying a lack of empathy or an intense desire to compete, and an aversion to mundane details. A meta-analysis of the research on narcissistic leaders concluded that narcissism predicts leader emergence more strongly than leadership effectiveness. More importantly, there is a curvilinear relationship between narcissism and a leader’s effectiveness. In other words, some degree of narcissism is good, but beyond that it’s probably harmful.
So what constrains narcissism so it doesn’t reach unproductive levels? External constraints and reality checks. Steve Jobs both ran a publicly traded company and got fired. Larry Ellison also ran a publicly traded company with a presumably independent board of directors and, moreover, got sanctioned by the SEC for accounting irregularities including exaggerating sales. Meanwhile, Jack Welch managed to work his way up the hierarchy at General Electric, he ran a public company with a board, and had people just below him go on to become CEOs in their own right. He successfully worked with other strong leaders who could have potentially challenged him.
Politicians not only have to win over the public but also supporters in their own party and donors. Having watched fundraising at Stanford, I can tell you that nothing helps the money flow as much as the ability to flatter a potential donor endlessly—which requires submerging one’s own ego. It works the same way with political fundraising.
Trump has not run a public company with an independent board. While he has certainly suffered reversals—casino bankruptcies as one example—he has not been fired. He has not had to curry favor with bosses or board members to get, or keep, his job. He has not run for elected office before, with the requirement of sucking up to donors and submerging his ego. He has not been in a role, such as governor or mayor, where to get things done he needed the support of legislators or city council members. In short, Donald Trump has reached his late 60s without facing the situational exigencies that could potentially curb his larger-than-life narcissistic behaviors—scenarios that many other productive narcissists have encountered.
It is, of course, always possible to learn the behaviors that make interpersonal influence work. People can learn to listen; to build empathic understanding; to not be the first one, or the loudest one, to speak; to work collaboratively in teams; and to temper competitive, disdainful behavior. If and whether, at this stage in life, Donald Trump can do any of this remains an open question. If he doesn’t, I foresee a bumpy road ahead for the Trump show.
(This post was originally published on Fortune on June 20, 2016)
The post Donald Trump: The Unproductive Narcissist appeared first on Jeffrey Pfeffer.
Why Deception Is Probably the Single Most Important Leadership Skill
The world is awash in claims of the benefits of truthfulness, candor, and transparency. A Google search using the phrase “benefits of candor” returned 30,500 entries, with just six for the opposite phrase, “costs of candor.” The kumbaya nature of leadership advice shows through.
But before you run off and tell everyone precisely what you are thinking and feeling, here are a few pieces of evidence in favor of the opposite approach.
Expectation Effects
About 50 years ago, a Harvard social psychologist and a San Francisco school principal studied Pygmalion effects in the classroom. They found that students who had been labeled, on the basis of fictitious test results, likely to experience spurts in intellectual growth showed increases in measured IQ over the course of the school year. The effect was particularly pronounced for children in the first and second grades. This research led to a boom in similar studies, first in education and then in management and leadership.
An Israeli academic, Dov Eden, conducted a number of studies demonstrating that when leaders communicated high expectations for individuals ranging from sales people to military personnel, those individuals performed at a higher level than people not subjected to similarly high expectations. A subsequent systematic review of the scientific literature confirmed the effects of expectations on performance and found that the effects were more pronounced for people who had previously been poor performers.
There are at least two mechanisms by which expectations have an effect on a person’s performance. One is called defensive effort. People who are told they won’t do well will, reasonably enough, not try very hard. Why waste energy on a fruitless quest? On the other hand, people who are told they are likely to succeed will invest more time and energy because they expect a payoff from their efforts.
Second, people, including teachers and supervisors, behave differently toward people depending on what they are told about those people. One article noted that when a person is provided with stereotype-cuing information about another individual with whom they expect to interact—for instance information about physical attractiveness, intelligence, and so forth—their behavior changes in ways that act to confirm the stereotype. For instance, people who thought they were interacting with a physically attractive person were more sociable, friendly, and likable than those who thought they were interacting with a less attractive individual.
In many cases, for positive expectations to improve performance, leaders or teachers must deliver false or bogus information to the targets. If poor performers are going to improve because they are told they are expected to do great, leaders may have to say things they may not believe.
Placebo Effects
A related phenomenon in medicine is the placebo effect—people who believe they have been given some drug or treatment will react more just because they think they received a potent treatment. For instance, a study of the administration of a stimulant (not a placebo) to cocaine abusers found that the physiological metabolic response was some 50% higher in people who were told they were being given the stimulant compared to people who received the identical dosage but were told they were being given a placebo.
A recent article in the New England Journal of Medicine noted that the therapeutic encounter—the doctor in the white coat, the other symbols and settings of medicine, and the apparent administration of some treatment—activated certain parts of the brain and affected patients’ levels of endorphins and dopamine. The article argued that some of these effects on neurotransmitters were identical to what was achieved when patients took actual drugs.
The potency of the placebo effect coupled with the tremendous contemporary problem of opiate addiction has led to the recommendation to sometimes use “fake” pills to treat patients’ pain. The idea is to achieve pain relief without the administration (and availability) of addictive narcotics.
Once again, for the placebo effect to work, there must be deception. If someone says you are getting a sugar pill, the placebo effect won’t operate and there will be no benefit to the patient.
Self-Fulfilling Prophecies
Placebo and expectation effects are examples of self-fulfilling prophecies—the concept that a certain idea produces behaviors that make the idea, even if originally false, become true. The classic example would be a run on a bank. If people believe a bank is on the verge of failing, they will rush to get their money out, which then causes the bank to fail.
For businesses to succeed, they need the support of investors, the purchases of customers, and the talent and energy of employees. But none of these parties will want to be associated with a company that is going to fail. So, one of the most important tasks of a leader is to convince others that the organization can and will be successful and that it deserves their support. Leaders who convincingly display confidence can attract the support that makes the confident posture become true, as the company becomes successful because others believe it will be and act on that basis.
Sometimes, as Intel co-founder and former CEO Andy Grove once told a Harvard Business School conference in the San Francisco Bay Area, this requires leaders to display confidence that they may not feel and to act as if they know what they are doing even if they don’t.
As quoted in a book I wrote with Bob Sutton, Grove argued that leaders needed to use deception to create the conditions for success: “Part of it is self-discipline and part of it is deception. And the deception becomes reality. Deception in the sense that you pump yourself up and put a better face on things than you start off feeling. But after a while, if you act confident, you become more confident. So the deception become less of a deception.”
Grove also emphasized that leaders should not display uncertainty and insecurity, even if, to quote him again, “none of us have a real understanding of where we are heading.”
Forget for a moment the self-interested benefits that may come to people who deceive others for their own advantage. Suppose leaders have the purest of intentions and just want other people to succeed to fulfill the lofty expectations others may have of them. Or maybe leaders want their organizations to succeed because success inspires others to put in more effort and stay at the company. Or perhaps doctors want to improve treatment outcomes by tapping into the placebo effect.
In all of these instances, people need to be able to convincingly prevaricate—which is one reason I sometimes say that the ability to lie convincingly may be the single most important management skill. Simply put, many situations in management—and medicine—rely on the operation of the self-fulfilling prophecy. The sooner we recognize this and incorporate it into leadership training, the better off we will be.
(This post was originally published on Fortune on June 2, 2016)
The post Why Deception Is Probably the Single Most Important Leadership Skill appeared first on Jeffrey Pfeffer.
May 19, 2016
To Fix High Drug Prices, Stop the Merger Madness
Sky-high prescription drug prices have angered both politicians and patients, and for good reason. Medication prices rose by more than six times the rate of inflation between 2006 and 2013.
While drug pricing is a complicated issue, the co-occurrence of soaring drug prices and an all-time record year for mergers in the pharmaceutical and biotechnology industry may be more than mere coincidence. 2015 saw the largest number and value of acquisitions ever in pharma and biotech—168 announced deals with 30 transactions exceeding $1 billion. The total value of mergers and acquisitions in 2015 was more than $300 billion, easily surpassing 2014’s record of $250 billion.
Over the years, large, economically significant players in the drug industry disappeared in takeovers—Warner-Lambert was acquired by Pfizer, Smith Kline Beechman was gobbled up by Glaxo Wellcome, Wyeth went to Pfizer, Schering-Plough was bought by Merck—the list goes on and on. Of course, the troubled Valeant made buying other companies, and then raising prices, a core component of its business strategy.
The drug business is not the only industry to see both many mergers and signs of anti-competitive behavior. In 2015, the Justice Department opened an investigation into possible collusion by airlines to limit capacity—and thereby help to ensure higher and more stable prices. Just as in pharma, airline mergers have been plentiful, with companies such as Northwest Airlines, TWA, Republic Airlines, Western Airlines, and AirTran being absorbed and Continental merging with United, Air France with KLM, and British Airways with Iberia.
Economics teaches that under conditions of perfect competition, firms earn normal profits and “prices will be kept low by competitive pressures.” To achieve perfect competition, you need many firms to operate in the same business. Therefore, antitrust policy is supposed to ensure that competitive markets remain that way by limiting mergers that reduce market competition. However, as even casual observation attests, the number and value of takeovers has soared over the years, with only a tiny minority of acquisitions drawing regulatory scrutiny.
Somewhat surprisingly, as Princeton economist Orley Ashenfelter noted, there has been little empirical evaluation of the effects of mergers on consumer prices. A study of five mergers by Ashenfelter and Federal Trade Commission economist Daniel Hosken found that in four of the cases, there was evidence of an increase in some consumer prices.
Fewer competitors makes it easier for companies to tacitly coordinate behavior in anticompetitive ways. Smaller numbers of firms simplifies the task of monitoring what others are doing. And smaller numbers of competitors makes it easier for companies to cooperate with each other without having to explicitly communicate.
Large firms, of course, tend to operate in more markets. Operating in multiple market segments raises the prospects for “mutual forbearance.” The idea is that when companies compete across multiple fronts, they may restrain their competitive moves in one market out of concern that there could be retaliation by competitors in other markets.
I am bemused as Congress bemoans high drug prices—and for that matter, crummy airline service—without asking the fundamental question: why? Of course companies will raise prices if and when they can and, for that matter, do everything else possible to increase their profits. That’s precisely what good managers are supposed to do.
While it’s the job of business people to increase profits, it’s the job of politicians and regulators to ensure that markets remain as competitive as possible. Only the business people seem to be doing their jobs better.
(This post was originally published on Fortune on May 17, 2016)
The post To Fix High Drug Prices, Stop the Merger Madness appeared first on Jeffrey Pfeffer.
May 1, 2016
Great Leaders Do Not Need to Be Authentic
Ted Cruz has accused Donald Trump of, among other things, not being a “true conservative.” Instead, Trump seems willing to tack to the prevailing winds to garner support and, for that matter, to make deals. For instance, he has supported the Clintons and other Democrats in the past, has articulated somewhat ambiguous and possibly evolving positions on a woman’s right to choose (abortion rights), and while currently demonizing immigrants, Trump has hired plenty of them to work at his properties in the past.
All of this raises the question: Is Trump “authentic?” But that query raises an even more critical question: Should we even care about that?
Leaders need to be pragmatic—to say and do what is required to obtain and hold onto power and to accomplish their objectives. As an essay on the 500th anniversary of the publication of Machiavelli’s The Prince noted, we like to believe in virtuous leaders. But “in a world where so many are not good, you must learn to be able to not be good.” That’s because one of the crucial aims of a leader is to keep their job. After all, one cannot accomplish much without a position—a power base—from which to do so.
As you can tell, I am not a fan of the authentic leadership idea. First of all, authentic leadership is a construct with numerous dimensions, definitions, and measurements, which makes it impossible to study empirically.
Second, one component in many definitions is relational transparency—being honest with others so they know what you think of them. But this is often a horrible idea. A former student of mine once worked at a company that supposedly encouraged employees to share their honest feedback with others. She gave her boss at the time some (constructive) criticism. You can guess what happened next—the boss moved to get her insubordinate subordinate fired. Flattery is almost certainly a surer way of obtaining support than telling others what you honestly think of them.
The ability to subordinate one’s views and feelings is a critical skill for advancing and surviving in the workplace. As Gary Loveman, former CEO and current chair of the gaming company Caesar’s, told my Paths to Power class at Stanford, when he joined Harrah’s (now Caesar’s) in 1998 as Chief Operating Officer, he knew that, as someone in a senior position, critical relationships had to work. For him, one of those relationships was with the CFO, a very skilled, experienced executive who wanted the company to succeed but also did not want Loveman to stand in his way of becoming COO or CEO himself. So every day, on Loveman’s list of things to do, was to serve that relationship, humbly. Loveman would visit the CFO in his office. Most importantly, he would ask the CFO about his opinions—what did he think of some casino general manager? What did he think of a new marketing initiative? Loveman would ask for the CFO’s opinion to ensure that he felt valued and listened to, regardless whether Loveman believed the CFO’s views would be valid or useful. As Loveman noted, early in one’s career, in junior roles, you can afford to like or not like people and let your feelings be known. As you move up the organizational hierarchy, executives need to get others on their side. To make relationships work, you sometimes need to conceal your true feelings or opinions.
Although we often don’t like to admit it, many of the most revered and successful leaders throughout history have had hidden agendas and were willing to cut deals with ideological opponents to advance their cause. Abraham Lincoln for a time did not reveal his ultimate goal to free all the slaves in the U.S. Nor was Lincoln fully honest about the location of the Southern delegation coming to Washington to negotiate the end of the Civil War. Nelson Mandela, the father of modern South Africa, at times advocated for violence and radical views and at other times took peace-making and pro-business positions.
Willie Brown, two-term former mayor of San Francisco and widely considered to be one of the most skilled politicians in California, was first elected speaker of the California assembly with the votes of conservative Republicans, even though Brown had previously worked to reduce the penalties for possessing small amounts of marijuana and promoted gay rights. Brown and then-governor Ronald Reagan got along famously because Reagan was a pragmatist. And Reagan let Brown influence his budget priorities. To this day, Brown has enormous influence in the Bay Area and in California because he gets along with politicians of all stripes. And getting along requires being strategic in your interactions.
Simply put, authenticity places too much value on being true to yourself. Leaders need to be true less to themselves than to what others and the situation requires of them in the moment. As INSEAD professor Herminia Ibarra insightfully noted, “By viewing ourselves as works in progress and evolving our professional identities through trial and error, we can develop a personal style that … suits our organizations’ changing needs.”
(This post was originally published on Fortune on April 29, 2016)
The post Great Leaders Do Not Need to Be Authentic appeared first on Jeffrey Pfeffer.
April 15, 2016
Why ‘Modern’ Work Culture Makes People So Miserable
Dan Lyons’ account of his time at the software company HubSpot describes a workplace in which employees are disposable, “treated as if they are widgets to be used up and discarded.” And HubSpot is scarcely unique: The description of Amazon’s work environment is just one of many similar cases. An increasing number of companies offer snacks, foosball, and futuristic jargon to keep employees’ minds off their long hours and omnipresent economic insecurity.
Whether that works, and for how long, is an open question.
Of course, in the new economy ever fewer companies hire people like Lyons as employees in the first place. Many workplaces prefer to use independent contractors for much of the companies’ work. A recent working paper by labor economists Lawrence Katz and Alan Krueger concluded that the proportion of the U.S. labor force in alternative work arrangements—working for temporary help agencies or as independent contractors, for example—expanded by some 50% in the decade from 2005 to 2015. Moreover, they wrote, “all of the net employment growth in the U.S. economy from 2005 to 2015 appears to have occurred in alternative work arrangements.”
Three facts about the new deal at work and the new work arrangements are important. First, these new work arrangements are actually old, much like how work was organized before the modern employment relationship originated. Second, these new/old work arrangements represent a natural progression of company policies, begun decades ago, that tell people that companies owed them nothing except promised pay and to make them more employable. And third, by leaving people to navigate the contemporary labor market essentially alone, the employment arrangements disrupt an old and important reason for working—the opportunity to be part of a group.
The New Work Arrangements are Actually Old
As numerous labor historians and sociologists have documented, before there were large companies employing people and using elaborate human resource policies to govern recruitment, compensation, benefits, and training, work was mostly performed by small entrepreneurs using family labor (as on farms) or by contract labor paid on a piece-work basis.
The modern employment relationship emerged primarily because of enlightened employer self-interest. As UCLA economist Sanford Jacoby documented in his book Employing Bureaucracy, rule-based control over labor emerged as a response to the inefficiencies caused by a capricious employment relationship that left employees alienated and turnover extremely high. Jacoby’s Modern Manors detailed how generous benefits such as paid vacations and pensions arose under a system called “welfare capitalism” as a way to forestall unionization and government regulation. As an example of this, Henry Ford’s famous $5 a day wage and a concomitant reduction in daily work hours from 9 to 8 arose because Ford saw that he needed to pay more to retain workers and ensure they would show up. His famous automobile assembly line couldn’t run with people missing.
Although it is coupled with more computer surveillance and fancy scheduling platforms to pair people with work, today’s use of independent contractors paid on a piece-rate system represents a return to the work arrangements of 140 years ago, not some new managerial innovation.
Employees Beware
Second, the need for employees to fend for themselves is also a hoary idea. The number of employers offering some form of employment security—a no-layoff policy—has declined drastically. What once was a practice cited by many companies on the Great Place to Work list now exists almost nowhere.
Decades ago, employers in Silicon Valley and elsewhere began telling people that employees were responsible for their own careers. A company could at best deliver on promised pay and benefits and hopefully provide workers with jobs that would build their skills and make them more employable (possibly elsewhere).
Almost 30 years ago, Northwestern professor Paul Hirsch wrote Pack Your Own Parachute to provide advice to workers in an era of mergers, downsizing, and outsourcing. Fifteen years later, Dan Pink’s Free Agent Nation achieved best-seller status with its combination of practical suggestions on how to navigate an increasingly market-based labor market along with inspiring stories of free agents who loved their new autonomy.
The idea of reciprocity inside companies—repaying the loyalty of employees to their employer with company loyalty to its workers—is mostly gone. Carnegie-Mellon professor Denise Rousseau co-authored a paper reporting that within two years of joining their employer, more than half of the people surveyed reported that the implicit and explicit psychological contracts with their employer had been violated. Research by Virginia business school professor Peter Belmi and I found that when you told people they were making decisions about whether or not to reciprocate a favor, their motivation to do so depended on whether they thought they were in an organizational or interpersonal context. People put in an organizational mind-set were much less likely to reciprocate.
Companies started to cut employees loose quite a while ago. What we see today is just a continuation of a trend to treat people as human resources, as assets to be acquired and discarded according to the return for doing so.
What’s Missing?
What’s missing from the current labor market is a sense of humanity—as contrasted with lots of emphasis on efficiency, costs, and productivity.
Human beings are social creatures. We crave companionship, seek to be part of communities, and thrive on social support. Not surprisingly, solitary confinement is increasingly under fire for being cruel and excessive punishment. Research consistently finds a relationship between social support and health, because social support in the workplace can buffer workplace stressors and contributes to physical and mental well-being. Working as free agents for multiple employers can separate people from workmates and a community that provide both job satisfaction and social support.
Pope Francis’s recent message on family life recognizes the inhumanity and destructiveness of many contemporary workplaces. Describing families as being “under siege” by the pressures of modern life, the Pope noted, “In many cases, parents come home exhausted, not wanting to talk, and many families no longer even share a common meal.” He commented that many families “often seem more caught up with securing their future than with enjoying the present,” a situation aggravated by concerns about financial security and steady employment.
What’s Next
As Jacoby and others have noted, the benign workplace situation of higher wages, employment security, decent benefits, and due process protections largely originated from employers’ desires to control their own work practices. Employers did not want policies subject to either collective bargaining agreements or government regulation. The diminishing role of both the state and organized labor in the contemporary economy—in the U.S. and abroad—undoubtedly has much to do with the evolution of labor market arrangements.
Today, individual companies face a free-rider/collective action problem. If they offer a better deal, beyond a certain point the companies incur costs that their competitors do not. This idea of matching what others do—and no more—has set off a race to the bottom that only seems to abate when challenged by a tightening labor market or political actions such as the current groundswell for higher minimum wages and more paid family leave.
But human needs for safety and security—a part of Maslow’s hierarchy—and for social interaction are primal, existing regardless of competitive pressures, unions, government regulations, and the unemployment rate. This disconnect, between human needs and work arrangements that fulfill them, is one of the costs exacted in contemporary labor market arrangements. And it’s one reason for the anger so visibly playing out in elections all over the industrialized world.
(This post was originally published on Fortune on April 12, 2016)
The post Why ‘Modern’ Work Culture Makes People So Miserable appeared first on Jeffrey Pfeffer.
April 1, 2016
Sorry, Uber. Customer Service Ratings Cannot Replace Managers
The so-called “gig” economy is mostly filled with companies that have few to no employees who actually provide the companies’ primary services. Full-time employees at companies such as Uber, Airbnb, Postmates, Taskrabbit, and Doordash provide public relations and legal services, marketing, and of course the technical development and maintenance of the software platforms that make the in-home chefs, rides, or renting accommodations possible.
These platform-as-business-model enterprises raise an interesting question: If the people who provide the core services are independent contractors, and if these independent contractors have no supervisors or bosses, how are they managed so the companies can deliver the high quality service necessary to build a good reputation and strong customer retention?
The answer: these people are managed by customer ratings. People who are highly rated are kept on board and those who are not highly rated are dropped. Moreover, the use of customer ratings to evaluate workers is increasingly used even in companies with employees, as the use of customer experience surveys grows. For instance, it is almost impossible to call any business these days and not, as part of the phone tree experience, be asked if you are willing to do a brief survey at the end of the call. The argument: data on individual performance, derived from surveys and ratings, can replace management and supervision. It’s as simple as that.
But, of course, it isn’t that simple. Here’s why.
First of all, ratings provided by people who may use widely varying criteria and are not trained in how to do assessments, are almost certainly unreliable and invalid. I showed that there was little correspondence between restaurant ratings on TripAdvisor and lists of Michelin-starred restaurants and reviewed the extensive research demonstrating zero correlation between student ratings of teachers and objective measures of what students learned. Recently, three University of Colorado marketing professors published a study using the Amazon ratings of 1,272 products across some 120 different product categories. They found that the Amazon ratings did not converge either with Consumer Reports ratings or with resale values for products where there was a resale market. Moreover, the consumer ratings showed high dispersion, meaning that there was so much variation across raters that the reliability of the ratings was questionable.
Second, using ratings to drop workers assumes that the people who have been “dismissed” can be readily replaced, and presumably by better performers. That Uber and Lyft have offered various bonuses to sign up drivers suggests that, even in the gig economy, workers may be scarce. And by firing a poor performer and then getting a replacement from essentially the same labor pool, you are relying on random luck to find someone who is going to do a better job.
Intuitively believing that ratings could not do the job of real, human supervisors, I talked to Adi Bittan to further explore my misgivings. Bittan, a former Stanford MBA student, is the co-founder of Owner Listens, a company whose mission is to provide real-time, detailed feedback so that customer issues can be addressed before customers posted negative reviews or left unhappy, never to return. In addition to the two problems already mentioned, here are some more that emerged in our conversation.
Few ratings provide actionable information. Positive or negative ratings arise from numerous behaviors, and just seeing a score doesn’t tell someone either what to keep doing or what to change. Because of the wide variations among individual ratings, people face conflicting messages—there is simply too much noise in the data to know how to respond. For instance, people using a car service may not have liked the radio volume, the choice of music, the cleanliness of the car—who knows what? Receiving comments on specific behaviors is essential for understanding what and how to change.
Next, ratings are not normalized (although they could be) for the person doing the rating. Just as some teachers are hard graders and others are easier, some people give high ratings and others rate more stringently. A “4” on a five-point scale means something completely different from someone who normally gives 5’s compared to someone who mostly gives 3’s.
Here’s the biggest problem: ratings, unlike supervisors, can’t provide coaching or training about how to improve. As we come to the end of “March Madness,” picture replacing successful basketball coaches with a rating system for players. Yes, such ratings could (and do) distinguish player ability. But those ratings would not provide inspiration and motivation at difficult moments during games and, more importantly, the instruction that helps even talented individuals develop to their full potential.
Bittan believes that ratings are helpful in the case of the simplest, most basic services, when there is less complexity and less learning and training inherent to the task. For more complex and complicated tasks and services, though, there is no substitute for effective supervision.
So, before you think of replacing your managers with rating systems, remember this: according to Brandeis University’s Jody Hoffer Gittell’s book, The Southwest Airlines Way, Southwest, which was for many years a leader in customer service, had more supervisors with fewer direct reports than its competitors. That’s because Southwest’s managers were expected to provide coaching, which requires more, rather than fewer, real human beings.
(This post was originally published on Fortune on March 31, 2016)
The post Sorry, Uber. Customer Service Ratings Cannot Replace Managers appeared first on Jeffrey Pfeffer.
Jeffrey Pfeffer's Blog
- Jeffrey Pfeffer's profile
- 291 followers
