Michael C. Perkins's Blog

February 5, 2023

Doctors are Demoralized by Our Health System

Doctors have long diagnosed many of our sickest patients with “demoralization syndrome,” a condition commonly associated with terminal illness that’s characterized by a sense of helplessness and loss of purpose. American physicians are now increasingly suffering from a similar condition, except our demoralization is not a reaction to a medical condition, but rather to the diseased systems for which we work.

The United States is the only large high-income nation that doesn’t provide universal health care to its citizens. Instead, it maintains a lucrative system of for-profit medicine. For decades, at least tens of thousands of preventable deaths have occurred each year because health care here is so expensive.

During the Covid-19 pandemic, the consequences of this policy choice have intensified. One study estimates at least 338,000 Covid deaths in the United States could have been prevented by universal health care. In the wake of this generational catastrophe, many health care workers have been left shaken.

“For me, doctoring in a broken place required a sustaining belief that the place would become less broken as a result of my efforts,” wrote Dr. Rachael Bedard about her decision to quit her job at New York City’s Rikers Island prison complex during the pandemic. “I couldn’t sustain that belief any longer.”

Thousands of U.S. doctors, not just at jails but also at wealthy hospitals, now appear to feel similarly. One report estimated that in 2021 alone, about 117,000 physicians left the work force, while fewer than 40,000 joined it. This has worsened a chronic physician shortage, leaving many hospitals and clinics struggling. And the situation is set to get worse. One in five doctors says he or she plans to leave practice in the coming years.

To try to explain this phenomenon, many people have leaned on a term from pop psychology for the consequences of overwork: burnout. Nearly two-thirds of physicians report they are experiencing its symptoms.

But the burnout rhetoric misses the larger issue in this case: What’s burning out health care workers is less the grueling conditions we practice under, and more our dwindling faith in the systems for which we work. What has been identified as occupational burnout is a symptom of a deeper collapse. We are witnessing the slow death of American medical ideology.

It’s revealing to look at the crisis among health care workers as at least in part a crisis of ideology — that is, a belief system made up of interlinking political, moral and cultural narratives upon which we depend to make sense of our social world. Faith in the traditional stories American medicine has told about itself, stories that have long sustained what should have been an unsustainable system, is now dissolving.

During the pandemic, physicians have witnessed our hospitals nearly fall apart as a result of underinvestment in public health systems and uneven distribution of medical infrastructure. Long- ignored inequalities in the standard of care available to rich and poor Americans became front-page news as bodies were stacked in empty hospital rooms and makeshift morgues. Many health care workers have been traumatized by the futility of their attempts to stem recurrent waves of death, with nearly one-fifth of physicians reporting they knew a colleague who had considered, attempted or died by suicide during the first year of the pandemic alone.

Although deaths from Covid have slowed, the disillusionment among health workers has only increased. Recent exposés have further laid bare the structural perversity of our institutions. For instance, according to an investigation in The New York Times, ostensibly nonprofit charity hospitals have illegally saddled poor patients with debt for receiving care to which they were entitled without cost and have exploited tax incentives meant to promote care for poor communities to turn large profits.

Hospitals are deliberately understaffing themselves and undercutting patient care while sitting on billions of dollars in cash reserves. Little of this is new, but doctors’ sense of our complicity in putting profits over people has grown more difficult to ignore.

Resistance to self-criticism has long been a hallmark of U.S. medicine and the industry it has shaped. From at least the 1930s through today, doctors have organized efforts to ward off the specter of “socialized medicine.” We have repeatedly defended health care as a business venture against the threat that it might become a public institution oriented around rights rather than revenue.

This is in part because doctors were told that if health care were made a public service, we would lose our professional autonomy and make less money. For a profession that had fought for more than a century to achieve elite status, this resonated.

And so doctors learned to rationalize a deeply unequal health care system that emphasizes personal, rather than public, moral responsibility for protecting health. We sit at our patients’ bed sides and counsel them on their duty to counteract the risks of obesity, heart disease and diabetes, for example, while largely ignoring how those diseases are tied to poor access to quality food because of economic inequities. Or, more recently, we find ourselves advising patients on how to modulate their personal choices to reduce their Covid risk while working in jobs with dismal safety practices and labor protections.

Part of what draws us into this norm is that doctors learn by doing — that is, via apprenticeship — in which we repeat what’s modeled for us. This is, to a degree, a necessary aspect of training in an applied technical field. It is also a fundamentally conservative model for learning that teaches us to suppress critical thinking and trust the system, even with its perverse incentives.
It becomes difficult, then, to recognize the origins of much of what we do and whose interests it serves. For example, a system of billing codes invented by the American Medical Association as part of a political strategy to protect its vision of for-profit health care now dictates nearly every aspect of medical practice, producing not just endless administrative work, but also subtly shaping treatment choices.

Addressing the failures of the health care system will require uncomfortable reflection and bold action. Any illusion that medicine and politics are, or should be, separate spheres has been crushed under the weight of over 1.1 million Americans killed by a pandemic that was in many ways a preventable disaster. And many physicians are now finding it difficult to quash the suspicion that our institutions, and much of our work inside them, primarily serve a moneymaking machine.

Doctors can no longer be passive witnesses to these harms. We have a responsibility to use our collective power to insist on changes: for universal health care and paid sick leave but also investments in community health worker programs and essential housing and social welfare systems.

Neither major political party is making universal health care a priority right now, but doctors nonetheless hold considerable power to initiate reforms in health policy. We can begin to exercise it by following the example of colleagues at Montefiore Medical Center in the Bronx who, like thousands of doctors before them, recently took steps to unionize. If we can build an organizing network through doctors’ unions, then proposals to demand universal health care through use of collective civil disobedience via physicians’ control over health care documentation and billing, for example, could move from visions to genuinely actionable plans.

Regardless of whether we act through unions or other means, the fact remains that until doctors join together to call for a fundamental reorganization of our medical system, our work won’t do what we were promised it would do, nor will it prioritize the people we claim to prioritize. To be able to build the systems we need, we must face an unpleasant truth: Our health care institutions as they exist today are part of the problem rather than the solution.

-NYT
6 likes ·   •  0 comments  •  flag
Share on Twitter
Published on February 05, 2023 21:27

October 20, 2022

HOW DEMOCRATS LOST THE WORKING CLASS

Have you ever wondered what happened to The New Deal? I did. And it wasn’t until I read Matt Stoller’s book, Goliath, that I found out.

It goes back to my Boomer generation entering Congress in 1975.

Here they encountered old-time populists like Wright Patman. He was Chairman of the Banking Committee. He had never gone to college, although he had been a crusading economic populist during the Great Depression. But what made Patman unappealing to the newbies was his support of segregation and the Vietnam War. Whereas the young Boomers had been weaned on campus politics, television, and affluence.

“The populism of the 1930s doesn’t really apply to the 1970s,” argued Pete Stark, a California member.

For more than a decade, Patman had represented a Democratic political tradition stretching back to Thomas Jefferson, an alliance of the agrarian South and the West against Northeastern capital.
For decades, Patman had sought to hold financial power in check, investigating corporate monopolies, high interest rates, the Federal Reserve, and big banks. But the banking allies on the committee had had enough of Patman’s hostility to Wall Street.

Patman “committed one cardinal sin as chairman. ... He wants to investigate the big bankers.”

According to Stoller, not all on the left were swayed. Barbara Jordan, the renowned representative from Texas, spoke eloquently in Patman’s defense. Ralph Nader raged at the betrayal of a warrior against corporate power. And California’s Henry Waxman, one of the few populist Boomers, broke with his class, puzzled by all the liberals who opposed Patman’s chairmanship. Still, Patman was out. Of the three chairmen who were forced out, Patman lost by the biggest margin.

Since that time, a majority of Democrat Boomers became pro-bank. This was seen again with Hillary Clinton’s cozy relationship with Goldman Sachs, as well as numerous banks that contributed to her campaign.

In 1968, there was a great debate about the future of the Democratic Party. Robert F. Kennedy sought to win the primary with a “black-blue” coalition of black “have-nots” and working-class whites. He sought continuity in the policies of protecting independent farmers, shopkeepers, and workers, all of which formed the heart of the New Deal—yet he also wanted to end the war in Vietnam and expand racial justice. But Kennedy’s strategy to merge these ideas disappeared when he was assassinated.

Democratic strategist Fred Dutton, forged a new coalition. By quietly cutting back the influence of unions, Dutton sought to eject the white working class from the Democratic Party, which he saw as “a major redoubt of traditional Americanism and of the antinegro, anti-youth vote.” The future, he argued, lay in a coalition of African Americans, feminists, and affluent, young, college-educated whites.

With key intellectuals in the Democratic Party increasingly agreeing with Republican thought leaders on the virtues of corporate concentration, the political economic debate changed drastically. Henceforth, the economic leadership of the two parties would increasingly argue not over whether concentrations of wealth were threats to democracy or to the economy, but over whether concentrations of wealth would be centrally directed through the public sector or managed through the private sector—a big-government redistributionist party versus a small-government libertarian party.

“The Neoliberal Club” emerged. Disciples like Gary Hart, Bill Bradley, Bill Clinton, Bruce Babbitt, Richard Gephardt, Michael Dukakis, Al Gore, Paul Tsongas, and Tim Wirth were all essentially representatives of the Baby Boom generation. Most Democratic presidential candidates for the next 25 years came from this pool of leaders.

Meanwhile, when Reagan came into office, one of his most extreme acts was to eliminate the New Deal anti-monopoly framework. He continued Carter’s deregulation of finance, but Reagan also stopped a major antitrust case against IBM and adopted Robert Bork’s view of antitrust as policy. The result was a massive merger boom and massive concentration in the private sector.

Later, Bill Clinton stripped antitrust out of the Democratic platform; it was the first time a reference to monopoly power was not in the platform since 1880. He also championed the repeal of the Glass-Steagall Act that separated commercial banking from investment banking.

However, in 2000, the American people didn’t reward the Democrats with majorities in Congress or an Oval Office victory. In particular, the rural parts of the country in the South, which had been a traditional area of Democratic strength up until the 1970s, were strongly opposed to this new Democratic Party. White working-class people did not perceive the benefits of the “greatest economy ever.”

It turns out, according to a McKinsey study, that a disproportionately large amount of the productivity gains from the remarkable computerization of the economy were the result of just one company: Walmart, the new A&P. The mega store’s economic influence “reached levels not seen by a single company since the 19th-century.”

The gains of the 1990s, it turns out, were not structural, but illusory. Early in Bush’s term, the stock-market bubble burst and wages collapsed. A few years later, a global banking crisis, induced by a financial sector that had steadily gained power for 40 years, erupted. Concentration of power in the private sector, it turned out, had its downsides.

By 2008, the ideas that took hold in the 1970s had been Democratic orthodoxy for two generations. “Left-wing” meant opposing war, supporting social tolerance, advocating environmentalism, and accepting corporatism and big finance while also seeking redistribution via taxes. The Obama administration was ideologically consistent with the rejection of the old populism.

In the last seven years, another massive merger boom has occurred, with concentrations accruing in the hospital, airline, telecommunications, and technology industries. This is the world of the Democrat Boomers and the libertarian and statist thinkers who shaped their intellectual understanding of it.

Trump’s emergence would not be a surprise to someone like Patman, or to most New Dealers. They would note that the real-estate mogul’s authoritarianism is not new in American culture; it is ubiquitous. It is consistent with how the commercial sphere has developed since the 1970s. Americans feel a lack of control: They are at the mercy of distant forces, their livelihoods dependent on the arbitrary whims of power. Patman once attacked chain stores as un-American, saying, “We, the American people, want no part of monopolistic dictatorship in … American business.” Having yielded to monopolies in business, the nation must now face the un-American threat to democracy Patman warned they would sow.


=========


Matt Stollers’s Book, Goliath

https://www.goodreads.com/book/show/4...

Companion book by Thomas Frank

https://www.goodreads.com/book/show/2...
2 likes ·   •  0 comments  •  flag
Share on Twitter
Published on October 20, 2022 16:35

June 10, 2022

If America fails to punish its insurrectionists, it could see a wave of domestic terror

We must not repeat the mistakes of the years after the 1860s war for white supremacy that we call the civil war

by Steve Phillips, The Guardian

The last time the United States failed to properly punish insurrectionists, they went on to form the Ku Klux Klan, unleash a reign of murderous domestic terrorism, and re-establish formal white supremacy in much of the country for more than 100 years. As the House select committee investigating the January 6 Capitol attack begins televised hearings this week, the lessons from the post-civil war period offer an ominous warning for this current moment and where we go from here.

It is often difficult to sustain the requisite sense of urgency about past events, however dramatic and shocking they may have been at the time. Memories fade, new challenges arise and the temptation to put it all behind us and move on is strong. On top of all that, Republicans quickly and disingenuously called for “unity”, mere days after failing to block the the peaceful transfer of power. If we want to preserve our fragile democracy, however, Congress and the president must learn from history and not make the same mistakes their predecessors did in the years after the 1860s war for white supremacy that we call the civil war.

In 1860, many people believed that America should be a white nation where Black people could be bought and sold and held in slavery. The civil war began when many of the people who held that view refused to accept the results of that year’s presidential election. They first plotted to assassinate President-elect Abraham Lincoln (five years later, they would succeed). Then they seceded from the Union, and shortly thereafter started shooting and killing people who disagreed with them. By the end of the war, 2% of the entire country’s population had been killed, the equivalent of 7 million people being killed based on today’s US population.

Despite the rampant treason and extraordinary carnage of the war, the country’s political leaders had little appetite for punishing their Caucasian counterparts who had done their level best to destroy the United States of America. After Confederate sympathizer John Wilkes Booth successfully assassinated Lincoln in 1865, Andrew Johnson ascended to the highest office in the land. Johnson, a southerner who “openly espoused white supremacy”, “handed out pardons indiscriminately” to Confederate leaders and removed from the south the federal troops protecting newly freed African Americans.

The historian Lerone Bennett Jr captured the tragedy of the moment in his book Black Power USA: The Human Side of Reconstruction, 1867-1877, writing: “Most Confederate leaders expected imprisonment, confiscation, perhaps even banishment. Expecting the worst, they were willing to give up many things in order to keep some. If there was ever a moment for imposing a lasting solution to the American racial problem, this was it. But the North dawdled and the moment passed. When the Confederates realized that the North was divided and unsure, hope returned. And with hope came a revival of the spirit of rebellion … this was one of the greatest political blunders in American history.”

With that revival of white supremacist hope came ropes and robes and widespread domestic terrorism. Mere months after the ostensible end of the civil war in April 1865, half a dozen southern young white Confederate war veterans gathered in Pulaski, Tennessee, in December 1865 to discuss what to do with their lives, and they decided to form a new organization called the Ku Klux Klan. The first Grand Wizard of the KKK, Nathan Bedford Forrest, was a Confederate general who had been pardoned by Johnson. In less than one year, Forrest would go on to orchestrate “336 cases of murder or assault with intent to kill on freedmen across the state [of Georgia] from January 1 through November 15 of 1868”.

[Forrest, BTW, was Shelby Foote's hero in his Civil War history. It tells us all we need to know about the trustworthiness of Foote's book, or lack thereof. Ken Burns has come out recently and admitted he was naive in abetting Foote in the Civil War doc]

The effectiveness of the domestic terrorism in crushing the country’s nascent multiracial democracy was unsurprising and undeniable. In Columbia county, Georgia, 1,222 votes had been cast for the anti-slavery party in April 1868, and after the reign of terror that year, the party received just one vote in November of that year.

Lest we think this was all a long time ago, the House committee hearings are about to remind us all that we had an insurrection just last year. Not only did a violent mob attack the country’s elected leaders and attempt to block the peaceful transfer of power, but even after the assault was repelled, 147 Republicans – the majority of the Republican members in Congress – refused to accept the votes of the American people in their attempt to overthrow the elected government of the United States of America.

And far from being chastened, the enemies of democracy in the Republican party have only become emboldened, like their Confederate counterparts of the last century. Just as happened in the years after the civil war when the prospect of large-scale Black voting threatened white power and privilege, the defenders of white nationalism have engaged in a legislative orgy of passing pro-white public policies. From trying to erase evidence of racism and white supremacy from public school instruction to laws making it increasingly difficult for people of color to cast ballots. As journalist Ron Brownstein has warned, “The two-pronged fight captures how aggressively Republicans are moving to entrench their current advantages in red states, even as many areas grow significantly more racially and culturally diverse. Voting laws are intended to reconfigure the composition of today’s electorate; the teaching bans aim to shape the attitudes of tomorrow’s.”

All of this is happening because the insurrectionists have not and believe they will not be punished. But it doesn’t have to be this way. Democrats control Congress and the White House, and they can take strong and decisive action to ensure appropriate consequences for people who seek to undermine democracy. The House of Representatives impeached Donald Trump in 2021 for incitement of insurrection, and Congress can still invoke the 14th amendment’s provision banning from office any person who has “engaged in insurrection”.

All those who aided and abetted Trump’s insurrection should face the full force of the laws that are designed to protect the multiracial democracy that the majority of Americans want. The fate of democracy in America is quite literally at stake.
15 likes ·   •  13 comments  •  flag
Share on Twitter
Published on June 10, 2022 09:39

June 5, 2022

Memory Laws

Russian policies belong to a growing international body of what are called “memory laws”: government actions designed to guide public interpretation of the past. Such measures work by asserting a mandatory view of historical events, by forbidding the discussion of historical facts or interpretations or by providing vague guidelines that lead to self-censorship.

This spring, memory laws arrived in America. Republican state legislators proposed dozens of bills designed to guide and control American understanding of the past. As of this writing, five states (Idaho, Iowa, Tennessee, Texas and Oklahoma) have passed laws that direct and restrict discussions of history in classrooms. The Department of Education of a sixth (Florida) has passed guidelines with the same effect.

The most common feature among the laws, and the one most familiar to a student of repressive memory laws elsewhere in the world, is their attention to feelings.

History is not therapy, and discomfort is part of growing up. As a teacher, I cannot exclude the possibility, for example, that my non-Jewish students will feel psychological distress in learning how little the United States did for Jewish refugees in the 1930s. I know from my experience teaching the Holocaust that it often causes psychological discomfort for students to learn that Hitler admired Jim Crow and the myth of the Wild West.

Teachers in high schools cannot exclude the possibility that the history of slavery, lynchings and voter suppression will make some non-Black students uncomfortable. The new memory laws invite teachers to self-censor, on the basis of what students might feel — or say they feel. The memory laws place censorial power in the hands of students and their parents. It is not exactly unusual for white people in America to express the view that they are being treated unfairly; now such an opinion could bring history classes to a halt.

The memory laws arise in a moment of cultural panic when national politicians are suddenly railing against “revisionist” teachings.

A hundred years after the Tulsa massacre, almost to the day, the Oklahoma Legislature passed its memory law. Oklahoman educational institutions are now forbidden to follow practices in which “any individual should feel discomfort, guilt, anguish or any other form of psychological distress” on any issue related to race. (This has already led to at least one community college canceling a class on race and ethnicity.)

My experience as a historian of mass killing tells me that everything worth knowing is discomfiting; my experience as a teacher tells me that the process is worth it. Trying to shield young people from guilt prevents them from seeing history for what it was and becoming the citizens that they might be. Part of becoming an adult is seeing your life in its broader settings. Only that process enables a sense of responsibility that, in its turn, activates thought about the future.

Democracy requires individual responsibility, which is impossible without critical history. It thrives in a spirit of self-awareness and self-correction.

Authoritarianism, on the other hand, is infantilizing: We should not have to feel any negative emotions; difficult subjects should be kept from us. Our memory laws amount to therapy, a talking cure. In the laws’ portrayal of the world, the words of white people have the magic power to dissolve the historical consequences of slavery, lynchings and voter suppression. Racism is over when white people say so.

We start by saying we are not racists. Yes, that felt nice. And now we should make sure that no one says anything that might upset us. The fight against racism becomes the search for a language that makes white people feel good. The laws themselves model the desired rhetoric. We are just trying to be fair. We behave neutrally. We are innocent.

-Timothy Snyder
14 likes ·   •  9 comments  •  flag
Share on Twitter
Published on June 05, 2022 08:43

June 1, 2022

A Pulitzer winner who was actually a thief

Researched and documented in detail, how Wallace Stegner's plagiarism was flat out theft. Non-writers don't seem to get it. What if you came home from vacation to find your house, cars, and everything gone? But plagiarism is even worse because the writing is the artist's personal creation and someone with a lack of ideas and no talent is taking it as their own.

https://www.newyorker.com/books/page-...

===============

Definition of plagiarism. Given what's explained above, three out of the four bullet points would apply.

https://www.plagiarism.org/article/wh...
 •  0 comments  •  flag
Share on Twitter
Published on June 01, 2022 17:43

May 22, 2022

Who Killed Jane Stanford? (WSJ)

Stanford University opened its doors to students in 1891, just two years before its co-founder Leland Stanford died. For a dozen years after her husband’s death, Jane Stanford almost single-handedly financed and oversaw the fledgling institution—including fending off a legal challenge to the Stanford estate that went to the U.S. Supreme Court—on its journey to becoming one of the world’s wealthiest and most exclusive universities.

Despite these accomplishments, Jane Stanford has been largely overlooked by historians of the American West. But writers and readers with an interest in lurid crime are fascinated by her horrifying demise. In February of 1905, she died an agonizing death in a Honolulu hotel room; she had ingested strychnine. Jane Stanford’s poisoning remains one of the most dramatic unsolved murders of its day.

In “Who Killed Jane Stanford? A Gilded Age Tale of Murder, Deceit, Spirits, and the Birth of a University,” the historian Richard White turns his sights on this dramatic true-crime story. An emeritus professor at Stanford and author of “Railroaded,” a history of the transcontinental rail boom, Mr. White is also a brilliant, acerbic guide into a world that resonates with the present.

“In an age of surreal conspiracy theories,” Mr. White writes in the preface, the cover-up of Jane Stanford’s unsolved murder “is a reminder that conspiracies can be quite real.” Offering a detective story with more twists and turns than a Dashiell Hammett novel, Mr. White leads us through his research into the labyrinth. Along the way, Mr. White uncovers a century-long campaign kicked off by the university’s first president to cloud the circumstances of Jane Stanford’s death. But he fails to make us care much for any of this dastardly cast of characters—including the victim.

“Who Killed Jane Stanford?” puts to rest any lingering doubts that the university’s co-founder was, in fact, murdered. After surviving an initial poisoning attempt at her Nob Hill mansion, Stanford fled to Waikiki. She was accompanied by her long-serving private secretary, Bertha Berner, and a maid. After a pleasant picnic on the Pali—a cliff overlooking the Pacific—with freshly baked gingerbread, Stanford and Berner returned to their hotel, had an early supper, and retired for the evening. Not long after, Berner and the maid heard Jane crying out for help and saw her clinging to the doorframe. She realized she’d been poisoned, exclaiming “This is a horrible death to die!”

Many of the more gruesome details of Stanford’s death already had been uncovered by a retired Stanford neurology professor, Robert W.P. Cutler. In his slim 2003 book “The Mysterious Death of Jane Stanford,” he obtained and analyzed the toxicologist’s report and coroner’s inquest from Hawaii but declined to name a killer. Acknowledging his debt to Cutler, Mr. White reveals the “why” of the story, pulling out the lens for a wider view. He uses Jane Stanford’s death, as he puts it, “to reveal the politics, power struggles, and scandals of Gilded Age San Francisco.”

As a leading chronicler of that era, Mr. White is an adept and engaging tour guide to this corrupt and vivid world, as well as to Jane Stanford’s devotion to spiritualism. I found myself envying the undergraduates in Mr. White’s classroom, in which he used the murder to teach historical research methods. He brought the class to the university archives, where an archivist pulled Jane Stanford’s death mask from its box and showed it to the students, “who reacted with audible gasps as if her corpse had walked in the room. At that moment Jane Stanford leaped across the century and came among us like an apparition at one of her séances.”

Mr. White, however, is not seeking justice for the victim. “I wish I could say that seeing the mask of Jane Stanford’s face, only a little more than a day dead, sparked a desire in me to see justice done. It didn’t. She was dead. I saw an aged woman and I initially wondered not who killed her, but why?” Jane Stanford becomes a means to an end for Mr. White, her death a frame for a tale of social turmoil that detours into San Francisco’s underworld and the links that connected bribe-taking police with the Chinese gangs known as “tongs.”

On the whole, the portrait rendered of Stanford, who became one of California’s leading philanthropists, is not appealing. She appears here as a selfish and unlikable rich woman, who spent the final decades of her life focused on aggrandizing herself, her dead husband, and their much-mourned son, Leland Jr, who had died of typhoid at age 15. In his epilogue, Mr. White confesses to sympathizing with at least one suspect, who endured Stanford’s demands and cruelty. The subtitle of Mr. White’s book might as well have been “Why she deserved what she got.”

The result is that the narrative largely relegates its subject to the role of victim—rather than to her rightful place as the forceful woman who kept the university alive in its early years. That emphasis may, in part, be due to the incomplete or distorted material Mr. White had to work with. Adding another layer of mystery to the case, he notes that “rarely have I encountered more documents that have vanished and more collections and reports that have gone missing than in this research.” Those absences were intentional, as Mr. White details.
“Who Killed Jane Stanford?” shows us the lengths to which the university’s first president, David Starr Jordan, working alongside Jane’s brother Charles Lathrop, pulled every possible lever in corrupt San Francisco to make sure the police investigation into her death petered out and the newspapers dropped the murder story. Both men wanted to foreclose challenges to her bequests that a murder charge could trigger.

Just as Mr. White is unsympathetic to Stanford, he’s equally searing in his treatment of Jordan and Lathrop. San Francisco detectives “wanted to eliminate the murder, not the murderer,” observed the sheriff in Hawaii who first oversaw the case. They succeeded, as did Jordan, who had been about to lose his post as university president. Stanford’s death meant that didn’t happen: Jordan remained in leadership at the university for another eleven years. White declares him an “accessory after the fact.”

Mr. White has done an astonishing job of sifting through the available clues—and turning up an impressive array of new details, including a mysterious pharmacist with shifting addresses. He teases us through nearly 300 pages before naming the person he believes was responsible for Stanford’s murder—though towards the end, he admits, “the evidence is circumstantial.” Mr. White does not produce the smoking gun to definitively solve this famous murder. But he does—in the parlance of early 20th century detectives—thoroughly “sweat” the historical record. In this fascinating “whydunit,” he makes a convincing case for why Jane Stanford’s murder was covered up for so long.
1 like ·   •  0 comments  •  flag
Share on Twitter
Published on May 22, 2022 14:23

May 19, 2022

WSJ: crypto is a libertarian scam

To its advocates, cryptocurrency is, at its heart, a libertarian project to free mankind from the shackles of government—most of all its power to debase a “fiat” currency by printing more of it. Do Kwon, the South Korean creator of the stablecoin TerraUSD, regularly equated fiat currency to “state violence.”

So when inflation took off, crypto’s supporters were triumphant. Bitcoin’s value “is telling us that the central banks are bankrupt, that we are at the end of the fiat money regime,” venture capitalist and bitcoin investor Peter Thiel declared in April.

Then a funny thing happened. As the Federal Reserve responded to rising inflation by raising interest rates, fiat currency rallied big time. Bitcoin has fallen 30% against the dollar since Mr. Thiel’s comments. TerraUSD, which is supposed to trade one-for-one with the dollar, now trades eight-for-one. In fiat money terms, crypto’s total value has plummeted by 56%, or $1.6 trillion, since November.

Perhaps this is just another of crypto’s many temporary downdrafts. Or perhaps rising interest rates have exposed the hollowness of crypto’s libertarian promise.

To its advocates, cryptocurrency is, at its heart, a libertarian project to free mankind from the shackles of government—most of all its power to debase a “fiat” currency by printing more of it. Do Kwon, the South Korean creator of the stablecoin TerraUSD, regularly equated fiat currency to “state violence.”

So when inflation took off, crypto’s supporters were triumphant. Bitcoin’s value “is telling us that the central banks are bankrupt, that we are at the end of the fiat money regime,” venture capitalist and bitcoin investor Peter Thiel declared in April.

Then a funny thing happened. As the Federal Reserve responded to rising inflation by raising interest rates, fiat currency rallied big time. Bitcoin has fallen 30% against the dollar since Mr. Thiel’s comments. TerraUSD, which is supposed to trade one-for-one with the dollar, now trades eight-for-one. In fiat money terms, crypto’s total value has plummeted by 56%, or $1.6 trillion, since November.

Perhaps this is just another of crypto’s many temporary downdrafts. Or perhaps rising interest rates have exposed the hollowness of crypto’s libertarian promise.
Bubbles are a regular byproduct of our financial system, from dot-com stocks in the late 1990s to subprime mortgages in the mid-2000s to green technology recently. Crypto was different: It sought to replace the financial system altogether with one that was faster, cheaper, less under the thumb of government and more accessible to the poor.

It has had 13 years to make that case, and failed. Bitcoin comprises just 0.2% of international remittances, according to Manuel Orozco of the Inter-American Dialogue, a U.S.-based think tank. El Salvador made bitcoin legal tender last September and heavily subsidized its adoption. Usage has since plunged; only 20% of companies in El Salvador accept it and less than 5% of sales are conducted in bitcoin, according to an April study. The poor, it turns out, don’t need a new currency: They need cheaper ways to use the old one. Crypto makes day-to-day transactions more expensive, not less. Bitcoin ATM fees can range from 7% to 20%, and transaction charges from $1.78 to $62. The only businesses to truly embrace crypto are those allergic to oversight, such as ransomware and sanctions busting.

Having failed as a medium of exchange, crypto survives as an asset class: Today, crypto is primarily used to trade other crypto. Here, too, libertarian arguments are made for crypto’s superiority over more regulated assets like equities. A stock “is a government-linked entity,” Mr. Thiel said. “Woke companies are sort of quasi-controlled by the government in a way that bitcoin never will be.”

Brian Brooks, chief executive of Bitfury Group, a bitcoin-mining company, and a former Trump-appointed bank regulator, told Congress last year: “Unlike the IPO boom, unlike venture capital, [crypto] doesn’t require that you know a guy, or that you be well-connected, or that you be an accredited investor to participate. This is a chance for underrepresented communities to be in on the wealth creation stage of some new thing, as opposed to coming in at the end.” That, he said, is why “there are more minority investors than white investors in crypto.”

There are, of course, profound differences between stocks and crypto. Stocks have intrinsic value: they are a claim on a company’s cash flow. Its price may be out of whack with that cash flow but at least you can make a judgment. Stocks can go to zero and investors can lose fortunes. But those risks are mitigated by regulations: companies must disclose information material to their share price, mutual funds must report their assets, and securities brokers and their customers must meet certain criteria. This regulation has costs, including barriers to entry.

Instead of standardized regulatory filings, cryptocurrency issuers publish jargony “white papers” to the internet. Aside from some stablecoins, cryptocurrencies are backed by no tangible assets, so even outlandish predictions of their value are unfalsifiable. Crypto promoters argue crypto isn’t a security and shouldn’t be regulated as such, and have spent and recruited heavily to make those views heard in Washington. So while regulators have pushed back and brought enforcement cases, laissez-faire has by and large prevailed at the federal level.

That means barriers to entry and investor protections are low. TerraUSD’s meltdown illustrates the perils. Stablecoins typically peg themselves to the dollar and hold a reserve of actual dollars in a bank deposit to redeem the coins. TerraUSD was an algorithmic stablecoin backed only by another coin called Luna and by a now-depleted reserve fund of bitcoin and other cryptocurrencies, i.e., nothing tangible.

With libertarian logic, Mr. Kwon once argued this made TerraUSD superior to regular stablecoins which are “held hostage to whoever feels like they have control over the underlying bank deposits.” TerraUSD offered “decentralization purity in the sense that there’s nobody that can freeze your assets…It’s a lot more robust from regulation,” Mr. Kwon said. Of course, that meant when the combined value of TerraUSD and Luna went from $48 billion to under $3 billion in less than two weeks, there wasn’t much in the way of assets for investors, either. (Mr. Kwon has announced a plan to distribute 1 billion tokens of a new version of Luna to existing Luna and TerraUSD holders and developers.)

Investors, including from underrepresented communities, who shared in crypto’s wealth creation are now sharing in its wealth destruction. Caveat emptor, one might say. Except, Timothy Massad, former chairman of the Commodity Futures Trading Commission notes, “We’ve decided that caveat emptor in the financial markets is not a good way to grow markets overall…Financial access and inclusion needs to come with a reasonable framework of investor and consumer protection.”
4 likes ·   •  0 comments  •  flag
Share on Twitter
Published on May 19, 2022 12:40

March 11, 2022

Memory Laws

from Timothy Snyder....


Russian policies belong to a growing international body of what are called “memory laws”: government actions designed to guide public interpretation of the past. Such measures work by asserting a mandatory view of historical events, by forbidding the discussion of historical facts or interpretations or by providing vague guidelines that lead to self-censorship.

This spring, memory laws arrived in America. Republican state legislators proposed dozens of bills designed to guide and control American understanding of the past. As of this writing, five states (Idaho, Iowa, Tennessee, Texas and Oklahoma) have passed laws that direct and restrict discussions of history in classrooms. The Department of Education of a sixth (Florida) has passed guidelines with the same effect.

The most common feature among the laws, and the one most familiar to a student of repressive memory laws elsewhere in the world, is their attention to feelings.

History is not therapy, and discomfort is part of growing up. As a teacher, I cannot exclude the possibility, for example, that my non-Jewish students will feel psychological distress in learning how little the United States did for Jewish refugees in the 1930s. I know from my experience teaching the Holocaust that it often causes psychological discomfort for students to learn that Hitler admired Jim Crow and the myth of the Wild West.

Teachers in high schools cannot exclude the possibility that the history of slavery, lynchings and voter suppression will make some non-Black students uncomfortable. The new memory laws invite teachers to self-censor, on the basis of what students might feel — or say they feel. The memory laws place censorial power in the hands of students and their parents. It is not exactly unusual for white people in America to express the view that they are being treated unfairly; now such an opinion could bring history classes to a halt.

The memory laws arise in a moment of cultural panic when national politicians are suddenly railing against “revisionist” teachings.

A hundred years after the Tulsa massacre, almost to the day, the Oklahoma Legislature passed its memory law. Oklahoman educational institutions are now forbidden to follow practices in which “any individual should feel discomfort, guilt, anguish or any other form of psychological distress” on any issue related to race. (This has already led to at least one community college canceling a class on race and ethnicity.)

My experience as a historian of mass killing tells me that everything worth knowing is discomfiting; my experience as a teacher tells me that the process is worth it. Trying to shield young people from guilt prevents them from seeing history for what it was and becoming the citizens that they might be. Part of becoming an adult is seeing your life in its broader settings. Only that process enables a sense of responsibility that, in its turn, activates thought about the future.

Democracy requires individual responsibility, which is impossible without critical history. It thrives in a spirit of self-awareness and self-correction.

Authoritarianism, on the other hand, is infantilizing: We should not have to feel any negative emotions; difficult subjects should be kept from us. Our memory laws amount to therapy, a talking cure. In the laws’ portrayal of the world, the words of white people have the magic power to dissolve the historical consequences of slavery, lynchings and voter suppression. Racism is over when white people say so.

We start by saying we are not racists. Yes, that felt nice. And now we should make sure that no one says anything that might upset us. The fight against racism becomes the search for a language that makes white people feel good. The laws themselves model the desired rhetoric. We are just trying to be fair. We behave neutrally. We are innocent.
15 likes ·   •  8 comments  •  flag
Share on Twitter
Published on March 11, 2022 08:32

January 12, 2022

Aldous Huxley and Eric Blair (Orwell) at Eton

Eton College, June 1918. The almost blind Aldous Huxley was by all agreement a truly hopeless teacher. He was tall and gaunt, dressed after the fashion of Oscar Wilde, and was considered an eccentric fop. Blair was one of the few who admired him. Emboldened by Huxley’s myopia, the boys at the back of the class were playing cards. In an effort to quell their increasingly rowdy behaviour, made worse by the fact it was the last day of the school year, Huxley set the class regular tests, which, he assured them, though unconvincingly, would reflect on their standing in the grades.

This test was simple: Whom do you consider the ten greatest men now living?

‘Pens down, gentlemen,’ Huxley said, feebly. ‘Pass your papers to the front.’ He stood before the blackboard, flicking rapidly through the papers, which he held close to his face. ‘Who is going to volunteer to tell me their answer?’

‘I can, sir. It’s Blair [George Orwell].’ He wasn’t by nature a keen student, but thought the class’s mocking of the afflicted Huxley scandalous.

‘Yes, Blair, I can see you.’ He handed Blair’s paper back to him. ‘Your top ten, please.’

Sitting at his desk, he read it out. ‘Wells, obviously, sir. Shaw, equally obviously. Galsworthy. Jack London. Henri Barbusse.’ ‘

Barbusse, Blair? Bravo,’ Huxley said. ‘But where on earth did you get a copy of Le Feu?’ The anti-war novel was considered almost seditious.

‘From the provost, sir.’

‘That’s five so far, Blair. Continue.’

‘Bertrand Russell—’ ‘

Oh, Blair!’ one boy called out, ‘this is just too funny. I knew you were a Shavian and a Red, but a conchie [conscientious objector]? This is hilarious.’ The card-playing boys threw crumpled papers at him.

He dodged the missiles and kept going. ‘Keir Hardie—’

‘Dead!’ someone yelled out. ‘You of all people should know that, Blair.’

‘Bukharin—’


‘Never heard of him!’

‘Editor of Pravda,’ said Huxley. ‘Reputedly the most brilliant of all the Bolsheviks.’

More laughter and missiles.

‘Trotsky.’

‘That’s nine, Blair.’

‘And Lenin.’

‘Lenin indeed, Mr Blair. I’ve just had a glance at all sixteen papers, and fifteen name Lenin. Why do you think that is?’

‘He represents the future happiness and freedom of man, sir.’

‘Do you really think so? Blair, does the future happiness and freedom of man lie in Jacobinism? Did Cromwell and his Ironsides deliver happiness?’

‘It lies in equality.’

‘Imposed by force? As under Robespierre and the Terror? You’ve read Carlyle?’

‘Yes, sir. Violence … well, it seems necessary, sometimes.’

‘Ah, so you think the two things – happiness and equality – are the same? What about freedom?’

‘The poor are always unfree.’

‘I take it you got that from Jack London? What if, instead of mandating communism, we gave the people what they wanted?’

‘You mean equality, sir?’

‘No. What they’re actually asking for. I mean happiness – peace, nice clothes, an annual holiday at the seaside, an ice every day, free beer, oriental mistresses, everyone with their own motorcar and aeroplane. No work, no need to think or worry.’


‘A society based on the principle of hedonism?’

‘Yes, Blair. Shallow, gutless hedonism. Happiness! With little to complain about or agitate for, people will be easily governed, don’t you think? Isn’t that ultimately what Mr Lenin is promising the Russians – a complete absence of material hardship for everyone forever? Universal happiness? An end to politics?’

‘He’s promising it to all the working classes of the world, sir, not just the Russians,’ added Cyril Connolly, his friend. ‘Or was that Trotsky? I can’t remember.’

‘Which will make it all just that much more difficult to achieve, Connolly, when the dictatorship eventually ends,’ Huxley continued. ‘And if its goal of material progress and equality fails, what will be left? Mr Blair, what do you think will be left?’

‘I don’t know, sir.’ ‘

Think about it, Blair. What will be left will be the very things you started with: force and terror. The dictatorship of the proletariat, forever.’

https://www.goodreads.com/book/show/3...
13 likes ·   •  11 comments  •  flag
Share on Twitter
Published on January 12, 2022 15:39

April 12, 2021

the fallacy of “whataboutism.

WSJ, June 2017

In his interview with NBC’s Megyn Kelly on Sunday, Russian President Vladimir Putin employed the tried-and-true tactic of “whataboutism.” When asked about Russia’s reported meddling in American elections, he changed the subject to U.S. interference abroad: “Put your finger anywhere on a map of the world, and everywhere you will hear complaints that American officials are interfering in internal election processes.”

As Michael McFaul, a former ambassador to Russia, observed on Twitter, “Putin plays classic whataboutism card when asked about interference in US elections.”

“Whataboutism” is another name for the logical fallacy of “tu quoque” (Latin for “you also”), in which an accusation is met with a counter-accusation, pivoting away from the original criticism. The strategy has been a hallmark of Soviet and post-Soviet propaganda, and some commentators have accused President Donald Trump of mimicking Mr. Putin’s use of the technique. When former Fox News host Bill O’Reilly called Mr. Putin “a killer” in an interview in February, Mr. Trump responded, “We got a lot of killers—what, you think our country’s so innocent?”

The term was popularized by articles in 2007 and 2008 by Edward Lucas, senior editor at the Economist. Mr. Lucas, who served as the magazine’s Moscow bureau chief from 1998 to 2002, saw “whataboutism” as a typical Cold War style of argumentation, with “the Kremlin’s useful idiots” seeking to “match every Soviet crime with a real or imagined western one.”

But the roots of “whataboutism” actually go back to the decadeslong sectarian struggle between unionists and nationalists in Northern Ireland, known as the Troubles.

On Jan. 30, 1974, the Irish Times published a letter to the editor from Sean O’Conaill, a history teacher from the town of Coleraine in Northern Ireland. Mr. O’Conaill wrote of “the Whatabouts,” his name for “the people who answer every condemnation of the Provisional I.R.A. with an argument to prove the greater immorality of the ‘enemy.’ ”

Three days later, in the same newspaper, John Healy picked up the theme in his “Backbencher” column, citing Mr. O’Conaill’s letter. “We have a bellyful of Whataboutery in these killing days, and the one clear fact to emerge is that people, Orange and Green, are dying as a result of it,” he wrote.

Commentators on the Troubles embraced the term “whataboutery” and frequently mentioned it in the ensuing years of strife. The “whataboutism” variant appeared as early as 1993, in Tony Parker’s book “May the Lord in His Mercy Be Kind to Belfast”: “And I’d no time at all for ‘What aboutism’—you know, people who said ‘Yes, but what about what’s been done to us?’ ”

Reached by email, Mr. O’Conaill told me that he is surprised at how the “whatabout” idea has spread and mutated since his 1974 letter to the editor. “The phenomenon was waiting to be named, and others were soon observing it too, and naming it much the same way,” he said. As for the “whataboutism” of Messrs. Putin and Trump, he added, “I claim no responsibility whatever for their shenanigans.”

John Oliver (short)....

https://www.youtube.com/watch?v=RS82J...
5 likes ·   •  0 comments  •  flag
Share on Twitter
Published on April 12, 2021 12:40