IN CASE YOU MISSED THIS PAGE 2
SO YOU'RE FAMILIAR WITH "ZUCK TALK", RIGHT?
by John Herrman
So, in spoken language, there are these things that just sort of show up over time, and then it seems like they’re everywhere, and so we call them trends, right? So in a world where there is more recorded speech than ever, and, um, more access to all of this speech, these changes can happen very fast, but they can also be harder to isolate, right? So there’s actually a whole field about this, and it’s actually called linguistics, and it’s a really good tool for understanding the world around us.

Right?

Maybe you know someone who talks like this. It’s a disorienting speaking style, one that marries supreme confidence with nervous filler words and a fear of pauses. Maybe you overhear this voice talking to a date about meme stocks.

Maybe you hear it pitching a counterintuitive regulatory proposal on TV, or on a podcast, explaining which complicated things are actually simple and which simple things are actually complicated. Maybe it’s an executive on an earnings call, in an interview or pacing around a stage, delivering a Jobsian message in a Gatesian tone.

Maybe you hear Mark Zuckerberg, the head of Facebook. The style didn’t originate with him, nor is he responsible for its spread. He may, however, be its most visible and successful practitioner.

During his frequent public appearances, Mr. Zuckerberg can be heard expounding on all sorts of topics in this manner: the future of tech (“in terms of augmented reality, right, so there is virtual reality. …”); the early days of his social network (“there was no feed, right?”); human progress (“right, so, I mean life expectancy has gone up from about 50 to about 75”); Facebook’s mission (“you know, what I care about is giving people the power to share, giving every person a voice so we can make the world more open and connected. Right?”); “the history of science” (“most big scientific breakthroughs are driven by new tools, right, new ways of seeing things, right?”).

This is the voice of someone — in this case, and often, a man — who is as comfortable speaking about virtually any subject as he is uncomfortable speaking at all. (This is not the careful, measured voice of Sheryl Sandberg, the cheerily blustering awkwardness of Elon Musk.) It is, by default, one of the defining communication styles of its time. Right?

So.

ZuckTalk is a style of unpolished speech exhibited in contexts where polish is customary. It’s a linguistic hooded sweatshirt in a metaphorical boardroom. It is more than a collection of tics, but its tics are crucial to understanding it.
One: So. Another: Right? In their Zuckerbergian ultimate form, combined as a programmatic if-then connective move: Right? So.

Linguistic observers have noted for years the apparent rise of “so” in connection with the popularization of certain subjects and modes of speech. In 2010, in The New York Times, Anand Giridharadas announced the arrival of a new species of the unassuming word.

“‘So’ may be the new ‘well,’ ‘um,’ ‘oh’ and ‘like.’ No longer content to lurk in the middle of sentences, it has jumped to the beginning,” he wrote, crediting the journalist Michael Lewis with documenting its use among programmers at Microsoft more than a decade earlier.

In 2015, in a story for “Fresh Air” on NPR, Geoff Nunberg, the program’s longtime linguist, explained this use of “so” as a cue used by “people who can’t answer a question without first bringing you up to speed on the back story,” he said. Hence his name for it: back story “so.”

Syelle Graves, a linguist and the assistant director of the Institute for Language Education in Transcultural Context at the Graduate Center of the City University of New York, wrote her dissertation on the rise and uses of this particular “so.” Analyzing a sampling of spontaneous, unwritten American speech from 1990 to 2011, she concluded that this usage of “so” had indeed increased significantly, often as a stand-in for “well.”

By examining online posts, she also found that people were not only noticing its spread — they were also often irritated by it. “One of the most surprising results was that some public posters associated back story ‘so’ with women, but just as many associated it with men,” Dr. Graves wrote in an email.
Later, Dr. Graves conducted a survey in which subjects responded to recordings of men and women providing identical answers to questions, with “so” and “well” spliced in at the beginning. “In a nutshell, the woman who answered with back story ‘so’ was rated as less authoritative, more trendy and more like a ‘valley girl’ than the exact same woman who answered questions with well,” she said.

“The man who answered questions with back story ‘so’ was less likable, more condescending and more like a ‘tech bro’ than the exact same recording of the exact same man who answered with ‘well,’” she said.
Speakers loosely associated with either of California’s apparently linguistically verdant valleys — Silicon in the north, San Fernando in the south — were generally “perceived as less intelligent, less professionally competent and less mature, among other things.”

Right?

Well into the era of “so,” another linguistic trend was receiving much more attention: vocal fry.

The term describes a manner of speaking — also known as “creaky voice” — that carries with it a number of gendered connotations. Studies have suggested that women with vocal fry are often judged as less competent, less intelligent and less qualified than those without.

In popular culture, vocal fry became a joke, then its defense a minor cause; in countless YouTube comment sections, it was a way for sexist people to briefly masquerade as concerned prescriptive linguists in order to complain, once again, about how women talk.

Male-coded speaking styles are subject to somewhat less scrutiny. That’s not to say they go completely unnoticed. Users on Quora, a sort of professional class Yahoo! Answers, which is popular among employees in tech and tech-adjacent industries and skews male, have returned again and again to the same question: “When and why did everyone start ending sentences with ‘right?’”

This is what’s called a question-tag “right,” similar to a British “innit,” a Canadian “eh” or a French “n’est-ce pas.” (See also: “Correct?” “Is it not?” “No?” “OK?”)

To hear Quora users tell it, “right” is endemic in their worlds. “I suspect that this speaking technique may have possibly developed as a result of the proliferation of podcasts, TED Talks and NPR-type radio programs,” one user wrote. “Because they are not interested in what you have to say, they only want you to affirm/confirm what they are saying.”
“It could be linked to narcissism or a borderline personality disorder,” another user wrote. “Seems to be very common among the Silicon Valley intelligentsia,” a third said.

Micah Siegel, a venture capitalist and former Stanford professor, joined one Quora thread with an unusually specific theory. “My take is that this is a classic speech virus,” he wrote. “I believe it started in the particle physics community in the early 1980s, spread to the solid state physics community in the mid 1980s and then to the neuroscience community in the late 1980s. It appears to have gone mainstream just in the past few years. I am not sure what caused this latest jump.”

Mr. Siegel isn’t alone in observing the prevalence of “right?” among academics in the sciences; a 2004 paper by the linguist Erik Schleef found far higher usage of related forms of “OK” and “right” in natural science lectures than in humanities lectures, speculating that they need to “check on understanding more often than humanities instructors.”

One plausible answer to Mr. Siegel’s question about what caused “right” to enter “mainstream” speech is that people from academic backgrounds like his — familiar with a culture of talks and presentations, most comfortable in settings with specialized shared expertise — are now public figures. They work on companies and products that, rather quickly, became extremely powerful well outside of the worlds in which they were built.

However credible one finds the linguistic lab-leak theory, “right” and its many variants achieved wide community spread. In 2018, writing for The Cut, Katy Schneider diagnosed Mark Cuban with severe rightness.

“He disguises the ‘right’ as a question, but really it’s the opposite: a flat, affectless confirmation of whatever he himself just said, a brief affirmative pause between one confident statement and the next,” she wrote. Soon, she heard it everywhere, “used frequently by pundits, podcast hosts, TED Talk speakers.”

Mignon Fogarty, the host of the “Grammar Girl” podcast and the author of seven books about language, cautions that, when it comes to changes in language, annoyance and recognition are often intertwined. “When you don’t like someone, it’s easy to criticize their speech as a way of manifesting that,” she said. As someone who records a weekly audio program on language, she knows that firsthand.

In 2014, after receiving complaints about how often she began sentences with “so,” Ms. Fogarty suggested a story idea to one of her contributors: Is this habit condescending? The writer was Dr. Graves, and the answer, it turned out, was complicated.

So

For a young, rising Facebook founder to talk in a way that whizzes through premises on the way to a pitch was, among other things, part of the job. Mr. Zuckerberg’s former speechwriter Kate Losse described his manner of speaking in her memoir, “Boy Kings,” as “a combination of efficient shorthand and imperialist confidence.” Also: “flat” but with a “boyish cadence.”

The job, however, has changed. Which may be why, as a style of speaking, ZuckTalk is starting to sound … a little old? Or maybe just ubiquitous.
Even Mr. Zuckerberg seems to have noticed. According to transcripts from Marquette University’s Zuckerberg Files project, the distilled “right? so” construction is, after a peak in 2016 — much to talk about! plenty to explain! — falling out of favor in the Facebook creator’s lexicon.

In the world he helped create, however, “right” and “so” are right at home. They’re tools for the explainers among us and have proliferated as such: in media interviews, seminars, talks and speeches. Now, thanks to social media — the ever-prompting machine — everyone has the chance, or need, to explain themselves in front of an audience.

“So” is comfortable in front of the YouTube video; “right” handily punctuates up the Instagram Live; a “right? so” maneuver erases dead air on a podcast. These turns of phrase aren’t likely to go away soon, so we might as well get used to them. Right?




When the computer scientist and mathematician Lenore Blum announced her resignation from Carnegie Mellon University in 2018, the community was jolted. A distinguished professor, she’d helped found the Association for Women in Mathematics, and made seminal contributions to the field. But she said she found herself steadily marginalized from a center she’d help create — blocked from important decisions, dismissed and ignored. She explained at the time: “Subtle biases and microaggressions pile up, few of which on their own rise to the level of ‘let’s take action,’ but are insidious nonetheless.”

It’s an experience many women can relate to. But how much does everyday sexism at work matter? Most would agree that outright discrimination when it comes to hiring and advancement is a bad thing, but what about the small indignities that women experience day after day? The expectation that they be unfailingly helpful; the golf rounds and networking opportunities they’re not invited to; the siphoning off of credit for their work by others; unfair performance reviews that penalize them for the same behavior that’s applauded in men; the “manterrupting”?

When I was researching my book “The End of Bias: A Beginning” I wanted to understand the collective impact of these less visible forms of bias, but data were hard to come by. Bias doesn’t happen once or twice; it happens day after day, week after week. To explore the aggregate impact of routine gender bias over time, I teamed up with Kenny Joseph, a computer science professor at the University at Buffalo, and a graduate student there, Yuhao Du, to create a computer simulation of a workplace. We call our simulated workplace “NormCorp.” Here’s how it works.

NormCorp is a simple company. Employees do projects, either alone or in pairs. These succeed or fail, which affects a score we call “promotability.” Twice a year, employees go through performance reviews, and the top scorers at each level are promoted to the next level.

NormCorp employees are affected by the kinds of gender bias that are endemic in the workplace. Women’s successful solo projects are valued slightly less than men’s, and their successful joint projects with men accrue them less credit. They are also penalized slightly more when they fail. Occasional “stretch” projects have outsize rewards, but as in the real world, women’s potential is underrecognized compared with men’s, so they must have a greater record of past successes to be assigned these projects. A fraction of women point out the unfairness and are then penalized for the perception that they are “self-promoting.” And as the proportion of women decreases, those that are left face more stereotyping.

We simulated 10 years of promotion cycles happening at NormCorp based on these rules, and here is how women’s representation changed over time.

These biases have all been demonstrated across various professional fields. One working paper study of over 500,000 physician referrals showed that women surgeons receive fewer referrals after successful outcomes than male surgeons. Women economists are less likely to receive tenure the more they co-author papers with men. An analysis at a large company found that women’s, as well as minority men’s, performance was effectively “discounted” compared with that of white men.

And women are penalized for straying from “feminine” personality traits. An analysis of real-world workplace performance evaluations found that more than three-quarters of women’s critical evaluations contained negative comments about their personalities, compared with 2 percent of men’s. If a woman whose contributions are overlooked speaks up, she may be labeled a self-promoter, and consequently face further obstacles to success. She may also become less motivated and committed to the organization. The American Bar Association found that 70 percent of women lawyers of color considered leaving or had left the legal profession entirely, citing being undervalued at work and facing barriers to advancement.

Our model does not take into account women, such as Lenore Blum, who quit their jobs after experiencing an unmanageable amount of bias. But it visualizes how these penalties add up over time for women who stay, so that by the time you reach more senior levels of management, there are fewer women left to promote. These factors not only prevent women from reaching the top ranks in their company but for those who do, it also makes the career path longer and more demanding.

When we dig into the trajectory of individual people in our simulation, stories begin to emerge. With just 3 percent bias, one employee — let’s call her Jenelle — starts in an entry-level position, and makes it to the executive level, but it takes her 17 performance review cycles (eight and a half years) to get there, and she needs 208 successful projects to make it. “William” starts at the same level but he gets to executive level much faster — after only eight performance reviews and half Jenelle’s successes at the time she becomes an executive.
Our model shows how large organizational disparities can emerge from many small, even unintentional biases happening frequently over a long period of time. Laws are often designed to address large events that happen infrequently and can be easily attributed to a single actor—for example, overt sexual harassment by a manager — or “pattern and practice” problems, such as discriminatory policies. But women’s progress is hindered even without one egregious incident, or an official policy that is discriminatory.

Gender bias takes on different dimensions depending on other intersecting aspects of a person’s identity, such as race, religion, ethnicity, sexual orientation, disability and more. Another American Bar Association study found that white women and men of color face similar hurdles to being seen as competent, but women of color face more than either group.

Backlash, too, plays out differently for women of different racial groups, points out Erika Hall, an Emory University management professor. A survey of hundreds of women scientists she helped conduct found that Asian American women reported the highest amount of backlash for self-promotion and assertive behavior. An experimental study by the social psychologist Robert Livingston and colleagues, meanwhile, found that white women are more penalized for demonstrating dominant behavior than Black women. Our model does not account for the important variations in bias that women of different races experience.

So what’s to be done? Diversity trainings are common in companies, educational institutions and health care settings, but these may not have much effect when it comes to employees’ career advancement. The sociologists Frank Dobbin and Alexandra Kalev found that after mandatory diversity trainings, the likelihood that women and men of color became managers either stayed the same or decreased, possibly because of backlash. Some anti-bias trainings have been shown to change behavior, but any approach needs to be evaluated, as psychologist Betsy Levy Paluck has said, “on the level of rigorous testing of medical interventions.”

We also explored a paradox. Research shows that in many fields, a greater proportion of men correlates with more bias againstwomen. At the same time, in fields or organizations where women make up the majority, men can still experience a “glass escalator,” being fast-tracked to senior leadership roles. School superintendents, who work in the women-dominated field of education but are more likely to be men, are one example. To make sense of this, we conceptualized bias at work as a combination of both organizational biases that can be influenced by organizational makeup and larger societal biases.
What we found was that if societal biases are strong compared with those in the organization, a powerful but brief intervention may have only a short-term impact. In our simulation, we tested this by introducing quotas — requiring that the majority of promotions go to women — in the context of low, moderate, or no societal bias. We made the quotas time-limited, as real world efforts to combat bias often take the form of short-term interventions.

Our quotas changed the number of women at upper levels of the corporate hierarchy in the short term, and in turn decreased the gender biases against women rising through the company ranks. But when societal biases were still a persistent force, disparities eventually returned, and the impact of the intervention was short-lived.

What works? Having managers directly mentor and sponsor women improves their chance to rise. Insisting on fair, transparent and objective criteria for promotions and assignments is essential, so that decisions are not ambiguous and subjective, and goal posts aren’t shifting and unwritten. But the effect of standardizing criteria, too, can be limited, because decision-makers can always override these decisions and choose their favored candidates.

Ultimately, I found in my research for the book, the mindset of leaders plays an enormous role. Interventions make a difference, but only if leaders commit to them. One law firm I profiled achieved 50 percent women equity partners through a series of dramatic moves, from overhauling and standardizing promotion criteria, to active sponsorship of women, to a zero-tolerance policy for biased behavior. In this case, the chief executive understood that bias was blocking the company from capturing all the available talent. Leaders who believe that the elimination of bias is essential to the functioning of the organization are more likely to take the kind of active, aggressive, and long-term steps needed to root out bias wherever it may creep into decision making.




​HOW EVERYDAY SEXISM HARMS WOMEN
by Jessica Nordell and Yaryna Serkez
​The Fight for Asian American Studies
After a year that put a spotlight on anti-Asian racism, students around the country have been petitioning their schools to create curriculums that reflect the moment.

On a Saturday afternoon in September, the kind of day most college students would spend sprawled on a quad, soaking up the moments that still feel like summer, the Dartmouth Asian American Student Collective was getting organized. Its members had gathered to finalize a mission statement and a petition to circulate across campus.

Their goal? Persuade the administration of Dartmouth College to create an Asian American studies program.

Lily Ren, who led the meeting with her classmate Maanasi Shyno, said that taking classes that centered Asian American experiences at Dartmouth helped her better understand her own identity. “Because I was so transformed by these classes, I thought: How many other students didn’t have the opportunity to also learn so much, just because there were so few of them offered and you couldn’t major or minor in it?” she said in an interview in November.

In the group’s statement, which was released in October with the petition, its members outline why they believe such a program is necessary today, citing widespread incidents of anti-Asian racism and violence. To date, the group has collected nearly 1,200 signatures from students, parents and faculty.
The fight for Asian American studies at Dartmouth dates back several decades and is part of a larger academic movement that began in the 1960s. Though there have been minor victories at Dartmouth — new classes, new hires — change has been incremental, and a full program has yet to be formalized.

But this time could be different. After all, Asian American studies programs have often come into being during times of social unrest and change, as a result of student activism.

Decades of Activism at Dartmouth

Ms. Shyno and Ms. Ren, both 20 and double majors in sociology and gender studies, were not always student activists. In April, however, they were attending a virtual town hall with Asian American alumni and faculty when, halfway through, someone asked who was leading the student movement at Dartmouth today. The response was a long silence, they said.

“All the students working on this prior burned out or graduated, which is unfortunately what happens with student movements,” Ms. Shyno said. “That’s when we decided that we could be the ones to start it up again.”

The Dartmouth Asian Pacific American Alumni Association has compiled a history of such activism in a timeline that dates back to 1979. The timeline also refers to peer institutions, like Cornell University, which started the first Asian American studies program in the Ivy League in 1987, and Northwestern University, which introduced its own program in 1999 — a few years after students participated in a nearly monthlong hunger strike, in which they refused to eat meals.

​In 2001, seven Dartmouth professors proposed a list of initiatives, including an Asian American studies minor and a building to serve as a centralized hub for related programs. At the same time, the issue was gaining attention among students.

“When I arrived on campus, it was probably the first time, like many other Asian Americans, where I was really exposed to the history and study of our community,” said Morna Ha, who graduated from Dartmouth in 2004 and now serves as the alumni group’s chair of the subcommittee on Asian American studies.

As an undergraduate, Ms. Ha led an Asian American studies task force. “The work that we were doing then was similar, unfortunately, to a lot of the work that’s being done now,” she said.

After Ms. Ha graduated, several professors were hired to focus on Asian American studies. “But unfortunately,” she said, “Dartmouth has a really terrible track record of retaining these experts.”

Aimee Bahng, who started teaching in Dartmouth’s English department in 2009 and specialized in Asian American literature, was denied tenure in 2016. Students, concerned about the prospect of losing a mentor and faculty member of color, started organizing, posting on Twitter under the hashtags #Fight4FacultyOfColor and #DontDoDartmouth. The story made headlines, and there was even a Change.org petition written by Dartmouth faculty that received close to 4,000 signatures.
Diana Lawrence, the associate vice president for communications at Dartmouth, said in a statement: “Mathematically, there is no significant difference in tenure rates between women and men or between white and BIPOC faculty at Dartmouth.” The statement added that the college is “making notable progress in its efforts to recruit and retain faculty of color.”

Ms. Lawrence also wrote that “Dartmouth students who wish to major in Asian American studies may choose to do so regardless of whether there is a program currently in place. The College offers many courses in that area, including opportunities to study abroad.”

Ms. Bahng, now an associate professor of gender and women’s studies at Pomona College, said that the status of tenure holds the stakes of defining whether “a faculty member will be protected and job-secure in their effort to teach these subjugated knowledges and marginalized histories.”

“The push that Dartmouth is currently in,” Ms. Bahng added, “is informed by students and faculty members who’ve been through this before.”

The ‘Model Minority’ Myth

Twenty-one percent of Dartmouth’s class of 2025 is Asian American, according to the college’s admissions site. And across the United States, Asian Americans are the fastest-growing racial or ethnic group, almost doubling in size to 18.9 million from about 10.5 million between 2000 and 2019, according to the Pew Research Center.

Still, Asian American identity has been fraught in education. Because Asian Americans have been viewed as the “model minority” — stereotyped as high academic and financial achievers — institutions have not always considered them a protected class.
In 2020, a Washington State school district was the center of a controversy for excluding students of Asian descent from a category labeled “students of color” in a 2019 performance report. “While our intent was never to ignore Asian students as ‘students of color’ or ignore any systemic disadvantages they too have faced, we realize our category choices caused pain and had racist implications,” the district later responded in a statement.

Eng-Beng Lim, an associate professor of women’s, gender and sexuality studies at Dartmouth, said that same attitude could in part be why the college has yet to form a program. “My students have reported stories of how the pushback from the upper administrators included sentiments about how Asian Americans are not a minority group,” he said.
The Fight for Asian American Studies
After a year that put a spotlight on anti-Asian racism, students around the country have been petitioning their schools to create curriculums that reflect the moment.


In “Electable: Why America Hasn’t Put a Woman in the White House … Yet,” the NBC News Capitol Hill correspondent Ali Vitali describes Amy Klobuchar’s well-honed political origin story. In her book, out this month, Vitali explains that Klobuchar, the Minnesota senator, was kicked out of the hospital just a day after giving birth to her daughter. This was a not-uncommon cost-cutting measure for insurance companies in the past, and Klobuchar was sent home even though her baby had to stay because of complications.

“It was the match to Klobuchar’s political fire,” Vitali writes, and it inspired her, as a private citizen, to lobby for a guaranteed 48 hours in the hospital together for moms and their newborns (a requirement later enshrined by the Newborns’ and Mothers’ Health Protection Act). This maternal activism kick-started Klobuchar’s legislative career, and ultimately led her to run for office.

She wasn’t the only candidate to tie her maternal identity to her “political fire” during the 2020 presidential election, Vitali points out. Senator Kirsten Gillibrand of New York “centered her candidacy on womanhood: framing policy proposals around how they’d impact families, promising that ‘as a mom, I’ll fight for your family as hard as I fight for my own.’”

While I was reading Vitali’s book last week, news broke that some of the climate and tax provisions of Democrats’ original Build Back Better Plan, left for dead a few weeks ago, were being brought back to life as part of a pared-down bill called the Inflation Reduction Act. This is good news for anyone who cares about climate change and its impact on our children’s futures, but it’s still disappointing that several proposed investments from earlier drafts of Democrats’ plan for child care, universal preschool, child tax credits and elder care support have been dropped like a hot rock.

And I couldn’t help but notice that the senators mentioned in The Times’s coverage as having helped to resurrect the package — Joe Manchin, Chuck Schumer, Mark Warner, John Hickenlooper and Chris Coons — were all men who are ostensibly past needing parental leave, preschool or child care for their immediate families. In October, when The Washington Post reported on the “last-ditch effort by Democratic women to pressure Manchin and salvage paid family and medical leave,” it was moms leading that good fight, including Gillibrand and Senator Patty Murray of Washington.

Ideally, legislators who aren’t caretakers of young children would still see the profound value of things such as paid leave and child tax credits, which are also essential for the health of the next generation and society in general. Hopefully, they would also acknowledge that nearly half of Americans, including 41 percent of Republicans, think our country doesn’t do enough for parents, according to Pew Research. In my dreams, paid leave, which is available in nearly every other country, would not just be tacked onto enormous budget bills only to be sacrificed in the horse-trading process.

Since mothers are out in front fighting for these supports, we probably need more of them in positions of true power in our legislative and executive bodies. (I asked dads to start shouting about paid leave back in November, but they seem to have lost their voices.) As the midterms approach, I thought it would be a good moment to take a temperature check on how voters perceive candidates who are mothers. The overarching feeling, as Vitali put it to me when we spoke, is that mothers running for office are much better off than they once were, “but still with a lot of progress left to make.”

A particular bright spot is that more women are starting their political careers younger than their predecessors did, which may set them up to be in more powerful positions later on. Another is that motherhood is increasingly in the foreground of campaigns. In The Atlantic in 2018, Annika Neklason explained that “moms are not only seeking political seats, but seeking them explicitly, and proudly, as moms; in this year’s election cycle, motherhood has become an asset to be flaunted in progressive campaigns, resolving a decades-old tension for women seeking to enter electoral politics.”

It’s not just Democratic women, either. Elise Stefanik, the third-ranking Republican in the House, is also a new mom. In March, The Times’s Annie Karni reported on a House Republican retreat where Stefanik “was running the show, working the room with her 7-month-old son on her hip.” In 2019, Stefanik was part of a bipartisan group that introduced paid leave legislation that would allow families to receive advance child tax credits up to $5,000 during the first year of a child’s life or the first year after a child’s adoption — not as generous as I’d like to see at the federal level, but better than what we have now, which is nothing. And though paid leave is often framed as a Democratic priority, according to a Morning Consult poll from September: “Consistently, more than half of Republican women support paid family and medical leave, even when it’s framed as a Democratic proposal. Republican men, meanwhile, haven’t always been on board but are coming around on the idea.”

Politicians bringing their newborns to work and taking parental leave while in office is something new. It used to be that mothers mostly ran for office when their kids were older, said Corrine McConnaughy, a political scientist at Princeton University. “Nancy Pelosi is famously a mother of five, but also — as was not atypical of women navigating politics in her generation — waited until her kids were grown and then entered politics,” McConnaughy said. It matters that women are starting earlier, because unlike male politicians — Pete Buttigieg, who ran for president after a mere two terms as mayor of a small city, comes to mind — they “feel they need to be more qualified to succeed,” said Jennifer Lawless, a professor of politics at the University of Virginia. “They’re not going to throw their hat into the ring when they’ve been in the Senate for two years.”

In previous generations, there was criticism of women who aimed for high office while their kids were still at home. In her excellent book, “The Political Consequences of Motherhood,” Jill Greenlee, an associate professor of politics at Brandeis University, describes the way Geraldine Ferraro, the three-term New York congresswoman, faced a “chorus of criticism” while running for vice president in 1984, along the lines of: “I’m not voting for her because she belongs in the home, she belongs back with her kids, what the hell is she doing this for?”
“Ferraro and her family were the subject of public scrutiny, as was (and is) often the case when women step into new political roles,” Greenlee writes. “This forced Ferraro and her defenders to demonstrate her devotion as a mother while also promoting her professional credentials.”

By the time Sarah Palin, who was then Alaska’s Republican governor, ran for vice president in 2008, there was less cultural resistance to the idea of a mother in that role, though there was still intense, at times unfair, scrutiny of Palin’s family. Palin, who embraced a “hockey mom” image, herself declared “that she was part of a generation of women who have become used to juggling work and family and would not shy away from a political challenge,” Greenlee notes.

In the intervening 14 years, we keep moving forward, but a full acceptance of mothers as political powerhouses will take more time. Last year, Stefanik had to rebut a news report that suggested she might struggle to handle her legislative responsibilities as a new mom. According to a 2017 research paper from the Barbara Lee Family Foundation, voters still have concerns about women being able to balance family and political responsibilities.

The foundation presented survey respondents with four fictional candidates with no partisan identifiers: a married man with young children, a married woman with a young child, a single mother of young children and a never-married woman without children. Then, voters were presented with critiques “which focused on their ability to manage their family life and at the same time be effective office holders.” When the candidates pushed back, voters found the male candidate to be the most “convincing.” Though voters recognized “a double standard for moms,” they still participated in enforcing the double standard.

Still, “a more diverse array of women are putting themselves forward as candidates,” said Lawless, and the more different kinds of mothers who prove that they can govern, the more it will become a nonissue. Vitali mentioned Katie Porter of California and Abigail Spanberger of Virginia, both Democrats, as members of the House who are mothers of school-age children and part of the national conversation. “Women are taking up this space as mothers and power brokers” that they didn’t used to, Vitali said. Maybe that will eventually lead to building back something that won’t topple.


Would We Have Paid Parental Leave if More Moms Were in Congress?
by Jessica Grose



I recently turned 83, and while there are many joys to getting older, getting out of taxis is not one of them.

What you don’t want to do is get your left foot caught under the front right seat before you try to swing your right foot toward the door; otherwise, you’ll topple over while attempting to pay the fare, possibly injuring your ankle, and causing the maneuver to go even more slowly. If you make it past the taxi door, there is still the one-foot jump to the street. You’re old. You could fall. Happens all the time.
And that’s when it’s just you in the taxi. If some other old person is with you — a friend, a spouse — there’s a real possibility of never getting out of the vehicle. You might live out the rest of your days in the back seat, watching Dick Cavett do real estate ads on a loop.

“Old People Getting Out of Taxis.” I was thinking of making a film with that title, if I knew how to make a film. Figure it would run four hours. I asked an actor friend, also old, if he’d star in it. His response: “If I can get out of my chair.”
It’s no joke, old age. It just looks funny. Mel Brooks latched on to this in his 1977 film “High Anxiety” with Professor Lilloman (pronounced “little old man”), a stock character who moves at a turtle’s pace, mumbles and whines as he goes, equally irritated and irritating.

I used to find the professor a lot funnier than I do now. Slow? Merely to rise to my feet in a restaurant takes so much angling and fulcrum searching, the waitstaff takes bets on whether I will do it at all.

Old age isn’t what the books promised it would be. Literature is littered with old people for whom the years have brought some combination of wisdom, serenity, authority and power — King Lear, the ageless priest in Shangri-La, Miss Marple, Mr. Chips, Mrs. Chips (I made that up), Dickens’s Aged P, crazy Mrs. Danvers. In fiction, old folks are usually impressive and in control. In life, something less.
I can’t think of anyone who has come to me for wisdom, serenity, authority or power. People do come to sell me life insurance for $9 a month and medicines such as Prevagen, which is advertised on TV as making one sharper and improving one’s memory. Of course, that is beneficial only to those who have more things they wish to remember than to forget.

One thing I need to remember is which day for which doctor. Two years ago, my wife and I moved back to New York City after 24 years of living by the sea. The city is safer, we thought — just in case we may ever need to be near medical facilities. Since our move, not a day has passed without one of us seeing a doctor, arranging to see one or thinking or talking about seeing one.


What They Don't Tell You about Getting Old by Roger Rosenblatt
Hey, it’s election season! Think about it: A year from now, we should know who the next president is going to be and …

Stop beating your head against the wall. Before we start obsessing over the candidates, let’s spend just a few minutes mulling the big picture. Really big. Today, we’re going to moan about the Electoral College.

Yes! That … system we have for actually choosing a president. The one that makes who got the most votes more or less irrelevant. “The exploding cigar of American politics,” as Michael Waldman of the Brennan Center for Justice called it over the phone.

Whoever gets the most electoral votes wins the White House. And the electoral votes are equal to the number of representatives and senators each state has in Washington. Right now that means — as I never tire of saying — around 193,000 people in Wyoming get the same clout as around 715,000 people in California.

It’s possible the system was quietly hatched as a canny plot by the plantation-owning Southerners to cut back on the power of the cities. Or it’s possible the founders just had a lot on their minds and threw the system together at the last minute. At the time, Waldman noted, everybody was mainly concerned with making sure George Washington was the first president.

Confession: I was hoping to blame the whole Electoral College thing on Thomas Jefferson, who’s possibly my least favorite founding father. You know — states’ rights and Sally Hemings. Not to mention a letter he once wrote to his daughter, reminding her to wear a bonnet when she went outside because any hint of the sun on her face would “make you very ugly and then we should not love you so much.” But Jefferson was someplace in France while all this Electoral College stuff was going on, so I’m afraid it’s not his fault.

Anyway, no matter how it originally came together, we’ve now put the loser of the popular vote in office five times. Three of those elections were more than a century ago. One involved the Republican Rutherford B. Hayes, who won in 1876 even though the electoral vote was virtually tied and Samuel Tilden easily won the popular vote. But the Republicans made a deal with Southern Democrats to throw the election Hayes’s way in return for a withdrawal of federal troops from the South, which meant an end to Reconstruction and another century of disenfranchisement for Black voters in the South.

Really, every time I get ticked off about the way things are going in our country, I keep reminding myself that Samuel Tilden had it worse. Not to mention the Black voters, of course.

Here’s the real, immediate worry: Our current century is not even a quarter over and we’ve already had the wrong person in the White House twice. George W. Bush lost the popular vote to Al Gore in 2000 — many of you will remember the manic counting and recounting in Florida, which was the tipping point state. (Gore lost Florida by 537 votes, in part thanks to Ralph Nader’s presence on the ballot. If you happen to see Robert Kennedy Jr. anytime soon, remind him of what hopeless third-party contenders can do to screw up an election.)

And then Hillary Clinton beat Donald Trump decisively in the popular vote — by about 2.8 million votes, coming out ahead by 30 percentage points in California and 22.5 percentage points in New York. But none of that mattered when Trump managed to eke out wins by 0.7-point margins in Wisconsin and Pennsylvania, not to mention his 0.3-point victory in Michigan.
By the way, does anybody remember what Clinton did when she got this horrible news? Expressed her dismay, then obeyed the rules and conceded. Try to imagine how Trump would behave under similar circumstances.
OK, don’t. Spare yourselves.

Sure, every vote counts. But it’s hard not to notice that every vote seems to count a whole lot more if you happen to be registered in someplace like Michigan, where the margin between the two parties is pretty narrow. After her loss, Clinton did wonder how much difference it might have made if she’d taken “a few more trips to Saginaw.”

On the other side of the equation, Wyoming is the most Republican state, with nearly 60 percent of residents identifying with the G.O.P. and just about a quarter saying they’re Democrats. Nobody is holding their breath to see which way Wyoming goes on election night.

But if you’re feeling wounded, Wyoming, remember that presidential-election-wise, every citizen of Wyoming is worth almost four times as much as a Californian.
And then Hillary Clinton beat Donald Trump decisively in the popular vote — by about 2.8 million votes, coming out ahead by 30 percentage points in California and 22.5 percentage points in New York. But none of that mattered when Trump managed to eke out wins by 0.7-point margins in Wisconsin and Pennsylvania, not to mention his 0.3-point victory in Michigan.

By the way, does anybody remember what Clinton did when she got this horrible news? Expressed her dismay, then obeyed the rules and conceded. Try to imagine how Trump would behave under similar circumstances.

OK, don’t. Spare yourselves.

Sure, every vote counts. But it’s hard not to notice that every vote seems to count a whole lot more if you happen to be registered in someplace like Michigan, where the margin between the two parties is pretty narrow. After her loss, Clinton did wonder how much difference it might have made if she’d taken “a few more trips to Saginaw.”

On the other side of the equation, Wyoming is the most Republican state, with nearly 60 percent of residents identifying with the G.O.P. and just about a quarter saying they’re Democrats. Nobody is holding their breath to see which way Wyoming goes on election night.

But if you’re feeling wounded, Wyoming, remember that presidential-election-wise, every citizen of Wyoming is worth almost four times as much as a Californian.






The Exploding Cigar of American Politics
by Gail  Collins
An Updage on the Electoral College
The notable fact about the anti-Israel campus demonstrations is that they are predominantly an elite phenomenon. Yes, there have been protests at big state schools like the University of Nebraska, but they have generally been small, tame and — thanks to administrators prepared to enforce the rules — short-lived. It’s Stanford, Berkeley, Yale, Penn, Harvard, Columbia and many of their peers that have descended to open bigotry, institutional paralysis and mayhem.

Two questions: Why the top universities? And what should those on the other side of the demonstrations — Jewish students and alumni most of all — do about it?

Regarding the first question, some argue that the furor over the campus protests is much ado about not much. The demonstrators, they say, represent only a small fraction of students. The ugliest antisemitic expressions occasionally seen at these events are mainly the work of outside provocateurs. And the student protesters (some of whom are Jewish) are acting out of youthful idealism, not age-old antisemitism. As they see it, they aim only to save Palestinian lives and oppose the involvement of their universities in the abuses of a racist Israeli state.

There’s something to these points. With notable exceptions, campus life at these schools is somewhat less roiled by protest than the media makes it seem. Outside groups, as more than one university president has told me, have played an outsize role in setting up encampments and radicalizing students. And few student demonstrators, I’d wager, consciously think they harbor an anti-Jewish prejudice.

But this lets the kids off the hook too easily.

Students who police words like “blacklist” or “whitewash” and see “microaggressions” in everyday life ignore the entreaties of their Jewish peers to avoid chants like “globalize the intifada” or “from the river to the sea.” Students who claim they’re horribly pained by scenes of Palestinian suffering were largely silent on Oct. 7 — when they weren’t openly cheering the attacks. And students who team up with outside groups that are in overt sympathy with Islamist terrorists aren’t innocents. They’re collaborators.

How did the protesters at elite universities get their ideas of what to think and how to behave?

They got them, I suspect, from the incessant valorization of victimhood that has been a theme of their upbringing, and which many of the most privileged kids feel they lack — hence the zeal to prove themselves as allies of the perceived oppressed. They got them from the crude schematics of Diversity, Equity and Inclusion training seminars, which divide the world into “white” and “of color,” powerful and “marginalized,” with no regard for real-world complexities — including the complexity of Jewish identity. They got them from professors who think academic freedom amounts to a license for political posturing, sometimes of a nakedly antisemitic sort. They got them from a cheap and easy revision of history that imagines Zionism is a form of colonialism (it’s decidedly the opposite), that colonialism is something only white people do, and that as students at American universities, they can cheaply atone for their sins as guilty beneficiaries of the settler-colonialism they claim to despise.

They also got them from university administrators whose private sympathies often lie with the demonstrators, who imagine the anti-Israel protests as the moral heirs to the anti-apartheid protests and who struggle to grasp (if they even care) why so many Jewish students feel betrayed and besieged by the campus culture.

That’s the significance of the leaked images of four Columbia University deans exchanging dismissive and sophomoric text messages during a panel discussion in May on Jewish life on campus, including the suggestion that a panelist was “taking full advantage of this moment” for the sake of the “fundraising potential.”

Columbia placed three of the deans on leave. Other universities, like Penn, have belatedly moved to ban encampments. But those steps have a grudging and reactive feel — more a response to Title VI investigations of discrimination and congressional hearings than a genuine acknowledgment that something is deeply amiss with the values of a university. At Harvard, two successive members of the task force on antisemitism resigned in frustration. “We are at a moment when the toxicity of intellectual slovenliness has been laid bare for all to see,” wrote Rabbi David Wolpe in his resignation announcement.

That’s the key point. More dismaying than the fact that student protesters are fellow traveling with Hamas is that with their rhyming chants and identical talking points, they sound more like Maoist cadres than critical thinkers. As the sociologist Ilana Redstone, author of the smart and timely book “The Certainty Trap,” told me on Monday, “higher education traded humility and curiosity for conviction and advocacy — all in the name of being inclusive. Certainty yields students who are contemptuous of disagreement.”

And so the second question: What are Jewish students and alumni to do?

It’s telling that the Columbia deans were caught chortling during exactly the kind of earnest panel discussion that the university convened presumably to show alumni they are tackling campus antisemitism. They were paying more lip service than attention. My guess is that they, along with many of their colleagues, struggle to see the problem because they think it lies with a handful of extremist professors and obnoxious students.

But the real problem lies with some of the main convictions and currents of today’s academia: intersectionality, critical theory, post-colonialism, ethnic studies and other concepts that may not seem antisemitic on their face but tend to politicize classrooms and cast Jews as privileged and oppressive. If, as critical theorists argue, the world’s injustices stem from the shadowy agendas of the powerful and manipulative few against the virtuous masses, just which group is most likely to find itself villainized?

Not even the most determined university president is going to clean out the rot — at least not without getting rid of the entrenched academic departments and tenured faculty members who support it. That could take decades. In the meantime, Jews have a history of parting company with institutions that mistreated them, like white-shoe law firms and commercial banks. In so many cases, they went on to create better institutions that operated on principles of intellectual merit and fair play — including many of the universities that have since stumbled.

If you are an Ivy League megadonor wondering how to better spend the money you no longer want to give a Penn or a Columbia — or just a rising high school senior wondering where to apply — maybe it’s time to forgo the fading prestige of the old elite for the sake of something else, something new. That’s a subject for a future column.

Bret Stephens is an Opinion columnist for The Times, writing about foreign policy, domestic politics and cultural issues. Facebook
Should American Jews Abandon Elite Universities?by Adam Gray








Just a week ago, it seems, a new America began. I’ve struggled ever since to figure out what the apparent sudden revolution in our politics means.

I keep coming back to the Ernest Hemingway quote about how bankruptcy happens. He said it happens in two stages, first gradually and then suddenly.

That’s how scholars say fascism happens, too—first slowly and then all at once—and that’s what has been keeping us up at night.

But the more I think about it, the more I think maybe democracy happens the same way, too: slowly, and then all at once. 

At this country’s most important revolutionary moments, it has seemed as if the country turned on a dime. 

In 1763, just after the end of the French and Indian War, American colonists loved that they were part of the British empire. And yet, by 1776, just a little more than a decade later, they had declared independence from that empire and set down the principles that everyone has a right to be treated equally before the law and to have a say in their government.

The change was just as quick in the 1850s. In 1853 it sure looked as if the elite southern enslavers had taken over the country. They controlled the Senate, the White House, and the Supreme Court. They explicitly rejected the Declaration of Independence and declared that they had the right to rule over the country’s majority. They planned to take over the United States and then to take over the world, creating a global economy based on human enslavement. 

And yet, just seven years later, voters put Abraham Lincoln in the White House with a promise to stand against the Slave Power and to protect a government “of the people, by the people, and for the people.” He ushered in “a new birth of freedom” in what historians call the second American revolution. 

The same pattern was true in the 1920s, when it seemed as if business interests and government were so deeply entwined that it was only a question of time until the United States went down the same dark path to fascism that so many other nations did in that era. In 1927, after the execution of immigrant anarchists Nicola Sacco and Bartolomeo Vanzetti, poet John Dos Passos wrote: “they have clubbed us off the streets they are stronger they are rich they hire and fire the politicians the newspaper editors the old judges the small men with reputations….” 

And yet, just five years later, voters elected Franklin Delano Roosevelt, who promised Americans a New Deal and ushered in a country that regulated business, provided a basic social safety net, promoted infrastructure, and protected civil rights.

Every time we expand democracy, it seems we get complacent, thinking it’s a done deal. We forget that democracy is a process and that it’s never finished.

And when we get complacent, people who want power use our system to take over the government. They get control of the Senate, the White House, and the Supreme Court, and they begin to undermine the principle that we should be treated equally before the law and to chip away at the idea that we have a right to a say in our government. And it starts to seem like we have lost our democracy. 

But all the while, there are people who keep the faith. Lawmakers, of course, but also teachers and journalists and the musicians who push back against the fear by reminding us of love and family and community. And in those communities, people begin to organize—the marginalized people who are the first to feel the bite of reaction, and grassroots groups. They keep the embers of democracy alive.

And then something fans them into flame. 

In the 1760s it was the Stamp Act, which said that men in Great Britain had the right to rule over men in the American colonies. In the 1850s it was the Kansas-Nebraska Act, which gave the elite enslavers the power to rule the United States. And in 1929 it was the Great Crash, which proved that the businessmen had no idea what they were doing and had no plan for getting the country out of the Great Depression.

The last several decades have felt like we were fighting a holding action, trying to protect democracy first from an oligarchy and then from a dictator. Many Americans saw their rights being stripped away…even as they were quietly becoming stronger. 

That strength showed in the Women’s March of January 2017, and it continued to grow—quietly under Donald Trump and more openly under the protections of the Biden administration. People began to organize in school boards and state legislatures and Congress. They also began to organize over TikTok and Instagram and Facebook and newsletters and Zoom calls. 

And then something set them ablaze. The 2022 Dobbs v. Jackson Women’s Health Organization decision stripped away from the American people a constitutional right they had enjoyed for almost fifty years, and made it clear that a small minority intended to destroy democracy and replace it with a dictatorship based in Christian nationalism. 

When President Joe Biden announced just a week ago that he would not accept the Democratic nomination for president, he did not pass the torch to Vice President Kamala Harris.

He passed it to us. 

It is up to us to decide whether we want a country based on fear or on facts, on reaction or on reality, on hatred or on hope.

It is up to us whether it will be fascism or democracy that, in the end, moves swiftly, and up to us whether we will choose to follow in the footsteps of those Americans who came before us in our noblest moments, and launch a brand new era in American history.
A New American
by Heather Cox Richardson
On its face, there’s nothing necessarily political about the mantra that the customer is always right. It can buck up the patience of an exasperated shopkeeper dealing with a finicky patron or push complacent manufacturers to think harder about evolving consumer tastes. It fosters a service culture that, as visitors to the United States often remark, is notable for its niceness.

But the idea that the customer is always right also contains a worldview, a kind of market fundamentalism that typifies much of the American right today. The more pervasive it becomes, the more pernicious it gets — and the more it diminishes the very values conservatives claim to hold dear.

When are customers “always right”? When they want the beige interior not the black one, or the subway tiles for the downstairs bathroom but not the upstairs one, or the sauce on the side — that is, anywhere within the broad spectrum of personal preference that typify most consumer choices.

The problem starts when our decisions aren’t merely subjective — that is, when questions of truth, moral or factual, are involved. This is a particular concern when it comes to two beleaguered American institutions that have come to grief in recent years by bowing too often to the demands of their customers: universities and the news media.

There was a time when being a college student meant that you willingly submitted to the rules, expectations and judgments of a professor or a department. You didn’t get to grade your teachers at the end of the term: What mattered to the university was their opinion of you, not yours of them. The relationship was unabashedly hierarchical. As a student, you were presumed ignorant, but teachable. You paid the university for the opportunity to become a little less ignorant.

Much of this has been overturned in recent years. Students today, whose parents often pay fortunes for their education, are treated like valuable customers, not lowly apprentices. University curriculums have moved away from core requirements — the idea that there are things all educated people ought to have read, understood and discussed together — to a kind of mix-and-match set of offerings. Liberal arts have endured frequent budgetary cuts for not being seen to have practical benefits — that is, skills that are valued in the job market.

The result has been the hollowing out of higher ed. Professors cater to students with higher grades and diminished expectations. At Yale, nearly four out of five grades are in the “A” range. At Princeton, studying Latin or Greek is no longer a requirement for classics majors. During the recent student protests, I kept wondering: Where did these kids get their sense of total certitude? Part of it is youthful idealism, and part of it stems from ideological currents in elite academia. But an equal part is the substitution of critical thinking with the ceaseless affirmation of emotional choice created when the “customer is always right” maxim is applied to education.
As for news media, here too there was a time when Walter Cronkite could end his program by saying, “And that’s the way it is…”, and be largely believed. His authority derived from the accuracy and quality of his reports. But his audience also understood that the news wasn’t simply what they wanted it to be. Facts shaped opinions, not the other way around.

That’s a bygone world. Conservatives, including me, long complained that “mainstream media” too often present a left-tilting slant on the news. But the right’s answer hasn’t been to seek or create news media that provides straighter news or a better balance of opinion. It’s been to turn the tables.

This has proved immensely profitable, especially on cable TV, radio airwaves and now podcasts. It has given previously disaffected consumers a much wider range of options for where they obtain their news, or at least the version of it that does the least to contradict their beliefs. But what it has produced isn’t a better-informed country. It’s a land of cacophony, confusion and conspiracy theories. When market forces provide you with alternative cushions or chocolates, the world is better for it. When those same forces provide you with alternative facts, it isn’t. Can we reverse the trend?

In “Memoirs of Hadrian,” the novelist Marguerite Yourcenar has her Roman emperor-protagonist observe: “There is more than one kind of wisdom, and all are essential in the world; it is no bad thing that they should alternate.” The wisdom of customers, crowds and markets has a lot to recommend it. But there’s also a wisdom rooted in knowledge, expertise and experience that collectively goes by the name of authority. It’s time to restore it.

What if higher education responded to plummeting public confidence by demanding a whole lot more of their students, especially through extensive core requirements? Or if professors gave grades that reflected actual performance? Or if administrators responded to rules-breaking through summary expulsions? What if the news media, also facing declining levels of trust, stopped catering to their least literate readers, stopped caring about their angriest ones, stopped publishing dumbed-down versions of news, and stopped acting as if journalism is just another form of entertainment?

Maybe moves like these would spell the death of academia and the news media. I think it would help save them both. The words today’s consumers almost never hear — “You are wrong” — are sometimes the ones that, unknowingly, they most yearn for.


Conservatives Think The Market Always Gets It Right. It Doesn't

by Bret Stephens

Dear Help Person,

I live in New York and nobody cares how I vote in the big, huge, potentially history-changing presidential race. All because of the Electoral College! Why do we have that thing and how can we get rid of it?

— Wants to Live in Pennsylvania

Dear Wants: Stop whining. True, the only people poll-watchers will really pay attention to are from a handful of states where the political division is very close. Nobody cares if about two million New Yorkers cast votes for Kamala Harris because nobody cares about the popular vote.

Snap question: Who won the popular vote in 2016?

Answer: Hillary Clinton, by about 2.9 million. But she lost a few states, including Michigan and Wisconsin, by itsy-bitsy margins, and that knocked her out of the White House. Blame the Electoral College.

How did we get stuck with this thing?

The Electoral College was created at the nation’s constitutional convention after a long, long struggle. Led by the Committee of Unfinished Parts.
Wait a minute. The committee of what?

Yeah, that’s the story. The founding fathers couldn’t decide whether the president should be elected by Congress or state legislatures, the people, or something else. They finally settled on something else.

“They squished it all together,” said Robert Alexander, an expert on the Electoral College at Bowling Green State University. Alexander likes to explain that the founders’ solution “looks like Frankenstein’s monster.”

These were the same guys who gave the power to do everything from trying impeachments to confirming members of the Supreme Court to the Senate, in which California and Wyoming have exactly the same number of votes despite the fact that California’s population is approximately 68 times as large.

What can I tell you, people? True, a lot of you are going to feel cranky when you diligently go to the polls on Election Day, say hello to all your neighbor-voters, and then go home to watch the whole world hanging on their fingernails to see what people who live in Pennsylvania and Arizona decide.

But don’t blame the swing states. Blame the Committee of Unfinished Parts.
Sick of the Electoral College? Stop Whining.

by Gail Collins
Yes, another article on THE ELECTORAL COLLEGE!

We need to be up on this! 
Read it and understand just what might happen in this year's election!