5069 stories
·
16 followers

Trump Says He 'Paid Zero' for the Government's $11 Billion Stake in Intel. Here's the Downside.

1 Comment and 2 Shares

President Donald Trump negotiated a deal last week for the U.S. government to take a substantial ownership stake in an American company. Despite his assurances, Trump's socialistic transaction is a terrible deal not only for the parties involved, but for the entire U.S. economy.

"It is my Great Honor to report that the United States of America now fully owns and controls 10% of INTEL, a Great American Company that has an even more incredible future," Trump posted Friday on Truth Social. "The United States paid nothing for these Shares, and the Shares are now valued at approximately $11 Billion Dollars. This is a great Deal for America and, also, a great Deal for INTEL. Building leading edge Semiconductors and Chips, which is what INTEL does, is fundamental to the future of our Nation."

"I PAID ZERO FOR INTEL, IT IS WORTH APPROXIMATELY 11 BILLION DOLLARS," Trump added on Monday. "All goes to the USA. Why are 'stupid' people unhappy with that?"

As of this writing, Intel's market cap is around $110 billion, so a 10 percent stake would indeed be worth $11 billion. But despite what Trump says, this was not a freebie.

"Under terms of the agreement, the United States government will make an $8.9 billion investment in Intel common stock," the company announced. "The government's equity stake will be funded by the remaining $5.7 billion in grants previously awarded, but not yet paid, to Intel under the U.S. CHIPS and Science Act and $3.2 billion awarded to the company as part of the Secure Enclave program….The $8.9 billion investment is in addition to the $2.2 billion in CHIPS grants Intel has received to date, making for a total investment of $11.1 billion."

Intel added that "under the terms of today's announcement, the government agrees to purchase 433.3 million primary shares of Intel common stock at a price of $20.47 per share, equivalent to a 9.9 percent stake in the company." According to the Financial Times, that was "below Friday's closing price of $24.80, but about the level where they traded early in August. Intel's board had approved the deal, which does not need shareholder approval."

The Financial Times added that under the agreement, "the US will also receive a five-year warrant, which allows it to purchase an additional 5 per cent of the group at $20 a share," but only "if Intel jettisons majority ownership of its foundry business, which makes chips for other companies." Trump may be expanding state ownership of private industry, but at least he seems to have no interest in seizing the means of production.

The CHIPS Act grants were approved under Trump's predecessor, President Joe Biden. Before leaving office, Biden's administration rushed to finalize many such grants, even as Intel was the worst-performing tech stock in 2024; the government actually agreed to less than initially allocated when the company failed to hit certain milestones.

Instead of rescinding those grants, as Trump reportedly threatened to do, he instead demanded a tenth of the business, as a result making the U.S. government Intel's largest shareholder.

Every part of this transaction flies in the face of any sincere interpretation of free markets, including the Biden administration's original sin to approve billions of dollars for a struggling company. It is perhaps telling that as Reason's Eric Boehm noted last week, the idea that the U.S. government should take a piece of Intel in exchange for CHIPS Act funding was first floated by Sen. Bernie Sanders (I–Vt.). Trump and his allies are now issuing talking points that could have come from the socialist senator himself.

If the U.S. government insists upon dishing out taxpayer money to private companies, is there any reason it shouldn't, as U.S. Secretary of Commerce Howard Lutnick put it to CNBC, get "a piece of the action"?

There are many reasons, in fact. "The most immediate risk is that Intel's decisions will increasingly be driven by political rather than commercial considerations," Scott Lincicome of the Cato Institute wrote Sunday in The Washington Post. "With the U.S. government as its largest shareholder, Intel will face constant pressure to align corporate decisions with the goals of whatever political party is in power."

Not only that, Lincicome writes, but "Intel's U.S.-based competitors…might find themselves at a disadvantage when vying for government contracts or subsidies, winning trade or tax relief, or complying with federal regulations. Private capital might in turn flow to Intel (and away from innovation leaders in the semiconductor ecosystem) not for economic reasons but because Uncle Sam now has a thumb on the scale."

Such market distortions may seem abstract, but they can have devastating consequences for the American industrial economy. "Will investors and entrepreneurs stay away from critical industries that might also see the U.S. government eager to get more involved?" Lincicome wonders. "Will future presidents, Republican or Democrat, use this noncrisis precedent to carry out their own adventures into corporate ownership with their own economic and social priorities attached?"

Indeed, White House National Economic Council director Kevin Hassett told CNBC on Monday that he's "sure at some point there'll be more transactions, if not in this industry, [then] in other industries."

Trump has made several such deals just since reentering office in January. He leaned on Intel competitors Nvidia and AMD to give 15 percent of proceeds from Chinese sales to the government; he demanded veto power over U.S. Steel as part of its sale to the Japanese company Nippon Steel; and MP Minerals, which operates a rare earth mineral mine in the U.S., got a $400 billion government investment that made the Department of Defense its largest shareholder.

In his Monday morning Truth Social post defending the Intel agreement, Trump said, "I will make deals like that for our Country all day long."

But as Lincicome notes, Republicans likely won't be in power forever; in time, a Democratic president would have the same influence on Intel—and beyond.

"This is a product of both parties forgetting a cardinal rule of politics: don't give yourself powers you don't want your opponents to have," writes Ryan Young, an economist at the Competitive Enterprise Institute. "The Democrats who passed the CHIPS Act likely did not foresee Republicans using it to essentially nationalize Intel. Similarly, Republicans cheering government takeovers of chipmakers will somehow be surprised if Democrats invoke similar powers in the health insurance, energy, and other industries when they are in power again."

The post Trump Says He 'Paid Zero' for the Government's $11 Billion Stake in Intel. Here's the Downside. appeared first on Reason.com.

Read the whole story
freeAgent
2 days ago
reply
Trump blatantly lies all the time. Does he believe the things he says? If so, he's incompetent and should be removed from office.
Los Angeles, CA
mareino
1 day ago
reply
Washington, District of Columbia
Share this story
Delete

America’s Mines Are Literally Throwing Away Critical Metals – Mother Jones

1 Comment and 2 Shares
Read the whole story
acdha
4 days ago
reply
Imagine if we had sensible industrial policy as opposed to a Maoist Great Leap Backward to the age of coal?
Washington, DC
mareino
1 day ago
reply
Washington, District of Columbia
Share this story
Delete

Toward a Shallower Future

1 Comment
Art by GPT-5. Prompt: “an image that evokes ‘The Triumph of Death’, with similar buildings and a similar layout, but where everyone is healthy and happy”

Today I saw my friend Noor Siddiqui getting some grief on the internet. Noor is the founder and CEO of Orchid, a company that will select your embryos for IVF in order to avoid passing on genetic diseases. As someone with a number of friends who have genetic disorders, this seems highly appealing — I think most parents would want their child not to have to suffer the same innate handicaps that they suffered. Noor was recently interviewed about her company by Ross Douthat of the New York Times.

When she tweeted out a link to the interview, Noor asked:

What if your baby never walks? What if they are never able to live independently? What if you could have stopped it… but chose not to? That’s the question @OrchidInc’s embryo screening forces. You optimize everything…career, diet, skincare…but you’re going to chance it on your child’s genome, one of the most significant determinants of their health?

A lot of people on X got mad at this, calling it “eugenics”, claiming that it invalidated the life of people born with genetic disorders, and generally saying that Noor’s vision of healthy babies is dystopian.

The argument reminded me of one of my favorite essays that I’ve ever written — my New Year’s post in January 2024. It was about how lots of people have the instinct to value human suffering, and to disdain technological solutions that make the struggles of the past obsolete. I thought I’d repost it, because I think it applies to the controversy over embryo screening as well.


“I would love to live to be 50 years old.” — Keith Haring

Yes, this post starts with the latest ridiculous contretemps on the social media platform formerly known as Twitter. But I promise, it gets more interesting!

The latest contretemps revolves around a famous painting: Keith Haring’s Unfinished Painting. Painted in 1989, it represented the artist’s impending death from AIDS. Haring died the following year, at the age of 31.

It’s an incredibly haunting, tragic image. The streaks of paint falling from the fragment of a pattern immediately evoke tears, blood, disintegration, futility; they emphasize just how much of the canvas was left blank. It’s a reminder of how much of our potential as individuals is wasted, and of an almost-forgotten pandemic that claimed 700,000 lives in the U.S. alone.

The other day, a pseudonymous account named DonnelVillager1 posted an AI-generated image that “completes” the pattern in the upper left of Haring’s painting:

DonnelVillager’s post — perfectly calculated to simulate ingenuousness, while actually poking fun at art appreciators — was itself a masterwork of internet pranksterism. It was instantly condemned by tens of thousands of angry Twitter users for “desecrating” Haring’s art. Defenders responded that DonnelVillager’s trollish tweet was itself a work of art, and that the furious response proved that AI art has the potential to be transgressive and to tweak the cultural orthodoxy’s tail.

Normally I would just shake my head at one more social media food fight and move on. But this reply by my friend Daniel caught my eye:

Of course, Daniel is also poking fun, but in a very important way, he’s right. If AIDS had never existed — or if HIV treatments had come just a little sooner — Haring might have created something like DonnelVillager’s AI image. After all, a fair amount of Haring’s other work did look like that.

And yes, without AIDS, Haring very well might never produced anything as haunting or evocative as Unfinished Painting. His art might have remained forever cheerful and whimsical, peppered with the occasional political statement. This June, William Poundstone wrote that “Everybody loves Keith Haring, but nobody takes him seriously…The [latest] exhibition does not exactly demolish the notion that Haring was repetitious.” The AI image that DonnelVillager created is an incredibly shallow thing — an unthinking regurgitation of meaningless patterns in a Haring-like style by a large statistical model. But without the pressure of a life cut short, Haring’s art might never have been as deep as it was.

Yet that would have been a good trade. Unfinished Painting is a great work of art, but it wasn’t worth the price of Haring’s life. Without AIDS, the world might have been a bit shallower, with less tragedy for humans to struggle against. But no one in their right mind wishes for tragedies to continue just so that human life can continue to be filled with pathos. Adversity is not worth the price of adversity. Even a world where Keith Haring lived to old age, but every one of his paintings was pointless AI-generated crap, would have been preferable to the world we actually got.

This got me thinking about the meaning of progress.

One of my grandfathers was a bombardier in the European theater of World War 2. He came back uninjured, but the stress of so many near-death experiences, and so many dead friends, drove him to lifelong alcoholism. Once, in the 1990s, I heard a conservative pundit claim that young Americans had become soft and weak because they had never had to face adversity like the World War 2 generation did. I asked my grandfather what he thought of that. After uttering something unprintable, he said: “I did that [stuff] so you wouldn’t have to.”

In a letter to his wife in 1780, John Adams, one of America’s founders, expressed a sentiment that was very similar to what my grandfather felt — and with which many veterans undoubtedly agree. He wrote:

I must study politics and war, that our sons may have liberty to study mathematics and philosophy. Our sons ought to study mathematics and philosophy, geography, natural history and naval architecture, navigation, commerce and agriculture in order to give their children a right to study painting, poetry, music, architecture, statuary, tapestry and porcelain.

Embedded in these statements is the belief that the trials and challenges of the world are potentially impermanent; that rather than something to be endured again and again ad infinitum, they are something that can and should be conquered and put behind us forever. It’s the belief that with effort, we can create a durably better world.

That’s not a trivial assumption. Humans have always dreamed of creating a better world, but for most of our history, the world stubbornly refused to get better at anything faster than a snail’s pace. A human in China or Europe or the Middle East in the year 1400 didn’t live a significantly better life than one in 400 B.C. Civilizations would rise, but then they would fall, smashed back to earth by something that looked suspiciously like a Malthusian ceiling. As a Frenchman in the year 1000 you could dream of creating God’s kingdom on Earth, but short of supernatural intervention, you could not reasonably dream of a world without smallpox, bedbugs, or senile erectile dysfunction.

Then, of course, something changed. By now you’ve all seen the graph where world GDP creeps along and then explodes upward like a hockey stick; I won’t post it again. Instead I’ll post this one:

For American women to die from pregnancy used to be a normal occurrence; then in the 1930s and 1940s it became an extreme rarity. Suddenly, a fundamental fact of human suffering that had stubbornly resisted change since time immemorial simply gave way. We fought and lost, and fought and lost, and then one day we fought and won.

The proximate reason for the abrupt decline in maternal mortality was the invention of antibiotics in 1928, and the development of medical practices like blood transfusions whose safety depends on antibiotics. But although penicillin was discovered by accident, it didn’t simply appear out of nowhere; its discovery required the edifice of an industrial society that took centuries to build. The victory over maternal mortality was achieved by a long struggle, not by a happy accident. (In fact, in some countries, maternal mortality began to fall in the 1800s, thanks to the wealth created by industrialization.)

A romantic could argue, if they were so inclined, that the conquest of maternal mortality has made the world a shallower place. In the early 1800s, you could tell stories whose emotional power rested — explicitly or silently — on the universal knowledge that childbirth meant mortal danger. Today, our high school English teachers have to explain this to us when we read Jane Austen or Emily Brontë, just so we understand, on an intellectual level, how brave the women in their novels were.

Such conquests have become commonplace. HIV was a death sentence in 1995; the next year, David Ho and his team unveiled a new combination drug therapy that turned it into a manageable chronic disease. And now, almost three decades later, Unfinished Painting is already becoming something that most people need explained to them; we still understand the meaning of terminal disease, but the context of AIDS, and especially what it meant to gay people in the political climate of the 1980s, is already fading from living memory into dry history.

As the world becomes safer — as one after another edifice of human suffering crumbles before the collective might of science, technology, and industrial society — it becomes harder to harness the emotional power of tragedy, risk, adversity, and heroism. The lives of more individuals become childlike, pure, and unmarked — or at least a little bit more so than before.

I first realized this years ago, while watching Disney’s The Little Mermaid. In the original 1837 Danish fairy tale, the mermaid wagers her life on a chance to win the love of a prince; she fails, and her life is forfeit to the evil sea witch. In the 1989 Disney movie, the same exact thing happens — except instead of passively accepting defeat, the mermaid and the prince simply stab the sea witch in the chest with the broken prow of a ship, and live happily ever after.

Perhaps this is the kind of resolution that could only feel natural and satisfying in America, a country that grew up after the Industrial Revolution. Some call Hollywood endings shallow, but they reflect our everyday reality in the modern world; what is David Ho’s defeat of AIDS, but the stabbing of an evil sea witch in the chest?

Nor, I think, are we simply on a temporary upswing. Some romanticists imagine that society is a cycle, where hard times create strong men, who create good times, which creates weak men, who create hard times. But whether or not that sort of institutional cycle exists, the technologies discovered during the last upswing will be preserved. Countries may collapse, but humanity will not forget antibiotics.

Nor is there any sign that this process will be naturally limited by humans’ inability to appreciate the improvements in their material lives. There is no upper limit on the correlation between life satisfaction and GDP. Contrary to popular myth, suicide rates tend strongly to fall as countries become richer. The higher measured rate of depression in developed nations is likely due not to ennui, but to better diagnosis.

Some romanticists feel the urge to knock over the edifice of industrial society intentionally, in order to kick against the seeming shallowness of modern life — to return humanity to a world of toil and struggle, in order to ennoble us. But these dark romantics are rightfully recognized in fiction and public discourse as villains. The heroes of our stories are the people like David Ho — the ones who fought to hoist humanity up from the muck so that future generations could be a little more childlike, the ones who studied politics and war so that our grandchildren may study statuary, tapestry, and porcelain.

Romanticists need to accept that the nobility of suffering has always been a coping mechanism — a way to sustain hope through the long twilight of apparent futility. And they need to accept that heroism is always inherently self-destroying — that saving the world requires that the world is worth having been saved.

And they must at least try to understand that in a more general sense, happiness isn’t truly shallow — it just has a different kind of depth. The passions of people raised in a kinder, gentler world may be alien and incomprehensible to the older generation, but they are no less intense, and the culture around them is no less complex. Adversity forces us to rise to its challenge, but abundance allows us to discover who we might become, and that is a different sort of adventure.

Looking back on my own life so far, I remember the happy child I was, before clinical depression changed me. Depression is horrible, but it added a richness and depth to the person I am today, and I appreciate the value of those changes. But if that happy child had gotten a chance to grow up without depression, I think he would have been changed in different ways, and under the tutelage of gentler teachers, would have become no less worthwhile and interesting of a person.

So it must be with humanity. The modern world of push-button marvels has lost something, but it has gained more than it has lost. By celebrating it, we honor the countless millennia of heroes who worked in some small way to bring it about, even as we dedicate ourselves to continuing their great enterprise. Our legacy is to fill the Universe with children who laugh more than we were allowed to.


Subscribe now

Share

1

Interestingly, DonnelVillager’s handle is one of the things that inspired me to write this post. It’s a reference to one of my favorite video games, Fire Emblem: Awakening. The character Donnel is a simple villager who is forced to go fight in an apocalyptic war after his father is killed by bandits. If you take care to level him up, he becomes a very powerful hero, but at the end of the war he goes back to his farming village and lives out a simple life, giving up fighting and adventure forever. His story serves as a reminder that struggle is not done for struggle’s sake.

Read the whole story
mareino
2 days ago
reply
"Adversity forces us to rise to its challenge, but abundance allows us to discover who we might become, and that is a different sort of adventure."
Washington, District of Columbia
Share this story
Delete

America has only one real city

1 Share

Americans who go to Tokyo or Paris or Seoul or London are often wowed by the efficient train systems, dense housing, and walkable city streets lined with shops and restaurants. And yet in these countries, many secondary cities also have these attractive features. Go to Nagoya or Fukuoka, and the trains will be almost as convenient, the houses almost as dense, and the streets almost as attractive as in Tokyo.

The U.S. is very different. We have New York City, and that’s about it. People from Chicago or Boston may protest that their own cities are also walkable, but transit use statistics show just how big the gap is between NYC and everybody else:

Source: Census Bureau via @StatisticUrban

Chicago, Boston, and the rest have their old urban cores with a few train lines and some shopping streets. But for the most part, even these cities are car-centric sprawl. You can also see this in the population density numbers; New York simply towers over all the rest:1

Source: Census Bureau via Wikipedia

There’s simply no other town in America that looks and feels like NYC.

Some of the reasons for this are historical. NYC became a big city before the rise of the mass-market passenger car, so it had to use transit to move people around; many cities, like L.A., Houston, and Phoenix, saw their growth happen later. America’s car-friendly policies, abundant land, and desire for suburban living created the car-centric development pattern that we see in many cities in the West and South today.

But many older cities don’t have this excuse. For example, take Philadelphia. In 1910, NYC was only three times bigger than Philly; by 1960 it was almost four times as big, and by 2010 it was five times as big. In other words, Philadelphia had its big growth spurt earlier than NYC did, but its outcome in terms of walkability and transit is just much weaker, with fewer than 20% of Philadelphians using transit for their commute. Very little of downtown Philly looks like Manhattan.

The reason NYC is so much bigger than every other city in America is partly mathematical — every country tends to have one city that towers over the rest in terms of total population. And it’s partly economic — Ed Glaeser has a great essay on the industrial history of NYC. But those reasons can’t explain why NYC is so much denser than other cities. In fact, because NYC includes such an unusually large percent of its metropolitan area (44%, compared to less than 33% for other major cities), you might naively expect it to be less dense — San Francisco is just the tiny metropolitan core of the Bay Area, while NYC includes Staten Island and other outlying areas. Yet NYC is still far denser than SF or any other large American city.

The reason NYC is America’s only truly dense large city is due to policy. Other cities have restrictive zoning codes that limit floor-area ratios, impose citywide height limits, impose parking minimums, and restrict certain areas to single-family homes.2 For example, here’s a map showing just how much of San Francisco’s land (in pink) is zoned to allow only single-family homes:

Source: SF Planning Dept. via San Francisco Public Press

Keep in mind that this is America’s second-densest big city. New York City really stands alone, in terms of allowing tall buildings.

New York City is also unique in having an extensive subway system. In terms of miles of rail, NYC has more than other cities, but just as important is the shape of the network. NYC’s subway is a dense grid that covers all of Manhattan and much of Brooklyn; other cities tend to have commuter rail systems that connect the city center directly to outlying areas but which aren’t as useful for getting around within the central city. For example, here are train maps for NYC, San Francisco, and Boston:

Source: MTA

American cities are no longer able to build subways. This is partly because we’ve outlawed the cheap methods used to build them:

The Works in Progress Newsletter
Why we stopped building subways cheaply
The Linear No Threshold model says that there is no safe level of radiation exposure. There is overwhelming evidence it is false, yet it inspires the ALARA principle, which makes nuclear power unaffordable worldwide. Read the lead article from Issue 19 of Works in Progress…
Read more

But a lot of it is because of the same problems of low state capacity and excessive citizen input that block every other construction project in America.

In other words, America has only one New York because no other American city wants to become like New York. Throughout the country, “Manhattanization” is a scary term that gets thrown at any developer who wants to increase density.

And yet the number of Americans who want to live in NYC is not small; it’s huge. NYC 1-bedroom rents have been soaring, even as they stagnate nationwide:

Source: Zumper via CRE Daily

Someone wants to live in NYC, obviously. Partly that’s because of the enormous consumption benefits for the young wealthy childless people who love living in cities. And partly that’s because dense cities allow industrial clustering effects — everyone knows that if you want to hire good employees in banking, publishing, corporate law, and so on, it helps to be in NYC.

Is one city enough to hold all of the Americans who want to live in big, dense cities, as well as all of the Americans who need to live there for work? It is not. The middle class is being pushed out of NYC at a rapid clip. Americans are trying to pile into other cities, but NIMBYism isn’t letting those cities build many new houses to accommodate them; as a result, rents in other cities go up faster than wages.

America needs more than one NYC. It needs Chicago, Philadelphia, and other big old cities with existing walkable urban cores to step up and Manhattanize themselves, so that the country won’t just have one Manhattan.

How can this be done? The first step is simply to adopt NYC-style big floor-area ratios, as well as all the city’s other permissive building policies. Allow more density, and some density will get built.

The second thing these cities can do is to build more trains. Because the “cut and cover” policies that build subways cheaply are always very unpopular, this probably also means building elevated trains and surface rail. NIMBYism will have to be overcome, but that’s true of just about anything that anyone wants to get done. Cities should also focus on building trains that allow their residents to get around the city, rather than just get into and out of the city; this means constructing trains in a grid or web pattern.

Another idea is that if other big cities can reduce crime, their citizens will be less apprehensive about allowing more density and transit. NYC is one of America’s safest big cities, with a homicide of less than 4 per 100,000 population as of 2024. Chicago, in contrast, was at 17.5, and Philadelphia at 16.9. San Francisco has a fairly low homicide rate of 6.4, but it still has a big problem of public disorder, including fentanyl use, homeless encampments, store raids, and general lawlessness. Reducing this public disorder — as well as crime in general — would make it far more appealing to live in a dense area, to walk down shop-lined streets, to take the train, and so on.

Some Americans instinctively recoil from calls to make more cities like NYC. They prefer their single-family homes, their cars, their strip-malls and lawns. Fine. But those people should consider that if America had one or two more New York-style cities, the people who want to live in that sort of city would move there, freeing up more space for everyone else.

The U.S. needs both dense cities and suburbs, in order to satisfy all the different Americans who want different lifestyles. We are overweight on Los Angeles type cities, and underweight on NYC type cities. We need to restore balance, by converting more of our big old cities into gleaming new Manhattans.


Subscribe now

Share

1

This is not true of, say, Japanese cities. Osaka is actually about twice as dense as Tokyo. That’s partly an artifact of how density is measured; Tokyo is more of an office town, where people commute in from residential areas outside the city proper.

2

NYC has a few other innovative policies that allow it to achieve greater density. These include density bonuses, special-purpose districts, as-of-right development, and the ability to sell unused floor-area ratio so that nearby buildings can use it.

Read the whole story
mareino
2 days ago
reply
Washington, District of Columbia
Share this story
Delete

Hundreds Of HHS Staff Sign Letter Begging RFK Jr. To Stop Making Them Targets With Misinformation

2 Shares

As you will recall, a single gunmen opened fire on a CDC campus in Atlanta earlier this month, claiming to have been injured by COVID vaccines. The rhetoric he had used prior to the shooting closely aligned with what RFK Jr. had been spouting for years. While Kennedy took nearly a day to even publicly comment on the shooting, more local CDC leadership was fielding questions from the Atlanta team that amounted to how the organization was going to ensure that misinformation stopped flowing from Kennedy’s mouth such that they had become targets for this gunman in the first place. They got their answer when Kennedy commented publicly the next week, reiterating all that same rhetoric that caused them to be targeted.

It’s perhaps not surprising then that hundreds of CDC staff signed an open letter essentially begging Kennedy to stop putting them in potential crosshairs.

More than 750 employees across the Department of Health and Human Services sent a signed letter to members of Congress and Health Secretary Robert F. Kennedy Jr. on Wednesday morning, calling on the secretary to stop spreading misinformation.

The letter states the deadly shooting that occurred at the Atlanta headquarters of the Centers for Disease Control and Prevention on Aug. 8 was “not random” and was driven by “politicized rhetoric.”

The signatories are accusing Kennedy of endangering the lives of HHS employees by spreading misinformation.

It’s a cry for help coming from within the organization that Kennedy is responsible for. These are people worried that their lives are being put at risk by Kennedy and his ilk, all due to the irresponsible claims he’s made for years, and continues to make to this day.

But if you were expecting empathy from the leadership of HHS, you’ll be sorely disappointed. Instead of that empathy, a statement from HHS apparently accuses signatories to that letter of “politicizing” the mass shooting that targeted them.

In a statement to ABC News, HHS said, “Secretary Kennedy is standing firmly with CDC employees — both on the ground and across every center — ensuring their safety and well-being remain a top priority. In the wake of this heartbreaking shooting, he traveled to Atlanta to offer his support and reaffirm his deep respect, calling the CDC ‘a shining star among global health agencies.'”

“For the first time in its 70-year history, the mission of HHS is truly resonating with the American people — driven by President Trump and Secretary Kennedy’s bold commitment to Make America Healthy Again,” the statement continued. “Any attempt to conflate widely supported public health reforms with the violence of a suicidal mass shooter is an attempt to politicize a tragedy.”

This is bullshit. CDC staff are not politicizing the shooting; they’re begging to not be made targets. That statement is so far afield from the actual request in the letter that I don’t even know how to respond to it, other than to say that it’s quite obvious Kennedy is refusing to moderate or alter his rhetoric. Conspiracy theories appear to be more important to him than the lives of those under his employ.

This is governmental malpractice. It needs to stop. It won’t stop unless someone in a position of power does something about it.

Read the whole story
mareino
3 days ago
reply
Washington, District of Columbia
freeAgent
5 days ago
reply
Los Angeles, CA
Share this story
Delete

"No more harbor seal for me, thanks. I'm full."

1 Comment
Jared Towers was in his research vessel on two separate occasions watching killer whales off the coast of Vancouver Island when the orcas dropped their prey directly in front of him and his colleagues.

The encounters he describes as "rare" and awe-inspiring have led to a new study published in the peer-reviewed Journal of Comparative Psychology, detailing researchers' experiences with killer whales apparently sharing their food with humans...

Towers and his colleagues began an investigation that led to the study published on Monday, which examines 34 instances in which killer whales around the world appeared to offer their prey to humans...

In all but one of the situations, the study says the whales were observed waiting for people to respond before either recovering or abandoning their prey.

"These weren't mistakes. They weren't like the killer whales accidentally dropped the food. They wanted to see how people responded," Towers says.

The study does not rule out any selfish motivations behind the behaviour. But Towers says he feels the apparent prey sharing is "altruistic" and "pro-social."
More details at The Canadian Press.

Read the whole story
mareino
3 days ago
reply
They are trying to establish First Contact, dummies. Eat the seal and give the Vulcan Salute.
Washington, District of Columbia
Share this story
Delete
Next Page of Stories