4471 stories
·
16 followers

People Don't Understand Affordability Requirements

1 Share

Shane Phillips, a researcher at UCLA Lewis Center for Regional Policy Studies, released a tool (using Terner Center at UC Berkeley’s data) that show the trade-off of mandating higher percentages of low-income housing against the production of market-rate housing in privately financed developments. As the percentage of inclusionary units increases in new developments, the number of market homes declines. Most importantly, once an inclusionary requirement reaches 20 - 25%, it ceases to produce additional low-income units and decreases the number of low-income homes built because the volume of market housing overall is less. A higher percentage does not mean the quantity, the real number of low-income homes, is higher. You’d be surprised how many in housing discourse are mathematically challenged by the concept of fractions.

Inclusionary zoning started in the early 1970s in California to encourage suburban communities to integrate by mandating a percentage of low-income housing in new rental developments. When President Nixon froze public housing construction, some urban communities innovated inclusionary zoning as a supplement. It became popular among progressive jurisdictions to make middle-class-oriented rental housing more low-income.

Within ten years of implementation, pundits, researchers and its many original sponsors conceded it didn’t build much low-income housing. Here’s Donald Terner, Housing Director under Jerry Brown and one of Inclusionary Zoning’s original sponsors explaining it’s primary problem upon roll-out.

Local government offers the developer incentives to make [affordability requirements] economically feasible, says Terner. Without greater density, government loans, land write downs or parking regulation waivers, inclusionary zoning was "well-intended nonsense."

— SF Examiner, 2/9/1986

An excerpt from a book on the East Bay Area’s left-wing economic revolution in the 1970s explains how the socialists in Berkeley who introduced one of California’s first ordinances knew their 25% inclusionary requirements weren't feasible.

The requirement that low-to-moderate income housing be provided in any development was included to guarantee that new development would not be exclusively for wealthy residents. But proponents also understood that no private, speculative developer would either desire to provide lower priced housing, or be able to afford such inclusions without subsidies.

Inclusionary zoning is, as Donald Terner said, a well-intended tool sometimes weaponized by exclusionary communities or proponents to act as unofficial housing moratoriums, with voters often none the wiser. Most people when asked what percentage of new developments should be low-income have no idea of the financial feasibility or where funding is coming from, so whoever says the highest number wins.

That’s how you get results like Portland’s, where a 20% inclusionary policy that promised 555 low-rent homes a year resulted in only 189 a year. In 2016, San Francisco voters passed a nice sounding 25% affordability requirement. The Controller’s Office predicted it would create fewer low income homes, but many politicians and activists dismissed this as neoliberal fearmongering. Like clockwork, the city had a 75% reduction in low income housing funding because the revenue they get from housing construction tanked just two years later. Finally admitting it didn’t work, the Board of Supervisors had to reverse this policy last year as homelessness skyrocketted and construction was completely unfeasible. A painful lesson in mathematics that making percentages higher does not inherently make the quantity higher.

While San Francisco’s case was unintended by voters, some exclusionary communities intentionally abuse inclusionary zoning knowing exactly what they’re doing. Atherton, the wealthy Silicon Valley suburb, mandated 25% inclusionary as a way to subvert state law forcing them to build homes. Republican-run Huntington Beach has announced its intent to do the same. 

Let's be honest: there’s nowhere in the United States or any city on Earth where unfunded mandates for subsidized housing in new developments produce a lot of low-income housing. Most places with high inclusionary rates are among the most unaffordable cities in the world and produce the fewest homes.

Inclusionary zoning higher than 5% or even 10% is only feasible or profitable in very expensive markets. San Francisco’s own Planning Department found that with high inclusionary requirements, the only place building housing will not be a net-negative in revenue is in the highest of high rent neighborhoods, only. Rents are set by the market, translated from what enough people are willing to pay, and market rents must provide enough revenue to exceed the cost of construction, debt service and profit margin.

Anything approaching 20% inclusionary housing mandates mostly materialize in places where projects are profitable enough to subsidize those inclusionary units, and those projects are usually expensive luxury housing. That’s why most of the yield from inclusionary zoning is coming from places like downtown Los Angeles or San Francisco rather than middle class areas. The latter’s rents aren’t high enough to cross-subsidize 1/5th of the homes.

For example, San Francisco’s real estate is dealing with two events: one is high-interest rates meant to deter so-called over-construction, and the second is the outmigration of higher-income professionals from the city. There are enough luxury condos and high-end rentals in downtown S.F. relative to demand, so building went from being small to almost nothing. While middle-class developers can build in cheaper places like Texas or Arizona, only high-end, luxury developers can afford the cost of construction in San Francisco and much of California. When the median new home in San Francisco costs well over $1 million per unit to build, thanks to high material costs, high-interest rates, and an expensive approval process, you cannot price a home any lower than that.

We should look towards Vienna, Austria for how we want to approach integrated housing development. American puff pieces about Vienna never engage with the substance or specifics of how Vienna achieves low-income housing ratios as high as 40% in new developments. Chiefly, it’s not an unfunded neoliberal mandate expecting developers to provide public goods by incentive. Vienna builds social housing via a 1% income tax towards a dedicated housing fund. The municipal agencies then incentivize competition among client developers with these funds. This is also why development looks better in Vienna than in most American cities (that and building regulations are modern and more evidence-based). Whereas inclusionary zoning essentially taxes new housing to pay for subsidized units, Vienna taxes the city’s wealth to pay private developers with loans and land leasing in exchange for high-quality, social housing on the city’s terms.

The unfunded mandates version of inclusionary zoning we practice in expensive U.S. cities doesn’t work at producing sizable amounts of low-income housing, and we knew that within years of its creation. It was a neoliberal solution to dwindling HUD investment of public housing and unsuccessful integration. It’s as silly as mandating grocery stores sell 20% of their food below market prices to solve food insecurity or 20% of gas stations to sell below market fuel to solve transportation. The debate should be on whether to subsidize the market producers, create public goods or subsidize people in need.

I don’t believe inclusionary zoning is bad. Once we acknowledge that subsidized housing requires subsidies and taxing new housing is sub-optimal over taxing income or land, which are the actual instruments of high rents and home prices, we can start to realize some solutions. Realistically, unfunded inclusionary zoning requirements should not exceed 10% of total projects and should be traded for swift approvals, greater density and fee exemptions. Our most successful inclusionary programs are density bonuses, so let’s build off that. In the 1990s, some California cities — including progressive Berkeley — approached inclusionary zoning by offering loans to developers that they would pay back with realized rents. Smart!

Inclusionary zoning is not an affordability policy, it's an integration policy that is constantly misused. Unfunded mandates can’t replace the hard work of taxing monied interests — chiefly landowners and wealthy residents, which includes all real estate holders as well as developers — to pay for public goods. Cities wrongly depend disproportionately on the borrowed capital of multifamily developers, not even house flippers or single-family developers, to subsidize housing. Rents and home prices derive not from who builds housing but who owns housing and land under it. We should subsidize our low-income housing by taxing landowners and the high incomes inflating housing costs, not tax what we lack, which is housing.

Subscribe to The Discourse Lounge

I analyze housing affordability, transportation, culture and urban living. Among other things.

Read the whole story
mareino
1 hour ago
reply
Washington, District of Columbia
Share this story
Delete

Stop making people do the wrong jobs

1 Share

NextCity recently published a hot take by Steffen Berr tying the ways in which the US is failing at reducing pedestrian deaths to the misaligned training that most transportation engineers in the US receive. Berr explains that a transportation engineer “is a really a civil engineer who has received a little exposure to the transportation sector.” Due to the structure of accredited degree programs, “In a best-case scenario, a civil engineer will only take three transportation classes during their bachelor’s degree. In the worst case, they’ll only take one: Introduction to Highway Engineering. To put this into perspective, the most educated professionals we entrust to design and run our roads and streets have received only half of a minor with a handful of credits on the topic.”

Berr goes on to address the reasonable objection that in many fields, people learn on the job. But what transportation engineers learn on the job, per Berr, is not things like how to choose the most appropriate intersection for the desired use, how the road system should be laid out at a network/route level, or how to fix congestion (none of which, he argues, they learn in school either.) Instead, they learn “how to navigate the impressive amounts of bureaucracy that have been built up in the industry, memorize an impressive vocabulary of technical jargon, practice with design software like AutoCAD to produce engineering plans, and how to copy the current engineering standards. There is no exposure to deep levels of theory that can help our future professionals create original solutions to fundamental problems like safety, congestion, emissions and ethics.” 

I’m less interested in Berr’s point about the wrong degree requirements than I am in his observation about what the job of transportation engineer actually is. As Stafford Beer observed, “the purpose of a system is what it does,” and by analogy, the purpose of a job is not its stated goals but what the people who do it actually do day to day.1 When talking to people who’ve never worked in government, the biggest disconnect is usually a lack of understanding of the actual jobs of public servants. A rather dramatic illustration of this comes from a Mercatus Center podcast with Lant Pritchett in which he shares an anecdote about advocating for evidence-based policy in the Indian bureaucracy. 

After they had done the RCT [randomized control trial] showing that this Balsakhi program of putting tutors in the schools really led to substantial gains and learning achievement and reading outcomes, he took it to the secretary of education of the place in which they had done the RCT. And he said, “Oh, by the way, I have the solution to your problem of low learning levels, or at least part of the solution. Look, we’ve got this powerful evidence that this works to improve leading outcomes by putting these volunteer tutors and pulling their low learning kids out.”

The response of the secretary of education was, “What do you think my job is? Why do you think that this is a solution to a problem I have? Look around my office. See these piles and piles of files that keep me busy 60 hours a week and not one of these files is about a child not learning. I’m under no pressure about that problem. If I try and transfer a teacher, I’ve got a court case on my hand. If I try and close a school, I got a court case on my hand. My job is to administer the existing education policy such that there’s policy compliance. Super kudos to you for this cute little study you’ve done. It has nothing to do with my job as secretary of education.”

Ouch. And that’s a secretary of an agency serving a county with 1.5 billion people. 

I suspect a lot of public servants in the US will read that and think “My job is not quite as bad as that but it sure feels that way a lot.” The people I know maintain enough connection to the actual mission to avoid such a meltdown (though I find the secretary’s frankness refreshing.) But both these stories help explain a conundrum that many who care about effective government (or, shall we say, state capacity) struggle to explain: the contradiction between the dedication, smarts, and creativity of most public servants and the sometimes terrible outcomes they are associated with, like the recent tragic lapses in administering student loans by the US Department of Education. (Or in Berr’s world, the 40,000 traffic deaths we’re stuck with every year while countries like the Netherlands have dropped their own already low number by 46%.2) To be sure, there are often extraordinary outcomes (hello Direct File!), and we notice them far less often, to our own detriment. But while it’s impossible to give government a meaningful overall grade, if its job is to meet challenges we face (national security, climate change, an effective safety net, etc.), we are at risk of falling dangerously short. The problem isn’t that public servants are doing a bad job, it’s that they’re doing a great job — at the wrong jobs.

The (unnamed in this context) Indian Secretary of Education seems to agree: “My job is to administer the existing education policy such that there’s policy compliance.” I highly doubt that’s the job he thought he was getting, or the job he wanted to do. Berr is on the same general theme when he says that what transportation engineers learn on the job is “how to operate in the industry effectively as it has been currently set up.” Note his use of the word effectively. Effective towards what? Not towards reducing traffic deaths or congestion levels. “All the experience in the world of copying and pasting a standard invented fifty years ago is useless when the problems that the standard was invented to resolve have changed,” he says. “Understanding this sheds a lot of light as to why 40,000 people are still dying on our roads every year and why your local city insists on laying down sharrows [which are known to be ineffective and often dangerous] in their latest round of “safety improvements.” Quite frankly, it’s because we have no idea what we are doing.”

This is a useful nuance as I develop a framework for building state capacity. One of my admittedly obvious and oversimplified tenets is that systems have both “go energy” and “stop energy,” much as a car has a gas pedal and a brake. You wouldn’t drive a car without a brake, but you also wouldn’t drive a car in which the brake was pressed all the time, even when you were trying to accelerate. This is a good metaphor for how we’re dealing with the implementation of CHIPS, IRA, and the Infrastructure Bill, for example, where the clear intent is speed and scale but the public servants responsible are held back from that by the brakes of overly zealous compliance functions. I hear a version of this at every agency I visit: “Congress tells us to do something. Then the compliance offices keep us from doing that very thing.” (And side note for further discussion: This is an issue of representation, voice, and democracy.) The stop energy in our government is currently a lot bigger than it should be. We’re hitting the gas but we’re not accelerating because we’re pressing the brake at the same time. 

Lots of people in government have “stop energy” jobs. We need them, and we need them to be good at them. I don’t want to live in a country where our government doesn’t exercise “stop authority.” I try to remember not to complain when my flight is delayed because I really don’t want to die in a plane crash, and a rigidly implemented checklist is a big part of how we keep safe (the current epidemic of doors and engine cowlings blowing off notwithstanding). I also really like being pretty confident that a pill I’m taking has been tested and not tampered with. I like thinking our nuclear arsenal is protected. You know, little things like that.

Stop energy is critical. Rigid adherence to protocol is usually lifesaving. But it must exist in balance. I recently learned the Navy concept of “front of sub/back of sub.” The back of a nuclear submarine, where the nukes live, is run by the book. You don’t deviate from the checklist. You don’t innovate. You don’t question. The front of the sub, on the other hand, is responsible for navigating through dark waters. You have to improvise. You have to make judgment calls. There are manuals and checklists, for sure, but the nature of the work calls for a different approach, and the Navy recognizes that the cultures of front and back have evolved appropriately to meet distinct needs. 

There are times, of course, when you’ll need front of sub judgment in a back of sub context. If the plane I was on was about to be bombed by an enemy combatant (unlikely in my life, I hope), I would be okay with the pilot using her discretion to cut a corner or two on the takeoff checklist, because the very thing that checklist is there to protect (the lives of the people on board) would under threat from a different vector. Taking every precaution in that scenario could be reckless. That’s a bit how I feel about the NEPA reviews and other bureaucratic processes that are holding back building the infrastructure we need to move to a low-carbon economy. I wish for the public servants in charge to see the threat of inaction – those species the checklist is trying to protect are threatened by temperature rise as much or more than they are by the project in question – and make good judgment calls about getting the plane off the runway a lot quicker, so to speak. This feels like a domain where back of sub culture has more hold than it should given the circumstances. And to Berr’s point, we can’t rely on back of sub culture when the checklist and protocols it uses no longer serve the purpose.

Of course, “stop energy” roles can themselves be balanced – if only I had a dime for every discussion about the value of lawyers who get to yes and the frustrations with those who seem to do nothing but block. The analogy breaks down a bit here because the items on a pre-flight checklist are binary – they are either red or green – whereas the ad hoc checklists that lawyers assemble to ensure compliance before signing off on an action are almost always shades of gray – they can be open to lots of interpretations. Any given lawyer, or compliance officer, or ethics cop can treat their role with appropriate balance, reserving their stop authority only when the risks truly outweigh the benefits. But getting the culture of a team, department, or agency to balance stop and go correctly at a macro level is extremely difficult. It’s rare to see leadership really change that balance, or for it to stick. It’s a retail approach, hugely dependent on personalities and circumstances.

What would a wholesale approach to getting back into balance look like? One answer should be a simple matter of top-down workforce planning, of the kind our Office of Personnel Management should be empowered to do: fewer stop energy jobs relative to go energy jobs. Hire more doers than brakers, both in how the position is defined and in the characteristics of the people selected for the job. But that proposal needs several important caveats. Of course, every great employee is some mix of these energies – a “go only” employee would be exhausting and dangerous in all but the most extreme circumstances – so we’re talking about a general orientation. More importantly, having fewer brakers will only result in enormous backlogs if they have the same stopping power. But there are plenty of functions where its possible to safely move from default no to default yes, possibly with an after the fact correction mechanism.3 Instead of requiring form redesigns to go through a long White House approval process before they can be made available to the public, for instance, allow agencies to apply the appropriate level of scrutiny and sign-off for the form at hand and develop a process for catching and quickly fixing anything determined to be detrimental. This example speaks to the issue of multiple levels of safeguards. Loosening a safeguard that operates at the top level of federal government may not make much difference to overall stop energy if agencies, or in turn their subcomponents, or even teams, react by strengthening their own safeguard processes. There might be something like a Law of Conservation of Safeguards at play here. But it’s still worth considering the value of moving to default yes processes where appropriate. 

Share Eating Policy

Of course, the question of the nature of the job public servants are tasked with is about much more than just stop vs go. It’s about what kind of work we’ve decided to invest in. I go into some depth about this in Chapter 5 of Recoding America as it relates to our lack of investment in digital competencies and how ideologies about private sector superiority led to a big outsourcing push just as digital was beginning to massively transform society.

…these internal competencies in digital became necessary just as we were jettisoning internal competencies of all sorts, not developing them. Instead of digital competency, government has developed extensive processes and procedures for procurement of digital work, and the ins and outs of procurements sometimes seem more complex and technical than the latest programming languages.

This points to another way to understand the disconnect between high employee performance and the outcomes our government produces (or fails to), especially relative to the investment made.4 Take procurement. I know a lot of people in procurement who are really good at their jobs. Some of them are considered really good because they’re great at the “back of sub” tasks of making sure every box is checked, and a manager might feel compelled to give them a high performance rating because of their thoroughness and dedication, even if the people who need the thing being acquired are frustrated by the slowness and rigidity of the process, and even if the thing that is ultimately acquired has checked all the boxes but doesn’t actually work. (For an example of this, see Chapter 4 of Recoding America.) But many of these procurement professionals operate according to “front of sub” principles, and are enormously creative and mission-driven. The other public servants who rely on them to procure things value them enormously. They may or may not receive high ratings, if the manager is judging them based on a “back of sub” approach. But procurement processes simply should not be as complex and burdensome as they have become. Both of these kinds of procurement professionals are doing a job that simply shouldn’t exist in its current form.

Especially with the looming threat of the return of Schedule F under a possible Trump administration, there’s a lot of talk of public sector employee performance and protections. I agree strongly with Donald Kettl, who has said about the left’s silence on civil service reforms in the face of Schedule F: “You can’t fight something with nothing.” I hope to be part of proposing a something there, something that improves government’s ability to fill many open positions and to effectively and ethically manage the workforce. But we could succeed entirely at that and still fail to meet the challenges in front of us if the jobs we fill are the wrong jobs.

Another of my admittedly obvious and oversimplified principles of how to build state capacity is that there are really only three things you can do:

  • You can have more of the right people

  • You can focus them on the right things

  • You can burden them less.

There is obviously quite a lot to say about each of those things, and they are all deeply intertwined. A big reason we don’t have more of the right people is that we overburden both the people responsible for hiring and the applicants, focusing both on the wrong things. We overburden public servants generally because we have designed too many of their jobs to stop bad things instead of to enable the things we desperately need. We are too often asking if public servants are doing a good job instead of understanding and questioning the nature of the jobs they’ve been hired to do. 

We need a much more robust understanding of how to fix the problem of hiring the right people to do the wrong jobs. We need wholesale strategies for tuning the dial between front of sub and back of sub, between stop and go, between brake and gas, and refocusing the job of public servants on the work that’s most directly meaningful towards the outcomes we want. We need staffers in agencies who act as if the climate crisis is the enemy plane that’s about to bomb us. We need transportation engineers whose actual job – as practiced on a daily basis, at scale – is to reduce congestion and pollution and improve and save lives. We need Secretaries of Education who have time in their day to look at the study on improving learning achievement, and maybe even take action on it. We need all of this now.

Imagine a world in which this — not just enforcing rules, not even just helping agencies fill open jobs, but ensuring that federal government fills the right jobs — was the mandate of an empowered and deeply collaborative Office of Personnel Management. They couldn’t do it alone, of course — it’s agencies that define the jobs they think they need and Congress that throws down law after law they must comply with, feeding the need for compliance. The White House Office of Management and Budget adds its own reporting and compliance burdens. Each would need to buy in on an agenda of building state capacity and do their part. But this is what workforce planning should really be, and in 2025, we will need it more than ever. If Biden gets a second term, this is the kind of ambitious agenda he should set.

Share

Subscribe to Eating Policy

In business, culture eats strategy. In government, culture eats policy. Here we'll talk about the problems of state capacity (government's ability to achieve its policy goals) and how to fix them. From the author of Recoding America.

Read the whole story
mareino
1 hour ago
reply
Washington, District of Columbia
Share this story
Delete

EPA issues four rules limiting pollution from fossil fuel power plants | Ars Technica

2 Shares

Today, the US Environmental Protection Agency announced a suite of rules that target pollution from fossil fuel power plants. In addition to limits on carbon emissions and a tightening of existing regulations on mercury releases, additional rules target coal ash waste left over from power generation and contaminants in the water used during the operation of power plants. While some of these regulations will affect the operation of plants powered by natural gas, most directly target the use of coal and will likely be the final nail in the coffin for the already dying industry.

The decision to release all four rules at the same time goes beyond simply getting the pain over with at once. Rules governing carbon emissions are expected to influence the emissions of other pollutants like mercury, and vice versa. As a result, the EPA expects that creating a single plan for compliance with all the rules will be more cost-effective.

Targeting carbon

The regulations that target carbon dioxide emissions have been in the works for roughly a year. The rules came in response to a Supreme Court decision in West Virginia v. EPA, which ruled that Clean Air Act regulations had to target individual power plants rather than giving states flexibility regarding how to meet broader standards. As a result, the new rules target carbon dioxide the only way they can: Plants can either switch to burning non-fossil fuels such as green hydrogen, or they can capture their carbon emissions.

The EPA did recognize, however, that the decline of coal was handling some of the issue on its own. No new plants have been built in years, and most of the existing ones are growing increasingly old and expensive compared to cheap natural gas and renewables, leading to widespread closures. So the EPA set up tiers of rules based on how long plants were expected to be operating. If a coal plant would be shut within a decade or two anyway, it could simply continue operating as it had or meet less stringent requirements.

In the final rule, this has been simplified down into three categories. Any plant that will cease operations before 2032 will get an exemption. Those that will shut prior to 2039 will have to meet less stringent requirements, equivalent to replacing 40 percent of their fuel with natural gas. Anything operating past 2039 will have to eliminate 90 percent of its carbon emissions.

Natural gas plants will face similar tiers of stringency, but this time based on how often they're in use. Plants that operate at less than 20 percent of their capacity, such as those that simply fill in during periods of low renewable energy production, can meet regulations simply by adopting low-emissions fuel. Those that run between 20 and 40 percent of the time have to meet operational efficiency standards, while anything that's operational over 40 percent of the time will have to eliminate 90 percent of its emissions.

Additional changes will allow plants some temporary exemptions from regulations if they're deemed critical to maintaining grid stability.

Should the rules survive court challenges, it's unlikely that more than a handful of coal plants will continue operations. Since burning coal produces a large range of pollutants, this will provide substantial non-climate benefits. The EPA estimates that in two decades, there will be significant declines in nitrogen oxide and sulfur dioxide pollution, fewer particulates, and less mercury released to the environment. Over the intervening years, this will avoid 1,200 premature deaths, nearly 360,000 asthma problems, and roughly 50,000 lost work days. All of that leads to substantial economic benefits, as seen in this chart.

Thanks to tax incentives for carbon capture contained in the Inflation Reduction Act and the continuing fall in the price of renewables, the EPA estimates that meeting the standards will result in a "negligible impact on electricity prices."


Page 2

Limitations on mercury have existed for some time, and the EPA has been working on tightening those rules since shortly after Biden entered office. The rule being announced today targets the burning of lignite, a softer form of coal that burns inefficiently due to a high level of contaminants. Lignite-fired plants will see existing limits on mercury emissions drop by 70 percent; all coal plants will see limits on other toxic metals fall by 67 percent. Plants will also be required to install real-time monitoring systems and make their data available to the public.

Overall, this will cut mercury, arsenic, and lead emissions, with obvious benefits for public health; the EPA expects to see a lower risk of fatal heart attacks, cancer, and developmental delays in children. As an added benefit, compliance will also cut carbon emissions.

Separately, coal plants will see tighter regulations on the discharge of water. Water is used to move the material left behind when coal is burned, termed "fly ash," out of the combustion area and into longer-term storage. It's also used in the machinery that removes pollutants (including mercury and sulfur) from the exhaust gasses of coal plants. During these processes, the water frequently picks up the toxic contaminants that are associated with coal use.

The EPA is also tightening the limits of contaminants allowed in this water before it is returned to the environment. Again, coal-fired plants that will be closed within the next decade will be allowed to continue operating under present restrictions until their closing; only those kept open for longer will need to meet the new requirements. "Following rigorous analysis, EPA has determined that this final rule will have minimal effects on electricity prices," the agency says. "EPA’s analysis shows that the final rule will provide billions of dollars in health and environmental benefits each year."

The final rule being announced today is largely closing a loophole in the existing rules regarding fly ash, which contains lots of toxic metals that can leach into the groundwater near storage facilities. Existing rules regulate many of the storage areas, but the agency has identified a number of inactive disposal sites at active coal plants, a situation that fell outside existing regulations. (Existing regulations targeted active disposal sites at operating plants and inactive sites at shuttered facilities.) The new rule brings these exceptions into the same regulatory scheme that governs the rest of the storage sites.

Sending signals

As noted above, the EPA argues that tying these regulations together will help those running coal-fired plants sort out how to meet them. "EPA is providing a predictable regulatory outlook for power companies, including opportunities to reduce compliance complexity, and clear signals to create market and price stability," the agency says.

Given that all four of these regulations target coal-burning plants, those "clear signals" are that coal is going away. It was doing so on its own, but the added regulations narrow the opportunities for coal plants to operate profitably.

Given the outsize impacts of coal pollution on public health, this also makes the EPA's economic case much easier. The vast costs of the health impacts will always dwarf the costs of compliance, especially in this case, where many plants will close for economic reasons before they even need to worry about compliance.

But the real battle will come in maintaining the rules governing carbon emissions in natural gas plants through court challenges and changes in administration. Natural gas is economically competitive, and it is currently playing key roles in both eliminating coal from the grid and balancing out the intermittent production from renewables. But long-term, our climate goals require that its emissions go away as well.

Given that these rules may not survive elections and the courts, it's not clear that the EPA's announcement is as direct a signal as our climate needs it to be.

Read the whole story
mareino
3 days ago
reply
Washington, District of Columbia
acdha
8 days ago
reply
Washington, DC
Share this story
Delete

A Close Examination of the Most Infamous Public Toilet in America

1 Share
We think of adding regulation as something liberals do and removing regulation as something conservatives do. But that is only part of the story.
Read the whole story
mareino
4 days ago
reply
Washington, District of Columbia
Share this story
Delete

Self-driving cars are underhyped

1 Share

The Obama administration saw a flurry of tech sector hype about self-driving cars. Not being a technical person, I had no ability to assess the hype on the merits, but companies were putting real money into it, and so I wrote pieces looking at the labor market, land use, and transportation policy implications. But the tech turned out to be way overhyped, progress was much slower than advertised, and then Elon Musk further poisoned the water by marketing some limited (albeit impressive) self-driving software as “Full Self-Driving.”

That whole experience seems to have left most people with the sense that self-driving cars are 10 years away and always will be.

I am still not a technical person, but at this point I am prepared to make a technical judgment: The current conventional wisdom is wrong and autonomous vehicle technology has become underhyped. The key mistake I’ve noticed people making is they don’t seem to realize that autonomous taxis are no longer a hypothetical future technology. They exist, and you can ride in them. Waymo has been operating in San Francisco and Phoenix for a while now and is expanding soon to Austin and to a sort of awkward-to-describe-accurately swathe of Los Angeles County.

The technology that exists is not without its limits. Waymo’s Driver software relies on detailed local mapping to work, which is not how humans drive cars. As a result, you can’t expand Waymo into an arbitrary geography without a significant capital outlay upfront. That’s a real business model challenge for the company, because this is obviously also a highly regulated space. In purely business terms, what you would want to do is start with the most lucrative geographies to start generating cash flow and only expand out from there. But in practice, Waymo needs to balance where it makes business sense to operate and where they can get permission. What’s more, they seem to only be targeting warm cities, as if they have a known problem with dealing with snow or ice.

So it’s not a solved problem; we’re still talking about a world of hype and potential more than a reality. But the reality is that if you have a business meeting in Downtown Phoenix, you can take a driverless taxi to the airport when you wrap up. Not in the future. Not hypothetically. But right now. We’re looking primarily at questions of business models and economics, not technology. Driverless cars are on the road as we speak, and more are coming in the very near future.

The freeway frontier

One reason that not a lot of people know about the world’s actually existing driverless taxis is that even in the places where they operate, there isn’t actually a good reason to take them.

The big problem is that Waymo’s vehicles aren’t authorized to drive on freeways. That means that in most cases, going somewhere in a Waymo One driverless taxi would be slower than going in an Uber or driving yourself. That’s fun if you’re a curious tourist, a journalist, or just a technophile who enjoys adventure. But what they have is essentially a product that is technically impressive but practically inferior to the existing alternatives. The inability to use freeways is a particularly crippling issue in Phoenix, given its urban design, but San Francisco also has a strong freeway spine for a lot of the most common trips you might want to take. And in San Francisco, the service area doesn’t include the airport, which in every city I’m familiar with is a really important pillar of the taxi market.

Importantly, though, this isn’t because there’s any technical problem with operating driverless cars on freeways.

The somewhat paradoxical thing about city versus freeway driving is that freeway driving is higher-stakes (because the cars are going much faster), but it’s cognitively and logistically easier. In the city, even when everyone is following the rules, you have all kinds of conflicts between moving cars and parking cars, between turning cars and pedestrians, between cars and bikes sharing lanes, and God knows what else. What’s more, it’s just not the case that everyone is following the rules. People unloading trucks double-park all the time. Pedestrians frequently cross mid-block. Cyclists don’t always stop at stop signs. And the rule violations compound in complicated ways: The truck parks in the bike lane, so the cyclist serves into the traffic lane. Because the vehicles are all moving relatively slowly, you’re less likely to have a spectacular crash. But because the situation is more chaotic, it is a much more technically difficult problem.

Regulators have, reasonably, wanted Waymo to demonstrate success with the harder-but-lower-stakes problem before tackling the easier-but-higher-stakes problem.

But Waymo has gotten regulatory permission to expand onto freeways and as far south as Sunnyvale, just outside San Jose. When this service rolls out it will be much more useful than the existing version of Waymo, letting people get rides to the airport and to the major Silicon Valley corporate campuses at reasonable speeds. They are also testing service on Phoenix freeways, which again will make it an actual competitive product that people might want to use for reasons other than curiosity. Meanwhile, I’ve seen humans driving Waymo cars around DC, which is apparently how their buildout works. The point is, driverless cabs are really happening, to a greater extent than most people seem to know.

Subscribe now

Autonomous vehicles could be a huge deal

During the Obama-era AV hype cycle, the prospect of driverless cars was mostly presented as a threat. I remember former SEIU president Andy Stern making a big deal about the possibility that automation would eliminate a lot of truck driver jobs and create a disemployment crisis.

One nice thing about having a moderately inflationary full employment economy is that it helps focus political attention on questions of productivity and growth. It would of course be sad for cab drivers to lose their jobs. But the current American economy has low unemployment and a decent number of job openings, which means they’re likely to find new work. And if driverless cars bring down the price of taxi rides, that reduces inflation, which reduces interest rates and that benefits everyone — in particular, it reduces the cost of investment in the future of the American economy.

Taxi rides are not a particularly large swathe of the Consumer Price Index.

But Stern’s truck drivers are a much bigger deal. The price of much of what we buy is at least in part a function of the cost of shipping, often via truck. More cost-effective shipping would be a huge deal for the economy and a big boost to productivity and disinflation. Again, of course, every lost job is sad for the people who lose their jobs. Which, again, is a reason that it’s good to have a low-unemployment economy. But even better when you consider the implications of making it cheaper to ship stuff, you’re clearly creating more demand for workers at other stages of the process by knocking out part of the cost of doing business.

Another area I’m excited for is the humble city bus. Because a bus runs a fixed route across a constrained geography, it’s ideally suited to this technology. Reducing bus operating costs lets you run more — and more frequent — routes, which is a huge win for riders. Transit agencies are currently facing big financial deficits while also struggling to attract drivers at current wages and also facing pressure to add security personnel. So in this case, there’s no big labor market issue at all, it’s just a win.

Most exciting of all, though, driverless cars (at least theoretically) have the potential to revolutionize land use. The current business model problem for Waymo is trying to compete with Uber, Lyft, and traditional taxi companies. But the ultimate promise of self-driving, if the geography becomes expansive enough, is to make hiring a driverless taxi cost-competitive with owning your own car. If it’s cheap enough to take driverless cabs basically everywhere you go rather than owning a car, then you don’t need a parking space. And if a critical mass of people start going the robotaxi route, then your destination doesn’t need a parking space either. That makes all kinds of construction projects cheaper and would be a huge shot in the arm to the overall economy.

We are currently a long way from the operating range of Waymo’s product being large enough to even contemplate that. The ability of a human driver to roam across an arbitrary and unfamiliar geography has a lot of value, even if most of us mostly just drive familiar streets near to where we live.

Regulatory choices matter

The key thing about this is that anything to do with moving vehicles on public thoroughfares is, correctly, going to be a pretty tightly regulated space.

Right now, one cap on driverless taxi expansion is the need to get stepwise regulatory approval for expansion. But another cap is just the reality that even though human-driven cars kill people every day, anyone operating in the driverless space knows that a single fatality could prompt a massive backlash. Based on its existing range, human drivers get into crashes 3-6 times as often as Waymo vehicles. That’s good news. What would be better would be to hear clearer statements from taxi regulators and state transportation departments that they are going to hold driverless vehicles to some kind of objective and reasonable “safer than humans on average” standard rather than pulling the plug the first time something bad happens.

Human driving is obviously much too useful to ban, but it’s genuinely incredibly deadly — tons of people die every day, through no fault of their own, due to error on the part of other human drivers, and plenty others die as a result of their own error.

When it come to the safety of driverless cars, letting the perfect be the enemy of the “better than the status quo” is going to be a huge loss for public health and economic productivity. And a lack of clear standards, or even verbal commitments, creates an unduly uncertain landscape for expansion. This map is a helpful guide to the unclear statutory landscape for driverless cars, but keep in mind that land area isn’t really the relevant criterion here. North Dakota is very large, but as a taxi market, it’s tiny compared to New York City.

Meanwhile, there is already some explicit backlash. The governor of Kentucky vetoed a driverless car bill as some kind of Teamsters Union thing (he got overridden). More insidiously, there is an effort afoot in California to establish local control over driverless car regulation. As we know from the housing landscape, the principle of localism is a good way to block stuff without explicitly saying you want to block stuff. An unfortunate aspect of the American labor paradigm is that if specific unionized workplaces lose jobs, that’s bad for the union, even if the technological shift creates jobs and raises wages on average. As a result, union leaders are essentially duty-bound to oppose productivity-enhancing innovation, regardless of the merits.

That said, I think the labor backlash is in important ways a good sign. People mobilize against real threats, and those with something at stake can see that this technology is real.

In my view, the more uncertain regulatory issue relates to land use. Innovations in transportation have historically been a very big deal economically. But the reason they are a big deal is they facilitate changes in land use. When trains were invented, people built dense towns near train stations. When cars were invented, people built sprawling settlements along roadways. If you look at the pre-car world, there were no parking lots and no garages. If you said “well, of course people can use cars, but nobody can build anything that’s out of character with the existing environment,” then cars would not have become very useful. In the opposite direction, the promise of driverless shouldn’t just be that people can watch TV shows while commuting — it should be to allow the nature of communities to transform to use space more efficiently. But for generations now, we’ve been weirdly prescriptive about the intersection of land use and transportation, not only allowing structures that accommodate the personal automobile but requiring them. That leaves AVs stuck in the niche taxi market when they should be truly transformational.

Share



Read the whole story
mareino
4 days ago
reply
Washington, District of Columbia
Share this story
Delete

Like Eminem and Elvis Presley, Here's Why Caitlin Clark Is the Newest Great White Hope

1 Comment and 2 Shares

Caitlin Clark has generated tons of interest in women’s basketball because of her long-range shooting and deft passing. She’s been likened to Golden State Warriors great Stephen Curry, and that comparison ain’t crazy...that’s how special Clark is.

Read more...

Read the whole story
mareino
8 days ago
reply
Good article. I appreciate The Root emphasizing that Clark herself didn't do anything wrong -- it's the market around her that's unfair.
Washington, District of Columbia
hannahdraper
8 days ago
reply
Washington, DC
Share this story
Delete
Next Page of Stories