Just a geek who lives in Olympia, WA with my wife, son, and animals, writing fiction that he hopes will make the world a better place someday.
199 stories
·
2 followers

Washington had a surge of Independent voters. What does that mean?

1 Share
Here is the last 10 years of Survey USA statewide poll results charted out (background data), focussing only on how the respondents identified their partisan affiliation.


Basically, following the trendlines, both the Republican and Democratic parties have lost marketshare and three times since 2006 there have been more identified independents than anything else. Also, in the most recent survey from last fall, the independent identification has a big lead.

It is worth noting that independents have always been strong in Cascadia, but I'm convinced we're seeing something different in this trend here.

What could have caused this?

I have a couple of theories, but I'm far from totally convinced by them.

I think the Top Two primary had something to do with this. Especially, in combination with a redistricting process in 2010 that had a lot to do with protecting incumbency and not with creating competitive districts between the traditional left and right.

So, since the first Top Two primary in 2008 and redistricting races in 2012, we're seeing more legislative level races that aren't competitive between the two major parties. So what do member of a minority ideology do when left in the cold without a standard bearer? I think it's possible they drop the partisan standard all together.

I think there's also something wrong with how we structure party politics around here that encourages not identifying as a partisan. Basically, political parties, the local county and legislative district ones, aren't forces in the lives of most voters or even most activists.

Campaigns can be built, volunteers recruited and advertising funded, without a lot of help from local party officials. The web has a lot to do with this, but the fact that the basic party structure is an obscure elected official called a precinct committee officer probably doesn't help.

What does this mean?

I think we're already seeing the impacts of what a possible non-partisan identifying stable plurality or even majority could mean in Washington State. With little buy-in with their actual policies, the Thurston County commission is now made up of conservative independents. There is was also an independent election on the Grays Harbor County commission, a more conservative but still usually solidly Democratic county.

Also, in Grays Harbor, you saw them support a Republican for president for the first time since the Democratic party was near its death in the 1920s in Washington State. My guess is that they voted for Trump not because was running as a Republican, but because he was running as a non-partisan under a partisan label.

What could this mean in the future is two things:

One, maybe Bill Bryant could have won if he'd shed the partisan banner. With 41 percent and growing, the independent population in Washington serves as a much handier base than a shrinking third place identification. It also seemed to me that Bryant ended up not running as really a conservative, but as a better version of the centrist pro-government governor we already have.

And two, on the local level, even more independents. I hope.

 It is one thing for three anti-growth regulation independents to be elected in a county that voted overwhelmingly for an urban environmentalist of lands commissioner. That (plus the way we voted for the independents across the county), means that enough voters didn't know what policies they were actually supporting and just pulled the lever for the non-partisan.

But, what happens when there are two non-labeled candidates in the race? What shortcuts do the voters use to make their decision? Or do low information voters drop out and leave the election to the voters who have their minds made up?
Read the whole story
Share this story
Delete

Why I Hope California Goes Ahead With Medicare For All

1 Share

As part of my attempts to reduce anxiety-loops related to media consumption, when the argument broke out about Obamacare eight years ago I purchased a number of books about healthcare around the world to better understand the global context and options.

I find Americans tend to argue that there’s ‘market’ driven healthcare and ‘socialist’ healthcare. Europe has ‘socialist’ healthcare and that’s expensive, they use a high amount of taxes to support it. America has less taxes, and spends more on defense, so it uses ‘market’ healthcare that its citizens pay for.

Often, the argument between left and right Americans is between arguing for higher taxes and better healthcare, or using the ‘market.’ Many Americans who have healthcare via their jobs are also somewhat uninformed about what American healthcare looks like and how it works. The number of people I’ve talked to who have day jobs and healthcare through employers and who are upset about Obamacare market exchanges being forced on them when they’re not using it, is somewhat astounding to me.

Talking to Europeans and other folk around the world, I also noticed that people took it for granted and saw it as invisible, or talked about the downsides. It wasn’t until I would outline how it worked in the US that they got horrified faces (I knew it was bad, but fuck me, was one friend’s response via email).

As far as I can tell, the America system is an amalgamation of a number of different healthcare approaches all followed somewhat haphazardly. It actually uses elements of ‘socialized’ healthcare and ‘market’ healthcare. But those two dualities are not altogether right, as far as I can tell.

The book that laid it all out the best is The Healing of America, which I really recommend anyone who opens their mouth about healthcare options read.

Different Types of Healthcare Models

There are basically 4 approaches to offering healthcare in the world that humanity tries. Wikipedia summarizes them here:

The Bismarck Model

This is the model followed in Germany and in its rudimentary form was laid out by Otto von Bismarck. The system uses private initiatives to provide the medical services. The insurance coverage is also mainly provided through private companies. However, the insurance companies operate as non-profits and are required to sign up all citizens without any conditions. At the same time all citizens (barring a rich minority in the case of Germany) are required to sign up for one or the other health insurance. The government plays a central role in determining payments for various health services, thus keeping a decent control on cost.

The Beveridge Model

This model adopted by Britain is closest to socialized medicine, according to the author. Here almost all health care providers work as government employees and the government acts as the single-payer for all health services. The patients incur no out-of-pocket costs, but the system is under pressure due to rising costs.

The National Health Insurance Model

The Canadian model has a single-payer system like Britain; however, the health care providers work mostly as private entities. The system has done a good job of keeping costs low and providing health care to all. The major drawback of this system comes from the ridiculously long waiting times for several procedures. The author, T.R. Reid, would have had to wait 18 months for his shoulder treatment in Canada.

The Out of Pocket Model

This is the kind of model followed in most poor countries. There is no wide public or private system of health insurance. People mostly pay for the services they receive ‘out of pocket’. However, this leaves many underprivileged people without essential health care. Almost all countries with such a system have a much lower life expectancy and high infant mortality rates. The author gives his experience with the system in India, and a brief description of the ancient medical system of Ayurveda.

So by the writer’s estimation, the USA mixes in from all four of those models above in bits and pieces.

Healthcare Models the US uses all simultaneously:

  • The Bismark Model for people under 65 and in the workforce. Although not non-profit, as in cheaper and more successful Bismark models, for profit companies work with employers to get health insurance set up in US. 64% of the US population, according to the US Census, is covered by the for-profit Bismark model. Kaiser Family Foundation claims it’s 49%.
  • The Beveridge Model for Veterans, Active Military Personnel, and Native Americans. This is where the government directly hires the doctors, and builds the hospital. This is how the UK creates national health care (and is actually sort of what Americans think socialized healthcare is). .5% of the population is active military, 5.2% are veterans, and about .5% of the US population are Native American eligible for that coverage. Up to 6% of the US population is covered by this centralized government healthcare model.
  • The National Health Insurance Model in the US is used for anyone 65 or older. This is called Medicare and Medicaid. The government acts as the insurer, collected payments (either through taxes or straight payments) and negotiates with private hospitals and doctors. According to Kaiser, 14% of the US population is on Medicare. 20% of the US population is on Medicaid. 2% is on other public assistance (like CHiPs for children to get access to healthcare if their parents have none). Canada uses the NIH model, it’s even called ‘Medicare’ and it’s basically Medicare for all, even though it’s decried as socialism by the American right wing.
  • The Out of Pocket model is used in the US for poor folk who have slipped between all those other systems and is often advocated for by right wing folk.

So, 36% of the US uses some form of a system from the NIH model, 50-60% of it uses some form of Bismarck mode, but using for-profit systems that are lightly regulated, whereas every other place that uses the Bismarck model (some of Germany, France, Belgium, Netherlands, Japan, and Switzerland) don’t actually do socialized medicine, they just highly regulate the companies that provide and demand they cover all citizens and offer minimum benefits.

Canada and the UK, which offer what some might imagine as socialized medicine, do it through two radically different mechanisms (Canada creates a national health insurance company via the government, Medicare, while UK government directly hires doctors and makes hospitals).

Few of the above, even in Europe, are actually truly socialized medicine, by the way. The UK comes the closest. Socialism is ‘seizing the means of production from private capital.’

What is ‘Single Payer?’

Okay, a number of debates are about ‘single payer’ and socialized healthcare vs ‘market’ healthcare.

Single payer means the government acts as an insurer and collects all the payments, whether via a tax, or via a set payment, and then pays private hospitals or doctors for your treatment. Having a single source means the government can negotiate down costs.

Medicare and Medicaid are single payer. The UK and Canada are single payer models. Canada is Medicare for all. A third of the US system is single payer. It is just that most Americans do not realize this, it’s a wonky term. Many people hear ‘single payer’ and they don’t think ‘Medicare’ they think ‘Canada’ or ‘Europe’ even though Europe has a mix of systems.

Who likes their healthcare the most?

Funnily enough, UK patients tend to self-report as liking their healthcare the best:

 

But that doesn’t mean the more socialized the healthcare the happier people are. Switzerland has a fairly lean Bismarck model that the US would recognize and is second on that chart up there. The difference is that they regulate the ever-loving hell out of it and require (mandate) that everyone buy some, something the US keeps shying away from.

Who lives the longest?

People in Japan live the longest. Switzerland is next, followed by Singapore, then Australia, Spain, Iceland, Italy, Israel, Sweden, France and then Republic of Korea for your top 10.

Now whenever I post that someone links me to a look at how much more they have public transportation, or a better diet. Sure, it’s not healthcare alone. But it’s the single largest impact on life expectancy of a civilization. The fact the USA is #31 on the life expectancy list  and dropping (one of the few or only developed nations to be reversing a trend in life expectancy growing in areas of the US) demonstrates the power of healthcare and quality and longevity of life.

But can America afford healthcare?

Often I hear an argument that goes “well, the US spends so much on defense we’d have to give up other things to have the government create socialized medicine, socialized medicine is too expensive.”

Well, arguments against the complicated amalgam of systems the US currently has isn’t an argument for socialized healthcare and also no other system is more expensive than the US system.

Here’s what countries spend, both in taxes via the public government, and via private systems, visualized on a graph:

You can see that just in government spending, the US spends as much as Switzerland, Netherlands, Sweden, Ireland, Austria, Denmark, Belgium and more than the UK. So we don’t have to spend any more than we’re already spending, we just need to change what we’re doing.

Also, all of those systems get dramatically better results for longevity and patient-reported happiness.

Woah, why is American healthcare so expensive?

There are a lot of reasons. A big one is that America is one of the few countries that assumes health insurance companies should be big, profitable businesses. Most countries look at it as a service. Fire, police and teachers aren’t big, for-profit business, but are services for the community. They make assumptions moving back from there. America’s education system also puts a huge burden on medical professionals who take on a lot of debt, who then charge more. The US also has a legal system that allows big lawsuits, that means doctors take out expensive operating insurance.

There are many other pain points as well, but another huge one is this:

The entire US system is actually socialized, and it was socialized by President Ronald Reagan in the 1980s with something called EMTALA. I have a long post about that here.

Short version: the US used to require payment or proof of insurance before you went into the ER. Reagan changed that to legally force ERs to take care of anyone who came in. Thus, the moral contract America legalized was that all people should be taken care of.

What Reagan never did was to decide how we paid for it. We’ve been arguing ever since. But hospitals are still admitting people. And since many Americans don’t have insurance for preventative care, they use the ER as their doctor. ERs pass this cost onto any American who has insurance by randomly fiddling with billing to make sure the hospital as a whole makes a profit.

I sometimes thus make the argument that American health insurance is a ‘socialist’ (using some right wing arguments about healthcare) unfunded mandate.

So what do I think we should do?

Funny you should ask.

This is of interest to me:

One of my friends who is a nurse retweeted this and it caught my attention because of the history of how Canada came to adopt the NIH model. In 1947 in Saskatchewan, a Canadian province rolled out an act that guaranteed free care, thanks to one Tommy Douglas. They couldn’t quite do universal health care, the original vision, due to funds at the time. Alberta came next with medical coverage for 90% of the population. In 1957 Canada’s Federal government created a 50% cost payment plan, and by 1961 all the provinces were using that plan to create universal programs. In 1966 it was expanded further.

That hints to me that all we need is one big state to do something similar in the US. Vermont had looked into it after Obamacare was passed, as that law has a provision allowing a state to take federal funds for health and pool them all into one giant pot if it’s creating a universal healthcare situation. That’s basically the Canada path.

I also think using Medicare as the vehicle is smart.

Medicare has a great brand. In the US, 75% of its users report satisfaction, making it one of the more well-liked American institutions.

Further, using existing Medicare program for growing would bring down older users costs in the program by healthifying the Medicare user base.

Lastly, Medicare, even though it’s for older folks and higher risk by default, is pretty damn cheap in comparison to workforce insurance and self employment health insurance. Part A is free (basic emergency stuff and hospitalizations) and Part B (doctors and preventative stuff) is $150/month and part D for drugs is $50. I’d jump on that.

And none of this means employers have to stop offering great healthcare plans to sugar employment deals. In the UK, and all throughout Europe, people who make extra money bolt on private health insurance plans on top of the public options so that they can the care they want in the style they want. Medicare has a part C, which is where you can get a more Cadillac private insurance set up added on.

But having the option so you can get out of a shitty employer healthcare plan, or move around, be portable? That sounds great.

One Canadian province setting it up got other provinces to look over there and say ‘hmmm’ and spread the idea. If California got rolling, it wouldn’t be too long before Washington and Oregon joined up, and the entire west coast was set up. They’d draw a lot of small business over there.

I’ll be rooting for California.

Read the whole story
Share this story
Delete

The Wrong Healthcare Issue

1 Share

Right now, the House Republicans are fighting to get enough votes to pass their bill to repeal and replace the Affordable Care Act, aka “Obamacare.” The Democrats are staunchly opposed. Both sides are arguing over the affordability of healthcare and access to healthcare insurance.

As far as I can see, they’re both circling around wrong tree, chasing each other’s tails. Insurance is only a symptom of the greater problem, and trying to deal with symptoms is not only expensive, but will also postpone dealing with the real problem, which continues to worsen. That problem? Healthcare costs. People need insurance because healthcare costs in the U.S. are effectively the highest in the world, and the vast majority of Americans don’t get as good healthcare as nations spending far less on healthcare.

In 2015, U.S. health care costs were $3.2 trillion, making healthcare one of the largest U.S. industries, nearly eighteen percent of Gross Domestic Product, but fifty-five years ago, healthcare only comprised five percent of GDP.

Part of the reason for the cost increase is emergency room treatment, the most expensive single aspect of current healthcare, making up one-third of all health care costs in America. And a significant proportion of emergency room care occurs because people can’t get or afford other treatment for various reasons.

Another component of rising costs is the continuing increase in the costs of drugs and medical devices. According to Forbes, the healthcare technology industry was the most profitable U.S. industry sector of all in 2015, notching an average profit margin of 21%, with the most profitable company of all being Gilead Sciences with a 53% profit margin. And no wonder, given that the list price for the top-20-selling drugs in the U.S. averages more than twice as much as those drugs as in the E.U. or Canada.

While the pharmaceutical industry pleads high research and development costs, a GlobalData study showed that the ten largest pharmaceutical companies in the world in 2013 spent a total of $86.9 billion on sales and marketing, as opposed to $35.5 billion on research and development, almost two and a half times as much on marketing as R&D. Those ten companies had an average profit margin of 19.4%, ranging individually from 10% to 43%, with half making 20% or more. And since Medicare is prohibited by law from negotiating drug prices for its 55 million beneficiaries, the program must pay whatever price drug makers set.

The U.S. medical technology market exceeds $150 billion a year in sales, and in 2015 the gross profit margin for the medical equipment and supplies industry averaged 12.1%, according to data from CSImarket.com.

Studies of doctors’ compensation show that over the past twenty years, that, in general, physician compensation has increased far less than all other components of healthcare. In fact, annual earnings actually declined for the typical physician between 2000 and 2010. Annual earnings for physician assistants and pharmacists have increased at a greater rate. More to the point, as a percentage of total national healthcare costs, U.S. physician wages are small – approximately 9% – a number among the lowest in the developed world.

Hospitals’ costs have increased significantly, but not because they’re making money. A Health Affairs study analyzed hospital income and costs of more than 3,000 hospitals nation-wide and found that fifty-five percent of hospitals lost money on each patient they served in 2013. This does raise the question of whether non-profit hospitals are paying more and more, possibly too much, for high-priced administrators apparently required by the bureaucratic and legal maze generated by the interweaving of private and public medical systems, government regulations, and insurance company requirements. Studies indicate that administrative costs make up twenty to thirty percent of the United States health care bill, far higher than in any other country. American insurers, meanwhile, spent $606 per person on administrative costs, more than twice as much as in any other developed country and more than three times as much as many, according to a study by the Commonwealth Fund.

Then add to that the skyrocketing costs of malpractice insurance and often excessive court judgments in medical tort claims cases.While the amount is subject to dispute, it’s not inconsiderable and also adds to costs.

Unfortunately, neither the Affordable Care Act nor any proposed Republican replacement will do anything to deal with what I’ve mentioned, and what I’ve mentioned are only the most obvious causes of ever-increasing health care costs.

Read the whole story
Share this story
Delete

All Together Now

1 Share

This is how to stop demagogues and extremists: rebuild community.

By George Monbiot, published in the Guardian 8th February 2017

Without community, politics is dead. But communities have been scattered like dust in the wind. At work, at home, both practically and imaginatively, we are atomised.

Politics, as a result, is experienced by many people as an external force, dull and irrelevant at best, oppressive and frightening at worst. It is handed down from above rather than developed from below. There are exceptions – the Sanders and Corbyn campaigns for example – but even they seemed shallowly rooted by comparison to the deep foundations of solidarity that movements grew from in the past, and may disperse as quickly as they gather.

It is in the powder of shattered communities that anti-politics swirls, raising towering dust devils of demagoguery and extremism. These tornadoes threaten to tear down whatever social structures still stand.

When people are atomised and afraid, they feel driven to defend their own interests against other people’s. In other words, they are pushed away from intrinsic values such as empathy, connectedness and kindness, and towards extrinsic values such as power, fame and status. The problem created by the politics of extreme individualism is self-perpetuating.

Conversely, a political model based only on state provision can leave people dependent, isolated and highly vulnerable to cuts. The welfare state remains essential: it has relieved levels of want and squalor that many people now find hard to imagine. But it can also, inadvertently, erode community, sorting people into silos to deliver isolated services, weakening their ties to society.

This is the third in my occasional series on possible solutions to the many crises we face. It explores the ways in which we could restore political life by restoring community life. This doesn’t mean ditching state provision, but complementing it with something that belongs neither to government nor to the market, but exists in a different sphere, a sphere we have neglected.

There are hundreds of colourful examples of how this might begin, such as community shops, development trusts, food assemblies, community choirs, free universities, time banking, Transition Towns, potluck lunch clubs, local currencies, men’s sheds (in which older men swap skills and make new friends), turning streets into temporary playgrounds (like the Playing Out project), secular services (such as Sunday Assembly), lantern festivals, fun palaces and technology hubs.

Turning such initiatives into a wider social revival means creating what practitioners call “thick networks”: projects that proliferate, spawning further ventures and ideas that weren’t envisaged when they started. They then begin to develop a dense participatory culture that becomes attractive and relevant to everyone, rather than mostly to socially active people with time on their hands.

A study commissioned by the London borough of Lambeth sought to identify how these thick networks are most likely to develop. The process typically begins with projects that are “lean and live”: they start with very little money, and evolve rapidly through trial and error. They are developed not by community heroes working alone, but by collaborations between local people. These projects create opportunities for “micro-participation”: people can dip in and out of them without much commitment.

When enough of such projects have been launched, they catalyse a deeper involvement, generating community businesses, co-operatives and hybrid ventures, which start employing people and generating income. A tipping point is reached when 10 to 15% of local residents are engaging regularly. Community then begins to gel, triggering an explosion of social enterprise and new activities, that starts to draw in the rest of the population. The mutual aid these communities develop functions as a second social safety net.

The process, the study reckons, takes about three years. The result is communities that are vibrant and attractive to live in, that generate employment, that are environmentally sustainable and socially cohesive, in which large numbers of people are involved in decision-making. Which sounds to me like where we need to be.

The exemplary case is Rotterdam, where, in response to the closure of local libraries, in 2011 a group of residents created a reading room out of an old Turkish bathhouse. The project began with a festival of plays, films and discussions, then became permanently embedded. It became a meeting place where people could talk, read and learn new skills, and soon began, with some help from the council, to spawn restaurants, workshops, care cooperatives, green projects, cultural hubs and craft collectives.

These projects inspired other people to start their own. One estimate suggests that there are now 1300 civic projects in the city. Deep cooperation and community building now feels entirely normal there. Both citizens and local government appear to have been transformed.

There are plenty of other schemes with this potential. Walthamstow, in east London, could be on the cusp of a similar transformation, as community cafes, cooking projects, workshops and traffic calming schemes begin to proliferate into a new civic commons. Incredible Edible, that began as a guerilla planting scheme in Todmorden, in West Yorkshire, growing fruit and vegetables in public spaces and unused corners, has branched into so many projects that it is widely credited with turning the fortunes of the town around, generating start-ups, jobs and training programmes. A scheme to clean up vacant lots in the Spanish city of Zaragoza soon began creating parks, playgrounds, bowling greens, basketball courts and allotments, generating 110 jobs in 13 months.

The revitalisation of community is not a substitute for the state, but it does reduce its costs. The Lambeth study estimates that supporting a thick participatory culture costs around £400,000 for 50,000 residents: roughly 0.1% of local public spending. It is likely to pay for itself many times over, by reducing the need for mental health provision and social care and suppressing crime rates, recidivism, alcohol and drug dependency.

Participatory culture stimulates participatory politics. In fact, it is participatory politics. It creates social solidarity while proposing and implementing a vision of a better world. It generates hope where hope seemed absent. It allows people to take back control.

Most importantly, it can appeal to anyone, whatever their prior affiliations might be. It begins to generate a kinder public life, built on intrinsic values. By rebuilding society from the bottom up, it will eventually force parties and governments to fall into line with what people want. We can do this. And we don’t need anyone’s permission to begin.

www.monbiot.com

 

Read the whole story
Share this story
Delete

Political Appeal and Innumeracy

1 Share

U.S. federal spending in 2016 was roughly $4 trillion, and revenues were slightly over $3.4 trillion, leaving a deficit of around $600 billion. Out of total spending, $2.6 trillion was mandatory spending on programs such as Social Security, Medicare, and Medicare. Spending on these programs cannot be cut without major changes in federal law, and since 77% of all Americans oppose such cuts, it’s highly unlikely that major cuts will occur any time soon. Then add to that some $260 billion in mandatory payments on the federal debt, and essentially 72% of federal spending cannot be effectively cut, at least at present. That leaves $1.1 trillion in discretionary spending, that is, spending that can be increased or decreased by Congress.

Unhappily, the vast majority of Americans have no real understanding of even these basic numbers, especially Fox News viewers, 49% of whom declared in a recent poll that cutting “waste and fraud” would eliminate “the national debt” [which now stands at $14.4 trillion]. A number of polls over the year have shown that most Americans believe that 25% of the federal budget goes to foreign aid [it’s less than one percent], and that five percent of all federal spending goes to PBS and NPR [in fact, roughly a tenth of one percent does].

The real numbers are more daunting. The largest component of discretionary spending is defense, and while the DOD “official” budget is slightly under $600 billion, various contingency funds and defense activities funded in other forms and by other agencies [for example, the Coast Guard is funded by the Treasury Department], brought the total annual cost of U.S. defense much higher, as high as $900 billion, according to some sources, but even assuming $600 billion for defense, that leaves $500 billion for everything else, including agriculture, energy, education, transportation, federal lands management, national parks, environmental protection, veterans benefits, welfare payments, and a whole lot more.

Trump’s proposed tax cut would reduce federal revenues by $500 billion, according to the Tax Foundation, on top of that $600 billion deficit, so even if he could persuade Congress to cut non-defense discretionary spending by 50% — in essence gutting most federal agencies, the deficit would increase to nearly $900 billion, and that doesn’t count the additional spending he’s proposed for infrastructure spending – which initial estimates suggest range from $500 billion to over a trillion dollars, over ten years, or $50 billion to $100 billion a year.

Proponents of the Trump plans claim that all the new investment and jobs will increase tax revenues, and some probably will, but not anywhere close to enough to deal with the federal deficit that increases the national debt – and the interest that must be paid on it – each year.

Based on a 2014 study by Standard & Poors, if Congress were to pass a $50 billion a year infrastructure bill, that legislation would create an additional 1.1 million jobs. Construction workers make an average of around $35,000 a year, and, under the best estimate of the Trump tax plan, those million workers would pay around $4,000 in federal income taxes each, thus adding up to an additional $4.5 billion. Economists like to point to the multiplier effect, i.e., how many additional jobs are created by one new job. According to the IMF, under present conditions, the multiplier effect is hovering around one, one additional job created somewhere in the economy for each new job created by investment. So… fifty billion dollars of infrastructure investment might create somewhere over two million jobs and possibly add $10 billion in tax revenues while costing $50 billion. Even if the multiplier effect is five times as much as the IMF says, the infrastructure proposal is at best a break-even proposition, and, as such, might be a good idea. BUT… it won’t do much for reducing the current deficit, let alone the increase in the deficit that will be occurring as a result of more federal spending on defense, and the likely coming increase in interest rates.

The other bottleneck in increasing jobs is the mismatch between available workers and the available jobs. According to research from human resources consultancy Randstad Sourceright, a survey of more than 400 U.S. executives found a skills gap impacting their businesses. Four-fifths of those executives said that a shortage of sufficiently skilled workers will affect their companies in the next 12 months. Complaints of hard-to-fill factory jobs are backed up by Bureau of Labor Statistics data: 324,000 manufacturing spots were open in November, up from 238,000 a year earlier.

Another problem that the Trump approach doesn’t address is that jobs creation isn’t equal. Right now, employees of high-tech companies receive almost 12% of all employee compensation, but there are only seven million of them and the average salary is close to $105,000, more than double the salary of the average industrial or manufacturing employee, or triple that of a construction worker. In addition, the tech industries are only adding about 200,000 employees a year. That doesn’t do much for the nearly 15 million unemployed or underemployed Americans, or the roughly three million college graduates each year. The largest numbers of jobs are in the lower paid service industries, and all the investment money putatively freed up by the tax cuts will be going to tech-heavy companies, and those jobs comprise less than 5% of total U.S. employment.

Massive tax cuts, more defense spending, a major infrastructure initiative… all to be paid for by new jobs and cuts in such federal programs as PBS, NPR, the Endowments for the Arts and Humanities, foreign aid, and the like? The numbers don’t add up, even if the political appeal does, perhaps because most Americans don’t seem to understand the numbers, or care to.

Read the whole story
Share this story
Delete

WSTC Seeking Input Through 1/18

1 Share
Here's an opportunity to have your voice heard:
The Washington State Transportation Commission would like to get your input on how our transportation system is working and ways to improve it. The survey is open until January 18. It includes questions about priorities for funding, like bigger highways vs. bike/ped infrastructure, and takes just a few minutes to complete.
Read the whole story
Share this story
Delete
Next Page of Stories