When you play a game you want to know the rules. You don’t, for instance, play American football by the rules of European football – otherwise known here as “soccer” – just because “Football’s football.” You could get hurt playing without a helmet, after all. And it’s pretty much the same in politics – you don’t just say “Politics is politics” or “A party’s a party” and then go out and play American politics by European rules either. You – or your cause – could get hurt doing that too.
As the 2012 presidential campaign warms up, increased calls for another shot at a “third party” presidential candidacy are inevitable. After all, the party holding the White House has switched and yet America’s disparity of wealth and income appears to grow unchecked; military spending continues on a pace nearly matching that of the entire rest of the planet put together; and the pointless, and increasingly obviously unwinnable war in Afghanistan that started with George Bush will pass the ten year mark bigger than ever – with Barack Obama at the helm.
Why not then just start afresh with a new party, like people in other countries do when they don’t like the parties they’ve got? Well, the simple answer is because parties can function quite differently in various situations. And we can’t consider an approach toward the current situation truly political – as opposed to philosophical – unless it measures the system in which it operates. So, while a third party may intuitively seem to be the “really radical” way to go, if it doesn’t work well in our system, it’s not. Outrage, however justified, is never a substitute for strategy.
Were we in Germany, for example, we’d be dealing with political parties with very different characteristics, operating under very different rules. So, when some on the German left found the politics of the Social Democratic Party disappointing, inadequate, or maybe not even left wing at all, they started a new party; first the Green Party and more recently the Left Party. These moves were quite logical within a system that allows parties to combine their respective parliamentary delegations to form a coalition government when no one of them has a majority – as is usually the case. A new party might realistically hope, then, to first become a junior coalition partner – and have some of its program adopted – and later even become the larger party. All of this can be done without great worry that a vote for the new party might inadvertently facilitate the worst possible outcome, namely a Prime Minister from the party whose policies the new party’s voters favor the least (in this case, probably the Christian Democrats.)
An American presidential election unfortunately offers no such assurance. There are no provisions for coalition governments. The White House goes to the winner of the vote of the Electoral College, the makeup of which is determined by pluralities of popular votes in the various states. Come in first in the state and get all of its electoral votes, even if you don’t have a majority. (Maine and Nebraska distribute their Electoral Votes on a Congressional District rather than statewide basis.) All of which means that in the U.S. a “third party” vote can unintenionally facilitate the election of a President from the least-liked party – probably the Republicans in the case of a “third party” of the left. Where German (or French or Italian) “third party” voters have reasonable assurance that their vote will actually increase the prospect of blocking the least desired electoral outcome, American “third party” voters do not. Ignore that fact and you might as well be playing American football without a helmet.
Are there circumstances that might outweigh these considerations? Well, there could be. The most common argument for not worrying too much about whether “third party” efforts might result in a Republican president is that there’s no essential difference between the Republicans and the Democrats. Let’s look.
So far as domestic politics go, the stark profile that Republicans are currently presenting in the U.S. House of Representatives and various state capitals, most famously Madison, Wisconsin, would seem to make this argument a fairly difficult one to press at the moment. When it comes to labor rights, for instance, Democrats may disappoint, but Republicans destroy – not a trivial distinction. While Democrats may fail to press forward aggressively on women’s rights, Republicans defund Planned Parenthood. And so on.
Since my goal is analysis rather than rhetoric, I don’t want to ignore the fact that Massachusetts’ Democratic controlled House of Representatives has since attempted to match the anti-union efforts of their Republican in Wisconsin. There’s no question but that the Democrats can make it very hard to defend them. But no matter how many times we’re moved to say, “They’re almost as bad as the Republicans,” the “almost” does matter.
And then there is the matter of day-to-day the consequences of appointments to bodies such as the Supreme Court and the National Labor Relations Board, an area where there may be the broadest agreement that there is a real difference between the effects of electing one of the “major” parties or the other.
On the foreign policy side, the argument for the rough equivalence of the parties can be a lot stronger though. For instance, the recent U.S. veto of a U.N. Security Council resolution that declared Israeli West Bank settlement illegal came as no big surprise since Democratic and Republican administrations have both done such things for decades. And not only is Obama’s pursuit of the Afghanistan War more aggressive than Bush’s – as promised – but he has also authorized American bombing in Pakistan, Yemen and Libya, the latter serving as a reminder of the Democrats’ embrace of the “humanitarian” military intervention during the Clinton Administration.
And yet, there has been a difference – certainly on the congressional level, anyhow. The invasion and occupation of Iraq, which stands out as the premier atrocity even in a decade of unceasing American military action, was initiated by a Republican president and opposed by most Democrats in the House of Representatives. And even when it’s been Obama initiating military action in Libya, it’s been Democrats who have been the most vigorous in calling him on his failure to consult Congress.
It also seems hard to argue that any Republican likely to replace Obama wouldn’t be even worse on foreign policy. For instance, while the Obama Administration’s pursuit of Wikileaks founder Julian Assange and its treatment of alleged leaker Bradley Manning certainly give us nothing to cheer about, consider the stance of presumed Republican presidential contender Mike Huckabee: “Whoever in our government leaked that information is guilty of treason, and I think anything less than execution is too kind a penalty.”
Candidate Newt Gingrich, who was for the Libya bombing before he was against it, believes “We certainly have to be prepared to use military force” to oust the government of Iran and in years past has called for legislation “that recognizes that we are entering World War III and serves notice that the United States will use all its resources to defeat our enemies – not accommodate, understand, or negotiate with them, but defeat them.”
Speaking of the possible development of a nuclear program in Iran, whose government he calls an “unalloyed evil,” former Massachusetts Governor Mitt Romney laments that “Unfortunately, for reasons that are unfathomable to me, our government has signaled that the military option is effectively off the table.”
Former Minnesota Governor Tim Pawlenty tells the President to “Stop apologizing for our country,” as “we undermine Israel, the U.K., Poland, the Czech Republic and Colombia, among other friends. Meanwhile, we appease Iran, Russia and adversaries in the Middle East, including Hamas and the Muslim Brotherhood.”
Minnesota Congresswoman Michelle Bachman believes that if “we reject Israel, then there is a curse that comes into play.”
And, of course, Sarah Palin’s views are well known.
In short, so far as foreign policy goes, while it might not be such an open and shut case as domestic policy, if you think it’s bad now … (There is one Republican presidential candidate who does differ from all of the above, however – Texas Representative Ron Paul. But Paul will not receive the nomination, in no small part because his sane views on foreign policy are so far out of tune with the bulk of his party. It will also constitute a tremendous failing on the part of antiwar forces within the Democratic Party if Paul and former New Mexico governor Gary Johnson should be the only candidates in either major party calling for an immediate end to the Afghanistan and Iraq wars and occupations.)
Still, some may argue that even if the Republicans are worse than the Democrats, the Democratic Party is nevertheless a corporate dominated entity that is an unworthy and/or unworkable vehicle for social change. While not wishing to discourage anyone from hurling righteous brickbats at the party’s current leadership in Congress and the White House, I think arguing that the “essence” of the Democratic Party somehow precludes our useful participation in also fails to take into account the actual structure of American political parties.
Where parties in many other countries are “disciplined,” in the sense that their elected representatives are expected to vote that party’s position, American parties famously are not. (The best source on this may well be the humourist Will Rogers, whose remarks on the topic included, “I’m not a member of any organized political party, I’m a Democrat!”) Apart from voting for the party’s candidate for Speaker or Majority Leader, it’s largely understood that American legislators will not be bound by any party strictures. Representatives like Dennis Kucinich or Barbara Lee may vote “off” from the majority of their party colleagues time and time again, yet they are in no way prevented from doing so. In a sense, the members of the House and Senate, dependent on their own fundraising devices as they largely are, could be seen as constituting 535 independent parties.
Likewise, presidents routinely govern without consulting the wishes of their party. Does anyone really think there is a Democratic Party structure telling Obama what to do? Or that Republican Party bosses directed Bush?
THINGS NEED TO GET WORSE?
And then you may also hear the argument that things need to get worse before they get better. So even if a third party candidacy did facilitate the election of a Republican who was the greater of two evils, it might have the effect of waking people up to what’s really going on. For instance, didn’t Wisconsin and the American labor movement come to life after Scott Walker was elected governor? Unfortunately, the most infamous formulation of this notion comes from Weimar-era Germany: “Nach Hitler uns” (After Hitler, us) – in other words, some on the German left thought once people saw how bad the right wing really was, they’d turn to them. You know how that worked out. And while nothing so dramatic may happen here, it seems that if there were anything much to that theory, you’d figure people would be pretty wide awake by now after their eight years of George Bush.
I’m no doubt short shifting a range of other arguments here, but the one additional that does come to mind is from people who say they just can’t bring themselves to vote for a Democrat because they would feel tainted by the very act. And ultimately you can’t argue with an individual’s feeling on that score – but that’s a personal statement and not a political act.
Of course, there are those who simply find the notion of making big change within the Democratic Party a dreary prospect – a high school classmate responded to my argument for challenging Obama in the primaries by citing Einstein’s definition of insanity as doing the same thing over and over again and thinking you’re going to get a different result. A fair enough assessment of recent left wing Democratic Party primary efforts, I’ll concede. Unfortunately, it’s a spot-on critique of recent left wing third party campaigns as well.
This is not the place to rehash all of the elements that led up to the Supreme Court decision declaring George Bush the winner of the 2000 election, but it seems undeniable that the perceived effect of Ralph Nader’s candidacy upon the outcome caused many potential supporters to simply apply the Einstein dictum and pay little attention to his subsequent efforts – or those of any third party candidate.
The context of Nader’s 2000 candidacy may be worth recalling, though. The Democratic primaries that year produced the most soporific race to occur in a year absent a sitting Democratic president in a very long time: Al Gore against Bill Bradley. Anyone out there remember what they disagreed on? As a result, Nader’s effort produced enough buzz to prompt a bit of serious consideration of how one might utilize the Electoral College system for a kind of “tactical voting,” a concept unfamiliar here, but fairly well known in the United Kingdom.
Although quite dissimilar overall, the British and American electoral systems do share the feature of not directly electing the head of state, but instead choosing those who do elect that person – Members of Parliament in the U.K. and Presidential Electors in the U.S. – and doing so by a simple plurality in each district. In the latter years of the last Conservative government, the fact that their votes had no impact outside of their own district led some U.K. voters to act very differently than they would if their votes were totaled nationally. Aided by the availability of reliable polling information, Labourites and Liberal Democrats frequently voted for whichever of the two parties appeared to have the better chance of defeating the Tory in their particular district. (The recent Conservative-Liberal Democrat coalition has shelved such tactics for the time being, of course.)
Likewise, in the U.S., some proposed that if you liked Nader and you lived in New York where Gore was sure to win, or Idaho where Bush was sure to win, you should just go ahead and vote for Nader. But if you lived in a state where the outcome was not so clear like, say, Florida, you should vote for Gore because he would be better than Bush, even if he was far less than ideal. Websites for negotiating interstate Gore–Nader vote swapping even sprung up before the government shut them down – on grounds that would later fail to pass muster in federal court. But talk of utilizing the Electoral College system for progressive ends pretty much came to a halt when the 2000 Nader vote exceeded Bush’s margin of victory in Florida and New Hampshire and it hasn’t been revived since.
All of this is not fundamentally an argument against either Ralph Nader or “third parties” in general. So far as Nader goes, the only thing that really bothered me about his most recent candidacy is that his announcement provided an opportunity for people who I don’t think could carry his briefcase to denounce him for ruining their lives’ work.
So far as “third parties” go, there have been some obvious notable successes on the local level, particularly in non-partisan elections. In San Francisco, for instance, over the past decade, two Greens have won seats on the city’s Board of Supervisors, two on the School Board and one on the Community College Board, while Green Party member Matt Gonzalez came within five points of defeating Democrat Gavin Newsom for mayor. (Two of the city’s chartered Democratic Clubs even endorsed Gonzalez, prompting an unsuccessful drive to de-charter them that ultimately established the right of the Clubs to endorse freely in nonpartisan elections. Four of the five successful Greens, by the way, have since left the Party; three to become Democrats.)
And then there is the wholly remarkable case of Bernie Sanders, who has won election to the United States Senate as an independent, in the process achieving sufficient stature that it would be a Democratic opponent rather than Sanders who would be deemed the spoiler should a three way race result in the seat going to a Republican.
Significantly, however, since the time Sanders reached Congress he has never embraced a “third party” presidential campaign, standing back from the Nader candidacy even in 2000, when in the early stages it looked to have the potential to exceed ten percent of the popular vote and really put the Greens on the map.
In the end, the 2000 Nader campaign actually played out quite similarly to Henry Wallace’s 1948 Progressive Party candidacy. Former Vice President Wallace, who would have become president following the death of Franklin Delano Roosevelt, had the 1944 convention not pushed him off the Democratic ticket in favor of Harry Truman, was likewise early on expected to garner at least ten percent of the vote in a four way race with the now-incumbent Truman, New York Republican Governor Tom Dewey and South Carolina Governor Strom Thurmond. The same dynamic as would develop fifty-two years later came into play, though, and fear of electing Dewey overrode lack of enthusiasm for Truman. Wallace’s vote sank to under three percent, just as Nader’s would.
In retrospect, if the 2000 election were considered a test for the American people on the use of the Electoral College, you’d have to say we flunked it. Hence, the growing popularity of the probably ultimately more desirable strategy of ditching the eighteenth century “College” entirely (which would, though, only intensify the danger of “third party” votes producing undesirable outcomes.)
Obviously nothing lasts forever and the current structure of the American political system won’t either. Still, it took a civil war to effect the last major alteration in the political landscape – the rise of the Republican Party. Likewise, we probably won’t see the next realignment until a significant portion of one party’s members – elected officials included – are ready to jump ship en masse – a possibility that does not seem to be on the immediate horizon.
However, a serious backlash among President Obama’s true believers does seem unavoidable, particularly among those who voted for him because of who they wanted him to be rather than who he was. “Third parties” can be particularly appealing to the relatively newly radicalized, who often want to put as much distance as possible between themselves and what they have just rejected. For one thing, a bold new party venture sure can seem a lot more glamourous than slogging though Democratic Party primaries.
In the end though, all of us, old or new, have to ask ourselves the same questions about the effectiveness of our political choices. And knowing the rules of the political system is ultimately a lot more important than knowing the rules of a game because so much more is at stake.
The media continues to produce bombastic reports regarding health care reform full of scare tactics from both sides. One of the buzz words used to scare people is the word – RATIONING. This word has been used to describe negative aspects of the UK National Health System (NHS) system (or the proposed “public” system in the US); and the word implies the perils of such a system–as if a health care czar sits in the coliseum giving a thumbs up or a thumbs down on medical procedures based on what side of the bed he got out of. I have a one word retort which the British might use to refute such allegations: BOLLOCKS.
You might call it rationing, but I call it common sense and logical decision making based on costs and benefits for health care procedures. And I have a perfect example of a person who has experienced rationing in the UK and lived to tell about it – Me.
My Personal Health Care Story
As I have noted in my four previous articles at Demockracy.com, my view on this subject is based on my direct experience with the health systems both in the US and Europe. Unlike most pundits on this subject, I have worked and lived in the UK and the US and experienced both health care systems first hand, as an employee, as an executive, as a corporate board member, as an owner, and, most importantly, as a patient. Four years ago, I was diagnosed with cancer while living in the UK and received treatment and advice from both the US and UK systems. The differences in the way these two systems treated my disease were telling – and at the end of the day the UK’s “rationing” system appeared to do everything the US plan offered at a fraction of the cost. Let me explain the detail.
In 2005, I was diagnosed with tonsil cancer while living in the UK. After my initial diagnoses and shock, I sat down with my UK doctors and discussed the treatment plan offered by the NHS. The proposed plan essentially consisted of the following:
1. A surgery procedure to cut the from my tonsil area and a major neck dissection that would cut tumor surrounding tissue from the tumor and lymph nodes from my neck and shoulder area.
2. Four weeks of rest to recuperate from the surgery.
3. Then, I would undergo seven weeks of focused radiotherapy to further eradicate pesky cancer cells in my neck and throat region.
4. And finally, in concert with the radiotherapy, I would receive several doses of chemotherapy over the same seven-week period to further blast away any cancer cells that had invaded my body.
All of that seemed perfectly reasonable to me; but to make doubly sure I was getting the correct treatment, I flew to the US to get a second opinion from a highly respected leading specialty hospital in New York. I was somewhat intrigued that this second opinion and treatment of the cancer were essentially the same as the NHS program. Intrigued, because I was still under the impression that US health care was better–different, and that I would be offered a different option, perhaps more expensive, but perhaps with better outcomes. But, no, the diagnosis and treatment plan were almost identical.
I followed the treatment plan in the UK – surgery, chemo, radio, all provided free of charge by the NHS. I have dual citizenship in the UK and the US and had lived in the UK for six years, so this was all perfectly within the rules of the NHS. And after 12 weeks of treatment, I was weakened and tired but satisfied that my doctors and I had done all that I could do to combat the cancer. So up to this point, there was essentially zero difference in the way a US or UK doctor would have treated me.
Now came the aftercare plan. I was told by both US and UK doctors that the tumor had a 50% chance of reappearing – a percentage high enough to cause me and my family many, many sleepless nights. Doctors on both sides of the Atlantic agreed that vigorous aftercare monitoring was needed to check to see if the cancer was to reappear. In the US, the doctors suggested that the best way to monitor a reappearance of the cancer was a series of PET /CT scans; they would use the latest and greatest technology to peer inside my body to see if the cancer had returned. They suggested a PET/CT scan once every six months for two years and then perhaps once per year until year 5. In all, this would mean 7-8 PET/CT scans, which at about $4,000 per scan would mean a total cost of about $30,000. That sounded like a lot of money, but, hey, – it was my life we are talking about, and there was a 50% chance of the tumor coming back so it sounded like a no-brainer to spend the money and get my ticket out of Camp Cancer.
I then met with my UK doctor and explained the proposed US treatment plan for aftercare. I still remember my doctor giving a wry smile that suggested there was a simpler way – (the smile also of someone who does not have a bottomless pit of money and funding). She agreed with the need for close monitoring to see if the cancer reappeared, but then explained her treatment plan to me, which was decidedly low tech. For the first year, she would have me back in her office every 60 days to perform a thorough physical examination. She explained that if indeed the tumor did recur, that 99 times out of 100 the tumor would reappear in the neck or throat area. Since this area is relatively exposed and easy to see (down the throat with a scope) or feel (through touching and feeling the neck area), she would be able to see or feel the tumor before it gets to the size of one centimeter (maybe even a bit less, depending on its location) – which is about the same size by which a PET/CT scan can detect a tumor. Furthermore, she explained, that although the PET/CT scans can be a very useful tool, they often can show false positives (white spots on the scan that turn out not to be malignant tumors), and these false positives will just lead to more aggravation and stress for both the patient and the doctor (and also more costs as the health system has to perform more tests to determine that a false positive is indeed false). Not totally convinced, I asked more questions about PET/CT scans and their use; my UK doctor then told me that if the initial tumor had been on an internal organ (the lung or pancreas for example), then the PET /CT scan would be a very useful tool, because if the tumor reappeared, it would be internal and not be in a place where a doctor’s visual or tactile examination could reveal its presence.
Upon further questioning regarding the benefits of the PET/CT scan, the UK doctor did accept that if in the 1 out of 100 chance that the tumor did reappear in another part of the body (not the neck or throat, but an organ such as the liver, the lung, or the pancreas), her physical inspection of the neck and throat area would not likely find the tumor and the PET / CT would give the patient an earlier indication that trouble had returned. However, she also noted that if the tumor reappeared via PET/CT scan in the liver or the pancreas, that this early detection does not usually lead to a better outcome for the patient. Unfortunately, if the cancer is found in another organ in the body, this would probably mean the cancer had metastasized and was spreading throughout the body. The sad reality of this situation is that one can run all the tests and scans in the world, but in all probability the patient has terminal cancer, and now the question becomes not if but when – a sad conclusion, but medicine and health care do not always have happy endings.
Now I am generally a skeptic on such matters; and I was still operating under the mindset that the US health care is better than UK health care. But I listened to what the UK doctor said, and I believed her. After all, the UK’s public system had not skimped one bit when it came to the surgery, chemo, and radiotherapy – these are expensive procedures, but the benefit they provide is very demonstrable and intuitive. But intuitively, the expensive aftercare offered by the US seemed to be a lot of work, effort, and technology for little result. Indeed, I had “found” the initial tumor when I felt a “lump” on my neck about the size of a pea and it was relatively easy to feel once you knew what you were looking for.
The Costs of Health Care
And, being the accountant that I am, I did the math on the treatment plans. The US and UK aftercare plans were similar in terms of trips to the doctors. However, to recap,the US plan included 7-8 PET/CT scans which would cost about $30,000. My UK doctor said that 99 times out of 100 a doctor’s visual/physical inspection would find the tumor as soon as the PET/CT scan. But one time out of 100 the scan would find the tumor that had reappeared at some different part of the body – but even then, in most cases, the likely outcome for the patient was terminal cancer, and the doctors could prolong life for a bit, but the patient would be left with a similar outcome – terminal cancer and death. So for our sample of 100 patients in the US who opt for regular PET/CT scans in their aftercare treatment plan, they and their doctors will certainly feel better about all the money and technology that is being spent to combat the disease. However, these 100 patients in the US will cost the system about $3 million ($30,000 times the 100 patient sample = $3 million); and this extra $3 million will provide little or no benefit when compared with the low tech, low cost approach.
Now let’s go back to the US doctor who is presented with this argument; let’s say he or she agrees with the argument and uses the British, low-tech and low-cost method. Two years later, 50 of his or her 100 patients have cancer (50% recurrence rate in both populations) – of those 50 maybe 2 or 3 or 4 think they have been treated wrongly and decide to sue for malpractice. They engage a lawyer and spend a lot of money on courts and legal proceedings. And ultimately the doctor has to stand in front of 12 jurors while a litigation lawyer – highly practiced in creating courtroom drama – will try to make the doctor look like a villain. Undoubtedly, during testimony, the lawyer will pointedly ask the doctor this question: “So you decided not to use the PET/CT, a technologically advanced procedure designed to identify cancer at early stages, and you chose not to perform this test to save the company $3,000 – and because of your penny pinching treatment plan, the patient did not get all the tests available to medical science, and now my client is dying of cancer because you would not allow the PET/CT scan?” These words may not be exactly true, but the lawyer is very good at bending and stating the facts in such a way to garner sympathy for his client. (And who really can blame him; he is simply doing his job.)
So the US doctor contemplates the legal scenario above … and guess what – he or she decides to order the PET/CT scan, and by the way, when he/she orders the scan he/she also gets to charge $500 to the patient to read and interpret the scan report – just another little incentive to do more not less. But who can blame him/her: he/she correctly justifies decisions as helping to save lives, reducing threats of litigation, making patients happier, and making a bit more money.
Back in the UK, the doctor has little or no liability from litigation, for a whole series of reasons. The most relevant is that the NHS has done the cost/benefit analysis and clearly sets treatment protocol based on this logical cost/benefit analysis. The doctor did it by the book, so there is little chance of liability.
And, finally, let’s look at is from the patients point of view. I imagine many may accept the logic of this argument. However, when it comes down to individual decisions on whether to do the PET/CT scans, many will still opt for doing the tests – nothing wrong with that – it’s a free country. But what the patient should decide is whether he or she is prepared to spend $30,000 of his or her own money to do these tests. I personally do not want to spend my tax dollars on a government run system that spends this kind of money for tests or programs with little or no benefit. So the patient can bankroll this $30,000 option by either dipping into his or her savings or opting into a fancy, pay-all, private insurance plan with all the bells and whistles that will pay for such luxuries. The cost of such an insurance plan will undoubtedly cost several thousand dollars more per year, but you get what you pay for.
Now back to the math and the big picture of US health care. In our little sample of 100 throat cancer patients, the US system spends approximately $3 million more than the low-tech approach. Across the country, in the US, about 30,000 people per year are diagnosed with tonsil/throat cancer. So expanding our sample from 100 to 30,000 means the US spends perhaps $900 million on PET/CT scans for throat and neck cancer patients per year, and this extra money provides little if any benefit in patient outcomes. $900 million is therefore largely unnecessarily spent in the US for this one little disease category. The American doctor often opts for the more expensive, more technical solution, not so much for the welfare of the patient, but so he or she is seen as doing as much as he/she can do when dealing with patients who are ill. This makes the patient happy, the doctor happy, the health care companies rich, and feeds into the common misperception in US medicine that more invasive care equates to better care.
$900 million – almost $1 billion spent with no recognizable benefit for one condition. And as the saying goes, “a billion here and a billion there–pretty soon we are talking real money.” And this one economic example, repeated over and over for other disease categories is certainly one of the major reasons that the US spends twice as much per capita on health care as most other developed countries with no demonstrable benefit to the population.
Rationing of health care services – bring it on. Let’s stop spending money on health care treatments which do not provide a real benefit. More money does not always mean better care.
You might call it rationing, but I call it common sense–making logical , informed decisions about health care procedures is an achievable goal that can make health care affordable to all Americans. So let’s look behind the scare tactics and buzz words and do what is right and allow for Affordable Health Care for all Americans.
Mr. Barack Obama
President of the United States of America
1600 Pennsylvania Avenue
Washington, DC 20001
Dear Mr. President:
I have a little problem. My little problem, however, is part of a big problem – Health Care in America. It is a problem for all Americans. I want to help you fix this problem. My individual experiences make me both passionate and uniquely qualified to help change health care in America to make our country a better place to live and work.
First, a short history of my little problem: I am a 49 year old American who moved to the UK ten years ago. I was a Chief Financial Officer (CFO) of an international health care company, and I accepted an assignment in Glasgow, Scotland for a 6-months stint that somehow turned into ten years. While staying true to my American roots, I have enjoyed my stay in the UK and enjoyed a successful career – except for one little glitch when in 2005, I was diagnosed with tonsil cancer. Six months of intensive treatment (surgery, chemotherapy, and radiotherapy) has been followed by three years of aftercare. All of my care was provided efficiently and effectively by the wonderful, capable doctors and nurses working for the National Health Service (NHS) here in Scotland. In some ways, I consider myself fortunate that I was diagnosed with this terrible disease while in the UK where I had access to the NHS to provide care.
But, now my little problem: I would like to move back to the U.S., but cannot – because no insurance company will come near a cancer survivor like me. So, every summer, I visit my family for two weeks, and every April 15th, I send the IRS a check for my U.S. taxes for the privilege of being a U.S. citizen. And I now wait until I am 65, when I will be covered by Medicare, to return to my home – this, of course, is fifteen years away. This, Mr. President, is no way to run a country.
Ironically, since I have worked as a financial executive for various health care companies for over 20 years – I understand all too well why insurance and health care companies run the other way when they see me coming. As a cancer survivor, I am a big, fat financial liability waiting to happen. I also know that there are ways I can “sneak” into the U.S. health care system by getting a job with an employer with an insurance plan or otherwise getting into a group plan and hoping that preexisting conditions do not trip me up sometime in the future. But then again, I know those insurance underwriters are smart and vigilant (that is what they are paid to do); they are continually finding ways to exclude the high risk patients from their insured population. And even if I do get into an insurance plan, I would imagine the fine print of the policy would find a way to exclude me if I became a major liability. So, is this really any way to live my life? Is this really any way for anyone to live their life? Is this any way to run the greatest country in the world?
So for now, I will remain here in the UK. I know the UK is not perfect, and it has its own health care issues. But I know one thing is certain – if I show up at the hospital in the UK diagnosed with cancer again, I will be treated, and I will not be financially ruined. The peace of mind this gives me is more than enough reason for me to stay here in the UK until we solve our little health care problem in America.
As I said at the beginning, my little problem is really an American problem. The real problem is that millions of other Americans who have similar or worse tales of woe and do not have a solution. They come in all shapes and sizes. Someone gets ill and then loses his or her job and health insurance coverage, and someone’s illness is excluded from insurance because of the fine print in the policy, for example because of preexisting conditions that were either conveniently or inconveniently forgotten at the beginning of a policy. The circumstances are wide and varied. Far too many people in America live in fear or ignorance of a health care event which can be catastrophic to themselves and their family. The free market system of American health care has developed in to a multi-headed hydra which is designed more for making profits than for caring for the sick – or even keeping people from getting sick in the first place.
I am passionate about helping fix the problem. I offer my services to you as cancer survivor, as an experienced financial executive, and as an American who wants to make the country a better place in which to live. I will consider any role in your administration (or indeed anywhere in the U.S.) which will put me in a position to help fix health care in America. I want to come home and help, but the irony is that I can’t come home until I get my health insurance sorted.
My compensation for providing such services is simple. My compensation will be to once again live in a country where if I (or anyone) am diagnosed with cancer (or any major disease), I will be treated and I will not be financially ruined. The peace of mind that comes with this end result will be more than enough reward for any service that I can provide.
Thank you for your consideration.
I remain a U.S. citizen, proud of America, and missing my home.
P.S. I have written also written a two part series for Demockracy.com on the subject that goes into more detail regarding the problems and solutions of U.S. health care. If you are so inclined, please look at these pieces and let us begin the work to fix American health care:
During the long years of the Cold War, not many dared to question the US military budget. Since then, however, the budget has continued to expand, often sending troops overseas to situations that were created by previous diplomatic blunders. Some of those blunders have directly created the morasses that we attempt to extricate ourselves from today. As such, let’s take a look at some of the history of what the CIA refers to as blowback for the U.S.
Brief Blowback History
In 1953, Iran, or Persia as it was then called, had a functioning democratic system. A successful coup by the CIA and British Intelligence overthrew the democratically elected government and replaced them with the hereditary Shah of Persia. His abuses and misrule led directly to the Islamic Revolution and the problems we have encountered with their Islamic government ever since. In the early 1980s, Iraq thus was encouraged to invade Iran, by the US in a fit of pique, and was supplied with arms in the resulting war. This assistance helped solidify Saddam Hussein’s military ambitions and indirectly encouraged his invasion of Kuwait in 1991, all of which led to the mess in Iraq today.
Meanwhile during the 1980s, the military assistance given to the tribes opposing the Russian occupation of Afghanistan led to the Taliban taking over the country. These people, who were responsible for 9/11 (despite what the Bush administration’s claims to the contrary), are whom we continue to fight today in Afghanistan. In addition, they also have brought the war on terror to the nuclear-power country of Pakistan.
Bill Clinton didn’t help matters, when he, in the midst of the Monica Lewinsky affair, launched Tomahawk missiles against suspected Al Qaeda munitions facilities at a site in Sudan and the Bora Bora site in Afghanistan where Osama Bin Laden was thought to be. This was in retaliation after US embassies in Kenya and Tanzania had been previously bombed. One of Tomahawks destroyed a human and veterinary manufacturing plant in Sudan, killing at least 20 Sudanese and putting many out of work. The Sudanese government immediately cut off all ties with the U.S. and released an important Al Qaeda suspect they had been about to hand over to the U.S. The Tomahawks in Afghanistan missed Bin Laden totally–he was in Kabul at the time. He in turn sold an unexploded Tomahawk to the Chinese for 10 million dollars. Worse, almost all of Africa, who had been outraged over the Embassy bombings by Al Qaeda, swung against the US policy after the bombings. Sound familiar?
In addition, it is clear to most of the world, though rarely reported in the US, that huge military assistance to Israel keeps them so dominant that they often disdain from entering into meaningful dialogue with the Palestinians or other nations in the region. Without meaningful legitimate political channels, arguably, that may have in turn indirectly led to the cult of the indefensible and grotesque suicide bomber.
Similar situations of blowback have occurred on all continents. It is alleged that the policy of supporting vain, immoral megalomaniacs as leaders in the more unstable areas of the world could be summed up as, “We don’t care if he’s a bastard so long as he’s our bastard”.
In too many situations today, previous meddling in the internal affairs or politics of other countries has led directly or indirectly to these messes that we may now face. If intervention leads to revolution or serious instability in the country involved, it is often inevitable that the beneficiaries of the situation will be the worst possible choices. It takes many generations for the situation to settle down and for the voices of reason make headway over the radicals who are always the initial power base. The French Revolution, The Russian Revolution, and the Persian [Iranian] Revolution are all cases in point
To return to the end of the Cold War, there was at that time, along with a feeling of relief that we were all suddenly safe, a hope that the troops could come home, and be discharged. That of course never happened. Why not?
The Military Industrial Complex
Today the US spends 46% of the total world’s military budget. The next 4 nations, the UK, France, Japan and China spend between 4-5% each. The US military budget has risen from 250 billion dollars in 2001 to over 700 billion in 2008. Thus, the sensible solution to help our failing economy would logically have to be to cut the military budget and bring everyone home. Wouldn’t that give us iron clad security at home? Maybe we could even make our inner cities safe and bring down the horrendous murder rate from the 17,000 yearly victims it is today.
Of course that is about as realistic as overall world peace. But why?
The answer to why that apparently sensible solution is currently a pipe dream was first given by President Eisenhower in 1961. Eisenhower was the first President, as a former General, to recognize the power of the Military Industrial Complex.
That term refers to an over friendly relationship between the government, the military, munitions manufacturers, and defense contractors. All in this relationship benefit financially, and unfortunately peace can get in the way. Eisenhower as a military man saw what could occur when future Presidents without military experience tried to go up against this Complex. They would be easily maneuvered by the military to react where no reaction was necessary, and to keep the US military equipped with constantly updated equipment and every new technology. Today, there is a defense contractor in every State of the Union. If there are cutbacks, you can be sure these workers will be out in force rallying senators and representatives at every level. The President will be lambasted across the nation and the Republicans will make hay. Any President to take on this issue will be lauded by history, but unlikely to win a second term.
Will Barack Obama be able to break this endless cycle to prevent the never ending cycle of blowback? If recent history is a good predictor, it certainly won’t be easy. For the sake of the rest of the world, let’s hope for the best.
As previously stated, the purpose of this two part series is to set forth my views on changing health care delivery in America to make it more efficient, more effective, and, most importantly, more compassionate.
In Part 1 of this series, Health Care in America: A Time for Change, I laid out my personal experiences that led me to write this series and outlined the problem. In Part 2 of this series, I will explore ways of making health care better for all citizens of the US, starting with the concept of triage.
TRIAGE – WHERE IS HAWKEYE WHEN YOU NEED HIM
In the old sitcom, MASH, Hawkeye Pierce would perform triage for his MASH unit. His job in triage was to separate the injured into three groups:
1. Those who needed care immediately to deal with a life threatening or a rapidly deteriorating situation.
2. Those who needed care, but who could wait for a time period with little or no effect on the patients’ wellbeing.
3. Those who could not be helped by health care (either they were not sick or they were so badly injured and sick that normal health care procedures could improve the situation).
Hawkeye heroically performed triage for the MASH unit, and by doing so, used the unit’s limited resources for maximum effect. And why was he a hero? Because he made tough decisions about prioritizing the needs of patients based on triage. The system was not perfect, but decisions were made, and the doctors got on with the job of caring for the sick and wounded. And through appropriate triage, care was provided most efficiently given the limited resources at hand.
In the UK, this triage function is essentially how the national health care system works. Resources are indeed limited in the UK within the NHS, and everyone knows it. Indeed, this obvious limitation of resources does cause issues (e.g., waiting lists). However, it also forces the people and the doctors in the system to focus on what is important; it forces the doctors to make the tough decisions necessary to give care to those that are in most need. The classic triage function is returned, not to insurance companies, lawyers, or accountants, but rather to medical professionals. Indeed, this is what doctors are paid to do – not just provide care, but to provide care to the sick with recognition of the limitations of resources. They are not just health care technicians, but they also perform a much more important role – TRIAGE.
Indeed, in the case of my son’s broken arm and my cancer treatment, the triage system worked just as it is supposed to work. In both of these situations, the doctors recognized a problem that needed to be addressed quickly, and even though their resources were limited, we were cared for quickly and efficiently with the resources at hand.
In America, this triage system has been distorted by the market system – a market system which has less to do with medical priorities than it does with economics, litigation, and profit. The market system ensures that the ones with the most money get the best health care in a timely fashion. The ones without money get what they can get. The market system also ensures that those with access to lawyers will receive health care, often unnecessary health care. Large sums of money in America are spent on defensive health care, where diagnostic procedures or tests are performed for the sole purpose of defending against potential lawsuits. Finally, the market system, through the prospective payment system, ensures that many wasteful procedures, tests, and office visits will occur, not necessarily for the benefit of the patient, but for the benefit of the provider who will earn more money by performing more procedures and tests.
The table below is a simple demonstration of the inefficiencies of the US health care system
|$ Spent Per Capita on Health Care(USD)||Health Care as % of GDP
|CT Units per 1 million persons
|MRI Units per 1 million persons
|Infant Mortality per 1,000 births
|Life Expectancy (years)
Source: OECD Health Data 2005
The first two columns of the table show how the US spends approximately twice as much on health care as comparable Western countries. The third and fourth columns are an indication on how that money is spent. The money is spent on fancy machines and diagnostic tools (CT and MRI); these columns show the US usage of MRIs is 3-6 times higher than other countries in Europe. Although these types of tests are a useful tool in diagnosing certain diseases, there use in the USA is certainly out of proportion when compared to other countries. Indeed, I believe the high use of such technology is not driven by patient need, but rather by profit motivation and the fear of litigation.
Finally, the last two columns show that for all the money spent and the technological advances (such as MRI and CT), the US lags behind other countries when it comes to two objective measurements of health care – infant mortality and life expectancy.
In summary, the market system of US health care forces costs to rise and rise and rise again with no objective benefit to the population. These costs are driven by all the players in the system:
1. Lawyers – who through the threat of litigation lead many doctors to perform unnecessary and non-cost effective treatments.
2. Managers (motivated by profit) – who want to provide more care (as long as it is covered by insurance, Medicare or Medicaid programs) because more care leads to more procedures leads to more revenue, which in turn leads to more profits.
3. Insurance Companies (and HMOs) – who tend to provide more care (more coverage) and increase premiums incrementally across its insurance pool. This is especially true since the costs of increased premiums are often negotiated with employers, and the costs are invisible to the employee (the actual customer). Ultimately, more care means more revenue, which usually leads to more profits for the insurance company or the HMO.
4. Doctors – who (bless them) want to provide more care because that is what keeps their patients healthy; but we also must remember that sometimes, doctors also have a financial motivation whereby more health care and more procedures will lead to more money in their pocket. In addition, the current system gives no financial incentive for doctors to coordinate care with other providers.
These four players in the system are all motivated to provide more health care and more expensive health care. The system is fixed to continually increase because there is no one in the system who is manning the brakes!
THE WAY FORWARD
The solution to these issues is simple in theory and more complicated in practice. This solution is a return to triage. Provide health care to the ones who need it, when they need it. The solution, however, given the ensconced positions of each of the players, will not be a quick fix. The US health care system has evolved and been shaped by the market, culture, and technology for over 100 years. A miracle cure will not happen overnight; any new law or system will need to be assimilated into the culture and have its own evolutionary process. However, change needs to happen, and that change must address the incentives and motivations of each of the constituent parties (the doctors, the managers, the insurance companies, and the lawyers).
However, the strategy for health care reform is simple: a return to triage – this entails three steps.
1. Define the resources (set the budgets): At a regional (manageable) level (state, county, or city) define the budget and resources which are available to each entity in any given year. Money, operating rooms, MRIs, hospitals, and all of the other resources available and the cost thereof must be defined and budgeted.
2. Define the health care needs of the population being serviced: At a regional level (state, county, or city), an actuarial study will need to define the health care needs of that particular population; any population of 50,000 persons or so can be studied from an actuarial viewpoint, and a very good estimate of health care needs of that population can be developed.
3. Allow medical professionals to make resource decisions: Doctors and medical professionals then need to make medical decisions on how to use these resources to best service the population. This will be a hard job; there is no doubt about it. However, it is the essence of triage and what doctors should be trained to do and, indeed, what they are paid to do. A limited budget means that not everyone can get an MRI; not everyone can have the expensive course of drugs that has a marginal effect. These are difficult decisions which need to be made, but without a budget limitation, the decision will not be made. These medical professionals, who should be appointed to significant staggered terms to avoid political winds, will have the flexibility to spend money on programs that provide more bang for the buck. They, in effect, will perform triage for the American health care system.
Finally, three other political decisions need to be implemented to allow this strategy to work.
First, health care must be universal and include all citizens. There needs to be an acceptance and realization that we as a country will take care of the basic health care needs of our citizenry. Patient finances should not be the determining factor when providing basic health care. This in turn will make it much easier to define the population and the health care needs of that population (because it is everyone, except those who opt out for private health care). This will also eliminate the universal problems of gaps in coverage when one changes jobs, issues relating to pre-existing conditions, and other problems caused by a private insurance system.
Second, all citizens will be allowed to opt out to a private system. If a person wants to spend his or her money for a special medical procedure, medical “gap” insurance or whatever he thinks is appropriate, he can. It is envisaged that only the relatively wealthy will opt for this type of coverage, but anyone who wants to spend his money on more health care can do so. (This would be analogous to a private school system whereby parents can opt out of the public system at their choice for a fee.)
And finally, and perhaps most importantly, we need to greatly reduce the dollars which are spent on lawyers and procedures performed only to placate the lawyers. Through major tort reform, we need to stop paying the lawyers to police the system. We must eliminate large litigation payouts and thereby eliminate most of the defensive medicine that today is necessary simply to provide an appropriate defense for many doctors. Fair reparation should be paid in certain cases where mistakes are made. However, the multimillion dollar payouts, and more importantly from a cost standpoint, all the waste which comes from defensive medicine in response to such lawsuits need to be eliminated.
These of course are only strategy statements. Work, work, and more work must be accomplished before this strategy can become reality. As such, these are no more than first steps meant to start a dialogue. I welcome discussion and debate which can lead to developing an American health care system which can be the best, the most efficient, and the most compassionate in the world.
Several months ago, in my first piece as a writer for Demockracy, I talked about my perspective as a Police Officer who is against the War on Drugs. In the months that followed, this article became a very popular piece on this Web site and across the social networks. As such, I’ve had several requests to follow up on this piece and talk more about my career experiences and share my insights on this ill-begotten war on drugs. From these requests, I’ve decided to write a follow-up piece. In this follow-up article, I will explore some of my personal experiences that have led me to many of my current conclusions. I hope you enjoy and please share any comments that you may have.
As I’ve shared in the past, I am a retired Policeman from Vancouver B.C., and I represent LEAP, Law Enforcement Against [drug ] Prohibition. We are a worldwide organization of Police Officers, Corrections Personnel, Judges, and many others who work in different areas of law enforcement, both active and retired. We currently number some 8,500 members. Our advisory board is made up of one US Governor, four sitting US Federal District Court Judges, five former police chiefs, the ex-mayor of Vancouver B.C, Senator Larry Campbell, the Former AG of Colombia, and from the UK, a former Chief Constable and the former head of narcotic task forces for all of England. We do not support drug use and realize that in an ideal world we would be better off without it. What we do believe is that “The War on Drugs” has created most of our problems with drugs and addiction today. Addiction is a disease, not a crime.
More On My Experience
With that said, let me tell you a bit more about myself and why I have come to these conclusions. I joined the Vancouver Police Department in 1973 and served for 28 years. The date of my joining is important, as the year before in 1972 a Canadian Parliamentary Committee known as Le Dain concluded that due to the high costs of enforcement and the relatively benign effects of marijuana, that there should be a gradual withdrawal of criminal sanctions over time resulting eventually in legalization of marijuana. All in-depth studies going back to the British India Hemp act of 1895 have come to the same conclusion about marijuana. However, the Canadian Parliament chose to ignore those recommendations.
As I recall, there was little focus on drugs when I went through the Vancouver Police Training Academy. (I did however learn that reasonable force extended to choking a dealer to prevent his swallowing the evidence and that the ponytails favored by so called hippies made a very effective handle to restrain them.) After completing training, I discovered that drug enforcement was mainly left to the individual officer’s discretion. No high level traffickers were ever investigated. Enforcement was done only at the street level. Those, however, who centered their activities on drug enforcement made substantial overtime amounts from court appearances. This policy, however, has never been the policy of the Federal Police, the R.C.M.P. They, unlike municipal departments, receive considerable federal funding to enforce the drug laws and do so enthusiastically.
One individual I worked with during my early years routinely arrested individuals on the basis of a dirty hash pipe or a spoon with enough residual heroin to analyze. It was not unusual to bring in 4 or 5 individuals from a rundown hotel room on the basis of a small baggie of weed. At that time, the hotel clerks would tell us the rooms where they suspected the occupants of drug use and hand us the keys, while we turned a blind eye to the other illegal activities carried out by the hotel managers and staff. (I suspect these hotel managers were probably the largest traffickers in the buildings and, according to more than one source, charged prostitutes a premium for brief hotel stays.) Drug charges in Vancouver often resulted in some officers doubling their wages from the overtime and court time involved. The drawback was that there was less police presence on the streets to handle the ongoing and routine crime of downtown Vancouver.
In 1995, I started the Vancouver Police Anti-Fencing Unit. Addicts tend to concentrate in the low rent districts as do pawnshops that often supply the addicts with money. The dealers are normally right outside the pawnshop doors to complete the equation. The average addict at that time was spending between $100-$200 daily on his or her addiction. Unfortunately, pawnshops normally only pay 10 cents on the dollar; therefore to support their habits, the addicts have to steal $1,000-$2,000 worth of property. The evidence of stolen property in these pawnshops was so rife as to be almost ludicrous. I remember at one time entering a pawnshop when an addict came in with an armload of stolen property from London Drugs. While negotiating with the owner, he was ripping the London Drugs labels off CDs with his teeth while negotiating the price with the pawnbroker, as he had no spare hands to do the job.
There are unfortunately a small percentage of people who through nurture or genetics, always seem to fall to the bottom and are unable to survive without their self-medications. They have no time for treatment as their days are filled with theft to support their addiction, finding a dealer, and after purchasing their drug of choice, never knowing the quality of their purchase. We cannot help these individuals by locking them away. We must not kid ourselves; in jails, drugs are readily available. Generally, the prison system tolerates drugs as they tends to calm the inmates. The substance that the jail staff often fear is actually alcohol, which leads to riots and destruction. I was told by numerous prison guard colleagues that alcohol is so valued by some of the old alcoholics in jail that they will often attempt to import considerable quantities of drugs, just to trade for alcohol, which is much harder to find inside.
As a policeman, I attended many untimely drug related deaths in the downtown eastside area of Vancouver where I spent much of my career. Overdoses of various drugs were very common. No one paid much heed, and most were not too traumatic to me, as relatives were usually far away, often in Northern BC or other Provinces, and it was up to the local RCMP detachments to notify them. That area in Vancouver is the poorest area in Canada according to tax returns and acts as a magnet to those who have run away from home due to abuse, sexual and domestic. Few of them had any local support in Vancouver. These individuals rapidly became involved in the drug culture of the area and many died there. It was impossible to determine if the drug deaths were a result of long-term abuse, mixing too many drug cocktails or the strength of the drug being greater than expected, either by deliberation, such as we hear of with a hot cap, or by accident.
It was only when I attended deaths out of the usual pattern that the reality of the horror really set in. A one time partner of mine lost his 16 year old daughter to a drug overdose. Unfortunately, her dealer did not monitor her slide into abuse. He did not offer her counseling or monitor the purity and strength of the drugs he sold. He was probably an addict himself, dealing to support his habit. The outrage is that he and thousands more are still out there still selling their products, everywhere to our children.
Raw opium increases in price by several decimal places from the poppy fields, to the addicts in North America. Coke is not quite as profitable and the other drugs even less so, but anyone can rapidly rise to enjoy the lifestyle of say a successful surgeon or lawyer with no educational requirements, experience, skills, and very little work required. The only way we can break this cycle, ensure a uniform product, help those who request it, and monitor those who need help is to legalize the product, heavily regulate it, and supply it to those in need.
Why don’t we go out and arrest all drug dealers? We could arrest them all and you know what will happen? There will be fights, stabbings, shootings and deaths, AND tomorrow new dealers will be there to carry on business as usual. When you arrest a drug dealer, the only thing you create is a job opportunity. As an example, there was recently an investigation of an individual planning to blow up a city block in Surrey, BC, in order to rid it of all the drug dealers there. Some may believe that his point of view could be justified. The only problem was that he himself was a dealer and hoped to take over all the business with the others gone.
Ask yourself if heroin or cocaine were legal, would you use them? I wouldn’t. No one who is rational and has aspirations for a meaningful life is going to. In fact, 99% of all people tell us that they wouldn’t. The first drug laws were enacted because 1-3% of the population was believed to be addicted to drugs. By addicted I mean unable to hold meaningful work and behave in a socially responsible way. Today, after countless millions have been arrested and billions of dollars spent, the percentage of addicts is still estimated at between 1 to 3 % of the population.
Let’s take the money from the criminals, reduce property crimes, reduce prostitution, reduce disease, and give our social agencies the funds to really have an impact on society. Above all, let’s give that 1-3% a chance of a real life.
The purpose of this two part series is to set forth my views on changing health care delivery in America to make it more efficient, more effective, and, most importantly, more compassionate.
In Part 1 of this series, I will explore my personal experiences that led me to write this series and outline the problem. In Part 2, I will lay out my solutions for a way forward to solve the health care problem in America.
My credentials for my views come from both my personal and professional experience. I have 20 years of experience as a financial executive and CFO in the health care industry in America and Europe. As a CFO throughout my career, it was my job to create value (i.e., make profit) through the marketing and delivery of health care to the general population. This involved understanding the rules and complexities of both private and public health care systems from a financial viewpoint. In the later stages of that career, I have founded, owned, and managed health care companies in both the US and the UK and have experienced firsthand how the corporate world prospers in both market-based and government-supported systems. In addition, while living in both Europe and America during this time, I have experienced health care as a patient on both sides of the Atlantic.
In 2005, my health care experience became more personal when I was diagnosed with cancer while residing in the UK. I was treated for the disease in both the UK and the US and directly experienced how each of these countries dealt with the diagnosis, treatment, and aftercare of a person with a major health issue. As a patient with a major illness, I suddenly had a very different perspective on what constitutes best practice when it comes to delivering health care. After one year of treatment and three years of aftercare, I am now a cancer survivor and am on a mission to bring about affordable, efficient health care to all citizens of the US.
Overall, I believe that my 20 years of business/health care experience gives me the expertise to help make a difference in health care delivery in America. My experience as a cancer survivor in America and Europe makes me want to make a difference
My Vision or (How I Learned How to Stop Worrying and Embrace National Health Care)
First, let me state my bias; I believe the UK health care system is better than the US system in many ways:
1. It is more efficient than the US system in terms of costs per capita.
2. It provides better outcomes than the US system (based on measures such as life expectancy and infant mortality rates).
3. It is more compassionate than the US system because all citizens are cared for regardless of income or net worth.
4. It allows for rich people to “opt out” and go private.
What’s not to like – better, more compassionate care for less money, and the ability to pay more to get an even higher standard of service?
My vision for US health care is certainly affected by my experience with the National Health Service (NHS) in the UK and Europe. The UK health care system is far from perfect, and the purpose of this series is not to critique that system. I also know that individual anecdotes seldom tell the whole story. Nevertheless, my health care experiences in a foreign country are worth mentioning.
In 1999, soon after moving to the UK, my son, aged 7, fell and broke his arm. I rushed him to the emergency room and I was in a fair amount of anxiety, not just because of the injury, but because I did not know how the medical system worked. He was in obvious pain, and upon arriving, the attending nurse quickly gave my son some drugs for the pain. After a 30 minute wait, the doctor diagnosed a broken arm, an x-ray confirmed the diagnosis, and after about 2 hours, my son left the emergency room with a cast on his arm. Both he and I were tired, but relieved that everything was going to be OK. While at the ER, we did not fill out forms, and there was no mention of money or insurance. The only thing that appeared to matter was that my son was in pain and injured, and the doctors acted on his injury. A light bulb went off in my head – this might be a better way.
Five years later, while still living in the UK, I was diagnosed with tonsil cancer. Six months of intensive treatment followed including two surgeries, radiotherapy, and chemotherapy. Three years of follow-up care (still ongoing) continued afterward. Interestingly, in terms of health care administration, my bout with cancer exactly paralleled my son’s broken arm incident. There were no forms, mention of money, or insurance and what was covered and what was not. More importantly, there were no discussions of employment gaps or pre-existing conditions or how future insurance coverage would be affected. Instead, the only thing that appeared to matter was that I was ill, and the doctors acted on that illness. I imagine there was some administration and paperwork somewhere, but I didn’t see it. All I saw was a focus between doctor and patient regarding the care, well being, and options of the cancer patient – me.
The light bulb in my head was now a spotlight in my face. This had to be a better way.
The Problem – Health care in America.
American health care, as in many other facets of American life, can lay claim to being the best in the world. America arguably has the best doctors, the best equipment, the best medical schools, the best research and development, and the best hospitals in the world. Many US hospitals are known throughout the world as “The Place” to go to ensure the best health care possible. The Mayo Clinic and the Sloan Kettering Institute are two examples of organizations which lead the world in health care practices. However, from the standpoint of efficiency, effectiveness, and perhaps most importantly, compassion, the US system falls well short when compared to other countries.
For example, studies have shown that the US spends about twice the amount on health care per capita when compared to other Economic Developed Countries (EDCs). (These other EDCs generally use a socialized or government sponsored health care system.) More interestingly, of the total US expenditure, about half is actually spent by the government that generally foots a large portion of the bill for over 65’s (through Medicare) and the “non-wealthy” through the Medicaid system. (I put non-wealthy in quotes, because nearly 40 percent of the uninsured population in the US reside in households that earn $50,000 or more, so this group is not the indigent poor.) So, even though there is a popular opinion that the US primarily relies on private health care, the US government spends about the same as other developed countries on a minority of its population even before one factors in private expenditure. The US already has a national health care system whether it knows it or not.
So the issue is not whether the US should or should not have a national health care system–the US already has one. The issue is how the US as a country can spend twice as much on health care as other similar countries, AND
2. Have approximately 1/6th of its population (approaching 50 million people) with no insurance or health care plan other than a trip to the nearest emergency room when trouble occurs. This segment of population lives either in ignorance or fear of the liabilities which could occur if their health takes a turn for the worse.
But let’s put these statistics aside and get to the real issue – the human issues of people who are sick and suffering from not only sickness or disease, but also from the anxiety caused from the personal financial repercussions of injury or illness. When you or a loved one is sick and unsure if you have the financial wherewithal to deal with the sickness, the financial/personal issues can become more important than the sickness itself.
So what is wrong with American Health care? One thing that is not wrong is money – as stated previously, twice as much is spent in America on health care when compared to other countries. So what IS wrong is that this money is being spent on the wrong things, and I will sum that problem up in one word - TRIAGE. I will explore this and much more in Part 2 of this series.
The election of Barack Obama has signaled a potential turning point for the people of the United States. Millions have been inspired by Obama’s hopeful message of change, and for the first time ever, a man of color occupies the nation’s top position. Many Americans of all classes and creeds, to say nothing of race, are looking forward in hope for an improved economy, an end to the wars in Iraq and Afghanistan, and a more respectable reputation abroad. With all this talk of change, however, more attention needs to be paid to the concerns of women in the United States. With so many competing pressures at home and abroad, many women’s groups and feminists fear that gender equality will not be a priority of the Obama administration. This essay intends to explore three imminent problems American women face today: pay inequity, lack of representation in leadership positions, and limitations on the right to choice. Congress and past presidents have given some attention to these matters in the past; however much work awaits us if there is to be significant improvement in these areas. Under the new Obama administration, there may never be a better time to make such progress.
While the women’s movement is not as active as it was during its heyday of the 1960s and 1970s, women have by no means achieved an equal status to men. One of the most frustrating problems facing women today is unequal pay. Although employers are barred from discrimination based on gender, and equal pay has been on the books since 1963, women still on average make 77 cents for the man’s dollar. Race and class intersect with gender to structurally disadvantage poor women of color the most. While some middle-class white women can almost touch the glass ceiling, far too many working-class women with darker skin tend to scrub the floors below. There are numerous reasons for women’s lower wages, including but not limited to: the fact that discrimination is extremely difficult to prove in court, the devaluation of women’s work, inadequate family leave policies, and ideals of masculinity/femininity that prevent women from occupying the “top” positions. Solutions to the problem must be multifaceted and should address inequalities that occur at different stages of life. For example, girls need encouragement to develop their interests in traditionally “masculine” enterprises like math and science, and women who become mothers should be entitled to a reasonable amount of paid leave.
Currently, The Family and Medical Leave Act (FMLA) only grants twelve unpaid weeks to employees who wish to care for a family member, although some employers may choose to offer more time off and pay for it. While the FMLA represents some progress in addressing women and family’s needs, the law is embarrassingly stingy compared to the more generous policies of other western countries. In the United Kingdom, for example, women are entitled to 26 weeks of paid maternity leave. Also, paternity leave should be encouraged. Providing leave to only mothers reinforces the idea that women are and should be the primary caregivers to all children. If we strive to make care giving an equally shared responsibility in order to open up opportunities to women, then men should also be incorporated into such leave policies. Clearly, U.S. family leave policy is severely out of touch and inadequate to meet the real needs of working women and families. Family leave policies are generally very politically popular, and there is real opportunity for significant bills to be passed in this area over next four years under President Obama.
In addition to pay equity issues, women are also severely underrepresented in public service. At the date of this writing, women hold a mere 16% of congressional seats and governed only 8 states. In most other forms of political participation, women are equal or almost equal to men, and in some cases even surpass their male counterparts (e.g., women tend to vote at higher rates compared to men). However, when it comes to the most powerful and prestigious positions in our society, such as elected officials, the number of women is dismally low. Spectators have attributed the low number of women in office to a plethora of reasons, including the political opportunity structure, gender discrimination, and the fact that women simply do not run at the same rates as men. However, childcare responsibilities may represent the primary reason why women dangle at the bottom of the political rope. Women are more likely than men to begin their political career later in life after their children are grown. Thus, politically minded women usually lack time (and energy) to gain the requisite experience needed for higher offices. However, with the election of a black president, along with Hillary Clinton and Sarah Palin’s historic bids, women may have more of a chance than ever to break the glass ceiling of politics. At the very least, the 2008 elections taught young girls that the President doesn’t always have to be white and male. The President can be black or possibly female. Aside from providing excellent role models, the Obama administration should encourage women to run for office by providing extra support for mothers. As with pay equity issues, paid family leave is essential and a compensation program for elder care-givers (the large majority who are women) might be considered. In addition, because the seeds of Senators and Presidents are first planted at the grassroots level, Obama’s famous technological grassroots organizing techniques could be expanded to help recruit women candidates to run at all levels, including for offices such as in the city council and the state legislator. Many groups, such as Emily’s List, already have the infrastructure in which to invest new techniques. With any luck, such a program would help breed tomorrow’s female Senators, Governors, and even Presidents.
Finally, many women also lack control over their own bodies. Reproductive rights and freedom have slowly but steadily been eroded over the years, and Roe v. Wade may hang by a thread. There are a plethora of restrictions on access to abortion in some areas of the country, and the women who often are the least likely to be able to access abortion services are poor women and women of color. In general, poor women are more likely than affluent women to carry an unwanted pregnancy to term, either because of indigency or the lack of reproductive providers in their area. Poor women are also more likely than affluent women to have abortions later in their pregnancies, which increases the likelihood of complications and health risks. In the area of abortion rights, President Obama may have the power through appointment to help shape broader access to abortion in the coming decades. He can appoint strong pro-choice justices to the Supreme and federal courts and put pressure on certain states to provide more funding to indigent women who want or need abortions. Obama may also consider becoming a champion for policies that may prevent unwanted pregnancies from occurring in the first place. Such policies would promote comprehensive sexual education for all public school children, which stresses protection and accountability. Abstinence-only education in schools has been shown to be inadequate and ineffective in preventing premarital sex. Such policies would also make contraceptives more widely available to at-risk children. Also, in order to prevent abortions, women need to be able to freely choose to become mothers or not. Without adequate public assistance, childcare subsidies, and paid family leave, many women feel forced by their economic situation to terminate a pregnancy rather than live in poverty. We have to remember that reproductive freedom is not only about the right to abortion, but also about providing women with the means, resources, and opportunities to choose whether to raise a child.
This essay points to three major issues that affect women’s lives–pay inequity, lack of representation in the political world, and restrictions on reproductive freedom. Although women’s issues may not be perceived as important as they once were, the urgency of problems that afflict women in 2009 is just as strong as it was several decades ago. While the substance of each issue discussed is different, the roots of these problems are similar. Women are still the primary caregivers to children and elder parents, despite the fact that both men and women have increasingly expressed agreement with egalitarian ideas about sharing domestic responsibility. To compound the problem, women are not allotted enough reproductive liberty to freely choose motherhood and are not allotted political liberty to be represented by their peers. Additionally, the problems discussed are not discrete, independent issues, but rather are very much interrelated to one other. More women in public office are likely to encourage more family-friendly policy, and attention to reproductive freedom may increase. More reproductive freedom is likely to allow women to take advantage of the same opportunities as men, including moving up in the working and political world. These problems and their solutions thus cannot be addressed individually in a vacuum, but must be addressed together as part of a comprehensive plan to elevate the status of women.
The inequalities women face in the United States are symptomatic of an unequal democracy. If women do not possess as much power, influence, and control over their own lives as men do, then we cannot say that “we the people” rule our country. Obama’s election inspired a hope for a more egalitarian society in which freedom and prosperity could flourish. In order to accomplish this utopian vision, attention needs to be paid to what Simone de Beauvoir termed the “second sex.” Women’s needs should be taken into account, not only to raise the status of women, but also to create a society of true equals. I have hope that the next four years can begin to lead us in that very direction.
On a rainy Saturday afternoon this past November, San Francisco said its final goodbye to the Abraham Lincoln Brigade. Or at least it said goodbye to the one veteran of the brigade who could make it – the hundred-year-old Hilda Roberts, one of about sixty American women who served the Republican cause in the Spanish Civil War. Apparently, a couple of other vets had planned on being there but the weather kept them away, and there’s not a large pool to draw on – only about twenty-two or twenty-four of the veterans are thought to still be alive.
The San Francisco event was a commemoration of a much larger leave taking that took place seventy years earlier, almost to the day. For that farewell, remembered in Spain as the Despedida, the crowd numbered in the tens of thousands, as Spaniards filled the streets of Barcelona for a last look at the departing International Brigades, the 35,000 or so volunteers from 53 countries who had come to defend the Spanish Republic from General Francisco Franco’s military uprising two years earlier. Among the departing were about 2,800 Americans – less about 800 who died in Spain – who subsequently became known as the Veterans of the Abraham Lincoln Brigade.
At the time, Spain seemed a microcosm of all the world’s conflicting ideas on one peninsula in Europe. Within five years after the 1931 fall of the monarchy that ruled it almost continuously for centuries, Spain’s disparate points of view had crystallized into two opposing coalitions: The Popular Front of Socialists, Communists, and left-wing Republicans; and the National Front of Christian Democrats, fascists, and monarchists. Five months after the Popular Front’s electoral victory, the two sides would become transformed into the warring Republicans and Nationalists when army officers in the Spanish colony of Morocco began the uprising that would end democracy in the country for nearly four decades.
It is hard to convey today what Spain meant to the world in those days, but perhaps the title of Andre Malraux’s novel about the Civil War does it best: It is called Man’s Hope. And the fact the events, while certainly not clearly recalled or understood, have never entirely receded from popular memory came to the fore in the most recent presidential election when both Barack Obama and John McCain claimed Republican sympathizer Ernest Hemingway’s Spanish Civil War novel For Whom the Bell Tolls as one of their favorite books.
Just a few years ago, the Bay Area Veterans, while few in number themselves, were holding annual reunion events at the Oakland Hilton or the Kaiser Center that drew crowds in the high hundreds. Speakers like Ariel Dorfman and Molly Ivins talked of the relevance the Spanish war to the events of the day, and the whole audience joined members of the San Francisco Mime Troupe in singing “Viva la Quince Brigada!” and the other songs of the Spanish Republic. But seventy years is a mighty long time to keep an organization going when there’s no source of new members. A recent obituary for Jack Shafran noted that the 91 year old was “one of the youngest volunteers in the Lincoln Brigade.” So the decision was made to dissolve the Veterans, either upon the death of two of the group’s remaining activists, Moses Fishman and Abraham Sorodin or on the seventieth anniversary of the Despedida, whichever came first. The organization’s work would be continued by the Abraham Lincoln Brigade Archives.
By the time of the San Francisco event, both of those veterans had in fact died, so there no longer was an organization known as the Veterans of the Abraham Lincoln Brigade. The singing would be thin on the choruses of the old songs at the final event, and the 150 seat Delancey Street Theater was less than half full for a showing of a British newsreel on the Despedida apparently never before seen in the US. On the screen, Dolores Ibárruri Gómez, better known as La Pasionaria (or “the passion flower”), delivered her famous send off speech to thousands of the then young volunteers. As a rule, Ibárruri’s speeches included the Republican cry, “No pasaran!” – they shall not pass. But there were no such illusions on that day in Barcelona. Franco’s Nationalists had indeed passed and the Internationals were being sent home because the cause was lost. Instead, Ibárruri told them, “Sois la leyenda.” You are legend. And legends they would be, for the rest of their days. People used to cite the phrase “May you live in interesting times” as an ancient Chinese curse. It seems, however, that this widely cited bit of eastern wisdom may have originated in the east coast of the United States, for it appears to be neither ancient nor Chinese. In fact, the earliest date anyone can find evidence of its use is 1936, the year the Spanish Civil War began. And maybe that’s about right because the veterans of that war embodied this apparently modern curse as well as anyone ever has.
When the western democracies refused to aid Spain’s fight against the military uprising, the Internationals came without sanction of their governments. (The only significant foreign assistance the Republic received came from the Stalin-era Soviet Union.) Afterwards, some volunteers, like the Italians and the Germans who constituted the largest bloc, couldn’t go home. In Spain, they had fought against their own governments because, unlike France, the United Kingdom, or the United States, Mussolini and Hitler’s governments had not hesitated to assist their ally Franco – and take the opportunity to hone their military operations for the larger conflict on the horizon. Others like the Americans were able to return home but were now considered suspect as the times got ever more “interesting.” It seems they had been “premature anti-fascists.” They were anti-Hitler before being anti-Hitler was cool.
The interesting times continued. When US Attorney General Thomas Clark decided to warn the nation about the subversive organizations in its midst on 1947, he did so by releasing a list in alphabetical order, starting with the Abraham Lincoln Brigade. And since a good number of the volunteers had, in fact been Communist Party members, they faced “are you now, or have you ever been” questioning for decades.
By the Vietnam War era, the Spanish Civil War was a largely forgotten event in the US. Most of the participants in the big antiwar demonstrations of the day would likely not have noticed the group of old men and a few women, marching behind a Veterans of the Abraham Lincoln Brigade banner. But they were always there, probably the most antiwar group of veterans you were ever going to meet. And an activist core continued on, and on, and on. When the Reagan Administration subverted the Sandinista government in Nicaragua in the 1980s, the Lincolns were well past the age of volunteering to fight the Contras, so they sent an ambulance down instead.
In her speech on that long ago afternoon in Barcelona, La Pasionaria went on to exhort the 13,000 Internationals who were there, “When the olive tree of peace puts forth its leaves entwined with the laurels of the Spanish Republic’s victory, come back.” Considering the Republic’s desperate military position at the time, this seemed like so much bravado. And, at the time, it was. But not in the long run. As the viewers of Saturday Night Live would be reminded week after week, in 1975 General Francisco Franco finally died. And more importantly, with him died his dictatorship. Two years later, La Pasionaria, returned form exile, was elected to represent Asturias in the first post-Franco government.
Still, Spain was reluctant to revisit its Civil War in those first post-Franco years, and it would be nearly another two decades before the volunteer veterans were invited back. But in November 1996, sixty years after the war’s start and three years after Ibárruri herself had died, 400 of them returned to finally see the olive trees of peace and receive a hero’s welcome at the “Homenaje,” the homecoming. Twelve years later, a mere twenty-three of them were on hand for the seventieth anniversary Despedida commemoration in Spain, their numbers having plunged worldwide just as they have in the Bay Area.
But things have continued to change in Spain, and the reluctance to confront the crimes of the Franco has declined with the passing of those personally involved. A recently passed law mandates the removal of symbols of the Franco era from various public buildings and funds the unearthing of Civil War-era mass graves. And it begins the real Homenaje: As of the end of 2008, all descendants of those Spaniards forced to leave the country from the beginning of the war through 1975 will be allowed to claim the Spanish citizenship denied them by Francisco Franco’s war and dictatorship. And although most did not live to see it and had to content themselves with being legends in their own time, this is the final victory of the Abraham Lincoln Brigade.