Sunday, May 31, 2015

A Billionaire's Thoughts on the Minimum Wage

In the words of Warren Buffett, in an article he wrote for The Wall Street Journal, America's widening wealth gap is not a result of the rich being incredibly rich, but an "inevitable consequence of the of an advanced market-based economy", explaining that America's "economic engine" constantly "requires more high-order talents while reducing the need for commodity-like tasks". This basically means that highly educated workers are in demand and uneducated workers are not. One unfortunate result of this is nearly 15 percent of the U.S. population below the poverty line. Buffett says that the solution to this problem is not education however. Though quality schools should be available to all, in reality there is a limited number of positions for tech geniuses like Steve Jobs or economic experts like Buffett himself. Education is a much more complex, and longer term solution to the wealth gap than what Buffett believes should be done: "a carefully crafted expansion of the Earned Income Tax Credit (EITC)".

The current EITC is a not doing enough. It is a refundable federal income tax credit for people of  middle to lower income that averages just below $3,000 a year per family. Three thousand dollars a year is little money compared to the amount that nearly doubling the minimum wage (to $15 an hour as some groups propose), however, the problem with doubling the minimum wage is that almost all companies would have to reduce employment, crushing many of the workers possessing only basic skills, as Buffett puts it. Some would be happy with there $15 an hour, but many would be unemployed entirely. Buffett proposes that minimum wage workers get more back from the government in the form of an EITC, saying that this provides an "incentive for workers to improve their skills" (because, unlike welfare programs, this requires a source of income) and more importantly, "does not distort market forces, thereby maximizing employment".

It sounds great. The government gives people more money, and with business left unaffected, everybody has a job and the economy can grow unhindered. It sounds too great. The glaring question is: where will the money come from and what will suffer because of it? It is how politicians can respond to this question that will determine whether or not an expansion of the current EITC can be successful. Ideally, the result of a successful expansion will be so beneficial to the economy that no government programs will be negatively impacted. Government money lost due to EITC will be made up for with money saved on other welfare programs. And as Buffett put it: "America will deliver a decent life for anyone willing to work".

Thursday, May 28, 2015

Business's Influence Over Science

As part of my Junior theme research, I uncovered the influence that the pharmaceutical industry can have over the conclusions of scientific studies. Through my research, I found that scientific studies funded by the pharmaceutical industry disproportionately support drugs companies. For instance, in 2006, GlaxoSmithKline payed all eleven of the authors of the study that influenced the FDA's approval of their drug, Avandia. Four of the authors were even companies employees who held company stock (Washington Post). Because of that conflict of interest, the evidence used to prove Avandia's safety is incredibly biased.

This is not a problem isolated to the pharmaceutical industry however. Companies in the sugar industry also heavily compensate scientists that research the health effects of high amounts of sugar consumption. According to a segment from John Oliver's Last Week Tonight,  88.3 percent of independent studies on the relationship between sugar and weight gain have found direct links between the two. However, "the vast majority of studies" that have received funding from companies in the sugar industry find "the exact opposite of that". Other studies corroborate this bold claim. One such study from PLOS Medicine reveals that a study with funding from a food or drink company is five times more likely to not support the connection between sugar and weight gain than a study without funding.  While it seems unfair to jump to the conclusion that large corporations such as CocaCola or PepsiCola are manipulating science, clearly the money they spend funding scientific research must have some impact because the majority of science disagrees with them.

Nobody can blame CocaCola for defending themselves, but the motive is there: the company has funded scientists directly in response to accusation that CocaCola's products contribute to America's obesity crisis. Luckily in 2015, the majority of consumers understand that excessive sugar consumption is bad for health. It should not be a concern that food and drink companies are tricking people in this way. The concerning part is that food and drink companies have the ability to influence science in this way. If companies can cause researchers to contradict a claim as widely accepted as sugar causes obesity, then they could certainly influence lesser known or emerging health knowledge. I wonder what else food and drink companies might not want the public to know.


Wednesday, May 27, 2015

The Ethics of Maternity Leave

As a result of the Civil Right Act of 1964 and the Pregnancy Discrimination Act of 1978, women were not allowed to be discriminated against in the workplace due to pregnancy. This was a tremendous step forward at the time, guaranteeing women job security during a mandated unpaid leave. However, the protection of pregnant women's rights has not come very far since then. According to a segment from John Oliver's Last Week Tonight, the only countries in the entire United Nations that do not promise women paid maternity leave are the United States and Papua New Guinea. In the United States, even the federally mandated unpaid leave is fairly limited. It only applies to companies which have 50 or more employees and to employees who are full-time, salaried, and have been with a company for over a year.  Of course, many women are provided with paid maternity leave by their employers, but according to Oliver that number is only about 60 percent of women.

This means that many of the new mothers of this remaining 40 percent are forced back into work early for financial reasons. These mothers must immediately learn to juggle a newborn child in addition to their full-time job. But, it shouldn't be like that.

Last year, the Democratic party created the Family and Medical Insurance Leave Act (FMIL), which specifically outlines that workers receive that workers receive 12 weeks of paid family leave (maternity and paternity) at two thirds of normal monthly wages. However as of the end of last year,  it hasn't been passed as zero Republicans have offered their support so far. As Oliver explained it, this is because many people believe that at this point in time "'the country's businesses are saddled with two many regulatory burdens'", and republicans believe the bill is "'anti-business'" and "'anti-growth'". This seems insensitive but there must be some truth behind it. Only three states, California, Rhode Island, and New Jersey, currently mandate paid maternity leave. Even if the plan of the FMIL act is so harmful to business, is that the kind of country that America wants to live in? 

Should business come before family values? Or should the United States join the rest of the world and mandate paid maternity leave for new mothers to welcome their children to the world?


Sunday, May 17, 2015

The Over Usage of Medical Care

While modern medical breakthroughs have saved millions of lives, the reality may be that certain medical treatments are also widely overused. According to a recent New Yorker article, "Overkill", in 2010, "the Institute of Medicine issued a report stating that waste" or unnecessary healthcare services "accounted for thirty per cent of healthcare spending". Such unnecessary services include diagnostic tests, pharmaceuticals, and procedures. The result is more Americans spending tremendous sums of money out of pocket when they don't need to, and more money going to Medicare when it could be going to education, for instance.

According to the article, this trend has two primary causes, the first of which is a mentality shared by both doctors and patients. The author of the piece, Atul Gawande, a tumor surgeon, believes that doctors feel an obligation to test for every possibility of illness, saying that "as a doctor, I am far more concerned about doing too little than too much". This is a mentality that might reasonably come along with being in the healthcare field. People take their health very seriously, as they should, and it often leads to unnecessary precautions both issued by doctors and willingly accepted by patients.

The other cause of healthcare waste come not from caution but ignorance. As explained by Gawande, it is what economists call "information asymmetry", and is simply the truth that doctors know far more about treatment and procedures options than patients do. Because of this, many patients tend to blindly follow doctor's advice regardless of whether a doctor truly believes it is in a patient's best interest, a doctor is mistaken, or because it will "enhance a doctor's income". While it may not be entirely malicious, the result of this power that doctors hold causes an over diagnosis of disease. The United States is "treating hundreds of thousands more people each year for diseases", "yet only a tiny reduction in death if any, has resulted".

This trend is directly paralleled with prescription drug usage. As I have learned from my extensive Junior Theme research, although prescription drug use has increased nearly 50 percent in the last decade and a half, America is getting statistically sicker. This increase is in large part due to pharmaceutical industry making it in the best interest of doctors to prescribe their medications by paying them to promote drugs to other doctors and even directly paying them kickbacks for prescriptions. The glaring conflict of interest does not stop their apparently.

What, if anything, should be done to change the way that the medical profession is run? What can be done to make the medical industry more consumer friendly?

Thursday, May 14, 2015

The Creation of the Wealth Gap

Recently on Last Week Tonight with John Oliver, there was a segment examining the growing income disparity in America, which is currently at its highest level since the 1920s. According to Oliver, the highest earning 1% of Americans is currently making nearly 20% of all available income, and controlling 35% of all wealth. The numbers are staggering, however, the more interesting aspects of this income inequality are the reasons why it exists.

Oliver claims that such inequality exists because of policies in America that benefit wealthier people. These include "cutting income tax and capital gains tax for the richest in half". He believes that these policies, which favor the few over the majority, exist because of "one of America's greatest qualities: optimism". This is a broad and general claim, however, Oliver backs it up. He explains that, according to a poll from Pew Researcher, 60% of Americans believed that they can "get ahead" if they are "willing to work hard".

Supporting low capital gains tax perfectly exemplifies American's confidence in hard work. According to a Wall Street Journal article, some experts believe that the low capital gains tax should stay low because it encourages investments in growing businesses by increasing the payoff. This may make both business owners and investors happy, but it really only effects people of the middle upper to upper echelons of society. There is a very low portion of individuals receiving direct benefits from a lower capital gains tax. So who voted for it?

Maybe this is where John Oliver's theory comes in. As he sarcastically put it, people know "the game is rigged" and believe that is "why it's going to be so sweet when [they] win the thing", meaning that people understand that low capital gains tax favors few people, but they don't care, because 60% of Americans believe they will "get ahead" once they start working hard. However, that is an unrealistic expectation for Americans. With nearly 15% of Americans beneath the poverty line, and the upper class drifter farther and farther from the middle, perhaps American voters should convert a little bit of their optimism into realism.

Thursday, April 23, 2015

How Drug Companies Cheat the Patent System

According to the New York TimesTeva Pharmaceutical Industries recently payed $512 million to settle claims that the company was paying off generic drug companies to keep their cheaper products off the market. The drug company's intent, as explained by the article, was to be able to keep selling their sleep disorder drug, Provigil, for hundreds of dollars without competition from generic copycat drugs. While this would obviously be great for Teva, the reality is that American consumers are forced to pay ridiculous sums for their medications for far longer . When the release of generic drugs is delayed, many ill people must simply forgo medication that much longer.

This is not the first time this practice has been seen in the industry. The New York Times also cites an instance in 2011 in which Cephalon "paid generic drug manufacturers more than $200 million" to delay sales of their generic drugs until 2012. It also says that drug purchasers argued that "were it not for deals with generic companies, the drug [in this instance] would have faced competition in 2006", meaning that Cephalon's pricing remained absurdly high for six years longer than it should have.

According to former Federal Trade Commission policy director, attention being brought to this issue is "a great result for consumers". But, paying off generic companies is not the only way that Big Pharma companies can use their money to slip past the system. Companies such as GlaxoSmithKline, Pfizer, and Johnson and Johnson have all payed sums greater than a billion dollars to settle claims for offenses such as off-label promotion and paying "kickbacks" to doctors. Hopefully, raising awareness about the Teva incident will subsequently raise awareness about these questionable business practices as well. 


Monday, April 20, 2015

Is College No Longer Worth Investing In?

In the last two decades alone, the amount of both people and money going into higher education has skyrocketed. According to a recent article in The Economist, the percentage of college age students actually enrolled in college increased from 14% to 32% over these years, while the OECD countries increased their spending on higher education from 1.3% to 1.6% of total GDP. That means millions of students and billions and billions of dollars. This is undoubtedly great news for global education as a whole, however, the subsequent increase in demand for degrees may be turning America's higher education into a money driven system, not a system focused on learning.

While the article admits that a bachelor's degree does still promise a 15% return on investment, it also states that, in a recent study on "academic achievement", "45 percent of students made no gains in their first two years of university". Though such a lack of learning may be hard to believe, it might not matter so much. Another study on "recruitment by professional-services firms" found that firms selected candidates from the "most-prestigious universities because of those universities' tough selection procedures", not because of "what candidates might have learned" in their time at that university. This is because America's higher education system lacks a "clear measure of educational output". The article therefore suggests that, when considering post-college employment, college really is all about getting in.

However, this goes directly against every single one of the reassuring, don't-obssess-over-college talks I have gotten in high school. This information seems to reaffirm everything that I foolishly thought about college going into high school: that I must always work my ass of because I have to get into the most selective university I can or the world will end.

What about the piece by Frank Bruni that was the most-emailed New York Times article for two weeks straight, "How to Survive the College Admissions Madness"? That article seems to directly disagree with the economist in a number of ways. It quotes Sam Altman, "one of the best known providers of first-step seed money for tech start-ups" as saying that the school where the graduates most stand out is "'the University of Waterloo'", not one of the country's most selective private schools. It also describes the obsession over "getting into the Stanfords of the world" as "getting crazier and more corrosive", not important for employment as The Economist would lead you to believe.

So which assessment of America's college system is closer to the truth? Which one do you think should the college system strive for?







Sunday, April 19, 2015

What Drug Companies Don't Tell You

The F.D.A.'s recent approval of a new drug, named Corlanor and developed by the drug and biotech company Agmen, may keep millions of Americans suffering from heart disease out of the hospital/

Whereas many "new" drugs released are basically the same as older medications, Corlanor actually does promise to have a different effect on its patients, those suffering from chronic heart failure. According to the New York Times, "the drug works by inhibiting what is known as the 'funny current' in the heart's natural pacemaker". Based on the study that won the drug the F.D.A.'s approval, the drug promises a reduction in risk [of hospitalization for heart failure] of 26 percent".

This sounds like fairly good news, but the problems lay in the way the study was conducted. First of all, it says in the New York Times article itself that the study was payed for by Servier, a french pharmaceutical company. Although Servier and Agmen may be different companies, this is still a tremendous conflict of interests because the companies have collaborated together in the past. Even if that wasn't the case, the study would still be inherently bias because, as I have learned from my Junior theme book, Our Daily Meds by Melody Petersen, drug companies that pay for studies tend to disproportionately favor the interest of the company. According to the book, one way that drug companies can make this happen is by requiring those who conduct the studies to test the drug against a placebo instead of against a competing drug or an older drug.

Interestingly enough, that is exactly what is done in the study that was cited in the New York Times. The article says that "[patients] were randomly assigned to take either the ivabradine [Corlanor] or a placebo. The obvious problem with this is that Corlanor is really being compared with nothing. Therefore, the "promising" results really only mean that taking this $4500 a year drug is better than nothing.

Corlanor has also shown signs of being unsafe in trials. The article states that cardiologists are "lukewarm" about the drug's approval in part because the it has been shown to drop heart rates "dangerously low" in some patients.

Based on the flawed studies and with its potential risks, should Corlanor have even been approved?Should the FDA hold companies like Agmen to higher standards?




Sunday, April 12, 2015

University.com

With the average annual cost of tuition at a private university reaching more than $31,000, and average graduation debt reaching $40,000, more and more students are turning to the internet for more affordable degrees. Though some look down upon online degrees as inherently inferior, both state schools and elite universities alike are beginning to integrate online courses as a supplement to their core curriculum. Arizona State, Columbia, Penn State, and Harvard are a few of the pioneers who have been able to save their students both time and money through this educational revolution (The Economist).

The integration of online courses into college curriculums may help improve some of biggest problems in secondary education. According to a recent article in The Economist, "The Log-On Degree", Arizona State has "almost doubled undergraduate enrollment since 2002" through online courses, and "increased the share of students who graduate after four-years from two-thirds to one-half". All the while, tuition has stayed "reasonably low" at $10,000 in-state.

Online courses are increasing college enrollment and keeping students on track to graduation through their convenience and personalization. The article explains that many college courses with online components feature an "eAdvisor system" which tells students whether they are on track or not, while "prompts and explanations ensure that teachers do not have to keep going over the basics". This allows universities to increase college enrollment without having to fund new buildings or pay expensive professors for as much of their time.  And when universities save money, so do the students. 

The question is where will online courses take us in the future? Will Americans take advantage of the benefits that online courses offer as a supplement to college education, or will college tuition continue to rise as America's educational system refuses to change? 

Saturday, April 4, 2015

The Problem with the Religious Freedom Act- Part I

The Religious Freedom Restoration Act (RFRA), which was passed in Indiana on April 25th, allows businesses to use their religious affiliations as an excuse to refuse service to certain customers. While some say that the act provides further protection to the first amendment freedom of religion, others believe that the act legally supports discrimination, particularly against members of the L.G.B.T. community. As one can imagine, this act has become a hot button issue on a national level.

The wording of the act itself is interesting. It explains that the government may not "substantially burden" a person's freedom of religion, defining the word burden as an action that "constrains, inhibits, curtails or denies" the exercise of religion by a person. All of those words could potentially encompass a great deal of actions. The act is protecting the expression of a right with far too much wiggle room. By that I mean that religion is something that is constantly up for interpretation. As history has blatantly revealed, there will never be a consensus on what religion to believe in, or even what a certain line of belief specifically means. Christianity alone has dozens of subsets each with their own values and beliefs. The problem with that is that nobody is regulating religious beliefs and therefore nobody is controlling what is protected by the RFRA. Anybody could claim that a certain person is constraining, inhibiting or curtailing there religious freedom if there is no set definition for religious beliefs. Laws cannot be based upon such subjective grounds.

Religion is something that must be constantly revised and reinterpreted as well. Back in the pre-Civil War era United States, the Bible was used to support slavery. Also, the Bible is extremely sexist, "I do not permit a woman to teach or to have authority over a man; she must be silent" (1 Timothy 2:12). Should the RFRA really be legally justifying these beliefs, allowing people to discriminate in business and service?

Friday, April 3, 2015

Are Latinos the Future of America?

The real fight over immigration may no longer be whether or not we should open or close our borders, but rather what should be done with the immigrants that are already here. In a recent article from The Economist, it is made clear that the 48 million documented and many more undocumented Hispanic immigrants on course to becoming the population majority could give the United States an advantage over other nations in the near future. 

In the coming decades, the article predicts that the relatively young and rapidly growing Latino demographic will bring down the median age to "a spritely 41"- 11 years younger than Germany's-
and allow America's population and labour force to continue expanding, while that of countries such as China will shrink. The result would be a lower percentage of the population reaping the government benefits of old age, and a higher percentage fueling the country's economy. 

The article suggests that it would be in Americans best interest to "not squander . . . the rise of Latinos", and instead take care to educate the demographic. Certain states are making the right move in this situation, making college more affordable for students with "good grades but the wrong legal status". Yet other states, including Arizona and Georgia, refuse to give undocumented students in-state tuition rates, and others such as Alabama and North Carolina refuse to even permit them enrollment. This is a tricky issue, as I partially feel that U.S. citizens should be given priority. However, the reality might be that "the whole country will suffer" if the 2044 majority of the population is "poorer and worse educated than [today's] American average". 

Avoiding the moral issues regarding illegal immigrants receiving in-state tuition, I believe from a economic stand point that the country should follow the path of states such as California, New Mexico and Texas, which go as far as providing undocumented Hispanic students with state financial aid. Although it might not make the most sense to reward those coming to the United States illegally, it may be in the country's best interest to give in. 


Sunday, March 22, 2015

Progress Towards Gender Equality

Ever since the 1970s, the ratio of female to male students enrolled in college has been steadily climbing. From 1985 to this year, the percentage of female college students has increased from 46% to 56%.  Those numbers would have been inconceivable just a few decades ago, however, today girls are dominating education, surpassing their male counterparts in standardized testing in reading, writing and science, and closing the gap in math (The Economist)

According to a recent article in The Economist, "The Weaker Sex", this shift in academic performance and college enrollment is not caused solely by girls. The reality is that teenage boys are falling behind in school, living in a social environment where it is "not cool for for them to perform". Boys are "50% more likely than girls to fail to achieve basic proficiency in any of maths, reading and science". It is predicted that the number of girls in college may surpass boys even further, "rising to 58%" of all students.

The most pressing question that The Economist article raises is: "So are women now on their way to becoming the dominant sex?" Based off of these numbers, one might jump to the conclusion: Yes. However, though the numbers show great progress for women in education, the battle for gender equality is not necessarily over. It is essential to consider that school is not the only area in which women must overcome injustices.

There is a glaring gender disparity in income that seems entirely unexplained. The truth is that women only earn three quarters of what their male counterparts are paid. If women are superior, more ambitious students, then why is that? Assuming a high level of education translates into job performance, the income difference is ridiculous.

Also, of all the C.E.O.'s of the Fortune 500 companies, only 24, or 4.8%, were female as of last year. Although I do not mean to suggest that that is the best factor to measure with, it is certainly telling about women's leadership rolls in business. The lack of females in charge of big companies may show how women are still discriminated against when it comes to expressing authority or leadership, despite the fact that women are more academically qualified as a sex.

Are these factors ones that will change fairly soon, as an older generation of men and women are replaced in the workforce? Or will the unjust inequality persist if nothing is done to correct it?

Saturday, March 21, 2015

Are Our Veterans Being Cheated?

In order to partially repay veterans who have sacrificied years of their lives to defend our country, the government created the Service-members Civil Relief Act, or S.C.R.A, in order to protect veterans from repossession or foreclosure without a court order (New York Times). However, the brutal truth is that this law may have been infringed upon more than 15,000 times in 2012 alone. This means that in a single year thousands of war veterans returned home from duty to find that they had not just lost their innonence, and maybe one or more of their limbs, but also their cars or homes. 

As explained by The New York Times, this injustice occurs because of a law policy known as arbitration, a system "where the courtroom rules of evidence do not apply". Though, supposedly, abitration is "more efficient and less expensive than court", the reality is that the law tactic is a way around the United States' typical justice system (i.e. no judge or jury is involved) and, because of simple economics, arbitrators (the stand-in judges) are more likely to side with large companies than individual. Also, many use arbitration reluctantly because of the high costs of court. Even though regular civilians can be legally subjected to it, veterans are protected from arbitration and the legal disadvantages that come with it by the laws of the S.C.R.A. (New York Times). 

A former Air Force attorney, Colonel John S. Odom, was quoted in the NYT article, saying "mandatory arbitration threatens to take these laws [of the S.C.R.A.] and basically tear them up". The phrase "tear them up" suggests that the military doesn't just feel that arbitration avoids or circumvents their laws but actual insults or dishonors veterans returning home. And I agree with that viewpoint. Thousands of U.S. veterans come home from wars abroad every year, many of them carrying a host of new life challenges, including P.T.S.D., physical injuries, and financial struggles. Any company, such as Nissan which was exemplified in the article, that uses arbitration as a means to strip service members of their legal rights is certainly dishonoring them. 

  

Wednesday, March 18, 2015

The Hypocrisy of American Values

America was a country founded by colonists looking to escape the oppressive aristocracies of England. It was a nation that believed "all men are created equal". It was a nation that once took in "huddled masses yearning to breath free", desperate immigrants looking for equal opportunity and a fresh start. However, ironically, America is now ranked last, among developed democracies, in equality.

According to a a recent article in The New Yorker, "Richer and Poorer", economic inequality in the United States is greater than "any other democracy in the developed world" (as measured by the Gini index, a scale from 0 to 1 used to measure income inequality). The Gini rating for the United States has steadily climbed from ".397" in 1965 to ".476" in 2013. According to the article, "it's no longer possible to deny that [this change] exists". The new key question is why. Why is America becoming more unequal, and what does it mean for the future?

The article provides one answer that I did not at all expect: Congress. Sure, the approval rating and the productiveness of Congress are near all time lows, but is it also causing or creating inequality?

As explained in the article, the data from a series of international comparison studies shows that it might. Based on the findings of Alfred Stepan and Juan  J. Linz, there is a strong correlation between a a given country's Gini rating and two other political factors: a nation's number of "veto players" (branches of government able to check and balance one another), and a nation's ratio of citizens to representatives in legislature. The higher the number of veto players, the higher a nation's Gini rating tends to be. Similarly, the higher ratio of citizens to representatives, again, the higher the Gini rating (The New Yorker). The United States is no exception to this pattern; it is the leader in all three factors.
The United States is the only democracy in the world with four "veto players", and also has the most "malapportioned" representation in the legislature.

What is interesting, however, is that all of that information about Congress and inequality is included in the last two paragraphs of the seven page article. Though it says that "the problem . . . lies with Congress", the article barely explains it at all. Apart from the fancy, complex statistics, I do not see the connection between the United States' general political formation and heightened economic inequality. Therefore, at this point I don't buy it. The article presents an interesting idea: the effect that general political structure has on equality. But, I do not believe that The New Yorker definitively identified what is making the United States the most unequal democracy in the world. That criminal still desperately needs to be found. 

Saturday, February 28, 2015

Ending the Segregation of History

Although it is usually meant in a respectful and tolerant way, saying something along the lines of "in honor of black history month", is actually quite the opposite in my opinion. Black history month is, in a way, a continuation of racial segregation. By simply existing, Black History Month is implying that "black" history is separate from and not to be included in "regular" history. 

In 1926, when Black History Month's predecessor was created, Negro History Week, the government could have easily gone down a path of integration: changing school curriculums to further incorporate black history. Instead, however, the government confined the area of study to its own separate and extremely unequal time in school curriculum, the month of February, coincidentally the shortest month of the year. (This wasn't the government's only attempt at "separate but equal"). Not only is a separation implied, but it is quite literal. Also, there is the implication, though perhaps it is not intentional, that black history deserves one twelfth the time that "regular" history deserves. 

Perhaps this subtle degradation of black history is why Morgan Freeman has a similar view on the month meant to honor and recognize the historical impact of people of color. In an interview on 60 Minutes, the topic is briefly discussed (brief enough to watch the whole thing). Freeman expresses that it feels as if black history is not so much being recognized but rather being "relegated to one month". In  those words, it sounds almost as if the history of a people is being contained, not celebrated. 

As I mentioned above, Black History Month grew out of Negro History Week, which may be a sign of progress. Not just because of the more respectful word choice, but also because of the longer time spent to recognize black history. Nevertheless, the best way to recognize and observe the history would be to treat it as nothing special, just history. Black history will truly get what it deserves when it is simply called, in the words of Morgan Freeman, American history. 
 

Sunday, February 22, 2015

The Benefits of Hallucinogens

Former President Richard Nixon's "War on Drugs" was a complete failure. The government intervention, originally intended to free the streets of harmful drugs, has in fact not decreased drug use in the United States at all. In actuality, the "War" has wasted billions and billions of taxpayer's dollars, and caused more U.S. citizens to be imprisoned than ever before.

An additional effect of the "War on Drugs" is the increase in the stigma associated with the use of drugs today. Drugs that were once ingredients in soft drinks, and even in the Pope's wines, changed into terrible poisons that caused instant death as a result of ubiquitous nationwide anti-drug campaigning.

Though I agree that in most cases the changed image of drug use is very good, a recent article from The New Yorker, "The Trip Treatment", reveals how this change may have been bad. The changed image of drugs, and consequently the "War on Drugs" as a whole, may have taken away a key treatment option for people suffering from mental illnesses. 

According to the article, research from the 1950's revealed that hallucinogens (also known as psychedelics) , specifically LSD, and Psilocybin, found in magic mushrooms, were "useful in treating anxiety, addiction, and depression". The article goes on to explain that, between 1953 and 1957, the government allocated "four million dollars to fund a hundred and sixteen studies of LSD", and the results of the studies were "frequently positive"even if some weren't perfect in design. By the 1960's, LSD and other similar drugs were used "successfully" to treat alcoholism and end-of-life anxiety in medical patients.

Unfortunately for any patients that may have been treated after 1970, Richard Nixon signed the Controlled Substances Act, "prohibiting the use of [psychedelics] for any purpose". Therefore, research was abandoned and all of the promising results that had been acquired over the years "was all but erased from the field of pschiatry". And it was at this point that hallucinogens switched from being candidates for curing disease to drugs that only ruin lives. It is the result of the widely accepted belief in the latter that many find the topic of this post so surprising. And that belief was created entirely as a result of Nixon's act and more generally, the War on Drugs. 

Had Nixon's act not been signed, research and clinical trials with psychedelics would have continued, and psyschedelics may have gone on to become common drugs, tremendously helping to ease the suffering of those with mental illnesses, or maybe not at all. The point is that we will never know. Because of government intervention, one can only speculate what the effects of psychedelic drugs may have been.


Thursday, February 12, 2015

Should there be a Naked Statue of Bill Cosby?

After being accused of sexual assault by more than thirty women, America's favorite comedian and renowned pop culture figure, Bill Cosby, was put to shame. Despite the fact that he has not yet been convicted of these alleged crimes, his once magnificent image has been tarnished nonetheless. His place in history as a progressionist for the portrayal of African Americans on television may even be ruined.

However, one high school artist from Massachusetts, Rodman Edwards believes that Cosby's punishment of shame, ruin and inevitable prison time is not enough. Edwards feels that two statues honoring Cosby- one of which is in Walt Disney World, the other in the TV Hall of Fame in California- should be removed and replaced (in a different location, such as an art museum) with something intended to shame the criminal: enormous bronze statues of the man standing naked with Fat Albert in place of his penis.

Rodman Edwards' Digital Proposal
As humorous as the proposal is, it is genuinely being considered. The statue will be presented at the Cory Allen Contemporary are showroom on February 20th, and the idea is being proposed to the Academy of Television Arts and Science Hall of Fame.

While it may be reasonable to argue that the previous works that honored Cosby should taken down, I am not so sure about arguing for this sort of replacement. Cosby's actions may have warranted many decades in prison, but should a buffoonish statue of him be added to his punishment as well? Maybe a better question would be: is this the way the situation should be handled?

With the severity of Cosby's alleged crimes taken into consideration, I am not sure that this kind of statue gets the job done properly. I feel as if it turns a series of horrible rapes and turns it into something that can be punished with an unusual joke that has no connection at all to the crimes except for maybe the lack of clothing. Also, it is important to consider that a statue is permanent, something put in place to preserve the memory of honorable people. And quite simply, though he had two of his own statues before, Cosby is sadly no longer a person to be remembered with one.

What do you guys thing about statues such as these being displayed in a contemporary art museum?








Sunday, February 1, 2015

What America WANTS in a Sniper

The recent blockbuster, American Sniperdirected by Clint Eastwood and starring Bradley Cooper, is supposed to be based on the autobiography by ex-Navy SEAL Chris Kyle. However, in truth, the film strays from the storyline of the book significantly. Having read the book and seen the movie- both of which I would highly recommend despite the criticism about to follow- I know that not only are omissions made and events dramatized, but the characteristics and actions of protagonist Chris Kyle are different than shown in his best-selling autobiography.

A few additions may not seem so significant, but some of the changes struck me as too big to ignore. Take the portrayal of fellow SEAL Ryan Job for instance. In the movie, Ryan Job is shot in the face by an enemy sniper and dies in a military hospital soon afterward. In real life, Job was seriously wounded but still made it home to his wife and children. I couldn't help but wonder why the writer and/or director decided to make this change. One reason might have been that it set the protagonist Chris Kyle on a quest for revenge against the foreign enemy, which is well explained by this Slate article. So maybe a living man was shown to be dead in order to portray Kyle as a more noble figure, one who avenges his fallen comrades even if it mean taking a fourth tour of duty in Iraq, one more than normal.

This possible intention, of making Kyle appear like a better person in the movie, actually made sense after considering how different his stance on war is in the book. While in the movie Kyle goes to war entirely to protect his country and save American lives, in the book he explains the excitement and enjoyment that his line of work brings him. He admits many of his weaknesses on the battlefield, including thinking with his emotions not his head, and also reveals his ruthlessness as a soldier, "You'd have a violent explosion, a fire, and then no more enemy. Gotta love it". In the movie, something designed by producers, and intended to be approved of by the public, Kyle is never shown to have this love for violence and death.

But is this change okay to make? It does make for a more relatable and more noble hero figure. But, in a way, it may be hiding American civilians from the truth about our military. By portraying our soldiers are extremely virtuous and good-willed people, the movie may be presenting a skewed depiction of American involvement in the Middle East. And this may lead Americans to further believe what our country is doing is always right. Though as Kyle explains himself in his memoirs, he is not one-hundred percent perfect. He turns saving American lives into a bloodsport of killing as many enemies as possible. Yet, unless you read the book, you would never know the truth.

Is it okay for producers to change these aspects of a story when it is based on real life events, especially something as significant as the war in Afghanistan?

Tuesday, January 27, 2015

Few Men Are Created Equal

This past week, an article in the Economist, entitled "America's New Aristocracy", explained just that, what might be America's educational aristocracy. And although American ideals are strongly against inherited privileges (at least in theory), this aristocracy seem quite real. Certain children's educations are very different than others because of class differences, however, it may not simply be because of the educational advantages of more money. Children of wealthier parents may be gaining an advantage because of their parents brains.

The process begins before the children are even born. The article claims that "far more than in previous generations" smart and successful people are marrying other smart and successful people, resulting in a trend called "assortative mating" which leads to "bright children" and "stable homes". But this natural aptitude for intelligence is simply the foundation.

The intriguing claim is that the wealthier parents are creating a better environment for their children's brains to develop completely for free. According to the article, "children of professionals hear 32m more words by the age of four than those of parents on welfare". And keep in mind that over a third of the population was on welfare as of 2012. Therefore, regardless of which families have the money to afford neighborhoods with better schools, or expensive tutors, children of upper middle to upper class families generally live in an environment where their young, malleable brains are much more stimulated.

This difference between classes is unique. Nobody can say that this gives wealthier children an unfair academic advantage because, unlike with money, parents across all class levels have the ability to speak with their children. Though some busy parents may not have enough time to see their children, that issue is not specific to class. McDonald's store managers, surgeons, and businessmen alike all have to work late hours occasionally. Wealthy children should not be limited, as some argue they should with access to test prep, because that would only lower the typical student's ability. Instead, underpriveleged or struggling children should be aided. 

Encouraging and educating working class parents of the importance of brain stimulation for growing children could be essential. And if the problem cannot be partially corrected in the early home, then at the least, public schools should receive equal funding from state taxes. When underprivileged children are already starting behind, the playing field should at least be leveled in the public school system. Otherwise, only a portion of the population, the educational aristocracy, will continue to dominate college admissions. And eventually fair paying jobs. And at that point the cycle of the educational aristocracy with begin again.

Monday, January 12, 2015

The New Fad: Obesism

This past weekend, I attended a show, Panic on Cloud Nine, at Chicago's The Second City comedy club which provided some critical social commentary on sexuality in 2015. In one skit, a group of girls at a slumber party go around revealing secrets. The girls who reveal that they are secretly lesbian, transgender, or even secretly men are all responded to with huge acceptance, with the other girls saying that its not a big deal to be part of the "LGBTQ" community. The other girls kept saying, "It's 2015!" However, the skit ended with the last girl sharing her secret: "I think I might be gaining some weight". To this, the other girls shouted things like, "Fat ass!" and "You're disgusting!" I took this to mean that, in 2015, somebody can have any color of sexuality they please and no one will bother them, but if your obese, you will still receive all of the hurtful, prejudiced treatment that previous generations loved to use so much.

Despite the fact that this was a part improv comedy show- a hilariously funny one by the way-, I wondered to what extent this is true in my society today. I wonder if people with weight problems have truly been skipped or missed by the movement of tolerance and acceptance that has come with my generation. Of course, obese people are not denied any rights by U.S. law like homosexuals were denied the right to marry, but I wonder if they have been denied the right to fair treatment or freedom of bullying or hazing in some way.

So, I researched the issue on the omniscient internet, and found that overweight or obese individuals do face more struggles than their healthier weighing peers. According to a CNN article, obese middle-school children are sixty-five percent more likely to be bullied. The article then quotes Dr. Matthew Davis, a primary care physician and director of a children's hospital, on his thought about the potential underlying causes of this trend. "'We always have to keep in mind how we're modeling respect for others around multiple issues, including weight,' he says. 'Imagine how many signals kids get about weight just by hearing conversations by adults or seeing advertisements on TV. The messages are everywhere in terms of trying to control weight and be a different size than you are right now." Maybe Photoshop, the ubiquity of modern advertising and the messages it sends about body image have led children to this conclusion: fat is not desirable, skinny is attractive.  

If children are truly influenced into being anti-obesity from a young age, then perhaps obese people are facing injustice, however, not simply because of something like homophobia or racism like with other groups which have faced discrimination, but because of the modern age we live in. Obese children are not actually the demographic that missed out on the tolerance of the young generation, but they might be the new generation to be discriminated against in the ever-expanding world of digital perfection.

Thursday, January 8, 2015

Racism Disguised as Diversity

In American Studies class today, we explored the issue of tokenism in modern television. After seeing Mr. Bolos's (my teacher's) presentation on tokenism specifically in network dramas, I, and most of my class for that matter, were convinced that tokenism is a very real phenomenon. We also agreed on the conclusion that tokenism is a means for TV shows to appear diverse, and therefore for the networks, writers, and advertisers to appear tolerant and accepting.

However, in network sitcoms, I think that the purpose of the "token minority" character goes beyond that single use. Although it may not be the case across the board, in a high number of the sitcoms that actually do include a minority, the minority character is exploited for the ethnic and racial jokes that their inclusion in the show socially permits. All minorities it would seem, except for African Americans (that seems to be the one race that networks are scared to touch). For instance, Gloria in ABC's Modern Family, Han in CBS's Two Broke Girls, and Timmy in CBS's Rules of Engagement all have ethnic jokes made at them. However, Donna, an African American, on NBC's Parks and Recreation never has her race mentioned. I have seen several episodes and scoured YouTube for clips of a black joke, yet found nothing. Meanwhile, ethnicity is a major component of the other not white and not black characters: Gloria (Hispanic), Han (Asian), and Timmy (Indian).

On Modern Family, Gloria is constantly mispronouncing english words and sayings and corrected by the cast. Gloria is also asked if she is legally in the country on multiple occasions, and it is mentioned that she has been deported twice. Han's character, on Two Broke Girls is one of the most exploited I have seen. Practically all of the screen time he has on the show is of him speaking with an exaggerated Chinese accent, emasculated by women for his height and apparent agelessness, or otherwise negatively stereotyped. One of the main characters also says in front of him, "You can't tell an Asian he failed. He'll go out back and throw himself on his sword". Timmy, in Rules of Engagement, also faces jokes about his Indian ethnicity. He is confused with Indian Americans, and questioned over the correct of his English, despite the fact that it is very proper. He is also overworked as an assistant to his boss, portrayed as very obedient and too
passive to stand up for himself, all the while being accompanied
by a fittingly boyish name.

Though they may be considered offensive to some, because of the laugh reel that is played after the ethnic jokes (laugh reels are not featured in Modern Family), the shows must believe that these are lines to be laughed at. Therefore, assuming that directors and producers link more laughs to more viewership, as it should be, these jokes are specifically included for the purpose of driving up ratings, and making money. Due to the extremely high prevalence of the ethnic humor only when ethnics are on screen, I suspect that the only purposes of minority actors in sitcoms are diversity and the extra laughs. It is possible that the performers are given their jobs on the shows mainly for those reasons. When the characters are used to repeat the same offensive jokes over and over again, as they very much are in these three shows, it is hard to believe that the actors are on set for any other reason.

If this is really the case, has television truly diversified?

Saturday, January 3, 2015

Discrimination in Healthcare

Mr. Kramer at the Lee Specialty Clinic
Courtesy of The New York Times
Because of the accepting and tolerant community that I am so lucky to live in, I take it as a given that everyone is understanding of the special requirements of the mentally disabled. As I witness at my high school, New Trier on a daily basis, those with conditions such as autism and Down's syndrome receive assistance from a entire team of trained paraprofessionals and a growing number of students volunteers. To my surprise however, this might be something that is special about my high school. Many facilities, not just schools, but also dentist's and doctor's offices, do not accept or provide care to mentally disabled patients.

In a recent article by The New York Times, this issue is explored.  Mimi Kramer, a single mom working as a housekeeper, shares her experience with trying to find medical care for her thirty-three year old son who suffers from both autism and cerebral palsy. She tells reporters that she "has literally sat there with a phone book and called one [doctor] after another to try to get him [her son] in". She says that the response she gets most often is that "the [practice she calls] is not taking any new medicaid patients once they hear that he [her son] is challenged" (New York Times). And although I am unable to find a statistic to support this hardship as a trend- perhaps because the mentally disabled population is very small and because of the stigma surrounding the population- the article claims that the mentally disabled are "the most medically underserved population in the country". 

Reading about this issue, I could hardly believe it was current. Doctor's offices and dentist's offices that refuse special needs people service sound like they belong in the 1914 not 2014. Refusing service on the basis of ability also seems like a direct violation of civil rights laws. Though the reforms put in place after the civil rights movements of the 1960's did not protect the mentally disabled- or homosexuals for that matter- there is no reason for a change not to be made today.

Luckily, a change is beginning. It has been started not by the US government but by good-willed, individual facilities. The Lee Specialty Clinic in Kentucky is one such pioneer. According to the article the facility is one of the few of its kind designed specifically to treat those with intellectual disabilities, "The 17,000 square foot facility, offers certain amenities [tailored to the needs of the special patients]. A reception area with natural light and easy-to-clean cushions. Extra wide halls. Scales designed to weight people in wheelchairs. An overhead tram to lift patients into dental chairs." Ms. Kramer's son and others with mental disabilities are able to receive medical attention like everyone else thanks to institutions like this. However, the case is that many families drive hours to receive this type of care. Most still have no access to care like this. If only families across the country, regardless of neighborhood or income level, could receive the specialized care that New Trier High School and the Lee Specialty Clinic offer, the care that everybody deserves.