Sal's Take
Search This Blog
Sunday, April 19, 2015
Vatican Hosts Climate Change Summit
Earlier this week, the Vatican announced it's intention to organize a summit named "Protect the Earth, Dignify Humanity. The Moral Dimensions of Climate Change and Sustainable Development " via their website. This move on the part of the Vatican appears to be part of a recent line of decisions by Pope Francis to take a more progressive stance towards issues that have famously been points of sharp criticisms for the church.
Climate change denial in particular, has been strongly tied to the Religious Right here in the United States, whom often been accused of being anti-science and 'behind the times'. From denial of evolutionary theory (which is surprisingly, only accepted by 40% of Americans) to presidential candidate's outright rejection of man made climate change, the Conservative bloc of American politics, which is intimately tied to the religious right, has taken a bad beating as a result of its progress arresting views.
The relatively politically progressive stance taken by the current Pope Francis is a breath of fresh air for a party that has been slowly losing credibility for its increased irrelevance. Though the majority of the United States Christian base is Protestant, and the views expressed by the Pope do not necessarily translate into American politics as a result, Pope Francis enjoys a cozy 90% acceptance rate among American Catholics and a 70% approval rating among the general American population. Such a sign couldn't be more obvious, if the GOP wants to survive Post- Baby Boomer, this may be a good indication of what needs to change.
As was put nicely by Recca Leber, a writer for the New Republic:
"That's certainly not the popular view of science and faith in America. On climate change, deniers like Senator James Inhofe have argued it's hubris to think humans can alter the atmosphere. Florida Governor Rick Scott has used religion to dodge a question on if he thinks humans have caused global warming. On evolution, Republicans like Senator Marco Rubio and Louisiana Governor Bobby Jindal have pled ignorance in order to avoid offending creationists. Lately Republicans have been invoking the same excuse, over and over again: When asked a question that touches upon science, they dodge the question by saying “I’m not a scientist.”
Francis isn't a scientist either. If he can have some faith in it, can't the GOP, too?"
For our own sake, I certainly hope so.
Saturday, April 11, 2015
Why Do Students Cheat?
Parents helping students cheat in Bihar, India |
This past month has been a strange time for education. In Atlanta, 11 educators were charged in what has been described as the largest cheating scandal in American history, while in Bihar, India, over a thousand people were detained (and 600 students expelled) for widespread cheating on state examinations.
Some students feel the need to cheat, and it’s important to ask why. The Atlanta and Bihar cheating scandals have some important things to remind us and it likely isn’t about the education system. The problem, rather, may lie with the private sector. In an increasingly mechanized workforce over three billion strong, one has to ask, “what sort of chances do I have?”
They aren't good, here’s the proof.
In a study conducted by the USA Today, an estimated 70% of low skilled jobs, those only requiring a high school diploma, are at risk of being completely replaced by automation in 10-20 years. An additional 46% percent of mid-level jobs, those requiring some training up to a Bachelor’s Degree, are also at risk of replacement. The safest bet seems to lie in the high skill job arena, which will experience the smallest shock, with 8% of jobs at risk. This of course does not consider jobs created as a result of automation, which, unsurprisingly, are generated mostly in the high-skill job sector as well.
Assuming you do manage to keep a mid or low skill job through the Robot Apocalypse, the numbers just don’t support the decision. The pay gap between those with no to some secondary education, and those with it, are rising. At $17,500 (as of 2013), the gap is the largest it has ever been, and it appears that the chasm will continue to grow for the foreseeable future.
Jobs requiring low levels of education are not only disappearing, but are becoming less and less lucrative with time. So it seems, to keep your prospects for any sort of tenable career, one’s best chances exist in obtaining a so called 'high skill job'.
All you need is a Bachelor’s Degree.
Here is where the problem lies.
With the cost a Bachelor's Degree looming around $23,410 for In-State students and $46,272 for those in private education, students from lower income background just can't afford the sticker price. Top schools are also getting more competitive, accepting less students overall, making scholarships more inaccessible for the average student. With such a dangerous combination of selectivity and inaffordability, the few available opportunities around are met with fierce competition. Students are forced to grapple with applicants both at home and internationally, fighting with the best and brightest globally for the scarce positions available.
Domestic students may be able to alleviate the need for scholarships by attending schools in-state, but for International Students coming from impoverished backgrounds, scholarships are a must. These scholarships are often much more limited in numbers than those available domestically, and in higher demand, often giving priority to domestic applicants.
All this, for a degree has the same scaled earning potential as it did in the 60s.
All considered, cheating seems to be a highly rational choice. With such tremendous odds stacked against the average student right out of high school, following the rules at the expense of a secure future just doesn't make sense. Concepts like honor and responsibility begin to lose water when your limited chance of future success rests on a single exam, and sometimes, the academic pressure cooker demands sacrifices.
Unfortunately, there isn't much that can be done to put out the fire. As competitive careers pull from applicants internationally and a the percentage of educated workers grows, the occupational vice will continue to demand more and more from the average worker. Educational attainment will inevitably lose value over time as supply grows from every corner of the globe. Job growth may be our only hope, and that can't even keep up with population growth (much less our robot overlords).
As long as these conditions continue to exist, there is little doubt that cheating will continue, I mean, why shouldn't it? The demand on the average student is tremendous, and it's growing at a startling pace. I can't help but think back to that old joke they often tell Liberal Arts majors, "Would you like fries with that degree?", even that, it seems, isn't a guarantee anymore.
...and we haven't even considered the competition at the Masters and PhD level.
Sunday, April 5, 2015
An Update on HSBC
Since I have such love and admiration for our old friend HSBC, I wanted to discuss a recent document released by the Justice Department on HSBC being 'too slow' in its promised reform since its 2012 deferred prosecution agreement for funding terrorist organizations. It seems, the same HSBC that is being charged with money laundering in its Swiss Branch has, to the surprise of many, been lazy in cleaning up its ethical activity. The stupidity that is required to think that this sort of display is having any effect on the bank's business behavior, and further, and any sort of serious investigation is being held, is tremendous. The cited article further discusses HSBC's main office's connection to the current Swiss Branch's investigation:
"The bank said the Swiss unit at the center of the tax-evasion allegations had a “different culture” that “was not fully integrated into HSBC,” being largely formed from acquisitions.
HSBC’s monitor said that top leaders at the bank are doing a good job of taking responsibility for the bank’s anti-money-laundering and sanctions compliance program, but criticized other staffers, the filing said. Senior managers at the U.S. unit’s global banking and markets business inappropriately pushed back against negative findings from internal auditors and others, the filing said. Individuals involved in this incident had their bonuses reduced."
Reassuringly enough, that other branch had nothing to do with them and we continue to keep our faith in HSBC in good order. What I really wanted to focus on is the heavy handed action taken by US Justice Department against those senior managers that pushed back against internal auditors, namely, the taking of their bonuses. Or rather, the reduction of their bonuses, so it seems that Bill in HR will not longer be able to afford the car of his dreams, and must now settle for a Porsche. There may still be hope though, it appears the Feds are having a difficult time accessing HSBC's computers:
"Mr. Cherkasky said the bank’s compliance technology remains “an area of material weakness,” the filing said. These systems are fragmented, which prevents bank investigators from easily reviewing customer records when evaluating suspicious activity. Mr. Cherkasky said these systems needed improvement in his initial report on the bank’s progress last year.
HSBC has good plans to improve these technology problems, but actually executing them will be “difficult, expensive and time consuming,” according to the filing."
Unfortunately, the burden once again falls on our friend to fund the upgrades required for their computers and how HSBC will manage to implement a task of such difficulty, time commitment and expense is beyond me.
Remember kids, crime doesn't pay.
P.S. Surprisingly, the kindly UK government appears to have forgiven many of the tax evaders in the HSBC laundering scandal.
Monday, March 30, 2015
State Secrets and Court Cases
The United States Government recently dismissed a "defamation suit by Greek shipping magnate Victor Restis against a shadyadvocacy group called United Against Nuclear Iran". It's not particularly
strange to hear about the United States government dismissing a court case
based on national security reasons, but this one does not seem to involve
either a government group or a government contractor, setting an upsetting
precedent and forcing us to revisit the power the State's Secrets Privilege
grants our Executive Branch (for you Americans out there). The States Secrets
Privilege has historically been the target of tremendous criticism, mostly in
its ability to subvert other forms of law in order to 'protect national
security'. It's a tricky issue, such privileged information can prove harmful
if divulged recklessly, but the degree
of trust this forces us to place on our legislators can be unsettling. The
potential for abuse is tremendous, and as was pointed out by the Guardian'stake on the lawsuit dismissal above, this potential has been realized again and
again.
Take for example, the Electronic Frontier Foundation's
lawsuit against the NSA are Edward Snowden leaked details on the United State
Goverment's massive surveillance program:
"And it doesn’t even matter if the basic facts of the
case are already public.
Take, for instance, the case brought by the Electronic
Frontier Foundation (EFF) – my former employer – challenging NSA phone
surveillance. It’s been going on since long before the Edward Snowden
revelations, and the government has long been invoking the state secrets
privilege to argue the court should dismiss EFF’s case before ever getting to
the question of whether the NSA’s mass surveillance programs are
unconstitutional. Given that the Obama administration has profusely proclaimed
its commitment to transparency after all the NSA information went public, you’d
think it would look preposterous if they continued to claim it’s a secret in
court.
Nope. Instead, Obama’s justice department doubled down on
protecting the truth about the NSA. In EFF’s case, the government is still
claiming that NSA’s phone and internet surveillance programs are secret and
cannot be challenged in open court."
This sort of 'trust us, we can't tell you, it's for your own
good' form of law put a tremendous amount of power in the hands of United
State's government and does not appear to hold it to any sort of law except its
own opinion. The very point of a National Constitution is to keep the ruling in
check, and force all participants in a nation to play by these rules. You would
think this would be obvious, but the ability to confer in secret and dismiss
cases based on 'national security' has the terrifying ability to enforce any
sort of agenda that such information may disrupt.
Though the defense is still subject to judicial review, this
has been continually challenged since the Post 9/11 Bush Administration. The
following is from a 2006 article regarding the Bush Administration's spike inuses of the State Secrets Privilege:
"In response to every lawsuit, the administration has
invoked the previously rare (but, now common) “state secrets” privilege–an
extraordinary doctrine which holds that a court should refuse to rule on any
matter which the executive claims would risk disclosure of critical state
secrets. The Reporters’ Committee for Freedom of the Press reported that while
the Executive Branch asserted the privilege approximately 55 times in total
between 1954 (the privilege was first recognized in 1953) and 2001, it has
asserted it 23 times in the four years after 9/11. The Washington Post’s Dana
Priest reported in May that the administration has invoked the doctrine five
times in the past year alone.
As Andrew Zajac wrote in the Chicago Tribune, “The Bush
administration is aggressively wielding a rarely used executive power known as
the state-secrets privilege in an attempt to squash hard-hitting court
challenges to its anti-terrorism campaign. … Judges almost never challenge the
government’s assertion of the privilege, and it can be fatal to a plaintiff’s
case.”
Beyond abuse of the state secrets doctrine, the White House,
through its loyalists on the Senate Judiciary Committee, has continuously
blocked efforts by senators such as Republican Arlen Specter to direct the
secret Foreign Intelligence Surveillance Act (FISA) court to rule on the
legality and constitutionality of the administration’s warrantless
eavesdropping activities. And in each instance where a court would have an
opportunity to rule on the program, the administration has invoked procedural
doctrines to keep the court from doing so."
Though the Executive Branch's challenge of this sort of
judicial review ultimately proved fruitless, the implications a successful
challenge could have produced are terrifying, and in light of the recent
resurgence in the State Secrets Privilege we must be aware of the unnerving
degree of control such a privilege can produce.
I want to be fair to this issue though, and I'm forced to
admit that an easy solution likely does not exist. There is no doubt there is
classified information that can be harmful to civilians if put in the wrong
hands, and I'm sure that I can get along just fine without knowing nuclear
launch codes or the United State's Military's latest technical project. When
such secrets regard domestic life, like PRISM and phone tapping, the line
becomes much blurrier. So what do you guys think about the State's Secrets
Privilege?
Thursday, March 12, 2015
The Emergence of Western Individualism
Individualism, a strong focus on the self-versus the group, is an integral part of the Western world, arguably one of the central ideas that is bears. The importance of this central idea leaks into every faucet of American dialogue and appears consistently in all sorts of major issues within the United States. LGBT rights, gun control, taxation and racial discrimination all bear strong ties to the right of the individual and the assumed undeniability that these rights infer. The basis of the idea of the ‘individual’, not as a discreet unit, but as a place of priority and particular significance, is an unusual one, one not shared by the majority of the world. Despite the issues that individualism can play on incompatible establishments, hubs of individuality are some of the most stable and wealthy governments on the planet.
I’ll argue in this blog post that individualism, especially the Western sort, was a combination of resource abundance and a unique philosophical tradition dating back to the Renaissance. I’ll explore the implications these investigations have on Western foreign policy with collectivist nations, and how these two systems can work together harmoniously for the benefits of both. As strong culture figures can often be used as ‘markers’ of significant ideas during a certain place and time, I’ll point to the ideas of philosophers during the Enlightenment to act as purveyors of the important ideas during this period of increased individualism and chart these out until we reach to the strong individualistic ethic we have now. Finally the ‘why’ of individualism will be developed, why did it occur and what benefits will it have versus a collectivist culture, while showing that the two country type distinction is simply a direct result of maximization of societal stability.
I’ll divide these posts into three
parts each, with Parts I and II lending to support part III, both explaining
how the contemporary individual emerged and why it’s important.
Part I: Origins of Collectivism and the Cost of
Individuality
Before we flesh out these topics
too much, I want to explore why we started from a place of collectivistic culture
rather than an individualistic one, what sort of relationship do these two
systems have? As I pointed out in my most recent blog post about the idea of a
moralistic god, acting in a way that is against the priorities of the group in
pre-modern societies could often be of great danger to the group. Stability was
of paramount importance in early human societies, as survival would lie in
cooperation of the group against an unpredictable environment. Stealing a
neighbor’s cow could spell starvation for your neighbor and his family, as well
as sever any insurance against famine you might have from your neighbor’s
goodwill. This brings to mind the famous Bible verse popularly used by John
Smith: “He who does not work, neither shall he eat”. This idea being is central
to the workings of a highly collectivist society, contribute and you shall be
cared for.
This focus on stability and
obedience pushed the importance of idea systems that could increase group
coherence and foster group identity, allowing the accomplishment of common goals.
Individualism and individual identity had no place where survival was all but
guaranteed. The idea of ‘individualistic goals’ was a dangerous and costly one,
where early societies were often very close to famine even in years of surplus
(Minc, 1986), they could not afford to invest resources in individual
cultivation. A worker taking unnecessary
time off or cheating the system would be of great danger to all those that
depend on his/her efforts. Acting in self-interest had a high potential cost
and often low benefit and, as a rule, these behaviors were seen as dangerous
ones and punished (Allchin, 2009). We should not feel that the individual was
strongly oppressed in these early systems though, as the very basic goal of
survival was one shared by the group as a whole and all the individuals in it
(Allchin, 2009). The main point being that organized behavior is the most
efficient way to try out new methods for foraging and agriculture (Minc, 1986,
p. 43), and ensure optimum survival for all involved via assumed reciprocity and
synchronized behavior in an environment of scarcity.
It’s
not surprising then that the individual in early societies would define the
self in their relationships to others and the role they fit into (Lieber, 1990),
rather than anything self-determined. How well one did the job that was set out
for them (either through ancestry or though societal need) would evoke larger
personal, positive reinforcement than a job that was fun or made one feel
impassioned. This leads to a collective or role defined sense of self, as is
seen in the Medieval and Greek art below:
In “Monks Singing in the Office”
(Olivetan Gradual, 1439-1447) above, we can see the singers blend
together as a functional unit and the conductor, likely a monk of higher
status, serves his role separately. The
picture simply serves to depict the idea of sing in a choir and gives no
significance to the individuals in the picture. This was found in a gradual
(The Olivetan Gradual), a book of religious hymns, which further
serves to emphasize the functional rather than individualistic value of the
artwork. This functional emphasis is also seen to the right painting’s
(Last Judgement by Fra Angelico) blending together individuals to favor an
overall theme, that being, hell and the suffering it entails. This pulls us
away from considering how each victim is suffering, in favor of promoting the
more general theme that ‘’people are suffering and this place is horrible’’.
This is unsurprising considering this painting was found painted on a wall of a
monastery (Santa Maria degli Angeli Monastary in Florence, Italy) part of a
larger painting showing Heaven and Hell. The demon is emphasized in this
painting much like the conducting monk in “Monks Singing in the Office” also
existing as a reference to the overall theme, rather than existing for his own
sake (we don’t care how he feels or thinks, we care about what he represents)
(Angelico, 1431-1435).
The Greek vase painting compilation above also appears to place no significance on the individuals
themselves, in favor of promoting the battle as a whole (Gigantomachia, War of
the Giants, 400-390 BC).
Though the Greeks and Romans did
show individuals in art, individual emphasis in art was often reserved for
depictions of Gods and great individuals, whom symbolized the group as a whole
(for example, Ceaser and the Roman State). This overall view of the lowered
significance of the individual in Pre-Enlightenment art is echoed by the
Metropolitan Museum of Art’s entry on Portraiture
in Renaissance and Baroque Europe (in their Heilbrunn Timeline of Art History):
“A portrait is typically defined as a representation of a specific individual, such as the artist might meet in life. A portrait does not merely record someone's features, however, but says something about who he or she is, offering a vivid sense of a real person's presence.
“A portrait is typically defined as a representation of a specific individual, such as the artist might meet in life. A portrait does not merely record someone's features, however, but says something about who he or she is, offering a vivid sense of a real person's presence.
The traditions of portraiture in the West extend back to
antiquity and particularly to ancient Greece and Rome, where lifelike
depictions of distinguished men and women appeared in sculpture and on coins.
After many centuries in which generic representation had been the norm,
distinctive portrait likenesses began to reappear in Europe in the fifteenth
century. This change reflected a new growth of interest in everyday life and
individual identity as well as a revival of Greco-Roman custom. The resurgence
of portraiture was thus a significant manifestation of the Renaissance in
Europe.” (Sorabella, "Portraiture in Renaissance and
Baroque Europe")
In art,
an emphasis on the individual began to emerge during the Renaissance, though
the individualistic ethic that eventually led to our contemporary conceptions
of self did not take off until the British Enlightenment, which occurred at the
tail end of the Renaissance and almost 200 years after it began (“What’s the
Difference Between the Renaissance and the Enlightenment?”, 2015). To
re-reference the group dynamic model
from Part I used above, it seems this model of collectivism assumes that
environments of scarcity and higher risk of threat push a society towards
stronger interdependent relationships, and by the same token, when a society
exists in a state of large surplus, individualism becomes much less costly. As
such, the development of individualistic ideas should have been much more
likely and favorable as agricultural efficiency increased and basic needs
became more accessible.
Although it is difficult to assess
agricultural surplus and the percentage of people housed before these sorts of
statistics were taken, we do currently tend to find a very strong tie between
national wealth, quality of life and individualism. With wealth suggested as the
causal agent for individualism in culture, the inverse also holds, poorer
nations tend to be more collectivistic (Hofstede, 2001), (Triandis et al.,
1986), with collectivistic culture allowing for more efficient allocation of
scarce resources. Though the causal association cannot be confirmed to be the
same in the past, for the sake of this argument we will assume the proposed
mechanism holds for Medieval and Imperialistic powers of the past.
Part II: Philosophical Origins of Individuality and
Self-Differentiation
It should be no surprise then that such a big gap existed
between the first appearances of individualism and its surge into a major
theme. This period of time was also shared with the rise of imperial nations,
most notably the French and British Empires. As imperial powers, especially
their capitals, had greater access to goods via consolidation of agriculture
and taxation, we should expect additional bumps for individualism to occur
during this time. Both the British and French Empire led to the development of
individuality in their own ways, via additions to important philosophies and/or
historical events that were centered on individual rights.
The French promoted the individual most
notably via the French Revolution, which is often cited as the beginning of the
rise of individual rights in France. In Britain, their contributions were more
indirect and subtle, mostly through Britain’s Empiricist tradition, with Empiricist
thinkers eventually spreading ideas of liberty and the right of the individual
to not be crushed under the heel of. In fact, the Empiricist movement is
responsible for many of the ideas introduced into the United States
Constitution, via the tremendous impact John Locke on Thomas Jefferson’s drafting
of the constitution (Foundation of American Government, 2015).
As I am much more familiar with the
major philosophers of this period (rather than the ideas of the French
Revolution), I will use three thinkers: Rene Descartes, John Locke and Immanuel
Kant as the ‘markers of their time’, as I listed in my first paragraph, this
should allow outline the view of the individual held by Enlightenment Era
thinkers.
The
earliest of these philosophers, Rene Descartes, was a French citizen and a part
of the Rationalist tradition in France (a related, often adversarial tradition
to the Empiricist Tradition, helping shape it by association). He’s often
heavily cited for his phrase cogito ergo
sum or ‘I think therefore I am’, this phrase being what I would like to
examine to point to the views of individualism expressed during the time
(Middle 15th Century). Rene Descartes wrote a series of short
writings, his Meditations, where he
challenged his most basic and innate conceptions to see if they were
susceptible to any degree of doubt. Famously, at the end of his thought
experiment, he found that he could not prove anything without at least some
degree of doubt, except that he existed (this being called Cartesian Doubt). The reasoning being that in order to doubt,
something had to be doubting, that thing being him. Interestingly, what is
often excluded when students are introduced to this text are Descartes’s
responses to his finding, the quote below is from Descartes’s Third Meditation, and is very telling of the views of God and his
relationship to the individual held at the time:
“I must inquire whether there is a God, as soon as an
opportunity of doing so shall present itself; and if I find that there is a
God, I must examine likewise whether he can be a deceiver; for, without the
knowledge of these two truths, I do not see that I can ever be certain of
anything. And that I may be enabled to examine this without interrupting the
order of meditation I have proposed to myself [which is, to pass by degrees
from the notions that I shall find first in my mind to those I shall afterward
discover in it], it is necessary at this stage to divide all my thoughts into
certain classes, and to consider in which of these classes truth and error are,
strictly speaking, to be found.
…if God did not in reality exist--this same God, I say,
whose idea is in my mind--that is, a being who possesses all those lofty
perfections, of which the mind may have some slight conception, without,
however, being able fully to comprehend them, and who is wholly superior to all
defect [ and has nothing that marks imperfection]: whence it is sufficiently
manifest that he cannot be a deceiver, since it is a dictate of the natural
light that all fraud and deception spring from some defect.” (Descartes)
This
response is important because of the bridge this forms between the perceived
experience one has, and the way the world is. In Descartes’s view, the world is
perceived as it is, and if this were not so, there would be no God, or God
would be imperfect. The truth value of this argument is unimportant, but the
assumptions it creates is that others perceives the world in the same manner as
he does, the connections between the outside world and the self only differ in
their scope from person to person. The individual is isolated from reality in
the same respect as everyone else, and individual differences are of no
importance, because the world is as we see it. Though I do not claim that
Descartes was unaware of differences in height, angle of view, intelligence
differences, etc. and their effects on perceptions of the world, he was not
aware of aspects of the world that are hidden from human experience (except
maybe that which God has chosen to hide). The Neoclassicist art movement took
off during the Age of Enlightenment, and celebrated reality as it was perceived,
painting many naturalistic scenes and emphasizing extreme realism (Perry et al.)
. These views, and the importance of our central ties to the ‘world as it was
seen’.
John Locke, an important British
Empiricist that wrote shortly after Descartes death, expressed views very
similar to those of Descartes in its implication. The following quote is from
Locke’s A Treatise of Human
Understanding, Book I:
“For though the comprehension of our understandings comes exceeding short of the vast extent of things, yet we shall have cause enough to magnify the bountiful Author of our being, for that proportion and degree of knowledge he has bestowed on us, so far above all the rest of the inhabitants of this our mansion. Men have reason to be well satisfied with what God hath thought fit for them, since he hath given them (as St. Peter says) pana pros zoen kaieusebeian, whatsoever is necessary for the conveniences of life and information of virtue; and has put within the reach of their discovery, the comfortable provision for this life, and the way that leads to a better. How short soever their knowledge may come of an universal or perfect comprehension of whatsoever is, it yet secures their great concernments, that they have light enough to lead them to the knowledge of their Maker, and the sight of their own duties.” (Locke, 1689, p. 2)
“For though the comprehension of our understandings comes exceeding short of the vast extent of things, yet we shall have cause enough to magnify the bountiful Author of our being, for that proportion and degree of knowledge he has bestowed on us, so far above all the rest of the inhabitants of this our mansion. Men have reason to be well satisfied with what God hath thought fit for them, since he hath given them (as St. Peter says) pana pros zoen kaieusebeian, whatsoever is necessary for the conveniences of life and information of virtue; and has put within the reach of their discovery, the comfortable provision for this life, and the way that leads to a better. How short soever their knowledge may come of an universal or perfect comprehension of whatsoever is, it yet secures their great concernments, that they have light enough to lead them to the knowledge of their Maker, and the sight of their own duties.” (Locke, 1689, p. 2)
For the sake of relevance I will only
delve into Locke’s views and the purpose of the text this quote was derived
from only superficially. This quote is found in the beginning of Locke’s first
of three books titled A Treatise of Human
Understanding, in its introduction, and was to act as a very obvious and
assumed claim before Locke proceeded with his extensive analysis. Serving to
explain that although we may not have access to the realm of God and perfect
understanding, we are all granted access to the same pot of knowledge,
knowledge that allows us to serve God and keep us happy during our time on
Earth. Further (in Locke’s A Treatise of
Human Understanding), individual
differences in perception are not accounted for and objects of perception are
assumed to be particular forms of a universal,
that being, a ‘perfect’ conception of an object of observation. If we
perceive a red kitchen table, we are viewing a specific form of a ‘universal
table’ or an idealized conception of the table (similar to Plato’s ideal forms for those familiar). These
universals are shared by everyone and do not appear to be subject to
interpretation in Locke’s views, further tying us to the ‘world as it is’ (to
the extent that is relevant to us) in the same manner, courtesy of the
Christian God.
Immanuel Kant’s philosophy was one
of the first major philosophers to argue for a viewpoint that allowed for
individualistic interpretation as a result something called the noumenal/ phenomenal distinction, a view
central to his philosophy. The noumenal
realm is the world as it is, without the constraints of interpretation and
our senses, the objective world. The phenomenal
realm is the world as we see it. When knowledge building, to Kant, we are
simply understanding the phenomenal realm in greater detail but learning nothing
of the noumenal, the varied formation of phenomenal systems can lead to
different system of understanding depending on the subject. This further led to
the inevitable implication that the system build in our own head, as a result
of assimilating information in different environments, are different than those
built up by others. Kant’s views were widely accepted at the time, and as a
result he remains a central figure in the development of Industrial Era Western
philosophy American and European (Kant’s death was only a few decade before the
beginning of the American Industrial Revolution).
Kant’s views of the phenomenal led
to a school of investigation deemed phenomenology,
which studied consciousness and its relationship to the world around it,
and had a precursory relationship to psychology. More importantly, after Kant’s
philosophy took off, individualistic ethic and the school of Existentialism
formed soon after. This period of time also marked the beginning of
industrialization, where quality of life began to climb and food production and
dissemination was making its largest increases since the advent of agriculture.
It’s difficult to predict causality here, did the increase quality of life
reduce the need of others in a survival sense, leading to more individualistic
schools of philosophy, or did they develop independently? I’m not too sure, I’ll leave it to the reader
to further explore.
Part III: The Importance of Individualism and its Importance
for Economic and Foreign Policy
As a result of industrialization,
basic needs became much more disseminated as the ‘cost of individuality’ was no
longer fatal to the group and even garnered certain advantages. Individuality
actually provided a benefit once a cost had been paid via so called division of labor. One could now
specialize on a specific task or field and do it to a degree of mastery than
anyone else before them. These specialized individuals could achieve tasks,
when working as a unit, much more comprehensive and complex than ever before, but
required a costly training period before this benefit could be accessed.
Sukkoo Kim in his paper, DIVISION OF LABOR AND THE RISE OF CITIES:
EVIDENCE FROM U.S. INDUSTRIALIZATION, 1850-1880, suggest that the
importance of division of labor also led to the development of cities, as labor
matching is easy with a more localized workforce. Factors like higher wages served
to attract rural workers into cities, and workers were also allowed to choose
what they wanted to invest their time into fostering. Whether it be welding,
writing, cooking, etc., it still fit some sort of role desired by society (Kim,
2006). Non-essential tasks could be given higher priority when survival was no
longer a serious concern, and self-cultivation and leisure became more and more
accessible for greater parts of the population, and leading to a higher and
more secure quality of life.
Though individualism seems to tie
well to higher prosperity, we should dig a little deeper to see if it leads to
a happier population (money can’t buy happiness, right?). Collectivism seems
to, by necessity of the strong ties fostered between mutual dependence, build
stronger communities. While loneliness and narcissism seem to be obstacles which
make an individualistic existence something undesirable. The idea of the
predatory businessman selling out his community to make a quick buck comes to
mind, putting himself above the group for his sole benefit.
According to a 2014 study published
by Helliwell et al. in the Journal of Happiness Studies, areas with stronger
communities and a higher emphasis on ‘social capital’ have happier citizens on
average. He also found following the 2008 Financial Crisis, individualistic
nations experienced some of the largest drops in happiness, signaling a worse
response to crisis in nations with less social support networks in place. This
makes sense and supports the idea that collectivistic ideals lead to better
social security and stability, while being forced to face crisis alone leads to
more difficulty with recovery. This also seems to fit well into the description
of group resilience given in Part I, where the group can act as a safety net,
and less affected members can aid those most affected (Helliwell, 2014).
So why would prosperity lead to
individualism if it leads to a less happy population in general? Although a
collectivist model can handle crisis to a greater degree than an individualist
system, an individualist system can reduce the likelihood and severity of these
crisis, reducing or nullifying the advantages a collectivist system allows. Division
of labor allows for the building of more elaborate infrastructure and greater
cumulative efforts in research and development, which allows industrialized
nations to protect against both internal and external threats much more
effectively. Collectivist cultures could not foster such specialized efforts in
so many different arenas. As such individualized countries may foster a greater
degree of unhappiness during crisis, or even in a general sense, but there
seems to be a strong stability advantage as a compromise. This stability is
both a requirement for individualism to exist in the first place, and once such
a system takes root, an aspect that is continually reinforced by individualism (Hofstede,
2001).
Before I continue I should point
out that individualism, as I am defining it in this blog post, is described in
terms of a person acting authentically,
or in a manner that aligns with individually determined values. An individual’s
values can come from the group (in fact most of the time they do to some
extent) but in situations where individual and societal values conflict,
individual values should take precedence. That isn’t to say that we should
assume that everyone would act in a manner akin to the businessman above, as
empathy (and other altruistic tendencies) occur naturally in humans. Rather, the values and terms one comes to
describe as important are up to the individual, and determined by chosen
alignment to external values. It has been further suggested that well-being in
individualistic societies is tied to working with or for the sake of others (Helliwell,
2014), while others find that individualism predicts happiness while wealth does
not (as such, the goals people work for often aren’t monetary) (Fischer, R.,
& Boer, D., 2011). This further supports the notion that individualistic
societies do not necessarily devolve into a free for all (though some
regulation may need to exist to protect from the authentically purely self-interested).
When authenticity and division of
labor are both considered from the social model in Part I, we begin to see that
authenticity actually works for the group. The collectivist agenda is to
optimize the survival of the group, and this agenda does not change during a
switch to an individualistic ethic, they do not have to be at odds. As the same
principles dictate higher level purpose in people in both cultures, most
importantly: order, social well-being, maximization of happiness and safety,
etc. we should not view the two cultures as having dissimilar ends. By allowing
more individuals to work as nodes of exploration and innovation, we maximize
our control and understanding of our environment as a unit in indivualistic
cultures. Comfort is also increased, as basic needs can be meet for a large
base of individuals with less workers and increased automation. Time can now be
spent finding a niche and exploring and understanding our environment and
ourselves. Identity becomes self-managed and the individual can become who they
want due to their individual efforts, thus social mobility is increased.
Complete social mobility in any
industrialized nation at the moment is only an ideal, but we are seeing
increased debate over one’s ability to meet self-defined goals with a fairer
chance, this stands at the heart of much of the LGBT, racial and gender based
debate on equality prevalent in many industrialized nations. The meeting of
individual goals (“being yourself”) takes strong precedence when authenticity
promotes a stronger social infrastructure than one based in tradition and
authority. This building up of social infrastructure helps everyone involved,
with equal mobility allowing individuals to become skilled in something they
enjoy while giving society increased resilience through a diverse workforce. If
done well, individualism does not have to lead to lower happiness and
well-being, and several individualistic nations report to be the happiest in
the world (Standish, M., & Witters, D., 2014).
These ideas have important
implications on foreign policy and should be considered when collectivistic and
individualistic nations interact. The difference on emphasis between the
traditional and the personal can cause stark differences in priority between
these two sorts of nations. The United States’s strong emphasis on
individualistic rights, accessibility of information and ‘the pursuit of
happiness’ can only make sense from both the tradition in part II and the ‘reduced
burden of individuality’ explained in part I. We should resist the temptation
to think nations will, once given enough surplus and security, eventually
become highly individualistic. Western Individualism seems to be a unique
combination of an emerged tradition and a lowered cost of individual
cultivation. Western Individualism is only one of many possible systems that
can emerge from economic abundance and stability.
I will claim that cultivating stability in a nation will serve to
make said nation more open to new ideas, lowering the social and economic risk
adoption of new ideas poses. Experimentation is possible in abundant and stable
government systems and undesirable when basic needs are a group’s top priority.
Introduced deviations from social norms can be perceived as dangerous and
intrusive in countries experiencing issues of safety and stability, and
rightfully so, as they funnel resources away from basic needs. Foreign policy should keep these differences
in priority in mind when dealing with nations of differing degrees of
self-perceived stability, by helping to foster stability in locations lacking
it, and facilitating cultural and economic exchange in those that perceive
stability.
Attempts to introduce elections and
democratic systems to fragile states can often fail for this very reason. Despite
the views of the citizen base, individuals may prefer to put those views aside
for a perceived group dynamic, one that will promote stability and cooperation
in a nation badly needing it. It seems instead that aiding in infrastructure
development and economic stability serves as more effective methods of allowing
cultural exchange, a goal that can aid both the donor nation and the donated. It
may seem counterproductive to not focus on social issues in developing
countries, and though I am not advocating a complete abandonment of social
rights watching in other nations, the only way enduring change can occur is
when stability is already in place. This model is obviously simplistic, and
corrupt regimes, diplomatic refusal and differences in infrastructure priority
are among the host of issues that can make stability a difficult point to
reach, nonetheless reaching this point provides massive benefits for all
involved. This makes institutions like the United States Peace Corp, Médecins
Sans Frontières and the United Nations invaluable allies in this mission.
Diplomatic relationships with collectivistic
nations allows refinement of ideas through new social mediums and enrichment of
culture on both ends as a result. Further, common missions can occur when two
nations reach stability, and joint projects like our current CERN and
International Space Station become more viable. Labor can also be divided
between larger populations, facilitating more extensive and complex economic
and social projects. The principle of ‘helping others to help yourself’ is
important here, as structural investment in nations badly needing it can
provide valuable allies and partners in the future, and can allow faster
developments of projects that are important to all of mankind. Much in the vein
of collectivistic culture, we should help those who need help, as they can help
us when we do.
Works Cited
Allchin, D. (2009). The evolution of morality. Evolution: Education and Outreach, 2(4), 590-601.
Angelico, F.
(1431-1435) Last Judgment (detail of Hell) [online image], tempera on
wood. Museo di San Marco, Florence. Retrieved March 11, 2015 from http://www.artcolorstore.com/images/painting/Last-Judgement-detail-Hell-Early- Renaissance- Oil- Painting-ACS02546.jpg
Descartes, R.
Discourse on the method of rightly conducting the reason, and seeking truth in
the sciences. [The Project
Gutenberg EBook].
Geert Hofstede, Culture’s
Consequences: Comparing Values, Behaviors, Institutions, and Organizations Across Nations. Second Edition, Thousand Oaks CA: Sage
Publications, 2001.
Fischer, R., &
Boer, D. (2011, June 14). Money Can't Buy Happiness. Retrieved March 11, 2015, from http://www.apa.org/news/press/releases/2011/06/buy-happiness.aspx.
Foundation of American Government. (2015, January 1).
Retrieved March 11, 2015, from http://www.ushistory.org/gov/2.asp
The Gigantomachia,
War of the Giants [Online Image]. (400 - 390 BC). Retrieved March 11, 2015 from http://www.theoi.com/Gallery/L20.1.html
Helliwell, J. F., Huang, H., & Wang, S. (2014).
Social capital and well-being in times of crisis. Journal of Happiness Studies, 15(1),
145-162.
Kim, S. (2006).
Division of labor and the rise of cities: evidence from US industrialization,
1850– 1880. Journal of Economic Geography, 6(4), 469-491.
Locke, J. (1689, January 1). An Essay Concerning Human
Understanding. Retrieved March 12, 2015.
Minc, L. D. (1986). Scarcity and survival: the role of
oral tradition in mediating subsistence crises. Journal of Anthropological Archaeology, 5(1), 39-113.
Olivetan Gradual
(1439-1447). "Monks Singing the Office" [Online image].
Retrieved March 11, 2015 from http://www.gg-art.com/news/photoshow/43225l1.html
Perry, M., Hollinger,
P., & Baker, J. (n.d.). Humanities in the Western Tradition - Chapter
Summary 18. Retrieved March 10, 2015,
from http://college.cengage.com/humanities/perry/humanities/1e/students/summaries/ch18.html
Rider-Bezerra, S.
(2014, September 14). Introducing Beinecke MS 1184: The Olivetan Gradual. Retrieved March 10, 2015.
Sorabella, Jean.
"Portraiture in Renaissance and Baroque Europe". In Heilbrunn
Timeline of Art History. New York:
The Metropolitan Museum of Art, 2000–. http://www.metmuseum.org/toah/hd/port/hd_port.htm
(August 2007)
Standish, M., &
Witters, D. (2014, September 16). Country Well-Being Varies Greatly Worldwide. Retrieved March 10, 2015, from http://www.gallup.com/poll/175694/country-varies-greatly- worldwide.aspx
Triandis, H. C., Bontempo, R., Betancourt, H., Bond,
M., Leung, K., Brenes, A., ... & Montmollin, G. D. (1986). The measurement of the etic aspects of individualism and
collectivism across cultures. Australian journal of Psychology, 38(3), 257-267.
What’s the Difference
Between the Renaissance and the Enlightenment? (2015, January 9). Retrieved March 11, 2015 from http://www.slate.com/blogs/quora/2015/01/09/what_s_the_difference_between_the_renaissa nce_and_the_enlightenment.html
Sunday, March 8, 2015
God and Societal Complexity
I think when we examine early societies we often find a lot of diversity across these groups, traditional foods, religious ceremonies and courting rituals all seem to vary wildly from population to population. Yet the metathemes, things like religion, conserved beliefs (tradition and other "passed down practices") and spoken language are seen to fit a sort of category of universality among all human populations. Though the idea of universals (in this case "cultural universals") among human groups is a fairly controversial view (with opponents referred to as "relativists"), I tend to ascribe to the former school. That's why this article by Nature on complex societies evolving without belief in all-powerful deity to be particularly fascinating and to be a particularly criticism to the idea of certain axiomatic cultural universals, at least in the sense of an all powerful god.
In studying a set of 96 Austronesian cultures Professor Watts from the University of Auckland in New Zealand checked for two common sets of religious systems, Moralizing High Gods (MHGs) and beliefs in systems of supernatural punishment (BSPs). They found 6 and 37 fitting into each group respectively, with BSPs helping (though not guaranteeing) the development of societies in higher levels of complexity and MHGs evolving after societies had reached a certain degree of political complexity. The explanation below for this phenomena is taken from the article:
"So what are MHGs for? “They are tools of control used by purveyors of religion to cement their grip on power,” says Pagel. “As soon as you have a large society generating lots of goods and services, this wealth can be put to use by someone who can grab the reins of power. The most immediate way to do this is to align yourself with a supreme deity and then make lists of things people can and cannot do, and these become ‘morals’ when applied to our social behaviour." "
The idea that humans can 'naturalize' these ethical principles, in the form metaphysical systems (like the karma system as mentioned in the article or "accumulated good", both BSPs) or a ruler like entity is not one that's ever crossed me before, but makes a lot of sense when seems as playing a strong stabilization potential for pro-social behaviors in early societies. As these systems follow the need for stability rather than create them, a society can arguably develop other methods to create stability.Such a need was much more important in much earlier societies, as anti-social or self serving behavior could play a massive role is creating conditions that lead to the deterioration of the society as a whole, which could spell famine, civil unrest and disease for the effected social group. Death is usually not on the table for a modern family with a disobedient child or obstructive neighbor, but when living in a environment where one's own survival depends on their crop yield or in a worst case scenario, their neighbor's generosity, such ties played an absolutely vital role in individual survival.
Further, it seems that other, more efficient methods at creating this stability can create degrees of obsolescence in these systems, this is of course speculative, but seems to follow, as their purpose is better met somewhere else. A study by Phil Zuckerman, a Professor of Sociology, titled Atheism: Contemporary Rates and Patterns, examined differences in rates of atheism across different countries and asks this very same question, here's the relevant excerpt:
"What accounts for the staggering differences in rates of non-belief between nations? For instance, why do most nations in Africa, South America, and Southeast Asia contain almost no atheists, while many European nations contain an abundance of non-believers? There are various explanations (Zuckerman, 2004; Paul, 2002; Stark and Finke, 2000; Bruce, 1999). One leading theory comes from Norris and Inglehart (2004), who argue that in societies characterized by plentiful food distribution, excellent public healthcare, and widely accessible housing, religiosity wanes. Conversely, in societies where food and shelter are scarce and life is generally less secure, religious belief is strong. Through an examination of current global statistics on religiosity as they relate to income distribution, economic inequality, welfare expenditures, and basic 19 measurements of lifetime security (such as vulnerability to famines, natural disasters, etc.), Inglehart and Norris (2004) convincingly argue that despite numerous factors possibly relevant for explaining different rates of religiosity world-wide, “the levels of societal and individual security in any society seem to provide the most persuasive and parsimonious explanation” (p.109).iii Of course, there are anomalies, such as Vietnam (81% non-believers in God) and Ireland (4- 5% non-believers in God). But aside from these two exceptions, the correlation between high rates of individual and societal security/well-being and high rates of non-belief in God remains strong."
This hypothesis seems compatible with the views expressed in the Nature article above, and creates a functionalist perspective for the purpose of moralizing religions. It would be interesting to see if this also fits for a belief in a non-moralizing deity or intrinsic sense of non-moral order to the universe. So what do you guys think? Did spirituality as a whole evolve in us to create order, or does it play some other, more complex purpose?
In studying a set of 96 Austronesian cultures Professor Watts from the University of Auckland in New Zealand checked for two common sets of religious systems, Moralizing High Gods (MHGs) and beliefs in systems of supernatural punishment (BSPs). They found 6 and 37 fitting into each group respectively, with BSPs helping (though not guaranteeing) the development of societies in higher levels of complexity and MHGs evolving after societies had reached a certain degree of political complexity. The explanation below for this phenomena is taken from the article:
"So what are MHGs for? “They are tools of control used by purveyors of religion to cement their grip on power,” says Pagel. “As soon as you have a large society generating lots of goods and services, this wealth can be put to use by someone who can grab the reins of power. The most immediate way to do this is to align yourself with a supreme deity and then make lists of things people can and cannot do, and these become ‘morals’ when applied to our social behaviour." "
The idea that humans can 'naturalize' these ethical principles, in the form metaphysical systems (like the karma system as mentioned in the article or "accumulated good", both BSPs) or a ruler like entity is not one that's ever crossed me before, but makes a lot of sense when seems as playing a strong stabilization potential for pro-social behaviors in early societies. As these systems follow the need for stability rather than create them, a society can arguably develop other methods to create stability.Such a need was much more important in much earlier societies, as anti-social or self serving behavior could play a massive role is creating conditions that lead to the deterioration of the society as a whole, which could spell famine, civil unrest and disease for the effected social group. Death is usually not on the table for a modern family with a disobedient child or obstructive neighbor, but when living in a environment where one's own survival depends on their crop yield or in a worst case scenario, their neighbor's generosity, such ties played an absolutely vital role in individual survival.
Further, it seems that other, more efficient methods at creating this stability can create degrees of obsolescence in these systems, this is of course speculative, but seems to follow, as their purpose is better met somewhere else. A study by Phil Zuckerman, a Professor of Sociology, titled Atheism: Contemporary Rates and Patterns, examined differences in rates of atheism across different countries and asks this very same question, here's the relevant excerpt:
"What accounts for the staggering differences in rates of non-belief between nations? For instance, why do most nations in Africa, South America, and Southeast Asia contain almost no atheists, while many European nations contain an abundance of non-believers? There are various explanations (Zuckerman, 2004; Paul, 2002; Stark and Finke, 2000; Bruce, 1999). One leading theory comes from Norris and Inglehart (2004), who argue that in societies characterized by plentiful food distribution, excellent public healthcare, and widely accessible housing, religiosity wanes. Conversely, in societies where food and shelter are scarce and life is generally less secure, religious belief is strong. Through an examination of current global statistics on religiosity as they relate to income distribution, economic inequality, welfare expenditures, and basic 19 measurements of lifetime security (such as vulnerability to famines, natural disasters, etc.), Inglehart and Norris (2004) convincingly argue that despite numerous factors possibly relevant for explaining different rates of religiosity world-wide, “the levels of societal and individual security in any society seem to provide the most persuasive and parsimonious explanation” (p.109).iii Of course, there are anomalies, such as Vietnam (81% non-believers in God) and Ireland (4- 5% non-believers in God). But aside from these two exceptions, the correlation between high rates of individual and societal security/well-being and high rates of non-belief in God remains strong."
This hypothesis seems compatible with the views expressed in the Nature article above, and creates a functionalist perspective for the purpose of moralizing religions. It would be interesting to see if this also fits for a belief in a non-moralizing deity or intrinsic sense of non-moral order to the universe. So what do you guys think? Did spirituality as a whole evolve in us to create order, or does it play some other, more complex purpose?
Wednesday, March 4, 2015
The FCC Classifies the Internet as a Public Utility
I know the last few posts on this blog have been a bit on the bleak side, so I'm glad to write this post about the FCC's decision to finally list the internet as a public utility. This after the FCC received over three million complaints over Comcast's attempt to implement a program that would intstitute "fast lanes" where clients who paid more had faster access to certain sites, and "slow lanes" where costumers who did not pay this extra fee would have certain sites throlled (lower bandwith). This model is very similar to the cable and satellite TV model, where access to sites is regulated by payment (though it was never particularly clear if you would be required to pay for access to certain sites, or if it simply throttled access to these sites).
Though Comcast does not claim to throttle access to sites currently, it has charged streaming sites, most notably Netflix, for the high data usage the site requires. This fee that Comcast pays out (or at this point paid out) was followed by a slow down in access to Netflix via Comast from when the payment was requested and spiked again after Netflix agreed to pay (see graph below).
Much like radio, this decision heavily regulates access to, and usage one has, when using internet infrastructure (phone lines, fibre optic cables, etc.). For those living in major metropolitan areas, this lack of regulation has lead to "provider zones" where living in a certain area forces the use of one ISP or another, with Comcast, Time Warner and AT&T U-Verse. This, along with the largest ISPs agreeing not to extend into each other's "provider zones", has resulted in higher prices, decreased competition and on a global scale, some of the slowest internet in the developed world.
The following quote from the New York Times puts it quite succinctly:
"The reason the United States lags many countries in both speed and affordability, according to people who study the issue, has nothing to do with technology. Instead, it is an economic policy problem — the lack of competition in the broadband industry.
“It’s just very simple economics,” said Tim Wu, a professor at Columbia Law School who studies antitrust and communications and was an adviser to the Federal Trade Commission. “The average market has one or two serious Internet providers, and they set their prices at monopoly or duopoly pricing.”
For relatively high-speed Internet at 25 megabits per second, 75 percent of homes have one option at most, according to the Federal Communications Commission — usually Comcast, Time Warner, AT&T or Verizon. It’s an issue anyone who has shopped for Internet knows well, and it is even worse for people who live in rural areas. It matters not just for entertainment; an Internet connection is necessary for people to find and perform jobs, and to do new things in areas like medicine and education."
This newly passed legislation both serves to limit the extend ISPs can throttle and serves to break up the hold major ISPs have on the market. It's no surprise than that giants like Google, as it attempts to implement its Google Fiber service, and Netflix, in suffering directly from this model, heavily support this FCC decision.
With the internet playing such a vital role in the days to day lives of adults everywhere, it seems very difficult to justify the stifling of an essential public resource to create an atmosphere of artificial scarcity. Pushing back the development of an incredibly important piece of infrastructure to make a quick buck can only serve the hurt both national and global development in a nation where 74.4% of citizens report having direct access to the internet at home. Although I often feel regulatory agencies fail to well, regulate in an effective manner, this decision marks a step in the right direction both for the everyday consumer and serves as an important example of the government working for the people, not against them.
Subscribe to:
Posts (Atom)