Friday, January 27, 2012
Blog Post 3
“Second Variety” was a very good story. In my opinion, the moral of it
is that the creation of atomic weapons that can bring about the apocalypse will
one day surely bring about the apocalypse. All it takes is one action, just one
press of a button to initiate it. Sometimes, however, I must admit that I
disagree with this. I definitely do not believe the U.S. will fire one off
first again (even though we HAD to do so during WWII, despite what some people
may say). I don’t even think the governments of countries like Iran and North
Korea would. Why? Because even though the dictators who control those countries
may be complete assholes to their own people, they’re definitely not stupid. If
anything, they’re greedy as hell. Most people in power, including those in our
own country, care only about two things: power and wealth. If they were to fire
off a nuke today (as opposed to the weak ones dropped at Hiroshima and
Nagasaki) and start a chain reaction of countries firing off nukes at one
another in the process, then they wouldn’t have the means to continue acquiring
more power and wealth because the majority of the people on the planet would be
dead. Sure they (the world’s rulers) could travel to the moon, or move
underground, but with everyone gone, is their money really worth anything? Do
they really possess the kind of power they had when 7 billion people roamed the
planet? Obviously not. In the advent of nuclear technology, people used to say
that these bombs would be able to deter war. That too obviously hasn’t turned
out to be the case. If anything, the destructive power of nuclear bombs simply
deter people from using them. With
that said, I think “Second Variety” is a great story. Sure it’s entertaining,
but its message is also one that should definitely be heeded. The reason I say
this is because if a nuclear weapon (or codes) was to get in the hands of the
wrong person, say, a religious fanatic who doesn’t care about power or wealth
or anything this world as to offer, then we’re all definitely screwed. From
that standpoint, I couldn’t be in more agreement with the moral of this story.
Thursday, January 26, 2012
Blog Post 1
I really enjoyed “Liar” because of how it illustrated the
complexity of human emotional cognition. At the heart of this complexity is
‘conflict.’ In class, we discussed how conflict is necessary to every story.
This makes perfect sense considering how important of role it plays in our
existence. Without conflict, would imperfection exist in our reality? Would we
have any concept of emotion? Maybe happiness, if when you stop to think about
it, most, if not all happiness results from some sort of resolved conflict.
Whether as big as an international crises, dispute, etc., or as minuscule as
the frustration accompanied with learning how to tie a fishing knot for the
first time, conflict is what stimulates our emotions, and thus, is what makes
us human.
Again, as we discussed in class there are two types of
conflicts: external (man vs. man or man vs. nature) and internal (man vs.
himself). “Liar” does an exceptional job exposing the symbiotic relationship
between the two, specifically through the characters, Peter Bogert and Susan
Calvin. Bogert’s internal conflict is his nagging desire to succeed Alfred
Lanning as the head of U.S. Robot & Mechanical Men, Inc, while Calvin’s is
her love for a fellow employee, Milton Ashe. Their external conflicts are revealed
as a result of their confiding in a mind-reading robot named Herbie. With
respect to Bogert, Herbie informs him that he will soon become head of U.S.
Robot & Mechanical Men; for Calvin, that Ashe feels the same way about her
as she does for him. What both characters (Bogert and Calvin) foolishly forget
is that Herbie is bound to the Three Laws of Robotics: “(1) a robot may not
injure a human being or, through inaction, allow a human being to come to harm;
(2) a robot must obey the orders given it by human beings except whre such
orders would conflict with the First Law; and (3) a robot must protect its own
existence as long as such protection does not conflict with the First or Second
Laws” (282). As it turns out, Herbie was forced to lie to them in order to obey
these three laws, resulting in possible disastrous consequences for Bogert’s
career, as well as Calvin’s anger for having fooled herself into believing that
human beings had finally created something that could solve a mystery as great
as that of their own emotional cognition. In essence, she lied to herself.
To me, this all illustrates how conflict is a central aspect
of our existence as human beings. Without conflict, emotion cannot exist, and
without emotion, we would not be human, but something completely different,
like Herbie. The point is that human beings may be able to create things such
as fire, the wheel, printing press, washing machine, car, computer, etc. to
make our lives easier and more convenient, but we will never be able to create
anything that will completely correct an imperfect reality. Conflict is the
root of that perfection, and so it will always remain.
This movie (Equilibrium) relates well to the story and
everything I’ve just written; it’s also one of my favorites. The song
(Archetype by Fear Factory) is also a personal favorite of mine. Chances are
some, if not most of you get will get a headache (either real or imagined) from
listening to a heavy metal so here’s a link to the lyrics in case you want to
watch the movie, but mute the sound (there’s no dialogue anyway).
Friday, January 20, 2012
Blog Post 2
The character that interested me the most this week was Elena. In class we talked about how the plot
of a story is driven by a character’s need, desire, or goal to satisfy a
personal or external conflict. Elena’s conflict is a person one in that she
doesn’t feel there’s anything special or significant about her life or life in
general. She’s troubled by her perception that everyone around her views the
human existence as something significant; something more than a set of
predetermined algorithms. These algorithms that Elena perceives are specifically
in reference to the particular, ‘proper/expected’ ways people communicate with
each other. For example, when someone asks how you’re doing, it’s normal to
give a response along the lines of “I’m doing well,” even if it isn’t true. In
other words, the question doesn’t inquire about a person’s condition, so much
as it is an acknowledgement of their presence. Communication like this, which
is socially and culturally ingrained, creates a conflict for Elena because she
doesn’t see it as real/sincere. When combined with depression (referencing the
passing of her infant daughter, Aimee) Elena takes this notion of insincerity a
step farther by viewing all communication as artificial just like her dolls;
resulting to point where she believes she can predict what people are going to
say before they even say it. What she doesn’t realize is that her depressing
mental state (and it’s affect on herself and others) is the reason why she can predict
people’s responses. Nonetheless, he attempts to fill her void by making a doll
exactly like her daughter. Unfortunately, she doesn’t see that a machine will
never be able to replace her daughter because a machine isn’t as complex as a
human. Elena has fallen so far into this state of depression-rooted insanity
that she isn’t able to acknowledge this difference; she can’t see past her
perception of human beings as biological ‘machines’ whose existences are
constructed by a set of predetermined algorithms. As a result, the resolution
is left unknown to the reader, though suicide is possible and fair speculation.
Subscribe to:
Posts (Atom)