During our last class meeting, I thought about all the ideas and point of views we have learned throughout this course. Understanding and learning from different people from different background really enlighten me as not only a graduate student but also just as a person.
It is interesting to me to think that what we have said during the first class meeting and what we have said at the last class meeting. Knowing, at least for me, that I realized how much digital media have an influence, I wouldn’t not stand so firm on my declarations at the beginning of this semester. Two prime examples are: The “stuff” we now read online, are we so much eager to validate its truthfulness? Knowing how much our lives are integrated to the Internet, are we going to take personal information / privacy so lightly now?
Regardless, there is no doubt that digital media will only make inroads into our lives, our society, and our culture.
As we talked about the blending of cyber culture and real life culture and the impact of the Google or any other archiving system have, we continually come to idea that cyber and real life is coming closer and closer together. There is no doubt that by the speed of Internet development/evolution and our consistent connection to the Internet, our online life can continue to live long after our physical body abandon us. But how big of a role is Google play in this I wonder?
The answer came to me when I read this post from Wall Street Journal. http://online.wsj.com/article/SB10001424052702304510004575186310354202110.html?mod=WSJ_hpp_sections_business True, this is just a rather run of the mill press release from Google’s CFO about their profit but consider this. If Google wants to, it can measure the U.S. economic health by the trends and the volume of searches relating to job hunting. It can even index all blog post, forums, tweets, and any other social communication to determine if people are talking about finding a job or got a job or got fired from a job. Think about this, nowadays, getting even a basic low skilled job requires computer search and submission which in term translates to Google knowing about it. More and more telecommunications are moving to IP based and Google are investing not only to massive amount of data centers but also equipments for becoming Internet service providers.
Wow.. the possibilities are endless. good and evil…
Although the line between spectator and user has always been burred, I believe the gray area are going to be greater. One thing we have to take into consideration is the fact that technology is driving a lot of these changes. With the advance of technology, the gray area becomes larger until one takes over all.
At the beginning of the mass internet consumption, web sites are majority static. Surfers comes to a site, get the information / entertainment they seek and leaves. Social interaction were restricted to specially designed software such as ICQ, IRC, and limited forums. With the advancement of faster connections, Web 2.0, Flash, and others, websites becomes a place where surfers are not only spectating but also using. Might it be providing feedback on a eBay auction, posting a blog, replying a friends request, or chatting via Meebo; these activities consolidated previously decentralized.
I have no doubt with the push of HTML5, Internet2, peer-to-peer networking, and such, that and surfer will become an user if they like it or not. Take for example of the site chatroulette. The site is such a small step away from total automation where you just have to hit that bookmark. It just might be that the future of the internet will be all or nothing situation where there is no spectator. Only user or not a user.
As we discussed in class about the many side of the Internet, some of the more prominent issues such as online identity, privacy and addiction was talked at length. From person of IT background, it seems to me that Internet is a very hard thing to categorize unlike newspaper or television.
Every “cave” of the Internet behaves differently which requires the users to act and behaves differently. For example, in IRC rooms, anyone can shed their real life and submerge themselves into their fantasy. What is even more powerful is that the same someone can play multiple roles at the same time and uproot their previous background with a simple click. On the other hand, social sites such as Facebook is a totally different way of interacting with the Internet where you have to safeguard your identity because people you befriend tends to be someone you know in real life.
In between of these two spectrum are games and sites such as World of Warcraft which by the way attracts tons of tons of players across the world. Speaking on a personal note, I got “dragged” and “drugged” into playing World of Warcraft for a period of time. For me personally, it was a surreal experience. Having the urge to play all the time and while at work, search for anything related to the game. I can’t speak for every player but for me, it was a sense of accomplish to reach the next level within the game. The satisfaction of getting a new gear to show off to friends. The ability to contruibte to a group effort where everyone within the party is important.
I have no doubt the game designers created this game carefully to enhance these experiences in order to make a person like me log on as soon as I get off work and play until 4AM. I believe many players of WoW willingly embrace the illusion of stature which they cannot find in real life. After all, the storyline within WoW makes you feel like you are changing the world.
“Hi, my name is Edward and I was a Wow addict.”
This article just blows me away. I don’t even know what to say.
Instead of understand and re-engineer teaching methods, universities and professors are banning laptops from their class rooms. I wholeheartedly understands the distractions and ability to cheat from technology such as laptops but I don’t see the benefit from a total ban. Most than likely that these law students will utilize laptops when they are professionals. To think otherwise is just foolish.
Personally, I would be turned off if a professor require such a harsh demand.
In Benkler’s section of relevance/accreditation, he explained how Slashdot’s comments section works and how “peer review” offers both relevance and accreditation without the need of professionals. Although I agree with Benkler that this is an interest case study for relevance and accreditation, I don’t agree that this method or other methods such as Digg really offers relevance and accreditation.
Because of the political, social, economical factors, users within these system tend to form into a group. When a story or an articles are posted for comments, these group will start to follow a group mentality and rate the same. In a worst case, groups will use their voting power to suppress or elevate based on their idea.
When a normal user is exposed to these conditions of suppressed voice (Slashdot) or suppressed articles (Digg), a complete picture becomes incomplete. On a grander level, if all of our data feeds are pre-evaluated by other individuals or groups, and given that we have almost uncontrollable data feeds from everywhere, aren’t we going to start to be more ‘group think’?
One thing that stands out for me in this weeks reading was the area of virtual reality. As someone who spends way too much time on games such as Doom and Myst, I can attest to the attractiveness of virtual reality and associated media.
Two things we have to keep in mind are that one, I believe virtual reality has hit a wall in terms of growth and development. I am not sure if it is a technology barrier or a lack of interest, but nonetheless, we have seen less of a presence in new media. Two, we are unable to climb out of the “uncanny valley”.
Uncanny valley, first described by Masahiro Mori says that as artificially created human representation approaches 100% likeness, humans are repelled from them. It is interesting to see that some of the popular media such as droids from Star Wars and Cylons from Battlestar Galactica all are far from looking 100% human. On the other hand, movies such as Final Fantasy and Beowulf really look “fake”.
I’m not sure how or if it is even possible to provide total virtual immersion. 3D movies and 3D bluray is just one small step towards that direction but maybe it would take something like Matrix’s direct cerebral cortex connection to truly experience virtual reality.
Something funny to watch…
This week’s reading took us to a trip back in time to peek into some of the great minds in the past and read how they view the development of thinking machines. No doubt these great minds shaped what we are using today.
One thing that troubled me is with “Lady Lovelace’s Objection” (Turing 59). Being a programmer for over 12 years, I have a strong opinion in how we can create true thinking machines. In my opinion, there is no way for us to write a truly intelligent computer even comparable to human mind. The lack of ability to automatically create association between two different idea really hinders the idea of artificial intelligence.
Few years ago, programmers design software with the simple idea of capturing whatever logic it needs to complete its task. With that model, users will need to be trained in order to use the software as intended. Currently, there is a push toward making software which is fitted toward natural human process. This makes the software, hardware more seamless toward human nature. Hardware and software can only go so far, and will never create a true thinking machine.
What is new and what is old? Isn’t media just media but in an evolving stage?
This question popped into my mind when I was reading the Lister’s part on “Simulation Games” (pp 42-44). I wondered out loud that if we are really trying to justify the existence of new media by technical / personalized aspect? Sure, we can quantify it by using technical measurements but do we need to?
Look at any children playing with their train set, building blocks, or their make belief games will give us an insight that we have been doing “simulations” for ages. Prior to computers, prior to media, and prior to fancy toys, kids (and adults) had been simulating on what ever media we can find! It is true that new media and technology dramatically enhance the complexity of these simulations but it the definition of media.