Skip to main content

[Apologies: This is a long academical essay that is intended to be the last time I write on a topic that has bothered me for years. It is a little heady, but I hope it's comprehensible.]

    A colleague asked me what I thought about the current generation of college students. Now, every generation is “a generation of vipers” (Mt 23:33), and a fun game is going through Bartlett's Familiar Quotations or the like and searching out “young people” or “today's youth” quotes. No elder generation is particularly pleased with the redecorating the new kids do. A wise person, therefore, takes care not to cry wolf. When Douglas Coupland and the rest started up their talk of “Generation X” and the like, I thought it was hooey (especially since that 'generation' began three years after I did). I observed that growing up with a computer did not seem to make much of a difference. In fact, I was disappointed in how small a difference it made in folks who were in their teens in 1984 (I was past twenty). Since my life had been a series of unfulfilled promises of future innovation and Utopia, I was attributed generational changes to yet another floor in the tower of human folly. Until, in fact, about five years ago, I have thought all of the “generation this'n'that” was bunk, but I'm not entirely sure now that there isn't a big, big change coincident with generation.

    First, the talk is proliferating, but it is also generating actual research. “Millennials” are a group that is not merely being studied, but is actually showing cognitive and behavioral differences from prior groups. Second, the changes that we're seeing are not merely in them, but rather are more acute and pronounced among them. In other words, whatever it is that makes this generation so odious to traditionalists, it is all over the environment and merely concentrated in the young. The young are unmediated and unreflexive (lacking “meta,” if you will).

    Nicholas Carr's The Shallows: What the Internet Is Doing to Our Brains is a provocative, and I think essentially true (if premature), summary of one facet of the generation. Too few people read the book, or too few took it to heart. Carr is examining purely the informational and cognitive effect of the Internet. If we imagine that a topic can be mapped spatially, we could think that it would have, at the top, “apple trees,” and then “uses for,” “history of,” “diseases and pests,” “types,” “cultivation,” and a number of other topics that are one level below. Each of those beneaths has below it a tree of other topics. This brachiation continues until the information is a taxonomy that is familiar to all of us who grew up in the paper book age. It is the Linnaean system, after all.

    The Internet has squashed the hierarchy of knowledge into a single search line. The web has, Carr argues, made knowledge simultaneous and eliminated the capacity of learners and thinkers to hold taxonomies. We do not develop depth, in other words, because our information is all planar. Additionally, he focuses on the myths of “multitasking.” Multitasking is supposed to be a compensatory skill that allows for dual attention, like the superior aliens in Arthur C. Clarke's Childhood's End who could read a book and have a conversation at the same time. However, neurologists report that no one multitasks. In fact, “millennials” are worse at multitasking than elders, and all they do is, as we used to say when denigrating Windows 3.1, “task switching.” There have been confirmations since that Stanford study in 2009 as well: swapping tasks (cell phone, driving; phone, class, notes, chat) means doing several things very, very poorly. What's more, the younger the person, the more the person believes his or her performance is excellent. Thus, as Carr argues, the interconnections are creating a flat, unprocessed, and unremembered and therefore unprocessed cognition.

    My colleague's question was along a different line. He had noticed something else. He said, “The i-phone, i-pad. . . the I is the important thing.”

    He's not entirely wrong, but it would be a terrible mistake to retreat to the comfortable shell of the cultural conservative's charge of egoism. Let's leave that for George Will's monomania. Saying things like that is slightly more useful than throwing a spear into the ocean: the tide is coming in anyway (a reference to Caligula, not Cnut).

     The reason that I believe we are right to note that Something Happened is that it happened and is happening to all segments of American and other technophilic cultures. It may be easiest to see in the United States, where there are virtually no cultural institutions or governmental backstops against cultural change than in European nations where traditions and various quasi-governmental or governmental institutions and requirements might slow a change like this. (E.g. studying for A- and O-levels  might be a national reinforcement for traditional, taxonomic knowledge and layered thinking.)

    Prior analyses miss the point: neither the computer nor “the Internet” is responsible for the "short now" and the alterations we are seeing in consciousness. The Internet is the connection between various networks – primarily e-mail, Usenet, and the world wide web. It is little more than TCP/IP and the hardware of routers and hubs that should amaze a space alien and NSA officer alike. An apparent "acceleration" has been palpable since the 1790's, when S.T. Coleridge and W. Wordsworth complained in the "Preface" to the second edition of Lyrical Ballads that newspapers were making the world transient.

     No: let us be frank: what has made the change is the cell phone and the devouring of “the Internet” by the world wide web and the decision to open the web to .com's. When its history is written (on flash card), the allocation of the .com's will be the moment that changed everything. What's more, my bedraggled generation knew it at the time. We sported “The Internet's full: Go Away!” t-shirts when AOL was allowed onto e-mail. We were not thrilled that CompuServ was going to invade our paradise. The idea that graphics were going to be expected on an HTML site struck me as vulgar. Even after every thirteen year old wrote Wikipedia articles on the “company” that he had made up with his best friend (Zack n' Tay Limited Game Designers, yo) and went into war mode because it was deleted that night, the mind of the generation did not change.

Supplement vs. Replacement

    Carr's central thesis – the loss of depth to knowledge – was underway the moment hypertext's promise began to be diverted into a syntax for pictograms, but there was no guarantee that it would dominate the American and Canadian mind. Hypertext and graphics are fundamentally in tension. I was angry in 1997 because I saw pictures taking over, when hypertext offers limitless learning. The promise of hypertext had been an eternally expanding, encyclopedic, structure to knowledge whereby I could make all of my footnotes link, all of my allusions link, all of my obscure jokes link to a page where I explain them, and, were I a fiction author, I could offer layers of additional comment by hypertext. HTML could mean layer upon layer of meaning placed additively onto a piece of information, with the links acting as valves that would suggest an hierarchy of topic, but it never was used that way much, due to the preference for graphics, and the Babel-like quarrel of languages, fonts, and monitors. However, that model of hypertext – lateral linking to supplemental information – suffered at the hands of a commercial impulse to replace knowledge, and the .com HTML virtually strangled it in its crib with a world wide web where every site was illustrated first and foremost, and the HTML was a method of getting the pictures to stick and click.

    The non-profit and educational use of hypertext to supplement never quite died, even if it never developed. It is part of Wikipedia, for example, and some non-profit bloggers use it (e.g. DailyKos, Crooks and Liars, ThinkProgress). However, the commercialization of the world wide web meant that a revenue model was built on advertising, and advertising could be metered on “clicks” and “eyeballs.” Therefore, it was and remains in the interests of each newspaper and commercial entertainment site to limit supplemental information. For example, to supplement my allusion to Caligula, I linked to a web page on British history, and it, being a non-profit, had links to other web pages; if you click on one of these, I “lose” you. If I were making money off of every click, I would be a fool to give that link, then. Instead, I would have guidelines like the following:

1. If you need to explain a term, link to an on-site page.
2. A pop-up that can be sponsored with a single line is ideal for an explainer, as it will not risk losing the “eyeball” or “click.”
3. Other links should be to other exciting stories or compelling teases that are on the site.
4. Toward the top and bottom of any content should be an image or tease that will lead to another click on the site.
From a commercial point of view, HTML and XML coding should be streamlined specifically not to supplement the information, but rather to be null (the explainers) or to replace one revenue click with another. A person who goes to Sports Illustrated online may go to four stories and have no idea of what he or she has read. (There is no alternative but trying it for yourself.)

My cell phone ate my house!

    The first appearance of the mobile phone was as a luxury item. It was a display item and a feature of ostentatious display worthy of Thorstein Veblen. Nowadays, satirists ridicule the way that such a commonplace and cheapening feature of everyday life was once so impressive, but from its first appearance as an alluring object of desire, it then appeared as an object of pure utility, and this was the second phase to the mobile phone's life. The second phase was business. Executives were not branded with these phones, but deliverymen and drivers were, at least initially, and this way dispatchers and bosses could track their locations. Later, middle managers got the phones to be “in pocket” when they left work. The phone was not a reward, but a sort of voluntary ankle bracelet monitor. Next, and perhaps currently (although I doubt it), Christine Rosen documents how men employed cell phones as lek items in their mating dances in single's bars. Rosen's research was 2004, and I suspect that, these days, the gender display has altered, as I observe women using cell phones as items of competition rather than men as display. Each of these moves shows a masking and mutation of underlying function and quality of the technology as tool; the utility of the device is in masquerade. If we look through history, we can also see that this is routine: the important thing to any technology is to obscure its utility (my beloved personal computer was a way to play cool games, after all, and, secondarily and resentfully a way to run spreadsheets).

    In the earliest days, cell phone carriers advertised and promoted a high revenue stream that was mature in Japan: the text message. Americans were slow to warm to it, but they succumbed eventually. SMTP is one of the innovations that curmudgeonly computer users of the first PC generations already had strong feelings about. After all, it was AOL and Yahoo's spawn, and it encouraged the sort of vapidity that we associated with the Wrong Sort of People. Consequently, there was a conservative backlash in the online world that made it “unhip.” The backlash simply aged out, and the cell phone sublimated the entire medium (the Internet itself).

    “Convergence” was the dream of telecomms from the moment of AT&T's deregulation. One of the interesting features of the reporting, book, and film, Enron: The Smartest Guys in the Room is that Enron's fall was due to dreams of graft with convergence. The thing that provoked investigation into the firm was the splatter of fictional accounting and the overt evil of creating a shortage in energy so as to profit from a futures market in energy, but what made for the run on shares in the first place was Enron's claim to provide video on demand via the Internet with a proprietary pipe.

     You see, the idea of owning the pipeline by which consumers would get their “content” has been the mouth- and pants- watering concept that has driven investors in telecoms for decades. Imagine a day when the consumer pays $1 to watch a movie, to the phone or cable provider, and $40 a month to the phone or cable provider, and then can pay an extra $5 a month to the phone or cable provider for super speed, and then $.05 per text message. Meanwhile, the publisher of the film or television show will also have to pay the cable or phone company a very large amount of money to get the product to those consumers, because the pipeline will be proprietary, and the consumers will be watching ads (and paying). You can be forgiven for thinking of your Kindle now.

    Is it a wonder, then, that Palm Pilot is all but gone? Are you surprised that Blackberry is dying? The question was whether the telephone would eat the PDA (personal digital assistant) or the PDA would eat the telephone. At home, there was never any question: the television was going to eat the personal computer from the very beginning. The cable and satellite companies had considerable interests in making the television the single portal for all things in the home, including the telephone. The phone companies had and have an interest in ensuring that the telephone replaces the computer. Since these are merged and integrated corporations (e.g. Sony producing movies, games, computers, televisions, and phones), there is ample power. Thus, the personal computer is now a laptop, and the laptop is getting more and more PDA-like, so it can be easily replaced by the telephone.

    If I were dystopian, or a technology writer, I would predict that the future would have no PC's in it. Desktop computers would only exist as servers. All consumers would have notebook computers, and these would be phones, and phones would be laptops. Thus, everyone would always pay a monthly connection price for telephony and Internet, and client-side software (programs one actually owns and controls rather than rents) would all but disappear. I suppose such predictions hardly take a crystal ball these days, but there is every chance to be wrong still.

My Cell Phone Is My Home

    The cell phone has two mutations of mental space. The first showed in its early use by draymen: it makes the person always “at work.” Work ceases to have a physical definition in the mind of the worker, because there is no way to leave its anxieties and obligations. People have always taken work home, but taking the workplace home is a new feature. As workers have moved into offices and changed their work from physical to mental activity, the work consists of managing personalities, performing business maintenance, and analyzing productive and consumptive patterns. These intense and anxious activities now travel by the web, e-mail, and, most consistently, the instant message. The phone is always on so that the worker can access leisure, and that paradoxically means never having any leisure, because she or he is always intersected with “at work.”

    The Instant Message is an always-on application. It is “terminate and stay resident” in mental space. Unlike e-mail, an instant message interrupts. It rings like a phone. Hence, it could “go off” at any moment and is involuntary. It therefore introduces the other spatial mutation: it makes every place “home.” As we never before took the workplace home, we never before took home to work. The network of friends and family that constitute relief and personality maintenance are available for reference at any moment via the instant message in exchange for being available as a resource for each member of that network to the same degree. Those who turn their status to “unavailable” will lose their place in the network, and those who turn off their devices will lose the support they seek.

    These spatial mutations have led us to a familiar sight. Where fast food restaurants famously had “no personal calls on company time” policies, they now have counter workers with open cell phones distracted by customers. Executives have trouble both in meetings and in leisure by the presence of the other in the phone. The young in their high school and college classes have surreptitious conversations not by whispering, but by texting, and co-workers go to lunch together to sit silently across from one another and stare at phone screens. The maintenance expectation of the phone's presence is such children turning off phones can create fears of disaster among parents.

    What has happened is that the device has amalgamated all spaces into one space. Rather than saying that it is an expanding ego, it is better to think of it as a vanishing self. The “I” of the smart phone user is not growing vaccuously selfish, but rather numbly evacuated, as it has been robbed of home, replaced by a non-space that cannot be considered a person and which has no rights -- just obligations.

Narcissus/Narcosis

    Marshal McLuhan reminds us that the myth of Narcissus is not of a narcissist. Narcissus was a beautiful youth who fell in love with his reflection. He did not love himself, but rather loved his image. If you cannot read McLuhan's prose, at least see McLuhan's Wake.

    Each piece of technology offers us a beautiful image of ourselves at first. It anesthetizes us. It says, “I am an object of beauty, not a smoke-belching automobile.” It says, “I am the most desirable thing in the world, and I will make you yourself, only better and more beautiful and stronger.” And it delivers on this promise, as well. It is only later, much later, that we discover that our workplaces are twenty miles from our homes and that our air is dirty and that we pay for gasoline every month. Technology is the mechanical Maria of "Metropolis."

    Just as the personal computer's actual attraction in its early days was in games and the earliest attractions of the Internet seem to have been pornography (Usenet alt.bin for photos long before the web), so they were the mosquito's saliva. The cell phone said, “I am 'Star Trek' communicator cool,” and “Why wait to get on the web?” and “Why not stay in touch?” It also said, “Be stronger than you are: never be stuck in an emergency again.” (The cell phone users bewildered on 9/11/01 were a thing I won't forget.) Facebook said, “Get Grandma to see the baby” and “Keep in touch with your kooky college friends” and “Share your wiiiiild opinions with your friends, without those nasty people there.” It was only later that we realized that police were routinely turning on the GPS feature of smart phones without warrants and that telecoms were routinely giving the FBI  locations eight million times in a year for a single company. Only later did we realize that Facebook generated 11,000 pages of data on a person that it retained after an account of six months had been deleted.

    Only later did we realize that our mental spaces had collapsed.

    The collapse of hierarchical organization of information that the flat plane of the Google search bar introduced and the borderless one-site, replace-over-add, organization of the commercial web experience, along with the encapsulating, app-friendly, user cosetting approach of Facebook have meant a change of the way that we can think. Nicholas Carr said, on “The Colbert Report,” that he would be unable to write his undergraduate thesis today, because he could not concentrate so deeply anymore. While that may or may not be true (whether we can regain old structures as easily as we lose them is a novel matter), there is a change that seems to be underway for us adults which undeniably shapes the young.

    Walter Ong, S.J.'s Orality and Literacy proposed the thesis that literate societies think differently than pre-literate ones. There is a great deal of evidence for his observation, and McLuhan spoke of the same thing. Oral societies have long, long memories. They memorize epic poems word perfect. They hold catalogs in their memories. However, they sacrifice analytical functions for this capacity. They are not inductively flexible because not ready to forget. (To accept a concept like the germ theory, one must be ready to abandon humors and airs.) Literate societies displace their memories onto written words and can be fantastic analysts, but they lose capacity for long knowledge and layered culture in the process. In essence, they get a shorter "now" for a more logical one.

    What, then, happens when we displace the word onto the search bar? Strictly speaking, we are not literate anymore. We are not relying on books. We are relying on searches. When did you last look something up with finger and eye? What happens to memory when there is never, ever a need to reach for a book?

    Let me repeat that for a moment and add all of our elements together: home, work, school, bank – all of these are the phone. “You” are nowhere and all places simultaneously, just as each of those places is “here” and “now,” although time is not stacking in memory, but rather on a remotely stored server 'wall' or blog. There is no shelf of books, no order for those books, no searching through such a shelf, and no top or bottom. Nor do you have to travel to a place, or even find an Internet site to search on, to gain information. It is possible to say, as more than one student of mine has said, “I don't go on the Internet. I just use Facebook.” This is because the distinctions have no conceptual meaning and no active difference. There is no place, no site, because all are one. When that occurs, then what happens to memory? What happens to organizing time?

    If we want to be truly chilled, what happens to culture? Narcissus has an image of himself reflected to him so that he believes he is seeing beauty, believes that he is seeing the beauty of himself, but it is only an illusion. Narcissus, we recall, starved to death. He could not look up from the small reflection before him to know where he was, what time it was, or who he was.

There is no condemning of this or that thing. The tool is not the problem, after all. It is our tool-mind, our innately cybernetic culture -- we form the tool to resemble us, and soon we resemble the tool. Perhaps we can realize why we are hungry.

Originally posted to A Frayed Knot on Tue May 29, 2012 at 10:10 AM PDT.

Also republished by Readers and Book Lovers, The Royal Manticoran Rangers, and Community Spotlight.

EMAIL TO A FRIEND X
Your Email has been sent.
You must add at least one tag to this diary before publishing it.

Add keywords that describe this diary. Separate multiple keywords with commas.
Tagging tips - Search For Tags - Browse For Tags

?

More Tagging tips:

A tag is a way to search for this diary. If someone is searching for "Barack Obama," is this a diary they'd be trying to find?

Use a person's full name, without any title. Senator Obama may become President Obama, and Michelle Obama might run for office.

If your diary covers an election or elected official, use election tags, which are generally the state abbreviation followed by the office. CA-01 is the first district House seat. CA-Sen covers both senate races. NY-GOV covers the New York governor's race.

Tags do not compound: that is, "education reform" is a completely different tag from "education". A tag like "reform" alone is probably not meaningful.

Consider if one or more of these tags fits your diary: Civil Rights, Community, Congress, Culture, Economy, Education, Elections, Energy, Environment, Health Care, International, Labor, Law, Media, Meta, National Security, Science, Transportation, or White House. If your diary is specific to a state, consider adding the state (California, Texas, etc). Keep in mind, though, that there are many wonderful and important diaries that don't fit in any of these tags. Don't worry if yours doesn't.

You can add a private note to this diary when hotlisting it:
Are you sure you want to remove this diary from your hotlist?
Are you sure you want to remove your recommendation? You can only recommend a diary once, so you will not be able to re-recommend it afterwards.
Rescue this diary, and add a note:
Are you sure you want to remove this diary from Rescue?
Choose where to republish this diary. The diary will be added to the queue for that group. Publish it from the queue to make it appear.

You must be a member of a group to use this feature.

Add a quick update to your diary without changing the diary itself:
Are you sure you want to remove this diary?
(The diary will be removed from the site and returned to your drafts for further editing.)
(The diary will be removed.)
Are you sure you want to save these changes to the published diary?

Comment Preferences

  •  Give us some conclusions (2+ / 0-)
    Recommended by:
    Neuroptimalian, ozsea1

    A writer needs to say something new, or say it in a novel way, and I'm not entirely sure you've done that. It really took me some digging to ferret out your thesis here amongst the verbiage.

    "Technology is changing our minds, probably for the worse." is what I got. You give us a rehash of Mcluhan on technology, "The medium is the message" is a message we've heard before.

    In addition, you seem to be basing your conclusions on one book, by a man who is simply not any kind of a scientist. His writing is not based on any sort of scientific studies and his focus seems to be "General Lud was right! Technology is bad, it is turning us into idiotic zombies!"

    In conclusion, you managed to take a very, very long time to say very little that has not already been said, and none of it is backed up with any sort of science, only anecdote and essay. Sorry for the negative review but technology is my field and this is twenty minutes of my life I'm never getting back, so I feel compelled to register a complaint.

    •  I hope you lose much more time (7+ / 0-)

      A writer needs to be a writer and not a blunderbuss. I anticipate an audience that is sophisticated, but as Tolstoy said, it is not possible to explain anything, no matter how simple, to a person who has already made up his mind that he knows what it is.

      I do not give a rehash of McLuhan but adapt McLuhan's observations to a temporality and commercial model that revolves around a zero point, atemporal phenomenon. This is not "the medium is the message," but rather that there is no message when convergence is involved. If you do not believe the Stanford study or any of the neurology is "science," then your definitions must be restricted to what confirms your opinion.

      "General Lud?" Do you have any idea of history? Ned Lud was not a general. Furthermore, his protest was not anti-technological: it was pro-labor and anti-capitalist. Finally, if you think in terms of technology being "good" OR "bad," then you are a fool indeed.

      Technology has no qualitative value, and there is not going to be a conclusion. Simpletons write up prescriptions or product reviews. Wise people recognize what the cultural forces invested in a technology are doing and what the technology is doing to cultural expression and how these conjoin. I can only hope that you are discomfitted over and over again by people who refuse to give you platitudinous conclusions. Since you won't even click a link, you need to reorganize your mental space a bit.

      Every reductio ad absurdum will seem like a good idea to some fool or another.

      by The Geogre on Tue May 29, 2012 at 11:14:44 AM PDT

      [ Parent ]

      •  Seth's not a fool... (1+ / 0-)
        Recommended by:
        ozsea1

        He's a fellow smarty, and a good guy with a good heart too. We knew each other in person when he was in the Bay Area.

        He's coming across slightly grouchy today.  I was grouchy earlier this morning but I'm not at the moment:-)

        "Minus two votes for the Democrat" equals "plus one vote for the Republican." Arithmetic doesn't care about your feelings.

        by G2geek on Tue May 29, 2012 at 11:58:43 AM PDT

        [ Parent ]

        •  Sorry. Luddite, All said before, Just McLuhan? (9+ / 0-)

          I'm not going to praise my own work or defend its novelty, but to see nothing new in what I've done? To then dismiss the whole thing as a naive and "unscientific" anti-technology piece? To go for the ultimate philistine insult of "15 minutes of my life back?" That practically screams "I'm a journalist, and I don't read much."

          First, I reject the very, very childish pro-/anti- technology. It's boring, and it leads to the ridiculous state we're in, where journalists and bloggers either write about how X technology will bring peace on earth or how it will make our heads explode. Those worldviews are worthless -- both of them.

          Analysis means analysis. It means looking for the why of the what, and I think there is some serious extension to the intersect of capitalism, corporatism, executive growth, and the simultaneous technologic erosion of self, place, and time. It took the one to help the other, and the other aids the one, and the two together really seem to portend narcosis.

          Every reductio ad absurdum will seem like a good idea to some fool or another.

          by The Geogre on Tue May 29, 2012 at 12:49:15 PM PDT

          [ Parent ]

          •  i'm with you on this one. (5+ / 0-)
            Recommended by:
            The Geogre, ozsea1, bkamr, native, walkshills

            I think you made a bunch of good points, and i'd reply to them in some detail but for doing this in the middle of my work day in between programming stuff and taking client calls.  

            The best times to run lengthy think-pieces are in the evening when people are off work and can give them more attention.

            Anyway, Seth's a friend of mine and I don't want to get in the middle of an interpersonal dynamic.  My point was basically to say you shouldn't take it personally, but apparently that did not come across effectively in its previous iterations.

            "Minus two votes for the Democrat" equals "plus one vote for the Republican." Arithmetic doesn't care about your feelings.

            by G2geek on Tue May 29, 2012 at 01:32:59 PM PDT

            [ Parent ]

      •  "Technology has no qualitative value" (1+ / 0-)
        Recommended by:
        The Geogre

        you should specify when you say this.

        Clothing is a technology. Toothbrushes are a technology. Running water is technology. I'd say all of these have qualitative values.

        I did rec the diary but almost all of this I'd read before in other forms. Yes, we're distracted, and increasingly "dumb." What is the conclusion/solution?

        I'm struck by how the meanest, cruelest, nastiest people brag about how they live in a Christian nation. It's rather telling.

        by terrypinder on Tue May 29, 2012 at 12:44:33 PM PDT

        [ Parent ]

        •  That is not what I was saying (7+ / 0-)

          First, I would say that yes, technology has no qualitative value. Running water can be clean, potable water, or it can be a way to take water from poor people. The toothbrush requires dental health culture. The quality comes from the human culture, not the technology.

          Hobbes said "true and false are attributes of human speech, not things. Where humans are not, there neither are true or false." The same could be said of good and bad.

          No, we are not increasingly dumb. In fact, I would say that our intelligence must be unaffected. However, our mental organization is affected, and severely. Nor are we necessarily distracted. Distraction implies that there is a foreground activity.

          My argument is that our reorganization, or at least the reorganization suggested by the present capital, corporate, and technological convergence, is toward an obliteration of priority itself. This seems to me, and probably to all of us, as if a complete disaster. I cannot imagine analytic thought being possible without historical and organizational thinking.

          However, it's possible that the evaporated place and time may make us emotive or unemotive. I cannot tell, but I believe that, short of self-imposed strictures, the change is already underway, and not because "we are distracted," but because major corporations make money off of it and we don't have homes anymore.

          Every reductio ad absurdum will seem like a good idea to some fool or another.

          by The Geogre on Tue May 29, 2012 at 01:32:25 PM PDT

          [ Parent ]

          •  "the Miasma" (2+ / 0-)
            Recommended by:
            walkshills, The Geogre

            This is the term I've come up for to explain the inability of our society to come to grips with the present and all the difficulties that modernity is presenting.  I feel like you've defined an essential part of this - the changing way we think and the blurring of boundaries such that we can't define anything anymore.  Anyway, I really enjoyed this...more on this please.

            The Tree of Liberty must be refreshed from time to time with the tears of morons.

            by Doolittle on Wed May 30, 2012 at 09:00:26 AM PDT

            [ Parent ]

            •  Orwell/Huxley/Postman (0+ / 0-)

              I am working on an Orwell/Huxley/Postman analysis. It's more literary than this, but it's done for the payoff of an political and economic analysis.

              The way I work, I'll write the thing and then take a week to find all the typos and messes, and then it'll take me longer to find all the places where I assume everyone knows what I'm talking about but, in fact, I'm referring to something that's either specialist stuff or in only my own head.

              I'm a-working on it, though, and Summer is writin' season (and job applyin' for, I guess).

              Every reductio ad absurdum will seem like a good idea to some fool or another.

              by The Geogre on Thu May 31, 2012 at 07:55:22 AM PDT

              [ Parent ]

    •  yo bro' what's got you all grouchy today? n/t (1+ / 0-)
      Recommended by:
      Chi

      "Minus two votes for the Democrat" equals "plus one vote for the Republican." Arithmetic doesn't care about your feelings.

      by G2geek on Tue May 29, 2012 at 11:42:58 AM PDT

      [ Parent ]

    •  When people state their belief ... (0+ / 0-)

      that they were "promised" things (flying cars being a great example) that were never promises, merely ideas and dreams, you shouldn't be surprised to find little cognitive depth is in play.  ; )

      "Two things are infinite: the universe and human stupidity, and I am not sure about the universe." -- Albert Einstein

      by Neuroptimalian on Tue May 29, 2012 at 04:33:19 PM PDT

      [ Parent ]

      •  I remember predictions (3+ / 0-)
        Recommended by:
        Chi, ozsea1, aravir

        There have always been several layers, and the villains are journalists, and especially technology writers. As I've said, they today fall into "This? This is going to be great! Just get the new iPhad 9 and plug it into your HDTV with a new HDMI cable in your integrated mancave, and it will make life perfect" or the "This single program is going to bring down the power network and make all children stupid" camps.

        This is because they have to find a take away. They have to have a good or bad. They have to have a promise. Listen to a scientist get interviewed. "So, when will this be a vaccine?" "Uh, we just noticed the molecule." "Do you think we'll have a drug within a year?" "This is all just a discovery right now, so..."

        I remember the promises indeed. Furthermore, I can take you to Alvin Toffler for solid predictions of how the #1 problem was going to be too much leisure. The home robot was going to be everywhere. Just because scientists have always been cautious, do not think that none of us did not hear what we heard. In fact, just go read Future Shock. From 1970 - 1980, futurists made all sorts of "by 2000" predictions that were hilariously utopian.

        Every reductio ad absurdum will seem like a good idea to some fool or another.

        by The Geogre on Tue May 29, 2012 at 05:48:29 PM PDT

        [ Parent ]

        •  Predictions are merely the opinions ... (1+ / 0-)
          Recommended by:
          ozsea1

          of humans with (usually) unremarkable IQs.  I prefer to arrive at my own conclusions after studying the issue rather than set myself up for disappointment or worse.

          "Two things are infinite: the universe and human stupidity, and I am not sure about the universe." -- Albert Einstein

          by Neuroptimalian on Tue May 29, 2012 at 06:53:51 PM PDT

          [ Parent ]

  •  I am going to keep (8+ / 0-)

    my phone.   I almost never go more than 15 miles from home in any event on a day to day basis, so they'll find me if they want me.  I can leave the phone out of hearing distance and I do.  

    Humans are almost always owned by their possessions, regardless of what they are.   Learning to own our possessions is an important part of gaining wisdom.  Few of us reach it, fewer seem to want it.

    Which brings us back to the shallow, non-reflective nature of internet information.   Information isn't knowledge, certainly not wisdom.  We still have to engage our brains.  Quite frankly, I don't think most people in most generations have done that to any serious degree.

    •  here, try this: (7+ / 0-)

      http://users.ox.ac.uk/...

      Reprint of a paper on Gödelian incompleteness, human minds, and machine simulations of human minds.  (Author argues that minds are noncomputable.)  

      Brainfood for smarties, and nothing but good ol' text.

      There's plenty more where that comes from.

      Pick any topic on Wikipedia (philosophy and the sciences suit me fine) and follow the "see also" links at the bottom (something I predicted in the early 1980s), and you'll find all kinds of good brainfood.  

      As with everything else, it all depends on where you look.

      "Minus two votes for the Democrat" equals "plus one vote for the Republican." Arithmetic doesn't care about your feelings.

      by G2geek on Tue May 29, 2012 at 11:56:46 AM PDT

      [ Parent ]

      •  I click links (3+ / 0-)
        Recommended by:
        The Geogre, G2geek, ozsea1

        and I hate too many pictures unless it is identify the bug, bird, tree leaf variety.

        But I am always amazed at people who show up at a forum or blog site and ask, how do I find 'X'.   If google/yahoo, etc. got you to a forum, it can get you to the rest of the information sites as well.   People can be very lazy intellectually.  Me included.

      •  Encyclopedic v lexical (4+ / 0-)
        Recommended by:
        G2geek, ybruti, Susan from 29, ozsea1

        There is an old distinction found in Umberto Eco's Opera Aperta on the difference between lexical and encyclopedic information. The one is delimiting, and the other is inclusive. Lexical information attempts to distinguish one meaning from a vast array, while encyclopedic knowledge attempts to contextualize an already known piece of information in all possible contexts.

        The early HTML offered such a possibility, and it was frightening what it was going to do. Of course it didn't last very long, but it's still out there. The dream was for real HTML editions. A very serious HTML Moby Dick, for example, would have links to all the scholarly articles that discuss the various elements, as well as original materials Melville used, as well as supplemental texts on the age, as well as historical context, as well as biographical material, as well as contemporary reactions, and this would be for every line of the text.

        Every reductio ad absurdum will seem like a good idea to some fool or another.

        by The Geogre on Tue May 29, 2012 at 01:07:00 PM PDT

        [ Parent ]

        •  though one should be careful about.... (3+ / 0-)
          Recommended by:
          nickrud, ozsea1, ChocolateChris

          ... over-linking, because that just creates another overload.

          The way I intended "See Also" in the early 80s, texts would have footnotes or end-notes, and those would be links.  Except the mouse hadn't been invented yet, so they would be numbered, and you'd press a key and enter the number to go to the linked item.  The numbers on the pages for the footnotes would be something like "speed dial" numbers that stood in for the actual index numbers (like telephone numbers or Dewey Decimal numbers) of the items.

          So you might see something like this:

          "... thereby demonstrating the dualistic nature of photons via the two-slit experiment (1)..." and at the bottom of the page, (1) would include a brief note about what was linked e.g. "See also Two-slit experiment"  So you'd press a key and then the number 1, which would call up the speed-dial embedded in the page, and that would bring up the "linked" item.  And there would be a horizontal row of numbers at the bottom of the screen whereby you could have more than one "thing" available at once by typing another key followed by the one-digit number of the relevant page or document (much as we think of "tabs" today).  You could also get at those with the left and right cursor keys.  The idea was to create the equivalent of having multiple books open to whatever pages, and go back and forth between the books.

          You could also download store catalogs overnight (with pictures of goods), and thereafter be able to place orders with the stores.  You'd fill out an online form for items to be delivered, and pay for them with electronic bank transfers, which of course would be transmitted in encrypted form.    

          A lot of people were thinking along those lines back then.  Or at least I have to believe they were.  

          Some day soon I may get the chance to build parts of that thing, if nothing else, as a historical curiosity.  

          "Minus two votes for the Democrat" equals "plus one vote for the Republican." Arithmetic doesn't care about your feelings.

          by G2geek on Tue May 29, 2012 at 01:56:29 PM PDT

          [ Parent ]

      •  Love the link (2+ / 0-)
        Recommended by:
        G2geek, aravir

        Right in my wheelhouse: 1960's rationalist anti-rationalist neohumanism (if we must give it a name).

        I love that stuff and feel great kinship to it. That was my native soil, in a way -- all of the Kierkegaardians and Americans inventing Zen. The backlash against Behaviorism did not end with Arthur Koestler's The Ghost in the Machine (another thing that people mocked because it wasn't written by a scientist). Koestler made many naive mistakes, but other people read it and kept the fire. Paradox's heart is the human irrationality.

        Heck, we can calculate with numbers that we don't know, like pi. What kind of crazy person puts that into an equation? [Then again, philosophy of mind folks get... well... there is a local one, and I just have to smile and get to a neutral topic, like how silly Julian Jaynes is.]

        Every reductio ad absurdum will seem like a good idea to some fool or another.

        by The Geogre on Tue May 29, 2012 at 01:39:00 PM PDT

        [ Parent ]

        •  re. Jaynes: (2+ / 0-)
          Recommended by:
          The Geogre, aravir

          He was writing long before we figured out (only recently) that dyslexics (among them yours truly) have their verbal processing spread across four smaller areas at the "corners" of the brain (from an overhead plan view) rather than in one larger area on the side opposite the writing hand.

          So it seems to me that finding somewhat falsifies Jaynes, or at least creates exceptions.

          Consciousness is one of the toughest nuts to crack.  My inclination is to think that David Chalmers' "interactionist theory of mind" is correct (minds are composed of information interacting with certain types of physical structures such as neurons; where "information" is a fundamental constituent of the universe, a less radical position than Wheeler), and that Penrose & Hameroff's "orchestrated objective-reduction" theory of consciousness is correct (neural computation carried out at the level of proteins in the skeletal structures of neurons, susceptible to quantum mechanical effects and thereby not wholly deterministic and not reducible to algorithms; this would also provide a physical basis for free will).  

          Though on the other hand, simply postulating a variation of mind/body dualism solves a lot of it neatly, though has the problem of being not quite unfalsifiable but at least difficult to operationalize falsifiably.  

          "Minus two votes for the Democrat" equals "plus one vote for the Republican." Arithmetic doesn't care about your feelings.

          by G2geek on Tue May 29, 2012 at 02:05:59 PM PDT

          [ Parent ]

          •  Recovered dualisms (3+ / 0-)
            Recommended by:
            ozsea1, G2geek, aravir

            Julian Jaynes was a joke to me when he came out, because his proof was stuff from Homer that was quite literally untrue and material from the Old Testament that was plainly false. He didn't have data, and he wanted the corpus collosum to develop way, way too recently.

            For example, he wanted Homer, 800 BC, to have a bicameral mind, where his right brain only came to his left brain as a vision/voice of a god. Such a person cannot empathize, for example, and yet there is Iliad (the earlier poem) VI (lines 470-1), where Hector is going to take his son into his arms. However, his infant son cries, and Hector realizes that the child is frightened by the horsehair helmet, laughs, and takes it off. This feat of imagining a man imagining the perceptions of a child is impossible by Jaynes's version.

            Jaynes wants all the "primitive" Israelites to be grunting along, hearing their right hemispheres, and thinking they've found God.

            He wasn't aiming for much, there.

            As for me, I haven't a clue. All I know is that we end up recovering dualism in all our endeavors, even those where we physicalize consciousness, because there is a central paradox. The observer must account for itself, and the mind must account for its own definition of itself before it can say where it is, and, so long as the mind is defining the mind, the definition is paradox.

            Any logical positivist would reject the idea out of hand, and so we are always, always testifying to the fact that there is something not physical in the calling of the concept "mind."

            Every reductio ad absurdum will seem like a good idea to some fool or another.

            by The Geogre on Tue May 29, 2012 at 04:18:50 PM PDT

            [ Parent ]

            •  ooh, nicely done, very much so. (1+ / 0-)
              Recommended by:
              aravir

              Your takedown of Jaynes is well-put and conclusive.  Though I'm not sure I'd agree that patients whose corpus callosum has been severed (some epilepsy cases that don't respond to medication) lose the capacity for empathy, because there don't seem to be clinical data demonstrating loss of empathy as a personality change for those patients.  Though it may be the case that once a person has developed the ability to model the subjectivity of others, that ability sticks with them like any other.  

              "Recovering dualism," you're hinting at Gödel there, or I'm reading something into what you said, because I just realized that Gödelian incompleteness rules out "mind as mechanism" theories including strong AI.  (Why I didn't get this earlier, I have no idea, since I've been studying this stuff since an early age.)  The computations that occur within the system are incomplete without some kind of agent outside the system.  

              If per Wheeler, information is "the" fundamental constituent of the universe (and he was a hard-core skeptic about anything even vaguely "spiritual"), or per Chalmers, information is "a" fundamental constituent of the universe, then the answer is that information (in a semantic sense as well as or instead of a Shannon sense) is the agent outside the system.  In which case it has to be self-organized and willful in order to have any kind of consistent causal relationship with the operations inside the system.  In which case that's what religion calls "the soul."  

              This is all theory backed by logic without sufficient empirical results yet, but at least it points to where the empirical results might be sought.  Though Hameroff's people are working on empirically falsifiable hypotheses at this point, and as might be expected, they are getting mixed results, with some support and some falsifications.  From all of this will emerge a more accurate theory, which is all to the good.

              "Minus two votes for the Democrat" equals "plus one vote for the Republican." Arithmetic doesn't care about your feelings.

              by G2geek on Wed May 30, 2012 at 12:40:24 AM PDT

              [ Parent ]

              •  Hinting Godel & Frege & Russell (1+ / 0-)
                Recommended by:
                G2geek

                No system may be verified by itself -- Frege.
                Language systems invalidate the logical systems and cannot be uprooted -- Wittgenstein.
                Mathematics must be incomplete by reason -- Godel.
                Paradox survives even a linguistic purge, and there are mathematical statements that are unresolvable paradox -- Russell.

                To that, I would add Hegel's notion of history: the historical moment is such that it determines where we are and what we think, and we only know anything at all about what and where we are by examining the past, because no one inside the moment can see it.

                What is observing? How is the observing thing accounting for itself? Is it doing it with itself? If there is supposed to be an outside system for reference, then what came up with that, except the very same object of enquiry?

                The enquiry shows us just this: the mind has a concept of "mind," which means that the mind thinks there is a thingness there that is different. It splits into two and then various minds argue about the quality of the thing they wish to prove exists.

                Every reductio ad absurdum will seem like a good idea to some fool or another.

                by The Geogre on Wed May 30, 2012 at 04:34:32 AM PDT

                [ Parent ]

    •  Thetic qualities (2+ / 0-)
      Recommended by:
      ozsea1, Leftcandid

      (In phenomenology, the thetic quality is the quality of enacting the soulfulness and corresponds to the lived life.)

      You certainly may keep your phone. The question is whether your mental space will be infected by it. What you say is true, in that people have carried anxieties from work home and from home to work, but the cell phone's contact list and always-on status mean that the workplace and homeplace have no separation unless you physically or mentally unlink them.

      Can you also keep your books? Can you keep your time? The question that today poses for all of us, and this includes me as much as anyone -- after all, I never said I was without a cell phone or text message account -- is whether we can retain structure and distinction when culture (home/friend), convergence (phone = all boxes) and capital (each corporation invested) and production (all businesses seeking 'productivity' and marketing by these means) are seeking to eliminate it?

      The fight is inside.

      Every reductio ad absurdum will seem like a good idea to some fool or another.

      by The Geogre on Tue May 29, 2012 at 01:01:42 PM PDT

      [ Parent ]

      •  sure my mental space (3+ / 0-)
        Recommended by:
        Dixiedemocrat, The Geogre, ozsea1

        and my physical spaces are affected by it.   But I can actually think about it, analyze it and set limits on it.  That seems to be the problem,  people who let the phone set the limits.  Those people also probably can't discipline their kids, train the dog or set limits on any of their other relationships.  Seems to be another skill to learn to manage modern life.  Why do we let phones be more important than people?  That's been happening since it was invented.  No matter what we're doing, we let the phone train us to drop it and answer.

      •  The workplace and the home (1+ / 0-)
        Recommended by:
        ozsea1

        for the vast majority of people for just about always has been been the same place. Especially for women. I think you're overreacting to the technology.

        Try to shout at the right buildings for a few months.

        by nickrud on Tue May 29, 2012 at 06:41:17 PM PDT

        [ Parent ]

        •  Ah! Pre-capitalist and capitalist models (1+ / 0-)
          Recommended by:
          Evolution

          I think you're discounting capitalism, which I was trying not to do. While people can say that they have "heard it before" and mean that they have heard conclusions before, I try to analyze and synthesize.

          1. Pre-capitalist productive models involved a merge of work and home that was complete. In The Making of the English Working Class there is a good discussion of the weaver's trade, for example. Both carders and weavers worked from home. As they did so, they only had contact with the buyer or contract owner quarterly. Consequently, their home was home from a day to day basis. They had child labor, female labor, elder labor, and anything else, exactly and only as the economic need and family situation dictated.

          Pre-capitalist agricultural workers were and are family oriented as well. Read "The Thresher's Lament" by Stephen Duck, if you want a first hand account of what farm work was like in England. Share cropping life was similar, in that the whole family worked in the field. Again, though, this is a completely different economic, class, and productive formation, as there is no workplace, in the capitalist sense.

          2. The capitalist alienation of work from its product
          Baseline exploitation comes from centralizing the workers to a single place owned by the capitalist. When no man or woman weaves, but all go to the factory to stand behind the power loom, which is owned by the company, then the worker is alienated from his or her craft. Marx picked this one up right away.

          The workplace is an invention of capitalism. At the high point of American capitalism, in the 1960's expansion, we saw the pretty buildings, the stratified replicas of social class on each floor and the floors reflecting social hierarchy, and the workers inside the building were forbidden from bringing family. There were rules against personal calls on company time as recently as ten years ago. Men were encouraged to have alternate pseudo mates in the 1960's and 1970's.

          The "housewife" was occupying a precapitalist position, a feudal one.

          No: what is happening now is not nothing. In effect, we are bringing the alienation of work from its product home and bringing the substitute family/social structures of "workplace" into our primary social spaces, and all so that we can take our friends and home with us into that work space.

          Every reductio ad absurdum will seem like a good idea to some fool or another.

          by The Geogre on Wed May 30, 2012 at 04:51:14 AM PDT

          [ Parent ]

          •  You act like being able to (0+ / 0-)

            have your home and friends with you at work is a bad thing.

            Anyway, if you want to use marxist analysis you might want to keep in mind it's the managerial class that takes work home with them, not the working class. I have somehow ended up in that group (not from trying) and yes, I get calls on weekends and nights. But none of my 'subordinates' do.

            Try to shout at the right buildings for a few months.

            by nickrud on Wed May 30, 2012 at 08:15:51 AM PDT

            [ Parent ]

            •  I think you're excusing (0+ / 0-)

              Having home and friends with you at work is not actually having them, but rather having a system of obligations in a social network with you. "They" are only there so long as "you" are there for them, and you must allocate units of attention and personality/social maintenance to each of them to the degree that you wish to receive it in exchange, while there is an invisible system of (researchable) rules for how much burden one member can place on the others.

              Similarly, the workplace has moved from the old manufacturing and productive modes of the early 20th century to an office bound and intellectual model, which means that the work being done, even by the lower ranks, is stress-inducing, personality managing, and analytical. Taking the work contacts home is taking the work home, because the work's obligations were not physical as much as intellectual.

              Bad? By itself, I don't think it's bad or good. However, I think it creates bad when current iterations of capitalism intensify alienation of workers and workers by themselves and of themselves lose the capacity to construct culture or analyze in time. It means powerlessness, and that's very much bad, no matter how nice it feels.

              Every reductio ad absurdum will seem like a good idea to some fool or another.

              by The Geogre on Thu May 31, 2012 at 08:11:48 AM PDT

              [ Parent ]

  •  If I did have a conclusion (7+ / 0-)

    I would conclude that the forces that have already aligned -- commercial interests that see greatest profit in subscription everything and Internet-based everything, the profit-driven push of SMTP, the profit-driven push of pictographic web -- and the current ones -- 4G phones and the resuscitation of the tablet with an ever-increasing acceptance of the erosion of rights and definition -- mean that this generation of students, more than this generation of adults, has no capacity for processing historical thinking or linear concepts. There is no particular recipe that I could offer, as it takes a serious case of delusions to believe that I could do anything about it, or even that individuals could do much to others.

    There is, however, a self-prescription. There are matters of self-discipline that individuals can follow. They're not unlike the "kill your TV" folks in that they require people purging their own mental environments of elements that they believe harm their ability to concentrate.

    Every reductio ad absurdum will seem like a good idea to some fool or another.

    by The Geogre on Tue May 29, 2012 at 11:25:11 AM PDT

    •  here's another thesis for you: Regress. (5+ / 0-)

      Cellphone audio is equivalent to 1925 landline audio.  The shitty compression algorithm also edits out all of the emotional subtleties of speech, making conversations less satisfying and more likely to produce conflicts.  

      Watching TV on a 4" diagonal screen goes right back to 1935 "mechanical" TVs with a spinning disc behind the screen.  This in an era when actual TVs are bigger than ever before.  

      Typing on a teeny-tiny keyboard or a tablet's "virtual" keyboard slows down textual communication to the speed of the Morse telegraph circa the late 1800s.  Squinting to read text on a teeny screen is a good way to develop eyesight problems.

      And "texting" itself is just a telegraph re-hash, complete with the use of ridiculous abbreviations to stay within the character limits and to contain costs, so update that portion of the telegraph to the 1910s.

      All of these "shiny objects" add up to what I call "digital monkey traps."  Most of them come along with extra added surveillance built right in.  

      As for me, I'll stick to the landline telephone where I can hear every nuance of voice, the full-sized laptop screen where I can read text and watch video without squinting, the full-sized keyboard on which I can type as fast as you can talk, and no extra surveillance beyond what I already get for my taxpayer's dollars.  

      "Minus two votes for the Democrat" equals "plus one vote for the Republican." Arithmetic doesn't care about your feelings.

      by G2geek on Tue May 29, 2012 at 11:51:19 AM PDT

      [ Parent ]

      •  If I WERE on a McLuhan trip... (2+ / 0-)
        Recommended by:
        G2geek, Susan from 29

        I'd point out that the tetralogy of media says that each technology extends something, accidentally revives something, makes something else obsolete, and reverses itself. (I think those are them.)

        Anyway, the point is that each new technology accidentally makes us recreate an old one, he said. Now, I always thought that was interesting as an observation but bogus as a law. I.e. it always seemed to be true, but I couldn't see any reason for it to be true.

        Cell phones revive the telegraph! It's beautiful!

        Every reductio ad absurdum will seem like a good idea to some fool or another.

        by The Geogre on Tue May 29, 2012 at 01:12:27 PM PDT

        [ Parent ]

        •  yep, neo-telegrafo. (3+ / 0-)
          Recommended by:
          Susan from 29, EthrDemon, ozsea1

          Hell, I even called it "instagraph" in fiction some years back, but then someone picked up the name and turned it into a commercial website, blah blah blah... which won't be the first time...  oh well.

          What I find just downright nuts is that people are willing to tolerate shitty audio, tiny video, squinty text, and tiny & virtual keyboards that slow down writing to a snail's pace, and then proclaim loudly how wonderful it all is.  

          It "is" arguably wonderful on the road.  But at the house or in the office it's nuts.  As in: car seats are nice in a car, but they're uncomfortable & inconvenient around the dinner table in a house.  

          "Minus two votes for the Democrat" equals "plus one vote for the Republican." Arithmetic doesn't care about your feelings.

          by G2geek on Tue May 29, 2012 at 01:39:17 PM PDT

          [ Parent ]

      •  The commentary is at least as fascinating (1+ / 0-)
        Recommended by:
        The Geogre

        as the diary that inspired it.

        Your comment rates at least a couple of orders of mojo magnitude beyond the mean.

        My grandfather, who was a switching engineer for Bell Telephone in the 1930s - 1970s, would have thoroughly enjoyed conversing with you.

        "What have you done for me, lately?" ~ Lady Liberty

        by ozsea1 on Tue May 29, 2012 at 07:27:08 PM PDT

        [ Parent ]

  •  OK. This is clever. (3+ / 0-)
    Recommended by:
    The Geogre, tardis10, Leftcandid

    SethRightmer is a foil you've deployed to buttress your essay, right? Oh My. (The commentor is perhaps unfamiliar with the poster's style: sort of a thoughful meander on the part of the poster with the reader as an invited guest.  It's old fashioned, to be sure.)

    Almost nothing has a name.

    by johanus on Tue May 29, 2012 at 12:05:49 PM PDT

    •  Awww, thank you. (1+ / 0-)
      Recommended by:
      ozsea1

      I am old fashioned, it's true.

      I confess: my essay models alternate between Joseph Addison and Myles na gCopaleen.  (Actually, I can't stand Addison, and I can't link you right to Myles, as he's under copyright -- bless his alcoholic self -- but I'd recommend anyone get The Best of Myles and have a hoot.) Oh, and John Berryman's "Dream Songs."

      The biggest thing is that I don't like journalistic certainty and self-satisfaction. Whatever David Gregory is, that's what I want to un-be. Whatever George F. Will is, that's what shouldn't be.

      I think, though, that the reader was just in a bad mood and may have read quickly. It's alright. I wish he hadn't been insulting, and I kind of wish I hadn't been nasty back.

      Every reductio ad absurdum will seem like a good idea to some fool or another.

      by The Geogre on Tue May 29, 2012 at 01:19:38 PM PDT

      [ Parent ]

  •  Interesting. (3+ / 0-)

    Tho some this could be said about any technological advancement.  After all, the book meant the village elder was no longer revered.  The printing press ushered in the
    Reformation.

    Some of it is facile but unsatisfying: i.e., 'forgetting'?  Nope.  Its not about content but context.  That is, it matters not what the theory or data is or is about, what matters is the system whereby Truth is judged.  We pursue the rational, scientific method supposedly, and thus do not 'forget' the theories of the past but discard them when they are proven false.  

    The meta tho, is fascinating. Are we really being hurt by a collapse of reliance on the Dewey Decimal system?  Srsly?  Or is this just another example of Capitalisms ability to commoditize everything?  You see 'young people' only concerned for images - as if the 1st 'permanent' communication weren't a picture (caves you know) - and worry they will lose the ability to think.  I see development of a semi-sentient info sphere that makes infomation available to all, freeing Mind to analyze and chose.  

    Yes, the Masses will (sometimes, perhaps often) use it stupidly.  When have they not?  Susan Collins outsells Lord Byron by a mile.  I dare say a heck of a lot more folks know the former than the latter.  If I thought it was keeping them from contemplating the Mysteries of Universe, I'd be worried.  'Course, I know 99% would just find something ever more inane to waste time, and more likely to harm me.

    But, in the end, if I want to 'leave work', I just turn my phone off.  Just like if you don't want your kids to see somehitng on the TV.

    As for 'chaning the way we think'... um, nope again.  The software doesn't change the hardware.  The problem isn't that folks don't go looking for primary sources, its that they never did.  The difference is the illusion of 'Wise Men' and 'Opinion Elites' has collapsed into what it always was: money.

    This only matters so long as information is restricted.  That is, if someone owns the pipeline.  And there you do have an important point.  Of course, the solution is antitrust of the comm industry.  But, given Citizen'sUnited, I wouldn't hold my breath.

    And in the end even this worry is as ephemeral as the technology it relies on and that created it.  Wait 20 years, when the pirate cellphone wars will be raging. :)

    •  I would like to go point by point, but (0+ / 0-)

      Our hardware is changed by our software, because the brain is rather unusual. It is a plastic organ, and its morphology actually changes in response to what we do with it. (This is one reason the "gay brain" research has been a mess.)

      My example of forgetting was facile, but the point was not. The medieval mind had trouble forgetting its certainties. It could be, and was, presented with inductive proof and could not accept the evidence of its senses, despite having a highly developed logical practice. When one's mental organization is vertically deep, there is no simple conclusion, because each is part of another, and that was one of the problems. As the population was more literate, though, it was easier to displace onto the written word.

      I do not think we are losing Dewey, but rather losing the capacity to create taxonomy at all. If taxonomy and structure do not exist, if they are replaced by an "I'll get it from my phone/Google," then the information loses its very meaning. After all, syntactically, words only mean in the presence of other words, and no data means except in the presence of other data.

      The problem with a pictographic system is that it is a 1:1 substitution. It is a non-interactive cognitive operation. The relationship is a preterite rather than a metaphor (sorry for being so abstract). Instead of creating tensions and ironies, they obliterate one object for another. This is inarguably less rich. There is nothing wrong with having it, except there is a problem with it dominating.

      Your last point, that this too is ephemeral, is a dangerous assumption. We always project based on our own living past, and we have seen rapid change and thus presume there will continue to be. However, the truly frenetic change occurred when there were competing standards and OS's. As the largest corporations have settled their scores and consolidated, there has been far, far less underlying change lately. More chilling is the fact that their dream has not changed since 1995. The Boca Raton CEO meeting where the Cable Town guys gushed about how much they were going to get for a la carte everything is still warming them at night.

      Every reductio ad absurdum will seem like a good idea to some fool or another.

      by The Geogre on Tue May 29, 2012 at 06:02:51 PM PDT

      [ Parent ]

      •  I disagree that we are losing taxonomy (1+ / 0-)
        Recommended by:
        The Geogre

        Yes, the trend over the past 10-15 years toward internet search and away from directories as a way to find information may have caused some people to lose some ability to classify information effectively or to think in terms of classification. However, I think non-taxonomic search is beginning to lose its effectiveness due to the sheer overwhelming volume of unstructured information, and that in the next decade, taxonomy/directory/category-based informational structuring will begin to make a comeback. Much of the categorization of information will increasingly be performed by algorithms and crowdsourcing, however, rather than by individuals or small groups of editors.

        Eric Stetson -- Entrepreneur and Visionary. www.ericstetson.com

        by Eric Stetson on Tue May 29, 2012 at 06:15:18 PM PDT

        [ Parent ]

        •  Possible, but wasn't that Google's idea? (1+ / 0-)
          Recommended by:
          gatorcog

          Yet Another Hierarchically Organized Organization (YAHOO) had an indexed search on the honor system, and Lycos, Altavista and the rest used crawlers. Of those two, Yahoo was better if a person was looking for, say, "The Thresher's Lament by Stephen Duck 1736," and the latter was better if one were looking for "Hot girls!!!" Google was going to use academic citations and links to be a crowd sourced usefulness index.

          I don't know what the heck they're doing now, but that ain't it. Now, trying to find an e-text with a precisely entered search term gives me junk, because Google is interested in "helping" me find hints and cheats for console video games and new products for my television, and they really, desperately want a cell phone number for my Gmail to work.

          I think it's quite possible that Google's own success as a publicly traded company will kill their utility as a search. Now that they have stock holders, they have people with money screaming for more. As Google wants to own phones (for Goodness!) and books and the like, they need money, so the search gets worse and worse.

          However, none of that will really change what I was writing about. What I think is killing taxonomy is not the flat search itself, but the fact that information for the end user requires neither parsing nor organizing to be found. This allows (and therefore causes) the mind to cease to go to the trouble of putting concepts into particular orders that themselves must be remembered. The consequence of that is memory loss.

          I know there are a lot of theories of memory. I can't pretend to be an expert, but I will say that the model that rings truest to me is the "card catalog" model. Memory stores away, and then we have a set of indexes to the memories. This is the difference between memory and recall. Imagine the recall facility as being like the FAT on an old hard disk. Language is necessary, or at least primary, for this catalog. Organization is a key method by which we can self-correct our recall facility: "No... not Duck... the other poor poet that they went gaga over... woman... the milkmaid poet... oh, who was she... became a real pain in the behind... Mary Collier!"

          Recall often leans on the logical system of indexing we got from taxonomy. What happens when it's, "I'll get it from the search?" That's my fear.

          Every reductio ad absurdum will seem like a good idea to some fool or another.

          by The Geogre on Wed May 30, 2012 at 05:09:12 AM PDT

          [ Parent ]

      •  Also interesting. (0+ / 0-)

        One meta tho, I think you overestimate the effective (and effectual) literacy of the Masses.  Most people don't know words.  They barely know concepts.  And usually those concepts have to be concrete enough to form mental pictures or they don't get it. This is one reason why story telling never died.

        I disagree that the hardware changes.  Sure the synaptic connections alter, but they always do (tho the major ones are established in pre-K ages iirc, and so are not likely affected by this media).  Now, you are likely correct that visual media will bias towards certain kinds of connections or networks that are different from written.  I simply don’t see that as all that different for most people than it is and has been since the 60s (i.e., the TV age).  Even when they read, most people don't read for information, they read for trash.  You might as well argue movies would kill the novel (some did argue that).  They didn't.  But both are often just as trashy as any other mass media content.

        And ultimately, we're still just hairless apes with 100k year old hardware (itself built on millions year old hardware).  From what I’ve read, the basic workings of the brain pretty much haven’t changed during that time.  Neolithic people’s were just a mentally capable as the average modern person’s.

        I think you might have missed my pt on the 'forgetting'.  The history of ‘intellectual’ thought, such as it is (intellect, that is), since the Enlightenment has been one of a change in the context in which info was viewed. This had more to do with the printing press, which in turn had more to do with the availability of only 1 book: the Bible.  The minute enough people started reading it rather than just taking the priests' word, the ossified non-classical view was doomed.  It is no coincidence that the scientific revolution advanced faster in Protestant nations.  I believe it had little to do with how the masses mentally classified information and more to do with the breakdown of Catholicism as the arbiter of truth.  After all, classical Greeks and Romans knew quite a bit of what we now consider 'modern' knowledge.  One IMO cogent example, they invented calculus 1700 or so years before Newton.

        IOW, the critical change IMO was political, and not some technological change in how information was mentally accessed, analyzed and classified.  The reason I think that important, is that I find the same political cause at work today: i.e., as I said, its not how info is accessed at the initiation of the analytic process (pictorially or hype-linked written), its who owns and controls the access to info.  Just as the Church's control of the pipeline (and other aspects of society thru its wealth and mystical monopoly) had the Masses rejecting Proof by Evidence for Authority for 1500 years.

        In that sense we both agree: communications monopolies are very, very dangerous to free thought, enlightenment and progress - not to mention democracy.  But, IMO its not a function of the 'platform'.  The same thing would have happened in the print age if one group controlled all the papers and other mass media.  See, Hearst -Spanish-American war and Gilded Age.

        As for ephemeral-ness, my pt was not that their goal was, but that their control of the platform is.  I am serious: baring catastrophic civilizational failure, energy and tech knowledge and access to materials will soon be widespread and cheap enough that 'pirate' cell networks will arise.  Indeed, pirate satelites will likely as well.  Consider the hacking of the I-phone Sim card.  Failing that, what nation awaits to trump the monopolices with their own comsats as soon as they’re unhappy with their cut of the monopolies action? Beyond that, there will also be non-broadcast communications: e.g., consider a communication tech based on quantum entanglement.  I don’t think its lack of OSs or competing standards that's the problem, anymore than Beta dying gave one company control of all movies.  Its the consolidation of the media industry.  Which is to say, capitalism itself.  Which is why I say this is an antitrust (i.e., political in the small sense) problem until the knowledge and availability of the tech to build the networks and receivers catches up.  Once that happens, the media corps we have now will go the way of the newspaper and magazine giants.  

        I also tend to agree with Eric, but again think you're speaking of elites who generally even now continue using written sources regardless of the interface the Masses use.

        All that said tho, I do think you raise some very interesting and well-taken pts.  And of course, these are just my opinions. I certainly could be wrong.

  •  I find that I have more time to be analytical (2+ / 0-)
    Recommended by:
    nickrud, The Geogre

    now that information is so quickly and easily at my fingertips on the internet. I work more efficiently and can make better and faster decisions because of not having to spend as much time doing research. I can spend more time thinking about issues or problems I need to solve, rather than just gathering information. Internet search has been a great help in my life and my work.

    So, I don't think this new technology is destroying the human mind or reducing our capacity for analytical thought. Quite the contrary. I think that people who are inclined toward deep thinking can spend more time doing that now, whereas people who never were inclined to think deeply in the first place have more ways to fill their time now with technological gadgetry and thus avoid any depth of cogitation. It may just amplify the natural differences between the intelligent and the dull.

    As for cell phones, specifically, I don't like them much, and only use one for the convenience of being able to place calls wherever I happen to be located, if necessary. I don't like the fact that cell phones have become, for many people, like an electronic leash, and I refuse to use mine that way. I actually keep it turned off a lot of the time. Since I'm self-employed and don't have a boss who expects me to be on the cell phone 24 hours a day, fortunately I can get away with my preferred pattern of use.

    Eric Stetson -- Entrepreneur and Visionary. www.ericstetson.com

    by Eric Stetson on Tue May 29, 2012 at 06:09:09 PM PDT

    •  You're a model (0+ / 0-)

      You're a model of the same sort of person that can use the technology. I don't know your age, but the serious changes seem to be happening with those for whom the cell phone is "normal" and "natural" and a 13th birthday present.

      For either of us to use the Internet as a search tool, we have to remember those obscure details that we want to find. I keep a large set of reference books, myself, and enjoy looking up material there, for the flip and discover value of it.

      My argument is not Carr's. I think Carr is right, in general, but what I was trying to say was that we need to add all of these forces together. Take the thin mind of Carr's premise and neurological research into task switching, add the cell phone as tensor, add the instant message as a collapse of mental space and place, and add in the capital investment in metered payment, and we've got a heavy situation where a generation's ability to think is under siege. Unless they themselves become ascetic, I don't know that there is much to be done.

      Educators can introduce renewed drills on hierarchy, mnemonics and such, but not with regulations demanding teaching tests.

      Every reductio ad absurdum will seem like a good idea to some fool or another.

      by The Geogre on Wed May 30, 2012 at 05:18:05 AM PDT

      [ Parent ]

  •  I'm coming back to your diary and the comments (2+ / 0-)
    Recommended by:
    The Geogre, rightiswrong

    after some sleep.  Been a long day.  However, reading your diary, I remembered an event in my past that might give you a chuckle.

    I worked at Motorola during the late 80's and was part of the team who was involved with the implementation of the digital system across Europe.  As a member of management, I was given one of the first flip phones (verses the original "bricks").  Sooo, that meant I had one of those super-cool phones clipped to one hip and my pager on the other.  

    The first day I was so outfitted, of course, I had to use the restroom. And of course, as I went into the stall, both of the damn things went off.  Disgusted, I shut them both off!  I then went back to my area, took both off, and asked my assistant (we still had people in those roles back then) to mind the two things.  At that time, I could not imagine why in the world anyone would want to have their lives so intruded upon by the damn things -- and I worked in one of THE tech companies that was rolling them out!

    Today, I get teased about how often I lose my phone around the house or work.  If someone really wants me, they can leave a voice message or send an email -- I don't do texting -- and I get back with them in my own time.

    I wonder if I'd even have a sense of having a right to my "own time" if I'd grown up thinking that it was normal to be buzzed while using the restroom.

    Plutocracy (noun) Greek ploutokratia, from ploutos wealth; 1) government by the wealthy; 2) 21st c. U.S.A.; 3) 22nd c. The World

    by bkamr on Tue May 29, 2012 at 08:37:25 PM PDT

    •  Fantastic (4+ / 0-)

      I can tell you where that device has gone.

      Warning: True Story
      I was in the men's room. In our building there are three urinals at different heights. I call them Papa Bear, Mama Bear, and Baby Bear. Everyone goes for the middle, because... you know... as George Carlin said, everyone's average. As I was doing my bit for recycling, a kid came in with a cell phone in his hand.

      He walked up to Mama Bear, and he did his entire business without ceasing his texting.

      For any women, or non-visual thinkers, let me spell that out: he did the entire operation one-handed. He had to unzip, fish, produce, relieve, replace, and zip and flush.
      conclusion of the foregoing

      He never looked up from the screen, either.

      So, here's my thought: that TOOK PRACTICE.

      Every reductio ad absurdum will seem like a good idea to some fool or another.

      by The Geogre on Wed May 30, 2012 at 05:24:00 AM PDT

      [ Parent ]

    •  I have worked in IT for 37 years! Yikes! (1+ / 0-)
      Recommended by:
      The Geogre

      When the first "hobbyist" computers were first being sold in kit form (S-100 bus / 8080) from the back pages of Popular Electronics, my co-workers and I scoffed at the idea of having a computer at home. "What the hell would you ever want to do with that?"

      (Secretly, I wanted one very badly but couldn't afford it.)

      •  (Not much, in truth) (1+ / 0-)
        Recommended by:
        rightiswrong

        I was a co-conspirator to a Sinclair build. We thought building it was impossible, but actually having to create instructions for it to do something... like add... was really impossible.

        By the time Quentin, the man with the Sinclair, had it able to do a lunar lander simulation, the 8088 was out and CP/M.

        Every reductio ad absurdum will seem like a good idea to some fool or another.

        by The Geogre on Thu May 31, 2012 at 05:32:08 AM PDT

        [ Parent ]

  •  Wow, this is interesting (3+ / 0-)
    Recommended by:
    ozsea1, tardis10, rovertheoctopus
    I think there is some serious extension to the intersect of capitalism, corporatism, executive growth, and the simultaneous technologic erosion of self, place, and time. It took the one to help the other, and the other aids the one, and the two together really seem to portend narcosis.
    Some big ideas to be unpacked here. "Technologic erosion" and capitalism unite to "portend narcosis" ...

    If words are fingers pointing to the moon, then this is some deeply confusing pointing going on. I can't predict the future, but, for some strange reason, your words make me think of the Borg. We will all be assimilated into a great profit center.

    I think chemistry, biology and the Laws of Physics will have the final "word" on all this. The Dictatorship of Mathematics rules us all.

    muddy water can best be cleared by leaving it alone

    by veritas curat on Tue May 29, 2012 at 09:04:18 PM PDT

    •  I suspect Soylent Soma (1+ / 0-)
      Recommended by:
      gatorcog

      I would never attribute intention to social conjunctions. These forces are combining and aiding one another, and left wingers, civil libertarians, and labor people are howling at the lumpen mass and wondering.

      Well, first, I don't think the mass is lumpen, but I do think that it's getting harder and harder to remember.
      1. As capitalism moves into a phase of corporatism over capitalist magnates, whereby there is never an owner nor moral hazard, it naturally extends its own false consciousness ("there are consumers" and "government" is in opposition) into politics and culture.
      2. Every political executive believes his power is benign, whether it's George W. Bush or Barack Obama. Executive power has been on the rise since 1945. (Bomb Power is a must read.)
      3. I either established or failed to establish my case that the cell phone erodes the walls of place and time.
      So, big money accelerates the alienation of the individual from structure, and executive functions (literally, not politically) keep getting taken.

      If I follow this essay as I think I might in a few weeks, then the logical pick up is to argue that Neil Postman was wrong: 1984 can occur through the means of Brave New World. I see us taking the happy pills and not worrying about why we've been purged from the voter rolls by the Scott/Walker/Brewer Triumvirate.

      Every reductio ad absurdum will seem like a good idea to some fool or another.

      by The Geogre on Wed May 30, 2012 at 05:35:03 AM PDT

      [ Parent ]

    •  Are you man or are you hu man? (0+ / 0-)

      Not you in particular.
      What makes you human?  Your role in society is a construction, a utilitarian function of you making your way through it.  You really are none of that, and the good news is that you'll never find what you really are in the context of your relationship to your environment or society.  Mathematics is ultimately a construction of mind.

      Well, I guess I don't know what you mean by "equal justice under the law." - Bushy McSpokesperson

      by gatorcog on Wed May 30, 2012 at 11:48:26 AM PDT

      [ Parent ]

      •  Ideology (0+ / 0-)

        Althusser said that ideology is not the worker's place in relation to the productive/ownership relationship, but the worker's imagined position. Ideology is that very construction of who you think you are in society.

        So, if "you" are a cell phone, where all places are no places and the same place, if capitalists have moved away from morality in favor of a deferral into a system ("I would never want to fire a whole bunch of people at Ampad, but it's good for stock price, the abstraction, to do this, and my business model, the abstraction, dictates that I generate returns to investors, an abstraction, and so I'm going to 'trim the fat,' a metaphor, to achieve market position"), while politically each executive officer is given the brief that "there are things to scary and horrible to survive voting or ethics," then ideology can finally be managed into a negation.

        Every reductio ad absurdum will seem like a good idea to some fool or another.

        by The Geogre on Thu May 31, 2012 at 05:39:08 AM PDT

        [ Parent ]

  •  The idea that we must remain mindful of how (2+ / 0-)
    Recommended by:
    ozsea1, The Geogre

    we interact with technology, given that the profit impulse is not concerned with what's best for us, is ever more important.  The precautionary principle ought to guide us in this manner, advocating careful consideration of potential impacts of consumer tech prior to enga--HOLY FUCKING SHIT, IT CAN DO 3D PIX!!!!1!!

    Before elections have their consequences, Activism has consequences for elections.

    by Leftcandid on Tue May 29, 2012 at 09:16:09 PM PDT

  •  Too many words (0+ / 0-)

    Can you shorten this piece to a syllabyte or two?  heh
    Seriously, good piece.  I think narcissism has changed forms throughout history but remains the same thing, if you know what I mean.
    The Self, that's another matter.  From dualistic experience, where most people live, there is the 'I' and there is everything else.  Living non-dualistically, 'I' cannot be found, nor can it be seperate from its environment; at the same time, 'I' is not that which can be thought, nor can it be perceived.
    I have no point in this as it relates to the diary, yet at the same time I think it's related.  For instance, I don't text, not for any particular dislike of texting, it's just I have no purpose or need to.  I just talk to people on the phone or email them, not out of devotion to preserving a way of being, but just what I'm used to.  I don't feel left out; if anything I feel my life is less uselessly cluttered with bullshit, but I wouldn't resist texting if I felt I needed to or if it's useful.  I'm 50 yo, if that matters.
    The primary point I get is that people experience each other live in person, in "meat space" if you will, less and less.  This circumstance disconnects us from our humanity leading to the question, kind of as an aside - does it contribute to the cause of the rise in autism in youth?  Or are these types of afflictions invented by pharmaceutical companies?  
    Fascinating diary, thanks.  I read every word, BTW.

    Well, I guess I don't know what you mean by "equal justice under the law." - Bushy McSpokesperson

    by gatorcog on Wed May 30, 2012 at 11:41:07 AM PDT

  •  Fantastic writing (0+ / 0-)

    As Benjamin Barber pointed out in "Consumed", the tyranny of these modern times are not restrictive of liberty so much in the physical sense, as with kings and empires; so much as in the mental sense, with capitalism on the endless growth trajectory. Cell phones may have been innovated to incorporate more of the fragmented living the Internet provides with each iteration placed on the market, but by no means has it necessarily enhanced the human experience. Certainly access to information has risen exponentially, but by shorting the distance in a virtual space, it has also enabled behaviors like multitasking and mistakes the presence of information with the notion that one is also able to correctly glean information. Remember, misinformation now travels just as fast as factual information.

    The paradox about increased leisure through less physical work is also apparent, for the work is now placed on the brain rather than your back and your bare hands. I have known people whose entire existence seems endlessly consumed by the inevitability of some e-mail that needs to be answered over the weekend or while on a personal vacation. They assure themselves that, because it is not physical, the work is benign. Nevertheless, it has further adjusted expectations of productivity. Your life has been enabled to speed up, so, too, should your output. And in effect, cell phones are ersatz tools for life itself. Virtual instead of tangible. It's not the machine itself that's a problem, it's its application.

    "Growth for the sake of growth is the ideology of the cancer cell." ~Edward Abbey ////\\\\ "To be a poor man is hard, but to be a poor race in a land of dollars is the very bottom of hardships." ~W.E.B. DuBois

    by rovertheoctopus on Wed May 30, 2012 at 01:44:35 PM PDT

  •  I know this isn't what you're talking about, but (1+ / 0-)
    Recommended by:
    The Geogre

    I want to do real Boolean searches on Google, or any search engine, and I just can't get it to stop dishing garbage my way. Example (from memory so may not be exactly right):

    management "six sigma" -signal

    will still return web-pages that have the word "signal" and that do NOT have the exact phrase "six sigma". (Note that I've found more than I want to read on "six sigma" thank you very much, this was just an example).

    So the - (meaning "not") doesn't work, the "" (meaning "exact phrase") doesn't work, etc. Google and other search engines dish up stuff I don't want to see because it has decided that I probably do.

    This makes what's already a mind-numbing exercise much worse: trying to find what I'm really looking for on the web.

    I know you're talking about how the internet is demolishing culture, place and time etc., but surely this sort of distraction doesn't help a bit!

    •  Recent stories: "Google thinks you're dumb" (0+ / 0-)

      I didn't note the links, but apparently Google's study of people using its service has led it to conclude that users are morons.

      Shocking, right?

      However, if they decided, as apparently they did, to change their service to suit and mange the herd of cattle that they have discovered, then the "power users" and actual users of search won't want to use Google at all.

      Wonkette pointed out that the fun search was not "Ground Zero Mosque," for example, but "Ground Zero Mosk." That was before Google decided to treat the searches as identical.

      Again, people lose the ability to stratify because of the laying off of memory onto the web, and others never learn it, and that leads to stupid searches, and then the search giant makes the search less stratified, and that frustrates us more.

      Every reductio ad absurdum will seem like a good idea to some fool or another.

      by The Geogre on Thu May 31, 2012 at 05:46:25 AM PDT

      [ Parent ]

  •  It is the culture... (1+ / 0-)
    Recommended by:
    The Geogre

    Interesting post in my opinion.  There is also no real question that Americans are incredibly narcissistic.  The question about the technology is one of directionality.  Does the technology foster narcissism or the other way around.  I would argue that it is the narcissism that drives the technology.  Americans expect that their individual perceptions, opinions, ideas, beliefs, are all legitimate and worthy of attention.  Dkos is an example of the successful implementation of an opportunity for Americans to express their need for expression and response.  The “meta” diaries, Kos calls them, between those who criticize Obama and those who support him, between those who argue for some adherence to principle, and those who argue that only pragmatism matters and anything is better than the other guys.  What these diaries are really about however, is competing assertions of individualism.  

    The individualism so basic to American culture is expressed through making choices.  McLuhan was only half correct in asserting that the medium is the message.  Look what the “medium” has turned into, an increasing array of choices (how many channels are there?) that provide Americans with the opportunity to exercise their individuality (not unlike the never to be taken away choice to purchase a .38, .45, .50 caliber whatever).   McLuhan underestimated the power of American culture to drive the medium, Americans have imposed choice on the medium, the medium has not imposed uniformity on Americans.  In fact, the current punditry decry the choice by asserting that the current fracturing of the polity and “partisanship” that dominates politics is a result of Americans having and using the choice of the medium (TV) and the internet to choose only what affirms their beliefs (and of course lamenting that they are not chosen as much as they wish).  So we use our gadgets to exercise increasing (even if illusory) choices (and a basic homo sapiens desire to tell something else what to do and have it do it) through, for example, texting even as we (well, they, I do not do this and find it horribly irritating) drive.  Cell phone provide Americans to do what they do best to be about me, me ,me, all the time.  People who need people are the luckiest people in the world…. Somebody who needs me…. I did it MY way…..   Even before TV the medium was the message (AM, FM, and all those stations).  

    So basic point.  It is the culture (I suggest a reading of all Richard Slotkin’s work) and not the gizmos that drives narcissism.  And no, I do not think there is anything wrong with this, it is what people do (I am an anthropologist).  

    •  The non-narcissist Narcissus (0+ / 0-)

      I think McLuhan actually anticipated the profusion of choice without distinction. The central premise is that what we are choosing is an image of our selves improved. We fall in love with this vision of ourselves, not ourselves, and so we pay attention to this thing we believe is an object of beauty and never realize it is us and that we are being changed by the romance. In that sense, the television was bound to offer more and more channels with no difference between them, because the tool/medium had to remain the same but the love affair weakened.

      My worry is that the cultural structures are vanishing in depth. Whether we look at the various historians who have noted that PowerPoint and e-mail world are marking the end of primary source history. Material culture, too, ceases to have a mark of production in transnational productions.

      I agree, of course, that humans do as humans do, and there will be anthropology, but I have to agree with McLuhan that we have inadvertently recreated village life, on the one hand, and, on the other, I think we have moved into a new thing -- a mind without memory or place. Some reaction will occur, but, until then, we're kind of boned.

      Every reductio ad absurdum will seem like a good idea to some fool or another.

      by The Geogre on Thu May 31, 2012 at 05:56:42 AM PDT

      [ Parent ]

      •  I think your final conclusion... (1+ / 0-)
        Recommended by:
        The Geogre

        that until then we are.... The major reason is that as people stare into the gadget - mirrors others are making off with everything worth taking.  It will not be until the power goes off and they look up and realize that the Romneys, Blankfeins, Diamons of the world have hoovered everything up  I see someone like Matt Taibbi as a good example to follow.  He posts just enough on his blog to keep people coming while he spends much more time on serious in-depth reporting on the bankers.  Unfortunately too many others just put out stuff daily or more to get a response, so they can respond to something, so they can get a response.... and on it goes.  Meanwhile there's that Hoover whooshing in the background.  

  •  Why I disappeared (0+ / 0-)

    Work. Storms. Work + storms + Internet outage for the whole campus.

    Every reductio ad absurdum will seem like a good idea to some fool or another.

    by The Geogre on Thu May 31, 2012 at 05:27:58 AM PDT

Subscribe or Donate to support Daily Kos.

Click here for the mobile view of the site