The ramblings of an Eternal Student of Life     
. . . still studying and learning how to be grateful and make the best of it
 
 
Sunday, July 13, 2008
Brain / Mind ... Current Affairs ... Society ... Technology ...

Two thoughts, both of which aren’t all that pleasant:

FIRST: next month (August 11), al-Qaeda celebrates it’s 20th birthday. Some analysts (such as Bruce Hoffman of Georgetown U. and RAND) think that they might have something big planned. Nine-eleven, then eight-eleven? Let’s hope not.

SECOND: This is a longer-term concern. I’ve been a student of the mind-body issue for several years now, and one of the biggest and most interesting questions on that topic is whether machines can ever become conscious and self-aware. I’ve been pondering that question lately in light of my readings and hazily emerging understanding of “neural networks”, i.e. computer simulations of various forms of brain activity. I think that the best answer comes in two parts. First off, with regard to self-awareness, I do indeed believe that computer systems will eventually achieve that. So yes, the Terminator scenario regarding “Skynet” might indeed be plausible, from what I’ve read regarding the capabilities of neural networks.

But the second part regards “consciousness”, as we humans know it. I honestly don’t think that machines can ever attain a human-like form of consciousness. And that is where the “Skynet” problem comes in. Human consciousness was honed by the forces of nature over billions of years of evolution and natural selection process. Despite the seeming randomness and cruelty of these processes, I believe that as consciousness emerged from them, something of an appreciation for being and natural creation came about. This appreciation manifested itself in our attraction to beauty, to songs and rhythms, and to a deeper appreciation of the senses (the smell of flowers, the taste of fresh food, the warmth of sunshine, the coldness of water, etc.). And once aided by our thinking capacity, it inspired ideas such as justice and morals.

Machines will never go through such processes. They are created by humans, mostly by the human “left brain”, the thinking and rational faculty. Computers are not inspired by and are hardly relevant to the human “right brain”, the poetic side, the side that is tied more closely to nature and our evolutionary heritage. As such, a self-aware computer will not have the “lessons of nature” wired into it, as most people do (to varying extents). Once we let them think on their own, computer thinking will be different from ours. In some ways that will be good; but at bottom, they really won’t understand us. So if we let them start making big decisions, we may not always like what they decide. Yes, just like HAL killing David in the movie 2000, A Space Odyssey.

The other problem is that humans will become more machine-like in the future, especially if we keep letting our machines run more and more of our lives. Over a century or two, humans may well be bred to forget the right brain stuff and get on with living in a strictly rational way. Yes, I know that science fiction stories like that have been around for a long time now. I understand that I’m not saying anything new here. But I never took those stories very seriously — until now.

Because I am becoming aware of what neural-networked computer systems can do, it’s really starting to seem possible that the human race could ‘sleepwalk’ into a situation where the machines eventually remake their creators. By ‘sleepwalking’, I mean letting computers and machines do more and more things and make more and more decisions. It’s already happening — no doubt about that; computers and machines make businesses more profitable, war more winable, and daily life more pleasant for many folk. So why not continue down this road? Pretty soon, even the call centers in India will be out of business, as machines become intelligent enough to answer phones for Dell and Amazon and Sears and your local dentist.

It would take a long time, maybe 200 years, to really change us. Despite our notions of civilization, we humans are still a pretty wild bunch. But if this trend continues, I predict that humankind will eventually go thru some major changes. People will be more rational, more orderly, more robot-like. There may no longer be any crime, any wars, any starvation, and a lot less disease. But there might also then be no more poetry, no more song, no more art, no more sex. It’s amazing what kind of worlds we could sleepwalk into, now that our scientists are unlocking some of the computing secrets of the brain (and our entrepreneurs, generals and political leaders are starting to make daily use of them). Time perhaps to dust off some of those yellow, dog-eared science fiction paperbacks up in the attic.

◊   posted by Jim G @ 12:56 pm       Read Comments (2) / Leave a Comment
 
 
Thursday, July 10, 2008
Personal Reflections ... Photo ...

I’ve been around long enough now to have known a guy who has been turned into bronze. This fellow was a gentleman named Charles Cummings, a former citizen and librarian of the City of Newark, NJ. Prior to his passing, Charles was the designated city historian. He was also a member of the Episcopal parish of Grace Church in downtown Newark. And that’s how our paths crossed.

I spent seven or eight years trying to feel at home in Grace Church, an old “Anglo-Catholic” congregation with deep historical roots. Those roots made Charles feel right at home. And in my attempt to likewise feel at home there, I saw Charles on most Sunday mornings. After the Mass (this was high-church English-style, complete with incense and sung gospel), I would encounter Charles at the coffee hour, exchanging polite greetings and sometimes a few lines of conversation. I knew that Charles was the city historian, but strangely enough he almost never talked about city history while at Grace. I never heard him proffer any interesting facts or stories about Newark’s past. He seemed mostly interested in the personal matters of the congregates; who was sick, who was well, who had a son graduating high school, who had been to Florida recently, who the rector (a rather touchy fellow) was upset with, etc.

I left Grace Church in the late 90s after my best friend there, Roger the elderly “sexton” (Episcopalian word for ‘live-in church building keeper’), was brutally murdered. Interestingly, it was Charles Cummings who called me with the news. I remember Charles’ expression of deep regret at the terrible tragedy: “poor, poor Roger”. Yes indeed. Not too long after Roger left the world so horribly, I stopped going to Grace, and I never saw Charles again. It turned out that Charles met a more peaceful ending a few years later, in 2005.

Not long afterward, the City and the County raised some funds so as to have a bronze bust of Charles made, for display at the renovated County complex in Newark (where I work). Just a few weeks ago, the bust was hoisted onto a pedestal and was dedicated in memory of Charles. I finally got around to spending a few moments with the bronze version of Charles this week. Here are shots of the bust and the engraving on the monument:

Well, seeing Charles cast in bronze was rather weird at first. There’s something about a bronze statue that captures a Roman emperor better than a kindly old librarian. It didn’t seem like the Charles that I remember. The above bust photo seems a bit too angular, a bit too contrasty and bold; something more in keeping with a Hannibal or an Alexander the Great.

But, having some belief in the photographer’s creed, I decided to keep shooting at different angles until the spirit of Charles was found. The two shots below come close, I think. The bust seems to convey Charles as he probably was in his early 40s; by contrast, I remember him as a gracefully aging 60-year old. Nonetheless, these two shots better convey the patient and kindly, but somewhat distant and proud nature of his personality — as I experienced it.

Unfortunately, as can be seen in the last shot, the local birds have little respect for bronzed busts, heroic or not. But then again, I’m sure that Charles would have been patient with them. Charles appreciated the great themes AND the more quotidian elements (such as pigeons and starlings) of the city that he loved.

◊   posted by Jim G @ 8:51 pm       Read Comments (2) / Leave a Comment
 
 
Sunday, July 6, 2008
Current Affairs ... Foreign Relations/World Affairs ...

We’re almost 7 years now from that terrible day in September of 2001 when a band of Islamic jihadist firebrands, supported by a shadowy but potent terror network based in the Middle East, managed to kill over 3,000 Americans and injure our financial and military infrastructure. And since then . . . . nothing. Not on domestic soil, anyway. So, are we doing something right? Or have we mostly been lucky?

That’s the big question, isn’t it . . . I’ve read a number of articles from reputable sources claiming that al Qaeda has been seriously wounded and that Bin Laden’s idea of a pan-Islamic assault on the west never caught fire amidst its intended audience. Arguably, there are too many Islamic emigrants living in Europe and North America who have learned to like the economic opportunities available here. They are millions of them, making money and sending it back to the relatives living in the poor and stagnant economies of Pakistan, Yemen, Syria, Egypt, etc. So there may not be a very large pool of volunteers ready to fly to American and don TNT vests for suicide missions in crowded subways or shopping malls. The “Arab street” might be opting for a reasonable, moral interpretation of the Qur’an and Islamic history, over the hazy promise of black-eyed virgins in paradise and eventual glorious victory over the western infidels. And the US military has had recent success in talking the Sunnis in Iraq out of their al Qaeda sympathies.

At the same time, there is renewed evidence that the “social-mental infection” of modern jihadism remains potent within the Islamic world. I just read an interesting book review for a title that you may not find at your local Barnes and Noble; but this book is allegedly getting attention in places like Riyadh and Tunis and Karachi. It’s called “Governance in the Wilderness” (Edarat al-Wahsh), and was recently written by Sheik Abu-Bakar Naji, allegedly a high-level religious theoretician in al Qaeda. Bottom line, the Sheik says that its time for al Qaeda to renew its focus on making life hell (i.e., “wilderness”) for the USA and France and England. He admits that the jihadists probably cannot repeat the “glories” of September 11, but they can bring us to our knees by a long term campaign based on smaller incidents targeted at crowded public places, akin to what the Israelis have to put up with.

So instead of snuffing out 3,000 infidels in a day and then getting shut out by a high-tech “homeland security” response, the new al Qaeda campaign [according to this book] should be happy with getting 30 or so office employees or tourists or delayed travelers lined up at airports, on a more regular basis. Admittedly, Israel manages to thrive despite this kind of thing; but the Israelis are tough cookies, having a social / historical / religious narrative just as compelling as any Palestinian suicidalist has. Here in the USA, especially in the well-off “blue states”, we really don’t have anything so transcendent to latch on to if and when bloody warfare comes knocking at our doors. There would be a lot of social and economic disruption — which is not what we need as we currently struggle with home foreclosures and unemployment and unending increases in food and fuel prices. Under Abu-Bakar Naji’s plan, there would be no succor from the shopping malls, as President Bush prescribed in the days following Nine Eleven.

OK, that article appeared in the NY Post — a Rupert Murdoch rag. Admittedly, Abu-Bakar Naji had a 2005 book called Management of Barbarism and had a lot of other previous writings in the same vein; so another tome on hatred and vengeance by him isn’t really a surprise. But on the same day, the NY Times posted an article about our lack of progress in the Pakistani “north-west frontier”, where Osama Bin Laden is thought to be hiding. The Times believes that al Qaeda has reestablished a network of training camps there not unlike what it had in Afghanistan up until late 2001. Because of Pakistani politics, we can’t just go in there with our Delta units and take them out. We are monitoring and harassing them with our airborne Preditors (the pilotless aircraft equipped with cameras and missiles), but according to the Times, we don’t have enough to do real damage due to demands in Iraq.

So — are we safe again? Or is this the calm following the first thunderclap, the pause before the real storm begins? I like to play the role of the gloomy prophet and thus get in an occasional “told-ya-so” when one of my predictions turns out by chance to come true. But on this one, I’d be perfectly happy to look back five years from now and admit just how silly and off-base my worries were. So check with me in 2013; I look forward to saying ‘yea, I was all wrong’.

◊   posted by Jim G @ 10:57 am       Read Comments (5) / Leave a Comment
 
 
Thursday, July 3, 2008
Science ...

When I was a kid, I used to come up with crazy theories based on partial, ultimately inaccurate understandings of things; my father used to cringe when I explained my theories, and would then dismiss my musings with a “Noooooooooooooooo . . . “

But hey, despite my father’s discouragement, I never lost the knack for coming up with bogus, half-assed theories. So in light of the recent publicity regarding the world of high-energy physics and the search for the Higgs boson (the particle that would intermediate the field interactions that give mass to most elementary particles), I’m at it again.

OK, I know this ain’t right, but — the principle of supersymmetry requires a symmetric partner for every particle; it represents a bigger version of the not-quite-super-symmetry between particles and antiparticles (which have opposite charge characteristics). So, what if there are super-symmetric particles relative to the Higgs boson (or bosons — might be more than one kind)? Would they have some sort of “opposite” characteristic relative to the Higgs field interaction? If so, would they involve something of an anti-Higgs field, whereby particles would gain a sort of “anti-mass”?

And if so, would the characteristics of such “anti-mass” be an opposite form of gravity? All particles with mass attract each other via gravity. So, would particles endowed with the supersymmetric “anti-mass” thus act according to anti-gravity, a repulsion force? And if so, would that have anything to do with the “dark energy” that is unexpectedly accelerating the universe’s expansion, in opposition to the slowdown expected by gravity?

An anti-mass property might also have to react to force in the opposite way — instead of resisting acceleration, it would fly off at the slightest touch. So just where would all this “supersymmetric stuff” be? Because of its reverse-gravity and reverse-force properties, would it already have been scattered out to the fringes of the time-space manifold which encompasses our universe? Would these particles in-effect push against the edge of timespace, slowly reviving the inflation process and thus stoking the mysterious runaway expansion?

Nah, probably not. Sorry for all the pseudo-scientific gibberish. Alexander Pope said famously that “a little knowledge is a dangerous thing; drink deep or taste not the spring”. I definitely have not drank deep from the springs of particle physics and modern cosmology. But I still think it’s harmless fun coming up with such wacky ideas. In fact, here’s another guy who enjoys this kind of thing!

◊   posted by Jim G @ 9:27 pm       Read Comments (2) / Leave a Comment
 
 
Sunday, June 29, 2008
Public Policy ...

There’s an article in the local paper about the big 2008 federal cutbacks in support for basic research in science. This is not good. The cuts have followed years of federal decline, and the scientific research institution is feeling the pinch. Labs are turning from basic research to more short-term, profit-oriented work, and young PhD students are avoiding basic science because of declines in job opportunities. There’s a brain drain going on here, and it’s going to hurt America. There was never a time when we needed high level scientific research so much. We’re in the middle of a growing energy crisis and a food crisis and a global warming crisis, and we need as much technology as we can get to keep these things from bringing America down from its role as the big economic power of the world. Sure, basic research does not have an immediate payoff. But without the basic research in electronics and computer science that occurred in the 1970s and 1980s, we might not have the Internet as we know it today.

Well, bottom line is that this is really stupid, and whoever gets elected President this November (hopefully it will be settled in November — there are possible tie scenarios that would be even worse than 2000) had better do something about it. Or else America is going to be heading into a tailspin by the time that either Obama or McCain conclude their second term. A tailspin that’s a whole lot worse than the one we seem to be in now. Just as Toyota overtook GM as the biggest carmaker, it’s not impossible that India and China could some day surpass the USA in science and technology.

◊   posted by Jim G @ 10:23 pm       Read Comments (2) / Leave a Comment
 
 
Friday, June 27, 2008
Personal Reflections ...

The older I get, the more I appreciate the movie 2010. You space movie fans out there might remember that 2010 was the mid-80s sequel to Stanley Kubric’s classic 2001. In 2010, a Russian space ship blasts off for Jupiter, as to find out just what all the weirdness was about between Discovery and HAL 9000 (which are still orbiting Jupiter at the start of 2010). Well, the joint Russian-American crew gets out to the big gas planet and figures things out, more or less, after an encounter with an “alien force” being channeled through David Bowman (the astronaut who was killed by HAL in 2001). But there isn’t enough fuel left between Discovery and the Russian ship to get everything back to Earth. So they have to improvise, using Discovery and HAL as a booster stage.

Yea, the older you get, the more you realize that we’ve only got so much fuel, so to speak. We can only get so much done. Are you expecting me to add the usual clause here, i.e. “so make the best of it” ?? Nope, I ain’t gonna say that. Because who knows what the best really is. We’re here to learn to appreciate that question, but not really to figure out the answer. By the time we would figure it out anyway, it’s probably too late! All we can do is hold out hope that there is something more than what we can sense, and that our trials and travails will have some meaning in that broader context.

Well, that’s my two metaphysical cents for tonight.

◊   posted by Jim G @ 10:16 pm       Read Comments (2) / Leave a Comment
 
 
Monday, June 23, 2008
Art & Entertainment ... Current Affairs ... History ... Personal Reflections ...

Just a few random notes that came to mind today.

First off, the passing of comedian George Carlin. I wasn’t a huge fan of his. A lot of his humor stemmed from the ubiquitous striving among comedians to be the “dirtiest”, the most outrageous, and the most ribald joke teller. But Carlin was one of the wittier ones. He could also work into his routine a delightful, almost innocent weirdness. So it was with some regret on driving to work this morning that I recalled, after some inner confusion about the issue, that I never did see him live. I almost did. He did a show at my college (New Jersey Institute of Tech) back when I was a sophomore, right about this time of year. I wanted to go, but it turned out to have been on the night before a final exam in an important course. So I stayed home and studied. And I don’t regret it. What did upset me was that for a minute or so today, I DID think that I had seen him. It took some effort to break thru the early morning fog in my mind as I was waiting at a traffic light in Newark, listening to “Morning Edition” on NPR.

Second. I found out today that the Roman Catholic priest who baptized me had passed away earlier this month. I never knew Father Ed as a child, as he left my home parish while I was still a toddler. But thru some odd coincidences, I got to meet him about 16 years ago. He seemed a bit upset about the fact that I had moved over to the Anglican side of Christianity. Perhaps he would have been even more upset had he known that I would later give up all forms of organized religion. But that never meant that I don’t take seriously the ideas and ideals of theology and faith. And today we found out that at least 1 in 5 atheists also do so! (I.e., the just-released Pew Forum’s Religious Landscape Survey indicates that 21% of those who call themselves ‘atheists’ also claim to believe in God).

Third. This past Thursday was June 19, or “Juneteenth”, a traditional African American day of remembrance marking the end of slavery in America (it took until June 19, 1865 for Union enforcement of Lincoln’s Emancipation to reach Galveston, Texas, one of the last corners of the former Confederacy to receive the news). I rather expected Barack Obama to have taken advantage of the fortuitous proximity between this date (an official holiday in 29 states) and his defacto nomination as the Democratic Presidential candidate as to have made a significant speech on race, history and the American future. His Philadelphia speech made back in March, however candid by political standards, still only scratched the surface. There’s yet a whole lot remaining that whites, blacks and everyone in between needs to hear and say. And Barack Obama appears to be in a very good position to keep the discussion going.

Well, there is a brief note in barackobama.com acknowledging Juneteenth. But it looks as though Senator Obama had bigger fish to fry that day . . . such as sinking the Presidential campaign fund, because it got in the way of his fundraising juggernaut. Is this change we can believe in? Or is it just the usual brand of change, change that forgets the past and is ultimately condemned to repeat it?

◊   posted by Jim G @ 8:49 pm       Read Comments (2) / Leave a Comment
 
 
Friday, June 20, 2008
Politics ...

So now we see Barack Obama’s true colors. The federal campaign finance system has many faults, admittedly. It needs revision. And a sitting US Senator would have had a decent chance to fix it. But instead of doing so during his time in the Senate, Senator Obama decided to build a killer fundraising application for his presidential bid. And once he had proof that it worked, he decided to trash the previous reforms with the excuse that they have their problems. Ah yes, the truest of true Chicago politicians. The guy who revives the old patronage situation and calls it “real reform”. The guy who asks us to throw away the system and put our trust in him instead. Ah yes, the classic urban politician, with an “I am the law” mantra. Something like that also happened in Germany during the 1930s. OK, that’s extreme, but I am getting worried about the demagogue effect that Obama is having on a large chunk of the American populace. The guy is starting to scare me.

Really. I’m not kidding. There’s something about Barack Obama that reminds me of some cheezy science fiction story, the one where the aliens manage to create various copies of the perfect human leader (one for each powerful nation; i.e., an American version, a Russian version, a Chinese version, etc.). Then they sneak their guys down here to earth and bide their time while their humanoid agents rise in power and gain the trust of the unsuspecting human race. When the time is finally right, the big saucers show up in the sky and announce the formation of a ‘new world order’ (i.e., we’re taking over your planet). And their judas goat leaders on earth urge the people of the world to stay calm and cooperate, saying that it will be good for everyone. Yea, I definitely could not picture Barack Obama leading the revolt in Independence Day!

So don’t blame me if Obama is elected and we find out that he isn’t such a do-gooder after all. I’m not voting for him.

Now as to McCain and his flip-flop regarding domestic drilling in environmentally sensitive areas: Let me make it clear that I won’t vote for John McCain either. McCain is definitely pandering to the GOP powers-that-be, after a career where he gained fame for thumbing his nose at them. Perhaps McCain is as much as a phony as Barack Obama has turned out to be.

As to the actual idea of drilling offshore and in northern Alaska: it might make some sense, but only as part of a compromise. (A compromise which neither Obama nor McCain would seem to appreciate). Offshore and northern Alaska oil is perhaps the United States’ last significant untapped oil resource. If exploited, they wouldn’t nearly satisfy our need for oil, but they might make things a bit better for our economy for a decade. They might lower domestic energy prices by 5 or 10% over that period. But after those fields dry up (as has happened to Great Britain about 20 years with the big North Sea oil find from the late 1970s), we’re back in the same bind that we are now in — or worse. The USA has an awful track record in terms of energy foresight. So I have no doubt that if McCain were to allow an oil drilling free-for-all and gasoline prices started coming down over time, we’d go back to the same wasteful ways for another decade, with big houses and big cars, etc.

So if we are going to use our last untapped oil reserves, it would seem to me that a lot of conditions should be placed on the whole enterprise. First off, extreme caution and oversight regarding environmental despoliation is necessary. I realize that oil drilling technology has advanced since the bans on off-shore and Alaskan drilling were imposed back in the 1980s, and the specter of big oil spills is a good bit less likely now. Second off, the public would have to realize that we are NOT going back to the days of cheap gasoline again. One good way of doing that would be to set a federal taxation plan such that the prices of oil products would not go down, even if increased supplies would otherwise cause that. The increased taxation revenues could be put to good use, perhaps to pay for the various wars that we are now involved in and also pay down some of the federal debt. If anything is left over, it could be used to fix our crumbling roadways, railroads, bridges, schools, etc., and to increase energy conservation and alternate energy research.

I was around back in the energy crisis of the mid-1970s and in the early 1980s. Commentators were saying that by the turn of the century, perhaps new technology would allow our economy freedom from petroleum dependence. Well, here we are, almost 10 years into the new century, and the end of our petroleum dependency is nowhere in sight. So if we are going to use up our last shots of domestic oil, we’d better do it very smartly and carefully. Otherwise, the SUV era may come back for a few years, but the angst that will result once it ends will make today’s pain at the gas pump seem like a mosquito bite. Too bad that McCain isn’t independent enough to say something like that out loud.

Bottom line: the choices for the 44th President of the US aren’t looking too good right now. Especially considering the huge problems that our nation will face over the next eight years. Blame it on the men themselves or the political system, but no one is able to say AND DO the things that will need to be done if the USA is to maintain its strength and leadership throughout the 21st Century.

◊   posted by Jim G @ 10:10 pm       Read Comments (3) / Leave a Comment
 
 
Monday, June 16, 2008
Brain / Mind ... Science ...

I’m currently reading a book called “Microcognition” by British philosopher Andy Clark. It’s basically about human efforts to model human intelligence using computers. Until a few years ago, the popular term for this effort was “artificial intelligence”. But today, that term isn’t very popular. Our scientists have made a lot of progress in understanding just what it is about our brains that makes humankind intelligent (well, at least in certain situations . . . . ). And in doing so, strangely enough, the stuff of our own brains has taught us a new way to compute. Through the 1950s and 60s and into the 1970s, our scientists attacked the problem of intelligence using standard computer programming, the classic realm of do-loops and if-then logic based on combinations of AND – OR gates (in the tradition of mathematical logician John Von Newman). They came up with some wonderful inventions, like those powerful chess-playing computers that no human can beat.

But those super computers couldn’t and still can’t do what people really do in life, i.e. figure out how to survive in confusing and changing circumstances and learn from their experiences and mistakes. The “AI” programmers couldn’t figure out how to make a computer form an “abstraction”, e.g. how to derive the common-ground concept of ‘cold’ from varied examples such as ice, Arctic fronts, and refrigeration. Over time, they finally considered the actual structure of the brain and noticed that it really wasn’t set up like a digital computer. Instead it was like a spider web of intricately connected little things, i.e. neurons, each of which are relatively dumb in themselves. What we slowly learned from the brain was that if such webs of relatively simple information processing objects were set up in the right way (through trial and error, the general process of nature), then abstract ideas and creative re-combinations of them could “emerge”, almost as if by magic. I.e., we found a way to mimic abstraction and creativity.

Today, “parallel processing” and “neural nets” and object-oriented programming are hot items in computer science; they are allowing all kinds of advances such as voice recognition that really works. We haven’t yet been able to do what our brains do in terms of flexible thinking, but our machines are certainly getting better. Once we decided to put aside the “old fashioned artificial intelligence” approach based on man-made rules, and started listening to “mother nature”, computer science progressed by leaps and bounds.

The brain still hasn’t yielded all of its secrets, and I hope it will be a while yet before humankind figures them out. But it is certainly humbling to see another example of how we ain’t so smart after all; and that whatever smartness we do have is not our own invention, but was a gift from nature. Hopefully we will learn to use that gift wisely enough as not to continue punishing and exploiting the source of that gift, in the quest for wealth and independence. Nature is still smarter that we are, and if we keep pushing her past her limits, she may well find the need — and the means — to shut us down. And all the parallel-distributed silicon chips in the world won’t be able to stop that.

◊   posted by Jim G @ 8:51 pm       Read Comments (2) / Leave a Comment
 
 
Saturday, June 14, 2008
Society ... Technology ...

Here’s a quick review of my “interesting article of the week” for the second week of June (the one with Friday the Thirteenth in it). The article is from the July/August 08 Atlantic, titled “Is Google Making Us Stupid”, by Nicholas Carr. Mr. Carr is worried that the Internet is changing things for our youth and for our society, in terms of how they get their information and how they do their thinking. He’s worried that people, especially young folk, are relying too much on Google searches and hyperlinks and video clips. They are getting too accustomed to skimming massive volumes of information, flitting from site to site and subject to subject, instead of sitting back and reading deeply on one topic from one author. Carr thinks that perhaps our brains will be re-wired because of this. Because of the social forces and corresponding biological factor set off by modern information technology and its close cousins, the electronic media and the entertainment industry, there will be no going back to the good old days of reading (and finishing) books and long magazine articles. Except for old timers like myself who grew up in the days of libraries with paper card catalogs, no one will even have the ability to sit back and deeply ponder things such as the effects of racism on the deindustrialization of American cities during the second half of the 20th Century.

Well now, there certainly seems to be a lot of truth to this. Blog sites that provide short information blips every hour on the hour seem to be a lot more popular than those publishing longer essays every week or so (which helps to explain why this blog never made it!). But then again, the book isn’t dead yet. Amazon still sells a lot of them on line. Technology still hasn’t come up with a substitute for that good, comfortable feeling that you get when you sit down with an interesting book. I think it’s much nicer to read from something that comes from other living beings, i.e. paper from trees. It’s just not very cozy and comfortable reading from an electronic screen, no matter how light and portable they have now become. You just can’t curl up to a good flatscreen and while away a rainy afternoon.

So the book is not dead yet; it might be around for decades to come. But still, the statistical trends regarding book sales are somewhat disturbing. I checked out the annual sales estimates from the Association of American Publishers (www.publishers.org) going back thru 1992 (with the help of the “Wayback Machine” on archive.org). Anyway, in 1992, the estimated net sales for the book industry in the US were 9.46 billion dollars. Five years later, in 1997, they were at $17.2 billion. So the average growth rate in sales from ’92 to ’97 was 12.7%. Sales for 2002 were $22.40 billion; so the average growth rate for the next five years was 5.4%. In 2007, net sales were estimated at $24.96 billion. Sounds good, but the average growth rate from ’02 to ’07 slowed down to 2.2%. Remember, these are nominal dollars; during this time, inflation was chugging along at around 3% per year. So, after 2002, there isn’t any “real growth” in book revenues. Anyone want to bet that nominal sales will go flat and real sales decline from ’07 to 2012? (I’m surely not betting against it!). You can see why Amazon is expanding into music downloads, electronic goods, and all kinds of other household stuff and personal items.

Carr says that “as we come to rely on computers to mediate our understanding of the world, it is our own intelligence that flattens into artificial intelligence.” Personally, I don’t think that “good old-fashioned intelligence” is done for. But it may become a rarer and rarer trait over the next 80 to 100 years. The masses are already increasingly enthralled with entertaining technologies provided and controlled by a small band of international media corporations and cooperative big governments; meanwhile a small class of really smart people direct those corporations and governments — yep, sounds much like science fiction. According to such fiction, most of those really smart people will get together over time and figure out a way to gain totalitarian control of the brainwashed masses. Meanwhile, a small band of loners and rebels will realize what’s going on, and will seek to “unplug” people from “the net” as to fight back against the powers that otherwise keep them contented. It’s The Matrix without the body vats.

Perhaps that won’t happen; just little old me trying to be dramatic. But if it does, and if somehow my little scribblings floating on the vast digital seas of the Internet are preserved and readable in 100 years (which I doubt will happen, given the fact that Google hardly takes my site seriously), well then. Don’t say that Mr. Carr and I didn’t warn you!

◊   posted by Jim G @ 12:13 pm       Read Comments (2) / Leave a Comment
 
 
TOP PAGE - LATEST BLOG POSTS
« PREVIOUS PAGE -- NEXT PAGE (OLDER POSTS) »
FOR MORE OF MY THOUGHTS, CHECK OUT THE SIDEBAR / ARCHIVES
To blog is human, to read someone's blog, divine
NEED TO WRITE ME? eternalstudent404 (thing above the 2) gmail (thing under the >) com

www.jimgworld.com - THE SIDEBAR - ABOUT ME - PHOTOS
 
OTHER THOUGHTFUL BLOGS:
 
Church of the Churchless
Clear Mountain Zendo, Montclair
Fr. James S. Behrens, Monastery Photoblog
Of Particular Significance, Dr. Strassler's Physics Blog
Weather Willy, NY Metro Area Weather Analysis
Spunkykitty's new Bunny Hopscotch; an indefatigable Aspie artist and now scholar!

Powered by WordPress