Category Archives: Confessional

Forget oxycodone. The most addictive drug is Google. And we’re past the point at which it’s just a ‘choice.’

addictive

Back on this post from yesterday, we were having the usual argument about the intrusiveness of private companies vs. the government, and as usual someone said “my use of Google Maps is voluntary,” an assertion which I questioned.

My use of Google Maps and other Google products is no longer in the realm of what I consider to be “voluntary.”

Google is as much a part of the daily infrastructure of my life, and the things I need to get done, as the streets I drive on. Its services are something I rely on, in a more direct, frequent and ubiquitous manner, than I do the direct services of the police.

I don’t see how to engage modern life without it — or something exactly like it. I couldn’t get through a day of ADCO work without it, much less publish this blog. Without Google, both of my active email accounts go away, my browser (the instantaneous searches that occur when you type into the URL field, making it unnecessary to know the address of anything, is indispensable) disappears; there’s no YouTube, no really utilitarian Maps program, and then all sorts of other useful things like Google Books, Translate (no longer can I just say, Well, that’s French and I don’t understand French… no excuse), etc. Without Google Images, I have to fall back on my highly flawed memory for names and faces.

One can attempt to drop off the grid and no longer use Google, just as one can drop out of society at large — quit paying one’s taxes, go live in the wilderness off the land. Theoretically, at least.

But the cost of doing either is pretty high…

Yes, there are other services that do these things. But that’s not the point. If Yahoo or AOL had succeeded in being what Google is, or if Facebook were to succeed in being what it wants to be, then it would be the same thing; we’d just be calling it something different. And why ever use competing services for any of these functions, when the very fact that they are all knit together seamlessly magnifies their utility exponentially? I would no more want to switch platforms than I would want to try to leave the roads and drive on a railroad track in my car.

Kathryn writes, “Google is a gateway drug.”

Yes. And more addictive than most.

I always had trouble with being distracted by looking things up. It was just too seductive. A dictionary on my desk was a dangerous thing. I couldn’t look up a word without running across several other words on the way that fascinated me, and each of them led to other words, and on and on.

Fortunately, I had a good vocabulary, and seldom really needed to look up a word.

But now that I can, instantly, look up anything, I cannot stop doing it. A thought about a word or a fact that causes my brain to wonder or doubt even slightly (something I have always done, constantly; it’s just that for the first decades of my life it was harder to scratch that itch) sends me on an immediate search.

For instance, last night I watched “Looper.” Almost immediately, I wondered who the protagonist was. It looked remotely like , but the expression and even facial structure was wrong (It was him, but he wore extensive makeup to make himself look like a young Bruce Willis). Then I thought, “Isn’t Bruce Willis in this? Why haven’t I seen him?” So I checked, and yeah, he was coming up. I see Emily Blunt’s in it. Isn’t she the girl who… ? Yes, she is. She’s really something. Jeff Daniels is surprisingly good in this. What’s his character’s name again? And so forth… (By the way, the movie wasn’t very satisfying.)

OK, so most of that was IMDB, and IMDB isn’t Google. Yet. But the fact is, I often use Google to flesh out what I find in the movie database, because the info there is pretty sketchy. I like depth in my trivia. I used to do this with my phone, which is always clipped to my belt. Now, I usually have the iPad within reach as well.

In any case, now that it’s possible to look things up constantly, I can’t stop.

You can point to this as a character flaw (or perhaps an illness), and you have a good argument. But aside from the compulsive aspect, a certain amount of this is necessary to practically everything I do, everywhere I go.

Let’s say that a person only really needs to use these services a tenth as much as I do. I could concede that. But if a person doesn’t at least use them that tenth amount, he’s not going to be able to keep pace with the world and interact with other people at the pace that society demands — at least, not in anything I’ve ever done for a living. (Yes, I know that lots and lots of jobs today are still not information-based.)

That puts Google into the realm of essential infrastructure, again like the roads that are a function of government.

It at least gets us to where any assertion that one is not forced to deal with Google (or, for the sake of argument, with some other “private” entity that’s just as useful) on fairly thin ice.

A ‘SmartCard’ is of little use to a stupid driver

smart

A couple of years back, tired of getting tickets whenever I found myself without spare change for the meters (which was often, since I conduct few transactions with cash these days), I took the advice of one of y’all — I think it was Kathryn — and got myself a SmartCard.

I carry it with me always, and top it up whenever it gets low.

But you know what? It’s of no help at all if you don’t actually use it.

This morning, I went for my usual breakfast, and sat there eating and reading my iPad, and right about the time I decided to have a second cup because I hadn’t gotten around to reading all the papers yet (I’d gotten sidetracked trading comments with some of y’all while eating), it suddenly struck me — I hadn’t slipped the card into the meter.

Sure enough, I had an $8 ticket when I got down to the street.

This probably happened because I’ve had relatively early (I say “relatively” because I still work roughly the hours I did as an editor at a morning newspaper, which makes a meeting at 8 or even 9 “early” for me) appointments all week, which means I was done with breakfast and gone well before they start checking the meters at 9, so I didn’t have to use the card before today.

But that’s a poor excuse. I’m pretty irritated with myself over this…

Top 12 Songs I Either Missed Entirely, or Didn’t Fully Appreciate at the Time

This started out as a Top Five List, but there were just too many, even for a Top Ten. Maybe I should have split it into two lists (or  even three) or simply been more selective. But I did none of those things. To carry on…

There are gaps in my musical memory.

For instance, there’s the late ’70s, when I was too busy starting a family and launching my late lamented newspaper career. Popular music of that time was in the very distant background for me, so I didn’t discover the Clash, or even Elvis Costello (just about my all-time favorite), until years later.

Music videos pulled me back in in the early ’80s — first TBS’ Night Tracks on the weekends, then later MTV. I loved the medium. Video may have killed the radio star, but it turned me on to so much music I would otherwise have missed. Madness, for instance. For a time, I told everyone, in all honesty, that if I could figure out how to become a director of music videos, I would give up newspapers for that.

Then, in the early to mid-’90s, MTV quit showing videos. Or at least, quit showing them all the time (video killed the radio star, and reality TV killed MTV). I had this routine in which I’d go down to the basement gym at The State and turn the tube onto MTV and watch videos while I worked out. This kept me current up to Soundgarden and Smashing Pumpkins and Nirvana, but then all of a sudden, the videos disappeared and I lost touch.

Consequently, I missed a lot of great stuff by Radiohead and Weezer and Green Day and Death Cab for Cutie and Me First and the Gimme Gimmes and Adele, to mention a few of those I’ve been listening to on Pandora or Spotify or YouTube lately. (At first, I was put off by Pandora’s way of refusing to play the song I asked for, and playing other things like it, but that has introduced or reintroduced me to a lot of great stuff I would not consciously have sought out).

Then, there are songs that came out at a time when I thought I was paying attention to music, but that went right by me — either because I was listening to the wrong stations, or my tastes hadn’t matured to the point that I fully appreciated them.

Here’s a list that just gives a sample of the stuff I find myself appreciating lately, and wondering how they got past me the first time (by the way, I’m not ranking these 1-12; I’m not sure I could. But I don’t mind naming these as, more or less, the top 12 in the category):

  1. Love and Happiness,” Al Green.Al Green I could just as well have named “Tired of Being Alone,” or “Let’s Stay Together.” I was actually living in Memphis at the time that he was recording these classics, but was too focused on The Rolling Stones, James Taylor, Elton John, Paul Simon, Bob Dylan, The Band and other international stars to pick up on the richness that was right there at my feet. I choose “Love and Happiness” because I actually heard it recently and thought, I’ve heard that forever and never appreciated how awesome it is, and even thought, who is that?, before realizing a split-second later that of course, of course, it was the Rev. Al. And kicking myself.
  2. Creep,” Radiohead. I had half-heard this many times before really listening to it once and realizing how good it was, how it qualified as rock in a way that so little other new music I’d heard in recent years did. This was two or three years ago. Then, when we visited Oxford and stayed on Abingdon Road, I somehow became aware that Radiohead was from Abingdon, and resolved to look into them further. Much later, I did, and now listen to my Radiohead “station” on Pandora as much as any other. Best bit: When the soft opening is first interrupted by the stuttering “CHA-chunk, CHA-chunk” of distorted guitar, and the whole nature of the song changes.
  3. You Can’t Hurry Love,” The Supremes. supremes Or almost any of their greatest hits, really. In the couple of years after I returned to this country in 1965 (after 2.5 years in South America), when I just could not get enough of American popular culture, the Supremes were always there — on the Sullivan show, everywhere. But I wasn’t into them. I was the stereotypical little white boy, into English guitar bands and Americans who imitated English guitar bands, with an occasional side trek into Herb Alpert or whatever. I just wasn’t that into those three elegant black women dressed like old people going to a formal affair. It was decades later before I realized how deeply they had imprinted their sound into my fondest memories of the period. Maybe it was Phil Collins’ special-effects tribute to this particular song in the early, exciting days of MTV that made me look back and consciously realize how amazing the Supremes were. Or the effective way “China Beach” used “Reflections” to, well, reflect the era. (I never actually watched the show, but I can remember pausing the channel long enough to list to the intro a number of times.)
  4. Life on Mars?” David Bowie. I could swear to you that this song did not exist before I first watched, on DVD, the British time-travel-cop show of the same name (sans question mark). I had zero memory of it. Of course, I wasn’t at all into Bowie in his initial iteration, but still — I had heard and enjoyed “A Space Oddity” and heard other songs of his in the background. But I had completely missed this. Even now, I’m not sure if it’s just that the song itself is so great (which it may be; a critic in The Telegraph listed it as the single greatest song of all time, with “Let it Be,” one of my personal favorites, in second place) or it’s just the way it shaped the wonderful opening scene in which the protagonist of the TV show is transported back to 1973 that imprinted it so favorably on my mind. (Wonderful touch — the song begins the scene playing on the character’s early iPod, which itself now looks dated, then ends up on an 8-track.) In any case, I listen to it a lot now. Oh, a word in your shell-like: Don’t bother putting the American series based on this into your Netflix queue (despite the presence of Harvey Keitel in the cast); just watch the original. (Best bit: 37 seconds into this clip, as the character “wakes up” into 1973 and the music reaches its climax.)
  5. Say It Ain’t So,” Weezer. In this position I could put Green Day’s “Basket Case,” or any one of a number of super-catchy way-post-punk, post-grunge tunes. But I’ll just pick this one, because I’ve been listening to my new Weezer station on Pandora a lot the last couple of weeks. I got into them through their relatively recent hit, “I Want You To,” which has everything a pop song should have, despite the feckless theme of the lyrics (assuming it’s up to a girl to make the first move — although, when you see Weezer, you understand this better).
  6. I Want You to Want Me,” Cheap Trick. cheaptrickNot much to say about this except that the title of Weezer’s “I Want You To,” got me to thinking about it, and wondering who had played it — because I assure you, I had never been a Cheap Trick fan (I was way too busy in 1975). But I finally recognized it as a very well-crafted pop song, which has a classic feel to it after all these years. It’s sort of anthem-y. After all, doesn’t the title rather economically state what most pop songs are about?
  7. Hallelujah,” Leonard Cohen. I wrote about this before, wondering at the alchemy that made it so hypnotically beautiful. Ever since I was quite young, I had known that cool people were supposed to be into Cohen — it was more of a measure of coolness even than being into Jeff Beck. But I was aware that I was not cool, and was satisfied not even to try to listen to him. It was the use of this in the love scene of “Watchmen” that made me focus on this song finally (which followed on a cover effectively used in “Shrek”), and I’m glad I did.
  8. Such Great Heights,” The Postal Service. First, I heard the cover by Iron and Wine, which was on a copy of the soundtrack of “Garden State” belonging to my daughter. She referred to it as “that Postal Service song,” and later she persuaded me to spend a gift certificate for Best Buy on “Give Up” — which is probably the last complete album that I have bought and really, completely gotten into, to the point of listening to it scores of times they way I did, say, with Abbey Road in 1969.
  9. I’ve Been Loving You Too Long (To Stop Now),” Otis Redding.OtisatWhiskey66-1-TH How did I live all the way through the 1960s thinking that Otis Redding was just that guy who had sung “Dock of the Bay”? Yes, that was a magnificent song, and no one could have done it better, but it wasn’t even really representative of what he did. I didn’t learn how wrong I was until I borrowed a greatest-hits CD from my brother (which I fear I never returned). This song moved me more than any other, but I could just as well have chosen “Try a Little Tenderness,” “That’s How Strong My Love Is,” or “Mr. Pitiful.” One weekend recently, my wife and I were walking up St. Philip Street in Charleston, and heard the echoing sound of “I’ve Been Loving You Too Long” coming out of one of the old Charleston-style houses turned into apartments. It made my day, and made me think highly of the tastes of the person (probably a very young person, since everyone in that neighborhood seemed to be about 24 — in which case, it’s someone who is much more hip to what’s good than I was at that age) who lived in that apartment.
  10. Bring It On Home to Me,” the Animals. After what I said about the Supremes, I guess I shouldn’t pick the version of this by a white guitar group, but hey, Eric Burdon wanted to be black more than any other white boy ever to come out of Newcastle. And this version is pretty soulful, I think. I think I like it better than the Sam Cooke version (which, I hate to say, seems to have most of the soul bleached out of it, in the deliberate effort to create a “crossover” hit). Anyway, I didn’t get into it until I bought an Animals greatest hits compilation on CD, sometime in the past decade. Eventually, I put the song on the playlist of my band — you know, for when I get around to starting a band.
  11. Goin’ Down,” The Monkees.the-monkees-goin-down-colgems I’ve mentioned this before. It’s sort of a special category. I thought it was cool when it came out (I had the album), but as I got older peer pressure brainwashed me into thinking that nothing by the Monkees could possibly be cool. Then it was used as background for a frantic meth-cooking montage on “Breaking Bad,” and I couldn’t place it for a moment, then recognized it. At first, I thought it was maybe someone else doing it, not Mickey Dolenz, it was just so good. Finally, I realized I had been right the first time, when I was 13 years old — it was a great song, very well done. And the “Pre-Fab Four” should probably get more respect than they did. (Oh, and yes, I know that The Rutles actually called themselves the Prefab Four, but the sobriquet was applied to the Monkees first, intended as an insult.)
  12. Mais que Nada,” Sergio Mendes and Brasil ’66. So yes, I heard this at the time, but it was in the background, and I never could have named it, the way I could have Mendes’ covers of Beatles and Simon & Garfunkel tunes. After I’d seen “Austin Powers” — which uses the song very effectively to evoke the period — several times, I set out to learn what that signature Brasil ’66 song was called. (It’s interesting to me how a Sergio Mendes or Herb Alpert or Petula Clark song, played in a background, can evoke the 60s more effectively than a Beatles or Rolling Stones song can do.) Once I figured it out, I’ve listened to it a lot.

 

I give up — what’s that bright thing in the sky? (Oh. Sirius.)

Image from my app.

Image from my app.

Any astronomers out there? Because I’ve got a question that’s been bugging me.

Some time ago, I picked up one of those cards at Starbucks that provides a code that lets you download a free iPhone app. This one was called “Star Walk,” and it was very cool.

Basically, you hold your phone up to the sky, and it gives you a labeled diagram of what you’re looking at. For that matter, you can use it inside, and it will tell you exactly where the planets and the constellations and major satellites are in relation to where you’re standing. You can even hold it toward the ground and see where the heavenly bodies are when they’re not in the visible sky, on the other side of our planet.

I appreciate it because I’ve always felt particularly ignorant because I know so little about what’s out there. I read those novels I love about Jack Aubrey and Dr. Maturin, and Jack is always so dumbfounded by how little his friend the doctor knows about the planets and stars or anything else having to do with seamanship. And I’m not genius with languages like the doctor, so I feel particularly stupid.

Before the app, if I saw a particularly bright object in the sky, I assumed it was Venus, unless it had a reddish tint, in which case I assumed it was Mars. But I really had no idea.

I’m not that much brighter now, but I’ve picked up a couple of things. I can look up right away and say, “There’s Jupiter.” And at this time of night, I can pick out Orion pretty clearly.

But there’s something that’s been perplexing me in recent weeks.

Jupiter is off to the right of Orion. Fine, I can see that. But there’s something a roughly equal distance off to the left of Orion, at about the same elevation, that’s just about as bright as Jupiter. And what with light pollution from streetlamps and such, that object is the only thing bright enough to see in that part of the sky.

There’s nothing on Star Walk’s celestial map to indicate that there’s anything that really stands out in that part of the sky. There’s Sirius, and…

You know what? I just looked up Canis Major, which I know to be to the left of Orion, and according to Wikipedia Sirius is the brightest star in that constellation by far. In fact, I see elsewhere that it’s the brightest star in the sky other than our own Sol. So, you know. Duh.

(Yes, all of you who know something about astronomy; I am abysmally ignorant. No way would they let me be master and commander of any vessel in Nelson’s Navy.)

Don’t know why my app didn’t indicate that. (It makes it look like Murzim and Betelgeuse and Bellatrix and Rigel are all just as bright, which they’re not.) But hey, it’s a free app. And what it does do is pretty cool.

It’s got to be Sirius. So never mind. Unless you know I’m wrong, in which case please tell me…

First-person shooter: What games did Loughner play?

This is a post I wrote back in early 2011, and didn’t publish. Recent discussions of gun violence bring it back to the fore, so here it is…

In my Monday Wall Street Journal (the only edition I received after coming back from England until late Wednesday, which was really frustrating), I read the following about the Arizona shooter:

“All he did was play video games and play music,” said Tommy Marriotti, a high school friend.

And that got me to wondering: What sort of games did he play? Since initially reading that, I see he recently played Earth Empires, a strategy game. But I suspect he has at least at some time — maybe back in high school, maybe some other time — played another sort of game.

I find myself wondering whether he was into first-person shooter games…

I have two reasons for wondering that. First, there are the theories of Lt. Col. Dave Grossman (ret.). Col. Grossman is the foremost expert in the field of “killology,” a term he coined. He wrote a fascinating book, On Killing: The Psychological Cost of Learning to Kill in War and Society, which I recommend. It discusses the psychology of killing, mostly within the context of war. He explains that for most of military history, as long as we’ve had projectile weapons in the hands of the average soldier, the overwhelming majority of soldiers did not shoot to kill. Frequently, they didn’t fire their weapons at all, and when they did, they tended to fire over the heads of their enemies — to engage in a sort of threat display, rather than use deadly force.

They did this because for most humans, the reluctance to kill is deep and strong.

The U.S. military, realizing this (on the basis of extensive studies during and after WWII), started conditioning that reluctance out of soldiers starting with the Vietnam era (or perhaps a little earlier; it’s been awhile since I’ve read it). Soldiers started to be trained to quickly acquire the human target and fire accurately before thinking about it too much. The result is that the U.S. military is, soldier for soldier, the most deadly fighting force in the world, perhaps in history. (Probably the most dramatic demonstration of this was the battle of Mogadishu in 1993, in which elite soldiers faced mobs of Somali militias with a tendency to fire randomly and wildly with their AK-47s — the result was 18 dead Americans, but about 1,000 dead Somalis.) But soldiers who shoot now often pay a profound psychological cost later, and that was what Col. Grossman was motivated to study.

He has also ventured into related peacetime phenomena, such as the popularity and increasing sophistication of FPS games, which train the reflexes of the kids who play them to shoot quickly and accurately, without reluctance. He asserts that it’s not a bit surprising that we have Columbines given the ubiquity of such games. Kids have had conditioned out of them the hesitation that affected trained soldiers through most of history.

You may say Col. Grossman exaggerates. And indeed, some experts are far more phlegmatic about such games. I don’t think he does, but that’s because of the other reason I was interested: I’ve played these games myself. A decade or so ago, I had a copy of an early version of Wolfenstein. The violence was non-stop, but it was also cartoonish and unconvincing, only a step or two beyond Space Invaders. Now, it’s different…

Two years ago, I got myself a copy of Call of Duty: World at War. I was fascinated by the premise, which was to put the player in realistic scenarios from the Pacific and Eastern fronts in the Second World War. (Some of them weirdly realistic. When I saw some of the scenes from the Peleliu campaign in “The Pacific” recently, I thought, I’ve been there… It was weird.) But I was completely unprepared for two things: First, the realism. When I first booted up the game on my computer (and I had to get a more sophisticated video card to run it, even though my computer was almost new), I thought I was watching a video prologue — I didn’t realize the game had started. I couldn’t believe the graphics were that realistic, that high-res.

Second, the emotional manipulation, which was stunning. There are two story lines: In one, you are a U.S. Marine named Miller, fighting your way across the Pacific. In the other, you are a Red Army soldier. The designers of the game came up with their own way of overcoming any reluctance the player might have to shooting the enemy. The Marine scenario begins with Miller being a prisoner of the Japanese. As Miller, you watch the Japanese torture and kill your buddy, before one of them moves toward you with a knife, prepared to serve you in the same way — before he is stopped by the commandos who have come to rescue you. Your rescuers hand you a weapon, and by this point, you’re expected to know what to do with it.

In the start of the Russian scenario, you are lying still among dead and dying comrades in Stalingrad. As you lie there (the game won’t let you move at first), you watch German soldiers step around you, casually shooting the wounded as you watch helplessly. Somehow they overlook you. As the enemy moves away, a grizzled Red Army sergeant who was also playing dead whispers to you to follow him, and he will show you how to get your vengeance on the fascists, who, as he keeps reminding you, are raping your homeland. He hands you a sniper rifle…

Creepy, huh? At this point, you’d like me to tell you I didn’t go on and play the game, but I did. I’ve played it all the way through a number of times. It’s very seductive, because it’s challenging. But I wouldn’t argue if you were to say, “Yes, of course it is — like other forms of pornography.” I expect those of you who’ve never played such games will have all sorts of critical things to say about me for playing it, and I won’t argue with those assertions, either. I know how it looks. When my wife enters the room when I’m playing, I hastily shut it down. Because she is my conscience.

But that’s not the really creepy thing: Over time, I played the game less. I had mastered the easier levels, and the harder ones were just ridiculous. Also, well, I’ve tried to spend less of my life in nonproductive pursuits. But a number of months ago, I got curious about something: I had never played the “multiplayer” option, in which you fight against other players over the Internet. So I tried that.

And I discovered that either the world is full of unsuspected super-soldiers, with reflexes that are not to be believed, or there are a lot of geeks out there who spend WAY too much time getting ridiculously good at playing these games. The latter, of course, is most likely. And hardly surprising. But I discovered one thing that positively sent chills down my spine. I quickly accepted that I could not survive more than a few seconds against people whose reflexes were so finely honed to aggressive play of the game. Fine — I have trouble with basketball, too. And I figured that the guys who spend a lot of time on these games are 20-something, and an old guy like me can’t hope to keep up. But what got me was when I encountered a few players who had activated the feature that enabled them to speak with each other in real time as they shot and stabbed their way across the landscape.

The thing that got me was when I heard their voices.

They were little boys. They sounded like they were about 10. And they were very, very efficient, hyperaggressive and unhesitating virtual killers.

I quit playing at that point.

Anyway, that’s why I wonder — what sorts of games did Loughner play?

What does a ‘like’ mean, as we slouch toward post-verbalism (if that’s what we’re doing)?

The top of my main Pinterest page.

Some years ago — it could have been 20 — I read an article by Umberto Eco that seems appropriate to this topic. I don’t remember all the particulars of the piece, or even in which magazine it appeared. But I seem to recall that the semiotician and novelist set forth the notion that we might be moving, beyond a post-literate society, to becoming post-verbal, returning to means of communication common in medieval days when, say, a pub called the Rose and Crown would be identified by a hanging sign showing pictures of those things, rather than words.

The premise would seem excessively alarmist, or at least premature, since the decades since I read that have seen an explosion of the written word on the Web. More people are writing, and reading, a greater profusion of words than at any time in the history of this planet.

But sometimes, we are faced with images alone, and words fail us. On friends’ Facebook pages, I’m occasionally confronted with images that just beg for accompanying text to explain them, but nary a word is offered.

And recently, I found myself in a world that brought the Eco piece back powerfully.

I was going to (and eventually did) write a light item for the ADCO blog about the addictiveness of Pinterest, which has hooked a couple of my co-workers. The spark was a study indicating that 20 percent of women who are online were into the site.

At first, I supposed that only women could possibly get into it, for as I perused the boards created by my female co-workers, I was overwhelmed by all the images of food and housewares and decorating ideas. As I said in that ADCO blog post, those screens looked like “the result of Edward Scissorhands going to town on a 10-foot-high stack of old copies of Better Homes and Gardens and Southern Living.”

But as I went through the little signup ritual for creating my own account, I saw how quickly the screen would morph into something that more interested me.

Here’s what happens: You sign in to the site. You are offered a screen full of slightly-bigger-than-thumbnail images. You are asked to “like” the ones that appeal to you. What you “like” affects what you see as you continue to scroll down. It’s rather fascinating to watch as the algorithm does its work. For a time, for a long time, the wave of images coming at you seems never-ending. The scroll bar on the right will seem to be approaching the bottom, then suddenly it will glide back up toward the middle as a new load of images arrives.

I saw a lot of images that interested me a great deal, but I couldn’t decide whether to “like” them or not. I mean, what does it say if you click “like” on a picture of a B-26 going down in flames? I don’t like that it’s going down, with American airmen dying in it. But I do want the program to know that I find images of WWII warplanes interesting.

Or what about a picture of Michael Caine as spy Harry Palmer? Will it think I like the raincoat, or “The Ipcress File?” This is a place where words would help.

And what does it mean when I “like” a picture of Marilyn Monroe? I mean, have you ever seen a picture of her you didn’t like, on some level or other? I haven’t. And yet, after I liked one or two of them, they kept coming in a profusion that suggested that Pinterest thought I had some kind of Elton-John-like celebrity fetish centered on her. I continued to “like” them, because that was my honest and uncomplicated answer. But I didn’t want it to offer me nothing but movie-star pictures going forward.

Just because I like Sean Connery doesn’t mean I want to see pictures of Rock Hudson (not that there’s anything wrong with that). And my liking a picture of Natalie Wood doesn’t mean I want to see Robert Wagner. And what’s with these Jody Foster pictures you keep throwing at me? I haven’t liked a single one, and they keep coming. Who do you think I am, John Hinckley? And just because I click on an interesting diagram of old military headgear doesn’t mean I want to look at one Confederate kepi after another!

So here’s where you end up, or where I ended up anyway: Pinterest now “knows” me well enough that one out of 10 or 12 things it throws at me will be mildly interesting. Which I guess is an achievement for a computer program.

But the language of social media — “like” and “friend” and other terms that so often don’t exactly describe the relationship in a given case — still needs work. Let’s not give up on words just yet.

Below are some of the pictures I “liked” as they were thrown at me. But really: What does it mean to “like” a picture of Bonnie and Clyde?

Enjoying reading about the last time I was this ticked off

At my desk at The State, evincing one of those moods./file photo from 2007

Today, grumpily wondering whether I’ll find the Democratic Convention next week as vapid, monotonous, insulting and obnoxious as I did the sliver of the GOP convention I listened to last night, I was reminded of column I wrote four years ago.

If past is prologue, it would seem the answer to my dreary question is “yes.”

That column, which ran on Aug. 31, 2008, was headlined, “Yelling at the television.” If you go back and read it, it will tell you what the rest of this week and all of next week will be like, if you find the parties as disgusting as I do.

A favorite excerpt:

What sets me off? Oh, take your pick — the hyperbole, the self-importance, the us-against-them talk, the stuff that Huck Finn called “tears and flapdoodle.”

Take, for instance, this typical bit from Hillary Clinton’s speech:

My friends, it is time to take back the country we love. And whether you voted for me or you voted for Barack, the time is now to unite as a single party with a single purpose. We are on the same team. And none of us can afford to sit on the sidelines. This is a fight for the future. And it’s a fight we must win together. I haven’t spent the past 35 years in the trenches… to see another Republican in the White House squander our promise…

Let’s deconstruct that a bit.

Take back the country? From whom? Did I miss something? Did the Russians roll right on through Gori and into Washington? No? You say Americans are still in charge, just the “wrong” Americans, of the wrong party? But your party controls Congress! Take it back from whom?

… a single party with a single purpose. Now there you’ve hit on the biggest lie propagated by each of the major parties, the conceit that there is something coherent and consistent about such loose confederations of often-incompatible interest groups. Did you not just spend the last few months playing with all the force you could muster upon those very differences, those very tensions — between feminists and black voters, between the working class and the wine and cheese set? What single purpose, aside from winning an election?

This is a fight… No, it isn’t, however much you love to say that. Again, I refer you to what the Russians are doing in Georgia — that’s a fight, albeit a one-sided one.

… that we must win together. Actually, that raises a particularly pertinent point, which is that the only “fights” that “must” be won are the ones in which “together” is defined as all Americans, or all freedom-loving peoples, whereas such divisive factions as your party and that other one that will meet in St. Paul militate against our being able to win such fights together.

I haven’t spent the past 35 years in the trenches… You’re absolutely right; you haven’t. So spare us the war metaphors.

… to see another Republican in the White House squander our promise… Like that’s what matters, the stupid party label. Like there isn’t more difference between you and Barack Obama in terms of philosophy and goals and experience and what you would bring to office than there is between John McCain and Joe Biden. Come on! Please!…

Sigh. Fume. Mutter.

Yep. I was thinking almost identical thoughts last night watching this convention.

I was pretty disgusted back then. Now, I enjoy reading about how disgusted I was. I always find that my writing improves with distance…

Where do kids listen to their pop music today? (All I know is, it better not be on my lawn…)

Spotify informs me that Darla Moore has subscribed to “my” playlist, “NPR Songs of Summer.” Of course, it’s not “my” playlist. It’s NPR’s.

For a moment I thought I’d discovered what Darla had been up to since Nikki bumped her from the USC board of trustees — listening to Adele, LMFAO, Taio Cruz, Gnarls Barkley, Simon and Garfunkel and the Stones. But then I realized it was another Darla Moore altogether — but one, it should be said, with pretty good taste, who also listens to Emeli Sandé, Kate Bush, R.E.M., Loudon Wainwright III, Beck, the Velvet Underground and the Psychedelic Furs, among many others, according to her public profile.

Which is aside from my point. The point is, I have a confession to share.

After having played them over a bunch more times, I realize I was wrong about some of those songs on the NPR list. Some of the recent songs I rated really low on my zero-to-five-stars scale are a lot better than I thought they were when I first rated them.

For instance… I wake up in the morning with LMFAO’s “Party Rock Anthem,” which has really grown on me, in my head.

And more dramatically, I originally rated Adele’s “Rolling in the Deep” at two stars, which was ridiculous. I now consider it to be worth at least four, if not five. It’s amazing. I didn’t come to this decision because of seeing two of my older (male, amazingly enough) cousins dancing to it with abandon at a wedding a couple of weeks back — doing something that looked very like an Indian rain or war dance, which the song’s driving rhythm tends to abet.

No, I’ve come to that conclusion from listening to it over and over. And eventually going, wow. You know how I posed the question of what, exactly, makes Leonard Cohen’s “Hallelujah” so mysteriously stirring? What, I asked, is the music doing to the ear, the brain, the soul in that part that “goes like this, the fourth, the fifth/ The minor fall and the major lift…?”

Well, something comparably awesome happens, building irresistibly, and then exploding, every time, when Adele sings this part:

The scars of your love remind me of us
They keep me thinking that we almost had it all
The scars of your love, they leave me breathless
I can’t help feeling

We could have had it ALLLLLLL…

It’s just amazing.

But it took time for me to fully realize it.

And it occurs to me that that is a large part of the difference, in terms of my appreciation, between recent songs and something like, for instance, “Honky Tonk Women,” with which I was saturated during the summer of 1969. (When I hear it, it brings one particular memory specifically to mind… driving down Highway 17 between Myrtle Beach and Surfside, passing by right where Tad’s used to be, telling my Uncle Woody — who’s just a little older than I am, and therefore sort of like an older brother — that that was just the best driving song ever. This was possibly influenced by the fact that I had just started driving.)

It’s not that I’m an old fogy — although I’m sure some of you will have your own opinions as to that. The thing is, I react to music much the same as I did in my youth. I certainly feel the same inside when I hear it.

But back in the day, we heard the songs so often, and they had a much better chance of growing on us. On TV, on the radio, walking down the street, coming from a juke box. Music was so common, and shared, and unavoidable. Grownups were able to mock The Beatles’ “yeah, yeah, yeah” because they heard it, everywhere.

There was one Top 40, and everybody was exposed to it. Now… music is more diverse, and specialized, and broken down. And I have the sense that you have to go out and seek it more than you do today. Even if it’s only clicking on a link from a friend via social media, you sort of have to seek it out.

Yeah, maybe it’s just because I’m not invited to those kinds of parties, but music just doesn’t seem as public and as ubiquitous as it once did. Is that a misperception? I don’t know.

I do know that music took a shift toward the private and esoteric and fragmented in the 70s, as we all became “album-oriented.” But then it came back together, became more democratic, in the 80s with MTV, to where most of us have a shared soundtrack for that era.

Now, just as people can choose highly specialized TV channels to watch — rather than having to be satisfied with three networks — they are more empowered to choose a specific musical direction, and have it be private, through their ear buds. Yes, it’s shared, but more person-to-person, rather than communally.

Or so it seems. As I say, I don’t go to parties where current pop music is being played, assuming such parties still exist. But then, I was a pretty antisocial kid, and didn’t go to all that many parties.

So what’s different? How do y’all see, or rather hear, the music scene today?

The music used to be so public, and unavoidable.

The alleged Top Ten best films of all time

There are things that run through my mind when I see Kim Novak. "Great actress" isn't one of them.

Roger Ebert brings my attention to this report by Alexander Hull on this decade’s Sight & Sound Top 10 Greatest Movies of all Time. Hull starts out:

The recent unveiling of Sight & Sound‘s 2012 list of the Top 10 Greatest Movies of all Time brings with it the inevitable chatter that accompanies most lists taking authoritative stabs at qualifying the best of, well, anything. Cinephiles scan for snubs, ranking quirks, and whatever consistencies and trends they can glean from the list. Released every ten years since 1952 and voted upon by hundreds of critics and industry professionals, Sight & Sound has long been seen as a definitive voice in cinema-culture consensus. This time around, though, there’s one gleaming omission from the Sight & Sound list: modern films. The top 10 doesn’t include any movie made in the last 44 years, and the Top 50 only features 13 films since the 1970s (only six since the 1980s)….

To be sure, there’s something obviously preposterous about saying that the decades after the release of 2001: A Space Odyssey in 1968 have produced no films worthy of inclusion in the top 10. If a movie is a masterpiece, it should be ranked as a classic, regardless of how old or young it is—right? Since 1968 (or the 1970s if you’re looking at the Top-50 list) cinema has offered countless great, widely acclaimed films. The critical question, as voiced by New Statesman‘s Ryan Gilbey: “Are those who voted paralysed by history or are the finest films really located in the distant past?”

But I’d argue that the voters are not as paralyzed as some might suspect. The new Sight & Sound list actually does represent a move—a small move—towards the modern. Citizen Kane lost its top spot to Vertigo, a movie 17 years its junior. And compared to the 2002 version, this year’s top-50 breakdown features fewer works from the years between 1920 and 1950 and more from the years between 1960 and today. These incremental shifts towards the new (well, newer) certainly suggest change is happening and that modern films are becoming canonized. It just also suggests that the canonization process is very, very slow.

Personally, I’d suggest that the methodology of this survey is lacking. This comes across like the consensus opinions, reflecting a discernment process lasting centuries, of the Old Ones in Heinlein’s Stranger in a Strange Land. (And by the way, why hasn’t that been made into a movie yet?)

Other reports have noted the fact that “Citizen Kane” has been toppled from the No. 1 spot. Which would be fine with me — I find the constant ranking of that admittedly excellent film on the tops of such lists rather monotonous — if only it were replaced by something awesome.

But instead, it’s replaced by Hitchcock’s “Vertigo” — a film that, to be honest, I can’t remember whether I’ve seen. Film buffs aren’t supposed to admit things like this, but frankly, some of Hitchcock’s films run together in my mind. Of course, if it’s the best movie of all time, certainly I haven’t seen it, or I’d remember, right?

But then, my tastes are seldom those of the kinds of people who assemble these lists. For instance, there’s the overabundance of foreign films, which too few Americans are regularly exposed to. Yes, there’s Netflix now, and I do order foreign DVDs (how else could I have been exposed to the wonderful “The Lives of Others?” But it’s not like I’ve seen it 10 times in theaters, starting when I was young — which I suspect is the case with New York or Los Angeles-based critics. Because those are the kinds of movies they seem to be into — ones that prove themselves over and over. I don’t know if I’m explaining myself well. But I’ve often thought that maybe if I were exposed to “Citizen Kane” more often, I’d realize how awesome it is. But I haven’t been, and I don’t.

Here’s what I think of the films in this new list:

  1. Vertigo” — OK, so I’ll put it on my Netflix list to make sure I’ve seen it. I’ll only pass on something my wife said last night. “Pal Joey” was on the tube while we were getting ready to have dinner, and she said something like, “What made anyone put ‘Kim Novak’ and ‘acting’ together?” I couldn’t answer her.
  2. Citizen Kane” — Again, maybe if I watch it over and over I’ll get hypnotized into thinking it’s awesome, but it might be too late. It’s been the butt of too many jokes playing on elements of the film that have become cliches. But it did produce some awesome b/w stills, I’ll say that.
  3. Tokyo Story” — Since the article doesn’t tell me, I don’t even know what it is about.
  4. La Règle du jeu” — Ditto. Another one for the Netflix queue, I guess.
  5. Sunrise: A Song of Two Humans” — Same deal. This is getting monotonous.
  6. “2001: A Space Odyssey” — A masterpiece, all right, although not one of my faves. I do happen to own it on Blu-Ray — it’s one of the first I went out and got when I first got a Blu-Ray player — and watched it again recently. The cinematography in the early scenes of the Pan-Am flight to the moon are great — the ballet of the spheres, and so forth — as are the scenes between Dave and Hal, as the quiet tension builds. But something struck me, as happens sometimes with pre-MTV films — I’m struck at how slow the pace is, and while I’m impressed with all the majesty, I get a little antsy.
  7. The Searchers” — Another I’ll have to see again, and try, try to understand why so many critics rate it above “Stagecoach” or “My Darling Clementine,” or “High Noon.” Probably something esoteric.
  8. Man with a Movie Camera” — As Soviet films go, I’ve at least heard of “Battleship Potemkin.” This, no.
  9. The Passion of Joan of Arc” — Yeahhh… that’s one of those I kind of knew I should probably see sometime, but haven’t quite gotten around to…
  10. 8 1/2” — OK, now this one I think I started to watch once, out of a sense of duty, but I didn’t finish it. Guess I should try again.

Basically, I think those who contributed to this list have achieved their goal: They’ve made me feel like an uncultured boob.

Now, for a regular ol’ unpretentious, red-blooded, All-American, pure vanilla Top Ten list. I’ll give my reasons for the my picks some other day:

  1. It’s a Wonderful Life
  2. The Godfather
  3. Casablanca
  4. The Graduate
  5. High Noon
  6. Saving Private Ryan
  7. The Natural
  8. “Hoosiers”
  9. His Girl Friday
  10. Mean Streets

And as a bonus, here are five more to chew on:

  1. The Year of Living Dangerously
  2. Gran Torino
  3. In the Line of Fire
  4. Young Frankenstein
  5. Goodfellas

Alla you foreign film buffs, get offa my lawn!

What my Paul Harris Fellowship means to me

Today, I was one of a group of Rotarians called up to the front of the room and honored for becoming “Paul Harris Fellows.”

Let me try to explain, simply, what that means to Rotary: It means the “fellow” has contributed $1,000 to the Rotary Foundation. Although I’ve been told probably 100 times what Rotary Foundation does, I can’t seem to remember. According to this website, the Foundation’s mission is “is to enable Rotarians to advance world understanding, goodwill, and peace through the improvement of health, the support of education, and the alleviation of poverty.”

Which is kind of general and vague, bearing a marked resemblance to a response given by a Miss America contestant. In a recent note of thanks I had gotten from Rotary International for a contribution of $9 (I have no memory ever of having given precisely $9 to the Foundation on any occasion), I got an elaboration:

On behalf of the mother who will receive prenatal care, the father who will have access to fresh water for his family, and the children who will learn to read and write in their newly furnished school, thank you for your gift to The Rotary Foundation’s Annual Fund. Your contributions provide immediate funding to projects that assist these individuals, these families, these communities.

If the first statement was too general, those examples were a little too specific, too retail, for me to get a clear idea of what the Foundation does. But that doesn’t matter much to me. I belong to Rotary for the fellowship of the specific people who are in the Columbia Rotary Club, and Rotary International remains to me not much more than a remote concept. Giving to the Foundation is just something Rotarians do.

Now… all of that said, my purpose in this post is not to communicate what the fellowship means to Rotary, but what it means to me, which is not the same thing at all. Oh, another thing I’m not doing — I’m not trying to get you to think I’m a swell guy for giving a thousand dollars to advance world peace, end poverty and so forth. It was pretty painless. In fact, most of the money I gave wasn’t even mine.

To get to my point…

A little more than 11 years ago, my wife was diagnosed with breast cancer. It had already spread to her liver when it was found. We found this out in a quick series of shocks: First the lump, then the exploratory surgery that found that the nodes were involved, then the biopsy that found multiple tumors in her liver. Stage four cancer. It is a brutally blunt understatement to say that her survival chances weren’t good.

We lived the next few months in a fog of anxiety mixed with urgent determination to do whatever we could. When 9/11 happened, it had little emotional impact on me; I was too wrapped up in this (I wrote about that in a column at the time). There was the quick series of interviews to find the right oncologist (we found the best in Bill Butler). Then the biopsies, and one bad report after another. Then a massive round of chemo. Then the surgery. Then a brief period of recovery, followed by another devastating round of chemo. Followed, after another brief time for recovery, by radiation. Then, the beginning a routine of milder chemo treatments every three weeks for the next eight years.

One night, early in the process, I was watching television, and for a moment, had stopped thinking about this horrible thing. My wife, who had been on the Internet where she spent so much of her time during that period, walked in and said she had good news — she had found a site that said she might live for five years if everything went right. That, she said, was easily the most optimistic assessment she had found. I was devastated. That might, in fact, have been my low point. I had not actually internalized, in a quantitative sense, how bad things were until that moment. And my shock was exacerbated by guilt, for having for a moment forgotten about this thing hanging over us. Watching stupid television.

We got through this time through the prayers and concern of many, through determination, through the skillful guidance of the folks at S.C. Oncology Associates, with the helping hands of friends (all sorts of folks brought us dinners during that period). One evening our pastor, Monsignor Leigh Lehocky, visited and spoke with us. I don’t remember all that he said, but I came out of that meeting with a particular focus on something Jesus told his followers more than once: Think about today; don’t get wrapped up in worrying about tomorrow. Sufficient unto the day is the evil thereof. Ask for your bread daily, not for storehouses that will supply you for life. Storehouses just keep you up nights.

So for my part, that’s what I did. I drew a line. I did not think about tomorrow, because it didn’t bear thinking about. I just focused on what we needed to do today to fight this threat.

But then one Monday, early in the crisis — sometime in the summer of ’01, I think — someone at Rotary spoke about how everyone in the club should try to become a Paul Harris fellow. The speaker — I don’t recall who it was now, although I can remember where I was sitting in that room at Seawell’s — said you don’t have to write a check for $1,000, although some in the club would do that. He or she said we could just commit ourselves to giving $25 a quarter, and in 10 years, we’d have accomplished the goal.

I sat, staring down at the carpet, almost shaking I was so upset. I was holding myself back from shouting, Don’t TALK to me about ten years from now! I don’t want to THINK about ten years from now! You’ve got no business, no right, trying to make me do that!

I don’t think anyone noticed what was happening to me, and I was glad for that. But I was shaken.

As much as I resented that pitch, at some point I started making the payments. It wasn’t about me; it was about the mission of Rotary, and I was in Rotary, so…

In any case, it wasn’t me doing the paying. I was in Rotary because my publisher (Fred Mott at the time) had told me to join (and because Jack Van Loan was recruiting me). The newspaper completely paid my way as a member. So, as the executive in charge of the editorial division budget — and as a member of the newspaper’s contributions committee, back in those days when we still had money to distribute in the community — I made the decision that if I were to be a member in good standing, the cost of contributing to the Foundation should be added to those quarterly payments I signed off on. It was a justifiable expense.

When I got laid off in 2009, I had a couple of decisions to make, among many others: One was whether to stay in Rotary, given that I had to pay for it myself now. The other was whether to keep making the Foundation payments. I’ve made these decisions over again every quarter when the bills come. Each time — so far — I’ve answered “yes” to both. So I guess a little over $300 of that thousand has come from me, in small increments. I sort of figured, I had come this far… and by this time, all members were expected to at least be working on becoming fellows. It really wasn’t seen as optional.

Since that first $25 payment, a lot has happened to us in our personal lives. Our children, three of whom still lived at home in 2001, have gone through all sorts of passages — graduations, and weddings for two of them. Most wonderfully, four more grandchildren have come into our lives.

My wife was first told she was definitely in remission early in 2002. In 2010, Dr. Butler said he thought it safe to take her off chemo altogether (for years, the regimen she was on didn’t have enough of a track record to give him a guide on when it would be safe to stop it).

For the past four-and-a-half years, she has spent most of her waking hours taking care of our four youngest grandchildren. She is their Nonni, and it would be impossible to overestimate how much she means to them. She is an irreplaceable part of their world, as she is of mine, and our children’s.

Last year, we spent 11 days in England, after delivering our eldest granddaughter to her Dad, who was studying at Oxford. Aside from one trip to Disney World with our two youngest daughters some years back, it was the first time we’d ever been able to go anywhere together other than the beach, or to visit family. We had a wonderful time together. Now, inspired in part by a whirlwind European tour our youngest daughter just returned from, we’re working on coming up with an excuse to go visit Wales and Ireland next summer. We may just go anyway, excuse or no.

So this is what the Paul Harris Fellowship means to me: It’s not about world peace or ending poverty, as wonderful as those things are. It’s not about standing up there today and having my fellow Rotarians applaud and congratulate me and the others, as kind as their intentions are.

What it means is that, even when things are at their darkest, the future is a thing worth investing in. Maybe you won’t make it to the end of the next decade; there are no guarantees in this life. But you might. And it’s worth a try.

People are hiding from me! On purpose! In 2012, when all info is supposed to be easily accessible!

Yes, we live in amazing times, even though we still don’t have flying cars.

Michael Rodgers is NOT hiding.

Just one example, from today, of the sort of miracle we take for granted, but which would have sounded like the wildest sort of science fiction back in, say, 1987 — the year I came to Columbia to become governmental affairs editor.

We were kicking around an idea for a TV commercial for a potential client, and suddenly I had a sort of half-memory of having seen an ad, long ago, that did something familiar. I whipped my iPhone from its holster (and if I wanted so see Cleavon Little say, “Just let me whip dis out” in “Blazing Saddles” within a few seconds, I’d do the same), and found a reference to the ad I was thinking of within 30 seconds. Within another 30 — still using my phone (my own  personal phone that goes everywhere I go, which was conceivable in 1987 but still fantastic) — I was watching that ad on YouTube. An ad that last ran in — get ready for it — 1987.

If, in 1987, I had wanted to find out about an ad from 1962, I would have had to spend half the day or more at the library, and whether I even found a reference to it would depend on some pretty tedious guesswork with a periodicals index, and I would have to cross my fingers for a miracle hoping that the library stocked that particular publication, and kept them going back 25 years.

Kathryn Fenner is NOT hiding.

As for actually seeing the ad, without a trip to New York or L.A. and a pretty tedious search once I got there — well, I would have been s__t out of luck, to use the technical term. Oh, maybe if I reached the right person on the phone in one of those places, and they were willing to make me a VHS tape and mail it to me, I might get to see it within a week. But it would have been iffy at best.

Anyway, I say all this to express my appreciation for all the things we can so easily find and experience now, right at our fingertips.

But this post is about the things we can’t, and how frustrating that is.

Phillip Bush is NOT hiding.

Today, the very day of the 1987 ad miracle, I was looking for a mug shot for my contacts list. You know how Google Contacts and iPhones and even Blackberries and Palms allow you to attach a picture of a person to their contact info? Well, I try to take advantage of that whenever I create a new contact. It usually only takes a few seconds. (It took me maybe a minute total to find the four mugs you see here, using Google Images.) I do this because I’m terrible at keeping names and faces straight — I know, or sorta know, too many people for that. By having this feature in widely used software, we are encouraged to do this. It’s normal. (If you had tried it in 1962 — the way the dwarf character did to Mel Gibson’s character in “The Year of Living Dangerously,” keeping a dossier on him and other friends — it would have creeped people out.)

For instance, if you Google me, you get a lot of pictures that are not me, but just people associated with me, but in the first couple of pages of results, there are about 17 images of yours truly. That’s high, on account of my blog and my long association with the newspaper, but not all that high. I get similar results with a lot of people on my contacts list.

But then… every once in a while… there’s someone I can’t find. Sometimes it’s understandable. They are quiet people who work in some private business that doesn’t require a lot of public interaction. But sometimes… it’s like Winston Smith and the gang in 1984 have expunged the person from existence.

Today, it was someone who actually leads a very public organization that advocates on behalf of a very hot local political issue. I had that person’s contact info, from an email, and while I could sort of picture the person in my mind from past interactions, I wanted the crutch of having the mug shot there in case memory failed me at a critical moment.

Doug Ross is NOT hiding.

And I could not find this person anywhere. Eventually, I set my pride aside and tried her Facebook page, which for me is really last-ditch (and feels, even in 2012, even for an unreconstructed journalist, a bit like prying sometimes). And discovered that this was one of those people who not only doesn’t have her own face as her profile picture, but doesn’t have a single image in which she appears among any of her Facebook photos.

At which point I started hearing that little dee-dee-DEE-dee music from “The Twilight Zone.”

Yeah, I realize, some people are just private, as anachronistic as that is in 2012. But I don’t see how a person who is heavily involved in the community manages to disappear so completely.

Thoughts about this? Does this happen to you? Does it drive you nuts? It does me. Maybe it shouldn’t, but it does. Information is normally so extremely accessible, that when it isn’t, it just seems wrong

Binge TV watching: “You don’t want to know”

Never mind that noise. It's just a bunch of walkers...

The “walker” down in the well was making that characteristic noise they make — the half-strangled, wheezy, snuffling snarl — as the survivors debated what to do about getting it out of there. And I guess I had it turned up a little louder than usual.

My wife called from another part of the house, “What is that noise?”

I hesitated.

Then, I said, “You don’t want to know.”

She laughed, and dropped it. I breathed a sigh of relief that I didn’t have to answer further, which would have embarrassed me. I was deep into a “Walking Dead” binge. And I knew that my wife had the exact attitude toward the series that I did before I got hooked on it — I couldn’t imagine wasting my time watching something that gross.

I had recorded the entire second season when they showed it in marathon form a weekend or two back, and was well into it now. This was to prepare me for the premiere of the third season, which is in… October! How’m I going to wait that long? Especially if this season leaves things hanging the way it almost certainly will, with this unresolved conflict between Rick and Shane, and Herschel likely to throw everybody off the farm any minute? And of course, walkers all over the place, and ammunition in finite supply.

At the back of my mind lingers another worry — my DVR is running out of space. Do I go ahead and erase this when I’ve watched them all? And what about the most recent seasons of “Breaking Bad” and “Mad Men,” which aren’t on Netflix yet? OK, I can erase “Breaking Bad” Sunday, when the Season 5 premieres and Season 4 does go up on Netflix, but what about the rest? It’s a tough call.

So it is that I could identify with this piece in The Wall Street Journal this morning, headlined, “Binge Viewing: TV’s Lost Weekends,” which is all about how, “Using streaming and DVRs, TV viewers are increasingly gobbling up entire seasons of shows in marathon sessions: How that’s changing the game for media companies, advertisers and show creators.”

An excerpt:

With the new season of “Breaking Bad” starting Sunday night, Chad Rohrbacher plopped down on his couch recently to catch up on some past episodes of the show about a chemistry teacher turned drug kingpin. Twenty-two hours passed before he clicked off the set. Pausing only for bathroom breaks, sandwiches and occasional comments of disbelief from his wife, he watched two entire seasons in one go. “It just kind of snowballed,” the 40-year-old novelist and college professor recalls.

When last season ended with a showdown between the bespectacled antihero and his drug boss, Mr. Rohrbacher watched the climax alone in his Greensboro, N.C., living room at 4 a.m. “I could barely see,” he says. The next day, “he was there in body, not in mind,” says his wife, Melanie, who did not participate. “I have a house and kids to take care of.”…

I feel for you, Chad. Been there.

Used to be, I had Netflix — and my DVR —  just for the movies. In fact, that was pretty much the only reason I had a TV. But now, that’s changed. And AMC has played a big role in making that happen. They seem the best at making must-keep-watching, high-quality TV shows. And now, you can watch them all at once.

So you’re saying that you’re a grownup, and you have your head on straight and a solid set of priorities, so this doesn’t affect you. Well, yes it does, if you watch TV at all — because it’s altering what’s being offered. Writers are having to write differently for this kind of audience. And it’s changing the whole business:

The industry ramifications are bigger than the occasional weekend lost to “Lost.” Bingeing breaks habits that have long supported the TV business, built on advertising and syndicated reruns. TV executives are torn by the development: gratified that people are gorging on their product, frustrated because it’s a TV party that all-important advertisers aren’t invited to. For middlemen like Amazon Instant Video, Hulu Plus and Netflix, it’s a godsend, boosting their quest to attract and retain subscribers. Writers and producers are just starting to confront the challenges of creating TV for an audience that may digest an entire season in one sitting.

Vince Gilligan, the creator of “Breaking Bad,” got his start in the writers room for “The X-Files.” There, he was schooled by creator Chris Carter in building cliffhangers to lure the audience through commercial breaks and into the next episode. Now, besides unspooling a narrative week by week, year by year, writers must also keep in mind fans who take the story “in a giant inhalation,” says Mr. Gilligan.

He describes his show as “hyper-serialized,” in the way writers try to close the loop on every character and plot point. A homemade poison that methamphetamine whiz Walter White (Bryan Cranston) tried to use on a drug dealer in season two ended up driving a big plot twist in season four. “We use every bit of the buffalo,” says Mr. Gilligan. He now believes fans who devour multiple seasons in short order are “more rewarded” because their memories of all the story threads are fresher. (Others disagree: TV critics are arguing about whether serialized TV is better when savored between episodes.)

Of course, the good news is, if I’m caught up on all three of the AMC series I watch (there would be four, if they hadn’t cancelled “Rubicon“), and I haven’t gone off on another tangent, like that period several months ago when I was plunging through all six seasons of “Lost,” I’m relatively free to live my life — interact with actual humans, or even read a book.

But when I’m in the middle of a season, it can be bad. The WSJ piece warns, “Brain chemistry plays a role in bingeing. ‘We get into something akin to a trance with great storytelling,’ says psychiatrist Norman Doidge, author of ‘The Brain That Changes Itself.’ Viewers identify with characters on screen and subconsciously begin to mimic their emotions—be it sadness or triumph or anxiety—and each emotional state triggers different brain chemicals, which linger.”

No kidding. I remember, in the first couple of seasons of “Breaking Bad,” when I’d stayed up long after my wife had gone to bed, I’d hesitate to go in and hit the sack, because I felt guilty. She might wake up and ask questions. No, I hadn’t personally been running a mobile meth lab and killing off competitors and keeping it a secret from my wife. That was Walter White. But I’d been so wrapped up in his tension, trying to keep it from his wife, that my nerves were on edge. I felt like an accomplice. I felt like I was descending into the sordid depths as surely as he was.

This feeling eased up in the most recent season, as Walter became more ruthless and started agonizing less over his actions. So… does that mean I’m as bad as he is now? Aye, there’s the rub.

But there’s always the fallback. If there are any questions, I’ll just say, “You don’t want to know.”

Nothing, dear! I'm just cooking up a batch of... stuff...

Being too ‘smart’ to see your own errors

Bart brings my attention to this thought-provoking piece:

Jonah Lehrer’s new post at The New Yorker details some worrying research on cognition and thinking through biases, indicating that “intelligence seems to make [such] things worse.” This is because, as Richard West and colleagues concluded in their study, “people who were aware of their own biases were not better able to overcome them.” Being smarter does not make you better at transcending unjustified views and bad beliefs, all of which naturally then play into your life. Smarter people are better able to narrate themselves, internally, out of inconsistencies, blunders and obvious failures at rationality, whereas they would probably be highly critical of others who demonstrated similar blunders.

I am reminded of Michael Shermer’s view, when he’s asked why smart people believe weird things, like creationism, ghosts and (as with Sir Arthur Conan Doyle) fairies: “Smart people are very good at rationalizing things they came to believe for non-smart reasons.” If you’ve ever argued with a smart person about an obviously flawed belief, like ghosts or astrology, you’ll recognise this: their justifications often involve obfuscation, deep conjecture into areas you probably haven’t considered (and that probably aren’t) relevant, and are all tied together neatly and eloquently because she’s a smart person…

Interesting proposition.

Here’s how I responded to it…

Well, I think there is little doubt that smart people are better at rationalizing a bad position.

I’ll also agree with the proposition that it is harder to argue a smart person out of a position if he is wrong. But only because it is harder to argue a smart person out of a position whether he is wrong OR right.

I’m going to strain your credulity by using myself as an example, even though that requires you, solely for the sake of argument, to consider me to be a smart person (but hey, I consider this to be a community of smart people — dumb people would be watching TV rather than debating ideas in writing, right?).

People — smart people — on my blog get frustrated sometimes with their inability to talk me out of a position. It’s not that I’m incapable of changing my mind on something. I sometimes do so quite abruptly. But usually not on the kinds of things we talk about on the blog. That’s because I have spent SO much time over the years honing my positions on those issues. And much of that time has been spent thinking about, and one by one knocking down, the arguments that might be offered in an effort to change my mind.

It’s not that I’m smarter than any of y’all. It’s that it was my job, every day for many years, to write my opinions for publication. When you do that, you take much greater care than most people do with their opinions. (I was very surprised to realize, over time, how much more carefully considered my positions on issues were after a couple of years on the editorial board. Before that, my opinions were private, and therefore largely untested. After I joined the board, every opinion I had went through the wringer before and after being expressed, and I took greater care accordingly.) You obsess about everything that could be wrong in your position, and raise every possible objection that you can think of that the hundreds of thousands of folks out there likely to read your opinion — including people more knowledgeable than you about the particular subject under discussion — might raise to knock it down. You work through each and every one of them before you finish writing and editing your opinion piece. Add to that the fact that it won’t get into the paper until it’s been read, and potentially challenged, by other people who do the same thing for a living, and go through the same daily exercises.

It makes for positions that, once fully formed, are hard to shake — whether they are wrong or right. I also believe that the process helps one be right, but whether wrong or right, shaking it takes some doing.

So basically, I admit that I could be wrong. It’s just that the process I went through in arriving at my wrong answer was sufficiently rigorous that even if you’re smarter than I am, you probably aren’t willing to invest the time it would take to dismantle the constructs upon which my position rests.

But I do hope you’ll keep trying. I like to think there’s hope for me…

Rielle Hunter: The assignment I declined to take

Sunday, I was having Father’s Day dinner with my two sons (all my daughters being out of town), my daughter-in-law and two of my grandchildren at Yesterday’s, and my phone rang.

It was The New York Post. If you’ll recall, I represented that paper at the infamous Mark Sanford confessional press conference in June 2009, and they have called me since then from time to time when pursuing a story in SC. I’m generally glad to help when I can. Working with those folks can be an interesting change of pace.

This time, when the editor heard I was having Father’s Day dinner, he said he’d let me go, which I appreciated. But my curiosity was piqued.

A couple of hours later, he called me again, and asked if I could do a job for them. I asked what job.

Basically, they had heard that someone in this part of the country had advance copies of Rielle Hunter’s book. They wanted me to obtain a copy, read it quickly, and file a story that night.

I declined, and offered them someone else who might be interested in doing the story for them.

Part of it was that I was behind on some stuff I had meant to get done during my week off, and needed to get done before heading back to the office on Monday.

But part of it, I confess, was… well, you know the adage, You couldn’t pay me enough to do that? I’d rather have my gums scraped with a rusty screwdriver than read a single page of a tell-all book by Rielle Hunter. Every second I would spend doing that, I’d be acutely aware of all the good books out there that I probably won’t have time to read in my lifetime, and the sense of wasted time would be like a physical pain. I don’t even want to spend time passively listening to someone sum up her book in 25 words or less, much less spend any of my finite time on this planet reading anything that she might have to say. Just the thought of the exertion required to pick up such a book made me recoil.

I see they got somebody to do the story. Good. Especially since it wasn’t me.

I hope this doesn’t mean they won’t call me when they have something I would jump at (like the Sanford thing — I was burning with curiosity to know where our gov had been). But if so, having dodged any contact with a book by Rielle Hunter would have been worth the loss.

The worst thing that my generation of journalists did to America

In advance of the 40th anniversary of the Watergate break-in, Kathleen Parker wrote a column about the harm that Nixon did to the American spirit. The part of the column that spoke most to me, though, was this:

Not incidentally, Watergate also created something else of significance — the celebrity journalist and a generation of wannabe Woodwards and Bernsteins. Those of us who found our way to newsrooms all wanted the big story, if not necessarily the movie with attendant fame and fortune. What most realized rather quickly was that journalism was more like laying bricks than leaping tall buildings. Deep Throat was just a disgusting porn flick, and The Big Story was more likely a city council debate over tax millage rates.

We couldn’t all be Woodwards and Bernsteins, it turned out, but the presumption of corruption and government as the enemy was a pervasive, defining force in newsrooms across the nation. And this force in turn helped shape a relentless cynicism that persists today even as it morphs into something else.

It has been my belief for some time that the adversarial relationship toward government — the gotcha approach — that characterized this generation of journalists did two things to tear down this republic:

  1. It fed the corrosive distrust and even hatred of government that that has come to characterize so much of our politics. There has been, ever since the 1770s, a strong anti-government strain in this country’s character. But after Watergate, journalists did all they could to pour gasoline on that smoldering wariness. From Ronald Reagan to the Tea Party (and on the Naderite left), it became increasingly respectable to think of government as inherently a bad thing. Since the media were saturated with stories of bad politicians and bureaucrats, readers and viewers came to believe that all who served in government were like that, which was very far from true. The process was complete when it became common for Americans to defend the wrongdoing of the pols they liked by saying, “They all do that.” Which they don’t.
  2. Government actually DID get worse, because increasingly good candidates refused to run for office. Normal, well-intentioned people simply will not subject themselves or their families to the perpetual third degree, a state of being in which a large portion of the world never trusts them, perpetually accuses them, and magnifies their flaws (which we all have) to an absurd degree.

What happened was a matter of degree. It has always been a legitimate part of the journalist’s job in this country to hold government accountable. It is an essential function of the Fourth Estate. And perpetual skepticism, captured in the adage “If your mother says she loves you, check it out” — was a legendary feature of the journalistic character long before 1972.

But it’s also a part of a journalist’s job to provide perspective. If it’s unusual for a politician to be a crook, and you give the public the impression that all politicians are crooks, you haven’t done your job, because you’ve presented a false picture to the reader.

At this point my colleagues will protest (as I often have myself), We’re in the news business. If it’s unusual for a politician to be a crook, then when I find a politician who IS a crook, it’s my job to report that. It is not my job to report on the 99 percent of pols who are NOT crooks, because they are not unusual, and therefore not news.

True. And even back when newspapers had a lot of space and people to fill it, resources were finite. You were pretty much out of room after you had reported on the news; you didn’t have space for the vast majority of information that was not news.

And it was ever thus. Newspaper exposes always fed a certain amount of cynicism among the public.

But as I said, it’s a matter of degree. Healthy skepticism took a slight, nasty turn after Watergate. From being a healthy part of the American character, it became a situation in which Americans thought their government was so bad, that the attitude itself was a self-fulfilling prophecy. Look at South Carolina. Today, hating government is generally considered a prerequisite for getting elected to office. And once elected, those candidates go about showing just how bad government can be.

We can blame Nixon. But the journalists who were inspired by Woodward and Bernstein played their role.

I still don’t understand how ANYONE was fooled by John Edwards, at any point in time

Here is an explanation by one accomplished professional (Walter Shapiro) who was completely taken in. Excerpts:

About three weeks after Sept. 11, 2001, my wife, Meryl Gordon, and I had an off-the-record dinner with John and Elizabeth Edwards at the Washington restaurant Olives. The dinner was at the blurry intersection of Washington life—ostensibly social (Meryl had bonded with Elizabeth after writing an Elle magazine profile of her husband in 2001) but at its core professional (I was a columnist for USA Today and Edwards had White House dreams). Everyone was in a shell-shocked daze after the terrorist attacks, but my only clear memory of that dinner was Edwards’ palpable dislike for John Kerry, an obvious rival for the 2004 Democratic presidential nomination.

That was the beginning of a political-journalistic courtship that now makes me cringe. With Edwards on trial in North Carolina on charges of violating federal campaign-finance laws—after the disgrace of being caught with a mistress and denying being the father of her baby—I wish I had befriended a comparatively more honorable political figure like Eliot Spitzer or Mark Sanford…

In hindsight, I feel like the jaded city slicker, bristling with self-confidence that he can never be fooled, who ends up hoodwinked by the smiling rural Southern confidence man. Please understand: I did not deliberately put a thumb on the scale when I wrote about Edwards. It was more that I was convinced by Edwards’ sincerity when he talked passionately about poverty and the Two Americas. And I especially believed (because I spent so much time with Elizabeth) the romantic myth of the Edwards marriage.

Many Edwards insiders from the 2004 campaign say the vice-presidential nomination (bestowed by, yes, John Kerry) changed him. The entourage, the plane, the Secret Service detail and the frenzy of a fall campaign all supposedly fueled Edwards’ self-importance and sense of entitlement. But as I struggle to understand my own entanglement with a scandal-scarred presidential contender, I wonder if this arbitrary division between pre-veep Edwards and post-veep Edwards is too glib.

The danger signs and character flaws were always there, and I failed to notice them. I was certainly not alone in my blindness. David Axelrod, for example, was Edwards’ first media consultant during the 2004 primary campaign. Even after Axelrod drifted away to concentrate on a long-shot Senate race for a candidate named Barack Obama in Illinois, he returned for Edwards’ last stand in the Wisconsin primary. I recall running into Axelrod in the Pfister Hotel in Milwaukee on primary day and hearing him say of Edwards, “He’ll be president someday.”…

Yes, the “danger signs and character flaws WERE always there,” and they stuck out a mile. While I hadn’t reached the point of completely dismissing him in print as a phony, you can see my uneasiness with him in this column from 2003:

… There are few things more unbecoming than a millionaire trial lawyer presenting himself to a crowd as the ultimate populist. Huey Long could pull it off; he had the common touch. So did George Wallace. But John Edwards is one of those “sleek-headed” men that Shakespeare wrote of in “Julius Caesar.” He may be lean, but he hath not the hungry look. Mr. Edwards is decidedly lacking in rough edges. Not even age can stick to him.
His entrance was predictably corny. Other speakers had unobtrusively climbed the back steps onto the platform. Mr. Edwards snuck around to the back of the crowd, then leaped out of his hiding place with a huge grin and his hand out, looking for all the world like he was surprised to find himself among all these supporters. He hand-shook his way through the audience to the podium, a la Bill Clinton , thereby signifying that he comes “from the people.” Watch for that shot in upcoming TV commercials.
His speech was laced with populist non-sequiturs. For instance, he went way over the top exhibiting his incredulity at Bush’s “jobless recovery,” chuckling with his audience at such an oxymoron – as though the current administration had invented the term. (A computer scan found the phrase 641 times in major news sources during calendar year 1993 ; so much for novelty.)…

(The point of the column was to say that some protesters who were there to picket Edwards were even worse than he was. But first I had to establish what I’d thought of him. This incident formed part of my better-known “phony” column in 2007, in which I particularly concentrated on a detail I had not used in this piece — because it involved such a subjective impression that I didn’t have the confidence to attach importance to it until I’d had more experience with him.)

I’m not smug for having been put off, from the first time I saw him in person, by what seems to have taken in others. I’m just surprised that they didn’t see it, too.

If we could just suspend the Constitution long enough to take care of this ONE thing…

I spent Easter weekend at the beach with some of my family. We stayed at the beach house that my grandfather built in Surfside Beach. It’s on a small, narrow freshwater lake — the twins call it “the river” because it is so long and narrow that that’s what it looks like — with the ocean a couple of blocks beyond.

I don’t get there often, but we made it this time. The weather was beautiful. We had the whole day free Sunday (at least, until we had to drive back), having attended Easter Vigil Mass at my cousin’s church in Conway.

Between the Easter egg hunt with the twins and dinner, I managed to find a few quiet moments to lie in the hammock right next to the lake. Conditions were perfect. The breeze was perfect. I put some early Steve Miller Band on my iPhone and set it next to my head (I had no earbuds) as I prepared to snooze. I was right on the verge of doing something unusual for me — thoroughly enjoy the outdoors.

Just then, the roar of a riding lawnmower started up about 20 feet from my head, over on the lot next door. The guy on the mower was not the owner of that property, but a stranger. He was doing this for pay, on Easter Sunday.

I went inside fuming about it, but took solace from the fact that with that riding mower and he being a professional, it couldn’t take more than 10 minutes for him to mow that yard. I even found myself making excuses for him in my mind: Poor fellow must be desperate for the income to be mowing lawns today…

Then I headed back outside and resolved to escape the noise the way Huck Finn escaped the things that he didn’t like about “sivilization.” I got into the jon boat and pushed off with an oar to drift across the lake. It’s only about 40 yards across. As I drifted, I realized to my horror that some unseen fiend was using a leaf blower on the far side. Which sounded even worse.

I turned back as the first guy stopped mowing, only to see that the heathen mercenary had started using a weedeater.

I changed course again and started rowing perpendicular to the line between these two abominations.

As the first guy put down his weedeater and picked up a leaf-blower of his own — to blow the yard trash he had cut out onto the surface of the lake — I paused to write the following on Twitter:

Forget the Constitution: Anyone operating obnoxious power tools on such a beautiful Easter Sunday should be drawn and quartered, then fined.

I was reTweeted and received supportive replies from several folks (one who totally got into the Swiftian spirit of the thing wrote, “And then punished in a manner that could be considered cruel and unusual for such a crime.”). Good to know there are some sane people left in this world.

I should have known it would be Moe

If there’s one thing an INTP should know, it’s to go with his gut.

Which I did not do yesterday.

From the start of the campaign for the District 3 seat on Columbia City Council, I had thought Moe Baddourah was the guy to beat. Yes, partly that was because of all the yard signs. Months ago (way early in terms of conventional yard sign theory), I saw about 10 on one block of Wheat Street. Everywhere you went in the district: “Moe!”

Beyond that, there was his convincing assurance that he had learned a lot from Seth Rose in losing to him two years ago, and was applying the lessons.

So until very recently, I was sure that it would be Moe and someone else in a runoff. Either Daniel Coble or Jenny Isgett.

But then, in the last days of the campaign, I heard that there were polling data out there indicating that Jenny Isgett would come in either first or second, with Daniel Coble being in a runoff with her. It was counterintuitive, but then I thought, “Hey, we’re talking small number of voters here, so tiny fluctuations can make a difference and overwhelm the factors that you’re seeing out there.” That caused me to overthink what I was seeing. I started thinking, “Moe peaked too soon.” (Of course, I was aware that with such a small number of voters, even the best polling data could be negated by relatively small shifts on Election Day — which was why I hedged my prediction.)

Well, we saw what happened. My gut was right all along.

As it usually is. I should have known better than to be so influenced by one hearsay data point.

Always trust the gut, without overwhelming evidence to the contrary…

Anyway, now it’s Moe and Daniel in a tight runoff. Right now, either of them could win. My gut is telling me that Coble should be able to win over more Isgett voters than Baddourah, but it’s also telling me it isn’t sure yet. It’s still collecting cosmic waves, or whatever. And it has fresh reason not to count Moe out.

When it’s sure, I may tell you. Then again, I may not. This latest experience is reminding my why I avoided making predictions for so many years.

Why I am not an Eagle Scout

On a previous post about the Carolina Cup, after I had expressed my aversion to being trapped somewhere far from my car and adequate sanitary facilities, Steven Davis II asked whether that meant I had not been a Boy Scout. I answered as follows…

Actually, I was, but not for all that long.

I was really active in a troop in Ecuador, made up of expatriate gringo kids. I had finish Cub Scouts there, and made it through Webelos, and was really pumped about becoming a full-fledged Scout. Ever since I was a really little kid I had read my uncle’s Scout Handbook, which I took to be The Guide to Life for Guys. I was excited about the opportunity to apply some of those things I’d learned about.

My troop went on one camping trip, to an undeveloped beach near the town of Salinas.

There were zero facilities, of course. It was like a beach on the surface of another planet, with surf pounding against sandstone formations that framed little patches of sandy beach. We carried in our own water in canteens, and washed our mess kits in the surf, scrubbing them with sand. We had brought along some ice and some new metal trash cans. We put our water and perishable food in the garbage cans, and buried them up to the lids in the sand just above the high water mark. You know, for the insulation, to keep things cool.

That night — the darkest night I’ve ever experienced (no moonlight or starlight that I recall, and definitely no manmade light) — we lay in our tents and told the scariest stories we could make up (I was a big Poe fan at the time). The one that stuck in my mind as I tried to get to sleep, listening to the unseen surf, went like this — a ghost ship of undead Vikings lands on our stretch of beach and hacks us all to death before slipping away, and NO ONE ever knows what happened to those Boy Scouts. I lay there thinking that it was the height of irrationality to pay any heed to a ridiculous story that a bunch of 11-year-olds had just moments before collaboratively made up, while at the same time constantly hearing, above the surf, the keel of a Viking longboat grounding itself on the sand mere yards from our tent.

Anyway, during the night, some jerk went to the garbage can and dumped out a lot of people’s water, including mine. Why? You’ve got me.

The next morning, I participated in my five-mile hike requirement for my Second Class badge. We marched out along the beach to a distant point sticking out into the sea, and back. In the equatorial sun. Without water.

I was a pretty scrawny little kid anyway, without a lot of water in my flesh to begin with. Very wiry. It didn’t take that much to wring out what moisture was in me.

It had rained slightly during the night, just enough to dampen the driftwood we had collected for our fires, so I had a hard time cooking my lunch, and finally gave up because between the heat I was able to generate with the coals and the sun beating down on my back, I was about ready to pass out.

On the long drive home that afternoon, I got a bad case of the runs. The van we were in would pull over to the side of the road (facilities? in the third world? are you kidding?) and I would assume the position right there with my fellow scouts watching.

When I got home, I was clinically dehydrated, with my skin starting to wrinkle up here and there.

Later, my Dad was transferred to New Orleans, where our troop leader often didn’t show up for meetings and was extremely disorganized when he did show, and I never could get the paperwork done to get my Second-Class badge I had earned in South America.

I retired from the Scouts as a Tenderfoot.

And my enthusiasm for camping never really recovered from that experience.

So THAT’s why I don’t feel as smart as I used to

This is an interesting piece brought to my attention by Stan Dubinsky:

SPEAKING two languages rather than just one has obvious practical benefits in an increasingly globalized world. But in recent years, scientists have begun to show that the advantages of bilingualism are even more fundamental than being able to converse with a wider range of people. Being bilingual, it turns out, makes you smarter. It can have a profound effect on your brain, improving cognitive skills not related to language and even shielding against dementia in old age.

This view of bilingualism is remarkably different from the understanding of bilingualism through much of the 20th century. Researchers, educators and policy makers long considered a second language to be an interference, cognitively speaking, that hindered a child’s academic and intellectual development.

They were not wrong about the interference: there is ample evidence that in a bilingual’s brain both language systems are active even when he is using only one language, thus creating situations in which one system obstructs the other. But this interference, researchers are finding out, isn’t so much a handicap as a blessing in disguise. It forces the brain to resolve internal conflict, giving the mind a workout that strengthens its cognitive muscles….

Set aside the fact that this NYT piece is written by one Yudhijit Bhattacharjee, who probably speaks at least two languages, since this is written in English. It fits with what I’ve read and heard elsewhere — aside from the fact that it stands to reason.

It also gives me a clue as to why I used to feel so much smarter when I was a kid than I do now. When I was a kid, I spoke Spanish as easily and smoothly as English. I thought in Spanish, I dreamed in Spanish. I learned the language at what was probably the last possible moment for learning it as easily as I did — when I was 9.

I learned it the best way, in a sense — from being forced to speak it. From the time my family arrived in Guayaquil, Ecuador, most of the people I encountered spoke no English. I did take Spanish as a course in school, but that had little effect, as I recall. Probably a bigger factor was that I took half of my courses in Spanish — including history, geography and science. That was at the Colegio Americano. I was in the Clase Especial, which didn’t quite mean what it means here. There, it meant I was in the one class in my grade that was for native English speakers, and that the classes I took in Spanish were actually a grade-level behind my English classes. Near as I could tell, that didn’t put me behind my peers when I got back to the states. And I certainly knew a lot more than the other kids back home about Latin American history. Not that anybody up here cares about that.

I learned a lot of my Spanish at home as well. My Dad at the time was a lieutenant commander in the Navy, which made us modestly middle class at home. But there, we had two maids, one of whom lived with us 24 hours a day. And no, it wasn’t like Downton Abbey. But the maids had no English, and I interacted with them constantly — I had to, to get through the day. The first word I remember learning from them by way of context happened the first couple of days we were in the country. One of the maids started working for us while we were still staying in the Humboldt Hotel on the waterfront. She took us for walk one day along the quay  (with me probably fuming because, at 9, I felt no need for a babysitter), holding my little brother’s hand. He was only 3, and of course he wanted to touch everything. She would pull him away, saying in an urgent, admonitory tone, “Sucio!” It wasn’t hard to figure out that that meant “dirty.”

Anyway, when we came back to the states two-and-a-half years later, I had this ability that I was seldom called upon to use. I only took Spanish once in school subsequently, and of course aced the course — even though my grammar going in wasn’t so hot (the result of having learned the language naturalistically, and sometimes from people whose own language skills weren’t the best). When I went to college, my skills were still good enough for me to test out of having to take any foreign language at all.

But since then… it’s been slipping away from me.

About a decade or so ago, we started having masses in Spanish at St. Peter’s. I became one of those who would read the Gospel in Spanish at mass. To do this, I read it aloud multiple times before I leave home, just to warm up the necessary muscles in my tongue and mouth — otherwise, I can’t do the accent. My accent still isn’t perfect when I get up there and read (to my critical ear), but it’s better than that of people who learned as adults. It’s good enough that folks who have no English come up to me after Mass and ask me questions, which only embarrasses me and causes me to say, “Lo siento, pero necesitas hablar con María…” and refer them to our Hispanic Minister.

Because the thing is, I can hardly understand a word they’re saying to me. When I do speak the language (and I only fully understand what I’m reading if I look up some of the words), it’s very halting. And to my mortification, whether speaking or listening, I have to translate the words or idiomatic phrases in my head — which would never have been necessary when I was a kid.

So I think being bilingual made me smarter — I remember the couple of years after I came back as a time when everything, from school subjects to popular culture, gave me a fantastic rush in my brain as I soaked it all up.

But I don’t think I’m that smart any more.