On Rachael Ray, Fame, and the Most Beautiful Beach You Will Never See

“Do you think Rachael Ray regrets her life,” I asked during the penultimate episode of “Food Network Star,” the reality TV series that has produced exactly one true star in 10 increasingly convoluted, laborious seasons . That was Guy Fieri, the frosted-hair purveyor of faux Americana and creator of, possibly, the Worst Restaurant in America. And Fieri won his way into the “Food Network Family” – as well as America’s arteries – way, way back in Season 2.

But back to my question, which was not unserious. Rachael Ray grew up in and around food – her family had four restaurants on Cape Cod; she managed a local pub in upstate New York; she once worked the fresh foods counter at Macy’s (which, in retrospect, seems completely unreal). Eventually, she started teaching cooking classes based on the “30-minute meals” concept, which led to a regular segment on a local New York CBS affiliate, which led to an appearance on the Today Show, which led to the Food Network, which led to spectacular, mind-blowing success. By my count, Rachael Ray has at least four television shows, a magazine that bears her name, a line of cookware, and more money than anyone reading this will ever have. Given her upbringing and her personal history with food, it is difficult to say that Rachael Ray has been anything but an unqualified success.

But at this particular moment, watching this particular episode, I genuinely wondered if she regretted her life. She looked tired and wary – as if she wanted no part in determining which of the four remaining contestants would, after 13 weeks/episodes, “win” the stardom she had spent nearly her entire life earning. She sounded worse – hoarse, mannish – as if she had just chugged a carton of cigarettes, or as if she had been talking nonstop for almost 15 years, which, as it turns out, she has. (It should be noted that Ray had vocal cord surgery about 5 years ago to remove a benign cyst). Given the time commitments of her various endeavors, I have to wonder when was the last time Rachael Ray cooked anything at all. I know she “cooks” on her various TV shows, but anyone who watches “Food Network Star,” or the network in general, knows that this is not cooking in any practical sense of the word. It is far more performance art – faux cooking for mass consumption.

That episode aired at the end of July this summer. Less than two weeks later, Robin Williams killed himself in his California home. What followed almost immediately were fond remembrances from those who knew him personally and three basic types of media accounts. The initial stories were primarily about how someone so beloved as Robin Williams, someone who made millions of people laugh, someone so talented, who had won an Oscar as well as multiple Emmys and Grammys, someone who had a family, who was The Genie in Aladdin and Peter Pan in Hook, could possibly kill himself. It was stunning. There were also the stories about the links between comedians and suicide, stories about how comedy frequently involves being emotionally naked and vulnerable to your audience on stage, which can mean being emotionally naked and vulnerable to the world, stories about how comedy frequently involves being able to laugh at yourself, which can mean allowing others to laugh at you. And finally, there were plenty of stories about the state of mental illness in our country, and the need for greater awareness of the problem – that it could happen to anyone, even Robin Williams. I found this one in Slate, written by someone who suffers from mental illness, particularly telling:

So many people have commented on how all the money and fame in the world couldn’t save Williams from depression. Not being famous, I wouldn’t know, though I can’t even imagine how celebrity complicates a mental illness. The mentally ill must wage a fairly constant internal battle. It’s exhausting, even without the public spotlight.

At the time of Williams’ death, I was in Maui, where there are vast expanses of beauty unlike anything I have ever seen before. A few millennia of volcanic activity has left its coastline littered with hard, jagged, coal-black rocks and towering structures that look like peanut brittle made of bronzed red sandstone. The geology contrasted with the translucent, turquoise water of the Pacific Ocean makes much of the place look almost extraterrestrial. There are swaths of dense forest, which open to sweeping valleys, acres upon acres of places where no humans ever walk. And, of course, there are beaches – beautiful, pristine places – almost everywhere you look. Tucked away in a U-shaped alcove, about a half-mile from the Kewala’i Church, Maluaka Beach was one such place. It did not have the breathtaking breadth of Makena Beach nor the inviting, communal feel of the various Kamaole Beaches along Maui’s western shore. Instead, it was an unassuming little place, like the beach lazily decided one day to nuzzle up to the water line and spoon for a little while. But the water was warm, the waves were modest, and there was a nice reef close to the shoreline where you could snorkel with minimal effort. It was about as perfect as perfect can be. Directly across from this spot, a developer was building four luxury residences – thousands of square feet of mahogany wood, open floor plans, top of the line appliances, sweeping views of the Pacific, and Infinity pools. The price? Just north of $8 million. So, there they stood, towering edifices of irony: The few people who could afford to pay such a price – celebrities, athletes, CEOs of tech companies – were the same people who probably could not enjoy the perfect little public beach only a few steps away, the very thing that made the homes so desirable.

We live in a culture that is seemingly centered on creating and celebrating fame. We have televised mechanisms to produce famous chefs, fashion designers, singers of all types, ballerina moms, Jersey Shore bros, stupid, vapid, housewives, and plenty more. Fame consumes many of those who seek it like a voracious velociraptor and often leaves them a pulpy, bloody mess as it passes by, looking for its next meal. It’s temporary and transient, fickle and capricious.

I’m not famous, but I find rich irony in something that can provide almost unimaginable freedom, and leave you free to do nothing; that can make you loved by millions, but feel completely alone; that means you need infinity when infinity is outside your front door.

1 Comment

Filed under Uncategorized

In- Consequence (We) Shall…

Recently, a good friend and I exchanged links to stories about the sinister underbelly of our On Demand culture. And in case you missed it, the iPhone 5 was released to the frenzied fanfare that greets many Apple products nowadays. It is hardly surprising that the former stories seem destined to be emptied from the recycling bins of history permanently, while the latter event was greeted, by some, as a national holiday

By On Demand culture, I mean a lot of things. It’s an amalgamation of the greater interconnectedness that the Internet has allowed, more powerful technology, mobile computing, and globalization. It is a culture in which thousands of users produce millions of by-the-second, 140-character messages on Twitter, creating a living, breathing stream of consciousness for the world. It’s a culture in which 955 million active Facebook users share minute-by-minute details of their social lives. It’s a culture in which Google has made any answer to any question available in seconds. It’s a culture in which an almost unfathomable amount of consumer goods are available on Amazon.com, and elsewhere, ready to ship to your front door in two days. Perhaps most tellingly, it’s a culture in which all of these things are available on a $600 mobile device, just an app and a tap away. The On Demand culture is the sense of immediacy we have about the world around us. Actually, it’s more than that. It is the expectation of immediacy we have about the world and our collective exasperation when that expectation is not met. Louis C. K. does this funny joke about people visibly frustrated by their cell phones not doing a task as fast as the user wants; paraphrasing, the punch line is: “That’s right, it’s only sending information through the air to a satellite thousands of miles in space, and it’s taking five seconds to do that. Stupid cell phone.”

In a world like this, it’s not hard to see how stories like the ones I mentioned at the beginning can be easily forgotten. In fact, the On Demand world is designed precisely so that they are forgotten.  

The first story takes place at a thinly-disguised Amazon mega-warehouse. The reporter goes undercover as a warehouse picker, the person responsible for locating and packing the thousands of products stored in the massive facility, in middle of nowhere America. “Well, it’s a job,” you may think, “and one in America at that.” This much is true. But it’s a job with a temporary staffing agency so that the employee does not receive benefits and the retailer does not have its name associated with the absurdly demanding conditions under which the workers are employed. The employees are paid $11 an hour. It is estimated that, in a day, they walk across 12 miles of pure concrete, bending, reaching, and stretching for items in ways that would give OSHA executives nightmares. They have insane time targets for locating items in this massive expanse of concrete and steel – targets provided by electronic scanners that monitor every move they make. Supervisors threaten that if they don’t meet the targets they will be replaced by one of the hundreds of other people who want the same job. If they are absent in their first week for any reason at all, they are fired immediately. If there are subsequent shortcomings or, for lack of a better word, demerits with respect to their work, the employee accumulates negative points. 6 ½ points and you are terminated without exception. My last Amazon order was an extra-large beach umbrella for $49.98 with free two-day shipping.

The second story also takes place, not coincidentally, in the world of massive, anonymous, industrial boxes – this time filled not with mountains of consumer goods but rather with rows and rows of computer servers. Yes, computer servers, the little boxes that power Internet giants like Facebook and Google and Yahoo. “What’s so wrong with computer servers?” you might ask. Nothing per se. Except there are thousands and thousands of warehouses filled with computer servers – so-called data centers, which sounds so clean! – just sucking energy off the grid to the tune of 30 billion watts of electricity, or the equivalent of the output of 30 nuclear power plants. It is estimated that 90 percent of the energy used by data centers is wasted because companies run their servers at full capacity in the off chance that there is a large increase in traffic to their website. As a further precaution, the companies use backup generators to make sure the sites never go down, even when the power goes out. These generators emit diesel exhaust, causing many data centers to appear on something called the Toxic Air Containment Inventory. Why the need for all this juice? According to the New York Times:

The inefficient use of power is largely driven by a symbiotic relationship between users who demand an instantaneous response to the click of a mouse and companies that put their business at risk if they fail to meet that expectation.

Eight people liked my post today.

My friend and I had a brief conversation about these two stories. I explained that I had never read Michael Pollan’s In Defense of Food (it is on my reading list), but I imagine that the book makes a point about consumer decision making with respect to food similar to this: As long as the chicken breast is $1.29/pound at the local supermarket, the consumer doesn’t really care where it came from or how it got there. (Pollan, I am guessing, argues this is a very dangerous phenomenon). My friend suggested that if America were not so large, geographically, there might not be enough anonymous places to hide these activities from the consumer’s view. I thought that was an interesting perspective, and that she was right, except for the places to hide them in China. That’s where this factory, which makes many Apple products, is located. Apple products like, say, the iPhone 5. In a facility described as military-like and authoritarian. Where workers receive no benefits and are forced to work overtime assembling technological marvels for the rest of the world.   

In business school, we occasionally talked about externalities – costs outside of the traditional supply/demand, price/purchase economic models, costs usually born by third-parties. Air pollution is the classic example. It’s a cost associated with buying a car or most manufacturing activity that is not reflected in the price of the good itself. It’s basically a hidden cost. Our On Demand culture has become particularly adept at creating and hiding externalities, things like inhumane working conditions or energy-sucking computer complexes (or, Pollan would say, disgusting industrial food facilities). We don’t really have the time or desire to think about them. 

We don’t like consequences. We hide them away in massive, anonymous, industrial boxes, and we hope no one will notice that they are there.

Leave a comment

Filed under Internet, Twitter

On Nostalgia and New Kids

On June 11, 2011, New Kids on the Block, the (seminal? uber?? heart-throbbing???) 1980s boy band, played a sold-out show at Fenway Park. Enough time has passed now that I believe this occurrence can be discussed in a rational, emotionless manner – unless you are a woman, of a certain age, who happened to be at this concert. Then, all bets are off. If I were wagering in 1995, or even in 2001, I probably could have received better odds on the Red Sox winning the World Series before the end of time than on NKOTB playing a sold-out show at Fenway in 2011, or as the year is otherwise known, 20 years after New Kids on the Block first became popular. It was that improbable, that far removed from the potential landscape. But this was not simply one isolated show, a send-off of sorts for a group formed in The City of Boston. No, the group also played sold-out shows at places like Foxwoods and The Comcast Center. And apparently (improbably?) has more shows later this year in places like Orlando and London. If this were not unbelievable enough, then consider that these shows were played with the Backstreet Boys – in some sort of perverse, boy-band gang bang, mind you – a group that became popular two years after NKOTB broke up. So, the question that has to be asked is: How did this all happen?

The music industry, particularly the concert tour scene, has always relied on established fan bases to fill seats. That’s why you still see bands, or perverse versions of bands, like Led Zeppelin, the Grateful Dead and even Sublime still performing. That’s why I recently received a Live Nation email advertising concerts by Huey Lewis & The News, Motley Crue, and Peter Frampton. To wit, here’s the list of the top grossing concerts for 2010:

Bon Jovi – $201 million
AC/DC – $177 million
U2 – $161 million
Lady Gaga – $134 million
Metallica – $110 million
Michael Buble – $104 million
Paul McCartney – $93 million
The Eagles – $92 million
Roger Waters – $90 million

Other than Lady Gaga, none of those musicians or bands have been relevant in at least a decade, and that’s only because U2 released All That You Can’t Leave Behind in 2001 after being, more or less, irrelevant for the previous decade. So why is there an inverse relationship between cultural relevance and the ability to sell expensive concert seats? That is, why is it that as a band becomes less popular, it also becomes more popular (at least in some way)? The answer would seem to be nostalgia. (I realize that there’s a demographic explanation here – that older people with more advanced careers and higher incomes can afford to spend more on discretionary items like concert tickets than the teenagers to whom most modern music is marketed; however, it also means that those same people are willing to spend more on bands that have not been relevant for decades).

Nostalgia is an interesting elixir – a sort of all-purpose, all-body cleanser that washes away pieces of the past and leaves us feeling clean and whole and good. It is a time machine and a particle decelerator, able to transport you to a time when everything in the world coalesced in a perfect moment, and you could see everything clearly, all at once, like Keanu Reeves in The Matrix, like streaming fluorescent green strings of computer code (sort of), perfectly, as if you were seeing the world for the first time, and that moment was the only moment that ever existed or mattered. Nostalgia may also, simply, be a psychological survival mechanism. For better or worse, we tend to remember the good experiences and good people, and forget – or at least suppress – the bad ones. Nostalgia helps us forget things that we might be better off not remembering.

Here’s what I tell people I remember about my senior year at UMass: the origins of this website, for one; busy afternoons spent putting out a decent newspaper with people I liked and respected and, occasionally, engaged in office wiffle ball; endless nights spent with good friends at crowded bars; the freedom to take whatever classes I wanted, having completed the requirements of my major; spending time with my sister and my best friend at a place I loved; pick-up basketball games.

I also remember being emotionally crippled for the first two months of my senior year, having endured a heart-rending breakup during the previous summer, and a series of, in retrospect, ludicrous circumstances that thereafter strained (and broke) longtime friendships. It was a complete downward spiral. I know this because in the Fall 2000 Back to School issue of our newspaper, I wrote a column which concluded with this sentence: “I am 20 years old, and I have lost all faith in humanity.” And it was the truest thing I have ever written.

I don’t think that this discrepancy is some form of agency problem – that there’s a conflict between what actually happened and what I want people to believe happened. It’s not even that I prefer to remember the good things over the bad things. It’s simply that I do remember the bullet-stopping, green computer code moments more than the two-month pile of shit. On balance, I am nostalgic for that time period.  

Chuck Klosterman recently suggested ( in this podcast I think)  that nostalgic (he didn’t use this word) music fans will attend a concert, even of a band that they did not like originally, as long as the band playing was popular/relevant during their teenage years. It’s an interesting, and unprovable, assertion; though, anecdotally, it would help explain why our list above has such a heavy ’80s focus to it – those fans having achieved the necessary levels of personal career success and nostalgia to shell out consistently large dollars for concert tickets. A thought process like the following is pretty remarkable:  In 1983, I never liked this band, nor the music they played, nor, perhaps, even the genre of music they embodied, but now, almost 30 years later, I want to see them perform music live to be reminded of 1983. It speaks, yes, to the deep emotional connection we have to music and, yes, to the fondness we associate with adolescence – primarily experiencing defining moments of self-identity, glimpsing our first hints of independence and hormones – but also to the allure of nostalgia. No matter how great any adolescence is, it is inevitably marred by clashes with parents, serious self-doubts about personal appearance, fear of rejection (and many times, things much worse than these). Yet, evidence suggests people are willing to pay lots of money to be reminded of music from their adolescence – perhaps even music they never listed to during adolescence.

During one of my favorite scenes in Garden State, Zach Braff’s character is talking about growing older and becoming more removed – physically and spiritually – from his sense of home; he explains that at some point, the place where you grew up, perhaps even the place where your parents still live, is no longer home. He concludes by saying, “Maybe that’s all family is – a group of people who miss the same imaginary place.” I think maybe the same is true about nostalgia.

So, here’s a concession: My original question does not have a good answer. There is no explanation that New Kids on the Block and the Backstreet Boys sold out Fenway Park in the year 2011, at least not for me.  I couldn’t understand it; in fact, it still kind of fries my brain. But that’s the thing about nostalgia: it’s irrational and emotionally-charged and personalized. To get it, you had to be there (again).

1 Comment

Filed under Uncategorized

On Sport, Place, Myth and The Most Hated Man in America

John Connolly, 18, of South Boston, Mass. is the most hated man in America. He just graduated from Boston Public Schools, where he earned above-average grades and athletic honors in football and baseball. He’s about to enjoy his last, great summer before he’ll be expected to start thinking about internships, building his resume and THE REAL WORLD. But, for the next three months, John and his friends will have fun just being teenagers in the neighborhood, and in the city, where they grew up. In September, JoCo – that’s what Connolly’s friends call him – will attend Fairfield University, where he’ll quickly learn that America suddenly hates Boston sports fans, and, by extension, people from Boston.

Once rumored to be the new location of Fenway Park (in 1999, when John was 7), South Boston has changed more than most people could have imagined in his lifetime. Large companies like Fidelity made significant capital investments in the South Boston waterfront creating a commercial district that had only been imagined as possible. The City built a massive convention center there that came with brand new hotels to service out-of-town guests. Southie has become a destination living spot for twenty- and thirty-somethings with fashionable restaurants and bars. Almost any observer of the city would say it’s been an improbable rise for South Boston over John’s 18 years. The same could be said about Boston sports teams, and about the City itself.

When John was 8 years old, the Patriots won their first Super Bowl ever with a team of largely unknown players and a coach that had been labeled a failure for his shortcomings in Cleveland. They went on to win two of the next three Super Bowls, and enjoyed one of the most dominant stretches the NFL has ever seen in 2003 and 2004 (for good measure, the Patriots finished the regular season undefeated when John was 14, and came within one insane Super Bowl catch of the second perfect season in NFL history). Nine months after the Patriots won their third Super Bowl, the Red Sox improbably completed a 3-0 comeback against the Yankees, their arch-rivals, and won the team’s first World Series in 86 years; they went on to win another in 2007. A year later, the Celtics won the NBA Championship for the first time in John’s lifetime, and played in another staggeringly close Finals against the Lakers two years after that. Then, finally, about a week ago, the Bruins won the Stanley Cup for the first time since 1972. Seven major sports championships in 10 years. And that is why, when John Connolly arrives at Fairfield University, his roommate is assuredly going to hate him.

To understand why any of this matters to John Connolly and to Boston is partly to understand why we watch sports and to understand why sports matter. It is partly to respond to derisive comments about grown men playing children’s games or throwing leather balls through iron rims or knocking each other senseless repeatedly. We watch sports, those of us that do, because it reinforces some of our basic assumptions about life – that if you work hard for a long period of time, you will succeed; that talent and skill are rewarded; that teamwork is important; that anyone can rise to an occasion and be a hero (Frankie Cabrera!). We watch sports because we like the certainty and finality that they give us when life usually cannot; knowing that one team, the best team, will win and one team will lose, even if, as in life, we often discount luck too much. We watch because, like a good Greek play, there are heroes and villains. We watch because sometimes the completely improbable becomes entirely possible.Sports is theater, sure, performed for sums of money that are incomprehensible, sure, but it can be the highest form of theater we have.

To grow up a Boston sports fan is to grow up hearing the same stories and the same names over and over. That type of oral history might be true of a sports fan in any city, but it’s particularly true here because of Boston’s historical presence in all four major sports. John Connolly heard those stories, the same way I did – from my father and coaches, from friends, on sports radio talk shows and in newspaper columns. For Celtics fans, it was stories of Larry Bird, Kevin McHale and Robert Parish, and before them Havlicek, Russell and Cousy, wars fought with the Lakers in the ’80s, the Bill Laimbeer clothesline. For Bruins fans, it was Bobby Orr, Phil Espositio and this goal. For Patriots fans, it was the ’86 Super Bowl and letting The Fridge score a touchdown. For Red Sox fans, it was Ted Willians, Yaz, Fred Lynn, Carlton Fisk, stories of the Impossible Dream season, the ’75 Series, Bucky *bleeping* Dent, and most of all THE CURSE OF THE BAMBINO. These stories and names were repeated so often, to so many people that they became myth – again in the Greek sense – more than historical recounts, parables about greatness and destiny and flying too close to the sun. Those were the Boston myths, inextricably tied up with the City, because so many in the city were inextricably tied up with them. For the Boston sports fan, they were all encompassing and inescapable.

When the Bruins finally won the Stanley Cup, more than one commentator described the team as “gritty” and “a reflection of the City of Boston.” It was, truly, an absurd premise. Were we really to believe that because a lot of the team resides in the North End that somehow these players – from Europe and Canada and other parts of this country – were imbued with the grittiness of a kid from the streets of Charlestown or Dorcester or the South Boston of John Connolly’s childhood? Of course not. But, the City is certainly a reflection of its sports teams. How could it not be? Those myths are rooted deep inside Bostonians, influencing our impulses, our moods, our behaviors like all those Greek morality plays were written to do. It’s not our fault as much as a byproduct of being raised as sports fans here, in this city. So, it’s easy now to call Boston, The City of Champions, given our teams’ recent successes. It’s an easy story to write (note: I’m aware the smug tone of that article kind of undermines my point here, especially from someone who profitted off of the fucking Curse of the Bambino. Screw you).

The individual fan experience is not nearly so universal. My dad’s experience as a Boston sports fan – 40-odd years before this winning blitz started – is different than mine (I was 21 when the Patriots won their first Super Bowl) is different than John Connolly’s. For me, I don’t think any sports moment will ever be better than the 2004 Red Sox – that epic comeback against the Yankees, the heroics of David Ortiz and Curt Schilling, winning that first World Series. That was the most meaningful for me (along with the Braves World Series victory in 1995, another story for another time). The stretch of Patriots dominance in 2003-2004 would be second; the clinical precision with which those teams dismantled the league, and Peyton Manning particularly, was awesome. Then the Celtics: watching Ray Allen play when he’s on is really fun. I was happy for Paul Pierce whose loyalty to Boston paid off in the end and Kevin Garnett. Then, finally, least meaningful to me, was the Bruins. I’m happy they won, but I never grew up playing hockey nor do I watch regular season hockey games. I’m more happy for my friends who did grow up playing and for whom the Bruins win was the spiritual equivalent of my 2004 Red Sox. Someone else would have a different order and John Connolly yet another one. Thats the thing about stories and myths, we self-select the ones that are important and meaningful to us.

A city doesn’t quite work like that. Boston will slowly change – with new stories and new myths, from a place shaped by my father’s experiences, and mine, to one shaped by John Connolly’s and people of his generation; maybe that process has already begun. If anyone can tell you about how a place can change, and shift, and become something it never was before, it’s John Connolly. It’s just too bad he’s going to be hated at Fairfield.

Leave a comment

Filed under Boston sports

On Convergence, Bill Simmons and Twitter

I first heard the word “convergence” 10 years ago. Actually, 10 years is probably my personal form of anchoring bias around even, neat-sounding time frames. It could have been 7 years ago. It could have been 15. There would be a way to pinpoint it – the exact first time I heard the word convergence – by researching the history of consumer technology. I could Google mobile technologies or search the digital archives of Wired to find the exact moment when convergence started becoming completely plausible and stopped sounding completely insane, which is how it sounded when I first heard the word.

Convergence is the concept of separate mediums or forms of telecommunication merging onto the same platform or same device. Today, the novelty of that idea is as interesting as listening to Congress debate raising the debt ceiling. But at the time, at the time I heard convergence described for the first time, it was completely insane, I promise you. Why would I want to watch TV on my computer? My TV was for watching South Park and The Daily Show. My computer was for bloodlusting Ogres… or was it downloading MP3s? There was no way I would want to watch TV on my computer. They were separate things. One was kept in the living room and one in the office. Not only were they separate physically, but I thought about them as separate things, each with its own place and function. In fact, I would say I probably wanted to keep them separate altogether… or so I thought.

You can’t do much of anything today without encountering convergence. Watching TV on your computer is pedestrian. You can surf the Internet on your TV. You can watch TV AND surf the Internet on almost every commercially-available cell phone, excuse me, smartphone. Movies are delivered seamlessly over high-speed cable lines into homes – sometimes before they are even released in theaters (and really, how long will those be around?) – or streamed through video game systems, or even picked up at vending machines in grocery stores? I can’t imagine what “convergence is completely insane” me would have said about movies in grocery stores. WHY would I want movies in a GROCERY store?

If you think about convergence on a grander scale – cosmically speaking, if you will – we have far exceeded even those basic notions of the concept. Facebook is the convergence of the Internet with real-life social interactions. Same with Foursquare. Twitter is the convergence of newspapers (what are those again?), or at least newsgathering, with text-messaging, itself the convergence of instant messaging and cell phones. And Google is the convergence of the Internet with literally everything.

For a while now one of my favorite writers has been Bill Simmons, formerly of ESPN.com, now of the more expositional Grantland.com. I admire his glib sense of humor and sports fan sensibilities as much his self-made career; he started his own website long before the convergence between newspapers and the Internet was inevitable or even probable. I think if someone were to ask Simmons why he has been so successful (and for non-sports fans, he has achieved tremendous success), he would answer because he was among the first to write from the perspective of the sports fan rather than the sports reporter people grew up reading. I’m pretty sure Simmons has said as much in one of his columns or one of his two books. I have a different take though. Simmons has been successful because of convergence. He was smart enough to realize the convergence between sports and pop culture, and savvy enough to capitalize on it long before Emmitt Smith appeared on Dancing With The Stars. He was a part – some part – of the shift from the passive, sit on the couch, watch the game, read the next day’s newspaper sports fan to the active, write about the game, bypass traditional media outlets sports fan. And that has been a huge convergence, not just in sports. Joe Public is no longer a passive receiver of telecommunications; he has become an active broadcaster and participant. That is where we are going – as a society, and in this column.

I recently got the fancy new Droid cellphone from Verizon, complete with 4G data speeds, an AMOLED touch screen, an 8-megapixel camera, 32GB of on-board memory and voice recognition. I say “navigate home” and it does. It’s about 10 times more powerful than any computer “convergence is completely insane” me ever owned or used. I joked to a friend that with all that technology I could practically run a television studio with it. What struck me most about the phone, though, was just how quickly it tracked the digital imprints of my life and embedded them in the device like electronic branding. Within a matter of seconds, my work email, calendar, and contacts, Google email, calendar, and contacts, Facebook friends, newsfeed, and pictures, Foursquare check-ins, and Twitter feeds and hashtags collided in a maelstrom of convergence. Everything was right there, in my hand, a swipe away. Now, even these new technologies, things that themselves were the products of technological convergence, were converging into this universal, all encapsulating, multi-platform message machine. Now these things that I had encountered separately, that I had thought of and learned about separately, were merged into one, inexorable stream of collective consciousness. Did I need my Facebook newsfeed and Foursquare check-ins in the same place? Did I want my Facebook status updates and personal tweets to be the same? Were they not different platforms with different purposes? Shouldn’t they be separate? Or… should they?

I’m a new Twitter user (follow me @cosmicspeaking, thanks), mostly because, for the longest time, I could not discern the difference between tweets and Facebook status updates. The lone exception, as far as I could tell, was that you could follow famous people on Twitter, which I did not care about doing. Or that’s what I thought when I first heard about Twitter. What I’ve discovered, however, is that all my interests are congregated in one place – it’s interest convergence, if you will. I follow my Top Chef/Food Network people, my poker people, my ESPN reporters, my Bill Simmons crew, my Boston-centric gang, my general news sites and my finance/investing prognosticators. I can go to Twitter and within a few moments, more or less, catch up on almost everything that interests me. There is no boundary between transmission and reception (it’s a fluid, ever-changing conversation), no boundary between user (passive) and content-creator (active), no boundary between Joe Public and Tom Hanks. For these reasons, Twitter is the penultimate convergence experience. Millions of its users self-select these worlds of intersecting interests, receiving messages, videos and pictures across mobile and desktop platforms, usually from complete strangers, and sometimes broadcasting the same back. Twitter is the ultimate aggregator, a perfect tool for a world in which convergence is ubiquitous.

As a social phenomenon of some import, Twitter is clearly here to stay in some form. The number of Twitter followers a user has is currently the ultimate form of social currency. The business side of the picture is less clear for Twitter (and Facebook and Foursquare and other convergence-based ideas). How to turn all of those aggregated eyes, aggregated messages, and aggregated content into aggregated dollars is a problem without a perfect solution as of yet. That means the future of convergence is equally unclear. Real social currency? Perhaps. Maybe, it won’t be long until someone actually is broadcasting television from their mobile supercomputer. Whatever happens with convergence from here on out though, it certainly won’t be completely insane. Almost anything seems entirely possible. Ten years ago, I never would have thought that.

Leave a comment

Filed under Bill Simmons, Convergence, Twitter