Wednesday, February 11, 2009

Facebook and the Commoditization of Friendship

I was tricked onto Facebook. That much I will swear to my dying day. One of my closest friends sent me an email link to a new photo album. She has several online photo albums scattered across the Web, and all of them are remarkable. The email I received noted that the new album resided on Facebook. I wasn’t a member, but the email said nothing about joining or setting up a profile, at least no disclaimer that I saw. Perhaps it was buried deep in the legalese. I clicked on the link, and the web page that loaded asked me to verify that I was a real person, asking simply for my name, my birth date and I believe a school name. I should have been more suspicious, but these days so many websites ask me to verify something to prove I’m not some web-bot out to spam the site.

At that moment I clicked submit, Facebook created a personal page for me and then proceeded to contact . . . Every. Person. I. Had. Ever. Met. I’m not entirely sure how the system was able to immediately pair me up with friends and colleagues. Needless to say, within thirty minutes, my email inbox had been spammed with notes from my friends mocking me for finally entering the 21st century.

I will be the first to say I’ve become a convert. Many years ago, my early working experience turned me against talking on the phone. I have very little patience for it and would be quite happy without one. Instead, I lived on the web with IRCC chat rooms (remember those?), with long, crafted emails, with personal websites. Then came academia—who knew that an email inbox could become so quickly flooded with so many painful requests from colleagues and students? I soon lost all interest in writing friends, which is somewhat of a problem when I live in rural Florida and everyone else is scattered across the United States and as far away as Australia.

Now, once a day, usually in the mornings, and depending on my mood, once at night before going to bed, I can type out some pithy Twitter like comment about my status, read everyone else’s, and sometimes read or post short notes. The secret, and perhaps Facebook’s success, is that it is endearingly shallow. Keeping up with friends is now quick and painless. Communication in any form other than notes either posted to one’s profile or sent through Facebook’s internal email system must stay less than 250 characters. In fact, the set-up for the website is much like I’ve always said I’d like to have for high school reunions. I was never very close to my high school compatriots – in fact, my high school experience is very much like the metaphorical one portrayed in Buffy the Vampire Slayer. But, I wouldn’t mind a simple online catalogue that tells me who popped out how many kids, who managed to escape Burke County, Georgia, and who managed to grow oldest fastest? Thankfully, I actually like the people I’ve re-connected with on Facebook, so my interests aren’t nearly so callous.

Within the first week, I discovered the first ethical quandary of Facebook etiquette—how to respond to friend requests, especially to people you barely new twenty years ago. The answer—befriend everyone but focus on the few who are close to you. That said, I’ve actually become better acquainted with a few of those same people I barely knew. Who would have guessed? The second ethical quandary—how to handle students? Yes, Facebook, despite its reputation as being “Mom’s website” when compared to the hipper MySpace, has still a very young demographic. Based on advice from other colleagues, I instituted a sharp rule: befriend no student until he or she graduates.

The formalist in me, though, has become deeply fascinated by the patterns I’ve seen on Facebook. Like email, viral messages are rampant—short games people play, either by confessing a specified number of secrets, by manipulating one’s name, or by compiling lists of various interests like food, drinks or music. This semi-narcissism, semi-voyeurism, semi-exhibitionism is very difficult to resist; I’ve seen friends actually complain vociferously about filling out these “surveys” even as they continue to post them. I bring this up not to tease these friends but simply to show how nearly impossible it is to not participate in these viral games.

Another viral pattern involves fan pages and groups—essentially an internal message board within Facebook that operates through subscriptions. One member of a circle will join a group; in doing so, Facebook alerts all of his or her friends. As they see the announcement, they may join, which in turn alerts all of their friends. And so topics from as broad as The Simpsons and Family Guy to as narrow as "Hanging out at the Krystle’s on Vineland Avenue in Macon, Georgia, in the 1980s" will pass through the circles of friends and accumulate fans.

A third pattern takes shape around the applications, which act as a combination game / electronic postcard system. Some of these applications are created by Facebook users such as my current favorite, the Shite Gifts for Academics, with such gifts as 80 freshman composition papers, boring faculty meeting or “that guy.” Another popular one, for me, offers Southern memorabilia such as NASCAR, grits and collards. The third application, however, gave me pause: Kidnap! In this little application, friends snag each other to cities around the world. To escape, one answers trivia questions about each locale. But what is so interesting about this little game? It’s sponsored by the Travel Channel and designed by Context Optional, a Facebook application development firm. That’s right, one can actually make enough money developing Facebook applications to warrant a firm.

Then I saw Facebook as the capitalist, marketing behemoth that it is. Along the right side of the webpage spans advertisements, each one customized to the user’s preferences as judged by our viral group and fan selections, along with whatever profile information we get tricked into revealing. Advertisers actually have a fair amount of leeway when it comes to capturing eyeballs. Facebook offers advertisers advice on how to develop their ads and target the correct audience. By setting a “Daily Budget,” advertisers can limit the amount of money per day they are charged; once the limit is reached, that ad falls out of rotation. Advertisers can choose to be charged for clicks, where customers actually click on the ad and are directed elsewhere (for a minimum charge of a penny per click), or impressions, where customers merely see the ad (for a minimum charge of 15 cents). The minimum daily budget is $5 per ad.

Every note that is written generates revenue. Every “poke,” every game, every Shite Gift, every Kidnap calls forth ads. Every group joined and cultural meme fanned rolls out ads. The very nature of friendship has been cleverly packaged. If, then, friendship becomes a commodity, can we assess value to it? Are those members with a wide circle of friends more “valuable” than those with, say, less than ten? Are those members who interact frequently with the applications and the viral nature of the site more “valuable”? And what is this value? How often do members evaluate (with that word chosen specifically because of its root, value) each other when it comes to status? Are those members with the most friends exhibiting a higher status? What of those members who frequently poke and post? For example, one Facebook member, Bob P., has 129,590,884 kidnaps. Does the status that comes from having many friends and engaging most often with the site go hand in hand with one’s economic, commodity value? Clearly it does for The Travel Channel and Context Optional, but what affect does it have on the everyday Facebook user? If I choose not to participate in the viral nature of Facebook, am I somehow less a friend, and less a person?

Saturday, February 7, 2009

Pretty Pictures

This week a book rep handed me a copy of yet another introduction to literature textbook, this time Nicholas Delbanco’s and Alan Cheuse’s Literature: Craft and Voice. Some may recognize Alan Cheuse as the commentator on NPR’s All Things Considered. The book rep mentioned as a selling feature the emphasis on visual images and graphics within the textbook, and indeed that is the first thing I noticed. Every page is brimming with photographs in brilliant colors, text wrapping around graphics or hovering over half-opacity images, quotations from the texts in green flashing in the middle of short stories, and excerpts of interviews by the author lurking in the introductory materials.

In their preface, the editors argue that this text is designed for a “visual age” and “to harness the power of media and use it to help students learn the art of sustained reading.” Students, they note “should engage their senses; they must listen as well as look.” As is standard with almost all literature textbooks, Delbanco and Cheuse package along with their text a DVD of videos, mostly interviews, but I had to admit that their concerted graphic design, a veritable Southern Living textbook, is very striking.

(While I applaud the idea of using higher media to engage students, the cynic in me can’t help but realize that such DVD inclusions more often than not force students to buy new textbooks rather than trade with their roommates and friends or acquire from sources other than the college bookstore.)

At first I agreed with them. I’ve seen firsthand how eagerly students engage with rhetoric readers when the texts are designed with visual elements in mind. John Mauk’s and John Metz’s Inventing Arguments, Andrea Lunsford’s Everything’s an Argument and Robert Lamm’s and Justin Everett’s Dynamic Arguments have been very popular with students, while Laurie Kirsner’s and Stephen Mandell’s Patterns and Laurence Behrens’ and Leonard Rosen’s Writing and Reading across the Curriculum have fallen flat. And even though these textbooks have different composition teaching theories at work, let’s face it, the essays in these readers are generally the same. And if the essays do change, the authors typically don’t. Secondly, not only do students become more engaged when the textbooks pop with graphic design, but I can also use the textbook itself as a form of visual rhetoric, a sort of meta-textbook.

Then I realized how I use overly designed rhetoric textbooks—as a meta-text—and ask students to question the design, to explore how the textbook designers attempt to control and manipulate them as readers. That is a benefit to a rhetoric class. Literature, on the other hand, is a different beast. Delbanco’s and Cheuse’s textbook images don’t just enliven the text, they impose interpretations. For example, Nathaniel Hawthorne’s “Young Goodman Brown” includes an image of the Devil card from a Tarot pack. Granted, the devil is certainly an issue in the short story, but what of the Tarot? For Jack London’s “A Wicked Woman,” the text of the story concerning Ned Bashford’s Greek temperament wraps around Michelangelo’s David; one wonders how an Italian image of a nude Israelite compares. Also, the quotations used to highlight the text, always in green, usually in the center of the page, much larger in print and in a separate font, call attention to passages, as if to alert students that these words are more important than any other in the story. In London’s case, those words are “He did not believe in the truth of women…” In Hawthorne’s case, those words are “He looked up to the sky, doubting whether there really was a heaven above him.”

How can I teach students to come to their own conclusions on a text, based on evidence and reasoning, when the text influences them to such a degree?

Later I grew disturbed by the entire nature of the textbook. Do students really need flashy graphics for a literature textbook? Isn’t imagination enough, and for an introduction to literature, shouldn’t we also be introducing imagination to new readers? The graphic for Franz Kafka’s “Metamorphosis” includes a large beetle, but shouldn’t students decide for themselves if Kafka’s bug refers to a beetle or a cockroach? Doesn’t an image of a gun embedded in Flannery O’Connor’s “A Good Man Is Hard to Find” give away the ending? Where is the “internal movie” that invariably trumps the Hollywood version when the graphic artists impose their own vision?

And have we really become such a nation of dullards that flash is needed? Of classrooms filled with Alice sniffing, “and what is the use of a book . . . without pictures or conversations?

Delbanco’s and Cheuse’s project brings to mind another debate, one for which I haven’t found my footing yet. Should we be turning academic studies into a television and magazine culture under the guise of “engaging,” or should we place students in an environment that is somewhat disorienting and unfamiliar under the guise of “challenging”? At the moment, I find the Lewis Carroll quotation from Alice in Wonderland very telling; it shows that for many, many years now there has been a dividing line between those illustrated tales for children, the “picture book” and the pure text of adulthood. That today’s students are more engaged with illustrated texts is no new thing and is not a fresh effect of the Internet and MTV. When should we ask people to put away childish things and become an adult? When they are 10, or 23?

Saturday, January 31, 2009

The Cost of Education and the Rights of Expectation

That the cost of education is sky-rocketing comes as no surprise to anyone. In the late 80s, I accrued a student loan in the range of forty thousand dollars. Today it is common for students to leave their undergraduate education with a student loan exceeding one hundred thousand; I’ve read reports of students graduating with B.A. degrees from tony ivy-league universities with loans over two hundred and fifty thousand. That price tag is equivalent to buying a house. Imagine being twenty-two years old with a thirty-year home mortgage, and not a starter home at that, and on the lookout for an entry-level job.

Yet at the same time, we live in a consumer culture. Buying a car? Of course one negotiates, paying more for the sun roof, asking for less when the color isn’t one’s first choice. If that car doesn’t please the customer during the first month, the customer takes it back for a refund. Can the same be done with a college degree? Does the person shelling out a hundred thousand dollars have the right to complain if their college experience doesn’t “meet their expectations”? Should that same person also complain if their college degree doesn’t “guarantee” a career and salary of his or her liking? Notice I say “person shelling out” the money—more often than not, that person is a parent, not a student.

Parents and students have already begun to make that leap. “I paid a lot of money for this degree, and I expect you to give me an A.” I hear this already at the low end of the totem pole; how frustrated will I be after another five to ten years of dealing with it? I have also noticed another trend: the people making this complaint are the very people who apply little to no effort into achieving the degree in the first place.

Charles Murray, author of the often-criticized book, The Bell Curve (the media reported Murray’s connection of race to intelligence as stated fact rather than concerned implication), has recently argued that colleges should scrap the bachelor’s degree and replace it with certification programs. He models this notion on the CPA exam. According to Murray’s argument in the Wall Street Journal, “Outside a handful of majors -- engineering and some of the sciences -- a bachelor's degree tells an employer nothing except that the applicant has a certain amount of intellectual ability and perseverance. Even a degree in a vocational major like business administration can mean anything from a solid base of knowledge to four years of barely remembered gut courses.” Evan R. Goldstein, in his response, “Degrees of Potential,” in the September 5, 2008, issue of the Chronicle of Higher Education, has already addressed how such a program can’t work. One of the more surprising (and devastating) arguments involves liability, that such certification programs will be inundated with claims of employment-discrimination.

Murray’s argument, however, offers another, more surprising implication.

Students (or parents) are not the customer base for a college or university.

Employers are the university’s true customer.

Employers hire based on the qualifications of the degree the university confers. I know, based on conversations with business school administrators of a large southern state university, that some corporations have already begun to differentiate the quality of new hires from particular degree programs. On the other hand, many corporations don’t make those distinctions, other than whether the candidate holds a B.A. or not. I know from personal experience, spending twelve years working in corporate, eight of them with a multi-national corporation, affiliated with sales, IT and human resources, that hiring managers do indeed expect that a bachelor’s degree confirms “that the applicant has a certain amount of intellectual ability and perseverance.” The intellectual ability involves critical thinking, problem solving and communication skills. The perseverance suggests time and project management, interpersonal skill sets (with both colleagues—classmates—and superiors—professors), and the ability to balance work versus play. I frequently grew frustrated that hiring managers never asked any further question other than “Do you have a B.A.?” I wanted to interrupt, “But let me tell you about my major. Don’t you want to know which school?” I had worked my way through a regionally prestigious liberal arts university. Nor did anyone ever ask me about my GPA. All that hard work to graduate with honors, and no one cared.

The lack of the B.A., however, curtailed the careers of some of my colleagues. And perhaps I’m revealing part of the intellectual snobbery that Murray identifies as the mystique around the bachelor’s degree, but I could tell a difference in the demeanor and skill sets of my colleagues without a degree. Part of the issue lay in an inability to diagnose problems and offer effective solutions. Yet there was something else. They lacked a certain ability to interact with their peers, mostly on a social level. They lacked a certain polish, what Donald Bartholomae, the rhetoric and composition guru, would call a “discourse.” They couldn’t “speak” the language, and that language included more than just words; it included tone, delivery and composure.

Which brings me to my main point—perhaps colleges and universities should enquire what exactly our “customers” seek in the 21st century, but with a considered inquiry into who exactly are our “customers.” These “customers” clearly are not happy. According to the often quoted NEA “Reading at Risk” survey, 38% finds high school graduates “deficient” in reading skills, while 63% rated those skills “very important.” They also report that the number of college graduates with deficient reading skills had increased by 23%. What skills sets are needed? Is academia qualified to answer that question without input? I also suggest that we consider that the medieval nature of the university has reasserted itself—we are an apprenticeship program, not a certification program. The professoriate represents a collection of individuals who dedicate their lives to the mastery of one single skill set. Each professor measures the acquisition of skill, not knowledge, based on a mutual agreement between our own expertise and the expectations and needs of the “market. “

To my colleagues who find such a paradigm too mercenary, based too much on capitalist business models, I refer them to Louis Althusser’s work on Ideological State Apparatuses: schools teach students “the ‘rules’ of good behavior, i.e. . . . rules for morality, civic and professional conscience, which actually means rules of respect for the socio-technical division of labor and ultimately the rules of the order established by class domination.” We teach students to appear before us on time, in an orderly fashion, sit in rows, accept their identities to be reduced to numbers and perform tasks per schedules, all to create perfect members of the petty bourgeoisie. The reason I bring Althusser’s critique to my colleagues’ attention is to remind them that even their fight against such ideological entrenchment only furthers the illusion of academia’s neutrality. To them Althusser has already written—“I ask pardon of those teachers who, in dreadful conditions, attempt to turn the few weapons they can find in the history and learning they ‘teach’ against ideology, the system and practices in which they are trapped. They are a kind of hero. . . . So little do they suspect it that their own devotion contributes to the maintenance and nourishment of this ideological representation of the School, which makes the School today as ‘natural,’ indispensable-useful and even beneficial.” The current education system already works in tandem with the market whether we agree or not, or even whether we fight it or not—perhaps we should address this situation on a more conscious level.

In addition, academia should consider other portions of our social networks. Our “customers” are not just the industrial-military complex, but health care, science and public administration, into which I include non-profits. Perhaps if academia is more specific about the number and kinds of professors universities need, we would also save ourselves the ethical quandary of creating thousands of new Ph.D.s, especially in the Liberal Arts, for the handful of jobs that come available each year. The “market,” however that is defined, with as many components from the arts as will make my colleagues comfortable and not some ethereal ivory tower ethos, should determine what skill sets we should teach. At the same time, that same market should realize that academia is the best organization qualified to teach those skill sets, a “Tell us what you want, and we’ll tell you how to make it happen” mentality.

And, yes, I’m fully briefed on academia’s mission to enlighten, to broaden, to better each individual, mentally, emotionally, maybe even spiritually, and that by doing so, we improve the human condition. I wonder, though, if that ethos isn’t simply nostalgia for the leisure class values of the gentleman (and gentlewoman) scholar. Not all of us have a trust fund, and because there are so few tenure track positions for us, many of us adjuncts doing the grunt work of teaching aren’t so inclined to buy that model any more. Impractical personal enlightenment can indeed be achieved alongside practical apprenticeship.

Throughout this discussion, I’ve used the term “skill set” and not content. All too often I hear from former colleagues in the corporate environment that such disciplines as art, philosophy or history (to name a few) are not valuable. These kinds of conversations are dangerous when disciplines are reduced to content and not skill sets. All of the liberal arts teach critical thinking, problem solving and communication skill sets. Can the same be said for the technical degrees, the ones most often touted as “important” to the market? A recent survey, posted by the Wall Street Journal, showed philosophy majors can expect to make a higher mid-career salary than an accountant, architect, marketing or information technology major, and on-par with a finance, international relations and information management major.

And as a mentor of mine once mentioned, if English majors can’t communicate the value of a liberal arts degree to the market, then we have a serious problem. Cost is still an issue; that’s not been my focus in this discussion. But in addressing complaints about meeting students’ (and parents’) expectations, the true expectations of a degree’s worth come from the people who intend to hire them. More should be asked about what the “market” wants, rather than what students want.

Friday, January 16, 2009

The Dumbing Down of the History Channel

I am and have always been an ardent fan of the History Channel and its spin-off, History International. When I do choose to sit down in front of the television, I rarely select anything with narrative in it; one of my friends, in fact, calls in non-fiction TV. I prefer these kinds of niche channels, ones where I can reasonably expect the content and quality to match certain conventions, in this case, history. The same can be said for my preference for the Discovery Channel (run by a different network). And yet, over the past year, maybe year and a half, I’ve noticed disturbing trends in the History Channel’s programming choices, trends that I fear will cause it to devolve into some amorphous conglomeration of programs that have no little to do with history.

Like the other networks, the History Channel has discovered the cost benefits of producing reality series, most notably Ax Men, Ice Road Truckers, Sandhogs and Tougher in Alaska. As reality series, these shows are clearly better conceived than Seriously, Dude, I’m Gay or Who’s Your Daddy. But I question the historicity of watching a collection of obnoxious men of questionable intellect driving transfer trucks. Granted, social historians like Studs Terkel have demonstrated the trials and tribulations of the working class as just as important as tales of princes and prime ministers, and I’ve come to prefer learning about the history of the working class (for example, Tony Robinson’s delightful Worst Jobs in History series). What I despise, however, are co-workers bitching about each other for the camera, especially when, because of the nature of reality television, I doubt their sincerity. Believe me when I say I can get my fill of co-workers bitching about each other at work: why would I want to see it on television when I get home? Clearly these shows are an attempt to copy the Discovery Channel’s success with The Deadliest Catch and Mike Rowe’s surprisingly charming Dirty Jobs. But when it comes to the content of these particular shows, like wine, I prefer my “history” with at least some vintage. Please give the grapes at least a little time to ferment before you cork the bottle.

Some of the History Channel’s reality shows do at least pretend to address history. These shows, like Surviving History, attempt to recreate historical artifacts. By far Surviving History is the worst. For forty minutes a passel of former workshop rejects jerry-rig medieval torture devices in an embarrassing pseudo-copy of MTV’s Jackass (which Jackass wanna-be can withstand the thumbscrews the longest?), with a ten minute spiel by some historian about what the torture device was designed to do. I can only imagine the former Jackass staffer who unwittingly found himself in a History Channel conference room, brazenly raising his hand and saying, “Oh, I have an idea.”

I will at least give the channel credit for not becoming all-reality, all-the-time. However, the entire reason for this tirade is because of another trend I’ve noticed in their programming—what I call the “let’s review” format. For such shows, the basic structure can be broken down to this—twelve minutes of introduction; commercials; twelve minutes of “let’s review,” where the same gee-wiz computer recreations are replayed, the same content is rehashed, though a different expert might weigh in; commercials; another twelve minutes of review material with maybe a teaser of new information added; commercials; and the final twelve minute “conclusion” where all the viewpoints of the various experts are rehashed. By far the worst offender is Jurassic Fight Club (more later on this notion of whether paleontology is history). In this show, fights between two dinosaurs are recreated in CGI, a cross between an autopsy procedure drama and mixed-martial arts bouts. However, with each twelve minute segment, another minute of the CGI battle is revealed, so that in order to follow the entire mini-movie from start to finish, one must watch the conclusion. This show is lead, not by a paleontologist, but by “dinosaur expert” (the actual title attributed to him onscreen) George Blasing, a former retail executive with a passion for fossils.

The “let’s review” format is clearly a reaction to American viewing habits. We can’t sit through commercials, and once they appear, we grab our remotes and begin to surf. If we arrive at the History Channel at, say, 8:20 p.m., then we can “catch up” with the content of the show and perhaps stick around for twelve minutes long enough to suffer the first commercial before we can surf away. That said, the most successful History Channel watcher is the one who arrives at 8:45 to see the conclusion.

Some shows, though, have cleverly been able to avoid the “let’s review” format by their very nature, for example, Wild West Tech, Cities of the Underworld or Engineering an Empire. Each twelve minute segment tackles a different technology or location, when, that is, in the case of Engineering an Empire, Peter Weller isn’t voraciously chewing the scenery.

Granted, the History Channel is not a history department in a university, where equal coverage and rigorous standards are the norm. One could argue that the History Channel is not intended to distribute history but to entertain those people who like history, with the essential disclaimer that fans of history “might also enjoy science and reality programming.” The History Channel is a profit venture, one that follows the ratings. The highest rated show to date, according to the History Channel’s website, is The Universe. Like paleontology, when did astronomy become history? Yes, the Big Bang, like the dinosaurs, occurred “in the past,” yet somehow I suspect that the paleontology and astronomy professors of this world might take issue. Actually, such science offerings are an attempt to leach viewers away from the Discovery Communications’ family of science channels.

Why is this an issue? I fear the trend set forth by the AETN (A&E Television Network), described on their corporate website as “a joint venture of The Hearst Corporation, Disney-ABC Television Group and NBC Universal.” A&E began as the “Arts and Entertainment Channel” before their discovery that true crime programming brought in higher ratings that Shakespeare and ballet, at which point the network dropped arts and entertainment from its name along with any programming even remotely artsy. Something similar occurred on The Learning Channel, when programmers discovered the reality cash cow, Trading Spaces, at which point the network devolved into TLC. I suspect, based on the number of channels currently offered by Discovery Communications versus the number of channels offered by AETN, that Discovery Communications is eating AETN’s lunch. As the History Channel begins to break out of the history niche and scramble after Discovery Communications’ viewers, how soon before they devolve so much that they, too, must drop “history” from their name? Has anyone noticed the recent change in the History Channel’s logo? The logo is now just a simple gold H. Is the H channel not far behind?

There is one bright spot on the horizon. Discovery Communications offers some history programming for their international networks. How soon before they smell blood in the water and begin to offer their own history channel in the United States?

Monday, January 12, 2009

Who is the Low Professor

I’ve been frequently asked to compose short biographies for several workshops, conferences and seminars, and over the years, I’ve come to rely on this one.

CR was born a poor white boy in rural east Georgia, in a town found only on the most ancient of maps, surrounded by horses, cows, chickens and pine trees. Through hard work and intense study, he managed to escape, abandoning his post from atop his rusting John Deere tractor. He ran, hiding behind billboards and attaching himself to the undercarriages of passing cars until he managed to reach the heart of downtown Atlanta. There he survived as a corporate trainer, teaching rapacious corporate suits such skills as conflict resolution, sales negotiation, account management, public speaking, business writing, and analytical and creative thinking. However, this easy lifestyle of art festivals, boutique shopping, glitzy celebrity restaurants and martini-swilling gallery openings resulted in that kind of fuzzy-headed liberal thinking which leads to being eaten. He had grown soft, and one late evening, as he was walking to his car on the fifth level of the parking garage, his complacent life ended. He was, sadly, jumped by the very enemy he had forgotten--Rural Life. When he eventually came to, he found himself living in rural east Polk County, Florida, surrounded by horses, cows, chickens and orange trees.

On lonely moonlit nights, when the wind blows from the east, one can sometimes hear his sad, plaintive cry.

Irony aside, I have completed my masters in English literature from the University of South Florida, along with a graduate degree in creative writing and a master’s thesis on Shakespeare, on the beardless male characters in his canon and the early-modern actors who portrayed them, to be precise. I’m taking a break at the moment before finishing my Ph.D., in American literature between 1865 and 1920, and my dissertation research focuses on male friendships in the literature of the period, especially those tales set in the American West. I’m trying to find a home for a scholarly note on the astronomical event referenced in John Dryden’s "Astrea Redux," along with two short stories. I’m currently working on an article on Erskine Caldwell’s Tobacco Road, and in October 2008, I delivered a paper at the FCEA conference in Ybor City on colonializing architecture and commoditized culture at EPCOT’s Showcase Lagoon.