Day 149: glam-bam 2017

The younger is helping me to get ready for my conference. She looks through my closet and chooses a dress she thinks I should take. I sigh when I see what she’s chosen. It’s a black jersey dress I bought off the sale rack at Jigsaw a couple of years ago. About every six months I try it on in front of the mirror and gaze at myself with a mixture of fascination and disbelief, wondering what on earth I was thinking when I bought it. Then I take it off again and hang it back in the closet. I have literally only ever worn this dress standing in front of my own bedroom mirror. She insists I try it on so she can give her assessment.

I wriggle into it as best I can and we both survey my reflection in the mirror. It clings to all the places, both right and wrong.

“Whoa!” the younger gasps.

She’s looking at my reflection in the mirror with a scandalized expression.

“You look like … like … like a movie star!” she finally exclaims, with some alarm.

She shakes her head. “You definitely shouldn’t wear that. It’s too …. movie star.”

Although I can’t agree that my appearance in this dress resembles that of any movie star I can recall (no movie star’s publicist would let them wear this dress, onscreen or off, without some heavy-duty torso-trussing undergarment), I do agree that I should not wear the dress.

With some relief I change back into the sweatpants I was wearing before and continue packing.

“OK, let’s practice how you’re going to talk to people,” she announces.

She herself has dressed for the occasion: grey leopard-print leggings, a red sequined party dress, and my gold boots.

She clomps over to me, “So, are you having a wonderful trip?“ she asks.

“Yes, I am, thank you,” I say. “How’s your trip going?”

“Very good, very good,” she says. “Isn’t this just so amazing and glam-bam?” she says, gesturing all around us and referring, I presume, to the fabulous soiree we are apparently attending.

“Yes,” I reply. “Wait, what is glam-bam?”

“Glam-bam is, you know,” she makes a je-ne-sai-quois expression, and tosses her hair, “just glam bam.”

“Have you tried the punch?” she asks.

“Umm, yes. It’s delicious,” I say.

“Wait,” she says, speaking in a stage whisper, momentarily suspending the make-believe: “do you think there will actually be punch at your concert?”

“Conference,” I say. “A concert is different. Hmmm, I don’t know if there’ll be punch. Probably not. But I hope so,” I say, sincerely.

She resumes her making-conversation-at-conference-voice. “Don’t you just love how we’re such beautiful and pretty and cool and stylish ladies?”

“And also smart,” I say, feeling the need to make a Feminist Intervention.

“Yes, and smart,” she echoes, unconcernedly.

“At a conference you might ask people about their work,” I continue, because I always have to ruin everything fun.

She doesn’t miss a beat. “So how is your research on literature and the eighteenth century?” she asks.

My mouth falls open in surprise and then laughter. She is on it. No intervention required. “Er, very well, thank you,” I stammer.

I collect myself. “And how is your research on eighteenth-century literature going?” I ask her, in return.

“Very good, very good,” she says. “Would you like more punch?”


Day 148: stab!

“What are those?” The younger asks, spying a packet of pads in the newly organized bathroom cabinet.

We’ve had a version of this conversation dozens of times. But she always acts as though she’s asking for the first time.

“They’re for when you have your period.”

“What’s a period?”

“It’s when you have bleeding, when you’re a wo—” I stop and revise mid-sentence, feeling pressure to be precise, “starting when you’re an older girl, once a month.”

I have a longer answer to this question, about the uterus and the lining blah blah blah, but my approach to explaining periods to the younger is similar to the way I approached telling people about my dissertation when I was applying for academic jobs. My answer exists in interlocking sections; I’ve got my one-sentence version, and I don’t offer elaboration until prompted.

“Once a month!” she exclaims, in a tone of outrage, as if I’ve told her she can expect a monthly beating starting in a few years.

“Is it a tradition,” she enunciates the word carefully, and I wonder if it’s one they’ve been using in kindergarten this week, “or does it just happen?”

“A tradition? Uh, no, it’s not a, a tradition.” [1]

“Dang it, if it was a tradition I wouldn’t have to do it!” she exclaims.

I am just thinking to myself that I love this conception of what tradition means (a thing you don’t have to do) when I observe that she is now (in illustration of the counterfactual “it” she wouldn’t have to do if periods were a tradition? In illustration of what it is that “just happens”?) gleefully plucking an invisible knife out of the air and miming stabbing herself in her crotch.

“Stab!” she yells, jubilantly, as she plunges the invisible knife into her groin.

I give an involuntary yelp.

“Yikes, no. Jesus.” I am actually wincing. “No. No. There’s no stabbing.”

As she turns away I find myself wondering if she will perform this mime in front of other people and if it will somehow be traced back to my explanation of what periods are.

“There’s no stabbing!” I call after her, as she walks away, and I think to myself that this is what is called losing control of the narrative.



[1] Although it kind of is, isn’t it? Which is to say, a menstrual period is both something that “just happens” (or, sometimes, sometimes momentously, doesn’t happen) and a “tradition.” I remember when I was on the pill how weird I thought it was when I realized that some of the pills were placebos so as to create an artificial period. And that the reason for this was because the inventors of the pill “believed that women would find the continuation of their monthly bleeding reassuring” (quoting from Malcolm Gladwell’s 2000 New Yorker article, “John Rock’s Error.”)



Day 147: Ode to Adderall

If you ask me, it’s not that I have a problem paying attention. It’s that I have a problem paying attention to things that are boring. I excel, if I may say so, at paying attention to many things, including television shows, cocktail menus, long, drawn-out dinners, stern admonitions, shoes, dancing (whether as spectator or participant), well-wrapped parcels, my children’s verbal articulations,* shells on the beach, a head of hair that I have permission to brush or braid, conversations (face to face or at a distance), and sex.

*except when these involve any of the following: Minecraft, Shopkins, the characteristics of various forms of weaponry, Beanie-boos, or anything they need right now, Mom, etc.

I am middling when it comes to attending to academic talks, essays, or monographs (rather unfortunate, this), art exhibitions, wine or beer lists, movies that lack a plot, pre-1750 American literature (the truly terrifying sermons of Jonathan Edwards excepted), and long disquisitions on unfamiliar subjects (note: the latter often masquerade as one side in a conversation; I am not fooled one bit).

I am abominable at attending to sealed envelopes addressed to me (unless they are adorned with an intriguing, hand-written script or promise to contain either book royalties or an invitation to a fancy party), dirty dishes, plants (indoor or outdoor), anything in the back of the refrigerator, laundry (clean or dirty), bills, and overdue notices of all kinds.

I’ve had trouble paying attention to boring things so long as I can recall. And it’s been part of my identity at least since my maths teacher called me out for being “lazy” (“I see you looking out the window, lost in your own little world!”), when I was about eleven.

But I was coddled enough (by family, university), that it didn’t really present a life-interfering problem until grad school.

“I can’t deal with paperwork,” I confessed to the therapist I sought out at the university’s mental health services, “or any other ‘little things’ that need to get done, so then they pile up and pile up and become overwhelming. I would rather write my dissertation. That’s how much I can’t bear tackling mundane tasks.”

“And are you actually writing your dissertation?” she asked me.

“Oh, yes!” I replied. “Writing is so much easier then dealing with all that other stuff.” (“all that other stuff” referred to any kind of task related to financial aid, immigration documentation, banking, bills, etc.)

At the time she was like, “well, maybe writing your dissertation is more important right now.” And I remember leaving feeling rather pleased with myself, re-imagining my inability to attend to the little things as evidence that I had my priorities straight, was simply dedicated to the execution of grander tasks altogether.

And yet …

Over the years my inattention to little things nearly landed me, more than once, in serious trouble. I let my green card expire and left the country. I only noticed when I was back in the U.S. I guess the immigration officer hadn’t been paying attention. (I actually only became a U.S. citizen in order to avoid the hassle of having to renew my green card every ten years for the rest of my life.) I twice racked up library bills amounting to tens of thousands of dollars and both times managed to persuade the long-suffering librarians to cancel the fines.

It led, inevitably, to problems in my marriage. Apart from the problem of my chronic inability to complete mundane tasks in a timely manner, which was naturally considerably irritating to He-Who-Must-Be-Preserved, his difficulty understanding why I couldn’t. Just. Get. Things. Done created a gulf between us.

I remember on more than one occasion having a conversation that went something like this.

DUCK-RABBIT: You know what I hate?


DUCK-RABBIT: When you decide that you want to do something in the long-term but in the moment you just can’t stick to it.


DUCK-RABBIT: Oh, you know, like when I decided [as I do, periodically] that it was definitely unethical to eat meat and I was fully convinced, and I made a plan even down to what I would order next time at Bagel Nosh instead of my usual but then two hours later we went out and I thought, fuck it I need a burger. Or when I resolve I’m going to exercise regularly and I even do that thing where I put all my running stuff out so it’s ready in the morning but then the next morning the prospect of actually going running is just too awful to contemplate. Or when you know you’ll feel better if you just clean up your desk but the piles of paper are just way too overwhelming.

HE-WHO-MUST-BE-PRESERVED: I …. I don’t really have that.

DUCK-RABBIT: What do you mean?

HE-WHO-MUST-BE-PRESERVED: If I decide I want to do something then I just do it.

DUCK-RABBIT: But what if it’s something that takes a lot of will-power?

HE-WHO-MUST-BE-PRESERVED: [a little nervously, realizing that his answer is not what I want to hear] Then I summon up the will-power?

DUCK-RABBIT: [increasingly agitated] But what if you don’t have the will-power?

HE-WHO-MUST-BE-PRESERVED: If it’s something I’m sure I want to do then I’ll have the willpower.

DUCK-RABBIT: [confounded] What, so you don’t have a rich, internal struggle every time you have to do anything that’s not flop on the sofa and watch telly?


At this point I would proceed to flop on the sofa and sulk while muttering to myself that he must be lying because it is a known fact that wrestling with yourself about whether you can bear to do anything other than lie on the sofa is part of the universal struggle of humankind.

These days I’m not so sure.

The catalyst for my recent doubts about whether overcoming indolence is indeed a universal struggle was another crisis caused by yet another situation in which I had avoided completing a relatively simple bureaucratic task which then ballooned into an overwhelming series of bureaucratic tasks due to my inertia.

This task was submitting my psychiatry bills to my insurance company so that I could be reimbursed for part of the (exorbitant) cost. I had allowed a full year’s worth of bills to pile up without submitting any of them. When this finally came to light (and by “came to light” I mean “was forced to confess to He-Who-Must-Be-Preserved due to our still intertwined tax returns”) I felt a pit in my stomach in which the shame of having deferred this simple task of responsible adulthood for so long mingled with the dread of realizing that I now actually had to submit the bills and complete multiple copies of the odious CMS-1500 health insurance claim form, each copy of which only has the space to itemize three weeks worth of appointments, which meant, since I had a full year’s worth of appointments to itemize, that I had to fill out, um, 17 and a half copies of the form. And not only that but then find an envelope. Or possibly multiple envelopes. And stamps. And a mailbox, for God’s sake.

I finally confessed that I’d been hoarding my superbills in therapy, after actually lying on previous occasions to my psychiatrist when she asked if I’d been submitting them. Given all the intimate and embarrassing things I have told my psychiatrist over the years, it is telling that this of all things is the thing I withheld. As I explained I added, almost idly, “this has been a problem my whole life.” When she asked what I meant I mentioned the preposterously large library fines, the other problems with completing necessary mundane paperwork, issues that have come up tangentially in therapy before which but we’d never discussed at much length before now.

I thought, because this is generally what we do in therapy, that she might ask me to dredge up my earliest memory of experiencing this kind of paralysis around completing mundane tasks, or something like that.

Instead, what she said next took me totally by surprise.

“Have you ever thought about taking a stimulant?” she asked. “A stimulant?” I repeated.“It’s used to treat ADHD,” she said, and went on to explain that it might help me concentrate upon and complete tasks.

I was a little incredulous at first. I didn’t fit my image of the kind of person who might have ADHD (my image was of a hyperactive kid (OK, a boy) who plays a lot of video games and has trouble sitting still in class. Whereas I am amazing at sitting still.)

Still, we talked about it for some time and I decided, cautiously, that I might be interested in trying it. She gave me a questionnaire to fill out and I began to feel increasingly excited as I read the first four questions:

  1. How often do you have trouble wrapping up the final details of a project, once the challenging parts have been done?
  2. How often do you have difficulty getting things in order when you have to do a task that requires organization?
  3. How often do you have problems remembering appointments or obligations?
  4. When you have a task that requires a lot of thought, how often do you avoid or delay getting started?

The answer to all four of these questions, for me, is: often. Not always, but often.

Not all of the questions spoke to me. For example: “How often do you have difficulty unwinding and relaxing when you have time to yourself?” Um, never (see: flopping on sofa, above).

But enough of the questions resonated with me that I started to wonder, a bit giddily, whether it was actually possible that a behavior I had always thought of as being a character trait that could only possibly be overcome by mustering Benjamin-Franklin-level reserves of determination that I was quite sure I didn’t possess might actually be a chemical imbalance that could be altered by taking a drug.

Reader, I write to you now as someone who is now taking a daily stimulant. It’s been a week and the change I’ve observed is pretty remarkable. Things I’ve accomplished this week include: submitting all the bills and the bloody forms in a satisfyingly hefty envelope. (Oh yeah, and I even mailed it.) Yard work. I’ll just repeat that for the benefit of those who know me well: I did yard work. Or gardening as we call it in England. Voluntarily. I even enjoyed myself. Cleaned out the overflowing shelving unit in the bathroom that was so tightly stuffed with miscellaneous bathroom products that retrieving a tube of toothpaste from it was like a game of Jenga. Killed two daddy-long-legs. (Actually that might just be a coincidence.) Wrote a full draft of a talk I have to give a clear two weeks before said talk is due to be delivered. Even finished making the PowerPoint. The most momentous accomplishment might seem the least impressive to those of you have a more normal ability to complete boring tasks: never until now have I been able to completely wash all of the dirty dishes in the sink. I always run out of steam about three quarters of the way through and leave a couple of plates, or a few knives and forks in the sink. That hasn’t been happening. Basically, I can now, much more easily focus on completing a task I’ve decided I want to complete. It sounds so simple, doesn’t it?

The strangest part is not the individual tasks that now seem doable. The strangest part is that this chemical experiment has shaken my previous belief that fighting inertia was not just my problem but rather one that afflicted most people to an equal degree. Honestly, have the rest of you just been merrily doing your dishes and filing your insurance claims forms with no great angst all these years? I just thought all of you people who got things done had this superhuman tolerance for mental pain! Why did nobody tell me!?

If I let myself start thinking about what my life might have looked like if I’d started taking Adderall earlier, it’s easy to get a little glum. Would that book have been written twice as fast? Would conflicts over domestic labor simply never occurred because I too would have cleaned the kitchen properly in the first place rather than in my typical half-arsed fashion?

I worry that this post (at the very least its title) might seem to trivialize the experiences of people who have more severe problems with attention deficit (or more severe problems, period); that isn’t my intention at all and I know that my attentional “disorder” (if we want to call it that, and I’m not sure that I do) is mild. I also know Adderall isn’t a panacea; it has side-effects, like any drug, and can be addictive when taken in high doses. And I’m in no way dismissing the idea that my (or anybody else’s) trouble completing tasks might stem from underlying beliefs or fears that might profitably be explored in therapy as well as treated with drugs.

I can imagine that someone (probably someone British) might observe, kindly, upon reading this post, “but the problem isn’t with you, dear duck-rabbit, it’s with this ghastly society with its information economy and ubiquitous social media that has addled your brain”; or, “but the problem isn’t with you, dear duck-rabbit, it’s with this ghastly society with its demand that one be constantly productive that has distorted your expectations of what a human can be expected to accomplish in one day.”

To which I would reply, this is probably true. But since this is the society I live in, I have to find a way to make it livable; and this is one way that is proving helpful, at least right now.

Finally (last bit of hand-wringing, this), I worry that my zeal for this new drug may cause you to worry that this dreamy duck-rabbit has turned feverish and frantic; so I hasten to assure you that I’m on an extremely low dosage under close supervision, and I don’t think there’s much discernible change in the way I interact with the world; I haven’t suddenly become a fast talker or someone who stays up all night alphabetizing her book collection.

And I still love flopping on the sofa. Odds are, I’m there right now.







Day 145: or what

Sometimes I wonder if the movies that permeated my consciousness as a child determined my later research interests. Take Groundhog Day. I watched it the other week with the kids and realized that it’s basically a late 20th-century version of Leibniz’s parable about Sextus Tarquinius. Both can be filed under the category: stories about assholes who are momentarily plucked out of the stream of existence so they may contemplate if they wish to renew their commitment to assholery. The fact that Phil Connors (Bill Murray’s character) chooses, eventually, to be good while Sextus defiantly refuses isn’t the only difference between the two narratives. In Leibniz’s tale the possible world that is actualized is one in which Sextus’s individual interests and the world’s fail to align: Sextus meets a grizzly end but his demise is also the catalyst for the founding of the Roman republic. In Groundhog Day the interests of the individual and the world line up: indeed, the learning curve the movie traces is Phil Connors’ gradual, very gradual discovery that dedicating himself to fulfilling the interests of other people is also the route, the only route, to his own happiness.

Before we watched the movie, I was sitting on the sofa and the younger sidled up to me, eyeing my belly critically.

“Are you going to have a baby, or what?”

I answered carefully. “No. No, I am not going to have a baby.”

Later that night, after the movie, when I was going to bed, I put on my Leibniz T-shirt, as you do (it has a portrait of Leibniz on it—I’ve been told it looks like me, truth be told, which is not a terribly flattering comparison—and underneath it reads, “In another world it would be worse”), and I wondered about that other world, the world the younger the imagined, in which she would be the middler.


Grammatically, when we express ideas about possibility, we use modal verbs like must, shall, will, etc.

This week, it being February, I had to teach “Frost at Midnight.” Duck-rabbit, I said sternly to myself, you shall. Not. Cry.

(I more or less pulled it off.)

“You shall not cry” is an example of something called “commissive modality” (I only learned this concept today.) It refers to a kind of modal statement in which the speaker indicates a strong commitment to ensuring the described event comes to pass (in this example, not crying). In English such statements usually use the modal verb “shall.”

There is something, I reflected today, especially poignant about second person commissive modality –i.e. statements in which shall is used in the second person, like “you shall not cry.” Indeed, it’s precisely when reading those lines when “Frost at Midnight”’s speaker uses shall (and it’s always in the second person, because he’s addressing his infant son) that my voice breaks:

“My babe so beautiful! it thrills my heart

With tender gladness, thus to look at thee,

And think that thou shalt learn far other lore,

And in far other scenes!

“But thou, my babe! shalt wander like a breeze”

“he shall mould

Thy spirit, and by giving make it ask”

“Therefore all seasons shall be sweet to thee”

“Or if the secret ministry of frost

Shall hang them up in silent icicles,

Quietly shining to the quiet Moon.”

Why is shall used in the second person poignant? I think it’s because of the other contexts in which we more usually encounter “shall,” which are quite specific.

Shall is the language of the law, whether divine and laid out in scripture (“Thou shalt not …” in the King James Bible or, to go back to Leibniz and his fable, “If you will renounce Rome, the Parcae shall spin for you different fates, you shall become wise, you shall be happy,” as Jupiter says to Sextus) or secular and expressed in legal documents, where it’s normally used in the third person (“the parties shall share joint legal custody”). The most prominent use of second person shall that I can think of other than in scripture is in fairy tales, where it’s used by fairies good and bad. “You shall go to the ball!” proclaims Cinderella’s fairy godmother. [1]

In the Ladybird book version of Cinderella I read as a child (the one with the three balls and, most importantly, the THREE DRESSES), in response to Cinderella expressing her wish to attend the ball the fairy godmother says, “And so you shall, my dear.”

Or consider these lines, uttered by the bad fairy in the Ladybird book version of Sleeping Beauty:

“When the King’s daughter is fifteen years old, she shall prick herself with a spindle and fall down dead.”

But then the good fairy modifies the bad fairy’s curse: “‘The Princess will prick herself with a spindle,’ went on the twelfth fairy, ‘but she shall not die. She will fall into a deep sleep that will last for a hundred years.’” [2]

When fairies or gods use shall in the second person, it’s not poignant because their supernatural authority underwrites their decrees ; the difference is when mere humans use “shall.” Such statements become poignant because of the auditor or reader’s understanding that the speaker can’t actually guarantee that the events they so confidently decree will necessarily come to pass. When an ordinary human uses “shall” in the second person, the very use of that verb bespeaks the individual’s powerlessness and their resultant desire to take on the authority of a god or fairy who might actually make the desired event so merely by uttering the decree.

Take Tennyson’s poem “The Death of the Old Year”: the first two stanzas end with second-person shall statements; “You came to us so readily, / You lived with us so steadily, / Old year you shall not die.” And then: “So long you have been with us, / Such joy as you have seen with us, / Old year, you shall not go.” But the point, as the poem makes clear, is that the speaker has no agency to stop the year from turning. The “shall” statements are a desperate bid to ward off the inevitability of death.

All of this is a very long way of explaining why “and therefore all seasons shall be sweet to thee” makes me cry; because would that all our benedictions for our babies were also promises.


[1] I looked but couldn’t figure out where this particular, now iconic expression of the fairy godmother’s promise to Cinderella derives from. Disney???

[2] I stumbled on an amazing subgenre of YouTube video while searching for these references: videos of actual Ladybird books being flipped through silently, sometimes with hands visibly turning the pages, other times not.


Day 144: in which I am bananas and also nonplussed


DUCK-RABBIT [in highly exasperated tone]: ___, you are driving me completely BANANAS.

YOUNGER [breaking out in helpless giggles]: BANAAARNAS!!! BANAAARNAS???!!! [pronouncing the word in mocking imitation of her mother]


YOUNGER [continuing in lofty tone]: Or, as Americans say…

DUCK-RABBIT [interrupting in anticipation of what is coming]: Yes, as Americans say ….

YOUNGER: “You are driving me completely INSANE.”

DUCK-RABBIT [nonplussed [1]]: What?

YOUNGER: “You are driving me completely INSANE.” That’s what an American would say.

DUCK-RABBIT: Oh. [a little crestfallen] I thought you were going to say “you are driving me completely BANÆNAS” [pronouncing the word in mocking imitation of an American accent]


DUCK-RABBIT: Right, I realize that now.



[1] I happened to look up nonplussed in the OED online just now, as you do, and discovered something peculiar. You could say that I was nonplussed by what I discovered. According to the OED, “nonplussed” has two distinct and opposed meanings. The first, the one I’m familiar with, is “1. Brought to a nonplus or standstill; at a nonplus; perplexed, confounded.” But then check out definition 2: “2. orig. and chiefly U.S. Not disconcerted; unperturbed, unfazed.” Then there is a note: “This usage is often regarded as erroneous: see discussion in etymology.”

Oh you crazy Americans! The OED’s “discussion in etymology” suggests that the American usage resulted “probably as a result of association with other words in non-prefix.” But wouldn’t this only make sense if “plussed,” somehow, had the connotation of being agitated in some way?

Disappointingly, plussed doesn’t actually seem to be a word in its own right, although I think it should be (Try saying “I’m so fucking plussed right now” and tell me it doesn’t feel right). But even as I tried out plussed and found it pleasing, I wondered if it feels right as an adjective describing annoyance because it sonically evokes other words we use to describe vexation. Plussed at once assonantly recalls fussed and consonantly evokes pissed. Or maybe it’s simply that the most frequently used examples of non-prefixes are applied to conditions in which the negated quality is bad: as in non-toxic, or non-threatening.

I remain nonplussed (and, for now, nonbananas).




















Day 143: the swelling spleen and phrenzy raging rife

On my drive to work on Tuesday I decided that George Michael’s song “Freedom ’90” is a Rousseavian critique of Franklinian self-fashioning. I was on my way to lecture, where I was to teach Franklin’s Autobiography, that paean to the art of self-reinvention. For Franklin as for Hume, identity is labile: “… as the same individual republic may not only change its members, but also its laws and constitutions; in like manner the same person may vary his character and disposition, as well as his impressions and ideas, without losing his identity” (Hume, Treatise, 1739). The words “character” and “identity” are important, here. You can vary your “character” without changing your “identity.” For Franklin, then, a self, like a republic, may be vastly improved and even perfected, with the use of the right methods and the cultivation of the right habits.

For Rousseau, by contrast, there is an essential, persistent self. His aim in the Confessions is to record that self, not to reinvent it: “The real object of my confessions,” he writes, “is to communicate an exact knowledge of what I interiorly am.”

“Freedom ’90,” as I heard it on the way to work, views the speaker’s misguided, Franklinian past from the perspective of a Rousseauvian epiphany. The speaker recalls how he initially achieved professional success by adopting Franklin’s technique of fashioning his self according to the demands of his audience: “I went back home got a brand new face / For the boys on MTV.” But eventually the speaker tires of this “show,” and an authentic, interior self asserts itself:

“I think there’s something you should know / I think it’s time I stopped the show / There’s something deep inside of me / There’s someone I forgot to be.”

I could go on with this reading of the song, but I can’t be bothered. I have deep affection for Rousseau, Franklin and especially George Michael, but on Tuesday I found myself profoundly irritated by their self-actualizing narratives. Both the idea that the self is infinitely variable and the idea of some ineffable interior self buried deep beneath the socially molded exterior share the assumption that one might shrug off the socialized self like a snake shedding its skin. Both models resist an idea of selfhood as material and embodied. And that’s why both the Rousseauvian and Franklinian models ring false with me.

Here I should probably quote Spinoza, or Deleuze but I can’t be bothered with them either. Instead I will just say that I find it difficult, at this moment, to distinguish myself either from the imperatives my body exerts upon me, or from the imperatives the external world exerts upon me. If only my perceptions felt the way Hume characterizes them. He makes them sound so light and gauzy, like a fluttering swarm of butterflies, “which succeed each other with an inconceivable rapidity, and are in a perpetual flux and movement.” But if the mind is a kind of theatre, as Hume represents it, mine is not one through which perceptions elegantly “glide,” sylph-like, as he would have it. No, my perceptions land, heavy-footed, like a ton of bricks.

Hunger. Fatigue. Grief. Guilt.

Sensation for me comes more like Edmund Spenser’s sickly parade of sins in The Faerie Queene, “The swelling Spleen, and Phrenzy raging rife, /The shaking Palsey, and Saint Frauncis’ Fire,” tramping through the theatre on their crew of motley beasts, pissing on the stage and leaving mud stains on the seats.