As an android, I am incapable of forgetting, short of deliberate memory erasure. And yet my thoughts behave as if I am capable of forgetting; as if I am frequently failing to recall the fact that Tasha is dead. I find myself thinking that something would be of interest to Tasha, and I should share it with her; or I think of an idea I wish to express to her, or a query as to what she might be doing, as if the knowledge of her death exists on a completely separate logic chain from my experience of familiar pathways of thought. I am used to thinking such things about Tasha, and therefore I continue to think them, even though they are no longer applicable. And in the moment when the thought forms, I remember once more that Tasha is dead, and once more I experience a sense of... loss.

It is as if I am continually polling, sending out queries to the universe to check for Tasha's presence, expecting to receive a reply. And when no response is returned, my processor spins for many more cycles than it should, hanging as I wait for a return, even though I have encoded the fact of her death into my permanent memory. I am aware there will never be a response to the query for Tasha's existence again, because she no longer exists – and yet the processor cycles that I have apparently dedicated to creating a model of her behavior in order to predict how she might respond to a given stimulus have not been re-dedicated. The model still exists and still runs, informing me over 39.55% percent of my cycles what I believe Tasha's reaction to new information I have received would be. I poll for her with great frequency, and hang when there is no response, as will be the case all the time, until the moment several nanoseconds later when I remember that she is dead and that I will not be able to share the information with her, ever again.

I cannot seem to stop this. I cannot erase the subroutine that models Tasha's behavior, or prevent it from running. I cannot stop querying the universe for her presence, even though I know the answer will always be null.

Counselor Troi says that this is grief, and that humans experience the same thing.

I wish I could take comfort in the fact that I appear to have been programmed to simulate the human condition closely in the manner of my reaction to the death of a friend. Even though I lack emotions in general, apparently I can experience something akin to human grief. It should please me to be similar to humanity in this way.

It does not.

Written because describing my experiences with my mother's death in terms of queries and polling and processor cycles sounds way too cold, even though that's exactly how it feels to me, but having an android describe those experiences in those terms would just be normal.