This story was inspired by An Cailin Rua's wicked humor piece, "By the Pundreds." (Look her up on my Favorite Authors page.) The key claim coming out of that story was that Cybertronian has no puns. I found this to be intriguing, but possibly problematic, though I hadn't imagined I'd find a way of fictively staging my reservation!
Add a few months, a late night discussion with a friend doing his master's in mathematics, throw in the outrageously supersized robots, and you have this story, posted with An Cailin Rua's gracious permission and beta-reading. Special thanks to Riko for the pizza, beer, and mathematical disquisition, and also for being geek enough to read this over for me to check for flagrant implausibility. Thanks also to Marta for casting a logical eye over this piece, despite it not being her fandom or her area of mathematical study. Needless to say, all flaws are mine.
It had begun innocently enough. Maggie hadn't gotten where she was in life without a healthy dose – more than healthy, according to her superiors, friends, boyfriends, and her mother – of curiosity. Not that it needed all that much to be interested in the aliens who had literally fallen in with humanity – who wouldn't want a word with them?
Maggie, at least, had a job that allowed for the occasional conversation, to say nothing of access to what was arguably some of the best (and occasionally the most horrifying) scuttlebutt on earth. So she'd heard about the pun incident. She hadn't seen Keller that white-faced and wide-eyed since they'd been 'introduced' to Megatron, just before the battle for the Allspark had begun. Apparently, Cybertronians didn't do anything by halves, whether it was interplanetary conquest or bad humor.
But while everyone else had laughed and shaken their heads, Maggie had quietly gone back to her desk and begun pulling up her favorite math websites. And when she had gone home that evening, she had made straight for the shelf reserved for books of mathematical theory, selected three of them, and then curled up with a beer on her favorite purple couch to read and think.
Everyone had hobbies, after all.
But not everyone's hobbies had quite the consequences that hers did, and over the course of a week and several more books and articles, as Maggie had compiled her argument, her unease had grown. When, finally, she had gotten it all laid out on her coffee table, scattered over about twenty different sticky notes, she'd spent some time considered who she might tell about this. Or if she should tell anyone. She read the news, after all – fences were going up all over the world, even in the "free" world. Xenophobia wasn't exactly wanting in excuses to rear its head.
But she couldn't let it go, either. She'd spent another week going back over what she'd concluded, wondering whether she could spot a bad inference or whether the data might mysteriously change. And in the mean time, she'd had a few run-ins with Cybertronians at work – Prime and Ratchet were both in D.C. for some discussion on the probable environmental impact a species of enormous robots would have. That had given her the opportunity to test her theory, though she'd tried to be subtle about it. She wasn't sure whether she'd actually been subtle or whether they simply hadn't picked up on cues that would've given her away to another human being, or perhaps they had simply been avoiding potential conflict.
But by the end of those two weeks, between books and her little conversational experiments, she'd come to the conclusion that she couldn't just keep her theory to herself. Which was how she'd come to be waiting around in the quiet little park about a block from work, holding down a lunch table as she kept one eye on the sidewalk and the other on her watch. And: Glenn, don't flake out on me! she thought, stirring her cup of instant ramen noodles with a pair of chopsticks.
Finally, however, she spied a familiar figure ambling her way, and she waved eagerly to him. Glenn gave her a nod and ambled a little faster, lunch in hand – his version of brown-bagging it being a stop off at MacDonald's.
"Long time, no see," he greeted her as he dropped the red and white bag on the table and plunked himself down across from her. "So what's up?"
"Some pretty weird stuff," she replied, and he snorted as he peered into the depths of the bag.
"Maggie, the day it ain't 'pretty weird stuff,' I'm gonna have your head checked," he declared, as he came up with a cheeseburger and began unwrapping it.
Maggie rolled her eyes, but then she leaned forward and said, "Seriously, Glenn – just listen, and please, for the love of God, don't repeat this. Like, not to anyone, all right?"
Once upon a time, Glenn would've pooh-poohed the idea that he would repeat anything she told him – it was a part of the hacker's creed, after all: Look, don't touch or tell. Unless, of course, it was a matter of safety a hacker couldn't in conscience keep silent about, but generally, you didn't tell. Or at least, you didn't tell anyone who would tell anyone with the authority to crack down on hackers. It was like the cyber-version of the Hippocratic Oath: First, do no harm… to fellow hackers.
Glenn and Maggie both understood that, but they'd never had it put to the test until six weeks ago, and admittedly, that little scene in the FBI holding cell had put a bit of a… strain… on their friendship. In fact, they hadn't talked as much since then – not even online.
It wasn't that Maggie was angry, precisely – Glenn had come through for her and for them all when it counted. Nevertheless, that unhappy bit of history lingered like the summer heat. Judging from the look Glenn was giving her, he was feeling the awkwardness of the past six weeks and wondering just what this sudden request for secrecy might mean, this side of Mission City. Maggie, however, said nothing, and just raised an expectant brow.
"Ok," he said after a few moments. "You got it. Nobody'll hear it from me. Promise."
"I mean it, Glenn. This could be…" Maggie paused a moment, seeking the right word, which, unfortunately, seemed to be eluding her. And so she finished instead: "I don't know what it could be, but it could be big."
"You aren't carrying Pentagon secrets in your make-up again, are you?" Glenn asked, warily, glancing around for ear-bugged suits. "Nothing in your lipstick case?"
"No. Not yet. It's not anyone's secret yet – except mine, and maybe also theirs."
"Autobots," Maggie hissed. Glenn's eyebrows crawled up his face, and he gave a low whistle.
"Not that I'm not interested – you get the goods, girl! – but you're tellin' me this why?"
"Because," she replied, and gave him a meaningful look, "who else would I tell?" Glenn paused with the burger halfway to his mouth, and he stared at her a moment before he huffed a bit and hastily took a bite of his lunch.
"So," he asked, when he'd swallowed and cleared his throat, "what's the deal?"
"All right," Maggie said, as she followed his lead and began eating some of her rapidly cooling lunch. And as she ate, she spoke: "So last week, Keller had a teleconference meeting with the Autobots, and afterwards, it was all over the Pentagon that Cybertronians didn't have puns in their language."
"How'd that come out?" Glenn demanded, wide-eyed, though his mouth was twitching. Clearly, he was trying to imagine robots who could casually look over rooftops making puns, and succeeding a little too well, if scuttlebutt were true. Maggie sighed.
"Never mind that. Listen," she urged. "You're a computer hacker – you're good at codes, at logic. Think about this a minute: they're a race of aliens who have been around for, God, millions of years, according to what they've told us, and they don't have puns in their language. None."
"'Cuz they're, like, giant robots?" Glenn supplied around a mouthful of hamburger, in a tone that clearly implied 'duh.' Nevertheless, it gave her her opening, and she seized on it.
"Exactly! They're giant intelligent robots – real AI, right? But they have a language that's purely logical."
"'So?'" Maggie repeated, incredulously, then demanded: "Glenn, when was the last time you heard of a purely logical language working like this? For years, cyberneticists here have been trying to break the language barrier, and if a purely logical language could cut it, they'd have done it already!"
"Yeah, but these guys aren't from our labs," Glenn retorted, attention still clearly on lunch more than anything else. He bit into the burger once more with relish and heaved a contented sigh. "Man, you may have saved my life! Grandmama's been on a health food kick like you wouldn't believe!"
With a sigh of far lesser contentment, Maggie set her chopsticks aside and reached into her purse to pull out a book.
"Have you studied Gödel's theorem?" she asked, as she slid the book across the table. Glenn shoved his glasses up on his nose as he leaned over the book, craning his neck at the upside-down title.
"Aren't you taking mathematical theory right now?" she demanded.
Glenn made a derisive noise, giving a dismissive wave of his hand. "I'm taking some bonehead calculus class that's supposed to get me out of other bonehead calculus classes. Can't believe they want this stuff just to let me get to programming I already know how to do! I'm tellin' you, man, they want us to go to school, they gotta knock this crap out!" He shook his head, disgusted. "Anyhow, we don't theorize, we calculate."
Maggie rolled her eyes. "Well haven't you ever looked beyond the code to the theory some of it uses?"
"Why bother?" Glenn shrugged. "Code works, right?"
"Because if you bother, then you're one up on everyone who doesn't," Maggie shot back. But then she leaned her elbows on the table, settling into her lecture mode. "Ok, look, Gödel's theorem is based on sets. You know about sets, right?" And when Glenn shot her a glare for the obvious pedantry, she continued. "So it's about being able to prove the truth value of sentences that are based on the elements of a set. Ok?"
"Okaaaay." Glenn drawled it out, clearly waiting for the punch line. "So what?"
"So here's what: if you take a language and you use prime factorization, you can assign every word and logical operator in a language a unique natural number. Every single operator can be matched up with a number that refers to it and no other. So '2' for the operator 'parentheses,' and '3' for the operator 'and', and so on until you've covered every word and every logical operator that forms relations between words – you follow me?"
"Yeah, sure." He nodded, and popped the last bit of his burger into his mouth, sinking a little further into a complacent slouch. Maggie arched a brow at this, exasperated, but she pressed on regardless – if she could just get to the point, she'd have him, she was sure of it.
"So now you have everything assigned a unique symbol. Everything in the language has a totally unique 'word' – it's a consistent language. No repetitions, no sound-alikes, if you were going to put this in natural language terms," she said. Glenn grunted and waved one hand, while with the other, he dug a few fries out of his bag.
"All right, I get it," he answered. "You've got a language with no puns, then, like you were saying."
"Yes. You've got a language with no puns or homophones or homonyms – no polysemy – and the point," Maggie said emphatically, when Glenn's attention threatened to drift off into another MacDonald's meditation, "is that we can do that – it's just a number generator and an assignment routine that'll be able to prove whether a sentence is true or false. We even know the code. But think about the cost to language for a minute," she urged.
Glenn frowned, absently munching on French fries, and Maggie waited, holding her breath. "Ok, so it'd be limited, right?" he said after a long minute. "It's just logic, one-to-one correspondence, under a domain of … uh… oh." Glenn blinked, and she could see revelation steal across his face.
"You see it? " Eagerly, Maggie held out her hands, as if to welcome him to her world. "You get it, don't you?" she demanded, excitedly.
"I think so. It's just logic..."
"Exactly, it's just logic. And logic needs unique symbolization because it has to be perfectly consistent, which means," she said, emphatically, "every meaningful statement has to be absolutely distinct from every other and it has to have only one of two values. Every sentence in a purely logical language has a meaning only because it's referred to a horizon of true or false. It's all propositional in that every sentence in it has no other meaning than 'it's true' or 'it's false.'"
"But that's… wait." Glenn wiped greasy fingers on a napkin and leaned his head in his hand, scowling ferociously now. After a few more moments' furious thought, he glanced up at her and protested, "But they do talk about other things than just what's true and what's false." A pause, then: "Don't they?"
"Well, there's the problem," Maggie replied, deflating into a bit of a slouch herself now. "I'm just not sure they do."
"But I've heard them," Glenn protested.
"What did you hear?"
"Well, they know things like… like freedom!"
"You heard someone say 'freedom is the right of all sentient beings,' didn't you?"
"Yeah. And ok, ok," he admitted, "so you can say that's got a truth value, but… seriously?"
"I've been trying to talk with them since I figured this out," Maggie replied, taking up her chopsticks once more, and stirring meditatively at her by now quite cool ramen. "I've been listening more carefully, too. I haven't asked outright – it just feels, well, rude. But I asked Ratchet once – I had on these new earrings. Big, bangley ones," she said, smiling a bit at the memory of them. "I asked him whether they were beautiful."
"What'd he say?"
"He said 'yes.'"
"Well there you go," Glenn replied, sounding satisfied. But Maggie shook her head.
"That's a true-false question, still – either they are, or they aren't, according to a standard he could have gotten from a commercial or a website or just picked up by hearing someone else say 'Earrings are beautiful,' or something like that. So I changed the topic a little – I started talking about how I was thinking of making jewelry as a hobby," she explained. "And I asked him what it was that made something beautiful."
"Aw, c'mon, that's not even fair," Glenn started to object, but Maggie cut him off.
"Is it? Is it really so unfair?" One pale brow arched skeptically. "If I asked you, wouldn't you answer?"
"Depends," he said warily. "You askin' me about something you're wearing?"
"I mean in general – what is it that makes something beautiful, Glenn?"
"Nobody knows the answer to that!"
"No. But we all can answer the question, can't we?" Maggie said, pinning him with a look. "We may not agree, but everyone knows 'beauty' when they see it and can come up with a general way of identifying it, right?"
"So what did Ratchet say?" Glenn demanded, avoiding an answer.
"He looked at me," Maggie said, brow furrowing, as in her mind's eye, she saw those glowing blue eyes fix unreadably upon her. "He looked at me, and then he asked what I thought made things beautiful."
"Well what's that prove? Maybe he got a cattle prod up his tailpipe or somethin' for getting outta line with a girl-bot on whether she was pretty," Glenn muttered. Maggie narrowed her eyes at that.
"'Girl-bot'?" she repeated, coolly.
"I'm just sayin' it's possible, ok? Whoever says it ain't, hasn't seen the giant alien robots in the first place!" Glenn declared, folding his arms defensively over his chest.
"It's possible," she conceded. But even as Glenn drew himself up just a little smugly, she continued with quiet force: "Or Ratchet might not have known what I meant because until someone assigns beauty a truth-testable reference, what I asked him wouldn't make sense in his purely logical language." Maggie shook her head. "You can't generate a meaning for beauty by referring to 'true' and 'false.' If it can't be defined on an object in those terms, you'll never be able to work with the idea as anything but a totally empty concept."
Glenn stared at her. "You don't think they understand anything until they can say it's true or false?"
"Well, I've been trying to ask questions like the one about beauty – about things that don't refer to truth or falsity to make sense. And the replies they've given have all been… evasive, after a certain point," she said finally. She pushed a hand through her hair, then slid it down the chain of the pendant she was wearing, 'til she had the green stone in her hand, and she ran her fingers about its edges – pensive, nervous habit.
"Are you saying you think they just assign a number to 'beauty', and that's it? That they, what?" Glenn paused, hands sketching helpless incomprehension in the air, "That they don't see beauty? They just have a word that doesn't mean anything?"
"Not just that it doesn't mean anything, but it couldn't mean anything to them. A logical language, if it were possible, would stall on any number of things," she replied. "Things that you can't say are true or false if you really want to understand them – like 'fun' or 'fear'… or 'love.'"
"Dude," Glenn said after a moment. "That's messed up!"
For awhile, the two of them sat there, thinking, with the cooling remains of lunch between them. Maggie looked around at the trees, at the roses and the little fountain in the middle of the lawn and thought of how hard people fought to have just a little bit of garden in their lives. Not because it could feed them, but just because it was beautiful, and no one needed to tell them what that was…
"I could be wrong, you know," she said after a time. "They did 'get' puns finally."
"And what about music?" Glenn said suddenly. "Didn't Sam say Bumblebee likes spinning tunes? What's Gödel got to say about that?"
"Well, a lot, actually," Maggie admitted. "In fact, it's his main point. You see, he came up with his theorem to prove that a language that was consistent couldn't work – that there's always a sentence within the set that proves the set can't do what it aims to do: be a complete, closed set of meanings that is consistent and so has no 'polysemic' variability."
"And this killer sentence would be?" Glenn asked.
"The claim: 'I cannot be proved,'" Maggie replied, arching a brow at him as she watched him digest this. "You can cover every relationship and word in the sentence, and you can prove that it is true in a consistent language. But what it tells you is that the set undoes itself as a closed set."
"So the only way to have that kind of logical language – " Glenn began slowly.
" – is based on there being an exception that can generate a contradiction," Maggie confirmed. "They're called Gödel sentences. They're a sort of… structural requirement of consistent languages like logic or, apparently, Cybertronian. But it puts a limit on what Cybertronian can talk about, if it really is a consistent language."
"But they liked the puns?" Maggie nodded. "Then maybe when Ratchet asked you what you thought was beautiful, he really was just trying to figure us out before diving off the deep end, or something?" Glenn hazarded.
"Maybe. And admittedly, I'm inclined to think that any being that can laugh has a broader horizon for meaning than just truth and falsity," she replied.
"I'll say. I mean, man, if they don't know what words like 'love' or 'hate' mean unless they can say it's true or false that someone loves or hates, what are they on about all this time with the war and everything?" Glenn shook his head. "They can't just have a totally logical language!"
"Well, that's the question, isn't it? They say they lack puns, which means they lack polysemy; they say their language is logical. But then they laugh. I've been thinking about whether you can reconcile it all," Maggie said, reaching across the table to snag a fry, which she began tearing pieces from and throwing to the sparrow that she'd noticed hungrily circling their table. It darted in to peck at the potato bits, and Maggie, watching it, grimaced.
"I suppose you could say that 'laughter' is a systematic response when they're confronted with a paradox not allowed for in their language," she sighed. "But that doesn't fit what I've heard or what I've seen."
"Wouldn't it just be easier to say they've gotta be working off something other than a logically consistent set?" Glenn demanded.
"Easier's one thing, correct is another."
"No kidding! And I'm saying they're just wrong," Glenn retorted. "Wouldn't be the first time an intelligent species has been wrong about itself!"
"That's one option," Maggie replied, stressing the word.
"You got another one?"
"Look, we're dealing with artificial intelligence – or what appears to be intelligence, based primarily off a logical language, if we accept what they say as true," she replied. "With Gödel's theorem, that leaves you to go two ways, as I see it: firstly, it could mean that they're wrong, like you said, and they have an inconsistent language, just like we do, whatever its peculiarities. Or it could be that they're in some sense telling the truth: that they do have a logical language. But you can't speak from a logical language like that – AI labs have tried that already, we know it doesn't work. It doesn't give you AI like what we're seeing."
"Um, Maggie – "
"No, seriously, Glenn," Maggie cut him off. "It's been a question since at least the eighties whether you could ever prove that AI could do what it claims to do: create intelligence on a computer. Now here it is – the 'real' thing! But is it really? What if we could explain everything we've seen as a sort of modified Chinese room?" And when Glenn just stared at her, she prompted, "You know, a machine that gives the appearance – "
" – of intelligent thought, yeah, I know, because it has some little guy in it matching up different sets of symbols according to some code book he doesn't understand," her friend said quickly. "But that's not the same – "
"No, it isn't, but it makes sense of everything we've seen – and of everything we could ever see. What if," Maggie asked, leaning forward intently, "the Cybertronian base operating system – their base 'language' – is a core logical language, just like any other computer, but their 'interface' with the environment is basically a Chinese room? And since this is a new environment for them, what if they haven't got a complete rulebook?"
"Is that even possible?"
"I don't know. But look, let's at least say they might not have a very extensive rulebook yet," Maggie said, hurrying past that, and with it, several long arguments. "They've only been here six weeks. Well, four years for Bumblebee, but –"
"But he wasn't exactly stopping to smell the roses," Glenn finished the thought, and she nodded.
"So what if they're just now starting to need to expand the rulebook?" Maggie asked.
Glenn frowned. "You think that's what Ratchet was doing, asking you what you thought made things beautiful?"
"It would make sense. If the normal operating language is logical, he'd be looking for a new rule that would let him add a page to the rulebook. He could've kept asking until he got an answer that could signify as 'true' or 'false,' then used that as his yardstick for finding appropriate responses to that kind of question. He wouldn't have to have any real idea about what 'beauty' is himself, he'd just need to know what traits made something beautiful to me," Maggie replied, leaning forward once more, as was her habit when intent upon an idea. "It's possible, at least theoretically, I think, and it would take account of what we see and what they say." She paused a moment, then asked, "What if that's it? What if that's why they've been able to fool us this long into thinking they think like we do?"
"Ok, brain's hurting, here," Glenn said, and pressed his fingers to his temples, massaging gently. Then he held up one splayed hand, as if to halt the rush of speculation, and said, "Let's leave aside the fact that the Chinese room is totally theoretical and that people've been arguing about it for what? Twenty years or thereabouts? You're saying the reason Cybertronians seem to be speaking the same kind of language we do most of the time is that they're all walking, talking Chinese rooms? And that the whole thing with the puns is just a sort of compilation error that ends up looking like a translation problem 'cuz that's what the rulebook says to do? Put it in terms of a logical language issue?"
"But that still doesn't work!" Glenn protested. "I mean, ok, fine, let's say it did. Let's say that's what they've got and it makes sense of the whole pun issue and everything else they've said or done. The point of the Chinese room – the thing that makes it cool – is that it's a fake! It's a way of seeming to be really thinking, but it's nothing but some guy who just matches up codes by following rules, without understanding anything!"
"I know," Maggie said softly, tossing another morsel to the sparrow, which was beginning to risk hopping a bit closer. But Glenn was on a roll now, and didn't seem inclined to acknowledge her as he continued:
"I mean, that's crazy, right? I don't care how big they are, they don't have a bunch of little green men running around in there!"
Maggie sighed. "I imagine not."
"It'd have to be like a machine in a machine, or code within a code, to pull it off. And the processing speed… We don't even know if that's possible!"
"It's true, we don't," she conceded, even as her friend wound toward his conclusion, having shown no sign he was really listening to her at this point.
"But if you're right about the puns, and they're telling the truth, then we can't rule it out. But if we can't, then that's… man, that's whacked!" Glenn declared, shaking his head dazedly. "That'd mean we've gotta deal with the possibility of a couple tons of machinery running around that are just… just pretending…"
And since he trailed off, she finished quietly for him: "Pretending to be people, when there's no one home?" And when he nodded, wordlessly, she grimaced. "I know. And that's what worries me."
Glenn snorted. "'Worried' is my grandmama thinkin' I'm goin' blind from sittin' too close to the screen," he declared. "This here is like Ghost in the Shell, except these guys ain't no tachikomas!" He fell silent then, picking dispiritedly at his fries. After a few moments, though he glanced up at her and asked, "Just outta curiosity, what are you worried about?"
"Well, if I'm right, and the Chinese room idea checks out, then that raises a lot of questions about them, doesn't it?" Maggie replied. "Just to start with, why would there be a lot of machines that can 'pass' for intelligent, and that seem so perfectly suited to hiding in certain kinds of environments – like ours, or that of any other technological species'?"
"Somebody wanted 'em made that way?" Glenn hazarded, though with manifest reluctance.
Maggie tipped her head toward him, acknowledging the point. "But if so, you've got to wonder what they wanted with machines like that – and where they are, the ones who made the machine that made them in the first place. Then, too, this whole thing, with the Decepticons and the Autobots – what's it really all about if they aren't really… well, thinking beings?" Maggie demanded.
"Something tells me I don't wanna know the answer to that," Glenn muttered. "Couldn't we just say it was some logic error and leave it at that?"
"Is that really better? That they could war themselves into extinction over a logical mistake? Just a computational error?" Maggie cocked her head at him, and Glenn sighed.
"You know," her friend said, "I'm beginning to think I like it better when you're smuggling top secret information in your rouge!"
"Come on, Glenn – if you want to hack top-class data, you've got to be ready to deal with the content," Maggie chided.
"Yeah, but you're talking about these guys as either some kinda spy drone things or machines that got violent over somebody misplacing a logical operator or somethin'!"
"Which is why what I'm worried about is what we're going to do," Maggie replied, dragging the conversation back around to the point that had been needling her insistently since last week. She frowned, absently playing with her pendant once again. "This whole thing with the puns and what they've said – even if I'm wrong about everything and they really do have a language we could qualify as natural, a language you could think from, the error isn't obvious. I've gone over this for a week, Glenn – I'm pretty sure the argument works out."
"You're worried because the argument checks out?"
"No, I'm worried because I might be wrong, but if I am, I can't see the mistake! We're talking about whether or not to treat these… beings… as people or not, Glenn. We can't be wrong about this," she shot back, and glared.
"But you're thinking we might be," Glenn replied, unhappily.
"I'm thinking they might've just given us the royal road to a royally wrong conclusion – logic!" Maggie sighed, putting her head in her hands, as she continued in a pained undertone: "Because they're not safe, Glenn. Even when they don't mean to be a danger, they're still dangerous. And people are so scared about so many things, lately." Hands fisted in her hair, as she twined curls about her fingers, tugging gently before looking up once more.
"If," she said ominously, "this got out, and people started thinking of Cybertronians as just these machines that only act like people, they wouldn't care about the argument. All they'd care about is that these are creatures – creations, rather – that can take out a building just by turning around, even when they're not deliberately blowing holes in them with weapons that you can't disarm from them without cutting off the equivalent of body parts!" She paused, raised a brow. "You see where this could go?"
"'Do I see?' Who you think you're talkin' to, Vanilla Ice?" Glenn demanded, gesturing to himself and shaking his head derisively. "'Do I see?' 'Course I do! People start thinking they're not really people, just a lot of trouble – boom!" He mimed pressing a button. "Control-alt-delete, man, with extreme prejudice and a side of napalm! You got your high tech lynch mob ready to roll with a phone call from your pal, Keller. Be a lot easier that way." He paused a moment, then asked, "But what if you're not wrong?"
Maggie considered this a long moment, before she replied, slowly, "The whole concept of the Chinese room relies on the assumption of a blind mechanism that, from any external perspective, is indistinguishable from a person who really thinks and speaks. That's the paradox – you can describe the Chinese room, but if one existed, you couldn't ever recognize it for what it was. You couldn't ever know that you weren't dealing with a person who had intelligence."
"Then maybe," Glenn said, and shrugged, "that's your counter-proof right there. If they were some kinda Chinese room done up in computer coding, you wouldn't have had a reason to think they were."
"That's not the point," Maggie insisted, but before she could go any further, his face clouded over, and he sucked in a breath. "What?" she asked, frowning.
"Autobot, ten o'clock," he muttered, and Maggie looked quickly over her shoulder to see a chartreuse S&R Hummer with a non-standard logo pull into the parking lot. As soon as it had rolled to a stop, it transformed smoothly into the Autobots' CMO. Ratchet stood there a moment, looking about, then carefully sidled between two trees and out onto the lawn.
"Hey, Ratchet," Glenn said, just a little too cheerfully to come off as innocent. Or comfortable. Fortunately, Ratchet did not seem to notice.
"Glenn," the Autobot greeted him in return, and then turned to Maggie: "Hello, Maggie."
"Hi," she answered, trying to seem a little less nervous than her friend. She glanced around, looking for flame-painted semis, and seeing none, asked, "Weren't you and Optimus Prime supposed to be meeting with those scientists today?"
"They need to refuel – eat," Ratchet answered. "Optimus opted to stay and talk with Keller."
"And you came out to see the sights or something?" Glenn asked.
"To see some sights," Ratchet replied, somewhat enigmatically, cocking his head at the sparrow, which had apparently decided that a three ton robot was somehow less frightening than human beings, and had landed on one of his lights.
"Careful, man, those things aren't too friendly to cars," Glenn advised.
"They are certainly messy at times," Ratchet agreed, but made no move to dislodge the bird.
Maggie glanced at Glenn, then down at the unappetizing remains of her lunch, and she stood, hurriedly stuffing the book back into her purse.
"I've got to run," she excused herself. "My lunch break is nearly over, and I've got a lot of analyses stacked on my desk." Turning to Glenn, she said, "Maybe we could meet again next week for lunch?"
"Sure," Glenn replied, and she smiled at him as he, too, rose. "I'll walk you back?"
"That would be lovely," she replied.
But they hadn't gone far when Maggie stopped, and turned back toward the Autobot who stood silently surveying the little park and its beds of roses. And despite herself, she just had to ask. "Ratchet?"
"What did you come out here to see?" she asked, aware of Glenn listening intently at her side.
Ratchet did not answer at once, and appeared to be considering her question. But at length, he replied, still staring at the roses, "Beautiful things." Then, turning toward her, he seemed to gauge her response – or lack of it – before adding, by way of ending the conversation: "Have a good afternoon."
"Yeah, you too, man," Glenn said from just beyond her elbow. "Catch you later!" She felt his fingers curl about her arm, as he urged, "C'mon, Maggie."
But Maggie didn't move – not immediately. She stood there, staring up at Ratchet, who stared back unblinkingly through bright blue eyes that radiated opacity in a face that was not so much closed as devoid of expression. Of expressiveness, and of a sudden, it was as if she could see every join and rivet and wire – all of it just an enigmatic assembly dissembling the suggestion of a face…
It was just a moment, and then she blinked; the uncanny image dissolved, and she was faced with Ratchet once more. The CMO canted an optical ridge at her, apparently uncertain what to make of her silence. And so hurriedly, she answered:
"Ah, thanks. You, too. I hope you find some – beautiful things, that is." To which, Ratchet gave a soft rumble, then wordlessly turned back to his contemplation of the roses, while she and Glenn hurried on their way. It wasn't until the two of them were halfway down the long city block that Glenn asked, in an undertone:
"What was that about?"
Still feeling rather unnerved, Maggie took her time in answering, but when she did, it was very deliberately that she replied, "Nothing. Just thinking."
And then before Glenn could think that one over, she tucked his arm in hers, and asked, determinedly, "You know, I'm going to take up a new hobby – making jewelry. So tell me – what do you think would be beautiful…?"
Brief, Unsightly Bibliography and Disclaimer
And now that you've gone through all that, this would be the point where I throw in the delayed disclaimer – I'm not a mathematician or a computer scientist. When I wrote the first draft, I was going off of one night's conversation and old memories of Roger Penrose on Turing machines. Subsequent research has turned up quite the furor over just what Gödel's theorem does and does not prove. Long story short, this is the kind of argument that lasts no fewer than fifty years in philosophical journals, and I'm the newbie on scene who never signed onto the vision of philosophical purpose that logical empiricism/positivism articulated and bequeathed to contemporary analytic philosophy.
The basic claim that spurred me to write is that a language without the ability to pun is a language that cannot be spoken and exhibit all the features that the speakers exhibit in fact in their thinking and reacting – not, and have them be meaningful to those same speakers. Gödel's theorem may be able to support me on this point, although it's still arguable. At the point where I introduce a dichotomy — to the degree that they are meaningful, said speakers are not speaking from a consistent language, whereas to the degree that they speak from a purely consistent language, they are not true speakers in the sense of 'intelligent subjects' — things get much more contentious and I end up going a route that is highly contested, mostly for reasons external to the technical justifications a philosophical argument of this type would be concerned with. I bring up the issue of the possibility of AI because it can be connected to an issue that is dramatically interesting and that speaks to what I assume would be a real problem if a lot of heavily armed alien robots, fresh from a genocidal war, showed up one fine day.
In any case, although the philosophical 'school' that argues over Gödel's theorem is not one that I generally agree with, it does give me a handy set of concepts that fit the AI/computer science context with which I can expect Maggie and Glenn to be familiar and comfortable. Whether or not I agree with the assumptions is less important than that this kind of argument seems far more consistent with these characters than other kinds of arguments. Finally, writing this was a great way of becoming a little less ignorant about this branch of philosophy.
For those interested, I found the following to be interesting and sometimes helpful:
Edis, Taner. "How Gödel's Theorem Supports The Possibility of Machine Intelligence." Minds and Machines. Vol. 8., 1998. 251-262. Gödel's theorem supports machine intelligence, because the continuity of physical algorithms is an empirical claim we do not know how to test. Should it be established, however, that there is continuity, then the distinction between human minds and machine processes would collapse. But it is just this sort of distinction on which we rely when we argue that Gödel's theorem excludes the possibility of machine intelligence.
Friedman, Michael. Reconsidering Logical Positivism. Cambridge, UK: Cambridge University Press, 1999. Especially chapter 7 – Why Carnap's metalanguage dies a painful death at the hands of Gödel and his theorems. Blessedly free of jargon and with minimal amounts of formal symbolic logic.
Friedman, Michael. "The Re-evaluation of logical positivism." Journal of Philosophy. Vol. 88, 1991. 505-519. Reviewing the myths about logical positivism. And then showing where its deepest philosophical insight ran fatally aground when it could not answer the question "What is the purpose of philosophy?" without putting it in such a way as to leave it wide open to Gödel's work.
Gaifman, Haim. "What Gödel's Incompleteness Result Does and Does Not Show." Journal of Philosophy. Vol. 97(8), 2000. 462-470. – Gaifman makes the argument that using Gödel's theorem to rule out the possibility of an intelligent computer rests on the inability to prove the consistency of a set. At this point, Gödel's second theorem, that no formal logical system can prove its own consistency intervenes, preventing us from establishing of any set the move that tells us that the set is consistent within a said set and at the same time rendering undecidable the difference between human reflective reasoning and a mechanical reasoning that simply is able to incorporate one level 'higher' than our own of consistency proofs. Since I learned about the second theorem's existence only after the whole story had been written and rewritten, and since it does not lead as nicely to the moral dilemma, I opted not to include it, although there are some obvious places where it could have fit.
Jacquette, Dale. "A Turing Test Conversation." Philosophy: The Journal of the Royal Institute of Philosophy. Vol. 68, 1993. 231-233. - A short, clever attempt to dramatize the way in which Gödel's theorem can be used to distinguish a machine imitating thought from a person who is thinking. Basically, this short dialogue appears to contest that the Chinese room conditions could ever be met, and uses Gödel's theorem as its wedge in a move Gaifman would likely reject as drawing an invalid conclusion. The Chinese room hypothesis lists passing a Turing test as one of its conditions.
Raatikainen, Panu. "On the Philosophical Relevance of Gödel's Incompleteness Theorems."Revue Internationale de Philosophie. Vol. 59 (4), 2005. 513-534. – A useful orientation to several uses of Gödel's theorem, including the anti-AI thesis. Raatikainen follows Gaifman and just about everyone else listed here in rejecting this thesis as unsupportable; all that can be supported is that the question depends upon demonstrating the consistency of the language, which may be more or less complex a task depending on the logical language in question. Because the truth value of a Gödel sentence depends on establishing consistency, if human beings are incapable of recognizing a consistent system, the more complex the logic, the less certain we can be that it is consistent. But Gödel's theorem only applies to consistent theorems, ergo, so if we cannot establish that a machine is operating with a consistent language, then we cannot know whether Gödel's theorem applies in such a way as to have any relevance for conclusions about the possibility of mechanical intelligence.