Short post today – I’ve got the flu/cold/whatever that it seems everybody in the country has. And, I’m told it lasts two weeks. (Yeah, I got a flu shot, but there’s always something else that goes around each year anyway). No post Wednesday.
Any idea what finite-state morphology is? I’d never heard of it – though it turns out I use its products every day, and you probably do, as well. (It’s the first time I’ve ever failed to find a Wikipedia entry on something.) I came across it in reading articles I picked from Computational Linguistics for what I hoped would be sufficient layman-friendly language. I was reading one of the Lifetime Achievement Award speeches, this one from Lauri Karttunen, currently at PARC at Stanford. I found myself completely lost, trying to glean the meaning of statements such as:
Generative phonologists of that time described morphological alternations by means of ordered rewrite rules introduced by Chomsky and Halle (1968). These rules are of the form α → β / γ δ, where α, β, γ, and δ can be arbitrarily complex strings or feature matrices. It was not understood how such rules could be used for analysis…Johnson observed that although the same context-sensitive rule could be applied several times recursively to its own output, phonologists have always assumed implicitly that the site of application moves to the right or to the left in the string after each application. For example, if the rule α → β / γ δ is used to rewrite the string γαδ as γβδ, any subsequent application of the same rule must leave the β part unchanged, affecting only γ or δ. Johnson demonstrated that the effect of this constraint is that the pairs of inputs and outputs produced by a phonological rewrite rule can be modeled by a finite-state transducer.
Okay, you lost me. Until later in the article, when I could finally extract some shade of meaning when I discovered that finite-state morphology is responsible for..the spell checker. (And, less successfully, the grammar checker.) Realizing that a lifetime achievement speech would probably be delivered to people who didn’t need a 101 lesson in the subject, I dug around the Internet (failing, shockingly, on Wikipedia) and found this blurb for one of Karttunen’s books, called, helpfully, Finite State Morphology:
Natural-language words are typically formed of morphemes concatenated together, as in un+guard+ed+ly and over+critic+al, but some languages also exhibit non-concatenative processes such as interdigitation and reduplication. When morphemes are combined together into new words, they often display alternations in their pronunciation or spelling, as when swim+ing becomes swimming, take+ing becomes taking and die+ing becomes dying. Finite-state morphology assumes that both the word-formation rules (morphotactics) and the morpho-phonological alternation rules can be modeled as finite-state machines.
A finite-state machine being, in essence, a decision tree. (Here’s the best plain-English definition I could find; the word “machine” here really means what we’d call “process.” And they’re all “finite state” because an “infinite state machine” is purely theoretical.) So the fruits of Computational Linguistics are working in the background every time Word, in my case, underlines a misspelled word and offers a correction. (Not always brilliantly; for instance, spell check offers to correct “morphotactics” in the quote above to “morph tactics,” which sounds like something from a video game.)
[I also found this book online, which I’m looking forward to – 200 pages on CompLing written mostly in “plain English” – I know, I know, if only I hadn’t listened to Fran Lebowitz all those years ago when she said in “Tips for Teens,” “Stand firm in your refusal to remain conscious during algebra. In real life, I assure you, there is no such thing as algebra.” After all, I was going to be a novelist, what use had I for math? For years I’ve had Norvig and Russell’s textbook on AI (the text, it appears; MIT’s OpenCourseWare class on AI uses it), and been so daunted by the incomprehensible mathematical formulas that I’ve hardly cracked it. I’m not opposed to math, it just never came naturally. And it hasn’t held me back, until now, when α → β / γ δ is Greek to me, and I need to admit my failings in this department and address them, at least to a remedial level, if I’m going to write this novel.]
Karttunen’s article makes it plain that getting a computer to understand “plain English” is a major project, given its range of ambiguity. For instance, take these two sentences she gives as examples:
a. Deena did not wait to talk to anyone. Instead, she ran home.
b. It hurt like hell, but I’m glad she didn’t wait to tell me.
(a) implies Deena did not talk to anyone. But (b) implies she told me something right away.
Question 1: How does it come about that X didn’t wait to do Y means either that X did Y right away or that X didn’t do Y at all?
We glean the meanings immediately – Deena “didn’t wait” to get (an apology/an explanation/asked to dance); something bad happened, internally or externally, because she “ran” home before the situation could change. The “she” in the second sentence had something that was painful for her to say (or, maybe, for the listener to hear), which is the kind of thing you might “wait” to say (i.e., maybe never say until you’re caught out withholding information and you construct a “plan” in which you were “waiting for the right time to tell you.”)
The layers of meaning that can be extracted by a machine are still far thinner in number than the layers we extract intuitively, unconsciously, from bushels and bales and sheaves of experience. The greatest difficulty isn’t coding the rules of language – since there are a “finite” number of rules, however large, eventually they can all be nailed down. The difficulty is coding the exceptions, the shades of meaning, the emotional response we have to certain language patterns that is crucial to a full understanding of what’s just been said. Building the Intuition Machine will be the final frontier of real machine “intelligence” when it comes to language.