Usages people keep telling you are wrong but which are actually standard in English.
For the hyper-critical, “to boldly go where no man has gone before” should be “to go boldly. . . .” It is good to be aware that inserting one or more words between “to” and a verb is not strictly speaking an error, and is often more expressive and graceful than moving the intervening words elsewhere; but so many people are offended by split infinitives that it is better to avoid them except when the alternatives sound strained and awkward.
Ending a sentence with a preposition
A fine example of an artificial “rule” which ignores standard usage. The famous witticism usually attributed to Winston Churchill makes the point well: “This is the sort of English up with which I will not put.” Jack Lynch has some sensible comments on this issue. If you think you know the original version of this saying, click here.
Beginning a sentence with a conjunction
It offends those who wish to confine English usage in a logical straitjacket that writers often begin sentences with “and” or “but.” True, one should be aware that many such sentences would be improved by becoming clauses in compound sentences, but there are many effective and traditional uses for beginning sentences thus. One example is the reply to a previous assertion in a dialogue: “But, my dear Watson, the criminal obviously wore expensive boots or he would not have taken such pains to scrape them clean.” Make it a rule to consider whether your conjunction would repose more naturally within the previous sentence or would lose in useful emphasis by being demoted from its position at the head of a new sentence.
Using “between” for only two, “among” for more
The “-tween” in “between” is clearly linked to the number two, but, as the Oxford English Dictionary notes, “In all senses, between has, from its earliest appearance, been extended to more than two.” We’re talking about Anglo-Saxon here—early. Pedants have labored to enforce “among” when there are three or more objects under discussion, but largely in vain. Very few speakers naturally say, “A treaty has been negotiated among Britain, France, and Germany.”
Over vs. more than.
Some people claim that “over” cannot be used to signify “more than,” as in “Over a thousand baton-twirlers marched in the parade.” “Over,” they insist, always refers to something physically higher: say, the blimp hovering over the parade route. This absurd distinction ignores the role metaphor plays in language. If I write 1 on the blackboard and 10 beside it, 10 is still the “higher” number. “Over” has been used in the sense of “more than” for over a thousand years.
Gender vs. sex
Feminists eager to remove references to sexuality from discussions of females and males not involving mating or reproduction revived an older meaning of “gender,” which had come to refer in modern times chiefly to language, as a synonym for “sex” in phrases such as “Our goal is to achieve gender equality.” Americans, always nervous about sex, eagerly embraced this usage, which is now standard. In some scholarly fields, “sex” is now used to label biologically determined aspects of maleness and femaleness (reproduction, etc.) while “gender” refers to their socially determined aspects (behavior, attitudes, etc.), but in ordinary speech this distinction is not always maintained. It is disingenuous to pretend that people who use “gender” in the new senses are making an error, just as it is disingenuous to maintain that “Ms.” means “manuscript” (that’s “MS”). Nevertheless, I must admit I was startled to discover that the tag on my new trousers describes not only their size and color, but their “gender.”
Using “who” for people, “that” for animals and inanimate objects
In fact there are many instances in which the most conservative usage is to refer to a person using “that”: “All the politicians that were at the party later denied even knowing the host” is actually somewhat more traditional than the more popular “politicians who.” An aversion to “that” referring to human beings as somehow diminishing their humanity may be praiseworthily sensitive, but it cannot claim the authority of tradition. In some sentences, “that” is clearly preferable to “who”: “She is the only person I know of that prefers whipped cream on her granola.” In the following example, to exchange “that” for “who” would be absurd: “Who was it that said, ‘A woman without a man is like a fish without a bicycle’?”*
*Commonly attributed to Gloria Steinem, but she attributes it to Irina Dunn.
“Since” cannot mean “because.”
“Since” need not always refer to time. Since the 14th century, when it was often spelled “syn,” it has also meant “seeing that” or “because.”
This word has meant “it is to be hoped” for a very long time, and those who insist it can only mean “in a hopeful fashion” display more hopefulness than realism.
“The plane will be landing momentarily” says the flight attendant, and the grumpy grammarian in seat 36B thinks to himself, “So we’re going to touch down for just a moment?” Everyone else thinks, “Just a moment now before we land.” Back in the 1920s when this use of “momentarily” was first spreading on both sides of the Atlantic, one might have been accused of misusing the word, but by now it’s listed without comment as one of the standard definitions in most dictionaries.
Lend vs. loan
“Loan me your hat” was just as correct everywhere as “lend me your ears” until the British made “lend” the preferred verb, relegating “loan” to the thing being lent. However, as in so many cases, Americans kept the older pattern, which in its turn has influenced modern British usage so that those insisting that “loan” can only be a noun are in the minority.
It is futile to protest that “near miss” should be “near collision.” This expression is a condensed version of something like “a miss that came very near to being a collision” and is similar to “narrow escape.” Everyone knows what is meant by it and almost everyone uses it. It should be noted that the expression can also be used in the sense of almost succeeding in striking a desired target: “His Cointreau soufflé was a near miss.”
“None” singular vs. plural
Some people insist that since “none” is derived from “no one” it should always be singular: “none of us is having dessert.” However, in standard usage, the word is most often treated as a plural. “None of us are having dessert” will do just fine.
Scan vs. skim.
Those who insist that “scan” can never be a synonym of “skim” have lost the battle. It is true that the word originally meant “to scrutinize,” but it has now evolved into one of those unfortunate words with two opposite meanings: to examine closely (now rare) and to glance at quickly (much more common). It would be difficult to say which of these two meanings is more prominent in the computer-related usage, to “scan a document.”
That said, it’s more appropriate to use “scan” to label a search for specific information in a text, and “skim” to label a hasty reading aimed at getting the general gist of a text.
For most Americans, the natural thing to say is “Climb down off of [pronounced “offa”] that horse, Tex, with your hands in the air”; but many UK authorities urge that the “of” should be omitted as redundant. Where British English reigns you may want to omit the “of” as superfluous, but common usage in the US has rendered “off of” so standard as to generally pass unnoticed, though some American authorities also discourage it in formal writing. But if “onto” makes sense, so does “off of.” However, “off of” meaning “from” in phrases like “borrow five dollars off of Clarice” is definitely nonstandard.
Till vs. ’til.
Since it looks like an abbreviation for “until,” some people argue that this word should always be spelled “’til” (though not all insist on the apostrophe). However, “till” has regularly occurred as a spelling of this word for over 800 years and it’s actually older than “until.” It is perfectly good English.
Teenage vs. teenaged.
Some people object that the word should be “teenaged,” but unlike the still nonstandard “ice tea,” “scramble eggs,” and “stain glass,” “teenage” is almost universally accepted now.
Don’t use “reference” to mean “cite.”
Nouns are often turned into verbs in English, and “reference” in the sense “to provide references or citations” has become so widespread that it’s generally acceptable, though some teachers and editors still object.
“I feel bad” is standard English, as in “This t-shirt smells bad” (not “badly”). “I feel badly” is an incorrect hyper-correction by people who think they know better than the masses. People who are happy can correctly say they feel good, but if they say they feel well, we know they mean to say they’re healthy.
Unquote vs. endquote.
Some people get upset at the common pattern by which speakers frame a quotation by saying “quote . . . unquote,” insisting that the latter word should logically be “endquote”; but illogical as it may be, “unquote” has been used in this way for about a century, and “endquote” is nonstandard.
Persuade vs. convince.
Some people like to distinguish between these two words by insisting that you persuade people until you have convinced them, but “persuade” as a synonym for “convince” goes back at least to the 16th century. It can mean both to attempt to convince and to succeed. It is no longer common to say things like “I am persuaded that you are an illiterate fool,” but even this usage is not in itself wrong.
Normalcy vs. normality.
The word “normalcy” had been around for more than half a century when President Warren G. Harding was assailed in the newspapers for having used it in a 1921 speech. Some folks are still upset, but in the US “normalcy” is a perfectly normal—if uncommon—synonym for “normality.”
Aggravate vs. irritate.
Some people claim that “aggravate” can only mean “make worse” and should not be used to mean “irritate,” but the latter has been a valid use of the word for four centuries, and “aggravation” means almost exclusively “irritation.”
You shouldn’t pronounce the “e” in “not my forte.”
Some people insist that it’s an error to pronounce the word “forte” in the expression “not my forte” as if French-derived “forte” were the same as the Italian musical term for “loud”: “for-tay.” But the original French expression is pas mon fort, which not only has no “e” on the end to pronounce—it has a silent “t” as well. It’s too bad that when we imported this phrase we mangled it so badly, but it’s too late to do anything about it now. If you go around saying what sounds like “that’s not my fort,” people won’t understand what you mean.
However, those who use the phrase to mean “not to my taste” (“Wagnerian opera is not my forte”) are definitely mistaken. Your forte is what you’re good at, not just stuff you like.
“Preventive” is the adjective, “preventative” the noun.
I must say I like the sound of this distinction, but in fact the two are interchangeable as both nouns and adjective, though many prefer “preventive” as being shorter and simpler. “Preventative” used as an adjective dates back to the 17th century, as does “preventive” as a noun.
People should say a book is titled such-and-such rather than entitled.
No less a writer than Chaucer is cited by the Oxford English Dictionary as having used “entitled” in this sense, the very first meaning of the word listed by the OED. It may be a touch pretentious, but it’s not wrong.
People are healthy; vegetables are healthful.
Logic and tradition are on the side of those who make this distinction, but I’m afraid phrases like “part of a healthy breakfast” have become so widespread that they are rarely perceived as erroneous except by the hyper-correct. On a related though slightly different subject, it is interesting to note that in English adjectives connected to sensations in the perceiver of an object or event are often transferred to the object or event itself. In the 19th century it was not uncommon to refer, for instance, to a “grateful shower of rain,” and we still say “a gloomy landscape,” “a cheerful sight” and “a happy coincidence.”
Dinner is done; people are finished.
I pronounce this an antiquated distinction rarely observed in modern speech. Nobody really supposes the speaker is saying he or she has been roasted to a turn. In older usage people said, “I have done” to indicate they had completed an action. “I am done” is not really so very different.
Crops are raised; children are reared.
Old-fashioned writers insist that you raise crops and rear children, but in modern American English children are usually “raised.”
“You’ve got mail” should be “you have mail.”
The “have” contracted in phrases like this is merely an auxiliary verb, not an expression of possession. It is not a redundancy. Compare: “You’ve sent the mail.”
It’s “cut the muster,” not “cut the mustard.”
This etymology seems plausible at first. Its proponents often trace it to the American Civil War. We do have the analogous expression “to pass muster,” which probably first suggested this alternative; but although the origins of “cut the mustard” are somewhat obscure, the latter is definitely the form used in all sorts of writing throughout the twentieth century. Common sense would suggest that a person cutting a muster is not someone being selected as fit, but someone eliminating the unfit.
Here is the article on “cut the mustard” from the FAQ of the UseNet newsgroup alt.usage.english:
This expression meaning “to achieve the required standard” is first recorded in an O. Henry story of 1902: “So I looked around and found a proposition [a woman] that exactly cut the mustard.”
It may come from a cowboy expression, “the proper mustard”, meaning “the genuine thing”, and a resulting use of “mustard” to denote the best of anything. O. Henry in Cabbages and Kings (1894) called mustard “the main attraction”: “I’m not headlined in the bills, but I’m the mustard in the salad dressing, just the same.” Figurative use of “mustard” as a positive superlative dates from 1659 in the phrase “keen as mustard”, and use of “cut” to denote rank (as in “a cut above” ) dates from the 18th century.
Other theories are that it is a corruption of the military phrase “to pass muster” (“muster”, from Latin _monstrare_=”to show”, means “to assemble (troops), as for inspection” ); that it refers to the practice of adding vinegar to ground-up mustard seed to “cut” the bitter taste; that it literally means “cut mustard” as an example of a difficult task, mustard being a relatively tough crop that grows close to the ground; and that it literally means “cut mustard” as an example of an easy task (via the negative expression “can’t even cut the mustard” ), mustard being easier to cut at the table than butter.
The more-or-less synonymous expression “cut it” (as in “” sorry” doesn’t cut it” ) seems to be more recent and may derive from “cut the mustard”.
It’s “carrot on a stick,” not “carrot or stick.”
Authoritative dictionaries agree, the original expression refers to offering to reward a stubborn mule or donkey with a carrot or threatening to beat it with a stick and not to a carrot being dangled from a stick. Further discussion. This and other popular etymologies fit under the heading aptly called by the English “too clever by half.”
“Spitting image” should be “spit and image.”
According to the Oxford English Dictionary, the earlier form was “spitten image,” which may indeed have evolved from “spit and image.” It’s a crude figure of speech: someone else is enough like you to have been spat out by you, made of the very stuff of your body. In the early 20th century the spelling and pronunciation gradually shifted to the less logical “spitting image,” which is now standard. It’s too late to go back. There is no historical basis for the claim sometimes made that the original expression was “spirit and image.”
“Lion’s share” means all of something, not the larger part of something.
Even though the original meaning of this phrase reflected the idea that the lion can take whatever he wants—typically all of the slaughtered game, leaving nothing for anyone else—in modern usage the meaning has shifted to “the largest share.” This makes great sense if you consider the way hyenas and vultures swarm over the leftovers from a typical lion’s kill.
“Connoisseur” should be spelled “connaisseur.”
When we borrowed this word from the French in the 18th century, it was spelled “connoisseur.” Is it our fault the French later decided to shift the spelling of many OI words to the more phonetically accurate AI? Of those Francophone purists who insist we should follow their example I say, let ’em eat bifteck.
See also Commonly Made Suggestions