27
an
frequently blinded by this phenomenon.42
One good example is the medical community's former refusal to
acknowledge the existence of adolescent depression. Frederick K.
Goodwin, scientific director of the National Institute of Mental Health,
explains that the psychotherapeutic community only recognized the
existence of severe depression in teenagers and children in the mid
'70s. Says Goodwin, "Until maybe 10 years ago, we believed that
severe depression was solely an illness of adults. Adolescents didn't
develop 'real' depression--they just had 'adolescent adjustment
problems,' so most psychiatrists didn't and still don't think to look for it
in kids. Now, however, we know that idea is dead wrong."43
In the '50s or '60s, you may have had a fourteen year old who
couldn't sleep, who'd stopped eating, who was spending a good deal
of his or her time weeping, who may have been a good student--bright,
cheerful and sociable--but now seemed cut off from other people. If
you'd taken that child to a psychiatrist or psychologist, the doctor
would almost certainly have dismissed the symptoms as merely a
phase.
Meanwhile, in 1951, medical researchers developed a new drug
(iproniazid) to treat tuberculosis. When the substance produced
strange side effects--making patients euphoric and unusually
energetic--it was slated for the medical trash can. Then, in 1957, a few
psychiatrists discovered that iproniazid could be used to relieve
depression in patients who hadn't responded to any other form of
therapy. But it wasn't until the '70s that the use of these miraculous
new pills spread through the medical community.44
When anti-depressants began to spill from the pharmaceutical
factories, a phenomenon which hadn't existed in the minds of doctors
emerged from the shadows of denial. The physician was suddenly
willing to see a set of symptoms which a few years earlier he had airily
dismissed. Today, the good doctor sagely pronounces the new
d focus on what they can.41 The medical profession's eyes are
<< < GO > >>