informed consent, in medicine, a patient's written consent to a surgical or medical procedure or other course of treatment, given after the physician has told the patient all of the potential benefits, risks, and alternatives involved. Informed consent is also required for participation in clinical studies. The concept of informed consent is based on the principle that a physician has a duty to disclose to a patient information that allows the patient to make a reasonable decision regarding his or her own treatment. There is debate over whether special populations, such as children and the mentally ill, can really be considered to have given informed consent.
The Columbia Electronic Encyclopedia, 6th ed. Copyright © 2024, Columbia University Press. All rights reserved.
See more Encyclopedia articles on: Medicine