Want More Education?
Delve deeper into the science behind skin care with —Skin Inc. Video Education!
Most Popular in:
It's a puzzle to scientists, but a new study suggests that the main cause of deadly skin cancer -- sunlight -- might also help protect against the disease.
The key could lie in the amount of ultraviolet B (UVB) light the skin absorbs -- enough to stimulate a healthy, vitamin D-linked immune response in the skin but not so much that it boosts skin cancer risk.
"I do think that a little bit of sunlight is good for people, but I think that one of the problems that the American Cancer Society and dermatologists have is, how do you define what a little bit is?" said skin cancer researcher Marianne Berwick, chief of epidemiology at the University of New Mexico's Cancer Research and Treatment Center. "How do you tell people that it's OK to have a little bit of sunlight but not too much?"
In 2005, Berwick's team published a controversial study that found that melanoma patients with higher levels of daily sun exposure actually had better survival than patients who spent less time in the sun.
"I've been searching for an explanation for that ever since," she said.
Now, findings from a group led by immunologists at Stanford University may provide an answer. The study, led by professor of pathology Eugene Butcher, is expected to be published in the March issue of Nature Immunology.
In its study, the Stanford team worked with cells in the lab and discovered a biochemical chain of events that appears to link sunlight exposure to the skin's own immune defenses.
The researchers started from the notion that an inactive precursor of vitamin D, called vitamin D3, "is generated in the skin in response to sun exposure." That's been known for years. Specifically, a short-wavelength form of UV light, called UVB, is responsible for D3 generation.
D3 is inert and powerless, however. Through contact with various enzymes in the liver and kidneys, the body turns D3 into an active compound called 1,25(OH)2D3.
And that's where the immune-system connection kicks in, the Stanford authors said.
In their experiments, they found that the new compound "signaled (immune) T-cells," pushing them to migrate back to specific sites in the skin's epidermis. Once there, these powerful immune system agents stand on guard against infection and even cancer, the researchers said.
"So, the same wavelengths of sunlight that are most potent in inducing skin cancer -- UVB -- are also the wavelengths that produce this vitamin D precursor, D3," said Dr. Martin Weinstock, chairman of the skin cancer advisory group at the American Cancer Society. And it's D3 that starts the whole chain of events rolling.
Weinstock stressed that the Stanford study is far from conclusive, however, and should not be seen as an excuse to bake in the sun.
"We know that the sun is the major avoidable cause of skin cancer," he said. "This study is interesting and points to a productive area of research, both to confirm this in other settings and to flush out the implications of the finding. But does it really relate to skin cancers in real live people? We don't know."
"So, avoiding intense sun, protecting yourself when you're out in intense sun -- that's still our [cancer society] recommendation, and this is not going to change that," said Weinstock, who is also professor of dermatology and community health at Brown University.
Kathleen Egan, a professor of epidemiology at the H. Lee Moffitt Cancer Center and Research Institute in Tampa, Fla., agreed that the study findings are "tantalizing" but need further study.
Especially since the release of Berwick's melanoma study, "there's been an awful lot of questions about how -- or if -- vitamin D has a part to play in potentially offering some [cancer] protection under some circumstances," she said. "But it's very difficult to tease out, because the main human source of vitamin D is, in fact, sunlight exposure, which is also the most important risk factor for melanoma."
Nutritionists have known for decades that sunlight stimulates vitamin D production in the skin. In fact, this natural process is the body's major source of the nutrient. A proper amount of vitamin D is crucial to bone health, "and there's also a bunch of evidence that vitamin D may have a role in preventing colon cancer, although there's still some controversy about that," Weinstock said.
So, how much sunlight is enough to get the ideal amount of vitamin D?
Katharine Tallmadge, a Washington, D.C., dietitian and a spokeswoman for the American Dietetic Association, suggests that most people can probably get the U.S. Department of Agriculture's recommended 400 daily IUs of vitamin D by spending a half-hour to an hour outside per day.
Egan agreed. She said it's not difficult for people to soak up the sun's goodness without boosting their cancer risk. In response to even a moderate amount of sunlight, "the skin actually creates an amazing amount of vitamin D," Egan said. "It doesn't take much exposure to make enough of the vitamin D that's certainly needed to preserve bone health, for example."
By E. J. Mundell, HealthDay News, January 29, 2007
Two are better than one when it comes to performing a skin exam for melanoma and spotting it at the earliest, most treatable stage, a new study shows.
Men and women at high risk of developing melanoma who underwent skin self exam training with their live-in partner were more likely to perform the exams than those who trained solo, Dr. June K. Robinson of Northwestern University Feinberg School of Medicine in Chicago and colleagues found.
People trained to perform a skin self exam also seek treatment at an earlier stage of melanoma and are less likely to die from it, Robinson and her team note in the Archives of Dermatology. They hypothesized that training couples in skin self exams would be even more effective than training individuals because couples may encourage each other to do the exams and help each other another to perform them.
To investigate, Robinson's group randomly assigned 130 people to undergo a 10-minute training on skin self exams alone or with their live-in partner.
Four months after the training, 45 of the 65 people who underwent solo training had not examined their skin, compared with 23 of the 65 who trained with their partner. Of the paired learners, 19 checked their skin at least once and 13 checked it several times, compared with 9 and 4 of the solo learners, respectively.
The men and women who learned in pairs also were more likely to perceive the exams as important and to feel confident in their ability to perform the exams, the researchers found.
The study "affirms the role of partners in health care and extends it to promoting health behaviors," the researchers conclude.
Reuters Health, January 16, 2007
Older Americans taking shots of human growth hormone in an effort to turn back the clock will likely be disappointed.
As an anti-aging treatment, the hormones appear to offer few benefits but significant health risks, a review of the research finds.
Stanford University researchers came to this conclusion after analyzing 31 studies that included a total of more than 500 relatively healthy elderly people.
The only clearly positive effect found from taking the hormones was a slight improvement in lean body mass.
On the negative side, participants who took human growth hormones were significantly more likely to develop joint swelling and pain, and carpal tunnel syndrome.
There was also a suggestion of an increased risk of diabetes and prediabetes, but that association did not reach statistical significance.
Authors of the review say better studies are needed to understand the risks and benefits of human growth hormone as an anti-aging treatment.
But they say studies do not support the use of human growth hormones for this reason.
"If the benefits truly are minimal, and the risks are not, this is not a therapy that should be used for anti-aging purposes," Hau Liu, MD, MBA, MPH tells WebMD.
Use Growing Among Elderly
Growth hormone is naturally produced in the pituitary gland at the base of the brain, but its levels decline with age.
Promoters of synthetic growth hormone as an anti-aging treatment claim the hormones can do everything from firm sagging skin to boost a sagging libido.
According to government figures, between 25,000 and 30,000 Americans used growth hormones for aging purposes in 2004. That is a tenfold increase in about a decade, Thomas T. Perls, MD, tells WebMD.
"The cost of this treatment can be $12,000 a year or more, but even if you take the cost out of the equation, there is still a huge potential for causing harm," Perls says. "The people promoting this stuff have absolutely no idea what the long-term health effects are."
Because human growth hormone has not been approved for use as an anti-aging treatment by federal regulators, Perls argues that doctors who prescribe it for this purpose are breaking the law.
He first made that charge in a report published in The Journal of the American Medical Association in late 2005.
Perls' report prompted Liu and colleagues to conduct their review of the research on human growth hormone as an antiaging treatment.
No Fountain of Youth
The researchers limited their review to randomized, controlled clinical trials that included relatively healthy elderly people.
The participants used growth hormone for an average of about six months.
While growth hormone did appear to increase lean muscle mass and reduce body fat by an average of just over 4 pounds, it did not appear to have an effect on other measures of fitness, including bone density, cholesterol and lipid levels.
"From our review, there's not data to suggest that growth hormone prolongs life, and none of the studies make that claim," Liu says.
Liu tells WebMD he was surprised to find so little research has been done on the use of growth hormones in the elderly population—especially since so many claims have been made about the treatment's benefits.
But he says he understands why people believe the hype.
"Elderly people today are very health conscious and they are trying to do all they can to take care of themselves," Liu says. "But our conclusion is that growth hormone does not represent a magic bullet or the fountain of youth."
By Salynn Boyles, WebMD Medical News, January 16, 2007
"SunSafe in the Middle School Years" was a middle school research project conducted to improve awareness and educate teenagers about the prevention of skin cancer and the need for sun protection. The research study has been published in the January issue of Pediatrics.
The study provided a two-year follow-up period indicating that teens who participated in the program were much better about using sun protection devices than those teens who had not participated in the program.
The "SunSafe" project involved the cooperation of schools staff, recreational sports program coaches, parents and health care professionals. According to the study results, this may be the intervention needed to improve the behaviors of teens in protecting themselves better from the harmful ultraviolet rays of the sun.
The research took place within 10 communities in Vermont and New Hampshire. Funding was provided by the National Cancer Institute and was directed by pediatrician Ardis Olsen, MD and colleagues.
According to the "Primary Care Practice Manual" produced through the project, reducing sun exposure can eliminate 90 percent of skin cancers that currently occur in 1 of 5 Americans.
The middle school years were noted as being an especially important time to be sure that teenagers are given the important information regarding the risks and precautions of sun exposure. The project establishes that the teenage years are influential as a time when children begin to establish their own health habits. The hopes to influence these habits in a healthy way can prevent many individuals from having to suffer from future skin cancer.
Some of the findings from the study include that only about 30 percent of middle school students protect themselves from the sun. Seventy percent of the children surveyed had suffered from a sunburn during the previous summer.
Statistics indicate that one or more blistering sunburn before 20 years of age doubles the risk of getting skin cancer.
Children have three times as much exposure to the sun as adults. Therefore, statistically, the majority of lifetime sun exposure occurs by the age of 18 years.
Health care providers should take part in the responsibility to discuss sun exposure with the teenagers as well as parents, teachers and coaches. However, this study found that only one-third of physicians had spoken to their patients about this subject.
Pediatricians who incorporated the "SunSafe" information message into their visits with their patients resulted in nearly a 10 percent increase of informed teenagers.
The approach of the SunSafe project was not to simply use classroom instruction, but to include poster contests, buttons and other means of promoting the sun-safety message. A medical device was also used to allow children to see skin changes that are not visible to the naked eye in normal light.
According to the researchers, public health efforts for the different approaches as used by SunSafe seem to show promise for establishing changes in adolescent behaviors in order to reduce skin cancer risks.
By Patricia Shehan, All Headline News, January 11, 2007
The U.S. Food and Drug Administration has approved a new treatment for moderate-to-severe frown lines, medically called nasolabial folds.
BioForm Medical issued a statement describing its Radiesse as a longer-lasting alternative to existing wrinkle fillers. The company said its calcium-based microsphere technology not only fills in facial folds and depressions, but also stimulates the body to produce collagen, the fibrous protein that gives the face its structure and fullness.
The drug was also newly approved to improve the appearance of people with AIDS-causing HIV who have significant facial fat loss (lipoatrophy), the San Mateo, Calif.-based company said.
Radiesse was first FDA approved in 2002 for use in facial reconstructive surgery.
HealthDay News, December 28, 2006
Giving infants "light therapy" to treat their jaundice may boost their risk of skin moles during childhood, French researchers report.
Some types of moles can raise risks for melanoma skin cancer, the team pointed out.
Jaundice affects between 45% and 60% of healthy newborns and as many as 80% of premature babies, according to background information in the article, which is published in the December Archives of Dermatology.
In this study, researchers at Saint-Antoine Hospital, Paris, examined 58 children, ages 8 or 9, for the presence of melanocytic nevi (moles). Eighteen of the children had phototherapy when they were newborns.
Overall, 37 of the children had moles that were 2 millimeters or larger, and there were an average of 2.08 moles per child. The children who'd had phototherapy (light therapy) had more moles than the other children (an average of 3.5 vs. 1.45). When the analysis was limited to moles between 2 millimeters and 5 millimeters, the association between phototherapy and moles was stronger, the study said.
Moles smaller than 2 millimeters in diameter "may represent more recent nevi, whereas those nevi due to early event should be larger," the researchers wrote. "Nevi larger than 5 millimeters probably are congenital nevi and are most probably associated with genetic predisposition."
"Higher numbers of acquired benign nevi are associated with increased risk of melanoma," the study authors concluded. "A detailed examination of the factors responsible for the development of nevi in children would be useful to identify high-risk groups to be targeted for prevention. The link between melanoma and phototherapy should be the focus of such a study."
HealthDay News, December 20, 2006
Laura Bush's skin cancer came with a classic symptom, a slow-healing sore.
That made it hard to ignore, a good thing: Remove skin cancer early, and it's easy to cure.
Better is preventing skin cancer, and key is protecting yourself—and your children, starting when they're tots—from the sun. Sunburns early in life are considered the most dangerous.
Too few heed that advice. Skin cancer strikes over 1 million Americans annually, and is on the rise.
The toll probably won't drop "until this generation that started using sunscreen in childhood grows up," predicts Dr. Clifford Perlis, a dermatologist at Fox Chase Cancer Center in Philadelphia.
Between 1 million and 1.2 million Americans are diagnosed each year with basal or squamous cell carcinoma, the most common and easy-to-treat skin cancers.
The first lady had a squamous cell carcinoma excised from her right shin in November.
Melanoma is the most lethal skin cancer, and strikes about 62,000 Americans a year. Of the 10,700 skin-cancer deaths annually, almost 8,000 are due to melanoma. Yet if caught before it has spread, even melanoma is survivable.
Most at risk for all skin cancers are people with fair skin, difficulty tanning, or a history of excessive sun exposure. For melanoma, major risk factors include a relative with the disease and having lots of moles.
Specialists urge all adults to examine their skin regularly for suspicious changes, such as a new growth or change in an old one.
Associated Press, December 20, 2006
Findings from a new study confirm that tea extracts applied to the skin promote the repair of damage from radiotherapy, and shed light on the mechanisms involved in the injury...
Scientists have long sought to learn whether and how stress can lead to skin problems. A new study in mice shows that a stress-triggered hormone could worsen or even cause skin disorders like psoriasis and eczema.
The scientists found that blocking the hormone called glucocorticoid—which increases in stressful times—resulted in better skin.
Understanding how glucocorticoids work could help scientists come up with ways to prevent human skin problems triggered by psychological stress, said lead researcher Kenneth Feingold of the Veterans Affairs Medical Center, San Francisco and the University of California at San Francisco.
"Here you have things going on in your mind that affect what's going on in your skin," Feingold told LiveScience.
The outermost layer of your skin, the epidermis, is composed of dead skin cells, which form a permeability barrier to prevent water loss. Every day tens of thousands of these dead cells slough off as tiny flakes. Typically, cells at the bottom of the epidermis grow, move to the surface and differentiate into skin cells to replace the lost flakes.
Previous research showed that psychological stress decreases cell growth and inhibits differentiation into skin cells.
In the new study, scientists subjected hairless mice to stress while either blocking the production of glucocorticoids or blocking the action of the hormone. Some mice weren't treated at all. The stress was created by placing the mice in small cages in constant light with a radio playing for 48 hours.
The two groups of mice treated with a type of glucocorticoid-blocker showed much better skin function compared with untreated and stressed mice.
While the researchers hope the study will lead to a way to treat people who suffer these skin conditions, there is still a long way to go. Besides needing to test the effect in people, blocking glucocorticoids could have negative side effects that are worse than exacerbations of skin disorders.
The research is detailed in the December issue of the American Journal of Physiology-Regulatory, Integrative and Comparative Physiology.
By Jeanna Bryner, LiveScience Staff Writer, December 7, 2006
By: Kirsten Sheridan
Discover the importance of a balanced internal and external pH in slowing the aging process.