Σάββατο, 13 Ιανουαρίου 2018

BRAF MUTATION AS PROGNOSTIC FACTOR IN THYROID CANCER

In a study reported in the Journal of Clinical Oncology, Shen et al found evidence that BRAF V600E mutation status explains the long-recognized increased mortality risk associated with age at diagnosis in patients with papillary thyroid cancer.
Study Details
The study involved data from 2,638 patients (623 men and 2,015 women) from 11 medical centers in 6 countries. Patients had a median age of 46 years at diagnosis (interquartile range = 35–58 years) and median follow-up of 58 months (interquartile range = 26–107 months).  A total of 1,094 patients (41%) had BRAFV600E mutation.
Median age was 48 years among patients with BRAF V600E mutation and 44 years among those with wild-type BRAF. Women accounted for 75% and 77% of the two groups.  
Age-Associated Risk
A linear association between patient age and mortality was observed among patients with BRAF V600E mutation but not in those with wild-type BRAF, with the mortality rate remaining low and flat with increasing age in the latter group. The correlation between age and mortality rate was strongly positive in BRAF V600E patients (r = 0.94, P = .002) but not in wild-type BRAF patients (r = 0.41, P = .36). Kaplan-Meier survival curves rapidly declined with increasing age in the BRAFV600E mutation group but not in the wild-type BRAF group, even among patients aged > 75 years.
In analysis adjusting for clinicopathologic characteristics, hazard ratios for mortality for age 50 years, 60 years, 70 years, and 80 years vs age 45 years were 1.47, 4.03, 12.92, and 42.35 among patients with BRAF V600E mutation and 1.56, 1.92, 1.64, and 1.38 among BRAF wild-type patients. Similar results were observed when analysis was limited to patients with the conventional variant of papillary thyroid cancer.
The investigators concluded, “The long-observed age-associated mortality risk in [papillary thyroid cancer] is dependent on BRAF status; age is a strong, continuous, and independent mortality risk factor in patients with BRAF V600E mutation but not in patients with wild-type BRAF. These results question the conventional general use of patient age as a high-risk factor in [papillary thyroid cancer] and call for differentiation between patients with BRAF V600E and wild-type BRAF when applying age to risk stratification and management.”  
The study was supported by grants from the US National Institutes of Health, Polish National Center of Research and Development, Menzies Health Institute, Griffith University, Queensland Cancer Council and Queensland Smart State Fellowship, and others.


Michael Mingzhao Xing, MD, PhD, of The Johns Hopkins University School of Medicine, is the corresponding author for the Journal of Clinical Oncology article. 

DESMOPLASTIC MELANOMAS HIGHLY RESPONSIVE TOI IMMUNOTHERAPY

Desmoplastic melanoma is a rare subtype of melanoma that is commonly found on sun-exposed areas, such as the head and neck, and usually seen in older patients. Treatment is difficult, as these tumors are often resistant to chemotherapy and lack actionable mutations commonly found in other types of melanoma that are targeted by specific drugs. However, findings reported by Eroglu et al in Nature showed that patients with desmoplastic melanoma are more responsive to immune-activating antiprogrammed cell death protein 1 (PD-1)/programmed cell death ligand 1 (PD-L1) therapies than previously assumed.
Scientists previously believed that the tissue architecture of desmoplastic melanomas would reduce the ability of immune cells to infiltrate the tumor area and limit the effectiveness of immune-activating drugs. However, based on anecdotal reports of favorable responses, a group of researchers hypothesized that patients with desmoplastic melanoma may be more responsive to antiPD-1/PD-L1 therapies than previously assumed, and explored this in the largest group of immunotherapy-treated desmoplastic melanoma patients studied to date.
Study Details
In a collaborative effort involving 10 U.S. and international cancer centers including Moffitt Cancer Center and the University of California, Los Angeles, researchers worked to determine the biologic reasons why patients with desmoplastic melanoma may benefit from drugs that target PD-1 or PD-L1. They first confirmed that desmoplastic melanomas have high levels of DNA mutations, as they are highly associated with ultraviolet light DNA damage caused by sun exposure. NF-1 mutations were found as the most common driving genetic event.
They also demonstrated that desmoplastic melanomas have the preexisting immune cells and proteins necessary to mount an immune response against cancer cells. Researchers compared tissue biopsies from patients with desmoplastic melanoma and nondesmoplastic melanoma. They discovered that desmoplastic melanomas have more cells with high levels of the PD-L1 protein within both the tumor and the invading edges of the tumor. Desmoplastic melanomas also have high levels of CD8 T cells that are critical for immune-activating drugs to be effective.
Study Findings
To test their hypothesis, the researchers analyzed 60 patients with advanced/metastatic desmoplastic melanoma who were previously treated with a drug that targets either PD-1 or PD-L1.
Forty-two patients had a significant response to treatment. Approximately half of these patients had a complete response, and the remainder had a partial response with significant reduction of their tumors. Seventy-four percent of patients were still alive more than 2 years after beginning treatment. This 70% response rate is one of the highest reported for anti–PD-1/PD-L1 therapies to date, and is even higher than response rates commonly observed in patients with other subtypes of melanoma, which are approximately 35% to 40%.    
“Our findings challenge the previous school of thought that immunotherapy would offer little benefit to patients with desmoplastic melanoma due to the dense tissue architecture of these tumors. These tumors in fact have the necessary biologic ingredients to be very effective targets for anti–PD-1 drugs,” said first author Zeynep Eroglu, MD, Assistant Member of the Cutaneous Oncology Department at Moffitt. 
“Often, combinations of two immunotherapy drugs are used to treat patients with melanoma to try to improve tumor response rates and survival above current reported rates. However, these combinations can lead to significantly higher rate of severe side effects than treatment with anti–PD-1 therapy alone. Our data suggest that single-agent anti–PD-1 therapy may well be sufficient for patients with desmoplastic melanoma, potentially sparing them the increased toxicities generally observed with combinations of immunotherapies,” he added.

OLAPARIB APPROVED FOR BRCAm BREAST CANCER

Today, the U.S. Food and Drug Administration (FDA) granted regular approval to olaparib (Lynparza), a poly ADP-ribose polymerase (PARP) inhibitor, for the treatment of patients with deleterious or suspected deleterious germline BRCA-mutated, HER2-negative metastatic breast cancer who have been treated with chemotherapy in either the neoadjuvant, adjuvant, or metastatic setting.
This is the first FDA-approved treatment for patients with germline BRCA-mutated, HER2-negative metastatic breast cancer. Patients with hormone receptor (HR)-positive breast cancer should have been treated with a prior endocrine therapy or be considered inappropriate for endocrine treatment. Patients must be selected for therapy based on the FDA-approved companion diagnostic for olaparib, BRACAnalysis CDx.
“This class of drugs has been used to treat advanced, BRCA-mutated ovarian cancer and has now shown efficacy in treating certain types of BRCA-mutated breast cancer,” said Richard Pazdur, MD, Director of the FDA’s Oncology Center of Excellence and Acting Director of the Office of Hematology and Oncology Products in the FDA’s Center for Drug Evaluation and Research. “This approval demonstrates the current paradigm of developing drugs that target the underlying genetic causes of a cancer, often across cancer types.”
OlympiAD Trial
Approval was based on data from the OlympiAD trial, an open-label, multicenter clinical trial that randomized 302 patients with germline BRCA-mutated, HER2-negative metastatic breast cancer (2:1) to olaparib at 300 mg orally twice daily or physician’s choice of chemotherapy (capecitabine, vinorelbine, or eribulin [Halaven]). All patients had to have a known deleterious or suspected deleterious germline BRCA mutation and must have received prior chemotherapy in the neoadjuvant, adjuvant, or metastatic setting to be randomized. Randomization was stratified by prior use of chemotherapy for metastatic disease; hormone receptor status (HR-positive vs triple-negative); and previous use of platinum-based chemotherapy.
The primary efficacy outcome was progression-free survival assessed by blinded independent central review. Estimated median progression-free survival was 7.0 and 4.2 months in the olaparib and chemotherapy arms, respectively (hazard ratio = 0.58, 95% confidence interval = 0.43–0.80; = .0009).
The most common adverse reactions reported in at least 20% of patients taking olaparib in clinical trials were anemia, nausea, fatigue (including asthenia), vomiting, neutropenia, leukopenia, nasopharyngitis/upper respiratory tract infection/influenza, respiratory tract infection, diarrhea, arthralgia/myalgia, dysgeusia, headache, dyspepsia, decreased appetite, constipation, and stomatitis.
About the BRACAnalysis CDx
The FDA also granted marketing authorization for the BRACAnalysis CDx test for use as an aid in identifying patients with breast cancer with deleterious or suspected deleterious germline BRCA-mutation who may be eligible for olaparib. The effectiveness of the BRACAnalysis CDx test was established based on the OlympiAD trial population, for whom deleterious or suspected deleterious germline BRCA-mutated status was confirmed with prospective or retrospective testing with the BRACAnalysis CDx test.
The recommended dose of olaparib is 300 mg (two 150 mg tablets) taken orally twice daily with or without food. Full prescribing information is available at https://www.accessdata.fda.gov/drugsatfda_docs/label/2018/208558s001lbl.pdf.

INCREASING BREAST CANCER SURVIVAL

Novel treatments and improvements in screening are responsible for the improvements in mortality rates from breast cancer from 2000 to 2012, a new study concludes.
The study was published online January 9 in JAMA.
It reports that 37% of the overall reduction in breast cancer mortality in 2012 was due to screening and that the rest was due to treatment, although these percentages vary with different molecular subtypes of breast cancer.
Of the 63% mortality reduction associated with treatment, 31% was attributable to chemotherapy, 27% to hormone therapy, and 4% to treatment with trastuzumab (multiple brands).
"These numbers represent very positive news for breast cancer patients," commented lead author Sylvia K. Plevritis, PhD, Departments of Radiology and Biomedical Data Science, James H. Clark Center, Stanford, California.
"Advances in screening and treatment are saving patients' lives, and this paper quantifies just how much of a difference these advances are making," she said in a statement.
Senior author Jeanne Mandelblatt, MD, MPH, professor of oncology and medicine at Georgetown University School of Medicine, Washington, DC, added: "Newer drugs, particularly ones that are molecularly targeted, are associated with a greater reduction in breast cancer mortality than screening.
"However, screening is still having a significant effect in reducing breast cancer deaths," she added.
Approached for comment, Deborah J. Rhodes, MD, professor of medicine at the Mayo Clinic, Rochester, Minnesota, said that the study is "valuable," particularly inasmuch as the models agreed with each other with respect to general trends.
As such, she believes that the data in the study are important, "not only to assess how far we've come but also to look at opportunities in the future."
However, Dr Rhodes thinks that the article "has the potential to be easily misinterpreted.
"One of the things that I'm struck by in this paper [is], frankly, how little of the gains were attributed to advances in imaging," she said.
A previous study by the same authors found a higher contribution from screening in earlier years. In a study published in 2005, Dr Plevritis and colleagues reported that from 1970 to 2000, breast cancer screening contributed a median of 46% to the overall reduction in breast cancer mortality; developments in adjuvant treatment contributed the rest.
Since 2000, mammography has evolved from film to digital technology, and treatments such as aromatase inhibitors and trastuzumab have been added to the arsenal of adjuvant therapy, so Dr Plevritis and colleagues decided to conduct another study.
"There have been many investments in screening and treatment," Dr Plevritis explained, adding: "We want to know what impact those investments have had in "It also helps us think about the future and how to make sure technologies and drugs that are making the biggest difference are disseminated most widely," she said.

Study Details

For the current study, six independent research teams from across the United States added data covering the years 2000 to 2012 to the original dataset.
Each center developed a CISNET model that incorporated updated breast cancer incidence estimates, estrogen receptor (ER)/HER2 survival trends, screening use, and treatment patterns, as well as US mortality trends.
The team calculated that for women aged 30 to 79 years, screening and treatment were associated with a reduction in overall breast cancer mortality in 2000 of 37%, against an estimated baseline mortality rate of 64 deaths per 100,000 women.
Of this reduction, 44% was attributable to screening and 56% to treatment.
In 2012, the overall reduction in breast cancer mortality with screening and treatment was 49%, against an estimated baseline mortality rate of 63 deaths per 100,000 women. Of this reduction, 37% was attributable to screening, and 63% to treatment.

Variation by Breast Cancer Subtype

The relative contribution of screening and treatment to reductions in breast cancer mortality varied by the molecular subtype of the disease.
For women with ER+/HER2- breast cancer, screening contributed 36% of the mortality reduction, vs 64% for treatment.
For women with ER+/HER2+ disease, the figures were 31% and 69%, respectively; for ER-/HER2+ disease, they were 40% and 60%, respectively; and for ER-/HER2- disease, they were 48% and 52%, respectively.
Speaking to Medscape Medical News, Dr Plevritis said that to build on these findings, there is a "need to work towards not only predicting the risk of an individual woman developing breast cancer but also trying to predict the subtype.
"So if we know that some women are at risk for a specific subtype of cancer, we may have a very personalized approach in terms of more or less screening," she said.
"The community has been thinking that this is necessary, and I think this study adds another dimension to that issue by showing the differences by molecular subtype."
Dr Plevritis explained that although there is "a lot of activity" to determine the risk for a given subtype, particularly for women carrying a BRCA mutation, "for the vast majority of the population, we still are trying to determine what is the likely subtype of cancer they may develop if they develop breast cancer."
She emphasized that breast cancer "is, of course, not just one disease; it's a complex set of diseases, and they are really different from each other.
"They respond differently to different treatments, and it seems that we really do need to think about different screening strategies for women, depending on whether or not they are at risk for one subtype vs another."
Dr Plevritis added: "In terms of policy, I think that that's an important direction, that we really need to collect more data in order to understand that patient-specific risk by subtype."
In an interview with Medscape Medical News, Dr Rhodes commented on the fact that screening in recent years appears to be making a smaller contribution to the overall reduction in breast cancer mortality than it had in the past. But it would be misleading to conclude that screening is having less of an impact on breast cancer mortality because of the inherent limitations of screening itself but rather that this reflects a continuing reliance on outdated technologies with poor yields, particularly for women with dense breasts, she said.
Dr Rhodes pointed out that as far back as 2010, she said in a TED Talk that, compared with film-based techniques, digital mammography "was a giant leap for imaging companies but a very tiny step for womenkind." mortality.

DURABLE COMPLETE RESPONSE AFTER DISCONTINUATION OF IMMUNOTHERAPY

As reported in the Journal of Clinical Oncology by Robert and colleagues, a high proportion of patients with metastatic melanoma achieving complete response on pembrolizumab (Keytruda) in the phase Ib KEYNOTE-001 trial maintained complete response for prolonged durations after treatment discontinuation.
Study Details
The current analysis included 105 (16.0%) patients who achieved a complete response among the 655 patients with ipilimumab (Yervoy)-naive or ipilimumab-treated advanced/metastatic melanoma who received one of three pembrolizumab dose regimens (2 mg/kg every 3 weeks, 10 mg/kg every 3 weeks, or 10 mg/kg every 2 weeks) in the trial. Eligible patients who received pembrolizumab for ≥ 6 months and at least two doses beyond confirmed complete response could discontinue therapy.
In the trial, response was centrally assessed every 12 weeks using RECIST version 1.1. For the current analysis, complete response was defined on investigator assessment using immune-related response criteria.
Prolonged Complete Responses
At data cutoff, with a median follow-up of 43 months, 92 (87.6%) of the 105 patients had a complete response, with a median follow-up of 30 months from first complete response. The median duration of treatment in these patients was 24 months (range = 1–53 months).
Fourteen patients (13.3%) continued to receive treatment for a median of ≥ 40 months; treatment was discontinued in 91 patients (86.7%), including 67 (63.8%) who were followed by observation without additional anticancer therapy. Among these 67 patients, median time to overall response was 3 months (range = 0.5–11 months); median time to complete response was 13 months (range = 3–36 months); median duration of treatment was 23 months (range = 8–44 months); and median duration of treatment after complete response was 7 months (range = 0.5–41 months).
As of data cutoff, only 7 of 105 patients with a complete response had confirmed progressive disease—2 while receiving initial pembrolizumab treatment, 4 after stopping pembrolizumab and proceeding to observation, and 1 who was reported as still receiving pembrolizumab. All 7 remained alive.
Among all 105 patients who had a complete response, estimated 24-month disease-free survival rate from time of complete response was 90.9% (95% confidence interval [CI] = 82.5%–95.4%). Among the 67 patients who discontinued pembrolizumab after complete response for observation, estimated 24-month disease-free survival from treatment discontinuation was 89.9%. Among all 89 patients who discontinued pembrolizumab after complete response for reasons other than progressive disease, estimated 24-month disease-free survival from treatment discontinuation was 85.8%.
Potential Factors in Response
On univariate analysis among 459 patients with baseline data on tumor programmed cell death ligand 1 (PD-L1) status and tumor size, complete response occurred in 42.7% of those with smaller (1–5 cm) PD-L1–positive tumors (≥ 1% staining in tumor cells and mononuclear inflammatory cells). Patients with larger tumors (5–90 cm) had a lower complete response rate (< 10%), except for those with tumors measuring 5 to 10 cm with positive PD-L1 expression, who had a complete response rate of 20.5%; this rate was similar to the rate of 20.7% among those with smaller tumors (1–5 cm) who were PD-L1–negative.
The investigators concluded: “Patients with metastatic melanoma can have durable complete remission after discontinuation of pembrolizumab, and the low incidence of relapse after median follow-up of approximately 2 years from discontinuation provides hope for a cure for some patients. The mechanisms underlying durable [complete response] require further investigation.”
The study was supported by Merck & Co.


Caroline Robert, MD, PhD, of Institut Gustave Roussy and Paris Sud University, is the corresponding author for the Journal of Clinical Oncology article.

TREATING CANCER CACHEXIA

Currently, there is no licensed treatment and no standard of care for cancer cachexia. Putting this in the context of a condition which impairs the delivery of anti-cancer therapy (through increased side-effects, treatment delays, dose reductions),[1] causes marked distress to patients and their families and is associated with reduced survival, there remains an urgency to progress the research agenda in cancer cachexia.[2]
Currow et al. are to be commended for rising to this challenge and herein they present long-term safety and efficacy data on the ghrelin agonist, anamorelin.[3]Preceding this trial, two large double blind placebo controlled randomised trials (ROMANA 1 and 2) have reported that anamorelin improves lean body mass and symptoms of cancer cachexia.[4] The present safety extension study (ROMANA 3) also reinforces the fact that anamorelin is well tolerated with prolonged use (24 weeks) and continues to improve body weight. Therefore, in the search for the holy grail that is a treatment of cancer cachexia, the present study and the accompanying body of work examining anamorelin are encouraging.
Discussion with the regulators are ongoing with relation to a licence for anamorelin based on the fact that the co-primary end points were not met; indeed whilst there was an improvement in lean body mass, the accompanying end point assessing function (hand grip strength) was not met.[4] Independent of this, however, the programme of work examining anamorelin, through early and late phase trials, will evolve with a focus on body weight, appetite and quality of life.
This challenge of developing a treatment of cancer cachexia is likely to continue to deter major pharma from entering this arena. Indeed, in the last three decades, the failure to develop treatments for cancer cachexia is in part due to the lack of engagement from pharmaceutical companies. Like the developers of anamorelin, there have been others who admirably have developed treatments for cancer cachexia that have not been realised into the clinical setting, for reasons which include clinical end points not being met.
For example, the selective androgen receptor modulator Enobosarm underwent clinical trials and reported beneficial effects on lean body mass but failed to show any consistent benefit on functional outcomes (using stair climb power or hand grip strength).[5] Whilst as cachexia researchers we can press regulators for appropriate guidance and consensus on optimal end points for cachexia trials, experts in this field have to provide the most informed information about measures of function that are both clinically meaningful and valid in patients with advanced cancer.[6]Researchers also need to be mindful that changes in muscle mass will not necessarily equate to changes in function. Using a measure of function such as hand grip strength or stair-climb may not be optimal end points in cachexia trials as they can be affected by numerous other cancer and non-cancer-related factors, rather than just due to loss of muscle and/or muscle function. The critical area of clinical end points requires more thought. It may be that a more patient-centred approach to muscle function, with a focus on Activities of Daily Living, rather than hand grip strength would be a more clinically useful end point. Similarly, a realistic measure of physical activity could be measured in clinical trials using a research grade accelerometer, such as the Actigraph.
It also follows that without robust characterisation of the clinical phenotype in cancer cachexia studies, including as a minimum, stratification for the systemic inflammatory response (SIR), results will be challenging to interpret. Indeed, in the ROMANA trials, the median time from diagnosis to trial entry was ~8 months. In the population studied (stage III or IV non-small cell lung cancer) this group of patients would be in the late stages of their illness and as such may have cachexia which is refractory to treatment. To optimise the treatment of cancer cachexia, any interventions should be initiated as early as possible to maximise any benefits and attenuate the development of cachexia.
We must also be mindful that it is challenging to conduct clinical trials in cancer cachexia when debate continues regarding the definition and classification.[7,8]Recently, Vanhoutte et al. have further advocated the role of the SIR in the genesis of cachexia.[9] This is congruent with evidence of the association of the SIR in quality of life in advanced cancer and its relationship to physical function.[10] In the design of future trials in cachexia, the SIR should be a key stratification factor and may identify subgroups where treatments may be more effective.[11]
There are, however, grounds for optimism and the study by Currow et al. demonstrates progress in researching a treatment of cachexia. The complex pathophysiology of cancer cachexia will almost certainly require therapy that targets the multiple mechanisms of cancer cachexia; termed multimodal therapy.[12,13] To treat cancer cachexia optimally, it could be argued that a foundation of optimal cachexia care is needed using multimodal therapy which targets the SIR, reduced food intake and decreased physical function. A phase II trial examining such an approach has recently been conducted and the encouraging findings mean that a phase III trial is underway.[14] It could be argued that novel agents such as anamorelin may have maximal benefits when used in combination with a background of optimal cachexia care. This may provide the best platform for novel therapies to reach their true potential.

ELECTROMAGNETIC FIELDS FOR GLIOBLASTOMA

Final results of a phase III trial reported by Stupp and colleagues in JAMA indicate that adding antimitotic treatment with tumor-treating fields to maintenance temozolomide is associated with improved progression-free and overall survival in patients with previously treated glioblastoma. The tumor-treating fields modality interferes with glioblastoma cell division and organelle assembly by delivery of low-intensity alternating electric fields to the tumor. 
Study Details
In this open-label trial, 695 patients from 83 sites in North America, Europe, the Republic of Korea, and Israel were randomized 2:1 between July 2009 and July 2014 to receive tumor-treating fields plus maintenance temozolomide (n = 466) or temozolomide alone (n = 229). Patients had to have undergone resection or biopsy, in addition to having completed chemoradiotherapy.
Tumor-treating fields treatment consisted of low-intensity, 200-kHz frequency, alternating electric fields delivered ≥ 18 h/d via a portable device connected to transducer arrays on the shaved scalp. Temozolomide treatment consisted of 150 to 200 mg/m2 for 5 days in 28-day cycles for 6 to 12 cycles.
The primary endpoint was progression-free survival in the intent-to-treat population. Patients had a median age of 56 years, 68% were male, 89% were white, and 49% were from the United States.
Survival Outcomes
A preplanned interim analysis reported in 2015 including the first 315 randomized patients showed improved progression-free and overall survival in the tumor-treating fields group. For the current report, patients were followed through December 2016. The median duration of tumor-treating fields treatment was 8.2 months.
Median follow-up was 40 months and minimum follow-up was 24 months. Median progression-free survival was 6.7 months in the tumor-treating fields group vs 4.0 months in the temozolomide-alone group (hazard ratio [HR] = 0.63, < .001). Median overall survival was 20.9 vs 16.0 months (HR = 0.63, < .001) in the two groups, respectively. In exploratory analysis, overall survival was 43% vs 31% at 2 years, 26% vs 16% at 3 years, and 13% vs 5% at 5 years.
Adverse Events
Grade 3 or 4 systemic adverse events occurred in 48% of patients in the tumor-treating fields group and 44% of the temozolomide-alone group, with the most common being nervous system disorders (24% vs 20%, seizure in 6% vs 6%), and blood and lymphatic system disorders (13% vs 11%, thrombocytopenia in 9% vs 5%). Mild to moderate skin toxicity underneath the transducer arrays occurred in 52% of the tumor-treating fields group.
The investigators concluded, “In the final analysis of this randomized clinical trial of patients with glioblastoma who had received standard radiochemotherapy, the addition of [tumor-treating fields] to maintenance temozolomide chemotherapy vs maintenance temozolomide alone, resulted in statistically significant improvement in progression-free survival and overall survival. These results are consistent with the previous interim analysis.”

NEOADJUVANT CHEMOTHERAPY FOR ADVANCED OVARIAN CANCER

The adoption of neoadjuvant chemotherapy is associated with reduced mortality in women with advanced epithelial ovarian cancer, researchers report.
"There are gynecologic oncologists who believe that the increasing use of neoadjuvant chemotherapy in the United States may be harming patients,” Dr. Alexander Melamed from Massachusetts General Hospital, Boston, told Reuters Health by email. “Our study suggests a decline in the use of primary debulking surgery in favor of neoadjuvant chemotherapy resulted in improvement in both perioperative and long-term survival.”
“In two randomized trials, neoadjuvant chemotherapy followed by surgery was found to be noninferior in terms of long term survival compared with upfront surgery, but not superior,” he explained. “Furthermore, prior observational studies showed that patients selected to receive neoadjuvant chemotherapy have shorter survival than those selected to undergo primary surgery, although such studies cannot adequately account for baseline differences between patients due to strong selection.”
In the current quasi-experimental study, Dr. Melamed and colleagues used the U.S. National Cancer Database to assess whether greater utilization of neoadjuvant chemotherapy (NACT) is associated with better survival in advanced epithelial ovarian cancer.
They compared outcomes between two groups: women in New England and the “east south central” region (two areas where NACT use increased by 27% between 2011 and 2012) and control women in the south Atlantic, west north central, and east north central regions (where NACT use remained unchanged from 2011 to 2012).
The two groups did not differ significantly in age, race/ethnicity, stage, histologic type, grade, or comorbidities.
In the two regions with rapid adoption of NACT after 2011, all-cause mortality was 19% lower in 2012 than in 2011, whereas mortality did not change significantly in the other regions, according to the January 3 BMJ online report.
Similarly, Kaplan-Meier survival curves showed superior survival in New England and the east south central region after the abrupt increase in NACT use, whereas survival remained unchanged in the control regions.
Between 2011 and 2012, the proportion of women who died within 30 days after surgery declined from 3.1% to 1.8% in the rapidly adopting regions, whereas the proportion did not change significantly in the control regions.
During the same interval, 90-day postoperative mortality declined to a significantly greater extent in the rapidly adopting regions than in the control regions.
Moreover, the proportion of women who did not receive surgery and chemotherapy declined from 20.0% in 2011 to 17.4% in 2012 in the rapidly adopting regions, compared with a slight uptick from 19.0% to 19.5% in the control regions.
“Physicians should be reassured that using neoadjuvant chemotherapy for women with advanced ovarian cancer is unlikely to harm long-term outcomes,” Dr. Melamed said. “I hope that this finding will encourage providers to choose neoadjuvant chemotherapy for ovarian cancer patients who are unlikely to benefit from upfront surgery, like elderly patients, and those for whom resection of all visible disease is either not feasible or would require a very extensive surgical procedure.”
Dr. Alon Altman from the University of Manitoba, in Canada, who recently reviewed the selection criteria of neoadjuvant chemotherapy patients, told Reuters Health by email, "Patients that have advanced disease that cannot be adequately debulked to microscopic disease, or are too sick to undergo surgery, likely do better with NACT followed by interval surgery. Those patients that are healthy, and the surgeon believes that they can be optimally debulked, likely still have better outcomes from primary surgery. The balance is always morbidity and mortality.”
“The correct balance of NACT versus primary surgery is a difficult point to determine and is influenced by surgeon comfort, patient health/comorbidities, distance traveled, and wait times (especially in publicly funded systems),” he said. “For example, if you have to wait for 4 weeks, 6 weeks, 8 weeks for surgery, is NACT a better upfront option? Not sure we know the answer to this question, yet.”
“I think the two camps of oncologists will argue about these results,” Dr. Altman concluded. “The pro-NACT camp will support the study and say that there is good RCT (data) and now retrospective data that shows it is safe. The anti-NACT camp will likely say that there still exists a different in surgical ‘aggression’ and this difference may be present between these two centers.”
SOURCE: http://bit.ly/2AycqFI
BMJ 2018.

NIGHT SHIFTS INCREASE BREAST CANCER RISK

A meta-analysis of international data confirms a positive association between long-term night shift work and an increased overall risk for cancer in women, particularly breast cancer.
In North America and Europe, working the night shift was associated with a 32% increased risk for breast cancer overall (odds ratio [OR], 1.316), the authors report.
But the risk was even higher in one specific group: Night nurses were found to have a "remarkable" 58% increased risk (OR, 1.577) for breast cancer.
Breast cancer risk was also elevated in a dose-response way, consistent with earlier studies. For every 5 years a women spent working nights, breast cancer risk increased by 3.3% (OR, 1.033), the study authors say.
The review, published online January 8 in Cancer Epidemiology, Biomarkers & Prevention, was led by Xuelei Ma, PhD, from the West China Medical Center of Sichuan University, Chengdu.
"By systematically integrating a multitude of previous data, we found that night shift work was positively associated with several common cancers in women," said Dr Ma in a statement. "Given the expanding prevalence of shift work worldwide and the heavy public burden of cancers, we initiated this study to draw public attention to this issue so that more large cohort studies will be conducted to confirm these associations."
More research is needed to understand the mechanisms behind this association and to better protect women working night shifts against increased cancer risk, Dr Ma told Medscape Medical News. 
"Breast cancer is the most diagnosed cancer among women worldwide, with higher incidence in developed regions," he said. "These results might help establish and implement effective measures to protect female night shifters. Long-term night shift workers should have regular physical examinations and cancer screenings."
Long-term night shift workers should have regular physical examinations and cancer screenings. Dr Xuelei Ma

Demands of the Modern World 

The productivity demands of the modern world call for increasing numbers of employees in the food production, entertainment, healthcare, and transportation industries to work across time zones, the investigators note.
Large numbers of people are being exposed to night shift work, which brings [a] huge detrimental impact on health. Ma et al
The third European Union survey showed that as early as 2000, 76% of employees regularly worked beyond normal working hours, the study authors point out. Up to 21.9% of men and 10.7% of women said they were exposed to shift work, and 7% permanently worked a night shift.
A 2004 European survey revealed that regularly working overtime was the most common form of "flexible" working hours and that this was linked to negative effects on stress, sleep, and social and mental health.
In 2012, a ground-breaking Danish study of female military workers reported by Medscape Medical News showed that the risks to health of working night shifts were greater in women who considered themselves "morning persons" than in women who considered themselves "night owls."

Details of the Meta-analysis 

The meta-analysis conducted by Dr Ma and colleagues looked at the association between long-term night shift work and the risk for 11 types of cancer using data from 26 cohort studies, 24 case-control studies, and 11 nested case-control studies. The 61 studies, updated to October 2016, included nearly 4 million women with cancers of the breast, lung, skin, and digestive and reproductive systems.
The overall risk for cancer increased by 19% among women working night shifts, compared with those who did not, in North America, Europe, Australia, and Asia. In addition to the increased risk for breast cancer already noted above, the risk for skin cancer also increased by 41% and risk for gastrointestinal cancer by 18% in women who worked night shifts.  
Significant heterogeneity was observed in the groups of breast cancer (I2 = 80.4%; P= .000), skin cancer (I2 = 64.7%; P = .009), and uterine cancer (I2 = 59.6%; P = .042). There was no evidence of heterogeneity in the other cancer groups.
The association between working the night shift and increased breast cancer risk in women was seen only in those living in North America and Europe, a finding that Dr Ma said surprised the investigators. "It is possible that women in these locations have higher sex hormone levels, which have been positively associated with hormone-related cancers such as breast cancer," he suggested.
"The positive relationship between endogenous hormone levels and risk of breast cancer supports therapeutic strategies that target estrogen signaling," Dr Ma added, noting that in women with estrogen receptor–positive breast cancer, the clinical benefits of adjuvant endocrine therapy "are well established."

Nurses Appear Most Vulnerable 

Of all the professions occupied by women, nursing appeared to be the most vulnerable to the carcinogenic effects of regular night shift work. A secondary analysis carried out by Dr Ma and colleagues looked at long-term night shift work in female nurses and the risk for six types of cancer. It showed that night nursing was also associated with a 35% increased risk for gastrointestinal cancer (OR, 1.350) and a 28% increased risk for lung cancer (OR, 1.280).
A nonsignificant effect was observed for ovarian cancer (OR, 1.135), and no effect was seen for cervical cancer (OR, 0.980), the investigators report.
"Nurses who worked the night shift were of a medical background and may have been more likely to undergo screening examinations," Dr Ma noted. "Another possible explanation for the increased cancer risk in this population may relate to the job requirements of night shift nursing, such as more intensive shifts."
Working at night leads to disruption in circadian rhythm and suppression of nocturnal melatonin secretion, the study authors note. Over the short term, this results in what's commonly referred to as "jet lag." Symptoms include sleep disorders, digestive troubles, fatigue, emotional fluctuation, and reduced physical activity.
Over the long term, however, research shows that circadian rhythm disruption and nocturnal melatonin suppression function as carcinogens that increase tumor incidence. Prolonged circadian disturbance has also been associated with increased risk for cardiovascular disease, as well as neuropsychiatric and endocrine system disorders.
An impact on urinary melatonin could play a role in sex hormone increases thoughtto be associated with hormone-dependent cancers, they note. Although night shift work was strongly associated with higher risk for breast cancer in women, this review found  no such effect for other cancers in women that are hormone dependent, such as ovarian and uterine cancer, the researchers say.
Lifestyle changes, such as irregular eating hours, reduced physical activity, and work-related stress, may also contribute to this increased cancer risk, Dr Ma told Medscape Medical News.
When asked whether smoking could be a contributing factor, he noted that previous studies adjusted for smoking "reported that ever-smokers counted a larger percentage among cases compared to noncases."
However, the current analysis was limited by a "lack of consistency regarding confounding factors," Dr Ma pointed out. In addition, "no stratified analysis was performed on smoking, thus no clear association was identified between smoking and cancer risk among night shift nurses."
Other study limitations include the lack of a consistent definition for night shift work  across studies, as well substantial between-study heterogeneity that could weaken the association between working at night and cancer risk, Dr Ma and colleagues say.
The study authors have disclosed no relevant financial relationships.
Cancer Epidemiol Biomarkers Prev. Published online January 8, 2018. Abstract

GASTRIC CANCER PROGNOSTIC TOOL

The Yonsei University Gastric Cancer Prediction Model accurately predicts 5-year overall survival in U.S. gastric cancer patients, according to a validation study.
An international collaborative group developed this prediction model using data from Korean gastric cancer patients and validated it using data sets from East Asia, but the model had not yet been tested in an equivalent Western population.
In the current study, Dr. Woo Jin Hyung from Yonsei University College of Medicine, Seoul, South Korea, and colleagues used data from the Surveillance, Epidemiology, and End Results (SEER) program of the U.S. National Cancer Institute to assess the new model’s accuracy and to compare it with the TNM (tumor-node-metastasis) staging system prognostic index.
The researchers identified 15,483 patients with gastric adenocarcinoma. Their ages ranged from 20 to 101 (mean, 65.5) and averaged 9.6 years older than the Korean group. The findings were reported online December 22 in the Journal of the American College of Surgeons.
Compared with Korean patients, fewer U.S. patients had stage 1 disease (8.2% vs. 47.1%) and more U.S. patients had M0 disease with lymph node dissections that were inadequate for proper staging (52.1% vs. 2.4%).
Despite these and other differences, the predictive accuracy of the Yonsei University Gastric Cancer Prediction Model was 76.2% in this population, significantly better than that of the TNM staging system 7th edition (70.4%) and consistent with the findings of the original study.
“While more accurate for survival than the TNM staging system and more inclusive than other nomograms, Yonsei University gastric cancer prediction tool’s limitation is that much like the other clinical nomograms, these tools only account for patients who have undergone surgery,” the researchers caution. “Additionally, majority of the gastric cancer patients outside of South Korea and Japan present with advanced gastric cancer will require multimodality treatments, and the impact of chemotherapy, radiation therapy, or targeted therapy will need to be accounted for to improve the accuracy of future prediction models.”
“A globally applicable prediction tool which can provide accurate gastric cancer prognosis remains paramount in the care of gastric cancer patients worldwide,” they conclude.
Dr. Hyung did not respond to a request for comment.
SOURCE: http://bit.ly/2m5GQeb

ONCOTYPE DX NOT SO COST EFFECTIVE

In the real world, as opposed to the controlled environment of clinical trials, use of the Oncotype DX (Genomic Health) test may not be as cost-effective as was thought. The  gene expression profile (GEP) test is used in patients with early breast cancer to help make decisions about further treatment, in particular whether chemotherapy can be avoided.  
A  new study, which looked at use in a community "real-world setting," found that the likely cost-effectiveness ratio for Oncotype DX testing was higher than the ratios for the most commonly accepted diagnostic and preventive interventions.
The study was published January 8 in the Journal of Clinical Oncology.
Under ideal conditions, such as clinical trial settings, this test has been found to be cost-effective, but when it is used in a less ideal, real-world setting, the cost-effectiveness is different, explained senior author, Jeanne S. Mandelblatt, MD, MPH, professor in the departments of oncology and medicine at Georgetown University School of Medicine and a member of Georgetown Lombardi Comprehensive Cancer Center, Washington, DC.
"The point of our study was that it is important to understand the actual use of new technology, and if it's going to have the intended effectiveness," Dr Mandelblatt told Medscape Medical News.
"When analyzing data about a treatment in a clinical trial, you get a particular magnitude that is sufficiently significant without undue toxicity, and it receives FDA [US Food and Drug Administration] approval," she said. "But then when it's used in the general population, you don't see the same effect. This is because it may be used differently than in the trial, or the patients in the trial were healthier and more select than what is seen in general community practice."
"This is an issue," she noted. "Clinical trials are the best source of evidence that we have, but then the next step — and a really critical one — is to assess the true population impact."
Dr Mandelblatt explained that the actual impact seen in their trial was different than in past analyses. "It's not that they were wrong and we are right," she emphasized. "These trials showed the maximum possible impact at the most reasonable cost, but the actual implementation in the real world was different than what was assessed in those analyses. They were correct for what their purpose was, but then you need to follow through and see how it is actually implemented to understand the real cost-effectiveness."

Variance in Testing Patterns

GEP testing is now considered standard of care for supporting treatment decision making for patients with early-stage, node-negative, estrogen receptor–positive, human epidermal growth factor receptor 2–negative cancers. Previous analyses of GEP have assessed hypothetical cohorts under ideal conditions and concluded that testing had low costs relative to its benefits.
For their study, Dr Mandelblatt and colleagues developed a model that  would reflect actual use in the community.
Their simulation model compared 25-year incremental costs and quality-adjusted life-years (QALYs) for Oncotype DX use in the community from 2005 to 2012 with costs and QALYs of usual care in the time period before testing (2000 to 2004).
They ran 100 million simulations to reduce Monte Carlo error when estimating costs and effects, using national data, published research, and electronic records from Kaiser Permanente Northern California that linked registry data, treatment, and GEP testing.
About a quarter (24.2%) of all patients received GEP testing, and 30% received chemotherapy. The patients who underwent testing were younger than those not tested (mean age, 56.2 years vs 60.7 years) and were most likely to have stage l than stage ll disease.
Patient who underwent testing and who were younger than age 50 years had lower chemotherapy rates than patients in the same age group who were not tested (53.0% vs 63.6%).
In contrast, older patients who were tested had higher rates of chemotherapy compared with the untested cohort (age 50 to 64 years: 36.5% vs 30.8%; age ≥ 65 years: 17.6% vs 8.2%).
These testing patterns showed that a greater proportion of tested than untested patients who were destined to have distant recurrences received chemotherapy (55.3% v 30.4%).

Preference-Driven Decisions

The overall cost-effectiveness ratio for testing vs usual care was $188,125 per QALY, but under more ideal conditions, it was $39,496 per QALY, which was closer to earlier estimates.
In another scenario, if the Oncotype DX costs were decreased from current Medicare reimbursement rates of $3416 to $2657, the ratio of community practice vs usual care decreased to $71,250 per QALY.
Finally, if the test properties of the assay improved, the incremental cost-effectiveness ratio decreased to $28,947 per QALY.
The accuracy of the test is important for cost-effectiveness, but the test assumes that patients are going to act on its recommendations, explained Dr. Mandelblatt. "If they are low risk they may forgo chemotherapy while those with high-risk results are recommended to receive chemotherapy, and that makes the cost-effectiveness more favorable."
But in the real-world setting, these are very much preference-driven decisions. "The earlier analyses assumed that everyone at low risk would forgo chemotherapy and everyone at high risk would receive it," she said. "In clinical practice, the reality is that some women at low risk are still worried about recurrence and want the chemotherapy, while others at high risk refuse it."
The test is also not perfect in predicting whose disease will recur; there are women in the low-risk group who will have distant metastases and those at high risk who will never experience a recurrence.
Dr Mandelblatt also pointed out that they selected Oncotype DX for their analysis because it is the most commonly used GEP test in the United States and was used in prior economic analyses.
Another test, MammaPrint (Agendia), is also available in the United States and was cleared for use by the FDA in 2007. The American Society of Clinical Oncology recently outlined  specific recommendations for its use. But Dr Mandelblatt noted that it is more commonly used in Europe.
"The results for MammaPrint could be more or less similar, but that would depend on if it is more accurate than Oncotype DX," she said. "I haven't evaluated it, but it may be prone to the same issues that we found while assessing Oncotype DX."
The study was supported by grants from the National Cancer Institute, an American Cancer Society Mentored Research Scholar Grant, a Lombardi Comprehensive Cancer Center American Cancer Society Young Investigator Award, a Cancer Prevention Research Fellowship sponsored by the American Society of Preventive Oncology, and the Breast Cancer Research Foundation.  The authors have disclosed no  relevant financial relationships.
J Clin Oncol. Published January 8, 2018. Abstract

5FU CREAM FOR SKIN CANCER PREVENTION

In a study conducted in veterans who had a history of basal cell carcinoma (BCC) or squamous cell carcinoma (SCC), patients who used a cream containing 5% fluorouracil (5-FU) (multiple brands) on their face and ears twice daily for 2 to 4 weeks subsequently underwent substantially less surgery for SCC than a similar group of patients who used a control cream.
The need for surgery was reduced by 75% over the year after treatment, researchers report in a study published online January 3 in JAMA Dermatology.
Topical 5-FU has been on the market for many years. It is used in the treatment of actinic keratosis, which is a precursor of SCC, explained lead author Martin Weinstock, MD, professor of dermatology, Warren Alpert Medical School of Brown University, Providence, Rhode Island, in an interview posted on JAMA Dermatologyalong with the study.
"We wanted to see whether it might be effective for preventing keratinocyte carcinoma, be it basal or squamous cell carcinoma," added Dr Weinstock, who is also chief of dermatology at the Providence Veterans Affairs Medical Center in Providence.
"We found that patients who were randomized to receive the 5-FU cream were substantially less likely to develop a new SCC on the area to which the cream was applied, namely, the face and ears, over the first year," he added.
This strongly suggests that 5-FU may have a role as an active chemopreventive agent in clinical practice in patients at very high risk for SCC. Dr Martin Weinstock
"So this strongly suggests that 5-FU may have a role as an active chemopreventive agent in clinical practice in patients at very high risk for SCC," Dr Weinstock observed.

Details of the Study

The Veterans Affairs Keratinocyte Carcinoma Chemoprevention (VAKCC) trial randomly allocated 932 participants (median age, 70 years) to receive either the 5-FU cream or a control cream. Both creams were applied twice a day to the patients' face and ears for 2 to 4 weeks.
All participants had a history of at least three keratinocyte carcinomas during the past 5 years. Almost all the patients were white men.
The population was specifically chosen because veterans are exposed to a substantial amount of sunlight during and after their military service and are thus likely to have a history of keratinocyte carcinomas or are at high risk of developing them.
Indeed, during the 5 years before enrolling in the study, 93% of participants had had at least one BCC; 44% had developed at least three BCCs; 39% had developed at least one invasive SCC; and 18% had developed at least one SCC in situ.
All participants also received a suncreen of sun protection factor 30 and were counseled on its use.
"Study end points were surgically treated BCC or SCC on the face and ears," the authors note.
Patients were followed for 4 years; the median follow-up was 2.8 years.
During the full 4-year study, "there was no difference between treatment groups in time to first keratinocyte, basal cell, or squamous cell carcinoma," the researchers report.
On the other hand, during the first year following short-term use of the 5-FU cream, only 1% of participants in the active treatment group developed an SCC, compared to 4% in the control group.
This translated into a 75% reduction in the need to treat SCCs surgically between the two treatment groups (= .002), the researchers note.
For BCC, the 5-FU cream reduced future risk by 11% during the first year, but this was not statistically significant relative to controls, the authors point out.
The investigators also assessed the risk of participants requiring treatment with Mohs surgery for either BCC or SCC during the study period. They point out that Mohs surgery is the most resource-intensive and therefore expensive of the treatments used for BCC and SCC and was the most common modality used for the treatment of keratinocyte carcinomas among study participants.
Using Mohs surgery as an end point, the investigators documented a 49% reduction in the number of participants who received Mohs surgery for BCCs and keratinocyte carcinomas in the 5-FU group compared with controls, at least during the first year following application of the active cream.
Table. Keratinocyte Carcinomas (KCs) Treated With Mohs Surgery Over the Trial 
Period5-FUControl CreamRelative Risk Reduction
KCs Treated With Mohs Surgery   
Year 120410.72
Overall study period1531491.07
Participants With ≥1 KCs Treated With Mohs Surgery    
Year 116310.51
Overall study period101921.09
BCCs Treated With Mohs Surgery   
Year 117360.56
Overall study period1201181.04
Participants With ≥1 BCCs Treated With Mohs Surgery    
Year 114270.51
Overall study period87791.09
The 5-FU cream is associated with some adverse effects (AEs), including erythema, which occurred in more than 90% of the participants in the active treatment group.
Twenty-one percent of participants in the active treatment arm also rated the AEs they experienced as "severe" 6 months after starting treatment; 40% of them rated the AEs as "moderate."
In contrast, more than three quarters of control patients reported no AEs at the 6-month assessment point.
On the other hand, 87% of veterans who received the 5-FU cream indicated they would be willing to treat themselves again if the cream was proven to reduce future skin cancers.
Indeed, the investigators suggest that given the waning effectiveness of the cream in protecting patients against future SCCs after 1 year, it would be "reasonable" to consider repeat treatment every year to reduce the need for Mohs surgery for recurrent SCCs in high-risk populations.
If we could take a big chunk out of the incidence of SCC, it would be a major public health benefit. Dr Martin Weinstock
"Squamous cell carcinomas are very, very common, so if we could take a big chunk out of the incidence of SCC, it would be a major public health benefit," Dr Weinstock concluded.
Asked by Medscape Medical News to comment on the study, David Leffell, MD, David Paige Smith Professor of Dermatology and Surgery, Yale School of Medicine, New Haven, Connecticut, pointed out that it is well known that 5-FU reduces the incidence of actinic keratoses, which are considered precursors to SCC, so "it is not surprising that there would be no effect on basal cell cancer," he said in an email.
"The real purpose of the study was to see if the use of a relatively inexpensive topical agent could reduce the number of cases of SCC that require Mohs surgery," Dr Leffell emphasized.
The fact that this cream did reduce the need for Mohs surgery in participants may be slightly confounded by the fact that the decision to use Mohs surgery to treat SCC always depends on the judgment of the referring physicians, so a reduction in this end point may not be as cut and dry as it might first appear, Dr Leffell suggested.
"In practice, repeat courses of 5-FU are needed to manage people with multiple actinic keratoses," Dr Leffell pointed out. He indicated that repeat courses of 5-FU may be needed to manage SCC as well.
"However, if the study findings are real, this approach could become a standard in the population at risk," Dr Leffell said.
The study was supported in part by the Cooperative Studies Program of the Office of Research and Development, US Department of Veterans Affairs. Dr Weinstock is employed by Brown Dermatology, Inc (Brown University Department of Dermatology's faculty practice) and has served as a consultant to AbbVie, Castle, and Celgene. 
JAMA Dermatol. Published online January 3, 2018. Full text