Dr Linda Calabresi

Dr Linda Calabresi

GP; Medical Editor, Healthed
Dr Linda Calabresi is an Australian-based health professional. Linda is trained as a GP (General Practitioner) and has practices located in North Ryde, Artarmon.

More from this expert

Clinical Articles iconClinical Articles

The physical health of mentally ill patients is a "massive problem and we are doing very badly at it,” psychiatrist Dr Matthew Warden told doctors at a recent Healthed evening seminar in Sydney. In particular, the prevalence of high cardiovascular risk among patients with a history of psychosis, means this population was a "ticking time bomb", said Dr Warden, who is the Director of Acute Inpatient Services for Mental Health at St Vincent’s Hospital in Melbourne. Even without antipsychotic medication, a disproportionate number of people with a history of psychosis are overweight or obese, do very little if any physical exercise and smoke. And it is well-known that the metabolic side-effects associated with antipsychotic medications increases this cardiovascular risk enormously. Consequently, there has been growing pressure on psychiatrists to assess, monitor and manage the physical health of their patients with psychosis, but Dr Warden said, realistically this needs to be also done by GPs as they will usually be managing these patients long-term and "they are better at it.” Baseline metabolic measurements need to be taken at first episode of psychosis, including weight, BMI, BP, lipid levels, fasting blood sugar and smoking status. Weight, in particular needs to be monitored carefully following the commencement of antipsychotic medication, as weight gain is extremely common, especially with olanzapine which, Australia-wide is the most commonly prescribed antipsychotic. In answer to a GP’s question following his talk, Dr Ward said it is extremely difficult to avoid or reverse this medication-induced weight gain with diet and exercise alone. In addition, weight loss pharmacotherapy such as phentermine is contraindicated in people with a history of psychosis. Key to managing the weight gain issue was to choose an antipsychotic with the least long-term side effects from the outset. Olanzapine and clozapine are associated with the greatest weight gain while lurasidone and the partial agonists, aripiprazole and ziprasidone have the least effect on weight. Alternatively, for patients who may have been started on olanzapine or similar, swap to a more weight-neutral medication at the first sign they were gaining weight or developing other metabolic side-effects. It is more likely that a person who as gained weight on olanzapine, will lose that weight if switched to another weight-neutral medication early. The longer that patient stays on olanzapine and the weight gain is sustained, the harder it will be to shift even if the medication is changed, Dr Warden said. In addition to managing weight gain in mentally ill patients, Dr Warden also encouraged GPs to offer smoking cessation advice and help. Even though this population were often considered among the most dependent and heaviest smokers, his own research had found a significant number of patients could successfully quit or at the least cut down given the right advice and assistance. While most smoking cessation pharmacotherapy could be used, Dr Warden suggested that varenicline (Champix) was probably best avoided in these patients. At St Vincent’s Hospital in Melbourne, patients receiving antipsychotic therapy have their metabolic markers assessed at admission and at regular intervals after that, including measuring their serum prolactin. “Hyperprolactinaemia is a significant problem and should be monitored every six months if it is elevated or increasing particularly if there are symptoms then either reduce the dose or change antipsychotic or add in low dose aripiprazole which will lower prolactin levels,” Dr Warden explained.   Dr Matthew Warden spoke on the “Management of Metabolic Dysregulation in Patients on Antipsychotics” at the Healthed, Mental Health in General Practice Evening Seminar held in Sydney in June, 2018.

Clinical Articles iconClinical Articles

Why are Australians having rehabilitation as an inpatient after their total knee replacements rather than as an outpatient at a rate higher than any other country in the world? And why are our rates of inpatient rehabilitation as opposed to community or home-based rehab increasing? That’s what researchers were investigating in a study just published in the MJA. Could it be inpatient rehab was associated with better health outcomes for the patient than the other options? Or were patients too complex, lived too far away or needed greater supervision to allow them to have their rehab off-site? As it turns out, the reason inpatient rehabilitation rates are increasing has much more to do with private hospitals being able to access funding than any patient factors. According to the study authors, more than 50,000 total knee replacement operations were performed in Australia in 2016, about 70% of which took place in a private hospital. In that year, 2016, 45% of patients underwent inpatient rehabilitation following surgery. This represents a substantial increase from the 31% who had the same inpatient service back in 2009. This bucks an international trend. “Inpatient rehabilitation rates in the United States decreased from a peak of 35% in 2003 to 11% in 2009, with a mean rate during 2009-2014 of 15%,” the researchers said. Randomised controlled trials have failed to show the functional improvements achieved through inpatient rehabilitation are superior to those achieved with home- or community-based rehabilitation. However, the cost was significantly more. A recent analysis including almost 260 privately insured patients at 12 Australian hospitals put the cost differential at an average of $9500. And even though the mean age for patients undergoing inpatient rehab was slightly higher than for those who did not (71.0 vs 67.3 years), and they were more likely to have comorbidities and live alone, the study authors said the differences didn’t explain the wide variation in admission rates from hospital to hospital. “Patients in hospitals with high rates of inpatient rehabilitation were similar to those in hospitals with low rates, eliminating patient complexity as the reason,” they said. It seems the greatest determinant of whether a person had inpatient rehabilitation was the hospital in which the total knee replacement took place. “This factor was substantially more important than the clinical profile of the patient,” the study authors said. They suggested some private hospitals were encouraging inpatient rehabilitation because they were able to access funding on a per day basis for the rehab, in addition to the payment received for the knee surgery. The study authors concede it is an attractive business model, but while these hospitals may be offering excellent rehab in terms of services and facilities, it all comes at a cost ‘that, for many patients, is not justified by better outcomes.’ They suggest the proportion of patients receiving inpatient rehabilitation after a total knee replacement could be reduced, improving health care efficiency without harming health outcomes. “Reducing low value care will require system-level changes to guidelines and incentives for hospitals, as hospital-related factors are the major driver of variation in inpatient rehabilitation practices,” they concluded.   Reference: Schilling C, Keating C, Barker A, Wilson SF, Petrie D,  Predictors of inpatient rehabilitation after total knee replacement: an analysis of private hospital claims data. Med J Aust. 2018 August 27. 209(5): 222-7. Available from: https://www.mja.com.au/journal/2018/209/5/predictors-inpatient-rehabilitation-after-total-knee-replacement-analysis doi:10.5694/mja17.01231  

Clinical Articles iconClinical Articles

GPs may have to correct some patients’ misunderstanding following reports in the general media suggesting that testing for high risk cancer genes was now available to everyone free of charge. Writing in the latest issue of the MJA, Australian genetics experts say that testing for specific high- risk genetic mutations, especially BRCA1 and BRCA2 has been available to appropriate patients free of charge (but not Medicare-rebated) by genetic specialists in public clinics for over 20 years. What’s new is that these tests now attract a Medicare rebate and you don’t have to be a genetic specialist to order them, but they are still only available to selected patients. “Testing is appropriate when there is at least a 10% chance of identifying a gene mutation responsible for the personal or family history of cancer,” the authors of the article wrote. There are a number of algorithms available to help clinicians calculate whether the likelihood of having one of these cancer-causing genetic mutations is at least 10%. Usually testing is initially considered for patients who have been diagnosed with either breast or ovarian cancer, and because of their young age and/or their strong family history are considered at high possibility of having a genetic mutation that explains their condition. The new item numbers (73295,73296 and 73297) cover testing for heritable germline mutations in seven genes including BRCA1 and BRCA2. If such a mutation is found, then at risk adult relatives will be justified in also accessing testing. However, as the article authors point out there are limitations with this type of genetic testing. Firstly most breast and ovarian cancers occur in people without an identifiable underlying genetic variant. “Only 5% of female breast cancers, 15% of invasive epithelial ovarian cancers and up to 14% of male breast cancers are related to BRCA1 or BRCA2 mutations, thus, most patients with breast cancer do not need, nor will they benefit from, a genetic test,” they said. That’s not to say the absence of BRCA1 or BRCA2, or one of the other rarer high-risk mutations currently tested for, excludes the possibility that the patient has inherited a predisposition to the cancer. Families that appear to have a high prevalence of these types of cancer may indeed have an inherited genetic mutation, it is just that because of limitations of technology and knowledge it is yet to be isolated. What’s more, the sensitivity of the current testing methods, means that a number of incidental genetic mutations may be noted, but the significance of these is as yet unknown. It is critical that when testing is requested for a relative of an affected patient, the laboratory is informed of the exact genetic variation found in the original affected patient, to ensure pathologists distinguish between the disease-causing mutation and variants of undetermined significance. The authors also suggest confining testing to only the most likely variant/s rather than requesting testing for mutations in multiple genes. “[T]he testing of multiple genes may uncover unclassified variants, variants outside the usual clinical context, variants unrelated to the current cancer, or unexpected important variants for which the patient has not been well prepared,” they said. They also suggest education and counselling be given to patients considering this genetic testing, and written consent obtained. The new Medicare item numbers represent a major step forward in terms of genetic and genomic testing becoming mainstream, but, as the current incorrect media headlines demonstrate, this transition is going to require information and education. Clinicians who order these tests are likely to benefit from establishing close ties with genetic services and specialists to ensure best and appropriate practice in this ever-expanding area of medicine. Ref: Med J Aust 2018; 209 (5): 193-196. || doi: 10.5694/mja17.01124

Clinical Articles iconClinical Articles

Adolescent boys who struggle to understand how basic machines work and young girls who have difficulty remembering words are at increased risk of developing dementia when they’re older, new research has found. According to the longitudinal study published in The Journal of the American Medical Association, lower mechanical reasoning in adolescence in boys was associated with a 17% higher risk of having dementia when they were 70. With girls it was a lower memory for words in adolescence that increased the odds of developing the degenerative disease. It has been known for some time that the smarter you are throughout life, even as a child the less likely it is that you will develop dementia. Not a guarantee of protection – just a general trend. It has to do with cognitive reserve, the US researchers explain. “Based on the cognitive reserve hypothesis, high levels of cognitive functioning and reserve accumulated throughout the life course may protect against brain pathology and clinical manifestations of dementia,” they wrote. This theory has been supported by a number of studies such as the Scottish Mental Health Survey that showed that lower mental ability at age 11 increased the risk of dementia down the track. But what had been less well-defined was whether there were any particular aspects of intelligence in young people that were better predictors (or protectors) of dementia than others. This study goes some way to addressing this. Researchers were able to link sociobehavioural data collected from high school children back in 1960 with Medicare claims data over 50 years later that identified those people who had been diagnosed with Alzheimer's disease and related disorders. Interestingly, poor adolescent performance in other areas of intelligence such as mathematics and visualisation were also associated with dementia but not nearly to the extent of mechanical reasoning and word memory. So why is this so? The study authors say there are a few possible explanations. Maybe the poor performance in adolescence reflected poor brain development earlier in life, a risk factor for dementia. Or maybe these adolescents are more susceptible to neuropathology as they get older? Or maybe they are the adolescents who adopt poor health behaviours such as smoking and little exercise? “Regardless of mechanism, our findings emphasise that early-life risk stretches across the life course,” they said. And what can be done about it? That’s the million-dollar question. The researchers say the hope is if we know the at-risk group we can get aggressive with preventive management early. “Efforts to promote cognitive reserve-building experiences and positive health behaviours throughout the life course may prevent or delay clinical symptoms of Alzheimer's disease and related disorder.” An accompanying editorial takes this concept a little further. Dr Tom Russ, a Scottish psychiatrist says interventional research has identified a number of factors that can potentially influence cognitive reserve. These include modifiable health factors, education, social support, positive affect, stimulating activities and/or novel experiences, and cognitive training. As Dr Russ says, you can’t necessarily change all of these risk factors, and even the ones you can change may become less modifiable later in life. But as this study demonstrates, you may be able to work on a person’s cognitive reserve at different stages throughout their life to ultimately lower their risk of dementia.   Reference
  1. Huang AR, Strombotne KL, Horner EM, Lapham SJ, Adolescent Cognitive Aptitudes and Later-in-Life Alzheimer Disease and Related Disorders. JAMA Network Open [Internet]. 2018 Sep; 1(5): e181726. Available from: https://jamanetwork.com/journals/jamanetworkopen/fullarticle/2701735 doi:10.1001/jamanetworkopen.2018.1726.
  2. Russ TC, Intelligence, Cognitive Reserve, and Dementia: Time for Intervention? JAMA Network Open [Internet]. 2018 Sep; 1(5): e181724. Available from: https://jamanetwork.com/journals/jamanetworkopen/fullarticle/2701735 doi:10.1001/jamanetworkopen.2018.1724.

Clinical Articles iconClinical Articles

The value of omega-3 fatty acids has come under fire lately. But now a new systematic review suggests they might have benefits beyond the previous therapeutic targets of depression, cardiac health, eye health and arthritis. Researchers have found that omega-3 polyunsaturated fatty acids (PUFA) might reduce the symptoms of clinical anxiety, particularly among those people who had a specific clinical condition be it medical (such as Parkinson disease) or psychological (premenstrual syndrome). “This systematic review…provides the first meta-analytic evidence, to our knowledge, that omega-3 PUFA treatment may be associated with anxiety reduction, which might not only be due to a potential placebo effect, but also from some associations of treatment with reduced anxiety symptoms,” the review authors said in JAMA. The finding is likely to be welcome news for patients with this condition. Be it the potential side-effects of medications or the cost and accessibility of psychological therapy, patients with anxiety, especially those with comorbid medical conditions are keen for alternative or at least supplementary safe, evidence-based treatments for their symptoms. Previous research, in both human and animal studies had found that a lack of omega-3 PUFAs could induce various behavioural and neuropsychiatric disorders. What had not been shown was whether taking this supplement was effective in reducing the specific anxiety symptoms. The review involved an extensive literature search through a wide range of databases including PubMed and Cochrane looking for trials that had assessed the anxiolytic effects of these fatty acids in humans. In the end they found 19 trials that matched their eligibility criteria, which allowed researchers to analyse the effect of supplementation in just over 1200 participants and compare it with about 1000 matched controls who didn’t take the fatty acids. Overall, they found ‘there was a significantly better association of treatment with reduced anxiety symptoms in patients receiving omega-3 PUFA treatment than in those not receiving it.’ Subgroup analysis also showed that those taking at least 2000mg or more of the omega-3 PUFA treatment were more likely to have reduced anxiety. And somewhat surprisingly, those patients receiving supplements containing less than 60% EPA did better than those taking formulations with a greater concentration of EPA. The studies in the review included very different cohorts, and because of this and the limited number of studies included, the authors understandably say the results need to be interpreted with caution. However, while bigger, better studies are still needed to prove the benefit of omega-3 PUFAs in patients with clinical anxiety, this research certainly does suggest that higher dose formulations of less than 60% concentration of EPA might have a role as at least adjunctive treatment to standard therapy.   Reference: Su KP, Tseng PT, Lin PY, Okubo R, Chen TY, Chen YW, et al. Association of Use of Omega-3 Polyunsaturated Fatty Acids With Changes in Severity of Anxiety Symptoms; A Systematic Review and Meta-analysis. JAMA Network Open [Internet]. 2018 Sep; 1(5): e182327. Available from: https://jamanetwork.com/journals/jamanetworkopen/fullarticle/2701735 doi:10.1001/jamanetworkopen.2018.2327.

Clinical Articles iconClinical Articles

GPs can make a significant difference in curbing the rising rates of liver cancer deaths in Australia, experts say. According to an analysis of over 270 cases of newly diagnosed cases of hepatocellular cancer presenting at seven Melbourne tertiary hospitals over one year, researchers say survival rates could be improved with earlier diagnosis of cirrhosis and better adherence to recommended screening  schedules among those known to be at high risk. “[T]he number of liver cancer-related deaths has been the most rapid for any cancer type in Australia over the past 40 years,” the study authors said in the MJA. And of all the types of liver cancer, hepatocellular carcinoma is by far the most common, accounting for 82%. Even though treatments are available, both curative and palliative survival remains very poor with the Australian 12-month survival rate estimated to be only 62%. In this particular study, conducted over 2012/2013 the mean survival was only 18 months. As one would expect, the patients who did better, who generally survived the longest were those whose tumours were detected at an earlier stage. These were usually the patients who were known to be at high risk of developing liver cancer and were participating in a surveillance program. But this was only 40% of the 272 cases, even though 89% would have qualified for surveillance based on their risk factors. Why was this? Well firstly, many of these people did not know they were at risk. And that’s where GPs fit in. In the study the most common risk factors for liver cancer were found to be hepatitis C infection (41%), alcohol-related liver disease (39%), hepatitis B infection (22%) and non-alcoholic fatty liver disease (14%). Many had more than one risk factor. Most telling was the finding that, even though the vast majority of patients (83%) had cirrhosis when they were diagnosed with hepatocellular carcinoma, for one third of them that was the first they knew of it. The study authors suggest clinicians need to be alert for risk factors for chronic liver disease such as excess alcohol use, chronic HCV and HBV infections and even non-alcoholic liver disease in certain groups. In these people, checking for cirrhosis is likely to be worthwhile. “An aspartate transaminase to platelet ratio index (APRI) value greater than 1.0 predicts cirrhosis with 76% sensitivity and 72% specificity, and the test is simple to undertake,” the researchers said. The other major barrier to the earlier detection of liver cancer identified in the study was the poor adherence to surveillance among those people identified as being at high risk. Researchers found patients with alcohol-related liver disease or decompensated liver disease were the least likely to get regular monitoring. A surveillance program for this particular cancer involves a 6-monthly liver ultrasound and serum alpha-fetoprotein assessment. The study authors are advocating a national hepatocellular cancer surveillance program for those who are at high risk of developing the disease, which would include all patients with cirrhosis, Asian men over 40, women over 50, Africans over 20 years of age, and patients with a family history of [hepatocellular carcinoma] without cirrhosis but with chronic HBV infections. A national program to screen for hepatocellular carcinoma amongst this particular group would be worthwhile, the researchers said, as the incidence of the cancer is high, the screening is non-invasive and inexpensive and, perhaps most importantly early detection has been shown to improve survival. However, until such a national program is developed, researchers are encouraging GPs to ensure that their at-risk patients are enrolled in a surveillance program in order to hopefully improve their health outcomes.   Reference: Hong TP, Gow PJ, Fink M, Dev A, Roberts SK, Nicoll A, et al. Surveillance improves survival of patients with hepatocellular carcinoma: a prospective population-based study. Med J Aust [Internet]. 2018 Sep 24; 209(8): 1-7. Available from: https://www.mja.com.au/journal/2018/209/8/surveillance-improves-survival-patients-hepatocellular-carcinoma-prospective doi: 10.5694/mja18.00373

Clinical Articles iconClinical Articles

Vaccination in immunosuppressed adult patients has many facets and can be challenging for GPs who don’t deal with these cases regularly. But there are a few key considerations that can help guide clinicians, says Associate Professor Michael Woodward, Melbourne-based geriatrician, writer, researcher and passionate advocate for health promotion. Firstly, not all immunosuppression is equal. It is important to ascertain the degree of immunosuppression, as some people may be being unnecessarily denied vaccines because they are taking medication that can suppress the immune system but only at higher doses or in different formulations. “For instance, someone who is on inhaled corticosteroids for their asthma or on low dose (less than 20mg) prednisolone daily for just a few weeks is not significantly immunosuppressed and can be vaccinated in the same way as other people,” said Professor Woodward in an interview following his presentation at Healthed’s recent Annual Women's and Children’s Health Update in Perth. However, those on higher doses of steroids or on steroids more long-term, as well as those people who have conditions associated with immunosuppression such as haematological malignancy do need special consideration when it comes to vaccination. Most importantly, live vaccines are not to be given to this group. This includes the new herpes zoster vaccine (Zostavax), which absolutely contraindicated in severely immunocompromised patients. The consequences of inadvertently administering this vaccine to an immunosuppressed patient hit the headlines some months ago, highlighting the importance of this guideline. The other question often asked is whether patients who are known to be immunosuppressed, and therefore at greater risk of significant infections actually need more or stronger doses of the vaccines they are able to have. In some cases that is a very real and worthwhile consideration if you want to achieve the objective of immunoprotection, Professor Woodward said. For example, you might consider giving an immunosuppressed patient the pneumococcal vaccine (Prevenar 13) as opposed to the polysaccharide pneumococcal vaccine (Pneumovax 23). “The conjugate vaccine is generally slightly more likely to produce an immune response [than the polysaccharide vaccine],” he said. The other scenario where GPs might need to be considering vaccination in association with immunosuppression, is in patients who are scheduled for an elective splenectomy. The lack of a spleen is known to be associated with a reduction of the body’s ability to respond to a vaccine, so it is currently recommended that people who are about to undergo a splenectomy have the influenza, pneumococcal and the newer zoster vaccine. In addition, they should be vaccinated against H. influenza B and receive the two meningococcal vaccines currently available. All these are detailed as part of the pre-splenectomy recommendations on the spleen.org.au website, with the exception of the zoster vaccine, as the guidelines have yet to be updated. However, Professor Woodward says most health professionals in this area are advocating the inclusion of the zoster vaccine. Some of these vaccinations may also be given shortly after the removal of the spleen in cases where the splenectomy has been urgent, but this is generally not the remit of the GP. In general, the question of vaccination in the immunosuppressed patient can be complicated. It is a highly specialised area and Professor Woodward suggested, if in doubt GPs might want to seek input from a specialist in this area such as an immunologist or a rheumatologist.

Clinical Articles iconClinical Articles

Which bacteria are colonising your gut is becoming increasingly important, Australian researchers say. More and more evidence is suggesting the gut microbiota has a significant role in both the cause and the cure of a wide range of gastrointestinal and hepatic diseases and conditions. According to a review in The Medical Journal of Australia, research shows that particular types of bacteria colonising the gut have been associated the development of inflammatory bowel disease (IBD), metabolic syndrome, non-alcoholic fatty liver disease (NAFLD), non-alcoholic steato-hepatitis (NASH), obesity and diabetes. We know that the bowel starts to be colonised by bacteria in utero. The make-up of an individual’s gut microbiota then depends on factors such as mode of delivery, breast-feeding, diet, illness and exposure to antibiotics. By the age of three, the gut microbiota resembles that of an adult. Observational studies have suggested a strong association between alterations in the microbiota because of environmental factors, and an increased risk of diseases. For instance, IBD was very rare in traditional Chinese populations, but studies have shown exposure to Western diets and medicines from a young age has increased the prevalence of this disease. “Asian adults who migrate from countries of low prevalence to countries of high prevalence do not have an increased risk of developing IBD, but their children experience the IBD incidence of their new country of residence,” the review authors said. Stronger evidence comes from studies into the exact nature of bacteria colonising the gut. It appears both the specific bacteria and the diversity of bacteria are important in disease pathogenesis. “For example, the presence of Proteus species at the time of Crohn’s disease resection is associated with early disease recurrence, while the presence of Faecalibacterium prauznitzii is protective against recurrence,” they said. Researchers have also found significant differences in the microbiota of people who develop severe alcoholic hepatitis and those who maintain normal hepatic function despite drinking the same amount of alcohol. Most importantly in the investigation of the role of the gut microbiota, is the emerging evidence that by altering the bacterial colonies in the gut we can alter the course of the disease. The classic example of this, of course, was the discovery of Helicobacter Pylori as the cause of peptic ulcer disease with treatment of this, dramatically changing health outcomes. But since then a lot of the research focus has been on faecal microbiota transplants (or poo transplants as they are commonly known). Mice studies have shown that a mouse will become fat if given a transplant of faecal bacteria from a fat donor mouse. Similarly, a similar result has been shown in a single trial of FMT from lean to obese humans, lowering triglyceride levels and increasing insulin sensitivity. FMT has also been shown to be an effective therapy for recurrent C.difficile infection and in active ulcerative colitis. And while the application of FMT as a treatment continues to be explored, investigators are also looking at how the microbiota can be changed through dietary means and how this can be used therapeutically. While probiotics have not been shown to be effective in the majority of inflammatory diseases, an anti-inflammatory diet combined with liquid formulated enteral nutrition has shown some success in Crohn’s disease. In short, the review authors suggest that the current interest in the gut microbiome is justified and has the potential to provide important therapeutic options in the future. “Microbial manipulation is an effective therapy, likely to have broadening implications,” they concluded.   Reference White LS, Van den Bogaerde J & Kamm M. The gut microbiota: cause and cure of gut diseases. Med J Aust. 2018 Oct 1; 209(7): 312-16. Available from: https://www.mja.com.au/journal/2018/209/7/gut-microbiota-cause-and-cure-gut-diseases doi: 10.5694/mja17.01067

Clinical Articles iconClinical Articles

It wasn’t that long ago that vitamin D appeared to be the panacea for everything from preventing MS to reducing the risk of diabetes. But the one area where we thought the benefit of this vitamin was not up for debate was bone health. It has been proven - lack of vitamin D causes rickets. It has been proven that vitamin D is important in bone metabolism and turnover. And it has been proven the people with low bone density are more likely to experience fractures. Therefore add vitamin D and improve bones, right? Wrong! The latest meta-analysis of more than 80 randomised controlled trials shows that vitamin D supplementation does not prevent fractures or falls, and does not have any consistently clinically relevant effects on bone mineral density. This comes as a bit of a surprise, to say the least. According to the systematic review, vitamin D had no effect on total fractures, hip fractures, or falls among the 53,000 participants in the pooled analysis. And it didn’t matter if higher or lower doses of vitamin D were used, the New Zealand researchers reported in The Lancet. In looking for a reason for the lack of an effect from supplementation, previous explanations such as baseline 25OHD of trial participants being too high, or the supplement dose being too low, or the trial being done in the wrong population just don’t hold water. The sheer number and variety of trials included in this meta-analysis has meant all of these possible confounders have been accounted for. “The trials we included have a broad range of study designs and populations, but there are consistently neutral results for all endpoints, including the surrogate endpoint of bone mineral density,” they said. Consequently, the researchers said future trials were unlikely to alter these conclusions. “There is little justification to use Vitamin D supplements to maintain or improve musculoskeletal health,” they stated. And while they acknowledge the clear exception to this is in the case of the prevention or treatment of rickets and osteomalacia, in general clinical guidelines should not be recommending vitamin D supplementation for bone health. The conclusion appears quite emphatic and definitive, and it is supported in an accompanying commentary by a leading US endocrinologist. “The authors should be complimented on an important updated analysis on musculoskeletal health,” said Dr Chris Gallagher from Creighton University Medical Centre, Omaha in the US. But he suggests many Vitamin D supporters will still be flying the flag for supplementation, pointing to the multiple potential non-bony benefits. “Within three years, we might have that answer because there are approximately 100,000 participants currently enrolled in randomised, placebo-controlled trials of vitamin D supplementation,” he said. “I look forward to those studies giving us the last word on vitamin D.”  

References

Bolland MJ, Grey A, Avenell A. Effects of vitamin D supplementation on musculoskeletal health: a systematic review, meta-analysis, and trial sequential analysis. Lancet Diabetes Endocrinol. 2018 Oct 4. Available from: http://dx.doi.org/10.1016/S2213-8587(18)30265-1 [epub ahead of print] Gallagher JC. Vitamin D and bone density, fractures, and falls: the end of the story? Lancet Diabetes Endocrinol. 2018 Oct 4. Available from: http://dx.doi.org/10.1016/S2213-8587(18)30269-9 [epub ahead of print]
Clinical Articles iconClinical Articles

The answer to both these questions is yes according to Dr Darren Pavey, gastroenterologist and senior lecturer at the University of NSW. Speaking at the Healthed General Practice Education seminar in Sydney recently, Dr Pavey said there was good international research suggesting that many cases of chronic pancreatitis were going undiagnosed and the condition was far more prevalent than previously recognised. Overseas studies including cohorts of randomly selected adult patients suggest a prevalence of between 6-12%, with the condition being more likely among patients with recent onset type 2 diabetes, excess alcohol intake, smokers and those over 40 years of age, he said. And in response to the question of whether it is important to diagnose this condition, Dr Pavey said chronic pancreatitis not only caused immediate symptoms usually including pain, diarrhoea and weight loss but commonly had longer-term consequences such as pancreatic exocrine insufficiency (where there is less than 10% pancreatic function) and an increased risk of diabetes, malnutrition and even pancreatic cancer. Certainly, an incentive to diagnose and treat earlier rather than later. Part of the challenge in recognising the condition is that the classic triad of symptoms, namely abdominal pain, diarrhoea and weight loss are common to a variety of medical conditions including IBD and IBS. What’s more, abdominal pain, which many doctors would have thought had to be present with pancreatitis does not always occur in chronic pancreatitis especially when it is idiopathic which is the more common variety of chronic pancreatitis. In fact, pain is only present in about half the cases of idiopathic chronic pancreatitis. Idiopathic pancreatitis constitute 55% of all cases, the other 45% being alcohol-related. Abdominal pain tends to be a more consistent feature of alcoholic chronic pancreatitis. So if you have a patient in the right age group (about 40 to 60 years), who has chronic diarrhoea, weight loss and maybe abdominal pain and you suspect they might have chronic pancreatitis what do you do? The most common screening test for chronic pancreatitis is now a faecal elastase-1 stool test, requiring a single formed stool sample, said Dr Pavey. The test has a high specificity and sensitivity (both over 90%) and is readily available to Australian GPs, although it does not attract a Medicare rebate and costs approximately $60. The test is positive if the concentration of faecal elastase is less than 200mcg/g. In terms of imaging, CT is usually the option of first choice with signs of calcification and atrophy being pathognomonic of significant chronic pancreatitis. Aside from the need to stop drinking and smoking, treatment revolves around replacement of the pancreatic enzymes, which is available as a capsule taken orally (Creon). The deficiency of these enzymes is the chief cause of the diarrhoea, malabsorption, and weight loss so replacing them not only alleviates the symptoms but will also help prevent some of significant sequelae associated with this ongoing condition. Interestingly, a study of patients newly diagnosed with pancreatic cancer, showed that 66% had pancreatic exocrine insufficiency at diagnosis, and after two months this prevalence grew to 93% Dr Pavey advises starting patients with known chronic pancreatitis on 25,000 lipase units (Creon) with every meal and 10,000 units with every snack, and recommends patients eat six smaller meals during the day rather than three larger meals. This replacement therapy would then be titrated up to 40,000 units with a meal and 25,000 units for a snack. For those whose need was greater, replacement could even be increased to 80,000 units per meal. There was no need to put patients on a reduced fat diet when they were on pancreatic enzyme replacement therapy however they often had a highly acidic upper gastrointestinal environment and required acid suppression treatment. In conclusion, Dr Pavey advises clinicians to have a high index of suspicion for this poorly-recognised but important condition. “[Doctors] should be aware of the problem of underdiagnosing this condition and have a low threshold for checking faecal elastase and assessing pancreatic insufficiency,” he said.

Clinical Articles iconClinical Articles

New NHMRC guidelines put age and family history up front and centre in determining who should be screened for bowel cancer with colonoscopy and who needs iFOBT. It has been known for some time that family history can influence the risk of developing bowel cancer, Australia’s second most common cause of cancer death. But it is also known that specific, identified genetic mutations causing conditions such as Lynch syndrome or familial adenomatous polyposis are rare, accounting for less than 5% of all bowel cancers diagnosed. At most, the researchers say, this only explains half of the reasons why family history is a risk factor for bowel cancer. “The remainder of the observed increases in familial risk could be due in part to mutations in yet to be discovered colorectal cancer susceptibility genes, polygenic factors such as single-nucleotide polymorphisms, or dietary and other lifestyle factors shared by family members,” the guideline authors said in the Medical Journal of Australia. Therefore, the researchers, led by Professor Mark Jenkins, director of the Centre for Epidemiology and Biostatistics, in the University of Melbourne’s School of Population and Global Health, analysed all the available cohort studies to determine the risk of developing colorectal cancer based on age and family history. They categorised cohorts into one of three levels of risk and this determined at what age screening would be worthwhile starting and which screening method was most appropriate. The screening guidelines exclude people with a known or suspected cancer-causing genetic syndrome, as these people require much more intensive screening and should be managed in a family cancer clinic. The majority of Australians (90%) fall into the lowest risk category, category 1, which puts their risk at age 40 of developing colorectal cancer in the next 10 years at about 0.25% (one in 400). As with most other cancers age is a risk factor, so it is unsurprising that at age 50 the risk of developing this cancer has risen to 0.9%. Screening for this category 1 group should be the two-yearly iFOBT test that is currently available via the National Bowel Screening program for adults between the ages of 50 and 74 years. Interestingly, people aged 75 and older still develop bowel cancer but there have been no studies to determine the cost-effectiveness or benefit vs risk analysis of screening in this age group which is why the program and the guideline recommendations stop at 74 years. One of the differences in these new guidelines, a revision from the previous ones published back in 2005, is that people with a first degree relative who has had or has a bowel cancer at age 55 or older are still considered at average risk (category 1). However, people with this history might consider starting the iFOBT screening at a younger age (45 years), the guideline authors suggest. Category 2 includes people with a moderately increased risk of developing colorectal cancer, 3-6 times higher than average. This will mean having a first degree relative diagnosed with a bowel cancer before the age of 55 or having two first degree relatives who developed bowel cancer at any age (or one first degree and two second degree relatives). Category 2 people are recommended to have iFOBT every two years for the decade between ages 40 and 50 and then switch to five yearly colonoscopies until the age of 75. Finally, the high risk, category 3 is for all those patients without a genetic syndrome whose family history is even stronger than those people in category 2. Their risk is between 7-10 times higher than average. This includes people with at least three first-degree relatives who have been diagnosed with colorectal cancer at any age or people who have multiple relatives with the cancer including at least one diagnosed before aged 55. These high-risk people need to start screening earlier, with the guidelines recommending iFOBT every two years starting at age 35 and continuing for 10 years and then having a colonoscopy every five years between the ages of 45 and 75. Of note is that the revised guidelines have deleted the reference in the previous guidelines to starting screening 10 years before the earliest age colorectal cancer was diagnosed in a first degree relative. “There have been no studies conducted to determine the utility of beginning screening 10 years before the earliest diagnosis in the family, which was a recommendation in the 2005 guidelines and, therefore, it is not included in these guidelines,” they said. The new guidelines aim not only to more strongly define risk based on the latest evidence, but also to determine the most appropriate screening method based on that risk, taking into consideration cost-effectiveness and rationalisation of available services, in particular, colonoscopies.   Reference Jenkins MA, Ouakrim DA, Boussioutas A, Hopper JL, Ee HC, Emery JD, et al. Revised Australian national guidelines for colorectal cancer screening: family history. Med J Aust. 2018 Oct 29. doi: 10.5694/mja18.00142. [epub ahead of print]

Clinical Articles iconClinical Articles

It appears we might still be failing some of our poorer migrant women, with new study finding that they have higher rates of stillbirth compared to Australian-born mothers. Analysing data from stillbirths that occurred in Western Australia over the period 2005 to 2013, researchers found that while stillbirth rates overall were low and often much lower than in these migrant women’s country of birth, they were higher in non-Australian born women, especially in those women who were born in Africa. Published recently in The Medical Journal of Australia, the study also took note of whether the deaths occurred in the antepartum period (between 20 weeks gestation up to before labour commences) or the intrapartum period (which is the period after labour has started), in an attempt to determine when and in whom intervention might be warranted. Researchers found the key factor was the woman’s country of birth rather than her ethnic origin, as there appeared no difference in stillbirth rates among white and non-white Australian-born women. However, women born in Africa were twice as likely to have a stillbirth in the weeks before going into labour compared with Australian-born women. And Indian-born women were 70% more likely. Migrant women born in other countries collectively had an increased risk of about 40% of an antepartum stillbirth. And frighteningly, it appeared the rates of stillbirth occurring once labour had started were also much higher than that which occurs in Australian born women. Almost twice the risk for most migrant women, and more than double that again for African-born women. “That the rate intrapartum stillbirth was twice as high among African women is especially worrying, as intrapartum stillbirth is regarded as preventable and indicative of inadequate quality of care,” the study authors wrote. So why is this happening, the researchers asked. Why is it, that, despite access to the same standard of healthcare as the rest of the Australian population, these women are more at risk of losing their babies, especially African-born women and especially so late in the pregnancy? The study authors suggest cultural issues may play a major role. They point to statistics that show African-born women are more likely to have pregnancies lasting 42 weeks or more, a well-recognised risk factor for stillbirth. Qualitative studies have also determined there is often, particularly among African-born women, a deeply-held suspicion of interventions in pregnancy believing them to interfere with the natural process of childbirth and possibly having long-term repercussions. Consequently, there is not only a poorer attention to antenatal care but also a resistance to procedures such as induction of labour and caesarean section. “More in-depth investigation of the patterns of health service use, pregnancy, and labour care for migrant women, particularly African migrants, is warranted,” the researchers said. They suggest education is the most likely solution, but the changing of what is likely to be long-held and culturally-associated attitudes will need both sensitivity and intelligence. “Culturally appropriate antenatal engagement and educational programs about the risk of stillbirth and the indications for and the safety of induction and related interventions may be useful preventive strategies,” they concluded.  

Reference

Mozooni M, Preen DB, Pennell CE. Stillbirth in Western Australia, 2005–2013: the influence of maternal migration and ethnic origin. Med J Aust. 2018; 209(9): 394-400. DOI 10.5694/mja18.00362

The physical health of mentally ill patients is a "massive problem and we are doing very badly at it,” psychiatrist Dr Matthew Warden told doctors at a recent Healthed evening seminar in Sydney. In particular, the prevalence of high cardiovascular risk among patients with a history of psychosis, means this population was a "ticking time bomb", said Dr Warden, who is the Director of Acute Inpatient Services for Mental Health at St Vincent’s Hospital in Melbourne. Even without antipsychotic medication, a disproportionate number of people with a history of psychosis are overweight or obese, do very little if any physical exercise and smoke. And it is well-known that the metabolic side-effects associated with antipsychotic medications increases this cardiovascular risk enormously. Consequently, there has been growing pressure on psychiatrists to assess, monitor and manage the physical health of their patients with psychosis, but Dr Warden said, realistically this needs to be also done by GPs as they will usually be managing these patients long-term and "they are better at it.” Baseline metabolic measurements need to be taken at first episode of psychosis, including weight, BMI, BP, lipid levels, fasting blood sugar and smoking status. Weight, in particular needs to be monitored carefully following the commencement of antipsychotic medication, as weight gain is extremely common, especially with olanzapine which, Australia-wide is the most commonly prescribed antipsychotic. In answer to a GP’s question following his talk, Dr Ward said it is extremely difficult to avoid or reverse this medication-induced weight gain with diet and exercise alone. In addition, weight loss pharmacotherapy such as phentermine is contraindicated in people with a history of psychosis. Key to managing the weight gain issue was to choose an antipsychotic with the least long-term side effects from the outset. Olanzapine and clozapine are associated with the greatest weight gain while lurasidone and the partial agonists, aripiprazole and ziprasidone have the least effect on weight. Alternatively, for patients who may have been started on olanzapine or similar, swap to a more weight-neutral medication at the first sign they were gaining weight or developing other metabolic side-effects. It is more likely that a person who as gained weight on olanzapine, will lose that weight if switched to another weight-neutral medication early. The longer that patient stays on olanzapine and the weight gain is sustained, the harder it will be to shift even if the medication is changed, Dr Warden said. In addition to managing weight gain in mentally ill patients, Dr Warden also encouraged GPs to offer smoking cessation advice and help. Even though this population were often considered among the most dependent and heaviest smokers, his own research had found a significant number of patients could successfully quit or at the least cut down given the right advice and assistance. While most smoking cessation pharmacotherapy could be used, Dr Warden suggested that varenicline (Champix) was probably best avoided in these patients. At St Vincent’s Hospital in Melbourne, patients receiving antipsychotic therapy have their metabolic markers assessed at admission and at regular intervals after that, including measuring their serum prolactin. “Hyperprolactinaemia is a significant problem and should be monitored every six months if it is elevated or increasing particularly if there are symptoms then either reduce the dose or change antipsychotic or add in low dose aripiprazole which will lower prolactin levels,” Dr Warden explained.   Dr Matthew Warden spoke on the “Management of Metabolic Dysregulation in Patients on Antipsychotics” at the Healthed, Mental Health in General Practice Evening Seminar held in Sydney in June, 2018.

Clinical Articles iconClinical Articles

Why are Australians having rehabilitation as an inpatient after their total knee replacements rather than as an outpatient at a rate higher than any other country in the world? And why are our rates of inpatient rehabilitation as opposed to community or home-based rehab increasing? That’s what researchers were investigating in a study just published in the MJA. Could it be inpatient rehab was associated with better health outcomes for the patient than the other options? Or were patients too complex, lived too far away or needed greater supervision to allow them to have their rehab off-site? As it turns out, the reason inpatient rehabilitation rates are increasing has much more to do with private hospitals being able to access funding than any patient factors. According to the study authors, more than 50,000 total knee replacement operations were performed in Australia in 2016, about 70% of which took place in a private hospital. In that year, 2016, 45% of patients underwent inpatient rehabilitation following surgery. This represents a substantial increase from the 31% who had the same inpatient service back in 2009. This bucks an international trend. “Inpatient rehabilitation rates in the United States decreased from a peak of 35% in 2003 to 11% in 2009, with a mean rate during 2009-2014 of 15%,” the researchers said. Randomised controlled trials have failed to show the functional improvements achieved through inpatient rehabilitation are superior to those achieved with home- or community-based rehabilitation. However, the cost was significantly more. A recent analysis including almost 260 privately insured patients at 12 Australian hospitals put the cost differential at an average of $9500. And even though the mean age for patients undergoing inpatient rehab was slightly higher than for those who did not (71.0 vs 67.3 years), and they were more likely to have comorbidities and live alone, the study authors said the differences didn’t explain the wide variation in admission rates from hospital to hospital. “Patients in hospitals with high rates of inpatient rehabilitation were similar to those in hospitals with low rates, eliminating patient complexity as the reason,” they said. It seems the greatest determinant of whether a person had inpatient rehabilitation was the hospital in which the total knee replacement took place. “This factor was substantially more important than the clinical profile of the patient,” the study authors said. They suggested some private hospitals were encouraging inpatient rehabilitation because they were able to access funding on a per day basis for the rehab, in addition to the payment received for the knee surgery. The study authors concede it is an attractive business model, but while these hospitals may be offering excellent rehab in terms of services and facilities, it all comes at a cost ‘that, for many patients, is not justified by better outcomes.’ They suggest the proportion of patients receiving inpatient rehabilitation after a total knee replacement could be reduced, improving health care efficiency without harming health outcomes. “Reducing low value care will require system-level changes to guidelines and incentives for hospitals, as hospital-related factors are the major driver of variation in inpatient rehabilitation practices,” they concluded.   Reference: Schilling C, Keating C, Barker A, Wilson SF, Petrie D,  Predictors of inpatient rehabilitation after total knee replacement: an analysis of private hospital claims data. Med J Aust. 2018 August 27. 209(5): 222-7. Available from: https://www.mja.com.au/journal/2018/209/5/predictors-inpatient-rehabilitation-after-total-knee-replacement-analysis doi:10.5694/mja17.01231  

Clinical Articles iconClinical Articles

GPs may have to correct some patients’ misunderstanding following reports in the general media suggesting that testing for high risk cancer genes was now available to everyone free of charge. Writing in the latest issue of the MJA, Australian genetics experts say that testing for specific high- risk genetic mutations, especially BRCA1 and BRCA2 has been available to appropriate patients free of charge (but not Medicare-rebated) by genetic specialists in public clinics for over 20 years. What’s new is that these tests now attract a Medicare rebate and you don’t have to be a genetic specialist to order them, but they are still only available to selected patients. “Testing is appropriate when there is at least a 10% chance of identifying a gene mutation responsible for the personal or family history of cancer,” the authors of the article wrote. There are a number of algorithms available to help clinicians calculate whether the likelihood of having one of these cancer-causing genetic mutations is at least 10%. Usually testing is initially considered for patients who have been diagnosed with either breast or ovarian cancer, and because of their young age and/or their strong family history are considered at high possibility of having a genetic mutation that explains their condition. The new item numbers (73295,73296 and 73297) cover testing for heritable germline mutations in seven genes including BRCA1 and BRCA2. If such a mutation is found, then at risk adult relatives will be justified in also accessing testing. However, as the article authors point out there are limitations with this type of genetic testing. Firstly most breast and ovarian cancers occur in people without an identifiable underlying genetic variant. “Only 5% of female breast cancers, 15% of invasive epithelial ovarian cancers and up to 14% of male breast cancers are related to BRCA1 or BRCA2 mutations, thus, most patients with breast cancer do not need, nor will they benefit from, a genetic test,” they said. That’s not to say the absence of BRCA1 or BRCA2, or one of the other rarer high-risk mutations currently tested for, excludes the possibility that the patient has inherited a predisposition to the cancer. Families that appear to have a high prevalence of these types of cancer may indeed have an inherited genetic mutation, it is just that because of limitations of technology and knowledge it is yet to be isolated. What’s more, the sensitivity of the current testing methods, means that a number of incidental genetic mutations may be noted, but the significance of these is as yet unknown. It is critical that when testing is requested for a relative of an affected patient, the laboratory is informed of the exact genetic variation found in the original affected patient, to ensure pathologists distinguish between the disease-causing mutation and variants of undetermined significance. The authors also suggest confining testing to only the most likely variant/s rather than requesting testing for mutations in multiple genes. “[T]he testing of multiple genes may uncover unclassified variants, variants outside the usual clinical context, variants unrelated to the current cancer, or unexpected important variants for which the patient has not been well prepared,” they said. They also suggest education and counselling be given to patients considering this genetic testing, and written consent obtained. The new Medicare item numbers represent a major step forward in terms of genetic and genomic testing becoming mainstream, but, as the current incorrect media headlines demonstrate, this transition is going to require information and education. Clinicians who order these tests are likely to benefit from establishing close ties with genetic services and specialists to ensure best and appropriate practice in this ever-expanding area of medicine. Ref: Med J Aust 2018; 209 (5): 193-196. || doi: 10.5694/mja17.01124

Clinical Articles iconClinical Articles

Adolescent boys who struggle to understand how basic machines work and young girls who have difficulty remembering words are at increased risk of developing dementia when they’re older, new research has found. According to the longitudinal study published in The Journal of the American Medical Association, lower mechanical reasoning in adolescence in boys was associated with a 17% higher risk of having dementia when they were 70. With girls it was a lower memory for words in adolescence that increased the odds of developing the degenerative disease. It has been known for some time that the smarter you are throughout life, even as a child the less likely it is that you will develop dementia. Not a guarantee of protection – just a general trend. It has to do with cognitive reserve, the US researchers explain. “Based on the cognitive reserve hypothesis, high levels of cognitive functioning and reserve accumulated throughout the life course may protect against brain pathology and clinical manifestations of dementia,” they wrote. This theory has been supported by a number of studies such as the Scottish Mental Health Survey that showed that lower mental ability at age 11 increased the risk of dementia down the track. But what had been less well-defined was whether there were any particular aspects of intelligence in young people that were better predictors (or protectors) of dementia than others. This study goes some way to addressing this. Researchers were able to link sociobehavioural data collected from high school children back in 1960 with Medicare claims data over 50 years later that identified those people who had been diagnosed with Alzheimer's disease and related disorders. Interestingly, poor adolescent performance in other areas of intelligence such as mathematics and visualisation were also associated with dementia but not nearly to the extent of mechanical reasoning and word memory. So why is this so? The study authors say there are a few possible explanations. Maybe the poor performance in adolescence reflected poor brain development earlier in life, a risk factor for dementia. Or maybe these adolescents are more susceptible to neuropathology as they get older? Or maybe they are the adolescents who adopt poor health behaviours such as smoking and little exercise? “Regardless of mechanism, our findings emphasise that early-life risk stretches across the life course,” they said. And what can be done about it? That’s the million-dollar question. The researchers say the hope is if we know the at-risk group we can get aggressive with preventive management early. “Efforts to promote cognitive reserve-building experiences and positive health behaviours throughout the life course may prevent or delay clinical symptoms of Alzheimer's disease and related disorder.” An accompanying editorial takes this concept a little further. Dr Tom Russ, a Scottish psychiatrist says interventional research has identified a number of factors that can potentially influence cognitive reserve. These include modifiable health factors, education, social support, positive affect, stimulating activities and/or novel experiences, and cognitive training. As Dr Russ says, you can’t necessarily change all of these risk factors, and even the ones you can change may become less modifiable later in life. But as this study demonstrates, you may be able to work on a person’s cognitive reserve at different stages throughout their life to ultimately lower their risk of dementia.   Reference
  1. Huang AR, Strombotne KL, Horner EM, Lapham SJ, Adolescent Cognitive Aptitudes and Later-in-Life Alzheimer Disease and Related Disorders. JAMA Network Open [Internet]. 2018 Sep; 1(5): e181726. Available from: https://jamanetwork.com/journals/jamanetworkopen/fullarticle/2701735 doi:10.1001/jamanetworkopen.2018.1726.
  2. Russ TC, Intelligence, Cognitive Reserve, and Dementia: Time for Intervention? JAMA Network Open [Internet]. 2018 Sep; 1(5): e181724. Available from: https://jamanetwork.com/journals/jamanetworkopen/fullarticle/2701735 doi:10.1001/jamanetworkopen.2018.1724.

Clinical Articles iconClinical Articles

The value of omega-3 fatty acids has come under fire lately. But now a new systematic review suggests they might have benefits beyond the previous therapeutic targets of depression, cardiac health, eye health and arthritis. Researchers have found that omega-3 polyunsaturated fatty acids (PUFA) might reduce the symptoms of clinical anxiety, particularly among those people who had a specific clinical condition be it medical (such as Parkinson disease) or psychological (premenstrual syndrome). “This systematic review…provides the first meta-analytic evidence, to our knowledge, that omega-3 PUFA treatment may be associated with anxiety reduction, which might not only be due to a potential placebo effect, but also from some associations of treatment with reduced anxiety symptoms,” the review authors said in JAMA. The finding is likely to be welcome news for patients with this condition. Be it the potential side-effects of medications or the cost and accessibility of psychological therapy, patients with anxiety, especially those with comorbid medical conditions are keen for alternative or at least supplementary safe, evidence-based treatments for their symptoms. Previous research, in both human and animal studies had found that a lack of omega-3 PUFAs could induce various behavioural and neuropsychiatric disorders. What had not been shown was whether taking this supplement was effective in reducing the specific anxiety symptoms. The review involved an extensive literature search through a wide range of databases including PubMed and Cochrane looking for trials that had assessed the anxiolytic effects of these fatty acids in humans. In the end they found 19 trials that matched their eligibility criteria, which allowed researchers to analyse the effect of supplementation in just over 1200 participants and compare it with about 1000 matched controls who didn’t take the fatty acids. Overall, they found ‘there was a significantly better association of treatment with reduced anxiety symptoms in patients receiving omega-3 PUFA treatment than in those not receiving it.’ Subgroup analysis also showed that those taking at least 2000mg or more of the omega-3 PUFA treatment were more likely to have reduced anxiety. And somewhat surprisingly, those patients receiving supplements containing less than 60% EPA did better than those taking formulations with a greater concentration of EPA. The studies in the review included very different cohorts, and because of this and the limited number of studies included, the authors understandably say the results need to be interpreted with caution. However, while bigger, better studies are still needed to prove the benefit of omega-3 PUFAs in patients with clinical anxiety, this research certainly does suggest that higher dose formulations of less than 60% concentration of EPA might have a role as at least adjunctive treatment to standard therapy.   Reference: Su KP, Tseng PT, Lin PY, Okubo R, Chen TY, Chen YW, et al. Association of Use of Omega-3 Polyunsaturated Fatty Acids With Changes in Severity of Anxiety Symptoms; A Systematic Review and Meta-analysis. JAMA Network Open [Internet]. 2018 Sep; 1(5): e182327. Available from: https://jamanetwork.com/journals/jamanetworkopen/fullarticle/2701735 doi:10.1001/jamanetworkopen.2018.2327.

Clinical Articles iconClinical Articles

GPs can make a significant difference in curbing the rising rates of liver cancer deaths in Australia, experts say. According to an analysis of over 270 cases of newly diagnosed cases of hepatocellular cancer presenting at seven Melbourne tertiary hospitals over one year, researchers say survival rates could be improved with earlier diagnosis of cirrhosis and better adherence to recommended screening  schedules among those known to be at high risk. “[T]he number of liver cancer-related deaths has been the most rapid for any cancer type in Australia over the past 40 years,” the study authors said in the MJA. And of all the types of liver cancer, hepatocellular carcinoma is by far the most common, accounting for 82%. Even though treatments are available, both curative and palliative survival remains very poor with the Australian 12-month survival rate estimated to be only 62%. In this particular study, conducted over 2012/2013 the mean survival was only 18 months. As one would expect, the patients who did better, who generally survived the longest were those whose tumours were detected at an earlier stage. These were usually the patients who were known to be at high risk of developing liver cancer and were participating in a surveillance program. But this was only 40% of the 272 cases, even though 89% would have qualified for surveillance based on their risk factors. Why was this? Well firstly, many of these people did not know they were at risk. And that’s where GPs fit in. In the study the most common risk factors for liver cancer were found to be hepatitis C infection (41%), alcohol-related liver disease (39%), hepatitis B infection (22%) and non-alcoholic fatty liver disease (14%). Many had more than one risk factor. Most telling was the finding that, even though the vast majority of patients (83%) had cirrhosis when they were diagnosed with hepatocellular carcinoma, for one third of them that was the first they knew of it. The study authors suggest clinicians need to be alert for risk factors for chronic liver disease such as excess alcohol use, chronic HCV and HBV infections and even non-alcoholic liver disease in certain groups. In these people, checking for cirrhosis is likely to be worthwhile. “An aspartate transaminase to platelet ratio index (APRI) value greater than 1.0 predicts cirrhosis with 76% sensitivity and 72% specificity, and the test is simple to undertake,” the researchers said. The other major barrier to the earlier detection of liver cancer identified in the study was the poor adherence to surveillance among those people identified as being at high risk. Researchers found patients with alcohol-related liver disease or decompensated liver disease were the least likely to get regular monitoring. A surveillance program for this particular cancer involves a 6-monthly liver ultrasound and serum alpha-fetoprotein assessment. The study authors are advocating a national hepatocellular cancer surveillance program for those who are at high risk of developing the disease, which would include all patients with cirrhosis, Asian men over 40, women over 50, Africans over 20 years of age, and patients with a family history of [hepatocellular carcinoma] without cirrhosis but with chronic HBV infections. A national program to screen for hepatocellular carcinoma amongst this particular group would be worthwhile, the researchers said, as the incidence of the cancer is high, the screening is non-invasive and inexpensive and, perhaps most importantly early detection has been shown to improve survival. However, until such a national program is developed, researchers are encouraging GPs to ensure that their at-risk patients are enrolled in a surveillance program in order to hopefully improve their health outcomes.   Reference: Hong TP, Gow PJ, Fink M, Dev A, Roberts SK, Nicoll A, et al. Surveillance improves survival of patients with hepatocellular carcinoma: a prospective population-based study. Med J Aust [Internet]. 2018 Sep 24; 209(8): 1-7. Available from: https://www.mja.com.au/journal/2018/209/8/surveillance-improves-survival-patients-hepatocellular-carcinoma-prospective doi: 10.5694/mja18.00373

Clinical Articles iconClinical Articles

Vaccination in immunosuppressed adult patients has many facets and can be challenging for GPs who don’t deal with these cases regularly. But there are a few key considerations that can help guide clinicians, says Associate Professor Michael Woodward, Melbourne-based geriatrician, writer, researcher and passionate advocate for health promotion. Firstly, not all immunosuppression is equal. It is important to ascertain the degree of immunosuppression, as some people may be being unnecessarily denied vaccines because they are taking medication that can suppress the immune system but only at higher doses or in different formulations. “For instance, someone who is on inhaled corticosteroids for their asthma or on low dose (less than 20mg) prednisolone daily for just a few weeks is not significantly immunosuppressed and can be vaccinated in the same way as other people,” said Professor Woodward in an interview following his presentation at Healthed’s recent Annual Women's and Children’s Health Update in Perth. However, those on higher doses of steroids or on steroids more long-term, as well as those people who have conditions associated with immunosuppression such as haematological malignancy do need special consideration when it comes to vaccination. Most importantly, live vaccines are not to be given to this group. This includes the new herpes zoster vaccine (Zostavax), which absolutely contraindicated in severely immunocompromised patients. The consequences of inadvertently administering this vaccine to an immunosuppressed patient hit the headlines some months ago, highlighting the importance of this guideline. The other question often asked is whether patients who are known to be immunosuppressed, and therefore at greater risk of significant infections actually need more or stronger doses of the vaccines they are able to have. In some cases that is a very real and worthwhile consideration if you want to achieve the objective of immunoprotection, Professor Woodward said. For example, you might consider giving an immunosuppressed patient the pneumococcal vaccine (Prevenar 13) as opposed to the polysaccharide pneumococcal vaccine (Pneumovax 23). “The conjugate vaccine is generally slightly more likely to produce an immune response [than the polysaccharide vaccine],” he said. The other scenario where GPs might need to be considering vaccination in association with immunosuppression, is in patients who are scheduled for an elective splenectomy. The lack of a spleen is known to be associated with a reduction of the body’s ability to respond to a vaccine, so it is currently recommended that people who are about to undergo a splenectomy have the influenza, pneumococcal and the newer zoster vaccine. In addition, they should be vaccinated against H. influenza B and receive the two meningococcal vaccines currently available. All these are detailed as part of the pre-splenectomy recommendations on the spleen.org.au website, with the exception of the zoster vaccine, as the guidelines have yet to be updated. However, Professor Woodward says most health professionals in this area are advocating the inclusion of the zoster vaccine. Some of these vaccinations may also be given shortly after the removal of the spleen in cases where the splenectomy has been urgent, but this is generally not the remit of the GP. In general, the question of vaccination in the immunosuppressed patient can be complicated. It is a highly specialised area and Professor Woodward suggested, if in doubt GPs might want to seek input from a specialist in this area such as an immunologist or a rheumatologist.

Clinical Articles iconClinical Articles

Which bacteria are colonising your gut is becoming increasingly important, Australian researchers say. More and more evidence is suggesting the gut microbiota has a significant role in both the cause and the cure of a wide range of gastrointestinal and hepatic diseases and conditions. According to a review in The Medical Journal of Australia, research shows that particular types of bacteria colonising the gut have been associated the development of inflammatory bowel disease (IBD), metabolic syndrome, non-alcoholic fatty liver disease (NAFLD), non-alcoholic steato-hepatitis (NASH), obesity and diabetes. We know that the bowel starts to be colonised by bacteria in utero. The make-up of an individual’s gut microbiota then depends on factors such as mode of delivery, breast-feeding, diet, illness and exposure to antibiotics. By the age of three, the gut microbiota resembles that of an adult. Observational studies have suggested a strong association between alterations in the microbiota because of environmental factors, and an increased risk of diseases. For instance, IBD was very rare in traditional Chinese populations, but studies have shown exposure to Western diets and medicines from a young age has increased the prevalence of this disease. “Asian adults who migrate from countries of low prevalence to countries of high prevalence do not have an increased risk of developing IBD, but their children experience the IBD incidence of their new country of residence,” the review authors said. Stronger evidence comes from studies into the exact nature of bacteria colonising the gut. It appears both the specific bacteria and the diversity of bacteria are important in disease pathogenesis. “For example, the presence of Proteus species at the time of Crohn’s disease resection is associated with early disease recurrence, while the presence of Faecalibacterium prauznitzii is protective against recurrence,” they said. Researchers have also found significant differences in the microbiota of people who develop severe alcoholic hepatitis and those who maintain normal hepatic function despite drinking the same amount of alcohol. Most importantly in the investigation of the role of the gut microbiota, is the emerging evidence that by altering the bacterial colonies in the gut we can alter the course of the disease. The classic example of this, of course, was the discovery of Helicobacter Pylori as the cause of peptic ulcer disease with treatment of this, dramatically changing health outcomes. But since then a lot of the research focus has been on faecal microbiota transplants (or poo transplants as they are commonly known). Mice studies have shown that a mouse will become fat if given a transplant of faecal bacteria from a fat donor mouse. Similarly, a similar result has been shown in a single trial of FMT from lean to obese humans, lowering triglyceride levels and increasing insulin sensitivity. FMT has also been shown to be an effective therapy for recurrent C.difficile infection and in active ulcerative colitis. And while the application of FMT as a treatment continues to be explored, investigators are also looking at how the microbiota can be changed through dietary means and how this can be used therapeutically. While probiotics have not been shown to be effective in the majority of inflammatory diseases, an anti-inflammatory diet combined with liquid formulated enteral nutrition has shown some success in Crohn’s disease. In short, the review authors suggest that the current interest in the gut microbiome is justified and has the potential to provide important therapeutic options in the future. “Microbial manipulation is an effective therapy, likely to have broadening implications,” they concluded.   Reference White LS, Van den Bogaerde J & Kamm M. The gut microbiota: cause and cure of gut diseases. Med J Aust. 2018 Oct 1; 209(7): 312-16. Available from: https://www.mja.com.au/journal/2018/209/7/gut-microbiota-cause-and-cure-gut-diseases doi: 10.5694/mja17.01067

Clinical Articles iconClinical Articles

It wasn’t that long ago that vitamin D appeared to be the panacea for everything from preventing MS to reducing the risk of diabetes. But the one area where we thought the benefit of this vitamin was not up for debate was bone health. It has been proven - lack of vitamin D causes rickets. It has been proven that vitamin D is important in bone metabolism and turnover. And it has been proven the people with low bone density are more likely to experience fractures. Therefore add vitamin D and improve bones, right? Wrong! The latest meta-analysis of more than 80 randomised controlled trials shows that vitamin D supplementation does not prevent fractures or falls, and does not have any consistently clinically relevant effects on bone mineral density. This comes as a bit of a surprise, to say the least. According to the systematic review, vitamin D had no effect on total fractures, hip fractures, or falls among the 53,000 participants in the pooled analysis. And it didn’t matter if higher or lower doses of vitamin D were used, the New Zealand researchers reported in The Lancet. In looking for a reason for the lack of an effect from supplementation, previous explanations such as baseline 25OHD of trial participants being too high, or the supplement dose being too low, or the trial being done in the wrong population just don’t hold water. The sheer number and variety of trials included in this meta-analysis has meant all of these possible confounders have been accounted for. “The trials we included have a broad range of study designs and populations, but there are consistently neutral results for all endpoints, including the surrogate endpoint of bone mineral density,” they said. Consequently, the researchers said future trials were unlikely to alter these conclusions. “There is little justification to use Vitamin D supplements to maintain or improve musculoskeletal health,” they stated. And while they acknowledge the clear exception to this is in the case of the prevention or treatment of rickets and osteomalacia, in general clinical guidelines should not be recommending vitamin D supplementation for bone health. The conclusion appears quite emphatic and definitive, and it is supported in an accompanying commentary by a leading US endocrinologist. “The authors should be complimented on an important updated analysis on musculoskeletal health,” said Dr Chris Gallagher from Creighton University Medical Centre, Omaha in the US. But he suggests many Vitamin D supporters will still be flying the flag for supplementation, pointing to the multiple potential non-bony benefits. “Within three years, we might have that answer because there are approximately 100,000 participants currently enrolled in randomised, placebo-controlled trials of vitamin D supplementation,” he said. “I look forward to those studies giving us the last word on vitamin D.”  

References

Bolland MJ, Grey A, Avenell A. Effects of vitamin D supplementation on musculoskeletal health: a systematic review, meta-analysis, and trial sequential analysis. Lancet Diabetes Endocrinol. 2018 Oct 4. Available from: http://dx.doi.org/10.1016/S2213-8587(18)30265-1 [epub ahead of print] Gallagher JC. Vitamin D and bone density, fractures, and falls: the end of the story? Lancet Diabetes Endocrinol. 2018 Oct 4. Available from: http://dx.doi.org/10.1016/S2213-8587(18)30269-9 [epub ahead of print]
Clinical Articles iconClinical Articles

The answer to both these questions is yes according to Dr Darren Pavey, gastroenterologist and senior lecturer at the University of NSW. Speaking at the Healthed General Practice Education seminar in Sydney recently, Dr Pavey said there was good international research suggesting that many cases of chronic pancreatitis were going undiagnosed and the condition was far more prevalent than previously recognised. Overseas studies including cohorts of randomly selected adult patients suggest a prevalence of between 6-12%, with the condition being more likely among patients with recent onset type 2 diabetes, excess alcohol intake, smokers and those over 40 years of age, he said. And in response to the question of whether it is important to diagnose this condition, Dr Pavey said chronic pancreatitis not only caused immediate symptoms usually including pain, diarrhoea and weight loss but commonly had longer-term consequences such as pancreatic exocrine insufficiency (where there is less than 10% pancreatic function) and an increased risk of diabetes, malnutrition and even pancreatic cancer. Certainly, an incentive to diagnose and treat earlier rather than later. Part of the challenge in recognising the condition is that the classic triad of symptoms, namely abdominal pain, diarrhoea and weight loss are common to a variety of medical conditions including IBD and IBS. What’s more, abdominal pain, which many doctors would have thought had to be present with pancreatitis does not always occur in chronic pancreatitis especially when it is idiopathic which is the more common variety of chronic pancreatitis. In fact, pain is only present in about half the cases of idiopathic chronic pancreatitis. Idiopathic pancreatitis constitute 55% of all cases, the other 45% being alcohol-related. Abdominal pain tends to be a more consistent feature of alcoholic chronic pancreatitis. So if you have a patient in the right age group (about 40 to 60 years), who has chronic diarrhoea, weight loss and maybe abdominal pain and you suspect they might have chronic pancreatitis what do you do? The most common screening test for chronic pancreatitis is now a faecal elastase-1 stool test, requiring a single formed stool sample, said Dr Pavey. The test has a high specificity and sensitivity (both over 90%) and is readily available to Australian GPs, although it does not attract a Medicare rebate and costs approximately $60. The test is positive if the concentration of faecal elastase is less than 200mcg/g. In terms of imaging, CT is usually the option of first choice with signs of calcification and atrophy being pathognomonic of significant chronic pancreatitis. Aside from the need to stop drinking and smoking, treatment revolves around replacement of the pancreatic enzymes, which is available as a capsule taken orally (Creon). The deficiency of these enzymes is the chief cause of the diarrhoea, malabsorption, and weight loss so replacing them not only alleviates the symptoms but will also help prevent some of significant sequelae associated with this ongoing condition. Interestingly, a study of patients newly diagnosed with pancreatic cancer, showed that 66% had pancreatic exocrine insufficiency at diagnosis, and after two months this prevalence grew to 93% Dr Pavey advises starting patients with known chronic pancreatitis on 25,000 lipase units (Creon) with every meal and 10,000 units with every snack, and recommends patients eat six smaller meals during the day rather than three larger meals. This replacement therapy would then be titrated up to 40,000 units with a meal and 25,000 units for a snack. For those whose need was greater, replacement could even be increased to 80,000 units per meal. There was no need to put patients on a reduced fat diet when they were on pancreatic enzyme replacement therapy however they often had a highly acidic upper gastrointestinal environment and required acid suppression treatment. In conclusion, Dr Pavey advises clinicians to have a high index of suspicion for this poorly-recognised but important condition. “[Doctors] should be aware of the problem of underdiagnosing this condition and have a low threshold for checking faecal elastase and assessing pancreatic insufficiency,” he said.

Clinical Articles iconClinical Articles

New NHMRC guidelines put age and family history up front and centre in determining who should be screened for bowel cancer with colonoscopy and who needs iFOBT. It has been known for some time that family history can influence the risk of developing bowel cancer, Australia’s second most common cause of cancer death. But it is also known that specific, identified genetic mutations causing conditions such as Lynch syndrome or familial adenomatous polyposis are rare, accounting for less than 5% of all bowel cancers diagnosed. At most, the researchers say, this only explains half of the reasons why family history is a risk factor for bowel cancer. “The remainder of the observed increases in familial risk could be due in part to mutations in yet to be discovered colorectal cancer susceptibility genes, polygenic factors such as single-nucleotide polymorphisms, or dietary and other lifestyle factors shared by family members,” the guideline authors said in the Medical Journal of Australia. Therefore, the researchers, led by Professor Mark Jenkins, director of the Centre for Epidemiology and Biostatistics, in the University of Melbourne’s School of Population and Global Health, analysed all the available cohort studies to determine the risk of developing colorectal cancer based on age and family history. They categorised cohorts into one of three levels of risk and this determined at what age screening would be worthwhile starting and which screening method was most appropriate. The screening guidelines exclude people with a known or suspected cancer-causing genetic syndrome, as these people require much more intensive screening and should be managed in a family cancer clinic. The majority of Australians (90%) fall into the lowest risk category, category 1, which puts their risk at age 40 of developing colorectal cancer in the next 10 years at about 0.25% (one in 400). As with most other cancers age is a risk factor, so it is unsurprising that at age 50 the risk of developing this cancer has risen to 0.9%. Screening for this category 1 group should be the two-yearly iFOBT test that is currently available via the National Bowel Screening program for adults between the ages of 50 and 74 years. Interestingly, people aged 75 and older still develop bowel cancer but there have been no studies to determine the cost-effectiveness or benefit vs risk analysis of screening in this age group which is why the program and the guideline recommendations stop at 74 years. One of the differences in these new guidelines, a revision from the previous ones published back in 2005, is that people with a first degree relative who has had or has a bowel cancer at age 55 or older are still considered at average risk (category 1). However, people with this history might consider starting the iFOBT screening at a younger age (45 years), the guideline authors suggest. Category 2 includes people with a moderately increased risk of developing colorectal cancer, 3-6 times higher than average. This will mean having a first degree relative diagnosed with a bowel cancer before the age of 55 or having two first degree relatives who developed bowel cancer at any age (or one first degree and two second degree relatives). Category 2 people are recommended to have iFOBT every two years for the decade between ages 40 and 50 and then switch to five yearly colonoscopies until the age of 75. Finally, the high risk, category 3 is for all those patients without a genetic syndrome whose family history is even stronger than those people in category 2. Their risk is between 7-10 times higher than average. This includes people with at least three first-degree relatives who have been diagnosed with colorectal cancer at any age or people who have multiple relatives with the cancer including at least one diagnosed before aged 55. These high-risk people need to start screening earlier, with the guidelines recommending iFOBT every two years starting at age 35 and continuing for 10 years and then having a colonoscopy every five years between the ages of 45 and 75. Of note is that the revised guidelines have deleted the reference in the previous guidelines to starting screening 10 years before the earliest age colorectal cancer was diagnosed in a first degree relative. “There have been no studies conducted to determine the utility of beginning screening 10 years before the earliest diagnosis in the family, which was a recommendation in the 2005 guidelines and, therefore, it is not included in these guidelines,” they said. The new guidelines aim not only to more strongly define risk based on the latest evidence, but also to determine the most appropriate screening method based on that risk, taking into consideration cost-effectiveness and rationalisation of available services, in particular, colonoscopies.   Reference Jenkins MA, Ouakrim DA, Boussioutas A, Hopper JL, Ee HC, Emery JD, et al. Revised Australian national guidelines for colorectal cancer screening: family history. Med J Aust. 2018 Oct 29. doi: 10.5694/mja18.00142. [epub ahead of print]

Clinical Articles iconClinical Articles

It appears we might still be failing some of our poorer migrant women, with new study finding that they have higher rates of stillbirth compared to Australian-born mothers. Analysing data from stillbirths that occurred in Western Australia over the period 2005 to 2013, researchers found that while stillbirth rates overall were low and often much lower than in these migrant women’s country of birth, they were higher in non-Australian born women, especially in those women who were born in Africa. Published recently in The Medical Journal of Australia, the study also took note of whether the deaths occurred in the antepartum period (between 20 weeks gestation up to before labour commences) or the intrapartum period (which is the period after labour has started), in an attempt to determine when and in whom intervention might be warranted. Researchers found the key factor was the woman’s country of birth rather than her ethnic origin, as there appeared no difference in stillbirth rates among white and non-white Australian-born women. However, women born in Africa were twice as likely to have a stillbirth in the weeks before going into labour compared with Australian-born women. And Indian-born women were 70% more likely. Migrant women born in other countries collectively had an increased risk of about 40% of an antepartum stillbirth. And frighteningly, it appeared the rates of stillbirth occurring once labour had started were also much higher than that which occurs in Australian born women. Almost twice the risk for most migrant women, and more than double that again for African-born women. “That the rate intrapartum stillbirth was twice as high among African women is especially worrying, as intrapartum stillbirth is regarded as preventable and indicative of inadequate quality of care,” the study authors wrote. So why is this happening, the researchers asked. Why is it, that, despite access to the same standard of healthcare as the rest of the Australian population, these women are more at risk of losing their babies, especially African-born women and especially so late in the pregnancy? The study authors suggest cultural issues may play a major role. They point to statistics that show African-born women are more likely to have pregnancies lasting 42 weeks or more, a well-recognised risk factor for stillbirth. Qualitative studies have also determined there is often, particularly among African-born women, a deeply-held suspicion of interventions in pregnancy believing them to interfere with the natural process of childbirth and possibly having long-term repercussions. Consequently, there is not only a poorer attention to antenatal care but also a resistance to procedures such as induction of labour and caesarean section. “More in-depth investigation of the patterns of health service use, pregnancy, and labour care for migrant women, particularly African migrants, is warranted,” the researchers said. They suggest education is the most likely solution, but the changing of what is likely to be long-held and culturally-associated attitudes will need both sensitivity and intelligence. “Culturally appropriate antenatal engagement and educational programs about the risk of stillbirth and the indications for and the safety of induction and related interventions may be useful preventive strategies,” they concluded.  

Reference

Mozooni M, Preen DB, Pennell CE. Stillbirth in Western Australia, 2005–2013: the influence of maternal migration and ethnic origin. Med J Aust. 2018; 209(9): 394-400. DOI 10.5694/mja18.00362
Clinical Articles iconClinical Articles