The Brain

While we know more than we ever have, there is still a lot to learn meaning that, in an ironic sense, the brain is still something we can’t fully get our head around.

Someone once said that if the human brain were so simple that we could understand it then we would be so simple that we couldn’t. I would have to agree. The brain is our vastly complicated seat of consciousness and individuality, controlling most functions of the body, some of which we are aware of and some of which are on a more subconscious level.

If one were to zoom in to see it under a microscope, one would find literally billions of nerve cells, or neurons, forming a continuous interconnected network signalling to each other using electrical pulses and chemical transfers. There are around 86 billion of these neurons in the adult brain, meaning that if you were to pick an area of the brain the size of a small grain of sand you might find as many as 100,000 neurons in just that one area. What’s more, each one connects to around 1,000 others via connections known as synapses.

As we zoom out again, we see that the brain has a wrinkled surface that, if stretched out flat, would cover the area of four A4 sheets of paper. It is cushioned and bathed by a viscous layer of cerebrospinal fluid that, as the name suggests, runs all the way down around the spine as well.  Weighing in at around 2% of our body weight, our brains manage around 98% of human function, which is a pretty good return. It follows therefore that it needs a fairly good power supply and, indeed, it has an important network of blood vessels that supply it with oxygenated blood and nutrients – it uses around 20% of the body’s energy supplies.

With such a complex make-up and such a plethora of responsibilities, it is no wonder that it is regarded with such intense interest and yet is still relatively poorly understood compared with other organs of the body. As with much scientific endeavour, much progress has been made in its understanding over the last century.

Take the frontal lobe for example. As part of the quest to understand the brain in more detail, scientists identify areas in accordance with their perceived function. The frontal lobe is thought to be involved in executive function such as judgement, decision-making, planning and control of behaviour – functions that became clear following an accident involving a railway worker named Phineas Gage in which, rather unfortunately, he received a metal pole through his forehead. Though he survived this ordeal, the once calm and understated worker famously showed a marked change in personality towards aggression and surliness.

Had the pole gone through his occipital lobe, he might have had trouble with his vision and, if it had pierced the temporal lobe, he would potentially have had trouble processing sound, using his memory and producing speech.

The point is that certain areas of the brain are involved in particular tasks. This can become apparent when someone has a stroke. Most strokes happen when blood supply to an area of brain tissue is interrupted.  The result can be, for example, loss of motor function in one side of the body. If there is a problem in one half of the brain, then the problem (when talking about motor function – i.e. moving an arm) manifests in the other side of the body. This is because nerve fibres from each side of the brain cross over at a certain point before descending the spine to the rest of the body.

We know that the brain performs so many functions. It allows us to move, to smell, to hear and to sense temperature. It also enables us to think. While this complexity is admirable, when it goes wrong the consequences can often be very distressing. Infection, head injury and cardiovascular disease all affect the brain’s health, as well as conditions such as Parkinson’s disease which affects the production of dopamine, (normally used to regulate our movement) and resulting in involuntary shaking, slow movement and stiff muscles.

Most significant of all, as our population grows older, dementia is becoming the largest cause of mortality in the UK and all over the developed world. Research is ongoing and we still have a long way to go both in understanding the processes involved and in treating the effects. It should be mentioned that dementia is not a single disease, rather a term to describe the symptoms that occur when there is a decline in brain function.

Alzheimer’s is the biggest cause of dementia. Though not fully understood, it is believed to be related firstly to the build-up of amyloid plaques and secondly to neurofibrillary tangles made up of proteins called TAU proteins. As more of these build up, the ability of the neurons in the brain to transmit information gradually diminishes. Research is currently focusing on the processes involved in the development of these two features. Just as importantly, the search is on for biomarkers (markers that we can sample in the blood or spinal fluid) that might give us an idea of whether someone might be developing a dementing condition, giving greater opportunity to take early steps to manage the condition and also to research disease progression over longer periods of time. Although it can be difficult to face, and often slow to present, if you have any concerns about memory, it is important to see your GP as there is often support available and it may also be the result of more benign and treatable conditions (for example low vitamin B12 levels or underlying infection).

When concentrating on the more physical effects of the brain, it is sometimes easy to overlook the deeper thought processes that are involved in our mental health. Much of our individuality comes through the environment in which we grow up. In the same way that we form new connections and synapses in our brains through repetition as we learn an instrument or practise our times-tables for example, it is thought that personality traits develop to some extent in the same way. For untold reasons, however, our minds can be fragile and depression and anxiety can be extremely damaging. Often there are so many different factors, both social and physical, that make such emotional issues difficult not only to treat but also to recognise. Chemical imbalance plays its part, for example in relation to levels of serotonin in the brain, and in such cases there can be a role for medication. More recently, there has been a push for increased awareness of mental health conditions in an attempt to remove any stigma attached to something that can cause a lot of problems if left unaddressed.

How do we look after our brains? Staying happy is a good start and there is plenty of support available for people for whom this is not the case. Keeping your mind busy helps to maintain your ‘neural plasticity’ – it ensures you are creating new synapses by learning new things. Maintaining healthy social networks is equally as important.

Regular exercise is vital for brain health as it increases the blood supply to the neurons, reduces blood pressure, helps blood sugar balance, improves cholesterol and reduces mental stress.

Getting enough sleep each night is important (8 hours being the aim).

Your diet can also give you benefits. Anything rich in omega 3 such as oily fish is useful and a ‘Mediterranean-style’ diet is a good start. Blueberries are rich in anti-oxidants, thought potentially to reduce inflammation involved in plaque formation in the brain, and dark leafy greens, such as kale and spinach, will give you good sources of vitamins C and E and folates – all thought potentially to reduce the risk of Alzheimer’s.

There is some evidence to suggest that certain people may benefit from medications like statins and aspirin but it’s always a good idea to come in to discuss any medication with your GP or pharmacist. And don’t forget not to smoke or drink too much alcohol.

While we know more than we ever have, there is still a lot to learn meaning that, in an ironic sense, the brain is still something we can’t fully get our head around.

 

Medicine: A work in progress

Our job is to sift through all of the research that is carried out (and there is a lot) and utilise the research that makes sense.

As I sat down to write this article, my initial aim was to try and pinpoint the biggest advances in medicine over the past year. In doing so, I rather suspect I made a rod for my own back. It turns out that pin-pointing specific advances that aren’t incredibly specialised and frankly mundane for the uninitiated is quite difficult. Headlines from certain sections of the media pronouncing grand new breakthroughs every other day would have you believe that vast strides are frequently made overnight. Talk of “miracle cures” and such is all too common and, while regular grand discoveries may not in reality be as frequent as they were perhaps a century ago, this is not to say that dramatic advances are not being made. Rather the process behind these advances is simply more gradual and far more intricate.

This led me to reflect upon one of the most interesting aspects of medicine and hopefully by the end of this article I will have conveyed the numerous ways in which medicine and the way in which we practise it remain a work in progress.

Throughout the last century people felt justifiably reassured by the steady advance of medical know-how. The twentieth century saw some incredible breakthroughs in the organisation of medical care, the understanding of disease and the implementation of effective treatments. (Antibiotics, public health, surgery, pharmaceuticals… the list is almost endless.) A lot of this will have been based on a new approach – evidence-based, which I will come to later.

Even now, however, even after all these advances, it is important to acknowledge that we don’t know everything and must constantly strive to improve and develop existing treatments as well as being on the lookout for new ones. Part of this will involve adapting to changes in demand which may vary to one decade to another. Thankfully, as you read this, that is exactly what many people are working on in order to stay up to date and push the boundaries in order to make treatments more effective. Not only is medical research important, it is – for better or for worse – big business. As such, a phenomenal amount of money is invested in research every year. In the UK alone, the Industrial Strategy Challenge Fund has set aside £146m of government money over the next four years for life sciences. Add to that the countless charities working on medical research along with the pharmaceutical companies and one can see how much activity there is in this field.

It is inevitable therefore that we see headlines almost daily about rumoured miracle treatments for this and that and warnings about things to avoid that at first glance seem perfectly innocuous (eg burnt toast – cancer).

Our job is to sift through all of the research that is carried out (and there is a lot) and utilise the research that makes sense. Often this is done via panels that do that work for us and produce guidelines, though it must be said there is frequent disagreement amongst professionals about even these. Needless to say, there is considerable variation in the quality of research and some of it must be taken with a pinch of salt.

If we consider the development of a new drug, for example, one of the most important aspects is naturally whether or not it is effective. In order to answer this question, studies must be carried out to trial it on ideally as many people as possible in order to iron out any statistical inconsistencies. The longer the trial goes on the better, for the same reason. Add to that the complicated task of removing as much bias as possible from those carrying out the study and one will find that, of the thousands of studies carried out each year, very few have enough statistical power to draw totally reliable conclusions.

Unfortunately, even the most unfounded conclusions end up as headlines. Here’s an example. “Tattoos could give you cancer, new research suggests”. This was based on a study in which 4 out of 6 donors had ink particles in their lymph nodes after post mortem. There was no information about whether the donors had cancer or not. And yet, for many, that headline is enough. For this reason, we all have a responsibility to be wary about what we take from the news no matter where it is published. It is so easy to fall foul to misinformation, even health ministers are not immune.

It is important to add that some studies, although they do not come up with firm conclusions, add to the body of research out there. If people didn’t at least try to generate evidence, progress would be much slower. For example, last summer a UK study hit the headlines following its claim that the age-old notion of finishing a course of antibiotics may be outdated. It suggested that doing this actually contributed to antibiotic resistance. Quite rightly the study did not sway official advice – to finish the course of antibiotics even if you begin to feel better before they run out – because the way the study was carried out left too much scope for bias from the organisers. It did however raise the question and will no doubt encourage further, more powerful, studies in the future that will give us a better idea of what we should be doing.

So this is what I mean by an evidence-based approach, as mentioned earlier. This approach has become the cornerstone of modern medicine and for good reason. So, while it may not have given us a list of show-stopping breakthroughs of late, it has given us a valuable and active research community that is perpetually in motion and coming up with improvements and suggestions, however large or small, all of the time.

To finish, I must stress that development of medicine is not just about medications and treatments. It is vital that we are able to utilise these treatments in the best and most effective way possible. Technological advances are becoming more prominent (artificial pancreases for type 1 diabetics and drones delivering medical supplies for example) but, with the current levels of demand and the well documented pecuniary squeeze in mind, for me the biggest advance in 2017 has been the provision of locally available care. As hospitals come under more strain, a big drive to treat more people in the community is afoot through minor injury units, intermediate care and rapid assessment units. Having services like this makes a huge difference and I feel I must highlight how much of a positive these additions have been. The more people are aware of the services available, the easier it is for the health services to spread the load. After all, if the strain on our health services becomes too difficult to sustain even at the most basic level, it may be even more difficult to make the clinical breakthroughs of the future a reality. 

The Problem with Antibiotics

Our honeymoon period with antibiotics and their undeniable benefits ended long ago, but since their inception we have created a deep seated culture of dependence.

Many thanks for your responses to my article last time around – keep those suggestions for topics coming. Shortly after I had finished the article about the common cold, I developed a cold of my own, so I have decided to postpone indefinitely my planned article on smallpox.

Our topic this week is antibiotics – a subject which is garnering more and more attention in the media. Since 2015, there has even been an annual ‘world antibiotic awareness week’ which, appropriately, was last week.

Why the fuss? Well I am sure most people by now have heard all sorts of stories in the news about antibiotic resistance and the emergence of ominously entitled ‘superbugs’. This is all for good reason as I will expand upon.

To begin with, let’s focus on antibiotics and what they actually are. Prior to their discovery and development in the early half of the twentieth century, we had no really effective ways of treating bacterial infections. Historically, all manner of approaches were used, from the rather dramatic process of blood-letting (thought to stabilise the balance of the perceived four humours: blood, phlegm, yellow bile and black bile), to the use of things like willow bark by the ancient Greeks for curing fevers and pains. (Willow bark actually contains salicin, which is chemically related to modern day aspirin).

Things all changed when the Scottish botanist Alexander Fleming returned to his laboratory in 1928 after a family holiday and noticed that mould had grown in his petri dishes of staphylococci bacteria. The mould in question (penicillium) had killed off the surrounding areas of the bacteria prompting Fleming’s famous response – ‘That’s funny’.

The rest, as they say, is history and since then many different families of antibiotics have been developed to fight off bacterial infections that had once been, at best, troublesome and, at worst, fatal. As we approach a century of antibiotic use, we can look back upon a vast improvement in our ability to treat infections such as pneumonia, syphilis, tuberculosis, meningitis and many more. This has no doubt had a vast social and economic impact. However, now we come to the problem.

Antibiotic resistance is a process that has been developing from the very beginning. In broad terms, let us consider a group of bacteria exposed to an antibiotic. In any reproducing population, there will always be random mutations that occur in the genes of certain individual bacterial cells. Sometimes these mutations happen to protect the bacteria from the effects of an antibiotic. Bacteria without that protection die, leaving the resistant bacteria free to multiply without competition. Over time, these populations spread from person to person, meaning that, when the same antibiotic is used repeatedly, it becomes less and less effective in controlling these bacteria. That’s it in a nutshell.

We are now at a stage in which no new class of antibiotic has been found since 1987 and there are thought to be around 12,000 deaths each year in the UK as a result of bacteria resistant to antibacterial treatment. If this trend continues without further action, the World Health Organisation (WHO) state that the global mortality from such infections could be as much as 10 million people a year by 2050. Advancements and achievements in modern medicine such as chemotherapy, organ transplants and routine operations like caesarean sections and hip replacements – all of which rely heavily on the availability of effective antibiotics – are now potentially at risk.

Development of resistance is and always was a natural and unavoidable process but our use of them has unequivocally made things worse than they could have been. In 2015, it is thought that around 25% of antibiotics were taken unnecessarily in the UK. When you factor in un-regulated use of antibiotics in farming and the availability of antibiotics over the counter in some countries, one begins to see how much of a global issue this is.

On a personal note, I have certainly seen strikingly inappropriate use of strong antibiotics prescribed in other countries for even the most trivial of ailments. There is most definitely a responsibility amongst us as healthcare professionals to monitor what we are prescribing. Having said that, there have been surveys suggesting that up to 90% of GPs have experienced pressure from patients to prescribe antibiotics even when this was not appropriate and would serve no purpose. While this obviously differs from area to area (and to be fair you’re a pretty good bunch), we all share a certain responsibility in tackling this issue.

I don’t want to sound too gloomy, and thankfully there has been some international recognition of the issue. The WHO endorsed a global action plan in 2015 (though lamentably it will certainly now have to make do without the help of Robert Mugabe) and since then 193 countries have given further political endorsements via the UN to install tighter regulation and encourage further research into new antibiotic classes.

As often is the case with such gradual phenomena, the effects of such crises are not always immediately apparent. However, in this case, the signs have been there for a long time and Fleming himself warned about the potential for resistance. Now those signs are becoming ever more obvious and we must face up to the inconvenient truth. We stand to lose a lot if we refuse to do so.

Hygiene both in the community and in hospitals is vital to prevent the spread of bacteria. Responsible and restrained prescribing from doctors both here and all over the world is also required. Research into new antimicrobial agents is ongoing but slow, and techniques to bolster our existing agents is important for our short term management of the more serious infections. Crucially, educating people as to why it is often inappropriate to prescribe an antibiotic is just as important – after all we’re all in this together.

Our honeymoon period with antibiotics and their undeniable benefits ended long ago, but since their inception we have created a deep seated culture of dependence. This will be difficult to withdraw from, especially considering the advances we have built around it. Over the coming years, we must now consider whether or not an even more dramatic shift in our utilisation of such medicines is required before nature takes the matter out of our hands.