I’m writing this waiting for the drizzle to clear so I can go to the apiary and make up some nucs for swarm control. Without implementing some form of swarm control it’s inevitable that my large colonies will swarm 1.
Swarming is an inherently risky process for a colony. Over 75% of natural swarms perish, often because they do not build up strongly enough to overwinter successfully.
As a mechanism for reproduction swarming is somewhat unusual in that the intact colony is split into two not fully functional ‘halves’ 2.
By not fully functional I mean that neither the swarmed colony, nor the swarm are guaranteed to survive.
The swarmed colony lacks a queen, but has ample stores.
The swarm has a queen but has only the stores carried in the bellies of the workers.
The swarmed colony needs to rear a new queen. The swarm needs to find a new nest site, move there, build comb, rear brood, forage etc.
That seems like the very opposite of intelligent design, but it’s the way evolution has made things work. This being the case it involves a whole range of compromises and quick fixes that make it work.
One of these involves the memory of worker bees, which is what this post is about.
A range of events within the hive – which for reasons that will become obvious I will term the original nest site – trigger the urge to swarm. I discussed some of these when covering swarm prevention. Swarming is then essentially a two-stage process.
The two stage process of swarming
The first stage is the swarm leaving the original nest site and establishing a bivouac nearby. This is the classic cluster of bees hanging from a branch.
The bivouac sends out scout bees to search the nearby area for potential new nest sites. After ‘discussion’ (comprehensively covered by Thomas Seeley in Honeybee Democracy) between the scouts they reach a consensus of the best site.
The second stage is the relocation of the bivouacked colony to the new nest site. For example, this could be the church tower, a hollow tree or a bait hive. This site is likely to be within a few hundred metres of the original nest site, but can be further away.
All of which should raise some questions in the minds of beekeepers who are familiar with the “less than 3 feet or more than 3 miles” rule.
Have these bees not read the rules?
If you want to move a hived colony of bees you’ll often be told, or have read, that you need to move them either less than three feet or more than 3 miles.
Worker bees have a foraging range of about 3 miles. Within this range they have an uncanny ability to return to the hive location using features of the landscape to orientate themselves. The ‘final approach’ uses scent from the hive entrance.
Therefore, if you move a colony 3 feet they’ll still find the general location using landscape features, and then orientate to the hive entrance using scent.
If you move a colony 10 miles away everything is new to them and they’ll embark on some orientation flights to learn the new landscape features.
But if you move the colony a mile they’ll use the landscape features to return to the site of the original hive … to find it gone 🙁 3
Swarms break all these rules.
The bivouac is (in my experience) always more than 3 feet from the hive entrance. If the scout bees make the choice (e.g. selecting a bait hive to occupy), the swarm always relocates to a new nest site less than three miles from the site it left 4.
And a beekeeper who drops a bivouacked colony into a skep can move it wherever she wants, even back to the same hive stand it recently vacated.
If the swarm followed the rules, the majority of the workers would return from the bivouacked swarm to the original nest site.
At least they would if they had orientated to the original nest site in the first instance.
Are the bees naive?
About half of a workers life is spent as a forager collecting water, pollen or nectar. But before they venture out of the hive, the first half of a worker bees life is spent building comb, nursing larvae or cleaning cells.
Therefore, one possibility is that the bees present in a swarm have no knowledge of the hive location because they’ve never before left the hive.
We know that the proportion of workers that leave the colony when it swarms is about 75%. This has been determined in a number of independent studies and is remarkably consistent, irrespective of the size of the colony that swarms.
If 75% of the workers leave the colony when it swarms it is mathematically impossible for the swarm not to include older foragers (assuming the laying rate of the queen is steady).
In fact, we don’t need to resort to any underhand mathematics as the age classes of bees in a swarm have been measured. I’ve discussed this before when comparing natural and artificial swarms.
Age distribution of bees in swarms
The median age of adult bees in the hive is 19 days. The median age of bees in a swarm is 10 days. Therefore swarms do contain younger bees, but not exclusively so.
One of the reasons for this bias towards younger bees must be to do with the relatively short lifespan of foragers. Many of the older bees in the swarm will have perished long before the new brood laid by the queen emerges.
The bivouacked swarm doesn’t dwindle in size as the older foragers drift back to the original nest site. Other than a few hundred scout bees, the majority of the bivouacked swarm huddle together to protect the queen, buried somewhere in the centre, from the elements.
They don’t fly or forage … they’re waiting for the signal from the scout bees that a new nest site has been located.
And, once they relocate to the church tower, the hollow tree or a bait hive, the older foragers stay in the new nest location. It’s as though the bees in a swarm that previously knew where the original nest site was have amnesia.
And this makes sense. If they did return to the original nest site the swarm (whether bivouacked or relocated) would shrink in size and it’s chances of surviving would be severely diminished. Other than a full belly of honey a swarm can rely on nothing. They need as many bees as possible to take on all the roles needed to establish a new colony – comb builders, nurses, foragers etc.
But have they really forgotten the original nest site?
It turns out that swarms do retain a memory of their original nest site.
In 1993 Gene Robinson and colleagues demonstrated that a swarm shaken out from its new nest site preferentially returns to the original nest site, rather than to an equidistant alternate 5.
This ability must rely on the memory of the foragers in the swarm. Therefore it is likely to be lost in a relatively short time (days, not weeks) 6.
Firstly, the foragers will be busy reorienting to the new nest site, effectively overwriting the memory of the original nest location. In good weather this takes just a couple of days.
Secondly, these ageing bees don’t have long to live, so there will be ever-decreasing numbers of them to lead a shaken out swarm back to the original location.
Rain stops play
Sometimes the bivouacked colony never relocates to a new nest site. Either the scouts never achieve a consensus or – more likely – bad weather forces the swarm to hunker down.
When you hive a bivouacked swarm you will often find a small crescent or two of new wax on the branch they were clinging to. If the bees get trapped by bad weather I think the comb building continues. It’s not unusual to find comb in hedgerows near apiaries where bees that have got trapped have ended up trying to make a new nest.
Natural comb …
What does the memory – or lack of it – of swarms mean for practical beekeeping?
The (temporary) amnesia of swarms means you can collect a bivouacked swarm and move it wherever you want. A swarm that relocates to your bait hive can also be moved, but don’t wait too long. Within just a few days of a swarm arriving the bees will have reoriented to their new location. I always try and move bait hives to their final location within three days of a swarm appearing.
The drizzle stopped and I spent the entire day finding queens and making up nucs.
Note to self … a super-strong colony with no queen cells, wall-to-wall brood and no very young larvae or eggs probably has a faulty queen excluder 🙁
Second note to self … Sod’s law dictates that the colony with the faulty queen excluder probably has supers filled with drone comb 🙁
Almost every article or review on chronic bee paralysis virus 1 starts with a reference to Aristotle describing the small, black, hairless ‘thieves‘, which he observed in the hives of beekeepers on Lesbos over 2300 years ago 2.
Although Aristotle was a great observer of nature, he didn’t get everything right.
And when it came to bees, he got quite a bit wrong.
He appreciated the concept of a ‘ruling’ bee in the hive, but thought that the queen was actually a king 3. He also recognised different castes, though he thought that drones (which he said “is the largest of them all, has no sting and is stupid”) were a different species.
He also reported that bees stored noises in earthenware jars (!) and carried stones on windy days to avoid getting blown away 4.
However, over subsequent millenia, a disease involving black, hairless honey bees has been recognised by beekeepers around the world, so in this instance Aristotle was probably correct.
Little blacks, maladie noire, schwarzsucht
The names given to the symptomatic bees or the disease include little blacks or black robbers in the UK, mal nero in Italy, maladie noire in France or schwarzsucht (black addiction) in Germany. Sensibly, the Americans termed the disease hairless black syndrome. All describe the characteristic appearance of individual diseased bees.
Evidence that the disease had a viral aetiology came from Burnside in the 1940’s who demonstrated the symptoms could be recapitulated in caged bees by injection, feeding or spraying them with bacterial-free extracts of paralysed bees. Twenty years later, Leslie Bailey isolated and characterised the first two viruses from honey bees. One of these, chronic bee paralysis virus (CBPV), caused the characteristic symptoms described first by Aristotle 5.
CBPV causes chronic bee paralysis (CBP), the disease first described by Aristotle.
CBPV infection is reported to present with two different types of symptoms, or syndromes. The first is the hairless, black, often shiny or greasy-looking bees described above 6. The second is more typically abnormal shivering or trembling of the wings, often associated with abdominal bloating 7. These bees are often found on the top bars of the frames during an inspection. Both symptoms can occur in the same hive 8.
CBP onset appears rapid and the first thing many beekeepers know about it is a large pile (literally handfuls) of dead bees beneath the hive entrance.
It’s a distressing sight.
Despite thousands of bees often succumbing to disease, the colony often survives though it may not build up enough again to overwinter successfully.
Until recently, CBP was a disease most beekeepers rarely actually encountered.
Emerging and re-emerging disease
I’ve got a few hundred hive year’s worth 9 of beekeeping experience but have only twice seen CBP in a normally-managed colony. One was mine, another was in my association apiary a few years later.
A beekeeper managing 2 to 3 colonies might well never see the disease.
A bee farmer running 2 to 3 hundred (or thousand) colonies is much more likely to have seen the disease.
As will become clear, it is increasingly likely for bee farmers to see CBP in their colonies.
Virologists define viral diseases as emerging if they are new in a population. Covid-19, or more correctly SARS-CoV-2 (the virus), is an emerging virus. They use the term re-emerging if they are known but increasing in incidence.
Ebola is a re-emerging disease. It was first discovered in humans in 1976 and caused a few dozen sporadic outbreaks 10 until the 2013-16 epidemic in West Africa which killed over 11,000 people.
Often the terms are used interchangeably.
Sporadic and rare … but increasing?
Notwithstanding the apparently sporadic and relatively rare incidence of CBP in the UK (and elsewhere; the virus has a global distribution) anecdotal evidence suggested that cases of disease were increasing.
In particular, bee farmers were reporting increasing numbers of hives afflicted with the disease, and academic contacts overseas involved in monitoring bee health also reported increased prevalence.
Something can be rare but definitely increasing if you’re certain about the numbers you are dealing with. If you only have anecdotal evidence to go on you cannot be certain about anything very much.
If the numbers are small but not increasing there are probably other things more important to worry about.
However, if the numbers are small but definitely increasing you might have time to develop strategies to prevent further spread.
Far better you identify and define an increasing threat before it increases too much.
With research grant support from the UKRI/BBSRC (the Biotechnology and Biological Sciences Research Council) to the Universities of Newcastle (Principle Investigator, Prof. Giles Budge) and St Andrews, and additional backing from the BFA (Bee Farmers’ Association), we set out to determine whether CBPV really was increasing and, if so, what the increase correlated with (if anything).
This component of the study, entitled Chronic bee paralysis as a serious emerging threat to honey bees, was published in Nature Communications last Friday (Budge et al.,  Nat. Comms. 11:2164 https://doi.org/10.1038/s41467-020-15919-0).
The paper is Open Access and can be downloaded by anyone without charge.
There are additional components of the study involving the biology of CBPV, changes in virus virulence, other factors (e.g.environmental) that contribute to disease and ways to mitigate and potentially treat disease. These are all ongoing and will be published when complete.
Is chronic bee paralysis disease increasing?
We ‘mined’ the National Bee Units’ BeeBase database for references to CBPV, or the symptoms associated with CBP disease. The data in BeeBase reflects the thousands of apiary visits, either by call-out or at random, by dedicated (and usually overworked) bee inspectors. In total we reviewed almost 80,000 apiary visits in the period from 2006 to 2017.
There were no cases of CBPV in 2006. In the 11 years from 2007 to 2017 the CBP cases (recorded symptomatically) in BeeBase increased exponentially, with almost twice as much disease reported in commercial apiaries. The majority of this increase in commercial apiaries occured in the last 3 years of data surveyed.
Apiaries recorded with chronic bee paralysis between 2006 and 2017.
BeeBase covers England and Wales only. By 2017 CBPV was being reported in 80% of English and Welsh counties.
During the same period several other countries (the USA, several in Europe and China) have also reported increases in CBPV incidence. This looks like a global trend of increased disease.
But is this disease caused by CBPV?
It should be emphasised that BeeBase records symptoms of disease – black, hairless bees; shaking/shivering bees, piles of bees at the hive entrance etc.
How can we be sure that the reports filed by the many different bee inspectors 11 are actually caused by chronic bee paralysis virus?
Or indeed, any virus?
To do this we asked bee inspectors to collect samples of bees with CBPV-like symptoms during their 2017 apiary visits. We then screened these samples with an exquisitely sensitive and specific qPCR (quantitative polymerase chain reaction) assay.
Almost 90% of colonies that were symptomatically positive for CBP were also found to have very high levels of CBPV present. We are therefore confident that the records of symptoms in the historic BeeBase database really do reflect an exponential increase of chronic bee paralysis disease in England and Wales since 2007.
Interestingly, about 25% of the asymptomatic colonies also tested positive for CBPV. The assay used was very sensitive and specific and allowed the quantity of CBPV to be determined. The amount of virus present in symptomatic bees was 235,000 times higher than those without symptoms.
Further work will be needed to determine whether CBPV is routinely present in similar proportions of ‘healthy’ bees, and whether these go on and develop or transmit disease.
Using the geospatial and temporal (where and when) data associated with the BeeBase records we investigated whether CBPV symptomatic apiaries were clustered.
For example, in any year were cases more likely to be near other cases?
Across all years of data analysed together, or for individual years, there was good evidence for spatial clustering of cases.
We also looked at whether cases in one year clustered in the same geographic region in subsequent years.
They did not.
Clustering of CBPV – spatial and temporal analysis.
This was particularly interesting. It appears as though there were increasing numbers of individual clustered outbreaks each year, but that the clusters were not necessarily in the same geographic region as those in previous or subsequent years.
The disease appears somewhere, increases locally and then disappears again.
Apiary-level disease risk factors
The metadata associated with Beebase records is relatively sparse. Details of specific colony management methods are not recorded. Local environmental factors – OSR, borage, June gap etc. – are also missing. Inevitably, some of the factors that may be associated with increased risk are not recorded.
A relatively rare disease that is spatially but not temporally clustered is a tricky problem for which to define risk factors. Steve Rushton, the senior author on the paper, did a sterling job of analysing the data that was available.
The two strongest apiary-level factors that contributed to disease risk were:
Commercial beekeeping – apiaries run by bee farmers had a 1.5 times greater risk of recording CBP disease.
Importing bees – apiaries which had imported bees in the two preceding years had a 1.8 times greater risk of recording CBP disease.
Bee farming is often very different from amateur beekeeping. The colony management strategies are altered for the scale of the operation and for the particular nectar sources being exploited. For example, colonies may already be booming to exploit the early season OSR. This may provide ideal conditions for CBPV transmission which is associated with very strong hives and/or confinement.
Bee imports does not mean disease imports
There are good records of honey bees imported through official channels. This includes queens, packages and nucleus colonies. Between 2007 and 2017 there were over 130,000 imports, 90% of which were queens.
An increased risk of CBP disease in apiaries with imported bees does not mean that the imported bees were the source of the disease.
With the data available it is not possible to distinguish between the following two hypotheses:
imported honey bees are carriers of CBPV or the source of a new more virulent strain(s) of the virus, or
imported honey bees are susceptible to CBPV strain(s) endemic in the UK which they were not exposed to in their native country.
There are ways to tease these two possibilities apart … which is obviously something we are keen to complete.
All publicity is good publicity …
… but not necessarily accurate publicity 🙁
We prepared a press release to coincide with the publication of the paper. Typically this is used verbatim by some reporters whereas others ask for an interview and then include additional quotes.
Some more accurately than others 🙁
The Times, perhaps reflecting the current zeitgeist, seemed to suggest a directionality to the disease that we certainly cannot be sure of:
Its sister publication, The Sun, “bigged it up” to indicate – again – that bees are being wiped out.
And the comments included these references to the current Covid-19 pandemic:
“Guess its beevid – 19. I no shocking”
“It’s the radiation from 5g..google it”
“Local honey is supposed to carry antibodies of local virus and colds – it helps humans to eat the stuff or so they say. So it could be that the bees are actually infected by covid. No joke.“
All of which I found deeply worrying, on a number of levels.
The Telegraph also used the ‘wiped out’ reference (not a quote, though it looks like one). They combined it with a picture of – why am I not surprised? – a bumble bee. D’oh!
The Daily Mail (online) had a well-illustrated and pretty extensive article but still slipped in “The lethal condition, which is likely spread from imports of queen bees from overseas …”. The unmoderated comments – 150 and counting – repeatedly refer to the dangers of 5G and EMFs (electric and magnetic fields).
I wonder how many of the comments were posted from a mobile phone on a cellular data or WiFi network?
CBPV is causing increasing incidence of CBP disease in honey bees, both in the UK and abroad. In the UK the risk factors associated with CBP disease are commercial bee farming and bee imports. We do not know whether similar risk factors apply outside the UK.
Knowing that CBP disease is increasing significantly is important. It means that resources – essentially time and money – can be dedicated knowing it is a real issue. It’s felt real to some bee farmers for several years, but we now have a much better idea of the scale of the problem.
We also know that commercial bee farming and bee imports are both somehow involved. How they are involved is the subject of ongoing research.
Practical solutions to mitigate the development of CBP disease can be developed once we understand the disease better.
I am an author on the paper discussed here and am the Principle Investigator on one of the two research grants that funds the study. Discussion is restricted to the published study, without too much speculation on broader aspects of the work. I am not going to discuss unpublished or ongoing aspects of the work (including in any answers to comments or questions that are posted). To do so will compromise our ability to publish future studies and, consequently, jeopardise the prospects of the early career researchers in the Universities of St Andrews and Newcastle who are doing all the hard work.
This work was funded jointly by BBSRC grants BB/R00482X/1 (Newcastle University) and BB/R00305X/1 (University of St Andrews) in partnership with The Bee Farmers’ Association and the National Bee Unit of the Animal and Plant Health Agency.
It’s late May. Outside it’s dark, so you’re trapped inside until sunrise. Inside it’s warm, dark and humid. You and your sisters are crowded together with barely enough space to turn around.
And your mother keeps laying more eggs … perhaps 2000 a day. If it wasn’t for the fact that about 2000 of your sisters perish each day you’d have no space at all.
Most of them die out in the fields. Missing in action.
I counted them all out and I didn’t count them all back, as the late Brian Hanrahan did not say in 1982 😉
But some die inside. And in the winter, or during prolonged periods of poor weather, your sisters all die inside.
Which means there’s some housekeeping to do.
Bring out your dead
Dead bees accumulating in the hive are a potential source of disease, particularly if they decompose. Unless these are removed from the colony there’s a chance the overall health of the colony will be threatened.
Not all bees die of old age. Many succumb to disease. The older bees in the colony may have a higher pathogen load, reinforcing the importance of removing their corpses before disease can spread and before the corpses decompose.
Honey bees, like many other social insects, exhibit temporal polyethismi.e. they perform different tasks at different ages.
One of the tasks they perform is removing the corpses from the colony.
The bees that perform this task are appropriately termed the undertaker bees.
Gene Robinson in Cornell conducted observational studies on marked cohorts of bees. In these he identified the roles and activities of the undertaker bees. At any one time only 1-2% of the bees in the colony are undertakers 1.
These are ‘middle aged’ bees i.e. 2-3 weeks after eclosion, similar to guard bees. Although called undertakers, they do not exclusively remove corpses. Rather they are generalists that are more likely to remove the corpses, usually depositing them 50-100m from the hive and then returning.
They preferentially occupy the lower regions of the hive – presumably because gravity means the corpses accumulate there – where they also perform general hive cleansing roles e.g. removing debris.
Bees, like all of us, are getting older all the time. Some bees may spend only one day as undertakers before moving on to foraging duties. Presumably – I don’t think we know this yet – the time a bee remains as an undertaker is influenced by the colony’s need for this activity, the laying rate of the queen and, possibly, the numbers of other bees performing this role 2.
No no he’s not dead, he’s, he’s restin’!
In Monty Python’s Dead Parrot sketch Mr. Praline (John Cleese) argues with the shop owner (Michael Palin) that the Norwegian Blue parrot he’d purchased was, in fact, dead.
The shop owner tries to persuade Mr. Praline that the parrot is resting.
Or pining for the fjords.
The inference here is that it’s actually rather difficult to determine whether something is dead or not 3.
So if you struggle with an unresponsive parrot how do you determine if a bee is dead?
More specifically, how do undertaker bees in a dark, warm, humid hive determine that the body they’ve just tripped over is a corpse?
Almost forty years ago Kirk Visscher at Cornell studied necrophoresis (removal of the dead) in honey bees 5.
He noted that it had two distinct characteristics; it happened rapidly (up to 70 times faster than debris removal) and dead bees that were solvent-washed or coated in paraffin-wax were removed very much more slowly.
Kirk Visscher concluded that the undertaker bees “probably use chemical cues appearing very rapidly after the death of a bee” to identify the corpses.
Visscher studied honey bees, Apis mellifera. I’m not aware of any recent studies in A. mellifera that have better defined these ‘chemical cues’. However, a very recent preprint has been posted on bioRχiv describing how the closely related Eastern honey bee, Apis cerana, undertakers identify the dead.
As an aside, bioRχiv (pronounced bioarkive) is a preprint server for biology. Manuscripts published there have not been peer reviewed and will potentially be revised and/or withdrawn. They might even be wrong. Many scientists increasingly use bioRχiv to post completed manuscripts that have been submitted for publication elsewhere. The peer review and publication process is increasingly tortuous and long-winded. By posting preprints on bioRχiv other scientists can read and benefit from the study well before full publication elsewhere.
It’s also used as a ‘marker’ … we did this first 😉
Death recognition in honey bees is rapid. Visscher demonstrated that a dead worker bee was usually removed within 30 minutes, well before it would have started producing the pong associated with the processes of decay.
Corpse recognition occurs in the dark and in the presence of lots of other bees. Logically, an odour of some sort might be used for identification. Both visual and tactile signals would be unlikely candidates.
In searching for the odour or chemical clues (the term used by Visscher), Ping made some assumptions based on prior studies in social insects. In Argentine ants a reduction in dolichodial and iridomyrmecin is associated with corpse recognition, and addition of these compounds (respectively a dialdehyde and a monoterpene) prevented necrophoresis.
Conversely, some social insects produce signals associated with death or disease. Dead termites give off a mix of 3-octanone, 3-octanol and the combination of β-ocimene and oleic acid production is a marker of diseased brood in honey bees.
What else could be assumed about the chemicals involved? Corpse removal is an individual effort. There’s only one pallbearer. Therefore the chemical, whatever it is, doesn’t need to be a recruitment signal (unlike the alarm pheromone for example).
Finally, the signal needs to operate over a very short range. There’s no point in flooding the hive with a persistent long-range chemical as that would make the detection of the corpse impossible.
Cuticular hydrocarbons (CHC) are widely used in insect communication. They are long chain hydrocarbons (chemicals composed solely of carbon and hydrogen) that have many of the characteristics expected of a ‘death chemical’.
Nonacosane – a long chain CHC with 29 carbons and 60 hydrogen atoms
They are generally short-range, low volatility compounds. Honey bees use CHC’s for communication during the waggle dance and to distinguish colony mates by guard bees. They also have structural roles, being a major component of wax comb and, in the cuticle, they help maintain water balance in bees.
As would be expected from chemicals with a wide variety of roles, there’s a huge range of CHC’s. Taking all the above together, Wen Ping searched for CHC’s that functioned during necrophoresis.
Cool corpses and cuticular hydrocarbons
Wen studied undertakers removing segments of dead bees and determined that the chemical signal was most probably a component of the cuticle.
Living bees in his studies had a body temperature of ~44°C. In contrast, dead bees rapidly cooled to ambient temperatures. Wen demonstrated that corpse removal was significantly delayed if the corpses were warmed to ~44°C, but then occurred rapidly once they were allowed to cool. Finally, dead bees washed with hexane (which removes CHC’s) were removed even if the corpse was warm.
Taken together, these results suggest that a cuticular hydrocarbon that was produced and released from warm bees, but reduced or absent in cold bees, was a likely candidate for the necrophoresis signal.
But which one?
A gas chromatograph analyses volatile gases. Essentially gas vapour is passed through a thin coated tube and gaseous compounds of different molecular weights bind and elute at different times. It’s a very precise technique and allows all the components of a mixture to be identified by comparison with known standards.
Gas chromatography of volatiles from live (red) and dead (blue) bees.
Ping studied the volatile CHC’s in the airspace immediately surrounding dead bees or live bees using gas chromatography. There were some significant differences, shown by the absence of peaks in the blue trace of gases from the cold, dead bees. All of the peaks were identified and nine of the twelve peaks were CHC’s.
CHC’s with chain lengths of 27 or 29 carbons exhibited the greatest difference between live warm bees and cool dead bees and synthetic versions of these and the other CHC’s were tested to see which – upon addition – delayed the removal of dead bees.
Three had a significant impact in the dead bee removal assay – with chain lengths of 21, 27 and 29 carbons. These include the compounds heptacosane (C27H56)and nonacosane (C29H60).
The results section rather fizzles out in the manuscript posted to bioRχiv and I wouldn’t be surprised to see modifications to this part of the paper in a peer reviewed submission.
The overall story can be summarised like this. Live bees are warm and produce a range of CHC’s. Dead bees cool rapidly and some of the volatile CHC levels decrease in the immediate vicinity of the corpse. The undertaker bees specifically monitor the levels of (at least) heptacosane and nonacosane 6 as a means of discriminating between live and dead bees. Within 30 minutes of death local heptacosane and nonacosane levels have dropped below a level associated with life and the undertaker bee removes the corpse.
One final point worth making again. This study was conducted on Apis cerana. Our honey bees, A. mellifera, may use the same necrophoresis signals. Alternatively, they might use different chemicals in the same way.
Or they might do something else entirely.
Personally, I bet it’s a similar mechanism, potentially using different chemical.
There are mixed species colonies of A. mellifera and A. cerana. Do the undertakers only remove same-species corpses?
Global warming and hive cooling
The discussion of the bioRχiv paper raises two interesting points, both of which are perhaps a little contrived but still worth mentioning.
We’re living in a warming world.
Temperatures are rising
Dead bees cooling to ambient temperature lead to reduced CHC production. If global temperatures rise, so will the ambient temperature. Potentially this could decrease the reduction in the levels of CHC’s i.e. the dead bees might not look (er, smell!) quite so dead. This could potentially reduce corpse removal, with the concomitant potential for pathogen exposure.
I suspect that we’ll have much bigger problems to worry about than undertaker bees if the global temperatures rise that high …
But Wen also points out that the rise in global temperatures is also associated with more extreme weather, including very cold weather. Perhaps cold anaesthetised or weak bees will be prematurely removed from the hive under these conditions because their CHC levels have dropped below a critical threshold?
Finally, do dead bees lying on open mesh floors (OMFs) cool more rapidly and so trigger more efficient undertaking? Perhaps OMFs contribute more to hive hygiene than just allowing unwanted Varroa to drop through?
Even the most careful hive manipulations sometimes result in bees getting rolled between frames, or worse, crushed when reassembling the hive. Some beekeepers clip one wing of the queen to reduce the chance of losing a swarm, or uncap drone brood in the search for Varroa.
All of these activities can cause temporary or permanent damage, or may even kill, bees. A careful beekeeper should try and minimise this damage, but have you ever considered whether these damaged bees suffer pain?
Before considering the scientific evidence it’s important to understand the distinction between the detection of, for example, tissue damage and the awareness that the damage causes is painful and causes suffering.
Detection is a physiological response that is present in most animal species, the pain associated with it may not be.
What is pain?
Tissue damage, through chemical, mechanical or thermal stimuli, triggers a signal in the sensory nervous system that travels along nerve fibres to the brain. Or to whatever the animal has that serves as the equivalent of the brain 1.
This response is termed nociception (from the Latin nocēre, meaning ‘to harm’) and has been recorded in mammals, other vertebrates and in all sorts of invertebrates including leeches, worms and fruit flies. It has presumably evolved to detect damaging stimuli and to help the animal avoid it or escape.
But nociception is not pain.
Pain is a subjective experience that may result from the nociceptive response and can be defined as ‘an aversive sensation or feeling associated with actual or potential tissue damage’.
Most humans, being sentient, experience pain following the triggering of a nociceptive response and, understandably, conflate the two.
But they are separate and distinct. How do we know? Perhaps the first hint is that different people experience different levels of pain following the same harmful experience; an excruciatingly painful experience for one might be “just a scratch’ to another.
‘Tis but a scratch
With people it’s easy to demonstrate the distinction between nociception and pain – you simply ask them.
Can you feel that?
Does that hurt?
For the same stimuli you may receive a range of answers to the second question, depending upon their subjective experience of pain.
But you cannot ask a leech, or a worm or a fruit fly or – for the purpose of this post – a bee, whether a particular stimulus hurts.
Well, OK, you can ask but you won’t get an answer 😉
You can determine whether they ‘feel’ the stimulus. Since this is a simply physiological response you can measure all sorts of features of the electrical signal that passes from the nociceptors (the receptors in the tissue that detect damaging events) through the nerve fibres to the brain. This involves electrophysiology, a well established experimental science.
But how can we determine whether animals feel pain?
What do you do when you have a bad headache?
You take a painkiller – an aspirin or paracetamol. You self-medicate to relieve the pain.
Actually, even before you reach for the paracetamol, your body is already self-medicating by the release of endogenous opioids which help suppress the pain.
In cases of extreme pain injection of the opiate morphine may be necessary. Morphine is a very strong painkiller, or analgesic. Opioids bind to opioid receptors and this binding is blocked by a chemical called naloxone, an opiate antagonist. I’ll come back to naloxone in a minute.
But first, back to the unhelpfully unresponsive bee that may or may not feel pain …
It is self-medication with analgesics that forms the basis of the standard experiment to determine whether an animal feels pain.
The principle is straightforward. Two identical foods are prepared, one containing a suitable analgesic (e.g. morphine) and the other a placebo. If an animal is in pain it will preferentially eat the food containing the morphine.
Conversely, if they do not feel pain they will – on average – eat both types of food equally 2.
But this experiment will only work if morphine ‘works’ in bees.
Does morphine ‘work’ in bees?
An unpleasant or harmful stimulus induces a nociceptive response which might include taking defensive action like retreating or flying away. Studies have shown that the magnitude of this defensive action in honey bees is reduced or blocked altogether by prior injection with morphine.
This is a dose-response effect. The more morphine injected the smaller the nociceptive response by the bee. Importantly we know it’s the morphine that is having the effect because it can be counteracted by injection with naloxone.
We can therefore test whether bees choose to self-medicate with morphine to determine whether they feel pain.
And this is precisely what Julie Groening and colleagues from the University of Queensland did, and published three years ago in Scientific Reports. The full reference is Groening, J., Venini, D. & Srinivasan, M. In search of evidence for the experience of pain in honeybees: A self-administration study. Sci Rep7, 45825 (2017); https://doi.org/10.1038/srep45825
Ouch … or not?
The experiment was very simple. Bees were subjected to one of two different injuries; a continuous pinch to the hind leg, or the amputation of part of the middle leg. They were then offered sugar syrup alone and sugar syrup containing morphine.
The hypothesis proposed was that if bees felt pain they would be expected to consume more of the sugar syrup containing morphine.
To ensure statistically relevant results they used lots of bees. Half were injured and half were uninjured and used as controls. If syrup laced with morphine tasted unpleasant you would expect the control group to demonstrate this by eating less.
Throughout the experiments the authors were therefore looking for a difference in syrup alone or syrup with morphine consumption between the injured bee and the uninjured controls.
All of the experiments produced broadly similar results so I’ll just show one data figure.
Relative consumption of morphine (M) and pure sucrose solution (S) by injured (i; amputated) or control (c) bees.
Both groups of bees preferred the pure syrup (the two box plots on the right labelled S_c or S_i) over the morphine-laced syrup (M). However, the bees with the amputation did not consume any more of the morphine-containing syrup (M_i) than the controls (M_c).
Therefore they did not self-medicate.
Very similar results were obtained with the bees carrying the hind leg clip (recapitulating an attack by a competing forager or predator, which often target the rear legs). The injured bees consumed statistically similar amounts of plain or morphine-laced syrup as the control group.
The one significant difference observed was that bees with amputations consumed about 20% more syrup overall than those with the rear leg ‘pinch’ injury. The authors justified this as indicating that the amputation likely induced the innate immune system, necessitating the production of additional proteins (like the antimicrobial peptides that fight infection), so leading to elevated energy needs. Speculation, but it seems reasonable to me.
Feeling no pain
This study, using a pretty standard and well-accepted experimental strategy, strongly suggests that bees do not feel pain.
It does not prove that bees feel pain. It strongly supports the theory that they do not. You cannot prove things with science, you can just disprove them. Evidence either supports or refutes a hypothesis; in this case the evidence (no self-medication) supports the hypothesis that bees do not feel pain because, as has been demonstrated with several other animals, they would self-medicate if they did feel pain.
In the discussion of the paper the authors suggest that further work is necessary. Scientists often make that kind of sweeping statement to:
encourage funders to provide money in the future 😉
allow them to incorporate additional, perhaps contradictory, evidence that could be interpreted in a different way to their own results.
Skinning a cat
That is painful … but the proverb There’s more than one way to skin a cat 4 means that there is more than one way to do something.
And there are other ways of interpreting behavioural responses as an indication that animals feel pain.
For example, rather than measuring self-medication with an analgesic, you could look at avoidance learning or protective motor reactions as indicators of pain.
Protective motor reactions include things like preferential and prolonged grooming of regions of the body which have been injured 5. There is no evidence that bees do this.
However, there is evidence that bees exhibit avoidance learning. This is a behavioural trait in which they learn to avoid a harmful stimulus that might cause injury.
If a forager is attacked by a predator at a food source (and survives) it stops other bees dancing to advertise that food source when it returns to the hive 6.
Whilst avoidance learning does not indicate that bees feel pain, it does imply central processing rather than a simple nociceptive response. It shows that bees are able to weigh up the risk vs. reward of something good (a rich source of nectar) with something bad (the chance of being eaten when collecting the nectar). This type of decision making demonstrates a cognitive capacity that might make pain experience more likely.
We’re now getting into abstruse areas of neuropsychology … dangerous territory.
Let’s assume, as I do based upon the science presented here and in earlier work, that bees do not feel pain. What, if anything, does this mean for practical beekeeping.
It certainly does not mean we should not attempt to conduct hive manipulations in a slow, gentle and controlled manner. Just because rolled bees are not hurting, or crushed bees are not feeling pain, doesn’t give us carte blanche to be heavy handed.
One of the nociceptive responses is the production of alarm pheromones (sting and mandibular) which are part of the defensive response. Alarm pheromones agitate the hive and make the colony aggressive, much more likely to sting and much more difficult to inspect carefully.
So we should conduct inspections carefully, not because we are hurting the bees, but because they might hurt us.
But there are other reasons that care is needed as well. Crushed bees are a potential source of disease in the hive. One reason undertaker bees remove the corpses is to remove the likelihood of disease spreading in the hive. If bees are crushed the heady mix of viruses, bacteria and Nosema they contain are smeared around all over the place, putting other hive members at risk.
And, as we’re all learning at the moment, good hygiene can be a life-saver.
This is the first post written under ‘lockdown’. It’s a little bit later than usual as it has had to travel a v e r y l o n g way along the fibre to ‘the internet’. It’s going to be a very different beekeeping season to anything that has gone before.
About 11,000 years ago nomadic hunter-gatherers living near the river Tigris discovered they could collect the seeds from wild grasses and, by scattering them around on the bare soil, reduce the distance they had to travel to collect more grain the following year.
This was the start of the agricultural revolution.
They couldn’t do much more than clear the ground of competing ‘weeds’ and throw out handfuls of collected seed. The plough wasn’t invented for a further 6,000 years and wouldn’t have been much use anyway as they had no means of dragging it through the baked-hard soil.
But they could grow enough grains and cereals to settle down, doing less hunting and more gathering. Some grains grew better than others, with ‘ears’ that remained intact when they were picked, making harvesting easier. The neophyte farmers preferentially selected these and, about 10,000 years ago, the first domesticated wheat was produced.
Einkorn wheat (Triticum monococcum), one of the first domesticated cereals
Since they were less nomadic and more dependent upon the annual grain harvest they took increasing care to protect it. They were helped with this by the hunting dogs domesticated from wolves several thousands years earlier. The dogs protected the crops and kept the wild animals, primarily big, cloven-hooved ungulates and the native wild sheep and goats, at a distance.
But those that got too close were trapped and were remarkably good to eat.
And since it was easier to keep animals penned up to avoid the need to actively hunt them it was inevitable that sheep and goats were eventually domesticated (~9,000 years ago) … and the nomadic hunter-gatherers became settled farmers practising recognisably mixed agriculture.
Domestication of cattle
The sheep and goats were a bit weak and scrawny. The large ungulates, the aurochs, gaur, banteng, yak and buffalo 1 had a lot more meat on them.
Inevitably, first aurochs (which are now extinct) and then other wild ungulates, were independently domesticated to produce the cattle still farmed today. This process started about 8,000 years ago.
Auroch bull (left) and modern domesticated bull (right). Auroch were big, strong (tasty) animals.
Cattle were great. Not only did they taste good, but they could be managed to produce milk and were strong enough to act as beasts of burden.
The plough was invented and crop yields improved dramatically because the grain germinated better in the cleared, tilled soil. Loosely knit families and groups started to build settled communities in the most fertile regions.
Bigger farms supported more people. Scattered dwellings coalesced and became villages.
Not everyone needed to farm the land. The higher yields (of grain and meat) allowed a division of labour. Some people could help defend the crops from marauders from neighbouring villages, some focused on weaving wool (from the sheep) into textiles while others taught the children the skills they would need as adults.
Communities got larger and villages expanded to form towns.
Hunter-gatherers had previously had relatively limited contact with animals 2. In contrast, the domestication of dogs, sheep, goats and cattle put humans in daily contact with animals.
Many of these animals carried diseases that were unknown in the human population. The so-called zoonotic diseases jumped species and infected humans.
There’s a direct relationship between the length of time a species has been domesticated and the number of diseases we share with it.
Domestication and shared zoonotic diseases (years, X-axis)
The emergence of new diseases requires that the pathogen has both the opportunity to jump from one species to another and that the recipient species (humans in this case) transmits the disease effectively from individual to individual.
The nomadic hunter-gatherers had been exposed to many of these diseases as well but, even if they had jumped species, their communities were too small and dispersed to support extensive human-to-human transmission.
Rinderpest and measles
Until relatively recently rinderpest was the scourge of wild and domesticated cattle across much of the globe. Rinderpest is a virus that causes a wide range of severe symptoms in cattle (and wild animals such as warthog, giraffe and antelope) including fever, nasal and eye discharges, diarrhoea and, eventually, death. In naÏve populations the case fatality rate approaches 100%.
Rinderpest outbreak in South Africa, 1896
Animals that survive infection are protected for life by the resulting immune response.
Rinderpest is closely related to canine distemper virus and measles virus. Virologically they are essentially the same virus that has evolved to be specific for humans (measles), dogs (canine distemper) or cattle (rinderpest).
Measles evolved from rinderpest, probably 1,500 to 2,000 years ago, and became a human disease.
Rinderpest was almost certainly transmitted repeatedly from cattle to humans in the 6,000 years since auroch or banteng were domesticated. However, the virus failed to establish an endemic infection in the human population as the communities were too small.
However, by about 1,500 – 2,000 years ago the largest towns had populations of ~250,000 people. Subsequent studies have demonstrated that you need a population of this size to produce enough naÏve hosts (i.e. babies) a year to maintain the disease within the population.
This is because, like rinderpest, measles induces lifelong immunity in individuals that survive infection.
Measles is a devastating disease in an unprotected community. Case fatality rates of 10-30% or higher are not unusual. It is also highly infectious, spreading very widely in the community 3. Survivors may suffer brain damage or a range of other serious sequelae.
Measles subsequently changed the course of history, being partially responsible (along with smallpox) for Cortés’ defeat of the Aztec empire in the 16th Century.
John Enders, Maurice Hilleman and Andrew Wakefield
In the late 1950’s John Enders developed an attenuated live measles vaccine. When administered it provided long-lasting protection. It was an excellent vaccine. Maurice Hilleman, in the early 1970’s combined an improved strain of the measles vaccine with vaccines for mumps and rubella to create the MMR vaccine.
Widespread use of the measles and MMR vaccines dramatically reduced the incidence of measles – in the UK from >500,000 cases a year to a few thousand.
Incidence of measles in England and Wales
If vaccine coverage of 92% of the population is achieved then the disease is eradicated from the community. This is due to so-called ‘herd immunity’ 4 in which there are insufficient naÏve individuals for the disease to be maintained in the population.
Measles cases (and deaths) continued to fall everywhere the vaccine was used.
There was a realistic possibility that the vaccines would – like rinderpest 5 – allow the global eradication of measles.
And then in 1986 Andrew Wakefield published a paper in the Lancet suggesting a causative link between the MMR vaccine and autism in children.
Subsequent studies showed that this was a deeply flawed and biased study. And totally wrong.
There is not and never was a link between autism and measles vaccination 6. But that didn’t stop a largely uncritical press and subsequently even less critical social media picking up the story and disseminating it widely.
Measles and the anti-vaccine movement
Measles vaccination rates dropped because a subset of parents refused to have their kids vaccinated with the ‘dangerous’ measles vaccine.
Several successive birth cohorts had significantly lower than optimal vaccination rates. Measles vaccine coverage dropped to 84% by 2002 in the UK, with regional levels (e.g. parts of London) being as low as 61%. By 2006, twenty years after the thoroughly discredited (and now retracted) Lancet paper vaccine rates were still hovering around the mid-80% level.
As immunisation rates dropped below the critical threshold, measles started to circulate again in the population. 56 cases in 1998 to ~450 in the first 6 months of 2006. In that year there was also the first death from measles for many years – an entirely avoidable tragedy.
In 2008 measles was again declared endemic (i.e. circulating in the population) in the UK.
Similar increases in measles, mumps and rubella were occurring across the globe in countries where these diseases were unknown for a generation due to previous widespread vaccination.
The distrust of the MMR vaccine was triggered by the Wakefield paper but is part of a much wider ‘anti-vaccination movement‘.
“Vaccines are dangerous, vaccines themselves cause disease, there are too many vaccines and the immune system is overloaded, vaccines contain preservatives (thiomersal) that are toxic, vaccines cause sterility etc.”
None of these claims stand up to even rudimentary scientific scrutiny.
All have been totally debunked by very extensive scientific analysis.
The World Health Organisation consider the anti-vaccine movement (anti-vaxxers) one of the top ten threats to global health. Vaccination levels are lower than they need to be to protect the population. Diseases – not just measles – that should be almost eradicated now kill children every year.
Where are the bees in this beekeeping blog?
Bear with me … before getting to the bees I want to move from fact (all of the above) to fantasy. The following few paragraphs (fortunately) has not happened (and to emphasise the point it is all italicised). However, it is no more illogical than the claims already being made by the anti-vaccine movement.
The inexorable rise of internet misinformation and social media strengthened the anti-vaxxers beliefs further. Their claims that vaccines damage the vaccinees were so widespread and, for the uncritical, naturally suspicious or easily influenced who simply wanting to protect their kids, so persuasive that vaccine rates dropped further. They refused to consider the scientific arguments for the benefits of vaccines, and refused to acknowledge the detrimental effects diseases were having on the community.
The obvious causative link to the inevitable increase in disease rates was not missed – by both the anti-vaxxers and those promoting vaccination. However, the solutions each side chose were very different. Measles remained of particular concern as kids were now regularly dying from this once near-forgotten disease. The symptoms were very obvious and outbreaks spread like wildfire in the absence of herd-immunity 7.
The anti-vaxxers were aware that population size was a key determinant of the ability of measles to be maintained in the population. Small populations, such as those on islands or in very isolated regions, had too few new births annually to maintain measles as an endemic disease.
With the increase in remote working – enabled by the same thing (the internet) responsible for lots of the vaccine misinformation – groups of anti-vaxxers started to establish remote closed communities. Contact with the outside world was restricted, as was the size of the community itself.
A quarter of a million was the cutoff … any more than that and there was a chance that measles could get established in the unprotected population.
Small communities 8 work very well for some things, but very badly for others. Efficiencies of scale, in education, industry, farming and trade became a problem, leading to increased friction. When disease did occur in these unprotected communities it wreaked havoc. Countless numbers of people suffered devastating disease because of the lack of vaccination.
In due course this led to further fragmentation of the groups. They lived apart, leading isolated lives, flourishing in good years but struggling (or failing completely) when times were hard, or when disease was introduced. Some communities died out altogether.
They chose not to travel because, being unvaccinated, they were susceptible to diseases that were widespread in the environment. Movement and contact between villages, hamlets and then individual farm settlements was restricted further over time.
The benefits of large communities, the division of labour, the economies and efficiencies of scale, were all lost.
They didn’t even enjoy particularly good health.
They had ‘evolved’ into subsistence farmers … again.
OK, that’s enough! Where are the bees?
Anyone who has bothered to read this far and who read Darwinian beekeeping last week will realise that this is meant to be allegorical.
The introduction of Varroa to the honey bee population resulted from the globalisation of beekeeping as an activity, and the consequent juxtaposing of Apis mellifera with Apis cerana colonies.
Without beekeepers it is unlikely that the species jump would have occurred.
Apis cerana worker
Undoubtedly once the jump had occurred transmission of mites between colonies was facilitated by beekeepers keeping colonies close together. We do this for convenience and for the delivery of effective pollination services.
The global spread of mites has been devastating for the honey bee population, for wild bees and for beekeeping.
But (like the introduction to measles in humans) it is an irreversible event.
However, it’s an irreversible event that, by use of effective miticides, can at least be partially mitigated.
Miticides do not do long-term harm to honey bees in the same way that vaccines don’t overload the immune response or introduce toxins or cause autism.
There can be short term side effects – Apiguard stinks and often stops the queen laying. Dribbled oxalic acid damages open brood.
But the colony benefits overall.
Many of the miticides now available are organic acids, acceptable in organic farming and entirely natural (even being part of our regular diet). Some of the hard chemicals used (e.g. the lipid-soluble pyrethroids in Apistan) may accumulate in comb, but I’d argue that there are more effective miticides that should be used instead (e.g.Apivar).
I’m not aware that there is any evidence that miticides ‘weaken’ colonies or individual bees. There’s no suggestion that miticide treatment makes a colony more susceptible to other diseases like the foulbroods or Nosema.
Of course, miticides are not vaccines (though vaccines are being developed) – they are used transiently and provide short to medium term protection from the ravages of the mite and the viruses it transmits.
By the time they are needed again the only bee likely to have been previously exposed is the queen. They benefit the colony and they indirectly benefit the environment. The colony remains strong and healthy, with a populous worker community available for nectar-gathering and pollination.
The much reduced mite load in the colony protects the environment. Mites cannot be spread far and wide when bees drift or through robbing. Other honey bee colonies sharing the environment therefore also benefit.
The genie is out of the bottle and will not go back
Beekeepers (inadvertently) created the Varroa problem and they will not solve it by stopping treatment. Varroa will remain in the environment, in feral colonies and in the stocks of beekeepers who choose to continue treating their colonies.
And in the many colonies of Apis mellifera still kept in the area that overlaps the natural (and currently expanding) range of Apis cerana.
Treatment-free beekeepers may be able to select colonies with partial resistance or tolerance to Varroa, but the mite will remain.
So perhaps the answer is to ban treatment altogether?
What would happen if no colonies anywhere were treated with miticides? What if all beekeepers followed the principles of Darwinian (bee-centric, bee friendly, ‘natural’) beekeeping – well-spaced colonies, allowed to swarm freely, killed off if mite levels become dangerously high – were followed?
Surely you’d end up with resistant stocks?
Yes … possibly … but at what cost?
Commercial beekeeping would stop. Honey would become even scarcer than it already is 9. Pollination contracts would be abandoned. The entire $5bn/yr Californian almond crop would fail, as would numerous other commercial agricultural crops that rely upon pollination by honey bees. There would be major shortages in the food supply chain. Less fruits, more cereals.
Pollination and honey production require strong, healthy populous colonies … and the published evidence indicates that naturally mite resistant/tolerant colonies are small, swarmy and only exist at low density in the environment.
Like the anti-vaxxers opting to live as isolated subsistence farmers again, we would lose an awful lot for the highly questionable ‘benefits’ brought by abandoning treatment.
And like the claims made by the anti-vaxxers, in my view the detrimental consequences of treating colonies with miticides are nebulous and unlikely to stand up to scientific scrutiny.
Does anyone seriously suggest we should abandon vaccination and select a resistant strain of humans that are better able to tolerate measles?
It is an inauspicious day … Friday the 13th (unlucky for some) with a global pandemic of a new zoonotic viral disease threatening millions. As I write this the UK government is gradually imposing restrictions on movement and meetings. Governments across Europe have already established draconian regional or even national movement bans. Other countries, most notably the USA and Africa, have tested so few people that the extent of Covid-19 is completely unknown, though the statistics of cases/deaths looks extremely serious.
What’s written above is allegorical … and crudely so in places. It seemed an appropriate piece for the current situation. The development of our globalised society has exposed us – and our livestock – to a range of new diseases. We cannot ‘turn the clock back’ without dissasembling what created these new opportunities for pathogens in the first place. And there are knock-on consequences if we did that many do not properly consider.
Keep washing your hands, self-isolate when (not if) necessary, practise social distancing (no handshakes) and remember that your bees are not at risk. There are no coronaviruses of honey bees.
A fortnight ago I reviewed the first ten chapters of Thomas Seeley’s recent book The Lives of Bees. This is an excellent account of how honey bees survive in ‘the wild’ i.e. without help or intervention from beekeepers.
Seeley demonstrates an all-too-rare rare combination of good experimental science with exemplary communication skills.
It’s a book non-beekeepers could appreciate and in which beekeepers will find a wealth of entertaining and informative observations about their bees.
The final chapter, ‘Darwinian beekeeping’, includes an outline of practical beekeeping advice based around what Seeley (and others) understand about how colonies survive in the wild.
The chapter starts with a very brief review of about twenty differences between wild-living and managed colonies. These differences have already been introduced in the preceding chapters and so are just reiterated here to set the scene for what follows.
The differences defined by Seeley as distinguishing ‘wild’ and ‘beekeepers’ colonies cover everything from placement in the wider landscape (forage, insecticides), the immediate environment of the nest (volume, insulation), the management of the colony (none, invasive) and the parasites and pathogens to which the bees are exposed.
Some of the differences identified are somewhat contrived. For example, ‘wild’ colonies are defined fixed in a single location, whereas managed colonies may be moved to exploit alternative forage.
In reality I suspect the majority of beekeepers do not move their colonies. Whether this is right or not, Seeley presents moving colonies as a negative. He qualifies this with studies which showed reduced nectar gathering by colonies that are moved, presumably due to the bees having to learn about their new location.
However, the main reason beekeepers move colonies is to exploit abundant sources of nectar. Likewise, a static ‘wild’ colony may have to find alternative forage when a particularly good local source dries up.
If moving colonies to exploit a rich nectar source did not usually lead to increased nectar gathering it would be a pretty futile exercise.
Of course, some of the differences are very real.
Beekeepers site colonies close together to facilitate their management. In contrast, wild colonies are naturally hundreds of metres apart 1. I’ve previously discussed the influence of colony separation and pathogen transmission2; it’s clear that widely spaced colonies are less susceptible to drifting and robbing from adjacent hives, both processes being associated with mite and virus acquisition 3.
50 metres? … I thought you said 50 centimetres. Can we use the next field as well?
The other very obvious difference is that wild colonies are not treated with miticides but managed colonies (generally) are. As a consequence – Seeley contends – beekeepers have interfered with the ‘arms race’ between the host and its parasites and pathogens. Effectively beekeepers have ‘weaken[ed] the natural selection for disease resistance’.
Whilst I don’t necessarily disagree with this general statement, I am not convinced that simply letting natural selection run its (usually rather brutal) course is a rational strategy.
But I’m getting ahead of myself … what is Darwinian beekeeping?
Evolution is probably the most powerful force in nature. It has created all of the fantastic wealth of life forms on earth – from the tiniest viroid to to the largest living thing, Armillaria ostoyae4. The general principles of Darwinian evolution are exquisitely simple – individuals of a species are not identical; traits are passed from generation to generation; more offspring are born than can survive; and only the survivors of the competition for resources will reproduce.
I emphasised ‘survivors of the competition’ as it’s particularly relevant to what is to follow. In terms of hosts and pathogens, you could extend this competition to include whether the host survives the pathogen (and so reproduces) or whether the pathogen replicates and spreads, but in doing so kills the host.
Remember that evolution is unpredictable and essentially directionless … we don’t know what it is likely to produce next.
Seeley doesn’t provide a precise definition of Darwinian beekeeping (which he also terms natural, apicentric or beefriendly beekeeping). However, it’s basically the management of colonies in a manner that more closely resembles how colonies live in the wild.
This is presumably unnnatural beekeeping
In doing so, he claims that colonies will have ‘less stressful and therefore more healthful’ lives.
I’ll come back to this point at the end. It’s an important one. But first, what does Darwinian mean in terms of practical beekeeping?
Practical Darwinian beekeeping
Having highlighted the differences between wild and managed colonies you won’t be surprised to learn that Darwinian beekeeping means some 5 or all of the following: 6
Use splits and the emergency queen response for queen rearing i.e. allow the colony to choose larvae for the preparation of new queens – I’ve discussed splits several times and have recently posted on the interesting observation that colonies choose very rare patrilines for queens.
Refrain from treating with miticides – this is the biggy. Do not treat colonies. Instead kill any colonies with very high mite levels to prevent them infesting other nearby colonies as they collapse and are robbed out.
Good and not so good advice
A lot of what Seeley recommends is very sound advice. Again, I’m not going to paraphrase his hard work – you should buy the book and make your own mind up.
Sourcing local bees, using splits to make increase, housing bees in well insulated hives etc. all works very well.
High altitude bait hive …
Some of the advice is probably impractical, like the siting of hives 50 metres apart. A full round of inspections in my research apiary already takes a long time without having to walk a kilometre to the furthest hive.
The prospect of inspecting hives situated at altitude is also not appealing. Negotiating stairs with heavy supers is bad enough. In my travels I’ve met beekeepers keeping hives on shed roofs, accessed by a wobbly step ladder. An accident waiting to happen?
And finally, I think the advice to use small hives and to cull mite-infested colonies is poor. I understand the logic behind both suggestions but, for different reasons, think they are likely to be to the significant detriment of bees, bee health and beekeeping.
Let’s deal with them individually.
Small hives – one brood and one super
When colonies run out of space for the queen to lay they are likely to swarm. The Darwinian beekeeping proposed by Seeley appears to exclude any form of swarm prevention strategy. Hive manipulation is minimal and queens are not clipped.
They’ll run out of space and swarm.
Even my darkest, least prolific colonies need more space than the ~60 litres offered by a brood and super.
Seeley doesn’t actually say ‘allow them to swarm’, but it’s an inevitability of the management and space available. Of course, the reason he encourages it is (partly – there are other reasons) to shed the 35% of mites and to give an enforced brood break to the original colony as it requeens.
These are untreated colonies. At least when starting the selection strategy implicit in Darwinian beekeeping these are likely to have a very significant level of mite infestation.
These mites, when the colony swarms, disappear over the fence with the swarm. If the swarm survives long enough to establish a new nest it will potentially act as a source of mites far and wide (through drifting and robbing, and possibly – though it’s unlikely as it will probably die – when it subsequently swarms).
A small swarm … possibly riddled with mites
Thanks a lot!
Lost swarms – and the assumption is that many are ‘lost’ – choose all sorts of awkward locations to establish a new nest site. Sure, some may end up in hollow trees, but many cause a nuisance to non-beekeepers and additional work for the beekeepers asked to recover them.
In my view allowing uncontrolled swarming of untreated colonies is irresponsible. It is to the detriment of the health of bees locally and to beekeepers and beekeeping.
Kill heavily mite infested colonies
How many beekeepers reading this have deliberately killed an entire colony? Probably not many. It’s a distressing thing to have to do for anyone who cares about bees.
The logic behind the suggestion goes like this. The colony is heavily mite infested because it has not developed resistance (or tolerance). If it is allowed to collapse it will be robbed out by neighbouring colonies, spreading the mites far and wide. Therefore, tough love is needed. Time for the petrol, soapy water, insecticide or whatever your choice of colony culling treatment.
In fairness to Seeley he also suggests that you could requeen with known mite-resistant/tolerant stock.
But most beekeepers tempted by Darwinian ‘treatment free’ natural beekeeping will not have a queen bank stuffed with known mite-resistant mated queens ‘ready to go’.
But they also won’t have the ‘courage’ to kill the colony.
They’ll procrastinate, they’ll prevaricate.
Eventually they’ll either decide that shaking the colony out is OK and a ‘kinder thing to do’ … or the colony will get robbed out before they act and carpet bomb every strong colony for a mile around.
Killing the colony, shaking it out or letting it get robbed out have the same overall impact on the mite-infested colony, but only slaying them prevents the mites from being spread far and wide.
And, believe me, killing a colony is a distressing thing to do if you care about bees.
In my view beefriendly beekeeping should not involve slaughtering the colony.
Less stress and better health
This is the goal of Darwinian beekeeping. It is a direct quote from final chapter of the book (pp286).
The suggestion is that unnatural beekeeping – swarm prevention and control, mite management, harvesting honey (or beekeeping as some people call it 😉 ) – stresses the bees.
And that this stress is detrimental for the health of the bees.
I’m not sure there’s any evidence that this is the case.
How do we measure stress in bees? Actually, there are suggested ways to measure stress in bees, but I’m not sure anyone has systematically developed these experimentally and compared the stress levels of wild-living and managed colonies.
I’ll explore this topic a bit more in the future.
I do know how to measure bee health … at least in terms of the parasites and pathogens they carry. I also know that there have been comparative studies of managed and feral colonies.
Unsurprisingly for an unapologetic unnatural beekeeper like me ( 😉 ), the feral colonies had higher levels of parasites and pathogens (Catherine Thompson’s PhD thesis [PDF] and Thompson et al., 2014 Parasite Pressures on Feral Honey Bees). By any measurable definition these feral colonies were less healthy.
Less stress and better health sounds good, but I’m not actually sure it’s particularly meaningful.
I’ll wrap up with two closing thoughts.
One of the characteristics of a healthy and unstressed population is that it is numerous, productive and reproduces well. These are all characteristics of strong and well-managed colonies.
Finally, persistently elevated levels of pathogens are detrimental to the individual and the population. It’s one of the reasons we vaccinate … which will be a big part of the post next week.
The untold story of the honey bee in the wild by Thomas D. Seeley, Professor of Biology at Cornell University.
Well, not quite untold, but this is a highly informative and entertaining book about the biology of honey bees living wild, primarily in the Arnot Forest, near Ithaca in the Finger Lakes region of New York.
Thomas Seeley conducts simple, elegant experiments to address interesting or important questions about bees. He then presents the studies and the conclusions in an easily understandable form, unencumbered by statistical mumbo-jumbo or extensive caveats and qualifications.
This makes the work very accessible, even for those with no scientific training. You don’t even need extensive knowledge of honey bees; he explains the background to the experiments in sufficient detail that they are comprehensible without lots of prior knowledge.
For this reason, this is an ideal book to introduce a new beekeeper to the biology of bees.
However, for reasons to be covered separately, I think the suggestions it makes on practical beekeeping is very poor advice for the new beekeeper 🙁
A three part story
Essentially the book is in three parts, divided into eleven chapters.
After a general introduction there are three chapters that provide a historical perspective to the bees in the Arnot Forest and, more generally, to beekeeping. Not the practical aspects of beekeeping, but the interaction of humans and bees over tens of thousands of years.
The Beekeepers and the Birdnester by Pieter Bruegel (c. 1568)
Chapters 5 to 10 cover key aspects of the biology of the colony. These are:
the features that influence selection of a nest site
an overview of the annual cycle; spring build up, overwintering etc.
colony reproduction i.e. swarming
thermoregulation of the colony
collection of pollen, nectar and water – the food and stores needed for survival
defence of the colony – from microscopic viruses to (distinctly) macroscopic black bears
The final chapter – Darwinian beekeeping – contains Seeley’s suggestions for changes to beekeeping practice, informed by the observations presented in the six preceding chapters.
I’ll discuss Darwinian beekeeping another time as it deserves a post of its own.
Something for everyone
Each chapter is accompanied by a couple of pages of explanatory notes and there is a 19 page bibliography should the reader want to consult the primary sources.
An interested lay person could spend hours enjoyably reading about the biology of wild-living honey bees without ever consulting the notes or references. These don’t litter the text, making the book very much more accessible to those unused to the sort of cite-every-statement-to-avoid-offending-the-peer-reviewers style of writing that plagues most reviews (Bloggs et al., 1929b).
Alternatively, if you really do want to find out the original source you usually can, by consulting the notes and the references. Inevitably some things are missed, but that’s the nature of an eminently readable tome covering about a million years of Apis mellifera biology, 4500 years of beekeeping and at least 300 years of scientific observations about bees.
One of the great aspects of Seeley’s writing is that things are often presented with reference to some long-lost study which would otherwise have been forgotten.
A couple of weeks ago I discussed the importance of checking hive weights at this time of the year. The rate of stores usage increases significantly as more brood is reared. How do we know this increased rate of stores usage is due to increased brood rearing, rather than just correlating with it?
Seeley presents his data on colony weight changes but does so with reference to Clayton Farrar’s study of brood rearing by colonies lacking pollen in the 1930’s. These used only half as much of their stores because brood rearing needs pollen. Farrar’s study was published in the American Bee Journal in 1936.
There are several examples in the book where modern molecular studies are juxtaposed with some of the great observational science of the first part of the 20th Century. As someone involved daily at the gene-jockey end of science, this historical perspective alone makes the book worth purchasing.
Wild vs. domesticated bees
Throughout the book Seeley focuses on bees living in the wild i.e. without help intervention from beekeepers. His contention is that it is only by studying bees in their natural habitat that we’ll be able to properly understand what they need to survive and thrive when managed.
Seeley has studied bees in the Arnot forest for at least 40 years. He can therefore provide a ‘before and after’ view of the impact of the introduction of Varroa which probably occurred in the early 1990’s. Surprisingly, the overall number and density of colonies living in the forest in the 1970’s is about the same as it is now. This is discussed in several places in the book.
How can wild bees cope with the mites that, uncontrolled, generally destroy a hived colony within a year or two? His explanations of this is the underlying thread running through much of the book and the primary topic of the final chapter.
Are bees domesticated? This topic gets an entire chapter of its own. The genetic changes that species undergo during domestication 1 are not seen in honey bees.
Although perhaps not ‘domesticated’, through environmental manipulation we have significantly changed our relationship with bees. We now determine the size of the colony (or at least the space it has). By moving or manipulating the hive we influence what it produces (e.g. propolis, Royal Jelly, heather honey). We also control whether or not it reproduces. Indeed, most beekeepers try to stop their colonies reproducing (swarming) as it results in the loss of bees, and honey.
Throughout the book comparisons are made between the choices ‘wild’ bees make and the choices made for them by beekeepers. For example, the thermal conductivity of the hives used by beekeepers compared with a nest in a tree trunk.
The strapline on the front of the book indicates that this is the untold story of the honey bee in the wild.
In reality it’s not.
More accurately it’s a very readable compendium of studies published by Seeley and others over the last century or so.
But that’s hardly going to make copies of this £25 book fly off the shelves, so ‘untold’ it is.
In fact, several aspects of the biology of the wild-living honey bee will be familiar to readers of this site. I’ve covered studies by Seeley in discussion of bait hives, drifting, robbing, polyandry and mites in swarms. A quick search turns up ~25 posts in which he gets a mention.
In addition, anyone who is fortunate enough to have already read Honeybee Democracy will be familiar with many bits in the chapter that cover nest site selection. Similarly, the bee lining methods used to locate nests in the Arnot forest have been described in exhaustive detail in his previous book Following the Wild Bees.
Don’t let this put you off.
Honeybee Democracy takes ~250 pages to describe in exhaustive (but still entertaining) detail how swarms choose new nest sites. This topic, together with all sorts of fascinating stuff on comb building and propolis, takes just part of the 40 page ‘Nest’ chapter of The Lives of Bees.
Bee·lining box, in cutaway view to show construction detail.
Similarly, the mechanics of bee lining don’t really get described in the new book, but the wild-living nests discovered using this method feature throughout.
Absolutely. It’s an excellent book.
But be aware that, in addition to a comprehensive account of how bees live in the wild, there’s an agenda here as well.
The sleeve notes (does anyone really read these?) include the words ” … and how wild honey bees may hold the key to reversing the alarming die-off of the planet’s managed honey bee population”.
Global beehive numbers 1968 – 2018
What alarming die off?
The graph above is of the global total of beehive numbers over the last 50 years or so. During this period the number has increased by ~1.7 times.
Of course, there are more beekeepers over the last 50 years (and the global population has more than doubled). This increased number of beekeepers are having to work harder to maintain (and increase) the stocks they manage.
It is therefore both inaccurate and an oversimplification to claim that there’s an alarming die-off in honey bee colonies.
Perhaps the sleeve notes are just to help boost sales?
Something a bit spicy to entice the browser to think that the book they are holding contains the ‘untold’ secrets to ‘saving the bees’?
Save the bees … save humanity
It’s not the first time ‘Save the bees … save humanity’ has been used as a marketing ploy 3. Here’s a graphic I regularly use to introduce my talks on rational Varroa control.
Save the bees …
Pity the image is of a wasp 🙂
Again, don’t let these minor errors in the sleeve blurb put you off.
Whatever the relevance to practical beekeeping (or reversing the “alarming die-off”), the first ten chapters provide the best overview of the lives of wild-living honey bees written by an acknowledged master of science communication.
I read a lot of stuff about honey bees, for work and pleasure.
The Lives of Bees had a wealth of information I was unaware of.
Buy it, or borrow it from your library … you won’t be disappointed.
Brace yourselves. There’s some heavyweight science this week.
I’m going to discuss a very recent publication 1 on vaccinating bees against parasites and pathogens.
The paper involves a whole swathe of general concepts many readers will have some familiarity with – vaccines, immunity, infections, parasites, the gut microbiota 2 – which, because the paper is about bees, bear little recognisable relationship in the details.
And the devil is in the detail.
The paper appears to offer considerable promise … but I’ll return to that later.
To start with, let’s begin with measles.
Measles is a virus. It is highly contagious – typically being transmitted by coughing or sneezing – and causes a characteristic rash. Complications associated with measles infections – pneumonias, encephalitis and other respiratory and neurological conditions – are responsible for a case fatality rate of ~0.3% in the USA, or up to 30% in populations that are malnourished or have high levels of immune dysfunction.
Sixteenth century Aztec drawing of a measles victim
In 1980, 2.6 million people globally died of measles. That’s about five people (mainly kids) a minute 3.
By 2014 this figure had dropped to 73,000 due to a global vaccination campaign.
The measles vaccine is excellent. It is an attenuated (weakened) strain of the virus that is injected. When it replicates it produces all of the measles virus proteins. These are not naturally found in the human body, so the vaccinee 4 recognises them as foreign and produces an immune response that eventually stops the vaccine growing.
The really important thing about the immune response is that it lasts i.e. it has a memory. If the vaccinated individual is exposed to a virulent strain of measles in the future the immune response ‘wakes up’ and stops the virus replicating.
This immune response is effectively lifelong.
One important component of the immune response are antibodies. These are proteins that specifically recognise the measles virus, bind to it and lead to its destruction.
If you’ve been vaccinated (or have survived a previous infection) and subsequently get infected your body produces lots of antibodies which destroy the incoming strain of the measles virus, so protecting you (but this immune response is very specific … the response to measles does not protect you from poliomyelitis or coronavirus or mumps.).
The point about the stuff on measles was to introduce the principles of a protective immune response.
It has several characteristic features, including:
destruction of the incoming pathogen
In humans, all of the above are provided by antibodies 6.
Bees don’t have antibodies, but they do have an immune response which has all of the characteristic features listed above.
The immune response of bees uses nucleic acids7 which are common chemical molecules found in the bodies of all living things. Specifically bees use ribonucleic acid (RNA) that interferes with the nucleic acids of invading pathogens.
To make ribonucleic acid (RNA) that interferes easier to say it is abbreviated to RNAi 8.
RNA is made up of individual building blocks called nucleotides. There are four nucleotides, with names abbreviated to A, C, G and U. These join together in long strandse.g. ACGUUGUGCAG … the order (or sequence) of which has all sorts of important biological functions we don’t need to worry about for the purposes of vaccinating bees.
Pairs of nucleotides in different strands have the ability to bind together – A binds to U, G binds to C or U. Individually, these bonds are weak. When lots occur close together they are much stronger and therefore very specific.
For example, the sequence ACGUUGUGCAG binds very well to UGCGACGCGUU. In contrast, it binds very much less well to CGUUAGCAUUG (just count the vertical bars which indicate each of the weak bonds between the nucleotides in the two strands. The left hand pair bind tightly, those on the right do not).
Finally, these short RNAs interfere when they bind very well to their target sequence.
What does that mean?
In the cartoon above, imagine the text in red represents the RNAi and the text in blue represents part of the RNA genome of deformed wing virus (DWV), the most significant viral pathogen of honey bees 9.
The specific binding of RNAi to its target sequence recruits enzymes that result in either the destruction of the target, or the impairment of its functionality.
RNAi binding to DWV results in the inactivation and eventual destruction of the virus genome.
Virus replication is therefore stopped.
This is a ‘good thing’.
Before we get on to vaccinating bees I have one final thing to explain.
How does the bee ‘know’ it is infected with DWV (or a similar viral pathogen) and how is the RNAi actually made?
OK, that’s two things, but they’re actually closely related to each other.
I said earlier that our bodies recognise the proteins that the measles virus (or vaccine) produces as ‘foreign’ i.e. something not normally present in the body. It turns out that many organisms – including bees – have evolved specific ways of detecting double stranded RNA as a ‘foreign’ entity.
Double stranded RNA (dsRNA) is made when RNA viruses replicate, but it is never normally present in the cells of a healthy bee. Therefore if the bee detects dsRNA it ‘knows’ it is infected and it induces an immune response … specifically an RNAi-mediated immune response.
The dsRNA is recognised by a protein called Dicer which cuts up the double stranded RNA into smaller duplex RNAi molecules, one of the pair of these then associates with additional proteins (including Argonaute; Ago 10) to form the RNA induced silencing complex (RISC).
RISC, which includes the RNAi, binds to the specific target e.g. the genome of other DWV viruses, and chops it up and destroys it.
The mechanism of RNAi-mediated silencing
Finally, because RNAi is a small molecule it can easily move from cell to cell. So RNAi made in one cell can move to regions of the bee some distance away.
Phew … OK, that’s the end of the whistle-stop introduction to RNAi and insect immunity 11.
It’s been known for some time that you can directly introduce RNAi into bees and reduce the levels of some of the viruses present.
Frankly the data on DWV has not been great, but there are reasonably compelling studies of reductions in Israeli Acute Paralysis Virus (IAPV) levels and even field trails showing benefits at the colony level.
In these studies you either inject individual bees with RNAi, or you feed them large amounts of sugar syrup containing huge amounts (in value) of RNAi.
Neither of these routes is practically or financially viable.
Injecting individual bees takes a very long time 12. You need to anaesthetise the bee with CO2 or by chilling it on ice. It’s pretty tough on the bee and not all survive the anaesthetic or the injection. You need good lighting, good eyesight and a very small needle. It’s obviously a non-starter.
What about feeding? Syrup feeding is incompatible with honey production. It’s also a rather inefficient way to deliver RNAi. RNA is a very sensitive molecule. It is easily damaged. If it has to sit around in syrup for a few days, get collected by the bee, stored in the honey stomach, regurgitated and passed to another bee etc. there’s a risk it will be inactivated.
And it’s very expensive to produce …
The gut microbiota
Which in a really roundabout way brings us to this recent study by Leonard et al., published at the end of January in the prestigious journal Science
Leonard et al., (2020) Science 367, 573-576
In this study, the authors have modified a harmless bacterium normally present in the honey bee gut so that it produces double stranded RNA specific for DWV. This bacterium, specifically called Snodgrassella alvi, is a present in the gut of all bees. It is a core member of the gut microbiota, the bacterial population present in the honey bee gut.
The concept is relatively simple, but the science is pretty cool.
The bacterium sits around in the honey bee gut producing DWV-specific RNAi. If the bee gets infected – through feeding or injection, for example by Varroa – the RNAi (which has diffused around the body of the bee) is ready and waiting to ‘silence’, through RNA interference, the replicating DWV genome.
But there’s more … Snodgrassella alvi is presumably passed from bee to bee during feeding (of larvae or adult workers). Therefore the RNAi-expressing version should naturally spread through a colony, protecting all the bees. In addition, because it is present throughout the life of the bee, a genetically engineered form of the bacterium should provide the longevity that is characteristic of a protective immune response.
So, does it work?
The paper includes lots of introductory studies. These include:
demonstrating that engineered Snodgrassella alvi – which for pretty obvious reasons I’ll abbreviate to S. alvi for the rest of this post – colonises the bee gut and could be spread from bee to bee.
the introduced bacterium produces double strand RNA (dsRNA) precursors of the RNAi response.
that dsRNA produced in the bee gut spreads to other areas of the bee body.
and that the presence of dsRNA upregulates components of the immune response.
the demonstration that it was possible to control host gene expression using this dsRNA 14.
I’m going to return to some of these points in a future post (this one is already too long) as there are both promising and disturbing features buried within the data.
Let’s cut to the chase …
Symbiont-produced RNAi can improve honey bee survival after viral injection.
Seven day old adult worker bees were fed with S. alvi expressing RNAi to DWV or to an irrelevant target (GFP). Seven days later some were injected with DWV (solid lines in the graph above), others were injected with buffer alone (dashed lines).
In the 10 days after injection about 25% of the bees injected with buffer died. This reflects the ageing of the bees and the attrition rate due to handling in the laboratory.
About 75% of the bees ‘vaccinated’ with S. alvi expressing GFP RNAi or no RNAi died after DWV challenge over the 10 day period.
In contrast, only ~60% of the S. alvi bees expressing DWV-specific RNAi died. This is a relatively small difference, but – because the experiment was conducted with lots of bees – is statistically significant.
The results presented above are promising but the authors also explored the logical extension of this work.
If the RNAi produced by the engineered S. alvi becomes widely distributed in the honey bee, perhaps it also could also taken up when Varroa feeds on the bee?
In which case, if you engineered S. alvi to produce Varroa-specific RNAi’s, perhaps this would help kill mites.
Symbiont-produced RNAi kills Varroa mites feeding on honey bees
Using a similar ‘vaccination’ schedule as above, only ~25% mites exposed to bees carrying S. alvi expressing Varroa-specific RNAs’s survived 10 days, whereas 50% of mites survived when feeding on bees carrying non-specific engineered strains of S. alvi.
Again, this is encouraging.
Yes, at the moment, only encouraging.
Don’t get me wrong, this is pretty fancy technology and the results represent a lot of very laborious and elegant experiments.
At 2100 words this post is already too long … so here are a few things to think about which help justify my qualified enthusiasm for the paper.
Although I didn’t show the data, transmission of engineered S. alvi between bees was rather inefficient. Over 5 days, only 33% of naive co-housed bees demonstrated infection with the modified symbiont. Why might this be an issue? Alternatively, is transmission between adult bees important? When might it be important to not transmit between adult bees?
None of the experiments included any virus quantification. Did the bees that didn’t die after DWV injection challenge have lower DWV levels? If not, why not? What is the mechanism of protection?
Actually, there were some virus quantification studies buried in the Supplementary data. In these the authors showed that virus levels were lower in all bees carrying engineered S. alvi, even those expressing the GFP negative control RNAi. This suggests a non-specific up-regulation of the immune response.
All the challenge experiments were done with 7 day old worker bees. Are these the bees we really need to protect from DWV? Why didn’t they do any studies with larvae and pupae? These are much easier to handle and very much easier to inoculate. And very much more relevant in terms of virus-mediated colony losses.
What other species sharing the environment with honey bees carries S. alvi? Why should this matter? Snodgrassella is a gut symbiont of honey bees and lives in the ileum. Is it present in honey bee faeces?
I’ll post a follow-up in the next few weeks to discuss some of these in further detail.
Congratulations to those of you who have got this far … don’t get rid of your Apivar and oxalic acid stocks just yet 😉
I’ve recently discussed the importance and influence of polyandry for honey bee colonies. Briefly, polyandry – the mating of the queen with multiple (~12-18) drones – is critical for colony fitness e.g. ability to resist disease, forage efficiently or overwinter successfully.
Hyperpolyandry, for example resulting from instrumental insemination of the queen with sperm from 30+ drones, further increases colony fitness and disease resistance.
How do you measure polyandry?
Essentially, you genetically analyse the worker bees in the colony to determine the range of patrilines present. Patrilines are genetically distinct offspring fathered by different drones. Essentially they are subfamilies within the colony.
With a finite number of patrilines – which there must be, because the queen does not mate with an infinite number of drones – there will be a point at which the more workers you screen the fewer new patrilines will be detected.
Search and ye shall find – detecting rare patrilines
The more you screen, the more you are likely to have detected all the patrilines present.
However, the queen uses sperm randomly when fertilising worker eggs. This compounds the difficulty in determining the full range of different patrilines present in a population. In particular, it makes detecting very rare patrilines difficult.
For example, if 20% of workers belong to one patriline you don’t need to sample many bees to detect it. In contrast, if another patriline is represented by 0.0001% of randomly selected workers you would probably have to screen thousands to be sure of detecting it.
Consequently, rare patrilines in the honey bee worker population are very difficult to detect. Inevitably this means that the number of drones the queen mates (~12-18) with is probably an underestimate of the actual number 1.
Half-sisters and super-sisters
Worker bees are often described as ‘half sisters’ to each other. They share the same mother (the queen), but different fathers.
Actually, as you should now realise, that’s an oversimplification because – with only ~12-18 different fathers contributing to the genetics of the colony – some workers are going to be more related to each other because they share the same father and mother.
Half-sisters share the same mother but have different fathers and share about 25% of their genes.
Super-sisters share the same mother and father and so share about 75% of their genes (25% from the queen and 50% from the drone).
Super-sisters are more likely to help each other in the colony 2.
Emergency queens and nepotism
What’s the most important decision a colony makes?
If the queen is killed (or removed) the workers rear new queens under the so-called ’emergency response’. They feed selected young larvae copious amounts of Royal Jelly to rear a replacement queen.
Arguably, the most important decision the workers make is the selection of the day-old larvae to rear as new queens.
If they get it wrong the colony is doomed. If they get it right the colony will flourish 3.
But as described above, workers are more or less related to each other genetically.
To ensure the continued propagation of at least some of their genes it might be expected that the nurse bees making this selection 4 would choose larvae more closely related to themselves.
Do worker bees exhibit nepotism when rearing emergency queens?
If workers were nepotistic you’d expect the most common patrilines in the nurse/worker bee population would also predominate in the queens reared.
However, for at least 20 years evidence has been accumulating that indicates bees are not nepotistic. On the contrary, emergency queens appear to be reared from some of the rarepatrilines in the colony.
A recent paper from James Withrow and David Tarpy has provided some of the best evidence for the existence of these so-called royal patrilines in honey bee colonies 5.
Evidence for these goes back to at least 1997 6, with about half a dozen publications in the intervening period. Essentially all used broadly the same approach; they genetically screened worker bees and the emergency queens they reared to determine which patrilines were present in the two groups.
With certain caveats (size of study, number of microsatellites screened, colony numbers etc.) all concluded that colonies rear emergency queens from some of the rarest patrilines in the colony.
The recent study by Withrow and Tarpy is well explained and probably the most comprehensive, so I’ll use that to flesh out the details.
Six double-brood colonies were each split into a three separate colonies; a queenright single-brood colony and two five-frame nucs. The latter contained eggs and young larvae and so reared emergency queens.
Seven days later the developing emergency queens were all harvested for future analysis. One or two frames from the nucs were then exchanged with frames containing eggs and day-old larvae from the matched queenright colony.
The nucs then started rearing new queens … again.
And again … and again.
This process was repeated until the nucs failed.
In total over 500 queens were reared (to 7 days old) from these six original colonies. These queens were analysed genetically by microsatellite analysis, as were over 500 workers from colonies.
Within the 6 experimental colonies the authors identified a total of 327 patrilines (or subfamilies as Withrow and Tarpy describe them), ranging from 34-77 per colony. 108 patrilines (4-40 per colony) were exclusively detected in worker bees and 130 patrilines (5-55/colony) were exclusively detected in queens.
Cryptic “royal” subfamilies
Over 40% of queens raised per colony were produced from the patrilines exclusively detected in the queen population.
Subfamily distribution per colony.
As shown in the figure above, many queens (black bars) were reared from subfamilies (patrilines) not represented in the worker bee population (grey bars, sorted left to right by abundance).
Since there were different numbers of patrilines per colony (34-77), the bias towards the rarer patrilines is more apparent if you instead split them into tertiles (thirds) based upon worker abundance.
Are the queens predominantly reared from the most common tertile, the intermediate tertile or the rarest tertile?
Frequency distribution of subfamilies.
It’s very clear from this graph that workers select queens from the rarest patrilines within the colony.
It is therefore very clear that worker bees do not exhibit nepotism when choosing which larvae to rear emergency queen from.
Implications for our understanding of honey bee reproduction
Two points are immediately apparent:
there is a cryptic population of queen-biased patrilines that have largely been overlooked in genetic studies of honey bee polyandry
honey bee queens mate with more drones than conventional studies of worker bee patrilines indicate
Colony 5 had at least 77 distinct subfamilies (there might have been more detected had they screened more than the 94 workers and 135 queens from this colony). By extrapolation it is possible to determine that the effective queen mating frequency (me; the number of drones the queen had mated with) was ~32 if all the samples (worker and queen) were taken into account. If only the worker or queen samples were used for this calculation the effective queen mating frequency would be ~12 or ~65 respectively.
The average effective queen mating frequency over the six colonies was ~33 (total), significantly higher than the oft-quoted (including at the top of this page) me of ~12-18.
So perhaps honey bees really are hyperpolyandrous … or even extremely hyperpolyandrous as the authors suggest.
It’s worth noting in passing that routine mating frequencies over 30 are almost never quoted for honey bees 7, but that the ‘normal’ me ~12-18 is rather low when compared with other species within the genus Apis. The giant honey bee, Apis dorsata, exhibits mating frequencies of greater than 60.
Who’s the daddy?
So, when it comes to emergency queens , although we might not know precisely who the daddy is, we can be pretty certain the particular patriline selected by the workers is most likely to be one of the rare ones in the colony.
Mechanistically, what accounts for this?
Are these larvae selected solely because they are rare?
That seems unlikely, not least because it would require some sort of surveying or screening by nurse bees. Not impossible perhaps, though I’m not sure how this would be achieved.
Perhaps it is not even workerselection?
An alternative way to view it is larval competition. A better competing larvae would be fed Royal Jelly and would be much more likely to pass on her genes to the next generation.
We don’t know the answers to these questions … yet.
Or whether they’re the wrong questions entirely.
Swarming and supercedure
The colony rears a new queen under three conditions; enforced queenlessness (as described above) which induces emergency queen rearing, prior to swarming and during supersedure.
These are fundamentally different processes in terms of the larvae used for queen rearing.
During swarming and supersedure 8 the queen lays the egg in a ‘play cup’ which is subsequently engineered into a queen cell in which the new queen develops.
However, it is known that the patrilines of queens reared during the swarming response are similar to those of workers in the same colony 9, implying that there is no overt selection by the workers (or the parental queen).
Does this insight into how bees rear new queens have any implications for how beekeepers rear new queens?
There are about as many queen rearing methods as there are adult workers in a double-brood colony in late June. Many exploit the emergency queen rearing response by a colony rendered temporarily or permanently queenless.
Beekeepers often comment on the differential ‘take’ of grafted larvae presented to queenless cell raising colonies.
Sometimes you get very good acceptance of the grafted larvae, other times less so.
Of course, we only show the ones that worked well!
3 day old QCs …
Differential ‘take’ is often put down to the state of the cell raising colony or the nectar flow (or the cackhandedness of the grafter, or the phase of the moon, or about 100 other things).
I have never heard of beekeepers comparing the ‘take’ of larvae originating from the cell raising colony with those from another colony. The latter are always going to be ‘rare’ if you consider the patrilines present in the cell raising colony. However, grafts taken from the same colony as used for cell raising 10 are likely to reflect the predominant patrilines.
Are these accepted less well by the nurse bees?
I suspect not … but it is testable should anyone want to try.
My expectation would be that the presentation of larvae in a vertically oriented cell bar frame would likely override any genetic selectivity by the colony. They’re desperate to raise a new queen and – thank goodness – here’s a few that might do.
Alternatively, differential acceptance is more likely to reflect use of larvae of an unsuitable age, or that have been damaged during grafting.
As I listen to the wind howling outside it seems like a very long time until I can test any of these ideas … 🙁
Ray Winstone (as Carlin) 1979
Who’s the daddy? is British slang for who, or what, is the best. It originated in a line by Ray Winstone’s character Carlin from the 1979 film Scum. This was not a romantic comedy and I’m certainly not recommending viewing it. Nevertheless, the phrase became widely used over the subsequent couple of decades and seemed appropriate here because the colony is dependent on selecting high-quality larvae for colony survival.
Essentially, amitraz binds and activates receptors that are critically important in a range of important aspects of the Varroa activity and behaviour … amitraz changes [this] behaviour and so exhibits miticidal activity. It has additional activities as well … these multiple routes of action may explain why resistance to amitraz is slow to develop.
I made the point in a subsequent post that amitraz resistance was very well documented … in cattle ticks 2 but that there was only anecdotal or incompletely documented evidence of resistance in Varroa in the USA, Argentina and Europe.
Apivar strip – fit and (don’t) forget
Amitraz has been used for mite control in honey bees for over twenty years. Considering its widespread use, the concentrations it is used at, and the relatively high replication rate of Varroa it is surprising that there has not been better evidence of resistance.
But that is no longer the case 🙁
Do you want the good news or the bad news first?
The bad news
A very recent paper 3 has clearly documented amitraz resistant Varroa in several commercial beekeeping operations in the USA.
I’ll discuss the key results of this paper first and then make some general comments on the implications for beekeepers and beekeeping.
The study had three components:
Determine the sensitivity of Varroanever treated with amitraz to the chemical. This forms the baseline sensitivity against which field samples from commercial beekeepers could be tested.
Screen Varroa from hives maintained by commercial beekeepers (with a multi-year history of Apivar usage) for amitraz resistance.
Validate that the reduced efficacy of Apivar correlates with the observed amitraz resistance.
Essentially it involved harvesting live Varroa from colonies by a large-scale dusting with icing sugar 4. The Varroa were then tested to determine whether they showed resistance to amitraz, and the sensitivity was compared with the baseline sample of mites from colonies never treated.
Finally, an Apivar sensitivity test was conducted to determine the proportion of mites killed in a standardised assay in a set time period, again compared with the control (baseline sample).
Not all the apiaries tested yielded sufficient mites to screen for Apivar resistance. This is part of ‘the good news’ which I’ll get to shortly … but first the science.
Of those apiaries that did, Apivar resistance (determined by LC50 – the Lethal Concentration required to kill 50% of the mites) ranged from similar to that seen in the baseline samples to ~20-fold greater than the controls.
Two apiaries had an over 10-fold increase of the resistance ratio (the observed LC50 divided by the baseline LC50), with some individual colonies having high levels of Varroa infestation despite an active application of amitraz.
Apivar kills mites very quickly. Using a known number of mites trapped in a cage with a single small square of Apivar it is possible to ‘count the corpses’ and plot a kill curve over time. Sensitive mites from the control colonies were all killed within 3 hours.
Time course of Apivar efficacy in amitraz-susceptible Varroa
Using this as the baseline control it was then possible to determine the efficacy of Apivar in killing the mites (in the same 3 hour timeframe) from apiaries exhibiting resistance.
Apivar efficacy in commercial beekeeping apiaries.
Two apiaries (B and C, above) contained mites that exhibited high levels of resistance to Apivar, reflected in a low level of Apivar efficacy (above). In these apiaries, an average of less than 80% of mites were killed within the 3 hour assay.
Finally, the author demonstrated a correlation between Apivar efficacy and amitraz resistance. Unsurprising, but a necessary concluding point for the experimental data.
Within apiary variation
It was interesting that the author notes that the range of Apivar efficacy was much greater in colonies from apiaries with clear evidence of amitraz resistance.
For example, apiary B exhibited a range of Apivar efficacy in colonies from 28% to 97%, with an average (plotted above) of 68%. Whilst this is clearly an unacceptably low level, it is interesting that some of the colonies within the same apiary had mites killed at an efficacy similar or better (>90%) to apiaries A2 and A4 in the graph above.
I’ve re-plotted the primary data of Apivar efficacy vs. mite counts from individual colonies to emphasise this point.
Variation of Apivar efficacy vs mite infestation levels in individual colonies from commercial apiaries
Apiaries B and C (red markers) could be considered as ‘failing apiaries’ as the average Apivar efficacy of each was below 80% (see bar chart). Together the average mite load and Apivar efficacy for these two apiaries was 6.75 mites/100 bees and 72% respectively.
However, of the 16 colonies screened from these two apiaries (8 from each):
One colony had insufficient detectable mites to be included in the the full analysis.
Eight dropped less than 3 mites/100 bees during the sugar dusting analysis (the average over the 63 colonies screened was 5.33 mites/100 bees).
Four colonies exhibited ≥90% Apivar efficacy.
One colony from apiary B was a clear outlier, with >50 mites/100 bees ( 😯 ) and only ~28% Apivar efficacy. Inevitably this sample skews the averages …
Clearly the average figures presented in the bar chart above hides a very significant level of within-apiary variation.
I commented recently on the variation in mite levels during midwinter treatment of colonies with OA/Api-Bioxal. I attributed this – with little supporting evidence (!) – to different rates of late-season brood production. Colonies brooding late into the autumn were expected to have higher midwinter mite levels.
However, the variation seen here is different.
With the exception of that one heavily infested colony from apiary B, the mite levels in the ‘failing apiaries’ (B and C) are actually less than the average of the remainder of the study group (3.88 vs. 4.85).
What differs is the efficacy of Apivar treatment, not the resulting mite levels.
Frankly, this is a bit weird … on two counts:
If Apivar treatment had been failing for a long time in apiaries B and C I would have expected much higher than average mite levels.
Considering the amount of drifting and robbing that goes on between juxtaposed colonies I would have also expected Apivar-resistant mites to be very widely distributed within the ‘failing apiaries’.
Caveat on the mite counts – Apiaries in Louisiana, New York and South Dakota were analysed in this study. Louisiana apiaries were sampled in April, the others in July and August. I don’t know enough about the climate or mite-replication kinetics in these states to know how much this would have influenced the mite infestation levels (or prior or ongoing treatment regimes, which would also influence mite numbers). Unfortunately, the locations of the apiaries (A, B, C etc.) are not provided, other than the control apiary which is in Baton Rouge, LA. If the study had been done in the UK mite drops in April and August would have been wildly different depending upon the location.
Apivar resistance does appear to have arisen in some of these colonies, but it does not appear to have become widely distributed within the apiary.
I don’t actually think we have enough information to work with. The paper contains almost no additional background details – Apivar treatment history, use of other treatments, colony loss data etc.
But that won’t stop me speculating a little bit 😉
Do Apivar-resistant mites stop bees from drifting? Probably not, but it would explain why resistance was not widespread in the apiary 5.
More sensibly, perhaps Apivar resistance is detrimental in the absence of selection.
In the colonies in which resistance evolves it gives the mites a significant advantage. The ongoing infestation could encourage prolonged or repeated treatment, so selecting for yet more resistant mites. Eventually the colony succumbs to the resulting high viral load.
In other colonies, treatment is withdrawn (or forgotten … remember, we have zero information here!) and the Apivar-resistant mites are then at a disadvantage to their sisters.
This isn’t unheard of.
Apistan resistance appears to be detrimental in the absence of selection. There are some relatively straightforward molecular explanations for this type of phenotype.
You would have to assume differential colony treatments within apiaries B and C for this to be part of the explanation (and to account for drifting). Let’s hope the colony records are less shambolic than mine many beekeepers keep 😉
Until a clearer picture emerges of the management history of these colonies all we’re left with is the slightly (or very) confusing observation that Apivar resistance is a hive-specific phenomenon.
As the author states:
This colony level resolution suggests that each colony may act an island of resistance with its own distinct Varroa population. Beekeepers have reported inconsistency in amitraz treatment efficacy among colonies within an apiary and this variation seems to support those anecdotal observations.
And the good news?
I think there are two ‘encouraging’ observations in this paper (though of course I’d be happier if there was no resistance).
About half of the commercial apiaries surveyed (5 of 11 that had a long history of Apivar usage) had too few mites detectable to screen for amitraz resistance. Clearly Apivar works, and often works very well indeed.
Apivar resistance is not widespread in the apiaries within which it had arisen. For whatever reason, resistant mite populations appear restricted to individual colonies.
And these, in turn, have implications for practical beekeeping.
Implications for practical beekeeping
How does Apivar resistance evolve? Classically, misuse or overuse of treatments results in their eventual failure. Antibiotics are a good example of this.
I’ve been told by commercial beekeepers that some use a half dose of Apivar midseason to knock mite levels back sufficiently for the late season nectar flows. This is a typical example of misuse. It may not result in the development of resistance and it may not be a strategy used by the beefarmers managing apiaries B and C, but it is not the correct way to use Apivar.
What about overuse? Mites still dropping after 6 weeks of Apivar? Go on, slip another couple of strips in for another month or two. An (expensive) example of overuse.
Used Apivar strips
Or what about the Apivar strip found lying on the bottom of the hive at the first spring inspection? Again, overuse as there are likely to be lingering traces of Apivar present in the colony all winter 6.
So the first implication for practical beekeeping is to use Apivar correctly to help avoid the development of resistance. Don’t overdose or underdose, remove after 6-10 weeks, do not leave in over the winter.
Secondly, use alternate treatments to knock back the mite population. This is again a classic strategy to avoid selecting for resistance.
For example, use Apivar in late summer and Api-Bioxal in midwinter.
The mechanism of action of these two treatments is fundamentally different, so resistance to one will not confer resistance to the other (and there are no documented cases of oxalic acid resistance I’m aware of).
If you don’t treat midwinter (and you probably should7) then use Apiguard one year and Apivar the next. Again, totally different mechanisms of action.
Finally, do not rely on individual colonies within an apiary being indicative of all colonies. I know some beekeepers who only conduct mite drop counts in one colony as a ‘sentinel’ 8.
If the drop is high then treatment is needed.
Or vice versa … no mites, so no treatment needed.
There’s a lot of colony to colony variation so it’s worth monitoring them all 9. And this is probably even more important with the colony level Apivar resistance reported in this paper.