Understanding when and where a group of species evolved tells a story of the world they emerged into.
This past week, I had an interesting new scientific paper come across my desk, on the systematics of fir trees – genus Abies in the Pine family. I realize fir tree systematics is not what the average person might consider compelling reading, but if you look past the statistical tests and lines of code, systematics can be great reading for the imagination. In this paper, the researchers reconstructed the speciation pulse (a period of time when a bunch of species came into existence fairly quickly) of fir trees native to the Mediterranean Basin, then dated it using fossils. The main finding of the paper was that this burst had actually happened millions of years earlier than had been supposed.
The really neat part is how they then used this finding to paint a picture of the world when fir trees were colonizing the Mediterranean. At the time these fir trees diversified, what we now think of as the Mediterranean climate, temperate and favourable to biodiversity, didn’t exist, and wouldn’t for tens of millions of years yet. At this time, the Oligocene-Miocene boundary, around 23 million years ago, the global climate had been cooling for some time, which favoured the expansion of gymnosperms like firs. A single ancestor came to the Mediterranean from Asia and quickly (for trees) spread through the whole of the Mediterranean Basin, leaving a dozen or so new species in its wake. This was going on at a time when the first apes were evolving and glasslands were just forming for the first time. The Andes didn’t exist yet and Australia was moving northward and drying out. What we now know about Mediterranean fir trees adds a new detail to this picture of a world very different from, but turning into, our own.
In my own work on the systematics of the Dialiinae, I’ve always been captivated by a genus called Labichea. It’s a group of about 14 shrubs native to Queensland, Australia. The earlier evolving species of the genus have pinnate leaves with broad leaflets, sort of like walnut trees (but smaller and more oval). From this, there is an evolutionary progression to species with fewer leaflets, becoming long and thin, covered with hairs to keep the wind from pulling water away form the surface. They become tougher and more leathery to the touch. In one species, the leaflets have narrowed and curled up on themselves so much that they are, for all intents and purposes, needles now. Lining these species up next to one another is like watching a plant evolve and adapt to an increasingly hot, dry Australian outback in real time.
Studying the morphology – the physical form – of a plant will tell you a lot about the challenges of the place it evolved and what it had to survive to make it to today, but reconstructing the evolutionary tree of a group creates a sort of speculative saga of a group of species changing as the world changed and it pushed out to new territories and new niches.
I think that’s the difference between finding science a collection of dry facts and finding it utterly compelling to learn about… knowing that you need to see the forest rather than just the trees, and to find the story of life that all those facts come together to tell you.
[Note to sharp readers: yes, I know those are not fir trees in the image. But those are definitely the mists of time you’re seeing between them.]
Palm trees have had to develop some creative strategies to survive drought and cool climates.
My recent work with Botany One writing news briefs for newly published botany research has had me reading a lot of scientific articles, and it just so happens that twice in the last couple of weeks, they’ve dealt with new research on palms. Having worked mostly on legumes as a researcher, I’d never given much thought to these fascinating plants, beyond the fact that they look good on a tropical beach. But there’s a lot to them, as I’ve been finding out lately, and I thought I’d write a little post to share what I’ve learned.
Palms are members of the Arecaceae family, which has around 2600 species spread through the world’s tropical and subtropical regions. They are monocots, like grasses or orchids. The arborescent, or tree-like, members of the palm family – what we’d call a palm tree – are unique among tall trees in that they have no vascular cambium. This is the cell layer in the trunk of a tree that allows it to widen year upon year, and is also responsible for tree growth rings. If you cut a palm tree down, there are no annual rings in its trunk, because that trunk didn’t continue to grow. (It’s also why their trunks look so cylindrical, as opposed to the usual tapering you see in a tree trunk.) This imposes some interesting restrictions on the tree. For instance, the tree’s vasculature cannot be renewed, as it is in other trees. The cells making up the tubes that transport water and nutrients through the trunk must last the entire life of the tree, which can be upwards of 100 years in some species.
While more than 90% of palms are restricted to tropical rainforests, some also occur in cool, high altitude regions and arid deserts. Unlike most of the plants that live in cool and dry habitats, palms lack dormancy mechanisms, such as dropping their leaves, that would help them to survive these conditions. What’s more, like all monocots, palms have no central tap root that will allow them to reach deeper reserves of soil water. So they’ve had to develop some creative survival strategies. Under drought conditions, which some palm trees endure regularly due to their arid habitat, the greatest danger to a plant is vascular embolism. This happens when the water column that runs through the plant breaks because there’s not enough water, and air bubbles form and expand through the xylem tubes. Once a certain amount of air is present in the tube, it will never function again and the tissue it feeds will die. To help counter this, palm trees store water in parenchyma cells adjacent to the xylem, so that when an embolism is imminent, more water can be shifted into the column. Their anatomy also encourages embolisms that do happen to happen closer to the tip of the leaf, as opposed to near or inside the trunk, where they would do greater damage.
Palms have a neat survival trick to help their seeds germinate in the low temperatures. Most palms store oil in their seeds to provide sustenance for the seedling when it germinates. This is usually high in saturated fats, which aren’t liquid at low temperatures. This would mean that seeds either couldn’t germinate under cool conditions, or would risk starvation if they did. New research has found that palms growing in cooler climates have evolved their own oil blend rich in unsaturated fats, which are liquid at lower temperatures, to help their seeds thrive in those habitats.
Speaking of oil storage, palms have been hugely important to human beings since before the dawn of civilization, all thanks to those oils, which can occur in both the seed and the fruit, and provide a high calorie food source. The best known is coconut, Cocos nucifera, with its greasy, delicious seed, which we eat as a fruit. In fact, the fruit of a coconut isn’t a nut at all, it’s a drupe. But while coconut is perhaps the most familiar palm food, the most economically important is certainly the oil palm, genus Elaeis. The oil that comes from this palm is high in saturated fat, making it useful for deep-frying (and bio-fuel), if not the best for your health. The use of palm oil is controversial, because of the environmental and human rights abuses linked to its production, yet production is ongoing in regions of Africa, Asia, and the Americas. Outside of their oil production, palms are also the source of dates, palm syrup, carnauba wax, and wood.
Recent research has found that the seeds with the greatest oil storage are all grouped in the tribe Cocoseae, but that palms with oily fruits and moderately oily seeds abound throughout the family, suggesting there may yet be nutritionally and economically valuable species that haven’t been discovered, though whether the further exploitation of these resources is a welcome development is debatable.
How Evolution Made Baby-faced Humans & Adorable Dogs
Who among us hasn’t looked at the big round eyes of a child or a puppy gazing up at us and wished that they’d always stay young and cute like that? You might be surprised to know that this wish has already been partially granted. Both you as an adult and your full-grown dog are examples of what’s referred to in developmental biology as paedomorphosis (“pee-doh-mor-fo-sis”), or the retention of juvenile traits into adulthood. Compared to closely related and ancestral species, both humans and dogs look a bit like overgrown children. There are a number of interesting reasons this can happen. Let’s start with dogs.
When dogs were domesticated, humans began to breed them with an eye to minimizing the aggression that naturally exists in wolves. Dogs that retained the puppy-like quality of being unaggressive and playful were preferentially bred. This caused certain other traits associated with juvenile wolves to appear, including shorter snouts, wider heads, bigger eyes, floppy ears, and tail wagging. (For anyone who’s interested in a technical explanation of how traits can be linked like this, here’s a primer on linkage disequilibrium from Discover. It’s a slightly tricky, but very interesting concept.) All of these are seen in young wolves, but disappear as the animal matures. Domesticated dogs, however, will retain these characteristics throughout their lives. What began as a mere by-product of wanting non-aggressive dogs has now been reinforced for its own sake, however. We love dogs that look cute and puppy-like, and are now breeding for that very trait, which can cause it to be carried to extremes, as in breeds such as the Cavalier King Charles spaniel, leading to breed-wide health problems.
Foxes, another type of wild dog, have been experimentally domesticated by scientists interested in the genetics of domestication. Here, too, as the foxes are bred over numerous generations to be friendlier and less aggressive, individuals with floppy ears and wagging tails – traits not usually seen in adult foxes – are beginning to appear.
But I mentioned this happening in humans, too, didn’t I? Well, similarly to how dogs resemble juvenile versions of their closest wild relative, humans bear a certain resemblance to juvenile chimpanzees. Like young apes, we possess flat faces with small jaws, sparse body hair, and relatively short arms. Scientists aren’t entirely sure what caused paedomorphosis in humans, but there are a couple of interesting theories. One is that, because our brains are best able to learn new skills prior to maturity (you can’t teach an old ape new tricks, I guess), delayed maturity, and the suite of traits that come with it, allowed greater learning and was therefore favoured by evolution. Another possibility has to do with the fact that juvenile traits – the same ones that make babies seem so cute and cuddly – have been shown to elicit more helping behaviour from others. So the more subtly “baby-like” a person looks, the more help and altruistic behaviour they’re likely to get from those around them. Since this kind of help can contribute to survival, it became selected for.
Of course, dogs and humans aren’t the only animals to exhibit paedomorphosis. In nature, the phenomenon is usually linked to the availability of food or other resources. Interestingly, both abundance and scarcity can be the cause. Aphids, for example, are a small insect that sucks sap out of plants as a food source. Under competitive conditions in which food is scarce, the insects possess wings and are able to travel in search of new food sources. When food is abundant, however, travel is unnecessary and wingless young are produced which grow into adulthood still resembling juveniles. Paedomorphosis is here induced by abundant food. Conversely, in some salamanders, it is brought on by a lack of food. Northwestern salamanders are typically amphibious as juveniles and terrestrial as adults, having lost their gills. In high elevations where the climate is cooler and a meal is harder to come by, many of these salamanders remain amphibious, keeping their gills throughout their lives because aquatic environments represent a greater chance for survival. In one salamander species, the axolotl (which we’ve discussed on this blog before), metamorphosis has been lost completely, leaving them fully aquatic and looking more like weird leggy fish than true salamanders.
So paedomorphosis, this strange phenomenon of retaining juvenile traits into adulthood, can be induced by a variety of factors, but it’s a nice demonstration of the plasticity of developmental programs in living creatures. Maturation isn’t always a simple trip from point A to point B in a set amount of time. There are many, many genes at play, and if nature can tweak some of them for a better outcome, evolution will ensure that the change sticks around.
Martin & Gordon (1995) Journal of Evolutionary Biology 8:339-354
We think of scientific progress as working like building blocks constantly being added to a growing structure, but sometimes a scientific discovery can actually lead us to realize that we know less than we thought we did. Take vision, for instance. Vertebrates (animals with backbones) have complex, highly-developed “camera” eyes, which include a lens and an image-forming retina, while our invertebrate evolutionary ancestors had only eye spots, which are comparatively very simple and can only sense changes in light level.
At some point between vertebrates and their invertebrate ancestors, primitive patches of light sensitive cells which served only to alert their owners to day/night cycles and perhaps the passing of dangerous shadows, evolved into an incredibly intricate organ capable of forming clear, sharp images; distinguishing minute movements; and detecting minor shifts in light intensity.
In order for evolutionary biologists to fully understand when and how this massive leap in complexity was made, we need an intermediate stage. Intermediates usually come in the form of transitional fossils; that is, remains of organisms that are early examples of a new lineage, and don’t yet possess all of the features that would later evolve in that group. An intriguing and relatively recent example is Tiktaalik, a creature discovered on Ellesmere Island (Canada) in 2004, which appears to be an ancestor of all terrestrial vertebrates, and which possesses intermediate characteristics between fish and tetrapods (animals with four limbs, the earliest of which still lived in the water), such as wrist joints and primitive lungs. The discovery of this fossil has enabled biologists to see what key innovations allowed vertebrates to move onto land, and to precisely date when it happened.
There are also species which are referred to as “living fossils”, organisms which bear a striking resemblance to their ancient ancestors, and which are believed to have physically changed little since that time. (We’ve actually covered a number of interesting living fossils on this blog, including lungfish, Welwitschia, aardvarks, the platypus, and horseshoe crabs.) In the absence of the right fossil, or in the case of soft body parts that aren’t usually well-preserved in fossils, these species can sometimes answer important questions. While we can’t be certain that an ancient ancestor was similar in every respect to a living fossil, assuming so can be a good starting point until better (and possibly contradictory) evidence comes along.
So where does that leave us with the evolution of eyes? Well, eyes being made of soft tissue, they are rarely well preserved in the fossil record, so this was one case in which looking at a living fossil was both possible and made sense.
Hagfish, which look like a cross between a snake and an eel, sit at the base of the vertebrate family tree (although they are not quite vertebrates themselves), a sort of “proto-vertebrate.” Hagfish are considered to be a living fossil of their ancient, jawless fish ancestors, appearing remarkably similar to those examined from fossils. They also have primitive eyes. Assuming that contemporary hagfishes were representative of their ancient progenitors, this indicated that the first proto-vertebrates did not yet have complex eyes, and gave scientists an earliest possible date for the development of this feature. If proto-vertebrates didn’t have them, but all later, true vertebrates did, then complex eyes were no more than 530 million years old, corresponding to the time of the common ancestor of hagfish and vertebrates. Or so we believed.
This past summer, a new piece of research was published which upended our assumptions. A detailed electron microscope and spectral analysis of fossilized Mayomyzon (the hagfish ancestor) has indicated the presence of pigment-bearing organelles called melanosomes, which are themselves indicative of a retina. Previously, these melanosomes, which appear in the fossil as dark spots, had been interpreted as either microbes or a decay-resistant material such as cartilage.
This new finding suggests that the simple eyes of living hagfish are not a trait passed down unchanged through the ages, but the result of degeneration over time, perhaps due to their no longer being needed for survival (much like the sense of smell in primates). What’s more, science has now lost its anchor point for the beginning of vertebrate-type eyes. If an organism with pigmented cells and a retina existed 530 million years ago, then these structures must have begun to develop significantly earlier, although until a fossil is discovered that shows an intermediate stage between Mayomyzon and primitive invertebrate eyes, we can only speculate as to how much earlier.
This discovery is intriguing because it shows how new evidence can sometimes remove some of those already-placed building blocks of knowledge, and how something as apparently minor as tiny dark spots on a fossil can cause us to have to reevaluate long-held assumptions.
Gabbott et al. (2016) Proc. R. Soc. B. 283: 20161151
Lamb et al. (2007) Nature Rev. Neuroscience 8: 960-975
*The image at the top of the page is of Pacific hagfish at 150 m depth, California, Cordell Bank National Marine Sanctuary, taken and placed in the public domain by Linda Snook.
“Now, here, you see, it takes all the running you can do to keep in the same place.”
From a simple reproductive perspective, males are not a good investment. With apologies to my Y chromosome-bearing readers, let me explain. Consider for a moment a population of clones. Let’s go with lizards, since this actually occurs in lizards. So we have our population of lizard clones. They are all female, and are all able to reproduce, leading to twice the potential for creating more individuals as we see in a species that reproduces sexually, in which only 50% of the members can bear young. Males require all the same resources to survive to maturity, but cannot directly produce young. From this viewpoint alone, the population of clones should out-compete a bunch of sexually-reproducing lizards every time. Greater growth potential. What’s more, the clonal lizards can better exploit a well-adapted set of genes (a “genotype”); if one of them is well-suited to survive in its environment, they all are.
Now consider a parasite that preys upon our hypothetical lizards. The parasites themselves have different genotypes, and a given parasite genotype can attack certain host (i.e. lizard) genotypes, like keys that fit certain locks. Over time, they will evolve to be able to attack the most common host genotype, because that results in their best chance of survival. If there’s an abundance of host type A, but not much B or C, then more A-type parasites will succeed in reproducing, and over time, there will be more A-type parasites overall. This is called a selection pressure, in favour of A-type parasites. In a population of clones, however, there is only one genotype, and once the parasites have evolved to specialise in attacking it, the clones have met their match. They are all equally vulnerable.
The sexual species, however, presents a moving target. This is where males become absolutely worth the resources it takes to create and maintain their existence (See? No hard feelings). Each time a sexual species mates, its genes are shuffled and recombined in novel ways. There are both common and rare genotypes in a sexual population. The parasite population will evolve to be able to attack the most common genotype, as they do with the clones, but in this case, it will be a far smaller portion of the total host population. And as soon as that particular genotype starts to die off and become less common, a new genotype, once rare (and now highly successful due to its current resistance to parasites), will fill the vacuum and become the new ‘most common’ genotype. And so on, over generations and generations.
Both species, parasite and host, must constantly evolve simply to maintain the status quo. This is where the Red Queen hypothesis gets its name: in Wonderland, the Red Queen tells Alice, “here, you see, it takes all the running you can do to keep in the same place.” For many years, evolution was thought of as a journey with an endpoint: species would evolve until they were optimally adapted to their environment, and then stay that way until the environment changed in some fashion. If this was the case, however, we would expect that a given species would be less likely to go extinct the longer it had existed, because it would be better and better adapted over time. And yet, the evidence didn’t seem to bear this prediction out. The probability of extinction seemed to stay the same regardless of the species’ age. We now know that this is because the primary driver of evolution isn’t the environment, but competition between species. And that’s a game you can lose at any time.
Now the parasite attacking the lizards was just a (very plausible) hypothetical scenario, but there are many interesting cases of the Red Queen at work in nature. And it’s not all subtly shifting genotypes, either; sometimes it’s a full on arms race. Behold the passionflower. In the time of the dinosaurs, passionflowers developed a mutually beneficial pollinator relationship with longwing butterflies. The flowers got pollinated, the butterflies got nectar. But then, over time, the butterflies began to lay their eggs on the vines’ leaves. Once the eggs hatched, the young would devour the leaves, leaving the plant much the worse for wear. In response, the passionflowers evolved to produce cyanide in their leaves, poisoning the butterfly larvae. The butterflies then turned the situation to their advantage by evolving the ability to not only eat the poisonous leaves, but to sequester the cyanide in their bodies and use it to themselves become poisonous to their predators, such as birds. The plants’ next strategy was to mimic the butterflies’ eggs. Longwing butterflies will not lay their eggs on a leaf which is already holding eggs, so the passionflowers evolved nectar glands of the same size and shape as a butterfly egg. After aeons of this back and forth, the butterflies are currently laying their eggs on the tendrils of the passionflower vines rather than the leaves, and we might expect that passionflowers will next develop tendrils which appear to have butterfly eggs on them. These sorts of endless, millennia-spanning arms races are common in nature. Check out my article on cuckoos for a much more murderous example.
Had the passionflowers in this example been a clonal species, they wouldn’t likely have stood a chance. Innovations such as higher-than-average levels of cyanide or slightly more bulbous nectar glands upon which defences can be built come from uncommon genotypes. Uncommon genotypes produced by the shuffling of genes that occurs in every generation in sexual species.
And that, kids, is why sex is such as fantastic innovation. (Right?) Every time an illness goes through your workplace, and everybody seems to get it but you, you’ve probably got the Red Queen (and your uncommon genotype) to thank.
Brockhurst et al. (2014) Proc. R. Soc. B 281: 20141382.
Lively (2010) Journal of Heredity 101 (supple.): S13-S20 [See this paper for a very interesting full explanation of this links between the Red Queen hypothesis and the story by Lewis Carroll.]
Vanderplank, John. “Passion Flowers, 2nd Ed.” Cambridge: MIT Press, 1996.
*The illustration at the top of the page is by Sir John Tenniel for Lewis Carroll’s “Through the Looking Glass,” and is now in the public domain.
Throughout evolution, there have been, time and time again, key biological innovations that have utterly changed history thereafter. Perhaps the most obvious is the one you’re using to read this; the human brain. The development of the anatomically modern human brain has profoundly changed the face of the planet and allowed humans to colonize nearly every part of the globe. But an equally revolutionary innovation from an earlier time stares us in the face each day and goes largely unremarked upon. Flowers. (Stay with me here, guys… ) We think of them as mere window dressing in our lives. Decorations for the kitchen table. But the advent of the flowering plants, or “angiosperms”, has changed the world profoundly, including allowing those magnificent human brains to evolve in the first place.
Having arisen sometime around the late Jurassic to early Cretaceous era (150-190 million years ago), angiosperms come in every form from delicate little herbs to vines and shrubs, to towering rainforest canopy trees. They exist on every continent, including Antarctica, which even humans have failed to develop permanent homes on, and in every type of climate and habitat. They exploded from obscurity to the dominant form of plant life on Earth so fast that Darwin himself called their evolution an “abominable mystery”, and biologists to this day are unable to nail down exactly why they’ve been so incredibly successful. Nearly 90% of all terrestrial plant species alive today are angiosperms. If we measure success by the number of species that exist in a given group, there are two routes by which it can be improved- by increasing the number of distinct species (“speciation”), or by decreasing the rate at which those species go extinct. Let’s take a look at a couple of the features of flowers that have likely made the biggest difference to those metrics.
Picture a world without flowers. The early forests are a sea of green, dominated by ferns, seed ferns, and especially, gymnosperms (that is, conifers and other related groups). Before the angiosperms, reproduction in plants was a game of chance. Accomplished almost exclusively by wind or water, fertilization was haphazard and required large energy inputs to produce huge amounts of spores or pollen grains in order that relatively few would make their way to the desired destination. It was both slow and inefficient.
The appearance of flowers drew animals into the plant reproduction game as carriers for pollen – not for the first time, as a small number of gymnosperms are known to be insect pollinated – but at a level of control and specificity never before seen. Angiosperms have recruited ants, bees, wasps, butterflies, moths, flies, beetles, birds, and even small mammals such as bats and lemurs to do their business for them. The stunning variety of shapes, sizes, colours, and odours of flowers in the world today have arisen to seduce and retain this range of pollinators. Some plant species are generalists, while others have evolved to attract a single pollinator species, as in the case of bee orchids, or plants using buzz pollination, in which a bumblebee must vibrate the pollen loose with its flight muscles. In return, of course, the pollinators are rewarded with nectar or nutritious excess pollen. Or are at least tricked into thinking they will be. Angiosperms are paying animals to do their reproductive work for them, and thanks to incentivisation, the animals are doing so with gusto. Having a corps of workers whose survival is linked to their successful pollination has allowed the flowering plants to breed and expand their populations and territory quickly, like the invading force they are, and has lowered extinction rates in this group well below that of their competitors. But what happens when you expand into new territory to find that your pollinators don’t exist there? Or members of your own species are simply too few and far between for effective breeding?
Another unique feature that came with flowers is the ability to self-fertilise. “Selfing”, as it’s called, is a boon to the survival of plants in areas where pollinators can be hard to come by, such as very high latitudes or elevations; pollen simply fertilises its own flower or another flower on the same plant. Selfing can also aid sparse populations of plants that are moving into new territories, since another of its species doesn’t need to be nearby for reproductive success. It even saves on energy, since the flower doesn’t have to produce pleasant odours or nectar rewards to attract pollinators. Around half of all angiosperms can self-fertilise, although only 10-15% do so as their primary means of reproduction. Why, you may ask, since it’s such an effective strategy? Well, it’s an effective short term strategy. Because the same genetic material keeps getting reused, essentially, in each successive generation (it is inbreeding, after all), over time the diversity in a population goes down, and harmful mutations creep in that can’t be purged via the genetic mix-and-match that goes on in normal sexual reproduction. Selfing as a sole means of procreation is a slow ticket to extinction, which is why most plants that do it use a dual strategy of outbreeding when possible and inbreeding when necessary. As a short term strategy, however, it can allow a group of new colonists to an area to survive long enough to build up a breeding population and, in cases where that population stays isolated from the original group, eventually develop into a new species of its own. This is how angiosperms got to be practically everywhere… they move into new areas and use special means to survive there until they can turn into something new. I’m greatly simplifying here, of course, and there are additional mechanisms at play, but this starts to give an idea of what an unstoppable force our pretty dinnertable centrepieces really are.
Angiosperms are, above all, adaptable. Their history of utilising all possible avenues to ensure reproductive success is unparalleled. As I mentioned, we have the humble flower to thank for our own existence. Angiosperms are the foundation of the human – and most mammal – diets. Both humans and their livestock are nourished primarily on grasses (wheat, rice, corn, etc.), one of the latest-evolving groups of angiosperms (with tiny, plain flowers that you barely notice and which, just to complicate the point I’m trying to make here, are wind-pollinated). Not to mention that every fruit, and nearly every other type of plant matter you’ve ever eaten also come from angiosperms. They are everywhere. So the next time you buy flowers for that special someone, spare a moment to appreciate this world-changing sexual revolution in the palm of your hand.
Armbruster (2014) AoB Plants 6: plu003
Chanderbali et al. (2016) Genetics 202: 1255-1265
Crepet & Niklas (2009) American Journal of Botany 96(1): 366-381
Endress (2011) Annals of Botany 107: 1465-1489
Sicard & Lenhard (2011) Annals of Botany 107: 1433-1443
Wright et al. (2013) Proc. Biol. Sci. 280(1760): 20130133
All species likely descended from the same ancestor
Common ancestor lived 50-100 million years ago
Found: Fresh water bodies of any size, on every continent, including Antarctica
It Does What?!
Here’s a creature that truly exhibits questionable evolution- as in, the kind that tends to make you go extinct in a hurry. Bdelloid rotifers (the ‘B’ is silent) are microscopic animals found in all kinds of moist, freshwater habitats- puddles, ponds, mossy areas; you name it, they’re probably there. What’s so unusual about these guys is that they’re entirely asexual, and have been for a very, very long time. In fact, bdelloid rotifers are all female, a consequence of how they reproduce.
Now, asexual reproduction isn’t so uncommon. If you look at a field of dandelions, chances are, they’re all clones derived from asexual reproduction in a single common ancestor- no second parent needed. Even such advanced creatures as komodo dragons do this periodically- a baby dragon is formed from an unfertilized egg inside the mother. What differentiates bdelloid rotifers from other asexual reproducers is that it’s all they’ve done for the last 50 million years or more. Outside of our friends the rotifers, a species must either have sex from time to time, or face extinction.
Why? Because sex solves two major problems in life (your individual results may vary..). First, it weeds out errors which tend to accumulate in DNA over time. Unlike asexuals, which pass on a copy of a copy of a copy (etc.) of their genes, sperm and egg cells contain DNA which has been mixed and matched via a process called meiosis. The gist of this is that an organism can procreate without necessarily passing on any genetic errors it may have to the next generation. Second, this same process of mixing and matching creates new combinations of DNA sequences, which in turn create the natural variation between individuals that evolution can select for or against.
For example, a genetic combination which caused a polar bear to be born with a white nose would be selected for, since it would make a more effective camouflage for hunting. On the other hand, a combination which gave polar bears big black patches on their fur would be selected against, because they’d have a harder time hunting and would therefore starve more often. Asexuals, however, can neither quickly generate useful new combinations, nor purge their populations of harmful mutations.
So on the surface, it comes as a surprise to biologists that bdelloid rotifers have been able to survive for such an epic amount of time with no sex (in addition to the absence of males, genetic tests are able to show that meiosis hasn’t occurred). However, the rotifers have two impressive ways of dealing with this. First, when times get tough, they already have a pretty good defence mechanism worked out- they just dry up. The rotifer dehydrates itself and forms a dormant cyst in which it can remain in this state until conditions improve. This is called anhydrobiosis.
Second, and more importantly, they steal genes. This is the true secret to the successful asexual lifestyle. When a rotifer emerges from dormancy and needs to patch itself up, it’s actually able to incorporate random genetic material from its environment into its own genome. A nearby bacterium, some fungus, a passing bit of rotting leaf? All fair game, apparently. Researchers have found genes from each of these three groups in the rotifer genome. Incorporating these new bits of sequence seems to give rotifers the variation they need to develop new traits and stay off the evolutionary chopping block. In fact, given the success of the bdelloid rotifers – they’ve evolved into over 300 species since giving up sex – and the ease of asexual procreation – no need to find a partner – an argument could be made that when it comes to new genes, theft really is better than sex.
Gladyshev et al. (2008) Science 320(5880): 1210-1213