Climbing grades and lichen taxonomy

I’m enjoying Robbie Phillips’ (and everyone else’s) grade rants on Instagram at the moment, and it’s got me thinking about some of the similarities between climbing grades and taxonomy – the science of biological classification.

Whereas climbing has numerous grading systems depending on what you’re climbing and where in the world you’re climbing it, biology has one. Based upon a system popularised by Carl Linnaeus in seventeen-hundred-and-something, organisms are named according to levels of increasing inclusivity. Species are at the finest end of the scale, with closely related species being included in the same genus; this is the basis of the ‘latin’ name that we’re all used to seeing. As an example, the lichen genus (pl. genera) Parmelia contains several related species: Parmelia sulcata, Parmelia saxatilis, Parmelia omphalodes etc. Related genera are grouped together into Families, Families into Orders, Orders into Classes, Classes into Phyla, and Phyla into Kingdoms. Understanding how organisms are organised within these different ranks is a key part of how we characterise biological diversity. It’s when we try to decide where any given organism fits within this classification system that the similarities to climbing grades become apparent

Both grades and taxonomy can be, to a greater or lesser extent, subjective. Whether something fits into one level or another only really makes sense within the context of whatever classification system it is that you’re using, and where the boundaries fall will depend entirely on the rules that you use. 7b+ or 7c if you’re short. E9 if you go one way, E7 if you go the other. HS if you can jam, E3 if you can’t. E1? HVS with pads mate. While the aim is to reach some sort of consensus grade there will always be folks who disagree, and defining species can be just as controversial. Although the levels of biological classification are more-or-less fixed, where the edges between them actually lie is up for debate and can be influenced by how you go about defining a species (what’s called a species concept). Do you define species by whether they interbreed (biological species concept), have similar DNA (phylogenetic species concept), look similar (morphological species concept), or live in similar places (ecological species concept)? In some tidy situations species might be well defined by all of these criteria, but often the answer changes depending on how you approach the question.

Me on Non-Metallic Silver at Irish Man’s Wall. E2 in the old guide, E1 in the new one although it was definitely E2 when I did it because it was my birthday.

Classification systems tend to evolve over time and just as climbing grades vary from one edition of the guide to another, so species names change. Lichen taxonomy in particular has undergone extensive revision in recent years. Lobaria virens has turned into Ricasolia virens (main picture). Alectoria nigricans became Gowardia nigricans, only to revert to its original name a few years later. The entire family Teloschistaceae was broken up leading to new names for a remarkable 31 genera, the validity of which are still to be determined.

Alectoria (not Gowardia) nigricans

Like in climbing, debates about taxonomy can become heated. Just as a first ascensionist might not be happy about their route being downgraded, so a taxonomist could find themselves having to defend their new species circumscription against suggestions that it might be wrong. Once a new grade or revised species is finally settled on, the outcome can have repercussions for anyone using those classification systems. It can be demoralising to find that the route you’ve proudly been citing as the hardest thing you’ve ever done has been downgraded, and nothing frustrates a lichenologist like being told that you shouldn’t refer to your favourite lichen by the name you’ve been using quite happily for the past twenty-five years. Both grades and taxonomy can be cruel.

It’s all, of course, meaningless. A lichen doesn’t know what species it is any more than a rock knows how hard it is to climb, their names and grades imposed on them by us as a means of better interpreting our world. All part of this weird human desire to classify things. Yet meaningless as it is, I do wonder whether there’s some deep rooted evolutionary reason for this sort of behaviour. Non-human animals have been classifying things as food or not-food for hundreds of millions of years, and an innate desire to put things in boxes might literally be written in our DNA. Or it might not. At the end of the day, I just enjoy climbing rocks and looking at lichens. The rest is just academic.

NC

A.I. Lichens

I’ve recently been convinced to start using ChatGPT a bit more. I’ll confess, I don’t really ‘get’ A.I. yet. The actual nuts and bolts of the computational stuff involved are way over my head. I’m sure that it’ll revolutionise the world, but what about my own work? Despite some initial reservations I’m beginning to think there’s scope for using A.I. as a tool, but maybe not in the ways that I would have expected.

Unusually for a dry Saturday I’m not out climbing or lichen hunting, but sat at home nursing some lingering new year flu symptoms. In the interests of convalescing productively I thought I’d get ChatGPT to draw some lichens for me in R by asking the following question:

“Write me some code that will draw me a picture of the lichens Cladonia pyxidata, Xanothoria parietina, Rhizocarpon geographicum, and Lichina pygmaea using ggplot2.”

Me

ChatGPT was perfectly willing to carry out my request, but did offer me a word of caution:

“Creating detailed representations of multiple lichens in a single ggplot2 plot is quite complex due to the intricate structures of lichens. However, I can provide a basic example that includes a simplified representation of each lichen: Cladonia pyxidata, Xanthoria parietina, Rhizocarpon geographicum, and Lichina pygmaea.”

ChatGPT 3.5

Fair enough. The code generated was the following:

# Install and load ggplot2 if not already installed
if (!requireNamespace("ggplot2", quietly = TRUE)) {
  install.packages("ggplot2")
}
library(ggplot2)

# Function to draw simplified representations of multiple lichens
draw_lichen_community <- function() {
  ggplot() +
    # Cladonia pyxidata
    geom_polygon(
      aes(x = c(2, 4, 5, 6, 7, 8, 9), y = c(2, 3, 4, 3, 4, 3, 2)),
      fill = "green", color = "black"
    ) +
    geom_point(
      aes(x = c(4, 5, 6, 7, 8), y = c(3, 4, 3, 4, 3)),
      size = 3, color = "brown"
    ) +
    
    # Xanthoria parietina
    geom_polygon(
      aes(x = c(11, 12, 13, 14, 13), y = c(4, 5, 6, 5, 4)),
      fill = "orange", color = "black"
    ) +
    geom_point(
      aes(x = 13, y = 5),
      size = 3, color = "yellow"
    ) +
    
    # Rhizocarpon geographicum
    geom_polygon(
      aes(x = c(18, 19, 20, 21, 20), y = c(3, 4, 5, 4, 3)),
      fill = "gray", color = "black"
    ) +
    geom_point(
      aes(x = c(19, 20, 21), y = c(4, 5, 4)),
      size = 3, color = "orange"
    ) +
    
    # Lichina pygmaea
    geom_polygon(
      aes(x = c(25, 26, 27, 26), y = c(2, 3, 2, 1)),
      fill = "purple", color = "black"
    ) +
    geom_point(
      aes(x = 26, y = 2),
      size = 3, color = "pink"
    ) +
    
    # Set axis limits
    coord_cartesian(xlim = c(0, 30), ylim = c(0, 7)) +
    # Remove axis labels and ticks
    theme_void()
}

# Call the function to draw a simplified representation of multiple lichens
draw_lichen_community()

So far so good. I like the fact that it decided to refer to my plot as a lichen ‘community’; there’s obviously enough out there to train the A.I. that this is a word that’s commonly used when referring to assemblages of different species (although the habitat that supported a community containing all of these species together would be a rather odd one indeed). ChatGPT handed off with one final nugget:

“This code attempts to represent Cladonia pyxidata, Xanthoria parietina, Rhizocarpon geographicum, and Lichina pygmaea in a single plot. Adjustments and refinements may be necessary to better represent the specific features of each lichen. Keep in mind that this is a highly stylized and abstract representation for illustrative purposes.”

ChatGPT 3.5

This was the result:

A lichen community in ggplot2, as imagined by ChatGPT. Left to right: Cladonia pyxidata. Xanthoria parietina, Rhizocarpon geographicum, Lichina pygmaea.

On first glance this could appear either awful or amazing depending on your frame of mind, but there are some really interesting things here. First up is how colour has been chosen. The Cladonia is greenish and Xanthoria has come out in the expected orange. Grey-ish components of the Rhizocarpon geographicum thallus have been interpreted as dominant, while Lichina pygmaea is a fetching purple. Of course none of these colours are true to life, but as stylized choices I think they all work pretty well (with the exception of R. geographicum which I’ll come back to in a minute). There’s also been recognition that apothecia are different things to the thallus, being shown as different coloured points on top of the main thallus polygon. This has come out best on the Cladonia, with clear red/brown points contrasting with a green body, just like (well, sort of) the real thing. Xanthoria and Lichina are different shades of the thallus colour, which is fair. Interestingly, while the two-tone nature of Rhizocarpon has been recognised, it’s a bit muddled. While in real life the thallus is mostly yellow with black apothecia, this has been represented as the inverse in the plot, with a black (grey) thallus and yellow (orange) apothecia. While wrong, you can see why this could happen; R. geographicum thalli in real life are pretty variable and I think the uninitiated human would probably struggle to differentiate between which part was which. In its defence, ChatGPT isn’t a lichenologist and doesn’t really know what a thallus or an apothecium is anyway. The last thing to note is general morphology, which really hasn’t been depicted at all apart from in the ‘spikiness’ of the Cladonia which could be interpreted as multiple stalks.

The same lichens, but real. Left to right: Cladonia pyxidata. Xanthoria parietina, Rhizocarpon geographicum, Lichina pygmaea.

Some things to think about: ggplot2 is not the tool for drawing life-like representations of anything, let alone lichens, so I was already giving ChatGPT a pretty severe handicap. By doing so, it’s had an attempt at isolating some key features that it can use to answer the question. Interestingly, these are the key characteristics that we as people are drawn to as well; colour and shape. I often say to people that one of the reasons that lichens can be so hard for beginners is that there’s no archetypal lichen, no basic form that our minds jump to when we say the word ‘lichen’. Ask people what they think a lichen looks like and many will visualise Cladonia cups, while others think of a crust on a rock. There are no obvious straightforward rules that apply to all lichens as there are for something like trees (i.e. roots, trunk, branches, leaves). We might then use ChatGPT’s choices of what to put in these lichen plots as a way of identifying some rules of our own – that lichens can have two major parts, the thallus and the apothecia, and that lichens can come in lots of different colours.

As a test, to see how things went with what would, in my mind, be a simpler problem I thought I’d feed ChatGPT the same question but with some trees:

“Write me some code that will draw me a picture of the trees Quercus robur, Fraxinus excelsior, Pinus sylvestris, and Picea abies using ggplot2.”

Me

In its response ChatGPT automatically gave me back the common names of the species that I’d asked for, which was nice of it (I can see it struggling to do the same with lichens):

“Drawing realistic representations of trees using ggplot2 in R is quite challenging, as it’s primarily a tool for statistical and data visualization rather than detailed illustrations. However, I can provide a basic example that uses geometric shapes to represent simplified versions of oak (Quercus robur), ash (Fraxinus excelsior), pine (Pinus sylvestris), and spruce (Picea abies).”

ChatGPT 3.5

For anyone interested in the code, here it is:

# Install and load ggplot2 if not already installed
if (!requireNamespace("ggplot2", quietly = TRUE)) {
  install.packages("ggplot2")
}
library(ggplot2)

# Function to draw simplified representations of multiple trees
draw_tree_community <- function() {
  ggplot() +
    # Quercus robur (Oak)
    geom_rect(
      aes(xmin = 2, xmax = 3, ymin = 1, ymax = 5),
      fill = "brown", color = "black"
    ) +
    
    # Fraxinus excelsior (Ash)
    geom_rect(
      aes(xmin = 7, xmax = 8, ymin = 1, ymax = 5),
      fill = "gray", color = "black"
    ) +
    
    # Pinus sylvestris (Pine)
    geom_segment(
      aes(x = 12, y = 1, xend = 12, yend = 5),
      color = "green", size = 2
    ) +
    geom_polygon(
      aes(x = c(11, 12, 13), y = c(5, 8, 5)),
      fill = "green", color = "black"
    ) +
    
    # Picea abies (Spruce)
    geom_segment(
      aes(x = 17, y = 1, xend = 17, yend = 5),
      color = "darkgreen", size = 2
    ) +
    geom_polygon(
      aes(x = c(16, 17, 18), y = c(5, 8, 5)),
      fill = "darkgreen", color = "black"
    ) +
    
    # Set axis limits
    coord_cartesian(xlim = c(0, 20), ylim = c(0, 10)) +
    # Remove axis labels and ticks
    theme_void()
}

# Call the function to draw a simplified representation of multiple trees
draw_tree_community()

Again, ChatGPT has its disclaimer:

“This code uses rectangles and segments to represent simplified versions of Quercus robur, Fraxinus excelsior, Pinus sylvestris, and Picea abies. Adjustments and refinements may be necessary for a more accurate representation or to capture specific features of each tree. Keep in mind that this is a highly stylized and abstract representation for illustrative purposes. For detailed botanical illustrations of trees, specialized graphic tools and techniques may be more appropriate.”

ChatGPT 3.5
Some trees by ChatGPT in ggplot2. L-R: oak, ash, pine, spruce

Stylized is definitely the word. I was hoping that that it would draw the broadleaf trees with a circle on top and the conifers as a triangle, and while the latter part of this happened, both the oak and the ash have just been shown as trunks. It seems that ChatGPT isn’t really that interested in an archetypal anything but is just making it up as it goes along. I tried again, this time specifying that it show trunks and leaves:

the same trees with trunks and leaves. L-R: oak, ash, pine, spruce

Better, kind of, but still not what quite what I was expecting. I wasn’t sure how much I believed that ChatGPT really knew what it was doing here, so I asked the exact same question again but switched up the order of the trees to oak, spruce, ash, pine. This is what it came up with:

the same trees with trunks and leaves. L-R: oak, spruce, pine, ash

My confidence that ChatGPT was using any comprehensible logic to inform these plots at all was dwindling rapidly.

Whatever, in all of these cases we’ve gone in with a priori knowledge about what we’re looking at, which has really affected our interpretation (indeed, over-interpretation) of what ChatGPT has done . If someone showed you the lichen picture without telling you it was lichen, would you make that link? Probably not. The Cladonia could just as easily be a collapsed Christmas tree, or a bunch of basketballs on a grassy mountainside. Would the tree pictures fair any better? Perhaps, particularly with the addition of the ‘trunk and branches’ rule, but these are still highly abstract views. ChatGPT isn’t really telling us how to draw trees and lichens, but instead supplying an amalgamation of information about weirdly filtered copies of copies of something that somebody said was a tree or a lichen over infinite iterations, through the lens of 1,000 different pieces of code that were written to do something else entirely…. you get the idea.

Finally, I asked ChatGPT to draw me a lichen and a tree. ChatGPT thought I asked for a lichen on a tree and the outcome, a rusty rectangular Sputnik sporting a single tiny green orb, speaks for itself.

A lichen on a tree, apparently.

So what to conclude from all this? ChatGPT is good for many things, but it isn’t good at making things up. It needs prompts, both from you and from within the source material it’s drawing from in order to generate content. The less information it has on either side the worse it’s going to do, but at the same time it doesn’t know what to do with too much information. It’s also not particularly consistent; rules that might seem obvious to us aren’t automatically applied, and even if we supply a set of rules to follow, they likely won’t be adhered to in ways we expect. Arguably, the most useful thing that ChatGPT has done in this instance is to have made me write this blog and think about things in a way I wouldn’t have done otherwise. All scientific and artistic innovation throughout history has come about as a result of both the advances and limitations of the technology available at the time, with many interesting questions arising at the boundaries of what is possible. I think a particularly useful thing that can come out of AI is not specific answers to questions, but insight into how we actually frame those questions ourselves.

NC